This four-day Apache Hadoop 2.0 training course is designed for administrators who deploy and manage Apache Hadoop 2.0 clusters. Through a combination of lecture and hands-on exercises you will learn how to install, configure, maintain and scale your Hadoop 2.0 environment. At the end of this course you will have a solid understanding of how Hadoop works with Big Data and through the hands-on exercises will have completed the Hadoop deployment lifecycle for a multi-node cluster.
This course will help prepare delegates for the Hortonworks Apache Hadoop Administrator Certification program. Delegates wishing to sit the Administrator exam must book and sit their exam directly with a Kryterion testing centre.
This course utilises a Linux environment. Attendees should know how to navigate and modify files within a Linux environment. Existing knowledge of Hadoop is not required.
This course is designed for IT administrators and operators responsible for installing, configuring and supporting an Apache Hadoop 2.0 deployment in a Linux environment.
Delegates will learn how to
- How to size and deploy a cluster
- How to deploy a cluster for the first time
- How to configure Hadoop and the supporting frameworks
- How to perform ongoing maintenance to nodes in the cluster
- How to balance and performance tune a cluster
- How to move and manage data within a cluster
- How to integrate status and health checks into your existing monitoring tools (single pane of glass)
- How to add and remove DataNodes
- How to Implement a high available solution
- Best practices for deploying Hadoop clusters
- Introduction to Hortonworks Data
- Platform & Hadoop 2.0
- Hadoop Storage: HDFS Architecture
- Installation Prerequisites
- HDP Management: Ambari
- Ambari and the Command Line
- Hadoop Operating System (YARN) & MapReduce
- Configuring Services
- Configuring HDFS
- Configuring Hadoop Operating System (YARN) & MapReduce
- Configuring HBase
- Configuring ZooKeeper
- Configuring Schedulers
- Data Integrity
- Extract-Load-Transform (ELT) Data Movement
- Copying Data Between Clusters
- HDFS Web Services
- Apache Hive Data Warehouse
- Transferring data with Sqoop
- Moving Log Data with Flume
- Setting up the HDFS
- NFS Gateway
- Workflow Management: Oozie
- Data Lifecycle Management with Falcon
- Monitoring HDP 2.0 Services
- Commissioning and Decommissioning a Nodes and Services
- Rack Awareness and Topology
- NameNode Federation Architecture
- NameNode High-Availability (HA)Architecture
- Backup & Recovery
21 August 2014
I often feel I’m running the Red Queen’s race to ensure our delegates don’t have to.
It has been well documented that Windows Server 2003 will have support withdrawn on the 15th July 2015.
19 June 2014
If you read the tech press, you would think absolutely everybody was moving to the cloud. But is that just hype, or is it really true? And if it’s true, what benefits are they getting from it?
SharePoint 2013 and Internet Explorer 10 have a stormy relationship. I think it's time for marriage guidance counselling.
31 January 2014
The App-V 5.0 package format is very different from the previous 4.5/4.6 version, and the App-V 5.0 client is not compatible with the earlier package versions. To help protect your sequencing investment, Microsoft included two PowerShell commands on the sequencer to aid in migration: Test-AppVLegacyPackage and ConvertFrom-AppVLegacyPackage. The first tests the old package for known constraints, while the second attempts to convert the package to the new format
One of the things we're regularly asked on courses is "is there a quicker way to do xyz?"
Very often the answer is a resounding 'yes'.
So, I thought with this post I'd cover my favourite (and most commonly used) top 20 shortcuts when working with Adobe Photoshop (either in Creative Suite or Creative Cloud).
See all related blogs