Thank you for sending your enquiry! One of our team members will contact you shortly.
Thank you for sending your booking! One of our team members will contact you shortly.
Course Outline
1: HDFS (17%)
- Explain the roles of HDFS Daemons
- Describe the standard operation of an Apache Hadoop cluster, covering both data storage and processing aspects.
- Identify the current computing system features that drive the need for systems like Apache Hadoop.
- Categorize the primary objectives of HDFS Design.
- Given a specific scenario, identify the appropriate use case for HDFS Federation.
- Identify the components and daemons of an HDFS HA-Quorum cluster.
- Analyze the role of HDFS security, specifically Kerberos.
- Select the most suitable data serialization method for a given scenario.
- Describe the pathways for file read and write operations.
- Identify the commands used to manipulate files in the Hadoop File System Shell.
2: YARN and MapReduce version 2 (MRv2) (17%)
- Understand the impact on cluster settings when upgrading a cluster from Hadoop 1 to Hadoop 2.
- Understand the deployment of MapReduce v2 (MRv2 / YARN), including all associated YARN daemons.
- Grasp the basic design strategy for MapReduce v2 (MRv2).
- Determine how YARN manages resource allocations.
- Identify the workflow of a MapReduce job executing on YARN.
- Determine which files need to be modified, and how, to migrate a cluster from MapReduce version 1 (MRv1) to MapReduce version 2 (MRv2) on YARN.
3: Hadoop Cluster Planning (16%)
- Key points to consider when selecting hardware and operating systems for hosting an Apache Hadoop cluster.
- Analyze the options available when selecting an OS.
- Understand kernel tuning and disk swapping processes.
- Given a scenario and workload pattern, identify the hardware configuration that fits the scenario.
- Given a scenario, determine the necessary ecosystem components for your cluster to meet SLAs.
- Cluster sizing: Given a scenario and execution frequency, identify workload specifics, including CPU, memory, storage, and disk I/O.
- Disk Sizing and Configuration, including JBOD versus RAID, SANs, virtualization, and disk sizing requirements within a cluster.
- Network Topologies: Understand network usage in Hadoop (for both HDFS and MapReduce) and propose or identify key network design components for a given scenario.
4: Hadoop Cluster Installation and Administration (25%)
- Given a scenario, identify how the cluster handles disk and machine failures.
- Analyze logging configuration and the format of logging configuration files.
- Understand the fundamentals of Hadoop metrics and cluster health monitoring.
- Identify the function and purpose of available tools for cluster monitoring.
- Install all ecosystem components in CDH 5, including (but not limited to): Impala, Flume, Oozie, Hue, Manager, Sqoop, Hive, and Pig.
- Identify the function and purpose of available tools for managing the Apache Hadoop file system.
5: Resource Management (10%)
- Understand the overall design goals of each of Hadoop schedulers.
- Given a scenario, determine how the FIFO Scheduler allocates cluster resources.
- Given a scenario, determine how the Fair Scheduler allocates cluster resources under YARN.
- Given a scenario, determine how the Capacity Scheduler allocates cluster resources.
6: Monitoring and Logging (15%)
- Understand the functions and features of Hadoop’s metric collection capabilities.
- Analyze the NameNode and JobTracker Web UIs.
- Understand how to monitor cluster Daemons.
- Identify and monitor CPU usage on master nodes.
- Describe how to monitor swap and memory allocation on all nodes.
- Identify how to view and manage Hadoop’s log files.
- Interpret a log file.
Requirements
- Foundational skills in Linux administration
- Basic programming competence
35 Hours
Testimonials (3)
I genuinely enjoyed the many hands-on sessions.
Jacek Pieczatka
Course - Administrator Training for Apache Hadoop
I genuinely enjoyed the big competences of Trainer.
Grzegorz Gorski
Course - Administrator Training for Apache Hadoop
I mostly liked the trainer giving real live Examples.