This course covers the essentials of deploying and managing an Apache Hadoop cluster. The course is lab intensive with each participant creating their own Hadoop cluster using either the CDH (Cloudera's Distribution, including Apache Hadoop) or Hortonworks Data Platform stacks. Core Hadoop services are explored in depth with emphasis on troubleshooting and recovering from common cluster failures. The fundamentals of related services such as Ambari, Zookeeper, Pig, Hive, HBase, Sqoop, Flume and Oozie are also covered. The course is approximately 60% lecture and 40% labs.
Qualified participants should be comfortable with the Linux commands and have some systems administration experience, but do not need previous Hadoop experience
3 Days/Lecture & Lab
Systems Administrators who will be responsible for managing and administering Hadoop clusters
"Big Data", the big picture::HDFS::MapReduce::Authentication and Authorization::MapReduce schedulers::Cluster monitoring and maintenance::Troubleshooting::Appendix