>
Home About CE Services Job Searching Course Searching News Contact Us Sitemap

Information Technology

Release Time:6/30/2014  828 views
Course name:Hadoop Workshop
Course Code:
Nature and target:This 1-day workshop provides hands-on exercise for audience to setup Apache Hadoop cluster from scratch. Audience can learn how to run MapReduce programs on a cluster of machines for parallel computing and how Apache Hadoop recovers from system failures. This workshop is particularly useful to those who are about to kick- start a Hadoop-based project or about to implement a proof-of- concept (POC) setup.
Object:Decision makers, IT managers, technical managers, software developers, data analysts who want to learn latest new technologies in handling massive and ever-growing data assets.
Number:10(class)
The language of instruction:English
Teaching methods:ILT (Instructor-Led Training)
Hours:8
Teaching supervisor:tbc
Class Date:tbc
Venue:tbc
Tuition:MOP 4126.00
Content:
   Installing Hadoop and HDFS operations
-Install from CDH
-Format the NameNode
-Start and stop Hadoop daemons
-List the Hadoop daemons
-Test the Hadoop installation
   Running a MapReduce Job in pseudo-distributed mode
-Upload sample data to HDFS
-Run a MapReduce job
-View the output from the MapReduce job
-View the output from the Hadoop console
   Running a MapReduce Job in a clustered mode
-Configure the IP addresses and host names
-Configure the core site properties
-Configure the HDFS properties
-Configure the MapReduce properties
-Creating required Hadoop directories
-Setting the ownership for Hadoop directories
-Start the daemons for master node
-Start the daemons for slave node
-Test the Hadoop installation
   Managing Jobs
-List running jobs on the cluster
-Kill running jobs
   Breaking the Cluster
-Causing failures on Data Node
-Verifying the cluster
   Self-Healing Features of a Cluster
-Causing failures on Name Node
-Verifying the cluster

TOP