Report the ad
Hadoop Online Training course - Hyderabad
Friday, 8 November, 2013Item details
City:
Hyderabad, Andhra Pradesh
Offer type:
Offer
Item description
Hadoop Online Training
Introduction
Hadoop DW course structure is a blend of administration and writing code using
Hadoop ecosystem components to show working with Big data. Topics covered in
this course, include- Hive, Pig, Zoo Keeper, Sqoop and multi node setup of Hadoop
Cluster on Amazon ec2 in CDH4. Hadoop developer course focuses on training
participants on setting up Hadoop infrastructure, writing Map Reduce Programs, Hive
and Pig Scripts, working with HDFS, Zoo keeper and Sqoop, Flume, Oozie.
Who should attend?
Java Developers / Architects, Data warehouse developers /SAAS
Professionals/Architects, Big data Professionals
Pre requisites for attending Training
Basic knowledge of unix , sql scripting
OBJECTIVE of the Training
Understanding Distributed , parallel , cloud computing , No sql concepts
Setting up Hadoop infrastructure with single and multi node cluster on amazon
ec2(CDH4) Understanding of concepts of Map and Reduce and functional Programming
Writing Map and Reduce Programs , Working with HDFS
Writing Hive and Pig Scripts and working with Zoo Keeper and Sqoop
Ability to design and develop applications involving large data using Hadoop eco
system
Course Outline
Introduction to Hadoop
Distributed computing
Parallel computing
Concurrency
Cloud Computing
Data Past, Present and Future
Computing Past, Present and Future
Hadoop
NoSQL
Hadoop Streaming
Distributing Debug Scripts
Getting Started With Eclipse
Hadoop Stack
CAP Theorem
Databases: Key Value, Document, Graph
Hive and Pig
HDFSLab 1: Hadoop Hands-on
Installing Hadoop Single Node cluster(CDH4) Understanding Hadoop configuration files
HDFS Introduction
Architecture
File System
Data replication
Name Node
Data Node
Hive introduction
Installation and Configuration
Running Hive
Configuration management overview
Runtime configuration
Hive, Map-Reduce and Local-Mode
DDL Operations
Metadata Store
DML Operations
SQL Operations
queries
selects and filters
group by
join
multitable insert streaming
Exercise
MovieLens
Apache log
Hive Archiecture
Data Store
Metastore
Architecture
Interface
HQL
Compiler OptimizerPig Introduction
Pig and Dataflow
Pig Philosophy
Pig and Hadoop
Pig vs Hive
Why Pig
Installing and Configuring Pig
Download and Install from Apache
Running Pig
Local Cluster Cloud
Command Line Options
Grunt Understanding Grunt Entering PigLatin script in Grunt HDFS Commands in Grunt Controlling Pig from Grunt
Pig Data Model Problem Statement and Data Model Input and Output Load
Store
Dump
Relational Operators
Foreach
Filter Group
OrderBy
Distinct Join
Limit Sample
Parallel User Defined Functions
Registering UDF
Defining UDF
Calling Static Java Functions
Flume
What is Flume?
How it works ?
An example
What is Sqoop?
How it works ? An example
What is Oozie?
How it works?
An example
Sqoop
Oozie
Load and Store Functions
Overview of Built-in Functions
Introduction to Zoo Keeper
Cluster Planning and Cloud Manager Set-up
Hadoop Multi node Cluster Setup
Installation and Configuration
Running MapReduce Jobs on Multi Node cluster
Working with Large data sets
Steps involved in analyzing large data
Lab walk through
POC for Hadoop Connectivity with ETL Tool
High Availability Fedration, Yarn and Security
If you require any further information please do not hesitate to contact us
please feel free to mail us for demo session or call @ 9989754807
contact: trainings@keentechnologies.com
website url: httpwww.keentechnologies.com
Introduction
Hadoop DW course structure is a blend of administration and writing code using
Hadoop ecosystem components to show working with Big data. Topics covered in
this course, include- Hive, Pig, Zoo Keeper, Sqoop and multi node setup of Hadoop
Cluster on Amazon ec2 in CDH4. Hadoop developer course focuses on training
participants on setting up Hadoop infrastructure, writing Map Reduce Programs, Hive
and Pig Scripts, working with HDFS, Zoo keeper and Sqoop, Flume, Oozie.
Who should attend?
Java Developers / Architects, Data warehouse developers /SAAS
Professionals/Architects, Big data Professionals
Pre requisites for attending Training
Basic knowledge of unix , sql scripting
OBJECTIVE of the Training
Understanding Distributed , parallel , cloud computing , No sql concepts
Setting up Hadoop infrastructure with single and multi node cluster on amazon
ec2(CDH4) Understanding of concepts of Map and Reduce and functional Programming
Writing Map and Reduce Programs , Working with HDFS
Writing Hive and Pig Scripts and working with Zoo Keeper and Sqoop
Ability to design and develop applications involving large data using Hadoop eco
system
Course Outline
Introduction to Hadoop
Distributed computing
Parallel computing
Concurrency
Cloud Computing
Data Past, Present and Future
Computing Past, Present and Future
Hadoop
NoSQL
Hadoop Streaming
Distributing Debug Scripts
Getting Started With Eclipse
Hadoop Stack
CAP Theorem
Databases: Key Value, Document, Graph
Hive and Pig
HDFSLab 1: Hadoop Hands-on
Installing Hadoop Single Node cluster(CDH4) Understanding Hadoop configuration files
HDFS Introduction
Architecture
File System
Data replication
Name Node
Data Node
Hive introduction
Installation and Configuration
Running Hive
Configuration management overview
Runtime configuration
Hive, Map-Reduce and Local-Mode
DDL Operations
Metadata Store
DML Operations
SQL Operations
queries
selects and filters
group by
join
multitable insert streaming
Exercise
MovieLens
Apache log
Hive Archiecture
Data Store
Metastore
Architecture
Interface
HQL
Compiler OptimizerPig Introduction
Pig and Dataflow
Pig Philosophy
Pig and Hadoop
Pig vs Hive
Why Pig
Installing and Configuring Pig
Download and Install from Apache
Running Pig
Local Cluster Cloud
Command Line Options
Grunt Understanding Grunt Entering PigLatin script in Grunt HDFS Commands in Grunt Controlling Pig from Grunt
Pig Data Model Problem Statement and Data Model Input and Output Load
Store
Dump
Relational Operators
Foreach
Filter Group
OrderBy
Distinct Join
Limit Sample
Parallel User Defined Functions
Registering UDF
Defining UDF
Calling Static Java Functions
Flume
What is Flume?
How it works ?
An example
What is Sqoop?
How it works ? An example
What is Oozie?
How it works?
An example
Sqoop
Oozie
Load and Store Functions
Overview of Built-in Functions
Introduction to Zoo Keeper
Cluster Planning and Cloud Manager Set-up
Hadoop Multi node Cluster Setup
Installation and Configuration
Running MapReduce Jobs on Multi Node cluster
Working with Large data sets
Steps involved in analyzing large data
Lab walk through
POC for Hadoop Connectivity with ETL Tool
High Availability Fedration, Yarn and Security
If you require any further information please do not hesitate to contact us
please feel free to mail us for demo session or call @ 9989754807
contact: trainings@keentechnologies.com
website url: httpwww.keentechnologies.com