·
1 to 3 years
experience in Hadoop ecosystem
·
Strong working
experience on Hadoop distributed computing framework
·
Expertises in
HDFS,PIG, Hive & Hbase
·
Should have strong SQL
skills
·
Good working
experience on Kafka
·
Good understanding of
working of various component /Services of Hadoop
·
Expertise in building
fault tolerant and fully automated systems.
·
Should have working
knowledge on Python programming
·
Good working
experience in Object Oriented Programming Languages (Spark, Scala)
·
Experience in working
with complex data models, large databases, extensive reporting and data
analysis
·
Ability to evaluate
code for performance, understand key code metrics and take design decisions
Regards,
Prathima