Description :
|
Job Details: Must Have Skills 1. Snowflake, Airflow 2. phython, AWS 3. EC2, Cognos, Kinesis
Responsibilities : 1. Snowflake, Airflow and Design and implement distributed data processing pipelines using Spark, Hive, Sqoop, Python, and other tools and languages prevalent in the Hadoop ecosystem. Ability to design and implement end to end solution. 2. phython, AWS and Build utilities, user defined functions, and frameworks to better enable data flow patterns 3. EC2, Cognos, Kinesis and Define and build data acquisitions and consumption strategies
|