Please find the Job Description and let me know your comfort level AWS | Data On Cloud - Platform Location: Hillsboro, OR Duration: 6 Months plus Description: Must Have Skills (Top 3 technical skills only) * 1. Snowflake, Airflow 2. phython, AWS 3. EC2, Cognos, Kinesis Job Description: Good knowledge in Snowflake, Airflow, Phython, AWS, EC2, Cognos, Kinesis and Design and implement distributed data processing pipelines using Spark, Hive, Sqoop, Python, and other tools and languages prevalent in the Hadoop ecosystem. Ability to design and implement end to end solution. Build utilities, user defined functions, and frameworks to better enable data flow patterns. Research, evaluate and utilize new technologies tools frameworks centered around Hadoop and other elements in the Big Minimum years of experience*:5+ Top 3 responsibilities you would expect the Subcon to shoulder and execute*: - Snowflake, Airflow and Design and implement distributed data processing pipelines using Spark, Hive, Sqoop, Python, and other tools and languages prevalent in the Hadoop ecosystem. Ability to design and implement end to end solution.
- phython, AWS and Build utilities, user defined functions, and frameworks to better enable data flow patterns
- EC2, Cognos, Kinesis and Define and build data acquisitions and consumption strategies
|