• Bachelor’s degree in Computer Science, Computer Engineering or a software related discipline. A Master’s degree in a related field is an added plus
• 8+ years of experience in designing conceptual, logical and physical data models using major enterprise data modeling tools
• 5+ years of experience in strategic data planning, standards, procedures, and governance
• 5+ years of experience in data quality engineering, metadata consolidation and integration, metadata model development and maintenance, repository management, data warehouse design and data mining, and data security
• 5+ years of experience in leveraging enterprise data warehouse modeling constructs, methodologies and practices to ensure flexible, scalable, maintainable, and high-performing physical databases
• 2+ years of hands on experience in integrating OTLP and ERP systems with big data/data lake repositories using ELT/ETL tools, preferably Informatica and Oracle
• 1+ year of hands on experience in implementing and using CI/CD framework technologies such as Jenkins or Bamboo
• Experience with technologies and frameworks such as Hadoop, MapReduce, Pig, Hive, HBase, Flume, ZooKeeper, MongoDB, NoSQL and Cassandra
• Experience with Data warehousing and data mining is a must
• Strong experience in programming languages and latest technologies such Java, Javascript frameworks, RESTful Services, Spark, Scala, Python, Linux, Hive, Kafka, Redis, and Hortonworks
• SQL is must to have
• 3+ years of experience working as a member of an Agile team