Requirement:
- Writes complex SQL queries required to perform Data Acquisition and Ingestion required for Data pipelines
- Builds Data pipelines and does Data Engineering activities using technologies like Python, Hadoop, Spark etc.,
- Ensures the upkeep of the Hadoop Data Lake Platform by monitoring the Horton Works HDFS
- Monitors the Data Lake constantly and ensures that the appropriate support teams are engaged at the right times
- Works in an Agile/Scrum Environment, interacts with a scrum team as well as the Client Stakeholders
- Understands the client requirements from Agile scrum user stories and develops low level design required for the user stories
- Result Oriented and able to match the pace of work demands of the Program through self-improvement