What You Will Do:
- Design, develop and implement data pipeline (integration, transformation, data loading, data validation and error handling) using Databricks and Snowflakes.
- Deploy the data pipeline to the production environment via CI/CD system such as Jenkins and Terraform.
- Continuously monitor the pipeline and address any issues or improvements.
- Collaborate with stakeholders to gather requirements, including data schemas, transformation logic, and data quality rules.
Required
- 6 to 12 years of experience in IT and Data pipeline development
- Spark, Databricks, Snowflakes, Python, PostGres, AWS, Terraform
- Expected to have high attention to details, capable of proposing improvements and risk assessment to stakeholders.
- Ability to work independently.
- Willingness to learn new things on a regular basis.
- Strong verbal and written communication skills
Desired:
- Data modeling experience, schema design, and data warehousing concepts like SCD patterns.
Job Type: Contract
Pay: $55.00 - $60.00 per hour
Experience level:
- 10 years
- 11+ years
- 8 years
- 9 years
Schedule:
- Monday to Friday
Experience:
- Data pipelines: 10 years (Required)
- Data warehouse: 10 years (Required)
- scd: 10 years (Required)
Work Location: Remote