Job Description
One of our clients is an e-commerce platform looking to transform its data solutions team. As a part of this, they are looking to bring a strong Data Engineer onto the team with a skills in technologies like snowflake and DBT:
Position Summary:
The role of Data Engineer is to design, develop enterprise data platforms. The position is responsible for building real-time and batch data pipelines to process and unify data from multiple sources into data marts, reporting tables and application databases. The work includes analyzing, cleansing, and transforming data to support applications, reports/dashboards, and analytics.
The Data Engineer will leverage technologies including Python, PySpark, SQL, AWS Glue, Airflow, DBT, Meltano, Snowflake, Aurora MySQL, SQL Server, and notebook applications. The engineering activities will span all phases of the development lifecycle including design, planning, development, testing and operational support.
Job Responsibilities:
- Develop new data pipeline solutions to meet functional and non-functional requirements
- Monitor, maintain and tune existing pipelines, pipeline orchestration and query performance/efficiency
- Analyze data in OLTP, DWH and big data platforms
- Develop data quality/accuracy monitoring
- Respond to pipeline failure incidents, participate in alerting, escalations and/or recovery procedures necessary to ensure system functionality is restored in a timely manner
- Ensure adherence to development standards and release promotion procedures
- Participate in development and refinement of DevOps tools and policies to increase efficiency and system stability
- BA/BS degree in Information Technology or equivalent combination of education and experience
- 2-4 years of experience developing in python
- 2-4 years of experience developing in SQL
- 2-4 years of experience developing ELT solutions such as DBT and snowflake
- Experience deploying ELT processes to an enterprise environment
- Some experience in Performance Tuning and Optimization
- Experience in implementing operational automation using scripts
- Good communication and documentation skills
- Experience with cloud-based (AWS and Azure) database systems and environments
- Strong analytical and problem-solving skills