Job Description
Data Operations and Reliability Engineer
SUMMARY:
We are on a journey to build a modern, scalable, cloud first, data and analytics ecosystem. As we build out our data program and delivery system, we have an ever-increasing need for DataOps support for what we’re building. As a Data Operations and Reliability Engineer you will work with the broader Data Program Teams to ensure we have a reliable and efficient platform. You will be responsible for platform and application security, health, and performance.
Specific competencies include:
- Data Privacy and Security - Focuses on issues related to collecting, storing and retaining data, as well as data transfers within HIPAA policy. Protect data against unauthorized access, loss or corruption throughout the data lifecycle.
- Data Operations - Oversee the end-to-end data pipeline, from the initial data collection and preparation to the development and deployment of data models, to the ongoing monitoring and optimization of the pipeline.
- Data Reliability - Ensure that data is consistently available, accurate, and timely for analysts and other stakeholders.
RESPONSIBILITIES:
· Work with Data Engineers to ensure that data privacy and security policies and practices are being followed and functioning well.
· Coordinate the creation and dissemination of audit and reporting information demonstrating that data and processes are within compliance.
· Facilitate the user provision process and regularly reconcile access and report on findings.
· Monitor and coordinate resolution for issues impacting Data Operations.
· Maintain reporting and communication for situations where SLAs may not be met and remediation may be required.
· Assist Data Engineers in troubleshooting issues impacting Production.
· Overall responsibility for monitoring cost and performance for Data Lake and Data Warehouse environments.
· Ensure that HA, DR, and BCP plans and processes are documented and tested on a regular basis.
· Partner with Agile Delivery Teams in the identify and implement process and automation improvements.
QUALIFICATIONS:
- A bachelors degree in Data Analytics, MIS, Computer Science, or related area and significant experience with data analytics, business intelligence design, and development.
- 5+ years of experience engineering within a data warehouse or related experience with dimensional data modeling.
- 5+ years of experience designing and developing ETLs with tools like Microsoft SSIS (SQL Server Integration Services), Databricks, or Python.
- Excellent written and verbal communication skills, organizational and time management skills, and tested problem-solving and decision-making skills.
- Experience working as part of an Agile Scrum team.