Job description
- 10+ years of experience as a Data Engineer utilizing standard data management principles.
- 10+ years of experience with specified technical capabilities: • ETL experience, • Data Management Standards, • Cloud Based Computing such as AWS or Azure, • Data Architecture, • Big data technologies, • Data warehousing, mining, analysis, self-service, or machine learning, • Data Governance, • Structured and Unstructured Data, • SQL and NoSQL database technologies, • Data Modelling, • Python, SQL, Spark, • Airflow
- Acts as a technical authority for all technical decisions, issues, and difficult technical problems.
- Design conceptual data architectures solution for data-centric analytics projects.
- Use data mapping, data mining and data transformational analysis tools to design and develop technical solution architectures.
- Understand the complexity of data and design systems and models to handle different data sources/formats, which includes structured, semi-structured, and unstructured, as well as stream processing.
- Play a contributing role in the development and update of enterprise data architectural strategies, patterns, standards, processes and tools as new/unique data sources, use cases and requirements come up.
- Prepare and maintain accurate solution design and architectural documentations for delivery, platform and operations team.
- Understand the complexity of data and design systems and models to handle different data sources/formats, which includes structured, semi-structured, and unstructured, as well as stream processing.
- Address governance and security challenges associated with solutions.
- Analyze data sources, design and evaluate feasible data pipeline solutions. The solutions might include database modeling and design, relational database architecture, metadata and repository creation and configuration management.
- Design and develop data pipelines or ETL processes.
- Support and adhere to the development of data management policies, standards and procedures.
- Define integrated views of data drawing together data from across the enterprise, both in real-time and as extracts.
- Package deployments and collaborate with IT to migrate code between Development, Quality Assurance (QA), and Production environments.
- Support test automation, troubleshooting ETL job functionality, validating data, as well as create test data and table structures through use of SQL.
- Must be able to work in an agile, rapid delivery environment.
Experience:
- AWS: 4 years (Required)
- Python: 4 years (Required)
- Pyspark: 2 years (Required)
- Power BI: 2 years (Required)
Job Type: Contract
Pay: $65.00 - $70.00 per hour
Expected hours: 40 per week
Benefits:
- 401(k)
- Dental insurance
- Health insurance
Schedule:
- 8 hour shift
Work Location: In person