Company

CcubeSee more

addressAddressSan Diego, CA
type Form of workPart-Time
CategoryInformation Technology

Job description

Job Description


Senior Data Warehouse Architect with (AWS/Snowflake)

Remote

6-12 Months (Part Time 3-4 Hours a Day)

 

We are seeking a talented and experienced Data Architect to join our team in designing, developing, and optimizing data pipelines and ETL processes for cloud-based Data Warehouse tech stacks. In this role, you will collaborate closely with cross-functional teams, leveraging your expertise in designing and developing Data Pipelines, using AWS, Redshift, Snowflake, Airflow, Python scripting, and other pertinent AWS services to ensure the seamless ingestion, integration, transformation, and orchestration of data. Your experience with complex ETL pipelines, Changed Data Capture (CDC), and Slowly Changing Dimension (SCD) strategies will be instrumental in creating a scalable, high-performance data environment. By adhering to best practices and industry standards, you will collaborate with our engineering and data teams to design forward-thinking solutions.

 

Required Qualifications & Experience:

  • Bachelor's or master’s degree in computer science, Information Technology, or a related field.

  • Extensive hands-on experience designing, developing, and maintaining data pipelines and ETL processes on AWS Redshift/Snowflake, including data lakes and Data Warehouses.

  • Proficiency in SQL programming and Snowflake/Redshift stored procedures for efficient data manipulation and transformation.

  • Hands-on experience with AWS services such as AWS DMS, Amazon S3, AWS Glue, Redshift, Snowflake Airflow, and other pertinent data technologies.

  • Strong understanding of ETL best practices, data integration, data modeling, and data transformation.

  • Experience with complex ETL scenarios, such as CDC and SCD logics, and integrating data from multiple source systems.

  • Demonstrated expertise in AWS Data Pipelines for seamless ingestion from RDS, EMR, and S3 to AWS Redshift/Snowflake.

  • Proficiency in Python programming with a focus on developing efficient Airflow DAGs and operators.

  • Familiarity with version control systems, particularly Git, for maintaining a structured code repository.

  • Proficiency in identifying and resolving performance bottlenecks and fine-tuning Snowflake/Redshift queries.

  • Strong coding and problem-solving skills, and attention to detail in data quality and accuracy.

  • Ability to work collaboratively in a fast-paced, agile environment and effectively communicate technical concepts to non-technical stakeholders.

  • Proven track record of delivering high-quality data solutions within designated timelines.

  • Experience working with large-scale, high-volume data environments.

  • The ideal candidate possesses 5+ years of hands-on experience working with Snowflake/Redshift and other AWS services and a proven track record of delivering high-performing, scalable data platforms and solutions within the AWS cloud.

  • AWS certifications related to data engineering or databases are a plus.

Powered by JazzHR

YdD04RiWev

Refer code: 6886582. Ccube - The previous day - 2023-12-11 19:16

Ccube

San Diego, CA
Popular Data Warehouse Architect jobs in top cities

Share jobs with friends

Related jobs

Aws Data Warehouse Architect

Data Warehouse Architect

It Unlock Consulting

$80 - $110 an hour

Sacramento, CA

4 months ago - seen