Job Description
Core Responsibilities:
- Develop and sustain a state-of-the-art data architecture across the organization.
- Compile and manage comprehensive data sets that adhere to both functional and non-functional company needs.
- Engage in the enhancement of internal processes by:
- Implementing automation to replace manual tasks.
- Enhancing the efficiency of data transmission.
- Upgrading our systems to scale effectively and meet growing demands.
- Establish robust systems to streamline the data lifecycle, including extraction, transformation, and loading, utilizing tools such as Python and SQL.
- Craft analytical tools leveraging the data pipeline, delivering key insights into customer growth, operational enhancements, and pivotal business metrics.
- Collaborate closely with leadership and teams across Product, Operations, and Design domains to fulfill the company's data technical and infrastructural requisites.
Detailed Duties:
As a Data Engineer, your role is to fortify the support structure for our data analysis, operations, and architecture teams. You'll be pivotal in maintaining a seamless and efficient data delivery framework that remains robust across a range of projects.
Tasks you'll tackle include:
- Architecting data pipelines to cleanse, morph, and consolidate diverse data sets.
- Applying best practices in software development to refine our backend infrastructures.
- Creating comprehensive models of data streams, both incoming and outgoing, to enable in-depth organizational analysis.
- Formulating models to provide insights for business strategy inquiries.
- Facilitating interdepartmental communication to align on data requirements and usage.
Candidate Profile:
The ideal candidate will possess a blend of sharp problem-solving capabilities, solid technical acumen, and exceptional communication proficiency. Essential qualifications for success in this role include:
- At least 7 years of progressive experience with data analysis, data warehousing, and/or big data technologies.
- Proficiency with ETL tools and frameworks
- Familiarity with relational and NewSQL database technologies (we use numerous Postgres compatible databases including Google Cloud SQL Postgres, AlloyDB, and CochroachDB).
- Expertise in managing data pipelines and workflows.
- Practical experience with cloud services, particularly GCP, and tools like Google BigQuery. We are a BigQuery warehouse shop.
- Skilled in using Python and/or R for data exploration and analysis.
- Proven track record in conducting thorough root cause analysis on data and processes, with the goal of driving business insights and improvements.
- Development of processes that underpin data transformation, establish data structures, and manage dependencies and workloads.
- Analytical prowess with both structured and unstructured data sets, with a history of deriving meaningful insights from complex and disparate data sources.
- Strong project management capabilities and organizational skills.
- Proven experience in collaborating with and supporting cross-functional teams.
- Outstanding problem-solving skills coupled with the ability to communicate effectively.
- Self-motivated, eager to learn, and committed to personal growth and development.
- Advocacy for best practices in the field and a dedication to continual learning.