Essential Functions (primary functions and/or reasons the job exists in order of importance)
1. Designs, builds, and maintains custom data pipelines and ETL processes in support of business and Data Scientist needs and initiatives.
2. Partners with data analysts, data scientists, and other data consumers across IT and the business to optimize data availability with intention to build, refine and enhance AI/ML models and algorithms.
3. Drives continuous improvement in data pipelines and partners with others across IT to ensure both timely availability and security of data.
4. Supports daily operations via analyzing and correcting incidents and defects in a timely fashion.
5. Enables data consumption via visualization tools or API’s.
6. Serves as a peer mentor to less experienced Data Engineers.
Desired Qualifications/Experience/Certification/Education (in order of importance)
1. 3+ years of work experience in data management disciplines including data integration, modeling, data security, and data quality, and/or other areas directly relevant to Data Engineering responsibilities and tasks.
2. Highly skilled at systematically applying logical reasoning techniques to query, inspect, cleanse, profile, and study data in structured, semi-structured, and unstructured formats for the purpose of deriving knowledge to support business decision-making and mapping data between source and target systems to support application integrations.
3. Highly skilled at SQL Programming with the ability to perform the most complex queries against relational databases in an efficient manner.
4. Skilled at Python programming core concepts and using Python frameworks (e.g., Django, Flask) and libraries.
5. Skilled at creating data extracts or loads by selecting data to match consumer needs and creating data pipelines to feed required environments or AI/ML models with little to no transformation.
6. Skilled in developing secure Application Program Interface (API) bridges between systems or applications. Includes using varying protocols (e.g., SOAP, REST). Experience with MuleSoft platform a plus.
7. Skilled at writing moderate to complex scripts to automate manual, repeatable tasks using common industry tools (i.e., Python, Powershell and Bash).
8. Knowledgeable at defining and implementing processes to automate code deployment through the development life cycles using widely accepted tools (i.e., Jenkins, Kubernetes, Docker).
9. Knowledgeable at building conceptual, logical and physical data models which are visual representations of an enterprise's business data. Experience with ErWin or other modeling tools a plus. Ability to build and modify simple data models.
10. Bachelor’s degree in Information Technology, Computer Science, Engineering, Mathematics or related field or commensurate experience.
About TEKsystems:We're partners in transformation. We help clients activate ideas and solutions to take advantage of a new world of opportunity. We are a team of 80,000 strong, working with over 6,000 clients, including 80% of the Fortune 500, across North America, Europe and Asia. As an industry leader in Full-Stack Technology Services, Talent Services, and real-world application, we work with progressive leaders to drive change. That's the power of true partnership. TEKsystems is an Allegis Group company.
The company is an equal opportunity employer and will consider all applications without regards to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic protected by law.