Job Description
Title: Software Developer II
Location: Charlotte, NC (Hybrid)
Note: This role is NOT open to C2C candidates
The Software Developer will be working in Lending, safeguarding against attrition, risk, and ensuring delivery. They will be supporting 2022-to-2023 carry over initiatives, including Staged Funding (Growth), Fraud Monitoring (SII), Income Verification (Risk Exception), D2I Updates (Risk/AML ask), and BCDR (SII), as well as new initiatives for 2023, including SPOG, ACM, & Genesis IVR, Learning Management (SII), and Zoot Integration/3rd Party Risk (Finding). Additionally, the Software Developer will be responsible for;
- Designing and building to enhance an ever-expanding data platform supporting business process needs for internal and external integration via APIs, data models, self-serve reporting solutions, and interactive querying
- Defining, building, testing, documenting, and auditing data platform artifacts, including data models, data flow processes, integrations, etc.
- Developing standard methodologies and frameworks for unit, functional, and integration tests around data pipelines, and driving teams towards increased overall test coverage
- Designing Continuous Integration and deployment processes, and best practices for production data pipelines
- Creating data platform artifacts for data flow processes and integrations
- Working with the latest and greatest technologies in the Microsoft Cloud stack, including Azure SQL, Synapse, Data Lake, Data Factory, Databricks, Azure function, Service Bus, etc.
- Collaborating and influencing Users, Engineers, and Product Partners to ensure our data infrastructure meets constantly evolving requirements
- Finding opportunities to adopt innovative technologies that fuel company vision
Position Requirements:
- 5+ years of relevant work experience required
- Strong experience with the Azure platform, especially with data services like Azure SQL, Synapse, Data Lake, Data Factory, Databricks, Azure function, Service Bus, etc. required; experience working with other Cloud platforms, such as AWS, highly preferred
- Strong programming skills, especially in SQL, data modeling, and related data processing concepts, required
- Solid understanding of real-time data processing, data pipelines, transformation, and modeling processes using traditional and distributed systems
- Experience with developing and architecting data ingestion models, ETL jobs, and alerting to maintain high availability and data integrity
- Experience with the development of test automation solutions
- Thorough knowledge of Relational, Multi-Dimensional databases and No-SQL solutions
- Experience working with real-time data processing frameworks using Kafka/Azure event hubs
- Experience building modern data lake architecture like Lakehouse/Lambda
- Experience with PySpark preferred
- Bachelor's Degree in Computer Science, Information Systems, or other similar technical field of study, or equivalent professional experience, required