Alaffia Health has built the future of automated healthcare claims processing and is looking for the next addition to our backend API development team. Alaffia is a team of dedicated and passionate engineers who work as a unit to deliver the best in industry software centered around algorithmic and AI processing for maximum automation.
The Alaffia technical team lives at the cutting edge of the possibilities and is constantly pushing the boundaries of both data science and software development. The Alaffia engineering team is developing the latest techniques and tools to provide the most value and impact to the healthcare industry and medical claim reviews, with objective truth and accuracy as the guiding principles. We pride ourselves on quality, teamwork, and service. In this role, you'll have the opportunity to build a healthcare claim review platform with real social and economic impact. You'll be making a dent in the struggles of our nation's healthcare system from your first day.
Our Culture
At Alaffia, we fundamentally believe that the whole is more valuable than the sum of its individual parts. Further to that point, we believe a diverse team of individuals with various backgrounds, ideologies, and types of training generates the most value. Our people are entrepreneurial by nature, problem solvers, and are passionate about what they do - both inside and outside of the office.
About the Role & What You'll Be Doing
In this role, you'll be at the forefront of developing and enhancing our data infrastructure, driving the efficiency and effectiveness of our healthcare payment processing platform. You'll have the opportunity to design and implement robust data pipelines, integrate cutting-edge technologies, and collaborate with cross-functional teams to deliver high-quality solutions. Your responsibilities will include writing production-level code, architecting scalable systems, and ensuring data integrity throughout the pipeline. Additionally, you'll play a key role in innovating our data engineering practices and addressing security challenges to safeguard sensitive information.
Your Responsibilities
- Developing production-level code in Python for our data pipelines and adjacent services to ensure efficient data processing
- Designing and implementing new services to extract valuable insights from real-time data ingestion, enhancing the platform's capabilities
- Architecting robust data pipelines using Apache Airflow and Kubernetes, enabling the integration of human-in-the-loop machine learning models into our production environment
- Integrating technologies like GraphQL and PostgreSQL to develop a flexible and high-performing web application used by various stakeholders in the healthcare industry
- Driving innovation in data engineering by leveraging modern tools such as Apache Arrow, Postgres Foreign Data Wrappers, Deltalake, and MLFlow to optimize data processing workflows
- Making architectural decisions to support live updates for users and ensure scalability for continuous ingestion of medical claims
- Establishing comprehensive testing suites to validate data processing tasks and maintain data integrity throughout the pipeline
- Collaborating closely with backend and machine learning engineers to implement new APIs and services, enhancing the platform's functionality
- Working alongside the Payment Integrity team to incorporate their domain expertise into core development and data initiatives, ensuring alignment with industry best practices
- Partnering with the Product team to enable frontend features driven by a robust and efficient data pipeline
- Addressing security challenges related to private data at scale by implementing hardened data access policies, including Postgres Row-Level Security Policies (RLS), to safeguard sensitive information
What We're Looking For
- 5+ years of data engineering service development and data processing pipeline development
- Algorithmic, Statistical or ML based service development
- Demonstrated experience working in a fast-paced, innovation centric environment
- Deep experience with pipeline solutions and modern data related technologies
- Dataframe frameworks such as Pandas, Polars, Nushell, or Spark
- Orchestrators such as Airflow or Kubeflow
- Production level, object-oriented code written in Python
- Testing frameworks in python - Pytest, Poetry, Pydantic
- PostgreSQL or similar SQL technology
- Datalakes such as Snowflake, Deltalake, Azure Synapse, or AWS Athena/Glue
- PDF / Image extraction and normalization experience
- Object oriented programming experience
- Large file system and data lake management experience
- Team development and mentorship
- Preferred experience with modern infrastructure platforms and patterns
- Kubernetes
- AWS
- Microservice architectures
- Message Queues (Kafka, AMQP, Jetstream) and event driven architectures
- Datadog or similar logging and monitoring solutions
- Container based workflows with Docker
- Nice to Haves:
- Experience in the Healthcare, Insurance, or Healthcare Payments Industry
- Experience working with FHIR, EDI, or 837 data format
- Experience with Software as a Service (SaaS) enterprise systems
- Templating languages such as Helm, Jinja, or Go Templates
- GitHub, CI/CD using GitHub Actions
- Large Language Model development
What Else Do You Get Working With Us?
Hybrid work environment - work from the office and home
Employer-sponsored Medical, Dental, and Vision benefits
Flexible, paid vacation policy
Work in a flat organizational structure - direct access to Leadership
*Please note: Alaffia Health does not provide employment sponsorships at this time.