Job Description
Embark on an exhilarating career with FreightVerify, a trailblazing leader in innovative software solutions for the logistics industry! We are on the lookout for a dynamic Big Data Engineer– Data Engineering Leader with a passion for modern cloud architecture and Data Lake Technologies. This role offers you the chance to lead a talented team of engineers in crafting cutting-edge Business Insights products that redefine the landscape of freight and logistics.
Picture yourself at the heart of our newly renovated, state-of-the-art office nestled in Downtown Ann Arbor, where collaboration meets the latest and greatest technology. As we experience exponential growth, you have the unique opportunity to join us as a pivotal member of our Data Engineering Leadership team.
What makes this opportunity truly exciting?
- Out-of-the-Box Thinking: We're seeking visionaries who can think beyond conventions and deliver the big picture. Your innovative mindset is the driving force behind our success.
- Leadership Impact: Manage and mentor a team of 5+ members, with a clear path to overseeing a team of 10 as we continue to expand. Your leadership skills will be instrumental in shaping the future of our technology endeavors.
- Cutting-Edge Technologies: Immerse yourself in the world of modern cloud technologies (AWS/Azure/GCP), Micro-service APIs, and Data Lake technologies. Be at the forefront of the industry's advancements.
- Creative Freedom: As a Big Data Architect, you have the autonomy to conceptualize, architect, and design Big Data pipelines, implementing ETL processing that pushes the boundaries of what's possible.
- Rapid Growth Opportunities: With a minimum of 10 years of experience in software product development, you bring a wealth of knowledge to our thriving environment. Join us on this exciting journey as we expand our team and take on new challenges.
- Impactful Decision-Making: Evaluate, onboard, and productionize modern data lake technologies, influencing critical decisions that drive the success of FreightVerify.
- Tech Stack Mastery: From AWS, Azure, or Google Cloud technology stack to Java, C#, Kafka, NoSQL, and more – you'll be at the helm of a diverse and exciting tech stack.
Technical Qualifications:
- Minimum of 10 years of experience in software product development with demonstrated leadership in managing teams.
- Minimum 3-5+ years of experience in Big Data/Analytics and expertise in Big Data frameworks.
- Proficiency in AWS, Azure, or Google Cloud technology stack.
- Familiarity with Data Lake technologies, similar to Delta Lake.
- Ability to conceptualize, architect, and design Big Data pipelines and implement ETL processing.
- Expertise in data modeling and architecture (critical).
- Knowledge of data security and privacy.
- Proficiency in Micro-services APIs.
- Mastery of RDBMS – PostgreSQL, Oracle.
- Expertise in Big Data processing using Spark, Data Bricks, etc.
- Competence in Java, C#, Kafka, NoSQL, Cassandra, Elasticsearch, Redis.
- Experience with automated testing frameworks, CI/CD pipeline implementations.
- Proficiency in large distributed Big Data Application deployment.
- Knowledge of database structure systems and data mining.
- Excellent analytical and problem-solving skills.
- Experience in data analysis and management.
- Knowledge of underlying infrastructure for Big Data Solutions.
- Understanding of Enterprise BI and analytics.
At FreightVerify, we're not just building software; we're shaping the future of logistics. If you're ready to unleash your creativity, lead with passion, and make a lasting impact in a rapidly growing company, seize this exciting opportunity and join us on this thrilling journey!