Must be local - Hybrid role
Must have a valid LinkedIn profile
Scala Developer
Job Description:
· Designs, develops, and implements Big Data streaming applications with scala to support business requirements.
· Follows approved life cycle methodologies using standard software frameworks, perform coding and testing and operational support
· Resolves technical issues through debugging, research, and investigation. Relies on experience and judgment to plan and accomplish goals.
· Performs a variety of tasks. A degree of creativity and latitude is required.
· Codes software applications to adhere to designs supporting internal business requirements or external customers.
· Standardizes the quality assurance procedure for software. Oversees testing and develops fixes.
· Contribute to the Design and develop high quality software for large-scale Scala distributed systems using Databricks and AWS Cloud
· Ingest and process streaming data sets using appropriate technologies including but not limited to, AWS Cloud (Kinesis, S3, Lambda), Spark, and Kafka.
Skills:
· 5 years of programming experience in Scala, preferably in Big Data space
· Good knowledge of standard concepts, practices, and procedures within a particular field.
· Strong communication skills.
· Experience with Databricks, kafka, spark and AWS services like s3, kinesis, lambda
· Good understanding of Big Data concepts and experience with GitHub and CI/CD tools
Must have a valid LinkedIn profile
Scala Developer
Job Description:
· Designs, develops, and implements Big Data streaming applications with scala to support business requirements.
· Follows approved life cycle methodologies using standard software frameworks, perform coding and testing and operational support
· Resolves technical issues through debugging, research, and investigation. Relies on experience and judgment to plan and accomplish goals.
· Performs a variety of tasks. A degree of creativity and latitude is required.
· Codes software applications to adhere to designs supporting internal business requirements or external customers.
· Standardizes the quality assurance procedure for software. Oversees testing and develops fixes.
· Contribute to the Design and develop high quality software for large-scale Scala distributed systems using Databricks and AWS Cloud
· Ingest and process streaming data sets using appropriate technologies including but not limited to, AWS Cloud (Kinesis, S3, Lambda), Spark, and Kafka.
Skills:
· 5 years of programming experience in Scala, preferably in Big Data space
· Good knowledge of standard concepts, practices, and procedures within a particular field.
· Strong communication skills.
· Experience with Databricks, kafka, spark and AWS services like s3, kinesis, lambda
· Good understanding of Big Data concepts and experience with GitHub and CI/CD tools