Do you thrive on building innovative software that enables smart IOT products and platforms? Want to support those at a scale larger than you've ever thought possible?
If you are fueled by the challenges of deep diving into IOT applications spanning diverse technology stack, then the Noke Team is the place for you. You will be excited to think innovatively for our scalability and latency needs, from custom hardware, mobile and browsers to systems and storage and you'll get a chance to work with the latest cloud technologies.
As a senior Software Developer, you will be responsible for gathering and analyzing requirements for new enhancement. Coordinate with other teams and take full ownership of enhancement till smooth transition into production including documentation. Ensure any problem or issue is reported needs to be understood, recreated, and provide solution in give time with minimum supervision. Need to follow all best practices and standards of software development. Expecting continues improvement in coding skills with new technologies. Participate in meetings and discussions related to projects.
JOB STATEMENT:
As a skilled Data Engineer at Nok, your role encompasses a solid understanding of data science principles and will be responsible for building and maintaining an efficient data architecture that supports advanced data analysis and data science projects that enhance Nok Products and Business operations.
- Collect, store, and process large volumes of data using AWS tools like Kinesis, Glue, or Data Pipeline to collect, store, and process large volumes of data.
- Experience with design and manage large datasets, finding pattern/anomalies, data quality (EDA), etc. for data analysis using AWS services such as Amazon S3, RDS, and Redshift.
- Develop, Train, and Deploy machine learning models with ML Ops E.g., classification, regression, clustering, NLP, CNN etc. using AWS SageMaker or related services.
- Implement analytical algorithms, working on predictive and prescriptive modeling.
- Create visualizations and dashboards using tools like Amazon Quicksight or integration with tools like Tableau, PowerBI.
- Creating DB objects e.g., Tables, procedure, triggers, functions etc.
- Write complex SQL queries and performance tuning/query optimization.
- Extract data from SQL/NoSQL databases to generate ad hoc reports for analysis.
- Script & write programs with Python and R
- Document procedures, workflows, and best practices related to data processing and modeling.
- 5+ years of experience as a Data Engineer working with large datasets, data analysis with an understanding of data science methodologies.
- 3+ years of experience with SQL based database technologies (Oracle, SQL, MySQL, Postgres) including store procedure, functions, triggers etc.
- 3+ years of experience with data visualization tools like Power BI, AWS QuickSight etc.
- 3+ years of experience with cloud services (preferably AWS), including SageMaker, S3, EC2, EMR, RDS, Airflow and Redshift.
- Familiarity with big data tools like Hadoop, Spark, Kafka, etc.
- Strong programming skills in languages such as Python, Java, or Scala.
- Experience with machine learning frameworks (like TensorFlow or PyTorch) and ETL tools is a plus.
- Excellent communication and teamwork skills.
PHYSICAL DEMANDS:
- Regularly required to sit, stand, and walk.
- Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions of the positions.