Job Description
- Data Engineer
# Azure, Databricks, GCP, Kafka, EventHubs, Python Pyspark
Databricks solution to ingest streaming data to provide data tables.
"Data Lake Philosophy ". Need true Engineer, hands on building and developing.
Interview Process:
30 min screening (need available timings with submission)
1-hour technical interview with SME, 30 Minute Final with hiring manager