Company

Deloitte UsSee more

addressAddressLake Mary, FL
type Form of workOther
CategoryInformation Technology

Job description

Are you an experienced, passionate pioneer in technology - a solutions builder, a roll-up-your-sleeves technologist who wants a daily collaborative environment, think-tank feel and share new ideas with your colleagues - without the extensive demands of travel? If so, consider an opportunity with our US Delivery Center - we are breaking the mold of a typical Delivery Center.

Our US Delivery Centers have been growing since 2014 with significant, continued growth on the horizon. Interested? Read more about our opportunity below ...

Work you'll do/Responsibilities

  • Work with the team to evaluate business needs and priorities, liaise with key business partners and address team needs related to data systems and management.
  • Translate business requirements into technical specifications; establish and define details, definitions, and requirements of applications, components and enhancements.
  • Participate in project planning; identifying milestones, deliverables and resource requirements; tracks activities and task execution.
  • Generate design, development, test plans, detailed functional specifications documents, user interface design, and process flow charts for execution of programming.
  • Develop data pipelines / APIs using Python, SQL, potentially Spark and AWS, Azure or GCP Methods.
  • Use an analytical, data-driven approach to drive a deep understanding of fast changing business.
  • Build large-scale batch and real-time data pipelines with data processing frameworks in AWS, Azure or GCP cloud platform.
  • Moving data from on-prem to cloud and Cloud Data conversions.

The Team

Artificial Intelligence & Data Engineering:

In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment.

The Artificial Intelligence & Data Engineering team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets.

Artificial Intelligence & Data Engineering will work with our clients to:

Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms.

Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions.

Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements.

Qualifications

Required

  • 3+ years of experience in Data Engineering with an emphasis on data analytics and reporting.
  • 3+ years of experience with at least one of the following cloud platforms: Microsoft Azure, Amazon Web Services (AWS), Google Cloud Platform (GCP), others.
  • 3+ years of experience in SQL, data transformations, statistical analysis, and troubleshooting across more than one Database Platform (Cassandra, MySQL, Snowflake, PostgreSQL, Redshift, Azure SQL Data Warehouse, Databricks, etc.).
  • 3+ years of experience in the design and build of data extraction, transformation, and loading processes by writing custom data pipelines.
  • 3+ years of experience with one or more of the follow scripting languages: Python, SQL, Kafka and/or other.
  • 3+ years of experience designing and building solutions utilizing various Cloud services such as EC2, S3, EMR, Kinesis, RDS, Redshift/Spectrum, Lambda, Glue, Athena, API gateway, etc.
  • Must be legally authorized to work in the United States without the need for employer sponsorship, now or at any time in the future.
  • Ability to travel 10%, on average, based on the work you do and the clients and industries/sectors you serve. This may include overnight travel.
  • Expected to co-locate in your designated office/USDC location up to 30% of the time
  • Bachelor's degree, preferably in Computer Sciences, Information Technology, Computer Engineering, or related IT discipline; or equivalent experience
  • Must live in a commutable distance to or be willing to relocate to one of the following Delivery locations: Gilbert, AZ; Lake Mary, FL; Mechanicsburg, PA

Preferred

  • AWS, Azure and/or Google Cloud Platform Certification.
  • Master's degree or higher.
  • Expertise in one or more programming languages, preferably Scala, PySpark and/or Python.
  • Experience working with either a Map Reduce or an MPP system on any size/scale.
  • Experience working with agile development methodologies such as Sprint and Scrum.
  • Must be able to obtain the required level of security clearance for this role

Information for applicants with a need for accommodation: https://www2.deloitte.com/us/en/pages/careers/articles/join-deloitte-assistance-for-disabled-applicants.html


    Qualifications:

    Are you an experienced, passionate pioneer in technology - a solutions builder, a roll-up-your-sleeves technologist who wants a daily collaborative environment, think-tank feel and share new ideas with your colleagues - without the extensive demands of travel? If so, consider an opportunity with our US Delivery Center - we are breaking the mold of a typical Delivery Center.

    Our US Delivery Centers have been growing since 2014 with significant, continued growth on the horizon. Interested? Read more about our opportunity below ...

    Work you'll do/Responsibilities

    • Work with the team to evaluate business needs and priorities, liaise with key business partners and address team needs related to data systems and management.
    • Translate business requirements into technical specifications; establish and define details, definitions, and requirements of applications, components and enhancements.
    • Participate in project planning; identifying milestones, deliverables and resource requirements; tracks activities and task execution.
    • Generate design, development, test plans, detailed functional specifications documents, user interface design, and process flow charts for execution of programming.
    • Develop data pipelines / APIs using Python, SQL, potentially Spark and AWS, Azure or GCP Methods.
    • Use an analytical, data-driven approach to drive a deep understanding of fast changing business.
    • Build large-scale batch and real-time data pipelines with data processing frameworks in AWS, Azure or GCP cloud platform.
    • Moving data from on-prem to cloud and Cloud Data conversions.

    The Team

    Artificial Intelligence & Data Engineering:

    In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment.

    The Artificial Intelligence & Data Engineering team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets.

    Artificial Intelligence & Data Engineering will work with our clients to:

    Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms.

    Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions.

    Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements.

    Qualifications

    Required

    • 3+ years of experience in Data Engineering with an emphasis on data analytics and reporting.
    • 3+ years of experience with at least one of the following cloud platforms: Microsoft Azure, Amazon Web Services (AWS), Google Cloud Platform (GCP), others.
    • 3+ years of experience in SQL, data transformations, statistical analysis, and troubleshooting across more than one Database Platform (Cassandra, MySQL, Snowflake, PostgreSQL, Redshift, Azure SQL Data Warehouse, Databricks, etc.).
    • 3+ years of experience in the design and build of data extraction, transformation, and loading processes by writing custom data pipelines.
    • 3+ years of experience with one or more of the follow scripting languages: Python, SQL, Kafka and/or other.
    • 3+ years of experience designing and building solutions utilizing various Cloud services such as EC2, S3, EMR, Kinesis, RDS, Redshift/Spectrum, Lambda, Glue, Athena, API gateway, etc.
    • Must be legally authorized to work in the United States without the need for employer sponsorship, now or at any time in the future.
    • Ability to travel 10%, on average, based on the work you do and the clients and industries/sectors you serve. This may include overnight travel.
    • Expected to co-locate in your designated office/USDC location up to 30% of the time
    • Bachelor's degree, preferably in Computer Sciences, Information Technology, Computer Engineering, or related IT discipline; or equivalent experience
    • Must live in a commutable distance to or be willing to relocate to one of the following Delivery locations: Gilbert, AZ; Lake Mary, FL; Mechanicsburg, PA

    Preferred

    • AWS, Azure and/or Google Cloud Platform Certification.
    • Master's degree or higher.
    • Expertise in one or more programming languages, preferably Scala, PySpark and/or Python.
    • Experience working with either a Map Reduce or an MPP system on any size/scale.
    • Experience working with agile development methodologies such as Sprint and Scrum.
    • Must be able to obtain the required level of security clearance for this role

    Information for applicants with a need for accommodation: https://www2.deloitte.com/us/en/pages/careers/articles/join-deloitte-assistance-for-disabled-applicants.html


      Education:Bachelor's DegreeEmployment Type:
      Refer code: 8755780. Deloitte Us - The previous day - 2024-03-27 12:07

      Deloitte Us

      Lake Mary, FL
      Popular Cloud Data Engineer jobs in top cities
      Jobs feed

      Get Paid Today - Shop via Instacart - Now Hiring

      Instacart Shoppers

      Pennsylvania, United States

      Engineer - PACO

      Shell Usa, Inc.

      Sewaren, NJ

      Store Manager CosmoProf 06658

      Cosmoprof

      Aberdeen, NC

      Caregiver - CPR Certified

      Lhc Group. Inc

      Cumnock, NC

      Security Guard - Financial Headquarters - Lower Manhattan

      Allied Universal

      New York, NY

      School Bus Driver

      First Student

      South Sioux City, NE

      $20.00 per hour

      Material Handler - Now Hiring

      Proampac

      Minnesota, United States

      Share jobs with friends

      Related jobs

      Cloud Data Engineer - Solution Specialist

      Cloud Data Engineer - Senior Solution Specialist

      Deloitte Us

      Lake Mary, FL

      a month ago - seen

      Sr. Cloud Data Migration Engineer

      Deloitte Us

      Lake Mary, FL

      a month ago - seen

      Big Data/Cloud Engineer

      Signature Consultants

      North Palm Beach, FL

      3 months ago - seen