Company

Goli TechSee more

addressAddressSan Antonio, TX
type Form of workFull-Time
CategoryInformation Technology

Job description

Job Description

Job Summary: As a Data Integration Engineer on our Data team, you will have a chance to design intelligent, automated data systems using advanced data engineering knowledge in the data warehousing space to redefine best practices with a cloud-based approach to scalability and automation. In partnership with business analysts, you will work backwards from our business questions to build reliable and scalable data solutions to meet the business needs. By scaling up our client s data ecosystem around cloud-based CRM, ERP, and data platforms, we will improve speed, lower the total cost of ownership, and provide a unified view of entities.

This is an opportunity to revolutionize industrial technology space by partnering with different clients in industrial, construction, heavy equipment, renewables, oil & gas, and manufacturing sectors. You will be building new skills and strengthening your expertise through hands-on experience building robust data pipelines for modern enterprise platforms such as Salesforce and Microsoft Dynamics 365 to develop cloud-based data solutions.

The incumbent in this position is expected to model the following practices daily: 1) Demonstrate alignment with the company's mission and core business values; 2) Collaborate with key internal/external resources; Participate in ongoing self-development.

Essential Functions:

  • Develops, evaluates, and influences effective and consistent productivity and teamwork to ensure the delivery of Legendary Customer Service (LCS)
  • Models, promotes, reinforces, and rewards the consistent use of HOLT s Values Based Leadership (VBL) tools, models, and processes to ensure alignment with our Vision, Values, and Mission
  • Design, develop, implement, test, document, and operate large-scale, high-volume, high-performance data structures.
  • Create and propose technical design documentation, which includes current and future ETL functionality, database objects affected, specifications, and flows and diagrams to detail the proposed implementation.
  • Implement data structures using best practices in data modeling to provide on-line reporting and analysis using business intelligence tools and a logical abstraction layer against large, multi-dimensional datasets and multiple sources.
  • Understand existing databases and warehouse structures in order best determine how to consolidate and aggregate data in an efficient and scalable way.
  • Design and code all aspects of data solutions using cloud-based tools to build out a data warehouse.
  • Design ETL/ELT processes and data pipelines to bring data from various sources into a central data repository.
  • Work closely with business teams, developers, application teams, and vendors to develop optimal solutions.
  • Analyze new/disparate data sources for integration with existing datasets to tell a comprehensive story
  • Improve business process agility and outcomes, drive innovation, and reduce time to market for our innovative IT solutions.

Knowledge, Skills, and Abilities:

  • Capable of speaking articulately to a breadth of topics such as RDBMS, NoSQL, Azure data store technologies, ETL, data warehousing, data modeling, role-based access, etc.
  • Design & create Integration/ETL processes as needed to meet business requirements and ensure successful data migration for project implementation
  • Advanced SQL and query performance tuning skills.
  • Solid hands-on experience in architecting and developing with SOAP and REST web services, using ESB tools to deliver fast, reliable, and scalable integrations.
  • Designing Mule applications (API First approach, Application Networks, RAML modeling and mocking, Exception Handling, Logging, Cloudhub deployment)
  • Knowledge of Mule 4 API Development using JSON, XML, Exception Handling, Logging)
  • Programming with Java, C++, Python, JavaScript, or similar languages
    Understanding and experience with security implementations (SSL/mutual SSL, SAML, OAuth)
  • Ability to work with ambiguous requirements and drive clarity by collaborating with business groups
  • Detail-oriented individual with the ability to rapidly learn and take advantage of new concepts, tools, and technologies. Ability to quickly ramp up on new projects, their business needs and support engagements.
  • Get it done mentality, with the ability to deliver against achievements, regardless of the challenges thrown their way.
  • Ability to manage workload, multiple priorities, excellent problem solving, and troubleshooting skills.
  • Be comfortable working in a matrix environment and foster motivation within the project team to meet tight deadlines.

Experience:

  • 5+ years of experience designing, developing, and deploying data solutions
  • Informatica
  • Solid hands-on experience in architecting and developing with SOAP and REST web services, using ESB tools to deliver fast, reliable, and scalable integrations.
  • 4+ Experience with ETL and Web Services based integrations with expert level knowledge of developing APIs using SOAP and REST architecture styles and data interchange formats like XML, JSON etc.
  • 2+ years of coding experience with modern programming or scripting language (Python, Scala, Java, C# etc.).
  • Programming with Java, C++, Python, JavaScript, or similar languages
    Understanding and experience with security implementations (SSL/mutual SSL, SAML, OAuth)
  • Implement common APIs based on architecture guidelines and frameworks
  • Analyze, design, develop, as well as implement RESTful services and APIs
  • Experience in creating REST API documentation using Swagger and YAML or similar tools desirable
  • Knowledge of HTTP Protocols and experience using them in conjunction with RESTful APIs
  • Experience integrating with Cloud/SaaS applications, APIs, SDK of packaged applications and legacy applications.
  • Basic programming skills such as JavaScript and shell scripting. Experience with source control and build technologies (e.g., Azure DevOps, GIT)
  • Experience in developing/operating large-scale ETL/ELT processes with on-prem and cloud platforms; database technologies; data modeling.
  • Experience developing/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets from multiple systems.
  • Basic programming skills such as JavaScript and powershell scripting. Experience with source control and build technologies (e.g., Azure DevOps, GIT)
  • Active Snowflake, Informatica or Azure certifications
  • Experience working in an Agile environment

Required Skills : Data Warehouse
Basic Qualification :
Additional Skills : Data Warehouse Engineer
Refer code: 3522888. Goli Tech - The previous day - 2023-10-10 04:05

Goli Tech

San Antonio, TX

Share jobs with friends

Related jobs

Data Integration Engineer- Informatica/ Iics- Remote

AI & Data Integration Engineer (Hybrid)

Aecom

Austin, TX

a month ago - seen

Senior Data Integration Engineer (Remote)

Trace3

$125,600 - $163,800 a year

Houston, TX

2 months ago - seen

Sr. Data Integration Engineer

NTT DATA Group Corporation

Dallas, TX

4 months ago - seen

Data Engineer

Full Time/Contract for Workday Integrations Consultant in Sunnyvale or Austin (Relocation necessa...

Irving, TX

4 months ago - seen