Big Data Infrastructure Administrator
Merrimack NH or Dallas TX
12 Months contract
Great pay
Phone and Skype Interview
Job Description :
This role provides an exciting opportunity to roll out a new strategic initiative within the firm-- Enterprise Infrastructure Big Data Service. The Big Data Infrastructure DevOps / Support Administrator serves as a development and support expert with responsibility for the design, development, automation, testing, support and administration of the Enterprise Infrastructure Big Data Service. This will involve building and supporting a general purpose data analytics platform utilized by Fidelity's data scientist community. The incumbent will be responsible for developing features, ongoing support and administration, and documentation for the service. The platform provides a data hub and a blueprint for integrating with existing upstream and downstream technology solutions.
This role is DevOps and support administration. The position requires a strong background in computer architecture, software development, data management systems, distributed computing, and a solid understanding of the open-source technology ecosystem. An ideal prospect will have technical expertise, customer engagement skills, excellent communication skills, and a passion for organizing and analyzing data.
Primary Responsibilities
The incumbent will have the opportunity of working directly across the firm with developers, operations staff, data scientists, architects and business constituents to develop and enhance the Big Data service. Key responsibilities include:
Development, support, and maintenance of the infrastructure platform and application lifecycle
Design, development and implementation of automation innovations
Development of automated testing scripts
Building and nurturing relationships with Fidelity's developer and system administration communities
Contribution to all phases of the application lifecycle - requirements, development, testing, implementation, and support.
Responding and providing guidance to customers of the Big Data platform
Defining and implementing integration points with existing technology systems
Interacting with and participating in open-source software communities
Researching and remaining current on Big Data technology and industry trends and innovations
Participating in a 24 x 7 hour on-call support rotation
Education and Experience
B.S. Computer Science or equivalent
Master's degree is a plus
5+ years application development or systems administration experience
5+ years of development experience in one or more of the following languages: Java, C++, Perl, Python
Experience deploying or managing open-source software
Strong experience with any Linux distribution
Experience with Kafka and Cloudera
2+ years Hadoop experience
Database administration experience a plus
Skills and Knowledge
Certification and experience working in Hadoop
Certification and working experience in a NoSQL database
Experience with Splunk/HUNK or Solr solutions and dashboards running on Big Data technologies such as Hadoop
Strong background in Linux/Unix Administration
Agile Scrum or Kanban experience
Global team experience
Experience with automation/configuration management using Chef, Puppet or an equivalent
Programming in CI/CD technologies is a plus
Knowledge designing scalable distributed systems
Awareness of both current and developing technologies
Strong desire to innovate and develop future technology
All your information will be kept confidential according to EEO guidelines.