Detail Requirement:
- Must have 3+ years of hands-on experience in architecting/implementing Big Data (Hortonworks/Cloudera Hadoop) solutions .
- Expert knowledge of Big Data and data warehouse concepts
- Must have 3+ years of hands-on experience in Python
- Hands on experience with multiple Big Data tools (HIVE, Beeline(Hive2), Flume, Pig, HBase, Map-Reduce, Oozie, Sqoop/Sqoop2, Spark, Kafka)
- Must have experience in setting up security architecture's including Kerberos, firewalls, encryption, KMS, IAM, MFA, compliance (SOX/HIPPA/PII), data protection, etc.
- Experience with Unix/Linux systems with scripting experience in Shell
- Excellent grasp of integrating multiple data sources into an Enterprise data management platform and can lead data storage solution design
- Should be willing to take up the role of Big Data SME, who will advise various teams on architectural, design, performance tuning, cost efficiencies, and operations related issues
- Ability to understand business requirements and building pragmatic/cost effective solutions using agile project methodologies
- Experience of software development methodologies and structured approaches to system development
- Prior experience in developing work effort estimates, high-level project plans, budget estimates, etc. will be a plus
- Strong leadership and communication skills - listening, verbal, written and presentation
- Must demonstrate "out of box" thinking and creative problem solving skills.
- Ability to work effectively across all levels of the organization, ability to handle multiple tasks and function in a team-oriented, fast-paced, matrix environment
- Demonstrated experience working effectively with virtual teams and 3rd party vendors