Job Description
Responsibilities:
- Building highly-available and secure authentication and API services.
- Maintaining and evolving mission-critical internal databases and services.
- Optimizing and operating high volume auto-scaling streaming data services.
- Instrumenting streaming data services for visibility into utilization per customer.
- Writing and maintaining documentation about internal and public services.
- Working with product and other stakeholders to define features.
Must haves:
- Expertise in one or more systems/high-level programming language (e.g. Python, Go, Java, C++) and the eagerness to learn more.
- Experience running scalable (thousands of RPS) and reliable (three 9's) systems.
- Experience with developing complex software systems scaling to substantial data volumes or millions of users with production quality deployment, monitoring and reliability.
- Experience in integrating with third party APIs.
- Experience with large-scale distributed storage and database systems (SQL or NoSQL, e.g. MySQL, Cassandra).
- Ability to decompose complex business problems and lead a team in solving them.
- Data Processing - experience with building and maintaining large scale and/or real-time complex data processing pipelines using Kafka, Hadoop, Hive, Storm, or Zookeeper.
- 4+ years of experience.