Working on writing frameworks building for real time and batch pipelines to ingest and transform events(108 scale) from 100's of applications every day. The ML and Software engineers consume these for building data products like personalization and fraud detection. You will help optimize the feature pipelines for fast execution and work with software engineers to build event driven microservices.
- You have previously worked on building serious data pipelines ingesting and transforming > 10 ^6 events per minute and terabytes of data per day.
- You understand how microservices work and are familiar with concepts of data modelling.
- You can connect different services and processes together even if you have not worked with them before and follow the flow of data through various pipelines to debug data issues.
- You have worked with Spark and Kafka before and have experimented or heard about Flink/Druid/Ignite/Presto/Athena and understand when to use one over the other.
- Proficient in Java/Scala/Python/Spark
FinTech Innovation Lab
- Certified as a Great Place to Work!
- A collaborative, open work environment that fosters ownership, creativity, and urgency
- Enrolment in the Group Health Benefits plan right from Day 1, no waiting period
- Office snacks and refreshments
- Catered lunch and desserts on a monthly basis!