Work on writing frameworks building for real-time and batch pipelines to ingest and transform events (108 scale) from 100's of applications every day. ML and Software engineers consume these for building data products like personalization and fraud detection. You will also help optimize the feature pipelines for fast execution and work with software engineers to build event-driven microservices. You will get to put cutting edge tech in production and freedom to experiment with new frameworks, try new ways to optimize and resources to build the next big thing in fintech using data! * BSc. or MSc. Degree in Computer Science * Lead development teams to define and build data pipelines * Expert in building and operationalizing BigData platforms in cloud using one of the public clouds, preferably MS Azure. * Hands-on experience with Big Data streaming frameworks and tools (Spark Streaming, Storm, Kafka, etc.) * Expert in Hadoop ecosystem and toolset - Sqoop, Nifi, Pig, Spark, HDFS, Hive, HBase, etc. * Expert in automating data pipelines in a Big Data ecosystem, DevOps, and CICD. * Experience in developing Hadoop integrations (batch or streaming) for data ingestion, data mapping and data processing capabilities * Experience programming in both compiled languages (Java, Scala) and scripting languages (Python or R) * Expert in developing Big Data set processes for data modeling, mining, and productionThe client is a high growth tech company with hundreds of millions of active users worldwide. They have an office in Toronto with a large engineering team that they are expanding due to increased demand and product releases.
- A collaborative, open work environment that fosters ownership, creativity, and urgency
- Enrolment in the Group Health Benefits plan from Day 1
- Weekly delivery of groceries and all types of snacks to our office
- All types of signature drinks from coffee to lattes to cappuccinos
- Catered lunch and desserts on a monthly basis!
- Ping Pong and Pool
- And so much more!