About the Role
In this exciting role, you will:
Help us build state-of-art data processing framework for our security-oriented IoT platform
Work to architect a big data platform that is real-time, stable and scalable in order to support data analytics, reporting, data visualization and machine learning
The Data Engineer can also expect to:
- Design and develop ETL for the new big data platform with open source technologies such as Kafka, Spark, and Presto etc.
- Collaborate across teams to identify key datasets and implement ingestion process to onboard new datasets
- Extract, compile, track, and analyze data to generate reports.
- Implement and support a scalable data pipeline using technologies such as Kafka, Spark, and Kinesis to support IoT data streaming efficiently
- Optimize data collection procedures and generate reports on a weekly, monthly & quarterly basis.
- Refactor the existing data model into an easy-to-maintain data solution across the organization
- Work with agile teams to perform code reviews and participate in planning and design sessions
- Work with QC team to assure product quality
- Analyze structured and unstructured datasets to identify important feature sets
- Implement data governance processes to manage life cycle of datasets
- Work with operations team to automate build and deployment from DEV to PROD
- 3+ years of experience with data warehouse and data processing technologies such as Spark, Snowflake, Kafka, Presto, Hive, etc.
- 3+ years of experience in architecting and building scalable data platforms processing data on a multi terabyte scale
- 2+ years of experience with AWS components such as EMR, Athena, lambda, SQS, API gateway, AWS Kinesis
- Have a love for data and a talent for finding nuances and trends in large datasets.
- Must have experience with data modeling, ETL and database design
- In depth knowledge of concepts and experience of writing SQL (T-SQL or PL/SQL)
- Understanding of modern data structures and business intelligence reporting tools and track record of applying those on the job
- Must have working knowledge of data structures , Algorithms , and probabilistic data structures, and must be able to use these tools to solve the problems at hand
- Familiarity with NoSQL databases such as Cassandra, DynamoDB or MongoDB
- Experience with at least one data streaming Data processing frameworks such as Kafka, Kinesis, samza, flink, storm
- Experience with Object oriented design principles
- Experience with at least one programming language (Java, Python, Scala)
- Ability to manage numerous requests concurrently and be able to prioritize and deliver
- Good communication skills and dynamic team player
Brivo is the original innovator of cloud-based physical security solutions for commercial buildings. Our mission is to make the world a safer place by providing a subscription-based service for securing buildings using reliable, convenient, scalable, cyber-hardened technology. Every day this mission becomes more important as our world becomes more complex and more divided, and the nature of threats evolves faster than most observers can follow. Join a team that’s passionate for the business, its values, and building simply better security.
Brivo is an Equal Opportunity/Affirmative Action Employer
Powered by JazzHR