Our Core Values:
Users come first
Build a better product, not just different
Do less but get more done
Always be learning
We want to make education and work more efficient and enjoyable, by providing the best digital paper solution possible. We plan to be the go-to tool for all forms of notes.
Are you keen on building more than just Data pipelines? Shape GoodNotes' entire Data Strategy and tooling! You will contribute to a variety of initiatives and technologies such as analytics, ML modeling, tooling, services, and more, within a team of ML Experts passionate about knowledge sharing.
What you will achieve:
Help democratize data and make it broadly available within GoodNotes internally and to our users around the world
Design and build robust infrastructures and tools for collecting and processing data using batch and stream processing frameworks
Collaborate with ML, MLOps, and QA teams to identify opportunities to improve our data architectures
Help evangelize high-quality software engineering practices towards building data infrastructure and pipelines at scale
Continuously improve the development practices through research, automation, documentation, and testing
Share your knowledge and experience with the rest of the team
What you need to be successful:
Experience building highly available and scalable data infrastructures on AWS or other cloud providers
Experience in performance tuning skills such as storage planning, caching, index design, data partitioning, etc.
Deep understanding of data architectures and schema design
Comfortable in writing SQL and/or DataFrame APIs
Mastery in at least two programming languages: Java/Scala/Kotlin and Python
Experience with data workflow management platforms such as Airflow, Dagster, Prefect etc.
Hands-on experience building data pipelines using one or more of the following big data frameworks or services: Spark, Kafka, Flink, Druid, ClickHouse, Pinot and ELK stack
Strong understanding of computer science fundamentals and a solid background in software engineering
What else would help you, but not required
Knowledge in machine learning and data science
Experience with Kubernetes, Docker, Terraform or other cluster management solutions on AWS or other cloud providers
You'll receive competitive compensation and meaningful equity along with a chance to make significant contribution to a product people already love.
Most of our positions are eligible for remote work, provided you have at least 3 hours of overlap with the team in the office every weekday between 10 AM and 6 PM. Please indicate your preference in your application form.
You're also welcome to join us in our Hong Kong or London office, we can sponsor visas and support relocations.
We take care of you and your loved ones with medical insurance and flexible working hours.
Join our best company tradition, the annual off-site. Check out our pictures from team outings and more on our Instagram .