Where You'll Work
Andela is a network of technology leaders dedicated to advancing human potential. We help companies build high-performing distributed engineering teams by investing in the world's most talented software engineers.
What You'll Do
- Use your knowledge in your core technology to delight our clients around the world.
- Collaborate excellently with peers and stakeholders in fast-paced distributed teams, through a love of constructive feedback, dedication and universal respect.
Job Responsibilities
- Participate in cross-functional projects in an agile environment
- Build, deploy, and maintain your own code
- Design, build, and maintain big data processing pipeline in real-time streaming and batch
- Configure, deploy, manage, and document data extraction, transformation, enrichment, and governance process in cloud data platforms, including AWS and Microsoft Azure
- Implement and monitor analytics to ensure the health of data
- Support engineering and business analytics use cases
Job Requirement
- Bachelor degree in Computer Science (or related field) or 4 years of production experience
- Experience working with large datasets (terabyte scale and growing) and tooling
- Experience developing complex ETL processes, including SLA definition and performance measurements
- Production experience with building, maintaining, improving big data processing pipelines
- Production experience with stream and batch data processing, data distribution optimization, and data monitoring
- Understanding the data lifecycle
Required Skills:
- Experience with Python
- Experience with Snowflake
- Experience with AWS or Azure
Bonus Skills:
- Experience with Spark
- Experience with Airflow
- Experience Snowpipe
- Experience with Scala
- Experience with Node.js
- Experience with Function Apps / AWS Lambda