Stefanini is looking for a Big Data Engineer (Remote for now)
Stefanini is looking for a Big Data Engineer to join the Advanced Data and Analytics Capabilities Team.
We are a team based out of San Francisco that partners with business lines to deliver big data and advanced analytics products and solutions.
In this role, you will have the opportunity to contribute to several high-quality data solutions and enhance your technical skills across many disciplines.
- Design, develop, and maintain end to end data solutions using open source, modern data lake, and enterprise data warehouse technologies (Hadoop, Spark, Cloud, etc.).
- Contribute to multiple data solutions throughout their entire lifecycle (conception to launch).
- Partner with business stakeholders to understand and meet their data requirements.
- Design, build, and maintain machine learning data pipelines.
- Maintain security in accordance with Bank security policies.
- Participate in an Agile development environment Cloud/AWS is a nice to have.
- Bachelor’s degree in Computer Science, Engineering, or Information Management (or equivalent).
- 5+ years of relevant work experience.
- Professional experience optimizing machine learning workflows and maintaining data pipelines.
- Hands-on experience with a variety of big data (Hadoop / Cloudera, Cloud, etc.) and machine learning (Spark, AWS SageMaker, etc.) technologies.
- Experience with object-oriented scripting languages: Java (required), Python (required), etc.
- Advanced knowledge of SQL and experience with relational databases.
- Experience with UNIX shell scripts and commands.
- Experience with version control (git), issue tracking (jira), and code reviews.
- Proficient in agile development practices.
- Ability to clearly document operational procedures and solution designs.
- Ability to communicate effectively (both verbal and written).
- Ability to work collaboratively in a team environment.
- Ability to balance competing priorities and expectations.