Join us in building a data platform from the ground for data products in one of Large Corporate and Financial institution’s area within SEB. As a new team we're at the forefront of innovation and change. Our goal is to leverage data to drive value and insights in industry of Banking and Finance.
- Design and implement end-to-end data pipelines, emphasizing Python and/or Scala proficiency.
- Ensure seamless data integration, data structures, and data pipelines in cloud environments.
- Establish robust monitoring and alerting systems for the Data Warehouse infrastructure.
- Leverage Infrastructure as Code (i.e. Terraform) for effective cloud resource management.
- Optimize data processing and storage in platforms like GCP or Oracle.
- Utilize version control using Git for collaborative development.
- Apply Java knowledge to enhance specific components of the data engineering pipeline.
- Leverage Dataform for SQL automation and management.
- Demonstrate basic knowledge of Spark for enhanced data processing.
- Focus on Continuous Integration/Continuous Deployment (CI/CD) and automation practices.
- Enforce and champion DevSecOps practices for data security and compliance.
- Bachelor's or advanced degree in Computer Science, Data Science, or related field.
- Proven experience (5+ years) in data engineering roles, with expertise in the specified skills.
- Strong grasp of data warehouse principles and cloud-based data management.
- Familiarity with Data Platforms like GCP.
SEB is a solid employer. We have high ambitions to service our clients and are dedicated to creating an inclusive, friendly, value-driven culture where employees feel appreciated, respected and involved. Our benefits while on contract of employment include health insurance for you and your family, life insurance, individual pension fund, sports package, trainings and well as a top-notch office and great coffee.