Data Engineer – Streaming and Real time analytics

Location: Basel, BS

*** Mention DataYoshi when applying ***

We are looking for an expert in data engineering with focus on real time and streaming analytics to accelerate implementation of our next generation data and analytics systems.

Your contribution to our mission

We collect and analyse a large array of data and statistics on global financial systems and micro-economic activity. From this, we generate a variety of insights that are used and relied upon by central banks, financial regulatory institutions and academia around the world. Also usage of data & analytics is vital for us in offering better banking services to central banks and increase profitability.

But we know that our full potential is yet to be achieved. Our aim is to collect a wider variety of data and harness new technology to generate more powerful insights to deliver better service, gain efficiency and agility. This is a chance to be involved in a step change in our capability.

A collaborative role that transforms our use of data

You’ll join ITS Data & Analytics who are responsible for developing, maintain, supporting data & analytics platforms and systems for / with line-of-businesses. The team is tackling a range of complex software and data challenges, including big data, data warehousing, advanced analytics, real time analytics, business intelligence and data governance.

You will be helping us to improve the capabilities of the analytics platform in the real-time analytics space. You will design and implement the streaming platform which would become the banks enterprise-wide data backbone. Your expertise in stream processing would be leveraged to offer advanced real-time analytics capabilities to the various business areas.

Your qualifications and experience

You will have a graduate degree in computer science, or another IT-related technical field, with at least five years of experience in software development in the data engineering domain building and supporting large scale data pipelines. Above all, you will have a passion for data and analytics. You will fulfil the following role in Bank’s central IT unit (ITS) :

Data Engineer – Streaming and Real time analytics

You will focus on a variety of data management and analytics projects. The aim is on delivering data platforms and systems that will empower the Bank’s data and analytics transformation. For that you will need to have broad set of data engineering skills and experience in agile software development methodologies.

Typical responsibilities for the data and analytics engineer role include:

  • Design and implement frameworks to handle realtime/near realtime data use cases in the bank.
  • Develop complementary features to support strict SLA guarantees to end users: Optimize resource allocations, Enhanced monitoring capablities, focus on resiliency, ease of support and governance, multitenancy.
  • Improve observability and understandability of the various components
  • Support the infrastructure for greater scalability, high availability, security and resiliency.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery and enhancing the overall platform capabilities
  • Establish best practices and standards in the team around the following: Data modelling standards, code quality, TDD, DevOps, DataOps
  • Working with other teams in the bank and increasing the adoption of the framework.

The roles are spanned across multiple projects that aim to mordernise the Bank’s data management infrastructure, to name a few, building a new modern data warehouse and data lake architeture, offering innovative data lab environments for self-service analytics, enhancing capabilities to support machine learning and AI use cases, developing a data transfer backbone to support event processing and streaming. Based on your experience, skill and interest you would be considered for appropriate project role.

You will have the following qualifications to succeed in this role :

  • 8+ years of experience in designing and developing data management and analytical systems
  • 4+ years working and supporting Streaming data pipelines, stream processing and event driven architecture.
  • Advanced knowledge of the internals of Apache Kafka(or similar), Apache Spark streaming (or similar stream processing engines), Kafka connect, Distributed systems (HDFS, Map Reduce).
  • Working experience in developing meta-data driven frameworks for data ingestion, data transformation and data quality.
  • Good understanding of standard Data engineering practices and data warehousing, SQL, ETL
  • Proficiency in at least one major programming language (e.g. Java, Python) and ability pick up new tech skills independently.
  • Hands on experience with schema design and data modelling and interest in elegant and intuitive dataset design.
  • Knowledge of Service-oriented architecture and experience in API creation and management technologies (REST, SOAP etc)
  • Experience with on-prem big data distributions Clodera or Hortonworks
  • Experience in Agile methodologies and scrum
  • Excellent verbal and written communication skills and would be able to explain complex technical concepts in laymen’s term

The BIS is fully committed to equal opportunity employment and strives for diversity among its staff

*** Mention DataYoshi when applying ***

Offers you may like...

  • Living Security

    Senior Data Engineer
    Austin, TX 78738

    Data Engineer Staff
    Orlando, FL 32825
  • CyberCoders

    Data Engineer
    Chicago, IL 60608
  • iknowvate technologies

    Big Data Engineer | FULLY REMOTE
    Las Vegas, NV
  • Seamless.AI

    Data Engineer - Remote US
    Columbus, OH