Are you a Big Data Engineer with strong coding skills in Python & PySpark, Java / Scala and SQL and/or SAS? Are you passionate about building complex ETL tools and pipelines? Then this exciting role as Big Data Engineer, based in Lausanne, might just be perfect for you.
On behalf of our client, a global investment bank with offices all over Europe and the US, Swisslinx is currently searching for an experienced Big Data Engineer / Data Scientist with focus on strong coding skills and Data Analysis. This contract role can start as soon as possible and is planned to go at least until end of this year (31.12.2021) with possibility of extension to a long-term contract.
As a Data Scientist / Backend Developer you will be assigned to the CRO Data & Reporting Foundation program, working with the rest of the team on implementing core infrastructure functions for data sourcing, processing (including DQ controls, data adjustments, etc.) on the strategic infrastructure, mostly relying on the Foundry platform from Palantir as formerly deployed in the Bank for Compliance.
The role involves:
- Close engagement with CRO Change project teams and RDM BAU team to collect requirements for needed infrastructure capabilities for implementing fully functional CRO reporting applications and enabling digitalized BAU processes, while complying with regulatory rule set for Risk Reporting
- Supporting a dedicated squad team on specifying user stories and managing the backlog for implementation; implementing solutions, as well as performing UAT of provided solutions
- The opportunity to become part of our highly motivated Analytics team focusing on building and driving new technologies in various fields
- A challenging role as a Data Scientist in a dynamic, international and fast-paced environment using the latest innovations in predictive analytics and visualization techniques
- The chance to engage with senior Business users in automating regulation and compliance to build a state-of-the art regulatory reporting and analytics infrastructure
- You will develop and design algorithms, build prototype versions, run multiple validations with business experts and work on building products
Essentials Skills and Qualifications:
- At least 8 years of the overall experience and 4 years with the coding at least one of the following: Python, Java/Scala, SQL and/or SAS
- Preferably a PhD or Master Degree in Computer Science or Data Science or a quantitative field (Statistics, Mathematics, Economics).
- Proficient in at least one of the following: Python, Java/Scala, SQL and/or SAS
- Experience with relational database programming and distributed data processing at scale using Spark or/and Hive
- Your high-level of confidentiality, integrity and responsibility to deal with critical and confidential data
- You feel at ease in assembling data sets from disparate sources and analyse using appropriate quantitative methodologies, computational frameworks and systems
- Fluency in English (written and spoken); German or French would be a plus
Desired Skills and Qualifications:
- Your excellent people management abilities and good communication skills that enable you to collaborate effectively and to build up professional relationships even with senior level executives
- As demonstrated standout colleague, self-starter, and independent thinker, you are willing to cooperate in a highly collaborative environment and to contribute to the team's success
- You are willing to assume additional responsibilities and to become a respected know-how carrier in our team
Does this role sound exciting? If so, send us your CV today! I am happy to answer any open questions about the role as well.