Pru – Prudential Polska

Transaction Monitoring Data Analyst

Job description

The Role:

A Data Scientist will be a member of the Fraud Monitoring and Transaction Monitoring team within our operational planning centre of excellence at M&G. In this role you will use advanced analytics and machine learning algorithms to develop solutions that are data rich and use learning loops to improve. You will have the opportunity to shape and work on some of the in-depth rule development across Fraud and TM typologies.

This will be a key role in the design and delivery of leading edge applications and insights within an agile environment. You will be responsible for leading development whilst seeding your experience and approach to support blended client and 3rd party teams in delivery.


Working across the full project life-cycle, you will be expected to be comfortable at everything from requirements' gathering, design, development, implementation and initial support.


Key Responsibilities:

  • Design and build solutions that enable the delivery of fraud monitoring capability across our business
  • Be the team expert on Azure Databricks, and support other team members in upskilling in this area
  • Be the team expert on Power BI, and support other team members in upskilling in this area
  • Support the teams wider activities, assisting in investigation of suspicious transactions from data standpoint
  • Understand how complex data sources connect, to build in automation and to develop a capability to successfully isolate suspicious transactions and behaviours
  • Seek out new sources of data, which can be linked to online/offline transactional or customer data to provide new or enhanced insight to the business
  • Understand and review how other parts of the business are using data and data tool to see how we can learn, evolve and align
  • Build scalable and repeatable solutions that can be introduced in new areas/journeys/products, in line with our focus on building a centre of excellence
  • This is an individual contributor profile wherein the candidate is required to work in collaboration with the team Design and build systems that mine massive datasets and structure/ engineer it to be usable
  • Data quality assessment
  • Joining data from multiple sources (Data modelling)


Key knowledge, skills and experience:

  • Must be able to connect our data sets in a way that provides an end to end view of the customer journey, and in turn, helps to identify and mitigate fraudulent or suspicious activities.
  • Python skills; should be comfortable with command-line tools
  • Knowledge of Azure ecosystem – Devops, Databricks, Data Factory, etc.
  • Expertise in SQL
  • Working knowledge in Power BI/Tableu
  • Must be able to perform ETL tasks, logical/physical data modelling
  • Working with distributed systems (Spark clusters)
  • Experience of working closely with business stakeholders

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.