Job description

First Digital Finance Corporation (FDFC) is a fin-tech company building disruptive and innovative products in retail credit for Southeast Asia.


The company operates BillEase, the top-rated, buy now pay later (BNPL) platform for merchants and their customers in the Philippines with more than 1m downloads.


Vision: Build financial services that delight and enable consumers


Mission: Use software and AI and to build financial products that are at least 10x better than status quo for our customers


The Role


You Will Be Responsible For


  • Creating predictive and prescriptive models that could have a positive impact on the business.
  • Driving all data analytics activities ranging from conceptualisation, visualisation to operationalisation.
  • Performing statistical analysis on historical data to validate models.
  • Working across stakeholders to identify and capitalise on opportunities to leverage data to drive business solutions.
  • Developing scripts to process structured and unstructured data.
  • Recommending, developing and implementing ways to improve data reliability, efficiency and quality.
  • Supporting translation of data business needs into technical system requirements.
  • Working with stakeholders to understand needs in order with respect to data structure, availability, scalability and accessibility.
  • Communicating and supporting the use of the data architecture to all stakeholders.
  • Development of data architecture, strategy and governance.
  • Providing secure, stable, scalable and cost-effective solutions to facilitate storage, integration, usage, access, and delivery of data assets across the business.
  • Implementing real-time analytics use-cases on the Hadoop ecosystem.
  • Providing expertise on the various components & features of the Hadoop ecosystem (such as Spark, Map/Reduce, YARN, Hive, Pig, Impala/Drill etc.).
  • Designing and setting up Hadoop cluster in line with current and future needs.
  • Development and troubleshooting on Hadoop technologies.
  • Monitoring Hadoop cluster for performance and capacity planning.


Ideal Profile


Skills Required


  • You possess an advanced degree, in Mathematics, Statistics, Computer Science, or a related field.
  • You have at least 1-year experience, ideally within a Data Scientist, Data Architect, or Data Engineer role.
  • Ability to conceive the data picture from an organizational perspective, and bridge the gap between current state and future goals.
  • Demonstrated experience working with large and complex data sets as well as experience analyzing volumes of data.
  • Exposure/expertise in one or more of emerging tools like columnar and NoSQL databases, predictive analytics, data visualization, and unstructured data.
  • Ability to write and execute complex queries in SQL.
  • Strong expertise in data modeling & database design.
  • You possess strong knowledge of Django, Hadoop, Java, and Python.
  • Expertise in Data Design and Hadoop ecosystems (Hadoop, Spark, Map/Reduce, YARN, Hive, Pig, Impala, Drill) would be highly valuable.
  • You are a strong team player who can manage multiple stakeholders
  • You enjoy finding creative solutions to problems
  • You pay strong attention to detail and deliver work that is of a high standard


What's on Offer?


  • Excellent career development opportunities
  • Work alongside & learn from best-in-class talent.
  • Work within a company with a solid track record of success

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.