Pharo Management is a leading global macro hedge fund with a focus on emerging markets. Founded in 2000, the firm has offices in London, New York and Hong Kong and currently manages approximately $7 billion in assets across four funds. Pharo trades foreign exchange, sovereign and corporate credit, local market interest rates, commodities, and their derivatives. We trade in over 70 countries across Asia, Central and Eastern Europe, the Middle East and Africa, Latin America as well as developed markets. Our investment approach combines macroeconomic fundamental research and quantitative analysis.
Pharo employs a diverse, dynamic team of 125 professionals representing over 20 nationalities and 30 languages. We have a strong corporate culture anchored in core values such as collaborative spirit, creativity, and respect. We are passionate about what we do and are committed to attracting the best and brightest talent.
This is a great opportunity to join a top performing firm with a collaborative culture and contribute to our continued success.Job Description
Pharo is seeking an experienced Data Engineer in our London office to expand our data platform, including data pipelines, data warehousing, consumption APIs, and business intelligence tools. You will participate in a development life cycle that is closely aligned with business objectives, enhancing our enterprise data platform to facilitate strategic insights from our investment and risk data. This will directly contribute to the front office teams.
Pharo is evolving into a cloud-first engineering organization, adopting new infrastructure on Azure, including Data Factory, SQL, Blob, and other PaaS solutions. In this role, you will collaborate with various business teams such as Portfolio Managers, Quants, Risk, Economic Research, and Operations. Your responsibilities will include scaling our data infrastructure, developing API/reporting solutions, and promoting broader data access. The ideal candidate should possess a robust hands-on technical background, with expert knowledge in dimensional and relational data modeling, as well as data pipelines and BI tools. Excellent communication and presentation skills, along with effective time management, are essential.Areas Of Responsibility
Skills And Experience Required
- Collaborate with technical and non-technical stakeholders to determine requirements and implement solutions for data analysis, data on-boarding, and business intelligence projects across Portfolio Management, Risk, Quant, Economics, and Front Office teams.
- Develop data pipelines to extract/load/transform data at enterprise scale across an array of Azure cloud services, Snowflake data warehouse, and DBT framework.
- Contribute data architecture designs to facilitate API consumption and metrics layer solutions, streaming data ingestion, and machine learning frameworks.
- Attain functional subject matter expertise across various business teams to gather requirements and support the investment process.
- Monitor and triage data processes to ensure continuity of the data platform and meet business requirements.
- 7+ years’ experience as a Data Engineer within financial services handling datasets across batch files, API, and streaming from internal and external sources.
- Experience with Snowflake and DBT to onboard new datasets, develop data models, and build data transformation pipelines.
- Experience with data orchestration and extract/load pipelines using tools such as Azure Data Factory or Airflow.
- Demonstrated strength in data modeling, data warehousing concepts, and query optimization for large complex datasets.
- Strong SQL and Python development skills along with software engineering best practices including agile methodologies, DevOps, GIT, and CI/CD pipelines.
- Cloud domain experience with Azure services like Functions, KeyVault, LogicsApps is a big plus.
- Industry domain knowledge of Security Master, IBOR, and Portfolio Management is a big plus.
- Experience building APIs to serve data is a plus.