Corporate culture and guiding principles:
TP ICAP is a global firm of professional intermediaries that plays a pivotal role in the world’s financial, energy and commodities markets.
Operating through our core businesses, Tullett Prebon, ICAP, PVM, Mirexa Capital, Tullett Prebon Information, ICAP Information Services and PVM Data Services, we create strong networks in person and through technology. We provide comprehensive analysis and insight into market conditions and long-term trends. We combine data, knowledge and intelligence into contextual insight and commercial guidance. By engaging with our clients, and providing innovative products and services, we enable our clients to transact with confidence, facilitating the flow of capital and commodities around the world, enhancing investment and contributing to economic growth.
We are known in the market for our Honesty, Integrity, and Excellence in the provision of service to our clients. Above all else, we Respect our clients and each other, without bias. Employees are expected to uphold the values and principles of our cultural framework in performance of their job duties.
- Work with our Global and Regional Support teams to support existing Asia-based data applications and feeds at a second line support level.
- Working alongside Product, Data Science and Business Analyst colleagues, scope, build and maintain new data-feeds and applications to cater to Asia business requirements.
- Create/maintain Technical Support Documentation.
- Involved with Data Science and Development teams on a global level to help maintain and support application rollouts in the Asia timezone.
Requirements Gathering and Business Analysis:
- Produce clear business requirements and functional specification documentation when required.
- Define MVP’s from a wish list of requirements.
- Strong written and verbal communication skills including an ability to effectively communicate with both business and technical teams.
- Regular communication with the wider technology teams to coordinate interdependencies and resolve issues.
Requirements for the role:
- Bachelor's degree in computer science, engineering, mathematics, or a related technical discipline.
- 2+ years of industry experience in data engineering..
- Strong programming experience with C#, Python and SQL, other languages like Java or C++ are also useful.
- Able to write clean, scalable and performant code.
- Experience with ETL and event streaming e.g. Kafka
- Ability to take ownership and build effective end to end solutions.
- Confident with Linux and the command line.
- Snowflake, Kubernetes and Airflow experience is desirable.
- Experience with Amazon Web Services (AWS) or Google Cloud Platform (GCP) would be beneficial.
- Experience with FIX protocols a plus.