Data Engineer -Python, Snowflake, Airflow-700 per day - 6 Month Contract
Rate: 700 per day
Location: Remote
Duration: 6 months - Possible extension
IR35: Outside
The Role
A rewarding opportunity has arisen for a
Data Engineer looking to get involved with a worldwide
Financial Organisation based in
London. This opportunity offers a
fully remoteinitial 6-month contract with a possibility of an extension upon review.
The successful
Data Engineer in supporting the data integrations into & from a new ERP/Finance System. The
Data Engineer will have a strong background in
Snowflake and experienced in deploying pipelines using
Python & Panda.
Find below the experience requirements for this AWS Data Engineer Role:
Essential
- Degree in Computer Science, Engineering, or a related field.
- Proven experience as a Data Engineer, with a strong focus on Python, Snowflake, and Airflow.
- Solid understanding of data modelling, ETL/ELT processes, and data warehousing concepts.
- Previous experience developing and maintaining scalable data transformation pipelines.
- Hands-on experience with Airflow or similar workflow management tools.
- Worked with business product owners and collaborated with data analysts, data scientists.
- Strong problem-solving skills and the ability to work on multiple projects simultaneously.
Desirable
- Experience with other data engineering technologies such as Apache Spark, Kafka, or Hadoop.
- Familiarity with data visualization tools like Tableau, Power BI, or Looker.
If this
Data Engineer position sounds like it could be of interest and you would like to apply, please send your updated CV
THE DEADLINE FOR THIS APPLICATION IS 7th July 12pm