AWS Data Engineer

Job description

Department/project description:

Insights & Data practice delivers cutting-edge data centric solutions.

Most of our projects are Cloud & Big Data engineering. We develop solutions to process large, also unstructured, datasets, with dedicated Cloud Data services on AWS, Azure or GCP.

We are responsible for full SDLC of the solution: apart from using data processing tools (e.g., ETL), we code a lot in Python, Scala or Java and use DevOps tools and best practices. The data is either exposed to downstream systems via API, outbound interfaces or visualized on reports and dashboards.

Within our AI CoE we deliver Data Science and Machine Learning projects with focus on NLP, Anomaly Detection and Computer Vision.

Additionally, we are exploring the area of Quantum Computing, searching for practical growth opportunities for both us and our clients.

Currently, over 250 of our Data Architects, Engineers and Scientists work on exciting projects for over 30 clients from different sectors (Financial Services, Logistics, Automotive, Telco and others)

Come on Board! :)

Your daily tasks:

  • design and implementation of solutions processing large and unstructured datasets (Data Mesh, Data Lake or Streaming Architecture),
  • implementation, optimization and testing of modern DWH/Big Data solutions based on AWS cloud platform and Continuous Delivery / Continuous Integration environment,
  • data processing efficiency improvement, migrations from on-prem to public cloud platforms.

Frequently used technologies:

  • AWS (Cloud Platform)
  • Python
  • PySpark
  • Snowflake
  • SQL
  • Terraform

Our expectation:

  • At least 3 years of experience in Big Data or Cloud projects in the areas of processing and visualization of large and unstructured datasets (in different phases of Software Development Life Cycle),
  • practical knowledge of the AWS cloud in Storage, Compute (+Serverless), Networtking and DevOps areas supported by commercial project work experience,
  • theoretical AWS cloud knowledge supported with certificates (for example DVA-C01, SAA-C02, SAP-C01, VDS-C01, DAS-C01),
  • familiarity with several of the following technologies: Glue, Redshift, Lambda, Athena, S3, Snowflake, Docker, Terraform, CloudFormation, Kafka, Airflow, Spark,
  • at least basic knowledge of one of these programming languages: Python/Scala/Java/bash,
  • very good command of English (German language would be an advantage).

Our offer:

  • Permanent employment contract from the first day,
  • Hybrid, flexible working model,
  • Possibility of using increased tax-deductible costs in the case of creative work,
  • Co-financing to equip a workplace at home,
  • Development opportunities:
  • Substantive support from project leaders,
  • A wide range of internal and external trainings (technical, language, leadership),
  • Certification support in various areas,
  • Mentoring and a real impact on shaping your career path,
  • Access to a database of over 2,000 training courses on Pluralsight, Coursera, Harvard platforms,
  • Internal communities (including Agile, IoT, Digital, Security, Women@Capgemini),
  • The opportunity to participate in conferences both as a listener and an expert;
  • Relocation package;
  • Benefits as part of the social package (including Multisport card, medical care for the whole family, group insurance on preferential terms, cafeteria).

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.

Similar jobs

Browse All Jobs
Sabenza IT
February 27, 2024

AWS Data Engineer (Senior)

Sabenza IT
February 27, 2024

AWS Data Engineer

iSanqa Resourcing
February 27, 2024