Data Engineer

Location: Buenos Aires, Buenos Aires

*** Mention DataYoshi when applying ***

A Data Engineer at MightyHive, with a thorough understanding of cloud technologies, will help to deploy Big Data solutions, build and manage data pipelines, ETL processes, and cloud data migrations in a production, secure and scalable manner. The data engineer should be a data specialist with strong experience in the deployment and automation of data pipelines, data development, data cleansing, data warehousing, monitoring data processing systems, and dealing with large and various sets of structured and unstructured data. The data engineer will have experience ingesting real-time data from various data sources and designing new data platforms. Responsibilities Design and build data pipelines using a cloud platform. Manage and provision the cloud solution infrastructure. Design for data security and compliance. Manage and automate ETL and cloud deployment implementations. Ensure solution and operations reliability. Design and implement Big Data and Data Warehousing solutions with their corresponding Data Governance processes. Able to manage Cloud databases. Provide domain expertise around public cloud and enterprise technology.

MightyHive is the leading data and digital media consultancy that helps marketers take control. MightyHive delivers sustained results from the ground up through advisory for business transformation, privacy-first data strategy, and digital media services. The company is headquartered in San Francisco, with a team of consultants, platform experts, data scientists, and marketing engineers in 19 countries and 24 cities around the world. In 2018, MightyHive merged with S4Capital plc (SFOR.L), a tech-led new age/new era digital advertising and marketing services company established by Sir Martin Sorrell.


Required skills and qualifications Demonstrable deep knowledge and experience in cloud migration, cloud strategy and transformation, cloud architecture and engineering. Broad knowledge of the major cloud vendors with deep knowledge in GCP. A set of certifications in (GCP): Cloud Architect, Data Engineer or Cloud Engineer. Understanding of the high-level levers for cost-effective cloud delivery. Deep hands-on experience with cloud orchestration tooling, infrastructure-as-a-code (Terraform, Chef, Ansible or Puppet). Deep hands-on experience building scalable ETL pipelines (Apache Beam, Airflow). Programming experience in Python and Javascript. Technical depth and experience with SQL. Strong problem solving or analytics experience. Preferred qualifications Familiarity with digital marketing context. Knowledge of/exposure to Adobe Marketing Cloud or Adobe Analytics. A set of certifications or work experience in another cloud vendor (AWS or Azure). Technical depth and experience with: Linux/Unix administration. Knowledge of enterprise computer networks and VPC networks. Knowledge of IT and cloud security solutions. WHAT DO WE EXPECT FROM YOU? That you have initiative and be encouraged to make innovative proposals Don't wait for orders and take charge of the situation That you prioritize team success over your ego That you can find explanations and answers to problems you are facing for the first time


WHAT DO WE OFFER YOU? Teamwork culture, with many professional opportunities and in a family environment Health insurance Training Home Office Excellent work environment, relaxed, with drinks and snacks, fruit in the office, celebration of special dates, etc.

*** Mention DataYoshi when applying ***

Offers you may like...

  • The Upside Travel Company, LLC

    Senior Data Engineer
  • PPL Corporation

    Senior Data Engineer- Remote
    Allentown, PA
  • Artemis Consulting Inc

    Senior AWS Cloud Data Engineer
    Washington, DC 20001
  • Illuminate Education

    Data Engineer [remote]
    Minneapolis, MN 55402
  • Atomic

    Data Engineer