Job description

Role Description:

Our client is looking for an enthusiastic Data Engineer who can contribute to building a data repository in the sustainability domain. The candidate hired for this role will be working on building out and maintaining the data repository built on Snowflake.

Daily and Monthly Responsibilities:

  • Demonstrate expertise in data modeling, physical and logical schema designing, and ELT using Snowflake and SQL.

  • Deploy fully operational data warehouse solutions into production on Snowflake.

  • Support design of data strategy and governance.

  • Must be problem-solving and can break complex issues into manageable pieces that function successfully in a high-pressure, quick-turnaround environment.

  • Write a fully functioning piece of code in Python to support the various data-related workloads.

  • Good understanding of Data warehousing, ELT, ETL, OLTP, and OLAP

  • Build ETLs/ELTs to take data from various operational and transactional systems and create a unified dimensional or star schema data model for analytics, reporting, and delivery

  • Take an active part in all aspects of development including participation in reviewing the work done by peers

  • Collaborate within and across multiple teams, as necessary.

  • Strong cross-functional team player and demonstrate ability to coordinate application changes in a diverse and fast-paced work environment


Required Skills / Abilities:

  • 3-5 years of total experience with a minimum of 2-4 years of relevant experience in working as a Data Engineer

  • Experience working on data warehouse technical architectures, ETL/ELT, reporting/analytical tools, and data security

  • Working knowledge of developing data pipelines using Python.

  • Experience working with different offerings in Snowflake (SnowSQL, SnowPipe, Bulk loading, Shares)

  • Experience working with structured and semi-structured data

  • Solid fundamental skills in data structures and algorithms

  • Exposure to cloud-based services (AWS preferred)

  • Having excellent communication skills while collaborating with cross-located Development teams

  • Well versed with SCRUM / Agile delivery methodology

  • Knowledge of any version control system (e.g. SVN, git).


Required Education and Experience:

  • BE/B Tech/M Tech/MCA in Computer Science/Information Technology or equivalent from a reputed college with 2-4 years of experience as a Data Engineer

  • Prior experience working with Snowflake is mandatory

  • Prior coding experience in Python is mandatory

  • Prior experience working in Data warehousing and ETL development is mandatory

  • Motivated team player who goes over and above what is asked

  • Attitude to thrive in a fun, fast-paced start-up like environment

  • You love writing and owning codes and enjoy working with people who will keep challenging you at every stage.

  • You have strong problem solving, analytic, decision-making, and excellent communication with interpersonal skills.

  • You are self-driven and motivated with the desire to work in a fast-paced, results-driven agile environment with varied responsibilities

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.

Similar jobs

Browse All Jobs
GroupM Germany
September 19, 2022
NEW WORK SE
September 19, 2022
NEW WORK SE
September 19, 2022