Senior Data Engineer - Snowflake

Job description

This position is based in Bangalore, India.

Calix is undergoing a growth transformation, and we are looking for the best and brightest engineers for our Data Engineering team. Our team is facilitating Calix’s transformation into a more data centric enterprise with our business operational leaders. We partner with our operational teams to identify the key points of decision. We create decision support tools that enable optimal data driven selection of business actions and we build out & maintain these decision support tools on a modern data technology stack using DataOps processes. We are building out the data foundations for the next phase of our growth journey. This is a great opportunity to join a rapidly scaling enterprise with a lot of opportunity for personal growth.

The Data Engineering team is seeking a Lead Data Engineer who will be an extraordinary addition to our growing team. You will build and maintain our cloud-based enterprise data platform. Key areas that you will own include data architecture, data modeling, data pipeline flow, data warehousing, security and governance protocols, data integrity processes, and data QA best practices. You will lead buildout of the end-to-end ETL/ELT data environment, integrating with new technologies, and the development of new processes to support the creation and deployment of trusted, accurate and secure decision support tools to the Calix operational business units.

The ideal candidate will have outstanding communication skills, proven data infrastructure design and implementation capabilities, strong business acumen, and an innate drive to deliver results. He/she will be a self-starter, comfortable with ambiguity and will enjoy working in a fast-paced dynamic environment.

Responsibilities and Duties:

  • Lead data modeling, data ingestion, ELT/ETL, and data integration development using our cloud-based tooling including Snowflake, AWS, Fivetran, dbt, Airflow and GitHub.
  • Establish and maintain a DataOps approach for our data pipeline infrastructure and processes.
  • Create automation systems and tools to configure, monitor, and orchestrate our data infrastructure and our data pipelines.
  • Deploy production machine learning pipelines into business operations analytic tooling.
  • Ensure data quality throughout all stages of acquisition and processing.
  • Create and maintain secure and governed access to the enterprise data warehouse and reporting tools.
  • Evaluate new technologies for continuous improvement in data engineering.
  • Participate in project meetings, providing input to project plans and providing status updates.
  • A desire to work in a collaborative, intellectually curious environment.
  • Highly motivated self-starter with a bias to action and a passion for delivering high-quality data solutions.


  • 5+ years of experience in related field; preferably experience building and delivering data pipelines, data lakes and ELT solutions at scale.
  • Expert knowledge of data architecture, data engineering, data modeling, data warehousing, and data platforms.
  • Experience with Snowflake, BigQuery, Redshift, AWS, and pipeline orchestration tools (Fivetran, Stitch, Airflow, etc.).
  • Coding proficiency in at least one modern programming language (Python, Java, Ruby, Scala, etc.).
  • Deep SQL expertise.
  • Knowledge of software engineering best practices across the development lifecycle, including agile methodologies, coding standards, code reviews, source management, build processes, testing, operations, and technical documentation.
  • Excellent verbal and written communication skills and technical writing skills.
  • Strong interpersonal skills and the ability to communicate complex technology solutions to senior leadership to gain alignment, and drive progress.
  • Bachelor’s degree or equivalent experience in Computer Science, Engineering, Management Information Systems (MIS), or related field.

Preferred Qualifications

  • Experience with dbt SQL development environment.
  • Experience developing and deploying machine learning models in a production environment.
  • Experience with Power BI and/or SFDC Einstein/Tableau.
  • Experience with Oracle ERP and Oracle Data Cloud tools.


  • Bangalore, India

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.