Pleased to meet you, we are Zallpy.
We are much more than a technology company; we are a diverse, plural, and talented community. Our purpose is to lead digital transformation with excellence and agility, promoting mutual and genuine growth in ethical and long-lasting relationships. Flexibility is one of our trademarks; we operate in different team models and formats, maintaining our light, collaborative, and integrated culture, providing equitable opportunities in a space where everyone feels safe and heard.
What You Will Do
- Design, implement, and manage scalable data pipelines and ETL processes on GCP.
- Develop and maintain robust data architectures using GCP services such as BigQuery, Cloud Storage, and Dataflow.
- Setup and configure storage solutions for the data lake and data warehouse.
- Build ETL pipelines to engineer and wrangle data from various sources.
- Implement and manage workflow management systems for orchestrating complex data processes.
- Optimize and tune data processing and storage solutions for performance and cost-efficiency.
- Ensure compliance with data governance and regulatory requirements.
- Implement and enforce data security best practices.
- Ensure data quality and integrity through comprehensive testing and validation.
- Ensure that data solutions are reliable, scalable, and effective in production environments.
- Integrate data solutions with BI tools like Looker and Data Studio for visualization and reporting.
- Collaborate with data engineers, analysts, and other stakeholders to understand data requirements and deliver high-quality data solutions.
- Continuously expand data integration to additional sources and improve data processes and performance.
What We Are Looking For
- Advanced/Fluent English for conversation
- Bachelors or Masters degree in Computer Science, Engineering, or a related field.
- Great experience in data engineering with a focus on cloud technologies.
- Expertise in Google Cloud Platform (GCP) and its services, including BigQuery, Dataflow, Cloud Storage, Pub/Sub, and Cloud Functions.
- Proficiency in SQL and programming languages such as Python or Java.
- Experience with data modeling, ETL/ELT processes, and data warehousing concepts.
- Strong understanding of data architecture and data governance principles.
- Ability to optimize and tune data processing and storage solutions for performance and cost-efficiency.
- Proven ability to ensure data quality and integrity through comprehensive testing and validation.
- Experience with workflow management tools (e.g., Apache Airflow, Cloud Composer).
- Knowledge of data security practices and compliance requirements.
- Experience integrating with BI tools like Looker and Data Studio.
- Experience in building scalable and reliable data solutions in production environments.
- Excellent problem-solving skills and attention to detail.
- Ability to work effectively in a collaborative, fast-paced environment.
Where You Will Work
- This is a 100% remote position
Employment Type
Our Benefits Include
- 100% remote work; Meal and/or food allowance in a flexible model (EVA card)*
- Unimed health insurance for employees and dependents;*
- Uniodonto dental insurance for employees and dependents*
- Agreements with Educational Institutions for discounts on Undergraduate, Postgraduate, and short courses;
- Totalpass to take care of physical health;
- Zenklub to take care of mental health;
- Life insurance*
- Daycare assistance for zallpers with children aged 4 months to 6 years, who earn up to three times the minimum wage of the category*
- Baby Zallpy: a gift to celebrate the birth of zallpers babies;
- Communities: we support the operation of three voluntary zallpers communities: Diversity, Equity & Inclusion, Sports & Movement, and Technology.
Benefits valid for CLT type*