Job description

About Félix

Félix is a chat-based platform that enables Latinos in the US to send money home, pioneering remittance services via WhatsApp. We combine Blockchain and Artificial Intelligence to disrupt how remittances are done today and build the future of cross-border payments.

By joining Felix you will be part of the most innovative company in the cross-border payment industry. We recently received investment from top VCs from Silicon Valley, Europe, and Latam, as well as we won Blockchain and AI innovation and application awards at the Wharton Business School. You will be joining the journey to build the financial platform and become the companion for all Latinos in the United States!

You will work closely with cross-functional teams to support data integration, transformation, and analytics, contributing to the success of our data-driven initiatives.

Responsibilities

  • Data Architecture: Design and implement robust, scalable, and cloud-based data architectures, leveraging Google Cloud Platform
  • Data Integration: Develop ETL (Extract, Transform, Load) pipelines to extract data from various sources, transform it for analysis, and load it into the data warehouse
  • Data Modeling: Create and maintain data models that support business intelligence, analytics, and reporting needs
  • Data Quality: Implement data cataloging, metadata management, quality checks and ensure data accuracy, consistency, and completeness using services like Data Plex
  • Data Security: Implement security and compliance measures to protect sensitive data and ensure data privacy using encryption (at rest, in transit, and in use), data masking, etc
  • Performance Optimization: Identify and resolve performance bottlenecks, ensuring optimal data processing and storage
  • Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and provide data support for analytics and reporting
  • Documentation: Maintain comprehensive documentation of data pipelines, processes, and systems for knowledge sharing and troubleshooting
  • Automation: Implement automation and monitoring tools to streamline data operations and detect and resolve issues proactively

Requirements

  • Proven experience as a Data Engineer or similar role
  • Strong knowledge of GCP or AWS data solutions
  • Proficiency in programming languages (e.g., Python, Java, Scala) for ETL processes
  • Experience with data warehousing using Google Big Query and data modeling
  • Excellent problem-solving skills and attention to detail
  • Strong communication and teamwork skills

What We Offer

  • Competitive salary
  • Initial stock options grant
  • Annual bonus based on performance
  • Remote work environment
  • Flexible PTO
  • Paid parental leave
  • Empowering opportunities for growth in a dynamic entrepreneurial environment

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.