Shopee

Data Engineer - Payments Team

Job description

DepartmentOperations
LevelExperienced (Individual Contributor)
LocationBrazil - São Paulo

The Operation teams at Shopee covers the operational end-to-end process, from when the buyer searches for a product listed on the Shopee platform, to the moment the buyer receives the products. The team analyses and monitors operational KPIs across the region and conducts root cause analysis when operation performance fluctuates. The Operations team comprises Customer Service, Payment, Listings, Warehouse, Logistics, Seller Operations and Fraud. Browse our Operations team openings to see how you can make an impact with us.

Job Description:
  • Evaluate business needs and goals and how company data relates to them.
  • Apply Software Engineering concepts and good practices regarding structured thinking, reusability, modularization. Transmit ideas in organized form using documentation and diagrams, like flowcharts, UML and E-R models.
  • Collaborate with Data Analysts in a DataOps-like workflow, collecting feedback and developing ways to optimize value extraction from data.
  • Build data pipelines and process automation routines, integrating different data layers, using Python, Airflow and SQL.
  • Identify opportunities for data acquisition.
  • Analyze and organize raw data, combining raw information from different sources.
  • Interpret trends and patterns in collected data.
  • Explore ways to enhance data quality.
  • Execute basic data analysis and reporting on results regarding ELT processes.
  • Prepare data for prescriptive and predictive modeling.
  • Know and apply Software Engineering best practices when developing data pipelines.
  • Have knowledge about analytical tools integrated with data pipelines (dbt and Metabase)

Requirements:
  • Degree in Computer Science, Computer Engineering, IT, or similar field
  • Python, SQL, Shell Scripting. Code versioning and change management using Git
  • Python libraries: pandas, pyarrow, jinja2, database connectors like psycopg2 and pyhive
  • File handling using Python: csv, Excel, Google Sheets, parquet
  • Python packaging strategies using virtual environments
  • Develop process automation and data pipelines (ELT) using Airflow
  • Knowledge about Analytics workflow using dbt
  • Relational databases, E-R modeling
  • Linux. Docker Containers
  • Communicate with Data APIs using Python
  • Hadoop and Hive concepts and architecture
  • Data security concepts
  • Advanced English, Spanish is a bonus.

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.

Similar jobs

Browse All Jobs
GoTo
August 8, 2022
Meta
August 8, 2022
Optum
August 8, 2022