Job description

Described as the Uber of Content, Social Native is a marketplace technology company that empowers marketers to create, source and optimize authentic visual content in the most efficient way possible. Leveraging the world's first AI-powered creative platform, brands such as Unilever, Adidas, L'Oréal, Crocs and Nestlé Waters partner with Social Native to improve the performance of their paid and organic social strategy with a combination of Influencer Marketing, Custom Content, and Content Editing solutions.

With our recent acquisition of Olapic, we're changing the way marketers evaluate, refine and optimize their visual content strategy. This move solidifies our goal of delivering an all-in-one platform providing brands with data-driven insights, scales content creation, measures the impact of their work, and optimizes content and influencer strategy for even greater results.

As a Data Engineer you will be the owner of data pipelines and a big contributor in making sure data is available, consistent, correct and easy to access.

Responsibilities:

  • Create and maintain data pipelines to ensure data availability in the data warehouse.
  • Identify, define and implement data transformations to fulfill functional and performance requirements.
  • Optimize the data infrastructure to satisfy business and technical needs.
  • Understand technical details about our products to implement transformational logic in the data pipelines.
  • Ensure data integrity and consistency in the data warehouse.
  • Communicate with other technical teams to obtain and share knowledge regarding operational impact on data.
  • Build Analytics tools based on the data delivered by the pipelines to provide insights to customers and internal stakeholders.
  • Build ad-hoc reports, dashboards using visualization tools

Qualifications:

  • 5+ years of experience as a Data Engineer or other technical position centered around data.
  • Advanced SQL knowledge and experience working with relational databases.
  • Strong knowledge of a general-purpose programming language such as Python.
  • Experience building and optimizing data pipelines, infrastructure and architecture.
  • Experience working with scheduling tools like Airflow.
  • Experience with AWS, particularly Redshift.
  • Experience performing root cause analysis and diving into data to identify issues, bugs and improvement opportunities.
  • Experience working with ETL platforms

Nice to Have:

  • Knowledge of Machine Learning algorithms and experience deploying models
  • Experience working with Product and Engineering teams in a dynamic environment
  • Start up experience at a scaling organization

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.