Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen.
We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours!
Adobe Customer Solutions is looking for a full time Data Engineer with experience in building data integrations using AWS technology stack as part of the team's Data as a Service portfolio for Adobe’s Digital Experience enterprise customers.
Customer facing Engineers who enjoy tackling complex technical challenges, have a passion for delighting customers and who are self-motivated to push themselves in a team oriented culture will thrive in our environment
What You’ll Do
- Collaborate with Data architects, Enterprise architects, Solution consultants and Product engineering teams to capture customer data integration requirements, conceptualize solutions & build required technology stack
- Collaborate with enterprise customer's engineering team to identify data sources, profile and quantify quality of data sources, develop tools to prepare data and build data pipelines for integrating customer data sources and third party data sources with Adobe solutions
- Develop new features and improve existing data integrations with customer data ecosystem
- Encourage team to think out-of-the-box and overcome engineering obstacles while incorporating new innovative design principles.
- Collaborate with a Project Manager to bill and forecast time for customer solutions
- Experience as an enterprise Data Engineer from a consulting background
- 3+ years experience experience in building/operating/maintaining fault tolerant and scalable data processing integrations using AWS
- 3+ years experience in Python programming language preferably using PySpark
- Software development experience working with Apache Airflow, MongoDB, MySQL
- Strong capacity to manage numerous projects are a must
- Experience using Docker or Kubernetes is a plus
- BS/MS degree in Computer Science or equivalent industry experience
- Ability to identify and resolve problems associated with production grade large scale data processing workflows
- Excellent communication skills (we’re a geographically distributed team)
- Experience creating and maintaining unit tests and continuous integration.
- Passion for creating Intelligent data pipelines that customers love to use.
Special Consideration given for
- Experience & knowledge with Web Analytics or Digital Marketing
- Experience & knowledge with Customer Data Platform (CDP) or Data Management Platform (DMP)
- Experience & knowledge with Adobe Experience Cloud solutions