Job description

Overview:
  • Who we are

Imagine working in a place where continuous improvement and innovation is celebrated and rewarded; where fast-paced, high-impact teams come together to positively drive results for one of the largest & most iconic brands in the world.


As the only rapidly growing retailer, you may know us as your friendly neighborhood store. You probably know our familiar name, have seen our pervasive logo, and have tried our highly sought-after products, such as Slurpee® and Big Bite®. “Brain Freeze” is a 7-Eleven registered trademark for our 53-year old Slurpee® and with over 71,100 stores globally (more than any other retailer or food service provider), we sell over 14 million a month.


But there’s a lot more to our story and much more left to be written. We are transforming our business, ensuring we are customer obsessed and digitally enabled to seamlessly link our brick and mortar stores with digital products and services.


At 7-Eleven the entrepreneurial spirit is in our DNA and has been ever since our inception 90+ years ago. It’s what drove us to invent the convenience industry in 1927 by envisioning how a simple ice dock could provide household staples such as milk and eggs to better serve the needs of our customers.


Today we are redefining convenience and the customer experience in big ways...we are fundamentally changing our culture and we want talented, innovative, customer obsessed, and entrepreneurial people like you to come make history with us.


  • How we lead

At 7-Eleven we are guided by our Leadership Principles.


  • Be Customer Obsessed
  • Be Courageous with Your Point of View
  • Challenge the Status Quo
  • Act Like an Entrepreneur
  • Have an “It Can Be Done” Attitude
  • Do the Right Thing
  • Be Accountable

Each principle has a defined set of behaviors which help guide the 7-Eleven team to Serve Customers and Support Stores.


  • About This Opportunity
Responsibilities:
Design, develop and maintain optimal data pipelines.
  • Analyze and organize raw data.
  • Work closely with product managers and engineering managers to translate their needs into ETL flows.
  • Assemble large, complex data sets including legacy structured data warehouse that meet functional / non-functional business requirements.
  • Work with various data platforms, databases, and systems such as Oracle and Mongo DB
  • Participate in data modelling discussion and influence the data architecture to ensure the best performance for solutions required by various teams.
  • Manipulate, process and extract value from large, disconnected datasets.
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • Implement best practices to enhance data quality and reliability.
Qualifications:
5-10 years of experience working with data driven applications
Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
Experience building and optimizing data pipelines, architectures and data sets.
Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
Strong analytic skills related to working with unstructured datasets.
Build processes supporting data transformation, data structures, metadata, dependency and workload management.
A successful history of manipulating, processing and extracting value from large disconnected datasets.
Working knowledge of message queuing, stream processing, and highly scalable data stores.
Strong project management and organizational skills.
Experience supporting and working with cross-functional teams in a dynamic environment.
We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
Experience with relational SQL (Oracle) and NoSQL databases (Mongo).
Experience with data pipeline and workflow management tools (with any one or more): Azkaban, Luigi, Airflow, etc.
Experience with one of the cloud services like AWS : EC2, EMR, RDS, Redshift
Experience with object-oriented/object function scripting languages (with any one or more): Python, Java, Spark , Scala, etc.

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.