Job description

At Dollar General, our mission is Serving Others! We value each and every one of our employees. Whether you are looking to launch a new career in one of our many convenient Store locations, Distribution Centers, Store Support Center or with our Private Fleet Team, we are proud to provide a wide range of career opportunities. We are not just a retail company; we are a company that values the unique strengths and perspectives that each individual brings. Your difference truly makes a difference at Dollar General. How would you like to Serve? Join the Dollar General Journey and see how your career can thrive.

Dollar General Corporation has been delivering value to shoppers for more than 80 years. Dollar General helps shoppers Save time. Save money. Every day.® by offering products that are frequently used and replenished, such as food, snacks, health and beauty aids, cleaning supplies, basic apparel, housewares and seasonal items at everyday low prices in convenient neighborhood locations. Learn more about Dollar General at www.dollargeneral.com/about-us.html.

The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. This role will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up.

Duties & Responsibilities What major responsibilities does this position have and what percentage of time is spent on completing them? (Typically 5 – 7)

  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and cloud technologies.
  • Build analytics tools that utilize the data pipelines to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
  • Work with stakeholders including the Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.


Knowledge, Skills and Abilities (KSAs) What KSAs are required to perform this job?

  • Knowledge of programming languages (e.g. Java and Python)
  • Hands-on experience with SQL database design
  • Great numerical and analytical skills
  • Degree in Computer Science, IT, or similar field; a Master’s is a plus
  • Data engineering certification (e.g IBM Certified Data Engineer) is a plus
  • Experience with big data tools Hadoop, Spark, Kafka, etc.
  • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
  • Experience with data pipeline and workflow management tools Azkaban, Luigi, Airflow, etc.
  • Experience with Snowflake/Azure cloud services EC2, EMR, RDS, Redshift
  • Experience with stream-processing systems Storm, Spark-Streaming, etc.
  • Experience with object-oriented/object function scripting languages Python, Java, C++, Scala, etc


Work Experience &/or Education What are the minimum education and/or experience requirements necessary to perform this job?

  • Degree in information technology or computer science with additional vendor-specific certification.
  • BS or MS degree in Computer Science or a related technical field
  • 4+ years of Python or Java development experience
  • 4+ years of SQL experience (No-SQL experience is a plus)
  • 4+ years of experience with schema design and dimensional data modeling
  • Ability in managing and communicating data warehouse plans to internal clients
  • Experience designing, building, and maintaining data processing systems
  • Experience working with a cloud platform such as Snowflake / Azure or Databricks

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.