Georgia-Pacific’s Packaging & Cellulose and Building Products BI and Data Analytics group is looking for a Data Engineer to join our team. We are looking for entrepreneurial-minded innovators who can help us further develop this service of exceptionally high value to our business.
A successful candidate will bring advanced knowledge of best-in-class development methodologies, a passion for scalable and high-reliability data systems, and a working knowledge of public cloud (preferably AWS) services. You must be enthusiastically collaborative, value seeking, open to challenge and be comfortable with new ideas and established approaches with an appetite for learning and innovation.
LOCATION: REMOTE US
What You Will Do In Your Role
- Implement and support of the AWS data lake environment which includes all upstream data flows on AWS stack and SQL Server
- Be able to contribute to architectural discussions about new data system designs
- Develop enhancements/create data feeds from various ERP and other source systems to the data lake.
- Develop data solutions in direct support of business intelligence and analytics/reporting to meet business needs.
- Work with users and BI and business team to develop business requirements and documentation, be able to communicate complex solutions to stakeholders and other team members
- Troubleshoot, fix and explain failures/errors in data flows and take corrective, preventive actions.
- Ability to prioritize, organize and coordinate simultaneous tasks/projects
- Experimenting with new technologies and solutions, identifying ways we can use technology to create superior value for our customers.
- High initiative and a passion for driving rapid technical advancement
- Be able to present and partner with non-technical user base. Collaborate with a diverse IT team including business analysts, project managers, architects, developers or vendors to create or optimize innovative technologies and solutions
- Strong conceptual, analytical, and problem-solving abilities
The Experience You Will Bring
- At least 5 years of Data Analytics and Data Lake, ETL & data pipeline experience using AWS technologies like S3, Redhsift, Glue, Spark, Kinesis, and Lamda, AWS serverless technologies like Aurora DB; experience with Microsoft’s BI stack (SQL Server and SSIS).
- Data engineering concepts (ETL, near/real-time streaming, data structures, metadata and workflow management)
- Ability to pull together complex and disparate data sources, warehouse those data sources and architect a foundation to produce BI and analytical content, while operating in a fluid, rapidly changing data environment - 3+ years’ experience in writing SQL queries against relational databases (SQL Server, Postgres SQL etc.…)
- At least 2 years of Multi-dimensional databases, Cubes and Data Warehousing experience (Redshift, SQL)
- At least 2 years Programming / Scripting (Python, Node.JS preferred)
What Will Put You Ahead
- Bachelor’s Degree in Computer Science, Engineering or Mathematics preferred
- Understanding of Meta Data & Master Data Management concepts & practices
- Experience working with enterprise data stores, data lakes, external data sources and APIs
- Experience with broad set of analytics use cases; one or more of supply-chain, transportation, sales and operations
- Experience in visualization tools, such as Tableau or PowerBI
Salary and Benefits Commensurate with Experience.
Equal Opportunity Employer.
Except where prohibited by state law, all offers of employment are conditioned upon successfully passing a drug test.
This employer uses E-Verify. Please visit the following website for additional information: www.kochcareers.com/doc/Everify.pdf