The AWS Sr Data Engineer will be working with the product engineering team and focusing on AWS based data technologies. A key part of the role is championing and leading all things data. The Engineer will also work closely with the corporate technology team to build, manage and automate our AWS infrastructure. The position demands someone who is highly technically competent, detail oriented, and driven to stay current with evolving technologies.
All Sands Corporation Team Members are expected to always conduct and carry themselves in a professional manner. Team Members are required to observe the Company’s standards, work requirements and rules of conduct.
Essential Duties & Responsibilities
Lead efforts of the Data Engineers for AWS and take initiatives and create processes for utilization of AWS data services and technologies
Design, create, manage, and facilitate business use of large datasets, across a variety of data platforms
Design, build, and operationalize database solutions that are secure, scalable, and highly available on AWS
Work collaboratively with product & software engineering professionals to define solutions and deployment requirements
Provision, configure and maintain AWS data services and infrastructure defined as code
Design and build production data pipelines from ingestion to consumption within a data architecture, using Golang, Python, and/or Scala
Design and implement data engineering, ingestion and curation functions on AWS using AWS native or custom programming.
Perform detail assessments of current state data platforms and create an appropriate transition path/evolution to state-of-the-art AWS services as needed
Collaborate with architecture teams to identify optimal solutions
Manage a queue of work requests to meet service levels and KPI objectives.
Provide recommendations regarding product/vendor selection, technology evolution, and design strategies
Collaborate with business stakeholders, IT peers, and leadership
Execute incident, problem and change management processes and reporting as needed
21 years of age
Proof of authorization/eligibility to work in the United States
Bachelor’s Degree in computer science, Information Technology, or other relevant fields
At least 5 years designing and developing solutions using AWS services such as Lamdba, Glue, SQS, SNS, Redshift, Athena, PySpark EMR, DynamoDB, Neptune etc.
Experience on Streaming technologies both OnPrem/Cloud such as consuming and producing from Kafka, Kinesis
Experience implementing batch processing using Glue/Lake Formation, Lambda & Data Pipeline
Experience in optimizing the cost for service being utilized
Develop code using Golang, Python, besides PySpark, SQL and other languages
Experience building pipelines and orchestration of workflows in an enterprise environment
Perform ongoing monitoring, automation and refinement of data engineering solutions
Experience working with an on-shore / off-shore model
Strong understanding of how to secure AWS environments and meet compliance requirements
Experience deploying and managing infrastructure with Terraform
Experience with Kubernetes, GitHub, Jenkins, ELK and deploying applications on AWS
Ability to learn/use a wide variety of open-source technologies and tools
Strong bias for action and ownership
Ability to professionally engage with stakeholders across all levels of the organization, including peers and executive leadership
Effective written and verbal communication skills in English
Must be able to:
Physically access assigned workspace areas with or without reasonable accommodation.
Work remotely as necessary
Work indoors and be exposed to various environmental factors such as, but not limited to, CRT, noise, and dust.
Utilize laptop and standard keyboard to perform essential functions of the job.