AWS Cloud Data Engineer with Bachelor’s Degree in Computer Science, Computer Information Systems, Information Technology, or a combination of education and experience equating to the U.S. equivalent of a Bachelor’s degree in one of the aforementioned subjects.
Job Duties and Responsibilities:
Create AWS lambda functions primarily using functional interfaces & generics of Python Boto3 to run the code in response to HTTP requests using Amazon API Gateway.
Setting up an agent in Linux machine to send the logs from EC2 to CloudWatch and push top the Splunk for Monitoring and Alerting.
Introduce and drive adoption of CI/CD framework within the team.
Build and Deployment of CI/CD Pipelines.
Create Lambda Function in Python to Route the traffic to write Firehose and processed CloudWatch events via Lambda to Firehose.
Create and maintain Python data pipeline to update metrics database.
Create IAM roles using CloudFormation template which will create the handshake role for cross account access through different accounts for supporting in collection of Metadata.
Responsible in collecting the Metadata from RDS and Aurora related database which are compatible with MySQL, MariaDB, PostgreSQL, Oracle, Microsoft SQL Server using Boto3 to describe the DB clusters and DB instances.
Monitoring and Alerting the Application logging of an EC2 instance and created a log file to include the logging directory in the user data to write for application logging.
Writing on KornShell scripts to pull the S3 keys from the oracle database to process and write it to a file for the AWS S3 Automation process to consume.
Writing CloudFormation templates to create CloudWatch event rules, DynamoDB tables, Lambda Function and configures CloudWatch and Event Bus to automate the process of capturing the changes occurring such as ADDS/DELETES/MODIFY events of AWS services from different accounts and targeting to S3.
Working on Amazon Web Services (AWS) for a multitude of applications utilizing the b stack such as EC2, VPC, Glacier, Route53, S3, RDS, Cloud Watch, Cloud Trial, WAF, SNS and IAM, focusing on high- availability, fault tolerance, Load balancing and auto-scaling in designing, Deploying, and configuring.
Starting and stopping Administration server and managed servers.
Responsible for maintaining the application available for the users in all environments.
Sending Health Check reports of environments to check all applications servers to be green and troubleshooting various WebLogic issues.
Responsible for Splunk setup and developed multiple reports, alerts and dashboards that actively monitor all the systems. Designed Splunk Architecture, Queries and created Applications on Splunk to analyze Data.
Skills / Knowledge required:
At least 3 years of experience in Aws Cloud Solution architecture design.
Strong SDLC knowledge and experience working in an Agile environment
Experience working with Infrastructure as Code tools (IaC) such as Terraform Hashicorp or Sentinel.
Solid working knowledge of Cloud technologies (AWS, Google etc.).
Experience working as an Application Developer.
Comfortable building applications in two or more programming languages with an understanding of fundamental web/internet technologies.
Self-starter, able to work independently and as part of a team in a dynamic environment with competing priorities.
Work location is Portland, ME with required travel to client locations throughout USA.
Rite Pros is an equal opportunity employer (EOE).
Please Mail Resumes to: Rite Pros, Inc. 565 Congress St, Suite # 305 Portland, ME 04101.