Interesting Job Opportunity: AWS Data Engineer - S...

Job description

We are seeking a highly motivated and experienced AWS Engineer. This position requires an individual with AWS cloud experience and ambition to continually keep up with best practices when it comes to cloud development. The successful candidate must be able to seek out requirements and create best-in-class cloud-native solutions. The engineer must always create solutions that are repeatable, scalable and well-governed. They will deploy and rigorously test solutions to ensure they are robust and secure. The engineer will be responsible for creating and maintaining diagrams associated with any solutions that are deployed into production.

Must Have

  • Knowledge and experience in designing and developing RESTful services.
  • Experience building server less applications in AWS.
  • Experience in building real-time/streaming data pipelines.
  • 3-4 years of SQL & Python programming experience.
  • 2-3 years of experience with AWS tech stack (Glue, Redshift, Kinesis, Athena, CloudTrail, CloudWatch, Lambda, API Gateway, Step functions, SQS, S3, IAM roles, Secrets Manager).
  • Experience working with ETL Tools like Glue, Fivetran, Talend, Matillion, etc.
  • 1-2 years of experience in DBT with Data Modeling, SQL, Jinja templating, and packages/macros to build robust, performant, and reliable data transformation and feature extraction pipelines.
  • 1-2 years of experience in Airbyte building ingestion modules for streaming, batch, and API sources including building CDC mechanisms for database sources
  • Experience building distributed architecture-based systems, especially handling large data volumes and real-time distribution.
  • Initiative and problem-solving skills when working independently.
  • Familiarity with Big Data Design Patterns, modeling, and architecture.
  • Exposure to NoSQL databases and cloud-based data transformation technologies.
  • Understanding of object-oriented design principles.
  • Knowledge of enterprise integration patterns.
  • Experience with messaging middleware, including queues, pub-sub channels, and streaming technologies.
  • Expertise in building high-performance, highly scalable, cloud-based applications.
  • Experience with SQL and No-SQL databases.
  • Good collaboration and communication skills, highly self-driven, and take ownership.
  • Experience in Writing well-documented, Clean, and Effective codes is a must.

Good To Have

  • AWS Cloud Certifications.
  • Good experience in Airflow, MWAA
  • Knowledge of Jinja templating in Python.
  • Working knowledge of DevOps methodologies, including designing CI/CD pipelines.
  • Knowledge of Pyspark, DevOps. (intermediary proficiency should be fine, Pyspark is optional)
  • Proficient in SQL, Python, and PySpark.
  • Good experience building Real-Time streaming data pipelines with Kafka, Kinesis etc
  • Good understanding of Data warehousing & Data Lake solutions concepts.
  • Good Knowledge on Azure DE Create and maintain scalable, robust AWS architecture.
  • Collaborate with technical teams on modern architectures (Microservices, REST APIs, DynamoDB, Lambda, API Gateway).
  • Develop API-based, CDC, batch, and real-time data pipelines for structured and unstructured datasets.
  • Enable integration with third-party systems as needed.
  • Ensure solutions are repeatable and scalable across the organization.
  • Work with development teams to gather requirements, develop solutions, and deploy them.
  • Provide robust solution documentation for a wide audience.
  • Collaborate with data professionals to bring applications to life, meeting business needs.
  • Prioritize data protection and cloud security in all BE/B.Tech/MS/M.Tech/ME from reputed institute.
  • Every individual comes with a different set of skills and qualities so even if you don't tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow!


Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.