Responsibilities
- 5+ years of software/data engineering experience using Java, Scala, and/or Python, with at least 3 years experience in a data focused role
- Experience in data integration (ETL/ELT) development using multiple languages (e.g., Java, Scala, Python, PySpark, SparkSQL)
- Experience building and maintaining data pipelines supporting a variety of integration patterns (batch, replication/CDC, event streaming) and data lake/warehouse in production environments
- Experience with AWS-based data services technologies (e.g., Kinesis, Glue, RDS, Athena, Snowflake, etc.)
- Knowledge and experience with various relational databases and SQL
- Knowledge of software engineering and agile development best practices
Requirements
Requirements
AWS-based data services technologies (e.g., Kinesis, Glue, RDS, Athena, Snowflake, etc.
Java, Scala, Python, PySpark, SparkSQL, Etl tools
Location - Noida