While technology is the heart of our business, a global and diverse culture is the heart of our success. We love our people and we take pride in catering them to a culture built on transparency, diversity, integrity, learning and growth.
If working in an environment that encourages you to innovate and excel, not just in professional but personal life, interests you- you would enjoy your career with Quantiphi!
Experience : 4 to 6 Years
Location : Mumbai/ Pune (Hybrid)
Role: Senior Data Engineer
Required Skills:
Experience: 4+ years of relevant experience in building cloud-native, hybrid, or multi-cloud solutions.
In-depth knowledge and hands-on experience with on near real-time streaming use-cases using Apache Kafka and Kinesis, including topics, partitions, brokers, producers, consumers, and Kafka Connect for data ingestion.
Ability to analyze data sources, including Kafka Streams and Core Database Engine, to evaluate data quality, completeness, and transformations required for data ingestion.
Expertise in production grade solutions using: AWS (MSK, Redshift, S3, Glue, Lambda, Airflow), Python, Data pipelines using Glue/Airflow/Sagemaker.
Ability to design and implement complex data processing pipelines in PySpark.
Hands-on experience in working on large cloud-based migration workloads involving SQL and NoSQL Databases.
Data Warehousing - Hands-on experience to design and develop an Enterprise level data warehouse while leading a team of engineers for the same.
Experience with migration of databases to AWS. Strong ability in creating roadmaps and architecture to execute migration workloads on AWS.
Experience in SQL and Query optimisation. Proficient in SQL-based technologies (MySQL, Oracle DB,SQL Server etc.)
Experience in creation and maintenance of data dictionaries, metadata repositories, and data lineage documentation.
Experience building and supporting large-scale systems in a production environment.
Strong skills to mentor and manage teams of junior and senior data engineers and leading end to end delivery of technical workloads. Ability to work in an agile environment through estimating workloads effectively.
Good to have skills:
Basic understanding of data visualization tools like Quicksight can help in aligning the data ingestion pipeline with downstream reporting requirements.
Experience working in the Financial Services domain with knowledge of compliance regulations such as SOC2, PCI-DSS.
Prior experience of working on AWS Migration Acceleration Programs for customers.
Knowing how to optimize data ingestion processes and data storage in Kafka and Redshift environments will enhance the solution's performance.
Experience using Github/codecommit for developmental activities
If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us!