Experience in handling and maintaining ETL and Data pipelines
Working knowledge on Apache Scoop, Flume, Spark, Airflow, Hive etc.
Experience in configuring and managing visualization tools like Grafana/Kibana, Prometheus, ELK
Must have good experience with system integration with cloud/On-Prem vendors.
Must have experience in designing and proposing Data solutions
Work exposure with NoSQL Preferred
Good team player with exposure in managing teams and client rapport
Must have exposure to cluster based hyperscalers.
Experience with Kafka Streaming platforms.
Good to have cloud and containerization exposure
Analyzing user requirements, envisioning system features and functionality.
Design, build, and maintain efficient, reusable, and reliable software solution by setting expectations and features priorities throughout development life cycle
Identify bottlenecks and bugs, and recommend system solutions by comparing advantages and disadvantages of custom development
Contributing to team meetings, troubleshooting development and production problems across multiple environments and operating platforms
Understand Architecture Requirements and ensure effective Design, Development, Validation and Support activities
Besides the professional qualifications of the candidates we place great importance in addition to various forms personality profile. These include: