Kindly let me know if you have a suitable fit for the following position
Location: Phoenix, AZ
Please send the resume to firstname.lastname@example.org or 847- 350-1008
The data engineer is a critical role that will provide data engineering design, ETL development and technical expertise to an ETL scrum team. They will work on a team with other ETL developers, QA engineers and analysts to support data streams for payment, claim and customer service operations. The team is responsible for a large catalog of jobs that use a mixture of batch ETL architecture and real-time data streaming with API-integrated data services. The data engineer will spend their time doing hands on development, designing future data processes, conducting data analysis, consulting with other teams and interacting throughout the Agile process in a stable scrum team environment.
Essential Functions / Principal Responsibilities
- Develops data pipelines in both batch ETL and real-time streaming architectures.
- Develops data models to define new or modify existing data structures in support of data integration initiatives.
- Provides expert technical knowledge of data solutions for business projects.
- Provides source system analysis, data discovery, complex transformation assessment and target system exploration to understand information data requirements and anticipate user needs.
- Contributes to data pipeline design, coding, and technical / functional reviews while collaborating with source system developers, data engineers and functional subject matter experts.
- Develops effective data pipeline solutions to deliver business features.
- Adheres to best practices for data movement, data quality, data profiling, data cleansing and other data pipeline related activities.
- Applies tuning and optimization for continuous improvement.
- Presents technical information in easily understood terms (written, verbal and visual).
- Communicates effectively within the Agile team and to external stakeholders and management.
- Follows Agile best practices and adheres to internal IT processes like change management and problem management.
Skills that will Ensure Success:
- Specialist in ETL development with a demonstrated understanding of transactional data processing, streaming data and data pipeline best practices.
- Experience in build, unit test, and deployment of Informatica ETL processes.
- Knowledgeable in making REST API calls within data processes.
- Familiar with real-time data pipeline platforms, preferably StreamSets, AWS Glue or similar platform.
- Hands on experience with data streaming in Apache Kafka.
- Able to interpret business needs and turn them into a technical plan of attack with pros and cons of various approaches to the data processing options.
- Demonstrates a solid understanding of technical standards and processes related to batch and real-time data pipeline development.
- Excellent team player, able to work with product owners, technical developers, DBAs, system administrators, BI professional services, data warehouse operations and functional experts.
- Expertise in SQL query transactions and optimization, especially T-SQL.
- Understand nulls, cardinality, joins, data types to develop technical ETL specifications and technical metadata.
- Ability to integrate an application solution into the broader business and IT ecosystem in which it will operate.
- Firm understanding of quality assurance activities and automation in data pipeline and ETL processing.
- Desire experience working with financial and/or claims data requiring compliance, balancing and integrity checks, especially payment-related data, PCI compliant data and banking industry formats such as NACHA.
- Desire a firm understanding of cloud data processing and data streaming architectures, especially in AWS.
Charan Kumar | IVY Tech Sols Inc.
3403 N Kennecott Avenue, Suite B&C Arlington Heights, IL 60004
( Direct: (847) 350-1008
- email@example.com|Gtalk : charan.ivytech|
Powered by JazzHR