Job description

Role is responsible for translating business and technical requirements into IT deliverables, developing data ingestion/integration pipelines and workflows with associated transformations/harmonization/conversion, executing data profiling, maintaining quality of master/transactional data, creating data curation procedures, building analytics views for consumption, supporting data science (machine learning, regression, forecasting, etc.) requirements by provisioning data.


Activities include:

Implement system specific data models (physical) with input from Solution/System/Data Architect
Implement data components for interfaces (data pipelines, APIs) in collaboration with Architects
Implement data publishing views to support consumers (business organizations, data scientists)
Execute data profiling and provide detailed data analysis for harmonization and contextualization of disparate data sets
Provide input into business data glossary / data catalog (Collibra/Data Dash) and logical data models (ER/Studio); help maintain and steward technical metadata
Provide input to address data quality issues and provide technical data quality remediation plans for systems (in partnership with Data Architects and Business SMEs)
Maintain technical configuration / ingestion processes for foundational/master/transactional data


Required Advanced Level of Technical Competencies:

Designing and creating databases, data dictionaries, data maps using (de-)normalized data
Advanced database handling languages (SQL and variants)
Data manipulation tools such as: SSMS, SSIS, TOAD, Access, Excel
Experience in creation of new databases, data lakes, data warehouses, etc., (on-prem or cloud)
Experience in creation of data ingestion pipelines, using data integration/replication tools such as: Azure Data Factory, Qlik Replicate, Fivetran

Preferred (Beginner to Expert) Technical Competencies:

Competencies in one or more of the following areas:
Creating or editing new database models (physical and logical) using ERD tools such as ER/Studio.
Experience In Data Governance & Management Concepts (MDM, Data Quality)
Data mapping and profiling
Knowledge of programming languages and some ability to write code (e.g., python, VBA, XML, R)
API Development
Practical understanding of BI and Data visualization tools (Tableau)

Preferred Candidate should have:
Strong communication and interpersonal skills with strong English proficiency
Demonstrate critical thinking, analytical skills, and employ judgment to offer thoughtful, concise input toward resolutions of problems
Be able to translate business processes into data requirements
Comprehension of DevOps and Agile development and application to data centric architecture and solutions
Minimum Bachelor’s Degree in Engineering Technology, Computer Science, or a related field with equivalent experience
Pays attention to detail and is meticulous in his/her work
Able to work independently and in a team.

COMPUTING ENVIRONMENT
ADO for team work plan
Cloud-based tools (Azure, Snowflake, Cloudera)
MS Office software

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.