Job description

1. Key Outputs:

  • Collaborate with both non-technical and technical team members from cross-functional business and IT teams to gather, elicit and document data requirements and translate them into:
    • Data Models and Mapping Specifications
      • Develop and govern data modeling standards, best practices, and guidelines to ensure consistency and maintainability across data models.
      • Review to ensure up-to-date documentation for data models, including entity-relationship diagrams, data flow diagrams, and data catalogues/data lineage.
      • Ensure regular data profiling and analysis to validate and optimize data models for performance and efficiency.
      • Analyze source system data structures, schemas, and formats to identify data relationships, dependencies, and mapping requirements and understand data semantics and business rules.
      • Create and maintain data mapping specifications that document how data from one source is transformed and loaded into the target data model.
        • Define transformation rules and logic to transform data as needed.
      • Communicate effectively with stakeholders to ensure a shared understanding of data modeling and mapping processes.
      • Ensure data quality and integrity throughout the mapping process. Implement data quality checks and validation mechanisms to identify and resolve data quality issues.
      • Document and maintain the data flow, data dependencies, data dictionary, and data lineage for the data assets using tools such as ERwin, or SqlDBM.
      • Perform data profiling, data validation, and data quality checks using tools such as SQL and Python to ensure that the data is accurate, complete, and consistent across the data sources and targets.
    • Data Pipeline Solutions
      • Drive automation using modern data and analytical tools to develop, test, automate and deploy repeatable data integration and data preparation flows from multiple source systems into our data lake and enterprise data warehouse systems.
      • Build quick prototype data management solutions required for proof of concept or proof of value.

  • Identify and propose opportunities for optimization of our data modeling, mapping, integration, and transformation processes and propose solutions.
  • Perform data analysis, visualization, and reporting using tools such as Excel, and Power BI to find patterns and insights from the data and communicate them to the business users.
  • Troubleshoot and resolve any data issues or anomalies that may arise in our data systems or processes.
  • Collaborate with business, D&A 3rd party service providers and broader IT colleagues to evangelize data management, act as a data guru and work across business silos to promote better understanding of data and analytics.
  • Any other assigned responsibilities

2. Selection Criteria (skills, experience, knowledge):

  • Bachelors degree in Computer Science, Engineering, Business or equivalent discipline
  • 3+ years of experience in a Data Management discipline including experience in data warehousing concepts and data modeling
    • Experience designing and implementing data warehouses using Data Vault 2.0 principles is a strong plus
  • Strong experience working with heterogeneous datasets in building and optimizing data pipelines, pipeline architectures and integrated datasets using modern Data Integration Tools (e.g. Talend, Azure Data Factory etc) and Data Storage Solutions (e.g. Azure Data Lake, Azure Synapse, Snowflake Data Warehouse etc.)
  • Proficient in SQL and PL/SQL
  • Experience of working with data warehousing solutions (e.g. Snowflake) to build data warehouses / data marts
  • Experience of working with data from SAP, manufacturing, or scientific research systems is strong plus but not required/compulsory.
  • Basic understanding of popular open-source and commercial data science tools and platforms such as Python and Azure ML is an advantage.
  • Experience working in a virtual team setting and self-driven with desire to take the lead and drive tasks to completion in a remote environment.
  • Detail-oriented and strict attention to details and the ability to quickly spot and fix problems.
  • Willingness and ability to stay abreast of Data Engineering industry trends.
  • Excellent communication, collaboration, and problem-solving skills.

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.

Similar jobs

Browse All Jobs

data engineer

Neotalent Conclusion
April 17, 2024
April 17, 2024

Data Engineer