Creates data collection, extraction, and transformation frameworks for structured and unstructured data.
Develops and maintains infrastructure systems (e.g. data warehouses, data lakes) including data access points.
Prepares and manipulates data using MS SQL, Azure Synapse, Databricks and other data pipeline tools.
Organizes data into formats and structures that optimize reuse and efficient delivery to businesses and analytics teams and system applications.
Integrates data across data lake, data warehouse and systems applications to ensure the consistent delivery of information across the enterprise.
Implements backend API to enable access to datasets.
Accountable for efficient data architecture and systems design.
• Builds and evolves the data service layer and engages team to bring together components for a best- in-class customer offering. Highly skilled in assessing overall data architecture and integrations, making ongoing improvements to the solution.
• Engage critically with business stakeholders to establish clear needs and link to solutions, including setting up prototypes, and involving multiple parties in design sessions.
• Lead in the architecture, design and implementation of complex data architecture and integrations including best practices for the full development life cycle, coding standards, code reviews, source control management, build processes, testing, and operations.
• Design, develop and support back-end applications and programs (API) while ensuring all components adhere to a consistent, extensible, evolving architecture which meets business requirements.
• Perform database monitoring, collaborate with database administrators to optimize database performance.
• Lead the analyses of database entities, relationships, and attributes to determine efficient design solutions according to business needs
• Collaborate with internal stakeholders to ensure adherence to standards for code, design, documentation, testing and deployment.
• Collaborates with data governance and strategy to ensure data lineage is well understood and constructed in a way to highlight data re-use and simplicity.
• Actively works to assess new opportunities to simplify the data operation with new tools, technologies, file storage, management, and process. Uses team context and experience to valuate these opportunities and bring them forward to team members for assessment and implementation.
• Bachelor’s degree required, Masters an asset, in Software Engineering, Computer Science; or equivalent work experience in a Technology or business environment.
• Minimum of 7 years of experience working in developing and following structured work processes in data engineering using Microsoft SQL Server or Oracle.
• Minimum of 1 year of backend development experience Java, C#, Go, Node.js, API Micro Services.
• Minimum of 1 year of experience working with Azure/AWS or other cloud environments.
• Highly proficient in multiple programming languages and coding. Excellent ability to design and engineer moderately complex enterprise solutions.
• Highly proficient in data management, governance, data design and database architecture.
• Proven track record of manipulating, processing and extracting value from large disconnected datasets.
• Highly proficient in data modeling, data integrations, data orchestration, and supporting methodologies.
• Highly proficient in leading large scale projects or significant project steps and communicating progress/approach with technical/non-technical peers/clients and leaders.