Role- Data Engineer
Location- Mississauga, ON, Montreal, QC (Hybrid)
Need-
Data bricks, Pyspark, Sql
Responsibilities
Experience of designing and implementing an operational production grade large-scale data solution on Microsoft Azure some experience on Snowflake Data Warehouse.
Including hands on experience with productionized data ingestion and processing pipelines using Python, Data bricks, Snow SQL
Excellent understanding of Snowflake Internals and integration of Snowflake with other data processing and reporting technologies
Excellent presentation and communication skills, both written and verbal, ability to problem solve and design in an environment with unclear requirements.
Skill set
- Databricks
- Architecture
- Unity Catalog
- Delta lake tables
- Auto Loader
- Delta Live Tables
- Single/Multiple csv files data ingestion
- SCD Type1, type2 implementation
- Azure Services:
- ADLS Gen2
- Blob storage configuration
- Key vault
- Databricks
- ADF
- SQL:
- All Analytical/Window functions
- CTE
- Subqueries
- Constraints
- Joins
- Union/Union All
- SCD Type1 and Type2 implementation
- Datawarehouse:
- Dimensional modelling
- SCD Types
Thanks & Regards
Lokesh Sharma
Team Lead - Recruitment
Cybertec, Inc.
lokesh@cy-tec.com
LinkedIn: https://www.linkedin.com/in/lokesh-sharma-1541a3162/
11710 Plaza America Drive Suite #2000, Reston, VA 20190