Exigent is transforming the way legal services are offered by leveraging the power of technology and data that’s locked in contracts and processes. But the data is the cornerstone; it is the people of Exigent that bring this vision to life. We combine the best of tech with smartest legal and financial brains to cultivate a revolution in the legal industry. Intelligent thinking underpins all of our solutions — from contract management to business analytics tools and legal services.
Come and learn more about the Exigent family and how we are re-imagining, transforming and revolutionizing the legal industry through technology, AI and data https://www.exigent-group.com/
Responsibilities include:
- Develop and maintain data pipelines using Azure Data Factory and other relevant Azure technologies
- Optimize ETL processes for performance and scalability, leveraging Azure services such as Azure Data Factory Mapping Data Flows, Azure Data Lake Analytics, or Azure Synapse Pipelines
- Extract data from various sources, transform it into a usable format, and load it into Azure data storage solutions such as Azure SQL Database, or Azure Synapse Analytics
- Collaborate with cross-functional teams to understand data requirements and design scalable and efficient ETL processes using Azure services
- Identify, design and implement ETL solutions for extraction and integration of data to and from data warehouses and data marts for the purposes of reporting, decision support and analysis
- Analyze business and technical requirements, as well as data warehouse data models, to understand the detailed data management requirements
- Administer, maintain, develop and recommend policies and procedures for ensuring the performance, availability, security and integrity of the company's databases and systems
- Architect solutions to help implement the physical data model for the development and/or production environments
- Perform regular deployments, scheduled maintenance and housekeeping
- Work with business and analysts and solution architects to identify, design and implement custom user interface programs for data management and extraction purposes
- Manage job and activity schedules to promote optimal performance, maximum availability and minimize data latency and data load durations
- Implement data quality checks, data validation, and data cleansing techniques to ensure accuracy and reliability of the data
- Collaborate with internal users and data analysts to understand their data needs and provide them with clean, reliable, and well-structured data sets for analysis and modeling
- Ability to Extract, Transform, Load, Manage and Troubleshoot daily transaction files from various processors
- Ability to identify and validate data needs and/or information from clients and internal staff
- Ability to extract and prepare data for analysis across multiple platforms
- Perform data quality analysis to ensure data integrity and accuracy of production data
- Ability to proactively identify areas for improvement in data performance, quality, and processing
- Proven experience being accountable for the data /analytics architecture solutions for a given business area -how all the components, subsystems, platforms, and integrations fit together to meet the business needs and how the solution fits in with the greater enterprise context
- Strong experience in data extract transform load (ETL) processes using Azure services, such as Azure Data Factory, Azure Databricks, Azure Data Lake Storage, Azure SQL Database, or Azure Synapse Analytics
- Proficiency in SQL and scripting languages such as Python or PowerShell for data manipulation and transformation