Dice is the leading career destination for tech experts at every stage of their careers. Our client, Judge Group, Inc., is seeking the following. Apply via Dice today!
Location: Carrollton, TX
Description: This is a hybrid remote position which will require the candidate to report and work from the office two days a week.
Perm position paying in the 100k - 140k range plus bonus
No sponsorship opportunities available.
Cleans, prepares, and optimizes data for further analysis and modelling.
Designs, develops, optimizes, and maintains data architecture and pipelines that adhere to Data Pipeline (ie ELT) principles and business goals.
ESSENTIAL JOB FUNCTIONS / PRINCIPAL ACCOUNTABILITIES:
Designs, develops, optimizes, and maintains data architecture and pipelines that adhere to ELT principles and business goals.
Solves complex data problems to delivers insights that helps business achieve its goals.
Creates data products for engineer, analyst, and data scientist team members to accelerate their productivity.
Engineer effective features for modelling in close collaboration with data scientists and businesses Leads the evaluation, implementation and deployment of emerging tools and process for analytics data engineering to improve productivity and quality.
Partners with machine learning engineers, BI, and solutions architects to develop technical architectures for strategic enterprise projects and initiatives.
Fosters a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions.
Advises, consults, mentors, and coach other data and analytic professionals on data standards and practices.
Develops and delivers communication and education plans on analytic data engineering capabilities, standards, and processes.
Learns about machine learning, data science, computer vision, artificial intelligence, statistics, and/or applied mathematics as necessary to carry out role effectively.
MINIMUM SKILLS AND QUALIFICATION REQUIREMENTS:
Bachelor's degree in computer science, statistics, engineering, or a related field 5-10 years of experience required.
Experience with designing and maintaining data warehouses and/or data lakes with big data technologies such as Spark/Databricks, or distributed databases, like Redshift and Snowflake, and experience with housing, accessing, and transforming data in a variety of relational databases. Experience in building data pipelines and deploying/maintaining them following modern DE best practices (e.g., DBT, Airflow, Spark, Python OSS Data Ecosystem) Knowledge of Software Engineering fundamentals and software development tooling (e.g., Git, CI/CD, JIRA) and familiarity with the Linux operating system and the Bash/Z shell
Experience with cloud database technologies (e.g., Azure) and developing solutions on cloud computing services and infrastructure in the data and analytics space.
Basic familiarity with BI tools (e.g., Alteryx, Tableau, Power BI, Looker)
Expertise in ELT and data analysis, SQL primarily
Conceptual knowledge of data and analytics, such as dimensional modelling, reporting tools, data governance, and structured and unstructured data
This job and many more are available through The Judge Group. Find us on the web at