All new
Data Science
jobs, in one place.

Updated daily to help you be the first to apply ⏱

Data Engineer (ETL/ Enterprise Data Warehouse)
  • Python
  • Spark
  • SQL
  • Java
  • SAS
  • Big Data
  • Data Analysis
  • Excel
  • Database
  • Data Mining
  • ETL
  • Modeling
  • Hadoop
  • NoSQL
  • PostGreSQL
  • Business Intelligence
  • Azure
Blue Cross Blue Shield of Arizona
Phoenix, AZ 85021
140 days ago
Purpose of the job
Designs and implements business intelligence and e xtract, transform, and load (ETL) solutions using programming, performance tuning, data modeling.
Create databases optimized for performance, implementing schema changes, and maintaining data architecture standards across all of the business’s databases.
Serves as a liaison between the Database Administration department and development teams. ESSENTIAL job functions AND RESPONSIBILITIES
Learn area’s direct flow; and how it affects surrounding systems and operational areas. Architect, design, construct, test, tune, deploy, and support Data Integration solutions for various data management systems. Contribute to the team’s knowledge base with useful information such as adopted standards, procedure documentation, problem resolution advice, etc. Participate in the promotion of SQL Server best practices Collaborate with other technology teams and architects to define and develop solutions. Research and experiment with emerging Data Integration technologies and tools. Work with the team to establish and reinforce disciplined software development, processes, standards, and error recovery procedures are deployed; ensuring a high degree of data quality. Support Enterprise database clustering, mirroring, replication among other SQL Server technologies. Develop, write and implement processing requirements and post implementation review Facilitate and/or create new procedures and processes that support advancing technologies or capabilities Design & Implement Extract, Transform, and Load (ETL) solutions utilizing SSIS Apply data mining rules Create logic, system, and program flows for complex systems, including interfaces and metadata Write and execute unit test plans. Track and resolve any processing issues. Implement and maintain operational and disaster-recovery procedures. Participate in the review of code and/or systems for proper design standards, content and functionality. Participate in all aspects of the Systems Development Life Cycle Analyze files and map data from one system to another Adhere to established source control versioning policies and procedures Meet timeliness and accuracy goals. Communicate status of work assignments to stakeholders and management. Responsible for technical and production support documentation in accordance with department standards and industry best practices. Maintain current knowledge on new developments in technology-related industries Participate in corporate quality and data governance programs The position requires a full-time work schedule. Full-time is defined as working at least 40 hours per week, plus any additional hours as requested or as needed to meet business requirements.

Employment Requirements

Required Work Experience
4 years of experience in computer programming, query design, and databases Required Education
High-School Diploma or GED in general field of study Required Licenses
N/A Required Certifications
Preferred Work Experience
4+ years of experience building and managing complex Data Integration solutions.
4+ years of experience in computer programming, query design, and databases
4+ years of experience Database administration with SQL Server
Preferred Education
Bachelor’s Degree in Information Technology or related field preferred Preferred Licenses
N/A Preferred Certifications
MS SQL Certification or other certification in current programming languages competencies
Required Job Skills Intermediate skill in use of office equipment, including copiers, fax machines, scanner and telephones
Intermediate PC proficiency in spreadsheet, database and word processing software
Advanced knowledge of business intelligence, programming, and data analysis sotfware
Intermediate knowledge of Microsoft SQL databases and database administration
Intermediate proficiency in T-SQL, NZ-SQL, PostgreSQL, NoSQL, Hadoop, data tuning, enterprise data modeling and schema change management.
Working technical knowledge of current software protocols and Internet standards to the extent that they apply to database administration.
Excellent database troubleshooting skills
Working technical knowledge of PowerShell . Strong object-oriented design and analysis skills
Experience consuming, organizing and analyzing JSON and XML messages as data. Required Professional Competencies
Knowledge of agile development practices
Strong analytical skills to support independent and effective decisions
Ability to prioritize tasks and work with multiple priorities, sometimes under limited time constraints.
Perserverance in the face of resistance or setbacks.
Effective interpersonal skills and ability to maintain positive working relationship with others.
Verbal and written communication skills and the ability to interact professionally with a diverse group, executives, managers, and subject matter experts.
Systems research and analysis. Ability to write and present business intelligence documentation
Demonstrate the ability to stay current on global threats and vulnerabilities.
Maintain confidentiality and privacy Required Leadership Experience and Competencies
Build synergy with a diverse team in an ever changing environment.

Preferred Job Skills
Advanced knowledge of Data Integration
Advanced proficiency with relational technologies that supplement RDBMS tool sets
Advanced knowledge of Microsoft Applications and Suites, Windows Server, and Microsoft SQL databases.
Advanced knowledge of decision support systems
Advanced knowledge of Netezza administration
Advanced proficiency in TalenD Open Studio or Profisee Maestro Enterprise Data Warehouse (EDW) tools.
Minimum 1-2 Year Experience on Cloud computing, Azure preferable.
Experience supporting Spark-R and R
Intermediate knowledge in Python scripting
Knowledge of any of the common Hadoop Tools; such as, NIFI, Hive, Pig, Oozie, HBase, Flume, Sqoop, Yarn MapRecdce, Ambari, Spark, Java, Python,
Proficiency with agile development practices
Experience collecting and storing data from Restful API's Preferred Professional Competencies
Advanced systems research and analysis expertise
Solid technical ability and problem solving skills Preferred Leadership Experience and Competencies
Experience and knowledge in participating in a highly collaborative scrum team.
Experience in developing, maintaining and enhancing big data solutions.
Demonstrated experience and patience in mentoring and sharing knowledge with peers.

    Related Jobs

  • Machine Learning Engineer

    • PyTorch
    • scikit-learn
    • Keras
    14 days ago
  • Product Information Management (PIM) Data Analyst

    • Tableau
    • Database
    Genuine Parts Company
    7 days ago
  • Business Data Analyst - Entry Level *Spring 2021 Graduates Only*

    • SAS
    • Matlab
    • SQL
  • Senior Quality Data Analyst

    • Database
    • Data Analysis
    Cook Group
  • Data Analyst/IT Systems Support

    • Database
    Arapahoe County, CO
    7 days ago