SUMMARY: This position creates, implements, and maintains a design for the storage and maintenance of various on prem and off prem database systems. This position is also responsible for participating in the development and deployment activities for those systems. Create scripts for data maintenance and bug fixes. Works with various teams to provide information, knowledge, coordination, and tools that support the growth and continued success of NEP. Relies on knowledge and professional discretion to achieve goals. Significant ingenuity and flexibility is expected.
ESSENTIAL FUNCTIONS:
Develop and maintain data extraction, transfer and loading (ETL) from various sources using Python, SSIS or other tools.
Implement appropriate security measures throughout the data pipeline.
Responsible for maintaining metadata management and data quality activities so that data are accurate, reliable and documented.
Provide support as needed to production systems ensuring that both internal and external client’s needs are met.
Facilitate the transmission and understanding of data to enable fact-based decisions to various stakeholders, such as leadership, faculty and staff.
Provide reliable and timely data that supports strategic planning, student success initiates, and educational and operational effectiveness.
Utilize various software tools and reporting services to deliver actionable data to end-users in a digestible form
Possess a positive and constructive attitude
Reasonable and consistent Attendance to fulfill requirement of the position
KNOWLEDGE, SKILLS, & ABILITIES:
T-SQL or other SQL language Databases: MSSQL, PostgreSQL, Python, R, SSIS, SSRS, Tableau
Azure or Google Cloud Platform or AWS/On-Prem Hard Skills: Build, test, and improve/maintain ETL
Proficient with Microsoft Word and other applications in the Microsoft Office Suite
Improve data availability, usability, integrity and security
Enjoys data wrangling. Eager to understand internal/external business reporting needs.
Ability to communicate effectively and work with business stakeholders to arrive at the appropriate solution.
Comfortable working across departments to gather/spread necessary knowledge to complete a project.
Willingness to learn new languages/technologies as needed for the job.
EDUCATION & EXPERIENCE:
Bachelor’s degree in a related discipline
One to three (1-3) years of recent professional experience in data analysis and/or data engineering or equivalent educational experience.
Experience using Azure or Google Cloud Platform or AWS/On-Prem Hard Skills: Build, test, and improve/maintain ETL
Experience in higher education preferred
Experience working in a technology-driven enterprise preferred
All skills, abilities and education will be considered for minimum qualifications
WORKING CONDITIONS: This position operates in a remote, home office environment. This role routinely uses standard office equipment such as computers, printers, and phones.
Good working environment with the absence of disagreeable conditions.
The noise level in the work environment is usually moderate.