Location: Hoboken, NJ OR Remote, USA
Our mission is to unlock human potential. We welcome you for who you are, the background you bring, and we embrace individuals who get excited about learning. Bring your experiences, your perspectives, and your passion; it’s in our differences that we empower the way the world learns.
About the Role:
We are seeking a professional Data Engineer who is passionate about data, analytics, and automation! You will work closely with data engineers and data architects to build and maintain data pipelines going to different zones of our Data Warehouse environment (Snowflake). You will identify the optimal approach for extracting, transforming, and loading data into different zones of Data Lake (Raw/Native, Processed/Transformed, Enriched, Archive) This includes design and development to prepare data for movement, storage & consumption Views, ELT Processes, Extracts & other processes that manipulate, aggregate, clean, or enrich the data.
How you will make an impact:
Designing and implementing secure data pipelines into a Snowflake data warehouse from on premise and cloud data sources
Designing and implementing high performing data pipelines feeding downstream systems
Mentoring other team members in data engineering to ensure optimal utilization of Snowflake resources for various operational and project activities.
Collaborating with Quality engineering and Solution management teams for capacity planning
Collaborating with offshore data engineers who will be building or testing data applications in Snowflake environment.
Working with Business Analysts and Users to translate functional specifications into technical requirements and designs.
Designing and implementing high performing BI dashboard integrations Cognos and QlikView working with reporting and data visualization developers
Defining best practices and standards for data pipelining and integration with Snowflake data lake, and warehouses in collaboration with Data Architect and other Data leads, ensuring enterprise security and access control policies are adhered to in the solution
What we look for:
B.S Computer Science or related degree
Proven track record developing ETL, ELT and Data Warehousing solutions, developing SQL scripts and stored procedures that processes data from databases, and data modelling
Strong understanding of various data formats such as CSV, XML, JSON, etc.
Expert with batch job scheduling and identifying data/job dependencies
Proficient with information retrieval technology like Solr, ElasticSearch, Lucence
Enabling Discovery, Powering Education, Shaping Workforces.
We clear the way for seekers of knowledge: illuminating the path forward for research and education, tearing down barriers to society’s advancement, and giving seekers the help they need to turn their steps into strides.
Wiley may have been founded over two centuries ago, but our secret to success remains the same: our people. We are willing to challenge the status quo, move the needle, and be innovative. Wiley’s headquarters are located in Hoboken, New Jersey, with operations across the globe in more than 40 countries.
Wiley is an equal opportunity/affirmative action employer. We evaluate all qualified applicants and treat all qualified applicants and employees without regard to race, color, religion, sex, sexual orientation, gender identity or expression, national origin, disability, protected veteran status, genetic information, or based on any individual’s status in any group or class protected by applicable federal, state or local laws. Wiley is also committed to providing reasonable accommodation to applicants and employees with disabilities. Applicants who require accommodation to participate in the job application process may contact email@example.com for assistance.
When applying, please attach your resume/CV to be considered