Job description

A career in Products and Technology is an opportunity to bring PwC's strategy to life by driving products and technology into everything we deliver. Our clients expect us to bring the right people and the right technology to solve their biggest problems; Products and Technology is here to help PwC meet that challenge and accelerate the growth of our business. We have skilled technologists, data scientists, product managers and business strategists who are using technology to accelerate change. Our team establishes and builds processes and structures based on business and technical requirements to channel data from multiple inputs, route appropriately and store using any combination of distributed (cloud) structures, local databases, and other applicable storage forms as required. We develop the data sets and pipelines that will support analysis and model development including improving data quality, integration of disparate data sources, enhancement of data, transformation of data and development of performant infrastructure for access and reporting. As well, we design, build and oversee the deployment and operation of technology architecture, solutions and software to capture, manage, store and utilize structured and unstructured data from internal and external sources.


To really stand out and make us fit for the future in a constantly changing world, each and every one of us at PwC needs to be an authentic and inclusive leader, at all grades/levels and in all lines of service. To help us achieve this we have the PwC Professional; our global leadership development framework. It gives us a single set of expectations across our lines, geographies and career paths, and provides transparency on the skills we need as individuals to be successful and progress in our careers, now and in the future.

As a Senior Associate, you'll work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to:

  • Use feedback and reflection to develop self awareness, personal strengths and address development areas.
  • Delegate to others to provide stretch opportunities and coach to help deliver results.
  • Develop new ideas and propose innovative solutions to problems.
  • Use a broad range of tools and techniques to extract insights from from current trends in business area.
  • Review your work and that of others for quality, accuracy and relevance.
  • Share relevant thought leadership.
  • Use straightforward communication, in a structured way, when influencing others.
  • Able to read situations and modify behavior to build quality, diverse relationships.
  • Uphold the firm's code of ethics and business conduct.


Job Requirements and Preferences
:

Basic Qualifications:

Minimum Degree Required:
Bachelor Degree

Additional Educational Requirements:
Bachelor's degree or in lieu of a degree, demonstrating, in addition to the minimum years of experience required for the role, three years of specialized training and/or progressively responsible work experience in technology for each missing year of college.

Minimum Years of Experience:
2 year(s)

Preferred Qualifications:

Degree Preferred:
Master Degree

Preferred Fields of Study:
Business Analytics, Computer and Information Science, Mathematics

Certification(s) Preferred:
CCP Data Engineer Exam (DE575), CCA Spark and Hadoop Developer (CCA175), Oracle Certified Professional, Java SE 8 Programmer Certification Overview, Certified Professional in Python Programming Level 1 or 2

Preferred Knowledge/Skills:
Demonstrates thorough abilities and/or a proven record of success in the following areas:

  • Designing data integrations and data quality framework utilizing cloud computing platforms such as AWS, GCP and Azure;
  • Building data lakes and performing data analysis required to troubleshoot data related issues and assisting in the resolution of data issues;
  • Working within relational databases and writing SQL queries;
  • Developing and maintaining scalable data pipelines and building out new API integrations to support continuing increases in data volume and complexity
  • Leveraging big data machine learning toolkits such as SparkML, messaging systems (Kafka) and NoSQL databases (Cassandra, HBase, MongoDB);
  • Leveraging knowledge in computer science and being comfortable in programming in a variety of languages, including Java, Python, Scala;
  • Determining the appropriate software packages or modules to run, and how easily they can be modified;
  • Handling large scale structured and unstructured data from internal and third party sources;
  • Architecting highly scalable distributed data pipelines using open source tools and big data technologies such as Hadoop, Pig, Hive, Presto, Spark, Drill, Sqoop and ETL frameworks;
  • Utilizing Linux shell scripting and containerization technologies (Docker, Kubernetes); and,
  • Leading teams in a dynamic work environment while managing stakeholder expectations and scope. Demonstrates thorough abilities and/or a proven record of success as a team leader including:
  • Designing data integrations and data quality framework utilizing cloud computing platforms such as AWS, GCP and Azure;
  • Building data lakes and performing data analysis required to troubleshoot data related issues and assisting in the resolution of data issues;
  • Working within relational databases and writing SQL queries;
  • Developing and maintaining scalable data pipelines and building out new API integrations to support continuing increases in data volume and complexity
  • Leveraging big data machine learning toolkits such as SparkML, messaging systems (Kafka) and NoSQL databases (Cassandra, HBase, MongoDB);
  • Leveraging knowledge in computer science and being comfortable in programming in a variety of languages, including Java, Python, Scala;
  • Determining the appropriate software packages or modules to run, and how easily they can be modified;
  • Handling large scale structured and unstructured data from internal and third party sources;
  • Architecting highly scalable distributed data pipelines using open source tools and big data technologies such as Hadoop, Pig, Hive, Presto, Spark, Drill, Sqoop and ETL frameworks;
  • Utilizing Linux shell scripting and containerization technologies (Docker, Kubernetes); and,
  • Leading teams in a dynamic work environment while managing stakeholder expectations and scope.
  • Collaborating with analytics and business teams to improve data models that feed business intelligence tools, increases data accessibility and fosters data-driven decision making across the organization;
  • Implementing processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it;
  • Managing large scale structured and unstructured data from internal and third-party sources;
  • Utilizing Linux shell scripting and containerization technologies (Docker, Kubernetes); and,
  • Documenting systems, refining requirements, self-identifying solutions and communicating to the team.

At PwC, our work model includes three ways of working: virtual, in-person, and flex (a hybrid of in-person and virtual). Visit the following link to learn more: https://pwc.to/ways-we-work.

PwC does not intend to hire experienced or entry level job seekers who will need, now or in the future, PwC sponsorship through the H-1B lottery, except as set forth within the following policy: https://pwc.to/H-1B-Lottery-Policy.

All qualified applicants will receive consideration for employment at PwC without regard to race; creed; color; religion; national origin; sex; age; disability; sexual orientation; gender identity or expression; genetic predisposition or carrier status; veteran, marital, or citizenship status; or any other status protected by law. PwC is proud to be an affirmative action and equal opportunity employer.

For positions based in San Francisco, consideration of qualified candidates with arrest and conviction records will be in a manner consistent with the San Francisco Fair Chance Ordinance.

For positions in Colorado, visit the following link for information related to Colorado's Equal Pay for Equal Work Act: https://pwc.to/coloradoproductstechseniorassociate.

#LI-Remote

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.

Similar jobs

Browse All Jobs
PepsiCo
August 10, 2022

Data Engineer

HP
August 10, 2022

Lead Data Engineer