Job description

Details:

Job Description

Stefanini Group is hiring!

Stefanini is looking for a Data Engineer, Location: Dearborn, MI

For quick apply, please reach out Rakesh Singh at 248-936-5005/ rakesh.singh@stefanini.com

Open to W2 candidates only!

The successful candidate will be responsible for designing, developing the transformation and modernization of big data solutions on GCP cloud integrating native GCP services and 3rd party data technologies also build new data products in GCP. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design and develop right solutions with appropriate combination of GCP and 3rd party technologies for deploying on GCP cloud.

Responsibilities:

  • Work as part of an implementation team from concept to operations, providing deep technical subject matter expertise for successfully deployment of Data Platform Implement methods for automation of all parts of the pipeline to minimize labor in development and production.
  • Identify, develop, evaluate, and summarize Proof of Concepts to prove out solutions.
  • Test and compare competing solutions and report out a point of view on the best solution.
  • Experience with large scale solutioning and operationalization of data warehouses, data lakes and analytics platforms on GCP.
  • Design and build production data engineering solutions to deliver our pipeline patterns using Google Cloud Platform (GCP) services: BigQuery, DataFlow (Apache Beam), Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer (Apache Airflow), Cloud SQL, Compute Engine, Cloud Functions, and App Engine
  • Migrate existing Big Data pipelines into Google Cloud Platform. Build new data products in GCP.

Job Requirements

Details:

Experience Required:

  • Minimum 3 Years of Experience in Java/python in-depth
  • Minimum 2 Years of Experience in data engineering pipelines/ building data warehouse systems with ability to understand ETL principles and write complex SQL queries.
  • Minimum 5 Years of GCP experience working in GCP based Big Data deployments (Batch/Real-Time) leveraging Big Query, Big Table, Google Cloud Storage, PubSub, Data Fusion, Dataflow, Dataproc
  • Minimum 2 years of experience in development using Data warehousing, Big Data Eco System Hive (Hql) & Oozie Scheduler, ETL IBM Data Stage, Informatica IICS with Teradata 1 Year experience of deploying google cloud services using Terraform.

Skills Preferred:

  • Understands Cloud as being a way to operate and not a place to host systems Understands data architectures and design independent of the technology Experience with Python, Shell Script preferred Exceptional problem solving and communication skills and management of multiple stakeholders Experience in working with Agile and Lean methodologies Experience with Test-Driven Development.

Education Required:

  • Bachelors or masters in required field.
  • Listed salary ranges may vary based on experience, qualifications, and local market. Also, some positions may include bonuses or other incentives***

Stefanini takes pride in hiring top talent and developing relationships with our future employees. Our talent acquisition teams will never make an offer of employment without having a phone conversation with you. Those face-to-face conversations will involve a description of the job for which you have applied. We also speak with you about the process including interviews and job offers.

About Stefanini Group

The Stefanini Group is a global provider of offshore, onshore and near shore outsourcing, IT digital consulting, systems integration, application, and strategic staffing services to Fortune 1000 enterprises around the world. Our presence is in countries like the Americas, Europe, Africa, and Asia, and more than four hundred clients across a broad spectrum of markets, including financial services, manufacturing, telecommunications, chemical services, technology, public sector, and utilities. Stefanini is a CMM level 5, IT consulting company with a global presence. We are CMM Level 5 company.

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.

Similar jobs

Browse All Jobs
Complexio
July 27, 2024

Data Engineer

Costco Wholesale
July 27, 2024

Intermediate Data Engineer