YASH Technologies

Data Engineer_Python_Pyspark Job

Job description

YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation.

At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future.

We are looking forward to hire Python Professionals in the following areas :

Job Description

Our Digital Service Line is currently looking for industry-leading seasoned Data Engineer professionals with hands-on experience.

The shortlisted candidate should have the ability to analyse technical needs and work with the customers to develop project scope of work documents and Project Plans.

The responsibilities are primarily technical, although there is a strong element of functional understanding of the business process.

Data Engineering (DataEng)

  • Experience 3 to 6 years
  • Degree in computer science, engineering, or similar fields
  • Skill Set: Python, PySpark

Primary Responsibilities

  • Responsible for designing, developing, testing and supporting data pipelines and applications
  • Industrialize data feeds
  • Creates data pipelines into existing systems
  • Improves data cleansing and facilitates connectivity of data and applied technologies between both external and internal data sources.
  • Collaboration with Data Scientist
  • Establishes a continuous quality improvement process and to systematically optimizes data quality
  • Translates data requirements from data users to ingestion activities
  • B.Tech/ B.Sc./M.Sc. in Computer Science or related field and 3+ years of relevant industry experience
  • Agile mindset and a spirit of initiative
  • Interest in solving challenging technical problems
  • Experience with test driven development and CI/CD workflows
  • Knowledge of version control software such as Git and experience in working with major hosting services (e. g. Azure DevOps, Github, Bitbucket, Gitlab)
  • Experience in working with cloud environments such as AWSe especially creating serverless architectures and using infrastructure as code facilities such as CloudFormation/CDK, Terraform, ARM.
  • Hands-on experience in working with various frontend and backend languages (e.g., Python, R, Java, Scala, C/C++, Rust, Typescript, …)
  • Knowledge of container technologies such as Docker and Kubernetes
  • Experience with and understanding of Apache Spark and the Hadoop ecosystem
  • Knowledge of workflow engines such as Apache Airflow, Oozie, Kubeflow
  • Knowledge of protocols for authentication such as SAML and OAuth2
  • Experience in creating and interfacing with RESTful APIs
  • Experience in creating productive and robust ETL pipelines for batch as well as streaming ingestion
  • Basic knowledge of Statistics and Machine Learning is favorable

At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale.

Our Hyperlearning workplace is grounded upon four principles

  • Flexible work arrangements, Free spirit, and emotional positivity
  • Agile self-determination, trust, transparency, and open collaboration
  • All Support needed for the realization of business goals,
  • Stable employment with a great atmosphere and ethical corporate culture

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.