Data Engineer

Location: New York, NY 10013

*** Mention DataYoshi when applying ***

About Dorilton Capital:


We prefer to create value over the long term by reinventing cash flow while avoiding excessive leverage.


We work actively with existing management teams recognizing that long-term business success is the result of a team effort. Dorilton views its role as providing additional capital for acquisitions and growth projects and support and expertise to take its companies to the next level.


We partner with companies that are led by strong management teams and have a successful history and culture. We firmly believe in our companies continuing with the elements that made them successful.

About the Data Engineer:

The Data Engineer will work within BI/DW and will be responsible for Informatica Cloud Development. The person should have experience in developing End to End Data Pipelines using Informatica Cloud (Including Parameters) and have worked on using multiple types of connectors including Sql Server, Snowflake, Dynamics 365 and Azure Blob


  • Understand the requirements and understand the larger the initiatives that each project would be part of and build solutions with end state in mind
  • Create end to end solutions for data engineering, such as:
    • Analyzing source data
    • Identifying optimal connector
    • Develop frameworks or reuse/leverage on existing frameworks
    • Parameterize and Automate
  • Work directly with end users and understand requirements


  • 8+ years' experience in Informatica PowerCenter including Informatica Intelligent Cloud Services.
  • 8+ years' experience working with Relational Databases and Data Warehouses with ability to write, analyze and optimize complex Sql queries
  • 3+ years in Snowflake and have experience as Sql Developer and DB Administration
  • 2+ years' experience with Azure
  • Experience in IICS Application Integration components like Processes, Service Connectors, and Process Object
    • Must have experience in IICS pipelines using parameterization, REST API and task flows
    • Should have experience in automation and scheduling of Jobs (Eg: RunAJob)
  • Strong understanding of SQL, PL/SQL programming
    • Ability to write Complex queries (joining multiple tables), optimize, tune and analytical functions
  • Good understanding of Azure Blob, Kubernetes
  • Well versed with all Informatica Client Components (PowerCenter Designer, Workflow Manager, Workflow Monitor, Repository Manager)
  • Experience with Data Model creation (Ability to adopt to any major tools)
    • Should have created Data Models in 3NF and Dimensional Modelling
  • Experience with Data Vault method of Data Warehousing
  • Experience in working with Python for Data Science and knowledge of Python libraries
  • Certification with Snowflake and Informatica is preferred
  • Good Statistical and Analytical knowledge
  • Excellent communication skills and a good team player
  • Communicate effectively with both business, technical stakeholders as well as leadership.
  • Experience gathering and refining requirements.
  • Demonstrated ability to produce high-quality results with attention to details
  • Excellent analytical, organization, prioritization and oral/written communication skills
  • Should have experience working with Agile methodology and experience in working with JIRA
  • Positive and attitude and can-do attitude
  • Should be able to work in a fast-paced environment with ability to build systems

*** Mention DataYoshi when applying ***

Offers you may like...

  • Auchan Retail France

    Data Engineer / Machine Learning Engineer - SQL / ...
    59491 Villeneuve-d'Ascq
  • CollabraLink Technologies

    Data Engineer - Mid Level
  • Jobvite

    Data Engineer
  • AdamsGabbert

    Data Engineer
    Kansas City, MO 64106

    Geospatial Data Engineer
    Brooklyn, NY