Innovative Solutions

Data Engineer/Sr Data Engineer

Job description

As a Data Engineer on our Professional Services team, you will be responsible for working with customers, understanding their needs and requirements, and be able to discuss with them the “art of the possible”. You will also design and implement solutions for data warehouses, data lakes, ETL jobs, and data pipelines using AWS Services, as well as designing and implementing AI/ML solutions using AWS or IBM services such as Bedrock and WatsonX.

You will be responsible for:

Consulting with customers to:

  • Understand their data management strategy
  • Provide insights into optimizing their data strategy
  • Architect and Design data management processes
  • Excite customers on how AI/ML services can enhance their business
  • Implementing and Deploying machine learning models and enable advanced analytics
  • Documenting data architectures and data flows
  • Driving innovation internally and for customers
  • Contributing to R&D projects which may turn into new services offerings


How you will be successful:

  • Living and breathing the “cloud-first” approach
  • Thinking analytically to solve complex business problems
  • Obsessively delivering amazing customer experiences
  • Continually tracking new developments in one or more cloud platforms
  • Building trusting relationships with all team members
  • Comfortable with pushing boundaries and technical limits
  • Keeping up to date on industry trends
  • Always be learning


We are hiring for two engineers, to qualify for for the Mid- level range:

  • Able to modify and improve existing data sets and Structures
  • 5+ years professional service experience, with customer facing responsibilities
  • 2+ years professional AWS and/or WatsonX experience
  • At least (1) AWS or Google Certification
  • Proficient in one or more of these languages: Python, R, Java, Scala
  • Experience in:
  • SQL and NoSQL databases like PostgreSQL, MySQL, MongoDB, Cassandra etc. to store and query large datasets
  • Data modeling - ability to design conceptual, logical and physical data models
  • ETL (Extract, Transform, Load) tools like Informatica, Talend, Pentaho etc. to integrate and move data between systems
  • Big data frameworks like Hadoop, Spark, or Kafka. for distributed data processing and building data lakes
  • Machine learning frameworks like Tensorflow, PyTorch, Keras, Scikit-Learn for building ML models
  • Business Intelligence experience with Power BI and/or QuickSight is a great addition
  • Understanding of dimensional modeling, star schemas, data warehouses
  • Business Intelligence experience with Power BI and/or QuickSight is a great addition
  • Knowledge of data architecture patterns like lambda, kappa architecture
  • Ability to design scalable and flexible data pipelines are also a great addition
  • Experience working within standard agile methodologies


To qualify for for the Sr- level range:

  • 5+ years consulting/professional service experience, with customer facing responsibilities
  • 4+ years professional AWS and/or WatsonX experience
  • At least (1) AWS Professional Level Certification
  • Proficient in one or more of these languages: Python, R, Java, Scala
  • Experience in:
  • designing and building data pipelines,
  • creating machine learning models or using LLMs
  • SQL and NoSQL databases like PostgreSQL, MySQL, MongoDB, Cassandra to store and query large datasets
  • Data modeling - ability to design conceptual, logical and physical data models
  • Dimensional modeling, star schemas, data warehouses
  • ETL (Extract, Transform, Load) tools like Glue, Informatica, Talend, Pentaho etc. to integrate and move data between systems
  • Big data frameworks like Hadoop, Spark, or Kafka. for distributed data processing and building data lakes
  • Machine learning frameworks like Tensorflow, PyTorch, Keras, Scikit-Learn for building ML models
  • Business Intelligence experience with Power BI and/or QuickSight is a great addition
  • Knowledge of data architecture patterns like lambda, kappa architecture
  • Ability to design scalable and flexible data pipelines are also a great addition
  • Experience working within standard agile methodologies


The salary range provided is a general guideline. When extending an offer, Innovative considers factors including, but not limited to, the responsibilities of the specific role, market conditions, geographic location, as well as the candidate’s professional experience, key skills, and education/training.

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.

Similar jobs

Browse All Jobs
Phoenix Recruitment
November 4, 2024
Moyo
November 4, 2024

Senior Data Engineer

SURGO
November 4, 2024

Data Engineer