Job description


This position will design, develop, enhance, debug, support, maintain and test AWS Cloud solutions that support the Molex business. These solutions may involve diverse platforms, software, technologies and tools providing for a challenging and exciting environment. The position will be involved in all aspects of providing these solutions from design, development to implementation. Requires the Data assessment and assisting with strategic decisions/execution of the migration / new system data integration. The participant will not have direct reports but will be part of larger teams requiring excellent communication and collaboration skills with other technical groups as well as business leaders. Individuals should be able to work with minimal supervision and with general guidance as they deliver superior solutions to be used by the Molex business.

A Day In The Life Could Include

(job responsibilities)

  • Working with Global functional and technical teams.
  • Working with application architects to determine solution design, development and testing for data engineering activities
  • Data warehouse and data engineering concepts (ETL, near-/real-time streaming, data structures, metadata, and workflow management)
  • Reformulating existing frameworks to optimize their functioning
  • Using your technical and process aptitude to come up to speed on new tools and concepts required for integration development and support
  • Challenging the status quo and focusing on long-term value when designing solutions
  • Adopting best practices in the implementation and execution of support processes
  • Collaborating with various IT teams such as infrastructure support, networking, database administrators, web development and business application developers
  • Participate as the data engineering resource on small to large projects using both waterfall and agile methodologies
  • Participate in the Data governance project and solutioning.
  • Data assessment and assisting with strategic decisions on migration / new system data integration.
  • Utilizing strong problem solving and analytical skills as well as good written and verbal communication skills.
  • Completing key project work and support activities
  • Maintaining system support documentationIdentifying and implementing process improvements.

(experience & Education Required)

What You Will Need To Bring With You:

  • Bachelor’s degree in technology related field
  • Minimum of 5 years IT experience
  • Minimum of 4 years hands on experience with ETL tools like Talend, Informatica, AWS Glue.
  • Experience working with cloud-based environments – AWS Storage and Database services (S3, EC2, RDS, Redshift, Snowflake, Arora), Computing (Lambda, EMR) and AWS serverless services (Glue, Athena).
  • Strong programming experience in PySpark to perform real-time, large-scale data processing in a distributed environment.
  • Experience migrating data to AWS. Data replication experience (HVR tool experience is preferred).
  • Experience or knowledge working with cloud-based database platforms like Redshift and Snowflake
  • Prepare plans for all ETL Procedures with hands on experience on ETL tools such as Talend, Informatica, AWS Glue
  • Experience writing and troubleshooting scripts in Python, Bash on Linux. Strong SQL writing.
  • Advance knowledge of performance tuning related to ETL Development.
  • Strong Data Warehousing experience with an understanding of database structures, design, theories and SCD (slowly changing dimensions) principles.
  • Experience working with complex and potentially un-structured data sets JSON, XML, Parquet
  • Ability to develop repeatable development and administration processes with supporting documentation.
  • Experience working with users and architects to capture requirements, develop and test infrastructure.
  • Strong verbal and written communication
  • Strong analytical and problem-solving skill
  • Must be detailed-oriented.Economic thinking.

What Will Put You Ahead

  • AWS certification(Associate Architect)
  • Experience with Talend Big Data
  • Experience with SQL scripting, Bash and Python
  • Experience with Redshift , Snowflake DB data model design and Implementation
  • Experience with Athena tables for unstructured file formats.
  • Experience in Data replication.
  • Experience using CICD platform.
  • Knowledge of architecture design and best practices.

Other Considerations

(physical demands/ unusual working conditions)

  • Self-motivated and problem solver
  • Go-getter personality
  • Mentoring skills.

Koch is proud to be an equal opportunity workplace.

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.