Job description

Company Description

Beyondsoft International (Singapore) Pte. Ltd. was set up in 2007 and established as the regional headquarters for the Southeast Asia (SEA) and European markets in September 2015. Based on our vision of Using technology to promote social progress, economic development and become a global customer preferred partner and our concept of Beyond your expectations, Beyondsoft is committed to provide our customers in countries along the Belt and Road with comprehensive solutions and products and creating commercial value for customers to realizing continuous businesses development.

Our Core Business Includes

  • IT development servicesProviding customers with IT consulting, software research and development, software and hardware testing, system integration and operation and maintenance, data analysis and other services;
  • New retail solutions and productsThrough intelligent products, helping small and medium-sized enterprises (SMEs) realize the digital transformation of their daily operations;
  • Internet of Things (IoT) platform and solutionsComprehensive use of IoT, artificial intelligence, big data, cloud computing and other technologies to provide IoT solutions for intelligent upgrades in cities, parks, buildings and industries, to create a smart future.

For more information, please visit www.beyondsoft.com.

Responsibilities

  • Collaborate with cross-functional teams, including software developers, data scientists, and business analysts, to understand data requirements and design optimal data solutions.
  • Design, build, and maintain scalable and high-performance data pipelines for extracting, transforming, and loading (ETL) data from various sources to data warehouses or databases.
  • Develop and manage data integration processes, ensuring data accuracy, consistency, and reliability.
  • Implement and maintain efficient data models and schemas in collaboration with database administrators and data architects.
  • Optimize database performance by tuning SQL queries, indexing strategies, and other database-related parameters.
  • Monitor and troubleshoot data pipelines, processes, and software deployments to identify and resolve issues in a timely manner.
  • Utilize Linux skills to manage and maintain overall systems and data infrastructure, including setting up servers, configuring services, and managing security settings.
  • Collaborate with DevOps teams to ensure smooth deployment and operation of data solutions in a cloud or on-premises environment.
  • Work closely with software developers to assist in software deployment, configuration, and troubleshooting tasks.
  • Create and maintain documentation for data engineering processes, data flows, infrastructure setup, and troubleshooting procedures.

Qualifications

  • Bachelor’s degree in information or computer science, or a related field.
  • At least 3 years in Data Engineer, developing and maintaining data pipelines, integration solutions, and contributing to software deployment and troubleshooting.
  • Strong in SQL, T-SQL, ETL processes, data modeling concepts and metadata-driven ingestion frameworks.
  • Familiarity with data integration tools like Airflow, SQL Server Integration Services, AWS Glue, Azure Data Factory, or Informatica.
  • Solid understanding of database management systems, hands-on experience with systems like PostgreSQL, MySQL, Microsoft SQL Server.
  • Proficient in at least one data manipulation language: Python, Java, or Scala.
  • Knowledge of data warehouse design, including star-schema, snowflake-schema, and columnar storage.
  • Experience with data warehousing technologies like Amazon Redshift, Google BigQuery, Azure Synapse, or Snowflake.
  • Familiarity with Linux operating systems, experience in managing Linux-based infrastructure, version control systems (e.g., Git), and CI/CD pipelines

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.