As Acronis is dedicated not just to Cyber Protection but to the general protection of its potential and current employees, recruitment and onboarding process are being held online during the current global COVID-19 situation.
Acronis leads the world in cyber protection - solving safety, accessibility, privacy, authenticity, and security (SAPAS) challenges with innovative backup, security, disaster recovery, and enterprise file sync and share solutions that run in hybrid cloud environments: on-premises, in the cloud, or at the edge. Enhanced by AI technologies and blockchain-based data authentication, Acronis protects all data, applications and systems in any environment, including physical, virtual, cloud, and mobile.
With dual headquarters in Switzerland and Singapore, Acronis protects the data of more than 5 million consumers and 500,000 businesses in over 150 countries and 20 languages.
ACEP DataWarehouse is a big project used as a source for product managers insights about products usage and another user's metrics. It is also a data source for CyberCube data warehouse. Data engineer in Performance engineering team will be responsible for overall data warehouse data schema design, sources management and data flows control.
- Internal Data Warehouse business model ownership
- Data schema design and ownership, including comprehensive schema documentation
- Investigation of current products data flows and operational databases which are subject of data semantic, lineage etc
- Mediation between Services team, DWH developers and BI team to quickly resolve issues or introduce changes in data model
- Data governance & Data quality
- Introducing an automated data quality control, including anomalies detection and data loss
- Business support of data loss incidents troubleshooting or data migration
- Defining requirements for systems and services analytics
- Data analysis
- Manual and automated DWH data analysis and insights search
- Search for hidden dependencies (correlations) of clients behaviour based on their cohorts when interacting with the product
- Participation in the creation of business metrics and their visualisation (dashboard) for decision making
Requirements SKILLS & EXPERIENCE:
Would be a plus:
- 3+ years experience with Data Warehouse architecture and design, data flows design, ETL implementation
- Deep knowledge in SQL, experience with complex queries implementation, profiling and optimisation
- Python: experience with NumPy, Pandas, Scikit-learn or similar
- Experience with Hadoop stack, Hive and Pyspark;
- Experience with TensorFlow (Keras) or Pytorch
- Experience with tools: git, Confluence, Jira, DevOps