Job description

Role Proficiency

Independently provides expertise on data analysis techniques using software tools; streamlining business processes and managing team


  • Managing and designing the reporting environment including data sources security and metadata.
  • Providing technical expertise on data storage structures data mining and data cleansing.
  • Supporting the data warehouse in identifying and revising reporting requirements.
  • Supporting initiatives for data integrity and normalization.
  • Assessing tests and implementing new or upgraded software and assisting with strategic decisions on new systems.
  • Synthesize both quantitative and qualitative data into insights
  • Generating reports from single or multiple systems.
  • Troubleshooting the reporting database environment and reports.
  • Understanding business requirements and translating it into executable steps for the team members.
  • Identify and recommend new ways to streamline business processes
  • Illustrates data graphically and translates complex findings into written text.
  • Locating results to help the clients make better decisions. Get feedback from clients and offer to build solutions based on the feedback.
  • Review the team’s deliverables before sending final reports to stakeholders.
  • Support cross-functional teams with data reports and insights on data.
  • Training end users on new reports and dashboards.
  • Set FAST goals and provide feedback on FAST goals of reportees

Measures Of Outcomes

  • Quality - number of review comments on codes written
  • Accountable for data consistency and data quality.
  • Number of medium to large custom application data models designed and implemented
  • Illustrates data graphically and translates complex findings into written text.
  • Number of results located to help clients make informed decisions.
  • Attention to detail and level of accuracy.
  • Number of business processes changed due to vital analysis.
  • Number of Business Intelligent Dashboards developed
  • Number of productivity standards defined for project
  • Manage team members and review the tasks submitted by team members
  • Number of mandatory trainings completed

Outputs Expected

Determine Specific Data needs:

  • Work with departmental managers to outline the specific data needs for each business method analysis project

Management And Strategy

  • Oversees the activities of analyst personnel and ensures the efficient execution of their duties.

Critical Business Insights

  • Mines the business’s database in search of critical business insights and communicates findings to the relevant departments.


  • Creates efficient and reusable SQL code meant for the improvement manipulation and analysis of data.
  • Creates efficient and reusable code. Follows coding best practices.

Create/Validate Data Models

  • Builds statistical models; diagnoses validates and improves the performance of these models over time.

Predictive Analytics

  • Seeks to determine likely outcomes by detecting tendencies in descriptive and diagnostic analysis

Prescriptive Analytics

  • Attempts to identify what business action to take

Code Versioning

  • Organize and manage the changes and revisions to code. Use a version control tool like git bitbucket. etc.

Create Reports

  • Create reports depicting the trends and behaviours from the analysed data


  • Create documentation for own work as well as perform peer review of documentation of others' work

Manage Knowledge

  • Consume and contribute to project related documents share point libraries and client universities

Status Reporting

  • Report status of tasks assigned
  • Comply to project related reporting standards/process

Skill Examples

  • Analytical Skills: Ability to work with large amounts of data: facts figures and number crunching.
  • Communication Skills: Communicate effectively with a diverse population at various organization levels with the right level of detail.
  • Critical Thinking: Data analysts must look at the numbers trends and data and come to new conclusions based on the findings.
  • Presentation Skills - reports and oral presentations to client
  • Strong meeting facilitation skills as well as presentation skills.
  • Attention to Detail: Making sure to be vigilant in the analysis to come to correct conclusions.
  • Mathematical Skills to estimate numerical data.
  • Work in a team environment
  • Proactively ask for and offer help

Knowledge Examples

  • Database languages such as SQL
    • Programming language such as R or Python
    • Analytical tools and languages such as SAS & Mahout.
    • Proficiency in MATLAB.
    • Data visualization software such as Tableau or Qlik or Power BI.
    • Proficient in mathematics and calculations.
    • Spreadsheet tools such as Microsoft Excel or Google Sheets
    • DBMS
    • Operating Systems and software platforms
    • Knowledge about customer domain and also sub domain where problem is solved
Additional Comments

Job Summary As a Senior Data Engineer, you will be responsible for designing, building, and maintaining scalable data pipelines on an AWS cloud platform to support our data-driven initiatives. Your expertise in ETL data ingestion frameworks/tools will play a critical role in ensuring efficient data processing and integration. Accountabilities

  • Create and maintain data ingestion pipelines, models, and architectures required to support a growing Data Marketing business
  • Work with Product Management, business partners, and the Data Science team members to understand and create solutions to meet their needs
  • Work with the Quality Engineers to validate solutions are meetings requirements.
  • Implement automation processes as the opportunities present. Basic Qualifications:
  • Familiarity with Data Pipeline Management Frameworks on Cloud (AWS Preferred, Azure, Google): As a Senior Data Engineer, you should have a strong understanding of data pipeline management frameworks offered by major cloud providers like AWS, Azure, and Google. Your expertise in working with these platforms will enable you to design and implement robust data pipelines to extract, transform, and load data from various sources.
  • Familiarity with ETL Data Ingestion Framework/Tools. You should be well-versed in ETL (Extract, Transform, Load) data ingestion frameworks/tools, such as Azure Data Factory, Google Data Fusion, and SSIS. Your knowledge of these tools will facilitate seamless data integration and ensure data quality throughout the pipeline.
  • Hands-on Experience with Python: Proficiency in Python is essential for this role. You should have hands-on experience using Python to develop data processing scripts, data manipulation, and transformation tasks, as well as implementing data engineering solutions.
  • Knowledge of Source Control and Scrum Agile Software Development Methodologies: A strong foundation in source control practices, such as Bit Bucket, is required. Moreover, you should be familiar with Scrum Agile software development methodologies to effectively collaborate with cross-functional teams and deliver high-quality data engineering solutions.
  • Familiarity with AWS Ecosystem: Having a deep understanding of the AWS ecosystem, including training jobs, processing jobs, and Sagemaker, will be a significant advantage. This knowledge will allow you to leverage AWS services efficiently and optimize data workflows.
  • Good Exposure and hands-on working on the following skills o Glue, Glue Catalog, Crawler, Lambda, Airflow, IAM, S3, Athena, RedShift, Python, PySpark, SQL knowledge, Dynamo DB, Extensive knowledge on applying data transformations, GIT/Bit Bucket Preferred Qualifications:
  • Experience in large-data solutions is highly desirable.
  • Excellent verbal, written, and interpersonal communication skills.
  • Experience with Scikit-learn, PyTorch, and Huggingface, and Building Transformer and Sentence Transformer Models: Your expertise in working with popular machine learning libraries like Scikit-learn, PyTorch, and Huggingface will be critical for developing and deploying transformer and sentence transformer models. Experience in building and fine-tuning these models will further enhance your role as a Senior Data Engineer.

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.

Similar jobs

Browse All Jobs
February 28, 2024

Fleet Data Engineer

Apotek Hjärtat
February 28, 2024
Capgemini Engineering
February 28, 2024

Experienced Data Engineer & Team Lead