Within Data & Analytics department (D&A) we need a strong center of excellence for data and data analytics, which are the driving force behind a data-driven culture, data governance & quality frameworks, our BI function and the close connection with IT on a future-proof data infrastructure. The goal of the Data Engineering Team is to provide data pipelines that ensure continuity, safety and integrity of the delivered data.
The Data Factories team, is a team of data engineers within D&A, whose primary purpose is to establish robust data pipelines for retrieving data from internal and external sources, store the data as a raw archive (repayable) and eventually send it to the Core Data Platform, responsible for the data storage. The team works closely with different departments within Schiphol, some of them are AI, BI, IoT, Data Governance and some more.
What will you do as a Senior Data Engineer?
At one of the world's premier airports, as a Senior Data Engineer you will play a crucial part in the engineering efforts of your Scrum team. To this end, you will guide and coach the other engineers in the teams, set technical standards and best practices, ensure high performance along with scalability, and create an optimal work climate where every team member can excel. You'll be responsible for the design, implementation, test and maintenance of data products on Microsoft Azure. You will work closely together with various other (internal) business stakeholders and multiple data providers. In summary you can think of the following responsibilities:
- Support and improve software development lifecycle & our best practices;
- Participate in selecting new potential team members, support the onboarding and training of new hires since our team is currently growing;
- People leadership: coaching, mentoring and growing the members of your team. For example by pair programming, code reviews and providing general care;
- Support team happiness keeping the right balance between team challenges and achievements;
- Collaborate with other Lead Engineers, Data Architects & Enterprise Architects to set best practices, standards and design technical solutions;
- Identify the integration requirements of several data sources;
- Design, implement and test cloud based (Azure) data intensive applications;
- Contribute improvements to the Data Platform;
- Convert data product prototypes to data products.
How does a typical day for a senior engineer look like?
- In the morning you’ll join the stand-up of your team since you want to discuss some technicalities on one of the projects they are working on.
- After the stand-up you work on the implementation of the new data pipeline for one of our products.
- Next, you work together with one of our Lead Data Engineers on the design of a new CI/CD pipeline.
- After that you’ll have a coaching session with one of the more junior engineers in the team, you provide her/him with some input over a project that you work together.
As a next-generation professional you have a clear vision, courage and focus. Innovation and agile working are an absolute priority. You improve yourself every day so that you can be the best. You are introspective and resilient. You believe that creating a good atmosphere is essential for a successful, respectful and welcoming workplace. We are still a small team so we’ll be looking for a real ‘click’/connection. Your natural approachability enables you to connect with people. You are passionate about your work, and are curious and open to new developments, with a mindset keyed to possibilities and opportunities. If you meet the below requirements, then we look forward to receiving your application!
- You have a BSC or MsC in a relevant field (e.g. Computer Science, Mathematics etc.);
- You have at least 4 year of experience running data driven solutions, including deployment and management of data-pipelines in production;
- Strong Java/Scala experience, we would prefer someone with 2+ years hands-on experience with at least, one of this two;
- Professional experience with Kafka (or Azure EventHub);
- Professional experience with Spark, implementing both batch and streaming pipelines;
- You recognize the importance of logging and monitoring (we currently use DataDog, Splunk, sentry);
- You have at least one year experience with Microsoft Azure (AWS, GCP are fine too);
- You are comfortable working with latest DevOps technologies i.e., Kubernetes, Docker;
- Professional experience with Scrum;
- You are fluent in English. Knowledge of the Dutch language is a plus.
At Schiphol you can look out of the window and see your data products take effect on our operation. We are the only employer with both it's own police force and a cinema. Our team works on a range of projects, from image recognition of below-wing processes of aircraft sitting at the gate, to expected waiting times at security, and even dynamic routing of passengers through the terminal. Walking through the airport you can see your work take effect.