Global Transport and Logistics
Select country/region

Select country/region


Login Sign up

Job Req Number:  70859
Time Type:

Data Engineer – Data Foundation Team (Hybrid)


At DSV, we are on a journey to renew the Global Data and Analytics Platform that is being used by colleagues, customers and Business Partners in all Business Domains to get the most information out of available data.


Our department members are placed in numerous locations around Europe, Asia and South Africa.

A significant part of our department is located in Lisbon, grouped in development and operations teams.

For the Data Foundation Team, we are looking for a Data Engineer to support the team of currently 3 team members plus the Manager.

The position is part of the Global organisation and work will be in close collaboration with colleagues across locations in the department.


To achieve our goals, we are working on an ambitioned plan and have chosen a vendor who is offering us a full stack of services. All of our work is done in the cloud, utilising the full stack of Microsoft’s offering: Azure Data Lake, Azure SQL Server /Azure SQL Synapse, Azure Analysis Services, DataBricks, PowerBI Services, Azure Data Factory, Azure DevOps.

DSV will also embark on a multi cloud strategy and introduce technologies to facilitate cloud provider agnosticism.


The Data Foundation team works as technology spearheads and supports the delivery and operations teams with tools, frameworks, processes, and coaching. We deliver in an agile way with Jira as our management tool.



As a Data Engineer in Data Foundation Team, your major responsibilities will be:



  • Manage and optimize cross department tools.
  • Be a key member in cross data foundation implementation projects.
  • Analyse existing applications and optimize their performance.
  • Design and implement solutions that will support the development of our Operational Data Warehouse (ODW) in alignment with IT Architectural principles, guidelines, and standards.
  • Rapidly build up understanding of existing architecture and underlying business processes
  • Develop and maintain scalable data pipelines for ingesting, processing, and transforming large volumes of data from various sources.
  • Analyse requirements together with our different department teams.
  • Document and provide technical specifications for the implementations.
  • Refine and promote our agile delivery framework to help ensure high quality and flexible deliverables on the data & BI platform.
  • Actively transfer knowledge to other team members as well as the Operations & Support team.



Skills & Competencies

To perform the job successfully you should demonstrate the following competencies:


  • Analytical Skills: Ability for systematic and rational analysis to identify the root cause of problems.
  • Delivery & Project Management: Experience with Agile methodology, and the ability to promote and facilitate deliveries accordingly.
  • Conversational Skills: Being comfortable in international business communication, able to communicate openly and confidently, able to listen effectively, able to influence and convince others in a way that results in acceptance and agreement.
  • Personal Skills: Ability to work with different cultures and to manage cross functional interfaces, high priority for teamwork, capable to build strong and long-term team relationships within and across functions and geographical boundaries and based on ethics and trust, ready to take personal accountability for achieving individual and shared goals.


Required Experience:

  • Bachelor’s or master’s degree in computer sciences, Data Science, Information Technology or a related field.
  • 3-5 years of experiences in BI/Data Engineering projects.


Technical requirements:


  • Expert knowledge of Microsoft’s Data stack: Azure Datalake, Azure SQL Database/Azure Synapse, Azure Data Factory, Synapse integration, DevOps, etc
  • Ability to monitor, troubleshoot, and optimize data pipelines for performance and reliability.
  • Experience with profiling and tuning queries for efficiency.
  • Experience in working with GIT repositories.
  • Experiences of Agile methodology and CI/CD practices.
  • Experience with task management tools (JIRA, Confluence)
  • Experience with Databricks and Pyspark/Scala/Spark SQL
  • Expert Knowledge of data governance principles to ensure data quality, lineage, and compliance.




  • Proficient Level of English (spoken and written)


Our offer:

Your job location will be Portugal (Lisboa, Saldanha) and you will be part of DSV Group Finance with peers working in remote teams across the Globe.


  • Permanent Contract with 35h / Week (Hybrid)
  • Health Insurance
  • Comprehensive onboarding program
  • Work-life balance
  • Comfortable office
  • Internal training catalogue
  • Culture of feedback

DSV - Transporte Global e Logística 

A DSV é um local de trabalho dinâmico que promove a inclusão e a diversidade. Conduzimos os nossos negócios com integridade, respeitando as diferentes culturas, dignidade e os direitos dos indivíduos. Ao ingressar na DSV, você estará a trabalhar para uma das empresas com melhor desempenho no setor de transporte e logística. Você irá juntar-se a uma equipe talentosa com mais de 75.000 colaboradores em mais de 80 países, trabalhando com paixão para oferecer ótimas experiências aos clientes e serviços de alta qualidade. A DSV aspira liderar o caminho para um futuro mais sustentável para nossa indústria e está comprometida em negociar respeitando os termos da natureza.

Promovemos a colaboração e transparência e esforçamo-nos para atrair, motivar e reter pessoas talentosas numa cultura de respeito. Se você é motivado, talentoso e deseja fazer parte de uma organização progressiva e versátil, apoiá-lo-emos nas suas necessidades para alcançar o seu potencial e avançar na sua carreira.

Visite e siga-nos no LinkedInFacebook e Twitter.

Apply now