DP-750: Implement data engineering solutions using Azure Databricks

This course focuses on how to design and implement end-to-end data engineering solutions using Azure Databricks and Unity Catalog. You will learn how to build robust data pipelines, manage data securely, and operate scalable data platforms in the cloud.

The course covers how to configure and manage Databricks environments, implement data ingestion and transformation workflows, and apply governance and security practices across your data platform. You will also learn how to optimize workloads and maintain reliable data pipelines in production environments.

After completing the course, you will be able to build, deploy, and maintain enterprise-grade data engineering solutions that support modern analytics and AI workloads.

Course Objectives

After completing this course, participants will be able to:

  • Design and implement data engineering solutions using Azure Databricks
  • Build and manage data pipelines for ingestion, transformation, and loading
  • Apply data governance and security using Unity Catalog
  • Work with Delta Lake and lakehouse architecture
  • Optimize and monitor data workloads
  • Implement scalable and production-ready data platforms

Target audience

Data engineers and technical professionals working with data platforms

This course is intended for professionals who want to build and operate scalable data engineering solutions in Azure.

Prerequisites

Participants should have:

  • Basic understanding of data analytics concepts
  • Experience with SQL and preferably Python
  • Familiarity with cloud concepts and data storage

Azure Databricks fundamentals
Set up and configure Databricks environments and understand architecture

Data ingestion and transformation
Build pipelines to ingest, clean, and transform data

Unity Catalog and governance
Manage access control, security, and data governance

Data modeling and processing
Design data structures and implement transformation logic

Delta Lake and lakehouse architecture
Work with modern data platforms using Delta Lake

Pipeline deployment and operations
Deploy, monitor, and optimize data workflows in production

Practical information

Duration: 4 day
Price: 26 500 NOK
Language: English
Format: Classroom or virtual training, open or company-specific

FAQ

Hvordan gjennomføres kurset?
Kurset kan gjennomføres som et åpent kurs eller som bedriftsinternt kurs. Du kan delta enten fysisk i klasserom eller virtuelt.

Hvem passer kurset for?
Kurset passer for data engineers og tekniske ressurser som jobber med datapipelines, analyse og dataplattformer.

Hva lærer jeg i løpet av kurset?
Du lærer hvordan du bygger, drifter og optimaliserer datapipelines og dataplattformer i Azure ved hjelp av Databricks.

Er kurset praktisk rettet?
Ja. Kurset inkluderer praktiske øvelser hvor du jobber med databehandling, pipelines og Databricks i realistiske scenarioer.

Hvilke temaer dekkes i kurset?
Kurset dekker blant annet Databricks, data pipelines, Delta Lake, datamodellering og governance.

Får jeg sertifisering etter kurset?
Kurset er relevant for sertifisering innen Azure Databricks Data Engineer Associate.

Hvilke forkunnskaper anbefales?
Grunnleggende kunnskap om dataanalyse, SQL og gjerne Python anbefales.

Hva gjør dette kurset unikt?
Kurset gir en praktisk tilnærming til hvordan moderne dataplattformer bygges og driftes med lakehouse-arkitektur og Azure Databricks.

Other relevant courses

4. May
5 days
Classroom Virtual Guaranteed to run
2 days
Classroom
1 days
Classroom