<?xml version="1.0" encoding="utf-8" ?>
<!DOCTYPE FL_Course SYSTEM "https://www.flane.de/dtd/fl_course095.dtd"><?xml-stylesheet type="text/xsl" href="https://portal.flane.ch/css/xml-course.xsl"?><course productid="37144" language="en" source="https://portal.flane.ch/swisscom/en/xml-course/microsoft-dp-750t00" lastchanged="2026-05-11T09:48:36+02:00" parent="https://portal.flane.ch/swisscom/en/xml-courses"><title>Implement data engineering solutions using Azure Databricks</title><productcode>DP-750T00</productcode><vendorcode>MS</vendorcode><vendorname>Microsoft</vendorname><fullproductcode>MS-DP-750T00</fullproductcode><version>1.0</version><audience>&lt;p&gt;The target audience is data engineers who have fundamental knowledge of data analytics concepts, a basic understanding of cloud storage, and familiarity with data organization principles. They should be comfortable working with SQL and have experience using Python, including notebooks, for data engineering tasks. Learners are expected to have a good understanding of Azure Databricks workspaces and Unity Catalog, along with familiarity with data access patterns and core data engineering and data warehouse concepts. In addition, they should have foundational knowledge of Azure security, including Microsoft Entra ID, and be familiar with Git version control fundamentals.&lt;/p&gt;</audience><contents>&lt;h5&gt;Set up and configure an Azure Databricks environment&lt;/h5&gt;&lt;ul&gt;
&lt;li&gt;Explore Azure Databricks&lt;/li&gt;&lt;li&gt;Understand Azure Databricks architecture&lt;/li&gt;&lt;li&gt;Understand Azure Databricks Integrations&lt;/li&gt;&lt;li&gt;Select and Configure Compute in Azure Databricks&lt;/li&gt;&lt;li&gt;Create and organize objects in Unity Catalog&lt;/li&gt;&lt;/ul&gt;&lt;h5&gt;Secure and govern Unity Catalog objects in Azure Databricks&lt;/h5&gt;&lt;ul&gt;
&lt;li&gt;Secure Unity Catalog objects&lt;/li&gt;&lt;li&gt;Govern Unity Catalog objects&lt;/li&gt;&lt;/ul&gt;&lt;h5&gt;Prepare and process data with Azure Databricks&lt;/h5&gt;&lt;ul&gt;
&lt;li&gt;Design and implement data modeling with Azure Databricks&lt;/li&gt;&lt;li&gt;Ingest data into Unity Catalog&lt;/li&gt;&lt;li&gt;Cleanse, transform, and load data into Unity Catalog&lt;/li&gt;&lt;li&gt;Implement and manage data quality constraints with Azure Databricks&lt;/li&gt;&lt;/ul&gt;&lt;h5&gt;Deploy and maintain data pipelines and workloads with Azure Databricks&lt;/h5&gt;&lt;ul&gt;
&lt;li&gt;Design and implement data pipelines with Azure Databricks&lt;/li&gt;&lt;li&gt;Implement Lakeflow Jobs with Azure Databricks&lt;/li&gt;&lt;li&gt;Implement development lifecycle processes in Azure Databricks&lt;/li&gt;&lt;li&gt;Monitor, troubleshoot and optimize workloads in Azure Databricks&lt;/li&gt;&lt;/ul&gt;</contents><audience_plain>The target audience is data engineers who have fundamental knowledge of data analytics concepts, a basic understanding of cloud storage, and familiarity with data organization principles. They should be comfortable working with SQL and have experience using Python, including notebooks, for data engineering tasks. Learners are expected to have a good understanding of Azure Databricks workspaces and Unity Catalog, along with familiarity with data access patterns and core data engineering and data warehouse concepts. In addition, they should have foundational knowledge of Azure security, including Microsoft Entra ID, and be familiar with Git version control fundamentals.</audience_plain><contents_plain>Set up and configure an Azure Databricks environment


- Explore Azure Databricks
- Understand Azure Databricks architecture
- Understand Azure Databricks Integrations
- Select and Configure Compute in Azure Databricks
- Create and organize objects in Unity Catalog
Secure and govern Unity Catalog objects in Azure Databricks


- Secure Unity Catalog objects
- Govern Unity Catalog objects
Prepare and process data with Azure Databricks


- Design and implement data modeling with Azure Databricks
- Ingest data into Unity Catalog
- Cleanse, transform, and load data into Unity Catalog
- Implement and manage data quality constraints with Azure Databricks
Deploy and maintain data pipelines and workloads with Azure Databricks


- Design and implement data pipelines with Azure Databricks
- Implement Lakeflow Jobs with Azure Databricks
- Implement development lifecycle processes in Azure Databricks
- Monitor, troubleshoot and optimize workloads in Azure Databricks</contents_plain><duration unit="d" days="4">4 days</duration><pricelist><price country="US" currency="USD">2595.00</price><price country="CA" currency="CAD">2595.00</price><price country="DE" currency="EUR">2690.00</price><price country="GB" currency="GBP">2610.00</price><price country="CH" currency="CHF">2690.00</price><price country="AT" currency="EUR">2690.00</price><price country="SE" currency="EUR">2690.00</price><price country="SI" currency="EUR">2690.00</price><price country="IT" currency="EUR">1690.00</price></pricelist><miles/></course>