Fabric & Databricks Interoperability (4): Using Databricks Tables in Fabric for Viewing, Analysis, and Editing

Introduction Is it possible to seamlessly reference and edit tables created in Fabric within Databricks? Many people may have this question. In this article, we will specifically explore the use case of: Utilizing tables created in Databricks within Fabric. For prerequisite settings and configurations, please refer to previous articles. This article is part of a four-part series: Overview and Purpose of Interoperability Detailed Configuration of Hub Storage Using Tables Created in Fabric in Databricks Using Tables Created in Databricks in Fabric (this article) Linking Tables Created in Databricks to Fabric Creating a New Table in Databricks Create a new empty external table from Databricks. Specify the folder path of the hub storage as the Location. CREATE TABLE create_from_Databricks_sales USING DELTA LOCATION 'abfss://@.dfs.core.windows.net/folder_name/create_from_Databricks_sales' Checking the Created Table You can verify that the external table created from the Catalog Explorer contains data. A folder named create_from_Databricks_sales is created in the ext folder of the hub storage. (This means that the newly created external table physically exists in the hub storage.) It can also be confirmed that the table is in Delta format. At this point, the create_from_Databricks_sales table also becomes visible from Fabric's Lakehouse. Viewing and Analyzing Tables Created in Databricks in Fabric (Creating BI) From the Semantic Model, select the create_from_Databricks_sales table (created in Databricks) and click [Confirm]. Now, the table created in Databricks can be analyzed in Fabric. Editing (DML) Tables Created in Databricks from Fabric Execute an UPDATE statement (DML statement) from Fabric's Notebook. UPDATE Fabric_Lakehouse.ext.create_from_Databricks_sales SET Item = 'No.1 Quantity Water Bottle - 30 oz.' WHERE Item = 'Water Bottle - 30 oz.' Of course, it was confirmed that changes were reflected from Fabric. Editing was performed from Fabric, and the changes were also reflected on the Databricks side. Therefore, it is possible to edit (DML statements) in Fabric for tables created in Databricks. Issues and Specific Operational Methods The method introduced here has the advantage that both Fabric and Databricks can edit data. However, this can also be a weakness, as it makes table updates too easy. Additionally, in this case, an external table in Databricks was used. However, predictive optimization is currently only available for managed tables, making it ideal to use managed tables rather than external ones. Predictive optimization for Unity Catalog managed tables | Databricks on AWS docs.databricks.com We will continue to examine these challenges and share specific operational methods in the future. I also think that table cloning in Databricks might provide some useful hints. Databricksにおけるテーブルのクローン #deltalake - Qiita Clone a table on Databricks | Databricks on AWS [2022/10/28時点]の翻訳です。本書は抄訳であり内容の正確性を保証するものではありません。正… qiita.com Conclusion Based on the above, it was confirmed that "tables created in Databricks can be used in Fabric." Once the hub storage is set up, achieving interoperability between Fabric and Databricks is relatively simple. ▽ Previous article FabricとDatabricksの相互運用性③:Fabric で作成したテーブルをDatabricksで利用する(Databrickで閲覧・分析・編集可能) #BI - Qiita はじめにFabricで作成したテーブルをDatabricksでもシームレスに参照・編集が可能できるのか?このような疑問を持つ方も少なくないと思います。そこで今回はFabric で作成したテー… qiita.com

Feb 16, 2025 - 12:44
 0
Fabric & Databricks Interoperability (4): Using Databricks Tables in Fabric for Viewing, Analysis, and Editing

Introduction

Is it possible to seamlessly reference and edit tables created in Fabric within Databricks?

Many people may have this question.

In this article, we will specifically explore the use case of:

  • Utilizing tables created in Databricks within Fabric.

For prerequisite settings and configurations, please refer to previous articles.

This article is part of a four-part series:

  1. Overview and Purpose of Interoperability
  2. Detailed Configuration of Hub Storage
  3. Using Tables Created in Fabric in Databricks
  4. Using Tables Created in Databricks in Fabric (this article)

Linking Tables Created in Databricks to Fabric

Creating a New Table in Databricks

Create a new empty external table from Databricks.

Specify the folder path of the hub storage as the Location.

image.png

CREATE TABLE create_from_Databricks_sales
USING DELTA
LOCATION 'abfss://@.dfs.core.windows.net/folder_name/create_from_Databricks_sales'

Checking the Created Table

You can verify that the external table created from the Catalog Explorer contains data.
image.png

A folder named create_from_Databricks_sales is created in the ext folder of the hub storage.
(This means that the newly created external table physically exists in the hub storage.)

image.png

It can also be confirmed that the table is in Delta format.
image.png

At this point, the create_from_Databricks_sales table also becomes visible from Fabric's Lakehouse.

image.png

Viewing and Analyzing Tables Created in Databricks in Fabric (Creating BI)

From the Semantic Model, select the create_from_Databricks_sales table (created in Databricks) and click [Confirm].

image.png

Now, the table created in Databricks can be analyzed in Fabric.
image.png

Editing (DML) Tables Created in Databricks from Fabric

Execute an UPDATE statement (DML statement) from Fabric's Notebook.

image.png

UPDATE Fabric_Lakehouse.ext.create_from_Databricks_sales
SET Item = 'No.1 Quantity Water Bottle - 30 oz.'
WHERE Item = 'Water Bottle - 30 oz.'

Of course, it was confirmed that changes were reflected from Fabric.

image.png

Editing was performed from Fabric, and the changes were also reflected on the Databricks side.

image.png

Therefore, it is possible to edit (DML statements) in Fabric for tables created in Databricks.

Issues and Specific Operational Methods

The method introduced here has the advantage that both Fabric and Databricks can edit data. However, this can also be a weakness, as it makes table updates too easy.

Additionally, in this case, an external table in Databricks was used.

However, predictive optimization is currently only available for managed tables, making it ideal to use managed tables rather than external ones.

We will continue to examine these challenges and share specific operational methods in the future.

I also think that table cloning in Databricks might provide some useful hints.

Databricksにおけるテーブルのクローン #deltalake - Qiita

Clone a table on Databricks | Databricks on AWS [2022/10/28時点]の翻訳です。本書は抄訳であり内容の正確性を保証するものではありません。正…

favicon qiita.com

Conclusion

Based on the above,

it was confirmed that "tables created in Databricks can be used in Fabric."

Once the hub storage is set up, achieving interoperability between Fabric and Databricks is relatively simple.

▽ Previous article

FabricとDatabricksの相互運用性③:Fabric で作成したテーブルをDatabricksで利用する(Databrickで閲覧・分析・編集可能) #BI - Qiita

はじめにFabricで作成したテーブルをDatabricksでもシームレスに参照・編集が可能できるのか?このような疑問を持つ方も少なくないと思います。そこで今回はFabric で作成したテー…

favicon qiita.com