Fabric & Databricks Interoperability (1): Purpose of Hub Storage for Table Sharing

Introduction Although they each have their own characteristics, Microsoft Fabric and Databricks are broadly similar in what they can do. Through Azure Databricks Unity Catalog mirroring, we are now able to reference Databricks-managed data in Fabric, but editing the data is still not possible. This brings up the following concerns: Is it possible for our department to use tables managed by other departments in Databricks via our Fabric? I want to make the tables I create open for modification and reference, regardless of the tool used! We are currently using Databricks, but we might migrate to Fabric in the future... we want to maintain a vendor-free stance. Business-side employees use Fabric, but engineers use Databricks; there are times when we need to reference the same table. In this article, I will introduce use cases for seamlessly utilizing Fabric and Databricks: Using tables created in Databricks in Fabric Using tables created in Fabric in Databricks The goal of this article: :::note info This article consists of four parts: Overview and purpose of interoperability (this article) Detailed setup of hub storage Using tables created in Fabric in Databricks Using tables created in Databricks in Fabric ::: Prerequisite: Fabric and Databricks have similar functions... which one should we actually use? Fabric and Databricks are both attracting attention as Lakehouse platforms that handle data end-to-end. As someone new to the industry, my first impression of using both was that "They can probably do about the same things?". They both support ETL processes and AI model creation. Fabric is appealing because of its beginner-friendly GUI, designed for intuitive operations. On the other hand, Databricks is more code-based, so it seems to require a slightly higher skill level. Additionally, Databricks offers more customization options for computer resources, and if you stop the cluster frequently, it can be more cost-effective than Fabric. I believe the choice between these platforms depends on whether you prioritize ease of use or flexibility. ▽Reference Azure Databricks と Microsoft Fabric の関係を考える

Feb 16, 2025 - 11:03
 0
Fabric & Databricks Interoperability (1): Purpose of Hub Storage for Table Sharing

Introduction

Although they each have their own characteristics, Microsoft Fabric and Databricks are broadly similar in what they can do.

Through Azure Databricks Unity Catalog mirroring, we are now able to reference Databricks-managed data in Fabric, but editing the data is still not possible.

This brings up the following concerns:

  • Is it possible for our department to use tables managed by other departments in Databricks via our Fabric?
  • I want to make the tables I create open for modification and reference, regardless of the tool used!
  • We are currently using Databricks, but we might migrate to Fabric in the future... we want to maintain a vendor-free stance.
  • Business-side employees use Fabric, but engineers use Databricks; there are times when we need to reference the same table.

In this article, I will introduce use cases for seamlessly utilizing Fabric and Databricks:

  • Using tables created in Databricks in Fabric
  • Using tables created in Fabric in Databricks

The goal of this article:
image.png

:::note info
This article consists of four parts:

  1. Overview and purpose of interoperability (this article)
  2. Detailed setup of hub storage
  3. Using tables created in Fabric in Databricks
  4. Using tables created in Databricks in Fabric :::

Prerequisite: Fabric and Databricks have similar functions... which one should we actually use?

image.png
Fabric and Databricks are both attracting attention as Lakehouse platforms that handle data end-to-end.

As someone new to the industry, my first impression of using both was that "They can probably do about the same things?".
They both support ETL processes and AI model creation.

Fabric is appealing because of its beginner-friendly GUI, designed for intuitive operations.
On the other hand, Databricks is more code-based, so it seems to require a slightly higher skill level.
Additionally, Databricks offers more customization options for computer resources, and if you stop the cluster frequently, it can be more cost-effective than Fabric.

I believe the choice between these platforms depends on whether you prioritize ease of use or flexibility.

▽Reference