Leave a Comment / Articles / By SQLYARD
Introduction
Microsoft Fabric is redefining how data engineers and SQL developers work together. One of the most powerful new features is the ability to connect Python notebooks directly to SQL databases using a simple T-SQL Magic command.
For years, SQL developers relied on Management Studio or Azure Data Studio for querying data, while data scientists used Jupyter or Databricks notebooks for analysis. Fabric now merges both worlds — you can query data, analyze results with Python, visualize findings, and even automate data refreshes, all from the same interactive workspace.
In this post, we’ll cover:
- Why combining SQL and Python inside Fabric is a game changer.
- How to set up your first connection with T-SQL Magic.
- Examples of how to query, visualize, and analyze results.
- Tips for security and collaboration.
- A full workshop at the end to help you practice step by step.
Why This Is Exciting
Microsoft Fabric’s Python Notebooks + SQL Database integration bridges the gap between database administrators, analysts, and data scientists. It lets you run T-SQL and Python in the same notebook cell stream, enabling richer, automated workflows without jumping between tools.
Key benefits:
- Unified Querying and Analysis – Mix SQL queries with Python code for seamless hand-offs between data retrieval and analytics.
- Built-in Visualization – Use Matplotlib, Seaborn, or Fabric’s native chart view to graph query results instantly.
- Automation & Scheduling – Schedule notebook jobs to refresh datasets, run validations, or trigger alert emails.
- Collaboration – Share notebooks inside your Fabric workspace for real-time co-editing and documentation.
- Enterprise Security – Leverages Fabric’s security model, Entra ID authentication, and governed workspaces.
📖 Reference: Run T-SQL code in Fabric Python notebooks — Microsoft Learn
Step-by-Step: Connect to Your SQL Database
1. Create a New Notebook
- In your Microsoft Fabric workspace, go to Data Engineering → Notebooks.
- Select New Notebook and set the default language to Python.
2. Import the T-SQL Magic Extension
Fabric notebooks use a special “cell magic” to switch between Python and T-SQL.
# Load the Fabric T-SQL Magic command
%load_ext sql
3. Run a T-SQL Query
Change the cell type to T-SQL and connect to your Fabric SQL Database using the following syntax:
%%tsql -artifact MyFabricDB -type SQLDatabase
SELECT TOP 10 name, create_date
FROM sys.databases;
💡 Explanation:
%%tsqltells the notebook that this cell will execute T-SQL commands.-artifactspecifies the Fabric SQL Database name.-type SQLDatabasedefines the connection type (you can also use Lakehouse or Warehouse).
📖 Reference: T-SQL in Fabric Notebooks
Example: Query and Visualize Data
After connecting, you can query and analyze your data seamlessly:
%%tsql -artifact SalesDW -type SQLDatabase
SELECT TOP 100
p.ProductKey,
p.ProductName,
s.SalesAmount,
s.OrderDateKey
FROM dbo.FactSales AS s
JOIN dbo.DimProduct AS p
ON s.ProductKey = p.ProductKey;
The results automatically appear in the Fabric notebook grid view.
To analyze further, switch to Python in the next cell:
# Convert the previous SQL results to a pandas DataFrame
df = _sqldata.to_pandas()
# Inspect and visualize
print(df.head())
df.plot(kind="bar", x="ProductName", y="SalesAmount", figsize=(10,4))
✅ Inline Insight:
This pattern — query with SQL, then visualize in Python — is a powerful replacement for the traditional export-to-Excel workflow. You stay within the governed Fabric workspace.
Security and Access
When you run notebooks in Fabric:
- Authentication is handled through Microsoft Entra ID (Azure AD).
- Permissions mirror what’s set in your workspace and SQL endpoint.
- Connections are encrypted using Fabric’s secure data layer.
Use least-privilege access — assign users Contributor or Viewer roles only where needed.
📖 Reference: Fabric Security Model Overview
Automation and Scheduling
You can schedule your notebook to run automatically:
- In Fabric, click the Schedule tab for your notebook.
- Choose a frequency (hourly, daily, weekly).
- Add output destinations — write back to a Fabric Warehouse, Lakehouse, or send results via Power Automate.
This is ideal for generating daily reports or refreshing machine learning inputs.
Collaboration and Sharing
Notebooks can be shared just like Power BI reports:
- Use the Share button to grant workspace access.
- Add markdown cells to explain logic, results, and next steps.
- Version control integrates with OneLake Git for enterprise development.
Example Workflow: SQL + Python Together
Here’s a complete mini-workflow showing how to mix both languages.
%%tsql -artifact SalesDB -type SQLDatabase
SELECT Category, SUM(SalesAmount) AS TotalSales
FROM dbo.FactSales
GROUP BY Category;
# Convert to pandas and find top category
df = _sqldata.to_pandas()
top = df.sort_values(by="TotalSales", ascending=False).head(1)
print("Top Category:", top["Category"].iloc[0])
# Visualize the full result
import matplotlib.pyplot as plt
df.plot(kind='bar', x='Category', y='TotalSales', color='skyblue', legend=False)
plt.title("Total Sales by Category")
plt.ylabel("Sales ($)")
plt.show()
📊 You just ran an end-to-end query, transformation, and visualization entirely inside Fabric — no separate tools required.
Summary
Connecting Python notebooks to Fabric SQL Databases gives SQL professionals access to the full Python ecosystem (pandas, matplotlib, scikit-learn) while preserving the governance and security of Fabric.
Key takeaways:
- Use
%%tsqlcells for querying, Python cells for analysis and visuals. - Combine both languages for dynamic workflows and automation.
- Use Fabric scheduling and version control for repeatable pipelines.
- Share notebooks for team collaboration and documentation.
Final Thoughts
This integration closes the gap between database development and modern data science. SQL developers can now experiment with Python analytics, and data scientists can query production data securely without leaving Fabric.
Start small: connect, query, visualize, and schedule. Over time, your notebooks can evolve into end-to-end automated reporting and analytics solutions.
References
- Run T-SQL Code in Fabric Python Notebooks — Microsoft Learn
- Microsoft Fabric Documentation
- Fabric Security Overview
- Fabric Community Discussions
- Fabric Ideas Portal — Feedback Hub
Workshop: Query Store Analysis in Fabric Python Notebooks
Objective:
Build a hands-on notebook that queries a Fabric SQL Database, transforms the data in Python, and visualizes results — combining both languages in one workflow.
Step 1: Set Up Your Fabric Environment
- Go to your Fabric workspace.
- Under Data Engineering → Notebooks, select New Notebook.
- Set the default language to Python.
Step 2: Connect to Your Fabric SQL Database
%%tsql -artifact SalesDW -type SQLDatabase
SELECT TOP 20 * FROM sys.tables;
✅ Confirm that results appear in the output grid.
Step 3: Run a Query on Business Data
%%tsql -artifact SalesDW -type SQLDatabase
SELECT
ProductCategory,
SUM(SalesAmount) AS TotalSales,
COUNT(OrderID) AS Orders
FROM dbo.FactSales
GROUP BY ProductCategory;
Step 4: Analyze with Python
# Convert results to pandas
df = _sqldata.to_pandas()
# Sort and preview
df = df.sort_values("TotalSales", ascending=False)
df.head()
Step 5: Visualize
import matplotlib.pyplot as plt
plt.figure(figsize=(8,4))
plt.bar(df["ProductCategory"], df["TotalSales"], color="royalblue")
plt.title("Total Sales by Product Category")
plt.xlabel("Category")
plt.ylabel("Sales ($)")
plt.xticks(rotation=45)
plt.tight_layout()
plt.show()
Step 6: Automate
- In Fabric, open your notebook’s Schedule tab.
- Set it to run daily at 7 AM.
- Save outputs to a Lakehouse table for downstream reporting.
Step 7: Collaborate
- Add markdown cells describing the notebook’s purpose.
- Share it with teammates for live editing.
- Commit to Git integration for version control.
Outcome:
By completing this workshop, you’ll be able to:
- Connect a Fabric SQL Database to a Python notebook.
- Execute T-SQL and Python together.
- Visualize data interactively inside Fabric.
- Automate the notebook for daily analytics.
This workflow demonstrates how Fabric unifies traditional SQL and modern analytics — making it a single, end-to-end platform for data professionals.
Discover more from SQLYARD
Subscribe to get the latest posts sent to your email.


