Testing Translytic Tasks
A slinky springing between Fabric and Power BI

Testing Translytic Tasks

Power BI predominantly consumes data. We connect it to data sources (which includes a wide variety of data source types) and read this data into our semantic models that we can then use to build visualizations. But what if we wanted to use Power BI as a tool to write data back to these same data sources? One way that we can do this is through the new translytic task flow functionality that connects Power BI and Fabric.

Making It Happen

Here's how it works, breaking it down into three consecutive Power BI Weekly videos that covers the three key steps for configuring these flows.

Step 1: Create the function

We first need to create a user data function, which we'll do in Fabric (where we also need to set up an account first if we don't already have one). This creates a function via a Python notebook where we can access Python functions and additional Python packages/libraries/modules to build out this custom user data function.

Article content

Step 2: Connect the function to something

By itself, the function can run, but it may not update the entire data model in the way that we want it to. If we want to connect the user data function to a SQL database for example, then we need to first set up this database with the fields that we want to continue to add to in the future. We then want to leverage the user data function within Fabric to input new values into the target SQL database table. Additionally, here's the documentation for the functions within the fabric Python package that we can explore utilizing within this user data function. https://learn.microsoft.com/en-us/python/api/fabric-user-data-functions/fabric.functions

Article content

Step 3: Configure Power BI

The first two steps of this process occurred directly within Fabric. Finally, we need to connect it to Power BI. We can set up a connection to the Fabric SQL database directly within Power Query, then use it to build a summarized table visual in the report. This is the consumption side of Power BI, where we read this data into the semantic model and then into the report layer visuals. But what about writing data back to the data source? We'll use the text slicer visual to input the values with an action button connected to it that connects to the user data function (created in Step 2). This writes data directly back to the same Fabric SQL database that also serves as the semantic model data source.

Article content

Here's what putting all these steps together looks like as a combined Fabric and Power BI ecosystem diagram for development.

Article content
Translytic diagram

A word of warning though. You might not want to write back to the data source that you're using in Power BI, or you might not universally want everyone within your organization to have this capability. It's a powerful and previously often-requested feature to add to our Power BI models, but we should use this functionality carefully so that we don't unintentionally cause more problems along the way in the future.

Other Notes

Hope to see you at either (or both) events!

-HW



Great, practical post, thanks for breaking down an advanced feature so clearly!

To view or add a comment, sign in

More articles by Helen Wall

  • Power B(AI)

    When OpenAI's ChatGPT interface debuted to the public at-large almost three years ago, it seemed enigmatic. We could…

    23 Comments
  • Power BI for $$$

    Happy Birthday Power BI! Power BI turns 10 today (July 24, 2025)! I started using the tool not too long after it first…

    5 Comments
  • Flying on Autopilot

    I remember taking a plane flight several years ago with a roller coaster landing. It was an otherwise uneventful short…

    2 Comments
  • Releasing Snakes into the Wild

    This week brings big news in both the Excel and Python communities! Python in Excel is now generally available as of…

    4 Comments
  • Slithering Back In

    I'm finally catching up on the latest editions of my newsletter after a bit of a break. Writing newsletters or any kind…

    2 Comments
  • The Modern Updates

    As I was perusing potential updates for my home recently, I started to think about how the definition of "modern" will…

  • WINDOWs of the World

    It's really hard to get very far in data science without knowing SQL. Within SQL there are different levels of…

    3 Comments
  • Straightening Things Out

    When I took linear algebra in college, my favorite part of the class was the end of it. The course was highly…

    2 Comments
  • SWITCH It Up

    In order to master managing data models, learning how to leverage conditional logic is a must. We see conditional logic…

    3 Comments
  • Seeing Dots

    I use data visualizations not only to communicate data models to end-users who are stakeholders, but I also personally…

    5 Comments

Others also viewed

Explore content categories