📊Power BI
Publishing Reports
When you create a report in Power BI Desktop and then publish it, both the report and the semantic model will be published.
The semantic model is essentially the dataset — it includes the different tables, data sources, the credentials and access needed to refresh those sources, calculated measures and tables, and the relationships between tables, among other things.
Once published, you’ll be able to see the lineage view in a given workspace. For example, the image below shows a semantic model (in the middle) that connects to two different data sources, and then two separate reports use that semantic model.

It’s also important to note that a semantic model has a "main" report associated with it. If you want to modify a semantic model that is used by multiple reports and you choose to download it, what you actually download is the PBIX file of the original report that was used to publish that semantic model.
When you create additional reports in the Power BI Service that connect to this semantic model, those reports use a live connection to it (sometimes informally referred to as a DirectQuery-like connection, but technically it is a live connection to a Power BI dataset).
You can convert a live connection report into an import mode report by importing data into a new semantic model. However, when you publish that new report, it will create a new semantic model, and the new report will no longer be tied to the original shared semantic model.
Access Management
Say we have a large Excel file with data from all around the globe. Our report will be used by both global employees (who need to see all data) and regional employees (who should only see data for their region). How do we handle this?
This can be easily solved using Row-Level Security (RLS). There are two main types of RLS you should know about: Static RLS and Dynamic RLS.
Static RLS
To set up static RLS, start by going to the Modeling tab in Power BI Desktop, then click on Manage roles.
For example, if we want to create a role for Europe, we create a new role called Europe. In the next panel, we select the appropriate table (for example, the Region table), and then specify a DAX filter such as:
[RegionName] = "Europe"After publishing the report, go to the semantic model in the Power BI service, click on the three dots (…) next to it, choose Security, and assign users to the newly created role.
Dynamic RLS
Dynamic RLS is more flexible and powerful, especially for scenarios with many users and granular access needs (e.g., by business unit).
To implement dynamic RLS, you first need an access table in your data model. This could be a simple table (for example, an Excel file) containing columns such as Email and Business Unit (BU).
Once the access table is imported into the model, go back to Manage roles, create a new role (for example, Business Unit), and define a DAX filter like:
[Email] = USERPRINCIPALNAME()This line ensures that each user only sees data relevant to their email, matching what’s defined in the access table.

However, this alone is not enough. You also need to create a relationship between the access table and the relevant column in your data model (for example, connect the BU column in your access table to the BU column in your Region or main fact table).

What will happen to people who have access to the report or workspace but are not assigned to any RLS role?
If you define RLS roles, any user who has only Viewer or Build permission and is not assigned to a role will get an RLS error and see no data at all. Only users with edit permissions (Contributor, Member, Admin) can bypass RLS and see all data by default, even if they are not assigned to any role.
This also implies that RLS alone is not enough to give people access :) they should also either have access to the workspace or to the report itself.
Workspace permissions
Power BI Apps and Audiences
A Power BI app is a user-friendly way to bundle multiple reports and dashboards into a single, shareable experience. All content added to an app must come from the same workspace. If you need to include a report that uses a semantic model (dataset) from another workspace, you can create a thin report in the app’s workspace that connects to that external dataset. This is especially useful if the app's workspace has a lot of users and your semantic model has to stay private.
Additionally, you can technically embed reports from other workspaces using methods like "Publish to web" or secure embed, these do not integrate into the app’s sidebar navigation and feel less seamless for end users.
Access Management
To access a Power BI app, users only need to be added to the app’s audience and have the appropriate Row-Level Security (RLS) roles assigned if RLS is enabled. They do not need edit permissions on the workspace or reports. Additionally, users must have Power BI Pro licenses unless the app is hosted in a Premium capacity workspace, which allows free users to access it. No other permissions are required to view the app and its content.
Data Refresh
Power BI supports two main connection modes: DirectQuery (live connection) and Import mode.
In DirectQuery mode, data is queried live from the source whenever you interact with the report, so the dataset does not require scheduled refreshes. You only need to ensure that the credentials used to connect to the data source in the Power BI Service remain valid.
In Import mode, data is loaded into Power BI’s in-memory model, which requires refreshing the dataset to update the data. Refreshes can be triggered manually or scheduled to run automatically at regular intervals.
Sharing a Semantic Model
For Direct Access we have lots of options

If we want the user to only consume the Semantic Model in Power BI with no access to go back to previous steps and see deleted columns for example then we should not check the first option "Allow recipients to modify this dataset".
This will force them to only connect via a Live connection which restricts them from modifying measures also. They can however create new ones.
Power BI REST API
There are 2 good ways to use the REST API.
Service Principal
You'd have to first create a service principal, give it the permission flags you need and then give it access to the different datasets and reports you want it to interact with as if it was a real user.
Once that's done, you'll be able to generate an access token and query the different endpoints of the API based on what permissions you have.
TENANT_ID = ""
CLIENT_ID = ""
CLIENT_SECRET = ""
SCOPE = "https://analysis.windows.net/powerbi/api/.default"
AUTHORITY = f'https://login.microsoftonline.com/{TENANT_ID}'
def generate_access_token():
token_url = f"https://login.microsoftonline.com/{TENANT_ID}/oauth2/v2.0/token"
data = {
"client_id": CLIENT_ID,
"client_secret": CLIENT_SECRET,
"scope": SCOPE,
"grant_type": "client_credentials"
}
response = requests.post(token_url, data=data)
if response.status_code == 200:
access_token = response.json().get("access_token")
return access_token
else:
print(f"Failed to retrieve token: {response.status_code}")
print(response.json())
access_token = generate_access_token()You can find a full python script to backup reports here.
PowerShell CMDLETS
This is by far the preferred method.
Install-Module -Name MicrosoftPowerBIMgmt -Scope CurrentUser
Login-PowerBIServiceAccount
Invoke-PowerBIRestMethod -Url "https://api.powerbi.com/v1.0/myorg/groups/<workspace id>/datasets/<dataset id>/executeQueries" -Method Post -Body '{ "queries": [ { "query": "EVALUATE { [Total Sales] }" } ] }'This will effectively allow you to call the REST API with the permissions you already have, so there's not set up required.
However, due to some common organizational security group policies you might also have restrictions executing a powershell script and for that python comes to the rescue again.
ps_command = r'''
$env:PSModulePath += ";C:/Users/<username>/OneDrive/Documents/WindowsPowerShell/Modules";
Import-Module MicrosoftPowerBIMgmt;
Login-PowerBIServiceAccount;
Invoke-PowerBIRestMethod -Url "https://api.powerbi.com/v1.0/myorg/groups/<workspace_id>/datasets/<dataset_id>/executeQueries" -Method Post -Body '{ "queries": [ { "query": "EVALUATE { [Total Sales] }" } ] }';
'''
result = subprocess.run(["powershell", "-NoProfile", "-Command", ps_command], capture_output=True, text=True)
print(result.stdout)Last updated
Was this helpful?