Turn Power BI from a dashboard you open into a tool Claude can query. This tutorial walks through building a production-shape MCP server that lets an AI agent list datasets, inspect measures, run DAX, and respect Row-Level Security — without a single dashboard click.
What You Will Build
A TypeScript MCP server that exposes a Microsoft Power BI semantic model to Claude (or any MCP-compatible client) through four tools:
list_datasets— list workspaces and datasets the agent can readlist_measures— enumerate measures defined in a datasetquery_dataset— execute a DAX query and return rowsget_measure— fetch a single measure value with optional filters
By the end you will have a Claude Desktop session where you type "give me Q1 revenue by region with last year's comparison" and Claude composes the DAX, runs it against your tenant, and writes the narrative back.
This is the same shape Noqta deploys for MENA clients sitting on under-used Power BI Pro and Premium licences. The full AI activation pattern for Power BI in Saudi Arabia is covered in our Power BI activation pillar — this tutorial is the developer how-to.
Prerequisites
- Power BI Pro or Premium tenant with at least one published dataset
- Microsoft Entra (Azure AD) admin access (you will register an app)
- Node.js 20+ and TypeScript comfort
- Claude Desktop installed (for testing)
- Basic familiarity with DAX and Power BI workspaces
You do not need a Premium Per User (PPU) licence for this tutorial — the REST executeQueries endpoint works on Pro datasets with the workspace published to a Premium or Fabric capacity. For pure Pro workspaces, the XMLA endpoint requires Premium/PPU. We will note the difference where it matters.
Architecture in One Diagram
┌──────────────────────┐ JSON-RPC over stdio ┌──────────────────────┐
│ Claude Desktop │ ◄─────────────────────► │ MCP Server (Node) │
│ (MCP Host) │ │ │
└──────────────────────┘ │ ┌────────────────┐ │
│ │ Auth: MSAL │ │
│ │ Service Princ. │ │
│ └───────┬────────┘ │
│ │ │
│ ┌───────▼────────┐ │
│ │ Power BI REST │ │
│ │ executeQueries │ │
│ └───────┬────────┘ │
└──────────┼───────────┘
│
┌──────────▼───────────┐
│ Power BI Service │
│ (Semantic Model + │
│ RLS enforced) │
└───────────────────────┘
The MCP server is a thin shim. Heavy lifting — security, query planning, RLS — stays in the Power BI service.
Step 1 — Register a Service Principal in Microsoft Entra
This is the boring part. Get it right once and you never touch it again.
1.1 Create the app registration
In the Azure portal:
- Microsoft Entra ID → App registrations → New registration
- Name:
noqta-mcp-powerbi(or your convention) - Supported account types: Accounts in this organizational directory only
- Redirect URI: leave blank
- Register
Note the Application (client) ID and Directory (tenant) ID — you will need both.
1.2 Create a client secret
In your new app → Certificates & secrets → New client secret:
- Description:
mcp-powerbi-prod - Expires: 6 months (rotate on a calendar — see the Security section)
- Copy the secret value immediately. You will not see it again.
1.3 Grant Power BI service permissions
Power BI does not use Microsoft Graph permissions for dataset queries — it uses service principal access on the workspace, gated by a tenant setting.
Open Power BI Admin Portal → Tenant settings → Developer settings → "Allow service principals to use Power BI APIs" and enable it for a security group that contains your app registration. (Create a group like pbi-service-principals in Entra and add the app as a member.)
Then in the Power BI workspace itself: Workspace → Manage access → Add → search for the app name → Member (or Viewer for read-only).
Common pitfall: if the tenant toggle is off, every API call returns 401 Unauthorized regardless of how the app is permissioned. Always verify the tenant setting first.
1.4 Capabilities you just unlocked
With the service principal added as a workspace Viewer, the MCP server can:
- List datasets in that workspace
- Read dataset metadata (tables, columns, measures, RLS roles)
- Execute DAX via
executeQueries(subject to the workspace capacity rules)
It cannot modify the model, refresh data, or change RLS roles. That is the desired posture for a read-only agent.
Step 2 — Project Setup
mkdir noqta-mcp-powerbi && cd noqta-mcp-powerbi
npm init -y
npm install @modelcontextprotocol/sdk zod @azure/msal-node undici
npm install -D typescript @types/node tsx
npx tsc --initEdit tsconfig.json:
{
"compilerOptions": {
"target": "ES2022",
"module": "Node16",
"moduleResolution": "Node16",
"outDir": "./build",
"rootDir": "./src",
"strict": true,
"esModuleInterop": true,
"skipLibCheck": true
},
"include": ["src/**/*"]
}Edit package.json to add ESM + scripts:
{
"type": "module",
"bin": { "noqta-mcp-powerbi": "./build/index.js" },
"scripts": {
"build": "tsc",
"start": "node build/index.js",
"dev": "tsx watch src/index.ts"
}
}Create the source layout:
mkdir -p src/lib
touch src/index.ts src/lib/auth.ts src/lib/powerbi.tsStep 3 — Token Acquisition with MSAL
Power BI accepts an OAuth2 bearer token in the Authorization header. With a service principal, the client credentials flow is the right grant — no user interaction, ideal for a daemon.
Create src/lib/auth.ts:
import { ConfidentialClientApplication } from "@azure/msal-node";
const tenantId = process.env.AZURE_TENANT_ID;
const clientId = process.env.AZURE_CLIENT_ID;
const clientSecret = process.env.AZURE_CLIENT_SECRET;
if (!tenantId || !clientId || !clientSecret) {
throw new Error(
"Missing AZURE_TENANT_ID / AZURE_CLIENT_ID / AZURE_CLIENT_SECRET"
);
}
const cca = new ConfidentialClientApplication({
auth: {
clientId,
authority: `https://login.microsoftonline.com/${tenantId}`,
clientSecret,
},
});
let cached: { token: string; expiresAt: number } | null = null;
export async function getPowerBiToken(): Promise<string> {
const now = Date.now();
if (cached && cached.expiresAt - now > 60_000) {
return cached.token;
}
const result = await cca.acquireTokenByClientCredential({
scopes: ["https://analysis.windows.net/powerbi/api/.default"],
});
if (!result?.accessToken) {
throw new Error("Failed to acquire Power BI token");
}
cached = {
token: result.accessToken,
expiresAt: result.expiresOn?.getTime() ?? now + 50 * 60 * 1000,
};
return cached.token;
}Key choices:
- Scope is
https://analysis.windows.net/powerbi/api/.default— the Power BI resource. Do not invent custom scopes. - We cache the token for up to ~50 minutes (real lifetime ~1h). MSAL has its own cache too, but a local cache avoids a roundtrip per tool call.
- All secrets come from environment variables. Never hard-code them in the MCP config.
Step 4 — Power BI REST Client
Create src/lib/powerbi.ts:
import { request } from "undici";
import { getPowerBiToken } from "./auth.js";
const BASE = "https://api.powerbi.com/v1.0/myorg";
async function pbiRequest<T>(
path: string,
init: { method?: string; body?: unknown } = {}
): Promise<T> {
const token = await getPowerBiToken();
const { statusCode, body } = await request(`${BASE}${path}`, {
method: init.method ?? "GET",
headers: {
authorization: `Bearer ${token}`,
"content-type": "application/json",
},
body: init.body ? JSON.stringify(init.body) : undefined,
});
const text = await body.text();
if (statusCode >= 400) {
throw new Error(`Power BI ${statusCode} on ${path}: ${text}`);
}
return text ? (JSON.parse(text) as T) : ({} as T);
}
export interface PbiDataset {
id: string;
name: string;
configuredBy: string;
isOnPremGatewayRequired: boolean;
}
export interface PbiWorkspace {
id: string;
name: string;
type: string;
}
export async function listWorkspaces(): Promise<PbiWorkspace[]> {
const res = await pbiRequest<{ value: PbiWorkspace[] }>("/groups");
return res.value;
}
export async function listDatasets(workspaceId: string): Promise<PbiDataset[]> {
const res = await pbiRequest<{ value: PbiDataset[] }>(
`/groups/${workspaceId}/datasets`
);
return res.value;
}
export async function executeDax(
workspaceId: string,
datasetId: string,
daxQuery: string
): Promise<{ rows: Record<string, unknown>[] }> {
const res = await pbiRequest<{
results: { tables: { rows: Record<string, unknown>[] }[] }[];
}>(`/groups/${workspaceId}/datasets/${datasetId}/executeQueries`, {
method: "POST",
body: {
queries: [{ query: daxQuery }],
serializerSettings: { includeNulls: true },
},
});
return { rows: res.results?.[0]?.tables?.[0]?.rows ?? [] };
}A few notes:
/groupsis the legacy Power BI API name for workspaces. The endpoint name is stuck — get used to it.executeQueriesruns DAX against the dataset's semantic model. It respects RLS, security roles, and effective identity if you passimpersonatedUserName— which we are deliberately not doing here (service principal = its own identity).- Errors come back as JSON with a
codeandmessage. We are propagating the raw body so the MCP layer can surface useful diagnostics to Claude.
Step 5 — The MCP Server
Now wire the REST client into MCP tools. Create src/index.ts:
#!/usr/bin/env node
import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
import { z } from "zod";
import {
listWorkspaces,
listDatasets,
executeDax,
} from "./lib/powerbi.js";
const server = new McpServer({
name: "noqta-mcp-powerbi",
version: "0.1.0",
});
server.tool(
"list_datasets",
"List Power BI workspaces and the datasets the service principal can read. Use this first to discover what models are available.",
{},
async () => {
const workspaces = await listWorkspaces();
const enriched = await Promise.all(
workspaces.map(async (ws) => ({
workspace: { id: ws.id, name: ws.name },
datasets: await listDatasets(ws.id).catch(() => []),
}))
);
return {
content: [{ type: "text", text: JSON.stringify(enriched, null, 2) }],
};
}
);
server.tool(
"list_measures",
"List measures defined in a Power BI dataset. Returns measure name, table, and DAX expression.",
{
workspaceId: z.string().describe("Power BI workspace (group) ID"),
datasetId: z.string().describe("Power BI dataset ID"),
},
async ({ workspaceId, datasetId }) => {
const dax = `
EVALUATE
SELECTCOLUMNS(
INFO.MEASURES(),
"Name", [Name],
"Table", LOOKUPVALUE(INFO.TABLES()[Name], INFO.TABLES()[ID], [TableID]),
"Expression", [Expression]
)
`;
const result = await executeDax(workspaceId, datasetId, dax);
return {
content: [{ type: "text", text: JSON.stringify(result.rows, null, 2) }],
};
}
);
server.tool(
"query_dataset",
"Execute a DAX query against a Power BI dataset and return the rows. Prefer SUMMARIZECOLUMNS or EVALUATE TOPN to keep results small.",
{
workspaceId: z.string(),
datasetId: z.string(),
daxQuery: z
.string()
.describe(
"A DAX query starting with EVALUATE. Always use SUMMARIZECOLUMNS for grouped aggregates."
),
},
async ({ workspaceId, datasetId, daxQuery }) => {
const result = await executeDax(workspaceId, datasetId, daxQuery);
return {
content: [{ type: "text", text: JSON.stringify(result, null, 2) }],
};
}
);
server.tool(
"get_measure",
"Compute a single measure value with optional filters. Use this for one-shot KPI lookups instead of writing DAX.",
{
workspaceId: z.string(),
datasetId: z.string(),
measureName: z.string().describe("Fully qualified: 'Table'[Measure]"),
filters: z
.array(
z.object({
column: z.string().describe("Fully qualified: 'Table'[Column]"),
value: z.union([z.string(), z.number()]),
})
)
.optional(),
},
async ({ workspaceId, datasetId, measureName, filters }) => {
const filterExprs = (filters ?? [])
.map(
(f) =>
`${f.column} = ${
typeof f.value === "number" ? f.value : `"${f.value}"`
}`
)
.join(", ");
const dax = filterExprs
? `EVALUATE ROW("Value", CALCULATE(${measureName}, ${filterExprs}))`
: `EVALUATE ROW("Value", ${measureName})`;
const result = await executeDax(workspaceId, datasetId, dax);
return {
content: [{ type: "text", text: JSON.stringify(result.rows, null, 2) }],
};
}
);
async function main() {
const transport = new StdioServerTransport();
await server.connect(transport);
console.error("noqta-mcp-powerbi running on stdio");
}
main().catch((err) => {
console.error(err);
process.exit(1);
});Build it:
npm run buildYou now have an executable MCP server at build/index.js.
Step 6 — Connect to Claude Desktop
Edit your Claude Desktop config (~/Library/Application Support/Claude/claude_desktop_config.json on macOS, %APPDATA%\Claude\claude_desktop_config.json on Windows):
{
"mcpServers": {
"powerbi": {
"command": "node",
"args": ["/absolute/path/to/noqta-mcp-powerbi/build/index.js"],
"env": {
"AZURE_TENANT_ID": "your-tenant-id",
"AZURE_CLIENT_ID": "your-client-id",
"AZURE_CLIENT_SECRET": "your-client-secret"
}
}
}
}Restart Claude Desktop. In a new conversation, the four tools appear in the connector menu.
Secret hygiene: the Claude Desktop config sits in plaintext on disk. For team distribution, point command to a wrapper script that pulls secrets from your OS keychain (security find-generic-password on macOS, secret-tool on Linux, Windows Credential Manager). Never commit claude_desktop_config.json to a repo.
Step 7 — A Real Demo Conversation
With the server connected, type:
You: What datasets do you have access to?
Claude calls list_datasets, gets back the workspaces, summarises them in prose.
You: Use the "Retail KSA" dataset. What are the top 5 measures by name?
Claude calls list_measures, ranks the response.
You: Give me total revenue for April 2026 by region, descending.
Claude composes a DAX query, calls query_dataset:
EVALUATE
TOPN(
100,
SUMMARIZECOLUMNS(
'Region'[RegionName],
FILTER(
ALL('Date'),
'Date'[Year] = 2026 && 'Date'[MonthNumber] = 4
),
"Revenue", [Total Revenue]
),
[Revenue], DESC
)Returns rows, narrates: "Eastern Province led at SAR 4.2M, followed by Riyadh at SAR 3.8M..."
You: Same thing but in Arabic, formatted for a board email.
Claude does not need to call any tool. It already has the rows in context, and rephrases the narrative in Arabic. This is the Arabic narrative advantage that Power BI Copilot does not deliver natively — see the Power BI shelfware pillar for why this matters in MENA.
Step 8 — Security Hardening
This is the section a CISO will read. Get it right before shipping to a client.
Row-Level Security
Service principals do not bypass RLS. If your dataset has a role like [RegionManager] with 'Region'[Country] = USERNAME(), the service principal sees the role's filter applied to its own identity. To enforce per-user RLS through the agent, you must:
- Use
effectiveIdentityinexecuteQueries(only on Premium/PPU/Fabric capacities) - Pass the requesting user's UPN from the MCP host context
Add this option to executeDax when needed:
body: {
queries: [{ query: daxQuery }],
serializerSettings: { includeNulls: true },
impersonatedUserName: "user@tenant.com", // requires Premium/PPU
}For mid-market clients on Power BI Pro, the simpler pattern is one service principal per security boundary — e.g., a separate app registration with workspace access only to the regions a given team should see. Less elegant, no Premium licence required.
Token rotation
Client secrets are bearer credentials. Rotate every 90 days minimum, every 30 days if you publish the server publicly. Automate:
- Calendar reminder 14 days before expiry
- Script that creates a new secret via Microsoft Graph API
- Atomic env var swap + Claude Desktop restart
Prefer federated credentials (workload identity federation) over secrets if your MCP server runs on Azure or GitHub Actions — no secret to rotate.
Scope minimization
The service principal should be a Viewer, never a Member or Admin. Viewer is sufficient for executeQueries and metadata calls. Member would let the agent modify the model, which an autonomous LLM should not be able to do.
Audit trail
executeQueries calls show up in the Power BI Activity Log with the service principal's identity. Forward this log to your SIEM (Sentinel, Splunk) and alert on:
- Queries returning more than N rows (data exfiltration signal)
- Queries hitting tables the agent should not touch (HR, payroll)
- Token requests outside business hours
DAX injection
Treat any DAX string from an LLM as untrusted. The query_dataset tool gives the model a free hand — that is by design — but for production-facing deployments, consider a DAX allow-list: a list of vetted query templates the model can fill in, instead of arbitrary DAX. The XMLA endpoint can run server-side DAX with audit logging, but it cannot validate semantic safety.
Common Issues
401 Unauthorized on /groups. The tenant setting "Allow service principals to use Power BI APIs" is off, or the app is not in the allowed security group. Fix in Power BI Admin Portal.
403 Forbidden on dataset endpoints. The service principal is not added to the workspace. Workspace → Manage access → Add member.
executeQueries returns "Query exceeded result size limit". Power BI caps result sets (default 100K rows / 1M values). Use TOPN, SUMMARIZECOLUMNS, or push aggregation into the DAX.
PowerBINotAuthorizedException on Pro workspaces. executeQueries requires the workspace to be in a Premium or Fabric capacity, or the dataset to be in My Workspace of a PPU user. Pro-only workspaces cannot use the REST query endpoint — fall back to XMLA via adomd-client.
Where to Go Next
This MCP server is the technical foundation of the activation pattern. To make it useful to a finance team, you typically add:
- A narrative agent that wraps the MCP server with a system prompt enforcing Arabic + executive tone + anomaly callouts
- Scheduled prompts (cron, Logic Apps, n8n) that push narratives to Teams or WhatsApp instead of waiting for a user
- Anomaly detection measures pre-baked in the model so the agent can call
get_measurefor things like[Revenue YoY Anomaly Score]
The end-to-end pattern, with cost numbers and three KSA sector examples (retail, banking, government), is documented in our Power BI activation pillar for Saudi Arabia.
If you are a developer evaluating MCP itself, start with How to Create an MCP Server in TypeScript — same SDK, simpler example.
Authoritative Sources
- Model Context Protocol specification
- Power BI REST API — Execute Queries
- Microsoft Entra service principal for Power BI
- DAX
INFO.MEASURESandINFO.TABLES - MSAL Node — Confidential Client
Need this deployed in your stack? Noqta builds and operates production MCP integrations for MENA enterprises sitting on under-used BI investment. Book an audit — we will tell you in 30 minutes whether your Power BI tenant is a good fit for the activation pattern.