Optimizing report calculation performance
In Part 1 and Part 2, we established the foundation for performance tuning and tackled semantic model refresh optimization. Now we turn to the second optimization area: report calculation performance—the responsiveness users experience with every click, filter, and interaction.
The importance of calculation performance
While refresh optimization happens behind the scenes, calculation performance is felt with every user interaction.
- Some calculation optimizations may actually increase semantic model refresh time—but that happens while nobody is waiting. Users experience calculation delays with literally every click, so the tradeoff is almost always worth it.
- Poor calculation performance kills user adoption. Most users won't wait 30 seconds for a visual to load, even if the data is accurate.
- Calculation performance compounds across interactions. A single dashboard session might involve dozens of clicks—small delays add up to frustrated users who abandon the solution.
Key performance metrics and how to capture them
Required software
DAX Studio
Tabular Editor
Measure Killer
Report Analyzer
Establishing your baseline metrics
You'll capture these metrics before making changes and after each optimization iteration. Without baseline measurements, you're just guessing whether your changes actually improved the user experience.
Data model size in memory
- Open DAX Studio and connect to your semantic model, or launch it directly from the External Tools ribbon in Power BI Desktop
- Navigate to the Advanced tab at the top and click View Metrics > OK
- On the VertiPaq Analyzer tab, click Summary and note the total memory footprint
- You can also export these metrics to VPAX if you'd like to document the difference in table count, row count, or column size before and after your changes.
Worst object calculation time and specific trouble objects
- In Power BI Desktop, open Performance Analyzer (View tab > Performance Analyzer)
- Click Start recording, then click through every page and wait for the visuals to calculate. If you must make a selection to drive an action, make note of it so you can repeat the process consistently when you re-measure after future iterations.
- Click Stop, then Export, and save the JSON file in the same directory as your PBIX
- Open Report Analyzer, point to that folder, and view the performance results to see a wireframe of the report pages, highlighting the slowest objects. This is helpful to visualize what on the page is slowest, and you can also use this application to view recommendations related to the interface, like reducing the number of objects on a page.
- Now open DAX Studio and load the performance JSON (Home tab > Load Perf Data)
- This generates a clear table showing the object-level times for query and rendering, which can be sorted to see the overall most troublesome objects.
- If desired, you can export this to a simple CSV (as opposed to JSON) to snapshot the performance of an iteration of your testing. The export contains the IDs of the visuals and can be used to sum the total calculation time across all objects. (Maybe create a Power BI report for this? 🙂)
Total number of objects
- Open Measure Killer and refresh it against your report.
- Under Report Results, look at the Visuals column. You can also expand the pages to see how many objects are on each page.
Interface performance optimization checklist
This covers optimizations that improve calculation performance, working from the back end through the front end, which generally makes sense, unless you just have one or a few slow visuals.
Optimization |
How to Identify the Opportunity |
Implementation Notes |
---|---|---|
REPORT SETTINGS | ||
Enable query reduction settings | Check File > Options > Query reduction settings. | Consider adding "Apply" buttons to filters and/or slicers so visuals aren't updated until a user is finished selecting the desired values |
Disable auto date/time tables | In File > Options > Data Load > Time Intelligence, check if Auto date/time is enabled. Strange-looking date tables will also appear in the DAX Studio Advanced Metrics. | Disable auto date/time, which adds to the semantic model and is rarely the appropriate calendar to use in visuals and calculations |
STORAGE MODE | ||
Optimize table storage mode | Check current storage mode in Model view | Consider overall switch to DirectLake or DirectQuery for large or frequently updated data. Consider DirectQuery or hybrid mode for very large tables. |
Implement user-defined aggregations | Identify reports that frequently summarize large fact tables at higher grains (daily vs hourly data). Note: Best candidates are DirectQuery or composite models, because user-defined aggregation currently requires that the most detailed version of the fact table being in DirectQuery mode. (Technically you work around that with a hack.) | Create aggregation tables for common summary views. You can define the aggregate level or levels based on the design of the report, so most visuals will be able to use aggregated data. This may necessitate creating more, smaller dimensions, ex., separate Year, Month, and Date tables instead of a singular calendar. |
DATA MODEL APPROACH | ||
Implement star schema | Review model structure in Model view. Look for snowflaked data or unconventional data model tables. | Denormalize dimension tables to reduce relationship complexity. Star schema enables more efficient query paths. |
RELATIONSHIP CHARACTERISTICS | ||
Set proper cardinality | Review relationships in Model view. Look for many to many relationships, one to one relationships, or bidirectional filtering. |
|
Optimize join columns | Use DAX Studio to identify high-cardinality relationship columns, i.e., a dimension primary key with many, many distinct values. Check for text-based keys vs integer keys. |
|
TABLE COLUMNS | ||
Minimize column count |
Use Measure Killer to identify unused columns. Check DAX Studio Advanced Metrics for tables with many columns, focusing on fact tables (because they have the most rows). Adage: You can always add columns, but taking them away is much harder. Note: Users may say they need every field. If that is the case, consider separating out a "self-service" semantic model that is much larger -- albeit slower-performing -- using either DirectQuery or DirectLake to accommodate infrequently used data without coming at the expense of frequently used data response time. |
|
Minimize remaining column footprints | Review column storage, distinct values, and data types in DAX Studio Advanced Metrics. Look for the presence of DAX calculated columns. |
|
TABLE ROWS | ||
Reduce unnecessary fact rows | Review fact table data vs. actual reporting requirements. Run one-off queries to look for rows with 0s in all material fact fields. Review available time period, transaction types, etc. | Implement filters at the model level for inactive records, irrelevant date ranges, or zero-value transactions |
Reduce unnecessary dimension rows | Review relationship information in DAX Studio Advanced Metrics. Dimension tables usually contain values for inactive products, employees, dates in the distant past or future, etc. These all affect the size and performance of the solution as well as filling filters with irrelevant values, hurting the user experience. |
|
Aggregate facts based on remaining columns/values | After removing all columns, column footprints, and unneeded rows, either aggregate to the remaining dimensional values or do a distinct row count of the remaining dimensional values to see if the row count can be reduced. |
|
FILTER APPROACH | ||
Use filters over slicers by default | Review the report pages manually to see if slicers are used to filter reporting data. | Use Filter pane instead of slicers unless users need to see available values. Slicers add rendering time and calculation overhead, in addition to taking up limited space for visuals. The most appropriate time to use slicers is for field parameters or when one selected value is forced on a field. |
CALCULATION COMPLEXITY AND EFFICIENCY | ||
Push calculations to data model | Look for complex conditions in DAX (especially repeated across many measures) that could be materialized in the data model, instead. | Move the logic to a static flag or category field in Power Query and create simpler DAX |
Use DAX variables | Review complex measures with repeated expressions | Implement variables to store intermediate calculations and improve readability and performance |
Implement visual calculations | For simple calculations like running totals or percent of parent, check if visual calculations could replace complex DAX | Use visual calculations for visual-specific logic, especially if it doesn't need to be reused |
SLOW VISUALS | ||
Review slow DAX queries | Use Report Analyzer to identify visuals taking >n seconds (depends on the data volume). Copy queries to DAX Studio for detailed analysis, where you can review the SQL-like representation of the query under Trace > Server Timings. | Optimize the slowest queries first—they have disproportionate impact on user experience. Use LLMs or search DAX Patterns or SQLBI for ideas. There are books written on the topic of optimizing DAX. Get your data model right, first, then see what is still slow. |
Move measures to tooltips | Identify visuals with many measures where some are rarely viewed. Check if secondary metrics could be hidden and viewed on demand only. | Move less critical measures to tooltips to reduce visual complexity and calculation overhead. You may also be able to move dimensions that increase the grain of a visual to the tooltop, simulating drilling to detail. |
Implement field parameters to limit visible columns | For visual with many similar measures or columns, check if users need to see all options simultaneously | Use field parameters to let users choose which measures/columns to display, reducing calculation load, instead of defaulting all people to all measures |
Apply default filters | Review slicers and filters that users typically set to specific values, or visuals that are intended to only make sense or be relevant when you have filtered to just one value. | Set sensible defaults to reduce initial load time -- this is actually doing a favor to users so when they open a page, the charts are not already spinning, trying to return more data than they care to see. Use "always one selected" for required filters. |
INTERFACE DENSITY | ||
Reduce objects per page | Count visuals per report page. Both the Report Analyzer and the Measure Killer > Report Results have this information, and both have quality flags that will trigger if there are a high number of visuals on the page. |
|
CUSTOM VISUALS | ||
Audit custom visual performance | Find whether custom visuals are used in the Measure Killer > Report Results > Visuals Per Type to see if custom visuals are used. (This may already be readily apparent, just from eyeballing the report.) | Replace poorly performing custom visuals with native alternatives when possible. "When possible" = when it is a souped up version of a standard visual or the data can be communicated just as well using a standard visual. It is best not to create dependencies on custom visuals when it's not absolutely necessary, anyway, because the object may break at some point in the future, anyway. |
ADVANCED OPTIMIZATIONS | ||
Configure IsAvailableInMDX | By default everything is set to TRUE, so there's nothing to check here, unless you have already messed with these settings. | Set IsAvailableInMDX to FALSE in Tabular Editor for fact columns (assumes you have created explicit measures for them) and hidden key columns |
Validating your improvements
After implementing optimizations, verify that reports still function as expected and that user experience has actually improved.
Test these areas systematically:
User experience timing: If you're pulling from flat files or web sources, consider migrating to a database, which enables faster filtering and aggregation.
Visual functionality: Where possible, replace generic ODBC connectors with native Power BI connectors, as they often enable query folding and reduce overhead.
Data accuracy: Monitor the performance of your data gateway during refresh windows. Bottlenecks with CPU, RAM, or network may require increasing resources or adding nodes to a gateway cluster.
Cross-filter behavior: Monitor the performance of your data gateway during refresh windows. Bottlenecks with CPU, RAM, or network may require increasing resources or adding nodes to a gateway cluster.
Document your results using the baseline metrics you established earlier.
Your performance tuning journey is complete, but this creates a foundation for repeating the process with more reports and ongoing optimization as solutions change over time.