Contact Us

How can we help you with your data and analytics journey?

 

Power BI Performance Tuning (3 of 3)

Optimizing report calculation performance

In Part 1 and Part 2, we established the foundation for performance tuning and tackled semantic model refresh optimization. Now we turn to the second optimization area: report calculation performance—the responsiveness users experience with every click, filter, and interaction.

The importance of calculation performance

While refresh optimization happens behind the scenes, calculation performance is felt with every user interaction.

  • Some calculation optimizations may actually increase semantic model refresh time—but that happens while nobody is waiting. Users experience calculation delays with literally every click, so the tradeoff is almost always worth it.
  • Poor calculation performance kills user adoption. Most users won't wait 30 seconds for a visual to load, even if the data is accurate.
  • Calculation performance compounds across interactions. A single dashboard session might involve dozens of clicks—small delays add up to frustrated users who abandon the solution.

Abstract Composition of Geometric Tools

Key performance metrics and how to capture them

Required software

DAX Studio
Tabular Editor
Measure Killer
Report Analyzer

Establishing your baseline metrics

You'll capture these metrics before making changes and after each optimization iteration. Without baseline measurements, you're just guessing whether your changes actually improved the user experience.

Data model size in memory
  1. Open DAX Studio and connect to your semantic model, or launch it directly from the External Tools ribbon in Power BI Desktop
  2. Navigate to the Advanced tab at the top and click View Metrics > OK
  3. On the VertiPaq Analyzer tab, click Summary and note the total memory footprint
  4. You can also export these metrics to VPAX if you'd like to document the difference in table count, row count, or column size before and after your changes.

optimization 1

Worst object calculation time and specific trouble objects
  1. In Power BI Desktop, open Performance Analyzer (View tab > Performance Analyzer)
  2. Click Start recording, then click through every page and wait for the visuals to calculate. If you must make a selection to drive an action, make note of it so you can repeat the process consistently when you re-measure after future iterations.
  3. Click Stop, then Export, and save the JSON file in the same directory as your PBIXoptimization 2
  4. Open Report Analyzer, point to that folder, and view the performance results to see a wireframe of the report pages, highlighting the slowest objects. This is helpful to visualize what on the page is slowest, and you can also use this application to view recommendations related to the interface, like reducing the number of objects on a page.
  5. Now open DAX Studio and load the performance JSON (Home tab > Load Perf Data)
  6. This generates a clear table showing the object-level times for query and rendering, which can be sorted to see the overall most troublesome objects.
  7. If desired, you can export this to a simple CSV (as opposed to JSON) to snapshot the performance of an iteration of your testing. The export contains the IDs of the visuals and can be used to sum the total calculation time across all objects. (Maybe create a Power BI report for this? 🙂)
optimization 3
Total number of objects
  1. Open Measure Killer and refresh it against your report.
  2. Under Report Results, look at the Visuals column. You can also expand the pages to see how many objects are on each page.

optimization 4

Interface performance optimization checklist

This covers optimizations that improve calculation performance, working from the back end through the front end, which generally makes sense, unless you just have one or a few slow visuals.

Optimization

How to Identify the Opportunity

Implementation Notes

REPORT SETTINGS    
Enable query reduction settings Check File > Options > Query reduction settings. Consider adding "Apply" buttons to filters and/or slicers so visuals aren't updated until a user is finished selecting the desired values
Disable auto date/time tables In File > Options > Data Load > Time Intelligence, check if Auto date/time is enabled. Strange-looking date tables will also appear in the DAX Studio Advanced Metrics. Disable auto date/time, which adds to the semantic model and is rarely the appropriate calendar to use in visuals and calculations
STORAGE MODE    
Optimize table storage mode Check current storage mode in Model view Consider overall switch to DirectLake or DirectQuery for large or frequently updated data. Consider DirectQuery or hybrid mode for very large tables.
Implement user-defined aggregations Identify reports that frequently summarize large fact tables at higher grains (daily vs hourly data). Note: Best candidates are DirectQuery or composite models, because user-defined aggregation currently requires that the most detailed version of the fact table being in DirectQuery mode. (Technically you work around that with a hack.) Create aggregation tables for common summary views. You can define the aggregate level or levels based on the design of the report, so most visuals will be able to use aggregated data. This may necessitate creating more, smaller dimensions, ex., separate Year, Month, and Date tables instead of a singular calendar.
DATA MODEL APPROACH    
Implement star schema Review model structure in Model view. Look for snowflaked data or unconventional data model tables. Denormalize dimension tables to reduce relationship complexity. Star schema enables more efficient query paths.
RELATIONSHIP CHARACTERISTICS    
Set proper cardinality Review relationships in Model view. Look for many to many relationships, one to one relationships, or bidirectional filtering.
  • Ensure many to many is valid and not the result of bad data
  • Avoid bidirectional filtering unless specifically required
  • Combine tables with one to one relationships
Optimize join columns Use DAX Studio to identify high-cardinality relationship columns, i.e., a dimension primary key with many, many distinct values. Check for text-based keys vs integer keys.
  • Consider joining dimensions with very high-cardinality keys, or join the values with the most distinct values and use a more summarized dimension, instead. Think of this as moving some of the most highly distinct fields from the dimension to the fact table, then being able to get away with a dimension containing far fewer rows.
  • Replace text keys with integer keys where possible.
TABLE COLUMNS    
Minimize column count

Use Measure Killer to identify unused columns. Check DAX Studio Advanced Metrics for tables with many columns, focusing on fact tables (because they have the most rows). Adage: You can always add columns, but taking them away is much harder.

Note: Users may say they need every field. If that is the case, consider separating out a "self-service" semantic model that is much larger -- albeit slower-performing -- using either DirectQuery or DirectLake to accommodate infrequently used data without coming at the expense of frequently used data response time.

  • Remove unused columns to reduce model complexity and memory footprint -- anything that is not required for the interface, self-service, or a functioning data model.
  • Some columns can be combined, then the component fields dropped, ex., concatenate Employee Name and Employee ID into a one, unique Employee field
  • Some narrow tables can be joined, then the key field omitted from the final table
Minimize remaining column footprints Review column storage, distinct values, and data types in DAX Studio Advanced Metrics. Look for the presence of DAX calculated columns.
  • Decimals, DateTime: round remove insignificant digits, time or fractions of seconds not useful for analytical purposes
  • Cleanse messy data to reduce the length and number of distinct values
  • Null out or bucket as "Other" uninteresting values, i.e., only retain the values you care about
  • Push calculated columns back to Power Query to improve data compression
TABLE ROWS    
Reduce unnecessary fact rows Review fact table data vs. actual reporting requirements. Run one-off queries to look for rows with 0s in all material fact fields. Review available time period, transaction types, etc. Implement filters at the model level for inactive records, irrelevant date ranges, or zero-value transactions
Reduce unnecessary dimension rows Review relationship information in DAX Studio Advanced Metrics. Dimension tables usually contain values for inactive products, employees, dates in the distant past or future, etc. These all affect the size and performance of the solution as well as filling filters with irrelevant values, hurting the user experience.
  • Inner join dimension tables to relevant fact rows in Power Query to reduce out unnecessary rows.
  • If you have a security table, you can also inner join to reduce the rows in that to only those that grant access to data that exists in the semantic model.
Aggregate facts based on remaining columns/values After removing all columns, column footprints, and unneeded rows, either aggregate to the remaining dimensional values or do a distinct row count of the remaining dimensional values to see if the row count can be reduced.
  • Implement a GROUP BY to summarize the fact table based on the required data. For some calculations, you may also need to add a row count column with how many rows were rolled up into each summary row.
  • Don't overlook time granularity as a possible way to aggregate data. Perhaps weekly or monthly aggregates would be sufficient even when the source is daily.
FILTER APPROACH    
Use filters over slicers by default Review the report pages manually to see if slicers are used to filter reporting data. Use Filter pane instead of slicers unless users need to see available values. Slicers add rendering time and calculation overhead, in addition to taking up limited space for visuals. The most appropriate time to use slicers is for field parameters or when one selected value is forced on a field.
CALCULATION COMPLEXITY AND EFFICIENCY    
Push calculations to data model Look for complex conditions in DAX (especially repeated across many measures) that could be materialized in the data model, instead. Move the logic to a static flag or category field in Power Query and create simpler DAX
Use DAX variables Review complex measures with repeated expressions Implement variables to store intermediate calculations and improve readability and performance
Implement visual calculations For simple calculations like running totals or percent of parent, check if visual calculations could replace complex DAX Use visual calculations for visual-specific logic, especially if it doesn't need to be reused
SLOW VISUALS    
Review slow DAX queries Use Report Analyzer to identify visuals taking >n seconds (depends on the data volume). Copy queries to DAX Studio for detailed analysis, where you can review the SQL-like representation of the query under Trace > Server Timings. Optimize the slowest queries first—they have disproportionate impact on user experience. Use LLMs or search DAX Patterns or SQLBI for ideas. There are books written on the topic of optimizing DAX. Get your data model right, first, then see what is still slow.
Move measures to tooltips Identify visuals with many measures where some are rarely viewed. Check if secondary metrics could be hidden and viewed on demand only. Move less critical measures to tooltips to reduce visual complexity and calculation overhead. You may also be able to move dimensions that increase the grain of a visual to the tooltop, simulating drilling to detail.
Implement field parameters to limit visible columns For visual with many similar measures or columns, check if users need to see all options simultaneously Use field parameters to let users choose which measures/columns to display, reducing calculation load, instead of defaulting all people to all measures
Apply default filters Review slicers and filters that users typically set to specific values, or visuals that are intended to only make sense or be relevant when you have filtered to just one value. Set sensible defaults to reduce initial load time -- this is actually doing a favor to users so when they open a page, the charts are not already spinning, trying to return more data than they care to see. Use "always one selected" for required filters.
INTERFACE DENSITY    
Reduce objects per page Count visuals per report page. Both the Report Analyzer and the Measure Killer > Report Results have this information, and both have quality flags that will trigger if there are a high number of visuals on the page.
  • Remove unnecessary objects
  • Move formatting objects to a background image
  • Combine related visuals (use the title, subtitle, etc.)
  • Use drill-through to separate aggregated and detailed views
  • Split things onto multiple pages or show/hide with bookmarks instead of dense or scrollable pages -- you often can't see all of the visuals at the same time, anyway
  • Make the drillthrough information available in a tooltip and eliminate the detailed view
CUSTOM VISUALS    
Audit custom visual performance Find whether custom visuals are used in the Measure Killer > Report Results > Visuals Per Type to see if custom visuals are used. (This may already be readily apparent, just from eyeballing the report.) Replace poorly performing custom visuals with native alternatives when possible. "When possible" = when it is a souped up version of a standard visual or the data can be communicated just as well using a standard visual. It is best not to create dependencies on custom visuals when it's not absolutely necessary, anyway, because the object may break at some point in the future, anyway.
ADVANCED OPTIMIZATIONS    
Configure IsAvailableInMDX By default everything is set to TRUE, so there's nothing to check here, unless you have already messed with these settings. Set IsAvailableInMDX to FALSE in Tabular Editor for fact columns (assumes you have created explicit measures for them) and hidden key columns

Validating your improvements

After implementing optimizations, verify that reports still function as expected and that user experience has actually improved.

Test these areas systematically:

User experience timing: If you're pulling from flat files or web sources, consider migrating to a database, which enables faster filtering and aggregation.

Visual functionality: Where possible, replace generic ODBC connectors with native Power BI connectors, as they often enable query folding and reduce overhead.

Data accuracy: Monitor the performance of your data gateway during refresh windows. Bottlenecks with CPU, RAM, or network may require increasing resources or adding nodes to a gateway cluster.

Cross-filter behavior: Monitor the performance of your data gateway during refresh windows. Bottlenecks with CPU, RAM, or network may require increasing resources or adding nodes to a gateway cluster.

Document your results using the baseline metrics you established earlier.

Your performance tuning journey is complete, but this creates a foundation for repeating the process with more reports and ongoing optimization as solutions change over time.

Abstract Composition of Geometric Tools