Tuesday, February 19, 2019 #

DAX Studio Parameters Dialog

I recently updated the documentation on daxstudio.org to include a page on the Parameter Support, but I thought this might also benefit from a longer explanation.

It may surprise some people to hear this, but the DAX language has supported parameters in queries since it was first released. However while you can use parameters in a DAX query you cannot easily test parameterized DAX by supply the parameter values as variables like you can in T-SQL. In DAX the parameters are sent in a separate part of the XMLA as part of the XMLA command’s parameters collection. Historically one of the biggest issue with parameters in DAX (and MDX) has been the client tool support. From what I have experienced, they are really only used in Reporting Services reports. However with the addition of “paginated” reports to the PowerBI.com we may start to see a resurgence of these types of reports.

Up until recently the tooling around parameters in SSRS using DAX queries was not that great and took a bit of manual hacking between MDX and DMX to get it working. However now you can add a parameter to your DAX query in SSRS with a simple checkbox:

image

This generates a query like the following with the a parameter (highlighted in yellow). The pattern that the SSRS designer is relatively complex, but that’s because it’s building generic code that can also deal with multi-select parameters:

DEFINE
    VAR ProductColor1 =
        IF (
            PATHLENGTH ( @ProductColor ) = 1,
            IF ( @ProductColor <> "", @ProductColor, BLANK () ),
            IF (
                PATHITEM ( @ProductColor, 2 ) <> "",
                PATHITEM ( @ProductColor, 2 ),
                BLANK ()
            )
        )
    VAR ProductColor1ALL =
        PATHLENGTH ( @ProductColor ) > 1
            && PATHITEM ( @ProductColor, 1, 1 ) < 1
EVALUATE
SUMMARIZECOLUMNS (
    'Product'[Product Name],
    'Product'[Class],
    'Product'[Color],
    FILTER (
        VALUES ( 'Product'[Color] ),
        ( ( ProductColor1ALL
            || 'Product'[Color] = ProductColor1 ) )
    )
)

Then when your run your report the value for this parameter gets injected into the query as a string value.

This is all fine within the SSRS designer, but what happens when you need to do some performance tuning on the query? If you are using a tool like SSMS then you have no choice but to do a find and replace on the query text and replace the @ProductColor value with something like “Red” to test your query, but then you need to make sure to do the reverse of this operation before pasting your final query back into SSRS. This is a bit of a hassle with one parameter, but it can be a real pain and potentially error prone if you have 10 parameters?

XMLA Parameter Blocks

Well for a couple of years DAX Studio has supported XMLA parameter blocks so that you could run queries like the following. However the XMLA parameter syntax is a bit arcane and not the easiest thing to remember, but if you have an example to start from it’s not too bad.

image

Why did we choose the <Parameters> XMLA block as a format? Well if you create a trace using SQL Profiler that listens for QueryBegin events you will see text exactly like the query in the screenshot above, with the text of the query followed by the XMLA parameters block. This made it possible to capture SSRS queries from a SQL Profiler trace and then just paste the straight into DAX Studio and execute them.

This was a good start, but if you want to capture slow running queries you would probably setup or trace to listen to the QueryEnd events which contain the duration of the query, but does not include the XMLA parameter block. While you could listen for both the QueryBegin and QueryEnd events in order to be able to see both the durations and XMLA parameters you then need to scroll up and down to try and match the begin to the end events

The *New* Parameter Dialog

As of DAX Studio 2.8.0 we introduced a new Parameters Dialog, this should be triggered anytime you run a query that has parameters in the query, but no XMLA parameters block.

image

You can then run the query multiple times, filling in the value of the parameters each time. Or you can click the “Write Parameter XML” button in the bottom left to generate an XMLA parameter block. This will generate and XMLA parameter block and insert it underneath the query and then you can re-run the query multiple times without having to enter the values into a dialog box each time.

image

Merging Parameters

In addition to the ability to run queries with XMLA parameter blocks we also have the “Merge Parameters” button on the ribbon which will take an existing query with a parameter block and merge the parameter values into the query.

merge-parameters-before

This will take a query like the one above and producing the text below:

merge-parameters-after

This can be useful if you know that you just need to work on tuning a measure and you don’t want to bother with parameters. Or you could use this to paste the query into a unit testing framework like the excellent NBI 

Conclusion

So now with DAX Studio you have an easy way of working with parameters in DAX queries. Either for working with queries generated by the SSRS query builder or even just as a way of running your own queries with a range of different parameters.

Posted On Tuesday, February 19, 2019 9:31 AM | Comments (0)

Friday, February 1, 2019 #

Power BI – Fixing Dates from Sharepoint Lists (converting UTC Dates to Local Dates)

So the problem below was in relation to working with dates from a Sharepoint 2013 List, but the same technique should be applicable any time you want to convert UTC/GMT datetime fields to a local time.

When Sharepoint exposes datetime values from lists using its REST API it converts them to the UTC timezone. I suppose a lot of the time this makes sense, specially if you are dealing with data entry across multiple time zones. But if you are just working in a single timezone this can be frustrating to deal with. The with bringing this data into Power BI is that the column in Power BI does not get tagged as being UTC which can lead you to think that something has gone wrong. Because if your are in any timezone other than GMT you will see dates/times being offset when you bring them in to Power BI. I live in UTC+10 so this effect is quite marked often shifting dates back to the prior day.

If you look at the data from the Webservice calls to Sharepoint you can see that it’s returning the dates in UTC format, not as they were entered in the UI. The images below show you an example of this.

1. Is the data visible in the Sharepoint list

image

2. Is the data coming from the Sharepoint Webservice API ( the trailing ‘Z’ indicates that this date/time is now in UTC) and you can see that 11 hours have been subtracted from time (the date used above is during daylight savings in my local timezone which is 11 hours ahead of UTC)

image

3. Show how the data appears in Power BI, which is how it was sourced from the Sharepoint REST API.

image

To fix this so that you can view the dates in the local timezone we can go into the query in Power BI and do the following:

1. Click on the date column and then in the “Transform” tab click on the DataType setting and change this to “Date/Time/Timezone” – this will correctly tag the column as being in UTC+00:00

image

2. Then in the “Add Column” ribbon click on the “Custom Column” option and give this new column a name like “TxnDateLocal” and then enter the following formula where the text in red is the name of the column from sharepoint that we changed the type of in step 1

= DateTimeZone.ToLocal( [TxnDate] )

image

3. Repeat the above for each DateTime column and then use the “local” versions of these column in your model. It’s probably also a good idea to go back and delete the original version of these columns from the query after we have added the new local versions to prevent them being used accidentally.

Posted On Friday, February 1, 2019 2:22 PM | Comments (0)

Wednesday, January 23, 2019 #

DAX Studio 2.8.1 released

So thanks to a couple of early adopters we’ve found and hopefully squashed a few bugs that managed to sneak into the 2.8.0 release and the 2.8.1 release is now live on https://daxstudio.org

v2.8.1 Includes all the features from 2.8.0 along with the following fixes:

  • Fixed a crash when launching DAX Studio by double-clicking on a .dax file (related to a race condition that only occurs in release builds) - thanks to @speschl for reporting the cause of this issue
  • Fixed a crash when opening a file after closing all other windows (related to the issue above)
  • Attempted to fix a random crash when clicking on various tabs - this appear so to be related to this issue in AvalonDock 3.4 so we've rolled back to AvalonDock 3.3

Posted On Wednesday, January 23, 2019 12:01 PM | Comments (0)

Tuesday, January 22, 2019 #

DAX Studio 2.8.0 Released

So it’s been a little while coming, but the 2.8.0 release of DAX Studio is now available at https://daxstudio.org 

It contains a host of new features as well as a lot of stability improvements. Below is a copy of the release notes.

New Features:

  • Filter Dump Measure – right-clicking a table in the metadata pane can generate a measure that returns a string describing the active filter context in a table or in all the tables. The DAX measure generated should be copied in a Tabular model to display the filter context in a report tooltip for debugging purposes.
  • Define All Measures – right-clicking a table in the metadata pane can generate the definition of all the measures defined in a table or in all the tables.
  • Parameter UI - if your query contains @parameters you will be prompted for the parameter values
  • Auto-Save - if DAX Studio was not shutdown cleanly it will offer to recover the files that were open
  • Goto Line - ctrl+G lets you jump to a specified line number
  • Updated Data Provider libraries for connecting to Azure AS, PowerBI.com and Power BI Desktop
  • Support for Power BI Aggregation events in ServerTimings and AllQueries traces
  • Digitally Signed - the installer, Excel Addin and Standalone executable are all now digitally signed
  • Option to zoom results grid with query text (useful for when presenting)
  • Optimized tracing of Direct Query for newer engine sources (which now support filtering these trace events by session id)
  • Added links to dax.guide into function tooltips in code completion window
  • Option to set results grid font size
  • Added hotkey Ctrl-Shift-N to open a new query with the same connection as the current window

Fixes:

  • connecting to PowerPivot models with a single quote characters (') in the file name
  • defaulting Find/Replace text from the current selected text in the editor
  • added VAR & RETURN to the code completion keywords
  • fixed default file format to use UTF-8 (but you an still open both UCS2 or UTF-8 files)
  • fix for mis-reported line numbers for errors with DaxFormatter.com
  • fix to properly increment file names when exporting multiple resultsets to csv
  • numerous stability fixes as a result of reported issues and crash reports including:
    • fixed crash when resizing DAX Studio to a very narrow width
    • fixed crash when dragging metadata to editor while intellisense window was open
    • fixed crash when querying a table with the ^ character in a column name
    • fixed crash when clicking save when no document is open
    • fixed occassional crashes when copying & pasting (work around for .Net issue)
    • fixed numerous crashes when clicking ribbon buttons after closing all query windows

Posted On Tuesday, January 22, 2019 9:45 AM | Comments (0)

Sunday, January 20, 2019 #

DAX Studio – What do all those numbers mean in the server timing output?

When you run a query in DAX Studio with the Server Timings feature switched on you will see output like the following.

image

A number of the metrics that are gathered by the server timings have been abbreviated and if you may not be aware of what they all mean. The topic of performance tuning DAX queries can get quite involved. This post is just designed to give a brief overview and bit of background about how the different metrics are calculated.

Metric Description
Total This is the total query duration in milliseconds – taken from the Query End profiler event. This is the total time the server took to process the query. (so it will exclude any time the client took to process the result set)
SE CPU This is the amount of CPU time that was spend on Storage Engine queries (note that this figure *may* not be 100% reliable, so don’t place a high amount of importance on it) the blue ratio under SE CPU is the factor of SE CPU over SE and is a very rough indicator of the average parallel operations that the SE was running. This figure is calculated by adding up the CPU duration from the Storage Engine events
FE This is the amount of time spent in the Formula Engine, calculated by Subtracting the SE duration from the Total. The blue figure underneath is the percentage of FE / Total
SE This is the amount of time spend in the Storage Engine, calculated by adding up the duration of all the Storage Engine queries. The blue figure underneath is the percentage of SE / Total. As a rough rule of thumb you want to try to get your queries to spend more time in the Storage Engine as it is multi-threaded so can do more operations in parallel. While the Formula Engine is single threaded and cannot make use of multiple CPU cores. Note that you cannot have a query that is 100% handled in the storage engine as the FE sits over the top of the SE. The FE is what issues the requests to the SE and it also serializes the result set before it is sent back to the client
SE Queries this is the number of Storage Engine queries that were performed during the processing of the query
SE Cache this is the number of Storage Engine cache hits

You may also wonder what that “SQL like” query is that captured by the scan event. This is called xmSQL and is textual representation of the requests that the Formula Engine sent to the Storage Engine. As far as I am aware there is no way of executing these queries, they are merely a textual representation of the requests sent to the Storage Engine to enable people to understand what operations the storage engine was performing.

Posted On Sunday, January 20, 2019 10:17 PM | Comments (0)

Tuesday, February 13, 2018 #

DAX Studio 2.7.2 Released

The lastest update for DAX Studio is now live at http://daxstudio.org

This release includes a number of small enhancements and fixes including the following:

  • Enhancement: Allowing "Unlimited" Dataset sizes from PowerPivot – previously results were buffered through an internal memory structure that had a 2Gb limit which resulted in most typical queries failing at around the 2 million row mark. We've now implemented a new streaming interface which removes this limit and have run tests exporting over 6 million records and creating a 6Gb csv file.
  • Fix: Default Separators so that the option run queries with non-US separators is used correctly when set in the Options screen
  • Fix: Tracing Query Plans for PowerPivot
  • Fix: Setting Focus to the Find box after typing Ctrl-F
  • Fix: "Linked Excel" output when connected to an Analysis Services Multi-Dimensional cube so that it always includes Cube=[CubeName] in the connection string
  • Fix: Crash in "Define and Expand Measure" when run against a Power BI model with a measure with the same name as one of the column names

In addition to the above specific issues that have been fixed numerous stability enhancements have been added as a result of crash reports that have been logged. Thanks to those of you that have submitted these reports when the program crashes, specially those of you that have taken the extra time to note down some extra information about what you were doing when the crash occurred. With some of the crash reports it's easy to figure out what happened from the stack trace and screen shot, but in other cases it's quite difficult. We have also seen some reports that appear to be from .Net faults or issues in some of the third party libraries that we are using.

Posted On Tuesday, February 13, 2018 7:54 AM | Comments (0)

Tuesday, February 6, 2018 #

DAX Studio recent Win7 SP1 crashes

We've just found out that a recent security update to the .Net framework in January 2018 for Windows 7 SP1 has been causing crashes in DAX Studio when accessing the File menu. Unfortunately this issue is outside of our control and affects any WPF based windows app which references the Windows Font collection (which DAX Studio does in the Options window)

If this issue is affecting you the following link outlines the cause of the issue and some possible fixes https://github.com/dotnet/announcements/issues/53

This fault typically manifests as a fatal DAX Studio crash with a CrashReporter dialog which reports an "MS.Internal.FontFace.CompositeFontParser.Fail" exception and a message saying "No FontFamily element found in FontFamilyCollection that matches current OS or greater: Windows7SP1"

Hopefully a follow-up patch for this will be released soon that will remove the need for people to apply manual fixes for this.

Posted On Tuesday, February 6, 2018 7:08 AM | Comments (0)

Monday, October 2, 2017 #

DAX Studio 2.7.0 Released

The major change in this version is to the tracing engine. We’ve introduced a new trace type and made some changes to the way the tracing windows operate and incorporated some enhancements to crash reporting and enabling logging.

We've also finished moving off our old codeplex home onto http://daxstudio.org

Changes to the way trace windows work

Previously when you clicked on a trace button, the window opened and the trace was started and when you switched off the trace the window closed. The running of the trace and the visibility of the window was closely linked.

In v2.7 we have removed that tight linkage, when you click on a trace button the window opens and the trace still starts as it used to, but when you switch off the trace the window now remains open. The table below shows the 2 new states that trace windows now have.

v2.6 and Earlier

V2.7 or later

Window Visible - Trace Running

N/A

N/A

Window Closed – Trace Stopped

Window Visible - Trace Running

Window Visible – Trace Paused **

Window Visible – Trace Stopped **

Window Closed – Trace Stopped


All trace windows now have a number of additional controls in their title area.

clip_image002  Starts a paused or stopped trace

clip_image003  Pauses a running trace


clip_image004  Stops a running trace


clip_image005   Clears any information captured in the current trace window


The tabs for the traces now also have an indicator to show their state so that you can see the state of a given trace at a glance. In the image below you can see that the All Queries trace is stopped, while the Query Plan trace is running and the Server Timings trace is paused. Note that while a trace is paused the server side trace is still active it’s just the DAX Studio UI that is paused, so expensive trace events like Query Plans can still have an impact on the server.

clip_image006

The other side effect of this change is that if a .dax file is saved while a trace window is open, when that file is re-opened the trace window will also re-open with the saved trace information, but now the trace will be in a stopped state (previously the trace would open and re-start). This prevents accidentally overwriting the saved information and also means that the saved trace information will open even if you cancel the connection dialog (which would not happen in v2.6 or earlier, cancelling the connection would cause the saved trace information not to open)

The “All Queries” trace

The new trace type is called “All Queries” – which captures all queries against the current connection, regardless of the client application. This is useful for capturing queries from other client tools so that you can examine them.

When the trace is active it will capture all events from any client tool. The screenshot below shows a capture session that was running against a PowerBI Desktop file. When you hover over the queries the tooltip shows you a larger preview of the query and double clicking on the query text copies it to the editor

clip_image008

The “All Queries” trace has a few additional buttons in the title bar area.

The following button in the trace window title clip_image010 will copy all the queries matching the current filter to the editor pane.

The Filter button clip_image011 shows and hides the Filter controls, the clear filter button clip_image012 will clear any filter criteria from the filter controls.

Filters can be set for a specific type of query DAX/MDX/SQL/DMX, for a duration range, username, database or query. The filter all do a “contains” style search that matches if the text you type is anywhere in the field.

clip_image014

Note: You cannot have the "All Queries" trace running with either the Server Timings or Query Plan traces as DAX Studio currently only runs one trace session per query window and these traces apply different filters to the trace events. We expect that the normal workflow would be to run the All Queries trace and collect a set of queries. Then you will stop the All Queries trace and look at the long running queries, maybe re-running some of the captured queries with one or both of the other trace types to help isolate performance issues.

Enhanced Tooltips

The amount information in the tooltips for columns has been greatly extended. By default we now show the format string, min and max values, the number of distinct values and a sample of 10 values. There are now settings under File – Options that let you turn off either one or both of the basic statistics or the sample data information in the tooltip.

clip_image016 clip_image018

Crash Reporting

We’ve added the CrashReporter.Net component that will catch fatal crashes and give the user an option to submit a crash report containing the exception message and stack trace information that may help us resolve the issue.

Logging Improvements

We also now support a logging hotkey – holding down the left SHIFT key while starting up Excel or the DAX Studio standalone will start debug level logging. There is a link in the Help – About dialog that will open the log folder or type Win+R then enter %APPDATA%\DaxStudio\logs and click open or put that address into the address bar in Windows Explorer and hit the enter key.

Other Miscellaneous Fixes

  • Added support for opening .msdax files generated by SSMS
  • Fixing output of time portion of datetime columns when exporting to csv
  • Fixed a bug where table list sometimes did not update after changing the connection
  • Fixed missing define measure option for hidden measures
  • Fixed crash when right-clicking on the Query History window
  • Added installer support for the SQL 2017 versions of AMO and ADOMD – if you have both of these the installer should no longer insist on downloading the 2016 versions of these libraries.

Posted On Monday, October 2, 2017 7:57 PM | Comments (1)

Thursday, December 22, 2016 #

DAX Studio 2.6.0 downloading issues

UPDATE: Looks like the 2.6.0a release on codeplex is now being flagged by Chrome as malicious. I don't know if it's the file or codeplex.com that is the issue (our older releases with thousands of downloads appear to now be falgged as "malicious" too now). So I have also made the setup file available as a release from our github repo too - https://github.com/DaxStudio/DaxStudio/releases/download/2.6.0/DaxStudio_2_6_0a_setup.exe

If you had trouble dowloading the 2.6.0 release yesterday there is now a new 2.6.0a release up at https://daxstudio.codeplex.com/releases

Yesterday's release of DAX Studio 2.6.0 started off OK. We've had over 400 downloads, but then at some point the browsers seem to think the installer was a malicious file. We're not sure why, but all of them have started throwing up warnings, Chrome seems to be the worst offender, saying that the file is malicious and only giving you the option to discard it

image

And both Firefox and Edge give you very scary warning, but will let you download the file.

The advice on the Chrome "learn more" link is very generic and there does not appear to be any way of submitting for a re-assessment of this judgement.

I checked both the original file and then downloaded the file from codeplex and checked again (just in case something had infected the file after it was uploaded to codeplex) using Googles virustotal.com site and both times 0 out of 55 antivirus scanners reported any issues.

If you are curious below is a link to the VirusTotal.com results showing the detailed results:
https://virustotal.com/en/file/df634b42c7ce6027aeab4e90d83594c637d6d6e73acc9e89027ed34de1655d37/analysis/

Posted On Thursday, December 22, 2016 10:06 AM | Comments (1)

Wednesday, December 21, 2016 #

DAX Studio 2.6.0 Release

We seem to somehow have gotten into the habit of doing pre-Christmas releases every year, so why break with tradition now :)

The latest release of DAX Studio has a couple of new features as well as a bunch of small fixes.

The biggest single feature is the support for multiple result sets. Both SSAS 2016 and Power BI support sending multiple EVALUATE statements in a single batch. You can now do the same thing in DAX Studio and we will generate a numbered tab for each result set.

image

We've changed the Connection dialog so that connection types that are unavailable are just disabled instead of being hidden. There is also a little help icon with a tooltip which indicates why the particular option is disabled to help those that are new to DAX Studio.

image

We've added the ability to connect to SSDT Integrated Workspaces.

image

There is now a setting under File – Options to allow you to opt-in for notifications of pre-release builds. So when you launch DAX Studio if there is a new pre-release version available you will get a toast notification. I don't think we will always do a pre-release build, but there have been a number of times where it would have been nice to get a few more people testing out new functionality before doing the final release.

image

When querying measures the Formatting from the data model is now applied. Note that to do this we look for matches between the column names in the result set and the measures in your model. So if you use functions like ADDCOLUMNS or SUMMARIZE you need to make sure to give the output columns the same name as the underlying measure if you want formatting applied.

And there have been a bunch of minor enhancements:

  • A link has been added to the dowload page from Help - About when a newer version is available.
  • Added parsing of record counts in Server Timings for SQL 2016 / Power BI Desktop.
  • Improved metadata search performance and the search box now stays open when it has keyboard focus.

There are also a number of fixes in this release, some minor, but some also address some annoying crashes:

  • Fixed an issue where Query Plans and Server Timings would not start when connected PowerPivot.
  • Fixed an error when using the locale setting in the connection dialog
  • Fixed an issue with hidden animations running causing background CPU usage even when the app was idle.
  • Fixed crashes when refreshing metadata (also affects the automatic refresh when switching back to DAX Studio after editing your model).
  • Fixed PowerPivot connections so that they stay connected when you open another Excel file.
  • Fixed blank column headers in the results when running DMV queries
  • Fixed file outputs, csv and tab had been switched

Posted On Wednesday, December 21, 2016 10:02 AM | Comments (3)

Monday, November 14, 2016 #

ProcessAdd bug in AMO 2016

I saw the question below on the MSDN forum about processAdd not working in AMO 2016 and I thought it sounded strange so I did some investigation:

https://social.msdn.microsoft.com/Forums/sqlserver/en-US/91785c40-a03f-4f46-abc1-9a51ed5e4283/processadd-using-amo-sql-2016-systemnotimplementedexception?forum=sqlanalysisservices

When I ran Redgate Reflector over the Microsoft.AnalysisServices.Core.dll I came across this little gem:


Where it checks if the object is of a type IQueryBinding from the Miocrosoft.AnalysisServices.Core namespace and as far as I can see nothing currently implements this interface. What this means is that if you pass any of the built-in binding classes to the Process method - it will always throw a NotImplemented exception. 

I've posted a bug here https://connect.microsoft.com/SQLServer/feedback/details/3110077 and apparently it's a known issue and a fix is already in the pipeline. However there is also a relatively simple workaround which involves creating a class which implements IQueryBinding in your own project.

The IQueryBinding interface is thankfully not that complicated and a full implementation is outlined below:

public class MyBinding : Microsoft.AnalysisServices.Core.IQueryBinding
    {
        public MyBinding(string dataSourceID, string queryDefinition)
        {
            DataSourceID = dataSourceID;
            QueryDefinition = queryDefinition;
        }
        public string DataSourceID { get; set; }
        public string QueryDefinition { get; set; }
        public ISite Site { get; set; }
        public event EventHandler Disposed;
        public void Dispose() { }
    }

You can then simply create a MyBinding instance and pass this in to the Process method for you your partition:

var qb = new MyBinding("Adventure Works DW", "SELECT * FROM table");
partition.Process(ProcessType.ProcessAdd, qb);

Posted On Monday, November 14, 2016 9:26 AM | Comments (0)

Thursday, October 20, 2016 #

DAX Studio 2.5.0 Release

The next version of DAX Studio has just been released. You can download this release and read the release notes here

Note: In this release we have updated the versions of the Microsoft ADOMD.Net and AMO libraries we reference to use the SQL 2016 versions. This gives us scope to access some of the new functionality in Power BI and SQL Server 2016, but may mean that you are prompted to download these when you upgrade from a previous version.

Some of the highlights of this release are:

New Features

Added an option to trace Direct Query events

There is now an option under File > Options where you can enable extra events in Server Timings for Direct Query based models. These events add extra overhead so you should only enable this option before you start tracing a Direct Query model and you should disable this option once you are finished.

image

Added Dynamic syntax highlighting

Earlier versions of DAX Studio contained a hard coded list of syntax highlighting keywords and functions. The lists of keywords and functions used for syntax highlighting is now dynamically discovered from the data source. This has advantages when dealing with Power BI in particular which can get new functionality added from one month to the next.

Added rows and KB to server timings tab

Analysis Services 2016 and Power BI have added information to the server timing events that includes information about the number of rows and the size of data returned from each of the timing events. If this information is found it is now surfaced in the server timings tab.

image

Optimized DaxFormatter calls

The old version of the DaxFormatter.com API required a second call if there was an error to find out the details of the error. The nice people at DaxFormatter.com have updated their API so that this is no longer necessary.

Added an option to specify the default separator style

In the 2.4 release we introduced an option where you could convert on demand between the 2 different separator styles. But all queries had to be executed using the UK/US style.

The UK/US style is where a comma (,) is used as the list and thousands separator and the period (.) is used as the decimal separator.

eg. EVALUTE FILTER( 'Product' , 'Product'[List Price] > 1.25 )

The European/Other style is where a semi-colon (;) is used as the list separator, the thousands separator is a period (.)  and the comma (,) is used as the decimal separator.

eg. EVALUTE FILTER( 'Product' ; 'Product'[List Price] > 1,25 )

Now you can choose which style you want to use as your default in File > Options menu.

image

Added an error message when you attempt to open a .dax file that no longer exists

Prior to this version if you clicked on an entry in your recent file list which pointed to a file that had been renamed or deleted you would just get a blank window with no idea what went wrong. Now there will be an error posted to the output window tell you what went wrong.

Bug Fixes

  • Fixed a bug where server timing traces were also listening for query plan events
  • Fixed incorrect removal of square brackets from MDX results
  • Fixed a race condition that happened sometimes when trying to capture both Query Plans and Server Timings

Posted On Thursday, October 20, 2016 5:46 AM | Comments (0)

Tuesday, July 26, 2016 #

DAX Studio 2.4.1 Released

The next version of DAX Studio has just been released. You can download this release and read the release notes here

Some of the highlights of this release are:

Toggle Delimiters

A toggle delimiter feature has been added to switch between regional settings styles (commas vs semi-colons). Currently queries still need to be executed using the "US" style comma delimiters, but if you have expressions using semi-colon delimiters then you can easily switch back and forth. This can also be handy if you find an example expression online that is using different delimiters than what you have in your model.

image

this will swap between using comma delimiters

image

and semi-colon style delimiters

image

Intellisense Improvements

Added a Function Insight tooltip to intellisense which willl appear after the opening bracket for a function that will show the parameters for a function as well as it's description

image

The loading and caching of intellisense data has also been vastly improved. The slightly delay when starting to type a query against a large model should be gone and you should also not see issues when trying to edit a query in a window that is currently executing a query.

Posted On Tuesday, July 26, 2016 6:37 AM | Comments (3)

Saturday, June 4, 2016 #

DAX - Creating a measure for Top 5 and Others

Often when you have a requirement to show a measure for the TopN of a given attribute the users will want to see the subtotal of all the other items.

With multi-dimensional cubes and MDX you can create calculated members in a dimension. But you can’t do this in a tabular model using DAX . You could create a calculated column, but then your TopN would be fixed and could not respond dynamically to changes in filters as calculated columns are calculated and stored at processing time.

So one way to make this work is to have a placeholder row (or rows) with "Others" in the table which you are doing the TopN over.

The first step in this technique is to create a Rank measure (this saves having to re-type the rank logic multiple times). In this example I’m doing a top 5 Brands.

Brand Rank:=RANKX ( ALL ( Brand[BrandName] ), [Sales Amount] )

Then we create a top 5 measure that returns the sales amount if the rank is less than or equal to 5 and will return the sum of all ranks if the current brand is "Others" otherwise it will return BLANK()

Top5 := 
IF ( 
    [Brand Rank] <= 5, 
    [Sales Amount], 
    IF ( 
        HASONEVALUE ( Brand[BrandName] ), 
        IF ( 
            VALUES ( Brand[BrandName] ) = "Others", 
            SUMX ( FILTER ( ALL ( Brand[BrandName] ), [Brand Rank] > 5 ), [Sales Amount] ) 
        ) 
    ) 
)
 

Then if we create a pivot table with this measure we get the following result. In the picture below I’m showing the raw Sales amount so you can see how the amounts get re-assigned in the Top5 calculation.

This also works as you change the pivot table. When we introduce the Segment on the rows we can show a figure for “Others” within each segment. (Note that for this to work I had to create 2 placeholder “Others” rows, one in each segment). There is a link at the bottom of this post to the file that these screenshots were taken from.

 

At this point, I’m pretty happy. It seems to be working well although calculating ranks is rather expensive so I’m wondering how it will work against larger data sets. I know that there are definitely ways to make this faster if you are using the current generation of DAX engines (Excel 2016, Power BI or SQL 2016) as we have access to things like variables and the INTERSECTS function which we could use in conjunction with TOPN. But I think there are also opportunities to make this faster with “v1” DAX

Running this algorithm against the Adventure Works Product Name column results in a performance profile like the following (using the server timings feature of DAX Studio)

Which shows 86% of the time being spent in the formula engine. I’m pretty sure that a fair bit of this time is spent calculating ranks greater than 5 for the “Others” calculation. So I thought it might be faster to a TOPN in descending order and get everything except for the first 5. The formula for that is the following

Top5v2 := 
IF ( 
    [Brand Rank] <= 5, 
    [Sales Amount], 
    IF ( 
        HASONEVALUE ( Brand[BrandName] ), 
        IF ( 
            VALUES ( Brand[BrandName] ) = "Others", 
            SUMX ( 
                TOPN ( 
                    COUNTROWS ( ALL ( Brand[BrandName] ) ) - 5, 
                    ALL ( Brand[BrandName] ), 
                    [Sales Amount], 1 
                ), 
                [Sales Amount] 
            ) 
        ) 
    ) 

The modified version of the calculation has a much better performance profile, taking about 1/3 of the time and with an almost 50/50 split between the Formula Engine and the Storage Engine.

You can download the a sample workbook with these formulas from the link below:

 https://onedrive.live.com/redir?resid=98546E1B65A78A74!14166&authkey=!AK3SDGu-_aLAXHo&ithint=file%2cxlsx

Posted On Saturday, June 4, 2016 3:38 PM | Comments (2)

Tuesday, November 3, 2015 #

Dax Studio 2.3.2 released

This latest release includes a lot of UI polish features and we also now have support for SQL 2016 and Excel 2016 which is mainly an update to the installer and some changes to the Query Plan and Server Timings features to deal with some changes to the xmlSql that comes back in the trace events.

Following the theory that a picture is worth a thousand words – below are screenshots of the changes in this release.

The File –> Open menu now includes a list of recently opened files.

image

For performance testing you can now set the run button to always doe a clear cache before executing a query. This mode is easily selectable using the new arrow menu on the run button.

image

The model dialogs all have updated styling and now including a shaded overlay so that the active portions of the screen are clearly visible.

image

An options pane has been added to the File menu for setting global program options

image

A Query History pane has been added which records all of the queries run by Dax Studio. If you have the Server Timings feature enabled the Server duration and the FE / SE timing are also tracked. You can double click on an item in the query history to insert it back into the editor. This is great for performance tuning as you can easily see which variation of the query was fastest and returned the expected number of rows and then bring it back into the editor.

image

The metadata pane now loads asynchronously. In earlier versions the loading of the metadata pane was a blocking operation and the user interface could go unresponsive for a short time while loading large models. Now the metadata is loaded on a background thread so the interface remains responsive and the pane that is updating is greyed out to indicate that the load is still in progress. .

image

The new “Define Measure” feature, which is a right-click option on a calculated measure, is a great way to either see how a measure was defined without opening up the whole model. Or you can use it as a starting point to test some variations on the logic.

SNAGHTML44d892ee

There are also a number of small bug fixes and tweaks and a number of issues that were raised on codeplex that have been fixed (we always tag closed issues with the release they were fixed in)

Posted On Tuesday, November 3, 2015 11:10 AM | Comments (1)