THE SQL Server Blog Spot on the Web

Welcome to SQLblog.com - The SQL Server blog spot on the web Sign in | |
in Search

SQLBI - Marco Russo

SQLBI is a blog dedicated to building Business Intelligence solutions with SQL Server.
You can follow me on Twitter: @marcorus

  • Leverage INTERSECT to apply relationships in DAX

    If you are used to virtual relationships in DAX (see Handling Different Granularities in DAX), you probably use the following pattern relatively often:

    [Filtered Measure] :=
    CALCULATE (
        <target_measure>,
        FILTER (
            ALL ( <target_granularity_column> ),
            CONTAINS (
                VALUES ( <lookup_granularity_column> )
                <lookup_granularity_column>,
                <target_granularity_column> 
            )
        )
    )

    In the new DAX available in Excel 2016*, Power BI Desktop, and Analysis Services 2016, you can use a simpler syntax, which offers a minimal performance improvement and is much more readable:

    [Filtered Measure] :=
    CALCULATE (
        <target_measure>,
        INTERSECT (
            ALL ( <target_granularity_column> ),
            VALUES ( <lookup_granularity_column> )
        )
    )

    You can find a longer explanation of this new pattern and download some examples in the new article Physical and Virtual Relationships in DAX, on SQLBI web site.

  • Happy Birthday Power BI #powerbi

    Power BI has been on the market one year. My biggest concern, when the product was still in private beta, was the promise of monthly releases made by Microsoft. Today, I can say that the promise was real. I see a long road ahead, in terms of features and possible improvements. But it’s a matter of fact that Power BI is a product with a growing user adoption, that every month adds features that increase the number of companies and users that can consider its usage.

    If you go back to the situation of Microsoft BI two years ago, you should remind the lack of a mobile story, the requirement for SharePoint and the poor situation on the client side. A great server product (such as Analysis Services) was limited in its adoption because of the client options available. Today the trend is completely different, and on certain areas of the BI platform Microsoft became the leader instead of a follower.

    I always try to find the missing part, what can be improved, without spending too much time praising what is good (and there are many things that are). But, today, I just want to join the choir you will see in the video:

    HAPPY BIRTHDAY POWER BI !

  • A new MemoryHeapType default in #ssas #tabular 2016 (please, fix your setting in production!)

    If you already installed Analysis Services 2016, you should change the MemoryHeapType setting. There is a new default value (-1), which is an automatic choice that currently applies a new hybrid allocator (which has the number 5 as a value). It should resolve the memory fragmentation problem causing performance issue as described in an article I wrote a few years ago. However, the setup does not write the new default value as a current value and it still write the old default “2”, which is not good for Tabular. Thus, if you installed SSAS Tabular 2016, you probably have this setting (look at the Default Value different than the Value!). The new setting is also the suggested one for Multidimensional.

    image

    You should change the value to -1 and then restart the service. After that, reconnect to SSAS Properties and check that you have the following configuration:

    image

    Of course, we hope future updates of SQL 2016 will fix this setup issue. In the meantime, fix the setting to avoid performance issues on a production server!

  • Free Introducing Power BI eBook and new DAX recorded video course #powerbi #dax

    Microsoft Press released a free eBook you can already download, Introducing Microsoft Power BI, which I and Alberto Ferrari wrote in the last few months. Please note it is a very introductive book, don’t expect an inside-out. As we wrote in the introduction:

    analyticsWe wanted to write an introduction to Power BI that covers the basics of the tool and, at the same time, shows you what the main capabilities of Power BI are. […] At the beginning, we go for an easy introduction of the concepts along with an educational approach that lets you follow on your PC the same steps we show in the book. […] After the first chapters, we begin to run a bit faster, knowing that we are no longer guiding you step by step. […]

    This book is targeted to a variety of readers. There are information workers and people who are totally new to the BI world. For those readers, the book acts as a simple introduction to the concepts that are the foundation of BI. Yet, another category of we wanted to target is that of IT professionals and database administrators who might need to drive the decisions of the company in adopting Power BI, because their users are asking for it. If this is you, this book acts as both a simple introduction to the basic concepts, to help you understand why users are so interested in Power BI, and as an overview of the capabilities and tools available in Power BI, so that you can make educated choices in adopting it.

    As you see in the side picture, we included some real-world reports, and we have an entire chapter titled “Improving Power BI reports” where you will find a number of useful examples and guidelines. And, of course, they are included in the companion content, which is a separate download available here. We used some of the components available ok okviz.com, even if when we wrote the book such a web site was not ready, so we didn’t mention it in the book.

    masteringdaxThe goal of the book is to be introductive. So, what’s next? We are working on some new content for later this year, but in the last few months we also worked on a recorded version of our Mastering DAX course. So, if you cannot join us in one of the many classes we deliver around the world, you can now get a recorded version, which is complete with all the exercises. You will not have the same interaction that is possible in a classroom, but early adopters who also attended the live class told us that getting a recorded video as a revise tool. The structure and the flow is the same, and we tried to compensate the lack of interactivity with a physical presence on the screen.

    You can watch a number of segments for free, and you can save 70 USD until July 5 getting access to the entire course.

  • Many-to-many relationships in DAX and CROSSFILTER #dax #powerbi #powerpivot #ssas #tabular

    I wrote many articles and white papers about implementing many-to-many relationships in Analysis Services - for both Multidimensional and Tabular models. For the latter, the same techniques are also useful in Power Pivot and Power BI. With the recent release of Excel 2016, Analysis Services 2016, and the engine available in Power BI Desktop, there are new tools and techniques that we can use to implement this type of data models, which are more efficient and easier to manage in your DAX code.

    The article Many-to-many relationships in Power BI and Excel 2016 describes how to use the bidirectional filters in Power BI and Analysis Services 2016, and also how to obtain the same performance in Excel 2016 by using the CROSSFILTER function. These techniques use a smart technique that applies the filter through the relationships in the many-to-one direction only when this is really necessary, because there is an existing filter over the bridge table in a many-to-many relationship. This is possible in the "old" DAX with a more complex conditional statement, which makes the code less readable and also suffer of another performance issue related to the usage of an IF statement in a measure (also solved in the "new" engine).

    I think there are a huge number of data modeling options that are now possible thanks to these innovations, many of them will have an impact on several DAX Patterns, which we will revisit in a few months using the new techniques. 

  • Licensing and upgrade options for Analysis Services Tabular in SQL Server 2016 #ssas #tabular

    With the release of Microsoft SQL Server 2016, there are several options to use Analysis Services Tabular.

    • SQL Server Standard: it includes Analysis Services Tabular, and it is available in two licensing models: Server + Client Access License (CAL) and Core-based.
    • SQL Server Enterprise: it has all the features, and is available only in a license per-core.

    For a Tabular model, the limitations existing for the Standard edition affect the memory available for an instance (16GB) and the lack of these features, available only in Enterprise: DirectQuery, partitions, and perspectives. The limit of 24 cores for the standard fundamentally is a non-issue, because with the right hardware with a single socket you should have so many cores (no NUMA is still better for Tabular, at least in current release of 2016). You can find more details in this detailed matrix of the features available by the Editions of SQL Server 2016.

    If you have a Tabular model in SSAS 2012 or SSAS 2014 today, you might want to upgrade immediately because of the big performance improvements available in SSAS 2016 (almost any data model and report should benefit). However, what are the options to upgrade? It depends on your current license and whether you have a software assurance (SA) in place or not.

    • SQL 2012/14 Enterprise with SA: you can upgrade today, you should be already covered.
    • SQL 2012/14 Business Intelligence with SA: for the duration of your current SA contract, you can upgrade to SSAS 2016 using a licensing model Server+CAL that is not available otherwise. At the end of your SA, you will have to renew using current SQL Server Enterprise licensing terms. You can also get new licenses (of Business Intelligence that you upgrade in this way) up to 25% of the number you had on May 1, 2016.
    • SQL 2012/14 Enterprise or Business Intelligence without SA: you have to buy a license for SQL 2016. You might consider using the Standard for SSAS Tabular in case your model does not need the features available in Enterprise, otherwise you should get an Enterprise license.

    Please note this is a summary I created to recap the current situation. I suggest you to consider all details and terms explained in this PDF. You can also visit this Microsoft website with all the information about SQL Server licensing.

    Finally, a very welcome news is that Microsoft SQL Server 2016 Developer edition is now free. It had a minimal cost in the past, but now it will be much easier to install development enviroments also in those companies with a very long and bureaucratic procurement process, other than for all the consultants who want to study the new environment on their workstations without using a trial version expiring in 6 months.

  • New formula to compute new customers in #dax

    Two years ago I and Alberto Ferrari spent time discussing techniques to calculate new and returning customers, and we wrote the result of our study in the New and Returning Customer article in DAX Patterns web site and book.

    During one of the Mastering DAX courses, a student presented a better solution and Alberto published an article describing this approach that is much faster than the ones we used before. We also worked to backport this algorithm in the "legacy" DAX, without using new features available only in Power BI Desktop, Excel 2016, and SSAS 2016 (we call it "DAX 2015" to quickly identify the DAX version). The performances are equally amazing and you can read the full story (and the DAX code) on the Computing New Customers in DAX article on SQLBI.

    In the future we will also update the DAX Patterns web site, but the idea is to wait a larger work on the existing patterns updating them using the new syntax available in DAX 2015.

  • Leverage Analyze in Excel for your Power Pivot model

    Microsoft recently released the Analyze in Excel feature, which allows you to connect an Excel pivot table to a data model published on Power BI. I think/hope that Microsoft will enhance this feature to extend its current limits. For example, it does not allow you to connect a live model based on an on-premises SSAS Tabular server through Power BI Enterprise gateway. However, this feature is also very interesting to any Power Pivot user.

    When you have a data model in Power Pivot, you have to create all the reports within the same Excel workbook. At the end, you have a single file containing everything: the data model, the data, and the reports. This is fine at the beginning, but as soon as you create more complex data models, you increase the number of reports – in a word, when the model grows – then you face a few problems. Any edit operation of a power pivot model could require time to refresh pivot tables and measures. The editing experience becomes slower. You cannot separate the data from the reports in two different files to split maintenance responsibility. If you created several reports based on the same power pivot data model, you already know this story. It would be nice to split the data model (and the data) from the reports. Now you can, using Power BI and the Analyze in Excel feature. Power BI is your best friend here.

    This is the list of the operations you have to do:

    1. Download and install Power BI Desktop – you have to be a local administrator of your machine to do that, but this is the only step requiring such a permission.
    2. Open Power BI Desktop.
    3. Import the Power Pivot data model in Power BI Desktop using the feature File / Import / Excel Workbook Contents, and then schedule a refresh if you need that.
      import-from-excel
    4. Publish the Power BI Desktop data model in Power BI service
    5. Open the data model from Excel using the Analyze In Excel feature on Power BI service
    6. Create the same reports you had in the original Excel file using new pivot tables.

    At the end, you will have an Excel file connected to an external data model, which can refresh automatically through a schedule you control. Before Power BI, you had to use Power Pivot for SharePoint to do that.

    You might ask why I am not suggesting to publish the Power Pivot data model straight to Power BI service, without creating a Power BI Desktop file to do that. The reason is that it is much simpler to work with two separate files, and the environment in Power BI Desktop provides you the ability to use bidirectional filter in relationships, a feature that is not available in Power Pivot for Excel. However, this approach requires you to plan in advance the moment you “detach” the data model from Excel if you defined synonyms for Q&A.

    At the moment of writing, you cannot modify synonyms in Power BI, neither in Power BI desktop nor in Power BI service. However, if you created synonyms in Excel, they are kept in the data model in Power BI Desktop, even if you cannot modify them (by now, I hope!). If you want to modify the synonyms for Q&A in the Excel file, you have to repeat the cycle and import the Power Pivot data model again in Power BI, losing the changes you might have done before. I am not saying that this way of working should be suggested to anyone, but you might be interested if you are willing to pay the maintenance cost of this solution, which already provides you the best of the two worlds: define synonyms for Q&A in Power Pivot, and create relationships with bidirectional filter in Power BI Desktop. When editing synonyms will be possible also in Power BI Desktop and/or in Power BI Service, this article will become immediately obsolete.

    At the moment, you should consider that changing the synonyms in Power Pivot will require you to overwrite the changes you applied to the data model in Power BI Desktop. Thus, you might want to apply any change to the data model in Power Pivot (for example, adding/removing columns/measures) instead of using Power BI. You might just apply bidirectional filter on Power BI, but you will have to repeat that every time you import the data model in Power BI Desktop from Excel again. And you would lose any report designed in Power BI Desktop doing that, so you should edit reports in Power BI service only… Editing synonyms in Power BI will definitely streamline the process, when it will be possible.

    One feature that I would like for this scenario is being able to change automatically all the connections of the pivot tables based on a Power Pivot data model, replacing the correspondent connection string in the ODC file returned by Power BI clicking on Analyze In Excel. This is also a long awaited feature for those who want to publish a Power Pivot data model to an on-premises Analysis Services Tabular server, and it is not easy to manipulate the file so that you can refactor the connection string in all the queries of the dataset. I am sure that this would improve the productivity of Excel users facing this issue, and it will increase the number of users or Power BI service.

    Conclusion

    You can use synonyms today in Q&A, but you have to create and edit the model in Power Pivot, exporting it to Power BI only to edit relationships using types not available in Power Pivot (such as the bidirectional filter).

  • A warm welcome to Power BI Embedded #powerbi

    I wrote this blog post with Alberto Ferrari.

    A few days ago, during the Build conference, Microsoft announced the availability, although still in preview, of Power BI Embedded. In a few words, Power BI Embedded lets you embed Power BI reports in your application, taking advantage of the tools in Power BI to enrich the analytical power of your application. Users do not have to authenticate with the Power BI service. In fact, they do not even need a Power BI account. Your application performs the necessary steps to authenticate a user and then, it uses App Tokens to request for report rendering. App Tokens are generated by the Power BI service when your application requests them, providing the necessary keys associated with your Azure subscription.

    Power BI embedded is priced using renders. Currently, the price is 2.50 USD per 1,000 renders, and the first 1,000 renders in a month are free. A render is the production of a visual in a report, so a single report requires one or more renders (a dense report will be more expensive).

    The first step to use Power BI Embedded is to activate the service in Azure, by means of creating a Power BI Workspace Collection. A workspace collection, as its name implies, is a set of workspaces. A workspace can contain datasets, reports and dashboards, same as it is happening today with the Power BI workspaces you work with when you are connected to the Power BI service.

    You can add items to a Power BI Embedded workspace by uploading a PBIX file generated with Power BI Desktop. Connection strings, data model, measures and calculated columns, visual interactions and all other functionalities of Power BI are uploaded along with the model and available for your reports. This is, at least, the experience during the preview. As soon as the product reaches general availability, the development experience might be different and, as of today, there are no information about how it will work when released (hopefully you will be able to automate the creation of the model and its data refresh). Nevertheless, in this early phase, the option of using Power BI Desktop for development is awesome, as it leverages a well-established technology to author the reports, although it means that – as of today – you cannot add a dashboard to a workspace collection, because Power BI Desktop does not have a dashboard authoring feature, it stops with reports. Another limitation that exists during the preview is that you cannot refresh data, unless you upload a new version of the PBIX file. If you need to work with live data, today the only option is to leverage DirectQuery connections, along with all the limitations that come from that.

    Once the workspace collection is in place, your application needs to connect to the service and generate an App Token by providing the needed key (which you have to store in the application itself). Once the application has an App Token, it can interact with the user the way it needs and, when it is time to produce a dashboard, it provides the App Token to the service again requesting the rendering of one or more reports. Power BI takes care of handling the report rendering and visual interactions.

    In order for the entire reporting system to work, the Power BI service needs to be able to connect to the data source. This can be very easy, if the data source is already in the cloud, or it requires some additional refreshing steps if the data source is on premises and not connected to the service. In such a case, you have to publish to the Power BI cloud service an updated version of the PBIX file using the Power BI Embedded API.

    So far, so good, it looks like Microsoft created yet another library to show dashboards inside an app. Why do we believe this is huge, not just yet another reporting app?

    • It uses the very same tools your users are probably using in Power BI to gather insights.
    • It does not have a starting price: you pay for what you use. The more customers are using your application, the more you pay for. By using a similar approach (pay per use) you can limit the initial investment to build a real analytical system in your application
    • It does not provide you a simple reporting tool. It provides a modeling tool where you build a data model through a Power BI Desktop file, along with measures, ETL steps with Power Query, calculated columns and tables. Thus, in order to provide analytics to your application, you do not need to focus on how to build a given report. Instead, you will focus on how to build an analytical model on top of which, later, you will build reports.

    This latter change is, in our opinion, the most important of all. A simple shift in the way you think at reporting inside an application might open a whole bunch of new features and help democratizing Business Intelligence. For example, it is easy to think at standard reporting provided inside the application and, for advanced users that require more power, ISV can provide Power BI Desktop files that can be customized and later deployed on Power BI to perform custom analytics.

    It is important to note that this preview is not a complete solution for any ISV. Today there are no APIs to programmatically create a PBIX file. This results in strong limitations for automating the creation of a custom data model depending on parameters defined by the application created by the ISV. An API to create a PBIX file or to manipulate a data model published on the server would be an important advancement to create a fully functional ecosystem for the ISVs. We hope this is only the first step in that direction.

  • How to fix #dax time intelligence functions in #powerbi when date table uses a surrogate key

    I wrote an article about Time Intelligence in Power BI Desktop to explain how to fix measures using DAX functions for time intelligence when you have a data model where the relationship uses surrogate keys. THe short description is that all the time intelligence functions in DAX makes two assumptions: you have a "full" date table (all days for each year you want to handle must be present), and you use the "Mark as Date Table" setting to identify the date column in a date table. However, the latter is not possible in Power BI Desktop, because such a setting is not available yet (end of February 2016).

    There are a number of workarounds possible when we wait for such a feature in a future version of Power BI Desktop. This problem has a limited impact because, when you use the date column in the relationship, the "Mark as Date Table" setting is not strictly necessary. However, it is a good idea to understand why this behavior exists and how the settings affect the DAX engine by reading the article, even if you don't have this problem today.

  • Possible #powerpivot issue removing Excel 2016 going back to Excel 2013

    Today I observed an issue on a notebook where Excel 2016 was installed and then removed, going back to Excel 2013.

    After this operation, workbooks having a Power Pivot data model were working correctly by navigating data in PivotTables, but any operation in the Power Pivot window (such as adding columns, tables, or editing measures) was failing. The error was a TYPE_E_CANTLOADLIBRARY and it was caused by a type library not removed uninstalling Excel 2016.

    I solved the problem by removing the following key from the registry 

    [HKEY_CLASSES_ROOT\TypeLib\{00020813-0000-0000-C000-000000000046}\1.9]

    (WARNING: DON'T TOUCH THE WINDOWS REGISTRY IF YOU DON'T KNOW WHAT YOU ARE DOING - use the following link to get more detailed instructions and always do a backup first).

    I found this article that describes the problem as a possible issue in all the versions of Excel!

  • New features in Power Pivot Utils v 1.05: list unused columns #powerpivot #dax

    The latest version of Power Pivot Utils v 1.05 has a new feature that deserves the immediate upgrade: you can list unused columns!

    Power Pivot Utils Ribbon

    Thanks to Idan Cohen and Amir Hefetz for their contribute to the macros developed by Bertrand d'ARBONNEAU!

  • GROUPBY vs SUMMARIZE in #dax #powerbi #powerpivot

    If you are using Power BI Desktop or Power Pivot in Excel 2016, you should learn when and how you can use GROUPBY instead of SUMMARIZE. The SUMMARIZE function is very powerful and internally very complex, so it’s easy to find scenarios where you get unexpected results or you have performance issues. The new GROUPBY function (also available in SSAS Tabular 2016) can be a better choice in a number of cases, even if it’s not the same and, for example, it does not “enforce” a join as you can do using SUMMARIZE (see here for more details).

    I recently wrote an article about one situation where GROUPBY is absolutely the best choice: when you have nested grouping. An example is pretty simple: you want to SUMMARIZE the result of another SUMMARIZE… well, it’s not possible, but you can do that using GROUPBY.

    Once you get used with GROUPBY, I also suggest you to check your skills with the DAX Puzzle about GROUPBY we published a few weeks ago. And if you alread solved it, try the new puzzle published less than two weeks ago about “last date” – not related with groupby behavior, but still good food for mind!

  • Conferences and Training in Australia Feb/Mar 2016 #dax #sqlsaturday

    In a couple of weeks, I will move to Australia for 3 weeks, visiting Melbourne and Sydney for a number of events and training. This is the list of topics in public events you can attend (with link to registration pages – remember that SQL Saturday conferences are free events, but you still have to register first!):

    UPDATE 2016-02-08 - added a session on Feb 16 in Melbourne!

    You can follow the links to see more detailed info about each event. Please note that early bird discounted price for Mastering DAX and Optimizing DAX workshops will expire at the end of this week (Feb 6) – if you are in Australia (or close enough) and you don’t want to miss these unique dates for an immersive DAX training, hurry up and get the discounted price!

    I really look forward to come back in Australia!

  • Create static tables in Power BI, Power Pivot, and Analysis Services #powerbi #powerpivot #ssas #tabular

    I recently wrote an article about how to Create Static Tables in DAX Using the DATATABLE Function. Such a new DAX feature (the DATATABLE function) will be likely used in the upcoming Analysis Services Tabular 2016, but it is not used neither in Power BI nor in Power Pivot to create static table. For this reason, I think that it could be useful to do a quick recap of all the methods available if you need a table with fixed static data in your data model.

    • Power Pivot
      • Linked Tables: you can create a table in Excel with static data and import it in a data model using the Linked Table feature. However, this is technically not a “static” table, it can be refreshed with new data and the table in Excel can be dynamic, as shown in the article describing the “Linkback Tables in Power Pivot for Excel 2013”.
      • Paste data from Clipboard: if you copy a table in the clipboard and paste it in Power Pivot, the data model has a correspondent table that also contains data definition in model metadata. You cannot modify the content later, unless you use the Paste Replace feature of Power Pivot. Technically, the content is included in the XML defining the data model, but you do not have any access to it in Power Pivot.
    • Power BI Desktop:
      • Enter Data: the Enter Data feature in Power BI Desktop allows you to enter data manually in a table, and you can also paste data from the clipboard using this user interface, which allows you to modify the content of this static table later, after the paste operation. The content is stored in a string using a compressed and encoded JSON format. Such a string is extracted in an M transformation using calls to nested calls Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText(…))))
      • Calculated Table using DATATABLE function: you can also create a calculated table using DATATABLE, as explained in the article I previously mentioned.
    • Analysis Services 2012/2014/2016 (until model version 1103):
      • Paste data from Clipboard: similar to what you do in Excel Power Pivot. The only difference is that you can modify the static content defining in the data model by manipulating the .BIM file with a text editor. The content of the table is defined in an XML schema.
      • Linked Tables imported from an Excel data model: they are defined in the data model exactly as you would define a static table using Paste Data from Clipboard feature.
    • Analysis Services 2016 (from model version 1200):
      • Paste data from Clipboard: such a feature is not yet implemented (in CTP 3.2) and you get the error message “TOM does not yet support pushed tables” if you try to execute it. Because the model version 1200 does not store data in XML, I **suppose** they will implement the static table as a calculated table using the DATATABLE function, just as you might already do manually in Power BI Desktop. I will update this blog post later when a new CTP or RTM will support such a feature.

    I hope this schema will help you making a decision about the technique to use in this scenario.

More Posts Next page »

This Blog

Syndication

Archives

Powered by Community Server (Commercial Edition), by Telligent Systems
  Privacy Statement