How-to: Run Power BI reports on-premise

How-to: Run Power BI reports on-premise

BIGDATAPUMP is a cloud-born organization. We design and implement solutions to our customers using the hybrid cloud model when some part of the solution needs to run on-premise.

Recently, I had a request to do a Power BI demo on an on-premise environment. As many of you are probably aware, Power BI isn't actually available on-premise yet. SQL Server Reporting Services (SSRS) will have the capability of hosting Power BI reports shortly. In the meantime, I had to find a workaround. So, I started off as most of the people working in the field probably would: by checking the facts.


It turned out that we still lacked the software that we could just download and install on our server. However, I found out that there is a Technical Preview of Power BI reports in SQL Server Reporting Services that is only available on Azure. I read through the page to figure out how that was actually implemented: "This Technical Preview is a pre-configured Virtual Machine in the Azure Marketplace. The VM includes everything you need, even sample reports and data, making it quick and easy to give it a try."

The next minute I was spinning up the VM on Azure Marketplace to have a closer look. I logged into the VM, verified that everything necessary was there and shut it down. On the Azure Portal, I navigated to the vhds container of the storage account I created when provisioning the VM and found two blobs depicted on the following picture:

Based on the naming as well as on the size of the VHD files, I figured that the osdiskwindowssimple.vhd file would be the right one for me. I also realized that I needed a proper tool to speed up the download. I wasn't going to spend all night waiting for 127GBs to download and so I decided to use AzCopy as usual to do the job. AzCopy is a command line tool that I usually use to upload large files to Azure, but it can of course be used for download purposes as well. The following command did the job:

.\AzCopy.exe /NC:20 /Source: /Dest:C:\SSRS /SourceKey:WDP6ty93th43g3p98g3po8iuhqiugh4iquhhiIHHhisdjt439g3hgi4go43gjoipslfdsifuh7f2iofWGFOoCA== /Pattern:"osdiskforwindowssimple.vhd"

The /NC:20 defines that AzCopy will use connection throttling by opening 20 simultaneous connections which enables me to use my entire bandwidth, and thus minimize the download time. The rest of the command is probably pretty self-explanatory. (Please note that the access key in the example above is not the real key for the oskarissrs account. In fact, I have deleted the whole storage account so don't waste your time trying to access it. :) If you haven't used AzCopy previously, you need to add the path to your environment variables or run the AzCopy.exe from the directory you installed it into (default for 32bit installation: C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy>).

After downloading the Virtual Hard Disk (VHD), I shrank the file using Hyper-V from the original size of 127GB down to 34GBs to save some disk space and to make it more portable. In order to achieve this, I navigated to "Action" menu bar in Hyper-V and chose "Edit Disk" from the drop-down list. I located the VHD on my hard drive and used the "Convert" Action to convert the VHD into dynamically expanding VHDX format. After that, I proceeded with creating a new VM with the new VHDX file. Azure currently supports only Gen1 VM images so the VM I created in Hyper-V was a Generation 1 virtual machine. You may want to allocate as much RAM to your VM as possible, because there will be a SQL Server 2016 Analysis Services Tabular Model installed and running (and that is an In-Memory database).

After logging in to my local VM, I noticed a couple of more things:

  1. The Power BI Desktop version installed was not the latest one. Thus, I couldn't just use one of my own PBIX files to deploy a new report quickly. In addition, the latest version of Power BI Desktop does not include the functionality of saving the report into SSRS. Thus, I couldn't just update the Power BI Desktop to the latest version and deploy my sample reports after that.
  2. Only the DirectQuery functionality of Power BI Desktop was currently supported.
  3. The environment contained Adventure Works sample database so I could save some time and use that data for my demo purposes.
  4. Few sample reports were also included.

I decided to create a data model for SSAS Tabular using Visual Studio (the VM had Visual Studio 2015 as well as SQL Server Data Tools already installed) and to deploy that to the SSAS on the VM. I also created a reports on Power BI using my SSAS Tabular data model as a data source and deployed them successfully to SSRS. I also played around with the sample reports. Here's one screenshot of my environment:

Last useful tip to share, is that you can use a URL Access Parameter also with Power BI reports to hide the unnecessary SQL Server Reporting Services Toolbar. The rc:Toolbar=false parameter in the URL will do the trick:

All in all, it has been nice to gain hands-on experience with the upcoming capabilities. Running Power BI reports on-premise is a functionality that our customers are constantly asking for. It has been nice to be able to answer to that need lately by showing off some preview demos and the feedback from customer side has been great. I thought to share this as I’m sure I have a lot of colleagues with similar requests coming in.