THE SQL Server Blog Spot on the Web

Welcome to - The SQL Server blog spot on the web Sign in | |
in Search

John Paul Cook

  • FLAS

    The information technology field is full of obfuscation. There are many three letter acronyms, including the meta three letter acronym TLA. Now in the era of Big Data and Cloud, we have four letter acronyms, what I call FLAS, two of which are HTAP and ACID. People have asked me what these are, so here is an explanation.

    ACID has a new meaning. No longer is it just Atomicity, Consistency, Isolation, and Durability. It can also mean Algorithms, Cloud, IoT, and Data. HTAP is Hybrid Transactional Analytic Processing. Transactional processing systems are by default not optimized for analytical processing. Transactional processing works well when data is stored in indexed rows. That type of storage does not work well for analytic processing, which involves the aggregation of columns of data. By arranging and storing data in columns, aggregations can be done much faster. A tried and true technique for providing fast analytical processing and fast transactional processing is to have ETL or ELT processes to periodically build columnar OLAP storage from row oriented OLTP storage. The disadvantage is the latency introduced.

    HTAP solutions provide both OLAP and OLTP storage allowing analytical processing on current transactional data without slowing down the transactional processing.

    Of course I’m writing about FLAS because my job title as of this writing is DPSA, Data Platform Solution Architect.

  • Role Based Access Control in Azure

    From talking to customers and people new to Azure SQL Database, it is apparent that many people are unaware of how Role Based Access Control (RBAC) in Azure affects Azure SQL Database. Last week I gave a presentation on this at SQL Saturday and discussed it with customers at the SQL PASS Summit. This post is a brief summary and provides links to the official documentation.


    RBAC - Role Based Access Control

    IAM - Identity and Access Management

    CLI - Command Line Interface

    TDETransparent Data Encryption

    NOTE: Actions described as being performed in the Azure portal can also be accomplished via scripting with PowerShell, the Azure CLI, or the Azure REST API. Actions described as being accomplish in SSMS can also be scripted with T-SQL.

    Get started with access management in the Azure portal is where I suggest you start reading. Keep in mind that both Azure SQL Database and Azure SQL Datawarehouse have their Azure roles at the Azure database server level. An Azure database server is really a logical construct, a container of databases that exists for administrative purposes.


    Figure 1. Adding a role in the Azure portal.

    Pay particular attention to Owner vs. Contributor. If you put a user in the Contributor role, the user cannot grant permissions to other users. That’s the difference between the two roles.

    Although you can administer Azure roles through the Azure portal, eventually for production use you’ll want to script everything so you can have repeatable processes. You’ll need Azure PowerShell (Windows users) or the Azure CLI (cross platform and Docker). You can also use the REST API. See Managing Role-Based Access Control with the REST API, which also includes links to PowerShell and Azure CLI instructions.

    Use role assignments to manage access to your Azure subscription resources

    Documentation on Azure roles can be found at RBAC: Built-in roles. Notice there are three roles that begin with SQL:

    · SQL DB Contributor

    · SQL Security Manager

    · SQL Server Contributor


    Figure 2. SQL roles in Azure portal.

    It’s important to understand that these are Azure roles. They are limited in scope to actions that can be performed in the Azure portal. The Storage Account Contributor role in the Azure portal is relevant to Azure SQL Database if you have auditing enabled.

    The Azure portal is for administering Azure resources. SQL Server Management Studio (SSMS) or equivalent T-SQL scripts are used for administering the contents of Azure SQL Database. There is very little overlap between what can be done in the Azure portal and in SSMS. TDE can be enabled in either, see Transparent Data Encryption with Azure SQL Database. Dynamic Data Masking can be enabled in either although it is more flexible in SSMS. You can create an Azure SQL Database from either the Azure portal or SSMS.

  • Big Data V’s

    Originally there were three V’s of Big Data. Doug Laney introduced Volume, Velocity, and Variety as the three V’s of data management in 2001 when he worked for the Meta Group, which was purchased by Gartner in 2005. He did not use the term Big Data in his 2001 publication.

    IBM uses the same three V’s and added a fourth B of Veracity no later than 2013. The author Bernard Marr added a fifth V of Value. He describes his five V’s in these slides. Rob Livingstone added Validity and Visibility. In 2013, Doug Laney made the distinction between definitional and aspirational qualities of data. According to Doug, only Volume, Velocity, and Variety are definitional. All other V words are aspirational.

    Mike Gualtieri of Forrester in 2012 asserted that Doug’s three V’s are not actionable. Mike offered what he calls a pragmatic definition of Store, Process, and Access.

    Neil Biehn when still at PROS (now at Siemens) stated that the fourth and fifth V’s are Viability and Value.

    If you are willing to have more than three V’s, Visualization is an obvious V to add and multiple authors have written about it. It’s hard to make sense of Big Data and difficult to derive value or take action if you can’t see what the data is telling you.

    I think people should use as few or as many V’s as they find helpful. I will not go so far as to formally propose another V, but I will state that Vexing sometimes fits Big Data. It can be vexing because of the technical issues of Doug’s three V’s and it can be vexing to interpret and act upon.

  • Using SSMS in Customer Presentations

    When using SSMS with a projector or very large screen HDTV, there are several customizations to make things easier to read for the people in the back of the room. First let’s look at a screen capture of SSMS without any customizations.


    Figure 1. SSMS with default settings.

    Do you really need the Properties pane during a presentation? Probably not. You can close that out to obtain more SSMS real estate.


    Figure 2. Properties pane closed. Go to Tools and select Options.

    To increase the size of most fonts in SSMS including the Object Explorer, within Options, open Environment, select Fonts and Colors, then under Show settings for, select Environment Font. Pick a font type and size. The changes will not fully take effect until you close and reopen SSMS.


    Figure 3. Under Options, open Environment and select Fonts and Colors. Change the font settings for Environment Font.

    To make the grid results visible to the people in the back of the room, you can change the font settings for Grid Results.


    Figure 4. Change the font for the Grid Results.

    Notice how much larger the fonts are except for the query text.


    Figure 5. The grid results are larger, but the query text is still small.

    To increase the size of query text, you can use Fonts and Colors to change settings for the Text Editor. I prefer a different approach. You can change the size of query text by using the dropdown menu at the bottom of the query pane.


    Figure 6. Use the dropdown at the bottom of the query pane to increase the query font size.

    Another option for making SSMS more readable is to use the free Microsoft tool ZoomIt.

  • BI Beginner: Using Power BI Desktop Instead of Excel for Visualizing Big Data

    Excel is a great tool but Power BI offers more in the realm of visualization. The 64-bit version of Excel is better at processing large files than the 32-bit version, but even it has limitations. Power BI Desktop (BI as in Business Intelligence) is a free alternative to Excel for working with large files and Big Data. It is possible to enable the Power Pivot add-in for Excel and overcome the 1,048,576 row limit. You can extend Excel’s visualization capabilities with Power Map and Power Query, but Power BI can do more. Power BI Desktop has very powerful and easy to use visualization features of interest to anyone who needs to analyze and visualize data. Like with many software packages, it takes longer to read the instructions than it does to actually do the work. Once you know how, this exercise can be completed in just a few minutes. You don’t have to be a geek or a power user to succeed with Power BI. When you’re done creating your visualizations, you can share them with your colleagues over the web.


    Figure 1. File is too large for Excel.

    Use Get Data to load Excel and csv files into Power BI Desktop.


    Figure 2. Use Get Data to load an Excel or csv file into Power BI Desktop. I selected CSV because that is the type of file I needed to analyze.

    The file that had too many rows for Excel was obtained from Kaggle, which is a great source of large, real world, free datasets. I downloaded the Center for Disease Control’s (CDC) mortality dataset from Kaggle. The DeathRecords.csv file is 2,631,171 rows and took about a minute to load on my system that has a 4 GHz processor, 32 GB of ram, and an SSD.


    Figure 3. Select the file of interest.


    Figure 4. Click Load to load the selected file.


    Figure 5. DeathRecords.csv loaded into Power BI Desktop.

    Now it is time to do some visualizations. We’ll start with a pie chart showing what ages people died. There is more than one way to accomplish this task. Since we know we want a pie chart, go to Visualizations and select the pie chart icon. Next, select Age as shown. By default, it isn’t what we want. It is a sum and we need a count. After changing to a count, we need to set the Legend to Age.


    Figure 6. Select pie chart under Visualizations.


    Figure 7. Select Age.


    Figure 8. Go to the dropdown to the right of Age under Values and select Count.


    Figure 9. Under Fields, place the mouse over Age and hold the left mouse button down to drag it under Legend.


    Figure 10. Release the left mouse button to drop Age under Legend.


    Figure 11. Completed pie chart.

    The next step is to add a stacked column chart visualization.


    Figure 12. Adding a stacked column chart.


    Figure 13. Underneath Fields, check MaritalStatus to add it to the stacked column chart.


    Figure 14. Drag MaritalStatus underneath Value. It correctly defaults to a count.


    Figure 15. You can collapse Visualizations and Fields. You can also resize your two visualizations and reposition them.

    At first it appears that these two visualizations are independent of each other. They are interconnected. Move the mouse and place it over the far right column, the column for the number of widowed people who died. Click once. You will see that what you highlighted, the widowed data, is superimposed on the pie chart. The result is similar to the pie chart popularized by nurse and statistician Florence Nightingale. To undo the selection, click on the W columns again.


    Figure 16. Completed pie chart and stacked column chart.


    Figure 17. Click the W for widowed column.


    Figure 18. After clicking the widowed column, the pie chart is updated to show how as people age, they are more likely to be widowed at the time of death.

    You can save your model and come back and work with it later. It is saved with a file extension of pbix. You can also publish the model to the Microsoft cloud and grant access to the model to your colleagues. It’s important to understand that you’re not sharing a static view with your colleagues. You are sharing the data and the dynamic visualization features. If the underlying data is updated, your colleagues will be able to see the visualization change after doing a refresh to get the latest data.


    Figure 19. Using the published version of the Power BI model from a browser. The M for married column is selected. The pie chart now shows that people who die as younger adults are more likely to be married than those who die as elderly adults.

    Power BI can read all types of data sources, not just files. It natively supports reading files, folders of files, relational databases, Big Data stores, and more as you can see in the screen capture below.


    Figure 20. Datasources supported by Power BI Desktop. If you have a csv file, select csv.

    NOTE: Long time readers of my blog have seen numerous references in my posts to making accessibility accommodations. Since my last post, I found this research titled Good Fonts for Dyslexia by Luz Rello and Ricardo Baeza-Yates indicating that italics decrease the readability of text for people with dyslexia. In the past, the captions underneath my screen captures were italicized. I will avoid the use of italics going forward. Also notice that I provided a hyperlink instead of an APA formatted citation. I think MLA and APA formatting are not up with the times. Strict adherence to these legacy formatting rules with origins long before the digital age causes people to waste time on pedantic matters. Use of hyperlinks to primary sources placed directly in the text are more in keeping with the digital millennium providing both attribution and convenience. As always, my opinions are my own and not those of my employer or graduate school.

  • Use AutoCorrect in Word to Prevent SQL Sever

    Too often I see SQL Server misspelled as SQL Sever. You can easily prevent this by using AutoCorrect in Word.

    Start by clicking on File.


    Figure 1. Click on File in Word.

    Next, click on Options.


    Figure 2. Click on Options.

    On the Word Options dialog box, select Proofing and then click the AutoCorrect Options button.


    Figure 3. Click the AutoCorrect Options button.

    Enter sql sever under Replace and SQL Server under With. Click the Add button.


    Figure 4. Click the Add button after entering the text strings.

    You may also wish to add the following automatic correction to ensure proper casing of SQL Server. This allows you to type sql server without using the shift key but ending up with SQL Server as the result.


    Figure 5. Ensure the SQL Server is always properly cased.

  • SQL Server 2016 Temporal Queries Sample Code

    Temporal tables in SQL Server 2016 and Azure SQL Database allow you to have historical changes to data maintained for you automatically. You don’t have to write custom code to maintain historical values. New SQL syntax allows you to query and retrieve the data as it was in the past or you can simply retrieve the most recent value. The official documentation is found at and I encourage you to take advantage of it.

    The code posted below will help you understand the nuances and subtleties of temporal SELECT statements. I created a simple table structure for you to practice with. What you need to focus on is understanding the boundary conditions such as when a query is inclusive of a datetime boundary or not. You also need to understand what date values would cause overlaps or gaps. Figure out how the various temporal expressions work with datetime boundary conditions.

    Copy the code below and paste it into SQL Server Management Studio connected to SQL Server 2016. Make sure you are connected to a user database instead of a system database. Pay attention to the comments in the code.

    -- It is recommended that you execute all of the code in each section,
    -- each section in order, one section at a time. Understand each
    -- section before proceeding to the next one. Pay attention to boundary
    -- conditions. It's important to understand what is included and what
    -- isn't.

    -- The approach taken in sections 1 is definitely not a pattern
    -- production code. The purpose of these examples is to teach you how
    -- to run temporal SELECT statements. How the data was faked and inserted
    -- does not represent best practices for production code. The code in
    -- section 1 sets things up for you to learn how to query in sections 2-7.
    -- Refer to the official documentation for examples of how to create
    -- temporal tables.

    ---------- BEGIN SECTION 1 ----------

    -- Several tricks are used to load explicit data values. Do not
    -- use this section as an example for production code. This section
    -- was done the way it was for the sole purpose of providing data for
    -- the subsequent sections.

    -- create the simplest possible table for temporal queries
    CREATE TABLE dbo.alphabet
         letter   CHAR(1)   NOT NULL PRIMARY KEY CLUSTERED  -- must have a primary key
        ,sysend   DATETIME2 NOT NULL DEFAULT CONVERT(DATETIME2, '9999-12-31 23:59:59.9999999')

    INSERT INTO dbo.alphabet
    (letter, sysstart)
    ('X', '1-JAN-2016');

    ALTER TABLE dbo.alphabet ADD PERIOD FOR SYSTEM_TIME (sysstart, sysend);

    CREATE TABLE dbo.alphabet_history
         letter   CHAR(1)      NOT NULL
        ,sysstart DATETIME2 NOT NULL
        ,sysend   DATETIME2 NOT NULL

    INSERT INTO dbo.alphabet_history
    (letter, sysstart, sysend)
    ('A', '2015-01-01', '2015-02-01')  -- the boundary condiions are determind by the temporal expressions
    ,('B', '2015-02-01', '2015-03-01')
    ,('C', '2015-03-01', '2015-04-01')
    ,('D', '2015-04-01', '2015-05-01')
    ,('E', '2015-05-01', '2015-05-31')  -- gap between May 31 and June 1
    ,('F', '2015-06-01', '2015-06-30')  -- another gap, what happens at June 30 00:00:00?
    ,('G', '2015-07-01', '2015-08-01')
    ,('H', '2015-08-01', '2015-09-01')
    ,('I', '2015-09-01', '2015-10-01')
    ,('J', '2015-10-01', '2015-11-01')
    ,('K', '2015-11-01', '2015-12-01')
    ,('L', '2015-12-01', '2016-01-01')

    ALTER TABLE dbo.alphabet

    -- look at the contents of the tables
    SELECT * FROM dbo.alphabet;  -- 1 row
    SELECT * FROM dbo.alphabet_history;  -- 12 rows
    ---------- END SECTION 1 ----------

    -- run temporal queries

    ---------- BEGIN SECTION 2 ----------

    -- use the ALL expression

    SELECT * FROM dbo.alphabet  -- 13 rows

    SELECT * FROM dbo.alphabet  -- 13 rows
    ORDER BY sysstart;

    SELECT * FROM dbo.alphabet  -- 13 rows
    ORDER BY sysend;

    ---------- END SECTION 2 ----------

    ---------- BEGIN SECTION 3 ----------

    -- AS OF datetime
    --    means
    --    datetime value >= starting datetime
    --    and
    --    datetime value < ending datetime

    DECLARE @tempus DATETIME2 = CAST('2015-01-15' AS DATETIME2);
    SELECT *,'2015-01-15' FROM dbo.alphabet  -- data is actually in alphabet_history
    FOR SYSTEM_TIME AS OF @tempus;

    SET @tempus = CAST('2015-01-31' AS DATETIME2);
    SELECT *,'2015-01-31' FROM dbo.alphabet  -- data is actually in alphabet_history
    FOR SYSTEM_TIME AS OF @tempus;

    SET @tempus = CAST('2015-01-31 12:34:56' AS DATETIME2);
    SELECT *,'2015-01-31 12:34:56' FROM dbo.alphabet  -- data is actually in alphabet_history
    FOR SYSTEM_TIME AS OF @tempus;

    SET @tempus = CAST('2015-02-01' AS DATETIME2);
    SELECT *,'2015-02-01' FROM dbo.alphabet  -- data is actually in alphabet_history
    FOR SYSTEM_TIME AS OF @tempus;

    SET @tempus = CAST('2015-02-01' AS DATETIME2);
    SELECT *,'2015-02-01' FROM dbo.alphabet  -- data is actually in alphabet_history
    FOR SYSTEM_TIME AS OF @tempus;

    SET @tempus = CAST('2015-02-01' AS DATETIME2);
    SELECT *,'2015-02-01' FROM dbo.alphabet  -- data is actually in alphabet_history
    FOR SYSTEM_TIME AS OF @tempus;

    SET @tempus = CAST('2015-05-31' AS DATETIME2);  -- this day is undefined when using AS OF
    SELECT *,'2015-05-31' FROM dbo.alphabet  -- no data because the 2015-05-31 isn't < end datetime
    FOR SYSTEM_TIME AS OF @tempus;

    SET @tempus = CAST('2015-05-31 12:34:56' AS DATETIME2);  -- also undefined when using AS OF
    SELECT *,'2015-05-31 12:34:56' FROM dbo.alphabet  -- no data because it isn't < end datetime
    FOR SYSTEM_TIME AS OF @tempus;

    SET @tempus = CAST('2015-06-01' AS DATETIME2);
    SELECT *,'2015-06-01' FROM dbo.alphabet  -- works because 2015-06-01 is >= start datetime
    FOR SYSTEM_TIME AS OF @tempus;

    ---------- END  SECTION 3 ----------

    ---------- BEGIN SECTION 4 ----------

    -- use BETWEEN expression

    DECLARE @startBetween DATETIME2 = CAST('2015-01-01' AS DATETIME2);
    DECLARE @endBetween   DATETIME2 = CAST('2015-01-31' AS DATETIME2);
    SELECT * FROM dbo.alphabet
    FOR SYSTEM_TIME BETWEEN @startBetween AND @endBetween;

    SET @endBetween = CAST('2015-01-31 12:34:56' AS DATETIME2)
    SELECT * FROM dbo.alphabet
    FOR SYSTEM_TIME BETWEEN @startBetween AND @endBetween;

    SET @endBetween = CAST('2015-02-01' AS DATETIME2)
    SELECT * FROM dbo.alphabet
    FOR SYSTEM_TIME BETWEEN @startBetween AND @endBetween;

    SET @startBetween = CAST('2015-01-31' AS DATETIME2)
    SET @endBetween   = CAST('2015-02-01' AS DATETIME2)
    SELECT * FROM dbo.alphabet
    FOR SYSTEM_TIME BETWEEN @startBetween and @endBetween;

    SET @startBetween = CAST('2015-06-30' AS DATETIME2)
    SET @endBetween   = CAST('2015-07-01' AS DATETIME2)
    SELECT * FROM dbo.alphabet
    FOR SYSTEM_TIME BETWEEN @startBetween AND @endBetween;

    SET @startBetween = CAST('2015-04-15' AS DATETIME2)
    SET @endBetween   = CAST('2015-09-15' AS DATETIME2)
    SELECT * FROM dbo.alphabet
    FOR SYSTEM_TIME BETWEEN @startBetween AND @endBetween;

    ---------- END SECTION 4 ----------

    ---------- BEGIN SECTION 5 ----------

    -- use FROM expression

    DECLARE @startFrom DATETIME2 = CAST('2015-01-01' AS DATETIME2);
    DECLARE @endFrom   DATETIME2 = CAST('2015-01-31' AS DATETIME2);
    SELECT * FROM dbo.alphabet
    FOR SYSTEM_TIME FROM @startFrom TO @endFrom;

    SET @endFrom = CAST('2015-01-31 12:34:56' AS DATETIME2)
    SELECT * FROM dbo.alphabet
    FOR SYSTEM_TIME FROM @startFrom TO @endFrom;

    SET @endFrom = CAST('2015-02-01' AS DATETIME2)
    SELECT * FROM dbo.alphabet
    FOR SYSTEM_TIME FROM @startFrom TO @endFrom;  -- only returns one row

    SET @startFrom = CAST('2015-01-31' AS DATETIME2)
    SET @endFrom   = CAST('2015-02-01' AS DATETIME2)
    SELECT * FROM dbo.alphabet
    FOR SYSTEM_TIME FROM @startFrom TO @endFrom;

    SET @startFrom = CAST('2015-06-30' AS DATETIME2)
    SET @endFrom   = CAST('2015-07-01' AS DATETIME2)
    SELECT * FROM dbo.alphabet
    FOR SYSTEM_TIME FROM @startFrom TO @endFrom;  -- no rows because of the gap

    SET @startFrom = CAST('2015-06-30' AS DATETIME2)
    SET @endFrom   = CAST('2015-07-01' AS DATETIME2)
    SELECT * FROM dbo.alphabet
    FOR SYSTEM_TIME FROM @startFrom TO @endFrom;  -- no rows because of the gap

    SET @startFrom = CAST('2015-04-15' AS DATETIME2)
    SET @endFrom   = CAST('2015-09-15' AS DATETIME2)
    SELECT * FROM dbo.alphabet
    FOR SYSTEM_TIME FROM @startFrom TO @endFrom;  -- no rows because of the gap

    ---------- END SECTION 5 ----------

    ---------- BEGIN SECTION 6 ----------

    -- use CONTAINED IN expression

    DECLARE @startContained DATETIME2 = CAST('2015-01-01' AS DATETIME2);
    DECLARE @endContained   DATETIME2 = CAST('2015-01-31' AS DATETIME2);
    SELECT * FROM dbo.alphabet
    FOR SYSTEM_TIME CONTAINED IN (@startContained, @endContained);  -- no rows

    SET @startContained = CAST('2014-12-31' AS DATETIME2)
    SET @endContained   = CAST('2015-02-01' AS DATETIME2)
    SELECT * FROM dbo.alphabet
    FOR SYSTEM_TIME CONTAINED IN (@startContained, @endContained);

    SET @endContained   = CAST('2015-03-01' AS DATETIME2)
    SELECT * FROM dbo.alphabet
    FOR SYSTEM_TIME CONTAINED IN (@startContained, @endContained);

    ---------- END SECTION 6 ----------

    ---------- BEGIN SECTION 7 ----------

    -- Even though the sysstart and sysend columns aren't referenced
    -- in the UPDATE, they are automatically updated. Also notice
    -- that the row containing the letter X is deleted from the
    -- alphabet table and inserted into the alphabet_history table.

    UPDATE dbo.alphabet
    SET letter = 'Y';

    SELECT * FROM dbo.alphabet;

    SELECT * FROM dbo.alphabet

    ---------- END SECTION 7 ----------

    -- can't drop a table that is participating in temporal SYSTEM_VERSIONING
    -- turn off SYSTEM_VERSIONING to allow DROP TABLE to work
    DROP TABLE dbo.alphabet;          -- tables are no longer associated and
    DROP TABLE dbo.alphabet_history;  -- can be dropped in either order

    -- it is possible to create the main table and the history table
    -- with a single CREATE TABLE statement.
    -- Notice that this CREATE TABLE also contains GENERATED ALWAYS which
    -- prevents the insertion of explicit values.
    -- After creating a table using this syntax, refresh SSMS and expand
    -- the table in the Object Explorer. You will see that the history table
    -- was automatically created for you.

    CREATE TABLE dbo.alphabet
         letter   CHAR(1)   NOT NULL PRIMARY KEY CLUSTERED  -- must have a primary key
        ,sysend   DATETIME2 GENERATED ALWAYS AS ROW END NOT NULL DEFAULT CONVERT(DATETIME2, '9999-12-31 23:59:59.9999999')
        ,PERIOD FOR SYSTEM_TIME (sysstart, sysend)

  • SQL Server 2016 Documentation Survey

    The SQL Server product team wants to hear from you. Here’s your chance to say what you expect from documentation. Hurry – the survey closes on March 18.

  • SQL Server on Linux–sign up for the preview

    SQL Server runs on Linux now. Apply for the preview at

  • SQL Server, @@VERSION, and Hyper-V

    Yesterday my friend Kalen Delaney asked me about @@VERSION showing (HYPERVISOR) even though she wasn’t running inside a virtual machine. I was able to replicate the behavior on my machines. I asked my colleagues at Microsoft about this. It was confirmed that it is by design. The addition of (HYPERVISOR) to the output indicates only whether or not Hyper-V is currently enabled on the machine. It is not intended to indicate if SQL Server is running inside a Hyper-V virtual machine. Disable Hyper-V on the machine and the text goes away.

    I did not have access to any machines with hypervisors other than Hyper-V.

    If you didn’t intentionally enable Hyper-V on your machine, installing Visual Studio can do that depending on the options you selected.


  • Protecting Credit Card Data

    Although the latter part of this post discusses Always Encrypted in SQL Server 2016, I’m starting with the topic of protecting your own credit cards. I have a certain credit card that will remain nameless that we use for as many purchases as possible because of the cash rebate it offers. It’s like electronic cash to us and always paid in full. Because of my card’s widespread use, it seems to get compromised about every 10 months. Fraudulent charges appear and I end up spending some amount of time on the phone affirming that a whole bunch of charges were fraudulent. The credit card issuer removes the fraudulent charges, cancels the old card, and sends a new card.

    The problem with using one card for everything is when you have preauthorized payments for things like insurance and utilities. Having your card cancelled is very bad for preauthorized payments. You can end up with late fees or service disruptions when a scheduled payment is attempted against a cancelled card. My wife and I got tired of this problem and adopted a new strategy. I have a card and she has a card from the same issuer offering the same cash rebate. Her card is used exclusively for preauthorized payments. My card is used exclusively for everything except preauthorized payments. Her card has never left the house. When my card was last compromised, it was cancelled. Since our preauthorized payments were tied to her card, they weren’t affected. We avoided the inconvenience in logging on to a dozen websites and providing new credit card data.

    SQL Server 2016 has a new encryption feature Always Encrypted. It’s well suited for encrypting credit card numbers in a database. It’s perfect for when you want to store encrypted data and keep even the all powerful DBA from seeing the actual unencrypted data values. Read more about Always Encrypted here.

  • Storage Health Check

    Would a drive failure inconvenience you? Have you checked your old drives? Maybe it is time you should run some diagnostics. Before installing a new, larger SSD and making a completely fresh installation of Windows, I thought it would be a wise idea to check the status of my existing storage devices. Most of my drives are from Seagate, so I installed SeaTools for Windows. Other driver manufacturers have similar utilities. SeaTools works with drives from other manufacturers in case you can’t find a diagnostic utility from your manufacturer.


    Figure 1. SeaTools showing problems with the 3 TB drive. SeaTools detects the Plextor SSD.

    It’s possible that the drive problems could be fixed. But why? Drives don’t last forever and it is an old drive. Repairing it will take time and possibly not succeed. Since I bought a new 6 TB drive a few days ago, I decided the sensible thing to do was attach the 6 TB drive and transfer the data from the 3 TB drive to the 6 TB drive.


    Figure 2. Is repairing sectors on an old drive a wise use of resources?

    The new drive needed to be formatted. Should you use a quick format or a regular, slow format? I do both. First I do a quick format to see if the drive can be formatted at all. Then I run a drive utility to check the drive’s fitness. The new drive is an HGST, so the HGST drive fitness test was run.


    Figure 3. Notice that the HGST utility did not pick up the Plextor SSD boot drive.

    Once a drive has a clean bill of health, I do a regular, slow format. People disagree on this point. A slow format may detect drive problems that a quick format cannot detect. I intend to use the new drive for many years. I’ll gladly invest several hours up front checking my drive’s fitness before relying on it for years. The regular format took around 10 hours. Copying files from the old drive to the new drive took about 4 hours.

    Before you get rid of a drive, you need to erase it. Simply deleting the files doesn’t actually remove them. Microsoft has published guidance here. DP Wiper did not work for me. Eraser is another option. It can be obtained as a portable version. There are other ways to securely wipe a drive. Some people prefer using bootable media to erase using a utility such as Dban.


    Figure 4. Active KillDisk free version.

    It takes a long time to zero out a large drive. The free version for Active KillDisk only does a single pass of writing zeros.


    Figure 5. Be patient when waiting for zeros to be written.

    I did a second pass of writing zeros. The third pass aborted. I’ll probably use Eraser next time.

    Hopefully you won’t discover a looming problem like I did, but if you do, it’s better than suddenly being faced with catastrophic failure.

  • Privacy Settings: The Good, The Bad, and The Ugly

    I’ve spent more time than I’ve wanted today looking for and fiddling with privacy settings. Social media users can’t have privacy, but there are at least some things we can do to reduce annoyances from having an online presence.

    Targeted advertising is something I find disturbing. I don’t want to shop online and later have social media present ads based on what I was doing on other web sites. I saw a targeted ad from Target on Facebook which caused me to investigate. Yes, it is unfortunate that when writing about targeted ads my only example is from a company called Target, but let’s stay on track. I took the Why am I seeing this? link and found out more about targeted ads.


    Figure 1. Targeted ad on Facebook.

    The link provides information about a data provider called Datalogix or DLX. Unfortunately Facebook’s  opt out link for DLX didn’t work, which doesn’t reassure me about using Facebook.


    Figure 2. Broken opt out link on Facebook.

    I went to the Datalogix website and found a privacy page which revealed that Datalogix is owned by Oracle. The Datalogix privacy page has a link to opt out of cookie based ads. The link was broken, so I reported it to Oracle. I’ll let you know what happens.

    The Digital Advertising Alliance has an opt out link for cookie based advertising. The National Advertising Initiative has a similar link. You’ll need to use both links in every browser you use. Additional opt out links were found on Facebook’s Cookies, Pixels & Similar Technologies page. Understand that opting out may install an opt out cookie for your browser.

    Disabling third party cookies is helpful in reducing targeted advertising. In the Windows 10 Edge browser, click the ... and select Settings.


    Figure 3. Edge browser settings.

    Go the the Advanced settings section and click View advanced settings.


    Figure 4. Blocking third party cookies.

    Search for information on changing cookies settings for other browsers.

    Today Facebook has a link for Remove All Imported Contacts. Do you really want your email contacts loaded and stored in Facebook? I don’t.

    LinkedIn has an import feature that you may want to undo. Go to Privacy & Settings and select Manage. Expect to be prompted for your login even if you are already logged in. Go to Groups, Companies & Applications. Next, click Turn on/off data sharing with 3rd party applications.


    Figure 5. LinkedIn third party applications.

  • Windows 10 Fixing Flash Video Playback

    After upgrading my desktop (which has always been quirky) to Windows 10, I noticed that many videos would only play audio. Instead of video, all that I could see is a green background as shown in Figure 1 below.


    Figure 1. Green background during video playback.

    I suspected that the video was using the Adobe Flash Player which I confirmed by right-clicking over where the video should be appearing.


    Figure 2. Confirmation that the video problem was with the Adobe Flash Player.

    I went into Global Settings and tried resetting the license files as someone suggested long before Windows 10. It didn’t help. A help page on the Adobe site made me wonder if the Adobe Flash Player was enabled in the Edge browser. I clicked the ellipses in the Edge browser tool bar to open the More actions menu.


    Figure 3. Select Settings on the More actions menu.

    I scrolled to the bottom of the More actions menu and clicked View advanced settings.


    Figure 4. Click View advanced settings.

    I discovered that Adobe Flash Player was enabled.


    Figure 5. Adobe Flash Player enabled, but not working properly on my machine.

    Since the Adobe Flash Player wasn’t working for me, I turned if off.


    Figure 6. Turning off Adobe Flash Player fixed the video playback problem on my machine.


    Figure 7. Video playback on my machine working with Adobe Flash Player turned off.

    I am not advising anyone to change settings on Windows 10 or the Edge browser just because you can or because I did. If things are working, leave your settings alone. If your video playback isn’t working and you are seeing only a green background, you may want to try the change that worked for me. What works on my machine may not work on yours.

  • In The Cloud: Azure SDKs

    People have been asking me about three things recently:

    • Application development in Azure
    • Visual Studio 2015
    • Microsoft certification exam 70-534 Architecting Microsoft Azure Solutions

    These topics are interrelated. To do application development for Azure, you’ll need Visual Studio and SDKs. After installing VS 2015, visit the Azure downloads page to install the Microsoft Azure SDK for .NET (VS 2015). You’ll also find other SDKs that you’ll want to install.


    Figure 1. Installing Azure SDK for VS 2015

    After installing the Azure SDK, the Web Platform Installer will appear. Use it to install the Microsoft Azure cross-platform Command Line Tools.


    Figure 2. Azure Command Line Tools

    Other command-line tools you’ll need to familiarize yourself with for the 70-534 exam are the Azure Windows PowerShell and AzCopy. You’ll also need to have competency with the migration tools on the download page. I encourage you to take advantage of the documentation links on the page.

More Posts Next page »

This Blog


Privacy Statement