THE SQL Server Blog Spot on the Web

Welcome to SQLblog.com - The SQL Server blog spot on the web Sign in | |
in Search

John Paul Cook

  • Using Data for Product Recalls to Keep People Alive

    I just received a phone call from Kroger with a recorded message informing me that someone in my household purchased Blue Bell ice cream that may be part of the Listeria recall. As both a nurse and a database technologist who is very concerned about health, privacy, and security, I found the call interesting on many levels. The call was possible because of Kroger’s customer loyalty program. Kroger knows both how to contact my family and what we purchase. Of course, it requires that we opt-in to the customer loyalty program at the point of purchase. Sometimes we don’t. Until today, I never associated customer loyalty programs with public health announcements. This is a great use of data warehousing technology. Do you think you could do something good like this where you work?

    UPDATE: I went to Kroger to purchase produce and decided to provide my customer loyalty number. An alert appeared on the self-checkout terminal telling me to read the important message on my cash register receipt. The receipt mentioned that possibly affected products had been purchased and provided useful details.

  • Coping with Little Data

    With all of the hype about Big Data, Little Data is being overlooked. Not every business has zetabytes of data. Businesses that have a few bits here and maybe a few bytes there are being overlooked, but there is hope on the horizon. The most fundamental part of the Little Data ecosystem is Gnorm. Gnorm is named after the gnat in Jim Davis’s comic strip Gnorm Gnat. Jim wasn’t happy with a small success, so he abandoned Gnorm and created Garfield. But enough about Jim.

    Apache Gnorm is a set of algorithms for undistributed storage and undistributed processing of very small data sets on a single desktop computer. It was ported from an abacus. MapExpand is used to process the data into something large enough to see. Apache Hive is overkill for processing Little Data, so the developers created Apache Cell after extracting a single cell from Apache Hive to use as the data warehouse. Version 1 was a worker bee cell, but it in version 2 it was adapted from a queen bee cell. Similarly, Apache Zookeeper is too large for coordination of Little Data tasks, so Apache Petri was created. Real-time analysis is done with Breeze and machine learning is done with 65.

    I spoke with Liz N. Knot of the IT recruiting firm Loosey and Guice. She said it is very difficult finding IT professionals for Little Data projects. She said her clients want to solve simple problems like did the business bring in enough money to cover expenses today, but so many applicants only want to do correlations using R or Python. She just can’t get them to switch over to Worm.

  • Microsoft is Hiring for HoloLens

    If you go to https://careers.microsoft.com/search.aspx and use HoloLens as your search term, you’ll see that Microsoft is serious about this product. As of today, I found 100 job openings for HoloLens in the United States alone. Don’t know what HoloLens is? Take a look here: http://www.microsoft.com/microsoft-hololens/en-us

    As augmented reality and virtual reality become more commonplace, we’re going to be dealing with lots of large video files such as mp4 files. This creates data management challenges similar to what we see in databases. We use metadata to help manage data in a relational database. We can add metadata to video files, both to aid in discoverability and search as well as to add context. For example, an audiovisual (AV) recording of surgery by itself lacks context. It’s possible to add a track to the AV file that includes the patient’s electrocardiograph captured in real time. The viewer could view the metadata at the same time as the AV recording.

  • Office Mix for PowerPoint

    Office Mix is a free add-in for PowerPoint 2013. It records the screen, either a PowerPoint presentation or you can choose a specific region of your screen. It can record video, audio, or both. It also allows you to annotate your presentations with digital ink.

    image

    Figure 1. Technically Mix is a beta product.

    After installing Mix, a new Mix menu bar is added. Notice that it even has a feature for creating quizzes.

    image

    Figure 2. Mix menu bar.

    Learn more about it and download Mix from here.

  • In The Cloud: Using InPrivate Browsing

    Periodically people post looking for an answer to the following error message which is also shown in Figure 1. Frequently zebra diagnoses are presented as the response. In healthcare, a zebra diagnosis is when you overlook the simple and the obvious. If you hear hoof beats, you should think of horses before you think of zebras.

    The Windows Azure Portal encountered an error while trying to access your subscription.

    Based on where the posts are found, it’s a fair guess to say that the people are probably developers or otherwise geeky people who are early adopters trying to develop and test new things in Azure. People like that are likely to have more than one Azure account. That is the easiest explanation for the error. The easy way to resolve the error is to open an InPrivate browsing session and try again. You need to make sure that you have a clean slate when you connect to Azure.

    image

    Figure 1. Credentials problem when using multiple Azure accounts.

    There can be other causes. I’m just presenting you with the simplest, fastest first step before you spend a lot of time and effort on what could be a very simple problem.

    image

    Figure 2. Starting an InPrivate browsing session.

    If you still get the error message after attempting to connect to the Azure portal from an InPrivate browsing session, then start looking for zebras.

  • Backup Basics–Doing it Faster

    Several customers have told me that the time it takes to back up their databases is taking longer than they are comfortable with. Others have told me their backups take a long time but they don’t mind. Whatever the case, getting things done more quickly is usually worth at least a quick look.

    There are two basic approaches to speeding up a backup. One is to have smaller backups. The other is to split the work into multiple streams that are done in parallel. Each has its advantages and disadvantages, which aren’t the same for everybody. Some organizations may not have enough disks or LUNs to parallelize a backup operation. Others may not be able to spare the CPU cycles for compressing backups.

    SQL Server does have some built-in features that you can experiment with to find your optimal approach to database backups. All of my tests were conducted on my MIMIC II database that I migrated to SQL Server 2014. The database files are on a Plextor SSD and the database backups were written to 7200 rpm Seagate hard drives.

    image

    Figure 1. Baseline for backup with default settings.

    Native compression of backups has been a feature in SQL Server since SQL Server 2008 Enterprise. I found differing advice on restoring a compressed SQL Server backup. I asked SQL Server product team members for a clarification. You can restore a compressed backup to an equal or higher version of SQL Server. Compression reduces the amount of disk activity at the cost of using more processor time to do the compression. If you have CPU cycles to spare, the cost of doing the compression should be more than offset by sharply reduced amount of writes to disk.

    image

    Figure 2. Enabling backup compression under Backup Options.

    image

    Figure 3. Smaller backup from enabling compression is written much faster.

    Another approach is to split the work of writing to disk into multiple streams. This is called striping. Since I have three spinning disks, I split the workload into three stripes.

    image

    Figure 4. Striping to parallelize the writes.

    You should conduct your own tests in your own environment. Expect your results to vary from mine. There are many fine third party solutions from various vendors that provide compression and striping. Some solutions are smart about not automatically doing a full backup if a previous one exists and few data changes have been made since the last full backup.

    Third party tools are particularly appealing to people who don’t like to write custom scripts. For those of you who want to script your own solution, Ola Hallengren published some great scripts here.

    Collect your baseline performance data, conduct some tests, and speed things up!

  • In The Cloud: Azure Operational Insights for Server Monitoring

    Microsoft is now offering a preview version of Azure Operational Insights, which is a cloud based server monitoring solution and replacement for System Center Advisor. I decided to try it out on my desktop’s default installation of SQL Server 2014. I picked this SQL Server instance because it’s not configured how I would configure a production server. I wanted to see what things Azure Operational Insights would find.

    I went to http://azure.microsoft.com/en-us/services/operational-insights/ to get started. After logging in to the Azure Portal, I created an Operational Insights workspace named jpcoperationalinsights.

    image

    Figure 1. Creation of Operational Insights workspace.

    Operational Insights has Intelligence Packs, which are similar to System Center Operations Manager (SCOM) management packs. I added the SQL Assessment Intelligence Pack so that I could monitor SQL Servers.

    image

    Figure 2. Add SQL Assessment Intelligence Pack to monitor SQL Server.

    On the Operational Insights Overview page, click Servers and Usage to advance to the Overview > Usage page.

    image

    Figure 3. Clicking Servers and Usage takes you to the Overview > Usage page.

    Operational Insights can connect to SCOM directly. Since I don’t have SCOM on my home desktop, I needed to install an agent on my machine to monitor SQL Server and report the results to Operational Insights running in Azure. This requires clicking Configure on the Overview > Usage page to advance to the Configure > Usage > Direct Agent Configuration page.

    image

    Figure 4. Click Configure.

    On the Configure > Usage > Direct Agent Configuration page, use the Agent (64-bit) link to install the agent on the server to be monitored. The Workspace ID and primary workspace key are needed to connect the agent to Azure.

    image

    Figure 5. You need the Workspace ID and primary workspace key to connect to the agent to Operational Insights.

    When running the installer for the monitoring agent, specify if the agent will connect directly to Azure Operational Insights or to SCOM.

    image

    Figure 6. Connecting the agent directly to Operational Insights.

    You must enter the Workspace ID and primary workspace key during the installation of the monitoring agent if you are connecting directly to Operational Insights.

    image

    Figure 7. Specifying the connection details.

    After a few hours, I returned to Operational Insights to see what it found. The report categories are:

    • Security and Compliance
    • Availability and Business Continuity
    • Performance and Scalability
    • Upgrade, Migration and Deployment
    • Operations and Monitoring
    • Change and Configuration Management

    image

    Figure 8. Partial screen capture of SQL Assessment reports.

    On the Security and Compliance report details, it indicated that Windows password policies weren’t being applied to SQL Server logins.

    image

    Figure 9. Recommendation for applying Windows password policies to SQL Server logins.

    On the Availability and Business Continuity report, we see both high priority and low priority recommendations. There is a high priority recommendation to schedule full backups at least weekly.

    image

    Figure 10. Details of Availability and Business Continuity report.

    All of the recommendations were good. Operational Insights documentation is found here.

  • In The Cloud: Enhancements to Azure SQL Database

    Last week Microsoft released a preview of the next version of Azure SQL Database, which is available now in a preview version. The feature set is nearly complete when compared to the standalone version of SQL Server 2014 and a clustered index on every table is no longer required. To help highlight the differences, I’ve used the SQL Database Migration Wizard available on Codeplex in a side by side comparison.

    image

    Figure 1. SQL Database Migration Wizard start page.

    AdventureWorks2014 is used in the compatibility analyses.

    image

    Figure 2. Azure SQL Database version 1 on the left and Azure SQL Database Update V12 on the right.

    image

    Figure 3. Choose Objects.

    image

    Figure 4. Script Wizard Summary.

    Notice that the Results Summary shows very few incompatibilities between AdventureWorks2014 on SQL Server 2014 and on SQL Database Update V12. As the output shows, full text search is not supported in the preview version of SQL Database.

    image

    Figure 5. Notice the scroll bar on the left window. AdventureWorks2014 is almost a direct port to SQL Database Update V12.

    Since none of the actual databases I’m migrating to SQL Database use full text search features, my databases are completely compatible with SQL Database Update V12.

    The premium version of the preview version of SQL Database Update V12 supports in-memory columnstore queries for greatly improved performance. T-SQL windowing functions are also supported. For more information about what is supported, look here.

    You must use the preview Microsoft Azure management portal at http://portal.azure.com/ to access SQL Database V12. For more information on getting started, look here.

  • In The Cloud: Manually Migrating a Hyper-V VM to Azure

    Azure is great for sharing reproductions on difficult problems. I managed to reproduce an application problem in a virtual machine on my laptop. It’s really not reasonable to ask for others to help me after following a long set of steps to build a matching test environment. Besides, something could go wrong. I had the perfect test case, but only in my local Hyper-V environment. I realized this is a perfect use case for migrating a vm to Azure. I learned a few things along the way that may be of interest. One of the most important things I learned is to make sure the vm is configured to allow remote desktop connections before migrating it to Azure. An Azure vm that can’t accept Remote Desktop Connection is pretty close to useless.

    Azure requires fixed size vhd files. I had dynamically expanding vhdx files. Use PowerShell to convert vhdx to vhd.

    Windows PowerShell
    Copyright (C) 2013 Microsoft Corporation. All rights reserved.

    PS C:\Users\j.NA> Convert-VHD

    cmdlet Convert-VHD at command pipeline position 1
    Supply values for the following parameters:
    Path: D:\Hyper-V\SQLServer2005\VHDs\SQLServer2005.vhdx
    DestinationPath: D:\Hyper-V\SQLServer2005\VHDs\SQLServer2005.vhd
    PS C:\Users\j.NA> Convert-VHD

    cmdlet Convert-VHD at command pipeline position 1
    Supply values for the following parameters:
    Path: D:\Hyper-V\SQLServer2005\VHDs\D.vhdx
    DestinationPath: D:\Hyper-V\SQLServer2005\VHDs\D.vhd
    PS C:\Users\j.NA>

    image

    Figure 1. PowerShell converting vhdx to vhd.

    Next I used the Microsoft Web Platform Installer to install Azure PowerShell. I used Add-AzureAccount to connect my Azure PowerShell session to my Azure account. Add-AzureVhd takes care of the conversion from dynamic to fixed size vhd. It prompted me to enter my credentials.

    image

    Figure 2. Authenticating to Azure in Azure PowerShell.

    Windows PowerShell
    Copyright (C) 2013 Microsoft Corporation. All rights reserved.

    For a list of all Azure cmdlets type 'help azure'.
    For a list of Windows Azure Pack cmdlets type 'Get-Command *wapack*'.
    For Node.js cmdlets type 'help node-dev'.
    For PHP cmdlets type 'help php-dev'.
    For Python cmdlets type 'help python-dev'.
    PS C:\> Add-AzureAccount
    VERBOSE: Account "microsoft@microsoft.com" has been added.
    VERBOSE: Subscription "Microsoft Azure" is selected as the default subscription.
    VERBOSE: To view all the subscriptions, please use Get-AzureSubscription.
    VERBOSE: To switch to a different subscription, please use Select-AzureSubscription.

    Id                             Type       Subscriptions                          Tenants
    --                             ----       -------------                          -------
    microsoft@microsoft.com        User       eefffffa-ffff-ffff-9fff-9fffffffffff   72ffffff-ffff-ffff-ffff-ffffffffffff


    PS C:\> Add-AzureVhd

    cmdlet Add-AzureVhd at command pipeline position 1
    Supply values for the following parameters:
    (Type !? for Help.)
    Destination:
    https://jstorage.blob.core.windows.net/vhds/SQLServer2005.vhd
    LocalFilePath:
    D:\Hyper-V\SQLServer2005\VHDs\SQLServer2005.vhd
    MD5 hash is being calculated for the file  D:\Hyper-V\SQLServer2005\VHDs\SQLServer2005.vhd.
    MD5 hash calculation is completed.
    Elapsed time for the operation: 00:06:03
    Creating new page blob of size 136365212160...
    Elapsed time for upload: 01:50:39

    LocalFilePath                                               DestinationUri
    -------------                                               --------------
    D:\Hyper-V\SQLServer2005\VHDs\SQLServer2005.vhd             https://jstorage.blob.core.windows.net/vhds/SQLServer...


    PS C:\> Add-AzureVhd

    cmdlet Add-AzureVhd at command pipeline position 1
    Supply values for the following parameters:
    (Type !? for Help.)
    Destination: https://jstorage.blob.core.windows.net/vhds/D.vhd
    LocalFilePath: D:\Hyper-V\SQLServer2005\VHDs\D.vhd
    MD5 hash is being calculated for the file  D:\Hyper-V\SQLServer2005\VHDs\D.vhd.
    MD5 hash calculation is completed.
    Elapsed time for the operation: 00:04:50
    Creating new page blob of size 136365212160...
    Elapsed time for upload: 00:00:54

    LocalFilePath                                               DestinationUri
    -------------                                               --------------
    D:\Hyper-V\SQLServer2005\VHDs\D.vhd                         https://jstorage.blob.core.windows.net/vhds/D.vhd


    PS C:\>

    Once the vhds are in an Azure blob storage container, it’s time to create disks for the virtual machine. Select CREATE A DISK to start the wizard.

    image

    Figure 3. Create a disk from a vhd.

    image

    Figure 4. Selecting a vhd from the vhds container.

     

    image

    Figure 5. After selecting the vhd where the OS is installed, check The VHD contains an operating system.

    The process was repeated for the virtual machine’s D drive, which is just a data disk.

    Next, an Azure virtual machine was created. Notice that the operating system disk created in the previous steps appears under MY DISKS.

    image

    Figure 6. Selecting the operating system disk for the virtual machine.

    After the virtual machine was created, but before it was started, the data disk was added by using the options at the bottom of the Azure portal page.

    image

    Figure 7. Adding the data disk to the vm.

    Once the virtual machine was migrated to Azure, I could easily share it with my colleagues so they could see the problem easily and quickly from wherever they are.

  • BI Beginner: Creating SSIS projects for 64-bit Office

    SQL Server Data Tools for Business Intelligence (SSDT-BI) is the replacement for Business Intelligence Development Studio (BIDS). SSDT-BI is a 32-bit application. If you have 64-bit Office installed instead of 32-bit Office, you will encounter problems when attempting to create SSIS packages that use Excel files. This is easily resolved by installing 32-bit drivers from here. Click the Download button and select AccessDatabaseEngine.exe.

    Here’s what appears in SSDT-BI when you don’t have 32-bit drivers.

    image

    Figure 1. SSIS-BI without 32-bit drivers for Excel.

    After installing the 32-bit drivers mentioned above, the worksheet can be selected.

    image

    Figure 2. Selecting the worksheet from the dropdown list.

    That solves the bitness problem and uncovers a permissions problem. We see [SQL Server Destination [18]] Error: Unable to bulk copy data. You may need to run this package as an administrator and error code 0xC0202071.

    image

    image

    Figure 3. Error importing data from Excel even though connections are valid.

    Running SSDT-BI (i.e., Visual Studio) as an administrator is necessary to elevate the permissions to allow access to SQL Server.

    image

    Figure 4. Running SSDT-BI as administrator – good use case for pinning it to the taskbar.

    Running as administrator allow you to rerun the job with elevated privileges to connect to SQL Server.

    image

    Figure 5. Success at last.

  • BI Beginner: Installing SSDT BI

    The Business Intelligence Development Studio (BIDS) does not come with SQL Server 2014 like it did in previous versions of SQL Server. Furthermore, it has been renamed as SQL Server Data Tools Business Intelligence. If you have Visual Studio 2012, download this. If you have Visual Studio 2013, download this. I recommend that you download and save the file and run the saved copy instead of running directly from the link.

    A few individuals have reported seeing a “Same architecture installation” error message. This is most likely caused by not taking the default of Perform a new installation of SQL Server 2014. Pick the default and you should be fine. It does not install a new instance of the SQL Server database engine.

    image

    Figure 1. Take the defaults for Installation Type.

    Expect to reboot. Open Visual Studio. Go to FILE | New | Project… to see the new templates that have been added to Visual Studio.

    image

    Figure 2. Select Project to see the new Business Intelligence Templates.

    Pick an appropriate template and get down to business.

    image

    Figure 3. Business Intelligence Templates in Visual Studio 2013.

  • In The Cloud: Creating Your First Azure SQL Database

    Creating your first SQL Database in Azure is quick and easy. There are a few things I’ve discovered that might help you be successful faster. Many people have asked me how to get started, so I’ve provided these steps to success. The steps I show you aren’t the only way to achieve success. This is an adaptation of one of my Azure demos.

    First, you need an Azure account. If you have an MSDN subscription, you should be able to use that. Your employer might have Azure credits. If those options aren’t available, you can get a one-month trial here.

    Go to http://azure.microsoft.com and login.

    image

    Figure 1. Default appearance of SQL DATABASES in Azure.

    Sometimes I click CREATE A SQL DATABASE, which is the default appearance of the SQL DATABASES page in Azure. If you’re just starting out on Azure, I suggest that you take a look around to see what all of your options are. That will help you see the big picture. Click on SERVERS.

    image

    Figure 2. Click on SERVERS to see the server view and create a new SQL Database server.

    When working with a physical SQL Server, we create a SQL Server before we create a database. It’s a familiar model, so that’s what I’m showing you how to do in Azure. After you click CREATE A SQL DATABASE SERVER, you’ll be presented with a CREATE SERVER box.

    image

    Figure 3. Be sure to pick a nearby REGION for better response times.

    Once your SQL Database server has been created, you should click MANAGE at the bottom of the page to configure it.

    image

    Figure 4. Click MANAGE to configure your server.

    What happens next and what you should do depends on your network environment. If you have a simple, uncomplicated network, click YES to include your IP address as a permissible IP address allowed to go through the firewall.

    image

    Figure 5. Adding your IP address as a firewall rule.

    Depending on your network’s configuration, the IP address you are shown may not be your actual public IP address. If that’s the case, you’re not going to achieve success until you add your actual public IP address as a firewall rule. Thank you SQL Server MVP Denny Cherry for pointing that out to me. How do you know if the IP address you were shown is your actual public IP address? There are several websites that will show you what your actual, public IP address is. One such site is found here.

    If you discover that your public IP address is different (my public IP address at work differs in the third octet), you’ll need to click your server’s name to configure it.

    image

    Figure 6. Click on your server’s name to see the list of options available.

    On your server’s page, click on CONFIGURE to add an IP address to your firewall rules.

    image

    Figure 7. Click CONFIGURE to add an IP address.

    You can enter additional IP addresses on your server’s CONFIGURE page. Click SAVE after making changes.

    image

    Figure 8. Adding an IP address. Remember to click SAVE.

    You can create a database from within the Azure portal. It’s the simplest way and avoids surprises. If you chose to deploy an existing database into Azure as a SQL Database, not all SQL Server datatypes are implemented in Azure. Notice that within Azure, we have SQL databases, not SQL Server databases. Native Azure database servers and databases have a subset of what is available in SQL Server. There are other differences. In particular, you need to be aware that to migrate a SQL Server database to Azure SQL database, every table must have a clustered index. If you have a small, unindexed table you want to migrate to Azure SQL database, you’ll need to add a clustered index.

  • Migrating Open Source Databases to SQL Server

    In my recent migration of a PostgreSQL database to SQL Server, I made several generic observations that I hope you will find helpful. I used the official PostgreSQL drivers found here. They did work and I was eventually able to use the Import and Export Wizard to load the data into SQL Server. Drivers that work with the Import and Export Wizard will also work with SSIS.

    If you are dealing with large volumes of data, you may need to use the 64-bit version of the Import and Export Wizard instead of the default 32-bit version. It’s easy to find either version by using the Windows search feature. Three tables did not load because they were too large and the 32-bit version of the Import and Export Wizard ran out memory. The 64-bit version of the Import/Export Wizard was able to load all of the tables, including the one with 182 million rows.

    image

    Figure 1. Windows 8 search feature showing both versions of the Import/Export Wizard.

    The bitness of the drivers must match the bitness of the tool. I installed both 32-bit and 64-bit versions of the PostgreSQL ODBC drivers. Once again, the Windows search feature comes in handy for selecting the correct version of the ODBC Data Source Administrator.

    image

    Figure 2. Windows 8 search feature showing both versions of the ODBC Data Source Administrator.

    I was not successful in connecting to the PostgreSQL database when I used the server name. Since this was a one time migration, I took the expedient approach of using the server’s IP address instead of the server name. To obtain an IP address from a Windows server, open a command prompt and enter ipconfig and hit return. For a Linux server, open a terminal console and enter ifconfig and hit return.

    image

    Figure 3. Connecting to the remote database by IP address.

    Although basic connectivity was established, the default option for Specify Table Copy or Query was not exactly what was needed for success. The table names in the PostgreSQL database were all enumerated, but they were in the format TableName instead of a SchemaName.TableName format. It was necessary to select Write a query to specify the data to transfer option so that the SchemaName.TableName format could be manually specified.

    image

    Figure 4. Specify Table Copy or Query.

    Success was still elusive after this change. None of the tables could be created on the destination data source, which was SQL Server 2014.

    image

    Figure 5. Improper schema mappings.

    I found a fix to the schema mappings problems in this blog post on MSDN. Notice that the problem database in the linked page was a Pervasive database. That reinforces my original point that there are some generic issues to consider when migrating data.

    You may want to consider purchasing a third party data migration utility. If you have recurring, mission critical data migration needs, a third party vendor can provide support. This is a particularly important consideration when your source or target database is open source such as PostgreSQL. Some vendors write their own data access code which gives them control (good for you because they can fix problems should any arise) and may also give you better performance. I did exchange emails with Damir Bulic at http://www.spectralcore.com/ who was very helpful in informing me about data migration in general. I particularly appreciate how he pointed out to me that although the tables were indexed and some had primary keys, the largest tables did not have a primary key. Next time, I’ll more carefully check for primary keys before starting a migration. As Damir correctly pointed out, having a primary key allows keyed access to tables so that the entire table doesn’t have to be loaded into memory. His product takes advantage of primary keys so that data can be read in chunks and thus avoid out of memory issues. If you have a 32-bit system, that could be very important. You may want to add a primary key to a large table before attempting a migration.

    I also exported all of the tables into csv files. I knew that my 182 million row table was too big even for my 64-bit version of Excel on a machine with 32 GB of ram, but I wanted to see what would happen. Excel gracefully informed me that it could not load the entire file. As you can see, Excel has a limit of 1,048,576 rows.

    image

    image

    Figure 6. Row limit in Excel.

  • SQL Server Backup to Azure

    Because so many people have asked me for specific details on backing up a SQL Server database to Azure, this post shows the necessary steps to accomplish that. The original database name is MIMIC II, which is a clinical database that is available to researchers. It will be used in subsequent posts on data migration and machine learning. The original copy of MIMIC II is in a PostgreSQL database in an Ubuntu virtual machine. I used the SQL Server Import/Export Wizard to import the data into SQL Server, which I find more convenient to work with. I gave the database the name MIMIC2 in SQL Server 2014. When I completed the migration to SQL Server, of course I wanted a backup. Azure was my first choice. By storing it in Azure, I know it is both secure and accessible to me should I ever need it. 

    SQL Server 2014 is shown in the screen captures below. SQL Server versions as far back as SQL Server 2005 can be easily backed up to Azure by using the free Microsoft SQL Server Backup to Microsoft Azure Tool.

    The first step is to create a storage account in Azure. On the left hand side of the page, select STORAGE. Next click either + NEW or CREATE A STORAGE ACCOUNT.

    image

    Figure 1. Pick either of the options to create a storage account.

    Specify the first part of the URL that will be used to access the storage account.

    image

    Figure 2. Entering mimic2 for the URL makes the entire URL mimic2.core.windows.net

    image

    Figure 3. Click on MANAGE ACCESS KEYS.

    image

    Figure 4. You must save the access keys. They are your passwords to the storage account.

    You’ll need to create a security credential to use in SSMS. Here is what your key maps to in T-SQL:

    image

    Figure 5. The storage account name maps to IDENTITY. The primary access key maps to SECRET.

    image

    image

    Figure 6. You can use the SSMS gui to create the security credential instead of T-SQL. Notice the primary access key from Azure maps to Password on the New Credential box.

    Now that you have a storage account, the next step is to create a container. Click on CONTAINERS to change the view.

    image

    Figure 7. Click on CONTAINERS to create a container for your backup.

    To create the container, click either + ADD or CREATE A STORAGE ACCOUNT

    image

    Figure 8. Pick either of the options to create a container within your storage account.

    image

    Figure 9. Enter a name for your new container.

    image

    Figure 10. Your backups will reside in the blob container you’ve created.

    Refer back to Figure 5 or Figure 6 to create a security credential. You’ll have to do that before proceeding to the next step.

    Once you have a security credential created, you are ready to back up to Azure.

    image

    Figure 11. In SQL Server 2014, back up to Azure is accessed like any other back up method.

    To backup to Azure, select URL from the Back up to dropdown list and select your credential name from the SQL credential dropdown list,

    image

    Figure 12. Select URL and your credential name from the dropdown lists.

    You probably want to select Backup Options so that you can select Compress backup from the dropdown list. A compressed backup will take less time to upload to Azure and it will occupy less storage space.

    image

    Figure 13. Selecting Compress backup will reduce the upload time and Azure storage space.

    You can also connect to Azure directly from SSMS and browse your Azure storage.

    image

    image

    Figure 14. Select Azure Storage to connect to Azure to look at your Azure storage from within SSMS. The Account key is the primary access key in Azure, the SECRET in T-SQL, or the Password on the New Credential box in SSMS.

    image

    Figure 15. View your Azure storage accounts, containers, and backups from within SSMS.

    image

    Figure 16. You can also examine the contents of your container from within the Azure portal.

  • Antivirus and SQL Server

    Antivirus protection and SQL Server don’t mix well. Some good tips on getting antivirus software to play nicely with SQL Server can be found here. Take a look, you might find something you haven’t considered.

More Posts Next page »

This Blog

Syndication

Powered by Community Server (Commercial Edition), by Telligent Systems
  Privacy Statement