THE SQL Server Blog Spot on the Web

Welcome to SQLblog.com - The SQL Server blog spot on the web Sign in | |
in Search

Damian Widera

  • Technical Overview: SQL Server 2016 Community Technology Preview 3.3

    Hi Sql Folks!

    Accoring to the TechNet site (check here for more deatils) we can play with SQL Server CTP 3.3 now! 

     

    Here is what has changed or was updatec / fixed 

    New Stretch Database improvements in CTP 3.3 include:

    • Support to enable TDE on a remote DB if the local database has TDE enabled
    • Azure Stretch database edition preview with support for up to 60TB
    • Alter and drop index support for stretch tables
    • Add, alter and drop columns support for stretch tables
    • Point-in-time restore and geo-failover support
    • Query performance improvement

    SQL Server Management Studio improvements in this release include:

    • Additional Wizard features:
      • Added new SQL db credential management functionality
      • Integrated Table validation and selection updates to prevent stretch of unsupported datatypes at selection time
      • Table search functionality for table select page
      • Table selection column reordering
      • Support for temporal tables during table select
      • Integrated Azure sign in and SQL sign in credential
      • Add support for stretching using federated accounts
      • New firewall configuration and subnet detection functionality
      • Updated summary page details with pricing information
      • Improved SSMS visualization with StretchDB icons
    • Object Explorer:
      • Fly out menu updates to support disable and un-migration functionality
      • Un-migrate support functionality at database and table level

    Read the SSMS blog post to learn more.

    CTP3.3 adds support with In-Memory OLTP for:

    • Automatic update of statistics on memory-optimized tables: The statistics for memory-optimized tables are now updated automatically, removing the need for running maintenance tasks that update statistics manually.
    • Sampled statistics for memory-optimized tables: Sampling of statistics for the data in memory-optimized tables is now supported, alongside the previously supported fullscan statistics. This reduces the time it takes to collect statistics for large tables.
    • Use of LOB types varchar(max), nvarchar(max), and varbinary(max) with built-in string functions (‘+’, len, ltrim, rtrim and substring) in natively compiled modules, and as return type of natively compiled scalar UDFs.
    • Memory-optimized tables with row size > 8060 bytes, using non-LOB types. CTPs 3.1 and 3.2 supported larger rows using LOB types; as of CTP3.3, memory-optimized tables support also larger rows using types varchar(n), nvarchar(n) and varbinary(n). See below for an example.
    • The OUTPUT clause can now be used with INSERT, UPDATE and DELETE statements in natively compiled stored procedures.

    Autostats improvements in CTP 3.3

    Previously, statistics were automatically recalculated when the change exceeded a fixed threshold. As of CTP 3.3, we have refined the algorithm such that it is no longer a fixed threshold, but in general will be more aggressive in triggering statistics scans, resulting in more accurate query plans.

    Foreign Key Support enhancements in CTP 3.3

    SQL Server 2014 and earlier versions have limitations on the number of FOREIGN KEY references a table can contain, as well as the maximum number on incoming foreign key REFERENCES. The documented recommended maximum is 253, and when performing DML on tables with large numbers of incoming REFERENCES, statements time out with stack overflow error messages.

    This improvement increases the number of supported incoming foreign key REFERENCES to a table, while maintaining good performance for DML operations in both the referencing and the referenced table. The new maximum is 10,000. However, with the CTP 3.3 release, we have certain limitations on this feature:

    • We ONLY support Delete DML operation on foreign key references that go beyond the current recommended maximum of 253. Therefore, we will validate that no referencing rows exist before deletion.
    • Update and Merge operations are not supported with this release. Update will be available in RTM.
    • You will not see any change in behavior for cascading actions.
    • This is not available in ColumnStore, Hekaton or StretchDB.
    • This change is not applicable to a primary key table that is self-referencing (that is, if the table has a foreign key to itself). In this case, the behavior would remain the same as before.
    • This is not supported for partitioned foreign key tables for CTP 3.3. However, partitioned tables will be supported in RTM.

    SQL Server Analysis Services (SSAS) includes multiple additions in this release. Read the SSAS CTP 3.3 blog post to learn more.

    SQL Server Reporting Services (SSRS) includes an updated preview of its brand-new web portal with additional functionality:

    • Add the KPIs and reports that matter most to you to your Favorites and view them all in one place.
    • Manage shared data sources for your KPIs and reports and perform other management tasks.

    Read the SSRS blog post to learn more.

    Master Data Services (MDS) improvements in this release include:

    • Business rule changes
      • New, easier-to-use web UI administration page
      • Support for NOT conditional operator
      • Support for ELSE section that contains a set of actions to execute if the IF condition is false
      • Removed management UI from Excel add-in
    • Added support for purging (hard-deleting soft-deleted members) of an entity version
    • Added to the web explorer page a button to open the current entity view in the Excel add-in

     

    Happy playing!

    Damian 

  • What you should have read :) during the weekend - weekly SQL Server blogs review part 12

    Hello! Happy New Year to you all! I was really busy at the end of the last

    year because I was trying to close all opened projects. Which was not

    possible and now I know this!

    I am coming back to you with a list of posts that are a must to read !!!

    Hope you will have a lot of fun reading all of them.

      

    Author

    Title

    Link

    Jack Li

    What to do when you run out of disk space for In-Memory OLTP checkpoint files | CSS SQL Server Engineers

    http://bit.ly/1OBAms9

    Klaus Aschenbrenner

    How to pollute your Plan Cache with parameterized SQL statements

    http://bit.ly/1ntWgqK

    Adam Machanic

    SQLQueryStress: The Source Code

    http://bit.ly/1RPJTCi

    Mark Broadbent

    Problem removing files from TempDB

    http://bit.ly/1OsM1M9

    Joe Celko

    Declarative SQL: Using References

    http://bit.ly/1JVwTYK

    Michael Sorens

    Documenting Your PowerShell Binary Cmdlets

    http://bit.ly/1n6UL16

     

    Cheers

    Damian

  • What you should have read :) during the weekend - weekly SQL Server blogs review part 11

    Hello! It is Tuesday already. I hope you had great last week especially we

    have Christmas this week J. Maybe you will find some time to read the

    great posts I prepared? Hope you will have a lot of fun reading all of them.

      

    Author

    Title

    Link

    Solomon Rudzky

    Stairway to SQLCLR Level 1: What is SQLCLR? - SQLServerCentral

    http://bit.ly/1mfXWmC

    Louis Davidsson

    What Counts For a DBA

    http://bit.ly/1MuyXkS

    Merrill Aldrich

    Automate ETL Testing: Cross-Server Data Comparison with PowerShell

    http://bit.ly/1PkuebL

    Gail Shaw

    When naming transactions causes an error

    http://bit.ly/1m6i2QR

    Paul White

    Temporary Tables in Stored Procedures

    http://bit.ly/1QFLXw5

    MSSQL Tips

    Demystify TempDB Performance and Manageability

    http://bit.ly/1OJsvZ7

     

    Cheers

    Damian

     

  • What you should have read :) during the weekend - weekly SQL Server blogs review part 10

    Hello! It is Tuesday already but last week I found a lot of great resources.

    Hope you will have a lot of fun reading all of them! 

      

    Author

    Title

    Link

    Nick Burns

    Combining R and SQL Server to diagnose performance bottlenecks

    http://bit.ly/1MfFi3r

    Glenn Berry

    Introduction to Storage Spaces Direct for SQL Server

    http://bit.ly/1lN6aTI

    SQL Server Database Engine Team

    Built-in functions for compression/decompression in SQL Server 2016

    http://bit.ly/1Qr61C0

    Greg Low

    My Sessions from Ignite Australia on the Gold Coast now online

    http://bit.ly/1YewA1M

    Rob Farley

    A new superpower for SQL query tuners – Number of Rows Read

    http://bit.ly/1Y73SQe

     

    Cheers

    Damian

  • Review of LepideAuditor Suite for SQL Server


    Being a MVP gives me the chance to freely use and test some fancy tools and this time I would like to share my thoughts about one of such.  

    SQL Server Auditing is an important procedure to know the answers to questions such as who made a particular change, when, and from where was this change made. Unidentified changes can influence the whole IT environment, especially when other crucial servers are interconnected with SQL Server.

    LepideAuditor Suite audits every aspect of SQL Server. It provides a common platform to audit the server components such as Active Directory, Group Policy Objects, Exchange Server, SharePoint, and SQL Server. You can simultaneously audit multiple instances of any server component.

    System Requirements

    You can audit SQL Server 2005, SQL Server 2008, SQL Server 2008 R2, SQL Server 2012, and SQL Server 2014 using this software. The minimum system requirements to install this Lepide product are listed herein below.

    ·         Dual Core or higher Processor

    ·         Minimum 4 GB RAM

    ·         Free space on hard disk

    o   Minimum 1 GB

    o   Recommended 2 GB

    ·         Any of the following 32 bit or 64 bit Windows Operating Systems.

    o   Windows XP

    o   Windows 7

    o   Windows 8

    o   Windows 8.1

    o   Windows Server 2003

    o   Windows Server 2003 R2

    o   Windows Server 2008

    o   Windows Server 2008 R2

    o   Windows Server 2012

    o   Windows Server 2012 R2

    ·         .NET Framework 4.0 or later

    Prerequisites

    You can save the auditing logs to the same or different SQL Server. The following prerequisite software should be installed on the machine where SQL Server is installed.

    ·         Microsoft System CLR Types for SQL Server 2012

    ·         Microsoft SQL Server 2012 Management Objects Setup

    ·         .NET Framework 4.0

    Installation

    The installation of LepideAuditor Suite is easy and quick. You just have to download the setup file, execute it, and follow the onscreen steps. The setup files of Web Console and App Server comes with the downloaded zip file. You can install the Web Console to access the audit reports from anywhere in the network using a browser.

    LepideAuditor Suite sends real-time notifications in LepideAuditor App on your Android or Apple device through the default Lepide App Server. If you want to set up a custom App Server to send the notifications to App, then its installer file is available in the downloaded setup.

    Configuring the Software

    Once installed, you have to add the SQL Server that has to be audited. Before adding, please make sure to install the above prerequisite on the server machine. At the welcome screen, you have to provide the login credentials of the local system or domain administrator to run the software service.

    Welcome Screen of LepideAuditor Suite

     

    You can select an administrative user account and provide its password to run this service. Once configured the following dialog box will appear, where you have to select the component to be audited.

    Component Selection

    You have to select “SQL Server” and click “OK”. The following wizard will appear.

    Wizard to add a SQL Server option

    Two options are available here – Express Configuration and Advanced Configuration. The former is the way to add SQL Server with minimum configuration, where the latter lets you configure every aspect of SQL Server auditing. Click “Next” to start adding the SQL Server.

    Details of SQL Server

    Enter the details of SQL Server. You can also click “Browse” button to select the SQL Server. Click “Test Connection” button to test the connection to the target SQL Server. You can select whether to add server with agent or without agent. An agent will be installed on the machine where SQL Server is installed in agent-based auditing, whereas no agent will be installed if you select “Without Agent” for agentless auditing. Click “Next” once you are done. The next step lets you configure the health monitoring of SQL Server.

    Health Monitoring Settings

    You have to provide the name or IP Address and login credentials of the administrator of the computer where SQL Server is installed. Click “Next”.

    Audit Settings

    Here, you have to select what you want to audit from the following options.

    1.       Audit Everything: Everything on SQL Server will be audited.

    2.       Audit Server: Only the server except its databases will be audited.

    3.       Audit Server with selected objects: Server with the selected databases will be audited. The available databases will be listed when you will select this option. You can select the databases to be audited.

    Audit Server objects with the selected databases

    Once you have configured what to audit, you can click “Next”.

    Object Settings

    Here, you can select the desired server and database objects along with their operations to be audited. You can check the objects that has to be audited. Uncheck the others to exclude them from auditing. Click the operation cell for an object to select the object operations to be audited.

    Operations of an object

    You can check the objects that have to be monitored. Uncheck to exclude them from auditing. Once you have selected the objects and their operations to be audited, you can click “Next”.

    User Settings

    The available options are – All Users and Selected Users. You can select the latter to select the users to be audited.

    Once you have made your selection, click “Next” to proceed. The database settings appear onscreen.

    Database Settings

    You have to provide the SQL Server details and create a new database or select an existing database to store the auditing logs. There are small buttons to save the SQL Server as default for Auditor Suite or load the settings from an already saved default SQL Server.

    Click “Next” to proceed. The next steps lets you enable the archiving of logs and schedule the archiving.

    Archive Settings

    Enter the SQL Server details, select a database, and configure the schedule. The software will automatically archive the logs as per the provided schedule.

    Click “Finish” to add the SQL Server. A message box will appear onscreen to restart the software. Once restarted, you will notice a new tab for the added SQL Server in “Radar” tab. “Settings” tab will display the settings to configure the listing of SQL Server, whereas “Audit Reports” tab will show the audit reports for SQL Server.

    Glimpses of Configuration Changes

    Radar tab shows the summarized graph reports of all changes being made in SQL Server.

    SQL Server Tab

    In addition to the default tab, you can create multiple custom views. “Radar” tab for SQL Server lets you keep a check on the most critical changes such as database modification trend, table modification trend, user modification trend, top 10 failed logins, top administrators, resource utilization, all changes trend, and LiveFeed updates.

    Audit Reports

    You can switch to “Audit Reports” tab to view 50+ predefined reports that highlights every change made in the configuration of your SQL Servers.

    Audit Report

    Here, you can view the audit report in both text and graph views.

    Graph Report

    You can apply date range, working hours, and keyword filters to these reports. In addition, the columns can be grouped by and you can search for a particular event.

    Filtered table

    These reports can be saved on the disk and scheduled to be delivered automatically at predefined intervals.

    Create Schedule

    Alerts, Updates, and Notifications

    You can configure real-time alerts that can be delivered to the provided email addresses, displayed at LiveFeed widget of Radar Tab, and sent to the LepideAuditor App installed on your Android or Apple device.

    Create Alert

    These alerts can be used to show the live updates in LiveFeed widget in the Radar tab of SQL Server.

    LiveFeed Widget

    You can also receive the notifications in LepideAuditor App. This app is available in both Google Play Store and Apple App Store.

    App Notifications of SQL Server

    Search in Notifications

    Options in App

    Share Notifications

    In addition, you can install LepideAuditor Suite Web Console to host a report server using which you can let the selected users view the selected audit reports in a Web browser from anywhere in the network.

    All Server Object Modification Report in Web Console

    The best part is that LepideAuditor Suite also has a dedicated report on Console Auditing that shows the changes made to the configuration of the software itself.

    Conclusion

    After viewing the working of LepideAuditor Suite, I recommend you all to install this software for auditing any or multiple SQL Servers in the environment. It audits and tracks every change being made to the configuration of SQL Server. The vast set of predefined audit reports along with the options to customize, save, filter, and email the reports periodically make the task easier. Real-time alerts in email, LiveFeed and LepideAuditor App keeps you notified about the critical changes.

    Product page link- http://www.lepide.com/lepideauditor/sql-server.html

    Download trial - http://www.lepide.com/lepideauditor/download.html

     

    Cheers

    Damian 

     

  • What you should have read :) during the weekend - weekly SQL Server blogs review part 9

    Hello! It is Tuesday already but last week I found a lot of great resources. Hope you will have a lot of fun reading all of them! 

      

    Author

    Title

    Link

    Rob Farley

    Anti-Harassment Policies and Codes of Conduct

    http://bit.ly/1PWUVnZ

    SQL Server Blog

    SQL Server 2016 Community Technology Preview 3.1 is available

    http://bit.ly/1OrySUU

    SQL Server Engineering Team

    Announcing SQL Server Management Studio - November 2015 Release

    http://bit.ly/1IRwEYQ

    Raul Gonzales

    The Mysterious Case of the Missing Default Value

    http://bit.ly/1OfHr0H

    Joe Chang

    Join and Query hints – serious DBAs need to learn this

    http://bit.ly/1ONt9cf

     

    Cheers

    Damian

     

  • What you should have read :) during the weekend - weekly SQL Server blogs review part 8

    Hello! It is Monday so I am delivering you the new articles to read!

    Hope you will have a lot of fun reading all of them! 

      

    Author

    Title

    Link

    Raj Kumar Beniwal

    How to Import And Display Data From Web Page Using Power BI Desktop

    http://bit.ly/1QQsic6

    SSW TV

    Asynchronous Programming with Xamarin | Filip Ekberg at Xamarin Hack Day Sydney

    http://bit.ly/21pSKgG

    Ajit Jaokar

    Recurrent neural networks, Time series data and IoT – Part One

    http://bit.ly/1Njnpl4

    Ron Matchoro

    Azure SQL Database Threat Detection now in Public Preview

    http://bit.ly/1NExCz7

    Andy Leonard

    Biml 101 Recording and Files Are Now Available!

    http://bit.ly/1XtcoZv

     

    Cheers

    Damian

  • What you should have read :) during the weekend - weekly SQL Server blogs review part 7

    Hello! It is Monday so I am delivering you the new articles to read!

    Hope you will have a lot of fun reading all of them!

     

      

    Author

    Title

    Link

    CSS SQL Server Engineers

    Are my statistics correct?

    http://bit.ly/1X2OApU

    Chris Webb

    Power BI DirectQuery Mode: Not Just SSAS DirectQuery v2.0

    http://bit.ly/1SxnFkE

    Brent Ozar

    Is My SQL Server Too Big for Virtualization or the Cloud?

    http://bit.ly/1HQ6mei

    Doug Lane

    The Five Stages of Dynamic SQL Grief

    http://bit.ly/20Xjl4c

    Greg Low

    Data Tales #4: The Case of the Phantom Duplicate

    http://bit.ly/1QqLrkV

    Rob Farley

    What’s driving your data model?

    http://bit.ly/1lrDL5a

    Itzik Ben-Gan

    Packaging Intervals with Priorities

    http://bit.ly/1NX80fj

     

     Cheers

    Damian

     


  • What you should have read :) during the weekend - weekly SQL Server blogs review part 6

    Hello ! Been really busy last week so instead of publish my weekly review on Monday morning I do it now. It is Wednesday, 11/11 and we have Independence Day here in Poland! Happy reading J


     

     

    Author

    Title

    Link

    Joe Chang

    Computers without ECC memory are crap - no exceptions

    http://bit.ly/1NJvy7i

    Master Data Services Team

    What's New in Master Data Services - SQL2016 CTP3

    http://bit.ly/1PofsBF

    CSS SQL Server Engineers

    The given network name is unusable because there was a failure trying to determine if the network name is valid for use by the clustered SQL instance

    http://bit.ly/1PBQInU

    Neeraj Tripathi

    SQL Server Heaps, and Their Fragmentation

    http://bit.ly/1PBQNId

    Mine / R-bloggers

    A two-hour introduction to data analysis in R

    http://bit.ly/1LKRvjS

     

    Cheers

    Damian 

     

  • What you should have read :) during the weekend - weekly SQL Server blogs review part 5

    Hello !

    Welcome in November! I missed to write to you last week due to the project but here are my weekly reading and tips for you :).  

    Maybe you find something interesting there, too!!!

    Enjoy the Sunday!

     

    Author

    Title

    Link

     Dejan Sarka

    Data Mining Algorithms – Decision Trees

    http://bit.ly/1LKR7lx

    Jason Brimhall

    Tricks and Treats with XE

    http://bit.ly/1l2pStT

    Pankaj Manek

    Freaky statistics feature

    http://bit.ly/1ShkpdL

    Erik Darling

    SQL Server Features I’d Like To See, PostgreSQL Edition

    http://bit.ly/1WpNnxM

    Henk Van Der Valk

    Introducing Microsoft SQL Server 2016 R Services

    http://bit.ly/1LKRvjS

     

    Cheers

    Damian 

  • What you should have read :) during the weekend - weekly SQL Server blogs review part 4

    Hello !

    Here are my weekly reading and tips for you :).  Maybe you find something interesting there, too!!!

    Enjoy the Monday!

     

    Author

    Title

    Link

    Michael Swart

    Don’t abandon your transactions

    http://bit.ly/1kjukEf

    Thomas LaRock

    Improve database performance without changing code

    http://bit.ly/1jwZGXV

    Robert Davis

    Deciding between merge and peer-to-peer replication

    http://bit.ly/1KjLu9H

    Erik Darling

    SQL Server Features I’d like to see, Oracle Edition

    http://bit.ly/1ZQCg0j

    Kendra Little

    Training plan for a sysadmin to manage availability groups

    http://bit.ly/1hNbKCv

     

    Cheers

    Damian 

     

  • What you should have read :) during the weekend - weekly SQL Server blogs review part 3

    Hello !

    Here are my weekly reading and tips for you :).  Maybe you find something interesting there, too!!!

    Enjoy the Monday!

     

    Author

    Title

    Link

    Simon Liew

    How efficient is your covered index?

     

    http://bit.ly/1L7vknz

     Nitin Gupta

    Migrate existing DB Mail Exchange Configuration with New

    http://bit.ly/1ZuFuWS

    David Klee

    Level 4 - Building the Hyper-V VM

    http://bit.ly/1OwlZIy

    Casimir Saternos

    SQL and R

    http://bit.ly/1OtoaOo

    Marcin Policht

    Azure SQL Database – Row-Level Security

    http://bit.ly/1jfb76w

    Sergey Gigoyan 

     

    Storing passwords in a secure way in a SQL Server database

    http://bit.ly/1jr5TUy

     

     

     

    Cheers

    Damian 

  • SQL Server 2016 Insights by Polish MVP's - part 2

    Hello SQL Folks!

    In one of the latest posts I wrote that the SQL Server 2016 & Channel 9 series that is being prepared by Polish MVP’s.

    Now we have 4 recordings there but expect many more :). You can find more details about the series here: http://bit.ly/1MNlAAO

    The first session is dedicated to the security enhancements that will be available in the SQL Server 2016. This session was prepared by Marcin Szeliga, our PLSSUG President. The link to the recordings is here: http://bit.ly/1NJseZq

    The second session is about hybrid scenarios and especially about a new feature called Stretch Database. This recordings was done by Bartek Graczyk. The link to the recordings is here: http://bit.ly/1KmGNkv

    The third sessions was prepared by me! This session is about monitoring data changes using Temporal Tables. The link to the recordings is here: http://bit.ly/1U7Vtdu

    The fourth session was also prepared by me!!! This time I was discussing the changes to the query optimization  process. The main things I discussed are LQS and Query statistics plus of course new hints you can you can use in your queries!  The link to the recordings is here: http://bit.ly/1Gu5kPf

    Well, I think it is a good oportunity to learn about SQL Server 2016 before it is already available :) Please stay tuned as more recording will be posted soon! 

    And stay tuned for the part 4 of temporal table that I am going to blog this week!

     

    Cheers

    Damian

     

     

  • What you should have read :) during the weekend - weekly SQL Server blogs review part 2

    Hello !

    Here are my weekly reading and tips for you :).  Maybe you find something interesting there, too!!!

    Enjoy the Monday!

     

    Author

    Title

    Link

    Sergio Govoni 

    24 Hours of PASS (September 2015): Recordings Available! #24HOP

    http://bit.ly/1QERUVY

    Nakul Vachhrajani

     

    SQL Server Stored Procedures and SET options

    http://bit.ly/1L11qiH

    Dwain Camps

      

    How to Avoid Conditional JOINs in T-SQL

      

    http://bit.ly/1PexxAF

    Greg Low

    Data Tales #3: The Case of the Stubborn Log File 

    http://bit.ly/1LsmiFe

    Marco Russo

    Synoptic Panel for Power BI Best Visual Contest #powerbi #contest

     

    http://bit.ly/1WHs7k3

    Cheers

    Damian 

  • Temporal table behind the scenes - part 3

    Temporal table behind the scenes - part 3

    Hello SQL Folks. Today I would like to show how to implement temporal table solution when you have an existing table already present in the database. But this will not be all you find in the article. At the end I would like to show you an example where a new column is added to temporal table.

    The previous parts can be found here:

    Part 1

    Part 2

     

    I will not talk about a scenario where a new empty table is created first and then data from existing table are transferred. That is not the preferred way of dealing with temporal tables and in fact that is even worse scenario I can think about.

    Let’s do it in the right way. The table was created in the part I of this series and has the following structure:

    CREATE TABLE dbo.ProductsV3

    (

           ProductId INT IDENTITY NOT NULL PRIMARY KEY,

           Name NVARCHAR(100) NOT NULL,

           Number INT NOT NULL,

           Price MONEY NOT NULL

    )

     

    As you can see this is a regular table without the columns that are necessary to make it a temporal table. First of all let’s load some data to this table. That will emulate the real environment we usually have:

    INSERT INTO [dbo].[ProductsV3]([Name],[Number],[Price])

    SELECT [Name],[Number],[Price]

    FROM [dbo].[ProductsV2]

    GO

     

    By the way – the table ProductV2 is a temporal table we worked on in last blog post. According to the definition the temporal table contains two tables: the table that holds the current data and the historical table with changed records. The statement above will transfer only the actual data without touching the historical table. That makes sense of course but I was asked about that this week. Here is the  query execution  plan to make it clear:

     

     

    According to the Books Online the process of converting a regular table to the temporal one should be done in transaction and should have the following steps:

    • Add two non nullable columns of type datetime2 to the existing table. The precision used in the columns is up to you.

      • The columns must have default constraint turned on – as we already have data in the table

    • Mark the two columns as PERIOD FOR SYSTEM_TIME

    • Drop the defaults as SQL Server will take care about managing data in these column.

    • Set SYSTEM_VERSIONING to ON and add historical table name or not if you wish the SQL Server does it for you

     

    Here is the code:

    BEGIN TRAN;

     

    ALTER TABLE dbo.ProductsV3 ADD

      SYSSTART DATETIME2(0) GENERATED ALWAYS AS ROW START NOT NULL

        CONSTRAINT DT_ProductsV3_sysstart DEFAULT('19000101'),

      SYSEND DATETIME2(0) GENERATED ALWAYS AS ROW END NOT NULL

        CONSTRAINT DT_ProductsV3_sysend DEFAULT('99991231 23:59:59'),

      PERIOD FOR SYSTEM_TIME (SYSSTART, SYSEND);

     

    ALTER TABLE dbo.ProductsV3

      DROP CONSTRAINT DT_ProductsV3_sysstart

     

    ALTER TABLE dbo.ProductsV3

      DROP CONSTRAINT DT_ProductsV3_sysend

     

    ALTER TABLE dbo.ProductsV3

      SET ( SYSTEM_VERSIONING = ON ( HISTORY_TABLE = dbo.ProductsV3Hist ) );

     

    COMMIT TRAN;

     

     

    And that is it! No sweat at all! Now let’s go to the other situation where the table is already a temporal one and we have a new column to be added to the table. I bet you already can imagine that code will not work as long as the ProductsV3 table is a temporal table:

    ALTER TABLE dbo.ProductsV3 ADD Colour NVARCHAR(20) NULL

     

    The error message says the same:

    Msg 13550, Level 16, State 1, Line 156

    Add column operation failed on table 'TemporalDemo.dbo.ProductsV3' because it is not supported operation on system-versioned temporal tables.

     

    Let’s try another way then. Let’s remove the system versioning from the ProductsV3 table, add the column and turn on the system versioning again. I will do that operation in transaction so anything wrong happen in between the whole operation is not accepted:

    BEGIN TRAN;

     

    ALTER TABLE dbo.ProductsV3

      SET ( SYSTEM_VERSIONING = OFF)

     

    ALTER TABLE dbo.ProductsV3 ADD

      Colour NVARCHAR(20) NULL

     

    ALTER TABLE dbo.ProductsV3

      SET ( SYSTEM_VERSIONING = ON ( HISTORY_TABLE = dbo.ProductsV3Hist

    , DATA_CONSISTENCY_CHECK = ON  )  );

     

    COMMIT TRAN;

     

    The error message says:

    Msg 13523, Level 16, State 1, Line 168

    Setting SYSTEM_VERSIONING to ON failed because table 'TemporalDemo.dbo.ProductsV3' has 7 columns and table 'TemporalDemo.dbo.ProductsV3Hist' has 6 columns.

     

     

    That leads me to the idea that the new column must be added to both tables in the same transaction. That means that both tables must have exactly the same structure. I hope you can see that I also included a phrase: DATA_CONSISTENCY_CHECK = ON along with the HISTORY_TABLE. This data check just want to find out if the column that is defined as a SysEndTime has greater or at least equal values to the column that is defined as a SysStartTime:

    SysEndTimeSysStartTime 

     

    By the way when the SYSTEM_VERSIONING is going to be turned on the SQL Server checks the following (according the official documentation):

    • The number of columns is the same in both the current and the history table.

    • The datatypes match for each column between the current and the history table.

    • The period columns are set to NOT NULL.

    • The current table has a primary key constraint and the history table does not have a primary key constraint.

    • No IDENTITY columns are defined in the history table.

    • No triggers are defined in the history table.

    • If the current table has period columns specified as HIDDEN, the matching columns in the history table are also hidden.

    • No foreign keys are defined in the history table.

    • No table or column constraints are defined on the history table. However, default column values on the history table are permitted.

    • History table is not placed in a read-only filegroup.

    • The history table is not configured for change tracking and change data capture.

     

    You might guess that this code will not work, too. I just tried to add column to both tables but the data size in the columns are not equal:

    BEGIN TRAN;

     

    ALTER TABLE dbo.ProductsV3

      SET ( SYSTEM_VERSIONING = OFF)

     

    ALTER TABLE dbo.ProductsV3 ADD

      Colour NVARCHAR(20) NULL

     

    ALTER TABLE dbo.ProductsV3Hist ADD

      Colour NVARCHAR(10) NULL

     

    ALTER TABLE dbo.ProductsV3

      SET ( SYSTEM_VERSIONING = ON ( HISTORY_TABLE = dbo.ProductsV3Hist, DATA_CONSISTENCY_CHECK = ON ));

     

    COMMIT TRAN;

     

    The error message says:

    Msg 13525, Level 16, State 1, Line 202

    Setting SYSTEM_VERSIONING to ON failed because column 'Colour' has data type nvarchar(10) in history table 'TemporalDemo.dbo.ProductsV3Hist' which is different from corresponding column type nvarchar(20) in table 'TemporalDemo.dbo.ProductsV3'.

     

     

    What I also tried but without success is:

    BEGIN TRAN;

     

    ALTER TABLE dbo.ProductsV3

      SET ( SYSTEM_VERSIONING = OFF)

     

    ALTER TABLE dbo.ProductsV3 ADD

      Colour NVARCHAR(20) NULL

     

    ALTER TABLE dbo.ProductsV3Hist ADD

      Colour NVARCHAR(50) NULL

     

    ALTER TABLE dbo.ProductsV3

      SET ( SYSTEM_VERSIONING = ON ( HISTORY_TABLE = dbo.ProductsV3Hist, DATA_CONSISTENCY_CHECK = ON ));

     

    COMMIT TRAN;

     

    The error message is very similar to the one displayed last time:

    Msg 13525, Level 16, State 1, Line 220

    Setting SYSTEM_VERSIONING to ON failed because column 'Colour' has data type nvarchar(50) in history table 'TemporalDemo.dbo.ProductsV3Hist' which is different from corresponding column type nvarchar(20) in table 'TemporalDemo.dbo.ProductsV3'.

     

     

    At the end we have to add the column that way to make sure in both tables this column is identical:

    BEGIN TRAN;

     

    ALTER TABLE dbo.ProductsV3

      SET ( SYSTEM_VERSIONING = OFF)

     

    ALTER TABLE dbo.ProductsV3 ADD

      Colour NVARCHAR(20) NULL

     

    ALTER TABLE dbo.ProductsV3Hist ADD

      Colour NVARCHAR(20) NULL

     

    ALTER TABLE dbo.ProductsV3

      SET ( SYSTEM_VERSIONING = ON ( HISTORY_TABLE = dbo.ProductsV3Hist, DATA_CONSISTENCY_CHECK = ON ));

     

    COMMIT TRAN;

     

    What to do in case of removing a column from the temporal table? Well you have to follow the same pattern:

    BEGIN TRAN;

     

    ALTER TABLE dbo.ProductsV3

      SET ( SYSTEM_VERSIONING = OFF)

     

    ALTER TABLE dbo.ProductsV3 DROP COLUMN Colour

     

    ALTER TABLE dbo.ProductsV3Hist DROP COLUMN  Colour

     

    ALTER TABLE dbo.ProductsV3

      SET ( SYSTEM_VERSIONING = ON ( HISTORY_TABLE = dbo.ProductsV3Hist, DATA_CONSISTENCY_CHECK = ON ));

     

    COMMIT TRAN;

     

    And the same pattern is also valid when you would like to change the data type of a column.

    That is all for today! Stay tuned – in the next blog post I will try to show what happens when you do a SELECT statement on the temporal table.

     

    Cheers

    Damian

More Posts Next page »
Powered by Community Server (Commercial Edition), by Telligent Systems
  Privacy Statement