THE SQL Server Blog Spot on the Web

Welcome to SQLblog.com - The SQL Server blog spot on the web Sign in | |
in Search

Damian Widera

  • How to rename a column when CDC is enabled

    I have been recently asked if Always On supports metadata operations on tables. The operation is add, drop and column rename. I said that it was possible. However I was informed that during column rename there was an error about “replication”:

    Caution: Changing any part of an object name could break scripts and stored procedures.Msg 4928, Level 16, State 1, Procedure sp_rename, Line 655. Cannot alter column 'Name' because it is 'REPLICATED'.

    I know that replication is not used in the scenario so my next guess was that this must be caused by the CDC. And I was right. Let’s do a simple example to see how the CDC works and what happened if you try to rename a column.

    I downloaded the AdventureWorks2014 database from the codeplex page and enabled the CDC in this database. I created also a role that could be used to have access to the CDC data. However it is not important in our example.

     USE AdventureWorks2014;

    GO

    CREATE ROLE [cdc_admin]

    GO

    EXEC sys.sp_cdc_enable_db

    GO

     

    Next I configured that all CDC-related objects are stored in the separate filegroup – that is for performance reason:

    ALTER DATABASE AdventureWorks2014

    ADD FILEGROUP [CDC_Objects];

    GO

    ALTER DATABASE AdventureWorks2014

    ADD FILE (    NAME = N'CDC_Objects',

                         FILENAME = N'C:\Temp\Adwentureworks_cdc.ndf',

                         SIZE = 1024MB , FILEGROWTH = 1024MB )

    TO FILEGROUP [CDC_Objects];

    GO

    The last configuration step is to choose the table that will be under the control of the CDC feature.  You might indicate the role, capture instance name (which is important in case where there are two CDC tables) and filegroup name at the moment.

    EXEC sys.sp_cdc_enable_table

           @source_schema = N'Sales',

           @source_name = N'Store',

           @role_name = N'cdc_admin',

           @capture_instance = N'SalesStore',

           @filegroup_name = N'CDC_Objects';

    GO

    As the CDC configuration is done I was trying to rename on of the column that belongs to the table Sales.Store:

    EXEC sys.sp_rename 'Sales.Store.Name' , 'SomeNewBetterName', 'COLUMN'

     

    I was immediately notified by the error:

    Caution: Changing any part of an object name could break scripts and stored procedures.

    Msg 4928, Level 16, State 1, Procedure sp_rename, Line 655

    Cannot alter column 'Name' because it is 'REPLICATED'.

     

    I checked the sys.columns view and that was true – the flag is_replicated was set to all columns that are configured for the CDC.

    It makes perfect sense for me as the CDC feature uses log reader agent J to detect changes.

    Change data capture and transactional replication always use the procedure sp_replcmds to read changes from the transaction log.

     

    SELECT is_replicated FROM sys.columns

    WHERE object_id = object_id(N'Sales.Store')

    AND name = N'Name'

     

    What should we do in case of column rename? There is only one way I am aware of. First the CDC must be disabled for the table. But be careful before you run the statement because all related CDC tables will be dropped immediately and without any warning J

    EXEC sys.sp_cdc_disable_table

           @source_schema = N'Sales',

           @source_name = N'Store',

           @capture_instance = N'all';

    GO

     

    Finally it is possible to rename the column.

    EXEC sys.sp_rename 'Sales.Store.Name' , 'SomeNewBetterName', 'COLUMN'

     

    If all changes are done the CDC should be enabled again

    EXEC sys.sp_cdc_enable_table

           @source_schema = N'Sales',

           @source_name = N'Store',

           @role_name = N'cdc_admin',

           @capture_instance = N'SalesStore',

           @filegroup_name = N'CDC_Objects';

    GO

     

    And that's all for today! 

    Cheers,

    Damian

     

  • Playing with the Filestream storage on Sunday

    Today I was asked to do a pretty easy thing. The task was to add a filestream support to a table that already stores some documents. The customer wants to keep the data outside the database because he uses the SQL Server 2014 Express edition. According to the MSDN page there is limitation of 10GB that can be stored in this edition and filestream data are not taken into the consideration to this limit.

    But let’s go back to work now. The first thing we would have to do is to enable filestream support on the server level. That can be done using the SQL Server Configuration Manager or by sp_configure stored procedure. I prefer the second way. Please remember – you have to be a sysadmin to configure that.

    use demo_db

    go

     

    EXEC sys.sp_configure N'filestream access level', N'2'

    GO

    RECONFIGURE WITH OVERRIDE

    GO

     

    The second step is to make the database aware of the filestream support. To do so you need to add a new filegroup to this database. The filegroup is a special one as it will contain the filestream data. By the way – it is of course possible to have more than one such filegroup in a database but only one of them can be marked as default.

    ALTER DATABASE demo_db ADD FILEGROUP demoFS CONTAINS FILESTREAM

    GO

     

    When a filegroup is in place then it’s time to add file into it.  There is also possible to have more than one file in the filestream filegroup.

    ALTER DATABASE  demo_db

    ADD FILE

    (

        Name = demo_dbFS,

        FILENAME = 'C:\dbdemo',

        Maxsize = UNLIMITED

    ) TO FILEGROUP demoFS;

     

    Now it’s time to get into the table level. The customer has one table that should contain the filestream data.  Now the data are stored in the column that is called Data and is type of varbinary(MAX). In order to have filestream aware table you have to do 3 things:

    • Add a column that is a type of uniqeidentifier and is marked as rowguidcol.

    • Set filestream_on option on the table

    • Add new column of varbinary(MAX) type and mark it for filestream data

     

    Follow the three statements below:

    ALTER TABLE dbo.Documents

    ADD filestreamguidcol uniqueidentifier not null ROWGUIDCOL unique default newid()

    GO

     

    ALTER TABLE dbo.Documents

    SET (filestream_on=demoFS)

    GO

     

    ALTER TABLE dbo.Documents

    Add DataFS varbinary(max) FILESTREAM null

    GO

     

    After we did the necessary setup now is the time to transfer data from the existing column (Data) to the filestream column (DataFs). I did it  by running the simple UPDATE command:

    UPDATE dbo.Documents

    SET DataFS = Data

    GO

     

    Now it’s safe to drop the Data column:

    ALTER TABLE dbo.Documents

    DROP COLUMN [Data]

    GO

     

    The last step in this example is to rename the new column (DataFS) to the one that existed and was dropped in the previous step (Data) so I do not have to change the existing application data access layer. If you wish you could refresh all dependent objects at the end.

    EXEC sp_rename

        @objname = 'Documents.DataFS',

        @newname = 'Data',

        @objtype = 'COLUMN'

    GO

    By the way – there are at least two great sources of information (also internals) and some myth busters done by Paul Randal and Bob Beauchemin.

    Cheers,

    Damian

  • My MVA course about SQL Server on Azure - PaaS or Iaas?

    If you are tired today maybe you could take your time and spend the whole weekend with the SQL Server 2014 and Azure?

    The new MVA course is ready :) and waiting for you. Just visit the link http://bit.ly/1CXZT90 and enjoy. Watch out - it's in Polish :)

    The topics covered:

    • SQL Server as IaaS  - means SQL Server on the Azure VM
    • SQLServer as PaaS - means Azure SQL Database
    • IaaS vs PaaS - what to choose and when

    more comming....

     

    One of the the next MVA course I am going to prepare will be about... wait for it.....  Oracle & Azure ;)

     

    Cheers

    Damian 

  • SQLRally Nordic - after conference thoughts

    Hello All

    I was extremely honored to be able to participate in the SQLRally Nordic this year. The event took place this week in Copenhagen and I think it was a great success - from every angle. Many thanks to the organizers and many thanks to the community for being in that place. There was a great atmosphere during the whole event and what I like the most as a speaker - I had a great room for my speech! 

    I hope that the event will take place also next years but now I would like to invite to the annual PLSSUG conference to Wrocław. Let's meet in May. At least half of the tracks will be in English and you could meet also may world-class speakers. Just visit the page http://sqlday.pl

     

    Cheers

    Damian 

  • Memory leak occurs when you run DBCC CHECKDB against a database in SQL Server 2014 - fixed in SQL Server 2014 CU6

    Great news today!

    At the end of January it was a Connect item  (https://connect.microsoft.com/SQLServer/Feedback/Details/1090203) that described a possible memory leak on the SQL Server 2014.

    The details are as follow: 

    sys.dm_os_memory_clerks for type = MEMORYCLERK_SQLQUERYEXEC and sys.dm_os_memory_objects for type = MEMOBJ_COLUMNBATCH show ever increasing values when columnstore tables are integrity checked. For example, if you run DBCC CHECKDB on a database that contains columnstore tables or you run DBCC CHECKTABLE on a columnstore table you will find that the memory shown for the above mentioned clerk and object types is ever increasing.  

    The good news is that the problem is fixed and the solution will be available in the SQL Server 2014 CU6. Find more details: http://support.microsoft.com/kb/3034615 

     

    Cheers

    Damian 

     

  • SQL Server 2012 SP2 Cumulative Update 4 Released - information from the SQL Server SE Operations Team

    As per information from Microsoft:

    "It is my pleasure to announce the release of SQL Server 2012 SP2 Cumulative Update 4 on behalf of the team.  SQL Server 2012 SP2 Cumulative Update 4 incorporates 42 issues.

    This CU will be available for download from the associated cumulative KB article that has also been published.  Customers are directed to contact CSS to get the CU build or obtain the hotfix package through the new self-service feature by clicking on the “Hotfix Download Available” button found at the top of the KB article."

    To me the most important fix is the one regarding the possibility of having errors 17066 or 17310 during SQL Server startup (immediately after database recovery is complete and client connections are enabled.). Check out the link http://support.microsoft.com/kb/3027860 for details

    Link to the official site of this update: http://support.microsoft.com/kb/3007556/en-us

    Cheers

    Damian 

     

  • Kimberly Tripp is going to visit PLSSUG in January (online meeting)

    Paul Randal has recently proposed that SQLSkills could give sessions on the user groups meetings. I emailed him immediately and - making the long story short - the PLSSUG (Polish SQL Server Users Group) will be hosting Kimberly Tripp on 29 th January at 6PM local time. Please visit the  community webpage 

    Here are some details:

    Stored Procedure Optimization Techniques

    Presenter: Kimberly L. Tripp

    Abstract: Kimberly will discuss stored procedure caching and the potential for performance problems due to a problem known as parameter sniffing. There are some common solutions and while they can drastically improve performance, they can also cause other problems. Sometimes a hybrid solution is needs, from this session, you’ll know the best way to write procedures so that stable requests are cached and unstable requests are not (giving you the most optimal performance without wasting resources).

     

    Bio: Kimberly L. Tripp, President / Founder, SQLskills.com

    Kimberly has been a Microsoft SQL Server MVP and a Microsoft Regional Director for over 10 years; she's an author, consultant, and top-rated speaker having spoken at PASS, TechEd, SQLintersection, and numerous other events around the world. She has over 20 years of experience designing and tuning SQL Server solutions with a focus on optimizing procedural code, indexing strategies, statistics, and partitioning. When she's not tuning SQL servers, she's diving and photographing all sorts of underwater critters such as frogfish with her husband, Paul S. Randal.

    I hope that this is great news.

    And I promise - this will not be the only meeting :).

    More details will follow on the community webpage (as well as information how to participate in that meeting) - http://plssug.org.pl/

    Cheers,

    Damian

      

     

     


  • SQLDay 2015 - should we plan it now? Yes - check out who will join us :)

    SQL Folks, SQL Geeks and others....

    Is it the right time  to plan the next conference when the previous was finished last week? YES!!!! It is the right time!!! The PLSSUG is planning to have the biggest community conference ever. It will start at 11th May (Monday - preconf) and will be till Wednesday, 13th May (at least). The first great news is that Adam Machanic will be with us. So do not wait and register for his preconf and sessions :). Adam will not be the only superstar next year but more details will follow. Remember, that this year Bob Ward, Klaus Aschenbrenner and Alberto Ferrari were the special guests. And do not forget about the local leaders - you should have seen them last week in Lublin. So you know what am I talking about.

    So, stay tuned, vist the PLSSUG page frequently

    Cheers

    Damian 

     

     

  • Conference - ITAD Bielsko Biała

    Good evening

    Today I had a lecture on the academic conference ITAD (http://www.itadbb.pl/). That is amazing that 300  people came and took part in the event. 

    I was talking about the SQLCLR which seems to be my main topic in the last months. That is because I use it over and over again in my projects. The idea was to share best practices to the people who probably start practicing this feature.  Many thanks to the organizers for invitation, congratulations once again - it was really impresive event. Hope to see you next year as I promised you to show you some TSQL stuff that makes SQL Server fly (not cry).

    If you want to see what I was talking about you can download the materials from the drive  . Many thanks to Adam Machanik for help in preparing the demos and sharing knowledge about the SQLCLR 

    Cheers,

    Damian 

  • SQL Day Lite 2014 Lublin - part II - photos

    Hello again

    I am pleased to announce that pictures taken during the conference are available. Just take a look here. Thanks to Kamil Nowiński the photos.

    It is also a good moment to say "thanks" to the PLSSUG (Polish SQL Server Users Group) for the organization. Again - it was fabulous :) Over 100 people were attending the Saturday's conference and also over 45 were with us on Friday during the workshops (preconf). Please visit the PLSSUG site for the recorded sessions (comming soon...)

    The agenda of the conference was as follow:

    Łukasz Grala - SQL Server Analysis Services - Multidimensional vs Tabular

    me - All you wanted to know about the DBCC 

    Grzegorz Stolecki - Reporting Services - the rendering report extensions

    Łukasz Grala - What everybody (not only DBA's) should know about the backup

    Marek Adamczuk & Paweł Potasiński - Windowing Functions

    Maciej Pilecki - Event Notification 

      

    Cheers

    Damian 

  • SQLDAY Lite 2014 - Lublin.

    I had the priviledge to participate in the SQLDayLite 2014 conference that was held in Lublin this week. 
    I had the opportunity to prepare one-day workshop (on Friday) and also gave a speech on Saturday.
    The workshop was not easy :) as it was dedicated to the SQLCLR programming.  So we (me & participants) did a really deep dive into the SQLCLR :). So it was really demo intensive day for all of us :)
    On Saturday I was talking about the DBCC and its usage. I hope people enjoyed that talk especially that  I did show how to make a corruption in the db (for training purposes)

    Cheers,
    Damian 
  • My MVA course about SQL Server 2014 & Azure

    If you are tired today maybe you could take your time and spend the whole weekend with the SQL Server 2014 and Azure?

    The new MVA course is ready :) and waiting for you. Just visit the link http://bit.ly/1tJCTts and enjoy. Watch out - it's in Polish :)

    The topics covered:

    • backup encryption
    • manual backup to Azure storage
    • smart backup :)
    • buffer pool extension
    • SQL Server files integration with Azure storage

    more comming....

  • SQL Server 2014 CU4 is available for downloading

    The SQL 2014 CU4 is available :)

    There are some interesting facts:

    ·         66 issues were solved

    BI\Analysis Services

    10

    BI\Collaborative Insights

    1

    BI\Reporting Services

    11

    Data Quality Services\Cleanse\Export Results

    1

    Integration Services

    1

    Integration Services\Tasks Components

    2

    Integration Services\Tools

    1

    Master Data Services\Client

    1

    Master Data Services\Server

    1

    Master Data Services\Setup

    1

    SQL Connectivity

    1

    SQL Engine

    1

    SQL Engine\Column Stores

    4

    SQL Engine\DB Management

    1

    SQL Engine\Extreme OLTP

    1

    SQL Engine\High Availability and Disaster Recovery

    2

    SQL Engine\Management Services

    1

    SQL Engine\Programmability

    4

    SQL Engine\Query Execution

    2

    SQL Engine\Query Optimizer

    4

    SQL Engine\Replication

    2

    SQL Engine\Search

    1

    SQL Engine\Service Broker

    1

    SQL Engine\Spatial

    1

    SQL Engine\SQLOS

    1

    SQL Engine\Unstructured Data

    1

    SQL Server Client Tools

    3

    SQL Server Client Tools\Database Performance Tools

    1

    SQL Server Client Tools\Powershell

    1

    SQL Server Client Tools\SSMS

    3

     

    ·         The release build is 12.0.2430.0 

    You can find more details on the official page: http://support.microsoft.com/kb/2999197/en-us

    Cheers

    Damian 

     

     

     

  • Transactional replication - how to initialize a subscription from a backup

    Today I had a chance to initialize my scubscription in transactional replication from backup. The database is quite big (120GB) and I thought it I would be better to use this method to start the replication process.

    Unfortunatelly this is not possible from the replication wizard and you have to change the publication and subscription definition manually.

    Here are the steps - if you work with new replication:

    1) Create a publication script and update manually the  sp_addpublication command by changing the @allow_initialize_from_backup parameter  to "true". In my case it was set to "automatic" as I used the wizard to generate the script. I think that it is usefull to set the other parameter: @immediate_sync to "true", too. 

    2) Run the script on the publisher server and then create a backup of the database 

    3) Restore the database on the subscriber.

    4) Go back to the publisher and create a subscription  script but dont run it. Look into it and add these parameters to the sp_addsubscription procedure:

    @sync_type = "initialize with backup" 

    @backupdevicetype = "disk"  (you could add "tape" or "logical" apart from "disk")

    @backupdevicename = "path to the backup"

    when you yse the logical backup device then put the name od this device.

     

    I know that most of you know that a replication can be started using that technique. I worte this blog post in order to not forget about that "feature" in  the future

    Cheers

     

     

     

     

     

  • Swedish SQL Server User Group - my sessions in September

    I had a great privilege to have two sessions in Sweden in September. The first was done in Malmo and the second I gave in Stockholm. 
    The talks were about waits statistics in the SQL Server ("All you should know about waits"). I realized that during 90 minutes I am not able to tell just *ALL* about the wait statistics so I slightly modified the topic to "All IMPORTANT you should know..."
    If you are interested in the presentation deck you can find it here
      
    Thanks to Swedish MVP's and especially to Johan Ahlen,  Steinar Andersen and Lars Utterstrom for inviting me to Sweden
More Posts Next page »
Powered by Community Server (Commercial Edition), by Telligent Systems
  Privacy Statement