Ken Adams, Senior Business Intelligence Consultant at Key2 Consulting
By: Ken Adams

 

Not Enough Space

On my current engagement, we are trying to build a reporting platform that meets the needs of our client while simultaneously adhering to best practices. One of the best practices we’ve established as part of the platform is space savings. Our client uses SQL Server 2017, and they have a large number of databases primarily due to the fact that their customers do not allow for multi-tenancy.

Inside of the client databases, analysts have the ability to create N number of objects and some of those objects can get quite large. This has made it very difficult to maintain and manage this environment. Space is an ongoing problem, as in some cases there is a requirement to keep the data online for a year or more.

The new reporting platform will streamline the flow of data from cradle to grave, and while we can’t fully eliminate the multi-tenancy requirement, we can certainly help reduce the amount of storage required. In SQL Server 2008, compression was introduced – a functionality applicable to both tables and indexes. Compression was initially an enterprise-only feature, but in SQL Server 2016, the functionality was added to lower editions going down to “standard”, making compression widely available to most in the marketplace.

I don’t want to get into the minutiae of compression in this blog as there has been so much written on it over the years. Instead, I wanted to share a simple script that can help in adding compression to existing tables and indexes.
 

Adding Compression to Existing Tables and Indexes

We had started developing a prototype for our current reporting platform. This was a series of import, staging, and ultimately relational tables. Once we got through the requirements and prototyping, we were ready to establish proper guidelines for the way forward. As previously mentioned, we decided that compression was part of that way forward. Please note that the functionality might not be right for your environment. There has been much written on this topic as well.

We did some initial investigation on some of the larger tables to determine the effect of adding compression. It was decided that PAGE compression suited our needs best and we began testing the before and after of adding compression. Again, you can find many articles on testing for compression and adding it to objects – I recommend this one.

I really wanted to take it a step further and find a way to look for uncompressed objects and include the associated logic for testing those objects as well as the DML for adding compression to the object.

*Update 7/12/19: For those that might want to do this in batches working from smallest to largest I added a page count sort. Happy compressing!*

Executing those calls returns the 2 rows seen below:

SQL Rows 2 Key2 Consulting

This was a good enough indication that adding PAGE compression would give us the desired result and wouldn’t be a hindrance based on our environment. I went back to the previous results and grabbed the rows under CompressionSQL. I copied and pasted those to a new query window.

Running the SQL above adds page compression to the 2 tables and ultimately returns some space back to the database. When running the compression check query again you will no longer see those tables show up.

As noted, this is not the most elegant solution but it did the trick for us. You could take it even further and step through the rows while executing the SQL for those objects along the way. We wanted the ability to take snippets of code and run those to evaluate the results. If we felt the object was a good candidate, we then ran the ALTER TABLE statement.

You’ll also note that the stored procedure call is not being made for the indexes. We were only running that for the tables. There is some NULL handling required on the stored procedure calls for the tables as NULL are basically ignored by the CONCAT_WS function. Hence, passing that in a string and then removing it with the outer REPLACE.

Anyways, I hope you found this useful and can add this knowledge to your SQL tool belt. If you run into any issues or would like to see more topics like this one, let us know! We’d be glad to assist. Thanks for reading!

Helpful Compression Links:

 

Questions?

Thanks for reading! We hope you found this blog post to be useful. Do let us know if you have any questions or topic ideas related to BI, analytics, the cloud, machine learning, SQL Server, (Star Wars), or anything else of the like that you’d like us to write about. Simply leave us a comment below, and we’ll see what we can do!
 

Keep your data analytics sharp by subscribing to our mailing list

Get fresh Key2 content around Business Intelligence, Data Warehousing, Analytics, and more delivered right to your inbox!

 


 

Key2 Consulting is a data warehousing and business intelligence company located in Atlanta, Georgia. We create and deliver custom data warehouse solutions, business intelligence solutions, and custom applications.