SQL University Compression Week: Lesson 1 About Compression
Welcome to Lesson 2 on Compression and welcome back Dear Reader. I want to thank the Chancellor of SQL University Jorge Segarra (@SQLChicken | Blog) for the opportunity to be your Professor of Compression. Yesterday we discussed What Kind of Compression SQL Server uses and Why you would want to use Compression. Today let’s get down and dirty with the first level of Compression. But first let’s do a history lesion.
Compression was first introduced in SQL 2005 SP2, as Vardecimal Compression. In SQL 2005 you would have to enable the Database for Compression and then enable the table that you wanted to Compress. The compression would only work on Numeric or Decimal data types.
If you enabled Vardecimal Compression it would store Decimal and Numeric data as a new data type, Vardecimal. But to understand what that data type does we need to talk about the Fixed Length portion of a Data Record.
LET’S START AT THE BEGINNING
When we talk about Compression we have to look at our data from the inside out. So I like to make a reverse pyramid. We understand when we create a table there is a hierarchy to it. Tables have Columns and Rows, Columns and Rows have fields. When you break it down to the smallest unit storage, you would think it is a field, but it’s a little bit different.
Instead of Tables, Columns, Rows, and Fields we have IAM Chains and Allocation Units, Allocation Bitmaps, Extents, Pages, and Records. What we care about with Row Compression are the Records. If you look up above you’ll see what I like to call my Mall map of SQL Internal Storage. And you are here in the Records section today. So let’s take a closer look at what that records looks like.
This is from a slide I have in my Compression talks. If you look above you’ll find the structure of a regular record as it is stored on a page. This is courtesy of Paul Randal (@PaulRandal | Blog) go and look at his Data Structures MCM video it will give you a deeper understanding of how SQL Storage works internally. A Record essentially equals a row of data in a table (the deeper you dive the trickier this get’s but stick with me Row=Record). The part that we care about the most is the Fixed Length Columns. If you declare a Decimal or Numeric data type and insert data into it, that would be stored on the Fixed Length Column of the Record. If you declare a decimal(38,38) and put a number, 0.12345678900123456789012345678902134567, it takes up the exact same amount of space as if you stored 0.1234. That is what a Fixed Length column means, regardless of what you use you take up all the space.
When Vardecimal Compression occurs it changes where the record is stored, it goes to the variable length storage instead of fixed. So it is altering how the storage of a particular record data type occurs. So now instead of 0.1234 taking up 38 bytes of data it only takes up 4 bytes of data, in other words we need only what we use.
SQL 2008 takes it a step further not only do you change how it is stored, but you change the physical storage of Data at a Record level. With Row Compression we added a lot more data types Smallint, int, bigint, bit, smallmoney, money, float, real, datetime, datetime2, datetimeoffset, char, binary, timestamp/rowversion. SQL 2008 R2 added Unicode Compression for nchar and nvarchar. So if it is fixed length and we Row Compress it we remove all the unused space, and here is how we store it.
Paul Randal is a busy man, he has another MCM video on New Database Structures in SQL Server 2008 (go watch it). Compression wasn’t the only new data structure we got with 2008 go have a watch and you’ll learn all about it.
So now that we know all about how Row Compression changes our storage, let’s play around with it.
DEMO
First we need to create our database that we will be using. I’m doing this in SQL 2008 R2, so to make sure your results match up I would suggest 2008 R2 as well, however if you have 2008 for this example we will be fine.
USE master;
Go
IF EXISTS(SELECT name FROM sys.databases WHERE Name=N'demoCOMPRESSION')
BEGIN
DROP Database demoCOMPRESSION
END
CREATE DATABASE demoCOMPRESSION
GO
USE demoCompression
GO
So we’ve got our database our next step is to create a table and populate it with some records.
CREATE TABLE dbo.myTable1(
myID int IDENTITY(1,1)
,myChar CHAR(8000)
,CONSTRAINT PK_myTable1_myID
PRIMARY KEY CLUSTERED(myID)
);
GO
/*
Insert Some Records
*/
DECLARE @i INT
SET @i=0
BEGIN TRAN
WHILE (@i<10000)
BEGIN
INSERT INTO dbo.myTable1(mychar)
VALUES('myData');
SET @i = @i +1
END
COMMIT TRAN
Now that we’ve got a Table and Data we have something to play with we need a baeline. Remember KNOW THY DATA! To know our data a little more I would suggest turning on STATISTICS IO & TIME then running a select. Now this is very figurative, if you have a more targeted work load or set of stored procedures you would want to test those by getting the statistics and query plans so you can see IF compression changes anything for the better or worse.
/*
Do a select to look
at the CPU time and I/Os
Before Compression
*/
SET STATISTICS IO ON
SET STATISTICS TIME ON
SELECT * FROM dbo.myTable1
At this point set those baselines to the side. I would use sp_estimate_data_compression_savings on the table specifying ROW Compression. This will take a 5% sample of the table move it into your tempdb and apply the Compression setting that you specify, in this case row. It will return the results and then give you an estimate based on that sample that tells you what it believes your compression rate will be.
sp_estimate_data_compression_savings 'dbo', 'myTable1', NULL, NULL, ROW ;
GO
When I apply compression I like to use sp_spaceused to see the space the table takes up before and after.
/*
How Big is our Clustered Index
*/
sp_spaceused 'dbo.myTable1';
go
/*
Rebuild Our Table with Compression
*/
ALTER TABLE dbo.myTable1
REBUILD WITH (DATA_COMPRESSION = ROW);
go
/*
How Big is our Clustered Index
*/
sp_spaceused 'dbo.myTable1';
As you can see we reduced the table from around 80 MB to 288 KB. Now let’s run our query to get our comparison baseline and see how query performance was effected.
/*
Do a select to look
at the CPU time and I/Os
After Compression
*/
SET STATISTICS IO ON
SET STATISTICS TIME ON
SELECT * FROM dbo.myTable1
Home work, go watch Paul Randal’s MCM videos . Take the Row Compression Demo and add some more columns and Fixed Length Data Types to it, and see what you can Compress. Happy Compressing!
Thanks,
Brad
This comment has been removed by the author.
ReplyDelete