Well...  The other day I put in my resignation at Quantum Digital,Inc.  This is a place where I have spent three fantastic years.  During this time, I've made some life long friends.  Quantum Digital also gave me the luxury to learn about new things on the database world.   They gave me access to SQL Server 2008R2.  I got to play with geospatial data types and develop some great things for them.  I fixed the performance issues that we were having and I learned a great deal about SQL Server internals.  I also got the opportunity to learn MySQL.  Admittedly, I was very scared about this process.  But I overcame this fear and designed a rather elegant high availability solution.  I may write a different article about it;)

It will be a sad day come March 15th.  But I'm very excited about the new position that I will be taking.  I'm looking forward to learning a new business.  It looks like my reporting background will also come in handy as I'll be partially responsible for bringing up a data warehouse.  I can't wait!

Here's to a great year!

Posted
AuthorJ.D. Gonzalez
CategoriesCareer

The other day i got into a conversation with a fellow SQL Server DBA about titles.  The issue at hand was whether we considered ourselves an administrator or a developer when you're the sole employee.  The answer is…  It depends.

I am the sole database adminstrator where I work and while my 'title' is SQL Server administrator, I have to be well versed on both sides of SQL Server.  I've spent a considerable amount of time and money learning the underpinnings of SQL Server.  I won't consider myself an expert I have gotten myself out of some scary situations because of the time I've spent learning SQL Server.

The other part of my job is doing ETL, process automation, and SQL Server development.  Every time I use SSIS to create an ETL process I try to incorporate something new that will improve performance or manageability.  Having DBA skills has definitely made me a better developer.  I now write queries or processes that will be efficient and use minimal resources.  Just because 'select *' is easier to write instead of returning only the queries I need I know I'm helping the DBA in reducing the about of bandwidth to return data.

So what's my favorite position to play at work?  Honestly…  It's a draw.  They both help me be the best SQL Server professional I can be.

Posted
AuthorJ.D. Gonzalez
CategoriesSQL Server

As a mentioned in a previous post, this year I am going to get at least one certification in SQL Server. I know I've said similar things in the past, but I firmly believe it's going to be different. Why? You may ask? Well... This year I am going to do this with a partner. I am teaming up with a former coworker. And we've both made it our goals to complete at least one. Given that we both pushed each other hard when we worked together, I'm feel very confident that we can pull this off.

I also want to document my process of getting my certification to serve as inspiration to those that are being hesitant too. If I've procrastinated on this topic then I know others have too. So sit back and let's enjoy the ride.

For those that are curious, I am starting with the developer certification.

 


Posted
AuthorJ.D. Gonzalez
CategoriesSQL Server

Well... Today I began taking the first steps to completing my SQL Server certification. I created my first windows domain using Rackspace's cloud offering. Admittedly, this was the first domain I have ever set up. And the process went quite well. The only issue I had was adding computers to the domain. I wasn't able to see the domain. Turns out that I wasn't pointing my DNS to the DNS server I installed when I built the domain. Once I got that sorted out I was ready to go.

If you've ever used Amazon's EC2 service, Rackspace's cloud is pretty similar. There are a few key differences. Amazon charges you for the storage and time the server is actually 'powered' on. Rackspace, on the other hand, will bill you when the server is created. It doesn't make a difference if the server is on or off you will get billed for it. Because of this I've had to change my approach in my architecture to reduce costs.

  1. Create the domain controller and get it working the way I want it to work.
  2. Once that is done, make an image of the working server. This will allow me to recreate my lab whenever I need to.
  3. Create a server image that will be used to install SQL Server.
  4. When I need to work on the lab, I fire up the domain controller based on the image and the other server image.
  5. Every time you do this you will need to point your SQL Server install server to the new DNS on the DC. It will get a new IP address every time you build a new one.

The reason I'm doing it this way instead of just keeping the servers active is so that I don't get an hourly bill. That being said, I set up two types of instances. I set up a 1GB ram box with 40GB of space at .08 per hour and a 2GB ram box with 80GB of space at .16 per hour. That's really pretty inexpensive but I don't see the need to pay for something if I'm not using it.

I hope you take some time this year to explore the power of the cloud. Personally, I think this is an extremely valuable tool for learning how to administer or learn a new system. If you mess up just fire up another system.

Good luck!

J.D

 

Posted
AuthorJ.D. Gonzalez
CategoriesSQL Server

Today is January 1st and you know what that means. Yep... It's time to come up with resolutions for this year. Before I get into what my new years resolutions will be I'd like to reflect on how I did on last years' resolutions. I'm happy to report that I actually did very well. Although, I wish I would have done more studying on SQL Server internals. You can never know enough about that stuff! See what my goals were here.

Like last year, I really felt that having a set of personal and professional goals really helped me focus on what I wanted to accomplish. So this year, I will be doing the same. Here we go!

Professional

  1. Diversify my skill set. Currently, at work, we are using SQL Server and we are also beginning an initiative that will utilize mySQL. If I want to continue to be the 'go to' guy I need to learn some new skills.
  2. Become much more involved in the Austin SQL Server community. Wes Brown over at sqlserverio.com runs CACTUSS and is a fantastic guy. He's showed me that there are many people that want to learn from SQL Server professionals.
  3. Get my MCITP on SQL Server. It could come in handy when dealing in salary negotiations.

Personal

  1. Become more involved at my daughters school. There's no better feeling that giving a few hours of my time. And I know my daughter really appreciates it.
  2. Finish the Miata. What I mean by that is it won't be completed until the seats are redone, shifter is working properly, suspension work is done and ground effects are installed.
  3. Continue to play the trumpet, and perhaps a euphonium if I can get my hands on one. It made my day watching my son smile as I was playing my horns. Hopefully he'll grow up to enjoy music as much as I do.

Looking back, I think 2011 went really well. Let's hope 2012 is even better.

Posted
AuthorJ.D. Gonzalez

Well... 2011 is about to end and what a year it's been. Solved some major headaches at Quantum Digital by implementing some features in SQL Server. Probably the biggest issue we were having was some intermittent blocking due to the way some of our processes were written. I implemented read committed snapshot isolation and 99.9% of our blocking issue disappeared. A big win for the database team.

On the performance tuning front, I optimized a process that was maxing out our CPU. Due to the way the query was written I modified and removed some redundancies and the query ran with minimal impact on the production environment.

If you've never worked with geospatial datatypes... You should. I began working with them late last year and it was the best move that I could have made. I took a query that would have been extremely difficult and complex and with the aid of geospatial datatypes rewrote that query to be extremely simple to understand and considerably quicker too. I wrote about that here.

At a personal level, I am getting more and more involved in the local SQL Server community by attending the Capitol Area, Central Texas Users of SQL Server otherwise known as (CACTUSS). I learned more in the few months of attending CACTUSS meetings than I did working on the job. I attended SQLSaturday in Austin a few months ago. What a blast that was. Met some fantastic people and learned a great deal of knowledge from folks I consider experts in the SQL Server community.

My wife and I also had our second child this year. Needless to say, that has kept me very busy but I wouldn't give it up for the world. My other little 'baby' is now 6 years old. Woah! Life moves so fast; it's simple hard to believe how big they are getting.

I'm looking forward to what I hope will be a very exciting 2012. Now I have to come up with some resolutions for next year;)

J.D.

Posted
AuthorJ.D. Gonzalez
CategoriesSQL Server

For the last year or so I’ve been hearing people talking about PoweShell and how great it is.  I’m sure it’s great or a ton of people wouldn’t be talking about it.  My problem is I don’t get it.  I don’t understand why I’d want to use PS instead of just logging into a server.  So for that reason, I want to do a month long dive (I’ll stop short of calling it deep) into PowerShell to see for myself what all the excitement is all about.

For those that use PS I’d really love to hear some real life examples where this has helped you and just that you wanted a different way of doing a given function.  That being said, can anyone provide a starting point for learning PS?  Wish me luck!

Thanks

J.D.

Tagged in: ,
Posted
AuthorJ.D. Gonzalez
CategoriesSQL Server

Earlier today I was looking for SQL code formatters that I could use when writing to my blog.  Something I could just cut and paste to my Squarespace hosted blog.  Of course, Twitter came to the rescue.  Bill Fellows (twitter) led me here.  Worked fantastic.  Exactly what I was looking for.  Then Robert Davis (twitter) pointed me to a couple of other options.  The Simple-Talk Code Prettifier and a blogging application called Writer

Writer allows you to write your blog in a WYISYWIG fashion.  One of the plug-ins you can get for Writer is a plug in called Paste from Visual Studio.  What I like about this plug in is that it allows you to work in SQL Server Management Studio for your SQL code and use the plug in to paste what you’ve copied to the clipboard to your blog.  And it keeps the correct colors and formatting.  See below.

select *
from someTable
where myID = 'Some filter'

The only issue I have is that in order to paste your code to your blog, go to Insert > Paste from Visual Studio instead of just hitting ctrl-v.  In the grand scheme of things, it’s a pretty trivial issue to have to deal with. 

I haven’t given Writer a thorough test but for someone that only blogs a few times a month it’ll work fine for me.  I hope it’s useful for you too;)

J.D.

Posted
AuthorJ.D. Gonzalez

For it's part, SQL Server is extremely reliable.  However, no matter how many precautions we take, software is always prone to failure.  Things that can go wrong will at some point go wrong.  SQL Server is no exception.  Probably the worst event that can happen to SQL Server is having SQL Server not start up.  If you're able to get SQL Server to start up you have a chance at survival.  If you can't, you need to perform some tricks to reconstruct a master file.  For those that don't know, the master database is the heart of SQL Server.  Let's get one thing out in the open.  "DO NOT MESS WITH THE DATA DIRECTLY IN THE MASTER DATABASE".  Got it.  Bad things.  Very bad things can happen.

Today we're going to explore how to bring SQL Server backup in the event something has happened to SQL Server.  I had the system administrators where I work create for me a VM.  I then went on to install the eval version of SQL Server 2008R2 on the box.  First thing we need to do is to create a database so that we can demonstrate that we really did restore the master database.  Open up SQL Server Management Studio and type the following:

Create database myTestDatabase
go

Next thing we want to do is to backup all of our system databases.  This includes master, model, and msdb.  No need to backup tempdb as it get reconstructed everytime SQL Server gets restarted.  The code to backup the system databases is as follows:

backup database master
to disk = 'c:\master.bak'

backup database model
to disk = 'c:\model.bak'

backup database msdb
to disk = 'c:\msdb.bak' 

Now we need to start breaking stuff.  Let's first stop SQL Server by opening up a command window and typing:

net stop "MSSQLSERVER"

This assumes that your instance is the default instance.  If you have a named instance, use the following:

net stop "MSSQL$<<instancename>>"

Now that our database is stopped we can remove (or rename) the master.mdf and mast.log files.  In case you think that what we just did doesn't really hurt anything.  Let's try to restart SQL Server.  The code is similar to stopping the service.

net start "MSSQLSERVER"

or 

net start "MSSQL$<<instancename>>"

You'll notice that it errors out.  A quick look at the error log.  

Now what?  In order to get SQL Server to a point where we can start it and log into it we'll need the install media.  You still have that right?  We'll use some special switches and parameters to get the install media to build us some system database files and log files.

C:\Documents and Settings\Administrator>"c:\SQLServer\setup.exe" /ACTION=REBUILDDATABASE /INSTANCENAME=MSSQLSERVER /SQLSYSADMINACCOUNTS=BUILTIN\Administrators /SAPWD=mySuperSecurePassword

Because we're using setup.exe, we don't really want to install the entire SQL Server application again so we'll use some parameters.  These are taken from Microsoft's website.

/ACTION=REBUILDDATABASE

As the name describes, this specifies that Setup re-create the system databases.

/INSTANCENAME=MSSQLSERVER

This is the name of the instance of SQL Server. For the default instance, enter MSSQLSERVER.

/SQLSYSADMINACCOUNTS=BUILTIN\Administrators

Specifies the Windows groups or individual accounts to add to the sysadmin fixed server role. When specifying more than one account, separate the accounts with a blank space. For example, enter BUILTIN\Administrators MyDomain\MyUser. When you are specifying an account that contains a blank space within the account name, enclose the account in double quotation marks. For example, enter NT AUTHORITY\SYSTEM.

/SAPWD=mySuperSecurePassword

Specifies the password for the SQL Server sa account. This parameter is required if the instance uses Mixed Authentication (SQL Server and Windows Authentication) mode. 

You'll notice if you look at the directory where the database files are kept we now have files for master, model, and msdb.  In my case that's E:\Program Files\Microsoft SQL Server\MSSQL10_50.MSSQLSERVER\MSSQL\DATA.

Let's take a stab as getting sql server to start up.  Like before we'll use the net startup command.

net start "MSSQLSERVER"

or if you have an instance:

net start "MSSQL$<<instancename>>" 

At this point you should be able to log in.  Let's check to see if our database is available by using the following code:

select * from sys.databases

Uh...  Oh!   What happened to our newly created database?  Because we had to rebuild the master database and log files, SQL Server doesn't know that our database even exists.  But as I mentioned earlier, if we can log on to SQL Server we can restore our backups.  You did backup your databases right?  We now need to overwrite the master database with the database in our backup.  We'll need to stop the SQL Server service.  I've showed you how to do that above but we'll need to add the '/m' switch to get the SQL Server service to start in single-user mode.

net start "MSSQLSERVER" /m

or if you use a named instance:

net start "MSSQL$<<instancename>>" /m

while you have that command line window open we need to use SQLCMD.exe to allow us command line our restore operation.  You'll find SQLCMD.exe in the directory C:\Program Files\Microsoft SQL Server\100\Tools\Binn\.  In the command shell window type:

sqlcmd.exe

You'll notice that your prompt has now changed to a line number and a '>'.  This means you're now in SQL Server.

We use 'with replace' to overwrite the current master database.  Once it gets restored, SQL Server will kick you out to a DOS prompt.  Now that we've restored the master database, we can start up the service like I demostrated above.  We'll want to next query sys.databases to make sure that SQL Server know where our database is now.

select * from sys.databases

You'll notice that our database listed.

At this point, we can restore the model database if you use model as a template to save your company specific settings for creating databases.  If you had SQL Agent jobs, you'll also want to restore the msdb database.  Unlike before, it is not necessary to use SQLCMD.exe to restore these database.  They can be done from within SQL Server Management Studio.

Congratulations!  You've recovered from a catastrophic SQL Server failure.  I hope it also demostrates why backups (especially of system databases) are extremely important.  Take the time to practice this exercise.  Hopefully it's not something you'll have to use but when you do you'll be prepared.  

Thanks

J.D.

Posted
AuthorJ.D. Gonzalez
CategoriesSQL Server

It looks like May will be turning out to be a very busy month for me.  Well...  At least Friday the 13th will be.  I'm planning on attending a conference in Orlando that week, I have a concert in the wind ensemble that I perform in (I was going to guest conduct), and I need to attend my niece's baptism.  As much as I would love to attend the conference, it just doesn't seem right to skip my niece's baptism especially since I'm the godfather.  It's a big deal for me and my family.  I know I would be upset (or a little sad) if the situation were reversed.  I'm considering cancelling my attendance at the conference if I can't get a flight back without missing too much of the conference.  As the title says, sometimes the right thing to do isn't always the easiest.  

Either way, I'm going to be there at my niece's baptism.  I may need some coffee to get me up but I'll be there.  I'm going to see if I can get a different flight from Orlando so that I can fly directly to Laredo.  I have to wait to see what options our travel agent can provide.

Wish me luck!

J.D.

Posted
AuthorJ.D. Gonzalez

You've got to love change.  It's all around us.  Change in our production environment is no different.  My company is currently in the process of evaluating new database technologies.  One of our more important database servies is on SQL Server 2000 and it needs to be upgraded.  The hardware is old and the software is close to being a teenager.  As a SQL Server developer and administrator I'd like to see it moved to SQL Server 2008 but our executive team wants to evaluate open source alternatives.

Fair enough, there are several open sources database technologies that currently come to mind.  PostgreSQL and mySQL, for example.  We've also considered no SQL technologies but those would truly represent to radical mindshift in our development process.  Not to mention that our environment is a Windows environment.  Most of the no SQL technologies are built with Linux in mind.

Probably my main concern with moving to an open source technology are the costs associated with moving our environment.  I'm sure you're probably thinking right now 'what kind of costs do you have with an open source database'?  Here are a few that I thought of:

  • DBA training
  • Developer training
  • Software costs

DBA Training

Our current environment is run by two SQL Server DBAs with no experience in PostgreSQL or MySQL.  This means that we have to spend money on administration training.   Typical training costs are around $2000.  

But just taking this training does not make us experts, or even intermediate level DBAs.  There is a learning curve to get to the point where your comfortable working with it.  Perhaps even longer to get comfortable handling configuration changes or performance problems.  Remember how long it took you to solve that first issue you had?  The longer our site is down the more money we don't make.  

Developer Training

Administration training is only part of the costs.  Development training also costs around the same amount ~$2000.  But now we have another problem.  Time to implement new functionality will increase as developers have to learn a new development environment.  Earlier I mentioned that time was money.  They same still applies in development except it's the time to market or time to execute new functionality that suffers.  When competitors beat you out that business that you could have picked up.  Lost revenue.  Short and simple.

Software

Hardcore administrator types like to say that they only way to interact with systems is through the command line.  Don't get me wrong, command line has its place and I use it when I have to, but I prefer the simplicity of the GUI to interact with systems.  Most open source will come with tools to interact with the database but you'll see some costs in ETL tools.  And those aren't typically free.  In our enviroment we work with flat files, XML and move data from other systems.  Our cost for ETL tools would like require the purchase of a tool suite to handle all we do.  You could also develop your own tools but now you're looking at time and effort to develop new software.  Something else to support too.

The costs I mention above are one-time expenditures so once we pay them we won't have to pay them again.  But it could represent a cost that was unexpected depending on how experienced your staff is.  There are still intangible costs associated with a migration:  stress of migration, work life balance suffers, uncertainty of job since you're no longer the expert.  These do add up, it's just hard to put a figure on them.

Now I'm not suggesting that I don't want to consider open source database servers.  I enjoy learning new things.  This is no exception.  But to suggest that moving to open source alternatives is free is simply insane.  We just need to make sure that understand and communicate that moving to open source will not free even though the software that runs the database will be.

Personally, if you're already on an established database technology you should stick with that technology and upgrade accordingly.  If this were a new project, I think SQL Server and open source are both valid options that each carry the same amount of weight.  In my scenario coming from a version of SQL Server that is over 10 years old I would say that upgrading is a better bet.

What about you?  Have you been in a situation where you had to consider SQL Server and other open source technologies?

Posted
AuthorJ.D. Gonzalez
CategoriesSQL Server

Today I got the opportunity to work on an SSIS package.  I word it that way because I really haven't worked in SSIS in a few weeks.  Hopefully that speaks volumes as to how well I write my packages:)  During the course of development I disabled a few steps because I know they worked and I didn't want to run them.  

But when I try to run the package it immediately succeeds but the enabled task does not run.  Why is that?  If you look closely, the first step in the sequence container has an expression that needs to be evaluated.  In my example below, it evaluated as false so the underlying steps did not process.

So that I could make it run and prove my point I changed my variable so that the expression would evaluate as true and thus continue with the rest of the package. 

This is probably something that we don't experience very often but at least now we know why that happens.  I don't know about you but it stumped me for a good hour.  I'm glad I cracked the code on this one.

I hope this saves you some time when you're debugging your packages.

Posted
AuthorJ.D. Gonzalez
CategoriesSQL Server
TagsSSIS

Not a year goes by when my wife will ask me what my new years' resolutions are.  And every year I tell her pretty much the same thing, "I had really thought of anything."  This year will be different (how many times have you hear about that one?)   I'm breaking down my resolutions into two categories:

  • Professional -- These are goals that I will use to advance my professional career.
  • Personal -- Goals to make be a better person and make my life enjoyable.

Professional

  1. Learn more able geospatial datatypes in SQL Server.  I've played somewhat with them but I feel that there are many cool applications in which I could use them.
  2. Learn more about SQL Server internals.  I feel as if I know enough to get by but I'd like to be able to really understand what is going on 'under the covers' of SQL Server.  Conceptually, I understand indexes and their role in a relational database system.  But I'd like to get to the nuts and bolts to really understand how they work and not just to know that I need one.
  3. Continue to write high quality articles for SQL Server Central.

Personal

  1. Play more trumpet.  I really regret not playing trumpet in college.  It is a fabulous instrument and a blast to play.  No pun intended;)
  2. Enjoy life.  One of the things I learned from my father (albeit not the way he intended) was that things in life happen.  That's how it works.  From time to time when things don't go my way or I feel like we've off schedule I tend to freak out.  I need to sit back and deal with what's going on.  Not everything in life follows a schedule.  It's not the end of the world if I'm occasionally late.
  3. Do something with my Miata.  I don't know if I'm going to sell it and buy a new one or fix it up.  It's a fun car to drive.  For those that don't drive a convertible, it's IMPOSSIBLE to be mad or upset in one.  Try it you'll see;)

So long 2010.  Here's to 2011.  Let's make it a good one!

Posted
AuthorJ.D. Gonzalez
CategoriesSQL Server

I've been working with SQL Server for about 10 years now and it amazes me that I can still find little bits and pieces that I had no clue how they operated.  Today's tidbit...  The restore process.  More specifically, they events that occur when you do a restore.   Pradeep Adiga has a more detailed write up about it.

Admittedly, I thought that when you restored a database it would copy a snapshot of the database.  Turns out that's not the case.  I was about 33% correct.  There are three events that occur during a restore:

  1. The data copy -- This is what most people think of when you do a restore.
  2. The redo phase -- All committed  transactions in the log are rolled forward.  This includes any transactions that occurred up until the backup process has finished.
  3. The undo phase -- All uncommitted  transactions in the log are rolled back.

How have I gone this long and not know this?  That being said, I've made it my goal to take at least an hour or two each week and read blogs.  Not just scan them, but really understand how things work.  

 

Posted
AuthorJ.D. Gonzalez
CategoriesSQL Server

Well....  Maybe not time travel;)

The second in my series of SQL Server adventures is the geospatial data type.  All I have to say about this is, WOW!  For someone who is responsible for returning geocoded data this is a godsend.  

Although I knew about geospatial datatypes, I'd never really messed with them and have no experience with them.  Pinal Dave has an excellent write up on how to get started with shape files.  I wanted my learning exercise to be relevant to me so I downloaded the states by county shape files from VDS Technologies.  

Shape files are not like delimited text files that DBA's are used to.  These files required a method of uploading them.  For this task I used SharpGIS's Shape2SQL.  Loading the shape file data to a database is extremely easy with this tool.  Point to the file, select some parameters.  Here's what mine looked like below.

 I ran a simple select statement to see what geospatial data looks like.  Turns out, it looks like a binary data type.

 select *
from travisCountyEdges

 The data I uploaded is a representation of streets and neighborhoods.  At this point I wasn't really impressed.  That is...  Until I noticed an extra tab in the results pane.  The one called 'Spatial Results'.  This now gave me an image representation of the shape file we uploaded.  In my case, it's a view of Travis County in Texas.  Now we're getting somewhere.

A quick search on the MSDN library led me to all of the available methods I can use with geospatial data types.  To keep things simple I gave my self a business problem to solve.  "Give me a list of points within 1 mile of the Texas State Capitol."  If you're trying to find the geocodes for your specific area you'll need something to reverse geocode a given address.  I used this and it worked perfected for what I needed.

Use this code below and plug in the appropriate lat and long codes for your specific needs.

declare @radius float
set @radius = 1.00
declare @long float
declare @lat float
declare @multiplier float
set @long = -97.741132
set @lat = 30.272745
set @multiplier = 1609.344 -

-Gets points with in a radius
select * from travisCountyEdges 
where geom.STDistance('POINT( ' + cast(@long as varchar(10)) + ' ' + cast(@lat as varchar(10)) + ')') < (@radius * @multiplier)

There are a myriad of methods you can used with a geometry or geography datatype.  I used the 'STDistance' method and passed in a point.  But it could very well be a line, a polygon, or any other type of geometry.  

A few things to note

  1. Case matters in your syntax when using geospatial methods.  .STDistance is not the same as .STdistance.  Ask me how I know that one;)
  2. The distance returned from the method is in meters so you'll need to convert the @radius (which was in miles) to meters as I did above.  1609.344 meters per mile.  Google is a great thing.

The wheels in my head are spinning away looking for ways I can implement this.  I'm extremely excited about this.  I hope you are too.

Posted
AuthorJ.D. Gonzalez
CategoriesSQL Server

Today is one of those days that only comes around once in a while.  It's what I like to call a low volume of work day.  Doesn't happen that often so I'll take when it comes around;)

I decided to take this opportunity to learn about some features of SQL Server 2008 that I've been very interested in but have simply have not had the time to complete.  This will be the first in a series of brief postings.  This topic is about compression.  Backup compression, that is.

Backup compression is nothing new.  There are several companies that make some truly remarkable products.  Idera and Quest Software are two of several that exist.  These products allow you to take a backup and compress the size of the backup by 40% to 85%. 

Introduced with SQL Server 2008, SQL Server DBA's now have the ability to compress backups using native SQL code.  And it's a piece of cake too.  I have created a database on my machine called Spatial.  I'm learning about geospatial stuff too, but that's for a later post.  I've loaded about 40MB worth of data.  The bigger your database is the more noticeable the compression is.

Take take a backup I've used the following code.  No compression.

backup database [spatial]
to disk = 'd:\spatial_with_compression.bak'
with stats = 1, init

Now for the good stuff.  Compression!  Compress, baby, compress!!!

backup database [spatial]
to disk = 'd:\spatial_no_compression.bak'
with stats = 1, init, compression

All you need to do is add the keyword 'compression' and just like that you're now compressing your database backups.  Compressing my backup reduced the size of the backup by over 50%.  It went from 40MB to a little over 17MB.  I'll take that anyday.

There are a few gotchas to be aware of.

  1. Compressed backups can only be restored on SQL Server 2008 or greater database server.
  2. You will take a performance hit in CPU cycles when compressing the database.

 

Because this was a discovery process for me, I don't know if you will see an increase or decrease in time when using compression.  If anyone knows I'd love to hear your thoughts on this.

 

Posted
AuthorJ.D. Gonzalez
CategoriesSQL Server

Like the majority of my SQL Server posts, this one also revolves around a situation that occurred in real life.  For the last few months, yes months, I've had a job that would fail periodically but never show in the job monitor that it had failed.  I know what you're thinking.  How did I know that it was failing?  Well.  At first, I didn't.  But after following the stored procedure routine I was able to find the statement before it failed (it contained data) and the statement after it failed (it was missing data).

Today, I can say with confidence that I cracked the code on this one.  This issue is actually something that has bothered me about SSIS but I never took the time to research.  The default values for the properties 'FailPackageOnFailure' and 'FailParentOnFailure' are set to false.  If you create a simple SSIS package and run it as a job with a task of 'getdate()' not the lack of a select statement.  It will fail even though the property is set to false.

But if you get fancy with your SSIS packages and add an event handler.  The package will not return the error to SQL Server.  So my package that I had created earlier had an event handler that would send out an email along with the error message would now return 'Success' even though it had failed.

Fortunately, the fix is very easy to implement.  What you'll need to do is to set the 'FailPackagesOnFailure' to true.  I prefer to use the 'FailPackageOnFailure' versus the 'FailParentOnFailure' since I want the entire package to fail if there is a problem.  This property can be set at the package level or task level.  The image below shows you want I mean.

  

Now my question is why the default set to false instead of true.  If anyone knows why this is the case, please let me know.

Posted
AuthorJ.D. Gonzalez
CategoriesSQL Server
TagsSSIS

I had a situation today where a developer came to me and asked me why a stored procedure was running very poorly.  A few weeks ago it ran very quickly and no changes had been done to the DB so I knew this was an index problem or optimizer problem.

Because it wasn't a complicated query, I copied the code and pasted it to a new window.  Because there were some parameters I need to create some local parameters to make run correctly.  Once I did that (there were only two), it ran in one minute.  Given the volume of data and the number of rows I was expecting back this was very good.  I had dealt with this on several occasions.  The issue I ran into is called 'parameter sniffing'.  No kidding.  I did not just make that up;)  Microsoft defines parameter sniffing as:

"Parameter sniffing" refers to a process whereby SQL Server's execution environment "sniffs" the current parameter values during compilation or recompilation, and passes it along to the query optimizer so that they can be used to generate potentially faster query execution plans. The word "current" refers to the parameter values present in the statement call that caused a compilation or a recompilation.

In my scenario, the query I was using had an 'OR' statement in the where clause that would pull date if a date passed was null or populated.  This caused a problem during compilation.  The optimizer created the execution plan as if as if the date passed was a null.  This caused some excessive CPU usage when it received a value for the date.

The fix is surprisingly simple.  This is what I started with.

 

create proc testProc
@myID int,
@myDate datetime
as
begin
select IDToGet, myDateToGet
from someTableInMyDB
where IDToGet = @myID and
startDate = @myDate
end

 

To this:

create proc testProc
@myID int,
@myDate datetime
as
declare @passedID int
declare @passedDate datetime

set @passedID = @myID
set @passedDate = @myDate

begin

select IDToGet, myDateToGet
from someTableInMyDB
where IDToGet = @passedID and
startDate = @passedDate

end

See what I did.  I added two local variables and set them to the variables that are being passed.

Good luck!

Posted
AuthorJ.D. Gonzalez
CategoriesSQL Server

Yesterday I wrote about how important it is to look at the small details.  Today.  As luck would have it, proved just how important it is do remember the small stuff.  I was going to move a database from the D drive over to the C drive.  I detached the database with no issues.  When I tried to move the file, I received an error saying the disk was full but except there was.  There was at least 60 GB of space on the drive.  

After banging my head for a few hours I decided to look at the filesystem.  And guess what I found.  Yep.  A FAT32 formatted drive.  For those of you who don't know, FAT 32 can only handle files up to 4GB.  This explains why when I tried to move a smaller file it worked fine.  To confirm this, I tried to create a database larger than 4GB.  In my case, it was 5GB and I received the error below.  

Now as to why this drive was formatted as a FAT32 drive.  Well.  That's another story;)

 

 

Posted
AuthorJ.D. Gonzalez
CategoriesSQL Server

Have you ever had one of those a-ha moments.  Well...  Today I did.  

I was tasked with creating a process to update some records using an external application.  Fortunately, the application had an external API to use.  Sofar so good.  I created a simple application that would access the application and grab and update data.  How does this related to databases...  I had to take this data and update some records in the database.  

What I noticed was that there was a whole bunch of disk activity and would run slower and slower when updating records.  I knew there had to be a way of making it perform better.  I looked at the indexes and it had the appropriate indexes created.  I looked at the query to update the records.  Again, no glaring problems.  I thought this was just something that I had to live with.  Then it occurred to me...  Let me look at where the data and log files reside.  Yep... You guessed it.   They were on the same drive.  

A check on a different server that had much better performance confirmed my suspicions.  The better performing server had the data and log on different drives.  Tomorrow, I will move the database files to separate files.  I'm pretty sure I'll see some improvement.  

Sometimes...  It's the small things.

Posted
AuthorJ.D. Gonzalez
CategoriesSQL Server