SQL Server Central just posted part 2 of an interesting article on applying the techniques of test-driven development to a SQL project. The author, Andy Leonard, goes from describing basic tests for setting up your databases to writing tests for inserting and updating data.
It's good stuff for a variety of reasons. He noted in the first article that some of the techniques are simply good coding practices (drop and recreate) but by writing them up as test scripts as well as within the SQL itself makes the entire piece part of a regression test and can resolve lots of further problems. He also points out that unit tests mean different things to different people so ensure your definition is the same as others on your team.
I'll be the first to admit that I'm not following the full pattern of writing individual tests out for each of my tables (the firing squad can resume shortly) - but the overall approach that Andy describes is solid and depending on how you code, you might find you are already doing a lot of what he describes.
"What’s really cool about all this is the fact that it’s re-executable. You can run the same script against a new instance of SQL Server where WeatherData has never been deployed, or you can run it where only version 1.0 has been deployed. I like that a lot."
Much of it is obvious to most db developers (before creating a stored proc, check its existence, ensure you have the right database selected, etc) - but there's still a lot of dirty code out there in the while. I *love* re-executable SQL - on my current project, I'm seeing a lot of code that is made to run once, just to clean something up. I keep on sending it back to the developers noting that when we do an interim release, we need to be able to re-run the code as often as it takes until it's ready.
Definitely worth the read.