The Art of SQL Server Database Administration, Development, and Career Skills for the Technically Minded
One of my regrets of not having started blogging sooner is missing all the great T-SQL Tuesday blog parties that have been hosted in the past. That is something I plan to change, and if pardoned by the community, use a few past topics as seeds for future posts. There is a Rube Goldberg or two in my closet I just HAVE to let out.
To me, the coolest aspect of Machanic’s blog-party-brain-child is writing outside of your comfort zone. In that sense this month’s TSQL Tuesday topic is a perfect place to start. The cloud is definitely not a topic I would have picked on my own.
My first genuine cloud experience was 2009. Up to that point in my career all I had heard about the cloud was that it would be the end of my career as a DBA. Every popular blog, newsletter, and periodical related to IT was making it clear. The cloud was coming for my job.
Like the imminent zombie apocalypse, I’m still waiting – partially relieved – partially disappointed that it hasn’t happened yet.
Back in 2009 I was designing a database for the state of Washington that would serve a website we anticipated might get A LOT of traffic. A very important aspect of the site was its internal search against the database. I had decided to forgo Microsoft’s full text search features and write my own term map scheme complete with rankings, synonyms, keys, etc. powered by a collection of nested stored procs, scalar and table value functions. Remember the Rube Goldberg device I mentioned above. I made this decision for a couple reasons, some valid – and hind-site being 20/20 – some not so much.
We were very confident in the ability of the web server to handle the raw requests coming in. Our load concern was more with the database. Running from a single box, processing a couple searches at a time, it worked great. The real question? How would it perform under the stress of production that we anticipated? If this database feature was the camel – we needed to know how much straw it could carry. But how?
Enter the Cloud
Googling load testing strategies I stumbled upon a service offered by a new company (at the time) called LoadStorm. Using their service I could leverage the power of the cloud to release a torrent of activity against my database in a completely random, realistic way. It took me about an afternoon to record browsing patterns that the load storm site stored as scripts. We could then ramp up the number of hits per minute we wanted to test using those scripts in adjustable proportions. It worked great.
We went from testing simple worst case scenarios to designing completely un-realistic torture sessions. We no longer wanted to test how much straw the camel could carry – we wanted to destroy the camel – beat it in every way imaginable. It was easy. It was fun. It was incredible informative. I really learned a lot about performance tuning SQL Server that week. I got to see some wait stats I’d never heard of. I didn’t have to put much thought into generating load after that initial setup. I could just focus on mitigation.
That was, and still is to me, the greatest power of the cloud. To ramp up a bunch of servers on short notice for a limited period of time, quickly, cheaply, and easily. Something that you just simply can’t do outside of the cloud. Knowing the limits of the infrastructure we had designed before we revealed it to the world was incredibly valuable. It allowed us to gain real world experience – the greatest teacher of all – without driving off actual users in disgust – healthcare dot gov style.
That is to date my best and most valuable use of the cloud. My next cloud related goal is SQL Azure. I’d really like to move the DBA Toolbox back-end to Azure. The only obstacle right now is how much I’m relying on full text search that was built into the original prototype. I’m not really all that excited to write my own internal Rube Goldberg search logic again.