Thursday, February 19, 2009

Script it Script it Script it!

this has been said many times before, but I'm gonna say it again, anything worth doing twice is worth scripting!

In our environment the development databases are refreshed from production and sanitised weekly. This is scripted, it happens on the dot, every week, without fail (excepting catastrophies!).

All of our local builds are automated, type ant or mvn compile and magic happens.

All of our server builds are automated, svn ci causes a build to kick off automatically, and notifies you if something goes wrong.

The project I'm currently working on requires some manual configuration and then proceeds to turn the database inside out, and is not easily backed out (I've tried, and it was more pain than it was worth!). So for every full integration test there's about 8 separate configuration steps before it can be run. They've all been scripted (SQL in this case). Now I know I can refresh my database from production with sanitised data, run one script to reload the config, and run a full 4hour integration test, all with complete confidence that I'm running off a fully reproducible slate.

The first time you do something, fair enough, do it without regard to rigorous scripting (but save any commands you run). When you come to do it a second time, you have a hard decision to make. If there's even the remotest chance that you'll be called on to do this again, then script it. In fact, even if you probably think you won't have to, script it anyway. The number of times I've had to repeat the thing that "is surely just this once-off and never again!" and later wished I'd scripted it to start with.

I've found in my experience that the time it takes to script something, most of the time, is only slightly slower than the time it takes to do it ad hoc. When you script something you exercise the muscle between your ears, which is worth it if nothing else. Then as soon as you have to do it the second time, it pays for itself, and every single time after that.

I once had to step in for a client and run some web statistics for them as the usual staffer who performed the role was away. This normally took him the best part of a day to perform, between copying log files from the server, setting up the config and waiting for the tool to process the logs, then copying to results to the web directory. The first time I did it, I took about 2 days, maybe three, this time included figuring out the process as well as scripting it while waiting. The second time it took me the 5mins to update the links on the stats page to point to the new stats, as everything had run automatically before I got to work. Time well spent.

I'm sure I'm preaching to the choir here, and everyone reading this blog will be nodding their heads as if I'm ranting about how rocks fall down instead of up, but it's something we all too easily forget in the rush of everyday, to take the time to keep our minds and our tools sharp.

Got a large log file to parse(1GB+?)? Instead of wrestling with some editor, try writing a perl/python script to parse it for you, or even a shell script with a grep pipeline.

Let the machine work hard while you work smart. You'll save time, be more relaxed for it, and who knows, you might even enjoy it!

No comments: