Sunday
Feb112007

nullifying backups

While playing with stickam, I decided to demonstrate how I cook dinner to my huge audience. This involved separating frozen meat patties by banging them on the desk. I stopped this maneuver before completion, because my giant external hard drive, also located on the desk, started to whine. Loudly. After cooking and eating my tasty dinner, I returned to the hard drive. I figured the heads had gotten misaligned. With the power off, I tried tapping it gently. I tried tapping it harder. The whine was still there, and prohibitively loud. I figure the medium is probably compromised but I can use it for storage that can be ephemeral. (Damn it.) Today I turned it on again because my iTunes is setup to download podcasts there, and I needed some fresh podcasts. Hey! Writing data to the disk stopped the whining!
So can I ever rely on this disk again?
[UPDATE] No, I can never rely on this disk again, and I should be punished for entertaining the thought that I could. For that matter, I should never rely on a single point of failure for critical data. Time for me to work on my strongspace setup.

Wednesday
Feb072007

handy shell tools for finding large files

Heard over and over in development shops everywhere: "We're out of disk space! Who is the spacehog?" "Not me! It must be your project!" "Let's delete temp files | log files | core dumps | stuff that looks old." I will spare you the lecture on the heartbreak of irreplaceable data loss, and instead I provide a few one-line shell goodies to identify where the disk space is going, with human-readable text reports sufficient for mailing to all your co-workers.
The classic command for analyzing disk usage is
$ du -k
which will print something like this
32 Documents/Standards/sac-1.3/doc/org/w3c/css/sac/helpers
568 Documents/Standards/sac-1.3/doc/org/w3c/css/sac
568 Documents/Standards/sac-1.3/doc/org/w3c/css
568 Documents/Standards/sac-1.3/doc/org/w3c
568 Documents/Standards/sac-1.3/doc/org
That lists the size in kilobytes, followed by the file name. Output like this quickly gets unreadable. We can apply some concepts of information visualization to improve this output. Let's put the most important stuff at the end, by adding a sort command:
$ du -k Documents | sort -n
The last few lines of this list the biggest directories and their size in kilobytes:
80180 Documents/Reference/docs/api/java
82924 Documents/Reference/docs/api/javax
110788 Documents/Speed Download 4
205708 Documents/Reference/docs/api
251800 Documents/Reference/docs
254668 Documents/Reference
434216 Documents
Comparing six-digit numbers at a glance requires brain work. To make it easier, get human-readable output from du, by replacing the -k flag with -h. Now a line of output looks like this:
4.1M Documents/Standards
That breaks our sort, though; sort -n is numeric, and 2M is less than 4K. Wrong. Let's just throw out any du output less than 1 mb. I do that by piping the output through sed. I also want to limit how deep we descend into directories, since directories sizes include the summarry of their children's sizes. On the mac, pass in a -d depth flag; on linux, use --maxdepth=depth.
$ du -h -d 3 . | sed -e /\n*[KB]/d | sort -n
Then to get just the highlights, pipe that through a tail command, to select just the last 30 or so big guys:
$ du -h -d 3 . | sort -n | sed -e /\n*[KB]/d | tail -30
But wait, this is kind of stupid; I'm asking sort to sort a whole lot of stuff, then promptly throwing out most of the sorted things. Let's switch the order of the sed and the sort, which will make the sort smaller and faster.
$ du -h -d 3 . | sed -e /\n*[KB]/d | sort -n | tail -30
Props to Unix Power Tools and Jeffrey Friedl's Mastering Regular Expressions. We're just mortals, here, folks, but we're living in a well-documented world.
On the mac, for an easier way to do this, try OmniDiskSweeper.

Tuesday
Jan232007

practice makes perfect

To be productive as a software developer, or tester, or sys-admin, you have to know how to use your tools. Most development tools in the projects I run with are based on the old-skool unix model: a shell script, a build utility, obscure text configuration files, environment variables, and a source-code control system. Systems like that are exquisitely sensitive to errors, and even "vanilla" installs usually need some local tweaking, and some setup rituals. All of this can be maddening when you want to get some work done and the tools won't let you. What can be done?
Certainly, build tools should be easier to use. But they're not, and that's the world we live in. The answer came to me last night watching a martial artist do forms on tv. In martial arts, forms are a detailed choreography that represent a hypothetical fight; the purpose of practicing forms is to get each detail of each action dialed-in. I recognized the form being performed; it was the first form I learned, as a beginner in Ja Shin Do. "He must be a beginner," I thought, "to still be practicing the basics." Then I kicked myself: he was practicing the basics to perfect them. Go back to the beginning, and do it again... this is the practice of experts.
We get good at using the tools by using them, over and over. Start from a blank slate and build a development environment from nothing. Do this ten times in a row, and pay attention to exactly where you install the prerequisites. Build muscle memory for what goes where. Do it ten more times, and pay attention to the initial invocation rituals. Do it again, over rocky ground this time, or an os you don't like. If you want to be an expert software tools user, then practice, practice, practice.

Monday
Jan222007

dynamically-typed interpreted languages

On type mismatches in dynamically-typed interpreted languages: Javascript gives you enough rope to shoot yourself in the foot.

Monday
Jan152007

love letter to health insurance

A few weeks ago, I had an encounter with a serrated knife and a loaf of pecan-raisin bread that resulted in five stitches in my thumb. I went to a nearby small emergency room, and was in and out in two hours. The doctor and the staff were all friendly and competent; the care was excellent, and my thumb healed well. I just got the explanation-of-benefits from my health insurance company; the bill total was $2587.85, of which I am responsible for $62.37.
This is, arguably, what the US health care system is best at; definitive care for an acute condition, with no follow-up and no prevention. I'll take it! Now if we could just expand that quality of care to chronic conditions and prevention, and provide it to everyone regardless of financial means or immigration status... that would be something. I wonder how Arnold's latest proposal for mandatory health insurance would cover my self-employed neighbor cutting her thumb at work, or an illegal immigrant cutting his thumb cooking breakfast for his family. Thank you, Aetna and Seton Medical Center, for taking care of me. Please take care of everyone else, too.

Page 1 ... 8 9 10 11 12 ... 51 Next 5 Entries »