make – thank you for the music Wednesday 11th July 2007

Make doesn’t have to be just for complex code building applications.

I have my own postgres database of the albums that I own and have ripped, mainly because I’m a bit picky about the formatting of tag metadata.

I have a script that you can point at a musicbrainz album entry and it will download it in xml format an insert it into my metadata repository (utf-8, of course) from where I can tweak it as I like in a perl base webapp which has the one really cool killer feature. You can download a zip file with a windows .cmd file and a bunch of metadata text files which, when unzipped in the same directory as a bunch of Track??.wav files, will losslessly compress all of the wav files into my music directory (under appropriately named artist and album sub-directories), tag them all with the correct metadata and finally run a “flac -t” test decompression over all the newly compressed files.

The cool thing is that I can rip my newly arrived album and start listening to in wav format in the temporary directory into which I’ve ripped it, and take the time to sort out the metadata (spacing, capitalization, special characters) at my leisure and compress, tag and store replaygain data when I’m ready.

The one thing that has been bugging me since I set up my dual core machine is that the compression process uses only one core as flac runs as a single thread and the cmd file just runs commands sequentially. I’ve popped the lid off the flac libraries before so I vaguely considered hacking in some multithread, multifile feature but this seemed like hard work. I thought about getting the cmd script to spawn to seperate command processors each with half the files, but then one might finish early if the track lengths or complexity happened to be radically different. Perhaps I could spawn each compression instance in the background and run them all in parallel, but it doesn’t seem very neat spawning of a dozen processes all at once and I’d have not neat way of waiting for them all to finish and run the test. Finally I realised that this is really a job for make.

So I tweaked the cmd file generating webapp to create me a makefile instead and now I run “make -j 2” from the working directory. Hey presto, both cores are used and the compression takes about half the time. Here’s a quick sample:

.PHONY: all test

all: test

$(MYMUSIC)/Artist/Album/01_TrackOneTitle.flac: Track01.wav
    flac $< -o $@
    metaflac --remove-all-tags --no-utf8-convert --import-tags-from=track01.txt

$(MYMUSIC)/Artist/Album/01_TrackTwoTitle.flac: Track02.wav
    flac $< -o $@
    metaflac --remove-all-tags --no-utf8-convert --import-tags-from=track02.txt

# etc, etc, ...

OUTFILES := $(MYMUSIC)/Artist/Album/01_TrackOneTitle.flac \\
# etc, etc, ...

test: $(OUTFILES)
    flac -t $(OUTFILES)

You can add tags with the flac commandline but I use metaflac and text files because the commandline doesn't handle utf-8 so well, it's more reliable to generate tag import files and tell metaflac to import them straight into the vorbis comment block of the flac file without translation. It just works, and now at twice the speed.

Comments are closed.