Binary Static

Comfortably ripping CDs in the Linux console

Posted in Linux by Chris on December 2, 2009

Since I’m still converting my CD-Collection to listen to them on my iPod, I spent some time figuring out the easiest way to do this — in the background, with reasonable audio quality, and of course on Linux. My workflow now comes down to this:

  1. using the abcde script to rip the cd, some customizations apply
  2. using MusicBrainz through picard for tuning folder hierarchy and tags
  3. using Amazon’s API to download the cover files

This might be a bit more work than other approaches, but I get kick-ass audio files, clean tags and file names. This is my ~/.abcde.conf:

DEVICE=/dev/sr0
OUTPUTTYPE=mp3:" -V2 --vbr-new -q0 --lowpass 19.7 -b96"
PADTRACKS=y
OUTPUTFORMAT='${ARTISTFILE}/${ALBUMFILE}/${TRACKNUM} - ${TRACKFILE}'
EJECTCD=Y
mungefilename ()
{
echo "$@" | sed s,:,\ -,g | tr -d \'\"\?\[:cntrl:\]
}

Your device is probably named differently. The OUTPUTTYPE parameter sets the settings for lame. I chose a variable bitrate and basically cut off everything the ear can’t hear. The OUTPUTFORMAT parameter names and sorts the files into a iTunes-style hierarchy. I tweaked the mungefile() method to get Windows-style filenames, i.e. I want whitespace in my filenames.

I run abcde from a sandbox folder on my hard drive. It’s easy to put it in the background by starting it with screen. Having collected new music, I filter the whole stuff through picard to get top-notch MP3 tags. What picard can’t find I’ll add to the MusicBrainz database myself from the original CD.

As I use boxee to play and display my audio files on my media box, I want neat high-resolution cover images in each folder. I wrote a short python script to retrieve the covers from Amazon via their store API. Currently, I’m trying to add a GUI, which allows to scan whole music collections and provide missing or wrong images in batch mode. Once I’m done, I’m going to roll this into a Ubuntu package and distribute it.

Tagged with: , ,

Killing the Internet

Posted in Politics, Web by Chris on November 23, 2009

It’s getting worse everyday now. Cory Doctorow reported and commented on new restrictions on internet use which are about to become law in the UK. The proposal has the usual ingrediences: extended control over ISPs serving private interests, draconic sanctions against copyright infringement, and forcing users off the net. First France, now Britain, who will be next? Western nations are obviously under enormous pressure from the content industry to end the so-called “misuse” of the internet.

This whole crusade against file-sharing is way more dangerous than it might seem. It is not only successful lobbying on behalf of the content industry. And it is not only about cutting off some cheap kids from their music supply. Old media, content industry and politics might find a common interest to change the nature of the internet forever. The old powers have reason to turn the internet into something like TV: a malleable means to distribute products and channel approved information.

The content industry used mass media to sell their products, politicians used them to control the masses – and the old media gained power and wealth from both ends. However, the internet changed the rules of the game because it allowed individual communication on a massive and global scale. Precisely its peer-to-peer nature established a public infrastructure which could not as easily be twisted and influenced as the old media in the TV age. In terms of peer-to-peer communication every client is equal – and every voice and opinion may be heard unfiltered. No need for mediation nor content salesmen, no easy access for politicians to the public opinion. In our new age people can exchange digital goods and analog ideas without direction and control of the establishment. This is why content industry, TV-age media and old-school politicians all have a very good reason to hate the internet in its present form. And this is what worries me most in the power struggle that lies ahead. Sure, private broadband connections will survive, we will still have packet-based protocols – but what if all people may use it for is buying TV streams from established outlets as everything else is effectively outlawed? Would you still call that “internet”?

Tagged with: ,

Recent Chromium build for Ubuntu introduces extensions and bookmarks sync

Posted in Google, Linux, Web by Chris on November 17, 2009

… but kills essential development features (which didn’t work anyway). Chromium 4.0.250.0 (Ubuntu build 32056) was distributed via Launchpad the other day. There is no official repository for extensions yet. However, you may find other resources to mess with your browser. The syncing feature seems still somewhat immature, since your stuff will actually sync with a Google Docs document – and not with Google Bookmarks or another well established service like Delicio.us which would seem more plausible. Anyway, syncing via Google Docs works o.k., if all you want is to work with instances of Chromium across different machines. Since I still need Firefox from time to time, I prefer the Delicious bookmarklet to Chrome’s new built-in sync. And why would I still need Firefox? There’s still no serious web development on (Ubuntu) Linux with Chrome – which is too bad, really. The scripting console didn’t work before, and now Google seems to have killed the relevant entry in the developer tools menu. This is no solution, of course. OK, so in a new build I got the menu entries back – but there is still no chrome://devtools/devtools.html. Please Chromium Team, Firebug in all its glory is the one extension that ties me to Firefox (yeah, I know about Firebug Lite). I could easily dispense with the rest and work with javascript bookmarks for other vital functionality.

Tagged with: , , , ,

New Copyright Law Database Effort by EFF

Posted in Politics by Chris on November 15, 2009

The Electronic Frontier Foundation just announced a new collaborative effort to collect and make transparent international copyright laws and related acts.   We do need public awareness for these issues now more than ever as horrifying details about the ACTA conspiracy against intellectual progress had leaked some days ago.  The global scale of the content industry’s attack on freedom is truly astonishing.  This dying industry and their proponents seem to be willing to drag the whole western society and culture into their grave.

The EFF’s project is a first important step in the right direction: we need to know the nature and range of the attack.  However, we do also need to know the names of the proponents. Which corporations are behind this conspiracy? Which lobbyists are mind-twisting our elected officials? They need to get the spotlight.

Tagged with: , ,

Some minor issues with the Karmic Koala

Posted in Linux, SysAdmin by Chris on October 31, 2009

After upgrading from Jaunty to Koala by clean install I experienced some minor annoyances – some of which I’m still working around.

The onion router package tor is MIA from the official repositories. To make things worse, there is no karmic build in the tor project’s repository either. Luckily, the debian sid build works. So adding

deb http://deb.torproject.org/torproject.org sid main

to /etc/apt/sources.list did the trick. Tor is up and running again.

DNS lookup was slow all over the system. And I mean really slow. It took Chromium up to 30 seconds to connect to a server. OpenArena was even completely timed-out in multiplayer mode. I have no clue what the problem is here. Strangely enough, providing alternative nameserver addresses did the trick. I used OpenDNS by adding

prepend domain-name-servers 208.67.222.222,208.67.220.220;

to /etc/dhcp3/dhclient.conf. Now IP resolving works well. But I don’t like this.

Some strange messages show up during boot, which seem to be harmless:

One or more of the mounts listed in /etc/fstab cannot yet be mounted:
(ESC for recovery shell)

Everything is then mounted ok. It still costs a couple of seconds in boot time. There is some discussion about this being a bug in mountall. What for it’s worth, I’m running version 1.0 of mountall.

Installing apache2 with apt-get now provides the package apache2-mpm-prefork, which doesn’t work with libapache2-mod-php5. This apache needs libapache2-mod-php5filter to parse PHP. Too bad I had simply copied all package selections from my previous install. This kept me busy for a while.

Tagged with: ,