code linux python Recommended Software Solutions video Website

Download all your loved tracks in two simple steps

I’m a fan of online radio, and I have a habit of marking every good song that I hear as a “loved track”. Over the years I got quite a list, and so I decided to turn it into my jogging playlist. But for that, I need all the songs downloaded to my computer so I can put them on my mobile. While does link to Amazon for downloading all the loved songs for pay, I’m going to walk the fine moral line here and suggest how you can download every song from existing free YouTube videos.
If it really bothers you, think of it as if I created a YouTube playlist and now I’m using my data plan to stream the songs off YT itself..
Moral issues resolved, we can move on to the scripting.
Update (4/27/12): has moved:, and also added a very neat –extract-audio option so you can get the songs in audio right away (it basically does a conversion in a second step).

What you need to have:
Linux-like system, MPlayer, Lame MP3 encoder, some command-line experience or at least adventure-ness.
So first you’ll need to export your loved tracks from in tab separated format – a mere button press.

The “tsv” (tab separated values) file has a simple format: <song name> <artist> < url>
And now for the script, first, the loved tracks file is tab separated, so we use AWK to get the 2 first fields which are song-name and song-artist.
Then we use a neat command-line tool to download YT movies:

mkdir mylovedtracks
cd mylovedtracks
awk -F\t '{print "../ -f 18 -t \"ytsearch:" $1 " " $2 "\""}' ../my_lovedtracks.tsv | csh

The single-liner will download all the loved tracks from the tsv file into the current directory, given that & my_lovedtracks.tsv exist in the parent directory. -f 18 says it will download only MP4s and ytsearch says it will try to search YT for the term “song-name song-artist” and download the 1st result. The | csh says it will send this command AWK formatted into a new shell process.
The saved MP4 will be named after the name of the video, with addition of the YT hash string.
All the mp4s have been downloaded, so let’s batch convert them to mp3s:

mkdir sound
for f in *.mp4 ; do n=`echo $f | cut -d '.' -f1`; if [ ! -e sound/$n.mp3 ]; then `mplayer $n.mp4 -vc dummy -vo null -ao pcm:file=sound/temp.wav; lame -V2 sound/temp.wav sound/$n.mp3; rm sound/temp.wav`; fi ; done

This single-liner will extract audio from the mp4 into a PCM temp.wav file using MEncoder, and then convert to VBR MP3 using Lame.
You can run this command many times, as it checks if the file has not been converted yet. So you’re impatient (like me) on converting some of the MP4s before everything was downloaded – just run it, and later run it again.
Congrats, all your loved tracks were downloaded.
A few limitation to this method:
* Sometimes downloaded songs are not exactly what you wanted, especially specific versions. The search is arbitrary, and can’t be controlled too much.
* ID3 tags are non existent, although something can probably be done about that in the Lame encoding phase.
* Very high potential for parallelization that is unexploited. Mostly in the YT download phase, where YT pushes the first ~15% of the video very fast (I saw 1200Kb/s even), and then maintains a steady d/l rate to get the video downloaded by ~1:00 minute (may be as low as 50Kb/s). Downloading many videos at once could help.
* Still not a true single-liner, it is a two-step thing. But that can be done by modifying the 2nd step a bit and putting into the AWK print of the 1st step.
* MP3’s volume normalization – very important! else every songs sounds different and you must do vol-up vol-down all the time…
Still, did a nice quick job for me…

One reply on “Download all your loved tracks in two simple steps”

Hi Roy,
I am beginner to Opencv and VC++, I worked till now on Matlab, interested in computer vision, can you help me in compiling this “” src code 1.8.1 in VC2010, trying it from so many days getting errors, felt you can help me.
Thank you

Leave a Reply

Your email address will not be published. Required fields are marked *