Yeah, What They Said 3/23

Yeah, What They Said is a new feature on tunequest. Some people call it “link sharing.” These links won’t necessarily be music or iTunes related, but I’ll try not to stray too far from the topics on this site. Mostly though, it’ll be stuff that’s cool, but that I don’t have time to write about.

Behind the Mario Maestro’s Music:
Koji Kondo was in his mid-20s when he wrote the iconic music for The Legend of Zelda and Super Mario Bros. for the NES, but he doesn’t compose much these days. Wired takes a look a him.

10 Albums in 10 Minutes: Classic albums cut up and squeezed into 60 seconds of playing time.

Millions Dream of Megamillions: Compete blog charts the rise in Internet activity as the recent Megamillions Jackpot increased.

If You Can Read This, You’re Hired: How would you find a really good typographer? Would you use dingbats? If you can decipher these ads by 3/25, you could get a 1-year subscription to Indesign Magazine.

::

Bonus video: here’s Koji Kondo playing Super Mario Bros on the piano:

This recording is 105 years old

wax cylinder

Growing up, my parents had (and still have actually) an old Victrola record player. It was completely machine operated; no electronics whatsoever. To use it, you had to wind a handle, which tightened a spring. Flipping a switch unwound the spring and started the disc spinning. A needle, of course, translated the record into sounds. Volume was controlled by opening and closing two doors on the front.

Along with the Victrola itself, my parents had a nice collection of records for it. I always enjoyed exploring the various old pop, jazz and orchestral standards, using those recordings as a window to the past. Plus, there was a subtle aural appeal to the tinny, lo-fi sound quality of the music.

As much as I appreciated it, the machine was a bear to use. The records were heavy, but delicate. The handle needed constant turning. And most records only had one song per side. As enjoyable as the time was spent, the effort forced my sessions to be rather short.

Since the mp3/digital music revolution hit full throttle, I’ve had a dream to start digitizing some of those old records before they deteriorate beyond recognition. Being able to drop them on an iPod would greatly enhance my ability to explore those recordings.

Unfortunately, I’ve traditionally lacked a suitable recording environment. Also, that Victrola now lives more than 850 miles away from me. So for the time being, it will remain a dream.

Good news on a related front though! The University of California, Santa Barbara has been digitizing the recordings in its wax cylinder collection. Some of those recordings are even older than the ones I listened to growing up. Some of the oldest in the collection date to the 1890s while the most recent is dated 1928. The project has been ongoing since 2002 and, as of this writing, the digital collection turns up 6824 individual recordings.

The collection isn’t limited to music. It includes sermons, speeches, vaudeville and other spoken word (try the Humorous Recitations)

Each recording’s entry includes detailed information about the performer, the release title and the date (if known). Audio is downloadable as both mp3 and unedited .WAV files.

Explore the catalogue, catch the streaming audio of Cylinder Radio or subscribe to the site’s RSS feed.

Here is a taste to get you started. It’s Johann StraussBlue Danube waltz performed by Edison Symphony Orchestra in 1902, when the piece was only 35 years old. You’ll recognize the tune.

[audio:070320BlueDanube1902.mp3]

There is something awe-inspiring about listening to music that was probably recorded before my great grandparents were born.

What’s in a star rating?

Yesterday, I wrote a detailed article about the new formula I’m using to quantify the overall quality of albums in my iTunes library. It’s been working for me, but I realized that everyone rates their music differently. Webomatica, for example, explains in the comments that his song ratings are relative to other songs by the same artist.

So I’d like to explain the thought process that goes into my rating system. I’ve been using the same star rating criteria for years and that system has gone a long way toward helping me maintain control over my sprawling library. It allows me to quickly construct playlists of quality music, which is the single largest goal I have when managing and utilizing my library.

When thinking about a song’s rating, I basically need it answer one question: How likely I am to want to hear this song again? They are not designed to attribute a greater cultural value to a song, though the song’s general artistic worth plays a large role in the rating it receives. I’m more likely to enjoy a high-quality song and thus want to listen to it more often.

The rating is essentially a weighted vote for helping me determine how often a particular song gets played in the future. The breakdown looks like this:

  • Rating: ★★★★★ 5 stars: This song is excellent. It shows poise and craftsmanship and I’m pretty much guaranteed to enjoy this one the next time.
  • Rating: ★★★★☆ 4 stars: This song is very good. Well done and not off-putting, I’ll most likely enjoy this again, but it’s not brilliant enough to be a 5. The majority of songs in my library fall into this rating.
  • Rating: ★★★☆☆ 3 stars: This song is good. I’m not going to go out of my way to hear this one, but if I’m listening to an album beginning-to-end, I won’t skip it.
  • Rating: ★★☆☆☆ 2 stars: This song wasn’t very good. I’m fairly certain I’ll never want to hear it again. These songs are candidates for deletion. If any song stays at 2 stars for long enough, it is either upgraded to 3 stars or removed from the library.
  • Rating: ★☆☆☆☆ 1 star: Not used for rating purposes. Instead, songs that are marked with 1 star are taken out of circulation, usually because of encoding problems or bad ID3 tags. Its normal rating is returned when the problem is solved. Additionally, special audio such as comedy or spoken word is automatically given 1 star to keep it from mingling with music.
  • It is also worth noting that my ratings are not static. As my tastes fluctuate, I’ve been known to change them. It doesn’t happen often, but sometimes a 4 star song might become a 5. Or it could fall to a 3 if whatever aspect of the song I found appealing the last time I heard it is missing. In one extreme example, a song went from 5 to 2 stars and was subsequently deleted.

    There you have it. That’s where I’m coming from as I discuss song and album ratings on this site. I’d be interested to know how other people handle ratings in their iTunes libraries?

    In search of a definitive album rating formula

    When it comes to my iTunes library, I’m a regular statistics nut. Sure, my library exists primarily for my own enjoyment, but it contains so much organically-compiled data about my habits and tastes that I can’t help but want to take a look at it and find out what the data says about my interests.

    But for a while now, I’ve struggled to quantify, tabulate and analyze the overall sense of my library. Which of my albums albums are truly the greatest? Which artists, when the sum of their parts are combined, are really my favorites? And by how much? I want numbers.

    None of the iTunes stats options available at the moment give me the type of results that I want. The Album Ranking AppleScript provides a simple average that skews toward albums with fewer tracks. SuperAnalyzer provides a top 10 list that is skewed toward albums with more tracks.

    Most iTunes stats tools simply provide averages or totals of play counts and/or star ratings. Averages, while somewhat useful, can be misleading. An album could have a handful of awesome songs and a bunch of filler and still rank as well as and album that’s consistently good, but without much breakout material.

    And that can be frustrating to me, because, in terms of album or artist worth, I tend to value the ones with consistent performance.

    Take, for example, my recent run-down of Air’s discography, specifically the albums 10000 Hz Legend and The Virgin Suicides. After many years of listening, my artistic impression is that Virgin Suicides is ever so slightly the better of the two. The songs on Legend vary from excellent to clunkers. Suicides is overall pretty good, with only one exceptional track. However, averaging my ratings shows that Suicides is a 3.85 while Legend rates as an even 4.

    So, to reward albums that don’t veer wildly around the quality wheel, I’ve developed my own album rating formula that takes into account the consistency of all the star ratings on a given album.

    The Formula

    album rating = (mean of all songs + median of all songs) - standard deviation of the set

    The mean sums up the whole of the album. The median shows the state of the album at its core. The standard deviation indicates the variety of the individual ratings. The result is a number on a scale of 1 to 10. (Alternately, divide that number by 2 to return the result to a 5-star scale).

    Let’s take a look at the formula in action. Suppose we have two albums with twelve songs each. The first is generally excellent, but varies in quality. The second is good stuff throughout.

    Ex. 1 Ex. 2
    5 4
    4 4
    5 4
    2 4
    4 4
    5 4
    5 4
    2 4
    5 4
    3 4
    5 4
    3 4
    Mean 4 4
    Median 4.5 4
    total 8.5 8
    STDEV 1.21 0
    Score 7.29 8

    This table shows the individual star ratings for the two theoretical albums, as well as all the statistical data, as calculated by Excel. As you can see, both albums average score is the same (4) and Ex 1 even has a higher median than Ex 2. But, because the quality of Ex 1’s songs vary a great deal, its standard deviation is substantial, so much so that its album rating becomes 7.29 (or 3.645 on a 5-star scale) when my formula is applied. Ex 2’s score suffers no penalty and its score remains 8 (4). In this case, the standard deviation awarded Ex 2 a bonus for being of uniform quality.

    Let’s take a real world example, the two Air albums I mentioned above.

    10 kHz Legend Virgin Suicides
    4 4
    5 4
    4 4
    5 3
    5 3
    4 4
    3 5
    4 4
    3 4
    3 4
    4 4
    4
    3
    Mean 4 3.84
    Median 4 4
     
    total 8 7.84
     
    STDEV 0.77 0.55
     
    Score 7.23 7.29

    When the formula is applied to my ratings for each, the scores for 10000 Hz Legend and The Virgin Suicides become 7.23 (3.62) and 7.29 (3.65), respectively. So factoring in the standard deviation results in a score that more closely reflect my thoughts of those two albums.

    So what does this mean? I’m not sure exactly. In practice, I could whip up some listy goodness and see which albums are truly my favorites. A comprehensive analysis would be cool. I’d love to see the distribution of my album ratings. However, that would require more programming skills than I have. Though that could be a good project to help me learn.

    Out of curiosity though, I have picked 10 albums, just to see how they rate. One provision, of course, is that every song on an album must have a rating before the album score can be calculated. These ratings are on a 5-star scale.

    AVG My Score
    Radiohead – OK Computer 4.5 4.41
    Air [french band] – Moon Safari 4.5 4.39
    Nirvana – Nevermind 4.5 4.24
    Mouse on Mars – Radical Connector 4.33 4.23
    Ratatat – Ratatat 4.45 3.97
    Nine Inch Nails – With Teeth 4.31 3.77
    The Strokes – Is this it? 4.09 3.7
    LCD Soundsystem – LCD Soundsystem 4 3.68
    Basement Jaxx  –  Remedy 3.73 3.51
    Prefuse 73 – One Word Extinguisher 3.82 3.47
    Weezer – Make Believe 3.58 3.21

    This is by no means a top 10 list, but it is interesting to see where things ended up. It’s also interesting to see how minor fluctuations in star ratings can change the final score. For instance, if that Ratatat album had one more 5 star song in place of a 4 star song, its median number would become 5 and its album score would jump to 4.51. Lower a 5 star to a 4 star and the score only drops slightly to 3.93. I don’t know if this is a flaw in the formula or a reward for albums that have a lot of good songs.

    Problems and issues

    Small data sets. These are troublesome in all statistical circumstances and this formula is no different. Albums with only one song will, by definition, not have a mean, median or standard deviation, and that kills the formula with a divide-by-zero error. Also, because the formula uses the average rating as a component, albums with a low number of songs will tend to skew one way or the other.

    In my library, Boards of Canada’s EP In A Beautiful Place Out In The Country has four fantastic songs and ranks at 4.63, higher than anything on that list above. As a release, I’d say that’s accurate, but I’m sure it doesn’t surpass OK Computer. I would be interested to see a chart of how the album score changes as the number of tracks on an album increases.

    Additionally, I haven’t figured out a way to rank partial albums, i.e. albums where I either don’t own all the songs or albums where I’ve deleted songs I didn’t like. For now, I’m just excluding them altogether.

    Still, I’m fairly pleased with the results I’ve been getting as I run various albums through the formula. It’s working for me and my own song rating system, but I’m curious to see how it works with someone else’s.

    Fortunately, Webomatica has posted his song-by-song ratings for The Beatles’ Sgt. Pepper’s Lonely Hearts Club Band. Using his numbers, the average for the album is 4.38, while my formula renders a 4.28. I’d say that’s a consistently good album.

    ::

    Here’s a Microsoft Excel file you can download. Plug in your star ratings to find the album score. AlbumScore.zip

    Air – Pocket Symphony: A Little Side Step

    For these past couple weeks, tunequest has been counting down to Air’s fifth full-length record, Pocket Symphony, which was finally released a few days ago. I’ve had it long enough to give it a handful of thorough listens and I can tell you that this thing oozes craftsmanship. The numbers don’t lie and after rating all the songs on the album, I’ll confirm that this is good stuff.

    Upfront, let me say that I like Pocket Symphony. It is quintessentially Air; there’s no doubt about that. Sensually cool, in that singularly French way, Pocket Symphony lives up to expectations. But… it all feels a little too familiar.

    Don’t get me wrong; I’m going out of my way to say that I really enjoy this record and that I don’t think that it’s artistically disappointing in any way. It’s just that there’s nothing particularly ground-breaking at work here. Perhaps after ten years, the band has hit its stride and is confident in its sound. But for a group who has sounded just-so-perceptively different on each album, it’s hard to not have been eager to hear whatever new departure or tangent the duo had decided to explore. Pocket Symphony sounds like it could have been recorded at the same time as Talkie Walkie.

    To be sure, the mood is different: more sombre and tense than the “mellow exuberance” that marked Talkie Walkie. Still, its form, if not its function, are similar to its predecessor. Indeed, Pocket Symphony might well be called “Talkie Walkie After Dark,” but don’t go searching for it á Quartier Pigalle. With its precisely crafted, yet restrained sound, this music sounds like it would be more at home at a stiff, upper-crust soiree than in the back room of an after party at a trendy night club.

    But if courtly dress up affairs aren’t you’re thing, Pocket Symphony also makes for some perfect wind-down music for a 3AM drive through the city.

    The album’s first single, Once upon a Time, features afrobeat pioneer Tony Allen on drums (to great effect). Watch the video:

    My Library

    Air: Pocket Symphony (2007)
    13 tracks (of 13)
    Average Rating: 4.25
    Median Rating: 4
    Mode Rating: 4
    Signature Track: Mer du Japon
    [audio:070308MerduJapon.mp3]

    Air – Talkie Walkie: Mellow Exuberance

    After being somewhat let down by 10000 Hz Legend, I’d have thought that my interest in Air would had wained, but when Talkie Walkie was released in 2004 I was surprised by how eager I was to get a hold of it. When I did, my surprising anticipation was validated, in spades.

    I’ll just come out and say it: Talkie Walkie is a beautiful record. In direct contrast to its predecessor, the whole thing goes down smooth and is way easy on the ears. For 44 minutes, each song is like a tiny massage for your eardrums.

    It’s earnest. It’s serious. It’s playful. It’s compelling. It’s heartfelt.

    But Air doesn’t accomplish that by hearkening back to their earlier sound. There’s no attempt here to recapture the feeling of Moon Safari or any sideways glances toward retropop. It’s just an expansively rich aural canvass. While I don’t think it quite surpasses Moon Safari comes damn close though, it does one-up it by having nothing but four and five star song ratings.

    Whereas 10000 Hz Legend could be interpreted in a tongue-in-cheer manner, Talkie Walkie exudes earnestness. This thing has soul.

    • Cherry Blossom Girl’s sweet melody infects the brain and its minimalist chorus makes sure it stays there.
    • Surfing on a Rocket is social and political commentary that’s not only a new level of seriousness for the band but is also one of the best songs in the catalogue. I can’t get enough of that simple guitar riff.
    • Alpha Beta Gaga is positively one of the most happy-go-lucky songs I’ve ever heard. It also features one of the most effective uses of a banjo outside of bluegrass.
    • And don’t get me started about Universal Traveler; that thing just blows my mind.

    Talkie Walkie, without a doubt, is a masterpiece. If you don’t have this one in your collection, you need to go get it. now. If you need some convincing, here are some videos.

    ::

    Surfing on a Rocket

    Alpha Beta Gaga

    Cherry Blossom Girl

    Beautiful song. Explicit video. Seriously, don’t watch this one if you have any romantic illusions about the song.
    My Library

    Air: Talkie Walkie (2004)
    10 tracks (of 10)
    Average Rating: 4.3
    Median Rating: 4
    Mode Rating: 4
    Signature Track: Universal Traveller
    [audio:070303UniversalTraveller.mp3]

    air - universal traveler at itunes

    Air – 10000 Hz Legend: Frustrating Brilliance

    air promo

    Released in 2001, 10,000 Hz Legend is Air’s first proper follow up to their smash Moon Safari. Coming three years after taking the world by storm, the record was much anticipated. Air had carved out a particular niche of upbeat, laid-back retro-electro-lounge and the fans wanted more. MORE!

    Sadly, anyone who was expecting that was sorely disappointed. Including myself. I admit, it took me a long time to fully appreciate this album. I didn’t even pick up my own copy of it for months.

    Gone is the light, airy feeling that made earlier works so attractive. In their place is a decidedly denser, darker, more down to earth record. It is less electronic though there’s still plenty of it; more organic and human. Yet, it is simultaneously both more conventionally pop and more experimental than the easily digestible tunes of Air past releases. And that is the source of frustration with it.

    Yes, there is a certain je ne sais quoi that brands this as distinctly “Air,” but at times it just proves hard to listen to. Don’t Be Light, for example, has its moments, but it is so spastic–just all over the place–that it can’t muster up more than three stars. Wonder Milky Bitch plods along, like the soundtrack to a demented home on range, and is just downright weird. Conversely, Radio #1 exudes pure cheese: an over-the-top, over-produced mélange of sound, but it really isn’t that bad on the ears.

    But for all its stubbornness, 10,000 Hz Legend is the kind of album that benefits from repeated listening. Layered and complex, the album reveals new tangents every time. The more I listen to it, the I want to listen to it. This stuff is ponderous; it get stuck in your head.

    But it’s not all deep-thinking intellectualism and satire. Radian, the disc’s highlight, is pure pleasure. With a lofty flute melody, sensual strings, and a wonderful accompanying guitar, the song harkens back to the kinder, gentler Air from the past.

    In retrospect, 10,000 Hz Legend is probably the best career move the band could have made at the time. It deftly avoided pigeonholing the band as a novelty lounge act and showed that they could use a larger aural canvas and think big. It reminds me of how Nirvana decided to, with In Utero, make a record that would discourage their new-found fans in the wake of Nevermind’s success. But they ended up cementing their reputations as the leaders of rock. Likewise, 10,000 Hz Legend pinned Air with lasting artistic credibility.

    ::

    A fascinating video for Electronic Performers: