Machine Learning (W4771)
Eric Siegel, Columbia University
(a.k.a., E-Daddy-Iceberg)


A rap about machine induction

Eric Siegel wrote and performed this song when he was a computer science professor at Columbia University, 1997 - 2000.

Spoken rhythmically with an attitude, accompanied by a percussive beat. (tempo: 145)


  • Rapping pictures with Matt Miller (Spring 2000):
    The potential is awesome -- ya gotta realize it;
    the universal machine's got infinite horizons.
    Whatchya gonna do with your CPU?  Use your noggin' when you login!
    It's the best pet, you bet; don't fret, your wishes will be met.
    I get pet-peeved at dogs' flees and stupidity,
    but my semantics-free gizmo is syntactically supreme.
    You can't teach an old Pavlovian dog new tricks.
    But I can Chekov Rodennbury and dig Kirk's hip jiggy flicks.
    It's an extension of myself, a fast, errorless me in the extreme, 
    It's a magical machine -- gives me a license to dream. 
    Y2K reactionary activists will scare you with their paranoia.
    They've gotta learn their lesson -- doesn't that annoy ya?  
    Our algorithm doers are under our control -- 
    a soul-less servant devoted solely to the duties that you told.
    Its analytical prowess makes me shiver;
    It's got a CPU right were I've got a liver.
    Its all-purpose architecture can achieve any algorithm,
    So it's music to my ears; got me like Al Gore's Tipper has him.
    The ultimate endeavor that'd be clever - never say never -
    would be if my automaton could self-improve by trial and error.
    If you make all your mistakes in a computer simulation
    you'll learn a lot without your wallet or body achin'.
    Our machine expands horizons with algorithms' reasons and rhyme;
    Michael Jackson's man in the mirror changes faces over time.
    The machine will be your best friend when it derives its own idears,
    'cause a supportive friend with a mind of its own helps you be
        intellectually freer.
    Inference, deduction, and logical application
    apply a-priori rules to inputed facts for derivations.
    Modus Ponens is the tool by which rules fire
    which leads to new conclusions that are true or it's a liar.
    But where does all the knowledge and these fancy rules come from?
    How can I model reality when I'm so dumb?
    My logical inference is only good when rules leave no doubt;
    if you give it bad knowledge, garbage-in-garbage-out.
    What kind of thing do you want it to learn to do?
    Its action is a function: given input, what to map it to.
    The function that it learns is called "hypothesis";
    the function that you call is hyped as, "Learn this!"
    How do you represent your hypo-thesis?  Like this:
    Decision tree, function tree, neural network.
    Pick a good representation that is expressive enough to work.
    The problem is defined by a measure of performance.
    With an ample sample of examples: opinion, it will form it.
    Performance measure guides induction; it's more than an heuristic.
    You can select a friend who knows, but you can't pick it.
    By forming an hypothesis, it's forming an opinion
    on how the world works wonderfully, multi-layered like an onion.
    A zillion potential hypotheses -- that's a ton.
    To learn, simply put, is to find a good one.
    "Hee-hee, ha-ha!" cackle madly crazy scientist.
    My Frankenstein-creation will scare a few, but make a mint.
    Penultimate invention to uplift the human race.
    Reduced to a simple search through hypothesis space.
    Can I make it learn what I want it to learn with limited guidance?
    Can it do what I expect with robust high reliance?
    Count with me, count with me -- count the accuracy.
    But if you count on me for a learner, I need a search methodology.
    Rack yer brain, what a pain; I need a search.
    Insane in the membrane; search for a search.
    Heinous headache flame; search for search for search.
    Aspirin, Ibu-profane; meta-search verse.
    This verse is the worst verse I rehearsed for terseness; purse your lips.
    Self-reflection, meta-induction, musical lesson; shake your hips.
    My love for learning's not logical; I hope this rap's pedagogical.
    My true love will not toggle, but if you kiss a prince-turned-frog it will.
    Does the music help you learn?  Aren't the lyrics quite ironic?
    If you keep on learning learning, you'll be Hooked on Phonics.
    A B C D E F G,
    H I J K LMNOP.
    What makes you tick? Y'all kinda unpredictable;
    you're fickle, your process is intangible.
    Can I tickle your fancy and glean from you the reasons
    you emancipate your proclamations irrespective of the season?
    What mechanism manifests as your multitude of actions?
    You are a beautiful model in disguise with life's real-world complications.
    I want to model you, model, you're the Apple of my eye,
    And beauty is in the eye of the observing agent, I.
    You speak the machine language of love, 3PO's little R2. 
    You're made by Apple; you're clockwork, orange you? 
    I want to model your mind, you're a model of modern human kind,
    I give you a 99 on the accuracy scale --
    I love your binomial bell curvey tale.
    I'm biased since you remind me of my mom who made me what I am today;
    Nature versus nuture?  Eggs all the way!
    Don't spurn me, I'm the Ernie to your Bert, I'll burst your bubble;
    Big trouble in a little dinner where I eat, 
    Dinah won't you blow your own horn -- beep beep!
    You are more than just a model -- your actions are real.
    Bit, byte, bat your eye, validate my hypothetical zeal.
    Induction of an approximation of an elusive target function to model nature
    is doing science for which we're too lazy or immature.
    Nature taught me don't eat wood or stones;
    nurture taught me don't eat poprocks and Coke;
    learning taught me don't eat too many Doritos.
    Were my lessons learned?  By rote feedback.
    Was my wisdom learned?  By a spoonfed hack.
    Were my genetics learned?  Evolution's nack.
    For general numeric functions I gotta simple idear:
    a neural net of weighted sums with sigmoids for non-linear.
    The Froot Loops tucan follows his nose which always knows;
    with fruitful loops, you can follow the gradient to hit the min below.
    Gradient descent is guided via supervised feedback.
    It'll learn to feed itself -- Big Mac attack!
    The feedback at the outputs trickle backward through the network.
    This node-ticklin' 'll trick your hidden nodes to improve net worth.
    In particular, each particular arc adjusts its weight
    according to gradient, estimate error, magnitude of input and learning rate.
    Thus backprop allocates credit and blame where it is due --
    makes a hip-hop flip-flop dance of arc weight so they agree on what to do.
    Yes the magic sigmoid smooth squashing non-linear bias is the secret code
    for emergent co-evolution of mutually-interreliant concepts embodied by
        internal-layer nodes.
    My perceptron parity may seem odd, but my self-reflection never halts:
    "This statement is not not not not not not not not not false."
    Dynamically thinking new things you oughtta
    come outta that shell and yell a lot that I taught ya.
    Ideally we haven't simply reinvented the wheel,
    stealing heaps of metal for a mental revolution ideal.
    Hypothetically my hypothesis will remove the hype and lesson
    all the misconceptions of induction so you'll finally learn your lesson.
    This will stick in your mind, a new knowledge-making glue stick.
    It's time to grow up and face my music!
    New insights are gained by a syntactically defined procedure
    that freely roves mounds of data without a dictatorial teacher.
    Learn this lesson well, my friend -- know it, learn it and see it.
    Expand your notion of creativity and who or what would, can and should be it.

    © 1999 Eric V. Siegel