• mindbleach@lemmy.world
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    2 years ago

    I am a computer engineer. I get the math.

    This is not about the math.

    Speeding up a linear program means you’ve already failed. That’s not what parallelism is for. That’s the opposite of how it works.

    Parallel design has to be there from the start. But if you tell people adding more cores doesn’t help, unless!, they’re not hearing “unless.” They’re hearing “doesn’t.” So they build shitty programs and bemoan poor performance and turn to parallelism to hurry things up - and wow look at that, it doesn’t help.

    I am describing a bias.

    I am describing how a bias is reinforced.

    That’s not even a corruption of Amdahl’s law, because again, the actual dude named Amdahl was talking to people who wanted to build parallel machines to speed up their shitty linear code. He wasn’t telling them to code better. He was telling them to build different machines.

    Building different machines is what we did for thirty or forty years after that. Did we also teach people to make parallelism-friendly programs? Did we fuck. We’re still telling students about “linear portions” as if programs still get entered on a teletype and eventually halt. What should be a 300-level class about optimization is instead thrown at people barely past Hello World.

    We tell them a billion processors might get them a 10% speedup. I know what it means. You know what it means. They fucking don’t.

    Every student’s introduction to parallelism should be a case where parallelism works. Something graphical, why not. An edge-detect filter that crawls on a monster CPU and flies on a toy GPU. Not some archaic exercise in frustration. Not some how-to for turning two whole cores into a processor and a half. People should be thinking in workloads before they learn what a goddamn pointer is. We betray them, by using a framing of technology that’s older than disco. Amdahl’s law as she is taught is a relic of the mainframe era.

    Telling kids about the limits of parallelism before they’ve started relying on it has been an excellent way to ensure they won’t.