Weird Communities, Tech Dreams, and Effective Altruism


            New Orleans       Remember the old saw, “how come, if you’re so smart, you aren’t rich?”  These days, we need to update that question and restate it as, “how come you’re so rich that you aren’t smarter?”

When you read about all the tech bros who are heavy breathers about Ayn Rand and her semi-fascist philosophy of the superman made to rule by their own lights, you know that when the Silicon Valley types and their wannabes wave the libertarian flag, that means trouble for the rest of us, because we’re just leftovers on the hindside.  Those of us less gifted and narcissistic have been just crossing our fingers and hoping for the best for years, assuming eventually all of those guys would have enough money to buy a clue, so we could survive the worst of it.  But, then with every reveal of the real Elon Musk, that prospect seemed more distant.

Then we hear and read about one of the latest fads sweeping that lot called “effective altruism.”  Not wanting just to hip shoot and write it all off as more pop philosophy, some of us actually read philosopher Singer’s arguments and in the way of these flights of extreme rationality found it interesting without being convincing in real world terms.  The term “effective altruism” is clunky, but has some appeal.  It is trying to do some good.  The biggest problem is that we can grant that philosophy as a thought experiment can be pure, but we have to quickly remind all of the philosophers and their disciples that people decidedly are not.  When the philosophical utilitarianism argues you should make as much money as you can, and then spend it so the most people are benefited, you need to make sure your BS barometer is turned on, so that this isn’t just a convenient rationalization for a bunch of people to make a ton of money until they grow up, and then get over it and dump the altruism part.

Maybe that’s just cynical me, but listening to Michael Lewis read half of his audiobook about Sam Bankman-Fried, as I drove from Arkansas to Louisiana, was tres creepy.  Sam’s first time driving crypto into the wall, he was hiring almost exclusively from the EA community, but when they had a dispute, they all wanted to cash up and out big time, even though almost all of them, including Sam, donated to the same group of largely mainline charities.  I’m even reading now that this effective altruism was a big driver of the conflict in OpenAI that has been this week’s Silicon Valley heartbeat.  I was rooting for the nonprofit board against the corporate takeover, partially engineered by Sam Altman after his firing and then return.  Behind the scenes before they surrendered, they seem to have won some critical skirmishes that left some protections the rest of us need from artificial intelligence gone wild, until the government finally gets its act together and puts together real regulation, not just flimsy guidelines.  Turns out the old board and its mission was heavily EA, although we have to be careful, we’re not just being force-fed spin, because the Bankman-Fried scandal so totally eviscerated the entire effective altruism movement and any of its pretenses, that it’s now a tech billionaire whipping boy.  Nonetheless, one point Lewis keeps making about him and his people is how much they claim to be looking out for all people, but can’t actually deal with real people in a real human way.  I find that very unsettling, even if unsurprising.  Oh, and don’t let me forget to mention that the 700 or so OpenAI workers who demanded Altman’s return, did so partly because they were afraid the board’s action would erase their stock options in the for-profit OpenAI arm.  They weren’t clamoring for Sam Altruism to come back, but Sam Altman.

All of which makes it hard to figure out who might be left with a white hat in this mess. I’ll still stand by the position that money is still the bottom line for the tech companies and most of their acolytes, no matter what they pretend.  When money is in the driver’s seat, we’re all in line to be roadkill.