How Did Silicon Valley Break Bad?
A conversation with Henry Farrell about how the land of hippie-technologists turned into a haven for right-wingers with grand and malevolent ambitions.
Thank you for reading The Cross Section, and if you find my work valuable and would like it to continue, consider becoming a paid subscriber. This site has no paywall, so I depend on the generosity of readers to sustain the work I present here. Thanks.
The increasing power of Silicon Valley in recent years has seemed to go hand-in-hand with its moral degradation, as it becomes dominated by not just conservatives but a particularly toxic kind of right-wing plutocrat. So how did it happen?
To explore the question, I interviewed Henry Farrell, a professor at the Johns Hopkins University School of Advanced International Studies who has thought a great deal about the people, ideas, and forces driving the tech industry. The audio is below, along with a transcript. You can also find it by searching The Cross Section wherever you get your podcasts.
Transcript:
Paul Waldman: All right, I want to start this discussion with something that I wrote about a couple of months ago that comes out of my own evolving feelings about Silicon Valley. Now, some of us thought maybe 20 years or so ago when we looked at the people that were populating that newly ascendant economic engine of the country, that these Silicon Valley tech leaders could be an improvement over the earlier generations of oligarchs that we saw, those kind of cigar-chomping railroad barons and oil tycoons who dominated the economy and bent government to their will for so long. These nerd overlords seemed like they were undoubtedly smart and visionary. They had liberal social values. They dressed casually. They created these nurturing workplaces or so we thought at the time. And if some group of capitalists had to rule over us, we could do a lot worse.
But today, with Elon Musk ripping apart the federal government and other tech leaders lining up to pay tribute to Donald Trump and the tech industry increasingly an engine of surveillance and far right politics and ever crappier products, Silicon Valley seems to have broken bad. How did this happen? So to explore this issue, I'm talking today to Henry Farrell, who is a professor at the Johns Hopkins School of Advanced International Studies and one of the most trenchant observers of the ground where technology, politics, and commerce converge. His latest book is Underground Empire: How America Weaponized the World Economy with Abraham Newman. And you can find his newsletter, which is terrific, at programmablemutter.com.
So Henry, let me start with this question. You recently wrote a piece for Bloomberg about the Silicon Valley canon, which is a collection of books that the tech titans are either reading or telling each other to read or have on their shelves. What is it and what does it tell us about what these men, and they are almost all men, think about themselves and think about the government as they are now turning their gaze to it?
Henry Farrell: So I think that where you need to begin is to understand the role that this plays. There was a conversation that began in the middle of last year when Jasmine Sun, who's a really interesting thinker in Silicon Valley, talked about why this book called Seeing Like a State – it's by James Scott, it's a kind of a weird book for tech people to be interested in – because it seems to argue about the ways in which technology and the engineering mindset can be a curse. And Sun wanted to know why it is that this book was on many people's bookshelves in Silicon Valley.
So this gets taken up by a guy called Tanner Greer. He is an interesting conservative thinker and writer who asks about whether or not we wanted to talk about a vague tech canon. And then Patrick Collison, who is the CEO of Stripe, one of the two founders of Stripe, comes out and says, well, here is the tech canon as I understand it, here are 43 different sources that I think shape thinking within Silicon Valley. Now, Collison is absolutely clear that he does not endorse all of these books. These are not the books that he himself necessarily would want to read. But these are the books that, in his view, actually shape and express how people in Silicon Valley think about the world. And the canon more or less is a few websites, it is a lot of books, some of them are science fiction books, a lot of them tend to be biographies or stories of teams of one sort or another. And the fundamental message that comes across in these books is this idea of the world as a place where you have striving individuals, perhaps small teams, who really are the heroes of the story. These are people with – well, they're men, not entirely but nearly all men, with grand ambitions and grand flaws who set out to remake the world according to their values.
And so the argument that I make in the Bloomberg piece is that this gives a certain kind of mentality, a certain kind of sense to Silicon Valley people that they are an elite following in the footsteps of other elites like Robert Moses, the guy who made modern New York City what it is, like Teddy Roosevelt. And so you get these biographies rubbing shoulders with a biography of Elon Musk, effectively suggesting that Elon Musk is just the latest of a series of world-bestriding colossuses, these fantastically great men who really have grand ambitions who reshape the world in their ambitions.
And so the argument is, you can't really understand DOGE, you can't really understand why it is like that people like Musk and indeed many other people in Silicon Valley want to reshape the world if you don't pay attention to this kind of self-image that they have, that these are the people who are prepared to break apart the systems that don't work and create a better world in its stead, albeit certainly cracking massive, massive volumes of eggs and hurting people's livelihoods when they do. But that's just, basically it goes along with the business of being a world-shattering, fantastic intellect and great man.
Paul Waldman: So this seems to run up against what we understood to be the Silicon Valley ethos, or at least there was an image of it. The Whole Earth Catalog was first published in Menlo Park in 1968; for those who don't know, it this very influential kind of combination magazine/catalog of tools that you could use to start your own commune. And there was this kind of communal idea that people like Steve Jobs used to cite as part of the spirit of that place where they were creating these new machines and eventually software. And it had this image of a place where the workplaces were all really kind of open and creative and people were zooming around on scooters and bringing their dogs to work. And it was not so much about, we are going to reshape the world in our image, whether the rest of humanity likes it or not. It was like, wouldn't it be cool if we could, you know, create these neat tools that would enable people to do all kinds of things that they hadn't been able to do before. So I guess one question I have is, was that never really true about Silicon Valley that it had this kind of lefty communal spirit to it? Or was it inevitable that once it began to accumulate so much wealth that the leaders would turn into something more grandiose when it came to their vision of how they ought to stand atop society, bestriding it like a colossus.
Henry Farrell: So I think it was always sort of true, but only sort of true. Here I would think, as you say, there is this sense that a lot of Silicon Valley culture did come from places like the Whole Earth Catalog. There's a great book by a guy called Fred Turner, which talks about this cyber culture and counterculture and how the one emerged and evolved into the other. But there always has been, together with this, a different ethos of Silicon Valley, which has been important. Because you've got to remember, of course, Silicon Valley began as a kind of a military contracting business. The origins of this are semiconductors. And there's a great book by Margaret O'Mara talking about this. Semiconductors, their first customers are Uncle Sam and the Department of Defense, which needs missile guidance systems.
So the original Silicon Valley, it's really deeply and intensely connected into this military machine. You do get these more countercultural elements emerging in the 1970s and in the 1980s. But together with these elements, you do also get a significant right-wing culture emerging and being an important part at the same time. And there's a really interesting dissertation which was done by Becca Lewis – she is one of Fred's Turner students – pushing back in some very, very valuable ways at the Turner account of things and pointing out the ways in which back in the 1980s, 1990s, you get a bunch of activists who really, and here I'm quoting from her, “they tethered the ideal of Silicon Valley entrepreneurship to masculinity and the male breadwinner role in the traditional nuclear family and suggested that personal computers opened up a quasi-spiritual world specifically for entrepreneurs and used this to spread conservative ideas.” So you've got this other kind of counterculture to the counterculture effectively, which plays an important role in the debates and which also shapes some of the political views.
So there is work by Broockman and Malhotra, which they did this survey, I think the only survey that really gets at the values of high-end people in Silicon Valley, the venture capitalists and the entrepreneurs. And this survey suggests that these people were by and large, they were extremely liberal on social values. But when it came to notions of economics and in particular economic power, they really were very, very close to Republicans even back before the change actually took place over the last couple of years that we've seen with people like Musk and others turning towards Trump.
So that there is a certain kind of, how can I put it, there always was a liberaltarian culture. This was probably the dominant culture. A lot of tolerance of different lifestyles, but going together with a very definite bias, for example, against unions. And so it's not entirely surprising that when you see a lot of people's economic interests come into perspective as really being very important during the Biden era, you do see a bunch of people breaking towards the right and breaking towards those elements around people like Peter Thiel, which had always been a significant undercurrent in Silicon Valley culture.
Paul Waldman: See, that's one of the interesting things, because if you remember in 2016, Peter Thiel, who is the head of Palantir, which is kind of a dystopian surveillance company, and also an old friend of Elon Musk's, they were two of the co-founders of PayPal, right? He spoke at the 2016 Republican convention. And I remember it being characterized as this kind of weird anomaly, that at the time we still thought of Silicon Valley as a place that was very tight with Barack Obama and his administration. Gave a lot of financial support to the Obama administration, but also there was a lot of kind of back and forth between Washington and Silicon Valley during those eight years that Obama was in office. And Thiel was seen as kind of a weirdo, I don't even know if you call him a Republican, he's kind of an extreme libertarian.
There's this famous quote from him in something he wrote in 2009 where he said, “I no longer believe that freedom and democracy are compatible.” And that's often mentioned as evidence of what a right-wing weirdo he is. But today that may be the dominant sentiment among Silicon Valley CEOs. And maybe you can look at that and say, well, should we be surprised that a bunch of centibillionaires are in fact hostile to unions and generally want to have low taxes and low regulation? Was there that kind of right-wing cohort even then and we just didn't notice it and Thiel was just one visible guy? Or was he really unusual at that time, even just eight years ago? What was the situation at that point and how much has it changed since then?
Henry Farrell: So I think that there always was more of this undercurrent than people realized. So Rob Reich, who is a political theorist at Stanford, he recounts in a jointly written book that he was invited at some point to give a talk to a bunch of very, very important Silicon Valley people. He doesn't name who they are, but given Stanford's connections, I suspect that these really were people who were right at the top of the pecking order. And they're talking about, what is their ideal society? How can you actually create an ideal society? And Rob starts talking about how it is that you want to have some degree of democracy, and he basically gets laughed out of the room. Democracy, even back a few years ago, is seen as being part of the problem, not as being part of the solution.
So I think that there is that latent layer of skepticism about democracy, skepticism about the East Coast, skepticism about all of these very dyed-in-the-wool ways of doing things that has been an important part of Silicon Valley and indeed has been one of these systems that, when you see people talking about moving fast and breaking things, breaking laws has always been an important role, a part of the Silicon Valley business model for many key entrepreneurs. If you look at Uber, for example, Uber's modus operandi was more or less to break the law and to create a fait accompli, which then people then had to effectively work around in order to break these taxi monopolies and whatever. So I think that this has always been an important part of the mix, albeit a somewhat subdued part of the mix.
So I think that two things have changed. I think, first of all, people who might have been rather shy about talking about these beliefs have become notably less shy over time. And second, I think that together with this, there is a real radicalization that has happened where many people, and I should stress, this is probably not the majority by any stretch of the imagination of people in Silicon Valley. What evidence we have suggests that, for example, the donations that come from that area still tend to trend very heavily towards the Democratic Party. But we have a world in which people who used to be perhaps kind of skeptical about democracy, interested in somewhat science fictional alternative societies and alternative ways of doing things have decided that both their interests and their ideals push them towards some quite radically anti-democratic moves and quite radically anti-democratic understandings of how policy ought to go forward.
And the accession of Donald Trump to power, of course, has been a radical accelerant for all of this. I think that Jeff Bezos is one of the perfect examples. So Jeff Bezos back in the day, he was one of the people I understand who got the Washington Post to adopt this “Democracy Dies in Darkness” slogan that it used to have during the first Trump administration. And now he's more or less saying that if you are a op-ed writer or an opinion writer for the Post, you talk about free markets, you talk about individual liberties, and democracy very emphatically does not get mentioned in that list of desired things that you're supposed to talk about and seems for a combination of perhaps of business interests and also of ideological shifts to be something that Bezos has basically thrown out as, we don't particularly care about this, we don't want to talk about this. And he is, I think, probably on the left edge of a of a quite powerful clique of Silicon Valley people who are now, I think, turning in anti-democratic directions in some very, very important ways.
Paul Waldman: I wonder if the seeds of that are in this idea of disruption, which is so central to Silicon Valley ideology. I guess you could say they came by it honestly, because from the beginning, they were creating things that really were going to disrupt all kinds of industries and the way people live. But it became a cliche to the point where every kid who is trying to code an app to let you order a six pack of beer on a Friday night has to say that he's going to disrupt the whole beer industry or whatever it is, that everything had to be disruptive and was going to totally transform everything. And once you assimilate that as the kind of point of the enterprise is that all of the old ways of doing things and systems have to be swept away because what we are doing is so dramatic and so visionary and so brilliant, then it's just a short hop to say, well, why don't we just go ahead and disrupt the government?
And that has its end point in Elon Musk coming through and laying waste to the way everything works, because the fact is that a lot of democracy is very hidebound and slow and has a lot of redundancies built in. And those are sometimes just dysfunctional and could be reformed, but sometimes they actually serve a purpose. One of the models that people have cited for what Musk is doing right now is what he did to Twitter, where he basically came in and just fired everybody and then hired back some small portion of the people he had fired. And of course, if you shut down Twitter for a couple of days or it just doesn't work well for a couple of weeks or a couple of months, that doesn't really affect anybody. But if you do that and people don't get their Social Security checks, that's something much more significant.
So I wonder if you feel like the seeds of that – how you get from this idea that disruption is really good and we should question how things are done and build new tools that can change things around and make things more efficient and work better and create new opportunities for people to do things that they've never been able to do before, and if you get old industries that are swept away in that, they didn't deserve to live anyway. And where you get from that to thinking, we ought to just dismantle the entire democratic system we've had for 250 years.
Henry Farrell: So I think that this gets back to something you said at the beginning, which is, you do want to recognize that there is a fair amount of value to what Silicon Valley has done. There are technologies that we simply cannot live our lives without. I'm one of the people, I guess you are as well, who remembers what life was like before Google search, before having this massive array of valuable information at your fingertips. So it's really, really easy to, I think, discount the value that, in fact, this disruptive attitude towards the world has had in the past, and to some extent still has going forward. Equally, and I think here is a fundamental problem, this is an approach to business. It is not an appropriate approach to political philosophy and a general theory of how the world ought to work.
Because as you say, there are a lot of systems out there which are both messy and complex and where you can see what appear to you to be apparent inefficiencies, but actually turn out to be load-bearing structures of one sort or another simply because you don't understand what they do. And so here, I think that if you look at the world through this particular mindset, it is really an optimizing mindset. Silicon Valley, there's one weird trick to a lot of these disruptive strategies, which is you see this complex structure, this complex way of doing things, which doesn't work particularly well. You figure out as an engineer, you figure out some way of optimizing, you figure out some way to turn whatever this complex thing is into some simple set of structures with a so-called objective function, which is a mathematical term composed of vectors, which you maximize or minimize in order to approximate whatever thing it is that you want to achieve. Then you, bingo, you have ended up with a fantastic business model that takes this clunky, inefficient system and turns it into something cleaner, simpler, cheaper. Also, very often, involving a lot less employment for people because you rely much, much more on algorithms. But there is a certain logic and a certain value to that.
Where this doesn't work is when you begin to get into the infrastructures of society and in particular the kinds of social and political bargains that keep us held together. Because these do look incredibly inefficient in a lot of ways because human beings, if you think about human society, even democratic societies like the United States, they do not have an objective function. They do not have a set of values which can be minimized and maximized. Instead, these are messy structures which are held together by duct tape and spit, but they are held together because they allow people with very different values, very different ideas of what they want to achieve or don't achieve, they hold them together in a society which offers a relative degree of peace.
And so when you start ripping the shit out of these systems, when you start thinking these are systems that can be optimized, these are systems that can be thrown away, you discover that this becomes extremely problematic, to put it mildly. And you may discover that many of the things that you kind of assume away in your philosophy of the world, which thinks about effectively society as a set of problems to be optimized, you discover that a lot of the peace, a lot of the economic stability that is in effect the background conditions for the work that you are setting out to do, these background conditions can suddenly change drastically. And I worry very much that we may be in a world if this doesn't stop where having a few Social Security payments being missed is very much the beginning of the problem rather than the end because society cannot be optimized. And I think we're in a moment now where we have some quite crazy ideas out there about how far you can go, how far you can push it. And the notion being, as with Twitter, you basically rip the hell out of things and you figure out from that what you need. And then perhaps you build back in some of what you need. That does not work if you're trying to do it at the level of society or the level of government, because you discovered a lot of the things that you suddenly realized that you needed aren't actually available anymore for you to pick back up and to put back into the system.
Paul Waldman: I want to ask you about science fiction, which you've mentioned. Going back to where I started this discussion, that was one of the things that that made me feel good about Silicon Valley, because when I was a kid, I was a nerd who read science fiction and a lot of these people who are now going to be running the world, they were nerds who read science fiction. And for many science fiction fans, it's a genre about imagination and thinking about how technology can make the world both better and worse and what the future is going to be like. But many of these people seem to be taking all the wrong lessons from what they read as kids. The classic articulation of this was a viral tweet from 2021 right after Mark Zuckerberg announced that Facebook was changing its name to Meta and they unveiled the first iteration of the Metaverse, which still many billions of dollars later is incredibly crappy. And there's a writer and game designer named Alex Blechman who did this tweet, where the first line is: “Sci-fi author: in my book I invented the Torment Nexus as a cautionary tale.” The second line is, “Tech company: at long last we have created the Torment Nexus from the classic sci-fi novel Don't Create the Torment Nexus.”
That resonated with people I think not just because Silicon Valley often creates some terrible things or potentially threatening things but that whether what they're producing is horrifying or useful or just banal, it's always presented with this kind of grandiosity that it's going to bring us to our utopian future and that that future is always going to be good. And every once in a while, you know, Sam Altman says, well, let's make sure that the AI doesn't kill us all. But mostly it's just, “Everything is going to be great, and I know this because I read some science fiction when I was a kid.” So I guess my question is, I know this is this is something you've thought about, those tech leaders, were they reading the wrong science fiction? Were they misreading what they read? Were they just taking lessons that they shouldn't have from it? How do you see that playing into where we are today?
Henry Farrell: I think science fiction really has played a very important role in shaping the imaginations of people who think about the world. And you see this applying in lot of places. You see it applying in AI, where I think a lot of the notions that we have around AI, the notions of AGI, or artificial general intelligence, really are taken from the ideas of science fiction writers like Vernor Vinge back in the 1990s. But the key science fiction writer, I think, for understanding how Silicon Valley thinks about the world is Neal Stephenson. And here I think there are two books by Neal Stephenson. One of them is cited by Collison, The Diamond Age. The other is Snow Crash, which is a much more dystopian vision of the future. And dystopian in the sense that it is a pretty horrible future.
But Stephenson is also showing, I think, his political beliefs to some extent. But he's also being quite funny and he's satirizing and he is giving you a sense of this future, which is both terrifying and a little ridiculous at the same time. And I feel that many people actually have taken up these ideas and have taken them entirely seriously as being a kind of a business plan/prediction of the way that the world is going to work. So if you see, for example, Snow Crash is written, I think it comes out back in 1992. And in 1997, there is a book by two guys, Davidson and Rees-Mogg. Rees-Mogg is the father of Jacob Rees-Mogg, is a famous conservative pro-Brexit politician in the United Kingdom. But his father, this other guy, and Davidson come out with this book called The Sovereign Individual, which predicts that we are going to end up in a world in which technology is going to allow everybody, and in particular, powerful financial individuals, creators, entrepreneurs, it is going to allow them to exit the nation state. It is going to allow them to get out of this structure. It is going to allow them to get away from democracy, which is dominated by all of these worthless drones, and to create their own kind of glorious future.
And The Sovereign Individual is one of these books which very, very few people read, but among the people who read it is Peter Thiel. So Thiel, when the book comes out a few years ago with a new edition, Thiel writes the introduction and more or less says, this is the book that you need to read if you want to understand the world. And so I think that if you look at many of the visions that are unfurling, the vision of Thiel, the vision of people like Balaji Srinivasan, who's another very important person in this world who used to work for Andreessen Horowitz. If you look at Mark Andreessen's own views, you will see that they are influenced by this nexus of ideas from science fiction, where The Sovereign Individual, which dates back to Snow Crash in some very important ways, this really shapes their understanding of the world.
And the other set of ideas that I think is very important comes from a British philosopher called Nick Land, who is part of this cybernetic cultural research unit in Warwick back again in the early 90s, reads a whole lot of science fiction, incorporates this into his philosophy, has a kind of a breakdown following, I think, copious amounts of drugs, alcohol, and other things, but then reinvents himself after a few years as this radical anti-democratic philosopher who proposes, very similarly to the sovereign individual perspective, that we need to move towards a much more fragmented world without the same kinds of structures of authority.
And he then begins to rehabilitate a third figure who is important, this guy called Curtis Yarvin, who used to blog – those of you who remember the blogosphere in its early days, he has this minor, extremely crazy blog. And so Land then effectively picks up Jarvan's ideas, gives them a much more sophisticated spin than Yarvin himself is capable of doing, and then helps get Yarvin launched into the Peter Thiel extended universe. He is somebody who is picked up by Thiel. And then these ideas then begin to filter out to people like JD Vance, who are very, very deeply influenced and say that they're influenced by Yarvin’s perspective of this radically anti-democratic world in which we would see traditional forms of government being replaced effectively by sovereign corporations of one sort or another. So there is this very complex process by which this somewhat joking, quasi-serious satire by Neal Stephenson, this view, this dystopian view of the world, gets effectively transmogrified through other writers into a political philosophy that deeply influences the current vice president of the United States of America and that's a rather nerve-racking thing.
Paul Waldman: And you can hear the echoes of Ayn Rand in there too, the great man who can't be held back by the constraints of the system and the ants who exist below him. And that I think is probably a good description of how Elon Musk looks at the world, that the rest of us are essentially NPCs, non-player characters. We don't really matter. Only he has the vision to take humanity out beyond the stars. And so the laws and the rules and democracy are just a bunch of constraints that can't be tolerated and have to be swept aside.
But maybe we could finish up this discussion on a slightly more cheery note. You have also written about some very interesting people in and around the Bay Area in San Francisco who are trying to kind of reimagine what Silicon Valley can be. Can you talk a little bit about what they're advocating for and maybe address the question of whether or not these really creative people who are trying to get back to something different or create something different can really compete against the guys who have all the money?
Henry Farrell: So I think that there are two ways in which you might be able to do this. One is a way that I don't really talk about in the Bloomberg piece because I think that they don't fit into the Silicon Valley as canon as such. These are people who are really, I think, critics of one sort or another, who are sharp critics of the way that technology actually works. And so I think here, you can think of there's a whole bunch of prominent people who think in this way, a lot of them around AI, a lot of them around other topics, who are really trying to push back against this set of assumptions that I think is probably pretty pernicious. But what I chose to focus on in the article was a slightly different and more politically heterogenous crowd of people, by which I mean to say these are people who I suspect are probably not lefties in the way that I'm a lefty. There are some people I think who are very definitely on the classical liberal side, but who are interested, as far as I can tell, broadly speaking in two things.
First of all, they're interested in the heterogeneity and the intellectual diversity that used to be a real, important part of Silicon Valley. So it is hard to recognize now given that Silicon Valley appears to be, from the outside at least, be a kind of a monolithic monoculture dominated by these white dudes and their immediate servitors who have a very, very single-minded view of the world. The Silicon Valley and the Bay Area used to be, as you say, a strange and countercultural place. And so I think the first thing that they are trying to do is to revive that sense of a counterculture, to revive that sense of a culture in which lots of different people with different perspectives, different views, who don't necessarily fit with the outside world and also don't necessarily fit with each other in any very simple way, in which you can create a milieu in which these people can come up with interesting new ideas, interesting new ways of living together. And so I think that that is really the first thing where I think that there is some hope.
And the second thing that I think that is important is, if you look at where some of the people in Silicon Valley who are coming from this, again, more right-wing perspective than I have, there is also a lot of value from some conservative or right-wing thinkers. Here I'm thinking about people like Ernest Gellner, Karl Popper, who is a very, very important touchstone for many of the people like, example, Patrick Collison. And what I think all of these people have is a much more skeptical view of grand projects of one sort or another, of grand efforts at social engineering, and much more interest in piecemeal social engineering, to use a phrase from Popper, that is rather than trying to sort of recreate government in this grand historic step in ways that Elon Musk is trying to do, you try to reinvent things bit by bit.
And this also, I think, actually is manifested in the service which Elon Musk, think kind of like an alien facehugger, has colonized and is trying to burst out the stomach of. Sorry, that's probably a much too gross a metaphor, but there you go. That's what you get when you get an Irishman beginning to pontificate. But where Elon Musk, in order to create DOGE, he effectively took over this earlier thing called the US Digital Service, which was composed of a bunch of techies who had been invited into the government, who'd been tempted into the government, being paid far, far less than they ever had been paid in their lives, but having the opportunity to really make government better, not in any, as I say, grand scheme, but through figuring out bits and pieces here and there where you could actually connect government in a more practical way to people's lives, creating better interfaces, all of these kinds of things.
So if I were to hold out a hope, it would be for a Silicon Valley, which goes back to that kind of vision, which goes back to a real understanding that if you're an engineer, you actually do have a lot of skills, you've got a lot of value to provide, but you also have to respect the problem. You have to understand the systems that you are looking to change. You have to be very, very careful about changing those systems because there may be various forms of dependencies that you simply can't see that are going to be revealed when you begin to mess around. So I would love to see a world in which we saw Silicon Valley actually playing an incredibly valuable role in the world, but without this kind great man theory, instead being much more willing to think about small scale things, specific interventions, and really trying to make people's lives better without assuming for those people that they ought to live their lives in this way, that way or the other way.
Paul Waldman: Well, perhaps out of the rubble of the destruction that Musk and Trump will cause to the government, we can actually begin to build something that really does work better. Henry, thank you so much for joining me. This has been a fascinating discussion.
Henry Farrell: Thank you so much, Paul. A lot of fun to talk to you.
first, power corrupts.
second, even very very smart people are very very limited in what they know or can know.
third..i think...people attracted to computers may be specially limited in what they know about people and are easily suckered into believing that they are the smart ones and deserve to rule ..because they would do such a better job than the politicians and other people who have to deal with compromises and balancing competing interests.
i have my own reservations about democracy..because i see the Left blaming the Electoral College every time they lose an election, and i see the Right believing it has a right to force its superior morality on everyone else. I think the Framers had it right when they created a system of checks and balances....though they may not have quite expected it would be checks and balances between the haves and have nots. But neither side, nor anyone, has enough knowledge or wisdom or decency to rule everything. they will be stupid and eventually cruel.
back in the day it was my misfortune...caused by my stupidity...to find myself pursuing a profession that attracted the sort of people who love power, even if it is only over a caged pigeon.
I found i did not like, or desire power. so i had to get myself a steady job. Watching the clown show was enough stress for me. Now that it seems the Bad Clown has won, at least for a while, I feel afraid.
Go back and watch the original version of Rollerball. This is what the tiny manhoods want….corporations running the world not governments. Bottom line is these are all very disturbed individuals who misunderstood what they’ve read, clearly had odd upbringings and now think they’re gods.