Sherry Turkle wrote a scathing critique of the culture of Silicon Valley.
"Silicon Valley companies began life with the Fairy dust of 1960s dreams sprinkled on them. The revolution that 1960s activists dreamed of had failed, but the personal computer movement carried that dream onto the early personal computer industry. Hobbyist fairs, a communitarian language, and the very place of their birth encouraged this fantasy. Nevertheless, it soon became clear that, like all companies, what these companies wanted most of all, was to make money. Not to foster democracy, not to foster community and new thinking, but to make money."
"Making money with digital tools in neoliberal capitalism led to four practices that constituted a baseline ideology-in-practice."
Those are: 1: "The scraping and selling of user data", 2: "The normalization of lying to the public while wearing a public face of moral high-mindedness", 3: "Silicon Valley companies that have user-facing platforms want, most of all, to keep people at their screens", and 4: "Avatars have politics."
That last one has to do with how people are different in online conversations vs face-to-face.
Commentary: Her critique isn't especially original but it got me thinking about generational differences in how people relate to technology and speculation as to what the next shift will be.
Sherry Turkle is a longtime luminary in the human-computer interaction field from her work at MIT, and she wrote several books including The Second Self, Life on the Screen, and Alone Together. Perhaps because she's such a luminary the comment section here is more thoughtful than usual. She seems to pine for the days when kids would hang out after school and nothing was recorded and conversations didn't spread beyond the people who were there physically, so people could say what they truly believed and not "self censor". In the comments people argue that it's not the platforms like Facebook itself (questionable morals or otherwise) that people are afraid of, it's their peers, future employers, and so on. People argue that self-censorship is a good thing and an essential part of learning socialization. Others say young people simply accept all-pervasive surveillance as a fait accompli, because the war for privacy was irrevocably lost by previous generations before they even showed up.
Regarding the future, though, I've been thinking people might care more what the platforms think, because they have the AI models. In the past, platforms couldn't watch every conversation, because the number of users massively dwarfed the number of employees any company could have. Then they created algorithms to monitor people remove unacceptable content and people, but those algorithms have been crude -- evolving from simple keyword searches to more sophisticated but still very unreliable "sentiment analysis" algorithms. Now, however, language models are showing they can really, truly understand what people are saying. To the point where, for example, talking in some "coded language" to avoid using the keywords that the platforms are looking for will no longer work. Very soon now, the platforms will be able to understand the meaning of messages as well as a human employee reading the messages would.
I could make the case that this is actually a positive development. People who are genuinely weird but harmless, who might get banned, get their content deleted, or receive other punishments currently might be left alone, while people who are genuinely dangerous, who might otherwise slip under the radar, will pop out and be highly noticeable to the platform operators.
What I see as the potential downside is that the language models will also understand things like political philosophy just as well as human employees reading messages would. That means the people using the platform might be unable to hold political views that differ from the people running it. If you think that's a good thing because you agree with the political ideology of the people who run social networking platforms, remember that globally there are many places where the dominant political ideology is different from what it is here.