#trust

anonymiss@despora.de

Top #cop in Black man’s deadly #arrest withheld cam #video

source: https://apnews.com/article/arrests-death-of-ronald-greene-d2868b81b5af53a62301742d1ba4b825

But Clary, the highest-ranking #officer among at least six state troopers at the scene of Greene’s May 10, 2019, arrest, told investigators later that day that he had no body camera footage of the incident — a #statement proven to be untrue when his 30-minute body #camera video of the arrest emerged last month.

#crime #police #justice #lie #fail #violence #USA #news #trust

dredmorbius@joindiaspora.com

In the frame of Moralising Pathology, what went wrong at Boeing?

"Boeing’s 737 Max Is a Saga of Capitalism Gone Awry":

... What made the crashes so vexing is that it was impossible to pin the blame on one central villain. Instead, the whole company seemed to be at fault. Time and again, Boeing executives and engineers didn’t take warning signs seriously enough, opted against adopting additional precautions and made decisions for the sake of saving money or raising profits. ...

https://www.nytimes.com/2020/11/24/sunday-review/boeing-737-max.html

In this case, maybe, morality is to blame. There were individual technical, regulatory, design, and management errors, yes, but the common underlying cause is a fundamental misalignment of moral values. Of engineering excellence and passenger responsibility versus short term profits, bonuses, and above all, Milton Friedman's murderous bugbear, "shareholder value".

Or was it just short-sightedness, short-term thinking, denial, and scapegoating?

See previously: Treating systemic problems as moral failings … is why California is on fire.

#boeing #737max #MoralisingPathology #CorporateCulture #ShareholderValue #EngineeringDriven #ShortTermism #ShortSightedness #risk #trust #RegulatoryCapture #blame #denial #scapegoating #distributedResponsibility #BigProblems #ethics #morals #MiltonFriedman #JackWelch #DennisMuilenburg #HarryStonecipher #DaveCalhoun

dredmorbius@joindiaspora.com

An HN reader comments: I'd like a world where everyone consistently applies critical thinking to all sources of information.

That specifically Does. Not Scale.[1]
It fails two ways:

  1. Individuals suffer information overload, trust breakdown, and validation fatigue.

  2. Society finds itself with no common foundation of common shared facts and mechanisms. All points of view are asserted to be equally valid, expertise is entirely dismissed. Tribal beliefs are asserted as true (for Us) and invalid (if Them).

There is, I'll posit, a broad gulf between "verify everything" and "be prepared to question any belief". One varient of the latter is "strong opinions, weakly held". I'm not fully convinced this is a valid model, but it seems a good initial approach. It addresses both the need. to act in the moment, based on. partial information, and the realisation that this information and conclusions based on it may be faulty. The problem occurs when making decisions with no recourse --- betting the farm, burning the boats.

Otherwise, I lean strongly on Baconian, Pragmatic principles: our brains, both individually and socially, are sense-making organs, optimising for practical benefit. A challenge is that subject to perception, processing, and model-generation costs, complexity and rigour, though affording greater accuracy and precision, have enormous costs.

A manifestly false assumption of rational markets (and behaviour) theory is that information is free. It's not --- it has extraordinary costs, and model formation and coordination --- getting everyone on the same page --- are among the highest. We're constantly facing a complexity cost constraint (this is the essence of Gresham's law), in which a much simpler model is, under relaxed environmental selection, often more useful, as it permits discarding expensive perception, processing, and model transmission (education of the population). Which works fine until environmental selection mechanisms are increased.

What a trust, not in authority but in expertise, has to offer, suject to sufficient checks, is an efficient distribution of information, processing, and model formation. This is the ultimate aim of Baconian Science, expressed in the motto of the Royal Society: In nullis verba --- on the word of no person. Rather, it is justified trust in experiment, experience, integrity, and institutions, that gets you this.

Mind, the usual problem is that power-serving institutions have a staggering tendency to become self-serving and select not based on truth but on self-interest. Correcting for this tendency is the great problem of polity, commercial, social, justice, and moral systems.[2]


Notes:

  1. As I was composing this reply, another HNer in a different thread makes similar remarks: https://news.ycombinator.com/item?id=23450409

  2. This piece comes as I'm trying to wrestle another essay on truth and epistemology. Though it, so far, is winning.

#truth #belief #socialKnowledge #culturalKnowledge #RogerBacon #trust #belief #pragmatism #GreshamsLaw

erbenpeter@pluspora.com

#trust #knowlege #fakenews

The puzzles made visible through “fake news” are hard. They are socially and culturally hard. They force us to contend with how people construct knowledge and ideas, communicate with others and construct a society. They are also deeply messy, revealing divisions and fractures in beliefs and attitudes. And that means that they are not technically easy to build or implement. If we want technical solutions to complex socio-technical issues, we can’t simply throw it over the wall and tell companies to fix the broken parts of society that they made visible and helped magnify. We need to work together and build coalitions of groups who do not share the same political and social ideals to address the issues that we can all agree are broken.

Google and Facebook Can’t Just Make Fake News Disappear

dredmorbius@joindiaspora.com

Fostering Trust under Intermediate Agency

I wish I could hone in further on what trust is, but the concept itself seems sufficiently fuzzy that I cannot.

Etymological search has proved interesting. It's not definitive -- past meanings, origins, and evolutions of meaning are neither present meaning nor psychological or philosophical demonstrations of fundamental principles. But they do illuminate usages and connections through time.

Reputation is related to credit (Italian, credere, credo, "to believe"), and to trust, ultimately from the PIE root *deru-, "to be firm, solid, steadfast". So this is related to risk, probability, reliability, likelihood.

"Past performance is no guarantee of future returns."

I'm not even sure democratic means work, all told. Athens democratically condemned Aristotle to death, only to regret the decision shortly afterward (and he wasn't the only one this happened to). I've been poking at philosophical and information-theoretical concepts of truth and epistemology, as an outsider, for a few years. I tend to side fairly strongly with the pragmatists and empiricists -- truth is that which gives reliable guidance to the behaviour or properties of the observed world, according to the models and sensations we have for predicting/reasoning and perceiving it.

So, trust, based on evidence, collective assessment, admission of error where that's due, and more, seems a fairly good basis.

There's another problem, which I've been noodling at as well, that information technology improvements seem, as a whole, to create breakdowns in social trust institutions. The more you can ensure that someone does what they're supposed to be doing by checking up on them frequently -- every month, week, day, hour, minute, second, microsecond, ... -- the less you have to trust that they're going to do what they said, because you can verify it instead.

Instead of "trust, but verify", we end up with "verify, and distrust".

And that very much cuts both ways.

Neal Stephenson's The Diamond Age features a vignette in which members of one of the phyles comprising a future, quasi-dystopian society, go through trust rituals. Individuals, unknown to one another, are tasked with performing some action. The failure of one to do this, either when or as required, means the certain death of the other. That's an interesting concept.

Someone in 20th century politics or military noted that the best way to earn trust is to give it. Give someone control over a thing, with the understanding that they might succeed or fail, but that the outcome is almost entirely in their hands.

The question then -- and no, I don't have the answer -- is whether or not this can be extended, not to, but into our mediated spaces (I think the term "online" is a badly dated misnomer which misconstrues the fact that it is not a separate space, but a continuation of physical and immediate presences, though one with greater levels of intermediate agency imposed), in ways that don't merely foster reliability, but social trust itself.

That's where I'm at.


From a G+ thread by Akira Bergman, exploring questions of identity, information, reliability, truth, and trust, and whom I'd really like to entice over here. If he'd only trust me....

#trust #media #intermediateAgency

dredmorbius@joindiaspora.com

Who Can You Trust

Giselle Minoli made the point at #googleplus about only being able to trust what we make ourselves: I have to disagree entirely.

But she does get at an absolutely central point, and a concept I've been drilling into for a couple of years. Gideon Rosenblatt inspired a key insight as well.

Society is founded on trust. And how societies function is based on the level of trust within the society. There are high-trust, and low-trust societies, and this is a large area of study I've only just begun to scrape at (drop the terms into Google Scholar or Microsoft's academic database for starters).

We can trust in the institutions which found themselves on trust itself. That is, those which are accountable, open, transparent, commit to what it is that they will do. They may fail, but if they fail, they're open about that as well.

(Google are ... somewhat ... adhering to that last, though I wouldn't say they're doing a great job of this.)

Start-ups' Flipside

Some time back on Hacker News discussing whatever, I replied to a comment saying "Oh, I could start up a project to do that" concerning some need with the observation that it wasn't the starting up that was hard, but the not-shutting-down part.

Silicon Valley, synecdoche for the information media technology industry as whole, is a start-up culture, but that goes hand-in-hand with being a shut-down culture. The sector takes risks, seeks extreme financial rewards, dubs them after fantastical creatures that don't exist, unicorns, and, if a venture is even only modestly rather than outrageously successful, kills the concept without blinking. Even if there are hundreds of thousands, or hundreds of millions of people affected.

Governments are institutions which are criticised for their inefficiencies, but a side-effect of that is a strong resistence to shutting down.

(Sometimes too strong, but then, all features can be useful or harmful, swords cut two ways, etc., etc.)

And governments aren't the only ends to these means. Social organisations, educational institutions, cities, and nations -- as cultural constructs of common language, religion, beliefs, traditions, rituals, etc. -- are extremely durable. Not indestructable, but far more resilient than most commercial institutions.

(I don't have explanations of why this is yet, though I'm thinking of a few possibilities. One that stands out is an ability to survive in a very-low-activity state for a long time -- call it a hibernation, pupal, or spore state if you will.)

What commercial structures provide is an immense capacity to capture capital flows and formation (debt, equity), attention (via the very information media technologies that Silicon Valley develops), political power, social, cultural, and propaganda influence, and the like. But the brightest stars burn out the fastest, and the trajectory of many tech firms is high but brief. AT&T, IBM, Apple, Oracle, Apple, and Microsoft are among the oldest alive today. All are monopolies. (Something that goes very closely in hand with networks, physical, or logical.) All have reputations for playing rough, largely because there's very little other than aggression which creates a defensible space.

There are many things I've built myself which no longer work, or which I cannot maintain, or keep up-to-date with current needs. Rugged individualism is another trope, and in fact, the term itself is political -- Herbert Hoover used it in a speech in February of 1928, and its modern use is established from there (check that yourself on Google's Ngram viewer). So, no, we do need to work together, but we need to do so in ways that are trust-generative in and of themselves.

Information Technology is a Trust Killer

One of my strongest insights -- the one Gideon prompted me to -- was that advances in information technologies themselves tend to reduce overall social trust, and trust-building institutions. I'm not completely convinced this is correct, but it smells right to me.

In a pre-technological world, you could not communicate rapidly in rich detail beyond what you could immediately see or hear. You might be able to light a signal fire, visible for miles, to send a few bits. Or you could dispatch a messenger with a memorised or small written message, capable of covering 20 miles, possibly 250 miles on horseback, per day. Distant traval was impossible for months of the year -- November to May meant no seafaring due to storms and short days, in the Northern Hemisphere.

Every major preindustrial empire had an associated religion. The forms varied tremendously, but each had at its core a set of trust-generating precepts, behaviours that must be or could not be followed. Without the ability to check in on agents or counterparties at distant locations, you had to be able to rely on predictable behaviour. "A man's word is his bond." The concepts of name, that is, identity, and reputation, were virtually synonymous.

Deception occurred, certainly. History and mythology are full of it. But it's highly noted. Dante reserves his deepest pits of hell not for the transactors of violence, but for the betrayers of trust. This mattered to society tremendously.

Writing, printing, telegraphy, telephony, photography, cinema, radio, television, interactive communications systems, databases, the Internet, mobile connectivity -- all increased the capacity to see approaching real-time, or receive detailed representations of, conditions at a distance. You no longer relied on belief, testimony, and trust but on evidence.

(With profound implications for such trivial things as science and empiricism.)

A common observation of monitoring systems -- employee computer desktops, children's cellphones, professional review systems, etc. -- is that they break down trust. That's why you don't sneak a peak at your childrens' diaries or journals without a very good reason. (Generally: trust has already been broken.)

(This isn't an argument to stupidly deny evidence, but the line is a delicate one.)

We Need Trust-Generating Social Systems

Religion played this role before, I don't know that it can now, though I suspect something of roughly similar shape may be required. Note that the ancient empirical religions were not similarly structured; rather, they varied tremendously. Some were multitheistic, some pantheistic, some animist, some monothestic. Some had no gods, but were founded on meditation and practice. Some on ancestor worship. Some on rituals and sacrifice. And no, I'm not a Religions major, and my very casual familiarity with the field can certainly be corrected and strengthened.

There are also clear exceptions to the general rules. The Mongol Empire was largely tolerant of multiple religions, so far as I'm aware, and there were others as well, though I believe each did have a central faith of some form.

I don't have good answers yet, though I'm finding the questions interesting. I recommend others consider these issues.


Adapted from a G+ comment.

#trust
#infotech
#religion
#society
#longform