#infotech

dredmorbius@joindiaspora.com

Bugs in Our Pockets

... Client-side scanning, as the agencies’ new wet dream is called, has a range of possible missions. While Apple and the FBI talked about finding still images of sex abuse, the EU was talking last year about videos and text too, and of targeting terrorism once the argument had been won on child protection. It can also use a number of possible technologies; in addition to the perceptual hash functions in the Apple proposal, there’s talk of machine-learning models. And, as a leaked EU internal report made clear, the preferred outcome for governments may be a mix of client-side and server-side scanning.

In our report, we provide a detailed analysis of scanning capabilities at both the client and the server, the trade-offs between false positives and false negatives, and the side effects – such as the ways in which adding scanning systems to citizens’ devices will open them up to new types of attack. ...

...

If device vendors are compelled to install remote surveillance, the demands will start to roll in. Who could possibly be so cold-hearted as to argue against the system being extended to search for missing children? Then President Xi will want to know who has photos of the Dalai Lama, or of men standing in front of tanks; and copyright lawyers will get court orders blocking whatever they claim infringes their clients’ rights. Our phones, which have grown into extensions of our intimate private space, will be ours no more; they will be private no more; and we will all be less secure. ...

Authors are a who's who of cryptographic and security brilliance: Hal Abelson, Ross Anderson, Steven M. Bellovin, Josh Benaloh, Matt Blaze, Jon Callas, Whitfield Diffie, Susan Landau, Peter G. Neumann, Ronald L. Rivest, Jeffrey I. Schiller, Bruce Schneier, Vanessa Teague, and Carmela Troncoso.

Full paper (PDF): https://arxiv.org/abs/2110.07450

https://www.lightbluetouchpaper.org/2021/10/15/bugs-in-our-pockets/

#privacy #infosec #infotech #cryptography #surveillance #smartphones #MobileComputing #HalAbelson #RossAnderson #SevenMBellovin #JoshBenaloh #MattBlaze #JonCallas #WhitfieldDiffie #SusanLandau #PeterGNeumann #RonaldRivest #JeffreISchiller #BruceSchneier #VanessaTeague #CarmelaTroncoso

dredmorbius@joindiaspora.com

We've been thinking about it wrong: The norm has been Insecurity by obscurity

The Crypto AG CIA backdoor story (2020) clarifies to me much of the neverending flood of "outlaw strong crypto" thinkpieces and "lawful access" (a/k/a mandated backdoors) proposals.

I realised today that the whole #SecurityByObscurity discussion was missing a major insight: For much of the Cold War period, the operational standard has been instead #InsecurityByObscurity

Crypto AG was an allegedly secure system which was, obscure to the public, insecure. And that insecurity (along with fear, suprise, ruthless efficiency, and an almost fanatical devotion to the Pope), seems to have been a key element of US and #FiveEyes surveillance capabilities from the 1950s onward. (I'm aware Crypto AG's role under the CIA begain ~1970.) More recent stories of package intercepts (where backdoors are installed on specific equipment), zero-day hacks (such as are routinely purchased and exploited by Cellebrite, Palantir, the NSO Group, and others, is the logical extension of Crypto AG methods. As is putting a surveillance device in the pockets of the population that the surveillance targets themselves fight amongst themselves to buy.

Our information systems, technology, devices, and infrastructure are, obscure to us, insecure. And we fall for it again and again.

Because while the cryptography of the NSA and Five Eyes, as well as their counterparts worldwide, is no doubt prodigious, the cheapest way to break through a wall is to go around it. By far.

And virtually all the continuous whinging since the early 1990s about the hazards of emerging strong crypto makes vastly more sense in this context. The agencies know their own strengths, weaknesses, and secret weapons. And have been trying to preserve their advantage. (Even though this ultimately puts us all at vastly greater risk.) Their policy recommendations have been premised on this, even if they've been unwilling to admit this publicly.

But yeah, insecurity by obscurity as an operational norm. Describes much of the present Web as well.


Adapted from an earlier Mastodon thread: https://mastodon.social/@natecull/106112437055287730

#CryptoAG #security #surveillance #surveillanceCapitalism #surveillanceState #infosec #infotech

dredmorbius@joindiaspora.com

Data Facilitates Surveillance, Privacy Violation, and Manipulation Directly Through Increased Efficiences

Digitisation and distribution (to multiple agencies, organisations, and firms) has been ongoing mostly since the 1960s, though accelerating greatly as disk storage costs passed through the threshold of personal-budget levels in the late 1990s.

I'd been working with industry data in the early 1990s, when several analytic departments at a mid-sized firm might share a couple of gigabytes of mincomputer storage. At a conference during the 1990s, in an audience of several hundred data analysts, only a few hands went up for dealing with GB-scale datasets (the raised hand representing telecoms data as I recall). I realised circa 2000 that storage capable of storing a few hundred bytes data (plenty for a basic dossier) on every individual in a large country, or soon the world, would be within a modest household budget. Shortly afterward, the first news stories of data brokers started appearing, as well as Total Information Awareness, often contracting with those same data brokers.

Early social networking sites were beginning to apply collaborative-filtering moderation systems, which I quickly realised, having helped in the design of several myself, were themselves prodigious personal preferences data collection systems on the part of reviewers --- rating systems like many swords cut two ways; reviewers rate content, but ratings and content preferences also rate the reviewers. (An interesting twist on the Quis custodiet ipsos custodes question.)

In 1900, the only routinely digitised mass citizen data were US Census tabulations, updated decadally and not generally accessible. By 1960, telephone, banking, and airlines data (through SABRE) were digitised, largely as with Census data, on punch cards. Tape and further expansion to credit, insurance, and utility data developed by the 1970s, though punch cards remained in heavy use through the 1980s. The first widespread data privacy outcries came in the 1970s, see for example Newsweek's 1970 article, "The Assault on Privacy (1970)" (PDF), though early infotech pioneers such as packet-switched networking pioneer Paul Baran were writing on data, surveillance, privacy, and ethical concerns in the 1960s. (An aside; those publications are freely available online by RAND at my request.) Marketing and advertising were increasingly represented by the 1990s, as well as healthcare data, though records there remained (and still remain) highly fragmented.

By the 1990s, previously offline court and legal documents began getting digitised in bulk (a practice begun years earlier), sometimes by local courts, more frequently by aggreggation services such as LexisNexis, Westlaw, JustCite, HeinOnline, Bloomberg Law, VLex, LexEur, and others who took advantage of pubic access to compile and store their own aggreggations. Often literally by sending individuals to those rural courthouses mentioned above, and recording or duplicating records, one at a time, from clerks.

Access costs matter. And by costs I'm referring to all inputs, not just money: time, knowledge, rates of availability, periodic caps (e.g., 4 records/hr., but a daily cap of 8 records, 16/week, 32/month, effectively imposing an 8 hr/month access restriction), travel, parsing or interpretation, ability to compile independent archives (rather than relying on the source or origin archive), etc.

Aggregation itself is an invasion of privacy. Reduced search, parsing, and inference-drawing costs enable observation, surveillance, and manipulation.

Reduced costs don't simply facilitate existing uses, but facilitate new, lower-value, activities. This is a rephrasing of the Jevons paradox; increased efficiency increases consumption. Trying to reduce consumption through greater efficiency is like fucking for virginity. Another characteristic is that many of these new uses are of very limited, or negative, social benefit. Very often of fraud, or predatory practices.

Technnology is far less an equaliser than a power multiplier, amplifying inequalities. Information technology especially so.

Data corrupts. Absolute data corrupts absolutely.

#data #InfoTech #surveillance #SurveillanceState #SurveillanceCapitalism #privacy #manipulation #DataAggregation #JevonsParadox #ActonsLaw #PaulBaran

dredmorbius@joindiaspora.com

Who Can You Trust

Giselle Minoli made the point at #googleplus about only being able to trust what we make ourselves: I have to disagree entirely.

But she does get at an absolutely central point, and a concept I've been drilling into for a couple of years. Gideon Rosenblatt inspired a key insight as well.

Society is founded on trust. And how societies function is based on the level of trust within the society. There are high-trust, and low-trust societies, and this is a large area of study I've only just begun to scrape at (drop the terms into Google Scholar or Microsoft's academic database for starters).

We can trust in the institutions which found themselves on trust itself. That is, those which are accountable, open, transparent, commit to what it is that they will do. They may fail, but if they fail, they're open about that as well.

(Google are ... somewhat ... adhering to that last, though I wouldn't say they're doing a great job of this.)

Start-ups' Flipside

Some time back on Hacker News discussing whatever, I replied to a comment saying "Oh, I could start up a project to do that" concerning some need with the observation that it wasn't the starting up that was hard, but the not-shutting-down part.

Silicon Valley, synecdoche for the information media technology industry as whole, is a start-up culture, but that goes hand-in-hand with being a shut-down culture. The sector takes risks, seeks extreme financial rewards, dubs them after fantastical creatures that don't exist, unicorns, and, if a venture is even only modestly rather than outrageously successful, kills the concept without blinking. Even if there are hundreds of thousands, or hundreds of millions of people affected.

Governments are institutions which are criticised for their inefficiencies, but a side-effect of that is a strong resistence to shutting down.

(Sometimes too strong, but then, all features can be useful or harmful, swords cut two ways, etc., etc.)

And governments aren't the only ends to these means. Social organisations, educational institutions, cities, and nations -- as cultural constructs of common language, religion, beliefs, traditions, rituals, etc. -- are extremely durable. Not indestructable, but far more resilient than most commercial institutions.

(I don't have explanations of why this is yet, though I'm thinking of a few possibilities. One that stands out is an ability to survive in a very-low-activity state for a long time -- call it a hibernation, pupal, or spore state if you will.)

What commercial structures provide is an immense capacity to capture capital flows and formation (debt, equity), attention (via the very information media technologies that Silicon Valley develops), political power, social, cultural, and propaganda influence, and the like. But the brightest stars burn out the fastest, and the trajectory of many tech firms is high but brief. AT&T, IBM, Apple, Oracle, Apple, and Microsoft are among the oldest alive today. All are monopolies. (Something that goes very closely in hand with networks, physical, or logical.) All have reputations for playing rough, largely because there's very little other than aggression which creates a defensible space.

There are many things I've built myself which no longer work, or which I cannot maintain, or keep up-to-date with current needs. Rugged individualism is another trope, and in fact, the term itself is political -- Herbert Hoover used it in a speech in February of 1928, and its modern use is established from there (check that yourself on Google's Ngram viewer). So, no, we do need to work together, but we need to do so in ways that are trust-generative in and of themselves.

Information Technology is a Trust Killer

One of my strongest insights -- the one Gideon prompted me to -- was that advances in information technologies themselves tend to reduce overall social trust, and trust-building institutions. I'm not completely convinced this is correct, but it smells right to me.

In a pre-technological world, you could not communicate rapidly in rich detail beyond what you could immediately see or hear. You might be able to light a signal fire, visible for miles, to send a few bits. Or you could dispatch a messenger with a memorised or small written message, capable of covering 20 miles, possibly 250 miles on horseback, per day. Distant traval was impossible for months of the year -- November to May meant no seafaring due to storms and short days, in the Northern Hemisphere.

Every major preindustrial empire had an associated religion. The forms varied tremendously, but each had at its core a set of trust-generating precepts, behaviours that must be or could not be followed. Without the ability to check in on agents or counterparties at distant locations, you had to be able to rely on predictable behaviour. "A man's word is his bond." The concepts of name, that is, identity, and reputation, were virtually synonymous.

Deception occurred, certainly. History and mythology are full of it. But it's highly noted. Dante reserves his deepest pits of hell not for the transactors of violence, but for the betrayers of trust. This mattered to society tremendously.

Writing, printing, telegraphy, telephony, photography, cinema, radio, television, interactive communications systems, databases, the Internet, mobile connectivity -- all increased the capacity to see approaching real-time, or receive detailed representations of, conditions at a distance. You no longer relied on belief, testimony, and trust but on evidence.

(With profound implications for such trivial things as science and empiricism.)

A common observation of monitoring systems -- employee computer desktops, children's cellphones, professional review systems, etc. -- is that they break down trust. That's why you don't sneak a peak at your childrens' diaries or journals without a very good reason. (Generally: trust has already been broken.)

(This isn't an argument to stupidly deny evidence, but the line is a delicate one.)

We Need Trust-Generating Social Systems

Religion played this role before, I don't know that it can now, though I suspect something of roughly similar shape may be required. Note that the ancient empirical religions were not similarly structured; rather, they varied tremendously. Some were multitheistic, some pantheistic, some animist, some monothestic. Some had no gods, but were founded on meditation and practice. Some on ancestor worship. Some on rituals and sacrifice. And no, I'm not a Religions major, and my very casual familiarity with the field can certainly be corrected and strengthened.

There are also clear exceptions to the general rules. The Mongol Empire was largely tolerant of multiple religions, so far as I'm aware, and there were others as well, though I believe each did have a central faith of some form.

I don't have good answers yet, though I'm finding the questions interesting. I recommend others consider these issues.


Adapted from a G+ comment.

#trust
#infotech
#religion
#society
#longform