#icloud

magdoz@diaspora.psyco.fr

Le Danemark veut protéger ses données et bannit Google de ses écoles

https://www.journaldugeek.com/2022/07/19/le-danemark-veut-proteger-ses-donnees-et-bannit-google-de-ses-ecoles/

Et depuis, cela touche aussi les #Pays-Bas et l' #Allemagne !
En #France, les #médias dominants semblent ne pas en parler. Pas encore.
Par contre, l'info circule, par des messageries.

Le #Danemark interdit les #Chromebooks et #Google #Workspace dans les #écoles en raison des risques de transfert de données
https://linux.developpez.com/actu/335114/Le-Danemark-interdit-les-Chromebooks-et-Google-Workspace-dans-les-ecoles-en-raison-des-risques-de-transfert-de-donnees-et-ravive-le-debat-sur-les-possibilites-offertes-par-Linux-et-l-open-source/

Un article similaire :
Les écoles au Danemark, aux Pays-Bas et en Allemagne ne peuvent pas utiliser #Gmail ou de #cloud de Google,
https://securite.developpez.com/actu/335309/Les-ecoles-au-Danemark-aux-Pays-Bas-et-en-Allemagne-ne-peuvent-pas-utiliser-Gmail-ou-de-cloud-de-Google-en-raison-de-problemes-de-confidentialite/

Privacy Shield invalidé
Cette décision fait suite à des décisions similaires des autorités néerlandaises et allemandes.
Pour être plus précis, il a été décidé que les écoles allemandes ne doivent pas utiliser les offres cloud telles qu’Office 365, G Suite et #iCloud en raison de violations de la vie privée. Le commissaire de Hesse à la #protection des #données et à la #liberté d’information a publié une déclaration selon laquelle, compte tenu du manque de transparence en matière de protection des données et de l’accès potentiel de tiers, aucune donnée personnelle d’écoliers allemands ne doit être stockée sur les serveurs de #Microsoft, Google ou #Apple en dehors de l’Allemagne.

Autres sources :
https://fr.techtribune.net/google/le-danemark-interdit-les-chromebooks-et-google-workspace-dans-les-ecoles-en-raison-des-risques-de-transfert-de-donnees-techcrunch/375849/

⚠️ Le Danemark interdit les services de Google dans les écoles, après que des fonctionnaires de la municipalité d'Helsingør ont reçu l'année dernière l'ordre de procéder à une évaluation des risques liés au traitement des données personnelles par Google
https://techcrunch.com/2022/07/18/denmark-bans-chromebooks-and-google-workspace-in-schools-over-gdpr/

Dans un verdict publié la semaine dernière ( https://www.datatilsynet.dk/afgoerelser/afgoerelser/2022/jul/datatilsynet-nedlaegger-behandlingsforbud-i-chromebook-sag- ) l'agence danoise de protection des données, #Datatilsynet, a révélé que le traitement des données concernant les #étudiants qui utilisent la suite logicielle Workspace de Google basée sur le cloud - qui comprend Gmail, Google Docs, Calendar et Google Drive - "ne répond pas aux exigences" du règlement de l'Union européenne sur la protection des données ( #GDPR) (https://techcrunch.com/2018/01/20/wtf-is-gdpr/ ).

#Politique #UE #EU #RGPD #Enseignant #Enseignants #Prof #Profs #Professeur #Professeurs #École #élèves #Enfants #Éducation #Education #Logiciel #LogicielLibre

anonymiss@despora.de

#iCloud #crypto wallet #attack saw $650K stolen from trader within seconds; #MetaMask #vulnerability revealed

source: https://9to5mac.com/2022/04/19/icloud-crypto-wallet-attack-metamask/

An estimated $650,000-worth of cryptocurrencies and NFTs were gone in an instant.

...

The answer, as unearthed by a crypto #security expert who goes by Serpent, is that using the MetaMask app on #iPhone automatically stores a seed phrase file onto iCloud […]

#software #fail #news #finance #hack #hacker

petapixel@xn--y9azesw6bu.xn--y9a3aq

Is Apple Actually Going to Snoop on Your Photos?

image

Is Apple actually snooping on your photos? Jefferson Graham wrote an article last week warning this based on the company's child safety announcement. An attention-grabbing headline? Certainly. Accurate? It’s complicated.

There has been much criticism from privacy advocates, notably from the EFF and Edward Snowdon. This criticism is warranted, however, that criticism should very much be based on technical elements rather than hyperbole.

So in laymen’s terms, what’s going on?

No matter how well-intentioned, @Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow.

They turned a trillion dollars of devices into iNarcs—without asking. https://t.co/wIMWijIjJk

-- Edward Snowden (@Snowden) August 6, 2021

Unbelievable: @Apple now circulating a propaganda letter describing the internet-wide opposition to their decision to start checking the private files on every iPhone against a secret government blacklist as "the screeching voices of the minority."

This has become a scandal. https://t.co/uoa4uuNTaP

-- Edward Snowden (@Snowden) August 6, 2021

1) Families enrolled in iCloud Family Sharing will get tools to counter the sharing of explicit content.

If you have Family Sharing enabled and Apple knows that a user is under the age of 13, the device will scan all messages, both sent and received, for sexually explicit content.

The key here is that this feature is only enabled for users under the age of 13 using the Messages app. Parents can also switch on a feature that allows them to get alerts if children ignore a warning about the message.

So is Apple snooping on your photos in this instance? In my eyes, the answer is no.

2) All users who use iCloud Photos will have their photos scanned against a codebase (known as a hash) to identify Child Sexual Abuse Material (CSAM).

First, we need to understand what a hash is. Images connected to iCloud Photos are analyzed on the device and a unique number is assigned to it. The technology is clever enough that if you edit a photo through cropping or filters, the same number is assigned to it.

The National Center for Missing and Exploited Children provided Apple a list of hashes that are known CSAM photos. If your photo does not match that hash, the system moves on. The actual photo isn’t visible to anyone.

If a match is found, that match is added to a database against your iCloud account. If that database grows to a number (the specifications of which are not publicly known), Apple disables your iCloud account and send a report to the NCMEC.

So is Apple Snooping on your photos in this scenario? Maybe. It depends on what you consider snooping. Apple can’t see your photographers, only the hash and then they check that hash against a known CSAM hash.

Bear in mind that this is only enabled for those who use the photos app attached to an iCloud account, therefore you have other options (like using Google Photos) if you aren’t comfortable with the analysis of your photos.

It is worth remembering that all Android and Apple built devices already analyze your photos to be able to make them searchable. If you have a pet, type pet into the search box and pets appear. Analyzing photos is not a new technology, but CSAM detection extends the capabilities for the purposes of what Apple see as the common good.

Apple's filtering of iMessage and iCloud is not a slippery slope to backdoors that suppress speech and make our communications less secure. We’re already there: this is a fully-built system just waiting for external pressure to make the slightest change. https://t.co/f2nv062t2n

-- EFF (@EFF) August 5, 2021

3) Apple is adding guidance to Siri and Search related to CSAM.

This has nothing to do with scanning photos. If you search (using the iPhone search, not Safari), or ask Siri about CSAM content, it will provide you with links on how to report CSAM or tell you that interest in the topic can be harmful or problematic.

This will have the least impact on users, as I’m not sure people ask Siri about CSAM anyway! You can read Apple’s full explanation of that in this document.

To Summarize

1) Explicit content checks take place on devices known to Apple to belong to a child under 13 through iCloud family sharing. If you are over 13, your photos aren’t scanned.

2) Your iCloud-connected photo library will have a unique number (a hash) assigned to each photo. If that number matches a known CSAM hash, it will be added to a database within your iCloud account. If you have too many photos of this type, your account may be disabled and reported to the authorities.

3) You have a choice on whether or not you want this technology to run on your phone. You can decide not to use iCloud to store your photos or opt out of family sharing for your children.

Now that we have delved beyond the hyperbole, you are in a good position to make an informed decision about this technology. I encourage you to read both the criticism and praise for this method and make up your mind based on that.


_Disclosure: William Damien worked part-time at an Apple retail location seven years ago. The opinions expressed above are solely those of the author. _


Image credits: Header photo licensed via Depositphotos.

#editorial #opinion #technology #ai #aimodel #apple #artificialintelligence #childabuse #government #icloud #iphone #neuralmatch #oped #privacy #storage

petapixel@xn--y9azesw6bu.xn--y9a3aq

WhatsApp’s Head Calls Apple’s Child Safety Update ‘Surveillance’

image

Just one day after Apple confirmed that it plans to roll out new software that will detect child abuse imagery on iPhones, WhatsApp's head took to Twitter to call out the move as a "surveillance system" that could be abused.

Apple recently announced that it would roll out a software update this fall that would target three main areas to help keep children safe from exploitation and abuse. The company has faced criticism for the decision. While the specific target of the move is generally praised, the implications of the artificial intelligence used have caused some to worry about what the limits would be. In short, many argue that private corporations should not be responsible for acting as watchdogs.

on the surface this is great news, but it’s also kind of like if Amazon announced Alexa will now call the police if it thinks it hears a murder. Private corporations without oversight should probably not be acting as extrajudicial watchdogs. https://t.co/bORBuOYdno

-- 𝚊𝚕𝚎𝚡 𝚌𝚛𝚊𝚗𝚣 (@alexhcranz) August 6, 2021

Will Cathcart, WhatsApp's head at Facebook, criticized the decision as one that could be used to scan private content for anything that Apple or a government decided that it wanted to control and specifically referenced China as one place where such technology could be abused.

"Apple has long needed to do more to fight CSAM, but the approach they are taking introduces something very concerning into the world," Cathart writes. "Will this system be used in China? What content will they consider illegal there and how will we ever know? How will they manage requests from governments all around the world to add other types of content to the list for scanning?"

We’ve had personal computers for decades and there has never been a mandate to scan the private content of all desktops, laptops or phones globally for unlawful content. It’s not how technology built in free countries works.

-- Will Cathcart (@wcathcart) August 6, 2021

There are so many problems with this approach, and it’s troubling to see them act without engaging experts that have long documented their technical and broader concerns with this.

-- Will Cathcart (@wcathcart) August 6, 2021

…”it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.” Those words were wise then, and worth heeding here now.

-- Will Cathcart (@wcathcart) August 6, 2021

Apple is facing particular criticism because of its recent campaigns that tout the company as one that is focused on privacy.

"Instead of focusing on making it easy for people to report content that's shared with them, Apple has built software that can scan all the private photos on your phone -- even photos you haven't shared with anyone. That's not privacy," Cathart continues.

It should be noted that as a Facebook company, WhatsApp has much to gain from public statements against Apple policies. Facebook has repeatedly criticized Apple for interfering with Facebook's ability to track users and target ads and has called the tech giant one of its biggest competitors, going so far as to accuse Apple of using its dominant platform position to push its own apps over those like Facebook.

Apple recently updated its iOS to new opt-in requirements for users that ask developers to get express consent from device owners before they allow their unique identifiers to be collected and shared. While this is a boon for individual privacy, it was a massive hit to Facebook's business model.

To fight it, Facebook went so far as to send notifications through its apps, including Instagram, to try and sway users to allow them to continue to allow app tracking.

WhatsApp itself has not always had a shining history when it comes to privacy either. Facebook was hoping to monetize WhatsApp by allowing advertisers to contact customers directly via the app. A proposed change to its data-sharing policy in January exposed Facebook to a potential lawsuit and triggered a nationwide investigation into its operations.


Image credits: Header photo licensed via Depositphotos.

#culture #mobile #news #technology #ai #aimodel #apple #artificialintelligence #childabuse #facebook #government #icloud #instagram #iphone #neuralmatch #privacy #socialmedia #storage #whatsapp

petapixel@xn--y9azesw6bu.xn--y9a3aq

Apple Will Scan Photos Stored on iPhone, iCloud for Child Abuse: Report

image

Apple is reportedly planning to scan photos that are stored both on iPhones and in its iCloud service for child abuse imagery, which could help law enforcement but also may result in increased government demands for access to user data.

In a report by The Financial Times and summarized by The Verge, the new system will be called "neuralMatch" and will proactively alert a team of human reviewers if it believes that it has detected imagery that depicts violence or abuse towards children. As an artificial intelligence tool, neuralMatch has apparently been trained using 200,000 images from the National Center for Missing and Exploited Children to help identify problem images and will be rolled out first in the United States.

“According to people briefed on the plans, every photo uploaded to iCloud in the US will be given a ‘safety voucher,’ saying whether it is suspect or not,” The Financial Times reports. “Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities.”

As noted by The Verge , John Hopkins University professor and cryptographer Matthew Green raised concerns about the implementation.

"This is a really bad idea," he writes in a Twitter thread. "These tools will allow Apple to scan your iPhone photos for photos that match a specific perceptual hash, and report them to Apple servers if too many appear. Initially I understand this will be used to perform client side scanning for cloud-stored photos. Eventually it could be a key ingredient in adding surveillance to encrypted messaging systems."

This sort of tool can be a boon for finding child pornography in people’s phones. But imagine what it could do in the hands of an authoritarian government? https://t.co/nB8S6hmLE3

-- Matthew Green (@matthew_d_green) August 5, 2021

Green argues that the way Apple is implementing this will start with photos that people have already shared with the cloud, so theoretically the initial implementation won't hurt anyone's privacy. The Verge notes that Apple already checks iCloud files against known child abuse imagery, like every other cloud provider, but what the company plans to do here goes beyond that and will allow access to local iPhone storage.

"But you have to ask why anyone would develop a system like this if scanning E2E photos wasn’t the goal," he continues. "Even if you believe Apple won’t allow these tools to be misused, there’s still a lot to be concerned about. These systems rely on a database of 'problematic media hashes' that you, as a consumer, can’t review."

These images are from an investigation using much simpler hash function than the new one Apple’s developing. They show how machine learning can be used to find such collisions. https://t.co/N4sIGciGZj

-- Matthew Green (@matthew_d_green) August 5, 2021

"The idea that Apple is a 'privacy' company has bought them a lot of good press. But it’s important to remember that this is the same company that won’t encrypt your iCloud backups because the FBI put pressure on them," Green concludes.


Image credits: Header photo licensed via Depositphotos.

#culture #law #news #ai #aimodel #apple #artificialintelligence #childabuse #government #icloud #iphone #neuralmatch #privacy #storage

zardoz@diaspora-fr.org

Apple a livré au FBI les données iCloud d’Alexandra #Elbakyan, fondatrice de Sci-Hub

Pour tous ceux qui pensent qu'être sur #Apple ou #Iphone, protège la #ViePrivée, l' #éthique, le droit des #citoyens...

D'après la personne qui m'a transmis cette discussion, elle a eu lieu sur un #réseau non libre. Voici un copié-collé :

« #AlexandraElbakyan, fondatrice de #Sci-Hub, un site engagé qui pirate des études scientifiques du monde entier pour les rendre accessibles à tous, vient d'être prévenue par Apple que le contenu de son #iCloud avait été transmis au FBI.

Rien de tel qu'un cas concret, dans lequel un #GAFAM livre ouvertement des #données personnelles aux renseignements, pour se souvenir du pacte que nous signons avec leurs #plateformes.

C'est de notoriété publique, mais un rappel de ce qu'on ne conscientise pas tous les jours n'est jamais malvenu : ces géants tiennent entre leurs mains nos liens sociaux, professionnels, citoyens, politiques dont il est difficile de se passer du jour au lendemain. Mais en échange, nos vies sont à portée de #surveillance. Surveillance commerciale et gouvernementale.

C'est une dimension qu'il faut au moins garder en tête, quels que soient nos usages et le niveau de transition auquel nous nous trouvons.

A toutes fins utiles.

Source :
- https://www.01net.com/actualites/apple-a-livre-au-fbi-les-donnees-icloud-d-alexandra-elbakyan-fondatrice-de-sci-hub-2042934.html
- même sujet : https://www.journaldugeek.com/2021/05/19/apple-livre-des-donnees-icloud-au-fbi-dans-le-cadre-dune-enquete/

À noter que cette discussion a eu lieu sur un réseau de messagerie (Telegram), non #libre .... comme quoi, un effort en terme de conscience est encore à faire, même quand on s'intéresse aux problèmes de surveillance...

Choisir une #messagerie #LogicielLibre #Opensource #Chiffré #Décentralisé voire #SansServeur #NoServer ...