#childabuse

tpq1980@iviv.hu

My effort to edit the Epstein Gates photograph as clearly as possible using the limited smartphone tools available to me. There may be better versions available, but I haven't seen them. Took about 90 minutes of work.

#gates #billgates #gatesepstein #usa #epstein #jeffreyepstein #childabuse #blackmail #blackmailhoneypot #cia #mi6 #mossad #ghislainemaxwell #maxwell #robertmaxwell #spies #uk #childrape #pedophillia #paedophilia #consent #ageofconsent #honeypot #blackmailcontrol #childtrafficking #imageediting #gatesepsteinphoto

posuwaegeh@diasp.org

Childhood Origins of the Holocaust

https://psychohistory.com/articles/the-childhood-origins-of-the-holocaust/

This speech is astounding. I think what is contained in deMause's analysis is highly related to what we see happening in Russia and the Ukraine right now and even the USA with the rise of Authoritarianism.


The Childhood Origins of the Holocaust
Lloyd deMause

Over thirty years ago, my book The History of Childhood was published, opening with the following words:

“The history of childhood is a nightmare from which we have only recently begun to awaken. The further back in history one goes, the lower the level of child care, and the more likely children are to be killed, abandoned, beaten, terrorized, and sexually abused.” 1

In the three decades since then, a dozen books and over a hundred articles in The Journal of Psychohistory have been written by myself and fellow psychohistorians giving overwhelming evidence of the truth of this astonishing view of childrearing evolution. Furthermore, psycho-historians have written hundreds of additional books and articles showing the crucial effects of childrearing evolution on historical personalities — what we term “psycho-classes” — and on history itself, particularly on wars and genocides, which we have found to be caused by this routine child abuse, by the lack of love and care during the early years of life, by the severely fragile selves that are the results of extremely insecure early attachments.

Since I am giving this speech celebrating the release of my book Das emotionale Leben der Nationen in Austria — the home of Adolf Hitler and one of the countries that carried out the Holocaust — I feel I must address the emotional origins of the Holocaust in Austrian and German childrearing in order to understand why it occurred, so as to avoid a world-wide nuclear holocaust in the future. I want to do this because I hope you will be able to agree with me that Germans and Austrians were born innocent — as all humans are — but were made violent racists during their early years by childrearing practices that if I describe them to you I think will allow you to better understand your ancestors. I do this not to “excuse” them but rather to understand the developmental causes of the Holocaust, so genocides and wars can everywhere be eliminated in the future.

I hope you will find that my psycho-historical view of the origins of the Holocaust makes better sense than views like that of Daniel Goldhagen,2 who recently portrayed Germans and Austrians not only accurately as “Hitler’s Willing Executioners” but also as mysteriously containing seemingly inherited antisemitic personalities. I will furthermore show that the great reduction in German and Austrian antisemitism in the past half century is due to the vast improvement in childhood — that is, in the childrearing you in my audience actually experienced. Psychohistory’s main discovery is that war and genocide, like homicide and suicide, is a psychopathic disorder that simply does not occur in the absence of widespread early abuse and neglect, and I hope to show you that Austrian childrearing today has advanced sufficiently so that similar genocides and racist wars have become impossible for Austria in the future.
READ MORE

#childabuse #holocaust #authoritarianism #Russia #Germany #Austria #Ukraine #war #violence #childrearing #Trump #USA

tpq1980@iviv.hu

Ukraine is, apparently, a major #global center for #child #rape, child #prostitution, child #pornography production & child rape #tourism.

Ukraine's capital #Kiev / #Kyiv is both The major center in #Ukraine for the sexual #exploitation of #children & home to #Jewish #Ukrainian president Volodymyr #Zelenskyy.

I find child rape, especially the #systematic & #commercialised versions of it to be the most vile & egregious of #crimes. Perhaps #Putin will bring an end to this abomination?

#ukrainecrisis #childtrafficking #nwo #childabuse #ukraineconflict #russia #volodymyrzelenskyy #kiev #kyiv #globalism #newworldorder #davos #criminals

https://www.google.co.uk/search?q=Ukraine+child+pornography+prostitution

https://en.wikipedia.org/wiki/Child_prostitution_in_Ukraine

garryknight@diasp.org

Encryption: UK data watchdog criticises government campaign | BBC News

A Home Office-backed campaign against the rollout of ultra-secure messaging apps by social media firms has been criticised by the UK data watchdog.

#technology #tech #security #privacy #hacking #malware #CSA #ChildAbuse #Government

https://www.bbc.co.uk/news/technology-60072191

petapixel@xn--y9azesw6bu.xn--y9a3aq

Is Apple Actually Going to Snoop on Your Photos?

image

Is Apple actually snooping on your photos? Jefferson Graham wrote an article last week warning this based on the company's child safety announcement. An attention-grabbing headline? Certainly. Accurate? It’s complicated.

There has been much criticism from privacy advocates, notably from the EFF and Edward Snowdon. This criticism is warranted, however, that criticism should very much be based on technical elements rather than hyperbole.

So in laymen’s terms, what’s going on?

No matter how well-intentioned, @Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow.

They turned a trillion dollars of devices into iNarcs—without asking. https://t.co/wIMWijIjJk

-- Edward Snowden (@Snowden) August 6, 2021

Unbelievable: @Apple now circulating a propaganda letter describing the internet-wide opposition to their decision to start checking the private files on every iPhone against a secret government blacklist as "the screeching voices of the minority."

This has become a scandal. https://t.co/uoa4uuNTaP

-- Edward Snowden (@Snowden) August 6, 2021

1) Families enrolled in iCloud Family Sharing will get tools to counter the sharing of explicit content.

If you have Family Sharing enabled and Apple knows that a user is under the age of 13, the device will scan all messages, both sent and received, for sexually explicit content.

The key here is that this feature is only enabled for users under the age of 13 using the Messages app. Parents can also switch on a feature that allows them to get alerts if children ignore a warning about the message.

So is Apple snooping on your photos in this instance? In my eyes, the answer is no.

2) All users who use iCloud Photos will have their photos scanned against a codebase (known as a hash) to identify Child Sexual Abuse Material (CSAM).

First, we need to understand what a hash is. Images connected to iCloud Photos are analyzed on the device and a unique number is assigned to it. The technology is clever enough that if you edit a photo through cropping or filters, the same number is assigned to it.

The National Center for Missing and Exploited Children provided Apple a list of hashes that are known CSAM photos. If your photo does not match that hash, the system moves on. The actual photo isn’t visible to anyone.

If a match is found, that match is added to a database against your iCloud account. If that database grows to a number (the specifications of which are not publicly known), Apple disables your iCloud account and send a report to the NCMEC.

So is Apple Snooping on your photos in this scenario? Maybe. It depends on what you consider snooping. Apple can’t see your photographers, only the hash and then they check that hash against a known CSAM hash.

Bear in mind that this is only enabled for those who use the photos app attached to an iCloud account, therefore you have other options (like using Google Photos) if you aren’t comfortable with the analysis of your photos.

It is worth remembering that all Android and Apple built devices already analyze your photos to be able to make them searchable. If you have a pet, type pet into the search box and pets appear. Analyzing photos is not a new technology, but CSAM detection extends the capabilities for the purposes of what Apple see as the common good.

Apple's filtering of iMessage and iCloud is not a slippery slope to backdoors that suppress speech and make our communications less secure. We’re already there: this is a fully-built system just waiting for external pressure to make the slightest change. https://t.co/f2nv062t2n

-- EFF (@EFF) August 5, 2021

3) Apple is adding guidance to Siri and Search related to CSAM.

This has nothing to do with scanning photos. If you search (using the iPhone search, not Safari), or ask Siri about CSAM content, it will provide you with links on how to report CSAM or tell you that interest in the topic can be harmful or problematic.

This will have the least impact on users, as I’m not sure people ask Siri about CSAM anyway! You can read Apple’s full explanation of that in this document.

To Summarize

1) Explicit content checks take place on devices known to Apple to belong to a child under 13 through iCloud family sharing. If you are over 13, your photos aren’t scanned.

2) Your iCloud-connected photo library will have a unique number (a hash) assigned to each photo. If that number matches a known CSAM hash, it will be added to a database within your iCloud account. If you have too many photos of this type, your account may be disabled and reported to the authorities.

3) You have a choice on whether or not you want this technology to run on your phone. You can decide not to use iCloud to store your photos or opt out of family sharing for your children.

Now that we have delved beyond the hyperbole, you are in a good position to make an informed decision about this technology. I encourage you to read both the criticism and praise for this method and make up your mind based on that.


_Disclosure: William Damien worked part-time at an Apple retail location seven years ago. The opinions expressed above are solely those of the author. _


Image credits: Header photo licensed via Depositphotos.

#editorial #opinion #technology #ai #aimodel #apple #artificialintelligence #childabuse #government #icloud #iphone #neuralmatch #oped #privacy #storage

petapixel@xn--y9azesw6bu.xn--y9a3aq

WhatsApp’s Head Calls Apple’s Child Safety Update ‘Surveillance’

image

Just one day after Apple confirmed that it plans to roll out new software that will detect child abuse imagery on iPhones, WhatsApp's head took to Twitter to call out the move as a "surveillance system" that could be abused.

Apple recently announced that it would roll out a software update this fall that would target three main areas to help keep children safe from exploitation and abuse. The company has faced criticism for the decision. While the specific target of the move is generally praised, the implications of the artificial intelligence used have caused some to worry about what the limits would be. In short, many argue that private corporations should not be responsible for acting as watchdogs.

on the surface this is great news, but it’s also kind of like if Amazon announced Alexa will now call the police if it thinks it hears a murder. Private corporations without oversight should probably not be acting as extrajudicial watchdogs. https://t.co/bORBuOYdno

-- 𝚊𝚕𝚎𝚡 𝚌𝚛𝚊𝚗𝚣 (@alexhcranz) August 6, 2021

Will Cathcart, WhatsApp's head at Facebook, criticized the decision as one that could be used to scan private content for anything that Apple or a government decided that it wanted to control and specifically referenced China as one place where such technology could be abused.

"Apple has long needed to do more to fight CSAM, but the approach they are taking introduces something very concerning into the world," Cathart writes. "Will this system be used in China? What content will they consider illegal there and how will we ever know? How will they manage requests from governments all around the world to add other types of content to the list for scanning?"

We’ve had personal computers for decades and there has never been a mandate to scan the private content of all desktops, laptops or phones globally for unlawful content. It’s not how technology built in free countries works.

-- Will Cathcart (@wcathcart) August 6, 2021

There are so many problems with this approach, and it’s troubling to see them act without engaging experts that have long documented their technical and broader concerns with this.

-- Will Cathcart (@wcathcart) August 6, 2021

…”it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.” Those words were wise then, and worth heeding here now.

-- Will Cathcart (@wcathcart) August 6, 2021

Apple is facing particular criticism because of its recent campaigns that tout the company as one that is focused on privacy.

"Instead of focusing on making it easy for people to report content that's shared with them, Apple has built software that can scan all the private photos on your phone -- even photos you haven't shared with anyone. That's not privacy," Cathart continues.

It should be noted that as a Facebook company, WhatsApp has much to gain from public statements against Apple policies. Facebook has repeatedly criticized Apple for interfering with Facebook's ability to track users and target ads and has called the tech giant one of its biggest competitors, going so far as to accuse Apple of using its dominant platform position to push its own apps over those like Facebook.

Apple recently updated its iOS to new opt-in requirements for users that ask developers to get express consent from device owners before they allow their unique identifiers to be collected and shared. While this is a boon for individual privacy, it was a massive hit to Facebook's business model.

To fight it, Facebook went so far as to send notifications through its apps, including Instagram, to try and sway users to allow them to continue to allow app tracking.

WhatsApp itself has not always had a shining history when it comes to privacy either. Facebook was hoping to monetize WhatsApp by allowing advertisers to contact customers directly via the app. A proposed change to its data-sharing policy in January exposed Facebook to a potential lawsuit and triggered a nationwide investigation into its operations.


Image credits: Header photo licensed via Depositphotos.

#culture #mobile #news #technology #ai #aimodel #apple #artificialintelligence #childabuse #facebook #government #icloud #instagram #iphone #neuralmatch #privacy #socialmedia #storage #whatsapp

petapixel@xn--y9azesw6bu.xn--y9a3aq

Apple Will Scan Photos Stored on iPhone, iCloud for Child Abuse: Report

image

Apple is reportedly planning to scan photos that are stored both on iPhones and in its iCloud service for child abuse imagery, which could help law enforcement but also may result in increased government demands for access to user data.

In a report by The Financial Times and summarized by The Verge, the new system will be called "neuralMatch" and will proactively alert a team of human reviewers if it believes that it has detected imagery that depicts violence or abuse towards children. As an artificial intelligence tool, neuralMatch has apparently been trained using 200,000 images from the National Center for Missing and Exploited Children to help identify problem images and will be rolled out first in the United States.

“According to people briefed on the plans, every photo uploaded to iCloud in the US will be given a ‘safety voucher,’ saying whether it is suspect or not,” The Financial Times reports. “Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities.”

As noted by The Verge , John Hopkins University professor and cryptographer Matthew Green raised concerns about the implementation.

"This is a really bad idea," he writes in a Twitter thread. "These tools will allow Apple to scan your iPhone photos for photos that match a specific perceptual hash, and report them to Apple servers if too many appear. Initially I understand this will be used to perform client side scanning for cloud-stored photos. Eventually it could be a key ingredient in adding surveillance to encrypted messaging systems."

This sort of tool can be a boon for finding child pornography in people’s phones. But imagine what it could do in the hands of an authoritarian government? https://t.co/nB8S6hmLE3

-- Matthew Green (@matthew_d_green) August 5, 2021

Green argues that the way Apple is implementing this will start with photos that people have already shared with the cloud, so theoretically the initial implementation won't hurt anyone's privacy. The Verge notes that Apple already checks iCloud files against known child abuse imagery, like every other cloud provider, but what the company plans to do here goes beyond that and will allow access to local iPhone storage.

"But you have to ask why anyone would develop a system like this if scanning E2E photos wasn’t the goal," he continues. "Even if you believe Apple won’t allow these tools to be misused, there’s still a lot to be concerned about. These systems rely on a database of 'problematic media hashes' that you, as a consumer, can’t review."

These images are from an investigation using much simpler hash function than the new one Apple’s developing. They show how machine learning can be used to find such collisions. https://t.co/N4sIGciGZj

-- Matthew Green (@matthew_d_green) August 5, 2021

"The idea that Apple is a 'privacy' company has bought them a lot of good press. But it’s important to remember that this is the same company that won’t encrypt your iCloud backups because the FBI put pressure on them," Green concludes.


Image credits: Header photo licensed via Depositphotos.

#culture #law #news #ai #aimodel #apple #artificialintelligence #childabuse #government #icloud #iphone #neuralmatch #privacy #storage