Put yourself into the mindset of your inner male child.

A nude photo now shows up in your iMessages feed, but it’s blurred and comes with a warning and three choices.

Don’t view. Message a grown-up. Or View.

Which way do you think the kid is going to go?

Exactly. Can you believe people at Apple get paid big bucks to come up with bonehead ideas like this?

The nude photo blur fix is part of the update to the iOS 15 operating software, released earlier this week. It attempts to clean up the mess Apple made earlier this year when it announced new “child safety” tools in iOS that would have seen Apple inspecting our photos to weed out child pornography. This is a worthwhile effort, as many people noted at the time. But wanting Apple, the company that prides itself on taking a stronger stance for privacy than competitors Facebook and Google, now wants to inspect the photos on our devices?

The first plan

In August 2021, Apple announced a Child Sexual Abuse Material (CSAM) detection tool that would have scanned photos uploaded to the iCloud Photo Library for potential child abuse material. Apple would then potentially report users to authorities and lock them out of the system.

The nude scan tool for minors also originally had Apple informing parents that nude photos were on their iPhones.

The revised plan

The iCloud plan was tabled, for the time being, in the wake of the backlash. Buy an iPhone and get reported to the police? Is there a jury out there weighing in? The nude minor scan has been updated minus the reporting to parents.

The good news is that if parents want to keep nude photos away from their kids (and what parent wouldn’t?) they have to take action. They can enable the blurring tool by opting in via the Family Sharing plan. This is where parents share iMessage with their kids on multiple iPhones. The blurring and the warnings come into effect when nude photos are sent and received, so yes, Apple is inspecting the content of photos on our iPhones.

Apple’s take

“Messages analyzes image attachments and determines if a photo contains nudity while maintaining the end-to-end encryption of the messages,” says Apple. “The feature is designed so that no indication of the detection of nudity ever leaves the device. Apple does not get access to the messages, and no notifications are sent to the parent or anyone else.”

My bottom line is that I have no interest, zero, in taking nude photos on any device and sharing them. I get really queasy with some of the nearly nude stuff that pops up on my Instagram feed.

But clearly, I’m in the minority. (Just ask Roman Roy, from “Succession,” who sent a picture of his private parts to his elder dad on a recent episode of the hit HBO series.)

What Apple should have done

Sending nude photos on the Internet is a real issue, and is the cause of bullying at schools and beyond. But I mean come on. If Apple wants to get rid of nude photos and the issues around them, it could have simply turned a switch and made the iPhone a nude photo-free zone, period.

I’d be OK with it if it was just for kids, or everyone. Want an Apple device? These are the rules, and we have to abide by them if we want to play in their garden.

But the way Apple designed the software update will only want to make kids want to see the nude photos more and do everything in their power to skirt the system. There’s no way around that this whole thing makes kids more curious. I remember being told not to look at Playboy as a kid and trust me, that didn’t stop me.

And it won’t stop today’s generation either.

Editor’s note: We welcome this post by Jefferson Graham, a former USA TODAY columnist, host of the Photowalks travel photography series on YouTube and the author of the Photowalks newsletter. The opinions expressed in this column are his alone and not that of Photofocus.