For a lot of new photographers, Photoshop can seem intimidating. The application has been around since 1990, and it has every tool you can think of. But recent efforts have started to make Photoshop more approachable, through AI tools that help to speed up and make a workflow easier.

Stephen Nielson

I had the chance to chat with Stephen Nielson, Director of Product Management, Photoshop at Adobe, about the latest release. There’s a lot to this year’s updates, and there’s definitely some exciting possibilities ahead.

Making a complex tool more approachable

In this year’s Adobe MAX release of Photoshop. there were several instances where complex tools got easier to use. One of these was the Object Selection tool, which now has the ability to detect a subject, and then let users hover and see the mask that it puts on the subject. It seems simple, but it’s something that can dramatically help to improve a workflow.

“The auto masking is pretty exciting. Because I just think it makes things so much faster for a longtime user of Photoshop, and also easier for somebody who’s just learning Photoshop,” said Nielson.

“It’s similar to what we’ve done with Select Subject [and] the Object Selection tool. It’s using the machine learning model to predict the selection that a professional user would make. And so we train a machine learning model on lots and lots of examples. This one was pretty tricky to teach the model, like, OK, when they hover over this, make a selection of this and not that. It’s a different application, and a different set of problems than Select Subject and the Object Selection tool.

“What I love about what we’ve done is combining it with the Object Selection tool. So if it missed a spot, just hold down Shift and draw a box.”

The future of Neural Filters

When Neural Filters were first introduced in 2020, I could see the possibilities, but knew that they needed some work. While some of them were fun (I edited a photo of my dad to make him look much, much older, for instance. He appreciated that.), others were more realistic in terms of using them on a regular basis.

And now, Neural Filters have continued to evolve. Upgrades have come to Depth Blur, for instance, which know looks more realistic and offers additional controls to blur your background. And new tools like Landscape Mixer take the idea behind sky replacement, and apply it to an entire image, altering the overall look and feel.

There’s a lot to Neural Filters, but the goal is to create a one-click approach instead of one that requires multiple tools.

“I think the Neural Filters updates are pretty fantastic. There’s just so much happening in the world of machine learning, and I love that we have this platform to explore and build multiple machine learning models on top of that.

“Thinking back to when I was first learning Photoshop, one of the first things I would play around with were Blend Modes. And you know, the Filter Gallery, because I was like ‘Oh, with one click, I can totally make my image look different.’ Blend Modes and the Filter Gallery were really good examples of one click, [and] everything looks different. To be honest, most of the time, it didn’t look great. Just randomly changing the Blend Mode doesn’t make it artistic.

“But it gave me lots of ideas of, ‘Oh, that’s kind of cool. That’s kind of a cool effect.’ And then I would follow that idea and try to create something. So I think that exploratory nature of Photoshop, that really captured a lot of people’s imaginations early on and spurred creativity was really important. I think Neural Filters does the exact same thing, where with just one click, you can totally transform your image. I think that it’s going to inspire a lot of creativity to our users.”

At the same time, Neural Filters almost feels like a plugin within Photoshop, even though it works as a standard Smart Filter. Neilson says that with technology continuing to evolve and get more powerful, more of a native feel might not be far off.

“It can be computationally very expensive. Unlike an Adjustment Layer, it processes at one time, and caches that as a Smart Filter. That’s why you can’t paint on it. You have to go into it and allow that processing to happen, because it can’t happen in real time. So I think that model works well for what the current limitations of hardware are.

“We’ve always prided ourselves that you can do things in real time. [Neural Filters] are a much bigger challenge. It is something we’re looking into and obviously there’s lots of new hardware advances including like the new M1 Pro and M1 Max [chips] that were just announced. It’s not unrealistic that we could get there. But right now it’s just so much computation that it still needs to kind of be contained into its Smart Filter for now.”

Focus on the iPad

Photoshop on the iPad also saw a major upgrade, with the addition of Camera Raw. Not only that — Photoshop on the iPad saw Dodge & Burn, as well as Smart Object support in this release.

In terms of Camera Raw, it’s a bit different from the desktop version, but the results are exactly the same.

“All of Adobe Camera Raw, Lightroom and this new experience on iPad are all built on the same underlying technology. We call it Image Core, and internally, it’s all based on the same technology. So exposure, clarity, dehaze … whatever you use will always give you the same results.

“But the experience can change, right? We built it so that it would be familiar to existing customers who have been using ACR on desktop, but it is slightly different. It’s a smaller set of controls. It’s easier to navigate, and it matches more closely with Lightroom Mobile. And it’s also not exactly the same workflow. Whereas Camera Raw [on desktop] will open in a serrate window, [iPad] is a little more tightly integrated. But it’s using the same underlying technology.”

In terms of features like Neural Filters, and the new Hover Auto Mask tool, Neilson said the team definitely wants to bring them to the iPad as well.

“There’s a lot of requests for things on iPad that are in desktop. And we’re actively researching and considering which are the most important things we want to build next. It’s a question of when.”

What about the web?

Photoshop on the web certainly wasn’t something I expected to see. This is something I certainly see teams using together, and there’s some great possibilities for collaboration here.

“There’s a couple of ways we’re thinking about web. First of all, we first built Photoshop web to just make sure we could do it, and that it was possible to actually bring the rendering engine to the browser, and actually get the exact same results. We’ve proven to ourselves, yes we can do that.

“I think there’s a huge set of opportunities for what we do next with it. Our focus right now is on making a great experience for stakeholders and your co-workers that are collaborating. We’re really leading first into this idea of, you can open up my Photoshop document, you can comment on it, you can see all the layers, you can see it at 100% resolution … you know, it’s exactly the same as it is on the desktop.

“But we did add in some light editing tools. So if you’re working with a co-worker and want to make a slight change here, you can do that. We know that not all the tools are there. There’s a lot of things missing. But we wanted to put enough in there that it could be useful in some light editing scenarios.

“Right now, it’s really about collaboration and communication between your stakeholder or your co-workers.”

How does Photoshop continue to evolve?

Nielson made reference three main ideas when it came to the future of Photoshop — accessibility, machine learning and collaboration.

“I think that I think we’re largely going to continue on the same kind of trends that we’re continuing on now, which is making Photoshop available anywhere somebody wants to do creative work. We’re obviously doing that with iPad and now web.

“I think we’ll continue to invest significantly in machine learning, and better utilization of the hardware that’s available to us. What Apple announced was pretty exciting in terms of a neural engine. And the XDR display coming to the MacBook Pro line.”

“I think that this idea of collaboration is going to become even more important over the next five years. You know, what we’re finding in our research is that a lot of people will collaborate with their co-worker or their stakeholder today, but that even people who don’t collaborate a lot with somebody today are interested in it. Because there’s always somebody that you want to get feedback from, or you want to share with.

“The next generation of users really are kind of expecting, like, ‘Oh, yeah, why can’t I just give you access to the same document that I have.’ I think that aspect of collaboration, working together … certainly everybody’s realized during COVID-19 that it’s more challenging than it should be. And when there are great collaboration tools, they’re amazing. So we think that there’s still a lot of opportunity for Photoshop and for all of Creative Cloud, to build really amazing collaboration tools.”

Regardless, Photoshop continues to take giant steps each year in developing new, approachable technologies, that will help creatives everywhere achieve their visions.