Without a doubt, the biggest update to Adobe’s Creative Cloud platform last week was for Photoshop. Featuring brand-new features like Sky Replacement and Neural Filters, the 30-year-old program is looking ahead to the future by embracing artificial intelligence technologies.
While Sky Replacement and Neural Filters were by far the biggest show stoppers, there were several other updates to Photoshop 2021. Things like Intelligent Refine Edge — which let artists get a better selection for hair and objects — as well as a new Discover panel, refined plugin experience and updates to Live Shapes.
For anyone waiting on when Adobe would take that next step for the flagship program, they clearly have with Photoshop 2021.
With the excitement swirling around the latest update, I sat down with Stephen Nielson, the Director of Product Management for Photoshop. We discussed what these new features mean and what photographers can expect from artificial intelligence in general.
Lifting up the sky
While Photoshop’s Sky Replacement tool isn’t the first sky replacement tool to market, it certainly holds its own. Featuring a library of skies that’s organized by thumbnail, Photoshop makes it easy for artists to switch out a sky in their photographs.
“I think a couple things are really unique about the solution that we’ve shipped. The detection of the sky and the creation of the mask … I’m quite pleased with the results that we’ve been able to create in Photoshop,” said Nielson. “We held the feature for quite a while to really make it great. The workflow that we’ve created is completely nondestructive. So even if there is a slight mistake in the algorithm, we output all of the masks and the color foreground adjustments as an adjustment layer. Everything is completely nondestructive and re-editable, at a fine grained level.
“Our expert Photoshop users have been using Sky Replacement, and then doing really creative and powerful things with them. Duplicating the sky or copying the mask of the sky and doing some additional blend modes or whatever it might be. It’s been very interesting to see how people have been able to build on the very extensible solution that we have with Sky Replacement.”
“I really love that we built something that is easy. Anybody can do it. But it’s also extremely powerful. You’re in Photoshop, so you can get any result you want. Because we’re working with layers and masks and all those transformation controls, we can get a result that no automated solution can get.”
Neural Filters make complex edits easy to achieve
The most exciting feature for me personally was the development of Neural Filters. The possibilities here are endless — Nielson said they’re already working on opening up Neural Filters to third party developers. Whether you’re tweaking a portrait or bringing color to a black and white photograph, the technology with Neural Filters is quite impressive, and it’s clear that it will be a big part of Photoshop’s future.
“Neural Filters is kind of a fascinating back story. We have a very vibrant Adobe research community, where we have some of the world’s best computer vision scientists in the world that work at Adobe. We work closely with that team to take technology and turn it into features in Photoshop,” said Nielson.
“The unique thing about Neural Filters is that it’s a platform that has a very clever model that’s just operating as a filter. It allows us very quickly to take research technology and bring it into a feature in Photoshop, way faster than we have been able to do in the past. All of the filters that we’re shipping with are things that we’ve known about and have been researching for years. But where it would take maybe two years to be perfected and made as a feature in Photoshop, we can in a much shorter period of time, bring it in as a Neural Filter.
“Now that we’ve built this platform, we’re going to be able to expand it even more with totally new filters that are based on other research that is happening inside the company.”
The initial filters were chosen based on in-app feedback from beta testers. And after seeing that feedback, the Adobe team decided to keep the feedback option in the public release, so that any Photoshop user can submit their thoughts on the filter.
“What I’m really excited about with this feedback mechanism, is that there’s an option to add more details, and to include your image in your feedback. And that’s the key to massively improve machine learning models is having a lot of diverse images.”
But how exactly do Neural Filters work? Being able to bring a smile to someone’s face, adjust the tilt of their head and more … there’s a lot of work behind that.
“When we operate on something like smiles, expressions, eyes or even skin, most of the machine learning models work at a very low resolution, usually 1024×1024 [pixels]. Obviously that doesn’t work well to scale up to a high resolution image. So there’s a tremendous amount of work that we’ve had to do to take the low resolution output from the machine learning model that will generate a smile or change the direction of somebody’s head, and figure out how to generate high resolution data on top of that,” said Nielson.
“What has enabled it is all of the foundational capabilities of Photoshop. We’ve done an amazing amount of segmentation, blending work and using capabilities of Photoshop. So it’s not just a machine learning model. It’s a machine learning model that outputs something at low resolution, and then we do all sorts of transformations on it using technology that has existed in Photoshop previously. That’s why you can use Neural Filters on a 35-megapixel image. That’s extremely unique. I’m especially proud of the team for being able to do that.”
What AI means for photographers
Despite Adobe’s strong investment into artificial intelligence and machine learning across its Creative Cloud applications, Nielson sees it as more of a complementary tool as opposed to something that will completely remove the human touch. Long story short, robots won’t be running Photoshop anytime soon.
“I think our investments in machine learning have two major benefits. They make a lot of the technology and workflows more approachable. Any Photoshop user can figure out how to use Sky Replacement and get a really good result. Neural Filters, same thing. Just go in and move the sliders. But when we’ve built these features, we didn’t have in mind just beginners. We also really have focused on making it high quality for professionals.
“In our visits with customers and discussions and research, even expert users of Photoshop will see something and they’ll say, ‘Wow, I know how to do that by hand. And it’s kind of scary that you completely automated that. But now that I think about it, I hate doing that part! That’s not the fun part.’ In all our machine learning investments we very carefully try to focus on the tedious tasks. We’re not trying to use machine learning to replace people’s creativity. We’re trying to enable creativity by taking out the boring, tedious parts and making those more automatic.”
For Nielson and his team, machine learning is really about how to make an approach that’s better — even if it’s for a tool that already exists in Photoshop.
“Machine learning has so much potential for Photoshop — you’re seeing just the beginnings of what is possible. But from the selections to things like Neural Filters and Sky Replacement, these are really crazy, powerful tools that before the discovery of machine learning would have been impossible,” he said.
“Internally we talk a lot about how do we embrace machine learning in a big way, and [how do we] make some of the old ways of doing things obsolete because there’s a machine learning approach to it that’s better. So we’re constantly thinking about new ideas and new applications for machine learning. I think in regard to Neural Filters specifically, we’ve gotten so many suggestions and ideas of what filters we should build next. The thing I’m most excited about is how quickly we will be able to add new filters. With all the work that we did to build the foundations, it will really unlock a lot of opportunities for us to build out lots of filters.”