What are hot pixels, and how can you remove them?

Hot pixels are defects commonly found in digital cameras. You can determine if they are hot pixels because they will show in the same location in every frame — they do not move. They will look extra sharp because they are just one pixel. Below, I’ll discuss what they are and talk about three ways to remove them.

What are hot pixels?

Hot pixels are caused by electrical charges which leak into the sensor wells. You usually begin noticing them when you go home and look at your images more closely during post-processing.

Hot pixels can become more visible at high ISOs. They can also become more visible when the sensor becomes hot. This can occur when you are photographing during a hot day or evening. They tend to be more visible in dark areas of your image. They tend to show up far more often in long exposure images. And of course, this includes dark night photography images!

Above, I used the Dust& Scratches filter in Photoshop to remove hot pixels. This is viewing the image at 200% zoom.

A pixel by any other name …

For the purposes of this article, I am not going to differentiate between hot pixels, dead pixels (a pixel that is no longer receiving any information) or a stuck pixel (pixels that receive erroneous information, often showing up as red, blue or green). I’m simply going to refer to them as hot pixels for the sake of expediency.

Three ways to get rid of wayward pixels

1. Pixel mapping

We’ll start straight at your source: Your camera. Your camera may include pixel mapping features in its firmware. The general idea is that your camera takes a black reference frame and then eliminates the detected stuck pixels. Easy. Pentax and Olympus have pixel mapping features.

2. Removing noise

I’ll use Photoshop as an example, mostly because, well, that’s what I use. If you don’t use Photoshop, look for a similar feature on your photo editor.

Photoshop already has an excellent method of getting rid of hot pixels without resorting to third party plug-ins. It’s quite easy.

Using the "Dust & Scratches" filter to eliminate something other than dust and scratches. Many programs other than Photoshop have the ability to remove dust and scratches. Give it a try!
Using the Dust & Scratches filter to eliminate something other than dust and scratches. Many programs other than Photoshop have the ability to remove dust and scratches. Give it a try!

Go to Filter > Noise > Dust & Scratches. This will open a dialog box.

Adjusting the threshold in the "Dust & Scratches" Filter. Look! No more hot pixels!
Adjusting the threshold in the “Dust & Scratches” Filter. Look! No more hot pixels!

Set the pixel radius to 1-2 pixels. Set the threshold based on how much you wish to eliminate the offending pixels. 

The long exposure night image after reducing noise from the "Dust & Scratch" Filter.
The long exposure night image after reducing noise from the “Dust & Scratch” Filter.

What if the Dust & Scratches Filter messes up other parts of my image?

If this occurs, do one of the following:

Paint out the pixels with Layer Masks

Create a Layer Mask (Layer > Layer Mask > Reveal All). This should create a white mask in the Layers Window. Then click that white mask, select a brush with the color black and paint out all the areas where you feel the image has been adversely affected.

Use luminosity masks
Creating a luminosity mask. Notice that all the darkest areas are white?
Creating a luminosity mask. Notice that all the darkest areas are white?

Since offending pixels are far more visible in dark, shadowy or unexposed areas, you may target these in those areas with a luminosity mask. Create a luminosity mask where only the darker areas of the image are targeted. The pixels should be gone. If there are other pixels visible, you can paint those out on the mask itself with a white brush, brushing on more of the effect. 

This is a screenshot of the Layers Window. The luminosity mask is the small black and white mask to the right of the top layer. It's a Layer Mask, but is targeted to affect the darkest areas of the image. It's easier and quicker to create a luminosity mask than it is to paint on a Layer Mask, and sometimes can be extremely helpful if you know exactly what you wish to target.
This is a screenshot of the Layers palette. The luminosity mask is the small black and white mask to the right of the top layer. It’s a Layer Mask, but is targeted to affect the darkest areas of the image. It’s easier and quicker to create a luminosity mask than it is to paint on a Layer Mask, and sometimes can be extremely helpful if you know exactly what you wish to target.

3. Dark frame subtraction

There are two ways of doing dark frame subtraction — long exposure noise reduction and dark frame subtraction.

Long exposure noise reduction

You can turn on long exposure noise reduction (LENR) in your camera’s settings.

Remember, though, that this will result in your long exposure taking twice as long. For example, if you do a 3-minute exposure, your camera will then employ LENR for an additional three minutes, so your wait time will be six minutes total. Because of this, you also should not have LENR on if you are doing stacking of any kind. For these reasons, I rarely use LENR.

Dark frame subtraction in post-processing

This is how I prefer to do dark frame subtraction. While you are out in the field taking long exposure photos, take a dark exposure image. 

Set up your camera with the settings you are using. However, keep the lens cap on. This may get puzzled looks from your family if they are not photographers. Take the photo. 

Remove the lens cap and go ahead and take the long exposure. 

When you do your post-processing, open the dark frame subtraction image and your regular image. Put the dark image as the top layer. Change the Blend Mode to Subtract. Boom! That gets rid of whatever errant pixels you may have had.

It’s easy to deal with hot pixels

As I mentioned earlier, this mostly occurs with long exposure photos, especially in the dark areas. This unfortunately makes them particularly noticeable, especially when viewed on a large screen. Thankfully, there are numerous ways to address taking them out. And most of them are easy.

There are other methods of removing them as well. You could use either the Spot Healing Brush or the Clone Stamp tool to remove them. However, you would need to generally remove them individually. I tend to use the Spot Healing Brush for removing sensor dust more so than hot pixels.

You might also be able to use a noise reduction program such as Topaz Labs DeNoise AI.

Do you have a preferred method of removing hot pixels? If so, please tell me in the comments!

Five features I want to see coming to Lightroom

Woman using Lightroom on computer

Lightroom has great cloud features. But for all its power, the glaring omissions continue to frustrate.

I’m a big fan of Lightroom for its cloud features and ability to edit on the go from my mobile phone, tablet or laptop. But when producing client work, I always head back to Lightroom Classic.

Here’s five features we want to see coming to Lightroom to make it truly as useful as its desktop version.

Better control over storage is missing from Lightroom

I’ve written extensively on how to dig yourself out of the “Help, my Lightroom storage is full!” hole. It’s such a convoluted process, and it should be straightforward. I’d love to see Adobe adding storage control to Lightroom to make it easier to manage what’s in the cloud and what’s local.

Virtual copies that work as actual virtual copies

This one comes up a lot on Adobe message boards: Why won’t Lightroom let us make virtual copies? The Versions feature — which is touted as the equivalent — frankly, isn’t. Versions require you to go into a menu and apply a “version” (i.e. another edit) in a similar way that you apply a preset.

This isn’t the same as a virtual copy which sits in the filmstrip alongside the original copy. I can’t export a monochrome and a color version of the same photo in the same export action with Versions. And that drives me nuts.

(Note: You can make proper virtual copies in the desktop version of Lightroom. But this is missing from Lightroom on mobile and tablet: Versions are your only option.)

Access to display settings when sharing albums

The Share feature of Lightroom is one of my favorite things: I can edit my son’s birthday photos and then share the edited album, with JPEG downloads enabled, to friends and family. This saves a lot of time: I don’t have to export and upload the finished JPEGs to share them.

However, it drives me nuts: For some reason the default sort in online albums is “random.” I want my photos to display in the album in chronological order, but to set this up, I have to log into the web version of Lightroom (as opposed to the mobile, tablet or desktop versions) and change the display settings from in there.

Screenshot Lightroom for web
The web version of Lightroom, accessed at lightroom.adobe.com is the only place to control full display settings for shared albums.

It’s an extra step of double-handling that could be avoided if the display settings were built into the mobile/tablet/desktop sharing menu.

Greater consistency between the four variants of Lightroom

While we’re on the topic of double-handling, it’s frustrating that the mobile, tablet, desktop and web versions of Lightroom all have slightly different capabilities. I can create proper virtual copies on desktop, but not in the other three versions. I can control shared album display on web, but not in the other three versions.

What I want to do is generally possible somewhere, but having to hop and jump between four different interfaces (and devices!) to find it is annoying.

A sharing space for premium presets

The recent social additions to Lightroom have been interesting to see grow. It’s now possible to see people’s editing process demonstrated in real time and apply their edits to your own photos. I’d love to see this turned into a space where people can group presets and share them from account to account. A universal marketplace-style system with ratings and example photos from real users would be a great addition to the Adobe ecosystem.

Create a mood board for your next photoshoot with Excire Foto

Laptop with excire foto library

If you’re like me, then you love using photo prompts to inspire beautiful photos of your clients. And if you’re also like me, you love a quick way to spark ideas! Let’s check out how Excire Foto can be used with your own library of photos as an idea generation tool.

Excire Foto is an AI engine which automatically sorts your images according to content and the photography style detected in the image. It’s a powerful, fast piece of software that requires zero human input to tag images with keywords for quick searching.

There are so many ways that Excire Foto can be used. From culling photos based on expressions (it has the power to filter photos based on whether people are smiling or not, looking at the camera or not, and more), to searching your back catalog of photos to use in new projects, the applications are endless.

I write about photo prompts on my blog, the Promptographer Guide, and I loaded my library of inspiration images into Excire Foto to use its AI engine to quickly generate a mood board for clients using my own images.

How long does it take Excire Foto to process photos?

My photo prompt idea library currently contains 268 photos, and it took Excire Foto under a minute and a half to process and automatically keyword my images.

Excire Foto ingestion window
It took just under 90 seconds to ingest over 260 photos into my Excire Foto catalog

That’s pretty quick! I am not one of those amazing, meticulous people who keywords every image they edit (you glorious gods, you) so Excire Foto is something of a game-changer for me in this regard.

Create a mood board using the hierarchy in Excire Foto to browse by content

The main fun of Excire Foto happens in the right-hand sidebar, where the keyword hierarchy shows how many photos have been tagged with each keyword.

I’m shooting a wedding in a few days’ time, so let’s check out what ideas I already have in my back catalog of photos.

The hierarchy is divided into Content and Photography. Content keywords use AI to detect what the photo contains, whereas Photography detects the photographic qualities of the images.

Screenshot of Excire Foto keywords
The entire keyword hierarchy in Excire Foto is automatically generated, with keywords AI-detected and attached to photos on import.

I can jump into Event then Wedding and it brings up 21 photos in my catalog that Excire Foto has tagged Wedding. I can narrow that down to the Bridal Couple keyword, or just the Bride keyword.

By selecting and right-clicking I can add all my favorites to a custom Collection. I’ll use these to create a mood board for the job that I can share with my clients (more on sharing later).

Screenshot of Excire Foto add to collection
Create a custom collection in the left side bar and then photos can be easily added from searches or keyword browsing.

Under the keyword Other, then Romantic, there’s a heap of great photo inspiration to save for my mood board.

Screenshot of Excire Foto keywords
Browsing the content hierarchy brings up heaps of inspiration from my own back catalog of photos.

There is so much power in this Content hierarchy. Why not check out Person, Age, Baby / Toddler next time you have a family photoshoot scheduled? You could create a mood board for specific clients or events, or by type of session, all drawing on your own photos as inspiration.

Screenshot of Excire Foto keywords
Excire Foto uses AI to classify people in images by age.

Browsing photos using the Photography keywords in Excire Foto

The Photography half of the keyword hierarchy is another great source for inspiration.

Screenshot of Excire Foto keywords
The Bokeh keyword in the Photography hierarchy brings up anything with a short depth of field.

Low-key is extremely popular at the moment, so checking out my photos tagged Dark I can find heaps of great inspiration for after dark at the wedding reception.

Screenshot of Excire Foto keyword dark
The Dark keyword is a great source of low-key inspiration.

Another favorite of mine is the silhouette, so more of those go into my mood board collection. Remember, right-click and add to your custom collection to save those images for later use.

Screenshot of Excire Foto keyword silhouette
Excire Foto automatically detects if a photo contains a silhouette.

The final step to create a mood board: Sharing images in Excire Foto

Now that I have grabbed some inspiration for my wedding clients, I can bring up all my collected images by clicking the collection name. I want to share these with my clients as a mood board, and this is easily done with Excire Foto.

The first time you share, you’ll need to set up sharing in the Preferences menu. Hit File, then Preference, then Sharing, and follow the prompts to connect your Dropbox or Google Drive account. You only need to do this once.

Screenshot of Excire Foto sharing preferences
Excise Foto works with Dropbox or Google Drive to share custom collections of photos.

Right-click the collection and select Share, then wait for the photos to upload.

Screenshot of Excire Foto share submenu
Sharing Collections is as easy as a right-click!

Once done, I have a link ready to share with my clients, and our mood board is ready to go in my Google Drive.

Screenshot of Excire Foto share confirmation
The share confirmation dialogue has the link ready to go, for sharing with clients, your second shooter, or whoever else needs to see your mood board.

Depending on your process, you could have clients add their own must-haves to the folder, or use it as a planning tool at your pre-wedding meeting.

Google Drive automatically sub-folders the upload by date when the Excire Foto integration is set up.

Excire Foto makes it easy to trawl your own photos for inspiration

I love how easily I can make a mood board out of my own photos using Excire Foto. All with no time-consuming effort on my part. Win-win!

Creating a time-lapse video in less than three minutes with GlueMotion

I had almost 100 photos of the starry sky. I had taken these in succession while waiting for the Milky Way to appear. I wanted to turn this into a short time-lapse video.

But there was one issue — the previous old time-lapse software I had used stopped working. After a couple of hours of attempting to fix it and get it to recognize file destinations, I gave up. 

This is the time-lapse video created with GlueMotion, adding music and credits afterward via DaVinci Resolve. The music is from my band The Mercury Seven. The second half of the video are cumulative star trails created by StarStax.

In search of a new time-lapse app

I began looking around for another time-lapse creator. GlueMotion (for macOS) caught my eye. This appealing-looking app had a deflickering feature and control over the frames per second. That sounded exactly like what I was looking for, all in an easy-to-use interface.

I downloaded the demo, and it worked really beautifully. The interface was intuitive, all but walking you through the process. I decided to purchase it from the Mac App Store for $17.99

Very quickly, I had a finished time-lapse video. After two hours of struggling with my previous software, I had created a video in less than three minutes. Here’s how it works.

Step 1: Load the files

GlueMotion will prompt you to browse the files you want to load. It loads by folder, not individual files. If you have other folders contained within that folder, it will not load them or any of their contents. However, you can load this afterward if you wish.

GlueMotion supports PNG, JPEG, TIFF and RAW files. For this example, I used full-sized JPEG files encoded at the highest resolution, thinking this would be more than enough resolution for a video that would ultimately be seen on YouTube.

Step 2: Frame editing

Here, you are simply specifying the crop, scale, rotation and more. It also offers some basic editing adjustments, such as saturation, exposure and brightness.

Step 3: Deflickering

When making a time-lapse, there may be some flickering from frame to frame due to minute changes in brightness. This can create unwanted flashing, or flickering. GlueMotion allows you to remove this effect with two modes.

Uniform mode

If your time-lapse gradually changes brightness as the images progress, look no further. Uniform mode is for you. For example, with photos taken in succession, the sky grows gradually darker. In this scenario, uniform mode would be the best mode.

Experimental mode

Use this for basically everything else. This might be when the light or exposure keeps randomly changing. I did not use this mode, so I cannot tell you if it works or not.

Step 4: Encoding settings

You may choose from a variety of settings to export your video, including pixel size.

GlueMotion can export either a QuickTime Movie (MOV) or MPEG-4 (MP4, M4V). You can also choose the video codecs: HEVC, H264, JPEG, Apple Pro Res 422 and Apple Pro Res 4444. I wanted a reasonably high resolution, so I chose MOV and Apple Pro Res 422.

The frames per second (FPS) default to 24. I kept it there.

The original time-lapse video as originally created by GlueMotion.

What I liked

GlueMotion held my hand and walked me through the steps. I like this method as opposed to most video editors, which have different menus that you select. The layout was clean and easy to follow, and didn’t look like it was made by propeller-heads. It also worked quickly at both loading, editing and exporting. The demo version works flawlessly, but imprints a logo in the bottom corner.

What I didn’t like

So far, nothing! I love this app. Be sure to check out GlueMotion and see how you can quickly develop a time-lapse video with your photos!

I made the time-lapse video from the same images that I used to create this star trails, allowing them to do “double-duty.”

Selective color in Premiere Pro

You can perform selective coloring in Premiere Pro. If you want to change the color of an object in your video, simply use the Lumetri Color panel. Select the clip and open the Hue Saturation Curves section of the panel.

You’ll find a number of available curves, but the Hue Versus Hue curve is what you’ll want to use to change a hue range to another hue. Use the eyedropper to select the hue you wish to change. Control points will occur on the curve graph. Adjust the middle point to the hue you want. You can add more control points to narrow, expand or fine tune your selection.

Watch this short tutorial video to see the Hue vs Hue curve used for selective color in Premiere Pro:

A low light, high ISO noise challenge for Topaz DeNoise AI

Night photographers are always looking for good noise reduction software. After all, noise is the enemy of night photography. Extremely low light coupled with long exposures and high ISOs can make for rather noisy photos. How would Topaz DeNoise AI handle this?

Radio telescopes looking for signs of … artificial intelligence? Have they found it in Topaz Labs DeNoise AI?

The different noise reduction modes

I downloaded the latest version of Topaz DeNoise AI, which was version 3.2.0. With version 3, Topaz states that they have created a completely new AI engine and offers a new low-light model to give better details in dark areas. Also new is Severe Noise mode.

They also renamed some of their other modes:

  • Standard mode was formerly Denoise AI. This is what Topaz describes as a well-rounded noise reduction coupled with detail preservation.
  • Clear was formerly AI Clear. This is for denoising smooth surfaces
  • Severe Noise is, as the name implies, for addressing excessive noise from high ISO.
  • The new Low Light mode is intended for challenging lighting conditions with lots of noise.

Selecting a low-light high-ISO image for testing

I chose to test DeNoise AI with a night photo of some radio telescopes I had shot at ISO 4000 for 20 seconds. Often with this sort of photo, I would photograph the foreground using a longer exposure, low ISO setting and blend that clean version with the sky. However, it was 2:30 a.m., and I forgot. It happens. Consequently, I had a lot of noise throughout the image.

Would DeNoise AI be able to denoise the foreground well? Would it keep fine details intact? And would it dull the sharpness of the stars? A tall order indeed.

Testing in Photoshop

Partially out of habit, I do a lot of my processing in Photoshop. Thankfully, DeNoise AI works as a plugin in Photoshop as well as Lightroom Classic. 

I immediately took a liking to the comparison view as well as the ability to quickly switch between different views. 

software, Topaz, Denoise, night, noise
Comparison View in Denoise AI. In this example, if you wish to see Clear View, you would select that at the right. Click on the image for a larger view.

I set all the mode settings to the same, just for the sake of comparison. To me, the Low Light mode looked the most pleasing. It kept the sharpness of the stars intact. And it denoised the radio telescope. 

Zooming in close for some pixel peeping

After applying the effect to a new layer in Photoshop, I wanted to compare before and after more carefully. I zoomed in to 200% to further compare. Nobody except the person doing the processing ever looks at photos this closely. And no one would see this level of detail on any website, and certainly not on a phone. Nonetheless, I felt this would reveal more about how Denoise AI did.

Above, you can see a zoomed-in, close-up 200% view of the radio telescope and the stars. Toggle back and forth to view “before” and after”  images.

Shown above is a zoomed-in, close-up 200% view of the supports for the radio telescope, which were very dark and had considerable background and high ISO noise, partially from being dark.

As you compare the “before” and “after” images, you can see that DeNoise AI has fixed the noise quite well while keeping details intact. It even kept the faint stars sharp.

One word: Wow.

I could end my conclusion with one word: Wow. 

I’ll elaborate further. I’m deeply impressed with how well DeNoise AI is able to reduce noise while preserving details. Their AI engine is impressive at differentiating what is noise and what is detail that should be kept. It also preserves contrast beautifully. I have never seen any other noise reduction software that can effectively do both simultaneously.

To preserve this amount of detail, I’ve previously needed to fine-tune the adjustments using a combination of luminosity masks and “painting” on the layer mask to add or remove the noise reduction. It also has an excellent interface and is exceptionally easy to use.

Are there any cons? Sort of. The software requires slightly higher computer graphic card requirements. And sometimes, the processing time can seem slow compared to other software.

So why do I say “sort of”? It comes back to its quality and preservation of detail while denoising. Sure, it takes DeNoise AI a few seconds to provide previews in comparison view. And sure, it may sometimes take as long as 40 seconds to render after applying the effects. I would submit, however, that DeNoise AI is reasonably fast when you consider the processing involved.

Further, DeNoise AI is considerably faster and less work than it would take me to fine-tune the adjustments. That’s something I’ll happily embrace every single time.

Topaz DeNoise AI is the best noise reduction software I have ever used. And coming from a night photographer, that is really saying something.

Photoshop adds new Depth Blur to its suite of Neural Filters

Adobe recently released a new Neural Filter in Photoshop called Depth Blur. This allows you to target the area and range of blur in photos, easily creating wide-aperture depth of field blur effects. You may choose different focal points. However, you may also remove the focal point and control the depth blur through manipulating the focal range slider.

In some ways, this is similar to how some smartphones use “Portrait Mode” to create depth and blur, only with more control.

A quick look at Depth Blur

You may access Depth Blur in the top menu bar under Filter > Neural Filters > Beta. Yes, Adobe considers this a work in progress, so bear that in mind. In fact, Adobe offers you to let them know if you are satisfied with the results.

When using it for the first time, you will need to download the filter. You may do this by clicking on the cloud with the downward arrow. 

Blurring your photo

In the Depth Blur dialog box, you may control various parameters. This includes Blur Strength, Focal Range and Focal Distance. Focal Distance is disabled when you are choosing a Focal Point. Focal Range becomes more pronounced in effect when using a Focal Point. 

Also included are controls for Haze, Warmth, Brightness and Saturation.

Additionally, you may output the depth map only to use for blur … or any other effect, for that matter.

Experimenting with Focal Point

Original photo of Natural Bridges State Beach, Santa Cruz, CA.

I began experimenting with my long exposure photo Natural Bridges State Beach in Santa Cruz, CA. Using the dialog box, I chose the rock in the water as the focal point. I was able to easily blur the area surrounding the rock, including the foreground where the water is rushing up on the beach. This created a selective tilt-shift effect not unlike what a Lensbaby lens might do.

Photo after applying Depth Blur. I used a Focal Point on the rock. As you can see, it created a pleasing blur in the foreground. Note that the bird on the left side of the rock is blurred.

Experimenting with Focal Depth

I chose an abandoned truck at night for my next photo. I decided not to enable the Focal Point. Instead, I manipulated the blur largely through the Focal Distance Slider. 

Manipulating the Depth Blur sliders. This Focal Distance allowed me to push the blur effect farther in the distance.

As you can see, the photo generally looks good. However, you can see that there are some “mistakes.” For instance, it did not apply a distance blur to some of the streaking lights of a train or the bus in the background, apparently thinking this was part of the truck. It also blurred part of the bar in the front.

Remember, though, this is still beta, and is a cloud-based plugin. You may give feedback so that it helps out Adobe’s machine learning if you wish.

The quality of the blur itself seemed pleasing, with a decent amount of “creaminess.” 

I experimented with cranking the blur strength to see how it looks here.

Thoughts

The Depth Blur filter obviously has issues. However, it shows strong promise. As machine learning becomes more proficient, this can only get better. This will also likely continue to evolve and change before it becomes an official release.

In the future, might this replace fast lenses? Do you feel Instagram be flooded with images showcasing Lensbaby-type effects or ridiculously shallow depths of field? It’s possible. However, in the hands of someone who has taste and wants blur for specific reasons or effects, it could be another fantastic tool for photographers and graphic designers.

Brush up your photos

Get the Adobe Photography Plan, with Photoshop, Lightroom and Lightroom Classic — all for one low monthly fee!

Sky replacement round two: LuminarAI vs. Photoshop

One of the most anticipated features lately in both Luminar and Photoshop are their AI-powered sky replacement tools. Many of us night photographers would love to blend hour “blue hour” photos with photos of the night sky taken later in the same camera position. This can be time-consuming. Which can quicken the process more?

Skylum LuminarAI SkyAI vs. Adobe Photoshop Sky Replacement

I decided to take a look at the most recent version of LuminarAI‘s SkyAI — which is receiving an update this month — with the most recent version of Photoshop 2021. I used a day photo with complex pine needles. Such fine details provides a challenge for sky replacement programs.

I also decided not to go through a detailed, step-by-step process. The idea here is to see what one could do right away using minimal fussing to provide an apples-to-apples comparison.

LuminarAI SkyAI

I loaded the photo of a Mt. Pinos photo, above.

I loaded the photo of a Mt. Pinos photo, above.

Above is the appearance after loading my custom sky and then the Vertical Offset slider to adjust the sky.

Above, I adjusted the brightness to help blend the sky with the foreground a little more. I then adjusted several sliders in Mask Refinement. I was able to fit the stars around the horizon, including the pine needles and branches of the trees, by adjusting these parameters.

Above is a zoomed in view of the of the horizon. I zoomed in at 220%. This is of course pixel peeping. No one would ever view your final version like this. However, it’s educational for us to see what is happening.

Adobe Photoshop Sky Replacement

I began with the same photo again. Above, I’ve actually already loaded in the replacement sky, but have it turned off to show the initial photo.

Above, I have loaded in my custom sky. This is how it appears after adjusting the position of the sky. This is easily done with the Move tool on the upper left column.

I noticed that there was a lighter halo around the horizon, including the trees. I adjusted some of this by painting on the Foreground Lighting Layer Mask, located on the lower right of the screenshot above. I then zoomed in to the same area as the LuminarAI image. Like before, I zoomed in at 220% for educational purposes.

Comparison

I immediately saw four noticeable differences:

  1. There is a curious color difference. Adobe has changed the color of the sky from the original night sky. This can easily be addressed. Photoshop is, after all, a powerful program. However, it’s curious that it ever occurred.
  2. The details in the trees, notably in the trees on the mountain, are preserved better in LuminarAI.
  3. The top of the foreground is darker in Adobe than it is in LuminarAI.
  4. Despite me addressing the lighter blue halo around the horizon and trees, it’s still noticeable. I could have taken more than time and removed the halo. However, the fact remains that there is still a halo there, but none with LuminarAI.

See if you can find other differences. The stars might off slightly from one to the other because I was “eyeballing” their relative positions. Try to see if you can see differences in the quality of the blend and any artifacts.

What else puts LuminarAI ahead? Reflections

In addition to the adjustments Skylum has made to SkyAI that I mentioned above, there’s one much-anticipated feature that puts it over the edge — reflections. You can now swap out the sky and have it reflect in the water below. I used a photo below, by Bryan Esler, to illustrate this.

I spent approximately two minutes with each sky replacement program. I wanted to do this to make the comparison reasonably fair.

This is an updated version of LuminarAI, set to be released on March 16. I was able to get an early look at it for this article. The sky replacement is noticeably better than the previous version. I have no doubt that Adobe will continue to develop their sky replacement tool. The advantage that Adobe has is considerably more options and power in tweaking the initial sky replacement. I like options and power.

However, it is also in contrast to Skylum’s philosophy of having AI do more and more of the heavy lifting and decision-making for you so you spend less time in post-processing. For now, in early March 2021, LuminarAI seems more effective and appealing for my workflow. However, the battle will go on, and in the end, we the consumers are the winners.

Four steps to Super Resolution in Camera Raw

Adobe recently introduced Super Resolution. This is part of the new Camera Raw 13.2, although it will soon appear in Lightroom and Lightroom Classic. This continues a set of features including Raw Details which also produce greater detail while reducing artifacts.

What does Super Resolution do?

Super Resolution has the ability to create a quadruple enlargement! For example, you can take a 12-megapixel image and resizes it to 48MP.

It gets better, though. Enlarging a photo often creates blurry details. That’s not the case with Super Resolution. It uses advanced machine learning to enlarge photos while retaining edges and details.

Why do we need this?

It’s extremely handy for making large prints. Not all cameras have super-high resolution. Most modern phones produce 12MP images, for example. It’s also great for when you crop a photo noticeably.

I tested this with an image from a Nikon D750 and was blown away by the results. It took a 24.2MP image and created a 96.6MP image that appeared to be at least as sharp as the original RAW image.

How do I use Super Resolution?

Follow these four steps:

1.) Open the RAW image in Camera Raw.

2.) This will appear as a filmstrip with smaller thumbnails. Right click and select Enhance. This produces the Enhance Preview dialog box.

3.) Check the Super Resolution box. You can click and hold to see how the original image looked in comparison to the enhanced image. Also, the dialog box will give you an estimated completion time to enhance the image. It took seven seconds for mine on a 2017 iMac. 

4.) Click the Enhance button. 

This will produce a new RAW file in the DNG format which contains the enhanced photos. You may continue editing the file just like any other photo. 

Seeing double

Super Resolution doubles the linear resolution of the photo. This means that your image will be twice the width and twice the height, which is also four times the original amount of pixels. The difference is noted at the bottom of the above comparison photos.

Tips for Super Resolution

The developers at Adobe recommend that you use RAW files whenever possible. This will provide an image that is less likely to have artifacts.

They also point out that you don’t really need to do this with all photos, simply the ones that really need it, particularly print projects from small files. When you do use it, though, the results are stunning.

Apple Photos: Is there already a great photo editor on your Mac?

You have a Mac. You have some photos you wish to bring up to their fullest potential, so you start looking around at photo editors. Before you do, check out Apple Photos. You might be surprised at how good it can be.

Is Apple Photos easy to use?

In a word, yeah. And it’s quick. Photos tucks complex editing tools into several simple controls by default. The layout is extremely intuitive but surprisingly powerful.

There are three groups that allow you to edit your photos: Adjust, Filters and Crop. We will have a look primarily at Adjust. This might give you an idea whether you want to fire up Apple Photos and look at it more. The version in these screenshots is Apple Photos 5.0 running on macOS Catalina 10.15.7 on a 2017 iMac.

For those using macOS 11 Big Sur, the interface and features are similar. There’s a new Vibrance adjustment, as well as some updates to the Retouch tool. You’re also able to import and perform basic edits on video files.

Having a look at Adjust

Open any photo by double-clicking. Then click on the “Edit” button located on the toolbar on the upper right corner. In the middle of the toolbar, click on Adjust, which also has a dial icon. We’ll cover some of the features, located on the right, in order of appearance.

Light

Light slider that will auto-adjust several different parameters as you slide it. Try this first. This might get you where you want to go, as it works surprisingly well.

Click on the Options arrow and you can see the different parameters that it automatically adjusted. Tweak if you want to adjust further. See what happens. You can always move it back.

Have no fear, all of the tweaks you make can be reversed at any time!

Black and white

Are you into black and white photography? I always love to check out what my photos look like in black and white. Toggle back and forth between color and black and white by clicking the blue checkmark circle on and off for this or any other parameter.

Retouch

This is a surprisingly effective feature that really made me stand up and take notice of Apple Photos. The Retouch feature is designed to remove spots, blemishes and scratches. It does this very well. But I also found that I could remove things like tree branches that were hanging into the corner of my image! That’s impressive.

It’s so unbelievably easy to use that it might be dismissed. It should not be. Just brush the spot you want to disappear, and bam, it’s gone. You can adjust the size of the brush with the slider.

I decided to remove some of the dirt marks on the passenger window. Here is a screen shot of me in the middle of removing it.

Above is the photo after I removed a couple of the dirt marks.

White balance

White balance refers to the relative “warmth” or “coolness” of the color white in an image. It’s often referred to as a color temperature. The pulldown menu gives you neutral gray and two other scenarios. Adjust this to taste. I cooled this image very slightly by moving the slider to the left by a small amount.

Sharpen

This is exactly what you think it is. I clicked on the Auto function initially, which is shown above.

The Auto feature over-sharpened the image too much for my taste. No matter. I clicked on the arrow to the left of Sharpen to open up the individual features. I dialed back the intensity quite a bit, as you can see in the above photo.

A quick look at Filters

As mentioned, there are three major sections on the top of the toolbar, Adjust, Filters and Crop. Filters is somewhat similar to the filters you might see upon uploading filters to Instagram. With a click of the button, you can transform your photo to have a certain look. This is otherwise self-explanatory, but I did want to show that this is another feature.

A quick look at Crop

There are numerous features in crop that are useful. Above, I decided to ever-so-slightly straighten the photo. I did this by adjusting the dial to the left of the image. This results in a handy grid that allows you to better see if your horizon, or in this case, the truck window, is straight.

Crop also allows you to crop according to common aspect ratios such as square, 16:9, 8:10 and so forth.

A creepy Halloween night photo edited by Apple Photos.

Do I have Apple Photos?

Apple Photos comes bundled with Mac computers with OS Yosemite (10.10.3) and onward. It was released in April 2015. If you are running an OS before that, you have iPhoto, which offers some similar tools. By the way, there is a similar version of Apple Photos on iPhones.

The power of vignettes: Directing the light

A good composition is about directing the viewer through the image. One of the many ways that effective photographers do this is by directing the light.

Vignettes are a powerful tool in doing this. And best of all, it’s easy to do!

What’s a vignette?

A vignette is simply a reduction of an image’s brightness or saturation around the edges when compared to the center of the image. A vignette might occur naturally through the lens you use, particularly if you photograph with a very wide aperture. Or we can add it easily through post-processing.

It is easy to create vignettes!

In this example, I will use Adobe Lightroom Classic. However, you can use just about any program and achieve the same vignettes. I will show you using an example of a night photo. However, you may apply vignettes to any kind of photo. It is up to you!

Above, there already appears to be a little bit of vignetting in the original photo. However, the main reason the subject is brighter is because I lit the car grille during the exposure. I let everything else become a little more underexposed.

The lights in the distance are more or less in the center, and also aid in creating interest near the center. I have placed the brightest part of the sky directly over the highest part of the car grille for maximum effect.

Creating a vignette using Lightroom Classic

Above, under the Effects panel, there are controls for Post-Crop Vignetting. You probably already know what to do! Mess around with the controls and get something you like.

I find that for most applications, a small amount of vignetting is all that is needed. Most of the time, you might not want to draw attention to the fact that there is vignetting. Subtlety is key. Here, the amount is just a little.

I have also increased the feathering. This controls how gradually the vignette darkens.

See how easy that was?

Heavy-handed vignetting and hard feathering

Just for fun, I thought I would create an extreme example of vignetting. As you cay see, the Amount slider has been moved to the left considerably. And so has the Feathering slider. This is the opposite of a very gradual, subtle gradation from light to dark. For some photos, this might work. For most … probably not.

Vignette controls may already be on your phone!

You don’t need to have Lightroom, Photoshop, Luminar, Affinity or other programs to create vignettes. There’s a great chance that you have controls for this on your phone already. Most phones already have simple photo editing features.

See if you have one on your phone. The Photos app on iPhones, for instance, has the capability to create vignettes easily, similar to what I’ve shown here.

Directing the light to the subject

Subconsciously, the eyes of the viewers tend to go toward the brighter, more colorful parts of an image. Vignettes are one more tool in a photographer’s bag of tricks for doing so. It also has the subtle effect of almost cradling or framing the image.

What sort of photos do you think can benefit from vignettes? Portraits? Sports? Birds? Wedding? Fine art?

When you next look at photos, see if the photographer has used vignettes to direct your light toward the subject.

Which scenes work best for sky replacement of night skies in LuminarAI?

Night photographers often like to use tracking or stacking to reduce noise in the night sky. One of the main challenges is to try to blend the night sky with a foreground.

Which scenes work best for sky replacement of night skies?

Not all night foregrounds work. Determining which foregrounds blend effectively with a night background can save aggravation. Scenes with bright foregrounds and bright blue skies work best. But photos taken during the day are not always the best photos for making realistic blends. Blending blue hour photos generally looks more realistic than photos taken when the sun is out due to the light. Here’s how to do this in LuminarAI.

Five steps to sky replacement of a night sky

1. Open your photo

Open your blue hour photo in Sky AI. With your selected image open in the Edit panel, click on the Creative tab and select the Sky AI tool.

If Sky AI is inaccurate with reading the blend, try brightening the blue hour photo first. Brightening your blue hour photo and creating contrast between the foreground and sky before loading Sky AI can sometimes help the blend. This is especially true when blending complex foregrounds such as trees.

2. Load your sky

Load your sky by selecting your own sky with noise reduction or any other sky you have loaded on your hard drive. Click on the Sky Selection dropdown and select Load Custom Sky Image.

3. Maximize blend

Adjust the Sky AI parameters to maximize blend. You’ll probably want to adjust the sliders for Horizon Blending and Horizon Position first to get the sky matching your foreground.  After that, adjust the other parameters to make sure that you have closed the gaps and everything is looking good.

4. Relight Scene

After making the base adjustments for positioning and adjusting the gaps, adjust the Relight Scene slider to your liking. Along with this, if you so choose, you may also adjust the Sky Temperature, Sky Exposure and Atmospheric Haze under the Advanced Settings section.

5. Sharpen and adjust color

Let’s quickly make this look a little better. Spend a couple of minutes sharpening and adjusting the color and appearance even more by going to the Essentials tab and adjusting the sliders to your personal taste! Here, I’ve added a bit of glow, added some detail to the sky, and cooled the temperature of the whole scene slightly. I spent about two minutes with these adjustments.

Export

Export file. Click on the Export tab. This will open a dialogue box as shown above. Select the parameters and location where you are going to export your file. Congratulations. You’ve just done a complex blend of a blue hour photo and a night sky in only several minutes without doing any complex masking. And almost all of it were creative and aesthetic decisions.

Not all night photos work. The best AI is still challenged by dark skies, although the technology is improving rapidly. For now, we can select scenes that have the greatest chance for a realistic, beautiful blend.