Take better photos of the smokey orange fire sky

Photo of author

By admin

[ad_1]

t5a1116

The Bay Area woke up to dark orange skies Wednesday due to fires burning across the West Coast


James Martin/CNET

Much of the West Coast woke up to dark orange skies this morning, as wildfire smoke from hundreds of fires burning across California, Oregon and Washington blocked the sun. 

If, like many of us, you grabbed your phone to capture the creepy scene in a weird year that somehow continues to get weirder, you may have noticed that your phone’s camera wasn’t quite capturing the Blade Runner hue. But why?

While our eyes see an eerie orange glow, the images from your phone likely appeared more desaturated and grey. It’s because your smartphone’s camera is just too smart.

By design the camera does all the hard photography work for you, making thousands of decisions and corrections without your knowledge, It makes everything “right,” but not always real. 

The culprit is the software on your phone’s camera.

Apple’s Senior Vice President of Worldwide Marketing Phil Schiller referred to Apple’s new image-processing system Deep Fusion as  “computational photography mad science.” Like Google’s camera on the Pixel, the feature uses machine learning to decipher images and produce better-looking shots. 

A technique called HDR now merges multiple shots and even chooses individual pixels to produce an overly real version of the world. Google’s Pixel phones Night Sight  and Apple’s Night Mode almost impossibly makes dark nighttime shots better. The iPhone’s “portrait mode” applies made-up blur to backgrounds. Apps brighten eyes and smooth skin. It’s not real. But what is real?

For the first 20 years or so of digital photography, it was all about hardware. Bigger and better sensors brought more megapixels and more data. But over the past few years, the industry has shifted from being hardware-driven to software-driven. Most sensors are plenty big enough — with plenty of megapixels providing plenty of data — to make your pictures look good. Essentially, phone cameras are now taking all of that available data the hardware is providing and manipulating it to make your pictures better, without you having to do a thing.

img-8620

An image taken with the default camera app on an iPhone X, left, and an iPhone X image taken with the Hallide camera app, with the white balance set to ‘daylight’, on the right.


James Martin/CNET

One of the simple (and usually useful) adjustments the stock camera app on most mobile devices makes is to correct the white balance of photos. The camera senses the color temperature of your light source, ideally removing unrealistic color casting and balancing everything out, making sure that objects that actually are white appear white in your photos. 

Our eyes, or course, very efficiently make these adjustments of what is “white” in different lighting scenarios, but the auto white balance setting on your camera often has trouble getting it just right. That adjustment, which intends to make everything appear more “natural” and more like what your eyes see, makes it very hard for the camera to capture these smoke-filled wildfire skies in the same way your eyes do.

The trick is to manually override the color correction by switching to a more advanced camera app that will give you more choices in the way the image is captured. By moving away from the Auto settings, you’ll make your own choices and (hopefully) better images under the odd circumstances. Computers are great, but sometimes there’s no match for human free will.

Two of my favorite apps for more fine-grained detailed adjustments are the Hallide app (iPhone-only, $8.99) and Adobe’s Lightroom (iPhone and Android, free). Both are powerful professional-level camera apps that allow for many manual controls and shooting RAW images.

orange-tree

An image taken with the default camera app on an iPhone X, left, and an iPhone X image taken with the Hallide camera app, with the white balance set to ‘daylight’, on the right.


James Martin/CNET

Overriding the camera’s default auto control and setting the white balance color temperature to the “daylight’ setting will reveal a more true-to-your-eye depiction of the eerie orange glow. This white balance setting is typically designated by a sun icon on camera apps. Not far away on the white balance color temperature scale is the ‘cloudy’ setting, designed by a little cloud, which will also give you a more realistic, deeper orange fire sky. 

If you’re shooting images in RAW in an app like Hallide, and have manual control over the white balance number in an editing app such as Lightroom, the daylight-cloudy zone sweet spot is going to be around the 5500 to 7500 Kelvin range.

So, remember: There’s more to using a phone’s camera than just pressing a button. While software tools like Smart HDR, and Night Mode make the camera in your pocket an incredible tool, don’t always take them at face value. At the end of the day, the rules of photography are the same as they ever were. You just need to learn to use these grand new tools in the best way.



[ad_2]

Source link

Leave a Comment