Tech

Your Phone Isn't Capturing the Reality of Wildfires. Here's How.

Smartphone cameras use algorithms trained on normal landscapes, and aren't prepared to capture the colors of a climate crisis.
​Photos courtesy Paris Martineau​
Left: a photo with an iPhone. Right: a photo taken with a DSLR. Photos courtesy Paris Martineau

Like many of my fellow East Coasters, I watched the light inside my apartment darken and turn orange around 2 p.m. on Wednesday, and pulled out my phone to get a photo. The air quality outside was in the “hazardous” range, worse than any city on the planet, due to wildfires in Canada that have already burned nearly 10 million acres of land.  

But when I looked at the picture on my phone screen, it looked like the room usually does on a cloudy day: white walls, grayish sky outside, and none of the disturbing ocher hue that I could see with my bare eyes. What the fuck? 

Advertisement

The technology in our smartphone cameras isn’t ready for the devastating effects of climate change. In-camera algorithms that adjust for sharpness, white balance, and color correction mean that what we see on the screen isn’t always what we see with our own eyeballs. Smartphones made on Earth are not prepared to capture reality when the planet starts to look more like Mars.

My phone, a Google Pixel, corrects the white balance in photos, which under normal circumstances is fine and not noticeable. But on a day when I’d like to preserve an accurate memory of a constantly-worsening climate crisis making itself impossible to ignore in my city, I’d prefer accuracy. I got out my mini Instax camera and took some pics the old-fashioned way instead, and the results were more true to life:

In 2020, when much of the West Coast suffered unprecedented wildfires and was coated in a similar orange haze, people ran into the same problem with their photos. 

As Wired noted at the time, most smartphones have the same Sony-produced sensors and use something called Quad Bayer Filter architecture, which works with a digital light-sensitive sensor. This hardware works in combination with algorithms that differ between phone brands, but are trained on millions of images. “Imaging processors are taught to detect the scene, subject or scenario and then select the necessary exposure settings to achieve the desired shot,” according to Wired. “What people are essentially seeing is a combination of red light being trapped by particulate matter from the fire and bouncing back from the ionosphere in absence of the other wavelengths.” 

Advertisement

Because smartphones were trained for normal scenarios—sunny days, blue or gray skies, and even nighttime shots—in combination with the normal things they can see even on an orange day, like houses and cars, they try to produce the most normal image, even if it’s not accurate to reality.   

Last night, when the air quality index in NYC was breaching 400, I took an informal poll of friends on Instagram, where I kept seeing vivid, orange images in people’s stories. I asked if anyone was using special settings on their phones or editing them before posting to look more true to life.

“Is it EDITING if I’m making it look like the actual color of the sky,” technology journalist Megan Farokhmanesh replied. Valid point! I think it gets a pass. Another tech journalist, Paris Martineau, told me she broke out the DSLR to get an accurate photo; DSLR cameras also use algorithms, but it’s easier to adjust things like aperture, shutter speed and ISO to get a more natural shot before taking the photo. 

Several people denied editing their photos at all: “no bc i’m stupid and it would look like picnik if i did,” VICE social manager Emily Lipstein said, referring to the Myspace-era photo editor that made sepia tone edits the trend on social media for a cursed while. Others copped to adjusting the white balance in edits to make it look more true to life.  

Advertisement

Designer Chris Vranos posted a video to Twitter showing the orange scene in NYC while holding a laptop with a ColorChecker chart on the screen, which photographers use to calibrate colors, and which forced their phone camera to “see” the surroundings more accurately:   

Some smartphones used to offer white balance adjustments in-camera, before taking the photo, but Apple did away with the feature long ago and Google’s camera app hasn’t had it since 2019. Since every phone brand programs their algorithms slightly differently, there’s not one simple trick to getting accurate photos for all smartphone users, but here’s how you can get a more accurate shot using some popular phone brands: 

  • With iPhone 11, Apple added an image processing algorithm called Deep Fusion, which users can’t disable. But iPhone users can get around these settings, and get a more realistic apocalypse shot, by adjusting HDR settings, or download third-party apps like Halide and shoot in RAW instead of JPG, an image format that preserves more color data in a photo.
  • For Samsung smartphones, users can turn off “scene optimizer” and auto HDR settings in the camera, which makes the phone do less processing and should result in more natural-looking images. Failing that, there’s third-party apps like Lightroom to edit photos to look more realistic.

As for myself, using a Google Pixel, I’m out of luck unless I want to download another app or heavily edit photos after they’re taken, and will have to stick to my Instax to get an accurate photo—a return to tradition that feels right in the apocalypse.