Too-Smart Phone Cameras

I thought this was a really interesting article, and something I’ve thought about a lot with some of the newer smart phone photos I’ve seen both in some of my classes and out on the webs; what smartphones do by default becomes what we assume is normal. For years I’ve been trying to retrain students who assume all pictures should be brighter than they really should (concrete walkways shouldn’t glow white, for example). What happens with this next evolution? This article has an interesting point.

HAVE IPHONE CAMERAS BECOME TOO SMART? by

Kyle Chayka

Apple has reportedly sold more than a hundred million units of the iPhone 12 Pro, and more than forty million of the iPhone 13 Pro since it débuted, in September of last year. Both models are among the most popular consumer cameras ever made, and also among the most powerful. The lenses on our smartphones are tiny apertures, no bigger than a shirt button. Until recently, they had little chance of imitating the function of full-size professional camera lenses. Phone cameras achieved the standards of a basic digital point-and-shoot; many of us didn’t expect anything more. With the latest iPhone models, though, Apple is attempting to make its minuscule phone cameras perform as much like traditional cameras as possible, and to make every photo they take look like the work of a seasoned professional. (Hence the names 12 and 13 “Pro,” which are distinguished from the earlier iPhone 12 and 13 models mainly by their fancier cameras.) The iPhone 13 Pro takes twelve-megapixel images, includes three separate lenses, and uses machine learning to automatically adjust lighting and focus. Yet, for some users, all of those optimizing features have had an unwanted effect. Halide, a developer of camera apps, recently published a careful examination of the 13 Pro that noted visual glitches caused by the device’s intelligent photography, including the erasure of bridge cables in a landscape shot. “Its complex, interwoven set of ‘smart’ software components don’t fit together quite right,” the report stated.

In January, I traded my iPhone 7 for an iPhone 12 Pro, and I’ve been dismayed by the camera’s performance. On the 7, the slight roughness of the images I took seemed like a logical product of the camera’s limited capabilities. I didn’t mind imperfections like the “digital noise” that occurred when a subject was underlit or too far away, and I liked that any editing of photos was up to me. On the 12 Pro, by contrast, the digital manipulations are aggressive and unsolicited. One expects a person’s face in front of a sunlit window to appear darkened, for instance, since a traditional camera lens, like the human eye, can only let light in through a single aperture size in a given instant. But on my iPhone 12 Pro even a backlit face appears strangely illuminated. The editing might make for a theoretically improved photo—it’s nice to see faces—yet the effect is creepy. When I press the shutter button to take a picture, the image in the frame often appears for an instant as it did to my naked eye. Then it clarifies and brightens into something unrecognizable, and there’s no way of reversing the process. David Fitt, a professional photographer based in Paris, also went from an iPhone 7 to a 12 Pro, in 2020, and he still prefers the 7’s less powerful camera. On the 12 Pro, “I shoot it and it looks overprocessed,” he said. “They bring details back in the highlights and in the shadows that often are more than what you see in real life. It looks over-real.”  To read the rest of the article in the New Yorker, click this link

Comments are closed.

A WordPress.com Website.

Up ↑

%d bloggers like this: