Technology

New Video Shows iPhone 14 Pro Camera Has a Problem – Here’s What You Need to Know

Apple and Google are in constant competition over who has the best camera phone. After years of working with a lower-megapixel camera, Apple increased the megapixel count on its top-tier iPhone 14 models last year. The iPhone 14 Pro and iPhone 14 Pro Max have a 48MP main camera and are now among the best camera phones on our list, followed by the Pixel 7 Pro. But some users and popular YouTuber Marques Brownlee (will open in a new tab) (MKBHD) indicated that the iPhone 14 Pro may have camera issues.

MKBHD says that in the “scientific testing” of phone cameras it has done, the iPhone 14 Pro consistently fell into the middle ground and wasn’t even close to a few of the best phones. In fact, the Pixel 6a won this test ahead of Apple. It made him wonder where Apple went wrong – and despite having one of the best camera systems on a phone, why couldn’t it produce the prettiest photos.

In a video titled “What’s Happening to the iPhone’s Camera,” he says his theory is that iPhone photos are spoiled by excessive post-processing.

Retouched iPhone 14 Pro Photos

These days, just having a high-spec camera on a phone is not enough to guarantee that it will be a good camera phone. The sensor needs to be big enough to capture as much light detail as possible, but it’s equally important that phones have great image enhancement software capabilities.

Phones don’t have room for large camera sensors like DSLRs, so manufacturers make up for this with software that processes the image and corrects it after shooting.

Many of the new features released for cameras these days are purely software based, as manufacturers can do very little with the hardware. MKBHD says manufacturers are increasingly relying on smart software to provide consumers with good camera quality.

He also points out that Google “found gold” with the camera and software balance right from the Pixel 3. But once the company bumped up the camera megapixels to a 50-megapixel sensor with the Pixel 6 Pro, things seemed to go wrong. for them. The same thing seems to have happened to Apple now. The iPhone has been using a 12-megapixel sensor for years, but this time, when the iPhone 14 Pro jumped to a 48-megapixel camera, it upset the balance between hardware and software.

It looks like the software is getting overloaded, although it’s not necessary, now that the phone has a better camera sensor, resulting in over-processed and artificial photos.

Apple has Smart HDR, which combines multiple photos with different settings into one, allowing the phone to select the best features of each image and combine them into one photo. This can sometimes look unrealistic, and the iPhone maker seems to highlight the people in these images, resulting in the overall image looking quite sharp.

Apple’s new photonic engine on the iPhone 14 Pro, which improves the phone’s maker’s computational photography for medium to low light scenes, sometimes performs very well in favorable scenarios such as clear skies, grass, or good lighting. This can be seen in our iPhone 14 Pro Max and Pixel 7 Pro camera shootout. In the roller image below, the iPhone 14 Pro Max delivers a brighter, warmer image compared to the Pixel 7 Pro.

But when there are different light sources, colors and textures, the program seems to be unable to figure out which setting is best for all elements combined.

We found this to be the case in this Times Square shot, where the Pixel 7 Pro gave us a much sharper and brighter image, with various elements and details, such as the slanted glass panels above the ESPN sign. Let’s just say there’s nothing wrong with the iPhone 14 Pro Max image, it’s just not as good and sharp as the Pixel image.

MKBHD says the software fluctuates especially on skin tones, and while Google has Real Tone, which does a great job of capturing realistic skin tones in a variety of lighting conditions, Apple just seems to light faces evenly. Again, sometimes this can be fine, but more often than not, it gives us overly tweaked results, as in the example below from MKBHD. The iPhone doesn’t take into account different white balances or exposures, it just illuminates faces evenly.

Image of Marquez Brownlee's face when pressed on iPhone 14 Pro and Pixel 7 Pro.

(Image credit: Brownlee Brands)

Multiple Reddit Users (will open in a new tab) also agree with MKBHD that they also noticed this issue with the iPhone 14 Pro. Once an image is taken, it takes a second to make adjustments, after which users say it looks “completely different” or “blurred.”

Some users have also said that the drastic difference between a naked image and a retouched image is noticeable when it is a Live Photo and played back in the Photo Library.

iphone camera look

There is no need to press the panic button yet. The over-processing of images can probably be fixed with a few Apple software updates and it doesn’t seem to be a major issue or glitch.

Apple will continue to reinvent the wheel with smart cameras. The iPhone 15 is expected to be released this year and there are already rumors that the iPhone 15 could get a periscope camera for better long distance photography and zoom capabilities. If this is true, it will be a major hardware upgrade and we hope that the company will restore the balance between hardware and software this year.

A few years ago, many Chinese phone makers had the most artificially enhanced photos. At the time, the iPhone was praised for having the most natural images along with the Pixel. But now it appears that this iPhone has over-processed images, and we would like Apple to allow us to completely disable processing in some cases in the upcoming iPhone 15.

For now, the iPhone 14 Pro Max still holds the crown on our best camera phones list with the best camera system on a phone overall.

The best iPhone 14 deals for today

Back to top button

Adblock Detected

Please consider supporting us by disabling your ad blocker.