his is a camera review. There are a number of updates that should appeal in the iPhone 8 and 8 Plus, but the one question most upgraders are going to be asking is how good is the camera?
The camera system in the iPhone is becoming the central focus of its technological advancements. And it’s not just about pictures anymore. With augmented reality and computer vision emerging as contenders for the next major wave in platform development, the camera system is an input mechanism, a communications system and a statement of intent.
If the camera is a platform, the time to begin reviewing the iPhone as a camera is long overdue.
Two years ago, in my review of the iPhone 6s, I noted that the pace of iPhone performance improvements had quickened. The in-between years of the ’s’ models of iPhone were becoming the times when big silicon releases were being injected into the devices. Huge leaps in processing power and efficiency in versions that were being discounted as iterations on the previous year’s models.
The iPhone 8 proves to be an odd sort of proof for this hypothesis. Apple is launching it one year after the iPhone 7 while nearly simultaneously performing a table flip and announcing the iPhone X just minutes later. Gone is the ’s’ appendix, but continuing is the tradition of “ticking” the silicon improvements forward with crazy speed.
I’ll get this part out of the way right at the top. My recommendation remains unchanged from last year: With nicely enhanced cameras, more power and a fresh look, these are easy phones to recommend if you’re open to an upgrade.
But let’s dig a bit deeper.
Back that glass up
The return of the glass back is going to be a love-it-or-hate-it feature of the iPhone 8. But for me, it’s a love it. Since I gave up my iPhone 4s, I’ve been missing that feel you get from having both sides of the phone coated in glass.
There’s something about the texture of glass. It’s smoother than the aluminum, but typically less slippery and easier to grip. Glass also warms up to the temperature of your hand faster and stays at that temperature rather than getting hot. And when your phone is really chugging along, it distributes the heat from the processor better, making hot spots less pronounced.
But the main reason for the glass coating isn’t any of that, it’s that it needs to be radio transparent to enable wireless charging. I don’t know the exact internal layout of the phone yet, we’ll have to wait for a teardown, but it’s likely to be a wire loop of some sort that acts as a power antenna. With an aluminum back, Apple would have had to resort to some sort of external antenna, and that was never likely to happen. As it is, you get a nice return to form in the service of function.
So many iPhone screens get broken every year that making both sides glass seems inadvisable at best, but Apple has done a bit to help here.
First, the company works directly with Corning to develop stronger glass. It then gets exclusive access to this glass for some time before any competitor can use it. This is where the ion exchange process Apple has been using in its glass came from. That layer of strengthening is now 50 percent deeper than before. Think of it as a layer of toughened skin like a callous on your finger.
There’s also a bit more help in the shape of a substructure of copper and steel molecularly bonded together with a proprietary laser process and laid down in a lattice-style layer with gaps to allow things like wireless charging to take place through the back. Steel is strong and copper is a great heat sink. Together, hopefully, they make for a less breakable iPhone.
The aluminum of the body has also been strengthened again, just as it was in the second revision of this frame in the iPhone 6s. So, theoretically, you’re looking at the toughest possible phone made out of glass you can find. For what that’s worth.
The glass also allows for a beautifully translucent effect on the back of the iPhone 8 and 8 Plus. Light traveling through the glass, off of the under coating and back gives it a layered look. It makes the back of the aluminum 7 models look positively pedestrian by comparison.
For those interested, the Gold finish has a ton of pink in it. The backside is almost spot on ‘Millennial Pink’, that suddenly ubiquitous shade. I think it’s safe to say Apple thinks Rose Gold is on the outs (the fashion world agrees) and pink is in.
In short it’s like this. From the front: real similar; from the side: real similar; from the back: new hotness.
The best reason to buy a new iPhone
The camera is the best reason to buy a new iPhone this year just as it has been several years running. The iPhone is the world’s most popular camera, by far, and Apple continues to take seriously the business of improving it.
As I’ve mentioned before in these reviews, I have a lengthy history in photography. I’ve been a working pro photographer, and I’ve sold cameras, performed maintenance and run a print lab. This has helped me to recognize the difference between how seriously Apple takes photography and the way that other companies approach “making the camera better.”
There are other smartphones that take excellent pictures, Samsung’s Galaxy S8+, the most direct competitor in terms of hardware that Apple has, among them. However, once you move beyond the basics of increasing resolution, basic optimization and adding catch-up computational features like faux blur, you begin to realize that there’s not a smartphone company on earth that takes it as far as Apple does. It’s just not comparable once you get into the nitty gritty. Here are a few examples you’ll find in the iPhone 8.
Sensor improvements. Same resolution, 12MP, but bigger sensor overall. This is a great recipe for improved image quality. I’m a big fan of concentrating on the size of the individual image sensors that translate into larger individual pixels and making the walls between them deeper, both of which Apple has done here. Those deeper pixel wells give better isolation between capture elements so that you don’t get that speckle that results in color confusion between two pixels.
There’s also a new color filter. Given that digital camera color filters haven’t changed much in years, I was curious about the details here, but couldn’t learn much more than that it should result in improved dynamic range and color.
High dynamic range (HDR) shooting has also been massively improved, to the point where there is no longer even a toggle for it. You just shoot, and if the camera thinks your picture will benefit by an expansion of tones into dark and light it will use it. And there’s such a tiny lag in between the images used to composite together an HDR shot that you’ll find very little of that ghosting that happened under the previous system. You’ll notice that you no longer get two shots — one HDR and one not. That’s how confident Apple is. It’s really well done — seamless even.
The wide and telephoto sensors in the 8 Plus have both been updated, and the system has 83 percent more throughput, which allows for more data to be passed through in a more power-efficient manner. This will help with rapid fire images, but more importantly it allows for the enormous amount of information that needs to be pushed through the pipeline to support 4K video at 60 frames per second and super slo-mo 1080p at 240 FPS.
Apple’s adoption of the HEVC video format, which is enormously effective at reducing file sizes, assists with this, but that’s still crazy impressive. Especially given that Apple is not playing tricks with video quality, and that this still leaves plenty of overhead for the computer vision smarts it uses on video to determine subject matter.
The results are better color with a wider range of tones across all kinds of shooting environments. Apple is particularly fond of their work capturing skies with the iPhone 8 and 8 Plus, because skies are chock full of light of all spectrums. That holds up in my testing — less banding, more gradations of tone that reproduce more accurately.
Textures in shooting cloth or any other fine detail up close are also improved, with less chance for muddy or moire images when the patterns are super regular. This is reflected in the 4K video modes as well, which have better color rendition across dark and light scenes and less artifacting.
Though you don’t get it at a full 240 FPS yet, it may be interesting to some slow-motion videographers that you get continuous autofocus at 1080p 120 FPS now. This should help track subjects during a slo-mo shot as they move toward or away from you.
Similarly, skin tones have been improved, with less heavy-handed smoothing as seen in the iPhone 7. This is a result of sensor improvements, but also of Apple applying deep learning and intelligence to the process of determining subjects and optimizing exposure. More on that later.
As phone cameras have gotten better at low-light photography, the flash has been reduced to documenting that weird growth you text your friends and hopefully a medical professional.
Hardware accelerated noise reduction. I know this is going to be a bit in the weeds for some folks, but I’m really excited about this one. Noise reduction (NR) is the process that every digital camera system uses to remove the multi-colored speckle that’s a typical byproduct of a (relatively) tiny sensor, heat and the analog-to-digital conversion process. Most people just call this “grain.”
In previous iPhones this was done purely by software. Now it’s being done directly by the hardware. I’d always found Apple’s NR to be too “painterly” in its effect. The aggressive way that they chose to reduce noise created an overall “softening,” especially noticeable in photos with fine detail when cropped or zoomed.
Here’s the bit where I whined about the way Apple handled reducing noise in the iPhone 7:
I’m still not completely happy with how much noise reduction Apple’s image signal processor (ISP) applies to pictures, but I make this statement fully aware that this is not something most folks will notice.
It makes some sense that the NR would be more aggressive because most people want less ‘grain’ or pixel noise in their images. But it still results, I feel, in a little loss of sharpness in low-light situations. To be clear, this remains basically unchanged from the way that I feel about the way the ISP was tuned in the iPhone 6. Apple has made some insane improvements in the camera this time around, but I hope it does pay some attention to how they reduce noise and tweak that in the future.
Well, tweak it they have. Noise reduction is no longer a software-only feature. It’s hardware-accelerated, multi-band noise reduction done by the image signal processor (ISP) that Apple continues to improve. The result is reduced noise, but with a sharper, crisper feel that doesn’t feature the blotchy byproduct of the previous process. It’s a solid improvement everyone will benefit from, whether they realize it or not.
‘Zero’ shutter lag
For a while now, iPhones have been keeping a few images in memory before you even press the shutter button. These are recorded but thrown away nearly instantly. In optimal conditions, when you press the shutter, the last picture it recorded just before you pressed the shutter is the one you actually take.
This helps compensate for your normal human lag in pressing the button when you see something you want to take a shot of, and the lag of the system itself. It makes the picture-taking process feel instantaneous.
That buffer has gotten a bump in size in the iPhone 8, and Apple is now applying deep learning to optimize the process for the right time to shoot, the subject matter and other bits of intelligence.
The results are difficult to determine without a huge set of examples, because there are a lot of variables on any given shot. But, anecdotally, it does feel like the pictures fire off faster. Verdict: I believe them, but I need more time to feel this one out.
One of the main reasons you hate flash pictures is that they tend to pop the subject with tons of light and reduce the background to a blackish-gray nothing. It kills ambiance and mood and as phone cameras have gotten better at low-light photography, the flash has been reduced to documenting that weird growth you text your friends and hopefully a medical professional.
But pro photographers use flash all the time. Mainly because they have control over the shutter speed as well as the flash. This allows them to choose to leave the shutter open after the flash fires, filling in the background with more light and balancing the exposure.
Now, Apple does this automatically for you. If you take a flash picture of a person or thing and there’s enough light available to “fill” behind the subject, the iPhone will drag the shutter or leave it open automatically. It does this using intelligence around the subject matter, distance, ambient exposure and more in the time it takes to pop the shutter off.
The beauty of all of the examples I mentioned above? Every single bit of it is as accessible to the harried parent that wants the best shot of their kid on the first day of kindergarten as it is to a pro photographer. Yes, Apple has a silicon team that’s beyond ridiculous. Yes, it’s at the cutting edge of mobile computational photography and application of deep learning to photos. But you literally don’t have to give a flying animated poop emoji about it to get the benefit.
That is the key Apple innovation: It doesn’t matter whether you read, liked or understood what I just wrote above — you’re still going to benefit with incredible pictures.
iPhone 7 flash
iPhone 8 flash
Is the camera worth the upgrade?
Nearly every iPhone upgrade for the past several years has been driven by the camera. There have been impressive updates in hardware and feature additions, but anecdotally I cannot count the number of times people have cited the camera as the primary reason that they’re interested in updating their phone.
So, how does the camera in the iPhone 8 and 8 Plus stack up?
When people talk about Apple’s silicon team, they often concentrate on the A-series processors, which is fine. But that team also contributes heavily to the image pipeline in the iPhone’s camera systems.
This is the first year that I’m not saying ‘if you like bigger screens get the bigger one, otherwise get the smaller one’ about iPhones. I flat out recommend the iPhone 8 Plus if you’re in the market for an upgrade and can possibly stand using the larger phone. Why?
Most people aren’t aware of this, but Apple almost never starts a development process for a piece of hardware or a feature with a goal of adding hardware or adding a feature. They start with a question. Sometimes the answer to a question like “Why do people still buy SLR cameras?” is “Because they like the look of a blurry background that separates the subject” and you end up with Portrait Mode.
Now that Portrait Mode is out of beta and looking pretty good in most cases, we’re getting a big update.
The marquee feature of the iPhone 8 Plus is Portrait Lighting. Using deep learning and computer vision, this mode finds faces in an image, detects the planes and angles that need to be lit and applies a variety of different lighting styles that a user can choose from either before or after the picture is taken.
It works better than it has any right to.
Subjects are almost without fail painted appropriately with the different styles of light. A top-down studio style light, a contoured dramatic light that enhances cheekbones and two “stage” lighting modes that drop out the background transform a regular lame snapshot into something you’re proud to throw on the ‘gram or even print out.
The studio and contour options are going to be flooding social networks and phones internet-wide as soon as people get their hands on their iPhone 8 Pluses. The stage lighting takes a bit more effort, but when you nail it and the software is able to do its job by accurately detecting hair and head shapes, it really stuns. It can produce images that feel professional and would take dozens of lights and pieces of equipment to pull off.
Deep learning and computer vision are the keys to Portrait Lighting.
The drama of contour lighting enhances cheekbones.
The stage lighting takes a bit more effort.
But when the software accurately detects hair and head shapes, it can produce images that feel professional.
That’s not a huge shock because that’s what Apple did to get this right. They took hundreds of thousands of shots using professional lighting rigs set up by master photographers and cinematographers. Then they distilled those down to a set of master characteristics and figured out how to reproduce them computationally.
The feature is still in beta. But this is a look into the future of photography. All photography, not just on smartphones — though that’s quickly becoming the primary way that most people shoot pictures.
And it’s also something else: augmented reality.
While you might not jump instantly to assigning the term AR to this kind of feature, this is how Apple thinks of it. And it’s how the people that I talk to who are really excited about AR are thinking about it.
It’s altering the fabric of reality to enhance, remove or augment it. And it’s going to spread to every aspect of computer vision, photography and imaging. AR isn’t just putting a virtual bird on it or dropping an Ikea couch into your living room. It’s going to be everywhere, and Apple is preparing itself to push hard in this area. All of that is enabled, of course, by custom hardware.
Because the feature is still in beta, there are obviously things it will have trouble with for now, like curly or fine hair. But it’s useful on skin tones from dark to light — it was clearly tested on a wide variety of people. It’s not perfect but it’s still very impressive.
What the hell does Bionic mean anyway?
Nothing. At least, not specifically. The A11 Bionic chip carries that name simply to differentiate it without having to start appending a bunch of digits to it. But that doesn’t mean it isn’t different than what came before.
Bionic is just marketing, but there are real machine learning advantages. It can run instructions at 16-bit instead of 32-bit for instance, for calculations that don’t need the extra overhead. This can massively speed up ML library calculations. It also does have enhanced GPU support to allow ML applications to better leverage the Apple-designed GPU. Obviously GPUs have many advantages for ML training and modeling. And yes, it does have specific enhancements that Apple uses for its own ML stuff like portrait lighting, Face ID and more.
Apple is angling hard through these waters toward a future where the kinds of neural network processing that have to be done remotely can be done on device. This, of course, is a boon to privacy and security. It allows Apple to promise you that your sensitive data will never leave your hands while still being able to provide the advanced intelligence that modern features require.
Apple’s A11 chip has performance that’s on par with an i5 MacBook Pro — and that’s when it’s limited by having to sit on a battery-powered device. A custom designed performance controller — a CPU traffic cop — sits between six cores tuned to higher or lower performance tasks.
This is all opaque to you, the user, of course. But what it means is that you get more power but the same amount of battery life. It’s an even more impressive feat when you realize that the battery is actually physically smaller than last year’s models.
There is not a single team of engineers at Apple more responsible for the success of the iPhone than its group of chip makers.
Wireless, or contact, charging works but you have to buy a third-party pad yourself. That’s about all there is to say about it besides it does work with a case and charges about as fast as with the standard adapter.
I’ve had wireless charging on Android phones for a while, and it’s definitely handy. It’s nice to see it come to iPhones and the grab-and-go experience is lovely. We’ll see how good Apple thinks it can really make the experience when its own contact charging AirPad arrives.
Living in the shadow of X
All of this, of course, sits in the shadow of the iPhone X. Had Apple not announced the X when it did, the iPhone 8 would be an easy choice for upgraders and about the same amount of take-it-or-leave-it talk by early adopters that we saw at the launch of the iPhone 7. A bunch of internal upgrades and a nice new glass back along with the new photography stuff means that Apple will sell plenty of iPhone 8 and iPhone 8 Pluses.
I don’t think Apple knows whether people will lean one way or the other for sure. Their guidance for this quarter certainly indicates that the matter is up in the air. But I do believe that they’re not overly concerned if people buy an iPhone 8 or an iPhone X. You’re still buying an iPhone.
As far as a consumer goes, however, the iPhone 8 is the easy traditional choice this year. It’s got nearly every technical enhancement that the iPhone X has outside of the TrueDepth camera and OLED screen. I think the mental calculus on this one is probably closer than it’s ever been, but the framework is roughly the same: If you’re the kind of person who buys the high end iPhone every year then wait for the iPhone X. With the one caveat that if the notch for the depth camera on the front of the X offends you, well you have most of the major tech right in the iPhone 8.
I’ve been thinking about this one, and I think the best way to categorize the iPhone X is as a superset of the iPhone 8 series. I’ll talk about that more when it comes time to discuss the X, but for now the iPhone 8 is still going to get you most of the way to “the best” — especially when it comes to the camera.
The iPhone has been the world’s most popular camera for a while now, and it has become a huge reason, perhaps the primary reason, that iPhone users upgrade. Each year, improvements in silicon or design have also contributed to improvements in the iPhone as a camera. This year, Apple has done something really incredible with the Portrait Lighting mode, which is why if you’re in the market for a new iPhone, I recommend the 8 Plus.