Tick, tock. Tick, tock. This is the familiar sound of Apple’s iPhone.
Popularized by chip maker Intel, ‘tick tock development’ has come to refer to a major change in a product followed by a refinement of that product. For several years, Apple has focused on large design changes every two years, with healthy, though arguably less showy, upgrades in between. This has resulted in the iPhone and ’s’ iPhone being considered iterations of one another, rather than wholesale new models.
Though most people assume that those middle models are a simple refresh, Apple has been enacting a new strategy of using the mid-cycle models to set the stage for the power requirements (in terms of raw speed) for its next iteration.
This year, Apple’s iPhone 6s has bucked that trend with not only an enormous processing power upgrade but three marquee features that provide significant value to the customer — and they’ve backed it up with a new purchasing plan that removes a lot of the reasons many people have for not upgrading every year. I’m no analyst, but I believe that this is going to be a record ’s’ year in sales for the iPhone.
In last year’s review of the iPhone 6 and 6 Plus, I took them on a trip to Disneyland. This year, the family went to Disneyland and I went to San Francisco to throw our Disrupt conference, which I’m up early for as you’re reading this. So instead of a travelogue, we’ll have a talk about what it’s like to run a major tech blog almost entirely from the new iPhones and mix that in with some philosophical chit-chat.
In the process, we’ll get into what the big new features — 3D Touch, an upgraded 12MP camera with 4K video recording and Live Photos — mean for the industry. Hopefully, while we do that, you’ll get a decent picture of whether or not they’re appealing to you. Let’s get started.
iPhone 6s & 6s Plus | Review | 6:23
Performance In Between
As I mentioned above, the ’s’ years have grown increasingly important to Apple’s silicon roadmap. The A-series processors that they design internally and have built for them are growing ever more powerful, supporting features that are extremely rare in any device, much less a phone. Let’s paint a little picture of how Apple has been evolving its processors between the ‘tick’ and ‘tock’ cycles.
In our tests, there was a 56.5 percent increase in Geekbench benchmark scores from the iPhone 6 Plus to the iPhone 6s Plus. That follows a 97 percent increase from the iPhone 5 to the iPhone 5s.
In comparison, there was only a 24.9 percent increase from the iPhone 5s to the iPhone 6.
In terms of performance, the ‘tock’ years are really kicking the ‘tick’ years in the butt. There are a variety of possible reasons for this, but when you think about the fact that the chip team is able to turn and burn on new designs every two years — it’s very impressive. Especially when you remember that Apple only started shipping its own chip designs in 2013.
// If you’re curious, the iPhone 6 Plus scored a 2716 in multi-core performance and 1517 in single-core score. The iPhone 6s Plus notched a blistering 2515/4367. The iPhone 6s scored similarly.
What do these numbers mean for you? A buttery smooth iOS 9 experience, with none of the lag that some folks have been seeing on other devices. Whether that lag is early version issues or not, I don’t know. But on the iPhone 6s and 6s Plus, the app switcher, the new camera and anything else I threw at it reacted with speed and a feeling of fluidity. Third-party apps like games aren’t really stressing anything inside these phones to the max yet, but Apple’s standard apps are performing incredibly admirably, as are all of the third-party apps I’ve loaded.
In terms of performance, the ‘tock’ years are really kicking the ‘tick’ years in the butt.
Siri works better than ever, is more aggressive about offering you options, and feels very, very useful. The integration with HomeKit, which I haven’t been able to test so won’t opine on too long, has solid potential to be useful — especially in conjunction with the Apple Watch.
Battery life is only worth mentioning in that there has been some talk about a slight reduction in battery size for the iPhone 6s. I don’t know about that, but I can say that battery life is spec’d to be exactly the same by Apple and that’s the way it felt. Going from a heavy day of email, shooting pictures of my kid, meetings at companies like Facebook, video at a baseball game and late night Slacking all felt roughly identical on the iPhone 6 Plus and iPhone 6s Plus. No appreciable difference. To bed at around 20 percent with no charging during the day. If there’s any change in battery size, it’s likely more than made up for by the new power-saving features in iOS 9.
By the way, given that the vast majority of folks will restore their phones from an iCloud backup, I ditched the whole ‘let’s pretend this is a new iPhone’ testing methodology. I think it’s silly to test phones in a vacuum. So I loaded up my iCloud backup with all of my normal apps — nothing too crazy, not a lot of beta software, just a healthy mix of productivity, games, sports apps and the tools I need to run TechCrunch like Slack, Convo, Notefile, email accounts and messaging clients.
By ‘cloning’ my current iPhone, I’m able to see how they both perform on an equal real-world footing, not as lab test dummies. It’s not the only method, but it’s the only one that makes sense to me.
Panes Of Glass
iOS 7. The big redesign born of a corporate restructuring that saw the departure of Scott Forstall and the installation of Jony Ive as design lead on both hardware and software. At the time, there was a ton of confusion about why Apple would do such a thing, and there were plenty of complaints about the buttons that looked like text, the removal of the delightful texture-heavy designs that had on-boarded millions of new smartphone users so handily, and the design direction at large.
When iOS 7 dropped, I noted that it was a big break with the past (and later explained how it came to be). But I had my theories about the why of it:
“It’s as if Ive imagined the OS as a physical computer made up of sliding panes of frosted glass, with surfaces activated by touch and color. You slide those panes of glass around to access various functions and, as you do, they interact to provide context and a sense of place. Then, he took that and translated it to software.”
Basically, giving iOS the actual properties and physicality of a ‘real thing’ instead of the simulacrum of the real world projected onto glass. Physics and actual depth, rather than the appearance of depth. This is what Apple has been working toward. Not only was iOS 7 much more suited to multiple screen sizes (Apple TV, Apple Watch, dashboards, heads-up displays, virtual reality), it was also laying the groundwork for enabling a three-dimensional computer on a two-dimensional plane.
As is now well-documented, iOS 7 was a painful transition for many designers and app developers (and some users). The payoff for the pain of transition, the awkward affordances struggle, and the bifurcation of a team at the height of its power? 3D Touch.
Context And Fear
Here’s one thing that I think is important to state: 3D Touch is not the new right-click.
I have a feeling that this is going to be the easy comparison, and the early chatter about it by people who haven’t even tried it is already leaning that way. I can’t stress enough that this is not accurate. Right-click is about adding actions and complexity; a 3D Touch shortcut is about taking away actions and reducing complexity.
The right way to think about this is to imagine a small, furry animal with sharp claws that curls up at the bottom of your subconscious. When you’re using your smartphone and you’re presented with a link or tappable item, that little furry fear nervously flexes its fingers, digging those claws in. Will I be able to get back to this page if I tap this link? Will I lose my place? Will all of the information I put into this form go away? Will my app reload, blasting any sense of context I had when I’m done with the link and want to come back?
This is one of the big things that 3D Touch does. It eases the fear of handling actionable items. It allows you to retain your context while adding something to your calendar, peeking at an email or sneaking a look at a link to see if you really want to read it.
3D Touch is not the new right-click.
Pressing lightly to ‘peek’ and pushing hard to ‘pop’ it into existence provides an escape hatch that eases your mind, and a new iOS 9 affordance injects a ‘back’ button at the top-left corner of any screen you jump to. iOS 9’s new task manager, accessed by a firm press on the edge of the screen (or the standard double-tap of the home button) is also arranged in a much more contextually rich card format — a timeline of your jumping around through apps.
Of course, there is also the simple time-savings factor. Jumping straight to a single component or action of an app from the home screen is lovely. Popping right into a selfie from the camera icon is going to be a popular one. These shortcuts remind me a lot of the scripted actions that you can put together with apps like Workflow or Launch Center Pro. Native support for developers to enable these quick actions is a boon, and I’m very interested in seeing how they support it.
Do not be fooled into thinking that it’s a toy. 3D Touch is a major innovation. The 96 embedded sensors underneath the screen measure the flexion of a new kind of glass at an incredibly small scale to determine pressure.
Here’s an interesting tidbit: the screen measures thresholds of pressure (peek, pop) as determined by Apple’s design team — but it can also continuously measure pressure all the way up and down its scale. The potential of 3D Touch goes way beyond peek and pop and into some very interesting territory.
When the iPhone was launched, the majority of the world, by far, still used their desktop computers to browse the web, shop for things, watch videos and more. Now, many, many common activities have made the leap to being done mostly on mobile devices. In some parts of the world, the smartphone is the primary computer, not an accessory at all.
In order to support that more complex and varied ecosystem, mobile hardware and software has to evolve. For now, all of the 3D Touch actions that Apple has built into iOS 9 are optional. They don’t need to worry so much about things like how people will discover the actions yet. But I wouldn’t be surprised to see apps that run primary features off of this mechanic.
3D Touch In Action
When it comes to using 3D Touch, I found myself utilizing the in-app ‘peek’ and ‘pop’ more than the home screen shortcuts, though those are still useful for many things. Firing up a new email without having to even see my inbox, for instance, was a surprise productivity booster. Getting a message sent without having to wade through messages is sweet.
As a tip, there is a setting inside Settings>General>Accessibility that will allow you to adjust the sensitivity thresholds of 3D Touch. This was doubtless to help people with motor skills or grip-strength issues use the feature. But I found that because I jump between apps and use my iPhone pretty ferociously, I wanted the actions to happen quicker (with a lighter press), so I turned it all the way up to its ‘most sensitive’ setting. Play with this if you have trouble triggering it or do it too much.
Sending location and marking location from the Maps icon are also stellar examples of this. Both of those items take at least three taps normally: open>tap>tap. Using pressure, it’s press>tap. That seems minor, but if you’re a heavy user of your phone like I am (especially during conference time) then those multiple taps that you use for email, maps and other apps add up to hundreds or even thousands of interactions a day.
I did some admittedly very sketchy math based on just email and Twitter usage and came up with an average of 840 interactions taking an average of a second each to complete. That’s around 14 minutes a week I spend just tapping around. If you slice that into thirds then I’ll save just under 5 minutes a week by using 3D Touch — note, that’s only email and Twitter, and doesn’t take into account the cognitive load it takes to find a button on the screen once the app loads.
Doesn’t seem like a lot, but once third-party apps get on the bandwagon and you’re able to ‘deep link’ to explicit functions within apps, this has enormous time-savings potential. Slivers of time saved across trillions of interactions on hundreds of millions of devices.
And that doesn’t even take into account the savings inside apps. As I outlined above, peeking at a link without having to load another app and switch back and forth alone is massive.
3D Touch may, on the surface, appear to be a ‘neat trick,’ but it’s the first major step since Rich Corinthian Leather toward a future where our digital devices are as easy to comprehend and manipulate as the physical world is with our fingers.
A Live Photo is two files in one — a 12 megapixel still image and a 15 fps video file that are combined by iOS.
Sights And Sounds
Live Photos are not really a new format. The images, which are accompanied by 3 seconds of video (split before and after your shot) are stored as a .jpg file on your iPhone. The video is a .mov file containing 45 frames that play back at around 15fps when you press and hold on an image. The whole package takes up roughly the space of two regular 12 megapixel images. The appearance to the user is seamless, as iOS sees that they are connected, and presents them as one ‘Live Photo.’
In my experience, Live Photos work best when capturing ambience, not action. Because the frame rate is relatively low, moving the camera a ton while you shoot them or having a subject move will display a bit of jitter. If, however, you’re shooting a still image with some moving elements, the effect is extraordinary.
As a reformed professional photographer, I have spent countless hours waiting for conditions to line up to take a shot. When I look at my favorite ones, I can remember the sounds and sights that accompanied those shots: the sound of the surf hitting the rocks at midnight in Monterey Bay as I waited for the moon to be in the right position above the burned out cannery; the *thwack* as a bride in full wedding regalia hit a beautiful stroke off of the 9th tee and my remote flashes went off to capture it; the incredibly funny face and giggle that my daughter made right before she did her best ‘duck lips’ for a picture with her mother.
Until Live Photos, there was no easy way for any normal person to share both of those things at once in one go — a crystal-clear still image, with a sense of place attached.
I could go on about Live Photos, I find them to be very, very powerful, but I think that the best uses of them are still undiscovered. I’ve gotten some lovely test shots, which you can see here and in our video above, and I’m going to leave the feature enabled. But there is some very, very interesting potential here for creative uses.
What possibilities are there, for instance, for telling a story in the moments before and after a photo — and juxtaposing that with the still image for effect? A good example of this is the photo Stephen Colbert shot when he interviewed Tim Cook. It’s a slapstick joke, presented in a way that only makes sense in the Live Photo format. It’s a small thing, but it will be very interesting to see what creators from Snapchat and Vine do with the new format. Especially once social networks and other feeds start supporting it.
If you’re curious about the security and privacy implications of Live Photos, I’d encourage you to check out our piece here, which describes how those photos are handled.
Practical Live Photos tips:
- Other people with iOS 9 can see them, regardless of phone. So can people on the Mac once Apple updates the software in a few weeks.
- It’s best to find a good still image first — this is a picture after all. Once you get decent at it, the before and after moments can be planned, but for most people it’s best as captured ambience.
- A ‘Live’ icon appears above your camera as you shoot — hold it steady until it blinks out!
- A planned update coming in iOS 9.1 will automatically cut off your recording if you drop the camera, avoiding a lot of 1.5-second blurs of the floor.
Some of the most impressive camera and image processing feats in a smartphone, ever. Not in raw pixel count (though that’s increased) but absolutely in image quality and finesse of image processing. I’ve talked before about how important it is that Apple owns its whole image pipeline, rather than letting a third party image processor mangle the iPhone’s photos. Never has that advantage been more apparent.
Even though the camera in both new iPhones has been increased to 12MP, there is no appreciable additional noise, even in crappy lighting conditions. Apple has resisted increasing pixel count because the truth of the matter is that more pixels in the same size sensor usually means more heat, more noise and worse detail. Now, they’ve increased it for the first time (both in the front and back cameras).
Because there are more pixels to fit, the pixel pitch has gone down from 1.5µ to 1.22µ (smaller pixels usually means less light), but I haven’t seen any impact on image quality. Apple says that it’s reduced noise by blocking light from bouncing back and forth between pixels, causing confusion that leads to those multicolored speckles you see in what should be a solid color. It also bathes the bottom of the pixel wells in more light by moving the capture lenses on top of them to the top of the shaft.
Basically, Apple used smaller pixels because they had to fit more of them, but countered that with some pretty hard engineering challenges to correct the negative effects. This results in better image quality with more pixels on the same size of sensor.
The detail is much improved, enabling better cropping and zooming of faraway subjects and crystal clear landscapes, portraits and up to 63 megapixel panoramic shots.
Apple has resisted increasing pixel count
I was incredibly impressed by the differences in camera quality between the iPhone 6 Plus and the iPhone 6s Plus. It’s very, very noticeable and very welcome. The images aren’t over-sharpened because they don’t need to be — the detail is already there. The shots I took at night are pleasantly grainy, not so noise-reduced that they’re muddy blobby messes. The stabilization in the iPhone 6s Plus is still a very good reason for iPhone photographers to upgrade over the iPhone 6s — though both have ‘cinematic’ stabilization done in software.
The new 12MP stabilized time-lapses are gorgeous, and the 1080p slo-mo at 120 fps is a welcome addition to the standard 240fps at 720p. The front camera has also gotten a nice little 5MP upgrade for vloggers and Snapchatters, and looks better than ever.
But the real crown jewel of Apple’s achievements with the camera is the 4K video mode. Before most TVs can even play 4K back, Apple has enabled it on their iPhones. The challenge with 4K is that it is an ocean’s worth of data bursting through the sensor and into the image processor. Four times the amount that 1080p pushes, which is a lot. That alone would be impressive, but other smartphones have actually shot 4k before, including Samsung’s Galaxy S6.
When Apple’s achievement with the A9 processor really starts to shine is when you realize that you can chop and edit these enormous video files in real-time right in iMovie. Or when you want to look at a bit of video closer and you pinch-to-zoom in and it’s playing back in crisp 1080p at a 4x zoom ratio right on the screen. This takes a mind-boggling amount of processing power, and Apple’s on-board chip is more than capable.
The cinematic stabilization also works in 4K, which is crazy. The phone is able to algorithmically determine how much you’re moving your hands while you’re shooting and dynamically adjust the digital crop to present a stabilized view. By comparison, competing systems arbitrarily crop x% of the image off to use as a stabilization buffer.
The video and still images coming out of the iPhone just got seriously good. Apple is taking its position at the top of the world’s most used cameras very, very seriously.
1080p time-lapse and slo-mo tests | 1:10
Apple says its new Touch ID sensor is twice the speed of the one in the iPhone 6/6 Plus. I’m sure someone will try to measure it, but I think this one metric is enough: The new fingerprint sensor is so fast that you can no longer tap the home button to wake your screen, because it will unlock instantly.
I pull my iPhone out of my pocket with my finger on the home button to tap it and check my notifications. That behavior is out the window now, because by the time it’s out of my pocket, it’s unlocked. It’s incredibly quick. So quick that I think some people will have issues adjusting. Eventually I had to switch to tapping the power button to wake it so I wouldn’t miss my notifications.
I’m 100% convinced that the reason Apple finally added a chronological listing mode to Notification Center is to present you with exactly what you would normally have seen on your lock screen. They knew the Touch ID was going to be so fast that many folks would miss that list.
Good thing they also make the software that runs on their devices.
A New Tempo
The iPhone 6s and 6s Plus represent a new tempo for Apple.
Before the recent event, some of us were wondering how they’d fit all of the rumored announcements into one event.
From what I’ve heard, they did have a lot of trouble with that. Some demos were cut (as does happen) to squeeze everything onstage. There was so much stuff that the iPhone itself became the ‘one more thing’ at the end of the presentation. Likely to nestle the Apple TV inside the program, to indicate its graduation from hobby, but still interesting and indicative of how much product Apple is producing this year.
New Apple Watch finishes, including its first fashion brand collab, new iPad Pro, new Apple TV, the Pencil and a bunch of accessories. This is the second year in a row that Apple has introduced a truckload of new products, and it doesn’t show any signs of stopping.
I have a theory, just between you and me, that the new iPhones we’ll be seeing from Apple are never going to go back to the ‘tick/tock’ cycle. Competition is too fierce, the scope of Apple’s other ambitions is ever increasing, and with chip design in-house and clearly firing on all cylinders, the company has the resources and tools to introduce the most rapid refinements to the iPhone than ever before.
We’ll see if that’s true next year. But for now, both the iPhone 6s and iPhone 6s Plus are worth your attention and, if it’s in your budget, a purchase. The camera alone is worth the price of admission — it’s truly great — and the glimpse into the future of three dimensional computing is just a bonus.