Review: The iPhone X Goes To Disneyland

Four phones and three years ago, I took the first iPhones to “go big” to the “happiest place on earth” for a road test. It was a watershed year for iPhones, and I started thinking about how best to push them through my brain and out the other side in some way beneficial to the reader.

Most iPhone reviews are done by technically savvy writers that put them through their paces locally or in a lab. This just wasn’t how most people actually use their phones, and it created a focus on the wrong things.

Instead, what if you took an iPhone to a place where millions of people travel every year and use the absolute crap out of it for several days straight? That, I thought, would lend itself to a much more accurate through line between me fiddling with an iPhone for a few days and what the average buyer might be able to divine about how it might work for them.

The review struck a chord, and it remains the one people talk to me about the most. So I’m back on my…business…and did it again. I’ve had the iPhone X for a week and decided to put it back through the same gauntlet.

Disneyland is a vacation spot that has you using your iPhone like crazy every day. You take pictures, you look up directions, you use it for ticketing and FastPasses for rides. It’s hot and you’re distracted, and if you have kids you’re trying to keep them alive and in proximity while keeping them fed and hydrated enough to actually have fun on vacation. If a character walks by you need to be able to flip your phone up, shoo your kid over to them so they pause for a minute and be able to nail that in-focus shot. You’re bound to get that work call or need to reply to that email while the screen fights a battle against the nuclear fire of our star. Your fingers are greasy with churro grease and sunblock, your battery is getting hammered and there’s (almost) no Wi-Fi. It’s hell week for your phone and still, it just needs to work.

Most importantly, it’s about how the phone works in a practical setting and less about how two bars on a graph compare to one another.

Early last week, I went to Cupertino to pick up the iPhone X. While I was there, I participated in an interview with four Apple executives: SVP of Worldwide Marketing, Phil Schiller; SVP of Hardware Engineering Dan Riccio; SVP of Software Engineering, Craig Federighi; and VP of User Interface Design, Alan Dye. I’ll be including some of their relevant answers to questions about the iPhone X’s technology here, but stay tuned later today for the full story of the iPhone X’s development.

Now, let’s answer the question everyone wants to know…

Does Face ID work?

“Arguably the toughest challenge that we had is to replace Touch ID,” Apple’s Dan Riccio says. “It was very, very hard. If we were going to replace it we wanted to replace it with something that was at the end of the day both better and more natural.”

Riccio also flatly counters the narrative that Apple was still trying to use Touch ID in the iPhone X this year.

“I heard some rumor [that] we couldn’t get Touch ID to work through the glass so we had to remove that,” Riccio says, answering a question about whether there were late design changes. “When we hit early line of sight on getting Face ID to be [as] good as it was, we knew that if we could be successful we could enable the product that we wanted to go off and do and if that’s true it could be something that we could burn the bridges and be all in with. This is assuming it was a better solution. And that’s what we did. So we spent no time looking at fingerprints on the back or through the glass or on the side because if we did those things, which would be a last-minute change, they would be a distraction relative to enabling the more important thing that we were trying to achieve, which was Face ID done in a high-quality way.”

Going in to this review, my threshold for “success” was whether Face ID worked as well as or better than first-generation Touch ID. I didn’t expect it to nail the speed of the second-gen sensor, which is incredibly fast. As long as it landed between the two I would be happy.

Face ID works really well. First, it’s incredibly easy to set up. You choose to enable it and then rotate your nose around the points of a clock twice. That’s it. Second, it worked the vast majority of times I tried it, it never once unlocked using a picture of myself or another person’s face and the failure rate seemed to be about the same as Touch ID — aka almost never. As hoped, it’s definitely faster than the first generation of Touch ID, though perhaps slightly slower than the second gen.

At several points, the unlock procedure worked so well in pitch black or at weird angles that I laughed out loud. You get over the amazement pretty quickly, but it feels wild the first few dozen times you do it.

It works so quickly and seamlessly that after a while, you forget it’s unlocking the device — you just raise and swipe. Every once in a while you’ll catch the Face ID animation as it unlocks. Most of the time, though, it just goes. This, coupled with the new “all swipe” interface, makes using the phone and apps feel smooth and interconnected.

And, more importantly, it enables a whole new set of use cases and behaviors that feel organic, natural and just plain cool.

Let’s break it down.

Face ID works by reading the size and presence of your face, as well as the individual contours and three-dimensional shape of your facial features. Apple was essentially screwed before launch on this one because the systems that have been shipped so far that bear the name ‘facial recognition’ are essentially completely full of crap in comparison. I’m talking about the offering from Samsung that can be fooled by a picture of a person, for instance.

This is not “image recognition.” It is a full 3D facial reconstruction and eye-detection system. The first of its kind to ever be deployed at scale, period.

When Face ID did fail for me, it was almost always a function of one of two things: I wasn’t looking at the phone when it made the attempt (I have attention detection toggled on) or it was at too steep an angle and couldn’t see my whole face. If it was pointed at me and I was looking, it opened. There were definitely a couple of failed tries, but no more than I’ve seen with a Touch ID finger placement not being good enough. A second swipe/try typically opened it.

I used it bare-headed, with a hat, with other hats, with glasses, without glasses, with glasses and hat — all of the basic permutations. The only times it wouldn’t work at all is if I had my nose and mouth covered — something that Apple has said from the beginning was a deal breaker. For those of you in cold climates who wear face coverings, start practicing pulling that scarf down to unlock. The nice bit, of course, is that you don’t have to worry about gloves that don’t allow you to unlock your phone.

I eventually found a pair of sunglasses of mine that it could not penetrate. Apple says that the IR spectrum around 940nm is crucial to Face ID’s ability to function, so sunglasses that block this are an issue. That’s because the light spectrum that it uses is completely invisible to the naked eye. If you unlock in the dark it works perfectly, but no one sees anything coming from the phone — just for the record.

For the first time, the iPhone can actually be truly hands free.

Here’s the great thing, though: If your favorite pair of glasses happens to block this part of the spectrum, you can turn attention detection off and Face ID still works fine. I like it on, but if you’re a “no-look unlock” kind of person or want to have it work while you keep your eyes on the road then the toggle is there for you.

It’s worth noting now that toggling attention detection off for Face ID is also going to be good for accessibility reasons. Vision-impaired folks, especially, will benefit.

In fact, I believe strongly that Face ID is going to be an incredible boon to accessibility. Touch ID is difficult to operate for many with motor skills or mobility issues, forcing them to rely on a simple passcode or none at all. Face ID’s ability to passively know who you are and allow you to begin taking action right from the home screen with VoiceOver is going to be killer. Apple has had a massive lead in building accessibility into its products for some time now, and this is only going to widen the gap.

An interesting wrinkle in the way this works is that, for the first time, the iPhone can actually be truly hands free. You can look at your phone to unlock it and use any command that Siri supports, including launching apps, something that required you to unlock the phone yourself previously.

Some Face ID actions are better than Touch ID and some are just different. Paying for an iTunes purchase, for instance, requires about the same amount of time and interaction, because you have to double-tap-to-pay, but entering a password automatically on a page feels like absolute magic.

Speaking of paying for things, Apple Pay feels faster in terms of speed, but also more natural. You double tap the power button while looking at the phone to prime it and then hold it to a terminal to pay. This means you don’t have to do that awkward “hold the phone by the butt while your thumb is still on the Touch ID sensor” thing I see people doing a bunch. You could always prime a payment by Touch ID, but I think most people figured you had to hold the finger there to confirm, so this should add up to an improved experience for most. The procedure was so quick and smooth that we had to shoot our example for the video review several times to illustrate the payment and confirmation parts of the procedure.

There are other interesting functions of Face ID that fall under a separate category.

Intent and identity

I’ve been interested in the larger field of contextual computing for a few years now. In 2013, Apple bought Primesense (the makers of many of the components that are now miniaturized inside the True Depth camera). I wrote about the possibilities:

That’s what adding a 3D sensor to a smartphone will give you, additional contextual information that can implement the next wave of ‘intent based computing’…Apple has filed patents that involve customizing a phone’s interface depending on the identity of the user, as detected by sensors that see in 3D. Recognizing the user could add an additional layer of security and personalization to your device.

It’s becoming clear that perceptive computing — devices that are aware of us and their surroundings — is going to be the next big thing in portables. The things we carry with us are getting more ways to gather and interpret data and being able to perceive and leverage 3D space is the hurdle that many major mobile companies have chosen to leap next.

Apple is now beginning to deliver on these possibilities. Slowly but surely, we will see unique and interesting interactions that are only possible because the phone has instant and near-continuous authority that it is being operated by you.

One prominent example is that notifications of text messages and from apps are now set to “private” by default. This means you’ll see the app from which the notification came, but not the content of the notification. This was always an option but now you get the added benefit of them being private by default until you look at the device. Once it sees that it’s you looking at it, the messages expand to show you the private content. It’s super cool. A way to balance privacy and convenience based on context and identity.

Another is the password auto-fill function on websites. With Face ID on, simply open a website or an app that you have a saved password in Keychain for and it will pop the Face ID logo and auto-enter your information. Tap log in and go. Once you see this in action for the first time you’ll never ever want to be without it. If an app requires a password for entry it will even auto log-in for you, creating a smooth transition between opening, authentication and using an app. It’s huge for developers of banking apps, password keepers, financial apps and anything else that requires authentication to protect sensitive data.

Intent is a part of Face ID, as well. If your phone notices that you’re not looking at it, it will dim the screen faster than normal in order to save battery. If an alert comes in and you look at your phone, it will tone down the alert sound because it “knows” it got your attention. This is just the tip of the iceberg when it comes to contextual computing, but the era is upon us for sure.

One additional interesting thing about a camera-based authentication system is that it could eventually be included on every computing device Apple makes. I think it makes more sense on, say, an iMac than it does on an iPad — but we’ll see how it goes.

I had an extensive interview previously with Apple’s Federighi about the security and privacy of Face ID which you can read here, so I won’t go over that portion of it again.

The camera

I’ve written about the iPhone 8’s camera extensively, so I’ll focus here on the things that are different about the iPhone X, namely some small optical differences, the True Depth camera and stabilization on the telephoto lens.

Due to slightly different optics, the iPhone X has a looser crop than the iPhone 8 Plus at the same distance, roughly a 52mm equivalent versus the iPhone 8’s 56mm equivalent, if you’re into that sort of thing. That means there is more room to work in Portrait Mode at normal distances, but most people probably won’t notice unless they’re side by side. I know it’s small but I actually like it — it reminds me of shooting a 40mm pancake lens, one of my favorites on Canon and Pentax.

The telephoto lens is also helped out over the iPhone 8 Plus by having a brighter aperture at f2.4, about a half stop more light or 50 percent more brightness. Not earth-shattering but a nice bump that gets augmented by the stabilization.

I really got a feel for how much the stabilization in the telephoto lens affected my shots when taking pictures of landmarks at night. These shots of the Guardians of the Galaxy tower really highlight the difference in sharpness that you see with a stabilized lens.

The second big way that a stabilized telephoto lens improves your images is in Portrait Mode, especially in anything but bright sunlight. The stabilized lens gives you more confidence to flip it into Portrait Mode in any light that supports the feature. Adding stabilization essentially allows you to shoot all the way down to the low-light cap on the portrait effect itself, which is great.

Shooting inside ride buildings, in the twilight, even in open shade were all improved. Bright sunlight is actually pretty terrible for portraiture altogether because people squint and the hard shadows do no one any favors. But when you duck out of the sun your ability to shoot takes a hit so it’s a catch. The iPhone X shows a big improvement here over even the recently released 8.

Another place where stabilization comes in very handy is when shooting close up. If you’re taking macro images of flowers or details or, say, bacon, the stabilized lens will help immensely with fine detail and preventing motion blur.

Similarly to a telephoto situation, any motion of your hands can be greatly amplified because of the distance and detail levels of what you’re shooting. As you can see with my breakfast, stabilization means less blurry bacon.

Conversely, of course, if your lighting conditions are good (bright and sunny), then you’re going to see almost zero difference between the two cameras. Aside from the stabilization addition, we’re looking at the same sensors and ISPs. There are slightly different optics in the iPhone X, but that did not seem to affect quality in any way for me. That stabilizer, though, really comes in handy in portraits, macro shots and telephoto shots if the lighting is anything less than ideal.

As I bummed around the park testing the iPhone X, I found myself defaulting to the 2x mode a lot. This allowed for some great sharp captures inside rides at a zoom that simply weren’t possible before. I’ve gotten lucky a handful of times with phones in the past, but never with a telephoto lens. The train vignettes, Pirates and other rides are so incredibly dark and dramatically lit that they’re a huge stress test for a zoom lens on a phone. The results were very impressive.

The telephoto also produces much more pleasant-looking results with a little bit of depth-of-field compression and good bokeh even on “non-portrait” images. It turns the X into a stabilized candid lens that feels great for ‘street photography’ style images. You realize just how much the very wide angle standard lens limits your creative choices once you start shooting more with the telephoto.

Given that the sensors are the same, the quality of the images shot with the standard lens should be essentially identical, but I’ll jump ahead here to say that the OLED screen in the iPhone X makes every image look better, with more depth and color information across a broader range. It makes new images and old images alike better to look at. More on that when we talk about the screen in depth.

The TrueDepth camera

The one other big addition to your camera bag on the iPhone X is the TrueDepth camera on the front. This selfie upgrade allows for Face ID to work, but also enables Portrait Mode for selfies. The effects are the same as on the back dual-lens system, but with the accurate depth map provided by the dot projector, Apple is able to do it with a “single” camera.

The mode works, no doubt. The same rig that ensures it can see your face in 3D means that it has a bit more data to work with to determine how to create the “faux depth” look associated with portraits. Where it differs is how that effect is applied. I found in my testing that, due to the much closer distances that you’re shooting, you’re going to end up with a much shallower range of an “in focus” subject.

Less of the subject will be in focus and more of it will be blurry basically.

This is to be expected and, in fact, is the exact same thing that would happen on an optical lens at a wide aperture placed that close to a face. Many photographers shoot portraits with wide aperture lenses up close specifically to make sure that the eyes are sharp and everything else from basically the ears back is out of focus. This puts the eyes, lashes, cheekbones and mouth into sharp relief and — since the first thing anyone does when looking at a picture of a person is look at the eyes (try it, you’ll see) — this is seen as a great way to create impact.

But it also really hurts when you’re trying to shoot a selfie with more than one person in the frame. Unless you’re perfectly parallel (unlikely) someone is going to be out of focus in Portrait Mode. In single-person shots, Portrait Mode works just fine.

I have no doubt that this technique will take over selfie posting on Instagram. It’s pleasant, helps hide blemishes and emphasizes what people consider to be their best features. Until the algorithm gets better at figuring out that there are two people in frame and understanding how to keep them both sharp, though, I’d recommend keeping it in regular mode for group shots.

Animoji, Snapchat and augmented reality

When Apple announced Animoji on stage at the iPhone X event, the snarky tweets started popping off immediately. It’s a fad, it will go the same way as the 3D emoji on the Apple Watch, etc.

After having used them a bunch over the past week I can honestly tell you that I still have zero clue whether these things will disappear or whether people will use them like crazy. Maybe both? Maybe regionally? I don’t know.

What I can tell you is that they are cute and super funny. The way that the camera is able to accurately track and map your face, the physics in the models and the fun factor of being able to “wear a mask” combine to make something that’s actually a ton of fun. I used my son in one of my first tests in a message to my wife and the sight of the little pig ears and nose wobbling around and saying “dada” did produce some actual LOLs.

When you send them, they come in as standard video files that can be saved and shared, so I’m sure we’re going to see a bunch of this stuff on other social networks besides iMessage.

It’s also a glimpse at the power of allowing people to make themselves over in another image. One of the most interesting and compelling cases I’ve seen for AR has been this transformative property. Why can’t my face look like this instead of that? Why can’t I have wings? Why can’t I be the Hulk or a piglet or Rey?

The Snapchat lenses that are launching soon, which I got the ability to play with while testing the iPhone X, are another example of this. The set they’ve shown so far is small but I’m sure it will grow, and these are not run-of-the-mill Snapchat filters. They track facial movements and head shapes so well that it absolutely feels like you’re wearing another face. Your environment, your face, the objects around you will be things for you to play with and create with, rather than fixed in place.

Whether Animoji takes off, the paradigm is out there in the public consciousness. It’s the hyper-accurate, hardware-enabled payoff to the Snapchat filter boom. Now that the hardware to truly support these experiences is in place, and soon to be at scale, prepare to see a ton more of it.

As an aside, I also think that claiming Animoji — picking the animal that is “you” — will be a thing among users. Dibs on the fox.

The screen (and the notch)

Another major feature, another big risk. Organic light-emitting diode, OLED, screens have been prized for their much better color and ability to “turn off” completely at the pixel level, leading to deeper blacks. But, especially in smartphones, they’ve also been plagued by poor off-axis viewing, rough color balance, issues being driven by onboard graphics cards and latent screen images that get “burned in” over time.

The Google Pixel 2 XL is getting a drubbing currently for falling prey to a few of these issues. People who obsess over screens have been waiting to see whether the iPhone X is able to hurdle these issues and come up with a better implementation of OLED.

The answer is yes, mostly.

Apple’s version of an OLED screen is manufactured by Samsung, but is not an off-the-shelf Samsung part. It’s a custom-built, diamond-pattern OLED array that was built to Apple specifications and driven by an Apple display driver. This screen is not comparable to screens found in Samsung devices on a variety of levels. You can like those screens just fine, I’m not arguing that, but this is absolutely not an implementation of a standard Samsung part.

The colors are bright and saturated, without blocking up — a big problem with red and magenta colors, typically. The True Tone screen means that you’re getting more accurate balance indoors and out as well. I could tell how my images would and should look even if I was viewing them under artificial light.

I hate to say it, but it makes the iPhone 8 Plus LCD look kind of like butt. I love it, even though it is flawed in one noticeable way.

The one area where this display falls prey to standard OLED gripes is in off-axis viewing. Apple tells me that it has done work to counter the drop in saturation and shift to blue that affects OLED screens traditionally. I can tell you that, compared to other OLED screens, you have to get further “off of center” to see a real shift in color, holding the phone 30 degrees or more off of dead on. But it is still there. For people who share their phone’s screen or use it at odd angles a lot, it will be noticeable. On some phones, OLEDs go super blue. On the iPhone X it’s more of a slight blue shift with a reduction in saturation and dynamic range. It’s not terrible, but it definitely exists.

From the front-ish though? Wooof. It’s good. At a brightness of 640 nits, the view-ability is insane in the sun — much, much better than the iPhone 8 LCD. It’s hard to capture via photograph to be honest (though we tried), but in person you’ll be impressed by how easy it is to use in direct light. This helped a ton when walking back and forth from the interior of ride buildings to exterior walkways and looking at images while walking around under the molten, ever-present eye of a merciless star.

That, coupled with the True Tone tech, makes images look better and brighter on the iPhone X than any other Apple device, iMac included. It even makes pictures taken on other devices look better.

I haven’t been using the phone long enough to determine whether it is “burn-in proof” or whatever you want to call it, but Apple insists that it has done a ton of work to mitigate the problem. And I do use Twitter, with a static menu bar, a whole heck of a lot and see no burn in so far. That’s the best info I can give you besides that the Pixel 2 XL’s burn-in started showing up pretty quickly.

Now, about that notch. It’s caused a lot of consternation and I completely get why people hate it. They view it as a compromise — and it is. Apple needed the camera and sensor package in there and this is how it chose to implement it. They explored other options, for sure.

Craig Federighi says that they looked at a bunch of different implementations when they were prototyping.

“On the prototype front, early in the project, we had all manner of makeshift hardware prototypes with crazy True Depth bolt-ons and things like that, but we also had the interface running on iPads. And so we’d have a big iPad with iPhone X in the middle and we could run a whole user experience on it before we had the hardware with this dimension [of] display and everything else and that enabled a lot of prototyping in parallel with the early hardware builds.”

But, ultimately, he says, they felt that this was a good way to go, and that it provided people with definition about where Control Center lived on the screen.

Alan Dye, who was responsible for leading the software design teams that had to decide how to handle the sensor package, says that it felt the most honest.

“We’ve got this amazing True Depth camera system packed into this space at the upper center of the display. And we thought a lot about how to design for that. And ultimately we felt really comfortable with this notion of being really honest about it and allowing for the content to push out into those beautiful rounded corners,” says Dye.

Dye says that Apple did consider using digital bezels. “We did look at various different design iterations and considered some things that kind of acted as digital bezels if you will. But ultimately we never really felt comfortable with this notion of cropping into the content. We really love the new display, we love that it’s edge-to-edge. We love the way that it fits. It feels so perfectly designed for the overall form and so we’re encouraging people just to kind of push the content right out to the corners.”

In use, I have to say, the notch is just zero problem for me. I don’t give a rat’s ass about it. I know I’ll probably catch heat but I’m not carrying water for Apple here. I think it is absolutely a compromise but, after using Face ID and the True Depth camera for other stuff, I am willing to deal with it.

And beyond “dealing with it” I can tell you that as one of a few people outside of Apple to have used it for more than a day — you stop noticing it very, very quickly. It’s a part of the display, the areas to the sides are or aren’t used and that’s it. Major apps like Instagram and Facebook have already been updated for the iPhone X screen and they look fine. Apple had to do some serious engineering to make it to the corners too, as the OLED is flexible.

Watching video in landscape defaults to cropped in and I largely forget to zoom it. YouTube’s new app reminds you to pinch to zoom out so it fills the screen and it looks cool in my opinion. I think it’s neat and a bit futuristic. I’ve been waiting for asymmetrical screens that are tailor made for their use case forever. They’re in every sci-fi movie ever and we’ve all been stuck with rectangles since the iPhone hit. I’m okay with a change.

I (about half jokingly) called the True Depth area a “flap” back in September. Given that when you minimize an app you can see that it is a whole “card” that slides out from underneath – and screenshots show the area filled in, I am technically correct about that. In the interface design world of the iPhone X it is a flap that covers that area, not a notch that cuts that area out. It means nothing but if you like completely malleable digital content to conform to a definite physicality, this paragraph was for you.

If, however, you use your iPhone for data entry or browsing or whatever in landscape, the True Depth camera is going to be bang in your way, especially if it’s on the left. No getting around it. If that bothers you, don’t get an iPhone X. But even if you think it’s going to bother you I’m not sure it actually will once you spend a few days with it.

Which is sort of the mantra of the iPhone X: Give it a few days and it all gets a lot clearer.

Using iPhone X

Day one of using an iPhone X is profoundly strange and cumbersome in a lot of ways. If you’ve spent years whacking a home button you’re not going to be able to break down those memetics in a couple of hours. I had to get used to swiping up, across, down and up again instead of tapping the button, double tapping the button or double tapping and swiping.

Day two is better. Some actions already felt super natural, like tapping on the screen to wake it, swiping across the home bar to switch directly from one app to the next, rather than bringing up the heavy app switcher. Quick and light. Other actions like quickly dropping out of an app still result in a mash of a home button that doesn’t exist.

Day five is the turn. The point at which the hardest habit to break, tapping the home button to move from any other screen to your home screen, is starting to break up.

Day six is when things started getting weird with my old iPhones. I started swiping the home button up and staring stupidly at the screen waiting for it to automatically unlock.

Anecdotally, I got the phone on a Monday and until Saturday I was still stabbing the home button to go home. Today, a week later as I write this, I swiped the home button on my iPhone 7 to try to unlock it. So give it a week or so to acclimate.

Once you do, it’s sweet. The faster 120hz refresh rate of the touch array means that every action is buttery smooth and reacts immediately to your touch. If it didn’t, the whole thing would break down. You no longer have the affordance of the time it takes your finger to leave the home button and reach up to hit the screen before you take action on something. Everything has to happen immediately because your finger never leaves the screen. And that never leaving the screen is so key.

From opening the phone to flipping back and forth between apps to closing one and opening another, it’s all action start to finish. There is no more “out to the home button and back to the screen” bouncing. It’s super-fast and fluid and makes it feel like you’re getting more done more quickly.

The switching from app to app action is not an issue at all on the fingers or hand, by the way. I know there was some super awkward spy stuff out there but you just swipe along the bar left to right or right to left to swap apps. It’s easy and relaxed. If you want to access the switcher with the “swipe up and pause” action, you can, but I don’t see any major need for it.

Grabbing Control Center with your left hand is rough work, and I’m still not sold on its placement in the top-right corner or the fact that the controls are at the top.

When you’re walking around with a kid in one arm and trying to snag a FastPass for your next ride and you need to adjust brightness or toggle screen lock or anything like that it is damn near painful to do it the regular way. And it’s only slightly more pleasant using your right hand.

Which is why I am so glad that reachability still exists. It is incredibly useful here. It’s also tied to a much more intuitive activation process. You can pull the whole top of the screen down with a slight “tug down” of the home bar. Then, Control Center is easily reachable with your right hand and at least not impossible with your left. Reachability is now tucked away in Accessibility, if you’re reading this and looking for it.

The strongest recommendation I can make for the new “no home button” paradigm is that after just a week, regular home button actions like double tapping feel much too heavy after just a week of using it. Ten years of the home button, it turns out, was enough to allow us to move on.

Another interface tidbit: I really like the new force-press to activate the camera on the home screen. It feels much more definitive than the fumbly “swipe from right to left” that could go awry on a notification or not trigger because you didn’t quite hit the edge.

I took no special care to preserve battery beyond what I normally would, which is to try to stay off Twitter at Disneyland (you can see that I failed fairly miserably in this regard). The temperature was in the low 90s for the most part, which isn’t crazy for Southern California, but doesn’t do batteries any favors. The reception is still fairly poor in many areas of the park and the radio goes to seek a lot inside rides, leading to greater battery drain. Despite that, and despite the fact that I shot hundreds of photos, the battery lasted all day.

I started the day by unplugging the charger at around 8:24 and skated into our hotel room at about 9:11 PM at 6 percent on power save mode. Not a bad 13 hours 2 minutes on standby and 6 hours, 4 minutes of usage in such punishing conditions. This is far less than I’d expect to get on any typical day, but not at the parks, where batteries go to get tortured. My iPhone 7 did not make it the full day. The iPhone 8 Plus made it, but I didn’t use it as heavily when I wasn’t shooting comparison photos. And the battery is larger.

Physically, the iPhone X is great. Gorgeous, shiny, it looks just fine. It feels heftier and denser like a piece of high-quality watchmaking. The chrome-like stainless steel ring around the phone is picking up some fine abrasions but they look normal, and I tend to run without a case and scratch the junk out of my phones, so it’s not an alarm bell issue. The glass back still looks great, with a multiple layer backing that has a very light pearlescent sheen below the top sheet of glass. I also like that they cut down on trying to “bevel” the camera bump. It is what it is and it looks just fine with as minimal a bezel as possible. From the front, well, you get the screen and you get the notch/flap/TrueDepth camera array.

Risk/Reward

I mentioned at the top, Apple entered into a ton of risk when it went for it with iPhone X. It’s splitting the iPhone line, adding a new tier at the top and betting that people will buy in to a radical departure in look and feel.

Overall, using the iPhone X well takes some time. It’s a big change from a physical home button to a completely swipe-based interface. But it’s fast, fluid and a lot of fun once you get used to it. Before you know it, you’ll forget you ever had to whack a home button to get things done.

There are some rough edges here and there. The notch isn’t for everyone, and the screen does have some color issues at extreme viewing angles. But overall Apple bet big on a bunch of technologies all at once on the iPhone X and it delivered almost across the board. It really is like using the future of smartphones, today.