verge-rss

Reviewing the iPhone 16

Image: Alex Parkin / The Verge

You’ve spent the last couple of days tweaking the icons on your iOS 18 homescreen and luxuriating in your rest days with WatchOS 11. Now Apple’s hoping you’ll upgrade even further. The company is getting ready to start selling the new iPhone 16 lineup, the new Apple Watch Series 10, and the new AirPods 4 headphones.
We’ve been using all the devices since they were announced last week, and we’re here to answer the question that matters most: is any of this new stuff worth the money?

On this episode of The Vergecast, it’s wall-to-wall Apple reviews. First, we start with the iPhones, which The Verge’s Allison Johnson and Nilay Patel have been reviewing. We talk mostly about the camera, because, well, the camera’s most of what there is to talk about. Allison and Nilay tell us about their experiences with the Camera Control, the new photography styles, check the “what is a photo” temperature of the new devices, and try to differentiate the four phones from each other and from their predecessors.
After that, Victoria Song comes on to talk about the Apple Watch Series 10. We weren’t able to test one of the Watch’s key features — sleep apnea detection — ahead of our recording, but we did have plenty of time to talk about the new design, the new screen, and the new features. Oh, and the new speaker, about which we are all deeply conflicted.
Next, Chris Welch tells us about his time with the AirPods 4, and whether it’s worth spending $50 to get model with Active Noise Cancellation. These AirPods — with a sound quality upgrade, a competitive price, and big promises about noise management — might be hugely compelling if Apple got it all right. So… did Apple get it all right?
Finally, Allison returns, and we take a break from talking about Apple to answer a question from the Vergecast Hotline (866-VERGE11, call us!) about the case for the Pixel lineup in 2024. It’s a pretty good year for phones, and we have a lot to talk about.
If you want to know more about everything we discuss in this episode, you should read all our reviews of Apple’s new gadgets:

Apple iPhone 16 and 16 Plus review: all caught up
Apple iPhone 16 Pro review: small camera update, big difference
iOS 18 is a smart upgrade, even without the AI
Apple Watch Series 10 review: an Ultra sleek package
Apple AirPods 4 review: defying expectations

Image: Alex Parkin / The Verge

You’ve spent the last couple of days tweaking the icons on your iOS 18 homescreen and luxuriating in your rest days with WatchOS 11. Now Apple’s hoping you’ll upgrade even further. The company is getting ready to start selling the new iPhone 16 lineup, the new Apple Watch Series 10, and the new AirPods 4 headphones.

We’ve been using all the devices since they were announced last week, and we’re here to answer the question that matters most: is any of this new stuff worth the money?

On this episode of The Vergecast, it’s wall-to-wall Apple reviews. First, we start with the iPhones, which The Verge’s Allison Johnson and Nilay Patel have been reviewing. We talk mostly about the camera, because, well, the camera’s most of what there is to talk about. Allison and Nilay tell us about their experiences with the Camera Control, the new photography styles, check the “what is a photo” temperature of the new devices, and try to differentiate the four phones from each other and from their predecessors.

After that, Victoria Song comes on to talk about the Apple Watch Series 10. We weren’t able to test one of the Watch’s key features — sleep apnea detection — ahead of our recording, but we did have plenty of time to talk about the new design, the new screen, and the new features. Oh, and the new speaker, about which we are all deeply conflicted.

Next, Chris Welch tells us about his time with the AirPods 4, and whether it’s worth spending $50 to get model with Active Noise Cancellation. These AirPods — with a sound quality upgrade, a competitive price, and big promises about noise management — might be hugely compelling if Apple got it all right. So… did Apple get it all right?

Finally, Allison returns, and we take a break from talking about Apple to answer a question from the Vergecast Hotline (866-VERGE11, call us!) about the case for the Pixel lineup in 2024. It’s a pretty good year for phones, and we have a lot to talk about.

If you want to know more about everything we discuss in this episode, you should read all our reviews of Apple’s new gadgets:

Apple iPhone 16 and 16 Plus review: all caught up
Apple iPhone 16 Pro review: small camera update, big difference
iOS 18 is a smart upgrade, even without the AI
Apple Watch Series 10 review: an Ultra sleek package
Apple AirPods 4 review: defying expectations

Read More 

Apple iPhone 16 Pro review: small camera update, big difference

I’m not saying you should buy a new phone for a single camera setting… but I’m not not saying that, either. The iPhone 16 Pro is one of the most unfinished products Apple has ever shipped. Almost all of its highlight features will arrive in future software updates that will stretch well into next year before they’re here. That’s big stuff, like the new Apple Intelligence AI features the company says will start slowly arriving in October, and little stuff, like the complete functionality of the new Camera Control button on the side.
Even really minor things, like that new Siri animation that inspired the tagline “It’s Glowtime” for the phone’s launch event? Not here yet. You get the same old Siri bubble as ever until Apple Intelligence arrives.

The hard rule of reviews at The Verge is that we always review what’s in the box — the thing you can buy right now. We never review products based on potential or the promise of software updates to come, even if a company is putting up billboards advertising those features, and even if people are playing with those features in developer betas right now. When Apple Intelligence ships to the public, we’ll review it, and we’ll see if it makes the iPhone 16 Pro a different kind of phone.
Until then, the iPhone 16 Pro we’re reviewing today is an incremental update — it’s mostly a set of very nice but ultimately minor changes to the iPhone 15 Pro. It’s hard to make the case for an upgrade right now: there is almost no reason to upgrade to the 16 Pro or 16 Pro Max from the 15 Pro or 15 Pro Max — especially since the 15 Pros are the only older iPhones that will get Apple Intelligence when it arrives. And if you have an older Pro phone, it’s worth waiting to see if Apple Intelligence is any good before you upgrade; there’s no reason to throw money at hardware just to support unproven software.
All that said, the iPhone 16 Pro does contain one extremely notable camera update, and it’s a good one — although it’s probably not what you think. So let’s start there.

The Camera Control button sits where the mmWave 5G antenna used to be — it’s now been integrated into the other antennas.

There are two big changes to the iPhone 16 and 16 Pro cameras: the new Camera Control button, and a new set of controls for how images are processed. The button itself is a hybrid: you can press it down all the way to take a photo, or give it a light press to trigger a haptic click and bring up a setting like zoom or exposure, which you can adjust with a swipe. A double light press lets you switch between those settings. (You can adjust the pressure sensitivity of the haptic press in the accessibility settings, which is nice, although I found the default to be just fine.)
By default, a single click opens the camera when the phone is unlocked, and another takes a photo. It’s pretty fun to flip the phone on its side and shoot with the button like a normal camera, although the physical button is a bit stiff — a few Verge staffers found themselves moving the phone slightly when pushing all the way down to take a photo, although I thought it was fine.
I found myself accidentally opening the camera a lot at first since I’m left-handed and the button is placed where my fingers tend to rest when I hold the phone. You can set it to require a double-click, and that solved the problem for me. You can also set the button to open third-party camera apps; it works well with the new version of Halide that’s been updated to support that functionality.

We’ll swipe down the surface of things.

The reason Apple calls it “Camera Control” and not just “shutter button” is the capacitive controls on the top, which should ideally let you adjust various settings with a quick swipe. I was really hoping I’d find myself using the capacitive controls to adjust things like exposure and focal length, but it’s all a bit fiddly switching between everything with the light presses and far too easy to end up changing things you weren’t intending to. The whole thing would be greatly improved if a second light press dismissed the control; once they’re open, they tend to stay open, leading to inadvertent changes when your finger slides along the button.
You can just tap on the screen to dismiss the control, which I found useful. You can also just swipe on the onscreen settings to adjust them, which allowed for more precise control than swiping along the button itself.
In a real theme for the iPhone this year, the Camera Control is shipping in an unfinished state. Apple says a software update later this year will allow the button to emulate a traditional two-stage shutter button, where a half-press focuses and a full press takes the shot. (I asked, but the company isn’t giving a firm date for this.) It’s hard to know how big a deal this will be until it arrives; I’ve had a lot of complaints about iPhone cameras over the years, but setting focus has never been one of them.

It’s still easier to use the on-screen shutter button when shooting vertically.

Apple is very proud of the faster camera sensor in the iPhone 16 Pro, which it claims offers zero shutter lag, and you can indeed click away pretty fast on the camera button while shooting in HEIF or JPG mode. You can definitely outrun it if you’re shooting in RAW, though — I clocked it around 4 frames per second, which is pretty great for a phone but not anywhere close to what a modern mirrorless camera with an electronic shutter can do.
Overall, the button is very nice to have, but that’s about it right now — as it exists today, it’s not a huge improvement over shooting photos with any other iPhone.
The actual photos, on the other hand? Well, it’s complicated.

Having camera controls right under your finger is nice, but it can be easy to accidentally change things while you’re shooting.

It’s safe to say that a lot of people did not love the cameras on the iPhone 15 and 15 Pro. Apple has gotten increasingly aggressive with its approach to computational photography over the past few years, and various forums and social platforms have been filled with complaints about that for a while now. The New Yorker published a piece about iPhone photos looking unrealistic two years ago — the sense that these cameras are starting to look a little weird has been building.
The iPhone 15 and 15 Pro hit a kind of tipping point — they produced photos so aggressively processed that all kinds of people started noticing and complaining about it. I have been reviewing phones and cameras for a long time, but I will never publish a review as efficiently devastating as Alix Earle asking her 7 million followers why her iPhone 15 camera sucks. If people who’ve built multimillion-dollar content businesses with their phone cameras aren’t loving the cameras on their new phones, something’s gone wrong.
If I had to offer a radically simplified diagnosis of what’s going on with all these complaints, it’s simply that the iPhone won’t simply leave shadows and highlights alone. You’re not just taking a photo when you press that shutter button — Apple’s fancy Photonic Engine HDR photography pipeline captures up to nine frames with each press, intelligently exposes things like the sky and faces in different ways, applies a great deal of sharpening and noise reduction, and drops a final processed image in your camera roll. The whole process allows iPhones to preserve a great deal of detail across an image, but one side effect is that it inevitably brightens the dark parts of an image and brings down the bright parts so you can actually see that detail.
The side effect is that images seem flat because they lack contrast between light and dark. I always think about this like dynamics in music: if every part of a song is loud, then nothing actually seems loud. That’s what’s been happening with the iPhone camera over time. Everything is getting so bright that nothing is bright, and the photos are starting to look flat, even gray.
This time around, I have good news and bad news.

The bad news is that by default, the iPhone 16 Pro camera is even more aggressive about evening out shadows and highlights than the iPhone 15 Pro. It’s subtle, but it’s there — you can see it with basic photos of plants, with pictures of people, with street scenes — it’s all just a little bit brighter, a little bit flatter.
Shadows in iPhone 16 Pro photos are dramatically boosted compared to the regular iPhone 16, although the 16 Pro offers much nicer depth of field, does less sharpening, and performs better in low light. (I actually found it hard to make the 16 Pro go into night mode, while the regular 16 drops to night mode pretty easily.) The larger sensor with bigger pixels on the 16 Pro can just capture more light than the sensor on the 16, and Apple’s default settings use all that extra light to wage absolute war on shadows. And while the 48-megapixel ultrawide camera on the iPhone 16 Pro produces 12-megapixel photos that look awfully similar to the iPhone 15 Pro, they are substantially better than the ultrawide photos from the iPhone 16.

We’re going to do a much deeper camera comparison in the weeks to come, so I won’t overdo the comparison to the Galaxy S24 Ultra and the Pixel 9 Pro XL, since that requires intense pixel peeping. Suffice it to say that the Pixel has the best zoom, while Samsung’s color handling remains aggressively chaotic. But time and again, all three cameras produced photos that were essentially small variations on the same ultraprocessed look that these companies have seemed intent on chasing for a while now.

The Pixel 9 Pro XL at 5x zoom is notably clearer than the iPhone 16 Pro Max 5x zoom.

The iPhone 16 Pro has a nice 5x telephoto lens, but you can see some artifacting in this medium-light shot.

But here’s the good news.
The iPhone 16 and 16 Pro allow you to exclude yourself from this narrative entirely with a huge upgrade to the Photographic Styles feature that allows you to adjust how the camera processes colors, skin tones, and shadows, even after you’ve shot a photo.

The iPhone 16 and 16 Pro let you pick “undertones” to help dial in your preferred skin tone.

You can pick between five “undertone” settings that are meant to adjust skin tones, and nine “mood” settings that feel a lot like high-quality Instagram filters. You can shoot with a live preview of any of the styles, and then you can tweak the settings or even switch styles entirely later on.
And all of these styles offer three new fine controls: there’s “color,” which is basically saturation, and “palette,” which is the range of colors being applied. Most importantly, there’s a new control called “tone” which lets you add shadows back to your photos. It turns out Apple is using “tone” in this context to mean “tone mapping,” and in my tests, the tone control allowed me to reliably bring the iPhone’s image processing back to reality by turning it down.

The tone control is semantically aware — it will adjust things like faces and the sky differently, so it’s still doing some intense computational photography, but the goal is for you to be able to take photos that look a lot more like what a traditional camera would produce if you bring the slider all the way down. (You can also go all the way up for the most intense smartphone HDR photos you’ve ever seen, if that’s the sort of thing that makes you happy.)

Turning down the tone control felt like a sigh of relief — I prefer photos with less aggressive tone mapping way more than the default iPhone 16 Pro settings and the photos produced by the iPhone 15 Pro. It’s like a haze is being lifted; images are a little punchier, a little more present. You might feel differently, but I like shadows and highlights, and the addition of the tone control lets me have them on a phone camera without jumping through the hoops of shooting in RAW and processing the photos myself.
For me, the tone control offers such a meaningful improvement to iPhone photos that it’s possible to argue that this one single camera adjustment makes upgrading to an iPhone 16 or 16 Pro worth it. I am a huge photo nerd who cares a lot about these things, and even I don’t think that that argument is 100 percent convincing, but it is very possible to make that argument, which is wild.
Could you also just buy a camera app like Halide and use its very popular new Process Zero feature on any other iPhone to take less processed photos? You could. Could you just spend this money on a nice point and shoot, which are staging a mini comeback? You definitely could, and you might change your relationship to photography in a deeply positive way by doing so. But if you take a lot of iPhone photos, and you’ve started to notice that they look a little weird, well, it’s possible to at least make the argument.

The 48MP ultrawide camera produces 12MP photos that look essentially the same as the iPhone 15 Pro’s ultrawide.

Allowing styles to be edited and changed after a photo is taken required Apple to reshuffle the Photonic Engine computational photography pipeline — it’s the same basic process on the iPhone 16 Pro as on the 15 Pro, but tone mapping is now one of the final steps. The idea here is for the the edits to be “perceptually lossless;” when you take a photo in a style, that’s how the photo is saved, but the iPhone adds a little chunk of data to the image file that allows it to undo that style and revert the image to standard. This means you can tweak styles and even change them entirely whenever you want, and I had great fun making several different versions of the same shot.
That bit of extra data results in files that are about 25 percent bigger than before — around 3MB instead of 2.5MB, generally — and the vagaries of compression mean that styles only work if you shoot in HEIF, a format that continues to bedevil basically everything outside of Apple’s ecosystem. If you set the camera to shoot standard JPGs, you don’t get styles or the new tone control. (Apple’s also added the ability to shoot in the new JPEG-XL format in both lossy and lossless modes, but that’s a RAW format, and styles won’t work.) My suggestion to Apple would be to allow the use of the tone control as a permanent exposure-like adjustment when shooting JPGs, but I’m just a guy who likes shadows.

The D-pad control for color and tone in photographic styles is fun but hard to use precisely.

Styles overall aren’t really ready for professional workflows — the only way to adjust them after shooting is with a fiddly two-axis D-pad control that also controls color, which makes the whole thing feel woefully imprecise. You also can’t apply a style to a bunch of photos at once, and trying to keep track of which photos have which styles applied requires staring into the absolute abyss of iOS file management. The Photos app in macOS Sequoia will be able to adjust styles, but Apple won’t say if third-party apps will be able to support style editing in the future.
The entire vibe of all these new controls is very much “you figure it out.” The more you play with styles, especially the undertone styles meant for skintones, the more it seems like Apple has simply given up on having a point of view for what this camera should look like. Google makes a lot of noise about its Real Tone project, which is supposed to allow the Pixel to capture accurate skin tones for all kinds of people, but Apple’s solution is to simply let people choose their own skin tone using the “undertones” styles (which works, although it often changed more than just skin tone in our test photos). Undertones also apply to everyone in an image, so if you take a photo of people with a range of skin tones, they’re all going to get the same effect. I get the idea behind undertones, but the execution feels like it needs a little more refinement.
I don’t have a lot of say about the “mood” styles, which are very fun and expressive. Verge supervising producer Vjeran Pavic basically fell in love with these while he was testing the video features — they reminded him of the very popular Fujifilm recipes for emulating different kinds of film. You should watch our video review for more on both. The one thing I will add is that the new spatial audio recording in video is surprisingly complicated, and doesn’t really result in spatial audio the way you’d expect when you play a video back.
Apple’s Alex Kirschner told me that spatial audio capture is primarily there to enable the (very cool!) new audio mix feature that allows you to remove background noise from videos of people talking; you’ll get headphone-based spatial audio when listening through AirPods, but Apple’s bizarrely chosen to have the Apple TV play these videos in 5.1 or 7.1 surround instead of something like Atmos, so you’ll lose any height effects. (Worse: if you AirPlay a video captured with spatial audio, it will only play back in stereo.)
Is it bananas that a smartphone can record 4K video in surround sound? It absolutely is. It is just also getting increasingly hard to understand what Apple means by “spatial audio,” and how it can be edited and played back across various audio devices.

No AI-enhanced existential crisis moon photos here.

It’s also notable what isn’t present on the iPhone this year: there’s no generative AI wackiness at all. There’s no video boost or face-swapping, no adding yourself to group photos, no drawing to add stuff with AI like on the Pixel or Galaxy phones — really, none of it. I asked Apple’s VP of camera software engineering Jon McCormack about Google’s view that the Pixel camera now captures “memories” instead of photos, and he told me that Apple has a strong point of view about what a photograph is — that it’s something that actually happened. It was a long and thoughtful answer, so I’m just going to print the whole thing:

Here’s our view of what a photograph is. The way we like to think of it is that it’s a personal celebration of something that really, actually happened.
Whether that’s a simple thing like a fancy cup of coffee that’s got some cool design on it, all the way through to my kid’s first steps, or my parents’ last breath, It’s something that really happened. It’s something that is a marker in my life, and it’s something that deserves to be celebrated.
And that is why when we think about evolving in the camera, we also rooted it very heavily in tradition. Photography is not a new thing. It’s been around for 198 years. People seem to like it. There’s a lot to learn from that. There’s a lot to rely on from that.
Think about stylization, the first example of stylization that we can find is Roger Fenton in 1854 — that’s 170 years ago. It’s a durable, long-term, lasting thing. We stand proudly on the shoulders of photographic history.

That’s a sharp and clear answer, but I’m curious how Apple contends with the relentless addition of AI editing to the iPhone’s competitors. The company is already taking small steps in that direction: a feature called “Clean Up” will arrive with Apple Intelligence, which will allow you to remove objects from photos like Google’s Magic Eraser. McCormack told me that feature will somehow mark the resulting images as having been generatively edited, although he didn’t say how.
I did ask if Apple would adopt an image verification standard like C2PA, which companies like Adobe, Microsoft, OpenAI, and now Amazon and Google have decided to support; McCormack told me Apple was waiting to see how things evolved before it made decisions. This is fair, as that standard is a bit of a mess right now, and it’s not even clear it’ll even be that effective on social platforms. But being able to trust the images we see is going to get more and more complex and important, and the iPhone is the most popular camera in the world, so it’s clear that the industry will ultimately bend around Apple’s approach. We’ll see.

The 16 Pro Max is a big phone.

Everything else about the iPhone 16 Pro is incredibly incremental.
The displays now go down to 1 nit of brightness, which is very nice for not waking your partner while you doomscroll in bed. Those displays are also bigger now — the Pro is 6.3 inches, while the Pro Max is 6.9 inches, which is the largest ever on an iPhone. The regular Pro doesn’t feel that much bigger, since the bezels are smaller and the phones didn’t get any thicker. But the 16 Pro Max feels meaningfully larger than the 15 Pro Max. I have big hands and I’ve always picked the big phone, and the 16 Pro Max is definitely right on the line of too big to handle like a phone instead of a tablet.
Both phones have an A18 Pro chip inside, which Apple claims is faster by various impressive-sounding percentages than the A17 Pro. As with all iPhones, those performance numbers are mostly about headroom and longevity at this point — my iPhone 15 Pro doesn’t feel slow, and the 16 Pro doesn’t feel faster. I am very curious to see if the addition of Apple Intelligence changes this perception, but we’ll just have to wait and see.
I feel the same way about battery life. While the iPhone performance advantage means these phones will stay relevant for a long time, my experience with battery degradation is just the opposite. After about a year, my iPhone 15 Pro Max battery capacity has dropped to 93 percent, and it now struggles to make it through a day without enabling Low Power Mode.
Apple says the iPhone 16 Pro gets significantly better battery life than the iPhone 15 Pro, although the company won’t quote anything other than video playback times. The Pro Max is supposed to have the best battery life ever on an iPhone, and the batteries certainly held up for full days during my testing, which was very heavy on camera usage and screen-on time. But it’s unclear how Apple Intelligence running various places across iOS will affect battery life, and it’s similarly hard to know if the battery will stay strong after months and years of use. iPhone 16 Pro battery replacements from Apple cost more than before, so this is something to pay attention to over time.
Software-wise, my review units are running iOS 18.0, which allows you to radically customize the homescreen, lockscreen, and Control Center, and which supports RCS for better messaging with Android users.

The updated Qi2-enabled MagSafe puck can charge the iPhone 16 Pro at 30W.

You can more or less theme the homescreen any way you want, down to adjusting icon colors globally, and the lockscreen now allows you to change the quick access buttons to third-party apps. The preview build of Halide I was testing supported this, so I switched it in for the system camera app, which was nice. Once more camera apps support this, we’ll end up with a lot of ways to open camera apps from the lockscreen now — you’ll be able to set the Action Button, the Camera Control button, and the lockscreen button all to different camera apps if you want, and still have the ability to open the system camera by swiping to the right. Pretty neat.
The revamped Control Center is a bit of an adjustment. If you’re like me and you’ve used it by pure muscle memory for years, even the switch from squarish icons to circles is a little disorienting. The whole thing is now organized into vertical sheets: favorites, media controls, home controls, and the various radio and network controls. You can move controls and groups of controls around at will, resize them as you like, and generally create a little freeform command center of your go-to settings. This one will all come down to how much time you want to spend creating the perfect arrangement of controls — I’m a huge nerd, and I can’t wait to spend an hour or so getting it just right.
Apple also updated the MagSafe charging system, which can now charge at up to 25 watts using the new Qi2-compatible MagSafe puck and a 30W charger. I charged my review unit for quite a while at full speed, and it didn’t even get warm.
Price-wise, things are the same as last year: the iPhone 16 Pro starts at $999 for the model with 128GB of storage, while the larger Pro Max starts at $1,199 with 256GB of storage. You can get them in desert, natural, white, and black, which all look fine — I’m a little jealous that the regular iPhones get fun colors this year, but I just stick these things in cases anyway, so it doesn’t really matter.

Can Apple pull its AI software… into focus?

So that is the iPhone 16 Pro… so far. As it exists today, it’s a remarkably iterative update to the iPhone 15 Pro — it’s hard to find reasons to upgrade from last year’s model. And I’m not at all convinced that it’s worth upgrading to the 16 Pro from older Pro models just yet, either — the Camera Control and Action Button are nice, but not game changing, and unless you’re excited about dialing in the new Photographic Styles and the new tone control, you might find the even-brighter-and-flatter photos to actually be a step backward in photo processing. If you can’t tell, I am personally thrilled by the tone control, so this is an easy choice for me, but it feels like it’s worth waiting a tick for everyone else.
A lot of people have asked us if the extra money for the Pro phone is worth it this year, since the spec sheet of the iPhone 16 appears to be very close to the Pro. We’ve got a full review of the regular iPhone 16 here, but my short answer is that the Pro camera is meaningfully better, and that Apple shipping a 60Hz screen in 2024 is just silly, so I’m a Pro phone person all the way.
It really does feel like Apple intended to ship these things with Apple Intelligence, but it’s simply not here yet, and the complete feature set Apple’s announced with things like image generation and ChatGPT integration won’t be here until next year. And if you’re in the EU or China, you might be waiting for quite a while, as Apple navigates various regulatory hurdles in those regions to even launch this stuff at all.
That’s not to say the iPhone 16 Pro is a bad phone — it’s a great phone, with some fascinating ideas about smartphone photography embedded in it. But it’s also clearly unfinished, and I think it’s worth waiting to see if Apple Intelligence can complete some of these thoughts before spending the money on an upgrade.

I’m not saying you should buy a new phone for a single camera setting… but I’m not not saying that, either.

The iPhone 16 Pro is one of the most unfinished products Apple has ever shipped. Almost all of its highlight features will arrive in future software updates that will stretch well into next year before they’re here. That’s big stuff, like the new Apple Intelligence AI features the company says will start slowly arriving in October, and little stuff, like the complete functionality of the new Camera Control button on the side.

Even really minor things, like that new Siri animation that inspired the tagline “It’s Glowtime” for the phone’s launch event? Not here yet. You get the same old Siri bubble as ever until Apple Intelligence arrives.

The hard rule of reviews at The Verge is that we always review what’s in the box — the thing you can buy right now. We never review products based on potential or the promise of software updates to come, even if a company is putting up billboards advertising those features, and even if people are playing with those features in developer betas right now. When Apple Intelligence ships to the public, we’ll review it, and we’ll see if it makes the iPhone 16 Pro a different kind of phone.

Until then, the iPhone 16 Pro we’re reviewing today is an incremental update — it’s mostly a set of very nice but ultimately minor changes to the iPhone 15 Pro. It’s hard to make the case for an upgrade right now: there is almost no reason to upgrade to the 16 Pro or 16 Pro Max from the 15 Pro or 15 Pro Max — especially since the 15 Pros are the only older iPhones that will get Apple Intelligence when it arrives. And if you have an older Pro phone, it’s worth waiting to see if Apple Intelligence is any good before you upgrade; there’s no reason to throw money at hardware just to support unproven software.

All that said, the iPhone 16 Pro does contain one extremely notable camera update, and it’s a good one — although it’s probably not what you think. So let’s start there.

The Camera Control button sits where the mmWave 5G antenna used to be — it’s now been integrated into the other antennas.

There are two big changes to the iPhone 16 and 16 Pro cameras: the new Camera Control button, and a new set of controls for how images are processed. The button itself is a hybrid: you can press it down all the way to take a photo, or give it a light press to trigger a haptic click and bring up a setting like zoom or exposure, which you can adjust with a swipe. A double light press lets you switch between those settings. (You can adjust the pressure sensitivity of the haptic press in the accessibility settings, which is nice, although I found the default to be just fine.)

By default, a single click opens the camera when the phone is unlocked, and another takes a photo. It’s pretty fun to flip the phone on its side and shoot with the button like a normal camera, although the physical button is a bit stiff — a few Verge staffers found themselves moving the phone slightly when pushing all the way down to take a photo, although I thought it was fine.

I found myself accidentally opening the camera a lot at first since I’m left-handed and the button is placed where my fingers tend to rest when I hold the phone. You can set it to require a double-click, and that solved the problem for me. You can also set the button to open third-party camera apps; it works well with the new version of Halide that’s been updated to support that functionality.

We’ll swipe down the surface of things.

The reason Apple calls it “Camera Control” and not just “shutter button” is the capacitive controls on the top, which should ideally let you adjust various settings with a quick swipe. I was really hoping I’d find myself using the capacitive controls to adjust things like exposure and focal length, but it’s all a bit fiddly switching between everything with the light presses and far too easy to end up changing things you weren’t intending to. The whole thing would be greatly improved if a second light press dismissed the control; once they’re open, they tend to stay open, leading to inadvertent changes when your finger slides along the button.

You can just tap on the screen to dismiss the control, which I found useful. You can also just swipe on the onscreen settings to adjust them, which allowed for more precise control than swiping along the button itself.

In a real theme for the iPhone this year, the Camera Control is shipping in an unfinished state. Apple says a software update later this year will allow the button to emulate a traditional two-stage shutter button, where a half-press focuses and a full press takes the shot. (I asked, but the company isn’t giving a firm date for this.) It’s hard to know how big a deal this will be until it arrives; I’ve had a lot of complaints about iPhone cameras over the years, but setting focus has never been one of them.

It’s still easier to use the on-screen shutter button when shooting vertically.

Apple is very proud of the faster camera sensor in the iPhone 16 Pro, which it claims offers zero shutter lag, and you can indeed click away pretty fast on the camera button while shooting in HEIF or JPG mode. You can definitely outrun it if you’re shooting in RAW, though — I clocked it around 4 frames per second, which is pretty great for a phone but not anywhere close to what a modern mirrorless camera with an electronic shutter can do.

Overall, the button is very nice to have, but that’s about it right now — as it exists today, it’s not a huge improvement over shooting photos with any other iPhone.

The actual photos, on the other hand? Well, it’s complicated.

Having camera controls right under your finger is nice, but it can be easy to accidentally change things while you’re shooting.

It’s safe to say that a lot of people did not love the cameras on the iPhone 15 and 15 Pro. Apple has gotten increasingly aggressive with its approach to computational photography over the past few years, and various forums and social platforms have been filled with complaints about that for a while now. The New Yorker published a piece about iPhone photos looking unrealistic two years ago — the sense that these cameras are starting to look a little weird has been building.

The iPhone 15 and 15 Pro hit a kind of tipping point — they produced photos so aggressively processed that all kinds of people started noticing and complaining about it. I have been reviewing phones and cameras for a long time, but I will never publish a review as efficiently devastating as Alix Earle asking her 7 million followers why her iPhone 15 camera sucks. If people who’ve built multimillion-dollar content businesses with their phone cameras aren’t loving the cameras on their new phones, something’s gone wrong.

If I had to offer a radically simplified diagnosis of what’s going on with all these complaints, it’s simply that the iPhone won’t simply leave shadows and highlights alone. You’re not just taking a photo when you press that shutter button — Apple’s fancy Photonic Engine HDR photography pipeline captures up to nine frames with each press, intelligently exposes things like the sky and faces in different ways, applies a great deal of sharpening and noise reduction, and drops a final processed image in your camera roll. The whole process allows iPhones to preserve a great deal of detail across an image, but one side effect is that it inevitably brightens the dark parts of an image and brings down the bright parts so you can actually see that detail.

The side effect is that images seem flat because they lack contrast between light and dark. I always think about this like dynamics in music: if every part of a song is loud, then nothing actually seems loud. That’s what’s been happening with the iPhone camera over time. Everything is getting so bright that nothing is bright, and the photos are starting to look flat, even gray.

This time around, I have good news and bad news.

The bad news is that by default, the iPhone 16 Pro camera is even more aggressive about evening out shadows and highlights than the iPhone 15 Pro. It’s subtle, but it’s there — you can see it with basic photos of plants, with pictures of people, with street scenes — it’s all just a little bit brighter, a little bit flatter.

Shadows in iPhone 16 Pro photos are dramatically boosted compared to the regular iPhone 16, although the 16 Pro offers much nicer depth of field, does less sharpening, and performs better in low light. (I actually found it hard to make the 16 Pro go into night mode, while the regular 16 drops to night mode pretty easily.) The larger sensor with bigger pixels on the 16 Pro can just capture more light than the sensor on the 16, and Apple’s default settings use all that extra light to wage absolute war on shadows. And while the 48-megapixel ultrawide camera on the iPhone 16 Pro produces 12-megapixel photos that look awfully similar to the iPhone 15 Pro, they are substantially better than the ultrawide photos from the iPhone 16.

We’re going to do a much deeper camera comparison in the weeks to come, so I won’t overdo the comparison to the Galaxy S24 Ultra and the Pixel 9 Pro XL, since that requires intense pixel peeping. Suffice it to say that the Pixel has the best zoom, while Samsung’s color handling remains aggressively chaotic. But time and again, all three cameras produced photos that were essentially small variations on the same ultraprocessed look that these companies have seemed intent on chasing for a while now.

The Pixel 9 Pro XL at 5x zoom is notably clearer than the iPhone 16 Pro Max 5x zoom.

The iPhone 16 Pro has a nice 5x telephoto lens, but you can see some artifacting in this medium-light shot.

But here’s the good news.

The iPhone 16 and 16 Pro allow you to exclude yourself from this narrative entirely with a huge upgrade to the Photographic Styles feature that allows you to adjust how the camera processes colors, skin tones, and shadows, even after you’ve shot a photo.

The iPhone 16 and 16 Pro let you pick “undertones” to help dial in your preferred skin tone.

You can pick between five “undertone” settings that are meant to adjust skin tones, and nine “mood” settings that feel a lot like high-quality Instagram filters. You can shoot with a live preview of any of the styles, and then you can tweak the settings or even switch styles entirely later on.

And all of these styles offer three new fine controls: there’s “color,” which is basically saturation, and “palette,” which is the range of colors being applied. Most importantly, there’s a new control called “tone” which lets you add shadows back to your photos. It turns out Apple is using “tone” in this context to mean “tone mapping,” and in my tests, the tone control allowed me to reliably bring the iPhone’s image processing back to reality by turning it down.

The tone control is semantically aware — it will adjust things like faces and the sky differently, so it’s still doing some intense computational photography, but the goal is for you to be able to take photos that look a lot more like what a traditional camera would produce if you bring the slider all the way down. (You can also go all the way up for the most intense smartphone HDR photos you’ve ever seen, if that’s the sort of thing that makes you happy.)

Turning down the tone control felt like a sigh of relief — I prefer photos with less aggressive tone mapping way more than the default iPhone 16 Pro settings and the photos produced by the iPhone 15 Pro. It’s like a haze is being lifted; images are a little punchier, a little more present. You might feel differently, but I like shadows and highlights, and the addition of the tone control lets me have them on a phone camera without jumping through the hoops of shooting in RAW and processing the photos myself.

For me, the tone control offers such a meaningful improvement to iPhone photos that it’s possible to argue that this one single camera adjustment makes upgrading to an iPhone 16 or 16 Pro worth it. I am a huge photo nerd who cares a lot about these things, and even I don’t think that that argument is 100 percent convincing, but it is very possible to make that argument, which is wild.

Could you also just buy a camera app like Halide and use its very popular new Process Zero feature on any other iPhone to take less processed photos? You could. Could you just spend this money on a nice point and shoot, which are staging a mini comeback? You definitely could, and you might change your relationship to photography in a deeply positive way by doing so. But if you take a lot of iPhone photos, and you’ve started to notice that they look a little weird, well, it’s possible to at least make the argument.

The 48MP ultrawide camera produces 12MP photos that look essentially the same as the iPhone 15 Pro’s ultrawide.

Allowing styles to be edited and changed after a photo is taken required Apple to reshuffle the Photonic Engine computational photography pipeline — it’s the same basic process on the iPhone 16 Pro as on the 15 Pro, but tone mapping is now one of the final steps. The idea here is for the the edits to be “perceptually lossless;” when you take a photo in a style, that’s how the photo is saved, but the iPhone adds a little chunk of data to the image file that allows it to undo that style and revert the image to standard. This means you can tweak styles and even change them entirely whenever you want, and I had great fun making several different versions of the same shot.

That bit of extra data results in files that are about 25 percent bigger than before — around 3MB instead of 2.5MB, generally — and the vagaries of compression mean that styles only work if you shoot in HEIF, a format that continues to bedevil basically everything outside of Apple’s ecosystem. If you set the camera to shoot standard JPGs, you don’t get styles or the new tone control. (Apple’s also added the ability to shoot in the new JPEG-XL format in both lossy and lossless modes, but that’s a RAW format, and styles won’t work.) My suggestion to Apple would be to allow the use of the tone control as a permanent exposure-like adjustment when shooting JPGs, but I’m just a guy who likes shadows.

The D-pad control for color and tone in photographic styles is fun but hard to use precisely.

Styles overall aren’t really ready for professional workflows — the only way to adjust them after shooting is with a fiddly two-axis D-pad control that also controls color, which makes the whole thing feel woefully imprecise. You also can’t apply a style to a bunch of photos at once, and trying to keep track of which photos have which styles applied requires staring into the absolute abyss of iOS file management. The Photos app in macOS Sequoia will be able to adjust styles, but Apple won’t say if third-party apps will be able to support style editing in the future.

The entire vibe of all these new controls is very much “you figure it out.” The more you play with styles, especially the undertone styles meant for skintones, the more it seems like Apple has simply given up on having a point of view for what this camera should look like. Google makes a lot of noise about its Real Tone project, which is supposed to allow the Pixel to capture accurate skin tones for all kinds of people, but Apple’s solution is to simply let people choose their own skin tone using the “undertones” styles (which works, although it often changed more than just skin tone in our test photos). Undertones also apply to everyone in an image, so if you take a photo of people with a range of skin tones, they’re all going to get the same effect. I get the idea behind undertones, but the execution feels like it needs a little more refinement.

I don’t have a lot of say about the “mood” styles, which are very fun and expressive. Verge supervising producer Vjeran Pavic basically fell in love with these while he was testing the video features — they reminded him of the very popular Fujifilm recipes for emulating different kinds of film. You should watch our video review for more on both. The one thing I will add is that the new spatial audio recording in video is surprisingly complicated, and doesn’t really result in spatial audio the way you’d expect when you play a video back.

Apple’s Alex Kirschner told me that spatial audio capture is primarily there to enable the (very cool!) new audio mix feature that allows you to remove background noise from videos of people talking; you’ll get headphone-based spatial audio when listening through AirPods, but Apple’s bizarrely chosen to have the Apple TV play these videos in 5.1 or 7.1 surround instead of something like Atmos, so you’ll lose any height effects. (Worse: if you AirPlay a video captured with spatial audio, it will only play back in stereo.)

Is it bananas that a smartphone can record 4K video in surround sound? It absolutely is. It is just also getting increasingly hard to understand what Apple means by “spatial audio,” and how it can be edited and played back across various audio devices.

No AI-enhanced existential crisis moon photos here.

It’s also notable what isn’t present on the iPhone this year: there’s no generative AI wackiness at all. There’s no video boost or face-swapping, no adding yourself to group photos, no drawing to add stuff with AI like on the Pixel or Galaxy phones — really, none of it. I asked Apple’s VP of camera software engineering Jon McCormack about Google’s view that the Pixel camera now captures “memories” instead of photos, and he told me that Apple has a strong point of view about what a photograph is — that it’s something that actually happened. It was a long and thoughtful answer, so I’m just going to print the whole thing:

Here’s our view of what a photograph is. The way we like to think of it is that it’s a personal celebration of something that really, actually happened.

Whether that’s a simple thing like a fancy cup of coffee that’s got some cool design on it, all the way through to my kid’s first steps, or my parents’ last breath, It’s something that really happened. It’s something that is a marker in my life, and it’s something that deserves to be celebrated.

And that is why when we think about evolving in the camera, we also rooted it very heavily in tradition. Photography is not a new thing. It’s been around for 198 years. People seem to like it. There’s a lot to learn from that. There’s a lot to rely on from that.

Think about stylization, the first example of stylization that we can find is Roger Fenton in 1854 — that’s 170 years ago. It’s a durable, long-term, lasting thing. We stand proudly on the shoulders of photographic history.

That’s a sharp and clear answer, but I’m curious how Apple contends with the relentless addition of AI editing to the iPhone’s competitors. The company is already taking small steps in that direction: a feature called “Clean Up” will arrive with Apple Intelligence, which will allow you to remove objects from photos like Google’s Magic Eraser. McCormack told me that feature will somehow mark the resulting images as having been generatively edited, although he didn’t say how.

I did ask if Apple would adopt an image verification standard like C2PA, which companies like Adobe, Microsoft, OpenAI, and now Amazon and Google have decided to support; McCormack told me Apple was waiting to see how things evolved before it made decisions. This is fair, as that standard is a bit of a mess right now, and it’s not even clear it’ll even be that effective on social platforms. But being able to trust the images we see is going to get more and more complex and important, and the iPhone is the most popular camera in the world, so it’s clear that the industry will ultimately bend around Apple’s approach. We’ll see.

The 16 Pro Max is a big phone.

Everything else about the iPhone 16 Pro is incredibly incremental.

The displays now go down to 1 nit of brightness, which is very nice for not waking your partner while you doomscroll in bed. Those displays are also bigger now — the Pro is 6.3 inches, while the Pro Max is 6.9 inches, which is the largest ever on an iPhone. The regular Pro doesn’t feel that much bigger, since the bezels are smaller and the phones didn’t get any thicker. But the 16 Pro Max feels meaningfully larger than the 15 Pro Max. I have big hands and I’ve always picked the big phone, and the 16 Pro Max is definitely right on the line of too big to handle like a phone instead of a tablet.

Both phones have an A18 Pro chip inside, which Apple claims is faster by various impressive-sounding percentages than the A17 Pro. As with all iPhones, those performance numbers are mostly about headroom and longevity at this point — my iPhone 15 Pro doesn’t feel slow, and the 16 Pro doesn’t feel faster. I am very curious to see if the addition of Apple Intelligence changes this perception, but we’ll just have to wait and see.

I feel the same way about battery life. While the iPhone performance advantage means these phones will stay relevant for a long time, my experience with battery degradation is just the opposite. After about a year, my iPhone 15 Pro Max battery capacity has dropped to 93 percent, and it now struggles to make it through a day without enabling Low Power Mode.

Apple says the iPhone 16 Pro gets significantly better battery life than the iPhone 15 Pro, although the company won’t quote anything other than video playback times. The Pro Max is supposed to have the best battery life ever on an iPhone, and the batteries certainly held up for full days during my testing, which was very heavy on camera usage and screen-on time. But it’s unclear how Apple Intelligence running various places across iOS will affect battery life, and it’s similarly hard to know if the battery will stay strong after months and years of use. iPhone 16 Pro battery replacements from Apple cost more than before, so this is something to pay attention to over time.

Software-wise, my review units are running iOS 18.0, which allows you to radically customize the homescreen, lockscreen, and Control Center, and which supports RCS for better messaging with Android users.

The updated Qi2-enabled MagSafe puck can charge the iPhone 16 Pro at 30W.

You can more or less theme the homescreen any way you want, down to adjusting icon colors globally, and the lockscreen now allows you to change the quick access buttons to third-party apps. The preview build of Halide I was testing supported this, so I switched it in for the system camera app, which was nice. Once more camera apps support this, we’ll end up with a lot of ways to open camera apps from the lockscreen now — you’ll be able to set the Action Button, the Camera Control button, and the lockscreen button all to different camera apps if you want, and still have the ability to open the system camera by swiping to the right. Pretty neat.

The revamped Control Center is a bit of an adjustment. If you’re like me and you’ve used it by pure muscle memory for years, even the switch from squarish icons to circles is a little disorienting. The whole thing is now organized into vertical sheets: favorites, media controls, home controls, and the various radio and network controls. You can move controls and groups of controls around at will, resize them as you like, and generally create a little freeform command center of your go-to settings. This one will all come down to how much time you want to spend creating the perfect arrangement of controls — I’m a huge nerd, and I can’t wait to spend an hour or so getting it just right.

Apple also updated the MagSafe charging system, which can now charge at up to 25 watts using the new Qi2-compatible MagSafe puck and a 30W charger. I charged my review unit for quite a while at full speed, and it didn’t even get warm.

Price-wise, things are the same as last year: the iPhone 16 Pro starts at $999 for the model with 128GB of storage, while the larger Pro Max starts at $1,199 with 256GB of storage. You can get them in desert, natural, white, and black, which all look fine — I’m a little jealous that the regular iPhones get fun colors this year, but I just stick these things in cases anyway, so it doesn’t really matter.

Can Apple pull its AI software… into focus?

So that is the iPhone 16 Pro… so far. As it exists today, it’s a remarkably iterative update to the iPhone 15 Pro — it’s hard to find reasons to upgrade from last year’s model. And I’m not at all convinced that it’s worth upgrading to the 16 Pro from older Pro models just yet, either — the Camera Control and Action Button are nice, but not game changing, and unless you’re excited about dialing in the new Photographic Styles and the new tone control, you might find the even-brighter-and-flatter photos to actually be a step backward in photo processing. If you can’t tell, I am personally thrilled by the tone control, so this is an easy choice for me, but it feels like it’s worth waiting a tick for everyone else.

A lot of people have asked us if the extra money for the Pro phone is worth it this year, since the spec sheet of the iPhone 16 appears to be very close to the Pro. We’ve got a full review of the regular iPhone 16 here, but my short answer is that the Pro camera is meaningfully better, and that Apple shipping a 60Hz screen in 2024 is just silly, so I’m a Pro phone person all the way.

It really does feel like Apple intended to ship these things with Apple Intelligence, but it’s simply not here yet, and the complete feature set Apple’s announced with things like image generation and ChatGPT integration won’t be here until next year. And if you’re in the EU or China, you might be waiting for quite a while, as Apple navigates various regulatory hurdles in those regions to even launch this stuff at all.

That’s not to say the iPhone 16 Pro is a bad phone — it’s a great phone, with some fascinating ideas about smartphone photography embedded in it. But it’s also clearly unfinished, and I think it’s worth waiting to see if Apple Intelligence can complete some of these thoughts before spending the money on an upgrade.

Read More 

Apple iPhone 16 and 16 Plus review: all caught up

It’s a good year for Apple’s basic iPhone, even if its AI is MIA. Over the past few years, Apple’s standard iPhone looked a little neglected. The Pro models got new chipsets, camera features, and a customizable Action Button, while the standard models made do with the leftovers.
But this year, things are different: the iPhone 16 and 16 Plus played catch-up, and the gap between these phones and the Pro models isn’t as wide as it once was.
That matters a lot, especially on the more basic models. If you’ve been holding on to an older iPhone for the past couple of generations wondering whether this is the year to upgrade, then I think there’s an easy answer this time around: go for it. It’s a good year for the basic iPhone, and it’s a good year to upgrade.
But this iPhone is still very much a work in progress. For starters, Apple Intelligence is supposedly a major component of this phone’s software, and it’s just not available at launch. It’s out in beta now, and some features will begin to ship next month with iOS 18.1 — still marked as “beta.” But it’s not on the phone I’m reviewing, which is running 18.0, and therefore, it’s not part of this review.

I also have mixed feelings about the Camera Control, a new button on both the Pro and regular iPhone 16 models that allows you to launch the camera, take photos, and adjust some settings. I appreciate that it isn’t a Pro-exclusive feature, and boy, do I love a button. But in practice, I find it hard to use and have largely been ignoring it.
But the thing is, the foundational stuff is good. This year’s chipset is in the same family as the one on the Pro models, which means they’ll likely be on roughly the same software update schedule. The camera itself is as capable as ever, and the phone hardware itself looks great — Apple’s using some saturated colors again, thank God. And it still starts at $799, so anything added this year just feels like a nice-to-have. Even if Apple Intelligence never ships, you’d still have a good iPhone in your hands.

The 6.7-inch screen on the iPhone 16 Plus (left) is one for the big screen fans.

There’s one particularly conspicuous hardware feature missing from this, a high-end phone in the year 2024: a high refresh rate screen. Only the Pro phones get smoother ProMotion displays that go up to 120Hz, while the 16 and 16 Plus are stuck in 60Hz. By now, it’s a standard feature on modern smartphones from the midrange on up, and the iPhone looks awfully dated without it.
On principle, it’s irritating that Apple doesn’t offer this on the basic models, but in reality, how much that bothers you is entirely personal. I use phones with 120Hz screens for most of the rest of the year, and it’s always jarring for the first few minutes when I switch back to a 60Hz screen. But I get used to it pretty quickly, and I only notice the more stuttered scrolling when I think about it. Some people will find this an inexcusable omission, and they’re probably right. Some people will be perfectly happy with a 60Hz screen, and they’re also right. Everyone else exists somewhere between the two.
The 16 and 16 Plus also miss out on the always-on display offered on the Pro models. I like being able to glance at my notifications and my wallpaper when the iPhone is idle, so I miss having the always-on display on the 16. Still, I know a lot of people who don’t like it, so on balance, it’s no great loss here.

The Camera Control is kind of a hybrid mechanical / capacitive button.

Now, if anyone is on the record as a full-fledged button supporter, it’s me. I can’t get enough of ‘em. So, imagine my delight at having two new buttons on this phone — the programmable Action Button from the 15 Pro and the new Camera Control. I use the Action Button to open the app I use to sign my kid out of daycare. Usually, I have to fumble around looking for the app while there’s another parent in a rush waiting behind me, so it soothes my anxious brain every time I press that button. You can program it to do all kinds of things if you’re willing to learn the ways of Shortcuts. But for the rest of us, it’s pretty straightforward to map it to open a specific app and leave it at that.
I wish I had better things to say about the Camera Control. Believe me, I wanted to like it. I’ve used it a bunch, and I plan to keep trying it, just in case I’m missing something. But so far, I’m not impressed. It’s an actual button, and fully pressing it will launch the camera app. Once you’re there, another full press will take a photo. It’s also a capacitive control with haptic feedback — lightly pressing it will bring up exposure settings that you can adjust by moving your finger along the control.

Surprisingly, that’s the action I’m most comfortable with. It’s pressing the actual button and firing the shutter I’m struggling with. The mechanism feels too stiff to me, and no matter how hard I try to support the phone, I end up shaking the whole device every time I take a picture. And if I linger on that light press too long, I end up changing the exposure compensation or some other setting inadvertently. I have to take my focus away from the moment and think about pressing a damn button, and at that point, what are we even doing here?
I do like using it to launch the camera, but once I’ve done that, I’ve mostly gone back to using the onscreen shutter. I’m also using the capacitive control for exposure compensation, but I can’t help feeling that I’m underutilizing one of this phone’s fancy new features. If this button had one job instead of two, it would be more intuitive. Still, I now have a dedicated button to launch the camera and a capacitive exposure comp dial for the camera, and I can’t complain about that at all. I just wish the dual functions of this button worked better together.
I have to take my focus away from the moment and think about pressing a damn button
And while we’re in the camera app, let’s talk about Photographic Styles. Remember those? They’re like filters for the iPhone camera, but they’re applied during capture. On the iPhone 16 series, you’ll have a whole new range of custom settings to help you dial in the photographic style you like. You can go into the weeds of how this works in our iPhone 16 Pro review, but at a high level, they let you adjust color cast — in service of warmer or cooler skin tones — as well as brightness and contrast. If you’re one of the many people who think that iPhone photos look overprocessed lately, then this is the feature for you.
I’ve landed on a style that I like, but it wasn’t easy getting there. To set a photographic style as your new default, you need to go into the system settings menu and go through a setup process where you audition four of your photos in the new style. If you just pick a new style in the camera app itself, it’ll reset to standard when you leave. This is a different behavior than on previous iPhones, and it confused the hell out of me at first.
And here’s the bad news: you need to shoot in HEIF to use the new styles. HEIF is a cursed file format that no other company loves as much as Apple. Most of the time, your HEIF images will be converted to JPEG when sending them outside of the Apple ecosystem, but inevitably, one day you’ll have the misfortune of trying to open a .heic file on a non-Apple device and be met with nothing but sadness. I usually avoid shooting in HEIF, but the new photographic styles are so good that I’m willing to put up with the stray compatibility issue.

This level of flexibility makes it kind of hard to evaluate the camera itself. I’ve been shooting with a contrastier photographic style, which dials up shadows in a way I like. Along with the brighter highlights preserved by HDR tone mapping, you get an image with actual highlights and shadows — not a bunch of gray mush scrunched into a standard dynamic range space. The camera will still go a little intense with blue skies in certain circumstances, but you could play around with the photographic style settings to dial that down. I like my version of the camera app, which may be different from your version.
Mostly, I’m grateful that the iPhone continues to deliver great photos in portrait mode, and I’m always impressed by the video quality in cinematic mode, too. The 2x crop zoom is fine in decent lighting, and it’s a handy focal length for portrait shots. Having macro focus on the ultrawide lens is nice for the occasional close-up shot, too.
But the iPhone 16 uses a smaller main image sensor than its Pro peers, and its low-light image quality isn’t quite as good. Image quality is fine if your subjects aren’t moving, but don’t expect to get away with a lot if you’re trying to shoot portraits or moving subjects in dim light. And out of curiosity, I compared the 5x digital zoom on the iPhone 16 with the 5x telephoto lens on the 15 Pro. Predictably, the 15 Pro blows it out of the water. There’s still no substitute for good ‘ol optical zoom.

Photo: Allison Johnson / The Verge
The vertically stacked cameras enable spatial video. And did I mention that these colors rule?

I have nothing shocking to report about the iPhone 16’s overall performance. The A18 chipset (and some additional RAM — thanks AI!) handles daily tasks easily. I can fire off portrait mode photos about once a second; each one has a little bit of built-in buffering time, but I never had to wait longer than that for the buffer to clear. Even if you never use the AI features Apple is promising for this phone, getting the newer chipset is a win for the regular iPhones this year and should keep this phone running smoothly well into the next four or five years.
Even on the smaller model, the battery keeps up all day. On a day of heavier use that included streaming KEXP with Strava using GPS in the background, I still had around 30 percent by bedtime. If you opt for the 16 Plus, with its larger battery, you can stretch a single charge well into a second day. The real question will be how it keeps up a year or two down the line; Apple’s recent track record here isn’t great.

Photo: Allison Johnson / The Verge
Neglected no more.

It’s a good year for the basic iPhones, and that hasn’t been the case over the past few generations. To be sure, there’s nothing groundbreaking here, and certainly nothing you should trade in your iPhone 15 for. But if you’ve been on the fence for a while about upgrading from an 11 or 12, then I think this is the year to go for it.
You’ll get a couple of new buttons to play with, and who knows, maybe you’ll get along better with the Camera Control than I did. And If Apple Intelligence arrives and proves to be the time-saving, stress-easing set of features Apple insists it will be, then this phone will be ready for it. But even if they never arrive, you’re still getting some upgrades that matter in the long run. It’s a catch-up year for the regular iPhone, and that’s a good year to upgrade indeed.
Photography by Allison Johnson / The Verge

It’s a good year for Apple’s basic iPhone, even if its AI is MIA.

Over the past few years, Apple’s standard iPhone looked a little neglected. The Pro models got new chipsets, camera features, and a customizable Action Button, while the standard models made do with the leftovers.

But this year, things are different: the iPhone 16 and 16 Plus played catch-up, and the gap between these phones and the Pro models isn’t as wide as it once was.

That matters a lot, especially on the more basic models. If you’ve been holding on to an older iPhone for the past couple of generations wondering whether this is the year to upgrade, then I think there’s an easy answer this time around: go for it. It’s a good year for the basic iPhone, and it’s a good year to upgrade.

But this iPhone is still very much a work in progress. For starters, Apple Intelligence is supposedly a major component of this phone’s software, and it’s just not available at launch. It’s out in beta now, and some features will begin to ship next month with iOS 18.1 — still marked as “beta.” But it’s not on the phone I’m reviewing, which is running 18.0, and therefore, it’s not part of this review.

I also have mixed feelings about the Camera Control, a new button on both the Pro and regular iPhone 16 models that allows you to launch the camera, take photos, and adjust some settings. I appreciate that it isn’t a Pro-exclusive feature, and boy, do I love a button. But in practice, I find it hard to use and have largely been ignoring it.

But the thing is, the foundational stuff is good. This year’s chipset is in the same family as the one on the Pro models, which means they’ll likely be on roughly the same software update schedule. The camera itself is as capable as ever, and the phone hardware itself looks great — Apple’s using some saturated colors again, thank God. And it still starts at $799, so anything added this year just feels like a nice-to-have. Even if Apple Intelligence never ships, you’d still have a good iPhone in your hands.

The 6.7-inch screen on the iPhone 16 Plus (left) is one for the big screen fans.

There’s one particularly conspicuous hardware feature missing from this, a high-end phone in the year 2024: a high refresh rate screen. Only the Pro phones get smoother ProMotion displays that go up to 120Hz, while the 16 and 16 Plus are stuck in 60Hz. By now, it’s a standard feature on modern smartphones from the midrange on up, and the iPhone looks awfully dated without it.

On principle, it’s irritating that Apple doesn’t offer this on the basic models, but in reality, how much that bothers you is entirely personal. I use phones with 120Hz screens for most of the rest of the year, and it’s always jarring for the first few minutes when I switch back to a 60Hz screen. But I get used to it pretty quickly, and I only notice the more stuttered scrolling when I think about it. Some people will find this an inexcusable omission, and they’re probably right. Some people will be perfectly happy with a 60Hz screen, and they’re also right. Everyone else exists somewhere between the two.

The 16 and 16 Plus also miss out on the always-on display offered on the Pro models. I like being able to glance at my notifications and my wallpaper when the iPhone is idle, so I miss having the always-on display on the 16. Still, I know a lot of people who don’t like it, so on balance, it’s no great loss here.

The Camera Control is kind of a hybrid mechanical / capacitive button.

Now, if anyone is on the record as a full-fledged button supporter, it’s me. I can’t get enough of ‘em. So, imagine my delight at having two new buttons on this phone — the programmable Action Button from the 15 Pro and the new Camera Control. I use the Action Button to open the app I use to sign my kid out of daycare. Usually, I have to fumble around looking for the app while there’s another parent in a rush waiting behind me, so it soothes my anxious brain every time I press that button. You can program it to do all kinds of things if you’re willing to learn the ways of Shortcuts. But for the rest of us, it’s pretty straightforward to map it to open a specific app and leave it at that.

I wish I had better things to say about the Camera Control. Believe me, I wanted to like it. I’ve used it a bunch, and I plan to keep trying it, just in case I’m missing something. But so far, I’m not impressed. It’s an actual button, and fully pressing it will launch the camera app. Once you’re there, another full press will take a photo. It’s also a capacitive control with haptic feedback — lightly pressing it will bring up exposure settings that you can adjust by moving your finger along the control.

Surprisingly, that’s the action I’m most comfortable with. It’s pressing the actual button and firing the shutter I’m struggling with. The mechanism feels too stiff to me, and no matter how hard I try to support the phone, I end up shaking the whole device every time I take a picture. And if I linger on that light press too long, I end up changing the exposure compensation or some other setting inadvertently. I have to take my focus away from the moment and think about pressing a damn button, and at that point, what are we even doing here?

I do like using it to launch the camera, but once I’ve done that, I’ve mostly gone back to using the onscreen shutter. I’m also using the capacitive control for exposure compensation, but I can’t help feeling that I’m underutilizing one of this phone’s fancy new features. If this button had one job instead of two, it would be more intuitive. Still, I now have a dedicated button to launch the camera and a capacitive exposure comp dial for the camera, and I can’t complain about that at all. I just wish the dual functions of this button worked better together.

I have to take my focus away from the moment and think about pressing a damn button

And while we’re in the camera app, let’s talk about Photographic Styles. Remember those? They’re like filters for the iPhone camera, but they’re applied during capture. On the iPhone 16 series, you’ll have a whole new range of custom settings to help you dial in the photographic style you like. You can go into the weeds of how this works in our iPhone 16 Pro review, but at a high level, they let you adjust color cast — in service of warmer or cooler skin tones — as well as brightness and contrast. If you’re one of the many people who think that iPhone photos look overprocessed lately, then this is the feature for you.

I’ve landed on a style that I like, but it wasn’t easy getting there. To set a photographic style as your new default, you need to go into the system settings menu and go through a setup process where you audition four of your photos in the new style. If you just pick a new style in the camera app itself, it’ll reset to standard when you leave. This is a different behavior than on previous iPhones, and it confused the hell out of me at first.

And here’s the bad news: you need to shoot in HEIF to use the new styles. HEIF is a cursed file format that no other company loves as much as Apple. Most of the time, your HEIF images will be converted to JPEG when sending them outside of the Apple ecosystem, but inevitably, one day you’ll have the misfortune of trying to open a .heic file on a non-Apple device and be met with nothing but sadness. I usually avoid shooting in HEIF, but the new photographic styles are so good that I’m willing to put up with the stray compatibility issue.

This level of flexibility makes it kind of hard to evaluate the camera itself. I’ve been shooting with a contrastier photographic style, which dials up shadows in a way I like. Along with the brighter highlights preserved by HDR tone mapping, you get an image with actual highlights and shadows — not a bunch of gray mush scrunched into a standard dynamic range space. The camera will still go a little intense with blue skies in certain circumstances, but you could play around with the photographic style settings to dial that down. I like my version of the camera app, which may be different from your version.

Mostly, I’m grateful that the iPhone continues to deliver great photos in portrait mode, and I’m always impressed by the video quality in cinematic mode, too. The 2x crop zoom is fine in decent lighting, and it’s a handy focal length for portrait shots. Having macro focus on the ultrawide lens is nice for the occasional close-up shot, too.

But the iPhone 16 uses a smaller main image sensor than its Pro peers, and its low-light image quality isn’t quite as good. Image quality is fine if your subjects aren’t moving, but don’t expect to get away with a lot if you’re trying to shoot portraits or moving subjects in dim light. And out of curiosity, I compared the 5x digital zoom on the iPhone 16 with the 5x telephoto lens on the 15 Pro. Predictably, the 15 Pro blows it out of the water. There’s still no substitute for good ‘ol optical zoom.

Photo: Allison Johnson / The Verge
The vertically stacked cameras enable spatial video. And did I mention that these colors rule?

I have nothing shocking to report about the iPhone 16’s overall performance. The A18 chipset (and some additional RAM — thanks AI!) handles daily tasks easily. I can fire off portrait mode photos about once a second; each one has a little bit of built-in buffering time, but I never had to wait longer than that for the buffer to clear. Even if you never use the AI features Apple is promising for this phone, getting the newer chipset is a win for the regular iPhones this year and should keep this phone running smoothly well into the next four or five years.

Even on the smaller model, the battery keeps up all day. On a day of heavier use that included streaming KEXP with Strava using GPS in the background, I still had around 30 percent by bedtime. If you opt for the 16 Plus, with its larger battery, you can stretch a single charge well into a second day. The real question will be how it keeps up a year or two down the line; Apple’s recent track record here isn’t great.

Photo: Allison Johnson / The Verge
Neglected no more.

It’s a good year for the basic iPhones, and that hasn’t been the case over the past few generations. To be sure, there’s nothing groundbreaking here, and certainly nothing you should trade in your iPhone 15 for. But if you’ve been on the fence for a while about upgrading from an 11 or 12, then I think this is the year to go for it.

You’ll get a couple of new buttons to play with, and who knows, maybe you’ll get along better with the Camera Control than I did. And If Apple Intelligence arrives and proves to be the time-saving, stress-easing set of features Apple insists it will be, then this phone will be ready for it. But even if they never arrive, you’re still getting some upgrades that matter in the long run. It’s a catch-up year for the regular iPhone, and that’s a good year to upgrade indeed.

Photography by Allison Johnson / The Verge

Read More 

Backup by BioLite is a beefy emergency battery for your big appliances

Backup by BioLite doesn’t require an electrician to install it. | Image: BioLite

Instead of wiring into a home’s breaker box, the Backup by BioLite home backup power solution relies on thin battery panels that can fit behind appliances to keep them running for days at a time. It’s built around two 1.5kWh lithium iron phosphate (LiFePO4) batteries, the Backup Core and Backup Extend. Like any uninterruptible power supply, the Backup Core keeps itself perpetually charged from a wall outlet while power is available, then switches to keep whatever’s plugged into it running during a power outage.
If you need more backup power, up to five Backup Extend units can be connected to a Backup Core unit to expand the total capacity to 9kWh, and you don’t need a contractor or electrician to install any of it.
BioLite’s system was designed to be a cheaper and easier alternative to whole-home backup power solutions that rely on a central battery or gas-powered generator to keep an entire house running during a power outage. A natural gas generator alone can cost well over $5,000, while a Tesla Powerwall installation can set you back over $10,000.

Image: BioLite
The Backup by BioLite batteries are 29 inches tall and weigh upwards of 40lbs each.

The Backup Core battery panel will sell for $1,999, but BioLite will also offer a $2,999 Backup Complete solution, pairing a Core with a single Extend battery, that’s eligible for a 30 percent home energy tax credit. The company says the 3kWh Backup Complete has enough capacity to keep an 18 cubic foot fridge running for up to 60 hours or a larger 26 cubic foot fridge powered for up to 30 hours while also allowing for the occasional use of lights or other smaller appliances.
BioLite is bringing its backup power solution to consumers through a Kickstarter crowdfunding campaign that launches today, with discounts for early backers or those who opt to put down a deposit and pay in full through an installment plan later. Although the company has been around since 2006, making products like a power-generating camp stove, this will be its most expensive offering to date, and the usual caveats and risks with crowdfunded products apply here.

Image: BioLite
The Backup by BioLite batteries don’t need to be installed on a wall. They can be placed on top of a refrigerator, or hidden behind other furniture.

The Backup by BioLite batteries measure 29 inches tall and weigh between 35 and 40 lbs. They’re each just 2.8 inches thick, however, allowing them to be mounted out of sight behind appliances or furniture using hardware that takes about 30 minutes to install, claims the company. But a permanent installation isn’t necessary, as BioLite says the batteries will work just as well placed atop a fridge or slid under a bed.

Backup by BioLite doesn’t require an electrician to install it. | Image: BioLite

Instead of wiring into a home’s breaker box, the Backup by BioLite home backup power solution relies on thin battery panels that can fit behind appliances to keep them running for days at a time. It’s built around two 1.5kWh lithium iron phosphate (LiFePO4) batteries, the Backup Core and Backup Extend. Like any uninterruptible power supply, the Backup Core keeps itself perpetually charged from a wall outlet while power is available, then switches to keep whatever’s plugged into it running during a power outage.

If you need more backup power, up to five Backup Extend units can be connected to a Backup Core unit to expand the total capacity to 9kWh, and you don’t need a contractor or electrician to install any of it.

BioLite’s system was designed to be a cheaper and easier alternative to whole-home backup power solutions that rely on a central battery or gas-powered generator to keep an entire house running during a power outage. A natural gas generator alone can cost well over $5,000, while a Tesla Powerwall installation can set you back over $10,000.

Image: BioLite
The Backup by BioLite batteries are 29 inches tall and weigh upwards of 40lbs each.

The Backup Core battery panel will sell for $1,999, but BioLite will also offer a $2,999 Backup Complete solution, pairing a Core with a single Extend battery, that’s eligible for a 30 percent home energy tax credit. The company says the 3kWh Backup Complete has enough capacity to keep an 18 cubic foot fridge running for up to 60 hours or a larger 26 cubic foot fridge powered for up to 30 hours while also allowing for the occasional use of lights or other smaller appliances.

BioLite is bringing its backup power solution to consumers through a Kickstarter crowdfunding campaign that launches today, with discounts for early backers or those who opt to put down a deposit and pay in full through an installment plan later. Although the company has been around since 2006, making products like a power-generating camp stove, this will be its most expensive offering to date, and the usual caveats and risks with crowdfunded products apply here.

Image: BioLite
The Backup by BioLite batteries don’t need to be installed on a wall. They can be placed on top of a refrigerator, or hidden behind other furniture.

The Backup by BioLite batteries measure 29 inches tall and weigh between 35 and 40 lbs. They’re each just 2.8 inches thick, however, allowing them to be mounted out of sight behind appliances or furniture using hardware that takes about 30 minutes to install, claims the company. But a permanent installation isn’t necessary, as BioLite says the batteries will work just as well placed atop a fridge or slid under a bed.

Read More 

Mickey 17’s first trailer is light in tone, heavy on dead Robert Pattinsons

Image: Warner Bros. Pictures

It may have been delayed until next year, but at least we now have our first look at Mickey 17. The debut trailer for the sci-fi film — which is helmed by Parasite director Bong Joon-ho and stars Robert Pattinson as Mickey Barnes — premiered Tuesday evening. According to the trailer’s description, Barnes “has found himself in the extraordinary circumstance of working for an employer who demands the ultimate commitment to the job… to die, for a living.”
The movie is based on the novel Mickey 7 by Edward Ashton, about “a disposable employee on a human expedition sent to colonize the ice world Niflheim.” The book follows the seventh incarnation of the titular Mickey, but it appears in the movie he dies quite a few more times. In addition to Pattinson, it stars Steven Yeun, Naomi Ackie, Toni Collette, Mark Ruffalo, and Thomas Turgoose.
Mickey 17 will be Joon-ho’s first film since Parasite in 2019, and is slated to hit theaters on January 31st, 2025.

Image: Warner Bros. Pictures

It may have been delayed until next year, but at least we now have our first look at Mickey 17. The debut trailer for the sci-fi film — which is helmed by Parasite director Bong Joon-ho and stars Robert Pattinson as Mickey Barnes — premiered Tuesday evening. According to the trailer’s description, Barnes “has found himself in the extraordinary circumstance of working for an employer who demands the ultimate commitment to the job… to die, for a living.”

The movie is based on the novel Mickey 7 by Edward Ashton, about “a disposable employee on a human expedition sent to colonize the ice world Niflheim.” The book follows the seventh incarnation of the titular Mickey, but it appears in the movie he dies quite a few more times. In addition to Pattinson, it stars Steven Yeun, Naomi Ackie, Toni Collette, Mark Ruffalo, and Thomas Turgoose.

Mickey 17 will be Joon-ho’s first film since Parasite in 2019, and is slated to hit theaters on January 31st, 2025.

Read More 

You’ll be able to use an iPhone to wirelessly restore an iPhone 16

Photo: Allison Johnson / The Verge

iOS 18 has a new feature that lets you wirelessly restore an iPhone 16 using another iPhone or an iPad, 9to5Mac reports.
9to5Mac says it was able to simulate the new recovery method. “Essentially, when the iPhone 16 enters Recovery Mode for some reason, users can simply place it next to another iPhone or iPad to start the firmware recovery,” according to 9to5Mac. “The other device will download a new iOS firmware and transfer it to the bricked device.”
Apparently, any device on iOS 18 can do the restoration, but only the iPhone 16 lineup of phones can actually be restored using this method. You can already use an iPhone to wirelessly restore an Apple Watch or an Apple TV, so it’s nice to see a similar feature come to the iPhone, too.
iOS 18 is out now, and the iPhone 16 lineup is set to launch on Friday.

Photo: Allison Johnson / The Verge

iOS 18 has a new feature that lets you wirelessly restore an iPhone 16 using another iPhone or an iPad, 9to5Mac reports.

9to5Mac says it was able to simulate the new recovery method. “Essentially, when the iPhone 16 enters Recovery Mode for some reason, users can simply place it next to another iPhone or iPad to start the firmware recovery,” according to 9to5Mac. “The other device will download a new iOS firmware and transfer it to the bricked device.”

Apparently, any device on iOS 18 can do the restoration, but only the iPhone 16 lineup of phones can actually be restored using this method. You can already use an iPhone to wirelessly restore an Apple Watch or an Apple TV, so it’s nice to see a similar feature come to the iPhone, too.

iOS 18 is out now, and the iPhone 16 lineup is set to launch on Friday.

Read More 

California governor signs rules limiting AI actor clones

Photo by Chip Somodevilla/Getty Images

California governor Gavin Newsom has signed two bills that will protect performers from having their likeness simulated by AI digital replicas.
The two SAG-AFTRA supported bills, AB 2602 and AB 1836, were passed by the California legislature in August and are part of a slate of state-level AI regulations. AB 2602 bars contract provisions that would let companies use a digital version of a performer in a project instead of the real human actor, unless the performer knows exactly how their digital stand-in will be used and has a lawyer or union representative involved.
AB 1836 says that if a performer has died, entertainment companies must get permission from their family or estate before producing or distributing a “digital replica” of them. The law specifies that these replicas don’t fall under an exemption that lets works of art represent people’s likeness without permission, closing what The Hollywood Reporter characterizes as a potential loophole for AI companies.

View this post on Instagram

A post shared by California Governor (@cagovernor)

“We’re making sure that no one turns over their name, image, and likeness to unscrupulous people without representation,” Newsom said in a video posted to his Instagram on Tuesday, where he’s seen alongside SAG-AFTRA president Fran Drescher.
The two bills’ signing may bode well for the fate of the arguably biggest legal disruption to the AI industry: California’s SB 1047, which currently sits on Newsom’s desk awaiting his decision. SAG-AFTRA has also publicly supported SB 1047. But the bill has drawn opposition from much of the AI industry — which has until the end of September to lobby for its veto.

Photo by Chip Somodevilla/Getty Images

California governor Gavin Newsom has signed two bills that will protect performers from having their likeness simulated by AI digital replicas.

The two SAG-AFTRA supported bills, AB 2602 and AB 1836, were passed by the California legislature in August and are part of a slate of state-level AI regulations. AB 2602 bars contract provisions that would let companies use a digital version of a performer in a project instead of the real human actor, unless the performer knows exactly how their digital stand-in will be used and has a lawyer or union representative involved.

AB 1836 says that if a performer has died, entertainment companies must get permission from their family or estate before producing or distributing a “digital replica” of them. The law specifies that these replicas don’t fall under an exemption that lets works of art represent people’s likeness without permission, closing what The Hollywood Reporter characterizes as a potential loophole for AI companies.

“We’re making sure that no one turns over their name, image, and likeness to unscrupulous people without representation,” Newsom said in a video posted to his Instagram on Tuesday, where he’s seen alongside SAG-AFTRA president Fran Drescher.

The two bills’ signing may bode well for the fate of the arguably biggest legal disruption to the AI industry: California’s SB 1047, which currently sits on Newsom’s desk awaiting his decision. SAG-AFTRA has also publicly supported SB 1047. But the bill has drawn opposition from much of the AI industry — which has until the end of September to lobby for its veto.

Read More 

The Pixel Buds Pro no longer let you ‘touch and hold’ to hear notifications

Photo by Chris Welch / The Verge

Google is taking away a Pixel Buds Pro feature that lets users touch and hold an earbud to hear unread notifications. In an update on Tuesday, Google confirmed that Assistant “will no longer read unread notifications when using the press Assistant feature.”
Additionally, Google says Assistant will stop automatically alerting you to unread notifications on the Pixel Buds, nor will it let you reply. Instead, Google says you can hear notifications by activating Assistant and saying “Read my notifications.”
The “touch and hold” gesture on the Pixel Buds Pro currently lets wearers switch between active noise cancelation and Transparency mode by default. Before this change, users could configure this gesture to have Assistant read notifications aloud.
Last week, Pixel Buds Pro users noticed that they weren’t able to use the “touch and hold” to hear their notifications, but Google didn’t confirm that it’s removing the feature until now.
The change, which Google says it’s making based on user feedback, comes about a week before the release of the Pixel Buds Pro 2. It seems like more tweaks are to come, as Google says it will “evaluate further changes to notifications.”

Photo by Chris Welch / The Verge

Google is taking away a Pixel Buds Pro feature that lets users touch and hold an earbud to hear unread notifications. In an update on Tuesday, Google confirmed that Assistant “will no longer read unread notifications when using the press Assistant feature.”

Additionally, Google says Assistant will stop automatically alerting you to unread notifications on the Pixel Buds, nor will it let you reply. Instead, Google says you can hear notifications by activating Assistant and saying “Read my notifications.”

The “touch and hold” gesture on the Pixel Buds Pro currently lets wearers switch between active noise cancelation and Transparency mode by default. Before this change, users could configure this gesture to have Assistant read notifications aloud.

Last week, Pixel Buds Pro users noticed that they weren’t able to use the “touch and hold” to hear their notifications, but Google didn’t confirm that it’s removing the feature until now.

The change, which Google says it’s making based on user feedback, comes about a week before the release of the Pixel Buds Pro 2. It seems like more tweaks are to come, as Google says it will “evaluate further changes to notifications.”

Read More 

The lone US manufacturer of boutique keycaps may have just been saved

Signature Plastics’ in-office gallery of various custom mechanical keyboards. | Image: Signature Plastics

Signature Plastics, the Custer, Washington-based company known for specialized mechanical keyboard keycaps, has announced it’s being acquired by a Portland-based investment group that intends to keep the company operating with all employees maintained. This comes after majority owner Bob Guenser announced his plans to retire and find new ownership back in January.
Signature Plastics is well-regarded in the mechanical keyboard community for its high-quality PBT keycaps in vintage-looking SA and DSA profiles, such as DSA creamsicle and SA nuclear data, often selling runs of keycap sets in limited group buys with long lead times that can make them quite exclusive. It’s the only high-end, boutique keycap maker based in the US.

The investment group coming to Signature Plastics’ rescue is led by Will Clark, a self-proclaimed keyboard enthusiast who previously worked in e-commerce and cofounded software-as-a-service startups. While Clark has already joined SP’s leadership team, the acquisition will not be finalized until 2025.

Clark sat down for a brief video interview with Tae Ha Kim on his YouTube channel Taeha Types to discuss the acquisition and what brought him and his partners to Signature Plastics and offer fans some reassurance that notable mainstays in the company like minority owner Melissa Petersen are staying on.
It’s pretty common for acquisitions to get messy, with changes to the products or staffing that may be felt soon after. Fans of Signature Plastics in the community seem to see this as a win so far, and hopefully, the brand’s quality keycaps will be able to continue in the midst of growing competition from dupes and copycats in the market.

Signature Plastics’ in-office gallery of various custom mechanical keyboards. | Image: Signature Plastics

Signature Plastics, the Custer, Washington-based company known for specialized mechanical keyboard keycaps, has announced it’s being acquired by a Portland-based investment group that intends to keep the company operating with all employees maintained. This comes after majority owner Bob Guenser announced his plans to retire and find new ownership back in January.

Signature Plastics is well-regarded in the mechanical keyboard community for its high-quality PBT keycaps in vintage-looking SA and DSA profiles, such as DSA creamsicle and SA nuclear data, often selling runs of keycap sets in limited group buys with long lead times that can make them quite exclusive. It’s the only high-end, boutique keycap maker based in the US.

The investment group coming to Signature Plastics’ rescue is led by Will Clark, a self-proclaimed keyboard enthusiast who previously worked in e-commerce and cofounded software-as-a-service startups. While Clark has already joined SP’s leadership team, the acquisition will not be finalized until 2025.

Clark sat down for a brief video interview with Tae Ha Kim on his YouTube channel Taeha Types to discuss the acquisition and what brought him and his partners to Signature Plastics and offer fans some reassurance that notable mainstays in the company like minority owner Melissa Petersen are staying on.

It’s pretty common for acquisitions to get messy, with changes to the products or staffing that may be felt soon after. Fans of Signature Plastics in the community seem to see this as a win so far, and hopefully, the brand’s quality keycaps will be able to continue in the midst of growing competition from dupes and copycats in the market.

Read More 

SocialAI answers the question: what if Twitter actually needs more bots?

Image: SocialAI

Remember the last time you posted a salient take to social media and got zero engagement, or trolled? Now you can avoid that with a new “social network” full of inane AI chatbots that will — your pick! — debate you, attack you, or even just say nice things if you want.
It’s called SocialAI, and the very first thing it invites you to do is pick the followers you want, like “supporters,” “nerds,” “skeptics,” “visionaries,” and “ideators.” Afterward, endless chatbots along those themes fill the replies to your posts — not unlike the bots and boosters you’ll already find on Elon Musk’s social network, but now under your control.
Does that mean it’s any better? Well, take a look:

Screenshot: SocialAI
I thought a Nintendo social network sounded pretty good.

Well if it’s looking to emulate out-of-the-blue replies on social media, it’s doing a bang-up job here.

Image: @suobset

Above, the “interesting social dynamics” of chilling in a hot tub five feet away from bros.

Image: @fernbyfilms

I’m glad Dr. Eloise Hartmann respects opinions.

Screenshot: Threads

Surprisingly, the bots actually seem to have some concrete feelings on the PS5 Pro — I guess a $699 price tag will do that.

Image: SocialAI
You can summon a techbro whenever you like.

As alx1231 points out, the AI threads it serves up aren’t any worse than the least interesting things the algorithm sometimes serves you on Threads or X. The difference is that try as we might, we could not get the chatbots to be all that mean to us!

Image: SocialAI

The bots always reply in the same basic format, just a few brief retorts or quips, and even when we chose to max out trolling and sarcasm, we didn’t see any personal attacks.
When we tried to create a positive echo chamber instead, they had no problem calling hot dogs the “sparkly sandwiches of the world” or including out-of-place chart emoji.

Screenshot: SocialAI

And yes, let’s discuss the science of peanut butter and jelly and its impact on cognition and mood!

Image: SocialAI

They’ll even respond to boilerplate Lorem Ipsum text:

Image: SocialAI

So you get the idea. If you’ve used early chatbots, these kinds of replies should look familiar, and this isn’t even the first social networking app that has experimentally replaced all of the humans with generative AI.

SocialAI comes across as sort of a joke, or maybe some kind of meta-commentary on the concept of social media and cheap engagement, particularly after creator Michael Sayman helpfully explained: “now we can all know what Elon Musk feels like after acquiring Twitter for $44 billion, but without having to spend $44 billion.” He also says it’s “designed to help people feel heard,” though, and is ostensibly a way to help people avoid feeling isolated.
There’s no edit button, by the way.

Image: SocialAI

Image: SocialAI

Remember the last time you posted a salient take to social media and got zero engagement, or trolled? Now you can avoid that with a new “social network” full of inane AI chatbots that will — your pick! — debate you, attack you, or even just say nice things if you want.

It’s called SocialAI, and the very first thing it invites you to do is pick the followers you want, like “supporters,” “nerds,” “skeptics,” “visionaries,” and “ideators.” Afterward, endless chatbots along those themes fill the replies to your posts — not unlike the bots and boosters you’ll already find on Elon Musk’s social network, but now under your control.

Does that mean it’s any better? Well, take a look:

Screenshot: SocialAI
I thought a Nintendo social network sounded pretty good.

Well if it’s looking to emulate out-of-the-blue replies on social media, it’s doing a bang-up job here.

Image: @suobset

Above, the “interesting social dynamics” of chilling in a hot tub five feet away from bros.

Image: @fernbyfilms

I’m glad Dr. Eloise Hartmann respects opinions.

Screenshot: Threads

Surprisingly, the bots actually seem to have some concrete feelings on the PS5 Pro — I guess a $699 price tag will do that.

Image: SocialAI
You can summon a techbro whenever you like.

As alx1231 points out, the AI threads it serves up aren’t any worse than the least interesting things the algorithm sometimes serves you on Threads or X. The difference is that try as we might, we could not get the chatbots to be all that mean to us!

Image: SocialAI

The bots always reply in the same basic format, just a few brief retorts or quips, and even when we chose to max out trolling and sarcasm, we didn’t see any personal attacks.

When we tried to create a positive echo chamber instead, they had no problem calling hot dogs the “sparkly sandwiches of the world” or including out-of-place chart emoji.

Screenshot: SocialAI

And yes, let’s discuss the science of peanut butter and jelly and its impact on cognition and mood!

Image: SocialAI

They’ll even respond to boilerplate Lorem Ipsum text:

Image: SocialAI

So you get the idea. If you’ve used early chatbots, these kinds of replies should look familiar, and this isn’t even the first social networking app that has experimentally replaced all of the humans with generative AI.

SocialAI comes across as sort of a joke, or maybe some kind of meta-commentary on the concept of social media and cheap engagement, particularly after creator Michael Sayman helpfully explained: “now we can all know what Elon Musk feels like after acquiring Twitter for $44 billion, but without having to spend $44 billion.” He also says it’s “designed to help people feel heard,” though, and is ostensibly a way to help people avoid feeling isolated.

There’s no edit button, by the way.

Image: SocialAI

Read More 

Scroll to top
Generated by Feedzy