engadget-rss
Proton brings its VPN to Apple TV with new app
Proton announced the debut of an Apple TV app for its virtual private network. The new app, which was “among the most requested features from our community,” according to the company’s blog post, is available for download from the App Store on any Apple TV. It will allow customers with a paid Proton VPN plan to stream their media content from any location on Apple’s set-top box.
Proton VPN was our favorite when we reviewed it in 2023, and it’s still our top pick this year for a virtual private network. The service boasts excellent features for security, privacy and usability. Our only real complaint was that the free tier comes with a lot of limitations. But if you’re interested in the company’s platform, Proton is currently running an early Black Friday deal where you can snag one or two year plans at a steep discount.
This article originally appeared on Engadget at https://www.engadget.com/cybersecurity/vpn/proton-brings-its-vpn-to-apple-tv-with-new-app-204549019.html?src=rss
Proton announced the debut of an Apple TV app for its virtual private network. The new app, which was “among the most requested features from our community,” according to the company’s blog post, is available for download from the App Store on any Apple TV. It will allow customers with a paid Proton VPN plan to stream their media content from any location on Apple’s set-top box.
Proton VPN was our favorite when we reviewed it in 2023, and it’s still our top pick this year for a virtual private network. The service boasts excellent features for security, privacy and usability. Our only real complaint was that the free tier comes with a lot of limitations. But if you’re interested in the company’s platform, Proton is currently running an early Black Friday deal where you can snag one or two year plans at a steep discount.
This article originally appeared on Engadget at https://www.engadget.com/cybersecurity/vpn/proton-brings-its-vpn-to-apple-tv-with-new-app-204549019.html?src=rss
X’s Community Notes feature has one job, and it’s failing to do it
It’s no secret that X has become an even bigger cesspool of misleading information, unchecked claims and flat-out falsities since Elon Musk took over. Two new reports from The Center for Countering Digital Hate (CCDH) and The Washington Post reveals that the safeguards Musk removed and replaced aren’t controlling X’s problems with misinformation.
The CCDH published a report on its investigation into X’s Community Notes feature, a user-driven reporting system in which anonymous users write and rate correction for misleading posts. Researchers took a sample of 283 misleading election posts from the social media platform that received proposed Community Notes between January 1 and August 25. The report says that 209 of those misleading sample posts did not show the Community Notes correction to all X users. Even more alarming, the 209 misleading posts in question racked up 2.2 billion views.
The Washington Post followed the CCDH’s report with its own investigation into X’s Community Notes feature and found that X’s problems with misinformation go far beyond the election.
Former President Donald Trump made the bold claim during his only presidential debate with Vice President Kamala Harris that Haitians were eating people’s pets in Springfield, Ohio. Moderator and ABC news anchor David Muir corrected Trump’s statement as false because no such cases were reported to local police or government entities. The fact checking website Politifact rated Trump’s claim its lowest false rating of “Pants on Fire.” That didn’t stop this falsehood from spreading across X among conservative-leaning users.
The Post found that an account called End Wokeness with a following of 3.1 million X users started disseminating the former President’s claim about Haitian immigrants. The post remained unchecked for four days until one Community Notes user flagged the post as incorrect, citing five different articles to back up the correction. Unfortunately, the note failed to garner enough votes to label the post as false and it went uncorrected. As of Wednesday, the post is still on @EndWokeness’ account with a Community Note where it’s racked up 4.9 million views.
Musk’s account hasn’t helped the problem. The Post reports that he’s become “one of the X users most often targeted with proposed Community Notes” with one of 10 posts receiving a proposed correction note.
The publication cited a July post from @elonmusk containing a manipulated video of Harris spouting about President Joe Biden’s “senility” and how she became the nominee because she’s “the ultimate diversity hire.” You know where this is going. There’s no Community Notes or correction and the post is still on X even though thousands of replies from other X users are pointing out that it’s a fake. The post has a whopping 136.6 million views.
The CCDH is one of Musk and X’s most vocal opponents. The British non-profit continually monitors Musk’s account for false posts that failed to earn a Community Note, particularly when it comes to the Presidential election. Its CEO Imran Ahmed said in August that X “is failing woefully to contain the kind of algorithmically-boosted incitement that we all know can lead to real world violence. X took the CCDH to court over claims the non-profit created a “scare campaign” to bring down its advertising revenue. A US district court judge dismissed the lawsuit in March.This article originally appeared on Engadget at https://www.engadget.com/social-media/xs-community-notes-feature-has-one-job-and-its-failing-to-do-it-202645987.html?src=rss
It’s no secret that X has become an even bigger cesspool of misleading information, unchecked claims and flat-out falsities since Elon Musk took over. Two new reports from The Center for Countering Digital Hate (CCDH) and The Washington Post reveals that the safeguards Musk removed and replaced aren’t controlling X’s problems with misinformation.
The CCDH published a report on its investigation into X’s Community Notes feature, a user-driven reporting system in which anonymous users write and rate correction for misleading posts. Researchers took a sample of 283 misleading election posts from the social media platform that received proposed Community Notes between January 1 and August 25. The report says that 209 of those misleading sample posts did not show the Community Notes correction to all X users. Even more alarming, the 209 misleading posts in question racked up 2.2 billion views.
The Washington Post followed the CCDH’s report with its own investigation into X’s Community Notes feature and found that X’s problems with misinformation go far beyond the election.
Former President Donald Trump made the bold claim during his only presidential debate with Vice President Kamala Harris that Haitians were eating people’s pets in Springfield, Ohio. Moderator and ABC news anchor David Muir corrected Trump’s statement as false because no such cases were reported to local police or government entities. The fact checking website Politifact rated Trump’s claim its lowest false rating of “Pants on Fire.” That didn’t stop this falsehood from spreading across X among conservative-leaning users.
The Post found that an account called End Wokeness with a following of 3.1 million X users started disseminating the former President’s claim about Haitian immigrants. The post remained unchecked for four days until one Community Notes user flagged the post as incorrect, citing five different articles to back up the correction. Unfortunately, the note failed to garner enough votes to label the post as false and it went uncorrected. As of Wednesday, the post is still on @EndWokeness’ account with a Community Note where it’s racked up 4.9 million views.
Musk’s account hasn’t helped the problem. The Post reports that he’s become “one of the X users most often targeted with proposed Community Notes” with one of 10 posts receiving a proposed correction note.
The publication cited a July post from @elonmusk containing a manipulated video of Harris spouting about President Joe Biden’s “senility” and how she became the nominee because she’s “the ultimate diversity hire.” You know where this is going. There’s no Community Notes or correction and the post is still on X even though thousands of replies from other X users are pointing out that it’s a fake. The post has a whopping 136.6 million views.
The CCDH is one of Musk and X’s most vocal opponents. The British non-profit continually monitors Musk’s account for false posts that failed to earn a Community Note, particularly when it comes to the Presidential election. Its CEO Imran Ahmed said in August that X “is failing woefully to contain the kind of algorithmically-boosted incitement that we all know can lead to real world violence. X took the CCDH to court over claims the non-profit created a “scare campaign” to bring down its advertising revenue. A US district court judge dismissed the lawsuit in March.
This article originally appeared on Engadget at https://www.engadget.com/social-media/xs-community-notes-feature-has-one-job-and-its-failing-to-do-it-202645987.html?src=rss
AMD’s next-gen GPUs are set to arrive in early 2025, suggesting a CES reveal
AMD and NVIDIA could be on a collision course for CES. AMD CEO Lisa Su has confirmed for the first time that the company is set to release its next-gen PC GPUs early next year.
“In gaming graphics, revenue declined year-over-year as we prepare for a transition to our next-gen Radeon GPUs based on our RDNA 4 architecture,” Su told investors on AMD’s third-quarter earnings call. “In addition to a strong increase in gaming performance, RDNA 4 delivers significantly higher ray-tracing performance and adds new AI capabilities. We are on track to launch the first RDNA 4 GPUs in early 2025.”
The timing very much suggests that AMD will reveal those RDNA 4-based graphics cards at CES in early January. It’s rare for the company to unveil desktop GPUs at the trade show (laptop cards are generally the order of the day for AMD at that event). However, it’s widely expected that NVIDIA will use its CES keynote to debut its next-gen 50-series GeForce RTX GPUs. We might get a little more clarity on that front when NVIDIA announces its own Q3 earnings results on November 19.
As PCWorld notes, AMD’s first RDNA 4 GPUs are expected to deliver mid-range performance at an equivalent price point in a bid to increase its market share. AMD’s gaming business (which includes the company’s GPU division) saw a 69 percent year-over-year drop in revenue to $462 million in Q3. This article originally appeared on Engadget at https://www.engadget.com/computing/amds-next-gen-gpus-are-set-to-arrive-in-early-2025-suggesting-a-ces-reveal-192630199.html?src=rss
AMD and NVIDIA could be on a collision course for CES. AMD CEO Lisa Su has confirmed for the first time that the company is set to release its next-gen PC GPUs early next year.
“In gaming graphics, revenue declined year-over-year as we prepare for a transition to our next-gen Radeon GPUs based on our RDNA 4 architecture,” Su told investors on AMD’s third-quarter earnings call. “In addition to a strong increase in gaming performance, RDNA 4 delivers significantly higher ray-tracing performance and adds new AI capabilities. We are on track to launch the first RDNA 4 GPUs in early 2025.”
The timing very much suggests that AMD will reveal those RDNA 4-based graphics cards at CES in early January. It’s rare for the company to unveil desktop GPUs at the trade show (laptop cards are generally the order of the day for AMD at that event). However, it’s widely expected that NVIDIA will use its CES keynote to debut its next-gen 50-series GeForce RTX GPUs. We might get a little more clarity on that front when NVIDIA announces its own Q3 earnings results on November 19.
As PCWorld notes, AMD’s first RDNA 4 GPUs are expected to deliver mid-range performance at an equivalent price point in a bid to increase its market share. AMD’s gaming business (which includes the company’s GPU division) saw a 69 percent year-over-year drop in revenue to $462 million in Q3.
This article originally appeared on Engadget at https://www.engadget.com/computing/amds-next-gen-gpus-are-set-to-arrive-in-early-2025-suggesting-a-ces-reveal-192630199.html?src=rss
The Daft Punk anime Interstella 5555 is coming to theaters for one night only
The Daft Punk anime Interstella 5555: The 5tory of the 5ecret 5tar 5ystem is coming to movie theaters on December 12, for one night only. It’ll be screened in over 800 cinemas in more than 40 countries throughout the globe. Tickets go on sale November 13, so bookmark this page to make sure you snag one before they sell out.
For the uninitiated, the film was first released back in 2003 and was a joint collaboration between Daft Punk and manga legend Leiji Matsumoto, who passed away last year. The anime acts as a visual companion piece to Daft Punk’s album Discovery. There’s no dialogue and minimal sound effects. It’s all about the music.
There is a plot, but it’s more a loose amalgamation of sci-fi ideas that act as a springboard to play Daft Punk songs. For instance, the band’s iconic “Harder, Better, Faster, Stronger” begins when the lead character gets a hold of some high-tech sunglasses. It is, however, a visually stunning affair.
To that end, this is a 4K remaster of the original. However, there has already been a bit of controversy surrounding this remaster. Distributor Trafalgar allegedly used AI to upscale some of the footage and, well, people don’t seem to be happy with the results. In any event, we don’t have too long to see how it all came together.
Daft Punk
To commemorate this limited theatrical release, the band’s releasing a whole bunch of affiliated merch. There’s a physical edition of the film, complete with the original Japanese artwork, stickers and a collectible Daft Club card. Fans can also purchase the soundtrack album in multiple formats, including gold vinyl and numbered CDs.This article originally appeared on Engadget at https://www.engadget.com/entertainment/tv-movies/the-daft-punk-anime-interstella-5555-is-coming-to-theaters-for-one-night-only-184501194.html?src=rss
The Daft Punk anime Interstella 5555: The 5tory of the 5ecret 5tar 5ystem is coming to movie theaters on December 12, for one night only. It’ll be screened in over 800 cinemas in more than 40 countries throughout the globe. Tickets go on sale November 13, so bookmark this page to make sure you snag one before they sell out.
For the uninitiated, the film was first released back in 2003 and was a joint collaboration between Daft Punk and manga legend Leiji Matsumoto, who passed away last year. The anime acts as a visual companion piece to Daft Punk’s album Discovery. There’s no dialogue and minimal sound effects. It’s all about the music.
There is a plot, but it’s more a loose amalgamation of sci-fi ideas that act as a springboard to play Daft Punk songs. For instance, the band’s iconic “Harder, Better, Faster, Stronger” begins when the lead character gets a hold of some high-tech sunglasses. It is, however, a visually stunning affair.
To that end, this is a 4K remaster of the original. However, there has already been a bit of controversy surrounding this remaster. Distributor Trafalgar allegedly used AI to upscale some of the footage and, well, people don’t seem to be happy with the results. In any event, we don’t have too long to see how it all came together.
To commemorate this limited theatrical release, the band’s releasing a whole bunch of affiliated merch. There’s a physical edition of the film, complete with the original Japanese artwork, stickers and a collectible Daft Club card. Fans can also purchase the soundtrack album in multiple formats, including gold vinyl and numbered CDs.
This article originally appeared on Engadget at https://www.engadget.com/entertainment/tv-movies/the-daft-punk-anime-interstella-5555-is-coming-to-theaters-for-one-night-only-184501194.html?src=rss
Children with Android phones will be able to use Google Wallet’s tap-to-pay next year
Google Wallet for kids will roll out in 2025. “Following the positive response of tap-to-pay on Fitbit Ace LTE devices, we’re expanding tap-to-pay for kids to Google Wallet,” Google wrote in a statement to 9to5Google, which first reported on it. Parents could approve credit and debit cards added to children’s phones, and Google’s Family Link would let them view transactions and easily approve or remove cards.
The service would build on the tap-to-pay functionality in Google’s Fitbit Ace LTE kids’ activity tracker. The expansion would make the Google Wallet app available for Android phone-using children whose parents have set up Family Link and approved access.
Any of the parents’ existing payment cards in Google Wallet could be used for the kids’ spinoff. When paying, children would have to approve tap-to-pay purchases using standard authentication options (fingerprint, facial recognition, PIN or password). At launch, the service is said to support gift cards and event tickets but not online purchases, identification or health cards.
Apple already has a similar take on children’s purchases. Families in the company’s ecosystem can let their kids use Apple Pay in stores and online or send money through Messages with Apple Cash Family.
9to5Google says Google’s kids’ payments feature will roll out next year for “some Google Wallet users in several countries,” including the US. A wider rollout is expected at some point after that.This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/children-with-android-phones-will-be-able-to-use-google-wallets-tap-to-pay-next-year-182650364.html?src=rss
Google Wallet for kids will roll out in 2025. “Following the positive response of tap-to-pay on Fitbit Ace LTE devices, we’re expanding tap-to-pay for kids to Google Wallet,” Google wrote in a statement to 9to5Google, which first reported on it. Parents could approve credit and debit cards added to children’s phones, and Google’s Family Link would let them view transactions and easily approve or remove cards.
The service would build on the tap-to-pay functionality in Google’s Fitbit Ace LTE kids’ activity tracker. The expansion would make the Google Wallet app available for Android phone-using children whose parents have set up Family Link and approved access.
Any of the parents’ existing payment cards in Google Wallet could be used for the kids’ spinoff. When paying, children would have to approve tap-to-pay purchases using standard authentication options (fingerprint, facial recognition, PIN or password). At launch, the service is said to support gift cards and event tickets but not online purchases, identification or health cards.
Apple already has a similar take on children’s purchases. Families in the company’s ecosystem can let their kids use Apple Pay in stores and online or send money through Messages with Apple Cash Family.
9to5Google says Google’s kids’ payments feature will roll out next year for “some Google Wallet users in several countries,” including the US. A wider rollout is expected at some point after that.
This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/children-with-android-phones-will-be-able-to-use-google-wallets-tap-to-pay-next-year-182650364.html?src=rss
Google CEO says a quarter of the company’s new code is already AI generated
Google CEO Sundar Pichai just revealed that AI now generates more than a quarter of new code for its products, according to a company earnings call transcribed by Ars Technica. In other words, AI tools are already having an absolutely mammoth impact on the development of software.
Pichai did say that human programmers oversee the computer-generated code, which is something. The CEO noted that AI coding helps with “boosting productivity and efficiency,” ensuring that engineers “do more and move faster.”
There’s no two ways around it. 25 percent is a lot, and Google is just one company relying on AI algorithms to perform complex coding tasks. According to Stack Overflow’s 2024 Developer Survey, over 75 percent of respondents are already using or are “planning to use” AI tools to assist with software development. Another survey by GitHub indicated that 92 percent of US-based developers are currently using AI coding tools.
This leads us to the rampaging elephant in the room. As AI continues to gobble up coding tasks, human experience starts to dwindle. This could eventually lead to a decreased knowledge base in which humans don’t know how to fix errors created by AI algorithms that were, in turn, created by other AI algorithms. We could be staring down an ouroboros of confusion where it’s nearly impossible to detect bugs amidst generations of AI code. Fun times!
We aren’t quite there yet, but AI-assisted coding shows no signs of slowing down. The process started its meteoric rise back in 2022 when GitHub widely launched its Copilot program. Since then, companies like Anthropic, Meta, Google and OpenAI have all released AI-coding software suites. GitHub recently announced that Copilot can now be used with models from Anthropic and Google, in addition to OpenAI.This article originally appeared on Engadget at https://www.engadget.com/ai/google-ceo-says-a-quarter-of-the-companys-new-code-is-already-ai-generated-180038896.html?src=rss
Google CEO Sundar Pichai just revealed that AI now generates more than a quarter of new code for its products, according to a company earnings call transcribed by Ars Technica. In other words, AI tools are already having an absolutely mammoth impact on the development of software.
Pichai did say that human programmers oversee the computer-generated code, which is something. The CEO noted that AI coding helps with “boosting productivity and efficiency,” ensuring that engineers “do more and move faster.”
There’s no two ways around it. 25 percent is a lot, and Google is just one company relying on AI algorithms to perform complex coding tasks. According to Stack Overflow’s 2024 Developer Survey, over 75 percent of respondents are already using or are “planning to use” AI tools to assist with software development. Another survey by GitHub indicated that 92 percent of US-based developers are currently using AI coding tools.
This leads us to the rampaging elephant in the room. As AI continues to gobble up coding tasks, human experience starts to dwindle. This could eventually lead to a decreased knowledge base in which humans don’t know how to fix errors created by AI algorithms that were, in turn, created by other AI algorithms. We could be staring down an ouroboros of confusion where it’s nearly impossible to detect bugs amidst generations of AI code. Fun times!
We aren’t quite there yet, but AI-assisted coding shows no signs of slowing down. The process started its meteoric rise back in 2022 when GitHub widely launched its Copilot program. Since then, companies like Anthropic, Meta, Google and OpenAI have all released AI-coding software suites. GitHub recently announced that Copilot can now be used with models from Anthropic and Google, in addition to OpenAI.
This article originally appeared on Engadget at https://www.engadget.com/ai/google-ceo-says-a-quarter-of-the-companys-new-code-is-already-ai-generated-180038896.html?src=rss
November’s PS Plus Monthly Games include Ghostwire: Tokyo and Hot Wheels Unleashed 2
Sony has revealed the trio of games that all PlayStation Plus members can claim in November and keep in their library as long as they maintain their subscription. Arguably the most recognizable title of the bunch is Ghostwire: Tokyo (PS5), an action-adventure game from former Bethesda studio Tango Gameworks.
Ghostwire: Tokyo, which debuted in early 2022, is a fairly well-reviewed first-person game that sees you battling supernatural forces in Japan’s capital using an array of abilities. A sequel had been mooted before Bethesda owner Microsoft shut down Tango earlier this year. While Tango has found a second life after PUBG: Battlegrounds publisher Krafton snapped it up, it’s unclear whether the Ghostwire: Tokyo franchise will continue into another game.
It’s worth noting that the PC version of Ghostwire: Tokyo will be available to claim for free on the Epic Games Store starting Thursday as well. You’ll have until next Thursday morning (November 7) to snag that.
PS Plus members can also claim Hot Wheels Unleashed 2 – Turbocharged (PS4 and PS5) at no extra cost. It’s a racing game, as you might expect. It includes a track editor, so you can create your own courses.
Last but not least is Death Note Killer Within (PS4 and PS5). This is a brand-new social deduction game for up to 10 people in which you can play as characters from the manga. It looks like a Death Note-flavored spin on games like Among Us.
You can claim all three of these titles between November 5 and December 2, so they could help keep your mind occupied while the general election results become clear. If you haven’t yet snagged October’s PS Plus Monthly Games — WWE 2K24, Dead Space and Doki Doki Literature Club Plus! — you’ve got until November 4 to do so.This article originally appeared on Engadget at https://www.engadget.com/gaming/playstation/novembers-ps-plus-monthly-games-include-ghostwire-tokyo-and-hot-wheels-unleashed-2-174051803.html?src=rss
Sony has revealed the trio of games that all PlayStation Plus members can claim in November and keep in their library as long as they maintain their subscription. Arguably the most recognizable title of the bunch is Ghostwire: Tokyo (PS5), an action-adventure game from former Bethesda studio Tango Gameworks.
Ghostwire: Tokyo, which debuted in early 2022, is a fairly well-reviewed first-person game that sees you battling supernatural forces in Japan’s capital using an array of abilities. A sequel had been mooted before Bethesda owner Microsoft shut down Tango earlier this year. While Tango has found a second life after PUBG: Battlegrounds publisher Krafton snapped it up, it’s unclear whether the Ghostwire: Tokyo franchise will continue into another game.
It’s worth noting that the PC version of Ghostwire: Tokyo will be available to claim for free on the Epic Games Store starting Thursday as well. You’ll have until next Thursday morning (November 7) to snag that.
PS Plus members can also claim Hot Wheels Unleashed 2 – Turbocharged (PS4 and PS5) at no extra cost. It’s a racing game, as you might expect. It includes a track editor, so you can create your own courses.
Last but not least is Death Note Killer Within (PS4 and PS5). This is a brand-new social deduction game for up to 10 people in which you can play as characters from the manga. It looks like a Death Note-flavored spin on games like Among Us.
You can claim all three of these titles between November 5 and December 2, so they could help keep your mind occupied while the general election results become clear. If you haven’t yet snagged October’s PS Plus Monthly Games — WWE 2K24, Dead Space and Doki Doki Literature Club Plus! — you’ve got until November 4 to do so.
This article originally appeared on Engadget at https://www.engadget.com/gaming/playstation/novembers-ps-plus-monthly-games-include-ghostwire-tokyo-and-hot-wheels-unleashed-2-174051803.html?src=rss
How to take Apple’s hearing test with the AirPods Pro 2
When iOS 18.1 arrived earlier this week, Apple delivered the highly-anticipated suite of hearing health features that it announced at the iPhone event in September. This includes hearing aid and hearing protection tools, as well as a “clinically-validated” hearing test in your pocket. With the combination of an iPhone and a pair of second-gen AirPods Pro, you can take a hearing test that’s similar to what you’d get at an audiologist’s office without leaving home. Most importantly, the whole thing takes about five minutes and gives you detailed results immediately. Here’s a step-by-step guide on how to use it.
Update your iPhone and AirPods Pro 2
Billy Steele for Engadget
Before you can access Apple’s hearing test, you’ll need to make sure your iPhone is updated to iOS 18.1 and your AirPods Pro 2 have the latest firmware (7B19). None of the new hearing health features will show up in the AirPods settings or in the Apple Health app if you don’t have both of those updates. What’s more, you won’t be able to run the hearing test or use any of the other new tools on the first-gen AirPods Pro (2019 model).
You can check your current iOS version from the iPhone Settings menu. Scroll down to General and tap Software Update. From here, you can see which version of iOS you’re running and if you’ve got a pending update that’s ready to download and install. Once again, you’re looking for iOS 18.1 here since this is the software version that delivers the suite of hearing health features.
To check the firmware on your AirPods Pro 2, connect the earbuds to your iPhone and navigate to the Settings menu. Here, your AirPods Pro 2 should appear near the top of the list and tapping that option will take you into the settings. You can also access AirPods Pro 2 details from the Bluetooth menu by tapping the “i” icon next to the device name.
Once you’re in the AirPods settings menu, scroll all the way down to the bottom of the main screen. One of the last things you’ll see is a bunch of firmware info, including the current version for the AirPods Pro 2. If you see 7B19, you’re good to go. If not, your earbuds haven’t updated yet, but you can try to force them to do so instead of waiting for the over-the-air process to take place on its own.
To do this, connect the AirPods Pro 2 to your iPhone for at least 30 seconds and play music to confirm the connection is stable. Then put the earbuds back in the charging case and close the lid, keeping the AirPods Pro 2 in range of the iPhone. Now check Bluetooth settings, and if you see the AirPods Pro 2 stay connected for more than 10 seconds while in the charging case with the lid closed, that should indicate the update is in progress.
Where to find Apple’s hearing test
Billy Steele for Engadget
Apple allows you access its hearing test from two places, and both of them are easy to find. The first is in the AirPods menu, which you can get to from the main Settings menu or from the Bluetooth menu. The Hearing Health section is prominently displayed on the main screen, just under the Noise Control options. In this section, “Take a Hearing Test” will be the third item after Hearing Protection and Hearing Assistance, and it will appear in blue.
In the Health app, the fastest way to get to the hearing test is to tap Browse on the menu on the bottom of main Summary screen. From there, select “Hearing” with the blue ear icon and scroll down to “Get More From Health.” Here, you’ll see the option to take the hearing test with the AirPods Pro 2.
How to take Apple’s hearing test
Billy Steele for Engadget
After you update your devices and find the hearing test, the hardest part is over. The software-based test guides you through the entire process, with detailed info on what you can expect and what the results mean for you. After you select “Take Hearing Test,” you’ll tap “Get Started” to begin the process. First, the software will ask you if you’re 18 or older, if you’re experiencing allergy or cold symptoms and if you’ve been in a loud environment (like a concert) in the last 24 hours. The second two items could impact the accuracy of your test if the answer is yes to either one.
On the next screen, the AirPods and iPhone tandem will make sure that your surroundings are quiet enough for the hearing test. Too much background noise will make it difficult for you to hear the more subtle tones during the screening. Next, the setup will make sure that the AirPods Pro 2 fit properly in your ears and that they provide an adequate seal for the test. You’ll be notified that Do No Disturb will be active during the test to prevent distractions and active noise cancellation (ANC) mode will be enabled at this point. The test will then offer some sample tones and let you know that each tone during the test will play three times.
When the test begins, you’ll simply tap the screen of your iPhone when you hear a tone (you only have to tap once for each tone). The test begins with your left ear before moving over to the right. Don’t worry if you miss one: the test will repeat any of the sounds it thinks you missed along the way. When the test is over, you’ll immediately get the results on your iPhone for each ear, including a detailed audiogram that shows which frequencies you struggle hearing (if any). Results are also viewable in the Health app at any time, and you can export a PDF to share with a doctor or for other purposes as needed.
What to do with your hearing test results
If you have little to no hearing loss, Apple’s tool will offer suggestions on how to keep your hearing healthy and inform you that no changes are needed to the tuning of your AirPods Pro 2. If you exhibit mild to moderate hearing loss, the software will ask if you want to set up Apple’s Hearing Assistance features which include hearing aid, Media Assist and Conversation Boost. Lastly, if the test determines that you have severe or profound hearing loss, Apple will recommend that you see a professional for further evaluation.
AirPods Pro 2 hearing aid features are only designed for users with mild to moderate hearing loss and the hearing test can only measure hearing loss under 85 dBHL. Here’s how the hearing loss categories break down, according to the World Health Organization:
Little to No Loss: Up to 25 dBHL
Mild Loss: 26 – 40 dBHL
Moderate Loss: 41 – 60 dBHL
Severe Loss: 61 – 80 dBHL
Profound Loss Above: 80 dBHL
This article originally appeared on Engadget at https://www.engadget.com/audio/headphones/how-to-take-apples-hearing-test-with-the-airpods-pro-2-173014978.html?src=rss
When iOS 18.1 arrived earlier this week, Apple delivered the highly-anticipated suite of hearing health features that it announced at the iPhone event in September. This includes hearing aid and hearing protection tools, as well as a “clinically-validated” hearing test in your pocket. With the combination of an iPhone and a pair of second-gen AirPods Pro, you can take a hearing test that’s similar to what you’d get at an audiologist’s office without leaving home. Most importantly, the whole thing takes about five minutes and gives you detailed results immediately. Here’s a step-by-step guide on how to use it.
Update your iPhone and AirPods Pro 2
Before you can access Apple’s hearing test, you’ll need to make sure your iPhone is updated to iOS 18.1 and your AirPods Pro 2 have the latest firmware (7B19). None of the new hearing health features will show up in the AirPods settings or in the Apple Health app if you don’t have both of those updates. What’s more, you won’t be able to run the hearing test or use any of the other new tools on the first-gen AirPods Pro (2019 model).
You can check your current iOS version from the iPhone Settings menu. Scroll down to General and tap Software Update. From here, you can see which version of iOS you’re running and if you’ve got a pending update that’s ready to download and install. Once again, you’re looking for iOS 18.1 here since this is the software version that delivers the suite of hearing health features.
To check the firmware on your AirPods Pro 2, connect the earbuds to your iPhone and navigate to the Settings menu. Here, your AirPods Pro 2 should appear near the top of the list and tapping that option will take you into the settings. You can also access AirPods Pro 2 details from the Bluetooth menu by tapping the “i” icon next to the device name.
Once you’re in the AirPods settings menu, scroll all the way down to the bottom of the main screen. One of the last things you’ll see is a bunch of firmware info, including the current version for the AirPods Pro 2. If you see 7B19, you’re good to go. If not, your earbuds haven’t updated yet, but you can try to force them to do so instead of waiting for the over-the-air process to take place on its own.
To do this, connect the AirPods Pro 2 to your iPhone for at least 30 seconds and play music to confirm the connection is stable. Then put the earbuds back in the charging case and close the lid, keeping the AirPods Pro 2 in range of the iPhone. Now check Bluetooth settings, and if you see the AirPods Pro 2 stay connected for more than 10 seconds while in the charging case with the lid closed, that should indicate the update is in progress.
Where to find Apple’s hearing test
Apple allows you access its hearing test from two places, and both of them are easy to find. The first is in the AirPods menu, which you can get to from the main Settings menu or from the Bluetooth menu. The Hearing Health section is prominently displayed on the main screen, just under the Noise Control options. In this section, “Take a Hearing Test” will be the third item after Hearing Protection and Hearing Assistance, and it will appear in blue.
In the Health app, the fastest way to get to the hearing test is to tap Browse on the menu on the bottom of main Summary screen. From there, select “Hearing” with the blue ear icon and scroll down to “Get More From Health.” Here, you’ll see the option to take the hearing test with the AirPods Pro 2.
How to take Apple’s hearing test
After you update your devices and find the hearing test, the hardest part is over. The software-based test guides you through the entire process, with detailed info on what you can expect and what the results mean for you. After you select “Take Hearing Test,” you’ll tap “Get Started” to begin the process. First, the software will ask you if you’re 18 or older, if you’re experiencing allergy or cold symptoms and if you’ve been in a loud environment (like a concert) in the last 24 hours. The second two items could impact the accuracy of your test if the answer is yes to either one.
On the next screen, the AirPods and iPhone tandem will make sure that your surroundings are quiet enough for the hearing test. Too much background noise will make it difficult for you to hear the more subtle tones during the screening. Next, the setup will make sure that the AirPods Pro 2 fit properly in your ears and that they provide an adequate seal for the test. You’ll be notified that Do No Disturb will be active during the test to prevent distractions and active noise cancellation (ANC) mode will be enabled at this point. The test will then offer some sample tones and let you know that each tone during the test will play three times.
When the test begins, you’ll simply tap the screen of your iPhone when you hear a tone (you only have to tap once for each tone). The test begins with your left ear before moving over to the right. Don’t worry if you miss one: the test will repeat any of the sounds it thinks you missed along the way. When the test is over, you’ll immediately get the results on your iPhone for each ear, including a detailed audiogram that shows which frequencies you struggle hearing (if any). Results are also viewable in the Health app at any time, and you can export a PDF to share with a doctor or for other purposes as needed.
What to do with your hearing test results
If you have little to no hearing loss, Apple’s tool will offer suggestions on how to keep your hearing healthy and inform you that no changes are needed to the tuning of your AirPods Pro 2. If you exhibit mild to moderate hearing loss, the software will ask if you want to set up Apple’s Hearing Assistance features which include hearing aid, Media Assist and Conversation Boost. Lastly, if the test determines that you have severe or profound hearing loss, Apple will recommend that you see a professional for further evaluation.
AirPods Pro 2 hearing aid features are only designed for users with mild to moderate hearing loss and the hearing test can only measure hearing loss under 85 dBHL. Here’s how the hearing loss categories break down, according to the World Health Organization:
Little to No Loss: Up to 25 dBHL
Mild Loss: 26 – 40 dBHL
Moderate Loss: 41 – 60 dBHL
Severe Loss: 61 – 80 dBHL
Profound Loss Above: 80 dBHL
This article originally appeared on Engadget at https://www.engadget.com/audio/headphones/how-to-take-apples-hearing-test-with-the-airpods-pro-2-173014978.html?src=rss
Ubisoft stealth released an NFT game absolutely no one needs
A new tactical RPG game for PCs from Ubisoft requires NFTs to play it. Per IGN Ubisoft Quartz, the publisher’s NFT platform, has released Champions Tactics: Grimoria Chronicles without much fanfare.
It’s not really billed as a Web3-based game in the trailer, probably because NFTs are as popular of an investment as Blackberry phones or fax machines. However, it still plays a major part in Champions Tactics. The game starts you with a handful of free temporary figurines to start you off but you’ll need to buy your own NFTs to compete using in-game currency or cryptocurrency that can reach into the thousands. The highest figurine called the “Swift Zealot” will set you back $63,000.
Ubisoft launched its NFT platform Quartz a little under three years ago to a wave of backlash. The announcement video on YouTube attracted more than 35,000 dislikes in 24 hours and open criticisms from consumers and employees who felt its environmental impact was not worth the risk. The launch included a set of 15 NFTs in the form of skins and guns for Ghost Recon Breakpoint. The publisher only sold 18 NFTs in its first few weeks.This article originally appeared on Engadget at https://www.engadget.com/gaming/pc/ubisoft-stealth-released-an-nft-game-absolutely-no-one-needs-172145079.html?src=rss
A new tactical RPG game for PCs from Ubisoft requires NFTs to play it. Per IGN Ubisoft Quartz, the publisher’s NFT platform, has released Champions Tactics: Grimoria Chronicles without much fanfare.
It’s not really billed as a Web3-based game in the trailer, probably because NFTs are as popular of an investment as Blackberry phones or fax machines. However, it still plays a major part in Champions Tactics. The game starts you with a handful of free temporary figurines to start you off but you’ll need to buy your own NFTs to compete using in-game currency or cryptocurrency that can reach into the thousands. The highest figurine called the “Swift Zealot” will set you back $63,000.
Ubisoft launched its NFT platform Quartz a little under three years ago to a wave of backlash. The announcement video on YouTube attracted more than 35,000 dislikes in 24 hours and open criticisms from consumers and employees who felt its environmental impact was not worth the risk. The launch included a set of 15 NFTs in the form of skins and guns for Ghost Recon Breakpoint. The publisher only sold 18 NFTs in its first few weeks.
This article originally appeared on Engadget at https://www.engadget.com/gaming/pc/ubisoft-stealth-released-an-nft-game-absolutely-no-one-needs-172145079.html?src=rss
Avride’s next-gen delivery robot ditches two wheels and adds NVIDIA AI brains
Autonomous delivery vehicle company Avride has a fresh design — and NVIDIA AI brains. The company’s engineers have swapped out the old six-wheel configuration for a more efficient four-wheel chassis. It can make 180-degree turns almost instantly, effortlessly park on inclines and move faster without compromising safety.
Avride has been working on autonomous delivery robots since 2019. It began as part of Russian tech company Yandex’s autonomous driving wing. But the spun-off company divested its Russian assets after Vladimir Putin ordered the invasion of Ukraine in 2022 and rebranded as Avride. It’s now owned by the Netherlands-based Nebius Group (formerly Yandex N.V.), headquartered in Austin, TX and making deals with the likes of Uber.
The company’s latest delivery robot shakes up one of the few constants from previous iterations: They all had six wheels. The new four-wheel robo-buggy uses a “groundbreaking chassis design” that eliminates some of the rough spots from older generations. These included additional friction and tire wear caused by excessive braking required for turns, lower maneuverability and less precise trajectory execution. Avride says the new model dramatically improves on all of those counts.
Avride
The new vehicle’s wheels are mounted on movable arms attached to a pivoting axle. For turns, each wheel glides along a circular path stabilized by the central arm. “This design allows the wheels to rotate both inward and outward, reducing friction during turns,” the company wrote in its announcement blog post.
Central to the new design is ditching the traditional front and rear axles for mechanically connected wheel pairs on each side. Avride says this enables simultaneous turning angle adjustment, leading to more precise positioning and maneuvers.
Among the results of the fresh approach are almost instant 180-degree turns. Avride says this especially helps when navigating narrow sidewalks, where sudden adjustments could be necessary. Parking on slopes is also more energy efficient: It now sets its wheels in a cross pattern to park in place without careening downward. The tighter controls also let the company increase its maximum speed. “This means faster deliveries for our customers,” the company wrote. (And, presumably, more profit.)
Avride
Not only did the new generation of delivery bots get a new body, but it also got smarter. Powered by the NVIDIA Jetson Orin platform, essentially an “AI brain for robots,” the vehicles can now tap into neural networks as powerful as those in full-size autonomous cars. This lets them process “vast amounts” of sensor data like lidar inputs and camera feeds in real time.
Finally, it wouldn’t be a delivery buggy without a cargo compartment — and that got an upgrade, too. The new model has a fully detachable storage section, allowing for modular swap-outs for different purposes. Avride says its standard cargo hold is big enough to hold several large pizzas and drinks or multiple grocery bags. It also adds a sliding lid that only provides access to the correct section, helping to avoid delivering orders to the wrong customers.
Engineering and design nerds can read much more detail about the new robots in Avride’s Medium post.This article originally appeared on Engadget at https://www.engadget.com/transportation/avrides-next-gen-delivery-robot-ditches-two-wheels-and-adds-nvidia-ai-brains-171053813.html?src=rss
Autonomous delivery vehicle company Avride has a fresh design — and NVIDIA AI brains. The company’s engineers have swapped out the old six-wheel configuration for a more efficient four-wheel chassis. It can make 180-degree turns almost instantly, effortlessly park on inclines and move faster without compromising safety.
Avride has been working on autonomous delivery robots since 2019. It began as part of Russian tech company Yandex’s autonomous driving wing. But the spun-off company divested its Russian assets after Vladimir Putin ordered the invasion of Ukraine in 2022 and rebranded as Avride. It’s now owned by the Netherlands-based Nebius Group (formerly Yandex N.V.), headquartered in Austin, TX and making deals with the likes of Uber.
The company’s latest delivery robot shakes up one of the few constants from previous iterations: They all had six wheels. The new four-wheel robo-buggy uses a “groundbreaking chassis design” that eliminates some of the rough spots from older generations. These included additional friction and tire wear caused by excessive braking required for turns, lower maneuverability and less precise trajectory execution. Avride says the new model dramatically improves on all of those counts.
The new vehicle’s wheels are mounted on movable arms attached to a pivoting axle. For turns, each wheel glides along a circular path stabilized by the central arm. “This design allows the wheels to rotate both inward and outward, reducing friction during turns,” the company wrote in its announcement blog post.
Central to the new design is ditching the traditional front and rear axles for mechanically connected wheel pairs on each side. Avride says this enables simultaneous turning angle adjustment, leading to more precise positioning and maneuvers.
Among the results of the fresh approach are almost instant 180-degree turns. Avride says this especially helps when navigating narrow sidewalks, where sudden adjustments could be necessary. Parking on slopes is also more energy efficient: It now sets its wheels in a cross pattern to park in place without careening downward. The tighter controls also let the company increase its maximum speed. “This means faster deliveries for our customers,” the company wrote. (And, presumably, more profit.)
Not only did the new generation of delivery bots get a new body, but it also got smarter. Powered by the NVIDIA Jetson Orin platform, essentially an “AI brain for robots,” the vehicles can now tap into neural networks as powerful as those in full-size autonomous cars. This lets them process “vast amounts” of sensor data like lidar inputs and camera feeds in real time.
Finally, it wouldn’t be a delivery buggy without a cargo compartment — and that got an upgrade, too. The new model has a fully detachable storage section, allowing for modular swap-outs for different purposes. Avride says its standard cargo hold is big enough to hold several large pizzas and drinks or multiple grocery bags. It also adds a sliding lid that only provides access to the correct section, helping to avoid delivering orders to the wrong customers.
Engineering and design nerds can read much more detail about the new robots in Avride’s Medium post.
This article originally appeared on Engadget at https://www.engadget.com/transportation/avrides-next-gen-delivery-robot-ditches-two-wheels-and-adds-nvidia-ai-brains-171053813.html?src=rss