verge-rss

Nintendo Direct June 2024: all the news and trailers

Illustration by Alex Castro / The Verge

No Switch 2, no problem. As promised, the latest Nintendo Direct is taking place on June 18th, and the focus is on games for the second half of the year, which is a good thing because, aside from a Luigi’s Mansion 2 remake and a game about speedrunning, the lineup of upcoming Switch titles is very quiet at the moment.
There are some games we know are in the works, at least, which could potentially make an appearance. That includes the long-in-development Metroid Prime 4 and Pokémon Legends: Z-A, which isn’t due out until 2025. And, hey, maybe this is the time we finally learn when Hollow Knight: Silksong will be coming out. But given the way the year has been going, there’s a pretty good chance this Nintendo showcase will feature a number of rereleases and remasters.
One thing that won’t be there is the Switch 2 or whatever Nintendo’s next console turns out to be. Despite persistent rumors of an imminent reveal, Nintendo is adamant that “there will be no mention of the Nintendo Switch successor during this presentation.”
The Nintendo Direct kicks off at 7AM PT / 10AM ET, and you can keep up with all of the important reveals and trailers right here.

Illustration by Alex Castro / The Verge

No Switch 2, no problem.

As promised, the latest Nintendo Direct is taking place on June 18th, and the focus is on games for the second half of the year, which is a good thing because, aside from a Luigi’s Mansion 2 remake and a game about speedrunning, the lineup of upcoming Switch titles is very quiet at the moment.

There are some games we know are in the works, at least, which could potentially make an appearance. That includes the long-in-development Metroid Prime 4 and Pokémon Legends: Z-A, which isn’t due out until 2025. And, hey, maybe this is the time we finally learn when Hollow Knight: Silksong will be coming out. But given the way the year has been going, there’s a pretty good chance this Nintendo showcase will feature a number of rereleases and remasters.

One thing that won’t be there is the Switch 2 or whatever Nintendo’s next console turns out to be. Despite persistent rumors of an imminent reveal, Nintendo is adamant that “there will be no mention of the Nintendo Switch successor during this presentation.”

The Nintendo Direct kicks off at 7AM PT / 10AM ET, and you can keep up with all of the important reveals and trailers right here.

Read More 

Google DeepMind’s new AI tool uses video pixels and text prompts to generate soundtracks

Illustration: Cath Virginia / The Verge | Photos: Getty Images

Google DeepMind has taken the wraps off of a new AI tool for generating video soundtracks. In addition to using a text prompt to generate audio, DeepMind’s tool also takes into account the contents of the video.
By combining the two, DeepMind says users can use the tool to create scenes with “a drama score, realistic sound effects or dialogue that matches the characters and tone of a video.” You can see some of the examples posted on DeepMind’s website — and they sound pretty good.

For a video of a car driving through a cyberpunk-esque cityscape, Google used the prompt “cars skidding, car engine throttling, angelic electronic music” to generate audio. You can see how the sounds of skidding match up with the car’s movement. Another example creates an underwater soundscape using the prompt, “jellyfish pulsating under water, marine life, ocean.”
Even though users can include a text prompt, DeepMind says it’s optional. Users also don’t need to meticulously match up the generated audio with the appropriate scenes. According to DeepMind, the tool can also generate an “unlimited” number of soundtracks for videos, allowing users to come up with an endless stream of audio options.
That could help it stand out from other AI tools, like the sound effects generator from ElevenLabs, which uses text prompts to generate audio. It could also make it easier to pair audio with AI-generated video from tools like DeepMind’s Veo and Sora (the latter of which plans to eventually incorporate audio).
DeepMind says it trained its AI tool on video, audio, and annotations containing “detailed descriptions of sound and transcripts of spoken dialogue.” This allows the video-to-audio generator to match audio events with visual scenes.

The tool still has some limitations. For example, DeepMind is trying to improve its ability to synchronize lip movement with dialogue, as you can see in this video of a claymation family. DeepMind also notes that its video-to-audio system is dependent on video quality, so anything that’s grainy or distorted “can lead to a noticeable drop in audio quality.”
DeepMind’s tool isn’t generally available just yet, as it will still have to undergo “rigorous safety assessments and testing.” When it does become available, its audio output will include Google’s SynthID watermark to flag that it’s AI-generated.

Illustration: Cath Virginia / The Verge | Photos: Getty Images

Google DeepMind has taken the wraps off of a new AI tool for generating video soundtracks. In addition to using a text prompt to generate audio, DeepMind’s tool also takes into account the contents of the video.

By combining the two, DeepMind says users can use the tool to create scenes with “a drama score, realistic sound effects or dialogue that matches the characters and tone of a video.” You can see some of the examples posted on DeepMind’s website — and they sound pretty good.

For a video of a car driving through a cyberpunk-esque cityscape, Google used the prompt “cars skidding, car engine throttling, angelic electronic music” to generate audio. You can see how the sounds of skidding match up with the car’s movement. Another example creates an underwater soundscape using the prompt, “jellyfish pulsating under water, marine life, ocean.”

Even though users can include a text prompt, DeepMind says it’s optional. Users also don’t need to meticulously match up the generated audio with the appropriate scenes. According to DeepMind, the tool can also generate an “unlimited” number of soundtracks for videos, allowing users to come up with an endless stream of audio options.

That could help it stand out from other AI tools, like the sound effects generator from ElevenLabs, which uses text prompts to generate audio. It could also make it easier to pair audio with AI-generated video from tools like DeepMind’s Veo and Sora (the latter of which plans to eventually incorporate audio).

DeepMind says it trained its AI tool on video, audio, and annotations containing “detailed descriptions of sound and transcripts of spoken dialogue.” This allows the video-to-audio generator to match audio events with visual scenes.

The tool still has some limitations. For example, DeepMind is trying to improve its ability to synchronize lip movement with dialogue, as you can see in this video of a claymation family. DeepMind also notes that its video-to-audio system is dependent on video quality, so anything that’s grainy or distorted “can lead to a noticeable drop in audio quality.”

DeepMind’s tool isn’t generally available just yet, as it will still have to undergo “rigorous safety assessments and testing.” When it does become available, its audio output will include Google’s SynthID watermark to flag that it’s AI-generated.

Read More 

Tesla’s big, epic, confusing future

Image: Alex Parkin / The Verge

Elon Musk got paid. A lot. Now comes the interesting part: Musk has to make Tesla so big, so successful, and so valuable that the tens of billions he’s getting seem cheap by comparison. You might think selling cars is the way to get there, but you might be wrong about that. Musk and, by extension, Tesla seem far more interested in some much bigger, more ambitious, and decidedly more complicated ideas.
On this episode of The Vergecast, The Verge’s Andrew Hawkins joins to discuss all the things Tesla is working on — and why it increasingly feels like cars are at the bottom of that list. We talk about the Optimus robots, the Tesla Network, the whole “AWS for AI” thing, and what the next three mystery vehicles might be. Finally, we look at how Musk’s relationship to Tesla, and our relationship to Musk, has changed since the last time Tesla shareholders voted to give Musk all that money. Do these two need each other more than ever, or is this a match doomed to fail?

After that, The Verge’s Victoria Song joins to share her excitement about Apple Watch rest days. Er, excuse me, “ring pauses.” It only took a decade for Apple to realize what even the most active people need sometimes, and we’re very excited about the possibility of a saner relationship with our exercise streaks. We also talk about Pixel Watch leaks, the new Samsung Galaxy Watch FE, and whether the Galaxy Ring can be a game-changer.
Finally, we answer a question from the Vergecast Hotline (866-VERGE11 or vergecast@theverge.com, send us all your questions!) all about weather apps. There’s no best weather app for everyone, we realize — but it’s totally possible to get the best one for you.

If you want to know more about everything we discuss in this episode, here are some links to get you started, beginning with Tesla:

Tesla’s 2024 shareholder meeting: all the news about Elon Musk’s $50 billion payday
Tesla shareholders approve Elon Musk’s massive pay package — was there ever any doubt
Let’s speculate wildly about Tesla’s three mystery vehicles
Whatever Elon wants, Tesla gets

And on all things wearables:

Finally, the Apple Watch will let you rest
Apple announces watchOS 11 with new training features and Live Activities
The Pixel Watch 2 can now detect when you’ve been in a car crash
Leaked Google Pixel Watch 3 renders suggest it will get thicker but not bigger
Samsung’s Galaxy Watch FE is its new entry-level smartwatch
Samsung sues Oura preemptively to block smart ring patent claims
This walking app let me whack my co-workers with a baseball bat

And on weather apps:

Find the best forecast where you are: Forecast Advisor

Apple’s Weather chaos is restarting the weather app market

Image: Alex Parkin / The Verge

Elon Musk got paid. A lot. Now comes the interesting part: Musk has to make Tesla so big, so successful, and so valuable that the tens of billions he’s getting seem cheap by comparison. You might think selling cars is the way to get there, but you might be wrong about that. Musk and, by extension, Tesla seem far more interested in some much bigger, more ambitious, and decidedly more complicated ideas.

On this episode of The Vergecast, The Verge’s Andrew Hawkins joins to discuss all the things Tesla is working on — and why it increasingly feels like cars are at the bottom of that list. We talk about the Optimus robots, the Tesla Network, the whole “AWS for AI” thing, and what the next three mystery vehicles might be. Finally, we look at how Musk’s relationship to Tesla, and our relationship to Musk, has changed since the last time Tesla shareholders voted to give Musk all that money. Do these two need each other more than ever, or is this a match doomed to fail?

After that, The Verge’s Victoria Song joins to share her excitement about Apple Watch rest days. Er, excuse me, “ring pauses.” It only took a decade for Apple to realize what even the most active people need sometimes, and we’re very excited about the possibility of a saner relationship with our exercise streaks. We also talk about Pixel Watch leaks, the new Samsung Galaxy Watch FE, and whether the Galaxy Ring can be a game-changer.

Finally, we answer a question from the Vergecast Hotline (866-VERGE11 or vergecast@theverge.com, send us all your questions!) all about weather apps. There’s no best weather app for everyone, we realize — but it’s totally possible to get the best one for you.

If you want to know more about everything we discuss in this episode, here are some links to get you started, beginning with Tesla:

Tesla’s 2024 shareholder meeting: all the news about Elon Musk’s $50 billion payday
Tesla shareholders approve Elon Musk’s massive pay package — was there ever any doubt
Let’s speculate wildly about Tesla’s three mystery vehicles
Whatever Elon wants, Tesla gets

And on all things wearables:

Finally, the Apple Watch will let you rest
Apple announces watchOS 11 with new training features and Live Activities
The Pixel Watch 2 can now detect when you’ve been in a car crash
Leaked Google Pixel Watch 3 renders suggest it will get thicker but not bigger
Samsung’s Galaxy Watch FE is its new entry-level smartwatch
Samsung sues Oura preemptively to block smart ring patent claims
This walking app let me whack my co-workers with a baseball bat

And on weather apps:

Find the best forecast where you are: Forecast Advisor

Apple’s Weather chaos is restarting the weather app market

Read More 

Blink and you won’t miss these Moments

Blink’s new Moments feature should make it easier to see relevant clips at a glance. | Image: Blink

Budget smart security camera company Blink is adding a neat trick to its app: Blink Moments automatically stitches together relevant clips from multiple cameras into a single video. The idea is to make it easier to see what’s been going on around your home and also to share it with friends and family — or the police — depending on what you captured.
Moments, which has been in beta testing, started rolling out on June 4th and will be available to all Blink users with multiple cameras in the next few weeks, according to the company. Besides multiple cameras, it requires a Blink Subscription Plus Plan ($10 a month or $100 a year) but doesn’t require Blink’s Sync Module hub.
According to Blink, Moments are generated when two or more cameras on the same system record motion-activated clips or saved Live View clips within 45 seconds of the start time of the last recorded clip. The clips are then stitched together into one video that can be shared from the Blink app.

Blink Moments stitches together clips from multiple cameras into a single video. This shows a delivery driver being filmed from various angles.

This could be a useful feature, as if you have multiple cameras it can be tricky to manually scroll through dozens of clips to see what’s been going on. Because Blink only offers motion and person detection, there are fewer ways to filter clips to find what you’re looking for, and the Moments feature could offer a glanceable way to view all the action.
Blink, which is owned by Amazon, isn’t the only camera company with this type of capability. Eufy’s cross-camera tracking is a similar concept and doesn’t require a subscription — although it does need a Eufy HomeBase 3, which costs $150. It will be interesting to see if a Moments-style feature comes to Blink’s sister brand Ring at some point, too.

Photo by Jennifer Pattison Tuohy / The Verge
The Blink Mini 2, a plug in indoor / outdoor camera, will work with the new Moments feature.

According to Blink, Moments will work with all current and prior generations of Blink’s battery-powered, wired, and plug-in cameras, and if you add new cameras, they’ll automatically integrate with the feature.
Blink’s current flagship camera is its $90 Blink Outdoor 4 battery-powered camera ($120 with a required Sync Module). It can last up to an impressive two years on two AA batteries or up to four years with the expansion battery pack. The company also sells a $40 plug-in Blink Mini 2, a wired floodlight camera, and a wired / battery-powered video doorbell. All of these cameras support the Moments feature.

Update, June 18th: Added that a Blink Sync module is not required for Moments.

Blink’s new Moments feature should make it easier to see relevant clips at a glance. | Image: Blink

Budget smart security camera company Blink is adding a neat trick to its app: Blink Moments automatically stitches together relevant clips from multiple cameras into a single video. The idea is to make it easier to see what’s been going on around your home and also to share it with friends and family — or the police — depending on what you captured.

Moments, which has been in beta testing, started rolling out on June 4th and will be available to all Blink users with multiple cameras in the next few weeks, according to the company. Besides multiple cameras, it requires a Blink Subscription Plus Plan ($10 a month or $100 a year) but doesn’t require Blink’s Sync Module hub.

According to Blink, Moments are generated when two or more cameras on the same system record motion-activated clips or saved Live View clips within 45 seconds of the start time of the last recorded clip. The clips are then stitched together into one video that can be shared from the Blink app.

Blink Moments stitches together clips from multiple cameras into a single video. This shows a delivery driver being filmed from various angles.

This could be a useful feature, as if you have multiple cameras it can be tricky to manually scroll through dozens of clips to see what’s been going on. Because Blink only offers motion and person detection, there are fewer ways to filter clips to find what you’re looking for, and the Moments feature could offer a glanceable way to view all the action.

Blink, which is owned by Amazon, isn’t the only camera company with this type of capability. Eufy’s cross-camera tracking is a similar concept and doesn’t require a subscription — although it does need a Eufy HomeBase 3, which costs $150. It will be interesting to see if a Moments-style feature comes to Blink’s sister brand Ring at some point, too.

Photo by Jennifer Pattison Tuohy / The Verge
The Blink Mini 2, a plug in indoor / outdoor camera, will work with the new Moments feature.

According to Blink, Moments will work with all current and prior generations of Blink’s battery-powered, wired, and plug-in cameras, and if you add new cameras, they’ll automatically integrate with the feature.

Blink’s current flagship camera is its $90 Blink Outdoor 4 battery-powered camera ($120 with a required Sync Module). It can last up to an impressive two years on two AA batteries or up to four years with the expansion battery pack. The company also sells a $40 plug-in Blink Mini 2, a wired floodlight camera, and a wired / battery-powered video doorbell. All of these cameras support the Moments feature.

Update, June 18th: Added that a Blink Sync module is not required for Moments.

Read More 

Xreal’s new gadget is a phone-sized Android tablet just for your AR glasses

The Xreal Beam Pro looks like a smartphone — but it’s not. | Image: Xreal

Xreal has made a name for itself with some surprisingly nice-looking augmented reality glasses, which put a display in front of your eyes so you can do things like watch TV or play games on a giant screen projected just for you. But unlike, say, Apple’s Vision Pro or Meta’s Quest 3, Xreal has no built-in software or content. It’s just a screen in your glasses. This is good in that you can plug in lots of other devices, but it does restrict the sort of things those glasses can do.
The new Beam Pro is Xreal’s latest attempt to bridge that gap. It’s a handheld device with the rough dimensions of a smartphone, but Xreal thinks of it as more of a companion to your glasses. It runs a customized version of Android 14 — Xreal calls it NebulaOS — and should be able to load most apps onto your face screen. And on the back, there’s a dual-lens camera you can use to take spatial and 3D videos for viewing in your glasses. (Or your Vision Pro; Xreal says the Beam Pro’s footage will work in your Apple headset, too.)
The specs here are pure smartphone: the Beam Pro has a 6.5-inch, 2400 x 1800 screen, runs on a Qualcomm Snapdragon processor (though it’s not clear which one), and has either 6GB or 8GB of RAM and either 128GB or 256GB of storage. The cheapest model costs $199, though you probably won’t want one unless you’re also plunking down a few hundred bucks for a pair of Xreal glasses.
There are a couple of obvious tells that this is no ordinary Android phone, though. First are the dual 50-megapixel cameras, the likes of which we haven’t seen on smartphones much in recent years. The Beam Pro also has two USB-C ports, so you can charge the device and connect it to your glasses simultaneously. The NebulaOS tweaks to Android are all about AR, too; when you have your glasses plugged in, you can use the Beam’s screen as a touchpad, and the device is also designed to have two apps open side by side in your field of view. When you first plug in the glasses, it’ll pop up a homescreen of your apps, which you can open and control using the Beam Pro as a remote.

Image: Xreal
When you connect the Beam Pro to your glasses, your Android apps will appear in front of your eyes.

The Beam Pro looks like a big upgrade on the Beam, which was essentially just a remote control for your Xreal glasses. The Beam definitely solved a problem for Xreal owners, but it had some issues: a bunch of reviewers and users found it was fiddly and unreliable, and Xreal had a hard time explaining to users why it even existed in the first place. The screen should make the Pro much easier to use, and the camera makes it more than just a lesser smartphone replacement. You can, of course, still plug in your Steam Deck or smartphone and use Xreal’s glasses that way, but this feels like a more integrated approach.
Xreal’s approach is much less integrated than what we’re seeing from Apple and Meta, both of which are determined to put a whole computer on your face. But there’s something clever about Xreal’s way: it’s using a totally mature device category to do all the hard work — and doing as little on your face as possible. At least for now, it feels like a smart strategy.

Image: Xreal
Xreal’s glasses can still connect to your phone — but now they get their own device, too.

The Xreal Beam Pro looks like a smartphone — but it’s not. | Image: Xreal

Xreal has made a name for itself with some surprisingly nice-looking augmented reality glasses, which put a display in front of your eyes so you can do things like watch TV or play games on a giant screen projected just for you. But unlike, say, Apple’s Vision Pro or Meta’s Quest 3, Xreal has no built-in software or content. It’s just a screen in your glasses. This is good in that you can plug in lots of other devices, but it does restrict the sort of things those glasses can do.

The new Beam Pro is Xreal’s latest attempt to bridge that gap. It’s a handheld device with the rough dimensions of a smartphone, but Xreal thinks of it as more of a companion to your glasses. It runs a customized version of Android 14 — Xreal calls it NebulaOS — and should be able to load most apps onto your face screen. And on the back, there’s a dual-lens camera you can use to take spatial and 3D videos for viewing in your glasses. (Or your Vision Pro; Xreal says the Beam Pro’s footage will work in your Apple headset, too.)

The specs here are pure smartphone: the Beam Pro has a 6.5-inch, 2400 x 1800 screen, runs on a Qualcomm Snapdragon processor (though it’s not clear which one), and has either 6GB or 8GB of RAM and either 128GB or 256GB of storage. The cheapest model costs $199, though you probably won’t want one unless you’re also plunking down a few hundred bucks for a pair of Xreal glasses.

There are a couple of obvious tells that this is no ordinary Android phone, though. First are the dual 50-megapixel cameras, the likes of which we haven’t seen on smartphones much in recent years. The Beam Pro also has two USB-C ports, so you can charge the device and connect it to your glasses simultaneously. The NebulaOS tweaks to Android are all about AR, too; when you have your glasses plugged in, you can use the Beam’s screen as a touchpad, and the device is also designed to have two apps open side by side in your field of view. When you first plug in the glasses, it’ll pop up a homescreen of your apps, which you can open and control using the Beam Pro as a remote.

Image: Xreal
When you connect the Beam Pro to your glasses, your Android apps will appear in front of your eyes.

The Beam Pro looks like a big upgrade on the Beam, which was essentially just a remote control for your Xreal glasses. The Beam definitely solved a problem for Xreal owners, but it had some issues: a bunch of reviewers and users found it was fiddly and unreliable, and Xreal had a hard time explaining to users why it even existed in the first place. The screen should make the Pro much easier to use, and the camera makes it more than just a lesser smartphone replacement. You can, of course, still plug in your Steam Deck or smartphone and use Xreal’s glasses that way, but this feels like a more integrated approach.

Xreal’s approach is much less integrated than what we’re seeing from Apple and Meta, both of which are determined to put a whole computer on your face. But there’s something clever about Xreal’s way: it’s using a totally mature device category to do all the hard work — and doing as little on your face as possible. At least for now, it feels like a smart strategy.

Image: Xreal
Xreal’s glasses can still connect to your phone — but now they get their own device, too.

Read More 

WhatsApp reunites Modern Family for ad about green bubble friends

Image: WhatsApp

Four members of ABC’s Modern Family cast have returned to their roles to star in a new commercial for WhatsApp. The ad is designed to promote WhatsApp to families that use a mix of iPhones and Android devices, so they can avoid green bubbles and the limitations of Apple’s group chat MMS solution.
If you’re an iPhone user in the US you probably already avoid inviting Android friends into group chats, as it switches from iMessage to MMS and the regular Messages functionality like read receipts, encryption, and full resolution images are all disabled. WhatsApp hits on this directly in its commercial, with Julie Bowen (Claire Dunphy) explaining to Jesse Tyler Ferguson (Mitchell Pritchett) that “blurry photos” and “weird likes” are why he hasn’t been invited to the group chat with his new Android phone.
“Don’t make this family’s mistake – switch to WhatsApp for seamless and private messaging across all phones,” is the tagline for the commercial, that also stars Ty Burrell (Phil Dunphy) and Eric Stonestreet (Cameron Tucker).
The timing of the WhatsApp ad comes as Apple begins adding Rich Communication Services (RCS) support to iOS 18 to replace the aging SMS standard. Meta’s commercial plays on the years of taunting and cajoling over green bubble friends, but it might only have months to make its point ahead of the RCS support in iOS 18.
Apple barely acknowledged its RCS plans during its WWDC keynote earlier this month, and we still don’t know much about how exactly Apple will implement it. RCS support in iOS 18 is still a big deal though, and one that might make it even harder for WhatsApp to convince iPhone users to switch their group chats over.
Still, at least Modern Family fans get to see some of the original cast back together again for a brief spot. The show ended four years ago, and Hollywood Reporter notes there are no plans for a Modern Family reboot.

Image: WhatsApp

Four members of ABC’s Modern Family cast have returned to their roles to star in a new commercial for WhatsApp. The ad is designed to promote WhatsApp to families that use a mix of iPhones and Android devices, so they can avoid green bubbles and the limitations of Apple’s group chat MMS solution.

If you’re an iPhone user in the US you probably already avoid inviting Android friends into group chats, as it switches from iMessage to MMS and the regular Messages functionality like read receipts, encryption, and full resolution images are all disabled. WhatsApp hits on this directly in its commercial, with Julie Bowen (Claire Dunphy) explaining to Jesse Tyler Ferguson (Mitchell Pritchett) that “blurry photos” and “weird likes” are why he hasn’t been invited to the group chat with his new Android phone.

“Don’t make this family’s mistake – switch to WhatsApp for seamless and private messaging across all phones,” is the tagline for the commercial, that also stars Ty Burrell (Phil Dunphy) and Eric Stonestreet (Cameron Tucker).

The timing of the WhatsApp ad comes as Apple begins adding Rich Communication Services (RCS) support to iOS 18 to replace the aging SMS standard. Meta’s commercial plays on the years of taunting and cajoling over green bubble friends, but it might only have months to make its point ahead of the RCS support in iOS 18.

Apple barely acknowledged its RCS plans during its WWDC keynote earlier this month, and we still don’t know much about how exactly Apple will implement it. RCS support in iOS 18 is still a big deal though, and one that might make it even harder for WhatsApp to convince iPhone users to switch their group chats over.

Still, at least Modern Family fans get to see some of the original cast back together again for a brief spot. The show ended four years ago, and Hollywood Reporter notes there are no plans for a Modern Family reboot.

Read More 

LG Electronics will supply EV chargers to ChargePoint as part of new deal

South Korea’s LG Electronics is teaming up with ChargePoint to install more electric vehicle charging stations in the US, the companies announced today.
As part of the deal, ChargePoint will provide software to operate LG’s EV chargers, and LG will supply ChargePoint with hardware to bolster its network of 306,000 charge ports. The companies say they plan coming together to jointly install “commercial charging solutions,” with the first deliveries expected later this summer.
Other possibilities include ChargePoint tapping into LG’s energy storage installations, as well as connecting LG’s ThinQ smart home system with ChargePoint’s Home Flex residential charger.
LG Electronics, which makes TVs and home appliances, opened its first EV charger factory in Fort Worth, Texas earlier this year. The 100,000 square-foot facility has the capacity to produce up to 12,000 chargers a year. At the time, the company said it would supply EV charging equipment to charging operators. The deal with ChargePoint is one of the first supplier partnerships to be announced.
Both companies anticipate tapping into funding from the Biden administration’s National Electric Vehicle Infrastructure program, which is allocating millions of dollars to EV charging operators. Despite being authorized in 2021, only seven stations have opened so far using NEVI funding, prompting questions from Republicans about the program’s effectiveness.

South Korea’s LG Electronics is teaming up with ChargePoint to install more electric vehicle charging stations in the US, the companies announced today.

As part of the deal, ChargePoint will provide software to operate LG’s EV chargers, and LG will supply ChargePoint with hardware to bolster its network of 306,000 charge ports. The companies say they plan coming together to jointly install “commercial charging solutions,” with the first deliveries expected later this summer.

Other possibilities include ChargePoint tapping into LG’s energy storage installations, as well as connecting LG’s ThinQ smart home system with ChargePoint’s Home Flex residential charger.

LG Electronics, which makes TVs and home appliances, opened its first EV charger factory in Fort Worth, Texas earlier this year. The 100,000 square-foot facility has the capacity to produce up to 12,000 chargers a year. At the time, the company said it would supply EV charging equipment to charging operators. The deal with ChargePoint is one of the first supplier partnerships to be announced.

Both companies anticipate tapping into funding from the Biden administration’s National Electric Vehicle Infrastructure program, which is allocating millions of dollars to EV charging operators. Despite being authorized in 2021, only seven stations have opened so far using NEVI funding, prompting questions from Republicans about the program’s effectiveness.

Read More 

How to watch the June 2024 Nintendo Direct

Photo by Amelia Holowaty Krales / The Verge

The onslaught of summer gaming news might finally be at an end — but not before Nintendo has its say. Following events from PlayStation and Xbox, along with Summer Game Fest and presentations focused on Ubisoft and EA games, Nintendo has announced its latest Direct presentation. It’s slated to last around 40 minutes and will be “focused on Nintendo Switch games coming in the second half of 2024,” according to the company. (There won’t be any news about the next Switch, however, Nintendo says.)
As always with Nintendo, it’s tough to predict what to expect, though it’s possible we’ll see a number of rereleases of classic games and, just maybe, finally get a glimpse of Metroid Prime 4. You’ll definitely have to tune in to see what the company has in store for the Switch this holiday season.
How and when to watch the June 2024 Nintendo Direct
The next Nintendo Direct will take place on June 18th at 7AM PT / 10AM ET. It’ll be streaming on YouTube; you can either watch it right here or in the embed at the top of this article.

Photo by Amelia Holowaty Krales / The Verge

The onslaught of summer gaming news might finally be at an end — but not before Nintendo has its say. Following events from PlayStation and Xbox, along with Summer Game Fest and presentations focused on Ubisoft and EA games, Nintendo has announced its latest Direct presentation. It’s slated to last around 40 minutes and will be “focused on Nintendo Switch games coming in the second half of 2024,” according to the company. (There won’t be any news about the next Switch, however, Nintendo says.)

As always with Nintendo, it’s tough to predict what to expect, though it’s possible we’ll see a number of rereleases of classic games and, just maybe, finally get a glimpse of Metroid Prime 4. You’ll definitely have to tune in to see what the company has in store for the Switch this holiday season.

How and when to watch the June 2024 Nintendo Direct

The next Nintendo Direct will take place on June 18th at 7AM PT / 10AM ET. It’ll be streaming on YouTube; you can either watch it right here or in the embed at the top of this article.

Read More 

Meta releases Threads API for developers to build ‘unique integrations’

Illustration: The Verge

The Threads API is now available, meeting a promised launch by the end of June. The free API will allow developers to build “unique integrations” into Threads, and potentially even result in third-party apps for Meta’s competitor to what was previously known as Twitter.
“People can now publish posts via the API, fetch their own content, and leverage our reply management capabilities to set reply and quote controls, retrieve replies to their posts, hide, unhide or respond to specific replies,” explains Jesse Chen, director of engineering at Threads.

Chen says that insights into Threads posts are “one of our top requested features for the API,” so Meta is allowing developers to see the number of views, likes, replies, reposts, and quotes on Threads posts through the API. Meta has published plenty of documentation about how developers can get started with the Threads API, and there’s even an open-source Threads API sample app on GitHub.
Meta has been testing the Threads API with a small number of developers: Grabyo, Hootsuite, Social News Desk, Sprinklr, Sprout Social, and Techmeme. These test integrations have allowed sites like Techmeme to automate posting to Threads, or Sprout and Hootsuite customers to feed Threads posts into the social media management platform.
We’re now waiting to see if developers will be able to easily build a third-party Threads app with this new API that’s not connected to a social media management platform. The existing fediverse beta could help with that, allowing Threads users to access posts through Mastodon clients and share content to Mastodon servers. The current beta of the fediverse integration doesn’t let users view replies and follows from the fediverse though, so it’s far from being feature complete as an alternative to third-party Threads apps.

Illustration: The Verge

The Threads API is now available, meeting a promised launch by the end of June. The free API will allow developers to build “unique integrations” into Threads, and potentially even result in third-party apps for Meta’s competitor to what was previously known as Twitter.

“People can now publish posts via the API, fetch their own content, and leverage our reply management capabilities to set reply and quote controls, retrieve replies to their posts, hide, unhide or respond to specific replies,” explains Jesse Chen, director of engineering at Threads.

Chen says that insights into Threads posts are “one of our top requested features for the API,” so Meta is allowing developers to see the number of views, likes, replies, reposts, and quotes on Threads posts through the API. Meta has published plenty of documentation about how developers can get started with the Threads API, and there’s even an open-source Threads API sample app on GitHub.

Meta has been testing the Threads API with a small number of developers: Grabyo, Hootsuite, Social News Desk, Sprinklr, Sprout Social, and Techmeme. These test integrations have allowed sites like Techmeme to automate posting to Threads, or Sprout and Hootsuite customers to feed Threads posts into the social media management platform.

We’re now waiting to see if developers will be able to easily build a third-party Threads app with this new API that’s not connected to a social media management platform. The existing fediverse beta could help with that, allowing Threads users to access posts through Mastodon clients and share content to Mastodon servers. The current beta of the fediverse integration doesn’t let users view replies and follows from the fediverse though, so it’s far from being feature complete as an alternative to third-party Threads apps.

Read More 

Sims competitor Life by You has been canceled

Image: Paradox Interactive

Life by You, Paradox Interactive’s in-development competitor to The Sims, has been canceled, the company announced on Monday.
The game, which was first revealed in 2023, sounded impressive: it was designed to allow for the entire town to be simulated in real-time and have no loading screens. However, based on a forum post by Paradox’s deputy CEO Mattias Lilja, the game had some issues that may not have been easily fixable even with additional time for development.

“A few weeks back, we decided to hold off on an Early Access release in order to re-evaluate Life by You, as we still felt that the game was lacking in some key areas,” Lilja says. “Though a time extension was an option, once we took that pause to get a wider view of the game, it became clear to us that the road leading to a release that we felt confident about was far too long and uncertain.”
Lilja says that the game “had a number of strengths,” but the company realized that “when we come to a point where we believe that more time will not get us close enough to a version we would be satisfied with, then we believe it is better to stop.”
The game’s initial early access launch had been set for September 2023, but it was pushed to March 2024, then June, and then delayed indefinitely in May before being officially canceled.
The Life by You team hasn’t been the only one trying to make a new take on The Sims: former XCOM developers recently launched Midsummer Studios to develop a new life sim game of their own. But EA is hard at work on more Sims as well, developing a new free-to-play Sims game codenamed Project Rene.

Image: Paradox Interactive

Life by You, Paradox Interactive’s in-development competitor to The Sims, has been canceled, the company announced on Monday.

The game, which was first revealed in 2023, sounded impressive: it was designed to allow for the entire town to be simulated in real-time and have no loading screens. However, based on a forum post by Paradox’s deputy CEO Mattias Lilja, the game had some issues that may not have been easily fixable even with additional time for development.

“A few weeks back, we decided to hold off on an Early Access release in order to re-evaluate Life by You, as we still felt that the game was lacking in some key areas,” Lilja says. “Though a time extension was an option, once we took that pause to get a wider view of the game, it became clear to us that the road leading to a release that we felt confident about was far too long and uncertain.”

Lilja says that the game “had a number of strengths,” but the company realized that “when we come to a point where we believe that more time will not get us close enough to a version we would be satisfied with, then we believe it is better to stop.”

The game’s initial early access launch had been set for September 2023, but it was pushed to March 2024, then June, and then delayed indefinitely in May before being officially canceled.

The Life by You team hasn’t been the only one trying to make a new take on The Sims: former XCOM developers recently launched Midsummer Studios to develop a new life sim game of their own. But EA is hard at work on more Sims as well, developing a new free-to-play Sims game codenamed Project Rene.

Read More 

Scroll to top
Generated by Feedzy