engadget-rss
After a 15-year hiatus, a new Skate game is coming in 2025
Skateboarding games have been going through another golden age the past couple of years with contributions like Devolver’s surrealist skater Skate Story and Phantom Coast Games’ roguelite shredder Helskate. And now new entry in one of the most beloved skateboarding franchises just moved one step closer to getting a release date.
EA announced plans for an early access release for a new Skate — simply titled skate. (lower case, with a period) —game next year on the game’s official X account. The update also includes some pre-alpha footage of the new game that’s currently being playtested for consoles and the franchise’s first PC release through Steam.
we’re incredibly stoked to announce that skate. will be launching in Early Access in 2025. we’ll share more details on what to expect in the coming months. pic.twitter.com/ZbFL3WycWu— skate. (@skate) September 17, 2024
Fans of the long-running Skate franchise have been receiving a stream of teases and updates since EA first announced the new entry four years ago. Notably, last June EA informed fans this new title would be a free-to-play live service game with microtransactions (allegedly without any “play-to-win” elements though). The game’s publisher also released new details on its official game dev diary about how it’s rebuilding the “Flick-It” trick control system, expanding the game’s character customizations and implementing playtests with feedback from the franchise’s fans.
The devs also revealed some interesting details about the game’s core narrative. The new Skate game takes place in the fictional city of San Vansterdam, which has been taken over by a corporate overlord named M-Corp. It seems that M-Corp’s misdeeds are finally catching up to it, meaning that the shreddable city of San Van is open to skaters once again. A teaser video shared an update on M-Corp’s crumbling empire starring I Think You Should Leave star and comedian Tim Robinson as corporate lackey Richard “Richie” Dandle.
This article originally appeared on Engadget at https://www.engadget.com/gaming/after-a-15-year-hiatus-a-new-skate-game-is-coming-in-2025-182125136.html?src=rss
Skateboarding games have been going through another golden age the past couple of years with contributions like Devolver’s surrealist skater Skate Story and Phantom Coast Games’ roguelite shredder Helskate. And now new entry in one of the most beloved skateboarding franchises just moved one step closer to getting a release date.
EA announced plans for an early access release for a new Skate — simply titled skate. (lower case, with a period) —game next year on the game’s official X account. The update also includes some pre-alpha footage of the new game that’s currently being playtested for consoles and the franchise’s first PC release through Steam.
we’re incredibly stoked to announce that skate. will be launching in Early Access in 2025. we’ll share more details on what to expect in the coming months. pic.twitter.com/ZbFL3WycWu
— skate. (@skate) September 17, 2024
Fans of the long-running Skate franchise have been receiving a stream of teases and updates since EA first announced the new entry four years ago. Notably, last June EA informed fans this new title would be a free-to-play live service game with microtransactions (allegedly without any “play-to-win” elements though). The game’s publisher also released new details on its official game dev diary about how it’s rebuilding the “Flick-It” trick control system, expanding the game’s character customizations and implementing playtests with feedback from the franchise’s fans.
The devs also revealed some interesting details about the game’s core narrative. The new Skate game takes place in the fictional city of San Vansterdam, which has been taken over by a corporate overlord named M-Corp. It seems that M-Corp’s misdeeds are finally catching up to it, meaning that the shreddable city of San Van is open to skaters once again. A teaser video shared an update on M-Corp’s crumbling empire starring I Think You Should Leave star and comedian Tim Robinson as corporate lackey Richard “Richie” Dandle.
This article originally appeared on Engadget at https://www.engadget.com/gaming/after-a-15-year-hiatus-a-new-skate-game-is-coming-in-2025-182125136.html?src=rss
Logitech drops an analog keyboard and new Pro Superlight mice
Logitech is revealing plenty of new gaming accessories and gear at Logi Play 2024, which is happening right now. Of the many new offerings from Logitech, two keyboards and two mice caught our eye.
Let’s start with the G Pro X TKL Rapid Wired Gaming Keyboard, a keyboard featuring magnetic analog switches, a first for the G Pro line. These switches have adjustable actuation points, rapid trigger functionality and key priority. In short, the keyboard lets you customize how hard presses need to be, has speedy key press recognition and the ability to prioritize certain keys when pressing two at once.
You can also use the multi-point feature in the G Hub keyboard customization software to assign more than one command to a key depending on how far it’s pressed down. As the name suggests, this is a tenkeyless model (no number pad), and you can get it for $170 in November. The three available colors are black, white and pink.
The next keyboard is the G915 X series, a trio of new members of the G915 family (we reviewed the G915 TKL back in 2020). The mechanical keyboards all have a height of 23mm and redesigned galvanic switches with a 1.3mm actuation point. They retain the original volume roller, G key and media buttons, but the Keycontrol feature allows for more macros, even letting users combine the G key with other keys.
Logitech
The G915 X series includes the G915 X Lightspeed ($230), G915 X Lightspeed TKL ($200) and G915 X Wired Gaming Keyboard ($180). The G915 X Lightspeed is a tenkeyless version of the G915 X Lightspeed, while the G915 X doesn’t support wireless connections but is identical in almost every way to the G915 X Lightspeed. The Lightspeed models can come in black or white, but the wired model is only available in black. They’re all available right now.
Moving on to the mice, the G Pro X Superlight 2 Dex Lightspeed wireless gaming mouse is an upgrade of the Pro X Superlight and Pro X Superlight 2, both of which are favorites among current and former Engadget staffers. This new mouse is designed with the help of pro esports athletes, boasting a maximum limit of 44k DPI, 888 IPS acceleration and steady 8kHz polling rate performance.
Logitech
The Superlight 2 Dex Lightspeed has five buttons and Lightforce switches while weighing only 60 grams. It’s also compatible with Logitech’s PowerPlay wireless charging system. If you’re interested, you get it now for $160 in black, white or pink.
For those who like the original G Pro mouse, consider the Pro 2 Lightspeed wireless gaming mouse, an improvement over the old model. The Hero 2 sensors on this one are rated for 32k DPO and over 500 IPS acceleration. The highest polling rate for the Pro 2 Lightspeed is 1kHz.
Logitech
Similar to the first G Pro, this one weighs 80 grams, perfect for gamers who prefer something heavier. It doesn’t support wireless charging but can work with the Pro Lightspeed receiver for 8kHz polling rates. The receiver will only be available for $30 in October. This mouse is now available for $140 in black, white and pink.This article originally appeared on Engadget at https://www.engadget.com/computing/accessories/logitech-drops-an-analog-keyboard-and-new-pro-superlight-mice-180113818.html?src=rss
Logitech is revealing plenty of new gaming accessories and gear at Logi Play 2024, which is happening right now. Of the many new offerings from Logitech, two keyboards and two mice caught our eye.
Let’s start with the G Pro X TKL Rapid Wired Gaming Keyboard, a keyboard featuring magnetic analog switches, a first for the G Pro line. These switches have adjustable actuation points, rapid trigger functionality and key priority. In short, the keyboard lets you customize how hard presses need to be, has speedy key press recognition and the ability to prioritize certain keys when pressing two at once.
You can also use the multi-point feature in the G Hub keyboard customization software to assign more than one command to a key depending on how far it’s pressed down. As the name suggests, this is a tenkeyless model (no number pad), and you can get it for $170 in November. The three available colors are black, white and pink.
The next keyboard is the G915 X series, a trio of new members of the G915 family (we reviewed the G915 TKL back in 2020). The mechanical keyboards all have a height of 23mm and redesigned galvanic switches with a 1.3mm actuation point. They retain the original volume roller, G key and media buttons, but the Keycontrol feature allows for more macros, even letting users combine the G key with other keys.
The G915 X series includes the G915 X Lightspeed ($230), G915 X Lightspeed TKL ($200) and G915 X Wired Gaming Keyboard ($180). The G915 X Lightspeed is a tenkeyless version of the G915 X Lightspeed, while the G915 X doesn’t support wireless connections but is identical in almost every way to the G915 X Lightspeed. The Lightspeed models can come in black or white, but the wired model is only available in black. They’re all available right now.
Moving on to the mice, the G Pro X Superlight 2 Dex Lightspeed wireless gaming mouse is an upgrade of the Pro X Superlight and Pro X Superlight 2, both of which are favorites among current and former Engadget staffers. This new mouse is designed with the help of pro esports athletes, boasting a maximum limit of 44k DPI, 888 IPS acceleration and steady 8kHz polling rate performance.
The Superlight 2 Dex Lightspeed has five buttons and Lightforce switches while weighing only 60 grams. It’s also compatible with Logitech’s PowerPlay wireless charging system. If you’re interested, you get it now for $160 in black, white or pink.
For those who like the original G Pro mouse, consider the Pro 2 Lightspeed wireless gaming mouse, an improvement over the old model. The Hero 2 sensors on this one are rated for 32k DPO and over 500 IPS acceleration. The highest polling rate for the Pro 2 Lightspeed is 1kHz.
Similar to the first G Pro, this one weighs 80 grams, perfect for gamers who prefer something heavier. It doesn’t support wireless charging but can work with the Pro Lightspeed receiver for 8kHz polling rates. The receiver will only be available for $30 in October. This mouse is now available for $140 in black, white and pink.
This article originally appeared on Engadget at https://www.engadget.com/computing/accessories/logitech-drops-an-analog-keyboard-and-new-pro-superlight-mice-180113818.html?src=rss
Snap’s fifth-generation Spectacles bring your hands into into augmented reality
Snap’s latest augmented reality glasses have a completely new — but still very oversized — design, larger field of view and all-new software that supports full hand tracking abilities. But the company is only making the fifth-generation Spectacles available to approved developers willing to commit to a year-long $99/month subscription to start.
It’s an unusual strategy, but Snap says it’s taking that approach because developers are, for now, best positioned to understand the capabilities and limitations of augmented reality hardware. They are also the ones most willing to commit to a pricey $1,000+ subscription to get their hands on the tech.
Developers, explains Snap’s director of AR platform Sophia Dominguez, are the biggest AR enthusiasts. They’re also the ones who will build the kinds of experiences that will eventually make the rest of Snapchat’s users excited for them too. “This isn’t a prototype,” Dominguez tells Engadget. “We have all the components. We’re ready to scale when the market is there, but we want to do so in a thoughtful way and bring developers along with our journey.”
Snap gave me an early preview of the glasses ahead of its Partner Summit event, and the Spectacles don’t feel like a prototype the way its first AR-enabled Spectacles did in 2021. The hardware and software are considerably more powerful. The AR displays are sharper and more immersive, and they already support over two dozen AR experiences, including a few from big names like Lego and Niantic (Star Wars developer Industrial Light and Motion also has a lens in the works, according to Snap.)
The glasses
To state the obvious, the glasses are massive. Almost comically large. They are significantly wider than my face, and the arms stuck out past the end of my head. A small adapter helped them fit around my ears more snugly, but they still felt like they might slip off my face if I jerked my head suddenly or leaned down.
Still, the new frames look slightly more like actual glasses than the fourth-generation Spectacles, which had a narrow, angular design with dark lenses. The new frames are made of thick black plastic and have clear lenses that are able to darken when you move outside, sort of like transition lenses.
The fifth-generation Spectacles are the first to have clear lenses.Karissa Bell for Engadget
The lenses house Snap’s waveguide tech that, along with “Liquid Crystal on Silicon micro-projectors,” enable their AR abilities. Each pair is also equipped with cameras, microphones and speakers.
Inside each arm is a Qualcomm Snapdragon processor. Snap says the dual processor setup has made the glasses more efficient and prevents the overheating issues that plagued their predecessor. The change seems to be an effective one. In my nearly hour-long demo, neither pair of Spectacles I tried got hot, though they were slightly warm to the touch after extended use. (The fifth-generation Spectacles have a battery life of about 45 minutes, up from 30 min with the fourth-gen model.)
Snap’s newest AR Spectacles are extremely thick. Karissa Bell for Engadget
Snap has also vastly improved Spectacles’ AR capabilities. The projected AR content was crisp and bright. When I walked outside into the sun, the lenses dimmed, but the content was very nearly as vivid as when I had been indoors. At a resolution of 37 pixels per degree, I wasn’t able to discern individual pixels or fuzzy borders like I have on some other AR hardware.
But the most noticeable improvement from Snap’s last AR glasses is the bigger field of view. Snap says it has almost tripled the field of view from its previous generation of Spectacles, increasing the window of visible content to 46 degrees. Snap claims this is equivalent to having a 100-inch display in the room with you, and my demo felt significantly more immersive than what I saw in 2021.
The fourth-generation Spectacles (above) were narrow and not nearly as oversized as the fifth-gen Spectacles (below).Karissa Bell for Engadget
It isn’t, however, fully immersive. I still found myself at times gazing around the room, looking for the AR effects I knew were around me. At other points, I had to physically move around my space in order to see the full AR effects. For example, when I tried out a human anatomy demo, which shows a life-sized model of the human body and its various systems, I wasn’t able to see the entire figure at once. I had to move my head up and down in order to view the upper and lower halves of the body.
Snap OS
The other big improvement to the latest Spectacles is the addition of full hand tracking abilities. Snap completely redesigned the underlying software powering Spectacles, now called Snap OS, so the entire user interface is controlled with hand gestures and voice commands.
You can pull up the main menu on the palm of one hand, sort of like Humane’s AI Pin and you simply tap on the corresponding icon to do things like close an app or head back to the lens explorer carousel. There are also pinch and tap gestures to launch and interact with lenses. While Snap still calls these experiences lenses, they look and feel more like full-fledged apps than the AR lens effects you’d find in the Snapchat app.
Lego has a game that allows you to pick up bricks with your hands and build objects. I also tried a mini golf game where you putt a golf ball over an AR course. Niantic created an AR version of its tamagotchi-like character Peridot, which you can place among your surroundings.
The interface for Snapchat’s AI assistant, MyAI, on Spectacles.Snap
You can also interact with Snapchat’s generative AI assistant, MyAI, or “paint” the space around you with AR effects. Some experiences are collaborative, so if two people with Spectacles are in a room together, they can view and interact with the same AR content together. If you only have one pair of Spectacles, others around you can get a glimpse of what you’re seeing via the Spectacles mobile app. It allows you to stream your view to your phone, a bit like how you might cast VR content from a headset to a TV.
The new gesture-based interface felt surprisingly intuitive. I occasionally struggled with lenses that required more precise movements, like picking up and placing individual Lego bricks, but the software never felt buggy or unresponsive.
There are even more intriguing use cases in the works. Snap is again partnering with OpenAI so that developers can create multimodal experiences for Spectacles. “Very soon, developers will be able to bring their [OpenAI] models into the Spectacles experience, so that we can really lean into the more utilitarian, camera-based experiences,” Dominguez says. “These AI models can help give developers, and ultimately, their end customers more context about what’s in front of them, what they’re hearing, what they’re seeing.”
Is AR hardware about to have a moment?
CEO Evan Spiegel has spent years touting the promise of AR glasses, a vision that for so long has felt just out of reach. But if the company’s 2021 Spectacles showed AR glasses were finally possible, the fifth-generation Spectacles feel like Snap may finally be getting close to making AR hardware that’s not merely an experiment.
For now, there are still some significant limitations. The glasses are still large and somewhat unwieldy, for one. While the fifth-gen Spectacles passably resemble regular glasses, it’s hard to imagine walking around with them on in public.
Then again, that might not matter much to the people Snap most wants to reach. As virtual and mixed reality become more mainstream, people have been more willing to wear the necessary headgear in public. People wear their Apple Vision Pro headsets on airplanes, in coffee shops and other public spaces. As Snap points out, its Spectacles, at least, don’t cover your entire face or obscure your eyes. And Dominguz says the company expects its hardware to get smaller over time.
Snap’s fifth-generation Spectacles are its most advanced, and ambitious, yet.Karissa Bell for Engadget
But the company will also likely need to find a way to reduce Spectacles’ price. Each pair reportedly costs thousands of dollars to produce, which helps explain Snap’s current insistence on a subscription model, but it’s hard to imagine even hardcore AR enthusiasts shelling out more than a thousand dollars for glasses that have less than one hour of battery life.
Snap seems well aware of this too. The company has always been upfront with the fact that it’s playing the long game when it comes to AR, and that thinking hasn’t changed. Dominguez repeatedly said that the company is intentionally starting with developers because they are the ones “most ready” for a device like the fifth-gen Spectacles and that Snap intends to be prepared whenever the consumer market catches up.
The company also isn’t alone in finally realizing AR hardware. By all accounts, Meta is poised to show off the first version of its long-promised augmented reality glasses next week at its developer event. Its glasses, known as Orion, are also unlikely to go on sale anytime soon. But the attention Meta brings to the space could nonetheless benefit Snap as it tries to sell its vision for an AR-enabled world.This article originally appeared on Engadget at https://www.engadget.com/social-media/snaps-fifth-generation-spectacles-bring-your-hands-into-into-augmented-reality-180026541.html?src=rss
Snap’s latest augmented reality glasses have a completely new — but still very oversized — design, larger field of view and all-new software that supports full hand tracking abilities. But the company is only making the fifth-generation Spectacles available to approved developers willing to commit to a year-long $99/month subscription to start.
It’s an unusual strategy, but Snap says it’s taking that approach because developers are, for now, best positioned to understand the capabilities and limitations of augmented reality hardware. They are also the ones most willing to commit to a pricey $1,000+ subscription to get their hands on the tech.
Developers, explains Snap’s director of AR platform Sophia Dominguez, are the biggest AR enthusiasts. They’re also the ones who will build the kinds of experiences that will eventually make the rest of Snapchat’s users excited for them too. “This isn’t a prototype,” Dominguez tells Engadget. “We have all the components. We’re ready to scale when the market is there, but we want to do so in a thoughtful way and bring developers along with our journey.”
Snap gave me an early preview of the glasses ahead of its Partner Summit event, and the Spectacles don’t feel like a prototype the way its first AR-enabled Spectacles did in 2021. The hardware and software are considerably more powerful. The AR displays are sharper and more immersive, and they already support over two dozen AR experiences, including a few from big names like Lego and Niantic (Star Wars developer Industrial Light and Motion also has a lens in the works, according to Snap.)
The glasses
To state the obvious, the glasses are massive. Almost comically large. They are significantly wider than my face, and the arms stuck out past the end of my head. A small adapter helped them fit around my ears more snugly, but they still felt like they might slip off my face if I jerked my head suddenly or leaned down.
Still, the new frames look slightly more like actual glasses than the fourth-generation Spectacles, which had a narrow, angular design with dark lenses. The new frames are made of thick black plastic and have clear lenses that are able to darken when you move outside, sort of like transition lenses.
The fifth-generation Spectacles are the first to have clear lenses.
The lenses house Snap’s waveguide tech that, along with “Liquid Crystal on Silicon micro-projectors,” enable their AR abilities. Each pair is also equipped with cameras, microphones and speakers.
Inside each arm is a Qualcomm Snapdragon processor. Snap says the dual processor setup has made the glasses more efficient and prevents the overheating issues that plagued their predecessor. The change seems to be an effective one. In my nearly hour-long demo, neither pair of Spectacles I tried got hot, though they were slightly warm to the touch after extended use. (The fifth-generation Spectacles have a battery life of about 45 minutes, up from 30 min with the fourth-gen model.)
Snap’s newest AR Spectacles are extremely thick.
Snap has also vastly improved Spectacles’ AR capabilities. The projected AR content was crisp and bright. When I walked outside into the sun, the lenses dimmed, but the content was very nearly as vivid as when I had been indoors. At a resolution of 37 pixels per degree, I wasn’t able to discern individual pixels or fuzzy borders like I have on some other AR hardware.
But the most noticeable improvement from Snap’s last AR glasses is the bigger field of view. Snap says it has almost tripled the field of view from its previous generation of Spectacles, increasing the window of visible content to 46 degrees. Snap claims this is equivalent to having a 100-inch display in the room with you, and my demo felt significantly more immersive than what I saw in 2021.
The fourth-generation Spectacles (above) were narrow and not nearly as oversized as the fifth-gen Spectacles (below).
It isn’t, however, fully immersive. I still found myself at times gazing around the room, looking for the AR effects I knew were around me. At other points, I had to physically move around my space in order to see the full AR effects. For example, when I tried out a human anatomy demo, which shows a life-sized model of the human body and its various systems, I wasn’t able to see the entire figure at once. I had to move my head up and down in order to view the upper and lower halves of the body.
Snap OS
The other big improvement to the latest Spectacles is the addition of full hand tracking abilities. Snap completely redesigned the underlying software powering Spectacles, now called Snap OS, so the entire user interface is controlled with hand gestures and voice commands.
You can pull up the main menu on the palm of one hand, sort of like Humane’s AI Pin and you simply tap on the corresponding icon to do things like close an app or head back to the lens explorer carousel. There are also pinch and tap gestures to launch and interact with lenses. While Snap still calls these experiences lenses, they look and feel more like full-fledged apps than the AR lens effects you’d find in the Snapchat app.
Lego has a game that allows you to pick up bricks with your hands and build objects. I also tried a mini golf game where you putt a golf ball over an AR course. Niantic created an AR version of its tamagotchi-like character Peridot, which you can place among your surroundings.
The interface for Snapchat’s AI assistant, MyAI, on Spectacles.
You can also interact with Snapchat’s generative AI assistant, MyAI, or “paint” the space around you with AR effects. Some experiences are collaborative, so if two people with Spectacles are in a room together, they can view and interact with the same AR content together. If you only have one pair of Spectacles, others around you can get a glimpse of what you’re seeing via the Spectacles mobile app. It allows you to stream your view to your phone, a bit like how you might cast VR content from a headset to a TV.
The new gesture-based interface felt surprisingly intuitive. I occasionally struggled with lenses that required more precise movements, like picking up and placing individual Lego bricks, but the software never felt buggy or unresponsive.
There are even more intriguing use cases in the works. Snap is again partnering with OpenAI so that developers can create multimodal experiences for Spectacles. “Very soon, developers will be able to bring their [OpenAI] models into the Spectacles experience, so that we can really lean into the more utilitarian, camera-based experiences,” Dominguez says. “These AI models can help give developers, and ultimately, their end customers more context about what’s in front of them, what they’re hearing, what they’re seeing.”
Is AR hardware about to have a moment?
CEO Evan Spiegel has spent years touting the promise of AR glasses, a vision that for so long has felt just out of reach. But if the company’s 2021 Spectacles showed AR glasses were finally possible, the fifth-generation Spectacles feel like Snap may finally be getting close to making AR hardware that’s not merely an experiment.
For now, there are still some significant limitations. The glasses are still large and somewhat unwieldy, for one. While the fifth-gen Spectacles passably resemble regular glasses, it’s hard to imagine walking around with them on in public.
Then again, that might not matter much to the people Snap most wants to reach. As virtual and mixed reality become more mainstream, people have been more willing to wear the necessary headgear in public. People wear their Apple Vision Pro headsets on airplanes, in coffee shops and other public spaces. As Snap points out, its Spectacles, at least, don’t cover your entire face or obscure your eyes. And Dominguz says the company expects its hardware to get smaller over time.
Snap’s fifth-generation Spectacles are its most advanced, and ambitious, yet.
But the company will also likely need to find a way to reduce Spectacles’ price. Each pair reportedly costs thousands of dollars to produce, which helps explain Snap’s current insistence on a subscription model, but it’s hard to imagine even hardcore AR enthusiasts shelling out more than a thousand dollars for glasses that have less than one hour of battery life.
Snap seems well aware of this too. The company has always been upfront with the fact that it’s playing the long game when it comes to AR, and that thinking hasn’t changed. Dominguez repeatedly said that the company is intentionally starting with developers because they are the ones “most ready” for a device like the fifth-gen Spectacles and that Snap intends to be prepared whenever the consumer market catches up.
The company also isn’t alone in finally realizing AR hardware. By all accounts, Meta is poised to show off the first version of its long-promised augmented reality glasses next week at its developer event. Its glasses, known as Orion, are also unlikely to go on sale anytime soon. But the attention Meta brings to the space could nonetheless benefit Snap as it tries to sell its vision for an AR-enabled world.
This article originally appeared on Engadget at https://www.engadget.com/social-media/snaps-fifth-generation-spectacles-bring-your-hands-into-into-augmented-reality-180026541.html?src=rss
California passes landmark regulation to require permission from actors for AI deepfakes
California has passed a landmark AI regulation bill to protect performers’ digital likenesses. On Tuesday, Governor Gavin Newsom signed Assembly Bill 2602, which will go into effect on January 1, 2025. The bill requires studios and other employers to get consent before using “digital replicas” of performers. Newsom also signed AB 1836, which grants similar rights to deceased performers, requiring their estate’s permission before using their AI likenesses.
AB 2602, introduced in April, covers film, TV, video games, commercials, audiobooks, and non-union performing jobs. Deadline notes its terms are similar to those in the contract that ended the 2023 actors’ strike against Hollywood studios. SAG-AFTRA, the film and TV actors’ union that held out for last year’s deal, strongly supported the bill. The Motion Picture Association first opposed the legislation but later switched to a neutral stance after revisions.
The bill mandates that employers can’t use an AI deepfake of an actor’s voice or likeness if it replaces work the performer could have done in person. It also prevents digital replicas if the actor’s contract doesn’t explicitly state how the deepfake will be used. It also voids any such deals signed when the performer didn’t have legal or union representation.
The bill defines a digital replica as a “computer-generated, highly realistic electronic representation that is readily identifiable as the voice or visual likeness of an individual that is embodied in a sound recording, image, audiovisual work, or transmission in which the actual individual either did not actually perform or appear, or the actual individual did perform or appear, but the fundamental character of the performance or appearance has been materially altered.”
Meanwhile, AB 1836 expands California’s postmortem right of publicity. Hollywood must now get permission from the deceased estates before using their digital replicas. Deadline notes that exceptions were included for “satire, comment, criticism and parody, and for certain documentary, biographical or historical projects.”
“The bill, which protects not only SAG-AFTRA performers but all performers, is a huge step forward,” SAG-AFTRA chief negotiator Duncan Crabtree-Ireland told the The LA Times in late August. “Voice and likeness rights, in an age of digital replication, must have strong guardrails around licensing to protect from abuse, this bill provides those guardrails.”
AB2602 passed the California State Senate on August 27 with a 37-1 tally. (The lone holdout was from State Senator Brian Dahle, a Republican.) The bill then returned to the Assembly (which passed an earlier version in May) to formalize revisions made during Senate negotiations.
On Tuesday, SAG-AFTRA President Fran Drescher celebrated the passage, which the union fought for. “It is a momentous day for SAG-AFTRA members and everyone else, because the A.I. protections we fought so hard for last year are now expanded upon by California law thanks to the Legislature and Gov. Gavin Newsom,” Drescher said. This article originally appeared on Engadget at https://www.engadget.com/ai/california-passes-landmark-regulation-to-require-permission-from-actors-for-ai-deepfakes-174234452.html?src=rss
California has passed a landmark AI regulation bill to protect performers’ digital likenesses. On Tuesday, Governor Gavin Newsom signed Assembly Bill 2602, which will go into effect on January 1, 2025. The bill requires studios and other employers to get consent before using “digital replicas” of performers. Newsom also signed AB 1836, which grants similar rights to deceased performers, requiring their estate’s permission before using their AI likenesses.
AB 2602, introduced in April, covers film, TV, video games, commercials, audiobooks, and non-union performing jobs. Deadline notes its terms are similar to those in the contract that ended the 2023 actors’ strike against Hollywood studios. SAG-AFTRA, the film and TV actors’ union that held out for last year’s deal, strongly supported the bill. The Motion Picture Association first opposed the legislation but later switched to a neutral stance after revisions.
The bill mandates that employers can’t use an AI deepfake of an actor’s voice or likeness if it replaces work the performer could have done in person. It also prevents digital replicas if the actor’s contract doesn’t explicitly state how the deepfake will be used. It also voids any such deals signed when the performer didn’t have legal or union representation.
The bill defines a digital replica as a “computer-generated, highly realistic electronic representation that is readily identifiable as the voice or visual likeness of an individual that is embodied in a sound recording, image, audiovisual work, or transmission in which the actual individual either did not actually perform or appear, or the actual individual did perform or appear, but the fundamental character of the performance or appearance has been materially altered.”
Meanwhile, AB 1836 expands California’s postmortem right of publicity. Hollywood must now get permission from the deceased estates before using their digital replicas. Deadline notes that exceptions were included for “satire, comment, criticism and parody, and for certain documentary, biographical or historical projects.”
“The bill, which protects not only SAG-AFTRA performers but all performers, is a huge step forward,” SAG-AFTRA chief negotiator Duncan Crabtree-Ireland told the The LA Times in late August. “Voice and likeness rights, in an age of digital replication, must have strong guardrails around licensing to protect from abuse, this bill provides those guardrails.”
AB2602 passed the California State Senate on August 27 with a 37-1 tally. (The lone holdout was from State Senator Brian Dahle, a Republican.) The bill then returned to the Assembly (which passed an earlier version in May) to formalize revisions made during Senate negotiations.
On Tuesday, SAG-AFTRA President Fran Drescher celebrated the passage, which the union fought for. “It is a momentous day for SAG-AFTRA members and everyone else, because the A.I. protections we fought so hard for last year are now expanded upon by California law thanks to the Legislature and Gov. Gavin Newsom,” Drescher said.
This article originally appeared on Engadget at https://www.engadget.com/ai/california-passes-landmark-regulation-to-require-permission-from-actors-for-ai-deepfakes-174234452.html?src=rss
Get one year of Dashlane Premium password manager for only $39
An annual membership to Dashlane’s premium password manager is on sale for just $39, which is a discount of more than $20 and a savings of 35 percent. Just use the code “SEPT35” at checkout. The company says this is in celebration of something called Cyber Security Awareness Month, which actually doesn’t start until October. It’s always nice to see a festive new holiday on the scene.
Anyways, this deal is for the premium plan, which includes unlimited password and passkey storage. Users also get phishing alerts to stay on top of attacks, dark web monitoring and more. The plan even comes with a VPN, which I’ve found particularly useful for watching my stories when in another country.
Dashlane found a place on our list of the best password managers. We admired the robust suite of features and noted that some of these tools, like password storage, are even available with the free plan. We also called out the secure sharing functionality, with password sharing baked right into the system. It’s also available for plenty of platforms, including macOS, iOS, Android, Chrome, Firefox, Safari, Opera and other browsers.
There’s one major platform missing, however, which could be a dealbreaker for some. Dashlane doesn’t support Linux. There’s also the issue of an annual subscription. There’s no telling how much it’ll be next year, though switching password managers isn’t as tough as it used to be. There’s a free plan if the price shoots up too high. The deal ends on September 22.
Follow @EngadgetDeals on Twitter and subscribe to the Engadget Deals newsletter for the latest tech deals and buying advice.This article originally appeared on Engadget at https://www.engadget.com/deals/get-one-year-of-dashlane-premium-password-manager-for-only-39-172851487.html?src=rss
An annual membership to Dashlane’s premium password manager is on sale for just $39, which is a discount of more than $20 and a savings of 35 percent. Just use the code “SEPT35” at checkout. The company says this is in celebration of something called Cyber Security Awareness Month, which actually doesn’t start until October. It’s always nice to see a festive new holiday on the scene.
Anyways, this deal is for the premium plan, which includes unlimited password and passkey storage. Users also get phishing alerts to stay on top of attacks, dark web monitoring and more. The plan even comes with a VPN, which I’ve found particularly useful for watching my stories when in another country.
Dashlane found a place on our list of the best password managers. We admired the robust suite of features and noted that some of these tools, like password storage, are even available with the free plan. We also called out the secure sharing functionality, with password sharing baked right into the system. It’s also available for plenty of platforms, including macOS, iOS, Android, Chrome, Firefox, Safari, Opera and other browsers.
There’s one major platform missing, however, which could be a dealbreaker for some. Dashlane doesn’t support Linux. There’s also the issue of an annual subscription. There’s no telling how much it’ll be next year, though switching password managers isn’t as tough as it used to be. There’s a free plan if the price shoots up too high. The deal ends on September 22.
Follow @EngadgetDeals on Twitter and subscribe to the Engadget Deals newsletter for the latest tech deals and buying advice.
This article originally appeared on Engadget at https://www.engadget.com/deals/get-one-year-of-dashlane-premium-password-manager-for-only-39-172851487.html?src=rss
Snap is redesigning Snapchat and adding new AI powers
Since first introducing its generative AI assistant, Snap has been steadily ramping up the amount of AI in its app. Now, the company is adding a new slate of AI-powered features as it begins testing a larger redesign of the app.
Snap often brings new AI features to its Snapchat+ subscribers first, and the company is continuing the trend with a new feature called “My Selfie.” The feature uses selfies to create AI-generated images of users and their friends (if they also subscribe) in creative poses and situations. The company is also rolling out a new “grandparents lens” that uses AI to imagine what you might look like as a senior citizen.
Snapchat+ subscribers will also get access to a new AI feature in memories, which tracks users’ previously saved snaps. With the change, Memories will be able to surface photos and videos that have been edited with with AI-generated captions or new AR lens effects.
Additionally, Snap is making its chatGPT-powered MyAI assistant more powerful with the ability to “problem solve” based on photo snaps. The company says the assistant will be able to translate restaurant menus, identify plants and understand parking signs.
The new “simplified” Snapchat design.Snap
The new AI capabilities arrive as Snap is starting to test a larger redesign of its app that’s meant to make Snapchat, long criticized for a confusing interface, simpler and more intuitive. The new design will bring conversations between friends and Stories content into a single view, with Stories at the top of conversations. (Interestingly Snap previously combined users’ chats and Stories into a single feed in a previous, widely unpopular redesign in 2018.) The redesign will also eliminate a separate tab for the Snap Map, placing it instead in the “chat” tab.
And instead of keeping separate sections for Spotlight and Stories, Snap will combine the two into a single “Watch” feed that will algorithmically recommend content. While the current iteration of Snapchat has five distinct sections, the “simplified” version will have just three, including the camera, which will still be the first screen users see upon launching the app.
Snap has struggled with major redesigns in the past and the company says it intends to roll out the new look slowly, with only a small number of users getting the update to start.This article originally appeared on Engadget at https://www.engadget.com/social-media/snap-is-redesigning-snapchat-and-adding-new-ai-powers-171552703.html?src=rss
Since first introducing its generative AI assistant, Snap has been steadily ramping up the amount of AI in its app. Now, the company is adding a new slate of AI-powered features as it begins testing a larger redesign of the app.
Snap often brings new AI features to its Snapchat+ subscribers first, and the company is continuing the trend with a new feature called “My Selfie.” The feature uses selfies to create AI-generated images of users and their friends (if they also subscribe) in creative poses and situations. The company is also rolling out a new “grandparents lens” that uses AI to imagine what you might look like as a senior citizen.
Snapchat+ subscribers will also get access to a new AI feature in memories, which tracks users’ previously saved snaps. With the change, Memories will be able to surface photos and videos that have been edited with with AI-generated captions or new AR lens effects.
Additionally, Snap is making its chatGPT-powered MyAI assistant more powerful with the ability to “problem solve” based on photo snaps. The company says the assistant will be able to translate restaurant menus, identify plants and understand parking signs.
The new “simplified” Snapchat design.
The new AI capabilities arrive as Snap is starting to test a larger redesign of its app that’s meant to make Snapchat, long criticized for a confusing interface, simpler and more intuitive. The new design will bring conversations between friends and Stories content into a single view, with Stories at the top of conversations. (Interestingly Snap previously combined users’ chats and Stories into a single feed in a previous, widely unpopular redesign in 2018.) The redesign will also eliminate a separate tab for the Snap Map, placing it instead in the “chat” tab.
And instead of keeping separate sections for Spotlight and Stories, Snap will combine the two into a single “Watch” feed that will algorithmically recommend content. While the current iteration of Snapchat has five distinct sections, the “simplified” version will have just three, including the camera, which will still be the first screen users see upon launching the app.
Snap has struggled with major redesigns in the past and the company says it intends to roll out the new look slowly, with only a small number of users getting the update to start.
This article originally appeared on Engadget at https://www.engadget.com/social-media/snap-is-redesigning-snapchat-and-adding-new-ai-powers-171552703.html?src=rss
Final Fantasy 16 players are encountering bugs after PS5 firmware update
PlayStation 5 users received a firmware update (24.06-10.00.00) last week, but there are lots of reports of bugs when playing Final Fantasy 16, such as game crashes while loading saves or fast traveling. The most spectacular bug of all, shared by a Reddit user, was the appearance of a tide of black squares approaching the player and covering much of the screen.
The official Final Fantasy 16 account has made a post on X about the issues. In the post, Square Enix says that it is cooperating with Sony Interactive Entertainment (SIE) to determine the causes. The company also asked gamers to be patient and wait for further updates.
Following the recent release of the PlayStation 5 system update, there have been reports of the game crashing and graphical issues.We are currently working with SIE to investigate, and sincerely apologize for any inconvenience caused.Please await our further updates. #FF16— FINAL FANTASY XVI (@finalfantasyxvi) September 17, 2024
Sony hasn’t made any statements on the issues as of writing time. Players of other games have also posted about how they ran into similar bugs while playing other games. One example is this bug in the Resident Evil 4 remake. IGN also noted that some games like Star Wars Outlaws, Death Stranding and No Man’s Sky had issues after the update.This article originally appeared on Engadget at https://www.engadget.com/gaming/playstation/final-fantasy-16-players-are-encountering-bugs-after-ps5-firmware-update-163017275.html?src=rss
PlayStation 5 users received a firmware update (24.06-10.00.00) last week, but there are lots of reports of bugs when playing Final Fantasy 16, such as game crashes while loading saves or fast traveling. The most spectacular bug of all, shared by a Reddit user, was the appearance of a tide of black squares approaching the player and covering much of the screen.
The official Final Fantasy 16 account has made a post on X about the issues. In the post, Square Enix says that it is cooperating with Sony Interactive Entertainment (SIE) to determine the causes. The company also asked gamers to be patient and wait for further updates.
Following the recent release of the PlayStation 5 system update, there have been reports of the game crashing and graphical issues.
We are currently working with SIE to investigate, and sincerely apologize for any inconvenience caused.
Please await our further updates. #FF16
— FINAL FANTASY XVI (@finalfantasyxvi) September 17, 2024
Sony hasn’t made any statements on the issues as of writing time. Players of other games have also posted about how they ran into similar bugs while playing other games. One example is this bug in the Resident Evil 4 remake. IGN also noted that some games like Star Wars Outlaws, Death Stranding and No Man’s Sky had issues after the update.
This article originally appeared on Engadget at https://www.engadget.com/gaming/playstation/final-fantasy-16-players-are-encountering-bugs-after-ps5-firmware-update-163017275.html?src=rss
A Sims movie from Amazon MGM Studios is on the way
The Sims has been one of the biggest success stories in gaming over the last quarter century, with more than 500 million players trying to understand Simish, learning what WooHoo-ing is and using the classic Rosebud cheat to gain more money. All of that could be coming to a big screen near you, as Electronic Arts has revealed that Amazon MGM Studios is working on a movie adaptation of the games.
Kate Herron (Loki, The Last of Us) will direct the film and co-write the screenplay with Briony Redman (Doctor Who). One of the production companies that’s on board is Margot Robbie’s LuckyChap, which seems appropriate given that EA is looking “to make an impact the size of something like a Barbie movie,” EA vice president and Sims general manager Kate Gorman said. (For the tape, Barbie is the 14th-highest-grossing film of all time.)
EA wants the movie to be an authentic experience for fans, particularly given that many people have “love and nostalgia” for the series. To that end, you can expect a lot of Sims lore and Easter eggs in the film.
“There will be Freezer Bunnies,” Gorman told Variety. “I’m sure a pool without a ladder is somewhere in there, but we haven’t finalized any of those details. But that’s the idea, is to say that it lives within this space. It’s a nod to all of the amazing play and creation and fun that people have had over the last 25 years within The Sims.”
Meanwhile, EA provided updates on The Sims franchise as a whole. The company doesn’t currently plan to release The Sims 5, instead opting to focus on updating The Sims 4 and releasing paid expansions for the 10-year-old game. The publisher is also spinning up a creator program and some players who create custom in-game items will be able to sell them as Creator Kits.
While The Sims 4 will remain the core of the series, EA is looking at expanding the franchise in other ways, including with Project Rene, a cross-platform multiplayer experience that the publisher has been talking up for a couple of years. An invite-only playtest is scheduled for this fall.This article originally appeared on Engadget at https://www.engadget.com/gaming/a-sims-movie-from-amazon-mgm-studios-is-on-the-way-161159048.html?src=rss
The Sims has been one of the biggest success stories in gaming over the last quarter century, with more than 500 million players trying to understand Simish, learning what WooHoo-ing is and using the classic Rosebud cheat to gain more money. All of that could be coming to a big screen near you, as Electronic Arts has revealed that Amazon MGM Studios is working on a movie adaptation of the games.
Kate Herron (Loki, The Last of Us) will direct the film and co-write the screenplay with Briony Redman (Doctor Who). One of the production companies that’s on board is Margot Robbie’s LuckyChap, which seems appropriate given that EA is looking “to make an impact the size of something like a Barbie movie,” EA vice president and Sims general manager Kate Gorman said. (For the tape, Barbie is the 14th-highest-grossing film of all time.)
EA wants the movie to be an authentic experience for fans, particularly given that many people have “love and nostalgia” for the series. To that end, you can expect a lot of Sims lore and Easter eggs in the film.
“There will be Freezer Bunnies,” Gorman told Variety. “I’m sure a pool without a ladder is somewhere in there, but we haven’t finalized any of those details. But that’s the idea, is to say that it lives within this space. It’s a nod to all of the amazing play and creation and fun that people have had over the last 25 years within The Sims.”
Meanwhile, EA provided updates on The Sims franchise as a whole. The company doesn’t currently plan to release The Sims 5, instead opting to focus on updating The Sims 4 and releasing paid expansions for the 10-year-old game. The publisher is also spinning up a creator program and some players who create custom in-game items will be able to sell them as Creator Kits.
While The Sims 4 will remain the core of the series, EA is looking at expanding the franchise in other ways, including with Project Rene, a cross-platform multiplayer experience that the publisher has been talking up for a couple of years. An invite-only playtest is scheduled for this fall.
This article originally appeared on Engadget at https://www.engadget.com/gaming/a-sims-movie-from-amazon-mgm-studios-is-on-the-way-161159048.html?src=rss
Early Prime Day deals include the Echo Show 5 plus a smart light bulb for only $60
The next Amazon Prime Day sales event is set for October, but the early deals have already begun to trickle in. Case in point? There are some nifty discounts on Echo Show smart displays that ship with smart light bulbs.
The Echo Show 5 is available for just $60, which is a discount of $50. This is a great all-around device that easily found a spot on our list of the best smart displays. It’s bare bones, but gets the job done. We appreciated the compact design and the diminutive, yet useful, 5.5-inch screen. The compact size allows the Echo Show 5 to double as one heck of a smart alarm clock.
To that end, there’s an ambient light sensor that adjusts the screen’s brightness automatically, a tap-to-snooze function and a sunrise alarm that slowly brightens the screen for a gentle wake up call. There’s also a camera for video calls and the like, which is great, and privacy concerns are assuaged by the physical camera cover that ships with the display.
This is a smart display, so Amazon has packed in a Sengled smart light bulb. This is a decent way to learn the ins and outs of making smart home adjustments, but it’s just a colored light bulb. The only major downside with this display is that the speakers are tiny, to suit the rest of the device. This translates to reduced sound quality when compared to rival smart displays.
If you want larger speakers and a larger screen, there’s a similar offer for the Echo Show 8. The smart display is available for $105, which is a discount of $65. It also comes with the aforementioned light bulb.
Follow @EngadgetDeals on Twitter for the latest tech deals and buying advice in the lead up to October Prime Day 2024.This article originally appeared on Engadget at https://www.engadget.com/deals/early-prime-day-deals-include-the-echo-show-5-plus-a-smart-light-bulb-for-only-60-151010373.html?src=rss
The next Amazon Prime Day sales event is set for October, but the early deals have already begun to trickle in. Case in point? There are some nifty discounts on Echo Show smart displays that ship with smart light bulbs.
The Echo Show 5 is available for just $60, which is a discount of $50. This is a great all-around device that easily found a spot on our list of the best smart displays. It’s bare bones, but gets the job done. We appreciated the compact design and the diminutive, yet useful, 5.5-inch screen. The compact size allows the Echo Show 5 to double as one heck of a smart alarm clock.
To that end, there’s an ambient light sensor that adjusts the screen’s brightness automatically, a tap-to-snooze function and a sunrise alarm that slowly brightens the screen for a gentle wake up call. There’s also a camera for video calls and the like, which is great, and privacy concerns are assuaged by the physical camera cover that ships with the display.
This is a smart display, so Amazon has packed in a Sengled smart light bulb. This is a decent way to learn the ins and outs of making smart home adjustments, but it’s just a colored light bulb. The only major downside with this display is that the speakers are tiny, to suit the rest of the device. This translates to reduced sound quality when compared to rival smart displays.
If you want larger speakers and a larger screen, there’s a similar offer for the Echo Show 8. The smart display is available for $105, which is a discount of $65. It also comes with the aforementioned light bulb.
Follow @EngadgetDeals on Twitter for the latest tech deals and buying advice in the lead up to October Prime Day 2024.
This article originally appeared on Engadget at https://www.engadget.com/deals/early-prime-day-deals-include-the-echo-show-5-plus-a-smart-light-bulb-for-only-60-151010373.html?src=rss
Here’s how Google will start helping you figure out which images are AI generated
Google is trying to be more transparent about whether a piece of content was created or modified using generative AI (GAI) tools. After joining the Coalition for Content Provenance and Authenticity (C2PA) as a steering committee member earlier this year, Google has revealed how it will start implementing the group’s digital watermarking standard.
Alongside partners including Amazon, Meta, and OpenAI, Google has spent the past several months figuring out how to improve the tech used for watermarking GAI-created or modified content. The company says it helped to develop the latest version of Content Credentials, a technical standard used to protect metadata detailing how an asset was created, as well as information about what has been modified and how. Google says the current version of Content Credentials is more secure and tamperproof due to stricter validation methods.
In the coming months, Google will start to incorporate the current version of Content Credentials into some of its main products. In other words, it should soon be easier to tell whether an image was created or modified using GAI in Google Search results. If an image that pops up has C2PA metadata, you should be able to find out what impact GAI had on it via the About this image tool. This is also available in Google Images, Lens and Circle to Search.
The company is looking into how to use C2PA to tell YouTube viewers when footage was captured with a camera. Expect to learn more about that later this year.
Google also plans to use C2PA metadata in its ads systems. It didn’t reveal too many details about how its plans there other than to say it will use “C2PA signals to inform how we enforce key policies” and do so gradually.
Of course, the effectiveness of this all depends on whether companies such as camera makers and the developers of GAI tools actually use the C2PA watermarking system. The approach isn’t going to stop someone from stripping out an image’s metadata either. That could make it harder for systems such as Google’s to detect any GAI usage.
Meanwhile, throughout this year, we’ve seen Meta wrangle over how to disclose whether images were created with GAI across Facebook, Instagram and Threads. The company just changed its policy to make labels less visible on images that were edited with AI tools. Starting this week, if C2PA metadata indicates that someone (for instance) used Photoshop’s GAI tools to tweak a genuine photo, the “AI info” label no longer appears front and center. Instead, it’s buried in the post’s menu.This article originally appeared on Engadget at https://www.engadget.com/ai/heres-how-google-will-start-helping-you-figure-out-which-images-are-ai-generated-150219272.html?src=rss
Google is trying to be more transparent about whether a piece of content was created or modified using generative AI (GAI) tools. After joining the Coalition for Content Provenance and Authenticity (C2PA) as a steering committee member earlier this year, Google has revealed how it will start implementing the group’s digital watermarking standard.
Alongside partners including Amazon, Meta, and OpenAI, Google has spent the past several months figuring out how to improve the tech used for watermarking GAI-created or modified content. The company says it helped to develop the latest version of Content Credentials, a technical standard used to protect metadata detailing how an asset was created, as well as information about what has been modified and how. Google says the current version of Content Credentials is more secure and tamperproof due to stricter validation methods.
In the coming months, Google will start to incorporate the current version of Content Credentials into some of its main products. In other words, it should soon be easier to tell whether an image was created or modified using GAI in Google Search results. If an image that pops up has C2PA metadata, you should be able to find out what impact GAI had on it via the About this image tool. This is also available in Google Images, Lens and Circle to Search.
The company is looking into how to use C2PA to tell YouTube viewers when footage was captured with a camera. Expect to learn more about that later this year.
Google also plans to use C2PA metadata in its ads systems. It didn’t reveal too many details about how its plans there other than to say it will use “C2PA signals to inform how we enforce key policies” and do so gradually.
Of course, the effectiveness of this all depends on whether companies such as camera makers and the developers of GAI tools actually use the C2PA watermarking system. The approach isn’t going to stop someone from stripping out an image’s metadata either. That could make it harder for systems such as Google’s to detect any GAI usage.
Meanwhile, throughout this year, we’ve seen Meta wrangle over how to disclose whether images were created with GAI across Facebook, Instagram and Threads. The company just changed its policy to make labels less visible on images that were edited with AI tools. Starting this week, if C2PA metadata indicates that someone (for instance) used Photoshop’s GAI tools to tweak a genuine photo, the “AI info” label no longer appears front and center. Instead, it’s buried in the post’s menu.
This article originally appeared on Engadget at https://www.engadget.com/ai/heres-how-google-will-start-helping-you-figure-out-which-images-are-ai-generated-150219272.html?src=rss