Month: August 2024

Apple Immersive Series ‘Elevated’ Coming to Vision Pro on September 6

Apple’s latest immersive video series for the Vision Pro is set to launch in early September. “Elevated,” a new 3D experience, will be available starting on September 6. A teaser for the show with an upcoming date is listed in the Apple Immersive Video section of the Vision Pro.

Apple first announced Elevated in July, but did not provide launch timing for the show. According to Apple, the aerial travel series “whisks viewers around iconic vistas from staggering heights” that include volcanoes, waterfalls, and more in Hawaii.Gain an all-new perspective as you take aerial tours of the world’s most remarkable landscapes, led by well-known guides with a special connection to each place.While the first episode is set in Hawaii, the second episode, coming at a later date, will feature an autumn scene in New England. Apple is working on multiple new immersive video experiences that are rolling out this year.

Boundless, a series that launched in July, features a first episode with a hot air balloon ride. A second upcoming episode will focus on Arctic Surfing. Apple is planning to release a new immersive performance from The Weeknd, a big wave surfing sports series, a behind-the-scenes view of the 2024 NBA All-Star Weekend, and its first immersive short film called “Submerged.”

Vision Pro users can watch the Apple Immersive video content from the Apple TV app in Australia, Canada, Hong Kong, France, Germany, Japan, Singapore, the U.K., and the U.S. Users in China can watch the content through the Migu Video and Tencent Video apps.Related Roundups: visionOS, visionOS 2Related Forum: Apple Vision ProThis article, “Apple Immersive Series ‘Elevated’ Coming to Vision Pro on September 6” first appeared on MacRumors.comDiscuss this article in our forums

Apple’s latest immersive video series for the Vision Pro is set to launch in early September. “Elevated,” a new 3D experience, will be available starting on September 6. A teaser for the show with an upcoming date is listed in the Apple Immersive Video section of the Vision Pro.

Apple first announced Elevated in July, but did not provide launch timing for the show. According to Apple, the aerial travel series “whisks viewers around iconic vistas from staggering heights” that include volcanoes, waterfalls, and more in Hawaii.Gain an all-new perspective as you take aerial tours of the world’s most remarkable landscapes, led by well-known guides with a special connection to each place.While the first episode is set in Hawaii, the second episode, coming at a later date, will feature an autumn scene in New England. Apple is working on multiple new immersive video experiences that are rolling out this year.

Boundless, a series that launched in July, features a first episode with a hot air balloon ride. A second upcoming episode will focus on Arctic Surfing. Apple is planning to release a new immersive performance from The Weeknd, a big wave surfing sports series, a behind-the-scenes view of the 2024 NBA All-Star Weekend, and its first immersive short film called “Submerged.”

Vision Pro users can watch the Apple Immersive video content from the Apple TV app in Australia, Canada, Hong Kong, France, Germany, Japan, Singapore, the U.K., and the U.S. Users in China can watch the content through the Migu Video and Tencent Video apps.

Related Roundups: visionOS, visionOS 2
Related Forum: Apple Vision Pro

This article, “Apple Immersive Series ‘Elevated’ Coming to Vision Pro on September 6” first appeared on MacRumors.com

Discuss this article in our forums

Read More 

D&D Publisher Walks Back Controversial Changes To Online Tools

The Verge’s Ash Parrish reports: Last week, as a part of the updates to Dungeons & Dragons Fifth Edition — collectively known as the 2024 revision — the publisher announced that it would update D&D Beyond, the tabletop RPG’s official digital toolkit that players use to reference content and create characters using a host of official and third-party sources. The update would add the new 2024 rulebooks to the toolkit, mark outdated content with a “legacy” badge, and change players’ character sheets to reflect all the new rules and features.

That last part is critical to understanding why some D&D players (including my own dungeon master) spent the last 72 hours in a state of panic. Though some of the 2024 revisions are essentially cosmetic in nature — for example, “races” will be updated to “species” — other updates like the ones to weapons, spells, and magic items fundamentally alter the game. Wizards of the Coast would have essentially overwritten every user’s character sheet with the new information whether they wanted it or not. “All entries for mundane and magical items, weapons, armor, and spells will also be updated to their 2024 version,” Wizards said in its initial announcement. The publisher did say that players would have the option to continue to use the 2014 version of spells and magic items. But doing so requires using the game’s homebrew rules. which aren’t known for being user-friendly.

Thankfully, Wizards of the Coast isn’t in the car business, and after a weekend of backlash on social media, the company will no longer force the new changes on players. “We misjudged the impact of this change, and we agree that you should be free to choose your own way to play,” Wizard’s said in its latest announcement. Current character sheets will only be updated with new terminology while the older versions of spells, magic items, and weapons will be preserved. Also, players who have access to both the 2014 and 2024 digital versions will have the option to use both when creating new characters.

Read more of this story at Slashdot.

The Verge’s Ash Parrish reports: Last week, as a part of the updates to Dungeons & Dragons Fifth Edition — collectively known as the 2024 revision — the publisher announced that it would update D&D Beyond, the tabletop RPG’s official digital toolkit that players use to reference content and create characters using a host of official and third-party sources. The update would add the new 2024 rulebooks to the toolkit, mark outdated content with a “legacy” badge, and change players’ character sheets to reflect all the new rules and features.

That last part is critical to understanding why some D&D players (including my own dungeon master) spent the last 72 hours in a state of panic. Though some of the 2024 revisions are essentially cosmetic in nature — for example, “races” will be updated to “species” — other updates like the ones to weapons, spells, and magic items fundamentally alter the game. Wizards of the Coast would have essentially overwritten every user’s character sheet with the new information whether they wanted it or not. “All entries for mundane and magical items, weapons, armor, and spells will also be updated to their 2024 version,” Wizards said in its initial announcement. The publisher did say that players would have the option to continue to use the 2014 version of spells and magic items. But doing so requires using the game’s homebrew rules. which aren’t known for being user-friendly.

Thankfully, Wizards of the Coast isn’t in the car business, and after a weekend of backlash on social media, the company will no longer force the new changes on players. “We misjudged the impact of this change, and we agree that you should be free to choose your own way to play,” Wizard’s said in its latest announcement. Current character sheets will only be updated with new terminology while the older versions of spells, magic items, and weapons will be preserved. Also, players who have access to both the 2014 and 2024 digital versions will have the option to use both when creating new characters.

Read more of this story at Slashdot.

Read More 

Telegram’s CEO has taken a hands-off approach for years — now his luck might have run out

Image: Cath Virginia / The Verge, Getty Images

Telegram doesn’t have the moderation of most social networks or the privacy of a true encrypted messaging app. That could leave its operator in hot water. Telegram CEO Pavel Durov’s arrest in France on Saturday took the tech world by surprise. The 39-year-old Russian-born billionaire was detained after touching down at an airport outside of Paris in his private plane. And with scant detail available, observers wondered what the unprecedented action meant for free speech, encryption, and the risks of running a platform that could be used for crime.
On Monday, French officials revealed that Durov is being questioned as part of a wide-ranging criminal investigation surrounding crimes that regularly happen on Telegram. While some of the accusations could still raise red flags, many seem to concern serious offenses — like child abuse and terrorism — that Durov would reasonably have been aware of. But many questions remain unanswered, including how worried other tech executives should be.
Crime happens on lots of platforms. Why does Telegram stand out?
Telegram is a messaging app that was founded in 2013 by brothers Pavel and Nikolai Durov. While it’s sometimes portrayed as an “encrypted chat app,” it’s mostly popular as a semi-public communication service like Discord, particularly in countries like Russia, Ukraine, Iran, and India.
It’s a massive platform that is used by millions of innocent people every day, but it’s also gained a reputation for being a safe haven for all sorts of criminals, from scammers to terrorists.
Pavel Durov has crafted a brash pro-privacy persona in public. In an interview with Tucker Carlson this year, Durov gave examples of times that Telegram has declined to hand over data to governments: when Russia asked for information on protesters, for instance, and when US lawmakers requested data on participants in the January 6th Capitol riot. Earlier, at a 2015 TechCrunch event, Durov said that Telegram’s commitment to privacy was “more important than our fear of bad things happening, like terrorism.”
That sentiment isn’t radically out of step with what many encryption proponents believe, since strong encryption must protect all users. A “backdoor” targeting one guilty party compromises everyone’s privacy. In apps like Signal or iMessage, which use end-to-end encryption, nobody but the sender and recipient can read a message’s contents. But as experts have pointed out, Telegram doesn’t implement this in any meaningful sense. End-to-end encryption has to be enabled manually for one-on-one messaging, and it doesn’t work for group chats or public channels where illegal activity occurs in plain view.
“Telegram looks much more like a social network that is not end-to-end encrypted,” John Scott-Railton, senior researcher at Citizen Lab, told The Verge. “And because of that, Telegram could potentially moderate or have access to those things, or be compelled to.”
The ecosystem of extremist activity on the platform is so well-known that it even has a nickname: “terrorgram.” And much of it happens in the open where Telegram could identify or remove it.
Telegram does occasionally take action on illegal content. The platform has blocked extremist channels after reports from the media and revealed users’ IP addresses in response to government requests, and an official Telegram channel called “Stop Child Abuse” claims that the platform blocks more than 1,000 channels engaged in child abuse every day in response to user reports.
But there have been numerous reports of lax moderation on Telegram, with its general approach being frequently described as “hands off” compared to competitors like Facebook (which still struggles to effectively moderate its own massive platform). Even when Telegram does take action, reporters previously discovered that the service may only hide the offending channels rather than block them.
All of this puts Telegram in a unique position. It’s not taking a significantly active role in preventing use of its platforms by criminals, the way most big public social networks do. But it’s not disavowing its role as a moderator, either, the way a truly private platform could. “Because Telegram does have this access, it puts a target on Durov for governmental attention in a way that would not be true if it really were an encrypted messenger,” said Scott-Railton.
Why was Pavel Durov arrested? And why were other tech executives upset?
According to a statement by French prosecutor Laure Beccuau, Durov is being questioned as part of an investigation on Telegram-related crimes, which was opened on July 8th.
The listed charges include “complicity” in crimes ranging from possessing and distributing child sexual abuse material to selling narcotics and money laundering. Durov is also being investigated for refusing to comply with requests to enable “interceptions” from law enforcement and for importing and providing an encryption tool without declaring it. (While encrypted messaging is legal in France, anyone importing the tech has to register with the government.) He’s also accused of “criminal association with a view to committing a crime” punishable by more than fine years in prison. The statement added that Durov’s detainment could last 96 hours, until Wednesday, August 28th.
When Durov was first taken into custody, though, these details weren’t available — and prominent tech executives immediately rallied to his defense. X owner Elon Musk posted “#FreePavel” and captioned a post referencing Durov’s detention with “dangerous times,” framing it as an attack on free speech. Chris Pavlovski, CEO of Rumble — a YouTube alternative popular with right-wingers — said on Sunday that he had “just safely departed from Europe” and that Durov’s arrest “crossed a red line.”
Durov’s arrest comes amid a heated debate over the European Commission’s power to hold tech platforms responsible for their users’ behavior. The Digital Services Act, which took effect last year, has led to investigations into how tech companies handle terrorism and disinformation. Musk has been recently sparring with EU Commissioner Thierry Breton over what Breton characterizes as a reckless failure to moderate X.
Over the weekend, the public response was strong enough that French President Emmanuel Macron issued a statement saying that the arrest took place as part of an ongoing investigation and was “in no way a political decision.” Meanwhile, Telegram insisted that it had “nothing to hide” and that it complied with EU laws. “It is absurd to claim that a platform or its owner are responsible for abuse of that platform,” the company’s statement said.
Is the panic around Durov’s arrest justified?
With the caveat that the situation is still evolving, it seems like free speech is not the core issue — Durov’s alleged awareness of crimes is.
In posts on X, University of Lorraine law professor Florence G’sell noted that the most serious charges against Durov are the ones alleging direct criminal conspiracy and a refusal to cooperate with the police. By contrast, the charges around declaring encryption tech for import seem like minor offenses. (Notably, in the United States, certain import / export controls on encryption have been found to be violations of the First Amendment.) G’sell noted that there are still unknowns surrounding which criminal codes Durov could be charged under but that the key issue seems to be knowingly providing tech to criminals.
Arguably, Telegram has long operated on a knife-edge by attracting privacy-minded users — including a subset of drug dealers, terrorists, and child abusers — without implementing the kind of robust, widespread encryption that would indiscriminately protect every user and the platform itself. If child abuse or terrorism is happening in clear view, platforms have a clear legal responsibility to moderate that content.
That’s true in the US as well as in Europe. Daphne Keller, platform regulation director of the Stanford Cyber Policy Center, called Durov’s arrest “unsurprising” in X posts and said it could happen under the US legal system, too. Failing to remove child abuse material or terrorist content “could make a platform liable in most legal systems, including ours,” she wrote. Section 230, which provides a broad shield for tech platforms, notably doesn’t immunize operators from federal criminal charges.
That said, there are still many unknowns with Durov’s arrest, and there may be further developments that justify some of the concern over implications for encryption tech. References to lawful “interceptions” — a term that typically refers to platforms facilitating surveillance of users’ communications — are particularly worrying here.
European and US police have increasingly targeted encrypted chat platforms used by criminals in recent years, hacking a platform called EncroChat and even going as far as to secretly run an encrypted phone company called Anom. Notably, those platforms were focused on serving criminals. Telegram, on the other hand, is aimed at the general public. In his interview with Carlson, Durov claimed that at one point, the FBI — which played a key role in the Anom operation — attempted to convince Telegram to include a surveillance backdoor.
“This case definitely illustrates — whatever you think about the quality of Telegram’s encryption — how many people care about the ability to communicate safely and privately with each other,” said Scott-Railton.
Durov’s arrest also raises the question of what should push a platform into legal liability. Serious crimes certainly occur on Facebook and nearly every other massive social network, and in at least some cases, somebody at the company was warned and failed to take action. It’s possible Durov was clearly, directly involved in a criminal conspiracy — but short of that, how ineffectual can a company’s moderation get before its CEO is detained the next time they set foot on European soil?

Image: Cath Virginia / The Verge, Getty Images

Telegram doesn’t have the moderation of most social networks or the privacy of a true encrypted messaging app. That could leave its operator in hot water.

Telegram CEO Pavel Durov’s arrest in France on Saturday took the tech world by surprise. The 39-year-old Russian-born billionaire was detained after touching down at an airport outside of Paris in his private plane. And with scant detail available, observers wondered what the unprecedented action meant for free speech, encryption, and the risks of running a platform that could be used for crime.

On Monday, French officials revealed that Durov is being questioned as part of a wide-ranging criminal investigation surrounding crimes that regularly happen on Telegram. While some of the accusations could still raise red flags, many seem to concern serious offenses — like child abuse and terrorism — that Durov would reasonably have been aware of. But many questions remain unanswered, including how worried other tech executives should be.

Crime happens on lots of platforms. Why does Telegram stand out?

Telegram is a messaging app that was founded in 2013 by brothers Pavel and Nikolai Durov. While it’s sometimes portrayed as an “encrypted chat app,” it’s mostly popular as a semi-public communication service like Discord, particularly in countries like Russia, Ukraine, Iran, and India.

It’s a massive platform that is used by millions of innocent people every day, but it’s also gained a reputation for being a safe haven for all sorts of criminals, from scammers to terrorists.

Pavel Durov has crafted a brash pro-privacy persona in public. In an interview with Tucker Carlson this year, Durov gave examples of times that Telegram has declined to hand over data to governments: when Russia asked for information on protesters, for instance, and when US lawmakers requested data on participants in the January 6th Capitol riot. Earlier, at a 2015 TechCrunch event, Durov said that Telegram’s commitment to privacy was “more important than our fear of bad things happening, like terrorism.”

That sentiment isn’t radically out of step with what many encryption proponents believe, since strong encryption must protect all users. A “backdoor” targeting one guilty party compromises everyone’s privacy. In apps like Signal or iMessage, which use end-to-end encryption, nobody but the sender and recipient can read a message’s contents. But as experts have pointed out, Telegram doesn’t implement this in any meaningful sense. End-to-end encryption has to be enabled manually for one-on-one messaging, and it doesn’t work for group chats or public channels where illegal activity occurs in plain view.

“Telegram looks much more like a social network that is not end-to-end encrypted,” John Scott-Railton, senior researcher at Citizen Lab, told The Verge. “And because of that, Telegram could potentially moderate or have access to those things, or be compelled to.”

The ecosystem of extremist activity on the platform is so well-known that it even has a nickname: “terrorgram.” And much of it happens in the open where Telegram could identify or remove it.

Telegram does occasionally take action on illegal content. The platform has blocked extremist channels after reports from the media and revealed users’ IP addresses in response to government requests, and an official Telegram channel called “Stop Child Abuse” claims that the platform blocks more than 1,000 channels engaged in child abuse every day in response to user reports.

But there have been numerous reports of lax moderation on Telegram, with its general approach being frequently described as “hands off” compared to competitors like Facebook (which still struggles to effectively moderate its own massive platform). Even when Telegram does take action, reporters previously discovered that the service may only hide the offending channels rather than block them.

All of this puts Telegram in a unique position. It’s not taking a significantly active role in preventing use of its platforms by criminals, the way most big public social networks do. But it’s not disavowing its role as a moderator, either, the way a truly private platform could. “Because Telegram does have this access, it puts a target on Durov for governmental attention in a way that would not be true if it really were an encrypted messenger,” said Scott-Railton.

Why was Pavel Durov arrested? And why were other tech executives upset?

According to a statement by French prosecutor Laure Beccuau, Durov is being questioned as part of an investigation on Telegram-related crimes, which was opened on July 8th.

The listed charges include “complicity” in crimes ranging from possessing and distributing child sexual abuse material to selling narcotics and money laundering. Durov is also being investigated for refusing to comply with requests to enable “interceptions” from law enforcement and for importing and providing an encryption tool without declaring it. (While encrypted messaging is legal in France, anyone importing the tech has to register with the government.) He’s also accused of “criminal association with a view to committing a crime” punishable by more than fine years in prison. The statement added that Durov’s detainment could last 96 hours, until Wednesday, August 28th.

When Durov was first taken into custody, though, these details weren’t available — and prominent tech executives immediately rallied to his defense. X owner Elon Musk posted “#FreePavel” and captioned a post referencing Durov’s detention with “dangerous times,” framing it as an attack on free speech. Chris Pavlovski, CEO of Rumble — a YouTube alternative popular with right-wingers — said on Sunday that he had “just safely departed from Europe” and that Durov’s arrest “crossed a red line.”

Durov’s arrest comes amid a heated debate over the European Commission’s power to hold tech platforms responsible for their users’ behavior. The Digital Services Act, which took effect last year, has led to investigations into how tech companies handle terrorism and disinformation. Musk has been recently sparring with EU Commissioner Thierry Breton over what Breton characterizes as a reckless failure to moderate X.

Over the weekend, the public response was strong enough that French President Emmanuel Macron issued a statement saying that the arrest took place as part of an ongoing investigation and was “in no way a political decision.” Meanwhile, Telegram insisted that it had “nothing to hide” and that it complied with EU laws. “It is absurd to claim that a platform or its owner are responsible for abuse of that platform,” the company’s statement said.

Is the panic around Durov’s arrest justified?

With the caveat that the situation is still evolving, it seems like free speech is not the core issue — Durov’s alleged awareness of crimes is.

In posts on X, University of Lorraine law professor Florence G’sell noted that the most serious charges against Durov are the ones alleging direct criminal conspiracy and a refusal to cooperate with the police. By contrast, the charges around declaring encryption tech for import seem like minor offenses. (Notably, in the United States, certain import / export controls on encryption have been found to be violations of the First Amendment.) G’sell noted that there are still unknowns surrounding which criminal codes Durov could be charged under but that the key issue seems to be knowingly providing tech to criminals.

Arguably, Telegram has long operated on a knife-edge by attracting privacy-minded users — including a subset of drug dealers, terrorists, and child abusers — without implementing the kind of robust, widespread encryption that would indiscriminately protect every user and the platform itself. If child abuse or terrorism is happening in clear view, platforms have a clear legal responsibility to moderate that content.

That’s true in the US as well as in Europe. Daphne Keller, platform regulation director of the Stanford Cyber Policy Center, called Durov’s arrest “unsurprising” in X posts and said it could happen under the US legal system, too. Failing to remove child abuse material or terrorist content “could make a platform liable in most legal systems, including ours,” she wrote. Section 230, which provides a broad shield for tech platforms, notably doesn’t immunize operators from federal criminal charges.

That said, there are still many unknowns with Durov’s arrest, and there may be further developments that justify some of the concern over implications for encryption tech. References to lawful “interceptions” — a term that typically refers to platforms facilitating surveillance of users’ communications — are particularly worrying here.

European and US police have increasingly targeted encrypted chat platforms used by criminals in recent years, hacking a platform called EncroChat and even going as far as to secretly run an encrypted phone company called Anom. Notably, those platforms were focused on serving criminals. Telegram, on the other hand, is aimed at the general public. In his interview with Carlson, Durov claimed that at one point, the FBI — which played a key role in the Anom operation — attempted to convince Telegram to include a surveillance backdoor.

“This case definitely illustrates — whatever you think about the quality of Telegram’s encryption — how many people care about the ability to communicate safely and privately with each other,” said Scott-Railton.

Durov’s arrest also raises the question of what should push a platform into legal liability. Serious crimes certainly occur on Facebook and nearly every other massive social network, and in at least some cases, somebody at the company was warned and failed to take action. It’s possible Durov was clearly, directly involved in a criminal conspiracy — but short of that, how ineffectual can a company’s moderation get before its CEO is detained the next time they set foot on European soil?

Read More 

Apple Now Lets Users Transfer Apple Music Playlists to YouTube Music

Apple Music subscribers who want to transfer their playlists over to YouTube Music can now use Apple’s Data and Privacy page to do so, Apple says in a new support document.

After logging in on the Data and Privacy page, ‌Apple Music‌ users can select the “Transfer a copy of your data” option to move playlists from ‌Apple Music‌ to YouTube Music. YouTube Music is the only supported music service for playlist transfers, and Apple does not offer options to transfer to services like Spotify at this time.

To make a transfer, an active ‌Apple Music‌ subscription or iTunes Match subscription and an active YouTube Music account are required. Playlists that are transferred are not deleted from ‌Apple Music‌, and the transfer process can take anywhere from a few minutes to a few hours depending on how many playlists are being transferred.

Apple has details on what can be transferred and what can’t:

Only the playlists that you’ve created (including collaborative playlists that you own) are transferred
Music files aren’t transferred
Non-collaborative shared playlists and curated playlists aren’t transferred
Folders in which you’ve organized your ‌Apple Music‌ playlists aren’t transferred
Playlists can include only songs available on YouTube Music. If your playlists contains other audio files — such as podcasts, audio books, or user-uploaded audio files — they won’t be transferred.

After the transfer process has been completed, playlists from ‌Apple Music‌ will appear in the Library tab in YouTube Music. If a song is missing, it might not be available on YouTube Music.This article, “Apple Now Lets Users Transfer Apple Music Playlists to YouTube Music” first appeared on MacRumors.comDiscuss this article in our forums

Apple Music subscribers who want to transfer their playlists over to YouTube Music can now use Apple’s Data and Privacy page to do so, Apple says in a new support document.

After logging in on the Data and Privacy page, ‌Apple Music‌ users can select the “Transfer a copy of your data” option to move playlists from ‌Apple Music‌ to YouTube Music. YouTube Music is the only supported music service for playlist transfers, and Apple does not offer options to transfer to services like Spotify at this time.

To make a transfer, an active ‌Apple Music‌ subscription or iTunes Match subscription and an active YouTube Music account are required. Playlists that are transferred are not deleted from ‌Apple Music‌, and the transfer process can take anywhere from a few minutes to a few hours depending on how many playlists are being transferred.

Apple has details on what can be transferred and what can’t:

Only the playlists that you’ve created (including collaborative playlists that you own) are transferred

Music files aren’t transferred

Non-collaborative shared playlists and curated playlists aren’t transferred

Folders in which you’ve organized your ‌Apple Music‌ playlists aren’t transferred

Playlists can include only songs available on YouTube Music. If your playlists contains other audio files — such as podcasts, audio books, or user-uploaded audio files — they won’t be transferred.

After the transfer process has been completed, playlists from ‌Apple Music‌ will appear in the Library tab in YouTube Music. If a song is missing, it might not be available on YouTube Music.
This article, “Apple Now Lets Users Transfer Apple Music Playlists to YouTube Music” first appeared on MacRumors.com

Discuss this article in our forums

Read More 

Echelon raises $3.5 million in seed funding to advance DeFi lending on Move-based blockchains

Echelon, a decentralized lending protocol, has raised $3.5 million in seed funding to advance DeFi lending on Move-based blockchains. The funding round was led by Amber Group, with participation from Laser Digital, Saison Capital, Selini Capital, Interop Ventures, and Re7.
The post Echelon raises $3.5 million in seed funding to advance DeFi lending on Move-based blockchains first appeared on Tech Startups.

Echelon, a decentralized lending protocol, has raised $3.5 million in seed funding to advance DeFi lending on Move-based blockchains. The funding round was led by Amber Group, with participation from Laser Digital, Saison Capital, Selini Capital, Interop Ventures, and Re7. […]

The post Echelon raises $3.5 million in seed funding to advance DeFi lending on Move-based blockchains first appeared on Tech Startups.

Read More 

AT&T Hit With $950,000 Fine for 2023 911 Outage

AT&T will pay a $950,000 fine for failing to notify 911 call centers about a service outage that occurred in 2023, the Federal Communications Commission (FCC) said this week. There was an AT&T outage in Illinois, Kansas, Texas, and Wisconsin in August 2023, and AT&T was penalized both for failing to deliver 911 calls and for not notifying call centers in a timely manner.

The outage happened when AT&T was testing its 911 network. A technician accidentally disabled a portion of the network, and AT&T’s system did not adjust to accommodate the disabled part of the network. It was not part of planned maintenance, and there was no stringent technical review. During the outage period, which lasted for a little over an hour, there were more than 400 failed 911 calls.

In addition to paying a $950,000 fine, AT&T has implemented a three-year compliance plan to make sure that it does not violate the FCC’s 911 and outage notification rules going forward.

Service providers like AT&T are required to let call centers know about an outage right away so that the public can be notified about alternative ways to get emergency assistance.

It’s been a bad year for AT&T. In March, AT&T confirmed that a 2021 data leak included the passcodes and sensitive info from 7.6 million AT&T customers and 65.4 million former AT&T customers. AT&T claimed that the data was obtained without unauthorized access to its systems, but hackers were able to get their hands on names, addresses, birth dates, phone numbers, social security numbers, and more from customers.

In April, the company was fined over $57 million for illegally sharing customer data with third-party data aggregators, an issue that Verizon and T-Mobile also had to shell out money for.

In July, AT&T announced a second major data breach. Hackers were able to get into a cloud platform used by AT&T, and stole the records of “nearly all” of its cellular customers. The stolen data included the phone numbers of cellular and landline customers, as well as records of calls and text messages between May and October 2022.

This week, AT&T is also in mediation with the hope of solving an ongoing Communication Workers strike in the southeast, which has impacted service in some areas and involves 17,000 employees.Tag: AT&TThis article, “AT&T Hit With $950,000 Fine for 2023 911 Outage” first appeared on MacRumors.comDiscuss this article in our forums

AT&T will pay a $950,000 fine for failing to notify 911 call centers about a service outage that occurred in 2023, the Federal Communications Commission (FCC) said this week. There was an AT&T outage in Illinois, Kansas, Texas, and Wisconsin in August 2023, and AT&T was penalized both for failing to deliver 911 calls and for not notifying call centers in a timely manner.

The outage happened when AT&T was testing its 911 network. A technician accidentally disabled a portion of the network, and AT&T’s system did not adjust to accommodate the disabled part of the network. It was not part of planned maintenance, and there was no stringent technical review. During the outage period, which lasted for a little over an hour, there were more than 400 failed 911 calls.

In addition to paying a $950,000 fine, AT&T has implemented a three-year compliance plan to make sure that it does not violate the FCC’s 911 and outage notification rules going forward.

Service providers like AT&T are required to let call centers know about an outage right away so that the public can be notified about alternative ways to get emergency assistance.

It’s been a bad year for AT&T. In March, AT&T confirmed that a 2021 data leak included the passcodes and sensitive info from 7.6 million AT&T customers and 65.4 million former AT&T customers. AT&T claimed that the data was obtained without unauthorized access to its systems, but hackers were able to get their hands on names, addresses, birth dates, phone numbers, social security numbers, and more from customers.

In April, the company was fined over $57 million for illegally sharing customer data with third-party data aggregators, an issue that Verizon and T-Mobile also had to shell out money for.

In July, AT&T announced a second major data breach. Hackers were able to get into a cloud platform used by AT&T, and stole the records of “nearly all” of its cellular customers. The stolen data included the phone numbers of cellular and landline customers, as well as records of calls and text messages between May and October 2022.

This week, AT&T is also in mediation with the hope of solving an ongoing Communication Workers strike in the southeast, which has impacted service in some areas and involves 17,000 employees.

Tag: AT&T

This article, “AT&T Hit With $950,000 Fine for 2023 911 Outage” first appeared on MacRumors.com

Discuss this article in our forums

Read More 

Self-storage rooftops will become a nationwide 100MW+ solar farm

Electrek reports that a solar energy company is renting 8.5 million square feet of roof space from the National Storage Affiliates Trust’s (NSA) buildings for its newest solar panel project.
The commercial and community solar developer Solar Landscape’s new rooftop solar panel grid on the NSA’s 1,052 self-storage facilities and properties across 42 states and Puerto Rico are expected to produce at least 100 megawatts of solar capacity. The NSA, headquartered in Greenwood Village, Colorado, is one of the nation’s largest self-storage operators with brands like iStorage, Move It, Northwest and SecurCare.
These solar energy panels won’t just generate power for the NSA’s facilities. The panels will also provide clean power to nearby businesses and homes for a discounted price.
One of the challenges of implementing solar energy is finding enough space for the solar panels. These panels can take up a lot of space, like the Noor Abu Dhabi solar plant that set a world record in 2019 with 3.2 million solar panels taking up over 3 square miles of space.
Solar Landscape and the NSA may have found an interesting solution to solar panel projects’ space problem. If this partnership is successful, it could inspire similar deals for other communities looking to benefit from solar power technology.This article originally appeared on Engadget at https://www.engadget.com/science/self-storage-rooftops-will-become-a-nationwide-100mw-solar-farm-223004138.html?src=rss

Electrek reports that a solar energy company is renting 8.5 million square feet of roof space from the National Storage Affiliates Trust’s (NSA) buildings for its newest solar panel project.

The commercial and community solar developer Solar Landscape’s new rooftop solar panel grid on the NSA’s 1,052 self-storage facilities and properties across 42 states and Puerto Rico are expected to produce at least 100 megawatts of solar capacity. The NSA, headquartered in Greenwood Village, Colorado, is one of the nation’s largest self-storage operators with brands like iStorage, Move It, Northwest and SecurCare.

These solar energy panels won’t just generate power for the NSA’s facilities. The panels will also provide clean power to nearby businesses and homes for a discounted price.

One of the challenges of implementing solar energy is finding enough space for the solar panels. These panels can take up a lot of space, like the Noor Abu Dhabi solar plant that set a world record in 2019 with 3.2 million solar panels taking up over 3 square miles of space.

Solar Landscape and the NSA may have found an interesting solution to solar panel projects’ space problem. If this partnership is successful, it could inspire similar deals for other communities looking to benefit from solar power technology.

This article originally appeared on Engadget at https://www.engadget.com/science/self-storage-rooftops-will-become-a-nationwide-100mw-solar-farm-223004138.html?src=rss

Read More 

Kamala Harris and Tim Walz Interview to Air on CNN on Thursday

The joint interview, airing at 9 p.m. Eastern, is the first time the vice president will face sustained questions from a journalist since President Biden withdrew from the race.

The joint interview, airing at 9 p.m. Eastern, is the first time the vice president will face sustained questions from a journalist since President Biden withdrew from the race.

Read More 

Gen Z And Millennials Are Hung Up On Answering the Phone

A quarter of young adults aged 18-34 never answer phone calls, according to a recent Uswitch survey. The study reveals a generational shift in communication preferences, with 70% of respondents in this age group favoring text messages over voice calls. Experts attribute this trend to the rise of mobile technology and social media. While avoiding calls, younger generations maintain constant contact through group chats and social media platforms. Voice notes have emerged as a compromise, with 37% of 18-34 year-olds preferring them to traditional calls. This communication shift extends to the workplace, causing challenges for some employers.

Read more of this story at Slashdot.

A quarter of young adults aged 18-34 never answer phone calls, according to a recent Uswitch survey. The study reveals a generational shift in communication preferences, with 70% of respondents in this age group favoring text messages over voice calls. Experts attribute this trend to the rise of mobile technology and social media. While avoiding calls, younger generations maintain constant contact through group chats and social media platforms. Voice notes have emerged as a compromise, with 37% of 18-34 year-olds preferring them to traditional calls. This communication shift extends to the workplace, causing challenges for some employers.

Read more of this story at Slashdot.

Read More 

Scroll to top
Generated by Feedzy