Author: abubakar
The Cycling Champion Who Doesn’t Have to Win to Be Satisfied
Ashton Lambie won a world championship in individual pursuit 16 months ago. Winning was great, he said, but now he just wants to explore.
Ashton Lambie won a world championship in individual pursuit 16 months ago. Winning was great, he said, but now he just wants to explore.
M.L.S. Preview: St. Louis S.C., Apple TV+ and More
St. Louis City S.C. will hit the field as the league’s 29th franchise, but to watch it, and every other team, fans will have to get to know Apple TV+.
St. Louis City S.C. will hit the field as the league’s 29th franchise, but to watch it, and every other team, fans will have to get to know Apple TV+.
App Store, Apple Music, Apple TV+ and More Experiencing Outage [Update: Fixed]
Several Apple services appear to be experiencing issues at the current time, according to Apple’s System Status page. Outages are impacting the App Store, Apple Books, Apple Music, Apple TV+, Apple TV Channels, the Mac App Store, Podcasts, TestFlight, Messages, Apple Fitness+, Find My, Game Center, iCloud Mail, App Store Connect, and more.
According to Apple the problem has been ongoing since 4:03 p.m. Eastern Time, and there is no word on when all of the services will be back up and running. Apple says that these impacted services may be slow or unavailable for some users.
Update: According to Apple’s System Status page, all services are once again operational.This article, “App Store, Apple Music, Apple TV+ and More Experiencing Outage [Update: Fixed]” first appeared on MacRumors.comDiscuss this article in our forums
Several Apple services appear to be experiencing issues at the current time, according to Apple’s System Status page. Outages are impacting the App Store, Apple Books, Apple Music, Apple TV+, Apple TV Channels, the Mac App Store, Podcasts, TestFlight, Messages, Apple Fitness+, Find My, Game Center, iCloud Mail, App Store Connect, and more.
According to Apple the problem has been ongoing since 4:03 p.m. Eastern Time, and there is no word on when all of the services will be back up and running. Apple says that these impacted services may be slow or unavailable for some users.
Update: According to Apple’s System Status page, all services are once again operational.
This article, “App Store, Apple Music, Apple TV+ and More Experiencing Outage [Update: Fixed]” first appeared on MacRumors.com
Discuss this article in our forums
Valve used secret memory access “honeypot” to detect 40K Dota 2 cheaters
Publisher is publicizing its methods to send a message to would-be exploit users.
The cat-and-mouse battle between game makers and cheat makers has seen plenty of inventive twists and turns over the years. Even amid that backdrop, though, Dota 2 stands out for a recently revealed “honeypot” trap hidden inside the game’s memory buffer.
In a blog post this week, Valve revealed the existence of this trap, which was released as part of an earlier update to the game. Valve says that update included “a section of data inside the game client that would never be read during normal gameplay.” But that memory could be read by third-party cheat tools that used exploits to sniff out (and share) internal data normally invisible to players.
To activate its honeypot trap, all Valve had to do was watch for any accounts that tried to read from that “secret” memory area, an event that would lead to “extremely high confidence that every ban was well-deserved,” according to Valve.
‘Cocaine Bear’ review: A wild true story becomes a bonkers comedy, and yet…
There are few films that can deliver as succinctly and accurately on their title as Cocaine Bear does. There is a bear. She does cocaine. Hence, the Cocaine Bear of it all. But while the movie comes through on its gloriously stupid title, it still left me craving something extra. Sure, it might be strange to say that I wanted more from a movie called Cocaine Bear, but I expect a lot out of any film that promises a drug-fueled animal rampage — especially one directed by Elizabeth Banks and produced by Phil Lord and Chris Miller.
SEE ALSO:
Is the true story of ‘Cocaine Bear’ as wild as it sounds?
If all you need out of Cocaine Bear is the greatest hits from its trailer, then the movie is fun enough. Yes, a bear gets into a stash of cocaine, and yes, she proceeds to violently murder anyone who gets between her and her beloved drugs. The resulting film is goofy and gory, yet also strangely lacking, as Cocaine Bear spreads itself too thin with an ensemble cast of humans who can’t hold a candle to Cocaine Bear herself.
Cocaine Bear is a wild ride based on a true story.
Little do they know, they’ll soon meet Cocaine Bear.
Credit: Pat Redmond/Universal Pictures
As bananas as it sounds, Cocaine Bear is rooted in fact. In 1985, drug smuggler Andrew Thornton (Matthew Rhys) dropped millions of dollars worth of cocaine from a plane flying over Georgia. A black bear in Chattahoochee National Forest found said cocaine and ingested it, dying from an overdose. Cocaine Bear imagines what would have happened had that bear not died immediately but instead embarked upon a cocaine-fueled killing spree.
It wouldn’t be a killing spree without some people to terrorize, and Cocaine Bear introduces a slew of characters who most certainly did not expect their day at Chattahoochee National Forest to involve a bear on cocaine. Among them is concerned mother Sari (Keri Russell), who is looking for her daughter Dee Dee (Brooklynn Prince, The Florida Project) and Dee Dee’s friend Henry (Christian Convery) after learning they’ve skipped school. Sari teams up with Ranger Liz (Margo Martindale) and animal rights activist Peter (Jesse Tyler Ferguson) to help track them down. Then, there’s drug dealers Eddie (Alden Ehrenreich) and Daveed (O’Shea Jackson Jr.), sent by drug smuggler Syd Dentwood (the late, great Ray Liotta) to collect the lost cocaine. Hot on their trail is police officer Bob (Isiah Whitlock Jr.). It’s not long before they all cross paths with Cocaine Bear.
Cocaine Bear is a star.
Cocaine Bear is scary, but I also want to give her a hug.
Credit: Universal Pictures
Whether she’s ramming down doors or snorting a line off a severed leg, there’s no doubt that Cocaine Bear (aka Cokey) is a cinematic beast. Her luscious fur and cocaine-smeared muzzle position her right at the intersection of adorable forest creature and terrifying killing machine. At one point, after tearing a hiker to shreds and letting out a magnificent roar, Cokey takes a second to admire a passing butterfly. The duality of bear!
Want more about the latest in entertainment? Sign up for Mashable’s Top Stories newsletter today.
Speaking of duality, Banks nails the simultaneous ridiculousness and terror inherent to the concept of a “cocaine bear” in each of the film’s genre-bending attack scenes. A tense standoff in a park visitor center turns Cokey into the stuff of slasher nightmares. When charging after an ambulance, Cokey displays athletic skills right out of a high-octane action movie. Yet Cokey can get a little silly too, scooting along the ground on her back or cuddling up to a terrified human who should be honored to be within spitting distance of this bear icon. In fact, we should all be honored to be watching the birth of a new horror legend, as Cokey lights up the screen whenever she’s around. There’s just one problem…
Cocaine Bear doesn’t have enough Cocaine Bear.
Cocaine Bear is on a mission.
Credit: Pat Redmond/Universal Pictures
Cocaine Bear is full of nicely committed performances, with special shout-outs to Ehrenreich and Jackson Jr. for their buddy dynamic, and to Convery for his wired turn as a kid who may have done some cocaine. However, there are so many disparate characters that Cocaine Bear ends up feeling more scattered than a stash of cocaine in the Georgia mountains.
Banks and writer Jimmy Warden do their best to ground their ensemble with emotionally resonant backstories, such as Eddie mourning his dead wife or Dee Dee and Sari arguing about Sari’s new boyfriend. However, these plotlines clash with the otherwise wild tone of Cocaine Bear, and in the film’s rush to resolve all of them neatly, they lose their potency.
I did not come to the movies to see Cocaine Human.
But Cocaine Bear’s greatest sin of all is that in spending so much time with these stories, it robs us of precious time with its titular beast. After all, I did not come to the movies to see Cocaine Human. I came to the movies to see Cocaine Bear. Banks and Warden really only show us Cokey through the eyes of its human characters, a move that helps give her a wee bit of Jaws-esque mystique but squanders the concept’s comedic potential. Would you believe we never see Cokey discover cocaine for the first time? What kind of insanity would the film have been able to wring out of that first high? And what is the world like through the eyes of a coked-out bear? Cocaine Bear keeps its star at too much of a distance for us to truly tell, and that’s a shame.
SEE ALSO:
The 20 best action movies on Netflix
This isn’t to say that Cocaine Bear doesn’t deliver its fair share of laughs and thrills. It has genuinely laugh-out-loud moments — Ehrenreich’s delivery of “A BEAR did COCAINE!” is immaculate — and its best kills will leave you hooting and hollering with glee. But as fun and stupid as Cocaine Bear is, you can’t help but think while watching that it could have been even more fun and even more stupid. Instead, the movie goes into cruise control between bear-centric set pieces, almost as if it got too high on its own premise to push itself to truly gonzo heights. Cocaine Bear herself is an instant legend; I just wish I could say the same about her star vehicle.
Cocaine Bear hits theaters Feb. 24.
There are few films that can deliver as succinctly and accurately on their title as Cocaine Bear does. There is a bear. She does cocaine. Hence, the Cocaine Bear of it all. But while the movie comes through on its gloriously stupid title, it still left me craving something extra. Sure, it might be strange to say that I wanted more from a movie called Cocaine Bear, but I expect a lot out of any film that promises a drug-fueled animal rampage — especially one directed by Elizabeth Banks and produced by Phil Lord and Chris Miller.
If all you need out of Cocaine Bear is the greatest hits from its trailer, then the movie is fun enough. Yes, a bear gets into a stash of cocaine, and yes, she proceeds to violently murder anyone who gets between her and her beloved drugs. The resulting film is goofy and gory, yet also strangely lacking, as Cocaine Bear spreads itself too thin with an ensemble cast of humans who can’t hold a candle to Cocaine Bear herself.
Cocaine Bear is a wild ride based on a true story.
Credit: Pat Redmond/Universal Pictures
As bananas as it sounds, Cocaine Bear is rooted in fact. In 1985, drug smuggler Andrew Thornton (Matthew Rhys) dropped millions of dollars worth of cocaine from a plane flying over Georgia. A black bear in Chattahoochee National Forest found said cocaine and ingested it, dying from an overdose. Cocaine Bear imagines what would have happened had that bear not died immediately but instead embarked upon a cocaine-fueled killing spree.
It wouldn’t be a killing spree without some people to terrorize, and Cocaine Bear introduces a slew of characters who most certainly did not expect their day at Chattahoochee National Forest to involve a bear on cocaine. Among them is concerned mother Sari (Keri Russell), who is looking for her daughter Dee Dee (Brooklynn Prince, The Florida Project) and Dee Dee’s friend Henry (Christian Convery) after learning they’ve skipped school. Sari teams up with Ranger Liz (Margo Martindale) and animal rights activist Peter (Jesse Tyler Ferguson) to help track them down. Then, there’s drug dealers Eddie (Alden Ehrenreich) and Daveed (O’Shea Jackson Jr.), sent by drug smuggler Syd Dentwood (the late, great Ray Liotta) to collect the lost cocaine. Hot on their trail is police officer Bob (Isiah Whitlock Jr.). It’s not long before they all cross paths with Cocaine Bear.
Cocaine Bear is a star.
Credit: Universal Pictures
Whether she’s ramming down doors or snorting a line off a severed leg, there’s no doubt that Cocaine Bear (aka Cokey) is a cinematic beast. Her luscious fur and cocaine-smeared muzzle position her right at the intersection of adorable forest creature and terrifying killing machine. At one point, after tearing a hiker to shreds and letting out a magnificent roar, Cokey takes a second to admire a passing butterfly. The duality of bear!
Want more about the latest in entertainment? Sign up for Mashable’s Top Stories newsletter today.
Speaking of duality, Banks nails the simultaneous ridiculousness and terror inherent to the concept of a “cocaine bear” in each of the film’s genre-bending attack scenes. A tense standoff in a park visitor center turns Cokey into the stuff of slasher nightmares. When charging after an ambulance, Cokey displays athletic skills right out of a high-octane action movie. Yet Cokey can get a little silly too, scooting along the ground on her back or cuddling up to a terrified human who should be honored to be within spitting distance of this bear icon. In fact, we should all be honored to be watching the birth of a new horror legend, as Cokey lights up the screen whenever she’s around. There’s just one problem…
Cocaine Bear doesn’t have enough Cocaine Bear.
Credit: Pat Redmond/Universal Pictures
Cocaine Bear is full of nicely committed performances, with special shout-outs to Ehrenreich and Jackson Jr. for their buddy dynamic, and to Convery for his wired turn as a kid who may have done some cocaine. However, there are so many disparate characters that Cocaine Bear ends up feeling more scattered than a stash of cocaine in the Georgia mountains.
Banks and writer Jimmy Warden do their best to ground their ensemble with emotionally resonant backstories, such as Eddie mourning his dead wife or Dee Dee and Sari arguing about Sari’s new boyfriend. However, these plotlines clash with the otherwise wild tone of Cocaine Bear, and in the film’s rush to resolve all of them neatly, they lose their potency.
I did not come to the movies to see Cocaine Human.
But Cocaine Bear‘s greatest sin of all is that in spending so much time with these stories, it robs us of precious time with its titular beast. After all, I did not come to the movies to see Cocaine Human. I came to the movies to see Cocaine Bear. Banks and Warden really only show us Cokey through the eyes of its human characters, a move that helps give her a wee bit of Jaws-esque mystique but squanders the concept’s comedic potential. Would you believe we never see Cokey discover cocaine for the first time? What kind of insanity would the film have been able to wring out of that first high? And what is the world like through the eyes of a coked-out bear? Cocaine Bear keeps its star at too much of a distance for us to truly tell, and that’s a shame.
This isn’t to say that Cocaine Bear doesn’t deliver its fair share of laughs and thrills. It has genuinely laugh-out-loud moments — Ehrenreich’s delivery of “A BEAR did COCAINE!” is immaculate — and its best kills will leave you hooting and hollering with glee. But as fun and stupid as Cocaine Bear is, you can’t help but think while watching that it could have been even more fun and even more stupid. Instead, the movie goes into cruise control between bear-centric set pieces, almost as if it got too high on its own premise to push itself to truly gonzo heights. Cocaine Bear herself is an instant legend; I just wish I could say the same about her star vehicle.
Cocaine Bear hits theaters Feb. 24.
Section 230, the internet law the Supreme Court could change, explained
The Supreme Court is considering two cases that could change the internet as we know it. | Eric Lee/Bloomberg via Getty Images
The pillar of internet free speech seems to be everyone’s target. You may have never heard of it, but Section 230 of the Communications Decency Act is the legal backbone of the internet. The law was created almost 30 years ago to protect internet platforms from liability for many of the things third parties say or do on them.
Decades later, it’s never been more controversial. People from both political parties and all three branches of government have threatened to reform or even repeal it. The debate centers around whether we should reconsider a law from the internet’s infancy that was meant to help struggling websites and internet-based companies grow. After all, these internet-based businesses are now some of the biggest and most powerful in the world, and users’ ability to speak freely on them bears much bigger consequences.
While President Biden pushes Congress to pass laws to reform Section 230, its fate may lie in the hands of the judicial branch, as the Supreme Court is considering two cases — one involving YouTube and Google, another targeting Twitter — that could significantly change the law and, therefore, the internet it helped create.
Section 230 says that internet platforms hosting third-party content are not liable for what those third parties post (with a few exceptions). That third-party content could include things like a news outlet’s reader comments, tweets on Twitter, posts on Facebook, photos on Instagram, or reviews on Yelp. If a Yelp reviewer were to post something defamatory about a business, for example, the business could sue the reviewer for libel, but thanks to Section 230, it couldn’t sue Yelp.
Without Section 230’s protections, the internet as we know it today would not exist. If the law were taken away, many websites driven by user-generated content would likely go dark. A repeal of Section 230 wouldn’t just affect the big platforms that seem to get all the negative attention, either. It could affect websites of all sizes and online discourse.
Section 230’s salacious origins
In the early ’90s, the internet was still in its relatively unregulated infancy. There was a lot of porn floating around, and anyone, including impressionable children, could easily find and see it. This alarmed some lawmakers. In an attempt to regulate this situation, in 1995 lawmakers introduced a bipartisan bill called the Communications Decency Act, which would extend laws governing obscene and indecent use of telephone services to the internet. This would also make websites and platforms responsible for any indecent or obscene things their users posted.
In the midst of this was a lawsuit between two companies you might recognize: Stratton Oakmont and Prodigy. The former is featured in The Wolf of Wall Street, and the latter was a pioneer of the early internet. But in 1994, Stratton Oakmont sued Prodigy for defamation after an anonymous user claimed on a Prodigy bulletin board that the financial company’s president engaged in fraudulent acts. The court ruled in Stratton Oakmont’s favor, saying that because Prodigy moderated posts on its forums, it exercised editorial control that made it just as liable for the speech on its platform as the people who actually made that speech. Meanwhile, Prodigy’s rival online service, Compuserve, was found liable for a user’s speech in an earlier case because Compuserve didn’t moderate content.
Fearing that the Communications Decency Act would stop the burgeoning internet in its tracks, and mindful of the Prodigy decision, then-Rep. (now Sen.) Ron Wyden and Rep. Chris Cox authored an amendment to CDA that said “interactive computer services” were not responsible for what their users posted, even if those services engaged in some moderation of that third-party content.
“What I was struck by then is that if somebody owned a website or a blog, they could be held personally liable for something posted on their site,” Wyden told Vox’s Emily Stewart in 2019. “And I said then — and it’s the heart of my concern now — if that’s the case, it will kill the little guy, the startup, the inventor, the person who is essential for a competitive marketplace. It will kill them in the crib.”
As the beginning of Section 230 says: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” These are considered by some to be the 26 words that created the internet, but the law says more than that.
Section 230 also allows those services to “restrict access” to any content they deem objectionable. In other words, the platforms themselves get to choose what is and what is not acceptable content, and they can decide to host it or moderate it accordingly. That means the free speech argument frequently employed by people who are suspended or banned from these platforms — that their Constitutional right to free speech has been violated — doesn’t apply. Wyden likens the dual nature of Section 230 to a sword and a shield for platforms: They’re shielded from liability for user content, and they have a sword to moderate it as they see fit.
The Communications Decency Act was signed into law in 1996. The indecency and obscenity provisions about transmitting porn to minors were immediately challenged by civil liberty groups and struck down by the Supreme Court, which said they were too restrictive of free speech. Section 230 stayed, and so a law that was initially meant to restrict free speech on the internet instead became the law that protected it.
This protection has allowed the internet to thrive. Think about it: Websites like Facebook, Reddit, and YouTube have millions and even billions of users. If these platforms had to monitor and approve every single thing every user posted, they simply wouldn’t be able to exist. No website or platform can moderate at such an incredible scale, and no one wants to open themselves up to the legal liability of doing so. On the other hand, a website that didn’t moderate anything at all would quickly become a spam-filled cesspool that few people would want to swim in.
That doesn’t mean Section 230 is perfect. Some argue that it gives platforms too little accountability, allowing some of the worst parts of the internet to flourish. Others say it allows platforms that have become hugely influential and important to suppress and censor speech based on their own whims or supposed political biases. Depending on who you talk to, internet platforms are either using the sword too much or not enough. Either way, they’re hiding behind the shield to protect themselves from lawsuits while they do it. Though it has been a law for nearly three decades, Section 230’s existence may have never been as precarious as it is now.
The Supreme Court might determine Section 230’s fate
Justice Clarence Thomas has made no secret of his desire for the court to consider Section 230, saying in multiple opinions that he believes lower courts have interpreted it to give too-broad protections to what have become very powerful companies. He got his wish in February 2023, when the court heard two similar cases that include it. In both, plaintiffs argued that their family members were killed by terrorists who posted content on those platforms. In the first, Gonzalez v. Google, the family of a woman killed in a 2015 terrorist attack in France said YouTube promoted ISIS videos and sold advertising on them, thereby materially supporting ISIS. In Twitter v. Taamneh, the family of a man killed in a 2017 ISIS attack in Turkey said the platform didn’t go far enough to identify and remove ISIS content, which is in violation of the Justice Against Sponsors of Terrorism Act — and could then mean that Section 230 doesn’t apply to such content.
These cases give the Supreme Court the chance to reshape, redefine, or even repeal the foundational law of the internet, which could fundamentally change it. And while the Supreme Court chose to take these cases on, it’s not certain that they’ll rule in favor of the plaintiffs. In oral arguments in late February, several justices didn’t seem too convinced during the Gonzalez v. Google arguments that they could or should, especially considering the monumental possible consequences and impact of such a decision. In Twitter v. Taamneh, the justices focused more on if and how the Sponsors of Terrorism law applied to tweets than they did on Section 230. The rulings are expected in June.
In the meantime, don’t expect the original authors of Section 230 to go away quietly. Wyden and Cox submitted an amicus brief to the Supreme Court for the Gonzalez case, where they said: “The real-time transmission of user-generated content that Section 230 fosters has become a backbone of online activity, relied upon by innumerable Internet users and platforms alike. Given the enormous volume of content created by Internet users today, Section 230’s protection is even more important now than when the statute was enacted.”
Congress and presidents are getting sick of Section 230, too
In 2018, two bills — the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA) and the Stop Enabling Sex Traffickers Act (SESTA) — were signed into law, which changed parts of Section 230. The updates mean that platforms can now be deemed responsible for prostitution ads posted by third parties. These changes were ostensibly meant to make it easier for authorities to go after websites that were used for sex trafficking, but it did so by carving out an exception to Section 230. That could open the door to even more exceptions in the future.
Amid all of this was a growing public sentiment that social media platforms like Twitter and Facebook were becoming too powerful. In the minds of many, Facebook even influenced the outcome of the 2016 presidential election by offering up its user data to shady outfits like Cambridge Analytica. There were also allegations of anti-conservative bias. Right-wing figures who once rode the internet’s relative lack of moderation to fame and fortune were being held accountable for various infringements of hateful content rules and kicked off the very platforms that helped create them. Alex Jones and his expulsion from Facebook and other social media platforms — even Twitter under Elon Musk won’t let him back — is perhaps the best example of this.
In a 2018 op-ed, Sen. Ted Cruz (R-TX) claimed that Section 230 required the internet platforms it was designed to protect to be “neutral public forums.” The law doesn’t actually say that, but many Republican lawmakers have introduced legislation that would fulfill that promise. On the other side, Democrats have introduced bills that would hold social media platforms accountable if they didn’t do more to prevent harmful content or if their algorithms promoted it.
There are some bipartisan efforts to change Section 230, too. The EARN IT Act from Sens. Lindsey Graham (R-SC) and Richard Blumenthal (D-CT), for example, would remove Section 230 immunity from platforms that didn’t follow a set of best practices to detect and remove child sexual abuse material. The partisan bills haven’t really gotten anywhere in Congress. But EARN IT, which was introduced in the last two sessions, was passed out of committee in the Senate and ready for a Senate floor vote. That vote never came, but Blumenthal and Graham have already signaled that they plan to reintroduce EARN IT this session for a third try.
In the executive branch, former President Trump became a very vocal critic of Section 230 in 2020 after Twitter and Facebook started deleting and tagging his posts that contained inaccuracies about Covid-19 and mail-in voting. He issued an executive order that said Section 230 protections should only apply to platforms that have “good faith” moderation, and then called on the FCC to make rules about what constituted good faith. This didn’t happen, and President Biden revoked the executive order months after taking office.
But Biden isn’t a fan of Section 230, either. During his presidential campaign, he said he wanted it repealed. As president, Biden has said he wants it to be reformed by Congress. Until Congress can agree on what’s wrong with Section 230, however, it doesn’t look likely that they’ll pass a law that significantly changes it.
However, some Republican states have been making their own anti-Section 230 moves. In 2021, Florida passed the Stop Social Media Censorship Act, which prohibits certain social media platforms from banning politicians or media outlets. That same year, Texas passed HB 20, which forbids large platforms from removing or moderating content based on a user’s viewpoint.
Neither law is currently in effect. A federal judge blocked the Florida law in 2022 due to the possibility of it violating free speech laws as well as Section 230. The state has appealed to the Supreme Court. The Texas law has made a little more progress. A district court blocked the law last year, and then the Fifth Circuit controversially reversed that decision before deciding to stay the law in order to give the Supreme Court the chance to take the case. We’re still waiting to see if it does.
If Section 230 were to be repealed — or even significantly reformed — it really could change the internet as we know it. It remains to be seen if that’s for better or for worse.
Update, February 23, 2023, 3 pm ET: This story, originally published on May 28, 2020, has been updated several times, most recently with the latest news from the Supreme Court cases related to Section 230.
The Supreme Court is considering two cases that could change the internet as we know it. | Eric Lee/Bloomberg via Getty Images
The pillar of internet free speech seems to be everyone’s target.
You may have never heard of it, but Section 230 of the Communications Decency Act is the legal backbone of the internet. The law was created almost 30 years ago to protect internet platforms from liability for many of the things third parties say or do on them.
Decades later, it’s never been more controversial. People from both political parties and all three branches of government have threatened to reform or even repeal it. The debate centers around whether we should reconsider a law from the internet’s infancy that was meant to help struggling websites and internet-based companies grow. After all, these internet-based businesses are now some of the biggest and most powerful in the world, and users’ ability to speak freely on them bears much bigger consequences.
While President Biden pushes Congress to pass laws to reform Section 230, its fate may lie in the hands of the judicial branch, as the Supreme Court is considering two cases — one involving YouTube and Google, another targeting Twitter — that could significantly change the law and, therefore, the internet it helped create.
Section 230 says that internet platforms hosting third-party content are not liable for what those third parties post (with a few exceptions). That third-party content could include things like a news outlet’s reader comments, tweets on Twitter, posts on Facebook, photos on Instagram, or reviews on Yelp. If a Yelp reviewer were to post something defamatory about a business, for example, the business could sue the reviewer for libel, but thanks to Section 230, it couldn’t sue Yelp.
Without Section 230’s protections, the internet as we know it today would not exist. If the law were taken away, many websites driven by user-generated content would likely go dark. A repeal of Section 230 wouldn’t just affect the big platforms that seem to get all the negative attention, either. It could affect websites of all sizes and online discourse.
Section 230’s salacious origins
In the early ’90s, the internet was still in its relatively unregulated infancy. There was a lot of porn floating around, and anyone, including impressionable children, could easily find and see it. This alarmed some lawmakers. In an attempt to regulate this situation, in 1995 lawmakers introduced a bipartisan bill called the Communications Decency Act, which would extend laws governing obscene and indecent use of telephone services to the internet. This would also make websites and platforms responsible for any indecent or obscene things their users posted.
In the midst of this was a lawsuit between two companies you might recognize: Stratton Oakmont and Prodigy. The former is featured in The Wolf of Wall Street, and the latter was a pioneer of the early internet. But in 1994, Stratton Oakmont sued Prodigy for defamation after an anonymous user claimed on a Prodigy bulletin board that the financial company’s president engaged in fraudulent acts. The court ruled in Stratton Oakmont’s favor, saying that because Prodigy moderated posts on its forums, it exercised editorial control that made it just as liable for the speech on its platform as the people who actually made that speech. Meanwhile, Prodigy’s rival online service, Compuserve, was found liable for a user’s speech in an earlier case because Compuserve didn’t moderate content.
Fearing that the Communications Decency Act would stop the burgeoning internet in its tracks, and mindful of the Prodigy decision, then-Rep. (now Sen.) Ron Wyden and Rep. Chris Cox authored an amendment to CDA that said “interactive computer services” were not responsible for what their users posted, even if those services engaged in some moderation of that third-party content.
“What I was struck by then is that if somebody owned a website or a blog, they could be held personally liable for something posted on their site,” Wyden told Vox’s Emily Stewart in 2019. “And I said then — and it’s the heart of my concern now — if that’s the case, it will kill the little guy, the startup, the inventor, the person who is essential for a competitive marketplace. It will kill them in the crib.”
As the beginning of Section 230 says: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” These are considered by some to be the 26 words that created the internet, but the law says more than that.
Section 230 also allows those services to “restrict access” to any content they deem objectionable. In other words, the platforms themselves get to choose what is and what is not acceptable content, and they can decide to host it or moderate it accordingly. That means the free speech argument frequently employed by people who are suspended or banned from these platforms — that their Constitutional right to free speech has been violated — doesn’t apply. Wyden likens the dual nature of Section 230 to a sword and a shield for platforms: They’re shielded from liability for user content, and they have a sword to moderate it as they see fit.
The Communications Decency Act was signed into law in 1996. The indecency and obscenity provisions about transmitting porn to minors were immediately challenged by civil liberty groups and struck down by the Supreme Court, which said they were too restrictive of free speech. Section 230 stayed, and so a law that was initially meant to restrict free speech on the internet instead became the law that protected it.
This protection has allowed the internet to thrive. Think about it: Websites like Facebook, Reddit, and YouTube have millions and even billions of users. If these platforms had to monitor and approve every single thing every user posted, they simply wouldn’t be able to exist. No website or platform can moderate at such an incredible scale, and no one wants to open themselves up to the legal liability of doing so. On the other hand, a website that didn’t moderate anything at all would quickly become a spam-filled cesspool that few people would want to swim in.
That doesn’t mean Section 230 is perfect. Some argue that it gives platforms too little accountability, allowing some of the worst parts of the internet to flourish. Others say it allows platforms that have become hugely influential and important to suppress and censor speech based on their own whims or supposed political biases. Depending on who you talk to, internet platforms are either using the sword too much or not enough. Either way, they’re hiding behind the shield to protect themselves from lawsuits while they do it. Though it has been a law for nearly three decades, Section 230’s existence may have never been as precarious as it is now.
The Supreme Court might determine Section 230’s fate
Justice Clarence Thomas has made no secret of his desire for the court to consider Section 230, saying in multiple opinions that he believes lower courts have interpreted it to give too-broad protections to what have become very powerful companies. He got his wish in February 2023, when the court heard two similar cases that include it. In both, plaintiffs argued that their family members were killed by terrorists who posted content on those platforms. In the first, Gonzalez v. Google, the family of a woman killed in a 2015 terrorist attack in France said YouTube promoted ISIS videos and sold advertising on them, thereby materially supporting ISIS. In Twitter v. Taamneh, the family of a man killed in a 2017 ISIS attack in Turkey said the platform didn’t go far enough to identify and remove ISIS content, which is in violation of the Justice Against Sponsors of Terrorism Act — and could then mean that Section 230 doesn’t apply to such content.
These cases give the Supreme Court the chance to reshape, redefine, or even repeal the foundational law of the internet, which could fundamentally change it. And while the Supreme Court chose to take these cases on, it’s not certain that they’ll rule in favor of the plaintiffs. In oral arguments in late February, several justices didn’t seem too convinced during the Gonzalez v. Google arguments that they could or should, especially considering the monumental possible consequences and impact of such a decision. In Twitter v. Taamneh, the justices focused more on if and how the Sponsors of Terrorism law applied to tweets than they did on Section 230. The rulings are expected in June.
In the meantime, don’t expect the original authors of Section 230 to go away quietly. Wyden and Cox submitted an amicus brief to the Supreme Court for the Gonzalez case, where they said: “The real-time transmission of user-generated content that Section 230 fosters has become a backbone of online activity, relied upon by innumerable Internet users and platforms alike. Given the enormous volume of content created by Internet users today, Section 230’s protection is even more important now than when the statute was enacted.”
Congress and presidents are getting sick of Section 230, too
In 2018, two bills — the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA) and the Stop Enabling Sex Traffickers Act (SESTA) — were signed into law, which changed parts of Section 230. The updates mean that platforms can now be deemed responsible for prostitution ads posted by third parties. These changes were ostensibly meant to make it easier for authorities to go after websites that were used for sex trafficking, but it did so by carving out an exception to Section 230. That could open the door to even more exceptions in the future.
Amid all of this was a growing public sentiment that social media platforms like Twitter and Facebook were becoming too powerful. In the minds of many, Facebook even influenced the outcome of the 2016 presidential election by offering up its user data to shady outfits like Cambridge Analytica. There were also allegations of anti-conservative bias. Right-wing figures who once rode the internet’s relative lack of moderation to fame and fortune were being held accountable for various infringements of hateful content rules and kicked off the very platforms that helped create them. Alex Jones and his expulsion from Facebook and other social media platforms — even Twitter under Elon Musk won’t let him back — is perhaps the best example of this.
In a 2018 op-ed, Sen. Ted Cruz (R-TX) claimed that Section 230 required the internet platforms it was designed to protect to be “neutral public forums.” The law doesn’t actually say that, but many Republican lawmakers have introduced legislation that would fulfill that promise. On the other side, Democrats have introduced bills that would hold social media platforms accountable if they didn’t do more to prevent harmful content or if their algorithms promoted it.
There are some bipartisan efforts to change Section 230, too. The EARN IT Act from Sens. Lindsey Graham (R-SC) and Richard Blumenthal (D-CT), for example, would remove Section 230 immunity from platforms that didn’t follow a set of best practices to detect and remove child sexual abuse material. The partisan bills haven’t really gotten anywhere in Congress. But EARN IT, which was introduced in the last two sessions, was passed out of committee in the Senate and ready for a Senate floor vote. That vote never came, but Blumenthal and Graham have already signaled that they plan to reintroduce EARN IT this session for a third try.
In the executive branch, former President Trump became a very vocal critic of Section 230 in 2020 after Twitter and Facebook started deleting and tagging his posts that contained inaccuracies about Covid-19 and mail-in voting. He issued an executive order that said Section 230 protections should only apply to platforms that have “good faith” moderation, and then called on the FCC to make rules about what constituted good faith. This didn’t happen, and President Biden revoked the executive order months after taking office.
But Biden isn’t a fan of Section 230, either. During his presidential campaign, he said he wanted it repealed. As president, Biden has said he wants it to be reformed by Congress. Until Congress can agree on what’s wrong with Section 230, however, it doesn’t look likely that they’ll pass a law that significantly changes it.
However, some Republican states have been making their own anti-Section 230 moves. In 2021, Florida passed the Stop Social Media Censorship Act, which prohibits certain social media platforms from banning politicians or media outlets. That same year, Texas passed HB 20, which forbids large platforms from removing or moderating content based on a user’s viewpoint.
Neither law is currently in effect. A federal judge blocked the Florida law in 2022 due to the possibility of it violating free speech laws as well as Section 230. The state has appealed to the Supreme Court. The Texas law has made a little more progress. A district court blocked the law last year, and then the Fifth Circuit controversially reversed that decision before deciding to stay the law in order to give the Supreme Court the chance to take the case. We’re still waiting to see if it does.
If Section 230 were to be repealed — or even significantly reformed — it really could change the internet as we know it. It remains to be seen if that’s for better or for worse.
Update, February 23, 2023, 3 pm ET: This story, originally published on May 28, 2020, has been updated several times, most recently with the latest news from the Supreme Court cases related to Section 230.
The Marvel Movies From Worst to Best—and Where to Stream Them
Here’s our definitive ranking of all 31 films (and counting) in the Marvel Cinematic Universe.
Here’s our definitive ranking of all 31 films (and counting) in the Marvel Cinematic Universe.
iPhone 15 Pro Could Come in Dark Red, With Pink and Light Blue Options for iPhone 15
With every iteration of the iPhone, Apple changes the available color options, often introducing a special color or set of colors that set new iPhones apart from the prior generation. With the iPhone 14 Pro, Apple introduced a dark purple, while the standard iPhone 14 was offered in a purple shade.
Apple’s iPhone 15 and 15 Pro models will also come in unique colors, and 9to5Mac says that an unnamed source indicates Apple is working on iPhone 15 Pro and Pro Max models in a dark red, which is close to a burgundy shade. The color hex code is #410D0D, described as a dark sienna.
The standard iPhone 15 models could be available in new dark pink and bright, light blue colors. The pink shade, color hex code #CE3C6C is a deeper pink described as “telemagenta.” The blue, color hex code #4DB1E2, is described as “picton blue.”
The dark red shade that Apple may have planned for the iPhone 15 Pro models would likely be accompanied by standard (PRODUCT)RED devices in a brighter shade, along with more traditional shades close to silver/gold and space gray. iPhone 15 models could have more color options, including colors akin to red, black, and white.
Apple’s dark red color would presumably be offered for the titanium finish rumored for the iPhone 15 Pro and Pro Max. Apple so far has released titanium Apple Watch models in a standard silver titanium color and a darker titanium color, but red anodization would be entirely new.
According to 9to5Mac’s source, the information on the color options is “still early” and “could change” closer to the fall, so the colors shared could be off. As Apple plans devices well in advance of when they launch, design choices will likely be solidified in the near future. Apple is expected to begin the EVT (engineering validation test) phase of iPhone 15 production in March.Related Roundups: iPhone 15, iPhone 15 Pro
Related Forum: iPhone
This article, “iPhone 15 Pro Could Come in Dark Red, With Pink and Light Blue Options for iPhone 15” first appeared on MacRumors.comDiscuss this article in our forums
With every iteration of the iPhone, Apple changes the available color options, often introducing a special color or set of colors that set new iPhones apart from the prior generation. With the iPhone 14 Pro, Apple introduced a dark purple, while the standard iPhone 14 was offered in a purple shade.
Apple’s iPhone 15 and 15 Pro models will also come in unique colors, and 9to5Mac says that an unnamed source indicates Apple is working on iPhone 15 Pro and Pro Max models in a dark red, which is close to a burgundy shade. The color hex code is #410D0D, described as a dark sienna.
The standard iPhone 15 models could be available in new dark pink and bright, light blue colors. The pink shade, color hex code #CE3C6C is a deeper pink described as “telemagenta.” The blue, color hex code #4DB1E2, is described as “picton blue.”
The dark red shade that Apple may have planned for the iPhone 15 Pro models would likely be accompanied by standard (PRODUCT)RED devices in a brighter shade, along with more traditional shades close to silver/gold and space gray. iPhone 15 models could have more color options, including colors akin to red, black, and white.
Apple’s dark red color would presumably be offered for the titanium finish rumored for the iPhone 15 Pro and Pro Max. Apple so far has released titanium Apple Watch models in a standard silver titanium color and a darker titanium color, but red anodization would be entirely new.
According to 9to5Mac‘s source, the information on the color options is “still early” and “could change” closer to the fall, so the colors shared could be off. As Apple plans devices well in advance of when they launch, design choices will likely be solidified in the near future. Apple is expected to begin the EVT (engineering validation test) phase of iPhone 15 production in March.
This article, “iPhone 15 Pro Could Come in Dark Red, With Pink and Light Blue Options for iPhone 15” first appeared on MacRumors.com
Discuss this article in our forums
Tech Leaders in Israel Wonder if It’s Time to Leave
Ahead of a judicial overhaul that could transform the country and frighten away investors, the executives of Start-Up Nation are mulling an exodus.
Ahead of a judicial overhaul that could transform the country and frighten away investors, the executives of Start-Up Nation are mulling an exodus.