slashdot-rss

The Theory That Volcanoes Killed the Dinosaurs Is Officially Extinct

“Sixty-six million years ago, all dinosaurs (except for birds) were wiped from the face of the Earth…” writes Gizmodo. “What’s indisputable about this pivotal moment in Earth’s history is that a 6.2 to 9.3-mile-wide (10 to 15-kilometer) asteroid struck what is now modern-day Mexico. Around the same time, however, volcanoes in what is now India experienced some of the largest eruptions in Earth’s history.”

Those volcanos “have long been proposed as an alternative cause for the demise of the dinosaurs…” writes Phys.org. But “Now, climate scientists from Utrecht University and the University of Manchester show that while the volcanism caused a temporary cold period, the effects had already worn off thousands of years before the meteorite impacted.”

Earth scientists have fiercely debated for decades whether a massive outpouring of lava on the Indian continent, which occurred both prior to and after the meteorite impact, also contributed to the demise of dinosaur populations roaming Earth. These volcanic eruptions released vast amounts of CO2, dust, and sulfur, thereby significantly altering the climate on Earth — but in different ways and on different timescales to a meteorite impact. The new publication provides compelling evidence that while the volcanic eruptions in India had a clear impact on global climate, they likely had little to no effect on the mass extinction of the dinosaurs.

By analyzing fossil molecules in ancient peats from the United States of America, the scientific team reconstructed air temperatures for the time period covering both the volcanic eruptions and the meteorite impact. Using this method, the researchers show that a major volcanic eruption occurred about 30,000 years before the meteor impact, coinciding with at least a 5 degrees Celsius cooling of the climate… Importantly, the scientists discovered that by around 20,000 years before the meteorite impact, temperatures on Earth had already stabilized and had climbed back to similar temperatures before the volcanic eruptions started.

The study is published in the journal Science Advances. And Gizmodo shares this quote from Bart van Dongen of The University of Manchester, who worked on the research.

“The study provides vital insights not only into the past but could also help us find ways for how we might prepare for future climate changes or natural disasters.”

Read more of this story at Slashdot.

“Sixty-six million years ago, all dinosaurs (except for birds) were wiped from the face of the Earth…” writes Gizmodo. “What’s indisputable about this pivotal moment in Earth’s history is that a 6.2 to 9.3-mile-wide (10 to 15-kilometer) asteroid struck what is now modern-day Mexico. Around the same time, however, volcanoes in what is now India experienced some of the largest eruptions in Earth’s history.”

Those volcanos “have long been proposed as an alternative cause for the demise of the dinosaurs…” writes Phys.org. But “Now, climate scientists from Utrecht University and the University of Manchester show that while the volcanism caused a temporary cold period, the effects had already worn off thousands of years before the meteorite impacted.”

Earth scientists have fiercely debated for decades whether a massive outpouring of lava on the Indian continent, which occurred both prior to and after the meteorite impact, also contributed to the demise of dinosaur populations roaming Earth. These volcanic eruptions released vast amounts of CO2, dust, and sulfur, thereby significantly altering the climate on Earth — but in different ways and on different timescales to a meteorite impact. The new publication provides compelling evidence that while the volcanic eruptions in India had a clear impact on global climate, they likely had little to no effect on the mass extinction of the dinosaurs.

By analyzing fossil molecules in ancient peats from the United States of America, the scientific team reconstructed air temperatures for the time period covering both the volcanic eruptions and the meteorite impact. Using this method, the researchers show that a major volcanic eruption occurred about 30,000 years before the meteor impact, coinciding with at least a 5 degrees Celsius cooling of the climate… Importantly, the scientists discovered that by around 20,000 years before the meteorite impact, temperatures on Earth had already stabilized and had climbed back to similar temperatures before the volcanic eruptions started.

The study is published in the journal Science Advances. And Gizmodo shares this quote from Bart van Dongen of The University of Manchester, who worked on the research.

“The study provides vital insights not only into the past but could also help us find ways for how we might prepare for future climate changes or natural disasters.”

Read more of this story at Slashdot.

Read More 

Sea Levels are Already Rising in America’s Southeast. A Preview of the Future?

The Washington Post visits one of over 100 tide-tracking stations around the U.S. — Georgia’s Fort Pulaski tide gauge:

Since 2010, the sea level at the Fort Pulaski gauge has risen by more than 7 inches, one of the fastest rates in the country, according to a Washington Post analysis of National Oceanic and Atmospheric Administration data for 127 tide gauges. Similar spikes are affecting the entire U.S. Southeast — showing a glimpse of our climate future… [I]n the previous 30 years, the ocean rose about 3.7 inches. And the deluge stretches all across the South and the Gulf Coast; over the past 14 years, sea levels in the U.S. South have risen twice as fast as the global average…

Scientists suspect part of that is because of the Gulf Stream — a long band of warm water that follows the coast up from the equator and then, near Cape Hatteras, turns out into the Atlantic Ocean. The waters of the Gulf Stream and the Gulf of Mexico are warming faster than other parts of the Atlantic, boosting sea levels. “The Gulf of Mexico has warmed exceptionally fast over the past decade and a half,” Piecuch said. “It’s uncontroversial.” But scientists have puzzled over where all that heat is coming from… [T]he current heat could be part of long-term variations in ocean currents, and not a clear signal of climate change. But the fact that the change is linked to heat — at the same time as the entire ocean is taking on excess heat from global warming — makes some experts suspicious. “This particular mechanism does not immediately suggest it’s just natural variability,” [said Ben Hamlington, a research scientist who leads NASAâ(TM)s sea level change team].

For now, sea levels in the Southeast are surging — and they provide an early picture of what most of the United States, and the rest of the world, will experience as oceans rise… On Tybee Island — whose population of 4,000 swells to over 100,000 during the summer months — leaders have gotten used to the constant fight against the waves. Five or six times a year, high tides sweep over the one road that connects the island to the mainland, cutting residents off from services. By 2050, scientists estimate, those high tides will happen 70 days a year. With the help of the U.S. Army Corps of Engineers, the city has built dunes to protect vacation homes and local storefronts from the rising water; many homeowners have also raised their properties high up into the air. In Savannah, small businesses and city streets are washed in floods even on bright, sunny days — thanks to high tides that surge into the drainage system. The city estimates that it will cost $400 million to update the stormwater infrastructure over the next two decades. So far, it has raised $150 million…
Other states and cities will soon see the same effects. NASA projections show that in the coming decades, many cities in the Northeast will experience up to 100 more days of high-tide flooding each year.

“Some researchers think that the Southeast acceleration may be linked to long-term weather patterns in the Atlantic Ocean like the North Atlantic Oscillation.

“If so, the trend could switch in the coming decades — with areas of the Northeast seeing rapid sea level rise while the trend in the Southeast slows down.”

Read more of this story at Slashdot.

The Washington Post visits one of over 100 tide-tracking stations around the U.S. — Georgia’s Fort Pulaski tide gauge:

Since 2010, the sea level at the Fort Pulaski gauge has risen by more than 7 inches, one of the fastest rates in the country, according to a Washington Post analysis of National Oceanic and Atmospheric Administration data for 127 tide gauges. Similar spikes are affecting the entire U.S. Southeast — showing a glimpse of our climate future… [I]n the previous 30 years, the ocean rose about 3.7 inches. And the deluge stretches all across the South and the Gulf Coast; over the past 14 years, sea levels in the U.S. South have risen twice as fast as the global average…

Scientists suspect part of that is because of the Gulf Stream — a long band of warm water that follows the coast up from the equator and then, near Cape Hatteras, turns out into the Atlantic Ocean. The waters of the Gulf Stream and the Gulf of Mexico are warming faster than other parts of the Atlantic, boosting sea levels. “The Gulf of Mexico has warmed exceptionally fast over the past decade and a half,” Piecuch said. “It’s uncontroversial.” But scientists have puzzled over where all that heat is coming from… [T]he current heat could be part of long-term variations in ocean currents, and not a clear signal of climate change. But the fact that the change is linked to heat — at the same time as the entire ocean is taking on excess heat from global warming — makes some experts suspicious. “This particular mechanism does not immediately suggest it’s just natural variability,” [said Ben Hamlington, a research scientist who leads NASAâ(TM)s sea level change team].

For now, sea levels in the Southeast are surging — and they provide an early picture of what most of the United States, and the rest of the world, will experience as oceans rise… On Tybee Island — whose population of 4,000 swells to over 100,000 during the summer months — leaders have gotten used to the constant fight against the waves. Five or six times a year, high tides sweep over the one road that connects the island to the mainland, cutting residents off from services. By 2050, scientists estimate, those high tides will happen 70 days a year. With the help of the U.S. Army Corps of Engineers, the city has built dunes to protect vacation homes and local storefronts from the rising water; many homeowners have also raised their properties high up into the air. In Savannah, small businesses and city streets are washed in floods even on bright, sunny days — thanks to high tides that surge into the drainage system. The city estimates that it will cost $400 million to update the stormwater infrastructure over the next two decades. So far, it has raised $150 million…
Other states and cities will soon see the same effects. NASA projections show that in the coming decades, many cities in the Northeast will experience up to 100 more days of high-tide flooding each year.

“Some researchers think that the Southeast acceleration may be linked to long-term weather patterns in the Atlantic Ocean like the North Atlantic Oscillation.

“If so, the trend could switch in the coming decades — with areas of the Northeast seeing rapid sea level rise while the trend in the Southeast slows down.”

Read more of this story at Slashdot.

Read More 

OpenAI’s Next Big AI Effort GPT-5 is Behind Schedule and Crazy Expensive

“From the moment GPT-4 came out in March 2023, OpenAI has been working on GPT-5…” reports the Wall Street Journal. [Alternate URL here.] But “OpenAI’s new artificial-intelligence project is behind schedule and running up huge bills. It isn’t clear when — or if — it’ll work.”

“There may not be enough data in the world to make it smart enough.”
OpenAI’s closest partner and largest investor, Microsoft, had expected to see the new model around mid-2024, say people with knowledge of the matter. OpenAI has conducted at least two large training runs, each of which entails months of crunching huge amounts of data, with the goal of making Orion smarter. Each time, new problems arose and the software fell short of the results researchers were hoping for, people close to the project say… [And each one costs around half a billion dollars in computing costs.]
The $157 billion valuation investors gave OpenAI in October is premised in large part on [CEO Sam] Altman’s prediction that GPT-5 will represent a “significant leap forward” in all kinds of subjects and tasks…. It’s up to company executives to decide whether the model is smart enough to be called GPT-5 based in large part on gut feelings or, as many technologists say, “vibes.”

So far, the vibes are off…

OpenAI wants to use its new model to generate high-quality synthetic data for training, according to the article. But OpenAI’s researchers also “concluded they needed more diverse, high-quality data,” according to the article, since “The public internet didn’t have enough, they felt.”

OpenAI’s solution was to create data from scratch. It is hiring people to write fresh software code or solve math problems for Orion to learn from. [And also theoretical physics experts] The workers, some of whom are software engineers and mathematicians, also share explanations for their work with Orion… Having people explain their thinking deepens the value of the newly created data. It’s more language for the LLM to absorb; it’s also a map for how the model might solve similar problems in the future… The process is painfully slow. GPT-4 was trained on an estimated 13 trillion tokens. A thousand people writing 5,000 words a day would take months to produce a billion tokens.

OpenAI’s already-difficult task has been complicated by internal turmoil and near-constant attempts by rivals to poach its top researchers, sometimes by offering them millions of dollars… More than two dozen key executives, researchers and longtime employees have left OpenAI this year, including co-founder and Chief Scientist Ilya Sutskever and Chief Technology Officer Mira Murati. This past Thursday, Alec Radford, a widely admired researcher who served as lead author on several of OpenAI’s scientific papers, announced his departure after about eight years at the company…

OpenAI isn’t the only company worrying that progress has hit a wall. Across the industry, a debate is raging over whether improvement in AIs is starting to plateau. Sutskever, who recently co-founded a new AI firm called Safe Superintelligence or SSI, declared at a recent AI conference that the age of maximum data is over. “Data is not growing because we have but one internet,” he told a crowd of researchers, policy experts and scientists. “You can even go as far as to say that data is the fossil fuel of AI.”

And that fuel was starting to run out.

Read more of this story at Slashdot.

“From the moment GPT-4 came out in March 2023, OpenAI has been working on GPT-5…” reports the Wall Street Journal. [Alternate URL here.] But “OpenAI’s new artificial-intelligence project is behind schedule and running up huge bills. It isn’t clear when — or if — it’ll work.”

“There may not be enough data in the world to make it smart enough.”
OpenAI’s closest partner and largest investor, Microsoft, had expected to see the new model around mid-2024, say people with knowledge of the matter. OpenAI has conducted at least two large training runs, each of which entails months of crunching huge amounts of data, with the goal of making Orion smarter. Each time, new problems arose and the software fell short of the results researchers were hoping for, people close to the project say… [And each one costs around half a billion dollars in computing costs.]
The $157 billion valuation investors gave OpenAI in October is premised in large part on [CEO Sam] Altman’s prediction that GPT-5 will represent a “significant leap forward” in all kinds of subjects and tasks…. It’s up to company executives to decide whether the model is smart enough to be called GPT-5 based in large part on gut feelings or, as many technologists say, “vibes.”

So far, the vibes are off…

OpenAI wants to use its new model to generate high-quality synthetic data for training, according to the article. But OpenAI’s researchers also “concluded they needed more diverse, high-quality data,” according to the article, since “The public internet didn’t have enough, they felt.”

OpenAI’s solution was to create data from scratch. It is hiring people to write fresh software code or solve math problems for Orion to learn from. [And also theoretical physics experts] The workers, some of whom are software engineers and mathematicians, also share explanations for their work with Orion… Having people explain their thinking deepens the value of the newly created data. It’s more language for the LLM to absorb; it’s also a map for how the model might solve similar problems in the future… The process is painfully slow. GPT-4 was trained on an estimated 13 trillion tokens. A thousand people writing 5,000 words a day would take months to produce a billion tokens.

OpenAI’s already-difficult task has been complicated by internal turmoil and near-constant attempts by rivals to poach its top researchers, sometimes by offering them millions of dollars… More than two dozen key executives, researchers and longtime employees have left OpenAI this year, including co-founder and Chief Scientist Ilya Sutskever and Chief Technology Officer Mira Murati. This past Thursday, Alec Radford, a widely admired researcher who served as lead author on several of OpenAI’s scientific papers, announced his departure after about eight years at the company…

OpenAI isn’t the only company worrying that progress has hit a wall. Across the industry, a debate is raging over whether improvement in AIs is starting to plateau. Sutskever, who recently co-founded a new AI firm called Safe Superintelligence or SSI, declared at a recent AI conference that the age of maximum data is over. “Data is not growing because we have but one internet,” he told a crowd of researchers, policy experts and scientists. “You can even go as far as to say that data is the fossil fuel of AI.”

And that fuel was starting to run out.

Read more of this story at Slashdot.

Read More 

Scientists Build a Nuclear-Diamond Battery That Could Power Devices for Thousands of Years

The world’s first nuclear-powered battery — a diamond with an embedded radioactive isotope — could power small devices for thousands of years, according to scientists at the UK’s University of Bristol.

Long-time Slashdot reader fahrbot-bot shared this report from LiveScience:

The diamond battery harvests fast-moving electrons excited by radiation, similar to how solar power uses photovoltaic cells to convert photons into electricity, the scientists said.

Scientists from the same university first demonstrated a prototype diamond battery — which used nickel-63 as the radioactive source — in 2017. In the new project, the team developed a battery made of carbon-14 radioactive isotopes embedded in manufactured diamonds. The researchers chose carbon-14 as the source material because it emits short-range radiation, which is quickly absorbed by any solid material — meaning there are no concerns about harm from the radiation. Although carbon-14 would be dangerous to ingest or touch with bare hands, the diamond that holds it prevents any short-range radiation from escaping. “Diamond is the hardest substance known to man; there is literally nothing we could use that could offer more protection,” Neil Fox, a professor of materials for energy at the University of Bristol, said in the statement…

A single nuclear-diamond battery containing 0.04 ounce (1 gram) of carbon-14 could deliver 15 joules of electricity per day. For comparison, a standard alkaline AA battery, which weighs about 0.7 ounces (20 grams), has an energy-storage rating of 700 joules per gram. It delivers more power than the nuclear-diamond battery would in the short term, but it would be exhausted within 24 hours. By contrast, the half-life of carbon-14 is 5,730 years, which means the battery would take that long to be depleted to 50% power….
[A] spacecraft powered by a carbon-14 diamond battery would reach Alpha Centauri — our nearest stellar neighbor, which is about 4.4 light-years from Earth — long before its power were significantly depleted.

The battery has no moving parts, according to the article. It “requires no maintenance, nor does it have any carbon emissions.”

Read more of this story at Slashdot.

The world’s first nuclear-powered battery — a diamond with an embedded radioactive isotope — could power small devices for thousands of years, according to scientists at the UK’s University of Bristol.

Long-time Slashdot reader fahrbot-bot shared this report from LiveScience:

The diamond battery harvests fast-moving electrons excited by radiation, similar to how solar power uses photovoltaic cells to convert photons into electricity, the scientists said.

Scientists from the same university first demonstrated a prototype diamond battery — which used nickel-63 as the radioactive source — in 2017. In the new project, the team developed a battery made of carbon-14 radioactive isotopes embedded in manufactured diamonds. The researchers chose carbon-14 as the source material because it emits short-range radiation, which is quickly absorbed by any solid material — meaning there are no concerns about harm from the radiation. Although carbon-14 would be dangerous to ingest or touch with bare hands, the diamond that holds it prevents any short-range radiation from escaping. “Diamond is the hardest substance known to man; there is literally nothing we could use that could offer more protection,” Neil Fox, a professor of materials for energy at the University of Bristol, said in the statement…

A single nuclear-diamond battery containing 0.04 ounce (1 gram) of carbon-14 could deliver 15 joules of electricity per day. For comparison, a standard alkaline AA battery, which weighs about 0.7 ounces (20 grams), has an energy-storage rating of 700 joules per gram. It delivers more power than the nuclear-diamond battery would in the short term, but it would be exhausted within 24 hours. By contrast, the half-life of carbon-14 is 5,730 years, which means the battery would take that long to be depleted to 50% power….
[A] spacecraft powered by a carbon-14 diamond battery would reach Alpha Centauri — our nearest stellar neighbor, which is about 4.4 light-years from Earth — long before its power were significantly depleted.

The battery has no moving parts, according to the article. It “requires no maintenance, nor does it have any carbon emissions.”

Read more of this story at Slashdot.

Read More 

Months After Its 20th Anniversary, OpenStreetMap Suffers an Extended Outage

Monday long-time Slashdot reader denelson83 wrote: The crowdsourced, widely-used map database OpenStreetMap has had a hardware failure at its upstream ISP in Amsterdam and has been put into a protective read-only mode to avoid loss or corruption of data. .

The outage had started Sunday December 15 at 4:00AM (GMT/UTC), but by Tuesday they’d posted a final update:
Our new ISP is up and running and we have started migrating our servers across to it. If all goes smoothly we hope to have all services back up and running this evening…

We have dual redundant links via separate physical hardware from our side to our Tier 1 ISP. We unexpectedly discovered their equipment is a single point failure. Their extended outage is an extreme disappointment to us.

We are an extremely small team. The OSMF budget is tiny and we could definitely use more help. Real world experience… Ironically we signed a contract with a new ISP in the last few days. Install is on-going (fibre runs, modules & patching) and we expect to run old and new side-by-side for 6 months. Significantly better resilience (redundant ISP side equipment, VRRP both ways, multiple upstream peers… 2x diverse 10G fibre links).

OpenStreetMap celebrated its 20th anniversary in August, with a TechCrunch profile reminding readers the site gives developers “geographic data and maps so they can rely a little less on the proprietary incumbents in the space,” reports TechCrunch, adding “Yes, that mostly means Google.”

OpenStreetMap starts with “publicly available and donated aerial imagery and maps, sourced from governments and private organizations such as Microsoft” — then makes them better:

Today, OpenStreetMap claims more than 10 million contributors who map out and fine-tune everything from streets and buildings, to rivers, canyons and everything else that constitutes our built and natural environments… Contributors can manually add and edit data through OpenStreetMap’s editing tools, and they can even venture out into the wild and map a whole new area by themselves using GPS, which is useful if a new street crops up, for example…

OpenStreetMap’s Open Database License allows any third-party to use its data with the appropriate attribution (though this attribution doesn’t always happen). This includes big-name corporations such as Apple and VC-backed unicorns like MapBox, through a who’s who of tech companies, including Uber and Strava… More recently, the Overture Maps Foundation — an initiative backed by Microsoft, Amazon, Meta and TomTom — has leaned heavily on OpenStreetMap data as part of its own efforts to build a viable alternative to Google’s walled mapping garden.
The article notes that OpenStreetMap is now overseen by the U.K.-based nonprofit OpenStreetMap Foundation (supported mainly by donations and memberships), with just one employee — a system engineer — “and a handful of contractors who provide administrative and accounting support.”

In August its original founder Steve Coast, returned to the site for a special blog post on its 20th anniversary:

OpenStreetMap has grown exponentially or quadratically over the last twenty years depending on the metric you’re interested in… The story isn’t so much about the data and technology, and it never was. It’s the people… OpenStreetMap managed to map the world and give the data away for free for almost no money at all. It managed to sidestep almost all the problems that Wikipedia has by virtue of only representing facts not opinions. The project itself is remarkable. And it’s wonderful that so many are in love with it.

“Two decades ago, I knew that a wiki map of the world would work,” Coast writes. “It seemed obvious in light of the success of Wikipedia and Linux…”

Read more of this story at Slashdot.

Monday long-time Slashdot reader denelson83 wrote: The crowdsourced, widely-used map database OpenStreetMap has had a hardware failure at its upstream ISP in Amsterdam and has been put into a protective read-only mode to avoid loss or corruption of data. .

The outage had started Sunday December 15 at 4:00AM (GMT/UTC), but by Tuesday they’d posted a final update:
Our new ISP is up and running and we have started migrating our servers across to it. If all goes smoothly we hope to have all services back up and running this evening…

We have dual redundant links via separate physical hardware from our side to our Tier 1 ISP. We unexpectedly discovered their equipment is a single point failure. Their extended outage is an extreme disappointment to us.

We are an extremely small team. The OSMF budget is tiny and we could definitely use more help. Real world experience… Ironically we signed a contract with a new ISP in the last few days. Install is on-going (fibre runs, modules & patching) and we expect to run old and new side-by-side for 6 months. Significantly better resilience (redundant ISP side equipment, VRRP both ways, multiple upstream peers… 2x diverse 10G fibre links).

OpenStreetMap celebrated its 20th anniversary in August, with a TechCrunch profile reminding readers the site gives developers “geographic data and maps so they can rely a little less on the proprietary incumbents in the space,” reports TechCrunch, adding “Yes, that mostly means Google.”

OpenStreetMap starts with “publicly available and donated aerial imagery and maps, sourced from governments and private organizations such as Microsoft” — then makes them better:

Today, OpenStreetMap claims more than 10 million contributors who map out and fine-tune everything from streets and buildings, to rivers, canyons and everything else that constitutes our built and natural environments… Contributors can manually add and edit data through OpenStreetMap’s editing tools, and they can even venture out into the wild and map a whole new area by themselves using GPS, which is useful if a new street crops up, for example…

OpenStreetMap’s Open Database License allows any third-party to use its data with the appropriate attribution (though this attribution doesn’t always happen). This includes big-name corporations such as Apple and VC-backed unicorns like MapBox, through a who’s who of tech companies, including Uber and Strava… More recently, the Overture Maps Foundation — an initiative backed by Microsoft, Amazon, Meta and TomTom — has leaned heavily on OpenStreetMap data as part of its own efforts to build a viable alternative to Google’s walled mapping garden.
The article notes that OpenStreetMap is now overseen by the U.K.-based nonprofit OpenStreetMap Foundation (supported mainly by donations and memberships), with just one employee — a system engineer — “and a handful of contractors who provide administrative and accounting support.”

In August its original founder Steve Coast, returned to the site for a special blog post on its 20th anniversary:

OpenStreetMap has grown exponentially or quadratically over the last twenty years depending on the metric you’re interested in… The story isn’t so much about the data and technology, and it never was. It’s the people… OpenStreetMap managed to map the world and give the data away for free for almost no money at all. It managed to sidestep almost all the problems that Wikipedia has by virtue of only representing facts not opinions. The project itself is remarkable. And it’s wonderful that so many are in love with it.

“Two decades ago, I knew that a wiki map of the world would work,” Coast writes. “It seemed obvious in light of the success of Wikipedia and Linux…”

Read more of this story at Slashdot.

Read More 

Luigi Mangione’s Ghost Gun Was Only Partially 3D-Printed

“More than a decade after the advent of the 3D-printed gun as an icon of libertarianism and a gun control nightmare, police say one of those homemade plastic weapons has now been found in the hands of perhaps the world’s most high-profile alleged killer,” Wired wrote this month:

For the community of DIY gunsmiths who have spent years honing those printable firearm models, in fact, the handgun police claim Luigi Mangione used to fatally shoot UnitedHealthcare CEO Brian Thompson is as recognizable as the now-famous alleged shooter himself — and shows just how practical and lethal those weapons have become. In the 24 hours since police released a photo of what they say is Mangione’s gun following the 26-year-old’s arrest Monday, the online community devoted to 3D-printed firearms has been quick to identify the suspected murder weapon as a particular model of printable “ghost gun” — a homemade weapon with no serial number, created by assembling a mix of commercial and DIY parts. The gun appears to be a Chairmanwon V1, a tweak of a popular partially 3D-printed Glock-style design known as the FMDA 19.2 — an acronym that stands for the libertarian slogan “Free Men Don’t Ask.”

The FMDA 19.2, released in 2021, is a relatively old model by 3D-printed-gun standards, says one gunsmith who goes by the first name John and the online handle Mr. Snow Makes… Despite its simple description by law enforcement and others as a “3D-printed pistol,” the FMDA 19.2 is only partially 3D printed. That makes it fundamentally different from fully 3D-printed guns like the “Liberator,” the original one-shot, 3D-printed pistol Wilson debuted in 2013. Instead, firearms built from designs like the FMDA 19.2 are assembled from a combination of commercially produced parts like barrels, slides, and magazines — sometimes sold in kits — and a homemade frame. Because that frame, often referred to as a “lower receiver” or “lower,” is the regulated body of the gun, 3D-printing that piece or otherwise creating it at home allows DIY gunmakers to skirt gun-control laws and build ghost guns with no serial number, obtained with no background check or waiting period.

Chairmanwon “instantly recognized the gun seized from the suspect…” reported USA Today.
As a photo circulated online the fake New Jersey driver’s license and 3D-printed gun police found on Luigi Mangione, he spotted the tell-tale stippling pattern on the firearm’s grip. “It’s mine lol,” the man, known as “Chairmanwon” quipped on X Dec. 9. Then he quickly deleted the post…

No federal laws ban 3D-printed or privately made firearms. But as police agencies have increasingly recovered untraceable homemade guns at crime scenes, some state legislatures have passed stricter rules… If authorities can prove Mangione downloaded and printed his firearm in Pennsylvania or New York, he could face additional gun charges. Fifteen states now require serial numbers on homemade parts or ban 3D printing them. Some even ban the distribution of 3D printing instructions.

President Biden and the Bureau of Alcohol, Tobacco, Firearms and Explosives added regulations in 2022 that say the ghost gun parts kits themselves qualify as “firearms” that should be regulated by the Gun Control Act. [“Commercial manufacturers of the kits will have to be licensed and must add serial numbers on the kits’ frame or receiver,” USA Today reported earlier. ] Gunmakers challenged those rules at the Supreme Court. In October, the court heard oral arguments, but justices signaled they were leaning toward upholding the rules.

Rolling Stone tries to assess the results:

In recent years, crimes involving ghost guns seem to have abated across much of the United States. Ghost gun recoveries by police in New York City, Los Angeles, Philadelphia, Baltimore, and other major cities dropped by as much as 25 percent between 2022 and 2023, and the most prolific maker of the kits used to build the untraceable weapons closed its doors this year. The likely cause is a federal rule change requiring the kits to be serialized — a stipulation that forces sellers to conduct background checks on their customers.

Monday Luigi Mangione will appear in court for arraignment on state murder charges, reports USA Today:

Mangione had been expected to face arraignment on the state charges Thursday, but the proceedings were postponed after federal authorities announced they were also bringing charges, and he was whisked to a federal courthouse instead in a move that appeared to shock Mangione’s defense team… Federal authorities unsealed a criminal complaint against Mangione that included four separate charges: murder using a firearm, two counts of interstate stalking and a firearms count. The death penalty was abolished in New York state, but the federal charges could bring a death sentence if Mangione is convicted. The charge of murder using a firearm carries a maximum possible sentence of death or life in prison. The other federal charges have maximum sentences of life in prison, and the firearms charge has a mandatory minimum sentence of 30 years.

Read more of this story at Slashdot.

“More than a decade after the advent of the 3D-printed gun as an icon of libertarianism and a gun control nightmare, police say one of those homemade plastic weapons has now been found in the hands of perhaps the world’s most high-profile alleged killer,” Wired wrote this month:

For the community of DIY gunsmiths who have spent years honing those printable firearm models, in fact, the handgun police claim Luigi Mangione used to fatally shoot UnitedHealthcare CEO Brian Thompson is as recognizable as the now-famous alleged shooter himself — and shows just how practical and lethal those weapons have become. In the 24 hours since police released a photo of what they say is Mangione’s gun following the 26-year-old’s arrest Monday, the online community devoted to 3D-printed firearms has been quick to identify the suspected murder weapon as a particular model of printable “ghost gun” — a homemade weapon with no serial number, created by assembling a mix of commercial and DIY parts. The gun appears to be a Chairmanwon V1, a tweak of a popular partially 3D-printed Glock-style design known as the FMDA 19.2 — an acronym that stands for the libertarian slogan “Free Men Don’t Ask.”

The FMDA 19.2, released in 2021, is a relatively old model by 3D-printed-gun standards, says one gunsmith who goes by the first name John and the online handle Mr. Snow Makes… Despite its simple description by law enforcement and others as a “3D-printed pistol,” the FMDA 19.2 is only partially 3D printed. That makes it fundamentally different from fully 3D-printed guns like the “Liberator,” the original one-shot, 3D-printed pistol Wilson debuted in 2013. Instead, firearms built from designs like the FMDA 19.2 are assembled from a combination of commercially produced parts like barrels, slides, and magazines — sometimes sold in kits — and a homemade frame. Because that frame, often referred to as a “lower receiver” or “lower,” is the regulated body of the gun, 3D-printing that piece or otherwise creating it at home allows DIY gunmakers to skirt gun-control laws and build ghost guns with no serial number, obtained with no background check or waiting period.

Chairmanwon “instantly recognized the gun seized from the suspect…” reported USA Today.
As a photo circulated online the fake New Jersey driver’s license and 3D-printed gun police found on Luigi Mangione, he spotted the tell-tale stippling pattern on the firearm’s grip. “It’s mine lol,” the man, known as “Chairmanwon” quipped on X Dec. 9. Then he quickly deleted the post…

No federal laws ban 3D-printed or privately made firearms. But as police agencies have increasingly recovered untraceable homemade guns at crime scenes, some state legislatures have passed stricter rules… If authorities can prove Mangione downloaded and printed his firearm in Pennsylvania or New York, he could face additional gun charges. Fifteen states now require serial numbers on homemade parts or ban 3D printing them. Some even ban the distribution of 3D printing instructions.

President Biden and the Bureau of Alcohol, Tobacco, Firearms and Explosives added regulations in 2022 that say the ghost gun parts kits themselves qualify as “firearms” that should be regulated by the Gun Control Act. [“Commercial manufacturers of the kits will have to be licensed and must add serial numbers on the kits’ frame or receiver,” USA Today reported earlier. ] Gunmakers challenged those rules at the Supreme Court. In October, the court heard oral arguments, but justices signaled they were leaning toward upholding the rules.

Rolling Stone tries to assess the results:

In recent years, crimes involving ghost guns seem to have abated across much of the United States. Ghost gun recoveries by police in New York City, Los Angeles, Philadelphia, Baltimore, and other major cities dropped by as much as 25 percent between 2022 and 2023, and the most prolific maker of the kits used to build the untraceable weapons closed its doors this year. The likely cause is a federal rule change requiring the kits to be serialized — a stipulation that forces sellers to conduct background checks on their customers.

Monday Luigi Mangione will appear in court for arraignment on state murder charges, reports USA Today:

Mangione had been expected to face arraignment on the state charges Thursday, but the proceedings were postponed after federal authorities announced they were also bringing charges, and he was whisked to a federal courthouse instead in a move that appeared to shock Mangione’s defense team… Federal authorities unsealed a criminal complaint against Mangione that included four separate charges: murder using a firearm, two counts of interstate stalking and a firearms count. The death penalty was abolished in New York state, but the federal charges could bring a death sentence if Mangione is convicted. The charge of murder using a firearm carries a maximum possible sentence of death or life in prison. The other federal charges have maximum sentences of life in prison, and the firearms charge has a mandatory minimum sentence of 30 years.

Read more of this story at Slashdot.

Read More 

US Life Expectancy Rose to 78.4 years in 2023 – Highest Level Since Pandemic

An anonymous reader shared this report from NBC News:

U.S. life expectancy rose last year, hitting its highest level since the beginning of the Covid pandemic, according to a report from the Centers for Disease Control and Prevention.

The report, released Thursday, found that life expectancy at birth was 78.4 years in 2023. That’s a significant rise — nearly a full year — from the life expectancy of 77.5 years in 2022. “The increase we had this year — the 0.9 year — that’s unheard of prior to the pandemic,” said Ken Kochanek, a statistician at the National Center for Health Statistics who co-authored the report. “Life expectancy in the United States never goes up or down any more than one- or two-tenths,” he said. “But then when Covid happened, you had this gigantic drop, and now we have a gigantic drop in Covid. So, you have this gigantic increase in life expectancy.”

From 2019 to 2021, U.S. life expectancy dropped from 78.8 years to 76.4. Covid deaths fell significantly last year: Whereas Covid was the fourth leading cause of death in 2022, it was the 10th in 2023, according to the new report. Last year, Covid was the underlying or contributing cause of more than 76,000 deaths, according to an August CDC report, compared with more than 350,000 such deaths in 2020.
The new findings are based on an analysis of death certificates from all 50 states and Washington, D.C. The results showed that the overall death rate for the U.S. population decreased by 6%.

“According to the new report, the top five causes of death in the U.S. last year were heart disease, cancer, unintentional injuries, stroke and chronic lower respiratory diseases. Death rates fell for nine of the top 10 causes in 2023, while the rate of cancer deaths remained fairly unchanged…”

The Atlantic shares some other positive statistics, including reports that America’s traffic fatalities keep declining, while drug-overdose deaths also dropped 3% between 2022 and 2023 and there was also a double-digit drop in murder rates.

“America is suddenly getting healthier,” they write. “No one knows why.”

Read more of this story at Slashdot.

An anonymous reader shared this report from NBC News:

U.S. life expectancy rose last year, hitting its highest level since the beginning of the Covid pandemic, according to a report from the Centers for Disease Control and Prevention.

The report, released Thursday, found that life expectancy at birth was 78.4 years in 2023. That’s a significant rise — nearly a full year — from the life expectancy of 77.5 years in 2022. “The increase we had this year — the 0.9 year — that’s unheard of prior to the pandemic,” said Ken Kochanek, a statistician at the National Center for Health Statistics who co-authored the report. “Life expectancy in the United States never goes up or down any more than one- or two-tenths,” he said. “But then when Covid happened, you had this gigantic drop, and now we have a gigantic drop in Covid. So, you have this gigantic increase in life expectancy.”

From 2019 to 2021, U.S. life expectancy dropped from 78.8 years to 76.4. Covid deaths fell significantly last year: Whereas Covid was the fourth leading cause of death in 2022, it was the 10th in 2023, according to the new report. Last year, Covid was the underlying or contributing cause of more than 76,000 deaths, according to an August CDC report, compared with more than 350,000 such deaths in 2020.
The new findings are based on an analysis of death certificates from all 50 states and Washington, D.C. The results showed that the overall death rate for the U.S. population decreased by 6%.

“According to the new report, the top five causes of death in the U.S. last year were heart disease, cancer, unintentional injuries, stroke and chronic lower respiratory diseases. Death rates fell for nine of the top 10 causes in 2023, while the rate of cancer deaths remained fairly unchanged…”

The Atlantic shares some other positive statistics, including reports that America’s traffic fatalities keep declining, while drug-overdose deaths also dropped 3% between 2022 and 2023 and there was also a double-digit drop in murder rates.

“America is suddenly getting healthier,” they write. “No one knows why.”

Read more of this story at Slashdot.

Read More 

T2 Linux SDE 24.12 ‘Sky’s the Limit!’ Released With 37 ISOs For 25 CPU ISAs

Berlin-based T2 Linux developer René Rebe is also long-time Slashdot reader ReneR — and popped by with a special announcement for the holidays:

The T2 Linux team has unveiled T2 Linux SDE 24.12, codenamed “Sky’s the Limit!”, delivering a massive update for this highly portable source-based Linux distribution… With 3,280 package updates, 206 new features, and the ability to boot on systems with as little as 512MB RAM, this release further strengthens T2 Linux’s position as the ultimate tool for developers working across diverse hardware and embedded systems.

Some highlights from Rene’s announcement:

“The release includes 37 pre-compiled ISOs with Glibc, Musl, and uClibc, supporting 25 CPU architectures like ARM(64), RISCV(64), Loongarch64, SPARC(64), and vintage retro computing platforms such as M68k, Alpha, and even initial Nintendo Wii U support added.”
” The Cosmic Desktop, a modern Rust-based environment, debuts alongside expanded application support for non-mainstream RISC architectures, now featuring LibreOffice, OpenJDK, and QEMU.”

And T2sde.org gives this glimpse of the future:

“While initially created for the Linux kernel, T2 already has proof-of-concept support for building ‘home-brew’ pkg for Other OS, including: BSDs, macOS and Haiku. Work on alternative micro kernels, such as L4, Fuchsia, RedoxOS or integrating building ‘AOSP’ Android is being worked on as well.”

Read more of this story at Slashdot.

Berlin-based T2 Linux developer René Rebe is also long-time Slashdot reader ReneR — and popped by with a special announcement for the holidays:

The T2 Linux team has unveiled T2 Linux SDE 24.12, codenamed “Sky’s the Limit!”, delivering a massive update for this highly portable source-based Linux distribution… With 3,280 package updates, 206 new features, and the ability to boot on systems with as little as 512MB RAM, this release further strengthens T2 Linux’s position as the ultimate tool for developers working across diverse hardware and embedded systems.

Some highlights from Rene’s announcement:

“The release includes 37 pre-compiled ISOs with Glibc, Musl, and uClibc, supporting 25 CPU architectures like ARM(64), RISCV(64), Loongarch64, SPARC(64), and vintage retro computing platforms such as M68k, Alpha, and even initial Nintendo Wii U support added.”
” The Cosmic Desktop, a modern Rust-based environment, debuts alongside expanded application support for non-mainstream RISC architectures, now featuring LibreOffice, OpenJDK, and QEMU.”

And T2sde.org gives this glimpse of the future:

“While initially created for the Linux kernel, T2 already has proof-of-concept support for building ‘home-brew’ pkg for Other OS, including: BSDs, macOS and Haiku. Work on alternative micro kernels, such as L4, Fuchsia, RedoxOS or integrating building ‘AOSP’ Android is being worked on as well.”

Read more of this story at Slashdot.

Read More 

Voyager 1 Signals from Interstellar Space Detected by Amateur Astronomers on 1950s Telescope

“Voyager 1 is currently exploring interstellar space at a distance of 15.5 billion miles (24.9 billion kilometers) away from Earth,” writes Gizmodo.

And yet a team of amateur astronomers in the Netherlands was able to receive Voyager’s signals on a 1950s telescope designed to detect weak, low-frequency emissions from deep space:

NASA uses the [Earth-based] Deep Space Network (DSN) to communicate with its spacecraft, but the global array of giant radio antennas is optimized for higher frequency signals. Though NASA’s DSN antennas are capable of detecting S-band missives from Voyager — it can also communicate in X-band — the spacecraft’s signal can appear to drop due to how far Voyager is from Earth. The Dwingeloo telescope, on the other hand, is designed for observing at lower frequencies than the 8.4 gigahertz telemetry transmitted by Voyager 1, according to the C.A. Muller Radio Astronomy Station… [W]hen Voyager 1 switched to a lower frequency, its messages fell within Dwingeloo’s frequency band. Thus, the astronomers took advantage of the spacecraft’s communication glitch to listen in on its faint signals to NASA.

The astronomers used orbital predictions of Voyager 1’s position in space to correct for the Doppler shift in frequency caused by the motion of Earth, as well as the motion of the spacecraft through space. The weak signal was found live, and further analysis later confirmed that it corresponded to the position of Voyager 1.
Thankfully, the mission team at NASA turned Voyager 1’s X-band transmitter back on in November, and is currently carrying out a few remaining tasks to get the spacecraft back to its regular state. Fortunately, radio telescopes like Dwingeloo can help fill in the gaps while NASA’s communications array has trouble reaching its spacecraft.

Scientific American shares an interesting perspective on the Voyager probes:
we everyday Earthlings may simplistically think of the sun as a compact distant ball of light, in part because our plush atmosphere protects us from our star’s worst hazards. But in reality the sun is a roiling mass of plasma and magnetism radiating itself across billions of miles in the form of the solar wind, which is a constant stream of charged plasma that flows off our star. The sun’s magnetic field travels with the solar wind and also influences the space between planets. The heliosphere grows and shrinks in response to changes in the sun’s activity levels over the course of an 11-year cycle… [Jamie Rankin, a space physicist at Princeton University and deputy project scientist of the Voyager mission] notes, astronomers of all stripes are trapped within that chaotic background in ways that may or may not affect their data and interpretations. “Every one of our measurements to date, until the Voyagers crossed the heliopause, has been filtered through all the different layers of the sun,” Rankin says.
On their trek to interstellar space, the Voyagers had to cross a set of boundaries: first a termination shock some seven billion or eight billion miles away from the sun, where the solar wind abruptly begins to slow, then the heliopause, where the outward pressure from the solar wind is equaled by the inward pressure of the interstellar medium. Between these two stark borders lies the heliosheath, a region where solar material continues to slow and even reverse direction. The trek through these boundaries took Voyager 1, the faster of the twin probes, nearly eight years; such is the vastness of the scale at play.

Beyond the heliopause is interstellar space, which Voyager 1 entered in 2012 and Voyager 2 reached in 2018. It’s a very different environment from the one inside our heliosphere — quieter but hardly quiescent. “It’s a relic of the environment the solar system was born out of,” Rankin says of the interstellar medium. Within it are energetic atomic fragments called galactic cosmic rays, as well as dust expelled by dying stars across the universe’s eons, among other ingredient.

Earlier this month Wired noted ” The secret of the Voyagers lies in their atomic hearts: both are equipped with three radioisotope thermoelectric generators, or RTGs — small power generators that can produce power directly on board. Each RTG contains 24 plutonium-238 oxide spheres with a total mass of 4.5 kilograms…”

But as time passes, the plutonium on board is depleted, and so the RTGs produce less and less energy. The Voyagers are therefore slowly dying. Nuclear batteries have a maximum lifespan of 60 years. In order to conserve the probes’ remaining energy, the mission team is gradually shutting down the various instruments on the probes that are still active…

Four active instruments remain, including a magnetometer as well as other instruments used to study the galactic environment, with its cosmic rays and interstellar magnetic field. But these are in their last years. In the next decade — it’s hard to say exactly when — the batteries of both probes will be drained forever.

Read more of this story at Slashdot.

“Voyager 1 is currently exploring interstellar space at a distance of 15.5 billion miles (24.9 billion kilometers) away from Earth,” writes Gizmodo.

And yet a team of amateur astronomers in the Netherlands was able to receive Voyager’s signals on a 1950s telescope designed to detect weak, low-frequency emissions from deep space:

NASA uses the [Earth-based] Deep Space Network (DSN) to communicate with its spacecraft, but the global array of giant radio antennas is optimized for higher frequency signals. Though NASA’s DSN antennas are capable of detecting S-band missives from Voyager — it can also communicate in X-band — the spacecraft’s signal can appear to drop due to how far Voyager is from Earth. The Dwingeloo telescope, on the other hand, is designed for observing at lower frequencies than the 8.4 gigahertz telemetry transmitted by Voyager 1, according to the C.A. Muller Radio Astronomy Station… [W]hen Voyager 1 switched to a lower frequency, its messages fell within Dwingeloo’s frequency band. Thus, the astronomers took advantage of the spacecraft’s communication glitch to listen in on its faint signals to NASA.

The astronomers used orbital predictions of Voyager 1’s position in space to correct for the Doppler shift in frequency caused by the motion of Earth, as well as the motion of the spacecraft through space. The weak signal was found live, and further analysis later confirmed that it corresponded to the position of Voyager 1.
Thankfully, the mission team at NASA turned Voyager 1’s X-band transmitter back on in November, and is currently carrying out a few remaining tasks to get the spacecraft back to its regular state. Fortunately, radio telescopes like Dwingeloo can help fill in the gaps while NASA’s communications array has trouble reaching its spacecraft.

Scientific American shares an interesting perspective on the Voyager probes:
we everyday Earthlings may simplistically think of the sun as a compact distant ball of light, in part because our plush atmosphere protects us from our star’s worst hazards. But in reality the sun is a roiling mass of plasma and magnetism radiating itself across billions of miles in the form of the solar wind, which is a constant stream of charged plasma that flows off our star. The sun’s magnetic field travels with the solar wind and also influences the space between planets. The heliosphere grows and shrinks in response to changes in the sun’s activity levels over the course of an 11-year cycle… [Jamie Rankin, a space physicist at Princeton University and deputy project scientist of the Voyager mission] notes, astronomers of all stripes are trapped within that chaotic background in ways that may or may not affect their data and interpretations. “Every one of our measurements to date, until the Voyagers crossed the heliopause, has been filtered through all the different layers of the sun,” Rankin says.
On their trek to interstellar space, the Voyagers had to cross a set of boundaries: first a termination shock some seven billion or eight billion miles away from the sun, where the solar wind abruptly begins to slow, then the heliopause, where the outward pressure from the solar wind is equaled by the inward pressure of the interstellar medium. Between these two stark borders lies the heliosheath, a region where solar material continues to slow and even reverse direction. The trek through these boundaries took Voyager 1, the faster of the twin probes, nearly eight years; such is the vastness of the scale at play.

Beyond the heliopause is interstellar space, which Voyager 1 entered in 2012 and Voyager 2 reached in 2018. It’s a very different environment from the one inside our heliosphere — quieter but hardly quiescent. “It’s a relic of the environment the solar system was born out of,” Rankin says of the interstellar medium. Within it are energetic atomic fragments called galactic cosmic rays, as well as dust expelled by dying stars across the universe’s eons, among other ingredient.

Earlier this month Wired noted ” The secret of the Voyagers lies in their atomic hearts: both are equipped with three radioisotope thermoelectric generators, or RTGs — small power generators that can produce power directly on board. Each RTG contains 24 plutonium-238 oxide spheres with a total mass of 4.5 kilograms…”

But as time passes, the plutonium on board is depleted, and so the RTGs produce less and less energy. The Voyagers are therefore slowly dying. Nuclear batteries have a maximum lifespan of 60 years. In order to conserve the probes’ remaining energy, the mission team is gradually shutting down the various instruments on the probes that are still active…

Four active instruments remain, including a magnetometer as well as other instruments used to study the galactic environment, with its cosmic rays and interstellar magnetic field. But these are in their last years. In the next decade — it’s hard to say exactly when — the batteries of both probes will be drained forever.

Read more of this story at Slashdot.

Read More 

US Drone Sightings Provoke Reactions From New Jersey Legislature, Federal Government

On Thursday New Jersey lawmakers passed a resolution “calling on the federal government to conduct a ‘rigorous and ongoing’ investigation into the drone sightings in the state,” reports the Associated Press:

Meanwhile, federal and local authorities are warning against pointing lasers at suspected drones, because aircraft pilots are being hit in the eyes more often. Authorities also said they are concerned people might fire weapons at manned aircraft that they have mistaken for drones…

White House national security spokesperson John Kirby said Monday that the federal government has yet to identify any public safety or national security risks. “There are more than 1 million drones that are lawfully registered with the Federal Aviation Administration here in the United States,” Kirby said. “And there are thousands of commercial, hobbyist and law enforcement drones that are lawfully in the sky on any given day. That is the ecosystem that we are dealing with.” The federal government has deployed personnel and advanced technology to investigate the reports in New Jersey and other states, and is evaluating each tip reported by citizens, he said. About 100 of the more than 5,000 drone sightings reported to the FBI in recent weeks were deemed credible enough to warrant more investigation, according to a joint statement by the Department of Homeland Security, FBI, Federal Aviation Administration and Department of Defense.

Speculation has raged online, with some expressing concerns the drones could be part of a nefarious plot by foreign agents or clandestine operations by the U.S. government. Pentagon spokesperson Maj. Gen. Pat Ryder said it’s unlikely the drones are engaged in intelligence gathering, given how loud and bright they are. He repeated Tuesday that the drones being reported are not being operated by the Department of Defense. Asked whether military contractors might be operating drones in the New Jersey area, Ryder rebuffed the notion, saying there are “no military operations, no military drone or experiment operations in this corridor.” Ryder said additional drone-detecting technology was being moved to some military installations, including the Picatinny Arsenal…

U.S. Sen. Andy Kim, a New Jersey Democrat, said he has heard nothing to support the notion that the government is hiding anything. He said a lack of faith in institutions is playing a key part in the saga.

Read more of this story at Slashdot.

On Thursday New Jersey lawmakers passed a resolution “calling on the federal government to conduct a ‘rigorous and ongoing’ investigation into the drone sightings in the state,” reports the Associated Press:

Meanwhile, federal and local authorities are warning against pointing lasers at suspected drones, because aircraft pilots are being hit in the eyes more often. Authorities also said they are concerned people might fire weapons at manned aircraft that they have mistaken for drones…

White House national security spokesperson John Kirby said Monday that the federal government has yet to identify any public safety or national security risks. “There are more than 1 million drones that are lawfully registered with the Federal Aviation Administration here in the United States,” Kirby said. “And there are thousands of commercial, hobbyist and law enforcement drones that are lawfully in the sky on any given day. That is the ecosystem that we are dealing with.” The federal government has deployed personnel and advanced technology to investigate the reports in New Jersey and other states, and is evaluating each tip reported by citizens, he said. About 100 of the more than 5,000 drone sightings reported to the FBI in recent weeks were deemed credible enough to warrant more investigation, according to a joint statement by the Department of Homeland Security, FBI, Federal Aviation Administration and Department of Defense.

Speculation has raged online, with some expressing concerns the drones could be part of a nefarious plot by foreign agents or clandestine operations by the U.S. government. Pentagon spokesperson Maj. Gen. Pat Ryder said it’s unlikely the drones are engaged in intelligence gathering, given how loud and bright they are. He repeated Tuesday that the drones being reported are not being operated by the Department of Defense. Asked whether military contractors might be operating drones in the New Jersey area, Ryder rebuffed the notion, saying there are “no military operations, no military drone or experiment operations in this corridor.” Ryder said additional drone-detecting technology was being moved to some military installations, including the Picatinny Arsenal…

U.S. Sen. Andy Kim, a New Jersey Democrat, said he has heard nothing to support the notion that the government is hiding anything. He said a lack of faith in institutions is playing a key part in the saga.

Read more of this story at Slashdot.

Read More 

Scroll to top
Generated by Feedzy