Month: August 2024

These robots move through the magic of mushrooms

Researchers at Cornell University tapped into fungal mycelia to power a pair of proof-of-concept robots. Mycelia, the underground fungal network that can sprout mushrooms as its above-ground fruit, can sense light and chemical reactions and communicate through electrical signals. This makes it a novel component in hybrid robotics that could someday detect crop conditions otherwise invisible to humans.
The Cornell researchers created two robots: a soft, spider-like one and a four-wheeled buggy. The researchers used mycelia’s light-sensing abilities to control the machines using ultraviolet light. The project required experts in mycology (the study of fungi), neurobiology, mechanical engineering, electronics and signal processing.
“If you think about a synthetic system — let’s say, any passive sensor — we just use it for one purpose,” lead author Anand Mishra said. “But living systems respond to touch, they respond to light, they respond to heat, they respond to even some unknowns, like signals. That’s why we think, OK, if you wanted to build future robots, how can they work in an unexpected environment? We can leverage these living systems, and any unknown input comes in, the robot will respond to that.”
The fungal robot uses an electrical interface that (after blocking out interference from vibrations and electromagnetic signals) records and processes the mycelia’s electrophysical activity in real time. A controller, mimicking a portion of animals’ central nervous systems, acted as “a kind of neural circuit.” The team designed the controller to read the fungi’s raw electrical signal, process it and translate it into digital controls. These were then sent to the machine’s actuators.
Cornell University / Science Robotics
The pair of shroom-bots successfully completed three experiments, including walking and rolling in response to the mycelia’s signals and changing their gaits in response to UV light. The researchers also successfully overrode the mycelia’s signals to control the robots manually, a crucial component if later versions were to be deployed in the wild.
As for where this technology goes, it could spawn more advanced versions that tap into mycelia’s ability to sense chemical reactions. “In this case we used light as the input, but in the future it will be chemical,” according to Rob Shepherd, Cornell mechanical and aerospace engineering professor and the paper’s senior author. The researchers believe this could lead to future robots that sense soil chemistry in crops, deciding when to add more fertilizer, “perhaps mitigating downstream effects of agriculture like harmful algal blooms,” Shepherd said.
You can read the team’s research paper at Science Robotics and find out more about the project from the Cornell Chronicle.This article originally appeared on Engadget at https://www.engadget.com/science/these-robots-move-through-the-magic-of-mushrooms-171612639.html?src=rss

Researchers at Cornell University tapped into fungal mycelia to power a pair of proof-of-concept robots. Mycelia, the underground fungal network that can sprout mushrooms as its above-ground fruit, can sense light and chemical reactions and communicate through electrical signals. This makes it a novel component in hybrid robotics that could someday detect crop conditions otherwise invisible to humans.

The Cornell researchers created two robots: a soft, spider-like one and a four-wheeled buggy. The researchers used mycelia’s light-sensing abilities to control the machines using ultraviolet light. The project required experts in mycology (the study of fungi), neurobiology, mechanical engineering, electronics and signal processing.

“If you think about a synthetic system — let’s say, any passive sensor — we just use it for one purpose,” lead author Anand Mishra said. “But living systems respond to touch, they respond to light, they respond to heat, they respond to even some unknowns, like signals. That’s why we think, OK, if you wanted to build future robots, how can they work in an unexpected environment? We can leverage these living systems, and any unknown input comes in, the robot will respond to that.”

The fungal robot uses an electrical interface that (after blocking out interference from vibrations and electromagnetic signals) records and processes the mycelia’s electrophysical activity in real time. A controller, mimicking a portion of animals’ central nervous systems, acted as “a kind of neural circuit.” The team designed the controller to read the fungi’s raw electrical signal, process it and translate it into digital controls. These were then sent to the machine’s actuators.

Cornell University / Science Robotics

The pair of shroom-bots successfully completed three experiments, including walking and rolling in response to the mycelia’s signals and changing their gaits in response to UV light. The researchers also successfully overrode the mycelia’s signals to control the robots manually, a crucial component if later versions were to be deployed in the wild.

As for where this technology goes, it could spawn more advanced versions that tap into mycelia’s ability to sense chemical reactions. “In this case we used light as the input, but in the future it will be chemical,” according to Rob Shepherd, Cornell mechanical and aerospace engineering professor and the paper’s senior author. The researchers believe this could lead to future robots that sense soil chemistry in crops, deciding when to add more fertilizer, “perhaps mitigating downstream effects of agriculture like harmful algal blooms,” Shepherd said.

You can read the team’s research paper at Science Robotics and find out more about the project from the Cornell Chronicle.

This article originally appeared on Engadget at https://www.engadget.com/science/these-robots-move-through-the-magic-of-mushrooms-171612639.html?src=rss

Read More 

Best Back-to-School Fitness Essentials for a Healthy School Year

Check out these fitness gadgets that’ll help you stay healthy during the school year.

Check out these fitness gadgets that’ll help you stay healthy during the school year.

Read More 

SAVE Student Loan Forgiveness Plan Isn’t Dead Yet. What Experts Say Is Next

Experts didn’t expect the Supreme Court to reinstate SAVE before the lower courts’ rulings were finalized. Here’s what you can do in the meantime.

Experts didn’t expect the Supreme Court to reinstate SAVE before the lower courts’ rulings were finalized. Here’s what you can do in the meantime.

Read More 

Rumored Samsung XR headset appears on Geekbench with specs that could tease an AI-powered Apple Vision Pro rival

Geekbench scores for the Samsung XR headset have leaked, and it could be an AI-powered Apple Vision Pro rival.

At the recent foldables-focused Galaxy Unpacked event Samsung and Google teased that we’d finally see the results of their XR partnership “coming this year” – and it looks like  things are on track as the Samsung XR headset’s performance scores have seemingly leaked on Geekbench. If the leak is correct, the device looks set to be a bonafide Apple Vision Pro rival which could get some AI integration.

While the Geekbench score isn’t explicitly labeled something like ‘Samsung VR headset’ it is for a device called the Samsung SM-I130 – a designation a leaker attached to the Samsung headset back in January. What’s more, the CPU information which is listed as being a 6-core 2.36GHz processor would match up with the one known Samsung headset hardware detail – that it uses a Snapdragon XR2+ Gen 2.

The Samsung headset’s reliance on Qualcomm’s Snapdragon XR2+ Gen 2 was teased by Qualcomm itself, and the chipset is a 6-core processor, with its cores being able to run at up to 2.4 GHz.

Qualcomm’s announcement dropped big hints (Image credit: Qualcomm)

As for the new details, the leak says the device has 16GB of RAM. This is the same as the Apple Vision Pro, and twice the amount you’d find in the Meta Quest 3. Not only will this help apps and XR experiences run more smoothly, I suspect it’ll facilitate AI integration.

RAM boosts were the standout upgrade for Google’s own Pixel phones at its recent Google Pixel 9 event – the Pixel 9 getting 12GB, and Pixel 9 Pro models receiving 16GB – and the reason behind this improvement was AI. More RAM means the phones can do more with Google’s Gemini AI on device – so results are quicker, and your data doesn’t have to be shared with an off-device server making it more private.

With 16GB of RAM of its own the Samsung XR headset would theoretically be able to pull off many of the same tricks we’ve seen from the Pixel 9 Pro.

Considering Apple and Meta aren’t yet leveraging their AI platforms in a meaningful way with their XR devices (save for the Ray-Ban Meta Smart Glasses’ Meta AI integration, though they aren’t XR glasses in fairness) Google and Samsung focusing on Gemini would be a clever way to stand out from the crowd and make the upgraded hardware feel worthwhile.

Will AI take next-gen wearables to the next level? (Image credit: Meta)

We’ve also got a confirmation that the headset will run a version of Android 14. This again leans into my AI theory as Gemini is a key part of the latest Android OS,  but otherwise, it’s not surprising or really of note. Meta’s own HorizonOS is a spinoff of Android, and being that Android is Google’s home-grown OS it’s not a shocker Google would adapt the latest version to run on VR headsets it’s helping to create.

As with all leaks, we should take these details with a pinch of salt, until Google or Samsung say anything official there’s no guarantee what specs the Samsung XR headset will offer. Regardless of if the leak is correct, we shouldn’t be left waiting much longer for the device to be revealed; the end of 2024 is fast approaching putting a time limit on Samsung’s “this year” promise, and our bet is we’ll see something in just over a month at the Samsung Developer’s Conference which is scheduled for October 3, 2024. Whenever it’s announced you can be sure we’ll be ready to fill you in on all the important XR details.

You might also like

Samsung Galaxy S25 Ultra tipped to be thinner than the iPhone 16 Pro MaxThe Samsung Galaxy Z Fold 6’s paint is peeling off because you’re charging it wrongUnofficial Samsung Galaxy S25 Ultra renders make it look less like the Galaxy Note

Read More 

A new AI wearable to supplement your memory is happy to tread where other AI hardware failed

Plaud.AI has launched an AI-enhanced wearable recording device called the NotePin.

The current era of AI has produced plenty of impressive products but it’s hard to point to any AI-centered hardware that can match the popularity of software like ChatGPT despite occasional spikes in hype around one device or another. The new NotePin wearable from Plaud.ai seeks to fill that void by acting as a “memory capsule” to record, transcribe, and summarize your conversations and private monologues. 

At $169, the NotePin does at least offer some flexibility in how it adorns your body. As seen in the image above, it’s not only a pin or clipped onto your jacket but can be worn as a necklace or fit into a wristband. Once activated, the NotePin records and transmits audio to the Plaud app to be transcripted, summarized, or even turned into a visual mind map if you choose. The transcriptions are composed using OpenAI’s Whisper tool, but you get to choose between OpenAI’s GPT-4o or Anthropic’s Claude 3.5 Sonnet model for the summarization and analysis, should you have a preference. Plaud.AI also hinted at more AI model options to come.

The NotePin doesn’t record automatically. You can turn it on and off as you choose. You also have to pick the format of notes it will take for you, though Plaud.AI claims the device will adapt to your preferences over time. You get 300 minutes of transcription a month, though you can pay $79 a year for the Pro Plan and get 1,200 minutes per month and other features.

“Plaud NotePin is more than just an AI device,” said Plaud.AI CEO Nathan Hsu. “It’s your always-ready business partner, handling mundane, daily tasks so you can concentrate on what truly drives value in your life and career. This small but powerful device is reshaping the professional landscape, allowing users to optimize their day-to-day workflow and focus on what matters most.”

AI Hardware is Hard

If a pin recording your day and analyzing it with AI sounds familiar, that’s because it’s pretty much the pitch for the Humane AI Pin and pretty similar to the Rabbit R1 device that garnered a lot of excitement, if only for a brief time. As with those devices, the biggest question is whether AI hardware is worth it when you have a smartphone with AI apps available. Even the wearable element may not be that enticing if the next generation of smartwatches can perform the same recording and transmitting functions. 

Plus, the NotePin’s reliance on cloud services for AI processing is a potential drawback for those worried about data privacy. Plaud.AI boasted that its encryption and other security protocols protect user data, but considering the all-too-common story of data breaches, people may not want to commit their conversations, let alone inner musings, to the cloud. That’s not to say the NotePin might not beat the odds and become a popular accessory, but it might take more than a sleek design to convince people they can’t live without an AI-enabled microphone on their lapel.

You might also like…

Rabbit R1 is a beautiful mess that I’m not sure anyone needsSurvey says AI is more buzzkill than buzzword for marketingHumane AI Pin review roundup: an undercooked flop that’s way ahead of its time

Read More 

Star Wars Outlaws: Change These Settings as Soon as You Start

The new open-world game allows for a lot of customization, but there are a handful of settings worth changing at the outset.

The new open-world game allows for a lot of customization, but there are a handful of settings worth changing at the outset.

Read More 

Scroll to top
Generated by Feedzy