Uncategorized

Amazon reportedly working on Echo Frames for delivery drivers

Amazon’s rumored smart glasses for delivery drivers would have an embedded display, unlike the audio-only consumer version. | Image: David Pierce / The Verge

Amazon is developing smart glasses for delivery drivers, Reuters reports. The aim is to give drivers turn-by-turn directions, thereby shaving seconds off of each delivery.
Citing anonymous Amazon sources, Reuters says the project is part of Amazon’s efforts to increase efficiency in the last 100 yards of a delivery. Codenamed “Amelia”, the smart glasses are based off the existing Echo Frames platform. Unlike the current audio-only Frames they would have an embedded display that could give a driver more precise directions — for example, turning left or right after getting off an elevator. Amazon is also exploring camera capabilities using that embedded display, enabling drivers to take photos of packages as proof of delivery. Theoretically, doing all this would allow drivers to carry more packages because the glasses are hands-free, while all those saved seconds would allow drivers to squeeze in more home deliveries in a single shift.

That said, it might be a good long while before Amazon drivers sport smart glasses — if they ever do. Adding displays to ordinary glasses or audio-based smart glasses like the current Echo Frames is also a tough engineering challenge — one that many companies have failed at. Amazon is reportedly having issues making glasses with a battery that can last a full eight-hour shift while also being light enough to wear all day. Another problem is many people already have corrective lenses, and thus far, consumer smart glasses haven’t always been capable of accommodating every prescription. It’ll also have to convince its entire fleet of drivers — many of which are third-party contractors — to adopt the technology. It could take years for Amazon to gather enough data on that last 100 yards (i.e., building layouts, sidewalks, streets, driveways, etc.) to make its vision a reality.
It’s not wholly surprising to see Amazon explore enterprise options for its smart glasses tech. Reuters’ sources also stated that the last-gen Echo Frames sold fewer than 10,000 units — disappointing when compared to how successful the Ray-Ban Meta glasses have been. (Both were released around the same time last year.) Pivoting to enterprise has also long been the playbook for underperforming smart glasses and AR headsets, including Google Glass, Magic Leap, and Microsoft HoloLens. Also unclear is whether Amazon will keep this purely for its own delivery network or pursue third-party enterprise contracts. That said, it’s possible that some of this tech will also make its way to consumers. The report notes that Amazon is working on an embedded screen for future Echo Frames glasses that could show up sometime in Q2 2026.

Amazon’s rumored smart glasses for delivery drivers would have an embedded display, unlike the audio-only consumer version. | Image: David Pierce / The Verge

Amazon is developing smart glasses for delivery drivers, Reuters reports. The aim is to give drivers turn-by-turn directions, thereby shaving seconds off of each delivery.

Citing anonymous Amazon sources, Reuters says the project is part of Amazon’s efforts to increase efficiency in the last 100 yards of a delivery. Codenamed “Amelia”, the smart glasses are based off the existing Echo Frames platform. Unlike the current audio-only Frames they would have an embedded display that could give a driver more precise directions — for example, turning left or right after getting off an elevator. Amazon is also exploring camera capabilities using that embedded display, enabling drivers to take photos of packages as proof of delivery. Theoretically, doing all this would allow drivers to carry more packages because the glasses are hands-free, while all those saved seconds would allow drivers to squeeze in more home deliveries in a single shift.

That said, it might be a good long while before Amazon drivers sport smart glasses — if they ever do. Adding displays to ordinary glasses or audio-based smart glasses like the current Echo Frames is also a tough engineering challenge — one that many companies have failed at. Amazon is reportedly having issues making glasses with a battery that can last a full eight-hour shift while also being light enough to wear all day. Another problem is many people already have corrective lenses, and thus far, consumer smart glasses haven’t always been capable of accommodating every prescription. It’ll also have to convince its entire fleet of drivers — many of which are third-party contractors — to adopt the technology. It could take years for Amazon to gather enough data on that last 100 yards (i.e., building layouts, sidewalks, streets, driveways, etc.) to make its vision a reality.

It’s not wholly surprising to see Amazon explore enterprise options for its smart glasses tech. Reuters’ sources also stated that the last-gen Echo Frames sold fewer than 10,000 units — disappointing when compared to how successful the Ray-Ban Meta glasses have been. (Both were released around the same time last year.) Pivoting to enterprise has also long been the playbook for underperforming smart glasses and AR headsets, including Google Glass, Magic Leap, and Microsoft HoloLens. Also unclear is whether Amazon will keep this purely for its own delivery network or pursue third-party enterprise contracts. That said, it’s possible that some of this tech will also make its way to consumers. The report notes that Amazon is working on an embedded screen for future Echo Frames glasses that could show up sometime in Q2 2026.

Read More 

Leave a Reply

Your email address will not be published. Required fields are marked *

Scroll to top
Generated by Feedzy