University of Washington researchers developed a system called VueBuds that uses tiny cameras in off-the-shelf wireless earbuds to allow users to talk with an AI model about the scene in front of them. For instance, a user might look at a Korean food package and say, “Hey VueBuds, translate this for me.” They’d then hear an AI voice say, “The visible text translates to ‘Cold Noodles’ in English.




































