WhereIsIt? Here It Is!

Blind and severely visually impaired people depend in everyday life on systematically placing objects or remembering their location. Because visual control is lacking, everyday items such as keys, medication, documents, or technical aids are often misplaced or must be searched for with considerable effort. This leads to loss of time, stress, and unnecessary dependence on other people. Existing solutions such as Microsoft’s „Find My Things“ often rely on visual object recognition or complex assistance systems. These are technically demanding, prone to errors, energy-intensive, and not always acceptable from a privacy perspective. What is needed is a simple, robust, and practical solution for everyday use that does not require continuous camera usage and can be operated intuitively. A speech-based object reminder assistant called WhereIsIt is being developed on the initiative of Prof. Dr. Oliver Bendel. The user can use voice input to record which object has been placed where (e.g., „I put my medication on the kitchen table“). The information is stored locally and provided with a timestamp. When asked later („Where is my medication?“), the system outputs the last known location via speech. Optionally, inexpensive Bluetooth tags can be used that emit an additional acoustic signal to make the object physically easier to locate. The focus is on ease of use, low technical complexity, and high reliability. Possible technical components include: voice capture and speech recognition; extraction of object and location information; local data storage with time reference; voice-based feedback; optional integration of BLE tags. When AI components are used, it is a project within Inclusive AI. The kick-off meeting will take place on March 17, 2026 at the FHNW School of Business. Damian Huckele has been recruited to implement the project.

Fig.: A blind person