A wearable memory aid
The DejaView Concept
Human memory is not perfect. Dementia, injury, and the natural decline of mental abilities with age all further affect our ability to remember. Memory impairment has a huge impact on quality of life and, with an ageing western population, is becoming ever more of a problem. Current coping strategies range from simple aids such as post-it notes and calendars to, more recently, assistive devices which attempt to provide reminders at appropriate times, capture details of important events, or aid with performing complex tasks. Promising findings in the use of wearable camera-based memory aids such as the SenseCam have been widely reported. However, to date, there has been relatively little consideration of the potential for offering memory help in real-time during daily living. We suggest that such assistance, in the form of proactive visual prompts, could help people with memory problems to immediately orientate themselves in a situation - supplying details of where they are, or who they are with. Providing this form of immediate and 'in-the-moment' contextual feedback to the user represents the philosophy of the DejaView system.
The DejaView system is represented by a three-tier architecture, comprised of:
In its simplest form, the architecture allows similar functionality to that of the SenseCam - photographs can be autonomously captured and stored for later review. A distinction here is that, instead of being stored on the device itself, the images are immediately transmitted (via the smartphone) to the Internet where they are stored; this means that other use-cases can be conceived, for example where real-time monitoring of a wearer is possible. However, the real flexibility of the DejaView system comes when the Internet service instantly analyses uploaded photos and sensor data, and feeds back relevent contextual information to the user via their smartphone. In the currently-implemented example, photos captured by the wearable device are compared against a database of faces stored on the remote computer. The user subsequently receives information about people around them via their smartphone. More generally, the architecture permits a wide range of intelligent methods for selecting useful cues, based on the user's environment, to be integrated into the system, facilitating the provision of real-time help for memory problems.
- a low-power wearable sensing device called the DejaView device. The device autonomously captures photos based on inputs to its onboard sensors, and transmits them (and collected sensor data) wirelessly via a BlueTooth interface to a smartphone;
- an application running on a smartphone which receives data from the DejaView device, appends additional sensor data, and transmits this information to a remote web service using its Internet connection. The application also receives contextual feedback from the web service and presents it to the user;
- a web service, which determins context from the uploaded data, using the wealth of data and processing that it has access to (for example algorithms such as face or object recognition, and connecting to the user's social networks, calendar, online photo albums, etc).
Below is a crude video showing the operation (as of March 2012) of the DejaView system, including the device, mobile phone/Android application, and web service/interface:
We are currently working to improve the operation and functionality of the system, and collaborating with memory experts to evaluate the clinical benefit of the system. We are always happy to hear from interested parties regarding comments or potenital collaborations by emailing
. Further information can also be found at http://www.dejaview.ecs.soton.ac.uk.
Recent DejaView News
I Begin to trial DejaView!17th July 2012
Today I got my own DejaView device to start to wear, play with, improve (and fix the bugs)! While we've had working devices and a working system for many months now, up until now any devices that we have had have been used by the rest of the research team and our clinicial collaborators in London.
... [more]Alex Presents DejaView Research at IET WSS19th June 2012
Today Alex presented his research at the IET conference on Wireless Sensing Systems (WSS). His paper was on the topic of "Adaptive Sampling in Context-Aware Systems: A Machine Learning Approach", and is a method he is researching to attempt to maximise the usefullness of images captured by the DejaV... [more]Best Presentation Award at SenseCam 20124th April 2012
Our presentation entitled "DejaView: help with memory, when you need it" was awarded the best presentation prize at the SenseCam 2012 Symposium. The prize was awarded by Steve Hodges from Microsoft Research.
Abstract: Promising findings in the use of wearable memory aids such as SenseCam have bee... [more]