Integrating Minimalistic Localization and Navigation for People with Visual Impairments
AdvisorBekris, Kostas E
StatisticsView Usage Statistics
Indoor localization and navigation systems for individuals with visual impairments (VI) typically rely upon extensive augmentation of the physical space or expensive sensors; thus, few systems have been adopted. This work describes a system able to guide people with VI through buildings using inexpensive sensors, such as accelerometers, which are available in portable devices like smart phones. This approach introduces some challenges due to the limited computational power of the portable devices and the highly erroneous sensors. The method takes advantage of feedback from the human user, who conﬁrms the presence of landmarks. The system calculates the location of the user in real time and uses it to provide audio instructions on how to reach the desired destination. A ﬁrst set of experiments suggested that the accuracy of the localization depends on the type of directions provided and the availability of good transition and observation models that describe the user's behavior. During this initial set of experiments, the system was not executed in real time so the approach had to be improved. Towards an improved version of the method, a signiﬁcant amount of computation was transferred oﬄine in order to speed up the system's online execution. Inspired by results in multi-model estimation, this work employs multiple particle ﬁlters, where each one uses a diﬀerent assumption for the user's average step length. This helps to adaptively estimate the value of this parameter on the ﬂy. The system simultaneously estimates the step length of the user, as it varies between diﬀerent people, from path to path, and during the execution of the path. Experiments are presented that evaluate the accuracy of the location estimation process and of the integrated direction provision method. Sighted people, that were blindfolded, participated in these experiments.