Stephen Hicks, a research fellow in neuroscience at the University of Oxford says that “Even when someone is losing their sight, they still have a good brain that’s trying to understand and pick up clues from objects if given enough input”. With that in mind, OxSight is building a pair of AR glasses that represent the physical world visible, even to the visually impaired.
The eyes pick up only specific details (like colour, contrast or dimensions) while the occipital and parietal lobes will make sense of the overall picture. Considering this, Hicks along with computer-vision scientist Philip Torr are created OxSight, a spinout that was launched in March 2016. The pair designed augmented-reality glasses that let partially sighted people make sense of their surroundings by spotlighting specific visual cues and overlaying them on the lenses in real time.
OxSight is using computer-vision algorithms and cameras, which is increasing the contrast, highlight specific visual features or create cartoonish representations of reality, depending on the eye state they’re being used to compensate for. The final product is programmed to be realised at the end of this year and it will be available on Android.