Two Interesting Technologies People Without Sight Can Use to Read The Stock Market

Photo by Daniel Ali on Unsplash

Ever since I’ve read this book by neuroscientist David Eagleman, I have picked interest in how our sensors (vision, hearing, smell, touch and taste) are formed and, in general, how brain plasticity works (high level, of course, because I’m far from an expert in the field).

I’m impressed by how sensors can make up for a lost one (e.g. hearing for lost vision) and how other parts of the body can serve as a means for receiving information and basically create a totally new sensor.

Visually impaired people, totally blind or with low vision, have the same desires as everyone else when consuming or producing data — this is obvious, right? They are writers, readers, social media users and investors in the stock market.

Screen readers are a great assistive technology that is often used to accomplish tasks using the computer. These programs can process the text on the screen and read it out loud.

However, screen readers are ineffective for consuming information in a graph. It may work for simple ones, but for more complex data, it’s not enough. It doesn’t work well in identifying the categories of data and trends.

Two interesting solutions can help visually impaired users navigate the digital world better. Not only for analysing financial charts, but in other use cases as well.

Neosensory Buzz

It’s a wearable wristband that can capture data or sound and translate it into vibrational patterns felt on the skin. It’s built around the idea of sensory substitution. “The field that studies the transmission of information to the brain through unusual brain pathways”.

Credit Neosensory blog.

For example, the device translates a dog barking into a specific vibration on the skin. As you notice such a vibration a few times, the brain learns what it means. Next time you feel it, you know instantly it’s a dog barking. The same logic applies to other types of sounds and data in general.

If you go to the website, you will notice that it is more focused on capturing sounds to assist deaf people. But it also has an API where developers can transmit other kinds of information, such as the stock market data. Or infrared light, for example.

Such a wearable device is an interesting technology that gives us superpowers, even if we are not blind or deaf.

Human eyes cannot see infrared light, but the device can detect and translate to a specific vibration pattern. Every time you feel such a vibration, it will be like you are ‘seeing’ it.

Data Sonification

Data sonification or auditory display became a research field in 1992. It studies the process of transforming data into sound signals. For example, one could develop a data sonification program that reads the stock market data and plays a high note for a high point in the graph and a low one otherwise.

As in Thomas Herman, 2008 — One could argue that music is a type of data sonification. But they’re not. As an analogy, music can be compared to sonifying data as a graph of the stock market can be compared with a painting. Data sonification is all about visualizing the data underneath it; the sound must clearly allow the correct interpretation. Whereas music and painting are more subjective, there are more layers of interpretation, focusing more on the viewer or listener on how he or she can be inspired.

Bloomberg is one company that invested time and resources to build a system that supports data sonification. Their goal was to make their application more accessible for people without sight. One of their projects was the development of an iPhone app called Sonify. As the user would slide their fingers on the graph, and app would emit different sounds following the oscillation in the chart.

Such a technique is not only useful for visually impaired users. Imagine a highly critical system that requires 24/7 monitoring. It’s not practical to look at the logs in real-time. Probably by the time the issue happens, it can be too late. Alerts/alarms are an alternative, but they are often after the fact. They also generate false positives and overflow the email inbox.

Conclusion

Although I have emphasized visually impaired users conveying information from financial charts/stock market, the scope of such technologies is bigger.

The sound or vibration on the skin can serve as an extra layer on top of seeing and reading. It can allow for a quicker interpretation, and one can consume data at a distance or on the move.

It seems people have not picked up so much interest in these research areas yet. At least I don’t see much being discussed. Perhaps because it mostly benefits a minority. Another reason may be because there is already too much content to see; Adding two extra options for consuming data can feel even more overwhelming.

Nevertheless, depending on the use case, these technologies are definitely useful. Thanks for reading.

Leave a comment

Blog at WordPress.com.

Up ↑