The ability to read minds may still be a long ways away, but the ability to recognize and read emotions, especially if you have difficulties doing so due to autism, may have arrived already.

Google Glass, Google’s 3D reality glasses that didn’t hit big with the tech crowd several years ago have made a comeback with apps like Empower Me.

Facial recognition software can detect several emotions, including happiness, sadness, anger, fear, disgust, anger, contempt and surprise. The wearer of the Google Glasses is then shown an emoticon in the corner of the eyewear, letting them known what the person across from them is “feeling.”

So you can’t read her mind…but you can read her emotions.

In some cases, you probably wouldn’t want to do either.

Children who have taken part in studies using the Glasses have shown to be better at recognizing emotions. The conclusions so far have been limited to smaller populations but there is more funding being requested for the future.

More than likely, this is a great tool to teach emotional recognition and the empathy that accompanies it in an instructional setting. But don’t expect a set of gear that will magically transform an emotionally unresponsive person to one who loves to communicate and cuddle. A child (or adult) can be taught empathy until the sun does down but that doesn’t mean he’ll translate that into the way he interacts with his peers on a daily basis. Why, this photograph above could have several meanings, including 1. I really understand what you feel, 2. I feel the same way or 3. I think these glasses are telling me to give you the finger.

Combining tech and software to “teach” emotions with lots of real interaction with people who can share common interests is probably the best way to ingrain emotional reciprocity in a person’s brain.

Please follow and like us:

Comments are closed.



Site Navigation