Science

An AR, brain-computer interface that shows the color, speed and shape of our emotions

If we could see our emotions through a Brain Computer Interface (BCI) device based on Augmented Reality (AR), what color, shape, or velocity would they be?

“Is it really possible to tell someone else what one feels?” ― Leo Tolstoy, Anna Karenina

With evolution, the human species has learnt to mask its true feelings. We smile in polite society even when we hurt or rage inside. So much so, that at times even we don’t realize what we are actually feeling.

What if there were a  BCI platform that could show us our true feelings on a screen? Will they shock us? Will we be relieved at finally being aware of what we’ve known all along?

Kammil Carranza

“My basic inspiration came from understanding something that you cannot see but you do feel. Something similar to ghosts”

Kammil Carranza, Creative Director and Project Manager at Augmented Island Studios, has come up with just such a device that uses Augmented Reality (AR) to display the true emotions of an individual by presenting their perception on a screen through color and velocity.

The Sociable spoke with Carranza about his project Daydreamers.

“Sometimes you feel angry, but you might wonder what your anger looks like or whether it’s different from another person’s anger. So I wanted to give these abstract concepts a representation,” he says.

Read More: Virtual Reality Takes Consciousness Research into Mystic Realms of the Divine Play

Daydreamers builds a new perception of reality, proposing the construction of the world that surrounds us as a direct response to our brain waves that can be explained as emotions we are experiencing.

Inspiration

Carranza started the project as part of his master’s thesis at the Institute for Advanced Architecture of Catalonia (IAAC) under Professors Luis Fraguada and Elizabeth Bigger, within the advanced interaction research line, with a focus on how wearable technology can augment the senses.

Read More: Brain-computer interface helps turn brainwave patterns into speech

He reveals that in the beginning, the product was about recording dreams. So people could experience other people’s dreams as well as see them on an AR platform.

“My basic inspiration came from understanding something that you cannot see but you do feel. Something similar to ghosts,” he says.

How it Works

For Daydreamers to work, the user has to wear a headband called Muse, which is used to measure brain activity. With the help of an EEG headset as the sensor, the user sees custom digital content based on the data recorded by this sensor, which is set in the real world using AR as a visualization platform.

“If you feel excitement or fear, it will change its shape, color and also the velocity of some elements”

In simple terms, if you are wearing the headset and looking at a tree, the screen will show the tree in a shape and color that reflects your emotional state.

“I can pick the data and send it to the phone, transforming the content in real-time. So if you feel excitement or fear, it will change its shape, color and also the velocity of some elements,” Carranza explains.

The screen starts to reflect the emotions of the user by changing the pattern, velocity, and the colors of whatever the user perceives.

Read More: Terence McKenna’s ‘cyberdelic’ predictions for Virtual Reality 25 years on

“So if you are depressed, it starts to show slow movement or goes purple and a bit faded. It converts the user data in a way to express his/her own emotions,” he says.

Carranza did not have a commercial aspect in mind while creating the project, but he believes that it has social insights. The research supports the benefits of creating a BCI interface that can control digital and physical content.

Read More: Brain-computer interface allows for telepathic piloting of drones

Even if the digital context is in a virtual reality world, the device offers a reasonable understanding of mixing the physical and digital worlds. In the process, not only can users delve into themselves for self-awareness, they can also develop empathy by understanding another person’s emotions.

Navanwita Sachdev

An English literature graduate, Navanwita is a passionate writer of fiction and non-fiction as well as being a published author. She hopes her desire to be a nosy journalist will be satisfied at The Sociable.

Recent Posts

DARPA ‘Generative Optogenetics (GO)’ seeks to program biology using light, could aid in ‘extended human spaceflight’

Apart from 'extended human spaceflight' for what other purposes could DARPA GO serve? perspective DARPA…

10 hours ago

Competing in the post-gatekeeper era: How the DMA is rewiring platforms, security, and market access

The Digital Markets Act (DMA) has joined the General Data Protection Regulation (GDPR) as one…

3 days ago

Horasis India Meeting to Spotlight India’s Global Ascent At Singapore Summit This Month

Amid several years of shifting global dynamics, it’s become increasingly clear that we are entering…

5 days ago

AI scams targeting businesses are surging: Here are the top 3 threats your team is likely to face in 2026 (Brains Byte Back Podcast)

Imagine a company interviewing a candidate for a senior IT role. The résumé checks out,…

5 days ago

AI Won’t Scale in Advertising Until Trust Does: How to Identify AI Tools That Deliver Quality Security and Expertise

At the start of the year, data suggested that only about a third of agencies,…

5 days ago

What It Means When Algorithms Say “I”: Toward a Theory of Digital Subjectivity

Picture an AI assistant you have worked with for the past five years. It knows…

5 days ago