Thursday, 13 December 2018

Designing the Emotional Interfaces of the Future

“People will forget what you said, people will forget what you did, but people will never forget how you made them feel.” (Maya Angelou)

Today when we think about what product we want to use, we have a lot of options to choose from. But there’s one thing that plays a crucial role in this decisions making process: emotions. Humans are emotion-driven species. We evaluate products not only based on utility but also on the feelings they evoke. We prefer to choose products that create a sense of excitement over dull products that only solve our problems.

A lot of articles have been written about how to design emotional interfaces. Most of them describe how to create such interfaces using fine microcopy, illustrations, animations, and visual effects; this article is different; here we’ll see how designers can follow different approaches to create genuinely innovative interfaces. The tips listed below will help you design interfaces of future.

Emotional Interfaces of the Future

People create bonds with the products they use. The emotion users feel (both positive and negative) stay with them even after they ended using a product. Peak–end rule states that people judge an experience largely based on how they felt at its peak, the effect occurs regardless of whether the experience is pleasant or unpleasant.

It’s evident that positive emotional stimuli can build better engagement with your users—people will forgive a product’s shortcomings if you reward them with positive emotions.

In order to influence emotions, designers should have a solid understanding of general factors that impact on users such as:

  • Human cognition – the way people consume information, learn or make decisions;
  • Human psychology – the factors that affect emotions, including colors, sounds, music, etc;
  • Cultural references;
  • Contextual factors – how the user feels at the time of using a particular product. For example, when a user wants to purchase a ticket in a ticket machine, they want to spend the least possible amount of time on this activity. The UI of this machine needs to reflect users’ desire for speed.

By focusing on those areas, it’s possible to create an experience that can change the way people perceive the world. Let’s see how it works how we can design beyond a screen.

Designing a Voice Interface That Feels Real

I probably don’t need to prove the fact that voice is the future. Garter research predicts that by the end of 2018, 30% of our interactions with technology will be through “conversations.” Even today many of us use Amazon Echo and Apple Siri for everyday activities—such as setting an alarm clock or make an appointment. But a majority of voice interaction systems have a natural limitation (Narrow AI). When we interact with products like Google Now or Apple Siri, we have a strong sense of communicating with a machine, not a real human being. It happens mainly because the system responds predictably, and the responses are too scripted. We can’t have a meaningful dialogue with such system.

But there are some completely different systems available on the market today. One of them is Xiaoice—an AI-driven system developed by Microsoft. The system is based on an emotional computing framework. When users interact with Xiaoice they have a strong sense of chatting with real human being. Some Xiaoice users even say that they consider the system as a friend.

The limitation of Xiaoice is that it’s text-based chat but it’s clear that it’s possible to achieve the much stronger effect by making voice-based interaction; voice can convey a powerful spectrum of emotions. Remember the film Her when the main character played by Joaquin Phoenix fell in love with Samantha (a sophisticated OS). The most interesting thing about this film is that Theodore (the main character) didn’t have a visual image of the Samantha, he only had her voice.

It’s possible to bake a broad spectrum of emotions in voice and tone. And suddenly our daily routine tasks become less dull and more entertaining when we use voice and visual input together.

001

Voice interfaces for Brain.ai

Evolution of AR Experience—From Mobile Screen to Glass

Augmented Reality (AR) is defined as a digital overlay on top of the real world. The beauty of AR is that it provides an extra layer of information over the existing objects in our environment. It transforms the objects around us into interactive digital experiences—the environment becomes more intelligent. The fact that users have an illusion of ‘tangible object’ on the tips of their fingers creates a deeper connection between a user and a product/content.

Early work in AR that we saw in the 90’s were focused mainly on the technology. Even when designers used content in their products, the goal was to demonstrate what AR technology is capable of. But the situation changed. AR is no longer a technology for the sake of technology. The success of Pokemon Go has proven the fact that AR can create a whole new level of engagement and people happy to adopt it. That leads designers to focus on content and the human experience.

AR can be used not just for entertainment, it can be a powerful tool for problem-solving. Here are just a few things it can help you:

Improve Online Shopping Experience

According to Retail Perceptions, a report that analyses the influence of AR on the retail sector, even today 61% of shoppers prefer to shop at stores that offer AR. The survey says that the most popular items to shop for with augmented reality are: Furniture, Clothing, Groceries, Shoes. People love AR because it allows them to see product properties in details and make the shopping experience fun.

002

Users can decide whether they like an item or not before buying. This is especially important for clothing or furniture. As a result, AR can reduce product return rates, saving money on returns.

Creates a New Level of Experience

AR helps us see an enhanced view of the world. For example, it might be a new level of in-flight experience.

003

AR in flight experience for Airbus A380

Or it can be rich contextual hints for your current location. The technology known as SLAM (Simultaneous localization and mapping) can be used for that. SLAM allows real-time mapping of environment. This technology is used in Google’s self-driving car but it can also be applied to AR experience to place multimedia content in the environment.

004

Provide additional information in context

Last but not least, you can completely reimagine existing concepts and use a new dimension (AR) to provide additional information.

005

The concept of interactive walls – a digital overlay on top of the real world

Today it becomes much easier to build AR experience. AR like ARKit and ARCore have enabled sophisticated computer algorithms to be available for anyone to use.

When it comes to technology, the vast majority of AR experiences are phone-based AR. The primary reason why phone-based AR becomes so powerful is obvious; 95% of global populations are smartphone owners. At the same time, AR experience on mobile devices has two natural limitations:

  • Field of view (Augmented Reality is still restricted to a physical space of your mobile screen)
  • Input (we still have to use touch and gestures on the screen to interact with the app)

It’s possible to create a much more engaging experience using Glass technology. Just imagine that you won’t need to take your phone from your pocket to have AR experience.

When you hear the word “AR Glass” most probably you think of Google Glass. It’s been nearly five years since the release of the Google Glass—a promising concept of standalone AR headset. Unfortunately, the first version of the product didn’t reach retail stores. The fact that Google Glass failed on the market led to the countless discussions on whether or not it’s a stillborn concept.

Many people believe that Glass is a dumb idea. As for me, I strongly believe that everything visionary looks stupid at first.

The way technology changes don’t look like a linear process; it looks more like waves. Each new wave can change completely the way we think about technology.

The key to innovation is building something that nobody did before. We need to experiment to get a winning formula—the one that will help us create a product people love. I remember when people said touchscreen phones were stupid because of Palm and Microsoft’s lousy implementation. Then along came Apple and now most of us use touchscreens. Once the product is done right, the technology will make people change their point of view.

One of the promising concepts of AR glasses is Rokid Glass. Rokid is envisioning its smart glasses as a kind of next-generation Google Glass. It’s a standalone headset (meaning it won’t require you to plug to a smartphone or a desktop), it will run on batteries and incorporate an internal processor for handling computing on its own. Rokid is just a part of a broader movement towards consumer AR glasses that bring concepts that we saw in science fiction movies to life.

However, at the time when AR Glass technology will be accepted, we might face a few problems of augmented hyper-reality. Augmented reality may recontextualize the functions of consumerism and change the way in which we operate within it. The fear is that it might change it in the worst way—making the environment overwhelming for users. As a result, the technology that was intended to bring only positive emotions might switch to the entirely negative spectrum.

Moving From Augmented Reality Towards Virtual Reality to Create an Immersive Experience

AR has a natural limitation—as users, we have a clear line between us and content; this line separates one world (AR) with another (real world). This line causes a sense that the AR world is not real.

You probably know what will be the answer for this AR design limitation—VR. Thanks to VR, we can have a truly immersive experience. This experience will remove the barrier between worlds—the real world and VR world can blur together.

With the recent technologies like Oculus Quest offering all-in-one device—location tracking, controllers, and processing in a standalone unit. No separate PC required. This makes VR device a separate unit, not just an extra feature for your mobile phone or desktop computer.

VR can work great both for entertainment: just imagine how you can experience movies in VR in 360 or how you as a user can use natural gestural interactions in games. And in office space: just imagine how video calls in your favorite messenger will evolve to VR calls where you’ll be able to actually interact with another person. This will help to establish a much deeper emotional connection with people.

VR will invite designers to think about important questions such as:

  • Rethink the process for creating digital products. It’s clear that the modern state of VR is way too skeuomorphic. We still in the process of defining the way we want our users to interact in a virtual space. But it’s an excellent task for designers.
  • Ethics of removing the line between content and UI (“Where does the line between content and UI should start and end?”)
  • Rise of VR addiction. Today we have a problem of Smartphone zombie, people who stick to their smartphones and don’t see the world around them. With VR we might face even more addictive behavior. People will hunt for the new level of experience, the powerful emotion VR technology will deliver. As a result, they might go too deep in VR experience and skip the reality.

Conclusion

When we think about the modern state of product design, it becomes evident that we are only at the tip of the iceberg. We’re witnessing a fundamental shift in Human-Computer Interaction (HCI)—rethinking the whole concept of digital experience.

In the next decade, designers will break the glass (the era of mobile devices as we know them today will over) and move to the interfaces of the future.

 

Featured image via DepositPhotos.

Add Realistic Chalk and Sketch Lettering Effects with Sketch’it – only $5!

Source

from Webdesigner Depot https://www.webdesignerdepot.com/2018/12/designing-the-emotional-interfaces-of-the-future/

No comments:

Post a Comment