TechThe future of UX / UI design: AR glasses, EMG sensors, haptic suits, artificial intelligence assistants

April 11, 2021716 min

Augmented reality (AR) technology finds more use in almost every field, from e-commerce to education, while also changing human-computer interaction and user experience as we know it.

The immersive experience provided by AR applications enables users to view all types of computer-derived audio and visual data interactively acting as if they were part of the physical world around us.

The 3D and inclusive experiences provided not only by AR applications, but also by all XR technologies (Extended Reality – the spectrum of ” Extended Reality ” that includes all AR / VR / MR technologies) are far beyond limited access to information offered by the hardware and software interfaces that we currently use. Today, we still access computer-sourced digital data through two-dimensional displays (computer monitors and mobile devices), electronic keyboards that are the successors to physical typewriters, touch screens, and of course, control interfaces such as the mouse that came into our lives in the late 1960s. However, when we look at the contents and applications of rapidly developing AR / VR technologies, it is obvious that the traditional hardware interfaces we use will be insufficient.

Facebook Reality Labs, which has been conducting research on AR&VR technologies in recent days, shared its vision for user experiences that await us far beyond the use of 2D screens and mice in the near future. In 2019, Facebook acquired a company called “CTRL-Labs” that develops hardware in human-machine interaction. CTRL-Labs had developed hardware that, with the help of electromyography (EMG) sensors, could convert electrical signals that pass through the human body’s neural networks into digital inputs and do so through an electronic band that can be wear on the wrist. This wrist hardware developed by CTRL-Labs, who joined Facebook Reality Labs after the purchase, detects the motor neurons sent to the muscles of our body by the brain and the central nervous system on the basis of electrical signals and converts them in digital commands. that computers can perceive.

In other words, the device briefly transfers our body’s biochemical data to digital. For example, when a person wants to move his finger even at the millimeter level, the signals from the motor neurons transmitted to the muscles by the central nervous system are detected during transmission, and then this electrical signal and the “finger movement” are analyzed. command is detected. Its objective is to directly control the machines with the signals coming from the neural network that surrounds the human body with the equipment that is being developed. An important point here is that this technology should not be confused with placing electrodes directly on the human brain, as in Neurolink. It is not like placing an implant in the brain in any way, but simply a wrist strap that tracks the electrical signals that pass through the muscles.

The main objective of Facebook is to facilitate access to information in Augmented Reality (AR) / XR experiences and eliminate inappropriate interfaces such as the mouse and keyboard that we currently use in the long term. This approach is not the kind of user experience we’ve encountered before, but rather a revolutionary advance in human-computer interaction. This means that whether you are sitting or walking, you can easily control the devices when you move your finger within millimeters.

Because you need less muscle movement and physical effort, you can navigate between user interfaces much faster than normal, type with the help of virtual keyboards much faster than physical keyboards (and even without a keyboard) or tools that require more precise control. may use control systems. In particular, remote control of surgical robots will be much more possible and easier because there will no longer be any equipment that reduces the interaction between your muscles and the computer. This, etc. Similar hardware will not only perceive the commands, but will also interact with the computer and machines with haptic feedback, that is, characteristics such as vibration or movement that simulate the sense of touch.

Of course, converting complex signals transmitted through the central nervous system into precise and meaningful digital commands is a very difficult subject. At this point, the use of Artificial Intelligence becomes much more important because while these commands are perceived by the computer, they have to be evaluated according to environmental factors and of course the environment and the context of the user, otherwise they can occur. unintentional control errors. For example, while you are running, your AI personal assistant should detect that you are running and prevent contractions in your arm area that are much higher than the normal sitting position from becoming individual commands or providing limited control. over applications to avoid inadvertent commands. during movement. Of course, visually, instead of showing you unnecessary screens while running, it should only offer things like how many miles you run, your location on the map, or playlist options for music to play on the back.

On the contrary, when you sit at the table in your room or in any cafe, the artificial intelligence assistant should reveal many more screens with the help of the AR glasses than you need while running, and at the same time, because you are not in a hurry. Furthermore, even the sensitive voltages on your arms must be observed and converted on command. In short, the artificial intelligence assistant will enrich reality according to your habits in an endless cycle, and will facilitate the user experience at the highest level. This enrichment, of course, will be at the visual, auditory and haptic level. Instead of the term Augmented Reality, you will move to the term “Personalized Reality” (KR), where perhaps personal experience will be experienced. The designs of services and interfaces, which until now are said to be customer-oriented, will not even come close to the experience that we will find. Imagine what you saw and heard, a reality that no one else has seen. As mentioned above, an important part of the customization will be created by the artificial intelligence assistant, taking into account the user’s situation and environment.

Not only the visual user interfaces, but also the haptics, clothing and clothing that trigger sensory sensations will be part of the future user experience. Haptic suits are currently used primarily in virtual reality (VR) games and simulation applications. Mini motors in clothing bring vibration and movement to the wearer’s body, trying to mimic the sense of touch in real life. For example, with the haptic suit developed by the Tesla Suit brand, when someone shoots you in virtual reality games, you can feel a sense of simulation at the point where you hit. Haptic equipment adds the sense of touch in addition to visual and auditory communication, adding a whole new layer of human-computer interaction. For now, although it is mainly used in virtual reality (VR) games and business simulations, the truth is that in the future we will have very interesting experiences sending haptic-based emojis instead of the emojis that we send each other over the internet. telephone.

In short, in the near future, XR technologies will completely change the user experience and designs that we are used to, especially with the widespread use of AR glasses. The physical keyboard, mouse or touch screen that we currently use will be replaced by EMG and haptic bracelets that transform the muscles of the human body into interfaces. An important part of this type of hardware will be the artificial intelligence assistants that organize the commands according to the context of the environment in which the users are. Augmented Reality (AR) will be replaced by more advanced ” Personalized Reality ” experiences powered by AI. In addition, wearable technologies will be used effectively not only as equipment that users use to control machines, but also in many areas, such as effective monitoring of the human body, especially in the field of remote treatment diagnosis, sports, etc. .

On the other hand, with each new technology, of course, it comes into our lives with a dark side. It is quite dangerous that not only our written and visual information, but also when we use such portable equipment, the movements in our muscles and nervous system can be measured and monitored based on signals, and the possibility of being seized by third parties. in the event of a possible security breach. Additionally, the sense of advanced inclusiveness that this hardware will create can lead users to unethical experiences. For example, when you are wearing augmented reality (AR) glasses and within an app, the pop-up ad that suddenly appears and your haptic hardware simultaneously emits a haptic warning in accordance with the ad will be far beyond the inconvenience we encountered. looking at a phone or computer screen. Although Facebook has published a number of ethical and design-oriented guidelines for user safety for XR technologies, we will only see what kinds of things we will find only after we start using these technologies more intensively.

Leave a Reply

Your email address will not be published. Required fields are marked *