Pages

Sabtu, 02 Juli 2016

Future Technology: Natural User Interface


Stands for "Natural User Interface." An NUI is a type of user interface that is designed to feel as natural as possible to the user. The goal of an NUI is to create seamless interaction between the human and machine, making the interface itself seem to disappear.
A common example of a natural user interface is a touchscreen interface, which allows you to move and manipulate objects by tapping and dragging your finger(s) on the screen. The digital objects on the screen respond to your touch, much like physical objects would. This direct feedback provided by a touchscreen interface makes it seem more natural than using a keyboard and mouse to interact with the objects on the screen.
Another modern example of an NUI is a motion-based video game. The Nintendo Wii, for instance, allows you to wave a controller in the air to perform actions on the screen. Microsoft's Xbox Kinect allows you to control your on-screen character by simply moving your body. Both of these motion-based interfaces are considered natural user interfaces since they respond to your natural motions.
While touchscreens and motion-based games are two of the most common types of NUIs, several others exist as well. For example, a voice recognition interface like Apple's Siri assistant on the iPhone is considered a natural user interface since it responds to naturally spoken commands and questions. Virtual reality devices are NUIs, since they emulate a real world experience. Even some robots are considered natural user interfaces since they respond to human motion and spoken commands.

A natural user interface (NUI) is a system for human-computer interaction that the user operates through intuitive actions related to natural, everyday human behavior.
A NUI may be operated in a number of different ways, depending on the purpose and user requirements. Some NUIs rely on intermediary devices for interaction but more advanced NUIs are either invisible to the user or so unobtrusive that they quickly seem invisible.

Some examples and applications of natural user interfaces:
Touch screen interfaces let users interact with controls and applications more intuitively than a cursor-based interface because it is more direct – instead of moving a cursor to select a file and clicking to open it, for example, the user touches a graphic representation of the file to open it. Smartphones and tablets typically enable touch input. Touch is being adapted for non-screen applications as well. For example, Microsoft is working on a touch interface called “skinput” that allows users to interact by tapping their own skin.
Gesture recognition systems track user motions and translate those movements to instructions. Nintendo Wii and PlayStation Move motion gaming systems work through controller-based accelerometers and gyroscopes to sense tilting, rotation and acceleration. A more intuitive type of NUI is outfitted with a camera and software in the device that recognizes specific gestures and translates them to actions. Microsoft’s Kinect, for example, is a motion sensor for the Xbox 360 gaming console that allows users to interact through body motions, gestures and spoken commands. Kinect recognizes individual players’ bodies and voices. Gesture recognition can also be used to interact with computers.
Speech recognition allows users to interact with a system through spoken commands. The system identifies spoken words and phrases and converts them to a machine-readable format for interaction. Speech recognition applications include call routing, speech-to-text and hands-free computer and mobile phone operation.  Speech recognition is also sometimes used to interact with embedded systems.
Gaze-tracking interfaces allow users to guide a system through eye movements. In March 2011, Lenovo announced that they had produced the first eye-controlled laptop. The Lenovo system combines an infrared light source with a camera to catch reflective glints from the user’s eyes. Software calculates the area of the screen being looked at and uses that information for input.
Brain-machine interfaces read neural signals and use programs to translate those signals into action. BCI can make it possible for someone who is paralyzed to operate a computer, motorized wheelchair or prosthetic limb through thought alone.
Speaking at a 2008 conference, Microsoft’s August de los Reyes spoke of the NUI as the next evolutionary stage in computing from the graphical user interface (GUI), as the GUI was from the command-line interface (CLI).


Source:
http://techterms.com/definition/nui 
http://whatis.techtarget.com/definition/natural-user-interface-NUI