language-icon Old Web
English
Sign In

User control

The user interface (UI), in the industrial design field of human–computer interaction, is the space where interactions between humans and machines occur. The goal of this interaction is to allow effective operation and control of the machine from the human end, whilst the machine simultaneously feeds back information that aids the operators' decision-making process. Examples of this broad concept of user interfaces include the interactive aspects of computer operating systems, hand tools, heavy machinery operator controls, and process controls. The design considerations applicable when creating user interfaces are related to or involve such disciplines as ergonomics and psychology.Historic HMI in the driver's cabin of a German steam locomotiveModern HMI in the driver's cabin of a German Intercity-Express High-Speed TrainThe HMI of a toilette (in Japan)Voice user interface of a wearable computer (here: Google Glass)HMI for audio mixingHMI for video productionHMI of a machine for the sugar industry with pushbuttonsHMI for a Computer numerical control (CNC)slightly newer HMI for a CNC-machineemergency switch/panic switch The user interface (UI), in the industrial design field of human–computer interaction, is the space where interactions between humans and machines occur. The goal of this interaction is to allow effective operation and control of the machine from the human end, whilst the machine simultaneously feeds back information that aids the operators' decision-making process. Examples of this broad concept of user interfaces include the interactive aspects of computer operating systems, hand tools, heavy machinery operator controls, and process controls. The design considerations applicable when creating user interfaces are related to or involve such disciplines as ergonomics and psychology. Generally, the goal of user interface design is to produce a user interface which makes it easy, efficient, and enjoyable (user-friendly) to operate a machine in the way which produces the desired result. This generally means that the operator needs to provide minimal input to achieve the desired output, and also that the machine minimizes undesired outputs to the human. User interfaces are composed of one or more layers including a human-machine interface (HMI) interfaces machines with physical input hardware such a keyboards, mice, game pads and output hardware such as computer monitors, speakers, and printers. A device that implements a HMI is called a human interface device (HID). Other terms for human-machine interfaces are man–machine interface (MMI) and when the machine in question is a computer human–computer interface. Additional UI layers may interact with one or more human sense, including: tactile UI (touch), visual UI (sight), auditory UI (sound), olfactory UI (smell), equilibrial UI (balance), and gustatory UI (taste). Composite user interfaces (CUI) are UIs that interact with two or more senses. The most common CUI is a graphical user interface (GUI), which is composed of a tactile UI and a visual UI capable of displaying graphics. When sound is added to a GUI it becomes a multimedia user interface (MUI). There are three broad categories of CUI: standard, virtual and augmented. Standard composite user interfaces use standard human interface devices like keyboards, mice, and computer monitors. When the CUI blocks out the real world to create a virtual reality, the CUI is virtual and uses a virtual reality interface. When the CUI does not block out the real world and creates augmented reality, the CUI is augmented and uses an augmented reality interface. When a UI interacts with all human senses, it is called a qualia interface, named after the theory of qualia. CUI may also be classified by how many senses they interact with as either an X-sense virtual reality interface or X-sense augmented reality interface, where X is the number of senses interfaced with. For example, a Smell-O-Vision is a 3-sense (3S) Standard CUI with visual display, sound and smells; when virtual reality interfaces interface with smells and touch it is said to be a 4-sense (4S) virtual reality interface; and when augmented reality interfaces interface with smells and touch it is said to be a 4-sense (4S) augmented reality interface. The user interface or human–machine interface is the part of the machine that handles the human–machine interaction. Membrane switches, rubber keypads and touchscreens are examples of the physical part of the Human Machine Interface which we can see and touch. In complex systems, the human–machine interface is typically computerized. The term human–computer interface refers to this kind of system. In the context of computing, the term typically extends as well to the software dedicated to control the physical elements used for human-computer interaction. The engineering of the human–machine interfaces is enhanced by considering ergonomics (human factors). The corresponding disciplines are human factors engineering (HFE) and usability engineering (UE), which is part of systems engineering. Tools used for incorporating human factors in the interface design are developed based on knowledge of computer science, such as computer graphics, operating systems, programming languages. Nowadays, we use the expression graphical user interface for human–machine interface on computers, as nearly all of them are now using graphics.

[ "Computer hardware", "Electrical engineering", "Artificial intelligence", "control" ]
Parent Topic
Child Topic
    No Parent Topic