System software: User interfaces
A user interface, also sometimes called a human-computer interface, comprises both hardware and software components. It handles the interaction between the user and the system.
There are different ways of interacting with computer systems which have evolved over the years. There are three main types of user interface:
- command line
- graphical user interface (GUI)
- menu driven
- form based
- natural language
Command Line Interface
A command line interfaces are the oldest of the interfaces discussed here. It involves the computer responding to commands typed by the operator. This type of interface has the drawback that it requires the operator to remember a range of different commands and is not ideal for novice users.
Graphical User Interface
Graphical user interfaces (GUI) are sometimes also referred to as WIMP because they use Windows, Icons, Menus and Pointers. Operators use a pointing device (such as a mouse, touchpad or trackball) control a pointer on the screen which then interacts with other on-screen elements.
A menu driven interface is commonly used on cash machines (also known as automated teller machines, or ATMs), ticket machines and information kiosks (for example in a museum). They provide a simple and easy to use interface comprised of a series of menus and sub-menus which the user accesses by pressing buttons, often on a touch-screen device.
A form-based interface uses text-boxes, drop-down menus, text areas, check boxes, radio boxes and buttons to create an electronic form which a user completes in order to enter data into a system. This is commonly used on websites to gather data from a user, or in call centres to allow operators to quickly enter information gathered over the phone.
A natural language interface is a spoken interface where the user interacts with the computer by talking to it. Sometimes referred to as a 'conversational interface', this interface simulates having a conversation with a computer. Made famous by science fiction (such as in Star Trek), natural language systems are not yet advanced enough to be in wide-spread use. Commonly used by telephone systems as an alternative to the user pressing numbered buttons the user can speak their responses instead.
Gesture DrivenTED conference by Pranav Mistry who invented the Sixth Sense device. This uses cameras to detect human motion which it uses to determine what the user wants to do.
Other examples include the increasing use on touch-screen devices (such as mobile phones and tablet computers) of gestures such as 'pinching' to zoom in and out. Some games consoles are starting to use gesture driven interfaces. The Wii was the first such console which used a hand-held controller to detect gestures. More recently the xBox introduced a system similar to SixthSense which uses a camera to detect motion.