System software: User interfaces

From Wikibooks, open books for an open world
Jump to navigation Jump to search

SECTION 1 - ⇑ System software ⇑

← Operating systems User interfaces Utility software →

User interface - The features of a computer system which allows the user to interact with it.

A user interface, also sometimes called a human-computer interface, comprises both hardware and software components. It handles the interaction between the user and the system.

There are different ways of interacting with computer systems which have evolved over the years. There are five main types of user interface:

  • command line (cli)
  • graphical user interface (GUI)
  • menu driven (mdi)
  • form based (fbi)
  • natural language (nli)

Command Line Interface[edit]

Command line interfaces are the oldest of the interfaces discussed here. It involves the computer responding to commands typed by the operator. This type of interface has the drawback that it requires the operator to remember a range of different commands and is not ideal for novice users.

Screenshot from the MS DOS operating system using a command line interface.

Graphical UI[edit]

Graphical user interfaces (GUI) are sometimes also referred to as WIMP because they use Windows, Icons, Menus and Pointers. Operators use a pointing device (such as a mouse, touchpad or trackball) to control a pointer on the screen which then interacts with other on-screen elements. It allows the user to interact with devices through graphical icons and visual indicators such as secondary notations. The term was created in the 1970s to distinguish graphical interfaces from text-based ones, such as command-line interfaces. However, today nearly all digital interfaces are GUIs. The first commercially available GUI, called "PARC," was developed by Xerox. It was used by the Xerox 8010 Information System, which was released in 1981. After Steve Jobs saw the interface during a tour at Xerox, he had his team at Apple develop an operating system with a similar design. Apple's GUI-based OS was included with the Macintosh, which was released in 1984. Microsoft released its first GUI-based OS, Windows 1.0, in 1985.

Screenshot of a graphical user interface used by the software package.

Menu Driven[edit]

A menu driven interface is commonly used on cash machines (also known as automated teller machines (ATM's), ticket machines and information kiosks (for example in a museum). They provide a simple and easy to use interface comprised of a series of menus and sub-menus which the user accesses by pressing buttons, often on a touch-screen device. Preferably, if one has knowledge on UML modeling, it can be a good example when designing the architecture of the machine.

ATM machine with menu drive interface.

Form Based[edit]

A form-based interface uses text-boxes, drop-down menus, text areas, check boxes, radio boxes and buttons to create an electronic form which a user completes in order to enter data into a system. This is commonly used on websites to gather data from a user, or in call centres to allow operators to quickly enter information gathered over the phone.

An example of a form based user interface.

Natural language[edit]

A natural language interface is a spoken interface where the user interacts with the computer by talking to it. Sometimes referred to as a 'conversational interface', this interface simulates having a conversation with a computer. Made famous by science fiction (such as in Star Trek), natural language systems are not yet advanced enough to be in wide-spread use. Commonly used by telephone systems as an alternative to the user pressing numbered buttons the user can speak their responses instead. An Example of this type of interface is Voice Recognition

This is the kind of interface used by the popular iPhone application called Siri and Cortana in Windows.


Gesture Driven[edit]

Pranav Mistry demonstrating the Sixth Sense device
A recent innovation in user-interfaces is the gesture-driven interface. This is an interface which is controlled by a human making physical gestures which are then detected by the computer. The idea was discussed at the TED conference by Pranav Mistry who invented the Sixth Sense device. This uses cameras to detect human motion which it uses to determine what the user wants to do.

Other examples include the increasing use on touch-screen devices (such as mobile phones and tablet computers) of gestures such as 'pinching' to zoom in and out. Some games consoles are starting to use gesture driven interfaces. The Wii was the first such console which used a hand-held controller to detect gestures. More recently the Xbox introduced a system similar to SixthSense which uses a camera to detect motion.

External links[edit]