System software: User interfaces
A user interface, also sometimes called a human-computer interface, comprises both hardware and software components. It handles the interaction between the user and the system.
There are different ways of interacting with computer systems which have evolved over the years. There are five main types of user interface:
- command line (cli)
- graphical user interface (GUI)
- menu driven (mdi)
- form based (fbi)
- natural language (nli)
Command Line Interface[edit | edit source]
Command line interfaces are the oldest of the interfaces discussed here. It involves the computer responding to commands typed by the operator. This type of interface has the drawback that it requires the operator to remember a range of different commands and is not ideal for novice users.
Graphical UI[edit | edit source]
Graphical user interfaces (GUI) are sometimes also referred to as WIMP because they use Windows, Icons, Menus and Pointers. Operators use a pointing device (such as a mouse, touchpad or trackball) to control a pointer on the screen which then interacts with other on-screen elements. It allows the user to interact with devices through graphical icons and visual indicators such as secondary notations. The term was created in the 1970s to distinguish graphical interfaces from text-based ones, such as command-line interfaces. However, today nearly all digital interfaces are GUIs. The first commercially available GUI, called "PARC," was developed by Xerox. It was used by the Xerox 8010 Information System, which was released in 1981. After Steve Jobs saw the interface during a tour at Xerox, he had his team at Apple develop an operating system with a si was included with the Macintosh, which was released in 1984. Microsoft released its first GUI-based OS, Windows 1.0, in 1985.
Menu Driven[edit | edit source]
A menu-driven interface is commonly used on cash machines (also known as automated teller machines, or ATMs), ticket machines and information kiosks (for example in a museum). Menu-driven interfaces provide a simple and an easy to use interface composed of a series of menus and sub-menus which the user accesses by pressing buttons, often on a touch-screen device. Preferably, if one has knowledge on UML modeling, it can be a good example when designing the architecture of the machine.
Form Based[edit | edit source]
This is a method of enabling you to interact with an application.
The form normally provides limited choices as to the use.
For example, a form interface for setting text characteristics in application software might offer the choices of selecting font size, colour, style.
A form interface which will allow you to interact with the system software might offer choices such as selecting your screen resolution, default language, keyboard style etc.
A form interface can also be used to enter data into a system, for example a database system will usually allow you to create a form to enter data into tables.
Natural language[edit | edit source]
A natural language interface is a spoken interface where the user interacts with the computer by talking to it. Sometimes referred to as a 'conversational interface', this interface simulates having a conversation with a computer. Made famous by science fiction (such as in Star Trek), natural language systems are not yet advanced enough to be in wide-spread use. Commonly used by telephone systems as an alternative to the user pressing numbered buttons the user can speak their responses instead. An Example of this type of interface is Voice Recognition.
Gesture Driven[edit | edit source]A recent innovation in user-interfaces is the gesture-driven interface. This is an interface which is controlled by a human making physical gestures which are then detected by the computer. The idea was discussed at the TED conference by Pranav Mistry who invented the Sixth Sense device. This uses cameras to detect human motion which it uses to determine what the user wants to do.
Other examples include the increasing use on touch-screen devices (such as mobile phones and tablet computers) of gestures such as 'pinching' to zoom in and out. Some games consoles are starting to use gesture driven interfaces. The Wii was the first such console which used a hand-held controller to detect gestures. More recently the Xbox introduced a system similar to SixthSense which uses a camera to detect motion.
[edit | edit source]