Introduction to Software Engineering/History

From Wikibooks, open books for an open world
Jump to navigation Jump to search

History[edit | edit source]

When the first modern digital computers appeared in the early 1940s,[1] the instructions to make them operate were wired into the machine. At this time, people working with computers were engineers, mostly electrical engineers. This hardware centric design was not flexible and was quickly replaced with the "stored program architecture" or von Neumann architecture. Thus the first division between "hardware" and "software" began with abstraction being used to deal with the complexity of computing.

Programming languages started to appear in the 1950s and this was also another major step in abstraction. Major languages such as Fortran, ALGOL, and COBOL were released in the late 1950s to deal with scientific, algorithmic, and business problems respectively. E.W. Dijkstra wrote his seminal paper, "Go To Statement Considered Harmful",[2] in 1968 and David Parnas introduced the key concept of modularity and information hiding in 1972[3] to help programmers deal with the ever increasing complexity of software systems. A software system for managing the hardware called an operating system was also introduced, most notably by Unix in 1969. In 1967, the Simula language introduced the object-oriented programming paradigm.

These advances in software were met with more advances in computer hardware. In the mid 1970s, the microcomputer was introduced, making it economical for hobbyists to obtain a computer and write software for it. This in turn led to the now famous Personal Computer (PC) and Microsoft Windows. The Software Development Life Cycle or SDLC was also starting to appear as a consensus for centralized construction of software in the mid 1980s. The late 1970s and early 1980s saw the introduction of several new Simula-inspired object-oriented programming languages, including Smalltalk, Objective-C, and C++.

Open-source software started to appear in the early 90s in the form of Linux and other software introducing the "bazaar" or decentralized style of constructing software.[4] Then the World Wide Web and the popularization of the Internet hit in the mid 90s, changing the engineering of software once again. Distributed systems gained sway as a way to design systems, and the Java programming language was introduced with its virtual machine as another step in abstraction. Programmers collaborated and wrote the Agile Manifesto, which favored more lightweight processes to create cheaper and more timely software.

The current definition of software engineering is still being debated by practitioners today as they struggle to come up with ways to produce software that is "cheaper, better, faster". Cost reduction has been a primary focus of the IT industry since the 1990s. Total cost of ownership represents the costs of more than just acquisition. It includes things like productivity impediments, upkeep efforts, and resources needed to support infrastructure.

References[edit | edit source]

  1. Leondes (2002). intelligent systems: technology and applications. CRC Press. ISBN 9780849311215.
  2. Dijkstra, E. W. (1968). "Go To Statement Considered Harmful" (PDF). Wikipedia:Communications of the ACM. 11 (3): 147–148. doi:10.1145/362929.362947. Retrieved 2009-08-10. {{cite journal}}: Unknown parameter |month= ignored (help)
  3. Parnas, David (1972). "On the Criteria To Be Used in Decomposing Systems into Modules". Wikipedia:Communications of the ACM. 15 (12): 1053–1058. doi:10.1145/361598.361623. Retrieved 2008-12-26. {{cite journal}}: Unknown parameter |month= ignored (help)
  4. Raymond, Eric S. The Cathedral and the Bazaar. ed 3.0. 2000.

Further Reading[edit | edit source]

History of software engineering