A-level Computing/AQA/Paper 1/Systematic approach to problem solving
Specification[edit | edit source]
|Specification Point||Content||Additional Information|
|220.127.116.11 Analysis||Be aware that before a problem can be solved, it must be defined, the requirements of the system that solves the problem must be established and a data model created. Requirements of system must be established by interaction with the intended users of the system. The process of clarifying requirements may involve prototyping/agile approach.||Students should have experience of using abstraction to model aspects of the external world in a program.|
||Students should have sufficient experience of successfully structuring programs into modular parts with clear documented interfaces to enable them to design appropriate modular structures for solutions.|
||Students should have sufficient practice of writing, debugging and testing programs to enable them to develop the skills to articulate how programs work arguing for their correctness and efficiency using logical reasoning, test data and user feedback.|
|18.104.22.168 Evaluation||Know the criteria for evaluating a computer system.|
Analysis[edit | edit source]
Analysis of the system to identify the requirements and define the problem being solved.
For example, the construction of a website could cover:
- Data - its origin, uses, volumes and characteristics
- Procedures - what is done, where, when and how, and how errors and exceptions are handled
- Future - development plans and expected growth rates
- Problems with any existing system
In the case of a different type of problem such as a simulation or game, the requirements will still need to cover a similar set of considerations.
Design[edit | edit source]
When designing the system, some or all of the following should be taken into account:
- Processing: Documenting and creating the algorithms and appropriate modular structure for the solution.
- Data structures: how data will be held and how it will be accessed - for example in a dynamic structure such as a queue or tree, or in a file or database
- Output: content, format, sequence, frequency, medium etc.
- Input: volume, frequency, documents used, input methods;
- User interface: screens and dialogues, menus, special-purpose requirements
- Security: how the data is to be kept secure from accidental corruption or deliberate tampering or hacking
- Hardware: selection of an appropriate configuration
Implementation[edit | edit source]
Once the design has been agreed, the programs can be coded. A clear focus needs to be maintained on the ultimate goal of the project, without users or programmers being sidetracked into creating extra features which might be useful, or possible future requirements. Programmers will need to be flexible in accepting user feedback and making changes to their programs as problems or design flaws are detected. In even a moderately complex system it is hard to envision how everything will work together, so iterative changes at every stage are a normal part of a prototyping/agile approach.
Testing[edit | edit source]
Testing is carried out at each stage of the development process. Once all the programs have been tested with normal, boundary and erroneous data, unit testing, module testing and system testing will also be carried out. The system then needs to be tested by the user to ensure that it meets the specification. This is known as acceptance testing. It involves testing with data supplied by the end user rather than data designed especially for testing purposes. it has the following objectives:
- to confirm that the system delivered meets the original customer specifications
- to find out whether any major changes in operating procedures will be needed
- to test the system in the environment in which it will run, with realistic volumes of data
Testing is an iterative process, with each stage in the test process being repeated with modifications have to be made owing to errors coming to light at a subsequent stage.
Evaluation[edit | edit source]
The evaluation may include a post-implementation review, which is a critical examination of the system three to six months after it has been put into operation. This waiting period allows users and technical staff to learn how to use the system, get used to new ways of working and understand the new procedures required. It allows management a chance to evaluate the usefulness of the reports and on-line queries that they can make, and go through several 'month-end' periods when various routine reports will be produced. Shortcoming of the system, if there are any, will be becoming apparent at all levels of the organisation, and users will want a chance to air their views and discuss improvements. The solution should be evaluated on the basis of effectiveness, usability and maintainability. The post-implementation review will focus on the following:
- a comparison of the system;s actual performance wit the anticipated performance objectives
- an assessment of each aspect of the system against preset criteria
- errors which were made during system development
- unexpected benefits and problems