# Unit 2.1 Elements of Computational Thinking

#### Keywords

• Abstraction – process of separating ideas from specific instances of those ideas at work. Computational structures are defined by their meanings while hiding away the details of how they work. Abstraction tries to factor out details from a common pattern so that programmers can work close to the level of human thoughts, leaving out details that matter in practice but not to the problem being solved.
• Abstract model – Any model of a system that has been taken from or based on (abstracted) a real life situation or problem.
• Recursion – an advanced programming construction in which a block of code (often a function) is able to call itself. Recursion algorithms must be designed carefully so that a terminating condition will be met.
• Functions – a block of code given a unique identifiable name within a program. A function can take either 0 or more parameters when it is called and should return a value. The function should be designed and written to perform one task or action which is clearly indicated by its name.
• Parameters – data structures passed into a procedure or function when they are initially called.
• By value – if a parameter is passed by value, a copy of the data is used within the subprogram, which is discarded when the subprogram exits.
• By reference – if a parameter is passed by reference, the location (in memory) of the variable is used. This means any changes affect the variable in the main program so are retained after the subprogram exits.
• Concurrent processing – any situation in the design or programming of a system when you would want more than one thing happening at the same time.
• Pipelining – concurrent decoding of two or more machine instructions. While part of one instruction (e.g. address field) is being decoded, another part of a second instruction (e.g. operation code) may also be decoded.
• Decomposition – the breaking down of a problem into smaller parts that are easier to solve. The smaller parts can sometimes be solved recursively → over and over again until problem is solved.
• Object oriented programming – a program made up of objects that interact. E.g. Java.

## Elements of Computational Thinking

• Compatibility – whether or not a problem can be solved using an algorithm.
• Speed and memory - limiting factor. As larger amounts of power become available we can tackle more problems using a computer.
• Solving problems is joint between humans and computers.
• Some problems will never be solvable by computers.

Problem recognition

• It is important to first recognise a problem
• Using computational and intuitive methods it may be possible to come up with a solution

Backtracking

Algorithmic approach to a problem where partial solutions to a problem are built up as a pathway to follow → if pathway fails at one point the search begins at the last potentially successful point.

Data Mining

• Going through lots of data that probably has lots of sources
• Useful for searching for relationships and facts that are not immediately obvious to the casual observer.
• Used when data comes from data sets that are not structured in the same way.
• E.g. a supermarket loyalty card scheme.
• Perform searches in attempts to find patterns.
• Would show if certain products are bought together, by the same person or same demographic.
• Algorithms that help are known as ‘pattern matching’ or ‘anomaly detection’.

Data mining is possible because of:

• Big databases
• Faster Processing

Pipelining

• Output of one process is the input to another
• Useful in RISC structures.

Visualisation to Solve Problems

• Present data in an easy to grasp form.
• Simplest → table data as a graph
• Can make previously unnoticed facts and trends apparent.

## 2.1.1 Thinking Abstractly

• Abstraction is a concept of reality.
• Commonly makes use of symbols to represents components of a problem so that the human mind or computing agent can process the problem.
• About finding out what does and does not matter in a scenario.
• Helps maximise our chances of solving a problem by letting us separate out the components and deciding which matter.

Abstraction and Real World Issues

• Using computers to solve real world problems would be impossible without it.
• Variables are abstraction. They represent real world values or intermediate values in a calculation.

Levels of Abstraction

• In a complex system is often useful to construct an abstraction to represent a large problem and create lower-level abstractions to deal with component parts.
• Power of this approach is that the details in each layer of abstraction can be hidden from the others.
• Frees up the solution process to concentrate on just one issue at a time or send the different sub-problems to different staff or different companies to work on.
• Layering is found widely in the construction of any large system.
• Layers are a way of dividing the functionality of a system into separate areas.
• Same principle can apply to physical items (e.g. a car).

## 2.1.2 Thinking Ahead

A computer scientist, when programming a system, will:

• Determine the outputs required
• Determine necessary actions to achieve the outputs
• Consider the resources needed
• Consider user expectations

Anyone can make use of these strategies to:

• Decide what is to be achieved
• Determine prerequisites
• Determine what is possible

Computer Example: Caching

Human Example: someone wanting to be a computer professional would need to determine:

• Subjects to study at a level
• What aptitudes they have
• Amount of work they need to put in
• Likely demand for the skillset – now and in the future

Prefetching

• Anticipated instructions are fetched before they are needed and placed before they are needed and placed in a cache from which they can be obtained quickly, so there is no delay in accessing the slower RAM.

The Need for Reusable Program Components

• Important from a business point of view as well as a tool to support future developments
• Most systems are designed by composing existing components that have been used in other systems
• Gaining more popularity in software.

Benefits

• Increased dependability – more dependable than new software (bugs and faults are already identified)
• Reduced process risk – cost of existing software is already known whereas cost of development is a matter of judgement.
• Effective use of specialists – saves having specialists do the same work over and over. Can develop reusable software that encapsulates their knowledge.
• Standards compliance – some standards (e.g. user interface standards) can be implemented as a set of reusable components. Can be used so that all applications present have the same menu format.
• Accelerated development – can speed up system production as development and validation time can be reduced.

Drawbacks

• Increased maintenance costs – can be higher due to possible incompatibilities of the reused component with system changes.
• Lack of tool supports – do not support development reuse. May be difficult to integrate these tools with a computer library system.
• Not-Invented-Here systems – prefer to rewrite as they can improve on them.
• Creating, maintaining and using a component library – can be expensive to populate and ensure that developers know how to use a library. Processes have been developed to ensure that the library is used.
• Finding, understanding and adapting reusable components – have to be discovered in a library, understood and sometimes adapted for a new environment.

Example – How did NASA think ahead with the New Horizons mission?

• How they would maintain their equipment with nobody on board.
• Cost vs Reward
• How would they power it?
• How would they get information back to earth? How could they collect information?
• How long would it take? Speed
• Flyby or orbit mission?
• Potential unknowns
• What equipment would be needed
• Possible future missions

Caching

• Data that is input might be stored in RAM in case it is needed again before the process is shut down.
• If it is required, it does not need to be read again from the disk → creates a faster response time.
• Prefetching → an instruction is requested from the memory before it is required to speed up the process.

• Can reduce load on a web server because data required can be anticipated

• Can be complicated to implement
• If wrong data is cached it can be difficult to re-establish the correct sequence of data.

Inputs and Outputs

• One of the first things done by an analyst is to determine what outputs are needed.
• The designer of the system needs to ensure that there are correct inputs when needed to ensure that the output can be achieved.

## 2.1.3 Thinking Procedurally

About decomposition → used to make a problem more manageable.

• Any large problem is divided into smaller sub problems.
• Eventually they will equate to a program module or group of modules.
• Order of execution needs to be taken into account – may need data to be processed by one module before another can use it.
• Some need to be accessible in an unpredictable way.
• Large human project benefit from the same approach (e.g. building a house)

Order

• When planning a solution, the order may or may not be important.
• Event driven situations – order may be unpredictable.
• In other situations, order can be important as sometimes you cannot complete a task without completing another one before that

.

## 2.1.4 Thinking Logically

When planning a program identifying the decision points is a crucial part of the program design. Can be planned using pseudocode, structured statements or flowcharts.

## 2.1.5 Thinking Concurrently

Most modern computers can process a number of instructions at the same time (thanks to multicore processors and pipelining).

• Means programs need to be specifically designed to take advantage of this.
• Modules processed at the same time should be independent.
• Well-designed programs can save a lot of processing time.
• Human activities also benefit.
• Gantt charts can be commonly used to visualise projects that are occur concurrently.

Parallel Processors

• Enable different parts of a program to be executed simultaneously.
• Multicore processors are also becoming more common.
• Potential advantages → programs executed faster and savings made on energy.
• Programs have to be written specifically to take advantage of parallel processing which can make them longer and more complex.
• May not be much time saved if instructions have to be executed in sequence.