A-level Physics (Advancing Physics)/Finding the Distance of a Remote Object/Radar

From Wikibooks, open books for an open world
Jump to navigation Jump to search

1. What sort of wave does your system use? What is an approximate wavelength of this wave?

Radio waves, with a wavelength ranging from 2.7mm to 100m.

2. What sort of distance is it usually used to measure? What sort of length would you expect the distance to be?

The distance to an object within the radio horizon. This width is given by the formula:

So, a radar 10m above the Earth's surface has a range of 11.3 km.

3. Why is measuring this distance useful to society?

e.g. Radar is used at airports to locate aeroplanes and co-ordinate them so that they can land safely, avoiding collisions.

4. Draw a labelled diagram of your system.

5. Explain how the system works, and what data are collected.

The 'dish' rotates, and the transmitter on it transmits a pulse of radio waves. The waves, if they hit an aircraft, are reflected by it. The dish reflects the reflected radio pulse onto a receiver (allowing for some variety in incoming angles due to varying distances of aircraft). The time taken for the signal to travel to the aircraft and back is recorded, and the speed of the radio pulse (3 x 108ms−1) is already known.

6. Explain how the distance to the object is calculated using the data collected.

,

where s = distance, t = time and v = velocity. In this case, the time taken to travel to the aircraft t is half of the total time taken to travel to the aircraft and back T, so:

7. What limitations does your system have? (e.g. accuracy, consistency)

  • Random noise e.g. birds, weather
  • Deliberate attempts to evade radar detection e.g. radio-wave-absorbent paint
  • The aircraft are moving, so, in the time taken for the signal to return to the radar station and be processed, the aircraft will no longer be in the position calculated.

8. What percentage error would you expect these limitations to cause?

If we assume a distance from station to aircraft as 5 km, the speed of EM waves as 3 x 108ms−1 and the aircraft's speed as 500kmh−1, the time taken for the signal to return from the aircraft would be 1.67 x 10−5s. 500kmh−1 = 500000mh−1 = 139ms−1. Therefore, the aircraft would have moved 2.31mm in the time taken for the signal to return to the radar station. This means that there is a ±0.0000463% potential for error in the distance reading, depending on the direction of the aircraft's travel. This may seem insignificant, but readings are only taken once every rotation of the dish, so this potential error gets a lot larger in reality.

9. How might these problems be solved?

Take multiple readings over a period of time, and use a computer to predict where the aircraft will be next given the aircraft's current trajectory.