Electronics/Analog to Digital Converters
Most information in the real world is considered analog (e.g. Audio, temperature and distance) the problem is modern day processing in electronics is done in binary ( a series of 1s and 0s) An analog to digital converter is an IC that solves this problem.
The step size is the voltage difference between one digital level (i.e. 0001) and the next one (i.e. 0010 or 0000). For example if an ADC has a step size of 1 Volt an input of 1 volt will produce an output, in a 4 bit converter, of 0001. 0 volts is always considered 0000. Obviously, real world ADCs and DACs would be functionally useless with such a large step size, and most modern devices have step sizes in the microvolt range.
Step size is an important consideration, as well as how many bits will be used. A 4 bit converter with a step size of 1 volt can show a maximum of 15 volts at the input, however a complex sine wave will be heavily distorted. Decreasing the step size to .5 volts will reduce the distortion but lower the maximum input reading. As smaller step sizes are used, higher bit converters are needed to keep maximum voltages equal.
This explains what many consumer audio products mean by "X bit outputs".