Multimeters measure electrical voltage (volts), current (amps) and resistance (ohms). A large rotary control knob in the center of the meter lets you choose which to measure. One of the most noticeable differences between digital and analog multimeters is that the digital variety feature an LED readout on a small screen -- much like that of a digital watch -- while the analog ones have a readout consisting of a needle that moves on a permanent background grid. There are other differences between the two, as well.
Analog Multimeter Advantages
Analog meters are older and still preferred by many engineers. One reason for this is that analog meters are more sensitive to what is happening in the circuit that is being measured. A digital multimeter samples the quantity being measured at a particular time and displays it. Analog multimeters sample a quantity as it is happening. If there are slight changes in DC voltage, the needle of an analog multimeter will track them -- the needle moves -- while digital multimeters often miss them. This continuous tracking feature becomes important when testing capacitors or coils. A properly functioning capacitor should allow current to flow when voltage is applied, and the current slowly decreases to zero -- this "signature" is easy to see on an analog multimeter but not on a digital multimeter. It is similar when testing a coil, except the current starts small and increases.
Digital Multimeter Advantages
Digital multimeters are simpler to use and read, and more accurate than analog multimeters. For example, calibrating a digital multimeter is simply a matter of pressing a button. More expensive digital multimeters have "automatic ranging" -- to measure voltage you just select "V" and the meter determines the range. On an analog multimeter you must choose if the voltage is less than a volt, less than 10 volts, less than 100 volts, etc. If you make a mistake in selecting the range on an analog multimeter you can damage the meter. Digital multimeters are more robust in general -- if you drop an analog multimeter, it is likely ruined; a digital multimeter has no moving parts so it is more likely to survive a drop.
Significant Differences to Weigh
The most significant differences are in usability, cost, accuracy, sensitivity and battery effects. Digital multimeters are simpler to learn or use but more expensive. Analog multimeters are (on average) accurate within 3%, while digital ones are accurate within 0.5%. Analog multimeters are more sensitive to what a circuit is doing. Digital multimeters take "photographs" of what the circuit is doing at a particular time. Analog multimeters are more sensitive to the shocks and bangs of rough usage, and digital meters are more durable. Analog meters use the circuit for power when measuring volts or amps, relying on a battery only for measuring ohms. Digital meters use an internal battery for measuring everything. This means you cannot use a digital meter at all if the battery is low or missing. The digital multimeter battery can also influence the power readings by adding to or subtracting from the power of the circuit.
Analog Multimeter Advantages
Analog meters are older and still preferred by many engineers. One reason for this is that analog meters are more sensitive to what is happening in the circuit that is being measured. A digital multimeter samples the quantity being measured at a particular time and displays it. Analog multimeters sample a quantity as it is happening. If there are slight changes in DC voltage, the needle of an analog multimeter will track them -- the needle moves -- while digital multimeters often miss them. This continuous tracking feature becomes important when testing capacitors or coils. A properly functioning capacitor should allow current to flow when voltage is applied, and the current slowly decreases to zero -- this "signature" is easy to see on an analog multimeter but not on a digital multimeter. It is similar when testing a coil, except the current starts small and increases.
Digital Multimeter Advantages
Digital multimeters are simpler to use and read, and more accurate than analog multimeters. For example, calibrating a digital multimeter is simply a matter of pressing a button. More expensive digital multimeters have "automatic ranging" -- to measure voltage you just select "V" and the meter determines the range. On an analog multimeter you must choose if the voltage is less than a volt, less than 10 volts, less than 100 volts, etc. If you make a mistake in selecting the range on an analog multimeter you can damage the meter. Digital multimeters are more robust in general -- if you drop an analog multimeter, it is likely ruined; a digital multimeter has no moving parts so it is more likely to survive a drop.
Significant Differences to Weigh
The most significant differences are in usability, cost, accuracy, sensitivity and battery effects. Digital multimeters are simpler to learn or use but more expensive. Analog multimeters are (on average) accurate within 3%, while digital ones are accurate within 0.5%. Analog multimeters are more sensitive to what a circuit is doing. Digital multimeters take "photographs" of what the circuit is doing at a particular time. Analog multimeters are more sensitive to the shocks and bangs of rough usage, and digital meters are more durable. Analog meters use the circuit for power when measuring volts or amps, relying on a battery only for measuring ohms. Digital meters use an internal battery for measuring everything. This means you cannot use a digital meter at all if the battery is low or missing. The digital multimeter battery can also influence the power readings by adding to or subtracting from the power of the circuit.