How to Calculate Ohm Resistance
- 1). Set the multimeter to measure voltage. Depending on the model of multimeter, setting the multimeter to voltage should require simply the turn of a dial or flip of a switch. Test the voltage of a new AA battery by touching the positive probe to the positive end of the battery and the negative probe to the negative end. Always hold the probes by the insulated portions. If the multimeter reads close to the voltage listed on the battery, then the multimeter is calibrated and functional. Multimeters should always be tested before use on home outlets.
- 2). Insert one of the probes into one of the two terminals of a wall socket, and then insert the other probe into the other terminal. The battery tested in the previous step provided a direct current, or DC, so the positive and negative charges of the probes mattered. However, home electrical circuits provide alternating current, or AC, so the probes can be inserted into either terminal regardless of charge. Read the multimeter and record the voltage of the outlet. Home outlets in the United States typically provide around 120 volts.
- 3). Remove the probes without touching the metal part of the probes to anything or to each other. Mishandling the probes of a multimeter can damage the device or cause electric shock. The probes are removed while switching multimeter settings to prevent accidents.
Set the multimeter to amperage, and then reinsert the probes, one into each terminal. Record the amperage of the circuit. - 4). Divide the voltage by the amperage to calculate the resistance. Ohm's law states that resistance, or R, in ohms is equal to voltage, or V, in volts divided by the current, or I, in amperage, so that R=V / I. The overall resistance of a circuit is affected by anything plugged into the circuit, by breakers and sockets, and by the wiring carrying the electricity. The resulting resistance is measured in ohms.
Source...