Here's an excerpt from the Western Digital website which clarifies this issue:
---quote---
Decimal vs. Binary:
For simplicity and consistency, hard drive manufacturers define a megabyte as 1,000,000 bytes and a gigabyte as 1,000,000,000 bytes. This is a decimal (base 10) measurement and is the industry standard. However, certain system BIOSs, FDISK and Windows define a megabyte as 1,048,576 bytes and a gigabyte as 1,073,741,824 bytes. Mac systems also use these values. These are binary (base 2) measurements.
To Determine Decimal Capacity:
A decimal capacity is determined by dividing the total number of bytes, by the number of bytes per gigabyte (1,000,000,000 using base 10).
To Determine Binary Capacity:
A binary capacity is determined by dividing the total number of bytes, by the number of bytes per gigabyte (1,073,741,824 using base 2).
This is why different utilities will report different capacities for the same drive. The number of bytes is the same, but a different number of bytes is used to make a megabyte and a gigabyte. This is similar to the difference between 0 degrees Celsius and 32 degrees Fahrenheit. It is the same temperature, but will be reported differently depending on the scale you are using.
---end quote---
So, as you can see, you're not losing any capacity in your hard drive...it's only the way it's being reported (i.e., 120,000,000,000 bytes is still 120,000,000,000 bytes) and you do not lose anything of significance due to formatting.