To round a real number off is to take the nearest integer to the specified
number. When the fractional part of a positive number is less than a half, then, the
number is rounded down, and when the fractional part is greater than a half,
the number is rounded up [with just the opposite happening for negative
numbers]. When the real number is exactly halfway between, the common practice is to
take the greater of the two closest integers. For example, to round off 3 gives 3, to
round off 3.5 gives 4, to round off 7.2 gives 7, to round off -0.2 gives 0, to round off
-99.7 gives -99.
Sometimes, with an integer-and-a-half value, rounding off is done to the nearest
integer with the greatest magnitude. So, whereas the first definition would round -0.5
off to 0 and -7.5 to -7, this alternate definition would round -0.5 off to -1 and -7.5 to
-8. The first definition has the advantage that it is easier to code, it looks something
like floor(x + 0.5); the second definition has the advantage that it makes a
litte more sense, when you think about it a particular way (for example, adding 0.5
[rounded off to 1] to -0.5 [rounded off to -1#&93; gives 0).