This allowed the system to be extended to the fractional and negative numbers that we see today. By the 19th century astronomers needed a more precise scale, so they defined magnitude 2 as 2.5 times fainter than magnitude 1, magnitude 3 as 2.5 times fainter than that, and so on. The system originated in the ancient world when people called the very brightest stars 'first magnitude', the next brightest 'second magnitude' and so on.
The brightness of stars is usually expressed in terms of a 'magnitude' value, and - confusingly - this number actually gets bigger for fainter stars. Not surprisingly, a giant star will be much brighter than a main sequence star seen at the same distance. Our own sun is in the prime of its life - called the 'main sequence' of stellar evolution - but stars that are further advanced in their life cycles can grow to giant proportions. A second reason for differences in brightness is that some stars are larger than others.