Sorry Marcus, but we do not all know that its is better to use 72 or 96 Pixels per inch (not dots as those refer to screening, just like samples refer to a scanner) than 300 for the web.

This makes no difference whatsoever on a monitor. A browser cannot change the dimension of a pixel at all. Your videocard setting can, and allows you to set it at for example 800x600 or 1600x1200. In the latter setting the pixels are only half as "large" as in the former one. But when you look at the same pic with 1600x1200 it will only be one quarter of the size it has on 800x600 as heighth and width are indeed the same amount of pixels but each pixel is only half the size.
If you use 300ppi, you only need a quarter of the physical dimensions of your image to make it appear as large as one set at 75dpi. But in print, it will be a quarter of what you see.

So, no, this makes no difference at all: only the number of pixel counts and the size varies with the monitor settings.

And as far as I know, the 72 comes from the first Mac computers. I really would like to find a good site on the history of computers. I do remember vaguely a huge machine and me typing in Fortran4, cards coming out of the machine, and after putting them on a tray and they being read and interpreted by the comp, there was this unavoidable message that there were faults in the programming and it couldn't do what I asked it to do. Things have gotten easier and more powerful.