I wonder how it came to be that the term "dpi" has become such a confusion for so many people. I honestly blame Adobe for this, because even at the inception of PostScript in the mid-80s, displays were not always 72 dpi, they defined that number for other unrelated and technically inaccurate reasons.
If it's a digital image, there is no "dpi." It stands for Dots Per Inch. In most cases dots means pixels. Digital images all by themselves have no inches, only pixels. It's only when you know how many inches you're talking about, that the term "dpi" could possibly be meaningful. The dpi number attached to a file has nothing to do with its quality at all.
If you have a 10x10 inch piece of paper, THEN you can consider how many pixels to put on it. 300dpi works out to 3000x3000 pixels. Easy math. 150dpi works out to 1500x1500 pixels. Easy math.
If your target is the screen, forget about dpi entirely. One image has been saved with 3 dpi. The second image has been saved as 3000 dpi. See any difference in quality? The dpi does not mean quality.
If I printed these images from a graphics program, and the graphics program didn't tell the printer any special sizing information (which is exceedingly rare), then the 3dpi image would measure over 100" wide, and the 3000dpi would measure just over 1" wide. If the program told the printer what to do explicitly, like most applications do, then they'd print the same size, as your web browser is showing you.
Both were set with the same JPEG compression settings, so even though the "dpi" were saved differently, they both come out to the same file size too. The dpi does not indicate quality!