"My understanding was that 200x200 on a MAC is different to a 200x200 on a PC because MAC screens are 72dpi and PC screens are 96dpi."
Untrue and irrelevant. 200x200 is 200x200 on any computer in the universe.
K
Printable View
"My understanding was that 200x200 on a MAC is different to a 200x200 on a PC because MAC screens are 72dpi and PC screens are 96dpi."
Untrue and irrelevant. 200x200 is 200x200 on any computer in the universe.
K
Saying that 200 pixels are 200 pixels is as relevant as debating on how many angels can sit on a needlepoint. [img]/infopop/emoticons/icon_smile.gif[/img]
Pixels have no dimensions whatsoever, they're completely virtual. Of course you also have the monitor-settings, but these only tell you how many pixels can be shown horizontally and vertically without having to scroll, and they tell you nothing about the physical size of an image (and every image has one).
So here we don't have any reference and are stuck in a vacuum. Saying that every screen-pixel corresponds with an image-pixel is true, but doesn't help us as a pixel can be everything in size.
Therefore people agreed on a fixed standard. The Mac standard was that there would be 72 screen pixels for every inch. With this standard people knew what they were talking about. Mac stayed faithful to this. PC nowadays shows 96 pixels for every inch. So even if the images have the same size in pixels, they are different in visible size on the monitor, and that's what this topic is all about. AND it is important for the web.
Of course you don't need to scan at 72 pixels for the web, 600 is at least as good if you adapt the physical size of your image.
If you don't agree with this, then please correct me where I'm wrong and what 72PPI or 96PPI screen resolution really means and why there is this difference between the two.
If you don't work against time, time often works for you.
http://www.macsecurity.org/mail/maco...msg00046.shtml
you might like to take a look at this page
If I´m being ignorant please enlighten me, but what´s the point of keeping track of DPI if you´re just going to display it on a monitor. As far as I´m concerned, DPI is only useful only when printing/scanning.
I don´t understand the point of 96DPI on PC and 72DPI on Macs. The way I see it: a 33 inch monitor (hypothetical) at 640x480 resolution = less pixels in an inch/cm/mm/mile (whatever). 33 inch monitor at 4 trillion x 3 trillion resolution = more pixels in an inch.
The day when browsers do scale images according to DPI will just make things more complicated. According to Cascading Style Sheets, 20-pixel-tall text doesn´t have to be 20 pixels (it can be 10 pixels 18 pixels, or 2000 miles tall and it´ll be totally OK according to CSS guidelines). details here:
http://www.macedition.com/cb/cb_20010604.shtml
My advice, when you see the word DPI, ignore it, take a piece of tape and cover it up on you monitor.
[This message was edited by Alex W. on June 05, 2001 at 17:32.]
Alex, you are totally right, and so you spared me from writing all that - thanks! :-)
"My advice, when you see the word DPI, ignore it, take a piece of tape and cover it up on you monitor."
I can only add: AMEN.
K
So, and maybe this is just for printing purposes,
what is all this MAC v PC stuff then?
I did think (in my ignorance on not much contact with MAC's) that MAC Cathode Ray Tubes were a different pitch to the standard PC Monitor tubes. Hence a different DPI for MAC than for PC.
I've also read that for web work saving at greater than 96dpi is a waste of time as you are just saving info for the printer and not the screen - again I thought this was to do with the phisical pitch of the monitor.
Turan
Sorry if I'm just prolonging what you've already said above, but I've seen such info in more than one place and now you seem to be saying its rubish. [img]/infopop/emoticons/icon_frown.gif[/img]
Sorry - reading Alex post, just now does not help [img]/infopop/emoticons/icon_frown.gif[/img] if I'm reading correctly one pixel and one phisical dot on the tube are two different things? I think. Maybe a diagram would help.
Reading it again! It seems to say the OS picks the DPI not the phisical device? More confussed.
[This message was edited by Turan Mirza on June 06, 2001 at 03:04.]
[This message was edited by Turan Mirza on June 06, 2001 at 03:12.]
DPI can be important when you are only using a bitmap for on screen stuff. A pixel as you probably know is the smallest area of the screen a computer can control. The size of a pixel is dependant on the resolution and size of the screen you are using. Because different people have different screens running at different resolutions, a 640 x 480 pixel picture could fill the screen on one monitor and take up only a fraction of the display area on another monitor. As the size in pixels can make such a big difference to the size of pictures, some bitmaps also contain a DPI that indicates how many dots of that particular picture should be used for each inch of screen area – the idea being that applications will scale the bitmap so it is always the same size no matter what monitor size a person is using.
For all this to work, applications need to be able to find out the DPI of the screen they are displaying the bitmap on. Unfortunately most versions of Windows assume a screen has a DPI of 96 no matter what screen size is being used. Windows 2000 allows more accurate settings of DPI using the Font Size option in the Monitor settings (accessible from the Settings page Display Properties dialog by pressing the Advanced button) – in theory it should be possible in Windows 2000 to set the font size to Other and then set an exact DPI. Other Windows operating systems allow some control over the screens DPI by using the Large Fonts option that sets the DPI to 120. The problem with both of these methods of settings the screen’s DPI is that most applications don’t work that well with DPIs other that 96 (largely due to the way Windows is designed).
Although some applications don’t use the DPI field of bitmaps it is good practice to set it correctly so you pictures will look good in applications that do use the DPI field.
Thanks Jonathan.
My 11 years of writting low level embedded software, drivers for custom Hardware and OS's shows I still have a lot to learn about software and high level OS's.
Turan
Quote:"DPI can be important when you are only using a bitmap for on screen stuff." True, Jonathan.
I am not a catholic, so I can't say "Amen".
This whole discussion began because you have a visual diiference of some 25% between Xara and Photoshop or Flash.
When you make and save a file of 400x200pixels in Photoshop, you cannot get around a dpi setting, even if you chose pixels as your units.
I chose 72DPI. Now when I open this file in XaraX, it is some 25% bigger, even if I set the options to "DPI autogenerated" at 72DPI, or if I scale form 96 to 72. Xara adapts the pixel size. (see image)
When you save as .swf and open in Flash, it is some 25% smaller than in Xara. And this is caused by the difference between 72 and 96. And Jonathan explains why.
I can't upload with Opera, so here is the attachment.
I opened an image of 400x200 pixels (made in PhotoshopLE) in Xara.
You can verify on the bottom-line that I changed Xara's options and set to 72DPI, and you can see in the dimensions box that Xara changed the number of pixels of the dimensions.