-
For those of you who don't know, I teach graphics and multimedia classes for a living (Flash, PhotoShop, etc.) For several years now, I've been explaining to students how to optimize graphics for the web, and inevitably the subject of dpi/ppi comes up. I usually tell students to save files (or resample files) to 72dpi, because dpi stands for dots-per-inch and there are 72 dots (pixels) per inch on computer monitors. (We've discussed the PC/MAC difference here before: PCs display at 96dpi -- actually many MAC monitors do too nowadays. Xara is a PC app, so its default resolution is 96.)
Trouble is, this isn't true. Monitors DON'T display 72 pixels per inch. Nor do they display 96 pixels per inch. In fact, the concept of an "inch" doesn't make much sense when talking about monitors. For instance, if you design a 72 pixel by 72 pixel graphic at 72dpi, this should display at 1 inch x 1 inch, right? You should be able to hold a ruler up to your monitor and measure it at 1 inch x 1 inch, right? Okay, try displaying it on one of the monitors in Times Square. It will probably measure 1 yard x 1 yard.
Pixels have no set size! Monitors have no set size. So what, exactly, does the 72 (or 96) dpi setting (in Xara, PhotoShop, etc.) mean?
As best as I can understand it, 72dpi means "if you have a 13-15inch monitor, every inch of this image will contain 72 pixels. If you have a larger monitor, every UNIT will contain 72 pixels, and it's up to you to figure out what a UNIT is."
Does anyone here have a REALLY good gut understanding of dpi (a.k.a. ppi)? Say I'm designing a graphic that's going to appear during the title sequence of a major motion picture (shown in the cinema). What does 72dpi mean to me?
I've searched the web for details, but almost every graphic site says "set resolution to 72dip, because monitors display at 72dpi."
Hope this doesn't seem wildly off-topic.
Marcus Geduld
{ email me } { visit me }
-
I know the score on this pretty well, Marcus - and I agree with you that what we hear all around is a lot of confused hogwash. You're right: "72dpi" for a movie or a web page or a Times square screen means NOTHING. It's a mere nominal, arbitrary value, since there must be SOME value in a file's header about the "resolution". This file header info - the "DPI" - is only usable information by printers, when the image is to be rendered into absolute physical dimensions.
The best simple advice I know is to forget ALL about DPI/PPI and only consider the number of horiz./vert. number of pixels in a file - e.g., 600x800, 2000x3000. Only this states the REAL amount of information in a digital image.
K
-
Use pixels for the web and forget the DPI issue.
The only time DPI comes into play is with print, because pixels are of varying sizes on a printer.
So this is the deal: DPI means nothing to your PC monitor, it only knows pixels. Pixels mean nothing when printing, because a printers pixels will be different sizes. If you wan't a 1inch image on your printer and you are printing at 600dpi then you need a 600 pixel image.
Michael Ward
http://LeighCenturions.net
-
If dpi/ppi meant nothing, we could totally ignore it. Still, we all know it's better to make web images 72/96 dpi than 300 dpi. So the concept isn't meaningless. (A 1 inch by 1 inch file at 300 dpi will appear much bigger than 1 inch by 1 inch on 13-19 inch monitors set at some reasonable resolution, like 800x600).
Also, I haven't done a lot of film work, but I'm pretty sure that people making graphics for the big screen are also advised make them 72dpi. Why? (This gets even more confusing when you think that almost all films wind up on television. So the graphic is going to be displayed on a huge screen AND a small screen).
Marcus Geduld
{ email me } { visit me }
-
Sorry Marcus, but we do not all know that its is better to use 72 or 96 Pixels per inch (not dots as those refer to screening, just like samples refer to a scanner) than 300 for the web.
This makes no difference whatsoever on a monitor. A browser cannot change the dimension of a pixel at all. Your videocard setting can, and allows you to set it at for example 800x600 or 1600x1200. In the latter setting the pixels are only half as "large" as in the former one. But when you look at the same pic with 1600x1200 it will only be one quarter of the size it has on 800x600 as heighth and width are indeed the same amount of pixels but each pixel is only half the size.
If you use 300ppi, you only need a quarter of the physical dimensions of your image to make it appear as large as one set at 75dpi. But in print, it will be a quarter of what you see.
So, no, this makes no difference at all: only the number of pixel counts and the size varies with the monitor settings.
And as far as I know, the 72 comes from the first Mac computers. I really would like to find a good site on the history of computers. I do remember vaguely a huge machine and me typing in Fortran4, cards coming out of the machine, and after putting them on a tray and they being read and interpreted by the comp, there was this unavoidable message that there were faults in the programming and it couldn't do what I asked it to do. Things have gotten easier and more powerful.
-
Marcus,
Klaus and Michael are totally right!
FORGET dpi !! (when talking of screens)
Why ??
Create a 100 x 100 pixel wide graphic and save it once with 300dpi and once with 72 dpi (as GIF or JPG or PNG).
View them both in your browser and - voilá - they will have the same size.
Not anymore will they look the same, if you print them both. The 300dpi version will come out much smaller than the 72dpi version.
My clients sometimes ask me, at how much dpi I need the pictures for their webpages. I always tell them, that this does not matter and that I'd prefer this or that size in pixels. They rarely understand me ...
I guess this is, because almost nobody actually understands the dpi-concept. Well most of us graphic people here may do, but outside this place the air gets thin.
Everybody who buys a consumer printer, ink or laser, asks one thing first: "How many dpi does it have ?". Even though this number does not directly relate to the final quality, it's the only distinctive feature most people know. Or think to know.
Alas, nobody asks for dpi when they buy a monitor. The size in inches is the common known feature here.
And as long you can have the same resolution on different sized monitors, dpi will not make any sense.
I can set my 18" TFT to 640x480 which will make a 100x100 pixel graphic really big. You can set the same resolution on a 14" CRT monitor and the 100x100 will look much smaller.
What will stay the same is the relation between the size of your graphic and the size of your screen. and that's what really counts here!
Wolfgang
-
I don't know exactly what DPI means, but I can tell you what I've learned from one little experience.
A Month ago I was asked to design some flyers for print advertising 2 shows. I thought, no biggy, and got them done within a week. I then sent them to the company in bitmap form (By means of Bitmap, jpeg, tiff, swf) And when they printed out, all looked well except the text. It printed out really fuzzy and too anti-anilized. The guy said the dpi needed to be higher, reall high! So I put it up to the max Xara would allow 300 (The guy wanted even more than that!) and it was still fuzzy!?!? It was really frusterating! So either Xara doesn't allow enough dpi exporting for proper printing, or it's antianilizing is too much!? Hope this help
Steve Newport
-
I agree that once you have a bitmap, DPI is nonsense. Pixels are pixels.
But, try this:
<UL TYPE=SQUARE><LI>In Xara X set your Page Options Units to INCH.<LI>Create a 1 inch square, no outline.<LI>Include fill of your choice or leave it black.<LI>Select just the square and export it as a PNG bitmap. Set the DPI to 96 on the Bitmap Size tab.<LI>Now, export it as a PNG bitmap a second time. Choose a new filename and set its DPI to 300.[/list]
When you look at the two PNG bitmaps, you will find that they are different sizes. 96 and 300 pixels square respectively. A bitmap DPI specification has resulted in the correct translation between the inch defined vector graphic and the pixel defined bitmap graphic(s).
I suspect this also applies to many or all bitmap exports that support a DPI attribute.
I've long wondered if some bitmap formats internally carry the DPI setting used during their creation, or their real world dimensions (in, cm), for latter post processing. That would allow a bitmap to be printed at the author's intended size. If anyone can speak to that aspect, please inform us.
Happy holidays everyone.
-
Marcus: "Still, we all know it's better to make web images 72/96 dpi than 300 dpi."
Nonsense. 300x300 pixels will look the same on your browser no matter what "dpi" you specify.
"So the concept isn't meaningless."
In the context of web and screen, it is. In the context of print it is hugely meaningful. So make up your mind whether you are dealing with web or print. If web, forget DPI. Or go on forever confusing yourself - but you have been warned! ;-)
K
K
-
Al, that's just what it is all about. The "one inch" is of no importance when you look at a file on your monitor, except that is specifies how many pixels there will be:
1 inch with 72 pixels per inch gives 72 pixels
1 inch with 300 pixels per inch gives 300 pixels
Nothing more, nothing less.
And if your monitor is set at 800x600, your one inch with 72 pixels in it will be appr. 1/11 of your monitor's width, and your one inch with 300 pixels in it will be 3/8 (nearly half).
If you don't work against time, time often works for you.
-
Steve
I know that Xara has a number of dpi settings in the export dialogue but you CAN save higher than that. I thought this at one time but then I thought lets try and replace the 300dpi shown with 400 dpi - and it worked. Prior to this I had to export at 300dpi and then resample in Corel PhotoPaint, a right pain in the you know what!!! Now I simply delete 300 and replace with 400 and I have no problem with fuzzy text (albeit on A4 printed material).
On a general point, I also get confused over the question of dpi/ppi and my partner gets very frustrated at my lack of understanding. I do try very hard but I do find it a difficult concept to get my head around. I'll get there though - and not before time as far as my partner is concerned, I'm sure.
Tracey
-
understanding "resolution" is an old and ongoing issue I find this thread particulary good because different people are saying the samething ... its just matter of how the they explain it and THE WAY THE READER UNDERSTANDS IT. Eric and Klaus are essentially saying the same thing; one reader may connect with what Eric has written while another connects with what Klaus has written. Bottom line, more readers MAY become better informed.
What I come away from this is:
The monitor (based on its size and setting) is a FIXED MATRIX of "screen units". One pixel is mapped to one screen unit. Its like a big piece of graph paper where ONE PIXEL of color is assigned TO ONE SQUARE on the graph paper.
If the monitor setting is increased from say 640 x 480 to 800 to 600 it would be similar to mapping a drawing on 4-square/inch graph paper and then redrawing the same image (one pixel to one square) on a 6-square/inch graph paper.
A printer, on the otherhand, has the ability to interpret the number of colored "dots" per an "inch" of paper space.
-
Given that we're speaking of physical printing, all text - given the razor-sharp nature of letters - need to have a DPI of at least 600! If it's a big heading at 70-100pt you can get away with less, but if it's small bodycopy text at 8-9 point you'll need 800 or higher to avoid fuzzyness. Big numbers for potentially huge files!
But of course, text should NOT ever be bitmapped!!! That's why we have file formats like EPS, which allow for vector text and bitmap images in the same file, for both razor sharp text output AND a more sensible DPI for the images - around 200-300 DPI for ordinary halftone screens.
The well-known "rule" which states you should have a bitmap DPI twice that of the LPI of the halftone screen is another myth devoid of real knowledge: depending on the image, you can have far less or need far more. A totally softfocus photo can have a DPI number the same as (or even less than) the LPI number without any adverse effects, whereas an images with razor-sharp information - such as text! - needs a much higher ratio.
K
www.klausnordby.com/xara
[This message was edited by Klaus Nordby on December 08, 2001 at 13:44.]
-
Regarding XaraX and DPI output: again, forget about DPI! XX's bitmap output is only limited to this: 32,000 x 32,000 pixels. Printed at 300 DPI, that's about a wall-sized image. Printed at 3000 dpi, that's about a page-sized image. So take your pick, depending on your output needs! :-)
K
www.klausnordby.com/xara
-
Tracey, thanks a lot for the info, this will be VERY usefull in the future. My Mom was getting very frusterated at Xara because of that reason and said I need to learn new programs (Like illustrator, I hate that program!!)?? I know all those are more widely used, but people need to see that xara is sooo much better!!! It is really frusterating, the thought that before I start working in this field I'm going to have a to learn that program, or some other crappy one [img]/infopop/emoticons/icon_frown.gif[/img]
Steve Newport
-
1 Attachment(s)
This my graphical attempt to explain this (though I'm not sure how well I've don [img]/infopop/emoticons/icon_smile.gif[/img] )
BTW, all figures below are theoretical.
Michael Ward
http://LeighCenturions.net
-
Michael, it's a good graph to explain what's really very simple - except your graph is kinda wrong! Inches do not apply to screen images: they are not physical entities having fixed dimensions. To explain the 1:1 matching between an image and a particular screen "resolution" (a total misnomer and a major conceptual culprit in all this confusion) you should show how one 16x16 pixel image is still a 16x16 image on ANY screen "resolution".
K
www.klausnordby.com/xara
-
If you read what I've put, I say that the figures are simply relative.
If you have 800x600 on one particular monitor then you will fit half as much into one inch of real space as you would with 1600x1200 (which is more or less true).
Michael Ward
http://LeighCenturions.net
-
I may be misunderstanding (it's late, and I'll reread all your great posts tomorrow), but it seems like many of you are saying that dpi/ppi is totally unimportant on the web. If that's true, it doesn't matter whether I make web images 72ppi, 300ppi or 3000ppi.
This flies in the face of DOZENS of books I own (some of them very well respected, like "Real World PhotoShop"), all of which insist that web images should be saved at 72ppi. Can they all be wrong?
If so, why is this myth have such a lifespan. If not -- if it IS true -- what is magic about the number 72?
I DO realize that a 3000 or 300dpi file will have a much larger file size (more pixels = more bytes) than a 72dpi file -- which means it will take longer to download. But that STILL doesn't explain why 72 (or even 96) is such a major target.
Marcus Geduld
{ email me } { visit me }
-
Marcus: "I may be misunderstanding (it's late, and I'll reread all your great posts tomorrow), but it seems like many of you are saying that dpi/ppi is totally unimportant on the web. If that's true, it doesn't matter whether I make web images 72ppi, 300ppi or 3000ppi."
Right!
Marcus: "This flies in the face of DOZENS of books I own (some of them very well respected, like "Real World PhotoShop"), all of which insist that web images should be saved at 72ppi. Can they all be wrong?"
Yes! It's a stupid, Mac-biased myth.
Marcus: "If so, why is this myth have such a lifespan. If not -- if it IS true -- what is magic about the number 72?"
It's the default "screen resolution" of the Mac. And virtually everybody who use a Mac - including tons of graphics people - are brain-dead, because the Mac does not encourage the use of one's brain.
Also, the number "72" is "magical" (hah!) because the PostScript standard for typographical points is 72 points in an inch (which is not *exactly* the traditional measure). This is - most likely - the historical origin of that figure.
K
www.klausnordby.com/xara
-
Great thread! Try this linkThe Mad, Mad World of Pixels Per Inch
Another point worth mentioning is the difference between CRT monitors and TFT monitors. I don't pretend to understand the difference but I believe a TFT monitor is set a particular resolution, and whilst it can copy other resolutions it does it differently and far worse than a CRT screen
Egg
-
Klaus/Marcus
I would have thought that the lower figures are purely for the purpose of keeping file sizes down. After all, we all know that the higher the dpi the larger the file and the larger the file the longer the download time. For the web therefore, the lower the file size the better and hence the lower dpi for web graphics.
Apart from this, I'm fairly sure that whenever I've tried saving a GIF it doesn't matter what the dpi of the source file, whether 3000dpi or 150 dpi, it is always saved at 72 dpi. As far as I am aware this does not apply to JPG or PNG.
.....or have I got it wrong - again.
Tracey
-
Hi,
Just thought I'd add my bit.
You can find the resolution of your screen quite easily. Try this:
Create a new image, draw a square at say, 100x100 pixels to start with. Now resize it so that when you hold a ruler up to your screen, it measures 1 inch by 1 inch.
Now count the pixels - on my screen I have a 95x95 square. So the 'dots/pixels per inch' of my screen is 95.
Of course this is only approximate as it's not easy to accurately measrure an object on a screen.
You can do another test. Open/create a small picture, set it's resolution to your screen resolution (in my case - 95 - found using the method above) and print it at that resolution. Hold the print up to your screen image - they should be pretty similar in size.
Obviously, if you change your screen settings from say, 800x600 to 1024x768, then you'll get a different result.
-Richard
-
Tracey: "I would have thought that the lower figures are purely for the purpose of keeping file sizes down. After all, we all know that the higher the dpi the larger the file and the larger the file the longer the download time. For the web therefore, the lower the file size the better and hence the lower dpi for web graphics."
NO! NO! NO! You stilll don't get it. There is no "DPI" for screen/web images - there are ONLY the actual pixel dimensions: 300x300, 600x600, etc.
For anyone who finds this resolution-topic even slightly confusing, I repeat this simple - but totally correct - advice: WHEN IT COMES TO SCREEN/WEB IMAGES, FORGET YOU EVER HEARD ABOUT DPI. IT ONLY PERTAINS TO PRINTING.
Exasperatedly,
K
www.klausnordby.com/xara
-
Richard, what you're saying is of course mathematically correct. :-) But it's also TOTALLY useless - and only further serves to spread the confusion that screen/web images "really have" a DPI. If people think that your ruler-idea is relevant, then they will start to creat screen/web graphics with 92 and 98 and 88 and 101 DPI - ALL OF WHICH IS SENSELESS AND A COMPLETE WASTE OF TIME.
K
www.klausnordby.com/xara
-
Thanks Klaus & company!
Anyone here can also "get it." Just perform this simple test using Photoshop (or some other application that lets you choose image width & height in inches AND pixels and also image resolution in pixels per inch):
Create a new image (File > New). In the new image dialogue box, enter the following settings:
WIDTH: 100 pixels (NOT inches)
HEIGHT: 100 pixels (NOT inches)
RESOLUTION: 72 pixels per inch
DON'T click the OK button. Instead, look at the top of the dialogue box. Photoshop tells you how big this image will be: 30K
Now click the OK button and look at the physical dimensions of the graphic: it's 100pix by 100pix (as you specified), so it's a small square.
OK, now create ANOTHER new image. Use the following settings:
WIDTH: 100 pixels (NOT inches)
HEIGHT: 100 pixels (NOT inches)
RESOLUTION: 300 pixels per inch
NOTE: these are the same settings as in our first test, except here we're using 300 pixels per inch instead of 72 pixels per inch.
DON'T click the OK button. Instead, look at the top of the dialogue box. Photoshop tells you how big this image will be: 30K
Click OK and now compare the physical dimenstions of the two images. The are EXACTLY the same size -- same width and height. Which shouldn't be surprising, because they are both 100px by 100px. The resolution had NO effect at all! (Of course, it would have a MAJOR effect if we printed these graphics).
Also, note the filesizes are also the same 30K and 30K. They are the same, because BOTH images contain 10000pixels (100pix by 100pix). Filesize is determined by number of pixels -- NOT by dpi. So there will be no difference in download time either. Resolution is MEANINGLESS on the web.
Now, here's ANOTHER experiment:
Create ANOTHER new image using the following settings:
WIDTH: 1 inch
HEIGHT: 1 inch
RESOLUTION: 72 pixels per inch
This image is 16K in filesize and looks quite small on my screen.
Create a final image using the following settings:
WIDTH: 1 inch
HEIGHT: 1 inch
RESOLUTION: 300 pixels per inch
This image is 264K and looks much larger on my screen!
Both images are 1 square inch (in print), but whereas we've only packed 5184 pixels into the first one (72pix * 72pix = 5184pix), we've packed 90000 pixels into the second one (300pix * 300pix = 90000pix). More pixels = a bigger file = longer download time. Also a 300 pixel by 300 pixel image appears larger -- on the screen -- than a 100 pixel by 100 pixel image.
Bottom line: all that matters online is the number of pixels in the width and the number of pixels in the height. The resolution (dpi/ppi) doesn't do anything for you, good or bad.
Marcus Geduld
{ email me } { visit me }
-
1 Attachment(s)
-
1 Attachment(s)
-
1 Attachment(s)
-
1 Attachment(s)
-
1 Attachment(s)
-
and explained that really well!
Simple as:
Use pixels as a measurement for the web, and forget DPI.
Uses inches as a measurement for print work, and set the appropraite DPI for your output (ie 600dpi if your printer is 600dpi)
Michael Ward
http://LeighCenturions.net
-
Guys, now you're finally getting what I have been saying all along!
No DPI for screen/web, lots of DPI for print.
Welcome to the club - but I fear it's still a small fraternity! I estimate that only 5% of all designers understand these things - and only 2% of all SB-people (who are even dumber than designers).
K
www.klausnordby.com/xara
-
Egg: "Another point worth mentioning is the difference between CRT monitors and TFT monitors. I don't pretend to understand the difference but I believe a TFT monitor is set a particular resolution, and whilst it can copy other resolutions it does it differently and far worse than a CRT screen"
Your observations are all correct. The difference between CRT (regular, oldstyle monitors) and TFT/LCD (notebooks and newfangled flatpanels) is simple, and explains why the image on LCD monitors so often look "far worse" when you change the screen's pixel area settings. (I use the term "screen's pixel area" instead of the term "screen resolution" since this is a total misnomer only spreading confusion about the nature of "resolution.") CRT tubes do not have a fixed screen pixel area at all, whereas LCD panels do. The screen pixel area of a CRT is wholly created by the graphics card, it is only limited upward to whatever is the maximum number of horiz/vert pixels the tube can handle. So CRTs have a "fluid" nature, with no number of horiz/vert. pixels being more "right" than any other: it just accepts with no problems whatever screen pixel area the graphics card throws at it. But LCDs have a built-in fixed grid of pixels - an actual physical screen pixel area, like 1024x800. So if you change your graphics cards screen pixel area from the factory set default 1024x800 to, say, 600x800, there is no longer any 1:1 correspondence between the panel's rigidly locked physical characteristics and the information generated by the graphics card. The screen must therefore remap the screen pixel area to the physical pixels - and this must inevitably result in jagginess. Some single screen area pixels will be spread from one to cover two LCD pixels, and others will be squeezed down from two pixels to one. In LCDs, there are no in-between "decimals", only "integers".
So for people who like to switch between different screen pixel area settings all the time - like we designers do a lot - LCDs are a very poor choice. Personally, I go crazy when I see remapped pixels on any screen! So if we want maximum monitor quality, we're stuck with huge, bulky CRTs for - God knows how long.
K
www.klausnordby.com/xara
-
Marcus: "Filesize is determined by number of pixels -- NOT by dpi."
True - but it's not ONLY determined by the number of pixels, but by the number of pixels and theCOLOR DEPTH of the image: 1-bit, 8-bit, 24-bit, 32-bit etc. These are of course global values: all pixels in an image must have the same same color depth - the color depth can't vary in different pixels. A small but important point to fully understand a file's size in working memory.
When saved on a disk or transmitted via the Net, there is of course no correspondence between a file's size in working memory and its stored/transmitted size. Hence, that small 60 Kb JPG file we download may well swell to 10Mb when viewed in a browser - and thus bring a system with little RAM to its knees! (Impossible, you say? I have such a file open right now.)
Gee, there's something about this entire resolution/file size/color depth thing which really fascinates me! :-)
K
www.klausnordby.com/xara
-
I think we're now wandering away from the (hopefully resolved) issue at hand... DPI for the web or not.
I think that's where we should stop for this thread.
Full Stop.
Michael Ward
http://LeighCenturions.net
-
<BLOCKQUOTE><font size="-1">quote:</font><HR>I think that's where we should stop for this thread.<HR></BLOCKQUOTE>
Come on, Michael — stop trying to play God! It's just starting to get really interesting with Klaus's latest observations about CRTs, TFT/LCDs and working memory (and I mean that, Klaus)...
Peter</p>
Peat Stack or Pete's Tack?</p>
-
Well I should smite you down right were you sit!!
[img]/infopop/emoticons/icon_wink.gif[/img]
Michael Ward
http://LeighCenturions.net
-
Yes Klaus, what's this thing about small k images consuming huge amounts of system resources??
And someday we'll have to get you guys to work explaining optimal scanner settings. That subject really seems to be beyond comprehension to so many scanner owners.
Regards, Ross
<a href=http://www.designstop.com/>DesignStop.Com</a>
-
I hate scanners that interpolate, and it's nigh on impossible to find their optical resolution.
Michael Ward
http://LeighCenturions.net