Scan Resolution

D

DNT

In my store I print up to 13x19 with an epson stylus photo 2200. I scan
negs or prints with an epson perfection 4870 photo. A couple of days
ago I printed an 11x14 from a 35mm neg that I scanned at 2400 dpi. The
customer said the print was soft, so I scanned the neg again at 4800
dpi. Of course the image was much sharper. That makes sense, but here's
my question: When someone orders a 16x20 I have to outlab it. The tech
at the other lab told me to scan a 35mm neg at 300 dpi for a 16x20.
Does anyone know why 300 would work for them on a 16x20, but 2400
doesn't work for me on an 11x14? I learned a formula at one point which
is supposed to give you the resolution you need for any size print.
300 x the short side of the print. So if I want to print an 8x10 I
would multiply 8 x 300 and scan at 2400. Is anyone familiar with this
formula, or know of any others? Thanks.
 
S

Scott W

DNT said:
In my store I print up to 13x19 with an epson stylus photo 2200. I scan
negs or prints with an epson perfection 4870 photo. A couple of days
ago I printed an 11x14 from a 35mm neg that I scanned at 2400 dpi. The
customer said the print was soft, so I scanned the neg again at 4800
dpi. Of course the image was much sharper. That makes sense, but here's
my question: When someone orders a 16x20 I have to outlab it. The tech
at the other lab told me to scan a 35mm neg at 300 dpi for a 16x20.
Does anyone know why 300 would work for them on a 16x20, but 2400
doesn't work for me on an 11x14? I learned a formula at one point which
is supposed to give you the resolution you need for any size print.
300 x the short side of the print. So if I want to print an 8x10 I
would multiply 8 x 300 and scan at 2400. Is anyone familiar with this
formula, or know of any others? Thanks.

He must have meant to scan so that the final image would be 300 ppi
when printed.
There limits as to how much detail you can can get off a negative but
if you wanted to make a 16x20 print at 300 ppi then you would need to
scan a 35mm negative at around 5080 ppi. problem is a scan at the
resolution will be soft. A 16x20 print from 35mm is going to be soft
any way to do it.

Scott
 
W

Wayne

The tech
at the other lab told me to scan a 35mm neg at 300 dpi for a 16x20.
Does anyone know why 300 would work for them on a 16x20, but 2400
doesn't work for me on an 11x14? I learned a formula at one point which
is supposed to give you the resolution you need for any size print.
300 x the short side of the print. So if I want to print an 8x10 I
would multiply 8 x 300 and scan at 2400. Is anyone familiar with this
formula, or know of any others? Thanks.


Your formula is only correct if you are scanning one inch. 35mm film is
a bit less than an inch wide, maybe 0.93 inches, so in that way, it is
only an approximation, probably close enough for 35mm film, but not
exact.

The correct formula is that if you want to print an 8 inch dimension at
300 dpi, then you require 8 x 300 = 2400 pixels in that dimension.

If you are scanning one inch, then yes, 2400 dpi does create 1x2400
= 2400 pixels. But if you are scanning 2 inches, then 1200 dpi will
create 2x1200 = 2400 pixels. Or if scanning 1/2 inch, then 4800 dpi
creates 0.5x4800 = 2400 pixels. In all three cases, the resulting 2400
pixels will print 8 inches at 300 dpi.... 2400 pixels / 8 inches = 300
dpi.

The first statement regarding scanning to get 16 inches at 300 dpi means
300 dpi "at the output size", which is just another way to specify that
you need 16 inches x 300 dpi = 4800 pixels. The pixels are what is
important.
 
D

DNT

What's the difference between dpi and ppi?
Wayne said:
Your formula is only correct if you are scanning one inch. 35mm film is
a bit less than an inch wide, maybe 0.93 inches, so in that way, it is
only an approximation, probably close enough for 35mm film, but not
exact.

The correct formula is that if you want to print an 8 inch dimension at
300 dpi, then you require 8 x 300 = 2400 pixels in that dimension.

If you are scanning one inch, then yes, 2400 dpi does create 1x2400
= 2400 pixels. But if you are scanning 2 inches, then 1200 dpi will
create 2x1200 = 2400 pixels. Or if scanning 1/2 inch, then 4800 dpi
creates 0.5x4800 = 2400 pixels. In all three cases, the resulting 2400
pixels will print 8 inches at 300 dpi.... 2400 pixels / 8 inches = 300
dpi.

The first statement regarding scanning to get 16 inches at 300 dpi means
300 dpi "at the output size", which is just another way to specify that
you need 16 inches x 300 dpi = 4800 pixels. The pixels are what is
important.
 
W

Wayne

What's the difference between dpi and ppi?


No difference if referring to printing digital images (the only difference
is perhaps who says it). Two names for same thing. If about images, then
both terms mean pixels per inch, about the spacing of the image pixels on
paper when printing. Regarding digital images, or regarding image files,
this is the only possible meaning it can have.

There is also a very different use called dpi for printers, about how the
printers space their ink dots on paper trying to replicate one image pixel.
That is indeed a different concept. We might set the printer to 1440 dpi or
2400 dpi (a high quality setting) to print our 300 dpi image (pixel
spacing). If about images, dpi means pixels per inch. If about printers,
it doesn't.

It is easy to start a usenet fight about the proper term, and one might
appear here now from others. I might as well address that now. Lately
some do want to demand that everyone says ppi for images and dpi for
printers, which is fine if you wish to say ppi, but such insistance for
others is just wasting time because it ignores that the proper name has
always been dpi, and still is. For example, scanner manufacturers
typically say dpi. There are no ink dots in scanners, but scanners are
rated dpi when it is always about pixels per inch. That is because dpi has
simply always been the name of it (regardless of what some newbies may
imagine when first grasping the concepts). We will always hear dpi, so my
point is that it seems good to understand its use, which is more beneficial
than fighting it.

Ppi is a fine term too, perhaps even better, but if one insists on that
stand of fighting the use of dpi, it simply means they won't be able to
understand half of what they might read. I'm one of the old geezers, so
there is no hope for me... I've said dpi for years, and will continue,
because to me, ppi just doesn't have the same ring to it. But ppi is fine
too, and is used too, and both mean pixels per inch - if about images.

My own stand is NOT that dpi is right and ppi less so. I agree that ppi
might be the better term now, but dpi certainly has been in use years
longer. Both terms are fine and equal (if about images), so use whichever
you prefer to use. My own actual stand is just that we definitely ARE
going to hear it both ways everywhere, so like it or not, it is necessary
to always understand the meaning when we do hear it. The meaning comes
from the context. There is no ambiguity - if the context is about spacing
image pixels on paper, regardless if we prefer dpi or ppi, the only meaning
it can possibly have is pixels per inch.
 
C

CSM1

A square 120 negative is a 2.25 inch by 2.25 inch image.
So to scan for a 16 x 20 inch print you would calculate the 20 inch
dimension.
Assume printing at 300 DPI.

20 inches times 300 DPI = 6000 pixels.
You want 6000 x 6000 pixels (You will have to crop the image in the 16 inch
dimension).
To get 6000 pixels from 2.25 inch negative you scan at 6000/2.25=2667 ppi.

So the scanner resolution should be 2667 ppi. To have the scan resolution to
be evenly divided by 8, scan at 2672 ppi.

A 11 x 14 print needs 300 * 14=4200 pixels in the long direction.
So scan at 4200/2.25=1867 ppi . 1872 ppi to even divide by 8.

Again you crop the short dimension.
 
S

Surfer!

CSM1 said:
A square 120 negative is a 2.25 inch by 2.25 inch image.
So to scan for a 16 x 20 inch print you would calculate the 20 inch
dimension.
Assume printing at 300 DPI.

20 inches times 300 DPI = 6000 pixels.
You want 6000 x 6000 pixels (You will have to crop the image in the 16 inch
dimension).
To get 6000 pixels from 2.25 inch negative you scan at 6000/2.25=2667 ppi.

So the scanner resolution should be 2667 ppi. To have the scan resolution to
be evenly divided by 8, scan at 2672 ppi.

A 11 x 14 print needs 300 * 14=4200 pixels in the long direction.
So scan at 4200/2.25=1867 ppi . 1872 ppi to even divide by 8.

Again you crop the short dimension.

Of course if you want to crop more than the shorted dimension you need
to scan at a higher resolution...
 
B

Bart van der Wolf

SNIP
Ppi is a fine term too, perhaps even better, but if one insists on
that stand of fighting the use of dpi, it simply means they won't
be able to understand half of what they might read. I'm one of
the old geezers, so there is no hope for me...

Sure there is, don't give up ;-)
I've said dpi for years, and will continue, because to me, ppi
just doesn't have the same ring to it. But ppi is fine too, and is
used too, and both mean pixels per inch - if about images.

To me, also an old geezer (depending on the definition of old), the
dots concept is solidly embedded in the (process-) printing industry
where traditionally ink dots are used to simulate continuous tones
with half tones (e.g. black and nothing, i.e. paper color). Small dots
simulate lighter tones, and bigger dots simulate darker tones.

When discussing discrete sampled imaging I use the term PPI, both in
capture (camera or scan) and in output (mainly inkjet or DyeSub
printing and film recorders). Inkjet printers dither the pixels in
order to simulate continuous tones.

Bart
 
W

Wayne

When discussing discrete sampled imaging I use the term PPI, both in
capture (camera or scan) and in output (mainly inkjet or DyeSub
printing and film recorders). Inkjet printers dither the pixels in
order to simulate continuous tones.


To each his own Bart, it is a fine preference. But not a rule.

I try to say ppi now and then, but after all these years, "72 ppi"
sounds strange to me. That is just my own problem perhaps, but ...
"72 dpi" gets 12,100,000 hits on Google.
"72 ppi" gets 124,000 hits.
An order of magnitude in popular usage merits considerable attention.

Scanner manufacturers do of course use dpi ratings - meaning pixels per
inch. Continuous tone printers use dpi ratings too (dye subs and Fuji
Frontier chemical type), again referring to the pixels they can print.
I am not speaking of you of course, but it seems suspect for some to
stand up and call all of them wrong. Seems much better to try to
explain actual real world usage, such as it is, and always has been.

My argument is NOT at all about which is more correct, even if there is
much obvious evidence to support dpi. Either one is quite fine with me,
simply doesn't matter to me. Preferences are fine, and I understand it
either way.

My only argument is that beginners better get it into their head early
on to try to understand it both ways, either way they hear it, because
we will hear it both ways, and we need to understand it either way, even
if we choose to only say our own preference. It is just how things are.

My experience is that the "right" term does not help newbies to
understand. They still very much need the concept. And if they have
the concept, then the term is superficial and doesnt matter... the
context speaks for itself.
 
C

catfish

Wayne said:
I try to say ppi now and then, but after all these years, "72 ppi"
sounds strange to me. That is just my own problem perhaps, but ...
"72 dpi" gets 12,100,000 hits on Google.
"72 ppi" gets 124,000 hits.
An order of magnitude in popular usage merits considerable attention.

not to quibble, but looks like two orders of magnitude to me.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top