Producing A Great 8X12 With a Scanner

C

CCDee

I want to produce a great 8X12 photo from a 35mm negative. Is it better to
use the "native" resolutions of the scanner (3200, 1600 etc. and even
multiples of those number) when scanning? Or does this matter? To produce an
8X12 I calculated that 2540 DPI on the scanner will yield a 300 DPI photo.
Should I also worry about printing in the "native" output resolutions of my
printer (1440, 720, 360, 240 etc.) or does this matter? Do the two affect
each other at all? Or is all this moot? Thx.
 
J

Jim

Whether you can make a great 8x12 photo from a 35mm negative depends more on
the quality of the negative than anything else.

I use the native resolution of the scanner because I can't say that I will
always be satisfied with only an 8x12 image.

As for the "native" resolution of the printer, you are comparing apples and
oranges. Each dot of an inkjet usually contains one color (there are some
which feature transparent inks such that more than one color can be applied
to an individual dot). Thus, it takes at least 3 dots to make a pixel.
More dots per pixel definitely help because the printer driver can make a
smoother transition in color between pixels.

The 300 pixels per inch you mention for photo reproduction is not a hard and
fast doctrine. You find that it is hard to tell the difference between 240
and 300.

Jim
 
C

CCDee

OK so, for example, if the scanner sensor is rated at 3200 DPI what happens
if I use a custom DPI or say an odd number like 3199 DPI? My printer is an
older Epson Photo 700 rated at 1440 DPI. Using the 5 color (+black) system.
If I use 240 as my output that divides into 1440 by exactly six times. So
does that mean the Epson actually prints at 240 using 6 ink dots? Would I
get a better color space on the printer if I printed at 120 DPI? Thx.
 
W

Wayne Fulton

OK so, for example, if the scanner sensor is rated at 3200 DPI what happens
if I use a custom DPI or say an odd number like 3199 DPI?


The scanner CCD sensor is 3200 dpi regardless, so that is all it can do, so
software will have to resmample it to the requested 3199 dpi size. That's
a loss, not good, but small, and you may or may not be able to detect it.
There is no reason to do that. If you do need to resample, the photo
editor can probably resample the final image better. 8x10 inches from 35
mm is a formidable feat on a flatbed anyway, so you will want to sharpen
with USM.

Your computed 2450 dpi for 8x12 is correct enough, except that it assumes
you will print full frame. However we often want to crop it a little
tighter for artistic reasons, cropping usually makes a better picture.
This cropping reduces the film size used, requiring higher scan resolution
to compensate.

My printer is an
older Epson Photo 700 rated at 1440 DPI. Using the 5 color (+black) system.
If I use 240 as my output that divides into 1440 by exactly six times. So
does that mean the Epson actually prints at 240 using 6 ink dots? Would I
get a better color space on the printer if I printed at 120 DPI? Thx.

The 240 dpi rating is about image pixels, meaning a spacing of 240 image
pixels per inch of paper.

The 1440 dpi rating is about carriage motor spacing of printhead ink drops,
which is how the printer tries to reproduce the color of those pixels.

Ink drops and image pixels are different things, not the same concept.

The printer has only 6 ink colors, so to print an image pixel to be one of
16.7 million colors, it must use several ink dots to simulate the color the
best it can. For example to print one pink pixel, it has no red ink at
all, so it must use some magenta ink dots, some yellow ink dots, and some
paper space left blank to be white. Lots of ink dots per pixel, depending
on pixel color, and often they wont all fit in the pixel area, so then
error diffusion techniques overcompensate the neighboring pixels the
opposite direction of the error.

These more numerous ink dots per pixel is why we need 1400 dpi mode to
print 240 dpi pixels for example. We like to imagine the ink dots are this
size, but actually it is the motor stepping that is that size, and the ink
dots are the size they are. These two values are really not related...
you'd use 1440 dpi mode for 300 dpi images too, or a 279 dpi image... a
high quality setting for high image resolution. The color of every pixel is
likely different, there can be no relationship.

The way to think of it is that the 1440 dpi setting is a quality setting,
much better for photos than a 360 dpi fast draft mode (fine vs coarse), and
it is about ink dot spacing, not pixel spacing.

The 240 dpi image resolution is the resolution of the printed image, and it
is about image pixels.

A 120 dpi image has larger pixels on paper, and it is easier to fit the
many ink dots of 6 colors into that larger area, so yes, the pixel color
can be reproduced better. However the pixel is larger and the image is
lower resolution, and that is a disadvantage. The best tradeoff is likely
to be the 240 dpi image in 1440 dpi mode on the good photo paper.
 
C

CCDee

OK I'm talking about two different things. Bear with me. When I change scan
resolutions from 3200 to 1600, 800, 400, 200...the scanner takes less and
less time to scan. Does this mean the scanner is ignoring or shutting off
parts of the sensor. Is the scanner in fact using 3200 to scan everything
from 3200-1601? then using 1600 to scan everything from 1600-801
etc.shutting down another row on the sensor? Am I optimizing CPU processor
time by using 3200, 1600 etc. instead of scanning at 3000 or 600? (BTW I do
just judicious amounts of USM depending on the subject matter).

I realize image pixels are composed of many inkdrops. Is the number of drops
per pixel finite or is this a 0-255 computer thing? The stepper motor
"steps" 720 lines per inch to make the 1440, is this 720 number a product of
the print head "jets"?. Or is the step a multiple of 720? I realize this may
vary between printer technologies, let's just use the Epsons as an example.
What I'm trying to get at is two things. Optimizing my scan times and seeing
if there is a "sweet spot" on my printer that relates to the output DPI.
Thx.
 
W

Wayne Fulton

OK I'm talking about two different things. Bear with me. When I change scan
resolutions from 3200 to 1600, 800, 400, 200...the scanner takes less and
less time to scan. Does this mean the scanner is ignoring or shutting off
parts of the sensor. Is the scanner in fact using 3200 to scan everything
from 3200-1601? then using 1600 to scan everything from 1600-801
etc.shutting down another row on the sensor? Am I optimizing CPU processor
time by using 3200, 1600 etc. instead of scanning at 3000 or 600? (BTW I do
just judicious amounts of USM depending on the subject matter).


Generally lower resolution will be faster of course, if that is the size
image you need for the goal, but the horizontal CCD array itself isnt much
speed factor, the hardware does that. But the vertical motor steps are a big
factor (3200 sampling stops on 3200 rows per inch is slower than 1600
sampling stops on 1600 rows per inch). And of course the larger data size is
slower to move in the port cable, that's a big factor too.

Yes, the scanner can easily scan at divisions like 3200 or 1600 or 800 dpi,
but it cannot actually scan at 3100 or 1703 dpi - it must resample then.
However, it is not just horizontal resampling that is a problem, it is also
the possible steps available where the vertical stepping motor can stop too.
The motor probably claims 2x, or 1/6400 inch steps, so for example 1/3200 dpi
is two steps per row, or four steps for row for 1/1600 dpi, which is easy,
but most other numbers like 1/3100 dpi are probably not an even motor
stepping choice, so it must do whatever it can do, like 2 steps on most rows
which is not enough, so 3 steps on a few other rows now and then, to average
it out to 1/3100 inch as requested. Not a big factor, the motor steps are
small (esp at large low resolutions), but it is one factor.

We know it cannot do 3100 dpi horizontally either, so there are different
ways it might resample horizontally, just using every second or every fourth
column is one quick way, and some do that and dont offer any other choices.
This will match the motor steps too. Or some do bilinear interpolation
horizontally, probably from the next higher integer divisor. They usually
dont specify, and overall, we can often do better resampling in our photo
editor. Like scan at 800 dpi and resample to 700 dpi yourself if necessary
(that 700 dpi would be an image size, not a printing resolution). You may or
may not see a difference, its not a huge factor, and it may depend on how
fussy you might be, and how well it does it. Note this may give different
answers on two different resolution cases.

The 2400/3200 dpi class of flatbed scanners when used at the highest
resolutions are simply not very sharp, and will normally need quite a bit
more USM sharpening than at lower resolutions. Do that sharpening last after
all other adjustments. This is going to be your biggest problem trying to
print 8x12 from 35 mm film on a flatbed. Again, some people are fussier than
others.
I realize image pixels are composed of many inkdrops. Is the number of drops
per pixel finite or is this a 0-255 computer thing? The stepper motor
"steps" 720 lines per inch to make the 1440, is this 720 number a product of
the print head "jets"?. Or is the step a multiple of 720? I realize this may
vary between printer technologies, let's just use the Epsons as an example.
What I'm trying to get at is two things. Optimizing my scan times and seeing
if there is a "sweet spot" on my printer that relates to the output DPI.
Thx

Hopefully finite. <g> But not known nor constant. The 0..255 color values
are the possible colors of the pixel (in one channel of RGB). If the pixel
color is exactly black, you do have exactly black ink, and conceivably might
need only one ink drop (or enough to fill a pixel area, but color dithering
is no issue then). If the pixel color is not exactly one of the CMYK
colors, like red, green or blue pixels are not, then several ink drops will
be needed to simulate the color, even roughly. Pixels vary in color, and
there is no such relationship that will be useful to us, nor known to us.
I'm glad we dont have to worry about it. <g>

The printer has two motors, like 1440x720 dpi. The 1400 dpi rating is the
carriage motor moving the print head horizontally. The 720 dpi rating is the
paper stepping motor moving the paper vertically. These are the locations it
can center an ink drop. The ink drop may be larger than the smallest grid we
imagine, so again, nothing is known to us. The quality setting in your
printer driver properties, 360 dpi fast draft mode, 720 dpi, or 1440 dpi slow
high quality mode, is a combination of using these locations, as best it can
do it.

The general sweet spot for printing photoquality is easy. Use one of better
quality printer settings like 1440 dpi mode, and sharpened images of 240 to
300 dpi (thereabouts, it is not at all critical that it be 240.000 dpi, 261
dpi is fine too. And 180 dpi is often pretty fair.) And of course use the
recommended photo paper, with appropriate setting for that type of paper
(which controls the amount of ink used). It's not that hard.
 
J

Jim

CCDee said:
OK so, for example, if the scanner sensor is rated at 3200 DPI what happens
if I use a custom DPI or say an odd number like 3199 DPI?
An arbitrary number of pixels per inch requires added computation in either
the scanner or your computer. I never reduce the resolution when I am
changing print sizes.
My printer is an
older Epson Photo 700 rated at 1440 DPI. Using the 5 color (+black) system.
If I use 240 as my output that divides into 1440 by exactly six times. So
does that mean the Epson actually prints at 240 using 6 ink dots?
Yes. It is actually a 6x3 matrix of dots per pixel.
Would I
get a better color space on the printer if I printed at 120 DPI? Thx.
No. You get a much worse print because the pixels are too big.

I have retired my old Photo 700 now that I own a Photo 1280. The Photo 1280
makes much better prints.

Jim
 
J

Jim

CCDee said:
I realize image pixels are composed of many inkdrops. Is the number of drops
per pixel finite or is this a 0-255 computer thing? The stepper motor
"steps" 720 lines per inch to make the 1440, is this 720 number a product of
the print head "jets"?. Or is the step a multiple of 720? I realize this may
vary between printer technologies, let's just use the Epsons as an example.
What I'm trying to get at is two things. Optimizing my scan times and seeing
if there is a "sweet spot" on my printer that relates to the output DPI.
The number of drops per pixel is not related to the contents of the byte
which defines the shade of the color.

The drops per inch of the printer are specified as horizontal resolution x
vertical resolution. The horizontal resolution is controlled by the number
of ink jets. The vertical resolution is controlled by a stepper motor.
Thus, for Epson Photo 700, the resolution can be specified as 1440x720.
That means that the printer uses enough of the jets to lay down a line of
dots at 1440 dots per inch. It then steps to the next line at the rated of
720 lines per inch.

The sweet spot on the Epson Photo 700 (for the best print it can make) is
1440x720. Of course, if you specify a lower resolution, it prints faster at
the cost of worse appearance.

Jim
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top