Scanning Film

M

measekite

I have an Espon 4180 flatbed film scanner that I am happy with. I am
scanning Fuji negative film at 24bit - 3200 dpi. I then edit and crop
the results in Photoshop. Sometimes I crop a horizontal photo to print
as a portrait. In that instance the cropping is severe.

I want to be able to print at least as large as 8x10 and occasionally 11x14.

The scanning takes a long time. I also have to reduce the pixel count
to print on my 1200 dpi Canon IP4000 printer.

QUESTION: Is it necessary to scan at 3200 dpi or can I get the same
results with much lower dpi. If so what is the best dpi to scan without
loosing any quality in the above situation.
 
W

Wayne Fulton

I have an Espon 4180 flatbed film scanner that I am happy with. I am
scanning Fuji negative film at 24bit - 3200 dpi. I then edit and crop
the results in Photoshop. Sometimes I crop a horizontal photo to print
as a portrait. In that instance the cropping is severe.

I want to be able to print at least as large as 8x10 and occasionally 11x14.

The scanning takes a long time. I also have to reduce the pixel count
to print on my 1200 dpi Canon IP4000 printer.

QUESTION: Is it necessary to scan at 3200 dpi or can I get the same
results with much lower dpi. If so what is the best dpi to scan without
loosing any quality in the above situation.


Yes, you say you are printing 8x10 and 11x14 inches from extremely cropped 35
mm film, so yes, you need all of 3200 dpi. More would likely be good.

Digital images are dimensioned in pixels, and the actual requirement to print
8x10 inches at say 300 dpi is that you need
(8 inches x 300 dpi) x (10 inches x 300 dpi) = 2400 x 3000 pixels.

Another way to look at it is that the ratio of
(scanning resolution / printing resolution) is the enlargement factor.

8x12 inches is about 9x the size of full frame 35 mm film,
meaning, scan at 2700 dpi, print at 300 dpi, for 2700/300 = 9x size.

But if you are cropping, then the enlargement is much greater than 9x.
Regardless, to print 8x10 inches at 300 dpi, you need 2400 x 3000 pixels.
No matter the starting point, you need have 2400x3000 pixels left for this
goal of printing 8x10 at 300 dpi. If you dont have that much left, then you
can only print it at lower quality than 300 dpi.

And to print 6x4 inches at 300 dpi needs 1800x1200 pixels.

Which is a lesser size requirement, except that if the 6x4 inches might be
cropped from half film width and half film length, then the enlargement is
double that to print 8x12 inches from full frame, meaning you need the same
starting point (number of pixels) to be able to crop this extremely, and
still end up at 1800x1200 pixels.

You didnt mention pixels, and I fear you may not be thinking in pixels,
but pixels is what it is about, pixels is all there is. When we print at say
300 dpi, it has the meaning that we need 300 pixels per inch of dimension.
 
B

Brooks Moses

Wayne said:
You didnt mention pixels, and I fear you may not be thinking in pixels,
but pixels is what it is about, pixels is all there is. When we print at say
300 dpi, it has the meaning that we need 300 pixels per inch of dimension.

One could quibble about the distinction between pixels-per-inch (ppi)
and dots-per-inch (dpi), if one wanted to be really pedantic here. And
it may even be relevant; printing on a 300dpi printer means that we're
printing at much less than 300ppi, since the printer uses several dots
to print each pixel -- though I presume you meant printing at 300ppi.

- Brooks
 
A

Alan Gauld

the results in Photoshop. Sometimes I crop a horizontal photo to print
as a portrait. In that instance the cropping is severe.

I want to be able to print at least as large as 8x10 and occasionally 11x14.
QUESTION: Is it necessary to scan at 3200 dpi or can I get the same
results with much lower dpi. If so what is the best dpi to scan without
loosing any quality in the above situation.

Lets do the math for your worst case...

Assume at least 200dpi for printout.
11x14 = 2200x2800
Assume you crop your portrait picture to the full height of the
35mm original.
you need 2800 pixels minimum on the short side.

So yep 3200 looks like a safe setting.
Try it at 300dpi for printing and 3200 isn't even close...

Alan G.

Author of the Learn to Program website
http://www.freenetpages.co.uk/hp/alan.gauld
 
M

measekite

I have been printing at 1200 dpi on my Canon IP4000. Are you saying
that I can get just as good results (after scanning at 3200dpi) by
printing at 300 dpi instead of 1200??
 
W

Wayne Fulton

I have been printing at 1200 dpi on my Canon IP4000. Are you saying
that I can get just as good results (after scanning at 3200dpi) by
printing at 300 dpi instead of 1200??

Absolutely. Try it and you'll see. However scanning at 3200 dpi and
printing at 300 dpi will give 3200/300 = 10.7X enlargement, about 10x15 inches
from full frame 35 mm, which wont be very appropriate for that 6x4 inch
printer, unless you crop the daylights out of it. You need more like 4.4x
enlargement for 6x4 from full frame 35 mm, and of course more if you crop
much.

Let's be sure we are on the same page however. When I say "print at 300 dpi",
I mean that to print 6 inches, you must have (6 inches x 300 dpi) = 1800
pixels in that image dimension. 1800 pixels will print 6 inches at 300 dpi.
This 300 dpi value is scaled (or set) in a photo editor program. That resize
dialog shows you have 1800 pixels, and you might specify either the 6 inches
or the 300 dpi there, and it does the division to compute the other. Then you
print it at 300 dpi. The image properties will say this 300 dpi value then,
which is how it gets 6 inches from 1800 pixels. I refer to that image
property.

And very important, you must still have 1800 pixels after this resize if you
are scaling instead of resampling. Photo programs do vary, but many of them
today have a RESAMPLE check box there (resize dialog) which you turn off to
scale instead of resample. This is very basic, and extremely important, be
certain that you understand the difference between scaling and resampling. It
is the one required fundamental fact about printing digital images.
My site below might help.

However, the printer driver Properties also has a box for dpi, which has a
different meaning of dpi. I am not speaking of that, but I was not sure if
you were or not, hence all of this. That printer dpi rating Property is about
ink dots instead of image pixels. The 1200 dpi is in that context, but the 300
dpi is in the first context above. In this printer driver Properties, you
might be able to set maybe 300 dpi to select fast draft mode (but poor
quality), or a slower but better 1200 dpi photo quality mode. This setting is
about the print quality. So I dont mean that.

You do want to print a 300 dpi image printed to maybe a 1200 dpi printer
quality setting, asssuming you want photo quality.

Because, the 1200 dpi printer rating is not about pixels, but instead about
where the printer can position its ink dots of 3 or 5 colors of ink. 24 bit
pixels can have up to 16.7 million colors. One of those possible 16.7 million
colors for one pixel might be pink. You dont have any pink ink. You dont even
have any red ink. So pink must be simulated by several ink dots of only 3 or
5 colors of CMYK ink, so that that the total appearance averages out near the
correct color. Because of this need for several ink dots per pixel, there is
no way a 1200 dpi printer can reproduce a 1200 dpi color image properly.
However (the one exception), it could print 1200 dpi line art images, because
line art is only two colors, black or white, and the printer has ink that is
exactly black, and the paper is exactly white, so that one ink dot can
represent one pixel if line art. But no way for color.

Sorry to be wordy, and I'm not even sure if this was related to your question,
but the short of it is that the term dpi has two standard meanings :

If about image resolution, dpi means pixels per inch. This is a very old
term, it has always been true, since long before ink jets had color ink to
worry about. The newcomers often dont know this, so recently dpi is also being
called ppi to help understand it, and that's fine, it is very descriptive.
However you will see dpi everywhere, and ppi in only relatively few places.
So, we must understand it both ways. Dpi and ppi have exactly the same
meaning in this context (about images, not printers).

If about printer ratings, dpi means ink drops per inch, of one color of ink.
That is something entirely different, not about pixels at all.

One knows the difference by the context in which it is used. It is not
difficult, it always means the only thing it can mean, in context.

Hope that helps.
 
K

Kennedy McEwen

Wayne Fulton said:
Absolutely. Try it and you'll see.

Err - I don't think so Wayne!

I know that you know that printing at 300 "dots-per-inch" makes the ink
dots very visible indeed. ;-)

I know that you know that Measekit will get vastly superior results
printing at 1200dpi than printing at 300dpi. ;-)

However, he will see very little difference printing at 300ppi
(PIXELS-per-inch) compared to printing at 1200ppi and I think that is
what you mean, but it isn't what Measekit asked.

I know that you know the difference between the two, and Brooks already
raised the distinction in this thread, but his question clearly shows
that Measekit doesn't understand that distinction. It won't help his
understanding by further confusing the terms. He should continue to
print at 1200dpi on the printer because that is the minimum decent
setting for photo quality from his printer. However, there is generally
no value in him printing at more than 300ppi.
 
W

Wayne Fulton

Arguing the semantics of the term isnt going to get it done Kennedy, since
the term has multiple definitions. I wasnt sure which way measekite meant
it, so that's why I tried to explain both accounts.
 
K

Kennedy McEwen

Wayne Fulton said:
Arguing the semantics of the term isnt going to get it done Kennedy, since
the term has multiple definitions.
It only has one definition, but it is frequently misused, even by
manufacturers and software writers, and hence often misunderstood by
users. That is why it is important that we try to get it right when
explaining the issue to those who are new to the game and only have the
instruction manual for one particular piece of equipment to go on.

Dots are not pixels and pixels are not dots, it is as simple as that.
You can call it what you like, but avoiding those semantics is simply
promoting continued confusion - otherwise you might as well use german
terminology in one paragraph, greek in the next and swahili in a third.
What you call semantics is actually what makes a common language useful.
 
B

Bart van der Wolf

Kennedy McEwen said:
It only has one definition, but it is frequently misused, even by
manufacturers and software writers, and hence often misunderstood by
users. That is why it is important that we try to get it right when
explaining the issue to those who are new to the game and only have
the instruction manual for one particular piece of equipment to go
on.

Dots are not pixels and pixels are not dots, it is as simple as
that. You can call it what you like, but avoiding those semantics is
simply promoting continued confusion - otherwise you might as well
use german terminology in one paragraph, greek in the next and
swahili in a third. What you call semantics is actually what makes a
common language useful.

I fully agree with that. Pixels are not dots. Life will become so much
easier if terminology is used correctly. It will help newbies in
understanding, and it'll help the more experienced posters by not
having to explain it over and over again (at least not as often).

There are samples and pixels (SPI or PPI) on the one side, and on the
other side there are (multiple) printer (ink) dots (DPI) that are used
to simulate intermediate ink colors for each pixel (through a process
called dithering). The samples or pixels define the spatial resolution
limit, the dots define intermediate (ink) color accuracy.

Bart
 
W

Wayne Fulton

It only has one definition, but it is frequently misused, even by
manufacturers and software writers, and hence often misunderstood by
users.

That is simply wrong Kennedy, or wishful thinking, or head in the sand,
something along those lines (if not all of them). I know you know better,
but sometimes you’re stubborn. :)

Dpi is technical jargon no doubt, but it is very well established jargon
from the prepress industry, which used to be the only people interested in
this. Yes, a pixel is conceptually a kind of dot. Not an ink dot of course,
but a pixel is certainly a conceptual colored dot, and nothing other than a
conceptual colored dot. Image resolution has always been called dpi, for
forever, for years before ink jets could print colored ink dots. Like it
or not, dpi is simply the long established name for the concept of image
resolution, meaning pixels per inch.

We used to NEVER hear the term ppi at all (no so very long back either). It
is true that ppi is more descriptive, true also that in the last few years
that photo editor software has generally switched from dpi to ppi, but my
point is that beginners simply MUST understand both terms, simply because
they are always going to see it both ways, simply because the name for the
term is dpi.

PPI is a fine term too, nothing at all wrong with it, but there are many
contexts where it just sounds so dumb .. a 2400 ppi scanner, or arguments
about 72 ppi for video, etc...when ppi is just blatantly out of place, at
least in any historical perspective.

Just search google for the terms: (two discrete terms, not phrases)

image dpi 2,220,000 links
image ppi 292,000 links

I'd call that a full order of magnitude of difference in preference of dpi.
Dpi seems amazing well established to me. :) The only surprising thing is
the amount of ground that ppi has gained recently. It used to be greatly
more different than this. Like it or not, these numbers definitely indicate
that your view may not be the widely honored view.

And ppi is a fine term too, no argument about that. But there are two terms,
and two definitions, and to tell beginners it is one way when the rest of the
world is another way is what is wrong to me. I think they should instead be
told how things really are, that there are two terms, and about what they can
expect to see. The understanding should be taught.

Scanners scan only pixels, and of course ink dots are totally unknown to
scanners, and yet published scanner ratings are always dpi, meaning pixels
per inch. dpi means pixels per inch. All those scanner manufacturers are
wrong because Kennedy says so? The context in this group is scanners.

Dye sub printers, and Fuji Frontier type printers dont print discrete ink
dots either, their surface area is continuous tone, but their published
ratings are always also dpi, meaning pixels per inch. dpi means pixels per
inch. All those manufacturers are wrong because Kennedy says so? Frankly,
it's the ink jets that have abused the established terms and definitions.

BTW, what is the generic name for the Fuji Frontier type of printer? The
type must some generic identification.

Microsoft also calls it dpi for their logical inches in Windows (like 120 dpi
large fonts), and there are no ink dots there. That is about pixels per inch,
even if their inches are imaginary (the pixels are real). dpi means pixels
per inch. Microsoft is wrong because Kennedy says so? Frankly, you may be
right that time, I think they dont get it sometimes too. :)

That is four specific examples that show you are wrong Kennedy. For you to
imagine that your own personal preference is the law of the land seems
extremely pretentious.
 
R

RSD99

"Wayne Fulton" posted:
"...
For you to
imagine that your own personal preference is the law of the land seems
extremely pretentious.
...."


True ...

But "SOP"
 
K

Kennedy McEwen

Wayne Fulton said:
That is simply wrong Kennedy, or wishful thinking, or head in the sand,
something along those lines (if not all of them).

No it isn't - it is perfectly correct. Pixels are not dots and dots are
not pixels and arguing that they are on the basis of common misuse is
wishful thinking and keeping your head in the sand.
Dpi is technical jargon no doubt, but it is very well established jargon
from the prepress industry, which used to be the only people interested in
this. Yes, a pixel is conceptually a kind of dot.

No it isn't. Making this mistake is fundamental to your continuation of
confusion in others.

Pixels are not dots and dots are not pixels.

It has no relevance to your argument that the term dots-per-inch existed
before pixels-per-inch: half tone printing existed long before
discretely sampled images! In fact, it is only a few weeks ago that I
was looking at some images in a museum images were clearly halftone
processed and printed long before image sampling or pixelation was ever
practical.
That is four specific examples that show you are wrong Kennedy.

Not one of which is a definition, merely rough references to product
descriptions where the precise meaning is less important than its
reference to a context in which possible users will be familiar. The
convenient reference to familiar terms does not make them the correct
terms. However that is precisely the opposite of the issue at hand,
where you are misusing the term "dot" in a context in which THE
potential user has ALREADY demonstrated they have confused it with
"pixel"! I seriously believe that your, and others, use of what is
blatantly the wrong descriptive term is one of the main reasons why this
single point of confusion recurs so frequently in this and other groups.
Confusion encourages the myth that this is all some black art or
something which can only be understood by experts, if anyone, when in
fact it is all very simple if described precisely.

I know that you are aware that almost all scanner manufacturers misuse
terms in their product descriptions, for example quoting optical
resolution when the resolution of the scanner optics is grossly worse,
and we all know of Microsoft's reputation for adhering to well defined
standards, so resorting to such examples merely demeans your argument
and yourself.

Further, since you expressed concerns about teaching, it is better to
teach the correct term together with an awareness that these terms are
often misused - it is absurd to teach the wrong term just because others
who know little and care less misuse it! Promoting and encouraging
confusion is not teaching anything!
For you to
imagine that your own personal preference is the law of the land seems
extremely pretentious.

It isn't "my personal preference" - a pixel and a dot have completely
different definitions. To begin with, one is physical and the other
completely abstract. Dots can be used to create a physical
manifestation of abstract pixels, but they rarely map directly onto each
other.

Dots physically exist on your monitor screen, the output of your half
tone inkjet printer, your continuous tone printer and even the Fuji
Frontier photo printer (yes - examine the output under a microscope and
it has contiguous dots, which may, or may not, map exactly to the image
pixels). In all of those cases a pixel MAY be represented by a dot, but
it IS not a dot - which is why it can also be represented by many dots
(whether discrete or contiguous) even on continuous tone printers! In
fact, depending on the capabilities of your monitor, by suitable
selection of your screen resolution you may be able to demonstrate that
image pixels can even be represented by LESS than a single dot!
 
W

Wayne Fulton

Not one of which is a definition

You're saying that all these examples of industry-wide common usage and
definition are wrong? Dye sub specs (all of them), scanner specs (all of
them), and operating system logical inches, all of them are wrong? Only your
own preference is correct? I think I understand :)

These are the industry leaders that create the products and define the terms
by their usage. Dpi has always had the universal standard definition of
pixels per inch, but regardless, due to their clout, the term means whatever
they say it means. So we need to understand it that way too.
It isn't "my personal preference" - a pixel and a dot have completely
different definitions. To begin with, one is physical and the other
completely abstract.

Sorry, you're making up rules as you go, simply to define your own preference.
Your rules certainly dont negate existing universal standard usage. But yes,
a pixel is obviously an abstract dot of color. All printers would physically
print it that way too, if they were able.

Preferences are fine, and dreaming up better systems is fine too, and ppi is a
fine term, possibly even a better term, I'm not arguing otherwise.
But refusing to acknowledge the real world that exists, and declaring that
which does in fact already exist to be "wrong", seems rather suspect.

It seems better to understand and acknowledge the way things are in the real
world, so that we can understand the literature and specifications that we
see, which has always used the term that way. It's not at all difficult,
context makes it clear, same as all other English words. Understanding is
necessary, but that's always true.

Yes, there is also a second usage of dpi to refer to physical ink dots.
Only puny English words dont have several definitions. :)
 
H

Hecate

That is simply wrong Kennedy, or wishful thinking, or head in the sand,
something along those lines (if not all of them). I know you know better,
but sometimes you’re stubborn. :)

Dpi is technical jargon no doubt, but it is very well established jargon
from the prepress industry, which used to be the only people interested in
this. Yes, a pixel is conceptually a kind of dot. Not an ink dot of course,
but a pixel is certainly a conceptual colored dot, and nothing other than a
conceptual colored dot. Image resolution has always been called dpi, for
forever, for years before ink jets could print colored ink dots. Like it
or not, dpi is simply the long established name for the concept of image
resolution, meaning pixels per inch.
No, Kennedy is right. How would you explain to someone that they
should print a 360dpi image at 1440 dpi? You can't, sensibly because
what you should be explaining is how to print a 360 *ppi* image at
1440 dpi.
 
W

Wayne Fulton

No, Kennedy is right. How would you explain to someone that they
should print a 360dpi image at 1440 dpi? You can't, sensibly because
what you should be explaining is how to print a 360 *ppi* image at
1440 dpi.

If that's all they know, they're still in trouble. What is actually needed is
a few more words to give a little understanding about how things work. We are
still more likely to see image resolution referred to as dpi than ppi, so we
need to know to expect that, and what it means. If we understand, then
there's no problem.
 
K

Kennedy McEwen

Wayne Fulton said:
You're saying that all these examples of industry-wide common usage and
definition are wrong?

No, not all of them, some of them do use the term correctly, as you are
aware.
Dye sub specs (all of them),

When they refer to dots per inch, yes that is exactly what they mean -
and this is quite distinct from pixels per inch. They refer, quite
specifically, to the highest number of dots of dye that the printer
deposits every inch in each axis. There is no difference in this
instance from the dpi terminology used in laser printers, inkjets or
Fuji Frontier type printers. This dot resolution of the printer is
quite distinct from the pixel resolution of the image sent to it and
even those printer technologies which can represent a pixel as a single
dot are not confined to do so. Pixels are not dots and dots are not
pixels!
scanner specs (all of
them),

Yes, all of those that use this term do use it wrongly. However, not
all scanners do use this term and many of them are inconsistent, because
some of the documents are written by people who care less about what is
meant than others. Since as long as I have used them, all of the Nikon
range of film scanners, for example, refer to "pixels/inch" or
"pixels/cm" in both the software and handbook specifications, whilst
*some* of the advertising literature refers to "dpi". Which of those
would you suggest is more likely to be technically correct?
and operating system logical inches, all of them are wrong?

Since Operating Systems do not even define inches, this is an irrelevant
issue. As far as I recall, since Windows 3.0 at least, Microsoft has
only referred to *pixel* count in their OS settings - they don't care
what monitor size you use.
Only your
own preference is correct? I think I understand :)

I don't think you do - which is probably why you continue to sow seeds
of confusion with novices, which I acknowledge is quite at odds with
your aspirations and achievements in other aspects of the process.
These are the industry leaders that create the products and define the terms
by their usage. Dpi has always had the universal standard definition of
pixels per inch, but regardless, due to their clout, the term means whatever
they say it means.

Sorry, but that is "Gates-speak" - just because Microsoft calls black
white does not make it so, no matter how much clout they have.
So we need to understand it that way too.

We need to recognise what they mean, and realise that they may be
describing something wrongly. We should not be teaching those incorrect
terms as if they were correct, that is simply spreading misconceptions,
myth and confusion.
Sorry, you're making up rules as you go, simply to define your own preference.

No, those are the definitions - I have not made any of it up, although I
am having to explain the issue to you from several different
perspectives to demonstrate that your continual error has implications
on many levels.
Your rules certainly dont negate existing universal standard usage.

They aren't my rules, Wayne - I may know people who know people who
compile the OED, but I have had no influence that I am aware of on its
compilation or the definitions they use.
But yes,
a pixel is obviously an abstract dot of color. All printers would physically
print it that way too, if they were able.

I have no need of a printer that outputs only mental abstractions, and I
doubt that you have either.
Preferences are fine, and dreaming up better systems is fine too, and ppi is a
fine term, possibly even a better term, I'm not arguing otherwise.
But refusing to acknowledge the real world that exists, and declaring that
which does in fact already exist to be "wrong", seems rather suspect.

It seems better to understand and acknowledge the way things are in the real
world, so that we can understand the literature and specifications that we
see, which has always used the term that way. It's not at all difficult,
context makes it clear, same as all other English words. Understanding is
necessary, but that's always true.

As you will note from the origin of this discussion in the thread,
context certainly does NOT make it clear, otherwise Measekite would not
have thought you meant dots of ink instead of pixels when you said he
should print his images at 300dpi instead of 300ppi!

Which of your schizophrenic definitions you meant was clear enough in
your own mind but the resulting text you uttered was ambiguous at best
and in fact misleading and confusing to the new user, resulting in a
question which assumed the wrong interpretation.
 
W

Wayne Fulton

I must give you one point Kennedy. I had not noticed that the newest Nikon
models have changed the specs to "up to 4000 pixels per inch". This is new
with those current models. It wasnt true of the previous models, for example
the IV and 4000, which had ratings which said dpi, per the old school. So
Nikon is changing lately. This means I must say "almost all" now. :)

And that's fine. I am certainly not arguing that dpi should be used, not at
all. I am only arguing that dpi IS in fact used. Dpi has always been the
standard definition for image resolution, and I think that beginners need to
know that too. It is unrealistic to deny that the term dpi is in fact THE
term used for image resolution in most of the real world. That dpi usage was
virtually 100% only a very few years back. According to the referenced
previous Google search, use of dpi appears to have dropped today to only about
90%. 90% is still an overwhelming number, which would seem impossible to deny
or ignore, even with head in sand. Instead, it needs to be explained.

If one prefers to use ppi, that's fine. It's a fine term, very descriptive and
it means the same thing as dpi (when dpi is about images). My scanning book
uses the term ppi every place it can (but the web site has not been updated).
However, I still think in old school terms of dpi myself. One should use
whichever term they prefer, but they need to understand it both ways.

So the big point is that beginners definitely still must understand that they
will see the use of dpi in the majority of existing literature, and they need
to be taught what it means, and how to interpret it in context. Simply put,
both terms dpi and ppi mean pixels per inch, equivalent when and if it is
about image resolution. They really need to know that.

I am saying that it is extremely short sighted to stand up and shout Wrong at
any mention of dpi, and to insist that only ppi can be used, simply because it
is your preference. That is unproductive if not egotistic, because regardless
of your personal preference, the Real World simply aint that way. The term
dpi has been used for years and years, and is still widely used today (the
90%). What's the beginner to think then when they are trying to decipher
writing (the 90%) that uses dpi, and they have incorrectly been told it can
only mean ink dots? You certainly did them no favor, one simply cannot
substitute ink dots in those many cases.

Beginners need the understanding that there are two equivalent terms used,
then it becomes a totally trivial issue.
 
K

Kennedy McEwen

Wayne Fulton said:
I must give you one point Kennedy. I had not noticed that the newest Nikon
models have changed the specs to "up to 4000 pixels per inch". This is new
with those current models. It wasnt true of the previous models, for example
the IV and 4000, which had ratings which said dpi, per the old school. So
Nikon is changing lately. This means I must say "almost all" now. :)

No, it isn't new as all - I made a point of checking my old LS-20 and
its software documentation before posting that. The LS-20 was the first
Nikon scanner I ever used and the specification in the User's Manual and
the terminology of the software (NS-1.60) is "pixels / inch" and "pixels
/ cm". As for the LS-4000, that is the latest Nikon I have bought and
it is certainly "pixels / inch" in all the technical documentation. I
assume that they have continued the use of the correct terminology in
documentation of the LS-V and the LS-5000, even though some of the
advertising literature gets it wrong.
And that's fine. I am certainly not arguing that dpi should be used, not at
all. I am only arguing that dpi IS in fact used.

So, as the owner of one of the most popular sites for new users, why
don't you teach them to use the correct terminology and explain that
marketeers in sharp suits who are only interested in getting their cash
often get the facts wrong? Instead of this common sense approach, you
continue to propagate misuse, confusion and make the whole topic appear
to be black magic.
Dpi has always been the
standard definition for image resolution, and I think that beginners need to
know that too.

But it hasn't! Cycles per millimetre, inch, milliradian or degree is
the measure of resolution. Pixels per inch is n0ot a measure of
resolution - it may determine the *limiting* resolution, but it
certainly is not a measure or definition of the resolution! Just look
at almost any flatbed imager on the market - no matter whether the
manufacturer uses dots or pixels per inch, it bears little relevance to
the image resolution achievable.
It is unrealistic to deny that the term dpi is in fact THE
term used for image resolution in most of the real world. That dpi usage was
virtually 100% only a very few years back. According to the referenced
previous Google search, use of dpi appears to have dropped today to only about
90%. 90% is still an overwhelming number, which would seem impossible to deny
or ignore, even with head in sand. Instead, it needs to be explained.
It needs to be explained that it is wrong to refer to pixels as dots -
they are completely different entities. It needs to be explained that
the term dot is often misused. When new users are taught that even
experts, such as yourself, misuse the terminology they have much more
chance of understanding the subject.

We all use the wrong terminology from time to time and I am sure you can
find references where I have used the term dpi when I meant ppi. That
doesn't mean we should settle on the wrong term just because it is
commonly misused and propagate misunderstanding. You, more than most on
this forum, have a responsibility to get it right.
So the big point is that beginners definitely still must understand that they
will see the use of dpi in the majority of existing literature, and they need
to be taught what it means, and how to interpret it in context.

When I learned Natural Philosophy I was taught to use the SI system of
units. I learned how to calculate a result from a set of data and that
the by using the SI system I was assured that the resulting units would
be correct if all of my data was in SI units as well. I was also taught
that in the real world people still used non-SI units, the cgs system or
even imperial units. I was taught to expect this, to learn how to
convert the real world values into what they should be for my purposes.

People use the wrong values and definitions all over the place -
teachers use the correct definitions. I consider you to be one of the
teachers and it certainly irritates me when I see confusion arise
because you continue to propagate the wrong definition when it would be
so much simpler for you and the learners to use the right ones.

Pixels are not dots and dots are not pixels!
I am saying that it is extremely short sighted to stand up and shout Wrong at
any mention of dpi, and to insist that only ppi can be used, simply because it
is your preference.

That isn't what I am doing Wayne, and you know it!
You used 72 lines of text on this one occasion to explain that you
really meant pixels and not dots in your original reply to Measekite. 72
lines to explain the misuse of 2 words! The number of lines of text you
have used in the past year, based on the number of times this same issue
comes up, must range into the thousands - perhaps millions if every hit
on your website is counted! Yet still further explanation of a pretty
basic concept is required time and again, expansion on what you have
already written, because you consider that the continued use of the
wrong terminology is justified by existing misuse! Save yourself, and
the rest of us, a lot of problems by trying to get it right the first
time - an explanation that misuse is common is all that is necessary.
Beginners need the understanding that there are two equivalent terms used,
then it becomes a totally trivial issue.
Except, as demonstrated by the complete misunderstanding that initiated
this sub-thread, it is far from trivial to any new user - it simply
confuses them. Use the correct term and the difference is obvious.
Relying on context means that they must understand everything you write
in exactly the same thought processes that you write it. In short, you
and they must use the same paradigm but that is, by definition,
impossible between teacher and scholar, expert and novice. As a
consequence, correct terminology and nomenclature are extremely
important in the very field that you aspire to - and if you were
prepared to use them then much of your efforts could be directed to real
issues.
 
W

Wayne Fulton

As for the LS-4000, that is the latest Nikon I have bought and
it is certainly "pixels / inch" in all the technical documentation. I
assume that they have continued the use of the correct terminology in
documentation of the LS-V and the LS-5000, even though some of the
advertising literature gets it wrong.

Kennedy, checkout
http://www.nikonusa.com/template.php?cat=1&grp=98&productNr=9238
for just one of many. I could post a jillion links, but you already know it.
This is pretty much universal usage. The definition of dpi in such usage is
"pixels per inch", if about images. Always has been, years and years.

That is simply how the real world actually is (I'd guess 90%). Whether you
like it or not is not the issue. Beginners certainly need to have this usage
explained, because they are going to see it everywhere. Telling them that it
can only mean something it cant possibly mean (in such context) is
counterproductive to their understanding. That's not good.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top