Vuescan Review at Photo-i

D

Don

What if you were to use a film calibration target, i.e. a camera shot on
Kodachrome of a standard calibrated target? By using the reference values of
the original target (rather than those measured on the slide) you would
then correct the colour balance of the film and neutral would correspond to
'real' neutral. You could apply your analogue gain correction as well before
profiling. At least in theory :)

I like the "in theory" bit... ;o)

Seriously though, the "real" neutral I was referring to is not what's
on the film, but the desired balance in the image after editing

This is because the balance on film is not necessarily the desired
balance in the end product. That's why we edit in the first place,
otherwise we would just calibrate and leave it at that.

So - as unusual as it sounds - but trying to replicate what's on the
film verbatim may end up being counterproductive. If the end goal is
contrary, the profile will need to be "undone" first. If the profile
"doesn't go far enough" it will have to enhanced. Either way,
additional editing takes place and that's always bad as damage is
cumulative.

Of course, it all depends on the end goal. So, all that is "in
theory"... ;o)

Don.
 
D

Don

It's no more "pretend" than calibrating your monitor or printer.

Actually, it is. This is what I mean:

Monitor and printer calibration assure not only that the printer
output corresponds to what's on the monitor, but what's on the monitor
will correspond to everybody else's calibrated monitor. Not to mention
monitor and printer calibrations are non-destructive (if the profile
is merely tagged).

Now, scanner/film profiles not only change data irretrievably but
serve only to try and *emulate* what's on the film. However, since
what's on the film is almost never what's in the final product -
that's why we edit in the first place - this "correction" usually does
more "damage" than good.
True. but as I put forth in another post, correcting LED gain is not
linear and has insufficient range for the task.

Which is the reason for that big "if" in my original statement.
The neutral of a calibration target is a measured deviation of the film
from international color standards. That difference tells the software
where to "find" neutral.

Yes, but *film* neutral.

The problem is this film "neutral" has very little resemblance to the
real neutral in the final edit. That's my point exactly. (See below.)
I'm not sure what you mean by the "real neutral". When I say neutral I
mean an in gamut color is the same from the color chart to the film to
through the scanner and from the printer.

What do you mean?

Let's say that the middle gray on film is 127, 127, 127.

The scanner changes this to 120, 127, 131 or whatever...

Applying the scanner/film profile changes this back to 127, 127, 127 -
as it should! - to correspond to what's on the film. So far, so good.

However, after importing the image into an editor we set the neutral
(click on concrete, shadows, whatever...) or simply apply some
arbitrary curves or let our artistic talents go wild, etc. all because
we want the image to look a certain way. Hey, that's the beauty of
digital image processing!

If we now go back and measure the same spot we'll see it has become
112, 136, 98 or whatever.

That's the "real neutral" I'm referring to. And that's quite different
from "film neutral".

So even though scanner/film calibration performed perfectly, the
change it made had no resemblance to the final product. That "neutral"
was not only irrelevant to the final product, but probably just made
matters worse - all things considered.
No. That's what I mean about you misunderstanding the nature of color
space and calibration.

No, that's what I mean about you misunderstanding what I say! ;o) But
seriously, it's not a calibration issue. Maybe the above example will
clear things up.
The question is what next? Or how do I make these bits in the scan file
look like the slide I started with? That's where color management comes
in. And yes, it always and inevitably involved altering the data.

The point is - even though, at face value, it may appear
counterintuitive - what's on the film is not necessarily what's on the
final product (usually isn't). That's the key.

Now, that's very unnerving and may appear counterintuitive but it's
true. That's the whole beauty of digital editing and what it enables
us to do.

And once we recognize that, then the question becomes how to get the
most out of this process, rather than trying to just mechanically
replicate what's on the film. Especially, since chasing this elusive
replication can be, and usually is, counterproductive once we take the
final end product into consideration.

But, as I mentioned earlier, every now and then it's also good to
stand back and look at everything from the "big picture" perspective.

So, playing devil's advocate to my own position... Sure, profiles may
change data, but by what amount? And how significant a "damage" is
that, really? If the amount is small then it may not matter i.e. even
though there may be "damage" in a literal sense, in the big scheme of
things it's not noticeable. In other words, chasing "data purity" to
an extreme is just as bad as chasing "perfect calibration" to an
extreme!

Which nicely goes back to usage. If one scans to take advantage of
digital editing, but the end goal is really printing, then such
"profile corruption" is totally irrelevant if it speeds up the process
or benefits in other ways. The same goes if the goal is to view images
on a monitor. Our 8-bit eyes looking at an 8-bit monitor will never be
able to see any "profile corruption" which happened at 16-bit!

The only place where it may matter is if we scan for archiving, in
which case the unedited "digital negative" - as raw scanning is also
called - should be as "pure" as possible. We can then work on a copy
and have the "digital negative" to go back to in the future if we
want, whether because we'd like to have another go at editing or, for
example, when monitor and printer technology improves.

Don.
 
D

Don

Ok. I'll bite. Let's say you get every single bit from your scanner. Now
what?

Now you have the most data, which translates into more elbow room
before artifacts start appearing.
Well, unless you are extremely lucky your image will need correction.

That's a given. That's precisely why it's important to start with the
"best" data you can. The difference is, you're starting with pure data
and there is nothing to "undo" (whether implicitly or explicitly)
before you start the real work.
You can apply it after the scan as a profile, as I have suggested.

In which case your data has already been modified once. And, as we
know, each change you apply during editing degrades data further.
That's why the first rule of image editing to limit the number of
steps.

That's all very easy to see if you examine the histograms at maximum
scale. Don't look at Photoshop histograms because they show you a
scaled down version (at least my Version 6 does). Each "bin" in
Photoshop's 8-bit histogram represents 256 bins of 16-bit data. Even
the free "Wide histogram" with 12-bit depth doesn't go far enough.
That's why I wrote my own true 16-bit histogram.
And as you said, the data will be altered.

What is the alternative?

To keep data as pure as possible.

I mean, look at this way. Would you apply contrast and curves etc in
your scanner software? I certainly would not for a number of reasons
and I doubt you would either.
Well, you could attempt correct during the scan. You have in the case of
the Nikon 3 LED's to work with. In other scanners you have less.

Exactly! LEDs (i.e. exposure) and ICE (when applicable) are about the
only parameters I modify. Everything else is turned off, or set to
neutral.
Unfortunately, attempting to control precise color management is nearly
impossible by tweaking the LED's exposure. In the case of Nikon, the the
color response of each LED to being "dialed up" or "dialed down" is non
linear across the color produced by the LED. So while you *might* get
correction to one color, all the others would be thrown off. Besides
which the range of movement is very limited. Finally, the corrections
are such that that you would lose overall white balance.

Actually, the LED response is linear, it's the film that's non-linear.
But that's not as big a problem as may appear at first. What is much
more important is dynamic range. For example, in case of those pesky
Kodachromes, often times red barely reaches the middle of the
histogram, while blue is already clipping. In such a case it helps to
boost red and thereby providing more dynamic range to play with later.

Granted, that may open up a different can of worms, from non-linear
film response, to color imbalance, etc. but - all things considered -
it's the lesser of two evils. At least there's sufficient dynamic
range in all channels to perform the corrections without invoking
artifacts.

Yes, that's a lot of work and I would *not* recommend it to everyone,
but it really depends on usage. For example, if the purpose is to
print or a to be used as a web graphic then the whole hassle is not
worth it. But if the purpose is to archive then it's a must. IMHO at
least.
And if you attempt this correction to neutral, by altering the LED's or
their electronic filter, how would you do it? You would need to
calibrate to a standard. A color target - a calibration slide would do
the trick.

As I indicate elsewhere, it isn't necessary to start with a calibrated
image to benefit from calibration. The calibration really comes into
its own later when you edit and output the image (be in on the monitor
or to printer).
And *if* you used the corrections directed by the slide and *if* you
could actually get the full range of required correction what would it
look like.

Terrible! And that's the point!

Look at it this way. What would you rather have from your digicam? A
color calibrated jpeg, or an uncalibrated raw image?

And what does raw Bayer data look like? Terrible!

By comparison the jpeg will "look great". But that's only how a
"civilian" would react. Those in the know realize that the jpeg is not
only heavily "corrupt" but it lost most of its data, while Bayer data
is "pure". Does it require more work? Sure, but that's the price one
pays for quality and flexibility.

This is the same thing.
Well, the resulting histogram of the scan with the correction applied
via LED's *during* the scan would look like the scan with the correction
applied *after* the scan. As long as post the correction profile was
applied at sufficient bit depth say 32 bit correction against a 14 bit
image. (BTW, this is how silverfast works)

Actually it will not look the same because there are just far too many
variables. Again, one needs to examine the data at an appropriate
histogram depth. You'd notice spikes and gaps, and worse...

In order to apply 32-bit correction they would need a 32-bit look-up
table (LUT) and I doubt very much Silverfast would use a 32-bit LUT to
correct a 14-bit image. I do *not* know this for a fact and I'm only
guessing, but it seems very unlikely for a number of reasons. They
could do it on-the-fly but then you get rounding errors.

But regardless of all that, the point is the raw data has been
changed. And will have to be changed again (while editing).

Therefore, two changes will always introduce more data loss than a
single change.

But we also need to step back and look at the big picture. Some of
those changes may be very small. How significant that is, will depend
on the task at hand and overall goal. In some cases the loss will be
irrelevant, in others significant.

However, looking at it purely from a factual stand, the data will
change.
Now, because of the nature of the corrections, the real world colors
will still exceed the range of the scanner and these will appear clipped
in the file no matter which method your use. The real world colors the
scanner is able to render fully will either be shifted to a neutral
representation or unaffected.

Those are the limitations of the scanner color space. There's no way
around it.

Which is exactly why one should strive to get the most out of the
scanner and not damage data right from the start.
So, what are you left with? The same result either way.

As I say, the result is not same.

How significant the differences are is a function of usage. For some
use the difference may be irrelevant, for others "grave".
Unfortunately, in digital imaging, as with analog imaging, there is no
getting away from color space. Use it or ignore it, its always there.
Omit color management in one link in the production chain and the whole
process suffers.

It does not! That's exactly my point. Take the example in the quote
above.

Does your process suffer if you start with an uncalibrated image
(assuming an unknown source)? No, it does not. The calibration comes
into its own only later when you start dealing with monitors and
output devices. The initial scanner calibration does not affect that.
If it did, you'd be unable to process images from an "unknown source".
And, of course, you can, thereby proving that initial scanner
calibration is irrelevant to the rest of the process.
Unfortunately "pure unadulterated data" has almost no meaning in the
context of scanning because you are crossing so many color spaces to get
from the film to your output.

If that were true you would scan at 10 dpi. The state of initial data
is paramount. If something is missing from the initial data you just
can't get it back. You can interpolated it or "correct" it or apply
any other arbitrary "creative" enhancement but once the data has been
lost it's gone. How important that is, depends on the context. In some
cases such attention to data purity may be an overkill but it is a
fact nevertheless.
It appears then, that the difference between us is how we apply this
knowledge.

Very true! ;o)

Don.
 
U

UrbanVoyeur

Don said:
Actually, it is. This is what I mean:

Monitor and printer calibration assure not only that the printer
output corresponds to what's on the monitor, but what's on the monitor
will correspond to everybody else's calibrated monitor. Not to mention
monitor and printer calibrations are non-destructive (if the profile
is merely tagged).

What are the calibrated to? International standard color charts same as
the file profile.


Now, scanner/film profiles not only change data irretrievably but
serve only to try and *emulate* what's on the film. However, since
what's on the film is almost never what's in the final product -
that's why we edit in the first place - this "correction" usually does
more "damage" than good.

More damage than good? LOL. That's funny. I can only think that either
- you've never properly used a calibration slide
OR
- you used a poor made profile.

How are these target slides made?
Well, under controlled measured, lighting specified by the ISO, the
maker of the film and the spectrometer, several standard color targets
are shot at different exposures.

This film is processed according to strict standards, and the slide is
measured on a spectrometer. Adjustments are made to color temp,
processing, and lighting and the shots are done again. and again. and
again. Until both the film manufacturers specification for the film AND
the spectrometer manufacturers specs for consistency and measurement are
met.

Then a final set of slides of the calibration target is made with under
the optimized conditions.

These slide are measured and difference between the colors on the slide
and the colors on the chart is recorded. This becomes the "calibration
profile" of that slide. It tells exactly how much each color on the
slide deviates in R,G & B from each standard color on the chart. Every
color on the chart is in the film's gamut so there is no clipping.

This is done is very large runs for consistency and economy.

When you use this information to calibrate a scanner, there is no pretend.

You scan the slide. You measure the RGB value of each square on the
slide. Each square on that slide has real world value that is is
supposed to be - taken from the measurement of the original color chart
used to shoot the slide.

The scanned image will ave different values. Part of the deviation will
be the film. But earlier every deviation of the slide from he original
chart was measured on an extremely accurate spectrometer.

So if we subtract the *known* film difference, we are left with the
difference introduced by the scanner. Each of those R,G,B differences
on each color patch is added to the film's measured difference to
produce a calibration profile for that scanner.

When that profile - which is nothing more than a carefully constructed
set of adjustments - is applied, real world colors recorded on the film
are then faithfully translated to the resulting scan.

There is no "pretend" involved.
Which is the reason for that big "if" in my original statement.






Yes, but *film* neutral.

No. Real world neutral.


Let's say that the middle gray on film is 127, 127, 127.

The scanner changes this to 120, 127, 131 or whatever...

Applying the scanner/film profile changes this back to 127, 127, 127 -
as it should! - to correspond to what's on the film. So far, so good.

However, after importing the image into an editor we set the neutral
(click on concrete, shadows, whatever...) or simply apply some
arbitrary curves or let our artistic talents go wild, etc. all because
we want the image to look a certain way. Hey, that's the beauty of
digital image processing!

If we now go back and measure the same spot we'll see it has become
112, 136, 98 or whatever.

That's the "real neutral" I'm referring to. And that's quite different
from "film neutral".

That's not neutral. That's just how you want to output your images. That
is completely arbitrary.

If you want to say that every image is tweaked according to whatever you
feel - sure you can do that.

Is that a controlled or consistent process? no.

Is it neutral compared to the color of the original object? no, but then
you didn't intend it to be.

The point is - even though, at face value, it may appear
counterintuitive - what's on the film is not necessarily what's on the
final product (usually isn't). That's the key.
Now, that's very unnerving and may appear counterintuitive but it's
true. That's the whole beauty of digital editing and what it enables
us to do.

It's not counterintuitive. It is undisciplined.

Yes, you can shoot and see what comes out, and then play around with it
until you are satisfied. And every photo will be a guess. And every one
of them and adventure to get an output that's pleasing to you. Some will
be easier to shoot than others.

And in this context, nothing that's on film really matters as long as
you can manipulate it into a pleasing final product.

In this context, anything that corrects to a "standard" would be an
obstacle - since you are not starting from nor ending up in an
standardized place.

Now, another way shoot is to maximize the film at the medium.
Understanding its limitation and crafting an image (in camera) that is
optimized for them. At that point, all else follows from the film -
because that is that "pre-visualized" target medium.

And once we recognize that, then the question becomes how to get the
most out of this process, rather than trying to just mechanically
replicate what's on the film. Especially, since chasing this elusive
replication can be, and usually is, counterproductive once we take the
final end product into consideration.

You may feel it is elusive - I suspect because of your shooting style.

For you, the film is a way station in the process. From what you say,
there is an image, in your mind's eye, that you want to represent in
output. Part of that vision exists on the film, but only in a raw form.
The rest must be brought out of it in post production.


For me, the slide is an end to itself - because if it isn't on the
slide, no amount of post processing creativity is going to put it there.
The vision I have in my mind is precisely what is on the film - because
I made the slide conform to that vision when I made a whole host of
decisions about lighting, exposure and composition.

So I put all my energy into crafting the best slide - and a workflow
that extract the that information accurately from the slide. Accurate
based on the slide.

With print film the pre-vis process is the same, but the end medium is a
print.

Sure, I can go in any direction I want in digital processing. But that's
not how I shoot - I know what I want the film to look like when I'm
making the image.

If the film does not look like what I want as an end product, then I go
back and re-examine my technique. Film has a limited range. Every image
has to somehow be made to fit in the film's range - and use the entire
range - not be compressed into the lower (underexposure) or upper range
(overexposure) of the film. If the image doesn't then I'm not taking
full advantage of the medium.

Since the scanner, monitor, and printer all have more limited range than
film, the next task is to squeeze the film into their ranges.


Also, while I'm not saying it is in your case, it can be a crutch for
poor technique to "work it out in post". This is particularly true of
slide film, which responds dramatically to over and under exposure. Yes,
you can recover some data that has been compressed into dark shadows or
blown highlights, but not nearly as much as if those were properly
placed in the film's range at the start.

Exposure which is not well controlled can lead to slides where the only
course of action is to do heavy post work to bring the images out. The
best fix for that is better technique.

But, as I mentioned earlier, every now and then it's also good to
stand back and look at everything from the "big picture" perspective.

So, playing devil's advocate to my own position... Sure, profiles may
change data, but by what amount? And how significant a "damage" is
that, really? If the amount is small then it may not matter i.e. even
though there may be "damage" in a literal sense, in the big scheme of
things it's not noticeable. In other words, chasing "data purity" to
an extreme is just as bad as chasing "perfect calibration" to an
extreme!
True.

Which nicely goes back to usage. If one scans to take advantage of
digital editing, but the end goal is really printing, then such
"profile corruption" is totally irrelevant if it speeds up the process
or benefits in other ways. The same goes if the goal is to view images
on a monitor. Our 8-bit eyes looking at an 8-bit monitor will never be
able to see any "profile corruption" which happened at 16-bit!

Oh, how I long for one of those expensive 14 bit monitors. :)
The only place where it may matter is if we scan for archiving, in
which case the unedited "digital negative" - as raw scanning is also
called - should be as "pure" as possible. We can then work on a copy
and have the "digital negative" to go back to in the future if we
want, whether because we'd like to have another go at editing or, for
example, when monitor and printer technology improves.

Well for archival purposes, I would store a raw scan, no profiles. But
I'd rather put my energy into preserving the film - and wait until
scanning improves.
 
U

UrbanVoyeur

Don said:
To keep data as pure as possible.

<snip>


Does your process suffer if you start with an uncalibrated image
(assuming an unknown source)? No, it does not. The calibration comes
into its own only later when you start dealing with monitors and
output devices. The initial scanner calibration does not affect that.
If it did, you'd be unable to process images from an "unknown source".
And, of course, you can, thereby proving that initial scanner
calibration is irrelevant to the rest of the process.




If that were true you would scan at 10 dpi. The state of initial data
is paramount. If something is missing from the initial data you just
can't get it back. You can interpolated it or "correct" it or apply
any other arbitrary "creative" enhancement but once the data has been
lost it's gone. How important that is, depends on the context. In some
cases such attention to data purity may be an overkill but it is a
fact nevertheless.


It seems that you have conflated the arguments of bit integrity with
color management and while interrelated, they really are different things.


Getting all the bits out of your scanner is important. But if you blues
look purple it doesn't do you much good.

That's where color management comes in. Yes, it alters the data. Every
change to the image alters the data. That's a given. There are only so
many bits and changing an image always involves moving some and
discarding others

If you want to start from a place of wrong colors, contrast and exposure
for the sake to saying your bits are untouched - by all means, feel free.

That's not what's on the film, but it does have every bit the scanner
can give you.

I on the other hand prefer to start from a where the slide left off, not
what the scanner's "opinion" of the slide is. If the scanner thinks my
blues are purple, I don't care how many bits it uses to express that
purple, I shot blue and that's what I want the scanner to give me. If it
means throwing away a few "purple" bits to my blues are true, then so be
it.


I think you will find it easier, and requires less post work if you
bring your images back to a color managed neutral first.
 
J

John

UrbanVoyeur said:
This is done is very large runs for consistency and economy.

When you use this information to calibrate a scanner, there is no pretend.

You scan the slide. You measure the RGB value of each square on the
slide. Each square on that slide has real world value that is is
supposed to be - taken from the measurement of the original color chart
used to shoot the slide.

The scanned image will ave different values. Part of the deviation will
be the film. But earlier every deviation of the slide from he original
chart was measured on an extremely accurate spectrometer.

So if we subtract the *known* film difference, we are left with the
difference introduced by the scanner. Each of those R,G,B differences
on each color patch is added to the film's measured difference to
produce a calibration profile for that scanner.

When that profile - which is nothing more than a carefully constructed
set of adjustments - is applied, real world colors recorded on the film
are then faithfully translated to the resulting scan.

There is no "pretend" involved.

It's even more fundamantal than that. RGB values from an uncalibrated
scanner are meaningless in absolute terms - they are values which are
relative to each other and to the exposure of the scanner. They do not
represent absolute colour in any shape or form. On the other hand, a
calibration slide has been measured absolutely, as you point out, using a
spectrophotometer. However, these absolute values are not RGB values but Lab
values - i.e. real world colour. When we calibrate a scanner, we are
*defining* what the RGB values it produced actually correspond to in terms
of absolute colour (i.e. Lab).

Anyone who 'scans straight into Adobe RGB (1998)' or whatever is
*miscalibrating* their scanner, because the RGB values it produces are not
valid in that colour space. The manual tweaks that they then have to perform
to make the colours 'look right' are every bit as damaging to the data as
converting from a scanner profile to Adobe RGB using a CMS.

Of course, *calibrating* a scanner *does not* damage the data. To calibrate
a scanner, all one has to do is to assign a correct scanner profile to the
RGB data. The RGB is then defined and the scanner is calibrated. Only when
we convert into a more suitable *working space* do we 'damage' the data. In
thoery, one does not need to do this - one can edit the scanner data
directly in scanner colour space and merely use the CMS to view the result.
However, the practical problems of doing this are, of course, non linearity
and non perceptual uniformity of the scanner's native colour space, which
would make this approach extremely difficult.
 
J

John

UrbanVoyeur said:
This is done is very large runs for consistency and economy.

When you use this information to calibrate a scanner, there is no pretend.

You scan the slide. You measure the RGB value of each square on the
slide. Each square on that slide has real world value that is is
supposed to be - taken from the measurement of the original color chart
used to shoot the slide.

The scanned image will ave different values. Part of the deviation will
be the film. But earlier every deviation of the slide from he original
chart was measured on an extremely accurate spectrometer.

So if we subtract the *known* film difference, we are left with the
difference introduced by the scanner. Each of those R,G,B differences
on each color patch is added to the film's measured difference to
produce a calibration profile for that scanner.

When that profile - which is nothing more than a carefully constructed
set of adjustments - is applied, real world colors recorded on the film
are then faithfully translated to the resulting scan.

There is no "pretend" involved.

It's even more fundamantal than that. RGB values from an uncalibrated
scanner are meaningless in absolute terms - they are values which are
relative to each other and to the exposure of the scanner. They do not
represent absolute colour in any shape or form. On the other hand, a
calibration slide has been measured absolutely, as you point out, using a
spectrophotometer. However, these absolute values are not RGB values but Lab
values - i.e. real world colour. When we calibrate a scanner, we are
*defining* what the RGB values it produced actually correspond to in terms
of absolute colour (i.e. Lab).

Anyone who 'scans straight into Adobe RGB (1998)' or whatever is
*miscalibrating* their scanner, because the RGB values it produces are not
valid in that colour space. The manual tweaks that they then have to perform
to make the colours 'look right' are every bit as damaging to the data as
converting from a scanner profile to Adobe RGB using a CMS.

Of course, *calibrating* a scanner *does not* damage the data. To calibrate
a scanner, all one has to do is to assign a correct scanner profile to the
RGB data. The RGB is then defined and the scanner is calibrated. Only when
we convert into a more suitable *working space* do we 'damage' the data. In
theory, one does not need to do this - one can edit the scanner data
directly in scanner colour space and merely use the CMS to view the result.
However, the practical problems of doing this are, of course, non linearity
and non perceptual uniformity of the scanner's native colour space, which
would make this approach extremely difficult.
 
D

Don

What are the calibrated to? International standard color charts same as
the file profile.

It doesn't matter. That's the "how", I'm talking about the "what".

The key point of calibration is that you see on your monitor what I
see on my monitor. And, what anyone sees on their monitors is what
they also see on their printouts. That's it!
More damage than good? LOL. That's funny. I can only think that either
- you've never properly used a calibration slide
OR
- you used a poor made profile.

How are these target slides made?

Totally irrelevant! (See below.) Again, This is *not* about
calibration.
Well, under controlled measured, lighting specified by the ISO, the
maker of the film and the spectrometer, several standard color targets
are shot at different exposures. ....
There is no "pretend" involved.

You applied "pretend" incorrectly. Here we go with misunderstandings
all over again... ;-)

"Pretend" does *not* refer to *how* the calibration slide was made or
*how* (the mechanics) it is applied. I have great confidence all that
is done with highest care and accuracy. But that's not the point.

"Pretend" refers to the effect or the consequences of scanner
calibration on the scan in the final analysis.

What that particular *part* of calibration (the scanner part, the
front end part) does is corrects the image to remove any scanner bias.
It *may* correct it to look like the film, but that's meaningless in
regard to the final product, as already explained.

Therefore, with reference to the *final product*, any such corrections
by the profile are "pretend".

Why? Because what's on the film is not necessarily what's on the final
product. Here are at least two reasons:

1. The shot may be off or, not what one wants in the final product.

2. *Uncalibrated* images from an unknown source work just as well.
No. Real world neutral.

If that were the case you would never need to edit the image
afterwards.

Also, what about editing uncalibrated images from an unknown source?

What do you then? How do you assign a "real world neutral" to an image
from an unknown source where you don't even know whether the image is
calibrated or not?
That's not neutral.

Yes it is! It's the actual neutral, the end result. Not the "absolute"
or "should-be" neutral which the profile unilaterally declares - no
matter how noble the profile's intentions are.
That's just how you want to output your images. That
is completely arbitrary.

Precisely! And that's the reason why trying to recapture that "film
neutral" is... well... pointless in relation to the final end product.

If scanner profiling were non-destructive, then it would be harmless.
However, if it does change the data it may do more damage than good.
If you want to say that every image is tweaked according to whatever you
feel - sure you can do that.

Is that a controlled or consistent process? no.

Is it neutral compared to the color of the original object? no, but then
you didn't intend it to be.

Exactly! So trying to recover the color of the original object as
represented on film is pointless. Furthermore, if in the process of
doing that you lose data it's also counterproductive.
It's not counterintuitive. It is undisciplined.

So, that means you never edit your images, then? You stick to what the
calibration gives you, because to deviate from the "calibration
neutral" would be undisciplined, right?
And in this context, nothing that's on film really matters as long as
you can manipulate it into a pleasing final product.

In this context, anything that corrects to a "standard" would be an
obstacle - since you are not starting from nor ending up in an
standardized place.

Here we go again... ;o)

Your above conclusion does not follow from anything I wrote. You
instantly jump to a generalization. Let me quote what I wrote before:

I say:

Without a profile the scanner produces pure data.

You hear:

Throw away *all* calibration and color space info!

That is not the case. Calibration is essential! However, input
calibration is of very limited use and can even be harmful if it
destroys data.
You may feel it is elusive - I suspect because of your shooting style.

No, you misunderstand again... Aaaarrrrrgggghhh! ;-) You're not doing
this on purpose, are you? ;o)

The "elusive" part refers to the fact that the end product is
different from the calibrated "input" into your editing process.

Once the image enters your editing process, do all the calibration you
can! But calibrating before entering this process is of very limited
use and indeed, may (and often does) end up doing more damage than
good.

Again, given an *uncalibrated* image from an *unknown source* -
according to you - we should give up! Of course, we don't! You can
take such an image and *within* your calibrated workflow produce
output just as perfect as from "calibrated" input.

And if you can do that, it proves beyond any shadow of the doubt, that
whether the input is calibrated or not is totally irrelevant to the
(*rest of*) color management workflow or to the end result!

And that (the rest of) color management workflow has absolutely no
relation to whether the input is calibrated or not!
For you, the film is a way station in the process. From what you say,
there is an image, in your mind's eye, that you want to represent in
output. Part of that vision exists on the film, but only in a raw form.
The rest must be brought out of it in post production.

No, that's not what I'm saying at all...

I'm saying that there are a number of factors *outside* of one's
control which result in what's on the film not being what's on the
final product.
For me, the slide is an end to itself - because if it isn't on the
slide, no amount of post processing creativity is going to put it there.
The vision I have in my mind is precisely what is on the film - because
I made the slide conform to that vision when I made a whole host of
decisions about lighting, exposure and composition.

So I put all my energy into crafting the best slide - and a workflow
that extract the that information accurately from the slide. Accurate
based on the slide.

We all (strive to) do that! But the real world does not cooperate...
Since the scanner, monitor, and printer all have more limited range than
film, the next task is to squeeze the film into their ranges.

And that's one example of the real world not cooperating!
Exposure which is not well controlled can lead to slides where the only
course of action is to do heavy post work to bring the images out. The
best fix for that is better technique.

Again, that's not what I'm saying. Those are all things under your
control. I'm talking about the (many) things outside of your control,
the end result of which is that what's on the film does not correspond
to the final product.

Finally, we agree! ;o)
Oh, how I long for one of those expensive 14 bit monitors. :)

I know! A while back I was surfing to check out HDR image processing
and I came across a report of a white-LED based monitor with "real
world" dynamic range for viewing HDR images. However, it was a custom
made one-off for some exhibition.

But even when that sort of hardware filters down (and I'm sure it's
coming) we still have a problem with our ability to only see 8-bit
color (some research even claims as little as 6-bit). But an
improvement in display dynamic range would certainly help a lot!
Well for archival purposes, I would store a raw scan, no profiles. But
I'd rather put my energy into preserving the film - and wait until
scanning improves.

The problem is that would be a never-ending wait. I mean, things
improve constantly. So I'd rather capture all I can at this time
before any further film deterioration takes place, because no matter
how well stored, film does deteriorate while bits, no matter how
imperfect current technology may be, do stay constant.

Don.
 
D

Don

Anyone who 'scans straight into Adobe RGB (1998)' or whatever is
*miscalibrating* their scanner, because the RGB values it produces are not
valid in that colour space. The manual tweaks that they then have to perform
to make the colours 'look right' are every bit as damaging to the data as
converting from a scanner profile to Adobe RGB using a CMS.

Of course, *calibrating* a scanner *does not* damage the data. To calibrate
a scanner, all one has to do is to assign a correct scanner profile to the ....
Only when
we convert into a more suitable *working space* do we 'damage' the data. In
theory, one does not need to do this - one can edit the scanner data
directly in scanner colour space and merely use the CMS to view the result.

Yeah! What he said!!!

Now, why couldn't I put it that simply and clearly! ;o)

Seriously though, that's exactly my point!

Of course, the catch is not all CMSes will let you merely assign a
profile, as some insist on converting instead. In that case, I (for
one) would rather take the raw data than the "damaged" profiled data.
However, the practical problems of doing this are, of course, non linearity
and non perceptual uniformity of the scanner's native colour space, which
would make this approach extremely difficult.

I wouldn't say extremely, but it does require more work, that's for
sure. However, given the fact that some editing will have to be done
no matter what, and that editing damage is cumulative, it's better to
start with raw data IMHO. Or, let's just say, it's the lesser of two
evils.

Don.
 
D

Don

It seems that you have conflated the arguments of bit integrity with
color management and while interrelated, they really are different things.

I've actually spent over a week stating exactly that they *are*
different!! That's the key of our misunderstanding.

However, that specific *portion* of CMS (the front end, scanner part)
affects bit integrity. And that means, narrower and more limited
options down the CMS path i.e. there are fewer bits to "play with".

Yes, they may be the "right" bits *in theory* but not in practice. So
when the crunch comes we have fewer options (more limitations before
artifacts start appearing).
Getting all the bits out of your scanner is important. But if you blues
look purple it doesn't do you much good.

And neither does the loss of data which a profile will introduce. It
will "correct" the blues, but at what cost?

If, after reviewing the image, you then discover that the blues...
ahem... actually... as a matter of fact... erm... really need to be "a
tad" more purple, there's no way of getting that information back! Not
to mention cumulative and compound damage two edits will do.
That's where color management comes in. Yes, it alters the data. Every
change to the image alters the data. That's a given. There are only so
many bits and changing an image always involves moving some and
discarding others

That's exactly my point!
If you want to start from a place of wrong colors, contrast and exposure
for the sake to saying your bits are untouched - by all means, feel free.

It's not a willingness to start with wrong colors, but a willingness
to start with more data, because that translates into most flexibility
down the road i.e. both CMS and I have more data at our disposal, more
elbow room, more flexibility.

If there were a way to have the correction done automatically
*without* loss of data, I'd embrace that wholeheartedly.

But given the dilemma of "ease of use" vs "more data and flexibility"
I'd pick "more data and flexibility" every time.
I think you will find it easier, and requires less post work if you
bring your images back to a color managed neutral first.

It's a trade-off. It may be easier superficially, but at the cost of
data integrity, which means loss of data. And that means less elbow
room down the road. So any "gains" in terms of time and effort are at
the expense of flexibility and data integrity which directly
translates into fewer editing options later.

Don.
 
U

UrbanVoyeur

Don said:
And neither does the loss of data which a profile will introduce. It
will "correct" the blues, but at what cost?

If, after reviewing the image, you then discover that the blues...
ahem... actually... as a matter of fact... erm... really need to be "a
tad" more purple, there's no way of getting that information back! Not
to mention cumulative and compound damage two edits will do.

But that is a creative decision that you actively make - rather than a
mistake made by your scanner in rendering the color in the first place.
It's not a willingness to start with wrong colors, but a willingness
to start with more data, because that translates into most flexibility
down the road i.e. both CMS and I have more data at our disposal, more
elbow room, more flexibility.

Ok. I won't disabuse of of that thought. Though you will do far more
"damage" (as you call it) to your data in editing then you will ever do
in calibration. But as you have said, it's not about degree - you want a
scan with all the bits.

If there were a way to have the correction done automatically
*without* loss of data, I'd embrace that wholeheartedly.

But given the dilemma of "ease of use" vs "more data and flexibility"
I'd pick "more data and flexibility" every time.




It's a trade-off. It may be easier superficially, but at the cost of
data integrity, which means loss of data. And that means less elbow
room down the road. So any "gains" in terms of time and effort are at
the expense of flexibility and data integrity which directly
translates into fewer editing options later.

If you think that starting with incorrectly rendered colors and contrast
is a good thing, then by all means go ahead.
 
U

UrbanVoyeur

Don said:
It doesn't matter. That's the "how", I'm talking about the "what".
The key point of calibration is that you see on your monitor what I
see on my monitor. And, what anyone sees on their monitors is what
they also see on their printouts. That's it!

Not entirely.

The point I'm trying to make here is that you are correcting your
monitor and printer to the same color standards that I am talking about
applying to your scanner. Applying these standards allows your monitor
and printer to match their colors. But not applying them to your scanner
means that your scans don't match your slide.

So going in it doesn't look anything like your source material, meaning
that it is difficult if not impossible to get the image on your monitor
or from your printer to look like the original slide.


Totally irrelevant! (See below.) Again, This is *not* about
calibration.


You applied "pretend" incorrectly. Here we go with misunderstandings
all over again... ;-)

"Pretend" does *not* refer to *how* the calibration slide was made or
*how* (the mechanics) it is applied. I have great confidence all that
is done with highest care and accuracy. But that's not the point.

"Pretend" refers to the effect or the consequences of scanner
calibration on the scan in the final analysis.

What that particular *part* of calibration (the scanner part, the
front end part) does is corrects the image to remove any scanner bias.
It *may* correct it to look like the film, but that's meaningless in
regard to the final product, as already explained.

Why is it irrelevant for an image on screen to look like the film? I
thought the purpose of a scanner was to *fully* and *accurately*
translate the image from the slide to a file.

If the resulting image doesn't look the the film, well then, its just a
random interpretation of the image by the scanner.

Therefore, with reference to the *final product*, any such corrections
by the profile are "pretend".

Why? Because what's on the film is not necessarily what's on the final
product. Here are at least two reasons:

1. The shot may be off or, not what one wants in the final product.

That's creative. And you should be deciding where to go with the image
without the bias of the scanner.
Also, what about editing uncalibrated images from an unknown source?

Again, what is an "uncalibrated image from an unknown source"?

If you mean an image that I didn't scan from a slide I never see on an
emulsion I don't know, well that's called a "favor for a friend", since
I no longer print other people's work for hire.

In that case I'm working blind - and with someone else's image. All bets
are off. I make the best of it in photoshop. In the past five years I've
done that a handful of times.

All the rest has been my work, where I know scanner, image, film, etc.


What do you then? How do you assign a "real world neutral" to an image
from an unknown source where you don't even know whether the image is
calibrated or not?

No. I do the best I can in photoshop and call it a day. But that's not
my primary workflow.

Yes it is! It's the actual neutral, the end result. Not the "absolute"
or "should-be" neutral which the profile unilaterally declares - no
matter how noble the profile's intentions are.

Not in the way I see it.

"Real world neutral" is rendering a standard color so that is measures
the same RGB value on the real world object as it does on film, in the
image file, on the monitor, and in the final print. That is real world
neutral. And example would be the colors of a MacBeth chart.

It may not be what you desire - that's a creative decision.

It may not be what my scanner tries to pass off as an accurate image
file of the slide. That's a problem.


Precisely! And that's the reason why trying to recapture that "film
neutral" is... well... pointless in relation to the final end product.

If scanner profiling were non-destructive, then it would be harmless.
However, if it does change the data it may do more damage than good.




Exactly! So trying to recover the color of the original object as
represented on film is pointless.

Only to you. I actually want my scans and prints to look like my slides.


So, that means you never edit your images, then? You stick to what the
calibration gives you, because to deviate from the "calibration
neutral" would be undisciplined, right?

I don't have to is my point. And when I do edit it is very slight
changes. The vast majority of what I shoot take very little or no curve
editing.

When I shoot well the scans match the slides. I use the calibrated image
because it matches the film.

If I am printing, I may do some corrections to overcome limitations of
the printer & ink.

I may do heavy editing when I shoot beyond what I know to be the
limitations of the slide. Or when I make mistakes. But even then, I use
the calibrated image as my starting point - because I still want the
blue on the slide to be blue in the file, not the uncorrected purple the
scanner thinks it should be.


No, you misunderstand again... Aaaarrrrrgggghhh! ;-) You're not doing
this on purpose, are you? ;o)

The "elusive" part refers to the fact that the end product is
different from the calibrated "input" into your editing process.

Ok. We have very different working styles.

With me, the printed output look just like the image on the monitor
which look just like the slide. The input (slide) looks just like the
output (print).

Yes the paper is big and matte and the slide is small and glossy. But
hold them up - print to monitor, slide to monitor, slide to print and
they are all very close. None has a different color cast than the
others. None renders a particular blue as a green or a purple. In all of
the images, the blacks are all black, not grey. The whites are all white
- not pink - in any of them.

That's predictable, accurate and painless.

Once the image enters your editing process, do all the calibration you
can! But calibrating before entering this process is of very limited
use and indeed, may (and often does) end up doing more damage than
good.

Again, given an *uncalibrated* image from an *unknown source* -
according to you - we should give up!

No at all - I did not mean to imply that. But obviously we can work with
any image. But I would rather not work against the film and the
scanner's bias.


And if you can do that, it proves beyond any shadow of the doubt, that
whether the input is calibrated or not is totally irrelevant to the
(*rest of*) color management workflow or to the end result!

Only in your workflow. But I have had no success in convincing you of
the value of calibrating your scanner.

And that (the rest of) color management workflow has absolutely no
relation to whether the input is calibrated or not!

Your end point is only as good as your starting point. I want my output
to look like my slides, so I make sure I start with an image file that does.

For you its not important, as long as the final image is pleasing. Ok.
So you can start wherever.

No, that's not what I'm saying at all...

I'm saying that there are a number of factors *outside* of one's
control which result in what's on the film not being what's on the
final product.

Such as?
We all (strive to) do that! But the real world does not cooperate...

How so? I choose the lens, camera, composition, aperture, film speed,
emulsion, shutter speed, and filtration. If I can't decide what goes on
that film who or what does? I'm not being facetious here.

Now sure there are always images that are poorly made or where the film
is used beyond its limitations that we want to "rescue" to make them
look right. They need work. Lots of it. But with these it helps to
remove the "damage" done by the scanner's bias.

And that's one example of the real world not cooperating!

Not to me. That's just the material I work with. Just like B&W wet
processing and printing. Every film and paper has limitations.

Again, that's not what I'm saying. Those are all things under your
control. I'm talking about the (many) things outside of your control,
the end result of which is that what's on the film does not correspond
to the final product.


Such as?
 
U

UrbanVoyeur

Don said:
Yeah! What he said!!!

Now, why couldn't I put it that simply and clearly! ;o)

Seriously though, that's exactly my point!

Of course, the catch is not all CMSes will let you merely assign a
profile, as some insist on converting instead. In that case, I (for
one) would rather take the raw data than the "damaged" profiled data.

Well assigning a profile vs converting to a color space are two entirely
different things.

I have *never* made a statement about which color space to assign an
image to. That process does subtract a significant quantity of data from
the image file that is irrevocable, and depending on the color space and
output, not always desirable.

I have made consistent statements about applying calibration profiles
and how necessary they are to correct the inaccuracies of scanners.


Agreed. This is why we (generally) use color spaces that make sense
perceptually within the limitations of our monitors.
 
D

Don

But that is a creative decision that you actively make - rather than a
mistake made by your scanner in rendering the color in the first place.

Exactly! That was my point all along: The final product is not single
dimensional (i.e. a simple replication of what's on the film) but the
result of the whole process (i.e. scanning and editing). And if that's
the case, then trying to replicate what's on the film mechanically is
futile if one knows that - whatever the result of this mechanical
conversion - it's bound to change further down in the chain once we
start editing.
Ok. I won't disabuse of of that thought. Though you will do far more
"damage" (as you call it) to your data in editing then you will ever do
in calibration. But as you have said, it's not about degree - you want a
scan with all the bits.

It's precisely because editing is so damaging (and cumulative!) that
it's imperative to start with as much and as pure a data as possible!

If we know that editing is going to compromise data, then why compound
the damage by starting with compromised data in the first place?
If you think that starting with incorrectly rendered colors and contrast
is a good thing, then by all means go ahead.

As Reagan would say: There you go again... ;o)

I never said that starting with incorrectly rendered colors and
contrast is a good thing!

I said that starting with more of, and less corrupt, data is a good
thing!

If the raw scan has incorrectly rendered colors and contrast, the
profile will never be able to correct that. It will "pretend" to
"correct" it by modifying ("destroying") existing data, but it will
never be able to put in the missing data which was not there in the
first place. However, by "meddling" it only makes our job more not
less difficult.

Don.
 
D

Don

Not entirely.

The point I'm trying to make here is that you are correcting your
monitor and printer to the same color standards that I am talking about
applying to your scanner. Applying these standards allows your monitor
and printer to match their colors. But not applying them to your scanner
means that your scans don't match your slide.

And since we have shown that the end result will almost never look
like the slide anyway, that's moot!

And what about uncalibrated data from an unknown source?
Why is it irrelevant for an image on screen to look like the film? I
thought the purpose of a scanner was to *fully* and *accurately*
translate the image from the slide to a file.

Yes, but as we know scanners *fail* to do that.

I'd *love* if the scanners were able to replicate the film *natively*
without use of profiles!

But trying to "fudge" this imperfect scanner rendition with a profile
afterwards only increases the amount of *real* damage to the
underlying data only to have *superficial* appearance of similarity
which - in the final analysis - is not what the end product will look
like anyway!
If the resulting image doesn't look the the film, well then, its just a
random interpretation of the image by the scanner.

As disturbing as that sounds, but that's what scanners do! (Except
it's not that random.)
That's creative. And you should be deciding where to go with the image
without the bias of the scanner.

Whatever the reason, it's a fact.
Again, what is an "uncalibrated image from an unknown source"?

I explained that already!

You get an image. You don't know what device the image comes from.
What's more, you don't even know if the image has been calibrated in
that initial device or not!

Can you process such an image? Of course, you can.
In that case I'm working blind - and with someone else's image. All bets
are off. I make the best of it in photoshop. In the past five years I've
done that a handful of times.

And - using well known methodologies - you were, no doubt, able to
produce results indistinguishable from a "known source" image.
No. I do the best I can in photoshop and call it a day. But that's not
my primary workflow.

We are not talking about your primary workflow, but the fact that even
when you start with an "unknown source" image you can still
incorporate it into your workflow without any problems and produce
results indistinguishable from "known source" images.
Not in the way I see it.

I know, and that's the problem. You are looking at it too rigidly.
Nothing wrong with that, of course, you're just following the
standardized calibrated workflow instructions.

What I'm saying is, apply some lateral though, and "think out of the
box" by looking at what is the real goal here and what are the real
effects of alternative workflows.
"Real world neutral" is rendering a standard color so that is measures
the same RGB value on the real world object as it does on film, in the
image file, on the monitor, and in the final print. That is real world
neutral. And example would be the colors of a MacBeth chart.

No, it's not. That's the "dogma" I referred to earlier, and the fact
that you're locked into this "calibration-think".

The real world neutral is what's on the final product.

Yes, you can define this color in a gamut of your choosing, and it may
be different from "should be" neutral, but all that is beside the
point. That color in the image is what will be printed even if it is
different from "sacred neutral".

It is, however, *essential* to define the colors in the processed
image (even though, literally speaking, they are "heretical"). The
reasons is so that others viewing it on their monitors see the same
thing and the printout comes out right.
It may not be what my scanner tries to pass off as an accurate image
file of the slide. That's a problem.

And - as we've shown - no matter how accurate you make it, it's going
to change the very femtosecond you import it into PS! So much for all
the effort of making it look like the film.
Only to you. I actually want my scans and prints to look like my slides.

In which case you don't do any editing afterwards, and then, yes,
applying a profile makes perfect sense.
I don't have to is my point. And when I do edit it is very slight
changes. The vast majority of what I shoot take very little or no curve
editing.

Well, you can't have it both ways. If you don't edit, then scanner
profiling makes perfect sense. If you do, the moment you import it
into PS, the scanner profile becomes meaningless.
Ok. We have very different working styles.

It's nothing to do with working style!

It's only an objective statement of fact, unrelated to any working
style.

I have never advocated any working style. All I'm saying is this is
what the reality is.
No at all - I did not mean to imply that. But obviously we can work with
any image. But I would rather not work against the film and the
scanner's bias.

The fact is you don't know that with an image from an "unknown
source". It may very well have been calibrated! You simply don't know.

That's my point! So you take it at face value, which is exactly how
you take a calibrated image. Your editing process does *not* change.

The only difference is that when you do know an image comes from a
calibrated source, it may give you a "warm, fuzzy feeling" of
security, but in realistic terms, it makes absolutely no difference to
your editing process i.e.:

You still clip the highlights and the shadows a little to boost
contrast the same way, you still click on a sidewalk to set the gray
the same way, you still apply a bit of an "S" curve to "pep the image
up" the same way, you still calibrate the monitor and the printer the
same way, and so on and so forth... No difference!
Only in your workflow. But I have had no success in convincing you of
the value of calibrating your scanner.

Again, it's nothing to do with workflow - as I've just shown above.

First it was the black point, then the color space, and now it's the
workflow... ;o) We just can't seem to talk about the same thing... ;o)
Your end point is only as good as your starting point.

Yes!

And if your starting point is a compromised image ("damaged" data)
only so it *superficially* "pretends" to look like the slide, which
itself bears no resemblance to the final end product - then you
starting point is pretty bad. That's my point exactly.
I want my output
to look like my slides, so I make sure I start with an image file that does.

And it that case, profiling the scanner makes perfect sense -
providing you do no editing afterwards.
For you its not important, as long as the final image is pleasing. Ok.
So you can start wherever.

No, I don't start "wherever". I start with the most data and therefore
have most flexibility. Is it easy? No, of course not. But do I have
more options, more elbow room, more flexibility? Absolutely!

We've been over this already. From unfavorable shooting conditions, to
"creative" reasons as you call them, to gamut limitations of different
output media down the line, etc. etc.
How so? I choose the lens, camera, composition, aperture, film speed,
emulsion, shutter speed, and filtration. If I can't decide what goes on
that film who or what does? I'm not being facetious here.

I know you're not. But just as I listed above there are reasons
outside of one's control.

See above.

Don.
 
U

UrbanVoyeur

Don said:
As Reagan would say: There you go again... ;o)

I never said that starting with incorrectly rendered colors and
contrast is a good thing!

I said that starting with more of, and less corrupt, data is a good
thing!

If the raw scan has incorrectly rendered colors and contrast, the
profile will never be able to correct that. It will "pretend" to
"correct" it by modifying ("destroying") existing data, but it will
never be able to put in the missing data which was not there in the
first place. However, by "meddling" it only makes our job more not
less difficult.

LOL. If you say so Don.

Bear in mind that the vast, overwhelming majority of the pro world
(photographers, labs, printers, magazines, printer manufacturers,
scanner manufacturers, camera makers, etc) rely on color calibration
charts and the profiles they produce because they
(1) work
(2) make our jobs easier
(3) produce better images
(4) don't "damage" the images.

If any of these were not true, people would not use calibration.

Believe whatever you like. :)
 
M

mlgh

I suspect that the differences arising between Don and J in this thread
have a lot to do with the materials they work with.

J is scanning very recently exposed (by himself) and processed (by well
trusted labs) film. It strikes me that in this situation calibration at
every level is the way to go, in fact it is plain common sense.

Don however is (I think...apologies if I've got this wrong) scanning
older material for the purpose of archiving. If my experience is
anything to go by fading and colour shifting will be present and to
very varying degrees. In this situation scanner calibration / profiling
will be of very little if any benifit at all.

mlgh
 
U

UrbanVoyeur

Don said:
And since we have shown that the end result will almost never look
like the slide anyway, that's moot!

Only in your world. Just because you change mediums does not mean colors
lose their relationships to each other - or to physical examples of
those color in the non digital world.

If a color exists in all the color spaces we pass through, why is there
any reason to expect it to be rendered as anything but the original value?

An "in gamut" red of a flower should be the same RGB value from the
flower to the print.

Now if I want to change the red for creative reasons, fine.

But unless actively choose to edit that color, the production chain
should be neutral with respect to that "in-gamut" color. That is to say,
it should not shift it blue or green or change the luminance in any
representation of the image from file to monitor to printer.

It's like a ruler. When I measure an meter using a ruler, I don't want
my ruler to give me a longer or shorter meter depending on which day I
use it.
And what about uncalibrated data from an unknown source?
Asked and answered.
We are not talking about your primary workflow, but the fact that even
when you start with an "unknown source" image you can still
incorporate it into your workflow without any problems and produce
results indistinguishable from "known source" images.

Um, yes, we are. I know my film, lab and scanner, I rarely print for
other people.

Besides, the "unknown image" is a straw argument. It has no bearing on
whether calibration works under *known* conditions, which *is* what we
are talking about.
I know, and that's the problem. You are looking at it too rigidly.
Nothing wrong with that, of course, you're just following the
standardized calibrated workflow instructions.

What I'm saying is, apply some lateral though, and "think out of the
box" by looking at what is the real goal here and what are the real
effects of alternative workflows.

ROFL. I could say the same of you. ;-)

But seriously, I am looking carefully at the intended result, the actual
result and the work it takes to get there. Your methods/ideas about
color management, correction and "damage" just don't hold up in the real
world and are counter productive. But if the they work for you, I can't
argue with that.
No, it's not. That's the "dogma" I referred to earlier, and the fact
that you're locked into this "calibration-think".

LOL. Clearly you've never had to shoot fashion, people or things for
magazine.

Trust me if the film doesn't look like the subject, head roll. If the
scans don;t look like the film, graphic designers have cows. Great big
mooing ones.

And if the output from the printer doesn't look like the film, art
directors start biting off heads - its not pretty.

It's not that I'm locked int "calibration think" its that I use the real
world, not my computer out put as a reference.

The real world neutral is what's on the final product.

Yes, you can define this color in a gamut of your choosing, and it may
be different from "should be" neutral, but all that is beside the
point. That color in the image is what will be printed even if it is
different from "sacred neutral".


I'll agree that you can make a good print that looks nothing like the
slide. But that's not how I work and not what I'm trying to do.

A print showing a bowl of blue apples may look great.

But if I shot green apples, and my film shows green apples, then I want
green and only green apples in that bowl when I print it.

My scanner has no business telling me those are blue apples! And I will
correct its output with a profile.

You might shoot the same scene and be happy with the blue apples - maybe
happier than you are with a green apple print. I will not.

That's what I mean by differences in working style and working flow.

It is, however, *essential* to define the colors in the processed
image (even though, literally speaking, they are "heretical"). The
reasons is so that others viewing it on their monitors see the same
thing and the printout comes out right.
For you, as long as the image on the monitor looks like the print and
the print looks good, you're happy. Fine.

I want the print to look like my slide. We differ in objectives, style
and workflow.


Well, you can't have it both ways. If you don't edit, then scanner
profiling makes perfect sense. If you do, the moment you import it
into PS, the scanner profile becomes meaningless.


Not at all. It becomes a starting point. But just as you say I hold on
dogmatically to calibration , you hold on to this idea of the
"undamaged" image file.


It's nothing to do with working style!

It's only an objective statement of fact, unrelated to any working
style.

ROFL. Objective only in your rather oddly defined sense of the word.

I have never advocated any working style. All I'm saying is this is
what the reality is.

Again, only, for *your* workflow. Again, I suspect because your images
work well in that conceptual framework and would not work well in a
calibration workflow. So what appears objective to you, is merely what
works for your shooting and editing style.

You could say the same about the calibration I am advocating - except
that almost the entire pro world uses it.

The fact is you don't know that with an image from an "unknown
source". It may very well have been calibrated! You simply don't know.

That's my point! So you take it at face value, which is exactly how
you take a calibrated image. Your editing process does *not* change.

That's specious argument.
(1) You and I both know the emulsion and scanner used when we scan our
images.
(2) It has nothing to do with whether calibration works on known images.

It's unknown. You do the best you can.

What I *do* know is that the amount of correction and work it takes to
get a pleasing image from one of my *known* slides is far greater if I
don't apply a calibration profile before I start editing. And my own
editing does not begin to correct all the subtle issues a calibration
profile addresses. A good profile corrects along many simultaneous color
dimensions.

The only difference is that when you do know an image comes from a
calibrated source, it may give you a "warm, fuzzy feeling" of
security, but in realistic terms, it makes absolutely no difference to
your editing process i.e.:

You still clip the highlights and the shadows a little to boost
contrast the same way, you still click on a sidewalk to set the gray
the same way, you still apply a bit of an "S" curve to "pep the image
up" the same way, you still calibrate the monitor and the printer the
same way, and so on and so forth... No difference!

Um, no.
(1) I use far less correction or most of the time none if its calibrated
(2) I don't have to tweak dozens of colors to get them in the right place.
Again, it's nothing to do with workflow - as I've just shown above.

No you haven't shown me, but I guess you know that already.
First it was the black point, then the color space, and now it's the
workflow... ;o) We just can't seem to talk about the same thing... ;o)

LOL. At least not in the same way. They are all interrelated.
Yes!

And if your starting point is a compromised image ("damaged" data)
only so it *superficially* "pretends" to look like the slide, which
itself bears no resemblance to the final end product - then you
starting point is pretty bad. That's my point exactly.

Ok. Don.
We've been over this already. From unfavorable shooting conditions, to
"creative" reasons as you call them, to gamut limitations of different
output media down the line, etc. etc.

I don't consider limitations of my materials necessarily outside of my
control. They merely define the color spaces I work in - its still up to
me to decide how my images fit in those spaces. In others words, aliens
are not secretly sneaking in ad randomly altering my slides - that would
be out of my control.

Knowing the limitations of your tools is just good craft.
 
U

UrbanVoyeur

mlgh said:
I suspect that the differences arising between Don and J in this thread
have a lot to do with the materials they work with.

J is scanning very recently exposed (by himself) and processed (by well
trusted labs) film. It strikes me that in this situation calibration at
every level is the way to go, in fact it is plain common sense.

Don however is (I think...apologies if I've got this wrong) scanning
older material for the purpose of archiving. If my experience is
anything to go by fading and colour shifting will be present and to
very varying degrees. In this situation scanner calibration / profiling
will be of very little if any benifit at all.

mlgh

You could very well be right.
 
R

rafe bustin

I suspect that the differences arising between Don and J in this thread
have a lot to do with the materials they work with.

J is scanning very recently exposed (by himself) and processed (by well
trusted labs) film. It strikes me that in this situation calibration at
every level is the way to go, in fact it is plain common sense.

Don however is (I think...apologies if I've got this wrong) scanning
older material for the purpose of archiving. If my experience is
anything to go by fading and colour shifting will be present and to
very varying degrees. In this situation scanner calibration / profiling
will be of very little if any benifit at all.

mlgh


Applying calibration or profiles to old,
faded slides seems like an exercise in
futility.

I've been looking at quite a few of these
in the last few weeks. Slides dating back
to the late 1950s. I'd guess 75% or so have
serious fading or color shift issues.

Not surprisingly, the Kodachromes fare
the best.


rafe b.
http://www.terrapinphoto.com
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top