Vuescan Review at Photo-i

D

Don

So if understand you correctly, using calibration slides only gives you
general "hints" in the right direction?

When compared to specific calibration for any one specific setup, yes.
Here I think we get to the core of the problem. This "generic profile"
from a "distant lab". Is in fact very carfully contructed to meet
international standards of color and accuracy. It is not "all things to
all people". It is a well contructed tool

Nevertheless, it's still a compromise when we take any one specific
environment.
I'm not sure what "higher accuracy" you are striving for. ....

Absolutely NOT. Unexposed film is the color of unexposed film.

Black is an internationally defined color that can be found on standard
color charts and chip sets every where.

First of all you overlook the "in theory" part. The basic problem is
we're talking about different things. You're talking about the
*representation* of black in any one specific context (monitor,
printer, etc). So for you color is always relative and simply a
function of a device. But, instead of starting another sub-thread...

In any case, as I said already, if you don't like the black point
example, use the Spyder example. That's not what the original
paragraph was about.

I think we're digressing again, probably due to our different aims.
Your prime aim appears maintaining color accuracy throughout the
chain, my prime aim is to get the most out a scan - which was the gist
of the original paragraph which sprouted all this.

Therefore, if a CMS modifies my scan I will respectfully decline. For
you however, the fact that the scan is modified is a total non-issue
if the CMS addresses your main goal.

So, in fact, we are talking about different things.

Don.
 
U

UrbanVoyeur

Don said:
First of all you overlook the "in theory" part. The basic problem is
we're talking about different things. You're talking about the
*representation* of black in any one specific context (monitor,
printer, etc). So for you color is always relative and simply a
function of a device. But, instead of starting another sub-thread...

Please allow me a brief digression:

Before RGB monitors and CYMK printers, there were standard color charts
and standard color spaces which defined the visible spectrum of light.

Traditionally, in one of the most widely used systems, there were 6
Cardinal Colors Red, Orange, Yellow, Green, Blue and Violet. (3 primarys
Red, Yellow, Blue, and 3 secondaries Orange, Green & Violet)

Among these it was determined, largely through trial an error over the
ages that people can easily discriminate about 3000 discreet shades -
roughly 500 of each color, though more subtle gradations can be perceived.

A linear Black to White scale of about 500 shades of gray was also
pulled from these 3000.

That's it in terms of the colors *we can see* - that is our color space.

More recently this color space has been mapped and codified.

Each of the discreetly recognizable colors in this space is now well
defined internationally. Down to wavelengths. And yes, black is a color,
though it may be arrived at differently in different models.

Pantone, Kodak and Gretag-MacBeth among others have developed
standardized charts of these colors as well as translations of them into
the RGB and CYMK color models.

If you calibrate you monitor using a colorimeter you are calibrating it
to this international color standard.

If calibrate your scanner with a measured slide of a Kodak chart, you
are calibrating to this international standard.

Same for all the printer calibration tools.

This is applied color theory. If you want to map your system colors to
the real world, this is how its done.

If you map your system to some other standard, you cannot expect any
correlation between what you see on your slides and what shows up on
your monitor.
In any case, as I said already, if you don't like the black point
example, use the Spyder example. That's not what the original
paragraph was about.

I think we're digressing again, probably due to our different aims.
Your prime aim appears maintaining color accuracy throughout the
chain, my prime aim is to get the most out a scan - which was the gist
of the original paragraph which sprouted all this.

I think the two go hand in hand. In shooting and processing, I get the
most I can out of my film. I expect my digital workflow to faithfully
represent that. Color accuracy is one dimension, though an important one
when shooting slides.

The other major dimension is tonal range and contrast. A good
calibration slide is crucial to accurately reproducing the film's full
tonal range.

For me, 95% of the work happens when I release the shutter -
composition, highlight and shadow placement, contrast slope, color
balance. I shoot with a particular film speed, processing push and lab
in mind.

When it comes time to scan & print, my manipulations are largely if not
entirely done. I want to represent what's on that film - so that the
image from the printer looks like the image on the monitor which looks
like the image on the slide.

Because the slide represents exactly what I wanted when I made pressed
the shutter.
Therefore, if a CMS modifies my scan I will respectfully decline. For
you however, the fact that the scan is modified is a total non-issue
if the CMS addresses your main goal.

So, in fact, we are talking about different things.

Any Color Management System - or the absence of one will modify your
scan no matter what you do. That is the nature of digitizing. The
question is controlling that modification.

But I'll agree with you that we have different goals, with one caveat:

I may be mistaken, but from what I've read you taken exception to the
color & contrast performance of Vuescan - with Kodachrome, but also with
other films.

I wonder, since you use a non-standard method of calibrating your
scanner, could this be the cause of some of your difficulties?

Especially since your monitor is calibrated to ISO color standards using
a colorimeter, but you scanner is calibrated to something else entirely.
 
R

Robert Feinman

Among these it was determined, largely through trial an error over the
ages that people can easily discriminate about 3000 discreet shades -
roughly 500 of each color, though more subtle gradations can be perceived.

A linear Black to White scale of about 500 shades of gray was also
pulled from these 3000.

That's it in terms of the colors *we can see* - that is our color space.
Your analysis is basically correct as far as it goes, but the eye can
see many colors that can't be reproduced by phosphors, inks or film
dyes. That's where the artistry comes in, mapping the real out-of-gamut
colors to something which simulates the original scene.
The international standards don't cover this. That's why Photoshop, for
example, has three different rendering intents.
 
D

Don

Please allow me a brief digression:

That's what this whole sub-thread is. ;o) Digress away!

.... omitted for brevity...
If you map your system to some other standard, you cannot expect any
correlation between what you see on your slides and what shows up on
your monitor.

Yes, that's all fine, but that's not what the subject is. The question
(paraphrased) was:

"Is there a one-size-fits-all calibration?"

And the answer was:

"No, but if you want a generic solution then look for objective,
self-correcting references and methods."

*and*

"It's not going to be easy!"
I think the two go hand in hand. In shooting and processing, I get the
most I can out of my film. I expect my digital workflow to faithfully
represent that. Color accuracy is one dimension, though an important one
when shooting slides.

Yes, but by applying a profile you are *changing* data received from
the scanner. I understand that for you this is a Good Thing in order
to maintain color accuracy down the chain, etc, but strictly speaking,
the resulting data is *not* what you got from the scanner! That's
where the "raw scan sect" (of which I'm a card-carrying member)
shrieks in horror.

Now, you will probably say "Excellent! It's corrected data! That's
exactly what I *want*".

And the "raw scan sect" will say "Horrible! It's corrupted data!
That's exactly what we *don't* want".

So we end up talking abut two different things.
The other major dimension is tonal range and contrast. A good
calibration slide is crucial to accurately reproducing the film's full
tonal range.

No, the actual tonal range is a function of a scanner's dynamic range
capability.

All (*post-scan!!*) calibration does is applies a particular
*software* correction to this raw, *hardware* scan.

One could do that in Photoshop afterwards! Yes, yes, I know... That
would be a "wild" correction outside of the CMS chain unlike "tamed"
correction of a calibrated profile, but that's not the point.

The trouble is all that happens *after* the actual scan! It does not
change anything during the scanning process.

And anything that happens after the scan can be done elsewhere.
Any Color Management System - or the absence of one will modify your
scan no matter what you do. That is the nature of digitizing.

Yes, but that's a different subject. If we take that attitude we
should stop scanning altogether until the resolution gets (according
to one estimate) to ~10,000 dpi at which point even the smallest
particle in "grain clouds" is resolved without aliasing.
But I'll agree with you that we have different goals, with one caveat:

I may be mistaken, but from what I've read you taken exception to the
color & contrast performance of Vuescan - with Kodachrome, but also with
other films.

VueScan has nothing to do with this. The problems with VueScan are
much more mundane: amateur bugs und rampant unreliability.
I wonder, since you use a non-standard method of calibrating your
scanner, could this be the cause of some of your difficulties?

Especially since your monitor is calibrated to ISO color standards using
a colorimeter, but you scanner is calibrated to something else entirely.

No, no, my difficulties had nothing to do with any of the above. The
problems were way *below* that level. Much more basic.

In essence, the problem was twofold. One was Nikon scanner's inherent
inability to reproduce Kodachrome (KC) perfectly. But that's not that
hard to solve, although I would have preferred a word of advice from
Nikon instead of being taken for a ride by the incompetent so-called
"support". Having purchased a second Nikon scanner with KC mode, I was
very disappointed that KC mode doesn't do nearly enough. Proving once
again that "canned" or "generic" so-called "solutions" are anything
but, due to their "all things to all people" aspect. Which is to be
expected.

However, the biggest problem was insufficient bit-depth. Specifically,
the amount of noise still present in the image (again as related to
Kodachromes but evident in other film as well). Even though - in
theory - 14-bits of my current scanner should be more than enough with
about 1.5 bits of headroom to spare for noise, in reality it's not. My
tests showed that I really need about 17-18 bits to truly penetrate
those dense KCs and get rid of *all* noise.

From that, my efforts moved on to "contrast masking" but with a twist.
Instead of mixing images (and thus, in my view, "cross-polluting"
them) I wanted a "clean" merge where highlights scan *only* provides
the highlights and shadows scan *only* provides the shadows, with a
hard border (without any feathering or blending) between them. Of
course, different scanner exposures resulted in different color
balance and just bringing the shadow scan down wasn't enough. Which
then led to High Definition Range images which brought their own set
of problems... etc... etc...

Anyway, as I write elsewhere, all that has now been resolved since
I've switched to custom software.

Don.
 
U

UrbanVoyeur

Don said:
Yes, that's all fine, but that's not what the subject is. The question
(paraphrased) was:

"Is there a one-size-fits-all calibration?"

And the answer was:

"No, but if you want a generic solution then look for objective,
self-correcting references and methods."

Except that from what you describe, it is not objective.

For you, this has been a discussion of "one size" vs "self correcting".

For me, your "self correcting" methodology for slide film is so flawed
that I have objected to it.

I have no problem with continuous self correction as a general approach.
And I do think we agree on the need for monitor calibration.

However we differ greatly on what calibration means, monitor or
otherwise, and from what I've seen, some fundamental misunderstandings
about color management have led you down a difficult and error filled
path with color slides.

Yes, but by applying a profile you are *changing* data received from
the scanner. I understand that for you this is a Good Thing in order
to maintain color accuracy down the chain, etc, but strictly speaking,
the resulting data is *not* what you got from the scanner! That's
where the "raw scan sect" (of which I'm a card-carrying member)
shrieks in horror.

Now, you will probably say "Excellent! It's corrected data! That's
exactly what I *want*".

And the "raw scan sect" will say "Horrible! It's corrupted data!
That's exactly what we *don't* want".

You say corrupted, I say corrected. :)

After all, what good is a scan if if does not match the slide.
So we end up talking abut two different things.



No, the actual tonal range is a function of a scanner's dynamic range
capability.

It starts with the tonal range of the film. From there you decide where
and how to place the tonal range of the film into the tonal range of the
scanner. That's where profiles come in.

All (*post-scan!!*) calibration does is applies a particular
*software* correction to this raw, *hardware* scan.
Yes.

One could do that in Photoshop afterwards! Yes, yes, I know... That
would be a "wild" correction outside of the CMS chain unlike "tamed"
correction of a calibrated profile, but that's not the point.

Actually, that is the point. I am saving my self countless hours poking
around for just the right correction, which I may or may not be able to
find next time.

The trouble is all that happens *after* the actual scan! It does not
change anything during the scanning process.

Aside from very limited control over LED exposure, IR, and multi
sampling, there is nothing you can do to change the scan short of
building your own scanner.

No, no, my difficulties had nothing to do with any of the above. The
problems were way *below* that level. Much more basic.

In essence, the problem was twofold. One was Nikon scanner's inherent
inability to reproduce Kodachrome (KC) perfectly. But that's not that
hard to solve, although I would have preferred a word of advice from
Nikon instead of being taken for a ride by the incompetent so-called
"support". Having purchased a second Nikon scanner with KC mode, I was
very disappointed that KC mode doesn't do nearly enough. Proving once
again that "canned" or "generic" so-called "solutions" are anything
but, due to their "all things to all people" aspect. Which is to be
expected.

This brings up an important distinction. The "KC Mode" in NikonScan *is*
a very generic, canned profile, that is wildly inaccurate.

It bears no resemblance to the ICC profile created when you use a
calibration slide - which is neither "canned" nor "generic".

But you choose not to believe that and I clearly cannot dissuade you. I
would only encourage you to give the cal slides a try, especially now
that B&H has inexpensive Kodachrome calibration slides.

With the caveat that underexposed areas on high contrast films will
always present a problem. Your approach to contrast masking is one
common solution.

Anyway, as I write elsewhere, all that has now been resolved since
I've switched to custom software.


Best of luck to you. Though I sincerely believe you could have ended up
in the same place with a whole lot less work and custom software.
 
M

mlgh

Best of luck to you. Though I sincerely believe you could have ended up
in the same place with a whole lot less work and custom software.



--

J

www.urbanvoyeur.com

The difficulty arrises when you have to scan old faded / colour shifted
slides, or for that matter negatives, and if there is any way of doing
that other than "poking around" in Photoshop I would be pleased to hear
about it.

MLGH
 
U

UrbanVoyeur

mlgh said:
The difficulty arrises when you have to scan old faded / colour shifted
slides, or for that matter negatives, and if there is any way of doing
that other than "poking around" in Photoshop I would be pleased to hear
about it.

MLGH

weeeell, that's a tough one.

Most old color emulsions were not as color fast as today's, and in any
case they don't fade in a linear fashion - that is to say some colors
shift & fade faster than others.

If you can get a bunch of them from similar vintage/emulsion in a group,
you can tweak the process for one and make it a action.

Some software, like Silverfast, have pretty good restoration features
that will at least give a you decent starting point.

Also, Kodak has free ROC restoration plugins for photoshop on their
site. http://www.asf.com (applied science fiction was bought by
Kodak a while back)

The basic plug-ins used to be free, but now they make you pay for the
regular and the pro.
 
D

Don

You say corrupted, I say corrected. :)

Exactly! ;o)
After all, what good is a scan if if does not match the slide.

But the point is the (raw) scan does *not* match the slide!

All a scanner profile does is *pretends* to match the slide by
modifying the scan *after the fact* and (misre)presenting this
contraption as "true likeness". It's not! It's a profile's
*impression* by permanently changing raw data!

Like that weird saying goes, putting wheals on my grandma won't turn
her into a bicycle. Conversely, applying a scanner profile won't
really make the scan any closer to the slide.

It's slapping on a coat of paint to hide the cracks in the wall in the
truest sense of the word. All the problems you had before the profile
was applied *are still there* (!) only now it's even more difficult to
get rid of them because they're obscured (and the data irretrievably
destroyed!!!) by the scanner profile!
It starts with the tonal range of the film. From there you decide where
and how to place the tonal range of the film into the tonal range of the
scanner. That's where profiles come in.

But neither the film nor the profile can give you the range if it's
not in the scanner. And currently (in the commercial domain, at least)
the scanner is the bottleneck and also the crux of the matter.

As I like to say, scanning is taking a picture of a picture. Which
means second generation so there is loss to start with. I don't want
to add to this loss by having a scanner profile change it even more
before I get to it!
Actually, that is the point. I am saving my self countless hours poking
around for just the right correction, which I may or may not be able to
find next time.

That's a false conclusion I also made initially. I too started out
with the (natural) desire to have my scan reflect the slide perfectly
(the whole KC chase...). But after thinking about it I realized that's
not what I want after all (as I will explain below). What I really
want is an *unadulterated* scan!

The key is this: Even if you profile the scanner, you will still have
to do the editing afterwards! Only instead of starting with a "virgin"
or "raw" image with *maximum* amount of data, you're starting
handicapped with an image that has already been compromised once by
the profile.

And, as we all know, the more changes you make the more damage there
is to data!! The best editing - in theory - would be to figure out
what to do and then do it all *one* step.

You're swayed by the notion that you always start from the "same
point" i.e. a calibrated state. But that's totally pointless! Each
image is *different* depending on the infinite range of lighting
conditions and responses and whatnot... So your "initial state" bears
absolutely no relation to the end goal you want to get to after
editing!

In other words, "common initial state" is really a mirage. And then on
top, that "common initial state" you start with has already been
compromised once by the profile!

Indeed, in many cases, the editing will involve *removing* the changes
made by the profile!!!

I can give you tons of examples of this. Let's say you have a KC shot
with Tungsten lighting. The slide looks yellow and the correction is
to apply blue. Guess what? Nikon adds blue! However, your profile will
remove it making the image yellow. And then your editing will put the
blue back in again!

How does this yo-yoing help your end result?
Aside from very limited control over LED exposure, IR, and multi
sampling, there is nothing you can do to change the scan short of
building your own scanner.

So, you get the best the scanner can do i.e. a raw, unadulterated
scan!

The *last* thing you want to do is further corrupt the scanner output
and add to scanner distortions by piling on profile distortions!
Best of luck to you. Though I sincerely believe you could have ended up
in the same place with a whole lot less work and custom software.

Oh, absolutely not! Exactly the opposite is true. Had I written my own
software to start with I would have saved a lot of time, a lot of
nerves, and got far better results instead of wasting time hoping for
a magic solution elsewhere.

The closes was HDR Shop but it has two major problems - no, make that
three! It only accepts 8-bit input and the composite image is also
exported as 8-bit (yes, you can export to various HDR file modes but
none of them can be edited outside of HDR Shop - PS exception
addressed below).

So, I wrote a floating point to 16-bit converter (you also have to
apply gamma, BTW, because floats are stored in linear gamma!). To get
the most out it, I even added automatic contrast enhancement because
that's better done in "infinite" floating point rather than finite
16-bit. But it didn't make any difference...

You just can't get around that 8-bit source! GIGO, as we old computer
fogies used to call it (Garbage In, Garbage Out). The resulting 16-bit
image had irregular histograms (inspected with a true 16-bit histogram
I also had to write).

And then to add insult to injury, HDR Shop has a "VueScan feature" ;o)
i.e. a bug, whereby anything over 246 (on the 8-bit scale) gets
clipped (~0.92 in floating point)!! That was the last straw...

And so we come to the latest PS which claims to do HDR. All well and
good but PS still uses 8-bit input. Even if you feed it 16-bit images
PS will convert them to 8-bit first, before blending them into HDR. So
much for PS...

Therefore - at least for me - the only way to go is to
"roll-your-own". I get to input full 16-bit, sub-pixel align, and then
merge without fudging into 16-bit output i.e. generate my digital
negative which is then archived! Now, is that to much to ask? Well,
apparently...

Don.
 
D

Don

The difficulty arrises when you have to scan old faded / colour shifted
slides, or for that matter negatives, and if there is any way of doing
that other than "poking around" in Photoshop I would be pleased to hear
about it.

Precisely! And having the profile further "corrupt" the already
compromised data is not going to help any!

Indeed, more often than not, one first has to remove the "damage" done
by the profile, before even addressing the underlying problem. And the
same applies to images which are not difficult as well.

So, there's no way around it. For people who want quality there's no
way around post-processing. And if that's the case, then one is much
better off starting with as "pure" a scan as possible. At least IMHO.

Don.
 
U

UrbanVoyeur

Don said:
Precisely! And having the profile further "corrupt" the already
compromised data is not going to help any!
Indeed, more often than not, one first has to remove the "damage" done
by the profile, before even addressing the underlying problem. And the
same applies to images which are not difficult as well.

ROFL

Ok Don, if you say so.
 
U

UrbanVoyeur

Don said:
But the point is the (raw) scan does *not* match the slide!

True.

All a scanner profile does is *pretends* to match the slide by
modifying the scan *after the fact* and (misre)presenting this
contraption as "true likeness". It's not! It's a profile's
*impression* by permanently changing raw data!

You have a fundamental misunderstanding of color space and calibration.

Scanners digitize film. The raw file contains a great deal of info, but
not in a form or balance that resembles the original slide when rendered
in any of the digital color spaces (LCD monitor, ink jet printer, etc).
On top of that it contains the bias of the scanner and the film.

A well constructed calibration profile gives you a consistent,
externally validated correction for all of these issues. That's just a fact.

But neither the film nor the profile can give you the range if it's
not in the scanner. And currently (in the commercial domain, at least)
the scanner is the bottleneck and also the crux of the matter.

Well yes, but at this point in technology only extremely expensive
scanners can render the full contrast range of film in a single pass
without resorting to manipulations like HDR.

HDR and contrast maksing are fine, but they do nothing to address the
deviation from neutral inherent in the scanner or the slide.


That's a false conclusion I also made initially. I too started out
with the (natural) desire to have my scan reflect the slide perfectly
(the whole KC chase...). But after thinking about it I realized that's
not what I want after all (as I will explain below). What I really
want is an *unadulterated* scan!

The key is this: Even if you profile the scanner, you will still have
to do the editing afterwards! Only instead of starting with a "virgin"
or "raw" image with *maximum* amount of data, you're starting
handicapped with an image that has already been compromised once by
the profile.

Actually I do almost none on 90-95% of the slides I shoot. There are
some that i intentionally under or overexposed to make a visual point or
to capture a moment outside the limitations of the film, and those need
work.

I shoot under varied and often extreme conditions of light and color as
well as plain old day light.

But this important: *very* little work needs to be done to my images
after a calibrated scan.

Most of the adjustments (90%) I make are at the time of printing, and it
almost always is to overcome a limitation of the inks in my epson 1270 -
bronzing, metamerism, blocking up, etc. I do plan to move to an
ultrachrome printer soon.
And, as we all know, the more changes you make the more damage there
is to data!! The best editing - in theory - would be to figure out
what to do and then do it all *one* step.

Define damage.

If the on screen image looks like the slide, and the inkjet output looks
like the slide without adjsutment, what is damaged?

If the magazine pages, cd covers, posters and flyers people produce from
my scans only require CYMK conversion and little or no tweaking, what is
damaged?

It's not just me - people print from my scans all the time, and all i
hear is how little adjustment and how "on the money" they are.

What is damaged?


You're swayed by the notion that you always start from the "same
point" i.e. a calibrated state. But that's totally pointless! Each
image is *different* depending on the infinite range of lighting
conditions and responses and whatnot... So your "initial state" bears
absolutely no relation to the end goal you want to get to after
editing!

Not true. Film is film. Fuji Provia 400 F is Fuji Provia 400. It has a
characteristic response curve. That curve is *extremely* consistent from
batch to batch when you use fresh film. Ask Wolf Faust.

The light I shoot under doesn't matter.

Some of the light will be inside of the color space of the film and some
will not. But the response curve of the film *does not* change. It may
be non-linear at the extremes, it may exhibit metamerism, but it is a
for all intents and purposes fixed quantity.

I have shot thousands upon thousands of rolls of Provia 100 and 400 F in
the past 5 years. When processed at a good professional the curve is a
constant.

The *only* thing i have seen is that a couple of labs are a touch cooler
than i like and a couple are a tad (1/8-1/4 stop) faster than others.
That's it.

So, with my Nikon LS 4000 and FujuChrome, i do have a consistent
starting point.

Again, if you don't believe me contact Wolf Faust at
http://www.targets.coloraid.de
I can give you tons of examples of this. Let's say you have a KC shot
with Tungsten lighting. The slide looks yellow and the correction is
to apply blue. Guess what? Nikon adds blue! However, your profile will
remove it making the image yellow. And then your editing will put the
blue back in again!

How does this yo-yoing help your end result?

You misunderstand color calibration and color balance. If I shoot
daylight film in tungsten light it will have a tungsten (orange-brown)
balance.

*IF* I want the image to appear daylight balanced, I need to either
apply an 80b filter equivalent *or* find an area on the scan and call it
"white" and balance to daylight with that.

That has *nothing* to do with the inherent bias of the scanner or the
relationship of the film's repsonse to a standard color chart.

The calibration profile will not remove and tungsten cast form the film.
It will give you a neutral point to start from so you won't be fighting
both the film's bias and the scanner's bias while trying to correct the
tungsten to daylight - if that's what you want to do.

If shot thousands of daylight rolls under tungsten stage lighting - the
calibration works.

Oh, absolutely not! Exactly the opposite is true. Had I written my own
software to start with I would have saved a lot of time, a lot of
nerves, and got far better results instead of wasting time hoping for
a magic solution elsewhere.


What I've put forth is far from a magic solution. It is tested,
validated method of controlling scans so that they match film.

It is based in the methods used for nearly 100 years to color balance
film for motion pictures and still photography. It is a method used by
every pro lab and scanner manufacturer I know. It is even the method
that Fuji and Kodak recommend for getting the most out of scans.
http://www.colorprofiling.com/,
http://www.kodak.com/global/en/professional/hub/labDig/plrc/workflow/basics.jhtml


Your method is ... well.... your method.

You undoubtedly have considerable technical skill - writing you own
scanner software demonstrates that.

However, you are missing several key points about color space. I would
reccommend Bruce Frasier's book "Real World Color Mangement". It's
written for photographers who use photoshop.
 
D

Don

A well constructed calibration profile gives you a consistent,
externally validated correction for all of these issues. That's just a fact.

Which is not at dispute if we're talking about what happens
downstream. The only catch is I'm *not* talking about downstream...

At the front end (which is what I *am* talking about) the real
question is, how useful is it to change raw data from the scanner only
to make the image superficially appear like the slide, and then have
it changed again once you start editing? Often times *undoing* what
the profile has introduced!
Well yes, but at this point in technology only extremely expensive
scanners can render the full contrast range of film in a single pass
without resorting to manipulations like HDR.

HDR and contrast maksing are fine, but they do nothing to address the
deviation from neutral inherent in the scanner or the slide.

But they do address the dynamic range which was the subject in the
above paragraph.

As to deviation from neutral which the scanner introduces, well, the
profile doesn't address that either. If the balance is not there, the
profile can't add it. The profile may only *pretend* to add it by
modifying raw data and thereby damaging it.

Now, if there was a way to change *data acquisition* e.g. by modifying
AG, I'm all for that! (Indeed, that "calibration" is exactly what I
was trying to do to correct the KC bias.) The reason is because
varying *hardware* settings is essential to getting the most data out
of the scan.

But any *software* scanner "calibration" which changes the scan after
the fact is essentially cosmetic - but destructive.
But this important: *very* little work needs to be done to my images
after a calibrated scan.

Then it's fine. For you.

However, that does not change the fact that the profile introduces
irreversible changes to raw data. And, for me, that's paramount!
Most of the adjustments (90%) I make are at the time of printing, and it
almost always is to overcome a limitation of the inks in my epson 1270 -
bronzing, metamerism, blocking up, etc. I do plan to move to an
ultrachrome printer soon.


Define damage.

Loss of data.

Take an image.
Save the histogram (better still dump image as raw).
Apply any change e.g. a curve.
Reverse the change e.g. invert the above curve.
Save the histogram (better still dump image as raw).

In theory, they should be the same. In practice, they are miles apart.

Compare the two (histograms or data).
Shriek in horror!

Or not... If not, then you're not concerned about data loss at this
scale and the whole discussion is moot.

Nothing wrong with that, of course, but it doesn't change the fact
that scanner profiles damage raw data. Even if you don't care about
this loss.
If the on screen image looks like the slide, and the inkjet output looks
like the slide without adjsutment, what is damaged?

I'm not addressing printing here at all. Yes, calibration of a monitor
makes perfect sense, as does the calibration of the printer. I would
just add one proviso that the images are tagged rather than converted.

Actually, that's the problem. I'm talking about data purity at the
earliest stage (front end), while you always race ahead and talk about
what calibration does to the rest of the chain (back end). That's not
the subject! Those are two totally different things.

It's *essential* to calibrate the monitor and the printer to get
consistent results, but what I'm talking about is the data you're
*starting* with. And the less damage there is to data at that stage,
the better results you will get downstream.
Not true. Film is film. Fuji Provia 400 F is Fuji Provia 400. It has a
characteristic response curve. That curve is *extremely* consistent from
batch to batch when you use fresh film. Ask Wolf Faust.

Which is all beside the point, because each image is different and
requires different editing to produce the final result.

The key is, this final result is *not* necessarily an identical
reproduction of what's on the film! In fact it rarely is identical.
Your premise is that it's (nearly) identical, and that's where we
differ.

Again, you may indeed be satisfied with the results you're getting but
that doesn't change any of the above facts.
The light I shoot under doesn't matter.

Of course it does! It's night and day - pun intended.
You misunderstand color calibration and color balance. If I shoot
daylight film in tungsten light it will have a tungsten (orange-brown)
balance.

No, you misunderstand my point again.

How is changing the image twice (the second time removing the first
change) going to improve the final result?
*IF* I want the image to appear daylight balanced, I need to either
apply an 80b filter equivalent *or* find an area on the scan and call it
"white" and balance to daylight with that.

*That's* what I'm talking about!!! Inadvertently, you made my point.

You are therefore *deviating* from the profile e.g. changing the image
twice! First, by applying the profile, and then the second time by
going *against* the profile by arbitrarily declaring an area "white"!

The example I gave was extreme to illustrate the point, but such
adjustments (to some degree) are necessary to virtually all images.

Now, your mileage may vary, and that's perfectly fine. But that does
not change the fact that applying a scanner profile, damages data.
That's all there is to it.
The calibration profile will not remove and tungsten cast form the film.

And it's not supposed to!

But the profile doesn't know that. It will blindly "correct" the scan
even though your editing may end up reversing this change! That's my
point!
What I've put forth is far from a magic solution. It is tested,
validated method of controlling scans so that they match film.

The trouble is the final image does not always match the film! So that
intermediate step is usually counter-productive.
It is based in the methods used for nearly 100 years to color balance
film for motion pictures and still photography. It is a method used by
every pro lab and scanner manufacturer I know.

Which is all fine and dandy but has nothing to do with what I'm
talking about.

The trouble is you're confusing monitor/printer calibration - which is
*essential* - with data corruption at the front end. Those are two
totally different things.

Just because I say that a scanner profile compromises raw data
integrity does *not* mean I'm against monitor/printer calibration
after data acquisition!
Your method is ... well.... your method.

There is no "method"! That's the basis of your misunderstanding.

You're confusing insistence on *data integrity* with opposition to
calibration. That's a non-sequitur! And I've repeatedly stated that.
There is no conflict.

If anything, insisting on data integrity at the front end actually
*improves* everything that happens downstream, *including* color
management. Raw data integrity is *not* contrary to color management.
You undoubtedly have considerable technical skill - writing you own
scanner software demonstrates that.

Actually scanner software is fairly pedestrian to write as was the
rest of it. (Which makes it even more mind boggling why VueScan has so
many problems!?)

The key was running tests and realizing what is actually happening
instead of just blindly following dogma.

That's why not only do I switch scanner "calibration" off, but I also
prefer to twin scan instead of multi-scanning. The latter being
academic because Nikon crippled LS-50 by turning multiscanning off.

But - as it turns out in the end - that's no big loss.

Don.
 
U

UrbanVoyeur

Don said:
The key was running tests and realizing what is actually happening
instead of just blindly following dogma.


There is no "blindly following dogma", though you may think otherwise.

I personally and many others before me have validated with experience
every thing about color space and calibration I've put forth.

We could go round and round, (I say this without intending to offend)
but your misunderstanding about color space, gamut, calibration,
profiles and curves prevent us from really getting anywhere. The beliefs
about color you are holding on to set obstacles before you that you
don't even realize are there.

Space and this medium do not permit the full discussion of the topic.

Take a look at Bruce Frasier's "Real World Color Management". He
distills many sources down to a clear, concise discussion of color
management as it related to the digital photographer.

If you read it and try the examples in the book, it will disabuse you of
many, if not all of your false premises and erroneous conclusions with
regards to color space.

Or look at it as a test of what you think about color and scanning. If
what you believe holds up after reading his book, then you are validated.
 
U

UrbanVoyeur

Don said:
But they do address the dynamic range which was the subject in the
above paragraph.

Only partially. Improperly rendered colors or incorrect translation of
out of gamut color will result inconsistent contrast ranges.

As to deviation from neutral which the scanner introduces, well, the
profile doesn't address that either.

Actually it does. But you have to understand the color space spaces you
are working with.


at's* what I'm talking about!!! Inadvertently, you made my point.
You are therefore *deviating* from the profile e.g. changing the image
twice! First, by applying the profile, and then the second time by
going *against* the profile by arbitrarily declaring an area "white"!

You are not going against the calibrated profile. The profile contains
limited, very specific corrections that to not sway a particular image
toward or away from any color balance. It just brings you back to neutral.

The example I gave was extreme to illustrate the point, but such
adjustments (to some degree) are necessary to virtually all images.

Now, your mileage may vary, and that's perfectly fine. But that does
not change the fact that applying a scanner profile, damages data.
That's all there is to it.




And it's not supposed to!

But the profile doesn't know that.

Actually, it does. Its corrections are highly targeted within a give
color space.

Here's one way to think of it:

Think of color management in terms of concentric circles. The larger the
circle the greater the color range and dynamic range it encompasses.

The largest for us is "All the colors of light on Earth". For our
purposes the next smallest is "Human Vision" - all the colors we can see.

All Light > Human Vision > Slide Film > Photo Paper & Inkjets > Consumer
scanners > Adobe RGB 1998 > Monitors > sRGB

Colors and contrast ranges that exist in the color space are "in gamut",
color and contrast ranges that do not are "out of gamut".

At each transition point, there is a translation of color and contrast
values. That translation is the characteristic "curve" of the medium.

Different films have different curves and record identical scenes to
different color spaces. Some favor blue, others respond more heavily to
certain reds.

When going from a larger color space (a scene we like) to a smaller
color space (a photo of the scene on slide film) colors and contrast
ranges that exceed the limits of the medium are not recorded. That's
another way of saying we see more colors, more highlight and more shadow
detail when we take the photo than what appears on film.

The goal of calibration is to make sure that a given color and contrast
range that is contained (in gamut) in two color spaces is rendered the
same way in both. This includes the effects of metamerisms.

The correction of calibration takes into account the response curve of
the medium. A given in-gamut blue will be the same blue when I see it,
the same blue on the film, the same blue in the scanned TIFF.

If will may *look* slightly different because we are discarding color
information at each stage to go to a smaller space BUT the in gamut
color will remain true throughout.

This is important when we want to print and display images that look
like our slides. We have to make sure the "blues" match.

Without calibration as a guide, we are left with the curves of the
medium, which in the case of both film and scanners are complex, and
non-linear across colors and exposure values. At which point we guess.

So-called data accuracy in a scan doesn't mean much if your starting
point is off.

A good set of diagrams on this topic can be found in a recent luminous
landscape article.
http://www.luminous-landscape.com/tutorials/prophoto-rgb.shtml

One note: diagrams of color space in most books & articles describe the
spaces as concentric triangles rather than circles.
 
D

Don

I personally and many others before me have validated with experience
every thing about color space and calibration I've put forth.

The only catch is the subject is *not* color space and calibration
down the chain - as I keep repeating but it keeps getting ignored.

The subject is *data loss and data integrity* at the front end.

Applying a scanner profile has *in reality* - and that's what I meant
by tests rather than dogma - nothing to do with, and no effect on,
*the rest* of the calibration chain!

The reflex (i.e. "dogmatic") reaction to such a "heretical" statement
is to jump up and down and go: "But... but... but... One must profile
the scanner! All the scanning "scriptures" say so! See here...".

Well, one must not. Here's a simple proof. Take any image from an
*uncalibrated* source. Process and print that image using a calibrated
monitor and printer. Has the resulting image suffered any by starting
out "uncalibrated"? Of course, not!

Therefore, starting with an uncalibrated scan has absolutely *nothing*
to do with neither color space nor calibration *from that point on*!
And that's the key: "from that point on".

Having established that - and I hope it is established now - the
question, then, is: What is better: Pure, unadulterated data or,
"tainted" data. And the answer to that, again, is quite
straightforward: Pure data, of course.

Now please read that last paragraph again. You would have noticed *no
mention* of either color space or calibration anywhere!
We could go round and round, (I say this without intending to offend)
but your misunderstanding about color space, gamut, calibration,
profiles and curves prevent us from really getting anywhere. The beliefs
about color you are holding on to set obstacles before you that you
don't even realize are there.

No offence taken. Indeed, I think we are really on the same side of
the argument for the most part, if only we could actually talk about
the subject matter for a minute... ;o)

The problem, in a nutshell, is this:

I say:

Without a profile the scanner produces pure data.

You hear:

Throw away *all* color space and calibration!

That's a total non-sequitur.
If you read it and try the examples in the book, it will disabuse you of
many, if not all of your false premises and erroneous conclusions with
regards to color space.

No, the only false premises and erroneous conclusions are the ones you
made based on misunderstanding what I said.

We need go no further than your mistaken (and unwarranted) impression
about the level of my color space and calibration knowledge.

I intentionally do not want to get dragged into that discussion and
digress even further but we could have a *separate* discussion on
color space and calibration, if you wish.

However, I guess that would be a very short discussion, because - from
what you wrote - I gather we would probably agree on almost
everything.

Don.
 
D

Don

Actually it does. But you have to understand the color space spaces you
are working with.

A profile does *not* affect the *hardware*. It only pretends to
correct the data *after* the scan.

If a profile modified Analog Gain, for example, then it *would* change
what the scanner *hardware* produces.

But it does not, and therefore, color space is totally and completely
irrelevant in this context. It's what comes out of CCD wells that's
the subject matter. And a profile has absolutely no effect on CCD
wells.
You are not going against the calibrated profile. The profile contains
limited, very specific corrections that to not sway a particular image
toward or away from any color balance. It just brings you back to neutral.

The problem is that "neutral" is based on what's on film, not on the
real neutral (end product)!

And - as we have established already - what's on the film is *not* the
same as the end product i.e., the *real* neutral.

Therefore, again, this profile so-called "neutral" is not the same as
the *actual* real neutral.

And to achieve the actual, real neutral the profile "work" will have
to be undone - at the very least, in part.

And even if it didn't, any subsequent editing (to amplify rather then
fight the profile) will only further compromise the data. And - as
already established - two edits damage data more than a single edit.
Think of color management in terms of concentric circles. The larger the
circle the greater the color range and dynamic range it encompasses.
....snip...

That's a very nice summary of color management... For those who need a
summary.

As it happens, I don't. Furthermore, it has nothing to do with the
subject at hand.

However - and I really mean this! - it is a very nice and concise
primer for anyone else reading along.

Don.
 
J

John

Don said:
neutral.

The problem is that "neutral" is based on what's on film, not on the
real neutral (end product)!

And - as we have established already - what's on the film is *not* the
same as the end product i.e., the *real* neutral.

Therefore, again, this profile so-called "neutral" is not the same as
the *actual* real neutral.

And to achieve the actual, real neutral the profile "work" will have
to be undone - at the very least, in part.
What if you were to use a film calibration target, i.e. a camera shot on
Kodachrome of a standard calibrated target? By using the reference values of
the original target (rather than those measured on the slide) you would
then correct the colour balance of the film and neutral would correspond to
'real' neutral. You could apply your analogue gain correction as well before
profiling. At least in theory :)
 
U

UrbanVoyeur

Don said:
The only catch is the subject is *not* color space and calibration
down the chain - as I keep repeating but it keeps getting ignored.

The subject is *data loss and data integrity* at the front end.
Ok. I'll bite. Let's say you get every single bit from your scanner. Now
what?

Well, unless you are extremely lucky your image will need correction.

You can apply it after the scan as a profile, as I have suggested.

And as you said, the data will be altered.

What is the alternative?

Well, you could attempt correct during the scan. You have in the case of
the Nikon 3 LED's to work with. In other scanners you have less.

Unfortunately, attempting to control precise color management is nearly
impossible by tweaking the LED's exposure. In the case of Nikon, the the
color response of each LED to being "dialed up" or "dialed down" is non
linear across the color produced by the LED. So while you *might* get
correction to one color, all the others would be thrown off. Besides
which the range of movement is very limited. Finally, the corrections
are such that that you would lose overall white balance.

In the case of single LED scanners, like Minolta, all you can tweak is
an electronic filter, which is no different than applying a profile
after the scan.

Now, let's day your scanner is more advanced than most, with truly
linear LED's and wide latitude of correction.

And if you attempt this correction to neutral, by altering the LED's or
their electronic filter, how would you do it? You would need to
calibrate to a standard. A color target - a calibration slide would do
the trick.

And *if* you used the corrections directed by the slide and *if* you
could actually get the full range of required correction what would it
look like.

Well, the resulting histogram of the scan with the correction applied
via LED's *during* the scan would look like the scan with the correction
applied *after* the scan. As long as post the correction profile was
applied at sufficient bit depth say 32 bit correction against a 14 bit
image. (BTW, this is how silverfast works)

Now, because of the nature of the corrections, the real world colors
will still exceed the range of the scanner and these will appear clipped
in the file no matter which method your use. The real world colors the
scanner is able to render fully will either be shifted to a neutral
representation or unaffected.

Those are the limitations of the scanner color space. There's no way
around it.

So, what are you left with? The same result either way.

Applying a scanner profile has *in reality* - and that's what I meant
by tests rather than dogma - nothing to do with, and no effect on,
*the rest* of the calibration chain!

The reflex (i.e. "dogmatic") reaction to such a "heretical" statement
is to jump up and down and go: "But... but... but... One must profile
the scanner! All the scanning "scriptures" say so! See here...".

Well, one must not. Here's a simple proof. Take any image from an
*uncalibrated* source. Process and print that image using a calibrated
monitor and printer. Has the resulting image suffered any by starting
out "uncalibrated"? Of course, not!
Therefore, starting with an uncalibrated scan has absolutely *nothing*
to do with neither color space nor calibration *from that point on*!
And that's the key: "from that point on".

Unfortunately, in digital imaging, as with analog imaging, there is no
getting away from color space. Use it or ignore it, its always there.
Omit color management in one link in the production chain and the whole
process suffers.

Having established that - and I hope it is established now - the
question, then, is: What is better: Pure, unadulterated data or,
"tainted" data. And the answer to that, again, is quite
straightforward: Pure data, of course.

Unfortunately "pure unadulterated data" has almost no meaning in the
context of scanning because you are crossing so many color spaces to get
from the film to your output.
I intentionally do not want to get dragged into that discussion and
digress even further but we could have a *separate* discussion on
color space and calibration, if you wish.

However, I guess that would be a very short discussion, because - from
what you wrote - I gather we would probably agree on almost
everything.

It appears then, that the difference between us is how we apply this
knowledge.
 
U

UrbanVoyeur

Don said:
A profile does *not* affect the *hardware*. It only pretends to
correct the data *after* the scan.

It's no more "pretend" than calibrating your monitor or printer.

If a profile modified Analog Gain, for example, then it *would* change
what the scanner *hardware* produces.

But it does not, and therefore, color space is totally and completely
irrelevant in this context. It's what comes out of CCD wells that's
the subject matter. And a profile has absolutely no effect on CCD
wells.

True. but as I put forth in another post, correcting LED gain is not
linear and has insufficient range for the task.
The problem is that "neutral" is based on what's on film, not on the
real neutral (end product)!
The neutral of a calibration target is a measured deviation of the film
from international color standards. That difference tells the software
where to "find" neutral.
And - as we have established already - what's on the film is *not* the
same as the end product i.e., the *real* neutral.
I'm not sure what you mean by the "real neutral". When I say neutral I
mean an in gamut color is the same from the color chart to the film to
through the scanner and from the printer.

What do you mean?
Therefore, again, this profile so-called "neutral" is not the same as
the *actual* real neutral.

And to achieve the actual, real neutral the profile "work" will have
to be undone - at the very least, in part.

No. That's what I mean about you misunderstanding the nature of color
space and calibration.

And even if it didn't, any subsequent editing (to amplify rather then
fight the profile) will only further compromise the data. And - as
already established - two edits damage data more than a single edit.



...snip...

That's a very nice summary of color management... For those who need a
summary.

As it happens, I don't. Furthermore, it has nothing to do with the
subject at hand.

Thanks. It does have quite a bit to do with the subject at hand. It
gives us a context for phrases like neutral and "pure unadulterated
data". The is desirable - we all wan to get all the data we can out of
our scanners. And with any of the better scan software we do.

The question is what next? Or how do I make these bits in the scan file
look like the slide I started with? That's where color management comes
in. And yes, it always and inevitably involved altering the data.

The same thing used to happen "in the old days" when you corrected film.
You do process a small piece of film - clip test, then use visual
comparisons to find the right filtration to correct the slide to the
standard you wanted.

You then gave these instructions +5 Yellow, +3 Magenta, etc to your
processor. They would process and dye the film to those corrections.

Inevitably, the film would lose some of its ability to discriminate
particular shades as a result of the dyeing. The differences were masked
by the dye. In digital terms these colors were "clipped".

That was part of the process - you could either have corrected images or
full range slides. The nature of color spaces does not allow both.

With one caveat, for very very small color corrections with *some*
emulsions (a few select ektachromes and an FujiChromes) at least one lab
- A&I - could adjust individual color dimensions by slightly altering pH
and temperature. Ish, one of the founders of A & I worked this out in
the late 80's and early 90's

The major downsides of this process were the limited range, and the
ability to move in only one color dimension at a time. However the full
dynamic range of the film was retained.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top