A well constructed calibration profile gives you a consistent,
externally validated correction for all of these issues. That's just a fact.
Which is not at dispute if we're talking about what happens
downstream. The only catch is I'm *not* talking about downstream...
At the front end (which is what I *am* talking about) the real
question is, how useful is it to change raw data from the scanner only
to make the image superficially appear like the slide, and then have
it changed again once you start editing? Often times *undoing* what
the profile has introduced!
Well yes, but at this point in technology only extremely expensive
scanners can render the full contrast range of film in a single pass
without resorting to manipulations like HDR.
HDR and contrast maksing are fine, but they do nothing to address the
deviation from neutral inherent in the scanner or the slide.
But they do address the dynamic range which was the subject in the
above paragraph.
As to deviation from neutral which the scanner introduces, well, the
profile doesn't address that either. If the balance is not there, the
profile can't add it. The profile may only *pretend* to add it by
modifying raw data and thereby damaging it.
Now, if there was a way to change *data acquisition* e.g. by modifying
AG, I'm all for that! (Indeed, that "calibration" is exactly what I
was trying to do to correct the KC bias.) The reason is because
varying *hardware* settings is essential to getting the most data out
of the scan.
But any *software* scanner "calibration" which changes the scan after
the fact is essentially cosmetic - but destructive.
But this important: *very* little work needs to be done to my images
after a calibrated scan.
Then it's fine. For you.
However, that does not change the fact that the profile introduces
irreversible changes to raw data. And, for me, that's paramount!
Most of the adjustments (90%) I make are at the time of printing, and it
almost always is to overcome a limitation of the inks in my epson 1270 -
bronzing, metamerism, blocking up, etc. I do plan to move to an
ultrachrome printer soon.
Define damage.
Loss of data.
Take an image.
Save the histogram (better still dump image as raw).
Apply any change e.g. a curve.
Reverse the change e.g. invert the above curve.
Save the histogram (better still dump image as raw).
In theory, they should be the same. In practice, they are miles apart.
Compare the two (histograms or data).
Shriek in horror!
Or not... If not, then you're not concerned about data loss at this
scale and the whole discussion is moot.
Nothing wrong with that, of course, but it doesn't change the fact
that scanner profiles damage raw data. Even if you don't care about
this loss.
If the on screen image looks like the slide, and the inkjet output looks
like the slide without adjsutment, what is damaged?
I'm not addressing printing here at all. Yes, calibration of a monitor
makes perfect sense, as does the calibration of the printer. I would
just add one proviso that the images are tagged rather than converted.
Actually, that's the problem. I'm talking about data purity at the
earliest stage (front end), while you always race ahead and talk about
what calibration does to the rest of the chain (back end). That's not
the subject! Those are two totally different things.
It's *essential* to calibrate the monitor and the printer to get
consistent results, but what I'm talking about is the data you're
*starting* with. And the less damage there is to data at that stage,
the better results you will get downstream.
Not true. Film is film. Fuji Provia 400 F is Fuji Provia 400. It has a
characteristic response curve. That curve is *extremely* consistent from
batch to batch when you use fresh film. Ask Wolf Faust.
Which is all beside the point, because each image is different and
requires different editing to produce the final result.
The key is, this final result is *not* necessarily an identical
reproduction of what's on the film! In fact it rarely is identical.
Your premise is that it's (nearly) identical, and that's where we
differ.
Again, you may indeed be satisfied with the results you're getting but
that doesn't change any of the above facts.
The light I shoot under doesn't matter.
Of course it does! It's night and day - pun intended.
You misunderstand color calibration and color balance. If I shoot
daylight film in tungsten light it will have a tungsten (orange-brown)
balance.
No, you misunderstand my point again.
How is changing the image twice (the second time removing the first
change) going to improve the final result?
*IF* I want the image to appear daylight balanced, I need to either
apply an 80b filter equivalent *or* find an area on the scan and call it
"white" and balance to daylight with that.
*That's* what I'm talking about!!! Inadvertently, you made my point.
You are therefore *deviating* from the profile e.g. changing the image
twice! First, by applying the profile, and then the second time by
going *against* the profile by arbitrarily declaring an area "white"!
The example I gave was extreme to illustrate the point, but such
adjustments (to some degree) are necessary to virtually all images.
Now, your mileage may vary, and that's perfectly fine. But that does
not change the fact that applying a scanner profile, damages data.
That's all there is to it.
The calibration profile will not remove and tungsten cast form the film.
And it's not supposed to!
But the profile doesn't know that. It will blindly "correct" the scan
even though your editing may end up reversing this change! That's my
point!
What I've put forth is far from a magic solution. It is tested,
validated method of controlling scans so that they match film.
The trouble is the final image does not always match the film! So that
intermediate step is usually counter-productive.
It is based in the methods used for nearly 100 years to color balance
film for motion pictures and still photography. It is a method used by
every pro lab and scanner manufacturer I know.
Which is all fine and dandy but has nothing to do with what I'm
talking about.
The trouble is you're confusing monitor/printer calibration - which is
*essential* - with data corruption at the front end. Those are two
totally different things.
Just because I say that a scanner profile compromises raw data
integrity does *not* mean I'm against monitor/printer calibration
after data acquisition!
Your method is ... well.... your method.
There is no "method"! That's the basis of your misunderstanding.
You're confusing insistence on *data integrity* with opposition to
calibration. That's a non-sequitur! And I've repeatedly stated that.
There is no conflict.
If anything, insisting on data integrity at the front end actually
*improves* everything that happens downstream, *including* color
management. Raw data integrity is *not* contrary to color management.
You undoubtedly have considerable technical skill - writing you own
scanner software demonstrates that.
Actually scanner software is fairly pedestrian to write as was the
rest of it. (Which makes it even more mind boggling why VueScan has so
many problems!?)
The key was running tests and realizing what is actually happening
instead of just blindly following dogma.
That's why not only do I switch scanner "calibration" off, but I also
prefer to twin scan instead of multi-scanning. The latter being
academic because Nikon crippled LS-50 by turning multiscanning off.
But - as it turns out in the end - that's no big loss.
Don.