Nikon Coolscan 4000 Calibration

K

Kennedy McEwen

coolscan.pl said:
Hi Kennedy,

now my English will be really not so god, because I have temporary
no access to dictionary :)
Nope - you are still coming across loud and clear! Perhaps this is just
a legacy from my primary school teacher who was Polish! ;-)
Gospel truth. My experiments with CCD cooling has no effect in
stability of high tones (about white point). Only cooling of chips
(which control LED's) succeed.

That confirms what I suspected! ;-)
When I cool left side of plate two
channels are stable and when I cool right side of plate third channel
and IR are stable.
That is why at left side is placed chips, which controls first pair
of channels and at right side rest.

LEDs and LDs have a notoriously unstable output amplitude with
temperature. That is why the analogue gain is controlled by the
exposure time to the LED, rather than attempt to adjust the intensity.
Cooling instalation in 9K - it's no problem, beacause he is bigger
(and have more place inside) with comparison to smaller coolscans.
Additionally, in small format coolscans LED's are packs in black -
box (not openable). I'm not sure if in this box are placed only LED's
or LED's with all electronic control chips.
I haven't opened my LS-4000 up yet. I have opened every other Nikon I
have owned, usually to clean them, but I now have a pretty good working
procedure to keep the scanner free from dirt and dust accumulation so I
haven't had to resort to cleaning this one yet. No doubt that time will
come eventually and I will have to delve inside. ;-)
I know that small differences between channels don't make a problem,
because I can correct this via software, but this is problem for my
colorimetric experiments (I specialize in CM).
Additionally is not the same 2 ways:
a) scan original channel luminancy
b) scan modified channel luminancy and correct it via soft

because, when I correct via soft scans, wich has small Dmin value,
software levels corrections only restore global equalizations,
but not restore original contrast in small ranges in high tones (f.e.
luminance between 89 and 90), that will be partially loss.
To correct this I need precision curve tool, which can be set
up in high precision.
Yep, I understand your problem, but it is pretty specialised and not
typical of the average, or even most specialised users.

I suspect that you have had to develop most of this procedure yourself.
I'm not sure if I good understand your experiments about black
point (I must print your mails and go to home, when I have a
dictionary :), but remember, that:

a) when you scanning any material you should totally unfocus lens
to eliminate any structure fluctuancy

I fully agree with you - but in this case I was scanning an opaque
image, so no defocus was necessary. For higher illuminations this would
be an absolute requirement.
b) when you analyze scans line to line you should be considerate,
because previous line affecting to next line - this is effect of:
b1) electrical occurrence (CCD)
b2) lack ideal precision in locate each channels (in coolscans
RGBI channels is litle moved - about 0,2-0,5 pixel)
Again I agree, but I don't think this is an issue with the tests I did,
because it was a uniform, perfectly black target.
ps. I'm happy that not only me litle crazy about precision ;-)

Oh lots of us are after precision, but by the sound of things, you are
looking for (and need) a lot more precision and stability than the rest
of us, so you are probably out there on your own.

I am glad you have found this group though, because is sounds like you
might be doing a lot of work that many of us can benefit from.
 
K

Kennedy McEwen

wim wiskerke said:
Could that be due to chromatic aberration?
Or: How do you rule out chromatic aberration?
I have a coolscan 5000 and always thought the slight offset of the
channels was caused by CA.
It may be *real* chromatic aberration, with the optics not bringing all
the colours to a coincident focus across the field.

However there is another issue that is common with scanners which is
often termed chromatic aberration as well. This is actually caused by
the movement of the RGB trilinear CCD together with the illumination
source, causing objects to exhibit colour distortion at edges - one
colour at the leading edge and the complementary colour at the trailing
edge, in the axis of the scan head movement. The design of the Nikon,
with a single polychromatic detector and separate RGBI illumination
sources, effectively prevents this latter problem from arising.
 
B

Bart van der Wolf

Kennedy McEwen said:
I fully agree with you - but in this case I was scanning an opaque
image, so no defocus was necessary. For higher illuminations this
would be an absolute requirement.

Also, although the range of focus distances is limited, there is a
(small?) risk of changing effective exposure depending on focus
setting. On a truely opaque 'slide' there is no need to defocus,
although one should take care of avoiding in-scanner stray light (by
reducing ambient light levels and a black slide surface, e.g. alu-foil
with black sensor-side surface).

Bart
 
B

Bart van der Wolf

SNIP
Immediately before the data capture, I calibrated the scanner, so
there was at most 5 seconds between the start of the end of
calibration and the start of the scan.

Maybe I missed something, but I assume that was to determine the
stabilization period needed. I usually allow the scanner, and the film
inside the scanner, to reach a more stable equilibrium by allowing a
period of 'heating up'. Frequent previews will not only allow to reach
that state sooner, but it will also allow to reduce film movement
*during* the actual scan.

Bart
 
K

Kennedy McEwen

Bart van der Wolf said:
SNIP

Maybe I missed something, but I assume that was to determine the
stabilization period needed.

Not really - although I did the measurement soon after power up to
assess something like the worst case mean drift. I am sure that it will
improve if repeated after being powered and used for some time, as one
would normally do, and the shape of the curve towards the end of the
scan period supports this.

However the calibration was to reduce the non-uniformity to its minimum
level as close as possible to the start of the scan, so that its
degradation through the scan period from that optimum could be assessed,
and to determine if that degradation was similar to the mean black
drift.

Having examined the images in a little more detail this evening, I now
think I may be being somewhat overcritical in placing emphasis on the
pk-pk result.

Although this certainly is the pk-pk non-uniformity in each scan line,
there is no correlation that the peaks and troughs exist of the same
cells on each scan line. By increasing the contrast in the original
image by about 1000x in Photoshop (levels taken to 0-15 twice then 0-63
or similar on the third step) there is some evidence of this line
structure, but at a level well below the pk-pk amplitude present even
with 16x multiscanning. So I need to do a bit of filtering on the data
and repeat the stats.
 
C

coolscan.pl

Hi Fernando,

I will answer you in (or after) weekend, because now I have
a lot rush orders of scanning services (this is normal in friday).

Regards
Maciek
 
K

Kennedy McEwen

Kennedy McEwen said:
Having examined the images in a little more detail this evening, I now
think I may be being somewhat overcritical in placing emphasis on the
pk-pk result.

Although this certainly is the pk-pk non-uniformity in each scan line,
there is no correlation that the peaks and troughs exist of the same
cells on each scan line. By increasing the contrast in the original
image by about 1000x in Photoshop (levels taken to 0-15 twice then 0-63
or similar on the third step) there is some evidence of this line
structure, but at a level well below the pk-pk amplitude present even
with 16x multiscanning. So I need to do a bit of filtering on the data
and repeat the stats.


OK, now I have had some time to examine the data and filter it to give a
better measure of the line to line variation immediately after
calibration and its degradation with time.

What I did to the original data was fairly trivial. I wanted to reduce
the random noise amplitude but retain and variation along each line, so
the original data was simply averaged over 30 pixels along the scan
direction. This reduces the variation between samples but retains any
linear nonuniformity across the CCD. Obviously the mean dark level
stays virtually the same, although the filtering smooths the curve a
little. 30 samples averaged out of a total of 5782 isn't a vast amount
of smoothing.

The results are shown at
http://www.kennedym.demon.co.uk/results/filtered.jpg

For reference, the original unfiltered data is at
http://www.kennedym.demon.co.uk/results/gamma10.jpg

As expected, both the pk-pk non-uniformity (what will be most visible as
line structure) and the standard deviation are significantly lower in
the filtered plots, indicating that the line structure is actually
substantially less than the random dark noise - by about a factor of 4
or so. This isn't as low as is required for the line structure to be
completely invisible in the random noise if the image was subjected to
extreme level shifts, but at near normal luminance it is below visible
thresholds.

At the rate the pk-pk level is increasing, it would be expected to
become visible and objectionable after two or three scans but, as
explained previously, this data was captured pretty soon after power on
to capture the worst case. If I can remember, I'll try to capture
another set of data after the scanner has been on and working for an
hour or so.
 
C

coolscan.pl

Hi Maciek!

Hi Fernando again!
I'm planning to purchase precisely an LS-9000.

Brilliant decision. Congratulations.

Affection to LS-9000 is frequently similiar to women:
Probably You will love and hate, but for sure it will design
stream of your live :)
Your considerations about calibration and CCD stability made me worry
a bit:

Don't worry. For this price you have not a better alternative.
Next (litle better) level of scanners begining with 5-digit price.
Exception is Imacon, but:
a. he scans only dry (with dust and scratches)
b. is still CCD (increse noises with increasing of density)
c. I'm not tested it ;-)
basically, do you think this scanner can show appreciable
fluctuations in CCD response while heating and so ,and that those
fluctuations are not easily recovered by NikonScan "calibration"
command?

Yes, but only where the scanner is very hot (after long continuous job).
In general, where scanner is not strongly charged at one session,
calibration is helpful.
About cooling the chips: which cooling method are you using?
Simple fans or Peltier devices?

Two flat fans. Better solution is to install large flat passive cooling an
plate (you will have to cut firmly small radiators on control chips),
becouse:

a. passive cooling not need an additional power
b. not produce eventually noise

LS-9000 with Kami wet feeder and 8 or 16 multisampling is excellent
solutions, but you scholud remember that this type of scanning take some
time.

Regards
Maciek
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top