Announcement: James Photography seeks participants for 2nd Annual Film Scanner Bake-Off

  • Thread starter Thread starter Jim Hutchison
  • Start date Start date
J

Jim Hutchison

Last year's scanner bake-off was a runaway success, with almost 18,000
hits on the results page alone. It was based solely on scanner
resolution, measured scientifically with Norman Koren's Imatest
software. This year a panel of judges will rate the results on
sharpness, highlight/shadow accuracy, noise, as well as contrast,
saturation, and overall realism as compared to the original slide.
The subject will be a still life composition photographed with 35mm
equipment on Fuji Provia-F.

Participants must register before the February 28 deadline; final
results will be published in April.


Please visit http://www.jamesphotography.ca/ for details under
"scanner bake-off".

Thank you,


Jim Hutchison
 
Jim,

How about including a colour negative and a black and white negative in
the competition?

W
 
Jim,

How about including a colour negative and a black and white negative in
the competition?

W


Interesting suggestion for sure... I know B+W negs are problematic,
as are colour negs for less-than-perfect sharpness. The majority of
pros I know work solely with trannies, so that's why I've geared it
for that medium. Gotta keep it simple though, otherwise I'd risk
loosing participants.

Thanks for the feedback...


Cheers,

jim
 
Jim said:
Last year's scanner bake-off was a runaway success, with almost 18,000
hits on the results page alone. It was based solely on scanner
resolution, measured scientifically with Norman Koren's Imatest
software. This year a panel of judges will rate the results on
sharpness, highlight/shadow accuracy, noise, as well as contrast,
saturation, and overall realism as compared to the original slide.
The subject will be a still life composition photographed with 35mm
equipment on Fuji Provia-F.

Participants must register before the February 28 deadline; final
results will be published in April.

Please visit http://www.jamesphotography.ca/ for details under
"scanner bake-off".

Thank you,

Jim Hutchison

In this kind of bake-of, it would be nice to be able to separate a
scanner's capability from an user's capability and the original source
material quality. For example in sharpness evaluation, it would be nice
to know how well a scanner can handle film curvature (i.e. dof), and how
well point auto focus or point manual focus works. If a submitted test
scan came from a film with a lot of film curl, or if the operator did
not point focus at the most contrasty area, the judges should not
attribute a scan's lack of sharpness to the scanner.

The results of a bake-off are only meaningful if these things are under
control. But it is easier said than done.
 
In this kind of bake-of, it would be nice to be able to separate a
scanner's capability from an user's capability and the original source
material quality. For example in sharpness evaluation, it would be nice
to know how well a scanner can handle film curvature (i.e. dof), and how
well point auto focus or point manual focus works. If a submitted test
scan came from a film with a lot of film curl, or if the operator did
not point focus at the most contrasty area, the judges should not
attribute a scan's lack of sharpness to the scanner.

The results of a bake-off are only meaningful if these things are under
control. But it is easier said than done.


Easier said than done - absolutely. Last year's bake-off was based
on sharpness alone, and yet there were *SO* many variables such as
white and black point, gamma, focusing, etc. that the test cannot be
declared scientific. So much is dependant on the operator that I
decided to go the other way... let the output be optimized by the
users themselves to bring out the scanner's best.

I had tons of feedback last year asking that a real-world scan be
used. That way, each user can tweak and play to make the image as
good as possible... but, variances in output is still determined by
the scanner - you can't make a silk purse from a sow's ear.

Hopefully this year a large enough representation of each scanner
model will average out the results. Let's see!

jim h
 
Last year's scanner bake-off was a runaway success, with almost 18,000
hits on the results page alone. It was based solely on scanner
resolution, measured scientifically with Norman Koren's Imatest
software.

Jim, I enjoyed partecipating in last year's bake-off, and I already
submitted for this one. :-)

Just one thought: as you sure already know, the target slide we used
was not very appropriate for Imatest MTF analysis: the black/white
transition was too soft for a proper Slanted Edge Test run, and as a
result, the MTF figures were, in absolute terms, quite odd. Some of us
(for example Bart Van der Wolf and me) repeated the Imatest MTF tests
with a proper Slanted Edge target, and the results were far more
accurate and reliable.
Nonetheless, the test was very interesting and gathered a lot of
informations.

Now, this year you seem to steer towards a more "subjective" analysis,
with a real-world scene slide, postprocessed images, and a jury.
A good idea, but still, I'd have like to see, as a complement to this
subjective evaluation, a more scientific MTF-type test, this time with
a proper target slide (a 5 degrees framed razorblade or something
similar, with an extremely uniform edge and abrupt transition from
opaque to transparent, would do). For this kind of test, instructions
would be the same of last year's (proper focus, no sharpening, no
clipping).

Is it viable in your opinion?

Thanks, and keep on the good work!

Fernando
 
In this kind of bake-of, it would be nice to be able to separate a
scanner's capability from an user's capability and the original source
material quality. For example in sharpness evaluation, it would be nice
to know how well a scanner can handle film curvature (i.e. dof), and how
well point auto focus or point manual focus works. If a submitted test
scan came from a film with a lot of film curl, or if the operator did
not point focus at the most contrasty area, the judges should not
attribute a scan's lack of sharpness to the scanner.

As Jim noted, hopefully there will be enough participants that there
will be several data points for each type of scanner. With averaging
over enough data points, the effect of any given user's ability will be
diminished, and so the results should be fairly representative of the
scanner's ability. Or, alternately, if skill in this mostly amounts to
"not messing up", then it may be reasonable to presume that with enough
people there will be one for each scanner (at least, each popular
scanner) who hasn't messed up.

Another thing we're likely to see is just how much difference the
individual operator makes; I suspect that someone who's been doing this
for years can do a much better job with my scanner than I can, but it
will be interesting to see how true that is.

- Brooks
 
Fernando said:
Now, this year you seem to steer towards a more "subjective" analysis,
with a real-world scene slide, postprocessed images, and a jury.
A good idea, but still, I'd have like to see, as a complement to this
subjective evaluation, a more scientific MTF-type test, this time with
a proper target slide (a 5 degrees framed razorblade or something
similar, with an extremely uniform edge and abrupt transition from
opaque to transparent, would do). For this kind of test, instructions
would be the same of last year's (proper focus, no sharpening, no
clipping).

Is it viable in your opinion?

Speaking as a prospective participant, I'd be glad to also scan a target
slide of that sort along with the real-world slide. Thus, I suspect
it's perfectly viable if you're willing to send out the slides to
everyone. :)

- Brooks
 
it's perfectly viable if you're willing to send out the slides to
everyone. :)

It's not my bake-off, you see. I'd never had the organization skill
(and time to spend) that James already showed last year.
But for sure I can provide technical explanations about how to build
and scan such a target. :)

Fernando
 
Brooks said:
Speaking as a prospective participant, I'd be glad to also scan a target
slide of that sort along with the real-world slide. Thus, I suspect
it's perfectly viable if you're willing to send out the slides to
everyone. :)

- Brooks

I second this suggestion. It would certainly help separating the scanner
capability from the user capability.
 
Jim said:
Last year's scanner bake-off was a runaway success, with almost 18,000
hits on the results page alone. It was based solely on scanner
resolution, measured scientifically with Norman Koren's Imatest
software. This year a panel of judges will rate the results on
sharpness, highlight/shadow accuracy, noise, as well as contrast,
saturation, and overall realism as compared to the original slide.
The subject will be a still life composition photographed with 35mm
equipment on Fuji Provia-F.

Participants must register before the February 28 deadline; final
results will be published in April.

Please visit http://www.jamesphotography.ca/ for details under
"scanner bake-off".

Thank you,

Jim Hutchison

It would help to include comparing flares from scanners, as described
here:
http://www.vad1.com/photo/dirty-scanner/

This "feature" is not disclosed by the manufacturers, or addressed in
the reviews.
 
Jim, I enjoyed partecipating in last year's bake-off, and I already
submitted for this one. :-)

Just one thought: as you sure already know, the target slide we used
was not very appropriate for Imatest MTF analysis: the black/white
transition was too soft for a proper Slanted Edge Test run, and as a
result, the MTF figures were, in absolute terms, quite odd. Some of us
(for example Bart Van der Wolf and me) repeated the Imatest MTF tests
with a proper Slanted Edge target, and the results were far more
accurate and reliable.
Nonetheless, the test was very interesting and gathered a lot of
informations.

Now, this year you seem to steer towards a more "subjective" analysis,
with a real-world scene slide, postprocessed images, and a jury.
A good idea, but still, I'd have like to see, as a complement to this
subjective evaluation, a more scientific MTF-type test, this time with
a proper target slide (a 5 degrees framed razorblade or something
similar, with an extremely uniform edge and abrupt transition from
opaque to transparent, would do). For this kind of test, instructions
would be the same of last year's (proper focus, no sharpening, no
clipping).

Is it viable in your opinion?

Thanks, and keep on the good work!

Fernando


How would you propose a 5 degree framed razorblade be made? Also, I
agree that the test is more subjective, but something I will be doing
is instructing the participants to send me a before and after of the
scanned image, as well as a full-size 200x200 pixel clip of the
original imbedded in the final image.

Regarding the test image I sent everyone - I disagree that it wasn't
detailed enough. Mathematical analysis of the image says so: read
this article:

http://www.jamesphotography.ca/bakeoff2004/therealdeal.html


Thanks everyone for your input and suggestions; you're helping me
formulate and refine the test...

Regards,

jim h
 
Regarding the test image I sent everyone - I disagree that it wasn't
detailed enough. Mathematical analysis of the image says so: read
this article:

http://www.jamesphotography.ca/bakeoff2004/therealdeal.html

I read it, but while the anonymous contributor said that the slide did
not limit the results of the test except for the very best scanners
(?), the reality is that the results were presented as some sort of
"cy/mm rating". So the average reader (it already happened) tends to
believe that, for example, the best film scanner in the match achieved
something like 27.8 cy/mm at MTF 50.
While the same scanner, when properly tested with a proper SET target
by two independent testers, achieved something like 65 cy/mm at MTF
50.
If that not a big difference, then I don't know what it is...
Moreover, the insufficient edge sharpness of the test slide did
compress the ratings, so that a flatbed like my 2450 achieved 12.85
cy/mm at MTF 50, that is actually the same rating I got with the SET
target (the scanner does not even have the resolving power to put
stress on your target slide), that is, a figure that shows as only 1/2
the linear resolving power of the SE 5400.
Too bad that, with a proper test, the actual difference in linear
resolving power shows as more like 65 cy/mm vs. 13 cy/mm at MTF 50...
and 72 vs. 18 at MTF 30.

That said, there's no "mathematical analysis" on that contribution,
just some practical considerations.

The mathematical analysis is correctly performed by Imatest when
supplied with proper slanted edge test data, and gives the figures I
reported earlier.

This is not meant to dismiss you earlier, or present, efforts, that I
appreciate very much; just to put numbers in the right perspective...

Fernando
 
SNIP
So the average reader (it already happened) tends to
believe that, for example, the best film scanner in the
match achieved something like 27.8 cy/mm at MTF 50.
While the same scanner, when properly tested with a
proper SET target by two independent testers, achieved
something like 65 cy/mm at MTF 50.

Allow me to add a small but important nuance. The Slanted Edge Target
(SET) with a razor blade tests scanner resolution/MTF, where the
slanted edge on film tests the combined camera lens+film+scanner MTF.
Both give meaningful information, although there unfortunately was a
difference in the two batches of film targets that complicated the
film scan comparisons. The test also requires some honesty (i.e. no
sharpening) and the scan gamma should be matched with the parameter
value entered in Imatest (Linear gamma would be best, but not all
scanner drivers allow to save that data). There is also a precaution
involved with the use of a high contrast SET, and that is to avoid
clipping. That requires manual exposure control, which is not
available in all drivers (in particular for "flatbeds").
Moreover, the insufficient edge sharpness of the test slide
did compress the ratings, ...

Indeed, it didn't do full justice to the higher resolution scanners,
which could have extracted more info if it had been in the film in the
first place. Nevertheless I found it a worthwhile exercise, also
because many reported that my 5400 sample "looked" sharper than the
numbers seemed to indicate. Some people were also visually eluded by a
contrast enhanced scan. That, to me anyway, also indicates that the
numbers by themselves do not tell the whole story, although they (can)
help the interpretation.

Bart
 
SNIP
How would you propose a 5 degree framed razorblade be made?

I have made several.
Earlier attempts using folded alumin(i)um foil turned out to be too
delicate and hard to keep flat in use, despite the frame it was
mounted in.
Later versions were, and are, constructed by mounting a flat razor
blade (e.g. Gillette) or very sharp cutting blades (e.g. Logan model
270) in a 35mm slide mount. The correct blade size will still fit the
full 36mm dimension when slightly slanted.

I find it easiest to prepare a piece of rectangular paper with a
slanted line drawn on it. The angle of the line is not that critical,
but a slope of 1:10 produces an angle of 5.71 degrees, which is close
to the recommended slant angle for Imatest.
It is now easy to position the blade in the slidemount at approx. the
right angle of rotation, just align the mount with the paper edge and
align the blade with the line. Again, the exact orientation is not
super critical (besides not all scanners are exactly aligned either),
but it does help to choose an Imatest ROI crop as large as possible (<
600 pixels) because that will provide a large number of phase
rotations which benefits statistical accuracy of the measurement.

I think it is important to use the same type of blade if one wants to
avoid slight differences due to edge quality (I've seen corrosion form
on some blades that were no longer covered in a layer of oil). Also a
point of attention is that some cutting blades can be thicker than
film, Gillette razor blades are. That may cause issues with some slide
holders, so in those cases a thin Gepe slide mount might be preferable
(although I have only tested with regular Gepe mounts myself, they
mount blades well).

Bart
 
Allow me to add a small but important nuance. The Slanted Edge Target
(SET) with a razor blade tests scanner resolution/MTF, where the
slanted edge on film tests the combined camera lens+film+scanner MTF.

Yes, and this figure is quite different from the actual scanner MTF.
In the situation discussed here, lens and film more than halved the
scanner figure for the best scanners, while affecting only marginally
the flatbeds for example.
But since the ratings were presented as "scanner ratings" (it's a
scanner bake-off, after all), they could mislead the reader, that
could assume an SE5400 to get only double the linear resolving power
of an Epson 2400, for example.
That's my only gripe, I am very keen of the bakeoff per se; it was an
interesting experience and provided useful informations.

I thank James for this, he did a great and though job.
I'm happy to partecipate to the 2005 edition. :-)

Fernando
 
SNIP

I have made several.
Earlier attempts using folded alumin(i)um foil turned out to be too
delicate and hard to keep flat in use, despite the frame it was
mounted in.
Later versions were, and are, constructed by mounting a flat razor
blade (e.g. Gillette) or very sharp cutting blades (e.g. Logan model
270) in a 35mm slide mount. The correct blade size will still fit the
full 36mm dimension when slightly slanted.

I find it easiest to prepare a piece of rectangular paper with a
slanted line drawn on it. The angle of the line is not that critical,
but a slope of 1:10 produces an angle of 5.71 degrees, which is close
to the recommended slant angle for Imatest.
It is now easy to position the blade in the slidemount at approx. the
right angle of rotation, just align the mount with the paper edge and
align the blade with the line. Again, the exact orientation is not
super critical (besides not all scanners are exactly aligned either),
but it does help to choose an Imatest ROI crop as large as possible (<
600 pixels) because that will provide a large number of phase
rotations which benefits statistical accuracy of the measurement.

I think it is important to use the same type of blade if one wants to
avoid slight differences due to edge quality (I've seen corrosion form
on some blades that were no longer covered in a layer of oil). Also a
point of attention is that some cutting blades can be thicker than
film, Gillette razor blades are. That may cause issues with some slide
holders, so in those cases a thin Gepe slide mount might be preferable
(although I have only tested with regular Gepe mounts myself, they
mount blades well).

Bart


I think a razor blade scan provides valuable information, but making
tons of them for the bake-off is impractical. I only have so much
time! And although my slide perhaps didn't help differentiate the
high-end scanners, the results were "real-world" as described on the
web site. I'm not minimizing the value of a more controlled
scientific test, I'm just saying that in practice, the important
differences between various high-end scanners shouldn't be weighted by
sharpness alone.

Case in point: there was an Imacon in the test, which didn't fair
near as well as I had thought. But it produces clean, amazing scans
that in the hands of a pro, produces images worthy of any stock
agency.

This year, there will only be one set of slides that I'll shoot; I
didn't anticipate the huge number of participants in last year's, so I
had to shoot in 2 sessions. Obviously my focusing wasn't dead-on in
the 2nd shoot. Won't happen again!

Good discussion folks - thanks for the valuable input.
 
SNIP

A small typo/omission:
Also a point of attention is that some cutting blades can be thicker
than film, Gillette razor blades are.

Intended to say; Gillette razor blades are of about the same thickness
as film.

Bart
 
My questions are from someone who hasn't done this kind of testing. I
suppose the idea is to create a really sharp edge to be scanned. Instead
of using a razor blade, how about making a very sharp scratch on a piece
of film that is already mounted? (That will put all my reject slides to
good use!)
 
My questions are from someone who hasn't done this kind of testing. I
suppose the idea is to create a really sharp edge to be scanned. Instead
of using a razor blade, how about making a very sharp scratch on a piece
of film that is already mounted? (That will put all my reject slides to
good use!)

I was reasoning along the same lines (sharp 5-degrees cutaway on a
dark slide). Will do some tests this WE...

Fernando
 
Back
Top