Vuescan appears to be using "nearest neighbour" for downsampling

M

Mendel Leisk

Just a heads-up, Vuescan appears to be using "nearest neighbour" for
downsampling, when used in the scan-from-disk process. At least the
result is similar to that process as done through Photoshop.

During scan-from-disk within Vuescan, downsampling is accomplished by
setting "Input|Scan(Preview) Resolution" to "Custom", and then setting
a number. In my case, I worked with raw files having dpi set to 5400.
By setting the custom dpi to 4000 during scan-from-disk, I downsampled
by 4000/5400.

Unfortunately, I've gotten a fair way into my project now, when I
happened to try a bi-cubic downsample of the raw file, through
Photoshop. Zooming both to 200%, serious "jaggies" are evident in the
Vuescan result, whlle Photoshop's bicubic output is much smoother.

Groan!
 
W

Winfried

Mendel said:
Just a heads-up, Vuescan appears to be using "nearest neighbour" for
downsampling, when used in the scan-from-disk process. At least the
result is similar to that process as done through Photoshop.

During scan-from-disk within Vuescan, downsampling is accomplished by
setting "Input|Scan(Preview) Resolution" to "Custom", and then setting
a number. In my case, I worked with raw files having dpi set to 5400.
By setting the custom dpi to 4000 during scan-from-disk, I downsampled
by 4000/5400.

Unfortunately, I've gotten a fair way into my project now, when I
happened to try a bi-cubic downsample of the raw file, through
Photoshop. Zooming both to 200%, serious "jaggies" are evident in the
Vuescan result, whlle Photoshop's bicubic output is much smoother.

Groan!

That's not new. It is well documented:

"You can use this option to write files with a reduced number of
pixels. For instance, if size reduction is set to 3, then every 3x3
block of pixels in the image will be written as a single pixel, which
is the average of these 9 pixels."

Winfried
 
M

Mendel Leisk

Well that is in the description of the "output" tab. My downsample
setting is set in:

Input|Preview Resolution (I've activated "scan from preview")

In that page of the help file the description is:

If set to "Custom", the resolution is selected by the "Input|Preview
dpi" option

Even if the help file did go into more detail, on the Input tab
setting, I'd still be in the dark. I really did not understand the
downsampling options, and their impact, and am just getting an inkling
now.

Just from reviewing the two results, I think Vuescan is not the place
to be downsampling, Your results will look much better under close
scrutiny if you use Photoshop's bi-cubic.
 
W

Winfried

Mendel said:
Well that is in the description of the "output" tab. My downsample
setting is set in:

Input|Preview Resolution (I've activated "scan from preview")

In that page of the help file the description is:

If set to "Custom", the resolution is selected by the "Input|Preview
dpi" option

Even if the help file did go into more detail, on the Input tab
setting, I'd still be in the dark. I really did not understand the
downsampling options, and their impact, and am just getting an inkling
now.

Just from reviewing the two results, I think Vuescan is not the place
to be downsampling, Your results will look much better under close
scrutiny if you use Photoshop's bi-cubic.


That's right.
But it is better to scan at 4000 dpi and downsample to 2000 dpi in
VueScan than scanning with 2000 dpi from the beginning.
Therefor I normaly scan at full resolution (at least with
slides/negatives).
Of course if I use a flatbed as a copy maschine I use 300 dpi.

Winfried
 
B

Bart van der Wolf

Mendel Leisk said:
Just a heads-up, Vuescan appears to be using "nearest neighbour"
for downsampling, when used in the scan-from-disk process. At
least the result is similar to that process as done through
Photoshop. SNIP
... , when I happened to try a bi-cubic downsample of the raw file,
through Photoshop. Zooming both to 200%, serious "jaggies" are
evident in the Vuescan result, whlle Photoshop's bicubic output is
much smoother.

It was answered by Ed Hamrick (long before he was driven away from
this group by a few VueScan bashers) that he uses a bi-linear kind of
interpolation:
<http://www.google.com/[email protected]>

Good down-sampling uses a better kind of method, but it is relatively
(computationally) expensive. If you want the best results, don't even
use Photoshop but "ImageMagick" for downsampling. I have not found a
better method than their implementation of a CatRom, Lanczos, or Sinc
filtered down-sampling.
<http://www.xs4all.nl/~bvdwolf/main/foto/down_sample/down_sample.htm>
It offers the best compromise between aliasing and resolution,
followed by Mitchell and Bessel but at the expense of some resolution,
while their Cubic filter suppresses virtually all aliasing risk, but
at further loss of resolution.
If you want to use photoshop because it better suits your workflow,
then don't follow Adobe's Bi-cubic "sharper" recommendation, but just
use regular Bi-cubic:
<http://www.xs4all.nl/~bvdwolf/main/foto/down_sample/example1.htm>

The best results depend on the input signal, so the above mentioned
pre-filter methods should allow you to pick the best for your purpose.

Bart
 
M

Mendel Leisk

Thanks for the links/tips, Bart. Before I do ANY more downsampling,
I'll try some of these. The big hassle for me is 200+ meg files from my
Elite 5400. Taking them down to 4000ppi cuts the file size almost in
half.
From observation, I think the Vuescan downsample is closer to PS's
"nearest neighbour" than PS's "bi-linear". This is looking at well
defined diagonal edges, at 200% zoom.
 
B

Bart van der Wolf

Mendel Leisk said:
Thanks for the links/tips, Bart. Before I do ANY more
downsampling, I'll try some of these. The big hassle for
me is 200+ meg files from my Elite 5400.

You're welcome. Yes, the 5400 produces huge files (2 or 3 per CD,
depending on 48/64-bit).
Taking them down to 4000ppi cuts the file size almost
in half.

It all depends on the resolution in the original film image, and the
intended use.
If your original film image does resolve lots of detail (detailed
subject + good lens + low ISO film + good technique / tripod / mirror
lockup) the 5400 ppi scan will almost catch all there is in terms of
resolution. Down-sampling can lose real resolution, if it is there to
begin with in the film image.

If there is less detail, i.e. subject / lens / tripod limitations,
then down-sampling will not hurt available resolution much, but it
will benefit the S/N ratio. Both scanner noise and film graininess
will improve, but only if the down-sampling algorithm doesn't increase
grain-aliasing too much. That will require a good down-sampling
algorithm.
The file size reduction will also allow to store more images, and
lower noise images will also compress better.

Bart
 
E

Erik Krause

Bart van der Wolf said:
Good down-sampling uses a better kind of method, but it is relatively
(computationally) expensive. If you want the best results, don't even
use Photoshop but "ImageMagick" for downsampling. I have not found a
better method than their implementation of a CatRom, Lanczos, or Sinc
filtered down-sampling.

The Panorama Tools have the same or better interpolators but the best
(to my knowledge) has QImage - although I must admit that I never tried
it for downsampling. However, it does a superb upsampling. Here a
comparison of Fred Miranda's Resize Pro to QImage:
http://www.ddisoftware.com/testpics/p-rp.jpg
 
K

Kennedy McEwen

Bart van der Wolf said:
It was answered by Ed Hamrick (long before he was driven away from this
group by a few VueScan bashers) that he uses a bi-linear kind of
interpolation:
<http://www.google.com/[email protected]>
So what is "approximately bi-linear"?

It either is bi-linear or it isn't bi-linear.

Ed's description of his algorithm is akin to being a little bit
pregnant!

There are many different interpolation algorithms, some better, some
worse, but "bi-linear" is a very specific algorithm that Vuescan either
is or is not compliant with.
 
B

Bart van der Wolf

Kennedy McEwen said:
So what is "approximately bi-linear"?

It either is bi-linear or it isn't bi-linear.

It is bi-linear, but with a twist ;-). To me it's obvious that Ed
didn't want to fully disclose his proprietary algorithm. That's fine
with me because I use my own preferences when repurposing the scanned
data. I usually scan at full native resolution, and I use Qimage's
Vector or Pyramid interpolation for significant up-sampling, and
ImageMagic's Lanczos or Sinc prefiltering for most (pictorial)
down-sampling.

Bart
 
K

Kennedy McEwen

Bart van der Wolf said:
It is bi-linear, but with a twist ;-).

But without telling anyone anything about what that "twist" is, we can
only base our conclusions on the results which, as Mendel pointed out,
is grossly inferior to bilinear.

Incidentally, from around version 1.4 up to version 7, PaintShop Pro
used a twist in their "bilinear" interpolation which meant it was much
faster to compute but the results were also grossly inferior to proper
bilinear interpolation. Despite constant bug reports at every new
release, Jasc would not even so much as acknowledge the error. Only
after continual demonstration of the shortfall in every post that Kris
Zaklika made on the subject of interpolation on the PSP newsgroup were
they finally forced to fix it - with far more mud-slinging from PSP
fanatics than the Vuescan aficionados throw at its critics here, I can
assure you.

That twist? Use nearest neighbour interpolation for downsampling
throughout and when upsampling use nearest neighbour for the integer
part and linear interpolation for the fractional residue of the scaling.
So upsampling from say 100ppi to 250ppi produced 2.5 pixels for each
original, or 5 pixels from every two. Four of these were exactly the
same as the original nearest neighbours, and only the fifth was actually
produced by linear interpolation. A lot less computation, and thus
faster to implement, but far closer to nearest neighbour scaling than
bilinear!

Since they used proper bicubic interpolation when that was introduced,
presumably from a public software library rather than Jasc home baking,
the difference between the two was astounding and resulted in many PSP
users discounting bilinear as a waste of time - even when a proper
bilinear algorithm would have been perfectly adequate and much faster
than the bicubic they were forced to use.

I am not suggesting that this is the same twist as Ed has used, although
Mendel's results suggest it might be. However, misuse of these standard
algorithm terms can be very misleading - in Jasc's case deliberately so,
since they used very specific examples which concealed the deficiency
every time the topic was raised.

If it isn't bilinear then it shouldn't be called bilinear because,
whether deliberately or not, it is deceptive terminology. What Vuescan
implements is, for want of a better term, "Hamrick Interpolation", and
without reverse engineering it, we can't say precisely how close to
nearest neighbour or bilinear it actually is.
 
B

Bart van der Wolf

SNIP
I am not suggesting that this is the same twist as Ed has used,
although Mendel's results suggest it might be.

I believe Mendel said "Vuescan appears to be using "nearest neighbour"
for
downsampling, when used in the scan-from-disk process. At least the
result is similar to that process as done through Photoshop".

That indicated to me that there is little difference, so indeed quite
different from the earlier PSP method you described. I have not tested
it because bi-linear is vastly inferior to many other solutions
available.

If I were Ed, I would have implemented a different resampling
algorithm, but it's his program and our choice to either use the
functionality or not. Who knows, he might still change it, due to
popular demand (which caused him to add interpolation in the first
place) or as a challenge. He could use
<http://astronomy.swin.edu.au/~pbourke/analysis/interpolation/> as
inspiration ..., and maybe even add another twist of his own, like
proper lowpass prefiltering.

Bart
 
K

Kennedy McEwen

Bart van der Wolf said:
SNIP

I believe Mendel said "Vuescan appears to be using "nearest neighbour"
for
downsampling, when used in the scan-from-disk process. At least the
result is similar to that process as done through Photoshop".
Indeed - which is why I suggested that this might indeed be the same
process as Ed has used, but calls "approximately bilinear".
That indicated to me that there is little difference, so indeed quite
different from the earlier PSP method you described.

Read again my description of exactly what Jasc traded off as bi-linear
interpolation for at least 4 major version issues:
Use nearest neighbour interpolation for downsampling throughout

ie. *exactly* the same as Photoshop nearest neighbour downsampling.
when upsampling use nearest neighbour for the integer part and linear
interpolation for the fractional residue of the scaling.

ie. marginally distinguishable from Photoshop nearest neighbour for
scales of 2 and above.

I don't see how you can consider it to be quite different from what I
described, since it is exactly what I described for the condition that
Mendel referred to!
I have not tested it because bi-linear is vastly inferior to many other
solutions available.

But significantly superior to either nearest neighbour or the crap that
Jasc peddled for years under the lie of "bi-linear" and much faster
computationally than any of the more complex routines.
 
M

Mendel Leisk

Your correct in your interpretation of my statement. To confirm, the
test image had a bright edge of someone's face, against a dark
background, sloping slightly off vertical.

Viewing this area at 200%, with Photoshop's bi-linear and bi-cubic, the
results seemed identical, and quite smooth. Same thing with Photoshop's
nearest neighbour, I see the smooth edge breaking up into distinct
lines, creating jagged steps.

The appearance of similar area/zoom after Vuescan downsample by same
percent, looks exactly like Photoshop nearest neighbour.

This tanked about 3 months work for me :(
 
B

Bart van der Wolf

SNIP
The appearance of similar area/zoom after Vuescan
downsample by same percent, looks exactly like Photoshop
nearest neighbour.

Ed possibly didn't include proper pre-filtering before down-sampling.
In fact, it wouldn't surprise me if actually he uses binning (like
with "File size reduction" which is already available), topped off by
bilinear interpolation for the remainder to reach the desired output
size, but that's just my speculation.

Almost any interpolation scheme (including bi-cubic) will produce bad
aliased results without proper preparation. It was only as recent as
Photoshop CS that Adobe improved their bicubic algorithms, and they
still didn't get it right IMHO (edge artifacts, and halo or aliasing).

The best down-sampling implementation I use, is still the free
ImageMagick one. It offers, a choice of prefilters for all sorts of
image content, and proper handling of edge pixels. The UI (commandline
or script) may be a bit daunting for some but the results rule.

Ed may have been more concerned with the speed penalty of a better
interpolation, as scans can be quite large at native resolution. I
would recommend him to take another look at it, although I personally
won't use it a lot, there may be others that do.

Bart
 
M

Mendel Leisk

Bart said:
SNIP

Ed possibly didn't include proper pre-filtering before down-sampling.
In fact, it wouldn't surprise me if actually he uses binning (like
with "File size reduction" which is already available), topped off by
bilinear interpolation for the remainder to reach the desired output
size, but that's just my speculation.

Almost any interpolation scheme (including bi-cubic) will produce bad
aliased results without proper preparation. It was only as recent as
Photoshop CS that Adobe improved their bicubic algorithms, and they
still didn't get it right IMHO (edge artifacts, and halo or aliasing).

The best down-sampling implementation I use, is still the free
ImageMagick one. It offers, a choice of prefilters for all sorts of
image content, and proper handling of edge pixels. The UI (commandline
or script) may be a bit daunting for some but the results rule.

Ed may have been more concerned with the speed penalty of a better
interpolation, as scans can be quite large at native resolution. I
would recommend him to take another look at it, although I personally
won't use it a lot, there may be others that do.

Bart

I did look into the ImageMagick software you refer to. I had great
difficulty in the download/install process. What I gather is that
ImageMagick is a fairly esoteric program, requiring a fair bit of
savvy. For install on Windows system, it might involve pre-install of
another program. I did download a package and unzip it, but could not
see where to go from there.

At this point I elected to use Photoshop's bi-cubic, which you're
saying is almost as good. It holds up to close (200~300% zoom) scrutiny
to my eyes no apparent breaking up of diagonal edges.

My project is archiving family slide collections, relatively grainy,
hand held. 4000 dpi downsampling makes a tremendous reduction in
storage requirements.

With my Elite 5400 and the Minolta Scan Utility, I output a 16 bit
linear with ICE/GD enabled, at the full 5400dpi. Then inspect each file
in PS, cleaning what ICE has missed, with healing brush etc. Then
bicubic downsample to 4000. Then scan-from-disk from this file, with
Vuescan.
 
D

Don

The appearance of similar area/zoom after Vuescan downsample by same
percent, looks exactly like Photoshop nearest neighbour.

This tanked about 3 months work for me :(

I'm just curious... You've reported a number of serious problems with
VueScan over the months. Why do still stick with it?

This is a genuine question. Are there any specific features which you
can't get elsewhere and, if so, what are they?

Don.

BTW, it's not the first time this has happened. I clearly remember a
German company a while back also with several months of work down the
tubes. Back then it was some noise VueScan introduced, I believe.
 
M

Mendel Leisk

I simply use Vuescan for the portions of my workflow where it is
convenient, efficient and working to my liking, and Minolta Scan
Utility and Photoshop for the areas where Vuescan is falling short. I
like the Vuescan interface, and the control. For me, it's main
shortcoming at present is the quality of infrared cleaning, and now
this downsampling issue. I would hope these issues can be resolved. In
the interm, I use what works.
 
M

Mendel Leisk

Missed a portion of your question. Specific features I like:

scan-from-disk concept

simple, fairly efficient, crop and rotate

scanner calibration with targets (just considering this)

fairly good white balancing

color neg film advanced workflow (just considering this)

Note, I'm by NO means a pro. I'm sure there are other avenues for most
of the above, I'm just not that advanced/knowledgable, and Vuescan,
atleast some of it, works for me.
 
B

Bart van der Wolf

SNIP
I did look into the ImageMagick software you refer to. I
had great difficulty in the download/install process.

What I gather is that ImageMagick is a fairly esoteric
program, requiring a fair bit of savvy. For install on
Windows system, it might involve pre-install of another
program. I did download a package and unzip it, but
could not see where to go from there.

If, assuming you do work on a Windows platform, you could download
<http://prdownloads.sourceforge.net/imagemagick/ImageMagick-6.2.1-Q16-windows-dll.exe?download>,
to get the precompiled 16-bit/channel version installer (version
6.2.1).
After installation (check all features if in doubt) you can use the
command prompt (Start|Run...|cmd).

Then it'll require some old fashion typing-in of DOS commands, like CD
followed by a space to change to the directory with your images (e.g.
C:\temp). Then use the ImageMagick command "Convert" to downsample
(e.g. type: convert filename1.tif -filter sinc -resize 4000x4000
filename2.tif). That will use the "Sinc" pre-filter to resize
"filename1.tif" to "filename2.tif" and fit the longer dimension in a
4000x4000 pixel box. You can use your favorite sharpening
program/method on that and save a JPEG version.

You can also convert from TIFF to other file types, e.g. JPEG, in the
same operation by changing the second file's extention to .jpg and
also adjusting the compression quality and chroma subsampling
parameters if the defaults don't suit your goal, but issues may arise
depending on various issues.
At this point I elected to use Photoshop's bi-cubic, which
you're saying is almost as good. It holds up to close
(200~300% zoom) scrutiny to my eyes no apparent
breaking up of diagonal edges.

Yes, it's close but less than optimal, although it may be good enough
for the purpose.
My project is archiving family slide collections, relatively
grainy, hand held. 4000 dpi downsampling makes a
tremendous reduction in storage requirements.

Indeed, especially if down-sampling from a 5400 ppi sampling density
(which is best to reduce graininess with the DSE-5400, with the Grain
Dissolver activated).
With my Elite 5400 and the Minolta Scan Utility, I output
a 16 bit linear with ICE/GD enabled, at the full 5400dpi.
Then inspect each file in PS, cleaning what ICE has missed,
with healing brush etc. Then bicubic downsample to 4000.
Then scan-from-disk from this file, with Vuescan.

Sounds like a good workflow. Going from 5400 to 4000 ppi may actually
lose very little real resolution, depending on the actual captured
film resolution, although graininess will benefit from 5400 ppi +
"perfect" downsampling.

Bart
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Similar Threads


Top