24-bit images vs 32-bit video card and OS ?

T

Thierry

Hi,

A technical question about video boards.

The human eye detects 256 level of gray. Applied to RGB channels, it is able
to see 256x256x256 = 16.77 millions colors, so as much as a 24-bit card
(2^24 = 16.77 millions)

Question : why do we sale 32-bit card if a 24-bit is enough (in another
field, 24-bit audio board are common, but for other reasons)
I think that it is not related to the bus is 32-bit or because the
processing or the OS works on 32-bit.
It seems that the reaon is other...
Or do I confuse the format of images in 24 bit with the bus size, 32 bit ?
In this case, it still worst because I thought that all software were
running in 32 bit envrionment if not higher.

Any idea ?

Thanks
Thierry
 
T

Thierry

Bob Myers said:
Not exactly. It would be more correct to say that, at
any given state of adaptation, the human eye can cover
a dynamic range somewhere in the neighborhood of the
low hundreds to the very low thousands to one. That
SHOULD translate to an equivalent "bits of gray level"
of something like 8-10, maybe 12 at the outside, bits -
however:

OK. You provided me one of the more complete explanation I have read up to
now. Superb.
You know well your subject !
Also good to know that some non-linear 8 bits/color screen can compete
against 10-12 bits ones.

But it remains a dark pixel ;-)

On one side I have 24 bits to manage 16 millions colors, and on the otehr
side a 32-bit system.
You speak of bit:color, it's clear. But what become my 8 additional bits
(from 24 to 32) ?
Are these 32-bits required by the OS, or rather the hardware (data bus).
For example, could we manufacture video card of 24-bits ? (they 'd be just
suited to the number of colors to manage, shouldn't it ?)

Thanks for your lights.

Thierry
 
C

chrisv

Bob said:
Having said that - 30 bits/pixel (10 bits/color) and 36 bits/pixel
(12 bits/color) are certainly not unheard of in professional
graphics applications. A 8 bit/color (24 bits/pixel) system can
also be perceptually improved by employing a non-linear
encoding scheme for the RGB representations.

I'll say that whatever the theory is, in practice most people with
most (CRT) monitors under most ambient-lighting conditions would have
a tough time differentiating 24-bit color from anything "better".
I've a VB program that fills your screen with 256 bands, from 0 to 255
brightness, and I sure can't see the bands...
 
T

Thierry

chrisv said:
I'll say that whatever the theory is, in practice most people with
most (CRT) monitors under most ambient-lighting conditions would have
a tough time differentiating 24-bit color from anything "better".
I've a VB program that fills your screen with 256 bands, from 0 to 255
brightness, and I sure can't see the bands...

I already read that, that most people could see only about 200 bands.
Here is an example, orange strip in mid of page
http://www.arnaudfrichphoto.com/gestion-de-la-couleur/couleur-gamma.htm
(of course you also need of a excellent screen and if psosible, calibrated.

Thierry
http://www.astrosurf.org/lombry
 
B

Bob Myers

Thierry said:
Hi,

A technical question about video boards.

The human eye detects 256 level of gray.

Not exactly. It would be more correct to say that, at
any given state of adaptation, the human eye can cover
a dynamic range somewhere in the neighborhood of the
low hundreds to the very low thousands to one. That
SHOULD translate to an equivalent "bits of gray level"
of something like 8-10, maybe 12 at the outside, bits -
however:

1. The eye's response isn't linear. We are more sensitive
to changes at the low end of the current adaptation range,
and less so at the high end. (i.e., we see details in shadows
far better than we see details in very bright areas). The eye's
response curve is, by an odd coincidence, almost the exact
inverse of a CRT's "gamma" curve (which means that linear
changes in the input level to a CRT will be perceived as linear
changes in luminance, but that's a somewhat-separate issue...).

2. The above statement applies to luminance changes
(the perception of "brightness") only; how we perceive
changes in color (hue and saturation) is a good deal more
complex, and the individual RGB channels of the typical color
image do not contribute equally.

3. Variations in the response curve of the various display
technologies can also influence the number of bits/color you
need for a "good" image.

Applied to RGB channels, it is able
to see 256x256x256 = 16.77 millions colors, so as much as a 24-bit card
(2^24 = 16.77 millions)

Again, not exactly, for the reasons given above and others.

Question : why do we sale 32-bit card if a 24-bit is enough (in another
field, 24-bit audio board are common, but for other reasons)
I think that it is not related to the bus is 32-bit or because the
processing or the OS works on 32-bit.
It seems that the reaon is other...
Or do I confuse the format of images in 24 bit with the bus size, 32 bit ?
In this case, it still worst because I thought that all software were
running in 32 bit envrionment if not higher.

"32 bit" graphics cards typically provide only 8 bits per primary;
the additional 8 bits per pixels does a couple of things. First,
it means that the pixels line up "nicer" within the memory space, but
also these additional bits can be used to carry other information
besides the basic RGB intensity values (such as "alpha," or
transparency, values).

Having said that - 30 bits/pixel (10 bits/color) and 36 bits/pixel
(12 bits/color) are certainly not unheard of in professional
graphics applications. A 8 bit/color (24 bits/pixel) system can
also be perceptually improved by employing a non-linear
encoding scheme for the RGB representations.

Bob M.
 
T

Thierry

Bob Myers said:
Sure - but then, a CRT monitor under typical office/home
ambient doesn't cover the range of possible places where we might
be wondering about "24-bit color." I've also got some pretty
commonly-occuring examples where I will pretty much guarantee
you anyone will see the banding.

Does it mean that there are conditions in which a 12 or even 14-bit/color
screen (or graphic card) can be useful; and thus will display a image
different (and of course finer) from the one display on a 8-bit/color screen
? I think about LaCie vs Eizo screens) ?
Can we really distinguih the difference naked eye (or is this in theory and
only visible ousing some devices) ?

Theirry
 
B

Bob Myers

chrisv said:
I'll say that whatever the theory is, in practice most people with
most (CRT) monitors under most ambient-lighting conditions would have
a tough time differentiating 24-bit color from anything "better".

Sure - but then, a CRT monitor under typical office/home
ambient doesn't cover the range of possible places where we might
be wondering about "24-bit color." I've also got some pretty
commonly-occuring examples where I will pretty much guarantee
you anyone will see the banding.

Part of the problem with some systems is that you don't really
get much better than 8 bit/color performance anyway...note that
8-bit accuracy with an 0.7V video signal (standard analog video)
requires that everything is held accurate/stable to 0.7/255 volts,
or about 2.7 mV. And best of luck with that...

Bob M.
 
D

David Phillip Oster

The common representation if a pixel is 8 bits red, 8 bits green 8 bits
blue, with pixels being on 4-byte boundaries.

In the past, some video cards simply failed to populate the memory for
the bits that didn't correspond to color data. However, that extra byte
in a 32-bit word is often used for opacity, known as alpha, when
compositing images. Since textures=images are often stored on the video
card for 3d work, it is simpler in these days of large memory chips,
just to populate the entire memory space of the video card.

When you are doing photoediting, you often want many more than 8-bits
per pixel, since as pointed out, when the eye is looking at a dark
portion of an image it percieves one range, when looking at a bright
portion, it sees a different range.

<http://www.openexr.com/> OpenEXR is a high dynamic-range (HDR) image
file format developed by Industrial Light & Magic for use in computer
imaging applications. It represents a pixel as 3 floating point numbers
to be able to handle editing that shifts the brightness and contrast.

Some advanced video cards natively support a subset of this format.

<http://www.cybergrain.com/tech/hdr/> is a must-read review article on
why you might want such a thing.
 
T

Thierry

David Phillip Oster said:
The common representation if a pixel is 8 bits red, 8 bits green 8 bits
blue, with pixels being on 4-byte boundaries.

In the past, some video cards simply failed to populate the memory for
the bits that didn't correspond to color data. However, that extra byte
in a 32-bit word is often used for opacity, known as alpha, when
compositing images. Since textures=images are often stored on the video
card for 3d work, it is simpler in these days of large memory chips,
just to populate the entire memory space of the video card.

When you are doing photoediting, you often want many more than 8-bits
per pixel, since as pointed out, when the eye is looking at a dark
portion of an image it percieves one range, when looking at a bright
portion, it sees a different range.

<http://www.openexr.com/> OpenEXR is a high dynamic-range (HDR) image
file format developed by Industrial Light & Magic for use in computer
imaging applications. It represents a pixel as 3 floating point numbers
to be able to handle editing that shifts the brightness and contrast.

Some advanced video cards natively support a subset of this format.

<http://www.cybergrain.com/tech/hdr/> is a must-read review article on
why you might want such a thing.
 
T

Thierry

David Phillip Oster said:
The common representation if a pixel is 8 bits red, 8 bits green 8 bits
blue, with pixels being on 4-byte boundaries.

In the past, some video cards simply failed to populate the memory for
the bits that didn't correspond to color data. However, that extra byte
in a 32-bit word is often used for opacity, known as alpha, when
compositing images. Since textures=images are often stored on the video
card for 3d work, it is simpler in these days of large memory chips,
just to populate the entire memory space of the video card.

When you are doing photoediting, you often want many more than 8-bits
per pixel, since as pointed out, when the eye is looking at a dark
portion of an image it percieves one range, when looking at a bright
portion, it sees a different range.

<http://www.openexr.com/> OpenEXR is a high dynamic-range (HDR) image
file format developed by Industrial Light & Magic for use in computer
imaging applications. It represents a pixel as 3 floating point numbers
to be able to handle editing that shifts the brightness and contrast.

Some advanced video cards natively support a subset of this format.

<http://www.cybergrain.com/tech/hdr/> is a must-read review article on
why you might want such a thing.

Thanks David. This help me too.
Thierry
 
B

Bob Myers

Thierry said:
Does it mean that there are conditions in which a 12 or even 14-bit/color
screen (or graphic card) can be useful;

12 bits/color, yes, but that's pretty much the limit - at least
I have not heard of anyone trying to get anything better than
that. It's difficult enough just to get an honest 12 bits, and
the applications that truly need this level of performance (and
the users and conditions where such will actually make a
difference) are pretty few and far between. The average user,
in the average environment, might in some cases spot the
different between 8 and 10; the pro might be in the same
situation when comparing 10 and 12. Beyond 12, I think
anyone would be hard pressed to find a real need.

Bob M.
 
B

Bob Myers

Bob Myers said:
12 bits/color, yes, but that's pretty much the limit - at least
I have not heard of anyone trying to get anything better than
that. It's difficult enough just to get an honest 12 bits, and
the applications that truly need this level of performance (and
the users and conditions where such will actually make a
difference) are pretty few and far between. The average user,
in the average environment, might in some cases spot the
different between 8 and 10; the pro might be in the same
situation when comparing 10 and 12. Beyond 12, I think
anyone would be hard pressed to find a real need.

Just occurred to me - I should probably add that the above
assumes the optimal encoding schemes in each case. Simple
linear encoding of N bits in an RGB system is NOT optimal.

Bob M.
 
T

Thierry

Bob Myers said:
Just occurred to me - I should probably add that the above
assumes the optimal encoding schemes in each case. Simple
linear encoding of N bits in an RGB system is NOT optimal.

Thanks. Of course that lambda people can satisfy with 8-bits and low
contrasts.
This is strange. I don't really see the difference in linear and non linear
mode excepted that the shader is quite different (for a contrast ratio100:1,
the first shows 9900 vs 460 shades only). So the non linear RGB
representation looks even worst. But of course I am wrong somewhere but I
don't really find an article explained the pluses of the second on the first
representation (or vice versa).

I question you because I have to buy a new TFT for image processing, 3D and
renderings, 19" or maybe 21" 4/5 if not 16/9 supporting the highest contrast
and shader as possible (lacie 321 or another apple cinema maybe), and I
wondered that Brightside was able to produce a 37" screen showing an ANSI
contrast of 60000:1 (knowing that a picture against the light is close to
50000:1) using 16 bit/color while the lacie reaches "only" 500:1 using 10
bits in linear mode (?, not sure), while its so-called asian compitors (but
less performing) sometimes reach 1200-1500:1 (and up to 5000:1 for TV flat
screens).

So as photographers and other experts request a high contrast for the
postprocessing and other zoom function, soon the support of HDR without
doubt (already available in PS CS), this "low contrast" (below 1000:1) and 8
or even 10-bits linear will soon look oldfashion... (of course I exagerate a
bit, but maybe not for expert like photographs or for space imaging).

Thierry
 
B

Bob Myers

Thierry said:
I question you because I have to buy a new TFT for image processing, 3D and
renderings, 19" or maybe 21" 4/5 if not 16/9 supporting the highest contrast
and shader as possible (lacie 321 or another apple cinema maybe), and I
wondered that Brightside was able to produce a 37" screen showing an ANSI
contrast of 60000:1 (knowing that a picture against the light is close to
50000:1) using 16 bit/color while the lacie reaches "only" 500:1 using 10
bits in linear mode (?, not sure), while its so-called asian compitors (but
less performing) sometimes reach 1200-1500:1 (and up to 5000:1 for TV flat
screens).

The Brightside technology is a whole different animal; it doesn't
have a 16-bit LCD in it any more than any other LCD display
does, but it does some very interesting tricks elsewhere to create the
huge dynamic range. I've seen it, and it DOES look very impressive.

Extremely high contrast numbers in most displays, and especially some
of the things you see quoted for a lot of plasma TVs these days, are
what I like to call "science fiction specifications" - yes, you CAN
measure a pair of numbers that will give you the "X thousand to one"
sort of contrast numbers, but (a) that's not the same thing as the actual
dynamic range delivered by the panel, and (b) you don't actually see
anything remotely like that contrast under typical viewing conditions
(nor will you - it's simply not gonna happen).

Bob M.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top