Industrial said:
This has happened on 3 cameras I bought in a row.
The equipment you speak of is a built-in blu-ray drive in my computer and decoded by FFDShow. But I really don't see the relevance. This is digital. It either works or doesn't. The only artifacts that do exist from improper decoding are corrupted frames, wild colors you normally need to drop acid to see or slight brightness/contrast offsets. Blurriness is never a digital media artifact.
Btw, I welcome you to give me any screenshot straight from a Blu-ray with true 1080p detail. That would be a sight to see.
Well, now you're mixing topics.
Video is a different animal than still camera shots. Video does
both spatial and temporal compression (for video formats that
involve compression). If you shoot video of a perfectly still scene,
then the I-frame collected should be reasonably equivalent to a
still camera picture. If there is motion in a picture, any
particular frame selected from the video, may not look
very good as a "still".
http://en.wikipedia.org/wiki/Video_compression_picture_types
As for your camera, and the lies they wrote on the side of the
box. I'll give an example. My webcam is 1280x1024 native format.
Yet, it promises to shoot 5 megapixel pictures for stills. When
in fact, I know I've got all the detail it has to offer, if
I shoot in the max "native" resolution. That's what you want
to do with your camera, is use the native value. Not any value
which requires interpolation to make any in-between pixels to
pad out the image.
Some cheap cameras, only record in compressed formats such as JPEG.
They do this, because the built-in flash chip is so small, and they
want to be able to claim the ability to store a large number of
photos. Whereas the users, would want the image to be recorded
at native resolution, in an uncompressed format (BMP or some particular
TIFF format). Some camera users will select RAW as a format, but
this has implications for aspects other than just the number of
pixels.
As for compression methods, there are "lossy" and "lossless" methods.
If you have an 8 bit photo, you can use GIF as a "lossless" compression
format. No information should be lost with a lossless compressor, but
the level of compression to expect is relatively low (maybe 3:1 to
just make up a value). Things like JPEG, on the other hand, have a
"quality" factor, which goes hand in hand with compression ratio.
I could probably get a 100:1 compression ratio with JPEG, but the
resulting image would only be fit for the Trash Can.
http://en.wikipedia.org/wiki/Jpeg
"Sample photographs" [ three quarters of the way down the web page ]
Some cameras, will "soften" the image before storage, and this may be
an attempt at noise reduction. You can "sharpen" the image with
Photoshop or equivalent, to get back some of the detail. Oversharpening,
is a form of digital mutilation, so don't "turn the knob too far".
After a few trials, you'll get some idea what looks natural, and
what looks like "too much".
If shooting stills, with a camera with excessively noisy sensor, and
the scene is *perfectly still*, you can shoot two photos, one after the
other, with exactly the same lighting, then use Photoshop arithmetic
operation to compute (A+B)/2 or the "average" of the two images. This
helps reduce the noise to some extent, but without degrading the image.
But it only works for things like indoor scenes, where everything
in the scene is under your control. I used that technique, when
preparing photos for a "how-to" manual for something constructed
indoors. Every shot, consisted of two pictures, with the pictures
averaged together to get rid of camera sensor noise. It's what you
get, from a $100 camera. Even with halogen lighting of the scene,
there's still sensor noise present. Better quality sensors, make
that less evident (until it gets a lot darker).
HTH,
Paul