noise reduction in image

A

anonymous

I am looking for imformation about how to quickly w/low processing reduce
noise in a image. I seen a couple of examples of other ways off
http://en.wikipedia.org/wiki/Median_filter (at the bottom of the page) but I
can't understand the formula.

Currently I am taking the image down to a byte array and doing a standard
"median" filter against the image where you take the current pixel and
compare against the surrounding pixels to get the "median" value. i.e.

for(y = 1; y < height - 1; y++)
for(x=1; x < width - 1; x++)
{
ArrayList Red = new arraylist(), green....... blue.......
for(yy = y-1; yy < y+1; yy++)
for(xx = x - 1; xx < x+; xx++)
Add to red, green, blue

// sort red, green, blue
Red.sort();
gree....
blue....

// each array's total size is 9 bytes which means the median is the
4th value of each array after the sort
redmedian = red[4]
blue......
gree......
}


This is working like a champ and given the desired results but at a hugh
cost to processing and time to process. Im running off a dual core machine
and its running around 50% proc and taking around 500ms to complete this.
 
P

Peter Morris

You don't show how you are retrieving / setting the pixels and that is where
you are most likely to incur your penalty.

Google for something like
c# fast bitmap pixel access
 
D

Dathan

You don't show how you are retrieving / setting the pixels and that is where
you are most likely to incur your penalty.

Google for something like
c# fast bitmap pixel access

This shouldn't be the case, as the OP stated he's reduced the image
down to a byte array, and access to a byte array (whether rectangular
or jagged) should be pretty quick.

The slow down might just be the sort step. A mean or Gaussian
convolution should be faster than median filtering. Maybe try
implementing one of those? If it's still slow, then you can be
reasonably sure it's not an algorithmic problem, and profiling it to
find where the actual bottleneck is would be a good opportunity to
learn about performance counters.

~Dathan
 
P

Peter Morris

This shouldn't be the case, as the OP stated he's reduced the image
down to a byte array, and access to a byte array (whether rectangular
or jagged) should be pretty quick.

But how? :)

The slow down might just be the sort step. A mean or Gaussian
convolution should be faster than median filtering. Maybe try
implementing one of those? If it's still slow, then you can be
reasonably sure it's not an algorithmic problem, and profiling it to
find where the actual bottleneck is would be a good opportunity to
learn about performance counters.

I agree. I think the OP should first do nothing in the loop. Then add the
read pixel code, then the code to read surrounding pixels, then the code to
select the median, then the code to write the pixel. Should be pretty
obvious then where the bottleneck is.
 
A

anonymous

The slow down w/out a doubt is with the sorting and the alg being used to
process the current pixel and 8 surrounding pixels.

Reading/Writing bytes as Dathan stated is working like a champ and I can
compare frames (320X240 24bit) at 30fps without increasing processing time
or processor utilization. Problem now is that the images are coming from a
camera and as such has noise. This noise is causing issues with comparison.
Using the described alg the comparison works perfectly but this has cost me
hughly on time and processing.

Anywho.... thanks for your input.
 
A

anonymous

Thanks Dathan,

I will check out the other possible filtering methods. I have isolated the
slow down as you stated down to the sort routines.
 
P

Peter Morris

Using the described alg the comparison works perfectly but this has cost
me hughly on time and processing.

What about using mean average instead of a median? That way you don't need
a list or a sort

for xx
for yy
red += data[xx,yy];
green += data[xx,yy]
blue += data[xx,yy];

red = red / 9;
green = green / 9;
blue = blue / 9;
 
P

Peter Morris

PS, you would need to read from one array and write to another, otherwise
you will be using averaged pixels to average against.
 
J

JS

A median filter is quite often superior to a linear filter. It
wouldn't hurt to try a mean or gaussian filter, but you might be best
off optimizing the median filter.
 
D

Dathan

A median filter is quite often superior to a linear filter. It
wouldn't hurt to try a mean or gaussian filter, but you might be best
off optimizing the median filter.

A couple things to think about:
You know you're always looking at a group of 9 pixels. So instead of
instantiating three ArrayList's per pixel (and eventually having to
garbage collect them), instantiate byte[] red = new byte[9]; and the
same for green and blue outside the outermost loop. That alone should
save you a huge amount of time on garbage collection. You'll also
save on array accesses, as access to an Array should be faster than
access to an ArrayList.

If that doesn't do the trick, you might try implementing your own Sort
() method. In some cases for smaller result sets, it's actually
faster (due to overhead, recursion, etc.) to use a bubble sort than a
quicksort (which is what the MSDN documentation states that Array.Sort
() and ArrayList.Sort() use).

~Dathan
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top