Posts Tagged 'Mean Filter'

C# How to: Image Blur

Article Purpose

This article serves to provides an introduction and discussion relating to methods and techniques. The Image Blur methods covered in this article include: , , , and  .

Daisy: Mean 9×9

Daisy Mean 9x9

Sample Source Code

This article is accompanied by a sample source code Visual Studio project which is available for download .

Using the Sample Application

This article is accompanied by a sample application, intended to provide a means of testing and replicating topics discussed in this article. The sample application is a based application of which the user interface enables the user to select an type to implement.

When clicking the Load Image button users are able to browse the local file system in order to select source/input . In addition users are also able to save blurred result when clicking the Save Image button and browsing the local file system.

Daisy: Mean 7×7

Daisy Mean 7x7

The sample application provides the user with the ability to select the method of to implement. The dropdown located on the right-hand side of the user interface lists all of the supported methods of . When a user selects an item from the , the associated blur method will be implemented on the preview .

The image below is a screenshot of the Image Blur Filter sample application in action:

Image Blur Filter Sample Application

Image Blur Overview

The process of can be regarded as reducing the sharpness or crispness defined by an . results in detail/ being perceived as less distinct. are often blurred as a method of smoothing an .

perceived as too crisp/sharp can be softened by applying a variety of techniques and intensity levels. Often are smoothed/blurred in order to remove/reduce . In implementations better results are often achieved when first implementing through smoothing/. can even be implemented in a fashion where results reflect , a method known as .

In this article and the accompanying sample source code all methods of supported have been implemented through , with the exception of the filter. Each of the supported methods in essence only represent a different   . The technique capable of achieving optimal results will to varying degrees be dependent on the features present in the specified source/input . Each method provides a different set of desired properties and compromises. In the following sections an overview of each method will be discussed.

Daisy: Mean 9×9

Daisy Mean 9x9

Mean Filter/Box Blur

The also sometimes referred to as a represents a fairly simplistic implementation and definition. A definition can be found on as follows:

A box blur is an in which each pixel in the resulting image has a value equal to the average value of its neighbouring pixels in the input image. It is a form of low-pass ("blurring") filter and is a .

Due to its property of using equal weights it can be implemented using a much simpler accumulation algorithm which is significantly faster than using a sliding window algorithm.

as a title relates to all weight values in a being equal, therefore the alternate title of . In most cases a will only contain the value one. When performing implementing a , the factor value equates to the 1 being divided by the sum of all values.

The following is an example of a 5×5 convolution kernel:

Mean Filter Blur 5x5 Kernel

The consist of 25 elements, therefore the factor value equates to one divided by twenty five.

The Blur does not result in the same level of smoothing achieved by other methods. The method can also be susceptible to directional artefacts.

Daisy Mean 5×5

Daisy Mean 5x5

Gaussian Blur

The method of is a popular and often implemented filter. In contrast to the method produce resulting appearing to contain a more uniform level of smoothing. When implementing a is often applied to source/input resulting in . The has a good level of edge preservation, hence being used in operations.

From we gain the following :

A Gaussian blur (also known as Gaussian smoothing) is the result of blurring an image by a . It is a widely used effect in graphics software, typically to reduce image noise and reduce detail. The visual effect of this blurring technique is a smooth blur resembling that of viewing the image through a translucent screen, distinctly different from the bokeh effect produced by an out-of-focus lens or the shadow of an object under usual illumination. Gaussian smoothing is also used as a pre-processing stage in computer vision algorithms in order to enhance image structures at different scales

A potential drawback to implementing a results from the filter being computationally intensive. The following represents a 5×5 . The sum total of all elements in the equate to 159, therefore a factor value of 1.0 / 159.0 will be implemented.

Guassian Blur 5x5 Kernel

Daisy: Gaussian 5×5

Daisy Gaussian 5x5

Median Filter Blur

The is classified as a non-linear filter. In contrast to the other methods of discussed in this article the implementation does not involve or a predefined matrix . The following can be found on :

In signal processing, it is often desirable to be able to perform some kind of on an image or signal. The median filter is a nonlinear technique, often used to remove . Such noise reduction is a typical pre-processing step to improve the results of later processing (for example, on an image). Median filtering is very widely used in digital because, under certain conditions, it preserves edges while removing noise.

Daisy: Median 7×7

Daisy Median 7x7

As the name implies, the operates by calculating the value of a pixel group also referred to as a window. Calculating a value involves a number of steps. The required steps are listed as follows:

  1. Iterate each pixel that forms part of the source/input .
  2. In relation to the pixel currently being iterated determine neighbouring pixels located within the bounds defined by the window size. The window location should be offset in order to align the window’s middle pixel and the pixel currently being iterated.
  3. Neighbouring pixels located within the bounds  defined by the window should be added to a one dimensional neighbourhood array. Once all value have been added, the array should be sorted by value.
  4. The pixel value located at the middle of the sorted neighbourhood array qualifies as the value. The newly determined value should be assigned to the pixel currently being iterated.
  5. Repeat the steps listed above until all pixels within the source/input have been iterated.

Similar to the filter the has the ability to smooth whilst providing edge preservation. Depending on the window size implemented and the physical dimensions of input/source the can be computationally expensive.

Daisy: Median 9×9

Daisy Median 9x9

Motion Blur

The sample source implements filters. in the traditional sense has been association with photography and video capturing. can often be observed in scenarios where rapid movements are being captured to photographs or video recording. When recording a single frame, rapid movements could result in the changing  before the frame being captured has completed.

can be synthetically imitated through the implementation of Digital filters. The size of the provided when implementing affects the filter intensity perceived in result . Relating to filters the size of the specified in influences the perception and appearance of how rapidly movement had occurred to have blurred the resulting . Larger produce the appearance of more rapid motion, whereas smaller result in less rapid motion being perceived.

Daisy: Motion Blur 7×7 135 Degrees

Daisy Motion Blur 7x7 135 Degrees

Depending on the specified the ability exists to create the appearance of movement having occurred in a certain direction. The sample source code implements filters at 45 degrees, 135 degrees and in both directions simultaneously.

The listed below represents a 5×5 filter occurring at  45 degrees and 135 degrees:

MotionBlur5x5

Image Blur Implementation

The sample source code implements all of the concepts explored throughout this article. The source code definition can be grouped into 4 sections: ImageBlurFilter method, ConvolutionFilter method, MedianFilter method and the Matrix class. The following article sections relate to the 4 main source code sections.

The ImageBlurFilter has the purpose of invoking the correct blur filter method and relevant method parameters. This method acts as a method wrapper providing the technical implementation details required when performing a specified blur filter.

The definition of the ImageBlurFilter as follows:

 public static Bitmap ImageBlurFilter(this Bitmap sourceBitmap,  
                                             BlurType blurType) 
{  
     Bitmap resultBitmap = null; 

switch (blurType) { case BlurType.Mean3x3: { resultBitmap = sourceBitmap.ConvolutionFilter( Matrix.Mean3x3, 1.0 / 9.0, 0); } break; case BlurType.Mean5x5: { resultBitmap = sourceBitmap.ConvolutionFilter( Matrix.Mean5x5, 1.0 / 25.0, 0); } break; case BlurType.Mean7x7: { resultBitmap = sourceBitmap.ConvolutionFilter( Matrix.Mean7x7, 1.0 / 49.0, 0); } break; case BlurType.Mean9x9: { resultBitmap = sourceBitmap.ConvolutionFilter( Matrix.Mean9x9, 1.0 / 81.0, 0); } break; case BlurType.GaussianBlur3x3: { resultBitmap = sourceBitmap.ConvolutionFilter( Matrix.GaussianBlur3x3, 1.0 / 16.0, 0); } break; case BlurType.GaussianBlur5x5: { resultBitmap = sourceBitmap.ConvolutionFilter( Matrix.GaussianBlur5x5, 1.0 / 159.0, 0); } break; case BlurType.MotionBlur5x5: { resultBitmap = sourceBitmap.ConvolutionFilter( Matrix.MotionBlur5x5, 1.0 / 10.0, 0); } break; case BlurType.MotionBlur5x5At45Degrees: { resultBitmap = sourceBitmap.ConvolutionFilter( Matrix.MotionBlur5x5At45Degrees, 1.0 / 5.0, 0); } break; case BlurType.MotionBlur5x5At135Degrees: { resultBitmap = sourceBitmap.ConvolutionFilter( Matrix.MotionBlur5x5At135Degrees, 1.0 / 5.0, 0); } break; case BlurType.MotionBlur7x7: { resultBitmap = sourceBitmap.ConvolutionFilter( Matrix.MotionBlur7x7, 1.0 / 14.0, 0); } break; case BlurType.MotionBlur7x7At45Degrees: { resultBitmap = sourceBitmap.ConvolutionFilter( Matrix.MotionBlur7x7At45Degrees, 1.0 / 7.0, 0); } break; case BlurType.MotionBlur7x7At135Degrees: { resultBitmap = sourceBitmap.ConvolutionFilter( Matrix.MotionBlur7x7At135Degrees, 1.0 / 7.0, 0); } break; case BlurType.MotionBlur9x9: { resultBitmap = sourceBitmap.ConvolutionFilter( Matrix.MotionBlur9x9, 1.0 / 18.0, 0); } break; case BlurType.MotionBlur9x9At45Degrees: { resultBitmap = sourceBitmap.ConvolutionFilter( Matrix.MotionBlur9x9At45Degrees, 1.0 / 9.0, 0); } break; case BlurType.MotionBlur9x9At135Degrees: { resultBitmap = sourceBitmap.ConvolutionFilter( Matrix.MotionBlur9x9At135Degrees, 1.0 / 9.0, 0); } break; case BlurType.Median3x3: { resultBitmap = sourceBitmap.MedianFilter(3); } break; case BlurType.Median5x5: { resultBitmap = sourceBitmap.MedianFilter(5); } break; case BlurType.Median7x7: { resultBitmap = sourceBitmap.MedianFilter(7); } break; case BlurType.Median9x9: { resultBitmap = sourceBitmap.MedianFilter(9); } break; case BlurType.Median11x11: { resultBitmap = sourceBitmap.MedianFilter(11); } break; }
return resultBitmap; }

Daisy: Motion Blur 9×9

Daisy Motion Blur 9x9

The Matrix class serves as a collection of  various definitions. The Matrix class and all public properties are defined as static. The definition of the Matrix class as follows:

     public static class Matrix 
    {  
         public static double[,] Mean3x3 
         {  
             get 
             {  
                 return new double[,]   
                { {  1, 1, 1, },  
                  {  1, 1, 1, },  
                  {  1, 1, 1, }, }; 
             }  
         }  

public static double[,] Mean5x5 { get { return new double[,] { { 1, 1, 1, 1, 1 }, { 1, 1, 1, 1, 1 }, { 1, 1, 1, 1, 1 }, { 1, 1, 1, 1, 1 }, { 1, 1, 1, 1, 1 }, }; } }
public static double[,] Mean7x7 { get { return new double[,] { { 1, 1, 1, 1, 1, 1, 1 }, { 1, 1, 1, 1, 1, 1, 1 }, { 1, 1, 1, 1, 1, 1, 1 }, { 1, 1, 1, 1, 1, 1, 1 }, { 1, 1, 1, 1, 1, 1, 1 }, { 1, 1, 1, 1, 1, 1, 1 }, { 1, 1, 1, 1, 1, 1, 1 }, }; } }
public static double[,] Mean9x9 { get { return new double[,] { { 1, 1, 1, 1, 1, 1, 1, 1, 1 }, { 1, 1, 1, 1, 1, 1, 1, 1, 1 }, { 1, 1, 1, 1, 1, 1, 1, 1, 1 }, { 1, 1, 1, 1, 1, 1, 1, 1, 1 }, { 1, 1, 1, 1, 1, 1, 1, 1, 1 }, { 1, 1, 1, 1, 1, 1, 1, 1, 1 }, { 1, 1, 1, 1, 1, 1, 1, 1, 1 }, { 1, 1, 1, 1, 1, 1, 1, 1, 1 }, { 1, 1, 1, 1, 1, 1, 1, 1, 1 }, }; } }
public static double[,] GaussianBlur3x3 { get { return new double[,] { { 1, 2, 1, }, { 2, 4, 2, }, { 1, 2, 1, }, }; } }
public static double[,] GaussianBlur5x5 { get { return new double[,] { { 2, 04, 05, 04, 2 }, { 4, 09, 12, 09, 4 }, { 5, 12, 15, 12, 5 }, { 4, 09, 12, 09, 4 }, { 2, 04, 05, 04, 2 }, }; } }
public static double[,] MotionBlur5x5 { get { return new double[,] { { 1, 0, 0, 0, 1 }, { 0, 1, 0, 1, 0 }, { 0, 0, 1, 0, 0 }, { 0, 1, 0, 1, 0 }, { 1, 0, 0, 0, 1 }, }; } }
public static double[,] MotionBlur5x5At45Degrees { get { return new double[,] { { 0, 0, 0, 0, 1 }, { 0, 0, 0, 1, 0 }, { 0, 0, 1, 0, 0 }, { 0, 1, 0, 0, 0 }, { 1, 0, 0, 0, 0 }, }; } }
public static double[,] MotionBlur5x5At135Degrees { get { return new double[,] { { 1, 0, 0, 0, 0 }, { 0, 1, 0, 0, 0 }, { 0, 0, 1, 0, 0 }, { 0, 0, 0, 1, 0 }, { 0, 0, 0, 0, 1 }, }; } }
public static double[,] MotionBlur7x7 { get { return new double[,] { { 1, 0, 0, 0, 0, 0, 1 }, { 0, 1, 0, 0, 0, 1, 0 }, { 0, 0, 1, 0, 1, 0, 0 }, { 0, 0, 0, 1, 0, 0, 0 }, { 0, 0, 1, 0, 1, 0, 0 }, { 0, 1, 0, 0, 0, 1, 0 }, { 1, 0, 0, 0, 0, 0, 1 }, }; } }
public static double[,] MotionBlur7x7At45Degrees { get { return new double[,] { { 0, 0, 0, 0, 0, 0, 1 }, { 0, 0, 0, 0, 0, 1, 0 }, { 0, 0, 0, 0, 1, 0, 0 }, { 0, 0, 0, 1, 0, 0, 0 }, { 0, 0, 1, 0, 0, 0, 0 }, { 0, 1, 0, 0, 0, 0, 0 }, { 1, 0, 0, 0, 0, 0, 0 }, }; } }
public static double[,] MotionBlur7x7At135Degrees { get { return new double[,] { { 1, 0, 0, 0, 0, 0, 0 }, { 0, 1, 0, 0, 0, 0, 0 }, { 0, 0, 1, 0, 0, 0, 0 }, { 0, 0, 0, 1, 0, 0, 0 }, { 0, 0, 0, 0, 1, 0, 0 }, { 0, 0, 0, 0, 0, 1, 0 }, { 0, 0, 0, 0, 0, 0, 1 }, }; } }
public static double[,] MotionBlur9x9 { get { return new double[,] { { 1, 0, 0, 0, 0, 0, 0, 0, 1, }, { 0, 1, 0, 0, 0, 0, 0, 1, 0, }, { 0, 0, 1, 0, 0, 0, 1, 0, 0, }, { 0, 0, 0, 1, 0, 1, 0, 0, 0, }, { 0, 0, 0, 0, 1, 0, 0, 0, 0, }, { 0, 0, 0, 1, 0, 1, 0, 0, 0, }, { 0, 0, 1, 0, 0, 0, 1, 0, 0, }, { 0, 1, 0, 0, 0, 0, 0, 1, 0, }, { 1, 0, 0, 0, 0, 0, 0, 0, 1, }, }; } }
public static double[,] MotionBlur9x9At45Degrees { get { return new double[,] { { 0, 0, 0, 0, 0, 0, 0, 0, 1, }, { 0, 0, 0, 0, 0, 0, 0, 1, 0, }, { 0, 0, 0, 0, 0, 0, 1, 0, 0, }, { 0, 0, 0, 0, 0, 1, 0, 0, 0, }, { 0, 0, 0, 0, 1, 0, 0, 0, 0, }, { 0, 0, 0, 1, 0, 0, 0, 0, 0, }, { 0, 0, 1, 0, 0, 0, 0, 0, 0, }, { 0, 1, 0, 0, 0, 0, 0, 0, 0, }, { 1, 0, 0, 0, 0, 0, 0, 0, 0, }, }; } }
public static double[,] MotionBlur9x9At135Degrees { get { return new double[,] { { 1, 0, 0, 0, 0, 0, 0, 0, 0, }, { 0, 1, 0, 0, 0, 0, 0, 0, 0, }, { 0, 0, 1, 0, 0, 0, 0, 0, 0, }, { 0, 0, 0, 1, 0, 0, 0, 0, 0, }, { 0, 0, 0, 0, 1, 0, 0, 0, 0, }, { 0, 0, 0, 0, 0, 1, 0, 0, 0, }, { 0, 0, 0, 0, 0, 0, 1, 0, 0, }, { 0, 0, 0, 0, 0, 0, 0, 1, 0, }, { 0, 0, 0, 0, 0, 0, 0, 0, 1, }, }; } } }

Daisy: Median 7×7

Daisy Median 7x7

The MedianFilter targets the class. The MedianFilter method applies a using the specified and matrix size (window size), returning a new representing the filtered .

The definition of the MedianFilter as follows:

 public static Bitmap MedianFilter(this Bitmap sourceBitmap, 
                                   int matrixSize) 
{ 
     BitmapData sourceData = 
                sourceBitmap.LockBits(new Rectangle(0, 0, 
                sourceBitmap.Width, sourceBitmap.Height), 
                ImageLockMode.ReadOnly, 
                PixelFormat.Format32bppArgb); 

byte[] pixelBuffer = new byte[sourceData.Stride * sourceData.Height];
byte[] resultBuffer = new byte[sourceData.Stride * sourceData.Height];
Marshal.Copy(sourceData.Scan0, pixelBuffer, 0, pixelBuffer.Length);
sourceBitmap.UnlockBits(sourceData);
int filterOffset = (matrixSize - 1) / 2; int calcOffset = 0;
int byteOffset = 0;
List<int> neighbourPixels = new List<int>(); byte[] middlePixel;
for (int offsetY = filterOffset; offsetY < sourceBitmap.Height - filterOffset; offsetY++) { for (int offsetX = filterOffset; offsetX < sourceBitmap.Width - filterOffset; offsetX++) { byteOffset = offsetY * sourceData.Stride + offsetX * 4;
neighbourPixels.Clear();
for (int filterY = -filterOffset; filterY <= filterOffset; filterY++) { for (int filterX = -filterOffset; filterX <= filterOffset; filterX++) {
calcOffset = byteOffset + (filterX * 4) + (filterY * sourceData.Stride);
neighbourPixels.Add(BitConverter.ToInt32( pixelBuffer, calcOffset)); } }
neighbourPixels.Sort(); middlePixel = BitConverter.GetBytes( neighbourPixels[filterOffset]);
resultBuffer[byteOffset] = middlePixel[0]; resultBuffer[byteOffset + 1] = middlePixel[1]; resultBuffer[byteOffset + 2] = middlePixel[2]; resultBuffer[byteOffset + 3] = middlePixel[3]; } }
Bitmap resultBitmap = new Bitmap (sourceBitmap.Width, sourceBitmap.Height);
BitmapData resultData = resultBitmap.LockBits(new Rectangle (0, 0, resultBitmap.Width, resultBitmap.Height), ImageLockMode.WriteOnly, PixelFormat.Format32bppArgb);
Marshal.Copy(resultBuffer, 0, resultData.Scan0, resultBuffer.Length);
resultBitmap.UnlockBits(resultData);
return resultBitmap; }

Daisy: Motion Blur 9×9

Daisy Motion Blur 9x9

The sample source code performs by invoking the ConvolutionFilter .

The definition of the ConvolutionFilter as follows:

private static Bitmap ConvolutionFilter(this Bitmap sourceBitmap, 
                                          double[,] filterMatrix, 
                                               double factor = 1, 
                                                    int bias = 0) 
{ 
    BitmapData sourceData = sourceBitmap.LockBits(new Rectangle(0, 0, 
                             sourceBitmap.Width, sourceBitmap.Height), 
                                               ImageLockMode.ReadOnly, 
                                         PixelFormat.Format32bppArgb); 

byte[] pixelBuffer = new byte[sourceData.Stride * sourceData.Height]; byte[] resultBuffer = new byte[sourceData.Stride * sourceData.Height];
Marshal.Copy(sourceData.Scan0, pixelBuffer, 0, pixelBuffer.Length); sourceBitmap.UnlockBits(sourceData);
double blue = 0.0; double green = 0.0; double red = 0.0;
int filterWidth = filterMatrix.GetLength(1); int filterHeight = filterMatrix.GetLength(0);
int filterOffset = (filterWidth - 1) / 2; int calcOffset = 0;
int byteOffset = 0;
for (int offsetY = filterOffset; offsetY < sourceBitmap.Height - filterOffset; offsetY++) { for (int offsetX = filterOffset; offsetX < sourceBitmap.Width - filterOffset; offsetX++) { blue = 0; green = 0; red = 0;
byteOffset = offsetY * sourceData.Stride + offsetX * 4;
for (int filterY = -filterOffset; filterY <= filterOffset; filterY++) { for (int filterX = -filterOffset; filterX <= filterOffset; filterX++) {
calcOffset = byteOffset + (filterX * 4) + (filterY * sourceData.Stride);
blue += (double)(pixelBuffer[calcOffset]) * filterMatrix[filterY + filterOffset, filterX + filterOffset];
green += (double)(pixelBuffer[calcOffset + 1]) * filterMatrix[filterY + filterOffset, filterX + filterOffset];
red += (double)(pixelBuffer[calcOffset + 2]) * filterMatrix[filterY + filterOffset, filterX + filterOffset]; } }
blue = factor * blue + bias; green = factor * green + bias; red = factor * red + bias;
blue = (blue > 255 ? 255 : (blue < 0 ? 0 : blue));
green = (green > 255 ? 255 : (green < 0 ? 0 : green));
red = (red > 255 ? 255 : (red < 0 ? 0 : red));
resultBuffer[byteOffset] = (byte)(blue); resultBuffer[byteOffset + 1] = (byte)(green); resultBuffer[byteOffset + 2] = (byte)(red); resultBuffer[byteOffset + 3] = 255; } }
Bitmap resultBitmap = new Bitmap(sourceBitmap.Width, sourceBitmap.Height);
BitmapData resultData = resultBitmap.LockBits(new Rectangle (0, 0, resultBitmap.Width, resultBitmap.Height), ImageLockMode.WriteOnly, PixelFormat.Format32bppArgb);
Marshal.Copy(resultBuffer, 0, resultData.Scan0, resultBuffer.Length); resultBitmap.UnlockBits(resultData);
return resultBitmap; }

Sample Images

This article features a number of sample images. All featured images have been licensed allowing for reproduction.

The sample images featuring an image of a yellow daisy is licensed under the Creative Commons Attribution-Share Alike 2.5 Generic license and can be downloaded from Wikimedia.org.

The sample images featuring an image of a white daisy is licensed under the Creative Commons Attribution-Share Alike 3.0 Unported license and can be downloaded from Wikipedia.

The sample images featuring an image of a pink daisy is licensed under the Creative Commons Attribution-Share Alike 2.5 Generic license and can be downloaded from Wikipedia.

The sample images featuring an image of a purple daisy is licensed under the Creative Commons Attribution-ShareAlike 3.0 License and can be downloaded from Wikipedia.

The Original Image

Purple_osteospermum

Daisy: Gaussian 3×3

Daisy Gaussian 3x3

Daisy: Gaussian 5×5

Daisy Gaussian 5x5

Daisy: Mean 3×3

Daisy Mean 3x3

Daisy: Mean 5×5

Daisy Mean 5x5

Daisy: Mean 7×7

Daisy Mean 7x7

Daisy: Mean 9×9

Daisy Mean 9x9

Daisy: Median 3×3

Daisy Median 3x3

Daisy: Median 5×5

Daisy Median 5x5

Daisy: Median 7×7

Daisy Median 7x7

Daisy: Median 9×9

Daisy Median 9x9

Daisy: Median 11×11

Daisy Median 11x11

Daisy: Motion Blur 5×5

Daisy Motion Blur 5x5

Daisy: Motion Blur 5×5 45 Degrees

Daisy Motion Blur 5x5 45 Degrees

Daisy: Motion Blur 5×5 135 Degrees

Daisy Motion Blur 5x5 135 Degrees

Daisy: Motion Blur 7×7

Daisy Motion Blur 7x7

Daisy: Motion Blur 7×7 45 Degrees

Daisy Motion Blur 7x7 45 Degree

Daisy: Motion Blur 7×7 135 Degrees

Daisy Motion Blur 7x7 135 Degrees

Daisy: Motion Blur 9×9

Daisy Motion Blur 9x9

Daisy: Motion Blur 9×9 45 Degrees

Daisy Motion Blur 9x9 45 Degrees

Daisy: Motion Blur 9×9 135 Degrees

Daisy Motion Blur 9x9 135 Degrees

Related Articles and Feedback

Feedback and questions are always encouraged. If you know of an alternative implementation or have ideas on a more efficient implementation please share in the comments section.

I’ve published a number of articles related to imaging and images of which you can find URL links here:

Advertisement

C# How to: Image Unsharp Mask

Article purpose

The purpose of this article is to explore and illustrate the concept of . This article implements in the form of a 3×3 , 5×5 , 3×3 Mean filter and a 5×5 Mean filter.

Sample Source code

This article is accompanied by a sample source code Visual Studio project which is available for download .

Using the Sample Application

The sample source code associated with this article includes a based sample application implementing the concepts explored throughout this article.

When using the Image Unsharp Mask sample application users can select a source/input image from the local system by clicking the Load Image button. The dropdown at the bottom of the screen allows the user to select an unsharp masking variation. On the right hand side of the screen users can specify the level/intensity of resulting .

Clicking the Save Image button allows a user to save resulting to the local file system. The image below is a screenshot of the Image Unsharp Mask sample application in action:

Image Unsharp Mask Sample Application

What is Image Unsharp Masking?

A good definition of can be found on :

Unsharp masking (USM) is an image manipulation technique, often available in software.

The "unsharp" of the name derives from the fact that the technique uses a blurred, or "unsharp", positive image to create a "mask" of the original image. The unsharped mask is then combined with the negative image, creating an image that is less blurry than the original. The resulting image, although clearer, probably loses accuracy with respect to the image’s subject. In the context of , an unsharp mask is generally a or filter that amplifies high-frequency components.

In this article we implement by first creating a blurred copy of a source/input then subtracting the blurred from the original , which is known as the mask. Increased is achieved by adding a factor of the mask to the original .

Applying a Convolution Matrix filter

The sample source code provides the definition for the ConvolutionFilter targeting the class. method is invoked when implementing . The definition of the ConvolutionFilter as follows:

 private static Bitmap ConvolutionFilter(Bitmap sourceBitmap,  
                                     double[,] filterMatrix,  
                                          double factor = 1,  
                                               int bias = 0,  
                                     bool grayscale = false )  
{ 
     BitmapData sourceData = sourceBitmap.LockBits(new Rectangle (0, 0, 
                              sourceBitmap.Width, sourceBitmap.Height), 
                                                ImageLockMode.ReadOnly,  
                                          PixelFormat.Format32bppArgb); 

byte[] pixelBuffer = new byte[sourceData.Stride * sourceData.Height]; byte[] resultBuffer = new byte[sourceData.Stride * sourceData.Height];
Marshal.Copy(sourceData.Scan0, pixelBuffer, 0, pixelBuffer.Length); sourceBitmap.UnlockBits(sourceData);
if (grayscale == true) { float rgb = 0;
for (int k = 0; k < pixelBuffer.Length; k += 4) { rgb = pixelBuffer[k] * 0.11f; rgb += pixelBuffer[k + 1] * 0.59f; rgb += pixelBuffer[k + 2] * 0.3f;
pixelBuffer[k] = (byte )rgb; pixelBuffer[k + 1] = pixelBuffer[k]; pixelBuffer[k + 2] = pixelBuffer[k]; pixelBuffer[k + 3] = 255; } }
double blue = 0.0; double green = 0.0; double red = 0.0;
int filterWidth = filterMatrix.GetLength(1); int filterHeight = filterMatrix.GetLength(0);
int filterOffset = (filterWidth-1) / 2; int calcOffset = 0;
int byteOffset = 0;
for (int offsetY = filterOffset; offsetY < sourceBitmap.Height - filterOffset; offsetY++) { for (int offsetX = filterOffset; offsetX < sourceBitmap.Width - filterOffset; offsetX++) { blue = 0; green = 0; red = 0;
byteOffset = offsetY * sourceData.Stride + offsetX * 4;
for (int filterY = -filterOffset; filterY <= filterOffset; filterY++) { for (int filterX = -filterOffset; filterX <= filterOffset; filterX++) {
calcOffset = byteOffset + (filterX * 4) + (filterY * sourceData.Stride);
blue += (double)(pixelBuffer[calcOffset]) * filterMatrix[filterY + filterOffset, filterX + filterOffset];
green += (double)(pixelBuffer[calcOffset + 1]) * filterMatrix[filterY + filterOffset, filterX + filterOffset];
red += (double)(pixelBuffer[calcOffset + 2]) * filterMatrix[filterY + filterOffset, filterX + filterOffset]; } }
blue = factor * blue + bias; green = factor * green + bias; red = factor * red + bias;
if (blue > 255) { blue = 255; } else if (blue < 0) { blue = 0; }
if (green > 255) { green = 255; } else if (green < 0) { green = 0; }
if (red > 255) { red = 255; } else if (red < 0) { red = 0; }
resultBuffer[byteOffset] = (byte )(blue); resultBuffer[byteOffset + 1] = (byte )(green); resultBuffer[byteOffset + 2] = (byte )(red); resultBuffer[byteOffset + 3] = 255; } }
Bitmap resultBitmap = new Bitmap(sourceBitmap.Width, sourceBitmap.Height); BitmapData resultData = resultBitmap.LockBits(new Rectangle (0, 0, resultBitmap.Width, resultBitmap.Height), ImageLockMode.WriteOnly, PixelFormat.Format32bppArgb);
Marshal.Copy(resultBuffer, 0, resultData.Scan0, resultBuffer.Length); resultBitmap.UnlockBits(resultData);
return resultBitmap; }

Subtracting and Adding Images

An important step required when implementing comes in the form of creating a mask by subtracting a blurred copy from the original and then adding a factor of the mask to the original . In order to achieve increased performance the sample source code combines the process of creating the mask and adding the mask to the original .

The SubtractAddFactorImage iterates every pixel that forms part of an . In a single step the blurred pixel is subtracted from the original pixel, multiplied by a user specified factor and then added to the original pixel. The definition of the SubtractAddFactorImage as follows:

private static Bitmap SubtractAddFactorImage( 
                              this Bitmap subtractFrom, 
                                  Bitmap subtractValue, 
                                   float factor = 1.0f) 
{ 
    BitmapData sourceData =  
               subtractFrom.LockBits(new Rectangle (0, 0, 
               subtractFrom.Width, subtractFrom.Height), 
               ImageLockMode.ReadOnly, 
               PixelFormat.Format32bppArgb); 

byte[] sourceBuffer = new byte[sourceData.Stride * sourceData.Height];
Marshal.Copy(sourceData.Scan0, sourceBuffer, 0, sourceBuffer.Length);
byte[] resultBuffer = new byte[sourceData.Stride * sourceData.Height];
BitmapData subtractData = subtractValue.LockBits(new Rectangle (0, 0, subtractValue.Width, subtractValue.Height), ImageLockMode.ReadOnly, PixelFormat.Format32bppArgb);
byte[] subtractBuffer = new byte[subtractData.Stride * subtractData.Height];
Marshal.Copy(subtractData.Scan0, subtractBuffer, 0, subtractBuffer.Length);
subtractFrom.UnlockBits(sourceData); subtractValue.UnlockBits(subtractData);
double blue = 0; double green = 0; double red = 0;
for (int k = 0; k < resultBuffer.Length && k < subtractBuffer.Length; k += 4) { blue = sourceBuffer[k] + (sourceBuffer[k] - subtractBuffer[k]) * factor;
green = sourceBuffer[k + 1] + (sourceBuffer[k + 1] - subtractBuffer[k + 1]) * factor;
red = sourceBuffer[k + 2] + (sourceBuffer[k + 2] - subtractBuffer[k + 2]) * factor;
blue = (blue < 0 ? 0 : (blue > 255 ? 255 : blue)); green = (green < 0 ? 0 : (green > 255 ? 255 : green)); red = (red < 0 ? 0 : (red > 255 ? 255 : red));
resultBuffer[k] = (byte )blue; resultBuffer[k + 1] = (byte )green; resultBuffer[k + 2] = (byte )red; resultBuffer[k + 3] = 255; }
Bitmap resultBitmap = new Bitmap (subtractFrom.Width, subtractFrom.Height);
BitmapData resultData = resultBitmap.LockBits(new Rectangle (0, 0, resultBitmap.Width, resultBitmap.Height), ImageLockMode.WriteOnly, PixelFormat.Format32bppArgb);
Marshal.Copy(resultBuffer, 0, resultData.Scan0, resultBuffer.Length);
resultBitmap.UnlockBits(resultData);
return resultBitmap; }

Matrix Definition

The image blurring filters implemented by the sample source code relies on static / values defined in the Matrix class. The variants of implemented are: 3×3 , 5×5 Gaussian, 3×3 Mean and 5×5 Mean. The definition of the Matrix class is detailed by the following code snippet:

public static class Matrix
{
    public static double[,] Gaussian3x3
    {
        get
        {
            return new double[,]
            { { 1, 2, 1, }, 
              { 2, 4, 2, }, 
              { 1, 2, 1, }, };
        }
    }

public static double[,] Gaussian5x5Type1 { get { return new double[,] { { 2, 04, 05, 04, 2 }, { 4, 09, 12, 09, 4 }, { 5, 12, 15, 12, 5 }, { 4, 09, 12, 09, 4 }, { 2, 04, 05, 04, 2 }, }; } }
public static double[,] Mean3x3 { get { return new double[,] { { 1, 1, 1, }, { 1, 1, 1, }, { 1, 1, 1, }, }; } }
public static double[,] Mean5x5 { get { return new double[,] { { 1, 1, 1, 1, 1 }, { 1, 1, 1, 1, 1 }, { 1, 1, 1, 1, 1 }, { 1, 1, 1, 1, 1 }, { 1, 1, 1, 1, 1 }, }; } } }

Implementing Image Unsharpening

This article explores four variants of , relating to the four types of image blurring discussed in the previous section. The sample source code defines the following : UnsharpGaussian3x3, UnsharpGaussian5x5, UnsharpMean3x3 and UnsharpMean5x5. All four methods are defined as targeting the class. When looking at the sample images in the following section you will notice the correlation between increased and enhanced . The definition as follows:

public static Bitmap UnsharpGaussian3x3( 
                                 this Bitmap sourceBitmap,  
                                 float factor = 1.0f) 
{
    Bitmap blurBitmap = ExtBitmap.ConvolutionFilter( 
                                  sourceBitmap,  
                                  Matrix.Gaussian3x3,  
                                  1.0 / 16.0); 

Bitmap resultBitmap = sourceBitmap.SubtractAddFactorImage( blurBitmap, factor);
return resultBitmap; }
public static Bitmap UnsharpGaussian5x5( this Bitmap sourceBitmap, float factor = 1.0f) { Bitmap blurBitmap = ExtBitmap.ConvolutionFilter( sourceBitmap, Matrix.Gaussian5x5Type1, 1.0 / 159.0);
Bitmap resultBitmap = sourceBitmap.SubtractAddFactorImage( blurBitmap, factor);
return resultBitmap; } public static Bitmap UnsharpMean3x3( this Bitmap sourceBitmap, float factor = 1.0f) { Bitmap blurBitmap = ExtBitmap.ConvolutionFilter( sourceBitmap, Matrix.Mean3x3, 1.0 / 9.0);
Bitmap resultBitmap = sourceBitmap.SubtractAddFactorImage( blurBitmap, factor);
return resultBitmap; }
public static Bitmap UnsharpMean5x5( this Bitmap sourceBitmap, float factor = 1.0f) { Bitmap blurBitmap = ExtBitmap.ConvolutionFilter( sourceBitmap, Matrix.Mean5x5, 1.0 / 25.0);
Bitmap resultBitmap = sourceBitmap.SubtractAddFactorImage( blurBitmap, factor);
return resultBitmap; }

Sample Images

The used in rendering the sample images shown in this article is licensed under the Creative Commons Attribution-Share Alike 3.0 Unported license and can be from :

The Original Image

W-A-S-D

Unsharp Gaussian 3×3

Unsharp Gaussian 3x3

Unsharp Gaussian 5×5

Unsharp Gaussian 5x5

Unsharp Mean 3×3

Unsharp Mean 3x3

Unsharp Gaussian 5×5

Unsharp Mean 5x5

Related Articles and Feedback

Feedback and questions are always encouraged. If you know of an alternative implementation or have ideas on a more efficient implementation please share in the comments section.

I’ve published a number of articles related to imaging and images of which you can find URL links here:


Dewald Esterhuizen

Blog Stats

  • 843,494 hits

Enter your email address to follow and receive notifications of new posts by email.

Join 228 other subscribers

Archives


%d bloggers like this: