Archive for the 'Learn Everyday' Category



C# How to: Boolean Edge Detection

Article purpose

The purpose of this article is to detail Boolean Function Based Edge Detection. The filtering implemented in article occurs on a per pixel basis. The implementation relies on linear algebra. No GDI+ or traditional drawing methods are required.

Sample Source Code

This article is accompanied by a sample source code Visual Studio project which is available for download .

Using the Sample Application

Implemented as part of this article’s sample source code is a Sample Application. The concepts detailed in this article have all been implemented and tested using the associated Sample Application.

The first task required in using the Sample Application comes in the form of having to specify a source/input . Select files from the local system by clicking the Load Image button.

On the right-hand side of the Sample Application’s user interface the user will be presented with a set of controls which relate to various filter options. Users are able to specify implementation methods when adjusting filter options.

Filter type values are: None, Edge Detect and Sharpen.  Selecting a filter type of None results in no filtering being implemented, the original input/source will be displayed reflecting no change. When users select the Edge Detect filter type the resulting output reflects a black and white image of which only the detected edges are visible. The Sharpen filter type implements Boolean Edge Detection producing a sharpened by means of highlighting detected edges within the original input/source .

The Trackbar labelled Threshold is intended to allow users the option of reducing expressed as detected edges in the resulting . The level of present differs depending on the input/source specified, hence the option of implementing a threshold.

The three remaining TrackBar controls are labelled Red, Green and Blue. The Colour Factor filter options allows the user to specify the extend to which detected edges are expressed when sharpening an . When all three factor values are set to the same value edges appear white if the factor value exceeds zero. Factor values set to zero results in detected edges appearing as darker/black edge lines. Factor values can be set per colour value, which will have the effect of creating a coloured outline being visible in the result . The colour of the outlining effect can be controlled by adjusting individual colour factor values.

After having implemented image filtering the user has the option of saving the result to the local file system by clicking the Save Image button. The image shown below is screenshot of the Boolean Edge Detection sample application in action:

Boolean Edge Detection Sample Application

The Local Threshold and Boolean Function Based Edge Detection

Boolean Edge Detection is considered a a subset of . This method of employs both a local and global threshold. Implementation of the Boolean Edge Detection algorithm can be achieved by completing the following steps:

  1. Initiate a process of iterating each pixel that forms part of the source/input . Calculate a local threshold value based on a 3×3 /window. The should be positioned in a fashion where the pixel currently being iterated is located in the middle of the . Calculate a mean value using as input the 9 values covered by the . Create a new blank set to 3×3 dimensions. Compare each pixel in the source image to the calculated mean value. If a pixel’s value exceeds that of the mean value set the corresponding location on the blank to one. If a pixel’s value does not exceed that of the mean value set the corresponding location on the blank to zero.
  2. In the next step compare the newly created to the set of 16 edge masks. If the new is represented in the edge masks the middle pixel being iterated should be set to indicate an edge.
  3. The first two steps have to be repeated for each pixel contained in the source , in other words each pixel should be iterated. Edges should now be detected as you progress through image pixels, although false edges will also be present as a result of .
  4. False edges that were detected can be removed when implementing a global threshold. Firstly calculate the variance of each 3×3 . If the pixel currently being iterated was detected as part of an edge in step 2 and the variance calculated exceeds the global threshold the pixel can be considered as part of an edge. If the variance calculated equates to less than the global threshold a pixel does not form part of an edge even if the calculated matches one of the 16 edge masks.

The following image illustrates the 16 edge masks:

Edge Masks

Implementing Boolean Edge Detection

The sample source code implements the BooleanEdgeDetectionFilter targeting the class. This method implements the Boolean Edge Detection theoretical steps discussed in the previous section.

In order to determine if a newly calculated , as described in step 1, matches any of the 16 pre-defined edge masks the BooleanEdgeDetectionFilter implements comparison. The reasoning behind string based edge mask comparison boils down to efficiency, both in terms of reducing code complexity and improving performance. The method defines a generic List of type string and then proceeds to add 16 , each representing an edge mask. Edge mask strings express an edge mask in terms of a row and column format. The following code snippet lists the 16 edge masks strings being defined:

List<string> edgeMasks = new List<string>();

edgeMasks.Add("011011011"); edgeMasks.Add("000111111"); edgeMasks.Add("110110110"); edgeMasks.Add("111111000"); edgeMasks.Add("011011001"); edgeMasks.Add("100110110"); edgeMasks.Add("111011000"); edgeMasks.Add("111110000"); edgeMasks.Add("111011001"); edgeMasks.Add("100110111"); edgeMasks.Add("001011111"); edgeMasks.Add("111110100"); edgeMasks.Add("000011111"); edgeMasks.Add("000110111"); edgeMasks.Add("001011011"); edgeMasks.Add("001011011"); edgeMasks.Add("110110100");

The following code snippet list the complete implementation of the BooleanEdgeDetectionFilter :

public static Bitmap BooleanEdgeDetectionFilter( 
                                this Bitmap sourceBitmap, 
                                BooleanFilterType filterType, 
                                float redFactor = 1.0f, 
                                float greenFactor = 1.0f, 
                                float blueFactor = 1.0f, 
                                byte threshold = 0) 
{ 
    BitmapData sourceData = 
               sourceBitmap.LockBits(new Rectangle(0, 0, 
               sourceBitmap.Width, sourceBitmap.Height), 
               ImageLockMode.ReadOnly, 
               PixelFormat.Format32bppArgb); 

byte[] pixelBuffer = new byte[sourceData.Stride * sourceData.Height]; byte[] resultBuffer = new byte[sourceData.Stride * sourceData.Height];
Marshal.Copy(sourceData.Scan0, pixelBuffer, 0, pixelBuffer.Length);
sourceBitmap.UnlockBits(sourceData);
List<string> edgeMasks = new List<string>();
edgeMasks.Add("011011011"); edgeMasks.Add("000111111"); edgeMasks.Add("110110110"); edgeMasks.Add("111111000"); edgeMasks.Add("011011001"); edgeMasks.Add("100110110"); edgeMasks.Add("111011000"); edgeMasks.Add("111110000"); edgeMasks.Add("111011001"); edgeMasks.Add("100110111"); edgeMasks.Add("001011111"); edgeMasks.Add("111110100"); edgeMasks.Add("000011111"); edgeMasks.Add("000110111"); edgeMasks.Add("001011011"); edgeMasks.Add("001011011"); edgeMasks.Add("110110100");
int filterOffset = 1; int calcOffset = 0;
int byteOffset = 0; int matrixMean = 0; int matrixTotal = 0; double matrixVariance = 0;
double blueValue = 0; double greenValue = 0; double redValue = 0;
string matrixPatern = String.Empty;
for (int offsetY = filterOffset; offsetY < sourceBitmap.Height - filterOffset; offsetY++) { for (int offsetX = filterOffset; offsetX < sourceBitmap.Width - filterOffset; offsetX++) { byteOffset = offsetY * sourceData.Stride + offsetX * 4;
matrixMean = 0; matrixTotal = 0; matrixVariance = 0;
matrixPatern = String.Empty;
//Step 1: Calculate local matrix for (int filterY = -filterOffset; filterY <= filterOffset; filterY++) { for (int filterX = -filterOffset; filterX <= filterOffset; filterX++) { calcOffset = byteOffset + (filterX * 4) + (filterY * sourceData.Stride);
matrixMean += pixelBuffer[calcOffset]; matrixMean += pixelBuffer[calcOffset + 1]; matrixMean += pixelBuffer[calcOffset + 2]; } }
matrixMean = matrixMean / 9;
//Step 4: Calculate Variance for (int filterY = -filterOffset; filterY <= filterOffset; filterY++) { for (int filterX = -filterOffset; filterX <= filterOffset; filterX++) { calcOffset = byteOffset + (filterX * 4) + (filterY * sourceData.Stride);
matrixTotal = pixelBuffer[calcOffset]; matrixTotal += pixelBuffer[calcOffset+1]; matrixTotal += pixelBuffer[calcOffset+2];
matrixPatern += (matrixTotal > matrixMean ? "1" : "0" );
matrixVariance += Math.Pow(matrixMean - (pixelBuffer[calcOffset] + pixelBuffer[calcOffset + 1] + pixelBuffer[calcOffset + 2]), 2); } }
matrixVariance = matrixVariance / 9;
if (filterType == BooleanFilterType.Sharpen) { blueValue = pixelBuffer[byteOffset]; greenValue = pixelBuffer[byteOffset + 1]; redValue = pixelBuffer[byteOffset + 2];
//Step 4: Exlclude noise using global // threshold if (matrixVariance > threshold) { //Step 2: Compare newly calculated // matrix and image masks if (edgeMasks.Contains(matrixPatern)) { blueValue = (blueValue * blueFactor); greenValue = (greenValue * greenFactor); redValue = (redValue * redFactor);
blueValue = (blueValue > 255 ? 255 : (blueValue < 0 ? 0 : blueValue));
greenValue = (greenValue > 255 ? 255 : (greenValue < 0 ? 0 : greenValue));
redValue = (redValue > 255 ? 255 : (redValue < 0 ? 0 : redValue)); } } } //Step 4: Exlclude noise using global // threshold //Step 2: Compare newly calculated // matrix and image masks else if (matrixVariance > threshold && edgeMasks.Contains(matrixPatern)) { blueValue = 255; greenValue = 255; redValue = 255; } else { blueValue = 0; greenValue = 0; redValue = 0; }
resultBuffer[byteOffset] = (byte)blueValue; resultBuffer[byteOffset + 1] = (byte)greenValue; resultBuffer[byteOffset + 2] = (byte)redValue; resultBuffer[byteOffset + 3] = 255; } }
Bitmap resultBitmap = new Bitmap(sourceBitmap.Width, sourceBitmap.Height);
BitmapData resultData = resultBitmap.LockBits(new Rectangle(0, 0, resultBitmap.Width, resultBitmap.Height), ImageLockMode.WriteOnly, PixelFormat.Format32bppArgb);
Marshal.Copy(resultBuffer, 0, resultData.Scan0, resultBuffer.Length);
resultBitmap.UnlockBits(resultData);
return resultBitmap; }

Sample Images

This article features a photograph of The Eiffel Tower used in generating sample . The original image has been licensed under the Creative Commons Attribution-Share Alike 3.0 Unported license and can be downloaded from Wikipedia: Original Image.

The Original Image

Tour_Eiffel_Wikimedia_Commons

Edge Detection, Threshold 50

Boolean Edge Detection Threshold 50

Sharpen, Threshold 50, Blue

Boolean Edge Detection Threshold 50 Sharpen Blue

Sharpen, Threshold 50, Green

Boolean Edge Detection Threshold 50 Sharpen Green

Sharpen, Threshold 50, Green and Blue

Boolean Edge Detection Threshold 50 Sharpen Green Blue

Sharpen, Threshold 50, Red

Boolean Edge Detection Threshold 50 Sharpen Red

Sharpen, Threshold 50, Red and Blue

Boolean Edge Detection Threshold 50 Sharpen Red Blue

Sharpen, Threshold 50, Red and Green

Boolean Edge Detection Threshold 50 Sharpen Red Green

Sharpen, Threshold 50, White – Red, Green and Blue

Boolean Edge Detection Threshold 50 Sharpen White

Related Articles and Feedback

Feedback and questions are always encouraged. If you know of an alternative implementation or have ideas on a more efficient implementation please share in the comments section.

I’ve published a number of articles related to imaging and images of which you can find URL links here:

C# How to: Morphological Edge Detection

Article purpose

The objective of this article is to explore   implemented by means of and  . In addition we explore the concept of implementing morphological .

Sample source code

This article is accompanied by a sample source code Visual Studio project which is available for download .

Using the sample application

This article is accompanied by a Sample Application intended to implement all of the concepts illustrated throughout this article. Using the sample application users can easily test and replicate concepts.

Clicking the Load Image button allows users to select source/input from the local system. Filter option categories are: Colour(s), morphology type, edge options and filter size.

This article and sample source code can process colour as source . The user can specify which colour components to include in resulting . The three labelled Red, Green and Blue indicate whether the related colour component features in result .

The four labelled Dilate, Erode, Open and Closed enable the user to select the type of morphological filter to apply.

options include: None, Edge Detection and Image Sharpening. Selecting None results in only the selected morphological filter being applied.

Filter sizes range from 3×3 up to 17×17. The filter size specified determines the intensity of the morphological filter applied.

If desired users are able to save filter result images to the local file system by clicking the Save Image button. The image below is a screenshot of the Morphological Edge Detection sample application in action:

Morphological_Edge_Detection_Sample_Application

Morphology – Image Erosion and Dilation

and are implementations of , a subset of . In simpler terms can be defined by this :

Dilation is one of the two basic operators in the area of , the other being . It is typically applied to , but there are versions that work on . The basic effect of the operator on a binary image is to gradually enlarge the boundaries of regions of foreground (i.e. white pixels, typically). Thus areas of foreground pixels grow in size while holes within those regions become smaller.

being a related concept is defined by this :

Erosion is one of the two basic operators in the area of , the other being . It is typically applied to , but there are versions that work on . The basic effect of the operator on a binary image is to erode away the boundaries of regions of foreground (i.e. white pixels, typically). Thus areas of foreground pixels shrink in size, and holes within those areas become larger.

From the definitions listed above we gather that increases the size of edges contained in an . In contrast decreases or shrinks the size of an ’s edges.

Image Edge Detection

We gain a good definition of from ’s article on :

Edge detection is the name for a set of mathematical methods which aim at identifying points in a at which the changes sharply or, more formally, has discontinuities. The points at which image brightness changes sharply are typically organized into a set of curved line segments termed edges. The same problem of finding discontinuities in 1D signals is known as and the problem of finding signal discontinuities over time is known as . Edge detection is a fundamental tool in , and , particularly in the areas of and .

In this article we implement based on the type of being performed. In the case of the eroded is subtracted from the original resulting in an with pronounced edges. When implementing , is achieved by subtracting the original from the dilated .

Image Sharpening

is often referred to by the term , from Wikipedia we gain the following :

Edge enhancement is an filter that enhances the edge contrast of an or in an attempt to improve its acutance (apparent sharpness).

The filter works by identifying sharp edge boundaries in the image, such as the edge between a subject and a background of a contrasting color, and increasing the image contrast in the area immediately around the edge. This has the effect of creating subtle bright and dark highlights on either side of any edges in the image, called and undershoot, leading the edge to look more defined when viewed from a typical viewing distance.

In this article we implement by first creating an which we then add to the original , resulting in an with enhanced edges.

Implementing Morphological Filters

The sample source code provides the definition of the DilateAndErodeFilter targeting the class. The DilateAndErodeFilter as a single method implementation is capable of applying a specified morphological filter, and . The following code snippet details the implementation of the the DilateAndErodeFilter :

public static Bitmap DilateAndErodeFilter(this Bitmap sourceBitmap,  
                                        int matrixSize, 
                                        MorphologyType morphType, 
                                        bool applyBlue = true, 
                                        bool applyGreen = true, 
                                        bool applyRed = true,
                                        MorphologyEdgeType edgeType = 
                                        MorphologyEdgeType.None)  
{ 
    BitmapData sourceData =  
               sourceBitmap.LockBits(new Rectangle (0, 0, 
               sourceBitmap.Width, sourceBitmap.Height), 
               ImageLockMode.ReadOnly,  
               PixelFormat.Format32bppArgb); 

byte[] pixelBuffer = new byte[sourceData.Stride * sourceData.Height];
byte[] resultBuffer = new byte[sourceData.Stride * sourceData.Height];
Marshal.Copy(sourceData.Scan0, pixelBuffer, 0, pixelBuffer.Length);
sourceBitmap.UnlockBits(sourceData);
int filterOffset = (matrixSize - 1) / 2; int calcOffset = 0;
int byteOffset = 0;
int blue = 0; int green = 0; int red = 0;
byte morphResetValue = 0;
if (morphType == MorphologyType.Erosion) { morphResetValue = 255; }
for (int offsetY = filterOffset; offsetY < sourceBitmap.Height - filterOffset; offsetY++) { for (int offsetX = filterOffset; offsetX < sourceBitmap.Width - filterOffset; offsetX++) { byteOffset = offsetY * sourceData.Stride + offsetX * 4;
blue = morphResetValue; green = morphResetValue; red = morphResetValue;
if (morphType == MorphologyType.Dilation) { for (int filterY = -filterOffset; filterY <= filterOffset; filterY++) { for (int filterX = -filterOffset; filterX <= filterOffset; filterX++) { calcOffset = byteOffset + (filterX * 4) + (filterY * sourceData.Stride);
if (pixelBuffer[calcOffset] > blue) { blue = pixelBuffer[calcOffset]; }
if (pixelBuffer[calcOffset + 1] > green) { green = pixelBuffer[calcOffset + 1]; }
if (pixelBuffer[calcOffset + 2] > red) { red = pixelBuffer[calcOffset + 2]; } } } } else if (morphType == MorphologyType.Erosion) { for (int filterY = -filterOffset; filterY <= filterOffset; filterY++) { for (int filterX = -filterOffset; filterX <= filterOffset; filterX++) { calcOffset = byteOffset + (filterX * 4) + (filterY * sourceData.Stride);
if (pixelBuffer[calcOffset] < blue) { blue = pixelBuffer[calcOffset]; }
if (pixelBuffer[calcOffset + 1] < green) { green = pixelBuffer[calcOffset + 1]; }
if (pixelBuffer[calcOffset + 2] < red) { red = pixelBuffer[calcOffset + 2]; } } } }
if (applyBlue == false ) { blue = pixelBuffer[byteOffset]; }
if (applyGreen == false ) { green = pixelBuffer[byteOffset + 1]; }
if (applyRed == false ) { red = pixelBuffer[byteOffset + 2]; }
if (edgeType == MorphologyEdgeType.EdgeDetection || edgeType == MorphologyEdgeType.SharpenEdgeDetection) { if (morphType == MorphologyType.Dilation) { blue = blue - pixelBuffer[byteOffset]; green = green - pixelBuffer[byteOffset + 1]; red = red - pixelBuffer[byteOffset + 2]; } else if (morphType == MorphologyType.Erosion) { blue = pixelBuffer[byteOffset] - blue; green = pixelBuffer[byteOffset + 1] - green; red = pixelBuffer[byteOffset + 2] - red; }
if (edgeType == MorphologyEdgeType.SharpenEdgeDetection) { blue += pixelBuffer[byteOffset]; green += pixelBuffer[byteOffset + 1]; red += pixelBuffer[byteOffset + 2]; } }
blue = (blue > 255 ? 255 : (blue < 0 ? 0 : blue)); green = (green > 255 ? 255 : (green < 0 ? 0 : green)); red = (red > 255 ? 255 : (red < 0 ? 0 : red));
resultBuffer[byteOffset] = (byte)blue; resultBuffer[byteOffset + 1] = (byte)green; resultBuffer[byteOffset + 2] = (byte)red; resultBuffer[byteOffset + 3] = 255; } }
Bitmap resultBitmap = new Bitmap(sourceBitmap.Width, sourceBitmap.Height);
BitmapData resultData = resultBitmap.LockBits(new Rectangle(0, 0, resultBitmap.Width, resultBitmap.Height), ImageLockMode.WriteOnly, PixelFormat.Format32bppArgb);
Marshal.Copy(resultBuffer, 0, resultData.Scan0, resultBuffer.Length);
resultBitmap.UnlockBits(resultData);
return resultBitmap; }

Sample Images

The source/input used in this article is licensed under the Creative Commons Attribution-Share Alike 3.0 Unported license and can be downloaded from Wikipedia: http://en.wikipedia.org/wiki/File:Bathroom_with_bathtube.jpg

Original Image

1280px-Bathroom_with_bathtube

Erosion 3×3, Edge Detect, Red, Green and Blue

Erosion 3x3 Edge Detect Red, Green and Blue

Erosion 3×3, Edge Detect, Blue

Erosion 3x3, Edge Detect, Blue

Erosion 3×3, Edge Detect, Green and Blue

Erosion 3x3, Edge Detect, Green and Blue

Erosion 3×3, Edge Detect, Red

Erosion 3x3, Edge Detect, Red

Erosion 3×3, Edge Detect, Red and Blue

Erosion 3x3, Edge Detect, Red and Blue

Erosion 3×3, Edge Detect, Red and Green

Erosion 3x3, Edge Detect, Red and Green

Erosion 7×7, Sharpen, Red, Green and Blue

Erosion 7x7, Sharpen, Red, Green and Blue

Erosion 7×7, Sharpen, Blue

Erosion 7x7, Sharpen, Blue

Erosion 7×7, Sharpen, Green

Erosion 7x7, Sharpen, Green

Erosion 7×7, Sharpen, Green and Blue

Erosion 7x7, Sharpen, Green and Blue

Erosion 7×7, Sharpen, Red

Erosion 7x7, Sharpen, Red

Erosion 7×7, Sharpen, Red and Blue

Erosion 7x7, Sharpen, Red and Blue

Erosion 7×7, Sharpen, Red and Green

Erosion 7x7, Sharpen, Red and Green

Related Articles and Feedback

Feedback and questions are always encouraged. If you know of an alternative implementation or have ideas on a more efficient implementation please share in the comments section.

I’ve published a number of articles related to imaging and images of which you can find URL links here:

C# How to: Image Erosion and Dilation

Article purpose

The purpose of this article is aimed at exploring the concepts of , , and . In addition this article extends conventional and implementations through partial colour variations of and .

Sample source code

This article is accompanied by a sample source code Visual Studio project which is available for download .

Using the sample application

Included in this article’s sample source code you’ll find a based sample application. The sample application can be used to test and replicate the concepts we explore in this article.

When executing the sample application source/input can selected from the local system by clicking the Load Image button. On the right-hand side of the sample application’s user interface users can adjust the provided controls in order to modify the method of filtering being implemented.

The three labelled Red, Green and Blue relate to whether the relevant colour component will be regarded or not when implementing the configured filter.

Users are required to select an filter: , or . The interface selection is expressed by means of four respectively labelled Dilate, Erode, Open and Closed.

The only other input required from a user comes in the form of selecting the filter intensity/filter size. The dropdown indicated as Filter Size provides the user with several intensity levels ranging from 3×3 to 17×17. Note: Larger filter sizes result in additional processing required when implementing the filter. Large set to implement large sized filters may require more processor cycles.

Resulting filtered can be saved to the local file system by clicking the Save Image button. The screenshot below illustrates the Image Erosion and Dilation sample application in action:

Image_Erosion_Dilation_Sample_Application

Mathematical Morphology

A description of as expressed on :

Mathematical morphology (MM) is a theory and technique for the analysis and processing of geometrical structures, based on , , , and . MM is most commonly applied to , but it can be employed as well on , , , and many other spatial structures.

and -space concepts such as size, , , , and , were introduced by MM on both continuous and . MM is also the foundation of morphological image processing, which consists of a set of operators that transform images according to the above characterizations.

MM was originally developed for , and was later extended to and images. The subsequent generalization to is widely accepted today as MM’s theoretical foundation.

In this article we explore , , as well as and . The implementation of these filters are significantly easier to grasp when compared to most definitions of .

Image Erosion and Dilation

and are implementations of , a subset of . In simpler terms can be defined by this :

Dilation is one of the two basic operators in the area of , the other being . It is typically applied to , but there are versions that work on . The basic effect of the operator on a binary image is to gradually enlarge the boundaries of regions of foreground (i.e. white pixels, typically). Thus areas of foreground pixels grow in size while holes within those regions become smaller.

being a related concept is defined by this :

Erosion is one of the two basic operators in the area of mathematical morphology, the other being . It is typically applied to binary images, but there are versions that work on . The basic effect of the operator on a binary image is to erode away the boundaries of regions of foreground (i.e. white pixels, typically). Thus areas of foreground pixels shrink in size, and holes within those areas become larger.

From the definitions listed above we gather that increases the size of edges contained in an image. In contrast decreases or shrinks the size of an Image’s edges.

Open and Closed Morphology

Building upon the concepts of and this section explores and . A good definition of can be expressed as :

The basic effect of an opening is somewhat like erosion in that it tends to remove some of the foreground (bright) pixels from the edges of regions of foreground pixels. However it is less destructive than erosion in general. As with other morphological operators, the exact operation is determined by a . The effect of the operator is to preserve foreground regions that have a similar shape to this structuring element, or that can completely contain the structuring element, while eliminating all other regions of foreground pixels.

In turn can be defined as :

Closing is similar in some ways to dilation in that it tends to enlarge the boundaries of foreground (bright) regions in an image (and shrink background color holes in such regions), but it is less destructive of the original boundary shape. As with other , the exact operation is determined by a . The effect of the operator is to preserve background regions that have a similar shape to this structuring element, or that can completely contain the structuring element, while eliminating all other regions of background pixels.

Implementing Image Erosion and Dilation

In this article we implement and by iterating each pixel contained within an image. The colour of each pixel is determined by taking into regard a pixel’s neighbouring pixels.

When implementing a pixel’s value is determined by comparing neighbouring pixels’ colour values, determining the highest colour value expressed amongst neighbouring pixels.

In contrast to we implement by also inspecting neighbouring pixels’ colour values, determining the lowest colour value expressed amongst neighbouring pixels.

In addition to conventional and the sample source code provides the ability to perform  and targeting only specific colour components. The result of specific colour and produces images which express the effects of and only in certain colours. Depending on filter parameters specified edges appear to have a coloured glow or shadow.

The sample source code provides the definition for the DilateAndErodeFilter , targeting the class. The following code snippet details the implementation of the DilateAndErodeFilter :

public static Bitmap DilateAndErodeFilter(
                           this Bitmap sourceBitmap,  
                           int matrixSize, 
                           MorphologyType morphType, 
                           bool applyBlue = true, 
                           bool applyGreen = true, 
                           bool applyRed = true )  
{
    BitmapData sourceData =  
               sourceBitmap.LockBits(new Rectangle (0, 0, 
               sourceBitmap.Width, sourceBitmap.Height), 
               ImageLockMode.ReadOnly,  
               PixelFormat.Format32bppArgb); 

byte[] pixelBuffer = new byte[sourceData.Stride * sourceData.Height];
byte[] resultBuffer = new byte[sourceData.Stride * sourceData.Height];
Marshal.Copy(sourceData.Scan0, pixelBuffer, 0, pixelBuffer.Length);
sourceBitmap.UnlockBits(sourceData);
int filterOffset = (matrixSize - 1) / 2; int calcOffset = 0;
int byteOffset = 0;
byte blue = 0; byte green = 0; byte red = 0;
byte morphResetValue = 0;
if (morphType == MorphologyType.Erosion) { morphResetValue = 255; }
for (int offsetY = filterOffset; offsetY < sourceBitmap.Height - filterOffset; offsetY++) { for (int offsetX = filterOffset; offsetX < sourceBitmap.Width - filterOffset; offsetX++) { byteOffset = offsetY * sourceData.Stride + offsetX * 4;
blue = morphResetValue; green = morphResetValue; red = morphResetValue;
if (morphType == MorphologyType.Dilation) { for (int filterY = -filterOffset; filterY <= filterOffset; filterY++) { for (int filterX = -filterOffset; filterX <= filterOffset; filterX++) { calcOffset = byteOffset + (filterX * 4) + (filterY * sourceData.Stride);
if (pixelBuffer[calcOffset] > blue) { blue = pixelBuffer[calcOffset]; }
if (pixelBuffer[calcOffset + 1] > green) { green = pixelBuffer[calcOffset + 1]; }
if (pixelBuffer[calcOffset + 2] > red) { red = pixelBuffer[calcOffset + 2]; } } } } else if (morphType == MorphologyType .Erosion) { for (int filterY = -filterOffset; filterY <= filterOffset; filterY++) { for (int filterX = -filterOffset; filterX <= filterOffset; filterX++) { calcOffset = byteOffset + (filterX * 4) + (filterY * sourceData.Stride);
if (pixelBuffer[calcOffset] < blue) { blue = pixelBuffer[calcOffset]; }
if (pixelBuffer[calcOffset + 1] < green) { green = pixelBuffer[calcOffset + 1]; }
if (pixelBuffer[calcOffset + 2] < red) { red = pixelBuffer[calcOffset + 2]; } } } }
if (applyBlue == false ) { blue = pixelBuffer[byteOffset]; }
if (applyGreen == false ) { green = pixelBuffer[byteOffset + 1]; }
if (applyRed == false ) { red = pixelBuffer[byteOffset + 2]; }
resultBuffer[byteOffset] = blue; resultBuffer[byteOffset + 1] = green; resultBuffer[byteOffset + 2] = red; resultBuffer[byteOffset + 3] = 255; } }
Bitmap resultBitmap = new Bitmap (sourceBitmap.Width, sourceBitmap.Height);
BitmapData resultData = resultBitmap.LockBits(new Rectangle (0, 0, resultBitmap.Width, resultBitmap.Height), ImageLockMode.WriteOnly, PixelFormat.Format32bppArgb);
Marshal.Copy(resultBuffer, 0, resultData.Scan0, resultBuffer.Length);
resultBitmap.UnlockBits(resultData);
return resultBitmap; }

Implementing Open and Closed Morphology

The sample source code implements by first implementing on a source image, the resulting image is then filtered by implementing .

In a reverse fashion is achieved by first implementing on a source image, which is then further filtered by implementing .

The sample source code defines the OpenMorphologyFilter and CloseMorphologyFilter , both targeting the class. The implementation as follows:

public static Bitmap OpenMorphologyFilter(
                            this Bitmap sourceBitmap, 
                            int matrixSize,
                            bool applyBlue = true, 
                            bool applyGreen = true, 
                            bool applyRed = true ) 
{ 
    Bitmap resultBitmap = 
           sourceBitmap.DilateAndErodeFilter(
                        matrixSize, MorphologyType.Erosion, 
                        applyBlue, applyGreen, applyRed); 

resultBitmap = resultBitmap.DilateAndErodeFilter( matrixSize, MorphologyType.Dilation, applyBlue, applyGreen, applyRed);
return resultBitmap; }
public static Bitmap CloseMorphologyFilter( this Bitmap sourceBitmap, int matrixSize, bool applyBlue = true, bool applyGreen = true, bool applyRed = true ) { Bitmap resultBitmap = sourceBitmap.DilateAndErodeFilter( matrixSize, MorphologyType.Dilation, applyBlue, applyGreen, applyRed);
resultBitmap = resultBitmap.DilateAndErodeFilter( matrixSize, MorphologyType.Erosion, applyBlue, applyGreen, applyRed);
return resultBitmap; }

Sample Images

The original source image used to create all of the sample images in this article has been licensed under the Creative Commons Attribution-Share Alike 3.0 Unported, 2.5 Generic, 2.0 Generic and 1.0 Generic license. The original image is attributed to Kenneth Dwain Harrelson and can be downloaded from .

The Original Image

Monarch_In_May

Image Dilation 3×3 Blue

Image Dilation 3x3 Blue

Image Dilation 3×3 Blue, Green

Image Dilation 3x3 Blue, Green

Image Dilation 3×3 Green

Image Dilation 3x3 Green

Image Dilation 3×3 Red

Image Dilation 3x3 Red

Image Dilation 3×3 Red, Blue

Image Dilation 3x3 Red, Blue

Image Dilation 3×3 Red, Green, Blue

Image Dilation 3x3 Red, Green, Blue

Image Dilation 13×13 Blue

Image Dilation 13x13 Blue

Image Erosion 3×3 Green, Blue

Image Erosion 3x3 Green, Blue

Image Erosion 3×3 Green

Image Erosion 3x3 Green

Image Erosion 3×3 Red

Image Erosion 3x3 Red

Image Erosion 3×3 Red, Blue

Image Erosion 3x3 Red, Blue

Image Erosion 3×3 Red, Green

Image Erosion 3x3 Red, Green

Image Erosion 3×3 Red, Green, Blue

Image Erosion 3x3 Red, Green, Blue

Image Erosion 9×9 Green

Image Erosion 9x9 Green

Image Erosion 9×9 Red

Image Erosion 9x9 Red

Image Open Morphology 11×11 Green

Image Open Morphology 11x11 Green

Image Open Morphology 11×11 Green Blue

Image Open Morphology 11x11 Green Blue

Image Open Morphology 11×11 Red

Image Open Morphology 11x11 Red

Image Open Morphology 11×11 Red, Blue

Image Open Morphology 11x11 Red, Blue

Related Articles and Feedback

Feedback and questions are always encouraged. If you know of an alternative implementation or have ideas on a more efficient implementation please share in the comments section.

I’ve published a number of articles related to imaging and images of which you can find URL links here:

C# How to: Image Colour Average

Article purpose

This article’s intension is focussed on providing a discussion on the tasks involved in implementing Image Colour Averaging. Pixel colour averages are calculated from neighbouring pixels.

Sample source code

This article is accompanied by a sample source code Visual Studio project which is available for download .

Using the Sample Application

The sample source code associated with this article includes a based sample application. The sample application is provided with the intention of illustrating the concepts explored in this article. In addition the sample application serves as a means of testing and replicating results.

By clicking the Load Image button users are able to select input/source from the local system. On the right hand side of the screen various controls enable the user to control the implementation of colour averaging. The three labelled Red, Green and Blue relates to whether an individual colour component is to be included in calculating colour averages.

The filter intensity can be specified through selecting a filter size from the dropdown , specifying higher values will result in output images expressing more colour averaging intensity.

Additional image filter effects can be achieved through implementing colour component shifting/swapping. When colour components are shifted left the result will be:

  • Blue is set to the original value of the Red component.
  • Red is set to the original value of the Green component.
  • Green is set to the original value of the Blue component.

When colour components are shifted right the result will be:

  • Red is set to the original value of the Blue component
  • Blue is set to the original value of the Green component
  • Green is set to the original value of the Red Component

Resulting can be saved by the user to the local file system by clicking the Save Image button. The following image is a screenshot of the Image Colour Average sample application in action:

Image Colour Average Sample Application

Averaging Colours

In this article and the accompanying sample source code colour averaging is implemented on a per pixel basis. An average colour value is calculated based on a pixel’s neighbouring pixels’ colour. Determining neighbouring pixels in the sample source code has been implemented in much the same method as . The major difference to is the absence of a fixed /.

Additional resulting visual effects can be achieved through various options/settings implemented whilst calculating colour averages. Additional options include being able to specify which colour component averages to implement. Furthermore colour components can be swapped/shifted around.

The sample source code implements the AverageColoursFilter , targeting the class. The extent or degree to which colour averaging will be evident in resulting can be controlled through specifying different values set to the matrixSize parameter. The matrixSize parameter in essence determines the number of neighbouring pixels involved in calculating an average colour.

The individual pixel colour components Red, Green and Blue can either be included or excluded in calculating averages. The three method boolean parameters applyBlue, applyGreen and applyRed will determine an individual colour components inclusion in averaging calculations. If a colour component is to be excluded from averaging the resulting will instead express the original source/input image’s colour component.

The intensity of a specific colour component average can be applied to another colour component by means of swapping/shifting colour components, which is indicated through the shiftType method parameter.

The following code snippet provides the implementation of the AverageColoursFilter :

public static Bitmap AverageColoursFilter(
                            this Bitmap sourceBitmap,  
                            int matrixSize,   
                            bool applyBlue = true, 
                            bool applyGreen = true, 
                            bool applyRed = true, 
                            ColorShiftType shiftType = 
                            ColorShiftType.None)  
{ 
    BitmapData sourceData =  
               sourceBitmap.LockBits(new Rectangle(0, 0, 
               sourceBitmap.Width, sourceBitmap.Height), 
               ImageLockMode.ReadOnly,  
               PixelFormat.Format32bppArgb); 

byte[] pixelBuffer = new byte[sourceData.Stride * sourceData.Height];
byte[] resultBuffer = new byte[sourceData.Stride * sourceData.Height];
Marshal.Copy(sourceData.Scan0, pixelBuffer, 0, pixelBuffer.Length);
sourceBitmap.UnlockBits(sourceData);
int filterOffset = (matrixSize - 1) / 2; int calcOffset = 0;
int byteOffset = 0;
int blue = 0; int green = 0; int red = 0;
for (int offsetY = filterOffset; offsetY < sourceBitmap.Height - filterOffset; offsetY++) { for (int offsetX = filterOffset; offsetX < sourceBitmap.Width - filterOffset; offsetX++) { byteOffset = offsetY * sourceData.Stride + offsetX * 4;
blue = 0; green = 0; red = 0;
for (int filterY = -filterOffset; filterY <= filterOffset; filterY++) { for (int filterX = -filterOffset; filterX <= filterOffset; filterX++) { calcOffset = byteOffset + (filterX * 4) + (filterY * sourceData.Stride);
blue += pixelBuffer[calcOffset]; green += pixelBuffer[calcOffset + 1]; red += pixelBuffer[calcOffset + 2]; } }
blue = blue / matrixSize; green = green / matrixSize; red = red / matrixSize;
if (applyBlue == false) { blue = pixelBuffer[byteOffset]; }
if (applyGreen == false) { green = pixelBuffer[byteOffset + 1]; }
if (applyRed == false) { red = pixelBuffer[byteOffset + 2]; }
if (shiftType == ColorShiftType.None) { resultBuffer[byteOffset] = (byte)blue; resultBuffer[byteOffset + 1] = (byte)green; resultBuffer[byteOffset + 2] = (byte)red; resultBuffer[byteOffset + 3] = 255; } else if (shiftType == ColorShiftType.ShiftLeft) { resultBuffer[byteOffset] = (byte)green; resultBuffer[byteOffset + 1] = (byte)red; resultBuffer[byteOffset + 2] = (byte)blue; resultBuffer[byteOffset + 3] = 255; } else if (shiftType == ColorShiftType.ShiftRight) { resultBuffer[byteOffset] = (byte)red; resultBuffer[byteOffset + 1] = (byte)blue; resultBuffer[byteOffset + 2] = (byte)green; resultBuffer[byteOffset + 3] = 255; } } }
Bitmap resultBitmap = new Bitmap(sourceBitmap.Width, sourceBitmap.Height);
BitmapData resultData = resultBitmap.LockBits(new Rectangle(0, 0, resultBitmap.Width, resultBitmap.Height), ImageLockMode.WriteOnly, PixelFormat.Format32bppArgb);
Marshal.Copy(resultBuffer, 0, resultData.Scan0, resultBuffer.Length);
resultBitmap.UnlockBits(resultData);
return resultBitmap; }

The definition of the ColorShiftType :

public enum ColorShiftType  
{
    None, 
    ShiftLeft, 
    ShiftRight 
}

Sample

The original image used in generating the sample images that form part of this article, has been licensed under the Attribution-Share Alike , , and license. The can be from .

Original Image

Rose_Amber_Flush_20070601

Colour Average Blue Size 11

Colour Average Blue Size 11

Colour Average Blue Size 11 Shift Left

Colour Average Blue Size 11 Shift Left

Colour Average Blue Size 11 Shift Right

Colour Average Blue Size 11 Shift Right

Colour Average Green Size 11 Shift Right

Colour Average Green Size 11 Shift Right

Colour Average Green, Blue Size 11

Colour Average Green, Blue Size 11

Colour Average Green, Blue Size 11 Shift Left

Colour Average Green, Blue Size 11 Shift Left

Colour Average Green, Blue Size 11 Shift Right

Colour Average Green, Blue Size 11 Shift Right

Colour Average Red Size 11

Colour Average Red Size 11

Colour Average Red Size 11 Shift Left

Colour Average Red Size 11 Shift Left

Colour Average Red, Blue Size 11

Colour Average Red, Blue Size 11

Colour Average Red, Blue Size 11 Shift Left

Colour Average Red, Blue Size 11 Shift Left

Colour Average Red, Green Size 11

Colour Average Red, Green Size 11

Colour Average Red, Green Size 11 Shift Left

Colour Average Red, Green Size 11 Shift Left

Colour Average Red, Green Size 11 Shift Right

Colour Average Red, Green Size 11 Shift Right

Colour Average Red, Green, Blue Size 11

Colour Average Red, Green, Blue Size 11

Colour Average Red, Green, Blue Size 11 Shift Left

Colour Average Red, Green, Blue Size 11 Shift Left

Colour Average Red, Green, Blue Size 11 Shift Right

Colour Average Red, Green, Blue Size 11 Shift Right

Related Articles and Feedback

Feedback and questions are always encouraged. If you know of an alternative implementation or have ideas on a more efficient implementation please share in the comments section.

I’ve published a number of articles related to imaging and images of which you can find URL links here:

C# How to: Linq to Bitmaps – Partial Colour Inversion

Article Purpose

In this follow up article we further explore manipulating a  ’s underlying pixel data. This article is part 2 of the Linq to Bitmaps series, we’ll be focussing on partial colour inversion using queries.

Part 1: .

In my experience with previous articles I’ve written it seems that articles are better received by readers when accompanied by graphics/images. You will notice throughout this article I’ve added thumbnail images. All of the images originate from the same source image file and were created by the  accompanying sample application.

Sunflower-BlueSunflower-GreenSunflower-Invert-All-ShiftLeftSunflower-Invert-All-SwapBlueGreenFixRed200 Sunflower-Invert-All-SwapBlueRedFixGreen75

Sample source code

This article is accompanied by a sample source code Visual Studio project which is available for download here.

Sunflower-Invert-All-SwapRedGreenSunflower-Invert-All-SwapRedGreenFixBlue150Sunflower-Invert-BlueSunflower-Invert-Blue-GreenSunflower-Invert-BlueGreen-ShiftLeft

Using the sample Application

This article’s associated sample source code defines a sample application, detailing the concepts explored by this article. The sample application implements three types of image filters: Inverting Colours, Swapping each pixel’s colour components and Shifting pixels to different locations within the image data buffer. This article explores the Colour Inversion filter.

The image shown below is a screenshot of the Bitmap Pixel Manipulation application in action:

LinqToBitmaps_Screenshot

The sample application allows the user to specify an input source image which can then be modified by implementing an image filter. If desired the user has the option to save the new/result image to the file system.

Sunflower-Invert-BlueGreen-ShiftRightSunflower-Invert-BlueGreen-SwapBlueGreenSunflower-Invert-BlueGreen-SwapBlueGreenFixRed0Sunflower-Invert-BlueGreen-SwapBlueGreenFixRed125Sunflower-Invert-BlueGreen-SwapBlueRed

The Colour Inversion Filter

The Colour Inversion Filter can be implemented in various forms. The type of inversion is determined by the ColourInversionType , the definition as follows:

public  enum  ColourInversionType  
{
    All, 
    Blue, 
    Green, 
    Red, 
    BlueRed, 
    BlueGreen, 
    RedGreen, 
}

The following section provides an explanation of each Inversion Type:

  • All – Each Red, Green and Blue value will be subtracted from 255.
  • Blue – The value of Blue will be subtracted from 255, Green and Red values remain unchanged.
  • Green – The value of Green will be subtracted from 255, Blue and Red values remain unchanged.
  • Red – The value of Red will be subtracted from 255, Blue and Green values remain unchanged.
  • BlueRed – The value of Blue and Red will be subtracted from 255, Green  value remain unchanged.
  • BlueGreen – The value of Blue and Green will be subtracted from 255, Red value remain unchanged.
  • RedGreen – The value of Red and Green will be subtracted from 255, Blue value remain unchanged.

Sunflower-Invert-Blue-RedSunflower-Invert-BlueRed-SwapBlueGreenFixRed225Sunflower-Invert-BlueRed-SwapBlueRedFixGreen35Sunflower-Invert-BlueRed-SwapRedGreenFixBlue55Sunflower-Invert-Blue-ShiftLeft 

Applying Linq queries to Pixel Data

This article’s sample source code implements queries through the InvertColors extension method which targets the class. The definition is detailed by the following code snippet:

 public  static  Bitmap  InvertColors(this  Bitmap  sourceImage, 
                                 ColourInversionType  inversionType) 
{ 
    List <ArgbPixel > pixelListSource = GetPixelListFromBitmap(sourceImage); 

List <ArgbPixel > pixelListResult = null;
byte byte255 = 255;
switch (inversionType) { case ColourInversionType.All: { pixelListResult = (from t in pixelListSource select new ArgbPixel { blue = (byte )(byte255 - t.blue), red = (byte )(byte255 - t.red), green = (byte )(byte255 - t.green), alpha = t.alpha, }).ToList();
break; } case ColourInversionType.Blue: { pixelListResult = (from t in pixelListSource select new ArgbPixel { blue = (byte )(byte255 - t.blue), red = t.red, green = t.green, alpha = t.alpha, }).ToList();
break; } case ColourInversionType.Green: { pixelListResult = (from t in pixelListSource select new ArgbPixel {> blue = t.blue, red = t.red, green = (byte )(byte255 - t.green), alpha = t.alpha, }).ToList();
break; } case ColourInversionType.Red: { pixelListResult = (from t in pixelListSource select new ArgbPixel { blue = t.blue, red = (byte )(byte255 - t.green), green = t.green, alpha = t.alpha, }).ToList();
break; } case ColourInversionType.BlueRed: { pixelListResult = (from t in pixelListSource select new ArgbPixel { blue = (byte )(byte255 - t.blue), red = (byte )(byte255 - t.red), green = t.green, alpha = t.alpha, }).ToList();
break; } case ColourInversionType.BlueGreen: { pixelListResult = (from t in pixelListSource select new ArgbPixel { blue = (byte )(byte255 - t.blue), red = t.red, green = (byte )(byte255 - t.green), alpha = t.alpha, }).ToList();
break; } case ColourInversionType.RedGreen: { pixelListResult = (from t in pixelListSource select new ArgbPixel { blue = t.blue, red = (byte )(byte255 - t.blue), green = (byte )(byte255 - t.green), alpha = t.alpha, }).ToList();
break; } }
Bitmap resultBitmap = GetBitmapFromPixelList(pixelListResult, sourceImage.Width, sourceImage.Height);
return resultBitmap; }

The InvertColors extension method performs a simple select query returning a new instance of the ArgbPixel class adjusted according to the value of the ColourInversionType parameter passed.

Sunflower-Invert-Green-ShiftLeftSunflower-Invert-Green-SwapBlueGreenSunflower-Invert-Red-GreenSunflower-Invert-RedGreen-SwapBlueRedSunflower-Invert-RedGreen-SwapRedGreenFixBlue110

Filter implementation examples

This section contains the eye candy of this article. The following set of images were created from a single input source image. The source image has been released into the public domain and can be downloaded from Wikipedia.

The Original Image

Sunflower_USFWS

Filtered Images

Sunflower-BlueSunflower-GreenSunflower-Invert-All-ShiftLeftSunflower-Invert-All-SwapRedGreenFixBlue150Sunflower-Invert-BlueGreen-SwapBlueRedSunflower-Invert-Blue-RedSunflower-Invert-GreenSunflower-Invert-Red-GreenSunflower-Invert-RedGreen-SwapBlueRedSunflower-Invert-RedGreen-SwapRedGreenFixBlue110


Dewald Esterhuizen

Unknown's avatar

Blog Stats

  • 892,464 hits

Enter your email address to follow and receive notifications of new posts by email.

Join 91 other subscribers

Archives

RSS SoftwareByDefault on MSDN

  • An error has occurred; the feed is probably down. Try again later.