Chapter 4SEGMENTATION4.1 IntroductionRadiology and medical imaging have witnessed revolutionary developments in diagnosis and treatment evaluation during last two decades. Identical objects or parts of object can be observed in these homogeneous regions. Based on certain properties of the image, the homogeneous range of the segmented images is measured. In clustering process, the pixel in an images are arranged into various subgroups. The pixel in a subgroup has similar properties and the pixel in the two different subgroups have minimum difference. Segmentation helps in locating different boundaries and objects present in a digital image.

The representation of the image can be easily analyzed more meaningful after segmentation process. During the segmentation process, similar pixel intensities are assigned with the same label for the ease of identification. Several methods and algorithms have been developed for defined and the problems are domain specific. Lung image segmentation has been proposed for a numerous clinical inspections with varying complexity. In clinical point of view, the person responsible for providing meaning to an image is radiologist.

Major challenges that affect segmentation algorithm are intensity inhomogeneous, image noise, partial volume effect and image artifacts. These challenges in segmentation problem have been addressed in various algorithms.There are so many algorithm and advanced methodologies developed for lung image segmentation but still there is a need for an efficient and fast segmentation technique. Computational complexity is another drawback of majority of the lung segmentation algorithms. Many algorithm and methods are tied together to achieve high accuracy of computation. By combining several algorithms a having large number of iterative process, the computational complexity increases. Aim of the proposed work is to develop a robust algorithm to increase accuracy and reduce computational complexity. K-Mean clustering based segmentation process is used to detect lung tumor. K-Mean detects the performance of MR image segmentation algorithm in terms of accuracy, execution time and specificity and sensitivity. Inproposed method,watershed was implemented to authenticate features. There is a decrease in computational complexity and execution time compared to any other existing approaches. Fig. 4.1: Implementation and evaluation of segmentation algorithmA total of 200 images with tumor are considered for segmentation of which 100 images belong to benign cases and 100 belong to malignant cases. Benign cases are subdivided into 5 sets and malignant cases are subdivided into 5 sets. Malignant tumor considered in this thesis work. Benign tumor considered in this thesis. The detail of data set used in this work are shown in table 4.1.Table 4.1 original dataset used for segmentationInput Image Type Data Set Number Test images Pixel per ImageBenign Tumor 1 20 65536Benign Tumor 2 20 65536Benign Tumor 3 20 65536Benign Tumor 4 20 65536Benign Tumor 5 20 65536Malignant Tumor 6 20 65536Malignant Tumor 7 20 65536Malignant Tumor 8 20 65536Malignant Tumor 9 20 65536Malignant Tumor 10 20 655364.2 PERFORMANCE EVALUATION OF SEGMENTATION ALGORITHMThe popularity of image segmentation algorithms have increased in recent years because of its application in pattern and medical diagnosis. Large number of segmentation algorithms for lung MRI have been newly developed by the research community through these decades. These segmentation algorithms have strength as well as weakness and some of them are designed for specific applications. So it is obligatory to evaluate segmentation performance for the selection of a robust algorithm suited for automatic diagnosis systems. These performance metrics are application dependent and wrong selection of metrics leads to inaccurate results. In this selection different metrics used for analyzing segmentation methods implemented here are formulated and discussed. In this work five performance metrics are used for analyzing and comparing the robustness of segmentation methods. The performance metrics follows as MSE PSNR Accuracy Sensitivity Specificity4.2.1 MSE MSE is an estimator that defines the deviation of th segmentation output from the expected output (manually segmented). This deviation is considered as error and MSE calculates the quadratic loss in reference to the manually segmented output. MSE occurs as a result of the randomness of selected segmentation method and the computed value is always non-negative. The value of MSE should be as loss as possible and a value closer to zero indicates better segmentation. Consider a manually segmentation algorithm I(x, y) having dimension m x n. then MSE is given by equation 4.1 MSE= 1/mn €‘_(x=0)^(m-1)–’€‘_(y=0)^(n-1)–’–[ K(x,y)-I–(x,y)—^2 — (4.1)4.2.2 Peak signal to noise ratio (PSNR)It is the ratio of maximum power of the segmentation image signal to the error (noise) that corrupts the image. Since many images have wide range of intensity levels, this equation can be given in the equation 4.2. Consider the maximum pixel intensity in the imageI_max.PSNR=10–log—_10 ( (I^2 max)/MSE) (4.2)4.2.3 AccuracyAccuracy is described as the similarity of segmentation output image with the manually segmentation image. It is the ratio of detected tumor area to the manually segmentation tumor area. In order to calculate accuracy we require the following parameters as shown in the equation 4.3.Accuracy = (TP+TN)/(TP+FN+FP+TN)X100% (4.3)4.2.4 Sensitivity The percentage of the actual lesion that has been truly detected by the automated method. It is calculated as shown in the equation 4.4.Sensitivity =TP/(TP+FN)X 100% (4.4)4.2.5 SpecificityThe percentage of the actual background image. It is calculated as shown in the equation 4.5Specificity =TN/(TN+FP)x 100% (4.5)4.2.6 PrecisionIt shows what percentage of the detected border is the true lesion. It is calculated as shown in the equation 4.6.Precision =TP/(TP+FP)—100% (4.6)4.2.7 SimilarityThe degree of agreement between the automatic border and the manual border. It is calculated as shown in the equation 4.7.Similarity = 2TP/(2TP+FN+FP)—100% (4.7)4.2.8 Border ErrorIt measures the discrepancy between two borders.It is calculated as shown in the equation 4.8.Border Error = (FP+FN)/(TP+FN)—100% (4.8)