Article Preview
Top1. Introduction
Stomach is one of the essential organs which supports the gastro-intestinal functions of human being which in turn supports the smooth functioning of the digestive system. The risk factors like smoking, alcohol drinking, unhealthy food habits lead to stomach cancers. The early stages of detection with the symptoms like nausea, indigestion, weight loss, poor appetite can lead health expert to detect and cure the disease (Lambert, 2002). The health industry, suggests the Magnetic Resonance Imaging (MRI), a medical imaging technique to capture the image with the help of radiology. The proposed method uses neural network on MRI image to detect the early stage gastro disorders with accurate results.
1.1 Support Vector Machine
Support Vector Machine (SVM) is a supervised learning algorithm that has been widely used for problems of binary classification, regression analysis, or other tasks such as detection of outer ones. The SVM training algorithm constructs a model that can assign a new data point to either class (positive / negative class). The model itself represents the optimum hyper plane (or boundary) dividing the training data into two parts. The optimal hyper plane is the one that produces the greatest margin amidst the class and itself. Hence, it is also regarded as a classifier for maximum margins. The data points which are nearest to the hyper plane in the training dataset are called support vectors. Classifying any new data point depends on the side of the hyper plane it lies on.
Figure 1 shows an example of a simple classification problem in the 2-D space, with filled circles belonging to one class and empty circles to the other. Three possible hyper planes are H1 provides no distinction between the two groups. Both H2 and H3 establish a distinction but in the case of H3 the distance between the groups is maximum. An SVM model trained on the data points shown will yield line H3 as the ideal hyper plane.
Figure 1.
Example of Hyperplanes in the 2-D Space
The example above is a simple case where both groups are linearly divisible. The input classes often happen to be linearly non-separable. However, the kernel trick still allows SVMs to construct non-linear hyper-planes. A kernel function allows us to transform non- linear transformations into a feature space where the transformed feature space can be larger than the original space. Then SVM operates in this transformed space, giving us a hyper plane which in the original input space can be non-linear. Some kernel functions commonly used are linear, Gaussian, polynomial, and sigmoid.
Figure 2 gives an example of a data set in the original input space which is not linearly separable. The data points can be made linearly separable in the new function space by applying an appropriate transform to the input space. A nonlinear boundary is created by the back-projection of the optimum separating hyperplane from the new feature space to the original input space.
Figure 2.
Using the Kernel Trick to Create Non-Linear Hyperplanes
1.2 Logistic Regression
Logistic Regression being a statistical model describes the logistic function in such a way that it utilizes a variable which is represented in the binary form. The binary logistic model, mathematically has two indicator variables 0 and 1. Logistic Regression is utilized in several fields that includes medical fields and machine learning. These techniques are utilized to access the symptoms of developing a particular disease based on the observed patients’ characteristics (Figure 3).