Download Paper Title (use style: paper title)

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
An Intelligent Novel For Breast Cancer Diagnosis
Based on FNA1 Test
Mohammad Fiuzy
Javad Haddadnia
Electrical and computer Engineering Faculty
Sabzevar Tarbiat Moallem University
Sabzevar, Iran
[email protected]
Electrical and computer Engineering Faculty
Sabzevar Tarbiat Moallem University
Sabzevar, Iran
[email protected]
Abstract— Breast cancer is one of the most common diseases in
Women. The correct diagnosis of breast cancer is one of the
major problems in the medical field. Early diagnosis of breast
cancer, significantly reducing the deaths of women in community.
Fine Needle aspiration test or "FNA1", IS a simple tool,
inexpensive, noninvasive and exact way for detecting this Disease.
It is now, trying to final detection; done by intelligent system and
machine learning method. The most important Problems for
diagnose in early stages of this disease, is Expending the suitable
features of the FNA's test results for diagnosing. In this study, we
tried By combining artificial intelligence approaches, such as,
EA2 algorithm (here by GA3) In order to explore the space of
possible answers, Exact classifier systems (here by FCM 4) In
order to classify samples (malignant5) from normal (benign6),
And networks (NN7) In order to identify the model and structure,
Propose a new Algorithm for accurate diagnosis of disease.
Introduced Algorithm at this paper, Implemented on the
properties Statistical database (WDBC8) which consisted about
569 FNA test sample (212 patient samples and 357 healthy
samples). Finally, achieve high detection accuracy about
"96.579% ". Such a system Due to the Smart, cheap, noninvasive, rapid and accurate in this type, Can be introduced as a
useful diagnostic system for a comprehensive treatment of
patients.
Keywords-component; Breast Cancerg; FNA; Artificial
Intelligent; Data Processing
I.
INTRODUCTION
Breast cancer is a cancer with high mortality rates between
women, so that is the most common causes of death in the
community [1]. According to National Cancer Institute's
statistics in America; one of every eight women will be
diagnosed with breast cancer [2]. Six percent of all deaths
worldwide, caused by this disease. Cancer is the third cause of
death in Iran, also in all cancers; breast cancer is the most
common cancer in Iranian women [3, 4]. It should be noted, the
age of breast cancer in Iran, is one decade lower than a
Developed countries [5]. One of the major factors in the
treatment of this disease is early diagnosis [6]. Early diagnosis
of breast cancer; (5 years Maximum after the first cancer cell
division) Increased survival Chances from 56% to 86% [3].
Thus, exist a precise and reliable system, is essential for on
timely diagnosis of benign or malignant breast tumors [3]. The
conventional samples surgery is done to help diagnose the
cancer. This method has the highest recognition accuracy
among available methods (clinical examination and medical
imaging), but are invasive procedure, are time consuming and
expensive [4]. FNA test (method of this study) is the way that
has not previous methods advantages, and by using machine
learning techniques, can provide efficiently System for
detection the breast cancer. A lot of research on machine
learning techniques has been done to diagnosis breast cancer
and here discuss the studies on WDBC database (background).
A group of researchers by using the neural network [10, 11,
12, 42], SVM9 classifier with the RBF kernel12 [7, 8, 9] And a
fuzzy classifier [10, 11, 12] have been attempted to resolve
Wisconsin breast cancer issue. And in the best position,
reached about 95% identify accuracy. A recent study shows
that classification based on core, is superior to other existing
methods such as linear classification based on wavelet and
neural networks [12]. In [16] Achieved to 93% identify
accuracy by using a combination Decision Tree and Neural
Network Algorithm or achieved 94.4% by combining GA and
DT in [17]. In [18] by using expert systems 93.6%, accurately
identify. In [19] by using genetic algorithms and artificial
immune algorithm 94% accurately identified and in [20]
achieved 92% identified accurately by using FRPCA
algorithm. In all references [21, 22, 23, 24] by using the expert
system in [19] achieved 94% accurately identified, Artificial
neural networks and fuzzy systems, even by using Principal
Component Algorithm and Dimension reduction Method Or
by combination of these, Attempted to identify the disease
Which fairly poor careful than methods have been applied.
Many machine learning systems, due to inability to remove
waste and Undesirable features have low accuracy. Therefore,
in order to select the best features and achieve the highest
accuracy, in this work, proposed new and efficient algorithms
for feature selection, Classification of samples, identification
and modeling. First In this paper, General method described.
Then will discuss about the outline from Flowchart (1). So we
hope study results have although small, but significant, that to
Improved methods, treatment and diagnosis.
II.
PROPOSED ALGORITHM AND METHOD
At first, Data extracted from the database, then will be
delivered to Binary GA that neural network is in cost function.
Binary genetic algorithm, Based on optimal features selection
at database, Generates Optimal answers (0 or 1) Due to the
cost function. The network output, be delivered to the
classification algorithm "FCM" to reduce property, accuracy
estimate Binary Genetic Algorithm and correct classification
between patients of healthy. The algorithm output has returned
to network. This process, Repeated at cost function of the
binary genetic algorithm so until optimal features In order to
Increase accuracy feature selection And desired result namely,
simple detection be provided. Flowchart (1) shows all steps in
this research.
Data
Binary GA Algorithm
Evaluation & Fithness On Subject
Function
Neural Network
Estimation & Simulation
Fuzzy C-Means Algorithm
Data Mining & Dimension Reduction
Diagnosis by Correct Feature
Flowchart 1: Proposed Algorithm
III.
DATA BASE
The FNA test, Involves extracting fluid from the breast
tissue and visual examination samples under a microscope [8].
At first, In Smart detection, image becomes a gray level (does
not required Color Info). Then software defines the cell core
boundary based image processing techniques. And calculated
features for each core such as radius, texture, perimeter, area,
compaction (square perimeter divided by area), flat (mean
difference in length of lines radially adjacent), concavity,
symmetry and fractal (border kernel of approximation coastline
9 [27]). Finally calculated determine standard error mean and
three largest known of average values. Thus so, achieved "30"
features by actual values for each sample. WDBC database is
the data set consists of a few FNA test results that described
above who provided. This test was performed on 569 patients
in Wisconsin hospitals [28], which ultimately were 212
malignant and 357 benign cases were identified [28].
IV.
PATTERN RECOGNITION AND FEATURE SELECTION
In fact, feature selection Problem; Select the features that
have Maximum power at output prediction [29]. Definition of
optimal subset what can be, Depends on the problem we want
to solve [26], Feature selection algorithms which Depending to
evaluation process, they are divided into two major categories.
If feature selection be done independent of any learning
algorithm, (As a completely separate preprocessor), then called
filter method. In which case, achieved the selected features
before processing. Feature selection method is called Wrapper
or closed loop, if process assessment be associated with a
classified algorithm. The second usually have better
results. A common pattern recognition system consists of 4
sections, Feature extraction and selection, Designing and
training classifier, and finally testing. In this study ،feature
extraction and testing by neural networks, select best features,
by GA algorithm, And FCM classifier used to class and
sample classification.
V.
GENETIC ALGORITHM
At first during 1960 and 1970 by John Holland, Genetic
algorithms was developed [30], Pseudo-code, At below,
expression the genetic algorithm, [31].
Procedure: Applied Genetic Algorithm
Initialization
t = 0;
Set population size pop size, number of generation
max_gen, probability of crossover pc and probability of mutation pm
Initialize parent population P (t):
Evaluate P (t) and select the best solution * with the
Optimum objective function among P (t)
While (no termination criteria) do
Regenerate C (t) from P (t) by applying the crossover
And mutation operations:
Evaluate C (t) and select the current best solution 
With the minimum objective value among C (t)
Update the best solution * solution, I, e, if  < * , then * =  ;
Select P (t+1) from P (t) and C (t);
t = t+1;
End while;
End procedure
This source idea is the mimicking natural phenomena of
genetic and Darwinian evolution. Today, the genetic algorithm
is used for finding approximate answers in optimization and
search problems. This algorithm is inspired from natural
concepts such as heredity, mutation, selection, and
recombination. The main objective in this algorithm is to
increase the chance of good samples to continue life in the next
generation and to optimize generation by generation until an
acceptable answer is obtained. The flowchart 2 illustrates the
procedure of this algorithm [31]. In the genetic algorithm used
in this paper, n-gene chromosomes were used to illustrate the
problem space. For example, if n = 30 is the number of
Features, a chromosome is accidently like the shown
chromosome (For 30 features) as the below vector.
0 1 0 1 1101011010101010101111010 1
Each chromosome represents a solution acceptable that,
have represent one of the 30 features in the database. A
number of genes, is "1" (for the generation of an effective or
efficient features) And the number of them is "0" (for the
generation of non-effective or ineffective features). In this
paper the rotating selection (roulette wheel) is used for
selecting chromosomes in order to combine and generate the
next generation. In this method the chromosomes are mapped
on continuous sections of a line, so that each chromosome’s
share is equal to the amount of its fitness. In order to do the
selection, a random number is generated and if chromosome
with that number in its section then selected [32]. The
method applied for mutation is the inverse method. In this
method, a sub-collection of chromosome’s genes is selected
and then its reverse is replaced. In order to combine
chromosomes with above mentioned structure, the partially
mapped cross over (PMX) method is used [33]. The
combination operator keeps the talented genes and then
relocates other genes to generate offspring. When gene leading
to a maximum value of the objective function is talented.
Objective function, is the equation that gets merits for each
generation. It seems , accuracy in identifying more important
than small subset that is selected. Although between two series
that leads to the same accuracy then Smaller set be preferred,
Therefore we suggest the following fitness function.
Fithness  .Accuracy  .
ns
n
Which |n| is Number of Features And |s| is number of
selected Features. First sentence, is accurately identified
coefficient and Second, is the rate coefficient of reduced
features. Sum of α and β coefficients fixed and get on equal to
100. Now, according to importance of carefully identify and
reduce the number of features used in the diagnosis, α and β
Coefficients are set. Due to the importance of accurately
identifying and fewer features that used in diagnosis, α and β
Coefficients are set. Certainly in this problem, Detection
accuracy is very important And therefore the α (here 99) is
will be Larger than β (here 1)[26]. Genetic algorithm
parameters according to Table (1) are set. Clearly, GA starts
with random set in fact tries introducing us, some subset by
most useful information Reduce feature vectors that
introduced, by GA are used as FCM classifier input.
Table 1: GA Properties
ALG
Maxiter
Var high
Recom Percent
CrossPercent
Binary GA
250
+10
0.1
0.5
POPsize
Varlow
max generaiton
Mut Percent
selection
200
-10
600
0.4
Roulet
Evaluated based on two criteria, Each subset of features,
first: Accuracy of classifier (FCM) And second main value
of each feature original vector(features selected). Thus GA
Search among 2 power (30) subset with high possible to find
the optimal solution. However The maximum iterations of
Suggested algorithm is considered as the end requirement or
condition. Flowchart (2) the cost function of proposed
algorithm.
F
E
A
T
U
R
E
Binary GA Cost Function
Neural
Network
Neural
Network
FCM
Flowchart II: GA Cost Function
VI.
of static neural networks, And used in many applications such
as system identification problems, Control, etc[34]. Error back
propagation algorithm Or BP model with this model was
introduced as a training algorithm for network.
MULTI-LAYER PROGRESSIVE NEURAL NETWORK
Progressive (feed forward) multi-layer neural networks,
Composed of many processing elements that related together,
they called neurons [34]. Generally, the input neurons (P)
added and multiplied together itself Weights (w), And
applying by bias (b) or intercept, And gives result to an
activation function (f( and Then results or output (a) will
transferred to next layer, Shown in Fig (1). Feed forward
multi-layer neural network is one of the most important types
Figure 1: MLP Neural Network [Matlab Help]
This algorithm provides a solution for reducing gradient to
minimize errors in the network. In this project we had used 3
layers feed forward network so that; linear activation function
in input layer (i), Sigmoid function in middle and output layer
(j and k).
Figure 2: Proposed NN Adapted From [34]
Based on references used in [26] and consideration result to
select the best structure, We selected 30 neurons were
considered for input layer network (full model), and 15
neurons in middle layer of network. Select the number of
middle layer neurons is very important, because, if their
number is small, then network, can be faced by shortage of
learning resources to solve complex nonlinear problems And
if it is high then will cause two problems, At first, time
training will be high, And second, network may be trivial to
learn training data And operate weak in solving problem [35,
41, 42]. For network training in this project, we used from
back propagation algorithm. In this way, changes network
weight matrixes, so that Mean square error of network or MSE
will be specified lower than a certain amount as (0.02). In this
method, returned to back error by layer to layer specific
method, in each layer, necessary reforms carried out on
network weights. Processes done until total output error
converge to Minimum amount or Error Global converge and.
And be equal to maxiter that pre-determined. This value
should be set so until prevent the network from learning too.
maxiter of project has been considered 250 times. Error
Global Value has been determined as equal 0.02. Table (2)
shows details of neural networks in proposed Algorithm.
Table 2: NN Properties
Neurons
Input / Output
Learning Algorithm
MSE
Erorr Global
Maxiter
1th Activation Function
2th Activation Function
30/15/1
30/1
BP
00.02
00.02
1000
Sigmoeed
Linear
VII. FCM OR FUZZY C-MEANS ALGORITHM
In 1969 Ruspini, proposed first clustering model with fuzzy
idea [36]. In the way, is determined the amount of any
membership or data belonging each data to each cluster in
 

matrix member. U  u i , j
  u1 , u 2 ,....., un  , Which, c
cn
 
is number of clusters and n is the data number. In these
method two main limitations is applied; First, any cluster
should not be empty. ( u u  0   {1,......, c}) The
j1 i j
i
second limitation called normalization constraints; a total
membership of all the clusters must be equal to one in each
class data. ( u u  1 j  {1,......, n}) FCM tries for any
j1
ij
data set, Finds parts that minimize the following cost equation
or objective function.
2 Where in it,
(1) Equation J f ( X, Uf , C)  c n um
i j di j
i 1
j1
dij is data distance between Xj and the ith cluster center.
m[1,) is Degree of fuzziness. If "M" goes to one, clustering
is more difficult or crisp and conversely, if "M" goes to
infinity, clustering will be more fuzzy. Assume, such as Fig
(3) we want to classify 1000 random data.
Figure 3: 1000 rand data before Clustering
1- However, If function cannot be directly minimum, so, used
to repeated algorithm. To solve this problem, was used follows
optimal replacement scheme as follows. Select the proper
values for m, c and a small positive number for . The matrix
C is randomly filled (Middle or center of clusters) finally, set
t=0
2- In (t = 0) calculates membership matrix or in (t > 0)
sincerely update. Optimized degree of membership, for fixed
parameters of clusters such as Equation (2) at flow.
(2)
u (i jt  1) 
d ij2 /( m 1)
i 1 d ij2 /(m 1)
c

Figure 4: two Cluster By 1000 rand data
Shown in practice, this method does not get stuck on local
peaks, and today probable of FCM is used as primer in many
clustering ways. More details on this method are located at
[36, 37, 38, 39, and 40].
VIII. TEST ACCURACY THE FINAL ASSESSMENT
In general, all feature selection process that led to the
diagnosis as follows can be stated, generated strings for each
patient as filled with "0" and "1" would. String multiplied at
profile of each patient, According to "0" or "1" being features
of each patient, the array; enter into neural network (Neural
Network is in GA Binary Cost function). Then diagnostic
output is obtained for each patient by the network. Tailored to
each patient's diagnostic output for the correct classification,
Results has been entered into FCM algorithm And again
returns to the network. Do this process at cost function
algorithm (Binary GA) and repeat the algorithm to determine,
lead to select the best answer possible based on patient's
diagnosis by least and effective features. To demonstrate
capabilities of the proposed system, Cross-validation process
10 -fold (Adapted from [26]) was implemented on database by
569 sample parts. Those were divided into 10 categories. 9
parts with 57 samples and 1 part with 56 samples, so any
section contains a combination of sick and healthy. At crossvalidation-10 fold, 9 parts of the data for training And the
remainder used for testing. The training and testing process is
performed 10 times in a row, turn So that each time different
parts of the test are excluded.
IX.
SIMULATION AND EVALUATION OF PROPOSED
ALGORITHM
1
c
d lj
 ( dij )1 /(1m )
l 1
for i  1,...c and j  1,.....N
3- At final step, are updating, the center of the clusters with
memberships optimized matrix. In addition to these
parameters, will be very effective how to measured distance.
In addition to these parameters, will be very effective how to
measured distance.
4- Repeating Steps 2 and 3 until C( t 1) C( t ) ; <  or
U ( t 1)  U ( t ) <  By applied all above steps, on the random
test data at Fig (3) Second category is created; This can see
results in the Fig (4) at below.
MATLAB is used for simulation in this implementation;
trained neural networks are performed at one time. For each
input or one row of data collection, all weights will change.
Results in Table 4 are shown. In evaluated in [7] were 202
patients, Evaluation in [10, 11, 12] 202 patients, in a second
evaluation at [16] 198 patients, Evaluation in [17] 201
patients, Evaluation in [18] approximately 194 patients,
Evaluation in [19] 199, Evaluation in [20] 195 patients,
Approximately 204 patients were identified in proposed
method.
Table (3) : Compare and result
Patient
Identify
202
202
~198
~201
~194
~195
~195
~ 205
Best
Answer
Not
Available
---
Accuracy
Test
95.61%
-----------
> 93%
94.4%
> 93.6%
> 94%
> 92%
RBF+FCM or
SNM [10,11,12]
DT + NN [16]
GA + DT [17]
Exp Sys [18]
GA+ AI [19]
FPRCA [20]
96.579%
96 %
Proposed AlG
> 95%
Method
SVM [7]
The best and most effective features that choose in this way
were suggested by the following vector. these vector
represents seven effective features from all features that
Involved in disease diagnosis. They are 3, 6, 12, 18, 21, 25,
and 27 According to these features and database and proposed
algorithm can decide benign or malignant of patient. Figure (5
A and B), is regression graphs results in final assessment and
conclusion.
[8]
[9]
[10]
[11]
[12]
[13]
[14]
[15]
Figure 5: Regression plot for output and target in
validation and final result (train, test, valid)
X.
CONCLUSION
Breast cancer is the most common diseases that lead to the
deaths of women around the world. In this research, due to
necessity of early and timely disease diagnosis, new method
based on a combination of intelligent systems is presented.
Many researchers interest to use these tools, but the challenge
of training neural networks is their most important concern.
Combination of evolutionary algorithms and data mining
systems can be a good idea to train these networks, (to identify
and select properties), that finally led to the correctly
performance and data classified. Simulation results show that
using data mining process can to increase Accuracy and
efficiency of the network training like to a binary genetic
algorithm. Of course main goal of this research is to achieve
100% accuracy, so as to ensure that artificial intelligence
systems and give them practical aspects. If you have a high
knowledge, adequate control and understanding on intelligent
system and data mining and disease then achieving such a goal
is possible, In the not too distant future That God would want.
REFERENCES
[1]
[2]
[3]
[4]
[5]
[6]
[7]
Media Centre for World Health Organization, Fact sheet No. 297:
Cancer, WHO: 2007
Wun LM, Merril RM, Feuer EJ, "Estimating, lifetime and ageconditional probabilities of developing cancer". Lifetime Data Anal
1998; 4(2): 169-86
Litigate J. Predictive," models for breast cancer, susceptibility from
multiple single nucleotide polymorphisms". J Clin Cancer Res 2004; 10:
2725-37.
Center of Disease Control, Department of non-contagious of cancer
organization, Report of registered cancer cases, 1383.
Harirchi I, Karbakhsh M, Kashefi A, Momtahen AJ. "Breast Cancer in
Iran: results of a multicenter study". Asian Pac j Cancer Prev 2004;
5(1): 24-70.
Kazerooni Iman, Ghayumizade Hossein, Haddadnia Javad," Introduction
of an intelligent system for the accurate determination of masses in
mammographic images", IJBD Journal, 2nd year, 3th and 4th NO,
Auhumn and Winter- 2009
Mandelbrot B." The Fractal Geometry of Nature. Freeman and
Company:" New York 1997.
[16]
Maglogiannis I, Zafiropoulos E, Anagnostopoulos I." An intelligent
system for automated breast cancer diagnosis and prognosis using SVM
based classifiers". J Appl Intell 2007: 30(1): 24-36.
Wang Y, Wan F." Breast cancer diagnosis via support vector
machines". Proc. 25th Chinese Control Conf 2006: 1853-6.
Yang Z, Lu W, Yu D, Yu R." Detecting false benign in breast cancer
diagnosis." Proc. IEEEI NNS-ENNS Int. Joint Conf. Neural Networks
2000; 3: 3655-8.
Mu T, Nandi A. " Breast cancer detection from FNA using SVM with
different parameter tuning systems and SOM-RBF classifier". J Franklin
Institute 2007; 344: 285-311.
Fuentes-Uriarte J, Garcia M, Castillo O. "Comparative study of fuzzy
methods in breast cancer diagnosis". Annual Meeting of the NAFIPS
2008: 1-5.
Yang Lwei Y, Nishikwa R, Jiang "Y. A study on several machine
learning methods for classification of malignant and benign clustered
micro calcification". IEEE Trans Med Imaging 2005; 24(3): 371-80.
Sewak M, Vaidya P, Chan C, Duan Z." SVM approach to breast cancer
classification". Proc. 2nd Int. Multi-Symp. Computer and Computational
Sciences 2007: 32-7.
Anagnostopoilosi I, Anagnostopoilosi C, Vergados D, Rouskas A,
Kormentzas G. The Wisconsin," Breast Cancer Problem: Diagnosis and
TTR/DFS time prognosis using probabilistic and generalized regression
information classifiers". J Oncol Reports 2006; 15: 975-81.
Jose´ M. Jerez-Aragone´sa, Jose´ A. Go´mez-Ruiza , Gonzalo RamosJime´neza, Jose´ Mun˜oz-Pe´reza, " A combined neural network and
decision trees model for prognosis of breast cancer relapse", Artificial
Intelligence in Medicine 27
(2003) 45–63.
[17] Chin-Yuan Fana, Pei-Chann Changb, Jyun-Jie Linb, J.C. Hsiehb," A
hybrid model combining case-based reasoning and fuzzy decision tree
for medical data classification", Applied Soft Computing 11 (2011) 632–
644
[18] Murat Karabatak, M. Cevdet Ince, " An expert system for detection of
breast cancer based on association rules and neural network", Expert
Systems with Applications 36 (2009) –3469
[19] Seral , Sahana Kemal Polata Halife Kodazb, Salih Gune , "
Anewhybrid method based on fuzzy-artificial immune system and k-nn
algorithm for breast cancer diagnosis", Computers in Biology and
Medicine 37 (2007) 415 – 423
[20] Pasi Luukka, " Classification based on fuzzy robust PCA algorithms and
similarity classifier", Expert Systems with Applications 36 (2009) 7463–
7468
[21] Boris Kovalerchuk, Evangelos Triantaphyllou, James F. Ruiz, Jane
Clayton, "Fuzzy logic in computer-aided breast cancer diagnosis:
analysis of lobula", Artificial Intelligence in Medicine 11 (1997) 75–85
[22] P.J.G. Lisboa, H. Wong, P. Harris, R. Swindell, " A Bayesian neural
network approach for modeling censored data with an application to
prognosis after surgery for breast cancer", Artificial Intelligence in
Medicine 28 (2003) 1–25
[23] E.Y.K. Ng, U. Rajendra Acharya, Louis G. Keith, Susan Lockwood, "
Detection and differentiation of breast cancer Using neural classifiers
with first warning thermal sensors", Information Sciences 177 (2007)
4526–4538
[24] David B. Fogel, Eugene C. Wasson IIP, Edward M. Boughtonc, "
Evolving neural networks for detecting breast cancer", Cancer Letters
96 (1995)4 9-53
[25] Khooei AR, Mehrabi Bahar M, Ghaemi M, Mirshahi M. "Sensitivity and
specificity of CNB in diagnosis of breast masses". Iranian journal of
Basic Medical sciences 1384; 8(2): 6-100.
[26] Alipoor Mohammad, Haddadnia Javad, " Introduction of an intelligent
system for accurate diagnosis of breast cancer", IJBD, 2th year, 2th NO,
2009 Summer
[27] Mandelbrot B. "The Fractal Geometry of Nature. Freeman and
Company": New York, 1997.
[28] .http://archive.ics.uci.edu/ml/datasets/Breast+Cancer+Wisconsin+(Diagn
ostic)
[29] Jensen R." Combining rough and fuzzy sets for feature selection". Ph.D.
Thesis”, School of informatics, Univ. Edinburgh, 2005.
[30] Joaquen Abelln, Andrés R. Masegosa," An ensemble method using
credal decision trees", European Journal of Operational Research 205
(2010) 218–226
[31] Chiung Moon, Jongsoo Kim, Gyunghyun Choi, Yoonho Seo, " An
efficient genetic algorithm for the traveling salesman, problem with
precedence constraints", European Journal of Operational Research 140
(2002) 606–617
[32] Ron
Shelly
(1998);
"
Roulette
Wheel
Study"
http://en.wikipedia.org/wiki/Roulette#History
[33] Galiasso, Pablo & Wainwright, Roger L., 2001, “A Hybrid
Genetic Algorithm for the Point to Multipoint Routing Problem
White Single Splite Paths", Proceedings of ACM/SIGAPP
Symposium on Applied Computing, (SAC'01), March 11-14, 2001,
pp. 327-332.
[34] Madan M. Gupta, Liang Jin, and Noriyasu Homma “Static and Dynamic
Neural Networks from fundamental to advanced theory”, Forwarded by
Lotfi A. Zadeh. IEEE Press. 2003 .
Nirooe Mehyar, Parviz A.b, Giti M, "Simulating a Hybrid Model By GA
and NN for Segregation patterns of benign and malignant breast cancer
in mammography", Bio Phsics Iranian Journal, Winter 2006, NO 13, 3th
Year
Subtitle:
1
: Fine Needle Aspiration
: Evalutionaey Algorithm
3
: Genetic Algorithm
4
: Fuzzy C-means
5
: malignant
6
: beginner
7
: Neural Network
8
: Wisconsin Database for Breast Cancer
9
: Support Vector Machine
2
[35] A.Baraldi, and P.Blonda, “A Survey of Fuzzy Clustering, Algorithms for
Pattern Recognition—Part I and II”, IEEE TRANSACTIONS ON
SYSTEMS, MAN, AND CYBERNETICS—PART B: CYBERNETICS,
VOL. 29 NO. 6, 1999.
[36] A.K. Jain, M.N. Murty, and P.J. Flynn, “Data Clustering: A Review”,
ACM Computing Surveys, vol. 31, no. 3, pp. 264-323, 1999.
[37] Rui Xu, and Donald Wunsch, “Survey of Clustering Algorithm”, IEEE
TRANSACTIONS ON NEURAL NETWORKS, VOL. 16, NO. 3, 2005.
[38] C.Drِing, M.J.Lesot, and R.Kruse, “Data analysis with fuzzy clustering
methods”, Computational Statistics & Data Analysis, © Elsevier B.V,
2006.
[39] Asgarian Ehsan, Memarzadeh Hasan, Mohesen Saryani, Jafar H," A new
View On Clustering By GA", 14th CSICC 2009
[40] Naim Abdi, Amirahmadi Noshaz, Tahami Ehsan nad Rbbani," Diabet
Diagnisis By SVM", 14th ISCEE-2011 summer
[41] Zhaohui L, Xiaoming W, Shengwen G, Binggang Y. "Diagnosis of
breast cancer tumor based on manifold learning and support vector
machine". Proc. IEEE Int. Conf. Information and Automation 2008:
703-7.