2082-4599-1-SP - Majlesi Journal of Electrical Engineering
... sensitive items and we will arrange them based on sensitivity and length of the items in descending order. Up to the time that all of the sensitive rules are not hidden, the LHS element is deleted from the transactions which totally cover that sensitive rule and it starts from the next transaction w ...
... sensitive items and we will arrange them based on sensitivity and length of the items in descending order. Up to the time that all of the sensitive rules are not hidden, the LHS element is deleted from the transactions which totally cover that sensitive rule and it starts from the next transaction w ...
Learning Approximate Sequential Patterns for Classification
... propose a two-step process to discover such patterns. Using locality sensitive hashing (LSH), we first estimate the frequency of all subsequences and their approximate matches within a given Hamming radius in labeled examples. The discriminative ability of each pattern is then assessed from the esti ...
... propose a two-step process to discover such patterns. Using locality sensitive hashing (LSH), we first estimate the frequency of all subsequences and their approximate matches within a given Hamming radius in labeled examples. The discriminative ability of each pattern is then assessed from the esti ...
A Discretization Algorithm Based on Extended Gini Criterion
... continuous value according to cut point and stop when a stopping criterion is met, otherwise repeat the second step. Splitting methods are categorized into four types of method, which are binning, entropy, dependency and accuracy. To name a few, some of the algorithms developed under binning are Equ ...
... continuous value according to cut point and stop when a stopping criterion is met, otherwise repeat the second step. Splitting methods are categorized into four types of method, which are binning, entropy, dependency and accuracy. To name a few, some of the algorithms developed under binning are Equ ...
Ensemble of Classifiers to Improve Accuracy of the CLIP4 Machine
... Columns of this matrix correspond to variables of the optimized function (attributes). Rows correspond to function constrains (examples). The solution is obtained by selecting minimal number of matrix columns in such a way that for every row there will be at least one matrix cell with the value of 1 ...
... Columns of this matrix correspond to variables of the optimized function (attributes). Rows correspond to function constrains (examples). The solution is obtained by selecting minimal number of matrix columns in such a way that for every row there will be at least one matrix cell with the value of 1 ...
paper - AET Papers Repository
... Cameron (1997) indicates that clustering methods are an important tool when analyzing traffic accidents as these methods are able to identify groups of road users, vehicles and road segments which would be suitable targets for countermeasures. More specifically, cluster analysis is a statistical tec ...
... Cameron (1997) indicates that clustering methods are an important tool when analyzing traffic accidents as these methods are able to identify groups of road users, vehicles and road segments which would be suitable targets for countermeasures. More specifically, cluster analysis is a statistical tec ...
Finding Highly Correlated Pairs Efficiently with Powerful Pruning
... address here: We want to consider the supports of the pairs in a pruning rule without actually counting these supports. In this paper, we show that this can be done. We propose a pruning rule that involves the supports of the pairs. Meanwhile, we give a pruning method for this rule that does not req ...
... address here: We want to consider the supports of the pairs in a pruning rule without actually counting these supports. In this paper, we show that this can be done. We propose a pruning rule that involves the supports of the pairs. Meanwhile, we give a pruning method for this rule that does not req ...
Nearest-neighbor chain algorithm
In the theory of cluster analysis, the nearest-neighbor chain algorithm is a method that can be used to perform several types of agglomerative hierarchical clustering, using an amount of memory that is linear in the number of points to be clustered and an amount of time linear in the number of distinct distances between pairs of points. The main idea of the algorithm is to find pairs of clusters to merge by following paths in the nearest neighbor graph of the clusters until the paths terminate in pairs of mutual nearest neighbors. The algorithm was developed and implemented in 1982 by J. P. Benzécri and J. Juan, based on earlier methods that constructed hierarchical clusterings using mutual nearest neighbor pairs without taking advantage of nearest neighbor chains.