Publications

Export 133197 results:
Sort by: Author Title Type Year
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z 
2011
Hassanien, A. E., "Heart Sound Feature Reduction Approach for Improving the Heart Valve Diseases Identification", International Conference on Signal Processing, Image Processing and Pattern Recognition - , Jeju Island, Korea, 8-10 December, 2011. Abstract

Recently, heart sound signals have been used in the detection of the heart valve status and the identification of the heart valve disease. Due to these characteristics, therefore, two feature reduction techniques have been proposed prior applying data classifications in this paper. The first technique is the chi-Square which measures the lack of independence between each heart sound feature and the target class, while the second technique is the deep believe network that uses to generate a new data set of a reduced number of features according the partition of the heart signals. The importance of feature reduction prior applying data classification is not only to improve the classification accuracy and to enhance the training and testing performance, but also it is important to detect which of the stages of heart sound is important for the detection of sick people among normal set of people, and which period important for the classification of heart murmur. Different classification algorithms including naive bayesian tree classifier and sequential minimal optimization was applied on three different data sets of 100 extracted features of the heart sound. The extensive experimental results on the heat sound signals data set demonstrate that the proposed approach outperforms other classifiers and providing the highest classification accuracy with minimized number of features.

Heba, E., M. Salama, A. E. Hassanien, and T. - H. Kim, "Bi-Layer Behavioral-Based Feature Selection Approach for Network Intrusion Classification", Security Technology - International Conference, SecTech 2011, pp.195-203, Jeju Island, Korea, December 8-10,, 2011. Abstract

To satisfy the ever growing need for effective screening and diagnostic tests, medical practitioners have turned their attention to high resolution, high throughput methods. One approach is to use mass spectrometry based methods for disease diagnosis. Effective diagnosis is achieved by classifying the mass spectra as belonging to healthy or diseased individuals. Unfortunately, the high resolution mass spectrometry data contains a large degree of noisy, redundant and irrelevant information, making accurate classification difficult. To overcome these obstacles, feature extraction methods are used to select or create small sets of relevant features. This paper compares existing feature selection methods to a novel wrapper-based feature selection and centroid-based classification method. A key contribution is the exposition of different feature extraction techniques, which encompass dimensionality reduction and feature selection methods. The experiments, on two cancer data sets, indicate that feature selection algorithms tend to both reduce data dimensionality and increase classification accuracy, while the dimensionality reduction techniques sacrifice performance as a result of lowering the number of features. In order to evaluate the dimensionality reduction and feature selection techniques, we use a simple classifier, thereby making the approach tractable. In relation to previous research, the proposed algorithm is very competitive in terms of (i) classification accuracy, (ii) size of feature sets, (iii) usage of computational resources during both training and classification phases.

Hassanien, A. E., "A Fast and Secure One-Way Hash Function", Security Technology - International Conference, SecTech 2011, Jeju Island, Korea, 8-10 December, 2011. Abstract

One way hash functions play a fundamental role for data integrity, message authentication, and digital signature in modern information security. In this paper we proposed a fast one-way hash function to optimize the time delay with strong collision resistance, assures a good compression and one-way resistance. It is based on the standard secure hash function (SHA-1) algorithm. The analysis indicates that the proposed algorithm which we called (fSHA-1) is collision resistant and assures a good compression and pre-image resistance. In addition, the executing time compared with the standard secure hash function is much shorter.

Hassanien, A. E., "Feature evaluation based Fuzzy C-Mean classification", Fuzzy Systems (FUZZ), 2011 IEEE International Conference on , 27-30 June 2011 . Abstract

Fuzzy C-Means Clustering, FCM, is an iterative algorithm whose aim is to find the center or centroid of data clusters that minimize an assigned dissimilarity function. The degree of being in a certain cluster can be defined in terms of the distance to the cluster-centroid. The domain knowledge is used to formulate an appropriate measure. However the Euclidean distance is considered as a general measure for such value. The calculation of the Euclidean distance doesn't take into consideration the degree of relevance of each feature to the classification model. In this paper, scoring methods like ChiMerge and Mutual information are used in the FCM model to improve the calculation of the Euclidean distance. Experimental results demonstrate the better performances of the improved FCM on UCI benchmark data sets rather than the ordinary FCM, where the ordinary FCM uses in classification either all features or the most important features while the improved FCM uses all the features but the Euclidean Distance will be calculated according to the relevance degree of each feature.

2009
Kudelka, M., V. Snásel, Z. Horak, and A. E. Hassanien, "From Web Pages to Web Communities", Annual International Workshop on DAtabases, TExts, Specifications and Objects, Spindleruv Mlyn, Czech Republic , April 15-17, 2009. Abstract

In this paper we are looking for a relationship between the intent of Web pages, their architecture and the communities who take part in their usage and creation. From our point of view, the Web page is entity carrying information about these communities and this paper describes techniques, which can be used to extract mentioned information as well as tools usable in analysis of these information. Information about communities could be used in several ways thanks to our approach. Finally we present an experiment which illustrates the benefits of our approach.

Grosan, C., A. Abraham, and A. - E. Hassanien, "Designing resilient networks using multicriteria metaheuristics", Telecommunication Systems , vol. 40, issue 1-2, pp. 75-88, 2009. AbstractWebsite

The paper deals with the design of resilient networks that are fault tolerant against link failures. Usually,
fault tolerance is achieved by providing backup paths, which are used in case of an edge failure on a primary path. We consider this task as a multiobjective optimization problem: to provide resilience in networks while minimizing the cost subject to capacity constraint. We propose a stochastic approach,
which can generate multiple Pareto solutions in a single run. The feasibility of the proposed method is illustrated by considering several network design problems using a single weighted average of objectives and a direct multiobjective optimization approach using the Pareto dominance concept.

2008
Hassanien, A. E., and A. Abraham, "Rough Morphology Hybrid Approach for Mammography Image Classification and Prediction", International Journal of Computational Intelligence and Applications , vol. 7, issue 1, pp. 17-42 , 2008. AbstractWebsite

The objective of this research is to illustrate how rough sets can be successfully integrated with mathematical morphology and provide a more effective hybrid approach to resolve medical imaging problems. Hybridization of rough sets and mathematical morphology techniques has been applied to depict their ability to improve the classification of breast cancer images into two outcomes: malignant and benign cancer. Algorithms based on mathematical morphology are first applied to enhance the contrast of the whole original image; to extract the region of interest (ROI) and to enhance the edges surrounding that region. Then, features are extracted characterizing the underlying texture of the ROI by using the gray-level co-occurrence matrix. The rough set approach to attribute reduction and rule generation is further presented. Finally, rough morphology is designed for discrimination of different ROI to test whether they represent malignant cancer or benign cancer. To evaluate performance of the presented rough morphology approach, we tested different mammogram images. The experimental results illustrate that the overall performance in locating optimal orientation offered by the proposed approach is high compared with other hybrid systems such as rough-neural and rough-fuzzy systems.

Hassanien, A. E., M. E. Abdelhafez, and H. S. Own, "Rough Sets Data Analysis in Knowledge Discovery: A Case of Kuwaiti Diabetic Children Patients", Advances in Fuzzy Systems,, vol. 2008, issue 1, pp. 13, 2008. AbstractWebsite

The main goal of this study is to investigate the relationship between psychosocial variables and diabetic children patients and to obtain a classifier function with which it was possible to classify the patients on the basis of assessed adherence level. The rough set theory is used to identify the most important attributes and to induce decision rules from 302 samples of Kuwaiti diabetic children patients aged 7–13 years old. To increase the efficiency of the classification process, rough sets with Boolean reasoning discretization algorithm is introduced to discretize the data, then the rough set reduction technique is applied to find all reducts of the data which contains the minimal subset of attributes that are associated with a class label for classification. Finally, the rough sets dependency rules are generated directly from all generated reducts. Rough confusion matrix is used to evaluate the performance of the predicted reducts and classes. A comparison between the obtained results using rough sets with decision tree, neural networks, and statistical discriminate analysis classifier algorithms has been made. Rough sets show a higher overall accuracy rates and generate more compact rules.

Xiao, K., S. H. Ho, and A. E. Hassanien, "Aboul Ella Hassanien: Automatic Unsupervised Segmentation Methods for MRI Based on Modified Fuzzy C-Means", Fundamenta Informaticae, vol. 87, issue 3-4, pp. 465-481, 2008. Website
Own, H. S., and A. E. Hassanien, "Rough Wavelet Hybrid Image Classification Scheme", Journal of Convergence Information Technology, vol. 3, issue 4, pp. 65-75, 2008. AbstractWebsite

This paper introduces a new computer-aided classification system for detection of prostate cancer in
Transrectal Ultrasound images (TRUS). To increase the efficiency of the computer aided classification
process, an intensity adjustment process is applied first, based on the Pulse Coupled Neural Network
(PCNN) with a median filter. This is followed by applying a PCNN-based segmentation algorithm to
detect the boundary of the prostate image. Combining the adjustment and segmentation enable to eliminate PCNN sensitivity to the setting of the various PCNN parameters whose optimal selection can be difficult and can vary even for the same problem. Then, wavelet based features have been extracted and
normalized, followed by application of a rough set analysis to discover the dependency between the
attributes and to generate a set of reduct that contains a minimal number of attributes. Finally, a rough
confusion matrix is designed that contain information about actual and predicted classifications done by a
classification system. Experimental results show that the introduced system is very successful and has high detection accuracy

2012
Salama, M., Data Mining for Medical Informatics, , Cairo, Cairo Unv, 2012. AbstractThesis.pdfPresentation.pdf

The work presented in this thesis investigates the nature of real-life data, mainly in the medical field, and the problems in handling such nature by the conventional data mining techniques. Accordingly, a set of alternative techniques are proposed in this thesis to handle the medical data in the three stages of data mining process. In the first stage which is preprocessing, a proposed technique named as interval-based feature evaluation technique that depends on a hypothesis that the decrease of the overlapped interval of values for every class label leads to increase the importance of such attribute. Such technique handles the difficulty of dealing with continuous data attributes without the need of applying discretization of the input and it is proved by comparing the results of the proposed technique to other attribute evaluation and selection techniques. Also in the preprocessing stage, the negative effect of normalization algorithm before applying the conventional PCA has been investigated and how the avoidance of such algorithm enhances the resulted classification accuracy. Finally in the preprocessing stage, an experimental analysis introduces the ability of rough set methodology to successfully classify data without the need of applying feature reduction technique. It shows that the overall classification accuracy offered by the employed rough set approach is high compared with other machine learning techniques including Support Vector Machine, Hidden Naive Bayesian network, Bayesian network and other techniques.
In the machine learning stage, frequent pattern-based classification technique is proposed; it depends on the detection of variation of attributes among objects of the same class. The preprocessing of the data like standardization, normalization, discretization or feature reduction is not required in this technique which enhances the performance in time and keeps the original data without being distorted. Another contribution has been proposed in the machine learning stage including the support vector machine and fuzzy c-mean clustering techniques; this contribution is about the enhancement of the Euclidean space calculations through applying the fuzzy logic in such calculations. This enhancement has used chimerge feature evaluation techniques in applying fuzzification on the level of features. A comparison is applied on these enhanced techniques to the other classical data mining techniques and the results shows that classical models suffers from low classification accuracy due to the dependence of un-existed presumption.
Finally, in the visualization stage, a proposed technique is presented to visualize the continuous data using Formal Concept Analysis that is better than the complications resulted from the scaling algorithms.

Zawbaa, H. M., and A. E. Hassanien, Automatic Soccer Video Summarization, , Cairo, Cairo Unversity, 2012. Abstract

This thesis presents an automatic soccer video summarization system using machine learning (ML) techniques. The proposed system is composed of ve phases. Namely; in the pre-processing phase, the system segments the whole video stream into small video shots. Then, in the shot processing
phase, it applies two types of classi cation (shot type classi cation and play / break classification) to the video shots resulted from the pre-processing phase. Afterwards, in the replay detection phase, the proposed system applies two machine learning algorithms, namely; support vector machine (SVM) and arti cial neural network (ANN), for emphasizing important segments with championship logo appearance. Also, in the excitement event detection phase, the proposed system uses both machine learning algorithms for detecting the scoreboard which contain an information about the score of the game. The proposed system also uses k-means algorithm and Hough line transform for detecting vertical goal posts and Gabor lter for detecting goal net. Finally, in the event detection and summarization phase, the proposed system highlights the most important events during the match. Experiments on real soccer videos demonstrate encouraging results. The event detection and summarization has attained recall 94% and precision 97.3% for soccer match videos from ve international soccer championships.

2006
Hassanien, A. E., "A Copyright Protection using Watermarking Algorithm", Informatica, vol. 17 , issue 2, pp. 187-198, 2006. AbstractWebsite

In this paper, a digital watermarking algorithm for copyright protection based on the concept of embed digital watermark and modifying frequency coefficients in discrete wavelet transform (DWT) domain is presented. We embed the watermark into the detail wavelet coefficients of the original image with the use of a key. This key is randomly generated and is used to select the exact locations in the wavelet domain in which to embed the watermark. The corresponding watermark detection algorithm is presented. A new metric that measure the objective quality of the image based on the detected watermark bit is introduced, which the original unmarked image is not required for watermark extraction. The performance of the proposed watermarking algorithm is robust to variety of signal distortions, such a JPEG, image cropping, geometric transformations and noises.

Hassanien, A. E., and D. Slezak, "Rough neural intelligent approach for image classification: A case of patients with suspected breast cancer", International Journal of Hybrid Intelligent Systems, vol. 3, issue 4, pp. 205-218 , 2006. AbstractWebsite

The objective of this paper is to introduce a rough neural intelligent approach for rule generation and image classification. Hybridization of intelligent computing techniques has been applied to see their ability and accuracy to classify breast cancer images into two outcomes: malignant cancer or benign cancer. Algorithms based on fuzzy image processing are first applied to enhance the contrast of the whole original image; to extract the region of interest and to enhance the edges surrounding that region. Then, we extract features characterizing the underlying texture of the regions of interest by using the gray-level co-occurrence matrix. Then, the rough set approach to attribute reduction and rule generation is presented. Finally, rough neural network is designed for discrimination of different regions of interest to test whether they represent malignant cancer or benign cancer. Rough neural network is built from rough neurons, each of which can be viewed as a pair of sub-neurons, corresponding to the lower and upper bounds. To evaluate performance of the presented rough neural approach, we run tests over different mammogram images. The experimental results show that the overall classification accuracy offered by rough neural approach is high compared with other intelligent techniques

2004
Hassanien, A. E., "Rough set approach for attribute reduction and rule generation: A case of patients with suspected breast cancer", Journal of the American Society for Information Science and Technology , vol. 55, issue 11, pp. 954-962 , 2004. AbstractWebsite

Rough set theory is a relatively new intelligent technique used in the discovery of data dependencies; it evaluates the importance of attributes, discovers the patterns of data, reduces all redundant objects and attributes, and seeks the minimum subset of attributes. Moreover, it is being used for the extraction of rules from databases. In this paper, we present a rough set approach to attribute reduction and generation of classification rules from a set of medical datasets. For this purpose, we first introduce a rough set reduction technique to find all reducts of the data that contain the minimal subset of attributes associated with a class label for classification. To evaluate the validity of the rules based on the approximation quality of the attributes, we introduce a statistical test to evaluate the significance of the rules. Experimental results from applying the rough set approach to the set of data samples are given and evaluated. In addition, the rough set classification accuracy is also compared to the well-known ID3 classifier algorithm. The study showed that the theory of rough sets is a useful tool for inductive learning and a valuable aid for building expert systems.

Hassanien, A. E., J. M. H. Ali, and H. Nobuhara, "Detection of Spiculated Masses in Mammograms Based on Fuzzy Image Processing.", Artificial Intelligence and Soft Computing - ICAISC 2004, 7th International Conference, , Zakopane, Poland, Volume 3070/2004, 1002-1007, June 7-11, 2004. Abstract

This paper presents an efficient technique for the detection of spiculated massesin the digitized mammogram to assist the attending radiologist in making his decisions. The presented technique consists of two stages, enhancement of spiculation masses followed by the segmentation process. Fuzzy Histogram Hyperbolization (FHH) algorithm is first used to improve the quality of the digitized mammogram images. The Fuzzy C-Mean (FCM) algorithm is then applied to the preprocessed image to initialize the segmentation. Four measures of quantifying enhancement have been developed in this work. Each measure is based on the statistical information obtained from the labelled region of interest and a border area surrounding it. The methodology is based on the assumption that target and background areas are accurately specified. We have tested the algorithms on digitized mammograms obtained from the Digital Databases for Mammographic Image Analysis Society (MIAS).

2003
Hassanien, A. E., and J. M. H. Ali, "An Efficient Classification and Image Retrieval Algorithm Based on Rough Set Theory", Proceedings of the 5th International Conference on Enterprise Information Systems, , Angers, France, April 22-26 , 2003.
2012
Soliman, M. M., A. E. Hassanien, N. I. Ghali, and H. M. Onsi, "An adaptive Watermarking Approach for Medical Imaging Using Swarm Intelligent", International Journal of Smart Home, (ISSN: 1975-4094), vol. 6, issue 1, pp. 37-45, 2012. AbstractIJSH_ 2012.pdfWebsite

In this paper we present a secure patient medical images and authentication scheme which enhances the security, confidentiality and integrity of medical images transmitted through the Internet. This paper proposes a watermarking by invoking particle swarm optimization (PSO) technique in adaptive quantization index modulation and singular value decomposition in conjunction with discrete wavelet transform (DWT) and discrete cosine transform (DCT). The proposed approach promotes the robustness and watermarked image quality. The experimental results show that the proposed algorithm yields a watermark which is invisible to human eyes, robust against a wide variety of common attacks and reliable enough for tracing colluders.

2008
Hassanien, A. E., A. Abraham, J. F. Peters, and G. Schaefer, "An overview of rough-hybrid approaches in image processing.", IEEE International Conference on Fuzzy Systems (ISBN 978-1-4244-1818-3), Hong Kong, China, pp, 2135 - 2142 , 1-6 June, , 2008. Abstract

Rough set theory offers a novel approach to manage uncertainty that has been used for the discovery of data dependencies, importance of features, patterns in sample data, feature space dimensionality reduction, and the classification of objects. Consequently, rough sets have been successfully employed for various image processing tasks including image segmentation, enhancement and classification. Nevertheless, while rough sets on their own provide a powerful technique, it is often the combination with other computational intelligence techniques that results in a truly effective approach. In this paper we show how rough sets have been combined with various other methodologies such as neural networks, wavelets, mathematical morphology, fuzzy sets, genetic algorithms, Bayesian approaches, swarm optimization, and support vector machines in the image processing domain.

Al-Qaheri, H., S. Zamoon, A. E. Hassanien, and A. Abraham, " Rough Set Generating Prediction Rules for Stock Price Movement", The Second IEEE UKSIM European Symposium on Computer Modeling and Simulation, Liverpool, England, UK, pp.111-116 , 8-10 September , 2008. Abstract

This paper presents rough sets generating prediction rules scheme for stock price movement. The scheme was able to extract knowledge in the form of rules from daily stock movements. These rules then could be used to guide investors whether to buy, sell or hold a stock. To increase the efficiency of the prediction process, rough sets with Boolean reasoning discretization algorithm is used to discretize the data. Rough set reduction technique is applied to find all the reducts of the data. Finally, rough sets dependency rules are generated directly from all generated reducts. Rough confusion matrix is used to evaluate the performance of the predicted reducts and classes. A comparison between the obtained results using rough sets with decision tree and neural networks algorithms have been made. Rough sets show a higher overall accuracy rates reaching over 97% and generate more compact rules.

Al-Qaheri, H., S. Zamoon, A. E. Hassanien, and A. Abraham, " Rough Set Generating Prediction Rules for Stock Price Movement", The Second IEEE UKSIM European Symposium on Computer Modeling and Simulation, Liverpool, England, UK, pp.111-116 , 8-10 September , 2008. Abstract

This paper presents rough sets generating prediction rules scheme for stock price movement. The scheme was able to extract knowledge in the form of rules from daily stock movements. These rules then could be used to guide investors whether to buy, sell or hold a stock. To increase the efficiency of the prediction process, rough sets with Boolean reasoning discretization algorithm is used to discretize the data. Rough set reduction technique is applied to find all the reducts of the data. Finally, rough sets dependency rules are generated directly from all generated reducts. Rough confusion matrix is used to evaluate the performance of the predicted reducts and classes. A comparison between the obtained results using rough sets with decision tree and neural networks algorithms have been made. Rough sets show a higher overall accuracy rates reaching over 97% and generate more compact rules.

El-Hosseini, M. A., A. E. Hassanien, A. Abraham, and H. Al-Qaheri, " Genetic Annealing Optimization: Design and Real World Applications.", Eighth International Conference on Intelligent Systems Design and Applications, ISDA 2008, , Kaohsiung, Taiwan,, 26-28 November , 2008. Abstract

Both simulated annealing (SA) and the genetic algorithms (GA) are stochastic and derivative-free optimization technique. SA operates on one solution at a time, while the GA maintains a large population of solutions, which are optimized simultaneously. Thus, the genetic algorithm takes advantage of the experience gained in the past exploration of the solution space. Since SA operates on one solution at a time, it has very little history to use in learning from past trials. SA has the ability to escape from any local point; even it is a global optimization technique. On the other hand, there is no guarantee that the GA algorithm will succeeded in escaping from any local minima, thus it makes sense to hybridize the genetic algorithm and the simulated annealing technique. In this paper, a novel genetically annealed algorithm is proposed and is tested against multidimensional and highly nonlinear cases; Fed-batch fermentor for Penicillin production, and isothermal continuous stirred tank reactor CSTR. It is evident from the results that the proposed algorithm gives good performance.

El-Hosseini, M. A., A. E. Hassanien, A. Abraham, and H. Al-Qaheri, "Cultural-Based Genetic Algorithm: Design and Real World Applications. ", Eighth International Conference on Intelligent Systems Design and Applications, ISDA 2008, Kaohsiung, Taiwan, pp.488-493 , 26-28 November, 2008. Abstract

Due to their excellent performance in solving combinatorial optimization problems, metaheuristics algorithms such as Genetic Algorithms GA [35], [18], [5], Simulated Annealing SA [34], [13] and Tabu Search TS make up another class of search methods that has been adopted to efficiently solve dynamic optimization problem. Most of these methods are confined to the population space and in addition the solutions of nonlinear problems become quite difficult especially when they are heavily constrained. They do not make full use of the historical information and lack prediction about the search space. Besides the knowledge that individuals inherited "genetic code" from their ancestors, there is another component called Culture. In this paper, a novel culture-based GA algorithm is proposed and is tested against multidimensional and highly nonlinear real world applications.

Kudelka, M., V. Snásel, Z. Horak, and A. E. Hassanien, "Web Communities Defined by Web Page Content", IEEE/WIC/ACM International Conference on Web Intelligence and International Conference on Intelligent Agent Technology , Sydney, NSW, Australia, pp.385-389 , 9-12 December, 2008. Abstract

In this paper we are looking for a relationship between the intent of Web pages, their architecture and the communities who take part in their usage and creation. For us, the Web page is entity carrying information about these communities. Our paper describes techniques, which can be used to extract mentioned information as well as tools usable in analysis of these information. Information about communities could be used in several ways thanks to our approach. Finally we present an experiment which proves the feasibility of our approach.

2011
Heba, T., E. - B. Nashwa, H. AboulElla, B. Yehia, and S. Vaclav, "Retinal Feature-Based Registration Schema", Informatics Engineering and Information Science Communications in Computer and Information Science Volume 252, 2011, pp 26-36 , Ostrava, Czech Republic, 7-9 July, 2011. Abstract

This paper presents a feature-based retinal image registration schema. A structural feature, namely, bifurcation structure, has been used for the proposed feature-based registration schema. The bifurcation structure is composed of a master bifurcation point and its three connected neighbors. The characteristic vector of each bifurcation structure consists of the normalized branching angle and length, which is invariant against translation, rotation, scaling, and even modest distortion. The proposed schema is composed of five fundamental phases, namely, input retinal images pre-processing, vascular network detection, noise removal, bifurcation points detection in vascular networks, and bifurcation points matching in pairs of retinal images. The effectiveness of the proposed schema is demonstrated by the experiments with 12 pairs retinal images collected from clinical patients. The registration is carried out through optimizing a certain similarity function, namely, normalized correlation of images. It has been observed that the proposed schema has achieved good performance accuracy.