Applicability of feed-forward and recurrent neural networks to Boolean function complexity modeling

Azam Beg, P. W. Chandana Prasad, Ajmal Beg

    Research output: Contribution to journalArticlepeer-review

    7 Citations (Scopus)


    In this paper, we present the feed-forward neural network (FFNN) and recurrent neural network (RNN) models for predicting Boolean function complexity (BFC). In order to acquire the training data for the neural networks (NNs), we conducted experiments for a large number of randomly generated single output Boolean functions (BFs) and derived the simulated graphs for number of min-terms against the BFC for different number of variables. For NN model (NNM) development, we looked at three data transformation techniques for pre-processing the NN-training and validation data. The trained NNMs are used for complexity estimation for the Boolean logic expressions with a given number of variables and sum of products (SOP) terms. Both FFNNs and RNNs were evaluated against the ISCAS benchmark results. Our FFNNs and RNNs were able to predict the BFC with correlations of 0.811 and 0.629 with the benchmark results, respectively.

    Original languageEnglish
    Pages (from-to)2436-2443
    Number of pages8
    JournalExpert Systems with Applications
    Issue number4
    Publication statusPublished - May 2008


    • Bias
    • Biological sequence analysis
    • Classifier design
    • Feed-forward neural network
    • Machine learning
    • Motif
    • Pattern recognition
    • Recurrent neural network
    • Sub-cellular localization

    ASJC Scopus subject areas

    • General Engineering
    • Computer Science Applications
    • Artificial Intelligence


    Dive into the research topics of 'Applicability of feed-forward and recurrent neural networks to Boolean function complexity modeling'. Together they form a unique fingerprint.

    Cite this