Investigating the performance of MLP classifiers where limited training data are available for some classes

CR Parikh, MJ Pont, Yuhua Li, NB Jones

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Abstract

    The standard implementation of the back-propagation training algorithm for multi-layer Perceptron (MLP) neural networks assumes that there are equal number of samples for training each of the required classes. Where limited training data are available for one (or more) classes, sub-optimal performance may be obtained. We have demonstrated in a previous study [Parikh ct al., 1999. Proceedings of Condition Monitoring 1999, Swansea, UK] that, where unequal training class cannot be avoided, performance of the classifier may be substantially improved by duplicating the available patterns in the smaller class. In this study, we investigate whether the addition of random noise to the `duplicated' training patterns will further improve the classification performance. In the study conducted here, we conclude that the addition of noise does not give a consistent improvement in performance.
    LanguageEnglish
    Title of host publicationUnknown Host Publication
    Pages22-27
    Number of pages6
    Publication statusPublished - 2000
    EventSOFT COMPUTING TECHNIQUES AND APPLICATIONS -
    Duration: 1 Jan 2000 → …

    Publication series

    NameADVANCES IN SOFT COMPUTING

    Conference

    ConferenceSOFT COMPUTING TECHNIQUES AND APPLICATIONS
    Period1/01/00 → …

    Fingerprint

    Condition monitoring
    Multilayer neural networks
    Backpropagation
    Classifiers
    Neural networks

    Keywords

    • Multi-layer Perceptron
    • training algorithm
    • condition monitoring
    • fault diagnosis

    Cite this

    Parikh, CR., Pont, MJ., Li, Y., & Jones, NB. (2000). Investigating the performance of MLP classifiers where limited training data are available for some classes. In Unknown Host Publication (pp. 22-27). (ADVANCES IN SOFT COMPUTING).
    Parikh, CR ; Pont, MJ ; Li, Yuhua ; Jones, NB. / Investigating the performance of MLP classifiers where limited training data are available for some classes. Unknown Host Publication. 2000. pp. 22-27 (ADVANCES IN SOFT COMPUTING).
    @inproceedings{951dceb86ad74d869015e62322cfb539,
    title = "Investigating the performance of MLP classifiers where limited training data are available for some classes",
    abstract = "The standard implementation of the back-propagation training algorithm for multi-layer Perceptron (MLP) neural networks assumes that there are equal number of samples for training each of the required classes. Where limited training data are available for one (or more) classes, sub-optimal performance may be obtained. We have demonstrated in a previous study [Parikh ct al., 1999. Proceedings of Condition Monitoring 1999, Swansea, UK] that, where unequal training class cannot be avoided, performance of the classifier may be substantially improved by duplicating the available patterns in the smaller class. In this study, we investigate whether the addition of random noise to the `duplicated' training patterns will further improve the classification performance. In the study conducted here, we conclude that the addition of noise does not give a consistent improvement in performance.",
    keywords = "Multi-layer Perceptron, training algorithm, condition monitoring, fault diagnosis",
    author = "CR Parikh and MJ Pont and Yuhua Li and NB Jones",
    note = "Workshop 99 on Recent Advances in Soft Computing, LEICESTER, ENGLAND, JUL 01-02, 1999",
    year = "2000",
    language = "English",
    series = "ADVANCES IN SOFT COMPUTING",
    pages = "22--27",
    booktitle = "Unknown Host Publication",

    }

    Parikh, CR, Pont, MJ, Li, Y & Jones, NB 2000, Investigating the performance of MLP classifiers where limited training data are available for some classes. in Unknown Host Publication. ADVANCES IN SOFT COMPUTING, pp. 22-27, SOFT COMPUTING TECHNIQUES AND APPLICATIONS, 1/01/00.

    Investigating the performance of MLP classifiers where limited training data are available for some classes. / Parikh, CR; Pont, MJ; Li, Yuhua; Jones, NB.

    Unknown Host Publication. 2000. p. 22-27 (ADVANCES IN SOFT COMPUTING).

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    TY - GEN

    T1 - Investigating the performance of MLP classifiers where limited training data are available for some classes

    AU - Parikh, CR

    AU - Pont, MJ

    AU - Li, Yuhua

    AU - Jones, NB

    N1 - Workshop 99 on Recent Advances in Soft Computing, LEICESTER, ENGLAND, JUL 01-02, 1999

    PY - 2000

    Y1 - 2000

    N2 - The standard implementation of the back-propagation training algorithm for multi-layer Perceptron (MLP) neural networks assumes that there are equal number of samples for training each of the required classes. Where limited training data are available for one (or more) classes, sub-optimal performance may be obtained. We have demonstrated in a previous study [Parikh ct al., 1999. Proceedings of Condition Monitoring 1999, Swansea, UK] that, where unequal training class cannot be avoided, performance of the classifier may be substantially improved by duplicating the available patterns in the smaller class. In this study, we investigate whether the addition of random noise to the `duplicated' training patterns will further improve the classification performance. In the study conducted here, we conclude that the addition of noise does not give a consistent improvement in performance.

    AB - The standard implementation of the back-propagation training algorithm for multi-layer Perceptron (MLP) neural networks assumes that there are equal number of samples for training each of the required classes. Where limited training data are available for one (or more) classes, sub-optimal performance may be obtained. We have demonstrated in a previous study [Parikh ct al., 1999. Proceedings of Condition Monitoring 1999, Swansea, UK] that, where unequal training class cannot be avoided, performance of the classifier may be substantially improved by duplicating the available patterns in the smaller class. In this study, we investigate whether the addition of random noise to the `duplicated' training patterns will further improve the classification performance. In the study conducted here, we conclude that the addition of noise does not give a consistent improvement in performance.

    KW - Multi-layer Perceptron

    KW - training algorithm

    KW - condition monitoring

    KW - fault diagnosis

    M3 - Conference contribution

    T3 - ADVANCES IN SOFT COMPUTING

    SP - 22

    EP - 27

    BT - Unknown Host Publication

    ER -

    Parikh CR, Pont MJ, Li Y, Jones NB. Investigating the performance of MLP classifiers where limited training data are available for some classes. In Unknown Host Publication. 2000. p. 22-27. (ADVANCES IN SOFT COMPUTING).