Jump to content

How to make sigma-pi neural networks perform perfectly on regular training sets

Fast facts

  • Internal authorship

  • Publishment

    • 1994
  • Anthology

    How to make sigma-pi neural networks perform perfectly on regular training sets (7)

  • Journal

    Neural Networks,Neural Networks (8)

  • Organizational unit

  • Subjects

    • Applied computer science
  • Publication format

    Journal article (Article)

Quote

Lenze, Burkhard 1994. how to make sigma-pi neural networks perform perfectly on regular training sets. Neural Networks 7, 8, 1285-1293.

Content

In this paper, we show how to design three-layer feedforward neural networks with sigma-pi units in the hidden layer in order to perform perfectly on regular training sets. We obtain real-time design schemes based on massively parallel sampling and induced by so-called hyperbolic cardinal translation-type interpolation operators. The real-time nature of our strategy is due to the fact that in the neural network language our approach is nothing else but a very general and efficient one-shot learning scheme. Moreover, because of the very special hyperbolic structure of our sigma-pi units we do not have the usual dramatic increase of parameters and weights that in general happens in case of higher order networks. The final networks are of manageable complexity and may be applied to multigroup discriminant problems, pattern recognition, and image processing. In detail, the XOR-problem and a special multigroup discriminant problem are discussed at the end of the paper.

Notes and references

This site uses cookies to ensure the functionality of the website and to collect statistical data. You can object to the statistical collection via the data protection settings (opt-out).

Settings(Opens in a new tab)