De grootste kennisbank van het HBO

Inspiratie op jouw vakgebied

Vrij toegankelijk

Terug naar zoekresultatenDeel deze publicatie

An invariants based architecture for combining small and large data sets in neural networks

Open access

An invariants based architecture for combining small and large data sets in neural networks

Open access

Samenvatting

We present a novel architecture for an AI system that allows a priori knowledge to combine with deep learning. In traditional neural networks, all available data is pooled at the input layer. Our alternative neural network is constructed so that partial representations (invariants) are learned in the intermediate layers, which can then be combined with a priori knowledge or with other predictive analyses of the same data. This leads to smaller training datasets due to more efficient learning. In addition, because this architecture allows inclusion of a priori knowledge and interpretable predictive models, the interpretability of the entire system increases while the data can still be used in a black box neural network. Our system makes use of networks of neurons rather than single neurons to enable the representation of approximations (invariants) of the output.

OrganisatieHogeschool Utrecht
AfdelingKenniscentrum Digital Business & Media
LectoraatArtificial Intelligence
Gepubliceerd inProceedings of BNAIC/BeneLearn 2021. BnL, Luxembourg, Pagina's: 748-749
Datum2021-11-10
TypeConferentiebijdrage
ISBN0-2799-2527-X
TaalEngels

Op de HBO Kennisbank vind je publicaties van 26 hogescholen

De grootste kennisbank van het HBO

Inspiratie op jouw vakgebied

Vrij toegankelijk