Flexible model composition in machine learning and its implementation in MLJ

Show simple item record

dc.contributor.author Blaom, Anthony D
dc.contributor.author Vollmer, Sebastian J
dc.date.accessioned 2021-07-07T21:24:54Z
dc.date.available 2021-07-07T21:24:54Z
dc.identifier.uri https://hdl.handle.net/2292/55476
dc.description.abstract A graph-based protocol called `learning networks' which combine assorted machine learning models into meta-models is described. Learning networks are shown to overcome several limitations of model composition as implemented in the dominant machine learning platforms. After illustrating the protocol in simple examples, a concise syntax for specifying a learning network, implemented in the MLJ framework, is presented. Using the syntax, it is shown that learning networks are are sufficiently flexible to include Wolpert's model stacking, with out-of-sample predictions for the base learners.
dc.rights Items in ResearchSpace are protected by copyright, with all rights reserved, unless otherwise indicated. Previously published items are made available in accordance with the copyright policy of the publisher.
dc.rights.uri https://researchspace.auckland.ac.nz/docs/uoa-docs/rights.htm
dc.rights.uri https://creativecommons.org/licenses/by/4.0/
dc.subject cs.LG
dc.subject cs.LG
dc.subject I.2.6
dc.title Flexible model composition in machine learning and its implementation in MLJ
dc.type Journal Article
dc.date.updated 2021-06-08T23:01:00Z
dc.rights.holder Copyright: The authors en
pubs.author-url http://arxiv.org/abs/2012.15505v1
dc.rights.accessrights http://purl.org/eprint/accessRights/OpenAccess en
pubs.elements-id 833886


Files in this item

Find Full text

This item appears in the following Collection(s)

Show simple item record

Share

Search ResearchSpace


Browse

Statistics