Jump to content

Backpropagation through structure

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Pintoch (talk | contribs) at 15:47, 10 November 2016 (→‎top: change |id={{citeseer}} to |citeseerx= using AWB). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Backpropagation Through Structure (BPTS) is a gradient-based technique for training recursive neural nets (a superset of recurrent neural nets) and is extensively described in a 1996 paper written by Christoph Goller and Andreas Küchler.[1]

References

  1. ^ Kuchler, Andreas. "Learning Task-Dependent Distributed Representations by Backpropagation Through Structure". psu.edu. CiteSeerX 10.1.1.49.1968.