Differentiable programming: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
→‎Applications: Localized references
→‎Applications: →cite journal | Add: s2cid. | Use this tool. Report bugs. | #UCB_Gadget
Line 15: Line 15:


==Applications==
==Applications==
Differentiable programming has been applied in areas such as combining [[deep learning]] with [[physics engines]] in [[robotics]],<ref>{{cite arXiv |eprint=1611.01652 |class=cs.NE |first1=Jonas |last1=Degrave |first2=Michiel |last2=Hermans |title=A Differentiable Physics Engine for Deep Learning in Robotics |date=2016-11-05 |last3=Dambre |first3=Joni |last4=wyffels |first4=Francis}}</ref> solving electronic structure problems with differentiable [[density functional theory]],<ref name="Li2021">{{cite journal |last1=Li |first1=Li |last2=Hoyer |first2=Stephan |last3=Pederson |first3=Ryan |last4=Sun |first4=Ruoxi |last5=Cubuk |first5=Ekin D. |last6=Riley |first6=Patrick |last7=Burke |first7=Kieron |year=2021 |title=Kohn-Sham Equations as Regularizer: Building Prior Knowledge into Machine-Learned Physics |journal=Physical Review Letters |volume=126 |issue=3 |pages=036401 |arxiv=2009.08551 |bibcode=2021PhRvL.126c6401L |doi=10.1103/PhysRevLett.126.036401 |pmid=33543980 |doi-access=free}}</ref> differentiable [[Ray tracing (graphics)|ray tracing]],<ref>{{Cite web |title=Differentiable Monte Carlo Ray Tracing through Edge Sampling |url=https://people.csail.mit.edu/tzumao/diffrt/ |access-date=2019-02-13 |website=people.csail.mit.edu}}</ref> [[image processing]],<ref>{{Cite web |title=Differentiable Programming for Image Processing and Deep Learning in Halide |url=https://people.csail.mit.edu/tzumao/gradient_halide/ |access-date=2019-02-13 |website=people.csail.mit.edu}}</ref> and [[probabilistic programming]].<ref name="diffprog-zygote"/>
Differentiable programming has been applied in areas such as combining [[deep learning]] with [[physics engines]] in [[robotics]],<ref>{{cite arXiv |eprint=1611.01652 |class=cs.NE |first1=Jonas |last1=Degrave |first2=Michiel |last2=Hermans |title=A Differentiable Physics Engine for Deep Learning in Robotics |date=2016-11-05 |last3=Dambre |first3=Joni |last4=wyffels |first4=Francis}}</ref> solving electronic structure problems with differentiable [[density functional theory]],<ref name="Li2021">{{cite journal |last1=Li |first1=Li |last2=Hoyer |first2=Stephan |last3=Pederson |first3=Ryan |last4=Sun |first4=Ruoxi |last5=Cubuk |first5=Ekin D. |last6=Riley |first6=Patrick |last7=Burke |first7=Kieron |year=2021 |title=Kohn-Sham Equations as Regularizer: Building Prior Knowledge into Machine-Learned Physics |journal=Physical Review Letters |volume=126 |issue=3 |pages=036401 |arxiv=2009.08551 |bibcode=2021PhRvL.126c6401L |doi=10.1103/PhysRevLett.126.036401 |pmid=33543980 |doi-access=free}}</ref> differentiable [[Ray tracing (graphics)|ray tracing]],<ref>{{cite journal |first1=Tzu-Mao |last1=Li |first2=Miika |last2=Aittala |first3=Frédo |last3=Durand |first4=Jaakko |last4=Lehtinen |title=Differentiable Monte Carlo Ray Tracing through Edge Sampling |journal=ACM Transactions on Graphics |volume=37 |issue=6 |pages=222:1–11 |date=2018 |doi=10.1145/3272127.3275109 |s2cid=52839714 |url=https://people.csail.mit.edu/tzumao/diffrt/}}</ref> [[image processing]],<ref>{{cite journal |first1=Tzu-Mao |last1=Li |first2=Michaël |last2=Gharbi |first3=Andrew |last3=Adams |first4=Frédo |last4=Durand |first5=Jonathan |last5=Ragan-Kelley |title=Differentiable Programming for Image Processing and Deep Learning in Halide |journal=ACM Transactions on Graphics |volume=37 |issue=4 |pages=139:1–13 |date=August 2018 |doi=10.1145/3197517.3201383 |s2cid=46927588 |url=https://cseweb.ucsd.edu/~tzli/gradient_halide}}</ref> and [[probabilistic programming]].<ref name="diffprog-zygote"/>


==See also==
==See also==

Revision as of 03:46, 15 November 2022

Differentiable programming is a programming paradigm in which a numeric computer program can be differentiated throughout via automatic differentiation.[1] [2][3][4][5] This allows for gradient-based optimization of parameters in the program, often via gradient descent, as well as other learning approaches that are based on higher order derivative information. Differentiable programming has found use in a wide variety of areas, particularly scientific computing and artificial intelligence.[5] One of the early proposals to adopt such a framework in a systematic fashion to improve upon learning algorithms was made by the Advanced Concepts Team at the European Space Agency in early 2016.[6]

Approaches

Most differentiable programming frameworks work by constructing a graph containing the control flow and data structures in the program.[7] Attempts generally fall into two groups:

  • Static, compiled graph-based approaches such as TensorFlow,[note 1] Theano, and MXNet. They tend to allow for good compiler optimization and easier scaling to large systems, but their static nature limits interactivity and the types of programs that can be created easily (e.g. those involving loops or recursion), as well as making it harder for users to reason effectively about their programs.[7] A proof of concept compiler toolchain called Myia uses a subset of Python as a front end and supports higher-order functions, recursion, and higher-order derivatives.[8][9][10]

A limitation of earlier approaches is that they are only able to differentiate code written in a suitable manner for the framework, limiting their interoperability with other programs. Newer approaches resolve this issue by constructing the graph from the language's syntax or IR, allowing arbitrary code to be differentiated.[7][9]

Applications

Differentiable programming has been applied in areas such as combining deep learning with physics engines in robotics,[12] solving electronic structure problems with differentiable density functional theory,[13] differentiable ray tracing,[14] image processing,[15] and probabilistic programming.[5]

See also

Notes

  1. ^ TensorFlow 1 uses the static graph approach, whereas TensorFlow 2 uses the dynamic graph approach by default.

References

  1. ^ Izzo, Dario; Biscani, Francesco; Mereta, Alessio (2017). "Differentiable genetic programming". European Conference on Genetic Programming (EuroGP). Lecture Notes in Computer Science. 18: 35–51. arXiv:1611.04766. doi:10.1007/978-3-319-55696-3_3. ISBN 978-3-319-55695-6. S2CID 17786263.
  2. ^ Baydin, Atilim Gunes; Pearlmutter, Barak; Radul, Alexey Andreyevich; Siskind, Jeffrey (2018). "Automatic differentiation in machine learning: a survey". Journal of Machine Learning Research. 18: 1–43.
  3. ^ Wang, Fei; Decker, James; Wu, Xilun; Essertel, Gregory; Rompf, Tiark (2018), Bengio, S.; Wallach, H.; Larochelle, H.; Grauman, K. (eds.), "Backpropagation with Callbacks: Foundations for Efficient and Expressive Differentiable Programming" (PDF), Advances in Neural Information Processing Systems 31, Curran Associates, Inc., pp. 10201–10212, retrieved 2019-02-13
  4. ^ Innes, Mike (2018). "On Machine Learning and Programming Languages" (PDF). SysML Conference 2018.
  5. ^ a b c d Innes, Mike; Edelman, Alan; Fischer, Keno; Rackauckas, Chris; Saba, Elliot; Viral B Shah; Tebbutt, Will (2019), ∂P: A Differentiable Programming System to Bridge Machine Learning and Scientific Computing, arXiv:1907.07587
  6. ^ "Differential Intelligence". October 2016. Retrieved 2022-10-19.
  7. ^ a b c d Innes, Michael; Saba, Elliot; Fischer, Keno; Gandhi, Dhairya; Rudilosso, Marco Concetto; Joy, Neethu Mariya; Karmali, Tejan; Pal, Avik; Shah, Viral (2018-10-31). "Fashionable Modelling with Flux". arXiv:1811.01457 [cs.PL].
  8. ^ Merriënboer, Bart van; Breuleux, Olivier; Bergeron, Arnaud; Lamblin, Pascal (3 December 2018). "Automatic differentiation in ML: where we are and where we should be going". Proceedings of the 32nd International Conference on Neural Information Processing Systems. 31: 8771–8781.
  9. ^ a b c "Automatic Differentiation in Myia" (PDF). Retrieved 2019-06-24.
  10. ^ a b "TensorFlow: Static Graphs". Retrieved 2019-03-04.
  11. ^ Innes, Michael (2018-10-18). "Don't Unroll Adjoint: Differentiating SSA-Form Programs". arXiv:1810.07951 [cs.PL].
  12. ^ Degrave, Jonas; Hermans, Michiel; Dambre, Joni; wyffels, Francis (2016-11-05). "A Differentiable Physics Engine for Deep Learning in Robotics". arXiv:1611.01652 [cs.NE].
  13. ^ Li, Li; Hoyer, Stephan; Pederson, Ryan; Sun, Ruoxi; Cubuk, Ekin D.; Riley, Patrick; Burke, Kieron (2021). "Kohn-Sham Equations as Regularizer: Building Prior Knowledge into Machine-Learned Physics". Physical Review Letters. 126 (3): 036401. arXiv:2009.08551. Bibcode:2021PhRvL.126c6401L. doi:10.1103/PhysRevLett.126.036401. PMID 33543980.
  14. ^ Li, Tzu-Mao; Aittala, Miika; Durand, Frédo; Lehtinen, Jaakko (2018). "Differentiable Monte Carlo Ray Tracing through Edge Sampling". ACM Transactions on Graphics. 37 (6): 222:1–11. doi:10.1145/3272127.3275109. S2CID 52839714.
  15. ^ Li, Tzu-Mao; Gharbi, Michaël; Adams, Andrew; Durand, Frédo; Ragan-Kelley, Jonathan (August 2018). "Differentiable Programming for Image Processing and Deep Learning in Halide". ACM Transactions on Graphics. 37 (4): 139:1–13. doi:10.1145/3197517.3201383. S2CID 46927588.