Talk:Types of artificial neural networks
This article has not yet been rated on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | |||||||||||||||||||||||||
|
One-shot associative memories don't require parallel processing
[edit]It is stated in the article that the one-shot associative memory type network "however requires parallel processing". This cannot be correct—since all computations that are performed in parallel can be serialized—unless there is some computation that is required to run in a speed that could not be attained unless the computation was parallelized. This however is dubious, and I don't see what the sentence is actually trying to say. To me, it only seems like an advantage that the algorithm actually can be parallelized. —Kri (talk) 14:21, 4 October 2014 (UTC)
- I see now that "it" was supposed to refer to real-time pattern recognition and high scalability, while my initially interpretation was that it referred to the one-shot associative memory itself. I changed "it" to "this" and hope that will resolve the ambiguity. —Kri (talk) 10:08, 14 October 2014 (UTC)
Section How RBF networks work copied from external page
[edit]In this edit, content was added that seems to have been copied from this external page. Did the author of that page give his/her permission for the text to be copied into Wikipedia? Otherwise, this could be a potential copyright violation. Do we need to handle this in any way? —Kri (talk) 10:48, 14 October 2014 (UTC)
Reconciliation with ANN
[edit]Artificial Neural Networks#Types uses a different outline than this page. Neither article is properly sourced. I am removing the details from ANN so that we have a single source of truth. Would appreciate an expert's eye on the results. Generally, I am using this article's outline, while adding elements that are not in this article, but are in ANN. Lfstevens (talk) 16:51, 18 June 2017 (UTC)
External links modified
[edit]Hello fellow Wikipedians,
I have just modified 2 external links on Types of artificial neural networks. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
- Added archive https://web.archive.org/web/20101218121158/http://herselfsai.com/2007/03/probabilistic-neural-networks.html to http://herselfsai.com/2007/03/probabilistic-neural-networks.html
- Added archive https://web.archive.org/web/20120131053940/http://www.psi.toronto.edu/~vincent/research/presentations/PNN.pdf to http://www.psi.toronto.edu/~vincent/research/presentations/PNN.pdf
When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.
This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}}
(last update: 5 June 2024).
- If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
- If you found an error with any archives or the URLs themselves, you can fix them with this tool.
Cheers.—InternetArchiveBot (Report bug) 18:44, 14 September 2017 (UTC)
Closed-form continuous-time neural networks ?
[edit]I don't understand this well enough to add to the article, but just read about the concept here:
MIT solved a century-old differential equation to break 'liquid' AI's computational bottleneck - Engadget, Nov. 15, 2022
Closed-form continuous-time neural networks - Nature (Machine Intelligence), Nov. 15, 2022
Cheers! 98.155.8.5 (talk) 03:58, 19 November 2022 (UTC)