Talk:BERT (language model): Difference between revisions
Appearance
Content deleted Content added
I'm no expert, but I noticed it because it's very popular. It is clearly at least C-Class if not B |
|||
Line 1: | Line 1: | ||
{{WikiProject banner shell|class= |
{{WikiProject banner shell|class=C| |
||
{{WikiProject Google}} |
{{WikiProject Google}} |
||
{{WikiProject Computer science}} |
{{WikiProject Computer science}} |
Revision as of 22:06, 5 July 2024
This article is rated C-class on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | ||||||||||||||||||||||||||||||||||||||||||||||||
|
Title
This page should probably be moved to BERT (language representation model) rather than language model. A language model has a specific meaning in that it models the joint probability distribution of words, whereas BERT doesn't do that, although it can predict a masked word it can't give you the probability distribution.
This would also be consistent with Wikipedia's own definition of a language model.
- I agree. Let us move it unless we see substantial protests. Trondtr (talk) 14:03, 1 October 2021 (UTC).
Categories:
- C-Class Google articles
- Unknown-importance Google articles
- WikiProject Google articles
- C-Class Computer science articles
- Unknown-importance Computer science articles
- WikiProject Computer science articles
- C-Class Linguistics articles
- Unknown-importance Linguistics articles
- C-Class applied linguistics articles
- Applied Linguistics Task Force articles
- WikiProject Linguistics articles