From Wikipedia, the free encyclopedia
Jump to navigation Jump to search
WikiProject Academic Journals (Rated C-class)
WikiProject iconThis article is within the scope of WikiProject Academic Journals, a collaborative effort to improve the coverage of Academic Journals on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
 C  This article has been rated as C-Class on the project's quality scale.
See WikiProject Academic Journals' writing guide for tips on how to improve this article.
WikiProject Chemistry (Rated C-class, Low-importance)
WikiProject iconThis article is within the scope of WikiProject Chemistry, a collaborative effort to improve the coverage of chemistry on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
 C  This article has been rated as C-Class on the project's quality scale.
 Low  This article has been rated as Low-importance on the project's importance scale.
WikiProject Open Access (Rated C-class)
WikiProject iconMDPI is part of WikiProject Open Access, a collaborative attempt at improving the coverage of topics related to Open Access and at improving other articles with the help of materials from Open Access sources. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
 C  This article has been rated as C-Class on the project's quality scale.
 ???  This article has not yet received a rating on the project's importance scale.
WikiProject Open Access to-do list:

Here are some tasks awaiting attention:

Norwegian criticisms[edit]

I wonder whether we need to include every single thing any Norwegian ever has said about MDPI? It dissorts the page in my opinion. It seems in Norway academics are quarreling about the legitimacy of MDPI, and one particular user is now spilling it over to the English wiki of MDPI. I leave it here for you all to discuss. Kenji1987 (talk) 07:31, 11 July 2021 (UTC)

I'd favor tagging with {{globalize|section}} and expanding the section title, unless the criticism really is unique to Nordic countries, in which case I think the current length is OK but the section should not expand much more, if at all. Banedon (talk) 00:40, 12 July 2021 (UTC)
It is true that Norwegian academics actively evaluate and discuss MDPI – as far as I can tell more than others (as least as far as public debate goes), probably because Norway is an international leader in this area, i.e. the evaluation of academic journals and presses and the building of a comprehensive government-owned database of publication channels worldwide. I'm not aware of any quarrel because I've not really seen any positive assessments of MDPI from academics. --Bjerrebæk (talk) 15:27, 12 July 2021 (UTC)
Many sentences in the section indicate the comments are in a personal capacity, though. I'm also surprised you've not seen positive assessments of MDPI from academics because there are many, e.g. sources cited in [1] and [2], just usually less vocal. It's very much an international topic. Banedon (talk) 21:10, 12 July 2021 (UTC)
Yes, some of the comments by academics were in an individual capacity, which is usually true for all forms of science and scholarship, and which was also true for Beall's criticisms of MDPI. However there is also some institutional weight behind some of the criticism, for example when the world's most comprehensive database of academic journals and presses announced a new level for dubious publishers/journals and linked it specifically to expressions of concern regarding MDPI. It seems to me that the academics who have defended MDPI are usually ones with some form of conflict of interest because they are editors of MDPI journals or otherwise involved with MDPI. In fact Beall has criticized Murray-Rust for exactly that. I was in fact referring to the allegation that there is a "quarrel" between Norwegian academics over MDPI when I said I've not seen any Norwegian academics publicly defend MDPI and thus not seen any such quarrel. --Bjerrebæk (talk) 10:04, 13 July 2021 (UTC)
If we accept comments by academics in an individual capacity, then things like this would be worth including. Beall is different because he was a specialist in OA. The named scientists in the section are not. I still favor a globalize tag. Also I have not seen any quarrel involving Norwegian academics either so that is kind of moot; and the conflict of interest argument seems like an unsolvable catch-22: if one has edited a MDPI journal before then one has a COI, but if one hasn't, then one doesn't have firsthand knowledge. Banedon (talk) 11:05, 13 July 2021 (UTC)
The quarrel is mentioned in above mentioned own source. I dont know any university making use of the "worlds most comprehensive database". Italy's own database (ANVUR) has labelled MDPI's Sustainability as a class A journal, highest rating possible. But I feel little adding this. What is also important to mention is that the Norwegian scientists wiki pages are either created or edited by Bjaerebek. Kenji1987 (talk) 11:21, 13 July 2021 (UTC)
Yes, I did make a small edit[3] ​to Simen Andreas Ådnøy Ellingsen's article when I included his assessment of MDPI here. There must be something sinister going on, surely – according to MDPI, at least. No doubt it will be included in their "Response to MDPI Wikipedia Article" in due time. --Bjerrebæk (talk) 13:24, 13 July 2021 (UTC)
The following discussion has been closed. Please do not modify it.
Lol they have a response article? Nothing sinister here, but its interesting and important to point out. When will you add that the CEO supports Trump? In the Norwegian version its in the lead.Kenji1987 (talk) 13:51, 13 July 2021 (UTC)
This is the talk page of the English Wikipedia article about MDPI. When you have learned Norwegian, I'm happy to have a conversation with you about the Norwegian article, in Norwegian, on the Norwegian Wikipedia project. --Bjerrebæk (talk) 13:58, 13 July 2021 (UTC)
This is exactly the problem. Kenji1987 (talk) 14:05, 13 July 2021 (UTC)

I have never heard of Anders Skyrud Danielsen and Lars Mølgaard Saxhaug, nor any of the Norwegian sources you cite. Most of us can't assess (nor do we care, I care just as much for Norway as I do for Ecuador, Papua New Guinea, or Sri Lanka, but you don't see me dig into their academic discussions on MDPI) whether the things you add here really improve the quality of the article. Kenji1987 (talk) 15:36, 12 July 2021 (UTC)

Given your absolute hatred for anything remotely critical of MDPI, I can't say I particularly care who you have heard or not heard of in your life. WP:IRS does not mean "Sources and scholars Kenji1987 has personally heard of before". Norwegian sources are just as fine to use as American ones, British ones, French ones, German ones, etc... Headbomb {t · c · p · b} 17:20, 12 July 2021 (UTC)
Thanks for your insights! Kenji1987 (talk) 22:04, 12 July 2021 (UTC)
Instead of discussing obscure scholars debating about MDPI. This source: is really the best analysis out there. I am probably not able to add this, but anyone else. Feel free to do so. Kenji1987 (talk) 11:29, 13 July 2021 (UTC)

Is there any reason to believe that this source rises above our usual prohibition against blog sources? Crosetto appears to be an economist, with no special expertise in bibliometrics or academic publishing beyond (like all academics) as a participant, so I don't think the "established subject-matter expert" escape clause of WP:SPS applies. —David Eppstein (talk) 18:54, 21 August 2021 (UTC)
Yes, I think we'd really have to stretch the "established subject-matter expert" escape clause to allow that here. XOR'easter (talk) 22:55, 21 August 2021 (UTC)

Relevant new reference[edit]

We are not currently citing this, but maybe we should:

  • Oviedo-García, M. Ángeles (August 2021). "Journal citation reports and the definition of a predatory journal: The case of the Multidisciplinary Digital Publishing Institute (MDPI)". Research Evaluation. Oxford University Press. doi:10.1093/reseval/rvab020.

David Eppstein (talk) 17:59, 21 August 2021 (UTC)

Does anyone understand that paper's methodology in calculating self-citation rates? I tried duplicating it for Cells / Nature Reviews Molecular Cell Biology and didn't get the same numbers. Cells published 1,660 articles in 2019. These were cited 21,099 times in total by 18,845 citing articles. Without self-citations, there were 20,767 citations by 18,652 articles. Comparatively, Nature Reviews Molecular Cell Biology published 115 articles in 2019. These were cited 8,005 times by 7,597 articles, and 7,992 times by 7,586 articles when omitting self-citations. The "standard" way of calculating self-citation rate is the number of journal self-citations expressed as a percentage of the total citations to the journal. That gives Cells a self-citation rate of 1.5% (=[21099-20767]/21099 * 100%) and Nature Reviews Molecular Cell Biology a self-citation rate of 0.16%. Oviedo-Garcia gives 2.54% and 0.85%. How were these numbers calculated? Banedon (talk) 02:13, 25 August 2021 (UTC)
In the same way I'm getting that Sustainability has a self-citation rate of 6.16%, while Oviedo-Garcia says gives 27.69%. If Oviedo-Garcia is right, this exceptionally high self-citation rate should trigger an investigation by Clarivate. Has anyone seen any news that Clarivate is investigating Sustanability? Banedon (talk) 02:18, 25 August 2021 (UTC)
I didn't have time yet to read this article in detail (beyond the abstract), but could it be that they call it "self-citation" if a journal is cited by another journal from the same publisher? That would explain the higher numbers that they get and indicate that MDPI journals are a kind of citation ring. I don't expect to see any news about Clarivate examinations, as far as I can tell, they never say anything about a journal or group of journals being investigated, unless they find something amiss. --Randykitty (talk) 08:11, 25 August 2021 (UTC)
Effectively a citation ring, yes. Intra-MDPI citations being abnormally high, combined with the insane ammount of special issues (presumably not subject to the EiC's review, given no one has time to review over a thousand issues per journal per year). Headbomb {t · c · p · b} 12:13, 25 August 2021 (UTC)
The closest-related paragraph I can see from that paper is this:

Data on each selected journal were gathered from the following sections of the MDPI-journal web pages: Home, Editorial Board, Special Issues, APC, and Journal Statistics. Besides, data were collected from JCR (2018) on the Journal Impact Factor and the Impact Factor Without Self Cites. Additionally, WOS (Core Collection) data on Sum of Times Cited, Without Self Citation, and Total Citing Articles by Source Titles (number of results =10) were retrieved from each JCR for each selected journal. Exceptionally, data on the MDPI-journal self-citation rates were collected on 3 June 2020, to assure data accuracy in relation to the 2019 self-citation rates.

It's not clear to me what Oviedo-Garcia did. If Oviedo-Garcia indeed sorted by citations from publisher, then that should inflate self-citation rates from big publishers. But I think at that point the objections are academic and should be answered in the literature. I'll just add that it is possible to tell if a journal or group of journals is being investigated by Clarivate - [4] [5] [6] are examples. I think we should keep an eye on it - if Clarivate investigates the results are probably worth including, if they don't then the paper might be faulty and not worth referencing. Banedon (talk) 13:33, 25 August 2021 (UTC)
Banedon, those 3 references are about measures taken by Clarivate after investigations are finished. Whether they're currently investigating any (MDPI or other) journals is unknown. --Randykitty (talk) 14:20, 25 August 2021 (UTC)
Clarivate already banned 2020 self-citing journals. This is the time ref this study is referring to. Its here: - no single MDPI journal Kenji1987 (talk) 14:25, 25 August 2021 (UTC)
Check this also: If MDPI really was rigging the system, then Im surprised a tourism scholar had to find out, and not Clarivate Kenji1987 (talk) 14:38, 25 August 2021 (UTC)
Currently investigating or after investigating, I don't think it's a big difference. Point is that if Oviedo-Garcia's analysis is correct, Clarivate should be investigating, and with 53 journals in JCR chances are at least some of MDPI's journals are going to be delisted. If that happens it should be visible. Oviedo-Garcia did use 2019 data, meaning that if MDPI's journals are going to be delisted, it should be happening pretty soon. If it doesn't happen, Oviedo-Garcia's analysis might be faulty. The link by Kenji1987 indicates that it hasn't happened, which is concerning. A few other things to point out as well - firstly, Oviedo-Garcia called MDPI's 1000-2000 CHF publication charges "high", yet published in a journal that charges 2678 EUR. Secondly, Oviedo-Garcia compared the journals against the leading journals in the field. It stands to reason that the best journals have better citation metrics than the not-so-good journals, so the analysis that should have beed conducted is a comparison against other similarly-ranked journals. Finally, MDPI have posted a response that includes a chart showing where MDPI's self-citation rate (where they indeed sum across all journals by the same publisher) is relative to other publishers. Several major publishers have higher, just like several have lower. I'm getting the feeling that Oviedo-Garcia's paper is rather seriously flawed, which would be really ironic, since it would imply OUP is a predatory publisher. The paper is very new, only two weeks old as of time of writing. It might be too new to include. If kept (I don't feel strongly either way) I would expect rebuttals soon, when its inclusion should be re-evaluated. Banedon (talk) 23:55, 25 August 2021 (UTC)
It's very clear. Take a look at Table 3: "Table 3.Intra-MDPI citation rate 2018 and 2019 (top 10 citing journals)" - she only included top 10 citing journals, not all citations, which makes little sense. Kenji1987 (talk) 13:48, 25 August 2021 (UTC)

The analysis is a bit silly comparing MDPI journals with Nature journals, but it is a source that needs to be cited here. Author classifies MDPI as a predatory journal/editorial because self-citations among all the journals are higher than the comparison journals, except for one. The intra-MDPI journal self-citation is also high (no data given on the non-MDPI publishers). Review times are too fast, and too many special issues. Kenji1987 (talk) 00:39, 22 August 2021 (UTC)

@Banedon: you wanna team up and write a commentary on the article? It claims that some journals have 0 editorial board members (nonsense). The table on intra-MDPI citations also doesnt make sense (no idea how she calculated it). No comparable data is given on other publishers also. Article claims that it is problematic that you can't set a limit on having editorial board members from developing countries (which she states is a characteristic of predatory publishers sic!), and the article is full of typos and the graphs are not only low resolution but also very wrong (how can nominal variables be line graphs?), you would expect that this journal would have copy editors? Also Clarivate does not set a limit of 15% self-citations, they ban journals every year with self-cites of 50% or higher but there is no arbitrary limit. Author also cites a MDPI journal, but she doesn't want to cite it so she just copy pastes the link (again how can journals allow these citations?) Last but not least, I think its hilarious to compare MDPI journals with mostly Nature review journals, and then conclude, it must be predatory. Of course you can, but.... Well... Comparing a Porsche with a Lada. Both drive, and thats it. Are you interested? Drop me a message and Ill contact you. In the meanwhile I am adamant that this article is cited here. 100% supportive. Kenji1987 (talk) 12:49, 25 August 2021 (UTC)

Thanks for the offer. I suspect I lack the expertise to write a comment myself, but I'll be OK participating if the comment focuses on academic issues (so things like "full of typos" should not matter, since those things happen at virtually all journals). Banedon (talk) 23:55, 25 August 2021 (UTC)

Full of typos is a problem as there is no decent copyediting from the journal, one of the characteristics of a predatory journal. Well, Im not saying this particular journal is, but it is rather ironic. Kenji1987 (talk) 01:06, 26 August 2021 (UTC) - and as expected, an expression of concern. Perhaps the study got cited here too fast. Kenji1987 (talk) 12:07, 3 September 2021 (UTC)
Possibly. Could very well be MDPI lawyers trying to suppress research. Could be problems with methodology too. Wait and see for now, but in the meantime we shouldn't be citing this. Headbomb {t · c · p · b} 16:38, 3 September 2021 (UTC)
MDPI lawyers threatened Beall, beyond what could be published in his paper about them. MDPI appears to be a bad actor working to suppress criticism with threat of lawsuits. Given the suppression efforts, I think it makes sense to repsonsibly cite the analysis until there is evidence of problematic data published. Remember, MDPI refused to follow COPE when directly told they should retract a paper in Behavioral Sciences AllMyChillen (talk) 15:11, 27 September 2021 (UTC)

Perhaps we shouldnt be citing the study at all untill there is clarity? Kenji1987 (talk) 23:04, 3 September 2021 (UTC)

I've gone ahead and removed the section. Banedon (talk) 00:24, 4 September 2021 (UTC)

Good! And once there is more clarity this study can be included or remain removed. That MDPI lawyers would have a hand in this, is extremely unlikely, if you've actually read the paper. Its full of serious flaws. But no one reads papers anymore. Kenji1987 (talk) 00:49, 4 September 2021 (UTC)

I don't see why you think the existence of flaws (assuming they exist) would make lawyer action less likely. —David Eppstein (talk) 00:55, 4 September 2021 (UTC)

OUP is an academic juggernaut. I dont think they have much to fear of MDPI. Id say just read the article, the MDPI lawyer hypothesis would then become less likely. Kenji1987 (talk) 00:59, 4 September 2021 (UTC)

I don't see why an expression of concern makes lawyer action more likely either. Neither does it make lawyer action less likely. It's just an unrelated piece of news. If anyone is jumping to conclusions about how this might be due to MDPI lawyer action, they might want to examine internal biases, because that's what it looks like to me. Banedon (talk) 02:58, 4 September 2021 (UTC)
Internal biases, or not forgetting the past history of MDPI harassing bearers of bad news to the point that they were forced to go away, potato, potahto. Nevertheless, whatever conclusions we might jump to, the only actionable information is the paper and the expression of concern, not our speculation. —David Eppstein (talk) 06:09, 4 September 2021 (UTC)

"Official Norwegian List" in the lead[edit]

Now 5 (out of 350+ or so) journals of MDPI have been rated level X, does this warrant it to be in the lead? If yes, why aren't the 150+ journals with level 1 and 7 or so journals with level 0, and MDPI having a publisher level of 1 by the same index mentioned in the lead? If no, then it should be deleted. In the lead we should know that MDPI has a level 1 at least, or don't mention the Norwegian Index at all in the lead. Kenji1987 (talk) 13:34, 8 September 2021 (UTC)

The way the Norwegian Scientific Index works, each journal is assessed individually, ultimately by committees in the relevant field. (The publisher-level rating only applies to books) Perhaps this doesn't work well when dealing with a predatory publisher like MDPI that churns out journals and special issues. The system was clearly designed for the old publishing world of publishers and journals that behaved quite differently, and where journals were typically established by academics, rather than being motivated purely by profit.
The committee said they added five MDPI journals initially, in consultation with the discipline-specific subcommittees, to assess the response from the academic community, and that more journals may be added later. They also said explicitly that the entire Level X of dubious or possibly predatory journals was created in response to expressions of concern regarding MDPI[7]. They have also said MDPI is continually under review and that they may make further adjustments to the whole system in response to the problems posed by publishers like MDPI (as they did when they created Level X). MDPI is the largest publisher of Level X-journals, with about 40% of the list consisting of MDPI journals. MDPI is the only publisher mentioned specifically in connection with the establishment of the list. Level X shares many similarities with (and is clearly inspired by) Beall's List, but is published by a government agency and has direct funding and other implications. Bjerrebæk (talk) 13:53, 8 September 2021 (UTC)
I dont see any evidence that it only concerns books, and if it did, it is also relevant information. It also begs the question why not mention the 160+ journals that are level 1? Anyhow I dont intend to make this a long discussion, and I look forward to hearing everyone views on this. Kenji1987 (talk) 14:01, 8 September 2021 (UTC)
Institutional-level ratings only apply directly to books. This was explained by the committee to MDPI when they complained about it, according to minutes of their meeting of 13 September 2019, so it surprises me that you aren't aware of it. According to the minutes MDPI told the committee that they now planned to publish books. The committee hadn't yet figured out what to do with MDPI as a whole (although they said they were reviewing MDPI) and reinstated MDPI as a level 1 book publisher after they had designated MDPI as a level 0-publisher on the institutional level. The new test case where five MDPI journals are designated as possibly predatory is a significant new development in the way the whole index works. I agree that it would be best if MDPI journals were treated in a more consistent way, by designating all MDPI journals as predatory/non-academic. --Bjerrebæk (talk) 14:12, 8 September 2021 (UTC)

To redirect back to the discussion, what should be in the lead? a) only those 5 journals rated level x by "official norwegian list" (?). b) a) and MDPI being a level 1 publisher c) a) + b) + number of journals having level 1 (academic), 0 (non academic/predatory), X is grey zone, 0 is predatory. d) none of the above. e) different combination, namely.... Kenji1987 (talk) 14:28, 8 September 2021 (UTC)

What happened to this? --JBL (talk) 17:35, 8 September 2021 (UTC)

The page is currently being edited, after the edits are done, Ill add that MDPI is a level 1 publisher in the lead. Kenji1987 (talk) 03:22, 1 October 2021 (UTC) I was wondering whether we should keep level X in the lead, as the Norwegian page state the following: "Keeps level X for now [the MDPI journal]. Will be reconsidered at the decision meeting 14.January 2022." - perhaps better to wait until January 2022? Kenji1987 (talk) 03:18, 19 November 2021 (UTC)

Semi-protected edit request on 27 September 2021[edit]

There is a new peer-reviewed paper about MDPI that should be included. I suggest the text and citation be added as below. I suggest this end the third paragraph concerning general criticisms of MDPI.

A quantitative analysis of MDPI's citation pattern supported that it appeared to be a predatory publisher.Oviedo-Garcia, M Angeles (2021). "Journal citation reports and the definition of a predatory journal: The case of the Multidisciplinary Digital Publishing Institute (MDPI)". Research Evaluation. doi: {{cite journal}}: Check |doi= value (help); External link in |doi= (help) Specifically, the analysis identified an extremely high rate of self-citation commonly observed in predatory journals. AllMyChillen (talk) 15:06, 27 September 2021 (UTC)

This was discussed a couple of sections above. I'm of opinion it should not be added as long as this expression of concern remains. Banedon (talk) 01:17, 28 September 2021 (UTC)
What Banedon said. Headbomb {t · c · p · b} 01:54, 28 September 2021 (UTC)
Crawdaunt You might be interested in the above. Banedon (talk) 01:58, 26 November 2021 (UTC)
Crawdaunt addition here contains many questionable additions. First of all, the use of primary sources and self-research should not be here ( & Second the expression of concern is not due to the complaints of MDPI as it is incorrectly stated in current page ( Take a look at the article yourself, it's hilarious, and that probably caused an expression of concern. Banedon may I ask you to re-edit or remove this particular section until current study is in the clear? Otherwise I'll do it, but probably in a few days to get more feedback first. Kenji1987 (talk) 04:46, 26 November 2021 (UTC)
I'm waiting too, don't think it's urgent. Banedon (talk) 08:16, 26 November 2021 (UTC)

I’m happy to remove or re-edit. I felt like some justification for scope was necessary to warrant inclusion, but can take out altmetric and soften. I have read the article and the comment by MDPI. While the article indeed goes off on few… questionable tangents… the undisputed fact remains that MDPI has one of the highest self-citation networks in publishing. They address this in their comment, but their interpretation of their data is rather peachy compared to how it appears without their insistence that it’s normal/fine.

I do think it is a major controversy. I don’t see how the expression of concern changes that? Crawdaunt (talk) 06:48, 26 November 2021 (UTC)

The problem with an expression of concern is that it could indicate a poor paper that should not have been accepted. See section above for some analysis. I still haven't seen evidence Clarivate is investigating for excessively high self-citation rates, either. I am in favor of removing all mention of the paper as long as the expression of concern remains. The other article you added, for the Nutrients journal, is keeping in the spirit of the rest of the section, although the more I think of it the more dubious I get of the section as well. Lots of controversial articles are published every year by virtually every publisher (e.g. this was published in Nature Astronomy, this was published in A&A). I would prefer to see some kind of allegation of peer review failure on MDPI's part, or the section could well become longer than the rest of the article in the future. Banedon (talk) 08:16, 26 November 2021 (UTC)
Expression of Concern is kind of expected to be paired with controversy. I think you can look at the Expression of Concern from both sides. This is a controversy surrounding MDPI, but it does not have to assume MDPI is somehow in the wrong. Nevertheless it is an ongoing controversy worth mentioning (in my opinion).
Re: the controversy section: I think the frequency and rate of controversy is the reason it is featured so prominently on MDPI's Wiki page. Of course major journals have bunk slip through, but the rate of controversy per article published is very different. There is an overwhelming sentiment in the research community that MDPI is a controversial publisher for reasons of rigour and quality. This is distinct from other publishers where controversy typically stems from paywalls and article processing charges. There is another recent and fairly professional and respectful discussion of MDPI publication practices here at the following link, where MDPI also got to comment in its defence on the analysis later: The core message of this blog post hits similar tones as the peer-reviewed (and Expression of Concern) article listed, but with a much more respectful and less inflammatory discussion.
I say all this in defence of the controversy section featuring so prominently. It is an earned distinction that other Open Access mega-publishers (like BioMed Central) do not necessarily have to suffer. I do agree that it could be presented/organized differently to avoid it just sprawling into a mass of snippets. Perhaps a section on sting operations and Beall's list, and a separate (less prominent) section listing controversial articles.
edit: I see the PaoloCrosetto article was discussed above as well. Not proposing this meets Wiki standards, but I think it's relevant that some of the major conclusions of this good analysis and the questionable Research Evaluation Article overlap - as support that the Research Evaluation article isn't total bunk. Crawdaunt (talk) 13:56, 26 November 2021 (UTC)
Crawdaunt if we measure "controversy" by whether an article is retracted, then there has actually been analysis of whether "the rate of controversy per article published [by MDPI] is very different [from other publishers]". See [8]:

No matter how MDPI achieves a speedy publication process, it is clear that the research community has not rejected their approach so far. MDPI’s content has been growing, it has become increasingly citable, and it is not retracted at an alarming rate.

The publisher reported 19 retractions in 2019, equivalent to 0.5 retractions per 1,000 papers (assuming that retractions refer to year t-2). As a point of contrast, I could locate 352 papers on Elsevier’s ScienceDirect that included the phrase ‘this article has been retracted’ in 2019, implying 0.5 retractions per 1,000 papers (again, assuming that retractions refer to year t-2).

I would suggest being careful here, because we seem to be straying into the WP:RIGHTGREATWRONGS territory where because MDPI is a predatory publisher all criticism of them is true and must be included in the article, even when its reliability is in doubt. Banedon (talk) 14:11, 26 November 2021 (UTC)
Banedon This is fair. I hate the Scholarly Kitchen post by the way. They imply that a journal having an indexed impact factor alone is somehow proof of legitimacy. One of the core issues being addressed in the Research Evaluation article is how high self-citation rate artificially inflates Impact Factor. MDPI's own public comment confirms that they are an extreme outlier in having high self-citation particularly when you factor in total publications (though they claim it is within reasonable bounds). The contrast with Elsevier is cherry-picking. Elsevier publishes an order of magnitude more articles than MDPI, and so its rate of total controversies and self-citation is justifiably higher: there are more total articles citing and to be cited in the Elsevier profile. And more total articles that might receive attention for retraction.
I would likewise take care not to assume retraction is somehow the endpoint justification of controversy. Even the Science article on arsenic being used as a backbone for DNA was never retracted. Instead it was later contradicted by a follow-up article. The willingness of a publisher to retract articles is implicit in whether articles will get retracted. Having lots of controversial articles that are not retracted is itself a sign that the publisher is constantly letting in research of questionable quality, and is both unwilling to change its practice, and unwilling to admit fault.
Banedon "No matter how MDPI achieves a speedy publication process, it is clear that the research community has not rejected their approach so far." - I don't think this is true... There are numerous public instances calling for a boycott of MDPI journals. And I would add that being increasingly cited is not the same as being more citeable. Something like SciMago Journal Rank is a measure of citability, while Impact Factor is only a measure of total citations (regardless of where those citations come from).
If you didn't know, impact factors are provided by Clarivate. It is very hard to get one - the journal needs to be indexed by Web of Science, and fulfilling the criteria [9] is very difficult, so difficult that most journals never get there. Accordingly, having an indexed impact factor is a proof of legitimacy (this assumes a legitimate impact factor). Furthermore, the comparison with Elsevier is per 1000 papers hence it accounts for the fact that Elsevier is the larger publisher. I don't know why you bring up the Science arsenic article as well. Are you implying Science (or the publisher publishing it) is "constantly letting in research of questionable quality"? They didn't retract the controversial article either (neither did Nature for the water memory paper, for that matter). You mention "the willingness of a publisher to retract articles", but retractions are typically initiated by the editors of the journal or the authors of the article [10]. There are many people publicly boycotting MDPI journals, but there are also many people publicly supporting their journals (see how many editors & authors they have). They are definitely controversial, but it's not like the article claims MDPI aren't controversial, and there is a large controversies section. Finally are you aware that the impact factor as calculated by Clarivate only uses citations from other articles published in Web of Science-indexed journals? There is no "regardless of where those citations come from" - they necessarily come from other Web of Science journals, and as mentioned above these are all legitimate journals.
Please try to be objective. It is really looking to me like you have already decided that MDPI is predatory, which is why you are drawing all the worst-case inferences, even when the evidence is not there. Banedon (talk) 15:34, 26 November 2021 (UTC)
Banedon Woah now, I think you are assuming a lot and reading into things in a strange way.
1) I am not implying Science has predatory behaviour. I brought up the Science article as a point to emphasize that retraction is not sufficient to demonstrate an article had an underlying flaw. It's a famous example. That is all. A journal's willingness to retract an article is in many cases a sign of quality. Genuinely obvious predatory publishers never retract papers, so you cannot rely on the journal to be its own judge and jury by using retraction as a litmus test. Retraction count is not a good metric of predatory behaviour.
2) Yes I know what an Impact Factor is. Obtaining one is the very starting line to being a reputable academic publisher, not a gold star that forever assures quality. This is emphasized by the abuse of publication metrics like IF or h-index, which can be inflated by self-citations.[1] A very high self-citation rate is brought up in the Research Evaluation article, and agreed to by MDPI in their own internal statistics. Your use of having indexed journals with IFs indeed is a whopping point in favour of legitimacy. But not for the issue of whether MDPI is increasingly more citeable. Again, SciMago Journal Rank or an equivalent citation metric is a better reflection of citeability, as the SJR internal algorithm controls for the diversity of citing journals (and thus controls for self-citation networks). Journals that excessively cite themselves have a poor ratio of IF/SJR. See Falagas et al.[2]
3) I don't want to make this a tit-for-tat, but you accuse me of not being objective, and then use a weird equivalence of how despite many calls to boycott MDPI, some people also support MDPI. Of course. But the reason for the boycott is scientific rigour, and there are not similar calls to boycott other mega publishers like e.g. BioMed Central. Even Frontiers Media seems to have a better reputation than MDPI (despite many parallels). E.g. here: or here: . These are obviously small samples restricted to the Twitter community, but you would never find such numbers from polls asking "is BMC a predatory publisher?" despite both being models of Open Access Mega Publishing. There is indisputably a public debate specifically about MDPI being predatory (also evidenced by the number of articles published on the topic, discussed on this web page, and the edit history of this page). This debate on scientific rigour does not exist for the most reputable publishers. This is evidence that MDPI is perceived as a predatory publisher by a not-insignificant group, and it is a bit bizarre to suggest that by recognizing this, I am not being objective.
I don't know about this. "retraction is not sufficient to demonstrate an article had an underlying flaw" Really? If an article is retracted it must have some underlying flaw. Can you find a retracted article anywhere that doesn't have an underlying flaw? "Obtaining one is the very starting line to being a reputable academic publisher, not a gold star that forever assures quality." Are you aware that Clarivate regularly curates Web of Science and journals get delisted all the time, and that once you are delisted you cease to have an impact factor? "which can be inflated by self-citations" Are you also aware that excessive self-citations gets you investigated by Clarivate [11]? 3) misses the point entirely. Nobody denies that MDPI is a controversial publisher (and I know some who think Frontiers is worse than MDPI). How is that related to the fact that you have inserted a disputed paper into this article? Focus on what you inserted please. Are you claiming that Oviedo-García's article is not disputed? Are you claiming that disputed articles should be inserted into Wikipedia articles? Are you claiming that disputed articles should not be inserted into Wikipedia articles, unless it is about MDPI, in which case it should? Banedon (talk) 01:16, 27 November 2021 (UTC)
Thanks for de-escalation. 1) Retracted articles do almost always have significant flaws. My comment is to say that number of retractions depends both on flaws in the articles and the publisher's willingness to recognize them. A publisher that doesn't retract flawed articles will not have a high number of retractions, but is nevertheless dubious. 2) I am. That is why Bentham Science and OMICs publishing groups have both been delisted by the likes of Clarivate and SCImago. I think it's a fair point to say that MDPI is not yet arbitrated as a predatory publisher, and so Wikipedia should not outright call it one. But I do think there is a distinction between the label predatory publisher and controversies around exhibiting predatory publishing behaviour (or as Paolo Crossetto put it: "aggressive rent extractors"). I think that term is apt, but unfortunately the convo above has already agreed that Crossetto's economist background is not sufficient to be deemed an expert source on publishing ethics (for fair reasons of consistent editing policy). Admittedly... I'm not sure everyone has the same definition of predatory publishing in the modern era, and that might be behind some of the disconnect. 3) My point is that it is an ongoing controversy surrounding MDPI. If you would say that MDPI articles given a notice of concern should not be added to the page until arbitration, then I would agree. But this leaves the publisher to be its own judge and jury, and that is incorrect process. In the Controversies section, the result of the "Who's Afraid of Peer Review" sting operation is listed despite the MDPI journal at the time rejecting the paper. Not all controversy entries assume MDPI is at fault, but they are nonetheless controversies surrounding MDPI. Crawdaunt (talk) 11:38, 27 November 2021 (UTC)
Edit: I'd add that MDPI has issued a public comment on the Research Evaluation article, so this is a further argument that they themselves view it as a controversy they are embroiled in. Crawdaunt (talk) 11:43, 27 November 2021 (UTC)
1) Are you implying that MDPI refuses to retract articles? If yes - do you have any evidence beyond the one incident already in the article from 5 years ago? How do you explain the retractions that were mentioned above? 2) Are you aware that Clarivate does not index publishers? 3) There are lots of "ongoing controversies". You could for example argue that this is an ongoing controversy in whether or not dark energy exists (see [12] for popular-level writeup). It doesn't mean it should be included into our article on dark energy - the study is disputed, after all. It is also concerning that you've said that if it's an MDPI article with the expression of concern then it should not be added to the page, but an OUP article with the same is OK, since that looks like clear double standards. Unless another editor says not to do it (I know several are watching the page), I will remove this section. Banedon (talk) 15:34, 27 November 2021 (UTC)
1) No. You said: "Crawdaunt if we measure "controversy" by whether an article is retracted, then there has actually been analysis of whether "the rate of controversy per article published [by MDPI] is very different [from other publishers]"". I have explained repeatedly why article retraction is not a good metric to judge controversy. No more to say on this. 2) Please avoid semantics. I acknowledge the IF as a point of legitimacy. I have repeatedly attempted to explain how the IF can be abused, in an attempt to respond to your earlier quoted statement: "No matter how MDPI achieves a speedy publication process, it is clear that the research community has not rejected their approach so far." MDPI itself commented on the Research Evaluation article in question, and presented internal data that show they are amongst the highest self-citers of the publishers they included in their analysis (Fig 1 of: 3) You have grossly misunderstood my question. I made no such assertion. To phrase it another way: I asked if an MDPI article is currently undergoing a notice of concern, does that prevent it from being added to the page? Crawdaunt (talk) 20:20, 27 November 2021 (UTC)
I think we're at an impasse and will not be making progress, so let's ping the other people who have either participated on this page or have edited the article recently and see what they think: David Eppstein Headbomb Randykitty Bjerrebæk. Not pinging Kenji1987 since they've already said above they're against it. The dispute is over this edit, with me thinking it shouldn't be in the article as long as this expression of concern remains, and Crawdaunt apparently thinking it should be included anyway. Banedon (talk) 02:38, 28 November 2021 (UTC)

My position is unchanged from last time. If there's an expression of concern, we don't rely on it, regardless of why there's an expression of concern. Headbomb {t · c · p · b} 02:42, 28 November 2021 (UTC)

Although I supported including this source originally, I think we can wait until the expression of concern is resolved before including it, per Wikipedia:There is no deadline. —David Eppstein (talk) 07:05, 28 November 2021 (UTC)
I agree we are at an impasse. User:Banedon I would appreciate an answer to my question whether an expression of concern on an MDPI article would prevent it from being added to the controversy section? I ask it in good faith to know whether it is standard practice that an unresolved controversy is inappropriate for the section called "controversies." My position is that this is an ongoing controversy, and I do not think our judgements of which side is justified in their position should factor into the fact that controvesry nevertheless exists, and is centred around MDPI. The controversial article in Research Evaluation has garnered enough attention that the publisher felt it important enough to release a public rebuttal. Surely that is relevant to the controversy section? Crawdaunt (talk) 10:49, 28 November 2021 (UTC)
Consensus is clearly against inclusion of this source until the expression of concern is satisfactorily resolved. Headbomb {t · c · p · b} 11:05, 28 November 2021 (UTC)
Alright. 3 to 1 is pretty damning. I am nonetheless curious why this consensus is reached? Not all entries in the controversies section imply the publisher is at fault (e.g. Who's Afraid of Peer Review" entry). Is it standard practice to not list ongoing controversies? Crawdaunt (talk) 11:12, 28 November 2021 (UTC)
Make that 4 to 1... As long as there is an expression of concern, this is not a reliable source and should not be used anywhere. At this point the controversy is not about MDPI, but about whether or not that article is correct or needs changes, or is hopelessly flawed and should be retracted. Regarding the rest of the controversies section, I think that it should only mention controversies that concern the publisher as a company (such as being placed on/removed from Beall's list). Controversies concerning a single article should, at best, be mentioned in our articles about the journal. Discretion is needed, though. Publishers, editors, and reviewers are only human, so screw ups will happen from time to time with even the best of them. The larger the journal and the more journals a publisher has, the more retractions they will have. Unless there's a clear pattern of abuse or a larger than normal scandal, most of the retracted articles will be trivial and don't deserve even a mention in the article on the journal in question. If we would list every article for which there was a problem in a publisher's article, it would become impossibly large for huge publishers like Springer or Elsevier. --Randykitty (talk) 11:22, 28 November 2021 (UTC)
Fair enough. I guess Research Evaluation will sort itself out eventually. I am massively outvoted, but still believe that the publisher's public rebuttal means this is not simply about Research Evaluation, and MDPI has engaged in the controversy. On your point about retraction: I really do want to emphasize the specific flaw in using retraction as a metric of controversy being validated. Nature and Science have a policy of not retracting even flawed articles (e.g. Nature's memory of water 1988, or Science's arsenic DNA backbone), and there is reasoned debate on the merit of retraction vs. allowing the literature to sort itself out (which happened in both of those cases). On the other hand flawed articles might remain up, but with an expression of concern. Retraction is correlated with but independent from whether an article is flawed, and also independent from whether it is controversial.Crawdaunt (talk) 11:40, 28 November 2021 (UTC)

In my comment above, for "retraction" read "controversial article". --Randykitty (talk) 11:59, 28 November 2021 (UTC)


  1. ^ "Hundreds of extreme self-citing scientists revealed in new database". Nature. Retrieved 26 November 2021.
  2. ^ Falagas, ME; Kouranos, VD; Arencibia-Jorge, R; Karageorgopoulos, DE (August 2008). "Comparison of SCImago journal rank indicator with journal impact factor". FASEB journal : official publication of the Federation of American Societies for Experimental Biology. 22 (8): 2623–8. doi:10.1096/fj.08-107938. PMID 18408168.

Article restructure[edit]

Banedon: "I would prefer to see some kind of allegation of peer review failure on MDPI's part..."

I think this is a great suggestion that would also focus the article towards objectivity. It seems a bit like the structure of the article itself is currently inviting the debates about appropriate content. If the article doesn't invite the idea that controversies in general are content deserving front-and-centre attention, then the tone overall will shift to a more neutral stance.

To re-organize the article with the current text, I might propose the following restructure?:

  • 1 History
  • 1.1 Molecular Diversity Preservation International
  • 1.2 MDPI (Multidisciplinary Digital Publishing Institute)
  • 2 Accusations of predatory publishing behaviour
  • 2.1 Who's Afraid of Peer Review?
  • 2.3 Inclusion in Beall's list
  • 2.4 2014 OASPA evaluation
  • 2.5 2018 Resignation of Nutrients editors
  • 2.6 Assessments in the Nordic countries
  • 2.7 Journal citation reports and the definition of a predatory journal
  • 3 MDPI in the media
  • 3.1 List of controversial articles
  • 3.2 2016 Data breach
  • 3.3 Preferential treatment of authors from developed countries
  • 3 See also
  • 4 References

As a side note… the section on Assessments in the Nordic countries seems a bit long/overly-detailed relative to the rest of the article. Just a comment. Crawdaunt (talk) 19:47, 26 November 2021 (UTC)

Not a big fan, Id rather want to have a similar structure as the other major publishers have (though the Wiki of Elsevier is one big controversies article) Kenji1987 (talk) 01:13, 28 November 2021 (UTC)
Kenji1987The Wiki of many controversial publishers includes a sprawling list for controversies (e.g. Frontiers Media, Bentham Science Publishers). It's true that other publishing groups don't have such lists. But also their pages are padded out by listing journals (e.g. PLOS, Science (journal)). MDPI has its own page for this. Perhaps the See also could be moved up to below the History section so it is presented before the sprawling discussion of controversies? — Preceding unsigned comment added by Crawdaunt (talkcontribs)
See also sections go at the bottom, see MOS:ORDER. Headbomb {t · c · p · b} 17:48, 28 November 2021 (UTC)
Kenji1987, t, Just returned to this. For now why not move "2.1 Controversal Articles" to the bottom of section 2, rather than having it lead the section? It is effectively an expanding list, so moving it to the bottom avoids the potential for undue weight WP:DUE and allows for it to continue in its current function with its current standards for controversial article inclusion. Thoughts? Crawdaunt (talk) 08:29, 21 December 2021 (UTC)
Yes, I am in favor. This makes more sense. Kenji1987 (talk) 08:40, 21 December 2021 (UTC)
Edit made. Crawdaunt (talk) 09:01, 21 December 2021 (UTC)

Single events[edit]

@David Eppstein, the question isn't whether we believe the single events add up to a pattern. The question is whether we have reliable sources that say that there is a pattern of discriminating against researchers in LMICs. Without sources that say this is a pattern, then labeling "One time, at one special issue of one journal, one e-mail message said" as a general problem of "Preferential treatment of authors from developed countries" would violate WP:OR. WhatamIdoing (talk) 23:00, 18 December 2021 (UTC)

If our article does not say "MDPI have a pattern of repeating this same behavior" then we do not need sources showing that they have a pattern. In many situations in life, a single incident can be so extreme that it becomes significant even if there is no evidence of it being repeated. This incident, of MDPI corporate directly interfering with editorial decisions in a journal, rises to that level for me. But more broadly, if we used this reasoning to discount every incidence of wrongdoing by every wrongdoer as "it was only that one time" even when the same wrongdoer has done many other wrong things, one time each, then we would have no wrongdoing left in the world. It's bad reasoning to say that because that precise variation of misbehavior was only documented once, it was therefore unimportant. —David Eppstein (talk) 23:05, 18 December 2021 (UTC)
It seems to me that the section heading indicates that this is a general situation, rather than a one-time event. ==Preferential treatment of authors from developed countries== sounds like a general problem, no? Something like ==Interference in planned 2020 special issue of IJERPH Special Issue== sounds like what we can actually document from these sources.
I don't think the slippery slope argument is appropriate. Some instances of wrongdoing by some wrongdoers get books written about them. Sometimes they get multiple independent sources writing that a given instance of wrongdoing is part of a larger pattern. This particular instance appears to have earned only a blog post by the involved parties and a blog post by RetractionWatch, which makes me wonder whether it is UNDUE per policy (even if it feels egregious per personal views of individual editors). WhatamIdoing (talk) 02:01, 20 December 2021 (UTC)
Where do you get this idea that the only thing that could possibly be problematic enough to mention are patterns of repeated behavior? Sometimes single instances of problematic behavior are so problematic that they are worth mentioning. "Preferential treatment of authors from developed countries" is merely an accurate description of what happened in this single instance. It says nothing about being repeated. The idea that that title must somehow imply that the same thing happened repeatedly is something that comes from your fixation on repetition but is not in the actual text. —David Eppstein (talk) 02:39, 20 December 2021 (UTC)
If a single event were worth mentioning in a Wikipedia article, then multiple independent reliable sources would be mentioning it in the real world. That's how WP:DUE works. WhatamIdoing (talk) 16:26, 20 December 2021 (UTC)
Disconnected from everything else, the text "Preferential treatment of authors from developed countries" might sound like it's talking about a general trend, but in this article, it's a subsection heading at the end of a long list of specific incidents, e.g., "2018 Resignation of Nutrients editors". I don't think there's an implication of repetition — certainly not one strong enough to be confusing. XOR'easter (talk) 15:38, 20 December 2021 (UTC)
I'm doubtful that this should be mentioned in the article at all (because a single INDY source generally means an UNDUE problem), but I think that if we keep it, it should have a section heading that clearly marks it as a single event.
"Preferential treatment of authors from developed countries" is probably a criticism that almost all open-access peer-reviewed journals deserve. Even before this event happened, and in all of their journals, MDPI probably deserves that criticism. But that's not because a staffer tried to meddle in this special issue. It's because researchers from poor countries and poor institutions will have a much harder time paying the article processing charges. So if you are glancing through a list of complaints, and you see a section heading that sounds like it involves widespread, systematic discrimination, you will be surprised to get to this section and discover that it is a paragraph basically about a single e-mail message. ==2018 resignation of Nutrients editors== is a good section heading for that event; I suggest that ==2020 resignation of 2020 special issue of IJERPH editors == would be a better section heading for this event. WhatamIdoing (talk) 16:33, 20 December 2021 (UTC)
I think that a single source from Retraction Watch, a go-to place for coverage of sketchy behavior in academic publishing, is enough to support a single-sentence summary here. I'm open to adjusting the subsection heading; e.g., inserting the specific year would make it better parallel the preceding items. Dropping in an opaque acronym like "IJERPH" seems suboptimal to me, though. XOR'easter (talk) 16:39, 20 December 2021 (UTC)
Given the current standard of this Wiki page for inclusion, it does feel like it is significant enough to mention (rivalling an entry in the Controversial Articles section at a minimum). There is the section on "Who's Afraid of Peer Review" which... wasn't even a controversy? And only has a single sentence. I agree with WhatamIdoing that giving it an entire subsection straddles WP:DUE given the single instance as evidence. On the flipside, MDPI "Water" is a journal with a 2000CHF APC (amongst the highest APCs of the MDPI journals), and in MDPI's general description of discounts and waivers MDPI says:

"For journals in fields with low levels of funding, where authors typically do not have funds available, APCs are typically waived and cross-subsidized from fields for which more APC funding is available. For authors from low- and middle-income countries, waivers or discounts may be granted on a case-by-case basis."

The fact that Water, a journal with the highest APCs within MDPI, could ever have such a policy for a Special Issue is noteworthy. It also speaks to what what chief editorial oversight means by "case-by-case basis", i.e. that individual journals are apparently free to run in complete opposition to the spirit of waiving charges for LMICs. It's not like MDPI headquarters at all spoke out or punished the chief editors over this (that I'm aware of). That does make them complicit IMO. So that kinda tips the scales for me towards it being appropriate to have a section re: WP:DUE. Crawdaunt (talk) 08:51, 21 December 2021 (UTC)
The corporate response, which is reported in RetractionWatch with two quotations repudiating both the individual action and the general idea, indicates that it wasn't "policy" so much as one message from one staff person, who has since been informed what the actual policy was. (I wonder what it is about the internet that makes us want to see people be publicly "punished" for making mistakes, instead of having their employees treated the way that we would want to be treated ourselves in similar situations?)
If we keep this, it needs a clearer/more specific section heading. WhatamIdoing (talk) 16:28, 21 December 2021 (UTC)
Genuinely hadn't seen corporate response. I can't find the response or RetractionWatch report... but that sounds to me like enough to say it doesn't deserve its own subsection. Can you include link? Crawdaunt (talk) 20:15, 21 December 2021 (UTC)
Oh, I see that this is the ref on the main page added. That tweet was made in July 2020. Was there any action taken? Anything more than a tweet? I'm not sure I would qualify a tweet as an official response... Crawdaunt (talk) 22:15, 21 December 2021 (UTC)
Who knows? That's the kind of information we would expect to find discussed in the independent secondary sources that would prove that it's DUE, but AFAICT no such sources exist. What we have, at best, is one WP:PRIMARYNEWS source reporting, in a breaking-news fashion, that the editors announced their resignation via blog post yesterday, and that both the corporate office and the then-CEO said something today about the incident not complying with the corporate policies.
Apparently, nobody in the real world has considered this incident to be worth writing about this since then. 100% of the known sources appeared online within the space of about 24 hours last year. In any other subject area, even if there were dozens of independent sources rather than just one, we'd call that a flash in the page and omit it entirely. UNDUE says articles should "represent all significant viewpoints that have been published by reliable sources, in proportion to the prominence of each viewpoint in the published, reliable sources". The number of independent reliable sources that have presented information about this incident is: one. The number of sources talking MDPI in general that have chosen to mention this incident is: zero (AFAICT). Therefore I suggest that the correct balance, "in proportion to the prominence in the published, reliable sources" is: omission.
@XOR'easter, I appreciate including the year in the subsection heading, but it still makes it sound like this was a problem affecting all of MDPI's journals, rather than one. What do you think about something like ===Fee waivers in a 2020 special issue===? WhatamIdoing (talk) 00:00, 22 December 2021 (UTC)
I'm torn overall re: WP:DUE. One thing I definitely agree on with WhatamIdoing is that the current subsection title is inappropriately general. And I'm convinced a bit by WhatamIdoing's argument that, as I also can't find any subsequent articles (even blogposts) discussing it as the main topic, it doesn't seem like more than a flash point. If it were a controversial article, I think there'd be a simple case to add it to that section. But as it's an editorial issue, I'm not really sure how to include it without violating the spirit of WP:DUE, which probably means it shouldn't be there at all. Crawdaunt (talk) 09:56, 22 December 2021 (UTC)
I have no strong feelings about the subsection heading. I thought yesterday about merging it with the previous subsection and calling them "Resignations of editors" or something like that, or of working it into one of the later, longer subsections somehow. This isn't really a WP:DUE concern for me (although someone else might find that a heading puts emphasis on a topic, and I wouldn't dispute that reaction). Really, I'm just not fond of ultra-short subsections, which read choppily to me. XOR'easter (talk) 15:24, 22 December 2021 (UTC)
Putting multiple incidents in one subsection could have multiple advantages. I'm still not convinced that this particular incident should be in the article at all, but if it's in the article, it will both read better and eliminate the original problem completely. WhatamIdoing (talk) 02:35, 26 December 2021 (UTC)
I agree with WhatamIdoing. Crawdaunt (talk) 08:54, 1 January 2022 (UTC)

Czech University of South Bohemia stance against MDPI (Jan 1st 2022)[edit]

Not making a section, but it might be relevant to the page to watch the unfolding situation at a Czech university: the University of South Bohemia Faculty of Science plans to: 1) stop financial support for publishing in MDPI journals Jan 1st 2022, 2) will officially recommend against publishing in or otherwise devoting time to reviewing for MDPI, and 3) won’t guarantee that publications in MDPI journals will be taken into account for evaluations of employees and departments.

There was a huge Twitter blowup about this (many thousands of retweets), but that original post was eventually deleted by the user ("@CzSam00" who is a prof at University of South Bohemia suddenly finding herself the centre of unexpected attention). Edit: official .pdf release by USB available at: Crawdaunt (talk) 20:29, 21 December 2021 (UTC)

Perhaps a pattern in Czechia, as Charles University's vice dean also has a statement warning its researchers against use of MDPI journals (see: But this is not as strong a statement as it seems South Bohemia is making. Crawdaunt (talk) 20:37, 21 December 2021 (UTC)
I added the statement by the University of South Bohemia - it feels borderline (since it's but one faculty of one university) but different enough from all the other controversies to include anyway. They did after all single out MDPI. The second source, by Charles University's vice dean, I did not add because it seems like they are only 'considering' further action. Banedon (talk) 03:40, 22 December 2021 (UTC)

Delete: Who's Afraid of Peer Review?[edit]

Just curious why this entry is present at all? Re: MDPI, it wasn’t a controversy - MDPI rejected the fake paper. The title “Who’s Afraid of Peer Review?” in a “Controversies” section might imply to a casual reader that there is something scandalous to report when there isn’t... On the flipside, discussions on this Talk page have made a point to say that a single journal cannot be extrapolated to a pattern. This subsection currently refers to one unnamed MDPI journal, which kind of communicates as if MDPI as a whole rejected the paper, rather than the lone journal.

From either perspective, this seems misleading in its current form. Should this section just be deleted? — Preceding unsigned comment added by Crawdaunt (talkcontribs)

It seems to be based entirely on a single line in the supplemental spreadsheet, so if there are any WP:DUE concerns, they'd be with that subsection IMO. XOR'easter (talk) 15:38, 22 December 2021 (UTC)
Who's Afraid was a basic test, and MDPI passed it. Omitting it makes MDPI look worse than it is. Headbomb {t · c · p · b} 15:54, 22 December 2021 (UTC)
So basically its purpose on the page, in the controversies section, is to act as counter-balance? Crawdaunt (talk) 19:16, 22 December 2021 (UTC)
Well let's be honest. There is no room for MDPI articles that were in the news in a positive spotlight here. Lack of balance is already far off. Of the 300,000+ articles MDPI publishes every year, there are hardly 10-20 articles that have had some issues, and they are all mentioned here (I even added one or two examples). Very comparable with other publishers. Thus, if you want to argue that MDPI has some "predatory" practices, also add information proving the opposite. It passed a test, which more established publishers failed, most of its journals are indexed by the relevant databases, it is a member of COPE/DOAJ, and some of its journals are leaders in their respective fields etc. Kenji1987 (talk) 04:42, 30 December 2021 (UTC)
"There is no room for MDPI articles that were in the news in a positive spotlight here." Wrong. All positive coverage requires is that it is independent of MDPI. Headbomb {t · c · p · b} 05:44, 30 December 2021 (UTC)
So if we were to add, lets say, CNN articles reporting on something published in MDPI, that would be allowed here? That what I mean with "were in the news in a positive spotlight". Kenji1987 (talk) 06:28, 30 December 2021 (UTC)
No, unless they were commenting on MDPI themselves. Headbomb {t · c · p · b} 06:54, 30 December 2021 (UTC)
So I wasn't wrong. Besides, would it make little sense to praise an article as a result of MDPI practices, as they are expected to function as any other scientific publisher. So this section will always be biased, that's the way how news works. Kenji1987 (talk) 07:10, 30 December 2021 (UTC)
Stepping aside from our guidelines and the Through the Looking Glass world of science commentary, I think it is good that we have this coverage. Only reporting bad news skews coverage and traditional publishers prefer feeding their readership an imbalanced diet of mostly bad news. Mind, I don't want a rule that we report on all such hoaxes, rather this is an area where I apply our sourcing guidelines more leniently, because this kind of information at present tends to be valuable and rare. — Charles Stewart (talk) 06:10, 30 December 2021 (UTC)
There's of course a skew to reporting towards negative results, but that's because the mere statement of "this is an academic publisher indexed by..." already implies that they do their job properly most of the time. It's therefore not noteworthy to say "in general the MDPI journal Life does not publish papers causing mass ridicule." It is however noteworthy to say "In 2011, the MDPI journal Life published a paper claiming to have solved the theory of life, the universe, and everything. This was met with mass ridicule."
I would say that if the intent is to provide counter-balance to the article, then it seems exaggerated to put a single entry in for a single journal meeting the minimum bar of competence rejecting a completely bogus paper. The source (Who's Afraid of Peer Review) doesn't even mention the journal or MDPI in text, and the mentality of 'one journal doesn't reflect the whole' is argued here regularly. I feel like if one wants counter-balance, emphasizing something like MDPI being a member of COPE (by adding an introductory sentence reminding that MDPI is a member of COPE, how one qualifies for COPE, etc...) is a more objective entry than "here's an entire subsection dedicated to how one journal once rejected a paper that made no sense and was generated by an algorithm." Crawdaunt (talk) 07:51, 1 January 2022 (UTC)

Chinese Academy of Sciences lists multiple MDPI journals in Warning List of International Journals[edit]

Similar to the Norweigian controveries section, the Chinese Academy of Sciences has now listed various MDPI journals in their their new early warning list of international journals (by my count 16/65 journals included were MDPI in 2020 and 6/41 journals in 2021). Crudely translated: "The journal warning is not an evaluation of papers, nor is it a negative warning against every publication by the journal. Early warning journals are chosen to remind scientific researchers to carefully select publication platforms and remind publishing institutions to strengthen journal quality management." Links below, including an English-language discussion remarking on the number of MDPI, IEEE, and Hindawi journals in the list:


This seems like the Chinese parallel to "Level X" of the Norwegian Scientific Index. As this is an recommendation by a major scientific body in China, does this seem appropriate to mention on the MDPI page?

Thoughts? Crawdaunt (talk) 10:34, 1 January 2022 (UTC)

Yep, I am in favour. Can you draft up a text? Kenji1987 (talk) 13:16, 1 January 2022 (UTC)
I was actually wondering if, in an effort to reduce sprawl, it might be appropriate to create a separate section from Controversies like "MDPI in the International Community" or just create it as a subsection within Controversies. As its own section, I'm imagining a home for say... "MDPI as a member of international bodies like COPE", but also it can host the Norwegian critiques, this Chinese Academy of Science critique, and the section on the Univ South Bohemia (Czech) section; perhaps also that mention from Charles University's Vice dean (Czech) explicitly recommending against publishing in MDPI journals.
Having a separate section that could be expanded in a neutral light that would make more sense to me than a broad Controversies Section (implicit negative connotation) that is host to various controversies; also note in above discussion on "Who's Afraid of Peer Review", the Controversies section hosts non-controversies apparently included solely to help balance the article (?). I feel like much of the debate around what is and isn't appropriate re: WP:DUE is in part because the page's current organization shunts everything to the Controversies section and then each entry is weighed on whether it's appropriate for that section... That's my pitch for making a proper separate section on "MDPI in the International Community." Crawdaunt (talk) 18:11, 1 January 2022 (UTC)

I think there is a difference if there is some random Swedish researcher criticizing MDPI in a student journal or whether it is a Chinese governmental body asking its researchers to slow down a bit when it comes to publishing in certain outlets. The link you provided us lists the 6 MDPI journals as low risk - Hindawi's Complexity on the other hand...(I assume you'll also modify the Hindawi wiki???). So please go ahead and add this information along the same lines as University of South Bohemia in the controversies section as a seperate section. It deserves it. Kenji1987 (talk) 00:42, 2 January 2022 (UTC)

Not sure what you're referring to with Swedish? Was comparing the Norwegian Scientific Index to the Chinese Academy of Sciences. I will look to see what can/should be added to the Hindawi page, but think a bit more research on my part needs to be done to present that fairly. Thanks for drawing attention to it. Sounds good re: separate subsection. I leave the idea of a separate section open to discussion should anyone wish to comment. Will draft something and post it to the page (feel free to edit after). Crawdaunt (talk) 07:32, 2 January 2022 (UTC)
Addition: The Scholarly Kitchen post notes 22 of the original 65 (2020) journals were MDPI. I previously incorrectly noted 16. This post also comments that only four of the 65 were Hindawi. I'll have a look at the 2021 list that was just published 2 days ago, but I do think it's a false equivalence to suggest that the two groups' inclusion on the list are comparable. The inclusion of a number of Frontiers Media journals on the new list is also notable as there were none in 2020, but again only 3/41 (and like MDPI their names are easy to spot, biasing my scan). I think the part that merits inclusion in the MDPI controversies section is just how prevalent its journals were in the initial list (22/65!), and indeed MDPI's Chinese Comms Department responded (ref now added: MDPI still makes up 15% of the list in 2021 (not even all the same journals), so there is a history of controversy specifically and most prominently affecting MDPI regarding this list. I hope this makes the specific motivation for inclusion in the MDPI article clear. Crawdaunt (talk) 08:17, 2 January 2022 (UTC)
Great addition! I look forward to your Hindawi and Frontiers additions to their pages as well. Perhaps you could add here that MDPI journals were on the low risk classification. Kenji1987 (talk) 08:29, 2 January 2022 (UTC)
Thanks :) In fact, I missed one... There are 7 MDPI journals in the 2021 list (listed them in the most recent edit for posterity). That compares to 6 Hindawi, and 3 Frontiers. I think I will make a note on the Hindawi page. Frontiers seems undue re: WP:UNDUE given they make up only 3/41, and were not part of the initial list. Hindawi's presence is consistent across years and grew, so that does seem warranted. Crawdaunt (talk) 08:48, 2 January 2022 (UTC)