Scientists are subject to scams, just as much as anyone else. Predatory journals take advantage of authors who submit papers and readers who access these papers; predatory conferences exploit speakers and attendees. But an innovative tool called Aletheia-Probe (named after the Greek word for truth) offers a simple way to check the ratings of journals and conferences, so users can better assess which ones to trust.
Scientists often receive flattering e-mails inviting them to submit their work to journals and conferences that are happy to take their money in return for a shoddy service. The publications might skimp on the peer-review process or disappear after a few months; the conferences might consist of empty meeting rooms or involve off-topic talks. Researchers who are deciding which journal to submit their papers to, or whether to include a specific paper in a systematic review, can check various lists of legitimate journals (such as the Directory of Open Access Journals) or predatory ones (such as Predatory Journals), as well as databases such as CrossRef that provide more nuanced information about journals. But the information is scattered across sources, and the lists don’t always agree. Aletheia-Probe is a one-stop shop.
Andreas Florath, a cloud-computing architect at German telecommunications company Deutsche Telekom, who is based in Raeren, Belgium, created Aletheia-Probe while he was helping his colleagues to write a review paper on energy management. He needed to evaluate more than 350 papers. “I will not do this manually,” he recalls thinking. Like any good software developer, his laziness prompted hard work, and he built Aletheia-Probe to automate this process.
The software collects data from a dozen databases and applies an algorithm to integrate the information. Users can download Aletheia-Probe from cloud-based repository GitHub and run it on a text-based command-line interface (for instance, the macOS Terminal or MobaXterm on Windows). The command ‘aletheia-probe journal “Nature”’ directs the software to scrutinize this journal. It’s conclusion “Result: LEGITIMATE (confidence: 0.95)” is offered alongside a summary of its reasoning. For other journals, it might reply, “PREDATORY (confidence: 0.90)” or “INSUFFICIENT_DATA (confidence: 0.45)”. “My idea of this is something like having a virus scanner,” Florath says.
Florath uploaded a paper about the tool1 to the preprint server arXiv on 15 January. In an ironic twist, a predatory publisher subsequently e-mailed him asking if he would publish it with them, he says.
Manoj Lalu, an anaesthesiologist and scientist at The Ottawa Hospital, is helping to develop a label that provides a summary of journal integrity for research papers and journals called the Publication Facts Label, similar to a nutrition-facts label on food. He says Aletheia-Probe is “really nice in some ways, because it gives a very straightforward answer”. He also likes the fact that the software is open source. “You can see what it’s using to make its decision — that’s excellent.” But he cautions that because it relies in part on blacklists that don’t necessarily explain their reasoning, Aletheia-Probe’s output can be opaque. And not everyone is comfortable using a command-line interface, he adds.
Florath says that predatory publishing is a problem even in his original field of mathematics, in which errors can creep in even though correctness should be easy to check. “You stand on the shoulders of giants,” he says, “but currently you don’t know even if these shoulders are sound.” His biggest surprise when developing Aletheia-Probe was how many people were unaware of the problem. “They don’t need to use this [particular] tool,” he says, but they should use something. Instead of automatically assuming all journals are legitimate, Florath says, “it would be really good if people even think about checking”.
Shared from the Nature Careers website, find the original and more content at doi: https://doi.org/10.1038/d41586-026-00223-6

Print This Post