Image Integrity | Working Group

In scholarly publishing, we encounter image alterations as well as duplications.

Whatever the reason is behind the submission of altered and/or duplicated images to a journal, they should be identified early in the article evaluation process, so journals can take appropriate action prior to publication and in a best-case scenario, before peer review. Opposite to text plagiarism, which usually results in the violation of the research process, image alteration and/or duplication can be much more damaging, as it corrupts actual research results, wastes research money on invalid leads, undermines society’s trust in research, and can even endanger the society in which those “results” are used. In addition, the rise of Generative AI is expected to make image manipulation and fabrication much easier, and very difficult to detect.

STM has appointed a working group to answer questions around automatic image manipulation detection. It addresses topics like the minimal requirements for such tools, the current quality of them, how their quality can be measured, and how these tools can be widely, consistently, and effectively applied by scholarly publishers. It also looks at a standard classification of types and severity of image-related issues and proposes guidelines on what types of image alteration is allowable under what conditions. The working group currently operates as part of the STM Integrity Hub initiative.

For more information, contact Joris van Rossum at joris@stm-solutions.org

Resource Center

An evolving hub of videos, tools and guidelines for publishers working to detect automatic image alteration and/or duplications.

LEARN MORE

Meet the members of the working group