Next Issue
Previous Issue

Table of Contents

Publications, Volume 2, Issue 3 (September 2014), Pages 61-82

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
View options order results:
result details:
Displaying articles 1-2
Export citation of selected articles as:

Research

Open AccessArticle Measuring Scientific Misconduct—Lessons from Criminology
Publications 2014, 2(3), 61-70; doi:10.3390/publications2030061
Received: 28 February 2014 / Revised: 26 June 2014 / Accepted: 27 June 2014 / Published: 3 July 2014
PDF Full-text (191 KB) | HTML Full-text | XML Full-text
Abstract
This article draws on research traditions and insights from Criminology to elaborate on the problems associated with current practices of measuring scientific misconduct. Analyses of the number of retracted articles are shown to suffer from the fact that the distinct processes of misconduct,
[...] Read more.
This article draws on research traditions and insights from Criminology to elaborate on the problems associated with current practices of measuring scientific misconduct. Analyses of the number of retracted articles are shown to suffer from the fact that the distinct processes of misconduct, detection, punishment, and publication of a retraction notice, all contribute to the number of retractions and, hence, will result in biased estimates. Self-report measures, as well as analyses of retractions, are additionally affected by the absence of a consistent definition of misconduct. This problem of definition is addressed further as stemming from a lack of generally valid definitions both on the level of measuring misconduct and on the level of scientific practice itself. Because science is an innovative and ever-changing endeavor, the meaning of misbehavior is permanently shifting and frequently readdressed and renegotiated within the scientific community. Quantitative approaches (i.e., statistics) alone, thus, are hardly able to accurately portray this dynamic phenomenon. It is argued that more research on the different processes and definitions associated with misconduct and its detection and sanctions is needed. The existing quantitative approaches need to be supported by qualitative research better suited to address and uncover processes of negotiation and definition. Full article
(This article belongs to the Special Issue Misconduct in Scientific Publishing)
Open AccessArticle Failure to Replicate: A Sign of Scientific Misconduct?
Publications 2014, 2(3), 71-82; doi:10.3390/publications2030071
Received: 28 February 2014 / Revised: 12 August 2014 / Accepted: 18 August 2014 / Published: 1 September 2014
PDF Full-text (235 KB) | HTML Full-text | XML Full-text | Supplementary Files
Abstract
Repeated failures to replicate reported experimental results could indicate scientific misconduct or simply result from unintended error. Experiments performed by one individual involving tritiated thymidine, published in two papers in Radiation Research, showed exponential killing of V79 Chinese hamster cells. Two other
[...] Read more.
Repeated failures to replicate reported experimental results could indicate scientific misconduct or simply result from unintended error. Experiments performed by one individual involving tritiated thymidine, published in two papers in Radiation Research, showed exponential killing of V79 Chinese hamster cells. Two other members of the same laboratory were unable to replicate the published results in 15 subsequent attempts to do so, finding, instead, at least 100-fold less killing and biphasic survival curves. These replication failures (which could have been anticipated based on earlier radiobiological literature) raise questions regarding the reliability of the two reports. Two unusual numerical patterns appear in the questioned individual’s data, but do not appear in control data sets from the two other laboratory members, even though the two key protocols followed by all three were identical or nearly so. This report emphasizes the importance of: (1) access to raw data that form the background of reports and grant applications; (2) knowledge of the literature in the field; and (3) the application of statistical methods to detect anomalous numerical behaviors in raw data. Furthermore, journals and granting agencies should require that authors report failures to reproduce their published results. Full article
(This article belongs to the Special Issue Misconduct in Scientific Publishing)

Journal Contact

MDPI AG
Publications Editorial Office
St. Alban-Anlage 66, 4052 Basel, Switzerland
publications@mdpi.com
Tel. +41 61 683 77 34
Fax: +41 61 302 89 18
Editorial Board
Contact Details Submit to Publications
Back to Top