Next Article in Journal
Understanding Connections: Examining Digital Library and Institutional Repository Use Overlap
Next Article in Special Issue
Quantifying the Growth of Preprint Services Hosted by the Center for Open Science
Previous Article in Journal / Special Issue
Verified, Shared, Modular, and Provenance Based Research Communication with the Dat Protocol
 
 
Case Report
Peer-Review Record

Enabling A Conversation Across Scholarly Monographs through Open Annotation

Publications 2019, 7(2), 41; https://doi.org/10.3390/publications7020041
by Andrea C. Bertino 1,* and Heather Staines 2
Reviewer 1:
Reviewer 2:
Publications 2019, 7(2), 41; https://doi.org/10.3390/publications7020041
Submission received: 7 May 2019 / Revised: 22 May 2019 / Accepted: 28 May 2019 / Published: 8 June 2019
(This article belongs to the Special Issue New Frontiers for Openness in Scholarly Publishing)

Round 1

Reviewer 1 Report

SUMMARY

The reviewed case report discusses current examples of how open annotation of scholarly monographs enables a new kind of scholarly communication in the digital era. Annotation is definitely among the methods that have the potential to advance current models of scholarly publishing from being mere digital mirroring of traditional publication formats. Combined with the focus on open annotation, as described in the case report, this makes the topic of this manuscript highly relevant for the thematic theme of the journal issue – “New Frontiers for Openness in Scholarly Publishing”.


BROAD COMMENTS

The case report is well organised and clearly structured. It is well-written and largely easy to understand. Some sections could profit from some more detailed explanation for non-experts; see specific comments below.


My only major criticism is that the report could have used somewhat more space on discussion of the main topic. Much of section 4, pages 4 to 8, is about other issues than annotation. All three case studies discussed in section 4 are interesting examples of how annotation can be used in scholarly work. However, the first two case studies (4.3.1 and 4.3.2) are more relevant for the topic (scholarly conversation across monographs) than the last case study (4.3.3), which – as I understand it – is more about technical aspects of monography publishing. In addition to these case studies, it would be interesting to know more about any other experiences from using annotation framework/methods and tools like the ones discussed in the case report. This discussion could, e.g., be integrated into section 4.3.


SPECIFIC COMMENTS

Clarification/presentation issues

The following parts would profit from additional explanation:

Line 61-63: The meaning of this sentence is not quite clear to me.

Line 80; “Hypothesis”: Has this been explained previously?

Section 4.2.1: Here, I would have expected a short introduction before the bullet point list starts.

Line 251; “stable”: also “sustainable”?

Line 288-294: I think some screenshots could make this more accessible for the reader.

Line 344; “The tool, provided by Hypothesis, …”: Is the tool Hypothes.is, or based on Hypothes.is?

Line 348; “… by a non-profit organization”: Is this Hypothes.is?

Table after line 400: Not so easy to read; choose another format, e.g. plain text organised in headings and sub-headings?

Line 490: The meaning of the clause “so that became apparent emerged” is not clear to me.

Line 502; “proposed”: rather “described” or “discussed”?

Line 511-513: In addition to storage, discoverability is also an important issue, and should be mentioned here (cf. section 2.2).


References

References to further information about some of the terms and concepts mentioned in the report may be helpful for non-expert readers. Here are some examples:

Line 71: FAIR

Line 97: CLOCKS, Portico

Line 169: HSS (expand first time; cf. line 198)

Line 279: CENDARI

Line 282: DARIAH, HumaNum


Content issues

Finally some remarks about the issues and examples discussed in the case report. These remarks are probably not so much relevant for the report itself, but may be useful feedback for your work (or the work of associated partners).

Lines 85-86: What are the consequences of MDPI having their own Hypothes.is instance when it comes to interoperability with the generic instance run by Hypothes.is?

Lines 113-118: You mention DOI, but what about other PIDs?

Section 4.1: It is striking to me that all these publishing platforms use their own (proprietary?) software. Is there any discussion or efforts to collaborate on software development other than common metadata standards? Cf. e.g. the collaboration between service providers such as DMPonline and DMPTool.

Lines 267-268: Are there any other features that could be improved? I’m thinking here e.g. of associated research data.

Lines 332-340: How does this relate to standards and recommendations such as COUNTER and the Leiden Manifesto for Research Metrics?

Lines 433-435: On the other hand, wouldn’t we like the students – at least in an initial phase – to contribute with annotations that are not based on or biased by previous annotations?


Author Response

Response to Reviewer 1 Comments

The case report is well organised and clearly structured. It is well-written and largely easy to understand. Some sections could profit from some more detailed explanation for non-experts; see specific comments below.

Response: We are very grateful  for your positive review and for the valuable feedback you have given us. We have improved the text, and we hope that it will now fully satisfy you.

Point 1:  Much of section 4, pages 4 to 8, is about other issues than annotation. All three case studies discussed in section 4 are interesting examples of how annotation can be used in scholarly work. However, the first two case studies (4.3.1 and 4.3.2) are more relevant for the topic (scholarly conversation across monographs) than the last case study (4.3.3), which – as I understand it – is more about technical aspects of monography publishing.

Response 1:  We wanted to describe the other activities of the project to explain the technical and organizational context in which these annotation activities were conducted. As these activities are still in progress, we expect a second case report to be produced after they have been completed. We have added the following sentences: Encouraging the annotation of Open Access monographs in order to foster conversation between scholars, between teachers and students, and between the academic community and a non-academic audience is one of the aims of the HIRMEOS project.”

Point 2: In addition to these case studies, it would be interesting to know more about any other experiences from using annotation framework/methods and tools like the ones discussed in the case report. This discussion could, e.g., be integrated into section 4.3.

Response 2: In this article we focus on the activities of the HIRMEOS project. However, there are several activities with the annotation tool that can be considered.


Point 3: Line 61-63: The meaning of this sentence is not quite clear to me.

Response 3: Sentence was clarified.

Point 4: Line 80; “Hypothesis”: Has this been explained previously?

Response 4: It was not, we did it.

Point 5 :Section 4.2.1: Here, I would have expected a short introduction before the bullet point list starts

Response 5: we added:   “As part of the project, a workflow was established to add the following metadata to each monograph published on the participating platforms:”

Point 6: Line 251; “stable”: also “sustainable”?

Response 6: We changed ‘practical’ to ‘sustainable.’

Point 7: Line 288-294: I think some screenshots could make this more accessible for the reader.

Response 7: We have added a reference to an article concerning the implementation of this service. The article includes many screenshots.

Point 8: Line 344; “The tool, provided by Hypothesis, …”: Is the tool Hypothes.is, or based on Hypothes.is?

Response 8: Changed to: “ The Hypothesis annotation tool”

Point 9: Line 348; “… by a non-profit organization”: Is this Hypothes.is?

Response 9:  WC3. We reformulated this to read: “ [...] by the Web Annotation Working Group created by the World Wide Web Consortia (W3C), a non-profit, in 2014.”

Point 10: Table after line 400: Not so easy to read; choose another format, e.g. plain text organised in headings and sub-headings?

Response 10: We removed the table.

Point 11: Line 490: The meaning of the clause “so that became apparent emerged” is not clear to me.

Response 11: The text was clarified.

Point 12: Line 502; “proposed”: rather “described” or “discussed”?

Response 12: described.

Point 13: Line 511-513: In addition to storage, discoverability is also an important issue, and should be mentioned here (cf. section 2.2).

Response 13: added:  “and are easily discoverable”

Point 14: References to further information about some of the terms and concepts mentioned in the report may be helpful for non-expert readers. Here are some examples:

Line 71: FAIR

Line 97: CLOCKS, PorticoLine

Line 169: HSS (expand first time; cf. line 198)

Line 279: CENDARI

Line 282: DARIAH, HumaNum   

Response 14:

We modified the Line 71  so: “  [...]  mechanism to both help annotations to comply with the principles of the FAIR Data Principles

Footnote added: “ CLOCKSS, or Controlled LOCKSS (Lots of Copies Keep Stuff Safe), is a shared dark archive that runs on LOCKSS technology (https://clockss.org/); PORTICO is a digital preservation service funded by libraries and publishers (https://www.portico.org/). “

Footnote with “ Collaborative European Digital Archival Research Infrastructure (www.cendari.eu)”

Footnote with “ DARIAH, Digital Research Infrastructure for the Arts and Humanities (www.dariah.eu). HumaNum, a research infrastructure aimed at facilitating digital change in the SSH (humanities and social sciences (www.huma-num.fr/). “

Line 169  changed HSS  in SSH and clarified

Point 15: Finally some remarks about the issues and examples discussed in the case report. These remarks are probably not so much relevant for the report itself, but may be useful feedback for your work (or the work of associated partners).

Lines 85-86: What are the consequences of MDPI having their own Hypothes.is instance when it comes to interoperability with the generic instance run by Hypothes.is?

Lines 113-118: You mention DOI, but what about other PIDs?Section 4.1: It is striking to me that all these publishing platforms use their own (proprietary?) software. Is there any discussion or efforts to collaborate on software development other than common metadata standards? Cf. e.g. the collaboration between service providers such as DMPonline and DMPTool. Lines 267-268: Are there any other features that could be improved? I’m thinking here e.g. of associated research data.

Lines 332-340: How does this relate to standards and recommendations such as COUNTER and the Leiden Manifesto for Research Metrics? Lines 433-435: On the other hand, wouldn’t we like the students – at least in an initial phase – to contribute with annotations that are not based on or biased by previous annotations?

Response 15: Thank you very much for these thoughts that we will take into account in the follow-up to this report. Here are a few brief answers to some of your observations:

The platforms also take into account ISBN for the metrics service.

The platforms are run by open source software, not proprietary.

COUNTER ,(amongst others standards), is included in the metrics service developed in the framework of HIRMEOS.

If data can be selected in the browser with a stable url, then it can be annotated.

We intend to collect feedback from the students involved in the activity, in order to understand which is the best way to use this tool for commenting on monographs in a seminar. In this context, we plan to publish a second report.


Reviewer 2 Report

This paper proposes that the entire monograph publishing process can evolve by opening up the processes by which audiences and reviewers engage with a text. It is an interesting proposition, and the authors make their point well by sharing the experiments that HIRMEOS has completed with the Hypothesis open annotation tool. The report’s preliminary findings are important for anyone interested in scholarly publishing and for those involved in monograph publishing in particular. 

The main strength of the paper is its grasp of the importance of peer review as a tool for engaging with a text and its applications in the modern publishing landscape. Their section on needs and expectations for annotating monographs was particularly insightful and clearly outlined what an open annotation software needs to succeed.

I was also impressed by the amount of detail that the paper was able to portray in its case studies of Hypothesis’ use in three academic settings: in a classroom, for managing peer review, and as a quality checking tool. These sections were truly the highlight of the paper and displayed how open annotation can improve the traditional publishing process in many ways.

While there were several things I enjoyed about the paper, I also had some concerns about its content.

On page 2, line 49, the authors begin to explain where open annotation could lead in the future before any discussion of current trends or tools used for this purpose. I would recommend introducing Hypothesis here and perhaps one or two examples of other web annotation tools. This will give the readers a good understanding of the current landscape of open annotation before the authors delve into where open annotation could go in the future. It would also be useful to add this here since discussions of Hypothesis begin in section 2.1 without a proper introduction to the software before that point. The paragraph introducing Hypothesis on page 8, line 343, could be moved to page 2, line 49 for this purpose.

Besides this, I only had a few minor concerns about the paper:

On page 2, line 49, the authors begin a sentence with “In coming years, browsers will…” This statement has no citations to back up its claim that the future it is envisioning is certain. The authors could address this problem by either starting the sentence with “It is likely that…” or by adding references to papers that also make this claim.

Sections 4.2.1-4.2.4 do not seem essential to the argument being presented and could be edited down for length. Their content is interesting but not necessarily vital to the report.

The table on page 10 is difficult to read and is better represented by the bulleted lists that follow it. I would recommend either editing the table to better reflect the content within the bulleted lists or removing the table altogether.

Despite these concerns, I believe that this paper is an excellent piece of work on an often-disregarded aspect of scholarly publishing and a joy to read.

Author Response

Response to Reviewer 2 Comments

The report’s preliminary findings are important for anyone interested in scholarly publishing and for those involved in monograph publishing in particular. The main strength of the paper is its grasp of the importance of peer review as a tool for engaging with a text and its applications in the modern publishing landscape. Their section on needs and expectations for annotating monographs was particularly insightful and clearly outlined what an open annotation software needs to succeed.I was also impressed by the amount of detail that the paper was able to portray in its case studies of Hypothesis’ use in three academic settings: in a classroom, for managing peer review, and as a quality checking tool. These sections were truly the highlight of the paper and displayed how open annotation can improve the traditional publishing process in many ways.

Response: We are very grateful  for your positive review and for the valuable feedback you have given us. We have improved the text and we hope that it will now fully satisfy you.


Point 1: On page 2, line 49, the authors begin to explain where open annotation could lead in the future before any discussion of current trends or tools used for this purpose. I would recommend introducing Hypothesis here and perhaps one or two examples of other web annotation tools. This will give the readers a good understanding of the current landscape of open annotation before the authors delve into where open annotation could go in the future. It would also be useful to add this here since discussions of Hypothesis begin in section 2.1 without a proper introduction to the software before that point. The paragraph introducing Hypothesis on page 8, line 343, could be moved to page 2, line 49 for this purpose.

Response 1: We added some info about other annotation tools and now introduce Hypothesis tool earlier.


Point 2: On page 2, line 49, the authors begin a sentence with “In coming years, browsers will…” This statement has no citations to back up its claim that the future it is envisioning is certain. The authors could address this problem by either starting the sentence with “It is likely that…” or by adding references to papers that also make this claim.

Response 2: Sentence was better reformulated.

Point 3: Sections 4.2.1-4.2.4 do not seem essential to the argument being presented and could be edited down for length. Their content is interesting but not necessarily vital to the report.

Response 3:   We wanted to describe the other activities of the project in order to better clarify the technical and organizational framework in which these annotation activities were conducted. As these activities are still in progress, we expect a second case report to be produced after they have been completed. We have added the following sentences: Encouraging the annotation of Open Access monographs in order to foster conversation between scholars, between teachers and students, and between the academic community and a non-academic audience is one of the aims of the HIRMEOS project.”

Point 4: The table on page 10 is difficult to read and is better represented by the bulleted lists that follow it. I would recommend either editing the table to better reflect the content within the bulleted lists or removing the table altogether.

Response 4: We removed the table.

Point 5: Despite these concerns, I believe that this paper is an excellent piece of work on an often-disregarded aspect of scholarly publishing and a joy to read.

Response: Thank you very much!


Back to TopTop