Next Article in Journal
The ‘Cod-Multiple’: Modes of Existence of Fish, Science and People
Previous Article in Journal
Social and Emotional Intelligence as Factors in Terrorist Propaganda: An Analysis of the Way Mass Media Portrays the Behavior of Islamic Terrorist Groups
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Reviewer Experience vs. Expertise: Which Matters More for Good Course Reviews in Online Learning?

1
Business School of Sport, Beijing Sport University, Beijing 100084, China
2
The Key Laboratory of Rich-Media Knowledge Organization and Service of Digital Publishing Content, Institute of Scientific and Technical Information of China, Beijing 100036, China
3
Lazaridis School of Business & Economics, Wilfrid Laurier University, Waterloo, ON N2L 3C5, Canada
4
Department of Finance and Management Science, University of Saskatchewan, Saskatoon, SK S7N 2A5, Canada
*
Author to whom correspondence should be addressed.
Sustainability 2021, 13(21), 12230; https://doi.org/10.3390/su132112230
Submission received: 5 October 2021 / Revised: 2 November 2021 / Accepted: 3 November 2021 / Published: 5 November 2021
(This article belongs to the Section Sustainable Education and Approaches)

Abstract

:
With a surging number of online courses on MOOC (Massive Open Online Course) platforms, online learners face increasing difficulties in choosing which courses to take. Online course reviews posted by previous learners provide valuable information for prospective learners to make informed course selections. This research investigates the effects of reviewer experience and expertise on reviewer competence in contributing high-quality and helpful reviews for online courses. The empirical study of 39,114 online reviews from 3276 online courses on a leading MOOC platform in China reveals that both reviewer experience and expertise positively affect reviewer competence in contributing helpful reviews. In particular, the effect of reviewer expertise on reviewer competence in contributing helpful reviews is much more prominent than that of reviewer experience. Reviewer experience and expertise do not interact in enhancing reviewer competence. The analysis also reveals distinct groups of reviewers. Specifically, reviewers with low expertise and low experience contribute the majority of the reviews; reviewers with high expertise and high experience are rare, accounting for a small portion of the reviews; the rest of the reviews are from reviewers with high expertise, but low experience, or those with low expertise, but high experience. Our work offers a new analytical approach to online learning and online review literature by considering reviewer experience and expertise as reviewer competence dimensions. The results suggest the necessity of focusing on reviewer expertise, instead of reviewer experience, in choosing and recommending reviewers for online courses.

1. Introduction

The proliferation of online learning in the past decade and the surge of Massive Open Online Courses (MOOCs) worldwide since 2012 have significantly expanded the accessibility of professional education and higher education to the general population [1,2]. Online learning offers non-formal and informal learning opportunities to learners outside the formal education settings [3,4]. Compared with traditional courses in face-to-face classroom environments, online courses provide a structured learning format with the advantages of convenience, flexibility, and economic value. The acceptance and popularity of online courses have increased significantly in recent years [2].
However, with a large number of online courses available on MOOC platforms and the flexibility of taking courses according to the learners’ interests and without a set curriculum, prospective learners face difficulties in selecting courses. This difficulty is further amplified by the varying quality of and complexity in quality assessment on online courses, as well as the limited information and expertise of prospective learners in selecting courses [5]. To provide prospective learners with useful information and enable the sharing of course experience among learners, MOOC platforms commonly maintain a course review system that allows learners to post course reviews [6,7,8]. Learner reviews usually consist of a numeric rating and a short free-style textual comment to express learner opinions on courses.
Learners’ course reviews on MOOC platforms are a new and unique type of learner feedback, which is different from learning feedback, such as the peer feedback and course evaluations that are traditionally studied in the education literature [9,10]. Peer feedback is provided among learners during the learning process. It has the clear goals of improving learner performance [10,11]. Course evaluations are solicited confidentially by education institutions for instructor and course assessment. They have a structured information format. In contrast, learners’ course reviews on MOOC platforms are voluntarily contributed by learners for opinion sharing. They are publicly available in a prominent section on a course webpage to complement course description information.
Online course reviews are a valuable source of information for prospective learners to gain insights into courses and make informed course selections. Therefore, identifying and soliciting high-quality and helpful course reviews is important for online course providers and online learning platforms. To identify and recommend helpful reviews to prospective learners, course review systems on MOOC platforms incorporate a helpfulness voting mechanism that allows review readers to vote for the helpfulness of a course review [12]. The total number of helpfulness votes is used to rank reviews of a course to facilitate readers’ access to helpful information [12]. Although effective, this approach is passive and time consuming. Because it takes time for a review to accrue helpfulness votes, recommending helpful reviews by their helpfulness votes is prone to delay and bias [13,14].
A more proactive approach is to solicit and recommend reviews from competent reviewers. Compared to the helpfulness voting mechanism, this approach provides an effective and efficient way of building a timely depository of high-quality course reviews for prospective learners. As recent course experience is deemed more relevant for prospective learners to make course selection decisions, soliciting and recommending reviews from competent reviewers can help avoid delays in information sharing and improve the information seeking experience and course selection decision of prospective learners.
However, it is unclear what attributes make a reviewer competent in contributing high-quality and helpful reviews. Reviewer competence is an understudied topic in online education and online review literature. Without understanding key reviewer features that are associated with reviewer competence, it is hard to identify reviewers from a large number of learners on MOOC platforms to solicit and recommend course reviews.
To address this research gap, we draw from task performance literature [15,16,17] to examine reviewer experience and expertise as distinct reviewer competence dimensions. Specifically, this research inquires into the following questions: (1) Do and how do reviewer experience and expertise affect reviewer competence in contributing helpful course reviews in the context of online learning?, and (2) do reviewer experience and expertise interact in affecting reviewer competence?
An empirical study using a large-scale proprietary dataset from a leading MOOC platform in China reveals that both reviewer experience and expertise have significant positive impacts on their competence in contributing helpful course reviews. In particular, the effect of reviewer expertise on reviewer competence is more prominent than that of reviewer experience. In addition, reviewer experience and expertise do not interact in enhancing reviewer competence. Our analysis also reveals the distinct groups of reviewers. Reviewers with high experience and high expertise are scant. They only contribute a very small portion of the reviews. The majority of the reviews are posted by reviewers with low experience and low expertise. The rest of the reviews are posted by reviewers with high expertise, but low experience, or those with high experience, but low expertise.
This research makes several contributions to the literature. First, it contributes to the online learning literature by examining online course reviews as a unique type of learner feedback and studying reviewer competence in contributing helpful reviews. Online course reviews provide useful information to prospective learners in selecting online courses from a variety of courses available on online course platforms. Studying online course reviews and approaches to improve their quality and helpfulness is an important, but largely neglected component in online learning research. In addition, this study contributes to the online review literature by taking an analytical approach of studying reviewer experience and expertise as two essential dimensions of reviewer competence. This approach differs from the common approach in the online review literature that considers reviewer information as source cues [18,19,20]. Moreover, this study enriches the task performance literature [15,16,17] by studying the performance effect of experience and expertise in the context of online course reviews. The results provide evidence for the pivotal role of expertise in determining reviewer performance on online course platforms.
Our results provide useful practical suggestions to online course providers and online learning platforms in soliciting and recommending reviews for online courses. Instead of experienced reviewers who frequently contribute reviews, the priority should be placed on attracting and retaining expert reviewers who do not contribute often.
The rest of the paper is organized as follows. Section 2 offers a brief overview of the relevant literature, including online learning and learning feedback, the roles of experience and expertise in task performance, and online customer review. Section 3 presents the research framework and develops three research hypotheses. Section 4 discusses the research methodology, including data, variables, and the empirical model. Section 5 provides descriptive data analysis. Section 6 reports results of the testing of the hypotheses and robustness checks. Section 7 discusses theoretical implications, practical implications, and limitations and future works. Section 8 concludes the research.

2. Literature Review

2.1. Online Learning and Learning Feedback

Learning takes place in many different formats, such as formal, non-formal, and informal learning [3,21]. Formal learning is organized and structured, and has learning objectives [21]. Typical examples include learning through traditional educational institutions. Informal learning is not organized or intentional, and has no clear objective in its learning outcomes. It is often referred to as learning by experience [21]. Non-formal learning is an intermediate type that is often organized and has learning objectives, but is flexible in time, location, and curriculum [3,21]. MOOCs are commonly considered as non-formal learning [4] with opportunities to integrate formal, non-formal, and informal learning [3]. Traditional education research has largely focused on formal learning. Research within the informal and non-formal settings in online education is relatively new [22].
Feedback is an important component in learning and education processes. The education literature discusses several types of feedback based on different recipients intended. Feedback to learners consists of direct feedback and vicarious feedback [10]. Direct feedback refers to feedback from instructors or employers on learners’ performance, whereas vicarious feedback is from more experienced learners on their experience and consequences of actions [10,11]. A pedagogical technique that derives from feedback is feedforward, in which learners “interpret and apply … feedback to close the performance gap and to improve their demonstration of mastery of learning objectives” ([23], p. 587). That is, while feedback is on learners’ actual performance, feedforward is on possible directions or strategies for learners to attain desired goals [24,25].
Feedback can also be directed from learners to instructors and education institutions through course evaluation in both traditional classroom and online learning settings [9,26]. Course evaluations are implemented to assess the quality of courses and the effectiveness of instructors. They are initiated and administrated by educational institutions and take a structured format.
Online course reviews are a unique type of learner feedback that has not been well studied in the education literature. Online course reviews are different from peer feedback and peer feedforward [24] in that they are not particularly provided to learning peers on a particular performance or learning task. They are also different from course evaluations [9,26] in that they are unstructured, freestyled, and accessible to all online users. Contributors of online course reviews may have varying motivations and purposes in posting online course reviews. Nevertheless, prospective learners can gain useful information from these reviews in assessing their course choices.

2.2. The Roles of Experience vs. Expertise in Task Performance

Expertise and experience are two major task-oriented dimensions that have been widely investigated in the literature of individual performance, including task performance and decision making [15]. Experience refers to the degree of familiarity with a subject area. It is usually obtained through repeated exposure [16]. Expertise refers to the degree of skill in and knowledge of a subject area. It can lead to task-specific superior performance [16,17].
Expertise is a complicated concept with varying definitions. From a cognitive perspective, expertise refers to the “possession of an organized body of conceptual and procedural knowledge that can be readily accessed and used with superior monitoring and self-regulation skills” ([27], p. 21). From a performance-based perspective, it refers to the optimal level at which a person is able and/or expected to perform within a specialized realm of human activity [28]. Despite the different focuses of studying expertise, it is generally agreed that expertise is a dynamic state and domain-specific [29]. An expert in one field is not, in general, able to transfer their expertise to other fields [30,31].
The effects of experience and expertise on task performance have been studied in a wide range of scenarios with varying reported results [15,32,33,34,35]. For example, management research stresses the importance of a business owner’s experience to firm success [33], whereas accounting research reveals mixed results on the relationship between experience and performance of accounting professionals [17]. In software development teams, it is found that expertise coordination, instead of expertise, affects team performance, and experience affects team efficiency, but not effectiveness [34]. In the context of assurance reports, both expertise and experience of assurance providers enhance assurance quality [35]. The varying results across different task contexts and relating to the specific performance measures under study point to the context-dependent and task-specific nature of expertise and experience in determining performance outcomes.
Writing online course reviews is a specific task addressed in this research. How reviewer experience and expertise affect their competence in contributing helpful reviews has not been analyzed in prior research. Thus, it needs to be studied and understood.

2.3. Online Customer Reviews

Online course reviews share some similarities with online customer reviews on e-commerce platforms in format and purpose. Thus, we review relevant research on online customer reviews. Online customer reviews are customer voices that relate their experience with products and services on digital platforms. With the proliferation of online customer reviews, they now serve as an important and indispensable information source for a wide range of products and services. Online platforms, such as those for e-commerce [36], hotels [37], restaurants [38], touristic destinations [39], movies [40], and online education [6], commonly maintain an online review system for users to share their opinions. Online customer reviews play crucial roles in alleviating information asymmetry, reducing uncertainty, and shaping the informed decision making regarding a purchase of customers [41]. A recent industry report indicates that 92.4% of customers use online reviews to guide most of their purchasing decisions [42].
Research on online customer reviews evolves around the central topic of understanding factors that affect the perceived review helpfulness. Most research takes the perspective of readers’ information processing and applies dual-process theories, such as the Elaboration Likelihood Model (ELM), to guide their studies [43]. For example, the ELM distinguishes central and peripheral factors that affect readers’ information evaluation. Central factors are considered in readers’ message elaboration and affect the argument quality of a review. They include review depth/length, content sentiment, and linguistic style [43,44,45,46]. Peripheral factors are not considered in message elaboration, but are used for simplified inferences on the value of the message. They include source cues that are available to review readers, such as reviewer gender [47], identity [48], profile image [49], experience [18,19], and expertise [20,50,51].
Little research has been conducted to profile and understand reviewers and their behavior and performance. Exceptions include Mathwick and Mosteller [52], who identified three types of reviewers, namely indifferent independents, challenge seekers, and community collaborators. These three types of reviewers share similar altruistic motives, but have different egoistic motives of contributing reviews. Indifferent independents contribute online reviews for self-expression; challenge seekers approach review contribution as a game to master and an enjoyable, solitary hobby; and community collaborators perceive reviewing as an enjoyable, socially embedded experience. To the best of our knowledge, no study has examined reviewer features tied with their competence in contributing reviews.

3. Hypotheses

Reviewer experience and expertise can affect the helpfulness of reviews reviewers contribute in two ways. First, reviewers with high experience and expertise may be more competent in generating high-quality and helpful review content. Second, when information on reviewer experience and/or expertise is presented to review readers along with review content, this information is used as source cues that affect readers’ perception of review helpfulness.
Table 1 summarizes the primary literature on the effects of reviewer experience and reviewer expertise and compares it with this study. Prior studies have focused on the source cue effects of reviewer features [19] and have not considered the review quality effects of reviewer experience and expertise. In these studies, reviewer experience is usually operationalized as the number of prior reviews by a reviewer. Reviewer expertise, depending on reviewer information presented to review readers, is measured by expertise indicators [50], the total number of helpfulness votes a reviewer has received [20,53,54], or reviewer badges [55,56].
This research studies the review quality effects of reviewer experience and expertise. Figure 1 presents the research framework of the study. This research posits the effects of reviewer experience and reviewer expertise on reviewer competence in contributing helpful reviews and also considers the interactive effect of the two.
Experience refers to the degree of familiarity with a subject area obtained through exposure [16]. Experience enables situated learning. Individuals learn a wide range of job-related contextual knowledge and form a personal working approach to address issues and complete the job through repeated exposures to an environment and performing a job [33]. Although experience does not always enhance task performance [17,35], the effect of experience on task performance is monotonic when it does [58]. Furthermore, the effect of experience on task performance tends to be more prominent when an individual is relatively new and still learning the job. The effect may decline with the tenure of experience [58].
Writing online review is a specific task that may benefit from reviewer experience. In the online review context, reviewer experience refers to a reviewer’s experience of producing online reviews on a specific platform [19]. When reviewers continue to write reviews, they become more familiar with review writing and readers’ expectations and responses. The reviews written and reader responses received accumulate into a knowledge depository that reviewers can refer back to and compare with when deliberating on a new course experience and composing a new review. Additionally, as reviewers accumulate review experience, they are more familiar with the associated cognitive process and become more aware of and committed to the task. Therefore, they will be more active in reflecting on their course experience for review materials. In these ways, review experience adds to reviewer competence in contributing helpful online reviews. We hypothesize:
Hypothesis 1 (H1).
Reviewer experience has a positive impact on reviewer competence in contributing helpful online reviews.
Expertise consists of multiple components including knowledge structure and problem-solving strategies [17]. The expertise theory suggests that expertise develops through the acquisition of skills and knowledge, which is a cognitively effortful activity [58,59]. Consequently, experts are familiar with a great amount of knowledge in a specific domain to achieve superior performance. Expert knowledge includes static knowledge and task knowledge. Static knowledge can be acquired through learning. Task knowledge is compiled in an ongoing search for better ways to do things, such as problem solving. Expertise is field-dependent and its components vary across contexts of interest. For example, the expertise of wine tasting consists of an analytical sensory element [30], which is not a part of the expertise of professional writing [60].
Reviewer expertise describes the reviewers’ knowledge and ability to provide high-quality and helpful opinions on products/services to review readers [20,57]. Such expertise is composed of complex knowledge components and skills [61]. To produce high-quality and interesting reviews that draw attention and are helpful to review readers, online course reviewers need to be equipped with domain knowledge on the course contents. They also need to grasp the knowledge about the comparative quality of online courses, the contextual knowledge of understanding and identifying important and interesting issues and perspectives, the communication/writing skills with rhetorical knowledge, and the social knowledge for communication [62]. Because reviewers have varying knowledge bases and skillsets, they possess different expertise to perform the tasks. Reviewers with a high level of expertise would be more competent in writing valuable reviews that effectively communicate useful information to readers (potential learners) and help them make course decisions. We hypothesize:
Hypothesis 2 (H2).
Reviewer expertise has a positive impact on reviewer competence in contributing helpful online reviews.
In addition to the individual effects of experience and expertise on reviewer competence, we posit that reviewer expertise interacts with reviewer experience in enhancing their competence in contributing helpful online reviews. Experience facilitates the cognitive simplification of job-related routines and behaviors [63]. It helps maximize the performance of experts. In the context of online course reviews, experienced reviewers who have written many reviews have gone through the process of writing course reviews, reading peer reviews, and receiving reader responses multiple times. Compared with reviewers with little experience, experienced reviewers are familiar with the process, have previous experience to refer to, and are more aware of reader expectations and likely responses of the target audience on the platform. This familiarity will help them to better leverage their expertise in composing and communicating their opinions by identifying unique aspects and gauging their content to the audience. We hypothesize:
Hypothesis 3 (H3).
Reviewer expertise interacts with reviewer experience in enhancing their competence in contributing helpful online reviews.

4. Methodology

4.1. Data

We obtained a proprietary dataset from a leading MOOC platform in China. As of February 2021, the MOOC platform has hosted more than 22,055 sessions of 5821 MOOCs since its launch in May 2014. Courses offered on the MOOC platform are of a wide range of topics, covering most subjects in post-secondary education. Our dataset contains all course reviews (a total of 1,355,280) that were generated by learners of the MOOC platform. On average, each course received 233 course reviews. Each course review consists of a review rating, a textual comment, reviewers’ user name, time of posting, the session of the course taken, and the number of helpfulness votes of a review. Figure 2 shows a screenshot of course reviews posted on the MOOC platform.
To test the effects of reviewer experience and reviewer expertise on reviewer competence in contributing helpful reviews, we constructed a dataset for analysis using the following inclusion and exclusion criteria.
Inclusion criteria:
  • All course reviews that have received at least one helpfulness vote [53].
Exclusion criteria:
  • First course review of each reviewer, because it does not allow for a reviewer expertise measurement.
  • New course reviews that were posted within 60 days of data retrieval; this is to ensure a good measure of review helpfulness [64] (i.e., it takes time for an online course review to accrue helpfulness votes).
The final dataset for analysis contained 39,114 course reviews from 5216 sessions of 3276 MOOCs.

4.2. Variables

Table 2 presents the definitions of the dependent variable, independent variables, and control variables in this study. The choice and operationalization of these variables are in accordance with prior studies in the online review literature [49,53,65,66,67].

4.3. Empirical Model

The dependent variable, the helpfulness of a course review (RevHelpi), is a count variable of an over-dispersion nature. In other words, the variance of the variable is much larger than its mean. Therefore, we used the negative binomial model to investigate the impacts of reviewer experience and expertise on reviewer competence in contributing helpful course reviews [68]. The regression equation used in this study is specified in Equation (1). A natural logarithm transformation is taken for the variables that are highly skewed.
Log#RevHelpi = β0 + β1Experiencei + β2Expertisei + β3Experiencei × Expertisei
+ γαi + δφi + θωi + εi        
where the subscript i represents a course review; Log#RevHelpi denotes the dependent variable, i.e., the natural logarithm of review helpfulness; and reviewer experience (Experiencei) and reviewer expertise (Expertisei) are two key variables under study in this research.
For control variables, we included a set of review characteristics αi, a set of reviewer characteristics φi, and a set of course characteristics ωi. The set of review characteristics αi includes extremity of a course review (RevExtremityi), positivity of a course review (RevPositivityi), natural logarithm of review length (Log#RevLengthi), review score inconsistency (RevInconsisti), and natural logarithm of review age (Log#RevAgei). The set of reviewer characteristics φi includes course review diversity of a reviewer (CrsDiversityi). The set of course characteristics ωi includes natural logarithm of course popularity (Log#CrsPopuli) and learners’ satisfaction with a course (CrsSatisfi). To capture potential effects of review posting time, we included dummy variables for the year, month, and day of a week of the review posting date. To capture the heterogeneity across courses of various types, categories, and instructors from different education institutions, we included dummy variables for course types, categories, and college affiliations of instructors.

5. Descriptive Data Analysis

The data were analyzed using Stata 15.1. Table 3 presents the descriptive statistics of the dependent variable, independent variables, and control variables.
The mean of review helpfulness is 2.986, indicating that course reviews in our sample received an average of 3 helpfulness votes per review. The means of reviewer experience and reviewer expertise are 7.192 and 1.242, respectively, indicating that reviewers had an average experience of about 7 review postings and 1 helpfulness vote for each review they posted. The mean of course diversity is 4.611, indicating that most reviewers had posted reviews for courses in multiple (i.e., more than 4) categories. The means of review extremity and positivity are 0.901 and 0.945, respectively. This indicates most course reviews had positive ratings. The mean of review length is 29.720, indicating that the average review length was about 30 Chinese characters. The mean of review score inconsistency is 0.391, indicating that course reviews were largely consistent with the average prior review ratings. The mean of review age is 524.551, i.e., about 17.5 months. The mean of course popularity is 777.247. That is, in our sample, each course received about 777 reviews on average. The average course satisfaction is 4.760, which indicates an overall high course satisfaction.
To reveal distributions of reviewer experience and expertise, we plotted the scatter of the two variables in Figure 3. The plot shows an interesting inverse pattern of reviewer experience and expertise. Reviewers with high experience often had low expertise and vice versa. Reviewers with both high experience and high expertise were rare. The majority of reviews were produced by reviewers with both low experience and low expertise.

6. Results and Analysis

6.1. The Effects of Reviewer Experience and Reviewer Expertise on Review Helpfulness

Table 4 presents estimation results of the negative binomial regression specified in Equation (1). Model (1) reports the estimation results with independent variables and control variables. Model (2) reports the estimation results with independent variables, the interaction term, and control variables. Models (3) and (4) are for robustness checks.
As shown in Model (1), both reviewer experience and review expertise have significant positive effects on reviewer competence in contributing helpful reviews. H1 and H2 are supported. A closer look at the parameter estimations of the two factors indicates that the effect of reviewer expertise (β = 0.011) is much more prominent than that of reviewer experience (β = 0.001). Reviewer expertise is more important than reviewer experience in writing online course reviews. As the estimation result of the interaction term of reviewer experience and reviewer expertise is not significant, H3 is not supported. That is, different from the expectation, reviewer experience does not leverage their expertise to enhance reviewer competence in contributing helpful online course reviews. Figure 4 shows the causal model with the estimation coefficients of the relationships.

6.2. Robustness Checks

We examined the robustness of our major findings with a series of tests. First, the main sample for analysis excluded reviews with no vote. For robustness check, we included reviews with no vote and rerun the test. Second, the main analysis controlled for numerical features of reviews, such as review extremity, review positivity, and review length. These review features could be considered as an integral part of review quality, whose overall effect can be predicted by reviewer characteristics. Thus, for robustness check, we removed the variables of review features and rerun the test. The result is presented in Model (3) of Table 4. The estimation results of these tests are consistent with those in the main analysis.
Additionally, we performed an analysis using an alternative way of operationalizing reviewer experience and reviewer expertise. We classified reviewers into four groups by using 80 quantiles of reviewer experience and reviewer expertise. The four groups of high–high (i.e., experience–expertise), high–low, low–high, and low–low constituted 3.30%, 8.39%, 18.04%, and 70.27% of the reviews in our sample, respectively. Four dummy variables representing the four groups of reviewers were used to rerun the analysis. The estimation results are reported in Model (4) of Table 4. The estimations of high–low, low–high, and high–high variables were significantly positive, with the variables consisting of high reviewer expertise (i.e., low–high and high–high) being more significant with lower p values than that with high reviewer experience. These results are consistent with those in the main analysis. That is, reviewer experience and expertise both enhance reviewer competence in contributing helpful online course reviews, but the effect of reviewer expertise is more pronounced than that of reviewer experience.

7. Discussion

7.1. Theoretical Implications

This research makes several theoretical contributions to the literature. First, it contributes to the online learning literature by examining online course reviews as a unique type of feedback information that helps prospective learners to gain insights into online courses. Online course reviews are different from peer feedback and course evaluations traditionally studied in the education literature. While the online course review mechanism is widely implemented on MOOC platforms and online reviews have become an important information source for prospective learners, little research has been conducted on this unique type of learner feedback. Investigating and understanding online course reviews can significantly enhance our understanding of learners’ information needs and decision making in the searching and selection of a course.
Second, this research contributes to the online review literature by taking an analytical approach to study a fundamental question of reviewer competence through their experience and expertise. Reviewer experience and expertise can affect review helpfulness in two ways. On the one hand, they can serve as dimensions of reviewer competence in generating high-quality and helpful review content. On the other hand, when provided to readers along with review content, information on reviewer experience and expertise can act as source cues for simplified inferences on the value of a review. Prior research on online consumer reviews has focused on the source cue effects of reviewer features [19]. This research focuses on the review quality effects of reviewer features. The results reveal the pivotal role of expertise in determining reviewer performance.
Furthermore, this research enriches the task performance literature by studying the performance effect of experience and expertise in the online course review context. Previous studies on experience and expertise indicate their varying effects across contexts. This suggests the necessity of studying specific task scenarios for insights. Writing reviews for online courses is a specific task that has not been investigated in the previous literature. By linking the online review literature and the task performance literature, we studied reviewer experience and expertise in the context of online course reviews. The results of the non-interactive effects of the two are a novel addition to the literature.

7.2. Practical Implications

Our results provide useful implications to online course platforms in soliciting and recommending reviews for online courses. Identifying the right online course reviewers to solicit and recommend their online course reviews can be an efficient and effective way to build a timely course review depository for satisfying the information need of prospective learners. This proactive approach can avoid the delay associated with helpfulness votes and promote quality information. Particularly, the results of this study suggest that, instead of experienced reviewers who frequently contribute reviews, the priority should be on attracting and retaining expert reviewers who do not often contribute.

7.3. Limitations and Further Research

Three limitations of this study could inform further research. First, expertise is a complex construct with multiple dimensions. In this study, reviewer expertise is operationalized as a review performance measured by the average number of helpfulness votes per review that a reviewer has received before posting the review under study. Additional dimensions of expertise, such as learners/reviewers’ course performance (as a discipline knowledge component of expertise), could be considered and a composite index could be constructed and used for analysis. Second, the effects of reviewer experience and expertise on their competence in contributing helpful reviews may be heterogeneous across reviewers with different characteristics. As limited reviewer information is available, more nuanced studies on potential variations cannot be explored in this research. Future research may examine this further, using experiment and/or survey methods. Third, online course review is a unique type of learner feedback. This study focuses on examining online course reviews in terms of their helpfulness to readers and reviewer competency in contributing helpful reviews. It does not study online course reviews from a learner feedback perspective within the non-formal learning situations and/or process. Relating online course reviews to other types of learner feedback and examining them as an integral part of the entire learning process may gain further insights on review contribution motivations and quality.

8. Conclusions

As the number of online courses available on MOOC platforms and other online learning platforms surged in the past decade, prospective learners face increasing difficulties in assessing course quality and suitability, and making informed course selection decisions. Online course reviews are a valuable information source for prospect learners to gain insights into online courses. While a large body of literature has studied online product/service reviews, online course reviews have received scarce attention.
This research sheds light on how reviewer experience and expertise affect reviewer competence in contributing helpful online course reviews, which is a crucial question, but has not been previously examined. The empirical study of 39,114 online reviews of 3276 online courses reveals that both reviewer experience and expertise positively affect reviewer competence in contributing helpful reviews. Specifically, reviewer expertise is a more significant factor than reviewer experience in influencing review performance. In addition, the two dimensions do not interact in enhancing reviewer competence.
Our analysis also reveals the distinct groups of reviewers. Reviewers with both high experience and high expertise are scant and contribute a small portion of the reviews. The majority of the reviews are posted by reviewers with low experience and low expertise. The rest of the reviews are from reviewers with high expertise, but low experience, or those with high experience, but low expertise. Overall, our work is pioneering as it gained insight into the antecedents of the quality and helpfulness of online course reviews, especially from the perspective of reviewer competence.

Author Contributions

Conceptualization, Z.D., F.W. and S.W.; methodology, Z.D.; software, Z.D.; validation, Z.D.; formal analysis, Z.D.; investigation, Z.D.; resources, Z.D.; data curation, Z.D.; writing—original draft preparation, F.W. and Z.D.; writing—review and editing, F.W., Z.D. and S.W.; visualization, Z.D.; supervision, Z.D. and F.W.; project administration, Z.D. and F.W.; funding acquisition, Z.D. All authors have read and agreed to the published version of the manuscript.

Funding

This paper was funded by the National Natural Science Foundation of China (71901030), the Fund of the Key Laboratory of Rich-Media Knowledge Organization and Service of Digital Publishing Content (ZD2020/09-07), and the Fundamental Research Funds for the Beijing Sport University (2020048).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Almatrafi, O.; Johri, A.; Rangwala, H. Needle in a haystack: Identifying learner posts that require urgent response in MOOC discussion forums. Comput. Educ. 2018, 118, 1–9. [Google Scholar] [CrossRef]
  2. Mandernach, B.J.; Dailey-Hebert, A.; Donnelli-Sallee, E. Frequency and time investment of instructors’ participation in threaded discussions in the online classroom. J. Interact. Online Learn. 2007, 6, 1–9. [Google Scholar]
  3. Cha, H.; So, H.J. Integration of formal, non-formal and informal learning through MOOCs. In Radical Solutions and Open Science: An Open Approach to Boost Higher Education; Burgos, D., Ed.; Springer: Singapore, 2020; pp. 135–158. [Google Scholar]
  4. Gutiérrez-Santiuste, E.; Gámiz-Sánchez, V.M.; Gutiérrez-Pérez, J. MOOC & B-learning: Students’ Barriers and Satisfaction in Formal and Non-formal Learning Environments. J. Interact. Online Learn. 2015, 13, 88–111. [Google Scholar]
  5. Harvey, L.; Green, D. Defining quality. Assess. Eval. High. Educ. 1993, 18, 9–34. [Google Scholar] [CrossRef]
  6. Geng, S.; Niu, B.; Feng, Y.; Huang, M. Understanding the focal points and sentiment of learners in MOOC reviews: A machine learning and SC-LIWC-based approach. Br. J. Educ. Technol. 2020, 51, 1785–1803. [Google Scholar] [CrossRef]
  7. Fan, J.; Jiang, Y.; Liu, Y.; Zhou, Y. Interpretable MOOC recommendation: A multi-attention network for personalized learning behavior analysis. Internet Res. 2021, in press. [Google Scholar] [CrossRef]
  8. Li, L.; Johnson, J.; Aarhus, W.; Shah, D. Key factors in MOOC pedagogy based on NLP sentiment analysis of learner reviews: What makes a hit. Comput. Educ. 2021, 176, 104354. [Google Scholar] [CrossRef]
  9. Cunningham-Nelson, S.; Laundon, M.; Cathcart, A. Beyond satisfaction scores: Visualising student comments for whole-of-course evaluation. Assess. Eval. High. Educ. 2021, 46, 685–700. [Google Scholar] [CrossRef]
  10. Decius, J.; Schaper, N.; Seifert, A. Informal workplace learning: Development and validation of a measure. Hum. Resour. Dev. Q. 2019, 30, 495–535. [Google Scholar] [CrossRef] [Green Version]
  11. Morris, R.; Perry, T.; Wardle, L. Formative assessment and feedback for learning in higher education: A systematic review. Rev. Educ. 2021, 9, e3292. [Google Scholar] [CrossRef]
  12. Deng, W.; Yi, M.; Lu, Y. Vote or not? How various information cues affect helpfulness voting of online reviews. Online Inf. Rev. 2020, 44, 787–803. [Google Scholar] [CrossRef]
  13. Yin, D.; Mitra, S.; Zhang, H. When Do Consumers value positive vs. negative reviews? An empirical investigation of confirmation bias in online word of mouth. Inf. Syst. Res. 2016, 27, 131–144. [Google Scholar] [CrossRef]
  14. Wu, P.F. In search of negativity bias: An empirical study of perceived helpfulness of online reviews. Psychol. Mark. 2013, 30, 971–984. [Google Scholar] [CrossRef] [Green Version]
  15. Jacoby, J.; Troutman, T.; Kuss, A.; Mazursky, D. Experience and expertise in complex decision making. ACR North Am. Adv. 1986, 13, 469–472. [Google Scholar]
  16. Braunsberger, K.; Munch, J.M. Source expertise versus experience effects in hospital advertising. J. Serv. Mark. 1998, 12, 23–38. [Google Scholar] [CrossRef]
  17. Bonner, S.E.; Lewis, B.L. Determinants of auditor expertise. J. Account. Res. 1990, 28, 1–20. [Google Scholar] [CrossRef]
  18. Banerjee, S.; Bhattacharyya, S.; Bose, I. Whose online reviews to trust? Understanding reviewer trustworthiness and its impact on business. Decis. Support Syst. 2017, 96, 17–26. [Google Scholar] [CrossRef]
  19. Huang, A.H.; Chen, K.; Yen, D.C.; Tran, T.P. A study of factors that contribute to online review helpfulness. Comput. Hum. Behav. 2015, 48, 17–27. [Google Scholar] [CrossRef]
  20. Han, M. Examining the Effect of Reviewer Expertise and Personality on Reviewer Satisfaction: An Empirical Study of TripAdvisor. Comput. Hum. Behav. 2021, 114, 106567. [Google Scholar] [CrossRef]
  21. Werquin, P. Recognising Non-Formal and Informal Learning: Outcomes, Policies and Practices; OECD Publishing: Berlin, Germany, 2010. [Google Scholar]
  22. O’Riordan, T.; Millard, D.E.; Schulz, J. Is critical thinking happening? Testing content analysis schemes applied to MOOC discussion forums. Comput. Appl. Eng. Educ. 2021, 29, 690–709. [Google Scholar] [CrossRef]
  23. Faulconer, E.; Griffith, J.C.; Frank, H. If at first you do not succeed: Student behavior when provided feedforward with multiple trials for online summative assessments. Teach. High. Educ. 2021, 26, 586–601. [Google Scholar] [CrossRef]
  24. Latifi, S.; Gierl, M. Automated scoring of junior and senior high essays using Coh-Metrix features: Implications for large-scale language testing. Language Testing 2021, 38, 62–85. [Google Scholar] [CrossRef]
  25. Noroozi, O.; Hatami, J.; Latifi, S.; Fardanesh, H. The effects of argumentation training in online peer feedback environment on process and outcomes of learning. J. Educ. Sci. 2019, 26, 71–88. [Google Scholar]
  26. Alturkistani, A.; Lam, C.; Foley, K.; Stenfors, T.; Blum, E.R.; Van Velthoven, M.H.; Meinert, E. Massive open online course evaluation methods: Systematic review. J. Med. Internet Res. 2020, 22, e13851. [Google Scholar] [CrossRef]
  27. Glaser, R.; Chi, M.T.; Farr, M.J. (Eds.) The Nature of Expertise; Lawrence Erlbaum Associates: Mahwah, NJ, USA, 1988. [Google Scholar]
  28. Torraco, R.J.; Swanson, R.A. The Strategic Roles of Human Resource Development. Hum. Resour. Plan. 1995, 18, 10–21. [Google Scholar]
  29. Herling, R.W. Operational definitions of expertise and competence. Adv. Dev. Hum. Resour. 2000, 2, 8–21. [Google Scholar] [CrossRef]
  30. Honoré-Chedozeau, C.; Desmas, M.; Ballester, J.; Parr, W.V.; Chollet, S. Representation of wine and beer: Influence of expertise. Curr. Opin. Food Sci. 2019, 27, 104–114. [Google Scholar] [CrossRef]
  31. Leone, L.; Desimoni, M.; Chirumbolo, A. Interest and expertise moderate the relationship between right–wing attitudes, ideological self–placement and voting. Eur. J. Personal. 2014, 28, 2–13. [Google Scholar] [CrossRef]
  32. Reuber, A.R.; Fischer, E.M. Entrepreneurs’ experience, expertise, and the performance of technology-based firms. IEEE Trans. Eng. Manag. 1994, 41, 365–374. [Google Scholar] [CrossRef]
  33. Reuber, R. Management experience and management expertise. Decis. Support Syst. 1997, 21, 51–60. [Google Scholar] [CrossRef]
  34. Faraj, S.; Sproull, L. Coordinating expertise in software development teams. Manag. Sci. 2000, 46, 1554–1568. [Google Scholar] [CrossRef]
  35. Martínez-Ferrero, J.; García-Sánchez, I.M.; Ruiz-Barbadillo, E. The quality of sustainability assurance reports: The expertise and experience of assurance providers as determinants. Bus. Strategy Environ. 2018, 27, 1181–1196. [Google Scholar] [CrossRef]
  36. Craciun, G.; Zhou, W.; Shan, Z. Discrete emotions effects on electronic word-of-mouth helpfulness: The moderating role of reviewer gender and contextual emotional tone. Decis. Support Syst. 2020, 130, 113226. [Google Scholar] [CrossRef]
  37. Shin, S.; Du, Q.; Ma, Y.; Fan, W.; Xiang, Z. Moderating Effects of Rating on Text and Helpfulness in Online Hotel Reviews: An Analytical Approach. J. Hosp. Mark. Manag. 2021, 30, 159–177. [Google Scholar]
  38. Li, H.; Qi, R.; Liu, H.; Meng, F.; Zhang, Z. Can time soften your opinion? The influence of consumer experience: Valence and review device type on restaurant evaluation. Int. J. Hosp. Manag. 2021, 92, 102729. [Google Scholar] [CrossRef]
  39. Bigné, E.; Zanfardini, M.; Andreu, L. How online reviews of destination responsibility influence tourists’ evaluations: An exploratory study of mountain tourism. J. Sustain. Tour. 2020, 28, 686–704. [Google Scholar] [CrossRef]
  40. Deng, T. Investigating the effects of textual reviews from consumers and critics on movie sales. Online Inf. Rev. 2020, 44, 1245–1265. [Google Scholar] [CrossRef]
  41. Ismagilova, E.; Dwivedi, Y.K.; Slade, E.; Williams, M.D. Electronic Word of Mouth (eWoM) in The Marketing Context; Springer: New York, NY, USA, 2017. [Google Scholar]
  42. Stats Proving the Value of Customer Reviews, According to ReviewTrackers. Available online: https://www.reviewtrackers.com/reports/customer-reviews-stats (accessed on 2 November 2021).
  43. Wang, F.; Karimi, S. This product works well (for me): The impact of first-person singular pronouns on online review helpfulness. J. Bus. Res. 2019, 104, 283–294. [Google Scholar] [CrossRef]
  44. Berger, J.A.; Katherine, K.L. What makes online content viral? J. Mark. Res. 2012, 49, 192–205. [Google Scholar] [CrossRef] [Green Version]
  45. Mudambi, S.M.; Schuff, D. What makes a helpful online review? A study of customer reviews on Amazon.com. MIS Q. 2010, 34, 185–200. [Google Scholar] [CrossRef] [Green Version]
  46. Salehan, M.; Kim, D.J. Predicting the performance of online consumer reviews: A sentiment mining approach to big data analytics. Decis. Support Syst. 2016, 81, 30–40. [Google Scholar] [CrossRef]
  47. Craciun, G.; Moore, K. Credibility of negative online product reviews: Reviewer gender, reputation and emotion effects. Comput. Hum. Behav. 2019, 97, 104–115. [Google Scholar] [CrossRef]
  48. Forman, C.; Ghose, A.; Wiesenfeld, B. Examining the relationship between reviews and sales: The role of reviewer identity disclosure in electronic markets. Inf. Syst. Res. 2008, 19, 291–313. [Google Scholar] [CrossRef]
  49. Karimi, S.; Wang, F. Online review helpfulness: Impact of reviewer profile image. Decis. Support Syst. 2017, 96, 39–48. [Google Scholar] [CrossRef]
  50. Naujoks, A.; Benkenstein, M. Who is behind the message? The power of expert reviews on eWoM platforms. Electron. Commer. Res. Appl. 2020, 44, 101015. [Google Scholar] [CrossRef]
  51. Wu, X.; Jin, L.; Xu, Q. Expertise makes perfect: How the variance of a reviewer’s historical ratings influences the persuasiveness of online reviews. J. Retail. 2021, 97, 238–250. [Google Scholar] [CrossRef]
  52. Mathwick, C.; Mosteller, J. Online reviewer engagement: A typology based on reviewer motivations. J. Serv. Res. 2017, 20, 204–218. [Google Scholar] [CrossRef]
  53. Choi, H.S.; Leon, S. An empirical investigation of online review helpfulness: A big data perspective. Decis. Support Syst. 2020, 139, 113403. [Google Scholar] [CrossRef]
  54. Lee, M.; Jeong, M.; Lee, J. Roles of negative emotions in customers’ perceived helpfulness of hotel reviews on a user-generated review website: A text mining approach. Int. J. Contemp. Hosp. Manag. 2017, 29, 762–783. [Google Scholar] [CrossRef]
  55. Baek, H.; Ahn, J.H.; Choi, Y. Helpfulness of online consumer reviews: Readers’ objectives and review cues. Int. J. Electron. Commer. 2012, 17, 99–126. [Google Scholar] [CrossRef]
  56. Zhu, L.; Yin, G.; He, W. Is this opinion leader’s review useful? Peripheral cues for online review helpfulness. J. Electron. Commer. Res. 2014, 15, 267–280. [Google Scholar]
  57. Racherla, P.; Friske, W. Perceived ‘usefulness’ of online consumer reviews: An exploratory investigation across three services categories. Electron. Commer. Res. Appl. 2012, 11, 548–559. [Google Scholar] [CrossRef]
  58. Schmidt, F.L.; Hunter, J. General mental ability in the world of work: Occupational attainment and job performance. J. Personal. Soc. Psychol. 2004, 86, 162–173. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  59. Boshuizen, H.P.; Gruber, H.; Strasser, J. Knowledge restructuring through case processing: The key to generalise expertise development theory across domains? Educ. Res. Rev. 2020, 29, 100310. [Google Scholar] [CrossRef]
  60. Corrigan, J.A.; Slomp, D.H. Articulating a sociocognitive construct of writing expertise for the digital age. J. Writ. Anal. 2021, 5, 142–195. [Google Scholar]
  61. Schriver, K. What We Know about Expertise in Professional Communication. In Past, Present, and Future Contributions of Cognitive Writing Research to Cognitive Psychology; Berninger, V.W., Ed.; Psychology Press: New York, NY, USA, 2012; pp. 275–312. [Google Scholar]
  62. Kellogg, R.T. Professional writing expertise. In The Cambridge Handbook of Expertise and Expert Performance; Ericsson, K.A., Charness, N., Feltovich, P.J., Hoffman, R.R., Eds.; Cambridge University Press: Cambridge, UK, 2006; pp. 389–402. [Google Scholar]
  63. Earley, P.C.; Lee, C.; Hanson, L.A. Joint moderating effects of job experience and task component complexity: Relations among goal setting, task strategies, and performance. J. Organ. Behav. 1990, 11, 3–15. [Google Scholar] [CrossRef]
  64. Kuan, K.K.; Hui, K.L.; Prasarnphanich, P.; Lai, H.Y. What makes a review voted? An empirical investigation of review voting in online review systems. J. Assoc. Inf. Syst. 2015, 16, 48–71. [Google Scholar] [CrossRef] [Green Version]
  65. Fink, L.; Rosenfeld, L.; Ravid, G. Longer online reviews are not necessarily better. Int. J. Inf. Manag. 2018, 39, 30–37. [Google Scholar] [CrossRef]
  66. Guo, D.; Zhao, Y.; Zhang, L.; Wen, X.; Yin, C. Conformity feedback in an online review helpfulness evaluation task leads to less negative feedback-related negativity amplitudes and more positive P300 amplitudes. J. Neurosci. Psychol. Econ. 2019, 12, 73–87. [Google Scholar] [CrossRef]
  67. Liang, S.; Schuckert, M.; Law, R. How to improve the stated helpfulness of hotel reviews? A multilevel approach. Int. J. Contemp. Hosp. Manag. 2019, 31, 953–977. [Google Scholar] [CrossRef]
  68. Hausman, J.; Hall, B.H.; Griliches, Z. Econometric models for count data with an application to the patents-R&D relationship. Econometrica 1984, 52, 909–938. [Google Scholar]
Figure 1. Research framework.
Figure 1. Research framework.
Sustainability 13 12230 g001
Figure 2. A screenshot of course reviews on the MOOC platform.
Figure 2. A screenshot of course reviews on the MOOC platform.
Sustainability 13 12230 g002
Figure 3. Distribution of reviewer experience and reviewer expertise.
Figure 3. Distribution of reviewer experience and reviewer expertise.
Sustainability 13 12230 g003
Figure 4. The research model with empirical results.
Figure 4. The research model with empirical results.
Sustainability 13 12230 g004
Table 1. Dual effects of reviewer experience and expertise.
Table 1. Dual effects of reviewer experience and expertise.
EffectsReviewer ExperienceReviewer ExpertiseInteraction of the Two
Review quality effects (experience and expertise as reviewer competence dimensions)Hypothesized positive (This research)Hypothesized positive (This research)Hypothesized positive (This research)
Source cue effects (affecting readers’ perception of review helpfulness)Mixed effects [18,19,57]Positive effect [20,50]None
Table 2. Variable definitions.
Table 2. Variable definitions.
VariablesDefinitions
Dependent Variable
RevHelpHelpfulness of a course review to review readers. It is operationalized by the number of helpfulness votes that a course review has received.
Independent Variables
ExperienceReviewer experience in writing course reviews. It is operationalized by the total number of course reviews that a reviewer has posted before the one under study.
ExpertiseReviewer expertise in writing helpful reviews. It is operationalized by the average number of helpfulness votes per review received by a reviewer on reviews posted before the one under study.
Control Variables
RevExtremityA dummy variable indicating the extremity of a course review. The value of the variable takes 1 for a review rating of 1 or 5, and 0 otherwise.
RevPositivityA dummy variable indicating the positivity of a course review. The value of the variable takes 1 for a review rating of 4 or 5, and 0 otherwise.
RevLengthLength of a course review. It is operationalized as the number of Chinese characters in the textual comment of a course review.
RevInconsistReview score inconsistency. It refers to the extent to which a review rating differs from the average rating of a course. It is operationalized by the absolute value of the difference between a review rating and the average rating of all previous reviews of a course.
RevAgeReview age. It indicates how long ago a course review has been posted. It is operationalized by the number of days between the posting date of a review and the data retrieval date.
RevHourIndicates the hour of a day at which a course review was posted.
RevDoMIndicates the day of a month on which a course review was posted.
RevDoWIndicates the day of a week on which a course review was posted.
RevMonthIndicates the month in which a course review was posted.
RevYearIndicates the year in which a course review was posted.
CrsDiversityIndicates the diversity of course categories for which a reviewer has posted reviews. It is operationalized by the number of course categories for which a reviewer has posted course reviews.
CrsPopulIndicates the popularity of a course. It is operationalized by the number of reviews for a course.
CrsSatisfIndicates learners’ overall satisfaction with a course. It is operationalized by the average rating of all reviews for a course.
CrsTypeA set of dummy variables indicating the type of a course as specified by the MOOC platform, including general course, general basic course, special basic course, special course, etc.
CrsCategoryA set of dummy variables indicating the category of a course as specified by the MOOC platform, including agriculture, medicine, history, philosophy, engineering, pedagogy, literature, law, science, management, economics, art, etc.
CrsProviderA set of dummy variables indicating the college or university that provides a course.
Table 3. Descriptive statistics.
Table 3. Descriptive statistics.
VariablesNumber. of Obs.MeanStd. Dev.Min.Max.
RevHelp39,1142.98613.58111445
Experience39,1147.19235.8861807
Expertise39,1141.24210.68501195
RevExtremity39,1140.9010.29901
RevPositivity39,1140.9450.22801
RevLength39,11429.72043.6755500
RevInconsist39,1140.3910.60803.950
RevAge39,114524.551254.394601105
CrsDiversity39,1144.6118.095172
CrsPopul39,114777.2471932.504128,063
CrsSatisf39,1144.7600.1602.5385
Table 4. Estimation results.
Table 4. Estimation results.
Variables(1)(2)(3)(4)
Experience0.001 ** (0.000)0.001 * (0.000)0.001 * (0.000)
Expertise0.011 *** (0.001)0.011 *** (0.001)0.011 *** (0.001)
Experience × Expertise 0.000(0.000)
High-Low 0.036 ** (0.016)
Low-High 0.273 *** (0.018)
High-High 0.333 *** (0.030)
CrDiversity−0.002 * (0.001)−0.002 * (0.001)−0.002 * (0.001)−0.002 *** (0.001)
RevExtremity0.147 *** (0.019)0.145 *** (0.019) 0.137 *** (0.019)
RevPositivity−0.368 *** (0.049)−0.368 *** (0.049) −0.408 *** (0.049)
Log#RevLength0.380 *** (0.006)0.380 *** (0.006) 0.375 *** (0.006)
RevInconsist0.113 *** (0.019)0.111 *** (0.019) 0.086 *** (0.019)
Log#RevAge−0.210 *** (0.038)−0.211 *** (0.038)−0.210 *** (0.038)−0.234 *** (0.038)
Log#CrsPopul0.123 *** (0.006)0.123 *** (0.006)0.123 *** (0.006)0.123 *** (0.006)
CrsSatisf−0.082 ** (0.040)−0.082 ** (0.040)−0.082 ** (0.040)−0.097 ** (0.039)
lnalpha−0.536 *** (0.010)−0.536 *** (0.010)−0.536 *** (0.010)−0.531 *** (0.010)
Review timing FE aYYYY
Course FE bYYYY
# observations39,11439,11439,11439,114
Log likelihood−78,160−78,159−78,160−78,232
AIC157,286157,287157,286157,431
BIC161,427161,437161,427161,581
Note: * p < 0.1; ** p < 0.05; *** p < 0.01; FE = Fixed Effects; a Review post timing related fixed effects include review year, month, day of month, day of week, and hour of day; b Course related fixed effects include course type, course category, and course provider affiliations.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Du, Z.; Wang, F.; Wang, S. Reviewer Experience vs. Expertise: Which Matters More for Good Course Reviews in Online Learning? Sustainability 2021, 13, 12230. https://doi.org/10.3390/su132112230

AMA Style

Du Z, Wang F, Wang S. Reviewer Experience vs. Expertise: Which Matters More for Good Course Reviews in Online Learning? Sustainability. 2021; 13(21):12230. https://doi.org/10.3390/su132112230

Chicago/Turabian Style

Du, Zhao, Fang Wang, and Shan Wang. 2021. "Reviewer Experience vs. Expertise: Which Matters More for Good Course Reviews in Online Learning?" Sustainability 13, no. 21: 12230. https://doi.org/10.3390/su132112230

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop