New Methods and Assessment Approaches in Intelligence Research

A special issue of Journal of Intelligence (ISSN 2079-3200).

Deadline for manuscript submissions: closed (4 March 2020) | Viewed by 24059

Special Issue Editors


E-Mail Website
Guest Editor
Psychological Assessment, Institute of Psychology, University of Kassel, 34125 Kassel, Germany
Interests: psychometrics; test development; fluid intelligence; crystallized intelligence; educational achievement

E-Mail Website
Guest Editor
Department of Psychology, University of Bonn, Kaiser-Karl-Ring 9, 53111 Bonn, Germany
Interests: cognitive development across the lifespan; cognition in daily life; metacognition; micro- and macro-longitudinal research methods
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Department of Psychology, School of Medicine and Health Sciences, Carl von Ossietzky University Oldenburg, Ammerländer Heerstr. 114-118, 26129 Oldenburg, Germany
Interests: human cognitive abilities; interpersonal abilities; person perception; biological bases of individual differences; age-related cognitive differentiation and dedifferentiation; construction and evaluation of ability tests; multivariate methods, structural equation modeling
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Statistical Methods in the Social Sciences, Department of Statistics, TU Dortmund, 44227 Dortmund, Germany
Interests: longitudinal data modeling; test development; meta-analysis; item response theory; psychometrics
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Our understanding of intelligence has been—and still is—significantly influenced by the development and application of new computational and statistical methods, as well as novel testing procedures. In science, methodological developments typically follow new theoretical ideas. In contrast, great breakthroughs in intelligence research followed the reverse order. For instance, the once-novel factor analytic tools preceded and facilitated new theoretical ideas such as the theory of multiple group factors of intelligence. Therefore, the way we assess and analyze intelligent behavior also shapes the way we think about intelligence.

We want to summarize recent and ongoing methodological advances inspiring intelligence research and facilitating thinking about new theoretical perspectives. This Special Issue will include contributions that:

  1. take advantage of auxiliary data usually assessed in technology-based assessment (e.g., reaction times, GPS data) or take a mobile sensing approach to enriching traditional intelligence assessment;
  2. study change or development in (intensive) longitudinal data with time series analysis, refined factor analytic methods, continuous time modeling, dynamic structural equation models, or other advanced methods of longitudinal data modeling;
  3. examine the structure of and change in cognitive abilities with network analysis and similar recently popularized tools; and
  4. use supervised and unsupervised machine learning methods to analyze so-called Big Data in the field of intelligence research.

We invite original research articles and tutorials that use and explain the aforementioned and other innovative methods to familiarize readers with new ways to study intelligence. To this end, we appreciate reusable and commented syntax provided as online material. We especially welcome contributions from other disciplines, such as computer science and statistics. For your convenience, we have also compiled a list of free accessible intelligence data sets: https://goo.gl/PGFtv3.

Prof. Dr. Ulrich Schroeders
Prof. Dr. Gizem Hülür
Prof. Dr. Andrea Hildebrandt
Prof. Dr. Philipp Doebler
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a double-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Journal of Intelligence is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • psychometrics
  • modeling
  • methods
  • technology-based assessment
  • Big Data

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

15 pages, 4105 KiB  
Article
Tracking with (Un)Certainty
by Abe D. Hofman, Matthieu J. S. Brinkhuis, Maria Bolsinova, Jonathan Klaiber, Gunter Maris and Han L. J. van der Maas
J. Intell. 2020, 8(1), 10; https://doi.org/10.3390/jintelligence8010010 - 03 Mar 2020
Cited by 3 | Viewed by 4250
Abstract
One of the highest ambitions in educational technology is the move towards personalized learning. To this end, computerized adaptive learning (CAL) systems are developed. A popular method to track the development of student ability and item difficulty, in CAL systems, is the Elo [...] Read more.
One of the highest ambitions in educational technology is the move towards personalized learning. To this end, computerized adaptive learning (CAL) systems are developed. A popular method to track the development of student ability and item difficulty, in CAL systems, is the Elo Rating System (ERS). The ERS allows for dynamic model parameters by updating key parameters after every response. However, drawbacks of the ERS are that it does not provide standard errors and that it results in rating variance inflation. We identify three statistical issues responsible for both of these drawbacks. To solve these issues we introduce a new tracking system based on urns, where every person and item is represented by an urn filled with a combination of green and red marbles. Urns are updated, by an exchange of marbles after each response, such that the proportions of green marbles represent estimates of person ability or item difficulty. A main advantage of this approach is that the standard errors are known, hence the method allows for statistical inference, such as testing for learning effects. We highlight features of the Urnings algorithm and compare it to the popular ERS in a simulation study and in an empirical data example from a large-scale CAL application. Full article
(This article belongs to the Special Issue New Methods and Assessment Approaches in Intelligence Research)
Show Figures

Figure 1

18 pages, 440 KiB  
Article
Ergodic Subspace Analysis
by Timo von Oertzen, Florian Schmiedek and Manuel C. Voelkle
J. Intell. 2020, 8(1), 3; https://doi.org/10.3390/jintelligence8010003 - 06 Jan 2020
Cited by 3 | Viewed by 4536
Abstract
Properties of psychological variables at the mean or variance level can differ between persons and within persons across multiple time points. For example, cross-sectional findings between persons of different ages do not necessarily reflect the development of a single person over time. Recently, [...] Read more.
Properties of psychological variables at the mean or variance level can differ between persons and within persons across multiple time points. For example, cross-sectional findings between persons of different ages do not necessarily reflect the development of a single person over time. Recently, there has been an increased interest in the difference between covariance structures, expressed by covariance matrices, that evolve between persons and within a single person over multiple time points. If these structures are identical at the population level, the structure is called ergodic. However, recent data confirms that ergodicity is not generally given, particularly not for cognitive variables. For example, the g factor that is dominant for cognitive abilities between persons seems to explain far less variance when concentrating on a single person’s data. However, other subdimensions of cognitive abilities seem to appear both between and within persons; that is, there seems to be a lower-dimensional subspace of cognitive abilities in which cognitive abilities are in fact ergodic. In this article, we present ergodic subspace analysis (ESA), a mathematical method to identify, for a given set of variables, which subspace is most important within persons, which is most important between person, and which is ergodic. Similar to the common spatial patterns method, the ESA method first whitens a joint distribution from both the between and the within variance structure and then performs a principle component analysis (PCA) on the between distribution, which then automatically acts as an inverse PCA on the within distribution. The difference of the eigenvalues allows a separation of the rotated dimensions into the three subspaces corresponding to within, between, and ergodic substructures. We apply the method to simulated data and to data from the COGITO study to exemplify its usage. Full article
(This article belongs to the Special Issue New Methods and Assessment Approaches in Intelligence Research)
Show Figures

Figure 1

27 pages, 2861 KiB  
Article
The Impact of Situational Test Anxiety on Retest Effects in Cognitive Ability Testing: A Structural Equation Modeling Approach
by David Jendryczko, Jana Scharfen and Heinz Holling
J. Intell. 2019, 7(4), 22; https://doi.org/10.3390/jintelligence7040022 - 23 Sep 2019
Cited by 7 | Viewed by 6276
Abstract
When a cognitive ability is assessed repeatedly, test scores and ability estimates are often observed to increase across test sessions. This phenomenon is known as the retest (or practice) effect. One explanation for retest effects is that situational test anxiety interferes with a [...] Read more.
When a cognitive ability is assessed repeatedly, test scores and ability estimates are often observed to increase across test sessions. This phenomenon is known as the retest (or practice) effect. One explanation for retest effects is that situational test anxiety interferes with a testee’s performance during earlier test sessions, thereby creating systematic measurement bias on the test items (interference hypothesis). Yet, the influence of anxiety diminishes with test repetitions. This explanation is controversial, since the presence of measurement bias during earlier measurement occasions cannot always be confirmed. It is argued that people from the lower end of the ability spectrum become aware of their deficits in test situations and therefore report higher anxiety (deficit hypothesis). In 2014, a structural equation model was proposed that specifically allows the comparison of these two hypotheses with regard to explanatory power for the negative anxiety–ability correlation found in cross-sectional assessments. We extended this model for usage in longitudinal studies to investigate the impact of test anxiety on test performance and on retest effects. A latent neighbor-change growth curve was implemented into the model that enables an estimation of retest effects between all pairs of successive test sessions. Systematic restrictions on model parameters allow testing the hypothetical reduction in anxiety interference over the test sessions, which can be compared to retest effect sizes. In an empirical study with seven measurement occasions, we found that a substantial reduction in interference upon the second test session was associated with the largest retest effect in a figural matrices test, which served as a proxy measure for general intelligence. However, smaller retest effects occurred up to the fourth test administration, whereas evidence for anxiety-induced measurement bias was only produced for the first two test sessions. Anxiety and ability were not negatively correlated at any time when the interference effects were controlled for. Implications, limitations, and suggestions for future research are discussed. Full article
(This article belongs to the Special Issue New Methods and Assessment Approaches in Intelligence Research)
Show Figures

Figure 1

13 pages, 1858 KiB  
Article
Psychometric Network Analysis of the Hungarian WAIS
by Christopher J. Schmank, Sara Anne Goring, Kristof Kovacs and Andrew R. A. Conway
J. Intell. 2019, 7(3), 21; https://doi.org/10.3390/jintelligence7030021 - 09 Sep 2019
Cited by 26 | Viewed by 8152
Abstract
The positive manifold—the finding that cognitive ability measures demonstrate positive correlations with one another—has led to models of intelligence that include a general cognitive ability or general intelligence (g). This view has been reinforced using factor analysis and reflective, higher-order latent [...] Read more.
The positive manifold—the finding that cognitive ability measures demonstrate positive correlations with one another—has led to models of intelligence that include a general cognitive ability or general intelligence (g). This view has been reinforced using factor analysis and reflective, higher-order latent variable models. However, a new theory of intelligence, Process Overlap Theory (POT), posits that g is not a psychological attribute but an index of cognitive abilities that results from an interconnected network of cognitive processes. These competing theories of intelligence are compared using two different statistical modeling techniques: (a) latent variable modeling and (b) psychometric network analysis. Network models display partial correlations between pairs of observed variables that demonstrate direct relationships among observations. Secondary data analysis was conducted using the Hungarian Wechsler Adult Intelligence Scale Fourth Edition (H-WAIS-IV). The underlying structure of the H-WAIS-IV was first assessed using confirmatory factor analysis assuming a reflective, higher-order model and then reanalyzed using psychometric network analysis. The compatibility (or lack thereof) of these theoretical accounts of intelligence with the data are discussed. Full article
(This article belongs to the Special Issue New Methods and Assessment Approaches in Intelligence Research)
Show Figures

Figure 1

Back to TopTop