Next Article in Journal
Determinants of Fixed Asset Investment in the Polish Farms
Next Article in Special Issue
Digital Economy, Technological Innovation and High-Quality Economic Development: Based on Spatial Effect and Mediation Effect
Previous Article in Journal
Water Stress Affects the Some Morpho-Physiological Traits of Twenty Wheat (Triticum aestivum L.) Genotypes under Field Condition
Previous Article in Special Issue
Determinant Factors for Adoption of Government as a Platform in South Korea: Mediating Effects on the Perception of Intelligent Information Technology
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Usability of the G7 Open Government Data Portals and Lessons Learned

1
Department of Information Science, College of Arts, King Saud University, Riyadh 11451, Saudi Arabia
2
School of Informatics, The University of Edinburgh, 10 Crichton St., Edinburgh EH8 9AB, UK
3
Department of Management, Coggin College of Business, University of North Florida, 1 UNF DRIVE, Building 42, Jacksonville, FL 32224, USA
4
Deanship of Development and Quality, King Saud University, Riyadh 11451, Saudi Arabia
5
Deanship of Skills and Development, King Saud University, Riyadh 11451, Saudi Arabia
*
Author to whom correspondence should be addressed.
Sustainability 2021, 13(24), 13740; https://doi.org/10.3390/su132413740
Submission received: 16 October 2021 / Revised: 19 November 2021 / Accepted: 8 December 2021 / Published: 13 December 2021
(This article belongs to the Special Issue Digital Governance and Digital Economy: Are We There Yet?)

Abstract

:
Recent advances in technology have made truly open and accessible government significantly more realisable. One of the ways in which governments are using this technology is in the implementation of online portals that allow open (i.e., public and unrestricted) access to data and use of data. Such portals can be used by citizens and professionals to facilitate improved decision-making across a wide range of areas, from car-parking to promoting entrepreneurialism. However, the existence of portals per se is not enough. To maximise their potential, users must also feel that they are both accessible and usable. To gain insights into the current state of usability of OGD portals for professionals working in data-related areas, a comparative study of the portals of the G7 group was carried out, using a mixed methodology. This is the first specific comparison of these portals for such users, as well as the first study to add a user-centred qualitative dimension to the research. The study’s findings showed that the G7 countries are not maximising the potential of their portals or collaborating effectively. Addressing these issues, and building better cross-national consistency, would help to improve the value delivered by investment in OGD portals. The study also further supported an existing user-centred, heuristic evaluation framework for application to a more specific user group, as well as more generally.

1. Introduction

More than ever before, public bodies are producing and commissioning huge quantities of information and data, and there are real and significant benefits to making these datasets easily accessible. By encouraging the use and free distribution of datasets, governments can develop and implement more evidence-based and inclusive policies, promote business creation and innovation, and empower citizens to make more highly informed decisions [1]. Furthermore, easy access to open data can not only encourage and support better decision-making by individuals, but can help to develop a “healthier society” by providing “spaces through which people can investigate community problems, generate solutions, create media, and organise together” [2] (p. 5). To facilitate the use and reuse of datasets, the concept of the open government data (OGD) portal has emerged. This is a relatively recent development that has been enabled by the increasingly socially embedded nature of digital technology, and which is now being implemented by a rapidly growing number of countries. According to Statista [3], 79% of UN Member States had an OGD portal in 2020, up from 47% in 2018.
Despite this upward trend in OGD portal availability, however, there are legitimate questions about their usage. To promote the widespread use of OGD portals, it is not sufficient to merely publish data; the data must also have real value to private and public bodies [4]. This consideration, in turn, raises the questions of which datasets should be released in order to maximise public value, and who (given that it is for public use) should pay for this data to be updated and maintained. Another key question, and the theme of this paper, is how usable are OGD portals? To deliver on their objectives, in terms of engaging a wide range of citizens at both individual and organisational levels, and to facilitate the resolution of problems and create opportunity, portals must be simple to use, without the need for high levels of technical knowledge. This reduces barriers to participation and leads to increased usage and dissemination of data. Nevertheless, while the use of OGD portals is become significantly more widespread and they are acknowledged as the main medium of interaction between data providers in the public sector and private data users, the usability of these portals has been heavily criticised. For instance, Osagie et al. [5] assert that users lacking technical knowledge have difficulty utilising OGD portals, while other researchers note that the portals are not user-friendly [6,7]. So that such usability problems can be addressed, some researchers have begun to acknowledge the ways in which user-centred design can be of help, as this approach involves users in the design and development processes, ultimately ensuring that more people are aware that OGD portals exist [1]. Conversely, some researchers argue against the inclusion of users in the processes of development and delivery of user interfaces [8,9].
An interface being easy to use is not the only aspect that has an impact on individuals’ and organisations’ willingness to use OGD portals. Indeed, research shows that factors such as data quality issues and trust in government data also have significant effects on levels of engagement with OGD [10]. If the open government data quality is poor, there will be less demand for its use [11,12]. However, it has been shown that if users have a positive experience when utilising OGD portals, their willingness to continue using them increases, which encourages providers to release more data [13].
This study, therefore, seeks to answer two main questions related to how the usability of OGD portals across the world can be compared, and the barriers most commonly encountered in terms of use. While many organisations such as the World Wide Web Foundation [14] and Open Data in Europe [15] have made comparisons between portals, these investigations mainly focused on issues such as legal implications and the amount of published data rather than on usability as a discrete factor. Others, such as Máchová et al. [16] and Nikiforova [11], have explicitly addressed the usability of OGD portals and have carried out extensive studies of many portals. These studies, however, are largely concerned with a somewhat abstract notion of usability. They are “user-centred” in the sense that they focus on the activity and experiences of the users in the study, but the users are in many ways not typical portal users; they are often students of computing or IT specialists. An important question such work leaves open is how usable OGD portals are for professionals who work with government and similar data but are not closely familiar with the mechanisms and designs of the systems that provide and underlie the portals. With this in mind, the present study helps to fill in the missing knowledge by providing an expert-driven comparative analysis of the usability of a subset of important OGD portals—those of the G7 countries, namely Japan, the UK, Germany, France, Italy, Canada, and the USA. In particular, the study will seek to provide meaningful feedback on two key questions:
  • How can we effectively assess the usability of OGD portals for professional users, and how do portals compare in terms of perceived usability?
  • What issues are most frequently identified by professional users as factors that determine portal usability?
The selection of the G7 as a sample is based on the fact that these countries not only represent a range of political, population, and economic contexts, but are widely acknowledged to be committed to open data provision, as well as being politically and philosophically aligned with the principles of democratic and transparent government. Therefore, this research may help to validate the applicability of the selected heuristic evaluation technique.
All previous studies of portal usability that we are aware of are based on a quantitative approach (questionnaires) only, which may oversimplify the results and limit their value in terms of understanding the user perspective. Our study uniquely focusses on professional users and adds a qualitative dimension to its methodology in order to more fully explore the user experience and understand in greater depth the factors that drive the usability of portals for this critical group.
One key element in studying usability is the characteristics of the users. Usability for arbitrary users is limited to abstract and very general (e.g., cognitive) considerations. OGD portals may be relatively usable for one group but not another. Another key element is the users’ tasks; an interface may facilitate some tasks more than others, whereby the tasks of “citizens” in general may be unclear. Tasks for use in a study should be consistent with tasks in the users’ real world, while also supporting a certain degree of standardisation for comparability across studies. In the present study, we adopt the framework proposed by Máchová et al. [16], focusing on three specific aspects of OGD portals, which we feel achieves a good balance and can be interpreted in relation to specific user groups. We use this to explore the questions above.
The paper is structured as follows. Section 2 deals with the literature review. Section 3 proposes the methodology. Section 4 describes the results of the study. Section 5 provides the discussion. Section 6 provides the conclusions of the study.

2. Literature Review

The concept of OGD is of little or no value if the portals used to provide access to data are not used, while the key to their use is being user-friendly and simple [17,18]. Of course, any meaningful definition of terms such as ‘usability’ and ‘user-friendly’ will depend critically on the social and technical profile of the users under consideration and what these users aim to achieve by engaging with the OGD portal. In this research area, the term ‘user’ may refer to a wide range of individuals and organisations, covering a broad spectrum of social, political, and entrepreneurial needs, as described above. While some of these users will have sophisticated technical knowledge, many more will be unfamiliar with all but the most basic of technical processes. To be effective across most user groups, therefore, the portal should be designed to cater for knowledge levels at the ‘lowest common denominator’ level, supporting the ability to facilitate easy portal deployment in terms of using its principal features (identification of, and access to, relevant datasets, finding relevant and important information, etc.) for all users, even the least technical. However, it is important to note that care must be taken to ensure that oversimplification does not occur to the extent where the true nature of data (which can be complex) is obscured. “Usability” remains a relative notion, representing a trade-off between simplicity of use and the preservation of data values, which may be, as suggested above, somewhat specific to given combinations of users and tasks, even though some of these may also have much in common.
Such user-friendly portals have been developed by private companies and are implemented by a number of governments. In the USA, for instance, many state-level governments have entered into partnership with a company called Socrata in order to simplify access to data [19]. At an international level, many OGD portals use CKAN, a system that has a customisable front-end based on an essentially standard back-end [20]. This standardisation of the back-end provides a high level of technical interoperability among OGD portals, as metadata standards and tools are common to all users and portals.
The ‘user-friendliness’ factor does much more than encourage portal usage—it is critical to the success of OGD as a political concept. This is because a portal that is easy to use creates more value than its less user-friendly counterpart. In other words, it can be reasonably surmised that so long as all independent and legitimate data publishers are free to publish and that all data are provably authentic, portals with fewer datasets but that are easy to use and are highly relevant will create more value than those with more datasets but that are more difficult to use [21,22]. With this in mind, portal design and implementation should prioritise helping users find what they need, whether social, political, or entrepreneurial, over providing large volumes of data. However, as has already been mentioned, it is also important to remember that ease of use is not a ‘one size fits all’ concept. There are many stakeholders involved in OGD retrieval citizens, businesses, NGOs, other governments, etc. and each stakeholder has slightly different requirements in terms of a portal’s functionality. OGD portals need to be easy to use for a range of audiences, which will vary widely in the volumes of data they require.
The rapidly increasing adoption by governments of an OGD policy has led to a considerable body of literature on the subject. Much of the existing literature on the topic of OGD examines it from a certain viewpoint, such as compliance with, and adherence to, policy obligations, or whether published data correlate with a specified definition of OGD. Other studies examine types of published data or how well the OGD portals perform in terms of quantitative metrics such as data downloads and visitor numbers. Several studies, including those by Lourenço [23] and Zuiderwijk and Janssen [24], have also developed frameworks for use in assessing certain aspects of OGD, such as higher levels of accountability or transparency.
There is also a large amount of research that examines OGD implementation at all governmental levels, although most of this research is focused on initiatives in the researcher’s own country [25]. Such OGD case studies, such as those by Auer et al. [26] and Marienfeld [27], have been carried out across the world, including Germany, South America, Central and Eastern Europe, France, Taiwan, Latvia, and Central America. However, while portal usability may form part of this type of study, it is rarely the focus of discussion.
This kind of empirical research is undoubtedly of significant interest. To advance the understanding of OGD, however, a wider and more comparative approach is required. Moreover, much of the existing literature utilises varying approaches for examining and analysing the topic, meaning there is a lack of a standardised framework or methodology for comparing between and among different countries. Indeed, among the studies that look specifically at OGD portals, most analyse only one feature in isolation. Nikiforova [28], for example, looks at the machine-readability and timeliness of datasets, while Kim [29] documents the standard terms used in South Korean public data. Several studies explore the compliance of data to the 5-star model [18] and their breakdown into relevant categories [30,31]. While a focus on a single specific aspect of portals can deliver valuable insights, it is important to look at usability from a broader perspective.
Arguably, usability has as one component the abstract functionality of the interface, this tends to be the focus of studies in HCI and is dominant in technical design, but another component is the extent to which the interface supports the tasks of specific user groups, who may engage with the interface in different ways and find it more or less satisfactory. It is certainly not true to say that the issue of usability is never discussed in the existing literature. In fact, there is much previous research that has examined ways in which data providers have made access and use easier. Yet these studies invite further exploration of the perceptions and opinions of the users who will make day-to-day use of portals in their professional work [5]. Nikiforova and McBride [32] undertake a complex and detailed analysis of the usability of 41 OGD portals, but through entirely quantitative techniques and focussing on users with a specifically computing or IT background. User perspectives are touched upon in some papers, for example in studies by Zuiderwijk et al. [33] and Welle-Donker and van Loenen [7], which deal with the drivers and barriers of the use of OGD portals, focusing on attitudes and beliefs, but leave scope for a more specific focus on the capabilities of users. Ultimately, the knowledge and capabilities of users, and their ability to relate the use of the portal to a context such as their professional work, are the factors that impact the most on their relationship with the portal itself; thus, for the user, these factors determine the usability of the OGD portal.
There are studies that explore how OGD can help facilitate better transparency and that point to usability as playing a key role. In one such paper, the authors argue that portal usability is “the degree to which OGD portals are able to be used, or are fit for use by citizens” and that ‘higher usability tends to promote higher levels of transparency’ [34] (p. 516). In a similar vein, Attard et al. [25] carry out a systematic review of the literature surrounding OGD initiatives, and briefly touch upon usability as a concept. In this case, however, the researchers examine the usability of the actual data rather than the portal through which they are obtained. It is important to recognise that while these two concepts are similar, they are not the same, and cannot be compared effectively. The usability of datasets has also been investigated by other researchers, such as Dawes et al. [35], who looked at usability in the context of four main dimensions: data format, metadata, means of access, and dataset quality. One of the latest studies by Lněnička et al. [36] identified features of the portal detected according to the interactions of stakeholders that were able to increase the transparency of datasets. Recommendations were also provided by the researchers about ways to include these features into the design and development of OGD portals [37]. Finally, a study into the usability of OGD [38] emphasised the fact that further investigation is needed on the topic, yet despite this acknowledgement, their study focuses on the data rather than the actual portal used to acquire it.

3. Methodology

This study employs a mixed methods approach. According to Johnson and Onwuegbuzie [39], such an approach “mixes or combines quantitative and qualitative research techniques, methods, approaches, concepts, or language into a single study”. An approach using mixed types of methods was chosen, because by drawing on the strengths of each individual method (quantitative and qualitative) it is possible to derive insights that may not be possible through a focus on a single method only. As described below, the quantitative and qualitative stages of the research were carried out separately.

3.1. Stage 1: The OGD Portals Assessed

The national portals of the G7 group of countries were compared for usability, for the reasons described in the introduction to this paper. The G7 portals used are shown in Table 1.

3.1.1. Assessment Criteria

The framework devised by Máchová et al. [16] was selected as the most suitable for this research. This was for a number of reasons. It is, for example, a methodology currently used in many respected studies, demonstrating its robustness [32]. Additionally, it crucially facilitates a user-based analysis by breaking usability down into well-defined and measurable aspects. Máchová et al. provide an extensive comparative discussion of frameworks, indicating that their own proposal meets criteria that are aligned with our aims; hence, we conclude that their proposal fulfils the essential requirement of the current study. Another important advantage of this framework is that its three main dimensions have been successfully embedded in other academic research. We feel that it is valuable to establish this as a framework that can be used to standardise, to some extent, an approach for potential use in other studies comparing different user groups working with a similar task structure. It includes a wide range of typical tasks carried out by the user when interacting with a portal. Such tasks include, but are not limited to, identifying and accessing datasets, re-using data, establishing the data publisher, and requesting further information. The three dimensions are: (1) dataset specification: how relevant and usable is the dataset provided in terms of the user’s goals; (2) dataset feedback: to what extent are users encouraged and supported in providing feedback on datasets; (3) dataset requests: the extent to which users can request new or different datasets. In many respects, these criteria are related to the provision and usability of metadata. This is appropriate in the terms developed by Máchová et al. [16], and also in the terms of our own focus on usability as oriented towards supporting the tasks of particular, especially professional, user groups. Clearly, the use of a portal will depend heavily on the user’s ability to find and work with information about the datasets that their task requires, and again the quality of this information is distinct from the quality of the data in the dataset itself.
The three dimensions are broken down into the 14 subcriteria presented in Table 2.

3.1.2. Evaluator Recruitment

This study is a heuristic evaluation. In heuristic evaluations, each evaluator inspects each OGD portal. The evaluators were recruited as professional data users with expertise deriving from the fields of library science and computer science. In order to reflect ‘real life’ and capture the user experience as authentically as possible, evaluators were selected with a variety of web experience, ranging from those with limited skills and no familiarity with OGD portals to those with more advanced technical knowledge who had previous experience of portal usage. Using a process of email invitation, 51 experts were selected, 29 of which were from a library science background, while the remaining 22 were from a computer science background.
For an investigation of this exploratory nature, it was considered that 51 evaluators would be an adequate number [40]. Moreover, due to the very specific criteria set for each principle, the evaluation process was essentially subjective and interpretive rather than objective and quantitative.

3.1.3. Data Collection and Analysis

Before starting the heuristic evaluation process, each of the evaluators was briefed on the aim of the study. In order to evaluate the aspects of portal usability shown in Table 2, evaluators were asked to replicate, as far as possible, the typical ‘user experience’ by imagining a hypothetical context for the use of a dataset, which was specified for all of them. They would then use the portal to establish the ease of access, relevance, and authenticity of that dataset, along with other key factors, and to prepare it for re-use. The same exercise would be completed on each portal. To avoid the potential introduction of evaluator bias, no other training or preparation was provided. Another possible source of bias is the order in which portals are assessed by evaluators. There are studies that show that a randomised assessment sequence can reduce the possibility of introducing effects such as familiarity bias. However, this study did not employ randomisation. Instead, it asked each evaluator to follow an identical process, as described by [41,42]. Each evaluator first familiarised themselves with a portal’s layout and presentation for as long as required (up to 15 min), then attempted to complete the identification and retrieval of a dataset as described above. Following this, the portal aspects, as shown in Table 2, were assessed. This procedure of course allows individual variation among the experts, as there would be among users in practice, which was seen in the qualitative assessment phase described below. Evaluations took place between 1 February 2021 and 1 May 2021. As described below (Section 3.2.1), each evaluator recorded their results separately in an electronic table, which was collected via email.
A 3-point Likert scale was employed, with scores of 1 (unfulfilled), 2 (partially fulfilled) and 3 (fulfilled), following the approach of [16,32]. If a portal scored 1 for a particular criterion, evaluators were asked to note the usability issues they experienced. This process was repeated with each portal. Communication between evaluators was not permitted and the findings were not aggregated until the evaluations of all OGD portals had been completed, as described by [41,43]. By combining the scores across categories and criteria, it is possible to rank the usability of OGD portals, given the relevant statistical analysis, and to explore and orient the discussion and analysis at different levels (e.g., the categorical and criterial levels).

3.2. Stage 2: Qualitative Assessment

After completion of the first stage, the study moved on to the qualitative assessment phase. This set out to enhance and clarify any insights derived from stage 1 through the use of in-depth interviews. This technique, described as ‘conversation with purpose’ [44,45], is commonly deployed in qualitative research as an effective means of gaining a more meaningful understanding of participant experiences. According to Alrasbi [46], ‘interviews enable the detailed follow-up of points arising from the analysis of quantitative data of complex topics, [to discover] thoughts and feelings that cannot be directly observed.’

3.2.1. Evaluator Interviews

Not all evaluators who assessed the portals were invited to participate in a follow-up interview. The sample size for interviews was determined using a saturation approach, as recommended by Gorard [47] and Seyyedamiri and Khosravani [48]. In the current study, the sample size used was 10, with a balance of 6 from a library science background and 4 from computing science. This is significantly higher than the 6–7 interviews required to deliver 80% saturation (i.e., to identify 80% of recurring themes), although less than the 11–12 interviews required to achieve 95% saturation [49]. The default interview type was a face-to-face appointment, and all participants consented to the recording of interviews. All voice recordings were then transcribed to text on the same day to ensure no loss of accuracy. Interviews lasted between 45 and 60 min and were conducted in Arabic (i.e., the native language of both interviewees and interviewer). Transcripts were then translated into English and back again to ensure transcription accuracy [50], and all interviewees were asked to comment on the accuracy of the final transcription [24]. As a result of the feedback from this process, some clarifications to transcripts were made.

3.2.2. Data Analysis

The analysis of the interview transcripts was carried out in accordance with the six-stage thematic analysis technique described by Braun and Clarke [51]. In the first of the three principal stages, open coding is carried out to allow ‘units of meaning’ to be identified from text segments, such as sentences and paragraphs [52]. These text segments are then assigned a code to describe their content. Finally, the coded segments are listed under broader categories (themes) to identify patterns or relationships between them.

4. Results

Here, we report the outcomes of the scoring exercise, as augmented by comments that the evaluators gave during the interviews, selected in the light of identified themes. These comments help us to understand more clearly the usability issues underlying the scores given, as seen from the perspective of the evaluators as users themselves.

4.1. Category 1: Open Dataset Specification (7 Aspects)

Although all portals provided a description of the dataset (aspect 1), the mean score for all portals for this criterion was relatively low. This was principally because of the superficiality of the dataset description, as highlighted by several evaluators. One evaluator, for example, commented:
Although all the portals included descriptions of published datasets, this consisted of just the title of the dataset. If portals are to achieve their specified goals effectively and efficiently, they need to provide more information and context.
Another evaluator pointed out that:
As you know, not all users of portals are experts, so they need guidance in understanding the relevance and currency of datasets. Most portals provided very little support in this respect.
Another important usability criterion is the publisher information (aspect 2). In fact, this proved to be the highest scoring aspect in the category (aspect mean = 2.88), as all portals scored reasonably highly for this criterion and provided a (anonymous, via the portal) publisher email address for questions. However, the information supplied was not always comprehensive and could prevent users from identifying relevant datasets or understanding their provenance. Comments by evaluators included:
Basic information, such as the publishing organisation was always provided, but several portals failed to provide information such as data source(s) and dataset version.
Full publisher information is an important element of portal usability, not just to provide user information, but also to give users confidence that the dataset is relevant, current, and reliable. This level of information was not always available from the portals evaluated.
While France was rated very highly (with a maximum score of 3) in the use of thematic categories and tags (aspect 3), other portals varied in their implementation of this aspect. Some portals had designed specific tags, while others used a more generalised approach, thereby reducing the effectiveness of the searching process. For instance, the French portal used a four-level hierarchy (keywords, origination, licenses, and format) to distinguish datasets. According to evaluators:
The use of categories and tags can help to make the use of portals very easy and straightforward, especially when datasets are split into clear sections or themes. I felt that some portals could have made better use of more advanced features such as type-ahead tagging, while other could have used graphical techniques such as colour and underlining to improve clarity further still.
The importance of the presentation of portals is easy to overlook, but it is a key element of usability. Although some portals did this well, especially France, others could make the process of searching for and identifying datasets much easier and clearer.
Although information about data publication or modification dates (aspect 4) is also a part of aspect 2 (publisher information), it is important enough to be considered as a separate and independent aspect. Most portals scored relatively highly in this respect, displaying the relevant information clearly at the bottom of the page, although Italy and Japan presented it on the side of the page. From the user experience perspective, this is not a trivial point, as metadata typically consist of many fields, so visual presentation is an important part of the clarity and ease of use. Fields with less importance can be located at the bottom of the page or hidden from view altogether. This point was highlighted by several evaluators. For example:
From my point of view, OGD portals should be user-centred in other words, focused on the needs of the user, not the provider. However, many government sites do not seem to take this approach.
Working with open data portals should be user-friendly, and this can be only achieved through feedback from end users, who should be asked to evaluate portal structure and design. For example, navigation should be made as easy as possible, ensuring that important information is clearly visible. Unless this is the case, users can easily get lost.
Datasets were downloadable from all portals in a range of machine-readable formats (aspect 5). The most commonly used formats were JavaScript Object Notation (JSON), comma-separated values (CSV), and Excel files. However, not all portals offered all formats, and some were limited in the formats provided.
The availability of public data in machine-readable format is an important part of service delivery and can play a key role in the perception of usefulness and usability by the public. However, it is an aspect which is sometimes underrated by portal designers.
Most of the common machine-readable formats were proven by most portals, but most could be still further improved by including some of the more common quasi-machine-readable formats, such as PDF, which is familiar to most users.
The most poorly-rated aspect in this category (aspect mean = 1.08) was analytic and visualisation tools (aspect 7), which were always either basic or missing in all portals evaluated. This uniformly low rating, however, might be explained by the fact that there is no standard framework or guideline on how such tools should be implemented. Several evaluators emphasised the importance of including advanced features such as APIs and visualisation features to increase the usability of the open data portals. In their words:
I found that only the French, Italian, Canadian, and UK portals provided an API. However, this is a feature which I believe should be included in the portal to facilitate integration, allowing data to be integrated into third-party applications easily and seamlessly. It might only be of interest to the more technical users, but it’s very important for increasing usability.
The inclusion of a visualisation feature to support user experience is clearly very important. It is particularly important now that the availability of tools such as Microsoft Power BI and Tableau has increased user expectations.

4.2. Category 2: Open Dataset Feedback (4 Aspects)

This element of the study (involving 4 aspects) is concerned with the mechanisms used and support provided to encourage active engagement with the G7 portals by stakeholders. This appears to be a significant weakness of current OGD portals, as illustrated by the following comments from evaluators:
It seems that for some time, there has been a lack of active engagement between end users and OGD portal providers. In my view, it is crucial to provide clear instructions and documentation for use of the portal, as well as a mechanism for users to ask questions. This will build active engagement between all stakeholders.
Making government data openly available should be just the start of a conversation between those inside and those outside of government. To that end, portals should be designed to facilitate and deepen that discussion, but I have seen little evidence of such a design philosophy in most portals.
In this study, most evaluators found the UK and Canada to be the best in terms of the provision of supporting documentation (aspect 1), although none of the portals scored particularly well. The following evaluator comments are representative of others:
Most of the portals evaluated did not supply a huge amount of detailed documentation, except the UK and Canada. This is disappointing, as it is a good way to increase the usability of open data portals. Although it’s true that some users don’t need much, they just need a summary or basic guide, many other people are less technical and need much more comprehensive documentation, such as that provided by the UK and Canada.
In one way at least, OGD portals are much like any software or technology product … they need to be supported by clear documentation. A technology product that doesn’t have a user manual would not be very popular and would generate lots of complaints. The same is true with portals … if governments are serious about encouraging engagement, they must provide good documentation. With a couple of exceptions … Canada and UK … this generally wasn’t the case for the portals evaluated.
Aspect 2 (a mechanism for providing feedback) was also generally poor. While all portals provided contact details for giving feedback, only three portals (USA, UK, and Germany) made this easy by providing a 3-field (subject, email address, message, and reason for contacting) contact form for easy submission. The portals of the Canada and Germany also provided space for comments to be left by users at the bottom of the page, and the USA, UK, and Canadian portals allowed users to make recommendations for new features and improvements. As one evaluator put it:
Most development environments rely heavily on user feedback as a key element in product improvement. Governments should follow suit and improve their feedback systems if they want to improve the usage and performance of their portals.
Another evaluator felt similarly:
While some countries, such as the UK and Germany, seem to fully appreciate the value of user feedback, others seem uninterested in user experiences and needs. This can only result in suboptimal portals which fail to engage users over the longer term.
The ability to link to, and share information on, external sites such as social media platforms (aspect 3), is an obvious way to encourage engagement and to develop awareness of the benefit of portals. Most of the portals evaluated were moderate to good in terms of this, making it the highest scoring aspect of the category (aspect mean = 2.86), although there is clearly room for more development. According to one evaluator:
Creating a distribution channel for open data by linking portals to social media platforms is a powerful way of enhancing their applications and usefulness. Most of the portals studied have clearly realised this and provide reasonable mechanisms for linking the two environments.
Another evaluator pointed to the missed opportunity of not providing social media links:
The integration of government data, which is objective, and social media data, which is subjective, has the potential to improve decision-making by combining complementary points of view of the same problem. It’s, therefore, in a government’s interests to enable such integration.
According to Nikiforova [11,12], there will be little desire to utilise OGD if the quality of the data is poor. It is, therefore, in the interests of any government to ensure that dataset quality is high. This requires user feedback on dataset quality. However, the scores for the user rating and comment element (aspect 4) of the portals were very low in four of the seven evaluated, making this the lowest-rated aspect in the category (aspect mean = 1.72). Only France, Italy, and Germany provided a clear and easy-to-use mechanism for rating the quality of datasets. One evaluator commented:
Data quality is at the very heart of portal success. Without data quality, even the most sophisticated portal is worthless, so it is rather surprising that the issue seems to be so low on the priority list for several countries.
Another supported this view:
I was really shocked to find that only three countries gave users a way of feeding back on dataset quality. It’s almost as if they don’t want to know how good or bad the data is, which calls into question the reason for providing a portal in the first place. Is it just a cosmetic exercise to make the government look accountable and transparent?

4.3. Category 3: Open Dataset Request (3 Aspects)

The analysis of this category showed that all portals evaluated in the study included a data request form (aspect 1) without asking for registration supporting the basic philosophy behind OGDs. The following comments illustrate the general view:
In my view, asking for user registration contradicts the very idea of open data, which is that data should be freely available. It is hard to see why datasets involving subjects such as traffic, weather, and education should not be used without restrictions, in which case registration is unnecessary and limiting.
The process of requesting more or new data is supported by all the portals without registration, which carries the no threat that the user may be, in some way, penalised for the way they use the data obtained through the portal. This is likely to act as encouragement for some, if not many, potential users.
The ability to provide a list of requests (aspect 2) was completely unfulfilled by all portals in the study, giving it the lowest aspect mean (=1) in the study. This fact was considered unfortunate by all evaluators. As one evaluator phrased it:
Providing a list of requests concerning user submissions is a basic requirement, and it seems a gross oversight that this ability isn’t provided by any of the portals.
Another evaluator commented:
A request facility serves to give users confidence that the system is being used productively, and the failure to provide such a facility can only degrade user engagement.
User involvement (aspect 3), which is defined as allowing users to be directly and actively involved with the providers of datasets, is another important issue. However, all of the portals in the study scored badly in this respect. This is perhaps unsurprising, given that it is known from the literature that OGD users and providers tend to have little interaction. One evaluator commented:
Although there are recognised practical issues, as well as ideological ones, involved in improving user–provider interaction, there are some clear benefits in doing so, and it is regrettable that the portals involved haven’t yet resolved some of the issues and moved forward.
Another evaluator said:
Providing a means by which users can play an active role in shaping all aspects of portal use, not just data, is key to the future of OGD portals. Unless users feel that they are valued stakeholders, engagement is likely to be lower than it would otherwise be.

4.4. Summary

The outcomes in terms of comparisons between the national portals for each of the categories are shown in Table 3, Table 4 and Table 5, where we have highlighted the best and worst scores for each aspect. One can see that the picture is complex and does not invite arrangement of the portals on a simple linear scale. Some portals are better on some aspects, some on others. On the face of it, the US comes out best and Japan poorest, but this is not uniform in either case and is especially due to their divergent results on the various aspects of open dataset specification. However, our objective is not to derive a league table; rather, we aim to highlight the experiences, both good and bad, and the principal concerns of users in ways that will generally apply across portals and help to indicate directions for improvement of all.

5. Discussion

The context of this study is the increasingly common implementation of OGD portals across the world [53,54,55] and associated debate about the true potential of open data itself [4,33,56]. Some argue that governments that are more open with data will automatically become more transparent in other ways; others, however, contend that true openness is an illusion, and most portals only show data in ways determined by the government to suit political objectives [24,57]. This may reduce the interest of some users in accessing data. The current study examined the usability of OGD portals implemented by governments, specifically those of the G7, who are aware of the advantages of open use of data and are looking to make their portals more accessible and usable. It is vital to take a perspective on ‘usability’ that ensures the research findings can be interpreted appropriately. Our approach is discussed in Section 3 of this paper; in brief, we characterise ‘usability’ as the ability to facilitate easy portal deployment for given users and tasks. We focus on professional users and argue for a standardised set of tasks.
Interpretation of the study’s findings must acknowledge that the quality of a portal, including its usability, is not the same as the quality of the data included within it. These are distinct concepts; for instance, an OGD portal could have low usability but provide high-quality data, and likewise the converse. This study only examined usability with no focus on the quality of the data, as previous studies have covered the latter in some detail. Table 2 includes items concerned with the ease and accuracy of access to data, including arguably the quality of metadata, treated as aspects of usability that are distinct from the quality of data in itself. Similarly, the opportunity for portal users to offer feedback on the quality of the data provided (among other things) is an aspect of portal usability; hence, the benchmarking framework of the study did include an evaluation of feedback mechanisms for reporting data quality problems.
The study findings show that while some of the suggestions made in prior studies have been implemented, there is a wide range of variation among OGD portals, and a large number of weaknesses remain even among the strengths. A lesson perhaps is that portal designers need to more clearly consider the full range of users and the possible tasks that they might have. Portals may have emerged through a piecemeal development of specific functionalities and could benefit from a more holistic approach.
The study also highlighted the fact that much more could be done to encourage user engagement with portals. However, it should be recognised that this is a complex issue, and at least some of the problem lies with embedded cultural issues rather than failings of portal design and implementation. The fact that datasets must be accessed digitally, for example, can be a barrier for some people who may not have the devices or technical skills necessary [58]. Providing adequate support for those who need to develop such skills can be a major challenge for public sector organisations [59].
On the other hand, portals that lack usability are unlikely to increase user engagement, and there are possible approaches to developing enhanced usability features that may improve engagement with portals. Attard et al. [25], for instance, highlighted the potential for using virtual community channels for a number of purposes, such as advertising the availability of new datasets, of visualising datasets in the form of maps, of standardising metadata vocabularies used when describing datasets, and promoting tools and documentation that can help in the reuse of data.
It is also worth mentioning that open data can be accessed from places other than just portals [19,25]. However, data portals are generally the main tool used in open data initiatives, indicating that other approaches to providing open data are less effective or rely on technology that does not yet meet users’ needs and requirements. However, this is not to suggest that other approaches cannot be useful as well. Indeed, Neumaier et al. [19] posit the idea of metadata repositories; such repositories would be capable of storing dataset metadata on a variety of portals at different administrative levels, thereby increasing the availability and accessibility of open data. This notion drove van der Waal et al. [26] to recommend the harvesting, aggregation, and publication of the metadata found on OGD portals into these repositories. The discovery of data can be increased as long as specific vocabularies are developed and links are provided to other sources of data [26].
Another key aspect related to accessibility and discoverability is the interoperability of a portal, as well as the portability of its associated datasets, which can be supported by the W3C DCAT Data Catalogue Vocabulary [60]. The study showed that in this respect, most G7 portals were making strenuous efforts to deliver high levels of discoverability and interoperability. Finally, it should not be forgotten that providing access to data can compromise security, leading to trust issues among users [56]. One study [61] showed that stakeholders strongly prefer to remain anonymous when utilising open data. Again, this study found that the evaluated portals allowed anonymity.

6. Conclusions

This study used a heuristic, expert-driven framework to: (a) analyse and compare the usability of the OGD portals of the G7 group of countries and (b) to determine which aspects of these portals are commonly perceived as weak or missing. The findings showed that although all portals were rated by users as good in some aspects of their design, particularly in terms of basic features, user ratings for most portals dropped significantly when it came to more advanced features, such as analytic and visualisation tools. All portals were also lacking in efforts to encourage active engagement by users through features such as user feedback forms and user involvement mechanisms.
It is clear from the study’s findings that not all portals provided by the G7 governments offer the same opportunities for stakeholder engagement. However, although the study highlighted some key aspects of portal design and functionality that could be improved to increase engagement, it did not attempt to assess or evaluate data quality or value, which is a separate issue from portal usability. Moreover, this study did not focus on government data initiatives, nor privacy or security limitations that could have an impact on the accessibility or availability of certain datasets. There is an opportunity for future studies to further explore the various thematic categories of the data provided by portals and the needs of the stakeholders that use them. Furthermore, although Máchová and Lnenicka [54] have shown that many portals already provide statistics related to datasets, further studies would help to define the data categories and themes that would best complement usability and help to enhance engagement. While Kucera [62] provided an overview of this issue, arguing that their methodology would increase dataset discoverability, they did not evaluate or discuss how this might affect usability.
While it is believed that the current study provides many useful insights on OGD portal usability, it also has some clear limitations. The type and number of participants in the study, for example, and the sample selection process may not have facilitated a full understanding of portal usage, and may limit the generalisability of the study’s findings and conclusions. In order for findings to be more generalisable, future studies should employ a more diverse pool of participants. Another possible limitation of the study lies in the evaluator assessment process. It should also be noted that the intention of the current study was not to develop a framework for improving the usability of OGD portals; rather, it sought to make a contribution to the debate surrounding the importance of evaluating their usability. It was considered that the use of an expert-driven evaluation creates a foundation for future study, particularly with regard to a list of heuristics.
There is an acknowledgement that some of the usability differences, as well as the perceived weaknesses, of the OGD portals evaluated in this study may be due to social, economic, and political differences among the countries in which they are implemented. This could be clarified through further studies utilising Máchová et al.’s [16] heuristic framework, for which the current study provided further validation. However, this study demonstrates that although the G7 countries may understand the need for transparency and that OGD portals can be a tool through which this can be achieved, their potential is not being maximised through the use of available technologies and from mutual learning. Hence, this study’s findings and recommendations for practice may be of considerable use to a range of actors, such as open data portal designers, data providers, and government authorities.

Author Contributions

Conceptualization, I.M., A.A. (Abdullah Almuqrin), J.L. and J.Z.Z.; methodology, I.M., A.A. (Abdullah Almuqrin) and J.L.; validation, J.Z.Z., A.A. (Abdulaziz Alomran), T.O. and A.H.; formal analysis, T.O., A.F. and A.H.; writing—original draft preparation, A.H. and A.F.; writing—review and editing, I.M., A.A. (Abdullah Almuqrin), J.L. and J.Z.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Deanship of Scientific Research at King Saud University, research group number RG-1441-527.

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki, and approved by the Institutional Review Board (Human and Social Researches) of King Saud University (Ref. No.: KSU-HE-18-242 and date of approval 27 November 2018).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data available on request due to restrictions of privacy.

Acknowledgments

This research was funded by the Deanship of Scientific Research at King Saud University, re-search group.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. OECD. Open Government Data. Available online: https://www.oecd.org/gov/digital-government/open-government-data.htm (accessed on 2 September 2021).
  2. Allied Media Projects. The Opening Data Zine is Here! Available online: https://alliedmedia.org/wp-content/uploads/2020/10/ddjc_zine-final-rgb.pdf (accessed on 2 September 2021).
  3. Statista. Share of Countries with Associated Features of Open Government Data (OGD) Portals in 2020. Available online: https://www.statista.com/statistics/421880/open-government-data-sector/ (accessed on 2 September 2021).
  4. Janssen, M.; Charalabidis, Y.; Zuiderwijk, A. Benefits, adoption barriers and myths of open data and open government. J. Manag. Inf. Syst. 2012, 29, 258–268. [Google Scholar] [CrossRef] [Green Version]
  5. Osagie, E.; Waqar, M.; Adebayo, S.; Stasiewicz, A.; Porwol, L.; Ojo, A. Usability evaluation of an open data platform. In Proceedings of the 18th Annual International Conference on Digital Government Research, New York, NY, USA, 7–9 June 2017; ACM: New York, NY, USA, 2017. [Google Scholar] [CrossRef]
  6. Boychuk, M.; Cousins, M.; Lloyd, A.; MacKeigan, C. Do we need data literacy? Public perceptions regarding Canada’s open data initiative. Dalhous. J. Interdiscip. Manag. 2016, 12, 1–26. [Google Scholar] [CrossRef]
  7. Welle Donker, F.; van Loenen, B. How to assess the success of the open data ecosystem? Int. J. Digit. Earth 2017, 10, 284–306. [Google Scholar] [CrossRef] [Green Version]
  8. Meijer, A. E-Governance innovation: Barriers and strategies. Gov. Inf. Q. 2015, 32, 198–206. [Google Scholar] [CrossRef]
  9. Ruijer, E.; Meijer, A. Open government data as an innovation process: Lessons from a living lab experiment. Public Perform. Manag. Rev. 2020, 43, 613–635. [Google Scholar] [CrossRef] [Green Version]
  10. Young, M.; Yan, A. Civic Hackers’ User Experiences and Expectations of Seattle’s Open Municipal Data Program. Available online: https://scholarspace.manoa.hawaii.edu/bitstream/10125/41480/1/paper0331.pdf (accessed on 2 September 2021).
  11. Nikiforova, A. Open Data Quality Evaluation: A comparative analysis of open data in Latvia. Balt. J. Mod. Comput. 2018, 6, 363–386. [Google Scholar] [CrossRef]
  12. Nikiforova, A. Definition and evaluation of data quality: User-Oriented data object-driven approach to data quality assessment. Balt. J. Mod. Comput. 2020, 8, 391–432. [Google Scholar] [CrossRef]
  13. McBride, K.; Aavik, G.; Toots, M.; Kalvet, T.; Krimmer, R. How does open government data driven co-creation occur? Six factors and a ‘perfect storm’; insights from Chicago’s food inspection forecasting model. Gov. Inf. Q. 2019, 36, 88–97. [Google Scholar] [CrossRef]
  14. World Wide Web Foundation. Open Data Barometer: Leaders Edition. Available online: https://opendatabarometer.org/doc/leadersEdition/ODB-leadersEdition-Report.pdf (accessed on 2 September 2021).
  15. Open Data in Europe. Open Data Maturity. Available online: https://data.europa.eu/sites/default/files/open_data_maturity_report_2019.pdf (accessed on 2 September 2021).
  16. Máchová, R.; Hub, M.; Lnenicka, M. Usability evaluation of open data portals: Evaluating data discoverability, accessibility, and reusability from a stakeholders’ perspective. Aslib J. Inf. Manag. 2018, 70, 252–268. [Google Scholar] [CrossRef]
  17. Tang, R.; Gregg, W.; Hirsh, S.; Hall, E.U.S. state and state capital open government data (OGD): A content examination and heuristic evaluation of data processing capabilities of OGD sites. Proc. Assoc. Inf. Sci. Technol. 2019, 56, 255–264. [Google Scholar] [CrossRef]
  18. Martin, S.; Foulonneau, M.; Turki, S. 1–5 stars: Metadata on the openness level of open data sets in Europe. In Metadata and Semantics Research; Garoufallou, E., Greenberg, Eds.; Springer International Publishing: New York, NY, USA, 2013; pp. 234–245. [Google Scholar]
  19. Neumaier, S.; Umbrich, J.; Polleres, A. Automated quality assessment of metadata across open data portals. J. Data Inf. Qual. 2016, 8, 1–29. [Google Scholar] [CrossRef]
  20. European Data Portal. Recommendations for Open Data Portals: From Setup to Sustainability. 2020. Available online: https://data.europa.eu/sites/default/files/edp_s3wp4_sustainability_recommendations.pdf (accessed on 2 September 2021).
  21. Luthfi, A.; Janssen, M.; Crompvoets, J. Stakeholder tensions in decision-making for opening government data. In Proceedings of the Business Modeling and Software Design 10th International Symposium, Berlin, Germany, 6–8 July 2020; Shishkov, B., Ed.; Springer International Publishing: Cham, Switzerland, 2020; pp. 331–340. [Google Scholar]
  22. McBride, K.; Olesk, M.; Kütt, A.; Shysh, D. Systemic change, open data ecosystem performance improvements, and empirical insights from Estonia: A country-level action research study. Inf. Polity 2020, 25, 377–402. [Google Scholar] [CrossRef]
  23. Lourenço, R.P. An analysis of open government portals: A perspective of transparency for accountability. Gov. Inf. Q. 2015, 32, 323–332. [Google Scholar] [CrossRef]
  24. Zuiderwijk, A.; Janssen, M. Open data policies, their implementation and impact: A framework for comparison. Gov. Inf. Q. 2014, 31, 17–29. [Google Scholar] [CrossRef] [Green Version]
  25. Attard, J.; Orlandi, F.; Scerri, S.; Auer, S. A systematic review of open government data initiatives. Gov. Inf. Q. 2015, 32, 399–418. [Google Scholar] [CrossRef]
  26. Van der Waal, S.; Węcel, K.; Ermilov, I.; Janev, V.; Milošević, U.; Wainwright, M. Lifting open data portals to the data web. In Linked Open Data: Creating Knowledge out of Interlinked Data; Auer, S., Bryl, V., Tramp, S., Eds.; Springer International Publishing: Cham, Switzerland, 2014; pp. 175–195. [Google Scholar] [CrossRef] [Green Version]
  27. Marienfeld, F.; Schieferdecker, I.; Lapi, E.; Tcholtchev, N. Metadata aggregation at GovData.de. In Proceedings of the 9th International Symposium on Open Collaboration, Hong Kong, China, 5–7 August 2013; Association for Computing Machinery: New York, NY, USA, 2013; pp. 1–5. [Google Scholar]
  28. Nikiforova, A. Timeliness of open data in open government data portals through pandemic-related data: A long data way from the publisher to the user. In Proceedings of the Fourth International Conference on Multimedia Computing, Networking and Applications, Valencia, Spain, 19–22 October 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 131–138. [Google Scholar] [CrossRef]
  29. Kim, H. Analysis of standard vocabulary use of the open government data: The case of the public data portal of Korea. Qual. Quant. 2019, 53, 1611–1622. [Google Scholar] [CrossRef]
  30. Matheus, R.; Ribeiro, M.M.; Vaz, J.C. New perspectives for electronic government in Brazil: The adoption of open government data in national and subnational governments of Brazil. In Proceedings of the 6th International Conference on Theory and Practice of Electronic Governance, Albany, NY, USA, 22–25 October 2012; Association for Computing Machinery: New York, NY, USA, 2012. [Google Scholar] [CrossRef] [Green Version]
  31. Matheus, R.; Ribeiro, M.M.; Vaz, J.C.; de Souza, C.A. Anti-Corruption online monitoring systems in Brazil. In Proceedings of the 6th International Conference on Theory and Practice of Electronic Governance, Albany, NY, USA, 22–25 October 2012; Association for Computing Machinery: New York, NY, USA, 2012. [Google Scholar] [CrossRef]
  32. Nikiforova, A.; McBride, K. Open government data portal usability: A user-centred usability analysis of 41 open government data portals. Telemat. Inf. 2021, 58, 101539. [Google Scholar] [CrossRef]
  33. Zuiderwijk, A.; Janssen, M.; Choenni, S.; Meijer, R.; Alibaks, R.S. Socio-Technical impediments of open data. Electron. J. E-Gov. 2012, 10, 156–172. [Google Scholar]
  34. Matheus, R.; Janssen, M. A systematic literature study to unravel transparency enabled by open government data: The window theory. Public Perform. Manag. Rev. 2020, 43, 503–534. [Google Scholar] [CrossRef] [Green Version]
  35. Dawes, S.S.; Vidiasova, L.; Parkhimovich, O. Planning and designing open government data programs: An ecosystem approach. Gov. Inf. Q. 2016, 33, 15–27. [Google Scholar] [CrossRef]
  36. Lněnička, M.; Machova, R.; Volejníková, J.; Linhartová, V.; Knezackova, R.; Hub, M. Enhancing transparency through open government data: The case of data portals and their features and capabilities. Online Inf. Rev. 2021, in press. [Google Scholar] [CrossRef]
  37. Lnenicka, M.; Nikiforova, A. Transparency-by-design: What is the role of open data portals? Telemat. Inf. 2021, 61, 101605. [Google Scholar] [CrossRef]
  38. Weerakkody, V.; Irani, Z.; Kapoor, K.; Sivarajah, U.; Dwivedi, Y.K. Open data and its usability: An empirical view from the citizen’s perspective. Inf. Syst. Front. 2017, 19, 285–300. [Google Scholar] [CrossRef] [Green Version]
  39. Johnson, R.B.; Onwuegbuzie, A.J. Mixed methods research: A research paradigm whose time has come. Educ. Res. 2004, 33, 14–26. [Google Scholar] [CrossRef] [Green Version]
  40. Huang, Z.; Benyoucef, M. Usability and credibility of e-government websites. Gov. Inf. Q. 2014, 31, 584–595. [Google Scholar] [CrossRef]
  41. Nielsen, J. Enhancing the explanatory power of usability heuristics. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems Celebrating Interdependence, Boston, MA, USA, 24–28 April 1994; ACM Press: New York, NY, USA, 1994; pp. 152–158. [Google Scholar] [CrossRef]
  42. Nielsen, J.; Molich, R. Heuristic evaluation of user interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems Empowering People, Seattle, WA, USA, 1–5 April 1990; ACM Press: New York, NY, USA, 1990; pp. 249–256. [Google Scholar] [CrossRef]
  43. Tan, W.; Liu, D.; Bishu, R. Web evaluation: Heuristic evaluation vs. user testing. Int. J. Ind. Erg. 2009, 39, 621–627. [Google Scholar] [CrossRef]
  44. Maykut, P.; Morehouse, R. Beginning Qualitative Research: A Philosophic and Practical Guide; Falmer Press: London, UK, 1994. [Google Scholar]
  45. Bryman, A. Social Research Methods; Oxford University Press: Oxford, UK, 2004. [Google Scholar]
  46. Alrasbi, H. Motivation of Omani Schoolteachers. PhD Thesis, The University of Edinburgh, Edinburgh, UK, 2013. [Google Scholar]
  47. Gorard, S. Quantitative Methods in Educational Research: The Role of Numbers Made; Continuum: London, UK, 2001. [Google Scholar]
  48. Seyyedamiri, N.; Khosravani, A. Identification of the Effective E-Promotional Tools on Improving Destination Brand Image. J. Glob. Inf. Manag. 2020, 28, 169–183. [Google Scholar] [CrossRef]
  49. Guest, G.; Namey, E.; Chen, M. A Simple Method to Assess and Report Thematic Saturation in Qualitative Research. PLoS ONE 2020, 15, e0232076. [Google Scholar] [CrossRef]
  50. Brislin, R.W. Back-Translation for Cross-Cultural Research. J. Cross. Cult. Psychol. 1970, 1, 185–216. [Google Scholar] [CrossRef]
  51. Braun, V.; Clarke, V. Using thematic analysis in psychology. Qual. Res. Psychol. 2006, 3, 77–101. [Google Scholar] [CrossRef] [Green Version]
  52. Hachicha, Z.S.; Mezghani, K. Understanding Intentions to Switch Toward Cloud Computing at Firms’ Level. J. Glob. Inf. Manag. 2018, 26, 136–165. [Google Scholar] [CrossRef]
  53. Kubler, S.; Robert, J.; Le Traon, Y.; Umbrich, J.; Neumaier, S. Open data portal quality comparison using AHP. In Proceedings of the 17th International Digital Government Research Conference on Digital Government Research, Shanghai, China, 8–10 June 2016; Association for Computing Machinery: New York, NY, USA, 2016; pp. 397–407. [Google Scholar] [CrossRef] [Green Version]
  54. Máchová, R.; Lnenicka, M. Evaluating the quality of open data portals on the national level. J. Theor. Appl. Electron. Commer. Res. 2017, 12, 21–41. [Google Scholar] [CrossRef] [Green Version]
  55. Umbrich, J.; Neumaier, S.; Polleres, A. Quality assessment and evolution of open data portals. In Proceedings of the 3rd International Conference on Future Internet of Things and Cloud, Rome, Italy, 24–25 August 2015; IEEE: Los Alamitos, CA, USA, 2015; pp. 404–411. [Google Scholar] [CrossRef]
  56. Barry, E.; Bannister, F. Barriers to open data release: A view from the top. Inf. Polity 2014, 19, 129–152. [Google Scholar] [CrossRef] [Green Version]
  57. Sieber, R.E.; Johnson, P.A. Civic open data at a crossroads: Dominant models and current challenges. Gov. Inf. Q. 2015, 32, 308–315. [Google Scholar] [CrossRef]
  58. Millette, C.; Hosein, P. A consumer focused open data platform. In Proceedings of the 3rd MEC International Conference on Big Data and Smart City, Muscat, Oman, 15–16 March 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 1–6. [Google Scholar] [CrossRef] [Green Version]
  59. Susha, I.; Grönlund, Å.; Janssen, M. Organizational measures to stimulate user engagement with open data. Transform. Gov. People Process. Policy 2015, 9, 181–206. [Google Scholar] [CrossRef]
  60. W3C. Data Catalog Vocabulary (DCAT)—Version 2. Available online: https://www.w3.org/TR/vocab-dcat-2/ (accessed on 19 November 2021).
  61. Ojo, A.; Porwol, L.; Waqar, M.; Stasiewicz, A.; Osagie, E.; Hogan, M.; Harney, O.; Zeleti, F.A. Realizing the innovation potentials from open data: Stakeholders’ perspectives on the desired affordances of open data environment. In Collaboration in a Hyperconnected World, Proceedings of the 17th Working Conference on Virtual Enterprises, Porto, Portugal, 3–5 October 2016; Afsarmanesh, H., Camarinha-Matos, L., Lucas Soares, A., Eds.; Springer: Berlin, Germany, 2016; pp. 48–59. [Google Scholar] [CrossRef] [Green Version]
  62. Kucera, J. Open government data publication methodology. J. Syst. Integr. 2015, 6, 52–61. [Google Scholar] [CrossRef]
Table 1. The G7 portals used in the study.
Table 1. The G7 portals used in the study.
CountryOGD Portal URLAccessed Date
United Stateshttps://www.data.gov/3 September 2021
United Kingdomhttps://www.data.gov.uk/3 September 2021
Francehttps://www.data.gouv.fr/3 September 2021
Germanyhttps://www.govdata.de/3 September 2021
Italyhttps://www.dati.gov.it/3 September 2021
Japanhttp://www.data.go.jp/3 September 2021
Canadahttp://www.open.canada.ca/3 September 2021
Table 2. Portal assessment criteria, as originally published by Máchová et al. [16] (p. 256).
Table 2. Portal assessment criteria, as originally published by Máchová et al. [16] (p. 256).
CategoryAspectDescription
Open dataset specification(a) Description of datasetPortal provides datasets together with their description and how and for what purpose they were collected.
(b) Publisher of datasetPortal provides information about organization that published datasets.
(c) Thematic categories and tagsPortal provides thematic categories of datasets to address the main topics covered. It distinguishes categories (themes) from tags (keywords).
(d) Release date and up to dateDatasets are associated with a time or period tag; that is, date published, date updated, and frequency.
(e) Machine-readable formatsPortal provides datasets formats that are machine-readable and allow easy re-use.
(f) Open data licencePortal provides license information related to the use of the published datasets.
(g) Visualization and statisticsPortal provides visualization and analytics capabilities to gain information about a dataset, e.g., in charts or visualizations in maps.
Open dataset feedback(a) Documentation and tutorialsPortal provides high quality of documentation and tutorials to help users.
(b) Forum and contact formPortal provides an opportunity to submit feedback on a dataset from the users to providers and forum to discuss and exchange ideas among the users.
(c) User rating and commentsPortal provides capabilities allowing the collection of user ratings and comments.
(d) Social media and sharingPortal provides the integration with social media technologies to create a distribution channel for open data and sharing feedback.
Open dataset request(a) Request formPortal provides a form to request or suggest new type or format type of open data.
(b) List of requestsPortal provides a list of requests received from users, including the current state of request processing
(c) Involvement in the processPortal provides capabilities allowing the involvement in the same dataset.
Table 3. Results of the usability evaluation, open dataset specification (mean values).
Table 3. Results of the usability evaluation, open dataset specification (mean values).
Open Dataset SpecificationUKUSFranceGermanyItalyJapanCanadaAspect Mean
(1) Description of dataset2.592.512.622.502.652.492.572.56
(2) Publisher of dataset2.912.932.872.852.922.812.862.88
(3) Thematic categories and tags2.872.903.002.812.842.712.722.84
(4) Release date and up to date2.892.912.852.832.902.782.832.45
(5) Machine-readable formats2.782.812.902.802.932.792.792.81
(6) Open data licence2.902.322.502.002.322.842.512.48
(7) Visualization and statistics1.001.601.001.001.001.001.001.08
Table 4. Results of the usability evaluation, open dataset feedback (mean values).
Table 4. Results of the usability evaluation, open dataset feedback (mean values).
Open Dataset FeedbackUKUSFranceGermanyItalyJapanCanadaAspect Mean
(1) Documentation and tutorials2.752.582.642.622.522.552.682.62
(2) Forum and contact form2.602.402.002.622.002.042.002.24
(3) Social media and sharing2.892.912.852.832.902.782.832.86
(4) User rating and comments1.001.002.802.502.751.001.001.72
Table 5. Results of the usability evaluation, open dataset request (mean values).
Table 5. Results of the usability evaluation, open dataset request (mean values).
Open Dataset RequestUKUSFranceGermanyItalyJapanCanadaAspect Mean
(1) Request form2.602.402.002.622.002.042.002.24
(2) List of requests1.001.001.001.001.001.001.001.0
(3) Involvement in the process1.821.921.001.801.841.701.721.69
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Mutambik, I.; Almuqrin, A.; Lee, J.; Zhang, J.Z.; Alomran, A.; Omar, T.; Floos, A.; Homadi, A. Usability of the G7 Open Government Data Portals and Lessons Learned. Sustainability 2021, 13, 13740. https://doi.org/10.3390/su132413740

AMA Style

Mutambik I, Almuqrin A, Lee J, Zhang JZ, Alomran A, Omar T, Floos A, Homadi A. Usability of the G7 Open Government Data Portals and Lessons Learned. Sustainability. 2021; 13(24):13740. https://doi.org/10.3390/su132413740

Chicago/Turabian Style

Mutambik, Ibrahim, Abdullah Almuqrin, John Lee, Justin Zuopeng Zhang, Abdulaziz Alomran, Taha Omar, Ahmad Floos, and Abdullah Homadi. 2021. "Usability of the G7 Open Government Data Portals and Lessons Learned" Sustainability 13, no. 24: 13740. https://doi.org/10.3390/su132413740

APA Style

Mutambik, I., Almuqrin, A., Lee, J., Zhang, J. Z., Alomran, A., Omar, T., Floos, A., & Homadi, A. (2021). Usability of the G7 Open Government Data Portals and Lessons Learned. Sustainability, 13(24), 13740. https://doi.org/10.3390/su132413740

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop