1. Introduction
We have observed an evolution of legislative changes regarding privacy with the Data Protection Act (1998), Freedom of Information Act (2000), EU ePrivacy Directive (2002), General Data Protection Regulation (2018), and UK Data Protection Act (2018) over the first quarter of the 2000s. This evolution reflects a collaborated effort to adapt legal frameworks to the challenges stemming from technological developments while increasing the complexity of the procedures surrounding personal data. This study underscores data protection regulations and the challenges of implementing these frameworks. To address these issues, we developed a privacy framework to assist organisations in taking the necessary steps to comply with the UK GDPR, highlighting the current best-practice privacy frameworks and the requirements set by the Information Commissioner’s Office.
The UK is a generous nation with a long philanthropic tradition, donating a total of GBP 83 bn to more than 169 k registered charities in 2021. However, this also increases the potential for deliberate harm [
1]. To demonstrate, in 2022, Verison found that data breaches, both accidental and deliberate, made up approximately 30% of cyber incidents within EMEA (Europe, the Middle East, and Africa) [
2].
The Department for Culture, Media and Sport (DCMS) repeatedly found that charities are less cyber proficient than their private sector counterpart (this fact has been noted by DCMS across several Breach Survey Reports), citing a number of different reasons for this, ranging from a lack of awareness, to having an ad hoc approach to cyber security, based on informal advice, to charities believing they are not worth attackers’ efforts [
3].
Like many smaller organisations in the public sector, charities tend to focus on spending their income to deliver the services they were set up to provide. This is not surprising, as donors are likely more supportive if the funding they donate is spent on delivering services, rather than on the administration of the charitable organisation [
4]. For larger charities, because these tend to be managed in a similar fashion to industry large private businesses, they are more likely to have teams of cyber experts as part of their staffing. However, for smaller charities, these are often unable to afford having this type of expertise on the payroll, thus exposing them to cyber threats, including relatively simple cyber-attacks [
5]. As a result, while awareness of both ethical and legal responsibilities to data privacy is present, there is a lack of privacy and cyber security awareness on how to effectively defend the electronic data once collected. Many volunteers in charitable organisations are of an older generation [
6], perhaps as they have more time to give in their retirement years, and the older generation is not as technically well versed as the younger generations. According to [
7], it is the fear of vulnerabilities (viruses, phishing, fraud, etc.) and the using of technology that makes the older generation more adverse to adopting it.
Another aspect that charities must consider is the additional roles or types of data subject whose data they process, adding complexity to an already challenged sector. Whereas traditional businesses will typically deal with data from a few different data subject roles or types (e.g., staff, customers and suppliers), charities must also account for donors (who give funds), volunteers (unpaid workers) and beneficiaries (recipients of the service or benefit the charity provides), meaning they must also accounted for these roles, adding extra layers of consideration into their data-processing practices.
In addition, how data are handled by the charity can also be problematic; for example, once data have been centralised, e.g., by organising them and storing them in a database or in a customer relationship management (CRM) application, they can be described as structured data [
8]) and should be relatively easy to manage. Data collected, created, or stored outside of this controlled format may be described as unstructured data [
9], and handling this type of data can be daunting for organisations, as it ideally requires a managed approach in the form of taxonomy, indexing or classification [
10].
Work to update frameworks to accommodate the protection of personal data has already started; for instance, ISO 27001, the ISO standard for information security management [
11], has been enhanced with ISO 27701:2019, providing security techniques for privacy information management [
12]. However, these frameworks are generic to organisations as opposed to catering specifically for a particular industry or sector.
A further problem is the volume of reading required for data privacy compliance, as has been reflected in DCMS in their 2018 Breach Survey Report, where one charity commented that “short and snappy” documentation would be more desirable for charities to review [
13]. Smaller to medium size organisations (SMEs) may not have an obligation to appoint a Data Protection Officer (DPO), depending on the volume of records that are processed, and how sensitive the personal data being processed are [
14]. Therefore, smaller charities and SME may not possess the necessarily skills and experience in-house to implement effective data privacy practices.
All organisations have a duty of care to safely handle our personal data and protect our right to a private life. While this right is assured through legislation in the United Kingdom (UK), we all have some aspects of our personal lives that we would rather keep more private than others, particularly when things are not going as well as we might like, and it is at times like these that we may seek or need the support of charitable organisations for help, and therefore, we argue that not-for-profit organisations and charities have a duty of care above and beyond their legal obligations when managing our personal data.
Thus, in this paper, we present a data privacy framework called “Privacy Essentials!”, designed to enable not-for-profit and charitable organisations to better understand their obligations, both from regulatory and legislative perspectives, as well as societal expectations when managing the privacy of stakeholders and digital information. We argue that a vibrant information security culture will benefit an organisation more than adding technical controls. To this end, it is vital for any organisation, including charities, to establish a solid foundation of procedures, processes and policies that complements the mission statement of the charity, thereby improving the organisation’s security posture. This is what this paper seeks to achieve through building a framework, Privacy Essentials!, applied specifically for the charity sector, providing them with a baseline of documentation that any charity can implement within their business. The idea is that Privacy Essentials! will benefit the charity by demystifying some of the perceived complexity when handling personal and/or sensitive data [
15].
2. Background
As described in
Section 1, within the charitable sector, when it comes to effectively managing data privacy, particularly for smaller charities, there are several problem domains that may arise, including the perception that achieving data protection is problematic [
16]. This can be due to any number of the many factors that influence it as depicted in
Figure 1. Within larger organisations, which have processes and skilled staff in place, they may well have a high security awareness, and thus, the issue is not considered problematic. However, for smaller charities, which may not be as robust in their security awareness, it has been suggested that the regulations may prove difficult to implement [
17], and thus, we contend this framework’s outcomes will benefit any charitable organisation in trying to overcome these perceived problems, while allowing the charity to realise further benefits by being responsible custodians of personal data.
Uchendu et al. [
18] noted that organisations can enhance a cyber security culture when management drives this and supports it with appropriate policy, procedures, and awareness. In addition, having a visible security culture will aid charities both in soliciting donors (customers) and in retaining them, which, over time will lead to increased brand trust and loyalty [
19]. Moreover, charities that embrace the General Data Protection Regulation (GDPR) and the Data Protection Act 2018 (DPA) as having a positive impact on their operations will arguably reap these benefits, as well as managing personal data efficiently.
2.1. The Right to Privacy
Some industry sectors have historically had a robust position regarding confidentiality, for example, the banker considers customer privacy to be fundamental to the industry, born from the principles of morality and professionalism [
20]. Likewise, a GP considers patient confidentiality to be an integral part of dealing with medical matters, a value that can be traced back to Hippocrates and remains a principle that is practised today [
21].
In the UK, there was no specifically recognised legal right to privacy until the late 1990s, notably, the Protection from Harassment Act 1997, enacted to meet the obligations of the UK with regards to the European Convention of Human Rights, and, as regards digital data, the Data Protection Act 1998 [
22]. Following on from these initial statutory provisions, in 2018, the General Data Protection Regulation (GDPR) came into effect across the European Union (EU) [
8], and while the UK has now left the EU, the Data Protection Act 2018 (DPA 2018) has been enacted to incorporate the provisions of GDPR into UK law [
23].
2.2. DPA 2018 and UK GDPR
There are other laws and regulations designed to protect UK subjects; however, for the purpose of this study, we elected to concentrate on DPA 2018, as this legislation has been devised to provide the privacy obligations for managing data that organisations must abide by. As part of this, Gov.uk sets out six data protection principles as outlined in [
24] that align with GDPR Article 5(1). For the purposes of Privacy Essentials! we have also included, the seventh GDPR principle of accountability (GDPR Article 5(2)) for completeness (
Table 1).
The Information Commissioner’s Office (ICO) also states that the categories of personal data outlined in
Table 2 require stronger protection under UK GDPR Article 9.1.
Furthermore, personal data relating to criminal convictions and offences as outlined in UK GDPR Article 10 also require additional safeguards to be in place.
2.2.1. DPA Individual Rights and Processing of the Records
The UK GDPR also confers rights onto data subjects (the individual’s whose data the organisation handles); these rights are outlined in
Table 3.
When processing personal data, UK GDPR Article 30.1 requires that controllers maintain a record of how these personal records are processed by placing a requirement for the organisation to document as set out in
Table 4.
2.2.2. Data Protection Impact Assessment (DPIA)
As regards privacy risk, the ICO advises that a DPIA should be undertaken as part of an organisations’ accountability obligations at the start of any new project, or when there are changes made to existing processes [
26]. This requirement can be found within DPA 2018 chapter 4, Section 64 (1) and 64(4).
The definition for high risk has been adopted from the European Data Protection Board (EDPB), covering a wide range of processing operations [
27] including the following:
Large-scale profiling (although no specific number to define “large scale” is presented);
The use of innovative technology (including artificial intelligence);
Matching data obtained from multiple sources (e.g., in 2017, the ICO fined 11 charities GBP 138 k for profiling and matching potential donors [
28]);
Denial of service (based on automated profiling);
Risk of physical harm;
Collection of genetic or biometric data;
Invisible processing (typically related to third party collection of data);
Tracking an individual’s behaviour or geolocation;
Targeting vulnerable persons or children for marketing or automated decisions.
Further, the ICO cautions that while some might consider that the definition of “high risk and “large scale” is too vague [
29], there is still no definitive template to follow in conducting a DPIA [
26]. This is because while personal data types will share some common characteristics across the charity sector, e.g., name, e-mail, phone number, etc., how each organisation processes them will vary too widely for a framework to consider effectively. However, certain activities could arguably produce a partly filled template for a charity to complete, such as hiring employees, accepting donations, or delivering benefits.
Any breach, or suspected breach, of personal data, provided that the breach poses a risk to peoples’ rights and freedoms, must be reported to the ICO within 72 h of the charity becoming aware that a breach has, or may have, occurred [
30]. The ICO will then consider, based on how severe the breach is and what measures and technical controls are in place within the charity to protect personal data, what action to take. The ICO has the power to take action against the charity, including imposing significant fines dependent on the outcome of their investigation [
31].
Where it is found that harm has potentially been done to an individual whose data were breached, the organisation may be liable to compensate the individual for that harm, which may prove costly, depending on the severity of the harm [
32]. Recovery expenses, compensation, legal fees, reputational damage, regulatory fines, etc., are just some of the potential costs. However, it is worth bearing in mind that regarding how a charity reacts to a data breach, any loss of trust and/or reputational damage can be mitigated by being prepared. This means that having an effective incident response plan in place is important. Thus, a plan that incorporates sincere and apologetic strategies can help an organisation positively recover from a breach [
33].
2.3. Data Protection by Design and Default
DPA 2018 Section 57 requires organisations to implement data protection by design and default (DPPbDD, s. 25), which requires that appropriate organisational and technical measures are implemented to both protect personal data (s. 57(1a)), and to ensure the data are being processed for the specific purpose they were received (s. 57(3)).
The intention is for the framework developed through this study to incorporate these principles, and those others discussed that are relevant (e.g., see
Section 2.2), as an integral part of the development of the Privacy Essentials! framework.
2.4. Data Protection and Digital Information Bill
The Data Protection and Digital Information Bill is a new piece of legislation, currently being proposed by the UK government, devised to make changes to both UK GDPR and PECR [
34]. The initial proposal did not pass the first reading in favour of an amended second bill for parliament to consider. Whilst this second bill was, at the time of undertaking this study, in the early stages of proposal, this literature review would be remiss to guess the impact of the final draft; however, it is important to note the law continuously changes and adds further complexity to personal data management to keep up to date with changes. At the time of writing, this bill has now reached the committee stage, so it looks likely that this might progress through to the final stages and enactment [
34].
In addition to the aforementioned regulations, the Fundraising Regulator (FR) works within the confines of the law and, as part of their remit, provides standards to which charities are obliged to adhere [
35]. Upon reviewing this, while we acknowledge this is an excellent resource and a code of conduct for charities, it does not provide anything to assist in developing policy or procedures for privacy.
2.5. Existing Frameworks
Several frameworks were identified that could provide guidance for best practice in managing privacy and cybersecurity. While some may consider these frameworks cumbersome, they do provide a useful reference point and checklist which can be used by charities to ensure their procedures or processes are appropriate and well founded, and therefore, these frameworks were reviewed as part of this study.
2.5.1. National Institute of Standards and Technology (NIST)
The National Institute of Standards and Technology (NIST) was founded in 1901 NIST are part of the United States (US) Department of Commerce. One of the core competencies of NIST is the development and use of standards [
36], including the NIST Privacy Framework that proposes five functions devised to complement the NIST Cybersecurity Framework as set out in
Table 5 [
37].
In the US, privacy laws differ between the states, and only a few laws are universal across the country, and those privacy regulations have tended to be focused on particular industries, such as the Health Insurance Portability and Accountability Act (HIPPA), as opposed to Europe, where privacy is treated and dealt with through the one regulation, GDPR, regardless of the industry sector [
38]. As a result, the NIST privacy framework has been devised to address a large variety of use cases, rather than any unified set of laws, which is one of the advantages of NIST, as it provides appropriate guidance that can be easily adapted for our purposes.
2.5.2. International Organisation for Standardisation (ISO)
The International Organisation for Standardisation (ISO) provides ‘global standards’ for all manner of subjects that are agreed upon internationally by experts in each field [
39], including privacy and security. For privacy, ISO/IEC 27701:2019 provides a privacy extension to ISO/IEC 27001, the ISO security management standard developed for the international community. ISO 27701 incorporates many of the security controls from the main standard into the privacy framework to help organisations improve their privacy information management [
12].
ISO 27701 formalises the process of a Privacy Information Management System (PIMS), as well as assisting with the auditing of the processes and management reviews [
40]. As part of this standard, ISO recommends controls to protect and manage personal data. The regulations themselves (GDPR, and DPA) do not specify controls deliberately, rather, they leave the details of how to protect personal data for the organisation to define themselves [
41].
2.5.3. Cyber Essentials
The National Cyber Security Centre (NCSC) introduced Cyber Essentials, a UK government scheme devised to assist organisations in defending themselves from common cyber-attacks [
42]. There are two Cyber Essentials programmes as part of this scheme, Cyber Essential and Cyber Essentials Plus, both of which reference technical controls to maintain a secure network [
43]. The intention for this study is to use this framework to support and cross-reference to support the process documentation.
2.6. Data Taxonomy, Classification, and Access Control Frameworks
In managing data, it is useful to classify the data so that only authorised staff and or approved applications can view or access, based on appropriate user permissions, thereby allowing the organisation to adopt either Role-Based Access Controls (RBAC), i.e., access based on user role within the business, or Mandatory Access Controls (MAC), i.e., access based on the policy for objects [
44]. These access control frameworks help in understanding, planning and managing user roles for both structured and unstructured data.
However, it is important to keep data classification relatively simple and the naming conventions obvious, to make it easier for the user. For instance, Public > Personal > Personal Sensitive > Confidential will likely meet most needs, especially if the UK government only uses three levels [
45].
The disadvantage of classifying data is in ensuring that users classify documents in accordance with the agreed conventions. In the case of charity organisations, using pre-classified templates and ensuring that proper training is delivered to new employees or volunteers upon joining the charity will help ensure they fully understand how to classify documentation correctly [
46]. Metadata can also assist with data classification, and using authors name, tags, last saved date, etc., can all potentially assist in searching for documentation, as well as in reviewing for data retention considerations.
6. Evaluation
To evaluate the usability, a meeting was set up with the client to demonstrate PE. This meeting took place via a 2.5 h video conference call, with feedback recorded as it was received. The other charities were sent copies of Privacy Essentials! and asked to provide feedback via a questionnaire as well as providing any ad hoc critique (
Section 6.1).
The final evaluation was conducted with a qualified DPO for a large charitable organisation to affirm that the functionality meets the intended markets’ requirements. This DPO is also a Trustee for a charity with 140 locations across the UK, and therefore has an additional interest in the capabilities of Privacy Essentials!.
6.1. System Usability Survey
The System Usability Scale (SUS) [
48] provides a useful insight into the usability of an application. It contains ten questions, with users responding on a 5-point Likert scale (ranging from Strongly disagree to Strongly agree).
This questionnaire was adapted for PE, and evaluators were asked to complete the ten questions (
Figure 16), via an online survey (hosted by Jisc Online Surveys [
60]); eight responses were received.
We plotted the individual scores against a spider graph and plotted the ‘ideal score’ in orange and our results from the PE SUS questionnaire shown in blue (
Figure 17).
From this, the results suggest that users found PE intuitive to use. According to Brooke [
48], a score of 68% is an average score, and anything above this is deemed acceptable. As regards the number of respondents, the optional number of respondents is thought to be between 8 and 12 [
61]. Thus, we had eight evaluators completing the survey, who gave PE an average score of 82.2%; when compared against the SUS Adjective table shown in
Figure 18, PE achieves a rating of “Good”, approaching Excellent”.
6.2. Evaluator Feedback and Improvements
One area of potential concern was whether the other two evaluated charities’ needs could be assumed to be similar to those of the primary client. This concern was alleviated with evaluators responding to clarify that the identities and workflows in PE were equally appropriate to their operations. One point raised by Participant 5 (P5) was that the use of the word “Beneficiary” within PE could potentially cause confusion if PE was rolled out to other industries or sectors. Charities will use the term “Beneficiary” to indicate the identity of the recipient of the charity’s services, which could potentially be misconstrued in other sectors. To alleviate this, this term could, however, be changed to “Client” to be more meaningful across other sectors.
Evaluators were also asked if there was any additional feedback for Privacy Essentials!; the responses, suggestions and changes applied in response to the feedback are depicted in
Table 12.
The data risk register (DRR) required further information from the Workflow page to enhance Article 30 documentation as previously shown in
Table 6 in
Section 2.2. Thus, we updated the questions for each identity as depicted in
Figure 19 and
Figure 20.
Looking at
Figure 19, Ref. W.2.b was changed to be more meaningful to account for the relevance of the location of the stored personal data, with response choices presented now showing as “Within the UK/EU”, “Outside the UK/EU”, or “Not Sure”. Similarly, for Ref. W.2.e, responses, while similar, questioned the location of the application processing the data as opposed to the location of the data. Finally, W.2.f asks for the strongest security control, and although it could be argued that this question would be better served by a combo box, because PE is macro free, fields are limited to one response; therefore, this was left unchanged.
A caution was incorporated within the instructions to explain that the DRR can be manually enhanced to allow for scenarios where multiple applications are used to manage employee data. A feature that will permit more than one application to be recorded and automatically update the DRR may be incorporated in future iterations of PE.
6.3. UX Honeycomb Survey
After the amendments outlined in
Table 12 had been implemented, a final evaluation of Privacy Essentials! v2.0 took place with the DPO introduced in
Section 6 and the Client’s CFO. For this, we used the UX Honeycomb survey [
62] to obtain a qualitative assessment of Privacy Essentials! against seven variables as depicted in
Figure 21.
The UX Honeycomb survey was used to determine the appropriateness of the outcomes from PE v.2.0 using the seven questions listed in
Table 13.
The responses received from the evaluators are shown in
Table 14.
Thus, from the results from the Honeycomb evaluation survey and our subsequent conversations with the evaluators, their responses were highly encouraging, and the comment of the DPO, “I have had access to similar commercial off-the-shelf examples before, this is able to compete, if not beat those in terms of competition”, would indicate that PE more than meets the expected outcomes.
6.4. Privacy Essentials! v2.0 Improvements
Following feedback received from the v.2.0 reviews (
Section 6.3), some minor improvements will need to be considered. For example, workflows could benefit from having additional processes included, with, for instance, drop-down menus that gather information about data locations, the likely lawfulness for processing, data retention schedules, and security treatments. Adding these would add further complexity to the framework, but in turn, would allow the DRR to capture richer data. However, keeping PE relatively simple in the first instance will build confidence for the novice user, and therefore, there is a fine balance to be struck.
Another enhancement will be Subject Access Request (SAR), where data will be added to the DRR, following the logic that, if either SAR policy, or procedure, is selected by the user, then the DRR will automatically add relevant fields. Similarly, if a Data Transfer is required, the DRR will highlight those requirements in a similar fashion to the way DPIA obligations are shown in
Figure 22.
7. Conclusions
This paper has presented the Privacy Essentials! framework, a data protection assessment tool that will allow charitable organisations to begin creating and implementing an effective data privacy programme.
Identifying a gap in the market, our research found that charities have a tendency to struggle a little more than other sectors in being privacy and cyber savvy in their processes and practices, particularly smaller charities (
Section 1 and
Section 2). One of our main contributions is the addressing of this gap. For this aim, we created a data protection assessment tool that will allow charitable organisations to begin creating and implementing an effective data privacy programme.
We achieved this through action research via working in close collaboration with three charities and two data privacy experts. This research has resulted in the main contribution of this paper, Privacy Essentials!, a step-by-step framework that charities can use to assess their privacy posture, and identify the steps they need to implement to establish a comprehensive data privacy programme.
Privacy Essentials! leverages existing privacy standards and guidance, such as the NIST Privacy Framework (
Section 2.5.1), ISO 27701 (
Section 2.5.2) and Cyber Essentials (
Section 2.5.3) and the legal obligations brought by the UK GDPR (
Section 2.2), incorporating these to ensure charities cover all considerations around how best to manage privacy and handle data appropriately within the organisation. We coupled the insights gained from analysing these documents with primary data collected from the charities themselves through interviews and observation (
Section 4) to create a series of Personas (
Section 4.6.2) and requirements (
Section 4.7).
From this, the logic and flow of Privacy Essentials was designed (
Section 5), before using MS Excel to programme and create the actual framework itself (
Section 5.4). Once we were satisfied that Privacy Essentials! worked as expected (
Section 5.5), we went back to our collaborators and had them evaluate both v1.0 and v.2.0, and provide feedback on their findings (
Section 6). To the best of our knowledge, the contribution of Privacy Essentials! presents a great opportunity for charities to gain privacy compliance.
Discussion and Future Work
Privacy Essentials! demonstrates how, with a little guidance and direction, an organisation can enhance their privacy posture. Coupled with the implementation of Privacy Essentials! recommendations, and building out the required policies, procedures, and processes, organisations can also build a solid foundation for their security programme.
The resulting privacy framework has been designed to guide practitioners through how to become privacy compliant, through a step-by-step decision tree framework that will output a pack of required documentation needed to satisfy UK GDPR compliance. One of the contributions of this paper is to make Privacy Essentials! freely available to the charity sector, and, should its adoption become possible within either the ICO or the Charity Commission, then their reach would allow the framework to be used by any of the 169,000 charities in the UK. For this aim, Privacy Essentials! can be accessed via the link provided here:
https://eprints.bournemouth.ac.uk/39523/ (accessed on 1 January 2020).
Privacy Essentials! attempts to consider all personal data types and be as comprehensive as possible. One limitation is that certain data types that are unique (such as driving license or passport number) have been excluded. However, these can be accommodated within the framework under the heading classified as “other”. Similarly, Privacy Essentials! is limited to considering IT related matters, meaning the management of personal data on other media, such as social media, paper copies, video and audio, is not facilitated in this version of the framework. However, these may be added to this version of the framework as part of a future iteration.
A second limitation is that Privacy Essentials! limits the outcomes from processes of the Microsoft O365 licences due to time constraints around the study. Thus, the Google Suite was omitted deliberately. However, the processes as described within the framework will be equally applicable to Google, and the intention is that Privacy Essentials! will be updated to accommodate the Google Suite as part of future development.
Future work will seek to leverage the framework to be applicable to more industries and sectors. The disciplines required when protecting and managing personal data are readily transferable to other data assets, such as intellectual property, confidential information or any manner of sensitive documentation. We will also explore the opportunity offered by the DPO introduced in
Section 6, who offered to potentially trial and further evaluate future iterations of Privacy Essentials! within the charity he is a Trustee of. This would allow Privacy Essentials! an opportunity to gain a wider audience to evaluate the framework and gather invaluable feedback, which is gratefully appreciated. To this end, further discussions are in progress at the time of writing this paper.