Next Article in Journal
Third-Party Doctrine Principles and the Fourth Amendment: Challenges and Opportunities for First Responder Emergency Officials
Previous Article in Journal
Fashion between Inspiration and Appropriation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Public Interest, Health Research and Data Protection Law: Establishing a Legitimate Trade-Off between Individual Control and Research Access to Health Data

Melbourne Law School, University of Melbourne, Parkville VIC 3010, Australia
*
Author to whom correspondence should be addressed.
Submission received: 13 January 2020 / Revised: 7 February 2020 / Accepted: 8 February 2020 / Published: 14 February 2020
(This article belongs to the Section Health Law Issues)

Abstract

:
The United Kingdom’s Data Protection Act 2018 introduces a new public interest test applicable to the research processing of personal health data. The need for interpretation and application of this new safeguard creates a further opportunity to craft a health data governance landscape deserving of public trust and confidence. At the minimum, to constitute a positive contribution, the new test must be capable of distinguishing between instances of health research that are in the public interest, from those that are not, in a meaningful, predictable and reproducible manner. In this article, we derive from the literature on theories of public interest a concept of public interest capable of supporting such a test. Its application can defend the position under data protection law that allows a legal route through to processing personal health data for research purposes that does not require individual consent. However, its adoption would also entail that the public interest test in the 2018 Act could only be met if all practicable steps are taken to maximise preservation of individual control over the use of personal health data for research purposes. This would require that consent is sought where practicable and objection respected in almost all circumstances. Importantly, we suggest that an advantage of relying upon this concept of the public interest, to ground the test introduced by the 2018 Act, is that it may work to promote the social legitimacy of data protection legislation and the research processing that it authorises without individual consent (and occasionally in the face of explicit objection).

1. Introduction

The United Kingdom’s (UK) Data protection law requires that research processing of health data is in the “public interest”. There are no explanatory notes and little guidance to explain what this means.1 However, if research processing passes that test, and meets the public interest threshold, then there is no data protection law requirement for a data subject to consent to the use of his or her personal data for research purposes. That is not all. Ordinarily, if consent is not the legal basis for processing, then data subjects have the right to object to processing. As well as providing an alternative to consent as a lawful basis for processing, the public interest also qualifies that right to object where processing is necessary for research purposes. Thus, under UK (and European Union) data protection law, the public interest permits researchers to operate with relatively little deference to individual preferences: research use of health data is permissible without individual consent (opt in) or the opportunity to object (opt out). There is a risk that diluting individual control may allow uses of data that are unacceptable to individual data subjects. This could ferment discontent, undermine confidence in effective governance, and discredit health research more generally.2
In recent years, health data sharing has rarely been long out of the press. Although the purposes of sharing have not always been stated to be research purposes, controversy surrounded flows of GP data to the Health and Social Care Information Centre3 under the care.data scheme (Presser et al. 2015; Taylor 2014) and the original data sharing agreement between Google Deepmind and the Royal Free London hospital (Powles and Hudson 2017). These events in the UK have cast a long shadow. More recently, research was explicitly the aim of the flow of health data out of the UK to the USA which generated fresh newspaper headlines (Helm 2019). Few would likely dispute the value of a robust governance framework capable of promoting both the public interest in access to health data for appropriate research purposes, and protecting and promoting public trust and confidence in a confidential health care service.4 However, the capacity of the law to strike a balance between competing interests that is capable of promoting and protecting public support cannot be assumed. Legal compliance alone will not necessarily ensure social legitimacy (Carter et al. 2015).
The introduction into the Data Protection Act 2018 of a public interest test, applicable to the research processing of health data, creates an opportunity to further contribute toward crafting a framework that will support only appropriate trade-offs between competing interests. Much will depend on how this new public interest test is interpreted and applied. If the safeguard introduced by the 2018 Act is to have any meaning, then it must be possible for research proposals to fail the test. However, without a workable concept of the public interest it is impossible to consistently apply a public interest test capable of distinguishing between instances of health research that are in the public interest from those that are not. It is also impossible to explain why the public interest test might be satisfied in one part of data protection legislation, e.g., in relation to the need for a lawful basis, but not in another, e.g., in relation to setting aside a right to object under the European Union (EU) General Data Protection Regulation (GDPR).5 The question of the substantive content of the public interest test in UK data protection law is important not only for those conducting research subject to its provisions. It can inform a public interest critique of UK and EU data protection law’s protection of health research and individual control of the use and disclosure of health data. To what extent does the introduction of the ‘public interest’ test into UK law supplement or shift the protection of, and trade-off between, health research and privacy protection in data protection law? In order to answer this question, we require a fuller concept of public interest than is apparent on the face of UK data protection legislation. As will be seen, our central contention is that the public interest requires acceptable reasons for any diminution of individual control over the use and disclosure of personal health data. This proposition provides a platform for a public interest critique of the legal position, under both UK and EU law, that data protection legislation allows health researchers to process sensitive personal data with relatively little deference to individual preferences.
In this paper, we construct, explain, and then apply, a particular theoretical concept of public interest to the question of the appropriate trade-off between individual patient/consumer control and research access to personal health data.6 While grounded in public interest literature, the concept we offer is capable of defending a legal route through to processing personal health data for research purposes that does not require individual consent. As we shall see though, the defence offered is more qualified than an initial reading of data protection legislation might suggest. The concept we prefer holds it to be in the public interest to proceed without individual consent only where there are acceptable reasons for doing so. In this particular case, for example, it would suggest the public interest test in the 2018 Act could not be satisfied it were practicable to conduct the research and also clear the high GDPR threshold of valid consent. This is a novel interpretation and not put forward by the existing literature or professional guidance we explore. However, the advantages of unpacking the concept of public interest in the way we describe are, we suggest, at least three-fold.
First, the concept of public interest we construct provides a workable test and gives theoretical depth and substance to an otherwise notoriously abstract idea.7 In short form the test is relatively simple: we propose that the public interest requires any (unavoidable) trade-off between “common interests” (i.e., those universally held by members of a community) such as the interests in privacy and the benefits of health research, to be justified in terms that those affected have reason to accept and which respect individual objection in almost all circumstances. If it is required that a justification for any trade-off is articulated in ways that meet this test in the circumstances, then we have a practicable but meaningful test capable of discerning whether research processing is “in the public interest”.8
The second advantage attached to unpacking the concept of public interest in the way we describe is that the test can be applied in different parts of data protection legislation, to yield different answers, without inconsistency. For example, it may yield different answers when assessing if it is in the public interest to process personal health data for research purposes without consent, as compared to when assessing if it is in the public interest to continue to process in the face of an objection. It can do so, however, in predictable and repeatable ways. It is sufficiently nuanced to respond to context but sufficiently resilient to provide a consistent yardstick against which people may evaluate their own and others’ actions.
Perhaps most significantly, we suggest that the third reason our preferred concept of public interest is compelling is in its promotion of the social legitimacy of data protection legislation and, by association, the processing that takes place in accordance with its requirements.9 More particularly, our argument formulates the public interest safeguard contained in the UK Data Protection Act 2018 in a way that promotes the social legitimacy of health research without patient consent. In light of the kind of contemporary controversy referred to above, this should not be underestimated. Our argument demonstrates how a robustly constructed and applied concept of the public interest may point the way toward a governance framework capable of supporting and protecting public confidence that personal health data is only being used in ways that they have reason to accept and which will normally respect their expressed preferences (with objection only overridden where this is also justifiable as acceptable).
The generalisability of our approach to constructing a public interest test means that our argument has implications beyond data protection within UK (including formulations of public interest used by bodies such as the Public Benefit and Privacy Panel in Scotland;10 the Confidentiality Advisory Group in England and Wales;11 and the Privacy Advisory Committee in Northern Ireland12). It has relevance outside of the UK where public interest is a relevant feature of regulatory frameworks governing research access to personal health data, e.g., the Irish Health Research Regulations 2018, the South African Protection of Personal Information Act 2013 (“POPI”), and the Australian My Health Records Act 2012 (Cth), under which there may only be secondary use of health data where that is “in the public interest”. What is more, the argument has implications outside of health research, including in relation to Freedom of Information, and National Security, where what might be considered competing public interests (e.g., confidential health service; heath improvement; national security) are traded-off against both one another and against private interests (including in privacy protection) commonly held by individuals in personal data. Indeed, the argument we offer may have relevance any time there is a need to trade off common interests in the name of ‘the public interest’. We choose as our particular object of concern, however, the public interest safeguard introduced into UK data protection legislation by the Data Protection Act 2018 and consider the implications of adopting our concept of the public interest in its interpretation and application.
The paper is structured in the following way. Part I describes the relevant data protection legislation. It explains the “public interest” safeguard introduced by the UK Data Protection Act 2018 and puts that safeguard in the context of other references to public interest in UK and EU data protection legislation, with particular attention paid to the right to object. Part II considers the literature on theories of public interest and sets out and explains our particular conception of public interest. This conception of the concept of public interest is applied to demonstrate that, though a default route through to processing personal health data without consent can be defended, it is only in the public interest to trade off common interests in ways that can be justified as acceptable to those affected. This requires preservation of individual privacy, and control over whether personal health data is used for research purposes, to the maximum extent possible. In Part III we consider more closely the claim that our concept of public interest may work to promote the social legitimacy of data protection legislation and the research processing that it authorises without individual consent (and even in the face of explicit objection). Through the paper as a whole, we seek to offer a coherent concept of public interest, capable of making sense of an otherwise vague and potentially confusing regulatory test, consistent with the promotion and protection of public confidence in appropriate processing of personal data for health research purposes. It is a concept that can be used not only to interpret and unpack the public interest test introduced into UK data protection law but also to critique the trade-off, between privacy protection and health research, that UK and EU data protection legislation otherwise represents.

2. UK Data Protection Legislation

The EU General Data Protection Regulation (GDPR) (2016) repealed and replaced the European Data Protection Directive (95/46/EU). It came into force on 25 May 2018 and was intended to not only update European Data Protection Law but also, as a Regulation (rather than a Directive), to achieve higher levels of harmonisation across Europe. There remain areas where member states can either derogate from the Regulation or specify how it is to apply within the domestic context. Indeed, with regard to the level of variation permitted across Europe
research occupies a privileged position within the GDPR. This flexibility afforded to Member States …, means that the full extent of this special regime is not precisely delineated.
In the UK, any processing13 of personal data for research purposes, carried out in the context of an establishment of a controller or processor in the UK,14 must comply with data protection legislation,15 including the Data Protection Act 2018 and the GDPR as applied in the UK context.
The term “personal data” is defined very broadly by data protection legislation to include any information relating to an identified or identifiable person.16 Those subject to the requirements of data protection legislation must process personal data in compliance with a set of data protection principles which relate to “lawfulness, fairness, and transparency”, “purpose limitation”, “data minimisation”, “accuracy”, “storage limitation”, “integrity and confidentiality”, and “accountability”. The lawfulness of processing is determined, in part, by Article 6 of the GDPR.
It is necessary (but not sufficient) for lawful processing to meet one of the conditions set out in Article 6(1) GDPR. The conditions most likely to be appropriate to processing for research purposes are (i) processing is with the data subject’s consent (Article 6(1)(a)), (ii) processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller (Article 6(1)(e)), or (iii) processing is necessary for the purposes of a data controller’s legitimate interests (Article 6(1)(f)). Only one condition needs to be satisfied. A data subject’s consent is not required if an alternative ground is available. Controllers should select the most appropriate ground available for the processing intended.
If research17 is carried out by a public body, such as a UK university,18 then legitimate interests is not available as a legal basis for processing (Article 6(1)). Guidance from the Regulator in the UK, The Information Commissioner’s Office (ICO), recognises that Article 6(1)(e) is likely to apply to processing in universities for research purposes: “the public task is likely to apply to much of their processing, depending on the detail of their constitutions and legal powers” (Information Commissioner’s Office n.d.). A guide to GDPR compliance for researchers in UK universities, issued by UK Research and Innovation (UKRI),19 similarly recognises:
The most likely lawful basis for research in UKRI Institutes and in universities (as public authorities) is ‘task in the public interest’.
It is not clear from guidance whether UKRI institutes and universities are advised to apply the first or second limb of Article 6(1)(e). That is to say, it is not clear whether they should rely upon the necessity of processing for research “in the public interest” or in the “exercise of official authority”. In any case, if processing personal health data for research purposes, then an additional and specific public interest requirement is introduced through domestic UK law. It is this requirement to which we now turn.

2.1. Special Category Data: Default Prohibition

Special categories of data qualify for additional protections under data protection law. Special categories are defined, through Article 9 GDPR, to include genetic data and data concerning health. Processing of special category data is prohibited unless one of a number of exceptions apply. The exception most relevant to research is Article 9(2)(j): the “research exception”. This provides an exception, and permits processing, where it is necessary for research purposes. The important point here is that the research exception is only available where processing is in accordance with Article 89(1) and is based on Union or member state law. The relevant UK state law, The Data Protection Act 2018, establishes specific conditions for processing in this case. It is here that the 2018 Act introduces a new public interest requirement.

2.2. Public Interest Requirement

The Data Protection Act 2018 establishes that processing will only meet the requirement under the research exception for a basis in UK law if the processing:
(a)
is necessary for archiving purposes, scientific or historical research purposes or statistical purposes,
(b)
is carried out in accordance with Article 89(1) of the GDPR (as supplemented by section 19), and
(c)
is in the public interest (emphasis added) (Section 10(2); Schedule 1, Part 1, s 4.)
This requirement will apply to any research processing of health data reliant upon the research exception, irrespective of the lawful basis for processing (under Article 6). It will apply whether the research is by a public or private body and it will even apply to organisations outside of the EU if research involves monitoring behaviour in the UK. Indeed, following Brexit, UK law20 amends the GDPR applied in the UK (the “UK GDPR”) to bring into scope controllers and processors who are in the EU and would not previously have been subject to the Data Protection Act 2018 (Maynard 2019). For each and all of these reasons, it is important that we have a clear understanding of what this public interest requirement introduced by the Data Protection Act 2018 demands. Such clarity is also a prerequisite for understanding how this public interest test relates to others peppered throughout data protection law, including in relation to qualification on the “right to object”, and the extent of any adjustment being made to the position under the GDPR.21 Before moving on to consider such things, we must briefly pause to consider why a researcher might not choose to circumvent the public interest test introduced by the Data Protection Act 2018 by relying upon a research participant’s consent to processing. This consideration goes not only to the relevance of our argument overall but also to specific claims that we make later in the paper. This is because the availability of reasons for not seeking consent will, we suggest later, be relevant to its acceptability from the perspective of a data subject.

2.3. Why Not Consent?

The requirement that processing be “in the public interest”, established by the Data Protection Act 2018 as a requirement of domestic law relevant to Article 89(1), is only engaged where Article 9(2)(j)—the “research exception”—is relied upon (to rebut the general presumption against the processing of special category data). As our argument concerns how this public interest test is to be interpreted, and the implications of the test’s introduction, we must consider whether the test might easily be avoided by relying upon an alternative exception. The first alternative exception under Article 9 is that “the data subject has given explicit consent to the processing” (Article 9(2)(a)). Why would a controller not seek a data subject’s explicit consent to the processing and, thereby, avoid the public interest test requirement contained in the 2018 Act?
Here, it is important to remember that “consent” is both an available lawful basis for processing (under Article 6) and “explicit consent” an available exception to the prohibition on processing special category data (under Article 9). If there are good reasons for not relying upon consent to provide a lawful basis, then explicit consent is unlikely to be the most appropriate exception to the prohibition. This may be revealed by considering just two reasons for which consent may not be considered the most appropriate lawful basis. The first relates to the consequences of a withdrawal of consent. The European Data Protection Supervisor notes that:
If consent is the lawful ground for processing, the data subject must be able to withdraw that consent at any time; there is no exception to this requirement for scientific research. As a general rule, if consent is withdrawn, the controller is required to stop the processing actions concerned and, unless there is another lawful basis for the retention of those data for further processing, the data should be deleted by the controller.
Guidance from the Health Research Authority in the UK encourages data controllers intending to process personal data for research purposes to avoid this consequence of withdrawal where possible. The guidance states that:
If consent is the legal basis for processing the data involved and the data subject withdraws this consent, the data will need to be erased even if this is likely to render impossible or seriously impair the achievement of the objectives of that processing. This underlines the importance of ensuring that, when specifying the legal basis upon which data is processed, consent is only used where there are no more suitable alternatives. (emphasis added)
If explicit consent is relied upon to provide an exception to the processing of special category (e.g., health) data, then a withdrawal of consent would have the same consequence: the data could no longer be used even if this rendered impossible or seriously impaired the research purpose.
The second reason a data controller might rely upon an alternative other than consent, as either the lawful basis for processing or to establish the necessary exception, relates to the difficulty of satisfying the threshold requirements of a valid consent under the GDPR. Consent is defined by the GDPR to mean:
any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her.
(Article 4(11) GDPR)
Satisfaction of the adjectives attached to a valid consent—“freely given”, “specific”, “informed” and “unambiguous”—can each be challenging to a researcher seeking to use personal health data (Prictor et al. 2019). In particular, the quality of “free” consent can be difficult to establish in the context of research, e.g., a clinical trial, where it may be questioned whether a data subject really has a “free choice” due to health needs and there is a clear power imbalance between the data subject and controller (European Data Protection Board 2019; European Data Protection Supervisor 2020). Also, the ability to provide a sufficiently “specific” consent where future research uses are unclear, and not directly related to the primary reasons for collection, has been the subject of much academic discussion.22 These uncertainties must be successfully navigated in order for a researcher to be confident of a valid consent according to the high threshold requirements of the GDPR. There are also challenges that any secondary user of data will face if health data is originally collected either by a health care professional providing a health care service or previous research (Tassé et al. 2010). Not only can it be impossible for a secondary user to effectively control, or evidence, the actions of those who collect data for the primary purpose, but data may have been collected historically without the secondary use in mind and outside the contemplation of any original consent. Subsequently obtaining consent (particularly to subsequent secondary use) can be difficult for a host of reasons. These include but are not limited to the accuracy and availability of current contact details, the willingness of those with lawful access to data to contact participants on behalf of researchers, and the relatively low response rates associated with such approaches (Nelson et al. 2001).23 For any of these reasons, and more, a researcher may be well advised to rely upon an alternative to consent where one is available. It may be genuinely impracticable to complete the health research if consent is an absolute requirement. This is one of the reasons that the research exemption exists.
If a researcher is relying upon the “research exemption”, thus engaging the public interest test introduced by the Data Protection Act 2018, this then raises the question of how the “public interest” test introduced by the 2018 Act relates to other “public interest” tests in data protection legislation. We consider in particular, the relationship between the test contained within domestic legislation and the public interest qualification on the right to object contained in the GDPR.

2.4. Right to Object

One of the merits of the concept of public interest that we propose is that it not only offers a practicable concept of an otherwise indeterminate concept, but that it can help explain how a consistent concept of public interest may be employed across data protection law with (predictable) differences in outcome. This is important if, for example, the right to object is not to be automatically disapplied whenever the public interest is the legal basis for processing (under Article 6(1)) or the research exemption is relied upon (under Article 9(2)(j)) and processing satisfies the public interest safeguard introduced by the 2018 Act.
The GDPR establishes, through Article 21(1), that a data subject shall have the right to object,
on grounds relating to his or her particular situation, at any time to processing of personal data concerning him or her which is based on point (e) [public interest or official authority] or (f) [legitimate interests] of Article 6(1).24
However, Article 21(6) qualifies this prohibition in the following way:
Where personal data are processed for scientific or historical research purposes pursuant to Article 89(1), the data subject, on grounds relating to his or her particular situation, shall have the right to object to processing of personal data concerning him or her, unless the processing is necessary for the performance of a task carried out for reasons of public interest.
At first reading, this might appear straightforward. As a general rule, data subjects have the right to object to processing based on point (e) of Article 6(1) [public interest or official authority] but the right is restricted in case of research processing necessary in the public interest. However, it is significant that Article 21(6) includes the text ‘necessary for the performance of a task carried out for reasons of public interest’ rather than simply referencing point (e) of Article 6(1). Why is it significant? First, because it distinguishes between the two limbs of Article 6(1)(e): the right to object is only restricted where processing is necessary in the public interest (rather than the exercise of official authority). Second, it allows the public interest test to function independently in the different parts: it need not be sufficient for processing to be based on Article 6(1)(e) to disapply the right to object. This opens the door to the necessity of processing in the public interest being assessed in light of the objection. The concept of public interest we prefer supports, and indeed requires, this possibility. We will hold that it is not in the public interest to assume that satisfying a public interest test in one part of data protection law necessarily entails that it is cleared in all parts. Instead, the public interest requires a contextual assessment of the acceptability of the reasons for trading off interests engaged in the circumstances. There is a qualitative difference in the degree of interference with relevant interests (notably the interest in privacy) between a circumstance where the views of the data subject are unknown (and processing has taken place without consent) and a circumstance where it is known that they object. Of course, defence (or refutation) of this view requires a more fulsome understanding of the concept of public interest than can be read off the face of the 2018 Act.

3. The Public Interest

A relatively high-level view of the public interest might see it as the “pursuit of objectives valued by the community.” (Selznick cited by Feintuck 2004). The fact that such a view offers little more than a silhouette of the idea has its advantages but, while revealing little that is contentious, it also prompts many questions: Which community must value the objectives? Who determines which objectives are valued by that community? What happens when different objectives come into conflict? There is a vacuum that must be filled. The concept of public interest that we propose fill that vacuum is rooted in an extensive literature.

3.1. Theories of Public Interest

When McHarg reviewed different approaches toward unpacking the term “public interest” she described how different theories might be associated with three distinct groupings: “Preponderance (or aggregative)”, “Unitary”, and, “Common interest” theories (McHarg 1999).25 When Sorauf earlier composed a similar list he combined common interests with values and gave the grouping the title “Commonly-Held Value” (Sorauf 1957, p. 624). We add this as a separate grouping and will suggest a useful distinction can be drawn between values and interests. To better describe how the concept of public interest we prefer represents a hybrid approach, we briefly describe these different groupings.

3.1.1. Common Interest Theories

Common interest theories employ various concepts of interest “which all members of the public have in common, hence comprising a category of interests distinct from those of particular individuals or groups.” (McHarg 1999, p. 676).26 The basic idea is that it is “in the public interest if it serves the ends of the whole public rather than those of some sector of the public.” (Meyerson and Banfield, cited by Sorauf 1957). Despite the intuitive appeal, further consideration reveals a couple of difficulties with this idea. First, it is not self-evident within a pluralistic society which interests truly are common to all or even to most persons. Second, even if particular interests could be said to be common, this idea of the public interest does not explain how they should be traded in case of conflict. It does not explain or support any normative judgement seeking to resolve a conflict between “the public interest”, “individual rights”, or any other end public or private interest or value (McHarg 1999, p. 679).27

3.1.2. Preponderance Theories

According to a preponderance theory of public interest
[t]he public interest has no independent content, but is discovered simply by aggregating individual interests; that which is in the interest of a preponderance of individuals is also in the public interest.
This position adopts a utilitarian perspective on the determination of value. Preponderance theories are thus distinguishable from common interest theories as it is not necessary for the relevant interests to be common to all persons. They need only to be sufficiently prevalent for their aggregation to indicate a path toward utility maximisation. Such a concept could not be adopted and applied within the context of data protection law as the values at play are incompatible with a rights-based, liberal democracy. In the case of Young, James and Webster the European Court of Human Rights (ECtHR) said that
Pluralism, tolerance and broadmindedness are hallmarks of a “democratic society”. Although individual interests must on occasion be subordinated to those of a group, democracy does not simply mean that the view of the majority must always prevail: a balance must be achieved which ensures the fair and proper treatment of minorities and avoids any abuse of a dominant position.
(Young, James and Webster v United Kingdom (1981) 4 EHRR 38, para. 63)
If we seek a concept of public interest capable of normative explanation for trade-offs that is compatible with the values inherent to the European Convention on Human Rights and UK law,28 then we must look elsewhere.

3.1.3. Unitary Theories

Unitary theories treat the public interest as an overriding interest which transcends and reconciles apparently conflicting individual or sectional interests (McHarg 1999, p. 675). If we could identify a single overriding interest capable of determining what the ECtHR described as a “fair and proper” balance between individual and group interests, then we might describe an operative unitary theory of the public interest. However, we are aware of no consensus regarding such an overriding interest. Indeed, it is the lack of consensus that might be understood to underlie much of the uncertainty associated with the concept of “public interest” and it is the search for such content that motivates our argument here.29 To maximise the possibility of agreement in practice, in a potentially controversial area, we will suggest an approach that employs transcendent reason—and the impartiality it promises—alongside other admittedly contingent but widely shared premises.30 We will suggest that it is through identifying widely shared common interests, and what may be logically entailed by commitment to them and a commonly held value, we may construct a reasoned position on a robust concept of public interest. Robustness, we will indicate, is related to not only theoretical coherence but also the capacity of the concept of public interest to engender and protect the social legitimacy of the use personal data for health research purposes without individual consent.

3.1.4. Common Value Theories and Public Reason

Mike Feintuck is keen to ensure that the concept of the public interest is understood in terms that will establish and reflect “objectives valued by the community” (Feintuck 2004, p. 201) as a whole. In the context of the UK, this is to maintain the “legitimacy of the polity via ensuring the delivery of democratic expectations” including “the democratic expectation of equality of citizenship” (Feintuck 2004, p. 211). John Bell considers that the “public interest emerges as a set of fundamental values in society” (Bell 1993 cited in Feintuck 2004, p. 186). Such values, according to Bell, “characterise the basic structure of society” (Bell 1993 cited in Feintuck 2004, p. 34). If there is a core common value to characterise (liberal) democratic society, then we suggest it is tied to equality of citizenship. We posit acceptance of the idea that we are each to be valued as free and equal members of society.31 If accepted, then this has implications for the nature of a public interest argument. For those willing to accept a commitment to the principle of equality, or at least to concede its significance to liberal democracy, the idea of “public reason” has considerable relevance. “Public reason” is bound to the idea that any interference with an individual’s freedom should be defensible in terms of reasons that those affected can access and have reason to endorse (Gaus 2011, p. 19).32 We use this idea of public reason to help construct our proposed concept of public interest. Our concept might be considered a hybrid between common interest and common value theories.
At root, any claim that an individual should sacrifice some element of control over her or his own life, for the sake of a preference determined by others, demands justification if it is to be more than either an empty claim (devoid of any authority and to be avoided whenever possible) or a claim effectively enforced only by threat of force. That does not mean those affected must agree that a particular activity is preferred. There is a distinction between that which you would prefer to happen and that which you may accept as justified in the circumstances. People are capable of accepting reasonable things that are contrary to their own preferences.33 It places only a justificatory burden on those claiming “the public interest” to be able to produce reasons for any trade-off that those affected have reason to accept.
If the ideal of public reason is coupled with the idea that the relevant interests engaged are common interests, then we suggest that we have enough to fashion (via transcendent reason) a normative yardstick capable of determining what the public interest safeguard requires (i.e., that which the public have reason to accept is “in the public interest”). If the common value, and (any) trade-off between common interests, may only be for reasons that affected publics have reason to accept (and discussion on these reasons is held according to the principles of public reasoning),34 then we have the foundations of a test capable not only of determining what the public interest requires but doing so consistent with a commitment to promote social legitimacy.
In short, our aim is to recognise that respect for persons as free and equal members of society requires that the adoption of any trade-off between common interests is justified in terms that are both accessible and acceptable to them. If this can be achieved, then the justification for the adoption of a particular trade-off between common interests should “enhance the rationality of political discourse” (Harden and Lewis 1986, p. 44 cited in Feintuck 2004, p. 58) and go some way to respond to Brownsword’s call that “governance in the public interest must be conceived of as a quest and a claim for legitimate governance” (Brownsword 1993).

3.2. Applying a Concept of Public Interest to Health Research Processing of Personal Health Data: Common Interests in Privacy and the Benefits of Health Research

Both the promotion of health research and the protection of privacy engage common interests. These may not be the only common interests engaged by processing health data for research purposes. We select these two interests because they are likely to be engaged and illustrate the approach that must be taken to public interest decision-making in cases where (i) common interests need unavoidably be traded-off and (ii) common value is attached to the idea that we are free and equal members of society, entitled to acceptable reasons for such a trade-off.35 They are also both at large in debates regarding appropriate governance models to enable research access to health data and to the discussions while the research exemption in the GDPR was being debated (Nyrén et al. 2014).

3.2.1. The Common Interest in Health

Brian Barry notes a “person’s interests are (roughly) advanced when his [or her] opportunities to get what he [or she] wants are increased” (Barry 2011, p. 216). A person’s state of health is one of the things that affects those opportunities as good health is likely to support achievement of any life goals. The importance of health as a “generic feature of action” has led to claims that individuals can place moral demands upon others to support their health in certain circumstances (Gewirth 1978). When the State advances certain “goods” such as “health, food and housing; and by installing public institutions that bear positive duties to society in providing these goods” (Capps et al. 2008, p. 14), it promotes goods that are “recognisably public in the sense that they are generic features of everyone’s wellbeing and freedom” (Capps et al. 2008, p. 15).36
If we each have an interest in achieving and maintaining a state of health that is at least sufficient for us to pursue self-selected life goals, then we have an interest in things, such as health research, that are capable of protecting and improving those conditions. Of course, one cannot move from this to attributing universal value to all health research. We might get what we want out of life without the knowledge brought by specific research projects.37 There are, however, three points to suggest that there is a common interest in health research even if we may never require health services or enjoy the benefits of health research. The first is that one’s own future is uncertain. This limits the range of research activity that we can be confident will never have any relevance to us personally. The uncertainty means, as Lowrance puts it, “we are vulnerable together” (Lowrance 2012, p. 2). The second is that even if we could be certain that a particular research project will not benefit our own health, it is not necessarily the case that we only want good health for ourselves. We might increase the opportunities to “get what we want” by improving the health care of friends, family, and even complete strangers.38 Finally, we might want good health for others for other than personal, altruistic or moral reasons. Self-interest may motivate where poor health in others might burden us; particularly if we share a responsibility for supporting them, e.g., through a national health service. All this is to say that, at root, we typically want not only good health for ourselves, but we also want others to be healthy. We may accept those things that are directly or indirectly supportive of effective health care to be in our common interest. While demonstrating a common interest in health research might provide a pro tanto reason to will the means to its achievement, it does not follow that unfettered access to personal data for health research purposes is in the public interest. It is not the only common interest at stake.

3.2.2. The Common Interest in Privacy

Privacy is not an easy idea to capture succinctly but, briefly put, informational privacy can be understood in terms of “norms of exclusivity”: Privacy concerns include normative expectations and preferences regarding access to and use of personal data (Taylor 2012, pp. 25–34). Privacy can establish appropriate access and use as well as positively support data exchange and enable relations within a society. For example, while it is the norm in many European Countries to respect the confidentiality of health data, this is at least in part because such respect supports the disclosure of personal data in a health care context (M.S. v Sweden 1997, para. 41). One of the things that we seek to protect through informational privacy in a health care context is the ability to confide sensitive personal data to a health care professional. This also facilitates sharing between health care professionals in delivering health care as each professional similarly recognises and respects confidentiality. In the UK, when considering the norms of exclusivity in relation to personal health data, there is evidence that the public expects consent to be sought before personal health data is used for research purposes (Department of Health 2009, pp. 5–6; The Information Governance Review 2013). We freely acknowledge that individual control does not exhaust the interests in privacy protection. Indeed, we have separately considered how the law might better protect group privacy interests (Taylor and Whitton forthcoming). However, so long as privacy is associated with individual control over access to information relating to identifiable individuals, any interference with an individual’s ability to control third party access will engage a privacy interest.

3.2.3. The Common Value: Equality and Freedom

Fundamental to our concept is the common value attributed to equality and the attendant idea that one cannot justify imposing preferences upon other persons, and undermining (common) interests they hold, without asserting some superiority over them. This is exacerbated if the preference regarding the trade-off is not a preference they share. If one seeks to adopt a position that will negatively impact upon others, either because their preferences regarding access to data (to promote the common interest in health research) would not be followed, or because the research they value would not take place (due to respect for the common interest in privacy), or simply because the respective trade-off is imposed on rather than chosen by the individual, then one should be prepared to defend that position in terms of public reason: In terms of reasons that those affected can both access and have reason to endorse (Gaus 2011, p. 19). It is intolerable39 to do otherwise because ultimately it must prove to be either ineffectual or tyrannous.
The greater the level of individual control exercised in relation to personal health data, the harder it is to assure research access. An insistence upon explicit individual consent before researchers have access to identifiable information can frustrate or delay research in circumstances where such consent cannot be readily sought or would not be given (The Academy of Medical Sciences 2006, pp. 58–61; Huang et al. 2007, p. 18).40 This can create a tension between the protection of privacy (in particular the norm in relation to being asked before identifiable data is disclosed for purposes beyond direct care) and the facilitation of research access (The Academy of Medical Sciences 2006, pp. 58–61; Huang et al. 2007, p. 18).41 Often research access and privacy norms can both be respected and even mutually reinforced. However, occasionally they do come into unavoidable conflict and, on these occasions, it is not possible to promote one without undermining the other.

3.3. Implications and Acceptability

3.3.1. Acceptability of Processing for Health Research Purposes without Consent

The confidence level with which one can assert that an individual has reason to accept a trade-off between different interests is directly linked to how far one’s concept of the relevant person is idealised and how abstractly one conceptualises his or her individual interests as well as the ideas to which initial commitment is assumed. If everyone has a different idea of what the word “privacy” means, then it will be difficult to demonstrate a particular trade-off with health improvement that ought (rationally) to be consistently accepted by all members of the public. For the purposes of this analysis at least elements of common interest in “privacy” and “health” must be held constant. An acceptable trade-off can only be reliably assessed, in principle, if you abstract from the contingent circumstances of an individual and place certain normative and epistemological constraints around his or her reasoning. If one adopts a particularly idealised concept of “the reasonable person” (Rawls 1996), then one can deduce, with relative certainty, the trade-off(s) that could be rationally endorsed in principle.42 For the sake of our analysis, we rely upon the perspective of an idealised person who holds steady our three “relevant concerns”.43
A foundational principle, grounded in the concern with equality we have described, is that a person should be compelled to suffer an interference with a common interest only in so far as necessary and consistent with public reason. People are not compelled to suffer any interference with either common interest if they provide consent to the use or disclosure of personal health data for research purposes. For this reason, where it is practicable to seek an individual’s consent to any processing of personal health data for research purposes, his or her consent to processing must normally be sought. Otherwise, his or her interest in privacy44 is interfered with minimally to the extent he or she is denied the opportunity to exercise control (even though they would have agreed if asked) and maximally to the extent that the processing is inconsistent with his or her preference. Where it is not practicable to seek consent ab initio, then structural privileging is unavoidable. Either access without consent is permitted (potentially inconsistent with an individual’s preference) or it is not permitted (also potentially inconsistent with an individual’s preference). In so far as structural privileging is unavoidable, then (i) it is ideally reversible by the individual (so as to avoid unnecessary compulsion), and (ii) any compulsion, such as through imposition of priority, must be for accessible and acceptable reasons.
To permit access without consent, in circumstances where it is not practicable to seek consent, but it is practicable to allow individuals to express an objection to processing, is to permit access in circumstances where the structural priority prima facie accorded to health research is reversible by the individual. The fact that this option preserves the common interest in an individual being able to exercise control in relation to use of personal health data, as well as the common interest in health research, means that an individual has reason to accept it in preference to a structural privileging that he or she cannot reverse (and which would unavoidably frustrate the common interest in health research in circumstances). We reiterate that one can only offer persons reasons to accept this trade-off as that which best protects the common interests in both privacy and health research where it is unavoidable to impose prima facie priority of one over the other: where an individual could be asked ab initio what they prefer with no interference with either interest, then they have no reason to accept anything other than a model which seeks his or her consent before processing.
It is more difficult in a scenario where an individual is not able to express an objection. What reason would they have to accept structural priority being accorded to health research or privacy in these circumstances? Our analysis, based on the perspective of an idealised person who holds steady our three relevant concerns, allows us only to say that they would require reasons for processing that were acceptable to them in the circumstances. This would extend beyond reasons for not seeking consent, and reasons for not allowing objection, to include reasons for privileging the interests in access over non-access that he or she has reason to endorse. If this seems a rather disappointing conclusion, then we would emphasise that we have already travelled to a point where we can make five substantive comments about existing data protection law and the potential implications of the import into the UK Data Protection Act 2018 of a public interest test.
The first substantive comment is that it is appropriate for data protection law to establish a default route through to processing personal health data for research purposes without consent. This supports the position under data protection legislation that recognises there to be alternative legal bases for processing other than an individual’s consent. The second substantive comment, however, is that it is only in the public interest to rely upon that route consistent with a commitment to avoid imposing any interference with either privacy or health research where practicable. The only way that both interests are fully protected is if research is done with an individual’s consent. Therefore, if the research can be done with consent, then it should be done with consent. If it cannot be done with a consent that would satisfy the strict requirements of the GDPR, then it should be done with something less than that but which nonetheless represents an authentic commitment to enable individual control to the maximum extent practicable in the circumstances. The third substantive point is that this might amount to respecting objection in the circumstances (rather than seeking consent) but there would need to be acceptable reasons for this dilution of individual control. The kinds of reason for not seeking consent rehearsed earlier may be relevant here. However, this leads to the fourth substantive point: acceptable reasons for not seeking consent may be (in principle and practice) qualitatively different from acceptable reasons for not giving somebody the opportunity to object to processing or for overriding an expressed objection. That it is in the public interest to process without consent does not, therefore, entail that it is in the public interest to process without respecting objection and one should not adopt a crude understanding of public interest that implies otherwise. This suggests that the introduction of the public interest test into the UK Data Protection Act 2018 has implications for an understanding of the right to object.

3.3.2. The Right to Object

One implication of our argument is that the effect of introducing the public interest requirement into UK data protection law is to make access to health data for research purposes conditional upon there being acceptable reasons for overriding what may otherwise be a preference for consent.45 This may be more readily acceptable, where consent is not sought, if persons are provided with sufficient information about intended use, have an adequate time to reflect upon the information provided which includes details about why consent is not relied upon (as lawful basis or to provide the relevant exception), and are then (at least as a default) able to object to use if that is their considered preference for any reason at all.46 In this way individual preferences regarding access and use can be effectively protected. The interest in individual control of third-party access to health data for research purposes can be preserved with consultation and the associated opportunity to have individual objection respected.47
An important point to draw from this analysis is that even if one is able to offer an acceptable reason for research use of personal health data without consent in the circumstances, it does not follow that one can offer an acceptable reason not to respect objection or processing in circumstances where individuals have no opportunity to object. In this way our preferred concept of public interest supports an interpretation of data protection legislation under which the public interest test functions independently in the different parts: it should not be sufficient for processing to be based on Article 6(1)(e), including that limb relevant to processing necessary in performance of a task in the public interest, to disapply the right to object. Indeed, the public interest requires acceptable reasons for not only processing without consent but specifically also acceptable reasons for processing without the opportunity to object or processing in the face of an explicit objection.

3.3.3. Overriding Objection and Processing without Opportunity to Object

In the most extreme case, where express objection is overridden, acceptability may rely upon appealing to some second order, or universalisable principles, such as protection of other persons from more fundamental harm. Reasoning that an individual has reason to endorse, even if it is subjectively rejected. For example, in cases where research might be intimately connected with maintaining the conditions for others to act as free and members of society (e.g., identifying the signs of abuse and helping to prevent it) and objection to research access (e.g., objection to access by abusers) represents a failure to act with due respect to others’ fundamental rights and freedoms, then the imposition of a trade-off that places a lower value upon individual control may be justified.48 In such cases, engaging individuals in the decision-making process still has an advantage in terms of “reflexive governance” and objectors should have the opportunity to be heard, and for an explanation as to why, although his or her objections were taken into conscientious account, they were nevertheless rejected (Mullen et al. 2011; Laurie 2011). This may require abstraction from his or her personal and contingent circumstances (they are unlikely to perceive access to be in their interests). However, as rational persons engaged in a social enterprise the reasons should still be “mutually intelligible” (Gaus 2011, p. 283) to the person even if only qua rational agent. To lose this requirement entirely would be to give up on the idea of “public interest” entirely and to lose any authority in reliance upon it (Gaus 2011, p. 264).
In circumstances where individuals have no opportunity to object, or express objection is overridden, then the claim that persons have reason to endorse the trade-off and can be tested empirically. This leads to our fifth, and perhaps most significant, substantive comment on data protection legislation: our preferred concept of public interest provides a connection between the legal concept of public interest and public engagement work that may serve to protect the legitimacy of data protection law in operation.

4. Social Legitimacy

While, ultimately, the demands of public reason are satisfied if people can access reasons for a particular trade-off that they “have reason to accept”, what people are willing to accept in practice remains an important consideration. The legitimacy and the viability of a particular trade-off is dependent upon public acceptance of its reasonableness: it is undermined if people do not accept it. For this reason, public reasoning will also be an important part of the process.

Public Reasoning

To maximise the opportunities for reason(s) to be tested, and the conclusions accepted, any demands of public reason must be open to verification through public reasoning.
The virtue of public reasoning is the cultivation of clear and explicit reasoning orientated towards the discovery of common grounds rather than in the service of sectional interests, and the impartial interpretation of all relevant available evidence.
People must be able to see that the reasons for a particular trade-off are accessible and therefore open to challenge and correction. Individuals must also be able to reject the reasoning, and adopt their own preferences, in all but the most exceptional circumstances. Unpacking the rationale in this way can thicken the concept of public interest in meaningful ways. Substantive restraint is placed upon those who would seek to invoke public interest justifications if they are not able transparently to provide acceptable reasons for trading off common interests, including the interest in consent rather than consultation. The further advantage of unpacking the concept of the public interest in this way is the implication it has for promoting social legitimacy.
As we argue that acceptance is based on common interests and value in light of public reasoning, it is through this process of regulating in a way that is consistent with deeply held values of individuals (evidenced by acceptance in fact) that this construction of public interest is likely to increase the social legitimacy of regulations in this area. Here, in particular, we see resonance between our approach to unpacking the concept of public interest with the processual approach put forward by Annie Sorbie. Sorbie notes that while the concept of public interest she advocates is
not predicated upon a necessary link with actual publics’ views, the processual analysis draws attention to the ways that, in these contexts, publics’ views (and indeed those of other stakeholders—most particularly in the health research community) on the acceptability of data sharing and linkage do bear on the law as the public interest is operationalised and disseminated in particular contexts.
What people can be shown empirically to be willing to accept will thus have evidential value to the application of public interest rooted in the idea that any particular trade-off can be defended in terms that persons have reason to accept. If nobody accepts the trade-off in fact, then this challenges the claim that they have reason to accept it.49 What people are willing to accept in fact will be influenced by how particular ideas (such as privacy or health) are given substance within a particular community and on how these ideas then come to “bear on the law as the public interest is operationalised”.
Despite the fact that there has been rather a lot of research into what people think about access to their health records for research purposes, it remains unclear what level of trade-off between the common interests in privacy and health improvement that the public consider to be acceptable. One may concede that, if asked, people will typically say that they would prefer that their data is used in anonymous form and would prefer to be asked before it was used for purposes beyond their direct care50 without giving ground on the claim that people might acknowledge that it is acceptable for there to be access with lower levels of individual control in some circumstances.51 In fact, the research that has been done to consider patient preferences for different models of consent paints a picture of very different individual preferences with regards to levels of individual control.52 The challenge in reliably assessing public attitudes in this area was noted after an extensive meta-survey of the literature on public and professional attitudes to privacy in 2007:
Assessment of public attitudes is dependent on how the topic is framed. People will express concerns if questioned about ‘concerns’, but will readily trade these ‘concerns’ for health or other benefits, even altruistic ones. ‘Real world’ choices can be very different (and constrained) from those offered in opinion surveys where costs and trade-offs may not appear.
We can say with certainty that the empirical research that has been done supports the position that anonymity and consent are not the only things that people consider to be important.53 There is evidence to support the claim that people value health research and are supportive of it and would not want to see it unnecessarily impeded.54 It is hard to determine currently the extent to which people do appreciate that there is at times a necessary trade-off between the common interests of privacy and health improvement through research. We propose only that the concept of public interest that we prefer may promote social legitimacy in the processing of personal health data for research purposes without consent as it foregrounds the need to ensure that processing only occurs when individuals have reason to accept the trade-off that it represents in the circumstances.

5. Conclusions

A meta-study, conducted by Hill et al. (2013), found that low levels of current understanding about research uses of patient data were consistently reported. With low levels of understanding about research use of patient data in general, one must wonder how accessible the reasons are for research access to identifiable data in particular and how widespread the understanding of the implications for research if explicit consent were to be a requirement for access in all cases.
It seems reasonable that the level of individual control that individuals might be willing to accept could be affected by levels of understanding of (a) the reasons for access without explicit consent and (b) trust in the other things that they consider important about the governance of research processes and safeguards (the broader “norms of exclusivity”) were robustly and transparently protected. The public may be more willing to accept levels of individual control lower than they would prefer if norms of exclusivity are protected aside from expectations for individual control (Taylor and Taylor 2014).
Although ideally each individual might be able to specify his or her preferred level of control, (Willison et al. 2007)55 in practice, implementation must adopt a default until an individual preference is expressed. Either access would be allowed prior to a contrary preference being registered (a de facto opt-out) or access would be prohibited until individuals expressed a willingness for their data to be used (de facto requiring explicit consent). If we cannot give everyone the level of control that they would elect in practice as the default, then we must instead adopt that which we have reason to think they have reason to accept in principle.
Our argument is that one can, and should, adopt a default on the basis of what is entailed by transcendental reason given a commitment to minimal interference with common interests and justification in terms of public reason. The argument suggests a default similar to the approach currently taken by data protection law. There is a route through to lawful use of personal health data for research purposes without consent. Our argument also supports the position taken within data protection law that objection itself might also be qualified where the public interest requires it. However, our argument unpacks and explains these positions in ways that introduce important qualifications and clarifications. The qualification is that it is only in the public interest to allow research processing without consent when individuals can be provided with reasons to accept this use without consent. If the public interest test introduced by the 2018 Act is understood in this way, then the 2018 Act has introduced an important substantive restraint to research use without consent. While the default is appropriate, there is a responsibility to continue to protect an individual’s ability to express a preference on the research processing so far as is practicable in the circumstances. The clarification that further explains the implication of this position is that the fact that it may be in the public interest to process without consent does not necessarily entail that it is in the public interest to continue to process in the face of an explicit objection. Individual control needs to continue to be protected and respected to the maximum extent practicable. Ensuring trade-offs are acceptable to those affected is an important check to protect and promote the social legitimacy of the processing. Currently, UK researchers seeking to use personal health data for research purposes are advised not to rely upon consent as the lawful basis for processing but are encouraged to recognise that consent may nonetheless be a requirement of ethical research or legal compliance, e.g., with the law of confidence. Our argument not only supports that position but may serve to further consolidate health data governance by providing a coherent explanation and justification for the requirement that researchers continue to seek consent even if not an apparent requirement of data protection law. That is, even if consent is not relied upon to satisfy the requirement of data protection legislation for a lawful basis, or an exception to the prohibition on processing special category data, it may be a consequence of the 2018 Act introducing a public interest requirement. This would complement, and may be understood to parallel, other legal and ethical responsibilities to, at least by default, use personal health data for specific purposes only with the consent of the data subject.

Author Contributions

Conceptualization, M.J.T.; writing—original draft preparation, M.J.T.; writing—review and editing, T.W. All authors have read and agreed to the published version of the manuscript.

Funding

Some of the background research used to support the argument in this paper was originally carried out with the support of funding provided by The British Academy, Mid-Career Fellowship Award (2012). That support is gratefully acknowledged.

Acknowledgments

This paper has had a long gestation. Many people have commented and discussed various of the ideas contained within it as they have been presented at conferences, through draft papers, and discussed in friendly conversation. We are grateful to everyone but would like to thank in particular, Roger Brownsword, Richard Kirkham, Graeme Laurie, Pete Mills, Aurora Plomer, Annie Sorbie, and Don Willison for the challenge and support that have helped shape the ideas that constitute the paper’s foundations. None have read this paper in final form but each has been instrumental to progressing our thinking in relation to the concept of the public interest. We are also very grateful to the anonymous reviewers for helpful and insightful suggestions on this paper.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

References

  1. Baker, Richard, Christopher Shiels, Keith Stevenson, Robin Fraser, and Margaret Stone. 2000. What proportion of patients refuse consent to data collection from their records for research purposes? British Journal of General Practice 50: 655–56. [Google Scholar]
  2. Brownsword, Roger. 2011. Political Argument. London: Routledge. [Google Scholar]
  3. Bell, John. 1993. Public Interest: Policy or Principle? In Law and the Public Interest: Proceedings of the 1992 ALSP Conference. Edited by Roger Brownsword. Stuttgart: Franz Steiner. [Google Scholar]
  4. Brownsword, Roger. 1993. Law and the Public Interest. In Law and the Public Interest: Proceedings of the 1992 ALSP Conference. Edited by Roger Brownsword. Stuttgart: Franz Steiner, p. 11. [Google Scholar]
  5. Caldicott, Dame Fiona. 2013. Information: To Share or Not to Share? The Information Governance Review. Available online: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/192572/2900774_InfoGovernance_accv2.pdf (accessed on 1 January 2020).
  6. Capps, Benjamin, Alastair V. Campbell, and Ruud ter Meulen. 2008. Access to the UK Biobank Resource: Concepts of the Public Interest and the Public Good. Available online: https://www.researchgate.net/publication/279191310_Access_to_the_UK_Biobank_Resource_Concepts_of_the_Public_Interest_and_the_Public_Good (accessed on 1 January 2020).
  7. Carter, Pam, Graeme T. Laurie, and Mary Dixon-Woods. 2015. The social licence for research: Why care.data ran into trouble. Journal of Medical Ethics 41: 404–9. [Google Scholar] [CrossRef] [Green Version]
  8. Curtin, Deirdre, and Albert Jacob Meijer. 2006. Does Transparency Strengthen Legitimacy? Information Polity 11: 112. [Google Scholar] [CrossRef] [Green Version]
  9. Data Protection Act 2018 Explanatory Notes. 2018. Available online: http://www.legislation.gov.uk/ukpga/2018/12/notes (accessed on 13 February 2020).
  10. Department of Health. 2009. Summary of Responses to Consultation on Additional Uses of Health Data. pp. 5–6. Available online: https://webarchive.nationalarchives.gov.uk/20100411140307/http://www.dh.gov.uk/prod_consum_dh/groups/dh_digitalassets/documents/digitalasset/dh_110715.pdf (accessed on 1 January 2020).
  11. Department of Health. 2010. Confidentiality: NHS Code of Practice—Supplementary Guidance: Public Interest Disclosures . Available online: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/216476/dh_122031.pdf (accessed on 13 February 2020).
  12. Donnelly, Mary, and Maeve McDonagh. 2019. Health Research, Consent and the GDPR Exemption. European Journal of Health Law 26: 97–119. [Google Scholar] [CrossRef] [PubMed]
  13. European Data Protection Board. 2019. Opinion 3/2019 Concerning the Questions and Answers on the Interplay between the Clinical Trials Regulation (CTR) and the General Data Protection Regulation (GDPR) (art.70.1.b)). pp. 18–20. Available online: https://edpb.europa.eu/sites/edpb/files/files/file1/edpb_opinionctrq_a_final_en.pdf (accessed on 1 January 2020).
  14. European Data Protection Supervisor. 2020. A Preliminary Opinion on Data Protection and Scientific Research. Brussels: European Data Protection Supervisor. [Google Scholar]
  15. Feintuck, Mike. 2004. The Public Interest in Regulation. Oxford: OUP. [Google Scholar]
  16. Gaus, Gerald. 2011. The Order of Public Reason: A Theory of Freedom and Morality in A Diverse and Bounded World. Cambridge: CUP. [Google Scholar]
  17. Gershon, Andrea S., and Jack V. Tu. 2008. The Effect of Privacy Legislation on Observational Research. CMAJ 178: 871–73. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  18. General Data Protection Regulation (GDPR). 2016. Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC. Official Journal of the European Union 119: 1–88. [Google Scholar]
  19. Gewirth, Alan. 1978. Reason and Morality. Chicago: University of Chicago Press. [Google Scholar]
  20. Grace, Jamie, and Mark J. Taylor. 2013. Disclosure of Confidential Patient Information and the Duty to Consult: The Role of the Health and Social Care Information Centre. Medical Law Review 21: 415–47. [Google Scholar] [CrossRef] [PubMed]
  21. Harden, Ian, and Norman Lewis. 1986. The Noble Lie: The British Constitution and the Rule of Law. London: Hutchinson. [Google Scholar]
  22. Health Research Authority. 2018a. Safeguards. Available online: https://www.hra.nhs.uk/planning-and-improving-research/policies-standards-legislation/data-protection-and-information-governance/gdpr-detailed-guidance/safeguards/ (accessed on 13 February 2020).
  23. Health Research Authority. 2018b. Data Subject Rights and Research Exemptions. Available online: https://www.hra.nhs.uk/planning-and-improving-research/policies-standards-legislation/data-protection-and-information-governance/gdpr-detailed-guidance/data-subject-rights-and-research-exemptions/ (accessed on 1 January 2020).
  24. Held, Virginia. 1970. The Public Interest and Individual Interest. New York: Basic Books. [Google Scholar]
  25. Helm Toby. 2019. Patient data from GP surgeries sold to US companies. The Guardian Newspaper. December 8. Available online: https://www.theguardian.com/politics/2019/dec/07/nhs-medical-data-sales-american-pharma-lack-transparency (accessed on 1 January 2020).
  26. Hill, Elizabeth M., Emma L. Turner, Richard M. Martin, and Jenny L. Donovan. 2013. “Let’s get the best quality research we can”: Public awareness and acceptance of consent to use existing data in health research: A systematic review and qualitative study. BMC Medical Research Methodology 13: 5. [Google Scholar] [CrossRef] [Green Version]
  27. Hobbes, Thomas. 1651. Leviathan. London: Andrew Crooke. [Google Scholar]
  28. Hoeyer, Klaus, Bert-Ove Olofsson, Tom Mjörndal, and Niels Lynöe. 2004. Informed Consent and Biobanks: A Population-Based Study of Attitudes towards Tissue Donation for Genetic Research. Scandinavian Journal of Public Health 32: 224–29. [Google Scholar] [CrossRef]
  29. Huang, Nicole, Shu-Fang Shih, Hsing-Yi Chang, and Yiing-Jenq Chou. 2007. Record Linkage research and informed consent: Who consents? BMC Health Services Research 7: 18. [Google Scholar] [CrossRef] [Green Version]
  30. Information Commissioner’s Office. n.d. Lawful Basis for Processing. Available online: https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/lawful-basis-for-processing/ (accessed on 1 January 2020).
  31. Information Governance Review. 2013. Information: To share or not to share? p. 62. Available online: http://caldicott2.dh.gov.uk/ (accessed on 1 January 2020).
  32. Kaye, Jane, Edgar A. Whitley, David Lund, Michael Morrison, Harriet Teare, and Karen Melham. 2015. Dynamic Consent: A Patient Interface for Twenty-First Century Research Networks. European Journal of Human Genetics 23: 141–46. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  33. Kielmann, Tara, Alison Tierney, Rosemary Porteous, Guro Huby, Aziz Sheikh, and Hilary Pinnock. 2007. The Department of Health’s Research Governance Framework Remains an Impediment to Multi-Centre Studies: Findings from a National Descriptive Study. Journal of the Royal Society of Medicine 100: 234–38. [Google Scholar] [CrossRef] [Green Version]
  34. Krousel-Wood, Marie, Paul Muntner, Ann Jannu, Amanda Hyre, and Joseph Breault. 2006. Does Waive of Written Informed Consent from the Institutional Review Board Affect Response Rate in a Low-Risk Research Study? Journal of Investigative Medicine 54: 174–79. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  35. Laurie, Graeme, Pierre Mallia, David A. Frenkel, Atina Krajewska, Helena Moniz, Salvor Nordal, Claudia Pitz, and Judit Sandor. 2010. Managing Access to Biobanks: How Can We Reconcile Individual Privacy and Public Interests in Genetic Research? Medical Law International 10: 315–37. [Google Scholar] [CrossRef] [Green Version]
  36. Laurie, Graeme. 2011. Reflexive Governance in Biobanking: On the Value of Policy Led Approaches and the Need to Recognise the Limits of Law. Human Genetics 130: 347–56. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  37. Lipset, Seymour Martin. 1981. Political Man: The Social Bases of Politics. Baltimore: John Hopkins University Press, p. 64. [Google Scholar]
  38. Lowrance, William. 2012. Privacy, Confidentiality and Health Research. Cambridge: CUP. [Google Scholar]
  39. M.S. v Sweden. 27 August 1997. 74/1996/693/885. Available online: http://echr.ketse.com/doc/20837.92-en-19970827/view/ (accessed on 1 January 2020).
  40. Maynard, Paul. 2019. Dark Side of the Moon: Extraterritorial Applicability of the UK Data Protection Act 2018 after Brexit. HL Chronicle of Data Protection. Available online: https://www.hldataprotection.com/2019/03/articles/international-eu-privacy/dark-side-of-the-moon-extraterritorial-applicability-of-the-uk-data-protection-act-2018-after-brexit/ (accessed on 1 January 2020).
  41. McHarg, Aileen. 1999. Reconciling Human Rights and the Public Interest: Conceptual Problems and Doctrinal Uncertainty in the Jurisprudence of the European Court of Human Rights. The Modern Law Review 62: 674. [Google Scholar] [CrossRef]
  42. Medical Research Council. 2007. The Use of Personal Health Information in Medical Research: General Public Consultation (Final Report). p. 6. Available online: https://mrc.ukri.org/documents/pdf/the-use-of-personal-health-information-in-medical-research-june-2007/ (accessed on 1 January 2020).
  43. Mullen, Caroline, David Hughes, and Peter Vincent-Jones. 2011. The Democratic Potential of Public Participation: Healthcare Governance in England. Social & Legal Studies 20: 21–38. [Google Scholar]
  44. Nelson, Karin, Rosa Elena Garcia, Julie Brown, Carol M. Mangione, Thomas A. Louis, Emmett Keeler, and Shan Cretin. 2001. Do Patient Consent Procedures Affect Participation Rates in Health Services Research. Medical Care 40: 283–88. [Google Scholar] [CrossRef]
  45. Nuffield Council on Bioethics. 2012. Public Ethics and the Governance of Emerging Biotechnologies. London: Nuffield Council on Bioethics, p. 69. Available online: https://nuffieldbioethics.org/publications/emerging-biotechnologies/guide-to-the-report/public-ethics-and-the-governance-of-emerging-biotechnologies (accessed on 1 January 2020).
  46. Nyrén, Olof, Magnus Stenbeck, and Henrik Grönberg. 2014. The European Parliament proposal for the new EU General Data Protection Regulation may severely restrict European epidemiological research. European Journal of Epidemiology 29: 227–30. [Google Scholar] [CrossRef] [Green Version]
  47. Peto, Julian, Olivia Fletcher, and Clare Gilham. 2004. Data Protection, Informed Consent, and Research. BMJ 328: 1029–30. [Google Scholar] [CrossRef]
  48. Powles, Julia, and Hal Hudson. 2017. Google DeepMind and healthcare in an age of algorithms. Health and Technology 7: 351–67. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  49. Presser, Lizzie, Maia Hruskova, Helen Rowbottom, and Jesse Kancir. 2015. Care.data and access to UK health records: Patient privacy and public trust. Technology Science. August 11. Available online: https://techscience.org/a/2015081103 (accessed on 13 February 2020).
  50. Prictor, Megan, Harriet J. A. Teare, Jessica Bell, Mark Taylor, and Jane Kaye. 2019. Consent for data processing under the General Data Protection Regulation: Could ‘dynamic consent’ be a useful tool for researchers? Journal of Data Protection and Privacy 3: 93–112. [Google Scholar]
  51. Rawls, John. 1996. Political Liberalism. New York: Columbia University Press, pp. 49, 56, 217. [Google Scholar]
  52. Singleton, Peter, Nathan Lea, Archana Tapuria, and Dipak Kalra. 2007. Public and Professional Attitudes to Privacy of Healthcare Data: A Survey of the Literature. Available online: https://pdfs.semanticscholar.org/7cf6/7bdb85ec3abe95eb463b8c6d0b887afec748.pdf (accessed on 1 January 2020).
  53. Solum, Lawrence B. 2006. Public Legal Reason. Virginia Law Review 92: 1472. [Google Scholar]
  54. Sorauf, Frank. 1957. The Public Interest Reconsidered. The Journal of Politics 19: 619. [Google Scholar] [CrossRef]
  55. Sorbie, Annie. 2016. Conference Report: Liminal Spaces Symposium at IAB 2016: What Does It Mean to Regulate in the Public Interest? SCRIPTed 13: 375–81. [Google Scholar] [CrossRef] [Green Version]
  56. Sorbie, Annie. 2019. Sharing confidential health data for research purposes in the UK: Where are the publics in the public interest? Evidence and Policy, 1–17. [Google Scholar] [CrossRef] [Green Version]
  57. Tassé, Anne Marie, Isabelle Budin-Ljøne, Bartha Maria Knoppers, and Jennifer R. Harris. 2010. Retrospective access to data: The ENGAGE consent experience. European Journal of Human Genetics 18: 741–45. [Google Scholar] [CrossRef] [Green Version]
  58. Taylor, Mark, and Tess Whitton. Forthcoming. Health Research and Privacy through the Lens of Public Interest: A Monocle for the Myopic? In Handbook on Health Research Regulation. Edited by Laurie Graeme. Cambridge: CUP.
  59. Taylor, Mark J., and Natasha Taylor. 2014. Health Research Access to Personal Confidential Data in England and Wales: Assessing Any Gap in Public Attitude between Preferable and Acceptable Models of Consent. Life Sciences, Society and Policy 10: 1–24. [Google Scholar] [CrossRef] [Green Version]
  60. Taylor, Mark. 2011. Health Research, Data Protection, and the Public Interest in Notification. Medical Law Review 19. [Google Scholar] [CrossRef]
  61. Taylor, Mark. 2012. Genetic Data and the Law: A Critical Perspective on Privacy Protection. Cambridge: CUP. [Google Scholar]
  62. Taylor, Mark. 2014. Information Governance as a Force for Good? Lessons to be learnt from Care.data. SCRIPTed. 11. Available online: http//:script-ed.org/?p=1377 (accessed on 1 January 2020).
  63. The Academy of Medical Sciences. 2006. Personal Data for Public Good: Using Health Information in Medical Research. Available online: http://www.acmedsci.ac.uk/p99puid62.html (accessed on 1 January 2020).
  64. The Academy of Medical Sciences. 2011. A New Pathway for the Regulation and Governance of Health Research. Available online: http://www.acmedsci.ac.uk/p99puid209.html (accessed on 1 January 2020).
  65. The EU Special Barometer 340: Science and Technology. 2014. Available online: https://data.europa.eu/euodp/en/data/dataset/S806_73_1_EBS340 (accessed on 1 January 2020).
  66. Townend, David Matthew Roy. 2012. The Politeness of Data Protection: Exploring a Legal Instrument to Regulate Medical Research Using Genetic Information and Biobanking. Maastricht: Universitaire Pers Maastricht, p. 164. [Google Scholar]
  67. UK Research and Innovation. n.d. GDPR and Research—An Overview for Researchers. Available online: https://www.ukri.org/files/about/policy/ukri-gdpr-faqs-pdf/ (accessed on 1 January 2020).
  68. Williams, Hawys, Karen Spencer, Caroline Sanders, David Lund, Edgar A. Whitley, Jane Kaye, and William G. Dixon. 2015. Dynamic Consent: A Possible Solution to Improve Patient Confidence and Trust in How Electronic Patient Records Are Used in Medical Research. JMIR Medical Informatics. [Google Scholar] [CrossRef]
  69. Willison, Donald J., Lisa Schwartz, Julia Abelson, Cathy Charles, Marilyn Swinton, David Northrup, and Lehana Thabane. 2007. Alternatives to Project-specific Consent for Access to Personal Information for Health Research: What Is the Opinion of the Canadian Public? Journal of the American Medical Informatics Association 14: 711. [Google Scholar] [CrossRef] [Green Version]
  70. World Health Organisation. 1946 (as amended). Constitution of the World Health Organisation. Available online: https://www.who.int/about/who-we-are/constitution (accessed on 13 February 2020).
1
The explanatory notes to the legislation simply note that the section of the legislation containing the requirement “specifies the conditions that must be met in order for special categories of data […] to be processed for employment, health, archiving and research purposes”. (Data Protection Act 2018 Explanatory Notes 2018, p. 80). Guidance issued by the Health Research Authority on the public interest test notes that it is “good practice to document decisions” and that “relevant considerations could include that the processing is subject to a governance framework which operates with public interest as a criterion, assessed independently of the data controller. This could be by peer review from a public funder, research ethics committee review, Confidentiality Advisory Group (CAG) recommendation for support in England and Wales or support by the Public Benefit and Privacy panel for Health and Social Care in Scotland.” (Health Research Authority 2018a). There is guidance provided by the UK Information Commissioner’s Office on the meaning of “public interest” and operation of a public interest in the Freedom of Information Act 2000 (UK) (“FOIA”). While this indicates how the regulator might think about the concept of public interest more generally, it does not constitute guidance on how to interpret the public interest test now contained within the Data Protection Act 2018.
2
We recognise that data protection law is not the only source of governance requirement and that professional ethics as well as other legal duties, e.g., under the law of confidence, supplement data protection legislation. While our focus is on data protection law, we hope that the concept of public interest we develop here will be recognised to have relevance more broadly. It may serve to assist in aligning regulatory requirements across the breadth of health data governance and consistency itself may help to promote confidence in effective governance.
3
Now known as NHS Digital https://digital.nhs.uk accessed on 1 January 2020.
4
It was, for example, an explicit aim of the second information governance review, conducted by Dame Fiona Caldicott, “to ensure that there is an appropriate balance between the protection of the patient or user’s information, and the use and sharing of such information to improve care” (Caldicott 2013, p. 6).
5
Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC.
6
We draw upon and develop aspects of the “triple test” for public interest decision making introduced in (Taylor 2014, pp. 1–8) and (Taylor in Sorbie 2016, pp. 375–81).
7
Our approach shares features with the processual approach described by (Sorbie 2019, pp. 1–17). If applied consistently across the EU, then it might help address some of the challenges described by (Donnelly and McDonagh 2019, pp. 97–119). For example, they note that the result of Article 89(1) leaving the detail of the framework to be developed by member states is “continued fragmentation in data protection standards across the EU”.
8
The triple test that one of us has previously proposed would require that data be used only for purposes that persons have reason to expect, to accept, and which normally respect expressed preferences (Taylor 2014, pp. 1–8; Taylor in Sorbie 2016, pp. 375–81). Here we foreground just “accept” and “respect” as they are most relevant to our analysis of the test contained within the Data Protection Act 2018. One might also consider whether data protection legislation does enough to ensure that data is used in ways that persons have reason to “expect” but that would be to extend an already lengthy analysis. It may be considered separately.
9
We, here, associate legitimacy with “the capacity of the system to engender and maintain the belief that the existing political institutions are the most appropriate ones for the society” (Lipset 1981, p. 64). This is consistent with recognition that the “liberal principle of legitimacy states that the exercise of political power is justifiable only when it is exercised in accordance with constitutional essentials that all citizens may reasonably be expected to endorse in the light of principles and ideals acceptable to them as reasonable and rational” (Solum 2006, p. 1472). See also (Curtin and Meijer 2006; Taylor 2011).
10
A patient advocacy panel which scrutinises applications for NHS Scotland health data purposes beyond individual care including, for example, research: https://www.informationgovernance.scot.nhs.uk/pbpphsc/ accessed on 1 January 2020.
11
Guidance issued by the Department of Health recommends advice is sought from the CAG before making a disclosure “in the public interest” where the public interest is relied upon as a justification for disclosure in circumstances that would otherwise constitute a breach of the common law duty of confidentiality: (Department of Health 2010, p. 10). The CAG advises on the use of legal powers, provided under s 251 NHS Act 2006 and the Health Service (Control of Patient Information) Regulations 2002, which may provide a lawful basis for disclosure of confidential patient information for medical purposes, including health research.
12
The Privacy Advisory Committee advise health and social care bodies in Northern Ireland about the use of patient and client information including for research purposes.
13
Broadly defined by Article 4(2) to include any operation or set of operations performed on personal data or on sets of personal data whether or not by automated means.
14
Section 207(2) Data Protection Act 2018. In fact, the territorial application of the 2018 extends beyond this. This is a point we pick up later as it has some significance for researchers in member states targeting research participants in the UK in case of Brexit.
15
Section 3(9) Data Protection Act 2018 provides a definition of data protection legislation. To be amended, in case of Brexit by Schedule 21, Part 2, Para 2(1) of the Data Protection, Privacy and Electronic Communications (Amendments etc) (EU Exit) Regulations 2019.
16
Article 4(1) GDPR defines “personal data” as “any information relating to an identified or identifiable natural person (“data subject”); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person”.
17
Research itself is broadly defined in Recital 159 of the GDPR to include “for example technological development and demonstration, fundamental research, applied research and privately funded research”. Recital 159 also notes that “[s]cientific research purposes should also include studies conducted in the public interest in the area of public health”. Research may be conducted by a private or a public body.
18
Section 3(1) and Schedule 1, Part IV, Para 53 Freedom of Information Act 2000.
19
A non-departmental public body sponsored by the government department for Business, Energy and Industrial Strategy (BEIS) and established by the Higher Education Research Act 2017.
20
The Data Protection, Privacy and Electronic Communications (Amendments etc.) (EU Exit) Regulations 2019.
21
The GDPR refers to “public interest” 43 times through the recitals and a further 26 times in the text of articles.
22
Some of the concerns expressed during the drafting of the GDPR have been ameliorated by recital 33 but the extent to which a broad consent is permissible in a research context remains the subject of debate. See, for example, the resolution passed by the Conference of German Data Protection Authorities: https://www.datenschutzkonferenz-online.de/media/dskb/20190405_auslegung_bestimmte_bereiche_wiss_forschung.pdf accessed on 1 January 2020. See also (European Data Protection Supervisor 2020, pp. 18–19).
23
Response rates are also likely to vary across demographic characteristics and so requiring recontact for the purposes of seeking a consent may introduce additional bias into a dataset (Krousel-Wood et al. 2006).
24
If explicit consent were to be relied upon as the basis of an exception to the processing of special category data, then a data subject would be able to withdraw consent. An objection (rather than withdrawal) is only relevant where consent has not been given to the processing by the data subject.
25
McHarg cites (Held 1970) as the original source of the categorisation.
26
It should be highlighted that this is different from an attempt to distinguish those interests possessed as a group from interests possessed by members of that group as individuals. This would attempt to identify emergent collective or community interests and could be associated with some communitarian ideas.
27
One response might be to accept that the public interest is a purely descriptive concept. Any value associated with the idea, capable of such adjudication, has to be independently established. This may reduce the idea to something that is conceptually coherent but, by definition, also restricts its value. Nevertheless, if there were no better alternative, then the idea may have analytic merit. We suggest, however, that there is an alternative that is normatively as well as descriptively credible.
28
So far as it is possible to do so, UK data protection law must be read and given effect in a way which is compatible with the rights enshrined in the European Convention on Human Rights. Section 3(1) Human Rights Act 1998, UK.
29
Barry’s work on public interest does identify interests that might be described as universally common and potentially overriding—the interest that all persons have in achieving what he or she wants. (Barry 2011, p. 216). However, this does not, without more, provide a workable concept of public interest in the context of a data protection safeguard. Invariably, there is likely to be potential conflict between what individuals want in a scenario where privacy protection is brought into tension with access to data for research purposes.
30
There are sufficient points of disagreement regarding universal moral and political truth to undermine efforts to build a substantive concept of public interest through transcendent reason alone. That is why we adopt what might be described as a dialectically contingent rather than a dialectically necessary approach.
31
We could consider attempts made to defend the claim to equal rights to individual freedom in moral and political philosophy—and the associated philosophical defence of the idea of public reason. However, we seek an idea of public interest that is acceptable in fact. An idea of public interest that can only be defended relative to ideas that are not shared, as a matter of fact, by members of the public is fundamentally limited in its ability to protect social licence to operate. For that reason, rather than seek to offer a dialectically necessary argument for a particular concept of public interest, we are candid that our argument only has traction in relation to those willing to accept there is common value attributed to the recognition of individuals as free and equal members of society—at least when compared with the practicable alternatives—without further defending the idea. If people are not already willing to accept the value of equality, then we are sceptical of the power of philosophical argument to persuade them.
32
Note the distinction Gaus draws here between the Restricted and the Expansive view of Freedom and Equality.
33
This capacity might be characterised as “moral personhood” (Gaus 2011, p. 19).
34
The definition of public reasoning used by the Nuffield Council on Bioethics is adopted here: “The virtue of public reasoning is the cultivation of clear and explicit reasoning orientated towards the discovery of common grounds rather than in the service of sectional interests, and the impartial interpretation of all relevant available evidence.” (Nuffield Council on Bioethics 2012, p. 69).
35
The argument is that three contingent commitments may establish a platform upon which one might construct (via transcendental reason) a modest idea of the substance of public community in this context. The claim is admittedly and deliberately modest. We seek only to articulate the substance of public community in the context of research access to patient confidential data. If accepted, however, it shows how a public interest might be constructed in other contexts and in relation to other common interests (recognising that data protection is not exclusively concerned with the kind of control over data implied by informational privacy). The aim is to demonstrate how a public interest position, once constructed, might justify establishing policy and legal defaults in a particular way, and moving people toward particular choices, without subjugating them to principles that they do not have reason to accept.
36
In this context one might also note the constitution of the World Health Organisation and the assertion of the highest attainable standard of health as a fundamental human right. (World Health Organisation 1946).
37
It may be considered unlikely that good health across an individual’s entire lifespan is unlikely to be supported at all by any previous research if one considers public health and preventative measures based on previous studies, such as vaccination, but it is still possible.
38
Of course, if Gewirth’s argument is successful (Gewirth 1978), then we have reason to respect the right of others to health care necessary to enjoy the generic features of agency: we have reason to want relevant health research even if we do not recognise it.
39
As Hobbes observed, “when men that think themselves wiser than all others clamour and demand right reason for judge, yet seek no more but that things should be determined by no other men’s reason but their own, it is … intolerable in the society of men.” (Hobbes 1651, p. 10).
40
Although for the view that seeking consent did not bias the sample, see Baker et al. (2000), pp. 655–56.
41
42
When developing his idea of public reason Gaus relies upon an idea of Members of the Public where each “deliberates well and judges only on the relevant and intelligible values, reasons, and concerns of the real agent she represents and always seeks to legislate impartially for all other Members of the Public”. See (Gaus 2011, p. 26).
43
The “relevant concerns” are, as previously described (1) Privacy: Maintaining norms of exclusivity in respect of PCD (including preferred levels of individual control over access), (2) Health improvement: Facilitating research to improve health, and, (3) Respect for the common value of freedom and equality.
44
At least in so far as there is a relationship between individual control over access and use of personal health information and privacy.
45
One might reflect upon whether this is, and has always been, the case under the common law duty of confidentiality. Discussion of this point would take us too far from our central argument.
46
These are adapted from the classic requirements for consultation as articulated by Sedley QC and adopted by Hodgson J in R v Brent London Borough Council ex parte Gunning (1985) 84 LGR 168, p. 169. See also (Grace and Taylor 2013).
47
In fact, it is possible that their expectations (and therefore also their privacy) in relation to access that is permitted may be enhanced. There may be enhanced oversight and “bargaining power” over the conditions of access applied generically to default access than could be assured through an individual mechanism of specific consent. This argument is, I think, consistent with and potentially supplemented by Lowrance’s own argument that consent might best be understood as “entrusting”: emphasising “the correlate responsibilities of researchers and their institutions.” (Lowrance 2012, p. 86).
48
Here, we agree with Townend that the “objective claim to an intervention that legitimately overrides the subjective claim of the individual is the function of the public interest in society” but I would suggest that the “objective claim” is to an intervention that the individual themselves can be shown to have reason to accept (Townend 2012).
49
Support for the claim that understanding the public interest in this context requires input from the public can be found in (Laurie et al. 2010).
50
The Department of Health (2009) found that a majority of people state that identifiable data should never be used for research purposes without explicit patient consent. Research conducted by IPSOS MORI concluded “the two key pillars of anonymity and consent feature highly in the debate over what data should be available, to whom, and in what circumstances. These two themes are central to building trust.” (Medical Research Council 2007).
51
Summary Report of Qualitative Research into Public Attitudes to Personal Data and Linking Personal Data (Wellcome Trust, July 2013) found that “superficially no/very few objections to medical data being used for “the general good” (perceived as helping finding cures and causes), provided that commercial gain is not the priority”, p. 12. Available online: http://www.wellcome.ac.uk/stellent/groups/corporatesite/@msh_grants/documents/web_document/wtp053205.pdf accessed on 1 January 2020.
52
See, for example, the findings of Hill et al. (2013) that “there was no consensus on a preferred model [of consent] either within or across studies, although participants often considered the balance of obtaining consent against the public benefit incurred by unrestricted research”.
53
Klaus Hoeyer’s work has usefully established that these are not even always considered to be the most important considerations. See Hoeyer et al. (2004).
54
The EU Special Barometer 340: Science and Technology (2014), p. 113 (found that when asked which area of research should be tackled in priority by researchers in the European Union, respondents most often mention health issues (40%), with energy issues at 21% and environment issues at 18%). A systematic review of the literature concerning public acceptance of consent to use patient data in health research found that in a clear majority of previous qualitative studies (9 out of 11) “participants were reported as recognising the benefit of research for the population” (Hill et al. 2013).
55
Also supported by the idea of Dynamic Consent (Kaye et al. 2015); Williams et al. (2015).

Share and Cite

MDPI and ACS Style

Taylor, M.J.; Whitton, T. Public Interest, Health Research and Data Protection Law: Establishing a Legitimate Trade-Off between Individual Control and Research Access to Health Data. Laws 2020, 9, 6. https://doi.org/10.3390/laws9010006

AMA Style

Taylor MJ, Whitton T. Public Interest, Health Research and Data Protection Law: Establishing a Legitimate Trade-Off between Individual Control and Research Access to Health Data. Laws. 2020; 9(1):6. https://doi.org/10.3390/laws9010006

Chicago/Turabian Style

Taylor, Mark J., and Tess Whitton. 2020. "Public Interest, Health Research and Data Protection Law: Establishing a Legitimate Trade-Off between Individual Control and Research Access to Health Data" Laws 9, no. 1: 6. https://doi.org/10.3390/laws9010006

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop