Some measures for reducing harms in genomic research

By Haleh Asgarinia & Beatriz Esteves

Designed by Kjpargeter / Freepik

Last Thursday, September 17th, Karlin Lillington wrote an article in The Irish Times about how ‘Ireland is quietly allowing private companies to collect and monetize Irish DNA’, once again highlighting the country’s need for a national public genomics program.

In this particular case, Beaumont Hospital and Genuity Science are conducting a research study to create a Brain Tumor Information System for adults, using a large database of DNA samples collected from patients with brain tumors treated at Beaumont between 29 November 1987 and 7 August 2018.

Patients, family members and advocates of deceased patients were left in concern over receiving an unexpected Monday deadline to opt-out of this study, instead of having been contacted with an opt-in scenario, since the General Data Protection Regulation (GDPR) grants a special status to genetic and health-related data.

Like other health research, genomic research requires (informed) consent, but, is it enough to ensure that patients are respected or protected against harms caused by genomic research?

Genomic data are different from other types of biological and medical data because, first, parts of genetic information are shared among family members and ethnic communities, and second, DNA has the character of a text with structure. These two features show that research communities need to consider additional measures to respect or protect patients.

Even though patients gave consent to the using of their data for research purposes, they cannot give consent on behalf of their family members or ethnic communities because they do not know each other, and it is impossible to obtain consent from the whole community in practice. Also, DNA can be read and interpreted because it has a text character. Interpretation, for example, indicates the probability of disease. Based on the information, then, people are classified deterministically, while interpretation only shows probabilistic findings. The use of generalized data may negatively affect a person because they label and judge the person without allowing them to deny the interpretation or classification. So, employers or health insurance companies may use these labels or classification without taking into account the way the person wants to be identified.

These show that additional measures need to be considered to respect or protect patients against such potential harms, for example, by ensuring that data are properly interpreted and used. Can you assure your patients these measures will be considered by Monday? Or have such measures already been considered?

Read more about group privacy and harms of targeting people

Taylor, L., Floridi, L., & van der Sloot, B. (2017). Introduction: a New Perspective on Privacy. In L. Taylor, L. Floridi, & B. (. van der Sloot, Group Privacy: New Challenges of Date Technologies (pp. 10-22). Dordrecht: Springer.

Bruynseels, K. & van den Hoven, J. (2015). How to Do Things with Personal Big Biodata? In Roessler, B. & Mokrosinska, D. (eds), Social Dimensions of Privacy: Interdisciplinary Perspectives (pp. 122-140). Cambridge University Press.

Leave a Reply

Your email address will not be published. Required fields are marked *