The aim of the project is to establish EU-wide standards for the use of ‘in silico modelling’ (including software developed through artificial intelligence) in healthcare. ‘Personalised medicine’ is treatment tailored to a particular patient. This ‘personalisation’ case be achieved by running their medical records through a computer ‘model,’ such as software that helps doctors diagnose patients, or recommend treatment. 

The legal and ethical Work Package focuses on data protection, patients’ rights and research ethics. The objectives of the work package are to: 

1. Map and analyse European and international data protection and clinical trials regulation in the area of data protection, biomedical research (including clinical trials), and patients’ rights relevant for collaborative research for heterogeneous data integration and data driven in silico models.

2. Evaluate legal and ethical challenges and explore possibilities for collaborative research for heterogeneous data integration and data driven in silico models

3. In collaboration with the other WP’s and external stake holders: Develop recommendations which are compliant with the current legal and ethical framework and provide a sustainable back bone for collaborative research for heterogeneous data integration and data driven in silico models.

The project began in 2019, and is due to complete its recommendations in 2021-22. An international stakeholder workshop will be held in late 2021/early 2022, at which point this page will be updated. 


EU Stands 4 PM logo


  • M Mourby, '‘Leading by Science’ through Covid-19: the GDPR & Automated Decision-Making' (2021) 5 International Journal of Population Data Science
    The UK government announced in March 2020 that it would create an NHS Covid-19 ‘Data Store’ from information routinely collected as part of the health service. This ‘Store’ would use a number of sources of population data to provide a ‘single source of truth’ about the spread of the coronavirus in England. The initiative illustrates the difficulty of relying on automated processing when making healthcare decisions under the General Data Protection Regulation (GDPR). The end-product of the store, a number of ‘dashboards’ for decision-makers, was intended to include models and simulations developed through artificial intelligence. Decisions made on the basis of these dashboards would be significant, even (it was suggested) to the point of diverting patients and critical resources between hospitals based on their predictions. How these models will be developed, and externally validated, remains unclear. This is an issue if they are intended to be used for decisions which will affect patients so directly and acutely. We have (by default) a right under the GDPR not to be subject to significant decisions based solely on automated decision-making. It is not obvious, at present, whether resource allocation within the NHS could take place in reliance on this automated modelling. The recent A Level debacle illustrates, in the context of education, the risks of basing life-changing decisions on the national application of a single equation. It is worth considering the potential consequences for the health service if the NHS Data Store is used for resource planning as part of the Covid-19 response.