Welcome to the Law and Technology Research Group

We are a large and diverse community of University postholders, researchers, research associates, academic affiliates, academic visitors and students with a shared interest in the legal, regulatory and governance issues raised by new technologies. Our current interests span from CRISPR and assisted reproductive technologies right through to COVID track and trace systems, Big Data analytics, AI and machine learning to renewable energy sources. We are concerned with the way that new technologies are impacting every aspect of our daily lives, the communities we live in and our future societies. There are a wide range of events, opportunities for students, research and other activities related to Law and Technology being undertaken across the Oxford Law Faculty. 

If you would like to learn more or get involved with our Research Group, please contact us.

Professor Jane Kaye and Professor Justine Pila Co-Chairs

Read More


  • M Mourby, H Gowans, S Aidinlis and H Smith , 'Governance of academic research data under the GDPR—lessons from the UK' (2019) 9 International Data Privacy Law 192
    DOI: 10.1093/idpl/ipz010
    The General Data Protection Regulation (GDPR) includes a new power for Member States to pass exemptions for the purpose of ‘academic expression’. This may appear to provide greater freedom to researchers working under the new EU data protection regime. Using the UK as a case study, however, it is evident that even a full exercise of the academic derogation is likely to be limited by the GDPR’s requirement of necessity, and by privacy rights wherever they are engaged. Ultimately, the GDPR provisions applicable to universities as public authorities are likely to have greater impact on academic data processing in public institutions; a shift which is not conducive to greater freedom in research data processing.
  • J Bell, S Aidinlis, H Smith and H Gowans, 'Balancing data subjects' rights and public interest research: Examining the interplay between UK Law, EU Human Rights Law and the GDPR' (2019) 5 European Data Protection Law Review 43
  • J Pila, 'Adapting the ordre public and morality exclusion of European patent law to accommodate emerging technologies' (2020) 38 Nature Biotechnology 559
    European patent law prohibits the grant of a patent for any invention the commercial exploitation of which would be contrary to morality or ordre public (public policy). Since the introduction of the EU’s Biotech Patenting Directive in 1998, the “morality” aspect of this prohibition has been given new life with assistance from the Charter of Fundamental Rights and the Court of Justice of the European Union. In contrast, the “public policy” aspect remains largely dormant owing to its restrictive interpretation over many years by the European Patent Office. If the patent system is to have democratic and social legitimacy in an era of accelerating technological change and uncertainty, that interpretation needs to be revisited and a new method introduced for assessing the moral and public policy implications of patent applications. This should not be difficult to achieve, since such a method already exists and has been deployed in other areas of technology regulation.
  • M Mourby, '‘Leading by Science’ through Covid-19: the GDPR & Automated Decision-Making' (2021) 5 International Journal of Population Data Science
    DOI: https://doi.org/10.23889/ijpds.v5i4.1402
    The UK government announced in March 2020 that it would create an NHS Covid-19 ‘Data Store’ from information routinely collected as part of the health service. This ‘Store’ would use a number of sources of population data to provide a ‘single source of truth’ about the spread of the coronavirus in England. The initiative illustrates the difficulty of relying on automated processing when making healthcare decisions under the General Data Protection Regulation (GDPR). The end-product of the store, a number of ‘dashboards’ for decision-makers, was intended to include models and simulations developed through artificial intelligence. Decisions made on the basis of these dashboards would be significant, even (it was suggested) to the point of diverting patients and critical resources between hospitals based on their predictions. How these models will be developed, and externally validated, remains unclear. This is an issue if they are intended to be used for decisions which will affect patients so directly and acutely. We have (by default) a right under the GDPR not to be subject to significant decisions based solely on automated decision-making. It is not obvious, at present, whether resource allocation within the NHS could take place in reliance on this automated modelling. The recent A Level debacle illustrates, in the context of education, the risks of basing life-changing decisions on the national application of a single equation. It is worth considering the potential consequences for the health service if the NHS Data Store is used for resource planning as part of the Covid-19 response.