Let me begin with a disclaimer. I am no expert in Australian law. However, it seems to me that the Australian Federal Court (FCA) has made a series of missteps in the DABUS case (Thaler v Commissioner of Patents [2021] FCA 879) that led it to conclude that an AI-driven system can be an inventor. As a similar DABUS case is currently pending before the European Patent Office, another one was decided by the UK Court of Appeal last week (Thaler v Comptroller General [2021] EWCA Civ 1374) and a number of companies employ AI in their inventive endeavours, I find it relevant to discuss the FCA’s argumentation.

DABUS is an AI-driven system which is currently on a world tour of courts and patent offices with the claim to have invented the following subject matter: food container and devices and methods for attracting enhanced attention. More specifically, the Australian DABUS case dealt with the questions of whether an AI-driven system can be an inventor under patent law and how, if at all, its owner, programmer and operator can be granted a patent for the invention it generated. On 30 July 2021, the FCA concluded that a ‘patent can be granted to a legal person for an invention with an artificial intelligence system or device as the inventor’ ([200]).

First, the FCA reasoned that an AI-driven system can be an inventor because no provision in the Australian Patents Act (PA) or international Patent Cooperation Treaty (PCT) ‘expressly refutes the proposition that an artificial intelligence system can be an inventor’ ([118] and [72]). This line of reasoning suggests that the concept of inventor should then include cats, dragons and aliens, as they are not expressly excluded under the PA or PCT either.

Second, the FCA relies on the absence of definition of ‘inventor’ in the PA and PCT ([59] and [71]). However, no conclusions on eligibility of an AI-driven system for inventorship can be drawn from such an absence. The PCT is a purely procedural instrument. It establishes a Union for cooperation in the filing, searching, and examination of applications, with no aim of harmonising substantive patent law. Therefore it is not surprising that it does not define concepts of substantive law such as ‘inventor.’

As far as the concept of inventor under the PA is concerned, the FCA finds that as a matter of ordinary language, ‘the word ʻinventorʼ is an agent noun. […] Accordingly, if an artificial intelligence system is the agent which invents, it can be described as an ‘inventor’’ ([120]). The FCA finds its inclusion of an AI-driven system in the concept of inventor consistent with the Australian High Court’s ‘flexible and evolutionary’ interpretation of the concept of invention (manner of manufacture) in D'Arcy v Myriad Genetics Inc [2015] HCA 35 ([121]). This FCA’s linguistic argument is problematic because merely having the suffix ‘or’ typical for an agent noun does not preclude that concept from being a purely legal one such as ‘licensor’ or ‘trustor.’ Constructing the meaning of a legal term with reference to its understanding in ordinary language when that understanding comes from law would be circular. If ‘inventor’ is indeed a term of legal art, then it should be interpreted within the context of patent law. (For instance, ‘invention’ is a legal term and is defined in the Dictionary which is Schedule 1 to the PA.)

Similarly, it is curious that the Australian case law discussing inventorship (ie what it means to be an inventor) relied upon by the Commissioner of Patents was discarded by the FCA on the ground that the cases did not deal with the issue of artificial intelligence ([91]). Quite naturally they do not if one appreciates that the DABUS cases are historically the first cases involving a claim for AI inventorship. Is the emergence of AI a reason for ignoring a body of established law? (Just a rhetorical question.)

Moreover, according to the FCA, there is ‘no specific aspect of patent law, unlike copyright law involving the requirement for a human author or the existence of moral rights, that would drive a construction of the Act as excluding non-human inventors’ ([119]). Contrary to this FCA’s statement, moral rights of human inventors in respect of their inventions are internationally recognised: Article 4ter of the Paris Convention, to be read in conjunction with the WIPO Guide to the Application of the Paris Convention, recognises their right to be named as such in the patent. Australia has been a party to the Paris Convention since 1925.

The FCA also touched lightly on the issue of patentability of objects generated by AI-driven systems. It opined that there is no requirement of inventor’s mental state or act to meet the patentability requirements ([141]). However, this conflicts with the very authority on which the FCA relied to support its flexible and evolutionary reading of inventorship. In D’Arcy v Myriad Genetics Inc, the High Court held that patents cannot be granted for isolated gene sequences on the ground that such sequences are not human creations (inventions). ‘[A]n invention,’ it held, ‘is something which involves ‘making’. It must reside in something. It may be a product. It may be a process. […] Whatever it is, it must be something brought about by human action.’ (D’Arcy [6]) Moreover, according to the Hight Court, subject matter that ‘do not involve human intervention” lack “the necessary quality of inventiveness to qualify as a manner of manufacture’ (D’Arcy [136]). By treating a human act or intervention as a precondition for patentability, the High Court aligned Australian patent law with European and US patent law.

Finally, the FCA also fails to explain how, assuming that DABUS is an inventor, its owner, programmer and operator (Mr Thaler) might derive the title to be granted the patent. The FCA relied here on the ordinary meaning of derive, as including ‘to receive or obtain from a source or origin, to get, gain or obtain, and emanating or arising from’ ([177]-[179]). However, the statutory text in PA, s 15(1)(c) provides ‘derives title to the invention,’ not merely ‘derives.’ Since an AI-driven system does not have the legal personhood and thus cannot have legal title to anything, no title can be derived from it in any of the above sense.

To conclude, the FCA’s argumentation seems to have surrendered to the tempting question: why not? [eg 121]. But is it accurate to base the admissibility of an AI inventor on reasons for why not rather than why yes? What is the cost-benefit analysis for granting patents for AI-generated objects? For centuries human inventors have been challenged to give good reasons for why yes and why they merit the protection. As a result, the current patent system is built around the acceptable answers as to why yes. Unfortunately for AI, however, the indispensable justifications such as rewards and incentives theories rest on the human nature of inventors.

Eva Janečková (née Stanková) is a DPhil in Law candidate at the University of Oxford.