[KEYPOINT]: A historic Federal Court decision says an artificial intelligence system is capable of being named as an “inventor” under the Patents Act 1990, with potentially significant ramifications for technological innovation and the patent system in Australia.

In the first judicial determination in the world of its type, the Australian Federal Court has held that artificial intelligence systems or devices can be “inventors” for the purpose of the Patents Act 1990 (Cth) (Thaler v Commissioner of Patents [2021] FCA 879).

Corresponding patent applications naming artificial intelligence system DABUS as the inventor and Dr Thaler as the owner of the DABUS inventions have been rejected by Patent Offices and Courts in other jurisdictions including the US, UK and Europe.

Although a patent application may name an inventor as an artificial intelligence system, only a legal person can be the applicant for a patent or be granted a patent. The owner and controller of the artificial intelligence system may derive title to a patent from an AI “inventor”.

This decision opens up the possibility of patent applications for inventions created by AI systems/devices in various scientific fields and industries.

The decision may be appealed to the Full Federal Court.

Patent application, DABUS and decision of Deputy Commissioner

An Australian patent application filed by Dr Stephen Thaler named an artificial intelligence system, described as a Device for the Autonomous Bootstrapping of Unified Sentience (DABUS), as the inventor.

DABUS is a form of neuro-computing which causes a machine to create new concepts. The new concepts are then “encoded as chained associative memories within the artificial neural networks”. It uses multiple generator artificial neural networks which are said to mimic the neural networks of the human brain. DABUS is trained through supervised and unsupervised learning. In this case, the alleged invention was the output of DABUS’ processes. The patent application includes claims to various products and methods relating to food containers and devices and attracting enhanced attention using convex and concave fractal elements. According to Dr Thaler, the invention was autonomously generated by DABUS.

The Deputy Commissioner of Patents determined that the patent application did not comply with Reg 3.2C(2)(aa) of the Patents Regulations 1991, which requires the name of the inventor to be provided. Dr Thaler sought judicial review of the Commissioner’s decision by way of appeal to the Federal Court arguing that the Patents Act and Regulations do not preclude an AI system being treated as an inventor.

Beach J set aside the decision of the Deputy Commissioner and has remitted the matter to the Deputy Commissioner to be determined in accordance with the Court’s judgment.

Who (or more importantly, what) can be an “inventor”?

The key question was whether an AI system is capable of being an “inventor” under the Patents Act. While Beach J accepted that patents can only be granted to persons, and an applicant for a patent must be a person, he found no basis to preclude an AI machine from being named as an “inventor”.

There is no definition of “inventor” in the Patents Act or Regulations. The judge considered that the word “inventor” as an agent noun, means that if an AI system is the “agent which invents”, then it can be described as the “inventor”. He also indicated that the word “invents” may have been used historically to describe humans only as they were the only ones who could invent. However, now that artificial intelligence machines may perform the same function, the word may also describe them. In deciding that the natural definition of “inventor” can extend to AI, Beach J commented on the historical development of words and dictionaries:

“dictionaries are by their nature developed from historical usage. Accordingly, there would be no call for a definition directed to “something that invents” until that became possible. Similarly, prior to the development of the electronic computer, computations were made only by humans and the term “computer” referred to a person who made computations. But the term was correctly applied to a thing that made computations when such machines became available; this is now the dominant usage.”

Further, he compared the widening of the term “inventor” to the widening concept of “manner of manufacture” in light of new technologies. He stated that both terms derive from section 6 of the Statute of Monopolies and as a result “it makes little sense to be flexible about one but not the other”.

AI as “inventor” consistent with objects clause of Patents Act

The judge also considered that the broadening of the definition of inventor was consistent with the recently introduced “objects” clause in section 2A of the Patents Act: “to provide a patent system in Australia that promotes economic wellbeing through technological innovation and the transfer and dissemination of technology. In doing so, the patent system balances over time the interests of producers, owners and users of technology and the public”.

By rewarding innovations, irrespective of whether the innovator is human or not, his Honour held, would promote technological innovation and its public dissemination. In contrast, not recognising the reality could stifle innovation and lead to owners of AI systems protecting patentable inventions as trade secrets. However, Justice Beach did not specifically engage with whether recognising AI systems and devices as inventors will promote “economic wellbeing”.

The judge held that recognising an AI system or device as an inventor was not inconsistent with the Patents Act or Regulations. He considered a number of provisions, in particular section 15 of the Patents Act which deals with who may be granted a patent, which may only be a person. The Commissioner had argued that an AI system or device could not be an inventor as an AI inventor can have no title in the invention which it could transfer to the patent applicant.

But Beach J said this argument confused the question of ownership or control of a patentable invention, including who can be patentee, with the question of who can be an inventor.

Under section 15(1):

“a patent for an invention may only be granted to a person who:

(a) is the inventor; or

(b) would, on the grant of a patent for the invention, be entitled to have the patent assigned to the person; or

(c) derives title to the invention from the inventor or a person mentioned in paragraph (b); or

(d) is the legal representative of a deceased person mentioned in paragraph (b).”

One could argue from the language that the section assumes an inventor to be a person. However, Beach J concluded that, in principle, Dr Thaler could be granted a patent in relation to an invention made by an AI system such as DABUS under at least section 15(1)(c), and potentially also under section 15(1)(b).

In relation to section 15(1)(b), on established principles of property law, Beach J stated that Dr Thaler is the owner of the invention as the owner, programmer and operator of DABUS. Analogising with ownership of the progeny of animals or the treatment of fruit or crops produced by the labour and expense of the occupier of the land, he held it was incorrect to pre-suppose an earlier vesting of title in the inventor.

Somewhat controversially, the judge stated that section 15(1)(b) does not require the existence of an inventor at all. It requires no more than that the applicant is entitled to have the patent assigned to them in the event there is a grant – examples were given of circumstances where the invention was the subject of contract or had been misappropriated, giving rise to a legal or equitable right of assignment.

In order for Dr Thaler to derive title in the invention under section 15(1)(c), it did not necessarily require an assignment from the inventor at all:

“if the party claiming an interest has an interest in the invention even if that interest has not been conferred by means of an assignment, that party can be said to derive the invention from the inventor.”

Beach J was of the view that, Dr Thaler, as the owner and controller of DABUS, would own any inventions made by DABUS when they came into his possession. By deriving possession of the invention from DABUS, Dr Thaler prima facie derived title. Title can therefore be derived from an inventor “notwithstanding that it vests ab initio other than in the inventor. That is, there is no need for the inventor ever to have owned the invention, and there is no need for title to be derived by an assignment”.

It should be noted that the Court acknowledged that in other circumstances involving inventions that are the output of an AI system, there could be various possibilities as to who would be the owner e.g. the software programmer or developer of the AI system, the person who selected and provided the input or training data to train the AI system or the owner of the AI system who invested capital to produce the output.

Ramifications of the decision in Thaler

The Thaler decision has generated some controversy. It also raises a number of related possibilities, including:

  • To be patentable, an invention must involve an inventive step when compared with the prior art base. A logical extension of this case is that it is now possible that the requisite person skilled in the art may be considered is a person as assisted by AI. Will this decision therefore make it more difficult for patents by human inventors to be granted?
  • Will the common general knowledge now expand to include developments made by AI? If not, will it be harder to invalidate patents where the “inventor” is AI?
  • Complex ownership/entitlement issues may arise in the context of inventions developed by AI systems.
  • How will compliance with the clarity and support requirements in section 40 of the Patents Act be evaluated for these types of patent applications/patents? Will the support requirements in section 40(3) require the disclosure of the AI algorithm in the complete specification or something more?
  • Are these changes that should be instigated at the legislative level rather than being shoehorned into existing legislation?
  • Will there now be an influx of patent applications in Australia that contain AI systems as “inventors”? The use of AI is vast in a number of industries, in particular the pharmaceutical industry which has used AI, for example, in the development of vaccines, drug repurposing, and determining 3D protein structures. We will have to wait and see…

_____________________________

To make sure you do not miss out on regular updates from the Kluwer Patent Blog, please subscribe here.


Kluwer IP Law
image_pdfimage_print

8 comments

  1. A good PR stint for Mr Thaler.
    The decision should stay down under.
    I still have strong doubts whether an AI system can at be in the position to invent anything.
    In principle it can only do what it has been told to do.
    The beverage can is probably not inventive and the flashing light is at the limit of sufficiency.
    Both inventions have to do with fractals.
    AI should remain a playground for legal scholars and by no means be subject to a sui generis legislation.
    The only legislation about AI which should be taken is about transparency. Without knowing the correlation algorithm and the training data it will not be possible to trust the result.

  2. So is Section 15 of the AU Patent Statute better suited than Art.60, EPC, to the situation where an AI devises a solution to an objective technical problem? It seems so, for under Art. 60, EPC the default is that the inventor enjoys “title” to the invention and I don’t see how an AI can be the real owner of property. Should we start to think about amending the EPC to accommodate AI inventors?

    Of course, the world’s premier jurisdiction for guarding jealously the suite of legal rights enjoyed by “the inventor” is the home of the individual, the USA. Regardless what the Appeal Court of Australia decides (if ever it is tasked with the question), we shall have to wait till the courts of the USA have addressed the issue before we know whether owners of AI’s are going to be able, routinely, to name their machines as inventors.

    But don’t dismiss the possibility. Some corporate employers would very much like to be relieved of the burden of nurturing and motivating emotional and irrational human employee inventors. How convenient, instead, to be able every time to designate instead as inventor a compliant, obedient, predictable programmable machine.

    Remember the apochryphal tale from long ago, that everything that can be invented has been invented? Are we on the brink of its becoming true, with a first AI that spits out every solution to every technical problem, a second one (of ordinary skill) that declares each solution enabled and a third AI that finds every solution obvious?

  3. Yeah, plain and simple nonesense. Instead of making the Australian judicial system look capable of sensibliy dealing with software matters, it makes a joke out of it.

    As one EPO BoA member once told me: there is always this one guy who wants the decision to be different just for the heck of it.

  4. Interesting case, interesting article.
    Some comments regarding the further possibilities raised:
    “A logical extension of this case is that it is now possible that the requisite person skilled in the art may be considered is a person as assisted by AI”. I do not see how this extension follows in any logical manner from the case. The skilled person has always been a legal fiction, not to be confused with a real person or persons. There is, imho, no need for extending the legal fiction to an AI-assisted legal fiction, nor, indeed, should such an extension have any effect on the abilites of the skilled person.
    “Will the common general knowledge now expand to include developments made by AI”, also this statement seems somewhat off. It is immaterial whether an AI has contributed to developing common general knowledge or not; the original ‘authors’ of most of our common general knowledge are unknown (or forgotten), yet this does not change the value of the common general knowledge itself.
    Anyway, otherwise an interesting read, thanks again for sharing.

  5. The. problem is that we cannot trust AI for its result as long as we do not know what the algorithm is about and what the training data were.
    It is thus very doubtful that common general knowledge can be supplemented by AI!

  6. I have got quite a few questions here…

    So, an “AI” can be an inventor – could then an animal (think of primates, crows, whales, and some others, which are probably far more intelligent than any “AI” built so far) be an inventor, too? In the US they can not even have a copyright on a selfie…

    To take some steps back… CAD is now established in many (most?) disciplines. Thinking back more than 20 years, I heard lectures on some aspects of CAD algorithms for digital circuits (e.g. “simulated annealing”). Assuming such an optimization or design algorithm to be used, would its product be patentable? What claim would this be? product-by-design-process? Would the product of a patented CAD-program be the “product” and as such be protected?

    Back to “AI”: as far as I understand, an “AI” is simply a specialized program/algorithm. You give it some defined input and will get a certain output (maybe add a random number generator and the output will be less clearly defined). So what distinguishes an “AI” from a more usual program/algorithm, so that one of the products is patentable and the other isn’t?

    To me, this all doesn’t make sense. I hope PETA, the Humane Society or similar will sue Mr. Thaler when he switches off DABUS. 😉

  7. Among the issues cited in the post, one relates to existing AI tools : if AI tools can be easily developed to be an efficient substitute to very lengthy ad costly real-life experiments, e.g. to predict the biomedical performance of a plurality of molecules, could it be taken into account by judges to shift their assessment of the « undue burden to the skilled person » ?
    Regarding the DABUS cases, there is no question that this has been a highly successful publicity stunt. It is interesting on this subject to read Stephen Thaler’s answers to FAQ 2 in his presentation « What is DABUS » available on his website :
    “FAQ2. How come Thaler hasn’t written a ‘landmark’ paper on DABUS?
    In short, Thaler has written a landmark patent on DABUS and submitted it to totally unbiased subject matter experts (a.k.a., patent examiners) for approval. Besides, IEI is a business and not formally a part of academia where professors are paid to spend most of their time writing papers.
    Then again, our founder has written peer-reviewed papers on DABUS that are purposely a bit cryptic considering the related patent that was then in prosecution.”
    This makes it clear that Mr Thaler has not been eager to submit his claims that DABUS works without human input to a peer review by the academic community of AI experts. As to the submission to patent examiners of the DABUS cases, Mr Thaler’s answer lacks a very relevant point : patent examiners are not competent to assess the accuracy of the inventor designation (Rule 39(2) EPC) and can only take the applicant’s designation at face value.
    As to the validity of this claim and the decision of the Australian judge, there is a confusion regarding what is an « invention ». The output of an AI tool is not in itself an invention. An invention exists when an AI output is recognised as valuable in view of an objective. This requires intent – the ability to define an objective and the means toward this objective – and consciousness, without which an output cannot be identified and confirmed as valuable for the objective. Both intent and consciousness are purely human attributes.

Comments are closed.