The facts of the case
On 21 November 2023, Sir Anthony Mann of the London High Court handed down a judgment in the case of Emotional Perception AI Ltd v Comptroller-General of Patents, Designs and Trade Marks [2023] EWHC 2948 (Ch). This judgment addressed the question whether artificial neural networks (ANNs) are excluded subject matter under the ‘program for a computer’ exclusion of section 1(2)(c) of the UK Patents Act 1977.
The claimed invention concerns a system for providing media file recommendations (eg, music tracks) to an end user, incl. sending a file and a message in accordance with the recommendation. The advantage over existing systems lies in the offering of suggestions of similar music in terms of human perception and emotion irrespective of the genre of music or similar tastes of other humans.
The invention uses two neural networks. The first analyses music files accompanied by natural language descriptions (eg, happy/sad song) to create a semantic space of co-coordinates where semantically similar songs have their co-ordinates closer to each other. The second analyses tracks via human-set parameters (eg, tone, timbre, speed) and makes distances between pairs of the property co-ordinates converge or diverge in alignment with the distancing between them in the semantic space.
The trained network is used in applications where music tracks are passed through it to produce outputs. These outputs are compared to a database from which recommendations about similar tracks are generated and provided to the end user by sending a message and a file.
The Hearing Officer at the UKIPO found that the use of an ANN engages the ‘program for a computer’ exclusion and refused grant (BL O/542/22). On appeal, the applicant challenged the decision on two main bases: (1) that there is not a computer program in the claimed invention, and (2) if even there is one, the exclusion does not apply to it because the program is not claimed ‘as such’ and the claim reveals a technical contribution.
The findings of the court
To determine whether the exclusion applies, the High Court considers two key questions: (1) what is a computer? and (2) where is the program that engages the exclusion?
The court recognised that ANNs can have both hardware and software implementations. Indeed, neural networks can be implemented in hardware, which contains the physical nodes and layers, or in software which ‘emulates’ them. The court found these implementations to be functionally equivalent.
The court first considered whether a hardware ANN is a ‘computer’ and whether ANNs emulated in software run on a ‘computer’. Taking a functional approach, the court held that a ‘computer’ should not be defined by the fact that it runs programs (§41) but with a reference to its functions and activities which, consist of processing data (§40). The court thus found that both hardware and ‘emulated’ ANNs run on what is normally understood as computers.
Subsequently, the court assessed whether there was a program to which the exclusion apples. The analysis focused on whether there was any activity taking place which might be called programming in relation to either the ANN or the data. The court looked at two phases where there might be a computer program: the training stage and the operation stage, where the trained ANN is used.
Regarding the training stage, the court found that it involved the computer program that initiates the training (eg, training script). This stage clearly involved programming, but the court deemed it was ‘not correct to view the whole thing as some sort of overall programming activity for the purposes of the exclusion’ (§59). In other words, the programming was limited to setting the training objectives, choice of a learning algorithm, hyperparameter optimisation etc., and there was no programming beyond this stage. Since this program was only a ‘subsidiary part’ of the claim, which went beyond, the exclusion did not apply.
The more intricate question was whether the learning (ie, the automatic adjustment of the weights and biases) and the subsequent operation of the trained ANN was a computer program. To answer this question, the court looked at whether the ANN was following deliberate programming, or if it had ‘trained itself’ to apply its weights and biases to new data and to generate vector co-ordinates. Drawing on the alleged equivalence between hardware an emulated ANNs, the court took the view that an emulated ANN does not implement code given to it by a programmer but implements nodes and layers ‘created by the ANN itself’ (§54). An emulated ANN, therefore, could be decoupled from the underlying software on a computer in the same way a hardware ANN can be decoupled from other software that may be running on the same hardware. In other words, if the hardware ANN is ‘not operating a program then neither is the emulation’ (§56).
In light of these observations, the court found that the claim was not to a computer program at all so the exclusion was not invoked.
The court also entertained the possibility that its conclusion might be wrong and discussed the possible technical contribution of a hypothetical computer program claim. Referring to the case law in Halliburton and Protecting Kids, the court found that producing and moving files meets certain criteria, even though they may not be technical. Nevertheless, the ANN was found to have ‘gone about its analysis and selection in a technical way’. Particularly, identifying a file as semantically similar to another by applying technical criteria which the system has worked out for itself was found to be a technical effect outside the computer (§76). This technical effect was said to preclude the application of the exclusion.
Finally, the court addressed obiter dictum the mathematical method exclusion under section 1(2)(a) Patents Act 1977. The court seemingly subscribed to the view that while ANNs and their training methods per se are no more than an abstract mathematical algorithm, the specific application (eg, a file recommendation engine) was enough the dispense with the mathematical method as an objection. (§81-82).
Commentary
This decision was welcomed as the dawning of a new era for AI-related inventions. However, the court’s take on the nature of neural networks and the technical contribution of the invention, if followed, could have adverse implications in future cases.
First, the functional view of a computer is problematic because it departs from established concepts in the field of computer science (e.g., the stored-program computer or the universal algorithm). The court’s finding that the hardware implementation of an ANN is a computer is correct, but the reasoning, which focuses mostly on the fact that the machine is processing data (§40), is not. Whether the neural network is implemented on hardware or emulated in software does not change the nature of what it does, i.e., computations based on mathematical operations. The interaction between a general-purpose computer and a program running on it sets the standard of ‘normal’ physical interactions. This standard is used as a benchmark under both the examination approach of the EPO and the AT&T signposts in the UK to determine whether the execution of a computer program produces further technical effects. Precise use of these concepts is therefore essential to properly assess a computer program’s technical contribution.
Second, the argument that ANNs implemented in software are ‘emulated’ hardware ANNs is not convincing. The difference between hardware and software-implemented ANNs lies in the underlying architectures used for the computations. In T 0702/20, the EPO Board of Appeal underscored that ANNs relate to both computer programs and mathematical methods and that they operate according to the programming of their structure and learning scheme (r.10). To reason about the nature of ANNs solely by analogy with hardware implementations is risky. By way of example, one may rely on this to argue that any computer-implemented simulation of a technical system is also technical – a view explicitly rejected by the EPO in G1/19 (r.120).
Third, limiting the activity of programming to only those steps where a person gives a set of instructions to a computer to perform a task is not supported by actual practice. As the Hearing Officer noted, an ANN operates within the boundaries of the problem and the training approach defined by the programmer (§61-62 Decision, cited in §45-46 judgment). The learning process involves fully automated steps, but this is simply a feature of the algorithm that is still ultimately executed on a computer as a result of deliberate programming. The nature of the activity is not changed by the fact that the output is a trained neural network and not directly executable architecture-specific object code.
Conclusion
In the wake of the judgment, the UKIPO was quick to adapt its statutory guidance on the examination of patent applications involving ANNs. Under this new guidance, examiners should no longer object to inventions involving ANNs under the ‘program for a computer’ exclusion. This practice would likely to come at odds with the EPO approach to neural networks as involving both computer programs and mathematical methods. Finally, it is hard to imagine an EPO examiner recognising a technical effect in moving semantically similar files. Although it is too early to say, this judgment seemingly departs from the ‘problem-solution’ approach in assessing inventive step, increasing thereby the risk of fragmentation to the detriment of applicants and patent offices alike.
________________________
To make sure you do not miss out on regular updates from the Kluwer Patent Blog, please subscribe here.
“while ANNs and their training methods per se are no more than an abstract mathematical algorithm, the specific application (eg, a file recommendation engine) was enough the dispense with the mathematical method as an objection.”
Does not compile in my compiler.
The patent community has found another loophole to extend software patents to AI now. Without changing the law, just abusing the “as such” and “technical effect” one more time.