Executive Summary

On 21 November 2023, the English High Court handed down its decision in Emotional Perception’s Application [CH 2022-000144/GB 1904713.3]. In this complex matter, Bruce Dearling of Hepworth Browne instructed Mark Chacksfield KC and Henry Edwards of 8 New Square.

This decision is of importance to:

  1. The patentability of artificial neural networks “ANNs;”
  2. The distinction of ANNs over what is a “program for a computer” under the UK Patents Act and, particularly, the nature of programming; 
  3. How ANNs should be viewed in relation to the statutory exclusion of section 1(c) of the UK Patents Act 1977; and 
  4. How technical contribution needs to be assessed for inventions supporting some form of semantic result. 

5th Dec 2023

A Change in UK Law and Practice under the Section 1(2) Exclusions. UK High Court

CH-2022-000144 Emotional Perception AI Limited vs Comptroller General of Patents

Artificial Neural Networks, Computer Programs & Patentability.

Specifically, the decision establishes that:

  1. An ANN is a ‘thing’ performing a function. Whether an ANN is implemented in hardware or software is irrelevant since the invention is agnostic of the implementation detail. Significantly, a claim to an ANN itself, or a claim to a method of training an ANN have been held to be viable under UK patent legislation. A trained ANN or a claim to a method of training an ANN would therefore not be excluded under section 1(2) of the Act since the ANN can be viewed as possessing an external technical effect. This prevents the exclusion to patentability from applying [78].
  2. In assessing technical contribution, it is critical to identify the invention at the right level of generality. The approach previously adopted by the UKIPO was heavy criticised. Focusing on physical effects, rather than technical effects and how these are achieved, has previously led to the inappropriate finding of statutory exclusion following a superficial dismissal of the actual technical contribution made by the invention.
  3. An emulated ANN is, in substance, operating at a different level from the underlying program on the computer. A computer program to train an ANN is different to a computer program used to implement the trained ANN. ANNs are therefore not computer programs for the purposes of the Act since there is a decoupling of functions [56], [58].
  4. Since hardware and software implemented ANNs are equivalent, an emulated ANN does not implement code given to it by a human. Whilst the structure given to the ANN may well be a result of the programming, it is equivalent to any hardware realisation. The structure is self-created by the emulated ANN [54], [61].
  5. In relation to the training stage and whether a computer program “as such” exists at this point, the “programming” is based on the setting of training or learning objectives which support material development of a workable structure of the ANN. This involves some nuanced understanding about the form of a program, and reflects that aspects of ANNs are computer-implemented inventions and thus at a different level of generality and so not [necessarily] excluded. Recitation of a product-by-process claim that defines learning objectives is not therefore a computer program at all, not least because the network parameters, i.e., the weights and biases of individual neurons, determined during training are not part of the computer program.
  6. A technical contribution should not be precluded for the reason that the possible subjective effect is within a user’s own non-artificial neural network. 
  7. The sending of a message over a network overcomes the notion of the invention being a computer program “as such” [76].

"The strength of ANN technology is that ANNs can be structured to address intractable problems that cannot be simply coded."

This decision is widely applicable to the AI industry and is likely to apply to other technology industries in which the contribution is tied to the effect delivered, especially by Computer Implemented Inventions (CIIs). 

The decision may yet be appealed by the UKIPO. Regardless, the decision should force an immediate change in both the treatment of and practice relating to the patentability of ANNs, their set-up during training and, likely also, the patentability of technically non-trivial innovations making use of Artificial Intelligence “AI.” A change in the UKIPO’s Examination Guidelines on AI and the patent protection afforded to such technology may therefore be expected, not least because the now overturned Hearing Officer’s decision was cited in the most recent guidelines “Examining patent applications relating to artificial intelligence (AI) inventions: The Guidance - GOV.UK (www.gov.uk).”

The arguments made during the appeal appear to be relevant to debunking similar preliminary objections, should they arise, under the corresponding provisions of Art.52(2) EPC. This decision arguably further demonstrates that (a) formerly applied approaches of the UKIPO in its assessment of technical contributions were superficial and inappropriate and (b) the UKIPO approach to assessing statutory exclusion may have led to materially different results on patentability in view of the level of detail at which exclusion is assessed. Those are issues for another day.

The Invention and Observations on ANNs

Emotional Perception AI Limited, a U.K. AI innovation company controlled under an umbrella organisation Time Machine Capital Squared “TMC2,” developed an ANN-implemented mechanism for recommending to an end user a semantically relevant file, such as a music track or a video. The recommendation was selected based on an assessment of semantic closeness relative to reference criteria extracted from a query and as defined by the user. This assessment was determined during a forward pass through a trained ANN. This forward pass was operative solely to extract physical properties from the query and to correlate those in a continuous multi-dimensional property embedding space formed to retain semantic information. The training made use of a “learning objective” concept which converged a pairwise separation distance for a pair of files in property embedding space towards the pairwise separation distance for the same pair of files in semantic embedding space. The generated property vectors and semantic vectors were related but obtained from independent sources, namely a human produced literal expression converted by Natural Language Processing “NLP” into each semantic vector for each file, and the property vector obtained directly from the extraction of identified properties from each file. The independent claims of Emotional Perception’s application were structured as “product-by-process” claims, i.e., a new ANN self-formed by computer-implemented processes embodying learning objectives.

The strength of ANN technology is that ANNs can be structured to address intractable problems that cannot be simply coded.

In a program for a computer, especially those which are excluded from patentability, one generally knows the result that is to be achieved, i.e., there is a known prior output or expected condition. For example, following coding (whether in a ANN solving a technically trivial problem or otherwise), such known results may include (i) a binary “yes/no” decision on something like a financial loan, (ii) a set of identified prevailing logic conditions determining whether a player in a computer game should lose a life, (iii) the answer to a square root in mathematics (such as in Gale’s Application [1991] RPC 305), or even (iv) whether an individual has broken an obligation within some form a defined code of morality. 

Excluded programs for computers are often technically banal and typically represent the automation of a non-technical process that has no ancillary or saving technical outcome. The excluded computer programs solve no meaningful problem that is technical in nature, e.g., merely calculating a mathematical number or permitting the playing of a basic game, such as noughts and crosses or the like. Computer programs of this ilk are a manual coding of an articulated non-technical function that, by its nature, is a game, business method or mental act. 

In contrast, patentable CIIs are based on processes that are either unknown at the outset or otherwise which are technically innovative processes that deliver a meaningful technical output having some underlying technical synergy. This latter point is, to some extent, reflected in the IBM computer program products decision T1173/97-Computer Program Product

In relation to Emotional Perception’s claimed ANN-based system, the reason that the ANN’s output was unknowable at the outset arises because the output is not a binary ‘yes/no,’ but rather was highly complex and variable and related to an intractable problem. The issue of “unknowing” is not unique to Emotional Perception’s ANN and applies to other ANNs dealing with intractable problems. It is therefore not a simple question of just programming/coding to train the ANN, but rather an appreciation that setting the stage requires inventive conceptualization of suitable learning objectives that can support the definition of a loss or error function against which accuracy of a prediction made by the ANN during training can be assessed. Such a definition of technical learning objectives is the job of a research scientist and not a programmer [who essentially codes and debugs]. In fact, the job of a computer programmer is to translate an algorithm into computer code but not to come up with the conceptual algorithm, i.e., an effective set of learning objectives, in the first place.

If an ANN merely implements a set of well-known rules with a known output, then selection of the learning objectives can essentially be taken off-the-shelf. In such situations (as occurs with, for example, a decision on a bank loan based on the applicant’s employment status, health, age, credit ratings, adverse court judgements “CCJs” and other personal parameters), the inputs are themselves known and already used in existing commercial environments so, whilst the resultant ANN is new, it contributes nothing to human endeavour. Conversely, in Emotional Perception’s exemplary audio solution, the trained ANN took an unknown audio track selected by a person having unknown preferences and then extracted data from that audio track to provide, ultimately, an output that was unknown at the point of training the network. Learning objectives in all patentable instances provide a mechanism by which a viable ANN can be assembled to address such an intractable problem; they actually define a new computer-implemented invention process that accommodates the unknowable. The learning objectives do not represent simple or trivial concepts but clearly do require, at some point, coding as a computer-implemented invention. The point is that the learning objectives, in Emotional Perception’s application, were complex and technically non-trivial in nature. It is patentably immaterial that these learning objectives require some form of coding since the coding is immaterial to the functional technical result that they bring to the resulting ANN. 

The selection of a processor-based system, rather than a hardware solution, to realise the ANN is merely a design choice. Both function to achieve the same outcome.

"The programmer defines the problem and the training approach, and the ANN operates within those boundaries to build a suitable model"

As will be understood, training of the ANN further requires an assessment of vectorial closeness and adaptation of the weights and biases of neurons within the ANN through a backpropagation mechanism. During training, backpropagation closes the gap between the desired output/prediction and a current output from the ANN, with backpropagation therefore resolving the defined loss function of the network. Backpropagation is therefore a different process – at a different level of granularity - to the definition of learning objectives. Backpropagation is essentially just a simple but secondary mathematical operation within training epochs for the ANN. 

Once training of the ANN generates a prediction that is consistently acceptable, the network of neurons and their associated weights and biases are permanently fixed.

The Original UKIPO Decision and Reasoning

The present appeal arose from a decision from the Comptroller General of Patents to refuse the application under section 1(2)(c) UK Patents Act for reasons that (a) the claimed invention was a program for a computer “as such,” and (b) a recommendation of a semantically similar file was not a technical effect. 

The Hearing Officer concluded that, “The programmer defines the problem and the training approach, and the ANN operates within those boundaries to build a suitable model. This is still no more than a computer program in my opinion… even if this is so, key to the contribution is to specify the training method (pairwise comparison) and objective (converging distances), and this is no more than a computer programming activity,” thereby treating the entire system of both training and forward pass in the trained ANN as being a programming activity. Further, the 

The Hearing Officer also rejected the suggestion that, following generation of an output from the ANN, the subsequent provision of that output (as a file or the like) over a communications network was not a relevant technical effect. This contrasted with the decision in Protecting Kids the World Over (“PKTWO”) Ltd’s Application [2011] EWHC 2720 (Pat) in which the act of network transfer was a relevant external effect sufficient to overcome the statutory exclusion. In the decision under appeal, the Hearing Officer concluded at [69] that the communicated semantically close file was only a “beneficial effect… of subjective and cognitive nature and does not suggest there is any technical effect over and above the running of a program on a computer” and that an “ANN-based system for providing semantically similar file recommendations is not technical in nature.” The Hearing Officer justified his position by stating that the sending of the recommendation was achieved in a standard fashion within a conventional computer network and distinguished over PKTWO and T208/84-Vicom in that in that case the file (a photograph) was changed by the process. In Emotional Perception’s invention, The Hearing Officer contended that the output file is not altered in any way, and that the assessment of the emotional nature of the file “is not a technical process.” 

In the UKIPO’s initial decision, the mathematical exclusion was, according to the Hearing Officer, not engaged by the claims. This point of exclusion was belatedly raised by the UKIPO post-trial and only in response to queries later raised by the judge that he deemed necessary to allow him to better understand the confused case of the UKIPO. The mathematical exclusion is, however, limited under case law, and evidently not engaged by the need to conceptualize firstly appropriate learning objectives.

The Decision

At the initial hearing, the construction of claim and its contribution was agreed, especially in terms of what advantages were said to exist at the output of the ANN. This was based on the application of the test in Aerotel Ltd. Vs Telco Holdings Ltd [2007] RPC 7, as then followed by the enquiry into technical effects under the “Signpost” guidelines of AT&T Knowledge Ventures LP's Patent Application [2009] EWHC 343 (Pat)[2009] FSR 19. The “better computer” questions were ignored since they were not relevant.

The Appellant’s position, from the outset, was that:

  1. The computer program exclusion is not engaged at all; one does not get as far as finding a relevant computer program. 
  2. The reasoning of the Hearing Officer failed to acknowledge a line of cases described as the “patentable ignoring a computer program” line of cases, e.g., Vicom and CFPH's Application [2005] EWHC 1589 (Pat), [2006]. 
  3. If there is a computer program and the exclusion is prima facie engaged, it does not apply because the claim reveals a technical contribution in the sending of the file recommendation. Consequently, the claim is not to a program for a computer “as such.”

During the trial, the UKIPO conceded that implementation of a neural network in hardware was not likely to be excluded from patentability. This concession was not immaterial.

The judgement stated that.

“Ms Edwards-Stuart’s concession about the operation of a hardware ANN was not accompanied by reasons, presumably it is because the hardware is not implementing a series of instructions pre-ordained by a human. It is operating according to something that it has learned itself… I do not see why the same should not apply to the emulated ANN. It is not implementing code given to it by a human. The structure, in terms of the emulation of uneducated nodes and layers, may well be the result of programming, but that is just the equivalent of the hardware ANN. The actual operation of those nodes and layers inter se is not given to those elements by a human. It is created by the ANN itself.”

Whether this concession remains sound may be per incuriam. Time will tell.

The judgement also found that the setting of the learning objectives, whilst computer-implemented, are at a level that makes the computer program a subsidiary part to the claimed invention. The computer program exclusion was therefore not engaged.

In covering off the possibility that the learning objectives amounted to a program for a computer, the judgement went further to consider the technical contribution, and therefore steps (3) and (4) of Aerotel. This requires that any technical effect must be one that falls outside all of the statutory exclusions. Citing Halliburton Energy Services Inc’s Patent Application [2012] RPC 12, an invention which involved the process of a computer designing a drill bit by a process of simulation and alteration of a drill bit parameter, the salient parts of Birss J’s judgement were,

“32. Thus when confronted by an invention which is implemented in computer software, the mere fact that it works that way does not normally answer the question of patentability. The question is decided by considering what task it is that the program (or the programmed computer) actually performs. A computer programmed to perform a task which makes a contribution to the art which is technical in nature, is a patentable invention and may be claimed as such. Indeed (see Astron Clinica [2008] RPC 14) in those circumstances the patentee is perfectly entitled to claim the computer program itself.  

33. If the task the system performs itself falls within the excluded matter and there is no more to it, then the invention is not patentable … 

38. What if the task performed by the program represents something specific and external to the computer and does not fall within one of the excluded areas? Although it is clear that that is not the end of the enquiry, in my judgment that circumstance is likely to indicate that the invention is patentable. Put in other language, when the task carried out by the computer program is not itself something within the excluded categories then it is likely that the technical contribution has been revealed and thus the invention is patentable. I emphasise the word "likely" rather than "necessarily" because there are no doubt cases in which the task carried out is not within the excluded areas but nevertheless there is no technical contribution at all.” 

Within the context of the claim, Emotional Perception also argued that the sending of an improved recommendation message was sufficient to establish a technical contribution and pleaded that its case was effectively on all fours with PKTWO, saying that it was wrong for the Hearing Officer initially to hold that the external end result of the invention was not a “relevant technical effect”. 

The judgement contradicted the Hearing Officer’s finding saying that the sending of the recommendation, regardless of its nature, was a relevant technical effect and thus prevented the exclusion from applying, “It is not a disqualification that the result may be to facilitate user enjoyment.” This appears to be significant statement for the assessment of technical contribution and assessing invention at the correct level of granularity, and how a high-level perspective appears inappropriately superficial and dismissive.

More specifically, concerning what amounts to a technical contribution, the decision was extremely critical of the conclusions and subjectivity applied by the Hearing Officer:

“A decision of an expert tribunal such as the IPO Hearing Officer in this case is entitled to respect in relation to technical matters, and in respect of judgments such as were made in relation to technical effect, but I am afraid that in this instance I consider the judgment to be flawed and disagree with the assessment. The Hearing Officer seemed to consider that a subjective appreciation of the output of the system was just that, subjective and in the user, and therefore not a technical effect. I do not consider that to be the correct analysis. The Hearing Officer was right to acknowledge that the result of the invention was an effect external to the computer in the transmission of a chosen file. That is usefully analogous to the file that was moved in the third Gemstar patent. The correct view of what happened, for these purposes, is that a file has been identified, and then moved, because it fulfilled certain criteria. True it is that those criteria are not technical criteria in the sense that they can be described in purely technical terms, but they are criteria nonetheless, and the ANN has certainly gone about its analysis and selection in a technical way. It is not just any old file; it is file identified as being semantically similar by the application of technical criteria which the system has worked out for itself. So the output is of a file that would not otherwise be selected. That seems to me to be a technical effect outside the computer for these purposes, and when coupled with the purpose and method of selection it fulfils the requirement of technical effect in order to escape the exclusion. I do not see why the possible subjective effect within a user’s own non-artificial neural network should disqualify it for these purposes. To adapt the wording of Floyd J in Protecting Kids, the invention is not just one depending on the effect of the computerised process on the user. There is more than that. There is a produced file with (it is said) certain attributes. The file produced then goes on to have an effect on the user (if the thing works at all) but one cannot ignore the fact that a technical thing is actually produced. It would not matter if the user never listened to the file. The file, with its similarity characteristics, is still produced via the system which has set up the identification system and then implemented it. This effect is qualitatively different from the first two instances in Gemstar, and qualitatively similar to the effect in Protecting Kids.”

Conclusions & Remarks

Contrary to widely held conventional beliefs and guidelines, ANNs and their training regimes can support patentable invention and are not excluded under section 1(2) of the UK Patents Act. There is, however, a general need for appropriate learning objectives to be technically non-trivial.

ANNs do not make use of conventional programs for computers but are self-taught to address intractable problems that cannot be coded by a human. A distinction is therefore established and provides a limitation as to what is meant be the term “program for a computer” of section 1 of the UK Patents Act. ANNs are therefore not computer programs for the purposes of the Act. A software emulated ANN is not implementing code given to it by a human. It learns from the experience without being told how to do it by a human being. In the specific case of the invention, the ANN learnt how to discern semantic similarity from physical properties. It did not do so because any human (programmer) told it how to do it. And, furthermore, in all ANNs the state of each neuron is not determined by any human being programming those individual neurons or nodes. The structure, in terms of the emulation of uneducated nodes and layers, may well be the result of programming, but that is just the equivalent of the hardware ANN. The actual operation of those nodes and layers is not given to those elements by a human. It is created by the ANN itself.

The characteristics of the output of the ANN – and by analogy other systems - are not particularly relevant to the assessment of the technical contribution made by the new ANN (or system) in its entirety and within technical context. The output is likely to be a technical effect outside of the computer associated with an underlying technical process, and therefore not engaged by the investigations within the Signpost Test. This reflects the need for the assessment to be framed objectively at the right level of granularity. In other words, a technical contribution should not be precluded for reasons that the possible subjective effect is within a user’s own mind, i.e., their own non-artificial neural network.

Technically non-trivial learning objectives conceptualized for the definition of a suitable loss function operate at a different level from the underlying implementing program. 

The sending of a message over a network defeats the notion of the invention being a computer program “as such.” This reinforces the decision in PKTWO.

The decision’s criticism of the UKIPO’s stance and the direction of this case are both very welcome. It provides further hope that, in the future, the UKIPO will adopt a more pragmatic and reasonable approach to protect “all fields of technology,” thereby reflecting the revised language of Art.52(1) EPC and the requirements of Art.27 TRIPS.


Bruce Dearling represents Emotional Perception AI, is a shareholder at Hepworth Browne and is responsible for CIIs, AI, communications, and business technologies. He has over 30-years of private practice and industrial practice experience both in the UK and internationally, securing leading decisions in litigation in Germany and other European countries.