As Douglas Adams once put it: [1]
There is a theory which states that if ever anyone discovers exactly what the Universe is for and why it is here, it will instantly disappear and be replaced by something even more bizarre and inexplicable.
There is another theory mentioned, which states that this has already happened.
The interplay between law and technology is similar. The moment we have the interplay figured out, it is immediately replaced by something even more bizarre.
Artificial intelligence (AI) is no exception, and as a society we need to start asking how it fits in with our existing laws, or if we need new bespoke ones.
AI and intellectual property
One area of concern is whether AI can create something that might attract Intellectual Property (IP) rights. The concept of computer-generated works is not new to New Zealand copyright law, but the concept of AI-generated works is to some extent.
With the development of the last few decades of computing power, AI has become more powerful and more multi-faceted. AI is now used to create produce new works such as paintings in a particular style, like the Next Rembrandt.
Who owns copyright in these new creations? In New Zealand, a computer-generated work is considered to be authored by the person who “makes the necessary arrangements for the creation of the work”. Often this is the programmer. Generally speaking, the author is the owner of copyright, unless it has been on-sold to another person or entity.
AI throws a spanner in those works because there may not be an author – and therefore no owner – of any AI-generated works. Broadly speaking AI refers to algorithms that emulate human intelligence in machines. There are different methods for doing this, and one of the more common variants is based on generalising from examples (called “machine learning”), often built imitating the neurons in a brain (called “neural networks”). As AI systems are currently not recognised as legal or natural people (I leave arguments around this for another writer), the AI itself cannot be an author of its own works.
Because these types of algorithms “learn” for themselves, they are often programmed to learn and decide the best way forward – or the best output; the previously mentioned Next Rembrandt is an example of this. The algorithm had to decide what key features made up Rembrandt’s style, and what impact each had. It then emulated that and applied it to a completely new painting in that style. The programmers did not know what the painting would look like and left it to the algorithm to work that out. Because of this, crucial creative decisions in forming the final painting were made by the algorithm and not the programmers.
Bringing this back to the legal context, the programmers of the Next Rembrandt algorithm most likely did not make any ‘arrangements necessary’ to qualify them as authors of the copyright in the painting.
While it is true that the programmers chose the style (that of Rembrandt), they did not dictate the colours, the subject, the brush strokes, the placement of the eyes on the face and other sorts of decisions artists make in creating a work. Those decisions they trained an algorithm to decide for them. Decide it did. The algorithm made crucial creative decisions on the final form of the painting itself.
Copyright protects the expression of a person’s ideas, for example the choices made in the creation of a painting. But if a person did not make those choices and instead they were AI-generated, there is a strong argument that there is no author for works in those circumstances. With no author, there is no-one to own copyright.
Therefore, works created by AI may not be protected by copyright and other methods of protection may be needed to guard investments in those works. Further, if the AI-generated works are covered, it is vital that accurate records of contributors’ involvement and creative control over the output are kept, in order to prove authorship and therefore ownership of the resulting work.