Product Liability and Artificial Intelligence

Product Liability and Artificial Intelligence

Product Liability and Artificial Intelligence 1600 900 Christopher Hooley
The law always has difficulty keeping pace with advances in technology – product liability is no exception.

There is much established law on product liability for traditional products but what has happened with the increasing use of software, apps and related artificial intelligence?

These laws are founded in contract, for a direct supplier. In statute, the Sale of Goods Ordinance, for other products. Or in negligence, for direct and indirect supply, where a manufacturer is liable, if it supplies a defective product, in breach of its duty of care, and that breach then causes foreseeable damage.

This is the chain of causation. But what happens if that chain of causation is broken?

If a manufacturer produces a product and provides specific directions as to its use, and that product is then used contrary to those directions, it is arguable, but not inevitable, that the chain of causation has been broken. That may also result in a change of liability.

The same is true of software. Provided that it is used in the form delivered, whether “vanilla” or “customised”, liability will rest with the actual developer. However, as soon as that software is changed or indeed altered by the user, performance issues may no longer be the sole responsibility of the developer and as a consequence, liability is also likely to shift.

The situation is somewhat different where software is developed with the intention that the user will change that software through having that software learn to perform a specific task.

As soon as that software learning commences, the chain of causation is arguably broken, causing a change in liability.

Impact on AI

So questions will inevitably arise that may impact on the concept of artificial intelligence:

  • Could there be a fault in the relevant algorithm?
  • Could the data set have been corrupt?
  • Was there insufficient data for the AI to make informed decisions?
  • Could the user have misguided the AI?
  • What if there was more than one entity that had developed the initial code?

These are questions which may not easy to answer and which could well result in legal action, if there has been actionable loss by the end user. Even if there are informed factual answers, the process of identifying the cause of the problem will be time consuming and costly.

Given the inevitable complexities of determining liability, and while the law catches up with the technology, contract terms may need to be drafted to apportion blame. This also means that insurance contracts have to be revisited, to focus on resulting third party loss and damage.

Vicarious Liability

This arises where an employee, acting in the course of his/her employment, commits a tortious act that causes foreseeable loss or damage. This is well established in common law and also in some cases under statute, i.e. discrimination laws.

Currently, “robots” are not classified as employees and indeed, they currently do not have any express legal status at all.

As such, there can be no vicarious liability for their actions. The consequence is a possible broadening of the definition of an employee to include “robots”.

However, if the robot has been taught by employees of a company, that may still lead the relevant employer to be held liable in a more indirect way.

An extreme example might be where a female candidate is refused a job because a robot assesses she is pregnant and so not suitable for the position, and that candidate then sues.

The employer could argue that there is no vicarious liability as the act was done by a robot, and a robot is not an “employee”.

The female candidate however might argue that the employer should be liable, since the robot had only been trained by the employer’s staff.

If that argument were to be successful, we would get back to the question of who is responsible for the fundamental fault?

If it can clearly be shown that the software developer was at fault, liability may rest there.  If however, the error was caused by or in the training, the employer might become vicariously liable.

This could lead to discrimination claims being made, where an employee sues the employer and the employer joins the manufacturer/software developer in as a third party.

Legal Issues

All the above indicate that the law is a long way behind the advance state of robotics technology and beg questions:

  • Could a robot become a legally required alternative for a disabled worker?
  • What happens if you don’t introduce robots to do hazardous work? Would that mean you are falling short of your requirements to provide a safe place of work for an employee?
  • Discrimination – cannot harass – but may create or contribute to a hostile working environment. Technology magnifies our leverage and increases the importance of expertise, judgment and  creativity

All the above indicates that there will be change but how and when are still being discussed.  Some of the issues being considered are:

  • compulsory insurance
  • rigorous safety standards and. certification process
  • strict liability for manufacturers and developers
  • liability exemptions for users and/or manufacturers ?
  • automatic compensation funds, publicly or privately funded

Christopher Hooley

Chris advises on a wide range of corporate commercial, corporate finance, mergers and acquisition, information technology matters, from strategising on tech driven start ups to drafting documentation required for complex cross border transactions.

All articles by : Christopher Hooley
Privacy Preferences

When you visit our website, it may store information through your browser from specific services, usually in the form of cookies. Here you can change your Privacy preferences. It is worth noting that blocking some types of cookies may impact your experience on our website and the services we are able to offer.

For performance and security reasons we use Cloudflare
required
Google Analytics tracking code disabled/enabled
Google Fonts disabled/enabled
Google Maps disabled/enabled
video embeds (e.g. YouTube) disabled/enabled
 
View our Privacy Policy
We don't eat shark fin but our website does use cookies, mainly for analytics and provision of content from other websites. Define your Privacy Preferences and agree to our use of cookies. Privacy Policy
Skip to content