As noted previously by Agriland, in their enthusiasm for autonomous machines, the creators tend to overlook the troublesome hurdle of EU legislation, especially when it it comes to self-learning software.
Up until now, it has been far from clear just how the issue of robot safety is to be regulated, however, the EU legislative mill has recently started grinding away on the issue and a framework for machinery approval is now taking shape.
Health and safety a priority
Presently, before placing machinery on the market in the EU, manufacturers need to ensure that essential health and safety requirements are met and conformity checks carried out.
Up until now this has largely been the responsibility of the manufacturers who have been able to self certify their products, a process which has taken up a good deal of engineers’ time.
However, this situation is set to change with the EU dividing machines into two broad categories which it lists under the headings Part A and Part B of Annex 1.
It is proposed that machinery with self-evolving behaviour based on machine learning will be subject to stricter conformity assessment procedures which will need to be carried out by a third party.
These fall under Part A, and so, before they can go on sale within the EU, there will need to be some independent overview of the machine and, presumably, the software that directs it.
For those products that come under the heading, Part B, the manufacturer can still carry out the conformity assessment itself.
New requirements
The European Parliament has listed a series of requirements to be met by autonomous machines that fall into Part A before they can be sold.
There are several concerns addressed; the two that are most likely to impact on the viability of installing autonomy within a machine being that it should not attempt actions outside of its designed task(s), nor should it operate outside of a defined zone.
Geo boundaries are nothing new, so the second is easily accomplished, although the reliability of the system will most probably need to be proven.
The first is a little more involved for it will require a strict definition of a machine’s task.
How is that to be achieved and how are the limits to be communicated to the machine which, if self-learning is involved, may attempt to modify a task for the sake of efficiency?
Farmers may be opening themselves up to all sorts of potential liabilities if they cannot show that the machine was set up correctly, in which case it may become a job for a qualified specialist.
Other features demanded by the legislation will be that the machine communicates its intentions to the operator and that its interface is user friendly.
The need to be hackproof
It is likely that the emerging robotics industry will overcome these challenges, yet there still remains one obstacle which could prove the killer punch to autonomy, and that is cybersecurity.
Within the draft text of the new legislation, clearly set out, there is the requirement that “manufacturers must design machinery products so that connection to them by another device does not lead to a hazardous situation”.
That is to say they must be incapable of being hacked.
This is probably impossible as there is growing evidence that vehicles of all sorts may be vulnerable to hacking, be it directly, or via the software that it communicates with back at base.
Building to legislation
Here the legislation may be the author of its own undoing, for if it is required that a remote operator and an autonomous machine can communicate while it is in operation, then a radio link must be established, and that then becomes a point of entry for hackers.
None of these issues will be resolved overnight; the robotics industry will need to come to terms with the idea that safety and a machine’s interaction with humans is going to be overseen by legislation instead of being left to the rather haphazard approach that exists at the moment.
The sight of cabless tractors roaming the fields merrily beeping to themselves may be a long way off yet.