9 November 2021
Background
Flying Forward 2020 (FF2020) is a three-year project funded by the European Union, working on developing an entire state-of-the-art geospatial Urban Air Mobility (UAM) ecosystem. By deploying and testing this Urban Air Mobility geospatial ecosystem in five living labs, FF2020 will be able to deliver best-in-class drone infrastructure, autonomous monitoring, and last-mile delivery.
Maastricht University is one of the 12 organisations comprising the FF2020 consortium. The work carried out by its Law and Tech Lab’s team will enable the integration of the regulatory framework into the project’s Digital Toolbox, making it replicable and scalable in terms of laws and regulations. We interviewed Gijs van Dijck – Professor of Private Law at Maastricht University, director of the Law and Tech Lab, Principal Investigator at Brightlands Institute for Smart Society (BISS), and researcher at M-EPLI – to find out what The Lab’s role is in FF2020 and how their work will contribute to the success of the project.
About Gijs van Dijck
Gijs van Dijck integrates legal, empirical, and computational analysis in order to improve the description, application, understanding, and evaluation of the law. He has taught courses on tort law, contract law, property law, empirical legal research, and computational legal research. Gijs van Dijck has published in top journals, including the Journal of Empirical Legal Studies and the Oxford Journal of Legal Studies. He has been a speaker at various conferences, including Oxford, Harvard, Yale, Duke and Cornell. He was also a visiting scholar at Stanford University in 2011.
How did Maastricht University become involved with FF2020?
We have a Law and Tech Lab that rather operates in a niche, and at which we conduct research at the intersection of law, data science, and artificial intelligence (AI). What makes us unique is that we apply computer science methods to law. The aim is to understand better whether and how these methods can contribute to the description, interpretation, application, and evaluation of the law. We were approached by FF2020 for the purpose that we're now carrying out, which is mapping legislation and transforming it into machine-readable and executable code. This transformation will make us less reliant on human intervention and inputs when we conduct drone operations.
What captured your personal interest in FF2020?
For me, it was the fact that we were presented with a complex problem that we were unsure of how to solve. Because of our expertise, we came up with ideas on how to have machines capture legal information, and that made the possibility of joining very exciting. What made it even more appealing to us was that it wasn't only a theoretical case. FF2020 presented an opportunity to collaborate with strong consortium partners on putting our technology into practice.
Can you tell us a bit more about your team and the work you do in the Law and Tech Lab?
We started three years ago with a three-person team. Presently, we have around 15 researchers, so we behave similarly to an academic startup. We have a quite unique interdisciplinary team, as almost half of the team consists of computer scientists. I believe we might be the largest group in the world in which lawyers and computer scientists work together on applying computer science methods to law. One of our flagship projects is the creation of software for law students and legal researchers – who are usually not very technical – to help them apply network analysis. This software will help to improve their understanding of critical cases in a large network of case law.
Other topics include the automatic detection of whether European legislation has been properly implemented in European Member States, finding illegal activity on the dark web, and cross-border mergers – a merger in which the companies involved are subject to laws from different countries – and applying data science to track those developments. The topics we focus on range from traditional legal sources such as case law, to social media. We apply Natural Language Processing (NLP), machine learning, and network analysis, which we subsequently combine with legal analysis.
Why are machine-readable and machine-executable regulations and legislation crucial to the FF2020 project?
Rules and legislation are often drafted by humans, for humans. Moreover, the law includes many open norms that require interpretation. Some see that as a limitation, but I think it's also a virtue. Answers to legal questions depend on the facts. There is always an interaction between facts and legal rules, which are always in development - they're never static. They must always be contextualised at a particular time and in a specific context of application. This is challenging for machines. Ideally, you want the drone's autonomous flight to be automatically enforced, from takeoff to landing. So you can program a drone's software to automatically recognise if it's allowed to fly over an assembly of people and handle accordingly, based on its programming. But what is an assembly of people? What we consider an assembly of people during a lockdown may normally not be considered an assembly. Circumstances change. The question is how to deal with that when automating compliance with the law?
Another challenge is identifying quantifiable rules. Such rules enhance the possibility for machines to apply and understand those rules. We also identify non-quantified rules that can possibly be transformed into quantified rules and how to achieve this. To give an example, in the course of an operation during which a drone must fly safely, what does that entail exactly? What are the probabilities of collision, and do we find those probabilities acceptable? It is remarkable how we still require human input to make machine-executable rules. But ultimately, our goal is to automate the process and enhance compliance to reduce the possibility of human error while maintaining or increasing trust levels.
If a drone software programme is fully automated, will it still be possible for humans to intervene?
There are different ways humans could intervene, for instance, with a checklist approach that humans can use to verify legislative requirements. This process could be automated to prevent takeoff if the checklist is not completed. This approach is similarly used in aviation.
There's another factor to consider. To give an example, in flight, a pilot must prevent dangerous situations from becoming harmful to other aircraft, people or properties. If we make this process machine-readable and executable, it presents us with some challenges. Let's say that drone sensors can detect potential risks in flight, then we need to ask ourselves what the acceptable levels of risk are because it's never going to be zero. To give an example regarding the probability of collision, is 0.001% acceptable, 0.1%, or perhaps 1%? I believe this is a decision a legislator will never make. These decisions will always be defined in standards, by private regulators, or made by operators and pilots. How do we make it transparent who made such a decision in automated systems? That is one of the exciting questions that we will tackle in this project.
How will Maastricht University’s involvement in the project contribute to the success of the living labs?
The living labs are vital for FF2020 because they will put the technology we are developing into practice. We keep close contact with them to map their journey. If there are issues along the way, we will assist them in making the right decisions.
To give an idea of how we can support the living labs: There are multiple levels of regulation involved in these processes. The rules for flying drones are defined at a European level, but the Member State level will determine whether they will give clearance for takeoff, landing and other operations. A requirement for authorisation is that you need to fly safely. And part of the ‘fly safe rule’ is that you avoid, for instance, military zones and airport control zones. During operations, pilots will interact with national and local authorities to get permission to fly and avoid violating applicable rules.
We are working together with High Tech Campus Eindhoven – one of FF2020's Living Labs – on testing this process. Because the campus is close to a military base and an airport, it limits their operations. We keep an eye on their journey to provide support for issues they encounter along the way regarding regulation. We are researching whether we can ensure safe operations at High Tech Campus Eindhoven by making these rules machine-executable.
In your opinion, what are FF2020's most significant hurdles in realising its vision, and how will Maastricht University help overcome them?
The first challenge our project faces is creating a digital infrastructure and platform that supports drone use, simulations, and such. Second, we must consider the technical feasibilities. For our team and the activities we are carrying out, the challenge is to meaningfully translate human decisions into a machine-executable process. Do we assign probabilities to a rule, such as the ‘fly safe rule’, or do we do it differently? Our team must also take into account conflicting values or rules. Humans are good at balancing these out against one another. Machines are not. So, the question is: How do we teach machines to do this purposefully? In short, I believe that our main challenges are reducing human errors when interpreting and applying legislation and also dealing with conflicting rules. It may be that, at first, machines will perform worse than humans. The challenge is to improve their performance. I believe this will happen. AI is more likely to progress than human intelligence.
How complex is it to regulate the drone infrastructure, and how do you think will this affect public acceptance of drone operations?
It is rather difficult for the public to find out which legislation is applicable. We're working with a team of legal experts on this, and even we are sometimes struggling to figure out the applicable rules and how to map them systematically. It's almost impossible for those without legal knowledge to understand what drones are authorised or unauthorised to do. It's a great example of why our project is so relevant. I think we'll make strides if we create a tool to help non-legal experts.
And that kind of delves into the other part of your question, which addresses trust levels among citizens. Drones are flying at a much lower level than aeroplanes or helicopters. The safety of drone operations directly depends on legal compliance with rules and regulations. By enhancing legal compliance, we can increase trust levels in drone operations.
How are the developments in legal technology changing the legal sector, and what is Maastricht University's Law and Tech Lab's role in these advances?
Lawyers can often be doomsday preppers. We try to mitigate problems and assess risks. Even when we don't find any risks, we continue to look for them. Because of this, lawyers can be unpopular – we mostly see problems, not opportunities. People think we are against developments in this field. However, what we try to do in our lab is adapt technology to regulation and legislation without relinquishing the virtue of law. It remains essential to use human interpretation of good decisions and risks and to mitigate those risks. In the lab, we try to bridge current gaps in these developments by coming up with ideas on how to soundly regulate these matters so that we can continue advancing technology.
Do you think it's necessary to transition to a new regulatory framework in order to meet the necessary standards for drone operations?
Establishing legislation and regulation should be viewed in a multi-levelled context. At a very high level, there are value-based rules that are open-ended and require interpretation. The ‘fly safe’ rule is an example. It's then left to others to make the norms more concrete. When it comes to standardisation, I don't think European regulators are suitable to determine how high a drone should fly or the acceptable risk in terms of probabilities. The regulator is, however, capable of determining the values that should be defined at a lower level. So, perhaps even at a private party level, standards can be developed to operationalise those higher-level norms. It also provides flexibility in making adaptations since those norms can more easily be adapted and expanded compared to a high level of regulation.
I believe maintaining different regulatory levels is the right way to move forward. The lower the level, the more concrete these norms will be. Industry standards can be specified based on trial and error. To put my example of probabilities in context, the risk of collision should not be higher than a 0.0001% probability. If it's a higher percentage, permission to fly will be denied. That can be a perfect standard. When technology improves, this probability can be reduced to even lower risk levels.
What kind of impact do you wish FF2020 to have on the lives of European citizens?
My hope is that this project will make some technological strides to showcase the advantages of autonomous drone operations. Drones could potentially make life easier and better for citizens, but I also see potential risk factors. People don't want drones constantly flying around and over them, blocking their view of the sky. So, we must include public values to make sure the solutions we develop in FF2020 are not only technologically feasible but can also be implemented safely. For this reason, we have rules that take into account societal and public values.
This is our task in this project, and the subject also holds my personal interest. As a legal researcher, I think of many things that can go wrong and why we shouldn't proceed unless the data we collect can improve operations or ensure that the technology can advance sustainably and safely.
You mentioned before that we have strong partners in our consortium, and we know you work closely with Nalantis. Can you tell us about your collaboration with them?
Nalantis is a Belgium-based company that is great at developing ontologies, which basically translates into putting a skeleton over text, hence providing semantic meaning that a machine can understand. By way of example, when a computer reads text, it only reads zeros and ones. By providing a skeleton, essentially a table of contents, the machine can understand that the concept ‘age’ is part of the (higher) concept ‘remote pilot’, and that ‘remote pilot’ is part of the (higher) concept ‘persons’.
Nalantis has developed a search interface for the municipality of Ghent where a citizen can ask a question such as: ‘Where can I park my bike?’ The given answer is the applicable local regulation regarding bike parking. Our role in this process is to provide Nalantis with the text of drone regulations. In return, they give us the skeleton so that the machine can improve its understanding of how to relate words and sentences to one another. We enter those inputs into an interpretation framework to ensure the machine understands the semantics and other crucial data. Lastly, we make this machine-executable, allowing us to test whether the rule is satisfactory, based on entered values.
What do you wish to achieve for your lab and Maastricht University through FF2020?
We definitely want to have a better grasp of how we can translate human knowledge into machine-readable knowledge and pass that on to machines. It will not only help us in the context of drones, but also in automated decision-making processes. In other projects, we have been trying to make court decisions understandable for machines. The issues we are confronted with in that project are similar to those we encounter in FF2020. That's why we believe that automated compliance should become a requirement, and it must be injected with legal rules and public values.
We also hope to transfer the knowledge and experience we gain in these projects to other domains we're working on. By doing so, we can increase the body of academic knowledge of how machines and humans can interact meaningfully.
Is there anything else you would like to address regarding the work Maastricht University's Law and Tech Lab is doing in FF2020?
We are thoroughly enjoying our collaboration with other consortium partners. And although we encounter the usual struggles in projects such as this one, FF2020 is a special project with wonderful partners that are not only collaborating on research and development, but also on achieving tangible outcomes. It's a mission we are all committed to. I am convinced that the experiences we gain in FF2020 are extremely valuable and meaningful, which is why it's important we share those with the public.