Tilt: What role does NEC play in regulatory discussions in the countries where it has contracts?
Michael O'Connell: That is the most important question you asked, and I speak that in all sincerity, because it brings us to the reason that I stopped working for the government and the policing industry, particularly in the UK, and joining a company like NEC. It is one of the "technology houses" that seeks to change its role and its responsibility in society. She starts to understand that she created all these sophisticated things and now she needs to add layers around it. We try to promote social values and trust around these technologies.
As you pointed out, they can be used incorrectly – we don't want that to happen. We want them to be used in the correct proportion for what it was created to have a positive impact on society. In the past 18 months, we have established our social values for the corporation and developed a higher level of interpretation of the leadership role we play globally.
Tilt: What do you mean by that, in practice?
Michael O'Connell: We brought a team of digital trust that ensures that the corporation, all its authorities and its technologies are in accordance with a model of proportionality and transparency in the way that the technologies are used, obviously respecting some commercial interests in around intellectual property. [Transparency] with whom we will consider doing business, how we will educate potential customers and how we will ensure that our technologies provide the highest level of accuracy possible, that is, so that the error rate is as low as possible if you decide to use these technologies in a way that affects people's privacy and identification. No one wants to be misidentified and subject to controls or disruption of any kind. That is why NEC has spent so much time working on such a high level of specifications, both in terms of its detection and identification capabilities, such as facial and digital recognition, but also in the artificial intelligence, algorithms and machine learning that work in the background, connect the dots, learn and improve operational functionality.
Tilt: What is your role in this process?
Michael O'Connell: I am a member of the board of directors of the Biometrics Institute [international forum that has been given the mission to promote the responsible use of biometrics independently and impartially] and I have colleagues who support us in the legal digital privacy team. We work as a community to try to promote ethics around the use of these types of technologies, because we are still operating in an almost unregulated context. We are trying to project, instead of a dispute for less regulation, a competition for the best in terms of standards, performance and behavior.
We work to guide governments, international organizations such as the UN, the WTO [World Trade Organization] and the World Economic Forum, on the rules and procedures that we believe they should aspire to create. We actively support initiatives such as the GDPR [General Data Protection Regulation]. We don't run away from that.
Tilt: Are there any GDPR articles or other existing regulatory texts that NEC disagrees with because they are too restrictive? Do you believe that the GDPR has set good standards for what privacy protection should look like?
Michael O'Connell: If you take GDPR, it is, at this stage, an unqualified regulatory model. I say this because it was prescribed and released in an active context, but there will be episodes that it will need to be challenged in court. This is a common law approach in which declared cases will arise and attempt to calibrate it. Some would say that it is quite arbitrary, that it is crude, and that it gets in the way – which is true. And there are cost impositions, both in the application and in the case of perceived or real violations.
In the absence of anything else, it allows and forces us to look deeply into the operating environment of these technologies and this is not only for biometrics, but for data acquisition, internet use, the Wild West of exploiting individuals, their lifestyles, behaviors, religions and all attempts at commercial gain or advantage from it.
Tilt: Where do you see defects in GDPR?
Michael O'Connell: What we don't want, of course, is to be in a position where there is over-regulation, because it becomes debilitating for everyone, a step backwards to the opportunities that these technologies can receive. In health, there are mechanisms to ensure that we depersonalize information, but we can collect and use it. If we go overboard with regulations, some of these opportunities to create new drugs and solutions for ourselves and future generations can be undermined.
We have to be intuitive on this journey and maintain critical rigor in perpetuity, but evolving towards a regulatory model that is agile. It is a huge challenge for regulatory authorities, because they need to be faster. They need to be updated.