Black Hat Europe 2023: Should AI be regulated?

esteria.white

We experience progress

ChatGPT would probably say “Definitely not!” ”, but will we learn from the rush to regulate IoT in the past?

Black Hat Europe 2023: Should AI be regulated?

It’s difficult for all of us to keep up with the accelerating pace of technological advancement, especially for public sector decision-makers who traditionally follow rather than lead. Last week, the Black Europe hat The conference in London provided an opportunity to hear directly from several UK government employees and others responsible for advising the UK government on cybersecurity policy.

Late settlements and missing horses

All governments seem to suffer from their reaction – closing the stable door once the horse has run away is a good phrase to describe most political decisions. Let’s take the current conversations about artificial intelligence (AI) as an example; politicians are speaking out about the need for regulation and legislation to ensure that AI is used ethically and in the interests of society. But this comes after the AI already been there for many years and used in many technologies in one form or another. So why wait until it emerges and is widely accessible to the general public to start a debate on ethical standards? Shouldn’t we have done this before?

Another, perhaps better, example is the legislation surrounding consumer-focused Internet of Things (IoT) devices. The British government has published regulation in 2023 which sets out specific cybersecurity requirements that device manufacturers must adhere to, similar laws have emerged the European UnionAnd California implemented requirements for manufacturers in 2020. Establishing standards and guidance for IoT device manufacturers to follow probably should have happened in 2010, when there were less than a billion devices connected to the IoT – to wait until there are 10 billion devices in 2020, or even worse, when there will be almost 20 billion devices in 2023, this makes it impossible to apply what is already on the market.

Lessons learned or mistakes to make?

The UK government team’s discussion at Black Hat notably indicated that they are now focusing on the standards needed for enterprise IoT devices. I am certain that most companies have already made significant investments in connected devices classified as IoT, and that any standards currently adopted are impossible to impose retrospectively and will have little or no effect on the billions of devices already used.

Standards and policies do serve a purpose and an important element is educating the population on the correct use and adoption of technology. Continuing the previous example of consumer IoT, I’m sure most consumers now understand that you need to define a unique password on each device and that it may need frequent software updates to ensure security. I’m curious to see if they adopt the advice!

The political problem and the horse that has already run away might be that voters would not understand why their government is focusing on things they have never heard of. Imagine if policymakers started legislating IoT or connected devices in 2008, before most of us had even considered that we could fill our homes with connected devices in real time. The media and voters would have considered lawmakers wasting taxpayer dollars on something we had never heard of. In a perfect world, 2008 would have been the ideal time to establish standards for IoT devices. Likewise, the ethical use of AI should have been discussed when technology companies began developing solutions leveraging the technology, not when they began marketing products and services.

Last minute thoughts

This conference session was divided into two parts; the first half was used to explain the policies and areas the UK government is focusing on, while the second half was an open question and answer session with participants. This latter half was considered to be “in the room”, allowing policy makers to have open discussions with participants without the threat of what had been discussed falling into the public domain. Therefore, in accordance with the wishes of the speakers and other participants, I will refrain from commenting on what was discussed after the “in the room” statement.

However, for the record, and since I did not voice it in the room, I do not agree with implementing an encryption backdoor.

Before you leave: RSA Conference 2023 – How AI will infiltrate the world

Leave a comment