Focus July 2020

In Columbia, these assets happen to be bikes. Here, they are working with local government on a smart city initiative to address air and plastic pollution by incentivising young people to collect plastic bottles. These bottles are crushed to create colourful bicycle frames which are fitted with Dr Mobley’s patented technology and environmental sensors.’ “In this way, young, inner city people get access to bicycles in exchange for cleaning up plastic waste and the data from the sensors helps parents and local government securely track the location of the bikes, along with C02 levels, speeds and the state of local roads”, says Dr. Mobley. But the asset could also be an autonomous car. “If we think about the Artificial Intelligence (AI) that we’re training, like those in autonomous cars, we assume that we’re operating that car in a benign environment, but this isn’t necessarily true”. “There are people who are also training AI to do bad things, who may get hold of an unsecured device with the intention of altering its base programming. Dangerous situations like these need to be avoided. “For the AIs in autonomous cars to function correctly, they must upload the information they’ve gathered during the day into a database to be shared with other AI cars”. “This is then rolled back out overnight so that all the cars can benefit from what that one AI has learnt that day, allowing a car that is operational in sunny California to learn how to drive in snowy weather conditions from one that’s operational in Austria, for example. It knows because the other cars know! “But this key process of AI training is open to abuse from malicious actors if the data exchange is not secured. For these systems to be safe and reliable, their integrity is critical. We need to be able to trust these systems and know that the updates we are getting from other AI systems and environmental sensors are also trustworthy. “Training AI algorithms on untrusted or infected data could cause it to make bad decisions, which would be catastrophic in the case of a car. And if humans start to see that autonomous cars are showing signs of not behaving properly, then they will cease to trust them” says Dr Mobley. “By using a ‘trust anchor’, an autonomous car can drive off round the countryside and we can digitally ask it questions. If the right answers come back, then we know we are speaking to the right car and can exchange information we can trust. Equally, that car will know not to respond if it gets questions from a device that is asking it the wrong questions.”This simple exchange methodology can be used to create trust”, says Mobley. Securing data between physical industrial systems and IT systems is fundamentally different. “When people think about cyber security, they are often thinking about securing just IT systems, where confidentiality, integrity and availability (CIA) of the system are what matters most to the user”, says Mobley. “But with digital twins and industrial instrumentation systems and sensors, where data interacts with humans, this paradigm is flipped and the most important factor is that the device and the data are safe to use. “Blueskytec’s digital anchors are manufactured in the UK through a trusted supply chain and ensure that IoT/IIoT devices talk to their endpoint server first, before talking to the cloud. “This guarantees data security and addresses the trusted systems paradox. “As we move towards autonomy, hyper connectivity and embrace AI at scale in our future cities, we must elevate our thinking about security and integrity of the systems we are developing. We must understand the social implications and work to build people’s trust. “Without a solid foundation that encompasses an individual’s or company’s or society’s Security, Privacy Safety and Ethics, these latest advances and applications in IoT/ IIOT technologies will be significantly undermined and will result in the proliferation of new threats and vulnerabilities”. 2 28 Dr Chris Mobley As we move towards autonomy, hyper connectivity and embrace AI at scale in our future cities, we must elevate our thinking about security and integrity of the systems we are developing. We must understand the social implications and work to build people’s trust. “ “ 29

RkJQdWJsaXNoZXIy NTcyOTY=