What Do Iran’s Theocrats and Tesla Vehicles Have in Common? The Use of High-Tech Surveillance Cameras
Digital technology is a two-edged sword. On the one hand, it makes vital tools like VPNs possible, providing strong and affordable privacy protection for personal data when in transit across the Internet. On the other hand, the increased affordability of digital technology means it can be routinely deployed anywhere and with little effort, potentially causing serious harm to privacy.
Two recent, highly contrasting incidents show how governments and companies around the world are building a surveillance society before our eyes. In the Middle East, Iran is using surveillance tech to enforce the hijab law. Meanwhile, in the United States, Tesla’s surveillance seems to have become a problem, as it was shown to capture highly personal footage from its vehicle cameras, both inside and outside.
The incidents are related in a deeper way than just the use of cameras.
Surveillance Will Soon Be Everywhere
Iran’s theocratic government has been one of the most repressive in the world for decades, but its crackdown on Iranians has intensified in the wake of the death of Mahsa Amini at the hands of the Iranian police in 2022. One of the most potent symbols of the subsequent protests has been a refusal by many Iranian women to wear the traditional hijab head covering. Now the Iranian authorities are using surveillance technology in an attempt to enforce the hijab rules:
Iranian authorities have begun installing cameras in public places to identify unveiled women, the police have announced.
Women seen not covering their hair would receive a “warning text messages as to the consequences”, police said.
BBC News explains that the system uses so-called “smart” cameras and other tools to identify and send “documents and warning messages to the violators of the hijab law”. As PIA blog has reported, such “smart” cameras are now a well-established technology. They are also relatively cheap, which means that they can be rolled out on a massive scale, as is also the case in the West, for example in London.
What is significant here is that a regime hardly known for its embrace of advanced technology is turning to digital surveillance to control its people.
The other example of pervasive surveillance could hardly be more different. It concerns the electric vehicle company Tesla, often regarded as a technology leader. Its vehicles feature multiple cameras on both the outside and inside of the vehicle. According to a new report from Reuters, highly personal footage from these cameras was shared among Tesla employees:
Some of the recordings caught Tesla customers in embarrassing situations. One ex-employee described a video of a man approaching a vehicle completely naked.
Also shared: crashes and road-rage incidents. One crash video in 2021 showed a Tesla driving at high speed in a residential area hitting a child riding a bike, according to another ex-employee. The child flew in one direction, the bike in another. The video spread around a Tesla office in San Mateo, California, via private one-on-one chats, “like wildfire,” the ex-employee said.
Other images were more mundane, such as pictures of dogs and funny road signs that employees made into memes by embellishing them with amusing captions or commentary, before posting them in private group chats. While some postings were only shared between two employees, others could be seen by scores of them, according to several ex-employees.
Not Everything Smart Is Better
I’ve been warning on this very blog since 2019 about the risks to privacy that today’s “smart” vehicles represent. The revelations about the abuse of Tesla’s surveillance systems adds another element. And it arises from the fact that video cameras are now so cheap that they can be added at multiple points on a vehicle to help with driving and security.
Such cameras are general purpose; they can be installed anywhere, and in any device, without significantly adding to the overall size or cost. The roll-out of 5G technology means that cameras can even be installed in the most unlikely and remote locations with the help of wireless connections. As a result, low-cost but powerful video cameras will soon be present in the everyday digital devices that surround us.
The Tesla surveillance issue shows that any camera could become a surveillance system capturing intimate moments from our lives. The Reuters article underlines another key element in the story, one that will become increasingly important:
The sharing of sensitive videos illustrates one of the less-noted features of artificial intelligence systems: They often require armies of human beings to help train machines to learn automated tasks such as driving.
Since about 2016, Tesla has employed hundreds of people in Africa and later the United States to label images to help its cars learn how to recognize pedestrians, street signs, construction vehicles, garage doors and other objects encountered on the road or at customers’ houses. To accomplish that, data labelers were given access to thousands of videos or images recorded by car cameras that they would view and identify objects.
What Happens When AI No Longer Needs Training?
It is not just text that today’s AI systems are ingesting as part of their training data, with all the privacy dangers that implies. Increasingly, they are able to work with videos too, as Tesla has been doing for some years. The fact that such videos, often containing images of identifiable individuals, are readily available for this kind of training is problematic.
Abuses by employees with access to sensitive will always be a risk, because sharing these kind of videos is human nature, as the dominant forms of social media demonstrate every second of the day.
But there is a second, more subtle threat that the use of real-life videos for machine learning represents. The whole point is to train AI systems to recognize humans and the things around them. That means today’s advanced AI systems will be extracting from videos all kinds of highly personal information about what we do, where and with whom. That data will then be part of the huge and completely inscrutable generative AI systems that are being developed today.
Who knows what unexpected information may emerge when these essentially uncontrolled systems are interrogated by users, perhaps innocently, perhaps with malign intentions? We will probably find out soon enough.
Featured image by Darafsh.