In the past six months, we had quite a few chances to visit the US and meet new clients in Washington (WA). In our meetings we talked about consumer privacy and the role of technology in data protection. The General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA) (coming into effect on January 1st, 2020) have indeed raised many questions. With the anticipated further fragmentation of privacy regulations, organisations of all sizes have been struggling to find the answers in their own business context.

In October, we took the opportunity to sponsor the IT Summit in Denver and meet like-minded professionals in the information security and privacy domains — something we are more than passionate about. And we brought the topic on stage.

We talked about:

  • What are the challenges that regulations pose to organisations;
  • GDPR and CCPA – how the two regulations are different and what treats they share;
  • Upcoming privacy regulations such as Mind Your Own Business and ePrivacy;
  • The personal side of privacy;
  • Artificial Intelligence and its potential impact.

Fragmentation of the regulatory landscape – GDPR, CCPA, MYOB

Marko Simeonov, AMATAS’ Legal and Compliance Officer, focused on cybersecurity and the regulatory landscape. Not surprisingly, the World Economic Forum named cyberattacks the largest global threat in 2019. By 2021, damages caused by cybercrimes are expected to reach $6 trillion and the average cost of a data breach is $3,68 million, as calculated by the Ponemon Institute LLC.

Marko Simeonov, AMATAS’ Legal and Compliance Officer, focused on cybersecurity and the regulatory landscape.

GDPR and CCPA are the regulatory countermove. The requirements set by them aim at guiding organisations in the adoption of best practices and the protection of the rights and data of consumers. Here is a brief comparison between the two.

As evident, GDPR and CCPA are similar in their main intentions, but regulations are adapted to the local context and legal environment.

As evident, GDPR and CCPA are similar in their main intentions, but regulations are adapted to the local context and legal environment. Further, they are not a recipe organisations are expected to follow. They define the legal and technical frames in which organisations must fit, and leave the technological, procedural and business decisions to the organisations alone. In our experience, this results in misalignment between business strategy, on the one hand, and technological adoption, on the other. In many cases, we have seen negative effects on marketing and sales departments within businesses. Misguided, they are leaning toward the two extreme positions — practically limiting their activities or breaking the law. In both cases, business is at risk.

The ePrivacy Directive, which addresses online tracking and is expected to set even stricter limits on direct marketing activities, will likely soon join GDPR and CCPA on stage. Meanwhile, “Mind Your Own Business”, the privacy bill introduced by US Senator Wyden in October 2019, demonstrates the true magnitude of privacy as an issue. Although the bill targets mostly the largest technological organisations, its effects will likely cascade and impact even the smallest ones on the market. We are yet to see the development of a US Federal Data Protection Law.

Fragmentation of the regulatory landscape – GDPR, CCPA, MYOB

The technological and personal perspective on cybersecurity and privacy

When talking about privacy, we often forget about its implications in our personal lives. With his presentation, Boris Goncharov, VP of Strategy at AMATAS, took a more private, and philosophical, perspective, for technological advancements have enabled experiences we have not even dreamed about, until recently. Yet, as some have been warning us, these very advancements clearly pave the road to technological singularity.

The first step is Hyperreality — when technology makes it impossible for us to distinguish reality from the simulation of reality. Virtual (VR) and Mixed (MR) Reality have already let us not just learn about the laws of physics and the processes in the human body but experience them — with our minds and senses.

Rather intentionally, we have been gradually building relationships with machines, eagerly making them smarter and letting them relate with our emotions. Because we do not want to just have them by our side. We want them to be like us – to feel our emotions, to understand our thoughts and respond to them. We call this intelligence artificial, but our ultimate goal is to see it act naturally.

Biochemical profiling has now enabled this – analysis of human behaviour, with precision and clarity that are well beyond the capabilities of most humans. It is analogue interfaces with their inherently low bandwidth that prevent us from enjoying the full capacity of our relationship with machines. We still rely on keyboards and our voice to input data and our eyes and ears to catch the response.

A direct interface connecting our consciousness and machines will overcome these limitations, and will make our thoughts and feeling, the mental images we create about the world to understand it, readable and modifiable, even easy to create.

Driven by our intrinsic curiosity and desire to create things and breathe life and meaning into them, we will open ourselves to Artificial Intelligence, revealing the most intimate feelings and thoughts of ours. And as it starts learning all by itself, unrestricted by models and algorithms, we can only guess what will follow.

In the end, together we may find the answers to the most plaguing challenges we have ever faced. But there is this other possibility – to see the world going in a direction no one expects and understands.

Curiosity has taken us a long way. Every step along it, security and protection have had a low priority. This time we shall ask ourselves this simple question. Do we really know how and why does artificial intelligence in any of its forms think the way it does? We must build the capabilities to understand that — by letting it explain its decisions, or by auditing its processes. For if we fail, we risk losing the battle we started as an exploration, driven by our intrinsic desire to experiment, build and make things easier. It will be a shame to all humanity.

Do we really know how and why does artificial intelligence in any of its forms think the way it does? We must build the capabilities to understand that — by letting it explain its decisions, or by auditing its processes.