Zoom is not the worst, just getting the attention software deserves

The rise of scrutiny of Zoom is welcome evidence that privacy and security is valued and essential as our lives and interactions become increasingly virtual.

Key points
  • Zoom already had security challenges before the Coronavirus-caused lockdowns.
  • Massive adoption of Zoom led to more scrutiny and exposure of privacy and security issues.
  • Any software under the same scrutiny and without a strong focus on these features would have shown similar failures.
  • Communications apps and services deserve this level of scrutiny.
  • Investors, boards, and now customers must demand more from the entire industry.
News & Analysis

A few weeks ago, its name would probably have been unknown to you. Amidst the covid-19 crisis and the lockdown it caused, Zoom has suddenly become the go-to tool for video chat and conference calling, whether it’s a business meeting, a drink with friends, or a much needed moment with your family. This intense rise in use has been financially good to the company, but it also came with a hefty toll on its image and serious scrutiny on its privacy and security practices.

While Zoom already had a bit of history of security failures (one of which forced Apple to take global action to protect its customers from staying highly vulnerable), it is nothing compared to what the company is going through right now. From security vulnerabilities to privacy issues to questionable ethical decisions, the quantity and seriousness of problems that have been exposed in the past few weeks have rarely been matched.

Issues revealed about Zoom in the past 4 weeks:

  • Privacy policy that allowed data collected during meeting be sold
  • iOS app using Facebook’s SDK and sending data to Facebook (an issue PI has highlighted previously on other apps)
  • Attendee tracking feature (flagged and removed)
  • MacOS install process was opaque in the modules it initialised (fixed)
  • Zoom 0-day allowing remote execution on Windows (fixed)
  • Data mining to display LinkedIn profile of users (feature disabled)
  • Data leak of thousands of email addresses
  • Easy to guess meeting IDs (following a pattern) leading to Zoombombing
  • Pretending to use end-to-end encryption
  • Usage of poorly implemented home made encryption
  • Questionable call routing policies, for example via China

A long list of flaws… but not a surprising one

Yet, is this enough to say that Zoom is terrible software and that you, your company or your community should absolutely stay away from it? The answer is not that obvious.

Zoom certainly displayed questionable ethical choices and poor security practices. Most of the issues that have been reported highlight how the company focused on making easy-to-use software and dedicated little resources to make it secure or privacy-friendly1. But the reality is that almost any software which hasn’t paid crucial attention to privacy and security before being massively adopted would have run into similar issues.

Most private software is developed for a specific audience with a specific goal. And while security might be a key marketing argument in some context, it’s not always the case. Same goes for privacy, which is too often targeted at a niche audience.

Houseparty page on the AppStore with screenshots of the app
Houseparty, another newly succesful communication app with questionable privacy policy

While Zoom reportedly had issues which would have been critical for any software, chances are that any videoconference system which suddenly multiplies its user base by a factor of 20 (Zoom went from 10 to 200 millions users in 3 months) would have had flaws of their own revealed. It’s also important to flag that there are a great variety of use cases, from online classes to talking to your lawyer, and that one-size-fits-all solutions don’t exist.

Software isn’t the problem, our approach to developing it is

So what’s wrong with Zoom then? How can a company which already had millions of users fail on so many levels? To put it simply, Zoom was developed as a business-to-business software with the wrong priorities in mind, and the attention it received is revealing poor development choices.

Prioritising frictionless use over privacy and security might seem a good idea when trying to enter the market and find new customers. In itself, it’s not bad to focus on those things, but it should never be done at expense of security and privacy.

Considerable financial investment has gone into Zoom – and it is alarming that neither the board nor investors brought these issues to the fore sooner. We hope that investment will encourage Zoom to follow sound security and privacy choices in the future.

Private software development is also guided by rapid release of features and bug fixes to maintain user satisfaction and in order to maximise profits. These were likely additional reasons to overlook security and privacy. User-facing features are often more appreciated by businesses than core security design of company ethos, making it tempting for companies to ignore those key features.

Privacy and security: core supporters of our humanity

In today’s landscape, privacy and security cannot be considered optional. Today everyone is using services like Zoom: governments, schools, medical practices, law firms, journalists, families, friends, justice and rights groups like our own. No trustworthy service should place these people at risk. Somehow the UK Government Cabinet meetings still occur over Zoom despite concerns from the UK’s Ministry of Defence, even as schools in New York and Singapore have moved away from the platform. Google and SpaceX staff are no longer allowed to either, nor the German government’s foreign ministry.

 Boris Johnson, UK Prime Minister in front of a screen with Zoom open
Boris Johnson, UK Prime Minister using Zoom for Cabinet meeting

All these problems could have and should have been caught earlier. In the EU and under GDPR, companies have to produce Data Protection Impact Assessment to justify and review their data collection, processing and protection practices.  Although Zoom might be an American company, its user base is international (including European citizens) which make these sorts of practices necessary.

On the security side, a business-oriented company which provides communication channels should have a big focus on its security and undergo regular security audits. When your customers include valuable organisations who are likely targets of foreign states and other actors, good security should be a fundamental promise. This is something some governments have already acknowledged, and mass adoption of business ready software should not be a problem if proper attention is given to security and privacy.

Those things should not be wishful thinking, they are critical elements to create strong and reliable communications technology that protect users. In the case of Zoom, they should be even more important given its initial focus on businesses. Now under fire, Zoom has made fixes and improvements to some critical issues, but there is still a lot to do to make the software secure by default, starting with a modification to the default settings and more transparency about the real software capabilities.

There is a lesson for all of us here: if this crisis is an opportunity to develop or improve our communication technologies, both security and privacy must be given the attention they deserve – that we deserve. The past weeks have highlighted how vital technologies can become in times where contact with each other is paramount to our mental health. If anything, Zoom’s case has made more obvious that security and privacy are not optional but what people truly need. They are the fundamental layers that give us the peace of mind to reach out to others and let our humanity expresses itself.

Footnotes

(1): To their credit, Zoom has been tackling a lot of the issues spotted in a fairly transparent and now responsive way. It had to scale up quickly, which came at a price, and manage a surge in the number of users as well as the variety of uses which forced it to adapt its model.