Avast Team, Jul 23, 2020

A leading researcher in the field of online anonymity, Roger was recognized as one of the ‘Top 100 Global Thinkers’ by Foreign Policy magazine and was previously named as one of MIT Technology’s ‘Top 35 Innovators Under 35’. Ahead of his appearance at CyberSec&AI Connected this October, we caught up with Roger to discuss his presentation, as well as a range of privacy and security topics. 

Your presentation for CyberSec&AI Connected is entitled: “Surfing the Web Securely and Privately, Using Tor Anonymizing Network” Could you give us any insight into the kinds of things you’ll be addressing?

I want to give people a better intuition about tracking, surveillance, and censorship online, as part of explaining why Tor’s “distributed trust” and transparency are important building blocks for strong privacy. Tor is a free-software anonymizing network that helps people around the world use the internet in safety. Tor’s 7,500 volunteer relays carry traffic for millions of daily users, including ordinary citizens who want protection from identity theft and prying corporations, corporations who want to look at a competitor’s website in private, or people around the world whose internet connections are censored, and even governments and law enforcement.

One example topic is the “private browsing” and “incognito” modes in today’s mainstream browsers. Research by DuckDuckGo shows that people think these browser modes protect them from snooping by their ISP or by the websites they visit, when actually that’s not what they do at all. So we’ll cover topics like decentralized “privacy by design” compared to centralized “privacy by promise”.

I plan to talk about online surveillance and how to avoid it, and about the difference between network-level security (“where your internet traffic goes”) and application-level security (e.g. “what your browser gives away about you”). I’ll also bring in internet censorship: how it isn’t just a problem for far-away countries, and how the technical mechanics of surveillance and censorship are more similar than people realize.

This year’s conference will examine critical issues around AI for privacy and security. What aspects of this theme are you looking forward to discuss with other speakers and panelists at the conference?

There are so many! I will pick two to highlight here:

(1) Privacy by design (decentralization): Tor is designed so that no single piece of the system is in a position to learn what users do on the Tor network. This “distributed trust” architecture is fundamentally safer than a centralized privacy-by-promise approach, where somebody has all the data but they promise not to use it, sell it, or otherwise lose control of it.

(2) Transparency and openness: all of the Tor software is open source (also known as free software). But just being open source isn’t enough: we also publish specifications that describe *what* we think the software does, plus design documents that describe *why* we built it that way. In addition, we publicly identify ourselves as developers, and do presentations and training around the world, in order to build community trust. Some people are surprised because they think privacy and transparency are opposites, but there is no contradiction: privacy is about choice. We choose to be transparent in order to build strong privacy tools and have a strong global community.

One of the critical tech developments we are seeing around the COVID-19 pandemic has been the use of tracer apps, whether on people’s phones or in physical wearable tags, to help monitor the spread of virus and recognize patterns of infection. This obviously has both positive and potentially negative consequences in relation to AI, cybersecurity, and data. Is the speed these solutions are being pushed out of major concern?

I’m always worried when big companies see another chance to push their own agenda on us, yes. Done poorly, this is yet another opportunity for companies and governments to build comprehensive databases about our friends, our social graphs, our movement habits. Over and over the trend is that some crisis emerges, and some group uses it as an excuse to try to strengthen their controls over society, whether it’s for political gain or financial gain (if you can even draw a line between the two anymore).

The part where I’m optimistic here is that several groups of scientists have shown that we can build privacy-preserving COVID tracing apps, and they’re actually getting adopted in some European countries. The intuition behind these designs is that, rather than having every phone

tell the central database everywhere it has been, instead, each phone broadcasts a sequence of random numbers. If you later test positive, you can publish the random numbers your phone saw so then everyone else can privately, on their own phone, check to see if any of the numbers match the ones their phone sent out.

The part where I’m pessimistic is that, especially in the US, our government and law enforcement have a history of not actually being trustworthy (that is, worthy of our trust) when it comes to “protecting and serving” us. Whether it’s undercover infiltration into the Black Lives Matter movements, or illegal overcollection of phone and internet records, we’ve seen too many examples where they use their power to get and keep more power rather than to keep us safe.

It’s actually because of that lack of trust that I don’t expect the COVID tracing apps to see much adoption in the US. We already know that nothing good is going to come from answering questions about our movement patterns and associations.

In June, I moderated an online panel of experts looking at the intersection of COVID tracing apps and the international protests against systemic racism. You can watch that panel here:


What are some of the more recent trends and developments around privacy that have caught your eye?

I think the continued trend toward end-to-end encryption is critical for future society to be safe. We made a giant leap ten years ago when we succeeded at getting the world to think of https as an ordinary security layer that everybody should expect, rather than thinking of web encryption as something scary that only bad people would want. But we’re still fighting that same fight today, with Signal and WhatsApp trying to offer safety to their users while governments drum up fear about how civilization will collapse if they can’t decrypt anything and everything.

The reality is that giving people real encryption makes society safer, not less safe. That’s because taking encryption away from the masses hurts ordinary users without slowing down the bad people. Security and privacy go together; they’re not opposites.

We’re seeing this same fight on the horizon for Tor onion services, which are a way for Tor users to safely communicate with each other. Onion services provide end-to-end encryption (so it’s hard to break through the encryption and read what they’re saying), but also they provide another critical security property, which is that they protect the communications metadata (so it’s hard to learn who is talking to whom).

Today, websites like Facebook, the BBC, and the New York Times use onion services to offer safer ways for Tor users to reach their websites. Projects like SecureDrop and Globaleaks use onion services to safely connect whistleblowers to journalists. And a new generation of instant messaging apps aim to provide not just end-to-end encryption but also protection for the social graphs and communication patterns of users.

As onion services gain in popularity though, we’re seeing the same cycle repeat itself — where governments try to regain their ability to undermine safe communication tools by spreading fear, by trying to paint them as fringe rather than mainstream, and by trying to pass laws that don’t seem to take into account either mathematics or the fact that other countries exist. It’s great that large companies like Apple have chosen the correct side of the fight for strong encryption, but we as a society have a lot more work ahead of us.

Do you think the general public and/or businesses are far more educated around privacy than before? Or, for most people, is it still too abstract a concept to worry about and the convenience of digital comms outweighing any potential concerns?

When we first started working on Tor in the early 2000s, one of the challenges was getting people to care about privacy. But after the Snowden revelations and the Cambridge Analytica scandal, and more recently the outrage about Silicon Valley pushing facial recognition technologies, I think we’ve turned a corner.

The general public now understands that privacy is about who you are as a human being: it’s all the data that defines you, and how dangerous that can be if abused.

The challenge now is to build real solutions, that scale to help everybody around the world, that protect us from both over-reaching companies and over-reaching governments, and that align with user incentives and user flows so they aren’t fighting against what users are trying to do. Technology alone cannot solve this problem: we need social and political change too.

Due to current world events, this year’s conference will be done a bit differently. CyberSec&AI will be going virtual, connecting attendees wherever they are in the world. What excites you about this format and the opportunities it brings?

One of the really compelling opportunities for moving conferences online is that we can include more people from parts of the world who wouldn’t otherwise be able to make it to the physical conference.

Some of the most fascinating and memorable conversations I’ve had have been with activists doing key work in their own countries of Hong Kong, South Africa, the Philippines, Colombia, Taiwan, and the like. There are so many more people out there with amazing stories about the challenges and successes they see in their world, and bringing conferences online gives us the opportunity to bring more diverse perspectives to every discussion.

Secure your place at CyberSec&AI Connected and visit our booking page now to take advantage of our Summer Rate or 3 for 2 access offer.

This article features

Roger Dingledine

President, Researcher and Co-founder

CyberSecAI Connected 2020
Tor Project

Latest news

The positive development of a privacy preserv...

Since the beginning of the year, we have watched COVID-19 cases fluctuate around the world, turning ...

How AI can learn to win the cybersecurity “game”

Cybersecurity is an incredibly serious business. So thinking of it as a game may seem a little odd. ...

Global audience makes first virtual CyberSec&...

The first virtual edition of CyberSec&AI Connected took place on October 8th. Building on 2019...

CyberSec&AI Connected is here!

After a year of planning and preparation, CyberSec&AI Connected has arrived. Today sees delegate...