Exploring the Tech Risk Zones: Surveillance

Exploring the Tech Risk Zones: Surveillance

By Kacie Harold, Omidyar Network 

Matt Mitchell is a hacker and Tech Fellow at The Ford Foundation, working with the BUILD and Technology and Society teams to develop digital security strategy, technical assistance offerings, and safety and security measures for the foundation’s grantee partners. Matt has also worked as the Director of Digital Safety & Privacy at Tactical Tech, and he founded CryptoHarlem, which teaches basic cryptography tools to the predominately African American community in upper Manhattan.

Matt, why should small and midsize tech companies want to address issues of surveillance and think about data privacy and security for their users?

I recently spoke with founders of a blockchain, cryptocurrency social media startup that values “humans first”. Privacy came up briefly in the conversation. As a small team going through their first round of funding, they are motivated to build quickly, get people to use the product, and then find a way to monetize it. I suggested they create a transparency report and a plain speak privacy policy because this would give them competitive advantage, and it speaks to the motivations of that team. When you are building a product that’s new, existing companies and competitors might not have these things, so focusing on privacy is really easy, lo-hanging fruit when it comes to feature development. You can go a long way to earning the trust of your users and build engagement when people know that using your product isn’t going to compromise their security in the future.

Are there any common surveillance related problems companies run into when they build a new products or features?

When you’re making a product, there’s a temptation to gather as much data as possible because, in the worst case scenario, maybe you’re VC-funded and you’re losing your seed funding. The money you have to play with every month is going down and you’re not really meeting your KPIs, but you do know your users. If you reach a place where you may have to lose some staff, it can be tempting to sell user information or what you know about user behavior.

Monetizing user data usually seems like a good idea at the time. But it always turns out to be something that hurts you, because it hurts your relationship with the users. When your users can’t trust you anymore, they begin seeing you as the lowest part of what you provide. You are no longer delighting the users, and then they lose the reason why they’re there, and it becomes so easy for someone to replace you.

You may be approached by a company who is interested in just a small part of what you do, for instance, something related to user behavior. This is where you should say “no”. You are still empowered to say “no” at that moment. But as soon as you say “yes”, even if it’s just to sell a little bit of information, but only to trusted partners in certain conditions, that criteria starts sliding really quickly, especially if you are not the only one making decisions or you have funders or VCs you report to. Once you make it, you can’t undo it. You can’t unbuild a surveillance apparatus.

Another common problem is that teams are working on tight timelines, and it can be hard to find the time to make sure they are doing things right, and without guidance, they don’t know when they are doing something wrong. When it comes particularly to surveillance, people don’t have a good mapping of things that equal surveillance in their industry and in their products. Engineers aren’t thinking they want to add surveillance to something, they just want to build a tool. They don’t realize when the different elements of what they built and the data they are collecting can be used to monitor and harm users.

What can teams do to prevent surveillance issues from creeping up on them?

I think harm reduction on a micro-intervention level is a helpful practice because it’s just adding a few minutes into a workday that is full of loose minutes. When you’re trying to fix a broken app or a broken world it can take years, and you won’t necessarily have any wins. This is why it is important to invest those minutes and prevent these harms.

Everyone on the team needs to be equipped with tools and information to identify and prevent surveillance-related harms. For engineers, (quality assurance), and the debugging team, using basic checklists on a regular basis can help prevent problems and identify moments where the team should slow down and evaluate whether there may be a surveillance issue developing. Product managers and UX should create user personas that include information about how that user could be harmed, if your tool were used for surveillance.

Finally, give your team an “emergency brake” that anyone can pull anonymously, if they see an emerging harm, or something that violates the values your team or company has agreed upon. Make it clear ahead of time that if the emergency brake is pulled, the team will dedicate a sprint to fixing the issue.

What advice would you give tech builders who are just starting to think about surveillance?

Reading doesn’t seem like the first thing you want to do when starting a company and focused on finding funding, hiring engineers, and building a prototype. But reading doesn’t take long, and the value it saves you in protecting you from liability, enhancing your ability to compete, and building trust with your users pays itself back in dividends.

I recommend reading about Black [people] using technology, because those use cases open up a set harms that you can apply to almost everything. Two books I like are Dark Matters by Simone Brown, an amazing book on the surveillance of Black folks, and Algorithms of Oppression by Safiya Umoja Noble. When you know better, you can do better.

You can learn more about Matt’s work and watch his talks and security training videos on Medium, or follow him on Twitter @geminiimatt.

This site uses cookies

This website deploys cookies for basic functionality and to keep it secure. These cookies are strictly necessary. Optional analysis cookies which provide Omidyar Network with statistical information about the use of the website may also be deployed, but only with your consent. Please review our Privacy & Data Policy for more information.

Accept