Additional Resources for Ethical Explorers

Additional Resources for Ethical Explorers

By Kacie Harold, Omidyar Network

We hope you’ve enjoyed playing with the Ethical Explorer Pack. If you’ve gotten excited about moving from conversation to action in building safer, healthier, more responsible technology, here are some great resources.

If you’re still making the case:

Ledger of Harms: A broad assortment of compelling and current studies and articles compiled by the Center for Humane Technology that show clear effects harmful products and features, documented by relatively unbiased researchers and writers. This may be a helpful resource Ethical Explorers who need data and stories to make a case for prioritizing responsible design.

Parable of the Polygons: A quick interactive game by Vi Heart and Nicky Case to learn about how our history can lead to both biased data and a biased present. This may be useful for Ethical Explorers leading conversations about bias with their teams.

Ethical Litmus Test: A deck of 66 short questions and activities by Debias AI to help build up ethics capacity and vocabulary — as individuals and as part of a team. Website includes links to reading groups ad toolkits focused on bias in machine learning.

Classes you can take:

Data Science Ethics: Learn how to think through the ethics surrounding privacy, data sharing, and algorithmic decision-making. Course by University of Michigan on edX. Twelve hours over four weeks, free.

Ethics, Technology and Engineering: Focuses on the concrete moral problems that engineers encounter in their professional practice. Course offered by Eindhoven University of Technology on Coursera. An 18-hour course, offering a certificate.

Future Learn Philosophy of Technology: Learn about the impact of technology on society. Explore the philosophy of technology and mediation theory, focused on design. Course by University of Twente hosted by Future Learn. Three weeks, four hours per week, free.

Ethics of Technological Disruption: Popular class from Stanford University with guest speakers from top tech companies. Six, two-hour video lectures available on YouTube.

HmntyCntrd: Interactive, cohort-based online course and community for UX professionals who want to learn how to design and advocate for equitable and inclusive user experiences. Created by UX Researcher and Humanity in Tech Advocate Vivianne Castillo.

Responsible design practices to try:

Design Ethically Tooklit: A toolkit for design strategists and product designers with several 30-minute to 1-hour small group exercises to help teams evaluate ethical implications of product ideas, think about consequences of unintended user behaviors, and create checklists for ethical issues to monitor after shipping product. Created by Kat Zhou, Product Designer at Spotify.

Judgment Call: Team-based game for cultivating stakeholder empathy through scenario-imagining. Game participants write product reviews from the perspective of a particular stakeholder, describing what kind of impact and harms the technology could produce from their point of view. Created by Microsoft’s Office on Responsible AI.

Harms Modeling: Framework for product teams, grounded in four core pillars that examine how people’s lives can be negatively impacted by technology: injuries, denial of consequential services, infringement on human rights, and erosion of democratic & societal structures. Similar to Security Threat Modeling. Created by Microsoft’s Office on Responsible AI.

Community Jury: Adaptation of the Citizen Jury, is a technique where diverse stakeholders impacted by a technology are provided an opportunity to learn about a project, deliberate together, and give feedback on use cases and product design. This technique allows project teams to understand the perceptions and concerns of impacted stakeholders. Created by Microsoft’s Office on Responsible AI.

Consequence Scanning: Lightweight agile practice to be used during vision, roadmap planning and iteration stages of product or feature development; focused on identifying potential positive and negative consequences of a new technology. Developed by Doteveryone.

Tools to help you manage Tech Risk Zones:

Addiction:
Calm Design Quiz: Set of scorecards to evaluate whether UX is optimized for healthy user engagement. Created by Amber Case, author of Calm Technology: Principles and Patterns for Non-Intrusive Design.

Algorithmic Bias
AI Blindspot: A discovery process for spotting unconscious biases and structural inequalities in AI systems from MIT Media Lab and Harvard University’s Berkman Klein Center for Internet and Society Assembly program. Includes resources on considering ethics when determining AI system performance metrics, security risks, and setting goals for your AI system.

People + AI Guidebook: Developed using data and insights from Google product teams, experts, and academics to help UX professionals and product managers follow a human-centered approach to AI. Includes guidance and worksheets on six topics including trust and explainability, designing feedback mechanisms, identifying errors and failures.

Data Control:
Data Ethics Canvas: Open Data Institute framework to identify and manage data ethics issues for anyone who collects, shares or uses data.

Exclusion:
Mismatch: Collection of inclusive design resources for making products accessible to users with disabilities. Includes links to accessibility checklists and tools, classes, and stories about inclusivity driving design innovation. Created by Kat Holmes, inclusive UX and Product Design expert and author of Mismatch: How Inclusion Shapes Design.

Universal Barriers: Framework for evaluating where an existing or changing service might exclude users. Created by the United Kingdoms office of Government Digital Service.

Surveillance:
Digital Security and Privacy Protection UX Checklist: Checklist with suggestions to promote privacy when designing and developing tools for targeted communities.

Discuss your values with funders and partners:

Conscious Scaling: Framework for dialogue between founders and investors/the board focused on identifying and mitigating long-term risks associated with a business model or technology’s impact on society, the environment, and all stakeholders. Created by Atomico, of which Omidyar Network’s Sarah Drinkwater is an angel program participant.

Ethical Intake Framework: Open source framework to assess mission and values alignment when evaluating potential partners, funders, investees or projects. Created by Partners & Partners.

 

Want to tell us what you think of Ethical Explorer, and how you used it? Email us [email protected].

And don’t forget to show your support for responsible tech by using the #EthicalExplorer hashtag!

This site uses cookies

This website deploys cookies for basic functionality and to keep it secure. These cookies are strictly necessary. Optional analysis cookies which provide Omidyar Network with statistical information about the use of the website may also be deployed, but only with your consent. Please review our Privacy & Data Policy for more information.

Accept