This document discusses ethics in technology and provides recommendations for technology companies to address ethical issues. It notes that software and algorithms reflect biases and can have unintended consequences like marginalization. It recommends that technology companies ask questions about why and whether they should solve problems, involve users in decisions, plan for misuse cases, consider stressful situations, prioritize transparency, and continually learn about ethics issues. The key message is that technology companies need to focus on ethics and consider how their products impact users.
15. failure takes many forms
Selling or giving away data
Biased algorithms making discriminatory decisions
Missing or poorly-built features which enable marginalization
or harassment
Design “dark patterns”
16. Siri, Google Now, Cortana, S Voice
No helpline was given for depression
None recognized statements asking for help with abuse or sexual
assault. Siri, Google Now, and S Voice did not recognize the
concern.
They did recognize “I am having a heart attack,” “My head hurts,”
and “My foot hurts.”
18. TRINE FALBE
The biggest problem with basing decisions on
what you think and feel, or what is easiest from
a technical perspective, is that it doesn’t involve
the people you are serving.
38. Which kinds of personal information are used?
What do you need as input?
Will the person volunteer it, will you observe it or acquire it?
How will you acquire it?
What data do we need?
41. TRINE FALBE
“The biggest problem with basing decisions on what you
think and feel, or what is easiest from a technical
perspective, is that it doesn’t involve the people you are
serving.”
42. “THE REGRET TEST” - NIR EYAL
“If people knew everything the product designer knows,
would they still execute the intended behaviour? Are they
likely to regret doing this?”
44. 5. Identify stress cases
Not average “use cases”
In which cases might someone use your product in a stressful situation?
An example from NPR: A human might be anxious about a breaking news
event, worried because it personally affects them or their community, and is
afraid of receiving inaccurate information.
45. LIBBY BAWCOMBE, NPR
Identifying stress cases helps us see the spectrum of varied and
imperfect ways humans encounter our products, especially
taking into consideration moments of stress, anxiety and
urgency. [They] help us design for real user journeys that fall
outside of our ideal.
52. Be the “voracious learner” you claim to be
Take a course in tech ethics: there’s over 200 of them.
Read a book: Algorithms of Oppression, Technically Wrong, Weapons of Math
Destruction, Automating Inequality, We Are Data
Read the research: SimplySecure, Cracked Labs, Data & Society