Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Upcoming SlideShare
What to Upload to SlideShare
Next
Download to read offline and view in fullscreen.

0

Share

Download to read offline

Mastering the demons of our own design

Download to read offline

My talk about lessons for government from high tech algorithmic systems, given as part of the Harvard Science and Democracy lecture series on April 21, 2021. Download ppt for speaker's notes.

Related Books

Free with a 30 day trial from Scribd

See all
  • Be the first to like this

Mastering the demons of our own design

  1. 1 Mastering the Demons of Our Own Design Tim O’Reilly Founder and CEO, O’Reilly Media @timoreilly Science and Democracy Lecture Harvard April 21, 2021
  2. Giving credit where credit is due “Stunning as such crises are, we tend to see them as inevitable…. We take comfort in ascribing the potential for fantastic losses to the forces of nature and unavoidable economic uncertainty. But that is not the case. More often than not, crises aren’t the result of sudden economic downturns or natural disasters. Virtually all mishaps over the past decades had their roots in the complex structure of the markets themselves.”
  3. Markets are human creations Tax policy, laws, and regulations shape the economy in much the same way as the algorithmic systems at Google, Amazon, and Facebook shape their marketplaces. The market is a designed artifact, not a natural phenomenon. When Facebook’s algorithms have gone wrong, we demand that we change them. But we throw up our hands about many self-inflicted economic wounds, as if the rules of the market are unchangeable.
  4. Stability vs risk “An ecosystem is stable not because it is secure and protected but because it contains such diversity that some of its many types of members are bound to survive despite drastic changes….Herbert adds, however, that the effort of civilization to create and maintain security for its individual members “necessarily creates the conditions of crisis because it fails to deal with change.”
  5. “Gradually, then suddenly” Ernest Hemingway
  6. Gradually, then suddenly Artificial Intelligence and algorithmic systems are everywhere, in new kinds of partnerships with humans
  7. We are all living and working inside a machine
  8. It’s no longer just in the digital realm
  9. An Amazon warehouse is a human-machine hybrid
  10. Our financial markets are cut from the same cloth
  11. Collective Intelligence and “Hybrid AI” “The hope is that, in not too many years, human brains and computing machines will be coupled together very tightly, and that the resulting partnership will think as no human brain has ever thought and process data in a way not approached by the information- handling machines we know today.” - J.C.R. Licklider, Man-Machine Symbiosis,1960
  12. Gradually, then suddenly Large segments of the economy are governed not by free markets but by centrally managed platforms
  13. Algorithms decide “who gets what – and why” By placement on the screen and algorithmic priority, Google, Amazon, and app stores shape which pages users click on and which products they decide to buy. Facebook shapes what ideas gets attention. Uber and Lyft – not a free market of competing drivers – decide what to charge passengers, and thus the allocation of value between drivers and riders.
  14. Algorithms decide “who gets what – and why” A better designed marketplace can have better outcomes.
  15. Price signaling is no longer the primary coordinator
  16. Managing an algorithmic marketplace
  17. Real Time Digital Regulatory Systems Google search quality Social media feed organization Email spam filtering Credit card fraud detection Risk management and hedging
  18. Governance in the age of algorithms Must focus on outcomes, not on rules. Must operate at the speed and scale of the systems it is trying to regulate. Must incorporate real-time data feedback loops. Must be robust in the face of failure and hostile attacks. Must address the incentives that lead to misbehavior. Must be constantly refined to meet ever-changing conditions.
  19. It’s a hard problem Users post 7 billion pieces of content to Facebook a day. Expecting human fact checkers to catch fake news is like asking workers to build a modern city with only picks and shovels. At internet scale, we now rely increasingly on algorithms to manage what we see and believe.
  20. Algorithms have become a battleground The battle against bad actors crosses platform boundaries. Policing platforms becomes a major activity, which is also carried out by algorithmic systems.
  21. The incentives are wrong
  22. Why haven’t these problems been solved yet? Is it just that they are hard? Is it that our political system gives mixed messages about what to do? Is it that the leaders of the companies are bad people, concerned with profit above all else? Or is there something more at work?
  23. Algorithmic systems have an “objective function” Google: Relevance Facebook: Engagement Uber: Passenger pick up time Scheduling software used by McDonald’s, The Gap, or Walmart: Reduce employee costs and benefits Central banks: Control inflation? Employment? Interest rates?
  24. Like the djinn of Arabian mythology, our digital djinni do exactly what we tell them to do
  25. “The art of debugging is figuring out what you really told your program to do rather than what you thought you told it to do.” Andrew Singer Andrew Singer
  26. The runaway objective function “Even robots with a seemingly benign task could indifferently harm us. ‘Let’s say you create a self-improving A.I. to pick strawberries,’ Musk said, ‘and it gets better and better at picking strawberries and picks more and more and it is self-improving, so all it really wants to do is pick strawberries. So then it would have all the world be strawberry fields. Strawberry fields forever.’ No room for human beings.” Elon Musk, quoted in Vanity Fair https://www.vanityfair.com/news/2017/03/elon-musk-billion-dollar-crusade-to-stop-ai-space-x
  27. What is the objective function of our financial markets? “The Social Responsibility of Business Is to Increase Its Profits” Milton Friedman, 1970
  28. A system that turns idealists into monopolists
  29. The “Don’t Be Evil” Age of Internet Idealism “We want you to come to Google and quickly find what you want. Then we’re happy to send you to the other sites. In fact, that’s the point. The portal strategy tries to own all of the information…. Most portals show their own content above content elsewhere on the web. We feel that’s a conflict of interest, analogous to taking money for search results. Their search engine doesn’t necessarily provide the best results; it provides the portal’s results. Google conscientiously tries to stay away from that. We want to get you out of Google and to the right place as fast as possible. It’s a very different model.” “Our goal is to be earth's most customer-centric company.” Larry Page in 2004 Jeff Bezos in 1998
  30. Advertising and Mixed Motives
  31. The Shift to Mobile The shift to mobile and the rise of social media were an existential threat to Google.
  32. The pressure to grow is built into the system “The relentless pressure to maintain Google’s growth, he said, had come at a heavy cost to the company’s users. Useful search results were pushed down the page to squeeze in more advertisements, and privacy was sacrificed for online tracking tools to keep tabs on what ads people were seeing.”
  33. Here’s Google Search Up Till 2010
  34. But that began to change
  35. Google search today
  36. What happened to TripAdvisor Google introduces “Destinations” travel search feature on mobile, starting in March 2016, expands to desktop search thereafter. March 2018, Google retires “Don’t be evil” statement from corporate values.
  37. Google’s share of ad revenue over time O’Reilly Research
  38. When there is no money to be made… Google has added “answerbox” features that serve user interests, and mostly sends the traffic onwards as before. Only about 6% of Google search results pages contain advertising
  39. An Amazon Search Result from 2004 “Most popular” was the default search This distinguished Amazon from B&N and Borders, which features sponsored products or their own competitive products No more
  40. Amazon today All but one of the items shown is sponsored Publishers must advertise their own products to be visible “Featured” is now the default. The old concept of customer collective intelligence picking the top products is mostly gone.
  41. Amazon Ad Revenue 2014-2020 2014 $ 1.32 2015 $ 1.71 2016 $ 2.95 2017 $ 4.65 2018 $ 10.11 2019 $ 14.09 2020 $ 21.45
  42. In this category, Google has no mixed motive but Amazon does…
  43. Algorithmic rents Platforms use their power to decide who gets what and why to allocate an additional share of the value created to themselves.
  44. “Rents [accrue] from a mismatch between value creation and value appropriation” “The classical economists…[define] economic rent as income extracted from the ownership of a scarce asset (such as land or other natural resources) or control over an activity required for economic production in excess of the costs required to maintain the asset or activity. This income accrues without the creation of any additional value — what the classicals called ‘unearned income’ — so it can be viewed as ‘value extraction’, since it reduces the income available for productive investment, spending or innovation.” Mariana Mazzucato, UCL Institute for Innovation and Public Purpose
  45. In the long run, rent extraction is bad for the platforms themselves as well as their users
  46. Nations fail for the same reason as tech platforms Inclusive economies outperform extractive economies. When inclusive economies fall prey to extractive elites, everyone is worse off.
  47. Are the government’s economic “algorithms” having the intended effect?
  48. Divergence of productivity and real median family income in the US To paraphrase Bookstaber, “We take comfort in ascribing the problem to the unavoidable forces of ‘the market.’ But that is not the case."
  49. Goodhart’s Law When a measure becomes a target, it ceases to be a good measure. As restated by Marilyn Strathern
  50. Housing
  51. Tax incentives as algorithmic economics
  52. “Algorithmic” interventions can spur innovation
  53. AI is a mirror, not a master
  54. We have new tools “The opportunity for AI is to help humans model and manage complex interacting systems.” Paul R. Cohen
  55. What Might Mission Driven Algorithms Optimize For? • Dealing with climate change • Preparing for future pandemics • Rebuilding our infrastructure • Feeding the world • Ending disease and provide healthcare for all • Resettling refugees • Educating the next generation • Helping people to care for one another and to enjoy the fruits of shared prosperity
  56. Doughnut Economics Kate Raworth
  57. The great opportunity of the 21st century is to use our newfound cognitive tools to build sustainable businesses and economies

My talk about lessons for government from high tech algorithmic systems, given as part of the Harvard Science and Democracy lecture series on April 21, 2021. Download ppt for speaker's notes.

Views

Total views

1,119

On Slideshare

0

From embeds

0

Number of embeds

112

Actions

Downloads

6

Shares

0

Comments

0

Likes

0

×