View notes here: http://www.slideshare.net/JasonDonaldRowley/notes-on-46898401
This presentation was prepared for a 5-minute O'Reilly Ignite-style talk delivered for Professor James Evans's Internet and Society course at the University of Chicago. It details the current search landscape, some of the challenges facing incumbents like Google, and some of the products innovating on the core search experience.
1. The Future of Search
jrowley@uchicago.edu & danya@uchicago.edu
Organizing Principles
Challenges
2 Potential Ways Forward:
- Google’s Knowledge
Graph
- Microsoft Cortana &
Google Now
4. The Billy Sunday Problem
• Simply searching for “Billy
Sunday” returns a mess of
results
• Google returns what it “thinks”
is most relevant to me, even if
the content isn’t relevant to
me
• This is illustrative of a bigger
point
5. The problem with strings
• String-based search is old.
• Results in increased
interactional friction
6. More Problems with String-
based Search
• The previous approach leaves
users with two choices:
• Parse the hodgepodge
• Append the query with
modifying terms
• Both choices increase
interactional friction == bad
UX
7. Things > Strings
• Across web platforms, we’ve
seen a paradigm shift from
“string thinking” to graph-
based approach
• See, for example: Facebook’s
Social Graph & OGP,
Google’s Knowledge Graph,
LinkedIn’s Professional
Graph, Wolfram’s
Computational Knowledge
Engine, &c.
8. Unpacking Google’s
Knowledge Graph
• Google’s attempt to structure
its data the way humans
structure knowledge
internally.
• Humans interpret the world in
terms of things, not strings
• The Knowledge Graph is a
more “humane” way of
interfacing with information
9. Back to Billy Sunday
• Again, the string-based
search approach yields messy
results
• “Billy Sunday” belongs to
many different concepts
• Ex. Billy Sunday is in the set
of “Swanky Chicago Cocktail
Bars”, the set of “Temperance
Preachers” and “Professional
Baseball Players”
10.
11. Attacking the Context Problem
• Google is attempting to address this context
problem with the semantic Knowledge Graph
• The Knowledge Graph is a growing, morphing
database that can measure relationships between
entities in 100-dimensional space
• This is the core of Google’s search strategy going
forward
12. Search in a multiplatform
ecosystem
• The arrival of third, fourth and
fifth screens raises a number
of questions about the future
of search.
• People behave differently on
mobile devices
• Desktop search UX does not
map well onto small screens
13. Search as Background
Process
• Search is morphing from an
active behavior to a passive
one
• Relevant information is
represented in new ways and
delivered through novel
channels
14.
15. Structured Data
• Search engines use metadata
to interpret data and give
users relevant results
• We’ve already seen some
examples of structured data at
work in search results
• Some platforms (like
Facebook and
Wolfram|Alpha) are more
structured than others
16. Delivering on Ambient
Location Awareness
• Passive streaming of your
location helps search engines
better infer intent
• Some search engines use
location data to trigger
information discovery events
for the user
19. Search as Conversation with
Future AI Systems
• The chat window may replace
the search bar
• Friendly voices may replace
graphical user interface on
keyboard-free devices
• We already see this
happening with “AI” services
like DigitalGenius
• And human-powered ones
like Magic
20.
21. Discussion Questions
• What are the major defining features of the Knowledge Graph Optimization
described in this presentation, and how do they relate to previous attempts to
organize information that we have covered in the lecture and the readings?
• How does the structure of information influence the user’s process of gathering
information? How does it effect the production of future knowledge, and/or human
action? Is someone searching for Billy Sunday who finds a different result than
they were looking for likely to change their mind or behavior?
• To what extent is search determined by structure, and to what extent is it
determined by user?
• Is “friction” entirely bad in search? Do we lose anything when we automate the
suggestion process instead of leaving room for surprise?
Notes de l'éditeur
Hi there
This talk has three broad “movements”
Organizing principles of Search UX
Challenges facing Search
2 possible ways forward: Google’s Knowledge Graph (Contextual search) & Microsoft Cortana / Google Now (Seamless, background search)
General IxD principle: Reducing “interactional friction” between the user and the thing being used.
In search, the guiding principle is connecting a user with an intent to an action.
Google is designed to reduce this interactional friction by giving me actionable results.
Example: searching for “Flights from Chicago to New York” returns a little module through which I can book my flights
Searching “UChicago to Billy Sunday” returns transit directions from UChicago to my favorite swanky cocktail bar
What if I wanted to learn more about the baseball player turned evangelical preacher who vociferously argued against consuming alcohol?
Simply typing “Billy Sunday” into the search bar gives me a hodgepodge of results, mostly favoring the bar.
Google is returning what it thinks is most relevant to me. Given my love of swanky cocktail bars, they’d likely be right, but not in this case.
This conflict is illustrative of a major problem facing search today.
Strictly string-based search analyzes the concordance between a given search query and an index of content gathered on the web.
This approach left users with one of two choices:
Parse through the mixed results to find the contextually correct information
Append the query with modifiers, like “Billy Sunday preacher”
Both of these approaches violate the rule of reducing friction.
Paradigm shift in web platforms from strings to things
Network analysis and Structured data is the new reigning framework for understanding information on the web
Knowledge Graph is Google’s attempt to mimic the way humans structure information
We think in terms of things, not strings.
Knowledge Graph is a more “humane” way of understanding information.
The string based approach returns many kinds of results for the query “Billy Sunday”
“Billy Sunday” belongs to many different sets.
The set of ‘Swanky Chicago bars’ and set of ‘Temperance preachers’
Google is attempting to divine from unstructured text structured relationships between entities.
This allows Google to deliver users more relevant search results
At least in theory
We now see how Google is attempting to address the context problem by implementing the semantic Knowledge Graph
But there’s another problem looming on the horizon: search as we know it is changing
What does a search experience look like across 2-5 screens in simultaneous use?
What happens when the keyboard shrinks or disappears?
People’s search behavior is different on mobile
Mobile platforms which rely on search as their core interface mechanism are often dead on arrival
Search is morphing from an active behavior to a background process that surfaces information when it’s most pertinent
This information is delivered in “cards” and mobile push notifications, not a set of links
This new mode of search interaction is enabled by two factors:
Use of structured data (like calendar events with locations) built atop existing information infra
Mobile devices which stream location data
Shifts search from reactive to anticipatory, from an active process to a passive one
Search engines use structured data and metadata to deliver contextually relevant, highly actionable results
Examples include Google’s Flights and Directions results.
New platform like Facebook were built on incredibly well-structured data from the start, which gives them a potential search advantage in the future
Passive streaming of your location will help search engines better infer intent.
For example, the swanky cocktail bar may show up as the top result for searches made in Chicago, but the preacher may be the top result in Ames, IA where he was born
“In platforms like Microsoft Cortana and Google Now, information is surfaced based not on a change in keyword, but a change in state.” - Stefan Weitz (link)
Quick example: I receive notifications for when to leave my house based on the location of my calendar events and Google’s persistent traffic updates.
As the kinds of devices we use changes, new input methods are required
Voice-to-text is an increasingly popular way of performing queries, but it carries some problems with accuracy and computability (e.g. NLP is hard)
Image and gesture-based search are other possibilities for the future
The chat window may replace the search box
We see this happening at small scale with narrowly scoped AI systems like my friend’s company, DigitalGenius
And with human-powered digital assistants and concierge services like Magic.
Search applications have migrated from our desks, onto our laps, into our pockets, onto our wrists and now our faces with platforms like Google Glass.
Direct neuronal interface is without question on the horizon, but the timeline is uncertain.
What happens when we have frictionless total informational awareness?
Is it a good or bad thing that we’re outsourcing so much of our cognitive function to search companies?