3. Human acts on Machine
• Interaction examples
• GUI of PC, Smart Phone
• Commanding
• Discrete: start/stop, select, confirm
• Analog: quantity setting
• Texting
• Pointing (position, movement)
• Handle, Pedal in a car
• Button, Knob on consumer electronics
• …
• Observation
• Fingers/hands are dominant organs to play, while human
actions to human/atoms in real-world are multi-modal in
nature.
4. Machine receives information
from Human
• Interaction examples
• Health-care monitor
• Security camera
• Lock/unlock of car door
• Wind direction of air-conditioner
• …
• Observation
• They are one direction and not interactions. There is a
much space to help human through interactions.
5. Machine acts on Human
• Interaction examples
• Game story
• Alarm
• GUI - popup window
• sound, lighting, vibration
• Traffic signal
• …
• Observation
• Besides visuals, multiple modalities are used but in
primitive forms. Robots, for example, nursing care robot,
may change situation radically.
6. Human receives information from
Machine
• Interaction examples
• GUI of PC, Smart Phone
• Display
• Augmented reality with Google Glass
• Rumba position
• …
• Visuals are dominant.
7. Human vs Human
• Interaction channels
• language
• text
• voice
• visuals
• sounds
• body gesture
• Observation
• They are multimodal interactions.
8. Human vs Atom
• Interaction channels
• Human-to-atom: trigger by human mechanical actions
• Atom-to-human: human 5 senses
• Observation
• It is uni-modal from human to atom, while it is multi-
modal in opposite direction.
• If human multi-modal effectors such as voice, gaze, or brain
state is linked with machines to act on atoms, human power on
atoms would be amazing.