How user interfaces work
Posted: Thu Mar 22, 2007 10:48 am
I've seen two videos within a video of Alan Kay, the inventor of the desktop, which describe how Allan Kay came to develop the desktop user interface we are so accustomed to. It was a result of experimenting and understanding how the mind works.
See these streaming RealMedia files of the videos:
The idea is that humans have different input channels for their observations of the world. If you can combine these channels, you have have a working user interface. It also mentions how humans learn in different stages of child development, and that grownups still have these abilities which can be used.
See how the mouse, mouse pointer and display works:very young childeren learn mostly by doing, somewhat older children by seeing, and yet older children by thinking (abstraction).
Your hands give motor sensor impulses back to the brain, so your hands determine where the mouse pointer is on the screen. Your eyes see the mouse pointer and gives visual sensor impulses back to to brain how the mouse pointer moves, relative to other objects on the screen. The brain combines both input channels and can move the mouse pointer from one object on the screen to the other.
Mouse buttons are simply a way to prevent that you need to move your hands back and forth to the keyboard. Therefore, mouse buttons are a simplified version of the keyboard.
Now I'm curious if it is possible to combine other input channels to create new input devices. I'm sure there is a lot of research taking place to give us the input devices of the future, because the keyboard-mouse-display interface is not very useful for creating animation, and we really need a better interface between the human and the computer for more productivity.
See these streaming RealMedia files of the videos:
- http://webcast.berkeley.edu/stream.php? ... stid=14424
- http://webcast.berkeley.edu/stream.php? ... stid=14425
The idea is that humans have different input channels for their observations of the world. If you can combine these channels, you have have a working user interface. It also mentions how humans learn in different stages of child development, and that grownups still have these abilities which can be used.
See how the mouse, mouse pointer and display works:
Code: Select all
+--------------+
| v
hand moving eyes seeing brains combining
mouse device mouse pointer move mouse pointer
| ^
+----------------------+
motor sense visual sense abstraction by brain
Your hands give motor sensor impulses back to the brain, so your hands determine where the mouse pointer is on the screen. Your eyes see the mouse pointer and gives visual sensor impulses back to to brain how the mouse pointer moves, relative to other objects on the screen. The brain combines both input channels and can move the mouse pointer from one object on the screen to the other.
Mouse buttons are simply a way to prevent that you need to move your hands back and forth to the keyboard. Therefore, mouse buttons are a simplified version of the keyboard.
Now I'm curious if it is possible to combine other input channels to create new input devices. I'm sure there is a lot of research taking place to give us the input devices of the future, because the keyboard-mouse-display interface is not very useful for creating animation, and we really need a better interface between the human and the computer for more productivity.