- Digital wonderland at fingers' reach
- Devices became portable
- But require our whole attention
- Either IN the screen our OUT in the world
Source: Antoine Geiger's SUR-FAKE
A Humane Representation of Thought
Victor, B., 2014. Humane representation of thought: A trail map for the 21st century, in: UIST’14.
- Technology constrain our bodies
- Only eye and fingers working
- Bodies have been neglected
- Tip of finger interaction is limiting. Not humane.
- Evolution made it so that we think with all our senses
Real World
We know how it works
Tangible User Interfaces
Ishii, H., Ullmer, B., 1997. Tangible bits: Towards seamless interfaces between people, bits and atoms, in: CHI’97.Ishii, H. et al., 2012. Radical atoms: Beyond tangible bits, toward transformable materials. interactions.
Augmented Reality
- Traditional way is to use video see-through
- Can also uses head mounted display
- However: requires hardware for user
Spatial Augmented Reality
Uses projectors or screens in the environments to display information spatially related to this environment
Raskar, R. et al., 1998. Spatially augmented reality, in: IWAR’98.Raskar, R. et al., 2001. Shader lamps: Animating real objects with image-based illumination, in: Eurographics’01.
- SAR instead uses projector or screens in the environment
- Link with Ubicomp
- Example mug from before
- Normal mug + projector...
Augmented Objects
- For example, we can augment a normal mug with different functions:
- Displaying remaining steeping time
- Displaying temperature of liquid inside
- Handle turns green when everything is OK
Augmented Objects
- It can also be used to create augmented objects
- In these two pictures, white physical objects
Augmented Objects
My objective is to move towards a real-world experience without having to remove technology. Twofold: - By including real world elements with our use of technology - Using technology as a way to reflect on ourselves
Interaction
How can we interact with digital content hosted on physical objects?
Direct Interaction
Bandyopadhyay, D. et al., 2001. Dynamic shader lamps : painting on movable objects. ISAR’01.
Direct Interaction
Marner, M.R. et al., 2009. Physical-virtual tools for spatial augmented reality user interfaces, in: ISMAR’09.Benko, H. et al., 2012. Miragetable: Freehand interaction on a projected augmented reality tabletop, in: CHI’12.
- Show pointing on real object using direct touch
- Tangible interaction is sometimes not enough
- So we want to use digital tools
- Such tools are often operated on screen
- What works great on screen is indirect interaction such as the mouse
Interaction
Gervais, R. et al., 2015. Pointing in spatial augmented reality from 2D pointing devices, in: INTERACT’15.Gervais, R., et al., 2016. Tangible Viewports: Getting out of flatland in desktop environments, in: TEI’16.
The evaluation of the use of 2D pointing devices – mouse and graphics tablet – in a pointing task in a SAR context compared to a screen condition.
The design, implementation and evaluation of a system enabling the interaction between a typical desktop computer environ- ment – traditional screens, mouse and keyboard – with tangible augmented objects, considering an object design scenario as a main thread.
Introspection
How can we use augmented objects to reveal hidden information about our own selves?
Introspection
Mercier-Ganady, J. et al., 2014. The mind-mirror: See your brain in action in your head using eeg and augmented reality, in: VR’14.
Introspection
Norooz, L. et al., 2015. BodyVis: A new approach to body learning through wearable sensing and visualization, in: CHI’15.
Introspection
Frey, J., Gervais, R., et al., 2014. Teegi: Tangible EEG interface, in: UIST’14.Gervais, R., Frey, J., et al., 2016. TOBE: Tangible Out-of-Body Experience, in: TEI’16.
Standard way of pointing
Now what happens if...
Removing the screen
Does pointing still works without a screen?
Questions
Differences between SCREEN and SAR conditions for pointing?
Does pointing in SAR follow Fitts' law?
Apparatus
- A: Circle-shaped cursor that follows the geometry of the real world
- B: Plane onto which cursor is mapped
- In SAR, plane is virtual
- In SCREEN condition, we use a wooden panel to create a screen there
- C: Guide displayed on the table to help know where the cursor is located
- D: Position of the user is known
- E: Projector
- Augment real cube in SAR condition
- Projects a virtual cube in SCREEN condition
SCREEN vs SAR
- On the left it is a simulated version of what is seen on the right
- Comparison of the view in both conditions
- The view of the cube is the same
- In SCREEN condition, note the virtual table is aligned with real table
Scene
- Scene was changing between trials
- Cube alone in different orientation
- Cube and a more complex shape
Procedures
MacKenzie, I.S., 1992. Movement time prediction in human-computer interfaces.
Position cursor in starting zone
Zone changes from red to green
Target appears
User go click on target
Comes back to starting zone
Design
- Inefficiency: deviation from the most optimal path
Participants
- 16 participants
- Familiar with mice
- Little experience with graphics tablets
- No experience with SAR systems
Time
Users were 11% faster using a screen vs SAR
- Screen faster than SAR by 11%
- Drop of performance not so important: still usable
- Screen probably provide context for interaction
- No dead spaces in midair for SCREEN condition
Inefficiency
- Input modality significant effect.
- Tablet is less efficient than mouse
- Explained by experience with mice vs graphics tablet
- See heatmap figure to show example of this
Fitts' law
MacKenzie, I.S., 1992. Movement time prediction in human-computer interfaces.
- MT: Movement Time
- ID: Index of Difficulty
- D: Projected target distance in virtual screen
- W: Perceived target size
- We modeled the movement time with a linear regression.
Note: R2 = 0.8479
- We continued our investigation varying the different way to leverage the screen context
- Here, we keep the screen context for interaction and interact with a physical object when it is located in front of the screen
- Focus on hybrid environment
Metaphor
Working on a physical object becomes the same thing as working on a virtual version in a window
Pointing technique
- Same as with CurSAR
- Screen is tracked
- Behavior of cursor is coherent only for operator
- Cursor is visible by everyone
Complementary Representations
Jansen, Y. et al., 2013. Evaluating the efficiency of physical visualizations, in: CHI.
Towards the Desk as a Working Space
Direct-Indirect Combination
Improvised Working Surfaces
I am very much interested continuing bringing technology closer to the real world
Because technology has had this tendency to focus on doing while part of the human experience is not only about achieving but also about being.
But I know that the times in my life I felt most alive is when I actually leave technology behind, like
when I go on a self-supported bike tour with Dominique for a few weeks, juste pedaling all day
and stoping at night to cook and sleep in a tent. When you are out there, you never miss technology.
But when you come back, even if you're still in holidays, technology grabs you back in such a way that you loose this sense of calm you had.
Designing for Calmness and Wellbeing
I think we should not have to choose between calm and technology. We should strive to have technology not only to create engagement and completing tasks, but also enabling us to feel calm at times.