Liljedahl, Mats and Delsing, Katarina (2012) Sound for enhanced experiences in mobile applications. SMC Sweden 2012, Sound and Music Computing, Understanding and Particing in Sweden . pp. 10-12.
|PDF (Excerpt from the conference proceedings)|
Official URL: http://smcsweden.se/proceedings/SMCSweden2012_Proc...
When visiting new places you want information about restaurants, shopping, places of historic in- terest etc. Smartphones are perfect tools for de- livering such location-based information, but the risk is that users get absorbed by texts, maps, videos etc. on the device screen and get a second- hand experience of the environment they are vis- iting rather than the sought-after first-hand expe- rience. One problem is that the users’ eyes often are directed to the device screen, rather than to the surrounding environment. Another problem is that interpreting more or less abstract informa- tion on maps, texts, images etc. may take up sig- nificant shares of the users’ overall cognitive re- sources. The work presented here tried to overcome these two problems by studying design for human-computer interaction based on the users’ everyday abilities such as directional hearing and point and sweep gestures. Today’s smartphones know where you are, in what direction you are pointing the device and they have systems for ren- dering spatial audio. These readily available tech- nologies hold the potential to make information more easy to interpret and use, demand less cog- nitive resources and free the users from having to look more or less constantly on a device screen.
|Deposited By:||Mats Liljedahl|
|Deposited On:||15 Aug 2012 10:54|
|Last Modified:||13 Sep 2012 09:27|
Repository Staff Only: item control page