Home   >   CSC-OpenAccess Library   >    Manuscript Information
A Case Study on Non-Visual Pen-Based Interaction with the Numerical Data
Tatiana G. Evreinova, Grigori Evreinov, Roope Raisamo
Pages - 70 - 87     |    Revised - 15-05-2013     |    Published - 30-06-2013
Volume - 4   Issue - 2    |    Publication Date - May / June 2013  Table of Contents
MORE INFORMATION
KEYWORDS
Numerical data, Pen-based interaction, Kinesthetic display.
ABSTRACT
A widespread tabular format still poses great problems for screen readers because of diversity and complexity of the cells’ content. How to access the numerical data presented in tabular form in a quick and intuitive way in the absence of visual feedback? We have implemented and assessed the algorithm supporting an exploration of the tabular data in the absence of visual feedback. This algorithm helps to solve the most commonly encountered problems: retrieving the position of extreme values and the target value that can also be linked to the specific content of the virtual table. The performance of 11 blindfolded subjects was evaluated when they used the StickGrip kinesthetic display and when they relied on the Wacom pen and auditory signals. The results of the comparative study are reported.
CITED BY (2)  
1 Evreinova, T. V., Evreinov, G., & Raisamo, R. (2014). An Exploration of Volumetric Data in Auditory Space. Journal of the Audio Engineering Society, 62(3), 172-187.
2 Evreinova, T. V., Evreinov, G., & Raisamo, R. From kinesthetic sense to new interaction concepts: feasibility and constraints.
1 Google Scholar 
2 CiteSeerX 
3 refSeek 
4 Scribd 
5 SlideShare 
6 PdfSR 
A. Israr, P.H. Meckl, Ch.M. Reed, H.Z. Tan “Controller design and consonantal contrast coding using a multi-finger tactual display,” J. Acoust. Soc. Am. 125(6), June 2009, pp.3925-3935.
“Human Factors Guidelines for ICT products and services,” Design for All, ETSI EG 202116,V1.2.1 (2002-09), p. 146, available at:http://www.cettico.fi.upm.es/aenor/etsi202116v010201p.pdf [accessed on Mar. 2013].
B.N. Walker, J.T. Cothran “Sonification Sandbox: a graphical toolkit for auditory graphs,” in Proc. of ICAD Int. Conf. on Auditory Displays, 2003, pp. 161-163.
B.N. Walker, M.A. Nees “Theory of sonification,” in Hermann T, Hunt A, Neuhoff J.G., (eds),The Sonification Handbook, Logos Publishing House, Berlin, Germany, ch. 2, 2011, pp. 9-39.
C. Colwell, H. Petrie, D. Kornbrot, D. Kornbrot “Haptic virtual reality for blind computer users,” in Proc. of ASSETS Int. ACM Conf on Assistive technologies, ACM Press, NY, USA,pp. 92–99, 1998.
C. Jay, R. Stevens, R. Hubbold, M. Glencross “Using haptic cues to aid nonvisual structure recognition,” ACM Trans Appl Percept. 5(2), Article 8, May 2008, 14 p.
C. Sjöström ”Designing haptic computer interfaces for blind people,” in Proc. of ISSPA Int.Symp. on Signal Processing and its Applications, Workshop on Website Evolution. Kuala Lumpur, Malaysia, IEEE CERTEC, Lund, Vol 1, pp. 68-71, 2001.
D. Burton, J. Lilly “NonVisual Desktop Access and Thunder: A Comparison of Two Free Screen Readers,” Access World 12(8) Aug. 2011.
Excel Magic Trick 731: “Retrieve Cell Address of Minimum Value in Column,” at:http://www.youtube.com/watch?v=cpRlYaaztx4 [accessed on Mar. 2013].
G. Douglas, C. Corcoran, S.A. Pavey “The role of the WHO ICF as a framework to interpret barriers and to inclusion: visually impaired people’s views and experiences of personal computers,” Br J of Vis Impair, 25(1) 32–50, Jan. 2007, DOI: 10.1177/0264619607071773.
I.A. Doush, E. Pontelli “Non-visual navigation of spreadsheets,” UAIS, Mar. 2012, pp. 1-17,http://dx.doi.org/10.1007/s10209-012-0272-1.
I.A. Doush, E. Pontelli “Non-visual navigation of tables in spreadsheets,” in Proc. of ICCHP 2010, Vienna, Austria, Part I, LNCS 6179, Springer, Berlin, 2010, pp. 108–115.
I.A. Doush, E. Pontelli, D. Simon, T.C. Son, O. Ma “Making Microsoft ExcelTM: multimodal presentation of charts,” in Proc. of ACM SIGACCESS Conf. on Computers and Accessibility,ACM Press, NY, USA, 2009, pp. 147-154.
J. Kildal, S.A. Brewster “Explore the matrix: Browsing numerical data tables using sound,” in ICAD Proc. of Int. Conf. on Auditory Displays, Limerick, Ireland, 2005, pp 300-303.
J. Kildal, S.A. Brewster “Providing a size-independent overview of non-visual tables,” in Proc.of ICAD Int. Conf. on Auditory Displays, London, 2006, pp. 8-15.
J.O. Wobbrock, B.A. Myers, J.A. Kembel “EdgeWrite: A stylus-based text entry method designed for high accuracy and stability of motion,” in Proc. of UIST the ACM Symp. on User Interface Soft and Tech, Vancouver, British Columbia, Canada, ACM Press, NY, USA, 2003,pp. 61-70.
L.M. Brown, S.A. Brewster, R. Ramloll, M. Burton, B. Riedel “Design Guidelines For Audio Presentation of Graphs and Tables,” in Proc. of ICAD Int. Conf on Auditory Displays, Boston,MA, USA, 2003, pp. 284-297.
L.M. Brown, S.A. Brewster, R. Ramloll, W. Yu, B. Riedel “Browsing Modes For Exploring Sonified Line Graphs,” in British HCI Conf, London, UK, vol. 2, 2002, pp. 2-5.
M. Chiousemoglou, H. Jürgensen “Setting the table for the blind,” in Proc. of PETRA, Int.Conf. on Pervasive Technologies Related to Assistive Environments, Crete, Greece, ACM Press, New York, 2011, doi>10.1145/2141622.2141624.
R. Ramloll, S. Brewster, W. Yu, B. Riedel “Using Non-speech Sounds to Improve Access to 2D Tabular Numerical Information for Visually Impaired Users,” in BCS IHM-HCI, Lille,France, Springer, Berlin, 2001, pp. 515–530.
R. Ramloll, S.A. Brewster “Generic approach for augmenting tactile diagrams with spatial non-speech sounds,” in CHI Ext. Abstracts on Human factors in computing systems, ACM Press, NY, USA, 2002, pp. 770–771.
S. Smith, R.D. Bergeron, G.G. Grinstein “Stereophonic and Surface Sound Generation for Exploratory Data Analysis,” in CHI SIGCHI Conf on Human factors in comp systems, ACM Press, NY, USA, 1990, pp. 125-132.
S. Wall, S. Brewster “Providing External Memory Aids in Haptic Visualisations for Blind Computer Users,” Int. J. of Disability and Human Dev, Vol. 4, No 4, May 2005, pp. 331-338.
T. Stockman “The Design and Evaluation of Auditory Access to Spreadsheets,” in Proc. of ICAD Int. Conf. on Auditory Displays, Sydney, Australia, 2004, at:http://icad.org/Proceedings/2004/Stockman2004.pdf [accessed on Mar. 2013].
T. Stockman, C. Frauenberger, G. Hind “Interactive Sonification of Spreadsheets,” in Proc.of ICAD Int. Conf. on Auditory Displays, Limerick, 2005, pp. 134–139.
T.G. Evreinova, G. Evreinov, R. Raisamo “Camera-Based Target Acquisition Augmented with Phosphene Sensations,” in ICCHP 2010, Vienna, Austria, Springer, Berlin Heidelberg,LNCS 6180, 2010, pp. 282-289.
T.V. Evreinova, G. Evreinov, R. Raisamo “An Evaluation of the Virtual Curvature with the StickGrip Haptic Device: a Case Study,” UAIS 12(2) June 2013, pp. 161-173.DOI:10.1007/s10209-012-0273-0.
T.V. Evreinova, G. Evreinov, R. Raisamo “Non-visual Gameplay: Making Board Games Easy and Fun,” in ICCHP 2008, Linz, Austria, Springer, Berlin Heidelberg, LNCS 5105, 2008, pp.561-568.
Y. Wang, I.T. Phillips, R.M. Haralick “Table structure understanding and its performance evaluation,” Pattern Recognition, Vol. 37, No 7 Jul. 2004, pp. 1479-1497.
Miss Tatiana G. Evreinova
School of Information Sciences University of Tampere Tampere, 33014, Finland - Finland
Miss Grigori Evreinov
School of Information Sciences University of Tampere Tampere, 33014, Finland - Finland
Grigori.Evreinov@uta.fi
Mr. Roope Raisamo
School of Information Sciences University of Tampere Tampere, 33014, Finland - Finland


CREATE AUTHOR ACCOUNT
 
LAUNCH YOUR SPECIAL ISSUE
View all special issues >>
 
PUBLICATION VIDEOS