skip to main content
10.1145/1294211.1294259acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
Article

Lucid touch: a see-through mobile device

Published: 07 October 2007 Publication History
  • Get Citation Alerts
  • Abstract

    Touch is a compelling input modality for interactive devices; however, touch input on the small screen of a mobile device is problematic because a user's fingers occlude the graphical elements he wishes to work with. In this paper, we present LucidTouch, a mobile device that addresses this limitation by allowing the user to control the application by touching the back of the device. The key to making this usable is what we call pseudo-transparency: by overlaying an image of the user's hands onto the screen, we create the illusion of the mobile device itself being semi-transparent. This pseudo-transparency allows users to accurately acquire targets while not occluding the screen with their fingers and hand. Lucid Touch also supports multi-touch input, allowing users to operate the device simultaneously with all 10 fingers. We present initial study results that indicate that many users found touching on the back to be preferable to touching on the front, due to reduced occlusion, higher precision, and the ability to make multi-finger input.

    Supplementary Material

    AVI File (p269-wigdor.avi)

    References

    [1]
    Albinsson, P. and Zhai, S. 2003. High precision touch screen interaction. SIGCHI Conference on Human Factors in Computing. p.105--112.
    [2]
    Azuma, R.T. (1997). A Survey of Augmented Reality. Presence: Teleoperators and Virtual Environments 6(4) (August 1997). p. 355--385.
    [3]
    Balakrishnan, R. and Kurtenbach, G. (1999). Exploring bimanual camera control and object manipulation in 3D graphics interfaces. Proceedings of CHI '99. p. 56--62.
    [4]
    Baudisch, P. and Gutwin, C. Multiblending: displaying overlapping windows simultaneously without the drawbacks of alpha blending. Proceeding of CHI 2004, p. 367--374.
    [5]
    Benko, H., Wilson, A., and Baudisch, P. (2006). Precise selection techniques for multi-touch screens. Proceedings of CHI '06. p. 1263--1272.
    [6]
    Buxton, W. (1990). A Three-State Model of Graphical Input. Human-Computer Interaction-INTERACT '90. p. 449--456.
    [7]
    Buxton, W. and Myers, B.A. (1986). A study in two-handed input. Proceedings of CHI '86. p. 321--326.
    [8]
    Card, S.K., English, W.K., Burr, B.J. (1978). Evaluation of Mouse, Rate-controlled Isometric Joystick, Step Keys, and Text Keys for Text Selection on a CRT. Ergonomics 21(8). p. 601--613.
    [9]
    Dietz, P. and Leigh, D. (2001). DiamondTouch: a multi-user touch technology. Proceedings of UIST .01. p. 219--226.
    [10]
    Esenther, A., Ryall, K., (2006). Fluid DTMouse: Better Mouse Support for Touch-Based Interactions. Proceedings of AVI '06. p. 112--115.
    [11]
    Forlines, C., Shen, C., (2005). DTLens: Multi-User Tabletop Spatial Data Exploration. Proceedings of UIST '05. p. 119--122.
    [12]
    Forlines, C., Vogel, D., Balakrishnan, R. (2006). HybridPointing: Fluid switching between absolute and relative pointing with a direct input device. Proceedings of UIST '06. p. 211--220.
    [13]
    Guiard, Y. (1987) Asymmetric Division of Labor in Human Skilled Bimanual Action: The Kinematic Chain as a Model. Journal of Motor Behavior, 19. p. 486--517.
    [14]
    Han, J. (2005). Low-cost multi-touch sensing through frustrated total internal reflection. Proceedings of UIST .05. p. 115--118.
    [15]
    Hiraoka, S., Miyamoto, I., Tomimatsu, K. (2003) Behind Touch, a Text Input Method for Mobile Phones by The Back and Tactile Sense Interface. Information Processing Society of Japan, Interaction 2003. p. 131--138.
    [16]
    Kabbash, P., Buxton, W., and Sellen, A. (1994). Two-handed input in a compound task. Proceedings of CHI '94. p. 417--423.
    [17]
    Leganchuk, A., Zhai, S., and Buxton, W. (1998). Manual and cognitive benefits of two-handed input: an experimental study. ACM Trans. Comput.-Hum. Interact. 5 (4) (Dec. 1998). p. 326--359.
    [18]
    MacKenzie, I. S., Soukoreff, R. W. (2002). A model of two-thumb text entry. Proceedings of GI .02. p. 117--124.
    [19]
    Potter, R., Weldon, L., and Shneiderman, B. (1988). Improving the accuracy of touch screens: an experimental evaluation of three strategies. Proceedings of CHI .88. p. 27--32.
    [20]
    Rekimoto, J. (2002). SmartSkin: an infrastructure for freehand manipulation on interactive surfaces. Proceedings of CHI .02. p. 113--120.
    [21]
    Schilit, B., Adams, N.I., Want, R. (1994). Context-Aware Computing Applications. Proceedings of the Workshop on Mobile Computing Systems and Applications. p. 85--90.
    [22]
    Sears, A. and Shneiderman, B. (1991). High precision touchscreens: design strategies and comparisons with a mouse. International Journal of Man-Machine Studies, 34 (4). p. 593--613.
    [23]
    Sugimoto, M. Hiroki, K. (2006). HybridTouch: an intuitive manipulation technique for PDAs using their front and rear surfaces. Proceedings of MobileHCI '06, p. 137--140.
    [24]
    Tang, A., Neustaedter, C., Greenberg, S. (2004). VideoArms: Supporting Remote Embodiment in Groupware. Video Proceedings of CSCW .04.
    [25]
    Tang, J. C. and Minneman, S. L. (1990). VideoDraw: a video interface for collaborative drawing. Proceedings of CHI '90. p. 313--320.
    [26]
    Ullmer, B. and Ishii, H. (1997). The metaDESK: models and prototypes for tangible user interfaces Proceedings of UIST '97. p. 223--232.
    [27]
    Vogel, D. and Baudisch, P. Shift: A Technique for Operating Pen-Based Interfaces Using Touch. To appear in Proceedings of CHI 2007. (in press).
    [28]
    Ward, D. J., Blackwell, A. F., MacKay, D. J. (2000). Dasher-a data entry interface using continuous gestures and language models. Proceedings of UIST '00. p. 129--137.
    [29]
    Wigdor, D., Leigh, D., Forlines, C., Shipman, S., Barnwell, J., Balakrishnan, R., Shen, C. (2006). Under the table interaction. Proceedings of UIST .06. p. 259--268.
    [30]
    Wilson, A. D. 2004. TouchLight: an imaging touch screen and display for gesture-based interaction. Proceedings of ICMI '04. p. 69--76.
    [31]
    Zhai, S., Hunter, M., Smith, B. A. (2000). The metropolis keyboard-an exploration of quantitative techniques for virtual keyboard design. Proceedings of UIST '00. p. 119--128.
    [32]
    Zhai, S., Kristensson, P. (2003). Shorthand writing on stylus keyboard. Proceedings of CHI '03. p. 97--104.

    Cited By

    View all
    • (2024)Dual-Thumb pointing and command selection techniques for tabletsInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103203184:COnline publication date: 1-Apr-2024
    • (2023)Optimized design and application research of smart interactive screen for wireless networks based on federated learningEURASIP Journal on Wireless Communications and Networking10.1186/s13638-023-02315-72023:1Online publication date: 18-Oct-2023
    • (2023)Mobile Fighter Plane Game Manipulation: Touch-based Method vs. Camera-Based Method2023 9th International Conference on Systems and Informatics (ICSAI)10.1109/ICSAI61474.2023.10423365(1-5)Online publication date: 16-Dec-2023
    • Show More Cited By

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UIST '07: Proceedings of the 20th annual ACM symposium on User interface software and technology
    October 2007
    306 pages
    ISBN:9781595936790
    DOI:10.1145/1294211
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 07 October 2007

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. augmented reality
    2. bimanual input
    3. direct touch
    4. lucid touch
    5. multi-touch
    6. portable multi-touch
    7. pseudo-transparency
    8. transparent devices

    Qualifiers

    • Article

    Conference

    UIST07

    Acceptance Rates

    Overall Acceptance Rate 842 of 3,967 submissions, 21%

    Upcoming Conference

    UIST '24

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)119
    • Downloads (Last 6 weeks)12

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Dual-Thumb pointing and command selection techniques for tabletsInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103203184:COnline publication date: 1-Apr-2024
    • (2023)Optimized design and application research of smart interactive screen for wireless networks based on federated learningEURASIP Journal on Wireless Communications and Networking10.1186/s13638-023-02315-72023:1Online publication date: 18-Oct-2023
    • (2023)Mobile Fighter Plane Game Manipulation: Touch-based Method vs. Camera-Based Method2023 9th International Conference on Systems and Informatics (ICSAI)10.1109/ICSAI61474.2023.10423365(1-5)Online publication date: 16-Dec-2023
    • (2023)Mixed Reality Interaction TechniquesSpringer Handbook of Augmented Reality10.1007/978-3-030-67822-7_5(109-129)Online publication date: 1-Jan-2023
    • (2022)Augmented Chironomia for Presenting Data to Remote AudiencesProceedings of the 35th Annual ACM Symposium on User Interface Software and Technology10.1145/3526113.3545614(1-14)Online publication date: 29-Oct-2022
    • (2022)Iteratively Designing Gesture Vocabularies: A Survey and Analysis of Best Practices in the HCI LiteratureACM Transactions on Computer-Human Interaction10.1145/350353729:4(1-54)Online publication date: 5-May-2022
    • (2022)Select or Suggest? Reinforcement Learning-based Method for High-Accuracy Target Selection on TouchscreensProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3517472(1-15)Online publication date: 29-Apr-2022
    • (2022)Touching The Droid: Understanding and Improving Touch Precision With Mobile Devices in Virtual Reality2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR55827.2022.00099(807-816)Online publication date: Oct-2022
    • (2021)A Comparison Study on Camera-Based Pointing Techniques for Handheld DisplaysIEICE Transactions on Electronics10.1587/transele.2020DIP0003E104.C:2(73-80)Online publication date: 1-Feb-2021
    • (2021)Watching Your Phone's BackProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/34635225:2(1-26)Online publication date: 24-Jun-2021
    • Show More Cited By

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media

    -