skip to main content
10.1145/3531706.3536457acmconferencesArticle/Chapter ViewAbstractPublication PageseicsConference Proceedingsconference-collections
short-paper

Gestural-Vocal Coordinated Interaction on Large Displays

Published: 21 June 2022 Publication History
  • Get Citation Alerts
  • Abstract

    On large displays, using keyboard and mouse input is challenging because small mouse movements do not scale well with the size of the display and individual elements on screen. We present “Large User Interface” (LUI), which coordinates gestural and vocal interaction to increase the range of dynamic surface area of interactions possible on large displays. The interface leverages real-time continuous feedback of free-handed gestures and voice to control a set of applications such as: photos, videos, 3D models, maps, and a gesture keyboard. Utilizing a single stereo camera and voice assistant, LUI does not require calibration or many sensors to operate, and it can be easily installed and deployed. We report results from user studies where participants found LUI efficient, learnable with minimal instruction, and preferred it to point-and-click interfaces.

    Supplementary Material

    MP4 File (LargeUserInterface.mp4)
    Supplemental video

    References

    [1]
    Anand Agarawala and Ravin Balakrishnan. 2006. Keepin’ It Real: Pushing the Desktop Metaphor with Physics, Piles and the Pen. In Proc. of CHI ’06(Montréal, Québec, Canada). ACM, 10 pages. https://doi.org/10.1145/1124772.1124965
    [2]
    Richard A. Bolt. 1980. “Put-That-There”: Voice and Gesture at the Graphics Interface. SIGGRAPH Comput. Graph. 14, 3 (July 1980), 262–270. https://doi.org/10.1145/965105.807503
    [3]
    A. Kendon. 1988. How gestures can become like words. Hogrefe, 131–141.
    [4]
    Jinha Lee, Alex Olwal, Hiroshi Ishii, and Cati Boulanger. 2013. SpaceTop: Integrating 2D and Spatial 3D Interactions in a See-through Desktop Environment. ACM, 189–192. https://doi.org/10.1145/2470654.2470680
    [5]
    Ugo Braga Sangiorgi, François Beuvens, and Jean Vanderdonckt. 2012. User Interface Design by Collaborative Sketching. In Proceedings of the Designing Interactive Systems Conference (Newcastle Upon Tyne, United Kingdom) (DIS ’12). Association for Computing Machinery, New York, NY, USA, 378–387. https://doi.org/10.1145/2317956.2318013
    [6]
    Arthur Sluÿters, Mehdi Ousmer, Paolo Roselli, and Jean Vanderdonckt. 2022. QuantumLeap, a Framework for Engineering Gestural User Interfaces based on the Leap Motion Controller. Proc. ACM Hum. Comput. Interact. 6, EICS (2022), 1–47. https://doi.org/10.1145/3532211
    [7]
    Arthur Sluÿters, Quentin Sellier, Jean Vanderdonckt, Vik Parthiban, and Pattie Maes. 2022. Consistent, Continuous, and Customizable Mid-Air Gesture Interaction for Browsing Multimedia Objects on Large Displays. International Journal of Human-Computer Interaction 38, 10(2022). https://doi.org/10.1080/10447318.2022.2078464
    [8]
    M. Tavakol and R. Dennick. 2011. Making sense of Cronbach’s alpha. International Journal of Medical Education 2 (2011), 53–55. https://doi.org/10.5116/ijme.4dfb.8dfd arXiv:http://www.ijme.net/archive/2/cronbachs-alpha.pdf
    [9]
    Jamie Zigelbaum, Alan Browning, Daniel Leithinger, Olivier Bau, and Hiroshi Ishii. 2010. G-Stalt: A Chirocentric, Spatiotemporal, and Telekinetic Gestural Interface. In Proc. of TEI ’10 (Cambridge, Massachusetts, USA). ACM, 261–264.

    Cited By

    View all
    • (2024)A Type System for Flexible User Interactions HandlingProceedings of the ACM on Human-Computer Interaction10.1145/36602488:EICS(1-27)Online publication date: 17-Jun-2024
    • (2024)Evaluating gesture user interfacesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2024.103242185:COnline publication date: 1-May-2024
    • (2023)Engineering User Interfaces with Beat GesturesCompanion Proceedings of the 2023 ACM SIGCHI Symposium on Engineering Interactive Computing Systems10.1145/3596454.3597187(76-78)Online publication date: 27-Jun-2023
    • Show More Cited By

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    EICS '22 Companion: Companion of the 2022 ACM SIGCHI Symposium on Engineering Interactive Computing Systems
    June 2022
    69 pages
    ISBN:9781450390316
    DOI:10.1145/3531706
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 21 June 2022

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Large displays
    2. mid-air gestures
    3. multimedia objects
    4. vocal input.

    Qualifiers

    • Short-paper
    • Research
    • Refereed limited

    Funding Sources

    Conference

    EICS '22
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 73 of 299 submissions, 24%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)35
    • Downloads (Last 6 weeks)4

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)A Type System for Flexible User Interactions HandlingProceedings of the ACM on Human-Computer Interaction10.1145/36602488:EICS(1-27)Online publication date: 17-Jun-2024
    • (2024)Evaluating gesture user interfacesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2024.103242185:COnline publication date: 1-May-2024
    • (2023)Engineering User Interfaces with Beat GesturesCompanion Proceedings of the 2023 ACM SIGCHI Symposium on Engineering Interactive Computing Systems10.1145/3596454.3597187(76-78)Online publication date: 27-Jun-2023
    • (2023)Grab It, While You Can: A VR Gesture Evaluation of a Co-Designed Traditional Narrative by Indigenous PeopleProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580894(1-13)Online publication date: 19-Apr-2023

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media

    -