Andrew Kerne

 Andrew Kerne

Andrew Kerne

  • Courses0
  • Reviews0

Biography

Texas A&M University College Station - Computer Science

Program Director at National Science Foundation (NSF)
Online Publishing
Andruid
Kerne
Bryan/College Station, Texas Area
I direct The Interface Ecology Lab [http://ecologylab.net], investigating human-centered computing support for expression, creativity, and social engagement by rigorously connecting expressive methods of art and design with powerful pattern recognition and generation algorithms to build sensory interface systems.

Research considers situated practices, designs interfaces, and builds software components and distributed systems. We develop semantics, collections, platforms, tools, social networks, installations, and performances that open the role of computation in human experience, promoting creativity, discovery, communication, play, contemplation, satisfaction, and survival. Areas include creativity support tools, location-aware games, embodied sensory interfaces, and emergency response.

Specialties: human computer interaction, digital media, creativity support tools, social interaction, information semantics, distributed computing, games


Experience

  • Texas A&M University

    Associate Professor

    Develop Interface Ecology Lab research program.

    Investigate human-centered computing support for expression, creativity, social engagement, and education. Build sensory interface systems that engage human participation. Connect expressive methods of art and design with powerful pattern recognition and generation algorithms.

    Develop tools, games, social networks, collections, installations, and performances that open the role of computation in human experience by promoting creativity, discovery, learning, communication, play, and satisfaction. Invoke conceptual frameworks: interface ecosystems, information discovery, situated semantics, and collection sensemaking.

    Creative and expressive systems application areas include creativity support tools, location-aware games, and emergency response.

    Raise over $2,000,000 in funding from sources including the National Science Foundation.

  • Texas A&M University

    Professor

    Professor of Computer Science and Engineering. Principal Investigator and Director of the Interface Ecology Lab.

  • Creating Media

    Principal and Director

    • Develop web strategies, architectures, sites, navigation, tools, & infrastructure.
    • Clients: Modem Media, AT&T, Procter & Gamble, Mitsui, Ru4, Discovery Channel, Darwin Digital.
    • Function as Director of Technology and Creative Director.
    • Manage a team of six.
    • Develop creative strategy for brand identity on web and in print.
    • Hire, manage, and collaborate with staff and external vendors.
    • Manage customer accounts.
    • Develop proposals for new business.
    • Develop intellectual property, including trademarks; consideration of patent opportunities.
    • Collaborate with lawyers to develop contracts.
    • System architecture for AT&T Personal Solutions, a database-driven customer care site.
    • Develop interactive game banner ads for Tide and The Discovery Channel.
    • Develop first AT&T United Kingdom web site.

  • Mixed Reality Lab, University of Nottingham

    Sabbatical Fellow

    Andrew worked at Mixed Reality Lab, University of Nottingham as a Sabbatical Fellow

  • ACM Multimedia

    Interactive Art Program Co-Chair

    Co-curate internationally renowned interactive art program. Co-direct a unique and highly competitive scholarly research papers, mixing art and science methodologies.

  • National Science Foundation (NSF)

    Program Director

    Andrew worked at National Science Foundation (NSF) as a Program Director

Education

  • Wesleyan University

    M.A.

    Music / Composition

  • New York University

    Ph.D.

    Computer Science

  • Harvard University

    M.A.

    Applied Mathematics / Electronic Media

Publications

  • ZeroTouch: a zero-thickness optical multi-touch force field

    CHI 2011

    We present zero-thickness optical multi-touch sensing, a technique that simplifies sensordisplay integration, and enables new forms of interaction not previously possible with other multi-touch sensing techniques. Using low-cost modulated infrared sensors to quickly determine the visual hull of an interactive area, we enable robust real-time sensing of fingers and hands, even in the presence of strong ambient lighting. Our technology allows for 20+ fingers to be detected, many more than through prior visual hull techniques, and our use of wide-angle optoelectonics allows for excellent touch resolution, even in the corners of the sensor. With the ability to track objects in free space, as well as its use as a traditional multi-touch sensor, ZeroTouch opens up a new world of interaction possibilities.

  • ZeroTouch: a zero-thickness optical multi-touch force field

    CHI 2011

    We present zero-thickness optical multi-touch sensing, a technique that simplifies sensordisplay integration, and enables new forms of interaction not previously possible with other multi-touch sensing techniques. Using low-cost modulated infrared sensors to quickly determine the visual hull of an interactive area, we enable robust real-time sensing of fingers and hands, even in the presence of strong ambient lighting. Our technology allows for 20+ fingers to be detected, many more than through prior visual hull techniques, and our use of wide-angle optoelectonics allows for excellent touch resolution, even in the corners of the sensor. With the ability to track objects in free space, as well as its use as a traditional multi-touch sensor, ZeroTouch opens up a new world of interaction possibilities.

  • Pen-in-hand command: NUI for real-time strategy esports

    CHI 2012

    Electronic Sports (eSports) is the professional play and spectating of digital games. Real-time strategy games are a form of eSport that require particularly high- performance and precise interaction. Prior eSports HCI has been keyboard and mouse based. We investigate the real-time strategy eSports context to design novel interactions with embodied modalities, because of its rigorous needs and requirements, and the centrality of the human-computer interface as the medium of game mechanics. To sense pen + multi-touch interaction, we augment a Wacom Cintiq with a ZeroTouch multi-finger sensor[1]. We used this modality to design new pen + touch interaction for play in real-time strategy eSports.

  • Pen-in-hand command: NUI for real-time strategy esports

    CHI 2012

    Electronic Sports (eSports) is the professional play and spectating of digital games. Real-time strategy games are a form of eSport that require particularly high- performance and precise interaction. Prior eSports HCI has been keyboard and mouse based. We investigate the real-time strategy eSports context to design novel interactions with embodied modalities, because of its rigorous needs and requirements, and the centrality of the human-computer interface as the medium of game mechanics. To sense pen + multi-touch interaction, we augment a Wacom Cintiq with a ZeroTouch multi-finger sensor[1]. We used this modality to design new pen + touch interaction for play in real-time strategy eSports.

  • ZeroTouch: a zero-thickness optical multi-touch force field

    CHI 2011

    We present zero-thickness optical multi-touch sensing, a technique that simplifies sensordisplay integration, and enables new forms of interaction not previously possible with other multi-touch sensing techniques. Using low-cost modulated infrared sensors to quickly determine the visual hull of an interactive area, we enable robust real-time sensing of fingers and hands, even in the presence of strong ambient lighting. Our technology allows for 20+ fingers to be detected, many more than through prior visual hull techniques, and our use of wide-angle optoelectonics allows for excellent touch resolution, even in the corners of the sensor. With the ability to track objects in free space, as well as its use as a traditional multi-touch sensor, ZeroTouch opens up a new world of interaction possibilities.

  • Pen-in-hand command: NUI for real-time strategy esports

    CHI 2012

    Electronic Sports (eSports) is the professional play and spectating of digital games. Real-time strategy games are a form of eSport that require particularly high- performance and precise interaction. Prior eSports HCI has been keyboard and mouse based. We investigate the real-time strategy eSports context to design novel interactions with embodied modalities, because of its rigorous needs and requirements, and the centrality of the human-computer interface as the medium of game mechanics. To sense pen + multi-touch interaction, we augment a Wacom Cintiq with a ZeroTouch multi-finger sensor[1]. We used this modality to design new pen + touch interaction for play in real-time strategy eSports.

  • Pen-in-hand command: NUI for real-time strategy esports

    CHI 2012

    Electronic Sports (eSports) is the professional play and spectating of digital games. Real-time strategy games are a form of eSport that require particularly high- performance and precise interaction. Prior eSports HCI has been keyboard and mouse based. We investigate the real-time strategy eSports context to design novel interactions with embodied modalities, because of its rigorous needs and requirements, and the centrality of the human-computer interface as the medium of game mechanics. To sense pen + multi-touch interaction, we augment a Wacom Cintiq with a ZeroTouch multi-finger sensor[1]. We used this modality to design new pen + touch interaction for play in real-time strategy eSports.

  • Peripheral Array of Tangible NFC Tags: Positioning Portals for Embodied Trans-Surface Interaction

    Proc. Interactive Tabletops and Surfaces 2013

    Trans-surface interaction addresses moving information objects across multi-display environments that support sensory interaction modalities such as touch, pen, and free-air. Embodiment means using spatial relationships among surfaces and human bodies to facilitate users' understanding of interaction. In the present embodied trans-surface interaction technique, a peripheral NFC tag array provides tangible affordances for connecting mobile devices to positions on a collaborative surface. Touching a tag initiates a trans-surface portal. Each portal visually associates a mobile device and its user with a place on the collaborative surface. The portal's manifestation at the top of the mobile device supports 'flicking over' interaction, like playing cards. The technique is simple, inexpensive, reliable, scalable, and generally applicable for co-located collaboration. We developed a co-located collaborative rich information prototype to demonstrate the embodied trans-surface interaction technique and support imagining and planning tasks.

  • Peripheral Array of Tangible NFC Tags: Positioning Portals for Embodied Trans-Surface Interaction

    Proc. Interactive Tabletops and Surfaces 2013

    Trans-surface interaction addresses moving information objects across multi-display environments that support sensory interaction modalities such as touch, pen, and free-air. Embodiment means using spatial relationships among surfaces and human bodies to facilitate users' understanding of interaction. In the present embodied trans-surface interaction technique, a peripheral NFC tag array provides tangible affordances for connecting mobile devices to positions on a collaborative surface. Touching a tag initiates a trans-surface portal. Each portal visually associates a mobile device and its user with a place on the collaborative surface. The portal's manifestation at the top of the mobile device supports 'flicking over' interaction, like playing cards. The technique is simple, inexpensive, reliable, scalable, and generally applicable for co-located collaboration. We developed a co-located collaborative rich information prototype to demonstrate the embodied trans-surface interaction technique and support imagining and planning tasks.

  • ZeroTouch: a zero-thickness optical multi-touch force field

    CHI 2011

    We present zero-thickness optical multi-touch sensing, a technique that simplifies sensordisplay integration, and enables new forms of interaction not previously possible with other multi-touch sensing techniques. Using low-cost modulated infrared sensors to quickly determine the visual hull of an interactive area, we enable robust real-time sensing of fingers and hands, even in the presence of strong ambient lighting. Our technology allows for 20+ fingers to be detected, many more than through prior visual hull techniques, and our use of wide-angle optoelectonics allows for excellent touch resolution, even in the corners of the sensor. With the ability to track objects in free space, as well as its use as a traditional multi-touch sensor, ZeroTouch opens up a new world of interaction possibilities.

  • Pen-in-hand command: NUI for real-time strategy esports

    CHI 2012

    Electronic Sports (eSports) is the professional play and spectating of digital games. Real-time strategy games are a form of eSport that require particularly high- performance and precise interaction. Prior eSports HCI has been keyboard and mouse based. We investigate the real-time strategy eSports context to design novel interactions with embodied modalities, because of its rigorous needs and requirements, and the centrality of the human-computer interface as the medium of game mechanics. To sense pen + multi-touch interaction, we augment a Wacom Cintiq with a ZeroTouch multi-finger sensor[1]. We used this modality to design new pen + touch interaction for play in real-time strategy eSports.

  • Pen-in-hand command: NUI for real-time strategy esports

    CHI 2012

    Electronic Sports (eSports) is the professional play and spectating of digital games. Real-time strategy games are a form of eSport that require particularly high- performance and precise interaction. Prior eSports HCI has been keyboard and mouse based. We investigate the real-time strategy eSports context to design novel interactions with embodied modalities, because of its rigorous needs and requirements, and the centrality of the human-computer interface as the medium of game mechanics. To sense pen + multi-touch interaction, we augment a Wacom Cintiq with a ZeroTouch multi-finger sensor[1]. We used this modality to design new pen + touch interaction for play in real-time strategy eSports.

  • Peripheral Array of Tangible NFC Tags: Positioning Portals for Embodied Trans-Surface Interaction

    Proc. Interactive Tabletops and Surfaces 2013

    Trans-surface interaction addresses moving information objects across multi-display environments that support sensory interaction modalities such as touch, pen, and free-air. Embodiment means using spatial relationships among surfaces and human bodies to facilitate users' understanding of interaction. In the present embodied trans-surface interaction technique, a peripheral NFC tag array provides tangible affordances for connecting mobile devices to positions on a collaborative surface. Touching a tag initiates a trans-surface portal. Each portal visually associates a mobile device and its user with a place on the collaborative surface. The portal's manifestation at the top of the mobile device supports 'flicking over' interaction, like playing cards. The technique is simple, inexpensive, reliable, scalable, and generally applicable for co-located collaboration. We developed a co-located collaborative rich information prototype to demonstrate the embodied trans-surface interaction technique and support imagining and planning tasks.

  • Peripheral Array of Tangible NFC Tags: Positioning Portals for Embodied Trans-Surface Interaction

    Proc. Interactive Tabletops and Surfaces 2013

    Trans-surface interaction addresses moving information objects across multi-display environments that support sensory interaction modalities such as touch, pen, and free-air. Embodiment means using spatial relationships among surfaces and human bodies to facilitate users' understanding of interaction. In the present embodied trans-surface interaction technique, a peripheral NFC tag array provides tangible affordances for connecting mobile devices to positions on a collaborative surface. Touching a tag initiates a trans-surface portal. Each portal visually associates a mobile device and its user with a place on the collaborative surface. The portal's manifestation at the top of the mobile device supports 'flicking over' interaction, like playing cards. The technique is simple, inexpensive, reliable, scalable, and generally applicable for co-located collaboration. We developed a co-located collaborative rich information prototype to demonstrate the embodied trans-surface interaction technique and support imagining and planning tasks.

  • Multi-Tap Sliders: Advancing Touch Interaction for Parameter Adjustment

    Intelligent User Interfaces (IUI) 2013

    Research in multi-touch interaction has typically been focused on direct spatial manipulation; techniques have been create to result in the most intuitive mapping between the movement of the hand and the resultant change in the virtual object. However, as we attempt to design for more complex operations, the expectation of spatial manipulation becomes infeasible. We introduce Multi-tap Sliders for operation in what we call abstract parametric spaces that do not have an obvious literal spatial representation, such as exposure, brightness, contrast and saturation for image editing. This new widget design promotes multi-touch interaction for prolonged use in scenarios that require adjustment of multiple parameters as part of an operation. The multi-tap sliders encourage the user to keep her visual focus on the target, instead of the requiring to look back at the interface.

  • ZeroTouch: a zero-thickness optical multi-touch force field

    CHI 2011

    We present zero-thickness optical multi-touch sensing, a technique that simplifies sensordisplay integration, and enables new forms of interaction not previously possible with other multi-touch sensing techniques. Using low-cost modulated infrared sensors to quickly determine the visual hull of an interactive area, we enable robust real-time sensing of fingers and hands, even in the presence of strong ambient lighting. Our technology allows for 20+ fingers to be detected, many more than through prior visual hull techniques, and our use of wide-angle optoelectonics allows for excellent touch resolution, even in the corners of the sensor. With the ability to track objects in free space, as well as its use as a traditional multi-touch sensor, ZeroTouch opens up a new world of interaction possibilities.

  • Pen-in-hand command: NUI for real-time strategy esports

    CHI 2012

    Electronic Sports (eSports) is the professional play and spectating of digital games. Real-time strategy games are a form of eSport that require particularly high- performance and precise interaction. Prior eSports HCI has been keyboard and mouse based. We investigate the real-time strategy eSports context to design novel interactions with embodied modalities, because of its rigorous needs and requirements, and the centrality of the human-computer interface as the medium of game mechanics. To sense pen + multi-touch interaction, we augment a Wacom Cintiq with a ZeroTouch multi-finger sensor[1]. We used this modality to design new pen + touch interaction for play in real-time strategy eSports.

  • Pen-in-hand command: NUI for real-time strategy esports

    CHI 2012

    Electronic Sports (eSports) is the professional play and spectating of digital games. Real-time strategy games are a form of eSport that require particularly high- performance and precise interaction. Prior eSports HCI has been keyboard and mouse based. We investigate the real-time strategy eSports context to design novel interactions with embodied modalities, because of its rigorous needs and requirements, and the centrality of the human-computer interface as the medium of game mechanics. To sense pen + multi-touch interaction, we augment a Wacom Cintiq with a ZeroTouch multi-finger sensor[1]. We used this modality to design new pen + touch interaction for play in real-time strategy eSports.

  • Peripheral Array of Tangible NFC Tags: Positioning Portals for Embodied Trans-Surface Interaction

    Proc. Interactive Tabletops and Surfaces 2013

    Trans-surface interaction addresses moving information objects across multi-display environments that support sensory interaction modalities such as touch, pen, and free-air. Embodiment means using spatial relationships among surfaces and human bodies to facilitate users' understanding of interaction. In the present embodied trans-surface interaction technique, a peripheral NFC tag array provides tangible affordances for connecting mobile devices to positions on a collaborative surface. Touching a tag initiates a trans-surface portal. Each portal visually associates a mobile device and its user with a place on the collaborative surface. The portal's manifestation at the top of the mobile device supports 'flicking over' interaction, like playing cards. The technique is simple, inexpensive, reliable, scalable, and generally applicable for co-located collaboration. We developed a co-located collaborative rich information prototype to demonstrate the embodied trans-surface interaction technique and support imagining and planning tasks.

  • Peripheral Array of Tangible NFC Tags: Positioning Portals for Embodied Trans-Surface Interaction

    Proc. Interactive Tabletops and Surfaces 2013

    Trans-surface interaction addresses moving information objects across multi-display environments that support sensory interaction modalities such as touch, pen, and free-air. Embodiment means using spatial relationships among surfaces and human bodies to facilitate users' understanding of interaction. In the present embodied trans-surface interaction technique, a peripheral NFC tag array provides tangible affordances for connecting mobile devices to positions on a collaborative surface. Touching a tag initiates a trans-surface portal. Each portal visually associates a mobile device and its user with a place on the collaborative surface. The portal's manifestation at the top of the mobile device supports 'flicking over' interaction, like playing cards. The technique is simple, inexpensive, reliable, scalable, and generally applicable for co-located collaboration. We developed a co-located collaborative rich information prototype to demonstrate the embodied trans-surface interaction technique and support imagining and planning tasks.

  • Multi-Tap Sliders: Advancing Touch Interaction for Parameter Adjustment

    Intelligent User Interfaces (IUI) 2013

    Research in multi-touch interaction has typically been focused on direct spatial manipulation; techniques have been create to result in the most intuitive mapping between the movement of the hand and the resultant change in the virtual object. However, as we attempt to design for more complex operations, the expectation of spatial manipulation becomes infeasible. We introduce Multi-tap Sliders for operation in what we call abstract parametric spaces that do not have an obvious literal spatial representation, such as exposure, brightness, contrast and saturation for image editing. This new widget design promotes multi-touch interaction for prolonged use in scenarios that require adjustment of multiple parameters as part of an operation. The multi-tap sliders encourage the user to keep her visual focus on the target, instead of the requiring to look back at the interface.

  • ZeroTouch: an optical multi-touch and free-air interaction architecture

    CHI 2012

    ZeroTouch (ZT) is a unique optical sensing technique and architecture that allows precision sensing of hands, fingers, and other objects within a constrained 2-dimensional plane. ZeroTouch provides tracking at 80 Hz, and up to 30 concurrent touch points. Integration with LCDs is trivial. While designed for multi-touch sensing, ZT enables other new modalities, such as pen+touch and free-air interaction. In this paper, we contextualize ZT innovations with a review of other flat-panel sensing technologies. We present the modular sensing architecture behind ZT, and examine early diverse uses of ZT sensing.

  • ZeroTouch: a zero-thickness optical multi-touch force field

    CHI 2011

    We present zero-thickness optical multi-touch sensing, a technique that simplifies sensordisplay integration, and enables new forms of interaction not previously possible with other multi-touch sensing techniques. Using low-cost modulated infrared sensors to quickly determine the visual hull of an interactive area, we enable robust real-time sensing of fingers and hands, even in the presence of strong ambient lighting. Our technology allows for 20+ fingers to be detected, many more than through prior visual hull techniques, and our use of wide-angle optoelectonics allows for excellent touch resolution, even in the corners of the sensor. With the ability to track objects in free space, as well as its use as a traditional multi-touch sensor, ZeroTouch opens up a new world of interaction possibilities.

  • Pen-in-hand command: NUI for real-time strategy esports

    CHI 2012

    Electronic Sports (eSports) is the professional play and spectating of digital games. Real-time strategy games are a form of eSport that require particularly high- performance and precise interaction. Prior eSports HCI has been keyboard and mouse based. We investigate the real-time strategy eSports context to design novel interactions with embodied modalities, because of its rigorous needs and requirements, and the centrality of the human-computer interface as the medium of game mechanics. To sense pen + multi-touch interaction, we augment a Wacom Cintiq with a ZeroTouch multi-finger sensor[1]. We used this modality to design new pen + touch interaction for play in real-time strategy eSports.

  • Pen-in-hand command: NUI for real-time strategy esports

    CHI 2012

    Electronic Sports (eSports) is the professional play and spectating of digital games. Real-time strategy games are a form of eSport that require particularly high- performance and precise interaction. Prior eSports HCI has been keyboard and mouse based. We investigate the real-time strategy eSports context to design novel interactions with embodied modalities, because of its rigorous needs and requirements, and the centrality of the human-computer interface as the medium of game mechanics. To sense pen + multi-touch interaction, we augment a Wacom Cintiq with a ZeroTouch multi-finger sensor[1]. We used this modality to design new pen + touch interaction for play in real-time strategy eSports.

  • Peripheral Array of Tangible NFC Tags: Positioning Portals for Embodied Trans-Surface Interaction

    Proc. Interactive Tabletops and Surfaces 2013

    Trans-surface interaction addresses moving information objects across multi-display environments that support sensory interaction modalities such as touch, pen, and free-air. Embodiment means using spatial relationships among surfaces and human bodies to facilitate users' understanding of interaction. In the present embodied trans-surface interaction technique, a peripheral NFC tag array provides tangible affordances for connecting mobile devices to positions on a collaborative surface. Touching a tag initiates a trans-surface portal. Each portal visually associates a mobile device and its user with a place on the collaborative surface. The portal's manifestation at the top of the mobile device supports 'flicking over' interaction, like playing cards. The technique is simple, inexpensive, reliable, scalable, and generally applicable for co-located collaboration. We developed a co-located collaborative rich information prototype to demonstrate the embodied trans-surface interaction technique and support imagining and planning tasks.

  • Peripheral Array of Tangible NFC Tags: Positioning Portals for Embodied Trans-Surface Interaction

    Proc. Interactive Tabletops and Surfaces 2013

    Trans-surface interaction addresses moving information objects across multi-display environments that support sensory interaction modalities such as touch, pen, and free-air. Embodiment means using spatial relationships among surfaces and human bodies to facilitate users' understanding of interaction. In the present embodied trans-surface interaction technique, a peripheral NFC tag array provides tangible affordances for connecting mobile devices to positions on a collaborative surface. Touching a tag initiates a trans-surface portal. Each portal visually associates a mobile device and its user with a place on the collaborative surface. The portal's manifestation at the top of the mobile device supports 'flicking over' interaction, like playing cards. The technique is simple, inexpensive, reliable, scalable, and generally applicable for co-located collaboration. We developed a co-located collaborative rich information prototype to demonstrate the embodied trans-surface interaction technique and support imagining and planning tasks.

  • Multi-Tap Sliders: Advancing Touch Interaction for Parameter Adjustment

    Intelligent User Interfaces (IUI) 2013

    Research in multi-touch interaction has typically been focused on direct spatial manipulation; techniques have been create to result in the most intuitive mapping between the movement of the hand and the resultant change in the virtual object. However, as we attempt to design for more complex operations, the expectation of spatial manipulation becomes infeasible. We introduce Multi-tap Sliders for operation in what we call abstract parametric spaces that do not have an obvious literal spatial representation, such as exposure, brightness, contrast and saturation for image editing. This new widget design promotes multi-touch interaction for prolonged use in scenarios that require adjustment of multiple parameters as part of an operation. The multi-tap sliders encourage the user to keep her visual focus on the target, instead of the requiring to look back at the interface.

  • ZeroTouch: an optical multi-touch and free-air interaction architecture

    CHI 2012

    ZeroTouch (ZT) is a unique optical sensing technique and architecture that allows precision sensing of hands, fingers, and other objects within a constrained 2-dimensional plane. ZeroTouch provides tracking at 80 Hz, and up to 30 concurrent touch points. Integration with LCDs is trivial. While designed for multi-touch sensing, ZT enables other new modalities, such as pen+touch and free-air interaction. In this paper, we contextualize ZT innovations with a review of other flat-panel sensing technologies. We present the modular sensing architecture behind ZT, and examine early diverse uses of ZT sensing.

  • Using Metrics of Curation to Evaluate Information-based Ideation

    Transactions on Computer-Human Interaction (TOCHI)

  • ZeroTouch: a zero-thickness optical multi-touch force field

    CHI 2011

    We present zero-thickness optical multi-touch sensing, a technique that simplifies sensordisplay integration, and enables new forms of interaction not previously possible with other multi-touch sensing techniques. Using low-cost modulated infrared sensors to quickly determine the visual hull of an interactive area, we enable robust real-time sensing of fingers and hands, even in the presence of strong ambient lighting. Our technology allows for 20+ fingers to be detected, many more than through prior visual hull techniques, and our use of wide-angle optoelectonics allows for excellent touch resolution, even in the corners of the sensor. With the ability to track objects in free space, as well as its use as a traditional multi-touch sensor, ZeroTouch opens up a new world of interaction possibilities.

  • Pen-in-hand command: NUI for real-time strategy esports

    CHI 2012

    Electronic Sports (eSports) is the professional play and spectating of digital games. Real-time strategy games are a form of eSport that require particularly high- performance and precise interaction. Prior eSports HCI has been keyboard and mouse based. We investigate the real-time strategy eSports context to design novel interactions with embodied modalities, because of its rigorous needs and requirements, and the centrality of the human-computer interface as the medium of game mechanics. To sense pen + multi-touch interaction, we augment a Wacom Cintiq with a ZeroTouch multi-finger sensor[1]. We used this modality to design new pen + touch interaction for play in real-time strategy eSports.

  • Pen-in-hand command: NUI for real-time strategy esports

    CHI 2012

    Electronic Sports (eSports) is the professional play and spectating of digital games. Real-time strategy games are a form of eSport that require particularly high- performance and precise interaction. Prior eSports HCI has been keyboard and mouse based. We investigate the real-time strategy eSports context to design novel interactions with embodied modalities, because of its rigorous needs and requirements, and the centrality of the human-computer interface as the medium of game mechanics. To sense pen + multi-touch interaction, we augment a Wacom Cintiq with a ZeroTouch multi-finger sensor[1]. We used this modality to design new pen + touch interaction for play in real-time strategy eSports.

  • Peripheral Array of Tangible NFC Tags: Positioning Portals for Embodied Trans-Surface Interaction

    Proc. Interactive Tabletops and Surfaces 2013

    Trans-surface interaction addresses moving information objects across multi-display environments that support sensory interaction modalities such as touch, pen, and free-air. Embodiment means using spatial relationships among surfaces and human bodies to facilitate users' understanding of interaction. In the present embodied trans-surface interaction technique, a peripheral NFC tag array provides tangible affordances for connecting mobile devices to positions on a collaborative surface. Touching a tag initiates a trans-surface portal. Each portal visually associates a mobile device and its user with a place on the collaborative surface. The portal's manifestation at the top of the mobile device supports 'flicking over' interaction, like playing cards. The technique is simple, inexpensive, reliable, scalable, and generally applicable for co-located collaboration. We developed a co-located collaborative rich information prototype to demonstrate the embodied trans-surface interaction technique and support imagining and planning tasks.

  • Peripheral Array of Tangible NFC Tags: Positioning Portals for Embodied Trans-Surface Interaction

    Proc. Interactive Tabletops and Surfaces 2013

    Trans-surface interaction addresses moving information objects across multi-display environments that support sensory interaction modalities such as touch, pen, and free-air. Embodiment means using spatial relationships among surfaces and human bodies to facilitate users' understanding of interaction. In the present embodied trans-surface interaction technique, a peripheral NFC tag array provides tangible affordances for connecting mobile devices to positions on a collaborative surface. Touching a tag initiates a trans-surface portal. Each portal visually associates a mobile device and its user with a place on the collaborative surface. The portal's manifestation at the top of the mobile device supports 'flicking over' interaction, like playing cards. The technique is simple, inexpensive, reliable, scalable, and generally applicable for co-located collaboration. We developed a co-located collaborative rich information prototype to demonstrate the embodied trans-surface interaction technique and support imagining and planning tasks.

  • Multi-Tap Sliders: Advancing Touch Interaction for Parameter Adjustment

    Intelligent User Interfaces (IUI) 2013

    Research in multi-touch interaction has typically been focused on direct spatial manipulation; techniques have been create to result in the most intuitive mapping between the movement of the hand and the resultant change in the virtual object. However, as we attempt to design for more complex operations, the expectation of spatial manipulation becomes infeasible. We introduce Multi-tap Sliders for operation in what we call abstract parametric spaces that do not have an obvious literal spatial representation, such as exposure, brightness, contrast and saturation for image editing. This new widget design promotes multi-touch interaction for prolonged use in scenarios that require adjustment of multiple parameters as part of an operation. The multi-tap sliders encourage the user to keep her visual focus on the target, instead of the requiring to look back at the interface.

  • ZeroTouch: an optical multi-touch and free-air interaction architecture

    CHI 2012

    ZeroTouch (ZT) is a unique optical sensing technique and architecture that allows precision sensing of hands, fingers, and other objects within a constrained 2-dimensional plane. ZeroTouch provides tracking at 80 Hz, and up to 30 concurrent touch points. Integration with LCDs is trivial. While designed for multi-touch sensing, ZT enables other new modalities, such as pen+touch and free-air interaction. In this paper, we contextualize ZT innovations with a review of other flat-panel sensing technologies. We present the modular sensing architecture behind ZT, and examine early diverse uses of ZT sensing.

  • Using Metrics of Curation to Evaluate Information-based Ideation

    Transactions on Computer-Human Interaction (TOCHI)

  • Scanning FTIR: unobtrusive optoelectronic multi-touch sensing through waveguide transmissivity imaging

    TEI 2010

    We describe a new method of multi-touch sensing which can be unobtrusively added to existing displays. By coupling individually controlled optoelectronics to the edge of a planar waveguide, our scanning approach overcomes prior disadvantages of optoelectronic multi-touch sensing. Our approach allows for a completely transparent touch surface and easy integration with existing LCD displays.

  • Scanning FTIR: unobtrusive optoelectronic multi-touch sensing through waveguide transmissivity imaging

    TEI 2010

    We describe a new method of multi-touch sensing which can be unobtrusively added to existing displays. By coupling individually controlled optoelectronics to the edge of a planar waveguide, our scanning approach overcomes prior disadvantages of optoelectronic multi-touch sensing. Our approach allows for a completely transparent touch surface and easy integration with existing LCD displays.

Positions

  • ACM

    Papers Chair, Creativity and Cognition 2015

  • ACM

    Papers Chair, Creativity and Cognition 2015

  • ACM

    Papers Chair, Creativity and Cognition 2015

  • ACM

    Papers Chair, Creativity and Cognition 2015

  • ACM

    Papers Chair, Creativity and Cognition 2015

  • ACM

    Papers Chair, Creativity and Cognition 2015

  • ACM

    Papers Chair, Creativity and Cognition 2015

  • ACM

    Papers Chair, Creativity and Cognition 2015

  • ACM

    Papers Chair, Creativity and Cognition 2015

  • ACM

    Papers Chair, Creativity and Cognition 2015

  • ACM

    Papers Chair, Creativity and Cognition 2015

  • ACM

    Papers Chair, Creativity and Cognition 2015

  • ACM

    Papers Chair, Creativity and Cognition 2015

Possible Matching Profiles

The following profiles may or may not be the same professor:

  • Andrew Kerne (60% Match)
    Professor
    Texas A&M University - Texas A&m University