Roel Vertegaal

From Wikipedia, the free encyclopedia
Jump to: navigation, search
Roel Vertegaal
Roel Vertegaal.jpg
Roel Vertegaal with a 1920s Remington USB Keyboard.
Born (1968-07-13) July 13, 1968 (age 48)
Hazerswoude-Rijndijk, Netherlands
Residence Canada
Nationality Dutch
Fields Computer Science and Design
Institutions Utrecht School of the Arts
Twente University
Queen's University
Xuuk, Inc.
Alma mater Utrecht School of the Arts
Utrecht University
Bradford University
Twente University
Known for Attentive User Interface
Organic User Interface
Eye Contact Sensor
Flexible Computer
Paper Computer

Roeland "Roel" Vertegaal (born July 13, 1968) is a Dutch-Canadian interaction designer, scientist, musician and entrepreneur working in the area of Human-Computer Interaction. He is the director of the Human Media Lab and Professor at Queen's University's School of Computing. He is best known for his pioneering work on flexible and paper computers, with systems such as PaperWindows (2004),[1] PaperPhone (2010)[2] and PaperTab (2012).[3] He is also known for inventing ubiquitous eye input, such as Samsung's Smart Pause technologies, and BitDrones, one of the first programmable matter user interfaces.

Early life[edit]

Vertegaal grew up near Leiden in the Netherlands, where he attended Bonaventura grammar school. At age 10, he started experimenting with electronic music on a Yamaha CS 30. At age 15, he programmed his first computer games in Apple IIe BASIC. At age 17 he directed his first video clip, studying film with Frans Zwartjes at the Free Academy in The Hague. In 1987, he enrolled at the Utrecht Conservatory, becoming a Mac OS coder and professional pianist. He graduated in 1992 with a Major in Electronic Music and Minor in Jazz Piano, and was awarded the prestigious VSB Scholarship by Dutch author Boudewijn Büch.

In 1992, and supported by Apple's Advanced Technology Group, Vertegaal obtained an MPhil in Computer Science at Bradford University, studying with Prof. Eaglestone. There, he designed a simple user interface for FM synthesis using two 2D perceptually arranged sound maps, discovering that perceptual integrality of sound parameters could be measured through browsing action.[4] His intuitive sound editing interface was deployed in concerts in Vienna in the mid-nineties by computer music pioneer Tamas Ungvary as part of his SensOrg instrument, and is now a common feature of commercial sound synthesis software. In 1994, Vertegaal developed one of the first inline PC webcams, FrameServer, deployed by Apple co-founder Steve Wozniak. In 1998, Vertegaal obtained a PhD in Human-Computer Interaction from Twente University, with a committee including Profs Van der Veer and Bill Buxton. There, he studied effects of eye contact on multiparty videoconferencing, pioneering eye input with the GAZE Groupware System,[5] premiered at the Association of Computing Machinery's 50th anniversary ACM Expo 1997. This demo inspired the Blue Eyes project at IBM Almaden, and coined the term Attention-based Groupware,[6] now known as Attentive User Interfaces. During his PhD, Vertegaal demonstrated the power of eye contact in conversation, showing for the first time that speakers look more than listeners in multiparty conversations to indicate conversational attention, attributing the opposite pattern in dyadic conversations to the need to break prolonged eye contact, and showing that as much as 49% of the reason people speak is due to the presence of eye contact.[7] Vertegaal drives a Tesla Model S, and is an advocate for the Telsa Motors brand.[8]

Professional career[edit]

Attentive User Interface 2000−2005[edit]

In 2000, Vertegaal became a Professor in Human-Computer Interaction at Queen's University in Canada. There he founded the Human Media Lab, a research facility in Human-Computer Interaction, and developed eyeBox, the first eye contact sensor, premiered in 2003 by Diane Sawyer at ABC Good Morning America.[9] He edited a special issue of Communications of the ACM on Attentive User Interfaces,[10] showing how groups of computers could use human social cues for considerate notification.[11] Amongst these was an early iPhone that used eye tracking electronic glasses to determine whether users were in a conversation,[9] an attentive television that play/paused contents upon looking away, mobile Smart Pause and Smart Scroll (adopted in Samsung's Galaxy S4)[12] and calibration-free eye tracking. In 2004, Vertegaal founded alt.chi sessions at the ACM SIGCHI conference for presenting disruptive early research.

Organic User Interface 2005−present[edit]

In 2004, Vertegaal and his student Holman built the first bendable paper computer, PaperWindows,[1] which premiered at CHI 2005. It featured multiple flexible, hires, colour, wireless, thin-film multitouch displays through real-time depth-cam 3D Spatial Augmented Reality. In May 2007, at a lecture at Xerox PARC, Vertegaal coined Organic User Interfaces to describe a class of user interfaces with non-flat, optionally flexible displays. In 2008, Vertegaal co-edited a special issue of Communications of the ACM on Organic User Interfaces,[13] showcasing work on flexible and rigid computers of various shapes. It describes the first multi-touch spherical display,[14] and Dynacan, an interactive pop can: early examples of everyday computational things with interactive digital skins.[15][16] Vertegaal and Poupyrev organized the first International Workshop on Organic User Interfaces at ACM CHI 2009. Subsequent workshops were held in at ACM TEI 2011 in Madeira, Mobile HCI 2012 and CHI 2013 in Paris, France. In 2012, Vertegaal chaired the Tangible, Embodied and Embedded Interaction (TEI 2012) conference in Kingston, Ontario with OUI as a theme. In 2010, the Human Media Lab, with Arizona State University, developed the world's first functional flexible smartphone, PaperPhone. It pioneered bend interactions, premiering to critical acclaim at ACM CHI 2011 in Vancouver.[2] In 2012, the Human Media Lab introduced the world's first pseudo-holographic, live size 3D video conferencing system,[17] TeleHuman.[18] In 2013, Vertegaal unveiled PaperTab,[3] the world's first flexible tablet PC, at CES 2013 in Las Vegas, in collaboration with Plastic Logic and Intel. In 2012, the new Human Media Lab boutique headquarters, designed by industrial designer Karim Rashid, opened in Jackson Hall at Queen's University.

Teaching[edit]

The Human Media Lab features a graduate program in HCI, headed by Vertegaal. It has graduated over 25 MSc and PhD students. In 2001, Vertegaal created the HCI courses at Queen's University's School of Computing: CISC 325 "Introduction to HCI" and CISC 425 "Advanced User Interfaces". In 2008, Vertegaal co-founded the Computing and the Creative Arts (COCA) degree program at the School of Computing, the first such program in North America. It allows arts students to study computer science, and computer science students to study the arts. Its signature course is COCA 201, taught by Vertegaal. A DIY/Maker course, 2nd year students learn how to create and reflect on interactive arts exhibits by building interactive software and hardware technologies.[19]

Startups[edit]

In 2004, Vertegaal co-founded Xuuk, Inc. with his students. Xuuk unveiled the world's first $999 long-range calibration-free eye tracker at Google's headquarters in 2007 and was the first to introduce computer vision metrics for Digital Signage analytics.[20] Based on an infrared megapixel imaging camera, the eyebox2 is capable of tracking eyes and face orientation of multiple users from a distance of up to 10 meters without calibration. Other Human Media Lab startups include Kameraflage, Synbiota, and Mark One, Inc., known for the Vessyl smart beverage container.

References[edit]

  1. ^ a b Holman, D., Vertegaal, R. and Troje, N. (2005). PaperWindows: Interaction Techniques for Digital Paper. In Proceedings of ACM CHI 2005 Conference on Human Factors in Computing Systems. ACM Press, 591-599.
  2. ^ a b Lahey, B., Girouard, A., Burleson, W. and R. Vertegaal. (2011). PaperPhone: Understanding the Use of Bend Gestures in Mobile Devices with Flexible Electronic Paper Displays. In Proceedings of ACM CHI’11 Conference on Human Factors in Computing Systems, ACM Press, 1303-1312.
  3. ^ a b Warner, B. (2013). PaperTab a Fold-Up, Roll-Up Tablet Computer. Bloomberg Businessweek, May 2013.
  4. ^ Vertegaal R., and Bonis E. (1994). ISEE: An Intuitive Sound Editing Environment. Computer Music Journal (18)2 (Summer, 1994), pp. 21-29.
  5. ^ Vertegaal, R. 1999. The GAZE groupware system: mediating joint attention in multiparty communication and collaboration. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI'99). ACM Press, 294-301.
  6. ^ Vertegaal, R., Velichkovsky, B., and Van der Veer, G. (1997). Catching the Eye: Management of Joint Attention in Cooperative Work. SIGCHI Bulletin (29)4.
  7. ^ Vertegaal, R., Slagter, R., Van der Veer, G. and Nijholt, A. (2001). Eye gaze patterns in conversations: there is more to conversational agents than meets the eyes. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '01). ACM Press, pp. 301-308.
  8. ^ "roelvertegaal on Twitter". Twitter. Retrieved 2016-02-10. 
  9. ^ a b Vertegaal, R., Dickie, C., Sohn, C. and Flickner, M. (2002). Designing attentive cell phone using wearable eye-contact sensors. In CHI '02 Extended Abstracts on Human Factors in Computing Systems. ACM Press, pp. 646-647.
  10. ^ Vertegaal, R. (2003). Attentive User Interfaces. Editorial, In Special Issue on Attentive User Interfaces, Communications of ACM 46(3), ACM Press, 30-33.
  11. ^ Gibbs, W. (2005) Considerate Computing. Scientific American 292, 54 - 61
  12. ^ Dickie, C., Vertegaal, R., Sohn C., and Cheng, D. (2005). eyeLook: using attention to facilitate mobile media consumption. In Proceedings of the 18th annual ACM symposium on User interface software and technology (UIST '05). ACM Press, 103-106.
  13. ^ Vertegaal R., and Poupyrev, I. (2008). Introduction to Organic User Interfaces. In Special Issue on Organic User Interfaces, Communications of the ACM 51(6), 5-6.
  14. ^ Holman, D. and Vertegaal, R. (2008). Organic User Interfaces: Designing Computers in Any Way, Shape, or Form. In Special Issue on Organic User Interfaces, Communications of the ACM 51(6), 48-55.
  15. ^ Akaoka, E., Ginn, T. and R. Vertegaal. (2010). DisplayObjects: Prototyping Functional Physical Interfaces on 3D Styrofoam, Paper or Cardboard Models. In Proceedings of TEI’10 Conference on Tangible, Embedded and Embodied Interaction. ACM Press, 49-56.
  16. ^ Vertegaal, R. (2011). The (Re)Usability of Everyday Computational Things. In ACM Interactions Magazine, ACM Press, Jan/Feb 2011, 39-41
  17. ^ Kingsley, J. with will.i.am. (2013). Use Your Illusion. Wired UK, August 2013, 140-141.
  18. ^ Kim, K., Bolton, J., Girouard, A., Cooperstock, J. and Vertegaal, R. (2012). TeleHuman: Effects of 3D Perspective on Gaze and Pose Estimation with a Life-size Cylindrical Telepresence Pod. In Proceedings of CHI’12 Conference on Human Factors in Computing Systems, ACM Press, 2531-2540.
  19. ^ Strohmeier, P. Swensen, K. V., Lapp, C., Girouard, A. and Vertegaal, R. (2012). A Flock of Birds: Bringing Paper to Life. In Proceedings of TEI'12 Conference on Tangible, Embedded and Embodied Interaction. ACM Press, 333-334.
  20. ^ Sorrel, C. (2007). Google Eyes Up Billboard Ads. Wired.com

External Links[edit]