Karon MacLean

Professor

Research Classification

Information Systems

Research Interests

human-computer interaction
haptic interfaces
human-robot interaction
design of user interfaces

Relevant Degree Programs

 

Recruitment

Complete these steps before you reach out to a faculty member!

Check requirements
  • Familiarize yourself with program requirements. You want to learn as much as possible from the information available to you before you reach out to a faculty member. Be sure to visit the graduate degree program listing and program-specific websites.
  • Check whether the program requires you to seek commitment from a supervisor prior to submitting an application. For some programs this is an essential step while others match successful applicants with faculty members within the first year of study. This is either indicated in the program profile under "Requirements" or on the program website.
Focus your search
  • Identify specific faculty members who are conducting research in your specific area of interest.
  • Establish that your research interests align with the faculty member’s research interests.
    • Read up on the faculty members in the program and the research being conducted in the department.
    • Familiarize yourself with their work, read their recent publications and past theses/dissertations that they supervised. Be certain that their research is indeed what you are hoping to study.
Make a good impression
  • Compose an error-free and grammatically correct email addressed to your specifically targeted faculty member, and remember to use their correct titles.
    • Do not send non-specific, mass emails to everyone in the department hoping for a match.
    • Address the faculty members by name. Your contact should be genuine rather than generic.
  • Include a brief outline of your academic background, why you are interested in working with the faculty member, and what experience you could bring to the department. The supervision enquiry form guides you with targeted questions. Ensure to craft compelling answers to these questions.
  • Highlight your achievements and why you are a top student. Faculty members receive dozens of requests from prospective students and you may have less than 30 seconds to pique someone’s interest.
  • Demonstrate that you are familiar with their research:
    • Convey the specific ways you are a good fit for the program.
    • Convey the specific ways the program/lab/faculty member is a good fit for the research you are interested in/already conducting.
  • Be enthusiastic, but don’t overdo it.
Attend an information session

G+PS regularly provides virtual sessions that focus on admission requirements and procedures and tips how to improve your application.

 

Master's students
Doctoral students
Postdoctoral Fellows
2019

Human-robot interaction, user interface design, interactive sensing

I support experiential learning experiences, such as internships and work placements, for my graduate students and Postdocs.

Graduate Student Supervision

Doctoral Student Supervision (Jan 2008 - May 2019)
Haptic experience design : tools, techniques, and process (2017)

Haptic technology, which engages the sense of touch, offers promising benefits for a variety of interactions including low-attention displays, emotionally-aware interfaces, and augmented media experiences. Despite an increasing presence of physical devices in commercial and research applications, there is still little support for the design of engaging haptic sensations. Previous literature has focused on the significant challenges of technological capabilities or physical realism rather than on supporting experience design. In this dissertation, we study how to design, build, and evaluate interactive software to support haptic experience design (HaXD). We define HaXD and iteratively design three vibrotactile effect authoring tools, each a case study covering a different user population, vibrotactile device, and design challenge, and use them to observe specific aspects of HaXD with their target users. We make these in-depth findings more robust in two ways: generalizing results to a breadth of use cases with focused design projects, and grounding them with expert haptic designers through interviews and a workshop. Our findings 1) describe HaXD, including processes, strategies, and challenges; and 2) present guidelines on designing, building, and evaluating interactive software that facillitates HaXD. When characterizing HaXD processes, strategies, and challenges, we show that experience design is already practiced with haptic technology, but faces unique considerations compared to other modalities. We identify four design activities that must be explicitly supported: sketching, refining, browsing, and sharing. We find and develop strategies to accommodate the wide variety of haptic devices. We articulate approaches for designing meaning with haptic experiences, and finally, highlight a need for supporting adaptable interfaces. When informing the design, implementation, and evaluation of HaXD tools, we discover critical features, including a need for improved online deployment and community support. We present steps to develop both existing and future research software into a mature suite of HaXD tools, and reflect upon evaluation methods. By characterizing HaXD and informing supportive tools, we make a first step towards establishing HaXD as its own field, akin to graphic and sound design.

View record

Personalizing haptics : from individuals’ sense-making schemas to end-user haptic tools (2017)

Synthetic haptic sensations will soon proliferate throughout many aspects of our lives, well beyond the simple buzz we get from our mobile devices. This view is widely held, as evidenced by the growing list of use cases and industry's increasing investment in haptics. However, we argue that taking haptics to the crowds will require haptic design practices to go beyond a one-size-fits-all approach, common in the field, to satisfy users' diverse perceptual, functional, and hedonic needs and preferences reported in the literature. In this thesis, we tackle end-user personalization to leverage utility and aesthetics of haptic signals for individuals. Specifically, we develop effective haptic personalization mechanisms, grounded in our synthesis of users' sense-making schemas for haptics. First, we propose a design space and three distinct mechanisms for personalization tools: choosing, tuning, and chaining. Then, we develop the first two mechanisms into: 1) an efficient interface for choosing from a large vibration library, and 2) three emotion controls for tuning vibrations. In developing these, we devise five haptic facets that capture users' cognitive schemas for haptic stimuli, and derive their semantic dimensions and between-facet linkages by collecting and analyzing users' annotations for a 120-item vibration library. Our studies verify utility of the facets as a theoretical model for personalization tools. In collecting users' perception, we note a lack of scalable haptic evaluation methodologies and develop two methodologies for large-scale in-lab evaluation and online crowdsourcing of haptics.Our studies focus on vibrotactile sensations as the most mature and accessible haptic technology but our contributions extend beyond vibrations and inform other categories of haptics.

View record

Periodic vibrotactile guidance (2014)

Emergence of mobile technologies, with their ever increasing computing power, embedded sensors, and connectivity to the Internet has created many new applications such as navigational guidance systems. Unfortunately, these devices can become problematic by inappropriate usage or overloading of the audiovisual channels. Wearable haptics has come to the rescue with the promise of offloading some of the communication from the audiovisual channels.The main goal of our research is to develop a spatiotemporal guidance system based on the potentials and limitations of the sense of touch. Our proposed guidance method, Periodic Vibrotactile Guidance (PVG), guides movement frequency through periodic vibrations to help the user achieve a desired speed and/or finish a task in a desired time. We identify three requirements for a successful PVG system: accurate measurement of the user's movement frequency, successful delivery of vibrotactile cues, and the user's ability to follow the cues at different rates and during auditory multitasking.In Phase 1, we study the sensitivity of different body locations to vibrotactile cues with/without visual workload and under different movement conditions and examine the effect of expectation of location and gender differences. We create a set of design guidelines for wearable haptics.In Phase 2, we develop Robust Realtime Algorithm for Cadence Estimation (RRACE) which measures momentary step frequency/interval via frequency-domain analysis of accelerometer signals available in smartphones. Our results show that, with a 95% accuracy, RRACE is more accurate than the published state-of-the-art time-based algorithm.In Phase 3, we use the guidelines from Phase 1 and the RRACE algorithm to study PVG. First we examine walkers' susceptibility to PVG which shows most walkers can follow the cues with 95% accuracy. Then we examine the effect of auditory multitasking on users' performance and workload, which shows that PVG can successfully guide the walker's speed during multitasking.Our research expands the reach of wearable haptics and guidance technologies by providing design guidelines, a robust cadence detection algorithm, and Periodic Vibrotactile Guidance -- an intuitive method of communicating spatiotemporal information in a continuous manner -- which can successfully guide movement speed with little to no learning required.

View record

The Haptic Creature : social human-robot interaction through affective touch (2012)

Emotion communication is an important aspect of social interaction. Affect display research from psychology as well as social human-robot interaction has focused primarily on facial or vocal behaviors, as these are the predominant means of expression for humans. Much less attention, however, has been on emotion communication through touch, which, though unique among the senses, can be methodologically and technologically difficult to study.Our thesis investigated the role of affective touch in the social interaction between human and robot. Through a process of design and controlled user evaluation, we examined the display, recognition, and emotional influence of affective touch. To mitigate issues inherent in touch research, we drew from interaction models not between humans but between human and animal, whereby the robot assumes the role of companion animal.We developed the Haptic Creature, a small, zoomorphic robot novel in its sole focus on touch for both affect sensing and display. The robot perceives movement and touch, and it expresses emotions through ear stiffness, modulated breathing, and vibrotactile purring. The Haptic Creature was employed in three user studies, each exploring a different aspect of affective touch interaction.Our first study examined emotion display from the robot. We detail the design of the Haptic Creature's affect display, which originated from animal models, then was enhanced through successive piloting. A formal study demonstrated the robot was more successful communicating arousal than valence.Our second study investigated affect display from the human. We compiled a touch dictionary from psychology and human-animal interaction research. Participants first rated the likelihood of using these touch gestures when expressing a variety of emotions, then performed likely gestures communicating specific emotions for the Haptic Creature. Results provided properties of human affect display through touch and high-level categorization of intent.Our final study explored the influence of affective touch. Results empirically demonstrated the human's emotional state was directly influenced from affective touch interactions with the robot.Our research has direct significance to the field of socially interactive robotics and, further, any domain interested in human use of affective touch: psychology, mediated social touch, human-animal interaction.

View record

Design of haptic signals for information communication in everyday environments (2009)

Multi-function interfaces have become increasingly pervasive and arefrequently used in contexts which pose multiple demands on a single sensorymodality. Assuming some degree of modularity in attentional processing and thatusing a different sensory channel for communication can reduce interference withcritical visual tasks, one possibility is to divert some information through the touchsense.The goal of this Thesis is to advance our knowledge of relevant humancapabilities and embed this knowledge into haptic communication design toolsand procedures, in the interest of creating haptically supported interfaces thatdecrease rather than add to their users’ sensory and cognitive load. In short, wewanted to create tools and methods that would allow the creation of hapticsignals (accomplished via display of either forces or vibrations) extending beyondthe one bit of communication offered by current pagers and cellular phonebuzzers.In our quest to create information-rich haptic signals we need to learn howto create signals that are differentiable. We also need to study ways to assignmeanings to these signals and make sure that they can be perceived clearlywhen presented one after another even in environments where their recipientmight be involved with other tasks. These needs frame the specific researchgoals of this thesis.Most of the results described here were obtained through the study oftactile (in the skin) rather than proprioceptive (force feedback) stimuli. We beginby presenting several methods to create, validate and contrast tactile stimulusdissimilarity data and investigate the design of a waveform intended to be atactile perceptual intermediate between a square waveform and a trianglewaveform. Next, we explore methods to create and test tactile signal-meaningassociations and document a surprising ability of participants to exhibit highrecall of quickly learned associations at two weeks in a first examination oflongitudinal recall of tactile stimuli. We then present methods to measure tactilestimulus masking and identify crucial perceptual thresholds relating to stimulustemporal spacing in an exploration into the masking effects of common-onsetvibrotactile stimuli. Finally, we present methods to test haptic and multimodalperception in simulated scenarios including a method to simulate and controlcognitive workload; and provide evidence that the commonly-used device ofmultimodal signal reinforcement can adversely impact performance in an ongoingprimary task.The research presented in this Thesis has implications for the design ofsignals to be used in displays that are emerging in embedded computingenvironments such as cars, games, cellular phones, and medical devices.

View record

Master's Student Supervision (2010 - 2018)
Building believable robots : an exploration of how to make simple robots look, move, and feel right (2017)

Humans have an amazing ability to see a 'spark of life' in almost anything that moves. There is a natural urge to imbue objects with agency. It's easy to imagine a child pretending that a toy is alive. Adults do it, too, even when presented with evidence to the contrary. Leveraging this instinct is key to building believable robots, i.e. robots that act, look, and feel like they are social agents with personalities, motives, and emotions. Although it is relatively easy to initiate a feeling of agency, it is difficult to control, consistently produce, and maintain an emotional connection with a robot. Designing a believable interaction requires balancing form, function and context: you have to get the story right. In this thesis, we discuss (1) strategies for designing the bodies and behaviours of simple robot pets; (2) how these robots can communicate emotion; (3) and how people develop narratives that imbue the robots with agency. For (1), we developed a series of four robot design systems to create and rapidly iterate on robot form factors, as well as a tools for improvising and refining expressive robot behaviours. For (2), we ran three studies wherein participants rated robot behaviours in terms of arousal and valence under different display conditions. For (3), we ran a study wherein expert performers improvised emotional 'stories' with the robots; also one of the studies in (2) included soliciting narratives for the robot and its behaviours.

View record

Designing zooming interactions for small displays with a proximity sensor (2017)

Small, high resolution touchscreens open new possibilities for wearable and embedded applications, but are a mismatch for interactions requiring appreciable movement on the screen surface. For example, multi-touch or large-scroll zooming actions suffer from occlusion and difficulties in accessing or resolving large zoom ranges or selecting small targets.Meanwhile, emerging technologies have the potential to combine many capabilities, e.g., touch- and proximity-sensitivity, flexibility and transparency. A current challenge is to develop interaction techniques that can exploit the capabilities of these new materials to solve interaction challenges presented by trends such as miniaturization and wearability such as tiny screens that only one finger of one hand can fit on.To this end, Zed-zooming exploits the capabilities of emerging near-proximity sensors to address these problems, by mapping finger height above a control surface to image size. The EZ-Zoom technique adds the pseudohaptic illusion of an elastic finger-screen connection, by exploiting non-linear scaling functions to provide a usage metaphor. In a two-part user study, we compared EZ-Zoom to touchscreen standard pinch-to-zoom on smartphone and smartwatch screens, and found (a) a significant improvement in task time and preference for the smallest screen (equivalent task time for the smartphone); and (b) that the illusion improved users' reported sense of control, provided cues about the interaction's spatial extent and dynamics, and made the interaction more natural. From our experience with the study, we conclude requirements for the development of proximity sensors in order to afford such interactions. Our work goes on to reflect on how zed-zooming can be incorporated into seamless interaction tasks. We aim to identify some characteristics of a zooming interaction that would need to be considered when designing a complete one, and explore how these characteristics play into a complete and usable zooming interaction.

View record

Towards an emotionally communicative robot : feature analysis for multimodal support of affective touch recognition (2016)

Human affective state extracted from touch interaction takes advantage of natural communication of emotion through physical contact, enabling applicationslike robot therapy, intelligent tutoring systems, emotionally-reactivesmart tech, and more. This work focused on the emotionally aware robot pet context and produced a custom, low-cost piezoresistive fabric touch sensor at 1-inchtaxel resolution that accommodates the flex and stretch of the robot in motion.Using established machine learning techniques, we built classification models ofsocial and emotional touch data. We present an iteration of the human-robot interaction loop for an emotionally aware robot through two distinct studies anddemonstrate gesture recognition at roughly 85% accuracy (chance 14%).The first study collected social touch gesture data (N=26) to assess data qualityof our custom sensor under noisy conditions: mounted on a robot skeleton simulating regular breathing, obscured under fur casings, placed over deformable surfaces.Our second study targeted affect with the same sensor, wherein participants(N=30) relived emotionally intense memories while interacting with a smaller stationary robot, generating touch data imbued with the following: Stressed, Excited,Relaxed, or Depressed. A feature space analysis triangulating touch, gaze, andphysiological data highlighted the dimensions of touch that suggest affective state.To close the interactive loop, we had participants (N=20) evaluate researcherdesigned breathing behaviours on 1-DOF robots for emotional content. Resultsdemonstrate that these behaviours can display human-recognizable emotion asperceptual affective qualities across the valence-arousal emotion model. Finally, we discuss the potential impact of a system capable of emotional “conversation” with human users, referencing specific applications.

View record

Scrolling in radiology image stacks : multimodal annotations and diversifying control mobility (2014)

Advances in image acquisition technology mean that radiologists today must examine thousands of images to make a diagnosis. However, the physical interactions performed to view these images are repetitive and not specialized to the task. Additionally, automatic and/or radiologist-generated annotations may impact how radiologists scroll through image stacks as they review areas of interest. We analyzed manual aspects of this work by observing and/or interviewing 19 radiologists; stack scrolling dominated the resulting task examples.We used a simplified stack seeded with correct or incorrect annotations in our experiment on lay users. The experiment investigated the impact of four scrolling techniques: traditional scrollwheel, click+drag, sliding-touch and tilting to access rate control. We also examined the effect of visual vs. haptic annotation cues’ on scrolling dynamics, detection accuracy and subjective factors. Scrollwheel was the fastest scrolling technique overall for our lay participants. Combined visual and haptic annotation highlights increased the speed of target-finding in comparison to either modality alone.Multimodal annotations may be useful in radiology image interpretation; users are heavily visually loaded, and there is background noise in the hospital environment. From interviews with radiologists, we see that they are receptive to a mouse that they can use to map different movements to interactions with images as an alternative to the standard mouse usually provided with their workstation.

View record

Participatory design of a biometrically-driven portable audio player (2012)

Music listening assumes a number of different forms and purposes for many people who live in a highly digitalized world. Where, how and what songs are listened to can be a highly personalized activity, as unique musical preferences and individual tastes play an important role in choice of music. Today’s portable media devices are high-capacity and easy to carry around, supporting quick access to a nearly unlimited library of media, in many use contexts in nearly any time or place. But these advantages come at a cost. Operating the music player while doing other things can involve a physical and mental demand that ranges from inconvenient to dangerous. The Haptic-Affect Loop (HALO) paradigm was introduced by Hazelton et al. (2010) to help users control portable media players by continuously inferring the user’s affective state and the player behaviour they desired through physiological signals. They proposed using the haptic modality to deliver feedback and gathered initial requirements from a single user. In this thesis, we present a qualitative participatory design study which broadens Hazelton’s single user participatory design study to include six participants. A more efficient means of obtaining information about a user is developed to support scaling to multiple participants. We then examined these users’ expectations for user-device communication and the functionality of the HALO paradigm, with the objective of identifying clusters of preferred uses for HALO. In this regard, we identified the behaviours of a proposed system that these users would find most useful, and that they would like to interact with. We collectively explored a set of exemplar implicit and explicit interaction scenarios for HALO, finding greater confidence in mechanisms that did not relinquish user control, but openness to trying more implicit control approaches where priority of control in listening music was lower than secondary tasks. The willingness to try more implicit control approaches depends on the reliability of the technology. Finally, we generated a set of interaction design guidelines for the next stage of HALO prototyping.

View record

Sensing and recognizing affective touch in a furry zoomorphic object (2012)

Over the last decade, the surprising fact has emerged that machines can possess therapeutic power. Due to the many healing qualities of touch, one route to such power is in haptic emotional interaction, which in turn requires sophisticated touch sensing and interpretation. We explore the development of affective touch gesture recognition technologies in the context of a furry artificial lap-pet, with the ultimate goal of creating therapeutic interactions by sensing human emotion through touch. We design, construct, and evaluate a low-cost, low-tech furry lap creature prototype equipped with 2 types of touch-sensing hardware. The first of these hardware types is our own design for a new type of touch sensor built with conductive fur, invented and developed as part of this research. The second is an existing design for a piezoresistive fabric pressure sensor, adapted to three dimensions for our robot-body context. Combining features extracted from the time-series data output of these sensors, we perform machine learning analysis to recognize touch gestures. In a study of 16 participants and 9 key affective gestures, our model averages 94% gesture recognition accuracy when trained on individuals, and 86% accuracy when applied generally across the entire set of participants. The model can also recognize who out of the 16 participants is touching the prototype with an accuracy of 79%. These results promise a new generation of emotionally intelligent machines, enabled by affective touch gesture recognition.

View record

Sensing Gait on Smartphones to Support Mobile Exercise Applications and Games (2012)

No abstract available.

The design and field observation of a haptic notification system for timing awareness during oral presentations (2012)

Conference session chairs must manage time, usually by reminding speakers of the time remaining through a variety of means (e.g., visual signs with “X” minutes left). But speakers often miss reminders, chairs cannot confirm reminder receipt, and the broken dialogue can be a sideshow for the audience. The experience of speaking in front of an audience, such as at a conference or even during a classroom lecture, can also be cognitively demanding and overwhelming. This further causes speakers to miss reminders from personal timing tools (e.g., cellphone timer). To address these and other concerns, this thesis describes the design and evaluation of HaNS, a novel wireless wrist-worn chair-speaker Haptic Notification System, that delivers tactile timing alerts to unintrusively aid speakers and session chairs time-manage oral presentations. Iterative deployment and observation in realistic settings was used to optimize the attentional characteristics and the chair-speaker interaction. HaNS’s use was then observed through four field observations in three settings: two mid-sized academic conferences (55 speakers, 16 session chairs, 50 audience members), five university research seminars (11 speakers, 5 session chairs, 15 audience members), and four large university lectures (23 by 3 instructors). Through observation and self-reports, existing speaker and session chair timing practices and difficulties are documented. Results demonstrate that HaNS can improve a user’s awareness of time; it automatically delivers salient notifications, unintrusively, privately, and remotely. HaNS also facilitates chair-speaker coordination and reduces distraction of speaker and audience through its private communication channel. Eliminating overruns will require improvement in speaker ‘internal’ control, which our results suggest HaNS can also support given practice. This thesis concludes with design guidelines for both conference-deployed and personal timing tools, supported by haptics or other notification modalities.

View record

A First and Second Longitudinal Study of Haptic Icon Learnability (2010)

No abstract available.

Investigating, designing, and validating a haptic-affect interaction loop using three experimental methods (2010)

Computer interfaces commonly make large demands on our visual and auditory attention, which can make multi-tasking with multiple systems difficult. In cases where a primary task demands constant, unbroken attention from the user, it is often implausible for such a user to employ a system for a secondary task, even when desirable. The haptic modality has been suggested as a conduit for the appropriately-intrusive delivery of information from computer systems. Furthermore, physiological signals can be used to infer the affective state of a user without requiring attention. Combining these underexplored channels for implicit system command, control and display, we envision an automated, intelligent and emotionally aware interaction paradigm. We call this paradigm the Haptic-Affect Loop (HALO).This work investigates the potential for the HALO paradigm in a specific use case (portable audio consumption). It uses three experimental techniques to gather requirements for the paradigm, validate its technological feasibility, and develop the feedback-supported language of interaction with a HALO-enabled portable audio system.A focus group is first conducted to identify the perceived utility of the paradigm with a diverse – albeit technologically conservative – group of portable audio users, and to narrow its scope. Results of this focus group indicate that participants are sceptical of its technological feasibility (in particular, context resolution) and are unwilling to relinquish control over their players. This scepticism was alleviated somewhat by the conclusion of the sessions.Next, technological validation of online affect classification is undertaken via an exploratory, but formally controlled, experiment. Galvanic skin response measures provided a means to make introductory measures of interruption and, in some cases, musical engagement. A richer signal array is necessary to make the full array of required affect identifications for this paradigm, and is under development.The final phase of work involves an iterative participatory design process with a single participant who was enthusiastic but practical about technology to better define system requirements and to evaluate input and output mechanisms using a variety of devices and signals.The outcome of this design effort was a functioning prototype, a set of initial system requirements and an exemplar interaction language for HALO.

View record

 
 

If this is your researcher profile you can log in to the Faculty & Staff portal to update your details and provide recruitment preferences.