Karon MacLean

Professor

Research Classification

Research Interests

Information Systems
design of user interfaces
haptic interfaces
human-computer interaction
human-robot interaction

Relevant Thesis-Based Degree Programs

 
 

Recruitment

Master's students
Doctoral students
Postdoctoral Fellows

Human-robot interaction, user interface design, interactive sensing

I support experiential learning experiences, such as internships and work placements, for my graduate students and Postdocs.

Complete these steps before you reach out to a faculty member!

Check requirements
  • Familiarize yourself with program requirements. You want to learn as much as possible from the information available to you before you reach out to a faculty member. Be sure to visit the graduate degree program listing and program-specific websites.
  • Check whether the program requires you to seek commitment from a supervisor prior to submitting an application. For some programs this is an essential step while others match successful applicants with faculty members within the first year of study. This is either indicated in the program profile under "Admission Information & Requirements" - "Prepare Application" - "Supervision" or on the program website.
Focus your search
  • Identify specific faculty members who are conducting research in your specific area of interest.
  • Establish that your research interests align with the faculty member’s research interests.
    • Read up on the faculty members in the program and the research being conducted in the department.
    • Familiarize yourself with their work, read their recent publications and past theses/dissertations that they supervised. Be certain that their research is indeed what you are hoping to study.
Make a good impression
  • Compose an error-free and grammatically correct email addressed to your specifically targeted faculty member, and remember to use their correct titles.
    • Do not send non-specific, mass emails to everyone in the department hoping for a match.
    • Address the faculty members by name. Your contact should be genuine rather than generic.
  • Include a brief outline of your academic background, why you are interested in working with the faculty member, and what experience you could bring to the department. The supervision enquiry form guides you with targeted questions. Ensure to craft compelling answers to these questions.
  • Highlight your achievements and why you are a top student. Faculty members receive dozens of requests from prospective students and you may have less than 30 seconds to pique someone’s interest.
  • Demonstrate that you are familiar with their research:
    • Convey the specific ways you are a good fit for the program.
    • Convey the specific ways the program/lab/faculty member is a good fit for the research you are interested in/already conducting.
  • Be enthusiastic, but don’t overdo it.
Attend an information session

G+PS regularly provides virtual sessions that focus on admission requirements and procedures and tips how to improve your application.

 

ADVICE AND INSIGHTS FROM UBC FACULTY ON REACHING OUT TO SUPERVISORS

These videos contain some general advice from faculty across UBC on finding and reaching out to a potential thesis supervisor.

Graduate Student Supervision

Doctoral Student Supervision

Dissertations completed in 2010 or later are listed below. Please note that there is a 6-12 month delay to add the latest dissertations.

Magic pen: a versatile digital manipulative for learning (2022)

Digital manipulatives such as robots are an opportunity for interactive and engaging learning activities. The addition of haptic and specifically force feedback to digital manipulatives can enrich the learning of science-related concepts by building physical intuition. As a result, learners can design experiments and physically explore them to solve problems they have posed.In my thesis, I present the evolution of the design and evaluation of a versatile digital manipulative – called MagicPen – in a human-centered design context. First, I investigate how force feedback can enable learners to fluidly express their ideas. I identify three core interactions as bases for physically assisted sketching (phasking).Then, I show how using these interactions improves the accuracy of users’ drawings as well as their authority in creative works. In the next phase, I demonstrate the potential benefits of using force feedback in a collaborative learning framework, in a manner that is generalizable beyond the device we invented and lends insight on how haptics can empower digital manipulatives to express advanced concept by means of the behaviour of a virtual avatar and the respective feeling of force feedback.This informs our device’s capability for learning advanced concepts in classroom settings and further considerations for the next iterations of the MagicPen. Based on the findings of how haptic feedback could assist with design and exploration in learning, In the last phase of my thesis, I propose a framework for physically assisted learning (PAL) which links the expression and exploration of an idea. Furthermore, I explain how to instantiate the PAL framework using available technologies and discuss a path forward to a larger vision of physically assistedlearning. PAL highlights the role of haptics in future "objects-to-think-with".

View record

Haptic experience design: tools, techniques, and process (2017)

Haptic technology, which engages the sense of touch, offers promising benefits for a variety of interactions including low-attention displays, emotionally-aware interfaces, and augmented media experiences. Despite an increasing presence of physical devices in commercial and research applications, there is still little support for the design of engaging haptic sensations. Previous literature has focused on the significant challenges of technological capabilities or physical realism rather than on supporting experience design. In this dissertation, we study how to design, build, and evaluate interactive software to support haptic experience design (HaXD). We define HaXD and iteratively design three vibrotactile effect authoring tools, each a case study covering a different user population, vibrotactile device, and design challenge, and use them to observe specific aspects of HaXD with their target users. We make these in-depth findings more robust in two ways: generalizing results to a breadth of use cases with focused design projects, and grounding them with expert haptic designers through interviews and a workshop. Our findings 1) describe HaXD, including processes, strategies, and challenges; and 2) present guidelines on designing, building, and evaluating interactive software that facillitates HaXD. When characterizing HaXD processes, strategies, and challenges, we show that experience design is already practiced with haptic technology, but faces unique considerations compared to other modalities. We identify four design activities that must be explicitly supported: sketching, refining, browsing, and sharing. We find and develop strategies to accommodate the wide variety of haptic devices. We articulate approaches for designing meaning with haptic experiences, and finally, highlight a need for supporting adaptable interfaces. When informing the design, implementation, and evaluation of HaXD tools, we discover critical features, including a need for improved online deployment and community support. We present steps to develop both existing and future research software into a mature suite of HaXD tools, and reflect upon evaluation methods. By characterizing HaXD and informing supportive tools, we make a first step towards establishing HaXD as its own field, akin to graphic and sound design.

View record

Personalizing haptics: from individuals' sense-making schemas to end-user haptic tools (2017)

Synthetic haptic sensations will soon proliferate throughout many aspects of our lives, well beyond the simple buzz we get from our mobile devices. This view is widely held, as evidenced by the growing list of use cases and industry's increasing investment in haptics. However, we argue that taking haptics to the crowds will require haptic design practices to go beyond a one-size-fits-all approach, common in the field, to satisfy users' diverse perceptual, functional, and hedonic needs and preferences reported in the literature. In this thesis, we tackle end-user personalization to leverage utility and aesthetics of haptic signals for individuals. Specifically, we develop effective haptic personalization mechanisms, grounded in our synthesis of users' sense-making schemas for haptics. First, we propose a design space and three distinct mechanisms for personalization tools: choosing, tuning, and chaining. Then, we develop the first two mechanisms into: 1) an efficient interface for choosing from a large vibration library, and 2) three emotion controls for tuning vibrations. In developing these, we devise five haptic facets that capture users' cognitive schemas for haptic stimuli, and derive their semantic dimensions and between-facet linkages by collecting and analyzing users' annotations for a 120-item vibration library. Our studies verify utility of the facets as a theoretical model for personalization tools. In collecting users' perception, we note a lack of scalable haptic evaluation methodologies and develop two methodologies for large-scale in-lab evaluation and online crowdsourcing of haptics.Our studies focus on vibrotactile sensations as the most mature and accessible haptic technology but our contributions extend beyond vibrations and inform other categories of haptics.

View record

Periodic vibrotactile guidance (2014)

Emergence of mobile technologies, with their ever increasing computing power, embedded sensors, and connectivity to the Internet has created many new applications such as navigational guidance systems. Unfortunately, these devices can become problematic by inappropriate usage or overloading of the audiovisual channels. Wearable haptics has come to the rescue with the promise of offloading some of the communication from the audiovisual channels.The main goal of our research is to develop a spatiotemporal guidance system based on the potentials and limitations of the sense of touch. Our proposed guidance method, Periodic Vibrotactile Guidance (PVG), guides movement frequency through periodic vibrations to help the user achieve a desired speed and/or finish a task in a desired time. We identify three requirements for a successful PVG system: accurate measurement of the user's movement frequency, successful delivery of vibrotactile cues, and the user's ability to follow the cues at different rates and during auditory multitasking.In Phase 1, we study the sensitivity of different body locations to vibrotactile cues with/without visual workload and under different movement conditions and examine the effect of expectation of location and gender differences. We create a set of design guidelines for wearable haptics.In Phase 2, we develop Robust Realtime Algorithm for Cadence Estimation (RRACE) which measures momentary step frequency/interval via frequency-domain analysis of accelerometer signals available in smartphones. Our results show that, with a 95% accuracy, RRACE is more accurate than the published state-of-the-art time-based algorithm.In Phase 3, we use the guidelines from Phase 1 and the RRACE algorithm to study PVG. First we examine walkers' susceptibility to PVG which shows most walkers can follow the cues with 95% accuracy. Then we examine the effect of auditory multitasking on users' performance and workload, which shows that PVG can successfully guide the walker's speed during multitasking.Our research expands the reach of wearable haptics and guidance technologies by providing design guidelines, a robust cadence detection algorithm, and Periodic Vibrotactile Guidance -- an intuitive method of communicating spatiotemporal information in a continuous manner -- which can successfully guide movement speed with little to no learning required.

View record

The Haptic Creature: Social Human-Robot Interaction through Affective Touch (2012)

Emotion communication is an important aspect of social interaction. Affect display research from psychology as well as social human-robot interaction has focused primarily on facial or vocal behaviors, as these are the predominant means of expression for humans. Much less attention, however, has been on emotion communication through touch, which, though unique among the senses, can be methodologically and technologically difficult to study.Our thesis investigated the role of affective touch in the social interaction between human and robot. Through a process of design and controlled user evaluation, we examined the display, recognition, and emotional influence of affective touch. To mitigate issues inherent in touch research, we drew from interaction models not between humans but between human and animal, whereby the robot assumes the role of companion animal.We developed the Haptic Creature, a small, zoomorphic robot novel in its sole focus on touch for both affect sensing and display. The robot perceives movement and touch, and it expresses emotions through ear stiffness, modulated breathing, and vibrotactile purring. The Haptic Creature was employed in three user studies, each exploring a different aspect of affective touch interaction.Our first study examined emotion display from the robot. We detail the design of the Haptic Creature's affect display, which originated from animal models, then was enhanced through successive piloting. A formal study demonstrated the robot was more successful communicating arousal than valence.Our second study investigated affect display from the human. We compiled a touch dictionary from psychology and human-animal interaction research. Participants first rated the likelihood of using these touch gestures when expressing a variety of emotions, then performed likely gestures communicating specific emotions for the Haptic Creature. Results provided properties of human affect display through touch and high-level categorization of intent.Our final study explored the influence of affective touch. Results empirically demonstrated the human's emotional state was directly influenced from affective touch interactions with the robot.Our research has direct significance to the field of socially interactive robotics and, further, any domain interested in human use of affective touch: psychology, mediated social touch, human-animal interaction.

View record

Design of haptic signals for information communication in everyday environments (2009)

Multi-function interfaces have become increasingly pervasive and arefrequently used in contexts which pose multiple demands on a single sensorymodality. Assuming some degree of modularity in attentional processing and thatusing a different sensory channel for communication can reduce interference withcritical visual tasks, one possibility is to divert some information through the touchsense.The goal of this Thesis is to advance our knowledge of relevant humancapabilities and embed this knowledge into haptic communication design toolsand procedures, in the interest of creating haptically supported interfaces thatdecrease rather than add to their users’ sensory and cognitive load. In short, wewanted to create tools and methods that would allow the creation of hapticsignals (accomplished via display of either forces or vibrations) extending beyondthe one bit of communication offered by current pagers and cellular phonebuzzers.In our quest to create information-rich haptic signals we need to learn howto create signals that are differentiable. We also need to study ways to assignmeanings to these signals and make sure that they can be perceived clearlywhen presented one after another even in environments where their recipientmight be involved with other tasks. These needs frame the specific researchgoals of this thesis.Most of the results described here were obtained through the study oftactile (in the skin) rather than proprioceptive (force feedback) stimuli. We beginby presenting several methods to create, validate and contrast tactile stimulusdissimilarity data and investigate the design of a waveform intended to be atactile perceptual intermediate between a square waveform and a trianglewaveform. Next, we explore methods to create and test tactile signal-meaningassociations and document a surprising ability of participants to exhibit highrecall of quickly learned associations at two weeks in a first examination oflongitudinal recall of tactile stimuli. We then present methods to measure tactilestimulus masking and identify crucial perceptual thresholds relating to stimulustemporal spacing in an exploration into the masking effects of common-onsetvibrotactile stimuli. Finally, we present methods to test haptic and multimodalperception in simulated scenarios including a method to simulate and controlcognitive workload; and provide evidence that the commonly-used device ofmultimodal signal reinforcement can adversely impact performance in an ongoingprimary task.The research presented in this Thesis has implications for the design ofsignals to be used in displays that are emerging in embedded computingenvironments such as cars, games, cellular phones, and medical devices.

View record

Master's Student Supervision

Theses completed in 2010 or later are listed below. Please note that there is a 6-12 month delay to add the latest theses.

How teens can build remote social connection through an emotionally supportive robot swarm (2023)

Social connection plays an important role in early adolescent development, yet teenagers are increasingly turning to remote communication technologies like social media in order to fulfill their social needs. Unfortunately, these technologies are often missing several important elements of in-person interaction, such as non-verbal emotional cues and affective social touch. To address these shortcomings, this thesis explores the interaction design for ESSbots, one implementation of a proposed new kind of social medium that focuses on the shareable, expressive behaviours of an "emotionally supportive swarm” of small mobile robots. We grounded our initial design framework in the cognitive science theories of participatory sensemaking, embodied, embedded, enactive cognition, and actor network theory, and explored how the swarm robot properties of tangible embodiment, multiplicity and coordination, and animacy, agency, identification, and roleplay support remote group communication and connection through a series of iterative participatory design workshops. Based on participant feedback, we developed an interaction prototype and interface to support accessible swarm robot behaviour authoring and remote sharing between friends via atomic behaviours, i.e., basic actions that can be combined and modified to create complex, expressive behaviours with one or several robots. Our workshop findings revealed that teenagers wanted to use the swarm to communicate in creative and playful ways, share expressive emotions, and reflect their own personalities through the robots as proxies. They viewed the embodied aspect of ESSbots as a unique and important element of remote communication, and felt that mediated touch in particular would help them feel closer to their friends. We also found that our design was generalizable to both a new and returning group of participants. It was consistently easy to use and engaging, although participants felt the clarity of remote communication works best with friend groups who already know each other well.Finally, we highlight important design recommendations for ESSbots. Most notably, participants wanted a high level of autonomy over their own robots, including mechanisms to support consent when others use the swarm, as well as a mix of possible control methods including visual scripting, pre-made buttons, and direct manipulation to support expressive affect sharing and playful interaction.

View record

Beyond the bulging binder: family-centered design of an information management system for caregivers of children living with health complexity (2022)

Children Living with Health Complexity (CLHC) require continuity of health and community care to improve their quality of life and decrease family care burden. Due to medical complexity and numerous chronic conditions, these children rely heavily on multiple care providers. However, a fragmented health system and the communication challenges between stakeholders poses many obstacles for their caregivers, and the result is non-optimal care in both hospital and community. This leads to an immense burden on families who take on dual roles and become responsible for care coordination. Parent caregivers must continually manage and share masses of paper documents and repeat their child’s story for different stakeholders. A digital information management and care coordination solution to support these caregivers is long overdue.Our goal was to engage with parent caregivers of CLHC through the user-centered design process to understand their needs in a digital solution. Twelve caregivers participated in three rounds of user studies which were followed by design phases. In the first phase of the study, we aimed to understand the caregiver challenges, pain-points and strategies for dealing with masses of paper and electronic data. By utilizing thematic analysis, we found a set of caregiver challenges which closely aligned with previous research: access to health records, navigating the care system, organizing and managing information, finding resources, repeating their story, and managing finances. We mapped these challenges to the caregiver strategies and devised a set of design principles to address these challenges. We also collected data on caregivers’ feature preferences in a digital solution. Our emergent design principles are: providing a holistic view of patient care, allowing customizability and flexibility, personalizing and humanizing, facilitating communication and collaboration with care providers, avoiding jargon, capturing health history and providing insight, sharing and accessing confidentially, and integrating information. Finally, we developed an extensive prototype blueprint through an iterative process of feedback and design to serve as an example for implementing these design principles in a caregiver-centered interface. The design principles and the prototype are intended to be a stepping stone for developing the content and features of a caregiver-centered information management system.

View record

Feeling (key)pressed : comparing the ways in which force and self-reports reveal emotion (2022)

Interactive human-computer systems can be enriched to interpret and respond to users’ affective states using computational emotion models, which necessitates the collection of authentic and spontaneous emotion data. Popular emotion modelling frameworks rely on convenient, yet static abstractions of emotion (e.g., Ekman's basic emotions and Russell's circumplex). These abstractions often oversimplify complex emotional experiences into single emotion categories. In turn, emotion models guided by such emotion annotations leave out significant aspects of the user's true, spontaneous emotional experience. Richer representations of emotion, negotiated and understood between participants and researchers, can be created using mixed-methods labelling--assigning an emotion descriptor to a recorded segment of experience--approaches. However, resulting emotion annotations are often not ready-to-use in computational models. In this thesis, we investigate (1) ways to improve meaningfulness of self-reported emotion annotations, and (2) to understand the implicit expression of emotion in touch pressure. For the first, we propose three strategies to interpret multiple versions of self-annotated dynamic emotion through combining (multi-label classification), extracting (of alignment metrics), and resolving (of conflicts between) emotion labels. We evaluate our label-resolution strategies using the FSR EEG Emotion-Labelled (FEEL) dataset (N=16). The FEEL dataset includes brain activity and keypress force data captured from a 10-minute video of user gameplay experience, annotated with two methods of self-reporting emotion--a continuous annotation and an interview. By featuring multi-pass self-report and user-calibrated scales, the data collection protocol prioritized the capture of genuine emotion evolution. We triangulate multiple self-annotated emotion reports and evaluate classification accuracy of our three proposed label resolution strategies. For our second research question, we compare models built on keypress force and brain activity data in an effort to understand the implicit expression of emotion in touch pressure. Finally, we reflect on the trade-offs of each strategy for developing computational models of emotion. Our findings suggest that touch-based models outperform those built on brain activity, and mixed-methods emotion annotations increase self-report meaningfulness.

View record

myWeekInSight: visualizing personal data for chronic pain management through youth-centered design (2022)

Chronic pain is a common and costly condition in youth. 20-30% report recurrent pain that is not disabling, but still interferes with academic, social, and recreational functioning, and has significant effects on mental health (e.g. higher rates of anxiety, depression, and post-traumatic stress symptoms). Existing digital applications to help patients self-manage chronic pain often report low engagement and typically focus narrowly on either symptom tracking (emphasis on user providing data), or intervention delivery (translating an in-person intervention into a digital format).We hypothesize that if youth with chronic pain could actively explore data from their lived experiences, they could better relate their symptoms to other areas of their lives and improve their general functioning. Our approach novelly uses interactive visualization of self-reported data as an intervention. We contribute design principles for engaging youth-centered visualizations of personal health data, and discuss metrics that can be used to measure their efficacy. Both were derived through the development of myWeekInSight, a visualization-based web application for teens to interactively explore personal health data, using data collected via thrice-daily surveys (Ecological Momentary Assessments) to capture in-the-moment a youth’s everyday circumstances, symptoms and experiences. We developed these visualizations iteratively with guidance from pediatric chronic pain clinicians, a patient partner, and experts in information visualization and human-computer interaction. We evaluated them in two phases, both with members of the target population: (1) design evaluation (N=10): assessment of comprehensibility, usability, and engagement through semi-structured interviews and questionnaires, followed by a qualitative analysis using affinity diagramming; and (2) utility evaluation (N=50): through a 2-week clinical deployment of a fully-functioning prototype developed in collaboration with a health tech firm, followed by semi-structured interviews and questionnaires with a subset of the participants (N=10) analyzed using affinity diagramming. Youth found the visualizations to be reflective of their experiences, interesting and useful, and were able to extract actionable insights; they also confirmed their interest in using the application in their daily lives, and described possible usage scenarios. We close by discussing our learnings from the evaluation studies, and implications for next steps.

View record

What does it take to be a haptician? how community can empower designers and expose the many ways of being an expert (2022)

Haptic design practices have grown from an engineering sub-field in the 90's to encompass areas of robotics, human-computer interaction, the creative arts, and more. Yet designing in the haptic medium remains complex and difficult to learn regardless of one’s training, in part because access to specific knowledge, skill, and tools is currently limited outside academia and certain industries. Within academia, there has been haptic design and knowledge sharing but these efforts are often accessible only to designers in the STEM-aligned, technical sphere. Technological feats have enabled the field of haptics to grow; we are hearing it discussed in our everyday devices, courses, and projects. With the barrier to entry in the field lowering, challenges of haptic design are also shifting.We explore the opportunity opening at this crux, one where we want to enable and empower hapticians to create and understand touch sensations by expanding the contexts of haptic design. We do so through a design justice framework and a feminist, participatory qualitative approach. Individuals remain experts in their own lived experiences, whether that be topical, experiential, or technical, but is there a way to embolden this specialized haptics knowledge for larger collaboration and knowledge sharing? We hypothesize that a suitably structured community resource could provide an empowering, inclusive, and reflexive design ecosystem for hapticians of diverse backgrounds. Our research took two parallel paths: understanding the perspectives of “peripheral” hapticians and designing an online resource for community building for haptic design (N=6). In our understanding path, we learned that underrepresented hapticians need support in their interest areas, specifically through a welcoming community space. Additionally, we described obstacles still faced in the field and presented eight social principles for haptic design. In our designing path, we applied our findings to create a haptic design resource (Haptics Commons) which we evaluated in a pilot study (N=6). We found that representing perspective hapticians as both practitioners (people with specialized skill) and explorers (people looking to learn) on a community platform gives promise to inclusivity and empowerment.

View record

Comparing haptic application design communities: characterizing differences and similarities for future design knowledge sharing (2020)

Haptic technology has increasingly blended digital and physical world elements to create intuitive interactions in areas such as affective computing, VR/AR, video games, education and various other domains. However, we posit that the emergence of best processes for designing impactful haptic applications has been hindered by a lack of shared understanding of the technical and conceptual design knowledge involved in developing meaningful haptic experiences. With over 27 years of diverse haptic literature, we have an opportunity to verify our supposition by characterizing community design practices in their similarities/differences which can be used to highlight areas of design expertise and gaps that other communities can help improve/complete. In future work, these characterizations can be further analyzed and integrated to help formulate effective haptic application design processes which could lead towards new or improved haptic application experiences. We conducted a scoping literature review that provided initial characterizations of community haptic application design practices, in order to lay the foundations for future cross-fertilization of design knowledge.

View record

Rapid mold prototyping: creating complex 3D castables from 2D cuts (2020)

Designers, makers, and artists prototype physical products by iteratively ideating, modeling, and realizing them in a fast, exploratory manner. A popular method of bringing 3D designs to life is through casting. Casting is the process of pouring a material into a mold, such that once the material sets, the target object is created. Currently, the process of turning a digital design into a tangible product can be difficult. One reason for this is that building the mold - for example by 3D printing it - can take hours, slowing down the prototyping process.This can be particularly true when prototyping molds for casting interactive (sensate and actuated) or geometrically complex (curvy) objects.To this end, we developed two mold-making techniques intended to facilitate different, complementary needs for rapid prototyping. The first technique we introduce is Silicone I/O, a making method based on Computer Numerical Control (CNC) that enables the molding of sensate, actuated silicone devices. This method uses stacked laser-cut slices of wood bound together with molten wax in order to create cheap, accessible, one-time-use molds that are quick and easy to assemble. The Silicone I/O devices are pneumatically actuated using air channels created through lost-wax casting, and made sensate by mixing carbon fibre with silicone. The second technique that we describe is FoldMold, which allows curvy molds to be rapidly built out of paper and wax. This approach is based on “unfolding” a 3D object, cutting the 2D layout, and using papercraft techniques to reassemble the mold.Beyond the physical challenges of rapid mold-making, digitally designing mold patterns from 3D objects poses a bottleneck in the design process. We contribute the FoldMold Blender Add-on, a computational tool that turns 3D positives into CNC-ready papercraft mold patterns. This thesis contributes within two different broad approaches to increasing increasing speed in mold prototyping. The first method is by creating flat, laser-cuttable mold patterns, significantly speeding up the actual mold creation process. The second method is by automating mold design, off-loading much of the tedious design work to a computer software that can help a maker design a mold very quickly.

View record

Building believable robots: an exploration of how to make simple robots look, move, and feel right (2017)

Humans have an amazing ability to see a 'spark of life' in almost anything that moves. There is a natural urge to imbue objects with agency. It's easy to imagine a child pretending that a toy is alive. Adults do it, too, even when presented with evidence to the contrary. Leveraging this instinct is key to building believable robots, i.e. robots that act, look, and feel like they are social agents with personalities, motives, and emotions. Although it is relatively easy to initiate a feeling of agency, it is difficult to control, consistently produce, and maintain an emotional connection with a robot. Designing a believable interaction requires balancing form, function and context: you have to get the story right. In this thesis, we discuss (1) strategies for designing the bodies and behaviours of simple robot pets; (2) how these robots can communicate emotion; (3) and how people develop narratives that imbue the robots with agency. For (1), we developed a series of four robot design systems to create and rapidly iterate on robot form factors, as well as a tools for improvising and refining expressive robot behaviours. For (2), we ran three studies wherein participants rated robot behaviours in terms of arousal and valence under different display conditions. For (3), we ran a study wherein expert performers improvised emotional 'stories' with the robots; also one of the studies in (2) included soliciting narratives for the robot and its behaviours.

View record

Designing zooming interactions for small displays with a proximity sensor (2017)

Small, high resolution touchscreens open new possibilities for wearable and embedded applications, but are a mismatch for interactions requiring appreciable movement on the screen surface. For example, multi-touch or large-scroll zooming actions suffer from occlusion and difficulties in accessing or resolving large zoom ranges or selecting small targets.Meanwhile, emerging technologies have the potential to combine many capabilities, e.g., touch- and proximity-sensitivity, flexibility and transparency. A current challenge is to develop interaction techniques that can exploit the capabilities of these new materials to solve interaction challenges presented by trends such as miniaturization and wearability such as tiny screens that only one finger of one hand can fit on.To this end, Zed-zooming exploits the capabilities of emerging near-proximity sensors to address these problems, by mapping finger height above a control surface to image size. The EZ-Zoom technique adds the pseudohaptic illusion of an elastic finger-screen connection, by exploiting non-linear scaling functions to provide a usage metaphor. In a two-part user study, we compared EZ-Zoom to touchscreen standard pinch-to-zoom on smartphone and smartwatch screens, and found (a) a significant improvement in task time and preference for the smallest screen (equivalent task time for the smartphone); and (b) that the illusion improved users' reported sense of control, provided cues about the interaction's spatial extent and dynamics, and made the interaction more natural. From our experience with the study, we conclude requirements for the development of proximity sensors in order to afford such interactions. Our work goes on to reflect on how zed-zooming can be incorporated into seamless interaction tasks. We aim to identify some characteristics of a zooming interaction that would need to be considered when designing a complete one, and explore how these characteristics play into a complete and usable zooming interaction.

View record

Towards an Emotionally Communicative Robot: Feature Analysis for Multimodal Support of Affective Touch Recognition (2016)

Human affective state extracted from touch interaction takes advantage of natural communication of emotion through physical contact, enabling applicationslike robot therapy, intelligent tutoring systems, emotionally-reactivesmart tech, and more. This work focused on the emotionally aware robot pet context and produced a custom, low-cost piezoresistive fabric touch sensor at 1-inchtaxel resolution that accommodates the flex and stretch of the robot in motion.Using established machine learning techniques, we built classification models ofsocial and emotional touch data. We present an iteration of the human-robot interaction loop for an emotionally aware robot through two distinct studies anddemonstrate gesture recognition at roughly 85% accuracy (chance 14%).The first study collected social touch gesture data (N=26) to assess data qualityof our custom sensor under noisy conditions: mounted on a robot skeleton simulating regular breathing, obscured under fur casings, placed over deformable surfaces.Our second study targeted affect with the same sensor, wherein participants(N=30) relived emotionally intense memories while interacting with a smaller stationary robot, generating touch data imbued with the following: Stressed, Excited,Relaxed, or Depressed. A feature space analysis triangulating touch, gaze, andphysiological data highlighted the dimensions of touch that suggest affective state.To close the interactive loop, we had participants (N=20) evaluate researcherdesigned breathing behaviours on 1-DOF robots for emotional content. Resultsdemonstrate that these behaviours can display human-recognizable emotion asperceptual affective qualities across the valence-arousal emotion model. Finally, we discuss the potential impact of a system capable of emotional “conversation” with human users, referencing specific applications.

View record

Scrolling in Radiology Image Stacks: Multimodal Annotations and Diversifying Control Mobility (2014)

Advances in image acquisition technology mean that radiologists today must examine thousands of images to make a diagnosis. However, the physical interactions performed to view these images are repetitive and not specialized to the task. Additionally, automatic and/or radiologist-generated annotations may impact how radiologists scroll through image stacks as they review areas of interest. We analyzed manual aspects of this work by observing and/or interviewing 19 radiologists; stack scrolling dominated the resulting task examples.We used a simplified stack seeded with correct or incorrect annotations in our experiment on lay users. The experiment investigated the impact of four scrolling techniques: traditional scrollwheel, click+drag, sliding-touch and tilting to access rate control. We also examined the effect of visual vs. haptic annotation cues’ on scrolling dynamics, detection accuracy and subjective factors. Scrollwheel was the fastest scrolling technique overall for our lay participants. Combined visual and haptic annotation highlights increased the speed of target-finding in comparison to either modality alone.Multimodal annotations may be useful in radiology image interpretation; users are heavily visually loaded, and there is background noise in the hospital environment. From interviews with radiologists, we see that they are receptive to a mouse that they can use to map different movements to interactions with images as an alternative to the standard mouse usually provided with their workstation.

View record

Participatory design of a biometrically-driven portable audio player (2012)

Music listening assumes a number of different forms and purposes for many people who live in a highly digitalized world. Where, how and what songs are listened to can be a highly personalized activity, as unique musical preferences and individual tastes play an important role in choice of music. Today’s portable media devices are high-capacity and easy to carry around, supporting quick access to a nearly unlimited library of media, in many use contexts in nearly any time or place. But these advantages come at a cost. Operating the music player while doing other things can involve a physical and mental demand that ranges from inconvenient to dangerous. The Haptic-Affect Loop (HALO) paradigm was introduced by Hazelton et al. (2010) to help users control portable media players by continuously inferring the user’s affective state and the player behaviour they desired through physiological signals. They proposed using the haptic modality to deliver feedback and gathered initial requirements from a single user. In this thesis, we present a qualitative participatory design study which broadens Hazelton’s single user participatory design study to include six participants. A more efficient means of obtaining information about a user is developed to support scaling to multiple participants. We then examined these users’ expectations for user-device communication and the functionality of the HALO paradigm, with the objective of identifying clusters of preferred uses for HALO. In this regard, we identified the behaviours of a proposed system that these users would find most useful, and that they would like to interact with. We collectively explored a set of exemplar implicit and explicit interaction scenarios for HALO, finding greater confidence in mechanisms that did not relinquish user control, but openness to trying more implicit control approaches where priority of control in listening music was lower than secondary tasks. The willingness to try more implicit control approaches depends on the reliability of the technology. Finally, we generated a set of interaction design guidelines for the next stage of HALO prototyping.

View record

Sensing and Recognizing Affective Touch in a Furry Zoomorphic Object (2012)

Over the last decade, the surprising fact has emerged that machines can possess therapeutic power. Due to the many healing qualities of touch, one route to such power is in haptic emotional interaction, which in turn requires sophisticated touch sensing and interpretation. We explore the development of affective touch gesture recognition technologies in the context of a furry artificial lap-pet, with the ultimate goal of creating therapeutic interactions by sensing human emotion through touch. We design, construct, and evaluate a low-cost, low-tech furry lap creature prototype equipped with 2 types of touch-sensing hardware. The first of these hardware types is our own design for a new type of touch sensor built with conductive fur, invented and developed as part of this research. The second is an existing design for a piezoresistive fabric pressure sensor, adapted to three dimensions for our robot-body context. Combining features extracted from the time-series data output of these sensors, we perform machine learning analysis to recognize touch gestures. In a study of 16 participants and 9 key affective gestures, our model averages 94% gesture recognition accuracy when trained on individuals, and 86% accuracy when applied generally across the entire set of participants. The model can also recognize who out of the 16 participants is touching the prototype with an accuracy of 79%. These results promise a new generation of emotionally intelligent machines, enabled by affective touch gesture recognition.

View record

Sensing Gait on Smartphones to Support Mobile Exercise Applications and Games (2012)

To encourage and support physical activity in increasingly sedentary lifestyles, many are turning to mobile technology. Modern smartphones are equipped with a wealth of sensors, including Global Positioning Systems (GPS) and accelerometers, suggesting great potential to be integrated with fitness and exercise applications. So far, GPS-enabled devices have been used to support running, cycling, or even exercise games that encourage people to be physically active, but GPS- enabled devices lack fine-grained information about the user’s activity. Accelerometers have been used to some effect to detect step count and walking cadence (step rate), and even to classify activity (distinguishing walking from cycling, for example), but require a known carrying location and orientation. In this work, we examine the role of location in two application areas - real-time cadence estimation and gait classification - and develop algorithms to accommodate diverse carrying locations. In the first application area, real-time cadence estimation, our algorithm (Robust Real-time Algorithm for Cadence Estimation, or RRACE) uses a frequency- domain analysis to perform well without training or tuning, and is robust to changes in carrying locations. We demonstrate RRACE’s performance and robustness to be an improvement over existing algorithms with data collected from a user study. In the second application area, gait classification, we present a novel set of 15 gaits suitable for exercise games and other fitness applications. Using a minimal amount of training for participants, we can achieve a mean of 78.1% classification for all 15 gaits and all locations, an accuracy which may be usable now in some applications and warrants further investigation of this approach. We present findings of how our classification scheme confuses these gaits, and encapsulate insights in guidelines for designers. We also demonstrate that our classification performance varies dramatically for each individual even when trained and tested on that individual, suggesting strong individual differences in performing gaits. Our innovative methodology for simple and quick collection of accelerometer data is also described in detail. Future work includes planned improvements to both algorithms, further investigation of individual differences, and extension of this work to other application areas.

View record

TAMER : touch-guided anxiety management via engagement with a robotic pet efficacy evaluation and the first steps of the interaction design (2012)

Anxiety disorders are widespread among children and adolescents, yet the existing treatments are effective for only a small proportion of the affected young population. We propose a novel idea for improving the efficacy of anxiety treatments that relies on affective touch as a therapeutic medium. Building upon the wealth of evidence for therapeutic benefits of animals, our approach utilizes an animatronic pet, the Haptic Creature, as a tool to deliver calming effects. We ground our idea in the framework of social cognitive theory as used in human-animal interaction. We first model the interaction design as a search in a broadly defined interaction space, and then introduce a novel and systematic approach to the interaction design process. We describe our iterative design of a human-Creature interaction that was measurably calming, and share the methodology and results of our two most significant evaluation cycles. Our principal results, from the second study, showed that the interaction with the Haptic Creature, while it is breathing slowly and constantly, produces calming effects as indicated by decreased heart rate and breathing rate as well as the subjective reports.

View record

The design and field observation of a haptic notification system for timing awareness during oral presentations (2012)

Conference session chairs must manage time, usually by reminding speakers of the time remaining through a variety of means (e.g., visual signs with “X” minutes left). But speakers often miss reminders, chairs cannot confirm reminder receipt, and the broken dialogue can be a sideshow for the audience. The experience of speaking in front of an audience, such as at a conference or even during a classroom lecture, can also be cognitively demanding and overwhelming. This further causes speakers to miss reminders from personal timing tools (e.g., cellphone timer). To address these and other concerns, this thesis describes the design and evaluation of HaNS, a novel wireless wrist-worn chair-speaker Haptic Notification System, that delivers tactile timing alerts to unintrusively aid speakers and session chairs time-manage oral presentations. Iterative deployment and observation in realistic settings was used to optimize the attentional characteristics and the chair-speaker interaction. HaNS’s use was then observed through four field observations in three settings: two mid-sized academic conferences (55 speakers, 16 session chairs, 50 audience members), five university research seminars (11 speakers, 5 session chairs, 15 audience members), and four large university lectures (23 by 3 instructors). Through observation and self-reports, existing speaker and session chair timing practices and difficulties are documented. Results demonstrate that HaNS can improve a user’s awareness of time; it automatically delivers salient notifications, unintrusively, privately, and remotely. HaNS also facilitates chair-speaker coordination and reduces distraction of speaker and audience through its private communication channel. Eliminating overruns will require improvement in speaker ‘internal’ control, which our results suggest HaNS can also support given practice. This thesis concludes with design guidelines for both conference-deployed and personal timing tools, supported by haptics or other notification modalities.

View record

A first and second longitudinal study of haptic icon learnability : the impact of rhythm and melody (2010)

The design and evaluation of haptic icons -- brief, meaningful tactile stimuli -- has been studied extensively in the research community. Haptic icons are designed to support communication of information through the often-underutilized haptic modality. However, the learnability of haptic icons has not been evaluated in an ecologically plausible, longitudinal deployment scenario. This thesis endeavours to evaluate the learnability of haptic icons in a realistic context. We assign abstract meanings based on a realistic context to a large, previously developed set of rhythmic haptic stimuli. Then, during a period of 12 sessions over 4 weeks, we train users to recognize these icons and observe identification performance under workload using a Tetris game interruption task. Icons are presented to users in sets of 7. Upon the mastery of their current 7 icons, the user graduates to a new set, but must remember previously learned icons. We discover that perceptual discriminability dominates learnability -- the semantics of the icons have very little effect. We also find evidence that design based on multidimensional scaling (MDS) is adequate for developing haptic stimulus sets, but can be quite conservative in its identification performance predictions during deployment. Haptic icon learning is characterized by a peak in difficulty after learning progresses past a single group 7 icons, which may be explained by cognitive long-term encoding and an increase in perceptual sensitivity. In addition, we present a series of heuristics for designing rhythmic haptic icons, as well as guidelines for haptic icon training and advice for hardware designers. In an attempt to increase the expressiveness and learnability of rhythmic haptic icons, we explore the addition of melody. We iteratively develop a second set of 30 melodic haptic icons using an MDS methodology. We discover that rhythm dominates user categorization of melodies. This work also results in a set of heuristics for designing melodic icons. Finally, we evaluate the learnability of this new melodic set using our previous longitudinal methodology. Our results indicate that purely rhythmic haptic icons are easier to learn than melodic haptic icons that are grouped by rhythm and are thus more viable for deployment.

View record

Investigating, designing, and validating a haptic-affect interaction loop using three experimental methods (2010)

Computer interfaces commonly make large demands on our visual and auditory attention, which can make multi-tasking with multiple systems difficult. In cases where a primary task demands constant, unbroken attention from the user, it is often implausible for such a user to employ a system for a secondary task, even when desirable. The haptic modality has been suggested as a conduit for the appropriately-intrusive delivery of information from computer systems. Furthermore, physiological signals can be used to infer the affective state of a user without requiring attention. Combining these underexplored channels for implicit system command, control and display, we envision an automated, intelligent and emotionally aware interaction paradigm. We call this paradigm the Haptic-Affect Loop (HALO).This work investigates the potential for the HALO paradigm in a specific use case (portable audio consumption). It uses three experimental techniques to gather requirements for the paradigm, validate its technological feasibility, and develop the feedback-supported language of interaction with a HALO-enabled portable audio system.A focus group is first conducted to identify the perceived utility of the paradigm with a diverse – albeit technologically conservative – group of portable audio users, and to narrow its scope. Results of this focus group indicate that participants are sceptical of its technological feasibility (in particular, context resolution) and are unwilling to relinquish control over their players. This scepticism was alleviated somewhat by the conclusion of the sessions.Next, technological validation of online affect classification is undertaken via an exploratory, but formally controlled, experiment. Galvanic skin response measures provided a means to make introductory measures of interruption and, in some cases, musical engagement. A richer signal array is necessary to make the full array of required affect identifications for this paradigm, and is under development.The final phase of work involves an iterative participatory design process with a single participant who was enthusiastic but practical about technology to better define system requirements and to evaluate input and output mechanisms using a variety of devices and signals.The outcome of this design effort was a functioning prototype, a set of initial system requirements and an exemplar interaction language for HALO.

View record

 
 

If this is your researcher profile you can log in to the Faculty & Staff portal to update your details and provide recruitment preferences.

 
 

Planning to do a research degree? Use our expert search to find a potential supervisor!