Janet Werker

Professor

Research Classification

Language Acquisition and Development
Speech and Language Development Disorders
Language and Cognitive Processes
Bilingualism and Multilingualism
Psychology - Biological Aspects

Research Interests

Language Acquisition
speech perception
Multisensory Processing
Critical Periods
Plasticity
Psycholinguistics

Relevant Degree Programs

 

Research Methodology

Functional Near-Infrared Spectroscopy (fNIRS)
Electroencephalography (EEG) & Event-Related Potential (ERP)
Eye-tracking
Habituation of Looking Time Procedure
Head-turn Preference Procedure (HPP)

Recruitment

Complete these steps before you reach out to a faculty member!

Check requirements
  • Familiarize yourself with program requirements. You want to learn as much as possible from the information available to you before you reach out to a faculty member. Be sure to visit the graduate degree program listing and program-specific websites.
  • Check whether the program requires you to seek commitment from a supervisor prior to submitting an application. For some programs this is an essential step while others match successful applicants with faculty members within the first year of study. This is either indicated in the program profile under "Requirements" or on the program website.
Focus your search
  • Identify specific faculty members who are conducting research in your specific area of interest.
  • Establish that your research interests align with the faculty member’s research interests.
    • Read up on the faculty members in the program and the research being conducted in the department.
    • Familiarize yourself with their work, read their recent publications and past theses/dissertations that they supervised. Be certain that their research is indeed what you are hoping to study.
Make a good impression
  • Compose an error-free and grammatically correct email addressed to your specifically targeted faculty member, and remember to use their correct titles.
    • Do not send non-specific, mass emails to everyone in the department hoping for a match.
    • Address the faculty members by name. Your contact should be genuine rather than generic.
  • Include a brief outline of your academic background, why you are interested in working with the faculty member, and what experience you could bring to the department. The supervision enquiry form guides you with targeted questions. Ensure to craft compelling answers to these questions.
  • Highlight your achievements and why you are a top student. Faculty members receive dozens of requests from prospective students and you may have less than 30 seconds to peek someone’s interest.
  • Demonstrate that you are familiar with their research:
    • Convey the specific ways you are a good fit for the program.
    • Convey the specific ways the program/lab/faculty member is a good fit for the research you are interested in/already conducting.
  • Be enthusiastic, but don’t overdo it.
Attend an information session

G+PS regularly provides virtual sessions that focus on admission requirements and procedures and tips how to improve your application.

 

Doctoral students
Postdoctoral Fellows
Any time / year round
I support public scholarship, e.g. through the Public Scholars Initiative, and am available to supervise students and Postdocs interested in collaborating with external partners as part of their research.
I support experiential learning experiences, such as internships and work placements, for my graduate students and Postdocs.
I am open to hosting Visiting International Research Students (non-degree, up to 12 months).
I am interested in hiring Co-op students for research placements.

Graduate Student Supervision

Doctoral Student Supervision (Jan 2008 - May 2019)
Sex differences in the development of visual processing in infancy (2017)

The first months of life are a sensitive period for the development of visual processing, and face processing in particular. The main goal of this thesis is to examine the influence of infant sex on the development of visual processing. The overarching hypothesis was that 5-month-olds would differ in performance on tasks related to the higher levels of the ventral processing stream, with females showing more advanced ventral visual processing. To begin tracing the developmental trajectory of these differences, another group was tested at 7 to 8 months, after major changes in face processing abilities occur. An exploratory look was taken throughout at two factors that may influence face processing development – the size of the social environment, and locomotion level. In Chapter 2, 5-month-olds were tested on detection of an eye expression change, from smiling to neutral, following infant-controlled habituation. As predicted, females outperformed males in evidencing a novelty preference. In Chapter 3, 7- to 8-month-olds were tested on the same task. For females, a developmental change from novelty to familiarity preference was found. For males no indication of eye expression discrimination at either age was found. In Chapter 4, both age groups were tested on discriminating a featural change in internal features (eyes, nose, mouth). A female advantage was found in 5-month-olds, but disappeared by 7 to 8 months. Chapter 5 replicated the Chapter 2 findings of female superiority in eye expression discrimination at 5 months. Contrary to prediction, females did not show greater mirror image confusion. Laterality effects for both eye expression and mirror image discrimination were found in females, and a negative relation between mirror image and eye expression discrimination was found in males. Finally, effects of the social environment on male face processing and of locomotion level on female face processing were found. The results support the hypothesis of a sex difference in the development of ventral stream processing. They inform the fields of visual/face processing development and of sex differences, showing a sex difference in infant development of processing of internal facial features and identifying additional factors involved, and have implications for studies of autism.

View record

Language as a special signal : infants' neurological and social perception of native language, non-native language, and language-like stimuli (2016)

The capacity to acquire language is believed to be deeply embedded in our biology. As such, it has been proposed that humans have evolved to respond specially to language from the first days and months of life. The present thesis explores this hypothesis, examining the early neural and social processing of speech in young infants. In Experiments 1-4, Near-Infrared Spectroscopy is used to measure neural activation in classic “language areas” of the cortex to the native language, to a rhythmically distinct unfamiliar language, and to a non-speech whistled surrogate language in newborn infants (Experiments 1 & 2) as well as infants at 4 months of age (Experiments 3 & 4) in. Results revealed that at birth, the brain responds specially to speech: bilateral anterior areas are activated to both familiar and unfamiliar spoken language, but not to the whistled surrogate form. Different patterns were observed in 4 month-old infants, demonstrating how language experience influences the brain response to speech and non-speech signals. Experiments 5-7 then turn to infants’ perception of language as a marker of social group, asking whether infants at 6 and 11 month-olds associate the speakers of familiar and unfamiliar language with individuals of different ethnicities. Infants at 11 months—but not at 6 months—are found to look more to Asian versus Caucasian faces when paired with Cantonese versus English language (Experiments 5, 7). However, infants at the same age did not show any difference in looking to Asian versus Caucasian faces when paired with English versus Spanish (Experiment 6). Together, these results suggest that the 11 month-old infants tested have learned a specific association between Asian individuals and Cantonese language. The experiments presented in this thesis thus demonstrate that from early in development, infants are tuned to language. Such sensitivity is argued to be of critical importance, as it may serve to direct young learners to potential communicative partners.

View record

Visual influences on speech perception in infancy (2016)

The perception of speech involves the integration of both heard and seen signals. Increasing evidence indicates that even young infants are sensitive to the correspondence between these sensory signals, and adding visual information to the auditory speech signal can change infants’ perception. Nonetheless, important questions remain regarding the nature of and limits to early audiovisual speech perception. In the first set of experiments in this thesis, I use a novel eyetracking method to investigate whether English-learning six-, nine-, and 11-month-olds detect content correspondence in auditory and visual information when perceiving non-native speech. Six- and nine-month-olds, prior to and in the midst of perceptual attunement, switch their face-scanning patterns in response to incongruent speech, evidence that infants at these ages detect audiovisual incongruence even in non-native speech. I then probe whether this familiarization, to congruent or incongruent speech, affects infants’ perception such that auditory-only phonetic discrimination of the non-native sounds is changed. I find that familiarization to incongruent speech changes—but does not entirely disrupt—six-month-olds’ auditory discrimination. Nine- and 11-month-olds, in the midst and at the end of perceptual attunement, do not discriminate the non-native sounds regardless of familiarization condition. In the second set of experiments, I test how temporal information and phonetic content information may both contribute to an infant’s use of auditory and visual information in the perception of speech. I familiarize six-month-olds to audiovisual Hindi speech sounds in which the auditory and visual signals of the speech are incongruent in content and, in two conditions, are also temporally asynchronous. I hypothesize that, when presented with temporally synchronous, incongruent stimuli, infants rely on either the auditory or the visual information in the signal and use that information to categorize the speech event. Further, I predict that the addition of a temporal offset to this incongruent speech changes infants’ use of the auditory and visual information. Although the main results of this latter study are inconclusive, post-hoc analyses suggest that when visual information is presented first or synchronously with auditory information, as is the case in the environment, infants exhibit a moderate matching preference for auditory information at test.

View record

Sensorimotor influences on speech perception in infancy (2014)

The multisensory nature of speech, and in particular, the modulatory influence of one’s own articulators during speech processing, is well established in adults. However, the origins of the sensorimotor influence on auditory speech perception are largely unknown, and require the examination of a population in which a link between speech perception and speech production is not well-defined; by studying preverbal infant speech perception, such early links can be characterized. Across three experimental chapters, I provide evidence that articulatory information selectively affects the perception of speech sounds in preverbal infants, using both neuroimaging and behavioral measures. In Chapter 2, I use a looking time procedure to show that in 6-month-old infants, articulatory information can impede the perception of a consonant contrast when the related articulator is selectively impaired. In Chapter 3, I use the high-amplitude suck (HAS) procedure to show that neonates are able to discriminate and exhibit memory for the vowels /u/ and /i/; however, the information from the infants’ articulators (a rounded lip shape) seems to only marginally affect behavior during the learning of these vowel sounds. In Chapter 4, I co-register HAS with a neuroimaging technique – Near Infrared Spectroscopy (NIRS) – and identify underlying neural networks in newborn infants that are sensitive to the sensorimotor-auditory match, in that the vowel which matches the lip shape (/u/) is processed differently than the vowel that is not related to the lip shape (/i/). Together, the experiments reported in this dissertation suggest that even before infants gain control over their articulators and speak their first words, their sensorimotor systems are interacting with their perceptual systems as they process auditory speech information.

View record

Bilingualism in infancy : a window on language acquisition (2010)

To rise to the challenge of acquiring their native language, infants must deploy tools to support their learning. This thesis compared infants growing up in two very different language environments, monolingual and bilingual, to better understand these tools and how their development and use changes with the context of language acquisition.The first set of studies − Chapter 2 − showed that infants adapt very early-developing tools to the context of their prenatal experience. Newborns born to bilingual mothers directed their attention to both of their native languages, while monolinguals preferred listening to their single native language. However, prenatal bilingual experience did not result in language confusion, as language discrimination was robustly maintained in both monolinguals and bilinguals. Thus, learning mechanisms allow experience-based listening preferences, while enduring perceptual sensitivities support language discrimination even in challenging language environments.Chapter 3 investigated a fundamental word learning tool: the ability to associate word and object. Monolinguals and bilinguals showed an identical developmental trajectory, suggesting that, unlike some aspects of word learning, this associative ability is equivalent across different types of early language environments. Chapters 4 and 5 explored the development of a heuristic for learning novel words. Disambiguation is the strategy of associating a novel word with a novel object, rather than a familiar one. In Chapter 4, disambiguation was robustly demonstrated by 18-month-old monolinguals, but not by age-matched bilinguals and trilinguals. The results supported the “lexicon structure hypothesis”, that disambiguation develops with mounting evidence for a one-to-one mapping between words and their referents, as is typical for monolinguals. For bilinguals, translation equivalents (cross-language synonyms) represent a departure from one-to-one mapping. Chapter 5 directly tested the lexicon structure hypothesis, by comparing subgroups of bilinguals who knew few translation equivalents to bilinguals who knew many. Only the former group showed disambiguation, supporting the lexicon structure hypothesis.The series of studies presented in this thesis provides a window into language acquisition across all infants. Whether growing up monolingual or bilingual, infants harmonize their development and use of the tools of language acquisition to the particular challenges mounted by their language environment.

View record

The origins of articulatory-motor influences on speech perception (2010)

Myriad factors influence perceptual processing, but “embodied” approaches assert that sensorimotor information about bodily movements plays an especially critical role. This view has precedence in speech research, where it has often been assumed that the movements of one’s articulators (i.e., the tongue, lips, jaw, etc.) are closely related to perceiving speech. Indeed, previous work has shown that speech perception is influenced by concurrent stimulation of speech motor cortex or by silently making articulatory motions (e.g., mouthing “pa”) when hearing speech sounds. Critics of embodied approaches claim instead that so-called articulatory influences are attributed to other processes (e.g., auditory imagery or feedback from phonological categories), which are also activated when making speech articulations. This dissertation explores the embodied basis of speech perception, and further investigates its ontogenetic development. Chapter 2 reports a study where adults made silent and synchronous speech-like articulations while listening to and identifying speech sounds. Results show that sensorimotor aspects of these movements (i.e., articulatory-motor information) are a robust source of perceptual modulation, independent from auditory imagery or phonological activation. Chapter 3 reports that even low-level, non-speech articulatory-motor information (i.e., holding one’s breath at a particular position in the vocal tract) can exert a subtle influence on adults’ perception of related speech sounds. Chapter 4 investigates the developmental origins of these influences, showing that low-level articulatory information can influence 4.5-month-old infants’ audiovisual speech perception. Specifically, achieving lip-shapes related to /i/ and /u/ vowels (while chewing or sucking, respectively) is shown to disrupt infants’ ability to match auditory speech information about these vowels to visual displays of talking faces. Together, these chapters show that aspects of speech processing are embodied and follow a pattern of differentiation in development. Before infants produce clear speech, links between low-level articulatory representations and speech perception are already in place. As adults, these links become more specific to sensorimotor information in dynamically coordinated articulations, but vestigial links to low-level articulatory-motor information remain from infancy.

View record

Plasticity in infants' speech perception : a role for attention? (2009)

Phonetic perception becomes native-like by 10 months of age. A potential mechanism of change, distributional learning, affects the perception of 6-8-month-old infants (Maye et al., 2002). However, it was anticipated that perception may be more difficult to change by 10 months of age, after native categories have developed. In fact, some evidence suggests that by this age, the presence of social interaction may be an important element in infants’ phonetic change (Kuhl et al., 2003). The current work advances the hypothesis that infants’ level of attention, which tends to be higher with social interaction, may be a salient factor facilitating phonetic change. Three experiments were designed to test infants’ phonetic plasticity at 10 months, after phonetic categories have formed. A non-social distributional learning paradigm was chosen, and infants’ attention was monitored to probe whether a facilitating role would be revealed. In Experiment 1, 10-month-old English-learning infants heard tokens from along a continuum that is no longer discriminated at this age that formed a distribution suggestive of a category boundary (useful distinction). The results failed to reveal evidence of discrimination, suggesting that the distributional information did not have any effect. A second experiment used slightly different sound tokens, ones that are farther from the typical English pronunciation and are heard less frequently in the language environment. Infants still failed to discriminate the sounds following the learning period. However, a median split revealed that the high attending infants evinced learning. Experiment 3 increased the length of the learning phase to allow all infants to become sufficiently high attending, and revealed phonetic change. Thus, after phonetic categories have formed, attention appears to be important in learning.

View record

Master's Student Supervision (2010 - 2018)
The roots of categorization : 4-month-old infants extract feature correlations to form audio-visual categories (2010)

Is information from vision and audition mutually facilitative to categorization in infants? Ten-month-old infants can detect categories on the basis of correlations of five attributes of visual stimuli; four- and seven-month-olds are sensitive only to the specific attributes, rather than the correlations. If younger infants can detect specific attributes of visual stimuli, is there a way to facilitate the perception of these attributes as a meaningful correlation, and hence, as a category? The current studies investigate whether integrating information from two domains—speech within the auditory system together with shapes in the visual domain—could facilitate categorization. I hypothesized that 4-month-old infants could categorize audio-visual information by pairing correlation-based stimuli in the auditory domain (monosyllables) with correlation-based stimuli in the visual domain (line-drawn animals). In Experiment 1, infants were exposed to a series of line-drawn animals whose features were correlated to form two animal categories. During test, infants experienced three test trials: a novel member of a previously-shown category, a non-member of the categories (that shared similar features), and a completely novel animal. Experiment 2 used the same animals and paradigm, but each animal was presented with a speech stimulus (a repeating monosyllable) whose auditory features were correlated in order to form two categories. In Experiment 3, categorization of the auditory stimuli was investigated in the absence of the correlated visual information. Experiment 4 addressed some potential confounds of the findings from Experiment 2. Results from this series of studies show that 4-month-olds fail categorize in both visual-only and auditory-only conditions. However, when each visual exemplar is paired with a corresponding, correlated speech exemplar, infants can categorize; they look longer at a new, within-category exemplar than a new, category violator. These findings provide evidence that infants extract correlated information from two domains, enabling cross-modal categorization at a very young age. Infants’ sensitivity to correlated attributes across two domains and the implications for categorization are discussed.

View record

What is a word : understanding developmental changes in the sounds infants accept as possible labels (2010)

Language is a conventional system: the use of words is shared within a language community. Even further, each language community has conventions regarding what “forms” may serve as words. A form (the phonological sounds or hand movements that make up a word) used in one community may not be proper in another. It is therefore important that when young language learners acquire a language, they adhere to both the general conventionality of language and the word-form conventions of their particular language(s).Previous research has demonstrated a developmental narrowing in the word-forms that infants are willing to accept as conventional labels. Younger word-learning infants view a wider range of symbols as potential labels than do older infants. The present study takes this research further, and specifies the nature of this developmental narrowing. Two potential word-learning constraints are explored: a Linguistic word-learning constraint, in which infants limit the symbols they view as potential labels according to whether the label-form consists of components that occur in at least one of the world’s languages, versus a more restrictive Native Language Assimilation constraint, in which infants limit symbols according to whether the components within the label-forms assimilate into native language speech categories. In addition, this research probed whether the development of such constraints is related to infants’ vocabulary acquisition.In the present study, I explored infants’ ability to learn unassimilable yet linguistic click words as object labels. In Experiment 1, I first established the effectiveness of the novel two-object Referential Switch paradigm, demonstrating that 14-month-old infants succeed in learning unassimilable click words as object labels in this task. In Experiment 2, I then tested 20-month-old infants to investigate the development of a Linguistic versus Native Language Assimilation. I found that while 20-month-old infants with smaller vocabularies were able to learn the unassimilable click words as labels, infants with larger vocabularies were not. These results suggest that the narrowing that occurs between 14 and 20 months of age in infants’ awareness of word-form conventions is best explained by the development of a Native Language Assimilation word-learning constraint.

View record

News Releases

This list shows a selection of news releases by UBC Media Relations over the last 5 years.

Publications

 
 

If this is your researcher profile you can log in to the Faculty & Staff portal to update your details and provide recruitment preferences.