Abstract

It is commonly believed that, in right-handed individuals, words and faces are processed by distinct neural systems: one in the left hemisphere (LH) for words and the other in the right hemisphere (RH) for faces. Emerging evidence suggests, however, that hemispheric selectivity for words and for faces may not be independent of each other. One recent account suggests that words become lateralized to the LH to interact more effectively with language regions, and subsequently, as a result of competition with words for representational space, faces become lateralized to the RH. On this interactive account, left-handed individuals, who as a group show greater variability with respect to hemispheric language dominance, might be expected to show greater variability in their degree of RH lateralization of faces as well. The current study uses behavioral measures and ERPs to compare the hemispheric specialization for both words and faces in right- and left-handed adult individuals. Although both right- and left-handed groups demonstrated LH over RH superiority in discrimination accuracy for words, only the right-handed group demonstrated RH over LH advantage in discrimination accuracy for faces. Consistent with this, increased right-handedness was related to an increase in RH superiority for face processing, as measured by the strength of the N170 ERP component. Interestingly, the degree of RH behavioral superiority for face processing and the amplitude of the RH N170 for faces could be predicted by the magnitude of the N170 ERP response to words in the LH. These results are discussed in terms of a theoretical account in which the typical RH face lateralization fails to emerge in individuals with atypical language lateralization because of weakened competition from the LH representation of words.

INTRODUCTION

Behavioral, neuropsychological, and neuroimaging investigations in adults have identified a region of the ventral occipito-temporal cortex (vOT) that is selective for faces to a greater degree in the right hemisphere (RH) than in the left hemisphere (LH) and, conversely, a region that is selective for words to a greater degree in the LH than in the RH. However, almost all of the data establishing this complementarity come from right-handed participants, most of whom have LH language dominance. Recent evidence suggests that these patterns of hemispheric selectivity do not emerge independently of each other (Behrmann & Plaut, 2013a, 2013b; Dundas, Plaut, & Behrmann, 2013; Cantlon, Pinel, Dehaene, & Pelphrey, 2011; Dehaene & Cohen, 2011; Plaut & Behrmann, 2011; Dehaene et al., 2010). The key idea is that word selectivity becomes instantiated in the left vOT because of its proximity to (hence greater connectivity with) language regions. Consequently, as a result of competition for representational space, over the course of development, face representations become instantiated more strongly in the right vOT over the course of development. On this account, one might predict a reduction in the RH bias for face processing when language representations (hence word processing) are less robust in the LH, such as in individuals who are ambidextrous or are left handed. Although some ambidextrous and left-handed individuals continue to evince LH language lateralization, as a group, the lateralization is not as strong as in right-handed individuals (Gonzalez & Goodale, 2009; Knecht, Deppe, et al., 2000; Knecht, Drager, et al., 2000).

The current study examines the prediction outlined above by examining the hemispheric specialization for words and for faces in a sample of individuals with differing handedness. Although there is some controversy in the literature on the best way to define the trait of handedness, with some studies using, for example, measures of grip strength (Clerke & Clerke, 2001), the use of self-report inventories assessing hand choice across various manual activities such as the Edinburgh Handedness Inventory (EHI; Oldfield, 1971) is considered robust. Importantly, for the current purposes, this EHI measure, which is adopted here, is reliably correlated with language dominance (Hunter & Brysbaert, 2008; Knecht, Drager, et al., 2000), and we explore the behavioral and ERP profiles of individuals who vary along this handedness measure.

In addition to testing the prediction concerning the lateralization of face processing, the current study is also relevant in light of the recent call to include left-handed individuals in research (Willems, Van der Haegen, Fisher, & Francks, 2014). Specifically, individuals who are not right-handed (hence oppose the modal pattern) are often excluded from hemispheric studies, and our understanding of cortical organization in such individuals is less established. This study is pertinent, then, to elucidate the full range of diversity in behavior and cortical organization.

Below, we first review the existing literature concerning the lateralization of word and face processing and their possible relationship. We then describe our methods and our empirical findings and, last, evaluate the implications of our findings in light of theories of hemispheric organization.

Lateralization of Words in Right-handers

The dominance of the LH over the RH for visual word processing is well established in right-handed adults (for a review, see Hellige, Laeng, & Michimata, 2010; Grüsser & Landis, 1991) as illustrated by the advantage for identifying orthographic stimuli shown in the right visual field (RVF)/LH over those presented in the left visual field (LVF)/RH. Consistently, studies using ERPs reveal a stronger N170 component in the LH over the RH in response to printed words (for recent examples, see Maurer, Rossion, & McCandliss, 2008; Mercure, Dick, Halit, Kaufman, & Johnson, 2008), and neuroimaging studies have identified a region of the inferior temporal cortex, the Visual Word Form Area (VWFA), that shows greater selectivity for words over other visual stimuli, especially in the LH (Price & Devlin, 2011; Cohen et al., 2000; Puce, Allison, Asgari, Gore, & McCarthy, 1996). Finally, individuals with LH vOT lesions are impaired in word reading (“pure alexia”) to a greater degree than is the case after a lesion to the homologous RH region (e.g., Behrmann & Plaut, 2013a; Kleinschmidt & Cohen, 2006).

One explanation for the emergence of this hemispheric specialization is that, in right-handers, the left vOT region becomes tuned for processing orthography; by virtue of its location, it is ideally situated to integrate bottom–up visual input and top–down information from the left-lateralized language system (Bouhali et al., 2014; Kherif, Josse, & Price, 2011; Price & Devlin, 2011; Twomey, Kawabata Duncan, Price, & Devlin, 2011; Devlin, Jamison, Gonnerman, & Matthews, 2006). This proposal is supported by developmental studies showing that the left lateralization of word processing is directly tied to student's experience in attaching phonemes to their written grapheme (Shaywitz et al., 2002; Marcel, Katz, & Smith, 1974) and that the LH lateralization can be predicted by overall reading competence (Dundas, Plaut, & Behrmann, 2014). Moreover, brain sensitivity to print begins to emerge when children learn correspondences between speech sounds and letters (Brem et al., 2010).

Lateralization of Faces in Right-handers

In right-handed adults, hemispheric specialization for face processing is the mirror opposite of that for word processing, with superior performance for faces shown to LVF/RH over those presented in the RVF/LH (Rhodes, 1985; Levy, Heller, Banich, & Burton, 1983b; Heller & Levy, 1981), and ERP studies reveal a stronger face-related N170 component in the RH than in the LH (Rossion, Joyce, Cottrell, & Tarr, 2003; Allison, Puce, Spencer, & McCarthy, 1999; Bentin, Allison, Puce, Perez, & McCarthy, 1996). In addition, neuroimaging studies have identified a face-selective region in the inferior temporal cortex, the fusiform face area (FFA) that is greater in the RH than LH (Spiridon, Fischl, & Kanwisher, 2006; Yovel & Kanwisher, 2005; Kanwisher, 2000; Kanwisher, McDermott, & Chun, 1997; Sergent, Ohta, & MacDonald, 1992; Sergent & Signoret, 1992). Finally, neuropsychological investigations have observed that a lesion to the right vOT cortex results in prosopagnosia with greater frequency and severity than after a lesion to the left vOT (Behrmann & Plaut, 2013a; Kleinschmidt & Cohen, 2006; Sergent & Poncet, 1990).

The basis for the RH superiority for face processing emergence remains unclear. Some proposals have focused on intrinsic properties of the RH that are assumed to make it better suited than the LH for processing faces, such as a predisposition for lower-frequency visual information (Robertson & Ivry, 2000) or a bias toward categorical or holistic information processing (Farah, 1999; Kosslyn et al., 1989), but one might query how those intrinsic biases emerge in the first place. A recent alternative view claims that the RH face superiority arises from competition for higher-order visual representation with words in the LH (Behrmann & Plaut, 2013a, 2013b; Plaut & Behrmann, 2011). Whereas the left vOT becomes tuned for word recognition because of proximity to language regions (Bouhali et al., 2014), face representations are shifted to the right vOT because of the competition for representational space in the LH. This competitive account is supported by studies showing that the degree of right lateralization for face perception is dependent on reading experience in children and preliterate adults (Dundas et al., 2013, 2014; Cantlon et al., 2011; Dehaene et al., 2010). Moreover, although children aged 7–11 years exhibit the adult hemispheric pattern for words (LH advantage), they show neither a behavioral nor a neural hemispheric superiority for faces (Dundas et al., 2014). Of particular interest, in this same developmental study, the magnitude of the N170 ERP component for faces in the RH was related to the N170 amplitude for words in the LH: the stronger the LH word lateralization, the stronger the RH face lateralization. In addition, the mean gamma-band power in the N170 time range observed in response to faces in the RH for the children was correlated with the gamma-band power observed in response to words in their LH. Along similar lines, Golarai et al. (2007) found a strong hemispheric asymmetry in FFA volume with fMRI (RH > LH) but little asymmetry (and overall smaller volumes) in children and adolescents (see also Scherf, Behrmann, Humphreys, & Luna, 2007). Taken together, these findings suggest that the hemispheric organization of face and word recognition does not develop independently and that LH word lateralization may precede and drive later RH face lateralization.

Lateralization of Words and Faces in Left-handed Individuals

The complementary lateralization profiles for word and face recognition are well established in right-handed individuals. The lateralization of language per se is also well established, with approximately 96% of right-handed adults showing LH language dominance. By contrast, there is an increased incidence of bilateral and RH language lateralization among left-handers, compared with right-handers, although most left/mixed handers (75%) still show LH language dominance (Van der Haegen, Cai, & Brysbaert, 2012; Knecht, Deppe, et al., 2000; Knecht, Drager, et al., 2000). Thus, assessing the performance of a group of left-handers, who show more variability in the distribution of their language lateralization, provides an opportunity to examine hemispheric asymmetries for word and face perception (Willems et al., 2014; Van der Haegen et al., 2012).

As noted above, rather less is known about left-handers with respect to the lateralization of word selectivity or about the lateralization of face selectivity compared with right-handers. Some studies have reported that the superiority for discriminating faces in the LVF over the RVF is reduced in left-handed adults (Luh, Redl, & Levy, 1994; Levy, Heller, Banich, & Burton, 1983a; Gilbert & Bakan, 1973), and some have argued that left-handers have more bilateral representation of faces than do right-handers (Willems, Peelen, & Hagoort, 2010; Luh et al., 1994; Heller & Levy, 1981). One recent study found bilateral activation of the FFA when left-handed adults viewed faces (Willems et al., 2010), whereas another found that left-handers showed less right FFA activation to faces than right-handers (Badzakova-Trajkov, Haberling, Roberts, & Corballis, 2010). Finally, a recent study also reported that the FFA was atypically lateralized in the left-handers (Bukowski, Dricot, Hanseeuw, & Rossion, 2013).

Our Approach: ERP Components of Word and Face Processing

In the current study, we characterize the hemispheric specialization for words and for faces in a large group of adults, and we explore the relationship between the lateralization for these two visual domains as well as their relationship with each other and with handedness. In the extreme, if left-handers show reverse lateralization for cognitive/perceptual functioning, as they do for motor functioning, we would expect to see LH dominance for faces and RH dominance for words. However, this total reversal is unlikely to be the case because language is left-lateralized in most left-handers and only approximately 27% of left-handers have RH language dominance (Knecht, Drager, et al., 2000). A more likely outcome, then, is a reduction in LH superiority for word processing in the left-handed group, based on the greater percentage of left-handers than right-handers who do not show LH language lateralization and a commensurate reduction of RH superiority for faces processing. We also adopt a converging analytic approach: Given that handedness is only an indirect proxy for language lateralization, we also examine face lateralization in those individuals whose data indicate a strong versus weak LH bias for word recognition, based on some of the measures we collect.

METHODS

Participants

All participants were monolingual, native English-speaking adults with normal or corrected-to-normal vision. All completed the EHI, a questionnaire that surveys the hand used by an individual for a variety of activities including writing, throwing, striking a match, and opening a box. Possible scores range from 100 (extreme right-handed) to −100 (extreme left-handed). The mean EHI score for the participants was 8.42, with a large standard deviation reflecting the wide distribution of handedness (SD = 82.08). Participants were divided into two groups, with individuals with EHI scores greater than zero classified as “right-handed” and those with scores less than zero classified as “left-handed.” In the right-handed group, there were 24 individuals (15 men, 9 women) whose ages ranged from 18 to 31 years (mean = 23.05 years, SD = 4.07 years), and the mean on the EHI was 87.4 (SD = 17.1). In the left-handed group, there were 24 individuals (13 men, 11 women) whose ages ranged from 19 to 59 years (mean = 26.4 years, SD = 10.05 years), and the mean on the EHI was −69.05 (SD = 35.2). There was no age difference across the two groups (F(1, 48) = 2.1, p = .15). The participants were recruited from the subject pools maintained by Carnegie Mellon University, provided informed consent to participate, and were compensated $25 an hour or given course credit. The protocol was approved by the institutional review board of Carnegie Mellon University.

Stimuli

Twenty-four male and 24 female face images obtained from the Face-Place Database Project (2008, Dr. M. Tarr, wiki.cnbc.cmu.edu/Face_Place) were used in this experiment. All faces were forward-facing with neutral expression (see example in Figure 1A). The faces were cropped to remove hair cues and were presented in grayscale against a black background. Stimuli were 1.5 in. in height and 1 in. in width, yielding visual angles of 4.8° and 3.2°, respectively. On each trial, the faces in a pair were matched on gender to increase the difficulty of discrimination.

Figure 1. 

(A) Examples of a pair of face and a pair of word stimuli used in experiment. (B) Procedure and timing of a single trial in which a central face or word is presented and a face or word (same stimulus category) is then briefly shown to either the LVF or RVF for same/different judgment.

Figure 1. 

(A) Examples of a pair of face and a pair of word stimuli used in experiment. (B) Procedure and timing of a single trial in which a central face or word is presented and a face or word (same stimulus category) is then briefly shown to either the LVF or RVF for same/different judgment.

The word stimuli consisted of 48 four-letter words (24 pairs), presented in 35% gray, Arial, 18-point font against a black background. Stimuli were approximately 0.5 in. in height and 1 in. in width, yielding visual angles of 1.6° and 3.2°, respectively. Pairs were matched so that the words differed by one of the interior letters; half of the pairs differed in the second letter, and the other half differed in the third letter (see Figure 1A).

These face and word stimuli have been used successfully to reveal the RVF superiority for words and the LVF superiority for faces and to be matched on the difficulty of discrimination (Dundas et al., 2013).

Procedure

The experiment was run on a Dell Dimension 4700 computer using E-Prime software (Psychology Tools, Inc., Pittsburgh, PA) and participants sat approximately 24 in. from an Iiyama vision master 1415 monitor. Participants viewed a central fixation cross whose duration was jittered between 1500 and 2500 msec. After the offset of the fixation cross, a centrally presented stimulus (word or face) appeared for 750 msec and was followed immediately by a second stimulus of the same type presented for 150 msec in either the LVF or RVF (see Figure 1B). The center of the lateralized stimulus was 5.3° from fixation. Participants were instructed to keep their gaze fixated centrally throughout the experiment and to respond by pressing one of two buttons to indicate whether the second stimulus was identical to the first (same/different judgment). The fixation cross appeared after the button press and indicated the start of the next trial. The presentation of stimuli in the LVF or RVF was randomized per participant with equiprobable presentation in each field within a block. For each class of stimuli, there were 192 trials, which were split into six mini-blocks to allow participants time to rest in between blocks. Each stimulus pair was used only once per block.

EEG Recording

EEG scalp recordings were made from 64 Ag–AgCl sintered electrodes embedded in a fiber Quik-Cap (Charlotte, NC), arranged according to the 10–20 naming system. Ocular artifacts were monitored by four additional electrodes: one above and one below the left eye and one on the outer canthus of each eye. Electrodes were also placed on the right and left mastoids with the left serving as the online reference during data acquisition, and impedances were kept below 10 kΩ. The electrical signal was recorded continuously and amplified with a band-pass filter of 0.01–200 Hz and digitized at a sampling rate of 1000 Hz.

EEG Analysis

The signal was high-pass filtered at 0.1 Hz, low-pass filtered at 30 Hz, and rereferenced to the vertex (Cz) electrode. Trials were rejected if there was an eye blink −100 to 300 msec around the onset of the stimulus or if the participant answered incorrectly. Eye blinks were identified by a change in voltage in the subtraction of the eye channels that surpassed 100 μV within a 200-msec sliding window. Epochs were baseline corrected over a 200-msec prestimulus interval. ERP waveforms for each individual were averaged over the included trials, separately for words and for faces.

To examine hemispheric effects, for each individual participant, the LH electrodes, P7, P5, and P07, were averaged to create a grand-averaged ERP waveform, and this was done separately for each stimulus category. The same procedure was undertaken using the corresponding RH electrodes, P8, P6, and P08. The N170 component was analyzed by taking the mean amplitude in each hemisphere for each individual, between 160 and 180 msec after stimulus onset. As is frequently the case in the N170 literature, we plot negative values of the signals downward on the figures. Note that the ERP signals analyzed are those elicited in response to the presentation of the initial, central stimulus rather than to the “probe” lateralized stimulus to which the behavioral response is made. This allows us to examine hemispheric differences purely in response to the visual encoding of the face/word and in the absence of task demands (which, in themselves, might differentially engage the hemispheres).

As with the behavioral data, we generated difference scores of lateralization by subtracting the mean amplitude of the N170 component in the nonpreferred hemisphere from the amplitude in the preferred hemisphere.1 The difference scores were multiplied by −1, so that positive scores reflected a greater difference in amplitude in the preferred direction (larger N170 for faces in the RH and for words in the LH).

RESULTS

Behavioral Lateralization

First, we investigate the pattern of visual field lateralization using accuracy for words and for faces as the dependent measure. Because of the limited exposure duration and the fact that encoding is data-limited, accuracy rather than RT is the more appropriate dependent measure, but we do analyze the RT data as well.1 A 2 × 2 × 2 (Word/face stimulus × LVF/RVF × Right-/left-hander group) ANOVA did not reveal a significant three-way interaction (F(1, 46) = 0.28, p = .60; see Figure 2A). There was a significant Stimulus × Visual field interaction (F(1, 46) = 24.82, p < .001), as predicted, with higher accuracy for words in the RVF over the LVF (t(47) = 5.45, p < .001) and higher accuracy for faces in the LVF over the RVF (t(33) = 2.5, p = .02). There were no other significant interactions or main effects.

Figure 2. 

(A) Mean accuracy for the right- and left-handed groups for faces and words as a function of visual field. (B) Mean RT for the right- and left-handed groups for faces and words as a function of visual field.

Figure 2. 

(A) Mean accuracy for the right- and left-handed groups for faces and words as a function of visual field. (B) Mean RT for the right- and left-handed groups for faces and words as a function of visual field.

Because of our specific predictions and the Stimulus × Visual field interaction, we performed 2 × 2 (Word/face stimulus, LVF/RVF) ANOVAs on the right- and left-handed group data separately. A significant interaction between Field and Stimulus was observed for the right-handed group (F(1, 23) = 12.57, p = .002), with higher accuracy for words in the RVF over the LVF (t(23) = 4.01, p < .001) and for faces in the LVF over the RVF (t(32) = 2.36, p = .03; see Figure 2A, left). The analysis of the data from the left-handed group revealed a significant interaction as well (F(1, 32) = 12.52, p = .002), with significantly higher accuracy for words in the RVF over the LVF (t(23) = 3.73, p = .001), but no difference in accuracy for faces in the two fields (t(16) = 1.04 p = .31; see Figure 2A, right).

We performed the same analysis using RT for matching words and for faces as a function of visual field as the dependent measure. A 2 × 2 × 2 (Word/face stimulus × LVF/RVF × Right-/left-hander group) ANOVA did not reveal a significant three-way interaction (F(1, 46) = 0.89, p = .35; see Figure 2B). Consistent with the accuracy data, there was a significant Stimulus × Visual field interaction (F(1, 46) = 20.10, p < .001) with shorter latency for words in the RVF over the LVF (t(47) = 5.45, p < .001) and shorter latency for faces in the LVF over the RVF (t(33) = 2.5, p = .02). There were no other significant interactions or main effects.

Because of our specific predictions and the Stimulus × Visual field interaction, we again performed 2 × 2 (Word/face stimulus, LVF/RVF) ANOVAs on the right- and left-handed group data separately. A significant interaction between Field and Stimulus was observed for the right-handed group (F(1, 23) = 15.35, p = .001), with shorter latency for words in the RVF over the LVF (t(23) = 2.70, p = .013) and for faces in the LVF over the RVF (t(32) = 2.48, p = .021; see Figure 2B, left). The data for the left-handed group also revealed a significant interaction (F(1, 32) = 6.03, p = .022). However, these data were not consistent with the accuracy findings and revealed no difference in latency for words in the two fields (t(16) = 1.34, p = .19; see Figure 2B) but a trend toward shorter latency for faces in the LVF over the RVF (t(23) = 2.06, p = .052).

As evident from the above, the right-handed group exhibited the expected superiority effects in both accuracy and RT. The left-handed group, however, yielded somewhat less stable results with inconsistency in the accuracy and RT results. To ensure that these latter findings did not result from a speed–accuracy trade-off, we performed the same ANOVA for the two groups separately, using Inverse efficiency (IE) as the dependent measure. The IE score (expressed in milliseconds) is equal to the mean RT divided by the proportion of correct responses, calculated separately for each condition and each participant. Lower values on this measure indicate better performance (Townsend & Ashby, 1983). In the right-handed group, we replicate the established Stimulus × Hemisphere interaction (F(1, 23) = 33.7, p < .001), with a significant advantage for words in RVF over LVF and the converse for faces (both ps < .05). In the left-handed group, we also observe a Stimulus × Hemisphere interaction (F(1, 23) = 18.8, p < .001), but a breakdown of this interaction reveals only a significant advantage for words in the RVF over the LVF in IE but not a significant difference in faces in the two hemifields (p > .05). These findings confirm the absence of a hemifield difference for face processing in the left-handed individuals.

N170 Lateralization

Next, we examined the pattern of electrophysiological lateralization of the ERP component related to word and face recognition, the N170. Right- and left-handed group grand-averaged waveforms are illustrated in Figure 3A (negative plotted downward). The N170 values as a function of hemisphere (not visual field), and stimulus is shown in Figure 3B with negative plotted upward now. Using the amplitude of the N170 component as the dependent measure, a 2 × 2 × 2 (Word/face stimulus, RH/LH, Right-/left-handed group) ANOVA did not reveal a significant three-way interaction (F(1, 46) < 0.01, p = .9). There was a significant Stimulus × Hemisphere interaction (F(1, 46) = 16.1, p < .001), with more negative amplitude for words in the LH over the RH (t(47) = 3.97, p < .001) but no difference in amplitude for faces between the two hemispheres (t(47) = 1.44, p = .16). There was also a significant Group × Hemisphere interaction (F(1, 46) = 5.69, p = .022), with right-handers showing no overall difference in amplitude between the two hemispheres (t(23) = 0.62, p = .54) and left-handers revealing a more negative amplitude in the LH over the RH (t(23) = 2.36, p = .03). There was also a main effect of Stimulus type (F(1, 46) = 10.47, p = .002), with a more negative amplitude for faces than for words (faces: M = −4.04, words: M = −2.75).

Figure 3. 

(A) Grand-averaged waveform for faces and words as a function of hemisphere for the right- and left-handed groups. (B) Mean amplitude for the N170 right- and left-handed groups for faces and words as a function of hemisphere. Note that negative is plotted upward on the y axis.

Figure 3. 

(A) Grand-averaged waveform for faces and words as a function of hemisphere for the right- and left-handed groups. (B) Mean amplitude for the N170 right- and left-handed groups for faces and words as a function of hemisphere. Note that negative is plotted upward on the y axis.

To test our a priori predictions that hemispheric differences in word and face representation vary as a function of handedness, we examined the Stimulus × Hemisphere interaction in a 2 × 2 (Word/face stimulus, RH/LH) ANOVA on the N170 data from the right- and left-handed groups separately. The right-handed group demonstrated a significant two-way interaction between Stimulus and Hemisphere (F(1, 23) = 7.85, p = .01); consistent with the behavioral data, there was a significantly greater negative amplitude for words in the LH over the RH (t(23) = 2.15, p = .04) and a greater negative amplitude for faces in the RH over the LH (t(23) = 2.46, p = .02; see Figure 3B, left). The left-handed group demonstrated a significant two-way interaction between Stimulus and Hemisphere as well (F(1, 23) = 8.24, p = .009); consistent with the behavioral data of the left-handers, there was a significantly greater negative amplitude for words in the LH over the RH (t(23) = 3.43, p = .002) but no difference in amplitude for faces across the two hemispheres (t(23) = −0.23, p = .87; see Figure 3B, right).

To explore the profile further, we also examined the group difference within each stimulus type. For words, there was no significant Group × Hemisphere interaction (p < .05), whereas this group difference was significant for faces (p = .05). This within-stimulus category distinction confirms that the word N170 lateralization is shared across groups but that this is not the case for face lateralization (amplitudes right-handers: RH = −4.823, LH = −3.909; left-handers: RH = −0.3675, LH = −3.747).

To determine whether the Stimulus × Hemisphere interaction was specific to the N170 component, we also conducted a 2 × 2 × 2 (Word/face stimulus, RH/LH, Right-/left-handed group) ANOVA using the amplitude of the P100 as the dependent measure. This ANOVA did not uncover a three-way interaction (F(1, 46) = 2.32, p = .14). There was a significant main effect of Stimulus (F(1, 46) = 35.4, p < .001), with a more positive amplitude for faces than for words, perhaps because faces cover a larger portion of the visual field than do words (faces: M = 4.05, words: M = 2.33). There was also a main effect of Hemisphere (F(1, 46) = 74.6, p < .001), with a more positive amplitude in the RH over the LH (RH: M = 3.69, LH: M = 2.68; perhaps reflecting the bias for faces as well). There was also a Group × Stimulus interaction (F(1, 46) = 7.87, p = .007), with a greater amplitude for faces in the right-handed group than the left-handed group and a greater amplitude for words in the left-handed group over the right-handed group. Although this interaction was significant because of the opposing trends, neither of the pairwise comparisons were significant. There was no Stimulus × Hemisphere interaction, Hemisphere × Group interaction, nor main effect of Group.

Handedness

Having established that, as a group, left-handers do not demonstrate hemispheric lateralization for faces, either in accuracy, in IE, or in the N170 ERP component (although there was a marginal effect in RT), we next examined the relationship between handedness as a continuous measure (given the large variability within the right- and left-handed groups, especially the latter), as determined by the EHI, and hemispheric lateralization for words and faces, first with behavior and then with electrophysiological data.

To examine these correlations, we created a difference score by subtracting the accuracy in the typically nonpreferred field from the accuracy in the other (preferred) field (words: RVF–LVF, faces: LVF–RVF). A linear regression with the degree of lateralization for accuracy (difference score between two visual fields) as the dependent measure showed no significant effect of Handedness for either word lateralization (r2 = .02, t(1, 46) = −0.14, p = .36) or for face lateralization (r2 = .05, t(1, 46) = 1.52, p = .14).

A regression analysis of the amplitude of the N170 component against Handedness on the EHI also did not reveal a significant effect of Handedness on the lateralization for words (R2 = .001, t(1, 45) = −1.08, p = .29; see Figure 4). There was, however, a significant effect of Handedness on the lateralization for faces (r2 = .08, t(1, 45) = 2.0, p = .056), revealing that the more right-handed a participant, the more lateralized the amplitude of the N170 face component to the RH (see Figure 4). The difference between correlation coefficients was not significant (Z = 0.38, p = .70).

Figure 4. 

Correlation between degree of handedness on the EHI and degree of N170 lateralization for faces [N170 amplitude: (RH − LH)/(RH + LH)*−1] and for words [N170 amplitude: (LH − RH)/(LH + RH)*−1].

Figure 4. 

Correlation between degree of handedness on the EHI and degree of N170 lateralization for faces [N170 amplitude: (RH − LH)/(RH + LH)*−1] and for words [N170 amplitude: (LH − RH)/(LH + RH)*−1].

Correlation between Face/Word Behavior and N170 Lateralization

To examine the consistency between our measures of hemispheric lateralization for words and for faces, we next examined the correlation between the lateralization of accuracy scores for word and face processing and the hemispheric lateralization of the N170 component. There was no reliable relationship between the RVF accuracy advantage for words and the degree of LH N170 lateralization for words (r2 = .015, p = .41). However, there was a significant relationship between the degree of the LVF accuracy advantage for faces and the degree of RH N170 lateralization for faces (r2 = .18, p = .003; see Figure 5).

Figure 5. 

Correlation between degree of face matching accuracy (LVF–RVF) and degree of N170 face lateralization.

Figure 5. 

Correlation between degree of face matching accuracy (LVF–RVF) and degree of N170 face lateralization.

Exploration of the N170 across Hemispheres

Thus far, we have used handedness as the measure of lateralization for language dominance and superiority in orthographic processing (with analyses at the group and at the individual participant level). Although we did not obtain three-way interactions (with Group × Hemisphere or Visual field × Stimulus category), we did observe reliable patterns of difference across the two groups. The absence of the three-way interaction, however, might arise from the fact that handedness is not the ideal measure for assessing LH dominance for word processing. To assess this more directly, then, here, we analyze the relationship between the electrophysiological response properties of the two hemispheres with the idea that the LH N170 amplitude in response to words in the RVF may serve as a better marker of LH specialization. The question then is what is the relationship between this amplitude and the N170 amplitude for face processing.

To address this issue, we used the degree of N170 lateralization for faces as the dependent measure and performed a stepwise multiple regression with the predictive factors of the mean N170 amplitude for words in both hemispheres2 and handedness (to explore whether this factor contributes at all over and above the word N170). This analysis indicated that the degree of RH N170 lateralization for faces was significantly predicted by the mean amplitude of the N170 for words in the LH (r2 = .18, F(1, 45) = 9.48, p = .004) and that handedness did not contribute any unique variance. The relationship between RH N170 lateralization for faces and the N170 for words in the LH was such that the more negative the N170 response to words in the LH (r2 = −.42, t(1, 45) = −3.08, p = .004), where a more negative value reflects stronger responses, the greater the lateralization of the N170 response to faces (see Table 1). To further illustrate this relationship, we plotted grand-averaged waveforms, for both words and faces, averaged across the 10 participants who had the strongest LH N170 for words and the 10 participants who had the weakest LH N170 for words (see Figure 6). We also performed the same stepwise regression with the degree of N170 lateralization for words using handedness and the mean N170 amplitude for faces in both hemispheres as predictive variables. This analysis did not reveal any predictive factors.

Table 1. 

Stepwise Multiple Regression for N170 Amplitude Lateralization for Faces

Step 1BStd. Error, Bβ
(Constant) .155 .189  
Left N170 words −.123 .04 −.421* 
Step 1BStd. Error, Bβ
(Constant) .155 .189  
Left N170 words −.123 .04 −.421* 

*p < .01.

Figure 6. 

(A) Grand-averaged ERP waveforms in response to face and word stimuli in the LH and RH in the participants with the strongest LH N170 for words. (B) Grand-averaged ERP waveforms in response to face and word stimuli in the LH and RH in the participants with the weakest LH N170 for words.

Figure 6. 

(A) Grand-averaged ERP waveforms in response to face and word stimuli in the LH and RH in the participants with the strongest LH N170 for words. (B) Grand-averaged ERP waveforms in response to face and word stimuli in the LH and RH in the participants with the weakest LH N170 for words.

To evaluate further the unidirectional predictive relationship between the LH N170 for words and the lateralization of the N170 for faces, we split the group along the median value of N170 amplitude and performed further analyses on the median split data. It is important to note that there was no difference in handedness between the two groups (strong LH N170: M = 9.99, SD = 16.8; weak LH N170: M = 8.80, SD = 17.39). We then examined the Stimulus × Hemisphere interaction in a 2 × 2 × 2 (Strong/weak N170 LH words, Word/face stimulus, RH/LH) ANOVA. This analysis revealed, unsurprisingly, a main effect of group (F(1, 46) = 33.9, p < .001), hemisphere (F(1, 46) = 2.88, p = .096), and Stimulus × Hemisphere (F(1, 46) = 30.3, p < .000). Most important for the current purpose, however, there was a marginally significant three-way interaction (F(1, 46) = 2.89, p = .09).

To explore this interaction further, we conducted a 2 × 2 ANOVA (Word/face stimulus, RH/LH) on the data from the strong LH for word and weak LH for word groups separately. The strong LH word group demonstrated a significant two-way interaction between Stimulus and Hemisphere (F(1, 23) = 14.73, p = .001), with a significantly greater negative amplitude for words in the LH over the RH (t(23) = 3.54, p = .002) and a significantly greater negative amplitude for faces in the RH over the LH (t(23) = 2.09, p = .05). The weak LH word group demonstrated only a trend toward a two-way interaction between stimulus and hemisphere (F(1, 23) = 1.79, p = .19), but there was a significantly greater negative amplitude for words in the LH over the RH (t(23) = 2.22, p = .03) and no difference in amplitude for faces across the two hemispheres (t(23) = 0.25, p = .81). This last result complements the previous findings in which handedness defines the groups (or is used as a continuous measure) and shows that greater amplitude of N170 for words in the LH is associated with stronger lateralization of faces to the RH.

DISCUSSION

Considerable evidence from behavioral, neuropsychological, and neuroimaging studies would seem to indicate that there are separate, specialized mechanisms in the adult brain for processing faces (in the RH) and for processing words (in the LH). Most of these studies have been conducted with right-handed individuals, almost all of whom have left-hemisphere language lateralization. Emerging evidence, however, supports the alternative claim that the hemispheric specialization for faces is not entirely independent of the specialization for words and that the latter is, in turn, dependent on the lateralization of language processing. On this account, one would predict that individuals whose language dominance is less strongly represented in the LH would evince reduced RH lateralization for faces as well.

The current study explored the pattern of hemispheric specialization for both word and face processing in a large sample of adults varying in handedness. Previous work has shown that, as a group, fewer left-handers show LH dominance for language than right-handers (Knecht, Drager, et al., 2000). Although the degree of lateralization of the VWFA in left-handers has not been examined extensively, research has shown that it is closely related to language lateralization (Bouhali et al., 2014; Cai, Lavidor, Brysbaert, Paulignan, & Nazir, 2008; Hunter & Brysbaert, 2008), and this suggests that there is a relationship between handedness and hemispheric specialization for words (and we used both of these factors, handedness and hemispheric specialization for words, in our analyses). Counter to the claim that the face processing system develops independently of language-related systems, a few studies have shown a reduction in hemispheric lateralization for faces in left-handed populations (Willems et al., 2010; Luh et al., 1994; Heller & Levy, 1981) as well as reduced neural activation in region FFA for faces (Bukowski et al., 2013).

In the current work, we examine the hemispheric specialization for both word and face processing systems using behavioral and electrophysiological measures. Although we were unable to establish (using, e.g., neuroimaging) whether, for each individual participant, language was lateralized to the LH or RH (and this is more crucial for the left-handers than right-handers), we adopted the assumption, as is true in most other studies of left-handers, that their language lateralization would be more variable than that of right-handers. We also included a full range of right- and left-handed participants (see above for variability on handedness index) to add greater diversity to the hemispheric profiles in our sample. As noted above, although handedness is used as a proxy for hemispheric specialization of language, it is a rather coarse measure, and so we also examined more directly the relationship between the LH ERP measure of word processing (N170) and the lateralization of face processing.

To ensure the validity of our approach, we first replicated the standard finding of hemispheric specialization in right-handed participants (Iaccino, 1993), demonstrating more accurate and faster word processing in the RVF than LVF and, conversely, more accurate and faster face processing in the LVF than the RVF. Consistent with these data, we also found the expected pattern in the lateralization of the N170 amplitude, with the response potential to words being more negative (i.e., stronger) in the LH than RH and the response potential to faces being more negative in the RH than LH.

Having replicated the modal profile in the right-handers, we then explored the profile of left-handers. With respect to behavior, the left-handers, as a group, evinced greater accuracy for word discrimination when the stimuli were presented in the RVF than LVF, but they did not evince robust lateralization of face processing as evident from examining the analyses using accuracy and IE measures (see Figure 2A and B for left-handers). Consistent with the behavioral findings, the left-handed group also demonstrated a more negative ERP N170 component to words in the LH than RH but no difference in N170 for face processing between the two hemispheres.

Having established a difference between the groups of right- and left-handers in lateralization profiles, especially with respect to that associated with face processing, we then explored how handedness, as a continuous measure, is associated with the lateralization of word and face processing. Using the degree of lateralization of the N170 pitted against a measure of handedness (from the EHI; Oldfield, 1971), we observed that the more right-handed the individual, the more RH lateralized the amplitude of the N170 response to faces. As expected, based on the group data, we did not find a relationship between handedness and degree of lateralization of the amplitude of the N170 response to words per se. A closer examination of the lateralization of the N170 amplitude for faces revealed, however, that the N170 for words was indeed related to that for faces: Greater RH lateralization was correlated with a more negative mean amplitude of the N170 for words in the LH, over and above the contribution of handedness. Moreover, grouping participants based on the LH amplitude for words revealed a group difference in hemispheric lateralization of faces: This final analysis splits the sample based on the amplitude of the LH N170 ERP component with the assumption that those participants with more negative amplitude should show a stronger RH N170 for faces compared with those with less negative amplitude (thus setting handedness aside). This prediction was upheld, suggesting that the strength of the N170 in the LH for words (which serves as a rough measure of word and language processing in the LH) is associated with the lateralization of face processing.

Given that several studies have shown a strong tie between the lateralization of word processing and the lateralization of language (Cai et al., 2008; Hunter & Brysbaert, 2008) and that we found no difference in word lateralization as it relates to handedness, it is likely that most of our sample of left-handed individuals were, in fact, LH dominant for language. This in itself is not a surprising result as studies indicate that only approximately 27% of left-handers show RH language dominance and that these are the most strongly left-handed individuals (Knecht, Drager, et al., 2000). This conclusion notwithstanding, we still see a difference in hemispheric specialization for face representations in our left-handers compared with right-handers. Consistent with reported data, left-handers showed a reduction in hemispheric specialization for faces (Willems et al., 2010; Luh et al., 1994; Heller & Levy, 1981).

Given that we did not have a case-by-case determination of language lateralization, we also used the strength of the word N170 in LH as a measure of language lateralization, and this result was consistent with the handedness measure: The more left-lateralized for word reading, the more right lateralized for face processing.

How can we reconcile the absence of lateralization differences for word processing (both behavioral accuracy and N170) for right- and left-handers, with the reduction in face lateralization as a participant is increasingly left-handed? One possible explanation is that, although left-handers evince greater accuracy and N170 for word perception in the LH, this organization may be less coherent than is the case for the right-handers, with the result that face lateralization to the RH is not as marked in the left- as in the right-handed individuals. This possibility is supported by the data showing unidirectional dependency, with the predictive relationship between N170 amplitude for words in the LH and RH face lateralization but not vice versa. The unidirectional dependency suggests that the hemispheres are in competition for representation and if one set of computations is not firmly instantiated (e.g., having to do with word perception), neither will its competitive counterpart be firmly instantiated (e.g., having to do with face perception).

The pattern of data we have obtained supports the claim that face and word recognition mechanisms are not independent. These findings are not easily reconcilable with other proposals that focus on fundamental or intrinsic differences between the hemispheres. For example, it has been suggested that the two hemispheres are differentially sensitive to different spatial frequencies (RH tuned to lower spatial frequency and LH tuned to high spatial frequency [Robertson & Ivry, 2000; Ivry & Robertson, 1998]) or that the hemispheres have a differential predisposition to process inputs by categorical (LH) versus by coordinate (RH) relations (Kosslyn et al., 1989). A further possibility is that the RH mediates more configural or holistic processing, whereas the LH undertakes more analytical processing (see also Farah, 1999, for discussion of a two-stream system, one for faces and one for words). Our data do not fit with these accounts because we did not find a direct relationship between the degree of lateralization for both stimulus categories, which would be expected if underlying computational properties of the hemispheres determined lateralization. Moreover, these proposals cannot accommodate the specifics of our data (e.g., no handedness effects on lateralization of word processing but a significant effect on the lateralization of face perception skills).

A Computational Account of Lateralization for Both Face and Word Processing

The pattern of findings obtained from the behavioral and electrophysiological investigation fits well with a view that postulates that the hemispheric specialization for processing words and processing faces are related (Behrmann & Plaut, 2013b). On this account, because both words and faces place distinctive demands on high-acuity vision, words and faces compete for representational space in both hemispheres, and this competition takes place specifically in that cortical subarea adjacent to regions of retinotopic cortex encoding information from central vision with maximal discriminability (Woodhead, Wise, Sereno, & Leech, 2011; Hasson, Levy, Behrmann, Hendler, & Malach, 2002; Levy, Hasson, Avidan, Hendler, & Malach, 2001), notably the VWFA and the FFA. To minimize connection length (and the opportunity for errors to arise as signal propagation distance increases or interhemispheric engagement is necessary), orthographic representations are further constrained to be proximal to language-related information, which is left-lateralized in most individuals. As a result, words (and, presumably, letters before that) gradually come to rely most heavily on the left fusiform region (VWFA) as an intermediate cortical region bridging between early vision and language. Because of the competition of face representations with word representations, face representations consequently become mostly lateralized to the right fusiform region (FFA).

Plaut and Behrmann (2011) offered support for this view by demonstrating, within the context of the computational simulation, the acquired anatomic localization and the evolving hemispheric specialization of both words and faces. This view is also consistent with developmental data showing that face lateralization emerges after word lateralization and is related to reading ability in children and adolescents (Dundas et al., 2013). Cantlon et al. (2011) have also demonstrated that young children show decreasing responses to faces, but not other classes of stimuli, in the left fusiform (VWFA) with increasing letter knowledge. In addition, Dehaene et al. (2010) found that left fusiform response to faces diminished as a product of reading experience in preliterate adults.

Our results fit with this perspective because, although we did not find a clear relationship between handedness per se and the lateralization of the N170 response to words, we found a greater (more negative) magnitude of the N170 response to words in the LH to be related with more RH lateralization of the N170 response to faces. Again, we know that, for faces and objects, a more negative N170 response implicates greater visual expertise within a given domain (Rossion, Curran, & Gauthier, 2002) and the N170 has also been shown to be more negative with increased orthographic expertise (Maurer & McCandliss, 2008). Therefore, these results suggest that, although the more left-handed participants demonstrated hemispheric lateralization for words, their difference in hemispheric lateralization for faces could be predicted by a more finely tuned perceptual representation (greater negative N170 response) for words in the LH. The interpretation we offer is that many left-handed individuals fail to show lateralization for faces because the LH does not develop sufficiently robust visual representation of words so as to compete with faces and thereby drive their lateralization. Further work is clearly needed to characterize the word processing networks in left-handers, including distinguishing between those who have “reversed” lateralization with RH dominance for speech/language (Verma, Van der Haegen, & Brysbaert, 2013) and left-handers with LH language dominance. Spectral analysis of EEG signals and/or charting the development of the system in left-handed children may elucidate hemispheric organization and shed further light on the nonindependent relationship between the hemispheric specialization for words and the hemispheric specialization for faces.

Acknowledgments

This research was supported by a training grant from the National Institutes of Health (B-squared) to E. D. (T32GM081760), a grant from the National Science Foundation to M. B. and D. P. (BCS-1354350), and a grant to M. B. from the NSF Science of Learning Center (SMA-1041755 to the Temporal Dynamics of Learning Center; PI: G. Cottrell). The authors thank Ryan Egan for his assistance with data collection and processing.

Reprint requests should be sent to Marlene Behrmann, Department of Psychology, Carnegie Mellon University, 5000 Forbes Ave, Pittsburgh, PA 15213, or via e-mail: Behrmann@cmu.edu.

Notes

1. 

We also analyzed difference scores (RH − LH) for faces and (LH − RH) for words. Last, index scores were created by calculating the LH − RH/LH + RH for words and the RH − LH/RH + LH for faces. The difference score and index score were highly correlated for words (r2 = .748, p < .001) and for faces (r2 = .646, p < .001).

2. 

Two outliers were removed from this regression for having lateralization difference scores greater than 3 SDs from the mean.

REFERENCES

REFERENCES
Allison
,
T.
,
Puce
,
A.
,
Spencer
,
D. D.
, &
McCarthy
,
G.
(
1999
).
Electrophysiological studies of human face perception. I: Potentials generated in occipitotemporal cortex by face and non-face stimuli.
Cerebral Cortex
,
9
,
415
430
.
Badzakova-Trajkov
,
G.
,
Haberling
,
I. S.
,
Roberts
,
R. P.
, &
Corballis
,
M. C.
(
2010
).
Cerebral asymmetries: Complementary and independent processes.
PLoS One
,
5
,
e9682
.
Behrmann
,
M.
, &
Plaut
,
D. C.
(
2013a
).
Bilateral hemispheric processing of words and faces: Evidence from word impairments in prosopagnosia and face impairments in pure alexia.
Cerebral Cortex
,
24
,
1102
1118
.
Behrmann
,
M.
, &
Plaut
,
D. C.
(
2013b
).
Distributed circuits, not circumscribed centers, mediate visual recognition.
Trends in Cognitive Sciences
,
17
,
210
219
.
Bentin
,
S.
,
Allison
,
T.
,
Puce
,
A.
,
Perez
,
E.
, &
McCarthy
,
G.
(
1996
).
Electrophysiological studies of face perception in humans.
Journal of Cognitive Neuroscience
,
8
,
551
565
.
Bouhali
,
F.
,
Thiebaut de Schotten
,
M.
,
Pinel
,
P.
,
Poupon
,
C.
,
Mangin
,
J. F.
,
Dehaene
,
S.
,
et al
(
2014
).
Anatomical connections of the visual word form area.
Journal of Neuroscience
,
34
,
15402
15414
.
Brem
,
S.
,
Bach
,
S.
,
Kucian
,
K.
,
Guttorm
,
T. K.
,
Martin
,
E.
,
Lyytinen
,
H.
,
et al
(
2010
).
Brain sensitivity to print emerges when children learn letter–speech sound correspondences.
Proceedings of the National Academy of Sciences, U.S.A.
,
107
,
7939
7944
.
Bukowski
,
H.
,
Dricot
,
L.
,
Hanseeuw
,
B.
, &
Rossion
,
B.
(
2013
).
Cerebral lateralization of face-sensitive areas in left-handers: Only the FFA does not get it right.
Cortex
,
49
,
2583
2589
.
Cai
,
Q.
,
Lavidor
,
M.
,
Brysbaert
,
M.
,
Paulignan
,
Y.
, &
Nazir
,
T. A.
(
2008
).
Cerebral lateralization of frontal lobe language processes and lateralization of the posterior visual word processing system.
Journal of Cognitive Neuroscience
,
20
,
672
681
.
Cantlon
,
J. F.
,
Pinel
,
P.
,
Dehaene
,
S.
, &
Pelphrey
,
K. A.
(
2011
).
Cortical representations of symbols, objects, and faces are pruned back during early childhood.
Cerebral Cortex
,
21
,
191
199
.
Clerke
,
A.
, &
Clerke
,
J.
(
2001
).
A literature review of the effect of handedness on isometric grip strength differences of the left and right hands.
American Journal of Occupational Therapy
,
55
,
206
211
.
Cohen
,
L.
,
Dehaene
,
S.
,
Naccache
,
L.
,
Lehericy
,
S.
,
Dehaene-Lambertz
,
G.
,
Henaff
,
M. A.
,
et al
(
2000
).
The visual word form area: Spatial and temporal characterization of an initial stage of reading in normal subjects and posterior split-brain patients.
Brain
,
123
,
291
307
.
Dehaene
,
S.
, &
Cohen
,
L.
(
2011
).
The unique role of the visual word form area in reading.
Trends in Cognitive Sciences
,
15
,
254
262
.
Dehaene
,
S.
,
Pegado
,
F.
,
Braga
,
L. W.
,
Ventura
,
P.
,
Nunes Filho
,
G.
,
Jobert
,
A.
,
et al
(
2010
).
How learning to read changes the cortical networks for vision and language.
Science
,
330
,
1359
1364
.
Devlin
,
J.
,
Jamison
,
H.
,
Gonnerman
,
L.
, &
Matthews
,
P.
(
2006
).
The role of the posterior fusiform gyrus in reading.
Journal of Cognitive Neuroscience
,
18
,
911
922
.
Dundas
,
E. M.
,
Plaut
,
D. C.
, &
Behrmann
,
M.
(
2013
).
The joint development of hemispheric lateralization for words and faces.
Journal of Experimental Psychology: General
,
142
,
348
358
.
Dundas
,
E. M.
,
Plaut
,
D. C.
, &
Behrmann
,
M.
(
2014
).
An ERP investigation of the co-development of hemispheric lateralization of face and word recognition.
Neuropsychologia
,
61C
,
315
323
.
Farah
,
M. J.
(
1999
).
The cognitive neuroscience of vision
.
Oxford
:
Blackwell
Publishing.
Gilbert
,
C.
, &
Bakan
,
P.
(
1973
).
Visual asymmetry in perception of faces.
Neuropsychologia
,
11
,
355
362
.
Golarai
,
G.
,
Ghahremani
,
D. G.
,
Whitfield-Gabrieli
,
S.
,
Reiss
,
A.
,
Eberhardt
,
J. L.
,
Gabrieli
,
J. D.
,
et al
(
2007
).
Differential development of high-level visual cortex correlates with category-specific recognition memory.
Nature Neuroscience
,
10
,
512
522
.
Gonzalez
,
C. L.
, &
Goodale
,
M. A.
(
2009
).
Hand preference for precision grasping predicts language lateralization.
Neuropsychologia
,
47
,
3182
3189
.
Grüsser
,
O. J.
, &
Landis
,
T.
(
1991
).
Visual agnosias and other disturbances of visual perception and cognition
.
London
:
MacMillian
.
Hasson
,
U.
,
Levy
,
I.
,
Behrmann
,
M.
,
Hendler
,
T.
, &
Malach
,
R.
(
2002
).
Center-biased representation for characters in the human ventral visual stream.
Neuron
,
34
,
479
490
.
Heller
,
W.
, &
Levy
,
J.
(
1981
).
Perception and expression of emotion in right-handers and left-handers.
Neuropsychologia
,
19
,
263
272
.
Hellige
,
J. B.
,
Laeng
,
B.
, &
Michimata
,
C.
(
2010
).
Processing asymmetries in the visual system.
In
R.
Hugdahl
&
K.
Westerhausen
(Eds.),
The two halves of the brain: Information processing in the cerebral hemispheres
(pp.
379
415
).
Cambridge, MA
:
MIT Press
.
Hunter
,
Z. R.
, &
Brysbaert
,
M.
(
2008
).
Visual half-field experiments are a good measure of cerebral language dominance if used properly: Evidence from fMRI.
Neuropsychologia
,
46
,
316
325
.
Iaccino
,
J. F.
(
1993
).
Left brain–right brain differences: Inquires, evidence, and new approaches
.
Hillsdale, NJ
:
Lawrence Erlbaum Associates
.
Ivry
,
R.
, &
Robertson
,
L. C.
(
1998
).
The two sides of perception
.
Cambridge, MA
:
MIT Press
.
Kanwisher
,
N.
(
2000
).
Domain specificity in face perception.
Nature Neuroscience
,
3
,
759
763
.
Kanwisher
,
N.
,
McDermott
,
J.
, &
Chun
,
M. M.
(
1997
).
The fusiform face area: A module in human extrastriate cortex specialized for face perception.
Journal of Neuroscience
,
17
,
4302
4311
.
Kherif
,
F.
,
Josse
,
G.
, &
Price
,
C. J.
(
2011
).
Automatic top-down processing explains common left occipito-temporal responses to visual words and objects.
Cerebral Cortex
,
21
,
103
114
.
Kleinschmidt
,
A.
, &
Cohen
,
L.
(
2006
).
The neural bases of prosopagnosia and pure alexia: Recent insights from functional neuroimaging.
Current Opinion in Neurology
,
19
,
386
391
.
Knecht
,
S.
,
Deppe
,
M.
,
Drager
,
B.
,
Bobe
,
L.
,
Lohmann
,
H.
,
Ringelstein
,
E.
,
et al
(
2000
).
Language lateralization in healthy right-handers.
Brain
,
123
,
74
81
.
Knecht
,
S.
,
Drager
,
B.
,
Deppe
,
M.
,
Bobe
,
L.
,
Lohmann
,
H.
,
Floel
,
A.
,
et al
(
2000
).
Handedness and hemispheric language dominance in healthy humans.
Brain
,
123
,
2512
2518
.
Kosslyn
,
S. M.
,
Koenig
,
O.
,
Barrett
,
A.
,
Cave
,
C. B.
,
Tang
,
J.
, &
Gabrieli
,
J. D. E.
(
1989
).
Evidence for two types of spatial representations: Hemispheric specialization for categorical and coordinate relations.
Journal of Experimental Psychology: Human Perception and Performance
,
15
,
723
735
.
Levy
,
I.
,
Hasson
,
U.
,
Avidan
,
G.
,
Hendler
,
T.
, &
Malach
,
R.
(
2001
).
Center-periphery organization of human object areas.
Nature Neuroscience
,
4
,
533
539
.
Levy
,
J.
,
Heller
,
W.
,
Banich
,
M. T.
, &
Burton
,
L. A.
(
1983a
).
Are variations among right-handed individuals in perceptual asymmetries caused by characteristic arousal differences between hemispheres?
Journal of Experimental Psychology: Human Perception and Performance
,
9
,
329
359
.
Levy
,
J.
,
Heller
,
W.
,
Banich
,
M. T.
, &
Burton
,
L. A.
(
1983b
).
Asymmetry of perception in free viewing of chimeric faces.
Brain and Cognition
,
2
,
404
419
.
Luh
,
K. E.
,
Redl
,
J.
, &
Levy
,
J.
(
1994
).
Left- and right-handers see people differently: Free-vision perceptual asymmetries for chimeric stimuli.
Brain and Cognition
,
25
,
141
160
.
Marcel
,
T.
,
Katz
,
L.
, &
Smith
,
M.
(
1974
).
Laterality and reading proficiency.
Neuropsychologia
,
12
,
131
139
.
Maurer
,
U.
, &
McCandliss
,
B. D.
(
2008
).
The development of visual expertise for words: The contribution of electrophysiology.
In
E. L.
Grigorenko
&
A. J.
Naples
(Eds.),
Single word reading: Cognitive, behavioral and biological perspectives
(pp.
43
64
).
Mahwah, NJ
:
Lawrence Erlbaum Associates
.
Maurer
,
U.
,
Rossion
,
B.
, &
McCandliss
,
B. D.
(
2008
).
Category specificity in early perception: Face and word n170 responses differ in both lateralization and habituation properties.
Frontiers in Human Neuroscience
,
2
,
18
.
Mercure
,
E.
,
Dick
,
F.
,
Halit
,
H.
,
Kaufman
,
J.
, &
Johnson
,
M. H.
(
2008
).
Differential lateralization for words and faces: Category or psychophysics?
Journal of Cognitive Neuroscience
,
20
,
2070
2087
.
Oldfield
,
R. C.
(
1971
).
The assessment and analysis of handedness: The Edinburgh inventory.
Neuropsychologia
,
9
,
97
113
.
Plaut
,
D. C.
, &
Behrmann
,
M.
(
2011
).
Complementary neural representations for faces and words: A computational exploration.
Cognitive Neuropsychology
,
28
,
251
275
.
Price
,
C. J.
, &
Devlin
,
J. T.
(
2011
).
The interactive account of ventral occipitotemporal contributions to reading.
, Trends in Cognitive Sciences,
15
,
246
253
.
Puce
,
A.
,
Allison
,
T.
,
Asgari
,
M.
,
Gore
,
J. C.
, &
McCarthy
,
G.
(
1996
).
Differential sensitivity of human visual cortex to faces, letterstrings, and textures: A functional magnetic resonance imaging study.
The Journal of Neuroscience
,
16
,
5205
5215
.
Rhodes
,
G.
(
1985
).
Lateralized processes in face recognition.
British Journal of Psychology
,
76
,
249
271
.
Robertson
,
L. C.
, &
Ivry
,
R.
(
2000
).
Hemispheric asymmetries: Attention to visual and auditory primitives.
Current Directions in Psychological Science
,
9
,
59
64
.
Rossion
,
B.
,
Curran
,
T.
, &
Gauthier
,
I.
(
2002
).
A defense of the subordinate-level expertise account for the N170 component.
Cognition
,
85
,
189
196
.
Rossion
,
B.
,
Joyce
,
C. A.
,
Cottrell
,
G. W.
, &
Tarr
,
M. J.
(
2003
).
Early lateralization and orientation tuning for face, word, and object processing in the visual cortex.
Neuroimage
,
20
,
1609
1624
.
Scherf
,
K. S.
,
Behrmann
,
M.
,
Humphreys
,
K.
, &
Luna
,
B.
(
2007
).
Visual category-selectivity for faces, places and objects emerges along different developmental trajectories.
Developmental Science
,
10
,
F15
F30
.
Sergent
,
J.
,
Ohta
,
S.
, &
MacDonald
,
B.
(
1992
).
Functional neuroanatomy of face and object processing.
Brain
,
115
,
15
36
.
Sergent
,
J.
, &
Poncet
,
M.
(
1990
).
From covert to overt recognition of faces in a prosopagnosia patient.
Brain
,
113
,
989
1094
.
Sergent
,
J.
, &
Signoret
,
J.-L.
(
1992
).
Functional and anatomical decomposition of face processing: Evidence from prosopagnosia and PET study of normal subjects.
Philosophical Transactions of the Royal Society of London, Series B, Biological Sciences
,
335
,
55
62
.
Shaywitz
,
B. A.
,
Shaywitz
,
S. E.
,
Pugh
,
K. R.
,
Mencl
,
W. E.
,
Fulbright
,
R. K.
,
Skudlarski
,
P.
,
et al
(
2002
).
Disruption of posterior brain systems for reading in children with developmental dyslexia.
Biological Psychiatry
,
52
,
101
110
.
Spiridon
,
M.
,
Fischl
,
B.
, &
Kanwisher
,
N.
(
2006
).
Location and spatial profile of category-specific regions in human extrastriate cortex.
Human Brain Mapping
,
27
,
77
89
.
Townsend
,
J.
, &
Ashby
,
F.
(
1983
).
The stochastic modelling of elementary psychological processes
.
Cambridge
:
Cambridge University Press
.
Twomey
,
T.
,
Kawabata Duncan
,
K. J.
,
Price
,
C. J.
, &
Devlin
,
J. T.
(
2011
).
Top-down modulation of ventral occipito-temporal responses during visual word recognition.
Neuroimage
,
55
,
1242
1251
.
Van der Haegen
,
L.
,
Cai
,
Q.
, &
Brysbaert
,
M.
(
2012
).
Colateralization of Broca's area and the visual word form area in left-handers: fMRI evidence.
Brain and Language
,
122
,
171
178
.
Verma
,
A.
,
Van der Haegen
,
L.
, &
Brysbaert
,
M.
(
2013
).
Symmetry detection in typically and atypically speech lateralized individuals: A visual half-field study.
Neuropsychologia
,
51
,
2611
2619
.
Willems
,
R. M.
,
Peelen
,
M. V.
, &
Hagoort
,
P.
(
2010
).
Cerebral lateralization of face-selective and body-selective visual areas depends on handedness.
Cerebral Cortex
,
20
,
1719
1725
.
Willems
,
R. M.
,
Van der Haegen
,
L.
,
Fisher
,
S. E.
, &
Francks
,
C.
(
2014
).
On the other hand: Including left-handers in cognitive neuroscience and neurogenetics.
Nature Reviews Neuroscience
,
15
,
193
201
.
Woodhead
,
Z. V.
,
Wise
,
R. J.
,
Sereno
,
M.
, &
Leech
,
R.
(
2011
).
Dissociation of sensitivity to spatial frequency in word and face preferential areas of the fusiform gyrus.
Cerebral Cortex
,
21
,
2307
2312
.
Yovel
,
G.
, &
Kanwisher
,
N.
(
2005
).
The neural basis of the behavioral face-inversion effect.
Current Biology
,
15
,
2256
2262
.