The auditory-visual integration of anger is impaired in alcoholism: an event-related potentials study ===================================================================================================== * Pierre Maurage * Pierre Philippot * Frédéric Joassin * Laurie Pauwels * Tierry Pham * Esther Alonso Prieto * Ernesto Palmero-Soler * Franck Zanow * Salvatore Campanella ## Abstract **Objectives:** Chronic alcoholism leads to impaired visual and auditory processing of emotions, but the cross-modal (auditory-visual) processing of emotional stimuli has not yet been explored. Our objectives were to describe the electrophysiological correlates of unimodal (visual and auditory) impairments in emotion processing in people suffering from alcoholism, to determine whether this deficit is general or emotionspecific, and to explore potential deterioration in the specific cross-modal integration processes in alcoholism. **Methods:** We used an emotion-detection task, with recording of event-related potentials (ERPs), in which 15 patients suffering from alcoholism and 15 matched healthy control subjects were asked to detect the emotion (angry, happy or neutral) displayed by auditory, visual or auditory-visual stimuli. Behavioural performance and ERP data recorded between June 2005 and April 2006 were analyzed. **Results:** ERPs demonstrated that the deficit in alcoholism originates earlier in the cognitive stream than has previously been described (mainly P300), namely, at the level of specific face (N170) and voice (N2) perceptive processing. Moreover, while patients with alcoholism did not show impaired processing of happy and neutral audio-visual stimuli, they did have a specific impairment in the cross-modal processing of anger. A source location analysis was used to confirm and illustrate the results. **Conclusion:** These results suggest that the specific deficit that people with alcoholism demonstrate in processing anger stimuli, widely described in clinical situations but not clearly identified in earlier studies (using unimodal stimuli), is particularly obvious during cross-modal processing, which is more common than unimodal processing in everyday life. ## Introduction In everyday life, sensory events are not experienced in isolation. Indeed, human beings are constantly confronted with multiple stimuli that are integrated into a unitary perception of the environment. Cross-modal interactions at behavioural and cerebral levels are therefore crucial for daily adaptive behaviours. Nevertheless, because the sensory modalities have usually been explored separately in the fields of psychology and neuroscience, the mechanisms leading to cross-modal integration have only been explored during the last decade.1 At a behavioural level, cross-modal effects are mainly indexed by faster reaction times (RTs) in cross-modal, compared with unimodal, conditions.2 Among the several models proposed to explain this “facilitation effect,” the most validated are the coactivation models,3 which suggest an interaction between modalities (the stimulus in one modality influencing the processing of the other stimulus). Recent neuroimaging results highlight specific integrative processes: electrophysiological studies based on dipole modelling4 and gamma coherence5 showed early parieto-occipital interactions and specific cross-modal gamma-band activities, and functional magnetic resonance imaging (fMRI)6 studies identified brain integration areas such as the superior temporal sulcus and the right insula. Thus, although the locus of cross-modal interaction is still a matter of debate, these results clearly support the interactive coactivation models, and the existence of a mechanism linking the product of the unimodal processes is proposed. In keeping with this suggestion, the cross-modal mechanism is most commonly defined as the construction of a unified and coherent representation on the basis of different stimulations coming from different sensory modalities but concerning the same object or situation.7 The ubiquity of cross-modal interactions is particularly patent in the field of emotion processing because the perception and production of emotions are always based on several sensory aspects, for example, emotional facial expression (EFE), emotional prosody and postures. Unimodal exploration of emotion is thus often insufficient to comprehend the complexity of emotion processing. A few studies have explored the cross-modal integration of complex emotional stimuli in normal healthy subjects,8,9 and these have mainly demonstrated a particular involvement of the amygdala in cross-modal emotion processing. Moreover, it has been suggested that these cross-modal emotional processes could be impaired in psychopathological populations (particularly in schizophrenia sufferers10). Because chronic alcoholism is a common psychopathological state, it is important to explore the cross-modal processing of emotional information in this population. ### Alcoholism and cross-modal processing of emotions Chronic alcoholism leads to dysfunction of various social and interpersonal behaviours, notably, in the decoding of emotions. Indeed, it has been shown that alcoholism leads to deficits in the decoding of visual11,12 and auditory13,14 emotional stimuli when presented alone and that this impairment has deleterious effects on social interactions.15 However, it is unclear whether this deficit is maintained, reduced or increased when people with alcoholism are confronted simultaneously with visual and auditory stimuli. With this in mind, we used auditory-visual stimuli to determine to what extent the unimodal emotional deficit described in alcoholism is present in cross-modal situations. Moreover, the intensity of the EFE deficit varies across emotions: alcoholism is mainly associated with an impairment in negative emotion processing, especially for anger. This overestimation of anger11,16 has been associated with aggressive behaviour,17 which could have clinical implications. To further explore this differential impairment, we compared anger stimuli to happy (considered as preserved in alcoholism) and neutral stimuli. ### Alcoholism and ERPs Chronic alcoholism leads to brain atrophy18 and lesions (particularly in the prefrontal and frontal cortex19). This cerebral deterioration is associated with impaired performance in cognitive and neuropsychological tasks20 but also with decoding deficits in regard to visual11 and auditory13 stimuli at a behavioural level. Nevertheless, it is unclear at which stage of cognitive processing this deficit originates. ERPs monitor brain electrical activity during cognitive tasks with a high temporal resolution, which allows the electrophysiological component representing the onset of a dysfunction to be identified and the impaired cognitive stages to be inferred.21 ERPs have been used for decades with subjects suffering from alcoholism. Most studies focused on the P3b, a long-lasting positive deflection appearing at parietal sites between 300 and 800 milliseconds after stimulus onset and functionally associated with the closure of cognitive processing before the start of the motor response.22 Alcoholism is associated with a reduced amplitude and a delayed latency of P3b (for a review, see Hansenne23). This P3b impairment was considered to be an electrophysiological marker of earlier behavioural results showing that alcoholism is linked to a deficit in higher cognitive processing, namely, decision, inhibition24 or memory/attention.23 However, other studies have described a deficit in earlier visual ERP components, for example, P100,25 N17026 or N2.27 These deficits for P100 (only observed with basic nonemotional stimuli such as flashes or bursts) and, more importantly, for N170 (respectively linked to early visual processing and specific processing of faces) suggest that the impairment in alcoholism could begin before the decisional level (P3b), namely, at the visuospatial level of cognitive processing.26 On the basis of these findings, we used ERPs to explore the visuospatial processing of emotional stimuli in alcoholism. This procedure allows us to replicate (with visual stimuli) the impairment for P3b but also for earlier components, to explore these deficits in the auditory processing, to investigate the ERP correlates of the potential cross-modal deficit, to compare unimodal and cross-modal deficits and to localize the origin of this impairment on the information-processing stream. ### Hypotheses The present study used an emotion-detection task based on EFE and emotional prosody to explore the following hypotheses: 1. With regard to confirmation of the unimodal deficit, we hypothesized that people with alcoholism would present behavioural (i.e., higher error level and longer RTs) and electrophysiological (i.e., delayed latencies and reduced amplitudes) deficits in unimodal conditions. Because alcoholism is linked to visuospatial impairments, we hypothesized that the deficit would begin at the N170 (visual) and N2 (auditory) stage. 2. With regard to specificity of the emotional deficit, we hypothesized that the deficit would not be identical across emotions (angry, happy and neutral) but, rather, would mainly present for anger (with a relatively preserved performance for happy and neutral stimuli). 3. This study is the first attempt to explore cross-modal processing in chronic alcoholism, and we hypothesized that alcoholism would lead to a cross-modal deficit. Complementary analysis based on the “subtraction” technique (Teder-Sälejärvi and colleagues28) would isolate cross-modal activations and show whether alcoholism is linked to a specific deterioration of integration processes. ## Methods ### Participants We recruited 15 inpatients (10 men and 5 women) diagnosed with alcohol dependence according to DSM-IV criteria during the third week of their treatment in a detoxification centre (Les Marronniers Psychiatric Hospital, Tournai, Belgium). They had all abstained from alcohol for at least 2 weeks, were free of medication and of any other psychiatric diagnosis and were all right-handed. The mean alcohol consumption among patients just before detoxification was 15.4 units daily (standard deviation [SD] 4.61), and the mean number of previous detoxification treatments was 4.5 (SD 2.7). Patients were matched for age, sex and education with a control group of 15 volunteers who had no personal or familial history of psychiatric disorder or drug/substance abuse and whose personal alcohol consumption was lower than 5 units weekly. Exclusion criteria for both groups included major medical problems, central nervous system disease (including epilepsy), visual or auditory impairment and polysubstance abuse. Each participant had normal-to-corrected vision and normal hearing. Education level was assessed according to the number of years of education completed since starting primary school. Patients and control participants were assessed for several psychological control measures to evaluate the presence of comorbid psychopathologies and deficits. The following variables were evaluated by means of validated self-completion questionnaires: state and trait anxiety (State-Trait Anxiety Inventory, Forms A and B29), depression (Beck Depression Inventory-Short Form30), interpersonal problems (Inventory of Interpersonal Problems,31 which evaluates the quantity and quality of social interactions, integration in the family and relationship background) and alexithymia (20-item Toronto Alexithymia Scale32). Participants were provided with full details regarding the aims of the study and the procedure to be followed. After receiving this information, all participants gave their informed consent. The study was approved by the ethical committee of the medical school. ### Task and procedure We used an emotion-detection task in which participants were confronted with faces and voices presented separately (unimodal conditions) or simultaneously (cross-modal condition). Three categories of faces and voices that varied in terms of emotional content (angry, happy or neutral) were used. The visual stimuli (EFEs) were selected from the standardized set of Ekman and Friesen pictures.33 Four faces (2 men) were chosen, and 3 pictures were used for each category (angry, happy and neutral facial expression) so that there were 12 visual stimuli. The auditory stimuli were audiotapes consisting of the enunciation of a semantically neutral word (“paper”) with an emotional prosody. On the basis of a pilot study34 conducted on 70 participants (mean age 18.74, SD 0.89 y), we selected 12 auditory stimuli that best expressed the emotions of interest (as shown in Table 1) and included 4 voices (2 men and 2 women) and 3 audiotapes for each (angry, happy and neutral prosody). We also created 12 auditory-visual (cross-modal) stimuli, based on the combination of a visual and an auditory stimulus (congruent for emotion and sex). The study thus comprised 36 stimuli and 9 experimental conditions (3 categories of stimuli × 3 emotions). View this table: [Table 1](http://jpn.ca/content/33/2/111/T1) Table 1 Categorization of emotional stimuli by 70 pilot-study participants (see Maurage et al34)* Participants were confronted with a total of 15 blocks, each defined by 60 stimuli so that the study comprised 900 stimuli (100 per condition). To facilitate the task, only 2 emotions were displayed in each block and the study contained 5 blocks for each pair of emotions (happy–angry, happy–neutral and angry–neutral) with 60 stimuli (20 visual, 20 auditory and 20 auditory-visual) randomly displayed. The order of the 15 blocks varied across participants. During the ERP recordings, participants sat in a dark room on a chair placed at 1 m from the screen with their head restrained in a chin rest. Visual stimuli subtended a visual angle of 3° × 4°. Auditory stimuli were presented via binaural headphones. At the beginning of each trial, a fixation cross was presented for 300 milliseconds, and then the stimulus (face, voice or both) was presented for 700 milliseconds. A black screen was displayed between stimuli for a random duration of between 800 and 1300 milliseconds. From the stimulus onset, participants had 1500 milliseconds to answer. The experimental design is illustrated in Figure 1. At the beginning of each block, participants were told which pair of emotions would be presented in that particular block (for example, happy–angry), and they had to decide as quickly as possible which emotion was displayed by pressing the button corresponding to that emotion with their right forefinger. Response time and error rate were recorded. Participants were told that speed was important, but not at the cost of accuracy. Only correct responses were considered for analysis of RTs and ERPs. ![Fig. 1](http://jpn.ca/https://www.jpn.ca/content/jpn/33/2/111/F1.medium.gif) [Fig. 1](http://jpn.ca/content/33/2/111/F1) Fig. 1 Illustration of the experimental design, with the successive arrival of (1) a fixation cross, (2) the stimulus and (3) an interstimuli black screen. The 3 categories of stimuli are illustrated in an angry–happy block: (**A**) visual, (**B**) auditory and (**C**) cross-modal. ### EEG recording and analysis The electroencephalogram (EEG) was recorded with the use of 32 electrodes mounted in an electrode Quick-Cap. Electrode positions included the standard 10–20 system locations and intermediate positions. Recordings were taken with a linked mastoid physical reference but were rereferenced according to a common average. The EEG was amplified by battery-operated ANT amplifiers (ANT Software Ltd, Enschende, The Netherlands) with a gain of 30 000 and a band-pass of 0.01–100 Hz. The impedance of all electrodes was always kept below 10k Ohm. The EEG was recorded continuously (sampling rate 500 Hz, ANT Eeprobe 3.2.4 [ANT Software, 2004]**)**, and the vertical electrooculogram (VEOG) was recorded bipolarly from electrodes placed on the supraorbital and infraorbital ridges of the left and right eyes. Trials contaminated by VEOG artifacts (mean of 8%) were manually eliminated offline. Epochs were created starting 200 milliseconds before stimulus onset and lasting for 1300 milliseconds. Data were filtered with a 30-Hz low-pass filter. To compute different averages of ERP target stimuli for each subject individually, 3 parameters were coded for each stimulus: the stimulus type (visual, auditory or auditoryvisual), the emotion type (angry, happy or neutral) and the response type (correct or incorrect). For each participant and each component of interest (namely, P100, N170–N2 and P3b), individual peak amplitudes and maximum peak latencies were obtained from several electrodes separately for the ERPs evoked in response to deviant stimuli: Oz, O1, O2, T5 and T6 for P100; T5 and T6 for N170-N235; and Pz, P3 and P4 for P3b.22 These values were tested with repeated-measures analysis of variance (ANOVA), and a Greenhouse-Geisser correction was applied when appropriate. For each component of interest, 2 × 3 × 3 × 5 (2,3) ANOVAs were computed separately for latencies and amplitudes, with group (alcoholism patients and control subjects) as between-factor and emotion (angry, happy, neutral), modality (visual, auditory, auditory-visual) and location (Oz, O1, O2, T5 and T6 for P100; T5 and T6 for N170; Pz, P3 and P4 for P3b) as within-factors. The statistical analysis of the cross-modal interactions was based on the subtraction of the auditory and visual unimodal conditions from the auditory-visual bimodal condition (AV – [A + V]), a method frequently used to investigate the electrophysiological correlates of cross-modal processes.28,35 As illustrated in Figure 2 (for anger stimuli), this subtraction was first done for each subject individually for each emotional condition (angry, happy, neutral) and on frontal (F3, Fz, F4), temporal (T5, T6), central (C3, Cz, C4), parietal (P3, Pz, P4) and occipital (O1, Oz, O2) sites. This subtraction was performed only on correct trials. Mean electrophysiological activity resulting from this subtraction was calculated on each electrode for successive 10-millisecond intervals from 0 to 800 milliseconds. We then calculated significant effects at the group level for each interval and each electrode, using Student’s *t* tests (amplitude of the subtraction wave compared with 0). The spatiotemporal patterns that had a significant amplitude (*p* < 0.01) on at least 1 electrode for 3 consecutive intervals were considered as significant.36 ![Fig. 2](http://jpn.ca/https://www.jpn.ca/content/jpn/33/2/111/F2.medium.gif) [Fig. 2](http://jpn.ca/content/33/2/111/F2) Fig. 2 Illustration of the successive stages to obtain the specific integrative waves on Fz for anger among control subjects (above) and alcoholism patients (below). **Left**: First stage represents the averaging of ERP components elicited by the 3 modalities (auditory, visual and auditory-visual). **Middle**: Second stage shows the ERPs elicited by the subtraction (AV – [A +V ]). **Right**: Third stage is the illustration of the t values for the subtraction (AV – [A + V]). The activities are considered as significant at least at p < 0.05, when t values exceed 2.91 (i.e., when they are above the line). These significant activities index the specific integrative waves associated with the cross-modal processing. ERP = event-related potential; A = auditory; AV = auditory-visual; V = visual. Finally, a source reconstruction was conducted to detect the brain generators of the scalp signal, that is, to compute the intracortical distribution of the primary currents from the surface EEG data. We used ASA software (ASA 2.3. ANT Software Ltd, Enschende, The Netherlands, 2006) and a variation of the standardized low resolution brain electromagnetic tomography algorithm (sLORETA) — the swLORETA method — which has been shown to accurately reconstruct nearby current sources in the presence of noise in simulated data. The main difference between sLORETA and swLORETA is that swLORETA introduces an additional normalization to the lead field matrix that compensates for the variation in sensitivity of the EEG sensors to current sources at differing depths. It is well known that the inverse problem tends to bias the solution toward current sources near the surface of the brain. This is a direct result of trying to minimize the norm of the solution. If 2 different current source density distributions can both produce the same electrical field signals, then the one in which the current sources are deeper within the brain will require stronger sources to do so. Thus the solution with sources closer to the surface will have a smaller norm.37 The normalization to the lead field matrix introduced by swLORETA implies that the matrix is adjusted to provide a uniform sensitivity to voxels regardless of depth.38 Each voxel consists of an x, y and z component. The lead field matrix has 3 corresponding columns for each voxel that specify the contribution of that voxel to the magnetic field. The normalization works by computing the eigenvalues of that 3-column submatrix. These eigenvalues directly determine the sensitivity of the electrical field to the current source at that voxel. The lead field matrix can then be normalized by dividing each of these triplets of columns by factors so that each of these 3-column submatrices have the same eigenvalues (and thus the same sensitivity). For more specific details, see Palmero-Soler and colleagues.38 For our data, we computed the solution by restricting the points of the grid space to the grey matter, which was defined according to the probabilistic maps from the Montreal Neurological Institute. We took into account 2030 points as possible generators of the EEG, with an interspace of 10 mm. ## Results ### Control measures As shown in Table 2, alcoholism patients and control subjects were similar in terms of age (*F*1,28 = 0.91, not significant) and education (*F*1,28 = 2.07, not significant). The 2 groups did not differ in the global result of the Inventory of Interpersonal Problems (*F*1,28 = 1.63, not significant) but the score for selfcontrol problems was higher in the alcoholism group (*F*1,28 = 4.82, *p* < 0.05). Significant between-group differences were observed for depression (*F*1,28 = 24.48, *p* < 0.01), trait anxiety (*F*1,28 = 8.83, *p* < 0.01), state anxiety (*F*1,28 = 23.18, *p* < 0.01) and alexithymia (*F*1,28 = 5.45, *p* < 0.05). The between-group difference for alexithymia was only significant for factor I, difficulties in identifying feelings (*F*1,28 = 9.44, *p* < 0.01). Finally, in the alcoholism patients’ group, there was a significant correlation between self-control problems and difficulties in identifying feelings (*r* = 0.687, *p* < 0.01). However, as illustrated in Table 3, the differences between groups are unlikely to have influenced the experimental results because no significant Pearson’s correlations were shown between these control measures and any behavioural and electrophysiological data (*p* > 0.05 for each correlation). This lack of influence may be explained by the fact that the scores observed among alcoholism patients were all below the clinical level for depression and anxiety. On the basis of these results, it seems unlikely that our behavioural and ERP data were biased by interfering factors such as depression, anxiety, interpersonal problems or alexithymia. View this table: [Table 2](http://jpn.ca/content/33/2/111/T2) Table 2 Patient and control group characteristics View this table: [Table 3](http://jpn.ca/content/33/2/111/T3) Table 3 Pearson’s correlations (for the 30 subjects) between significant control measures and mean behavioural-electrophysiological data* ### Behavioural data #### Performance We carried out a 3 × 3 × 2 ANOVA with emotion (angry, happy and neutral) and stimulation modality (auditory, visual and auditory-visual) as within-factor and group (alcoholism patients, control subjects) as between-factor. As shown in Table 4, there was a main group effect (*F*1,28 = 4.64, *p* < 0.05): The alcoholism patients made more errors than the control subjects. There was also a main effect of emotion (*F*2,56 = 6.36, *p* < 0.05) and modality (*F*2,56 = 11.93, *p* < 0.01), with more errors for happy than for angry (*t*29 = 3.09, *p* < 0.01) and neutral (*t*29 = 2.60, *p* < 0.05) stimuli and more errors for visual than for auditory (*t*29 = 3.37, *p* < 0.01) and auditory-visual (*t*29 = 3.21, *p* < 0.01) stimuli. Finally, an interaction was found between group and modality (*F*2,56 = 4.66, *p* < 0.05), wherein alcoholism patients made more errors than control subjects only for visual stimuli (*t*14 = 2.21, *p* < 0.05). View this table: [Table 4](http://jpn.ca/content/33/2/111/T4) Table 4 Behavioural results #### Reaction times We computed a 3 × 3 × 2 ANOVA with emotion and modality as within-factor and group as between-factor. The results are shown in Table 4. Three main effects were found. With regard to group (*F*1,28 = 5.91, *p* < 0.05), there were longer RTs in the alcoholic group than in the control group. With regard to emotion (*F*2,56 = 12.15, *p* < 0.01), there were longer RTs for neutral than for happy (*t*29 = 4.99, *p* < 0.001) and angry (*t*29 = 4.76, *p* < 0.001) stimuli. With regard to modality (*t*2,56 = 37.73, *p* < 0.001), there were longer RTs for auditory than for visual (*t*29 = 8.12, *p* < 0.001) and auditory-visual (*t*29 = 10.76, *p* < 0.001) stimuli. ### ERP data These electrophysiological results are illustrated in Figure 3 and Figure 4. This section presents only the statistically significant results. ![Fig. 3](http://jpn.ca/https://www.jpn.ca/content/jpn/33/2/111/F3.medium.gif) [Fig. 3](http://jpn.ca/content/33/2/111/F3) Fig. 3 Electroencephalographic results for alcoholism patients. This figure represents the averaged wave for the 3 emotions (angry, happy and neutral) on each modality (auditory, visual and auditory-visual). The T5 electrode (**left**) shows the P100, N170 and N2 components; the Pz electrode (**middle, above**) shows the P3b component; the Oz electrode (**middle, below**) shows the P100 component. Finally, the T6 electrode (**right**) shows the P100, N170 and N2 components. AV = auditory-visual; V = visual; A = auditory. ![Fig. 4](http://jpn.ca/https://www.jpn.ca/content/jpn/33/2/111/F4.medium.gif) [Fig. 4](http://jpn.ca/content/33/2/111/F4) Fig. 4 Electroencephalographic results for control subjects. This figure represents the averaged wave for the 3 emotions (angry, happy and neutral) on each modality (auditory, visual and auditory-visual). The T5 electrode (**left**) shows the P100, N170 and N2 components; the Pz electrode (**middle, above**) shows the P3b component; the Oz electrode (**middle, below**) shows the P100 component. Finally, the T6 electrode (**right**) shows the P100, N170 and N2 components. Note that, as compared with the alcoholism group (Fig. 3), control subjects display higher amplitudes and shorter latencies concerning the N170-N2 and P3b components. V = visual; A = auditory; AV = auditory-visual. #### P100 With regard to latencies, the only effect concerned emotion (*F*2,56 = 7.99, *p* < 0.01). In both the alcoholism group and the control group, latencies were globally shorter for happy than for angry (*t*29 = 3.05, *p* < 0.01) and neutral (*t*29 = 3.15, *p* < 0.01) stimuli, regardless of the modality (auditory, visual or auditory-visual). With regard to amplitudes, there were main effects of emotion (*F*2,56 = 26.20, *p* < 0.01) and modality (*F*2,56 = 6.70, *p* < 0.05). P100 amplitudes were higher for happy than for angry (*t*29 = 7.25, *p* < 0.001) or neutral (*t*29 = 7.53, *p* < 0.001) stimuli and lower for auditory stimuli, compared with visual (*t*29 = 3.06, *p* < 0.01) and auditory-visual (*t*29 = 3.67, *p* < 0.01) stimuli. #### N170–N2 With regard to latencies, main effects were found for group (*F*1,28 = 5.38, *p* < 0.05), emotion (*F*2,56 = 19.71, *p* < 0.001) and modality (*F*2,56 = 84.12, *p* < 0.001). N170–N2 latencies were shorter for control subjects than for alcoholism patients, shorter for happy than for angry (*t*29 = 9.41, *p* < 0.001) and neutral (*t*29 = 3.09, *p* < 0.05) stimuli, and longer for auditory than for visual (*t*29 = 8.73, *p* < 0.01) and auditory-visual (*t*29 = 8.35, *p* < 0.01) stimuli. These effects were mediated by an interaction between group and modality (*F*2,56 = 6.66, *p* < 0.05): alcoholism patients had longer latencies than control subjects only for the auditory stimuli (*t*14 = 3.66, *p* < 0.01). With regard to amplitudes, there were main effects for group (*F*1,28 = 7.80, *p* < 0.05), emotion (*F*2,56 = 13.10, *p* < 0.001) and modality (*F*2,56 = 57.43, *p* < 0.001). N170–N2 amplitudes were higher for control subjects than for alcoholism patients, higher for happy than for angry (*t*29 = 3.72, *p* < 0.01) and neutral (*t*29 = 4.11, *p* < 0.01) stimuli, and lower for auditory than for visual (*t*29 = 7.59, *p* < 0.001) and auditory-visual (*t*29 = 7.56, *p* < 0.001) stimuli. Two interactions mediated these results. In the first, between group and modality (*F*2,56 = 3.71, *p* < 0.05), the N170–N2 amplitude was higher among control subjects only for auditory (*t*14 = 2.57, *p* < 0.05) and auditory-visual (*t*14 = 2.46, *p* < 0.05) stimuli. In the second interaction, between group and emotion (*F*2,56 = 3.32, *p* < 0.05), control subjects had higher N170–N2 amplitudes than alcoholism patients only for happy (*t*14 = 2.69, *p* < 0.05) and neutral (*t*14 = 2.37, *p* < 0.05) stimuli. #### P3b With regard to latencies, there were main effects for group (*F*1,28 = 179.49, *p* < 0.001), emotion (*F*2,56 = 12.29, *p* < 0.01) and modality (*F*2,56 = 5.71, *p* < 0.05). P3b latencies were shorter for control subjects than for alcoholism patients, longer for angry than for happy (*t*29 = 6.12, *p* < 0.001) and neutral (*t*29 = 2.52, *p* < 0.05) stimuli, and shorter for visual than for auditory (*t*29 = 2.71, *p* < 0.05) and auditory-visual (*t*29 = 2.21, *p* < 0.05) stimuli. Two interactions were found. The first was between group and modality (*F*2,56 = 15.10, *p* < 0.001). Among the alcoholism patients, P3 latency was shorter for visual than for auditory-visual stimuli (*t*14 = 2.48, *p* < 0.05), whereas in the control group, auditory stimuli had longer latencies than visual (*t*14 = 2.48, *p* < 0.05) and auditory-visual (*t*14 = 2.48, *p* < 0.05) stimuli. The second interaction was between group and emotion (*F*2,56 = 15.10, *p* < 0.01). In the alcoholism patient group, P3 latency was longer for angry than for happy (*t*14 = 4.87, *p* < 0.001) and neutral (*t*14 = 3.80, *p* < 0.01) stimuli, while among control subjects, P3 latency was shorter for happy than for angry (*t*14 = 3.76, *p* < 0.01) and neutral (*t*14 = 4.23, *p* < 0.01) stimuli. With regard to amplitudes, there were main effects for group (*F*1,28 = 4.83, *p* < 0.05), emotion (*F*2,56 = 6.01, *p* < 0.05) and modality (*F*2,56 = 74.79, *p* < 0.01). P3b amplitudes were higher for control subjects than for alcoholism patients, higher for happy than for angry (*t*29 = 3.25, *p* < 0.01) and neutral (*t*29 = 2.88, *p* < 0.01) stimuli, higher for auditory-visual than for visual (*t*29 = 2.82, *p* < 0.01) and auditory (*t*29 = 10.11, *p* < 0.001) stimuli and higher for visual than for auditory (*t*29 = 7.67, *p* < 0.001) stimuli. There were 2 interactions. The first was between group and modality (*F*2,56 = 3.72, *p* < 0.05). In the alcoholism group, P3b amplitude was higher for auditory-visual than for visual (*t*14 = 2.27, *p* < 0.05) and auditory (*t*14 = 5.71, *p* < 0.05) stimuli and higher for visual than for auditory (*t*14 = 3.52, *p* < 0.05) stimuli. In the control group, auditory stimuli had lower amplitudes than visual (*t*14 = 9.70, *p* < 0.05) and auditory-visual (*t*14 = 9.53, *p* < 0.05) stimuli. The second interaction was between group, modality and emotion (*F*2,56 = 3.93, *p* < 0.05). Control subjects had higher P3b amplitudes than alcoholism patients only for auditory-visual happiness (*t*14 = 2.41, *p* < 0.05), visual anger (*t*14 = 2.19, *p* < 0.05) and visual neutral (*t*14 = 2.63, *p* < 0.05) stimuli. ### Cross-modal interactions Table 5 shows the significant cross-modal activities revealed by the subtraction technique, which indexed the electrophysiological components specifically associated with integrative processing. View this table: [Table 5](http://jpn.ca/content/33/2/111/T5) Table 5 Significant cross-modal activities observed among control (*n* = 15) and alcoholism groups (*n* = 15) in each emotional condition To explore the between-groups differences in cross-modal processing, we computed a group comparison of the subtraction waves obtained in each emotional condition. Significant differences between groups were computed at each electrode by using paired sample *t* tests (for successive 10-ms intervals from 0 to 800 ms). The significant differences between groups for cross-modal activities are described in Table 6. View this table: [Table 6](http://jpn.ca/content/33/2/111/T6) Table 6 Significant differences between groups for the subtraction waves in each emotional condition ### Source location To test the anatomic correlates of the results obtained in the electrophysiological data, we computed a source analysis for both groups on anger and happiness subtraction waves during the 100–150 milliseconds after stimulus onset. For anger as well as for happiness, neural generators were identified in the occipital and temporal regions for both groups. Nevertheless, an additional generator located in the frontal region was observed for the anger stimuli among control subjects but not in the alcoholic group. These results are illustrated in Figure 5. ![Fig. 5](http://jpn.ca/https://www.jpn.ca/content/jpn/33/2/111/F5.medium.gif) [Fig. 5](http://jpn.ca/content/33/2/111/F5) Fig. 5 Source reconstruction analysis of the cerebral generators in the control (**left**) and alcoholism (**right**) groups, for happy (**above**) and angry (**below**) subtraction waves during the time period 100–150 milliseconds after stimulus onset. Observe the absence of frontal activation in the alcoholism group as compared with control subjects, but only in the angry condition. Globally, the most significant results are as follows: * At the behavioural level (see Table 4), alcoholism patients made more errors in the visual condition and had globally longer RTs in all the conditions. * At the ERP level (see Fig. 3 and Fig. 4), alcoholism patients had no deficit on P100, but presented an impairment on N170–N2 (in latencies for auditory stimuli and in amplitudes for auditory and cross-modal stimuli) and on P3b (in latencies mainly for cross-modal anger stimuli and in amplitudes for visual and cross-modal stimuli). * At the cross-modal interactions and source location level (see Fig. 5 and Table 5, Table 6), alcoholism patients showed impaired integrative processing for angry stimuli, indexed by a frontal activity reduction. ## Discussion This study mainly shows that the electrophysiological deficit observed in chronic alcoholism takes its origin at an earlier level than the P300, namely, at specific face (N170) and voice (N2) perceptive processing, and that subjects with alcoholism appear to be particularly deficient in the cross-modal processing of angry stimuli, compared with happy and neutral stimuli. This specific auditory-visual impairment for anger was confirmed by a source location analysis showing reduced frontal activity in alcoholism patients (compared with control subjects) during the processing of cross-modal angry stimuli. The implications of these results will now be discussed. ### Control measures Subjects with alcoholism were significantly more depressed and anxious than control subjects, which is in line with previous studies,39 but these differences are unlikely to have influenced the experimental results (as shown by the nonsignificant Pearson’s correlations between control measures and data). Interestingly, 2 results confirmed the relational and emotional deficit in alcoholism: alcoholism patients reported more self-control problems than control subjects (which confirms the link between alcoholism, lack of self-control and impulsivity40), and alcoholism was associated with difficulties in identifying feelings (which confirms earlier studies linking alcoholism and alexithymia41 and underlines the difficulties in emotion processing among alcoholism sufferers). Moreover, the significant correlation observed between self-control problems and difficulties in identifying feelings reinforces the hypothesis that interpersonal problems (particularly, those linked to aggressiveness) and impaired emotion processing may interact in chronic alcoholism and contribute to a vicious cycle resulting in increased alcohol consumption. ### Behavioural data Whereas a ceiling effect was observed among control subjects (95% correct responses on average), alcoholism patients made more errors than control subjects, confirming the general impairment in emotion identification that is present in people with alcoholism. Moreover, this impairment was particularly present for visual stimuli, confirming the EFE decoding deficit in alcoholism.11 Regarding the RTs, participants with alcoholism were globally slower, an expected finding because the fact that recently detoxified alcoholism patients present a general deficit in motor abilities has been described extensively.42 Moreover, auditory stimuli were associated with slower RTs than visual and auditory-visual stimuli. This was also expected because complex auditory stimuli are classically processed more slowly than visual stimuli.28,35 Finally, a facilitating cross-modal interaction was observed, in that the auditory-visual stimuli were associated with slower RTs than the auditory stimuli, suggesting that visual stimuli speed up the processing of voices. This facilitation effect has already been described among control subjects with simple43 and complex8 emotional stimuli. In summary, behavioural results confirm the impaired performance of subjects with alcoholism in identifying complex emotional stimuli, suggest that this deficit is greater for visual stimuli than for auditory and auditory-visual stimuli, and show that cross-modal interaction between emotional faces and voices is associated with a facilitation effect. ### Electrophysiological data First, there was a global effect of emotion: Happy stimuli had shorter latencies and higher amplitudes of P100, N170 and P3b than angry and neutral stimuli. This is in line with several earlier studies showing higher amplitudes or shorter latencies44,45 elicited by happy stimuli as compared with angry stimuli. Moreover, these results strongly support the hypothesis of an early modulation of ERP by emotional content,46 starting as soon as the P100. ### N170 deficit in alcoholism For the P100, there were no differences between groups and the only significant effect was a modality effect (auditory stimuli were associated with lower amplitudes than visual and auditory-visual stimuli) which confirms the higher ambiguity for auditory stimuli at the beginning of processing. The more complex processing for voices was confirmed for the N170 (in latency and amplitude), but the main result is that there were significant between-group differences for the N170, with alcoholism patients having longer latencies and lower amplitudes (particularly for happy and neutral stimuli) than control subjects. This result confirms earlier observations26 suggesting that the deficit for EFE decoding in alcoholism starts as early as the visuospatial level, particularly for the N170. Moreover, the same effect was observed for the auditory stimuli, and the deficit was larger for voices than for faces and bimodal stimuli. In conclusion, it seems that the impairment in emotion processing seen in alcoholism starts at the perceptive level, specifically, at the face and voice processing stage (namely, N170 and N2) of the cognitive stream. These data are of considerable importance because the study of N170 has recently generated interesting results in psychopathology (e.g., in schizophrenia47) and because N170–N2 are linked to a crucial social ability — the processing of faces and voices — which could be impaired in alcoholism.11,12 ### Generalization of the P3b deficit in alcoholism The electrophysiological impairment was confirmed at the decision level because the subjects with alcoholism had reduced amplitude and delayed latency for P3b. Many studies have focused on this component with simple visual and auditory stimuli, and our results are in complete agreement with previous data.48,49 However, the present study generalizes the observation of the P3b deficit in alcoholism to complex emotional visual and auditory stimuli (namely, faces and voices) and provides the first observation of an impairment in P3b for cross-modal stimuli. Moreover, while participants with alcoholism were not specifically impaired for the emotion of anger in earlier ERP components (P100 and N170), it appears that alcoholism leads to longer P3b latencies for angry stimuli, compared with neutral stimuli, whereas in the control group, latencies for angry and neutral stimuli did not differ (happy stimuli leading to shorter latencies in both groups), suggesting a decisional impairment in the processing of anger among people with alcoholism. Finally, the modality effect for amplitude confirms the cross-modal facilitation effect because auditory-visual stimuli were associated with higher amplitudes than auditory and visual stimuli. Indeed, although the simultaneous presentation of auditory and visual information did not lead to a faster latency of P3b, it appears that the amount of information extracted from the event (indexed by the amplitude), and thus the intensity of processing, is higher when faces and voices are presented together. This constitutes the electrophysiological marker of the cross-modal facilitation effect, which seems stronger among subjects with alcoholism (as shown by the group–modality interaction effect). Nevertheless, it should be noted that the behavioural results only partially confirmed this electrophysiological correlate of the cross-modal facilitation, showing globally higher performance (compared with visual, but not with auditory, stimuli) and shorter RTs (compared with auditory, but not with visual, stimuli) for crossmodal stimuli. Exploration of the specific cross-modal ERP components will provide further information concerning the integrity of these processes in alcoholism. ### Cross-modal effects: a specific deficit for anger in alcoholism The central aim of this study was to explore cross-modal processing in control subjects and alcoholism patients. First, control subjects present several ERP complexes specific to integration processes for each emotional condition. Indeed, an early complex (around 100 milliseconds) with frontocentral positivity and occipitotemporal negativity was present in all emotional conditions. The presence of this complex is in line with previous results showing this anterior positivity–posterior negativity complex with simple28 and nonemotional high-level stimuli,35 and it has been considered to reflect the influence of auditory information on visual processing in the fusiform gyrus. This specific cross-modal wave proves that control participants consistently integrated auditory-visual stimuli. Moreover, for the happy stimuli, this first integration complex was followed by a second (around 150 milliseconds, frontocentral negativity and occipitotemporal positivity) and a third (around 250 milliseconds, frontocentral positivity and occipitotemporal negativity) complex. These complexes are parallel to those described by Joassin and colleagues35 and have been interpreted as reflecting, respectively, the influence of visual information on auditory processing in the associative auditory area and the interaction between unimodal, cross-modal and semantic areas in the frontal gyri. This confirms ERP results for classic components, showing faster and stronger processing of happy stimuli and better auditoryvisual integration of these stimuli when compared with angry and neutral stimuli. What is more, a late parietal negative integrative complex was found for each condition. This could be due to the fact that the late parietal activities of the integration wave are the result of the arithmetical subtraction of 2 P3b (visual and auditory) from 1 (auditory-visual), explaining this late negativity. The first result of the cross-modal analysis is thus to confirm, for the control participants, the existence of the cross-modal ERP complexes described in earlier studies and to extend this observation to emotional faces and voices, particularly to happy stimuli. Second, and more important, it appears that alcoholism is associated with a global cross-modal deficit: Alcoholism patients had fewer, smaller and more diffuse integration activities than control subjects. Indeed, the positive–negative complex (around 100 milliseconds) observed in each condition among control subjects was reduced or absent in the subjects with alcoholism, particularly for the angry stimuli. Concerning the happy stimuli, alcoholism patients seemed to have preserved integrative processing except for the third positive–negative complex (250–300 milliseconds), which was absent in this group. To statistically confirm these differences, we conducted a group comparison of the subtraction waves. This comparison showed, first, that subjects with alcoholism did not have significant impairment for neutral stimuli (except for a reduction in the late parietal negativity). Cross-modal processing would, therefore, appear to be preserved for nonemotional (neutral) stimuli. Second, for the happy stimuli, alcoholism patients had preserved crossmodal processing, except for the third positive–negative complex (250–300 milliseconds). Because this third complex has been interpreted as reflecting the interaction between unimodal, cross-modal and semantic areas in the frontal gyri,35,50 this result confirms earlier results showing impaired semantic processing51 and frontal lobe dysfunction19 in people with alcoholism. Third, the differences between groups were particularly present for the angry stimuli, for which alcoholism patients had impaired cross-modal processing at frontal, central, temporal and parietal sites, mainly between 90 and 160 milliseconds. Interestingly, the source reconstruction analysis reinforced these data: As shown in Figure 5, there were no differences between control subjects and alcoholism patients for the happy stimuli in terms of the source location during the early processing (100–150 milliseconds). On the contrary, for the angry stimuli, subjects with alcoholism had significantly reduced generators, first at the occipital sites, where the activity was lower than for control subjects, and second and more importantly, at the frontal sites, where control subjects had marked activity between 100 and 150 milliseconds, whereas no activity was detected in the alcoholism group. These results clearly support the ERP data at the anatomic level: cross-modal processing is significantly reduced in alcoholism for angry stimuli, and this deficit is particularly salient at frontal and occipital sites around 100–150 milliseconds after stimulus onset. ### Implications and conclusion Because these deficits are not found with neutral and happy stimuli, we assume that, in alcoholism, early cross-modal processing is impaired specifically for anger. This constitutes the central result of our study and offers an explanation for the hiatus between clinical observations and previous studies exploring emotional decoding deficits in alcoholism. Indeed, many clinical studies52,53 have stressed that people with alcoholism have considerable difficulty managing their anger and correctly reacting to anger from others and that this impairment increases interpersonal problems. However, although some studies have suggested that alcoholism is associated with a relatively specific deficit in anger EFE decoding, 11 other studies have not replicated these results,14,54 and this deficit has not been described for other stimuli (notably auditory prosody). This difference between a patent clinical deficit and contrasting experimental results could be explained by the fact that previous studies only used unimodal stimuli (mainly EFE). These stimuli are artificial because in everyday social interactions, multimodal stimuli, and mainly auditory-visual stimuli, are more common. Using more ecologic stimuli, the present study established, at the electrophysiological level, the specific cross-modal deficit for anger in chronic alcoholism that has been repeatedly described at the clinical level. Because this impairment did not appear at the behavioural level, it could be that people with alcoholism develop alternative strategies in this experimental situation (notably, by focusing on 1 sensory modality) to compensate for this deficit and that these strategies are not efficient in real situations. Further studies are needed to better understand the extent of the deficit. Nevertheless, this study constitutes a first step in the understanding of cross-modal processing in alcoholism. These results may encourage future studies to use more ecologic stimuli (notably, faces and voices) in the exploration of emotional impairments in chronic alcoholism. Moreover, the electrophysiological data were confirmed by the anatomic investigations because the source reconstruction analysis showed reduced activity (occipital and mainly frontal) in alcoholism patients during the early processing of anger. This study thus demonstrates the utility of source reconstruction to determine which cerebral areas are implicated in the cross-modal integration of complex emotional stimuli. Nevertheless, these results should be completed by the application of electrophysiological (particularly gamma coherence) and neuroanatomical (mainly functional connectivity) techniques to the exploration of cross-modal processing in chronic alcoholism. ## Acknowledgements Dr. Joassin is a Research Associate at the National Fund for Scientific Research (FNRS, Belgium), and Dr. Maurage is a Research Assistant at the FNRS. This study was conducted with the help of the medical team of the unit “Les Pins,” directed by Dr. Jonniaux at the Hospital “Les Marronniers.” ## Footnotes * Medical subject headings: alcoholism; event-related potentials; facial expression; emotion. * **Competing interests:** Drs. Alonso Prieto, Zanow and Palmero-Soler are employees of Eemagine. * **Contributors:** Drs. Maurage, Philippot, Joassin, Pham and Campanella, and Ms. Pauwels designed the study. Dr. Maurage and Ms. Pauwels aquired the data, which Drs. Maurage, Joassin, Alonso Prieto, Palmero-Soler, Zanow and Campanella, and Ms. Pauwels analyzed. Drs. Maurage and Campanella wrote the article, and Drs. Philippot, Joassin, Pham, Alonso Prieto, Palmero-Soler, Zanow and Campanella, and Ms. Pauwels revised it. All authors gave final approval for the article to be published. * Received March 5, 2007. * Revision received June 4, 2007. * Revision received July 23, 2007. * Revision received August 27, 2007. * Accepted September 4, 2007. ## References 1. Calvert GA. Cross-modal processing in the human brain: Insights from functional neuroimaging studies. Cereb Cortex 2001;11:1110–23. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1093/cercor/11.12.1110&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=11709482&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=000172538400002&link_type=ISI) 2. Fort A, Delpuech C, Pernier J, et al. Early auditory-visual interactions in human cortex during nonredundant target identification. Brain Res Cogn Brain Res 2002;14:20–30. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1016/S0926-6410(02)00058-7&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=12063127&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) 3. Miller J. Divided attention: evidence for coactivation with redundant signals. Cognit Psychol 1982;14:247–79. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1016/0010-0285(82)90010-X&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=7083803&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=A1982NJ01900004&link_type=ISI) 4. Molholm S, Ritter W, Murray MM, et al. Multisensory auditoryvisual interactions during early sensory processing in humans: a high-density electrical mapping study. Brain Res Cogn Brain Res 2002;14:115–28. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1016/S0926-6410(02)00066-6&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=12063135&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) 5. Miltner WH, Braun C, Arnold M, et al. Coherence of gamma-band EEG activity as a basis for associative learning. Nature 1999;397: 434–6. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1038/17126&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=9989409&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=000078461700048&link_type=ISI) 6. Calvert GA, Campbell R, Brammer MJ. Evidence from functional magnetic resonance imaging of cross-modal binding in the human heteromodal cortex. Curr Biol 2000;10:649–57. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1016/S0960-9822(00)00513-3&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=10837246&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=000088978600016&link_type=ISI) 7. Driver J, Spence C. Multisensory perception: beyond modularity and convergence. Curr Biol 2000;10:R731–5. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1016/S0960-9822(00)00740-5&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=11069095&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=000090034300005&link_type=ISI) 8. Dolan RJ, Morris JS, de Gelder B. Cross-modal binding of fear in voice and face. Proc Natl Acad Sci U S A 2001;98:10006–10. [Abstract/FREE Full Text](http://jpn.ca/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6NDoicG5hcyI7czo1OiJyZXNpZCI7czoxMToiOTgvMTcvMTAwMDYiO3M6NDoiYXRvbSI7czoxODoiL2pwbi8zMy8yLzExMS5hdG9tIjt9czo4OiJmcmFnbWVudCI7czowOiIiO30=) 9. Pourtois G, de Gelder B, Bol A, et al. Perception of facial expressions and voices and of their combination in the human brain. Cortex 2005;41:49–59. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1016/S0010-9452(08)70177-1&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=15633706&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=000225894900006&link_type=ISI) 10. Surguladze SA, Calvert GA, Brammer MJ, et al. Auditory-visual speech perception in schizophrenia: An fMRI study. Psychiatry Res 2001;106:1–14. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1016/S0925-4927(00)00081-0&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=11231095&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=000167495300001&link_type=ISI) 11. Philippot P, Kornreich C, Blairy S, et al. Alcoholics’ deficits in the decoding of emotional facial expression. Alcohol Clin Exp Res 1999;23:1031–8. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1111/j.1530-0277.1999.tb04221.x&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=10397287&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=000080989000011&link_type=ISI) 12. Townshend JM, Duka T. Mixed emotions: alcoholics’ impairments in the recognition of specific emotional facial expressions. Neuropsychologia 2003;41:773–82. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1016/S0028-3932(02)00284-1&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=12631528&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=000182056300002&link_type=ISI) 13. Monnot M, Nixon S, Lovallo W, et al. Altered emotional perception in alcoholics: Deficits in affective prosody comprehension. Alcohol Clin Exp Res 2001;25:362–9. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1111/j.1530-0277.2001.tb02222.x&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=11290846&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=000167594300006&link_type=ISI) 14. Uekermann J, Daum I, Schlebusch P, et al. Processing of affective stimuli in alcoholism. Cortex 2005;41:189–94. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1016/S0010-9452(08)70893-1&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=15714901&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=000226935100011&link_type=ISI) 15. Nixon SJ, Tivis R, Parsons OA. Interpersonal problem-solving in male and female alcoholics. Alcohol Clin Exp Res 1992;16:684–7. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1111/j.1530-0277.1992.tb00661.x&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=1530130&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=A1992JK61200006&link_type=ISI) 16. 1. Kendall PC, 2. Hollon SD Marlatt CA. Alcohol use and problem drinking: a cognitive behavioral analysis. In: Kendall PC, Hollon SD, editors. Cognitive-behavioral interventions: theory, research, and practice. New York: Academic Press; 1979. p 139–61. 17. Bushman BJ, Cooper HM. Effects of alcohol on human aggression: An integrative research review. Psychol Bull 1990;107:341–54. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1037/0033-2909.107.3.341&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=2140902&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=A1990DC58200005&link_type=ISI) 18. Pfefferbaum A, Sullivan EV, Mathalon DH, et al. Frontal lobe volume loss observed with magnetic resonance imaging in older chronic alcoholics. Alcohol Clin Exp Res 1997;21:521–9. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1111/j.1530-0277.1997.tb03798.x&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=9161613&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=A1997WZ64500020&link_type=ISI) 19. Moselhy HF, Georgiou G, Kahn A. Frontal lobe changes in alcoholism: a review of the literature. Alcohol Alcohol 2001;36:357–68. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1093/alcalc/36.5.357&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=11524299&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=000171644900001&link_type=ISI) 20. Wegner AJ, Gunthner A, Fahle M. Visual performance and recovery in recently detoxified alcoholics. Alcohol Alcohol 2001;36:171–9. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1093/alcalc/36.2.171&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=11259215&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=000168130800012&link_type=ISI) 21. Campanella S, Philippot P. Insights from ERPs into emotional disorders: An affective neuroscience perspective. Psychol Belg 2006;46: 37–53. 22. Polich J. Clinical application of the P300 event-related brain potential. Phys Med Rehabil Clin N Am 2004;15:133–61. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1016/S1047-9651(03)00109-8&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=15029903&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) 23. Hansenne M. Event-related brain potentials in psychopathology: Clinical and cognitive perspectives. Psychol Belg 2006;46:5–36. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.5334/pb-46-1-2-5&link_type=DOI) 24. Porjesz B, Begleiter H. Alcoholism and human electrophysiology. Alcohol Res Health 2003;27:153–60. [PubMed](http://jpn.ca/lookup/external-ref?access_num=15303626&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=000223800500005&link_type=ISI) 25. Ogura C, Miyazato Y. Cognitive dysfunctions of alcohol dependence using event related potentials. Arukoru Kenkyuto Yakubutsu Ison 1991;26:331–40. [PubMed](http://jpn.ca/lookup/external-ref?access_num=1772373&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) 26. Maurage P, Philippot P, Verbanck P, et al. Is the P300 deficit in alcoholism associated with early visual impairments P100, N170? An oddball paradigm. Clin Neurophysiol 2007;118:633–44. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1016/j.clinph.2006.11.007&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=17208045&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=000244772600015&link_type=ISI) 27. Kathmann N, Soyka M, Bickel R, et al. ERP changes in alcoholics with and without alcohol psychosis. Biol Psychiatry 1996;39: 873–81. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1016/0006-3223(95)00289-8&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=9172708&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=A1996UJ09700007&link_type=ISI) 28. Teder-Sälejärvi WA, McDonald JJ, Di Russo F, et al. An analysis of audio-visual crossmodal integration by means of event-related potential (ERP) recordings. Brain Res Cogn Brain Res 2002;14:106–14. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1016/S0926-6410(02)00065-4&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=12063134&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) 29. Spielberger DC, Gorsuch RL, Lushene R, et al. Manual for the statetrait anxiety inventory. 1st ed. Palo Alto (CA): Consulting Psychology Press; 1983. 30. Beck AT, Steer RA. Beck depression inventory manual. 1st ed. San Antonio (TX): Psychological Corporation; 1987. 31. Horowitz LM, Rosenberg SE, Baer BA, et al. Inventory of interpersonal problems: psychometric properties and clinical applications. J Consult Clin Psychol 1988;56:885–92. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1037/0022-006X.56.6.885&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=3204198&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=A1988R143600016&link_type=ISI) 32. Bagby RM, Taylor GJ, Parker JD. The Twenty-item Toronto Alexithymia Scale–II. Convergent discriminant and concurrent validity. J Psychosom Res 1994;38:33–40. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1016/0022-3999(94)90006-X&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=8126688&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=A1994MQ14200004&link_type=ISI) 33. Ekman P, Friesen WV. Pictures of facial affect. 1st ed. Palo Alto (CA): Consulting Psychologists Press; 1976. 34. Maurage P, Joassin F, Philippot P, et al. A validated battery of vocal emotional expressions. Neuropsycholgical Trends. In press. 35. Joassin F, Maurage P, Bruyer R, et al. When audition alters vision: an event-related potential study of the cross-modal interactions between faces and voices. Neurosci Lett 2004;369:132–7. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1016/j.neulet.2004.07.067&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=15450682&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=000224438100009&link_type=ISI) 36. Rugg MD, Doyle MC, Wells T. Word and non-word repetition within-and across-modality: an event-related potential study. J Cogn Neurosci 1995;7:209–27. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1162/jocn.1995.7.2.209&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=23961825&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=A1995QX77000007&link_type=ISI) 37. Fuchs M, Wagner M, Kohler T, et al. Linear and nonlinear current density reconstructions. J Clin Neurophysiol 1999;16:267–95. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1097/00004691-199905000-00006&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=10426408&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=000081216400006&link_type=ISI) 38. Palmero-Soler E, Dolan K, Hadamschek V, et al. SwLORETA: A novel approach to robust source localization and synchronization tomography. Phys Med Biol 2007;52:1783–800. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1088/0031-9155/52/7/002&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=17374911&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) 39. Driessen M, Meier S, Hill A, et al. The course of anxiety, depression and drinking behaviours after completed detoxification in alcoholics with and without comorbid anxiety and depressive disorders. Alcohol Alcohol 2001;36:249–55. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1093/alcalc/36.3.249&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=11373263&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=000169318700012&link_type=ISI) 40. Mitchell JM, Fields HL, D’Esposito M, et al. Impulsive responding in alcoholics. Alcohol Clin Exp Res 2005;29:2158–69. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1097/01.alc.0000191755.63639.4a&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=16385186&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=000234389100010&link_type=ISI) 41. Taieb O, Corcos M, Loas G, et al. [Alexithymia and alcohol dependence.] [Article in French] Ann Med Interne (Paris) 2002;153:1S51–60. 42. Sullivan EV, Rosenbloom MJ, Pfefferbaum A. Pattern of motor and cognitive deficits in detoxified alcoholic men. Alcohol Clin Exp Res 2000;24:611–21. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1111/j.1530-0277.2000.tb02032.x&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=10832902&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=000087216000005&link_type=ISI) 43. Calvert GA, Hansen PC, Iversen SD, et al. Detection of auditory-visual integration sites in humans by application of electrophysiological criteria to the BOLD effect. Neuroimage 2001;14:427–38. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1006/nimg.2001.0812&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=11467916&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=000170253700017&link_type=ISI) 44. Esslen M, Pascual-Marqui RD, Hell D, et al. Brain areas and time course of emotional processing. Neuroimage 2004;21:1189–203. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1016/j.neuroimage.2003.10.001&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=15050547&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=000220723900001&link_type=ISI) 45. Kestenbaum R, Nelson CA. Neural and behavioral correlates of emotion recognition in children and adults. J Exp Child Psychol 1992;54:1–18. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1016/0022-0965(92)90014-W&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=1506820&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=A1992JE86100001&link_type=ISI) 46. Vuilleumier P, Pourtois G. Distributed and interactive brain mechanisms during emotion face perception: Evidence from functional neuroimaging. Neuropsychologia 2007;45:174–94. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1016/j.neuropsychologia.2006.06.003&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=16854439&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=000243439900015&link_type=ISI) 47. Herrmann MJ, Ellgring H, Fallgatter AJ. Early-stage face processing dysfunction in patients with schizophrenia. Am J Psychiatry 2004;161:915–7. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1176/appi.ajp.161.5.915&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=15121660&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=000221361700021&link_type=ISI) 48. Cohen HL, Wang W, Porjesz B, et al. Auditory P300 in young alcoholics: regional response characteristics. Alcohol Clin Exp Res 1995; 19:469–75. [PubMed](http://jpn.ca/lookup/external-ref?access_num=7625584&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) 49. Porjesz B, Begleiter H. Human evoked brain potentials and alcohol. Alcohol Clin Exp Res 1981;5:304–17. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1111/j.1530-0277.1981.tb04904.x&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=7018313&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=A1981MG33200020&link_type=ISI) 50. Poldrack RA, Wagner AD, Prull MW, et al. Functional specialization for semantic and phonological processing in the left inferior prefrontal cortex. Neuroimage 1999;10:15–35. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1006/nimg.1999.0441&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=10385578&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=000081454800003&link_type=ISI) 51. Ceballos NA, Houston RJ, Smith ND, et al. N400 as an index of semantic expectancies: differential effects of alcohol and cocaine dependence. Prog Neuropsychopharmacol Biol Psychiatry 2005;29:936–43. [CrossRef](http://jpn.ca/lookup/external-ref?access_num=10.1016/j.pnpbp.2005.04.036&link_type=DOI) [PubMed](http://jpn.ca/lookup/external-ref?access_num=15967560&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) 52. Bartek JK, Lindeman M, Hawks JH. Clinical validation of characteristics of the alcoholic family. Nurs Diagn 1999;10:158–68. [PubMed](http://jpn.ca/lookup/external-ref?access_num=10786556&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) 53. Karno MP, Longabaugh R. An examination of how therapist directiveness interacts with patient anger and reactance to predict alcohol use. J Stud Alcohol 2005;66:825–32. [PubMed](http://jpn.ca/lookup/external-ref?access_num=16459944&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) 54. Foisy ML, Philippot P, Verbanck P, et al. Emotional facial expression decoding impairment in persons dependent on multiple substances: impact of a history of alcohol dependence. J Stud Alcohol 2005;66:673–81. [PubMed](http://jpn.ca/lookup/external-ref?access_num=16329459&link_type=MED&atom=%2Fjpn%2F33%2F2%2F111.atom) [Web of Science](http://jpn.ca/lookup/external-ref?access_num=000232284900011&link_type=ISI)