A Unitary Account of Conceptual Representations of Animate / Inanimate Categories

In this paper we present an ERP study examining the underlying nature of semantic representation of animate and inanimate objects. Time-locking ERP signatures to the onset of auditory stimuli we found topological similarities in animate and inanimate object processing. Moreover, we found no difference between animates and inanimates in the N400 amplitude, when mapping more specific to more general representation (visual to auditory stimuli). These studies provide further evidence for the theory of unitary semantic organization, but no support for the feature-based prediction of segregated conceptual organization. Further comparisons of animate vs. inanimate matches and within– vs. between-category mismatches revealed following results: processing of animate matches elicited more positivity than processing of inanimates within the N400 time-window; also, inanimate mismatches elicited a stronger N400 than did animate mismatches. Based on these findings we argue that one of the possible explanations for finding different and sometimes contradictory results in the literature regarding processing and representations of animates and inanimates in the brain could lie in the variability of selected items within each of the categories, that is, homogeneity of the categories.

Theoretical approaches to understanding the conceptual structure of semantic knowledge have been mainly developed through studying the system in which normal processing has broken down, for example following traumatic brain injuries (Warrington & Shallice, 1984;Caramazza & Mahon, 2003;Moss & Tyler, 2000).Recently, however, methods such as positron emission tomography (PET) and functional magnetic resonance imaging (fMRI) have been applied to advance our understanding of the conceptual organisation of semantic knowledge in healthy participants (Martin, 2001;Martin & Chao, 2001;Martin, 2007).Brain damages which led to selective category deficits were often interpreted as a crucial evidence for discrete conceptual structure of semantic knowledge.
Corresponding author: vkovic@ff.uns.ac.rsHowever, these studies lacked a consistency in relating brain locations with semantic impairment data.This is why some researches (Devlin, Gonnerman, Andersen & Seidenberg, 1998;Tyler & Moss, 2000;Tyler et al.,2003a;Tyler et al., 2003b) claimed that semantic knowledge is organised within a unitary, distributed system whereby various patterns of activation correspond to different semantic properties.
A recent study by Kovic, Plunkett & Westermann (2009) provided a further evidence for a unitary account of conceptual organisation.Time-locking ERP signatures to the onset of visual stimuli we found topological similarities in animate and inanimate object processing.Moreover, when mapping more general to more specific representation (auditory to visual stimuli) we found no difference between animates and inanimates in the N400 amplitude either.
However, under the assumption that objects from the same category tend to share more features than objects from different categories, one would expect a more enhanced N400 to be elicited by between-category (animate-inanimate) mismatches than by within-category mismatches (animate-animate).In fact, a greater N400 amplitude for between-categories mismatches in comparison to within-category mismatches has been reported in the context of sentence processing (Kutas & Federmeier, 2001).However, the results of the Kovic et al. ( 2009) study revealed no systematic differences between within-and betweencategories mismatches for either animate or inanimate objects.The interpretation of this finding was that these differences could have been masked given the sequence of the stimuli presentation, that is, mapping form auditory to visual stimuli.Thus, the twist in stimulus sequence in the present study could provide a way of controlling for the abstractness or generality of a mental representation, whereby auditory presented labels would be considered as more abstract and general (i.e., mental types), and visual stimuli as more concrete and specific (i.e., mental tokens).Hence, in the current experiment we built upon our previous results by reversing the order of the auditory and visual presentation of the stimuli and time-locking the ERP signatures to the onset of the auditory stimuli.
According to Holcomb et al. (1999) anterior negativities evoked by visual stimuli reflect the activation of semantic representations of visual features, whereas the posterior negativities evoked by auditory stimuli are possibly reflecting the activation of auditory representations.Thus, besides distinguishing between signatures elicited by animate and inanimate stimuli and within-and between-categories mismatches, this study will provide insights into previously reported differences in scalp distribution between auditory stimuli and visual stimuli.Crucially, this study was expected to reveal different pattern or results to those reported by Kovic et al., (2009), given that in the previous study participants matched a label to a particular visual object, that is, more general mental representation to a more specific mental representation, or a mental type to a mental token.

Method
Participants: Fifteen healthy, normal, right-handed, native speakers of English were recruited for this ERP study.They all had normal hearing and normal or corrected to normal vision.All of the participants were Oxford University undergraduate students, some of whom were given course credit for their participation in the study and some of whom were paid five pounds for taking part in the experiment.None of the participants were excluded from the study.
Stimuli: Visual and auditory stimuli were exactly the same as the ones used in Kovic et al. (2009).
Experimental design: The experimental conditions were the same as described in Kovic et al. (2009), except that the visual stimuli in this study were presented prior to the auditory stimuli.
In this study, the time sequence of each trial was as follows: a fixation cross was displayed at the start of the trial for approximately 1000 ms followed by a picture which appeared on the screen at the offset of the fixation cross and remained on the screen for approximately 3000ms.During the picture presentation an auditory stimulus was played, 1500ms from the onset of the visual stimuli.Approximately 4000 ms from the onset of the trial and exactly at the offset of the visual stimuli a question mark appeared in the centre of the screen after which participants were instructed to press either the 'match' or 'mismatch' button on the keyboard.The presentation of the next trial automatically followed participants' responses (see Figure1).
The presentation of the picture-label pairs was randomised for each subject.A jitter of ± 200 ms was introduced prior to the presentation of the auditory stimuli, visual stimuli and question mark in order to avoid preparatory movement potentials during the task (Luck, 2005).Preparatory movement potentials are well known to appear as contingent negative variations (CNV), a low frequency negative wave preceding an expected stimulus (Luck, 2005).

Results
EEG signatures in this study were time-locked to the onset of the auditory stimuli.Mean amplitude measurements were extracted from the continuous EEG signal into 20 ms bins for each participant across all of the experimental conditions.Only significant differences between conditions where neighbouring 20ms bins were also significant at p<.05 level (Eddy, Schmid, & Holcomb, 2006) are reported here.
The data analysis revealed significant differences between the experimental conditions across the central, parietal and occipital regions.Significant differences across the conditions were observed between 340 ms and 480 ms in the parietal-occipital region and between 380 ms and 540 ms in the central region of the scalp.
Mean amplitudes from the sites at the parietal-occipital region (P3, PZ, P4, O1 and O2) were averaged together.

DISCUSSION
The current study investigated differences in the ERP signatures elicited by animate vs. inanimate auditory stimuli, as well as differences between mismatches within and mismatches between categories, time-locking ERP responses to the onset of the auditory stimuli.
Regarding the topographical distribution of the N400 elicited by auditory stimuli, earlier studies have reported its distribution mainly over the parietaloccipital brain region, due to increased activation of auditory representations (Friederici et al., 1993;Hagoort & Brown, 2000;Holcomb et al., 1999;Kutas & Van Petten, 1994;Van Berkum et al., 1999).In the present study, clear-cut mismatch effects were found in the parietal-occipital region as well, with no topographical differences in processing animates and inanimates, suggesting similar or close neural generators, contrary to the theories proposing a discrete neuroanatomical organisation of semantic representations in the brain (Warrington & Shallice, 1984;Caramazza & Mahon, 2003;Moss & Tyler, 2000).
Given the sensitivity of the N400 component to semantic processing and in accordance with previous studies (Kutas & Hillyard, 1980;Friedrich & Friederici, 2004;Kutas & Federmeier, 2001) we predicted that the mismatch conditions would give rise to an enhanced negativity in this time-window.As expected, regarding the N400 amplitude, systematic differences between ERP responses to picture-label matches and picture-label mismatches were observed from 340 ms in the parietal-occipital scalp region and from 380ms in the central region of the scalp, and they lasted about 140ms.During this time interval picture-label mismatches elicited more enhanced ERP responses that picturelabel matches.
Furthermore, given that animates have more strongly correlated features and less distinct features than inanimates (Tyler & Moss, 2001;Tyler et al., 2003aTyler et al., , 2003b;;Devlin, et al., 1998) and higher within-category similarity in comparison to inanimates (Gerlach, 2001;Lag, 2005;Lag, Hveem, Ruud, & Laeng, 2006), one would expect that inanimates would elicit a stronger N400 effect.The same prediction was made for between-categories mismatches in comparison to within-category mismatches, whereby within-category mismatches should have more featural similarities than between-categories mismatches.However, ERP signatures across animate vs. inanimate matches and within-vs.betweencategories mismatches did not systematically differ from each other regarding the parietal-occipital region.
Nonetheless, the central brain region revealed a more interesting and more complex patterns of results.To start with, processing of animate matches differed from inanimate matches in that animates elicited more positivity than inanimates within the N400 time-window.These results are in agreement with recently reported results by Proverbio, Del Zotto, & Zani, (2007) demonstrating a greater N400 effect being elicited by "artifacts than animals" in centro-parietal brain-region.More importantly, this result is also in agreement with the unitary account of semantic organization which would predict a greater N400 amplitude for inanimates in comparison to animates, due to increased number of distinct features and fewer number of similar and semantically correlated features in inanimates in comparison to animates.
Furthermore, animate and inanimate within-category mismatch signatures did not differ from each other, whereas one would expect within-category mismatches to elicit a stronger N400 in the case of inanimates in comparison to animates, given the higher within-category similarities for animate objects.However, between-categories mismatches differed from one another in that inanimate mismatches elicited a stronger N400 than did animate mismatches.Thus, presenting animate stimuli prior to inanimate stimuli elicited a stronger N400 than presenting inanimate prior to animate stimuli.Taking into account the results regarding within-category similarity across animates and inanimates, it could be agued that processing an object which comes from a group with higher variability, namely inanimates, led to the enhanced N400 in comparison to processing a mismatch which comes from the group with the smaller withincategory variability, namely animates.
In summary, the results of this experiment seem to suggest that one of the possible explanations for finding different and sometimes contradictory results in the literature regarding processing and representation of animates and inanimates in the brain could lie in the variability of selected items within each of the categories.This is, clearly, a question of homogeneity of those two categories.In order to further understand the role of within-category variability it would be necessary to design experiments which would directly contrast the processing of familiar items of similar structure and low within-item variability vs. processing of structurally different familiar items.Thus, one way of exploring this further would be to contrast within sub-category mismatches (sheep-bear) with withincategory matches (sheep-butterfly) and with between-category mismatches (sheep-table).If the N400 is sensitive to featural overlap, such graded featural overlap should become more apparent in the amplitude of the N400.The small number of sub-category items in the present research were not sufficient to run such detailed analysis.
The other way to control for object variability is to design a set of novel items with a similar structure and systematically vary feature properties across task-defined categories.By constructing such set of items and training participants to learn them, it would be further possible to examine processing of the object features on-line and explore strategies participants use to establish novel concepts for the newly learnt categories.Crucially, such a design would allow examination of the role of labels in forming novel category concepts.

Figure 1 .
Figure 1.The time sequence of the visual/auditory stimuli presentation

Figure 3 .
Figure 3. ERP signatures across the experimental conditions at the PZ site 2

Figure 5 .
Figure 5. ERP signatures across the experimental conditions at the C4 site