fMRI experiment for written words and pictures probed with visual and nonvisual properties
The data files refer to an event-related fMRI experiment where we investigated the role of left perirhinal cortex in knowledge retrieval from written words and pictures. During fMRI, eighteen participants performed a property verification task: they were asked to decide whether a property was applicable to a concrete entity presented as written word or picture. Each concrete entity - animate or inanimate - was probed with visual and nonvisual properties. The data shows that the left perirhinal cortex coded for the similarity in meaning between written words. The semantic similarity between stimuli was determined by means of a word association - based model.