fMRI experiment for written words and pictures probed with visual and nonvisual properties

Published: 31 January 2019| Version 1 | DOI: 10.17632/gr4mb4cf8v.1
Contributors:
Antonietta Gabriella Liuzzi,
Patrick Dupont ,
Rik Vandenberghe

Description

The data files refer to an event-related fMRI experiment where we investigated the role of left perirhinal cortex in knowledge retrieval from written words and pictures. During fMRI, eighteen participants performed a property verification task: they were asked to decide whether a property was applicable to a concrete entity presented as written word or picture. Each concrete entity - animate or inanimate - was probed with visual and nonvisual properties. The data shows that the left perirhinal cortex coded for the similarity in meaning between written words. The semantic similarity between stimuli was determined by means of a word association - based model.

Files

Institutions

Katholieke Universiteit Leuven

Categories

Semantics, Functional Magnetic Resonance Imaging

License