Filter Results
14 results
- Data for: Psychological and neural responses to architectural interiorsThis is the behavioral data from the experiments outlined in our study, "Psychological and neural responses to architectural interiors."
- Data for: The Role of the Basolateral Amygdala in DreamingThis dataset contains the data collected for the following paper: Blake, Y., Terburg, D., Balchin, r., van Honk, J., & Solms, M. The Role of the Basolateral Amygdala in Dreaming. Cortex, Under Revision. The data consists of scores for 23 dream reports collected from 8 patients with bilateral basolateral amygdala lesions due to Urbach Wiethe Disease, and 52 dream reports collected from 17 matched healthy controls. These dream reports were scored on seven measures by three independent research assistants. For information on the measures used, please see the descriptions in the paper. Brief description of each variable: PATIENT: Classifies the research participant as either an UWD patient or a control PARTICIPANT: Unique number assigned to each participant DREAM: Unique number assigned to each dream report collected from the participants POS_AFFECT: The dream report’s positive affect score (as measured by the PANDAS) NEG_AFFECT: The dream report’s negative affect score (as measured by the PANDAS) WISH: The dream report’s wish-fulfilment score NIGHTMARE: a score of 1 indicates that the dream report was classified as a nightmare, a score of 0 indicates that it was not. ANGER, LUST, PLAY, SEEKING, CARE, FEAR, LOSS: These variables indicate the dream reports score on each of Panksepp (1998)’s basic emotions, as measured by the ADS APP_AV: a score of 1 indicates that the dream report was classified as predominantly consisting of approach behaviour, a score of 0 indicates that the dream report was classified as predominantly consisting of avoidance behaviour. THREAT: a score of 1 indicates that the dream report was rated as containing a realistic physical threat to the dreamer, a score of 0 indicates that the dream report was not rated as containing a realistic physical threat to the dreamer. LIFE*: If the dream contained a realistic physical threat, was that threat rated as being life threatening? 1 = Yes, 0 = No ANCESTRAL*: If the dream contained a realistic physical threat, was that threat rated as being ancestral or modern? 1 = ANCESTRAL, 0 = MODERN ESCAPE*: If the dream contained a realistic physical threat, does the dreamer escape the threat? 1 = Yes, 0 = No. REALISTIC**: If the dreamer was able to escape the threat in their dream, is this escape realistic? 1 = Yes, 0 = No WORD_COUNT: The number of words counted in the dream report N_ITEM_COUNT: The number of narrative items counted in the dream report FACTOR 1: The dream report’s score on the first component extracted by the PCA FACTOR 2: The dream report’s score on the second component extracted by the PCA FACTOR 1: The dream report’s score on the third component extracted by the PCA *NOTE: The LIFE, ANCESTRAL, ESCAPE variables are only relevant if the score for THREAT = 1 **NOTE: The REALISTIC variable is only relevant if the score for ESCAPE = 1
- Data for: The Co-occurrence of Pitch and Rhythm Disorders in Congenital AmusiaPerceptual and tapping data
- fMRI data for word-pair presentationThere are two general views regarding the organization of object knowledge. The featurebased view assumes that object knowledge is grounded in a widely distributed neural network in terms of sensory/function features (e.g., Warrington & Shallice, 1984), while the category-based view assumes in addition that object knowledge is organized by taxonomic and thematic categories (e.g., Schwartz et al., 2011). Using an fMRI adaptation paradigm, we compare predictions from the feature- and category-based views by examining the neural substrates recruited as subjects read word pairs that are identical, taxonomically related, thematically related or unrelated while controlling for the function features involved across the two categories. The feature-based view predicts that adaptation in function regions (i.e., left posterior middle temporal lobe, left premotor cortex) should be observed for related word pairs regardless of the taxonomic/thematic categories. In contrast, the category-based view generates the prediction that adaptation in the bilateral anterior temporal lobes should be observed for taxonomically related word pairs and adaptation in the left temporo-parietal junction should be observed for thematically related word pairs. By improving upon previous study designs and employing the fMRI adaptation task, this study has the potential to clarify the role of semantic categories and features in the organization of object knowledge.
- Data for: Correlates of Anomia in non-semantic Variants of Primary Progressive Aphasia Converge over TimeStudy data and labels mapped onto the standard subject (fsaverage, FreeSurfer)
- The functional subdivision of the visual brain: Is there a real illusion effect on action? A multi-lab replication study (data for Cortex RR)Data and analyses for the paper 'The functional subdivision of the visual brain: Is there a real illusion effect on action? A multi-lab replication study', published as a Registered Report in Cortex. Authors: Karl K. Kopiske, Nicola Bruno, Constanze Hesse, Thomas Schenk, Volker H. Franz ABSTRACT - It has often been suggested that visual illusions affect perception but not actions such as grasping, as predicted by the "two-visual-systems" hypothesis of Milner & Goodale (1995, The Visual Brain in Action, MIT press). However, at least for the Ebbinghaus illusion, relevant studies seem to reveal a consistent illusion effect on grasping (Franz & Gegenfurtner, 2008. Grasping visual illusions: Consistent data and no dissociation. Cognitive Neuropsychology). Two interpretations are possible: either grasping is not immune to illusions (arguing against dissociable processing mechanisms for vision-for-perception and vision-for-action), or some other factors modulate grasping in ways that mimic a vision-for perception effect in actions. It has been suggested that one such factor may be obstacle avoidance (Haffenden Schiff & Goodale, 2001. The dissociation between perception and action in the Ebbinghaus illusion: Nonillusory effects of pictorial cues on grasp. Current Biology, 11, 177-181). In four different labs (total N=144), we conducted an exact replication of previous studies suggesting obstacle avoidance mechanisms, implementing conditions that tested grasping as well as multiple perceptual tasks. This replication was supplemented by additional conditions to obtain more conclusive results. Our results confirm that grasping is affected by the Ebbinghaus illusion and demonstrate that this effect cannot be explained by obstacle avoidance.
- Data for: 'Differences between morphological and repetition priming in auditory lexical decision'In this data set, we include the data for two auditory lexical decision experiments which are presented in the paper entitled "Differences between morphological and repetition priming in auditory lexical decision: Implications for decompositional models". Included with the data are the minimally processed raw data files, the R scripts used to prepare, trim, visualize, and model the data (using linear mixed-effects models), the model outputs themselves, and the graphs generated from the data. Please refer to the file_descriptions.pdf file for more details on the data itself. To start examining the data, please refer to the scripts labeled 1_exp1_data_prep.R and 1_exp2_data_prep.R . Following through to the other R scripts in the /scripts/ folder takes the reader through the analyses presented in the paper. These analyses read in CSV files from the /data/input/ folder and process them to create exp1_prep.csv, exp2_prep.csv, exp1_trim.csv, and exp2_trim.csv. These trimmed data sets are then used in the later scripts in the /scripts/ folder to generate the models found in the /scripts/data/ folder (the HTML output of which are in the /scripts/models/ folder) and the graphs found in the /scripts/graphs/ folder. The final scripts entitled exp1_models.R and exp2_models.R generate the models are found in /scripts/data/. Please direct comments about the data-set to the corresponding authors of the associated paper, Robert J. Wilder (rwilder@sas.upenn.edu) and Amy Goodwin Davies (amygood@sas.upenn.edu).
- Data for: Beyond decomposition: processing zero-derivations in English visual word recognitionRT data from masked priming studies (Experiments 1-3). Dat from delayed priming study (Experiment4). Key to experimental parameters.
- Data for: Altered neural dynamics in people who report spontaneous out of body experiences.These files are the raw (decimated to sampling rate of 512Hz) EEG recordings. The raw bdf files have been imported into EEG lab and saved as .set and .fdt files. No other processing has been applied. For further information about these data, please contact the first author Dr Elizabeth Milne, E.Milne@sheffield.ac.uk
- Data for: 'Differences between morphological and repetition priming in auditory lexical decision'In this data set, we include the data for two auditory lexical decision experiments which are presented in the paper entitled "Differences between morphological and repetition priming in auditory lexical decision: Implications for decompositional models". Included with the data are the minimally processed raw data files, the R scripts used to prepare, trim, visualize, and model the data (using linear mixed-effects models), the model outputs themselves, and the graphs generated from the data. Please refer to the file_descriptions.pdf file for more details on the data itself. To start examining the data, please refer to the scripts labeled 1_exp1_data_prep.R and 1_exp2_data_prep.R . Following through to the other R scripts in the /scripts/ folder takes the reader through the analyses presented in the paper. These analyses read in CSV files from the /data/input/ folder and process them to create exp1_prep.csv, exp2_prep.csv, exp1_trim.csv, and exp2_trim.csv. These trimmed data sets are then used in the later scripts in the /scripts/ folder to generate the models found in the /scripts/data/ folder (the HTML output of which are in the /scripts/models/ folder) and the graphs found in the /scripts/graphs/ folder. The final scripts entitled exp1_models.R and exp2_models.R generate the models are found in /scripts/data/. Please direct comments about the data-set to the corresponding authors of the associated paper, Robert J. Wilder (rwilder@sas.upenn.edu) and Amy Goodwin Davies (amygood@sas.upenn.edu).
1