Filter Results
59615 results
Abstract Planning for power systems with high penetrations of variable renewable energy requires higher spatial and temporal granularity. However, most publicly available test systems are of insufficient fidelity for developing methods and tools for high- resolution planning. This paper presents methods to construct open-access test systems of high spatial granularity to more accurately represent current infrastructure and high temporal granularity to represent variability of demand and renewable resources. To demonstrate, a high-resolution test system representing the United States is created using only publicly available data. This test system is validated by running it in a production cost model, with results validated against historical generation to ensure that they are representative. The resulting open source test system can support power system transition planning and aid in development of tools to answer questions around how best to reach decarbonization goals, using the most effective combinations of transmission expansion, renewable generation, and energy storage. Documentation of dataset development A paper describing the process of developing the dataset is available at https://arxiv.org/abs/2002.06155. Please cite as: Y. Xu, Nathan Myhrvold, Dhileep Sivam, Kaspar Mueller, Daniel J. Olsen, Bainan Xia, Daniel Livengood, Victoria Hunt, Benjamin Rouillé d'Orfeuil, Daniel Muldrew, Merrielle Ondreicka, Megan Bettilyon, "U.S. Test System with High Spatial and Temporal Resolution for Renewable Integration Studies," 2020 IEEE PES General Meeting, Montreal, Canada, 2020. Dataset version history 0.1, January 31, 2020: initial data upload. 0.2, March 10, 2020: addition of Tabular Data Package metadata, modifications to cost curves and transmission capacities aimed at more closely matching optimization results to historical data. 0.2.1, March 25, 2020: corrected a bug in the wind profile generation process which was pulling the wrong locations for wind farms outside the Western Interconnection.
Data Types:
  • Dataset
  • File Set
Abstract Planning for power systems with high penetrations of variable renewable energy requires higher spatial and temporal granularity. However, most publicly available test systems are of insufficient fidelity for developing methods and tools for high- resolution planning. This paper presents methods to construct open-access test systems of high spatial granularity to more accurately represent current infrastructure and high temporal granularity to represent variability of demand and renewable resources. To demonstrate, a high-resolution test system representing the United States is created using only publicly available data. This test system is validated by running it in a production cost model, with results validated against historical generation to ensure that they are representative. The resulting open source test system can support power system transition planning and aid in development of tools to answer questions around how best to reach decarbonization goals, using the most effective combinations of transmission expansion, renewable generation, and energy storage. Documentation of dataset development A paper describing the process of developing the dataset is available at https://arxiv.org/abs/2002.06155. Please cite as: Y. Xu, Nathan Myhrvold, Dhileep Sivam, Kaspar Mueller, Daniel J. Olsen, Bainan Xia, Daniel Livengood, Victoria Hunt, Benjamin Rouillé d'Orfeuil, Daniel Muldrew, Merrielle Ondreicka, Megan Bettilyon, "U.S. Test System with High Spatial and Temporal Resolution for Renewable Integration Studies," 2020 IEEE PES General Meeting, Montreal, Canada, 2020. Dataset version history 0.1, January 31, 2020: initial data upload. 0.2, March 10, 2020: addition of Tabular Data Package metadata, modifications to cost curves and transmission capacities aimed at more closely matching optimization results to historical data. 0.2.1, March 25, 2020: [erroneous upload] 0.2.2, March 26, 2020: [erroneous upload]
Data Types:
  • Dataset
  • File Set
Planning for power systems with high penetrations of variable renewable energy requires higher spatial and tempo- ral granularity. However, most publicly available test systems are of insufficient fidelity for developing methods and tools for high- resolution planning. This paper presents methods to construct open-access test systems of high spatial granularity to more accurately represent current infrastructure and high temporal granularity to represent variability of demand and renewable resources. To demonstrate, a high-resolution test system representing the United States is created using only publicly available data. This test system is validated by running it in a production cost model, with results validated against historical generation to ensure that they are representative. The resulting open source test system can support power system transition planning and aid in development of tools to answer questions around how best to reach decarbonization goals, using the most effective combinations of transmission expansion, renewable generation, and energy storage. A paper describing the process of developing the dataset is available at https://arxiv.org/abs/2002.06155. Version history 0.1, January 31, 2020: initial data upload. 0.2, March 10, 2020: addition of Tabular Data Package metadata, modifications to cost curves and transmission capacities aimed at more closely matching optimization results to historical data.
Data Types:
  • Dataset
  • File Set
Abstract Motivation Antibodies are widely used experimental reagents to test expression of proteins. However, they might not always provide the intended tests because they do not specifically bind to the target proteins that their providers designed them for, leading to unreliable and irreproducible research results. While many proposals have been developed to deal with the problem of antibody specificity, they may not scale well to deal with the millions of antibodies that have ever been designed and used in research. In this study, we investigate the feasibility of automatically extracting statements about antibody specificity reported in the literature by text mining, and generate reports to alert scientist users of problematic antibodies. Results We developed a deep neural network system called Antibody Watch and tested its performance on a corpus of more than two thousand articles that report uses of antibodies. We leveraged the Research Resource Identifiers (RRID) to precisely identify antibodies mentioned in an input article and the BERT language model to classify if the antibodies are reported as nonspecific, and thus problematic, as well as inferred the coreference to link statements of specificity to the antibodies that the statements referred to. Our evaluation shows that Antibody Watch can accurately perform both classification and linking with F-scores over 0.8, given only thousands of annotated training examples. The result suggests that with more training, Antibody Watch will provide useful reports about antibody specificity to scientists.
Data Types:
  • Dataset
  • File Set
Compressed fastqs for raw sequences of clinical isolates of Escherichia coli infection from Toronto, Canada in 2018 (Dataset 2). Sequencing details outlined in associated publication. Performed using Illumina NextSeq platform.
Data Types:
  • Document
  • File Set
Geosoup is a python package for geospatial data manipulation using GDAL and GDAL bindings in python. This package is a minimalistic software distribution for limited manipulation of common geospatial data types such as rasters, vectors and samples. All the heavy lifting is done by GDAL, numpy and, scipy. Install using: pip install geosoup Note: Must have GDAL >= 2.1.3 up and running before installing this package.
Data Types:
  • Software/Code
  • File Set
Open Authorization for MyProxy.
Data Types:
  • Software/Code
  • File Set
fix color settings improve robustness via micro-image mismatch enable continuous integration
Data Types:
  • Software/Code
  • File Set
A BIDS-App to preprocess anatomical (T1w, T2w and FLAIR) data, providing workflows for reuse in NIPreps (NeuroImaging PREProcesssing tools) such as fMRIPrep.
Data Types:
  • Software/Code
  • File Set
Several changes, most important: theoretical framework: Corrected likelihood: Forgot non-constant K term when K=K(theta): c8c0046153b1eb269d00280b572f742b1a3cf4d7 parameters and choices: unfolding parameters: 0fcafe2ff7770be8c2bb107256201af79739cdb3 unfolder and fg method use remove negatives only, no fill: 9edb48537cca1f88c3120a73fa8eb92f6ebb5177 Randomize p0 for decomposition 77dec9db9a3a34d5fd6195752c84cfbca0c26c39 implementation and convenience: different save/load for vectors e5f7e52ce13cff04e8b23f50a00902be1d098bfc and parent commits Enable pickling of normalizer instances via dill: 896b352686594a8c7dbe52904645cc5b900ba800
Data Types:
  • Software/Code
  • File Set