A new method for analyzing sustainability performance of global supply chains and its application to material resources
Contributors: Livia Cabernard, Stephan Pfister, Stefanie Hellweg
... VERSION 2 Compared to version 1, version 2 runs without matlab. The file “Instructions.pdf” attached below explains how to install and use this application. The file “Explanation_method_examples.pptx” attached below illustrates the principle of the method and how to use the application with several examples. OVERALL DESCRIPTION (same as for version 1): We share here the data compiled to calculate the results presented in the study «A new method for analyzing sustainability performance of global supply chains and its application to material resources». In order to allow for the compilation of all results of interest, we provide an application. This application is based on the multi-regional-input output database EXIOBASE3 (version 3.4) and data to assess the potential environmental impacts of emissions and resource use. While the data for assessing the environmental impacts is provided here, the user must download the Exiobase data from the Exiobase website (due to copyright issues). The provided application allows to assess the cumulated upstream impacts of any sector or region on the globe without double-counting and to track these impacts upstream and downstream the global value chain. The application covers a broad set of environmental and socio-economic indicators and the timespan from 1995 to 2011. The methodology of this application is comprehensively explained in the study: «A new method for analyzing sustainability performance of global supply chains and its application to material resources» (e.g Section 2.1 gives a broad overview of the principle of the method). Link to the study: https://doi.org/10.1016/j.scitotenv.2019.04.434
Contributors: Alon Poleg-Polsky
... Simulation of neuronal processing of noise-corrupted inputs NEURON simulation (http://neuron.yale.edu/neuron)
Contributors: Raúl Roberto poppiel, Jose Alexandre Dematte, Marilusa Pinto Coelho Lacerda, José Lucas Safanelli, Rodnei Rizzo, Manuel Pereira de Oliveira junior, Jean Jesus Novais
... Maps of clay, silt and sand contents (g kg-1) predicted at 0-20 cm, 20-60 cm and 60-100 cm depths intervals (3D) obtained by random forest regression in Google Earth Engine. Gridded soil information covers Midwest Brazil, from 12° S to 20° S and from 45° W to 54° W, and is available with 250m resolution. The maps were cross-validated and had Coefficient of Determination ranging from 0.64 to 0.85 at all depth intervals.
Top results from Data Repository sources. Show only results like these.
Contributors: Massimo Salvi
... This repository contains the FAST algorithm graphical user interface and some sample image used in the following work: - Salvi M., Cerrato V., Buffo A., and Molinari F., "Automated Segmentation of Brain Cells for Clonal Analyses in Fluorescence Microscopy Images", J Neurosci Methods 2019 (DOI: 10.1016/j.jneumeth.2019.108348) ABSTRACT The understanding of how cell diversity within and across distinct brain regions is ontogenetically achieved is a pivotal topic in neuroscience. Clonal analyses based on multicolor cell labeling represent a powerful tool to tackle this issue and disclose lineage relationships, but produce enormous sets of fluorescence images, leading to time consuming analyses that may be biased by the operator’s subjectivity. Thus, time-efficient automated software are needed to analyze images easily, accurately and without subjective bias. In this paper, we present a fully automated method, named FAST (‘Fluorescent cell Analysis Segmentation Tool’), for the segmentation of neural cells labeled by multicolor combinations of fluorophores and for their classification into clones. The proposed method was tested on 77 high-magnification fluorescence images of adult mouse cerebellar tissues acquired using a confocal microscope. Automatic results were compared with manual annotations and two open-source software designed for cell detection in microscopic imaging. The algorithm showed very good performance in the cellular detection and in the assignment of the clonal identity. To the best of our knowledge, FAST is the first fully automated technique for the analysis of cellular clones based on combinatorial expression of fluorescent proteins. The proposed approach allows to perform clonal analyses easily, accurately and objectively, overcoming those biases and errors that may result from manual annotations. Moreover, it can be broadly applied to the quantification and colocalization within cells of fluorescent markers, therefore representing a versatile and powerful tool for automated quantitative analyses in fluorescence microscopy.
Contributors: M. Sana Ullah Sahar
... This archive contains firmware of the nerve stretcher as well as necessary files to develop PCB for the device. This project is actively being maintained at GitHub. The updated versions of these scripts can be downloaded any time from our GitHub repository (https://github.com/msanaullahsahar/nestv2). If you find any issue/problem/error in these scripts please do not hesitate to report at (https://github.com/msanaullahsahar/nestv2/issues).
Contributors: Phil Symonds
... These data were used to quantify the impacts of air pollution policies on population health and health inequalities within a microsimulation model, MicroEnv . They provide a basis for comparing results from similar models and allow researchers to integrate additional model components.  P. Symonds, E. Hutchinson, A. Ibbetson, J. Taylor, J. Milner, Z. Chalabi, M. Davies, P. Wilkinson, MicroEnv: A microsimulation model for quantifying the impacts of environmental policies on population health and health inequalities, Science of The Total Environment, Volume 697, (2019) 134105, ISSN 0048-9697, https://doi.org/10.1016/j.scitotenv.2019.134105.
Contributors: Qiankun Liu, Jingang Jiang, Changwei Jing, Zhong Liu, Jiaguo Qi
... In this paper, a new, alternative, multi-scale, multi-pollution source waste load allocation (WLA) system was developed, with a goal to produce optimal, fair quota allocations at multiple scales. The new WLA system integrates multi-constrained environmental Gini coefficients (EGCs) and Delphi-analytic hierarchy process (Delphi-AHP) optimization models to achieve the stated goal. This dataset consists of the raw data and the source code of models (The multi-constrained environmental Gini coefficients and Delphi-analytic hierarchy process optimization models). The source code of the multi-constrained EGCs and Delphi-AHP models was used to run the program in MATLAB environment to allocate waste load reduction quotas at both the regional scale and the site-specific scale with multiple pollution sources. The raw data mainly consists of the following two parts: (1) The shp files of various geographic information data, which was used to depicture the administrative divisions, pollution source distribution, geographical characteristics and patterns of Xian-jiang watershed; (2) The basic data includes the statistical yearbook data of villages and towns in Ningbo city, the various indicator data using to calculate the weights at criteria level and decision-making level, the contribution coefficients, and the EGC values of the three pollutants. On the basis of these data, a new, alternative, multi-scale, multi-sector optimal WLA framework was developed. The new scheme provides decision-makers critical information (i.e., the best compromise solutions of WLA) and practical guidance as they address the related water pollution control. The results, in comparison with existing practices by the local governments, suggested that the pollution discharge quota at regional scale is much fairer than the existing WLA and, even have some environmental economic benefits at pollutant source scale after optimal WLA. Some important conclusions had been found: 1) Reductions and proportions of pollutants at regional scale are significantly associated with the region’s actual socioeconomic development modes. 2）There are certain characteristics that high-reduced pollution sources tend to share (which are listed in the article). The sources with the above features should be the top priorities in the reduction of removals. 3）Most previous studies reported primarily on the WLA of removals among point sources pollution. Conversely, we found that the industrial pollution source should be the last option for reduction from an environmental-economic benefit perspective. Instead, the often overlooked types, such as agricultural non-point source and domestic sources, deserve more attention, especially in extensive rural areas.
Data for: Integral transform method solution for heat transfer in polymer melt flow in a rectangular single screw channel with periodic inlet temperature
Contributors: YVES BEREAUX, Jean-Yves CHARMEAU, Yao AGBESSI, Jordan BIGLIONE, Liangxiao BU
... octave programs for performing the integral transform of periodic heat transfer in drag and pressure driven fluid flow
Contributors: Ryan Watkins
... Version 1.0, September 9, 2019 Purpose: Created as part of a project funded by NASA’S Lunar Data Analysis Program (LDAP), the purpose of this dataset is to provide locations and diameters of boulders around small, young impact craters on the Moon. These boulder counts were conducted as part of a study aimed at determining regolith production rates and assessing landing site hazards, as discussed in the associated publications. Researchers are encouraged to read the publications and data description document to understand how the data was acquired and used. This database contains boulder distributions around small (< 1 km), young (< 200 Ma) lunar impact craters located near spacecraft landing sites. The most up-to-date database contains boulder diameters and coordinates for counts around Surveyor (Apollo 12), Cone (Apollo 14), North Ray (Apollo 16), South Ray (Apollo 16), Camelot (Apollo 17), and Zi Wei (Chang’e-3) craters. Boulders were manually identified and measured on Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Camera (NAC) images (Robinson et al., 2010) at scales of ~0.5-2 m/pixel. LROC NAC images allow for boulders ~1-2m in size and larger to be identified and measured. The tools for measuring boulders were CraterTools (Kneissl et al., 2011) and Crater Helper Tools (Nava, 2011), both developed for the ArcMap GIS platform. These boulder distributions are being used to understand boulder degradation rates on the lunar surface, and to assess landing site hazards for future surface missions to the Moon. This dataset is being archived in Mendeley Data and at the Planetary Data System (PDS) Cartography and Imaging Node for use in future boulder distribution and landing hazard studies. Future boulder counts and any refinements to existing measurements will be uploaded into subsequent versions of this dataset here and at the PDS IMG Annex: https://astrogeology.usgs.gov/search/map/Moon/Research/Regolith/lunar_boulder_data_bundle