Non-Thermal Atmospheric Pressure Plasma Induces Epigenetic Modifications that Activate the Expression of Various Cytokines and Growth Factors in Human Adipose Tissue-Derived Stem Cells
Contributors: Jeongyeon Park, Kiwon Song
... In this study, by analyzing the whole genome expression profiles of NTAPP-activated ASCs with RNA-seq, we investigated the mechanism by which NTAPP increases the proliferation of ASCs in order to deduce the common mechanism underlying the physiological effects of NTAPP. We demonstrated that NTAPP increases the expression of genes associated with the activity of various cytokines and growth factors, and downregulates the genes involved in the intrinsic apoptotic pathway by inducing epigenetic modifications. We also showed that nitric oxide (NO) generated from NTAPP is a main component of epigenetic modification by NTAPP. The physiological mechanism of NTAPP proven in this study supports the development of NTAPP as an efficient tool for regenerative medicine.
Contributors: Raúl Roberto poppiel, Jose Alexandre Dematte, Marilusa Pinto Coelho Lacerda, José Lucas Safanelli, Rodnei Rizzo, Manuel Pereira de Oliveira junior, Jean Jesus Novais
... Maps of clay, silt and sand contents (g kg-1) predicted at 0-20 cm, 20-60 cm and 60-100 cm depths intervals (3D) obtained by random forest regression in Google Earth Engine. Gridded soil information covers Midwest Brazil, from 12° S to 20° S and from 45° W to 54° W, and is available with 250m resolution. The maps were cross-validated and had Coefficient of Determination ranging from 0.64 to 0.85 at all depth intervals.
Data for: The WC-Co/Fe-Ni interface: effect of holding time on the microstructure, grain size and grain growth mechanism
Contributors: Peiquan Xu
... The enclosed data provide the X-Ray Diffraction of samples W2, W3, W4, and invar alloy for comparison.
Top results from Data Repository sources. Show only results like these.
Contributors: Massimo Salvi
... This repository contains the FAST algorithm graphical user interface and some sample image used in the following work: - Salvi M., Cerrato V., Buffo A., and Molinari F., "Automated Segmentation of Brain Cells for Clonal Analyses in Fluorescence Microscopy Images", J Neurosci Methods 2019 (DOI: 10.1016/j.jneumeth.2019.108348) ABSTRACT The understanding of how cell diversity within and across distinct brain regions is ontogenetically achieved is a pivotal topic in neuroscience. Clonal analyses based on multicolor cell labeling represent a powerful tool to tackle this issue and disclose lineage relationships, but produce enormous sets of fluorescence images, leading to time consuming analyses that may be biased by the operator’s subjectivity. Thus, time-efficient automated software are needed to analyze images easily, accurately and without subjective bias. In this paper, we present a fully automated method, named FAST (‘Fluorescent cell Analysis Segmentation Tool’), for the segmentation of neural cells labeled by multicolor combinations of fluorophores and for their classification into clones. The proposed method was tested on 77 high-magnification fluorescence images of adult mouse cerebellar tissues acquired using a confocal microscope. Automatic results were compared with manual annotations and two open-source software designed for cell detection in microscopic imaging. The algorithm showed very good performance in the cellular detection and in the assignment of the clonal identity. To the best of our knowledge, FAST is the first fully automated technique for the analysis of cellular clones based on combinatorial expression of fluorescent proteins. The proposed approach allows to perform clonal analyses easily, accurately and objectively, overcoming those biases and errors that may result from manual annotations. Moreover, it can be broadly applied to the quantification and colocalization within cells of fluorescent markers, therefore representing a versatile and powerful tool for automated quantitative analyses in fluorescence microscopy.
Contributors: Qiankun Liu, Jingang Jiang, Changwei Jing, Zhong Liu, Jiaguo Qi
... In this paper, a new, alternative, multi-scale, multi-pollution source waste load allocation (WLA) system was developed, with a goal to produce optimal, fair quota allocations at multiple scales. The new WLA system integrates multi-constrained environmental Gini coefficients (EGCs) and Delphi-analytic hierarchy process (Delphi-AHP) optimization models to achieve the stated goal. This dataset consists of the raw data and the source code of models (The multi-constrained environmental Gini coefficients and Delphi-analytic hierarchy process optimization models). The source code of the multi-constrained EGCs and Delphi-AHP models was used to run the program in MATLAB environment to allocate waste load reduction quotas at both the regional scale and the site-specific scale with multiple pollution sources. The raw data mainly consists of the following two parts: (1) The shp files of various geographic information data, which was used to depicture the administrative divisions, pollution source distribution, geographical characteristics and patterns of Xian-jiang watershed; (2) The basic data includes the statistical yearbook data of villages and towns in Ningbo city, the various indicator data using to calculate the weights at criteria level and decision-making level, the contribution coefficients, and the EGC values of the three pollutants. On the basis of these data, a new, alternative, multi-scale, multi-sector optimal WLA framework was developed. The new scheme provides decision-makers critical information (i.e., the best compromise solutions of WLA) and practical guidance as they address the related water pollution control. The results, in comparison with existing practices by the local governments, suggested that the pollution discharge quota at regional scale is much fairer than the existing WLA and, even have some environmental economic benefits at pollutant source scale after optimal WLA. Some important conclusions had been found: 1) Reductions and proportions of pollutants at regional scale are significantly associated with the region’s actual socioeconomic development modes. 2）There are certain characteristics that high-reduced pollution sources tend to share (which are listed in the article). The sources with the above features should be the top priorities in the reduction of removals. 3）Most previous studies reported primarily on the WLA of removals among point sources pollution. Conversely, we found that the industrial pollution source should be the last option for reduction from an environmental-economic benefit perspective. Instead, the often overlooked types, such as agricultural non-point source and domestic sources, deserve more attention, especially in extensive rural areas.
Contributors: Louise Logsdon
... Estes arquivos compõem o "Instrumental de apoio ao projeto de moradias sociais" que alimentam o modelo de processo de projeto proposto pela autora em sua tese de doutorado: LOGSDON, Louise. Instrumental de apoio ao projeto de moradias sociais. 2019. Tese (Doutorado) - Arquitetura, Urbanismo e Tecnologia, Instituto de Arquitetura e Urbanismo, Universidade de São Paulo, São Carlos, 2019.
Extending the Ability of Near-Infrared Images to Monitor Small River Discharge on the Northeastern Tibetan Plateau (2018WR023808)
Contributors: li haojie, Hongyi Li
... Data accessing (2018WR023808) Note: All data used in this study are provided in this repository, including daily gauge discharge data, drawing data and calculations code. Please refer to this paper for details on how to use the data. 1. The all figures of this paper are provided. The vector data used in Figure 1 and Figure 4 are uploaded to separate folders (shp). 2. The data for drawing in this study are shared in data attachment file (excel). 3. Gauged discharge data The daily gauge discharge data used in this study are shared in the data attachment file (excel). 4. Landsat images and code The Landsat image sets and calculation code used in this study are available on the Google Earth Engine platform. The links of image sets and sample code as below. TM https://developers.google.com/earth-engine/datasets/catalog/LANDSAT_LT05_C01_T1 ETM https://developers.google.com/earth-engine/datasets/catalog/LANDSAT_LE07_C01_T1 OLI https://developers.google.com/earth-engine/datasets/catalog/LANDSAT_LC8_L1T Calculation code https://code.earthengine.google.com/2254187574dedb2c0b2e9a7fc1832eb2
UAV imagery and in-situ measurements for structure-from-motion snow depth mapping over the Laurichard rock glacier, France - surveyed in 2017
Contributors: Jason Goetz, Marco Marcer, Alexander Brenning, Xavier Bodin
... Unmanned aerial veichle (UAV) imagery and in-situ field measurements at the Combe de Laurichard, France (45.01ºN, 6.37ºE, 2500 m a.s.l.) were collected to explore uncertainties in mapping snow depth with structure-from-motion and multi-view stereo 3D reconstruction in an alpine area. Repeat UAV surveys were flown on each survey date to create multiple elevation models to determine the precision (i.e., repeatability) of SFM-MVS elevation models. This measure of uncertainty can be used to determine the precision of SFM-MVS snow depths using a model of error propagation. It also can illustrate how uncertainty in the SFM-MVS snow depths and elevation models vary spatially. The repeated snow-cover (snow-on) elevation models (6 in total) were acquired on June 1, 2017, and the snow-free (snow-off) elevation models (7 in total) on October 5, 2017. These elevation models were derived by performing SFM-MVS reconstruction using Agisoft PhotoScan. The UAV imagery was surveyed using a DJI Phantom 4, which flew in pre-programmed parallel flight paths with 75% side and top image overlap. The flying height of the UAV was approximately 60 m above ground level. Artificial targets were used for SFM camera calibration and georeferencing using the RGF93 / Lambert-93 projection and the NGF-IGN69 vertical datum (EPSG::5698). Validation data was collected by measuring topographic heights (i.e., check points) using a real-time-kinematic (RTK) global navigation satellite system (GNSS) survey with an accuracy < 2 cm (at 1 σ). At each check point location, snow depths were measured using an avalanche probe to a maximum 3 m depth. This RTK-GNSS survey was also used for collecting ground control points (GCPs). The position of the base-station was corrected using the PUYA reference station, which is located approximately 19 km from the study area. Additionally, to compare the uncertainty of SFM-MVS snow depths in stable and active deforming terrain (i.e., rock glacier creep), a mask of the rock glacier area was mapped using the UAV derived imagery and elevation models.
Data for the calculation of an indicator of the comprehensiveness of conservation of useful wild plants
Contributors: Colin K. Khoury, Daniel Amariles, Jonatan Soto, Maria Victoria Diaz, Steven Sotelo, Chrystian C. Sosa, Julian Ramírez-Villegas , Harold Achicanoy, Nora P. Castañeda-Álvarez , Blanca León
... The datasets presented here are related to the research article entitled “Comprehensiveness of conservation of useful wild plants: an operational indicator for biodiversity and sustainable development targets” (Khoury et al., 2019). The indicator methodology includes five main steps, each requiring and producing data, which are fully described and available here. These data include: species taxonomy, uses, and general geographic information (dataset 1); species occurrence data (dataset 2); global administrative areas data (dataset 3); eco-geographic predictors used in species distribution modeling (dataset 4); a world map raster file (dataset 5); species spatial distribution modeling outputs (dataset 6); ecoregion spatial data used in conservation analyses (dataset 7); protected area spatial data used in conservation analyses (dataset 8); and countries, sub-regions, and regions classifications data (dataset 9). These data are available at http://dx.doi.org/10.17632/2jxj4k32m2.1. In combination with the openly accessible methodology code (https://github.com/CIAT-DAPA/UsefulPlants-Indicator), these data facilitate indicator assessments and serve as a baseline against which future calculations of the indicator can be measured. The data can also contribute to other species distribution modeling, ecological research, and conservation analysis purposes. Khoury CK, Amariles D, Soto JS, Diaz MV, Sotelo S, Sosa CC, Ramírez-Villegas J, Achicanoy HA, Velásquez-Tibatá J, Guarino L, León B, Navarro-Racines C, Castañeda-Álvarez NP, Dempewolf H, Wiersema JH, and Jarvis A (2019) Comprehensiveness of conservation of useful wild plants: an operational indicator for biodiversity and sustainable development targets. Ecological Indicators 98: 420-429. doi: 10.1016/j.ecolind.2018.11.016. Available online at: https://doi.org/10.1016/j.ecolind.2018.11.016 Khoury CK, Amariles D, Soto JS, Diaz MV, Sotelo S, Sosa CC, Ramírez-Villegas J, Achicanoy HA, Castañeda-Álvarez NP, León B, and Wiersema JH (2019) Data for the calculation of an indicator of the comprehensiveness of conservation of useful wild plants. Data in Brief. doi: 10.1016/j.dib.2018.11.125.
Data for: Classification methods for point clouds in rock slope monitoring: a novel machine learning approach and comparative analysis
Contributors: Luke Weidner, Gabriel Walton, Ryan Kromer
... Data for the following submission: Title: Classification methods for point clouds in rock slope monitoring: a novel machine learning approach and comparative analysis Weidner, L.1*, Walton, G.1, Kromer, R.1 1Colorado School of Mines, Golden, USA *Corresponding author email: email@example.com In the event the manuscript is unavailable, please reach out to us for a copy. The main contents of this file are as follows: -Supplementary figures referenced in the manuscript -All processed point clouds used in the through-time analysis. (~9.3 GB) -Scripts used to calculate the results shown in Figures 11, 12 and 13. (~1.6 GB) -Numeric data in other tables, graphs, and figures. Due to the nature of the research, many large point clouds are created, too many to be all uploaded to this repository. If you are looking for data that is not provided in this dataset, please reach out to the authors and we would be happy to provide any additional data. Scripts labeled "RUNME" are found in the main file directory for creating the ML method results ('tests_RUNME.m'), and for hybrid and masking results. For the most part, scripts can be run without modification and should provide results (assuming the required MATLAB toolboxes are installed) Note that for hybrid and masking, multiple runs of the script are required, changing the filenames at the beginning of the script for each of the four dates calculated. The Random Forest TreeBagger object ('tb_t14_jun16dec18') is also included and all the feature sets used for training and validation ('date_struct.mat').