Filter Results
26 results
- Data for: A novel algorithm for calculating transition potential in cellular automata models of land-use/cover changeLand-use change dataset of Ahwaz, Iran.
- Data for: Moving to 3-D flood hazard maps for enhancing risk communicationThe .txt file is the Python script able to import .2dm triangular grid as a Blender mesh. In order to run the file in Blender, one can use the following steps: 1. open a Text Editor view in Blender 2. go to Text >Open Text Block and open the .txt file 3. press run script Other comments are reported in the file.
- Data for: Building a landslide climate indicator with machine learning and land surface modelsThis is the model dump file provided by XGBoost. This model has not been validated for use in regions other than the Pacific Northwest of the United States, nor with input data other than NCA-LDAS. This model was created to describe the seasonality of landslides over a broad geographic region, and it is not suitable for site-specific hazard calculations.
- Data for: Why So Many Published Sensitivity Analyses Are False. A Systematic Review of Sensitivity Analysis Practices This is the initial query data base
- Data for: Pathline creation using TOUGH simulation results and fully unstructured 3D Voronoi gridsThe data contains all the example files for the results visualization of TOUGH2Path pathline computation for 2D and 3D case studies.
- Data for: Development of an automated and open source GIS tool for reproducing the HAND terrain modelThe HAND tool was developed using Python programming language which uses functionalities of a commercial geographic information system for constructing the HAND model and terrain map. This tool can be used in ArcGIS 10.2.
- Data for: Improving development efficiency through decision analysis: reservoir protection in Burkina FasoA participatory conceptual model has been coded as a Monte Carlo simulation using the decisionSupport() function. The decisionSupport() function is part of the package decisionSupport (Luedeling and Göhring, 2017) in the R programming environment (R Core Team 2017). The vignette introduces the decision model and details the implementation of the model in R. Data table input is also provided.
- Data for: Accelerating Bayesian inference in hydrological modelling with a mechanistic emulatorSoftware used to generate this dataset is to be found in the repository https://github.com/machacd/mechemu .
- Data for: Communicating physics-based wave model predictions of coral reefs using Bayesian Belief NetworksBayesian belief network files for beach toe significant wave conditions on coral reefs, developed using wave predictions from Baldock et al (2015). There is one network (Hs_toe_*.neta, Netica v5.18 files) that has been trained using the case file Hs_toe.cas, with three different learning algorithms, counting (Hs_toe_C.neta), expectation-maximization (Hs_toe_EM.neta) and gradient descent (Hs_toe_GA.neta). Reference Baldock, T.E., Golshani, A., Atkinson, A., Shimamoto, T., Wu, S., Callaghan, D.P. and Mumby, P.J., 2015. Impact of sea-level rise on cross-shore sediment transport on fetch-limited barrier reef island beaches under modal and cyclonic conditions. Marine Pollution Bulletin.
- Data for: Inside the Black Box: Understanding Key Drivers of Global Emission ScenariosThese two files contain the data and analysis for the submitted article "Inside the Black Box", by Jonathan Koomey et al. The PFU file contains historical data used to create Figures 1 and 2 in the main text of the article, while the MESSAGE file contains the projections and data needed to create Figures 3 through 8 in the main text. There are many additional tabs in the workbooks that have historical value but are not directly relevant to the article itself. After the article is accepted we'll create tidier versions of these files that eliminate extraneous material, but we don't want to do that until we get final word from the editor and reviewers and make whatever additional changes they require in the analysis.
1