Mankind is faced with many ongoing environmental challenges including climate change, losses in biodiversity, deforestation, increased soil erosion, and air and water pollution, to name but a few. As the world’s population continues to rise, there is increasing competition for land, more people are migrating to urban areas and a greater demand is placed on the Earth’s finite supply of natural resources. The Sustainable Development Goals (SDGs), which will be adopted by member states at the end of September 2015, represent a set of actions supported by numerous targets that all countries should strive to meet. A number of the SDGs cover key environmental issues but they require science, technology and innovation to help reach these goals [
1]. Spatial analysis represents a set of powerful methods that can support the development of solutions to the many pressing environmental problems that we face. It also has the potential to be a transformative technology in an age where information is increasingly geotagged, particularly with the rise of smartphones and citizens as sensors [
2]. Thus, the focus of this Special Issue has been to examine what types of approaches are currently being used to solve different environmental problems.
The original expectation was that submissions to this Special Issue would exemplify cutting edge research on advances in spatial analytical methods. Instead, the result has been a much stronger emphasis on the environmental problem and the emergence of rich environmental insights, which has been made possible through the application of spatial analytical tools. Moreover, these insights have considerable value to many stakeholders, from urban planners to conservation managers to reservoir operators. All the papers emphasize how the resulting guidance can be used for decision-making. Thus, the environmental problem has driven the resulting choice of tools from the spatial analytical toolkit. Many of the applied methods may once have been innovative, but now they are robust approaches that can be incorporated in complex workflows to solve many different problems. For example, the papers by Simpson and Wu [
3] and Curtarelli
et al. [
4] both compare different methods of spatial interpolation for sediment volume estimation and bathymetric mapping, respectively. Spatial interpolation is a commonly used method to estimate values across a surface yet the wide variety of methods and parameter settings available means that the choice of method is not always straightforward. One paper finds that spline interpolation provides the best solution while kriging is favored by the other. Trying to find a single best interpolation method for all problems is not realistic and is a function of sample size, sample spacing and other site specific factors, which the authors acknowledge. With the current computational power of personal computers and the availability of cloud computing, comparison of multiple methods to find the one that performs best at a specific location is an entirely feasible approach. The bathymetric map in Curtelli
et al. [
4] is then used to derive area and volume curves as a function of water level, a variable that is easily measured, and can allow reservoir managers to monitor hydropower generation capacity.
Spatial interpolation is also used by Mobaied
et al. [
5] to create a soil depth surface map, which in addition to soil type, slope and aspect, are used to investigate their effects on heathland stability. Using transition matrices to trace landscape change over time and spatial methods to analyze the association between the aforementioned factors and the dynamics of heathland has helped to unpack the complicated changes that have occurred in a section of the Fontainbleau Forest in France over the last 60 years. In particular, it provides advice for land managers on selecting those heathland areas with the greatest potential for stability for subsequent conservation and restoration.
In the area of conservation but focusing this time on birds, spatial statistics and geographically weighted regression are employed by Holloway and Miller [
6] to examine how the scale at which an analysis is undertaken affects the results. The authors examine how hotspots and coldspots of species richness change as the data are aggregated to increasingly larger grid sizes as well as how relationships between species richness and precipitation, temperature, elevation and a remotely-sensed vegetation index change over increasing scales. The results show that, in some areas, the direction of the relationship changes entirely as scale changes while others show stability across increasing scales. This has implications for management strategies that are applied based on these results and draws attention to the need to study ecological relationships at multiple scales.
The final two papers are focused on urban environmental issues. The paper by Moreno
et al. [
7] considers tree canopy cover, impervious/pervious surfaces and buildings in more than 500 schools in Los Angeles, which has implications for the development of sun safety policies. Digitizing this information from very high resolution imagery and applying some exploratory statistics, they showed that less trees and less favorable siting occurred more frequently in more deprived areas. The data were then used to create school site reports using a storytelling approach as a way of cleverly communicating the results and providing an early warning system that can be used in future sun safety planning. The paper by Mitsova [
8] tackled a different urban problem,
i.e., the likely impact of climate change on a watershed near Cincinnati, Ohio using a coupling of models. A cellular automata-based urban growth model was used to predict change in urban area to 2030. Two different climate change scenarios were chosen, the data were downscaled and then used in a hydrological model for the basin, examining the effects on the 100 year flood and on low flows. Using Monte Carlo simulation to account for uncertainties related to climate and hydrological models, hydrologic response probabilities were derived. The results showed the increasing likelihood of exceeding the 100 year flood and the potential for droughts in the summer, with lower river flows predicted as a result of climate change. The paper also considers the use of low impact developments such as rain harvesting and porous pavements and shows which measures can help to mitigate the impacts of climate change, providing potentially useful advice for planners.
Another expectation was that the submissions to this Special Issue would use, or at least refer to, environmental “big data”. This term has become ubiquitous, and for environmental applications, one of the most relevant sources of big data is from Earth Observation, particularly with the opening up of the Landsat archive [
9] and the recently launched Sentinel 1 and 2 satellites [
10]. Yet what is really lacking is
in-situ data for calibration, validation and environmental modelling more generally [
11,
12]. There are high costs associated with professional data collection in the field which limits the amount of data that can be collected. The paper by Simpson and Wu [
3] tackles this issue head on by examining the effect of decreasing sample sizes on the accuracy of sediment volumes in a reservoir in South Dakota. They showed that there were few gains in accuracy to be realized above 50% of their collected sample size, which provides valuable guidance for future surveying in an environment of decreasing funding.
Spatial analysis will continue to play an important role in helping to solve many environmental problems. What these papers have shown is the benefits of these approaches in terms of gaining a deeper understanding of the problem space and how this knowledge can be translated into valuable guidance for decision-makers. Transforming this from valuable research to actual decision-making remains the real future challenge.