Results 1 
7 of
7
A blackboardbased approach towards predictive analytics
 in Proceedings AAAI Spring Symposium on Technosocial Predictive Analytics
, 2009
"... Significant increase in collected data for analysis and the increased complexity of the reasoning process itself have made investigative analytical tasks more challenging. These tasks are time critical and typically involve identifying and tracking multiple hypotheses; gathering evidence to validate ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
Significant increase in collected data for analysis and the increased complexity of the reasoning process itself have made investigative analytical tasks more challenging. These tasks are time critical and typically involve identifying and tracking multiple hypotheses; gathering evidence to validate the correct hypotheses and to eliminate the incorrect ones. In this paper we specifically address predictive tasks that are concerned with predicting future trends. We describe RESIN, an AI blackboardbased agent that leverages interactive visualization and mixedinitiative problem solving to enable analysts to explore and preprocess large amounts of data in order to perform predictive analysis.
unknown title
"... Flood flows having given recurrence intervals or probabilities of exceedance, i.e. the design flows, are basic hydrological information required in designing culverts, bridges and other hydraulic structures as well in assessing flood risk managements. The last one is especially important because of ..."
Abstract
 Add to MetaCart
Flood flows having given recurrence intervals or probabilities of exceedance, i.e. the design flows, are basic hydrological information required in designing culverts, bridges and other hydraulic structures as well in assessing flood risk managements. The last one is especially important because of the current implementation of the
c © TÜBİTAK doi:10.3906/muh10022 Flood frequency analysis using Mathematica
"... This study analyzes flood frequencies using discharge data from 6 gaging stations in the Aji River basin in Iran. Eighteen different distributions are fitted to the maximum annual discharges from each of these stations, and parameters of these distributions are estimated using the method of maximum ..."
Abstract
 Add to MetaCart
This study analyzes flood frequencies using discharge data from 6 gaging stations in the Aji River basin in Iran. Eighteen different distributions are fitted to the maximum annual discharges from each of these stations, and parameters of these distributions are estimated using the method of maximum likelihood and the method of moments. Calculations are performed with Mathematica, a computer algebra system developed by Wolfram Research. The advantage of using this software is that the symbolic, numerical, and graphical computations can be combined and all quantities can be accurately calculated; in particular, there is no need to resort to any approximate methods for the calculation of quantiles. There is a readytouse command for calculating quantiles from distributions that are built in Mathematica, while for other distributions they can be easily and accurately calculated by inverting the cumulative distribution functions or by solving nonlinear equations where the inversion is not possible. The best distribution is selected based on the root mean square error (RMSE), the coefficient of determination (R2) , and the probability plot correlation coefficient (PPCC). Relations between the distributions ’ parameters and the area, average discharge, and time of concentration are explored. The complete Mathematica code and sample data files are included in
Model complexity control for hydrologic prediction
"... [1] A common concern in hydrologic modeling is overparameterization of complex models given limited and noisy data. This leads to problems of parameter nonuniqueness and equifinality, which may negatively affect prediction uncertainties. A systematic way of controlling model complexity is therefore ..."
Abstract
 Add to MetaCart
(Show Context)
[1] A common concern in hydrologic modeling is overparameterization of complex models given limited and noisy data. This leads to problems of parameter nonuniqueness and equifinality, which may negatively affect prediction uncertainties. A systematic way of controlling model complexity is therefore needed. We compare three model complexity control methods for hydrologic prediction, namely, cross validation (CV), Akaike’s information criterion (AIC), and structural risk minimization (SRM). Results show that simulation of water flow using nonphysicallybased models (polynomials in this case) leads to increasingly better calibration fits as the model complexity (polynomial order) increases. However, prediction uncertainty worsens for complex nonphysicallybased models because of overfitting of noisy data. Incorporation of physically based constraints into the model (e.g., storagedischarge relationship) effectively bounds prediction uncertainty, even as the number of parameters increases. The conclusion is that overparameterization and equifinality do not lead to a continued increase in prediction uncertainty, as long as models are constrained by such physical principles. Complexity control of hydrologic models reduces parameter equifinality and identifies the simplest model that adequately explains the data, thereby providing a means of hydrologic generalization and classification. SRM is a promising technique for this purpose, as it (1) provides analytic upper bounds on prediction uncertainty, hence avoiding the computational burden of CV, and (2) extends the applicability of classic methods such as AIC to finite data. The main hurdle in applying SRM is the need for an a priori estimation of the complexity of the hydrologic model, as measured by its VapnikChernovenkis (VC) dimension. Further research is needed in this area.
Analysis of Hydrological Drought Events in the Upper Tana Basin of Kenya
"... Drought is a major environmental hazard which has serious implications for water management and environmental protection. This is especially so when unsustainable water management, as well as predicted climate change effects in droughts, could result in severe impacts on nature and society. Ineffici ..."
Abstract
 Add to MetaCart
(Show Context)
Drought is a major environmental hazard which has serious implications for water management and environmental protection. This is especially so when unsustainable water management, as well as predicted climate change effects in droughts, could result in severe impacts on nature and society. Inefficient management of drought and water resources could put aquatic ecosystems under serious severe stress. The lack of adequate water availability in rivers during drought episodes leads to heavy overexploitation of the rivers and reservoirs, which significantly affects the survival of associated biological diversity. It is therefore essential to know the occurrence of drought events in river basins with a view to establishing and developing measures to minimize the socioeconomic and environmental impacts of its effects in these areas. In this paper, drought duration and severity were examined in four homogenous regions of the upper Tana basin. The homogenous regions were established using principal component analysis results of discharge data from twenty two river gauge stations in basin. The runs analysis technique was then applied to examine the drought duration and severity at the homogenous regions of the basin. Results indicated that the mean drought duration varied from 4 to 11 months across the regions whilst the standardized mean severity ranged from 0.63 to 3.89. Two of the regions experienced nearly the same standardized mean severity. Drought events occurred at times when the basin