Results 1  10
of
434,995
Using Loops in DecisionTheoretic Refinement Planners
 In Proc. 3rd Intl. Conf. on A.I. Planning Systems
, 1996
"... Classical AI planners use loops over subgoals to move a stack of blocks by repeatedly moving the top block. Probabilistic planners and reactive systems repeatedly try to pick up a block to increase the probability of success in an uncertain environment. These planners terminate a loop only when the ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
the goal is achieved or when the probability of success has reached some threshold. The tradeoff between the cost of repeating a loop and the expected benefit is ignored. Decisiontheoretic refinement planners take this tradeoff into account, but to date, have been limited to considering only finite length
DecisionTheoretic Refinement Planning: Principles and Application
, 1995
"... We present a general theory of action abstraction for reducing the complexity of decisiontheoretic planning. We develop projection rules for abstract actions and prove our abstraction techniques to be correct. We present a planning algorithm that uses the abstraction theory to efficiently explore t ..."
Abstract

Cited by 12 (5 self)
 Add to MetaCart
the space of possible plans by eliminating suboptimal classes of plans without explicitly examining all plans in those classes. An instance of the algorithm has been implemented as the drips decisiontheoretic refinement planning system. We apply the planner to the problem of selecting the optimal test
Assessing coping strategies: A theoretically based approach
 Journal of Personality and Social Psychology
, 1989
"... We developed a multidimensional coping inventory to assess the different ways in which people respond to stress. Five scales (of four items each) measure conceptually distinct aspects of problemfocused coping (active coping, planning, suppression of competing activities, restraint coping, seeking ..."
Abstract

Cited by 610 (5 self)
 Add to MetaCart
of emotions, behavioral disengagement, mental disengagement). Study 1 reports the development of scale items. Study 2 reports correlations between the various coping scales and several theoretically relevant personality measures in an effort to provide preliminary information about the inventory
Theoretical improvements in algorithmic efficiency for network flow problems

, 1972
"... This paper presents new algorithms for the maximum flow problem, the Hitchcock transportation problem, and the general minimumcost flow problem. Upper bounds on ... the numbers of steps in these algorithms are derived, and are shown to compale favorably with upper bounds on the numbers of steps req ..."
Abstract

Cited by 565 (0 self)
 Add to MetaCart
This paper presents new algorithms for the maximum flow problem, the Hitchcock transportation problem, and the general minimumcost flow problem. Upper bounds on ... the numbers of steps in these algorithms are derived, and are shown to compale favorably with upper bounds on the numbers of steps required by earlier algorithms. First, the paper states the maximum flow problem, gives the FordFulkerson labeling method for its solution, and points out that an improper choice of flow augmenting paths can lead to severe computational difficulties. Then rules of choice that avoid these difficulties are given. We show that, if each flow augmentation is made along an augmenting path having a minimum number of arcs, then a maximum flow in an nnode network will be obtained after no more than ~(n a n) augmentations; and then we show that if each flow change is chosen to produce a maximum increase in the flow value then, provided the capacities are integral, a maximum flow will be determined within at most 1 + logM/(M1) if(t, S) augmentations, wheref*(t, s) is the value of the maximum flow and M is the maximum number of arcs across a cut. Next a new algorithm is given for the minimumcost flow problem, in which all shortestpath computations are performed on networks with all weights nonnegative. In particular, this
DecisionTheoretic Planning: Structural Assumptions and Computational Leverage
 JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH
, 1999
"... Planning under uncertainty is a central problem in the study of automated sequential decision making, and has been addressed by researchers in many different fields, including AI planning, decision analysis, operations research, control theory and economics. While the assumptions and perspectives ..."
Abstract

Cited by 510 (4 self)
 Add to MetaCart
Planning under uncertainty is a central problem in the study of automated sequential decision making, and has been addressed by researchers in many different fields, including AI planning, decision analysis, operations research, control theory and economics. While the assumptions and perspectives adopted in these areas often differ in substantial ways, many planning problems of interest to researchers in these fields can be modeled as Markov decision processes (MDPs) and analyzed using the techniques of decision theory. This paper presents an overview and synthesis of MDPrelated methods, showing how they provide a unifying framework for modeling many classes of planning problems studied in AI. It also describes structural properties of MDPs that, when exhibited by particular classes of problems, can be exploited in the construction of optimal or approximately optimal policies or plans. Planning problems commonly possess structure in the reward and value functions used to de...
Universals in the content and structure of values: theoretical advances and empirical tests in 20 countries
 ADVANCES IN EXPERIMENTAL SOCIAL PSYCHOLOGY
, 1992
"... ..."
DecisonTheoretic Refinement Planning Using Inheritance Abstraction
"... ion Peter Haddawy Meliani Suwandi Department of Electrical Engineering and Computer Science University of WisconsinMilwaukee PO Box 784 Milwaukee, WI 53201 1 Introduction Given a probabilistic model of the world and of available actions and a utility function representing the planner's ob ..."
Abstract
 Add to MetaCart
ion Peter Haddawy Meliani Suwandi Department of Electrical Engineering and Computer Science University of WisconsinMilwaukee PO Box 784 Milwaukee, WI 53201 1 Introduction Given a probabilistic model of the world and of available actions and a utility function representing the planner's objectives we wish to find the plan that maximizes expected utility. Finding the optimal plan requires comparing the expected utilities of all possible plans. Doing this explicitly would be computationally prohibitive in all but the smallest of domains. Thus we must find a way to compare partial plans in such a way that we can eliminate some partial plans before fully elaborating them. Such comparisons require a definition of partial plan that allows us to determine that all completions of one plan are less preferred than all completions of another. We achieve such a definition of partial plan by structuring actions into an abstraction hierarchy and by restricting the planner to using only refineme...
Bundle Adjustment  A Modern Synthesis
 VISION ALGORITHMS: THEORY AND PRACTICE, LNCS
, 2000
"... This paper is a survey of the theory and methods of photogrammetric bundle adjustment, aimed at potential implementors in the computer vision community. Bundle adjustment is the problem of refining a visual reconstruction to produce jointly optimal structure and viewing parameter estimates. Topics c ..."
Abstract

Cited by 555 (12 self)
 Add to MetaCart
This paper is a survey of the theory and methods of photogrammetric bundle adjustment, aimed at potential implementors in the computer vision community. Bundle adjustment is the problem of refining a visual reconstruction to produce jointly optimal structure and viewing parameter estimates. Topics
Approximate Signal Processing
, 1997
"... It is increasingly important to structure signal processing algorithms and systems to allow for trading off between the accuracy of results and the utilization of resources in their implementation. In any particular context, there are typically a variety of heuristic approaches to managing these tra ..."
Abstract

Cited by 516 (2 self)
 Add to MetaCart
these tradeoffs. One of the objectives of this paper is to suggest that there is the potential for developing a more formal approach, including utilizing current research in Computer Science on Approximate Processing and one of its central concepts, Incremental Refinement. Toward this end, we first summarize a
Centrality in social networks conceptual clarification
 Social Networks
, 1978
"... The intuitive background for measures of structural centrality in social networks is reviewed aPzd existing measures are evaluated in terms of their consistency with intuitions and their interpretability. Three distinct intuitive conceptions of centrality are uncovered and existing measures are refi ..."
Abstract

Cited by 1035 (2 self)
 Add to MetaCart
are refined to embody these conceptions. Three measures are developed for each concept, one absolute and one relative measure of the ~entra~~t~ ~ of ~os~tio~ls in a network, and one relenting the degree of centralization of the entire network. The implications of these measures for the experimental study
Results 1  10
of
434,995