Results 1  10
of
100
Experimental Analysis of Heuristics for the STSP
 Local Search in Combinatorial Optimization
, 2001
"... In this and the following chapter, we consider what approaches one should take when one is confronted with a realworld application of the TSP. What algorithms should be used under which circumstances? We ..."
Abstract

Cited by 66 (1 self)
 Add to MetaCart
In this and the following chapter, we consider what approaches one should take when one is confronted with a realworld application of the TSP. What algorithms should be used under which circumstances? We
Salient Closed Boundary Extraction with Ratio Contour
 IEEE Trans. on Pattern Analysis and Machine Intelligence
, 2005
"... We present ratio contour, a novel graphbased method for extracting salient closed boundaries from noisy images. This method operates on a set of boundary fragments that are produced by edge detection. Boundary extraction identifies a subset of these fragments and connects them sequentially to for ..."
Abstract

Cited by 60 (15 self)
 Add to MetaCart
(Show Context)
We present ratio contour, a novel graphbased method for extracting salient closed boundaries from noisy images. This method operates on a set of boundary fragments that are produced by edge detection. Boundary extraction identifies a subset of these fragments and connects them sequentially to form a closed boundary with the largest saliency. We encode the Gestalt laws of proximity and continuity in a novel boundarysaliency measure based on the relative gap length and average curvature when connecting fragments to form a closed boundary. This new measure attempts to remove a possible bias toward short boundaries. We present a polynomialtime algorithm for finding the mostsalient closed boundary. We also present supplementary preprocessing steps that facilitate the application of ratio contour to real images. We compare ratio contour to two closely related methods for extracting closed boundaries: Elder and Zucker's method based on the shortestpath algorithm and Williams and Thornber's method based on spectral analysis and a stronglyconnectedcomponents algorithm. This comparison involves both theoretic analysis and experimental evaluation on both synthesized data and real images.
Multilevel Refinement for Combinatorial Optimisation Problems
 SE10 9LS
, 2001
"... Abstract. We consider the multilevel paradigm and its potential to aid the solution of combinatorial optimisation problems. The multilevel paradigm is a simple one, which involves recursive coarsening to create a hierarchy of approximations to the original problem. An initial solution is found (some ..."
Abstract

Cited by 54 (5 self)
 Add to MetaCart
(Show Context)
Abstract. We consider the multilevel paradigm and its potential to aid the solution of combinatorial optimisation problems. The multilevel paradigm is a simple one, which involves recursive coarsening to create a hierarchy of approximations to the original problem. An initial solution is found (sometimes for the original problem, sometimes the coarsest) and then iteratively refined at each level. As a general solution strategy, the multilevel paradigm has been in use for many years and has been applied to many problem areas (most notably in the form of multigrid techniques). However, with the exception of the graph partitioning problem, multilevel techniques have not been widely applied to combinatorial optimisation problems. In this paper we address the issue of multilevel refinement for such problems and, with the aid of examples and results in graph partitioning, graph colouring and the travelling salesman problem, make a case for its use as a metaheuristic. The results provide compelling evidence that, although the multilevel framework cannot be considered as a panacea for combinatorial problems, it can provide an extremely useful addition to the combinatorial optimisation toolkit. We also give a possible explanation for the underlying process and extract some generic guidelines for its future use on other combinatorial problems.
Sensor Placement in Municipal Water Networks
 J. Water
, 2003
"... We present a mixedinteger programming (MIP) formulation for sensor placement optimization in municipal water distribution systems that includes the temporal characteristics of contamination events and their impacts. Typical network water quality simulations track contaminant concentration and move ..."
Abstract

Cited by 50 (11 self)
 Add to MetaCart
(Show Context)
We present a mixedinteger programming (MIP) formulation for sensor placement optimization in municipal water distribution systems that includes the temporal characteristics of contamination events and their impacts. Typical network water quality simulations track contaminant concentration and movement over time, computing contaminant concentration timeseries for each junction. Given this information, we can compute the impact of a contamination event over time and determine affected locations. This process quantifies the benefits of sensing contamination at different junctions in the network. Ours is the first MIP model to base sensor placement decisions on such data, compromising over many individual contamination events. The MIP formulation is mathematically equivalent to the wellknown pmedian facility location
Analysis and Approximation of Optimal CoScheduling on Chip Multiprocessors
, 2008
"... Cache sharing among processors is important for Chip Multiprocessors to reduce interthread latency, but also brings cache contention, degrading program performance considerably. Recent studies have shown that job coscheduling can effectively alleviate the contention, but it remains an open questio ..."
Abstract

Cited by 45 (11 self)
 Add to MetaCart
(Show Context)
Cache sharing among processors is important for Chip Multiprocessors to reduce interthread latency, but also brings cache contention, degrading program performance considerably. Recent studies have shown that job coscheduling can effectively alleviate the contention, but it remains an open question how to efficiently find optimal coschedules. Solving the question is critical for determining the potential of a coscheduling system. This paper presents a theoretical analysis of the complexity of coscheduling, proving its NPcompleteness. Furthermore, for a special case when there are two sharers per chip, we propose an algorithm that finds the optimal coschedules in polynomial time. For more complex cases, we design and evaluate a sequence of approximation algorithms, among which, the hierarchical matching algorithm produces nearoptimal schedules and shows good scalability. This study facilitates the evaluation of coscheduling systems, as well as offers some techniques directly usable in proactive job coscheduling.
Energyaware secure multicast communication in adhoc networks using geographic location information
 IN PROCEEDINGS OF INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP’03
, 2003
"... The problem of securing multicast communications in an energyconstrained adhoc network requires the efficient management of cryptographic quantities. We show that existing efficient key distribution techniques for wired networks that rely on logical hierarchies are extremely energy inefficient. We ..."
Abstract

Cited by 44 (9 self)
 Add to MetaCart
(Show Context)
The problem of securing multicast communications in an energyconstrained adhoc network requires the efficient management of cryptographic quantities. We show that existing efficient key distribution techniques for wired networks that rely on logical hierarchies are extremely energy inefficient. We also show that the consideration of the physical location of the members is critical for developing energyefficient key distribution schemes. By exploiting the spatial correlation between the members of the multicast group, we construct an energyaware key distribution scheme. We present simulation results to illustrate the improvements achieved by our proposed algorithm.
Dynamic programming approximations for a stochastic inventory routing problem
 Transportation Science
, 2004
"... This work is motivated by the need to solve the inventory routing problem when implementing a business practice called vendor managed inventory replenishment (VMI). With VMI, vendors monitor their customers ’ inventories, and decide when and how much inventory should be replenished at each customer. ..."
Abstract

Cited by 28 (3 self)
 Add to MetaCart
This work is motivated by the need to solve the inventory routing problem when implementing a business practice called vendor managed inventory replenishment (VMI). With VMI, vendors monitor their customers ’ inventories, and decide when and how much inventory should be replenished at each customer. The inventory routing problem attempts to coordinate inventory replenishment and transportation in such a way that the cost is minimized over the long run. We formulate a Markov decision process model of the stochastic inventory routing problem, and propose approximation methods to find good solutions with reasonable computational effort. We indicate how the proposed approach can be used for other Markov decision processes involving the control of multiple resources. ∗ Supported by the National Science Foundation under grant DMI9875400.
A Linear Time Approximation Algorithm for Weighted Matchings in Graphs
, 2003
"... Approximation algorithms have so far mainly been studied for problems that are not known to have polynomial time algorithms for solving them exactly. Here we propose an approximation algorithm for the weighted matching problem in graphs which can be solved in polynomial time. The weighted matching p ..."
Abstract

Cited by 25 (3 self)
 Add to MetaCart
(Show Context)
Approximation algorithms have so far mainly been studied for problems that are not known to have polynomial time algorithms for solving them exactly. Here we propose an approximation algorithm for the weighted matching problem in graphs which can be solved in polynomial time. The weighted matching problem is to find a matching in an edge weighted graph that has maximum weight. The first polynomial time algorithm for this problem was given by Edmonds in 1965. The fastest known algorithm for the weighted matching problem has a running time of O(nm+n 2 log n). Many real world problems require graphs of such large size that this running time is too costly. Therefore there is considerable need for faster approximation algorithms for the weighted matching problem. We present a linear time approximation algorithm for the weighted matching problem with a performance ratio arbitrarily close to 2/3
Efficient Exact Inference in Planar Ising Models
"... We give polynomialtime algorithms for the exact computation of lowestenergy states, worst margin violators, partition functions, and marginals in certain binary undirected graphical models. Our approach provides an interesting alternative to the wellknown graph cut paradigm in that it does not im ..."
Abstract

Cited by 23 (0 self)
 Add to MetaCart
(Show Context)
We give polynomialtime algorithms for the exact computation of lowestenergy states, worst margin violators, partition functions, and marginals in certain binary undirected graphical models. Our approach provides an interesting alternative to the wellknown graph cut paradigm in that it does not impose any submodularity constraints; instead we require planarity to establish a correspondence with perfect matchings in an expanded dual graph. Maximummargin parameter estimation for a boundary detection task shows our approach to be efficient and effective. 1