Results 1  10
of
1,211,044
Random minimum length spanning trees in regular graphs
"... Consider a connected rregular nvertex graph G with random independent edge lengths, each uniformly distributed on (0;1). Let mst(G) be the expected length of a minimum spanning tree. We show that mst(G) can be estimated quite accurately under two distinct circumstances. Firstly, if r is large and ..."
Abstract

Cited by 21 (9 self)
 Add to MetaCart
Consider a connected rregular nvertex graph G with random independent edge lengths, each uniformly distributed on (0;1). Let mst(G) be the expected length of a minimum spanning tree. We show that mst(G) can be estimated quite accurately under two distinct circumstances. Firstly, if r is large
A Note on Random Minimum Length Spanning Trees
 JOURNAL OF COMBINATORICS
, 2000
"... Consider a connected rregular nvertex graph G with random independent edge lengths, each uniformly distributed on [0, 1]. Let rest(G) be the expected length of a minimum spanning tree. We show in this paper that if G is sufficiently highly edge connected then the expected length of a minimum sp ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
Consider a connected rregular nvertex graph G with random independent edge lengths, each uniformly distributed on [0, 1]. Let rest(G) be the expected length of a minimum spanning tree. We show in this paper that if G is sufficiently highly edge connected then the expected length of a minimum
A note on random minimum length spanning trees. The Electronic
 Journal of Combinatorics
, 2000
"... Consider a connected rregular nvertex graph G with random independent edge lengths, each uniformly distributed on [0,1]. Let mst(G) be the expected length of a minimum spanning tree. We show in this paper that if G is sufficiently highly edge connected then the expected length of a minimum spannin ..."
Abstract
 Add to MetaCart
Consider a connected rregular nvertex graph G with random independent edge lengths, each uniformly distributed on [0,1]. Let mst(G) be the expected length of a minimum spanning tree. We show in this paper that if G is sufficiently highly edge connected then the expected length of a minimum
COMBINATORKA Akad~miai Kiad6 SpringerVerlag COMBnqATOPJCA 9 (4) (1989) 363374 ON RANDOM MINIMUM LENGTH SPANNING TREES
, 1988
"... We extend and strengthen the result hat, in the complete graph K, with independent random edgelengths uniformly distributed on [0, I], the expected length of the minimum spanning tree tends to s as n ~. In particular, ifK. is replaced by the complete bipartite graph Kn. ~ then there is a correspo ..."
Abstract
 Add to MetaCart
We extend and strengthen the result hat, in the complete graph K, with independent random edgelengths uniformly distributed on [0, I], the expected length of the minimum spanning tree tends to s as n ~. In particular, ifK. is replaced by the complete bipartite graph Kn. ~ then there is a
A distributed algorithm for minimumweight spanning trees
, 1983
"... A distributed algorithm is presented that constructs he minimumweight spanning tree in a connected undirected graph with distinct edge weights. A processor exists at each node of the graph, knowing initially only the weights of the adjacent edges. The processors obey the same algorithm and exchange ..."
Abstract

Cited by 443 (3 self)
 Add to MetaCart
A distributed algorithm is presented that constructs he minimumweight spanning tree in a connected undirected graph with distinct edge weights. A processor exists at each node of the graph, knowing initially only the weights of the adjacent edges. The processors obey the same algorithm
Inducing Features of Random Fields
 IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
, 1997
"... We present a technique for constructing random fields from a set of training samples. The learning paradigm builds increasingly complex fields by allowing potential functions, or features, that are supported by increasingly large subgraphs. Each feature has a weight that is trained by minimizing the ..."
Abstract

Cited by 664 (14 self)
 Add to MetaCart
introduced in this paper differ from those common to much of the computer vision literature in that the underlying random fields are nonMarkovian and have a large number of parameters that must be estimated. Relations to other learning approaches, including decision trees, are given. As a demonstration
ZTree: Zurich Toolbox for Readymade Economic Experiments, Working paper No
, 1999
"... 2.2.2 Startup of the Experimenter PC............................................................................................... 9 2.2.3 Startup of the Subject PCs....................................................................................................... 9 ..."
Abstract

Cited by 1956 (33 self)
 Add to MetaCart
2.2.2 Startup of the Experimenter PC............................................................................................... 9 2.2.3 Startup of the Subject PCs....................................................................................................... 9
Minimum Error Rate Training in Statistical Machine Translation
, 2003
"... Often, the training procedure for statistical machine translation models is based on maximum likelihood or related criteria. A general problem of this approach is that there is only a loose relation to the final translation quality on unseen text. In this paper, we analyze various training cri ..."
Abstract

Cited by 663 (7 self)
 Add to MetaCart
Often, the training procedure for statistical machine translation models is based on maximum likelihood or related criteria. A general problem of this approach is that there is only a loose relation to the final translation quality on unseen text. In this paper, we analyze various training criteria which directly optimize translation quality.
Results 1  10
of
1,211,044