Results 1  10
of
3,741
Wireless Communications
, 2005
"... Copyright c ○ 2005 by Cambridge University Press. This material is in copyright. Subject to statutory exception and to the provisions of relevant collective licensing agreements, no reproduction of any part may take place without the written permission of Cambridge University ..."
Abstract

Cited by 1129 (32 self)
 Add to MetaCart
Copyright c ○ 2005 by Cambridge University Press. This material is in copyright. Subject to statutory exception and to the provisions of relevant collective licensing agreements, no reproduction of any part may take place without the written permission of Cambridge University
Synchronization and linearity: an algebra for discrete event systems
, 2001
"... The first edition of this book was published in 1992 by Wiley (ISBN 0 471 93609 X). Since this book is now out of print, and to answer the request of several colleagues, the authors have decided to make it available freely on the Web, while retaining the copyright, for the benefit of the scientific ..."
Abstract

Cited by 369 (11 self)
 Add to MetaCart
The first edition of this book was published in 1992 by Wiley (ISBN 0 471 93609 X). Since this book is now out of print, and to answer the request of several colleagues, the authors have decided to make it available freely on the Web, while retaining the copyright, for the benefit of the scientific community. Copyright Statement This electronic document is in PDF format. One needs Acrobat Reader (available freely for most platforms from the Adobe web site) to benefit from the full interactive machinery: using the package hyperref by Sebastian Rahtz, the table of contents and all LATEX crossreferences are automatically converted into clickable hyperlinks, bookmarks are generated automatically, etc.. So, do not hesitate to click on references to equation or section numbers, on items of thetableofcontents and of the index, etc.. One may freely use and print this document for one’s own purpose or even distribute it freely, but not commercially, provided it is distributed in its entirety and without modifications, including this preface and copyright statement. Any use of thecontents should be acknowledged according to the standard scientific practice. The
Codes and Decoding on General Graphs
, 1996
"... Iterative decoding techniques have become a viable alternative for constructing high performance coding systems. In particular, the recent success of turbo codes indicates that performance close to the Shannon limit may be achieved. In this thesis, it is showed that many iterative decoding algorithm ..."
Abstract

Cited by 359 (1 self)
 Add to MetaCart
Iterative decoding techniques have become a viable alternative for constructing high performance coding systems. In particular, the recent success of turbo codes indicates that performance close to the Shannon limit may be achieved. In this thesis, it is showed that many iterative decoding algorithms are special cases of two generic algorithms, the minsum and sumproduct algorithms, which also include noniterative algorithms such as Viterbi decoding. The minsum and sumproduct algorithms are developed and presented as generalized trellis algorithms, where the time axis of the trellis is replaced by an arbitrary graph, the "Tanner graph". With cyclefree Tanner graphs, the resulting decoding algorithms (e.g., Viterbi decoding) are maximumlikelihood but suffer from an exponentially increasing complexity. Iterative decoding occurs when the Tanner graph has cycles (e.g., turbo codes); the resulting algorithms are in general suboptimal, but significant complexity reductions are possible compared to the cyclefree case. Several performance estimates for iterative decoding are developed, including a generalization of the union bound used with Viterbi decoding and a characterization of errors that are uncorrectable after infinitely many decoding iterations.
Software unit test coverage and adequacy
 ACM Computing Surveys
, 1997
"... Objective measurement of test quality is one of the key issues in software testing. It has been a major research focus for the last two decades. Many test criteria have been proposed and studied for this purpose. Various kinds of rationales have been presented in support of one criterion or another. ..."
Abstract

Cited by 351 (8 self)
 Add to MetaCart
Objective measurement of test quality is one of the key issues in software testing. It has been a major research focus for the last two decades. Many test criteria have been proposed and studied for this purpose. Various kinds of rationales have been presented in support of one criterion or another. We survey the research work in
The price of stability for network design with fair cost allocation
 In Proceedings of the 45th Annual Symposium on Foundations of Computer Science (FOCS
, 2004
"... Abstract. Network design is a fundamental problem for which it is important to understand the effects of strategic behavior. Given a collection of selfinterested agents who want to form a network connecting certain endpoints, the set of stable solutions — the Nash equilibria — may look quite differ ..."
Abstract

Cited by 279 (27 self)
 Add to MetaCart
Abstract. Network design is a fundamental problem for which it is important to understand the effects of strategic behavior. Given a collection of selfinterested agents who want to form a network connecting certain endpoints, the set of stable solutions — the Nash equilibria — may look quite different from the centrally enforced optimum. We study the quality of the best Nash equilibrium, and refer to the ratio of its cost to the optimum network cost as the price of stability. The best Nash equilibrium solution has a natural meaning of stability in this context — it is the optimal solution that can be proposed from which no user will defect. We consider the price of stability for network design with respect to one of the most widelystudied protocols for network cost allocation, in which the cost of each edge is divided equally between users whose connections make use of it; this fairdivision scheme can be derived from the Shapley value, and has a number of basic economic motivations. We show that the price of stability for network design with respect to this fair cost allocation is O(log k), where k is the number of users, and that a good Nash equilibrium can be achieved via bestresponse dynamics in which users iteratively defect from a starting solution. This establishes that the fair cost allocation protocol is in fact a useful mechanism for inducing strategic behavior to form nearoptimal equilibria. We discuss connections to the class of potential games defined by Monderer and Shapley, and extend our results to cases in which users are seeking to balance network design costs with latencies in the constructed network, with stronger results when the network has only delays and no construction costs. We also present bounds on the convergence time of bestresponse dynamics, and discuss extensions to a weighted game.
The origins of telicity
, 1998
"... The distinction between telic and atelic predicates has been described in terms of the algebraic properties of their meaning since the early days of modeltheoretic semantics. This perspective was inspired by Aristotle’s discussion of types of actions that do or do not take time to be completed1 w ..."
Abstract

Cited by 206 (4 self)
 Add to MetaCart
The distinction between telic and atelic predicates has been described in terms of the algebraic properties of their meaning since the early days of modeltheoretic semantics. This perspective was inspired by Aristotle’s discussion of types of actions that do or do not take time to be completed1 which was taken up and turned into a linguistic discussion of actiondenoting predicates by Vendler
Recognition of Shapes by Editing Their Shock Graphs
 Proc. Int’l Conf. Computer Vision
, 2001
"... Abstract—This paper presents a novel framework for the recognition of objects based on their silhouettes. The main idea is to measure the distance between two shapes as the minimum extent of deformation necessary for one shape to match the other. Since the space of deformations is very highdimensio ..."
Abstract

Cited by 204 (8 self)
 Add to MetaCart
Abstract—This paper presents a novel framework for the recognition of objects based on their silhouettes. The main idea is to measure the distance between two shapes as the minimum extent of deformation necessary for one shape to match the other. Since the space of deformations is very highdimensional, three steps are taken to make the search practical: 1) define an equivalence class for shapes based on shockgraph topology, 2) define an equivalence class for deformation paths based on shockgraph transitions, and 3) avoid complexityincreasing deformation paths by moving toward shockgraph degeneracy. Despite these steps, which tremendously reduce the search requirement, there still remain numerous deformation paths to consider. To that end, we employ an editdistance algorithm for shock graphs that finds the optimal deformation path in polynomial time. The proposed approach gives intuitive correspondences for a variety of shapes and is robust in the presence of a wide range of visual transformations. The recognition rates on two distinct databases of 99 and 216 shapes each indicate highly successful within category matches (100 percent in top three matches), which render the framework potentially usable in a range of shapebased recognition applications. Index Terms—Shape deformation, shock graphs, graph matching, edit distance, shape matching, object recognition, dynamic programming. æ 1
Supporting Text 1. Algorithm Details
"... Consider a corpus of m sentences (sequences) of variable length, each expressed in terms of a lexicon of finite size N. The sentences in the corpus correspond to m different paths in a pseudograph (a nonsimple graph in which both loops and multiple edges are permitted) whose vertices are the unique ..."
Abstract
 Add to MetaCart
in flux of paths at ej−1, starting at ei and moving along the subpath ei → ei+1 → ei+2 · · · → ej−1 PR(ei; ej) = p(ejeiei+1ei+2...ej−1) = l(ei; ej), [1] l(ei; ej−1) where l(ei; ej) is the number of occurrences of subpaths (ei; ej) in the graph. Proceeding in the opposite direction, from the right
Knowledge Discovery from Users WebPage Navigation
 in Proceedings of workshop on research issues in Data engineering
, 1997
"... We propose to detect users navigationpaths to the advantage of website owners. First, we explain the design and implementationof a profiler which captures client’s selected links and pages order, accurate page viewing time and cache references, using a Java based remote agent. The information captu ..."
Abstract

Cited by 177 (10 self)
 Add to MetaCart
We propose to detect users navigationpaths to the advantage of website owners. First, we explain the design and implementationof a profiler which captures client’s selected links and pages order, accurate page viewing time and cache references, using a Java based remote agent. The information captured by the profiler is then utilized by a knowledge discovery technique to cluster users with similar interests. We introduce a novel path clustering method based on the similarity of the history of user navigation. This approach is capable of capturing the interests of the user which could persist through several subsequent hypertext link selections. Finally, we evaluate our path clustering technique via a simulation study on a sample WWWsite. We show that depending on the level of inserted noise, we can recover the correct clusters by %10%27 of average error margin. 1.
Results 1  10
of
3,741