Results 1  10
of
872
Proof verification and hardness of approximation problems
 IN PROC. 33RD ANN. IEEE SYMP. ON FOUND. OF COMP. SCI
, 1992
"... We show that every language in NP has a probablistic verifier that checks membership proofs for it using logarithmic number of random bits and by examining a constant number of bits in the proof. If a string is in the language, then there exists a proof such that the verifier accepts with probabilit ..."
Abstract

Cited by 822 (39 self)
 Add to MetaCart
We show that every language in NP has a probablistic verifier that checks membership proofs for it using logarithmic number of random bits and by examining a constant number of bits in the proof. If a string is in the language, then there exists a proof such that the verifier accepts
The Application of Petri Nets to Workflow Management
, 1998
"... Workflow management promises a new solution to an ageold problem: controlling, monitoring, optimizing and supporting business processes. What is new about workflow management is the explicit representation of the business process logic which allows for computerized support. This paper discusses the ..."
Abstract

Cited by 522 (61 self)
 Add to MetaCart
techniques which can be used to verify the correctness of workflow procedures. This paper introduces workflow management as an application domain for Petri nets, presents stateoftheart results with respect to the verification of workflows, and highlights some Petrinetbased workflow tools.
A scheduling model for reduced CPU energy
 ANNUAL SYMPOSIUM ON FOUNDATIONS OF COMPUTER SCIENCE
, 1995
"... The energy usage of computer systems is becoming an important consideration, especially for batteryoperated systems. Various methods for reducing energy consumption have been investigated, both at the circuit level and at the operating systems level. In this paper, we propose a simple model of job s ..."
Abstract

Cited by 550 (3 self)
 Add to MetaCart
The energy usage of computer systems is becoming an important consideration, especially for batteryoperated systems. Various methods for reducing energy consumption have been investigated, both at the circuit level and at the operating systems level. In this paper, we propose a simple model of job scheduling aimed at capturing some key aspects of energy minimization. In this model, each job is to be executed between its arrival time and deadline by a single processor with variable speed, under the assumption that energy usage per unit time, P, is a convex function of the processor speed s. We give an offline algorithm that computes, for any set of jobs, a minimumenergy schedule. We then consider some online algorithms and their competitive performance for the power function P(s) = sp where p 3 2. It is shown that one natural heuristic, called the Average Rate heuristic, uses at most a constant times the minimum energy required. The analysis involves bounding the largest eigenvalue in matrices of a special type.
Bugs as Deviant Behavior: A General Approach to Inferring Errors in Systems Code
, 2001
"... A major obstacle to finding program errors in a real system is knowing what correctness rules the system must obey. These rules are often undocumented or specified in an ad hoc manner. This paper demonstrates techniques that automatically extract such checking information from the source code itsel ..."
Abstract

Cited by 387 (12 self)
 Add to MetaCart
A major obstacle to finding program errors in a real system is knowing what correctness rules the system must obey. These rules are often undocumented or specified in an ad hoc manner. This paper demonstrates techniques that automatically extract such checking information from the source code itself, rather than the programmer, thereby avoiding the need for a priori knowledge of system rules. The cornerstone of our approach is inferring programmer "beliefs" that we then crosscheck for contradictions. Beliefs are facts implied by code: a dereference of a pointer, p, implies a belief that p is nonnull, a call to "unlock(1)" implies that 1 was locked, etc. For beliefs we know the programmer must hold, such as the pointer dereference above, we immediately flag contra
Nonparametric Permutation Tests for Functional Neuroimaging: A Primer with Examples. Human Brain Mapping
, 2001
"... The statistical analyses of functional mapping experiments usually proceeds at the voxel level, involving the formation and assessment of a statistic image: at each voxel a statistic indicating evidence of the experimental effect of interest, at that voxel, is computed, giving an image of statistics ..."
Abstract

Cited by 376 (10 self)
 Add to MetaCart
The statistical analyses of functional mapping experiments usually proceeds at the voxel level, involving the formation and assessment of a statistic image: at each voxel a statistic indicating evidence of the experimental effect of interest, at that voxel, is computed, giving an image of statistics, a statistic
Mining Specifications
, 2002
"... Program verification is a promising approach to improving program quality, because it can search all possible program executions for specific errors. However, the need to formally describe correct behavior or errors is a major barrier to the widespread adoption of program verification, since program ..."
Abstract

Cited by 360 (6 self)
 Add to MetaCart
Program verification is a promising approach to improving program quality, because it can search all possible program executions for specific errors. However, the need to formally describe correct behavior or errors is a major barrier to the widespread adoption of program verification, since programmers historically have been reluctant to write formal specifications. Automating the process of formulating specifications would remove a barrier to program verification and enhance its practicality.
Checking Computations in Polylogarithmic Time
, 1991
"... . Motivated by Manuel Blum's concept of instance checking, we consider new, very fast and generic mechanisms of checking computations. Our results exploit recent advances in interactive proof protocols [LFKN92], [Sha92], and especially the MIP = NEXP protocol from [BFL91]. We show that every no ..."
Abstract

Cited by 274 (11 self)
 Add to MetaCart
. Motivated by Manuel Blum's concept of instance checking, we consider new, very fast and generic mechanisms of checking computations. Our results exploit recent advances in interactive proof protocols [LFKN92], [Sha92], and especially the MIP = NEXP protocol from [BFL91]. We show that every nondeterministic computational task S(x; y), defined as a polynomial time relation between the instance x, representing the input and output combined, and the witness y can be modified to a task S 0 such that: (i) the same instances remain accepted; (ii) each instance/witness pair becomes checkable in polylogarithmic Monte Carlo time; and (iii) a witness satisfying S 0 can be computed in polynomial time from a witness satisfying S. Here the instance and the description of S have to be provided in errorcorrecting code (since the checker will not notice slight changes). A modification of the MIP proof was required to achieve polynomial time in (iii); the earlier technique yields N O(log log N)...
Probablistic Object and Viewpoint Models for Active Object Recognition
"... Abstract — For mobile robots to perform certain tasks in human environments, fast and accurate object verification and recognition is essential. Bayesian approaches to active object recognition have proved effective in a number of cases, allowing information across views to be integrated in a princi ..."
Abstract
 Add to MetaCart
Abstract — For mobile robots to perform certain tasks in human environments, fast and accurate object verification and recognition is essential. Bayesian approaches to active object recognition have proved effective in a number of cases, allowing information across views to be integrated in a principled manner, and permitting a principled approach to data acquisition. Existing approaches however mostly rely on probabilistic models which make simplifying assumptions such as that features may be treated independently and that objects will appear without clutter at test time. We develop a number of probabilistic object and viewpoint models which are explicitly designed to cope with situations in which these assumptions fail, and show these to perform well in a Bayesian active recognition setting using test data in which objects appear in cluttered environments with significant occlusion. I.
Results 1  10
of
872