• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 1,636
Next 10 →

KLEE: Unassisted and Automatic Generation of High-Coverage Tests for Complex Systems Programs

by Cristian Cadar, Daniel Dunbar, Dawson Engler
"... We present a new symbolic execution tool, KLEE, capable of automatically generating tests that achieve high coverage on a diverse set of complex and environmentally-intensive programs. We used KLEE to thoroughly check all 89 stand-alone programs in the GNU COREUTILS utility suite, which form the cor ..."
Abstract - Cited by 557 (15 self) - Add to MetaCart
We present a new symbolic execution tool, KLEE, capable of automatically generating tests that achieve high coverage on a diverse set of complex and environmentally-intensive programs. We used KLEE to thoroughly check all 89 stand-alone programs in the GNU COREUTILS utility suite, which form

Iterative decoding of binary block and convolutional codes

by Joachim Hagenauer, Elke Offer, Lutz Papke - IEEE TRANS. INFORM. THEORY , 1996
"... Iterative decoding of two-dimensional systematic convolutional codes has been termed “turbo” (de)coding. Using log-likelihood algebra, we show that any decoder can he used which accepts soft inputs-including a priori values-and delivers soft outputs that can he split into three terms: the soft chann ..."
Abstract - Cited by 610 (43 self) - Add to MetaCart
Iterative decoding of two-dimensional systematic convolutional codes has been termed “turbo” (de)coding. Using log-likelihood algebra, we show that any decoder can he used which accepts soft inputs-including a priori values-and delivers soft outputs that can he split into three terms: the soft

Iterative (turbo) soft interference cancellation and decoding for coded CDMA

by Xiaodong Wang, H. Vincent Poor - IEEE Trans. Commun , 1999
"... Abstract — The presence of both multiple-access interference (MAI) and intersymbol interference (ISI) constitutes a major impediment to reliable communications in multipath code-division multiple-access (CDMA) channels. In this paper, an iterative receiver structure is proposed for decoding multiuse ..."
Abstract - Cited by 456 (18 self) - Add to MetaCart
computational complexity. A low-complexity SISO multiuser detector is developed based on a novel nonlinear interference suppression technique, which makes use of both soft interference cancellation and instantaneous linear minimum mean-square error filtering. The properties of such a nonlinear interference

Shoestring: Probabilistic Soft Error Reliability on the Cheap

by Shuguang Feng, Shantanu Gupta, Amin Ansari, Scott Mahlke
"... Aggressive technology scaling provides designers with an ever increasing budget of cheaper and faster transistors. Unfortunately, this trend is accompanied by a decline in individual device reliability as transistors become increasingly susceptible to soft errors. We are quickly approaching a new er ..."
Abstract - Cited by 24 (3 self) - Add to MetaCart
come at little or no cost. This paper presents Shoestring, a minimally invasive software solution that provides high soft error coverage with very little overhead, enabling its deployment even in commodity processors with “shoestring ” reliability budgets. Leveraging intelligent analysis at compile

Fingerprinting: Bounding Soft-Error Detection Latency and Bandwidth

by Jared C. Smolens, Brian T. Gold, Jangwoo Kim, Babak Falsafi, James C. Hoe, Andreas G. Nowatzyk - In Proc. of the Symposium on Architectural Support for Programming Languages and Operating Systems (ASPLOS , 2004
"... Recent studies have suggested that the soft-error rate in microprocessor logic will become a reliability concern by 2010. This paper proposes an e#cient error detection technique, called fingerprinting, that detects di#erences in execution across a dual modular redundant (DMR) processor pair. Finger ..."
Abstract - Cited by 80 (8 self) - Add to MetaCart
. This paper presents a study that evaluates fingerprinting against a range of current approaches to error detection. The result of this study shows that fingerprinting is the only error detection mechanism that simultaneously allows high-error coverage, low error detection bandwidth, and high I/O performance.

Reducing Overhead for Soft Error Coverage in High Availability Systems

by Nidhi Aggarwal, James E. Smith, Parthasarathy Ranganathan, Kewal K. Saluja
"... Abstract—High reliability/availability systems typically use redundant computation and components to achieve detection, isolation and recovery from faults. Chip multiprocessors (CMPs) incorporate multiple identical components on a chip to provide high performance/watt. These identical components can ..."
Abstract - Add to MetaCart
the components of a system in low cost commodity CMP-based high availability system. In this paper, we focus on low overhead techniques to detect logic soft errors in high availability systems. We specifically focus on reducing the overhead of replicating 1) memory because it is the most expensive component of a

Failure Mode Assumptions and Assumption Coverage

by David Powell , 1995
"... . A method is proposed for the formal analysis of failure mode assumptions and for the evaluation of the dependability of systems whose design correctness is conditioned on the validity of such assumptions. Formal definitions are given for the types of errors that can affect items of service deliver ..."
Abstract - Cited by 143 (4 self) - Add to MetaCart
delivered by a system or component. Failure mode assumptions are then formalized as assertions on the types of errors that a component may induce in its enclosing system. The concept of assumption coverage is introduced to relate the notion of partiallyordered assumption assertions to the quantification

Techniques to reduce the soft error rate of a high-performance microprocessor

by Christopher Weaver, Joel Emer, Shubhendu S. Mukherjee, Steven K. Reinhardt - In Proceedings of the 31st annual International Symposium on Computer Architecture , 2004
"... Transient faults due to neutron and alpha particle strikes pose a significant obstacle to increasing processor transistor counts in future technologies. Although fault rates of individual transistors may not rise significantly, incorporating more transistors into a device makes that device more like ..."
Abstract - Cited by 104 (5 self) - Add to MetaCart
likely to encounter a fault. Hence, maintaining processor error rates at acceptable levels will require increasing design effort. This paper proposes two simple approaches to reduce error rates and evaluates their application to a microprocessor instruction queue. The first technique reduces the time

Index Terms—Formal Verification, Soft Error Injection, Error Detection and Correction, Fault/Error Coverage

by unknown authors
"... Abstract—In this paper we describe a methodology to measure exactly the quality of fault-tolerant designs by combining fault-injection in high level design (HLD) descriptions with a formal verification approach. We utilize BDD based symbolic simulation to determine the coverage of online error-detec ..."
Abstract - Add to MetaCart
Abstract—In this paper we describe a methodology to measure exactly the quality of fault-tolerant designs by combining fault-injection in high level design (HLD) descriptions with a formal verification approach. We utilize BDD based symbolic simulation to determine the coverage of online error

Vierhaus, “Evaluating coverage of error detection logic for soft errors using formal methods

by U. Krautz, M. Pflanz, C. Jacobi, H. W. Tast, K. Weber, H. T. Vierhaus - in Design, Automation and Test in Europe, 2006
"... Abstract—In this paper we describe a methodology to measure exactly the quality of fault-tolerant designs by combining faultinjection in high level design (HLD) descriptions with a formal verification approach. We utilize BDD based symbolic simulation to determine the coverage of online error-detect ..."
Abstract - Cited by 12 (0 self) - Add to MetaCart
Abstract—In this paper we describe a methodology to measure exactly the quality of fault-tolerant designs by combining faultinjection in high level design (HLD) descriptions with a formal verification approach. We utilize BDD based symbolic simulation to determine the coverage of online error
Next 10 →
Results 1 - 10 of 1,636
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University