Results 1  10
of
18
Realism in Statistical Analysis of Worst Case Execution Times,” in
 Proc. of WCET’ 10,
, 2010
"... Abstract This paper considers the use of Extreme Value Theory (EVT) to model worstcase execution times. In particular it considers the sacrifice that statistical methods make in the realism of their models in order to provide generality and precision, and if the sacrifice of realism can impact the ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
(Show Context)
Abstract This paper considers the use of Extreme Value Theory (EVT) to model worstcase execution times. In particular it considers the sacrifice that statistical methods make in the realism of their models in order to provide generality and precision, and if the sacrifice of realism can impact the safety of the model. The Gumbel distribution is assessed in terms of its assumption of continuous behaviour and its need for independent and identically distributed data. To ensure that predictions made by EVT estimations are safe, additional restrictions on their use are proposed and justified.
ReSampling for Statistical Timing Analysis of RealTime Systems
"... Guaranteeing timing constraints is the main purpose of analyses for realtime systems. The satisfaction of these constraints may be verified with probabilistic methods (relying on statistical estimations of certain task parameters) offering both hard and soft guarantees. In this paper, we address th ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
Guaranteeing timing constraints is the main purpose of analyses for realtime systems. The satisfaction of these constraints may be verified with probabilistic methods (relying on statistical estimations of certain task parameters) offering both hard and soft guarantees. In this paper, we address the problem of sampling applied to the distributions of worstcase execution times. The pessimism of presented sampling techniques is then evaluated at the level of response times.
Reasoning about the reliability of multiversion, diverse realtime systems
 In Proceedings of IEEE RealTime Systems Symposium (RTSS
, 2010
"... Abstract—This paper is concerned with the development of reliable realtime systems for use in high integrity applications. It advocates the use of diverse replicated channels, but does not require the dependencies between the channels to be evaluated. Rather it develops and extends the approach of ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
(Show Context)
Abstract—This paper is concerned with the development of reliable realtime systems for use in high integrity applications. It advocates the use of diverse replicated channels, but does not require the dependencies between the channels to be evaluated. Rather it develops and extends the approach of Littlewood and Rushby (for general systems) by investigating a two channel system in which one channel, A, is produced to a high level of reliability (i.e. has a very low failure rate), while the other, B, employs various forms of static analysis to sustain an argument that it is perfect (i.e. it will never miss a deadline). The first channel is fully functional, the second contains a more restricted computational model and contains only the critical computations. Potential dependencies between the channels (and their verification) are evaluated in terms of aleatory and epistemic uncertainty. At the aleatory level the events “A fails ” and “B is imperfect ” are independent. Moreover, unlike the general case, independence at the epistemic level is also proposed for common forms of implementation and analysis for realtime systems and their temporal requirements (deadlines). As a result, a systematic approach is advocated that can be applied in a real engineering context to produce highly reliable realtime systems, and to support numerical claims about the level of reliability achieved.
On the comparison of deterministic and probabilistic wcet estimation techniques.
 In RealTime Systems (ECRTS), 2014 26th Euromicro Conference on,
, 2014
"... AbstractTiming validation is a critical step in the design of realtime systems , that requires the estimation of WorstCase Execution Times (WCET) for tasks. A number of different methods have been proposed, such as Static Deterministic Timing Analysis (SDTA). The advent of Probabilistic Timing An ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
AbstractTiming validation is a critical step in the design of realtime systems , that requires the estimation of WorstCase Execution Times (WCET) for tasks. A number of different methods have been proposed, such as Static Deterministic Timing Analysis (SDTA). The advent of Probabilistic Timing Analysis, both MeasurementBased (MBPTA) and Static Probabilistic Timing Analyses (SPTA), offers different design points between the tightness of WCET estimates, hardware that can be analysed and the information needed from the user to carry out the analysis. The lack of comparison figures among those techniques makes complex the selection of the most appropriate one. This paper makes a first attempt towards comparing comprehensively SDTA, SPTA and MBPTA qualitatively and quantitatively, under different cache configurations implementing LRU and random replacement. We identify strengths and limitations of each technique depending on the characteristics of the program under analysis and the hardware platform, thus providing users with guidance on which approach to choose depending on their target application and hardware platform.
Multilevel unified caches for probabilistically time analysable realtime systems
 In RTSS ’13
, 2013
"... Abstract—Caches are key resources in highend processor architectures to increase performance. In fact, most highperformance processors come equipped with a multilevel cache hierarchy. In terms of guaranteed performance, however, cache hierarchies severely challenge the computation of tight worst ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
Abstract—Caches are key resources in highend processor architectures to increase performance. In fact, most highperformance processors come equipped with a multilevel cache hierarchy. In terms of guaranteed performance, however, cache hierarchies severely challenge the computation of tight worstcase execution time (WCET) estimates. On the one hand, the analysis of the timing behaviour of a single level of cache is already challenging, particularly for data accesses. On the other hand, unifying data and instructions in each level, makes the problem of cache analysis significantly more complex requiring tracking simultaneously data and instruction accesses to cache. In this paper we prove that multilevel cache hierarchies can be used in the context of Probabilistic Timing Analysis and tight WCET estimates can be obtained. Our detailed analysis (1) covers unified data and instruction caches, (2) covers different cachewrite policies (writethrough and write back), write allocation policies (writeallocate and nonwriteallocate) and several inclusion mechanisms (inclusive, noninclusive and exclusive caches), and (3) scales to an arbitrary number of cache levels. Our results show that the probabilistic WCET (pWCET) estimates provided by our analysis technique effectively benefit from having multilevel caches. For a twolevel cache configuration and for EEMBC benchmarks, pWCET reductions are 55% on average (and up to 90%) with respect to a processor with a single level of cache. I.
A TraceBased Statistical WorstCase Execution Time Analysis of ComponentBased RealTime Embedded Systems
"... Abstract—This paper describes the tool support for a framework for performing statistical WCET analysis of realtime embedded systems by using bootstrapping sampling and Extreme Value Theory (EVT). To be specific, bootstrapping sampling is used to generate timing traces, which not only fulfill the r ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Abstract—This paper describes the tool support for a framework for performing statistical WCET analysis of realtime embedded systems by using bootstrapping sampling and Extreme Value Theory (EVT). To be specific, bootstrapping sampling is used to generate timing traces, which not only fulfill the requirements given by statistics and probability theory, but also are robust to use in the context of estimating the WCET of programs. Next, our proposed statistical inference uses EVT to analyze such timing traces, and computes a WCET estimate of the target program, pertaining to a given predictable probability. The evaluation results show that our proposed method could have the potential of being able to provide a tighter upper bound on the WCET estimate of the programs under analysis, when compared to the estimates given by the referenced WCET analysis methods. I.
A new way about using statistical analysis of worstcase execution times
 in in the sesion of the Euromicro Conference on RealTime Systems
, 2011
"... AbstractIn this paper, we revisit the problem of using Extreme Value Theory (EVT) in the WorstCase Execution Time (WCET) analysis of the programs running on a single processor. Our proposed statistical WCET analysis method consists of a novel sampling mechanism tackling with some problems that hi ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
AbstractIn this paper, we revisit the problem of using Extreme Value Theory (EVT) in the WorstCase Execution Time (WCET) analysis of the programs running on a single processor. Our proposed statistical WCET analysis method consists of a novel sampling mechanism tackling with some problems that hindered the application of using EVT in the context, and a statistical inference about computation of a WCET estimate of the target program. To be specific, the presented sampling mechanism takes analysis samples from the target program based around endtoend measurements. Next, the statistical inference using EVT together with other statistical techniques, analyzes such timing traces which contain the execution time data of the program, to compute a WCET estimate with a certain predictable probability of being exceeded.
Probabilistic Deadline Miss Analysis of RealTime Systems Using Regenerative Transient Analysis
"... Quantitative evaluation of realtime systems demands for analysis frameworks that go beyond worstcase assumptions, since some parameters could be better characterized by a random variable than by a deterministic value. On the one hand, this opens notable issues on the safe estimation of probabilist ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Quantitative evaluation of realtime systems demands for analysis frameworks that go beyond worstcase assumptions, since some parameters could be better characterized by a random variable than by a deterministic value. On the one hand, this opens notable issues on the safe estimation of probabilistic parameters starting from real measurements. On the other hand, this also requires modeling formalisms and solution techniques that can encompass stochastic temporal parameters with a nonMarkovian distribution, thus breaking the limits of Markovian approaches. We propose a framework for modeling and evaluating periodic realtime tasks that may have a probabilistic Worst Case Execution Time (pWCET) and are scheduled by a fixedpriority nonpreemptive policy. The methodology leverages the Extreme Value Theory (EVT) for the derivation of the pWCET of tasks by means of Erlang distributions. Evaluation is performed through regenerative transient analysis based on the method of stochastic state classes, supporting the derivation of quantitative measures on the time by which a deadline is missed. The approach is experimented on a case study including tasks with a pWCET derived from benchmarks and real system execution.
Continuous ConstantMemory Monitoring of Embedded Software Timing
"... AbstractA method is presented for generating statistical models of timing data continuously over very long monitoring sessions. This method is intended for memoryefficient runtime modeling of timing properties in embedded software systems, such as execution times or interarrival times, but is a ..."
Abstract
 Add to MetaCart
(Show Context)
AbstractA method is presented for generating statistical models of timing data continuously over very long monitoring sessions. This method is intended for memoryefficient runtime modeling of timing properties in embedded software systems, such as execution times or interarrival times, but is a quite generic method that should be applicable for other purposes and domains as well. Specifically, we intend to use this method as a component in automatic generation of simulation models for probabilistic timing analysis of complex embedded software systems. Given a stream of data as input, this method gradually builds up a statistical model capturing the approximate distribution of the data. The method uses a modest and fixed amount of ontarget RAM, decided by the desired accuracy of the model, and allows for long monitoring sessions covering billions of data points. The paper presents the motivation, algorithm, a prototype implementation and evaluation using real execution time data from an ARM7 microcontroller.