Results 1  10
of
205,553
The Power of Amnesia: Learning Probabilistic Automata with Variable Memory Length
 Machine Learning
, 1996
"... . We propose and analyze a distribution learning algorithm for variable memory length Markov processes. These processes can be described by a subclass of probabilistic finite automata which we name Probabilistic Suffix Automata (PSA). Though hardness results are known for learning distributions gene ..."
Abstract

Cited by 226 (17 self)
 Add to MetaCart
. We propose and analyze a distribution learning algorithm for variable memory length Markov processes. These processes can be described by a subclass of probabilistic finite automata which we name Probabilistic Suffix Automata (PSA). Though hardness results are known for learning distributions
Learning Probabilistic Automata with Variable Memory Length
, 1994
"... We propose and analyze a distribution learning algorithm for variable memory length Markov processes. These processescanbedescribedbya subclassofprobabilisticniteautomatawhichwe nameProbabilisticFiniteSuxAutomata. The learningalgorithmismotivatedbyrealapplicationsinmanmachineinteractionsuchashandwr ..."
Abstract

Cited by 47 (5 self)
 Add to MetaCart
We propose and analyze a distribution learning algorithm for variable memory length Markov processes. These processescanbedescribedbya subclassofprobabilisticniteautomatawhichwe nameProbabilisticFiniteSuxAutomata. The learningalgorithmismotivatedbyrealapplicationsinman
Classbased variable memory length Markov model
 in Proceedings of the European Conference on Speech Communication and Technology (INTERSPEECH
, 2005
"... In this paper, we present a classbased variable memory length Markov model and its learning algorithm. This is an extension of a variable memory length Markov model. Our model is based on a classbased probabilistic suffix tree, whose nodes have an automatically acquired wordclass relation. We expe ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
In this paper, we present a classbased variable memory length Markov model and its learning algorithm. This is an extension of a variable memory length Markov model. Our model is based on a classbased probabilistic suffix tree, whose nodes have an automatically acquired wordclass relation. We
Memory Length in Hyperheuristics: An Empirical Study
 CISCHED 2007
, 2007
"... Hyperheuristics are an emergent optimisation methodology which aims to give a higher level of flexibility and domainindependence than is currently possible. Hyperheuristics are able to adapt to the different problems or problem instances by dynamically choosing between heuristics during the sear ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
the search. This paper is concerned with the issues of memory length on the performance of hyperheuristics. We focus on a recently proposed simulated annealing hyperheuristic and choose a set of hard university course timetabling problems as the test bed for this empirical study. The experimental results
Near Shannon limit errorcorrecting coding and decoding
, 1993
"... Abstract This paper deals with a new class of convolutional codes called Turbocodes, whose performances in terms of Bit Error Rate (BER) are close to the SHANNON limit. The TurboCode encoder is built using a parallel concatenation of two Recursive Systematic Convolutional codes and the associated ..."
Abstract

Cited by 1776 (6 self)
 Add to MetaCart
and the associated decoder, using a feedback decoding rule, is implemented as P pipelined identical elementary decoders. Consider a binary rate R=1/2 convolutional encoder with constraint length K and memory M=K1. The input to the encoder at time k is a bit dk and the corresponding codeword
Design of a Linguistic Postprocessor using Variable Memory Length Markov Models
 In International Conference on Document Analysis and Recognition
, 1995
"... We present the design of a linguistic postprocessor for character recognizers. The central module of our system is a trainable variable memory length Markov model (VLMM) which predicts the next character given a variable length window of past characters. The overall system is composed of several fin ..."
Abstract

Cited by 53 (1 self)
 Add to MetaCart
We present the design of a linguistic postprocessor for character recognizers. The central module of our system is a trainable variable memory length Markov model (VLMM) which predicts the next character given a variable length window of past characters. The overall system is composed of several
Long memory relationships and the aggregation of dynamic models
 Journal of Econometrics
, 1980
"... By aggregating simple. possibly dependent, dynamic microrelationships, it is shown that the aggregate series may have univariate longmemory models and obey integrated, or infinite length transfer function relationships. A longmemory time series model is one having spectrum or order 6 ” for small ..."
Abstract

Cited by 356 (2 self)
 Add to MetaCart
By aggregating simple. possibly dependent, dynamic microrelationships, it is shown that the aggregate series may have univariate longmemory models and obey integrated, or infinite length transfer function relationships. A longmemory time series model is one having spectrum or order 6 ” for small
The GrünwaldLetnikov fractionalorder derivative with fixed memory length
"... Contrary to integer order derivative, the fractionalorder derivative of a nonconstant periodic function is not a periodic function with the same period, as a consequence of this property the timeinvariant fractional order system does not have any nonconstant periodic solution unless the lower te ..."
Abstract
 Add to MetaCart
. In this paper we attempt to give a solution for the above problem by imposing a simple modification on the GrünwaldLetnikov definition of fractional derivative, this modification consists of fixing the memory length and varying the lower terminal of the derivative. It is shown that the new proposed
Judgments of frequency and recognition memory in a multipletrace memory model (Tech
 University of Oregon Cognitive Science Program
, 1986
"... The multipletrace simulation model, MINERVA 2, was applied to a number of phenomena found in experiments on relative and absolute judgments of frequency, and forcedchoice and yesno recognition memory. How the basic model deals with effects of repetition, forgetting, list length, orientation task, ..."
Abstract

Cited by 300 (3 self)
 Add to MetaCart
The multipletrace simulation model, MINERVA 2, was applied to a number of phenomena found in experiments on relative and absolute judgments of frequency, and forcedchoice and yesno recognition memory. How the basic model deals with effects of repetition, forgetting, list length, orientation task
Learning variable memory length Markov chains from noisy output
, 1996
"... ... In this report we consider the case when the output of the target machine is seen only through a noisy channel. Among others, Lugosi [7] considered learning from noisy data and showed that consistent learning is possible in case of the binary symmetric channel even if the noise rate is not known ..."
Abstract
 Add to MetaCart
... In this report we consider the case when the output of the target machine is seen only through a noisy channel. Among others, Lugosi [7] considered learning from noisy data and showed that consistent learning is possible in case of the binary symmetric channel even if the noise rate is not known. In particular, he proved that the use of a nearest neighbor or a histogram classification method is consistent, that is, the error rate converges to its minimal value (Bayeserror). We will proceed as follows. In Section 2 we will present the framework and algorithm of Ron et al. for learning PFSA. In Section 3 we will modify the algorithm for learning from noisy data with exact knowledge of the noise structure. In Section 4 we will show that the same algorithm also works if we have a very good estimate of the noise.
Results 1  10
of
205,553