Results 1  10
of
11,493
REGULAR FORM SUBBASE FORM REGULAR FORM SUBBASE FORM
"... In my Grantha proposal L2/09372 I had requested the distinct encoding of subbase vowel signs for Vocalic L/LL at 1137E and 1137F while the regular vowel signs for Vocalic L/LL are to be encoded at 11362 and 11363 isomorphically with the other major Indic blocks. The regular vowel signs are “regula ..."
Abstract
 Add to MetaCart
are “regular ” in that they are the written forms seen in most (contemporary) printings and writings. These regular forms are placed to the right of their base. However the very same glyphs as used for these regular forms are also attested to have been archaically used below their base:
Understanding Normal and Impaired Word Reading: Computational Principles in QuasiRegular Domains
 PSYCHOLOGICAL REVIEW
, 1996
"... We develop a connectionist approach to processing in quasiregular domains, as exemplified by English word reading. A consideration of the shortcomings of a previous implementation (Seidenberg & McClelland, 1989, Psych. Rev.) in reading nonwords leads to the development of orthographic and phono ..."
Abstract

Cited by 613 (94 self)
 Add to MetaCart
and phonological representations that capture better the relevant structure among the written and spoken forms of words. In a number of simulation experiments, networks using the new representations learn to read both regular and exception words, including lowfrequency exception words, and yet are still able
Manifold regularization: A geometric framework for learning from labeled and unlabeled examples
 JOURNAL OF MACHINE LEARNING RESEARCH
, 2006
"... We propose a family of learning algorithms based on a new form of regularization that allows us to exploit the geometry of the marginal distribution. We focus on a semisupervised framework that incorporates labeled and unlabeled data in a generalpurpose learner. Some transductive graph learning al ..."
Abstract

Cited by 578 (16 self)
 Add to MetaCart
We propose a family of learning algorithms based on a new form of regularization that allows us to exploit the geometry of the marginal distribution. We focus on a semisupervised framework that incorporates labeled and unlabeled data in a generalpurpose learner. Some transductive graph learning
Bias toward regular form in mental shape spaces
 Journal of Experimental Psychology: Human Perception & Performance
, 2000
"... The distribution of figural "goodness " in 2 mental shape spaces, the space of triangles and the space of quadrilaterals, was examined. In Experiment 1, participants were asked to rate the typicality of visually presented triangles and quadrilaterals (perceptual task). In Experiment 2, par ..."
Abstract

Cited by 17 (2 self)
 Add to MetaCart
goodness in shape space. Compared with neutral distributions of random shapes in the same shape spaces, these distributions showed a marked bias toward regular forms (equilateral triangles and squares). Such psychologically medal shapes apparently represent ideal forms that maximize the perceptual
A Model of Investor Sentiment
 Journal of Financial Economics
, 1998
"... Recent empirical research in finance has uncovered two families of pervasive regularities: underreaction of stock prices to news such as earnings announcements, and overreaction of stock prices to a series of good or bad news. In this paper, we present a parsimonious model of investor sentiment, or ..."
Abstract

Cited by 777 (32 self)
 Add to MetaCart
Recent empirical research in finance has uncovered two families of pervasive regularities: underreaction of stock prices to news such as earnings announcements, and overreaction of stock prices to a series of good or bad news. In this paper, we present a parsimonious model of investor sentiment
Locally weighted learning
 ARTIFICIAL INTELLIGENCE REVIEW
, 1997
"... This paper surveys locally weighted learning, a form of lazy learning and memorybased learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias, ass ..."
Abstract

Cited by 599 (51 self)
 Add to MetaCart
This paper surveys locally weighted learning, a form of lazy learning and memorybased learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias
Convolution Kernels on Discrete Structures
, 1999
"... We introduce a new method of constructing kernels on sets whose elements are discrete structures like strings, trees and graphs. The method can be applied iteratively to build a kernel on an infinite set from kernels involving generators of the set. The family of kernels generated generalizes the fa ..."
Abstract

Cited by 506 (0 self)
 Add to MetaCart
the family of radial basis kernels. It can also be used to define kernels in the form of joint Gibbs probability distributions. Kernels can be built from hidden Markov random elds, generalized regular expressions, pairHMMs, or ANOVA decompositions. Uses of the method lead to open problems involving
Estimating the Support of a HighDimensional Distribution
, 1999
"... Suppose you are given some dataset drawn from an underlying probability distribution P and you want to estimate a "simple" subset S of input space such that the probability that a test point drawn from P lies outside of S is bounded by some a priori specified between 0 and 1. We propo ..."
Abstract

Cited by 783 (29 self)
 Add to MetaCart
propose a method to approach this problem by trying to estimate a function f which is positive on S and negative on the complement. The functional form of f is given by a kernel expansion in terms of a potentially small subset of the training data; it is regularized by controlling the length
Quantization Index Modulation: A Class of Provably Good Methods for Digital Watermarking and Information Embedding
 IEEE TRANS. ON INFORMATION THEORY
, 1999
"... We consider the problem of embedding one signal (e.g., a digital watermark), within another "host" signal to form a third, "composite" signal. The embedding is designed to achieve efficient tradeoffs among the three conflicting goals of maximizing informationembedding rate, mini ..."
Abstract

Cited by 496 (14 self)
 Add to MetaCart
We consider the problem of embedding one signal (e.g., a digital watermark), within another "host" signal to form a third, "composite" signal. The embedding is designed to achieve efficient tradeoffs among the three conflicting goals of maximizing informationembedding rate
On Language and Connectionism: Analysis of a Parallel Distributed Processing Model of Language Acquisition
 COGNITION
, 1988
"... Does knowledge of language consist of mentallyrepresented rules? Rumelhart and McClelland have described a connectionist (parallel distributed processing) model of the acquisition of the past tense in English which successfully maps many stems onto their past tense forms, both regular (walk/walked) ..."
Abstract

Cited by 415 (13 self)
 Add to MetaCart
Does knowledge of language consist of mentallyrepresented rules? Rumelhart and McClelland have described a connectionist (parallel distributed processing) model of the acquisition of the past tense in English which successfully maps many stems onto their past tense forms, both regular (walk
Results 1  10
of
11,493