Results 1  10
of
9,636
Input Statistics and Hebbian Crosstalk Effects
, 2014
"... As an extension of prior work, we study inspecific Hebbian learning using the classical Oja model. We use a combination of analytical tools and numerical simulations to investigate how the effects of inspecificity (or synaptic “crosstalk”) depend on the input statistics. We investigated a variety o ..."
Abstract
 Add to MetaCart
As an extension of prior work, we study inspecific Hebbian learning using the classical Oja model. We use a combination of analytical tools and numerical simulations to investigate how the effects of inspecificity (or synaptic “crosstalk”) depend on the input statistics. We investigated a variety
Additive Logistic Regression: a Statistical View of Boosting
 Annals of Statistics
, 1998
"... Boosting (Freund & Schapire 1996, Schapire & Singer 1998) is one of the most important recent developments in classification methodology. The performance of many classification algorithms can often be dramatically improved by sequentially applying them to reweighted versions of the input dat ..."
Abstract

Cited by 1750 (25 self)
 Add to MetaCart
Boosting (Freund & Schapire 1996, Schapire & Singer 1998) is one of the most important recent developments in classification methodology. The performance of many classification algorithms can often be dramatically improved by sequentially applying them to reweighted versions of the input
A blocksorting lossless data compression algorithm
, 1994
"... We describe a blocksorting, lossless data compression algorithm, and our implementation of that algorithm. We compare the performance of our implementation with widely available data compressors running on the same hardware. The algorithm works by applying a reversible transformation to a block o ..."
Abstract

Cited by 809 (5 self)
 Add to MetaCart
statistical modelling techniques. The size of the input block must be large (a few kilobytes) to achieve good compression.
Estimating the Support of a HighDimensional Distribution
, 1999
"... Suppose you are given some dataset drawn from an underlying probability distribution P and you want to estimate a "simple" subset S of input space such that the probability that a test point drawn from P lies outside of S is bounded by some a priori specified between 0 and 1. We propo ..."
Abstract

Cited by 783 (29 self)
 Add to MetaCart
of the weight vector in an associated feature space. The expansion coefficients are found by solving a quadratic programming problem, which we do by carrying out sequential optimization over pairs of input patterns. We also provide a preliminary theoretical analysis of the statistical performance of our
Cognitive Radio: BrainEmpowered Wireless Communications
, 2005
"... Cognitive radio is viewed as a novel approach for improving the utilization of a precious natural resource: the radio electromagnetic spectrum. The cognitive radio, built on a softwaredefined radio, is defined as an intelligent wireless communication system that is aware of its environment and use ..."
Abstract

Cited by 1541 (4 self)
 Add to MetaCart
and uses the methodology of understandingbybuilding to learn from the environment and adapt to statistical variations in the input stimuli, with two primary objectives in mind: • highly reliable communication whenever and wherever needed; • efficient utilization of the radio spectrum. Following
Unsupervised texture segmentation using Gabor filters
 Pattern Recognition
"... We presenf a texture segmentation algorithm inspired by the multichannel filtering theory for visual information processing in the early stages of human visual system. The channels are characterized by a bank of Gabor filters that nearly uniformly covers the spatialfrequency domain. We propose a s ..."
Abstract

Cited by 616 (20 self)
 Add to MetaCart
systematic filter selection scheme which is based on reconstruction of the input image from the filtered images. Texture features are obtained by subjecting each (selected) filtered image to a nonlinear transformation and computing a measure of “energy ” in a window around each pixel. An unsupervised square
Analysis of Recommendation Algorithms for ECommerce
, 2000
"... Recommender systems apply statistical and knowledge discovery techniques to the problem of making product recommendations during a live customer interaction and they are achieving widespread success in ECommerce nowadays. In this paper, we investigate several techniques for analyzing largescale pu ..."
Abstract

Cited by 523 (22 self)
 Add to MetaCart
Recommender systems apply statistical and knowledge discovery techniques to the problem of making product recommendations during a live customer interaction and they are achieving widespread success in ECommerce nowadays. In this paper, we investigate several techniques for analyzing large
Data Preparation for Mining World Wide Web Browsing Patterns
 KNOWLEDGE AND INFORMATION SYSTEMS
, 1999
"... The World Wide Web (WWW) continues to grow at an astounding rate in both the sheer volume of tra#c and the size and complexity of Web sites. The complexity of tasks such as Web site design, Web server design, and of simply navigating through a Web site have increased along with this growth. An i ..."
Abstract

Cited by 567 (43 self)
 Add to MetaCart
. An important input to these design tasks is the analysis of how a Web site is being used. Usage analysis includes straightforward statistics, such as page access frequency, as well as more sophisticated forms of analysis, such as finding the common traversal paths through a Web site. Web Usage Mining
RealTime Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations
"... A key challenge for neural modeling is to explain how a continuous stream of multimodal input from a rapidly changing environment can be processed by stereotypical recurrent circuits of integrateandfire neurons in realtime. We propose a new computational model for realtime computing on timevar ..."
Abstract

Cited by 469 (38 self)
 Add to MetaCart
varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks. It does not require a taskdependent construction of neural circuits. Instead it is based on principles of high dimensional dynamical systems in combination with statistical learning theory, and can
Paradox lost? Firmlevel evidence on the returns to information systems.
 Manage Sci
, 1996
"... T he "productivity paradox" of information systems (IS) is that, despite enormous improvements in the underlying technology, the benefits of IS spending have not been found in aggregate output statistics.One explanation is that IS spending may lead to increases in product quality or varie ..."
Abstract

Cited by 465 (23 self)
 Add to MetaCart
T he "productivity paradox" of information systems (IS) is that, despite enormous improvements in the underlying technology, the benefits of IS spending have not been found in aggregate output statistics.One explanation is that IS spending may lead to increases in product quality
Results 1  10
of
9,636