• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 6,498
Next 10 →

Good News and Bad News: Representation Theorems and Applications

by Paul R. Milgrom - Bell Journal of Economics
"... prior permission, you may not download an entire issue of a journal or multiple copies of articles, and you may use content in the JSTOR archive only for your personal, non-commercial use. Please contact the publisher regarding any further use of this work. Publisher contact information may be obtai ..."
Abstract - Cited by 700 (3 self) - Add to MetaCart
prior permission, you may not download an entire issue of a journal or multiple copies of articles, and you may use content in the JSTOR archive only for your personal, non-commercial use. Please contact the publisher regarding any further use of this work. Publisher contact information may be obtained at

GOOD REPRESENTATIONS AND SOLVABLE GROUPS

by Dan Edidin, William Graham , 2000
"... The purpose of this paper is to provide a characterization of solvable linear algebraic groups in terms of a geometric property of representations. Representations with a related property played an important role in the proof of the equivariant Riemann-Roch theorem [EG2]. In ..."
Abstract - Add to MetaCart
The purpose of this paper is to provide a characterization of solvable linear algebraic groups in terms of a geometric property of representations. Representations with a related property played an important role in the proof of the equivariant Riemann-Roch theorem [EG2]. In

Finding a good representation...

by Peter C. -h. Cheng , 2002
"... Six characteristics of effective representational systems for conceptual learning in complex domains have been identified. Such representations should: (1) integrate levels of abstraction; (2) combine globally homogeneous with locally heterogeneous representation of concepts; (3) integrate alternati ..."
Abstract - Add to MetaCart
Six characteristics of effective representational systems for conceptual learning in complex domains have been identified. Such representations should: (1) integrate levels of abstraction; (2) combine globally homogeneous with locally heterogeneous representation of concepts; (3) integrate

Good Representations and Homogeneous Spaces

by M. Jablonski , 804
"... Let G be a complex reductive affine algebraic group. Let F, H be algebraic reductive subgroups. The homogeneous space G/F has a natural, transitive left action of G on it. We will consider the induced action of H on G/F. Hereafter a property of a space will be called generic if it occurs on a nonemp ..."
Abstract - Add to MetaCart
Let G be a complex reductive affine algebraic group. Let F, H be algebraic reductive subgroups. The homogeneous space G/F has a natural, transitive left action of G on it. We will consider the induced action of H on G/F. Hereafter a property of a space will be called generic if it occurs on a nonempty Zariski open set. Our main result is the following. The following theorem and its corollaries are true for both real and complex algebraic groups. Moreover, the results actually hold for real semi-algebraic groups by passing to finite index subgroups and finite covers of manifolds. We omit the details of the proofs for semi-algebraic groups. Theorem 1. Consider the induced action of H on G/F, then generic H-orbits are closed in G/F; that is, there is a nonempty Zariski open set of G/F such that the H-orbit of any point in this open set is closed. Corollary 2. Let G, H, F be as above. If H is normal in G, then all orbits of H are closed in G/F. Consequently, if G acts on V and the orbit Gv is closed, then Hv is also closed. Corollary 3. Let G be a reductive algebraic group. If H, F are generic reductive subgroups, then H ∩ F is also reductive. More precisely, take any two reductive subgroups H, F of G. Then H ∩ gFg −1 is reductive for generic g ∈ G.

Greed is Good: Algorithmic Results for Sparse Approximation

by Joel A. Tropp , 2004
"... This article presents new results on using a greedy algorithm, orthogonal matching pursuit (OMP), to solve the sparse approximation problem over redundant dictionaries. It provides a sufficient condition under which both OMP and Donoho’s basis pursuit (BP) paradigm can recover the optimal representa ..."
Abstract - Cited by 916 (9 self) - Add to MetaCart
representation of an exactly sparse signal. It leverages this theory to show that both OMP and BP succeed for every sparse input signal from a wide class of dictionaries. These quasi-incoherent dictionaries offer a natural generalization of incoherent dictionaries, and the cumulative coherence function

Machine Learning in Automated Text Categorization

by Fabrizio Sebastiani - ACM COMPUTING SURVEYS , 2002
"... The automated categorization (or classification) of texts into predefined categories has witnessed a booming interest in the last ten years, due to the increased availability of documents in digital form and the ensuing need to organize them. In the research community the dominant approach to this p ..."
Abstract - Cited by 1734 (22 self) - Add to MetaCart
definition of a classifier by domain experts) are a very good effectiveness, considerable savings in terms of expert labor power, and straightforward portability to different domains. This survey discusses the main approaches to text categorization that fall within the machine learning paradigm. We

Robot Motion Planning: A Distributed Representation Approach

by Jérôme Barraquand, Jean-Claude Latombe , 1991
"... We propose a new approach to robot path planning that consists of building and searching a graph connecting the local minima of a potential function defined over the robot’s configuration space. A planner based on this approach has been implemented. This planner is considerably faster than previous ..."
Abstract - Cited by 402 (26 self) - Add to MetaCart
path planners and solves problems for robots with many more degrees of freedom (DOFs). The power of the planner derives both from the "good " properties of the potential function and from the efficiency of the techniques used to escape the local minima of this function. The most powerful

Locality-constrained linear coding for image classification

by Jinjun Wang, Jianchao Yang, Kai Yu, Fengjun Lv, Thomas Huang, Yihong Gong - IN: IEEE CONFERENCE ON COMPUTER VISION AND PATTERN CLASSIFICATOIN , 2010
"... The traditional SPM approach based on bag-of-features (BoF) requires nonlinear classifiers to achieve good image classification performance. This paper presents a simple but effective coding scheme called Locality-constrained Linear Coding (LLC) in place of the VQ coding in traditional SPM. LLC util ..."
Abstract - Cited by 443 (20 self) - Add to MetaCart
The traditional SPM approach based on bag-of-features (BoF) requires nonlinear classifiers to achieve good image classification performance. This paper presents a simple but effective coding scheme called Locality-constrained Linear Coding (LLC) in place of the VQ coding in traditional SPM. LLC

Policy gradient methods for reinforcement learning with function approximation.

by Richard S Sutton , David Mcallester , Satinder Singh , Yishay Mansour - In NIPS, , 1999
"... Abstract Function approximation is essential to reinforcement learning, but the standard approach of approximating a value function and determining a policy from it has so far proven theoretically intractable. In this paper we explore an alternative approach in which the policy is explicitly repres ..."
Abstract - Cited by 439 (20 self) - Add to MetaCart
approximating a value function and using that to compute a deterministic policy, we approximate a stochastic policy directly using an independent function approximator with its own parameters. For example, the policy might be represented by a neural network whose input is a representation of the state, whose

Good Models and Good Representations are a Support for Learners ’ Risk Assessment

by Laura Martignon
"... Abstract: When learners have to make sense of risky situations, they can use mathematical models and representations which facilitate successful risk assessment. Based on theoretical considerations on the benefits of specific models and specific representations in such contexts, we present empirical ..."
Abstract - Add to MetaCart
Abstract: When learners have to make sense of risky situations, they can use mathematical models and representations which facilitate successful risk assessment. Based on theoretical considerations on the benefits of specific models and specific representations in such contexts, we present
Next 10 →
Results 1 - 10 of 6,498
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2016 The Pennsylvania State University