Results 1 - 10
of
1,453,227
Are Tensor Decomposition Solutions Unique?
, 902
"... For tensor decompositions such as HOSVD and ParaFac, the objective functions are nonconvex. This implies, theoretically, there exists a large number of local optimas: starting from different starting point, the iteratively improved solution will converge to different local solutions. This non-unique ..."
Abstract
- Add to MetaCart
For tensor decompositions such as HOSVD and ParaFac, the objective functions are nonconvex. This implies, theoretically, there exists a large number of local optimas: starting from different starting point, the iteratively improved solution will converge to different local solutions. This non-uniqueness
The Nash Bargaining Solution in Economic Modeling
- Rand Journal of Economics
, 1986
"... This article establishes the relationship between the static axiomatic theory of bargaining and the sequential strategic approach to bargaining. We consider two strategic models of alternating offers. The models differ in the source of the incentive of the bargaining parties to reach agreement: the ..."
Abstract
-
Cited by 556 (1 self)
- Add to MetaCart
: the bargainers ' time preference and the risk of breakdown of negotiation. Each of the models has a unique perfect equilibrium. When the motivation to reach agreement is made negligible, in each model the unique perfect equilibrium outcome approaches the Nash bargaining solution, with utilities that reflect
For Most Large Underdetermined Systems of Linear Equations the Minimal ℓ1-norm Solution is also the Sparsest Solution
- Comm. Pure Appl. Math
, 2004
"... We consider linear equations y = Φα where y is a given vector in R n, Φ is a given n by m matrix with n < m ≤ An, and we wish to solve for α ∈ R m. We suppose that the columns of Φ are normalized to unit ℓ 2 norm 1 and we place uniform measure on such Φ. We prove the existence of ρ = ρ(A) so that ..."
Abstract
-
Cited by 560 (10 self)
- Add to MetaCart
that for large n, and for all Φ’s except a negligible fraction, the following property holds: For every y having a representation y = Φα0 by a coefficient vector α0 ∈ R m with fewer than ρ · n nonzeros, the solution α1 of the ℓ 1 minimization problem min �x�1 subject to Φα = y is unique and equal to α0
USER’S GUIDE TO VISCOSITY SOLUTIONS OF SECOND ORDER PARTIAL DIFFERENTIAL EQUATIONS
, 1992
"... The notion of viscosity solutions of scalar fully nonlinear partial differential equations of second order provides a framework in which startling comparison and uniqueness theorems, existence theorems, and theorems about continuous dependence may now be proved by very efficient and striking argume ..."
Abstract
-
Cited by 1403 (15 self)
- Add to MetaCart
The notion of viscosity solutions of scalar fully nonlinear partial differential equations of second order provides a framework in which startling comparison and uniqueness theorems, existence theorems, and theorems about continuous dependence may now be proved by very efficient and striking
Gravity with Gravitas: a Solution to the Border Puzzle
, 2001
"... Gravity equations have been widely used to infer trade ow effects of various institutional arrangements. We show that estimated gravity equations do not have a theoretical foundation. This implies both that estimation suffers from omitted variables bias and that comparative statics analysis is unfo ..."
Abstract
-
Cited by 610 (3 self)
- Add to MetaCart
Gravity equations have been widely used to infer trade ow effects of various institutional arrangements. We show that estimated gravity equations do not have a theoretical foundation. This implies both that estimation suffers from omitted variables bias and that comparative statics analysis is unfounded. We develop a method that (i) consistently and ef ciently estimates a theoretical gravity equation and (ii) correctly calculates the comparative statics of trade frictions. We apply the method to solve the famous McCallum border puzzle. Applying our method, we nd that national borders reduce trade between industrialized countries by moderate amounts of 20–50 percent.
Closed-form solution of absolute orientation using unit quaternions
- J. Opt. Soc. Am. A
, 1987
"... Finding the relationship between two coordinate systems using pairs of measurements of the coordinates of a number of points in both systems is a classic photogrammetric task. It finds applications in stereophotogrammetry and in robotics. I present here a closed-form solution to the least-squares pr ..."
Abstract
-
Cited by 973 (4 self)
- Add to MetaCart
Finding the relationship between two coordinate systems using pairs of measurements of the coordinates of a number of points in both systems is a classic photogrammetric task. It finds applications in stereophotogrammetry and in robotics. I present here a closed-form solution to the least
A solution to Plato’s problem: The latent semantic analysis theory of acquisition, induction, and representation of knowledge
- PSYCHOLOGICAL REVIEW
, 1997
"... How do people know as much as they do with as little information as they get? The problem takes many forms; learning vocabulary from text is an especially dramatic and convenient case for research. A new general theory of acquired similarity and knowledge representation, latent semantic analysis (LS ..."
Abstract
-
Cited by 1772 (10 self)
- Add to MetaCart
How do people know as much as they do with as little information as they get? The problem takes many forms; learning vocabulary from text is an especially dramatic and convenient case for research. A new general theory of acquired similarity and knowledge representation, latent semantic analysis (LSA), is presented and used to successfully simulate such learning and several other psycholinguistic phenomena. By inducing global knowledge indirectly from local co-occurrence data in a large body of representative text, LSA acquired knowledge about the full vocabulary of English at a comparable rate to schoolchildren. LSA uses no prior linguistic or perceptual similarity knowledge; it is based solely on a general mathematical learning method that achieves powerful inductive effects by extracting the right number of dimensions (e.g., 300) to represent objects and contexts. Relations to other theories, phenomena, and problems are sketched.
Inverse Acoustic and Electromagnetic Scattering Theory, Second Edition
, 1998
"... Abstract. This paper is a survey of the inverse scattering problem for time-harmonic acoustic and electromagnetic waves at fixed frequency. We begin by a discussion of “weak scattering ” and Newton-type methods for solving the inverse scattering problem for acoustic waves, including a brief discussi ..."
Abstract
-
Cited by 1072 (45 self)
- Add to MetaCart
discussion of Tikhonov’s method for the numerical solution of ill-posed problems. We then proceed to prove a uniqueness theorem for the inverse obstacle problems for acoustic waves and the linear sampling method for reconstructing the shape of a scattering obstacle from far field data. Included in our
Symmetry and Related Properties via the Maximum Principle
, 1979
"... We prove symmetry, and some related properties, of positive solutions of second order elliptic equations. Our methods employ various forms of the maximum principle, and a device of moving parallel planes to a critical position, and then showing that the solution is symmetric about the limiting plan ..."
Abstract
-
Cited by 539 (4 self)
- Add to MetaCart
We prove symmetry, and some related properties, of positive solutions of second order elliptic equations. Our methods employ various forms of the maximum principle, and a device of moving parallel planes to a critical position, and then showing that the solution is symmetric about the limiting
A tutorial on support vector machines for pattern recognition
- Data Mining and Knowledge Discovery
, 1998
"... The tutorial starts with an overview of the concepts of VC dimension and structural risk minimization. We then describe linear Support Vector Machines (SVMs) for separable and non-separable data, working through a non-trivial example in detail. We describe a mechanical analogy, and discuss when SV ..."
Abstract
-
Cited by 3319 (12 self)
- Add to MetaCart
SVM solutions are unique and when they are global. We describe how support vector training can be practically implemented, and discuss in detail the kernel mapping technique which is used to construct SVM solutions which are nonlinear in the data. We show how Support Vector machines can have very
Results 1 - 10
of
1,453,227