Results 1  10
of
8,816
SCHURCONVEX FUNCTIONS AND ISOPERIMETRIC INEQUALITIES
"... Abstract. In this paper, we establish some analytic inequalities for Schurconvex functions that are made of solutions of a second order nonlinear differential equation. We apply these analytic inequalities to obtain some geometric inequalities. 1. ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Abstract. In this paper, we establish some analytic inequalities for Schurconvex functions that are made of solutions of a second order nonlinear differential equation. We apply these analytic inequalities to obtain some geometric inequalities. 1.
Schurconvexity of the complete elementary symmetric function
 Article ID 67624
"... We prove that the complete elementary symmetric function cr = cr(x) = C[r]n (x) =∑ i1+···+in=r x i1 1 ···xinn and the function φr(x) = cr(x)/cr−1(x) are Schurconvex functions in Rn+ = {(x1,x2,...,xn)  xi> 0}, where i1, i2,..., in are nonnegative integers, r ∈ N = {1, 2,...}, i = 1,2,...,n. F ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
We prove that the complete elementary symmetric function cr = cr(x) = C[r]n (x) =∑ i1+···+in=r x i1 1 ···xinn and the function φr(x) = cr(x)/cr−1(x) are Schurconvex functions in Rn+ = {(x1,x2,...,xn)  xi> 0}, where i1, i2,..., in are nonnegative integers, r ∈ N = {1, 2,...}, i = 1,2,...,n
On Schurconvexity of expectation of weighted sum of random variables with applications
 Journal of inequalities in pure and applied mathematics
"... ABSTRACT. We show that the expectation of a class of functions of the sum of weighted identically independent distributed positive random variables is Schurconcave with respect to the weights. Furthermore, we optimise the expectation by choosing extraweights with a sum constraint. We show that und ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
that under this optimisation the expectation becomes Schurconvex with respect to the weights. Finally, we explain the connection to the ergodic capacity of some multipleantenna wireless communication systems with and without adaptive power allocation. Key words and phrases: Schurconvex function
Joint TxRx beamforming design for multicarrier MIMO channels: a unified framework for convex optimization
 IEEE TRANS. SIGNAL PROCESSING
, 2003
"... This paper addresses the joint design of transmit and receive beamforming or linear processing (commonly termed linear precoding at the transmitter and equalization at the receiver) for multicarrier multipleinput multipleoutput (MIMO) channels under a variety of design criteria. Instead of consid ..."
Abstract

Cited by 289 (20 self)
 Add to MetaCart
of considering each design criterion in a separate way, we generalize the existing results by developing a unified framework based on considering two families of objective functions that embrace most reasonable criteria to design a communication system: Schurconcave and Schurconvex functions. Once the optimal
UPPER BOUNDS ON CERTAIN FUNCTIONALS DEFINED ON GROUPS OF LINEAR OPERATORS ∗
"... Abstract. The problem of estimating certain functionals defined on a group of linear operators generating a group induced cone (GIC) ordering is studied. A result of Berman and Plemmons [Math. Inequal. Appl., 2(1):149–152, 1998] is extended from the sum function to Schurconvex functions. It is show ..."
Abstract
 Add to MetaCart
Abstract. The problem of estimating certain functionals defined on a group of linear operators generating a group induced cone (GIC) ordering is studied. A result of Berman and Plemmons [Math. Inequal. Appl., 2(1):149–152, 1998] is extended from the sum function to Schurconvex functions
MSE BASED OPTIMIZATION OF MULTIUSER MIMO MAC WITH PARTIAL CSI
"... This paper studies the optimal transmission design for multiple antenna multiple access channels with linear MMSE receiver at the base station and partial channel state information at the mobiles. The performance criterium for optimization is based on a Schurconcave function that works on the avera ..."
Abstract
 Add to MetaCart
on the average individual MSEs of all users. The optimal beamforming matrix of each user corresponds to the eigenvector matrix of his channel correlation matrix. The remaining power allocation problem can be solved at reduced complexity. It is also direct to extend the results to Schurconvex functions
SNOPT: An SQP Algorithm For LargeScale Constrained Optimization
, 2002
"... Sequential quadratic programming (SQP) methods have proved highly effective for solving constrained optimization problems with smooth nonlinear functions in the objective and constraints. Here we consider problems with general inequality constraints (linear and nonlinear). We assume that first deriv ..."
Abstract

Cited by 582 (23 self)
 Add to MetaCart
Sequential quadratic programming (SQP) methods have proved highly effective for solving constrained optimization problems with smooth nonlinear functions in the objective and constraints. Here we consider problems with general inequality constraints (linear and nonlinear). We assume that first
Graphical models, exponential families, and variational inference
, 2008
"... The formalism of probabilistic graphical models provides a unifying framework for capturing complex dependencies among random variables, and building largescale multivariate statistical models. Graphical models have become a focus of research in many statistical, computational and mathematical fiel ..."
Abstract

Cited by 800 (26 self)
 Add to MetaCart
of probability distributions — are best studied in the general setting. Working with exponential family representations, and exploiting the conjugate duality between the cumulant function and the entropy for exponential families, we develop general variational representations of the problems of computing
Learning the Kernel Matrix with SemiDefinite Programming
, 2002
"... Kernelbased learning algorithms work by embedding the data into a Euclidean space, and then searching for linear relations among the embedded data points. The embedding is performed implicitly, by specifying the inner products between each pair of points in the embedding space. This information ..."
Abstract

Cited by 780 (22 self)
 Add to MetaCart
Kernelbased learning algorithms work by embedding the data into a Euclidean space, and then searching for linear relations among the embedded data points. The embedding is performed implicitly, by specifying the inner products between each pair of points in the embedding space. This information is contained in the socalled kernel matrix, a symmetric and positive definite matrix that encodes the relative positions of all points. Specifying this matrix amounts to specifying the geometry of the embedding space and inducing a notion of similarity in the input spaceclassical model selection problems in machine learning. In this paper we show how the kernel matrix can be learned from data via semidefinite programming (SDP) techniques. When applied
A unified framework for optimizing linear nonregenerative multicarrier MIMO relay communication systems
 IEEE TRANS. SIGNAL PROCESS
, 2009
"... In this paper, we develop a unified framework for linear nonregenerative multicarrier multipleinput multipleoutput (MIMO) relay communications in the absence of the direct source–destination link. This unified framework classifies most commonly used design objectives such as the minimal meansqu ..."
Abstract

Cited by 93 (50 self)
 Add to MetaCart
square error and the maximal mutual information into two categories: Schurconcave and Schurconvex functions. We prove that for Schurconcave objective functions, the optimal source precoding matrix and relay amplifying matrix jointly diagonalize the source–relay–destination channel matrix and convert
Results 1  10
of
8,816