Results 1  10
of
11,371
Statistical Analysis of Cointegrated Vectors
 Journal of Economic Dynamics and Control
, 1988
"... We consider a nonstationary vector autoregressive process which is integrated of order 1, and generated by i.i.d. Gaussian errors. We then derive the maximum likelihood estimator of the space of cointegration vectors and the likelihood ratio test of the hypothesis that it has a given number of dimen ..."
Abstract

Cited by 2749 (12 self)
 Add to MetaCart
of dimensions. Further we test linear hypotheses about the cointegration vectors. The asymptotic distribution of these test statistics are found and the first is described by a natural multivariate version of the usual test for unit root in an autoregressive process, and the other is a x2 test. 1.
LSQR: An Algorithm for Sparse Linear Equations and Sparse Least Squares
 ACM Trans. Math. Software
, 1982
"... An iterative method is given for solving Ax ~ffi b and minU Ax b 112, where the matrix A is large and sparse. The method is based on the bidiagonalization procedure of Golub and Kahan. It is analytically equivalent to the standard method of conjugate gradients, but possesses more favorable numerica ..."
Abstract

Cited by 653 (21 self)
 Add to MetaCart
numerical properties. Reliable stopping criteria are derived, along with estimates of standard errors for x and the condition number of A. These are used in the FORTRAN implementation of the method, subroutine LSQR. Numerical tests are described comparing I~QR with several other conjugate
Loopy belief propagation for approximate inference: An empirical study. In:
 Proceedings of Uncertainty in AI,
, 1999
"... Abstract Recently, researchers have demonstrated that "loopy belief propagation" the use of Pearl's polytree algorithm in a Bayesian network with loops can perform well in the context of errorcorrecting codes. The most dramatic instance of this is the near Shannonlimit performanc ..."
Abstract

Cited by 676 (15 self)
 Add to MetaCart
;belief revision") version, Weiss For the case of networks with multiple loops, Richard son To summarize, what is currently known about loopy propagation is that ( 1) it works very well in an error correcting code setting and (2) there are conditions for a singleloop network for which it can be guaranteed
SelfTesting/Correcting with Applications to Numerical Problems
, 1990
"... Suppose someone gives us an extremely fast program P that we can call as a black box to compute a function f . Should we trust that P works correctly? A selftesting/correcting pair allows us to: (1) estimate the probability that P (x) 6= f(x) when x is randomly chosen; (2) on any input x, compute ..."
Abstract

Cited by 361 (27 self)
 Add to MetaCart
Suppose someone gives us an extremely fast program P that we can call as a black box to compute a function f . Should we trust that P works correctly? A selftesting/correcting pair allows us to: (1) estimate the probability that P (x) 6= f(x) when x is randomly chosen; (2) on any input x, compute
x2 Tests for the Choice of the Regularization Parameter in Nonlinear Inverse Problems
"... Abstract. We address discrete nonlinear inverse problems with weighted least squares and Tikhonov regularization. Regularization is a way to add more information to the problem when it is illposed or illconditioned. However, it is still an open question as to how to weight this information. The di ..."
Abstract
 Add to MetaCart
. The discrepancy principle considers the residual norm to determine the regularization weight or parameter, while the χ2 method [J. Mead, J. Inverse IllPosed Probl., 16 (2008), pp. 175–
CoNLLX shared task on multilingual dependency parsing
 In Proc. of CoNLL
, 2006
"... Each year the Conference on Computational Natural Language Learning (CoNLL) 1 features a shared task, in which participants train and test their systems on exactly the same data sets, in order to better compare systems. The tenth CoNLL (CoNLLX) saw a shared task on Multilingual Dependency Parsing. ..."
Abstract

Cited by 344 (2 self)
 Add to MetaCart
Each year the Conference on Computational Natural Language Learning (CoNLL) 1 features a shared task, in which participants train and test their systems on exactly the same data sets, in order to better compare systems. The tenth CoNLL (CoNLLX) saw a shared task on Multilingual Dependency Parsing
The Octagon Abstract Domain
, 2007
"... ... domain for static analysis by abstract interpretation. It extends a former numerical abstract domain based on DifferenceBound Matrices and allows us to represent invariants of the form (±x ± y ≤ c), where x and y are program variables and c is a real constant. We focus on giving an efficient re ..."
Abstract

Cited by 321 (24 self)
 Add to MetaCart
representation based on DifferenceBound Matrices—O(n 2) memory cost, where n is the number of variables—and graphbased algorithms for all common abstract operators—O(n 3) time cost. This includes a normal form algorithm to test equivalence of representation and a widening operator to compute least fixpoint
The robustness of test statistics to nonnormality and specification error in confirmatory factor analysis
 Psychological Methods
, 1996
"... Monte Carlo computer simulations were used to investigate the performance of three X 2 test statistics in confirmatory factor analysis (CFA). Normal theory maximum likelihood)~2 (ML), Browne's asymptotic distribution free X 2 (ADF), and the SatorraBentler rescaled X 2 (SB) were examined under ..."
Abstract

Cited by 230 (8 self)
 Add to MetaCart
Monte Carlo computer simulations were used to investigate the performance of three X 2 test statistics in confirmatory factor analysis (CFA). Normal theory maximum likelihood)~2 (ML), Browne's asymptotic distribution free X 2 (ADF), and the SatorraBentler rescaled X 2 (SB) were examined under
Evolutionary Algorithms for Constrained Parameter Optimization Problems
 Evolutionary Computation
, 1996
"... Evolutionary computation techniques have received a lot of attention regarding their potential as optimization techniques for complex numerical functions. However, they have not produced a significant breakthrough in the area of nonlinear programming due to the fact that they have not addressed the ..."
Abstract

Cited by 315 (18 self)
 Add to MetaCart
disappointing. In this paper we (1) discuss difficulties connected with solving the general nonlinear programming problem, (2) survey several approaches which have emerged in the evolutionary computation community, and (3) provide a set of eleven interesting test cases, which may serve as a handy reference
Why do Internet services fail, and what can be done about it?
, 2003
"... In 1986 Jim Gray published his landmark study of the causes of failures of Tandem systems and the techniques Tandem used to prevent such failures [6]. Seventeen years later, Internet services have replaced faulttolerant servers as the new kid on the 24x7availability block. Using data from three la ..."
Abstract

Cited by 313 (10 self)
 Add to MetaCart
In 1986 Jim Gray published his landmark study of the causes of failures of Tandem systems and the techniques Tandem used to prevent such failures [6]. Seventeen years later, Internet services have replaced faulttolerant servers as the new kid on the 24x7availability block. Using data from three
Results 1  10
of
11,371