### Table 2: Test matrices for the singular value decomposition

1997

"... In PAGE 25: ...14 Tests for the ScaLAPACK SVD routines The following tests will be performed on PDGESVD. A number of matrix #5Ctypes quot; are speci#0Ced, as denoted in Table2 . For each type of matrix, and for the minimal workspace as well as for larger than minimal workspace an M-byN matrix #5CA quot; with known singular... In PAGE 26: ....14.1 Test Matrices for the Singular Value Decomposition Routines Six di#0Berenttypes of test matrices may be generated for the singular value decomposition routines. Table2 shows the types available, along with the numbers used to refer to the matrix types. Except as noted, all matrix types other than the random bidiagonal matrices have O#281#29 entries.... ..."

### (Table 2) by a singular value decomposition. For the comparison of the singular

### Table 6.1 Comparison of centroid decomposition and singular value decomposition.

2002

Cited by 10

### Table 1. Singular Value Decomposition of global design matrix as given by their singular eigenvalues.

165

### Table 5.2 Example 5.2: Approximate solutions xj determined by truncated singular value and generalized singular value decompositions.

### Table 5: Comparison for the methods M4 (Cayley invariants) and M5 (the cross ratios) with and without the use of singular value decomposition for averaging.

1995

"... In PAGE 29: ... The structure results reported in Tables 3 and 4 for the two methods used measures based on the construction of a linear constraint system on the projective coordinates of the points of the form BX = 0, and then the solution of the system by singular value decomposition. For a di erent data set we show in Table5 that computing an estimate based on the singular value decomposition is typically far more stable than if we just used a simple averaging process (computation of the mean). We have also found the singular value decomposition approach to perform better than various forms of the -trimmed mean [23], even though these would be expected to be more capable of handling outliers.... ..."

Cited by 33

### Table 5: Comparison for the methods M4 (Cayley invariants) and M5 (the cross ratios) with and without the use of singular value decomposition for averaging.

"... In PAGE 29: ... The structure results reported in Tables 3 and 4 for the two methods used measures based on the construction of a linear constraint system on the projective coordinates of the points of the form BX = 0, and then the solution of the system by singular value decomposition. For a di erent data set we show in Table5 that computing an estimate based on the singular value decomposition is typically far more stable than if we just used a simple averaging process (computation of the mean). We have also found the singular value decomposition approach to perform better than various forms of the -trimmed mean [23], even though these would be expected to be more capable of handling outliers.... ..."

### TABLE X SINGULAR VALUE DECOMPOSITION RESULTS FROM HIGHEST CONFIDENCE AND SUPPORT QUERY PROBES. A VALUE OF 1.0 INDICATES AN EXACT MATCH.

### Table 1: Exact ideal average mean square error for the estimation of various test functions using the singular value decomposition (SVD), wavelet{vaguelette decomposition(WVD), and vaguelette{wavelet decomposition(VWD) approaches, for various levels of the signal- to-noise ratio SNR.

1998

"... In PAGE 10: ... The mother wavelets D4 and D8 were used in the wavelet{vaguelette decomposition and D5 and D9 in the vaguelette{wavelet decom- position. The various average mean square errors yielded by the application of the exact risk formulae are given in Table1 . To make a comparison with singular value decomposition, Table 1 near here we calculated the minimal average mean square error for a truncated singular value de- composition estimator with optimally chosen cut-o point M.... In PAGE 10: ... To make a comparison with singular value decomposition, Table 1 near here we calculated the minimal average mean square error for a truncated singular value de- composition estimator with optimally chosen cut-o point M. The results are also given in Table1 . Table 2 gives the optimal values of the thresholds in terms of , each found by a grid search at grid interval 0:05 , where is the standard deviation of the noise.... In PAGE 10: ... 4.3 Analysis of the results Table1 shows that the choice of whether to use wavelet{vaguelette decomposition or vaguelette{wavelet decomposition does not make a strong di erence in terms of average mean square error, nor does the choice between the di erent wavelet functions. The largest discrepancy between the two wavelet-based methods is the improvement a orded by using vaguelette{wavelet decomposition for the HeaviSine function.... In PAGE 11: ... Thus, we can examine the convergence for di erent methods by studying their performance as a function of signal-to-noise ratio. Table1 clearly indicates that the rates of convergence for the wavelet-based methods are much faster than that of the singular value decomposition approach, especially for the `blocks apos; and `Doppler apos; functions. The theoretical ground for this phenomenon is given in Section 5.... ..."

Cited by 45

### Table 1: Degree of Legendre expansions N and number of terms in the singular value decomposition k required to approximate the operators O and S to the indicated precision. k(O) N(O) k(S) N(S) 10?3

"... In PAGE 11: ... It remains also to determine N and k so that kS( ) ? WN (k) N (k) YN (k)T PN k lt; : (27) This is a rather complicated matter to handle analytically [8, 13], but straightforward to determine computationally. One can simply increase N and k until the desired level of accuracy is achieved, and we summarize the results in Table1 . The generalized FMM then proceeds as above with the following changes.... ..."