### Table 5. All possible derivations from path-consistency

"... In PAGE 8: ... 3. Temporal configuration with the matrix representation of its consistency net- work Table5 shows the possible derivations from path consistency that result in crisp results. For example, the derivation of the relation between intervals I1 and I2 (r12) is achieved by the intersection of the sets of relations that result from the compositions r13; r32, r14; r42, and r15; r52.... In PAGE 8: ... Cases when the intersections of composition paths do not result in crisp results are not included in Table 5. Table5 shows that there are seven possible derivations from path consistency; however, it is not possible to eliminate all of these relations. The derivation of r12 requires the relation r14 and, vice versa, the derivation of r14 requires the relation r12.... In PAGE 8: ...elation r12. So, we cannot eliminate both relations. The same situation occurs with relations r14 and r34, r12 and r15, and r24 and r34. Note in Table5 that to derive relation r35, one just needs the composition between relations r31 and r15, since only this composition gives a crisp result. A minimal subgraph will eliminate the maximum number of relations that are derivable, while keeping the minimum number of relations that are needed to completely determine the original consistency network.... ..."

### Table 3. All possible derivations from path-consistency

### Table III. Related paths in consistency analysis

### TABLE 6 The Results of our Method and IA Model on Path-Consistency (Seconds) Model 1st network 2nd network 3rd network 4th network

2003

Cited by 1

### Table 2: Comparison of the space and time complexity for MUSE path consistency on a MUSE CSP to path consistency on multiple CSPs representing a node splitting problem (e.g., lexical ambiguity in parsing).

### Table 5: Post-announcement revenue path consistent with the stock price decline

### Table 1: The four algorithms on three representative types of random problems: under{, over{, and middle{ constrained (500 instances per type). 208 among 500 instances of (50/0.5/0.38) were found to have a path consistent closure. These networks are then in the transition range, where much propagation is needed to nd the result. 500 among 500 instances of (50/0.5/0.1), and 0 among 500 instances of (50/0.5/0.6) were found to have a path consistent closure.

"... In PAGE 3: ... procedure PC1IA; 1 repeat 2 CHANGE False; 3 for k; i; j 1 to n do if i 6 = j 6 = k 6 = i then 4 if REVISE(i; k; j) then 5 if Rij = ; then exit \inconsistency quot;; 6 CHANGE True; 7 until not CHANGE; procedure PC-VK; 1 Q f(i; j)=i lt; jg; 2 while Q 6 = ; do 3 select and delete an arc (i; j) from Q; 4 for k 6 = i; k 6 = j do 5 if REVISE(i; j; k) then 6 if Rik = ; then exit \inconsistency quot;; 7 else Append(Q, f(i; k)g); 8 if REVISE(k; i; j) then 9 if Rkj = ; then exit \inconsistency quot;; 10 else Append(Q, f(k; j)g); Figure 1: procedures PC1IA and PC-VK. The results given Table1 lead us to a few comments on the four algorithms tested. First, it is clear that the number of table look-ups is strongly correlated to the number of times the function REVISE is called (since all the algorithms use the same function RE- VISE).... In PAGE 4: ... However, when path con- sistency is performed on very easy networks (under- constrained or over-constrained), where no propaga- 6For PC2IA it is equivalent to \Q f(i; k; j)=i lt; j; k 6 = i; k 6 = jg quot;. tion is necessary (the network is already path consis- tent or trivially inconsistent), the time needed by PC- vB and PC2IA to initialize the O(n3) set of length-2 paths can degrade their overall performance to such an extent that the time they need to achieve path consis- tency can exceed the time needed by PC1IA or PC-VK (see Table1 ). Fortunately, when path consistency is used at each step of a search procedure, this expensive initialization is performed only once at the root of the search tree, and then its cost is widely outweighed by the savings it implies during the search.... ..."

### Table 1: Optimum combination of 3 channels for each water vapor column. The total uncertainty in the excess path consists of independent contributions from the listed speci c error terms (see x2 for details). Passband variations and dry uctuations have not been included.

"... In PAGE 9: ... In practice, the optimum combination of frequencies will also depend on the relative importance of the di erent error terms. The numbers in Table1 clearly demonstrate that estimating the excess path accurately becomes more di cult as the water vapor column increases, primarily because of the reduced signal (Fig.... In PAGE 9: ... The vertical structure of the atmosphere is likely to deviate from the US Standard Atmosphere used in these calculations. To investigate this, the optimum channels listed in Table1 were used with di erent atmospheric models to calculate new values for the conversion factor C . It was found that neither the surface pressure nor the temperature pro le have much impact, but that the scale height of the water vapor distribution is more important.... In PAGE 12: ...5 GHz of bandwidth. This is not su cient to measure the values of T given in Table1 for 4 mm or 8 mm of water vapor. An integration time of 10 s would reduce the noise by a factor of 3, but it is likely that some fraction of the time would have to be spent on calibration against reference loads.... ..."

### Table 5: The tables show the the number of sequences s a neutral path consists of, the number of structures found along the path n(s ) and the fraction of frequent structures . The table on the right hand side additionally shows the characteristic number of sequences as obtained by tting, and the according number of structures n( ). (Computed for mapping sequences in Q30 A .)

1998

### Table 1: Comparison of Spike min-con icts with other approaches on the set of 60 job-shop scheduling constraint satisfaction problems de ned by Sadeh: the table lists the number of problems of each type solved by each method. The problem instances are designated by the values of the parameters for due and release date range (RG) and by the number of bottleneck resources (BK). Only Spike (column 1) found solutions to all 60 problem instances. con icts are counted (Spike uses an arc- and path-consistent form of the temporal constraints, and counts con icts for inferred as well as explicit constraints). Columns (5) through (8) are the results from [36] which includes detailed references to the various algorithms, where: DSR = dynamic search rearrangement, ABT = advised backtracking, ORR = operations resource reliance, FSS = ltered survivable schedules, and SMU = Southern Methodist University heuristics. While it is not possible to compare run-time performance in any detailed way, it is interesting to note that the median times required for Spike to nd solutions to these problems was only a few seconds each (on a Sparcstation 2, running Allegro Common Lisp) | much faster than the characteristic times of minutes 10

1994

"... In PAGE 10: ... While several initial guess methods were tried, the most e ective was a variant of \most-constrained rst quot; which e ectively focused on one job at a time. The results are shown in Table1 which lists the number of problems of each type that were solved by each of a variety of methods. The Spike results are in column (1): Spike was the only method to solve all 60 problems.... ..."

Cited by 23