### Table 6. Best fit parameters of an absorbed power-law model for X- ray bright unidentified X-ray sources in the Cha I XMM-Newton field.

"... In PAGE 15: ... We fitted these spectra alter- natively with an absorbed power law. The power law indices are between BDBMBHBMBMBMBEBMBJ (see Table6 ), which is typical for extra- galactic sources (Tozzi et al. 2001).... ..."

### Table 3 Average number of GMRES steps per Newton step for full potential Newton-Krylov-Schwarz solver

1994

"... In PAGE 9: ...We restart GMRES every 20 iterations. Table3 shows convergence performance for a xed-size problem of 128 128 uniform cells with a xed number of subdomains in an 8 8 array as the density of the unnested uniform coarse grid varies. Key observations from this example are: (1) even a modest coarse grid makes a signi cant improvementin an additiveSchwarz preconditioner;; (2) a law of diminishing returns sets in at roughly one point per subdomain;; and (3) matrix-free \matvecs quot; degrade convergence as muchas 15-20% in the less well-conditioned cases.... ..."

Cited by 8

### Table 1: Conditional probability table for a Proposition node viewed by the student This relationship is represented in the conditional probability table for Proposition nodes whose content a student has viewed in the example solution (see Table 1). If the parent Rule-application node is TRUE (i.e., the student explained the corresponding derivation), by definition the proposition node is TRUE (i.e., known by the student). Otherwise, the numbers in the table indicate that the probability of knowing the Proposition node depends on reading time: it is small (see p1 in Table 1) if reading time is LOW and high (see p2 Table 1) when the time is OK, or even higher (see p3) when the reading time is LONG. Some Proposition nodes in the Bayesian network may not be connected to a Read node, even after the student has viewed all the elements in the example solution. This happens when the worked out solution omits some of the solution steps. In Figure 11, for instance, the nodes G-try- Newton-2law and G-goal-choose-body cannot have any Read node pointing to them, because these goals are not explicitly mentioned in the example solution. A student can know

2002

"... In PAGE 44: ...igure 11: Segment of student model for the part of example shown to the left.......................... 25 List of Tables Table1 : Conditional probability table for a Proposition node viewed by the student .... ..."

Cited by 59

### Table 1. Benchmark Parameters The parallelization strategy implemented in DMMD up to now is a particle decomposition algorithm. In this method particles are distributed uniformly over the processors where they remain resident during the simulation. Each processor thus stores coordinates velocities and forces of the particles without replication. In the force loop Newton apos;s 3rd law is applied and communication of the coordinates and the forces between all processors is realized by two counter-rotating systolic loops.The performance is compared with results for the benchmark program of Steve Plimpton (PD2). However, this comparison should not be considered as decisive, since some algorithmical details are very much di erent implemented as it is in DMMD. The main di erence is the use of a replicated data algorithm, i.e. each PE stores coordinates and forces of all particles in the system, which leads to a very much smaller communication overhead when small systems are simulated on many PEs, but also limits the system size to a very much smaller number of particles

### Table: Damped Newton method

2006

### Table 2: Structural laws

1996

"... In PAGE 8: ...Table 2: Structural laws Structural laws Certain laws on processes have been recognized as having merely structural content; they are valid with respect to all di erent kinds of behavioral congruences, equivalences and preorders, including strong bisimulation (the nest \reasonable quot; equivalence). In this paper, we use a few structural laws (indicated by the symbol ) listed in Table2 in order to simplify the presentation of some derivation sequences of transitions. Fact 2.... ..."

Cited by 90

### Table 2: Structural laws

1996

"... In PAGE 10: ...act 2.3.2 is a congruence on P. [Hon92, ACS98] Structural laws Certain laws on processes have been recognized as having merely \structural quot; content; they are valid with respect to all di erent kinds of behavioral congruences, equivalences and preorders, including strong bisimulation (the nest \reasonable quot; equivalence). In this paper, we use a few structural laws (indicated by the symbol ) listed in Table2 in order to simplify the presentation of some derivation sequences of transitions in the proofs of Section 5. Fact 2.... ..."

Cited by 90

### Table 2: Structural laws

1996

"... In PAGE 8: ... In reduction semantics [MS92, HY93], a structural congruence relation is adopted a priori in order to allow for simpli ed presentations of the operational rules. In this paper, we use the structural laws ( ) listed in Table2 only in order to simplify the presentation of some derivation sequences of transitions. Fact 2.... ..."

Cited by 90