### Table 3: Local Search Algorithms

2000

"... In PAGE 34: ...Table 3: Local Search Algorithms The average number of labels placed by our local search algorithms and the corresponding running times are reported in Table3 . The SA and DS columns refer to simulated annealing and diversified neighbourhood search, respectively, applied to a map labelling formulation.... ..."

Cited by 13

### Table 3: Local Search Algorithms

2000

"... In PAGE 34: ...Table 3: Local Search Algorithms The average number of labels placed by our local search algorithms and the corresponding running times are reported in Table3 . The SA and DS columns refer to simulated annealing and diversi ed neighbourhood search, respectively, applied to a map labelling formulation.... ..."

Cited by 13

### Table 1: Results for Local Search

2005

"... In PAGE 13: ... The local search algorithm starts at a random binary vector and reaches a local maximum in the binary neighborhood by successively moving to the flrst improving neighbor found. Table1 presents the results that were obtained, where average and the maximum objective function value obtained starting from ten random binary vectors are shown for formulations (2) and (6). Matlabr function fmincon uses a sequential quadratic programming ap- proach for solving medium-scale constrained optimization problems.... ..."

### Table 4: Results without local search heuristics

"... In PAGE 17: ... In order to isolate the effects of the local search heuristics LAH and ITH on heuristic LSLSH, the algorithm SubG was run without the local improvements in primal feasible solutions. The results are compiled in Table4 . The gaps increased up to 5.... ..."

Cited by 1

### Table 3: Local search for the general shop model

"... In PAGE 24: ... Again, we can state that the neighbor selection process has no great influ- ence on the quality of the solutions, however, for rst- t the corresponding computation times are shorter. Comparing the results of Table3 with those of the columns CONV of Table 2, we see that the tabu search approach for the general shop model again reduces the objective values signi cantly. For the two relevant practical instances the best found value is below or close to the (not good) lower bound of the time-lag model (which is used in prac- tice).... ..."

Cited by 1

### Table 3: Local search for the general shop model

"... In PAGE 24: ... Again, we can state that the neighbor selection process has no great influ- ence on the quality of the solutions, however, for rst- t the corresponding computation times are shorter. Comparing the results of Table3 with those of the columns CONV of Table 2, we see that the tabu search approach for the general shop model again reduces the objective values signi cantly. For the two relevant practical instances the best found value is below or close to the (not good) lower bound of the time-lag model (which is used in prac- tice).... ..."

Cited by 1

### Table 8.3 Local search strategies

2001

Cited by 21

### Table 8.3 Local search strategies

2001

Cited by 21

### Table 1: Experimental Results for the Local Search Algorithm.

2004

"... In PAGE 10: ...57 NA NA Table 2: Experimental Results of the Genetic Algorithm in [12]. 4 Experimental Results Table1 depicts the experimental results on the standard OR Library benchmarks for unca- pacitated warehouse location, as well as the M* instances generated according to the scheme specified in [12]. Recall that the M* instances, which capture classes of real UWLPs [12], are very challenging for mathematical programming approaches because they have a large number of suboptimal solutions.... ..."

Cited by 17

### Table 1: Experimental Results for the Local Search Algorithm.

2004

"... In PAGE 12: ... We discuss the impact of this parameter in the next section. Table1 depicts the experimental results on the standard OR Library benchmarks for un- capacitated warehouse location, as well as the M* instances from [21].1 Recall that the M* instances, which capture classes of real UWLPs [21], are very challenging for mathemat- ical programming approaches because they have a large number of suboptimal solutions.... In PAGE 12: ...o note that the algorithm has no prior knowledge of the optimal solution, i.e., it cannot terminate early when the optimum solution is found. As can be seen from Table1 , the algorithm is very robust. It finds optimal solutions with very high frequencies on all benchmarks.... ..."

Cited by 17