Next: 4. On Schreyer's method Up: Standard bases, syzygies and Previous: 2. The standard basis

# 3. Examples and comparisons

Let

These three series stem from our attempts to disprove Zariski's conjecture.

Examples:

Options:
The ordering of the variables is t, x, y, z, w, the computations were done in characteristic 32003.
Lazard stands for the algorithm Lazard'' and Mora for the algorithm Standard basis'' with NFMora, cf. Chapter 1.
The columns have the following meaning:

1.
Lazard with MACAULAY, elimination order for homogenizing variable (MAC)
2.
Lazard with SINGULAR, elimination order for homogenizing variable (SING)
3.
Mora with degrevlex- (-)
4.
Mora with morePairs, degrevlex- (mP)
5.
Mora with sugar and morePairs, degrevlex- (su/mP)
6.
Mora with sugar, sugarCrit and morePairs, degrevlex- (suC/mP)
7.
Mora with sugar, degrevlex- (su)
8.
Mora with fast HCtest, degrevlex- (fHC)
9.
Mora with weighted ecartMethod and sugar, but without HCtest, degrevlex- (ecartM).
w denotes the ecart vector, if this is the same as 3. An asterisk in front of the ecart vector indicates that it was set by hand, otherwise it was set automtically by the method described in [Po].
10.
Mora with lex- (lex)
11.
Mora with degrevlex- applied to the standardbasis computed in 10 (d(lex)).

We used

-
the test version 8.6 of SINGULAR (January 1994) and
-
the version 4.0 alpha of MACAULAY (15th October, 1993)
for the tests on HP-UX, 9000/735 with 80 megabyte of main memory.

The symbol #'' indicates usage of more than 120 megabyte and -'' indicates that this option makes no sense.

The time is given in seconds (up to one decimal place). The time 0 indicates less than 0.05 seconds.

Compare the table on the next page.

 MAC SING -- mP su/mP suC/mP su fHC ecartM lex d(lex) 1 2 3 4 5 6 7 8 9 w 10 11 1 53 35.7 27.9 28.2 27.6 # 27.9 - 0 (5, 2, 2, 1) 0 0 2 21 12.9 419.8 9.9 9.3 9.5 289.6 0 1.0 (95, 94, 63) 0.5 1.1 3 20 13.6 4.3 8.6 8.4 6.2 4.4 0.2 0.9 (86, 67, 51) 0.4 1.18 4 2 1.6 6.9 3.5 3.4 1.4 6.6 0 1.4 *(4, 3, 3) 0.9 0.4 5 2 0.8 2.8 2.8 2.7 2.5 2.7 0.5 1.4 *(5, 3, 3) 2.1 1.3 6 12 7.3 26.5 4.5 4.5 4.7 26.9 0 1.0 (16, 8, 5) 0 0.1 7 6 3.5 21.1 2.2 2.2 2.5 20.6 0.4 0 (16, 8, 5) 0.1 1.7 8 3 1.9 0 0 0 0 0 0 0.1 (60, 55, 64) 0.1 0.4 9 7 4.1 6.8 3.0 3.1 2.2 7.4 0 6.2 (10, 11, 7) 0.2 0.3 10 2 1.3 4.3 4.2 4.2 2.0 6.3 0 3.0 *(5, 4, 4) 0.3 0.2 11 1 0.7 4.9 4.6 4.4 2.2 4.9 0.3 3.5 *(5, 4, 4) 0.3 0.6 12 4 1.2 # # 559.5 0.3 # - 0 (20, 20, 2, 1) 0 0 13 237 84.2 # 43.4 44.1 87.4 0 1.0 *(4, 3) 0 0 14 33 14.2 2520.8 56.7 53.4 44.3 2438.7 0 7.7 *(4, 3) 0.2 0.1 15 2405 1480.4 0 0 0 0 0 - 0 (47, 64, 62) 0 0 16 6156 4113.7 # # # # # - # (35, 38, 61) 0 0 17 5 3.3 14.7 1.6 1.6 1.3 15.0 0 1.2 (2, 3, 2) 0.1 0 18 4383 3838.4 0.6 0.1 0.1 # 0.5 - 0.3 (32, 12, 19, 13) 0 # 19 17641 9190.2 # 2205.2 2033.6 2253.6 # 2.2 92.9 *(4, 3, 2) 6.3 8.7 20 83965 50265.5 # 32421.8 33447.3 8855.7 # 315.0 # (76, 65, 47) 135.9 816.6

Conclusions: (for the case Loc;SPMlt; K[x] = K[x](x))
The above examples and many more tests which have not been documented show the following pattern:

1.
The HCtest is essential for 0-dimensional ideals. In SINGULAR it is checked automatically whether all axes contain a leading monomial, regardless of whether the ideal is 0-dimensional or not (which is not very expansive). If this is so, the highest corner is computed and used as described in Chapter 2. In almost all cases this is the fastest option. Of course, it may be combined with the ecartMethod (and is so in SINGULAR if one chooses the ecartMethod). Moreover, if it is known in advance that the ideal is 0-dimensional we have the option fast HCtest (fHC) which does the following: in case all but one axes contain a leading monomial the search for a leading monomial on the last axes has priority. This (as for other options) is achieved by sorting the sets appropriately.

2.
The ecartMethod applies to any ideal and can be extremely fast if one has a good feeling for the weights. In practice this option is most useful if the system automatically offers an ecart vector w. In SINGULAR this is implemented by choosing w such that the sum of the weighted ecart over all generators of the ideal (normalized in a certain way) is minimal. Other variants are under experimentation.

3.
If the ecartMethod is not successful, it is usually recommended to use the option morePairs.

4.
If the above options are not successful one should use morePairs and sugarCrit.

5.
The sugar option is generally good and is a default option in SINGULAR.

6.
Lazard's method is in general slower than the other options, although it is sometimes surprisingly fast. In general it seems to be least a safe option (there are no #).

7.
As the above timings show, the lex ordering is usually the fastest. Since it is not an elimination ordering in the local case, its use is very limited (e.g. the dimension is computed correctly but not the multiplicity). But it can be used as a preprocessing in the sense that we first compute a standard bases with lex- and then transform this standard basis into one with the desired ordering, either by linear algebra methods or just by using it as an input to another standard basis computation. We see that the sum of timings in columns 10 and 11 is, in general, a very good strategy. There are examples where this method, combined with the ecartMethod, is the only way to compute a standard basis for degrevlex-.

These are general principles which have proved useful. Moreover, in SINGULAR the options can be combined by the user. Of course, there are always special examples with different behaviour. For example, in example 12 sugarCrit + morePairs is very good, but in example 1 it is the worst option. There is no option which is universally the best. The standard basis algorithm is extremely sensitive to the choice of the strategy, in the local case ( Loc;SPMlt; K[x] = K[x](x)) even more than in the inhomogeneous Buchberger case. The homogeneous Buchberger case, on the contrary, is much less sensitive to the choice of the strategy and is very stable. This explains why Lazard's method did always succeed, but on average is much slower. The ecartMethod (in connection with HC) seems to be by far the best. But we did not succeed in finding a good ecart vector for example 16. Here lex together with d(lex) is the best (less than 0.1 seconds). The conclusion is that at the moment a system has to offer several strategies, a default one which is good in most cases, but also gives the user the possibility of another choice.

Moreover, for computations in characterstic zero or over algebraic extensions the growth of the coefficients has to be taken into account in the strategies.

We tried to compare the timings also with Faugère's GB, but GB neither offers the ordering used in Columns 1 and 2 nor the tangent cone orderings used in the remaining columns. Testing several homogeneous examples in positive characteristic with degrevlex ordering showed, however, that SINGULAR is about 1.4 times faster than GB (on a Sparc2 and on an IBM RS/6000).

Next: 4. On Schreyer's method Up: Standard bases, syzygies and Previous: 2. The standard basis
| ZCA Home | Reports |