Results 1  10
of
10
Factoring polynomials with rational coefficients
 MATH. ANN
, 1982
"... In this paper we present a polynomialtime algorithm to solve the following problem: given a nonzero polynomial fe Q[X] in one variable with rational coefficients, find the decomposition of f into irreducible factors in Q[X]. It is well known that this is equivalent to factoring primitive polynomia ..."
Abstract

Cited by 982 (11 self)
 Add to MetaCart
(Show Context)
In this paper we present a polynomialtime algorithm to solve the following problem: given a nonzero polynomial fe Q[X] in one variable with rational coefficients, find the decomposition of f into irreducible factors in Q[X]. It is well known that this is equivalent to factoring primitive polynomials feZ[X] into irreducible factors in Z[X]. Here we call f ~ Z[X] primitive if the greatest common divisor of its coefficients (the content of f) is 1. Our algorithm performs well in practice, cf. [8]. Its running time, measured in bit operations, is O(nl2+n9(log[fD3). Here f~Tl[X] is the polynomial to be factored, n = deg(f) is the degree of f, and for a polynomial ~ a ~ i with real coefficients a i. i An outline of the algorithm is as follows. First we find, for a suitable small prime number p, a padic irreducible factor h of f, to a certain precision. This is done with Berlekamp's algorithm for factoring polynomials over small finite fields, combined with Hensel's lemma. Next we look for the irreducible factor h o of f in
Algorithmic Geometry of Numbers
 Annual Review of Comp. Sci
, 1987
"... this article  Algorithmic Geometry of Numbers. The fundamental basis reduction algorithm of Lov'asz which first appeared in Lenstra, Lenstra, Lov'asz [46] was used in Lenstra's algorithm for integer programming and has since been applied in myriad contextsstarting with factorization ..."
Abstract

Cited by 53 (0 self)
 Add to MetaCart
(Show Context)
this article  Algorithmic Geometry of Numbers. The fundamental basis reduction algorithm of Lov'asz which first appeared in Lenstra, Lenstra, Lov'asz [46] was used in Lenstra's algorithm for integer programming and has since been applied in myriad contextsstarting with factorization of polynomials (A.K. Lenstra, [45]). Classical Geometry of Numbers has a special feature in that it studies the geometric properties of (convex) sets like volume, width etc. which come from the realm of continuous mathematics in relation to lattices which are discrete objects. This makes it ideal for applications to integer programming and other discrete optimization problems which seem inherently harder than their "continuous" counterparts like linear programming. 1
Universal Lattice Decoding: Principle and Recent Advances
 WIRELESS COMMUNICATIONS AND MOBILE COMPUTING
, 2003
"... ..."
(Show Context)
On the ChorRivest Knapsack Cryptosystem
, 1991
"... Among all publickey cryptosystems that depend on the knapsack problem, the system proposed by Chor and Rivest (IEEE Trans. Inform. Theory 34 (1988), 9017909) is one of the few that have not been broken. The main difficulty in implementing their system is the computation of discrete logarithms in l ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
Among all publickey cryptosystems that depend on the knapsack problem, the system proposed by Chor and Rivest (IEEE Trans. Inform. Theory 34 (1988), 9017909) is one of the few that have not been broken. The main difficulty in implementing their system is the computation of discrete logarithms in large finite fields. In this note we describe the "powerline system," which is a modification of the ChorRivest system that does not have this shortcoming. The powerline system, which is not a knapsack system, is at least as secure as the original ChorRivest system.
Factorization of Polynomials
 Computing, Suppl. 4
, 1982
"... Algorithms for factoring polynomials in one or more variables over various coefficient domains are discussed. Special emphasis is given to finite fields, the integers, or algebraic extensions of the rationals, and to multivariate polynomials with integral coefficients. In particular, various squaref ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
(Show Context)
Algorithms for factoring polynomials in one or more variables over various coefficient domains are discussed. Special emphasis is given to finite fields, the integers, or algebraic extensions of the rationals, and to multivariate polynomials with integral coefficients. In particular, various squarefree decomposition algorithms and Hensel lifting techniques are analyzed. An attempt is made to establish a complete historic trace for today's methods. The exponential worst case complexity nature of these algorithms receives attention. _______________ Appears in Computer Algebra, second edition, B. Buchberger, R. Loos, G. Collins, editors, Springer Verlag, Vienna, Austria, pp. 9511 (1982).  2  1. Introduction The problem of factoring polynomials has a long and distinguished history. D. Knuth traces the first attempts back to Isaac Newton's Arithmetica Universalis (1707) and to the astronomer Friedrich T. v. Schubert who in 1793 presented a finite step algorithm to compute the factors...
Removing Randomness From Computational Number Theory
, 1989
"... In recent years, many probabilistic algorithms (i.e., algorithms that can toss coins) that run in polynomial time have been discovered for problems with no known deterministic polynomial time algorithms. Perhaps the most famous example is the problem of testing large (say, 100 digit) numbers for pri ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
In recent years, many probabilistic algorithms (i.e., algorithms that can toss coins) that run in polynomial time have been discovered for problems with no known deterministic polynomial time algorithms. Perhaps the most famous example is the problem of testing large (say, 100 digit) numbers for primality. Even for problems which are known to have deterministic polynomial time algorithms, these algorithms are often not as fast as some probabilistic algorithms for the same problem. Even though probabilistic algorithms are useful in practice, we would like to know, for both theoretical and practical reasons, if randomization is really necessary to obtain the most efficient algorithms for certain problems. That is, we would like to know for which problems there is an inherent gap between the deterministic and probabilistic complexities of these problems. In this research, we consider two problems of a number theoretic nature: factoring polynomials over finite fields and constructing irred...
Accessed: 08/04/2009 10:26
"... Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at ..."
Abstract
 Add to MetaCart
Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at
Maximum Likelihood Sequence Estimation From The Lattice Viewpoint
, 1991
"... It is wellknown that the use of the Viterbi algorithm to implement a sequence estimator is an optimal way to remove the effect of intersymbol interference for digital transmission systems. However, such an implementation usually results in a very complicated receiver. In this thesis, we transform t ..."
Abstract
 Add to MetaCart
(Show Context)
It is wellknown that the use of the Viterbi algorithm to implement a sequence estimator is an optimal way to remove the effect of intersymbol interference for digital transmission systems. However, such an implementation usually results in a very complicated receiver. In this thesis, we transform the problem of maximum likelihood sequence estimation into the problem of finding the closest lattice point. Some related lattice algorithms such as the basis reduction algorithms and the enumeration algorithms are analyzed and some improved versions are suggested. Then efficient algorithms finding the nearest lattice point are derived. Based on these lattice algorithms, simple but effective sequence estimators are proposed for the PAM systems and their complexities are analyzed. Under some mild assumptions, our algorithms have both polynomial space and time complexities, and are therefore much superior to the conventional Viterbi detectors. Simulation results on three different channels show that the performance of the new sequence estimators depend on the distance spectrum of the channel. But, general speaking, the performance approaches optimal as the size of the signal set and the signaltonoise ratio increase. Finally, the extensions to other latticetype modulation schemes and the impacts of the lattice viewpoint on the design of bandlimited transmission systems are discussed.
Factoring
"... integere with the number fleld sieve version 19920507 Factoring integers with the number field sieve ..."
Abstract
 Add to MetaCart
(Show Context)
integere with the number fleld sieve version 19920507 Factoring integers with the number field sieve