Results 1  10
of
7,786
Discovery of Grounded Theory
, 1967
"... Abstract: This paper outlines my concerns with Qualitative Data Analysis ’ (QDA) numerous remodelings of Grounded Theory (GT) and the subsequent eroding impact. I cite several examples of the erosion and summarize essential elements of classic GT methodology. It is hoped that the article will clarif ..."
Abstract

Cited by 2637 (13 self)
 Add to MetaCart
Abstract: This paper outlines my concerns with Qualitative Data Analysis ’ (QDA) numerous remodelings of Grounded Theory (GT) and the subsequent eroding impact. I cite several examples of the erosion and summarize essential elements of classic GT methodology. It is hoped that the article
The Ant System: Optimization by a colony of cooperating agents
 IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICSPART B
, 1996
"... An analogy with the way ant colonies function has suggested the definition of a new computational paradigm, which we call Ant System. We propose it as a viable new approach to stochastic combinatorial optimization. The main characteristics of this model are positive feedback, distributed computation ..."
Abstract

Cited by 1300 (46 self)
 Add to MetaCart
methodology to the classical Traveling Salesman Problem (TSP), and report simulation results. We also discuss parameter selection and the early setups of the model, and compare it with tabu search and simulated annealing using TSP. To demonstrate the robustness of the approach, we show how the Ant System (AS
A review of image denoising algorithms, with a new one
 SIMUL
, 2005
"... The search for efficient image denoising methods is still a valid challenge at the crossing of functional analysis and statistics. In spite of the sophistication of the recently proposed methods, most algorithms have not yet attained a desirable level of applicability. All show an outstanding perf ..."
Abstract

Cited by 508 (6 self)
 Add to MetaCart
performance when the image model corresponds to the algorithm assumptions but fail in general and create artifacts or remove image fine structures. The main focus of this paper is, first, to define a general mathematical and experimental methodology to compare and classify classical image denoising algorithms
Decoding by Linear Programming
, 2004
"... This paper considers the classical error correcting problem which is frequently discussed in coding theory. We wish to recover an input vector f ∈ Rn from corrupted measurements y = Af + e. Here, A is an m by n (coding) matrix and e is an arbitrary and unknown vector of errors. Is it possible to rec ..."
Abstract

Cited by 1399 (16 self)
 Add to MetaCart
This paper considers the classical error correcting problem which is frequently discussed in coding theory. We wish to recover an input vector f ∈ Rn from corrupted measurements y = Af + e. Here, A is an m by n (coding) matrix and e is an arbitrary and unknown vector of errors. Is it possible
The irreducibility of the space of curves of given genus
 Publ. Math. IHES
, 1969
"... Fix an algebraically closed field k. Let Mg be the moduli space of curves of genus g over k. The main result of this note is that Mg is irreducible for every k. Of course, whether or not M s is irreducible depends only on the characteristic of k. When the characteristic s o, we can assume that k ~ ..."
Abstract

Cited by 506 (2 self)
 Add to MetaCart
~ (1, and then the result is classical. A simple proof appears in EnriquesChisini [E, vol. 3, chap. 3], based on analyzing the totality of coverings of p1 of degree n, with a fixed number d of ordinary branch points. This method has been extended to char. p by William Fulton [F], using specializations
Elephants don't play chess
 Robotics and Autonomous Systems
, 1990
"... Engineering and Computer Science at M.I.T. and a member of the Artificial Intelligence Laboratory where he leads the mobile robot group. He has authored two books, numerous scientific papers, and is the editor of the International Journal of Computer Vision. There is an alternative route to Artifici ..."
Abstract

Cited by 403 (4 self)
 Add to MetaCart
ongoing physical interaction with the environment as the primary source of constraint on the design of intelligent systems. We show how this methodology has recently had significant successes on a par with the most successful classical efforts. We outline plausible future work along these lines which can
A New Efficient Algorithm for Computing Gröbner Bases (F4)
 IN: ISSAC ’02: PROCEEDINGS OF THE 2002 INTERNATIONAL SYMPOSIUM ON SYMBOLIC AND ALGEBRAIC COMPUTATION
, 2002
"... This paper introduces a new efficient algorithm for computing Gröbner bases. To avoid as much as possible intermediate computation, the algorithm computes successive truncated Gröbner bases and it replaces the classical polynomial reduction found in the Buchberger algorithm by the simultaneous reduc ..."
Abstract

Cited by 365 (57 self)
 Add to MetaCart
This paper introduces a new efficient algorithm for computing Gröbner bases. To avoid as much as possible intermediate computation, the algorithm computes successive truncated Gröbner bases and it replaces the classical polynomial reduction found in the Buchberger algorithm by the simultaneous
Applying the Rasch model: Fundamental measurement in the human sciences
, 2001
"... I guess I just grew sick and tired of the same old request after almost every presentation I made at conferences involving developmental psychologists: “Trevor, could you just give me a simple ten minute explanation of what Rasch analysis is all about? ” After a dozen or so inquiries of this nature, ..."
Abstract

Cited by 319 (4 self)
 Add to MetaCart
or quantitative research methodology — we realized that we could not produce the goods. That’s not to deny in any way the importance of the classic Rasch texts — Wright and Stone, Andrich, Wright and Masters, and that of Georg Rasch himself. It highlighted, however, the absence of an introductory text — a text
Living with CLASSIC: When and How to Use a KLONELike Language
 Principles of Semantic Networks
, 1991
"... classic is a recentlydeveloped knowledge representation system that follows the paradigm originally set out in the klone system: it concentrates on the definition of structured concepts, their organization into taxonomies, the creation and manipulation of individual instances of such concepts, ..."
Abstract

Cited by 257 (18 self)
 Add to MetaCart
and weaknesses of classic, we consider the circumstances under which it is most appropriate to use (or not use) it. We elaborate a knowledgeengineering methodology for building klonestyle knowledge bases, with emphasis on the modeling choices that arise in the process of describing a domain. We also
Opportunistic Data Structures with Applications
, 2000
"... In this paper we address the issue of compressing and indexing data. We devise a data structure whose space occupancy is a function of the entropy of the underlying data set. We call the data structure opportunistic since its space occupancy is decreased when the input is compressible and this space ..."
Abstract

Cited by 296 (11 self)
 Add to MetaCart
entropy of T (the bound holds for any fixed k). Given an arbitrary string P [1; p], the opportunistic data structure allows to search for the occ occurrences of P in T in O(p + occ log u) time (for any fixed > 0). If data are uncompressible we achieve the best space bound currently known [12
Results 1  10
of
7,786