Results 1  10
of
278,305
The Vector Field Histogram  Fast Obstacle Avoidance For Mobile Robots
 IEEE JOURNAL OF ROBOTICS AND AUTOMATION
, 1991
"... A new realtime obstacle avoidance method for mobile robots has been developed and implemented. This method, named the vector field histogram(VFH), permits the detection of unknown obstacles and avoids collisions while simultaneously steering the mobile robot toward the target. The VFH method uses a ..."
Abstract

Cited by 484 (24 self)
 Add to MetaCart
a twodimensional Cartesian histogram gridas a world model. This world model is updated continuously with range data sampled by onboard range sensors. The VFH method subsequently employs a twostage datareduction process in order to compute the desired control commands for the vehicle
Stable Distributions, Pseudorandom Generators, Embeddings and Data Stream Computation
, 2000
"... In this paper we show several results obtained by combining the use of stable distributions with pseudorandom generators for bounded space. In particular: ffl we show how to maintain (using only O(log n=ffl 2 ) words of storage) a sketch C(p) of a point p 2 l n 1 under dynamic updates of its coo ..."
Abstract

Cited by 324 (13 self)
 Add to MetaCart
(n) is much smaller than n; to our knowledge this is the first dimensionality reduction lemma for l 1 norm ffl we give an explicit embedding of l n 2 into l n O(log n) 1 with distortion (1 + 1=n \Theta(1) ) and a nonconstructive embedding of l n 2 into l O(n) 1 with distortion (1 + ffl
Dimensionality Reduction for Fast Similarity Search in Large Time Series Databases
, 2000
"... The problem of similarity search in large time series databases has attracted much attention recently. It is a nontrivial problem because of the inherent high dimensionality of the data. The most promising solutions involve first performing dimensionality reduction on the data, and then indexing th ..."
Abstract

Cited by 240 (21 self)
 Add to MetaCart
The problem of similarity search in large time series databases has attracted much attention recently. It is a nontrivial problem because of the inherent high dimensionality of the data. The most promising solutions involve first performing dimensionality reduction on the data, and then indexing
Isoperimetric Problems for Convex Bodies and a Localization Lemma
, 1995
"... We study the smallest number /(K) such that a given convex body K in IR n can be cut into two parts K 1 and K 2 by a surface with an (n \Gamma 1)dimensional measure /(K)vol(K 1 ) \Delta vol(K 2 )=vol(K). Let M 1 (K) be the average distance of a point of K from its center of gravity. We prove for ..."
Abstract

Cited by 133 (9 self)
 Add to MetaCart
for the "isoperimetric coefficient" that /(K) ln 2 M 1 (K) ; and give other upper and lower bounds. We conjecture that our upper bound is best possible up to a constant. Our main tool is a general "Localization Lemma" that reduces integral inequalities over the ndimensional space
Dimensional reduction of . . .
, 2008
"... (Anti)selfdual instantons are only defined for gauge theories on fourdimensional base spaces. The notion of a gauge instanton can be generalized to dimensions greater than four in the presence of additional geometric structure like special holonomy. Conversely, one can consider the dimensional ..."
Abstract
 Add to MetaCart
of the generalization scheme for instanton equations, following the original work of ReyésCarrion, the dimensional reduction of Spin(7)instanton equations is carried out in two instances: First, on the product space Z × R with Z being a G2manifold connections in temporal gauge are considered. Second, for a K3
Local Dimensionality Reduction: A New Approach to Indexing High Dimensional Spaces
, 2000
"... Many emerging application domains require database systems to support efficient access over highly multidimensional datasets. The current stateoftheart technique to indexing high dimensional data is to first reduce the dimensionality of the data using Principal Component Analysis and then in ..."
Abstract

Cited by 119 (2 self)
 Add to MetaCart
Many emerging application domains require database systems to support efficient access over highly multidimensional datasets. The current stateoftheart technique to indexing high dimensional data is to first reduce the dimensionality of the data using Principal Component Analysis
Stable Conjugacy: Definitions and Lemmas*
"... The purpose of the present note is to introduce some notions useful for applications of the trace formula to the study of the principle of functoriality, including base change, and to the study of zetafunctions of Shimura varieties. In order to avoid disconcerting technical digressions I shall work ..."
Abstract
 Add to MetaCart
work with reductive groups over fields of characteristic zero, but the second assumption is only a matter of convenience, for the problems caused by inseparability are not serious. The difficulties with which trace formula confronts us are manifold. Most of them arise from the non
A combinatorial consistency lemma . . .
, 1999
"... The current proof of the PCP Theorem (i.e., N P = PCP(log, O(1))) is very complicated. One source of difficulty is the technically involved analysis of lowdegree tests. Here, we refer to the difficulty of obtaining strong results regarding lowdegree tests; namely, results of the type obtained and ..."
Abstract
 Add to MetaCart
simpler analysis of lowdegree tests (which yields weaker bounds). In other words, we replace the strong algebraic analysis of lowdegree tests presented by Arora and Safra and Arora et. al. by a combinatorial lemma (which does not refer to lowdegree tests or polynomials).
Results 1  10
of
278,305