• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 11 - 20 of 2,221,751
Next 10 →

String theory and noncommutative geometry

by Nathan Seiberg, Edward Witten - JHEP , 1999
"... We extend earlier ideas about the appearance of noncommutative geometry in string theory with a nonzero B-field. We identify a limit in which the entire string dynamics is described by a minimally coupled (supersymmetric) gauge theory on a noncommutative space, and discuss the corrections away from ..."
Abstract - Cited by 801 (8 self) - Add to MetaCart
We extend earlier ideas about the appearance of noncommutative geometry in string theory with a nonzero B-field. We identify a limit in which the entire string dynamics is described by a minimally coupled (supersymmetric) gauge theory on a noncommutative space, and discuss the corrections away from

The cascade-correlation learning architecture

by Scott E. Fahlman, Christian Lebiere - Advances in Neural Information Processing Systems 2 , 1990
"... Cascade-Correlation is a new architecture and supervised learning algorithm for artificial neural networks. Instead of just adjusting the weights in a network of fixed topology, Cascade-Correlation begins with a minimal network, then automatically trains and adds new hidden units one by one, creatin ..."
Abstract - Cited by 796 (6 self) - Add to MetaCart
Cascade-Correlation is a new architecture and supervised learning algorithm for artificial neural networks. Instead of just adjusting the weights in a network of fixed topology, Cascade-Correlation begins with a minimal network, then automatically trains and adds new hidden units one by one

Consistent hashing and random trees: Distributed caching protocols for relieving hot spots on the World Wide Web

by David Karger, Eric Lehman, Tom Leighton, Matthew Levine, Daniel Lewin, Rina Panigrahy - IN PROC. 29TH ACM SYMPOSIUM ON THEORY OF COMPUTING (STOC , 1997
"... We describe a family of caching protocols for distrib-uted networks that can be used to decrease or eliminate the occurrence of hot spots in the network. Our protocols are particularly designed for use with very large networks such as the Internet, where delays caused by hot spots can be severe, and ..."
Abstract - Cited by 701 (11 self) - Add to MetaCart
of existing resources, and scale gracefully as the network grows. Our caching protocols are based on a special kind of hashing that we call consistent hashing. Roughly speaking, a consistent hash function is one which changes minimally as the range of the function changes. Through the development of good

Some Evidence on the Importance of Sticky Prices

by Mark Bils, Peter J. Klenow - JOURNAL OF POLITICAL ECONOMY , 2004
"... We examine the frequency of price changes for 350 categories of goods and services covering about 70 % of consumer spending, based on unpublished data from the BLS for 1995 to 1997. Compared with previous studies we find much more frequent price changes, with half of goods' prices lasting less ..."
Abstract - Cited by 734 (15 self) - Add to MetaCart
We examine the frequency of price changes for 350 categories of goods and services covering about 70 % of consumer spending, based on unpublished data from the BLS for 1995 to 1997. Compared with previous studies we find much more frequent price changes, with half of goods' prices lasting less

Detection and Tracking of Point Features

by Carlo Tomasi, Takeo Kanade - International Journal of Computer Vision , 1991
"... The factorization method described in this series of reports requires an algorithm to track the motion of features in an image stream. Given the small inter-frame displacement made possible by the factorization approach, the best tracking method turns out to be the one proposed by Lucas and Kanade i ..."
Abstract - Cited by 622 (2 self) - Add to MetaCart
in 1981. The method defines the measure of match between fixed-size feature windows in the past and current frame as the sum of squared intensity differences over the windows. The displacement is then defined as the one that minimizes this sum. For small motions, a linearization of the image intensities

Electronic Markets and Electronic Hierarchies

by Robert I. Benjamin, Thomas W. Malone, Joanne Yates - Communications of the ACM , 1987
"... This paper analyzes the fundamental changes in market structures that may result from the increasing use of information technology. First, an analytic framework is presented and its usefulness is demonstrated in explaining several major historical changes in American business structures. Then, the f ..."
Abstract - Cited by 684 (11 self) - Add to MetaCart
This paper analyzes the fundamental changes in market structures that may result from the increasing use of information technology. First, an analytic framework is presented and its usefulness is demonstrated in explaining several major historical changes in American business structures. Then

The modern industrial revolution, exit, and the failure of internal control systems

by Michael C. Jensen - JOURNAL OF FINANCE , 1993
"... Since 1973 technological, political, regulatory, and economic forces have been changing the worldwide economy in a fashion comparable to the changes experienced during the nineteenth century Industrial Revolution. As in the nineteenth century, we are experiencing declining costs, increaing average ( ..."
Abstract - Cited by 932 (7 self) - Add to MetaCart
Since 1973 technological, political, regulatory, and economic forces have been changing the worldwide economy in a fashion comparable to the changes experienced during the nineteenth century Industrial Revolution. As in the nineteenth century, we are experiencing declining costs, increaing average

A Bayesian Framework for the Analysis of Microarray Expression Data: Regularized t-Test and Statistical Inferences of Gene Changes

by Pierre Baldi, Anthony D. Long - Bioinformatics , 2001
"... Motivation: DNA microarrays are now capable of providing genome-wide patterns of gene expression across many different conditions. The first level of analysis of these patterns requires determining whether observed differences in expression are significant or not. Current methods are unsatisfactory ..."
Abstract - Cited by 485 (6 self) - Add to MetaCart
Motivation: DNA microarrays are now capable of providing genome-wide patterns of gene expression across many different conditions. The first level of analysis of these patterns requires determining whether observed differences in expression are significant or not. Current methods are unsatisfactory due to the lack of a systematic framework that can accommodate noise, variability, and low replication often typical of microarray data. Results: We develop a Bayesian probabilistic framework for microarray data analysis. At the simplest level, we model log-expression values by independent normal distributions, parameterized by corresponding means and variances with hierarchical prior distributions. We derive point estimates for both parameters and hyperparameters, and regularized expressions for the variance of each gene by combining the empirical variance with a local background variance associated with neighboring genes. An additional hyperparameter, inversely related to the number of empirical observations, determines the strength of the background variance. Simulations show that these point estimates, combined with a t-test, provide a systematic inference approach that compares favorably with simple t-test or fold methods, and partly compensate for the lack of replication. Availability: The approach is implemented in a software called Cyber-T accessible through a Web interface at www.genomics.uci.edu/software.html. The code is available as Open Source and is written in the freely available statistical language R. and Department of Biological Chemistry, College of Medicine, University of California, Irvine. To whom all correspondence should be addressed. Contact: pfbaldi@ics.uci.edu, tdlong@uci.edu. 1

Versatile Low Power Media Access for Wireless Sensor Networks

by Joseph Polastre, Jason Hill, David Culler , 2004
"... We propose B-MAC, a carrier sense media access protocol for wireless sensor networks that provides a flexible interface to obtain ultra low power operation, effective collision avoidance, and high channel utilization. To achieve low power operation, B-MAC employs an adaptive preamble sampling scheme ..."
Abstract - Cited by 1077 (19 self) - Add to MetaCart
scheme to reduce duty cycle and minimize idle listening. B-MAC supports on-the-fly reconfiguration and provides bidirectional interfaces for system services to optimize performance, whether it be for throughput, latency, or power conservation. We build an analytical model of a class of sensor network

Nonrigid registration using free-form deformations: Application to breast MR images

by D. Rueckert, L. I. Sonoda, C. Hayes, D. L. G. Hill, M. O. Leach, D. J. Hawkes - IEEE Transactions on Medical Imaging , 1999
"... Abstract — In this paper we present a new approach for the nonrigid registration of contrast-enhanced breast MRI. A hierarchical transformation model of the motion of the breast has been developed. The global motion of the breast is modeled by an affine transformation while the local breast motion i ..."
Abstract - Cited by 685 (36 self) - Add to MetaCart
is described by a free-form deformation (FFD) based on B-splines. Normalized mutual information is used as a voxel-based similarity measure which is insensitive to intensity changes as a result of the contrast enhancement. Registration is achieved by minimizing a cost function, which represents a combination
Next 10 →
Results 11 - 20 of 2,221,751
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University