Results 1  10
of
908
Testing ContinuousTime Models of the Spot Interest Rate
 Review of Financial Studies
, 1996
"... Different continuoustime models for interest rates coexist in the literature. We test parametric models by comparing their implied parametric density to the same density estimated nonparametrically. We do not replace the continuoustime model by discrete approximations, even though the data are rec ..."
Abstract

Cited by 310 (9 self)
 Add to MetaCart
Different continuoustime models for interest rates coexist in the literature. We test parametric models by comparing their implied parametric density to the same density estimated nonparametrically. We do not replace the continuoustime model by discrete approximations, even though the data are recorded at discrete intervals. The principal source of rejection of existing models is the strong nonlinearity of the drift. Around its mean, where the drift is essentially zero, the spot rate behaves like a random walk. The drift then meanreverts strongly when far away from the mean. The volatility is higher when away from the mean. The continuoustime financial theory has developed extensive tools to price derivative securities when the underlying traded asset(s) or nontraded factor(s) follow stochastic differential equations [see Merton (1990) for examples]. However, as a practical matter, how to specify an appropriate stochastic differential equation is for the most part an unanswered question. For example, many different continuoustime The comments and suggestions of Kerry Back (the editor) and an anonymous referee were very helpful. I am also grateful to George Constantinides,
A Nonparametric Model of Term Structure Dynamics and the Market Price of Interest Rate Risk
, 1997
"... This article presents a technique for nonparametrically estimating continuoustime di#usion processes which are observed at discrete intervals. We illustrate the methodology by using daily three and six month Treasury Bill data, from January 1965 to July 1995, to estimate the drift and di#usion of t ..."
Abstract

Cited by 208 (5 self)
 Add to MetaCart
This article presents a technique for nonparametrically estimating continuoustime di#usion processes which are observed at discrete intervals. We illustrate the methodology by using daily three and six month Treasury Bill data, from January 1965 to July 1995, to estimate the drift and di#usion of the short rate, and the market price of interest rate risk. While the estimated di#usion is similar to that estimated by Chan, Karolyi, Longsta# and Sanders (1992), there is evidence of substantial nonlinearity in the drift. This is close to zero for low and medium interest rates, but mean reversion increases sharply at higher interest rates.
Nettimer: A Tool for Measuring Bottleneck Link Bandwidth
 In Proceedings of the USENIX Symposium on Internet Technologies and Systems
, 2001
"... Measuring the bottleneck link bandwidth along a path is important for understanding the performance of many Internet applications. Existing tools to measure bottleneck bandwidth are relatively slow, can only measure bandwidth in one direction, and/or actively send probe packets. We present the netti ..."
Abstract

Cited by 201 (1 self)
 Add to MetaCart
(Show Context)
Measuring the bottleneck link bandwidth along a path is important for understanding the performance of many Internet applications. Existing tools to measure bottleneck bandwidth are relatively slow, can only measure bandwidth in one direction, and/or actively send probe packets. We present the nettimer bottleneck link bandwidth measurement tool, the libdpcap distributed packet capture library, and experiments quantifying their utility. We test nettimer across a variety of bottleneck network technologies ranging from 19.2Kb/s to 100Mb/s, wired and wireless, symmetric and asymmetric bandwidth, across local area and crosscountry paths, while using both one and two packet capture hosts. In most cases, nettimer has an error of less than 10%, but at worst has an error of 40%, even on crosscountry paths of 17 or more hops. It converges within 10KB of the first large packet arrival while consuming less than 7% of the network traffic being measured.
Measuring Bandwidth
, 1999
"... Accurate network bandwidth measurement is important to a variety of network applications. Unfortunately, accurate bandwidth measurement is difficult. We describe some current bandwidth measurement techniques: using throughput, pathchar [8], and Packet Pair [2]. We explain some of the problems with t ..."
Abstract

Cited by 199 (4 self)
 Add to MetaCart
(Show Context)
Accurate network bandwidth measurement is important to a variety of network applications. Unfortunately, accurate bandwidth measurement is difficult. We describe some current bandwidth measurement techniques: using throughput, pathchar [8], and Packet Pair [2]. We explain some of the problems with these techniques, including poor accuracy, poor scalability, lack of statistical robustness, poor agility in adapting to bandwidth changes, lack of flexibility in deployment, and inaccuracy when used on a variety of traffic types. Our solutions to these problems include using a packet window to adapt quickly to bandwidth changes, Receiver Only Packet Pair to combine accuracy and ease of deployment, and Potential Bandwidth Filtering to increase accuracy. Our techniques are are at least as accurate as previously used filtering algorithms, and in some situations, our techniques are more than 37% more accurate. I. INTRODUCTION A common complaint about the Internet is that it is slow. Some of this...
Nonparametric Estimation of Regression Functions
 in the Presence of Irrelevant Regressors.” The Review of Economics and Statistics
, 2007
"... In this paper we propose a method for nonparametric regression which admits continuous and categorical data in a natural manner using the method of kernels. A datadriven method of bandwidth selection is proposed, and we establish the asymptotic normality of the estimator. We also establish the rate ..."
Abstract

Cited by 175 (17 self)
 Add to MetaCart
In this paper we propose a method for nonparametric regression which admits continuous and categorical data in a natural manner using the method of kernels. A datadriven method of bandwidth selection is proposed, and we establish the asymptotic normality of the estimator. We also establish the rate of convergence of the crossvalidated smoothing parameters to their benchmark optimal smoothing parameters. Simulations suggest that the new estimator performs much better than the conventional nonparametric estimator in the presence of mixed data. An empirical application to a widely used and publicly available dynamic panel of patent data demonstrates that the outofsample squared prediction error of our proposed estimator is only 14 % to 20 % of that obtained by some popular parametric approaches which have been used to model this dataset.
SiZer for exploration of structures in curves
 Journal of the American Statistical Association
, 1997
"... In the use of smoothing methods in data analysis, an important question is often: which observed features are "really there?", as opposed to being spurious sampling artifacts. An approach is described, based on scale space ideas that were originally developed in computer vision literatu ..."
Abstract

Cited by 151 (21 self)
 Add to MetaCart
In the use of smoothing methods in data analysis, an important question is often: which observed features are "really there?", as opposed to being spurious sampling artifacts. An approach is described, based on scale space ideas that were originally developed in computer vision literature. Assessment of Significant ZERo crossings of derivatives, results in the SiZer map, a graphical device for display of significance of features, with respect to both location and scale. Here "scale" means "level of resolution", i.e.
Internet Tomography
 IEEE Signal Processing Magazine
, 2002
"... Today's Internet is a massive, distributed network which continues to explode in size as ecommerce and related activities grow. The heterogeneous and largely unregulated structure of the Internet renders tasks such as dynamic routing, optimized service provision, service level verification, and ..."
Abstract

Cited by 139 (11 self)
 Add to MetaCart
Today's Internet is a massive, distributed network which continues to explode in size as ecommerce and related activities grow. The heterogeneous and largely unregulated structure of the Internet renders tasks such as dynamic routing, optimized service provision, service level verification, and detection of anomalous/malicious behavior increasingly challenging tasks. The problem is compounded by the fact that one cannot rely on the cooperation of individual servers and routers to aid in the collection of network traffic measurements vital for these tasks. In many ways, network monitoring and inference problems bear a strong resemblance to other "inverse problems" in which key aspects of a system are not directly observable. Familiar signal processing problems such as tomographic image reconstruction, system identification, and array processing all have interesting interpretations in the networking context. This article introduces the new field of network tomography, a field which we believe will benefit greatly from the wealth of signal processing theory and algorithms.
Network tomography: recent developments
 Statistical Science
, 2004
"... Today's Int ernet is a massive, dist([/#][ net work which cont inuest o explode in size as ecommerce andrelatH actH]M/# grow. Thehet([H(/#]H( and largelyunregulatS stregula of t/ Int/HH3 renderstnde such as dynamicroutc/[ opt2]3fl/ service provision, service level verificatflH( and det(2][/ of ..."
Abstract

Cited by 132 (4 self)
 Add to MetaCart
(Show Context)
Today's Int ernet is a massive, dist([/#][ net work which cont inuest o explode in size as ecommerce andrelatH actH]M/# grow. Thehet([H(/#]H( and largelyunregulatS stregula of t/ Int/HH3 renderstnde such as dynamicroutc/[ opt2]3fl/ service provision, service level verificatflH( and det(2][/ of anomalous/malicious behaviorext/[(22 challenging. The problem is compounded bytS fact tct onecannot rely ont[ cooperatH2 of individual servers and routSS t aid intS collect[3 of net workt/[S measurement vits fort/]3 t/]3] In many ways, net workmonit]/#[ and inference problems bear a st[fl[ resemblancet otnc "inverse problems" in which key aspect of asystfl are not direct/ observable. Familiar signal processing orst[]23/#[S problems such ast omographic imagereconst[/#[S] and phylogenet# tog identn/HH2[M have int erest3/ connect[HU t tonn arising in net working. This artflMM int/ ducesnet workt/H3]S]/ y, a new field which we believe will benefit greatU from tm wealt of stH2](/#S( ttH2 andalgorit#S( It focuses especially on recent development s int2 field includingtl applicat[fl of pseudolikelihoodmetfl ds andt reeestfl3](/# formulat]M23 Keyw ords:Net workt/HflS33/ y, pseudolikelihood,t opology identn/]H22(/ tn est/]H tst 1 Introducti6 Nonet work is an island, ent/S ofitS[S] everynet work is a piece of an int/]SS work, a part of t/ main . Alt[]][ administHSHSS of smallscale net works can monit( localt ra#ccondit][/ and ident ify congest/# point s and performance botU((2/ ks, very few net works are complet/# # Rui Castroan Robert Nowak are with theDepartmen t of Electricalan ComputerEnterX Rice Unc ersity,Houston TX; Mark Coates is with the Departmen t of Electricalan ComputerEnterX McGill UnG ersity,Mon treal, Quebec,Can Gan Lian an Bin Yu are with theDepartmen t of Statistics,...
Capprobe: A simple and accurate capacity estimation technique
 In Proceeding ACM SIGCOMM
, 2004
"... We present a new capacity estimation technique, called CapProbe. CapProbe combines delay as well as dispersion measurements of packet pairs to filter out samples distorted by crosstraffic. CapProbe algorithms include convergence tests and convergence speedup techniques by varying probing parame ..."
Abstract

Cited by 115 (25 self)
 Add to MetaCart
(Show Context)
We present a new capacity estimation technique, called CapProbe. CapProbe combines delay as well as dispersion measurements of packet pairs to filter out samples distorted by crosstraffic. CapProbe algorithms include convergence tests and convergence speedup techniques by varying probing parameters. Our study of CapProbe includes a probability analysis to determine the time it takes CapProbe to converge on the average. Through simulations and measurements, we found CapProbe to be quick and accurate across a wide range of traffic scenarios. We also compared CapProbe with two previous wellknown techniques, pathchar and pathrate. We found CapProbe to be much more accurate than pathchar and similar in accuracy to pathrate, while providing faster estimation than both. Another advantage of CapProbe is its lower computation cost, since no statistical post processing of probing data is required.