Results 1  10
of
67
Virtual Landmarks for the Internet
, 2003
"... Internet coordinate schemes have been proposed as a method for estimating minimum round trip time between hosts without direct measurement. In such a scheme, each host is assigned a set of coordinates, and Euclidean distance is used to form the desired estimate. Two key questions are: How accurate a ..."
Abstract

Cited by 187 (3 self)
 Add to MetaCart
Internet coordinate schemes have been proposed as a method for estimating minimum round trip time between hosts without direct measurement. In such a scheme, each host is assigned a set of coordinates, and Euclidean distance is used to form the desired estimate. Two key questions are: How accurate are coordinate schemes across the Internet as a whole? And: are coordinate assignment schemes fast enough, and scalable enough, for large scale use? In this paper we make contributions toward answering both those questions. Whereas the coordinate assignment problem has in the past been approached by nonlinear optimization, we develop a faster method based on dimensionality reduction of the Lipschitz embedding. We show that this method is reasonably accurate, even when applied to measurements spanning the Internet, and that it naturally leads to a scalable measurement strategy based on the notion of virtual landmarks.
Sampling Biases in IP Topology Measurements
 In IEEE INFOCOM
, 2003
"... Considerable attention has been focused on the properties of graphs derived from Internet measurements. Routerlevel topologies collected via traceroutelike methods have led some to conclude that the router graph of the Internet is well modeled as a powerlaw random graph. In such a graph, the degr ..."
Abstract

Cited by 127 (2 self)
 Add to MetaCart
Considerable attention has been focused on the properties of graphs derived from Internet measurements. Routerlevel topologies collected via traceroutelike methods have led some to conclude that the router graph of the Internet is well modeled as a powerlaw random graph. In such a graph, the degree distribution of nodes follows a distribution with a powerlaw tail.
Spectral Analysis of Internet Topologies
, 2003
"... We perform spectral analysis of the Internet topology at the AS level, by adapting the standard spectral filtering method of examining the eigenvectors corresponding to the largest eigenvalues of matrices related to the adjacency matrix of the topology. We observe that the method suggests clusters o ..."
Abstract

Cited by 86 (6 self)
 Add to MetaCart
We perform spectral analysis of the Internet topology at the AS level, by adapting the standard spectral filtering method of examining the eigenvectors corresponding to the largest eigenvalues of matrices related to the adjacency matrix of the topology. We observe that the method suggests clusters of ASes with natural semantic proximity, such as geography or business interests. We examine how these clustering properties vary in the core and in the edge of the network, as well as across geographic areas, over time, and between real and synthetic data. We observe that these clustering properties may be suggestive of traffic patterns and thus have direct impact on the link stress of the network. Finally, we use the weights of the eigenvector corresponding to the first eigenvalue to obtain an alternative hierarchical ranking of the ASes.
NegotiationBased Routing Between Neighboring ISPs
 in Proc. NSDI
, 2005
"... Abstract We explore negotiation as the basis for cooperation between competing entities, for the specific case of routing between two neighboring ISPs. Interdomain routing is often driven by selfinterest and based on a limited view of the internetwork, which hurts the stability and efficiency of ..."
Abstract

Cited by 61 (2 self)
 Add to MetaCart
(Show Context)
Abstract We explore negotiation as the basis for cooperation between competing entities, for the specific case of routing between two neighboring ISPs. Interdomain routing is often driven by selfinterest and based on a limited view of the internetwork, which hurts the stability and efficiency of routing. We present a negotiation framework in which adjacent ISPs share information using coarse preferences and jointly decide the paths for the traffic flows they exchange. Our framework enables pairs of ISPs to agree on routing paths based on their specific relationship, even if they have different optimization criteria. We use simulation with over sixty measured ISP topologies to evaluate our framework. We find that the quality of negotiated routing is close to that of globally optimal routing that uses complete, detailed information about both ISPs. We also find that ISPs have incentive to negotiate because both of them benefit compared to routing independently based on local information.
Internet Connectivity at the ASlevel: An OptimizationDriven Modeling Approach
, 2003
"... Two ASs are connected in the Internet AS graph only if they have a business "peering relationship." By focusing on the AS subgraph ASPC whose links represent providercustomer relationships, we develop a new optimizationdriven model for Internet growth at the ASPC level. The model's ..."
Abstract

Cited by 37 (7 self)
 Add to MetaCart
Two ASs are connected in the Internet AS graph only if they have a business "peering relationship." By focusing on the AS subgraph ASPC whose links represent providercustomer relationships, we develop a new optimizationdriven model for Internet growth at the ASPC level. The model's defining feature is an explicit construction of a novel class of intuitive, multiobjective, local optimizations by which the different customer ASs determine in a fully distributed and decentralized fashion their "best" upstream provider AS. Key criteria that are explicitly accounted for in the formulation of these multiobjective optimization problems are (i) ASgeography, i.e., locality and number of PoPs within individual ASs; (ii) ASspecific business models, abstract toy models that describe how individual ASs choose their "best" provider; and (iii) AS evolution, a historic account of the "lives" of individual ASs in a dynamic ISP market. We show that the resulting model is broadly robust, perforce yields graphs that match inferred AS connectivity with respect to a number of different metrics, and is ideal for exploring the impact of new peering incentives or policies on ASlevel connectivity.
Assessing the Vulnerability of the Fiber Infrastructure to Disasters
"... Communication networks are vulnerable to natural disasters, such as earthquakes or floods, as well as to physical attacks, such as an Electromagnetic Pulse (EMP) attack. Such realworld events happen in specific geographical locations and disrupt specific parts of the network. Therefore, the geogra ..."
Abstract

Cited by 32 (7 self)
 Add to MetaCart
(Show Context)
Communication networks are vulnerable to natural disasters, such as earthquakes or floods, as well as to physical attacks, such as an Electromagnetic Pulse (EMP) attack. Such realworld events happen in specific geographical locations and disrupt specific parts of the network. Therefore, the geographical layout of the network determines the impact of such events on the network’s connectivity. In this paper, we focus on assessing the vulnerability of (geographical) networks to such disasters. In particular, we aim to identify the most vulnerable parts of the network. That is, the locations of disasters that would have the maximum disruptive effect on the network in terms of capacity and connectivity. We consider graph models in which nodes and links are geographically located on a plane, and model the disaster event as a line segment or a circular cut. We develop algorithms that find a worstcase line segment cut and a worstcase circular cut. Then, we obtain numerical results for a specific backbone network, thereby demonstrating the applicability of our algorithms to realworld networks. Our novel approach provides a promising new direction for network design to avert geographical disasters or attacks.
Geometric exploration of the landmark selection problem
 In PAM
, 2004
"... Abstract. Internet coordinate systems appear promising as a method for estimating network distance without direct measurement, allowing scalable configuration of emerging applications such as content delivery networks, peer to peer systems, and overlay networks. However all such systems rely on land ..."
Abstract

Cited by 23 (0 self)
 Add to MetaCart
(Show Context)
Abstract. Internet coordinate systems appear promising as a method for estimating network distance without direct measurement, allowing scalable configuration of emerging applications such as content delivery networks, peer to peer systems, and overlay networks. However all such systems rely on landmarks, and the choice of landmarks has a dramatic impact on their accuracy. Landmark selection is challenging because of the size of the Internet (leading to an immense space of candidate sets) and because insight into the properties of good landmark sets is lacking. In this paper we explore fast algorithms for landmark selection. Whereas the traditional approach to analyzing similar networkbased configurkation problems employs the graph structure of the Internet, we leverage work in coordinate systems to treat the problem geometrically, providing an opening for applying new algorithms. Our results suggest that when employing small numbers of landmarks (510), such geometric algorithms provide good landmarks, while for larger numbers of landmarks (2030) even faster methods based on random selection are equally effective. 1
Network Reliability With Geographically Correlated Failures
"... Abstract—Fiberoptic networks are vulnerable to natural disasters, such as tornadoes or earthquakes, as well as to physical failures, such as an anchor cutting underwater fiber cables. Such realworld events occur in specific geographical locations and disrupt specific parts of the network. Therefor ..."
Abstract

Cited by 23 (1 self)
 Add to MetaCart
(Show Context)
Abstract—Fiberoptic networks are vulnerable to natural disasters, such as tornadoes or earthquakes, as well as to physical failures, such as an anchor cutting underwater fiber cables. Such realworld events occur in specific geographical locations and disrupt specific parts of the network. Therefore, the geography of the network determines the effect of physical events on the network’s connectivity and capacity. In this paper, we develop tools to analyze network failures after a ‘random ’ geographic disaster. The random location of the disaster allows us to model situations where the physical failures are not targeted attacks. In particular, we consider disasters that take the form of a ‘random ’ line in a plane. Using results from geometric probability, we are able to calculate some network performance metrics to such a disaster in polynomial time. In particular, we can evaluate average twoterminal reliability in polynomial time under ‘random ’ linecuts. This is in contrast to the case of independent link failures for which there exists no known polynomial time algorithm to calculate this reliability metric. We also present some numerical results to show the significance of geometry on the survivability of the network and discuss network design in the context of random linecuts. Our novel approach provides a promising new direction for modeling and designing networks to lessen the effects of geographical disasters or attacks. I.
A learningbased approach for IP geolocation
 In Proceedings of the Passive and Active Measurement Conference (PAM
, 2010
"... Abstract. The ability to pinpoint the geographic location of IP hosts is compelling for applications such as online advertising and network attack diagnosis. While prior methods can accurately identify the location of hosts in some regions of the Internet, they produce erroneous results when the de ..."
Abstract

Cited by 22 (4 self)
 Add to MetaCart
(Show Context)
Abstract. The ability to pinpoint the geographic location of IP hosts is compelling for applications such as online advertising and network attack diagnosis. While prior methods can accurately identify the location of hosts in some regions of the Internet, they produce erroneous results when the delay or topology measurement on which they are based is limited. The hypothesis of our work is that the accuracy of IP geolocation can be improved through the creation of a flexible analytic framework that accommodates different types of geolocation information. In this paper, we describe a new framework for IP geolocation that reduces to a machinelearning classification problem. Our methodology considers a set of lightweight measurements from a set of known monitors to a target, and then classifies the location of that target based on the most probable geographic region given probability densities learned from a training set. For this study, we employ a Naive Bayes framework that has low computational complexity and enables additional environmental information to be easily added to enhance the classification process. To demonstrate the feasibility and accuracy of our approach, we test IP geolocation on over 16,000 routers given ping measurements from 78 monitors with known geographic placement. Our results show that the simple application of our method improves geolocation accuracy for over 96 % of the nodes identified in our data set, with on average accuracy 70 miles closer to the true geographic location versus prior constraintbased geolocation. These results highlight the promise of our method and indicate how future expansion of the classifier can lead to further improvements in geolocation accuracy. 1
Toward a measurementbased geographic location service
 in Proc. of PAM’2004, Antibes JuanlesPins
, 2004
"... Abstract. Locationaware applications require a geographic location service of Internet hosts. We focus on a measurementbased service for the geographic location of Internet hosts. Host locations are inferred by comparing delay patterns of geographically distributed landmarks, which are hosts with ..."
Abstract

Cited by 20 (6 self)
 Add to MetaCart
(Show Context)
Abstract. Locationaware applications require a geographic location service of Internet hosts. We focus on a measurementbased service for the geographic location of Internet hosts. Host locations are inferred by comparing delay patterns of geographically distributed landmarks, which are hosts with a known geographic location, with the delay pattern of the target host to be located. Results show a significant correlation between geographic distance and network delay that can be exploited for a coarsegrained geographic location of Internet hosts. 1