• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations

State of the Art in Traffic Classification: A Research Review,” (2009)

by M Zhang, W John, kc claffy, N Brownlee
Add To MetaCart

Tools

Sorted by:
Results 1 - 8 of 8

Analysis of UDP traffic usage on internet backbone links

by Min Zhang, Maurizio Dusi, Wolfgang John, Changjia Chen - in saint'09, Ninth Annual International Symposium on Applications and the Internet, 2009
"... It is still an accepted assumption that Internet traffic is dominated by TCP [1], [2]. However, the rise of new streaming applications [3] such as IPTV (PPStream, PPLive) and new P2P protocols (e.g. uTP [4]) that try to avoid traffic ..."
Abstract - Cited by 9 (1 self) - Add to MetaCart
It is still an accepted assumption that Internet traffic is dominated by TCP [1], [2]. However, the rise of new streaming applications [3] such as IPTV (PPStream, PPLive) and new P2P protocols (e.g. uTP [4]) that try to avoid traffic
(Show Context)

Citation Context

...TV traffic is already common. Finally, we note that precise traffic classification requires methods beyond simple port classification. Most current traffic classification techniques focus on TCP [7], =-=[8]-=-, with only preliminary examination of techniques for UDP traffic [9] (other than deep packet inspection). Given the growing evidence for the use of UDP for increasingly popular applications, includin...

Using traffic analysis to identify the second generation onion router.

by John Barker , Peter Hannay , Patryk Szewczyk - In Embedded and Ubiquitous Computing (EUC), 2011 IFIP 9th International Conference on, , 2011
"... Abstract-Anonymous networks provide security for users by obfuscating messages with encryption and hiding communications amongst cover traffic provided by other network participants. The traditional goal of academic research into these networks has been attacks that aim to uncover the identity of n ..."
Abstract - Cited by 1 (0 self) - Add to MetaCart
Abstract-Anonymous networks provide security for users by obfuscating messages with encryption and hiding communications amongst cover traffic provided by other network participants. The traditional goal of academic research into these networks has been attacks that aim to uncover the identity of network users. But the success of an anonymous network relies not only on it's technical capabilities, but on adoption by a large enough user base to provide adequate cover traffic. If anonymous network nodes can be identified, the users can be harassed, discouraging participation. Tor is an example of widely used anonymous network which uses a form of Onion Routing to provide low latency anonymous communications. This paper demonstrates that traffic from a simulated Tor network can be distinguished from regular encrypted traffic, suggesting that real world Tor users may be vulnerable to the same analysis.
(Show Context)

Citation Context

...ture and not necessarily feasible in practice [14]. In certain circumstances they require compromise of large parts of the Tor network, supplying hostile data to Tor users or complicated knowledge of usage patterns and an excess of patience. The technique demonstrated in this paper is a low cost technique, which does not require sophisticated equipment and can be completed by a passive observer. V. TRAFFIC CLASSIFICATION When considering the use of traffic analysis for classification of Internet communications, three techniques are used: exact matching, heuristic matching and machine learning [15]. Since Tor employs strong encryption and can communicate on any port, it can easily an exact matching technique through simple configuration options. Heuristic based techniques have been designed to classify encrypted communications, including the identification of P2P traffic [16–18] and viruses [19]. Machine learning algorithms have also been used successfully to classify encrypted traffic including Skype [20] and to identify application protocols tunnelled over SSH [21]. An previous attempt to classify Tor using Bayesian networks was attempted in Herrmann et al. [13] without great success....

Quantitative Research

by Heywood John - Europe: Barron`s Educational Series, Inc. Available in http://www.answer.com/topic/quantita tiveresearch. Access on January 22 , 2008
"... This paper describes the reactions of graduate secondary school student teachers to an experiment which required them to evaluate specified techniques and theories of teaching and learning as part of their classroom practice. The aim of the experiment was to: (1) improve the quality of their judgmen ..."
Abstract - Cited by 1 (0 self) - Add to MetaCart
This paper describes the reactions of graduate secondary school student teachers to an experiment which required them to evaluate specified techniques and theories of teaching and learning as part of their classroom practice. The aim of the experiment was to: (1) improve the quality of their judgments about pupils; (2) acquire variety in teaching styles; (3) increase understanding of student learning; and (4) evaluate the merits of Kolb's theory of learning and its application to teaching practice. Kolb's cycle of learning has four stages: learners are actively involved in a specific experience; they reflect on this experience conceptualization; and they take action as a result of the conclusions. Kolb developed a Learning Styles Inventory to determine the disposition of learners within a framework of four learning

An In-depth Understanding Of Internet Backbone Traffic : A Case Study Approach

by Madhusmita Panda
"... ABSTRACT: Internet is very essential in day to day life of a common citizen. Modeling the Internet traffic is an important issue. Internet Protocol network capacity planning is a very important task. The Internet backbone is the principal data routes between large, strategically interconnected comp ..."
Abstract - Add to MetaCart
ABSTRACT: Internet is very essential in day to day life of a common citizen. Modeling the Internet traffic is an important issue. Internet Protocol network capacity planning is a very important task. The Internet backbone is the principal data routes between large, strategically interconnected computer networks and core routers on the Internet. These data routes are hosted by commercial, government, academic and other high capacity network centres, the Internet exchange points and network access points that interchange Internet traffic between the countries, continents and across the oceans. Internet service providers, often Tier 1 networks, participate in Internet backbone exchange traffic by privately negotiated interconnection agreements, primarily governed by the principle of settlement free peering. We study the understanding of Internet traffic characteristics by measuring and analyzing modern Internet backbone data. We start the thesis with an overview of several important considerations for passive Internet traffic collection on large-scale network links. The lessons learned from a successful measurement project on academic Internet backbone links can serve as guidelines to others setting up and performing similar measurements. The data from these measurements are the basis for the analyses made in this thesis. As a first result we present a detailed characterization of packet headers. We propose a method and accompanying metrics to assess routing symmetry on a flow-level based on passive measurements. This method helps to improve traffic analysis techniques. TCP as the transport protocol in backbone traffic. We observe an increase of UDP traffic during the last few years, which we attribute to P2P signalling traffic. These results show the detailed classification of traffic according to network application. To accomplish this, we review state-of-the-art traffic classification approaches and subsequently propose two new methods. The first method provides a payload-independent classification of aggregated traffic based on connection patterns, second, classification method for fine-grained protocol identification by utilizing statistical packet and flow features. This method is capable of accurate classification in a simple and efficient way. This thesis presents methods and results contributing additional perspectives on Internet classifications and characteristics at different levels of granularity.

Distributed

by Thomas Zink, Marcel Waldvogel
"... Abstract—With the beginning of the 21st century emerging peer-to-peer networks ushered in a new era of large scale media exchange. Faced with ever increasing volumes of traffic, legal threats by copyright holders, and QoS demands of customers, network service providers are urged to apply traffic cla ..."
Abstract - Add to MetaCart
Abstract—With the beginning of the 21st century emerging peer-to-peer networks ushered in a new era of large scale media exchange. Faced with ever increasing volumes of traffic, legal threats by copyright holders, and QoS demands of customers, network service providers are urged to apply traffic classifica-tion and shaping techniques. These systems usually are highly integrated to satisfy the harsh restrictions present in network infrastructure. They require constant maintenance and updates. Additionally, they have legal issues and violate both the net neutrality and end-to-end principles. On the other hand, clients see their freedom and privacy attacked. As a result, users, application programmers, and even commercial service providers laboriously strive to hide their interests and circumvent classification techniques. In this user vs. ISP war, the user side has a clear edge. While changing the network infrastructure is by nature very complex, and only slowly reacts to new conditions, updating and distributing software between users is easy and practically instantaneous. In this paper we discuss how state-of-the-art traffic classifica-tion systems can be circumvented with little effort. We present a new obfuscation extension to the BitTorrent protocol that allows signature free handshaking. The extension requires no changes to the infrastructure and is fully backwards compatible. With only little change to client software, contemporary classification techniques are rendered ineffective. We argue, that future traffic classification must not rely on restricted local syntax information but instead must exploit global communication patterns and protocol semantics in order to be able to keep pace with rapid application and protocol changes. I.
(Show Context)

Citation Context

...changes in traffic classification systems. II. RELATED WORK Traffic classification is a vast and heterogenous field with many different methodologies, applications and granularities. A study by Caida =-=[31]-=- reviews numerous papers that span over a decade of research. In general, the goal is to either specifically identify the layer 7 protocol, or to perform a coarse-grain classification according to som...

Network Traffic Exposed and Concealed

by Thomas Zink
"... You have enriched my life beyond belief. May the force be with you. Cyberspace: a world at war. Our privacy, freedom of speech, and with them the very foundations of democracy are under attack. In the virtual world frontiers are not set by nations or states, they are set by those, who control the fl ..."
Abstract - Add to MetaCart
You have enriched my life beyond belief. May the force be with you. Cyberspace: a world at war. Our privacy, freedom of speech, and with them the very foundations of democracy are under attack. In the virtual world frontiers are not set by nations or states, they are set by those, who control the flows of information. And control is, what everybody wants. The Five Eyes are watching, storing, and evaluating every transmission. Internet corporations compete for our data and decide if, when, and how we gain access to that data and to their pretended free services. Search engines control what information we are allowed- or want- to consume. Network access providers and carriers are fighting for control of larger networks and for better ways to shape the traffic. Interest groups and copyright holders struggle to limit access to specific content. Network operators try to keep their networks and their data safe from outside- or inside- adversaries. And users? Many of them just don’t care. Trust in concepts and techniques is implicit.
(Show Context)

Citation Context

...Recent studies [ZJCB09, DPC12] present trends and evolution of traffic classification. A digest is shown in Figure 2.4. Figure 2.4: The Traffic Classification and Obfuscation Arms Race. Influenced by =-=[ZJCB09]-=- Early classifiers relied on port numbers [ZJCB09, DPC12] to identify the application. RFC 739 [Pos77] first introduced the Assigned Numbers that maps well-known network protocols to specific ports an...

P2P NETWORK CLASSIFICATION A BOTH PORT AND PAYLOAD AGNOSTIC APPROACH

by P. J. Molijn, P. J. Molijn, Eleonore Eunice Molijn
"... ud en t: ..."
Abstract - Add to MetaCart
Abstract not found

Acknowledgement

by Di (fh Peter Dorfinger
"... ii ..."
Abstract - Add to MetaCart
Abstract not found
(Show Context)

Citation Context

...have a trace file that represents a mixture of the traffic that would be present in reality. In particular, differences in access and backbone networks influence the quality of traffic classification =-=[82]-=-. Currently research is far away from having a traffic classification approach that is applicable on a wide range of applications and a wide range of network links. Simple questions like “How much of ...

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University