Results 1 -
8 of
8
Analysis of UDP traffic usage on internet backbone links
- in saint'09, Ninth Annual International Symposium on Applications and the Internet, 2009
"... It is still an accepted assumption that Internet traffic is dominated by TCP [1], [2]. However, the rise of new streaming applications [3] such as IPTV (PPStream, PPLive) and new P2P protocols (e.g. uTP [4]) that try to avoid traffic ..."
Abstract
-
Cited by 9 (1 self)
- Add to MetaCart
(Show Context)
It is still an accepted assumption that Internet traffic is dominated by TCP [1], [2]. However, the rise of new streaming applications [3] such as IPTV (PPStream, PPLive) and new P2P protocols (e.g. uTP [4]) that try to avoid traffic
Using traffic analysis to identify the second generation onion router.
- In Embedded and Ubiquitous Computing (EUC), 2011 IFIP 9th International Conference on,
, 2011
"... Abstract-Anonymous networks provide security for users by obfuscating messages with encryption and hiding communications amongst cover traffic provided by other network participants. The traditional goal of academic research into these networks has been attacks that aim to uncover the identity of n ..."
Abstract
-
Cited by 1 (0 self)
- Add to MetaCart
(Show Context)
Abstract-Anonymous networks provide security for users by obfuscating messages with encryption and hiding communications amongst cover traffic provided by other network participants. The traditional goal of academic research into these networks has been attacks that aim to uncover the identity of network users. But the success of an anonymous network relies not only on it's technical capabilities, but on adoption by a large enough user base to provide adequate cover traffic. If anonymous network nodes can be identified, the users can be harassed, discouraging participation. Tor is an example of widely used anonymous network which uses a form of Onion Routing to provide low latency anonymous communications. This paper demonstrates that traffic from a simulated Tor network can be distinguished from regular encrypted traffic, suggesting that real world Tor users may be vulnerable to the same analysis.
Quantitative Research
- Europe: Barron`s Educational Series, Inc. Available in http://www.answer.com/topic/quantita tiveresearch. Access on January 22
, 2008
"... This paper describes the reactions of graduate secondary school student teachers to an experiment which required them to evaluate specified techniques and theories of teaching and learning as part of their classroom practice. The aim of the experiment was to: (1) improve the quality of their judgmen ..."
Abstract
-
Cited by 1 (0 self)
- Add to MetaCart
This paper describes the reactions of graduate secondary school student teachers to an experiment which required them to evaluate specified techniques and theories of teaching and learning as part of their classroom practice. The aim of the experiment was to: (1) improve the quality of their judgments about pupils; (2) acquire variety in teaching styles; (3) increase understanding of student learning; and (4) evaluate the merits of Kolb's theory of learning and its application to teaching practice. Kolb's cycle of learning has four stages: learners are actively involved in a specific experience; they reflect on this experience conceptualization; and they take action as a result of the conclusions. Kolb developed a Learning Styles Inventory to determine the disposition of learners within a framework of four learning
An In-depth Understanding Of Internet Backbone Traffic : A Case Study Approach
"... ABSTRACT: Internet is very essential in day to day life of a common citizen. Modeling the Internet traffic is an important issue. Internet Protocol network capacity planning is a very important task. The Internet backbone is the principal data routes between large, strategically interconnected comp ..."
Abstract
- Add to MetaCart
ABSTRACT: Internet is very essential in day to day life of a common citizen. Modeling the Internet traffic is an important issue. Internet Protocol network capacity planning is a very important task. The Internet backbone is the principal data routes between large, strategically interconnected computer networks and core routers on the Internet. These data routes are hosted by commercial, government, academic and other high capacity network centres, the Internet exchange points and network access points that interchange Internet traffic between the countries, continents and across the oceans. Internet service providers, often Tier 1 networks, participate in Internet backbone exchange traffic by privately negotiated interconnection agreements, primarily governed by the principle of settlement free peering. We study the understanding of Internet traffic characteristics by measuring and analyzing modern Internet backbone data. We start the thesis with an overview of several important considerations for passive Internet traffic collection on large-scale network links. The lessons learned from a successful measurement project on academic Internet backbone links can serve as guidelines to others setting up and performing similar measurements. The data from these measurements are the basis for the analyses made in this thesis. As a first result we present a detailed characterization of packet headers. We propose a method and accompanying metrics to assess routing symmetry on a flow-level based on passive measurements. This method helps to improve traffic analysis techniques. TCP as the transport protocol in backbone traffic. We observe an increase of UDP traffic during the last few years, which we attribute to P2P signalling traffic. These results show the detailed classification of traffic according to network application. To accomplish this, we review state-of-the-art traffic classification approaches and subsequently propose two new methods. The first method provides a payload-independent classification of aggregated traffic based on connection patterns, second, classification method for fine-grained protocol identification by utilizing statistical packet and flow features. This method is capable of accurate classification in a simple and efficient way. This thesis presents methods and results contributing additional perspectives on Internet classifications and characteristics at different levels of granularity.
Distributed
"... Abstract—With the beginning of the 21st century emerging peer-to-peer networks ushered in a new era of large scale media exchange. Faced with ever increasing volumes of traffic, legal threats by copyright holders, and QoS demands of customers, network service providers are urged to apply traffic cla ..."
Abstract
- Add to MetaCart
(Show Context)
Abstract—With the beginning of the 21st century emerging peer-to-peer networks ushered in a new era of large scale media exchange. Faced with ever increasing volumes of traffic, legal threats by copyright holders, and QoS demands of customers, network service providers are urged to apply traffic classifica-tion and shaping techniques. These systems usually are highly integrated to satisfy the harsh restrictions present in network infrastructure. They require constant maintenance and updates. Additionally, they have legal issues and violate both the net neutrality and end-to-end principles. On the other hand, clients see their freedom and privacy attacked. As a result, users, application programmers, and even commercial service providers laboriously strive to hide their interests and circumvent classification techniques. In this user vs. ISP war, the user side has a clear edge. While changing the network infrastructure is by nature very complex, and only slowly reacts to new conditions, updating and distributing software between users is easy and practically instantaneous. In this paper we discuss how state-of-the-art traffic classifica-tion systems can be circumvented with little effort. We present a new obfuscation extension to the BitTorrent protocol that allows signature free handshaking. The extension requires no changes to the infrastructure and is fully backwards compatible. With only little change to client software, contemporary classification techniques are rendered ineffective. We argue, that future traffic classification must not rely on restricted local syntax information but instead must exploit global communication patterns and protocol semantics in order to be able to keep pace with rapid application and protocol changes. I.
Network Traffic Exposed and Concealed
"... You have enriched my life beyond belief. May the force be with you. Cyberspace: a world at war. Our privacy, freedom of speech, and with them the very foundations of democracy are under attack. In the virtual world frontiers are not set by nations or states, they are set by those, who control the fl ..."
Abstract
- Add to MetaCart
(Show Context)
You have enriched my life beyond belief. May the force be with you. Cyberspace: a world at war. Our privacy, freedom of speech, and with them the very foundations of democracy are under attack. In the virtual world frontiers are not set by nations or states, they are set by those, who control the flows of information. And control is, what everybody wants. The Five Eyes are watching, storing, and evaluating every transmission. Internet corporations compete for our data and decide if, when, and how we gain access to that data and to their pretended free services. Search engines control what information we are allowed- or want- to consume. Network access providers and carriers are fighting for control of larger networks and for better ways to shape the traffic. Interest groups and copyright holders struggle to limit access to specific content. Network operators try to keep their networks and their data safe from outside- or inside- adversaries. And users? Many of them just don’t care. Trust in concepts and techniques is implicit.