Results 1 - 10
of
43
A Cost-Sensitive Adaptation Engine for Server Consolidation of Multitier Applications
"... Abstract. Virtualization-based server consolidation requires runtime resource reconfiguration to ensure adequate application isolation and performance, especially for multitier services that have dynamic, rapidly changing workloads and responsiveness requirements. While virtualization makes reconfig ..."
Abstract
-
Cited by 22 (6 self)
- Add to MetaCart
(Show Context)
Abstract. Virtualization-based server consolidation requires runtime resource reconfiguration to ensure adequate application isolation and performance, especially for multitier services that have dynamic, rapidly changing workloads and responsiveness requirements. While virtualization makes reconfiguration easy, indiscriminate use of adaptations such as VM replication, VM migration, and capacity controls has performance implications. This paper demonstrates that ignoring these costs can have significant impacts on the ability to satisfy response-time-based SLAs, and proposes a solution in the form of a cost-sensitive adaptation engine that weighs the potential benefits of runtime reconfiguration decisions against their costs. Extensive experimental results based on live workload traces show that the technique is able to maximize SLA fulfillment under typical time-of-day workload variations as well as flash crowds, and that it exhibits significantly improved transient behavior compared to approaches that do not account for adaptation costs. 1
2006. The impact of infusing social presence in the web interface: an investigation across different products
- International Journal of Electronic Commerce (IJEC
"... ABSTRACT: Many online stores tend to exhibit little emotional or social appeal, and may be viewed as lacking human-warmth. A recent study conducted by the authors showed that in an online apparel domain, increased levels of social presence through socially-rich descriptions and pictures positively i ..."
Abstract
-
Cited by 20 (2 self)
- Add to MetaCart
(Show Context)
ABSTRACT: Many online stores tend to exhibit little emotional or social appeal, and may be viewed as lacking human-warmth. A recent study conducted by the authors showed that in an online apparel domain, increased levels of social presence through socially-rich descriptions and pictures positively impacts attitudinal antecedents. However, the appropriateness and need for human warmth and sociability may differ across the types of products or services being sought. In this paper, an empirical investigation was undertaken to compare our earlier findings in the apparel domain (a product for which consumers seek fun and entertaining shopping experiences) to a different type of product (headphones: a product for which consumers primarily seek detailed product information). It was found that Websites selling headphones do not exhibit a similar positive effect on attitudinal antecedents from higher levels of social presence. Implications of these finding and future research are outlined.
Two Types of Attitudes in ICT Acceptance and Use
- INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION
, 2008
"... ..."
Rating scales for collective intelligence in innovation communities: Why quick and easy decision making does not get it right.
- In International Conference on Information Systems (ICIS),
, 2010
"... Abstract The increasing popularity of open innovation approaches has lead to the rise of various innovation platforms on the ..."
Abstract
-
Cited by 10 (3 self)
- Add to MetaCart
(Show Context)
Abstract The increasing popularity of open innovation approaches has lead to the rise of various innovation platforms on the
Please Continue to Hold An empirical study on user tolerance of security delays
"... We present the results of an experiment examining the extent to which individuals will tolerate delays when told that such delays are for security purposes. In our experiment, we asked 800 Amazon Mechanical Turk users to count the total number of times a certain term was repeated in a multipage docu ..."
Abstract
-
Cited by 10 (4 self)
- Add to MetaCart
(Show Context)
We present the results of an experiment examining the extent to which individuals will tolerate delays when told that such delays are for security purposes. In our experiment, we asked 800 Amazon Mechanical Turk users to count the total number of times a certain term was repeated in a multipage document. The task was designed to be conducive to cheating. We assigned subjects to eight between-subjects conditions: one of these offered a concrete security reason (virus-scanning) for the delay, another offered only a vague security explanation, while the remaining conditions either offered non-security explanations for the delay or no delay at all—in the case of the control condition. We found that subjects were significantly more likely to cheat or abandon the task when provided with non-security explanations or a vague security explanation for the delay. However, when subjects were provided more explanation about the threat model and the protection ensured by the delay, they were not more likely to cheat than subjects in the control condition who faced no such delay. Our results thus contribute to the nascent literature on soft paternalistic solutions to security and privacy problems by suggesting that, when security mitigations cannot be made “free ” for users, designers may incentivize compliant users ’ behavior by intentionally drawing attention to the mitigation itself. 1.
Blackbox Prediction of the Impact of DVFS on End-to-End Performance of Multitier Systems
"... Dynamic voltage and frequency scaling (DVFS) is a wellknown technique for gaining energy savings on desktop and laptop computers. However, its use in server settings requires careful consideration of any potential impacts on endto-end service performance of hosted applications. In this paper, we dev ..."
Abstract
-
Cited by 7 (1 self)
- Add to MetaCart
(Show Context)
Dynamic voltage and frequency scaling (DVFS) is a wellknown technique for gaining energy savings on desktop and laptop computers. However, its use in server settings requires careful consideration of any potential impacts on endto-end service performance of hosted applications. In this paper, we develop a simple metric called the “frequency gradient” that allows prediction of the impact of changes in processor frequency on the end-to-end transaction response times of multitier applications. We show how frequency gradients can be measured on a running system in a pushbutton manner without any prior knowledge of application semantics, structure, or configuration settings. Using experimental results, we demonstrate that the frequency gradients provide accurate predictions, and enable end-to-end performance-aware DVFS for mulitier applications. 1.
User preference and search engine latency
- In Proc. ASA Joint Statistical Meetings
, 2008
"... Industry research advocates a 4 second rule for web pages to load [7]. Usability engineers note that a response time over 1 second may interrupt a user’s flow of thought [6, 9]. There is a general belief that, all other factors equal, users will abandon a slow search engine in favor of a faster alte ..."
Abstract
-
Cited by 7 (0 self)
- Add to MetaCart
(Show Context)
Industry research advocates a 4 second rule for web pages to load [7]. Usability engineers note that a response time over 1 second may interrupt a user’s flow of thought [6, 9]. There is a general belief that, all other factors equal, users will abandon a slow search engine in favor of a faster alternative. This study compares two mock search engines that differ only in branding (generic color scheme) and latency (fast vs. slow). The fast latency was fixed at 250 ms, while 4 different slow latencies were evaluated: 2s, 3s, 4s, and 5s. When the slower search engine latency is 5 seconds, users state that they perceive the fast engine as faster. When the slower search engine latency is 4 or 5 seconds, users choose to use the fast engine more often. Based on pooling data for 2 s and 3 s, once slow latency exceeds 3 seconds, users are 1.5 times more likely to choose the fast engine. KEY WORDS: web search, latency, response time 1.
Privacy awareness about information leakage: Who knows what about me
- In Proceedings of the 12th ACM Workshop on Workshop on Privacy in the Electronic Society, WPES ’13
, 2013
"... The task of protecting users ’ privacy is made more difficult by their attitudes towards information disclosure without full awareness and the economics of the tracking and ad-vertising industry. Even after numerous press reports and widespread disclosure of leakages on the Web and on popu-lar Onlin ..."
Abstract
-
Cited by 5 (0 self)
- Add to MetaCart
(Show Context)
The task of protecting users ’ privacy is made more difficult by their attitudes towards information disclosure without full awareness and the economics of the tracking and ad-vertising industry. Even after numerous press reports and widespread disclosure of leakages on the Web and on popu-lar Online Social Networks, many users appear not be fully aware of the fact that their information may be collected, aggregated and linked with ambient information for a vari-ety of purposes. Past attempts at alleviating this problem have addressed individual aspects of the user’s data collec-tion. In this paper we move towards a comprehensive and efficient client-side tool that maximizes users ’ awareness of the extent of their information leakage. We show that such a customizable tool can help users to make informed decisions on controlling their privacy footprint.
Enabling the Transition to the Mobile Web with WebSieve
"... Web access on mobile platforms already constitutes a significant (> 20%) share of web traffic [3]. Furthermore, this share is projected to even surpass access from laptops and desktops [11]. In conjunction with this growth, user expectations for the performance ..."
Abstract
-
Cited by 4 (1 self)
- Add to MetaCart
(Show Context)
Web access on mobile platforms already constitutes a significant (> 20%) share of web traffic [3]. Furthermore, this share is projected to even surpass access from laptops and desktops [11]. In conjunction with this growth, user expectations for the performance
Characterization of Workload and Resource Consumption for an Online Travel and Booking Site
"... Abstract—Online travel and ticket booking is one of the top E-Commerce industries. As they present a mix of products: flights, hotels, tickets, restaurants, activities and vacational packages, they rely on a wide range of technologies to support them: Javascript, AJAX, XML, B2B Web services, Caching ..."
Abstract
-
Cited by 3 (1 self)
- Add to MetaCart
(Show Context)
Abstract—Online travel and ticket booking is one of the top E-Commerce industries. As they present a mix of products: flights, hotels, tickets, restaurants, activities and vacational packages, they rely on a wide range of technologies to support them: Javascript, AJAX, XML, B2B Web services, Caching, Search Algorithms and Affiliation; resulting in a very rich and heterogeneous workload. Moreover, visits to travel sites present a great variability depending on time of the day, season, promotions, events, and linking; creating bursty traffic, making capacity planning a challenge. It is therefore of great importance to understand how users and crawlers interact on travel sites and their effect on server resources, for devising cost effective infrastructures and improving the Quality of Service for users. In this paper we present a detailed workload and resource consumption characterization of the web site of a top national Online Travel Agency. Characterization is performed on server logs, including both HTTP data and resource consumption of the requests, as well as the server load status during the execution. From the dataset we characterize user sessions, their patterns and how response time is affected as load on Web servers increases. We provide a fine grain analysis by performing experiments differentiating: types of request, time of the day, products, and resource requirements for each. Results show that the workload is bursty, as expected, that exhibit different properties between day and night traffic in terms of request type mix, that user session length cover a wide range of durations, which response time grows proportionally to server load, and that response time of external data providers also increase on peak hours, amongst other results. Such results can be useful for optimizing infrastructure costs, improving QoS for users, and development of realistic workload generators for similar applications. I.