Results 1 
6 of
6
Approximation Schemes for Sequential Posted Pricing in MultiUnit
, 2010
"... We design algorithms for computing approximately revenuemaximizing sequential postedpricing mechanisms (SPM) in Kunit auctions, in a standard Bayesian model. A seller has K copies of an item to sell, and there are n buyers, each interested in only one copy, who have some value for the item. The se ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
We design algorithms for computing approximately revenuemaximizing sequential postedpricing mechanisms (SPM) in Kunit auctions, in a standard Bayesian model. A seller has K copies of an item to sell, and there are n buyers, each interested in only one copy, who have some value for the item. The seller must post a price for each buyer, the buyers arrive in a sequence enforced by the seller, and a buyer buys the item if its value exceeds the price posted to it. The seller does not know the values of the buyers, but have Bayesian information about them. An SPM specifies the ordering of buyers and the posted prices, and may be adaptive or nonadaptive in its behavior. The goal is to design SPM in polynomial time to maximize expected revenue. We compare against the expected revenue of optimal SPM, and provide a polynomial time approximation scheme (PTAS) for both nonadaptive and adaptive SPMs. This is achieved by two algorithms: an efficient algorithm that gives a (1 − 1 √)approximation (and hence a PTAS for sufficiently 2πK large K), and another that is a PTAS for constant K. The first algorithm yields a nonadaptive SPM that yields its approximation guarantees against an optimal adaptive SPM – this implies that the adaptivity gap in SPMs vanishes as K becomes larger. 1
Optimal realtime bidding for display advertising
 In KDD
, 2014
"... In this paper we study bid optimisation for realtime bidding (RTB) based display advertising. RTB allows advertisers to bid on a display ad impression in real time when it is being generated. It goes beyond contextual advertising by motivating the bidding focused on user data and it is different ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
In this paper we study bid optimisation for realtime bidding (RTB) based display advertising. RTB allows advertisers to bid on a display ad impression in real time when it is being generated. It goes beyond contextual advertising by motivating the bidding focused on user data and it is different from the sponsored search auction where the bid price is associated with keywords. For the demand side, a fundamental technical challenge is to automate the bidding process based on the budget, the campaign objective and various information gathered in runtime and in history. In this paper, the programmatic bidding is cast as a functional optimisation problem. Under certain dependency assumptions, we derive simple bidding functions that can be calculated in real time; our finding shows that the optimal bid has a nonlinear relationship with the impression level evaluation such as the clickthrough rate and the conversion rate, which are estimated in real time from the impression level features. This is different from previous work that is mainly focused on a linear bidding function. Our mathematical derivation suggests that optimal bidding strategies should try to bid more impressions rather than focus on a small set of high valued impressions because according to the current RTB market data, compared to the higher evaluated impressions, the lower evaluated ones are more cost effective and the chances of winning them are relatively higher. Aside from the theoretical insights, offline experiments on a real dataset and online experiments on a production RTB system verify the effectiveness of our proposed optimal bidding strategies and the functional optimisation framework.
Bargaining and Pricing in Networked Economic Systems
, 2011
"... Economic systems can often be modeled as games involving several agents or players who act according to their own individual interests. Our goal is to understand how various features of an economic system affect its outcomes, and what may be the best strategy for an individual agent. In this work, w ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Economic systems can often be modeled as games involving several agents or players who act according to their own individual interests. Our goal is to understand how various features of an economic system affect its outcomes, and what may be the best strategy for an individual agent. In this work, we model an economic system as a combination of many bilateral economic opportunities, such as that between a buyer and a seller. The transactions are complicated by the existence of many economic opportunities, and the influence they have on each other. For example, there may be several prospective sellers and buyers for the same item, with possibly differing costs and values. Such a system may be modeled by a network, where the nodes represent players and the edges represent opportunities. We study the effect of network structure on the outcome of bargaining among players, through theoretical
Handling Forecast Errors While Bidding for Display Advertising
"... Most of the online advertising today is sold via an auction, which requires the advertiser to respond with a valid bid within a fraction of a second. As such, most advertisers employ bidding agents to submit bids on their behalf. The architecture of such agents typically has (1) an offline optimizat ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Most of the online advertising today is sold via an auction, which requires the advertiser to respond with a valid bid within a fraction of a second. As such, most advertisers employ bidding agents to submit bids on their behalf. The architecture of such agents typically has (1) an offline optimization phase which incorporates the bidder’s knowledge about the market and (2) an online bidding strategy which simply executes the offline strategy. The online strategy is typically highly dependent on both supply and expected price distributions, both of which are forecast using traditional machine learning methods. In this work we investigate the optimum strategy of the bidding agent when faced with incorrect forecasts. At a high level, the agent can invest resources in improving the forecasts, or can tighten the loop between successive offline optimization cycles in order to detect errors more quickly. We show analytically that the latter strategy, while simple, is extremely effective in dealing with forecast errors, and confirm this finding with experimental evaluations.
Research Statement
"... My field of research is Theoretical Computer Science. My focus has been in the classical and quantum complexity of Boolean functions (including property testing, sensitivity and block sensitivity of Boolean functions and quantum database search), in electronic commerce, in graph algorithms and in co ..."
Abstract
 Add to MetaCart
(Show Context)
My field of research is Theoretical Computer Science. My focus has been in the classical and quantum complexity of Boolean functions (including property testing, sensitivity and block sensitivity of Boolean functions and quantum database search), in electronic commerce, in graph algorithms and in coding theory. I have designed effective algorithms as well as proved lower bounds for the complexity of problems in this area. 1 Combinatorial Complexity Measures of Boolean Functions In my work in the field of combinatorial complexity measures of Boolean functions I strive to obtain a better understanding of different measures of hardness for Boolean functions, and their relation to the amount of resources needed to compute them in several combinatorial models. Previous results in the area have been obtained by insightful identification of the right measures of complexity and the choice of appropriate mathematical tools. Similarly, in my study I apply mathematical ideas and develop tools for analyzing the complexity of Boolean functions. 1.1 Property Testing Many data sets that arise in the fields of biology, geology, astronomy, climatology, artificial intelligence, etc are massive. In fact they are so huge that even reading the whole data requires impossibly large resources.