Results 1  10
of
11
Simple and efficient local codes for distributed stable network construction
 In Proceedings of the 33rd ACM Symposium on Principles of Distributed Computing (PODC
, 2014
"... In this work, we study protocols so that populations of distributed processes can construct networks. In order to highlight the basic principles of distributed network construction we keep the model minimal in all respects. In particular, we assume nitestate processes that all begin from the same ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
(Show Context)
In this work, we study protocols so that populations of distributed processes can construct networks. In order to highlight the basic principles of distributed network construction we keep the model minimal in all respects. In particular, we assume nitestate processes that all begin from the same initial state and all execute the same protocol. Moreover, we assume pairwise interactions between the processes that are scheduled by a fair adversary. In order to allow processes to construct networks, we let them activate and deactivate their pairwise connections. When two processes interact, the protocol takes as input the states of the processes and the state of their connection and updates all of them. Initially all connections are inactive and the goal is for the processes, after interacting and activating/deactivating connections for a while, to end up with a desired stable network. We give protocols (optimal in some cases) and lower bounds for several basic network construction problems such as spanning line, spanning ring, spanning star, and regular network. The expected time to convergence of our protocols is analyzed under a uniform random scheduler. Finally, we prove several universality results by presenting generic protocols that are capable of simulating a Turing Machine (TM) and exploiting it in order to construct a large class of networks. We additionally show how to partition the population into k supernodes, each being a line of log k nodes, for the largest such k. This amount of local memory is sucient for the supernodes to obtain unique names and exploit their names and their memory to realize nontrivial constructions.
Strategic Formation of Credit Networks
, 2012
"... Credit networks are an abstraction for modeling trust between agents in a network. Agents who do not directly trust each other can transact through exchange of IOUs (obligations) along a chain of trust in the network. Credit networks are robust to intrusion, can enable transactions between strangers ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
(Show Context)
Credit networks are an abstraction for modeling trust between agents in a network. Agents who do not directly trust each other can transact through exchange of IOUs (obligations) along a chain of trust in the network. Credit networks are robust to intrusion, can enable transactions between strangers in exchange economies, and have the liquidity to support a high rate of transactions. We study the formation of such networks when agents strategically decide how much credit to extend each other. When each agent trusts a fixed set of other agents, and transacts directly only with those it trusts, the formation game is a potential game and all Nash equilibria are social optima. Moreover, the Nash equilibria of this game are equivalent in a very strong sense: the sequences of transactions that can be supported from each equilibrium credit network are identical. When we allow transactions over longer paths, the game may not
Intermediation and Voluntary Exposure to
, 2013
"... I develop a model of financial sector in which endogenous intermediation among debt financed banks generates excessive systemic risk. The central idea is to explore the possibility that certain financial institutions are able to use their lending and borrowing decisions to tilt the division of surpl ..."
Abstract
 Add to MetaCart
I develop a model of financial sector in which endogenous intermediation among debt financed banks generates excessive systemic risk. The central idea is to explore the possibility that certain financial institutions are able to use their lending and borrowing decisions to tilt the division of surplus in their own favor through capturing intermediation spreads, even if the implied change in the structure of financial system hurts the total surplus of the economy. The paper predicts that there is excessive connection among banks who make risky investments and too little connection among those who mainly provide funding. Inefficiency arises because the financial institutions who intermediate among other institutions are exposed to excessive counterparty risk: replacing them with certain other banks mitigates the extent of failure when it is inevitable without hurting the optimal level of investment. In equilibrium, intermediators choose to over expose themselves to other risky banks and suffer the cost of failure due to contagion if they absorb enough rents when they survive.
Local Bargaining and Market Fluctuations ∗ Thành Nguyen †
, 2012
"... We study how local bargaining in a networked market can cause endogenousfluctuationsbyanewapproachthatincorporatesnoncooperative bargaining into a large networked economy. In particular, we consider a networked bargaining game that captures trade with intermediaries and define its replications. We ..."
Abstract
 Add to MetaCart
We study how local bargaining in a networked market can cause endogenousfluctuationsbyanewapproachthatincorporatesnoncooperative bargaining into a large networked economy. In particular, we consider a networked bargaining game that captures trade with intermediaries and define its replications. We examine the agents ’ behavior in the limit as the population size goes to infinity: a limit stationary equilibrium exists if there is a converging sequence of semistationary equilibria in the finite replications. Theexistence of alimit stationary equilibriumcaptures the hypothesis that when the market gets large, the agents will behave myopically and the market will be stable. However, we prove that limit stationary equilibria need not exist even when market fundamentals are deterministic, agents are patient and share a common belief. This shows that in our setting the underlying network is the main friction that hinders stationary markets. Keywords: Noncooperative Bargaining, Network Games.
1I am grateful for comments and feedback from the editors Mila Getmansky and Roger Stein.
, 2015
"... Matrix Metrics: NetworkBased Systemic Risk Scoring I propose a novel framework for networkbased systemic risk measurement and management. I define a new systemic risk score that depends on the level of individual risk at each financial institution and the interconnectedness across institutions, a ..."
Abstract
 Add to MetaCart
Matrix Metrics: NetworkBased Systemic Risk Scoring I propose a novel framework for networkbased systemic risk measurement and management. I define a new systemic risk score that depends on the level of individual risk at each financial institution and the interconnectedness across institutions, and is generally applicable irrespective of how interconnectedness is defined. This risk metric is decomposable into risk contributions from each entity, forming a basis for taxing each entity appropriately. We may calculate risk increments to assess the potential risk of each entity on the overall financial system. The paper develops other subsidiary risk measures such as system fragility and entity criticality. An assessment using a measure of spillover risk is obtained to determine the scale of externalities that one bank might impose on the system; the metric is robust to this cross risk, and does not induce predatory spillovers. The analysis suggests that splitting up toobigtofail banks does not lower systemic risk.
Distributed Computing manuscript No. (will be inserted by the editor) Simple and Efficient Local Codes for Distributed Stable Network Construction
"... Abstract In this work, we study protocols so that populations of distributed processes can construct networks. In order to highlight the basic principles of distributed network construction, we keep the model minimal in all respects. In particular, we assume finitestate processes that all begin ..."
Abstract
 Add to MetaCart
Abstract In this work, we study protocols so that populations of distributed processes can construct networks. In order to highlight the basic principles of distributed network construction, we keep the model minimal in all respects. In particular, we assume finitestate processes that all begin from the same initial state and all execute the same protocol. Moreover, we assume pairwise interactions between the processes that are scheduled by a fair adversary. In order to allow processes to construct networks, we let them activate and deactivate their pairwise connections. When two processes interact, the protocol takes as input the states of the processes and the state of their connection and updates all of them. Initially all connections are inactive and the goal is for the processes, after interacting and activating/deactivating connections for a while, to end up with a desired stable network. We give protocols (optimal in some cases) and lower bounds for several basic network construction problems such as spanning line,
On the Resilience of Bipartite Networks
, 2013
"... Motivated by problems modeling the spread of infections in networks, in this paper we explore which bipartite graphs are most resilient to widespread infections under various parameter settings. Namely, we study bipartite networks with a requirement of a minimum degree d on one side under an indep ..."
Abstract
 Add to MetaCart
(Show Context)
Motivated by problems modeling the spread of infections in networks, in this paper we explore which bipartite graphs are most resilient to widespread infections under various parameter settings. Namely, we study bipartite networks with a requirement of a minimum degree d on one side under an independent infection, independent transmission model. We completely characterize the optimal graphs in the case d = 1, which already produces nontrivial behavior, and we give some extremal results for the more general cases. We also show that determining the subgraph of an arbitrary bipartite graph most resilient to infection is NPhard for any onesided minimal degree d ≥ 1.
To lag or not to lag? How to compare indices of stock markets that operate at different times.
, 2013
"... Financial markets worldwide do not have the same working hours. As a consequence, the study of correlation or causality between financial market indices becomes dependent on wether we should consider in computations of correlation matrices all indices in the same day or lagged indices. The answer th ..."
Abstract
 Add to MetaCart
Financial markets worldwide do not have the same working hours. As a consequence, the study of correlation or causality between financial market indices becomes dependent on wether we should consider in computations of correlation matrices all indices in the same day or lagged indices. The answer this article proposes is that we should consider both. In this work, we use 79 indices of a diversity of stock markets across the world in order to study their correlation structure, and discover that representing in the same network original and lagged indices, we obtain a better understanding of how indices that operate at different hours relate to each other. 1