Results 1 
6 of
6
Modelling Reciprocating Relationships with Hawkes Processes
"... We present a Bayesian nonparametric model that discovers implicit social structure from interaction timeseries data. Social groups are often formed implicitly, through actions among members of groups. Yet many models of social networks use explicitly declared relationships to infer social structure ..."
Abstract

Cited by 27 (3 self)
 Add to MetaCart
(Show Context)
We present a Bayesian nonparametric model that discovers implicit social structure from interaction timeseries data. Social groups are often formed implicitly, through actions among members of groups. Yet many models of social networks use explicitly declared relationships to infer social structure. We consider a particular class of Hawkes processes, a doubly stochastic point process, that is able to model reciprocity between groups of individuals. We then extend the Infinite Relational Model by using these reciprocating Hawkes processes to parameterise its edges, making events associated with edges codependent through time. Our model outperforms general, unstructured Hawkes processes as well as structured Poisson processbased models at predicting verbal and email turntaking, and military conflicts among nations. 1
Infinite dynamic bayesian networks
 In: International Conference on Machine Learning(ICML). (2011
"... Abstract We present the infinite dynamic Bayesian network model (iDBN), a nonparametric, factored statespace model that generalizes dynamic Bayesian networks (DBNs). The iDBN can infer every aspect of a DBN: the number of hidden factors, the number of values each factor can take, and (arbitrarily ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
(Show Context)
Abstract We present the infinite dynamic Bayesian network model (iDBN), a nonparametric, factored statespace model that generalizes dynamic Bayesian networks (DBNs). The iDBN can infer every aspect of a DBN: the number of hidden factors, the number of values each factor can take, and (arbitrarily complex) connections and conditionals between factors and observations. In this way, the iDBN generalizes other nonparametric state space models, which until now generally focused on binary hidden nodes and more restricted connection structures. We show how this new prior allows us to find interesting structure in benchmark tests and on two realworld datasets involving weather data and neural information flow networks.
The Indian Buffet Process: Scalable Inference and Extensions
"... August 2009This dissertation is the result of my own work and includes nothing which is the outcome of work done in collaboration except where specifically indicated in the text. c ○ Copyright by Finale DoshiVelez, 2009. Many unsupervised learning problems seek to identify hidden features from obse ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
August 2009This dissertation is the result of my own work and includes nothing which is the outcome of work done in collaboration except where specifically indicated in the text. c ○ Copyright by Finale DoshiVelez, 2009. Many unsupervised learning problems seek to identify hidden features from observations. In many realworld situations, the number of hidden features is unknown. To avoid specifying the number of hidden features a priori, one can use the Indian Buffet Process (IBP): a nonparametric latent feature model that does not bound the number of active features in a dataset. While elegant, the lack of efficient inference procedures for the IBP has prevented its application in largescale problems. The core contribution of this thesis are three new inference procedures that allow inference in the IBP to be scaled from a few hundred to 100,000 observations. This thesis contains three parts:
MONDRIAN HIDDEN MARKOVMODEL FOR MUSIC SIGNAL PROCESSING
"... This paper discusses a new extension of hidden Markov models that can capture clusters embedded in transitions between the hidden states. In our model, the statetransition matrices are viewed as representations of relational data reflecting a network structure between the hidden states. We specifi ..."
Abstract
 Add to MetaCart
(Show Context)
This paper discusses a new extension of hidden Markov models that can capture clusters embedded in transitions between the hidden states. In our model, the statetransition matrices are viewed as representations of relational data reflecting a network structure between the hidden states. We specifically present a nonparametric Bayesian approach to the proposed statespace model whose network structure is represented by a Mondrian Processbased relational model. We show an application of the proposed model to music signal analysis through some experimental results. Index Terms — Bayesian nonparametrics, hiddenMarkov model, Mondrian process
Research Experience Research Scientist
, 2001
"... Actively pursuing research into structured dynamical systems modeling with Bayesian nonparametrics, planning and model building for reinforcement learning, structured policy priors for policy learning, and universal inference for probabilistic programming languages. Current applied thrusts include r ..."
Abstract
 Add to MetaCart
Actively pursuing research into structured dynamical systems modeling with Bayesian nonparametrics, planning and model building for reinforcement learning, structured policy priors for policy learning, and universal inference for probabilistic programming languages. Current applied thrusts include reinforcement learning for multicore systems, machine learning for oil discovery, and generative models of machine vision. Contributed to funding efforts for AFOSR and Shell Oil.
5a. CONTRACT NUMBER
"... Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing this collection of information. Send comment ..."
Abstract
 Add to MetaCart
(Show Context)
Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing this collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing