Continuous-Time Markov Chain

We define the model in terms of a continuous-time Markov chain, which follows naturally if you first consider a change to only one tie-variable. At time point t0, say, there is no tie between Bob and Fredric; however, at time point t1, there is a tie between Bob and Fredric. We know that the tie was created between t0 and t1 but not when or if it was created, dissolved, and then re-created. Continuous time is important here because we consider modeling all changes to all tie-variables, and conceptually, it makes a difference whether another tie, say, between Bob and Erica, was formed before or after the tie between Bob and Fredric. After all, we are interested in modeling the dynamics, that is, how one change of the network reshapes the conditions for other potential changes to the network.

The assumption that the network changes gradually between observations is translated mathematically in the specification that we consider a time-dependent network X(t), where the time parameter t takes all real values from the first to the last observation moment: the time domain is the interval [t0, tM-1]. One could say that in any split second, some tie might change. These individual tie changes are not observed - only the total network at the observation moments t0 through tM-1 is observed. Between to and tM-1, there will be a finite number of times when the network changes, and at all other times, the network stays constant. Thus, we have a continuous-time parameter t, as well as both observations and changes at discrete moments - usually many more moments of change than moments of observation.

A convenient second assumption was proposed by Holland and Leinhardt (1977): at any moment, when the network changes, only one tie-variable can change. Thus, ties do not appear or disappear together but rather one by one. This assumption not only decomposes the change process into its smallest constituents but also rules out coordination, swapping partners, instantaneous group formation, and the like.

The third assumption is that the change probabilities of the network depend on the current state of the network but not on previous states - there is no memory, as it were. Mathematically, this assumption is expressed by saying that the stochastic process X(t) is a continuoustime Markov process (this is further defined in textbooks such as Norris (1997)) on the set of all graphs on n nodes.

132 Exponential Random Graph Models for Social Networks

< Prev   CONTENTS   Source   Next >