On a preferential attachment model
We study a general preferential attachment model.
Consider a random graph which evolves in time. At each step a new
vertex is introduced, which can be connected to at most one existing vertex.
If it is disconnected, it becomes a pioneer vertex. Given that it
is not disconnected, it joins an existing pioneer vertex with a
probability proportional to a function of the degree of that vertex.
This function is allowed to be vertex dependent, and is called
reinforcement function. We only assume that these functions are
strictly positive. We prove that there can be at most three phases
in this model, depending on the behavior of the reinforcement functions.
Consider the set whose elements are the vertices whose degree tends
a.s. to infinity. We prove that this set either is empty, or it has
exactly one element, or it contains all the pioneer vertices.
Moreover, we describe the phase transition in the case where the
reinforcement function is the same for all vertices. Our results
are general, and in particular we are not assuming monotonicity of
the reinforcement functions. Our proofs rely on a generalization of the Rubin construction given for edge-reinforced random walks.
This is joint work with Codina Cotar and Marco Li Calzi.