Statistical graphical models are getting more attention nowadays from many research fields such as Artificial Intelligence, Medical diagnosis, educational learning, cognitive science, biology, data mining, machine learning, and many other applied sciences. Some of the reasons are user-friendliness and easy interpretation and representation of model structures. The model structures are presented in graph, and we use directed edges or undirected edges between nodes (or variables) in the graph according to the intrinsic relationship between the corresponding nodes.
Necessary and sufficient conditions for collapsibility of a hierarchical log linear model for a multidimensional contingency table were defined by Asmussen and Edwards . We have shown that these conditions can be combined with recursive graphical models, and the necessary and sufficient for collapsibility is found for recursive graphical models.
We propose an ML estimation method for a recursive model of categorical variables which is too large to handle as a single model. We first split the whole model into a set of submodels which can be arranged in the form of a tree. Two conditions are suggested as an instrument for estimating the parameters of the whole model yet working within individual submodels. Theorems are proved to the effect that, when missing values are involved, we can generalize and apply the principle of EM to the tree of submodels so that the ML estimation is possible for a recursive model of any size. For illustration, simulation experiments of the ML estimation are carried out for recursive models of up to 158 binary variables, and the proposed method is applied successfully to real data where 28 binary variables are involved.