To maintain a real-time integrated system, the many constraints that results from each module must be considered. The perceptual (vision) and output (graphics) systems must both be real-time to avoid lag and the consequent complexities it might cause. In addition, the learning system must generate its prediction in real-time as well.

The constraints on the speed of the learning system have implications on the complexity it can have. If the pdf used is too complex, computing might be slow and the system will stall. This strongly reduces the quality of the interaction. Thus, the dimensionality of must be chosen carefully. Due to the fact that principal components analysis (PCA) was used, it is straightforward to reduce the dimensionality of the space by using fewer eigenvectors. In addition, due to the use of a conditioned mixture model as a pdf, it is also straightforward to reduce the learning system's complexity by throwing away models.

On the other hand, if high accuracy is required and real-time interaction is not important, more dimensions and more models can be used. Thus, many possible configurations exists as we vary the learning system's power to obtain different solutions. For instance, if only a very simple interaction is to be observed, one might be able to use fewer models and fewer dimensions. However, if the human to human interaction is extremely involved and lasts over many minutes, a more complex model might be in order. The ease with which modeling resources can be increased or decreased allows for this range of operation.

Finally, we address the training of the learning algorithm with CEM. As is well known in the machine learning community, using models that are too complex results in over fitting and causes poorer generalization. Meanwhile, over-simplified models cause under fitting and again poor generalization. Thus, there is another critical issue that arises as dimensionality and complexity are varied.

Given the annealing possibility that was presented earlier, it is possible to avoid some of the over fitting and under fitting problems by using CEM to find a more global estimate. This is typically a good way to address some of these complexity issues. If properly annealed, the learning algorithm is more likely to avoid degenerate and over-fit solutions. However, it is also important to initialize learning algorithms in a good way such that they converge well to a desired solution.

To summarize, the integrated system still has some subtleties that need to be addressed and is not a black box. There are some parameters that influence its efficiency, complexity, effectiveness, etc. and there are some principled ways to address these (refer to [5] [49]).