In semi-blind channel estimation techniques, the choice of the regularizing parameter that weights the blind criterion when linearly combined to the training-based least square criterion has a great impact on channel estimation performance. If a scalar regularization is considered, it has been noted that the optimal value of the regularizing factor has no closed-form expression. In a recent work, we proved that by using a regularization matrix instead, we not only enhance the performance but also can determine a closed-form expression for the optimal regularizing matrix that minimizes the asymptotic mean-square-error of the channel estimate. In this paper, we generalize our work to the context of Multiple-Input-Multiple-Output-Orthogonal-Frequency-Division-Multiplexing (MIMO-OFDM). As an application, we propose to make a performance comparison between linear prediction and subspace semi-blind estimators. In particular, we assess by simulations the accuracy of the derived results and investigate the Bit Error Rate performance as well as the impact of channel overmodeling.