A multiple measurement vector problem (MMV) is a generalization of the compressed sensing problem that addresses the recovery of a set of jointly sparse signal vectors. One of the important contributions of this paper is to show that the seemingly least related state-of-the-art MMV joint sparse recovery algorithms-the M-SBL (multiple sparse Bayesian learning) and subspace-based hybrid greedy algorithms-have a very important link. More specifically, we show that replacing the (.) term in the M-SBL by a rank surrogate that exploits the spark reduction property discovered in the subspace-based joint sparse recovery algorithms provides significant improvements. In particular, if we use the Schatten-quasi-norm as the corresponding rank surrogate, the global minimizer of the cost function in the proposed algorithm becomes identical to the true solution as. Furthermore, under regularity conditions, we show that convergence to a local minimizer is guaranteed using an alternating minimization algorithm that has closed form expressions for each of the minimization steps, which are convex. Numerical simulations under a variety of scenarios in terms of SNR and the condition number of the signal amplitude matrix show that the proposed algorithm consistently outperformed the M-SBL and other state-of-the art algorithms.