This is similar to linear regression but instead of having single dependent variable Y, we have multiple output variables. It may be written as,
Y = XB + U ,
where Y is a matrix with series of multivariate measurements (each column being a set of measurements on one of the dependent variables), X is a matrix of observations on independent variables that might be a design matrix (each column being a set of observations on one of the independent variables), B is a matrix containing parameters that are usually to be estimated. U is the regularisation factor.

Representing it in the old form,
E( \beta_ 0, \beta_ 1, \beta_ 2, \beta_ 3, .. \beta_ m) = \sum_{i=1}^{n} (h_ \theta (x_ i) - y_ i)Multivariate Linear Regression vs Multiple Linear Regression
In Multivariate regression there are more than one dependent variable with different variances (or distributions). The predictor variables may be one or multiple.
In Multiple regression, there is just one dependent variable i.e. y. But, the predictor variables or parameters are multiple.
Also read:
Why use Multivariate Linear Regression?
Suppose, in a medical trial, predictors might be weight, age, and race, and outcome variables are blood pressure and cholesterol. We could, in theory, create two “multiple regression” models, one regressing blood pressure on weight, age, and race, and a second model regressing cholesterol on those same factors. However, alternatively, we could create a single multivariate regression model that predicts both blood pressure and cholesterol simultaneously based on the three predictor variables. The idea being that the multivariate regression model may be better (more predictive) to the extent that it can learn more from the correlation between blood pressure and cholesterol in patients.[1]
Further reading:
1- Normal Equation
2- Normal Equation for Linear Regression
Reference:
[1] Difference between Multivariate Linear Regression and Multiple Linear Regression