This lecture compares the Generalized Matrix Regression (GMR) with Support Vector Regression (SVR) in the context of machine learning. GMR predicts trends away from data points, while SVR computes a weighted combination of local predictors. The lecture discusses the similarities and differences between the two methods, highlighting that GMR can predict multi-dimensional outputs, unlike SVR. It also covers the hyperparameters of both techniques and concludes that there is no straightforward way to determine which regression technique fits best for a given problem.