Regression models are often transformed into certain alternative forms in statistical inference theory. In this paper, we assume that a general linear model (GLM) is transformed into two different forms, and our aim is to study some comparison problems under the two transformed general linear models (TGLMs). We first construct a general vector composed of all unknown parameters under the two different TGLMs, derive exact expressions of best linear minimum bias predictors (BLMBPs) by solving a constrained quadratic matrix-valued function optimization problem in the Löwner partial ordering, and describe a variety of mathematical and statistical properties and performances of the BLMBPs. We then approach some algebraic characterization problems concerning relationships between the BLMBPs under two different TGLMs. As applications, two specific cases are presented to illustrate the main contributions in the study.