We consider the one-dimensional problem of propagation of a narrow-band signal in a homogeneous dispersive medium. Within the framework of the method of moments, simple relations are obtained which permit one to use integration of smooth (non-oscillatory) functions to find the midpoint and r.m. s. duration of an arbitrary signal at an arbitrary point on the path without any additional approximations. It is shown that if the absorption dispersion is negligible, the square of the r.m. s. duration of the signal depends on the path length according to the parabolic law, i.e., it has a single minimum (“focus” of the signal). The possibility of reducing the duration (and the corresponding power increase) of linear frequency-modulated (LFM) signals during propagation in a dispersive medium is considered. For a space-limited dispersive medium, using the example of a weakly colliding plasma, estimates are obtained for the maximum possible (for a given path length) reduction of the signal duration and increase in its power, as well as for the initial signal parameters at which these capabilities are realized.