In federated learning, data heterogeneity leads to significant differences in the local models learned by the clients, thereby affecting the performance of the global model. To address this issue, contrast federated learning algorithms increase the comparison of positive and negative samples on the clients, bringing the local models closer to the global model. However, existing methods take the global model as the positive sample and the previous round of local models as the negative sample, resulting in insufficient utilization of historical local models. In this paper, we propose SWIM: Sliding-WIndow Model contrast method, which introduces more rounds of local models. First, we design and utilize a sliding window mechanism for collecting client representations of historical local models. Subsequently, we employ the cosine distance function as a discriminator to distinguish them into positive and negative samples. In addition, we introduce a dynamic coefficient that balances the federated classification learning and feature learning tasks. By adjusting the dynamic coefficient at different training rounds, the global model becomes more focused on feature learning in the early stages and classification learning in the later stages. Experiments are compared with four state-of-the-art federated learning algorithms on three datasets. The results show that the proposed algorithm outperforms the four state-of-the-art algorithms in terms of accuracy. Source code is available at https://github.com/zhanghrswpu/SWIM.