Humans can adapt to changing environments and tasks and consolidate their previous knowledge by constantly learning and practicing new knowledge; however, it is extremely challenging for artificial intelligence to achieve this goal. With the development of the field of artificial intelligence, research on overcoming catastrophic forgetting in continuous learning is also faced with the challenges of focusing on the most important parameters of the current task, retaining knowledge from previous tasks and adapting it to new ones, and making better use of knowledge from previous and new tasks. To solve these problems, this article proposes a reasonable forgetting method, which is called selection of parameter importance levels for reasonable forgetting in continuous task adaptation (SPIRF-CTA). The SPIRF-CTA approach enables the constructed model to identify and focus on the most important parameters for the current task by designing a normalized parameter importance selection mechanism and a loss function with parameter importance penalties, and it adjusts the parameter updates by incorporating Hessian matrix information to achieve reasonable forgetting and prevent the new task from completely overwriting the knowledge of the previous task. Moreover, we design a model alignment loss function and a multitask loss function to use the knowledge of the new and previous tasks. We evaluate the SPIRF-CTA method on the Split CIFAR-10 Split CIFAR-100, and Split mini-ImageNet datasets, and the results show that the image classification accuracies of the proposed approach improve by 3.6%, 4.4%, and 3.36%, respectively; moreover, the SPIRF-CTA method exhibits excellent control of the degree of forgetting, with a forgetting rate of only 3.54%. Code is available at https://github.com/ybyangjing/CTA.