Currently our algorithms support mmclassification, mmdetection and mmsegmentation. Before running our algorithms, you may need to prepare the datasets according to the instructions in the corresponding document.
Note:
${task}
to represent one of mmcls
、mmdet
and mmseg
.cfg-options
(e.g., mutable_cfg
in nas algorithm or channel_cfg
in pruning algorithm) to avoid the need for a config for each subnet or checkpoint. If you want to specify different subnets for retraining or testing, you just need to change this arguments.Three are three steps to start neural network search(NAS), including supernet pre-training, search for subnet on the trained supernet and subnet retraining.
python tools/${task}/train_${task}.py ${CONFIG_FILE} [optional arguments]
The usage of optional arguments are the same as corresponding tasks like mmclassification, mmdetection and mmsegmentation.
python tools/${task}/search_${task}.py ${CONFIG_FILE} ${CHECKPOINT_PATH} [optional arguments]
python tools/${task}/train_${task}.py ${CONFIG_FILE} --cfg-options algorithm.mutable_cfg=${MUTABLE_CFG_PATH} [optional arguments]
MUTABLE_CFG_PATH
: Path of mutable_cfg
. mutable_cfg
represents config for mutable of the subnet searched out, used to specify different subnets for retraining. An example for mutable_cfg
can be found here, and the usage can be found here.Pruning has the four steps, including supernet pre-training, search for subnet on the trained supernet, subnet retraining and split checkpoint. The command of first two steps are similar to NAS, except here we need to use CONFIG_FILE
of Pruning. Commands of two other steps are as follows.
python tools/${task}/train_${task}.py ${CONFIG_FILE} --cfg-options algorithm.channel_cfg=${CHANNEL_CFG_PATH} [optional arguments]
Different from NAS, the argument that needs to be specified here is channel_cfg
instead of mutable_cfg
.
CHANNEL_CFG_PATH
: Path of channel_cfg
. channel_cfg
represents config for channel of the subnet searched out, used to specify different subnets for testing. An example for channel_cfg
can be found here, and the usage can be found here.There is only one step to start knowledge distillation.
python tools/${task}/train_${task}.py ${CONFIG_FILE} --cfg-options algorithm.distiller.teacher.init_cfg.type=Pretrained algorithm.distiller.teacher.init_cfg.checkpoint=${TEACHER_CHECKPOINT_PATH} [optional arguments]
TEACHER_CHECKPOINT_PATH
: Path of teacher_checkpoint
. teacher_checkpoint
represents checkpoint of teacher model, used to specify different checkpoints for distillation.Вы можете оставить комментарий после Вход в систему
Неприемлемый контент может быть отображен здесь и не будет показан на странице. Вы можете проверить и изменить его с помощью соответствующей функции редактирования.
Если вы подтверждаете, что содержание не содержит непристойной лексики/перенаправления на рекламу/насилия/вульгарной порнографии/нарушений/пиратства/ложного/незначительного или незаконного контента, связанного с национальными законами и предписаниями, вы можете нажать «Отправить» для подачи апелляции, и мы обработаем ее как можно скорее.
Комментарий ( 0 )