deepfacelab中文网

 找回密码
 立即注册(仅限QQ邮箱)
12
返回列表 发新帖
楼主: acamrita

新人预训练迭代变0的疑问。

[复制链接]

0

主题

9

帖子

2717

积分

初级丹圣

Rank: 8Rank: 8

积分
2717
发表于 2021-9-20 09:05:15 | 显示全部楼层
Hi,

Whenever you Pre-Train a model. You should always keep a backup of that model, in case you want to pretrain more then you can just train the backup Pre-Train model, as when you do normal training on a pretrain model it trains from 0 iteration, but here the model trains faster & if you again pretrain that model it will again start from 0, so always keep a backup of Pre-Trained Model.
回复 支持 反对

使用道具 举报

QQ|Archiver|手机版|deepfacelab中文网 |网站地图

GMT+8, 2024-9-20 05:44 , Processed in 0.084971 second(s), 10 queries , Redis On.

Powered by Discuz! X3.4

Copyright © 2001-2020, Tencent Cloud.

快速回复 返回顶部 返回列表