SAE: added option 'Pretrain the model?',

Pretrain the model with large amount of various faces. This technique may help to train the fake with overly different face shapes and light conditions of src/dst data. Face will be look more like a morphed. To reduce the morph effect, some model files will be initialized but not be updated after pretrain: LIAE: inter_AB.h5 DF: both decoders.h5. The longer you pretrain the model the more morphed face will look. After that, save and run the training again.
This commit is contained in:
iperov 2019-05-01 19:55:27 +04:00
parent 659aa5705a
commit 2a8dd788dc
8 changed files with 78 additions and 44 deletions

View file

@ -19,6 +19,7 @@ def trainerThread (s2c, c2s, args, device_args):
training_data_src_path = Path( args.get('training_data_src_dir', '') )
training_data_dst_path = Path( args.get('training_data_dst_dir', '') )
pretraining_data_path = Path( args.get('pretraining_data_dir', '') )
model_path = Path( args.get('model_path', '') )
model_name = args.get('model_name', '')
save_interval_min = 15
@ -40,6 +41,7 @@ def trainerThread (s2c, c2s, args, device_args):
model_path,
training_data_src_path=training_data_src_path,
training_data_dst_path=training_data_dst_path,
pretraining_data_path=pretraining_data_path,
debug=debug,
device_args=device_args)