summaryrefslogtreecommitdiff
path: root/models/model.py
AgeCommit message (Collapse)Author
2021-03-25Merge branch 'master' into data_paralleldata_parallelJordan Gong
# Conflicts: # models/model.py
2021-03-25Bug fixes and refactoringJordan Gong
1. Correct trained model signature 2. Move `val_size` to system config
2021-03-23Fix indexing bugs in validation dataset selectorJordan Gong
2021-03-22Add embedding visualization and validate on testing setJordan Gong
2021-03-16Set *_iter as *_iters in defaultJordan Gong
2021-03-15Remove redundant wrapper given by dataloaderJordan Gong
2021-03-15Fix redundant gallery_dataset_meta assignmentJordan Gong
2021-03-15Support transforming on training datasetsJordan Gong
2021-03-12Merge branch 'master' into data_parallelJordan Gong
2021-03-12Fix a typo when record none-zero countsJordan Gong
2021-03-12Merge branch 'master' into data_parallelJordan Gong
# Conflicts: # models/auto_encoder.py # models/model.py # models/rgb_part_net.py
2021-03-12Make evaluate method staticJordan Gong
2021-03-12Code refactoringJordan Gong
1. Separate FCs and triplet losses for HPM and PartNet 2. Remove FC-equivalent 1x1 conv layers in HPM 3. Support adjustable learning rate schedulers
2021-03-10Merge branch 'master' into data_parallelJordan Gong
2021-03-10Bug fixesJordan Gong
1. Resolve reference problems when parsing dataset selectors 2. Transform gallery using different models
2021-03-05Calculate losses outside modulesJordan Gong
2021-03-04Merge branch 'master' into data_parallelJordan Gong
2021-03-04Replace detach with no_grad in evaluationJordan Gong
2021-03-04Set seed for reproducibilityJordan Gong
2021-03-02Merge branch 'master' into data_parallelJordan Gong
2021-03-02Fix DataParallel specific bugsJordan Gong
2021-03-02Record learning rate every stepJordan Gong
2021-03-02Fix bugs in new schedulerJordan Gong
2021-03-01Bug fixesJordan Gong
2021-03-01Merge branch 'master' into data_parallelJordan Gong
# Conflicts: # models/model.py
2021-03-01New scheduler and new configJordan Gong
2021-03-01Move pairs variable to localJordan Gong
2021-02-28Implement sum of loss default in [1]Jordan Gong
[1]A. Hermans, L. Beyer, and B. Leibe, “In defense of the triplet loss for person re-identification,” arXiv preprint arXiv:1703.07737, 2017.
2021-02-28Log n-ile embedding distance and normJordan Gong
2021-02-27Implement Batch Hard triplet loss and soft marginJordan Gong
2021-02-26Merge branch 'master' into data_parallelJordan Gong
# Conflicts: # models/model.py
2021-02-26Fix predict functionJordan Gong
2021-02-20Separate triplet loss from modelJordan Gong
2021-02-20Merge branch 'master' into data_parallelJordan Gong
# Conflicts: # models/model.py
2021-02-20Separate triplet loss from modelJordan Gong
2021-02-19Correct cross reconstruction loss calculated in DataParallelJordan Gong
2021-02-19Merge branch 'master' into data_parallelJordan Gong
2021-02-19Allow evaluate unfinished modelJordan Gong
2021-02-18Merge branch 'master' into data_parallelJordan Gong
2021-02-18Implement adjustable input size and change some default configsJordan Gong
2021-02-18Decode mean appearance featureJordan Gong
2021-02-16Merge branch 'master' into data_parallelJordan Gong
2021-02-16Split transform and evaluate methodJordan Gong
2021-02-15Add DataParallel support on new codebaseJordan Gong
2021-02-15Revert "Memory usage improvement"Jordan Gong
This reverts commit be508061
2021-02-14Memory usage improvementJordan Gong
This update separates input data to two batches, which reduces ~30% memory usage.
2021-02-14Prepare for DataParallelJordan Gong
2021-02-10Save scheduler state_dictJordan Gong
2021-02-09Improve performance when disentanglingJordan Gong
This is a HUGE performance optimization, up to 2x faster than before. Mainly because of the replacement of randomized for-loop with randomized tensor.
2021-02-09Some optimizationsJordan Gong
1. Scheduler will decay the learning rate of auto-encoder only 2. Write learning rate history to tensorboard 3. Reduce image log frequency