summaryrefslogtreecommitdiff
path: root/models/model.py
AgeCommit message (Collapse)Author
2021-04-04Merge branch 'disentangling_only' into disentangling_only_py3.8Jordan Gong
# Conflicts: # models/model.py # utils/configuration.py # utils/sampler.py
2021-04-04Remove triplet samplerJordan Gong
2021-04-03Merge branch 'disentangling_only' into disentangling_only_py3.8Jordan Gong
# Conflicts: # models/model.py
2021-04-03Merge branch 'master' into disentangling_onlyJordan Gong
# Conflicts: # config.py # models/hpm.py # models/layers.py # models/model.py # models/part_net.py # models/rgb_part_net.py # test/part_net.py # utils/configuration.py # utils/triplet_loss.py
2021-04-03Revert "Normalize triplet losses"Jordan Gong
This reverts commit 99d1b18a
2021-03-27Normalize triplet lossesJordan Gong
2021-03-25Bug fixes and refactoringJordan Gong
1. Correct trained model signature 2. Move `val_size` to system config
2021-03-23Fix indexing bugs in validation dataset selectorJordan Gong
2021-03-22Add embedding visualization and validate on testing setJordan Gong
2021-03-16Set *_iter as *_iters in defaultJordan Gong
2021-03-15Remove redundant wrapper given by dataloaderJordan Gong
2021-03-15Fix redundant gallery_dataset_meta assignmentJordan Gong
2021-03-15Support transforming on training datasetsJordan Gong
2021-03-12Fix a typo when record none-zero countsJordan Gong
2021-03-12Make evaluate method staticJordan Gong
2021-03-12Code refactoringJordan Gong
1. Separate FCs and triplet losses for HPM and PartNet 2. Remove FC-equivalent 1x1 conv layers in HPM 3. Support adjustable learning rate schedulers
2021-03-10Bug fixesJordan Gong
1. Resolve reference problems when parsing dataset selectors 2. Transform gallery using different models
2021-03-04Replace detach with no_grad in evaluationJordan Gong
2021-03-04Set seed for reproducibilityJordan Gong
2021-03-02Record learning rate every stepJordan Gong
2021-03-02Fix bugs in new schedulerJordan Gong
2021-03-01New scheduler and new configJordan Gong
2021-03-01Move pairs variable to localJordan Gong
2021-02-28Implement sum of loss default in [1]Jordan Gong
[1]A. Hermans, L. Beyer, and B. Leibe, “In defense of the triplet loss for person re-identification,” arXiv preprint arXiv:1703.07737, 2017.
2021-02-28Log n-ile embedding distance and normJordan Gong
2021-02-27Implement Batch Hard triplet loss and soft marginJordan Gong
2021-02-26Fix predict functionJordan Gong
2021-02-20Separate triplet loss from modelJordan Gong
2021-02-19Merge branch 'python3.8' into disentangling_only_py3.8Jordan Gong
# Conflicts: # models/hpm.py # models/layers.py # models/model.py # models/part_net.py # models/rgb_part_net.py # utils/configuration.py
2021-02-19New branch with auto-encoder onlyJordan Gong
2021-02-19Merge branch 'master' into python3.8Jordan Gong
2021-02-19Allow evaluate unfinished modelJordan Gong
2021-02-18Merge branch 'master' into python3.8Jordan Gong
2021-02-18Implement adjustable input size and change some default configsJordan Gong
2021-02-18Decode mean appearance featureJordan Gong
2021-02-16Merge branch 'master' into python3.8Jordan Gong
# Conflicts: # models/model.py
2021-02-16Split transform and evaluate methodJordan Gong
2021-02-15Revert "Memory usage improvement"Jordan Gong
This reverts commit be508061
2021-02-15Revert "Memory usage improvement"Jordan Gong
This reverts commit be508061
2021-02-14Merge branch 'master' into python3.8Jordan Gong
2021-02-14Memory usage improvementJordan Gong
This update separates input data to two batches, which reduces ~30% memory usage.
2021-02-14Prepare for DataParallelJordan Gong
2021-02-10Merge branch 'master' into python3.8Jordan Gong
2021-02-10Save scheduler state_dictJordan Gong
2021-02-09Merge branch 'master' into python3.8Jordan Gong
# Conflicts: # models/rgb_part_net.py
2021-02-09Improve performance when disentanglingJordan Gong
This is a HUGE performance optimization, up to 2x faster than before. Mainly because of the replacement of randomized for-loop with randomized tensor.
2021-02-09Some optimizationsJordan Gong
1. Scheduler will decay the learning rate of auto-encoder only 2. Write learning rate history to tensorboard 3. Reduce image log frequency
2021-02-08Merge branch 'master' into python3.8Jordan Gong
# Conflicts: # models/hpm.py # models/layers.py # models/model.py # models/rgb_part_net.py # utils/configuration.py
2021-02-08Code refactoring, modifications and new featuresJordan Gong
1. Decode features outside of auto-encoder 2. Turn off HPM 1x1 conv by default 3. Change canonical feature map size from `feature_channels * 8 x 4 x 2` to `feature_channels * 2 x 16 x 8` 4. Use mean of canonical embeddings instead of mean of static features 5. Calculate static and dynamic loss separately 6. Calculate mean of parts in triplet loss instead of sum of parts 7. Add switch to log disentangled images 8. Change default configuration
2021-01-23Remove the third term in canonical consistency lossJordan Gong