summaryrefslogtreecommitdiff
AgeCommit message (Collapse)Author
2021-01-07Train different models in different conditionsJordan Gong
2021-01-07Add typical training script and some bug fixesJordan Gong
1. Resolve deprecated scheduler stepping issue 2. Make losses in the same scale(replace mean with sum in separate triplet loss, enlarge pose similarity loss 10x) 3. Add ReLU when compute distance in triplet loss 4. Remove classes except Model from `models` package init
2021-01-07Change device config and add enable multi-GPU computingJordan Gong
1. Add `disable_acc` switch for disabling accelerator. When it is off, system will automatically choosing accelerator. 2. Enable multi-GPU training using torch.nn.DataParallel
2021-01-06Add CUDA supportJordan Gong
2021-01-06Add TensorBoard supportJordan Gong
2021-01-05Implement checkpoint mechanismJordan Gong
2021-01-05Implement Batch All Triplet LossJordan Gong
2021-01-05Change and improve weight initializationJordan Gong
1. Change initial weights for Conv layers 2. Find a way to init last fc in init_weights
2021-01-03Separate last fc matrix from weight init functionJordan Gong
Recursive apply will override other parameters too
2021-01-03Delete dead training judgeJordan Gong
2021-01-03Implement weight initializationJordan Gong
2021-01-03Update hyperparameter configuration, implement prototype fit functionJordan Gong
2021-01-03Add separate fully connected layersJordan Gong
2021-01-03Unit testing on auto-encoder, HPM and Part NetJordan Gong
2021-01-03Bump up version for pillowJordan Gong
2021-01-02Separate training and evaluatingJordan Gong
2021-01-02Correct feature dims after disentanglement and HPM backbone removalJordan Gong
1. Features used in HPM is decoded canonical embedding without transpose convolution 2. Decode pose embedding to image for Part Net 3. Backbone seems to be redundant, we can use feature map given by auto-decoder
2021-01-02Change type of pose similarity loss to tensorJordan Gong
2020-12-31Implement some parts of RGB-GaitPart wrapperJordan Gong
1. Triplet loss function and weight init function haven't been implement yet 2. Tuplize features returned by auto-encoder for later unpack 3. Correct comment error in auto-encoder 4. Swap batch_size dim and time dim in HPM and PartNet in case of redundant transpose 5. Find backbone problems in HPM and disable it temporarily 6. Make feature structure by HPM consistent to that by PartNet 7. Fix average pooling dimension issue and incorrect view change in HP
2020-12-31Make HPM capable of processing frames in all batchesJordan Gong
2020-12-31Make super class constructor revoke consistentJordan Gong
2020-12-31Bug Fixes in HPM and PartNetJordan Gong
1. Register list of torch.nn.Module to the network using torch.nn.ModuleList 2. Fix operation error in squeeze list of tensor 3. Replace squeeze with view in HP in case batch size is 1
2020-12-30Correct and refine PartNetJordan Gong
1. Let FocalConv block capable of processing frames in all batches 2. Correct input dims of TFA and output dims of HP 3. Change torch.unsqueeze and torch.cat to torch.stack
2020-12-30Combine FPFE and TFA to PartNetJordan Gong
2020-12-30Combine FPFE and TFA to GaitPartJordan Gong
2020-12-30Add pooling options in HPMJordan Gong
According to [1], we can use GAP and GMP together, or one of both in ablation study. [1]Y. Fu et al., “Horizontal pyramid matching for person re-identification,” in Proceedings of the AAAI Conference on Artificial Intelligence, 2019, vol. 33, pp. 8295–8302.
2020-12-29Return canonical features at condition 1 for later aggregationJordan Gong
2020-12-29Correct batch splitterJordan Gong
We can disentangle features from different subjects, but cannot do it at different temporal orders
2020-12-29Add type hint for new label (numpy.int64)Jordan Gong
2020-12-29Encode class names to label and some access improvementJordan Gong
1. Encode class names using LabelEncoder from sklearn 2. Remove unneeded class variables 3. Protect some variables from being accessed in userspace
2020-12-28Wrap the auto-encoder, return 3 losses at t2Jordan Gong
2020-12-27Try some unit tests on CASIA-B datasetJordan Gong
2020-12-27Change default dataset directoryJordan Gong
2020-12-27Implement some parts of main model structureJordan Gong
1. Configuration parsers 2. Model signature generator
2020-12-27Fix inconsistency and API deprecation issues in decoderJordan Gong
1. Add default output channels of decoder 2. Replace deprecated torch.nn.functional.sigmoid with torch.sigmoid
2020-12-27Refine auto-encoderJordan Gong
1. Wrap fully connected layers 2. Introduce hyperparameter tuning in constructor
2020-12-27Prepare for FVG datasetJordan Gong
2020-12-27Make naming scheme consistentJordan Gong
Use `dir` instead of `path`
2020-12-27Add dataset selector to config type hint, change ClipLabels typo to ClipViewsJordan Gong
2020-12-27Adopt type hinting generics in standard collections (PEP 585)Jordan Gong
2020-12-26Implement batch splitter to split sampled dataJordan Gong
Disentanglement cannot be processed on different subjects at the same time, we need to load `pr` subjects one by one. The batch splitter will return a pr-length list of tuples (with 2 dicts containing k-length lists of labels, conditions, view and k-length tensor of clip data, representing condition 1 and condition 2 respectively).
2020-12-26Sample k more clips for disentanglementJordan Gong
2020-12-26Add config file and corresponding type hintJordan Gong
2020-12-26Combine transformed height and width to `frame_size`Jordan Gong
2020-12-26Bump up version for tqdmJordan Gong
2020-12-24Implement Horizontal Pyramid Matching (HPM)Jordan Gong
2020-12-24Optimize importsJordan Gong
2020-12-24Change the usage of layers and reorganize relations of layersJordan Gong
1. Add batch normalization and activation to layers 2. VGGConv2d and FocalConv2d inherits to BasicConv2d; DCGANConvTranspose2d inherits to BasicConvTranspose2d
2020-12-23Make activation inplaceJordan Gong
2020-12-23Modify activation functions after conv or trans-conv in auto-encoderJordan Gong
1. Make activation functions be inplace ops 2. Change Leaky ReLU to ReLU in decoder