Age | Commit message (Collapse) | Author | |
---|---|---|---|
2020-12-23 | Modify activation functions after conv or trans-conv in auto-encoder | Jordan Gong | |
1. Make activation functions be inplace ops 2. Change Leaky ReLU to ReLU in decoder | |||
2020-12-23 | Refactor and refine auto-encoder | Jordan Gong | |
1. Wrap Conv2d 3x3-padding-1 to VGGConv2d 2. Wrap ConvTranspose2d 4x4-stride-4-padding-1 to DCGANConvTranspose2d 3. Turn off bias in conv since the employment of batch normalization | |||
2020-12-23 | Wrap Conv1d no bias layer | Jordan Gong | |
2020-12-23 | Reshape feature before decode | Jordan Gong | |
2020-12-23 | Remove redundant Leaky ReLU in FocalConv2d | Jordan Gong | |
2020-12-23 | Split modules to different files | Jordan Gong | |