gigl.src.common.models.layers.loss#
Classes
Taken from THUwangcy/DirectAU, AlignmentLoss increases the similarity of representations between positive user-item pairs. |
|
Leverages BGRL loss from https://arxiv.org/pdf/2102.06514.pdf, using an offline and online encoder to predict alternative augmentations of the input. |
|
Computes SCE between original feature and reconstructed feature. |
|
Computes the Barlow Twins loss on the two input matrices as an auxiliary loss. |
|
A loss class that implements the GRACE (https://arxiv.org/pdf/2006.04131.pdf) contrastive loss approach. |
|
Calculates KL Divergence between two set of scores for the distribution loss. |
|
Calculates a margin-based rakning loss between two set of scores for the ranking loss in LLP. |
|
A loss layer built on top of the PyTorch implementation of the margin ranking loss. |
|
An enumeration. |
|
A loss layer built on top of the tensorflow_recommenders implementation. |
|
A loss layer built on top of the PyTorch implementation of the softmax cross entropy loss. |
|
TBGRL (https://arxiv.org/pdf/2211.14394.pdf) improves over BGRL by generating a third augmented graph as a negative sample, providing a cheap corruption that improves generalizability of the model in inductive settings. |
|
Taken from THUwangcy/DirectAU, UniformityLoss measures how well the representations scatter on the hypersphere. |
|
Utilizes canonical correlation analysis to compute similarity between augmented graphs as an auxiliary loss. |