gigl.src.common.models.layers.loss#

Classes

AligmentLoss

Taken from THUwangcy/DirectAU, AlignmentLoss increases the similarity of representations between positive user-item pairs.

BGRLLoss

Leverages BGRL loss from https://arxiv.org/pdf/2102.06514.pdf, using an offline and online encoder to predict alternative augmentations of the input.

FeatureReconstructionLoss

Computes SCE between original feature and reconstructed feature.

GBTLoss

Computes the Barlow Twins loss on the two input matrices as an auxiliary loss.

GRACELoss

A loss class that implements the GRACE (https://arxiv.org/pdf/2006.04131.pdf) contrastive loss approach.

KLLoss

Calculates KL Divergence between two set of scores for the distribution loss.

LLPRankingLoss

Calculates a margin-based rakning loss between two set of scores for the ranking loss in LLP.

MarginLoss

A loss layer built on top of the PyTorch implementation of the margin ranking loss.

ModelResultType

An enumeration.

RetrievalLoss

A loss layer built on top of the tensorflow_recommenders implementation.

SoftmaxLoss

A loss layer built on top of the PyTorch implementation of the softmax cross entropy loss.

TBGRLLoss

TBGRL (https://arxiv.org/pdf/2211.14394.pdf) improves over BGRL by generating a third augmented graph as a negative sample, providing a cheap corruption that improves generalizability of the model in inductive settings.

UniformityLoss

Taken from THUwangcy/DirectAU, UniformityLoss measures how well the representations scatter on the hypersphere.

WhiteningDecorrelationLoss

Utilizes canonical correlation analysis to compute similarity between augmented graphs as an auxiliary loss.