gt4sd.algorithms.generation.diffusion.geodiff.model.layers module¶
Summary¶
Classes:
CFConv layer. |
|
GINECONV layer. |
|
GIN encoder. |
|
Interaction block. |
|
MLP edge encoder. |
|
Hidden states output. |
|
Multi-layer Perceptron. |
|
SchNet encoder. |
|
Shifted softplus activation function. |
Reference¶
- class MoleculeGNNOutput(sample)[source]¶
Bases:
BaseOutput
Hidden states output. Output of last layer of model.
- sample: FloatTensor¶
- __annotations__ = {'sample': <class 'torch.FloatTensor'>}¶
- __dataclass_fields__ = {'sample': Field(name='sample',type=<class 'torch.FloatTensor'>,default=<dataclasses._MISSING_TYPE object>,default_factory=<dataclasses._MISSING_TYPE object>,init=True,repr=True,hash=None,compare=True,metadata=mappingproxy({}),kw_only=False,_field_type=_FIELD)}¶
- __dataclass_params__ = _DataclassParams(init=True,repr=True,eq=True,order=False,unsafe_hash=False,frozen=False)¶
- __doc__ = 'Hidden states output. Output of last layer of model.'¶
- __eq__(other)¶
Return self==value.
- __hash__ = None¶
- __init__(sample)¶
- __match_args__ = ('sample',)¶
- __module__ = 'gt4sd.algorithms.generation.diffusion.geodiff.model.layers'¶
- __repr__()¶
Return repr(self).
- class MultiLayerPerceptron(input_dim, hidden_dims, activation='relu', dropout=0)[source]¶
Bases:
Module
Multi-layer Perceptron. Note there is no activation or dropout in the last layer.
- __init__(input_dim, hidden_dims, activation='relu', dropout=0)[source]¶
Initialize multi-layer perceptron.
- Parameters
input_dim (
int
) – input dimensionhidden_dim – hidden dimensions
activation (
str
) – activation functiondropout (
float
) – dropout rate
- forward(x)[source]¶
Forward pass.
- Parameters
x (
Tensor
) – input tensor of shape (batch_size, input_dim)- Return type
Tensor
- Returns
output mlp.
- __annotations__ = {}¶
- __doc__ = 'Multi-layer Perceptron. Note there is no activation or dropout in the last layer.'¶
- __module__ = 'gt4sd.algorithms.generation.diffusion.geodiff.model.layers'¶
- class ShiftedSoftplus[source]¶
Bases:
Module
Shifted softplus activation function.
- forward(x)[source]¶
Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.- Return type
Tensor
- __annotations__ = {}¶
- __doc__ = 'Shifted softplus activation function.'¶
- __module__ = 'gt4sd.algorithms.generation.diffusion.geodiff.model.layers'¶
- class CFConv(in_channels, out_channels, num_filters, mlp, cutoff, smooth)[source]¶
Bases:
MessagePassing
CFConv layer.
- __init__(in_channels, out_channels, num_filters, mlp, cutoff, smooth)[source]¶
Construct a CFConv layer.
- Parameters
in_channels (
int
) – size of each input.out_channels (
int
) – size of each output.num_filters (
int
) – number of filters.mlp (
Callable
) – mlp hidden dimensions.cutoff (
float
) – cutoff distance.smooth (
bool
) – whether to use smooth cutoff.
- forward(x, edge_index, edge_length, edge_attr)[source]¶
Forward pass.
- Parameters
x (
Tensor
) – input tensor.edge_index – edge indices.
edge_length – edge lengths.
edge_attr – edge attributes.
- Return type
Tensor
- Returns
output tensor.
- message(x_j, W)[source]¶
Constructs messages from node \(j\) to node \(i\) in analogy to \(\phi_{\mathbf{\Theta}}\) for each edge in
edge_index
. This function can take any argument as input which was initially passed topropagate()
. Furthermore, tensors passed topropagate()
can be mapped to the respective nodes \(i\) and \(j\) by appending_i
or_j
to the variable name, .e.g.x_i
andx_j
.- Return type
Tensor
- __annotations__ = {}¶
- __doc__ = 'CFConv layer.'¶
- __module__ = 'gt4sd.algorithms.generation.diffusion.geodiff.model.layers'¶
- class InteractionBlock(hidden_channels, num_gaussians, num_filters, cutoff, smooth)[source]¶
Bases:
Module
Interaction block.
- __init__(hidden_channels, num_gaussians, num_filters, cutoff, smooth)[source]¶
Construct an interaction block.
- Parameters
hidden_channels (
int
) – number of hidden channels.num_gaussians (
int
) – number of gaussians.num_filters (
int
) – number of filters.cutoff (
float
) – cutoff distance.smooth (
bool
) – whether to use smooth cutoff.
- forward(x, edge_index, edge_length, edge_attr)[source]¶
Forward pass.
- Parameters
x (
Tensor
) – input tensor.edge_index – edge indices.
edge_length – edge lengths.
edge_attr – edge attributes.
- Return type
Tensor
- Returns
output tensor.
- __annotations__ = {}¶
- __doc__ = 'Interaction block.'¶
- __module__ = 'gt4sd.algorithms.generation.diffusion.geodiff.model.layers'¶
- class SchNetEncoder(hidden_channels=128, num_filters=128, num_interactions=6, edge_channels=100, cutoff=10.0, smooth=False)[source]¶
Bases:
Module
SchNet encoder.
- __init__(hidden_channels=128, num_filters=128, num_interactions=6, edge_channels=100, cutoff=10.0, smooth=False)[source]¶
Construct a SchNet encoder.
- Parameters
hidden_channels (
int
) – number of hidden channels.num_filters (
int
) – number of filters.num_interactions (
int
) – number of interactions.edge_channels (
int
) – number of edge channels.cutoff (
float
) – cutoff distance.smooth (
bool
) – whether to use smooth cutoff.
- forward(z, edge_index, edge_length, edge_attr, embed_node=True)[source]¶
Forward pass.
- Parameters
z (
Tensor
) – input tensor.edge_index (
Tensor
) – edge indices.edge_length (
Tensor
) – edge lengths.edge_attr (
Tensor
) – edge attributes.embed_node (
bool
) – whether to embed node.
- Return type
Tensor
- Returns
output tensor.
- __annotations__ = {}¶
- __doc__ = 'SchNet encoder.'¶
- __module__ = 'gt4sd.algorithms.generation.diffusion.geodiff.model.layers'¶
- class GINEConv(mlp, eps=0.0, train_eps=False, activation='softplus', **kwargs)[source]¶
Bases:
MessagePassing
GINECONV layer. Custom class of the graph isomorphism operator from the “How Powerful are Graph Neural Networks? https://arxiv.org/abs/1810.00826 paper. Note that this implementation has the added option of a custom activation.
- __init__(mlp, eps=0.0, train_eps=False, activation='softplus', **kwargs)[source]¶
Construct a GINEConv layer.
- Parameters
mlp (
Callable
) – MLP.eps (
float
) – epsilon.train_eps (
bool
) – whether to train epsilon.activation (
str
) – activation function.
- forward(x, edge_index, edge_attr=None, size=None)[source]¶
Forward pass.
- Parameters
x (
Union
[Tensor
,Tuple
[Tensor
,Optional
[Tensor
,None
]]]) – input tensor.edge_index (
Union
[Tensor
,SparseTensor
]) – edge indices.edge_attr (
Optional
[Tensor
,None
]) – edge attributes.size (
Optional
[Tuple
[int
,int
],None
]) – size.
- Return type
Tensor
- Returns
output tensor.
- message(x_j, edge_attr)[source]¶
Message function.
- Parameters
x_j (
Tensor
) – input tensor.edge_attr (
Tensor
) – edge attributes.
- Return type
Tensor
- Returns
message passing aggregation.
- __annotations__ = {}¶
- __doc__ = 'GINECONV layer.\n Custom class of the graph isomorphism operator from the "How Powerful are Graph Neural Networks?\n https://arxiv.org/abs/1810.00826 paper. Note that this implementation has the added option of a custom activation.\n '¶
- __module__ = 'gt4sd.algorithms.generation.diffusion.geodiff.model.layers'¶
- class GINEncoder(hidden_dim, num_convs=3, activation='relu', short_cut=True, concat_hidden=False)[source]¶
Bases:
Module
GIN encoder.
- __init__(hidden_dim, num_convs=3, activation='relu', short_cut=True, concat_hidden=False)[source]¶
Construct a GIN encoder.
- Parameters
hidden_dim (
int
) – number of hidden channels.num_convs (
int
) – number of convolutions.activation (
str
) – activation function.short_cut (
bool
) – whether to use short cut.concat_hidden (
bool
) – whether to concatenate hidden.
- forward(z, edge_index, edge_attr)[source]¶
Forward pass.
- Parameters
z (
Tensor
) – input tensor.edge_index (
Tensor
) – edge indices.edge_attr (
Tensor
) – edge attributes.
- Return type
Tensor
- Returns
graph with node feature.
- __annotations__ = {}¶
- __doc__ = 'GIN encoder.'¶
- __module__ = 'gt4sd.algorithms.generation.diffusion.geodiff.model.layers'¶
- class MLPEdgeEncoder(hidden_dim=100, activation='relu')[source]¶
Bases:
Module
MLP edge encoder.
- __init__(hidden_dim=100, activation='relu')[source]¶
Construct a MLP edge encoder.
- Parameters
hidden_dim (
int
) – number of hidden channels.activation (
str
) – activation function.
- __annotations__ = {}¶
- __doc__ = 'MLP edge encoder.'¶
- __module__ = 'gt4sd.algorithms.generation.diffusion.geodiff.model.layers'¶
- property out_channels¶