configuration_utils¶
multimolecule.models.configuration_utils
¶
HeadConfig
dataclass
¶
Bases: BaseHeadConfig
Configuration class for a prediction head.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
|
int
|
Number of labels to use in the last layer added to the model, typically for a classification task. Head should look for |
None
|
|
str
|
Problem type for Head should look for |
None
|
|
int | None
|
Dimensionality of the encoder layers and the pooler layer. Head should look for |
None
|
|
float
|
The dropout ratio for the hidden states. |
0.0
|
|
str | None
|
The transform operation applied to hidden states. |
None
|
|
str | None
|
The activation function of transform applied to hidden states. |
'gelu'
|
|
bool
|
Whether to apply bias to the final prediction layer. |
True
|
|
str | None
|
The activation function of the final prediction output. |
None
|
|
float
|
The epsilon used by the layer normalization layers. |
1e-12
|
|
`str`, *optional*
|
The name of the tensor required in model outputs. If is |
None
|
Source code in multimolecule/module/heads/config.py
MaskedLMHeadConfig
dataclass
¶
Bases: BaseHeadConfig
Configuration class for a Masked Language Modeling head.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
|
int | None
|
Dimensionality of the encoder layers and the pooler layer. Head should look for |
None
|
|
float
|
The dropout ratio for the hidden states. |
0.0
|
|
str | None
|
The transform operation applied to hidden states. |
'nonlinear'
|
|
str | None
|
The activation function of transform applied to hidden states. |
'gelu'
|
|
bool
|
Whether to apply bias to the final prediction layer. |
True
|
|
str | None
|
The activation function of the final prediction output. |
None
|
|
float
|
The epsilon used by the layer normalization layers. |
1e-12
|
|
`str`, *optional*
|
The name of the tensor required in model outputs. If is |
None
|
Source code in multimolecule/module/heads/config.py
PreTrainedConfig
¶
Bases: PretrainedConfig
Base class for all model configuration classes.