models package¶
Submodules¶
models.MLP module¶
- class models.MLP.MLPModel2(*args, **kwargs)[source]¶
Bases:
Model
A customizable Multi-Layer Perceptron (MLP) model for deep learning.
This class allows for the creation of a flexible MLP architecture for either classification or regression tasks. It supports features such as batch normalization, dropout for regularization, and dynamic learning rate decay.
- batch_norm_layer¶
Optional batch normalization layer.
- Type:
tf.keras.layers.BatchNormalization
- batch_norm¶
Flag indicating whether batch normalization is to be used.
- Type:
bool
- mlp_model¶
Sequential model representing the MLP layers.
- Type:
tf.keras.Sequential
- mlp_predictor¶
Final dense layer for prediction.
- Type:
tf.keras.layers.Dense
- optimizer¶
Optimizer with a decaying learning rate.
- Type:
tf.keras.optimizers.Adam
models.MLPEval module¶
models.MLPSequentialAttention module¶
Implements a machine learning model that combines a Multi-Layer Perceptron (MLP) with Sequential Attention for feature selection. This approach aims to enhance model performance by focusing on the most relevant input features over a series of training steps.
- class models.MLPSequentialAttention.SequentialAttentionModel(*args, **kwargs)[source]¶
Bases:
MLPModel2
Defines an MLP model enhanced with a Sequential Attention mechanism for dynamic feature selection during training.
Inherits from MLPModel2, extending its functionality with the Sequential Attention mechanism to select a subset of input features based on their importance to the prediction task.
- call(inputs, training=False)[source]¶
Forward pass for the model. Applies batch normalization (if enabled), Sequential Attention to input features, followed by the MLP layers and output prediction layer.
- Parameters:
inputs (Tensor) – Input features.
training (bool) – Whether the model is in training mode.
- Returns:
The output predictions of the model.
- Return type:
Tensor
models.SequentialAttention module¶
Sequential Attention for Feature Selection.
This module implements a Sequential Attention mechanism for feature selection as described in https://arxiv.org/abs/2209.14881. It progressively selects features based on their importance calculated through a trainable attention mechanism, optimizing for both feature relevance and compactness in the selected feature set.
- class models.SequentialAttention.SequentialAttention(num_candidates, num_candidates_to_select, num_candidates_to_select_per_step=1, start_percentage=0.1, stop_percentage=1.0, name='sequential_attention', reset_weights=True, **kwargs)[source]¶
Bases:
Module
Implements a Sequential Attention mechanism for feature selection.
This class defines a module that applies a trainable attention mechanism to sequentially select a subset of features from a larger set based on their relevance. The selection process is controlled by a percentage of the training process completed, allowing for dynamic adjustment of feature importance over time.