Model Architecture
Network Design
The neural network was designed to balance capacity and generalization for a 13-feature tabular dataset.| Layer | Neurons | Activation |
|---|---|---|
| Input | 13 | — |
| Hidden 1 | 64 | ReLU |
| Hidden 2 | 32 | ReLU |
| Hidden 3 | 16 | ReLU |
| Output | 1 | Sigmoid |
- Loss function: Binary Cross-Entropy
- Optimizer: Adam
- Regularization: Dropout after each hidden layer
Why 3 Hidden Layers?
The choice is grounded in the bias-variance trade-off:- Too few layers → underfitting (high bias), the model cannot capture meaningful patterns.
- Too many layers → overfitting (high variance), the model memorizes training data.