siglip-wikiart-5ep
This model is a fine-tuned version of google/siglip-base-patch16-256-multilingual on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.0290
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 5
Training results
| Training Loss | Epoch | Step | Validation Loss |
|---|---|---|---|
| 0.0353 | 0.1913 | 500 | 0.0328 |
| 0.0292 | 0.3826 | 1000 | 0.0295 |
| 0.0289 | 0.5738 | 1500 | 0.0272 |
| 0.0244 | 0.7651 | 2000 | 0.0253 |
| 0.0237 | 0.9564 | 2500 | 0.0241 |
| 0.0179 | 1.1477 | 3000 | 0.0233 |
| 0.0169 | 1.3389 | 3500 | 0.0222 |
| 0.0171 | 1.5302 | 4000 | 0.0216 |
| 0.0171 | 1.7215 | 4500 | 0.0220 |
| 0.0162 | 1.9128 | 5000 | 0.0196 |
| 0.0099 | 2.1041 | 5500 | 0.0236 |
| 0.0089 | 2.2953 | 6000 | 0.0226 |
| 0.0098 | 2.4866 | 6500 | 0.0239 |
| 0.01 | 2.6779 | 7000 | 0.0220 |
| 0.0087 | 2.8692 | 7500 | 0.0233 |
| 0.0059 | 3.0604 | 8000 | 0.0263 |
| 0.0063 | 3.2517 | 8500 | 0.0243 |
| 0.0059 | 3.4430 | 9000 | 0.0246 |
| 0.0063 | 3.6343 | 9500 | 0.0249 |
| 0.0063 | 3.8256 | 10000 | 0.0269 |
| 0.0054 | 4.0168 | 10500 | 0.0276 |
| 0.0054 | 4.2081 | 11000 | 0.0285 |
| 0.0039 | 4.3994 | 11500 | 0.0284 |
| 0.0044 | 4.5907 | 12000 | 0.0294 |
| 0.0049 | 4.7819 | 12500 | 0.0294 |
| 0.0035 | 4.9732 | 13000 | 0.0290 |
Framework versions
- Transformers 4.57.1
- Pytorch 2.6.0+cu124
- Datasets 4.4.1
- Tokenizers 0.22.1
- Downloads last month
- 1
Model tree for turing552/siglip-wikiart-5ep
Base model
google/siglip-base-patch16-256-multilingual