metadata
library_name: transformers
license: apache-2.0
base_model: facebook/dinov2-large
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: DinoVdeauTest-large-2024_09_24-batch-size32_freeze
results: []
DinoVdeauTest-large-2024_09_24-batch-size32_freeze
This model is a fine-tuned version of facebook/dinov2-large on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.1204
- F1 Micro: 0.8214
- F1 Macro: 0.7191
- Accuracy: 0.3135
- Learning Rate: 0.0000
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 150
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | F1 Micro | F1 Macro | Accuracy | Rate |
---|---|---|---|---|---|---|---|
No log | 1.0 | 273 | 0.1776 | 0.7486 | 0.5520 | 0.2169 | 0.001 |
0.2736 | 2.0 | 546 | 0.1528 | 0.7693 | 0.5698 | 0.2405 | 0.001 |
0.2736 | 3.0 | 819 | 0.1483 | 0.7774 | 0.6217 | 0.2495 | 0.001 |
0.1699 | 4.0 | 1092 | 0.1467 | 0.7772 | 0.6272 | 0.2554 | 0.001 |
0.1699 | 5.0 | 1365 | 0.1453 | 0.7772 | 0.6281 | 0.2464 | 0.001 |
0.1622 | 6.0 | 1638 | 0.1437 | 0.7810 | 0.6191 | 0.2630 | 0.001 |
0.1622 | 7.0 | 1911 | 0.1428 | 0.7811 | 0.6172 | 0.2599 | 0.001 |
0.1593 | 8.0 | 2184 | 0.1433 | 0.7799 | 0.6253 | 0.2557 | 0.001 |
0.1593 | 9.0 | 2457 | 0.1420 | 0.7893 | 0.6530 | 0.2519 | 0.001 |
0.1569 | 10.0 | 2730 | 0.2462 | 0.7441 | 0.6022 | 0.2554 | 0.001 |
0.156 | 11.0 | 3003 | 0.1400 | 0.7883 | 0.6483 | 0.2703 | 0.001 |
0.156 | 12.0 | 3276 | 0.1400 | 0.7906 | 0.6590 | 0.2599 | 0.001 |
0.1547 | 13.0 | 3549 | 0.1394 | 0.7876 | 0.6467 | 0.2644 | 0.001 |
0.1547 | 14.0 | 3822 | 0.1399 | 0.7879 | 0.6469 | 0.2640 | 0.001 |
0.1543 | 15.0 | 4095 | 0.1391 | 0.7881 | 0.6413 | 0.2512 | 0.001 |
0.1543 | 16.0 | 4368 | 0.1414 | 0.7907 | 0.6348 | 0.2633 | 0.001 |
0.1536 | 17.0 | 4641 | 0.1403 | 0.7903 | 0.6445 | 0.2616 | 0.001 |
0.1536 | 18.0 | 4914 | 0.1404 | 0.7929 | 0.6464 | 0.2602 | 0.001 |
0.1556 | 19.0 | 5187 | 0.1404 | 0.7936 | 0.6526 | 0.2585 | 0.001 |
0.1556 | 20.0 | 5460 | 0.1391 | 0.7900 | 0.6491 | 0.2550 | 0.001 |
0.1534 | 21.0 | 5733 | 0.1383 | 0.7917 | 0.6508 | 0.2581 | 0.001 |
0.1533 | 22.0 | 6006 | 0.1389 | 0.7935 | 0.6489 | 0.2685 | 0.001 |
0.1533 | 23.0 | 6279 | 0.1385 | 0.7843 | 0.6550 | 0.2561 | 0.001 |
0.1531 | 24.0 | 6552 | 0.1367 | 0.7921 | 0.6507 | 0.2696 | 0.001 |
0.1531 | 25.0 | 6825 | 0.1379 | 0.7894 | 0.6408 | 0.2751 | 0.001 |
0.1533 | 26.0 | 7098 | 0.1375 | 0.7943 | 0.6469 | 0.2710 | 0.001 |
0.1533 | 27.0 | 7371 | 0.1392 | 0.7921 | 0.6516 | 0.2644 | 0.001 |
0.1529 | 28.0 | 7644 | 0.1385 | 0.7918 | 0.6425 | 0.2637 | 0.001 |
0.1529 | 29.0 | 7917 | 0.1402 | 0.7883 | 0.6497 | 0.2626 | 0.001 |
0.153 | 30.0 | 8190 | 0.1377 | 0.7887 | 0.6553 | 0.2668 | 0.001 |
0.153 | 31.0 | 8463 | 0.1313 | 0.8018 | 0.6718 | 0.2789 | 0.0001 |
0.1486 | 32.0 | 8736 | 0.1318 | 0.8061 | 0.6772 | 0.2807 | 0.0001 |
0.1415 | 33.0 | 9009 | 0.1309 | 0.8050 | 0.6792 | 0.2775 | 0.0001 |
0.1415 | 34.0 | 9282 | 0.1296 | 0.8049 | 0.6775 | 0.2821 | 0.0001 |
0.1395 | 35.0 | 9555 | 0.1282 | 0.8085 | 0.6865 | 0.2893 | 0.0001 |
0.1395 | 36.0 | 9828 | 0.1289 | 0.8055 | 0.6828 | 0.2831 | 0.0001 |
0.1387 | 37.0 | 10101 | 0.1277 | 0.8055 | 0.6775 | 0.2831 | 0.0001 |
0.1387 | 38.0 | 10374 | 0.1275 | 0.8084 | 0.6883 | 0.2890 | 0.0001 |
0.1354 | 39.0 | 10647 | 0.1266 | 0.8099 | 0.6854 | 0.2879 | 0.0001 |
0.1354 | 40.0 | 10920 | 0.1282 | 0.8117 | 0.6981 | 0.2886 | 0.0001 |
0.1355 | 41.0 | 11193 | 0.1267 | 0.8082 | 0.6851 | 0.2883 | 0.0001 |
0.1355 | 42.0 | 11466 | 0.1262 | 0.8112 | 0.6942 | 0.2907 | 0.0001 |
0.1347 | 43.0 | 11739 | 0.1259 | 0.8107 | 0.6908 | 0.2911 | 0.0001 |
0.1337 | 44.0 | 12012 | 0.1264 | 0.8115 | 0.6925 | 0.2931 | 0.0001 |
0.1337 | 45.0 | 12285 | 0.1258 | 0.8110 | 0.6975 | 0.2966 | 0.0001 |
0.1329 | 46.0 | 12558 | 0.1254 | 0.8109 | 0.6941 | 0.2987 | 0.0001 |
0.1329 | 47.0 | 12831 | 0.1257 | 0.8098 | 0.6937 | 0.2921 | 0.0001 |
0.1331 | 48.0 | 13104 | 0.1254 | 0.8107 | 0.6905 | 0.2914 | 0.0001 |
0.1331 | 49.0 | 13377 | 0.1252 | 0.8137 | 0.6974 | 0.2945 | 0.0001 |
0.1309 | 50.0 | 13650 | 0.1248 | 0.8150 | 0.7026 | 0.2983 | 0.0001 |
0.1309 | 51.0 | 13923 | 0.1246 | 0.8158 | 0.7067 | 0.2959 | 0.0001 |
0.1304 | 52.0 | 14196 | 0.1246 | 0.8121 | 0.7009 | 0.2952 | 0.0001 |
0.1304 | 53.0 | 14469 | 0.1242 | 0.8143 | 0.6974 | 0.2990 | 0.0001 |
0.1309 | 54.0 | 14742 | 0.1241 | 0.8135 | 0.7001 | 0.2966 | 0.0001 |
0.1289 | 55.0 | 15015 | 0.1242 | 0.8131 | 0.6997 | 0.2952 | 0.0001 |
0.1289 | 56.0 | 15288 | 0.1235 | 0.8179 | 0.7064 | 0.3021 | 0.0001 |
0.1286 | 57.0 | 15561 | 0.1235 | 0.8150 | 0.6963 | 0.2994 | 0.0001 |
0.1286 | 58.0 | 15834 | 0.1231 | 0.8145 | 0.7012 | 0.2983 | 0.0001 |
0.1282 | 59.0 | 16107 | 0.1234 | 0.8153 | 0.7022 | 0.3001 | 0.0001 |
0.1282 | 60.0 | 16380 | 0.1239 | 0.8122 | 0.6978 | 0.2973 | 0.0001 |
0.1282 | 61.0 | 16653 | 0.1236 | 0.8158 | 0.7114 | 0.3015 | 0.0001 |
0.1282 | 62.0 | 16926 | 0.1227 | 0.8168 | 0.7120 | 0.3032 | 0.0001 |
0.1265 | 63.0 | 17199 | 0.1231 | 0.8137 | 0.7077 | 0.2949 | 0.0001 |
0.1265 | 64.0 | 17472 | 0.1228 | 0.8172 | 0.7084 | 0.3056 | 0.0001 |
0.1273 | 65.0 | 17745 | 0.1232 | 0.8183 | 0.7103 | 0.3077 | 0.0001 |
0.1258 | 66.0 | 18018 | 0.1226 | 0.8179 | 0.7065 | 0.3035 | 0.0001 |
0.1258 | 67.0 | 18291 | 0.1228 | 0.8185 | 0.7105 | 0.3053 | 0.0001 |
0.125 | 68.0 | 18564 | 0.1228 | 0.8181 | 0.7128 | 0.3042 | 0.0001 |
0.125 | 69.0 | 18837 | 0.1228 | 0.8137 | 0.7038 | 0.3053 | 0.0001 |
0.125 | 70.0 | 19110 | 0.1232 | 0.8155 | 0.7080 | 0.3018 | 0.0001 |
0.125 | 71.0 | 19383 | 0.1231 | 0.8156 | 0.7111 | 0.2990 | 0.0001 |
0.1245 | 72.0 | 19656 | 0.1223 | 0.8162 | 0.7150 | 0.3008 | 0.0001 |
0.1245 | 73.0 | 19929 | 0.1223 | 0.8174 | 0.7042 | 0.3049 | 0.0001 |
0.1248 | 74.0 | 20202 | 0.1237 | 0.8125 | 0.7009 | 0.2963 | 0.0001 |
0.1248 | 75.0 | 20475 | 0.1225 | 0.8152 | 0.7045 | 0.3046 | 0.0001 |
0.1249 | 76.0 | 20748 | 0.1247 | 0.8160 | 0.7099 | 0.3008 | 0.0001 |
0.1238 | 77.0 | 21021 | 0.1225 | 0.8179 | 0.7139 | 0.2990 | 0.0001 |
0.1238 | 78.0 | 21294 | 0.1222 | 0.8188 | 0.7061 | 0.3046 | 0.0001 |
0.1233 | 79.0 | 21567 | 0.1246 | 0.8152 | 0.7101 | 0.3018 | 0.0001 |
0.1233 | 80.0 | 21840 | 0.1221 | 0.8180 | 0.7103 | 0.3039 | 0.0001 |
0.1225 | 81.0 | 22113 | 0.1212 | 0.8185 | 0.7157 | 0.3018 | 0.0001 |
0.1225 | 82.0 | 22386 | 0.1216 | 0.8152 | 0.7089 | 0.3080 | 0.0001 |
0.1216 | 83.0 | 22659 | 0.1214 | 0.8165 | 0.7090 | 0.3080 | 0.0001 |
0.1216 | 84.0 | 22932 | 0.1216 | 0.8169 | 0.7100 | 0.3070 | 0.0001 |
0.1232 | 85.0 | 23205 | 0.1216 | 0.8188 | 0.7109 | 0.3053 | 0.0001 |
0.1232 | 86.0 | 23478 | 0.1219 | 0.8191 | 0.7176 | 0.3070 | 0.0001 |
0.1221 | 87.0 | 23751 | 0.1220 | 0.8177 | 0.7080 | 0.3063 | 0.0001 |
0.1208 | 88.0 | 24024 | 0.1210 | 0.8211 | 0.7158 | 0.3049 | 1e-05 |
0.1208 | 89.0 | 24297 | 0.1210 | 0.8241 | 0.7312 | 0.3073 | 1e-05 |
0.1189 | 90.0 | 24570 | 0.1206 | 0.8235 | 0.7232 | 0.3070 | 1e-05 |
0.1189 | 91.0 | 24843 | 0.1204 | 0.8191 | 0.7148 | 0.3087 | 1e-05 |
0.1181 | 92.0 | 25116 | 0.1203 | 0.8194 | 0.7132 | 0.3087 | 1e-05 |
0.1181 | 93.0 | 25389 | 0.1204 | 0.8215 | 0.7184 | 0.3084 | 1e-05 |
0.1183 | 94.0 | 25662 | 0.1201 | 0.8207 | 0.7195 | 0.3067 | 1e-05 |
0.1183 | 95.0 | 25935 | 0.1201 | 0.8197 | 0.7158 | 0.3084 | 1e-05 |
0.117 | 96.0 | 26208 | 0.1198 | 0.8217 | 0.7193 | 0.3053 | 1e-05 |
0.117 | 97.0 | 26481 | 0.1201 | 0.8205 | 0.7211 | 0.3063 | 1e-05 |
0.1176 | 98.0 | 26754 | 0.1201 | 0.8227 | 0.7250 | 0.3080 | 1e-05 |
0.1176 | 99.0 | 27027 | 0.1200 | 0.8206 | 0.7226 | 0.3073 | 1e-05 |
0.1176 | 100.0 | 27300 | 0.1200 | 0.8216 | 0.7191 | 0.3080 | 1e-05 |
0.117 | 101.0 | 27573 | 0.1199 | 0.8228 | 0.7242 | 0.3098 | 1e-05 |
0.117 | 102.0 | 27846 | 0.1204 | 0.8192 | 0.7216 | 0.3073 | 1e-05 |
0.1159 | 103.0 | 28119 | 0.1200 | 0.8222 | 0.7232 | 0.3067 | 0.0000 |
0.1159 | 104.0 | 28392 | 0.1204 | 0.8218 | 0.7235 | 0.3101 | 0.0000 |
0.1151 | 105.0 | 28665 | 0.1199 | 0.8219 | 0.7206 | 0.3077 | 0.0000 |
0.1151 | 106.0 | 28938 | 0.1198 | 0.8228 | 0.7270 | 0.3105 | 0.0000 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.4.1+cu121
- Datasets 3.0.0
- Tokenizers 0.19.1