train_qqp_1744902594
This model is a fine-tuned version of google/gemma-3-1b-it on the qqp dataset. It achieves the following results on the evaluation set:
- Loss: 0.0566
- Num Input Tokens Seen: 51858816
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 123
- gradient_accumulation_steps: 4
- total_train_batch_size: 16
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- training_steps: 40000
Training results
Training Loss | Epoch | Step | Validation Loss | Input Tokens Seen |
---|---|---|---|---|
0.0965 | 0.0098 | 200 | 0.1272 | 260832 |
0.0729 | 0.0195 | 400 | 0.0973 | 518880 |
0.0898 | 0.0293 | 600 | 0.0868 | 780768 |
0.0984 | 0.0391 | 800 | 0.0845 | 1038304 |
0.0969 | 0.0489 | 1000 | 0.0859 | 1296288 |
0.0773 | 0.0586 | 1200 | 0.0883 | 1554400 |
0.0801 | 0.0684 | 1400 | 0.0830 | 1813856 |
0.0597 | 0.0782 | 1600 | 0.0799 | 2074816 |
0.0568 | 0.0879 | 1800 | 0.0772 | 2332544 |
0.0707 | 0.0977 | 2000 | 0.0857 | 2594720 |
0.0781 | 0.1075 | 2200 | 0.0854 | 2853856 |
0.0948 | 0.1173 | 2400 | 0.0763 | 3112928 |
0.0603 | 0.1270 | 2600 | 0.0749 | 3374048 |
0.0622 | 0.1368 | 2800 | 0.0734 | 3637152 |
0.069 | 0.1466 | 3000 | 0.0746 | 3896512 |
0.1062 | 0.1564 | 3200 | 0.0733 | 4157216 |
0.0852 | 0.1661 | 3400 | 0.0739 | 4418592 |
0.0503 | 0.1759 | 3600 | 0.0707 | 4677248 |
0.0948 | 0.1857 | 3800 | 0.0706 | 4934080 |
0.1081 | 0.1954 | 4000 | 0.0726 | 5191936 |
0.0799 | 0.2052 | 4200 | 0.0712 | 5451200 |
0.0771 | 0.2150 | 4400 | 0.0706 | 5711648 |
0.0651 | 0.2248 | 4600 | 0.0696 | 5970048 |
0.0774 | 0.2345 | 4800 | 0.0688 | 6226272 |
0.0382 | 0.2443 | 5000 | 0.0742 | 6486336 |
0.068 | 0.2541 | 5200 | 0.0693 | 6744864 |
0.0651 | 0.2638 | 5400 | 0.0681 | 7006944 |
0.0709 | 0.2736 | 5600 | 0.0723 | 7267584 |
0.0771 | 0.2834 | 5800 | 0.0685 | 7529280 |
0.0529 | 0.2932 | 6000 | 0.0668 | 7788992 |
0.0481 | 0.3029 | 6200 | 0.0664 | 8052736 |
0.0779 | 0.3127 | 6400 | 0.0670 | 8311808 |
0.0731 | 0.3225 | 6600 | 0.0680 | 8568416 |
0.0489 | 0.3323 | 6800 | 0.0667 | 8830400 |
0.0708 | 0.3420 | 7000 | 0.0674 | 9091040 |
0.0648 | 0.3518 | 7200 | 0.0652 | 9350272 |
0.073 | 0.3616 | 7400 | 0.0663 | 9609312 |
0.0746 | 0.3713 | 7600 | 0.0767 | 9867648 |
0.0582 | 0.3811 | 7800 | 0.0662 | 10127328 |
0.0498 | 0.3909 | 8000 | 0.0640 | 10383808 |
0.0471 | 0.4007 | 8200 | 0.0648 | 10643424 |
0.0792 | 0.4104 | 8400 | 0.0640 | 10901760 |
0.0808 | 0.4202 | 8600 | 0.0673 | 11159584 |
0.0651 | 0.4300 | 8800 | 0.0657 | 11420640 |
0.0603 | 0.4397 | 9000 | 0.0642 | 11683072 |
0.0593 | 0.4495 | 9200 | 0.0674 | 11941600 |
0.0452 | 0.4593 | 9400 | 0.0633 | 12198528 |
0.0637 | 0.4691 | 9600 | 0.0640 | 12455968 |
0.0691 | 0.4788 | 9800 | 0.0632 | 12716992 |
0.0527 | 0.4886 | 10000 | 0.0628 | 12974048 |
0.0584 | 0.4984 | 10200 | 0.0635 | 13231360 |
0.0447 | 0.5081 | 10400 | 0.0662 | 13489760 |
0.0657 | 0.5179 | 10600 | 0.0628 | 13750592 |
0.0526 | 0.5277 | 10800 | 0.0627 | 14009088 |
0.0755 | 0.5375 | 11000 | 0.0621 | 14268352 |
0.0665 | 0.5472 | 11200 | 0.0643 | 14527072 |
0.0744 | 0.5570 | 11400 | 0.0626 | 14787040 |
0.0451 | 0.5668 | 11600 | 0.0630 | 15045600 |
0.0283 | 0.5766 | 11800 | 0.0626 | 15306176 |
0.0531 | 0.5863 | 12000 | 0.0636 | 15565184 |
0.0703 | 0.5961 | 12200 | 0.0617 | 15824576 |
0.0821 | 0.6059 | 12400 | 0.0616 | 16083104 |
0.0478 | 0.6156 | 12600 | 0.0632 | 16342784 |
0.045 | 0.6254 | 12800 | 0.0620 | 16601824 |
0.0572 | 0.6352 | 13000 | 0.0610 | 16860320 |
0.0562 | 0.6450 | 13200 | 0.0622 | 17118528 |
0.0731 | 0.6547 | 13400 | 0.0632 | 17378528 |
0.0561 | 0.6645 | 13600 | 0.0645 | 17638400 |
0.0517 | 0.6743 | 13800 | 0.0634 | 17898336 |
0.0559 | 0.6840 | 14000 | 0.0661 | 18158528 |
0.0896 | 0.6938 | 14200 | 0.0607 | 18418528 |
0.0656 | 0.7036 | 14400 | 0.0605 | 18679264 |
0.0477 | 0.7134 | 14600 | 0.0606 | 18940320 |
0.0654 | 0.7231 | 14800 | 0.0641 | 19196416 |
0.0699 | 0.7329 | 15000 | 0.0605 | 19454912 |
0.0715 | 0.7427 | 15200 | 0.0601 | 19715616 |
0.0582 | 0.7524 | 15400 | 0.0601 | 19976768 |
0.0862 | 0.7622 | 15600 | 0.0602 | 20234592 |
0.0349 | 0.7720 | 15800 | 0.0614 | 20493056 |
0.0554 | 0.7818 | 16000 | 0.0600 | 20750368 |
0.059 | 0.7915 | 16200 | 0.0597 | 21010432 |
0.0727 | 0.8013 | 16400 | 0.0643 | 21270112 |
0.0965 | 0.8111 | 16600 | 0.0602 | 21531456 |
0.0453 | 0.8209 | 16800 | 0.0601 | 21788384 |
0.0398 | 0.8306 | 17000 | 0.0621 | 22045600 |
0.058 | 0.8404 | 17200 | 0.0594 | 22303808 |
0.0451 | 0.8502 | 17400 | 0.0599 | 22562496 |
0.0532 | 0.8599 | 17600 | 0.0593 | 22821376 |
0.0695 | 0.8697 | 17800 | 0.0664 | 23080448 |
0.0549 | 0.8795 | 18000 | 0.0608 | 23338016 |
0.0887 | 0.8893 | 18200 | 0.0595 | 23598208 |
0.0371 | 0.8990 | 18400 | 0.0593 | 23857824 |
0.0323 | 0.9088 | 18600 | 0.0604 | 24117056 |
0.0759 | 0.9186 | 18800 | 0.0591 | 24375456 |
0.0614 | 0.9283 | 19000 | 0.0595 | 24635712 |
0.0522 | 0.9381 | 19200 | 0.0588 | 24895360 |
0.0409 | 0.9479 | 19400 | 0.0596 | 25156480 |
0.0877 | 0.9577 | 19600 | 0.0585 | 25415936 |
0.0539 | 0.9674 | 19800 | 0.0590 | 25677472 |
0.026 | 0.9772 | 20000 | 0.0585 | 25934656 |
0.0741 | 0.9870 | 20200 | 0.0582 | 26193248 |
0.0491 | 0.9968 | 20400 | 0.0583 | 26449184 |
0.0499 | 1.0065 | 20600 | 0.0615 | 26710048 |
0.0828 | 1.0163 | 20800 | 0.0601 | 26968800 |
0.0404 | 1.0261 | 21000 | 0.0598 | 27230240 |
0.0358 | 1.0359 | 21200 | 0.0596 | 27489152 |
0.0649 | 1.0456 | 21400 | 0.0589 | 27746528 |
0.0475 | 1.0554 | 21600 | 0.0599 | 28009568 |
0.0478 | 1.0652 | 21800 | 0.0601 | 28270592 |
0.0307 | 1.0750 | 22000 | 0.0588 | 28533952 |
0.0506 | 1.0847 | 22200 | 0.0588 | 28788352 |
0.0457 | 1.0945 | 22400 | 0.0598 | 29047328 |
0.0696 | 1.1043 | 22600 | 0.0587 | 29306368 |
0.0922 | 1.1140 | 22800 | 0.0583 | 29567616 |
0.0573 | 1.1238 | 23000 | 0.0578 | 29829920 |
0.0297 | 1.1336 | 23200 | 0.0584 | 30092128 |
0.0478 | 1.1434 | 23400 | 0.0588 | 30349984 |
0.0543 | 1.1531 | 23600 | 0.0637 | 30605344 |
0.0492 | 1.1629 | 23800 | 0.0586 | 30867648 |
0.0718 | 1.1727 | 24000 | 0.0584 | 31127744 |
0.0476 | 1.1824 | 24200 | 0.0577 | 31383392 |
0.0605 | 1.1922 | 24400 | 0.0607 | 31641056 |
0.0742 | 1.2020 | 24600 | 0.0580 | 31900960 |
0.0468 | 1.2118 | 24800 | 0.0582 | 32158304 |
0.0536 | 1.2215 | 25000 | 0.0586 | 32419552 |
0.0399 | 1.2313 | 25200 | 0.0590 | 32677888 |
0.0411 | 1.2411 | 25400 | 0.0574 | 32936608 |
0.0438 | 1.2508 | 25600 | 0.0583 | 33195264 |
0.0513 | 1.2606 | 25800 | 0.0614 | 33454720 |
0.0791 | 1.2704 | 26000 | 0.0584 | 33714496 |
0.0506 | 1.2802 | 26200 | 0.0580 | 33972576 |
0.0551 | 1.2899 | 26400 | 0.0577 | 34231488 |
0.0753 | 1.2997 | 26600 | 0.0584 | 34491904 |
0.0336 | 1.3095 | 26800 | 0.0575 | 34751008 |
0.0489 | 1.3193 | 27000 | 0.0592 | 35006432 |
0.0348 | 1.3290 | 27200 | 0.0585 | 35264896 |
0.0368 | 1.3388 | 27400 | 0.0591 | 35523424 |
0.0421 | 1.3486 | 27600 | 0.0580 | 35781024 |
0.0474 | 1.3583 | 27800 | 0.0575 | 36040224 |
0.0516 | 1.3681 | 28000 | 0.0576 | 36297952 |
0.086 | 1.3779 | 28200 | 0.0572 | 36557056 |
0.0423 | 1.3877 | 28400 | 0.0575 | 36815904 |
0.0424 | 1.3974 | 28600 | 0.0570 | 37076064 |
0.0499 | 1.4072 | 28800 | 0.0568 | 37333536 |
0.0396 | 1.4170 | 29000 | 0.0575 | 37593216 |
0.0319 | 1.4267 | 29200 | 0.0580 | 37850816 |
0.037 | 1.4365 | 29400 | 0.0575 | 38111232 |
0.0424 | 1.4463 | 29600 | 0.0576 | 38370144 |
0.0473 | 1.4561 | 29800 | 0.0574 | 38629280 |
0.0402 | 1.4658 | 30000 | 0.0575 | 38887744 |
0.048 | 1.4756 | 30200 | 0.0588 | 39146016 |
0.0454 | 1.4854 | 30400 | 0.0573 | 39406240 |
0.0465 | 1.4952 | 30600 | 0.0572 | 39664736 |
0.0349 | 1.5049 | 30800 | 0.0575 | 39922240 |
0.0764 | 1.5147 | 31000 | 0.0583 | 40181504 |
0.0451 | 1.5245 | 31200 | 0.0573 | 40439712 |
0.0337 | 1.5342 | 31400 | 0.0577 | 40700736 |
0.0525 | 1.5440 | 31600 | 0.0571 | 40963072 |
0.0386 | 1.5538 | 31800 | 0.0569 | 41224800 |
0.0368 | 1.5636 | 32000 | 0.0573 | 41485536 |
0.0581 | 1.5733 | 32200 | 0.0568 | 41743456 |
0.0483 | 1.5831 | 32400 | 0.0570 | 42005696 |
0.0683 | 1.5929 | 32600 | 0.0574 | 42267520 |
0.034 | 1.6026 | 32800 | 0.0574 | 42528896 |
0.0661 | 1.6124 | 33000 | 0.0571 | 42786240 |
0.0345 | 1.6222 | 33200 | 0.0578 | 43043616 |
0.0483 | 1.6320 | 33400 | 0.0575 | 43300896 |
0.0343 | 1.6417 | 33600 | 0.0572 | 43559424 |
0.0514 | 1.6515 | 33800 | 0.0573 | 43815424 |
0.0347 | 1.6613 | 34000 | 0.0577 | 44074432 |
0.0319 | 1.6710 | 34200 | 0.0571 | 44334304 |
0.0452 | 1.6808 | 34400 | 0.0570 | 44594368 |
0.0432 | 1.6906 | 34600 | 0.0569 | 44852576 |
0.0422 | 1.7004 | 34800 | 0.0569 | 45109056 |
0.0287 | 1.7101 | 35000 | 0.0568 | 45367936 |
0.0504 | 1.7199 | 35200 | 0.0570 | 45627104 |
0.0701 | 1.7297 | 35400 | 0.0569 | 45885312 |
0.0321 | 1.7395 | 35600 | 0.0567 | 46145568 |
0.07 | 1.7492 | 35800 | 0.0570 | 46409504 |
0.0473 | 1.7590 | 36000 | 0.0569 | 46669472 |
0.0611 | 1.7688 | 36200 | 0.0569 | 46929280 |
0.0412 | 1.7785 | 36400 | 0.0569 | 47188416 |
0.0611 | 1.7883 | 36600 | 0.0569 | 47447328 |
0.051 | 1.7981 | 36800 | 0.0570 | 47707040 |
0.0619 | 1.8079 | 37000 | 0.0571 | 47966176 |
0.0467 | 1.8176 | 37200 | 0.0569 | 48227328 |
0.0412 | 1.8274 | 37400 | 0.0568 | 48485632 |
0.0349 | 1.8372 | 37600 | 0.0569 | 48744768 |
0.0525 | 1.8469 | 37800 | 0.0569 | 49002400 |
0.0625 | 1.8567 | 38000 | 0.0568 | 49259584 |
0.0544 | 1.8665 | 38200 | 0.0567 | 49518144 |
0.0493 | 1.8763 | 38400 | 0.0567 | 49775776 |
0.0727 | 1.8860 | 38600 | 0.0566 | 50036384 |
0.0317 | 1.8958 | 38800 | 0.0567 | 50298720 |
0.0349 | 1.9056 | 39000 | 0.0567 | 50560704 |
0.0275 | 1.9153 | 39200 | 0.0567 | 50820416 |
0.0323 | 1.9251 | 39400 | 0.0567 | 51080832 |
0.0473 | 1.9349 | 39600 | 0.0567 | 51339424 |
0.0399 | 1.9447 | 39800 | 0.0568 | 51597120 |
0.0382 | 1.9544 | 40000 | 0.0567 | 51858816 |
Framework versions
- PEFT 0.15.1
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 10
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support