Model save
Browse files
README.md
ADDED
|
@@ -0,0 +1,180 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
library_name: transformers
|
| 3 |
+
tags:
|
| 4 |
+
- generated_from_trainer
|
| 5 |
+
datasets:
|
| 6 |
+
- generator
|
| 7 |
+
metrics:
|
| 8 |
+
- accuracy
|
| 9 |
+
model-index:
|
| 10 |
+
- name: T5Laa2-Large-WeightedLoss
|
| 11 |
+
results:
|
| 12 |
+
- task:
|
| 13 |
+
name: Sequence-to-sequence Language Modeling
|
| 14 |
+
type: text2text-generation
|
| 15 |
+
dataset:
|
| 16 |
+
name: generator
|
| 17 |
+
type: generator
|
| 18 |
+
config: default
|
| 19 |
+
split: train
|
| 20 |
+
args: default
|
| 21 |
+
metrics:
|
| 22 |
+
- name: Accuracy
|
| 23 |
+
type: accuracy
|
| 24 |
+
value: 0.037367906066536206
|
| 25 |
+
---
|
| 26 |
+
|
| 27 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
| 28 |
+
should probably proofread and complete it, then remove this comment. -->
|
| 29 |
+
|
| 30 |
+
# T5Laa2-Large-WeightedLoss
|
| 31 |
+
|
| 32 |
+
This model is a fine-tuned version of [](https://huggingface.co/) on the generator dataset.
|
| 33 |
+
It achieves the following results on the evaluation set:
|
| 34 |
+
- Perplexity: 184.6505
|
| 35 |
+
- Loss: 5.2185
|
| 36 |
+
- Accuracy: 0.0374
|
| 37 |
+
- Lookahead Perplexity: 2090.0473
|
| 38 |
+
- Lookahead Loss: 7.6449
|
| 39 |
+
|
| 40 |
+
## Model description
|
| 41 |
+
|
| 42 |
+
More information needed
|
| 43 |
+
|
| 44 |
+
## Intended uses & limitations
|
| 45 |
+
|
| 46 |
+
More information needed
|
| 47 |
+
|
| 48 |
+
## Training and evaluation data
|
| 49 |
+
|
| 50 |
+
More information needed
|
| 51 |
+
|
| 52 |
+
## Training procedure
|
| 53 |
+
|
| 54 |
+
### Training hyperparameters
|
| 55 |
+
|
| 56 |
+
The following hyperparameters were used during training:
|
| 57 |
+
- learning_rate: 5e-05
|
| 58 |
+
- train_batch_size: 4
|
| 59 |
+
- eval_batch_size: 4
|
| 60 |
+
- seed: 42
|
| 61 |
+
- optimizer: Use adamw_torch_fused with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
| 62 |
+
- lr_scheduler_type: linear
|
| 63 |
+
- training_steps: 524288
|
| 64 |
+
|
| 65 |
+
### Training results
|
| 66 |
+
|
| 67 |
+
| Training Loss | Epoch | Step | Accuracy | Lookahead Loss | Lookahead Perplexity | Validation Loss | Perplexity |
|
| 68 |
+
|:-------------:|:------:|:------:|:--------:|:--------------:|:--------------------------:|:---------------:|:----------:|
|
| 69 |
+
| 6.7799 | 0.0095 | 5000 | 0.0277 | 48.3492 | 994884838669032751104.0000 | 6.6288 | 756.5394 |
|
| 70 |
+
| 6.3701 | 0.0191 | 10000 | 0.0298 | 29.9422 | 10086010098930.062 | 6.2896 | 538.9212 |
|
| 71 |
+
| 6.1926 | 0.0286 | 15000 | 0.0310 | 17.4292 | 37103420.1102 | 6.0969 | 444.4629 |
|
| 72 |
+
| 6.058 | 0.0381 | 20000 | 0.0312 | 11.2096 | 73837.0007 | 5.9705 | 391.7203 |
|
| 73 |
+
| 5.9483 | 0.0477 | 25000 | 0.0318 | 8.7609 | 6379.6106 | 5.8987 | 364.5553 |
|
| 74 |
+
| 5.8936 | 0.0572 | 30000 | 0.0317 | 8.6431 | 5671.1388 | 5.9102 | 368.7617 |
|
| 75 |
+
| 5.9237 | 0.0668 | 35000 | 0.0342 | 8.5313 | 5070.9207 | 5.8056 | 332.1394 |
|
| 76 |
+
| 5.8761 | 0.0763 | 40000 | 0.0346 | 8.3857 | 4383.7940 | 5.7683 | 319.9989 |
|
| 77 |
+
| 5.8407 | 0.0858 | 45000 | 0.0353 | 8.6143 | 5509.9856 | 5.7586 | 316.8908 |
|
| 78 |
+
| 5.9167 | 0.0954 | 50000 | 0.0354 | 8.6580 | 5755.8949 | 5.7596 | 317.2192 |
|
| 79 |
+
| 6.0003 | 0.1049 | 55000 | 0.0359 | 8.7358 | 6221.9875 | 5.7997 | 330.1962 |
|
| 80 |
+
| 5.9179 | 0.1144 | 60000 | 0.0379 | 8.6974 | 5987.3140 | 5.7904 | 327.1379 |
|
| 81 |
+
| 5.9174 | 0.1240 | 65000 | 0.0384 | 8.5412 | 5121.6778 | 5.8089 | 333.2610 |
|
| 82 |
+
| 5.9954 | 0.1335 | 70000 | 0.0393 | 8.5149 | 4988.5368 | 5.8409 | 344.0949 |
|
| 83 |
+
| 6.0362 | 0.1431 | 75000 | 0.0400 | 8.3793 | 4356.0051 | 5.8705 | 354.4363 |
|
| 84 |
+
| 5.8366 | 0.1526 | 80000 | 0.0393 | 8.2272 | 3741.2679 | 5.8130 | 334.6056 |
|
| 85 |
+
| 6.0205 | 0.1621 | 85000 | 0.0392 | 8.5779 | 5313.1989 | 5.7982 | 329.6925 |
|
| 86 |
+
| 5.973 | 0.1717 | 90000 | 0.0398 | 8.8638 | 7071.0320 | 5.8446 | 345.3771 |
|
| 87 |
+
| 5.9258 | 0.1812 | 95000 | 0.0400 | 8.1622 | 3505.8012 | 5.7733 | 321.5895 |
|
| 88 |
+
| 5.8746 | 0.1907 | 100000 | 0.0403 | 8.2517 | 3834.1854 | 5.7572 | 316.4552 |
|
| 89 |
+
| 5.9069 | 0.2003 | 105000 | 0.0402 | 8.2418 | 3796.3680 | 5.7586 | 316.9171 |
|
| 90 |
+
| 5.9402 | 0.2098 | 110000 | 0.0400 | 8.5282 | 5055.2648 | 5.7703 | 320.6236 |
|
| 91 |
+
| 5.8692 | 0.2193 | 115000 | 0.0405 | 8.3863 | 4386.4832 | 5.7466 | 313.1122 |
|
| 92 |
+
| 5.9973 | 0.2289 | 120000 | 0.0394 | 8.6531 | 5727.7346 | 5.7815 | 324.2362 |
|
| 93 |
+
| 5.8888 | 0.2384 | 125000 | 0.0402 | 8.1073 | 3318.6616 | 5.7300 | 307.9743 |
|
| 94 |
+
| 5.9601 | 0.2480 | 130000 | 0.0409 | 8.6942 | 5968.4347 | 5.7525 | 314.9845 |
|
| 95 |
+
| 5.8925 | 0.2575 | 135000 | 0.0405 | 8.1664 | 3520.7167 | 5.7142 | 303.1319 |
|
| 96 |
+
| 5.8557 | 0.2670 | 140000 | 0.0401 | 7.9957 | 2968.1186 | 5.6992 | 298.6369 |
|
| 97 |
+
| 5.8511 | 0.2766 | 145000 | 0.0402 | 7.9752 | 2907.9587 | 5.7048 | 300.3090 |
|
| 98 |
+
| 5.8921 | 0.2861 | 150000 | 0.0407 | 7.9627 | 2871.8322 | 5.6648 | 288.5226 |
|
| 99 |
+
| 5.8002 | 0.2956 | 155000 | 0.0400 | 7.8912 | 2673.5755 | 5.6494 | 284.1184 |
|
| 100 |
+
| 5.8017 | 0.3052 | 160000 | 0.0400 | 8.0297 | 3070.7274 | 5.6654 | 288.7003 |
|
| 101 |
+
| 5.8462 | 0.3147 | 165000 | 0.0405 | 7.9691 | 2890.2133 | 5.6639 | 288.2660 |
|
| 102 |
+
| 5.8635 | 0.3242 | 170000 | 0.0403 | 8.2405 | 3791.4329 | 5.6655 | 288.7322 |
|
| 103 |
+
| 5.7894 | 0.3338 | 175000 | 0.0399 | 8.0391 | 3099.9643 | 5.6634 | 288.1246 |
|
| 104 |
+
| 5.9122 | 0.3433 | 180000 | 0.0411 | 8.0436 | 3113.8276 | 5.6549 | 285.6828 |
|
| 105 |
+
| 5.8401 | 0.3529 | 185000 | 0.0409 | 8.2639 | 3881.1753 | 5.6554 | 285.8374 |
|
| 106 |
+
| 5.8252 | 0.3624 | 190000 | 0.0408 | 7.9751 | 2907.7520 | 5.6592 | 286.9267 |
|
| 107 |
+
| 5.8975 | 0.3719 | 195000 | 0.0405 | 7.9789 | 2918.8320 | 5.6414 | 281.8590 |
|
| 108 |
+
| 5.8008 | 0.3815 | 200000 | 0.0393 | 7.8772 | 2636.5364 | 5.6323 | 279.2986 |
|
| 109 |
+
| 5.776 | 0.3910 | 205000 | 0.0401 | 7.9352 | 2793.9517 | 5.6288 | 278.3158 |
|
| 110 |
+
| 5.8825 | 0.4005 | 210000 | 0.0401 | 7.9805 | 2923.3879 | 5.6192 | 275.6601 |
|
| 111 |
+
| 5.7651 | 0.4101 | 215000 | 0.0400 | 7.9989 | 2977.7573 | 5.5993 | 270.2366 |
|
| 112 |
+
| 5.7721 | 0.4196 | 220000 | 0.0406 | 7.8928 | 2677.9319 | 5.5979 | 269.8660 |
|
| 113 |
+
| 5.8312 | 0.4292 | 225000 | 0.0396 | 8.0192 | 3038.8659 | 5.6054 | 271.8775 |
|
| 114 |
+
| 5.7752 | 0.4387 | 230000 | 0.0405 | 7.8009 | 2442.8390 | 5.5823 | 265.6886 |
|
| 115 |
+
| 5.8101 | 0.4482 | 235000 | 0.0397 | 7.8881 | 2665.3761 | 5.5903 | 267.8042 |
|
| 116 |
+
| 5.7115 | 0.4578 | 240000 | 0.0400 | 7.9381 | 2802.0555 | 5.5694 | 262.2645 |
|
| 117 |
+
| 5.7196 | 0.4673 | 245000 | 0.0394 | 7.8143 | 2475.7568 | 5.5596 | 259.7104 |
|
| 118 |
+
| 5.6944 | 0.4768 | 250000 | 0.0409 | 7.8772 | 2636.5595 | 5.5478 | 256.6677 |
|
| 119 |
+
| 5.6823 | 0.4864 | 255000 | 0.0395 | 7.7952 | 2428.9769 | 5.5298 | 252.0854 |
|
| 120 |
+
| 5.674 | 0.4959 | 260000 | 0.0399 | 7.8926 | 2677.4051 | 5.5318 | 252.5954 |
|
| 121 |
+
| 5.6606 | 0.5054 | 265000 | 0.0400 | 7.8189 | 2487.2421 | 5.5178 | 249.0964 |
|
| 122 |
+
| 5.7097 | 0.5150 | 270000 | 0.0395 | 7.8465 | 2556.6945 | 5.5101 | 247.1704 |
|
| 123 |
+
| 5.7047 | 0.5245 | 275000 | 0.0402 | 7.7667 | 2360.7532 | 5.5015 | 245.0604 |
|
| 124 |
+
| 5.6797 | 0.5341 | 280000 | 0.0397 | 7.8969 | 2688.8862 | 5.4982 | 244.2633 |
|
| 125 |
+
| 5.6739 | 0.5436 | 285000 | 0.0398 | 8.0241 | 3053.8151 | 5.4930 | 242.9751 |
|
| 126 |
+
| 5.6826 | 0.5531 | 290000 | 0.0397 | 7.9106 | 2726.0737 | 5.4990 | 244.4371 |
|
| 127 |
+
| 5.7864 | 0.5627 | 295000 | 0.0397 | 7.8498 | 2565.2361 | 5.4800 | 239.8371 |
|
| 128 |
+
| 5.6506 | 0.5722 | 300000 | 0.0401 | 7.9694 | 2891.0478 | 5.4805 | 239.9755 |
|
| 129 |
+
| 5.6403 | 0.5817 | 305000 | 0.0390 | 7.8301 | 2515.0960 | 5.4738 | 238.3728 |
|
| 130 |
+
| 5.6538 | 0.5913 | 310000 | 0.0398 | 7.8934 | 2679.4140 | 5.4811 | 240.0990 |
|
| 131 |
+
| 5.6665 | 0.6008 | 315000 | 0.0399 | 7.8407 | 2541.9513 | 5.4566 | 234.3107 |
|
| 132 |
+
| 5.5755 | 0.6104 | 320000 | 232.2292 | 5.4477 | 0.0395 | 2593.5901 | 7.8608 |
|
| 133 |
+
| 5.641 | 0.6199 | 325000 | 231.2019 | 5.4433 | 0.0394 | 2925.4813 | 7.9812 |
|
| 134 |
+
| 5.6113 | 0.6294 | 330000 | 229.7290 | 5.4369 | 0.0391 | 2471.8038 | 7.8127 |
|
| 135 |
+
| 5.6697 | 0.6390 | 335000 | 229.2882 | 5.4350 | 0.0394 | 2709.0417 | 7.9044 |
|
| 136 |
+
| 5.6425 | 0.6485 | 340000 | 228.1699 | 5.4301 | 0.0397 | 2550.9241 | 7.8442 |
|
| 137 |
+
| 5.626 | 0.6580 | 345000 | 226.4364 | 5.4225 | 0.0391 | 2601.7519 | 7.8639 |
|
| 138 |
+
| 5.5888 | 0.6676 | 350000 | 225.6680 | 5.4191 | 0.0394 | 2929.0955 | 7.9824 |
|
| 139 |
+
| 5.5793 | 0.6771 | 355000 | 224.6552 | 5.4146 | 0.0389 | 3111.5742 | 8.0429 |
|
| 140 |
+
| 5.5751 | 0.6866 | 360000 | 222.2638 | 5.4039 | 0.0385 | 2507.0953 | 7.8269 |
|
| 141 |
+
| 5.5659 | 0.6962 | 365000 | 219.8554 | 5.3930 | 0.0388 | 2442.3736 | 7.8007 |
|
| 142 |
+
| 5.6128 | 0.7057 | 370000 | 217.8869 | 5.3840 | 0.0385 | 2365.9076 | 7.7689 |
|
| 143 |
+
| 5.5471 | 0.7153 | 375000 | 216.1903 | 5.3762 | 0.0380 | 2286.1222 | 7.7346 |
|
| 144 |
+
| 5.5468 | 0.7248 | 380000 | 214.2540 | 5.3672 | 0.0387 | 2292.4794 | 7.7374 |
|
| 145 |
+
| 5.5354 | 0.7343 | 385000 | 211.9470 | 5.3563 | 0.0383 | 2245.4357 | 7.7167 |
|
| 146 |
+
| 5.5659 | 0.7439 | 390000 | 210.5267 | 5.3496 | 0.0384 | 2303.3757 | 7.7421 |
|
| 147 |
+
| 5.5114 | 0.7534 | 395000 | 209.1623 | 5.3431 | 0.0382 | 2344.8778 | 7.7600 |
|
| 148 |
+
| 5.5024 | 0.7629 | 400000 | 207.8823 | 5.3370 | 0.0383 | 2387.2802 | 7.7779 |
|
| 149 |
+
| 5.5723 | 0.7725 | 405000 | 206.5772 | 5.3307 | 0.0381 | 2312.2991 | 7.7460 |
|
| 150 |
+
| 5.4679 | 0.7820 | 410000 | 204.4713 | 5.3204 | 0.0384 | 2225.5773 | 7.7078 |
|
| 151 |
+
| 5.5022 | 0.7915 | 415000 | 202.6541 | 5.3115 | 0.0379 | 2216.8935 | 7.7039 |
|
| 152 |
+
| 5.4582 | 0.8011 | 420000 | 202.3064 | 5.3098 | 0.0382 | 2253.1685 | 7.7201 |
|
| 153 |
+
| 5.4716 | 0.8106 | 425000 | 199.9121 | 5.2979 | 0.0379 | 2201.0493 | 7.6967 |
|
| 154 |
+
| 5.4742 | 0.8202 | 430000 | 199.2166 | 5.2944 | 0.0379 | 2221.2136 | 7.7058 |
|
| 155 |
+
| 5.456 | 0.8297 | 435000 | 197.5286 | 5.2859 | 0.0378 | 2275.2305 | 7.7298 |
|
| 156 |
+
| 5.4751 | 0.8392 | 440000 | 196.2503 | 5.2794 | 0.0380 | 2230.8815 | 7.7102 |
|
| 157 |
+
| 5.4628 | 0.8488 | 445000 | 195.3121 | 5.2746 | 0.0379 | 2281.6319 | 7.7326 |
|
| 158 |
+
| 5.3535 | 0.8583 | 450000 | 195.0385 | 5.2732 | 0.0377 | 2178.3606 | 7.6863 |
|
| 159 |
+
| 5.5193 | 0.8678 | 455000 | 193.3610 | 5.2646 | 0.0380 | 2221.9979 | 7.7062 |
|
| 160 |
+
| 5.4747 | 0.8774 | 460000 | 192.5007 | 5.2601 | 0.0374 | 2183.6563 | 7.6888 |
|
| 161 |
+
| 5.4077 | 0.8869 | 465000 | 191.1322 | 5.2530 | 0.0375 | 2137.0722 | 7.6672 |
|
| 162 |
+
| 5.4288 | 0.8965 | 470000 | 190.2730 | 5.2485 | 0.0377 | 2108.7450 | 7.6538 |
|
| 163 |
+
| 5.4653 | 0.9060 | 475000 | 189.7327 | 5.2456 | 0.0377 | 2132.3223 | 7.6650 |
|
| 164 |
+
| 5.3929 | 0.9155 | 480000 | 188.8477 | 5.2409 | 0.0376 | 2122.3287 | 7.6603 |
|
| 165 |
+
| 5.405 | 0.9251 | 485000 | 187.7804 | 5.2353 | 0.0374 | 2107.0023 | 7.6530 |
|
| 166 |
+
| 5.4504 | 0.9346 | 490000 | 187.0694 | 5.2315 | 0.0374 | 2111.6347 | 7.6552 |
|
| 167 |
+
| 5.4217 | 0.9441 | 495000 | 186.9062 | 5.2306 | 0.0374 | 2110.7966 | 7.6548 |
|
| 168 |
+
| 5.4109 | 0.9537 | 500000 | 185.9346 | 5.2254 | 0.0372 | 2099.9909 | 7.6497 |
|
| 169 |
+
| 5.3892 | 1.0095 | 505000 | 185.5628 | 5.2234 | 0.0374 | 2097.9349 | 7.6487 |
|
| 170 |
+
| 5.3806 | 1.0191 | 510000 | 184.9853 | 5.2203 | 0.0374 | 2092.8939 | 7.6463 |
|
| 171 |
+
| 5.4174 | 1.0286 | 515000 | 184.8205 | 5.2194 | 0.0375 | 2090.1261 | 7.6450 |
|
| 172 |
+
| 5.4017 | 1.0381 | 520000 | 184.6505 | 5.2185 | 0.0374 | 2090.0473 | 7.6449 |
|
| 173 |
+
|
| 174 |
+
|
| 175 |
+
### Framework versions
|
| 176 |
+
|
| 177 |
+
- Transformers 4.57.0.dev0
|
| 178 |
+
- Pytorch 2.8.0+cu128
|
| 179 |
+
- Datasets 4.0.0
|
| 180 |
+
- Tokenizers 0.22.1
|
emissions.csv
ADDED
|
@@ -0,0 +1,2 @@
|
|
|
|
|
|
|
|
|
|
| 1 |
+
timestamp,project_name,run_id,experiment_id,duration,emissions,emissions_rate,cpu_power,gpu_power,ram_power,cpu_energy,gpu_energy,ram_energy,energy_consumed,country_name,country_iso_code,region,cloud_provider,cloud_region,os,python_version,codecarbon_version,cpu_count,cpu_model,gpu_count,gpu_model,longitude,latitude,ram_total_size,tracking_mode,on_cloud,pue
|
| 2 |
+
2025-10-11T04:48:11,codecarbon,845f784b-bb7d-4445-b5b0-8ad1b23c837d,5b0fa12a-3dd7-45bb-9766-cc326314d9f1,165966.11377579905,0.9164712577735377,5.522038426540395e-06,185.42970251401186,305.72457674222454,50.0,8.034171731925548,12.073996771411885,2.412319462592872,22.5204879659304,Sweden,SWE,dalarna county,,,Linux-4.18.0-553.56.1.el8_10.x86_64-x86_64-with-glibc2.28,3.10.18,3.0.4,32,AMD EPYC 7413 24-Core Processor,1,1 x NVIDIA H100 PCIe,15.6326,60.6043,171.68,machine,N,1.0
|
model.safetensors
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
size 3132668816
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:81d6a4380ea0550fdb9f995805dea6ef2c37e6f191679f0da61eb69ebe60bec4
|
| 3 |
size 3132668816
|
runs/Oct09_06-03-54_gpu22.viking2.yor.alces.network/events.out.tfevents.1759988460.gpu22.viking2.yor.alces.network.3252474.0
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
-
size
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:9a3fb6969000260c3dd917acb0ebf11853f38a38f08afe5b9709ce7e3748e217
|
| 3 |
+
size 140840
|