Commit History
step=320 loss=0.5616167783737183
910ce0d
John Doe
commited on
step=288 loss=1.5417312383651733
8e9af9e
John Doe
commited on
step=256 loss=0.8266531825065613
2e35d85
John Doe
commited on
step=224 loss=1.5066956281661987
39a4121
John Doe
commited on
step=192 loss=1.1840404272079468
4b5f322
John Doe
commited on
step=160 loss=0.9588625431060791
01bbac0
John Doe
commited on
step=128 loss=1.2087572813034058
99609be
John Doe
commited on
step=96 loss=1.340285301208496
343b043
John Doe
commited on
step=64 loss=1.172095775604248
1d1c1f0
John Doe
commited on
step=32 loss=1.1201095581054688
c64aa8c
John Doe
commited on
32 batches; loss=1.12
07f051a
John Doe
commited on
step=176 loss=1.1972198486328125
a2b4ee8
step=160 loss=1.2805689573287964
fb6a555
step=144 loss=1.4531817436218262
184fb44
step=128 loss=1.2950183153152466
a56b0ee
step=112 loss=1.7769852876663208
9e8338e
step=96 loss=1.256285548210144
438a84a
step=80 loss=1.703926682472229
97ad246
step=64 loss=1.73197340965271
6ac6dd0
step=48 loss=1.0865358114242554
46ccd76
step=32 loss=1.59828782081604
e44409e
step=16 loss=1.092730164527893
e3b4d54
step=16 loss=1.1869220733642578
4245347
batch=421 loss=2.16 (1.17)
f9b959e
John Doe
commited on
training interrupted and continued with new epoch. batch=331 loss=2.32 (1.84)
36d32f2
John Doe
commited on