-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathoutput
12840 lines (12740 loc) · 684 KB
/
output
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
Using TensorFlow backend.
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
embedding_1 (Embedding) (16, 64, 512) 44032
_________________________________________________________________
lstm_1 (LSTM) (16, 64, 256) 787456
_________________________________________________________________
dropout_1 (Dropout) (16, 64, 256) 0
_________________________________________________________________
lstm_2 (LSTM) (16, 64, 256) 525312
_________________________________________________________________
dropout_2 (Dropout) (16, 64, 256) 0
_________________________________________________________________
lstm_3 (LSTM) (16, 64, 256) 525312
_________________________________________________________________
dropout_3 (Dropout) (16, 64, 256) 0
_________________________________________________________________
time_distributed_1 (TimeDist (16, 64, 86) 22102
_________________________________________________________________
activation_1 (Activation) (16, 64, 86) 0
=================================================================
Total params: 1,904,214
Trainable params: 1,904,214
Non-trainable params: 0
_________________________________________________________________
Epoch 1/100
2018-08-19 08:12:54.247972: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use SSE4.2 instructions, but these are available on your machine and could speed up CPU computations.
2018-08-19 08:12:54.247997: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use AVX instructions, but these are available on your machine and could speed up CPU computations.
2018-08-19 08:12:54.248002: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use AVX2 instructions, but these are available on your machine and could speed up CPU computations.
2018-08-19 08:12:54.248006: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use FMA instructions, but these are available on your machine and could speed up CPU computations.
Batch 1: loss = 4.454390048980713, acc = 0.0107421875
Batch 2: loss = 4.436723232269287, acc = 0.1474609375
Batch 3: loss = 4.400537014007568, acc = 0.1142578125
Batch 4: loss = 4.232309341430664, acc = 0.1455078125
Batch 5: loss = 3.77453351020813, acc = 0.1611328125
Batch 6: loss = 3.9434356689453125, acc = 0.142578125
Batch 7: loss = 3.699286460876465, acc = 0.138671875
Batch 8: loss = 3.5910263061523438, acc = 0.1201171875
Batch 9: loss = 3.6818315982818604, acc = 0.0751953125
Batch 10: loss = 3.6594150066375732, acc = 0.08203125
Batch 11: loss = 3.567378044128418, acc = 0.0703125
Batch 12: loss = 3.629075527191162, acc = 0.046875
Batch 13: loss = 3.628910541534424, acc = 0.05078125
Batch 14: loss = 3.5983996391296387, acc = 0.0390625
Batch 15: loss = 3.5846943855285645, acc = 0.080078125
Batch 16: loss = 3.427335262298584, acc = 0.13671875
Batch 17: loss = 3.375317096710205, acc = 0.1591796875
Batch 18: loss = 3.3146743774414062, acc = 0.1611328125
Batch 19: loss = 3.5875778198242188, acc = 0.1259765625
Batch 20: loss = 3.530864715576172, acc = 0.123046875
Batch 21: loss = 3.5870752334594727, acc = 0.1279296875
Batch 22: loss = 3.6523892879486084, acc = 0.1259765625
Batch 23: loss = 3.296334743499756, acc = 0.181640625
Batch 24: loss = 3.480577230453491, acc = 0.1484375
Batch 25: loss = 3.492737293243408, acc = 0.1416015625
Batch 26: loss = 3.5193252563476562, acc = 0.130859375
Batch 27: loss = 3.24818754196167, acc = 0.1875
Batch 28: loss = 3.338367462158203, acc = 0.1689453125
Batch 29: loss = 3.488800048828125, acc = 0.1552734375
Batch 30: loss = 3.441650867462158, acc = 0.1455078125
Batch 31: loss = 3.304414749145508, acc = 0.166015625
Batch 32: loss = 3.3811538219451904, acc = 0.146484375
Batch 33: loss = 3.3684186935424805, acc = 0.162109375
Batch 34: loss = 3.3241796493530273, acc = 0.150390625
Batch 35: loss = 3.2710442543029785, acc = 0.1630859375
Batch 36: loss = 3.4312543869018555, acc = 0.150390625
Batch 37: loss = 3.375916004180908, acc = 0.12890625
Batch 38: loss = 3.313051462173462, acc = 0.142578125
Batch 39: loss = 3.307137966156006, acc = 0.1572265625
Batch 40: loss = 3.300055742263794, acc = 0.1669921875
Batch 41: loss = 3.3036417961120605, acc = 0.15234375
Batch 42: loss = 3.1704964637756348, acc = 0.169921875
Batch 43: loss = 3.183171272277832, acc = 0.1708984375
Batch 44: loss = 3.322441339492798, acc = 0.16015625
Batch 45: loss = 3.2925169467926025, acc = 0.14453125
Batch 46: loss = 3.20310115814209, acc = 0.162109375
Batch 47: loss = 3.2485883235931396, acc = 0.1640625
Batch 48: loss = 3.166637420654297, acc = 0.17578125
Batch 49: loss = 3.30924916267395, acc = 0.1484375
Batch 50: loss = 3.1999664306640625, acc = 0.1630859375
Batch 51: loss = 3.18353271484375, acc = 0.1591796875
Batch 52: loss = 3.1365394592285156, acc = 0.17578125
Batch 53: loss = 3.2167725563049316, acc = 0.150390625
Batch 54: loss = 3.28530216217041, acc = 0.1533203125
Batch 55: loss = 3.302375078201294, acc = 0.138671875
Batch 56: loss = 3.157515525817871, acc = 0.15234375
Batch 57: loss = 3.154728651046753, acc = 0.1611328125
Batch 58: loss = 3.2235875129699707, acc = 0.1396484375
Batch 59: loss = 3.274887800216675, acc = 0.13671875
Batch 60: loss = 3.1678736209869385, acc = 0.1435546875
Batch 61: loss = 3.1924760341644287, acc = 0.1533203125
Batch 62: loss = 3.1027002334594727, acc = 0.173828125
Batch 63: loss = 3.2482497692108154, acc = 0.1552734375
Batch 64: loss = 3.197383403778076, acc = 0.1533203125
Batch 65: loss = 3.1012067794799805, acc = 0.169921875
Batch 66: loss = 3.1290488243103027, acc = 0.1650390625
Batch 67: loss = 3.1854703426361084, acc = 0.1669921875
Batch 68: loss = 3.212329864501953, acc = 0.146484375
Batch 69: loss = 3.1458470821380615, acc = 0.1650390625
Batch 70: loss = 3.0480518341064453, acc = 0.1806640625
Batch 71: loss = 3.1526405811309814, acc = 0.16796875
Batch 72: loss = 3.155853271484375, acc = 0.1767578125
Batch 73: loss = 3.1185450553894043, acc = 0.1728515625
Batch 74: loss = 3.046165943145752, acc = 0.1806640625
Batch 75: loss = 3.039363145828247, acc = 0.1767578125
Batch 76: loss = 3.100555658340454, acc = 0.1689453125
Batch 77: loss = 3.021588087081909, acc = 0.2001953125
Batch 78: loss = 2.9985837936401367, acc = 0.1748046875
Batch 79: loss = 3.01607084274292, acc = 0.1884765625
Batch 80: loss = 2.9068760871887207, acc = 0.1943359375
Batch 81: loss = 3.0610411167144775, acc = 0.1904296875
Batch 82: loss = 3.1154165267944336, acc = 0.1787109375
Batch 83: loss = 3.021592855453491, acc = 0.181640625
Batch 84: loss = 2.965991973876953, acc = 0.19140625
Batch 85: loss = 2.885828733444214, acc = 0.2119140625
Batch 86: loss = 3.026918411254883, acc = 0.185546875
Batch 87: loss = 3.178969383239746, acc = 0.1640625
Batch 88: loss = 3.011878490447998, acc = 0.1943359375
Batch 89: loss = 3.0468716621398926, acc = 0.181640625
Batch 90: loss = 3.1029763221740723, acc = 0.193359375
Batch 91: loss = 2.964935064315796, acc = 0.20703125
Batch 92: loss = 2.9623889923095703, acc = 0.1806640625
Batch 93: loss = 3.003779172897339, acc = 0.1884765625
Batch 94: loss = 2.909271717071533, acc = 0.2021484375
Batch 95: loss = 2.873121738433838, acc = 0.21484375
Batch 96: loss = 2.8793997764587402, acc = 0.2021484375
Batch 97: loss = 2.9861719608306885, acc = 0.1748046875
Batch 98: loss = 2.9765210151672363, acc = 0.197265625
Batch 99: loss = 2.851529121398926, acc = 0.20703125
Batch 100: loss = 2.8618903160095215, acc = 0.21484375
Batch 101: loss = 2.976154327392578, acc = 0.203125
Batch 102: loss = 3.0254125595092773, acc = 0.18359375
Batch 103: loss = 2.8968610763549805, acc = 0.1875
Batch 104: loss = 2.8555829524993896, acc = 0.224609375
Batch 105: loss = 2.780703544616699, acc = 0.224609375
Batch 106: loss = 2.843506097793579, acc = 0.19140625
Batch 107: loss = 2.834918260574341, acc = 0.2275390625
Batch 108: loss = 2.767465353012085, acc = 0.244140625
Batch 109: loss = 2.7225265502929688, acc = 0.240234375
Batch 110: loss = 2.8358778953552246, acc = 0.2421875
Batch 111: loss = 2.8429794311523438, acc = 0.236328125
Batch 112: loss = 2.8061280250549316, acc = 0.224609375
Batch 113: loss = 2.7999496459960938, acc = 0.2138671875
Batch 114: loss = 2.8613545894622803, acc = 0.216796875
Batch 115: loss = 2.827665328979492, acc = 0.2265625
Batch 116: loss = 2.8445191383361816, acc = 0.2099609375
Batch 117: loss = 2.9224236011505127, acc = 0.2041015625
Batch 118: loss = 2.779392957687378, acc = 0.2353515625
Batch 119: loss = 2.7342352867126465, acc = 0.244140625
Batch 120: loss = 2.8388679027557373, acc = 0.2353515625
Batch 121: loss = 2.731635093688965, acc = 0.2412109375
Batch 122: loss = 2.833055257797241, acc = 0.21484375
Batch 123: loss = 2.78127384185791, acc = 0.2431640625
Batch 124: loss = 2.6567978858947754, acc = 0.2607421875
Batch 125: loss = 2.654332160949707, acc = 0.2861328125
Batch 126: loss = 2.783754348754883, acc = 0.2626953125
Epoch 2/100
Batch 1: loss = 3.035529613494873, acc = 0.2431640625
Batch 2: loss = 2.6527926921844482, acc = 0.2978515625
Batch 3: loss = 2.7760815620422363, acc = 0.2470703125
Batch 4: loss = 2.6269185543060303, acc = 0.2724609375
Batch 5: loss = 2.665935516357422, acc = 0.2783203125
Batch 6: loss = 2.7697596549987793, acc = 0.26953125
Batch 7: loss = 2.763477325439453, acc = 0.2578125
Batch 8: loss = 2.621147632598877, acc = 0.2626953125
Batch 9: loss = 2.6288437843322754, acc = 0.2685546875
Batch 10: loss = 2.527158260345459, acc = 0.3017578125
Batch 11: loss = 2.531536102294922, acc = 0.31640625
Batch 12: loss = 2.636348247528076, acc = 0.291015625
Batch 13: loss = 2.617934465408325, acc = 0.2783203125
Batch 14: loss = 2.5300230979919434, acc = 0.283203125
Batch 15: loss = 2.5671424865722656, acc = 0.27734375
Batch 16: loss = 2.500370502471924, acc = 0.30859375
Batch 17: loss = 2.4950320720672607, acc = 0.318359375
Batch 18: loss = 2.5213537216186523, acc = 0.30859375
Batch 19: loss = 2.578141212463379, acc = 0.296875
Batch 20: loss = 2.520859718322754, acc = 0.30859375
Batch 21: loss = 2.582458257675171, acc = 0.2861328125
Batch 22: loss = 2.4905521869659424, acc = 0.2998046875
Batch 23: loss = 2.342447280883789, acc = 0.341796875
Batch 24: loss = 2.4944207668304443, acc = 0.3076171875
Batch 25: loss = 2.4126648902893066, acc = 0.298828125
Batch 26: loss = 2.3795008659362793, acc = 0.33203125
Batch 27: loss = 2.3047196865081787, acc = 0.33203125
Batch 28: loss = 2.316915988922119, acc = 0.34765625
Batch 29: loss = 2.3974978923797607, acc = 0.31640625
Batch 30: loss = 2.392056703567505, acc = 0.3525390625
Batch 31: loss = 2.422626495361328, acc = 0.3349609375
Batch 32: loss = 2.5189547538757324, acc = 0.3095703125
Batch 33: loss = 2.374316453933716, acc = 0.345703125
Batch 34: loss = 2.3381474018096924, acc = 0.33203125
Batch 35: loss = 2.2939767837524414, acc = 0.3525390625
Batch 36: loss = 2.5395774841308594, acc = 0.2900390625
Batch 37: loss = 2.4815773963928223, acc = 0.3134765625
Batch 38: loss = 2.411848545074463, acc = 0.318359375
Batch 39: loss = 2.3354196548461914, acc = 0.3515625
Batch 40: loss = 2.250337839126587, acc = 0.3447265625
Batch 41: loss = 2.2737691402435303, acc = 0.35546875
Batch 42: loss = 2.1154227256774902, acc = 0.40625
Batch 43: loss = 2.1257755756378174, acc = 0.3759765625
Batch 44: loss = 2.3019466400146484, acc = 0.3359375
Batch 45: loss = 2.249648094177246, acc = 0.34765625
Batch 46: loss = 2.2227301597595215, acc = 0.3466796875
Batch 47: loss = 2.2312331199645996, acc = 0.3662109375
Batch 48: loss = 2.119194507598877, acc = 0.36328125
Batch 49: loss = 2.256570816040039, acc = 0.3466796875
Batch 50: loss = 2.183021306991577, acc = 0.353515625
Batch 51: loss = 2.1845574378967285, acc = 0.365234375
Batch 52: loss = 2.167276620864868, acc = 0.3828125
Batch 53: loss = 2.294023275375366, acc = 0.34375
Batch 54: loss = 2.3321142196655273, acc = 0.326171875
Batch 55: loss = 2.3665857315063477, acc = 0.3154296875
Batch 56: loss = 2.110642433166504, acc = 0.3798828125
Batch 57: loss = 2.109957218170166, acc = 0.396484375
Batch 58: loss = 2.237764596939087, acc = 0.345703125
Batch 59: loss = 2.2222180366516113, acc = 0.3408203125
Batch 60: loss = 2.1438279151916504, acc = 0.384765625
Batch 61: loss = 2.3372130393981934, acc = 0.3408203125
Batch 62: loss = 2.158538341522217, acc = 0.3759765625
Batch 63: loss = 2.365854263305664, acc = 0.3486328125
Batch 64: loss = 2.1124496459960938, acc = 0.38671875
Batch 65: loss = 2.065979480743408, acc = 0.4150390625
Batch 66: loss = 2.1161444187164307, acc = 0.3984375
Batch 67: loss = 2.227736473083496, acc = 0.3466796875
Batch 68: loss = 2.1691412925720215, acc = 0.37890625
Batch 69: loss = 2.0490870475769043, acc = 0.3994140625
Batch 70: loss = 2.1242687702178955, acc = 0.408203125
Batch 71: loss = 2.1080615520477295, acc = 0.4072265625
Batch 72: loss = 2.1949775218963623, acc = 0.3720703125
Batch 73: loss = 2.0792794227600098, acc = 0.396484375
Batch 74: loss = 1.9880890846252441, acc = 0.4140625
Batch 75: loss = 2.0539050102233887, acc = 0.390625
Batch 76: loss = 2.124222755432129, acc = 0.3916015625
Batch 77: loss = 2.0070576667785645, acc = 0.408203125
Batch 78: loss = 1.960235595703125, acc = 0.4404296875
Batch 79: loss = 1.9150278568267822, acc = 0.455078125
Batch 80: loss = 1.756925344467163, acc = 0.453125
Batch 81: loss = 2.0517184734344482, acc = 0.3955078125
Batch 82: loss = 2.0730862617492676, acc = 0.408203125
Batch 83: loss = 1.974690318107605, acc = 0.4169921875
Batch 84: loss = 1.927781581878662, acc = 0.4453125
Batch 85: loss = 1.8079042434692383, acc = 0.466796875
Batch 86: loss = 2.140885353088379, acc = 0.380859375
Batch 87: loss = 2.1705870628356934, acc = 0.39453125
Batch 88: loss = 2.121853828430176, acc = 0.4072265625
Batch 89: loss = 2.1970019340515137, acc = 0.3759765625
Batch 90: loss = 2.215484619140625, acc = 0.40234375
Batch 91: loss = 1.9512856006622314, acc = 0.421875
Batch 92: loss = 1.9551972150802612, acc = 0.4228515625
Batch 93: loss = 1.9607418775558472, acc = 0.4375
Batch 94: loss = 1.90357506275177, acc = 0.4404296875
Batch 95: loss = 1.8979930877685547, acc = 0.4208984375
Batch 96: loss = 1.9100072383880615, acc = 0.4345703125
Batch 97: loss = 2.0863187313079834, acc = 0.384765625
Batch 98: loss = 2.0357284545898438, acc = 0.416015625
Batch 99: loss = 1.8478232622146606, acc = 0.4541015625
Batch 100: loss = 1.810758113861084, acc = 0.447265625
Batch 101: loss = 2.0712060928344727, acc = 0.3984375
Batch 102: loss = 2.1583402156829834, acc = 0.3759765625
Batch 103: loss = 2.066755771636963, acc = 0.38671875
Batch 104: loss = 1.9659109115600586, acc = 0.400390625
Batch 105: loss = 1.9372284412384033, acc = 0.419921875
Batch 106: loss = 2.029855251312256, acc = 0.3984375
Batch 107: loss = 2.021092414855957, acc = 0.3955078125
Batch 108: loss = 1.920992374420166, acc = 0.427734375
Batch 109: loss = 1.8782713413238525, acc = 0.41796875
Batch 110: loss = 2.0092878341674805, acc = 0.41015625
Batch 111: loss = 2.115964889526367, acc = 0.400390625
Batch 112: loss = 2.0574092864990234, acc = 0.4130859375
Batch 113: loss = 1.9267797470092773, acc = 0.42578125
Batch 114: loss = 2.1179637908935547, acc = 0.3994140625
Batch 115: loss = 2.0716452598571777, acc = 0.4189453125
Batch 116: loss = 2.0397539138793945, acc = 0.4287109375
Batch 117: loss = 2.1219239234924316, acc = 0.3857421875
Batch 118: loss = 1.9701530933380127, acc = 0.427734375
Batch 119: loss = 1.9485059976577759, acc = 0.427734375
Batch 120: loss = 2.0442464351654053, acc = 0.4111328125
Batch 121: loss = 1.8664747476577759, acc = 0.447265625
Batch 122: loss = 2.068753242492676, acc = 0.3974609375
Batch 123: loss = 2.0040931701660156, acc = 0.41796875
Batch 124: loss = 1.933282494544983, acc = 0.421875
Batch 125: loss = 1.8580965995788574, acc = 0.4462890625
Batch 126: loss = 2.051638126373291, acc = 0.421875
Epoch 3/100
Batch 1: loss = 2.4864349365234375, acc = 0.3603515625
Batch 2: loss = 1.9282764196395874, acc = 0.4306640625
Batch 3: loss = 2.091183662414551, acc = 0.3955078125
Batch 4: loss = 1.8788573741912842, acc = 0.423828125
Batch 5: loss = 2.022368907928467, acc = 0.4248046875
Batch 6: loss = 2.1061437129974365, acc = 0.404296875
Batch 7: loss = 2.1353201866149902, acc = 0.3974609375
Batch 8: loss = 1.933554768562317, acc = 0.4287109375
Batch 9: loss = 1.986527442932129, acc = 0.42578125
Batch 10: loss = 1.8151978254318237, acc = 0.4658203125
Batch 11: loss = 1.90380859375, acc = 0.455078125
Batch 12: loss = 2.0145621299743652, acc = 0.4287109375
Batch 13: loss = 2.004103660583496, acc = 0.4169921875
Batch 14: loss = 1.8606159687042236, acc = 0.447265625
Batch 15: loss = 1.9784767627716064, acc = 0.4453125
Batch 16: loss = 2.0042788982391357, acc = 0.4248046875
Batch 17: loss = 1.9727903604507446, acc = 0.44140625
Batch 18: loss = 2.0280754566192627, acc = 0.4189453125
Batch 19: loss = 2.1030397415161133, acc = 0.4013671875
Batch 20: loss = 1.9459044933319092, acc = 0.4384765625
Batch 21: loss = 2.063318967819214, acc = 0.4130859375
Batch 22: loss = 1.9464104175567627, acc = 0.4267578125
Batch 23: loss = 1.8698973655700684, acc = 0.4453125
Batch 24: loss = 1.971191167831421, acc = 0.4287109375
Batch 25: loss = 1.8527791500091553, acc = 0.4443359375
Batch 26: loss = 1.8619911670684814, acc = 0.4560546875
Batch 27: loss = 1.7822284698486328, acc = 0.4765625
Batch 28: loss = 1.7763640880584717, acc = 0.4794921875
Batch 29: loss = 1.8359816074371338, acc = 0.4677734375
Batch 30: loss = 1.854383111000061, acc = 0.482421875
Batch 31: loss = 1.9640448093414307, acc = 0.4375
Batch 32: loss = 2.114806652069092, acc = 0.4140625
Batch 33: loss = 1.9769943952560425, acc = 0.4287109375
Batch 34: loss = 1.8760802745819092, acc = 0.4423828125
Batch 35: loss = 1.876638650894165, acc = 0.4521484375
Batch 36: loss = 2.077208995819092, acc = 0.4345703125
Batch 37: loss = 2.033595561981201, acc = 0.44140625
Batch 38: loss = 1.9413524866104126, acc = 0.4345703125
Batch 39: loss = 1.88388991355896, acc = 0.4638671875
Batch 40: loss = 1.832550287246704, acc = 0.4658203125
Batch 41: loss = 1.791978120803833, acc = 0.4873046875
Batch 42: loss = 1.7053086757659912, acc = 0.5029296875
Batch 43: loss = 1.7689096927642822, acc = 0.455078125
Batch 44: loss = 1.8696764707565308, acc = 0.4501953125
Batch 45: loss = 1.740330457687378, acc = 0.4853515625
Batch 46: loss = 1.8198007345199585, acc = 0.458984375
Batch 47: loss = 1.848508358001709, acc = 0.4501953125
Batch 48: loss = 1.718815803527832, acc = 0.4580078125
Batch 49: loss = 1.805159091949463, acc = 0.4912109375
Batch 50: loss = 1.730323314666748, acc = 0.5029296875
Batch 51: loss = 1.8272652626037598, acc = 0.4560546875
Batch 52: loss = 1.8240962028503418, acc = 0.47265625
Batch 53: loss = 1.8711615800857544, acc = 0.45703125
Batch 54: loss = 1.8804420232772827, acc = 0.4482421875
Batch 55: loss = 1.9133563041687012, acc = 0.4658203125
Batch 56: loss = 1.7307029962539673, acc = 0.4794921875
Batch 57: loss = 1.8204765319824219, acc = 0.455078125
Batch 58: loss = 1.848186731338501, acc = 0.4638671875
Batch 59: loss = 1.7199842929840088, acc = 0.513671875
Batch 60: loss = 1.7720987796783447, acc = 0.4755859375
Batch 61: loss = 1.9982962608337402, acc = 0.4287109375
Batch 62: loss = 1.9219627380371094, acc = 0.4140625
Batch 63: loss = 2.028214931488037, acc = 0.4189453125
Batch 64: loss = 1.653975486755371, acc = 0.51171875
Batch 65: loss = 1.7428345680236816, acc = 0.4931640625
Batch 66: loss = 1.7532994747161865, acc = 0.4931640625
Batch 67: loss = 1.8159581422805786, acc = 0.494140625
Batch 68: loss = 1.7796800136566162, acc = 0.4892578125
Batch 69: loss = 1.6641294956207275, acc = 0.5048828125
Batch 70: loss = 1.8424961566925049, acc = 0.47265625
Batch 71: loss = 1.7592437267303467, acc = 0.4833984375
Batch 72: loss = 1.7430033683776855, acc = 0.4873046875
Batch 73: loss = 1.7646540403366089, acc = 0.4658203125
Batch 74: loss = 1.7052150964736938, acc = 0.490234375
Batch 75: loss = 1.762937068939209, acc = 0.462890625
Batch 76: loss = 1.8073244094848633, acc = 0.4716796875
Batch 77: loss = 1.652730941772461, acc = 0.517578125
Batch 78: loss = 1.6041244268417358, acc = 0.5244140625
Batch 79: loss = 1.5498614311218262, acc = 0.53125
Batch 80: loss = 1.5305733680725098, acc = 0.5166015625
Batch 81: loss = 1.6710143089294434, acc = 0.5166015625
Batch 82: loss = 1.6873021125793457, acc = 0.51953125
Batch 83: loss = 1.6571204662322998, acc = 0.51953125
Batch 84: loss = 1.640306830406189, acc = 0.5244140625
Batch 85: loss = 1.6278698444366455, acc = 0.5087890625
Batch 86: loss = 1.772813320159912, acc = 0.482421875
Batch 87: loss = 1.7501507997512817, acc = 0.5048828125
Batch 88: loss = 1.885927677154541, acc = 0.4638671875
Batch 89: loss = 1.9273443222045898, acc = 0.44140625
Batch 90: loss = 1.945972204208374, acc = 0.4658203125
Batch 91: loss = 1.78849458694458, acc = 0.4775390625
Batch 92: loss = 1.6684720516204834, acc = 0.5
Batch 93: loss = 1.634171485900879, acc = 0.52734375
Batch 94: loss = 1.6244654655456543, acc = 0.5205078125
Batch 95: loss = 1.6611826419830322, acc = 0.4775390625
Batch 96: loss = 1.6761934757232666, acc = 0.484375
Batch 97: loss = 1.7611808776855469, acc = 0.47265625
Batch 98: loss = 1.711345911026001, acc = 0.50390625
Batch 99: loss = 1.6761798858642578, acc = 0.482421875
Batch 100: loss = 1.6257208585739136, acc = 0.4892578125
Batch 101: loss = 1.8532689809799194, acc = 0.4501953125
Batch 102: loss = 1.885880708694458, acc = 0.4560546875
Batch 103: loss = 1.797227144241333, acc = 0.482421875
Batch 104: loss = 1.6885825395584106, acc = 0.4921875
Batch 105: loss = 1.7128068208694458, acc = 0.48046875
Batch 106: loss = 1.7995485067367554, acc = 0.458984375
Batch 107: loss = 1.7133047580718994, acc = 0.482421875
Batch 108: loss = 1.7320176362991333, acc = 0.4755859375
Batch 109: loss = 1.6281497478485107, acc = 0.4833984375
Batch 110: loss = 1.694640874862671, acc = 0.4912109375
Batch 111: loss = 1.8366730213165283, acc = 0.451171875
Batch 112: loss = 1.7433979511260986, acc = 0.4765625
Batch 113: loss = 1.6867589950561523, acc = 0.4931640625
Batch 114: loss = 1.7841787338256836, acc = 0.4931640625
Batch 115: loss = 1.8074054718017578, acc = 0.4755859375
Batch 116: loss = 1.7643818855285645, acc = 0.470703125
Batch 117: loss = 1.7403641939163208, acc = 0.5078125
Batch 118: loss = 1.6794716119766235, acc = 0.498046875
Batch 119: loss = 1.740513563156128, acc = 0.46875
Batch 120: loss = 1.802060604095459, acc = 0.4541015625
Batch 121: loss = 1.6495702266693115, acc = 0.5
Batch 122: loss = 1.744360089302063, acc = 0.478515625
Batch 123: loss = 1.6899521350860596, acc = 0.50390625
Batch 124: loss = 1.7009353637695312, acc = 0.4765625
Batch 125: loss = 1.6735506057739258, acc = 0.494140625
Batch 126: loss = 1.8247368335723877, acc = 0.4794921875
Epoch 4/100
Batch 1: loss = 2.3326706886291504, acc = 0.4033203125
Batch 2: loss = 1.7301852703094482, acc = 0.4912109375
Batch 3: loss = 1.825676441192627, acc = 0.4765625
Batch 4: loss = 1.6541852951049805, acc = 0.5087890625
Batch 5: loss = 1.8442442417144775, acc = 0.453125
Batch 6: loss = 1.9040006399154663, acc = 0.458984375
Batch 7: loss = 1.8799805641174316, acc = 0.45703125
Batch 8: loss = 1.6720149517059326, acc = 0.5244140625
Batch 9: loss = 1.7459604740142822, acc = 0.47265625
Batch 10: loss = 1.5765156745910645, acc = 0.5390625
Batch 11: loss = 1.717172384262085, acc = 0.501953125
Batch 12: loss = 1.8837580680847168, acc = 0.4736328125
Batch 13: loss = 1.7482061386108398, acc = 0.4814453125
Batch 14: loss = 1.640287160873413, acc = 0.498046875
Batch 15: loss = 1.724438190460205, acc = 0.5
Batch 16: loss = 1.8189897537231445, acc = 0.4599609375
Batch 17: loss = 1.7548038959503174, acc = 0.4990234375
Batch 18: loss = 1.833411693572998, acc = 0.458984375
Batch 19: loss = 1.8519517183303833, acc = 0.458984375
Batch 20: loss = 1.6871449947357178, acc = 0.4970703125
Batch 21: loss = 1.8476972579956055, acc = 0.47265625
Batch 22: loss = 1.6739451885223389, acc = 0.49609375
Batch 23: loss = 1.7249739170074463, acc = 0.48046875
Batch 24: loss = 1.7761478424072266, acc = 0.4873046875
Batch 25: loss = 1.6415756940841675, acc = 0.5126953125
Batch 26: loss = 1.6599531173706055, acc = 0.513671875
Batch 27: loss = 1.6538031101226807, acc = 0.5234375
Batch 28: loss = 1.6181528568267822, acc = 0.5068359375
Batch 29: loss = 1.623913049697876, acc = 0.5224609375
Batch 30: loss = 1.6548986434936523, acc = 0.5166015625
Batch 31: loss = 1.7561100721359253, acc = 0.4921875
Batch 32: loss = 1.9369598627090454, acc = 0.4462890625
Batch 33: loss = 1.7883495092391968, acc = 0.4794921875
Batch 34: loss = 1.697330355644226, acc = 0.5029296875
Batch 35: loss = 1.7095690965652466, acc = 0.4873046875
Batch 36: loss = 1.8356661796569824, acc = 0.4814453125
Batch 37: loss = 1.7920982837677002, acc = 0.4990234375
Batch 38: loss = 1.7686851024627686, acc = 0.486328125
Batch 39: loss = 1.6966102123260498, acc = 0.4921875
Batch 40: loss = 1.6898393630981445, acc = 0.5078125
Batch 41: loss = 1.594961166381836, acc = 0.54296875
Batch 42: loss = 1.5764082670211792, acc = 0.5205078125
Batch 43: loss = 1.624817132949829, acc = 0.5009765625
Batch 44: loss = 1.6670315265655518, acc = 0.5146484375
Batch 45: loss = 1.5257494449615479, acc = 0.5576171875
Batch 46: loss = 1.634783148765564, acc = 0.5234375
Batch 47: loss = 1.6517177820205688, acc = 0.509765625
Batch 48: loss = 1.5467947721481323, acc = 0.513671875
Batch 49: loss = 1.5618327856063843, acc = 0.546875
Batch 50: loss = 1.5382195711135864, acc = 0.5654296875
Batch 51: loss = 1.6378127336502075, acc = 0.4931640625
Batch 52: loss = 1.7098462581634521, acc = 0.4775390625
Batch 53: loss = 1.7108516693115234, acc = 0.494140625
Batch 54: loss = 1.6368390321731567, acc = 0.537109375
Batch 55: loss = 1.6642111539840698, acc = 0.5146484375
Batch 56: loss = 1.577380657196045, acc = 0.521484375
Batch 57: loss = 1.6884639263153076, acc = 0.48828125
Batch 58: loss = 1.6496984958648682, acc = 0.5166015625
Batch 59: loss = 1.4773694276809692, acc = 0.572265625
Batch 60: loss = 1.597594976425171, acc = 0.5126953125
Batch 61: loss = 1.7655613422393799, acc = 0.515625
Batch 62: loss = 1.785996675491333, acc = 0.4384765625
Batch 63: loss = 1.8253412246704102, acc = 0.478515625
Batch 64: loss = 1.4267942905426025, acc = 0.5869140625
Batch 65: loss = 1.5791902542114258, acc = 0.5283203125
Batch 66: loss = 1.5997371673583984, acc = 0.5166015625
Batch 67: loss = 1.6144721508026123, acc = 0.5341796875
Batch 68: loss = 1.6002252101898193, acc = 0.544921875
Batch 69: loss = 1.5456883907318115, acc = 0.53125
Batch 70: loss = 1.702614665031433, acc = 0.498046875
Batch 71: loss = 1.561411738395691, acc = 0.53515625
Batch 72: loss = 1.5596723556518555, acc = 0.5390625
Batch 73: loss = 1.6199760437011719, acc = 0.509765625
Batch 74: loss = 1.5635534524917603, acc = 0.5126953125
Batch 75: loss = 1.629435658454895, acc = 0.498046875
Batch 76: loss = 1.6597676277160645, acc = 0.5048828125
Batch 77: loss = 1.4792728424072266, acc = 0.5654296875
Batch 78: loss = 1.45106840133667, acc = 0.5810546875
Batch 79: loss = 1.3860504627227783, acc = 0.5947265625
Batch 80: loss = 1.4341239929199219, acc = 0.5400390625
Batch 81: loss = 1.4763643741607666, acc = 0.564453125
Batch 82: loss = 1.4637075662612915, acc = 0.56640625
Batch 83: loss = 1.496018409729004, acc = 0.5517578125
Batch 84: loss = 1.5113822221755981, acc = 0.55078125
Batch 85: loss = 1.5445177555084229, acc = 0.5263671875
Batch 86: loss = 1.5820902585983276, acc = 0.5341796875
Batch 87: loss = 1.5392571687698364, acc = 0.5673828125
Batch 88: loss = 1.7532110214233398, acc = 0.4921875
Batch 89: loss = 1.7019399404525757, acc = 0.529296875
Batch 90: loss = 1.7841835021972656, acc = 0.48828125
Batch 91: loss = 1.6731864213943481, acc = 0.4970703125
Batch 92: loss = 1.4668543338775635, acc = 0.5498046875
Batch 93: loss = 1.4508464336395264, acc = 0.5751953125
Batch 94: loss = 1.478480577468872, acc = 0.560546875
Batch 95: loss = 1.5397164821624756, acc = 0.51953125
Batch 96: loss = 1.5484099388122559, acc = 0.5244140625
Batch 97: loss = 1.5663520097732544, acc = 0.541015625
Batch 98: loss = 1.564638614654541, acc = 0.5576171875
Batch 99: loss = 1.5364701747894287, acc = 0.52734375
Batch 100: loss = 1.4798271656036377, acc = 0.5419921875
Batch 101: loss = 1.6201667785644531, acc = 0.5302734375
Batch 102: loss = 1.720630168914795, acc = 0.4873046875
Batch 103: loss = 1.6205954551696777, acc = 0.513671875
Batch 104: loss = 1.5365924835205078, acc = 0.541015625
Batch 105: loss = 1.5693418979644775, acc = 0.5283203125
Batch 106: loss = 1.6408435106277466, acc = 0.529296875
Batch 107: loss = 1.546811819076538, acc = 0.53515625
Batch 108: loss = 1.5401549339294434, acc = 0.537109375
Batch 109: loss = 1.4911412000656128, acc = 0.533203125
Batch 110: loss = 1.4836015701293945, acc = 0.5595703125
Batch 111: loss = 1.6837384700775146, acc = 0.5
Batch 112: loss = 1.5329179763793945, acc = 0.5400390625
Batch 113: loss = 1.5347336530685425, acc = 0.53125
Batch 114: loss = 1.5913487672805786, acc = 0.5322265625
Batch 115: loss = 1.6589550971984863, acc = 0.5087890625
Batch 116: loss = 1.6319217681884766, acc = 0.5087890625
Batch 117: loss = 1.5192360877990723, acc = 0.560546875
Batch 118: loss = 1.5040736198425293, acc = 0.5419921875
Batch 119: loss = 1.6001052856445312, acc = 0.5185546875
Batch 120: loss = 1.6636027097702026, acc = 0.494140625
Batch 121: loss = 1.4687498807907104, acc = 0.5517578125
Batch 122: loss = 1.5237858295440674, acc = 0.541015625
Batch 123: loss = 1.5205799341201782, acc = 0.54296875
Batch 124: loss = 1.5533900260925293, acc = 0.513671875
Batch 125: loss = 1.5289340019226074, acc = 0.544921875
Batch 126: loss = 1.6443617343902588, acc = 0.515625
Epoch 5/100
Batch 1: loss = 2.1608481407165527, acc = 0.443359375
Batch 2: loss = 1.608494520187378, acc = 0.5126953125
Batch 3: loss = 1.6548470258712769, acc = 0.5087890625
Batch 4: loss = 1.5161596536636353, acc = 0.5341796875
Batch 5: loss = 1.7304306030273438, acc = 0.4951171875
Batch 6: loss = 1.7603676319122314, acc = 0.494140625
Batch 7: loss = 1.7303000688552856, acc = 0.4951171875
Batch 8: loss = 1.5352120399475098, acc = 0.5419921875
Batch 9: loss = 1.555720329284668, acc = 0.533203125
Batch 10: loss = 1.4347087144851685, acc = 0.58203125
Batch 11: loss = 1.5990538597106934, acc = 0.5380859375
Batch 12: loss = 1.7472333908081055, acc = 0.4853515625
Batch 13: loss = 1.6049778461456299, acc = 0.5234375
Batch 14: loss = 1.473071813583374, acc = 0.5400390625
Batch 15: loss = 1.5443103313446045, acc = 0.5498046875
Batch 16: loss = 1.639938473701477, acc = 0.5029296875
Batch 17: loss = 1.580937147140503, acc = 0.5419921875
Batch 18: loss = 1.6789898872375488, acc = 0.4921875
Batch 19: loss = 1.6796643733978271, acc = 0.505859375
Batch 20: loss = 1.4636597633361816, acc = 0.55078125
Batch 21: loss = 1.7084059715270996, acc = 0.4921875
Batch 22: loss = 1.5111571550369263, acc = 0.529296875
Batch 23: loss = 1.5585238933563232, acc = 0.5224609375
Batch 24: loss = 1.6123549938201904, acc = 0.525390625
Batch 25: loss = 1.4853136539459229, acc = 0.560546875
Batch 26: loss = 1.4858002662658691, acc = 0.5556640625
Batch 27: loss = 1.5188626050949097, acc = 0.5322265625
Batch 28: loss = 1.5111885070800781, acc = 0.5380859375
Batch 29: loss = 1.4778956174850464, acc = 0.5625
Batch 30: loss = 1.5123670101165771, acc = 0.55859375
Batch 31: loss = 1.586119532585144, acc = 0.5234375
Batch 32: loss = 1.7834978103637695, acc = 0.46484375
Batch 33: loss = 1.6079800128936768, acc = 0.5224609375
Batch 34: loss = 1.5680428743362427, acc = 0.5146484375
Batch 35: loss = 1.5563230514526367, acc = 0.533203125
Batch 36: loss = 1.6695468425750732, acc = 0.515625
Batch 37: loss = 1.6168240308761597, acc = 0.5400390625
Batch 38: loss = 1.6072347164154053, acc = 0.5185546875
Batch 39: loss = 1.5597343444824219, acc = 0.533203125
Batch 40: loss = 1.5342670679092407, acc = 0.5205078125
Batch 41: loss = 1.4210684299468994, acc = 0.5703125
Batch 42: loss = 1.4284924268722534, acc = 0.5380859375
Batch 43: loss = 1.5060468912124634, acc = 0.5185546875
Batch 44: loss = 1.515425443649292, acc = 0.548828125
Batch 45: loss = 1.3692721128463745, acc = 0.5908203125
Batch 46: loss = 1.4850261211395264, acc = 0.537109375
Batch 47: loss = 1.4696733951568604, acc = 0.552734375
Batch 48: loss = 1.405904769897461, acc = 0.5576171875
Batch 49: loss = 1.3978850841522217, acc = 0.5791015625
Batch 50: loss = 1.3992995023727417, acc = 0.5703125
Batch 51: loss = 1.4656224250793457, acc = 0.5380859375
Batch 52: loss = 1.580080509185791, acc = 0.5166015625
Batch 53: loss = 1.5560492277145386, acc = 0.51171875
Batch 54: loss = 1.4628970623016357, acc = 0.568359375
Batch 55: loss = 1.49594247341156, acc = 0.5546875
Batch 56: loss = 1.437619924545288, acc = 0.548828125
Batch 57: loss = 1.5457146167755127, acc = 0.494140625
Batch 58: loss = 1.5131821632385254, acc = 0.5361328125
Batch 59: loss = 1.29469633102417, acc = 0.6025390625
Batch 60: loss = 1.4472311735153198, acc = 0.5498046875
Batch 61: loss = 1.5681004524230957, acc = 0.529296875
Batch 62: loss = 1.6522669792175293, acc = 0.4794921875
Batch 63: loss = 1.639312744140625, acc = 0.5029296875
Batch 64: loss = 1.2426300048828125, acc = 0.6279296875
Batch 65: loss = 1.4528210163116455, acc = 0.5576171875
Batch 66: loss = 1.444413423538208, acc = 0.54296875
Batch 67: loss = 1.431443691253662, acc = 0.58203125
Batch 68: loss = 1.4601266384124756, acc = 0.5615234375
Batch 69: loss = 1.3851629495620728, acc = 0.5703125
Batch 70: loss = 1.5688163042068481, acc = 0.5087890625
Batch 71: loss = 1.4020183086395264, acc = 0.5654296875
Batch 72: loss = 1.3784745931625366, acc = 0.5849609375
Batch 73: loss = 1.503767728805542, acc = 0.533203125
Batch 74: loss = 1.4347527027130127, acc = 0.54296875
Batch 75: loss = 1.5082447528839111, acc = 0.517578125
Batch 76: loss = 1.5231221914291382, acc = 0.5263671875
Batch 77: loss = 1.326963186264038, acc = 0.59375
Batch 78: loss = 1.3381198644638062, acc = 0.580078125
Batch 79: loss = 1.2440199851989746, acc = 0.6240234375
Batch 80: loss = 1.3342610597610474, acc = 0.5634765625
Batch 81: loss = 1.3417084217071533, acc = 0.5830078125
Batch 82: loss = 1.295815110206604, acc = 0.6083984375
Batch 83: loss = 1.372413992881775, acc = 0.572265625
Batch 84: loss = 1.352799892425537, acc = 0.5849609375
Batch 85: loss = 1.451314926147461, acc = 0.5498046875
Batch 86: loss = 1.4389500617980957, acc = 0.568359375
Batch 87: loss = 1.3966171741485596, acc = 0.5986328125
Batch 88: loss = 1.599222183227539, acc = 0.5185546875
Batch 89: loss = 1.5080074071884155, acc = 0.5537109375
Batch 90: loss = 1.614975929260254, acc = 0.5146484375
Batch 91: loss = 1.5151299238204956, acc = 0.5224609375
Batch 92: loss = 1.3099496364593506, acc = 0.58984375
Batch 93: loss = 1.3074612617492676, acc = 0.599609375
Batch 94: loss = 1.3375568389892578, acc = 0.5947265625
Batch 95: loss = 1.408564805984497, acc = 0.5322265625
Batch 96: loss = 1.4151920080184937, acc = 0.5537109375
Batch 97: loss = 1.419140338897705, acc = 0.572265625
Batch 98: loss = 1.4190597534179688, acc = 0.5791015625
Batch 99: loss = 1.426762342453003, acc = 0.541015625
Batch 100: loss = 1.3658517599105835, acc = 0.55859375
Batch 101: loss = 1.4968757629394531, acc = 0.533203125
Batch 102: loss = 1.5403295755386353, acc = 0.541015625
Batch 103: loss = 1.4981763362884521, acc = 0.5439453125
Batch 104: loss = 1.3710696697235107, acc = 0.576171875
Batch 105: loss = 1.4374803304672241, acc = 0.5478515625
Batch 106: loss = 1.4731428623199463, acc = 0.5439453125
Batch 107: loss = 1.4026811122894287, acc = 0.556640625
Batch 108: loss = 1.395268201828003, acc = 0.5595703125
Batch 109: loss = 1.368475079536438, acc = 0.55078125
Batch 110: loss = 1.3529396057128906, acc = 0.5810546875
Batch 111: loss = 1.5620969533920288, acc = 0.5166015625
Batch 112: loss = 1.3634144067764282, acc = 0.564453125
Batch 113: loss = 1.4171531200408936, acc = 0.55078125
Batch 114: loss = 1.475853443145752, acc = 0.564453125
Batch 115: loss = 1.5191617012023926, acc = 0.5498046875
Batch 116: loss = 1.5188453197479248, acc = 0.52734375
Batch 117: loss = 1.3985536098480225, acc = 0.580078125
Batch 118: loss = 1.3866125345230103, acc = 0.564453125
Batch 119: loss = 1.4692407846450806, acc = 0.51953125
Batch 120: loss = 1.5310275554656982, acc = 0.517578125
Batch 121: loss = 1.3509331941604614, acc = 0.5634765625
Batch 122: loss = 1.3760803937911987, acc = 0.5703125
Batch 123: loss = 1.3960552215576172, acc = 0.564453125
Batch 124: loss = 1.4557362794876099, acc = 0.537109375
Batch 125: loss = 1.4211480617523193, acc = 0.548828125
Batch 126: loss = 1.540765404701233, acc = 0.5419921875
Epoch 6/100
Batch 1: loss = 1.9315521717071533, acc = 0.4833984375
Batch 2: loss = 1.502223253250122, acc = 0.529296875
Batch 3: loss = 1.5402510166168213, acc = 0.53515625
Batch 4: loss = 1.395174264907837, acc = 0.5556640625
Batch 5: loss = 1.5998222827911377, acc = 0.5029296875
Batch 6: loss = 1.61801016330719, acc = 0.517578125
Batch 7: loss = 1.578840970993042, acc = 0.525390625
Batch 8: loss = 1.4221124649047852, acc = 0.5625
Batch 9: loss = 1.4262405633926392, acc = 0.5419921875
Batch 10: loss = 1.2946531772613525, acc = 0.5986328125
Batch 11: loss = 1.4595897197723389, acc = 0.5390625
Batch 12: loss = 1.571437120437622, acc = 0.5126953125
Batch 13: loss = 1.476283311843872, acc = 0.537109375
Batch 14: loss = 1.3531785011291504, acc = 0.5625
Batch 15: loss = 1.4273473024368286, acc = 0.5771484375
Batch 16: loss = 1.522951602935791, acc = 0.513671875
Batch 17: loss = 1.458640217781067, acc = 0.5419921875
Batch 18: loss = 1.5558825731277466, acc = 0.5234375
Batch 19: loss = 1.5491985082626343, acc = 0.5322265625
Batch 20: loss = 1.3527979850769043, acc = 0.5732421875
Batch 21: loss = 1.584853172302246, acc = 0.5029296875
Batch 22: loss = 1.369291067123413, acc = 0.5712890625
Batch 23: loss = 1.4353551864624023, acc = 0.5361328125
Batch 24: loss = 1.4849257469177246, acc = 0.53515625
Batch 25: loss = 1.3559033870697021, acc = 0.57421875
Batch 26: loss = 1.3743318319320679, acc = 0.583984375
Batch 27: loss = 1.4338196516036987, acc = 0.5546875
Batch 28: loss = 1.415677785873413, acc = 0.546875
Batch 29: loss = 1.3702118396759033, acc = 0.59375
Batch 30: loss = 1.3603321313858032, acc = 0.5771484375
Batch 31: loss = 1.4863672256469727, acc = 0.537109375
Batch 32: loss = 1.6672344207763672, acc = 0.4814453125
Batch 33: loss = 1.4645100831985474, acc = 0.548828125
Batch 34: loss = 1.4863485097885132, acc = 0.53125
Batch 35: loss = 1.4548799991607666, acc = 0.54296875
Batch 36: loss = 1.5402699708938599, acc = 0.546875
Batch 37: loss = 1.521811604499817, acc = 0.5634765625
Batch 38: loss = 1.5299572944641113, acc = 0.53515625
Batch 39: loss = 1.4768110513687134, acc = 0.5361328125
Batch 40: loss = 1.4450714588165283, acc = 0.546875
Batch 41: loss = 1.3257054090499878, acc = 0.576171875
Batch 42: loss = 1.3736200332641602, acc = 0.5634765625
Batch 43: loss = 1.4458731412887573, acc = 0.5400390625
Batch 44: loss = 1.3904356956481934, acc = 0.5595703125
Batch 45: loss = 1.2639966011047363, acc = 0.6044921875
Batch 46: loss = 1.3793420791625977, acc = 0.5693359375
Batch 47: loss = 1.382025957107544, acc = 0.55859375
Batch 48: loss = 1.3020353317260742, acc = 0.5791015625
Batch 49: loss = 1.2741715908050537, acc = 0.5927734375
Batch 50: loss = 1.294905662536621, acc = 0.59375
Batch 51: loss = 1.360120177268982, acc = 0.5634765625
Batch 52: loss = 1.4850363731384277, acc = 0.52734375
Batch 53: loss = 1.459820032119751, acc = 0.53125
Batch 54: loss = 1.3580362796783447, acc = 0.5830078125
Batch 55: loss = 1.371085524559021, acc = 0.580078125
Batch 56: loss = 1.3300187587738037, acc = 0.564453125
Batch 57: loss = 1.4514625072479248, acc = 0.5126953125
Batch 58: loss = 1.4102078676223755, acc = 0.5537109375
Batch 59: loss = 1.1889535188674927, acc = 0.623046875
Batch 60: loss = 1.3514833450317383, acc = 0.5498046875
Batch 61: loss = 1.4593846797943115, acc = 0.5400390625
Batch 62: loss = 1.551917552947998, acc = 0.486328125
Batch 63: loss = 1.5478003025054932, acc = 0.5146484375
Batch 64: loss = 1.1386183500289917, acc = 0.6376953125
Batch 65: loss = 1.3817732334136963, acc = 0.5732421875
Batch 66: loss = 1.329563856124878, acc = 0.5751953125
Batch 67: loss = 1.3317127227783203, acc = 0.5986328125
Batch 68: loss = 1.3706095218658447, acc = 0.58984375
Batch 69: loss = 1.322316288948059, acc = 0.58203125
Batch 70: loss = 1.4439070224761963, acc = 0.544921875
Batch 71: loss = 1.3050638437271118, acc = 0.5810546875
Batch 72: loss = 1.2633998394012451, acc = 0.6005859375
Batch 73: loss = 1.4172146320343018, acc = 0.5537109375
Batch 74: loss = 1.3647761344909668, acc = 0.5390625
Batch 75: loss = 1.4412983655929565, acc = 0.537109375
Batch 76: loss = 1.446054220199585, acc = 0.54296875
Batch 77: loss = 1.2518343925476074, acc = 0.6025390625
Batch 78: loss = 1.2649849653244019, acc = 0.5927734375
Batch 79: loss = 1.1486788988113403, acc = 0.626953125
Batch 80: loss = 1.2631334066390991, acc = 0.560546875
Batch 81: loss = 1.2637226581573486, acc = 0.6044921875
Batch 82: loss = 1.2089431285858154, acc = 0.62109375
Batch 83: loss = 1.2902650833129883, acc = 0.5830078125
Batch 84: loss = 1.2895236015319824, acc = 0.5908203125
Batch 85: loss = 1.3878209590911865, acc = 0.556640625
Batch 86: loss = 1.3461568355560303, acc = 0.5869140625
Batch 87: loss = 1.296225905418396, acc = 0.615234375
Batch 88: loss = 1.5045137405395508, acc = 0.5302734375
Batch 89: loss = 1.3958749771118164, acc = 0.56640625
Batch 90: loss = 1.486483097076416, acc = 0.53515625
Batch 91: loss = 1.4482157230377197, acc = 0.54296875
Batch 92: loss = 1.2417532205581665, acc = 0.595703125
Batch 93: loss = 1.23195481300354, acc = 0.615234375
Batch 94: loss = 1.263588786125183, acc = 0.6064453125
Batch 95: loss = 1.3266751766204834, acc = 0.548828125
Batch 96: loss = 1.3282504081726074, acc = 0.5654296875
Batch 97: loss = 1.3402273654937744, acc = 0.5751953125
Batch 98: loss = 1.3367373943328857, acc = 0.603515625
Batch 99: loss = 1.3470829725265503, acc = 0.556640625
Batch 100: loss = 1.2930052280426025, acc = 0.5810546875
Batch 101: loss = 1.393386960029602, acc = 0.5556640625
Batch 102: loss = 1.4557414054870605, acc = 0.5400390625
Batch 103: loss = 1.4242725372314453, acc = 0.55859375
Batch 104: loss = 1.272107720375061, acc = 0.591796875
Batch 105: loss = 1.3494008779525757, acc = 0.5732421875
Batch 106: loss = 1.4194886684417725, acc = 0.5576171875
Batch 107: loss = 1.3279523849487305, acc = 0.572265625
Batch 108: loss = 1.338085651397705, acc = 0.5615234375
Batch 109: loss = 1.3065310716629028, acc = 0.5712890625
Batch 110: loss = 1.2691212892532349, acc = 0.5947265625
Batch 111: loss = 1.4756088256835938, acc = 0.53515625
Batch 112: loss = 1.291202187538147, acc = 0.5771484375
Batch 113: loss = 1.3276047706604004, acc = 0.564453125
Batch 114: loss = 1.390828013420105, acc = 0.5830078125
Batch 115: loss = 1.4630873203277588, acc = 0.5478515625
Batch 116: loss = 1.4615644216537476, acc = 0.5361328125
Batch 117: loss = 1.3363871574401855, acc = 0.5986328125
Batch 118: loss = 1.3057982921600342, acc = 0.56640625
Batch 119: loss = 1.429585576057434, acc = 0.5283203125
Batch 120: loss = 1.4601539373397827, acc = 0.5283203125
Batch 121: loss = 1.2985029220581055, acc = 0.572265625
Batch 122: loss = 1.3175666332244873, acc = 0.587890625
Batch 123: loss = 1.3357853889465332, acc = 0.576171875
Batch 124: loss = 1.3886535167694092, acc = 0.5478515625
Batch 125: loss = 1.3596996068954468, acc = 0.5732421875
Batch 126: loss = 1.4686166048049927, acc = 0.556640625
Epoch 7/100
Batch 1: loss = 1.8302080631256104, acc = 0.4931640625
Batch 2: loss = 1.448176383972168, acc = 0.541015625
Batch 3: loss = 1.454297423362732, acc = 0.5546875
Batch 4: loss = 1.3474721908569336, acc = 0.572265625
Batch 5: loss = 1.5251286029815674, acc = 0.5205078125
Batch 6: loss = 1.5451326370239258, acc = 0.52734375
Batch 7: loss = 1.5001229047775269, acc = 0.525390625
Batch 8: loss = 1.354753851890564, acc = 0.568359375
Batch 9: loss = 1.3390129804611206, acc = 0.57421875
Batch 10: loss = 1.2114009857177734, acc = 0.611328125
Batch 11: loss = 1.384641408920288, acc = 0.5546875
Batch 12: loss = 1.522101879119873, acc = 0.5146484375
Batch 13: loss = 1.3917925357818604, acc = 0.5498046875
Batch 14: loss = 1.2949354648590088, acc = 0.5810546875
Batch 15: loss = 1.3492867946624756, acc = 0.5791015625
Batch 16: loss = 1.470292329788208, acc = 0.5283203125
Batch 17: loss = 1.4090275764465332, acc = 0.564453125
Batch 18: loss = 1.4856314659118652, acc = 0.529296875
Batch 19: loss = 1.4692955017089844, acc = 0.5458984375
Batch 20: loss = 1.289626121520996, acc = 0.5859375
Batch 21: loss = 1.5217982530593872, acc = 0.529296875
Batch 22: loss = 1.2941443920135498, acc = 0.5791015625
Batch 23: loss = 1.349380612373352, acc = 0.544921875
Batch 24: loss = 1.4202171564102173, acc = 0.5546875
Batch 25: loss = 1.2805981636047363, acc = 0.591796875
Batch 26: loss = 1.3241233825683594, acc = 0.58203125
Batch 27: loss = 1.3845229148864746, acc = 0.5498046875
Batch 28: loss = 1.3551045656204224, acc = 0.5712890625
Batch 29: loss = 1.3181798458099365, acc = 0.607421875
Batch 30: loss = 1.3123897314071655, acc = 0.583984375
Batch 31: loss = 1.4171767234802246, acc = 0.5595703125
Batch 32: loss = 1.5915563106536865, acc = 0.5068359375
Batch 33: loss = 1.386869192123413, acc = 0.564453125
Batch 34: loss = 1.392135500907898, acc = 0.5478515625
Batch 35: loss = 1.3934085369110107, acc = 0.556640625
Batch 36: loss = 1.4442203044891357, acc = 0.5556640625
Batch 37: loss = 1.4487361907958984, acc = 0.5771484375
Batch 38: loss = 1.4371356964111328, acc = 0.544921875
Batch 39: loss = 1.3889515399932861, acc = 0.5546875
Batch 40: loss = 1.3879884481430054, acc = 0.5615234375
Batch 41: loss = 1.2733652591705322, acc = 0.59375
Batch 42: loss = 1.2993764877319336, acc = 0.5712890625
Batch 43: loss = 1.3711656332015991, acc = 0.544921875
Batch 44: loss = 1.337634801864624, acc = 0.5751953125
Batch 45: loss = 1.189774990081787, acc = 0.615234375
Batch 46: loss = 1.3318696022033691, acc = 0.580078125
Batch 47: loss = 1.2812776565551758, acc = 0.5732421875
Batch 48: loss = 1.2240759134292603, acc = 0.603515625
Batch 49: loss = 1.1968662738800049, acc = 0.626953125
Batch 50: loss = 1.244269847869873, acc = 0.6044921875
Batch 51: loss = 1.301901936531067, acc = 0.5595703125
Batch 52: loss = 1.4221737384796143, acc = 0.53125
Batch 53: loss = 1.3936803340911865, acc = 0.5458984375
Batch 54: loss = 1.2843091487884521, acc = 0.603515625
Batch 55: loss = 1.313774824142456, acc = 0.58203125
Batch 56: loss = 1.291394829750061, acc = 0.5751953125
Batch 57: loss = 1.3884791135787964, acc = 0.521484375
Batch 58: loss = 1.34963059425354, acc = 0.56640625
Batch 59: loss = 1.1095308065414429, acc = 0.6337890625
Batch 60: loss = 1.284639596939087, acc = 0.578125
Batch 61: loss = 1.3829598426818848, acc = 0.564453125
Batch 62: loss = 1.4853688478469849, acc = 0.498046875
Batch 63: loss = 1.4684219360351562, acc = 0.52734375
Batch 64: loss = 1.077593207359314, acc = 0.650390625
Batch 65: loss = 1.3551369905471802, acc = 0.568359375
Batch 66: loss = 1.3026520013809204, acc = 0.5849609375
Batch 67: loss = 1.2619688510894775, acc = 0.6259765625
Batch 68: loss = 1.3166654109954834, acc = 0.595703125
Batch 69: loss = 1.2634284496307373, acc = 0.599609375
Batch 70: loss = 1.4051485061645508, acc = 0.5537109375
Batch 71: loss = 1.2637171745300293, acc = 0.6005859375
Batch 72: loss = 1.199277400970459, acc = 0.6162109375
Batch 73: loss = 1.3675044775009155, acc = 0.5615234375
Batch 74: loss = 1.3314542770385742, acc = 0.564453125
Batch 75: loss = 1.4133286476135254, acc = 0.5205078125
Batch 76: loss = 1.4005331993103027, acc = 0.5625
Batch 77: loss = 1.1986815929412842, acc = 0.6279296875
Batch 78: loss = 1.2256653308868408, acc = 0.58984375
Batch 79: loss = 1.0984009504318237, acc = 0.6396484375
Batch 80: loss = 1.2122653722763062, acc = 0.5830078125
Batch 81: loss = 1.2062444686889648, acc = 0.6083984375
Batch 82: loss = 1.144284725189209, acc = 0.62890625
Batch 83: loss = 1.2590044736862183, acc = 0.59765625
Batch 84: loss = 1.2487449645996094, acc = 0.6083984375
Batch 85: loss = 1.3543469905853271, acc = 0.572265625
Batch 86: loss = 1.3028497695922852, acc = 0.5849609375
Batch 87: loss = 1.2440742254257202, acc = 0.625
Batch 88: loss = 1.4393460750579834, acc = 0.544921875
Batch 89: loss = 1.3369061946868896, acc = 0.576171875
Batch 90: loss = 1.4504499435424805, acc = 0.5537109375
Batch 91: loss = 1.3845469951629639, acc = 0.5546875
Batch 92: loss = 1.2120749950408936, acc = 0.6025390625
Batch 93: loss = 1.167757511138916, acc = 0.6357421875
Batch 94: loss = 1.2204668521881104, acc = 0.6025390625
Batch 95: loss = 1.2829262018203735, acc = 0.5576171875
Batch 96: loss = 1.3098255395889282, acc = 0.5556640625
Batch 97: loss = 1.2893365621566772, acc = 0.5791015625
Batch 98: loss = 1.2889957427978516, acc = 0.6044921875
Batch 99: loss = 1.3047351837158203, acc = 0.5693359375
Batch 100: loss = 1.2652453184127808, acc = 0.5908203125
Batch 101: loss = 1.3335552215576172, acc = 0.5693359375
Batch 102: loss = 1.3951983451843262, acc = 0.572265625
Batch 103: loss = 1.3572778701782227, acc = 0.57421875
Batch 104: loss = 1.2238569259643555, acc = 0.603515625
Batch 105: loss = 1.300654411315918, acc = 0.587890625
Batch 106: loss = 1.3644709587097168, acc = 0.5576171875
Batch 107: loss = 1.272972583770752, acc = 0.5908203125
Batch 108: loss = 1.2787046432495117, acc = 0.583984375
Batch 109: loss = 1.2607715129852295, acc = 0.5771484375
Batch 110: loss = 1.22431218624115, acc = 0.611328125
Batch 111: loss = 1.4141873121261597, acc = 0.5498046875
Batch 112: loss = 1.238264799118042, acc = 0.6044921875
Batch 113: loss = 1.286467432975769, acc = 0.578125
Batch 114: loss = 1.3342657089233398, acc = 0.6044921875
Batch 115: loss = 1.3953051567077637, acc = 0.5625
Batch 116: loss = 1.3910677433013916, acc = 0.548828125
Batch 117: loss = 1.2642887830734253, acc = 0.6123046875
Batch 118: loss = 1.2428429126739502, acc = 0.58984375
Batch 119: loss = 1.3453738689422607, acc = 0.548828125
Batch 120: loss = 1.395221471786499, acc = 0.55859375
Batch 121: loss = 1.2213882207870483, acc = 0.5947265625
Batch 122: loss = 1.2457201480865479, acc = 0.6005859375
Batch 123: loss = 1.2662208080291748, acc = 0.5966796875
Batch 124: loss = 1.3422988653182983, acc = 0.5537109375
Batch 125: loss = 1.3307318687438965, acc = 0.5732421875
Batch 126: loss = 1.4468778371810913, acc = 0.5673828125
Epoch 8/100
Batch 1: loss = 1.7242096662521362, acc = 0.525390625
Batch 2: loss = 1.4119117259979248, acc = 0.5703125
Batch 3: loss = 1.3692104816436768, acc = 0.5673828125
Batch 4: loss = 1.3185925483703613, acc = 0.57421875
Batch 5: loss = 1.463935136795044, acc = 0.5419921875
Batch 6: loss = 1.4952950477600098, acc = 0.53125
Batch 7: loss = 1.4710556268692017, acc = 0.5263671875
Batch 8: loss = 1.2774760723114014, acc = 0.5849609375
Batch 9: loss = 1.2981263399124146, acc = 0.5888671875
Batch 10: loss = 1.1922290325164795, acc = 0.6142578125
Batch 11: loss = 1.3873662948608398, acc = 0.54296875
Batch 12: loss = 1.4614207744598389, acc = 0.53125
Batch 13: loss = 1.3422315120697021, acc = 0.55859375
Batch 14: loss = 1.2498230934143066, acc = 0.58984375
Batch 15: loss = 1.284131407737732, acc = 0.5947265625
Batch 16: loss = 1.441833257675171, acc = 0.5322265625
Batch 17: loss = 1.3522543907165527, acc = 0.572265625
Batch 18: loss = 1.4298827648162842, acc = 0.544921875
Batch 19: loss = 1.394632339477539, acc = 0.5673828125
Batch 20: loss = 1.252681016921997, acc = 0.6025390625
Batch 21: loss = 1.464604139328003, acc = 0.53125
Batch 22: loss = 1.2270641326904297, acc = 0.580078125
Batch 23: loss = 1.3109849691390991, acc = 0.55859375
Batch 24: loss = 1.3468579053878784, acc = 0.5654296875
Batch 25: loss = 1.2269630432128906, acc = 0.6083984375
Batch 26: loss = 1.2342369556427002, acc = 0.6064453125
Batch 27: loss = 1.3269615173339844, acc = 0.580078125
Batch 28: loss = 1.3225510120391846, acc = 0.5712890625
Batch 29: loss = 1.2855600118637085, acc = 0.609375
Batch 30: loss = 1.2411303520202637, acc = 0.603515625
Batch 31: loss = 1.378375768661499, acc = 0.5537109375
Batch 32: loss = 1.5365511178970337, acc = 0.5146484375
Batch 33: loss = 1.3137623071670532, acc = 0.583984375
Batch 34: loss = 1.3573081493377686, acc = 0.5478515625
Batch 35: loss = 1.3464511632919312, acc = 0.5576171875
Batch 36: loss = 1.3720927238464355, acc = 0.5771484375
Batch 37: loss = 1.3756004571914673, acc = 0.5830078125
Batch 38: loss = 1.4003269672393799, acc = 0.5771484375
Batch 39: loss = 1.3372294902801514, acc = 0.564453125
Batch 40: loss = 1.320343017578125, acc = 0.5625
Batch 41: loss = 1.2240877151489258, acc = 0.6015625
Batch 42: loss = 1.2536191940307617, acc = 0.5869140625
Batch 43: loss = 1.3357446193695068, acc = 0.55078125
Batch 44: loss = 1.2867484092712402, acc = 0.5830078125
Batch 45: loss = 1.1476775407791138, acc = 0.6337890625
Batch 46: loss = 1.2863048315048218, acc = 0.587890625
Batch 47: loss = 1.2548637390136719, acc = 0.587890625
Batch 48: loss = 1.2005133628845215, acc = 0.595703125
Batch 49: loss = 1.1470768451690674, acc = 0.6318359375
Batch 50: loss = 1.1893696784973145, acc = 0.625
Batch 51: loss = 1.2320265769958496, acc = 0.580078125
Batch 52: loss = 1.3626965284347534, acc = 0.5517578125
Batch 53: loss = 1.3452521562576294, acc = 0.55078125
Batch 54: loss = 1.2581596374511719, acc = 0.6103515625
Batch 55: loss = 1.2500410079956055, acc = 0.5986328125
Batch 56: loss = 1.2515017986297607, acc = 0.580078125
Batch 57: loss = 1.3437796831130981, acc = 0.544921875
Batch 58: loss = 1.2974298000335693, acc = 0.5830078125
Batch 59: loss = 1.0588221549987793, acc = 0.654296875
Batch 60: loss = 1.2527270317077637, acc = 0.5791015625
Batch 61: loss = 1.323241949081421, acc = 0.5654296875
Batch 62: loss = 1.4835164546966553, acc = 0.49609375
Batch 63: loss = 1.4274349212646484, acc = 0.5419921875
Batch 64: loss = 1.0460166931152344, acc = 0.6435546875
Batch 65: loss = 1.3214821815490723, acc = 0.5791015625
Batch 66: loss = 1.2566176652908325, acc = 0.5986328125
Batch 67: loss = 1.2420063018798828, acc = 0.6240234375
Batch 68: loss = 1.3221515417099, acc = 0.591796875
Batch 69: loss = 1.247521162033081, acc = 0.6083984375
Batch 70: loss = 1.3585247993469238, acc = 0.576171875
Batch 71: loss = 1.200092077255249, acc = 0.615234375
Batch 72: loss = 1.169158935546875, acc = 0.6162109375