-
Notifications
You must be signed in to change notification settings - Fork 16
/
Copy pathexperiment.log
3503 lines (3174 loc) · 102 KB
/
experiment.log
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
#####################################################
Generating data ...
#####################################################
#####################################################
Start pre-training generator with MLE...
#####################################################
G-Step 0
Epoch 0, train loss: 1.95778
eval loss: 3.18820
G-Step 1
Epoch 0, train loss: 1.50074
eval loss: 2.57729
G-Step 2
Epoch 0, train loss: 1.35728
eval loss: 2.26309
G-Step 3
Epoch 0, train loss: 1.28384
eval loss: 2.12456
G-Step 4
Epoch 0, train loss: 1.24565
eval loss: 2.01085
G-Step 5
Epoch 0, train loss: 1.21686
eval loss: 1.86459
G-Step 6
Epoch 0, train loss: 1.18300
eval loss: 1.80417
G-Step 7
Epoch 0, train loss: 1.15361
eval loss: 1.70990
G-Step 8
Epoch 0, train loss: 1.12995
eval loss: 1.63818
G-Step 9
Epoch 0, train loss: 1.10801
eval loss: 1.60416
G-Step 10
Epoch 0, train loss: 1.09192
eval loss: 1.54670
G-Step 11
Epoch 0, train loss: 1.07958
eval loss: 1.53503
G-Step 12
Epoch 0, train loss: 1.06872
eval loss: 1.48851
G-Step 13
Epoch 0, train loss: 1.06266
eval loss: 1.49841
G-Step 14
Epoch 0, train loss: 1.05296
eval loss: 1.42393
G-Step 15
Epoch 0, train loss: 1.04644
eval loss: 1.42307
G-Step 16
Epoch 0, train loss: 1.04078
eval loss: 1.47327
G-Step 17
Epoch 0, train loss: 1.03412
eval loss: 1.45094
G-Step 18
Epoch 0, train loss: 1.03002
eval loss: 1.42999
G-Step 19
Epoch 0, train loss: 1.02369
eval loss: 1.41396
G-Step 20
Epoch 0, train loss: 1.02134
eval loss: 1.40341
G-Step 21
Epoch 0, train loss: 1.01532
eval loss: 1.39258
G-Step 22
Epoch 0, train loss: 1.01128
eval loss: 1.39962
G-Step 23
Epoch 0, train loss: 1.00942
eval loss: 1.39466
G-Step 24
Epoch 0, train loss: 1.00385
eval loss: 1.39647
G-Step 25
Epoch 0, train loss: 1.00019
eval loss: 1.38826
G-Step 26
Epoch 0, train loss: 0.99834
eval loss: 1.37058
G-Step 27
Epoch 0, train loss: 0.99407
eval loss: 1.36701
G-Step 28
Epoch 0, train loss: 0.99197
eval loss: 1.34471
G-Step 29
Epoch 0, train loss: 0.98912
eval loss: 1.37130
G-Step 30
Epoch 0, train loss: 0.98699
eval loss: 1.37200
G-Step 31
Epoch 0, train loss: 0.98498
eval loss: 1.35384
G-Step 32
Epoch 0, train loss: 0.98197
eval loss: 1.38496
G-Step 33
Epoch 0, train loss: 0.98035
eval loss: 1.34161
G-Step 34
Epoch 0, train loss: 0.97819
eval loss: 1.38768
G-Step 35
Epoch 0, train loss: 0.97611
eval loss: 1.31921
G-Step 36
Epoch 0, train loss: 0.97550
eval loss: 1.34016
G-Step 37
Epoch 0, train loss: 0.97239
eval loss: 1.32903
G-Step 38
Epoch 0, train loss: 0.97530
eval loss: 1.36813
G-Step 39
Epoch 0, train loss: 0.96995
eval loss: 1.33078
G-Step 40
Epoch 0, train loss: 0.96882
eval loss: 1.33785
G-Step 41
Epoch 0, train loss: 0.96655
eval loss: 1.38280
G-Step 42
Epoch 0, train loss: 0.96658
eval loss: 1.37037
G-Step 43
Epoch 0, train loss: 0.96421
eval loss: 1.31964
G-Step 44
Epoch 0, train loss: 0.96472
eval loss: 1.35492
G-Step 45
Epoch 0, train loss: 0.96184
eval loss: 1.33095
G-Step 46
Epoch 0, train loss: 0.96400
eval loss: 1.36843
G-Step 47
Epoch 0, train loss: 0.96008
eval loss: 1.27642
G-Step 48
Epoch 0, train loss: 0.95940
eval loss: 1.30638
G-Step 49
Epoch 0, train loss: 0.95776
eval loss: 1.32153
G-Step 50
Epoch 0, train loss: 0.95777
eval loss: 1.32579
G-Step 51
Epoch 0, train loss: 0.95653
eval loss: 1.29442
G-Step 52
Epoch 0, train loss: 0.95414
eval loss: 1.30739
G-Step 53
Epoch 0, train loss: 0.95388
eval loss: 1.29043
G-Step 54
Epoch 0, train loss: 0.95249
eval loss: 1.34594
G-Step 55
Epoch 0, train loss: 0.95218
eval loss: 1.31625
G-Step 56
Epoch 0, train loss: 0.95240
eval loss: 1.33682
G-Step 57
Epoch 0, train loss: 0.95100
eval loss: 1.31346
G-Step 58
Epoch 0, train loss: 0.94944
eval loss: 1.30011
G-Step 59
Epoch 0, train loss: 0.94933
eval loss: 1.33010
G-Step 60
Epoch 0, train loss: 0.94776
eval loss: 1.29219
G-Step 61
Epoch 0, train loss: 0.94678
eval loss: 1.32799
G-Step 62
Epoch 0, train loss: 0.94930
eval loss: 1.30732
G-Step 63
Epoch 0, train loss: 0.94491
eval loss: 1.31615
G-Step 64
Epoch 0, train loss: 0.94496
eval loss: 1.31841
G-Step 65
Epoch 0, train loss: 0.95073
eval loss: 1.28885
G-Step 66
Epoch 0, train loss: 0.94259
eval loss: 1.29647
G-Step 67
Epoch 0, train loss: 0.94230
eval loss: 1.27450
G-Step 68
Epoch 0, train loss: 0.94252
eval loss: 1.28120
G-Step 69
Epoch 0, train loss: 0.94075
eval loss: 1.29456
G-Step 70
Epoch 0, train loss: 0.94020
eval loss: 1.27954
G-Step 71
Epoch 0, train loss: 0.93972
eval loss: 1.32796
G-Step 72
Epoch 0, train loss: 0.94125
eval loss: 1.28191
G-Step 73
Epoch 0, train loss: 0.93847
eval loss: 1.24182
G-Step 74
Epoch 0, train loss: 0.93806
eval loss: 1.24173
G-Step 75
Epoch 0, train loss: 0.93879
eval loss: 1.25873
G-Step 76
Epoch 0, train loss: 0.93686
eval loss: 1.32008
G-Step 77
Epoch 0, train loss: 0.94260
eval loss: 1.28654
G-Step 78
Epoch 0, train loss: 0.93536
eval loss: 1.31335
G-Step 79
Epoch 0, train loss: 0.93478
eval loss: 1.27942
G-Step 80
Epoch 0, train loss: 0.93418
eval loss: 1.24272
G-Step 81
Epoch 0, train loss: 0.93410
eval loss: 1.30203
G-Step 82
Epoch 0, train loss: 0.93365
eval loss: 1.30017
G-Step 83
Epoch 0, train loss: 0.93348
eval loss: 1.30062
G-Step 84
Epoch 0, train loss: 0.93278
eval loss: 1.30732
G-Step 85
Epoch 0, train loss: 0.93397
eval loss: 1.29220
G-Step 86
Epoch 0, train loss: 0.93131
eval loss: 1.29157
G-Step 87
Epoch 0, train loss: 0.93924
eval loss: 1.28801
G-Step 88
Epoch 0, train loss: 0.93043
eval loss: 1.29633
G-Step 89
Epoch 0, train loss: 0.93007
eval loss: 1.26467
G-Step 90
Epoch 0, train loss: 0.92940
eval loss: 1.31685
G-Step 91
Epoch 0, train loss: 0.92923
eval loss: 1.27251
G-Step 92
Epoch 0, train loss: 0.92872
eval loss: 1.31288
G-Step 93
Epoch 0, train loss: 0.92970
eval loss: 1.21818
G-Step 94
Epoch 0, train loss: 0.92744
eval loss: 1.27318
G-Step 95
Epoch 0, train loss: 0.92740
eval loss: 1.25763
G-Step 96
Epoch 0, train loss: 0.92722
eval loss: 1.29681
G-Step 97
Epoch 0, train loss: 0.93106
eval loss: 1.27471
G-Step 98
Epoch 0, train loss: 0.93097
eval loss: 1.24930
G-Step 99
Epoch 0, train loss: 0.92590
eval loss: 1.27641
G-Step 100
Epoch 0, train loss: 0.92548
eval loss: 1.27268
G-Step 101
Epoch 0, train loss: 0.92545
eval loss: 1.26987
G-Step 102
Epoch 0, train loss: 0.92475
eval loss: 1.23178
G-Step 103
Epoch 0, train loss: 0.92469
eval loss: 1.24827
G-Step 104
Epoch 0, train loss: 0.92484
eval loss: 1.26548
G-Step 105
Epoch 0, train loss: 0.92445
eval loss: 1.25300
G-Step 106
Epoch 0, train loss: 0.92328
eval loss: 1.29323
G-Step 107
Epoch 0, train loss: 0.92349
eval loss: 1.30007
G-Step 108
Epoch 0, train loss: 0.93742
eval loss: 1.23213
G-Step 109
Epoch 0, train loss: 0.92270
eval loss: 1.30379
G-Step 110
Epoch 0, train loss: 0.92200
eval loss: 1.30377
G-Step 111
Epoch 0, train loss: 0.92192
eval loss: 1.23835
G-Step 112
Epoch 0, train loss: 0.92191
eval loss: 1.26244
G-Step 113
Epoch 0, train loss: 0.92150
eval loss: 1.25912
G-Step 114
Epoch 0, train loss: 0.92103
eval loss: 1.21884
G-Step 115
Epoch 0, train loss: 0.93700
eval loss: 1.25221
G-Step 116
Epoch 0, train loss: 0.92023
eval loss: 1.30946
G-Step 117
Epoch 0, train loss: 0.92033
eval loss: 1.28230
G-Step 118
Epoch 0, train loss: 0.92024
eval loss: 1.25854
G-Step 119
Epoch 0, train loss: 0.91975
eval loss: 1.26829
#####################################################
#####################################################
Start pre-training discriminator...
#####################################################
D-Step 0
Epoch 0, train loss: 0.70015, train acc: 0.507
Epoch 1, train loss: 0.69971, train acc: 0.508
Epoch 2, train loss: 0.69748, train acc: 0.511
eval loss: 0.69997, eval acc: 0.498
D-Step 1
Epoch 0, train loss: 0.69881, train acc: 0.508
Epoch 1, train loss: 0.69754, train acc: 0.512
Epoch 2, train loss: 0.69720, train acc: 0.513
eval loss: 0.69818, eval acc: 0.510
D-Step 2
Epoch 0, train loss: 0.69719, train acc: 0.512
Epoch 1, train loss: 0.69603, train acc: 0.519
Epoch 2, train loss: 0.69635, train acc: 0.512
eval loss: 0.69664, eval acc: 0.512
D-Step 3
Epoch 0, train loss: 0.69465, train acc: 0.515
Epoch 1, train loss: 0.69580, train acc: 0.513
Epoch 2, train loss: 0.69621, train acc: 0.508
eval loss: 0.69404, eval acc: 0.516
D-Step 4
Epoch 0, train loss: 0.69409, train acc: 0.522
Epoch 1, train loss: 0.69523, train acc: 0.517
Epoch 2, train loss: 0.69362, train acc: 0.521
eval loss: 0.69440, eval acc: 0.517
D-Step 5
Epoch 0, train loss: 0.69302, train acc: 0.521
Epoch 1, train loss: 0.69302, train acc: 0.525
Epoch 2, train loss: 0.69302, train acc: 0.523
eval loss: 0.69153, eval acc: 0.530
D-Step 6
Epoch 0, train loss: 0.69348, train acc: 0.526
Epoch 1, train loss: 0.69417, train acc: 0.517
Epoch 2, train loss: 0.69278, train acc: 0.520
eval loss: 0.69280, eval acc: 0.525
D-Step 7
Epoch 0, train loss: 0.69258, train acc: 0.521
Epoch 1, train loss: 0.69173, train acc: 0.525
Epoch 2, train loss: 0.69013, train acc: 0.533
eval loss: 0.69002, eval acc: 0.534
D-Step 8
Epoch 0, train loss: 0.69124, train acc: 0.530
Epoch 1, train loss: 0.69052, train acc: 0.533
Epoch 2, train loss: 0.69059, train acc: 0.531
eval loss: 0.68993, eval acc: 0.536
D-Step 9
Epoch 0, train loss: 0.69073, train acc: 0.527
Epoch 1, train loss: 0.69025, train acc: 0.532
Epoch 2, train loss: 0.69031, train acc: 0.533
eval loss: 0.69027, eval acc: 0.528
D-Step 10
Epoch 0, train loss: 0.68959, train acc: 0.533
Epoch 1, train loss: 0.68967, train acc: 0.536
Epoch 2, train loss: 0.68837, train acc: 0.543
eval loss: 0.68955, eval acc: 0.532
D-Step 11
Epoch 0, train loss: 0.68969, train acc: 0.533
Epoch 1, train loss: 0.68845, train acc: 0.544
Epoch 2, train loss: 0.68937, train acc: 0.537
eval loss: 0.68922, eval acc: 0.538
D-Step 12
Epoch 0, train loss: 0.69049, train acc: 0.531
Epoch 1, train loss: 0.69001, train acc: 0.530
Epoch 2, train loss: 0.68920, train acc: 0.536
eval loss: 0.68884, eval acc: 0.536
D-Step 13
Epoch 0, train loss: 0.68788, train acc: 0.540
Epoch 1, train loss: 0.68718, train acc: 0.546
Epoch 2, train loss: 0.68692, train acc: 0.545
eval loss: 0.68823, eval acc: 0.537
D-Step 14
Epoch 0, train loss: 0.68812, train acc: 0.541
Epoch 1, train loss: 0.68720, train acc: 0.539
Epoch 2, train loss: 0.68878, train acc: 0.537
eval loss: 0.68619, eval acc: 0.543
D-Step 15
Epoch 0, train loss: 0.68763, train acc: 0.543
Epoch 1, train loss: 0.68535, train acc: 0.547
Epoch 2, train loss: 0.68745, train acc: 0.542
eval loss: 0.68894, eval acc: 0.538
D-Step 16
Epoch 0, train loss: 0.68695, train acc: 0.545
Epoch 1, train loss: 0.68634, train acc: 0.545
Epoch 2, train loss: 0.68431, train acc: 0.550
eval loss: 0.68717, eval acc: 0.539
D-Step 17
Epoch 0, train loss: 0.68566, train acc: 0.550
Epoch 1, train loss: 0.68478, train acc: 0.545
Epoch 2, train loss: 0.68417, train acc: 0.548
eval loss: 0.68677, eval acc: 0.543
D-Step 18
Epoch 0, train loss: 0.68527, train acc: 0.546
Epoch 1, train loss: 0.68477, train acc: 0.550
Epoch 2, train loss: 0.68426, train acc: 0.551
eval loss: 0.68268, eval acc: 0.560
D-Step 19
Epoch 0, train loss: 0.68557, train acc: 0.546
Epoch 1, train loss: 0.68526, train acc: 0.551
Epoch 2, train loss: 0.68440, train acc: 0.554
eval loss: 0.68520, eval acc: 0.548
D-Step 20
Epoch 0, train loss: 0.68512, train acc: 0.551
Epoch 1, train loss: 0.68503, train acc: 0.548
Epoch 2, train loss: 0.68443, train acc: 0.556
eval loss: 0.68417, eval acc: 0.550
D-Step 21
Epoch 0, train loss: 0.68278, train acc: 0.556
Epoch 1, train loss: 0.68351, train acc: 0.552
Epoch 2, train loss: 0.68361, train acc: 0.558
eval loss: 0.68319, eval acc: 0.554
D-Step 22
Epoch 0, train loss: 0.68333, train acc: 0.558
Epoch 1, train loss: 0.68297, train acc: 0.554
Epoch 2, train loss: 0.68216, train acc: 0.554
eval loss: 0.68182, eval acc: 0.557
D-Step 23
Epoch 0, train loss: 0.68493, train acc: 0.550
Epoch 1, train loss: 0.68240, train acc: 0.553
Epoch 2, train loss: 0.68289, train acc: 0.556
eval loss: 0.68181, eval acc: 0.560
D-Step 24
Epoch 0, train loss: 0.68148, train acc: 0.569
Epoch 1, train loss: 0.68076, train acc: 0.566
Epoch 2, train loss: 0.67999, train acc: 0.562
eval loss: 0.67891, eval acc: 0.572
D-Step 25
Epoch 0, train loss: 0.68040, train acc: 0.565
Epoch 1, train loss: 0.68061, train acc: 0.562
Epoch 2, train loss: 0.68051, train acc: 0.565
eval loss: 0.67854, eval acc: 0.568
D-Step 26
Epoch 0, train loss: 0.68019, train acc: 0.566
Epoch 1, train loss: 0.68154, train acc: 0.559
Epoch 2, train loss: 0.67940, train acc: 0.562
eval loss: 0.67973, eval acc: 0.565
D-Step 27
Epoch 0, train loss: 0.67960, train acc: 0.566
Epoch 1, train loss: 0.67907, train acc: 0.563
Epoch 2, train loss: 0.67991, train acc: 0.563
eval loss: 0.67972, eval acc: 0.564
D-Step 28
Epoch 0, train loss: 0.68010, train acc: 0.561
Epoch 1, train loss: 0.67813, train acc: 0.569
Epoch 2, train loss: 0.67780, train acc: 0.572
eval loss: 0.67707, eval acc: 0.571
D-Step 29
Epoch 0, train loss: 0.67938, train acc: 0.564
Epoch 1, train loss: 0.67738, train acc: 0.571
Epoch 2, train loss: 0.67799, train acc: 0.571
eval loss: 0.67845, eval acc: 0.568
D-Step 30
Epoch 0, train loss: 0.67857, train acc: 0.565
Epoch 1, train loss: 0.67712, train acc: 0.562
Epoch 2, train loss: 0.67824, train acc: 0.566
eval loss: 0.68028, eval acc: 0.554
D-Step 31
Epoch 0, train loss: 0.67649, train acc: 0.571
Epoch 1, train loss: 0.67615, train acc: 0.575
Epoch 2, train loss: 0.67433, train acc: 0.572
eval loss: 0.67717, eval acc: 0.567
D-Step 32
Epoch 0, train loss: 0.67671, train acc: 0.568
Epoch 1, train loss: 0.67578, train acc: 0.571
Epoch 2, train loss: 0.67594, train acc: 0.569
eval loss: 0.67737, eval acc: 0.566
D-Step 33
Epoch 0, train loss: 0.67632, train acc: 0.569
Epoch 1, train loss: 0.67526, train acc: 0.574
Epoch 2, train loss: 0.67550, train acc: 0.579
eval loss: 0.67774, eval acc: 0.568
D-Step 34
Epoch 0, train loss: 0.67658, train acc: 0.572
Epoch 1, train loss: 0.67520, train acc: 0.571
Epoch 2, train loss: 0.67424, train acc: 0.578
eval loss: 0.67910, eval acc: 0.561
D-Step 35
Epoch 0, train loss: 0.67553, train acc: 0.575
Epoch 1, train loss: 0.67269, train acc: 0.580
Epoch 2, train loss: 0.67369, train acc: 0.577
eval loss: 0.67304, eval acc: 0.579
D-Step 36
Epoch 0, train loss: 0.67477, train acc: 0.580
Epoch 1, train loss: 0.67475, train acc: 0.575
Epoch 2, train loss: 0.67256, train acc: 0.575
eval loss: 0.67445, eval acc: 0.575
D-Step 37
Epoch 0, train loss: 0.67411, train acc: 0.578
Epoch 1, train loss: 0.67422, train acc: 0.572
Epoch 2, train loss: 0.67273, train acc: 0.578
eval loss: 0.67178, eval acc: 0.582
D-Step 38
Epoch 0, train loss: 0.67270, train acc: 0.579
Epoch 1, train loss: 0.67180, train acc: 0.586
Epoch 2, train loss: 0.67259, train acc: 0.578
eval loss: 0.66978, eval acc: 0.585
D-Step 39
Epoch 0, train loss: 0.67363, train acc: 0.575
Epoch 1, train loss: 0.67194, train acc: 0.583
Epoch 2, train loss: 0.67144, train acc: 0.582
eval loss: 0.66993, eval acc: 0.585
D-Step 40
Epoch 0, train loss: 0.67230, train acc: 0.579
Epoch 1, train loss: 0.67270, train acc: 0.579
Epoch 2, train loss: 0.67013, train acc: 0.582
eval loss: 0.67042, eval acc: 0.587
D-Step 41
Epoch 0, train loss: 0.67242, train acc: 0.585
Epoch 1, train loss: 0.67103, train acc: 0.581
Epoch 2, train loss: 0.67016, train acc: 0.583
eval loss: 0.66787, eval acc: 0.592
D-Step 42
Epoch 0, train loss: 0.66929, train acc: 0.588
Epoch 1, train loss: 0.67089, train acc: 0.578
Epoch 2, train loss: 0.66773, train acc: 0.588
eval loss: 0.66914, eval acc: 0.586
D-Step 43
Epoch 0, train loss: 0.66703, train acc: 0.595
Epoch 1, train loss: 0.66595, train acc: 0.595
Epoch 2, train loss: 0.66630, train acc: 0.590
eval loss: 0.66722, eval acc: 0.589
D-Step 44
Epoch 0, train loss: 0.67010, train acc: 0.585
Epoch 1, train loss: 0.66741, train acc: 0.592
Epoch 2, train loss: 0.66795, train acc: 0.587
eval loss: 0.66548, eval acc: 0.597
D-Step 45
Epoch 0, train loss: 0.66741, train acc: 0.590
Epoch 1, train loss: 0.66433, train acc: 0.594
Epoch 2, train loss: 0.66679, train acc: 0.589
eval loss: 0.66773, eval acc: 0.589
D-Step 46
Epoch 0, train loss: 0.66669, train acc: 0.588
Epoch 1, train loss: 0.66447, train acc: 0.594
Epoch 2, train loss: 0.66618, train acc: 0.591
eval loss: 0.66757, eval acc: 0.588
D-Step 47
Epoch 0, train loss: 0.66667, train acc: 0.591
Epoch 1, train loss: 0.66485, train acc: 0.593
Epoch 2, train loss: 0.66361, train acc: 0.593
eval loss: 0.66568, eval acc: 0.591
D-Step 48
Epoch 0, train loss: 0.66483, train acc: 0.591
Epoch 1, train loss: 0.66324, train acc: 0.598
Epoch 2, train loss: 0.66260, train acc: 0.601
eval loss: 0.66568, eval acc: 0.589
D-Step 49
Epoch 0, train loss: 0.66335, train acc: 0.593
Epoch 1, train loss: 0.65977, train acc: 0.603
Epoch 2, train loss: 0.66167, train acc: 0.604
eval loss: 0.66560, eval acc: 0.596
#####################################################
#####################################################
Start adversarial training...
#####################################################
Round 0
#Train generator
##G-Step 0
#Train discriminator
##D-Step 0
Epoch 0, train loss: 0.59151, train acc: 0.720
Epoch 1, train loss: 0.54995, train acc: 0.761
Epoch 2, train loss: 0.51250, train acc: 0.793
##D-Step 1
Epoch 0, train loss: 0.48067, train acc: 0.804
Epoch 1, train loss: 0.45579, train acc: 0.812
Epoch 2, train loss: 0.43562, train acc: 0.818
##D-Step 2
Epoch 0, train loss: 0.42442, train acc: 0.820
Epoch 1, train loss: 0.41558, train acc: 0.817
Epoch 2, train loss: 0.40602, train acc: 0.822
gen eval loss: 1.40718, dis eval loss: 0.40555, dis eval acc: 0.820
Round 1
#Train generator
##G-Step 0
#Train discriminator
##D-Step 0
Epoch 0, train loss: 0.66479, train acc: 0.641
Epoch 1, train loss: 0.63251, train acc: 0.663
Epoch 2, train loss: 0.61968, train acc: 0.670
##D-Step 1
Epoch 0, train loss: 0.61171, train acc: 0.667
Epoch 1, train loss: 0.60287, train acc: 0.674
Epoch 2, train loss: 0.59755, train acc: 0.680
##D-Step 2
Epoch 0, train loss: 0.59420, train acc: 0.677
Epoch 1, train loss: 0.59016, train acc: 0.681
Epoch 2, train loss: 0.58628, train acc: 0.683
gen eval loss: 1.50046, dis eval loss: 0.58667, dis eval acc: 0.683
Round 2
#Train generator
##G-Step 0
#Train discriminator
##D-Step 0
Epoch 0, train loss: 0.58504, train acc: 0.685
Epoch 1, train loss: 0.58112, train acc: 0.692
Epoch 2, train loss: 0.57866, train acc: 0.695
##D-Step 1
Epoch 0, train loss: 0.57929, train acc: 0.696
Epoch 1, train loss: 0.57986, train acc: 0.694
Epoch 2, train loss: 0.57525, train acc: 0.700
##D-Step 2
Epoch 0, train loss: 0.57174, train acc: 0.703
Epoch 1, train loss: 0.56819, train acc: 0.705
Epoch 2, train loss: 0.56589, train acc: 0.709
gen eval loss: 1.53040, dis eval loss: 0.56829, dis eval acc: 0.704
Round 3
#Train generator
##G-Step 0
#Train discriminator
##D-Step 0
Epoch 0, train loss: 0.57758, train acc: 0.701
Epoch 1, train loss: 0.57169, train acc: 0.706
Epoch 2, train loss: 0.57005, train acc: 0.706
##D-Step 1
Epoch 0, train loss: 0.57348, train acc: 0.703
Epoch 1, train loss: 0.57088, train acc: 0.705
Epoch 2, train loss: 0.56803, train acc: 0.704
##D-Step 2
Epoch 0, train loss: 0.56420, train acc: 0.714
Epoch 1, train loss: 0.56091, train acc: 0.715
Epoch 2, train loss: 0.55888, train acc: 0.715
gen eval loss: 1.51330, dis eval loss: 0.56298, dis eval acc: 0.712
Round 4
#Train generator
##G-Step 0
#Train discriminator
##D-Step 0
Epoch 0, train loss: 0.57107, train acc: 0.708
Epoch 1, train loss: 0.56484, train acc: 0.713
Epoch 2, train loss: 0.56324, train acc: 0.713
##D-Step 1
Epoch 0, train loss: 0.56093, train acc: 0.716
Epoch 1, train loss: 0.56198, train acc: 0.712
Epoch 2, train loss: 0.55765, train acc: 0.722
##D-Step 2
Epoch 0, train loss: 0.55798, train acc: 0.718
Epoch 1, train loss: 0.55727, train acc: 0.722
Epoch 2, train loss: 0.55309, train acc: 0.720
gen eval loss: 1.49720, dis eval loss: 0.55024, dis eval acc: 0.725
Round 5
#Train generator
##G-Step 0
#Train discriminator
##D-Step 0
Epoch 0, train loss: 0.56707, train acc: 0.705
Epoch 1, train loss: 0.56217, train acc: 0.712
Epoch 2, train loss: 0.55833, train acc: 0.711
##D-Step 1
Epoch 0, train loss: 0.55648, train acc: 0.710
Epoch 1, train loss: 0.55107, train acc: 0.717
Epoch 2, train loss: 0.54860, train acc: 0.718
##D-Step 2
Epoch 0, train loss: 0.55329, train acc: 0.716
Epoch 1, train loss: 0.54824, train acc: 0.718
Epoch 2, train loss: 0.54637, train acc: 0.725
gen eval loss: 1.44837, dis eval loss: 0.54945, dis eval acc: 0.717
Round 6
#Train generator
##G-Step 0
#Train discriminator
##D-Step 0
Epoch 0, train loss: 0.56166, train acc: 0.698
Epoch 1, train loss: 0.55578, train acc: 0.710
Epoch 2, train loss: 0.54909, train acc: 0.716
##D-Step 1
Epoch 0, train loss: 0.55429, train acc: 0.708
Epoch 1, train loss: 0.55032, train acc: 0.713
Epoch 2, train loss: 0.54631, train acc: 0.714
##D-Step 2
Epoch 0, train loss: 0.54737, train acc: 0.715
Epoch 1, train loss: 0.54241, train acc: 0.716
Epoch 2, train loss: 0.54282, train acc: 0.719
gen eval loss: 1.37043, dis eval loss: 0.54431, dis eval acc: 0.716
Round 7
#Train generator
##G-Step 0
#Train discriminator
##D-Step 0
Epoch 0, train loss: 0.54859, train acc: 0.718
Epoch 1, train loss: 0.54219, train acc: 0.725
Epoch 2, train loss: 0.53867, train acc: 0.727
##D-Step 1
Epoch 0, train loss: 0.53038, train acc: 0.734
Epoch 1, train loss: 0.53056, train acc: 0.730
Epoch 2, train loss: 0.52710, train acc: 0.733
##D-Step 2
Epoch 0, train loss: 0.52913, train acc: 0.736
Epoch 1, train loss: 0.52826, train acc: 0.735
Epoch 2, train loss: 0.52487, train acc: 0.736
gen eval loss: 1.34006, dis eval loss: 0.51751, dis eval acc: 0.743
Round 8
#Train generator
##G-Step 0
#Train discriminator
##D-Step 0
Epoch 0, train loss: 0.52375, train acc: 0.736
Epoch 1, train loss: 0.51892, train acc: 0.743
Epoch 2, train loss: 0.51671, train acc: 0.747
##D-Step 1
Epoch 0, train loss: 0.51435, train acc: 0.743
Epoch 1, train loss: 0.50758, train acc: 0.750
Epoch 2, train loss: 0.50943, train acc: 0.748
##D-Step 2
Epoch 0, train loss: 0.50186, train acc: 0.753
Epoch 1, train loss: 0.50100, train acc: 0.752
Epoch 2, train loss: 0.49398, train acc: 0.759
gen eval loss: 1.31720, dis eval loss: 0.50034, dis eval acc: 0.754
Round 9
#Train generator
##G-Step 0
#Train discriminator
##D-Step 0
Epoch 0, train loss: 0.49255, train acc: 0.766
Epoch 1, train loss: 0.48876, train acc: 0.765
Epoch 2, train loss: 0.48592, train acc: 0.767
##D-Step 1
Epoch 0, train loss: 0.49019, train acc: 0.768
Epoch 1, train loss: 0.48792, train acc: 0.764
Epoch 2, train loss: 0.48628, train acc: 0.767
##D-Step 2
Epoch 0, train loss: 0.48696, train acc: 0.768
Epoch 1, train loss: 0.48251, train acc: 0.768
Epoch 2, train loss: 0.47848, train acc: 0.773
gen eval loss: 1.30945, dis eval loss: 0.48090, dis eval acc: 0.769
Round 10
#Train generator
##G-Step 0
#Train discriminator
##D-Step 0
Epoch 0, train loss: 0.47620, train acc: 0.778
Epoch 1, train loss: 0.47299, train acc: 0.779
Epoch 2, train loss: 0.46769, train acc: 0.783
##D-Step 1
Epoch 0, train loss: 0.47547, train acc: 0.772
Epoch 1, train loss: 0.47199, train acc: 0.776
Epoch 2, train loss: 0.47110, train acc: 0.777
##D-Step 2
Epoch 0, train loss: 0.46376, train acc: 0.785
Epoch 1, train loss: 0.46196, train acc: 0.782
Epoch 2, train loss: 0.45862, train acc: 0.787
gen eval loss: 1.25969, dis eval loss: 0.45362, dis eval acc: 0.786