-
Notifications
You must be signed in to change notification settings - Fork 91
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
RotNetR验证集的loss降不下来 #20
Comments
这种一般是学到了什么奇怪的特征,要不先全部用正面或者全部用反面来试试?记得把 |
`import argparse import torch from rotate_captcha_crack.common import device if name == "main":
` |
我刚刚看到你的训练时间才意识到应该是数据量太少再加上纹理类型也很少(就两种)的问题,如果你的应用场景就是特定的几个图样那不如直接用传统的特征点匹配 |
改成RotNet LOSS就下来了 |
莫非是因为分类数变多了 |
使用训练参数
lr = 0.01
momentum = 0.9
epochs = 300
steps = 128
训练集:验证集70%:30% 总共约1100张去白边钱币
素材预览
测试角度都是错的,该如何优化
<2023-08-18 13:57:37.784> [INFO] Epoch#0. time_cost: 32.89 s. train_loss: 5.39984232. val_loss: 5.36462307 <2023-08-18 13:57:55.110> [INFO] Epoch#1. time_cost: 48.94 s. train_loss: 5.35185540. val_loss: 5.38960536 <2023-08-18 13:58:11.490> [INFO] Epoch#2. time_cost: 64.82 s. train_loss: 5.28306431. val_loss: 5.41948541 <2023-08-18 13:58:28.066> [INFO] Epoch#3. time_cost: 80.73 s. train_loss: 5.21700990. val_loss: 5.41810258 <2023-08-18 13:58:44.535> [INFO] Epoch#4. time_cost: 96.61 s. train_loss: 5.15511119. val_loss: 5.41366911 <2023-08-18 13:59:01.068> [INFO] Epoch#5. time_cost: 112.57 s. train_loss: 5.06084383. val_loss: 5.40824318 <2023-08-18 13:59:17.673> [INFO] Epoch#6. time_cost: 128.55 s. train_loss: 4.97198361. val_loss: 5.40104500 <2023-08-18 13:59:34.405> [INFO] Epoch#7. time_cost: 144.58 s. train_loss: 4.89924210. val_loss: 5.39258560 <2023-08-18 13:59:51.164> [INFO] Epoch#8. time_cost: 160.73 s. train_loss: 4.81734937. val_loss: 5.38335117 <2023-08-18 14:00:08.268> [INFO] Epoch#9. time_cost: 177.25 s. train_loss: 4.71686673. val_loss: 5.37633689 <2023-08-18 14:00:24.879> [INFO] Epoch#10. time_cost: 193.28 s. train_loss: 4.65475571. val_loss: 5.37279606 <2023-08-18 14:00:41.440> [INFO] Epoch#11. time_cost: 209.35 s. train_loss: 4.55467379. val_loss: 5.36483145 <2023-08-18 14:00:58.116> [INFO] Epoch#12. time_cost: 225.54 s. train_loss: 4.47352850. val_loss: 5.35914739 <2023-08-18 14:01:15.269> [INFO] Epoch#13. time_cost: 241.79 s. train_loss: 4.41723913. val_loss: 5.35702181 <2023-08-18 14:01:32.629> [INFO] Epoch#14. time_cost: 258.01 s. train_loss: 4.31735170. val_loss: 5.35660950 <2023-08-18 14:01:49.672> [INFO] Epoch#15. time_cost: 274.19 s. train_loss: 4.23688066. val_loss: 5.35307089 <2023-08-18 14:02:06.594> [INFO] Epoch#16. time_cost: 290.31 s. train_loss: 4.18479526. val_loss: 5.35310698 <2023-08-18 14:02:23.423> [INFO] Epoch#17. time_cost: 306.63 s. train_loss: 4.08692378. val_loss: 5.34825500 <2023-08-18 14:02:40.599> [INFO] Epoch#18. time_cost: 322.81 s. train_loss: 4.01058915. val_loss: 5.34844208 <2023-08-18 14:02:57.059> [INFO] Epoch#19. time_cost: 338.78 s. train_loss: 3.92014989. val_loss: 5.34933837 <2023-08-18 14:03:13.625> [INFO] Epoch#20. time_cost: 354.79 s. train_loss: 3.84989792. val_loss: 5.34641616 <2023-08-18 14:03:30.806> [INFO] Epoch#21. time_cost: 371.08 s. train_loss: 3.75730199. val_loss: 5.34646304 <2023-08-18 14:03:47.440> [INFO] Epoch#22. time_cost: 387.23 s. train_loss: 3.69406390. val_loss: 5.34865459 <2023-08-18 14:04:04.430> [INFO] Epoch#23. time_cost: 403.58 s. train_loss: 3.58468822. val_loss: 5.34718132 <2023-08-18 14:04:21.192> [INFO] Epoch#24. time_cost: 419.76 s. train_loss: 3.53524327. val_loss: 5.35206652 <2023-08-18 14:04:37.991> [INFO] Epoch#25. time_cost: 435.94 s. train_loss: 3.43011984. val_loss: 5.34888013 <2023-08-18 14:04:54.962> [INFO] Epoch#26. time_cost: 452.26 s. train_loss: 3.37779093. val_loss: 5.34560808 <2023-08-18 14:05:11.900> [INFO] Epoch#27. time_cost: 468.42 s. train_loss: 3.25674978. val_loss: 5.34464852 <2023-08-18 14:05:28.939> [INFO] Epoch#28. time_cost: 484.61 s. train_loss: 3.21313724. val_loss: 5.34361029 <2023-08-18 14:05:46.150> [INFO] Epoch#29. time_cost: 500.94 s. train_loss: 3.11788267. val_loss: 5.34200462 <2023-08-18 14:06:03.293> [INFO] Epoch#30. time_cost: 517.17 s. train_loss: 3.03583649. val_loss: 5.34520737 <2023-08-18 14:06:20.056> [INFO] Epoch#31. time_cost: 533.36 s. train_loss: 3.02078411. val_loss: 5.34665251 <2023-08-18 14:06:36.799> [INFO] Epoch#32. time_cost: 549.56 s. train_loss: 2.89035311. val_loss: 5.34966660 <2023-08-18 14:06:53.613> [INFO] Epoch#33. time_cost: 565.89 s. train_loss: 2.80607820. val_loss: 5.34706863 <2023-08-18 14:07:10.240> [INFO] Epoch#34. time_cost: 582.04 s. train_loss: 2.75119737. val_loss: 5.34983921 <2023-08-18 14:07:27.220> [INFO] Epoch#35. time_cost: 598.46 s. train_loss: 2.66276371. val_loss: 5.34948270 <2023-08-18 14:07:43.911> [INFO] Epoch#36. time_cost: 614.61 s. train_loss: 2.62098613. val_loss: 5.34578753 <2023-08-18 14:08:00.852> [INFO] Epoch#37. time_cost: 631.00 s. train_loss: 2.56218195. val_loss: 5.35035912 <2023-08-18 14:08:17.749> [INFO] Epoch#38. time_cost: 647.41 s. train_loss: 2.44102857. val_loss: 5.35140228 <2023-08-18 14:08:34.332> [INFO] Epoch#39. time_cost: 663.49 s. train_loss: 2.41077507. val_loss: 5.35149717 <2023-08-18 14:08:50.979> [INFO] Epoch#40. time_cost: 679.53 s. train_loss: 2.29778108. val_loss: 5.35420863 <2023-08-18 14:09:07.607> [INFO] Epoch#41. time_cost: 695.53 s. train_loss: 2.24829721. val_loss: 5.35726897 <2023-08-18 14:09:24.769> [INFO] Epoch#42. time_cost: 712.05 s. train_loss: 2.17894366. val_loss: 5.35133410 <2023-08-18 14:09:41.792> [INFO] Epoch#43. time_cost: 728.57 s. train_loss: 2.12200819. val_loss: 5.35315466 <2023-08-18 14:09:58.563> [INFO] Epoch#44. time_cost: 744.72 s. train_loss: 2.07205370. val_loss: 5.34872564 <2023-08-18 14:10:15.165> [INFO] Epoch#45. time_cost: 760.83 s. train_loss: 1.99733144. val_loss: 5.34793647 <2023-08-18 14:10:32.038> [INFO] Epoch#46. time_cost: 777.10 s. train_loss: 1.95925856. val_loss: 5.34805107 <2023-08-18 14:10:49.068> [INFO] Epoch#47. time_cost: 793.60 s. train_loss: 1.88353942. val_loss: 5.34884135 <2023-08-18 14:11:05.784> [INFO] Epoch#48. time_cost: 809.81 s. train_loss: 1.83938010. val_loss: 5.35141436 <2023-08-18 14:11:22.648> [INFO] Epoch#49. time_cost: 826.09 s. train_loss: 1.79292178. val_loss: 5.34627692 <2023-08-18 14:11:39.304> [INFO] Epoch#50. time_cost: 842.29 s. train_loss: 1.68158425. val_loss: 5.34675503 <2023-08-18 14:11:55.872> [INFO] Epoch#51. time_cost: 858.40 s. train_loss: 1.62872086. val_loss: 5.34923681 <2023-08-18 14:12:12.628> [INFO] Epoch#52. time_cost: 874.63 s. train_loss: 1.59070106. val_loss: 5.35235278 <2023-08-18 14:12:29.318> [INFO] Epoch#53. time_cost: 890.81 s. train_loss: 1.58161676. val_loss: 5.35133855 <2023-08-18 14:12:46.143> [INFO] Epoch#54. time_cost: 907.11 s. train_loss: 1.45260657. val_loss: 5.34634527 <2023-08-18 14:13:02.999> [INFO] Epoch#55. time_cost: 923.48 s. train_loss: 1.47507057. val_loss: 5.34150346 <2023-08-18 14:13:20.303> [INFO] Epoch#56. time_cost: 939.80 s. train_loss: 1.38150918. val_loss: 5.34167035 <2023-08-18 14:13:36.953> [INFO] Epoch#57. time_cost: 955.96 s. train_loss: 1.36797532. val_loss: 5.33937915 <2023-08-18 14:13:54.243> [INFO] Epoch#58. time_cost: 972.27 s. train_loss: 1.28458185. val_loss: 5.34360949 <2023-08-18 14:14:11.088> [INFO] Epoch#59. time_cost: 988.53 s. train_loss: 1.24424957. val_loss: 5.34156942 <2023-08-18 14:14:27.971> [INFO] Epoch#60. time_cost: 1004.85 s. train_loss: 1.18852963. val_loss: 5.33656184 <2023-08-18 14:14:44.796> [INFO] Epoch#61. time_cost: 1020.81 s. train_loss: 1.13654406. val_loss: 5.33123016 <2023-08-18 14:15:02.092> [INFO] Epoch#62. time_cost: 1037.30 s. train_loss: 1.11281125. val_loss: 5.33135033 <2023-08-18 14:15:19.106> [INFO] Epoch#63. time_cost: 1053.65 s. train_loss: 1.05972394. val_loss: 5.33335686 <2023-08-18 14:15:35.944> [INFO] Epoch#64. time_cost: 1069.97 s. train_loss: 1.02143451. val_loss: 5.33008957 <2023-08-18 14:15:52.917> [INFO] Epoch#65. time_cost: 1086.13 s. train_loss: 1.01201259. val_loss: 5.32418426 <2023-08-18 14:16:10.429> [INFO] Epoch#66. time_cost: 1102.75 s. train_loss: 0.97423361. val_loss: 5.32560396 <2023-08-18 14:16:26.975> [INFO] Epoch#67. time_cost: 1118.77 s. train_loss: 0.98155708. val_loss: 5.32422972 <2023-08-18 14:16:43.738> [INFO] Epoch#68. time_cost: 1134.93 s. train_loss: 0.89851645. val_loss: 5.32218838 <2023-08-18 14:17:00.692> [INFO] Epoch#69. time_cost: 1151.01 s. train_loss: 0.90840406. val_loss: 5.32185570 <2023-08-18 14:17:17.757> [INFO] Epoch#70. time_cost: 1167.22 s. train_loss: 0.84972844. val_loss: 5.31999524 <2023-08-18 14:17:34.972> [INFO] Epoch#71. time_cost: 1183.57 s. train_loss: 0.82386800. val_loss: 5.31885560 <2023-08-18 14:17:52.144> [INFO] Epoch#72. time_cost: 1199.80 s. train_loss: 0.79325682. val_loss: 5.32391421 <2023-08-18 14:18:09.007> [INFO] Epoch#73. time_cost: 1216.09 s. train_loss: 0.75515443. val_loss: 5.32378785 <2023-08-18 14:18:25.720> [INFO] Epoch#74. time_cost: 1232.24 s. train_loss: 0.71768803. val_loss: 5.31845586 <2023-08-18 14:18:42.934> [INFO] Epoch#75. time_cost: 1248.52 s. train_loss: 0.74559616. val_loss: 5.31854550 <2023-08-18 14:18:59.665> [INFO] Epoch#76. time_cost: 1264.72 s. train_loss: 0.68202233. val_loss: 5.31668854 <2023-08-18 14:19:16.607> [INFO] Epoch#77. time_cost: 1280.92 s. train_loss: 0.65233253. val_loss: 5.31342141 <2023-08-18 14:19:33.557> [INFO] Epoch#78. time_cost: 1296.93 s. train_loss: 0.64974236. val_loss: 5.30892277 <2023-08-18 14:19:50.504> [INFO] Epoch#79. time_cost: 1313.06 s. train_loss: 0.58420439. val_loss: 5.30771160 <2023-08-18 14:20:07.773> [INFO] Epoch#80. time_cost: 1329.33 s. train_loss: 0.60893153. val_loss: 5.30896378 <2023-08-18 14:20:24.518> [INFO] Epoch#81. time_cost: 1345.57 s. train_loss: 0.57614258. val_loss: 5.30884250 <2023-08-18 14:20:41.130> [INFO] Epoch#82. time_cost: 1361.64 s. train_loss: 0.53065565. val_loss: 5.30851078 <2023-08-18 14:20:58.022> [INFO] Epoch#83. time_cost: 1377.96 s. train_loss: 0.55741334. val_loss: 5.30744998 <2023-08-18 14:21:14.904> [INFO] Epoch#84. time_cost: 1393.96 s. train_loss: 0.48479454. val_loss: 5.30663506 <2023-08-18 14:21:32.100> [INFO] Epoch#85. time_cost: 1410.37 s. train_loss: 0.47788703. val_loss: 5.30053854 <2023-08-18 14:21:49.213> [INFO] Epoch#86. time_cost: 1426.53 s. train_loss: 0.48423344. val_loss: 5.29945850 <2023-08-18 14:22:06.621> [INFO] Epoch#87. time_cost: 1442.83 s. train_loss: 0.44990043. val_loss: 5.30025927 <2023-08-18 14:22:23.385> [INFO] Epoch#88. time_cost: 1458.98 s. train_loss: 0.44953348. val_loss: 5.29905891 <2023-08-18 14:22:40.358> [INFO] Epoch#89. time_cost: 1474.94 s. train_loss: 0.43896342. val_loss: 5.30130974 <2023-08-18 14:22:57.296> [INFO] Epoch#90. time_cost: 1491.33 s. train_loss: 0.40358200. val_loss: 5.30313905 <2023-08-18 14:23:14.203> [INFO] Epoch#91. time_cost: 1507.74 s. train_loss: 0.38030928. val_loss: 5.30020889 <2023-08-18 14:23:31.249> [INFO] Epoch#92. time_cost: 1524.19 s. train_loss: 0.36855397. val_loss: 5.29769659 <2023-08-18 14:23:48.304> [INFO] Epoch#93. time_cost: 1540.40 s. train_loss: 0.39343657. val_loss: 5.29662418 <2023-08-18 14:24:05.624> [INFO] Epoch#94. time_cost: 1556.82 s. train_loss: 0.35523611. val_loss: 5.29806995 <2023-08-18 14:24:22.604> [INFO] Epoch#95. time_cost: 1573.18 s. train_loss: 0.38470637. val_loss: 5.29897292 <2023-08-18 14:24:39.336> [INFO] Epoch#96. time_cost: 1589.40 s. train_loss: 0.35899499. val_loss: 5.30027898 <2023-08-18 14:24:56.202> [INFO] Epoch#97. time_cost: 1605.79 s. train_loss: 0.33074107. val_loss: 5.29959599 <2023-08-18 14:25:12.997> [INFO] Epoch#98. time_cost: 1622.00 s. train_loss: 0.31499263. val_loss: 5.29822334 <2023-08-18 14:25:29.901> [INFO] Epoch#99. time_cost: 1638.42 s. train_loss: 0.31733391. val_loss: 5.30082989 <2023-08-18 14:25:46.728> [INFO] Epoch#100. time_cost: 1654.55 s. train_loss: 0.32080573. val_loss: 5.29806964 <2023-08-18 14:26:03.189> [INFO] Epoch#101. time_cost: 1670.50 s. train_loss: 0.30004243. val_loss: 5.29328775 <2023-08-18 14:26:20.261> [INFO] Epoch#102. time_cost: 1686.65 s. train_loss: 0.29755034. val_loss: 5.29279629 <2023-08-18 14:26:37.467> [INFO] Epoch#103. time_cost: 1702.96 s. train_loss: 0.28089099. val_loss: 5.29503473 <2023-08-18 14:26:54.288> [INFO] Epoch#104. time_cost: 1719.07 s. train_loss: 0.28207827. val_loss: 5.29597712 <2023-08-18 14:27:11.176> [INFO] Epoch#105. time_cost: 1735.35 s. train_loss: 0.26215186. val_loss: 5.29589160 <2023-08-18 14:27:27.886> [INFO] Epoch#106. time_cost: 1751.49 s. train_loss: 0.25227861. val_loss: 5.29495859 <2023-08-18 14:27:44.839> [INFO] Epoch#107. time_cost: 1767.86 s. train_loss: 0.27683722. val_loss: 5.29369672 <2023-08-18 14:28:01.458> [INFO] Epoch#108. time_cost: 1783.96 s. train_loss: 0.24133206. val_loss: 5.29565271 <2023-08-18 14:28:18.337> [INFO] Epoch#109. time_cost: 1800.35 s. train_loss: 0.23433661. val_loss: 5.29376634 <2023-08-18 14:28:34.999> [INFO] Epoch#110. time_cost: 1816.54 s. train_loss: 0.23755440. val_loss: 5.29343255 <2023-08-18 14:28:51.802> [INFO] Epoch#111. time_cost: 1832.89 s. train_loss: 0.22447769. val_loss: 5.29052639 <2023-08-18 14:29:08.952> [INFO] Epoch#112. time_cost: 1849.30 s. train_loss: 0.23238146. val_loss: 5.29428466 <2023-08-18 14:29:25.784> [INFO] Epoch#113. time_cost: 1865.56 s. train_loss: 0.22170261. val_loss: 5.29492776 <2023-08-18 14:29:42.793> [INFO] Epoch#114. time_cost: 1882.06 s. train_loss: 0.21545648. val_loss: 5.29312213 <2023-08-18 14:29:59.412> [INFO] Epoch#115. time_cost: 1898.21 s. train_loss: 0.20180116. val_loss: 5.29303535 <2023-08-18 14:30:16.259> [INFO] Epoch#116. time_cost: 1914.53 s. train_loss: 0.19667596. val_loss: 5.29073985 <2023-08-18 14:30:33.082> [INFO] Epoch#117. time_cost: 1930.81 s. train_loss: 0.20202053. val_loss: 5.29082489 <2023-08-18 14:30:49.921> [INFO] Epoch#118. time_cost: 1947.08 s. train_loss: 0.19160863. val_loss: 5.29071124 <2023-08-18 14:31:06.598> [INFO] Epoch#119. time_cost: 1963.19 s. train_loss: 0.19253234. val_loss: 5.29217211 <2023-08-18 14:31:23.262> [INFO] Epoch#120. time_cost: 1979.35 s. train_loss: 0.20008727. val_loss: 5.29381418 <2023-08-18 14:31:40.064> [INFO] Epoch#121. time_cost: 1995.64 s. train_loss: 0.18423576. val_loss: 5.29436827 <2023-08-18 14:31:57.018> [INFO] Epoch#122. time_cost: 2012.03 s. train_loss: 0.18706201. val_loss: 5.29476261 <2023-08-18 14:32:13.724> [INFO] Epoch#123. time_cost: 2028.22 s. train_loss: 0.17524794. val_loss: 5.29642137 <2023-08-18 14:32:30.843> [INFO] Epoch#124. time_cost: 2044.73 s. train_loss: 0.17649993. val_loss: 5.29553525 <2023-08-18 14:32:47.986> [INFO] Epoch#125. time_cost: 2061.28 s. train_loss: 0.16367954. val_loss: 5.29705667 <2023-08-18 14:33:04.760> [INFO] Epoch#126. time_cost: 2077.57 s. train_loss: 0.16235787. val_loss: 5.29822620 <2023-08-18 14:33:21.836> [INFO] Epoch#127. time_cost: 2094.10 s. train_loss: 0.15805034. val_loss: 5.29666471 <2023-08-18 14:33:38.651> [INFO] Epoch#128. time_cost: 2110.38 s. train_loss: 0.16695243. val_loss: 5.29671876 <2023-08-18 14:33:55.408> [INFO] Epoch#129. time_cost: 2126.55 s. train_loss: 0.14886506. val_loss: 5.29992644 <2023-08-18 14:34:12.275> [INFO] Epoch#130. time_cost: 2142.95 s. train_loss: 0.14749421. val_loss: 5.29999638 <2023-08-18 14:34:29.053> [INFO] Epoch#131. time_cost: 2159.16 s. train_loss: 0.16307789. val_loss: 5.29719162 <2023-08-18 14:34:45.788> [INFO] Epoch#132. time_cost: 2175.27 s. train_loss: 0.14282146. val_loss: 5.29932435 <2023-08-18 14:35:02.511> [INFO] Epoch#133. time_cost: 2191.50 s. train_loss: 0.14902505. val_loss: 5.29678647 <2023-08-18 14:35:19.176> [INFO] Epoch#134. time_cost: 2207.62 s. train_loss: 0.16553759. val_loss: 5.29835685 <2023-08-18 14:35:36.006> [INFO] Epoch#135. time_cost: 2223.88 s. train_loss: 0.13034127. val_loss: 5.29728015 <2023-08-18 14:35:52.973> [INFO] Epoch#136. time_cost: 2240.26 s. train_loss: 0.13669445. val_loss: 5.29960585 <2023-08-18 14:36:09.725> [INFO] Epoch#137. time_cost: 2256.47 s. train_loss: 0.13718895. val_loss: 5.30132929 <2023-08-18 14:36:26.503> [INFO] Epoch#138. time_cost: 2272.64 s. train_loss: 0.13112331. val_loss: 5.30023400
The text was updated successfully, but these errors were encountered: