sayakpaul/glpn-nyu-finetuned-diode-230103-091356

古风汉服美女图集

sayakpaul/glpn-nyu-finetuned-diode-230103-091356


glpn-nyu-finetuned-diode-230103-091356

This model is a fine-tuned version of vinvino02/glpn-nyu on the diode-subset dataset.
It achieves the following results on the evaluation set:

  • Loss: 0.4360
  • Mae: 0.4251
  • Rmse: 0.6169
  • Abs Rel: 0.4500
  • Log Mae: 0.1721
  • Log Rmse: 0.2269
  • Delta1: 0.3828
  • Delta2: 0.6326
  • Delta3: 0.8051


Model description

More information needed


Intended uses & limitations

More information needed


Training and evaluation data

More information needed


Training procedure


Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 24
  • eval_batch_size: 48
  • seed: 2022
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.15
  • num_epochs: 100
  • mixed_precision_training: Native AMP


Training results

Training Loss Epoch Step Validation Loss Mae Rmse Abs Rel Log Mae Log Rmse Delta1 Delta2 Delta3
1.0762 1.0 72 0.5031 0.4779 0.6690 0.5503 0.2006 0.2591 0.3020 0.5337 0.8000
0.478 2.0 144 0.4653 0.4509 0.6307 0.4891 0.1861 0.2377 0.3300 0.5805 0.7734
0.4668 3.0 216 0.4845 0.4712 0.6373 0.5469 0.1963 0.2471 0.3110 0.5254 0.7235
0.4389 4.0 288 0.4587 0.4368 0.6219 0.4887 0.1787 0.2344 0.3578 0.6099 0.7926
0.4626 5.0 360 0.4879 0.4662 0.6351 0.5617 0.1937 0.2482 0.3135 0.5462 0.7395
0.4534 6.0 432 0.4638 0.4422 0.6236 0.4951 0.1810 0.2358 0.3606 0.5844 0.7831
0.4108 7.0 504 0.4688 0.4508 0.6279 0.5050 0.1856 0.2385 0.3426 0.5701 0.7623
0.3832 8.0 576 0.4759 0.4533 0.6284 0.5257 0.1869 0.2411 0.3331 0.5701 0.7617
0.4097 9.0 648 0.4771 0.4501 0.6303 0.5361 0.1855 0.2433 0.3454 0.5838 0.7609
0.3799 10.0 720 0.4575 0.4375 0.6240 0.4874 0.1790 0.2349 0.3669 0.6032 0.7916
0.3659 11.0 792 0.4718 0.4590 0.6298 0.5176 0.1893 0.2396 0.3283 0.5502 0.7368
0.4145 12.0 864 0.4776 0.4561 0.6298 0.5325 0.1883 0.2421 0.3333 0.5611 0.7540
0.4224 13.0 936 0.4320 0.4138 0.6202 0.4013 0.1655 0.2232 0.4217 0.6641 0.8004
0.4142 14.0 1008 0.4597 0.4440 0.6234 0.4842 0.1813 0.2330 0.3520 0.5895 0.7617
0.4393 15.0 1080 0.4333 0.4251 0.6197 0.4182 0.1712 0.2225 0.3787 0.6303 0.8100
0.4045 16.0 1152 0.4603 0.4356 0.6197 0.4819 0.1776 0.2322 0.3635 0.6050 0.7858
0.3708 17.0 1224 0.4738 0.4567 0.6292 0.5264 0.1886 0.2411 0.3283 0.5557 0.7596
0.4042 18.0 1296 0.5004 0.4802 0.6423 0.6101 0.2008 0.2560 0.3022 0.5165 0.6931
0.3763 19.0 1368 0.4501 0.4361 0.6213 0.4723 0.1772 0.2303 0.3634 0.6034 0.7889
0.4084 20.0 1440 0.4272 0.4133 0.6208 0.3958 0.1649 0.2226 0.4284 0.6684 0.8009
0.3637 21.0 1512 0.4307 0.4145 0.6199 0.4134 0.1665 0.2241 0.3957 0.6847 0.8137
0.3655 22.0 1584 0.4591 0.4374 0.6370 0.4594 0.1791 0.2384 0.3816 0.6264 0.7826
0.3844 23.0 1656 0.4692 0.4444 0.6273 0.5241 0.1824 0.2407 0.3540 0.5990 0.7756
0.428 24.0 1728 0.4982 0.4753 0.6403 0.6084 0.1984 0.2552 0.3099 0.5233 0.7204
0.4051 25.0 1800 0.4824 0.4618 0.6329 0.5533 0.1915 0.2461 0.3248 0.5495 0.7415
0.3584 26.0 1872 0.4434 0.4207 0.6177 0.4468 0.1694 0.2277 0.3975 0.6442 0.8038
0.3443 27.0 1944 0.4602 0.4434 0.6241 0.4912 0.1822 0.2351 0.3431 0.5877 0.7893
0.3714 28.0 2016 0.4818 0.4594 0.6316 0.5521 0.1900 0.2455 0.3283 0.5567 0.7493
0.3688 29.0 2088 0.4443 0.4215 0.6242 0.4386 0.1702 0.2294 0.4024 0.6522 0.8065
0.3615 30.0 2160 0.4462 0.4291 0.6189 0.4500 0.1739 0.2277 0.3792 0.6208 0.7896
0.3655 31.0 2232 0.4808 0.4574 0.6305 0.5524 0.1893 0.2452 0.3322 0.5590 0.7460
0.3576 32.0 2304 0.4321 0.4102 0.6182 0.4079 0.1640 0.2241 0.4296 0.6713 0.8074
0.3947 33.0 2376 0.4468 0.4298 0.6232 0.4574 0.1744 0.2306 0.3873 0.6163 0.7873
0.3402 34.0 2448 0.4565 0.4352 0.6195 0.4913 0.1776 0.2337 0.3734 0.6039 0.7865
0.3412 35.0 2520 0.4438 0.4261 0.6180 0.4546 0.1728 0.2279 0.3778 0.6252 0.8043
0.3547 36.0 2592 0.4577 0.4416 0.6218 0.4868 0.1807 0.2329 0.3517 0.5862 0.7862
0.3425 37.0 2664 0.4682 0.4511 0.6285 0.5210 0.1860 0.2406 0.3411 0.5748 0.7694
0.3853 38.0 2736 0.4752 0.4514 0.6289 0.5458 0.1863 0.2438 0.3408 0.5721 0.7760
0.3643 39.0 2808 0.4737 0.4547 0.6291 0.5401 0.1875 0.2428 0.3316 0.5673 0.7617
0.398 40.0 2880 0.4662 0.4467 0.6274 0.5124 0.1838 0.2394 0.3514 0.5823 0.7700
0.3579 41.0 2952 0.4781 0.4545 0.6290 0.5513 0.1880 0.2446 0.3343 0.5624 0.7718
0.3545 42.0 3024 0.4460 0.4277 0.6221 0.4553 0.1730 0.2294 0.3862 0.6285 0.7999
0.3527 43.0 3096 0.4330 0.4153 0.6169 0.4221 0.1668 0.2240 0.4106 0.6618 0.8084
0.3251 44.0 3168 0.4503 0.4286 0.6172 0.4781 0.1744 0.2313 0.3725 0.6224 0.8095
0.3433 45.0 3240 0.4471 0.4346 0.6187 0.4652 0.1772 0.2293 0.3606 0.6043 0.7952
0.3607 46.0 3312 0.4474 0.4263 0.6166 0.4658 0.1728 0.2293 0.3835 0.6287 0.8039
0.3722 47.0 3384 0.4527 0.4337 0.6205 0.4857 0.1768 0.2329 0.3696 0.6084 0.7922
0.3322 48.0 3456 0.4629 0.4431 0.6236 0.5118 0.1818 0.2373 0.3460 0.5897 0.7954
0.3624 49.0 3528 0.4431 0.4304 0.6203 0.4511 0.1742 0.2277 0.3827 0.6152 0.7917
0.3386 50.0 3600 0.4475 0.4260 0.6173 0.4697 0.1727 0.2301 0.3870 0.6283 0.8102
0.3316 51.0 3672 0.4558 0.4328 0.6194 0.4982 0.1770 0.2345 0.3618 0.6120 0.8124
0.3259 52.0 3744 0.4316 0.4084 0.6165 0.4234 0.1630 0.2245 0.4311 0.6809 0.8148
0.3299 53.0 3816 0.4489 0.4222 0.6198 0.4779 0.1706 0.2327 0.4049 0.6441 0.8021
0.3334 54.0 3888 0.4831 0.4598 0.6319 0.5716 0.1902 0.2476 0.3281 0.5597 0.7549
0.3342 55.0 3960 0.4478 0.4288 0.6166 0.4786 0.1745 0.2310 0.3749 0.6218 0.8091
0.3276 56.0 4032 0.4524 0.4342 0.6192 0.4852 0.1773 0.2326 0.3596 0.6113 0.8007
0.326 57.0 4104 0.4411 0.4226 0.6162 0.4486 0.1704 0.2268 0.3947 0.6403 0.7959
0.3429 58.0 4176 0.4578 0.4418 0.6221 0.4961 0.1812 0.2349 0.3497 0.5956 0.7750
0.3347 59.0 4248 0.4586 0.4409 0.6220 0.4946 0.1808 0.2347 0.3439 0.6004 0.7869
0.3215 60.0 4320 0.4583 0.4382 0.6232 0.4974 0.1789 0.2357 0.3667 0.6008 0.7855
0.331 61.0 4392 0.4412 0.4206 0.6145 0.4579 0.1699 0.2276 0.3966 0.6413 0.8047
0.3124 62.0 4464 0.4455 0.4236 0.6181 0.4727 0.1715 0.2313 0.3902 0.6417 0.8098
0.322 63.0 4536 0.4406 0.4230 0.6143 0.4548 0.1716 0.2269 0.3775 0.6425 0.8115
0.3194 64.0 4608 0.4473 0.4331 0.6193 0.4657 0.1765 0.2297 0.3606 0.6122 0.8014
0.3159 65.0 4680 0.4407 0.4225 0.6186 0.4548 0.1712 0.2293 0.3913 0.6433 0.8075
0.3118 66.0 4752 0.4478 0.4258 0.6169 0.4801 0.1728 0.2315 0.3762 0.6391 0.8064
0.336 67.0 4824 0.4659 0.4463 0.6252 0.5210 0.1834 0.2394 0.3464 0.5820 0.7786
0.3233 68.0 4896 0.4370 0.4208 0.6168 0.4452 0.1696 0.2265 0.4019 0.6425 0.8059
0.3285 69.0 4968 0.4479 0.4340 0.6189 0.4773 0.1771 0.2312 0.3609 0.6136 0.7972
0.3186 70.0 5040 0.4469 0.4308 0.6198 0.4698 0.1751 0.2310 0.3741 0.6219 0.7966
0.3351 71.0 5112 0.4476 0.4292 0.6176 0.4769 0.1745 0.2311 0.3718 0.6220 0.8035
0.3286 72.0 5184 0.4415 0.4229 0.6155 0.4655 0.1713 0.2289 0.3816 0.6376 0.8117
0.3135 73.0 5256 0.4527 0.4335 0.6198 0.4918 0.1769 0.2338 0.3621 0.6152 0.8036
0.3244 74.0 5328 0.4449 0.4290 0.6171 0.4685 0.1746 0.2296 0.3667 0.6234 0.8073
0.3253 75.0 5400 0.4450 0.4303 0.6182 0.4680 0.1750 0.2296 0.3703 0.6185 0.8013
0.3072 76.0 5472 0.4312 0.4212 0.6161 0.4337 0.1700 0.2242 0.3840 0.6411 0.8104
0.3159 77.0 5544 0.4434 0.4314 0.6186 0.4636 0.1754 0.2290 0.3643 0.6171 0.7996
0.3176 78.0 5616 0.4319 0.4207 0.6177 0.4330 0.1695 0.2249 0.3889 0.6524 0.8080
0.3243 79.0 5688 0.4432 0.4304 0.6186 0.4698 0.1752 0.2302 0.3667 0.6218 0.8058
0.3183 80.0 5760 0.4438 0.4288 0.6175 0.4665 0.1742 0.2294 0.3730 0.6235 0.8030
0.323 81.0 5832 0.4365 0.4248 0.6170 0.4480 0.1716 0.2263 0.3820 0.6313 0.8056
0.3348 82.0 5904 0.4385 0.4280 0.6179 0.4532 0.1738 0.2273 0.3651 0.6249 0.8099
0.2948 83.0 5976 0.4456 0.4330 0.6190 0.4727 0.1763 0.2305 0.3622 0.6121 0.7981
0.3156 84.0 6048 0.4349 0.4236 0.6155 0.4442 0.1712 0.2252 0.3834 0.6331 0.8086
0.3227 85.0 6120 0.4352 0.4251 0.6160 0.4423 0.1719 0.2250 0.3799 0.6293 0.8055
0.3044 86.0 6192 0.4349 0.4235 0.6165 0.4444 0.1714 0.2259 0.3858 0.6312 0.8108
0.3067 87.0 6264 0.4293 0.4214 0.6150 0.4293 0.1700 0.2229 0.3862 0.6397 0.8102
0.3083 88.0 6336 0.4260 0.4164 0.6139 0.4229 0.1673 0.2221 0.3989 0.6536 0.8126
0.2989 89.0 6408 0.4381 0.4270 0.6168 0.4526 0.1731 0.2270 0.3766 0.6248 0.8051
0.3232 90.0 6480 0.4352 0.4230 0.6158 0.4480 0.1711 0.2263 0.3854 0.6358 0.8112
0.3201 91.0 6552 0.4361 0.4242 0.6164 0.4462 0.1718 0.2262 0.3842 0.6327 0.8078
0.3096 92.0 6624 0.4390 0.4273 0.6171 0.4563 0.1733 0.2279 0.3790 0.6237 0.8046
0.322 93.0 6696 0.4338 0.4229 0.6157 0.4447 0.1709 0.2258 0.3889 0.6351 0.8069
0.3096 94.0 6768 0.4348 0.4238 0.6160 0.4448 0.1714 0.2256 0.3839 0.6342 0.8077
0.3067 95.0 6840 0.4414 0.4298 0.6181 0.4628 0.1748 0.2290 0.3707 0.6205 0.8027
0.3198 96.0 6912 0.4334 0.4228 0.6162 0.4434 0.1709 0.2258 0.3872 0.6370 0.8077
0.295 97.0 6984 0.4367 0.4261 0.6169 0.4507 0.1728 0.2269 0.3791 0.6283 0.8045
0.305 98.0 7056 0.4373 0.4266 0.6171 0.4524 0.1730 0.2273 0.3781 0.6280 0.8046
0.3304 99.0 7128 0.4334 0.4230 0.6162 0.4432 0.1709 0.2257 0.3874 0.6378 0.8062
0.3099 100.0 7200 0.4360 0.4251 0.6169 0.4500 0.1721 0.2269 0.3828 0.6326 0.8051

前往AI网址导航

收录说明:
1、本网页并非 sayakpaul/glpn-nyu-finetuned-diode-230103-091356 官网网址页面,此页面内容编录于互联网,只作展示之用;2、如果有与 sayakpaul/glpn-nyu-finetuned-diode-230103-091356 相关业务事宜,请访问其网站并获取联系方式;3、本站与 sayakpaul/glpn-nyu-finetuned-diode-230103-091356 无任何关系,对于 sayakpaul/glpn-nyu-finetuned-diode-230103-091356 网站中的信息,请用户谨慎辨识其真伪。4、本站收录 sayakpaul/glpn-nyu-finetuned-diode-230103-091356 时,此站内容访问正常,如遇跳转非法网站,有可能此网站被非法入侵或者已更换新网址,导致旧网址被非法使用,5、如果你是网站站长或者负责人,不想被收录请邮件删除:i-hu#Foxmail.com (#换@)

© 版权声明

相关文章