{"id":8028,"date":"2022-09-23T03:57:00","date_gmt":"2022-09-22T19:57:00","guid":{"rendered":"http:\/\/139.9.1.231\/?p=8028"},"modified":"2022-09-20T15:57:43","modified_gmt":"2022-09-20T07:57:43","slug":"swin-transformer-v2","status":"publish","type":"post","link":"http:\/\/139.9.1.231\/index.php\/2022\/09\/23\/swin-transformer-v2\/","title":{"rendered":"Swin Transformer v2"},"content":{"rendered":"\n<p>paper\uff1ahttps:\/\/arxiv.org\/pdf\/2111.09883.pdf<\/p>\n\n\n\n<p><strong>Swin Transformer V2: Scaling Up Capacity and Resolution<\/strong>\uff1a<strong>\u6269\u5c55\u5bb9\u91cf\u548c\u5206\u8fa8\u7387<\/strong><\/p>\n\n\n\n\n\n<p>Transformer \u662f Google \u7684\u56e2\u961f\u5728 2017 \u5e74\u63d0\u51fa\u7684\u4e00\u79cd NLP \u7ecf\u5178\u6a21\u578b\uff0c\u73b0\u5728\u6bd4\u8f83\u706b\u70ed\u7684 Bert \u4e5f\u662f\u57fa\u4e8e Transformer\u3002Transformer \u6a21\u578b\u4f7f\u7528\u4e86 Self-Attention \u673a\u5236\uff0c<strong>\u4e0d\u91c7\u7528<\/strong>&nbsp;RNN \u7684<strong>\u987a\u5e8f\u7ed3\u6784<\/strong>\uff0c\u4f7f\u5f97\u6a21\u578b<strong>\u53ef\u4ee5\u5e76\u884c\u5316\u8bad\u7ec3<\/strong>\uff0c\u800c\u4e14\u80fd\u591f<strong>\u62e5\u6709\u5168\u5c40\u4fe1\u606f\u3002<\/strong><\/p>\n\n\n\n<p>\u672c\u6587\u4ecb\u7ecd\u8fd9\u7bc7\u6587\u7ae0\u662f Swin Transformer \u7cfb\u5217\u7684\u5347\u7ea7\u7248 Swin Transformer v2\u3002Swin Transformer \u662f\u5c60\u699c\u5404\u5927CV\u4efb\u52a1\u7684\u901a\u7528\u89c6\u89c9Transformer\u6a21\u578b\uff0c\u5b83\u5728\u56fe\u50cf\u5206\u7c7b\u3001\u76ee\u6807\u68c0\u6d4b\u3001\u5206\u5272\u4e0a\u5168\u9762\u8d85\u8d8a SOTA\uff0c\u5728\u8bed\u4e49\u5206\u5272\u4efb\u52a1\u4e2d\u5728 ADE20K \u4e0a\u5237\u5230 53.5 mIoU\uff0c\u8d85\u8fc7\u4e4b\u524d SOTA \u5927\u6982 4.5 mIoU\uff01\u53ef\u80fd\u662fCNN\u7684\u5b8c\u7f8e\u66ff\u4ee3\u65b9\u6848\u3002\u9664\u6b64\u4e4b\u5916\uff0c\u672c\u6587\u4e00\u5e76\u4ecb\u7ecd Swin MLP \u7684\u4ee3\u7801\u5b9e\u73b0\uff0cSwin Transformer \u4f5c\u8005\u4eec\u5728\u5df2\u6709\u6a21\u578b\u7684\u57fa\u7840\u4e0a\u5b9e\u73b0\u4e86 Swin MLP \u6a21\u578b\uff0c\u8bc1\u660e\u4e86 Window-based attention \u5bf9\u4e8e MLP \u6a21\u578b\u7684\u6709\u6548\u6027\u3002<\/p>\n\n\n\n<p>Swin Transformer Block \u6709\u4e24\u79cd\uff0c\u5927\u81f4\u7ed3\u6784\u548c Transformer Block \u4e00\u81f4\uff0c\u53ea\u662f\u5185\u90e8 attention \u6a21\u5757\u5206\u522b\u662f Window-based MSA \u548c Shifted Window-based MSA\u3002Window-based MSA \u4e0d\u540c\u4e8e\u666e\u901a\u7684 MSA\uff0c\u5b83\u5728\u4e00\u4e2a\u4e2a window \u91cc\u9762\u53bb\u8ba1\u7b97 self-attention\uff0c\u8ba1\u7b97\u91cf\u4e0e\u5e8f\u5217\u957f\u5ea6&nbsp;N=hw&nbsp;\u6210\u7ebf\u6027\u5173\u7cfb\u3002Window-based MSA \u867d\u7136\u5927\u5e45\u8282\u7ea6\u4e86\u8ba1\u7b97\u91cf\uff0c\u4f46\u662f\u727a\u7272\u4e86 windows \u4e4b\u95f4\u5173\u7cfb\u7684\u5efa\u6a21\uff0c\u4e0d\u91cd\u5408\u7684 Window \u4e4b\u95f4\u7f3a\u4e4f\u4fe1\u606f\u4ea4\u6d41\u5f71\u54cd\u4e86\u6a21\u578b\u7684\u8868\u5f81\u80fd\u529b\u3002Shifted Window-based MSA \u5c31\u662f\u4e3a\u4e86\u89e3\u51b3\u8fd9\u4e2a\u95ee\u9898\u3002\u5c06\u4e0b\u4e00\u5c42 Swin Transformer Block \u7684 Window \u4f4d\u7f6e\u8fdb\u884c\u79fb\u52a8\uff0c\u5f97\u5230\u4e0d\u91cd\u5408\u7684 patch\u3002<\/p>\n\n\n\n<p>\u5728 Swin Transformer \u7684\u57fa\u7840\u4e0a\uff0c\u7814\u7a76\u4eba\u5458\u8fdb\u4e00\u6b65\u5f00\u53d1\u51fa\u4e86\u7528\u4e8e\u5e95\u5c42\u590d\u539f\u4efb\u52a1\u7684 SwinIR<\/p>\n\n\n\n<h2><strong>Swin Transformer v2 \u539f\u7406\u5206\u6790\uff1a<\/strong><\/h2>\n\n\n\n<p>Swin Transformer \u63d0\u51fa\u4e86\u4e00\u79cd\u9488\u5bf9\u89c6\u89c9\u4efb\u52a1\u7684\u901a\u7528\u7684 Transformer \u67b6\u6784\uff0cMSRA \u8fdb\u4e00\u6b65\u6253\u9020\u4e86\u4e00\u4e2a\u5305\u542b3 billion \u4e2a\u53c2\u6570\uff0c\u4e14\u5141\u8bb8\u8f93\u5165\u5206\u8fa8\u7387\u8fbe\u52301560\u00d71560\u7684\u5927\u578b Swin Transformer\uff0c\u79f0\u4e4b\u4e3a SwinV2\u3002\u5b83\u5728\u591a\u4e2a\u57fa\u51c6\u6570\u636e\u96c6 (\u5305\u542b ImageNet \u5206\u7c7b\u3001COCO \u68c0\u6d4b\u3001ADE20K \u8bed\u4e49\u5206\u5272\u4ee5\u53caKinetics-400 \u52a8\u4f5c\u5206\u7c7b) \u4e0a\u53d6\u5f97\u65b0\u8bb0\u5f55\uff0c\u5206\u522b\u662f ImageNet \u56fe\u50cf\u5206\u7c7b84.0% Top-1 accuracy\uff0cCOCO \u76ee\u6807\u68c0\u6d4b63.1\/54.4 box \/ mask mAP\uff0cADE20K \u8bed\u4e49\u5206\u527259.9mIoU\uff0cKinetics-400\u89c6\u9891\u52a8\u4f5c\u8bc6\u522b86.8% Top-1 accuracy\u3002<\/p>\n\n\n\n<p><strong>Swin Transformer v2 \u7684\u6838\u5fc3\u76ee\u7684\u662f\u628a Swin Transformer \u6a21\u578b\u505a\u5927<\/strong>\uff0c\u505a\u6210\u7c7b\u4f3c BERT large \u90a3\u6837\u5305\u542b 340M \u53c2\u6570\u7684\u9884\u8bad\u7ec3\u5927\u6a21\u578b\u3002\u5728 NLP \u4e2d\uff0c\u6709\u7684\u9884\u8bad\u7ec3\u7684\u5927\u6a21\u578b\uff0c\u6bd4\u5982 Megatron-Turing-530B \u6216\u8005 Switch-Transformer-1.6T\uff0c\u53c2\u6570\u91cf\u5206\u522b\u8fbe\u5230\u4e86530 billion \u6216\u80051.6 trillion\u3002<\/p>\n\n\n\n<p>\u53e6\u4e00\u65b9\u9762\uff0c\u89c6\u89c9\u5927\u6a21\u578b\u7684\u53d1\u5c55\u5374\u6ede\u540e\u4e86\u3002 Vision Transformer \u7684\u5927\u6a21\u578b\u76ee\u524d\u4e5f\u53ea\u662f\u8fbe\u5230\u4e861-2 billion \u7684\u53c2\u6570\u91cf\uff0c\u4e14\u53ea\u652f\u6301\u56fe\u50cf\u8bc6\u522b\u4efb\u52a1\u3002\u90e8\u5206\u539f\u56e0\u662f\u56e0\u4e3a\u5728\u8bad\u7ec3\u548c\u90e8\u7f72\u65b9\u9762<strong>\u5b58\u5728\u4ee5\u4e0b\u56f0\u96be\uff1a<\/strong><\/p>\n\n\n\n<ul><li><strong>\u95ee\u98981\uff1a<\/strong>\u8bad\u7ec3\u4e2d\u7684\u4e0d\u7a33\u5b9a\u6027\u95ee\u9898\u3002\u5728\u5927\u578b\u6a21\u578b\u4e2d\uff0c\u8de8\u5c42\u6fc0\u6d3b\u51fd\u6570\u8f93\u51fa\u7684\u5e45\u503c\u7684\u5dee\u5f02\u53d8\u5f97\u66f4\u5927\u3002\u6fc0\u6d3b\u503c\u662f\u9010\u5c42\u7d2f\u79ef\u7684\uff0c\u56e0\u6b64\u6df1\u5c42\u7684\u5e45\u503c\u660e\u663e\u5927\u4e8e\u6d45\u5c42\u7684\u5e45\u503c\u3002\u5982\u4e0b\u56fe1\u6240\u793a\u662f\u6269\u5927\u6a21\u578b\u5bb9\u91cf\u65f6\u7684\u4e0d\u7a33\u5b9a\u95ee\u9898\u3002 \u5f53\u6211\u4eec\u5c06\u539f\u6765\u7684 Swin Transformer \u6a21\u578b\u4ece\u5c0f\u6a21\u578b\u653e\u5927\u5230\u5927\u6a21\u578b\u65f6\uff0c\u6df1\u5c42\u7684 activation \u503c\u6025\u5267\u589e\u52a0\u3002\u6700\u9ad8\u548c\u6700\u4f4e\u5e45\u503c\u4e4b\u95f4\u7684\u5dee\u5f02\u8fbe\u5230\u4e86104\u3002\u5f53\u6211\u4eec\u8fdb\u4e00\u6b65\u6269\u5c55\u5230\u4e00\u4e2a\u5de8\u5927\u7684\u89c4\u6a21 (658M \u53c2\u6570) \u65f6\uff0c\u5b83\u4e0d\u80fd\u5b8c\u6210\u8bad\u7ec3\uff0c\u5982\u56fe2\u6240\u793a\u3002<\/li><\/ul>\n\n\n\n<figure class=\"wp-block-image\"><img src=\"https:\/\/pic4.zhimg.com\/v2-d5e3f5469799204997e3374778301c17_r.jpg\" alt=\"\"\/><figcaption>\u56fe1\uff1a\u6269\u5927\u6a21\u578b\u5bb9\u91cf\u65f6\u7684\u4e0d\u7a33\u5b9a\u95ee\u9898<\/figcaption><\/figure>\n\n\n\n<figure class=\"wp-block-image\"><img src=\"https:\/\/pic2.zhimg.com\/v2-e79ef75c5971509080f73bc724c9682d_r.jpg\" alt=\"\"\/><figcaption>\u56fe2\uff1a\u4f7f\u7528 Pre-Norm\uff0c\u5f53\u8fdb\u4e00\u6b65\u6269\u5c55\u5230\u4e00\u4e2a\u5de8\u5927\u7684\u89c4\u6a21 (658M \u53c2\u6570) \u65f6\u4e0d\u80fd\u5b8c\u6210\u8bad\u7ec3\u3002<\/figcaption><\/figure>\n\n\n\n<ul><li><strong>\u95ee\u98982\uff1a<\/strong>\u8bb8\u591a\u4e0b\u6e38\u89c6\u89c9\u4efb\u52a1\u9700\u8981\u9ad8\u5206\u8fa8\u7387\u7684\u56fe\u50cf\u6216\u7a97\u53e3\uff0c\u9884\u8bad\u7ec3\u6a21\u578b\u65f6\u662f\u5728\u4f4e\u5206\u8fa8\u7387\u4e0b\u8fdb\u884c\u7684\uff0c\u800c fine-tuning \u662f\u5728\u9ad8\u5206\u8fa8\u7387\u4e0b\u8fdb\u884c\u7684\u3002\u9488\u5bf9\u5206\u8fa8\u7387\u4e0d\u540c\u7684\u95ee\u9898\u4f20\u7edf\u7684\u505a\u6cd5\u662f\u628a\u4f4d\u7f6e\u7f16\u7801\u8fdb\u884c\u53cc\u7ebf\u6027\u63d2\u503c (bi-cubic interpolation)\uff0c\u8fd9\u79cd\u505a\u6cd5\u662f\u6b21\u4f18\u7684\u3002\u5982\u4e0b\u56fe3\u6240\u793a\u662f\u4e0d\u540c\u4f4d\u7f6e\u7f16\u7801\u65b9\u5f0f\u6027\u80fd\u7684\u6bd4\u8f83\uff0c\u5f53\u6211\u4eec<strong>\u76f4\u63a5\u5728\u8f83\u5927\u7684\u56fe\u50cf\u5206\u8fa8\u7387\u548c\u7a97\u53e3\u5927\u5c0f\u6d4b\u8bd5<\/strong>\u9884\u8bad\u7ec3\u7684 Imagenet-1k \u6a21\u578b (\u5206\u8fa8\u7387256\u00d7256\uff0cwindow siez=8\u00d78) \u65f6\uff0c\u53d1\u73b0\u7cbe\u5ea6\u663e\u7740\u4e0b\u964d\u3002<\/li><\/ul>\n\n\n\n<figure class=\"wp-block-image\"><img src=\"https:\/\/pic1.zhimg.com\/v2-56691f3bd9d2c71d24c934944a9eb838_r.jpg\" alt=\"\"\/><figcaption>\u56fe3\uff1a\u4e0d\u540c\u4f4d\u7f6e\u7f16\u7801\u65b9\u5f0f\u6027\u80fd\u7684\u6bd4\u8f83<\/figcaption><\/figure>\n\n\n\n<ul><li><strong>\u95ee\u98983\uff1a<\/strong>\u5f53\u56fe\u50cf\u5206\u8fa8\u7387\u8f83\u9ad8\u65f6\uff0cGPU \u5185\u5b58\u6d88\u8017\u4e5f\u662f\u4e00\u4e2a\u95ee\u9898\u3002<\/li><\/ul>\n\n\n\n<p>\u4e3a\u4e86\u89e3\u51b3\u4ee5\u4e0a\u51e0\u70b9\u95ee\u9898\uff0c\u4f5c\u8005\u63d0\u51fa\u4e86\uff1a<\/p>\n\n\n\n<h3><strong>\u65b9\u6cd51\uff1apost normalization \u6280\u672f\uff1a<\/strong>\u89e3\u51b3\u8bad\u7ec3\u4e2d\u7684\u4e0d\u7a33\u5b9a\u6027\u95ee\u9898<strong>\u3002<\/strong><\/h3>\n\n\n\n<p>\u628a Layer Normalization \u5c42\u653e\u5728 Attention \u6216\u8005 MLP \u7684\u540e\u9762\u3002\u8fd9\u6837\u6bcf\u4e2a\u6b8b\u5dee\u5757\u7684\u8f93\u51fa\u53d8\u5316\u4e0d\u81f3\u4e8e\u592a\u5927\uff0c\u56e0\u4e3a\u4e3b\u5206\u652f\u548c\u6b8b\u5dee\u5206\u652f\u90fd\u662f LN \u5c42\u7684\u8f93\u51fa\uff0c\u6709 LN \u5f52\u4e00\u5316\u4f5c\u7528\u7684\u9650\u5236\u3002\u5982\u4e0a\u56fe1\u6240\u793a\uff0c\u8fd9\u79cd\u505a\u6cd5\u4f7f\u5f97\u6bcf\u4e00\u5c42\u7684\u8f93\u51fa\u503c\u57fa\u672c\u4e0a\u76f8\u5dee\u4e0d\u5927\u3002\u5728\u6700\u5927\u7684\u6a21\u578b\u8bad\u7ec3\u4e2d\uff0c\u4f5c\u8005\u6bcf\u7ecf\u8fc76\u4e2a Transformer Block\uff0c\u5c31\u5728\u4e3b\u652f\u8def\u4e0a\u589e\u52a0\u4e86\u4e00\u5c42 LN\uff0c\u4ee5\u8fdb\u4e00\u6b65\u7a33\u5b9a\u8bad\u7ec3\u548c\u8f93\u51fa\u5e45\u503c\u3002<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img src=\"https:\/\/pic4.zhimg.com\/v2-7469ab0bda387b2812f0d2ba87115163_r.jpg\" alt=\"\"\/><figcaption>\u56fe4\uff1aSwin v2 \u76f8\u5bf9\u4e8e Swin Transformer \u7684\u6539\u8fdb (\u7ea2\u8272\u90e8\u5206)<\/figcaption><\/figure>\n\n\n\n<h3><strong>\u65b9\u6cd52\uff1ascaled cosine attention \u6280\u672f\uff1a<\/strong>\u89e3\u51b3\u8bad\u7ec3\u4e2d\u7684\u4e0d\u7a33\u5b9a\u6027\u95ee\u9898<strong>\u3002<\/strong><\/h3>\n\n\n\n<p>\u539f\u6765\u7684 self-attention \u8ba1\u7b97\u4e2d\uff0cquery \u548c key \u4e4b\u95f4\u7684\u76f8\u4f3c\u6027\u901a\u8fc7 dot-product \u6765\u8861\u91cf\uff0c\u4f5c\u8005\u53d1\u73b0\u8fd9\u6837\u5b66\u4e60\u5230\u7684 attention map \u5f80\u5f80\u88ab\u5c11\u6570\u50cf\u7d20\u5bf9\u6240\u652f\u914d\u3002\u6240\u4ee5\u628a dot-product \u6539\u6210\u4e86 cosine \u51fd\u6570\uff0c\u901a\u8fc7\u5b83\u6765\u8861\u91cf query \u548c key \u4e4b\u95f4\u7684\u76f8\u4f3c\u6027\u3002<\/p>\n\n\n\n<p>\\[\\operatorname{Sim}\\left(\\mathbf{q}<em>i, \\mathbf{k}_j\\right)=\\cos \\left(\\mathbf{q}_i, \\mathbf{k}_j\\right) \/ \\tau+B<\/em>{i j}\\]<br>\u5f0f\u4e2d\uff0c \\(B_{i j}\\) \u662f\u4e0b\u9762\u8bb2\u5f97\u76f8\u5bf9\u4f4d\u7f6e\u7f16\u7801\uff0c \\(\\tau\\) \u662f\u53ef\u5b66\u4e60\u53c2\u6570\u3002\u4f59\u5f26\u51fd\u6570\u662f naturally normalized\uff0c\u56e0 \u6b64\u53ef\u4ee5\u6709\u8f83\u6e29\u548c\u7684\u6ce8\u610f\u529b\u503c\u3002<\/p>\n\n\n\n<h3>\u65b9\u6cd53\uff1a\u5bf9\u6570\u8fde\u7eed\u4f4d\u7f6e\u7f16\u7801\u6280\u672f\uff1a\u89e3\u51b3\u5206\u8fa8\u7387\u53d8\u5316\u5bfc\u81f4\u7684\u4f4d\u7f6e\u7f16\u7801\u7ef4\u5ea6\u4e0d\u4e00\u81f4\u95ee\u9898\u3002<\/h3>\n\n\n\n<ul><li>\u8be5\u65b9\u6cd5\u53ef\u4ee5 \u66f4\u5e73\u6ed1\u5730\u4f20\u9012\u5728\u4f4e\u5206\u8fa8\u7387\u4e0b\u9884\u5148\u8bad\u7ec3\u597d\u7684\u6a21\u578b\u6743\u503c\uff0c\u4ee5\u5904\u7406\u9ad8\u5206\u8fa8\u7387\u7684\u6a21\u578b\u6743\u503c\u3002<br>\u6211\u4eec\u9996\u5148\u590d\u4e60\u4e0b Swin Transformer \u7684\u76f8\u5bf9\u4f4d\u7f6e\u7f16\u7801\u6280\u672f\u3002<br>\\[\\operatorname{Attention}(Q, K, V)=\\operatorname{SoftMax}\\left(Q K^T \/ \\sqrt{d}+B\\right) V\\]<br>\u5f0f\u4e2d\uff0c \\(B \\in \\mathbb{R}^{M^2 \\times M^2}\\) \u662f\u6bcf\u4e2a head \u7684\u76f8\u5bf9\u4f4d\u7f6e\u504f\u5dee\u9879 (relative position bias)\uff0c\\(Q, K, V \\in \\mathbb{R}^{M^2 \\times d}\\) \u662f window-based attention \u7684 query\uff0c key \u548c value\u3002 window \u7684\u5927\u5c0f\u3002<\/li><\/ul>\n\n\n\n<p>\u4f5c\u8005\u5f15\u5165<strong>\u5bf9\u6570\u7a7a\u95f4\u8fde\u7eed\u4f4d\u7f6e\u504f\u5dee (log-spaced continuous position bias)<\/strong>\uff0c\u4f7f\u76f8\u5bf9\u4f4d\u7f6e\u504f\u5dee\u5728\u4e0d\u540c\u7684 window \u5206\u8fa8\u7387\u4e4b\u4e0b\u53ef\u4ee5\u8f83\u4e3a\u5e73\u6ed1\u5730\u8fc7\u6e21\u3002<\/p>\n\n\n\n<h3><strong>\u65b9\u6cd54\uff1a\u8282\u7701 GPU memory \u7684\u65b9\u6cd5\uff1a<\/strong><\/h3>\n\n\n\n<p><strong>1 Zero-Redundancy Optimizer (ZeRO) \u6280\u672f\uff1a<\/strong><\/p>\n\n\n\n<blockquote class=\"wp-block-quote\"><p>\u6765\u81ea\u8bba\u6587\uff1aZero: Memory optimizations toward training trillion parameter models<\/p><\/blockquote>\n\n\n\n<p>\u4f20\u7edf\u7684\u6570\u636e\u5e76\u884c\u8bad\u7ec3\u65b9\u6cd5 (\u5982 DDP) \u4f1a\u628a\u6a21\u578b broadcast \u5230\u6bcf\u4e2a GPU \u91cc\u9762\uff0c\u8fd9\u5bf9\u4e8e\u5927\u578b\u6a21\u578b\u6765\u8bb2\u975e\u5e38\u4e0d\u53cb\u597d\uff0c\u6bd4\u5982\u53c2\u6570\u91cf\u4e3a 3,000M=3B \u7684\u5927\u6a21\u578b\u6765\u8bb2\uff0c\u82e5\u4f7f\u7528 AdamW optimizer\uff0c32\u4e3a\u7684\u6d6e\u70b9\u6570\uff0c\u5c31\u4f1a\u5360\u7528 48G \u7684 GPU memory\u3002\u901a\u8fc7\u4f7f\u7528 ZeRO optimizer\uff0c \u5c06\u6a21\u578b\u53c2\u6570\u548c\u76f8\u5e94\u7684\u4f18\u5316\u72b6\u6001\u5212\u5206\u5e76\u5206\u5e03\u5230\u591a\u4e2a GPU \u4e2d\uff0c\u4ece\u800c\u5927\u5927\u964d\u4f4e\u4e86\u5185\u5b58\u6d88\u8017\u3002\u8bad\u7ec3\u65f6\u4f7f\u7528 DeepSpeed framework\uff0cZeRO stage-1 option\u3002<\/p>\n\n\n\n<p><strong>2 Activation check-pointing \u6280\u672f\uff1a<\/strong><\/p>\n\n\n\n<blockquote class=\"wp-block-quote\"><p>\u6765\u81ea\u8bba\u6587\uff1aTraining deep nets with sublinear memory cost<\/p><\/blockquote>\n\n\n\n<p>Transformer \u5c42\u4e2d\u7684\u7279\u5f81\u6620\u5c04\u4e5f\u6d88\u8017\u4e86\u5927\u91cf\u7684 GPU \u5185\u5b58\uff0c\u5728 image \u548c window \u5206\u8fa8\u7387\u8f83\u9ad8\u7684\u60c5\u51b5\u4e0b\u4f1a\u6210\u4e3a\u4e00\u4e2a\u74f6\u9888\u3002\u8fd9\u4e2a\u4f18\u5316\u6700\u591a\u53ef\u4ee5\u51cf\u5c1130%\u7684\u8bad\u7ec3\u901f\u5ea6\u3002<\/p>\n\n\n\n<p><strong>3 Sequential self-attention computation \u6280\u672f\uff1a<\/strong><\/p>\n\n\n\n<p>\u5728\u975e\u5e38\u5927\u7684\u5206\u8fa8\u7387\u4e0b\u8bad\u7ec3\u5927\u6a21\u578b\u65f6\uff0c\u5982\u5206\u8fa8\u7387\u4e3a1535\u00d71536\uff0cwindow size=32\u00d732\u65f6\uff0c\u5728\u4f7f\u7528\u4e86\u4e0a\u8ff0\u4e24\u79cd\u4f18\u5316\u7b56\u7565\u4e4b\u540e\uff0c\u5bf9\u4e8e\u5e38\u89c4\u7684 GPU (40GB \u7684\u5185\u5b58)\u6765\u8bf4\uff0c\u4ecd\u7136\u662f\u65e0\u6cd5\u627f\u53d7\u7684\u3002\u4f5c\u8005\u53d1\u73b0\u5728\u8fd9\u79cd\u60c5\u51b5\u4e0b\uff0cself-attention \u6a21\u5757\u6784\u6210\u4e86\u74f6\u9888\u3002\u4e3a\u4e86\u89e3\u51b3\u8fd9\u4e2a\u95ee\u9898\uff0c\u4f5c\u8005\u5b9e\u73b0\u4e86\u4e00\u4e2a sequential \u7684 self-attention \u8ba1\u7b97\uff0c\u800c\u4e0d\u662f\u4f7f\u7528\u4ee5\u524d\u7684\u6279\u5904\u7406\u8ba1\u7b97\u65b9\u6cd5\u3002\u8fd9\u79cd\u4f18\u5316\u5728\u524d\u4e24\u4e2a\u9636\u6bb5\u5e94\u7528\u4e8e\u5404\u5c42\uff0c\u5e76\u4e14\u5bf9\u6574\u4f53\u7684\u8bad\u7ec3\u901f\u5ea6\u6709\u4e00\u5b9a\u7684\u63d0\u5347\u3002<\/p>\n\n\n\n<p>\u5728\u8fd9\u9879\u5de5\u4f5c\u4e2d\uff0c\u4f5c\u8005\u8fd8\u4e00\u65b9\u9762\u9002\u5ea6\u653e\u5927 ImageNet-22k \u6570\u636e\u96c65\u500d\uff0c\u8fbe\u52307000\u4e07\u5f20\u5e26\u6709\u566a\u58f0\u6807\u7b7e\u7684\u56fe\u50cf\u3002 \u8fd8\u91c7\u7528\u4e86\u4e00\u79cd\u81ea\u76d1\u7763\u5b66\u4e60\u7684\u65b9\u6cd5\u6765\u66f4\u597d\u5730\u5229\u7528\u8fd9\u4e9b\u6570\u636e\u3002\u901a\u8fc7\u7ed3\u5408\u8fd9\u4e24\u79cd\u7b56\u7565\uff0c\u4f5c\u8005\u8bad\u7ec3\u4e86\u4e00\u4e2a30\u4ebf\u53c2\u6570\u7684\u5f3a\u5927\u7684 Swin Transformer \u6a21\u578b\u5237\u65b0\u4e86\u591a\u4e2a\u57fa\u51c6\u6570\u636e\u96c6\u7684\u6307\u6807\uff0c\u5e76\u80fd\u591f\u5c06\u8f93\u5165\u5206\u8fa8\u7387\u63d0\u5347\u81f31536\u00d71536 (Nvidia A100-40G GPUs)\u3002\u6b64\u5916\uff0c\u4f5c\u8005\u8fd8\u5206\u4eab\u4e86\u4e00\u4e9b SwinV2 \u7684\u5173\u952e\u5b9e\u73b0\u7ec6\u8282\uff0c\u8fd9\u4e9b\u7ec6\u8282\u5bfc\u81f4\u4e86 GPU \u5185\u5b58\u6d88\u8017\u7684\u663e\u7740\u8282\u7701\uff0c\u4ece\u800c\u4f7f\u5f97\u4f7f\u7528\u5e38\u89c4 GPU \u6765\u8bad\u7ec3\u5927\u578b\u89c6\u89c9\u6a21\u578b\u6210\u4e3a\u53ef\u80fd\u3002 \u4f5c\u8005\u7684\u76ee\u6807\u662f\u5728\u89c6\u89c9\u9884\u8bad\u7ec3\u5927\u6a21\u578b\u8fd9\u4e2a\u65b9\u5411\u4e0a\u6fc0\u53d1\u66f4\u591a\u7684\u7814\u7a76\uff0c\u4ece\u800c\u6700\u7ec8\u7f29\u5c0f\u89c6\u89c9\u6a21\u578b\u548c\u8bed\u8a00\u6a21\u578b\u4e4b\u95f4\u7684\u5bb9\u91cf\u5dee\u8ddd\u3002<\/p>\n\n\n\n<h3><strong>\u4e0d\u540c Swin V2 \u7684\u6a21\u578b\u914d\u7f6e\uff1a<\/strong><\/h3>\n\n\n\n<ul><li>SwinV2-T: C= 96, layer numbers ={2,2,6,2}<\/li><li>SwinV2-S: C= 96, layer numbers ={2,2,18,2}<\/li><li>SwinV2-B: C= 128, layer numbers ={2,2,18,2}<\/li><li>SwinV2-L: C= 192, layer numbers ={2,2,18,2}<\/li><li>SwinV2-H: C= 352, layer numbers ={2,2,18,2}<\/li><li>SwinV2-G: C= 512, layer numbers ={2,2,42,2}<\/li><\/ul>\n\n\n\n<p>\u5bf9\u4e8e SwinV2-H \u548c SwinV2-G \u7684\u6a21\u578b\u8bad\u7ec3\uff0c\u4f5c\u8005\u6bcf\u7ecf\u8fc76\u4e2a Transformer Block\uff0c\u5c31\u5728\u4e3b\u652f\u8def\u4e0a\u589e\u52a0\u4e86\u4e00\u5c42 LN\uff0c\u4ee5\u8fdb\u4e00\u6b65\u7a33\u5b9a\u8bad\u7ec3\u548c\u8f93\u51fa\u5e45\u503c\u3002<\/p>\n\n\n\n<h2><strong>Experiments<\/strong><\/h2>\n\n\n\n<p><strong>\u6a21\u578b\uff1a<\/strong>SwinV2-G\uff0c3B parameters<\/p>\n\n\n\n<p><strong>Image classification<\/strong><\/p>\n\n\n\n<p><strong>Dataset for Evaluation\uff1a<\/strong>ImageNet-1k\uff0cImageNet-1k V2<\/p>\n\n\n\n<p><strong>Dataset for Pre-Training\uff1a<\/strong>ImageNet-22K-ext (70M images, 22k classes)<\/p>\n\n\n\n<p><strong>\u8bad\u7ec3\u7b56\u7565\uff1a<\/strong>\u5206\u8fa8\u7387\u4f7f\u7528192\u00d7192\uff0c\u4e3a\u4e86\u8282\u7ea6\u53c2\u6570\u91cf\u30022-step \u7684\u9884\u8bad\u7ec3\u7b56\u7565\u3002\u9996\u5148\u4ee5\u81ea\u76d1\u7763\u5b66\u4e60\u7684\u65b9\u5f0f\u5728 ImageNet-22K-ext \u6570\u636e\u96c6\u4e0a\u8bad\u7ec3 20 epochs\uff0c\u518d\u4ee5\u6709\u76d1\u7763\u5b66\u4e60\u7684\u65b9\u5f0f\u5728\u8fd9\u4e2a\u6570\u636e\u96c6\u4e0a\u8bad\u7ec3 30 epochs\uff0cSwinV2-G \u6a21\u578b\u5728 ImageNet-1k \u4e0a\u9762\u8fbe\u5230\u4e86\u60ca\u4eba\u768490.17%\u7684 Top-1 Accuracy\uff0c\u5728 ImageNet-1k V2 \u4e0a\u9762\u4e5f\u8fbe\u5230\u4e86\u60ca\u4eba\u768484.00%\u7684 Top-1 Accuracy\uff0c\u8d85\u8fc7\u4e86\u5386\u53f2\u6700\u4f73\u768483.33%\u3002<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img src=\"https:\/\/pic4.zhimg.com\/v2-eb1342b3c89a611586970f234b191c0f_r.jpg\" alt=\"\"\/><figcaption>\u56fe5\uff1aImage classification \u5b9e\u9a8c\u7ed3\u679c<\/figcaption><\/figure>\n\n\n\n<p>\u540c\u65f6\uff0c\u4f7f\u7528 Swin V2 \u7684\u8bad\u7ec3\u7b56\u7565\u4ee5\u540e\uff0cBase \u6a21\u578b\u548c Large \u6a21\u578b\u7684\u6027\u80fd\u4e5f\u53ef\u4ee5\u8fdb\u4e00\u6b65\u63d0\u5347\u3002\u6bd4\u5982 SwinV2-B \u548c SwinV2-L \u5728 SwinV1-B \u548c SwinV1-L \u7684\u57fa\u7840\u4e0a\u5206\u522b\u6da8\u70b90.8%\u548c0.4%\uff0c\u539f\u56e0\u6765\u81ea\u66f4\u591a\u7684 labelled data (ImageNet-22k-ext, 70M images), \u66f4\u5f3a\u7684 Regularization\uff0c\u6216\u662f\u81ea\u76d1\u7763\u5b66\u4e60\u7b56\u7565\u3002<\/p>\n\n\n\n<p><strong>Object detection\uff0cInstance Segmentation<\/strong><\/p>\n\n\n\n<p><strong>Dataset for Evaluation\uff1a<\/strong>COCO<\/p>\n\n\n\n<p><strong>Dataset for Pre-Training\uff1a<\/strong>Object 365 v2<\/p>\n\n\n\n<p>\u5982\u4e0b\u56fe6\u6240\u793a SwinV2-G \u6a21\u578b\u4e0e\u4e4b\u524d\u5728 COCO \u76ee\u6807\u68c0\u6d4b\u548c\u5b9e\u4f8b\u5206\u5272\u4efb\u52a1\u4e0a\u53d6\u5f97\u6700\u4f73\u6027\u80fd\u6a21\u578b\u8fdb\u884c\u4e86\u6bd4\u8f83\u3002SwinV2-G \u5728 COCO test-dev \u4e0a\u5b9e\u73b0\u4e86 63.1\/54.4 box\/max AP\uff0c\u76f8\u6bd4\u4e8e SoftTeacher (61.3\/53.0) \u63d0\u9ad8\u4e86 + 1.8\/1.4\u3002<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img src=\"https:\/\/pic4.zhimg.com\/v2-d8868852c93b8dae5eabf40c622715fb_r.jpg\" alt=\"\"\/><figcaption>\u56fe6\uff1aCOCO \u76ee\u6807\u68c0\u6d4b\u548c\u5b9e\u4f8b\u5206\u5272\u4efb\u52a1<\/figcaption><\/figure>\n\n\n\n<p><strong>Semantic segmentation<\/strong><\/p>\n\n\n\n<p><strong>Dataset for Evaluation\uff1a<\/strong>ADE20K<\/p>\n\n\n\n<p>\u5982\u4e0b\u56fe7\u6240\u793a SwinV2-G \u6a21\u578b\u4e0e\u4e4b\u524d\u5728 ADE20K \u8bed\u4e49\u5206\u5272\u57fa\u51c6\u4e0a\u7684 SOTA \u7ed3\u679c\u8fdb\u884c\u4e86\u6bd4\u8f83\u3002Swin-V2-G \u5728 ADE20K val \u96c6\u4e0a\u5b9e\u73b0\u4e86 59.9 mIoU\uff0c\u76f8\u6bd4\u4e8e BEiT \u7684 58.4 \u9ad8\u4e86 1.5\u3002<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img src=\"https:\/\/pic3.zhimg.com\/v2-cd273584d8650ec7b6b3dab65250e57e_r.jpg\" alt=\"\"\/><figcaption>\u56fe7\uff1aADE20k\u8bed\u4e49\u5206\u5272\u4efb\u52a1<\/figcaption><\/figure>\n\n\n\n<p><strong>Video action classification<\/strong><\/p>\n\n\n\n<p><strong>Dataset for Evaluation\uff1a<\/strong>Kinetics-400 (K400)<\/p>\n\n\n\n<p>\u5982\u4e0b\u56fe8\u6240\u793a SwinV2-G \u6a21\u578b\u4e0e\u4e4b\u524d\u5728 Kinetics-400 \u52a8\u4f5c\u5206\u7c7b\u57fa\u51c6\u4e0a\u7684 SOTA \u7ed3\u679c\u8fdb\u884c\u4e86\u6bd4\u8f83\u3002\u53ef\u4ee5\u770b\u5230\uff0cVideo-SwinV2-G \u5b9e\u73b0\u4e86 86.8% \u7684 top-1 \u51c6\u786e\u7387\uff0c\u6bd4\u4e4b\u524d\u7684 TokenLearner \u65b9\u6cd5\u7684 85.4% \u9ad8\u51fa +1.4%\u3002<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img src=\"https:\/\/pic1.zhimg.com\/v2-eade9854e04af79283aeb9cbb5070bbc_r.jpg\" alt=\"\"\/><figcaption>\u56fe8\uff1aK400\u89c6\u9891\u52a8\u4f5c\u5206\u7c7b\u4efb\u52a1<\/figcaption><\/figure>\n\n\n\n<p><strong>\u5bf9\u6bd4\u5b9e\u9a8c\uff1apost-norm \u548c scaled cosine attention \u7684\u4f5c\u7528<\/strong><\/p>\n\n\n\n<p>\u5982\u4e0b\u56fe9\u6240\u793a\uff0c\u8fd9\u4e24\u79cd\u6280\u672f\u5747\u80fd\u63d0\u9ad8 Swin-T\uff0cSwin-S \u548c Swin-B \u7684\u6027\u80fd\uff0c\u603b\u4f53\u63d0\u9ad8\u5206\u522b\u4e3a 0.2%\uff0c0.4% \u548c 0.5%\u3002\u8bf4\u660e\u8be5\u6280\u672f\u5bf9\u5927\u6a21\u578b\u66f4\u6709\u5229\u3002\u66f4\u91cd\u8981\u7684\u662f\uff0c\u5b83\u4eec\u80fd\u8ba9\u8bad\u7ec3\u66f4\u7a33\u5b9a\u3002\u5bf9\u4e8e Swin-H \u548c Swin-G \u6a21\u578b\u800c\u8a00\uff0c\u81ea\u76d1\u7763\u9884\u8bad\u7ec3\u4f7f\u7528\u539f\u6765\u7684 Swin V1 \u65e0\u6cd5\u6536\u655b\uff0c\u800c Swin V2 \u6a21\u578b\u8bad\u7ec3\u5f97\u5f88\u597d\u3002<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img src=\"https:\/\/pic1.zhimg.com\/v2-8e3b037807a7d344a2e41d99648f1ce4_r.jpg\" alt=\"\"\/><figcaption>\u56fe9\uff1apost-norm \u548c scaled cosine attention \u5bf9\u6bd4\u5b9e\u9a8c\u7ed3\u679c<\/figcaption><\/figure>\n","protected":false},"excerpt":{"rendered":"<p>paper\uff1ahttps:\/\/arxiv.org\/pdf\/2111.09883.pdf Swin Transfo &hellip; <a href=\"http:\/\/139.9.1.231\/index.php\/2022\/09\/23\/swin-transformer-v2\/\" class=\"more-link\">\u7ee7\u7eed\u9605\u8bfb<span class=\"screen-reader-text\">Swin Transformer v2<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[21,4,9],"tags":[],"_links":{"self":[{"href":"http:\/\/139.9.1.231\/index.php\/wp-json\/wp\/v2\/posts\/8028"}],"collection":[{"href":"http:\/\/139.9.1.231\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/139.9.1.231\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/139.9.1.231\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/139.9.1.231\/index.php\/wp-json\/wp\/v2\/comments?post=8028"}],"version-history":[{"count":10,"href":"http:\/\/139.9.1.231\/index.php\/wp-json\/wp\/v2\/posts\/8028\/revisions"}],"predecessor-version":[{"id":8038,"href":"http:\/\/139.9.1.231\/index.php\/wp-json\/wp\/v2\/posts\/8028\/revisions\/8038"}],"wp:attachment":[{"href":"http:\/\/139.9.1.231\/index.php\/wp-json\/wp\/v2\/media?parent=8028"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/139.9.1.231\/index.php\/wp-json\/wp\/v2\/categories?post=8028"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/139.9.1.231\/index.php\/wp-json\/wp\/v2\/tags?post=8028"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}