<img src="https://imagedelivery.net/phxEHgsq3j8gSnfNAJVJSQ/NODE2_4BDCF2BA-0053-405A-A98E-7820FC0C6846/public" style="background-color:initial;max-width:min(100%,700px);max-height:min(732px);;background-image:url(https://imagedelivery.net/phxEHgsq3j8gSnfNAJVJSQ/NODE2_4BDCF2BA-0053-405A-A98E-7820FC0C6846/public);height:auto;width:100%;object-fit:cover;background-size:cover;display:block;" width="700" height="732"> Batch Normalization(批归一化) 指的是在一个batch中对所有数据的每个特征分别做归一化 <img src="https://imagedelivery.net/phxEHgsq3j8gSnfNAJVJSQ/NODE2_F43AF76B-049A-4BF0-B6B7-0194152A587F/public" style="background-color:initial;max-width:min(100%,710px);max-height:min(764px);;background-image:url(https://imagedelivery.net/phxEHgsq3j8gSnfNAJVJSQ/NODE2_F43AF76B-049A-4BF0-B6B7-0194152A587F/public);height:auto;width:100%;object-fit:cover;background-size:cover;display:block;" width="710" height="764"> Layer Normalization(层归一化) 指的是在一个batch中分别对每个数据的所有特征做归一化 参考链接:<a href="https://blog.csdn.net/Little_White_9/article/details/123345062">https://blog.csdn.net/Little_White_9/article/details/123345062</a>