From patchwork Fri May 2 05:30:53 2025 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Herbert Xu X-Patchwork-Id: 887269 Received: from abb.hmeau.com (abb.hmeau.com [144.6.53.87]) (using TLSv1.2 with cipher ECDHE-RSA-AES256-GCM-SHA384 (256/256 bits)) (No client certificate requested) by smtp.subspace.kernel.org (Postfix) with ESMTPS id CCACE3D6F for ; Fri, 2 May 2025 05:31:03 +0000 (UTC) Authentication-Results: smtp.subspace.kernel.org; arc=none smtp.client-ip=144.6.53.87 ARC-Seal: i=1; a=rsa-sha256; d=subspace.kernel.org; s=arc-20240116; t=1746163866; cv=none; b=QugK6HoFmYQe5wQjaUeypje4Jt+nhVzcVZx4brGfVOd3EFLKzNFVeNdL8V7RCTh4Pyv5eI8cO59EbW+6DiwrPFk8o2w5nrF4loKBDrY9sSVwHEVdTu9qHJMW7zA66D/mWdh32RZ905KRdCgainuAfUH18gjowbDwfYlSdUuYsIQ= ARC-Message-Signature: i=1; a=rsa-sha256; d=subspace.kernel.org; s=arc-20240116; t=1746163866; c=relaxed/simple; bh=/092vDKI0jC6Eu/Iu4vTrVJgwKt14RwLIFFgnnmB5qE=; h=Date:Message-Id:In-Reply-To:References:From:Subject:To; b=RUvrVqdw0xbHbil0yjZVPB3zCxK8FNIYyZc+iM7zisxpoWRyNJJIGEsiLNcuroci4A6R0b5Uknm1xJzIb7LuEhQk1u3EnBvArTtym/Uq5Xk2pEprMQ66xHm0b8buQIa6303iSZbG8IuZx5vNvmHWYyUt8AufuNsWSWcfrgc7D4w= ARC-Authentication-Results: i=1; smtp.subspace.kernel.org; dmarc=pass (p=quarantine dis=none) header.from=gondor.apana.org.au; spf=pass smtp.mailfrom=gondor.apana.org.au; dkim=pass (2048-bit key) header.d=hmeau.com header.i=@hmeau.com header.b=cLk0jRcJ; arc=none smtp.client-ip=144.6.53.87 Authentication-Results: smtp.subspace.kernel.org; dmarc=pass (p=quarantine dis=none) header.from=gondor.apana.org.au Authentication-Results: smtp.subspace.kernel.org; spf=pass smtp.mailfrom=gondor.apana.org.au Authentication-Results: smtp.subspace.kernel.org; dkim=pass (2048-bit key) header.d=hmeau.com header.i=@hmeau.com header.b="cLk0jRcJ" DKIM-Signature: v=1; a=rsa-sha256; q=dns/txt; c=relaxed/relaxed; d=hmeau.com; s=formenos; h=To:Subject:From:References:In-Reply-To:Message-Id:Date:Sender: Reply-To:Cc:MIME-Version:Content-Type:Content-Transfer-Encoding:Content-ID: Content-Description:Resent-Date:Resent-From:Resent-Sender:Resent-To:Resent-Cc :Resent-Message-ID:List-Id:List-Help:List-Unsubscribe:List-Subscribe: List-Post:List-Owner:List-Archive; bh=R60FW3DMiGAexl+zhPl0TfHTpa2ofDgEGWwE1VOV5Eg=; b=cLk0jRcJlUwGg/Fg//yFAhTGCw 4jvE5aTQBaIXFZ2oOHpLVbqIp2FTxVUbViFr6wOleIjWvBeZvJggbSWnb78IpqJXiuERM7f+OBNji dZzcg7fr6ORoc57zLkWkOTe2BHLvn9vpVBuxn8wzoxAt0j2sfrbzhn+9zJSi1/79bBnX97egPdcE0 7812MfbVfwbkhvXXLDUZNU48XPTTqUqLa+Rz4jhUq0rXfxaE8pJ66RM6Xk9FWq9WlGB/kKdgg5NzK 6aM7oQNppY7fol8osRv9M/7pf1APVPHO1yK/KUR7pD/zCfW+15mZupu0NeS0dLmkiml2033DCGrPr R7jUqgGQ==; Received: from loth.rohan.me.apana.org.au ([192.168.167.2]) by formenos.hmeau.com with smtp (Exim 4.96 #2 (Debian)) id 1uAiz7-002lKF-2t; Fri, 02 May 2025 13:30:54 +0800 Received: by loth.rohan.me.apana.org.au (sSMTP sendmail emulation); Fri, 02 May 2025 13:30:53 +0800 Date: Fri, 02 May 2025 13:30:53 +0800 Message-Id: In-Reply-To: References: From: Herbert Xu Subject: [v2 PATCH 1/9] crypto: lib/sha256 - Add helpers for block-based shash To: Linux Crypto Mailing List Precedence: bulk X-Mailing-List: linux-crypto@vger.kernel.org List-Id: List-Subscribe: List-Unsubscribe: Add an internal sha256_finup helper and move the finalisation code from __sha256_final into it. Also add sha256_choose_blocks and CRYPTO_ARCH_HAVE_LIB_SHA256_SIMD so that the Crypto API can use the SIMD block function unconditionally. The Crypto API must not be used in hard IRQs and there is no reason to have a fallback path for hardirqs. Signed-off-by: Herbert Xu --- include/crypto/internal/sha2.h | 45 ++++++++++++++++++++++++++++++++++ lib/crypto/Kconfig | 8 ++++++ lib/crypto/sha256.c | 32 +++++++----------------- 3 files changed, 62 insertions(+), 23 deletions(-) diff --git a/include/crypto/internal/sha2.h b/include/crypto/internal/sha2.h index d641c67abcbc..fff156f66edc 100644 --- a/include/crypto/internal/sha2.h +++ b/include/crypto/internal/sha2.h @@ -3,7 +3,12 @@ #ifndef _CRYPTO_INTERNAL_SHA2_H #define _CRYPTO_INTERNAL_SHA2_H +#include #include +#include +#include +#include +#include void sha256_update_generic(struct sha256_state *sctx, const u8 *data, size_t len); @@ -24,5 +29,45 @@ void sha256_blocks_generic(u32 state[SHA256_STATE_WORDS], const u8 *data, size_t nblocks); void sha256_blocks_arch(u32 state[SHA256_STATE_WORDS], const u8 *data, size_t nblocks); +void sha256_blocks_simd(u32 state[SHA256_STATE_WORDS], + const u8 *data, size_t nblocks); + +static inline void sha256_choose_blocks( + u32 state[SHA256_STATE_WORDS], const u8 *data, size_t nblocks, + bool force_generic, bool force_simd) +{ + if (!IS_ENABLED(CONFIG_CRYPTO_ARCH_HAVE_LIB_SHA256) || force_generic) + sha256_blocks_generic(state, data, nblocks); + else if (IS_ENABLED(CONFIG_CRYPTO_ARCH_HAVE_LIB_SHA256_SIMD) && + (force_simd || crypto_simd_usable())) + sha256_blocks_simd(state, data, nblocks); + else + sha256_blocks_arch(state, data, nblocks); +} + +static __always_inline void sha256_finup( + struct crypto_sha256_state *sctx, u8 buf[SHA256_BLOCK_SIZE], + size_t len, u8 out[SHA256_DIGEST_SIZE], size_t digest_size, + bool force_generic, bool force_simd) +{ + const size_t bit_offset = SHA256_BLOCK_SIZE - 8; + __be64 *bits = (__be64 *)&buf[bit_offset]; + int i; + + buf[len++] = 0x80; + if (len > bit_offset) { + memset(&buf[len], 0, SHA256_BLOCK_SIZE - len); + sha256_choose_blocks(sctx->state, buf, 1, force_generic, + force_simd); + len = 0; + } + + memset(&buf[len], 0, bit_offset - len); + *bits = cpu_to_be64(sctx->count << 3); + sha256_choose_blocks(sctx->state, buf, 1, force_generic, force_simd); + + for (i = 0; i < digest_size; i += 4) + put_unaligned_be32(sctx->state[i / 4], out + i); +} #endif /* _CRYPTO_INTERNAL_SHA2_H */ diff --git a/lib/crypto/Kconfig b/lib/crypto/Kconfig index 6319358b38c2..1ec1466108cc 100644 --- a/lib/crypto/Kconfig +++ b/lib/crypto/Kconfig @@ -150,6 +150,14 @@ config CRYPTO_ARCH_HAVE_LIB_SHA256 Declares whether the architecture provides an arch-specific accelerated implementation of the SHA-256 library interface. +config CRYPTO_ARCH_HAVE_LIB_SHA256_SIMD + bool + help + Declares whether the architecture provides an arch-specific + accelerated implementation of the SHA-256 library interface + that is SIMD-based and therefore not usable in hardirq + context. + config CRYPTO_LIB_SHA256_GENERIC tristate default CRYPTO_LIB_SHA256 if !CRYPTO_ARCH_HAVE_LIB_SHA256 diff --git a/lib/crypto/sha256.c b/lib/crypto/sha256.c index 563f09c9f381..2ced29efa181 100644 --- a/lib/crypto/sha256.c +++ b/lib/crypto/sha256.c @@ -15,7 +15,6 @@ #include #include #include -#include /* * If __DISABLE_EXPORTS is defined, then this file is being compiled for a @@ -26,14 +25,16 @@ #include "sha256-generic.c" #endif +static inline bool sha256_purgatory(void) +{ + return __is_defined(__DISABLE_EXPORTS); +} + static inline void sha256_blocks(u32 state[SHA256_STATE_WORDS], const u8 *data, size_t nblocks, bool force_generic) { -#if IS_ENABLED(CONFIG_CRYPTO_ARCH_HAVE_LIB_SHA256) && !defined(__DISABLE_EXPORTS) - if (!force_generic) - return sha256_blocks_arch(state, data, nblocks); -#endif - sha256_blocks_generic(state, data, nblocks); + sha256_choose_blocks(state, data, nblocks, + force_generic || sha256_purgatory(), false); } static inline void __sha256_update(struct sha256_state *sctx, const u8 *data, @@ -79,25 +80,10 @@ EXPORT_SYMBOL(sha256_update); static inline void __sha256_final(struct sha256_state *sctx, u8 *out, size_t digest_size, bool force_generic) { - const size_t bit_offset = SHA256_BLOCK_SIZE - sizeof(__be64); - __be64 *bits = (__be64 *)&sctx->buf[bit_offset]; size_t partial = sctx->count % SHA256_BLOCK_SIZE; - size_t i; - - sctx->buf[partial++] = 0x80; - if (partial > bit_offset) { - memset(&sctx->buf[partial], 0, SHA256_BLOCK_SIZE - partial); - sha256_blocks(sctx->state, sctx->buf, 1, force_generic); - partial = 0; - } - - memset(&sctx->buf[partial], 0, bit_offset - partial); - *bits = cpu_to_be64(sctx->count << 3); - sha256_blocks(sctx->state, sctx->buf, 1, force_generic); - - for (i = 0; i < digest_size; i += 4) - put_unaligned_be32(sctx->state[i / 4], out + i); + sha256_finup(&sctx->ctx, sctx->buf, partial, out, digest_size, + force_generic || sha256_purgatory(), false); memzero_explicit(sctx, sizeof(*sctx)); } From patchwork Fri May 2 05:30:56 2025 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Herbert Xu X-Patchwork-Id: 887268 Received: from abb.hmeau.com (abb.hmeau.com [144.6.53.87]) (using TLSv1.2 with cipher ECDHE-RSA-AES256-GCM-SHA384 (256/256 bits)) (No client certificate requested) by smtp.subspace.kernel.org (Postfix) with ESMTPS id 8C8A719FA8D for ; Fri, 2 May 2025 05:31:05 +0000 (UTC) Authentication-Results: smtp.subspace.kernel.org; arc=none smtp.client-ip=144.6.53.87 ARC-Seal: i=1; a=rsa-sha256; d=subspace.kernel.org; s=arc-20240116; t=1746163867; cv=none; b=l2tcOxjtYc2V1HNFv/agg/cEaU5VkLvt4t2dW1+Ue63NNyg/vjtzp9cHCubqS+g04906L4J50YJghT//NZn8km8jASCJXilnUrbsitIlBA3650HV8JObD5QbM5bKdEUYjbQhdhjQEOoI9EfRFOoAI75E6P3LJwGq3zxmmw9mXsg= ARC-Message-Signature: i=1; a=rsa-sha256; d=subspace.kernel.org; s=arc-20240116; t=1746163867; c=relaxed/simple; bh=lgZEx8Kr65FVv+zc/5bw5o6g1WEgMoOvw58pawfqWUs=; h=Date:Message-Id:In-Reply-To:References:From:Subject:To; b=OTc+OY2obB750WRp9+UtjEzkcX5bKU+RzLnWrgFVX/FcAA85ulTZK9H81JBR6Zp+vTIHAuvUeQYvO8S9m/dzrqaovoIEksFrW0xtR5yxuS36SP78/N+LrIMVcooZzn9TBEnhUhiVrp6i2V8/an6xbTib0Q6d4P14W8FEjpFeQy4= ARC-Authentication-Results: i=1; smtp.subspace.kernel.org; dmarc=pass (p=quarantine dis=none) header.from=gondor.apana.org.au; spf=pass smtp.mailfrom=gondor.apana.org.au; dkim=pass (2048-bit key) header.d=hmeau.com header.i=@hmeau.com header.b=IiPw4zdF; arc=none smtp.client-ip=144.6.53.87 Authentication-Results: smtp.subspace.kernel.org; dmarc=pass (p=quarantine dis=none) header.from=gondor.apana.org.au Authentication-Results: smtp.subspace.kernel.org; spf=pass smtp.mailfrom=gondor.apana.org.au Authentication-Results: smtp.subspace.kernel.org; dkim=pass (2048-bit key) header.d=hmeau.com header.i=@hmeau.com header.b="IiPw4zdF" DKIM-Signature: v=1; a=rsa-sha256; q=dns/txt; c=relaxed/relaxed; d=hmeau.com; s=formenos; h=To:Subject:From:References:In-Reply-To:Message-Id:Date:Sender: Reply-To:Cc:MIME-Version:Content-Type:Content-Transfer-Encoding:Content-ID: Content-Description:Resent-Date:Resent-From:Resent-Sender:Resent-To:Resent-Cc :Resent-Message-ID:List-Id:List-Help:List-Unsubscribe:List-Subscribe: List-Post:List-Owner:List-Archive; bh=/7tNAUXzKXiKJLKBDI9Pm2BlLq6OYpzMX8p82LeHwGw=; b=IiPw4zdFHbTiAAxwzwr27bPxD/ yovCXL7VDtODy/XzfRMfdykHdzxmCQ33A396nWcx9esK2tfNm+4BhNXnZv+6QgcPlbTEpPhRsehoA js1eI+OmZToIGMNL0Wp26bV8mCFN5NXlhbQQn/GAEJHx/i828ZmENugpelidh/2QaaGqRSERCCR3H kFc2V1RVA+f1oI5VL+oHQ3NpWY/Bn8U7ftWqKeK3snowB1GoxrZRSvjFmWzH3KXN5eFHz4Ab6p0ZW dE0V7mmFzoSX0O674hovRktVxVDBSDLJEwlgYdLhfcLLpOm2S1jfsal5LXbXW8OVRcSSxXcpOpFwq x4FyIdvA==; Received: from loth.rohan.me.apana.org.au ([192.168.167.2]) by formenos.hmeau.com with smtp (Exim 4.96 #2 (Debian)) id 1uAizA-002lKP-0e; Fri, 02 May 2025 13:30:57 +0800 Received: by loth.rohan.me.apana.org.au (sSMTP sendmail emulation); Fri, 02 May 2025 13:30:56 +0800 Date: Fri, 02 May 2025 13:30:56 +0800 Message-Id: In-Reply-To: References: From: Herbert Xu Subject: [v2 PATCH 2/9] crypto: sha256 - Use the partial block API for generic To: Linux Crypto Mailing List Precedence: bulk X-Mailing-List: linux-crypto@vger.kernel.org List-Id: List-Subscribe: List-Unsubscribe: The shash interface already handles partial blocks, use it for sha224-generic and sha256-generic instead of going through the lib/sha256 interface. Signed-off-by: Herbert Xu --- crypto/sha256.c | 83 +++++++++++++++++++++++-------------------- include/crypto/sha2.h | 14 ++++++-- 2 files changed, 56 insertions(+), 41 deletions(-) diff --git a/crypto/sha256.c b/crypto/sha256.c index c2588d08ee3e..d6b90c6ea63d 100644 --- a/crypto/sha256.c +++ b/crypto/sha256.c @@ -30,15 +30,26 @@ EXPORT_SYMBOL_GPL(sha256_zero_message_hash); static int crypto_sha256_init(struct shash_desc *desc) { - sha256_init(shash_desc_ctx(desc)); + sha256_block_init(shash_desc_ctx(desc)); return 0; } +static inline int crypto_sha256_update(struct shash_desc *desc, const u8 *data, + unsigned int len, bool force_generic) +{ + struct crypto_sha256_state *sctx = shash_desc_ctx(desc); + int remain = len % SHA256_BLOCK_SIZE; + + sctx->count += len - remain; + sha256_choose_blocks(sctx->state, data, len / SHA256_BLOCK_SIZE, + force_generic, !force_generic); + return remain; +} + static int crypto_sha256_update_generic(struct shash_desc *desc, const u8 *data, unsigned int len) { - sha256_update_generic(shash_desc_ctx(desc), data, len); - return 0; + return crypto_sha256_update(desc, data, len, true); } static int crypto_sha256_update_arch(struct shash_desc *desc, const u8 *data, @@ -48,26 +59,35 @@ static int crypto_sha256_update_arch(struct shash_desc *desc, const u8 *data, return 0; } -static int crypto_sha256_final_generic(struct shash_desc *desc, u8 *out) -{ - sha256_final_generic(shash_desc_ctx(desc), out); - return 0; -} - static int crypto_sha256_final_arch(struct shash_desc *desc, u8 *out) { sha256_final(shash_desc_ctx(desc), out); return 0; } +static __always_inline int crypto_sha256_finup(struct shash_desc *desc, + const u8 *data, + unsigned int len, u8 *out, + bool force_generic) +{ + struct crypto_sha256_state *sctx = shash_desc_ctx(desc); + unsigned int remain = len; + u8 *buf; + + if (len >= SHA256_BLOCK_SIZE) + remain = crypto_sha256_update(desc, data, len, force_generic); + sctx->count += remain; + buf = memcpy(sctx + 1, data + len - remain, remain); + sha256_finup(sctx, buf, remain, out, + crypto_shash_digestsize(desc->tfm), force_generic, + !force_generic); + return 0; +} + static int crypto_sha256_finup_generic(struct shash_desc *desc, const u8 *data, unsigned int len, u8 *out) { - struct sha256_state *sctx = shash_desc_ctx(desc); - - sha256_update_generic(sctx, data, len); - sha256_final_generic(sctx, out); - return 0; + return crypto_sha256_finup(desc, data, len, out, true); } static int crypto_sha256_finup_arch(struct shash_desc *desc, const u8 *data, @@ -83,12 +103,8 @@ static int crypto_sha256_finup_arch(struct shash_desc *desc, const u8 *data, static int crypto_sha256_digest_generic(struct shash_desc *desc, const u8 *data, unsigned int len, u8 *out) { - struct sha256_state *sctx = shash_desc_ctx(desc); - - sha256_init(sctx); - sha256_update_generic(sctx, data, len); - sha256_final_generic(sctx, out); - return 0; + crypto_sha256_init(desc); + return crypto_sha256_finup_generic(desc, data, len, out); } static int crypto_sha256_digest_arch(struct shash_desc *desc, const u8 *data, @@ -100,13 +116,7 @@ static int crypto_sha256_digest_arch(struct shash_desc *desc, const u8 *data, static int crypto_sha224_init(struct shash_desc *desc) { - sha224_init(shash_desc_ctx(desc)); - return 0; -} - -static int crypto_sha224_final_generic(struct shash_desc *desc, u8 *out) -{ - sha224_final_generic(shash_desc_ctx(desc), out); + sha224_block_init(shash_desc_ctx(desc)); return 0; } @@ -147,35 +157,30 @@ static struct shash_alg algs[] = { .base.cra_name = "sha256", .base.cra_driver_name = "sha256-generic", .base.cra_priority = 100, + .base.cra_flags = CRYPTO_AHASH_ALG_BLOCK_ONLY | + CRYPTO_AHASH_ALG_FINUP_MAX, .base.cra_blocksize = SHA256_BLOCK_SIZE, .base.cra_module = THIS_MODULE, .digestsize = SHA256_DIGEST_SIZE, .init = crypto_sha256_init, .update = crypto_sha256_update_generic, - .final = crypto_sha256_final_generic, .finup = crypto_sha256_finup_generic, .digest = crypto_sha256_digest_generic, - .descsize = sizeof(struct sha256_state), - .statesize = sizeof(struct crypto_sha256_state) + - SHA256_BLOCK_SIZE + 1, - .import = crypto_sha256_import_lib, - .export = crypto_sha256_export_lib, + .descsize = sizeof(struct crypto_sha256_state), }, { .base.cra_name = "sha224", .base.cra_driver_name = "sha224-generic", .base.cra_priority = 100, + .base.cra_flags = CRYPTO_AHASH_ALG_BLOCK_ONLY | + CRYPTO_AHASH_ALG_FINUP_MAX, .base.cra_blocksize = SHA224_BLOCK_SIZE, .base.cra_module = THIS_MODULE, .digestsize = SHA224_DIGEST_SIZE, .init = crypto_sha224_init, .update = crypto_sha256_update_generic, - .final = crypto_sha224_final_generic, - .descsize = sizeof(struct sha256_state), - .statesize = sizeof(struct crypto_sha256_state) + - SHA256_BLOCK_SIZE + 1, - .import = crypto_sha256_import_lib, - .export = crypto_sha256_export_lib, + .finup = crypto_sha256_finup_generic, + .descsize = sizeof(struct crypto_sha256_state), }, { .base.cra_name = "sha256", diff --git a/include/crypto/sha2.h b/include/crypto/sha2.h index 9853cd2d1291..4912572578dc 100644 --- a/include/crypto/sha2.h +++ b/include/crypto/sha2.h @@ -88,7 +88,7 @@ struct sha512_state { u8 buf[SHA512_BLOCK_SIZE]; }; -static inline void sha256_init(struct sha256_state *sctx) +static inline void sha256_block_init(struct crypto_sha256_state *sctx) { sctx->state[0] = SHA256_H0; sctx->state[1] = SHA256_H1; @@ -100,11 +100,16 @@ static inline void sha256_init(struct sha256_state *sctx) sctx->state[7] = SHA256_H7; sctx->count = 0; } + +static inline void sha256_init(struct sha256_state *sctx) +{ + sha256_block_init(&sctx->ctx); +} void sha256_update(struct sha256_state *sctx, const u8 *data, size_t len); void sha256_final(struct sha256_state *sctx, u8 out[SHA256_DIGEST_SIZE]); void sha256(const u8 *data, size_t len, u8 out[SHA256_DIGEST_SIZE]); -static inline void sha224_init(struct sha256_state *sctx) +static inline void sha224_block_init(struct crypto_sha256_state *sctx) { sctx->state[0] = SHA224_H0; sctx->state[1] = SHA224_H1; @@ -116,6 +121,11 @@ static inline void sha224_init(struct sha256_state *sctx) sctx->state[7] = SHA224_H7; sctx->count = 0; } + +static inline void sha224_init(struct sha256_state *sctx) +{ + sha224_block_init(&sctx->ctx); +} /* Simply use sha256_update as it is equivalent to sha224_update. */ void sha224_final(struct sha256_state *sctx, u8 out[SHA224_DIGEST_SIZE]); From patchwork Fri May 2 05:30:58 2025 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Herbert Xu X-Patchwork-Id: 886715 Received: from abb.hmeau.com (abb.hmeau.com [144.6.53.87]) (using TLSv1.2 with cipher ECDHE-RSA-AES256-GCM-SHA384 (256/256 bits)) (No client certificate requested) by smtp.subspace.kernel.org (Postfix) with ESMTPS id 82ED5192B8C for ; Fri, 2 May 2025 05:31:02 +0000 (UTC) Authentication-Results: smtp.subspace.kernel.org; arc=none smtp.client-ip=144.6.53.87 ARC-Seal: i=1; a=rsa-sha256; d=subspace.kernel.org; s=arc-20240116; t=1746163867; cv=none; b=tXtA4dzTvAZ9p9b7kdNNrgZ20wTH8t61LozffnOxpiQMSBD4k7FuTbhuOioUG9CopUHlTGkh9QkqzM7KUEtcp/1iLXJmPEpDpvhcDZo1GJKmvT6zCwuBK+QNk6I4VqVguRBsjo+rZAYX3YIa8UozaafC179oz3oY4I9/8oHQNwQ= ARC-Message-Signature: i=1; a=rsa-sha256; d=subspace.kernel.org; s=arc-20240116; t=1746163867; c=relaxed/simple; bh=xljTx5BUoKFr1IaQQHdWdANgLINRwS5YVReiSr/62jM=; h=Date:Message-Id:In-Reply-To:References:From:Subject:To; b=okSOP3paz7b+K8RzrakJrFM+8zwFhJn3ao6mTf/XaKmOi9kP7iI0MVgdjG9s28OcxQocAcTD+f0LZyc+4cL7kaMkQKq//ppKT+pMLLfvXRLUH4yLRaqm3UhNfhE9s7Y7HIpfI81LhNsE8peknfnCYp7WwcBOT7DPb8yfMRxF1JI= ARC-Authentication-Results: i=1; smtp.subspace.kernel.org; dmarc=pass (p=quarantine dis=none) header.from=gondor.apana.org.au; spf=pass smtp.mailfrom=gondor.apana.org.au; dkim=pass (2048-bit key) header.d=hmeau.com header.i=@hmeau.com header.b=SgTfsoMO; arc=none smtp.client-ip=144.6.53.87 Authentication-Results: smtp.subspace.kernel.org; dmarc=pass (p=quarantine dis=none) header.from=gondor.apana.org.au Authentication-Results: smtp.subspace.kernel.org; spf=pass smtp.mailfrom=gondor.apana.org.au Authentication-Results: smtp.subspace.kernel.org; dkim=pass (2048-bit key) header.d=hmeau.com header.i=@hmeau.com header.b="SgTfsoMO" DKIM-Signature: v=1; a=rsa-sha256; q=dns/txt; c=relaxed/relaxed; d=hmeau.com; s=formenos; h=To:Subject:From:References:In-Reply-To:Message-Id:Date:Sender: Reply-To:Cc:MIME-Version:Content-Type:Content-Transfer-Encoding:Content-ID: Content-Description:Resent-Date:Resent-From:Resent-Sender:Resent-To:Resent-Cc :Resent-Message-ID:List-Id:List-Help:List-Unsubscribe:List-Subscribe: List-Post:List-Owner:List-Archive; bh=eQwG2BIa1mdwDPsYUReawdulSsEKfX8i5jt2M53PTsE=; b=SgTfsoMOs4re9XrDet5quS6Tox QfwBaJ349rVZ8LAQS726Ga06x2og3bAbpQkPFajH3i66mFh5DbhYfzM4RcytdJLgDt1XA4IXBoyk5 8PdatPfYGPLMxlaPOPcrUdIvgb9y+LAVXOslIXnL+Tn8sG7oGowYM3RB9hkQBhWsuUi7l6n+DU5QL Tru2WsU9AmoPR7y3UxKoyMJ7DJ5/vtSeutavwV1MSeHbCkKipsX+0wJ+dF7QKZ4Dk66q7Pa6fNcYL m38LHrfOFWVXcKetttefu9uCadi4ruLnuY5OjHewrbi25b9qzqhSY8ij3wZjMVp7KPoin2Ihhe5KZ TmZzhShw==; Received: from loth.rohan.me.apana.org.au ([192.168.167.2]) by formenos.hmeau.com with smtp (Exim 4.96 #2 (Debian)) id 1uAizC-002lKa-1g; Fri, 02 May 2025 13:30:59 +0800 Received: by loth.rohan.me.apana.org.au (sSMTP sendmail emulation); Fri, 02 May 2025 13:30:58 +0800 Date: Fri, 02 May 2025 13:30:58 +0800 Message-Id: <3f9c55263754eda2968b5fe319666fa00c2c46e2.1746162259.git.herbert@gondor.apana.org.au> In-Reply-To: References: From: Herbert Xu Subject: [v2 PATCH 3/9] crypto: arch/sha256 - Export block functions as GPL only To: Linux Crypto Mailing List Precedence: bulk X-Mailing-List: linux-crypto@vger.kernel.org List-Id: List-Subscribe: List-Unsubscribe: Export the block functions as GPL only, there is no reason to let arbitrary modules use these internal functions. Signed-off-by: Herbert Xu --- arch/arm/lib/crypto/sha256.c | 4 ++-- arch/arm64/lib/crypto/sha256.c | 4 ++-- arch/mips/cavium-octeon/crypto/octeon-sha256.c | 4 ++-- arch/powerpc/lib/crypto/sha256.c | 4 ++-- arch/riscv/lib/crypto/sha256.c | 4 ++-- arch/s390/lib/crypto/sha256.c | 4 ++-- arch/sparc/lib/crypto/sha256.c | 4 ++-- arch/x86/lib/crypto/sha256.c | 4 ++-- 8 files changed, 16 insertions(+), 16 deletions(-) diff --git a/arch/arm/lib/crypto/sha256.c b/arch/arm/lib/crypto/sha256.c index 3a8dfc304807..e2fae3664428 100644 --- a/arch/arm/lib/crypto/sha256.c +++ b/arch/arm/lib/crypto/sha256.c @@ -35,14 +35,14 @@ void sha256_blocks_arch(u32 state[SHA256_STATE_WORDS], sha256_block_data_order(state, data, nblocks); } } -EXPORT_SYMBOL(sha256_blocks_arch); +EXPORT_SYMBOL_GPL(sha256_blocks_arch); bool sha256_is_arch_optimized(void) { /* We always can use at least the ARM scalar implementation. */ return true; } -EXPORT_SYMBOL(sha256_is_arch_optimized); +EXPORT_SYMBOL_GPL(sha256_is_arch_optimized); static int __init sha256_arm_mod_init(void) { diff --git a/arch/arm64/lib/crypto/sha256.c b/arch/arm64/lib/crypto/sha256.c index 2bd413c586d2..91c7ca727992 100644 --- a/arch/arm64/lib/crypto/sha256.c +++ b/arch/arm64/lib/crypto/sha256.c @@ -45,14 +45,14 @@ void sha256_blocks_arch(u32 state[SHA256_STATE_WORDS], sha256_block_data_order(state, data, nblocks); } } -EXPORT_SYMBOL(sha256_blocks_arch); +EXPORT_SYMBOL_GPL(sha256_blocks_arch); bool sha256_is_arch_optimized(void) { /* We always can use at least the ARM64 scalar implementation. */ return true; } -EXPORT_SYMBOL(sha256_is_arch_optimized); +EXPORT_SYMBOL_GPL(sha256_is_arch_optimized); static int __init sha256_arm64_mod_init(void) { diff --git a/arch/mips/cavium-octeon/crypto/octeon-sha256.c b/arch/mips/cavium-octeon/crypto/octeon-sha256.c index f169054852bc..f93faaf1f4af 100644 --- a/arch/mips/cavium-octeon/crypto/octeon-sha256.c +++ b/arch/mips/cavium-octeon/crypto/octeon-sha256.c @@ -60,13 +60,13 @@ void sha256_blocks_arch(u32 state[SHA256_STATE_WORDS], state64[3] = read_octeon_64bit_hash_dword(3); octeon_crypto_disable(&cop2_state, flags); } -EXPORT_SYMBOL(sha256_blocks_arch); +EXPORT_SYMBOL_GPL(sha256_blocks_arch); bool sha256_is_arch_optimized(void) { return octeon_has_crypto(); } -EXPORT_SYMBOL(sha256_is_arch_optimized); +EXPORT_SYMBOL_GPL(sha256_is_arch_optimized); MODULE_LICENSE("GPL"); MODULE_DESCRIPTION("SHA-256 Secure Hash Algorithm (OCTEON)"); diff --git a/arch/powerpc/lib/crypto/sha256.c b/arch/powerpc/lib/crypto/sha256.c index c05023c5acdd..6b0f079587eb 100644 --- a/arch/powerpc/lib/crypto/sha256.c +++ b/arch/powerpc/lib/crypto/sha256.c @@ -58,13 +58,13 @@ void sha256_blocks_arch(u32 state[SHA256_STATE_WORDS], nblocks -= unit; } while (nblocks); } -EXPORT_SYMBOL(sha256_blocks_arch); +EXPORT_SYMBOL_GPL(sha256_blocks_arch); bool sha256_is_arch_optimized(void) { return true; } -EXPORT_SYMBOL(sha256_is_arch_optimized); +EXPORT_SYMBOL_GPL(sha256_is_arch_optimized); MODULE_LICENSE("GPL"); MODULE_DESCRIPTION("SHA-256 Secure Hash Algorithm, SPE optimized"); diff --git a/arch/riscv/lib/crypto/sha256.c b/arch/riscv/lib/crypto/sha256.c index 18b84030f0b3..2905a6dbb485 100644 --- a/arch/riscv/lib/crypto/sha256.c +++ b/arch/riscv/lib/crypto/sha256.c @@ -32,13 +32,13 @@ void sha256_blocks_arch(u32 state[SHA256_STATE_WORDS], sha256_blocks_generic(state, data, nblocks); } } -EXPORT_SYMBOL(sha256_blocks_arch); +EXPORT_SYMBOL_GPL(sha256_blocks_arch); bool sha256_is_arch_optimized(void) { return static_key_enabled(&have_extensions); } -EXPORT_SYMBOL(sha256_is_arch_optimized); +EXPORT_SYMBOL_GPL(sha256_is_arch_optimized); static int __init riscv64_sha256_mod_init(void) { diff --git a/arch/s390/lib/crypto/sha256.c b/arch/s390/lib/crypto/sha256.c index 50c592ce7a5d..fcfa2706a7f9 100644 --- a/arch/s390/lib/crypto/sha256.c +++ b/arch/s390/lib/crypto/sha256.c @@ -21,13 +21,13 @@ void sha256_blocks_arch(u32 state[SHA256_STATE_WORDS], else sha256_blocks_generic(state, data, nblocks); } -EXPORT_SYMBOL(sha256_blocks_arch); +EXPORT_SYMBOL_GPL(sha256_blocks_arch); bool sha256_is_arch_optimized(void) { return static_key_enabled(&have_cpacf_sha256); } -EXPORT_SYMBOL(sha256_is_arch_optimized); +EXPORT_SYMBOL_GPL(sha256_is_arch_optimized); static int __init sha256_s390_mod_init(void) { diff --git a/arch/sparc/lib/crypto/sha256.c b/arch/sparc/lib/crypto/sha256.c index 6f118a23d210..b4fc475dcc40 100644 --- a/arch/sparc/lib/crypto/sha256.c +++ b/arch/sparc/lib/crypto/sha256.c @@ -30,13 +30,13 @@ void sha256_blocks_arch(u32 state[SHA256_STATE_WORDS], else sha256_blocks_generic(state, data, nblocks); } -EXPORT_SYMBOL(sha256_blocks_arch); +EXPORT_SYMBOL_GPL(sha256_blocks_arch); bool sha256_is_arch_optimized(void) { return static_key_enabled(&have_sha256_opcodes); } -EXPORT_SYMBOL(sha256_is_arch_optimized); +EXPORT_SYMBOL_GPL(sha256_is_arch_optimized); static int __init sha256_sparc64_mod_init(void) { diff --git a/arch/x86/lib/crypto/sha256.c b/arch/x86/lib/crypto/sha256.c index 47865b5cd94b..8735ec871f86 100644 --- a/arch/x86/lib/crypto/sha256.c +++ b/arch/x86/lib/crypto/sha256.c @@ -35,13 +35,13 @@ void sha256_blocks_arch(u32 state[SHA256_STATE_WORDS], sha256_blocks_generic(state, data, nblocks); } } -EXPORT_SYMBOL(sha256_blocks_arch); +EXPORT_SYMBOL_GPL(sha256_blocks_arch); bool sha256_is_arch_optimized(void) { return static_key_enabled(&have_sha256_x86); } -EXPORT_SYMBOL(sha256_is_arch_optimized); +EXPORT_SYMBOL_GPL(sha256_is_arch_optimized); static int __init sha256_x86_mod_init(void) { From patchwork Fri May 2 05:31:00 2025 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Herbert Xu X-Patchwork-Id: 886714 Received: from abb.hmeau.com (abb.hmeau.com [144.6.53.87]) (using TLSv1.2 with cipher ECDHE-RSA-AES256-GCM-SHA384 (256/256 bits)) (No client certificate requested) by smtp.subspace.kernel.org (Postfix) with ESMTPS id 16A9F1A0BE1 for ; Fri, 2 May 2025 05:31:04 +0000 (UTC) Authentication-Results: smtp.subspace.kernel.org; arc=none smtp.client-ip=144.6.53.87 ARC-Seal: i=1; a=rsa-sha256; d=subspace.kernel.org; s=arc-20240116; t=1746163867; cv=none; b=L9INkiOrsKcf8gOqc6atYU9S9qxWg/NDW3lyup1TcHTCE/s9BvsmFpZF4lb2JYF+upd24FnAn90HB6DPh3ceGdk4/gMd47ge1f8StTpB6mGahOcpmV4+j9cVTQcF3pivHxyHd2h1AjS6w7fw8/dA4RBWSy5U4cqWhDyVwo3WDpo= ARC-Message-Signature: i=1; a=rsa-sha256; d=subspace.kernel.org; s=arc-20240116; t=1746163867; c=relaxed/simple; bh=cDazNFoLe44oaflsagIfz5Onx9azSypdeFvQ4McUelo=; h=Date:Message-Id:In-Reply-To:References:From:Subject:To; b=Qni2OGgFuw7Np9FbK6nK7J5XQIrUBheQoyXNW/WIB7He7eK2Y/ykLeVlzNqFDqbMrfeLkbgFFQ7VtjEfRtNjgbRhcZOli8qra4tGyngWFt3PdV7Q+qcJqdE+OVePA8VxSL8sQjoqxxX7YL35I1AAmVGkY01/P50Yh2QiHQi4WHk= ARC-Authentication-Results: i=1; smtp.subspace.kernel.org; dmarc=pass (p=quarantine dis=none) header.from=gondor.apana.org.au; spf=pass smtp.mailfrom=gondor.apana.org.au; dkim=pass (2048-bit key) header.d=hmeau.com header.i=@hmeau.com header.b=ZCkaN9Cy; arc=none smtp.client-ip=144.6.53.87 Authentication-Results: smtp.subspace.kernel.org; dmarc=pass (p=quarantine dis=none) header.from=gondor.apana.org.au Authentication-Results: smtp.subspace.kernel.org; spf=pass smtp.mailfrom=gondor.apana.org.au Authentication-Results: smtp.subspace.kernel.org; dkim=pass (2048-bit key) header.d=hmeau.com header.i=@hmeau.com header.b="ZCkaN9Cy" DKIM-Signature: v=1; a=rsa-sha256; q=dns/txt; c=relaxed/relaxed; d=hmeau.com; s=formenos; h=To:Subject:From:References:In-Reply-To:Message-Id:Date:Sender: Reply-To:Cc:MIME-Version:Content-Type:Content-Transfer-Encoding:Content-ID: Content-Description:Resent-Date:Resent-From:Resent-Sender:Resent-To:Resent-Cc :Resent-Message-ID:List-Id:List-Help:List-Unsubscribe:List-Subscribe: List-Post:List-Owner:List-Archive; bh=k9sl7qjwQFuFnOZAVeNzZtey0UxsJlT+8LG0yANeKv8=; b=ZCkaN9CyDY9fSJ1txgPFrxr/hN DMRs168/fuyy3+dX5OC7hEPwFOoF2OynExIXsWUrMgRCVNlj6dJyNLRYlH3uN+JlQ1WtEJGa3iYPl hWiXty+4/kubidAPZ9S5q2+vHtDz7tq+HLgOGSTq4OCWt6s0BR7TuIbOmojvgz4jllLgXxXy3qp/h /pKe47AcH1wgc4wEvhpUvg2y9Ka8+ej4geW15ZbduUR1X0iuzlhDA85Oh6lck06XrtpOjvPmbormj nJHHoh/afj2rlYOyM015X6YYIReI+3015lp7YSV9AjMGhwM3f3AIU5IfOMDra2mGSKuipmZhVOdlG WsoduSkw==; Received: from loth.rohan.me.apana.org.au ([192.168.167.2]) by formenos.hmeau.com with smtp (Exim 4.96 #2 (Debian)) id 1uAizE-002lKl-2a; Fri, 02 May 2025 13:31:01 +0800 Received: by loth.rohan.me.apana.org.au (sSMTP sendmail emulation); Fri, 02 May 2025 13:31:00 +0800 Date: Fri, 02 May 2025 13:31:00 +0800 Message-Id: <11573e030c689c70c634c7eceef48a5fa25a8507.1746162259.git.herbert@gondor.apana.org.au> In-Reply-To: References: From: Herbert Xu Subject: [v2 PATCH 4/9] crypto: arm/sha256 - Add simd block function To: Linux Crypto Mailing List Precedence: bulk X-Mailing-List: linux-crypto@vger.kernel.org List-Id: List-Subscribe: List-Unsubscribe: Add CRYPTO_ARCH_HAVE_LIB_SHA256_SIMD and a SIMD block function so that the caller can decide whether to use SIMD. Signed-off-by: Herbert Xu --- arch/arm/lib/crypto/Kconfig | 1 + arch/arm/lib/crypto/sha256-armv4.pl | 20 ++++++++++---------- arch/arm/lib/crypto/sha256.c | 14 +++++++------- 3 files changed, 18 insertions(+), 17 deletions(-) diff --git a/arch/arm/lib/crypto/Kconfig b/arch/arm/lib/crypto/Kconfig index 9f3ff30f4032..d1ad664f0c67 100644 --- a/arch/arm/lib/crypto/Kconfig +++ b/arch/arm/lib/crypto/Kconfig @@ -28,3 +28,4 @@ config CRYPTO_SHA256_ARM depends on !CPU_V7M default CRYPTO_LIB_SHA256 select CRYPTO_ARCH_HAVE_LIB_SHA256 + select CRYPTO_ARCH_HAVE_LIB_SHA256_SIMD diff --git a/arch/arm/lib/crypto/sha256-armv4.pl b/arch/arm/lib/crypto/sha256-armv4.pl index f3a2b54efd4e..8122db7fd599 100644 --- a/arch/arm/lib/crypto/sha256-armv4.pl +++ b/arch/arm/lib/crypto/sha256-armv4.pl @@ -204,18 +204,18 @@ K256: .word 0 @ terminator #if __ARM_MAX_ARCH__>=7 && !defined(__KERNEL__) .LOPENSSL_armcap: -.word OPENSSL_armcap_P-sha256_block_data_order +.word OPENSSL_armcap_P-sha256_blocks_arch #endif .align 5 -.global sha256_block_data_order -.type sha256_block_data_order,%function -sha256_block_data_order: -.Lsha256_block_data_order: +.global sha256_blocks_arch +.type sha256_blocks_arch,%function +sha256_blocks_arch: +.Lsha256_blocks_arch: #if __ARM_ARCH__<7 - sub r3,pc,#8 @ sha256_block_data_order + sub r3,pc,#8 @ sha256_blocks_arch #else - adr r3,.Lsha256_block_data_order + adr r3,.Lsha256_blocks_arch #endif #if __ARM_MAX_ARCH__>=7 && !defined(__KERNEL__) ldr r12,.LOPENSSL_armcap @@ -282,7 +282,7 @@ $code.=<<___; moveq pc,lr @ be binary compatible with V4, yet bx lr @ interoperable with Thumb ISA:-) #endif -.size sha256_block_data_order,.-sha256_block_data_order +.size sha256_blocks_arch,.-sha256_blocks_arch ___ ###################################################################### # NEON stuff @@ -470,8 +470,8 @@ sha256_block_data_order_neon: stmdb sp!,{r4-r12,lr} sub $H,sp,#16*4+16 - adr $Ktbl,.Lsha256_block_data_order - sub $Ktbl,$Ktbl,#.Lsha256_block_data_order-K256 + adr $Ktbl,.Lsha256_blocks_arch + sub $Ktbl,$Ktbl,#.Lsha256_blocks_arch-K256 bic $H,$H,#15 @ align for 128-bit stores mov $t2,sp mov sp,$H @ alloca diff --git a/arch/arm/lib/crypto/sha256.c b/arch/arm/lib/crypto/sha256.c index e2fae3664428..1dd71b8fd611 100644 --- a/arch/arm/lib/crypto/sha256.c +++ b/arch/arm/lib/crypto/sha256.c @@ -6,12 +6,12 @@ */ #include #include -#include #include #include -asmlinkage void sha256_block_data_order(u32 state[SHA256_STATE_WORDS], - const u8 *data, size_t nblocks); +asmlinkage void sha256_blocks_arch(u32 state[SHA256_STATE_WORDS], + const u8 *data, size_t nblocks); +EXPORT_SYMBOL_GPL(sha256_blocks_arch); asmlinkage void sha256_block_data_order_neon(u32 state[SHA256_STATE_WORDS], const u8 *data, size_t nblocks); asmlinkage void sha256_ce_transform(u32 state[SHA256_STATE_WORDS], @@ -20,11 +20,11 @@ asmlinkage void sha256_ce_transform(u32 state[SHA256_STATE_WORDS], static __ro_after_init DEFINE_STATIC_KEY_FALSE(have_neon); static __ro_after_init DEFINE_STATIC_KEY_FALSE(have_ce); -void sha256_blocks_arch(u32 state[SHA256_STATE_WORDS], +void sha256_blocks_simd(u32 state[SHA256_STATE_WORDS], const u8 *data, size_t nblocks) { if (IS_ENABLED(CONFIG_KERNEL_MODE_NEON) && - static_branch_likely(&have_neon) && crypto_simd_usable()) { + static_branch_likely(&have_neon)) { kernel_neon_begin(); if (static_branch_likely(&have_ce)) sha256_ce_transform(state, data, nblocks); @@ -32,10 +32,10 @@ void sha256_blocks_arch(u32 state[SHA256_STATE_WORDS], sha256_block_data_order_neon(state, data, nblocks); kernel_neon_end(); } else { - sha256_block_data_order(state, data, nblocks); + sha256_blocks_arch(state, data, nblocks); } } -EXPORT_SYMBOL_GPL(sha256_blocks_arch); +EXPORT_SYMBOL_GPL(sha256_blocks_simd); bool sha256_is_arch_optimized(void) { From patchwork Fri May 2 05:31:03 2025 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Herbert Xu X-Patchwork-Id: 887267 Received: from abb.hmeau.com (abb.hmeau.com [144.6.53.87]) (using TLSv1.2 with cipher ECDHE-RSA-AES256-GCM-SHA384 (256/256 bits)) (No client certificate requested) by smtp.subspace.kernel.org (Postfix) with ESMTPS id E44032F2F for ; Fri, 2 May 2025 05:31:06 +0000 (UTC) Authentication-Results: smtp.subspace.kernel.org; arc=none smtp.client-ip=144.6.53.87 ARC-Seal: i=1; a=rsa-sha256; d=subspace.kernel.org; s=arc-20240116; t=1746163869; cv=none; b=e7H4IcGgDXgIkHfgbq4XW9a8fQKI5Yhd+kgHu8WCTTITx8SeOjW82eXhJp6cq9Amve44/ix2eWEhNIxdbi/1tKEarY9hrFuFRxhTiqDZJkstUw0SpnDHAKByJmGMQHP9DgSzJl/KfCT4gGZBhfT7T4RIn8qvZY67Ir3fJSaDmpg= ARC-Message-Signature: i=1; a=rsa-sha256; d=subspace.kernel.org; s=arc-20240116; t=1746163869; c=relaxed/simple; bh=Tn1OlWYzB5W6yPmyGLEvaQt2RfbubOh0TNKzrPZeVys=; h=Date:Message-Id:In-Reply-To:References:From:Subject:To; b=pOx3z0Hk7Aev5P+lIGk4Ci6OzJf3TzjaeVoS4DZNGaoF7/7L22NI8s3OY23x2ze5pg840Hnj38tkxmNPyT7jFt9LmAls95fpaDwNc+ejpkq7INzllW3IcvEW1YMYLDTpiurv4UwlWa7N1JehvHLm9/LCHyVANhT3QerdY7uMpoc= ARC-Authentication-Results: i=1; smtp.subspace.kernel.org; dmarc=pass (p=quarantine dis=none) header.from=gondor.apana.org.au; spf=pass smtp.mailfrom=gondor.apana.org.au; dkim=pass (2048-bit key) header.d=hmeau.com header.i=@hmeau.com header.b=djHfwYYG; arc=none smtp.client-ip=144.6.53.87 Authentication-Results: smtp.subspace.kernel.org; dmarc=pass (p=quarantine dis=none) header.from=gondor.apana.org.au Authentication-Results: smtp.subspace.kernel.org; spf=pass smtp.mailfrom=gondor.apana.org.au Authentication-Results: smtp.subspace.kernel.org; dkim=pass (2048-bit key) header.d=hmeau.com header.i=@hmeau.com header.b="djHfwYYG" DKIM-Signature: v=1; a=rsa-sha256; q=dns/txt; c=relaxed/relaxed; d=hmeau.com; s=formenos; h=To:Subject:From:References:In-Reply-To:Message-Id:Date:Sender: Reply-To:Cc:MIME-Version:Content-Type:Content-Transfer-Encoding:Content-ID: Content-Description:Resent-Date:Resent-From:Resent-Sender:Resent-To:Resent-Cc :Resent-Message-ID:List-Id:List-Help:List-Unsubscribe:List-Subscribe: List-Post:List-Owner:List-Archive; bh=16XokEqasIWP95XVR1x8DIz2LLYQO0QQe6t3ZUj6SLo=; b=djHfwYYGfdEsV45pbqFHCwF4SK lF1dDFpBDTy77+Ca7yalbCGOPeAbJyBzNY6JhALAi+212gtaoNrBhxGYXoL7dN0TVCyk6bjbzxdYS du79ogHepiFUEwyHM/M7d2GMiAJd3bLb50hTPChrpmmE8qkiU0zy5VUsdiZJDKZxGRSAkH3dSdiB+ ncstOqakxf1kGjBXoqT4edYauNQVVq2tfKz8StGScw9ZJH9zIlWO/Hqn++H924OAcroAVPi7zWDzQ aR2B2sgmqHkeMyj9OJ5IkLvittjK7AB5Eb9YULi139bR7eAvN9IYh40ohH28V2nVTdjbcqjv5LA2f bxndV2Yg==; Received: from loth.rohan.me.apana.org.au ([192.168.167.2]) by formenos.hmeau.com with smtp (Exim 4.96 #2 (Debian)) id 1uAizH-002lKw-0N; Fri, 02 May 2025 13:31:04 +0800 Received: by loth.rohan.me.apana.org.au (sSMTP sendmail emulation); Fri, 02 May 2025 13:31:03 +0800 Date: Fri, 02 May 2025 13:31:03 +0800 Message-Id: <45df26ced152bec1bd86aa852869d10f8e1dae31.1746162259.git.herbert@gondor.apana.org.au> In-Reply-To: References: From: Herbert Xu Subject: [v2 PATCH 5/9] crypto: arm64/sha256 - Add simd block function To: Linux Crypto Mailing List Precedence: bulk X-Mailing-List: linux-crypto@vger.kernel.org List-Id: List-Subscribe: List-Unsubscribe: Add CRYPTO_ARCH_HAVE_LIB_SHA256_SIMD and a SIMD block function so that the caller can decide whether to use SIMD. Signed-off-by: Herbert Xu --- arch/arm64/crypto/sha512-glue.c | 6 +++--- arch/arm64/lib/crypto/Kconfig | 1 + arch/arm64/lib/crypto/sha2-armv8.pl | 2 +- arch/arm64/lib/crypto/sha256.c | 14 +++++++------- 4 files changed, 12 insertions(+), 11 deletions(-) diff --git a/arch/arm64/crypto/sha512-glue.c b/arch/arm64/crypto/sha512-glue.c index ab2e1c13dfad..15aa9d8b7b2c 100644 --- a/arch/arm64/crypto/sha512-glue.c +++ b/arch/arm64/crypto/sha512-glue.c @@ -18,13 +18,13 @@ MODULE_LICENSE("GPL v2"); MODULE_ALIAS_CRYPTO("sha384"); MODULE_ALIAS_CRYPTO("sha512"); -asmlinkage void sha512_block_data_order(u64 *digest, const void *data, - unsigned int num_blks); +asmlinkage void sha512_blocks_arch(u64 *digest, const void *data, + unsigned int num_blks); static void sha512_arm64_transform(struct sha512_state *sst, u8 const *src, int blocks) { - sha512_block_data_order(sst->state, src, blocks); + sha512_blocks_arch(sst->state, src, blocks); } static int sha512_update(struct shash_desc *desc, const u8 *data, diff --git a/arch/arm64/lib/crypto/Kconfig b/arch/arm64/lib/crypto/Kconfig index 49e57bfdb5b5..129a7685cb4c 100644 --- a/arch/arm64/lib/crypto/Kconfig +++ b/arch/arm64/lib/crypto/Kconfig @@ -17,3 +17,4 @@ config CRYPTO_SHA256_ARM64 tristate default CRYPTO_LIB_SHA256 select CRYPTO_ARCH_HAVE_LIB_SHA256 + select CRYPTO_ARCH_HAVE_LIB_SHA256_SIMD diff --git a/arch/arm64/lib/crypto/sha2-armv8.pl b/arch/arm64/lib/crypto/sha2-armv8.pl index 35ec9ae99fe1..4aebd20c498b 100644 --- a/arch/arm64/lib/crypto/sha2-armv8.pl +++ b/arch/arm64/lib/crypto/sha2-armv8.pl @@ -95,7 +95,7 @@ if ($output =~ /512/) { $reg_t="w"; } -$func="sha${BITS}_block_data_order"; +$func="sha${BITS}_blocks_arch"; ($ctx,$inp,$num,$Ktbl)=map("x$_",(0..2,30)); diff --git a/arch/arm64/lib/crypto/sha256.c b/arch/arm64/lib/crypto/sha256.c index 91c7ca727992..fdceb2d0899c 100644 --- a/arch/arm64/lib/crypto/sha256.c +++ b/arch/arm64/lib/crypto/sha256.c @@ -6,12 +6,12 @@ */ #include #include -#include #include #include -asmlinkage void sha256_block_data_order(u32 state[SHA256_STATE_WORDS], - const u8 *data, size_t nblocks); +asmlinkage void sha256_blocks_arch(u32 state[SHA256_STATE_WORDS], + const u8 *data, size_t nblocks); +EXPORT_SYMBOL_GPL(sha256_blocks_arch); asmlinkage void sha256_block_neon(u32 state[SHA256_STATE_WORDS], const u8 *data, size_t nblocks); asmlinkage size_t __sha256_ce_transform(u32 state[SHA256_STATE_WORDS], @@ -20,11 +20,11 @@ asmlinkage size_t __sha256_ce_transform(u32 state[SHA256_STATE_WORDS], static __ro_after_init DEFINE_STATIC_KEY_FALSE(have_neon); static __ro_after_init DEFINE_STATIC_KEY_FALSE(have_ce); -void sha256_blocks_arch(u32 state[SHA256_STATE_WORDS], +void sha256_blocks_simd(u32 state[SHA256_STATE_WORDS], const u8 *data, size_t nblocks) { if (IS_ENABLED(CONFIG_KERNEL_MODE_NEON) && - static_branch_likely(&have_neon) && crypto_simd_usable()) { + static_branch_likely(&have_neon)) { if (static_branch_likely(&have_ce)) { do { size_t rem; @@ -42,10 +42,10 @@ void sha256_blocks_arch(u32 state[SHA256_STATE_WORDS], kernel_neon_end(); } } else { - sha256_block_data_order(state, data, nblocks); + sha256_blocks_arch(state, data, nblocks); } } -EXPORT_SYMBOL_GPL(sha256_blocks_arch); +EXPORT_SYMBOL_GPL(sha256_blocks_simd); bool sha256_is_arch_optimized(void) { From patchwork Fri May 2 05:31:05 2025 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Herbert Xu X-Patchwork-Id: 886713 Received: from abb.hmeau.com (abb.hmeau.com [144.6.53.87]) (using TLSv1.2 with cipher ECDHE-RSA-AES256-GCM-SHA384 (256/256 bits)) (No client certificate requested) by smtp.subspace.kernel.org (Postfix) with ESMTPS id 2AF4E19FA8D for ; Fri, 2 May 2025 05:31:08 +0000 (UTC) Authentication-Results: smtp.subspace.kernel.org; arc=none smtp.client-ip=144.6.53.87 ARC-Seal: i=1; a=rsa-sha256; d=subspace.kernel.org; s=arc-20240116; t=1746163871; cv=none; b=aJ6SDDoolD7WV2p6I07/CyLRLN+U8ovIKsZ25GGyobiG+8lyrQJ6M4czWK/aRgR10zhQ7cmtpCV3Gs39mRDuMyx8UWY/IgL+81NKlunkQ06dHbRV2BV7lBEiOu5TeMjWFD60JMPUv271C361TBm/g+nHvtRS70FCMFqmuQLu//Y= ARC-Message-Signature: i=1; a=rsa-sha256; d=subspace.kernel.org; s=arc-20240116; t=1746163871; c=relaxed/simple; bh=xecfdc8aVC7Q4f5sVsF3XnUYRBQ96xKxSrcjKbBVxII=; h=Date:Message-Id:In-Reply-To:References:From:Subject:To; b=Lg4O9EbBJBLVt3pzGHin70PuO8CsnowjIN4UeUgxvu+jqAl0teNTSu9QrBxcuDnMpWRvQhDlZO8bBnoveC+WsPmyBE9S8QCAmvQpq2uFaHsdeOS9etzbC9IkaP0ViwczCS0cLqWIYNL2la0yv7egjiJPzOgnuw9chCoQB23s31E= ARC-Authentication-Results: i=1; smtp.subspace.kernel.org; dmarc=pass (p=quarantine dis=none) header.from=gondor.apana.org.au; spf=pass smtp.mailfrom=gondor.apana.org.au; dkim=pass (2048-bit key) header.d=hmeau.com header.i=@hmeau.com header.b=b9ZG+74A; arc=none smtp.client-ip=144.6.53.87 Authentication-Results: smtp.subspace.kernel.org; dmarc=pass (p=quarantine dis=none) header.from=gondor.apana.org.au Authentication-Results: smtp.subspace.kernel.org; spf=pass smtp.mailfrom=gondor.apana.org.au Authentication-Results: smtp.subspace.kernel.org; dkim=pass (2048-bit key) header.d=hmeau.com header.i=@hmeau.com header.b="b9ZG+74A" DKIM-Signature: v=1; a=rsa-sha256; q=dns/txt; c=relaxed/relaxed; d=hmeau.com; s=formenos; h=To:Subject:From:References:In-Reply-To:Message-Id:Date:Sender: Reply-To:Cc:MIME-Version:Content-Type:Content-Transfer-Encoding:Content-ID: Content-Description:Resent-Date:Resent-From:Resent-Sender:Resent-To:Resent-Cc :Resent-Message-ID:List-Id:List-Help:List-Unsubscribe:List-Subscribe: List-Post:List-Owner:List-Archive; bh=t4ss7E4zqQoTfvbDhmZeIpLXN+A/1mFSTgtWDtQF/Sw=; b=b9ZG+74ACWK+KXrWfc+Is49maH lhykouvAbc2Op7ieG/+OKTTf3QbtBei7CRNVScw6am+3f6qT3KtDXM3FqCxCK/8G3rBuEZx1XUG4i pAO8SWths17YdPlqBLj3x2XHdY64aEQqri1eGPgjO0Cd3idEnybUhIwptFJGaeWXIqMxKqURqUyqE G46mkHCwfyH3xCXv+LvzpEhVHMUNO/yZIfk/mPqsjc+01JbDPFwUdyS1tWSfXfnmZUDtBtnvfWuHx S7ACfbbywH1uG82wTtea/rXaHYOh3wz3IpcRAha5bol2sbj6QqFnUQRnZBoCz4+Ti7kydHt7D7p5Z ETFtPPSg==; Received: from loth.rohan.me.apana.org.au ([192.168.167.2]) by formenos.hmeau.com with smtp (Exim 4.96 #2 (Debian)) id 1uAizJ-002lL7-1M; Fri, 02 May 2025 13:31:06 +0800 Received: by loth.rohan.me.apana.org.au (sSMTP sendmail emulation); Fri, 02 May 2025 13:31:05 +0800 Date: Fri, 02 May 2025 13:31:05 +0800 Message-Id: In-Reply-To: References: From: Herbert Xu Subject: [v2 PATCH 6/9] crypto: riscv/sha256 - Add simd block function To: Linux Crypto Mailing List Precedence: bulk X-Mailing-List: linux-crypto@vger.kernel.org List-Id: List-Subscribe: List-Unsubscribe: Add CRYPTO_ARCH_HAVE_LIB_SHA256_SIMD and a SIMD block function so that the caller can decide whether to use SIMD. Signed-off-by: Herbert Xu --- arch/riscv/lib/crypto/Kconfig | 1 + arch/riscv/lib/crypto/sha256.c | 13 +++++++++---- 2 files changed, 10 insertions(+), 4 deletions(-) diff --git a/arch/riscv/lib/crypto/Kconfig b/arch/riscv/lib/crypto/Kconfig index c100571feb7e..47c99ea97ce2 100644 --- a/arch/riscv/lib/crypto/Kconfig +++ b/arch/riscv/lib/crypto/Kconfig @@ -12,4 +12,5 @@ config CRYPTO_SHA256_RISCV64 depends on 64BIT && RISCV_ISA_V && TOOLCHAIN_HAS_VECTOR_CRYPTO default CRYPTO_LIB_SHA256 select CRYPTO_ARCH_HAVE_LIB_SHA256 + select CRYPTO_ARCH_HAVE_LIB_SHA256_SIMD select CRYPTO_LIB_SHA256_GENERIC diff --git a/arch/riscv/lib/crypto/sha256.c b/arch/riscv/lib/crypto/sha256.c index 2905a6dbb485..c1358eafc2ad 100644 --- a/arch/riscv/lib/crypto/sha256.c +++ b/arch/riscv/lib/crypto/sha256.c @@ -9,10 +9,8 @@ * Author: Jerry Shih */ -#include #include #include -#include #include #include @@ -21,10 +19,10 @@ asmlinkage void sha256_transform_zvknha_or_zvknhb_zvkb( static __ro_after_init DEFINE_STATIC_KEY_FALSE(have_extensions); -void sha256_blocks_arch(u32 state[SHA256_STATE_WORDS], +void sha256_blocks_simd(u32 state[SHA256_STATE_WORDS], const u8 *data, size_t nblocks) { - if (static_branch_likely(&have_extensions) && crypto_simd_usable()) { + if (static_branch_likely(&have_extensions)) { kernel_vector_begin(); sha256_transform_zvknha_or_zvknhb_zvkb(state, data, nblocks); kernel_vector_end(); @@ -32,6 +30,13 @@ void sha256_blocks_arch(u32 state[SHA256_STATE_WORDS], sha256_blocks_generic(state, data, nblocks); } } +EXPORT_SYMBOL_GPL(sha256_blocks_simd); + +void sha256_blocks_arch(u32 state[SHA256_STATE_WORDS], + const u8 *data, size_t nblocks) +{ + sha256_blocks_generic(state, data, nblocks); +} EXPORT_SYMBOL_GPL(sha256_blocks_arch); bool sha256_is_arch_optimized(void) From patchwork Fri May 2 05:31:07 2025 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Herbert Xu X-Patchwork-Id: 887266 Received: from abb.hmeau.com (abb.hmeau.com [144.6.53.87]) (using TLSv1.2 with cipher ECDHE-RSA-AES256-GCM-SHA384 (256/256 bits)) (No client certificate requested) by smtp.subspace.kernel.org (Postfix) with ESMTPS id 8B193CA4E for ; Fri, 2 May 2025 05:31:11 +0000 (UTC) Authentication-Results: smtp.subspace.kernel.org; arc=none smtp.client-ip=144.6.53.87 ARC-Seal: i=1; a=rsa-sha256; d=subspace.kernel.org; s=arc-20240116; t=1746163873; cv=none; b=b+a+oSLpzPoWd3+M3b+/UzfHaNXQjZxyCFqZmzFzPVI27jW7axpG7mmUoCGdxRjlljAOPH6UKuJ5H3Cp7+WuJWIGI+97f/WeF3bjefGmTi786JXQf0Oeax5uqbTRR8+YGK0aNljpNofFIc0NY5iCI5s6QXXbecIbhR8PUeZspro= ARC-Message-Signature: i=1; a=rsa-sha256; d=subspace.kernel.org; s=arc-20240116; t=1746163873; c=relaxed/simple; bh=0V3RbGaWepEn6I/fNWDPUJ8EHbUxbND1FQBWMkMRUn8=; h=Date:Message-Id:In-Reply-To:References:From:Subject:To; b=lM6YFVzCEVFzqS8mDA+b2xt/iQJgI1lGSCxHX79WYlQDtwkvN05isIqYKB/HOonrKNL2GTyM1N2WgMF75zIt/F/pzlWQ2VFHqWQP+zR7kQzE+jOukDD81MpJI14hoZwD2zEwoKWcztBpvNAf6w0IZCzH3p9hvQ3Ls/hOPZB0lIg= ARC-Authentication-Results: i=1; smtp.subspace.kernel.org; dmarc=pass (p=quarantine dis=none) header.from=gondor.apana.org.au; spf=pass smtp.mailfrom=gondor.apana.org.au; dkim=pass (2048-bit key) header.d=hmeau.com header.i=@hmeau.com header.b=S098thkE; arc=none smtp.client-ip=144.6.53.87 Authentication-Results: smtp.subspace.kernel.org; dmarc=pass (p=quarantine dis=none) header.from=gondor.apana.org.au Authentication-Results: smtp.subspace.kernel.org; spf=pass smtp.mailfrom=gondor.apana.org.au Authentication-Results: smtp.subspace.kernel.org; dkim=pass (2048-bit key) header.d=hmeau.com header.i=@hmeau.com header.b="S098thkE" DKIM-Signature: v=1; a=rsa-sha256; q=dns/txt; c=relaxed/relaxed; d=hmeau.com; s=formenos; h=To:Subject:From:References:In-Reply-To:Message-Id:Date:Sender: Reply-To:Cc:MIME-Version:Content-Type:Content-Transfer-Encoding:Content-ID: Content-Description:Resent-Date:Resent-From:Resent-Sender:Resent-To:Resent-Cc :Resent-Message-ID:List-Id:List-Help:List-Unsubscribe:List-Subscribe: List-Post:List-Owner:List-Archive; bh=3Kihjh3Dao1W0H4HWvNSr/f9amAEH4iBqCZwvVRT000=; b=S098thkEe3dVHl4+zBopegBDaL HGCMCv0i5IT80In7tpSuCuAqkk4t6R9nJOGEbkUvdSvLqpIZOpVbUOIH35sDy9Nc5dI1DZydNKNJN Q8i86JDYyQOU1AnWqgYmi67+yjQMvASVXMHu414wGvNF473iSB/gmNUEMpdb7dJoVNoXik1bPRX01 lf+XGA2KH0cmSr6xKme61LuuGxTW5Gk/P/hjLI2XqS0ikBX7itqITPpROwgkhT/1he6avfuuKJXlI JSrjNCkTDVtrx/4C8wB5CIF6nw66iFFAGhtz+THE4mo7ZVtm1KhK4M0sdFAEWcFcLME9zqjYUnSLW 0Z/Ebu8A==; Received: from loth.rohan.me.apana.org.au ([192.168.167.2]) by formenos.hmeau.com with smtp (Exim 4.96 #2 (Debian)) id 1uAizL-002lLJ-2F; Fri, 02 May 2025 13:31:08 +0800 Received: by loth.rohan.me.apana.org.au (sSMTP sendmail emulation); Fri, 02 May 2025 13:31:07 +0800 Date: Fri, 02 May 2025 13:31:07 +0800 Message-Id: <7ac4f761c7526b75d17b3d5bcc4467e68c46da21.1746162259.git.herbert@gondor.apana.org.au> In-Reply-To: References: From: Herbert Xu Subject: [v2 PATCH 7/9] crypto: x86/sha256 - Add simd block function To: Linux Crypto Mailing List Precedence: bulk X-Mailing-List: linux-crypto@vger.kernel.org List-Id: List-Subscribe: List-Unsubscribe: Add CRYPTO_ARCH_HAVE_LIB_SHA256_SIMD and a SIMD block function so that the caller can decide whether to use SIMD. Signed-off-by: Herbert Xu --- arch/x86/lib/crypto/Kconfig | 1 + arch/x86/lib/crypto/sha256.c | 12 +++++++++--- 2 files changed, 10 insertions(+), 3 deletions(-) diff --git a/arch/x86/lib/crypto/Kconfig b/arch/x86/lib/crypto/Kconfig index e344579db3d8..5e94cdee492c 100644 --- a/arch/x86/lib/crypto/Kconfig +++ b/arch/x86/lib/crypto/Kconfig @@ -30,4 +30,5 @@ config CRYPTO_SHA256_X86_64 depends on 64BIT default CRYPTO_LIB_SHA256 select CRYPTO_ARCH_HAVE_LIB_SHA256 + select CRYPTO_ARCH_HAVE_LIB_SHA256_SIMD select CRYPTO_LIB_SHA256_GENERIC diff --git a/arch/x86/lib/crypto/sha256.c b/arch/x86/lib/crypto/sha256.c index 8735ec871f86..cdd88497eedf 100644 --- a/arch/x86/lib/crypto/sha256.c +++ b/arch/x86/lib/crypto/sha256.c @@ -6,7 +6,6 @@ */ #include #include -#include #include #include #include @@ -24,10 +23,10 @@ static __ro_after_init DEFINE_STATIC_KEY_FALSE(have_sha256_x86); DEFINE_STATIC_CALL(sha256_blocks_x86, sha256_transform_ssse3); -void sha256_blocks_arch(u32 state[SHA256_STATE_WORDS], +void sha256_blocks_simd(u32 state[SHA256_STATE_WORDS], const u8 *data, size_t nblocks) { - if (static_branch_likely(&have_sha256_x86) && crypto_simd_usable()) { + if (static_branch_likely(&have_sha256_x86)) { kernel_fpu_begin(); static_call(sha256_blocks_x86)(state, data, nblocks); kernel_fpu_end(); @@ -35,6 +34,13 @@ void sha256_blocks_arch(u32 state[SHA256_STATE_WORDS], sha256_blocks_generic(state, data, nblocks); } } +EXPORT_SYMBOL_GPL(sha256_blocks_simd); + +void sha256_blocks_arch(u32 state[SHA256_STATE_WORDS], + const u8 *data, size_t nblocks) +{ + sha256_blocks_generic(state, data, nblocks); +} EXPORT_SYMBOL_GPL(sha256_blocks_arch); bool sha256_is_arch_optimized(void) From patchwork Fri May 2 05:31:09 2025 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Herbert Xu X-Patchwork-Id: 886712 Received: from abb.hmeau.com (abb.hmeau.com [144.6.53.87]) (using TLSv1.2 with cipher ECDHE-RSA-AES256-GCM-SHA384 (256/256 bits)) (No client certificate requested) by smtp.subspace.kernel.org (Postfix) with ESMTPS id 841011B0416 for ; Fri, 2 May 2025 05:31:13 +0000 (UTC) Authentication-Results: smtp.subspace.kernel.org; arc=none smtp.client-ip=144.6.53.87 ARC-Seal: i=1; a=rsa-sha256; d=subspace.kernel.org; s=arc-20240116; t=1746163875; cv=none; b=nR5a+8gfcPeYjEzrqheD85pd2FFqJ0HmnXBkGbXtkvPoBzMC7qe9PTOWs4V8Odg4HaRdq3lNroob8b5bGeBrrRf5Z5MDksjLZoZjHwNmqDPbOdDKJkPUT0Yss15LicFN16K/eql2YAKF+aRRr3DT8uShGBx3HvJax4oQo4xMBtc= ARC-Message-Signature: i=1; a=rsa-sha256; d=subspace.kernel.org; s=arc-20240116; t=1746163875; c=relaxed/simple; bh=vmgmzfRWGEGByiQQm3hpvhfSwzko/ZQ6lcklw73Egb0=; h=Date:Message-Id:In-Reply-To:References:From:Subject:To; b=BOiDOvoVfBvYiK0/QFzKd0NDlVY1qPc/8bZeGvXmIvxoQyMmlolh/vLkekRXLFwwlzmV4cPRz6LoJuPwDG7t5Si4BGfEIKtOH970EKYBRh5y9av2sqdg1otg98l2cegCcPXG0A8yIlbrzP92jvpBe6T1q0fj1HiWzafGSZzKx68= ARC-Authentication-Results: i=1; smtp.subspace.kernel.org; dmarc=pass (p=quarantine dis=none) header.from=gondor.apana.org.au; spf=pass smtp.mailfrom=gondor.apana.org.au; dkim=pass (2048-bit key) header.d=hmeau.com header.i=@hmeau.com header.b=onN6zVgS; arc=none smtp.client-ip=144.6.53.87 Authentication-Results: smtp.subspace.kernel.org; dmarc=pass (p=quarantine dis=none) header.from=gondor.apana.org.au Authentication-Results: smtp.subspace.kernel.org; spf=pass smtp.mailfrom=gondor.apana.org.au Authentication-Results: smtp.subspace.kernel.org; dkim=pass (2048-bit key) header.d=hmeau.com header.i=@hmeau.com header.b="onN6zVgS" DKIM-Signature: v=1; a=rsa-sha256; q=dns/txt; c=relaxed/relaxed; d=hmeau.com; s=formenos; h=To:Subject:From:References:In-Reply-To:Message-Id:Date:Sender: Reply-To:Cc:MIME-Version:Content-Type:Content-Transfer-Encoding:Content-ID: Content-Description:Resent-Date:Resent-From:Resent-Sender:Resent-To:Resent-Cc :Resent-Message-ID:List-Id:List-Help:List-Unsubscribe:List-Subscribe: List-Post:List-Owner:List-Archive; bh=SWB5pEmKDu9CcQkOgQ8jregz2Lo7MK74+7YYJloUfxs=; b=onN6zVgSugPVPaXVzCh7zLUTDb ubSRXRf2Tml9IADC1doxpcEJ4jdo/Ml1j8/jgQXhfujigTEP0muTfpMRBxJLcigzKiLX4e+yCtZfV xy+1Y52F9Jlm2dpWPN4Hqk25dJ5rQhmiIuxlSbEf/dvtHYpkPm+50TGBIohhAfmM5hnHgePrrNXRy tR2j2hEqZhPyrjDNrNzvOSTPzylPdhZ7fY30ig6+jUdDwZ7vqQRwCLDSAH6WZaGdDeT11Zvijs5ay KQhoNUb1iT7C9sFFAjvNaBstTutVIvBzEB112BTNERqPq1PWCWIzJJ/GZH4YuIauK5rseRphiLWnO LBFbSdhw==; Received: from loth.rohan.me.apana.org.au ([192.168.167.2]) by formenos.hmeau.com with smtp (Exim 4.96 #2 (Debian)) id 1uAizO-002lLU-00; Fri, 02 May 2025 13:31:11 +0800 Received: by loth.rohan.me.apana.org.au (sSMTP sendmail emulation); Fri, 02 May 2025 13:31:09 +0800 Date: Fri, 02 May 2025 13:31:09 +0800 Message-Id: In-Reply-To: References: From: Herbert Xu Subject: [v2 PATCH 8/9] crypto: lib/sha256 - Use generic block helper To: Linux Crypto Mailing List Precedence: bulk X-Mailing-List: linux-crypto@vger.kernel.org List-Id: List-Subscribe: List-Unsubscribe: Use the BLOCK_HASH_UPDATE_BLOCKS helper instead of duplicating partial block handling. Also remove the unused lib/sha256 force-generic interface. Signed-off-by: Herbert Xu --- include/crypto/internal/sha2.h | 7 ---- lib/crypto/sha256.c | 75 ++++++---------------------------- 2 files changed, 12 insertions(+), 70 deletions(-) diff --git a/include/crypto/internal/sha2.h b/include/crypto/internal/sha2.h index fff156f66edc..b9bccd3ff57f 100644 --- a/include/crypto/internal/sha2.h +++ b/include/crypto/internal/sha2.h @@ -10,13 +10,6 @@ #include #include -void sha256_update_generic(struct sha256_state *sctx, - const u8 *data, size_t len); -void sha256_final_generic(struct sha256_state *sctx, - u8 out[SHA256_DIGEST_SIZE]); -void sha224_final_generic(struct sha256_state *sctx, - u8 out[SHA224_DIGEST_SIZE]); - #if IS_ENABLED(CONFIG_CRYPTO_ARCH_HAVE_LIB_SHA256) bool sha256_is_arch_optimized(void); #else diff --git a/lib/crypto/sha256.c b/lib/crypto/sha256.c index 2ced29efa181..107e5162507a 100644 --- a/lib/crypto/sha256.c +++ b/lib/crypto/sha256.c @@ -11,6 +11,7 @@ * Copyright (c) 2014 Red Hat Inc. */ +#include #include #include #include @@ -31,71 +32,40 @@ static inline bool sha256_purgatory(void) } static inline void sha256_blocks(u32 state[SHA256_STATE_WORDS], const u8 *data, - size_t nblocks, bool force_generic) + size_t nblocks) { - sha256_choose_blocks(state, data, nblocks, - force_generic || sha256_purgatory(), false); -} - -static inline void __sha256_update(struct sha256_state *sctx, const u8 *data, - size_t len, bool force_generic) -{ - size_t partial = sctx->count % SHA256_BLOCK_SIZE; - - sctx->count += len; - - if (partial + len >= SHA256_BLOCK_SIZE) { - size_t nblocks; - - if (partial) { - size_t l = SHA256_BLOCK_SIZE - partial; - - memcpy(&sctx->buf[partial], data, l); - data += l; - len -= l; - - sha256_blocks(sctx->state, sctx->buf, 1, force_generic); - } - - nblocks = len / SHA256_BLOCK_SIZE; - len %= SHA256_BLOCK_SIZE; - - if (nblocks) { - sha256_blocks(sctx->state, data, nblocks, - force_generic); - data += nblocks * SHA256_BLOCK_SIZE; - } - partial = 0; - } - if (len) - memcpy(&sctx->buf[partial], data, len); + sha256_choose_blocks(state, data, nblocks, sha256_purgatory(), false); } void sha256_update(struct sha256_state *sctx, const u8 *data, size_t len) { - __sha256_update(sctx, data, len, false); + size_t partial = sctx->count % SHA256_BLOCK_SIZE; + + sctx->count += len; + BLOCK_HASH_UPDATE_BLOCKS(sha256_blocks, sctx->ctx.state, data, len, + SHA256_BLOCK_SIZE, sctx->buf, partial); } EXPORT_SYMBOL(sha256_update); static inline void __sha256_final(struct sha256_state *sctx, u8 *out, - size_t digest_size, bool force_generic) + size_t digest_size) { size_t partial = sctx->count % SHA256_BLOCK_SIZE; sha256_finup(&sctx->ctx, sctx->buf, partial, out, digest_size, - force_generic || sha256_purgatory(), false); + sha256_purgatory(), false); memzero_explicit(sctx, sizeof(*sctx)); } void sha256_final(struct sha256_state *sctx, u8 out[SHA256_DIGEST_SIZE]) { - __sha256_final(sctx, out, SHA256_DIGEST_SIZE, false); + __sha256_final(sctx, out, SHA256_DIGEST_SIZE); } EXPORT_SYMBOL(sha256_final); void sha224_final(struct sha256_state *sctx, u8 out[SHA224_DIGEST_SIZE]) { - __sha256_final(sctx, out, SHA224_DIGEST_SIZE, false); + __sha256_final(sctx, out, SHA224_DIGEST_SIZE); } EXPORT_SYMBOL(sha224_final); @@ -109,26 +79,5 @@ void sha256(const u8 *data, size_t len, u8 out[SHA256_DIGEST_SIZE]) } EXPORT_SYMBOL(sha256); -#if IS_ENABLED(CONFIG_CRYPTO_SHA256) && !defined(__DISABLE_EXPORTS) -void sha256_update_generic(struct sha256_state *sctx, - const u8 *data, size_t len) -{ - __sha256_update(sctx, data, len, true); -} -EXPORT_SYMBOL(sha256_update_generic); - -void sha256_final_generic(struct sha256_state *sctx, u8 out[SHA256_DIGEST_SIZE]) -{ - __sha256_final(sctx, out, SHA256_DIGEST_SIZE, true); -} -EXPORT_SYMBOL(sha256_final_generic); - -void sha224_final_generic(struct sha256_state *sctx, u8 out[SHA224_DIGEST_SIZE]) -{ - __sha256_final(sctx, out, SHA224_DIGEST_SIZE, true); -} -EXPORT_SYMBOL(sha224_final_generic); -#endif - MODULE_DESCRIPTION("SHA-256 Algorithm"); MODULE_LICENSE("GPL"); From patchwork Fri May 2 05:31:12 2025 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Herbert Xu X-Patchwork-Id: 887265 Received: from abb.hmeau.com (abb.hmeau.com [144.6.53.87]) (using TLSv1.2 with cipher ECDHE-RSA-AES256-GCM-SHA384 (256/256 bits)) (No client certificate requested) by smtp.subspace.kernel.org (Postfix) with ESMTPS id 290521B0416 for ; Fri, 2 May 2025 05:31:15 +0000 (UTC) Authentication-Results: smtp.subspace.kernel.org; arc=none smtp.client-ip=144.6.53.87 ARC-Seal: i=1; a=rsa-sha256; d=subspace.kernel.org; s=arc-20240116; t=1746163878; cv=none; b=K0hWE3Kxdfphr8VkoAZQ2nD7DvczJ8oZHYX54yXrRpU1P8FI2Dz2yiB0EqjNlgWIPBZEC5ArZX7rcC7GPNrE/w01EReUi/R9pGBGqk6vRfnbJPLmNkOuCQ7Umyejgc5Md7ekFf2RafGtRkhgeRNgvbGq1t+T1xIDx15gN7xFKdg= ARC-Message-Signature: i=1; a=rsa-sha256; d=subspace.kernel.org; s=arc-20240116; t=1746163878; c=relaxed/simple; bh=43o5zrbYHn0Kq2HjIR3gKQJYASpR2YyGmCGuHIXgTYI=; h=Date:Message-Id:In-Reply-To:References:From:Subject:To; b=Fn+Q/W0QzCT/OUczePItG8l5kAAk+xyTbY1WuW9NBaiO8EhNARMBeJtg5E4VBbFnptUfmkDyt/VklDIOA7VSfFyGOzR7/EPrseOu97Y9ew5Rz5VBPS9tDHZiRGr2w2W+0/nRufA+580UwG+hYno8+r/sKiyy2o/tnxFNzjf3et8= ARC-Authentication-Results: i=1; smtp.subspace.kernel.org; dmarc=pass (p=quarantine dis=none) header.from=gondor.apana.org.au; spf=pass smtp.mailfrom=gondor.apana.org.au; dkim=pass (2048-bit key) header.d=hmeau.com header.i=@hmeau.com header.b=ksZJtUYN; arc=none smtp.client-ip=144.6.53.87 Authentication-Results: smtp.subspace.kernel.org; dmarc=pass (p=quarantine dis=none) header.from=gondor.apana.org.au Authentication-Results: smtp.subspace.kernel.org; spf=pass smtp.mailfrom=gondor.apana.org.au Authentication-Results: smtp.subspace.kernel.org; dkim=pass (2048-bit key) header.d=hmeau.com header.i=@hmeau.com header.b="ksZJtUYN" DKIM-Signature: v=1; a=rsa-sha256; q=dns/txt; c=relaxed/relaxed; d=hmeau.com; s=formenos; h=To:Subject:From:References:In-Reply-To:Message-Id:Date:Sender: Reply-To:Cc:MIME-Version:Content-Type:Content-Transfer-Encoding:Content-ID: Content-Description:Resent-Date:Resent-From:Resent-Sender:Resent-To:Resent-Cc :Resent-Message-ID:List-Id:List-Help:List-Unsubscribe:List-Subscribe: List-Post:List-Owner:List-Archive; bh=PcQFojgLQOKSj7kyTBi9KHye2ZMSbXzCCib4AwnDghQ=; b=ksZJtUYNbrCObYF2Fril6oUgZ4 2cPv5JPc2Dk0pLxnD/OshfVeQKIwJJ5dshV8bQPc7LJ9jEqFOFZvq3T9RjtXE7T+hfafkbskbCn+w J55sdyHJvvZT223GeMUWdH6fJFs3R9zst3ZYRfNVkLEZIFBi21xcMGI+SnGO84jgBXk8nTMXi0Hnv H26vPi2Czpw7K1IMMmo3BF6wjn/P5uC3ymEG1IHkcvPUc24rh+BjheQ4Z8+cWzDkI27HTmhMLqTDO TPaV8Qkcm0fi9Gl4Wct9d2rR9pPyuVg68q99LCJHfG70/44ccqPe1v/3jj8phX8YvLrHas/EyH0SF iGjFx0QA==; Received: from loth.rohan.me.apana.org.au ([192.168.167.2]) by formenos.hmeau.com with smtp (Exim 4.96 #2 (Debian)) id 1uAizQ-002lLf-11; Fri, 02 May 2025 13:31:13 +0800 Received: by loth.rohan.me.apana.org.au (sSMTP sendmail emulation); Fri, 02 May 2025 13:31:12 +0800 Date: Fri, 02 May 2025 13:31:12 +0800 Message-Id: <02900cbea8a00fe479f99b494b7d028ea7b9f119.1746162259.git.herbert@gondor.apana.org.au> In-Reply-To: References: From: Herbert Xu Subject: [v2 PATCH 9/9] crypto: sha256 - Use the partial block API To: Linux Crypto Mailing List Precedence: bulk X-Mailing-List: linux-crypto@vger.kernel.org List-Id: List-Subscribe: List-Unsubscribe: Use the shash partial block API by default. Add a separate set of lib shash algorithms to preserve testing coverage until lib/sha256 has its own tests. Signed-off-by: Herbert Xu --- crypto/sha256.c | 81 +++++++++++++++++++++++++++++++++++-------------- 1 file changed, 58 insertions(+), 23 deletions(-) diff --git a/crypto/sha256.c b/crypto/sha256.c index d6b90c6ea63d..a20c92098176 100644 --- a/crypto/sha256.c +++ b/crypto/sha256.c @@ -52,14 +52,20 @@ static int crypto_sha256_update_generic(struct shash_desc *desc, const u8 *data, return crypto_sha256_update(desc, data, len, true); } -static int crypto_sha256_update_arch(struct shash_desc *desc, const u8 *data, - unsigned int len) +static int crypto_sha256_update_lib(struct shash_desc *desc, const u8 *data, + unsigned int len) { sha256_update(shash_desc_ctx(desc), data, len); return 0; } -static int crypto_sha256_final_arch(struct shash_desc *desc, u8 *out) +static int crypto_sha256_update_arch(struct shash_desc *desc, const u8 *data, + unsigned int len) +{ + return crypto_sha256_update(desc, data, len, false); +} + +static int crypto_sha256_final_lib(struct shash_desc *desc, u8 *out) { sha256_final(shash_desc_ctx(desc), out); return 0; @@ -93,11 +99,7 @@ static int crypto_sha256_finup_generic(struct shash_desc *desc, const u8 *data, static int crypto_sha256_finup_arch(struct shash_desc *desc, const u8 *data, unsigned int len, u8 *out) { - struct sha256_state *sctx = shash_desc_ctx(desc); - - sha256_update(sctx, data, len); - sha256_final(sctx, out); - return 0; + return crypto_sha256_finup(desc, data, len, out, false); } static int crypto_sha256_digest_generic(struct shash_desc *desc, const u8 *data, @@ -107,20 +109,27 @@ static int crypto_sha256_digest_generic(struct shash_desc *desc, const u8 *data, return crypto_sha256_finup_generic(desc, data, len, out); } -static int crypto_sha256_digest_arch(struct shash_desc *desc, const u8 *data, - unsigned int len, u8 *out) +static int crypto_sha256_digest_lib(struct shash_desc *desc, const u8 *data, + unsigned int len, u8 *out) { sha256(data, len, out); return 0; } +static int crypto_sha256_digest_arch(struct shash_desc *desc, const u8 *data, + unsigned int len, u8 *out) +{ + crypto_sha256_init(desc); + return crypto_sha256_finup_arch(desc, data, len, out); +} + static int crypto_sha224_init(struct shash_desc *desc) { sha224_block_init(shash_desc_ctx(desc)); return 0; } -static int crypto_sha224_final_arch(struct shash_desc *desc, u8 *out) +static int crypto_sha224_final_lib(struct shash_desc *desc, u8 *out) { sha224_final(shash_desc_ctx(desc), out); return 0; @@ -184,16 +193,14 @@ static struct shash_alg algs[] = { }, { .base.cra_name = "sha256", - .base.cra_driver_name = "sha256-" __stringify(ARCH), - .base.cra_priority = 300, + .base.cra_driver_name = "sha256-lib", .base.cra_blocksize = SHA256_BLOCK_SIZE, .base.cra_module = THIS_MODULE, .digestsize = SHA256_DIGEST_SIZE, .init = crypto_sha256_init, - .update = crypto_sha256_update_arch, - .final = crypto_sha256_final_arch, - .finup = crypto_sha256_finup_arch, - .digest = crypto_sha256_digest_arch, + .update = crypto_sha256_update_lib, + .final = crypto_sha256_final_lib, + .digest = crypto_sha256_digest_lib, .descsize = sizeof(struct sha256_state), .statesize = sizeof(struct crypto_sha256_state) + SHA256_BLOCK_SIZE + 1, @@ -202,20 +209,48 @@ static struct shash_alg algs[] = { }, { .base.cra_name = "sha224", - .base.cra_driver_name = "sha224-" __stringify(ARCH), - .base.cra_priority = 300, + .base.cra_driver_name = "sha224-lib", .base.cra_blocksize = SHA224_BLOCK_SIZE, .base.cra_module = THIS_MODULE, .digestsize = SHA224_DIGEST_SIZE, .init = crypto_sha224_init, - .update = crypto_sha256_update_arch, - .final = crypto_sha224_final_arch, + .update = crypto_sha256_update_lib, + .final = crypto_sha224_final_lib, .descsize = sizeof(struct sha256_state), .statesize = sizeof(struct crypto_sha256_state) + SHA256_BLOCK_SIZE + 1, .import = crypto_sha256_import_lib, .export = crypto_sha256_export_lib, }, + { + .base.cra_name = "sha256", + .base.cra_driver_name = "sha256-" __stringify(ARCH), + .base.cra_priority = 300, + .base.cra_flags = CRYPTO_AHASH_ALG_BLOCK_ONLY | + CRYPTO_AHASH_ALG_FINUP_MAX, + .base.cra_blocksize = SHA256_BLOCK_SIZE, + .base.cra_module = THIS_MODULE, + .digestsize = SHA256_DIGEST_SIZE, + .init = crypto_sha256_init, + .update = crypto_sha256_update_arch, + .finup = crypto_sha256_finup_arch, + .digest = crypto_sha256_digest_arch, + .descsize = sizeof(struct crypto_sha256_state), + }, + { + .base.cra_name = "sha224", + .base.cra_driver_name = "sha224-" __stringify(ARCH), + .base.cra_priority = 300, + .base.cra_flags = CRYPTO_AHASH_ALG_BLOCK_ONLY | + CRYPTO_AHASH_ALG_FINUP_MAX, + .base.cra_blocksize = SHA224_BLOCK_SIZE, + .base.cra_module = THIS_MODULE, + .digestsize = SHA224_DIGEST_SIZE, + .init = crypto_sha224_init, + .update = crypto_sha256_update_arch, + .finup = crypto_sha256_finup_arch, + .descsize = sizeof(struct crypto_sha256_state), + }, }; static unsigned int num_algs; @@ -224,9 +259,9 @@ static int __init crypto_sha256_mod_init(void) { /* register the arch flavours only if they differ from generic */ num_algs = ARRAY_SIZE(algs); - BUILD_BUG_ON(ARRAY_SIZE(algs) % 2 != 0); + BUILD_BUG_ON(ARRAY_SIZE(algs) <= 2); if (!sha256_is_arch_optimized()) - num_algs /= 2; + num_algs -= 2; return crypto_register_shashes(algs, ARRAY_SIZE(algs)); } subsys_initcall(crypto_sha256_mod_init);