[PULL,16/23] tcg/arm: Tighten tlb indexing offset test

Message ID 20170907224051.21518-17-richard.henderson@linaro.org
State New
Headers show
Series
  • tcg constant pools and USE_DIRECT_JUMP cleanup
Related show

Commit Message

Richard Henderson Sept. 7, 2017, 10:40 p.m.
From: Richard Henderson <rth@twiddle.net>


We are not going to use ldrd for loading the comparator
for 32-bit guests, so don't limit cmp_off to 8 bits then.
This eliminates one insn in the tlb load for some guests.

Signed-off-by: Richard Henderson <rth@twiddle.net>

---
 tcg/arm/tcg-target.inc.c | 4 +++-
 1 file changed, 3 insertions(+), 1 deletion(-)

-- 
2.13.5

Patch

diff --git a/tcg/arm/tcg-target.inc.c b/tcg/arm/tcg-target.inc.c
index 66c369c239..6c12b169ce 100644
--- a/tcg/arm/tcg-target.inc.c
+++ b/tcg/arm/tcg-target.inc.c
@@ -1202,7 +1202,9 @@  static TCGReg tcg_out_tlb_read(TCGContext *s, TCGReg addrlo, TCGReg addrhi,
     }
 
     /* We checked that the offset is contained within 16 bits above.  */
-    if (add_off > 0xfff || (use_armv6_instructions && cmp_off > 0xff)) {
+    if (add_off > 0xfff
+        || (use_armv6_instructions && TARGET_LONG_BITS == 64
+            && cmp_off > 0xff)) {
         tcg_out_dat_imm(s, COND_AL, ARITH_ADD, TCG_REG_R2, base,
                         (24 << 7) | (cmp_off >> 8));
         base = TCG_REG_R2;