diff mbox series

[030/nnn] poly_int: get_addr_unit_base_and_extent

Message ID 87a80hpz4j.fsf@linaro.org
State New
Headers show
Series [030/nnn] poly_int: get_addr_unit_base_and_extent | expand

Commit Message

Richard Sandiford Oct. 23, 2017, 5:12 p.m. UTC
This patch changes the values returned by
get_addr_unit_base_and_extent from HOST_WIDE_INT to poly_int64.

maxsize in gimple_fold_builtin_memory_op goes from HOST_WIDE_INT
to poly_uint64 (rather than poly_int) to match the previous use
of tree_fits_uhwi_p.


2017-10-23  Richard Sandiford  <richard.sandiford@linaro.org>
	    Alan Hayward  <alan.hayward@arm.com>
	    David Sherwood  <david.sherwood@arm.com>

gcc/
	* tree-dfa.h (get_addr_base_and_unit_offset_1): Return the offset
	as a poly_int64_pod rather than a HOST_WIDE_INT.
	(get_addr_base_and_unit_offset): Likewise.
	* tree-dfa.c (get_addr_base_and_unit_offset_1): Likewise.
	(get_addr_base_and_unit_offset): Likewise.
	* doc/match-and-simplify.texi: Change off from HOST_WIDE_INT
	to poly_int64 in example.
	* fold-const.c (fold_binary_loc): Update call to
	get_addr_base_and_unit_offset.
	* gimple-fold.c (gimple_fold_builtin_memory_op): Likewise.
	(maybe_canonicalize_mem_ref_addr): Likewise.
	(gimple_fold_stmt_to_constant_1): Likewise.
	* ipa-prop.c (ipa_modify_call_arguments): Likewise.
	* match.pd: Likewise.
	* omp-low.c (lower_omp_target): Likewise.
	* tree-sra.c (build_ref_for_offset): Likewise.
	(build_debug_ref_for_model): Likewise.
	* tree-ssa-address.c (maybe_fold_tmr): Likewise.
	* tree-ssa-alias.c (ao_ref_init_from_ptr_and_size): Likewise.
	* tree-ssa-ccp.c (optimize_memcpy): Likewise.
	* tree-ssa-forwprop.c (forward_propagate_addr_expr_1): Likewise.
	(constant_pointer_difference): Likewise.
	* tree-ssa-loop-niter.c (expand_simple_operations): Likewise.
	* tree-ssa-phiopt.c (jump_function_from_stmt): Likewise.
	* tree-ssa-pre.c (create_component_ref_by_pieces_1): Likewise.
	* tree-ssa-sccvn.c (vn_reference_fold_indirect): Likewise.
	(vn_reference_maybe_forwprop_address, vn_reference_lookup_3): Likewise.
	(set_ssa_val_to): Likewise.
	* tree-ssa-strlen.c (get_addr_stridx, addr_stridxptr): Likewise.
	* tree.c (build_simple_mem_ref_loc): Likewise.

Comments

Jeff Law Dec. 6, 2017, 12:26 a.m. UTC | #1
On 10/23/2017 11:12 AM, Richard Sandiford wrote:
> This patch changes the values returned by

> get_addr_unit_base_and_extent from HOST_WIDE_INT to poly_int64.

> 

> maxsize in gimple_fold_builtin_memory_op goes from HOST_WIDE_INT

> to poly_uint64 (rather than poly_int) to match the previous use

> of tree_fits_uhwi_p.

> 

> 

> 2017-10-23  Richard Sandiford  <richard.sandiford@linaro.org>

> 	    Alan Hayward  <alan.hayward@arm.com>

> 	    David Sherwood  <david.sherwood@arm.com>

> 

> gcc/

> 	* tree-dfa.h (get_addr_base_and_unit_offset_1): Return the offset

> 	as a poly_int64_pod rather than a HOST_WIDE_INT.

> 	(get_addr_base_and_unit_offset): Likewise.

> 	* tree-dfa.c (get_addr_base_and_unit_offset_1): Likewise.

> 	(get_addr_base_and_unit_offset): Likewise.

> 	* doc/match-and-simplify.texi: Change off from HOST_WIDE_INT

> 	to poly_int64 in example.

> 	* fold-const.c (fold_binary_loc): Update call to

> 	get_addr_base_and_unit_offset.

> 	* gimple-fold.c (gimple_fold_builtin_memory_op): Likewise.

> 	(maybe_canonicalize_mem_ref_addr): Likewise.

> 	(gimple_fold_stmt_to_constant_1): Likewise.

> 	* ipa-prop.c (ipa_modify_call_arguments): Likewise.

> 	* match.pd: Likewise.

> 	* omp-low.c (lower_omp_target): Likewise.

> 	* tree-sra.c (build_ref_for_offset): Likewise.

> 	(build_debug_ref_for_model): Likewise.

> 	* tree-ssa-address.c (maybe_fold_tmr): Likewise.

> 	* tree-ssa-alias.c (ao_ref_init_from_ptr_and_size): Likewise.

> 	* tree-ssa-ccp.c (optimize_memcpy): Likewise.

> 	* tree-ssa-forwprop.c (forward_propagate_addr_expr_1): Likewise.

> 	(constant_pointer_difference): Likewise.

> 	* tree-ssa-loop-niter.c (expand_simple_operations): Likewise.

> 	* tree-ssa-phiopt.c (jump_function_from_stmt): Likewise.

> 	* tree-ssa-pre.c (create_component_ref_by_pieces_1): Likewise.

> 	* tree-ssa-sccvn.c (vn_reference_fold_indirect): Likewise.

> 	(vn_reference_maybe_forwprop_address, vn_reference_lookup_3): Likewise.

> 	(set_ssa_val_to): Likewise.

> 	* tree-ssa-strlen.c (get_addr_stridx, addr_stridxptr): Likewise.

> 	* tree.c (build_simple_mem_ref_loc): Likewise.

OK.

Note Martin S. has some code that's ready to go into the tree that
likely will require converting some bits to poly_int and hits some of
the same areas.  Given the tree isn't poly_int aware right now there may
be some coordination between yourself and Martin S. that will need to
occur depending on which bits go in first.

jeff
diff mbox series

Patch

Index: gcc/tree-dfa.h
===================================================================
--- gcc/tree-dfa.h	2017-10-23 17:16:59.705267681 +0100
+++ gcc/tree-dfa.h	2017-10-23 17:17:01.432034493 +0100
@@ -33,9 +33,9 @@  extern tree get_ref_base_and_extent (tre
 				     poly_int64_pod *, bool *);
 extern tree get_ref_base_and_extent_hwi (tree, HOST_WIDE_INT *,
 					 HOST_WIDE_INT *, bool *);
-extern tree get_addr_base_and_unit_offset_1 (tree, HOST_WIDE_INT *,
+extern tree get_addr_base_and_unit_offset_1 (tree, poly_int64_pod *,
 					     tree (*) (tree));
-extern tree get_addr_base_and_unit_offset (tree, HOST_WIDE_INT *);
+extern tree get_addr_base_and_unit_offset (tree, poly_int64_pod *);
 extern bool stmt_references_abnormal_ssa_name (gimple *);
 extern void replace_abnormal_ssa_names (gimple *);
 extern void dump_enumerated_decls (FILE *, dump_flags_t);
Index: gcc/tree-dfa.c
===================================================================
--- gcc/tree-dfa.c	2017-10-23 17:16:59.705267681 +0100
+++ gcc/tree-dfa.c	2017-10-23 17:17:01.432034493 +0100
@@ -705,10 +705,10 @@  get_ref_base_and_extent_hwi (tree exp, H
    its argument or a constant if the argument is known to be constant.  */
 
 tree
-get_addr_base_and_unit_offset_1 (tree exp, HOST_WIDE_INT *poffset,
+get_addr_base_and_unit_offset_1 (tree exp, poly_int64_pod *poffset,
 				 tree (*valueize) (tree))
 {
-  HOST_WIDE_INT byte_offset = 0;
+  poly_int64 byte_offset = 0;
 
   /* Compute cumulative byte-offset for nested component-refs and array-refs,
      and find the ultimate containing object.  */
@@ -718,10 +718,13 @@  get_addr_base_and_unit_offset_1 (tree ex
 	{
 	case BIT_FIELD_REF:
 	  {
-	    HOST_WIDE_INT this_off = TREE_INT_CST_LOW (TREE_OPERAND (exp, 2));
-	    if (this_off % BITS_PER_UNIT)
+	    poly_int64 this_byte_offset;
+	    poly_uint64 this_bit_offset;
+	    if (!poly_int_tree_p (TREE_OPERAND (exp, 2), &this_bit_offset)
+		|| !multiple_p (this_bit_offset, BITS_PER_UNIT,
+				&this_byte_offset))
 	      return NULL_TREE;
-	    byte_offset += this_off / BITS_PER_UNIT;
+	    byte_offset += this_byte_offset;
 	  }
 	  break;
 
@@ -729,15 +732,14 @@  get_addr_base_and_unit_offset_1 (tree ex
 	  {
 	    tree field = TREE_OPERAND (exp, 1);
 	    tree this_offset = component_ref_field_offset (exp);
-	    HOST_WIDE_INT hthis_offset;
+	    poly_int64 hthis_offset;
 
 	    if (!this_offset
-		|| TREE_CODE (this_offset) != INTEGER_CST
+		|| !poly_int_tree_p (this_offset, &hthis_offset)
 		|| (TREE_INT_CST_LOW (DECL_FIELD_BIT_OFFSET (field))
 		    % BITS_PER_UNIT))
 	      return NULL_TREE;
 
-	    hthis_offset = TREE_INT_CST_LOW (this_offset);
 	    hthis_offset += (TREE_INT_CST_LOW (DECL_FIELD_BIT_OFFSET (field))
 			     / BITS_PER_UNIT);
 	    byte_offset += hthis_offset;
@@ -755,17 +757,18 @@  get_addr_base_and_unit_offset_1 (tree ex
 	      index = (*valueize) (index);
 
 	    /* If the resulting bit-offset is constant, track it.  */
-	    if (TREE_CODE (index) == INTEGER_CST
+	    if (poly_int_tree_p (index)
 		&& (low_bound = array_ref_low_bound (exp),
-		    TREE_CODE (low_bound) == INTEGER_CST)
+		    poly_int_tree_p (low_bound))
 		&& (unit_size = array_ref_element_size (exp),
 		    TREE_CODE (unit_size) == INTEGER_CST))
 	      {
-		offset_int woffset
-		  = wi::sext (wi::to_offset (index) - wi::to_offset (low_bound),
+		poly_offset_int woffset
+		  = wi::sext (wi::to_poly_offset (index)
+			      - wi::to_poly_offset (low_bound),
 			      TYPE_PRECISION (TREE_TYPE (index)));
 		woffset *= wi::to_offset (unit_size);
-		byte_offset += woffset.to_shwi ();
+		byte_offset += woffset.force_shwi ();
 	      }
 	    else
 	      return NULL_TREE;
@@ -842,7 +845,7 @@  get_addr_base_and_unit_offset_1 (tree ex
    is not BITS_PER_UNIT-aligned.  */
 
 tree
-get_addr_base_and_unit_offset (tree exp, HOST_WIDE_INT *poffset)
+get_addr_base_and_unit_offset (tree exp, poly_int64_pod *poffset)
 {
   return get_addr_base_and_unit_offset_1 (exp, poffset, NULL);
 }
Index: gcc/doc/match-and-simplify.texi
===================================================================
--- gcc/doc/match-and-simplify.texi	2017-10-23 17:07:40.843798706 +0100
+++ gcc/doc/match-and-simplify.texi	2017-10-23 17:17:01.428035033 +0100
@@ -205,7 +205,7 @@  Captures can also be used for capturing
   (pointer_plus (addr@@2 @@0) INTEGER_CST_P@@1)
   (if (is_gimple_min_invariant (@@2)))
   @{
-    HOST_WIDE_INT off;
+    poly_int64 off;
     tree base = get_addr_base_and_unit_offset (@@0, &off);
     off += tree_to_uhwi (@@1);
     /* Now with that we should be able to simply write
Index: gcc/fold-const.c
===================================================================
--- gcc/fold-const.c	2017-10-23 17:11:40.244945208 +0100
+++ gcc/fold-const.c	2017-10-23 17:17:01.429034898 +0100
@@ -9455,7 +9455,7 @@  fold_binary_loc (location_t loc,
 	  && handled_component_p (TREE_OPERAND (arg0, 0)))
 	{
 	  tree base;
-	  HOST_WIDE_INT coffset;
+	  poly_int64 coffset;
 	  base = get_addr_base_and_unit_offset (TREE_OPERAND (arg0, 0),
 						&coffset);
 	  if (!base)
Index: gcc/gimple-fold.c
===================================================================
--- gcc/gimple-fold.c	2017-10-23 17:16:59.703267951 +0100
+++ gcc/gimple-fold.c	2017-10-23 17:17:01.430034763 +0100
@@ -838,8 +838,8 @@  gimple_fold_builtin_memory_op (gimple_st
 	      && TREE_CODE (dest) == ADDR_EXPR)
 	    {
 	      tree src_base, dest_base, fn;
-	      HOST_WIDE_INT src_offset = 0, dest_offset = 0;
-	      HOST_WIDE_INT maxsize;
+	      poly_int64 src_offset = 0, dest_offset = 0;
+	      poly_uint64 maxsize;
 
 	      srcvar = TREE_OPERAND (src, 0);
 	      src_base = get_addr_base_and_unit_offset (srcvar, &src_offset);
@@ -850,16 +850,14 @@  gimple_fold_builtin_memory_op (gimple_st
 							 &dest_offset);
 	      if (dest_base == NULL)
 		dest_base = destvar;
-	      if (tree_fits_uhwi_p (len))
-		maxsize = tree_to_uhwi (len);
-	      else
+	      if (!poly_int_tree_p (len, &maxsize))
 		maxsize = -1;
 	      if (SSA_VAR_P (src_base)
 		  && SSA_VAR_P (dest_base))
 		{
 		  if (operand_equal_p (src_base, dest_base, 0)
-		      && ranges_overlap_p (src_offset, maxsize,
-					   dest_offset, maxsize))
+		      && ranges_may_overlap_p (src_offset, maxsize,
+					       dest_offset, maxsize))
 		    return false;
 		}
 	      else if (TREE_CODE (src_base) == MEM_REF
@@ -868,17 +866,12 @@  gimple_fold_builtin_memory_op (gimple_st
 		  if (! operand_equal_p (TREE_OPERAND (src_base, 0),
 					 TREE_OPERAND (dest_base, 0), 0))
 		    return false;
-		  offset_int off = mem_ref_offset (src_base) + src_offset;
-		  if (!wi::fits_shwi_p (off))
-		    return false;
-		  src_offset = off.to_shwi ();
-
-		  off = mem_ref_offset (dest_base) + dest_offset;
-		  if (!wi::fits_shwi_p (off))
-		    return false;
-		  dest_offset = off.to_shwi ();
-		  if (ranges_overlap_p (src_offset, maxsize,
-					dest_offset, maxsize))
+		  poly_offset_int full_src_offset
+		    = mem_ref_offset (src_base) + src_offset;
+		  poly_offset_int full_dest_offset
+		    = mem_ref_offset (dest_base) + dest_offset;
+		  if (ranges_may_overlap_p (full_src_offset, maxsize,
+					    full_dest_offset, maxsize))
 		    return false;
 		}
 	      else
@@ -4317,7 +4310,7 @@  maybe_canonicalize_mem_ref_addr (tree *t
 	      || handled_component_p (TREE_OPERAND (addr, 0))))
 	{
 	  tree base;
-	  HOST_WIDE_INT coffset;
+	  poly_int64 coffset;
 	  base = get_addr_base_and_unit_offset (TREE_OPERAND (addr, 0),
 						&coffset);
 	  if (!base)
@@ -5903,7 +5896,7 @@  gimple_fold_stmt_to_constant_1 (gimple *
 	      else if (TREE_CODE (rhs) == ADDR_EXPR
 		       && !is_gimple_min_invariant (rhs))
 		{
-		  HOST_WIDE_INT offset = 0;
+		  poly_int64 offset = 0;
 		  tree base;
 		  base = get_addr_base_and_unit_offset_1 (TREE_OPERAND (rhs, 0),
 							  &offset,
Index: gcc/ipa-prop.c
===================================================================
--- gcc/ipa-prop.c	2017-10-23 17:16:59.704267816 +0100
+++ gcc/ipa-prop.c	2017-10-23 17:17:01.431034628 +0100
@@ -4297,7 +4297,7 @@  ipa_modify_call_arguments (struct cgraph
 	    off = build_int_cst (adj->alias_ptr_type, byte_offset);
 	  else
 	    {
-	      HOST_WIDE_INT base_offset;
+	      poly_int64 base_offset;
 	      tree prev_base;
 	      bool addrof;
 
Index: gcc/match.pd
===================================================================
--- gcc/match.pd	2017-10-23 17:11:39.914313353 +0100
+++ gcc/match.pd	2017-10-23 17:17:01.431034628 +0100
@@ -3345,7 +3345,7 @@  DEFINE_INT_AND_FLOAT_ROUND_FN (RINT)
   (cmp (convert1?@2 addr@0) (convert2? addr@1))
   (with
    {
-     HOST_WIDE_INT off0, off1;
+     poly_int64 off0, off1;
      tree base0 = get_addr_base_and_unit_offset (TREE_OPERAND (@0, 0), &off0);
      tree base1 = get_addr_base_and_unit_offset (TREE_OPERAND (@1, 0), &off1);
      if (base0 && TREE_CODE (base0) == MEM_REF)
@@ -3384,23 +3384,23 @@  DEFINE_INT_AND_FLOAT_ROUND_FN (RINT)
      }
      (if (equal == 1)
       (switch
-       (if (cmp == EQ_EXPR)
-	{ constant_boolean_node (off0 == off1, type); })
-       (if (cmp == NE_EXPR)
-	{ constant_boolean_node (off0 != off1, type); })
-       (if (cmp == LT_EXPR)
-	{ constant_boolean_node (off0 < off1, type); })
-       (if (cmp == LE_EXPR)
-	{ constant_boolean_node (off0 <= off1, type); })
-       (if (cmp == GE_EXPR)
-	{ constant_boolean_node (off0 >= off1, type); })
-       (if (cmp == GT_EXPR)
-	{ constant_boolean_node (off0 > off1, type); }))
+       (if (cmp == EQ_EXPR && (must_eq (off0, off1) || must_ne (off0, off1)))
+	{ constant_boolean_node (must_eq (off0, off1), type); })
+       (if (cmp == NE_EXPR && (must_eq (off0, off1) || must_ne (off0, off1)))
+	{ constant_boolean_node (must_ne (off0, off1), type); })
+       (if (cmp == LT_EXPR && (must_lt (off0, off1) || must_ge (off0, off1)))
+	{ constant_boolean_node (must_lt (off0, off1), type); })
+       (if (cmp == LE_EXPR && (must_le (off0, off1) || must_gt (off0, off1)))
+	{ constant_boolean_node (must_le (off0, off1), type); })
+       (if (cmp == GE_EXPR && (must_ge (off0, off1) || must_lt (off0, off1)))
+	{ constant_boolean_node (must_ge (off0, off1), type); })
+       (if (cmp == GT_EXPR && (must_gt (off0, off1) || must_le (off0, off1)))
+	{ constant_boolean_node (must_gt (off0, off1), type); }))
       (if (equal == 0
 	   && DECL_P (base0) && DECL_P (base1)
 	   /* If we compare this as integers require equal offset.  */
 	   && (!INTEGRAL_TYPE_P (TREE_TYPE (@2))
-	       || off0 == off1))
+	       || must_eq (off0, off1)))
        (switch
 	(if (cmp == EQ_EXPR)
 	 { constant_boolean_node (false, type); })
Index: gcc/omp-low.c
===================================================================
--- gcc/omp-low.c	2017-10-23 17:11:39.972424406 +0100
+++ gcc/omp-low.c	2017-10-23 17:17:01.432034493 +0100
@@ -8397,7 +8397,7 @@  lower_omp_target (gimple_stmt_iterator *
 		|| OMP_CLAUSE_MAP_KIND (c) == GOMP_MAP_FIRSTPRIVATE_REFERENCE)
 	      {
 		location_t clause_loc = OMP_CLAUSE_LOCATION (c);
-		HOST_WIDE_INT offset = 0;
+		poly_int64 offset = 0;
 		gcc_assert (prev);
 		var = OMP_CLAUSE_DECL (c);
 		if (DECL_P (var)
Index: gcc/tree-sra.c
===================================================================
--- gcc/tree-sra.c	2017-10-23 17:16:59.705267681 +0100
+++ gcc/tree-sra.c	2017-10-23 17:17:01.433034358 +0100
@@ -1678,7 +1678,7 @@  build_ref_for_offset (location_t loc, tr
   tree prev_base = base;
   tree off;
   tree mem_ref;
-  HOST_WIDE_INT base_offset;
+  poly_int64 base_offset;
   unsigned HOST_WIDE_INT misalign;
   unsigned int align;
 
@@ -1786,7 +1786,7 @@  build_ref_for_model (location_t loc, tre
 build_debug_ref_for_model (location_t loc, tree base, HOST_WIDE_INT offset,
 			   struct access *model)
 {
-  HOST_WIDE_INT base_offset;
+  poly_int64 base_offset;
   tree off;
 
   if (TREE_CODE (model->expr) == COMPONENT_REF
Index: gcc/tree-ssa-address.c
===================================================================
--- gcc/tree-ssa-address.c	2017-10-23 17:11:40.248952867 +0100
+++ gcc/tree-ssa-address.c	2017-10-23 17:17:01.433034358 +0100
@@ -1061,7 +1061,7 @@  maybe_fold_tmr (tree ref)
   else if (addr.symbol
 	   && handled_component_p (TREE_OPERAND (addr.symbol, 0)))
     {
-      HOST_WIDE_INT offset;
+      poly_int64 offset;
       addr.symbol = build_fold_addr_expr
 		      (get_addr_base_and_unit_offset
 		         (TREE_OPERAND (addr.symbol, 0), &offset));
Index: gcc/tree-ssa-alias.c
===================================================================
--- gcc/tree-ssa-alias.c	2017-10-23 17:16:59.705267681 +0100
+++ gcc/tree-ssa-alias.c	2017-10-23 17:17:01.433034358 +0100
@@ -683,8 +683,7 @@  ao_ref_alias_set (ao_ref *ref)
 void
 ao_ref_init_from_ptr_and_size (ao_ref *ref, tree ptr, tree size)
 {
-  HOST_WIDE_INT t;
-  poly_int64 size_hwi, extra_offset = 0;
+  poly_int64 t, size_hwi, extra_offset = 0;
   ref->ref = NULL_TREE;
   if (TREE_CODE (ptr) == SSA_NAME)
     {
Index: gcc/tree-ssa-ccp.c
===================================================================
--- gcc/tree-ssa-ccp.c	2017-10-23 17:07:40.843798706 +0100
+++ gcc/tree-ssa-ccp.c	2017-10-23 17:17:01.433034358 +0100
@@ -3003,7 +3003,7 @@  optimize_memcpy (gimple_stmt_iterator *g
 
   gimple *defstmt = SSA_NAME_DEF_STMT (vuse);
   tree src2 = NULL_TREE, len2 = NULL_TREE;
-  HOST_WIDE_INT offset, offset2;
+  poly_int64 offset, offset2;
   tree val = integer_zero_node;
   if (gimple_store_p (defstmt)
       && gimple_assign_single_p (defstmt)
@@ -3035,16 +3035,16 @@  optimize_memcpy (gimple_stmt_iterator *g
 	    ? DECL_SIZE_UNIT (TREE_OPERAND (src2, 1))
 	    : TYPE_SIZE_UNIT (TREE_TYPE (src2)));
   if (len == NULL_TREE
-      || TREE_CODE (len) != INTEGER_CST
+      || !poly_int_tree_p (len)
       || len2 == NULL_TREE
-      || TREE_CODE (len2) != INTEGER_CST)
+      || !poly_int_tree_p (len2))
     return;
 
   src = get_addr_base_and_unit_offset (src, &offset);
   src2 = get_addr_base_and_unit_offset (src2, &offset2);
   if (src == NULL_TREE
       || src2 == NULL_TREE
-      || offset < offset2)
+      || may_lt (offset, offset2))
     return;
 
   if (!operand_equal_p (src, src2, 0))
@@ -3053,7 +3053,8 @@  optimize_memcpy (gimple_stmt_iterator *g
   /* [ src + offset2, src + offset2 + len2 - 1 ] is set to val.
      Make sure that
      [ src + offset, src + offset + len - 1 ] is a subset of that.  */
-  if (wi::to_offset (len) + (offset - offset2) > wi::to_offset (len2))
+  if (may_gt (wi::to_poly_offset (len) + (offset - offset2),
+	      wi::to_poly_offset (len2)))
     return;
 
   if (dump_file && (dump_flags & TDF_DETAILS))
Index: gcc/tree-ssa-forwprop.c
===================================================================
--- gcc/tree-ssa-forwprop.c	2017-10-23 17:07:40.843798706 +0100
+++ gcc/tree-ssa-forwprop.c	2017-10-23 17:17:01.434034223 +0100
@@ -758,12 +758,12 @@  forward_propagate_addr_expr_1 (tree name
       && TREE_OPERAND (lhs, 0) == name)
     {
       tree def_rhs_base;
-      HOST_WIDE_INT def_rhs_offset;
+      poly_int64 def_rhs_offset;
       /* If the address is invariant we can always fold it.  */
       if ((def_rhs_base = get_addr_base_and_unit_offset (TREE_OPERAND (def_rhs, 0),
 							 &def_rhs_offset)))
 	{
-	  offset_int off = mem_ref_offset (lhs);
+	  poly_offset_int off = mem_ref_offset (lhs);
 	  tree new_ptr;
 	  off += def_rhs_offset;
 	  if (TREE_CODE (def_rhs_base) == MEM_REF)
@@ -850,11 +850,11 @@  forward_propagate_addr_expr_1 (tree name
       && TREE_OPERAND (rhs, 0) == name)
     {
       tree def_rhs_base;
-      HOST_WIDE_INT def_rhs_offset;
+      poly_int64 def_rhs_offset;
       if ((def_rhs_base = get_addr_base_and_unit_offset (TREE_OPERAND (def_rhs, 0),
 							 &def_rhs_offset)))
 	{
-	  offset_int off = mem_ref_offset (rhs);
+	  poly_offset_int off = mem_ref_offset (rhs);
 	  tree new_ptr;
 	  off += def_rhs_offset;
 	  if (TREE_CODE (def_rhs_base) == MEM_REF)
@@ -1169,12 +1169,12 @@  #define CPD_ITERATIONS 5
 	  if (TREE_CODE (p) == ADDR_EXPR)
 	    {
 	      tree q = TREE_OPERAND (p, 0);
-	      HOST_WIDE_INT offset;
+	      poly_int64 offset;
 	      tree base = get_addr_base_and_unit_offset (q, &offset);
 	      if (base)
 		{
 		  q = base;
-		  if (offset)
+		  if (maybe_nonzero (offset))
 		    off = size_binop (PLUS_EXPR, off, size_int (offset));
 		}
 	      if (TREE_CODE (q) == MEM_REF
Index: gcc/tree-ssa-loop-niter.c
===================================================================
--- gcc/tree-ssa-loop-niter.c	2017-10-23 17:07:40.843798706 +0100
+++ gcc/tree-ssa-loop-niter.c	2017-10-23 17:17:01.434034223 +0100
@@ -1987,7 +1987,7 @@  expand_simple_operations (tree expr, tre
 	return expand_simple_operations (e, stop);
       else if (code == ADDR_EXPR)
 	{
-	  HOST_WIDE_INT offset;
+	  poly_int64 offset;
 	  tree base = get_addr_base_and_unit_offset (TREE_OPERAND (e, 0),
 						     &offset);
 	  if (base
Index: gcc/tree-ssa-phiopt.c
===================================================================
--- gcc/tree-ssa-phiopt.c	2017-10-23 17:07:40.843798706 +0100
+++ gcc/tree-ssa-phiopt.c	2017-10-23 17:17:01.434034223 +0100
@@ -692,12 +692,12 @@  jump_function_from_stmt (tree *arg, gimp
     {
       /* For arg = &p->i transform it to p, if possible.  */
       tree rhs1 = gimple_assign_rhs1 (stmt);
-      HOST_WIDE_INT offset;
+      poly_int64 offset;
       tree tem = get_addr_base_and_unit_offset (TREE_OPERAND (rhs1, 0),
 						&offset);
       if (tem
 	  && TREE_CODE (tem) == MEM_REF
-	  && (mem_ref_offset (tem) + offset) == 0)
+	  && known_zero (mem_ref_offset (tem) + offset))
 	{
 	  *arg = TREE_OPERAND (tem, 0);
 	  return true;
Index: gcc/tree-ssa-pre.c
===================================================================
--- gcc/tree-ssa-pre.c	2017-10-23 17:11:39.943368879 +0100
+++ gcc/tree-ssa-pre.c	2017-10-23 17:17:01.435034088 +0100
@@ -2504,7 +2504,7 @@  create_component_ref_by_pieces_1 (basic_
 	if (TREE_CODE (baseop) == ADDR_EXPR
 	    && handled_component_p (TREE_OPERAND (baseop, 0)))
 	  {
-	    HOST_WIDE_INT off;
+	    poly_int64 off;
 	    tree base;
 	    base = get_addr_base_and_unit_offset (TREE_OPERAND (baseop, 0),
 						  &off);
Index: gcc/tree-ssa-sccvn.c
===================================================================
--- gcc/tree-ssa-sccvn.c	2017-10-23 17:16:59.706267546 +0100
+++ gcc/tree-ssa-sccvn.c	2017-10-23 17:17:01.435034088 +0100
@@ -1154,7 +1154,7 @@  vn_reference_fold_indirect (vec<vn_refer
   vn_reference_op_t op = &(*ops)[i];
   vn_reference_op_t mem_op = &(*ops)[i - 1];
   tree addr_base;
-  HOST_WIDE_INT addr_offset = 0;
+  poly_int64 addr_offset = 0;
 
   /* The only thing we have to do is from &OBJ.foo.bar add the offset
      from .foo.bar to the preceding MEM_REF offset and replace the
@@ -1164,8 +1164,10 @@  vn_reference_fold_indirect (vec<vn_refer
   gcc_checking_assert (addr_base && TREE_CODE (addr_base) != MEM_REF);
   if (addr_base != TREE_OPERAND (op->op0, 0))
     {
-      offset_int off = offset_int::from (wi::to_wide (mem_op->op0), SIGNED);
-      off += addr_offset;
+      poly_offset_int off
+	= (poly_offset_int::from (wi::to_poly_wide (mem_op->op0),
+				  SIGNED)
+	   + addr_offset);
       mem_op->op0 = wide_int_to_tree (TREE_TYPE (mem_op->op0), off);
       op->op0 = build_fold_addr_expr (addr_base);
       if (tree_fits_shwi_p (mem_op->op0))
@@ -1188,7 +1190,7 @@  vn_reference_maybe_forwprop_address (vec
   vn_reference_op_t mem_op = &(*ops)[i - 1];
   gimple *def_stmt;
   enum tree_code code;
-  offset_int off;
+  poly_offset_int off;
 
   def_stmt = SSA_NAME_DEF_STMT (op->op0);
   if (!is_gimple_assign (def_stmt))
@@ -1199,7 +1201,7 @@  vn_reference_maybe_forwprop_address (vec
       && code != POINTER_PLUS_EXPR)
     return false;
 
-  off = offset_int::from (wi::to_wide (mem_op->op0), SIGNED);
+  off = poly_offset_int::from (wi::to_poly_wide (mem_op->op0), SIGNED);
 
   /* The only thing we have to do is from &OBJ.foo.bar add the offset
      from .foo.bar to the preceding MEM_REF offset and replace the
@@ -1207,7 +1209,7 @@  vn_reference_maybe_forwprop_address (vec
   if (code == ADDR_EXPR)
     {
       tree addr, addr_base;
-      HOST_WIDE_INT addr_offset;
+      poly_int64 addr_offset;
 
       addr = gimple_assign_rhs1 (def_stmt);
       addr_base = get_addr_base_and_unit_offset (TREE_OPERAND (addr, 0),
@@ -1217,7 +1219,7 @@  vn_reference_maybe_forwprop_address (vec
 	 dereference isn't offsetted.  */
       if (!addr_base
 	  && *i_p == ops->length () - 1
-	  && off == 0
+	  && known_zero (off)
 	  /* This makes us disable this transform for PRE where the
 	     reference ops might be also used for code insertion which
 	     is invalid.  */
@@ -1234,7 +1236,7 @@  vn_reference_maybe_forwprop_address (vec
 	      vn_reference_op_t new_mem_op = &tem[tem.length () - 2];
 	      new_mem_op->op0
 		= wide_int_to_tree (TREE_TYPE (mem_op->op0),
-				    wi::to_wide (new_mem_op->op0));
+				    wi::to_poly_wide (new_mem_op->op0));
 	    }
 	  else
 	    gcc_assert (tem.last ().opcode == STRING_CST);
@@ -2242,10 +2244,8 @@  vn_reference_lookup_3 (ao_ref *ref, tree
 	}
       if (TREE_CODE (lhs) == ADDR_EXPR)
 	{
-	  HOST_WIDE_INT tmp_lhs_offset;
 	  tree tem = get_addr_base_and_unit_offset (TREE_OPERAND (lhs, 0),
-						    &tmp_lhs_offset);
-	  lhs_offset = tmp_lhs_offset;
+						    &lhs_offset);
 	  if (!tem)
 	    return (void *)-1;
 	  if (TREE_CODE (tem) == MEM_REF
@@ -2272,10 +2272,8 @@  vn_reference_lookup_3 (ao_ref *ref, tree
 	rhs = SSA_VAL (rhs);
       if (TREE_CODE (rhs) == ADDR_EXPR)
 	{
-	  HOST_WIDE_INT tmp_rhs_offset;
 	  tree tem = get_addr_base_and_unit_offset (TREE_OPERAND (rhs, 0),
-						    &tmp_rhs_offset);
-	  rhs_offset = tmp_rhs_offset;
+						    &rhs_offset);
 	  if (!tem)
 	    return (void *)-1;
 	  if (TREE_CODE (tem) == MEM_REF
@@ -3282,7 +3280,7 @@  dominated_by_p_w_unex (basic_block bb1,
 set_ssa_val_to (tree from, tree to)
 {
   tree currval = SSA_VAL (from);
-  HOST_WIDE_INT toff, coff;
+  poly_int64 toff, coff;
 
   /* The only thing we allow as value numbers are ssa_names
      and invariants.  So assert that here.  We don't allow VN_TOP
@@ -3364,7 +3362,7 @@  set_ssa_val_to (tree from, tree to)
 	   && TREE_CODE (to) == ADDR_EXPR
 	   && (get_addr_base_and_unit_offset (TREE_OPERAND (currval, 0), &coff)
 	       == get_addr_base_and_unit_offset (TREE_OPERAND (to, 0), &toff))
-	   && coff == toff))
+	   && must_eq (coff, toff)))
     {
       if (dump_file && (dump_flags & TDF_DETAILS))
 	fprintf (dump_file, " (changed)\n");
Index: gcc/tree-ssa-strlen.c
===================================================================
--- gcc/tree-ssa-strlen.c	2017-10-23 17:07:40.843798706 +0100
+++ gcc/tree-ssa-strlen.c	2017-10-23 17:17:01.436033953 +0100
@@ -227,8 +227,9 @@  get_addr_stridx (tree exp, tree ptr, uns
   if (!decl_to_stridxlist_htab)
     return 0;
 
-  base = get_addr_base_and_unit_offset (exp, &off);
-  if (base == NULL || !DECL_P (base))
+  poly_int64 poff;
+  base = get_addr_base_and_unit_offset (exp, &poff);
+  if (base == NULL || !DECL_P (base) || !poff.is_constant (&off))
     return 0;
 
   list = decl_to_stridxlist_htab->get (base);
@@ -368,8 +369,9 @@  addr_stridxptr (tree exp)
 {
   HOST_WIDE_INT off;
 
-  tree base = get_addr_base_and_unit_offset (exp, &off);
-  if (base == NULL_TREE || !DECL_P (base))
+  poly_int64 poff;
+  tree base = get_addr_base_and_unit_offset (exp, &poff);
+  if (base == NULL_TREE || !DECL_P (base) || !poff.is_constant (&off))
     return NULL;
 
   if (!decl_to_stridxlist_htab)
Index: gcc/tree.c
===================================================================
--- gcc/tree.c	2017-10-23 17:11:40.252960525 +0100
+++ gcc/tree.c	2017-10-23 17:17:01.436033953 +0100
@@ -4903,7 +4903,7 @@  build5 (enum tree_code code, tree tt, tr
 tree
 build_simple_mem_ref_loc (location_t loc, tree ptr)
 {
-  HOST_WIDE_INT offset = 0;
+  poly_int64 offset = 0;
   tree ptype = TREE_TYPE (ptr);
   tree tem;
   /* For convenience allow addresses that collapse to a simple base