PR middle-end/13146
PR tree-optimization/23940
PR tree-optimization/33237
PR middle-end/33974
PR middle-end/34093
PR tree-optimization/36201
PR tree-optimization/36230
PR tree-optimization/38049
PR tree-optimization/38207
PR tree-optimization/38230
PR tree-optimization/38301
PR tree-optimization/38585
PR middle-end/38895
PR tree-optimization/38985
PR tree-optimization/39299
* tree-ssa-structalias.h: Remove.
* tree-ssa-operands.h (NULL_USE_OPERAND_P): Make of type use_operand_p.
(NULL_DEF_OPERAND_P): Make of type def_operand_p.
(struct vuse_element_d): Remove.
(struct vuse_vec_d): Likewise.
(VUSE_VECT_NUM_ELEM, VUSE_VECT_ELEMENT_NC, VUSE_ELEMENT_PTR_NC,
VUSE_ELEMENT_VAR_NC, VUSE_VECT_ELEMENT, VUSE_ELEMENT_PTR,
SET_VUSE_VECT_ELEMENT, SET_VUSE_ELEMENT_VAR, SET_VUSE_ELEMENT_PTR,
VUSE_ELEMENT_VAR): Likewise.
(struct voptype_d): Likewise.
(NUM_VOP_FREE_BUCKETS): Likewise.
(struct ssa_operands): Remove vop_free_buckets and mpt_table fields.
(struct stmt_operands_d): Remove.
(VUSE_OP_PTR, VUSE_OP, SET_VUSE_OP, VUSE_NUM, VUSE_VECT,
VDEF_RESULT_PTR, VDEF_RESULT, VDEF_OP_PTR, VDEF_OP, SET_VDEF_OP,
VDEF_NUM, VDEF_VECT): Likewise.
(copy_virtual_operands): Remove.
(operand_build_cmp): Likewise.
(create_ssa_artificial_load_stmt): Likewise.
(enum ssa_op_iter_type): Remove ssa_op_iter_vdef.
(struct ssa_operand_iterator_d): Remove vuses, vdefs, mayusesm
vuse_index and mayuse_index members. Pack and move done and iter_type
members to the front.
(SSA_OP_VMAYUSE): Remove.
(SSA_OP_VIRTUAL_USES): Adjust.
(FOR_EACH_SSA_VDEF_OPERAND): Remove.
(unlink_stmt_vdef): Declare.
(add_to_addressable_set): Remove.
* tree-vrp.c (stmt_interesting_for_vrp): Adjust.
(vrp_visit_stmt): Likewise.
* doc/tree-ssa.texi (Alias analysis): Update.
* doc/invoke.texi (max-aliased-vops): Remove docs.
(avg-aliased-vops): Likewise.
* tree-into-ssa.c (syms_to_rename): Remove.
(need_to_update_vops_p): Likewise.
(need_to_initialize_update_ssa_p): Rename to ...
(update_ssa_initialized_fn): ... this. Track function we are
initialized for.
(symbol_marked_for_renaming): Simplify.
(add_new_name_mapping): Do not set need_to_update_vops_p.
(dump_currdefs): Use SYMS_TO_RENAME.
(rewrite_update_stmt): Always walk all uses/defs.
(dump_update_ssa): Adjust.
(init_update_ssa): Take function argument. Track what we are
initialized for.
(delete_update_ssa): Reset SYMS_TO_RENAME and update_ssa_initialized_fn.
(create_new_def_for): Initialize for cfun, assert we are initialized
for cfun.
(mark_sym_for_renaming): Simplify.
(mark_set_for_renaming): Do not initialize update-ssa.
(need_ssa_update_p): Simplify. Take function argument.
(name_mappings_registered_p): Assert we ask for the correct function.
(name_registered_for_update_p): Likewise.
(ssa_names_to_replace): Likewise.
(release_ssa_name_after_update_ssa): Likewise.
(update_ssa): Likewise. Use SYMS_TO_RENAME.
(dump_decl_set): Do not print a newline.
(debug_decl_set): Do it here.
(dump_update_ssa): And here.
* tree-ssa-loop-im.c (move_computations): Adjust.
(movement_possibility): Likewise.
(determine_max_movement): Likewise.
(gather_mem_refs_stmt): Likewise.
* tree-dump.c (dequeue_and_dump): Do not handle SYMBOL_MEMORY_TAG
or NAME_MEMORY_TAG.
* tree-complex.c (update_all_vops): Remove.
(expand_complex_move): Adjust.
* tree-ssa-loop-niter.c (chain_of_csts_start): Use NULL_TREE.
Simplify test for memory referencing statement. Exclude
non-invariant ADDR_EXPRs.
* tree-pretty-print.c (dump_generic_node): Do not handle memory tags.
* tree-loop-distribution.c (generate_memset_zero): Adjust.
(rdg_flag_uses): Likewise.
* tree-tailcall.c (suitable_for_tail_opt_p): Remove memory-tag
related code.
(tree_optimize_tail_calls_1): Also split the
edge from the entry block if we have degenerate PHI nodes in
the first basic block.
* tree.c (init_ttree): Remove memory-tag related code.
(tree_code_size): Likewise.
(tree_node_structure): Likewise.
(build7_stat): Re-write to be build6_stat.
* tree.h (MTAG_P, TREE_MEMORY_TAG_CHECK, TMR_TAG): Remove.
(SSA_VAR_P): Adjust.
(struct tree_memory_tag): Remove.
(struct tree_memory_partition_tag): Likewise.
(union tree_node): Adjust.
(build7): Re-write to be build6.
* tree-pass.h (pass_reset_cc_flags): Remove.
(TODO_update_address_taken): New flag.
(pass_simple_dse): Remove.
* ipa-cp.c (ipcp_update_callgraph): Update SSA form.
* params.h (MAX_ALIASED_VOPS): Remove.
(AVG_ALIASED_VOPS): Likewise.
* omp-low.c (expand_omp_taskreg): Update SSA form.
* tree-ssa-dse.c (dse_optimize_stmt): Properly query if the rhs
aliases the lhs in a copy stmt.
* tree-ssa-dse.c (struct address_walk_data): Remove.
(memory_ssa_name_same): Likewise.
(memory_address_same): Likewise.
(get_kill_of_stmt_lhs): Likewise.
(dse_possible_dead_store_p): Simplify, use the oracle. Handle
unused stores. Look through PHI nodes into post-dominated regions.
(dse_optimize_stmt): Simplify. Properly remove stores.
(tree_ssa_dse): Compute dominators.
(execute_simple_dse): Remove.
(pass_simple_dse): Likewise.
* ipa-reference.c (scan_stmt_for_static_refs): Open-code
gimple_loaded_syms and gimple_stored_syms computation.
* toplev.c (dump_memory_report): Dump alias and pta stats.
* tree-ssa-sccvn.c (vn_reference_compute_hash): Simplify.
(vn_reference_eq): Likewise.
(vuses_to_vec, copy_vuses_from_stmt, vdefs_to_vec,
copy_vdefs_from_stmt, shared_lookup_vops, shared_vuses_from_stmt,
valueize_vuses): Remove.
(get_def_ref_stmt_vuses): Simplify. Rename to ...
(get_def_ref_stmt_vuse): ... this.
(vn_reference_lookup_2): New function.
(vn_reference_lookup_pieces): Use walk_non_aliased_vuses for
walking equivalent vuses. Simplify.
(vn_reference_lookup): Likewise.
(vn_reference_insert): Likewise.
(vn_reference_insert_pieces): Likewise.
(visit_reference_op_call): Simplify.
(visit_reference_op_load): Likewise.
(visit_reference_op_store): Likewise.
(init_scc_vn): Remove shared_lookup_vuses initialization.
(free_scc_vn): Remove shared_lookup_vuses freeing.
(sort_vuses, sort_vuses_heap): Remove.
(get_ref_from_reference_ops): Export.
* tree-ssa-sccvn.h (struct vn_reference_s): Replace vuses
vector with single vuse pointer.
(vn_reference_lookup_pieces, vn_reference_lookup,
vn_reference_insert, vn_reference_insert_pieces): Adjust prototypes.
(shared_vuses_from_stmt): Remove.
(get_ref_from_reference_ops): Declare.
* tree-ssa-loop-manip.c (slpeel_can_duplicate_loop_p): Adjust.
* tree-ssa-copyrename.c (copy_rename_partition_coalesce): Remove
memory-tag related code.
* tree-ssa-ccp.c (get_symbol_constant_value): Remove memory-tag code.
(likely_value): Add comment, skip static-chain of call statements.
(surely_varying_stmt_p): Adjust.
(gimplify_and_update_call_from_tree): Likewise.
(execute_fold_all_builtins): Do not rebuild alias info.
(gimplify_and_update_call_from_tree): Properly update VOPs.
* tree-ssa-loop-ivopts.c (get_ref_tag): Remove.
(copy_ref_info): Remove memory-tag related code.
* tree-call-cdce.c (tree_call_cdce): Rename the VOP.
* ipa-pure-const.c (check_decl): Remove memory-tag related code.
(check_stmt): Open-code gimple_loaded_syms and gimple_stored_syms
computation.
* tree-ssa-dom.c (gimple_p): Remove typedef.
(eliminate_redundant_computations): Adjust.
(record_equivalences_from_stmt): Likewise.
(avail_expr_hash): Likewise.
(avail_expr_eq): Likewise.
* tree-ssa-propagate.c (update_call_from_tree): Properly
update VOPs.
(stmt_makes_single_load): Likewise.
(stmt_makes_single_store): Likewise.
* tree-ssa-alias.c: Rewrite completely.
(debug_memory_partitions, dump_mem_ref_stats, debug_mem_ref_stats,
debug_mem_sym_stats, dump_mem_sym_stats_for_var,
debug_all_mem_sym_stats, debug_mp_info, update_mem_sym_stats_from_stmt,
delete_mem_ref_stats, create_tag_raw, dump_points_to_info,
dump_may_aliases_for, debug_may_aliases_for, new_type_alias):
Remove public functions.
(pass_reset_cc_flags): Remove.
(pass_build_alias): Move ...
* tree-ssa-structalias.c (pass_build_alias): ... here.
* tree-ssa-alias.c (may_be_aliased): Move ...
* tree-flow-inline.h (may_be_aliased): ... here.
tree-ssa-alias.c (struct count_ptr_d, count_ptr_derefs,
count_uses_and_derefs): Move ...
* gimple.c: ... here.
* gimple.h (count_uses_and_derefs): Declare.
* tree-ssa-alias.c (dump_alias_stats, ptr_deref_may_alias_global_p,
ptr_deref_may_alias_decl_p, ptr_derefs_may_alias_p,
same_type_for_tbaa, nonaliasing_component_refs_p, decl_refs_may_alias_p,
indirect_ref_may_alias_decl_p, indirect_refs_may_alias_p,
ref_maybe_used_by_call_p, ref_maybe_used_by_stmt_p,
call_may_clobber_ref_p, stmt_may_clobber_ref_p, maybe_skip_until,
get_continuation_for_phi, walk_non_aliased_vuses, walk_aliased_vdefs):
New functions.
* tree-dfa.c (refs_may_alias_p): Move ...
* tree-ssa-alias.c (refs_may_alias_p): ... here. Extend.
* tree-ssa-alias.h: New file.
* tree-ssa-sink.c (is_hidden_global_store): Adjust.
(statement_sink_location): Likewise.
* opts.c (decode_options): Do not adjust max-aliased-vops or
avg-aliased-vops values.
* timevar.def (TV_TREE_MAY_ALIAS): Remove.
(TV_CALL_CLOBBER): Likewise.
(TV_FLOW_SENSITIVE): Likewise.
(TV_FLOW_INSENSITIVE): Likewise.
(TV_MEMORY_PARTITIONING): Likewise.
(TV_ALIAS_STMT_WALK): New timevar.
* tree-ssa-loop-ivcanon.c (empty_loop_p): Adjust.
* tree-ssa-address.c (create_mem_ref_raw): Use build6.
(get_address_description): Remove memory-tag related code.
* tree-ssa-ifcombine.c (bb_no_side_effects_p): Adjust.
* treestruct.def (TS_MEMORY_TAG, TS_MEMORY_PARTITION_TAG): Remove.
* tree-eh.c (cleanup_empty_eh): Do not leave stale SSA_NAMEs
and immediate uses in statements. Document.
* gimple-pretty-print.c (dump_gimple_mem_ops): Adjust.
(dump_symbols): Remove.
(dump_gimple_mem_ops): Do not dump loaded or stored syms.
* alias.c (get_deref_alias_set): New function split out from ...
(get_alias_set): ... here.
* alias.h (get_deref_alias_set): Declare.
* tree-vect-data-refs.c (vect_create_data_ref_ptr): Remove unused
type parameter. Remove restrict pointer handling. Create a
ref-all pointer in case type-based alias sets do not conflict.
(vect_analyze_data_refs): Remove SMT related code.
* tree-vect-stmts.c (vectorizable_store): Re-instantiate TBAA assert.
(vectorizable_load): Likewise.
* tree-data-ref.h (struct dr_alias): Remove symbol_tag field.
(DR_SYMBOL_TAG, DR_VOPS): Remove.
* tree-data-ref.c (dr_may_alias_p): Use the alias-oracle.
Ignore vops and SMTs.
(dr_analyze_alias): Likewise..
(free_data_ref): Likewise.
(create_data_ref): Likewise.
(analyze_all_data_dependences): Likewise.
(get_references_in_stmt): Adjust.
* tree-flow-inline.h (gimple_aliases_computed_p,
gimple_addressable_vars, gimple_call_clobbered_vars,
gimple_call_used_vars, gimple_global_var, may_aliases, memory_partition,
factoring_name_p, mark_call_clobbered, clear_call_clobbered,
compare_ssa_operands_equal, symbol_mem_tag, set_symbol_mem_tag,
gimple_mem_ref_stats): Remove.
(gimple_vop): New function.
(op_iter_next_use): Remove vuses and mayuses cases.
(op_iter_next_def): Remove vdefs case.
(op_iter_next_tree): Remove vuses, mayuses and vdefs cases.
(clear_and_done_ssa_iter): Do not set removed fields.
(op_iter_init): Likewise. Skip vuse and/or vdef if requested.
Assert we are not iterating over vuses or vdefs if not also
iterating over uses or defs.
(op_iter_init_use): Likewise.
(op_iter_init_def): Likewise.
(op_iter_next_vdef): Remove.
(op_iter_next_mustdef): Likewise.
(op_iter_init_vdef): Likewise.
(compare_ssa_operands_equal): Likewise.
(link_use_stmts_after): Handle vuse operand.
(is_call_used): Use is_call_clobbered.
(is_call_clobbered): Global variables are always call clobbered,
query the call-clobbers bitmap.
(mark_call_clobbered): Ignore global variables.
(clear_call_clobbered): Likewise.
* tree-ssa-coalesce.c (create_outofssa_var_map): Adjust
virtual operands sanity check.
* tree.def (NAME_MEMORY_TAG, SYMBOL_MEMORY_TAG, MEMORY_PARTITION_TAG):
Remove.
(TARGET_MEM_REF): Remove TMR_TAG operand.
* tree-dfa.c (add_referenced_var): Initialize call-clobber state.
Remove call-clobber related code.
(remove_referenced_var): Likewise. Do not clear mpt or symbol_mem_tag.
(dump_variable): Do not dump SMTs, memory stats, may-aliases or
partitions or escape reason.
(get_single_def_stmt, get_single_def_stmt_from_phi,
get_single_def_stmt_with_phi): Remove.
(dump_referenced_vars): Tidy.
(get_ref_base_and_extent): Allow bare decls.
(collect_dfa_stats): Adjust.
* graphite.c (rename_variables_in_stmt): Adjust.
(graphite_copy_stmts_from_block): Likewise.
(translate_clast): Likewise.
* tree-ssa-pre.c (struct bb_bitmap_sets): Add expr_dies bitmap.
(EXPR_DIES): New.
(translate_vuse_through_block): Use the oracle.
(phi_translate_1): Adjust.
(value_dies_in_block_x): Use the oracle. Cache the outcome
in EXPR_DIES.
(valid_in_sets): Check if the VUSE for
a REFERENCE is available.
(eliminate): Do not remove stmts during elimination,
instead queue and remove them afterwards.
(do_pre): Do not rebuild alias info.
(pass_pre): Run TODO_rebuild_alias before PRE.
* tree-ssa-live.c (remove_unused_locals): Remove memory-tag code.
* tree-sra.c (sra_walk_function): Use gimple_references_memory_p.
(mark_all_v_defs_stmt): Remove.
(mark_all_v_defs_seq): Adjust.
(sra_replace): Likewise.
(scalarize_use): Likewise.
(scalarize_copy): Likewise.
(scalarize_init): Likewise.
(scalarize_ldst): Likewise.
(todoflags): Remove.
(tree_sra): Do not rebuild alias info.
(tree_sra_early): Adjust.
(pass_sra): Run TODO_update_address_taken before SRA.
* tree-predcom.c (set_alias_info): Remove.
(prepare_initializers_chain): Do not call it.
(mark_virtual_ops_for_renaming): Adjust.
(mark_virtual_ops_for_renaming_list): Remove.
(initialize_root_vars): Adjust.
(initialize_root_vars_lm): Likewise.
(prepare_initializers_chain): Likewise.
* tree-ssa-copy.c (may_propagate_copy): Remove memory-tag related code.
(may_propagate_copy_into_stmt): Likewise.
(merge_alias_info): Do nothing for now.
(propagate_tree_value_into_stmt): Adjust.
(stmt_may_generate_copy): Likewise.
* tree-ssa-forwprop.c (tidy_after_forward_propagate_addr): Do
not mark symbols for renaming.
(forward_propagate_addr_expr): Match up push/pop_stmt_changes
with the same statement, make sure to update the new pointed-to one.
* tree-ssa-dce.c (eliminate_unnecessary_stmts): Do not copy
call statements, do not mark symbols for renaming.
(mark_operand_necessary): Dump something.
(ref_may_be_aliased): New function.
(mark_aliased_reaching_defs_necessary_1): New helper function.
(mark_aliased_reaching_defs_necessary): Likewise.
(mark_all_reaching_defs_necessary_1): Likewise.
(mark_all_reaching_defs_necessary): Likewise.
(propagate_necessity): Do not process virtual PHIs. For
non-aliased loads mark all reaching definitions as necessary.
For aliased loads and stores mark the immediate dominating
aliased clobbers as necessary.
(visited): New global static.
(perform_tree_ssa_dce): Free visited bitmap after propagating
necessity.
(remove_dead_phis): Perform simple dead virtual PHI removal.
(remove_dead_stmt): Properly unlink virtual operands when
removing stores.
(eliminate_unnecessary_stmts): Schedule PHI removal after
stmt removal.
* tree-ssa-ter.c (is_replaceable_p): Adjust.
(process_replaceable): Likewise.
(find_replaceable_in_bb): Likewise.
* tree-ssa.c (verify_ssa_name): Verify all VOPs are
based on the single gimple vop.
(verify_flow_insensitive_alias_info): Remove.
(verify_flow_sensitive_alias_info): Likewise.
(verify_call_clobbering): Likewise.
(verify_memory_partitions): Likewise.
(verify_alias_info): Likewise.
(verify_ssa): Adjust..
(execute_update_addresses_taken): Export. Update SSA
manually. Optimize only when optimizing. Use a local bitmap.
(pass_update_address_taken): Remove TODO_update_ssa, add
TODO_dump_func.
(pass_update_address_taken): Just use TODO_update_address_taken.
(init_tree_ssa): Do not initialize addressable_vars.
(verify_ssa): Verify new VUSE / VDEF properties.
Verify that all stmts definitions have the stmt as SSA_NAME_DEF_STMT.
Do not call verify_alias_info.
(delete_tree_ssa): Clear the VUSE, VDEF operands.
Do not free the loaded and stored syms bitmaps. Reset the escaped
and callused solutions. Do not free addressable_vars.
Remove memory-tag related code.
(warn_uninitialized_var): Aliases are always available.
* tree-ssa-loop-prefetch.c (gather_memory_references): Adjust.
* lambda-code.c (can_put_in_inner_loop): Adjust.
(can_put_after_inner_loop): Likewise.
(perfect_nestify): Likewise.
* tree-vect-stmts.c (vect_stmt_relevant_p): Adjust.
(vect_gen_widened_results_half): Remove CALL_EXPR handling.
(vectorizable_conversion): Do not mark symbols for renaming.
* tree-inline.c (remap_gimple_stmt): Clear VUSE/VDEF.
(expand_call_inline): Unlink the calls virtual operands before
replacing it.
(tree_function_versioning): Do not call update_ssa if we are not
updating clones. Simplify.
* tree-ssa-phiprop.c (phivn_valid_p): Adjust.
(propagate_with_phi): Likewise..
* tree-outof-ssa.c (create_temp): Remove memory tag and call
clobber code. Assert we are not aliased or global.
* tree-flow.h: Include tree-ssa-alias.h
(enum escape_type): Remove.
(struct mem_sym_stats_d): Likewise.
(struct mem_ref_stats_d): Likewise.
(struct gimple_df): Add vop member. Remove global_var,
call_clobbered_vars, call_used_vars, addressable_vars,
aliases_compted_p and mem_ref_stats members. Add syms_to_rename,
escaped and callused members.
(struct ptr_info_def): Remove all members, add points-to solution
member pt.
(struct var_ann_d): Remove in_vuse_list, in_vdef_list,
call_clobbered, escape_mask, mpt and symbol_mem_tag members.
* Makefile.in (TREE_FLOW_H): Add tree-ssa-alias.h.
(tree-ssa-structalias.o): Remove tree-ssa-structalias.h.
(tree-ssa-alias.o): Likewise.
(toplev.o): Add tree-ssa-alias.h
(GTFILES): Remove tree-ssa-structalias.h, add tree-ssa-alias.h.
* gimple.c (gimple_set_bb): Fix off-by-one error.
(is_gimple_reg): Do not handle memory tags.
(gimple_copy): Also copy virtual operands.
Delay updating the statement. Do not reset loaded and stored syms.
(gimple_set_stored_syms): Remove.
(gimple_set_loaded_syms): Likewise.
(gimple_call_copy_skip_args): Copy the virtual operands
and mark the new statement modified.
* tree-ssa-structalias.c (may_alias_p): Remove.
(set_uids_in_ptset): Take the alias set to prune with as
parameter. Fold in the alias test of may_alias_p.
(compute_points_to_sets): Compute whether a ptr is dereferenced
in a local sbitmap.
(process_constraint): Deal with &ANYTHING on the lhs, reject all
other ADDRESSOF constraints on the lhs.
(get_constraint_for_component_ref): Assert that we don't get
ADDRESSOF constraints from the base of the reference.
Properly generate UNKNOWN_OFFSET for DEREF if needed.
(struct variable_info): Remove collapsed_to member.
(get_varinfo_fc): Remove.
(new_var_info): Do not set collapsed_to.
(dump_constraint): Do not follow cycles.
(dump_constraint_graph): Likewise.
(build_pred_graph): Likewise.
(build_succ_graph): Likewise.
(rewrite_constraints): Likewise.
(do_simple_structure_copy): Remove.
(do_rhs_deref_structure_copy): Remove.
(do_lhs_deref_structure_copy): Remove.
(collapse_rest_of_var): Remove.
(do_structure_copy): Re-implement.
(pta_stats): New global variable.
(dump_pta_stats): New function.
(struct constraint_expr): Make offset signed.
(UNKNOWN_OFFSET): Define special value.
(dump_constraint): Dump UNKNOWN_OFFSET as UNKNOWN.
(solution_set_expand): New helper function split out from ...
(do_sd_constraint): ... here.
(solution_set_add): Handle UNKNOWN_OFFSET. Handle negative offsets.
(do_ds_constraint): Likewise.
(do_sd_constraint): Likewise. Do not special-case ESCAPED = *ESCAPED
and CALLUSED = *CALLUSED.
(set_union_with_increment): Make inc argument signed.
(type_safe): Remove.
(get_constraint_for_ptr_offset): Handle unknown and negative
constant offsets.
(first_vi_for_offset): Handle offsets before start. Bail
out early for offsets beyond the variable extent.
(first_or_preceding_vi_for_offset): New function.
(init_base_vars): Add ESCAPED = ESCAPED + UNKNOWN_OFFSET constraint.
Together with ESCAPED = *ESCAPED this properly computes reachability.
(find_what_var_points_to): New function.
(find_what_p_points_to): Implement in terms of find_what_var_points_to.
(pt_solution_reset, pt_solution_empty_p, pt_solution_includes_global,
pt_solution_includes_1, pt_solution_includes, pt_solutions_intersect_1,
pt_solutions_intersect): New functions.
(compute_call_used_vars): Remove.
(compute_may_aliases): New main entry into PTA computation.
* gimple.h (gimple_p): New typedef.
(struct gimple_statement_base): Remove references_memory_p.
(struct gimple_statement_with_memory_ops_base): Remove
vdef_ops, vuse_ops, stores and loads members. Add vdef and vuse
members.
(gimple_vuse_ops, gimple_set_vuse_ops, gimple_vdef_ops,
gimple_set_vdef_ops, gimple_loaded_syms, gimple_stored_syms,
gimple_set_references_memory): Remove.
(gimple_vuse_op, gimple_vdef_op, gimple_vuse, gimple_vdef,
gimple_vuse_ptr, gimple_vdef_ptri, gimple_set_vuse, gimple_set_vdef):
New functions.
* tree-cfg.c (move_block_to_fn): Fix off-by-one error.
(verify_expr): Allow RESULT_DECL.
(gimple_duplicate_bb): Do not copy virtual operands.
(gimple_duplicate_sese_region): Adjust.
(gimple_duplicate_sese_tail): Likewise.
(mark_virtual_ops_in_region): Remove.
(move_sese_region_to_fn): Do not call it.
* passes.c (init_optimization_passes): Remove pass_reset_cc_flags
and pass_simple_dse.
(execute_function_todo): Handle TODO_update_address_taken,
call execute_update_addresses_taken for TODO_rebuild_alias.
(execute_todo): Adjust.
(execute_one_pass): Init dump files early.
* ipa-struct-reorg.c (finalize_var_creation): Do not mark vars
call-clobbered.
(create_general_new_stmt): Clear vops.
* tree-ssa-reassoc.c (get_rank): Adjust.
* tree-vect-slp.c (vect_create_mask_and_perm): Do not mark
symbols for renaming.
* params.def (PARAM_MAX_ALIASED_VOPS): Remove.
(PARAM_AVG_ALIASED_VOPS): Likewise.
* tree-ssanames.c (init_ssanames): Allocate SYMS_TO_RENAME.
(duplicate_ssa_name_ptr_info): No need to copy the shared bitmaps.
* tree-ssa-operands.c: Simplify for new virtual operand
representation.
(operand_build_cmp, copy_virtual_operands,
create_ssa_artificial_load_stmt, add_to_addressable_set,
gimple_add_to_addresses_taken): Remove public functions.
(unlink_stmt_vdef): New function.
* gcc.dg/pr19633-1.c: Adjust.
* gcc.dg/torture/pta-callused-1.c: Likewise.
* gcc.dg/torture/pr39074-2.c: Likewise.
* gcc.dg/torture/pr39074.c: Likewise.
* gcc.dg/torture/pta-ptrarith-3.c: New testcase.
* gcc.dg/torture/pr30375.c: Adjust.
* gcc.dg/torture/pr33563.c: Likewise.
* gcc.dg/torture/pr33870.c: Likewise.
* gcc.dg/torture/pr33560.c: Likewise.
* gcc.dg/torture/pta-structcopy-1.c: New testcase.
* gcc.dg/torture/ssa-pta-fn-1.c: Likewise.
* gcc.dg/tree-ssa/alias-15.c: Remove.
* gcc.dg/tree-ssa/ssa-dce-4.c: New testcase.
* gcc.dg/tree-ssa/pr26421.c: Adjust.
* gcc.dg/tree-ssa/ssa-fre-10.c: XFAIL.
* gcc.dg/tree-ssa/ssa-dce-5.c: New testcase.
* gcc.dg/tree-ssa/pr23382.c: Adjust.
* gcc.dg/tree-ssa/ssa-fre-20.c: New testcase.
* gcc.dg/tree-ssa/alias-16.c: Adjust.
* gcc.dg/tree-ssa/ssa-fre-13.c: Likewise.
* gcc.dg/tree-ssa/ssa-fre-14.c: Likewise.
* gcc.dg/tree-ssa/alias-18.c: Likewise.
* gcc.dg/tree-ssa/ssa-fre-15.c: Likewise.
* gcc.dg/tree-ssa/ssa-lim-3.c: Likewise.
* gcc.dg/tree-ssa/alias-19.c: Likewise.
* gcc.dg/tree-ssa/pta-ptrarith-1.c: New testcase.
* gcc.dg/tree-ssa/pr13146.c: Likewise.
* gcc.dg/tree-ssa/ssa-pre-23.c: Likewise.
* gcc.dg/tree-ssa/pta-ptrarith-2.c: Likewise.
* gcc.dg/tree-ssa/ssa-fre-18.c: Likewise.
* gcc.dg/tree-ssa/ssa-pre-24.c: New XFAILed testcase.
* gcc.dg/tree-ssa/ssa-fre-19.c: New testcase.
* gcc.dg/tree-ssa/alias-20.c: Likewise.
* gcc.dg/tree-ssa/ssa-dse-12.c: Likewise.
* gcc.dg/tree-ssa/pr38895.c: Likewise.
* gcc.dg/uninit-B.c: XFAIL.
* gcc.dg/vect/no-vfa-vect-43.c: Adjust.
* gcc.dg/uninit-pr19430.c: XFAIL.
* g++.dg/tree-ssa/pr13146.C: New testcase.
* g++.dg/opt/pr36187.C: Adjust.
* g++.dg/torture/
20090329-1.C: New testcase.
git-svn-id: svn+ssh://gcc.gnu.org/svn/gcc/trunk@145494
138bc75d-0d04-0410-961f-
82ee72b054a4
+2009-04-03 Richard Guenther <rguenther@suse.de>
+
+ PR middle-end/13146
+ PR tree-optimization/23940
+ PR tree-optimization/33237
+ PR middle-end/33974
+ PR middle-end/34093
+ PR tree-optimization/36201
+ PR tree-optimization/36230
+ PR tree-optimization/38049
+ PR tree-optimization/38207
+ PR tree-optimization/38230
+ PR tree-optimization/38301
+ PR tree-optimization/38585
+ PR middle-end/38895
+ PR tree-optimization/38985
+ PR tree-optimization/39299
+ * tree-ssa-structalias.h: Remove.
+ * tree-ssa-operands.h (NULL_USE_OPERAND_P): Make of type use_operand_p.
+ (NULL_DEF_OPERAND_P): Make of type def_operand_p.
+ (struct vuse_element_d): Remove.
+ (struct vuse_vec_d): Likewise.
+ (VUSE_VECT_NUM_ELEM, VUSE_VECT_ELEMENT_NC, VUSE_ELEMENT_PTR_NC,
+ VUSE_ELEMENT_VAR_NC, VUSE_VECT_ELEMENT, VUSE_ELEMENT_PTR,
+ SET_VUSE_VECT_ELEMENT, SET_VUSE_ELEMENT_VAR, SET_VUSE_ELEMENT_PTR,
+ VUSE_ELEMENT_VAR): Likewise.
+ (struct voptype_d): Likewise.
+ (NUM_VOP_FREE_BUCKETS): Likewise.
+ (struct ssa_operands): Remove vop_free_buckets and mpt_table fields.
+ (struct stmt_operands_d): Remove.
+ (VUSE_OP_PTR, VUSE_OP, SET_VUSE_OP, VUSE_NUM, VUSE_VECT,
+ VDEF_RESULT_PTR, VDEF_RESULT, VDEF_OP_PTR, VDEF_OP, SET_VDEF_OP,
+ VDEF_NUM, VDEF_VECT): Likewise.
+ (copy_virtual_operands): Remove.
+ (operand_build_cmp): Likewise.
+ (create_ssa_artificial_load_stmt): Likewise.
+ (enum ssa_op_iter_type): Remove ssa_op_iter_vdef.
+ (struct ssa_operand_iterator_d): Remove vuses, vdefs, mayusesm
+ vuse_index and mayuse_index members. Pack and move done and iter_type
+ members to the front.
+ (SSA_OP_VMAYUSE): Remove.
+ (SSA_OP_VIRTUAL_USES): Adjust.
+ (FOR_EACH_SSA_VDEF_OPERAND): Remove.
+ (unlink_stmt_vdef): Declare.
+ (add_to_addressable_set): Remove.
+ * tree-vrp.c (stmt_interesting_for_vrp): Adjust.
+ (vrp_visit_stmt): Likewise.
+ * doc/tree-ssa.texi (Alias analysis): Update.
+ * doc/invoke.texi (max-aliased-vops): Remove docs.
+ (avg-aliased-vops): Likewise.
+ * tree-into-ssa.c (syms_to_rename): Remove.
+ (need_to_update_vops_p): Likewise.
+ (need_to_initialize_update_ssa_p): Rename to ...
+ (update_ssa_initialized_fn): ... this. Track function we are
+ initialized for.
+ (symbol_marked_for_renaming): Simplify.
+ (add_new_name_mapping): Do not set need_to_update_vops_p.
+ (dump_currdefs): Use SYMS_TO_RENAME.
+ (rewrite_update_stmt): Always walk all uses/defs.
+ (dump_update_ssa): Adjust.
+ (init_update_ssa): Take function argument. Track what we are
+ initialized for.
+ (delete_update_ssa): Reset SYMS_TO_RENAME and update_ssa_initialized_fn.
+ (create_new_def_for): Initialize for cfun, assert we are initialized
+ for cfun.
+ (mark_sym_for_renaming): Simplify.
+ (mark_set_for_renaming): Do not initialize update-ssa.
+ (need_ssa_update_p): Simplify. Take function argument.
+ (name_mappings_registered_p): Assert we ask for the correct function.
+ (name_registered_for_update_p): Likewise.
+ (ssa_names_to_replace): Likewise.
+ (release_ssa_name_after_update_ssa): Likewise.
+ (update_ssa): Likewise. Use SYMS_TO_RENAME.
+ (dump_decl_set): Do not print a newline.
+ (debug_decl_set): Do it here.
+ (dump_update_ssa): And here.
+ * tree-ssa-loop-im.c (move_computations): Adjust.
+ (movement_possibility): Likewise.
+ (determine_max_movement): Likewise.
+ (gather_mem_refs_stmt): Likewise.
+ * tree-dump.c (dequeue_and_dump): Do not handle SYMBOL_MEMORY_TAG
+ or NAME_MEMORY_TAG.
+ * tree-complex.c (update_all_vops): Remove.
+ (expand_complex_move): Adjust.
+ * tree-ssa-loop-niter.c (chain_of_csts_start): Use NULL_TREE.
+ Simplify test for memory referencing statement. Exclude
+ non-invariant ADDR_EXPRs.
+ * tree-pretty-print.c (dump_generic_node): Do not handle memory tags.
+ * tree-loop-distribution.c (generate_memset_zero): Adjust.
+ (rdg_flag_uses): Likewise.
+ * tree-tailcall.c (suitable_for_tail_opt_p): Remove memory-tag
+ related code.
+ (tree_optimize_tail_calls_1): Also split the
+ edge from the entry block if we have degenerate PHI nodes in
+ the first basic block.
+ * tree.c (init_ttree): Remove memory-tag related code.
+ (tree_code_size): Likewise.
+ (tree_node_structure): Likewise.
+ (build7_stat): Re-write to be build6_stat.
+ * tree.h (MTAG_P, TREE_MEMORY_TAG_CHECK, TMR_TAG): Remove.
+ (SSA_VAR_P): Adjust.
+ (struct tree_memory_tag): Remove.
+ (struct tree_memory_partition_tag): Likewise.
+ (union tree_node): Adjust.
+ (build7): Re-write to be build6.
+ * tree-pass.h (pass_reset_cc_flags): Remove.
+ (TODO_update_address_taken): New flag.
+ (pass_simple_dse): Remove.
+ * ipa-cp.c (ipcp_update_callgraph): Update SSA form.
+ * params.h (MAX_ALIASED_VOPS): Remove.
+ (AVG_ALIASED_VOPS): Likewise.
+ * omp-low.c (expand_omp_taskreg): Update SSA form.
+ * tree-ssa-dse.c (dse_optimize_stmt): Properly query if the rhs
+ aliases the lhs in a copy stmt.
+ * tree-ssa-dse.c (struct address_walk_data): Remove.
+ (memory_ssa_name_same): Likewise.
+ (memory_address_same): Likewise.
+ (get_kill_of_stmt_lhs): Likewise.
+ (dse_possible_dead_store_p): Simplify, use the oracle. Handle
+ unused stores. Look through PHI nodes into post-dominated regions.
+ (dse_optimize_stmt): Simplify. Properly remove stores.
+ (tree_ssa_dse): Compute dominators.
+ (execute_simple_dse): Remove.
+ (pass_simple_dse): Likewise.
+ * ipa-reference.c (scan_stmt_for_static_refs): Open-code
+ gimple_loaded_syms and gimple_stored_syms computation.
+ * toplev.c (dump_memory_report): Dump alias and pta stats.
+ * tree-ssa-sccvn.c (vn_reference_compute_hash): Simplify.
+ (vn_reference_eq): Likewise.
+ (vuses_to_vec, copy_vuses_from_stmt, vdefs_to_vec,
+ copy_vdefs_from_stmt, shared_lookup_vops, shared_vuses_from_stmt,
+ valueize_vuses): Remove.
+ (get_def_ref_stmt_vuses): Simplify. Rename to ...
+ (get_def_ref_stmt_vuse): ... this.
+ (vn_reference_lookup_2): New function.
+ (vn_reference_lookup_pieces): Use walk_non_aliased_vuses for
+ walking equivalent vuses. Simplify.
+ (vn_reference_lookup): Likewise.
+ (vn_reference_insert): Likewise.
+ (vn_reference_insert_pieces): Likewise.
+ (visit_reference_op_call): Simplify.
+ (visit_reference_op_load): Likewise.
+ (visit_reference_op_store): Likewise.
+ (init_scc_vn): Remove shared_lookup_vuses initialization.
+ (free_scc_vn): Remove shared_lookup_vuses freeing.
+ (sort_vuses, sort_vuses_heap): Remove.
+ (get_ref_from_reference_ops): Export.
+ * tree-ssa-sccvn.h (struct vn_reference_s): Replace vuses
+ vector with single vuse pointer.
+ (vn_reference_lookup_pieces, vn_reference_lookup,
+ vn_reference_insert, vn_reference_insert_pieces): Adjust prototypes.
+ (shared_vuses_from_stmt): Remove.
+ (get_ref_from_reference_ops): Declare.
+ * tree-ssa-loop-manip.c (slpeel_can_duplicate_loop_p): Adjust.
+ * tree-ssa-copyrename.c (copy_rename_partition_coalesce): Remove
+ memory-tag related code.
+ * tree-ssa-ccp.c (get_symbol_constant_value): Remove memory-tag code.
+ (likely_value): Add comment, skip static-chain of call statements.
+ (surely_varying_stmt_p): Adjust.
+ (gimplify_and_update_call_from_tree): Likewise.
+ (execute_fold_all_builtins): Do not rebuild alias info.
+ (gimplify_and_update_call_from_tree): Properly update VOPs.
+ * tree-ssa-loop-ivopts.c (get_ref_tag): Remove.
+ (copy_ref_info): Remove memory-tag related code.
+ * tree-call-cdce.c (tree_call_cdce): Rename the VOP.
+ * ipa-pure-const.c (check_decl): Remove memory-tag related code.
+ (check_stmt): Open-code gimple_loaded_syms and gimple_stored_syms
+ computation.
+ * tree-ssa-dom.c (gimple_p): Remove typedef.
+ (eliminate_redundant_computations): Adjust.
+ (record_equivalences_from_stmt): Likewise.
+ (avail_expr_hash): Likewise.
+ (avail_expr_eq): Likewise.
+ * tree-ssa-propagate.c (update_call_from_tree): Properly
+ update VOPs.
+ (stmt_makes_single_load): Likewise.
+ (stmt_makes_single_store): Likewise.
+ * tree-ssa-alias.c: Rewrite completely.
+ (debug_memory_partitions, dump_mem_ref_stats, debug_mem_ref_stats,
+ debug_mem_sym_stats, dump_mem_sym_stats_for_var,
+ debug_all_mem_sym_stats, debug_mp_info, update_mem_sym_stats_from_stmt,
+ delete_mem_ref_stats, create_tag_raw, dump_points_to_info,
+ dump_may_aliases_for, debug_may_aliases_for, new_type_alias):
+ Remove public functions.
+ (pass_reset_cc_flags): Remove.
+ (pass_build_alias): Move ...
+ * tree-ssa-structalias.c (pass_build_alias): ... here.
+ * tree-ssa-alias.c (may_be_aliased): Move ...
+ * tree-flow-inline.h (may_be_aliased): ... here.
+ tree-ssa-alias.c (struct count_ptr_d, count_ptr_derefs,
+ count_uses_and_derefs): Move ...
+ * gimple.c: ... here.
+ * gimple.h (count_uses_and_derefs): Declare.
+ * tree-ssa-alias.c (dump_alias_stats, ptr_deref_may_alias_global_p,
+ ptr_deref_may_alias_decl_p, ptr_derefs_may_alias_p,
+ same_type_for_tbaa, nonaliasing_component_refs_p, decl_refs_may_alias_p,
+ indirect_ref_may_alias_decl_p, indirect_refs_may_alias_p,
+ ref_maybe_used_by_call_p, ref_maybe_used_by_stmt_p,
+ call_may_clobber_ref_p, stmt_may_clobber_ref_p, maybe_skip_until,
+ get_continuation_for_phi, walk_non_aliased_vuses, walk_aliased_vdefs):
+ New functions.
+ * tree-dfa.c (refs_may_alias_p): Move ...
+ * tree-ssa-alias.c (refs_may_alias_p): ... here. Extend.
+ * tree-ssa-alias.h: New file.
+ * tree-ssa-sink.c (is_hidden_global_store): Adjust.
+ (statement_sink_location): Likewise.
+ * opts.c (decode_options): Do not adjust max-aliased-vops or
+ avg-aliased-vops values.
+ * timevar.def (TV_TREE_MAY_ALIAS): Remove.
+ (TV_CALL_CLOBBER): Likewise.
+ (TV_FLOW_SENSITIVE): Likewise.
+ (TV_FLOW_INSENSITIVE): Likewise.
+ (TV_MEMORY_PARTITIONING): Likewise.
+ (TV_ALIAS_STMT_WALK): New timevar.
+ * tree-ssa-loop-ivcanon.c (empty_loop_p): Adjust.
+ * tree-ssa-address.c (create_mem_ref_raw): Use build6.
+ (get_address_description): Remove memory-tag related code.
+ * tree-ssa-ifcombine.c (bb_no_side_effects_p): Adjust.
+ * treestruct.def (TS_MEMORY_TAG, TS_MEMORY_PARTITION_TAG): Remove.
+ * tree-eh.c (cleanup_empty_eh): Do not leave stale SSA_NAMEs
+ and immediate uses in statements. Document.
+ * gimple-pretty-print.c (dump_gimple_mem_ops): Adjust.
+ (dump_symbols): Remove.
+ (dump_gimple_mem_ops): Do not dump loaded or stored syms.
+ * alias.c (get_deref_alias_set): New function split out from ...
+ (get_alias_set): ... here.
+ * alias.h (get_deref_alias_set): Declare.
+ * tree-vect-data-refs.c (vect_create_data_ref_ptr): Remove unused
+ type parameter. Remove restrict pointer handling. Create a
+ ref-all pointer in case type-based alias sets do not conflict.
+ (vect_analyze_data_refs): Remove SMT related code.
+ * tree-vect-stmts.c (vectorizable_store): Re-instantiate TBAA assert.
+ (vectorizable_load): Likewise.
+ * tree-data-ref.h (struct dr_alias): Remove symbol_tag field.
+ (DR_SYMBOL_TAG, DR_VOPS): Remove.
+ * tree-data-ref.c (dr_may_alias_p): Use the alias-oracle.
+ Ignore vops and SMTs.
+ (dr_analyze_alias): Likewise..
+ (free_data_ref): Likewise.
+ (create_data_ref): Likewise.
+ (analyze_all_data_dependences): Likewise.
+ (get_references_in_stmt): Adjust.
+ * tree-flow-inline.h (gimple_aliases_computed_p,
+ gimple_addressable_vars, gimple_call_clobbered_vars,
+ gimple_call_used_vars, gimple_global_var, may_aliases, memory_partition,
+ factoring_name_p, mark_call_clobbered, clear_call_clobbered,
+ compare_ssa_operands_equal, symbol_mem_tag, set_symbol_mem_tag,
+ gimple_mem_ref_stats): Remove.
+ (gimple_vop): New function.
+ (op_iter_next_use): Remove vuses and mayuses cases.
+ (op_iter_next_def): Remove vdefs case.
+ (op_iter_next_tree): Remove vuses, mayuses and vdefs cases.
+ (clear_and_done_ssa_iter): Do not set removed fields.
+ (op_iter_init): Likewise. Skip vuse and/or vdef if requested.
+ Assert we are not iterating over vuses or vdefs if not also
+ iterating over uses or defs.
+ (op_iter_init_use): Likewise.
+ (op_iter_init_def): Likewise.
+ (op_iter_next_vdef): Remove.
+ (op_iter_next_mustdef): Likewise.
+ (op_iter_init_vdef): Likewise.
+ (compare_ssa_operands_equal): Likewise.
+ (link_use_stmts_after): Handle vuse operand.
+ (is_call_used): Use is_call_clobbered.
+ (is_call_clobbered): Global variables are always call clobbered,
+ query the call-clobbers bitmap.
+ (mark_call_clobbered): Ignore global variables.
+ (clear_call_clobbered): Likewise.
+ * tree-ssa-coalesce.c (create_outofssa_var_map): Adjust
+ virtual operands sanity check.
+ * tree.def (NAME_MEMORY_TAG, SYMBOL_MEMORY_TAG, MEMORY_PARTITION_TAG):
+ Remove.
+ (TARGET_MEM_REF): Remove TMR_TAG operand.
+ * tree-dfa.c (add_referenced_var): Initialize call-clobber state.
+ Remove call-clobber related code.
+ (remove_referenced_var): Likewise. Do not clear mpt or symbol_mem_tag.
+ (dump_variable): Do not dump SMTs, memory stats, may-aliases or
+ partitions or escape reason.
+ (get_single_def_stmt, get_single_def_stmt_from_phi,
+ get_single_def_stmt_with_phi): Remove.
+ (dump_referenced_vars): Tidy.
+ (get_ref_base_and_extent): Allow bare decls.
+ (collect_dfa_stats): Adjust.
+ * graphite.c (rename_variables_in_stmt): Adjust.
+ (graphite_copy_stmts_from_block): Likewise.
+ (translate_clast): Likewise.
+ * tree-ssa-pre.c (struct bb_bitmap_sets): Add expr_dies bitmap.
+ (EXPR_DIES): New.
+ (translate_vuse_through_block): Use the oracle.
+ (phi_translate_1): Adjust.
+ (value_dies_in_block_x): Use the oracle. Cache the outcome
+ in EXPR_DIES.
+ (valid_in_sets): Check if the VUSE for
+ a REFERENCE is available.
+ (eliminate): Do not remove stmts during elimination,
+ instead queue and remove them afterwards.
+ (do_pre): Do not rebuild alias info.
+ (pass_pre): Run TODO_rebuild_alias before PRE.
+ * tree-ssa-live.c (remove_unused_locals): Remove memory-tag code.
+ * tree-sra.c (sra_walk_function): Use gimple_references_memory_p.
+ (mark_all_v_defs_stmt): Remove.
+ (mark_all_v_defs_seq): Adjust.
+ (sra_replace): Likewise.
+ (scalarize_use): Likewise.
+ (scalarize_copy): Likewise.
+ (scalarize_init): Likewise.
+ (scalarize_ldst): Likewise.
+ (todoflags): Remove.
+ (tree_sra): Do not rebuild alias info.
+ (tree_sra_early): Adjust.
+ (pass_sra): Run TODO_update_address_taken before SRA.
+ * tree-predcom.c (set_alias_info): Remove.
+ (prepare_initializers_chain): Do not call it.
+ (mark_virtual_ops_for_renaming): Adjust.
+ (mark_virtual_ops_for_renaming_list): Remove.
+ (initialize_root_vars): Adjust.
+ (initialize_root_vars_lm): Likewise.
+ (prepare_initializers_chain): Likewise.
+ * tree-ssa-copy.c (may_propagate_copy): Remove memory-tag related code.
+ (may_propagate_copy_into_stmt): Likewise.
+ (merge_alias_info): Do nothing for now.
+ (propagate_tree_value_into_stmt): Adjust.
+ (stmt_may_generate_copy): Likewise.
+ * tree-ssa-forwprop.c (tidy_after_forward_propagate_addr): Do
+ not mark symbols for renaming.
+ (forward_propagate_addr_expr): Match up push/pop_stmt_changes
+ with the same statement, make sure to update the new pointed-to one.
+ * tree-ssa-dce.c (eliminate_unnecessary_stmts): Do not copy
+ call statements, do not mark symbols for renaming.
+ (mark_operand_necessary): Dump something.
+ (ref_may_be_aliased): New function.
+ (mark_aliased_reaching_defs_necessary_1): New helper function.
+ (mark_aliased_reaching_defs_necessary): Likewise.
+ (mark_all_reaching_defs_necessary_1): Likewise.
+ (mark_all_reaching_defs_necessary): Likewise.
+ (propagate_necessity): Do not process virtual PHIs. For
+ non-aliased loads mark all reaching definitions as necessary.
+ For aliased loads and stores mark the immediate dominating
+ aliased clobbers as necessary.
+ (visited): New global static.
+ (perform_tree_ssa_dce): Free visited bitmap after propagating
+ necessity.
+ (remove_dead_phis): Perform simple dead virtual PHI removal.
+ (remove_dead_stmt): Properly unlink virtual operands when
+ removing stores.
+ (eliminate_unnecessary_stmts): Schedule PHI removal after
+ stmt removal.
+ * tree-ssa-ter.c (is_replaceable_p): Adjust.
+ (process_replaceable): Likewise.
+ (find_replaceable_in_bb): Likewise.
+ * tree-ssa.c (verify_ssa_name): Verify all VOPs are
+ based on the single gimple vop.
+ (verify_flow_insensitive_alias_info): Remove.
+ (verify_flow_sensitive_alias_info): Likewise.
+ (verify_call_clobbering): Likewise.
+ (verify_memory_partitions): Likewise.
+ (verify_alias_info): Likewise.
+ (verify_ssa): Adjust..
+ (execute_update_addresses_taken): Export. Update SSA
+ manually. Optimize only when optimizing. Use a local bitmap.
+ (pass_update_address_taken): Remove TODO_update_ssa, add
+ TODO_dump_func.
+ (pass_update_address_taken): Just use TODO_update_address_taken.
+ (init_tree_ssa): Do not initialize addressable_vars.
+ (verify_ssa): Verify new VUSE / VDEF properties.
+ Verify that all stmts definitions have the stmt as SSA_NAME_DEF_STMT.
+ Do not call verify_alias_info.
+ (delete_tree_ssa): Clear the VUSE, VDEF operands.
+ Do not free the loaded and stored syms bitmaps. Reset the escaped
+ and callused solutions. Do not free addressable_vars.
+ Remove memory-tag related code.
+ (warn_uninitialized_var): Aliases are always available.
+ * tree-ssa-loop-prefetch.c (gather_memory_references): Adjust.
+ * lambda-code.c (can_put_in_inner_loop): Adjust.
+ (can_put_after_inner_loop): Likewise.
+ (perfect_nestify): Likewise.
+ * tree-vect-stmts.c (vect_stmt_relevant_p): Adjust.
+ (vect_gen_widened_results_half): Remove CALL_EXPR handling.
+ (vectorizable_conversion): Do not mark symbols for renaming.
+ * tree-inline.c (remap_gimple_stmt): Clear VUSE/VDEF.
+ (expand_call_inline): Unlink the calls virtual operands before
+ replacing it.
+ (tree_function_versioning): Do not call update_ssa if we are not
+ updating clones. Simplify.
+ * tree-ssa-phiprop.c (phivn_valid_p): Adjust.
+ (propagate_with_phi): Likewise..
+ * tree-outof-ssa.c (create_temp): Remove memory tag and call
+ clobber code. Assert we are not aliased or global.
+ * tree-flow.h: Include tree-ssa-alias.h
+ (enum escape_type): Remove.
+ (struct mem_sym_stats_d): Likewise.
+ (struct mem_ref_stats_d): Likewise.
+ (struct gimple_df): Add vop member. Remove global_var,
+ call_clobbered_vars, call_used_vars, addressable_vars,
+ aliases_compted_p and mem_ref_stats members. Add syms_to_rename,
+ escaped and callused members.
+ (struct ptr_info_def): Remove all members, add points-to solution
+ member pt.
+ (struct var_ann_d): Remove in_vuse_list, in_vdef_list,
+ call_clobbered, escape_mask, mpt and symbol_mem_tag members.
+ * Makefile.in (TREE_FLOW_H): Add tree-ssa-alias.h.
+ (tree-ssa-structalias.o): Remove tree-ssa-structalias.h.
+ (tree-ssa-alias.o): Likewise.
+ (toplev.o): Add tree-ssa-alias.h
+ (GTFILES): Remove tree-ssa-structalias.h, add tree-ssa-alias.h.
+ * gimple.c (gimple_set_bb): Fix off-by-one error.
+ (is_gimple_reg): Do not handle memory tags.
+ (gimple_copy): Also copy virtual operands.
+ Delay updating the statement. Do not reset loaded and stored syms.
+ (gimple_set_stored_syms): Remove.
+ (gimple_set_loaded_syms): Likewise.
+ (gimple_call_copy_skip_args): Copy the virtual operands
+ and mark the new statement modified.
+ * tree-ssa-structalias.c (may_alias_p): Remove.
+ (set_uids_in_ptset): Take the alias set to prune with as
+ parameter. Fold in the alias test of may_alias_p.
+ (compute_points_to_sets): Compute whether a ptr is dereferenced
+ in a local sbitmap.
+ (process_constraint): Deal with &ANYTHING on the lhs, reject all
+ other ADDRESSOF constraints on the lhs.
+ (get_constraint_for_component_ref): Assert that we don't get
+ ADDRESSOF constraints from the base of the reference.
+ Properly generate UNKNOWN_OFFSET for DEREF if needed.
+ (struct variable_info): Remove collapsed_to member.
+ (get_varinfo_fc): Remove.
+ (new_var_info): Do not set collapsed_to.
+ (dump_constraint): Do not follow cycles.
+ (dump_constraint_graph): Likewise.
+ (build_pred_graph): Likewise.
+ (build_succ_graph): Likewise.
+ (rewrite_constraints): Likewise.
+ (do_simple_structure_copy): Remove.
+ (do_rhs_deref_structure_copy): Remove.
+ (do_lhs_deref_structure_copy): Remove.
+ (collapse_rest_of_var): Remove.
+ (do_structure_copy): Re-implement.
+ (pta_stats): New global variable.
+ (dump_pta_stats): New function.
+ (struct constraint_expr): Make offset signed.
+ (UNKNOWN_OFFSET): Define special value.
+ (dump_constraint): Dump UNKNOWN_OFFSET as UNKNOWN.
+ (solution_set_expand): New helper function split out from ...
+ (do_sd_constraint): ... here.
+ (solution_set_add): Handle UNKNOWN_OFFSET. Handle negative offsets.
+ (do_ds_constraint): Likewise.
+ (do_sd_constraint): Likewise. Do not special-case ESCAPED = *ESCAPED
+ and CALLUSED = *CALLUSED.
+ (set_union_with_increment): Make inc argument signed.
+ (type_safe): Remove.
+ (get_constraint_for_ptr_offset): Handle unknown and negative
+ constant offsets.
+ (first_vi_for_offset): Handle offsets before start. Bail
+ out early for offsets beyond the variable extent.
+ (first_or_preceding_vi_for_offset): New function.
+ (init_base_vars): Add ESCAPED = ESCAPED + UNKNOWN_OFFSET constraint.
+ Together with ESCAPED = *ESCAPED this properly computes reachability.
+ (find_what_var_points_to): New function.
+ (find_what_p_points_to): Implement in terms of find_what_var_points_to.
+ (pt_solution_reset, pt_solution_empty_p, pt_solution_includes_global,
+ pt_solution_includes_1, pt_solution_includes, pt_solutions_intersect_1,
+ pt_solutions_intersect): New functions.
+ (compute_call_used_vars): Remove.
+ (compute_may_aliases): New main entry into PTA computation.
+ * gimple.h (gimple_p): New typedef.
+ (struct gimple_statement_base): Remove references_memory_p.
+ (struct gimple_statement_with_memory_ops_base): Remove
+ vdef_ops, vuse_ops, stores and loads members. Add vdef and vuse
+ members.
+ (gimple_vuse_ops, gimple_set_vuse_ops, gimple_vdef_ops,
+ gimple_set_vdef_ops, gimple_loaded_syms, gimple_stored_syms,
+ gimple_set_references_memory): Remove.
+ (gimple_vuse_op, gimple_vdef_op, gimple_vuse, gimple_vdef,
+ gimple_vuse_ptr, gimple_vdef_ptri, gimple_set_vuse, gimple_set_vdef):
+ New functions.
+ * tree-cfg.c (move_block_to_fn): Fix off-by-one error.
+ (verify_expr): Allow RESULT_DECL.
+ (gimple_duplicate_bb): Do not copy virtual operands.
+ (gimple_duplicate_sese_region): Adjust.
+ (gimple_duplicate_sese_tail): Likewise.
+ (mark_virtual_ops_in_region): Remove.
+ (move_sese_region_to_fn): Do not call it.
+ * passes.c (init_optimization_passes): Remove pass_reset_cc_flags
+ and pass_simple_dse.
+ (execute_function_todo): Handle TODO_update_address_taken,
+ call execute_update_addresses_taken for TODO_rebuild_alias.
+ (execute_todo): Adjust.
+ (execute_one_pass): Init dump files early.
+ * ipa-struct-reorg.c (finalize_var_creation): Do not mark vars
+ call-clobbered.
+ (create_general_new_stmt): Clear vops.
+ * tree-ssa-reassoc.c (get_rank): Adjust.
+ * tree-vect-slp.c (vect_create_mask_and_perm): Do not mark
+ symbols for renaming.
+ * params.def (PARAM_MAX_ALIASED_VOPS): Remove.
+ (PARAM_AVG_ALIASED_VOPS): Likewise.
+ * tree-ssanames.c (init_ssanames): Allocate SYMS_TO_RENAME.
+ (duplicate_ssa_name_ptr_info): No need to copy the shared bitmaps.
+ * tree-ssa-operands.c: Simplify for new virtual operand
+ representation.
+ (operand_build_cmp, copy_virtual_operands,
+ create_ssa_artificial_load_stmt, add_to_addressable_set,
+ gimple_add_to_addresses_taken): Remove public functions.
+ (unlink_stmt_vdef): New function.
+
2009-04-03 Alan Modra <amodra@bigpond.net.au>
* config.gcc (powerpc-*-linux*): Merge variants.
TREE_DUMP_H = tree-dump.h $(SPLAY_TREE_H) tree-pass.h
TREE_FLOW_H = tree-flow.h tree-flow-inline.h tree-ssa-operands.h \
$(BITMAP_H) $(BASIC_BLOCK_H) hard-reg-set.h $(GIMPLE_H) \
- $(HASHTAB_H) $(CGRAPH_H) $(IPA_REFERENCE_H)
+ $(HASHTAB_H) $(CGRAPH_H) $(IPA_REFERENCE_H) \
+ tree-ssa-alias.h
TREE_SSA_LIVE_H = tree-ssa-live.h $(PARTITION_H) vecprim.h
PRETTY_PRINT_H = pretty-print.h $(INPUT_H) $(OBSTACK_H)
DIAGNOSTIC_H = diagnostic.h diagnostic.def $(PRETTY_PRINT_H) options.h
$(TREE_H) $(PARAMS_H) $(FLAGS_H) $(FUNCTION_H) $(EXPR_H) output.h $(RTL_H) \
$(GGC_H) $(TM_P_H) $(TARGET_H) langhooks.h $(REGS_H) gt-stor-layout.h \
$(TOPLEV_H)
-tree-ssa-structalias.o: tree-ssa-structalias.c tree-ssa-structalias.h \
+tree-ssa-structalias.o: tree-ssa-structalias.c \
$(SYSTEM_H) $(CONFIG_H) coretypes.h $(TM_H) $(GGC_H) $(OBSTACK_H) $(BITMAP_H) \
$(FLAGS_H) $(RTL_H) $(TM_P_H) hard-reg-set.h $(BASIC_BLOCK_H) output.h \
$(DIAGNOSTIC_H) $(TREE_H) $(C_COMMON_H) $(TREE_FLOW_H) $(TREE_INLINE_H) varray.h \
$(RTL_H) $(TREE_H) $(TM_P_H) $(EXPR_H) $(GGC_H) $(TREE_INLINE_H) $(FLAGS_H) \
$(FUNCTION_H) $(TIMEVAR_H) convert.h $(TM_H) coretypes.h langhooks.h \
$(TREE_DUMP_H) tree-pass.h $(PARAMS_H) $(BASIC_BLOCK_H) $(DIAGNOSTIC_H) \
- hard-reg-set.h $(GIMPLE_H) vec.h tree-ssa-structalias.h \
+ hard-reg-set.h $(GIMPLE_H) vec.h \
$(IPA_TYPE_ESCAPE_H) vecprim.h pointer-set.h alloc-pool.h
tree-ssa-reassoc.o : tree-ssa-reassoc.c $(TREE_FLOW_H) $(CONFIG_H) \
$(SYSTEM_H) $(TREE_H) $(GGC_H) $(DIAGNOSTIC_H) $(TIMEVAR_H) \
value-prof.h $(PARAMS_H) $(TM_P_H) reload.h ira.h dwarf2asm.h $(TARGET_H) \
langhooks.h insn-flags.h $(CFGLAYOUT_H) $(CFGLOOP_H) hosthooks.h \
$(CGRAPH_H) $(COVERAGE_H) alloc-pool.h $(GGC_H) $(INTEGRATE_H) \
- opts.h params.def tree-mudflap.h $(REAL_H) tree-pass.h $(GIMPLE_H)
+ opts.h params.def tree-mudflap.h $(REAL_H) tree-pass.h $(GIMPLE_H) \
+ tree-ssa-alias.h
$(CC) $(ALL_CFLAGS) $(ALL_CPPFLAGS) \
-DTARGET_NAME=\"$(target_noncanonical)\" \
-c $(srcdir)/toplev.c $(OUTPUT_OPTION)
$(srcdir)/targhooks.c $(out_file) $(srcdir)/passes.c $(srcdir)/cgraphunit.c \
$(srcdir)/tree-ssa-propagate.c \
$(srcdir)/tree-phinodes.c \
- $(srcdir)/ipa-reference.c $(srcdir)/tree-ssa-structalias.h \
+ $(srcdir)/ipa-reference.c \
$(srcdir)/tree-ssa-structalias.c $(srcdir)/tree-inline.c \
+ $(srcdir)/tree-ssa-alias.h \
@all_gtfiles@
# Compute the list of GT header files from the corresponding C sources,
}
}
+/* Return the alias set for the memory pointed to by T, which may be
+ either a type or an expression. Return -1 if there is nothing
+ special about dereferencing T. */
+
+static alias_set_type
+get_deref_alias_set_1 (tree t)
+{
+ /* If we're not doing any alias analysis, just assume everything
+ aliases everything else. */
+ if (!flag_strict_aliasing)
+ return 0;
+
+ if (! TYPE_P (t))
+ {
+ tree decl = find_base_decl (t);
+
+ if (decl && DECL_POINTER_ALIAS_SET_KNOWN_P (decl))
+ {
+ /* If we haven't computed the actual alias set, do it now. */
+ if (DECL_POINTER_ALIAS_SET (decl) == -2)
+ {
+ tree pointed_to_type = TREE_TYPE (TREE_TYPE (decl));
+
+ /* No two restricted pointers can point at the same thing.
+ However, a restricted pointer can point at the same thing
+ as an unrestricted pointer, if that unrestricted pointer
+ is based on the restricted pointer. So, we make the
+ alias set for the restricted pointer a subset of the
+ alias set for the type pointed to by the type of the
+ decl. */
+ alias_set_type pointed_to_alias_set
+ = get_alias_set (pointed_to_type);
+
+ if (pointed_to_alias_set == 0)
+ /* It's not legal to make a subset of alias set zero. */
+ DECL_POINTER_ALIAS_SET (decl) = 0;
+ else if (AGGREGATE_TYPE_P (pointed_to_type))
+ /* For an aggregate, we must treat the restricted
+ pointer the same as an ordinary pointer. If we
+ were to make the type pointed to by the
+ restricted pointer a subset of the pointed-to
+ type, then we would believe that other subsets
+ of the pointed-to type (such as fields of that
+ type) do not conflict with the type pointed to
+ by the restricted pointer. */
+ DECL_POINTER_ALIAS_SET (decl)
+ = pointed_to_alias_set;
+ else
+ {
+ DECL_POINTER_ALIAS_SET (decl) = new_alias_set ();
+ record_alias_subset (pointed_to_alias_set,
+ DECL_POINTER_ALIAS_SET (decl));
+ }
+ }
+
+ /* We use the alias set indicated in the declaration. */
+ return DECL_POINTER_ALIAS_SET (decl);
+ }
+
+ /* Now all we care about is the type. */
+ t = TREE_TYPE (t);
+ }
+
+ /* If we have an INDIRECT_REF via a void pointer, we don't
+ know anything about what that might alias. Likewise if the
+ pointer is marked that way. */
+ if (TREE_CODE (TREE_TYPE (t)) == VOID_TYPE
+ || TYPE_REF_CAN_ALIAS_ALL (t))
+ return 0;
+
+ return -1;
+}
+
+/* Return the alias set for the memory pointed to by T, which may be
+ either a type or an expression. */
+
+alias_set_type
+get_deref_alias_set (tree t)
+{
+ alias_set_type set = get_deref_alias_set_1 (t);
+
+ /* Fall back to the alias-set of the pointed-to type. */
+ if (set == -1)
+ {
+ if (! TYPE_P (t))
+ t = TREE_TYPE (t);
+ set = get_alias_set (TREE_TYPE (t));
+ }
+
+ return set;
+}
+
/* Return the alias set for T, which may be either a type or an
expression. Call language-specific routine for help, if needed. */
STRIP_NOPS (inner);
}
- /* Check for accesses through restrict-qualified pointers. */
if (INDIRECT_REF_P (inner))
{
- tree decl;
-
- if (TREE_CODE (TREE_OPERAND (inner, 0)) == SSA_NAME)
- decl = SSA_NAME_VAR (TREE_OPERAND (inner, 0));
- else
- decl = find_base_decl (TREE_OPERAND (inner, 0));
-
- if (decl && DECL_POINTER_ALIAS_SET_KNOWN_P (decl))
- {
- /* If we haven't computed the actual alias set, do it now. */
- if (DECL_POINTER_ALIAS_SET (decl) == -2)
- {
- tree pointed_to_type = TREE_TYPE (TREE_TYPE (decl));
-
- /* No two restricted pointers can point at the same thing.
- However, a restricted pointer can point at the same thing
- as an unrestricted pointer, if that unrestricted pointer
- is based on the restricted pointer. So, we make the
- alias set for the restricted pointer a subset of the
- alias set for the type pointed to by the type of the
- decl. */
- alias_set_type pointed_to_alias_set
- = get_alias_set (pointed_to_type);
-
- if (pointed_to_alias_set == 0)
- /* It's not legal to make a subset of alias set zero. */
- DECL_POINTER_ALIAS_SET (decl) = 0;
- else if (AGGREGATE_TYPE_P (pointed_to_type))
- /* For an aggregate, we must treat the restricted
- pointer the same as an ordinary pointer. If we
- were to make the type pointed to by the
- restricted pointer a subset of the pointed-to
- type, then we would believe that other subsets
- of the pointed-to type (such as fields of that
- type) do not conflict with the type pointed to
- by the restricted pointer. */
- DECL_POINTER_ALIAS_SET (decl)
- = pointed_to_alias_set;
- else
- {
- DECL_POINTER_ALIAS_SET (decl) = new_alias_set ();
- record_alias_subset (pointed_to_alias_set,
- DECL_POINTER_ALIAS_SET (decl));
- }
- }
-
- /* We use the alias set indicated in the declaration. */
- return DECL_POINTER_ALIAS_SET (decl);
- }
-
- /* If we have an INDIRECT_REF via a void pointer, we don't
- know anything about what that might alias. Likewise if the
- pointer is marked that way. */
- else if (TREE_CODE (TREE_TYPE (inner)) == VOID_TYPE
- || (TYPE_REF_CAN_ALIAS_ALL
- (TREE_TYPE (TREE_OPERAND (inner, 0)))))
- return 0;
+ set = get_deref_alias_set_1 (TREE_OPERAND (inner, 0));
+ if (set != -1)
+ return set;
}
/* Otherwise, pick up the outermost object that we could have a pointer
extern alias_set_type new_alias_set (void);
extern alias_set_type get_alias_set (tree);
+extern alias_set_type get_deref_alias_set (tree);
extern alias_set_type get_varargs_alias_set (void);
extern alias_set_type get_frame_alias_set (void);
extern bool component_uses_parent_alias_set (const_tree);
@item max-cse-insns
The maximum instructions CSE process before flushing. The default is 1000.
-@item max-aliased-vops
-
-Maximum number of virtual operands per function allowed to represent
-aliases before triggering the alias partitioning heuristic. Alias
-partitioning reduces compile times and memory consumption needed for
-aliasing at the expense of precision loss in alias information. The
-default value for this parameter is 100 for -O1, 500 for -O2 and 1000
-for -O3.
-
-Notice that if a function contains more memory statements than the
-value of this parameter, it is not really possible to achieve this
-reduction. In this case, the compiler will use the number of memory
-statements as the value for @option{max-aliased-vops}.
-
-@item avg-aliased-vops
-
-Average number of virtual operands per statement allowed to represent
-aliases before triggering the alias partitioning heuristic. This
-works in conjunction with @option{max-aliased-vops}. If a function
-contains more than @option{max-aliased-vops} virtual operators, then
-memory symbols will be grouped into memory partitions until either the
-total number of virtual operators is below @option{max-aliased-vops}
-or the average number of virtual operators per memory statement is
-below @option{avg-aliased-vops}. The default value for this parameter
-is 1 for -O1 and -O2, and 3 for -O3.
-
@item ggc-min-expand
GCC uses a garbage collector to manage its own memory allocation. This
@cindex flow-sensitive alias analysis
@cindex flow-insensitive alias analysis
-Alias analysis proceeds in 4 main phases:
+Alias analysis in GIMPLE SSA form consists of two pieces. First
+the virtual SSA web ties conflicting memory accesses and provides
+a SSA use-def chain and SSA immediate-use chains for walking
+possibly dependent memory accesses. Second an alias-oracle can
+be queried to disambiguate explicit and implicit memory references.
@enumerate
-@item Structural alias analysis.
+@item Memory SSA form.
-This phase walks the types for structure variables, and determines which
-of the fields can overlap using offset and size of each field. For each
-field, a ``subvariable'' called a ``Structure field tag'' (SFT)@ is
-created, which represents that field as a separate variable. All
-accesses that could possibly overlap with a given field will have
-virtual operands for the SFT of that field.
+All statements that may use memory have exactly one accompanied use of
+a virtual SSA name that represents the state of memory at the
+given point in the IL.
+
+All statements that may define memory have exactly one accompanied
+definition of a virtual SSA name using the previous state of memory
+and defining the new state of memory after the given point in the IL.
@smallexample
-struct foo
-@{
- int a;
- int b;
-@}
-struct foo temp;
-int bar (void)
+int i;
+int foo (void)
@{
- int tmp1, tmp2, tmp3;
- SFT.0_2 = VDEF <SFT.0_1>
- temp.a = 5;
- SFT.1_4 = VDEF <SFT.1_3>
- temp.b = 6;
-
- VUSE <SFT.1_4>
- tmp1_5 = temp.b;
- VUSE <SFT.0_2>
- tmp2_6 = temp.a;
-
- tmp3_7 = tmp1_5 + tmp2_6;
- return tmp3_7;
+ # .MEM_3 = VDEF <.MEM_2(D)>
+ i = 1;
+ # VUSE <.MEM_3>
+ return i;
@}
@end smallexample
-If you copy the symbol tag for a variable for some reason, you probably
-also want to copy the subvariables for that variable.
+The virtual SSA names in this case are @code{.MEM_2(D)} and
+@code{.MEM_3}. The store to the global variable @code{i}
+defines @code{.MEM_3} invalidating @code{.MEM_2(D)}. The
+load from @code{i} uses that new state @code{.MEM_3}.
+
+The virtual SSA web serves as constraints to SSA optimizers
+preventing illegitimate code-motion and optimization. It
+also provides a way to walk related memory statements.
@item Points-to and escape analysis.
-This phase walks the use-def chains in the SSA web looking for
-three things:
+Points-to analysis builds a set of constraints from the GIMPLE
+SSA IL representing all pointer operations and facts we do
+or do not know about pointers. Solving this set of constraints
+yields a conservatively correct solution for each pointer
+variable in the program (though we are only interested in
+SSA name pointers) as to what it may possibly point to.
+
+This points-to solution for a given SSA name pointer is stored
+in the @code{pt_solution} sub-structure of the
+@code{SSA_NAME_PTR_INFO} record. The following accessor
+functions are available:
@itemize @bullet
-@item Assignments of the form @code{P_i = &VAR}
-@item Assignments of the form P_i = malloc()
-@item Pointers and ADDR_EXPR that escape the current function.
+@item @code{pt_solution_includes}
+@item @code{pt_solutions_intersect}
@end itemize
-The concept of `escaping' is the same one used in the Java world.
-When a pointer or an ADDR_EXPR escapes, it means that it has been
-exposed outside of the current function. So, assignment to
-global variables, function arguments and returning a pointer are
-all escape sites.
-
-This is where we are currently limited. Since not everything is
-renamed into SSA, we lose track of escape properties when a
-pointer is stashed inside a field in a structure, for instance.
-In those cases, we are assuming that the pointer does escape.
-
-We use escape analysis to determine whether a variable is
-call-clobbered. Simply put, if an ADDR_EXPR escapes, then the
-variable is call-clobbered. If a pointer P_i escapes, then all
-the variables pointed-to by P_i (and its memory tag) also escape.
-
-@item Compute flow-sensitive aliases
+Points-to analysis also computes the solution for two special
+set of pointers, @code{ESCAPED} and @code{CALLUSED}. Those
+represent all memory that has escaped the scope of analysis
+or that is used by pure or nested const calls.
-We have two classes of memory tags. Memory tags associated with
-the pointed-to data type of the pointers in the program. These
-tags are called ``symbol memory tag'' (SMT)@. The other class are
-those associated with SSA_NAMEs, called ``name memory tag'' (NMT)@.
-The basic idea is that when adding operands for an INDIRECT_REF
-*P_i, we will first check whether P_i has a name tag, if it does
-we use it, because that will have more precise aliasing
-information. Otherwise, we use the standard symbol tag.
+@item Type-based alias analysis
-In this phase, we go through all the pointers we found in
-points-to analysis and create alias sets for the name memory tags
-associated with each pointer P_i. If P_i escapes, we mark
-call-clobbered the variables it points to and its tag.
-
-
-@item Compute flow-insensitive aliases
-
-This pass will compare the alias set of every symbol memory tag and
-every addressable variable found in the program. Given a symbol
-memory tag SMT and an addressable variable V@. If the alias sets
-of SMT and V conflict (as computed by may_alias_p), then V is
-marked as an alias tag and added to the alias set of SMT@.
+Type-based alias analysis is frontend dependent though generic
+support is provided by the middle-end in @code{alias.c}. TBAA
+code is used by both tree optimizers and RTL optimizers.
Every language that wishes to perform language-specific alias analysis
should define a function that computes, given a @code{tree}
node, an alias set for the node. Nodes in different alias sets are not
allowed to alias. For an example, see the C front-end function
@code{c_get_alias_set}.
-@end enumerate
-
-For instance, consider the following function:
-
-@smallexample
-foo (int i)
-@{
- int *p, *q, a, b;
-
- if (i > 10)
- p = &a;
- else
- q = &b;
-
- *p = 3;
- *q = 5;
- a = b + 2;
- return *p;
-@}
-@end smallexample
-
-After aliasing analysis has finished, the symbol memory tag for
-pointer @code{p} will have two aliases, namely variables @code{a} and
-@code{b}.
-Every time pointer @code{p} is dereferenced, we want to mark the
-operation as a potential reference to @code{a} and @code{b}.
-
-@smallexample
-foo (int i)
-@{
- int *p, a, b;
-
- if (i_2 > 10)
- p_4 = &a;
- else
- p_6 = &b;
- # p_1 = PHI <p_4(1), p_6(2)>;
- # a_7 = VDEF <a_3>;
- # b_8 = VDEF <b_5>;
- *p_1 = 3;
+@item Tree alias-oracle
- # a_9 = VDEF <a_7>
- # VUSE <b_8>
- a_9 = b_8 + 2;
+The tree alias-oracle provides means to disambiguate two memory
+references and memory references against statements. The following
+queries are available:
- # VUSE <a_9>;
- # VUSE <b_8>;
- return *p_1;
-@}
-@end smallexample
-
-In certain cases, the list of may aliases for a pointer may grow
-too large. This may cause an explosion in the number of virtual
-operands inserted in the code. Resulting in increased memory
-consumption and compilation time.
-
-When the number of virtual operands needed to represent aliased
-loads and stores grows too large (configurable with @option{--param
-max-aliased-vops}), alias sets are grouped to avoid severe
-compile-time slow downs and memory consumption. The alias
-grouping heuristic proceeds as follows:
-
-@enumerate
-@item Sort the list of pointers in decreasing number of contributed
-virtual operands.
-
-@item Take the first pointer from the list and reverse the role
-of the memory tag and its aliases. Usually, whenever an
-aliased variable Vi is found to alias with a memory tag
-T, we add Vi to the may-aliases set for T@. Meaning that
-after alias analysis, we will have:
-
-@smallexample
-may-aliases(T) = @{ V1, V2, V3, @dots{}, Vn @}
-@end smallexample
-
-This means that every statement that references T, will get
-@code{n} virtual operands for each of the Vi tags. But, when
-alias grouping is enabled, we make T an alias tag and add it
-to the alias set of all the Vi variables:
-
-@smallexample
-may-aliases(V1) = @{ T @}
-may-aliases(V2) = @{ T @}
-@dots{}
-may-aliases(Vn) = @{ T @}
-@end smallexample
-
-This has two effects: (a) statements referencing T will only get
-a single virtual operand, and, (b) all the variables Vi will now
-appear to alias each other. So, we lose alias precision to
-improve compile time. But, in theory, a program with such a high
-level of aliasing should not be very optimizable in the first
-place.
-
-@item Since variables may be in the alias set of more than one
-memory tag, the grouping done in step (2) needs to be extended
-to all the memory tags that have a non-empty intersection with
-the may-aliases set of tag T@. For instance, if we originally
-had these may-aliases sets:
-
-@smallexample
-may-aliases(T) = @{ V1, V2, V3 @}
-may-aliases(R) = @{ V2, V4 @}
-@end smallexample
-
-In step (2) we would have reverted the aliases for T as:
-
-@smallexample
-may-aliases(V1) = @{ T @}
-may-aliases(V2) = @{ T @}
-may-aliases(V3) = @{ T @}
-@end smallexample
+@itemize @bullet
+@item @code{refs_may_alias_p}
+@item @code{ref_maybe_used_by_stmt_p}
+@item @code{stmt_may_clobber_ref_p}
+@end itemize
-But note that now V2 is no longer aliased with R@. We could
-add R to may-aliases(V2), but we are in the process of
-grouping aliases to reduce virtual operands so what we do is
-add V4 to the grouping to obtain:
+In addition to those two kind of statement walkers are available
+walking statements related to a reference ref.
+@code{walk_non_aliased_vuses} walks over dominating memory defining
+statements and calls back if the statement does not clobber ref
+providing the non-aliased VUSE. The walk stops at
+the first clobbering statement or if asked to.
+@code{walk_aliased_vdefs} walks over dominating memory defining
+statements and calls back on each statement clobbering ref
+providing its aliasing VDEF. The walk stops if asked to.
-@smallexample
-may-aliases(V1) = @{ T @}
-may-aliases(V2) = @{ T @}
-may-aliases(V3) = @{ T @}
-may-aliases(V4) = @{ T @}
-@end smallexample
-
-@item If the total number of virtual operands due to aliasing is
-still above the threshold set by max-alias-vops, go back to (2).
@end enumerate
+
}
-/* Dump the set of decls SYMS. BUFFER, SPC and FLAGS are as in
- dump_generic_node. */
-
-static void
-dump_symbols (pretty_printer *buffer, bitmap syms, int flags)
-{
- unsigned i;
- bitmap_iterator bi;
-
- if (syms == NULL)
- pp_string (buffer, "NIL");
- else
- {
- pp_string (buffer, " { ");
-
- EXECUTE_IF_SET_IN_BITMAP (syms, 0, i, bi)
- {
- tree sym = referenced_var_lookup (i);
- dump_generic_node (buffer, sym, 0, flags, false);
- pp_character (buffer, ' ');
- }
-
- pp_character (buffer, '}');
- }
-}
-
-
/* Dump a PHI node PHI. BUFFER, SPC and FLAGS are as in
dump_gimple_stmt. */
static void
dump_gimple_mem_ops (pretty_printer *buffer, gimple gs, int spc, int flags)
{
- struct voptype_d *vdefs;
- struct voptype_d *vuses;
- int i, n;
+ tree vdef = gimple_vdef (gs);
+ tree vuse = gimple_vuse (gs);
if (!ssa_operands_active () || !gimple_references_memory_p (gs))
return;
- /* Even if the statement doesn't have virtual operators yet, it may
- contain symbol information (this happens before aliases have been
- computed). */
- if ((flags & TDF_MEMSYMS)
- && gimple_vuse_ops (gs) == NULL
- && gimple_vdef_ops (gs) == NULL)
- {
- if (gimple_loaded_syms (gs))
- {
- pp_string (buffer, "# LOADS: ");
- dump_symbols (buffer, gimple_loaded_syms (gs), flags);
- newline_and_indent (buffer, spc);
- }
-
- if (gimple_stored_syms (gs))
- {
- pp_string (buffer, "# STORES: ");
- dump_symbols (buffer, gimple_stored_syms (gs), flags);
- newline_and_indent (buffer, spc);
- }
-
- return;
- }
-
- vuses = gimple_vuse_ops (gs);
- while (vuses)
+ if (vdef != NULL_TREE)
{
- pp_string (buffer, "# VUSE <");
-
- n = VUSE_NUM (vuses);
- for (i = 0; i < n; i++)
- {
- dump_generic_node (buffer, VUSE_OP (vuses, i), spc + 2, flags, false);
- if (i < n - 1)
- pp_string (buffer, ", ");
- }
-
+ pp_string (buffer, "# ");
+ dump_generic_node (buffer, vdef, spc + 2, flags, false);
+ pp_string (buffer, " = VDEF <");
+ dump_generic_node (buffer, vuse, spc + 2, flags, false);
pp_character (buffer, '>');
-
- if (flags & TDF_MEMSYMS)
- dump_symbols (buffer, gimple_loaded_syms (gs), flags);
-
newline_and_indent (buffer, spc);
- vuses = vuses->next;
}
-
- vdefs = gimple_vdef_ops (gs);
- while (vdefs)
+ else if (vuse != NULL_TREE)
{
- pp_string (buffer, "# ");
- dump_generic_node (buffer, VDEF_RESULT (vdefs), spc + 2, flags, false);
- pp_string (buffer, " = VDEF <");
-
- n = VDEF_NUM (vdefs);
- for (i = 0; i < n; i++)
- {
- dump_generic_node (buffer, VDEF_OP (vdefs, i), spc + 2, flags, 0);
- if (i < n - 1)
- pp_string (buffer, ", ");
- }
-
+ pp_string (buffer, "# VUSE <");
+ dump_generic_node (buffer, vuse, spc + 2, flags, false);
pp_character (buffer, '>');
-
- if ((flags & TDF_MEMSYMS) && vdefs->next == NULL)
- dump_symbols (buffer, gimple_stored_syms (gs), flags);
-
newline_and_indent (buffer, spc);
- vdefs = vdefs->next;
}
}
LABEL_DECL_UID (t) = uid = cfun->cfg->last_label_uid++;
if (old_len <= (unsigned) uid)
{
- unsigned new_len = 3 * uid / 2;
+ unsigned new_len = 3 * uid / 2 + 1;
VEC_safe_grow_cleared (basic_block, gc, label_to_block_map,
new_len);
if (gimple_has_mem_ops (stmt))
{
- gimple_set_vdef_ops (copy, NULL);
- gimple_set_vuse_ops (copy, NULL);
- copy->gsmem.membase.stores = NULL;
- copy->gsmem.membase.loads = NULL;
+ gimple_set_vdef (copy, gimple_vdef (stmt));
+ gimple_set_vuse (copy, gimple_vuse (stmt));
}
- update_stmt (copy);
+ /* SSA operands need to be updated. */
+ gimple_set_modified (copy, true);
}
return copy;
}
-/* Deep copy SYMS into the set of symbols stored by STMT. If SYMS is
- NULL or empty, the storage used is freed up. */
-
-void
-gimple_set_stored_syms (gimple stmt, bitmap syms, bitmap_obstack *obs)
-{
- gcc_assert (gimple_has_mem_ops (stmt));
-
- if (syms == NULL || bitmap_empty_p (syms))
- BITMAP_FREE (stmt->gsmem.membase.stores);
- else
- {
- if (stmt->gsmem.membase.stores == NULL)
- stmt->gsmem.membase.stores = BITMAP_ALLOC (obs);
-
- bitmap_copy (stmt->gsmem.membase.stores, syms);
- }
-}
-
-
-/* Deep copy SYMS into the set of symbols loaded by STMT. If SYMS is
- NULL or empty, the storage used is freed up. */
-
-void
-gimple_set_loaded_syms (gimple stmt, bitmap syms, bitmap_obstack *obs)
-{
- gcc_assert (gimple_has_mem_ops (stmt));
-
- if (syms == NULL || bitmap_empty_p (syms))
- BITMAP_FREE (stmt->gsmem.membase.loads);
- else
- {
- if (stmt->gsmem.membase.loads == NULL)
- stmt->gsmem.membase.loads = BITMAP_ALLOC (obs);
-
- bitmap_copy (stmt->gsmem.membase.loads, syms);
- }
-}
-
-
/* Return the number of operands needed on the RHS of a GIMPLE
assignment for an expression with tree code CODE. */
if (TREE_CODE (t) == SSA_NAME)
t = SSA_NAME_VAR (t);
- if (MTAG_P (t))
- return false;
-
if (!is_gimple_variable (t))
return false;
if (gimple_call_lhs (stmt))
gimple_call_set_lhs (new_stmt, gimple_call_lhs (stmt));
+ gimple_set_vuse (new_stmt, gimple_vuse (stmt));
+ gimple_set_vdef (new_stmt, gimple_vdef (stmt));
+
gimple_set_block (new_stmt, gimple_block (stmt));
if (gimple_has_location (stmt))
gimple_set_location (new_stmt, gimple_location (stmt));
gimple_call_set_return_slot_opt (new_stmt, gimple_call_return_slot_opt_p (stmt));
gimple_call_set_from_thunk (new_stmt, gimple_call_from_thunk_p (stmt));
gimple_call_set_va_arg_pack (new_stmt, gimple_call_va_arg_pack_p (stmt));
+
+ gimple_set_modified (new_stmt, true);
+
return new_stmt;
}
+
+/* Data structure used to count the number of dereferences to PTR
+ inside an expression. */
+struct count_ptr_d
+{
+ tree ptr;
+ unsigned num_stores;
+ unsigned num_loads;
+};
+
+/* Helper for count_uses_and_derefs. Called by walk_tree to look for
+ (ALIGN/MISALIGNED_)INDIRECT_REF nodes for the pointer passed in DATA. */
+
+static tree
+count_ptr_derefs (tree *tp, int *walk_subtrees, void *data)
+{
+ struct walk_stmt_info *wi_p = (struct walk_stmt_info *) data;
+ struct count_ptr_d *count_p = (struct count_ptr_d *) wi_p->info;
+
+ /* Do not walk inside ADDR_EXPR nodes. In the expression &ptr->fld,
+ pointer 'ptr' is *not* dereferenced, it is simply used to compute
+ the address of 'fld' as 'ptr + offsetof(fld)'. */
+ if (TREE_CODE (*tp) == ADDR_EXPR)
+ {
+ *walk_subtrees = 0;
+ return NULL_TREE;
+ }
+
+ if (INDIRECT_REF_P (*tp) && TREE_OPERAND (*tp, 0) == count_p->ptr)
+ {
+ if (wi_p->is_lhs)
+ count_p->num_stores++;
+ else
+ count_p->num_loads++;
+ }
+
+ return NULL_TREE;
+}
+
+/* Count the number of direct and indirect uses for pointer PTR in
+ statement STMT. The number of direct uses is stored in
+ *NUM_USES_P. Indirect references are counted separately depending
+ on whether they are store or load operations. The counts are
+ stored in *NUM_STORES_P and *NUM_LOADS_P. */
+
+void
+count_uses_and_derefs (tree ptr, gimple stmt, unsigned *num_uses_p,
+ unsigned *num_loads_p, unsigned *num_stores_p)
+{
+ ssa_op_iter i;
+ tree use;
+
+ *num_uses_p = 0;
+ *num_loads_p = 0;
+ *num_stores_p = 0;
+
+ /* Find out the total number of uses of PTR in STMT. */
+ FOR_EACH_SSA_TREE_OPERAND (use, stmt, i, SSA_OP_USE)
+ if (use == ptr)
+ (*num_uses_p)++;
+
+ /* Now count the number of indirect references to PTR. This is
+ truly awful, but we don't have much choice. There are no parent
+ pointers inside INDIRECT_REFs, so an expression like
+ '*x_1 = foo (x_1, *x_1)' needs to be traversed piece by piece to
+ find all the indirect and direct uses of x_1 inside. The only
+ shortcut we can take is the fact that GIMPLE only allows
+ INDIRECT_REFs inside the expressions below. */
+ if (is_gimple_assign (stmt)
+ || gimple_code (stmt) == GIMPLE_RETURN
+ || gimple_code (stmt) == GIMPLE_ASM
+ || is_gimple_call (stmt))
+ {
+ struct walk_stmt_info wi;
+ struct count_ptr_d count;
+
+ count.ptr = ptr;
+ count.num_stores = 0;
+ count.num_loads = 0;
+
+ memset (&wi, 0, sizeof (wi));
+ wi.info = &count;
+ walk_gimple_op (stmt, count_ptr_derefs, &wi);
+
+ *num_stores_p = count.num_stores;
+ *num_loads_p = count.num_loads;
+ }
+
+ gcc_assert (*num_uses_p >= *num_loads_p + *num_stores_p);
+}
+
#include "gt-gimple.h"
DEF_VEC_ALLOC_P(gimple,heap);
DEF_VEC_ALLOC_P(gimple,gc);
+typedef gimple *gimple_p;
+DEF_VEC_P(gimple_p);
+DEF_VEC_ALLOC_P(gimple_p,heap);
+
DEF_VEC_P(gimple_seq);
DEF_VEC_ALLOC_P(gimple_seq,gc);
DEF_VEC_ALLOC_P(gimple_seq,heap);
/* Nonzero if this statement contains volatile operands. */
unsigned has_volatile_ops : 1;
- /* Nonzero if this statement contains memory refernces. */
- unsigned references_memory_p : 1;
+ /* Padding to get subcode to 16 bit alignment. */
+ unsigned pad : 1;
/* The SUBCODE field can be used for tuple-specific flags for tuples
that do not require subcodes. Note that SUBCODE should be at
/* [ WORD 1-7 ] */
struct gimple_statement_with_ops_base opbase;
- /* [ WORD 8-9 ]
- Vectors for virtual operands. */
- struct voptype_d GTY((skip (""))) *vdef_ops;
- struct voptype_d GTY((skip (""))) *vuse_ops;
-
- /* [ WORD 9-10 ]
- Symbols stored/loaded by this statement. */
- bitmap GTY((skip (""))) stores;
- bitmap GTY((skip (""))) loads;
+ /* [ WORD 8-9 ]
+ Virtual operands for this statement. The GC will pick them
+ up via the ssa_names array. */
+ tree GTY((skip (""))) vdef;
+ tree GTY((skip (""))) vuse;
};
struct gimple_statement_with_memory_ops GTY(())
{
- /* [ WORD 1-10 ] */
+ /* [ WORD 1-9 ] */
struct gimple_statement_with_memory_ops_base membase;
- /* [ WORD 11 ]
+ /* [ WORD 10 ]
Operand vector. NOTE! This must always be the last field
of this structure. In particular, this means that this
structure cannot be embedded inside another one. */
struct gimple_statement_asm GTY(())
{
- /* [ WORD 1-10 ] */
+ /* [ WORD 1-9 ] */
struct gimple_statement_with_memory_ops_base membase;
- /* [ WORD 11 ]
+ /* [ WORD 10 ]
__asm__ statement. */
const char *string;
- /* [ WORD 12 ]
+ /* [ WORD 11 ]
Number of inputs, outputs and clobbers. */
unsigned char ni;
unsigned char no;
unsigned short nc;
- /* [ WORD 13 ]
+ /* [ WORD 12 ]
Operand vector. NOTE! This must always be the last field
of this structure. In particular, this means that this
structure cannot be embedded inside another one. */
extern tree get_call_expr_in (tree t);
extern void recalculate_side_effects (tree);
+extern void count_uses_and_derefs (tree, gimple, unsigned *, unsigned *,
+ unsigned *);
/* In gimplify.c */
extern tree create_tmp_var_raw (tree, const char *);
/* In builtins.c */
extern bool validate_gimple_arglist (const_gimple, ...);
-/* In tree-ssa-operands.c */
-extern void gimple_add_to_addresses_taken (gimple, tree);
-
/* In tree-ssa.c */
extern bool tree_ssa_useless_type_conversion (tree);
extern bool useless_type_conversion_p (tree, tree);
}
-/* Return the set of VUSE operands for statement G. */
+/* Return the set of VUSE operand for statement G. */
-static inline struct voptype_d *
-gimple_vuse_ops (const_gimple g)
+static inline use_operand_p
+gimple_vuse_op (const_gimple g)
{
+ struct use_optype_d *ops;
if (!gimple_has_mem_ops (g))
- return NULL;
- return g->gsmem.membase.vuse_ops;
+ return NULL_USE_OPERAND_P;
+ ops = g->gsops.opbase.use_ops;
+ if (ops
+ && USE_OP_PTR (ops)->use == &g->gsmem.membase.vuse)
+ return USE_OP_PTR (ops);
+ return NULL_USE_OPERAND_P;
}
+/* Return the set of VDEF operand for statement G. */
-/* Set OPS to be the set of VUSE operands for statement G. */
-
-static inline void
-gimple_set_vuse_ops (gimple g, struct voptype_d *ops)
+static inline def_operand_p
+gimple_vdef_op (const_gimple g)
{
- gcc_assert (gimple_has_mem_ops (g));
- g->gsmem.membase.vuse_ops = ops;
+ struct def_optype_d *ops;
+ if (!gimple_has_mem_ops (g))
+ return NULL_DEF_OPERAND_P;
+ ops = g->gsops.opbase.def_ops;
+ if (ops
+ && DEF_OP_PTR (ops) == &g->gsmem.membase.vdef)
+ return DEF_OP_PTR (ops);
+ return NULL_DEF_OPERAND_P;
}
-/* Return the set of VDEF operands for statement G. */
+/* Return the single VUSE operand of the statement G. */
-static inline struct voptype_d *
-gimple_vdef_ops (const_gimple g)
+static inline tree
+gimple_vuse (const_gimple g)
{
if (!gimple_has_mem_ops (g))
- return NULL;
- return g->gsmem.membase.vdef_ops;
+ return NULL_TREE;
+ return g->gsmem.membase.vuse;
}
+/* Return the single VDEF operand of the statement G. */
-/* Set OPS to be the set of VDEF operands for statement G. */
-
-static inline void
-gimple_set_vdef_ops (gimple g, struct voptype_d *ops)
+static inline tree
+gimple_vdef (const_gimple g)
{
- gcc_assert (gimple_has_mem_ops (g));
- g->gsmem.membase.vdef_ops = ops;
+ if (!gimple_has_mem_ops (g))
+ return NULL_TREE;
+ return g->gsmem.membase.vdef;
}
+/* Return the single VUSE operand of the statement G. */
-/* Return the set of symbols loaded by statement G. Each element of the
- set is the DECL_UID of the corresponding symbol. */
-
-static inline bitmap
-gimple_loaded_syms (const_gimple g)
+static inline tree *
+gimple_vuse_ptr (gimple g)
{
if (!gimple_has_mem_ops (g))
return NULL;
- return g->gsmem.membase.loads;
+ return &g->gsmem.membase.vuse;
}
+/* Return the single VDEF operand of the statement G. */
-/* Return the set of symbols stored by statement G. Each element of
- the set is the DECL_UID of the corresponding symbol. */
-
-static inline bitmap
-gimple_stored_syms (const_gimple g)
+static inline tree *
+gimple_vdef_ptr (gimple g)
{
if (!gimple_has_mem_ops (g))
return NULL;
- return g->gsmem.membase.stores;
+ return &g->gsmem.membase.vdef;
+}
+
+/* Set the single VUSE operand of the statement G. */
+
+static inline void
+gimple_set_vuse (gimple g, tree vuse)
+{
+ gcc_assert (gimple_has_mem_ops (g));
+ g->gsmem.membase.vuse = vuse;
+}
+
+/* Set the single VDEF operand of the statement G. */
+
+static inline void
+gimple_set_vdef (gimple g, tree vdef)
+{
+ gcc_assert (gimple_has_mem_ops (g));
+ g->gsmem.membase.vdef = vdef;
}
static inline bool
gimple_references_memory_p (gimple stmt)
{
- return gimple_has_mem_ops (stmt) && stmt->gsbase.references_memory_p;
+ return gimple_has_mem_ops (stmt) && gimple_vuse (stmt);
}
-/* Set the REFERENCES_MEMORY_P flag for STMT to MEM_P. */
-
-static inline void
-gimple_set_references_memory (gimple stmt, bool mem_p)
-{
- if (gimple_has_mem_ops (stmt))
- stmt->gsbase.references_memory_p = (unsigned) mem_p;
-}
-
/* Return the subcode for OMP statement S. */
static inline unsigned
ssa_op_iter iter;
use_operand_p use_p;
- FOR_EACH_SSA_USE_OPERAND (use_p, stmt, iter, SSA_OP_USE)
+ FOR_EACH_SSA_USE_OPERAND (use_p, stmt, iter, SSA_OP_ALL_USES)
{
tree use = USE_FROM_PTR (use_p);
tree new_name = get_new_name_from_old_name (map, use);
tree new_name = force_gimple_operand_gsi (gsi, expr, true, NULL,
true, GSI_SAME_STMT);
- set_symbol_mem_tag (SSA_NAME_VAR (new_name),
- symbol_mem_tag (SSA_NAME_VAR (old_name)));
return fold_build1 (code, type, new_name);
}
operands. */
copy = gimple_copy (stmt);
gsi_insert_after (&gsi_tgt, copy, GSI_NEW_STMT);
- mark_symbols_for_renaming (copy);
+ mark_sym_for_renaming (gimple_vop (cfun));
region = lookup_stmt_eh_region (stmt);
if (region >= 0)
/* Create new names for all the definitions created by COPY and
add replacement mappings for each new name. */
- FOR_EACH_SSA_DEF_OPERAND (def_p, copy, op_iter, SSA_OP_DEF)
+ FOR_EACH_SSA_DEF_OPERAND (def_p, copy, op_iter, SSA_OP_ALL_DEFS)
{
tree old_name = DEF_FROM_PTR (def_p);
tree new_name = create_new_def_for (old_name, copy, def_p);
next_e, map);
htab_delete (map);
loop_iv_stack_remove_constants (ivstack);
- update_ssa (TODO_update_ssa);
recompute_all_dominators ();
+ update_ssa (TODO_update_ssa);
graphite_verify ();
return translate_clast (scop, context_loop, stmt->next, next_e, ivstack);
}
new_stmt = gimple_call_copy_skip_args (cs->call_stmt,
args_to_skip);
+ if (gimple_vdef (new_stmt))
+ SSA_NAME_DEF_STMT (gimple_vdef (new_stmt)) = new_stmt;
gsi = gsi_for_stmt (cs->call_stmt);
gsi_replace (&gsi, new_stmt, true);
cgraph_set_call_stmt (cs, new_stmt);
check_decl (funct_state local,
tree t, bool checking_write)
{
- if (MTAG_P (t))
- return;
/* Do not want to do anything with volatile except mark any
function that uses one to be not const or pure. */
if (TREE_THIS_VOLATILE (t))
/* Direct functions calls are handled by IPA propagation. */
}
-/* Look into pointer pointed to by GSIP and figure out what interesting side effects
- it have. */
+/* Look into pointer pointed to by GSIP and figure out what interesting side
+ effects it has. */
static void
check_stmt (gimple_stmt_iterator *gsip, funct_state local, bool ipa)
{
gimple stmt = gsi_stmt (*gsip);
unsigned int i = 0;
- bitmap_iterator bi;
if (dump_file)
{
fprintf (dump_file, " scanning: ");
print_gimple_stmt (dump_file, stmt, 0, 0);
}
- if (gimple_loaded_syms (stmt))
- EXECUTE_IF_SET_IN_BITMAP (gimple_loaded_syms (stmt), 0, i, bi)
- check_decl (local, referenced_var_lookup (i), false);
- if (gimple_stored_syms (stmt))
- EXECUTE_IF_SET_IN_BITMAP (gimple_stored_syms (stmt), 0, i, bi)
- check_decl (local, referenced_var_lookup (i), true);
+
+ /* Look for direct loads and stores. */
+ if (gimple_has_lhs (stmt))
+ {
+ tree lhs = get_base_address (gimple_get_lhs (stmt));
+ if (lhs && DECL_P (lhs))
+ check_decl (local, lhs, true);
+ }
+ if (gimple_assign_single_p (stmt))
+ {
+ tree rhs = get_base_address (gimple_assign_rhs1 (stmt));
+ if (rhs && DECL_P (rhs))
+ check_decl (local, rhs, false);
+ }
+ else if (is_gimple_call (stmt))
+ {
+ for (i = 0; i < gimple_call_num_args (stmt); ++i)
+ {
+ tree rhs = get_base_address (gimple_call_arg (stmt, i));
+ if (rhs && DECL_P (rhs))
+ check_decl (local, rhs, false);
+ }
+ }
+ else if (gimple_code (stmt) == GIMPLE_ASM)
+ {
+ for (i = 0; i < gimple_asm_ninputs (stmt); ++i)
+ {
+ tree op = TREE_VALUE (gimple_asm_input_op (stmt, i));
+ op = get_base_address (op);
+ if (op && DECL_P (op))
+ check_decl (local, op, false);
+ }
+ for (i = 0; i < gimple_asm_noutputs (stmt); ++i)
+ {
+ tree op = TREE_VALUE (gimple_asm_output_op (stmt, i));
+ op = get_base_address (op);
+ if (op && DECL_P (op))
+ check_decl (local, op, true);
+ }
+ }
if (gimple_code (stmt) != GIMPLE_CALL
&& stmt_could_throw_p (stmt))
if (fn)
local = get_reference_vars_info (fn)->local;
- if (gimple_loaded_syms (stmt))
- EXECUTE_IF_SET_IN_BITMAP (gimple_loaded_syms (stmt), 0, i, bi)
- mark_load (local, referenced_var_lookup (i));
- if (gimple_stored_syms (stmt))
- EXECUTE_IF_SET_IN_BITMAP (gimple_stored_syms (stmt), 0, i, bi)
- mark_store (local, referenced_var_lookup (i));
- if (gimple_addresses_taken (stmt))
- EXECUTE_IF_SET_IN_BITMAP (gimple_addresses_taken (stmt), 0, i, bi)
- mark_address_taken (referenced_var_lookup (i));
-
- switch (gimple_code (stmt))
+ /* Look for direct loads and stores. */
+ if (gimple_has_lhs (stmt))
+ {
+ tree lhs = get_base_address (gimple_get_lhs (stmt));
+ if (lhs && DECL_P (lhs))
+ mark_store (local, lhs);
+ }
+ if (gimple_assign_single_p (stmt))
{
- case GIMPLE_CALL:
+ tree rhs = get_base_address (gimple_assign_rhs1 (stmt));
+ if (rhs && DECL_P (rhs))
+ mark_load (local, rhs);
+ }
+ else if (is_gimple_call (stmt))
+ {
+ for (i = 0; i < gimple_call_num_args (stmt); ++i)
+ {
+ tree rhs = get_base_address (gimple_call_arg (stmt, i));
+ if (rhs && DECL_P (rhs))
+ mark_load (local, rhs);
+ }
check_call (local, stmt);
- break;
-
- case GIMPLE_ASM:
+ }
+ else if (gimple_code (stmt) == GIMPLE_ASM)
+ {
+ for (i = 0; i < gimple_asm_ninputs (stmt); ++i)
+ {
+ tree op = TREE_VALUE (gimple_asm_input_op (stmt, i));
+ op = get_base_address (op);
+ if (op && DECL_P (op))
+ mark_load (local, op);
+ }
+ for (i = 0; i < gimple_asm_noutputs (stmt); ++i)
+ {
+ tree op = TREE_VALUE (gimple_asm_output_op (stmt, i));
+ op = get_base_address (op);
+ if (op && DECL_P (op))
+ mark_store (local, op);
+ }
check_asm_memory_clobber (local, stmt);
- break;
-
- /* We used to check nonlocal labels here and set them as potentially modifying
- everything. This is not needed, since we can get to nonlocal label only
- from callee and thus we will get info propagated. */
-
- default:
- break;
}
+
+ if (gimple_addresses_taken (stmt))
+ EXECUTE_IF_SET_IN_BITMAP (gimple_addresses_taken (stmt), 0, i, bi)
+ mark_address_taken (referenced_var_lookup (i));
return NULL;
}
finalize_var_creation (tree new_decl)
{
add_referenced_var (new_decl);
- if (is_global_var (new_decl))
- mark_call_clobbered (new_decl, ESCAPE_UNKNOWN);
mark_sym_for_renaming (new_decl);
}
gimple new_stmt = gimple_copy (old_stmt);
unsigned i;
+ /* We are really building a new stmt, clear the virtual operands. */
+ if (gimple_has_mem_ops (new_stmt))
+ {
+ gimple_set_vuse (new_stmt, NULL_TREE);
+ gimple_set_vdef (new_stmt, NULL_TREE);
+ }
+
for (i = 0; VEC_iterate (tree, acc->vars, i, var); i++)
{
tree *pos;
use_operand_p use_p;
gcc_assert (is_gimple_assign (stmt));
- if (!ZERO_SSA_OPERANDS (stmt, SSA_OP_ALL_VIRTUALS)
+ if (gimple_vuse (stmt)
|| !stmt_invariant_in_loop_p (inner, stmt))
return false;
imm_use_iterator imm_iter;
use_operand_p use_p;
- if (!ZERO_SSA_OPERANDS (stmt, SSA_OP_ALL_VIRTUALS))
+ if (gimple_vuse (stmt))
return false;
FOR_EACH_IMM_USE_FAST (use_p, imm_iter, gimple_assign_lhs (stmt))
incremented when we do. */
for (bsi = gsi_start_bb (bbs[i]); !gsi_end_p (bsi);)
{
- ssa_op_iter i;
- tree n;
gimple stmt = gsi_stmt (bsi);
if (stmt == exit_condition
VEC_index (tree, lbounds, 0), replacements, &firstbsi);
gsi_move_before (&bsi, &tobsi);
-
+
/* If the statement has any virtual operands, they may
need to be rewired because the original loop may
still reference them. */
- FOR_EACH_SSA_TREE_OPERAND (n, stmt, i, SSA_OP_ALL_VIRTUALS)
- mark_sym_for_renaming (SSA_NAME_VAR (n));
+ if (gimple_vuse (stmt))
+ mark_sym_for_renaming (gimple_vop (cfun));
}
}
if (changed)
cleanup_tree_cfg ();
}
+ if (gimple_in_ssa_p (cfun))
+ update_ssa (TODO_update_ssa);
current_function_decl = save_current;
pop_cfun ();
}
decode_options (unsigned int argc, const char **argv)
{
static bool first_time_p = true;
- static int initial_max_aliased_vops;
- static int initial_avg_aliased_vops;
static int initial_min_crossjump_insns;
static int initial_max_fields_for_field_sensitive;
static int initial_loop_invariant_max_bbs_in_loop;
lang_hooks.initialize_diagnostics (global_dc);
/* Save initial values of parameters we reset. */
- initial_max_aliased_vops = MAX_ALIASED_VOPS;
- initial_avg_aliased_vops = AVG_ALIASED_VOPS;
initial_min_crossjump_insns
= compiler_params[PARAM_MIN_CROSSJUMP_INSNS].value;
initial_max_fields_for_field_sensitive
flag_tree_switch_conversion = 1;
flag_ipa_cp = opt2;
- /* Allow more virtual operators to increase alias precision. */
-
- set_param_value ("max-aliased-vops",
- (opt2) ? 500 : initial_max_aliased_vops);
-
/* Track fields in field-sensitive alias analysis. */
set_param_value ("max-fields-for-field-sensitive",
(opt2) ? 100 : initial_max_fields_for_field_sensitive);
if (flag_ipa_cp_clone)
flag_ipa_cp = 1;
- /* Allow even more virtual operators. Max-aliased-vops was set above for
- -O2, so don't reset it unless we are at -O3. */
- if (opt3)
- set_param_value ("max-aliased-vops", 1000);
-
- set_param_value ("avg-aliased-vops", (opt3) ? 3 : initial_avg_aliased_vops);
-
/* Just -O1/-O0 optimizations. */
opt1_max = (optimize <= 1);
align_loops = opt1_max;
"The maximum number of instructions to search backward when looking for equivalent reload",
100, 0, 0)
-DEFPARAM(PARAM_MAX_ALIASED_VOPS,
- "max-aliased-vops",
- "The maximum number of virtual operators that a function is allowed to have before triggering memory partitioning heuristics",
- 100, 0, 0)
-
-DEFPARAM(PARAM_AVG_ALIASED_VOPS,
- "avg-aliased-vops",
- "The average number of virtual operators that memory statements are allowed to have before triggering memory partitioning heuristics",
- 1, 0, 0)
-
DEFPARAM(PARAM_MAX_SCHED_REGION_BLOCKS,
"max-sched-region-blocks",
"The maximum number of blocks in a region to be considered for interblock scheduling",
PARAM_VALUE (PARAM_SMS_DFA_HISTORY)
#define SMS_LOOP_AVERAGE_COUNT_THRESHOLD \
PARAM_VALUE (PARAM_SMS_LOOP_AVERAGE_COUNT_THRESHOLD)
-#define MAX_ALIASED_VOPS \
- PARAM_VALUE (PARAM_MAX_ALIASED_VOPS)
-#define AVG_ALIASED_VOPS \
- PARAM_VALUE (PARAM_AVG_ALIASED_VOPS)
#define INTEGER_SHARE_LIMIT \
PARAM_VALUE (PARAM_INTEGER_SHARE_LIMIT)
#define MAX_LAST_VALUE_RTL \
NEXT_PASS (pass_expand_omp);
NEXT_PASS (pass_referenced_vars);
- NEXT_PASS (pass_reset_cc_flags);
NEXT_PASS (pass_build_ssa);
NEXT_PASS (pass_early_warn_uninitialized);
NEXT_PASS (pass_all_early_optimizations);
NEXT_PASS (pass_copy_prop);
NEXT_PASS (pass_merge_phi);
NEXT_PASS (pass_cd_dce);
- NEXT_PASS (pass_simple_dse);
NEXT_PASS (pass_tail_recursion);
NEXT_PASS (pass_convert_switch);
NEXT_PASS (pass_cleanup_eh);
SSA form to become out-of-date (see PR 22037). So, even
if the parent pass had not scheduled an SSA update, we may
still need to do one. */
- if (!(flags & TODO_update_ssa_any) && need_ssa_update_p ())
+ if (!(flags & TODO_update_ssa_any) && need_ssa_update_p (cfun))
flags |= TODO_update_ssa;
}
cfun->last_verified &= ~TODO_verify_ssa;
}
+ if (flags & TODO_update_address_taken)
+ execute_update_addresses_taken (true);
+
if (flags & TODO_rebuild_alias)
{
+ if (!(flags & TODO_update_address_taken))
+ execute_update_addresses_taken (true);
compute_may_aliases ();
cfun->curr_properties |= PROP_alias;
}
execute_todo (unsigned int flags)
{
#if defined ENABLE_CHECKING
- if (need_ssa_update_p ())
+ if (cfun
+ && need_ssa_update_p (cfun))
gcc_assert (flags & TODO_update_ssa_any);
#endif
This is a hack until the new folder is ready. */
in_gimple_form = (cfun && (cfun->curr_properties & PROP_trees)) != 0;
+ initializing_dump = pass_init_dump_file (pass);
+
/* Run pre-pass verification. */
execute_todo (pass->todo_flags_start);
(void *)(size_t)pass->properties_required);
#endif
- initializing_dump = pass_init_dump_file (pass);
-
/* If a timevar is present, start it. */
if (pass->tv_id)
timevar_push (pass->tv_id);
+2009-04-03 Richard Guenther <rguenther@suse.de>
+
+ PR middle-end/13146
+ PR tree-optimization/23940
+ PR tree-optimization/33237
+ PR middle-end/33974
+ PR middle-end/34093
+ PR tree-optimization/36201
+ PR tree-optimization/36230
+ PR tree-optimization/38049
+ PR tree-optimization/38207
+ PR tree-optimization/38230
+ PR tree-optimization/38301
+ PR tree-optimization/38585
+ PR middle-end/38895
+ PR tree-optimization/38985
+ PR tree-optimization/39299
+ * gcc.dg/pr19633-1.c: Adjust.
+ * gcc.dg/torture/pta-callused-1.c: Likewise.
+ * gcc.dg/torture/pr39074-2.c: Likewise.
+ * gcc.dg/torture/pr39074.c: Likewise.
+ * gcc.dg/torture/pta-ptrarith-3.c: New testcase.
+ * gcc.dg/torture/pr30375.c: Adjust.
+ * gcc.dg/torture/pr33563.c: Likewise.
+ * gcc.dg/torture/pr33870.c: Likewise.
+ * gcc.dg/torture/pr33560.c: Likewise.
+ * gcc.dg/torture/pta-structcopy-1.c: New testcase.
+ * gcc.dg/torture/ssa-pta-fn-1.c: Likewise.
+ * gcc.dg/tree-ssa/alias-15.c: Remove.
+ * gcc.dg/tree-ssa/ssa-dce-4.c: New testcase.
+ * gcc.dg/tree-ssa/pr26421.c: Adjust.
+ * gcc.dg/tree-ssa/ssa-fre-10.c: XFAIL.
+ * gcc.dg/tree-ssa/ssa-dce-5.c: New testcase.
+ * gcc.dg/tree-ssa/pr23382.c: Adjust.
+ * gcc.dg/tree-ssa/ssa-fre-20.c: New testcase.
+ * gcc.dg/tree-ssa/alias-16.c: Adjust.
+ * gcc.dg/tree-ssa/ssa-fre-13.c: Likewise.
+ * gcc.dg/tree-ssa/ssa-fre-14.c: Likewise.
+ * gcc.dg/tree-ssa/alias-18.c: Likewise.
+ * gcc.dg/tree-ssa/ssa-fre-15.c: Likewise.
+ * gcc.dg/tree-ssa/ssa-lim-3.c: Likewise.
+ * gcc.dg/tree-ssa/alias-19.c: Likewise.
+ * gcc.dg/tree-ssa/pta-ptrarith-1.c: New testcase.
+ * gcc.dg/tree-ssa/pr13146.c: Likewise.
+ * gcc.dg/tree-ssa/ssa-pre-23.c: Likewise.
+ * gcc.dg/tree-ssa/pta-ptrarith-2.c: Likewise.
+ * gcc.dg/tree-ssa/ssa-fre-18.c: Likewise.
+ * gcc.dg/tree-ssa/ssa-pre-24.c: New XFAILed testcase.
+ * gcc.dg/tree-ssa/ssa-fre-19.c: New testcase.
+ * gcc.dg/tree-ssa/alias-20.c: Likewise.
+ * gcc.dg/tree-ssa/ssa-dse-12.c: Likewise.
+ * gcc.dg/tree-ssa/pr38895.c: Likewise.
+ * gcc.dg/uninit-B.c: XFAIL.
+ * gcc.dg/vect/no-vfa-vect-43.c: Adjust.
+ * gcc.dg/uninit-pr19430.c: XFAIL.
+ * g++.dg/tree-ssa/pr13146.C: New testcase.
+ * g++.dg/opt/pr36187.C: Adjust.
+ * g++.dg/torture/20090329-1.C: New testcase.
+
2009-04-02 Chao-ying Fu <fu@mips.com>
* gcc.target/mips/interrupt_handler.c: New test.
/* { dg-do run } */
-/* { dg-options "-O2 --param max-aliased-vops=20" } */
+/* { dg-options "-O2" } */
extern "C" void abort (void);
enum SbxDataType { SbxINTEGER, SbxDECIMAL, SbxBYREF = 0x4000 };
--- /dev/null
+/* { dg-do compile } */
+
+struct input_iterator_tag { };
+template<typename _Category, typename _Tp, typename _Distance = long, typename _Pointer = _Tp*, typename _Reference = _Tp&>
+struct iterator {
+ typedef _Category iterator_category;
+};
+template<typename _Iterator> struct iterator_traits {
+ typedef typename _Iterator::iterator_category iterator_category;
+};
+template<typename, typename> struct __lc_rai {
+ template<typename _II1, typename _II2>
+ static _II1 __newlast1(_II1, _II1 __last1, _II2, _II2) {
+ return __last1;
+ }
+ template<typename _II>
+ static bool __cnd2(_II __first, _II __last) {
+ return __first != __last;
+ }
+};
+template<typename _II1, typename _II2, typename _Compare>
+bool lexicographical_compare(_II1 __first1, _II1 __last1, _II2 __first2,
+ _II2 __last2, _Compare __comp) {
+ typedef typename iterator_traits<_II1>::iterator_category _Category1;
+ typedef typename iterator_traits<_II2>::iterator_category _Category2;
+ typedef __lc_rai<_Category1, _Category2> __rai_type;
+ __last1 = __rai_type::__newlast1(__first1, __last1, __first2, __last2);
+ for (;
+ __first1 != __last1 && __rai_type::__cnd2(__first2, __last2);
+ ++__first1, ++__first2) {
+ if (__comp(*__first1, *__first2)) return true;
+ }
+}
+void __assert_fail () throw () __attribute__ ((__noreturn__));
+template<typename T> struct BoundsContainer { };
+template<class T> class input_iterator_wrapper : public iterator<input_iterator_tag, T, long, T*, T&> {
+public:
+ typedef BoundsContainer<T> ContainerType;
+ T* ptr;
+ ContainerType* SharedInfo;
+ input_iterator_wrapper(const input_iterator_wrapper& in) : ptr(in.ptr), SharedInfo(in.SharedInfo) { }
+ bool operator==(const input_iterator_wrapper& in) const {
+ (static_cast<void> ((SharedInfo != __null
+ && SharedInfo == in.SharedInfo)
+ ? 0 : (__assert_fail (), 0)));
+ }
+ bool operator!=(const input_iterator_wrapper& in) const {
+ return !(*this == in);
+ }
+ T& operator*() const { }
+ input_iterator_wrapper& operator++() { }
+};
+struct X { };
+bool predicate(const X&, const X&) {
+ return true;
+}
+bool test2(input_iterator_wrapper<X>& x) {
+ return lexicographical_compare(x, x, x, x, predicate);
+}
--- /dev/null
+/* { dg-do link } */
+/* { dg-options "-O -fstrict-aliasing" } */
+
+class first
+{
+public:
+ double d;
+ int f1;
+};
+
+class middle : public first
+{
+};
+
+class second : public middle
+{
+public:
+ int f2;
+ short a;
+};
+
+class third
+{
+public:
+ char a;
+ char b;
+};
+
+class multi: public third, public second
+{
+public:
+ short s;
+ char f3;
+};
+
+extern void link_error ();
+
+void
+foo (first *s1, second *s2)
+{
+ s1->f1 = 0;
+ s2->f2 = 0;
+ s1->f1++;
+ s2->f2++;
+ s1->f1++;
+ s2->f2++;
+ if (s1->f1 != 2)
+ link_error ();
+}
+
+void
+bar (first *s1, multi *s3)
+{
+ s1->f1 = 0;
+ s3->f3 = 0;
+ s1->f1++;
+ s3->f3++;
+ s1->f1++;
+ s3->f3++;
+ if (s1->f1 != 2)
+ link_error ();
+}
+
+
+int
+main()
+{
+ first a;
+ second b;
+ multi c;
+ foo (&a, &b);
+ bar (&a, &c);
+ return 0;
+}
/* { dg-do run } */
-
-/* The max-aliased-vops setting is a temporary workaround to avoid the
- random failures as described in PR 30194. This test case does not
- need alias sets bigger than 13 elements. */
-/* { dg-options "-O2 --param max-aliased-vops=15" } */
+/* { dg-options "-O2" } */
extern void abort (void);
/* { dg-do run } */
-/* { dg-options "--param max-aliased-vops=0" } */
typedef struct _s {
int a;
/* { dg-do run } */
-/* { dg-options "--param max-aliased-vops=0" } */
struct T
{
/* { dg-do run } */
-/* { dg-options "--param max-aliased-vops=0" } */
struct T
{
/* { dg-do run } */
-/* { dg-options "--param max-aliased-vops=1" } */
struct X {
int i;
return 0;
}
-/* { dg-final { scan-tree-dump "y.._., name memory tag: NMT..., is dereferenced, points-to vars: { i }" "alias" } } */
+/* { dg-final { scan-tree-dump "y.._., points-to vars: { i }" "alias" } } */
/* { dg-final { cleanup-tree-dump "alias" } } */
return 0;
}
-/* { dg-final { scan-tree-dump "y.._., name memory tag: NMT..., is dereferenced, points-to vars: { i }" "alias" } } */
+/* { dg-final { scan-tree-dump "y.._., points-to vars: { i }" "alias" } } */
/* { dg-final { cleanup-tree-dump "alias" } } */
return 0;
}
-/* { dg-final { scan-tree-dump "p.._., name memory tag: NMT..., is dereferenced, points-to vars: { i j }" "alias" } } */
+/* { dg-final { scan-tree-dump "p.._., points-to vars: { i j }" "alias" } } */
/* { dg-final { cleanup-tree-dump "alias" } } */
--- /dev/null
+/* { dg-do run } */
+/* { dg-options "-fdump-tree-alias" } */
+/* { dg-skip-if "" { *-*-* } { "-O0" } { "" } } */
+
+extern void abort (void);
+struct X {
+ int *p;
+ int *q;
+ int *r;
+};
+int __attribute__((noinline))
+foo(int i, int j, int k, int off)
+{
+ struct X x;
+ int **p, *q;
+ x.p = &i;
+ x.q = &j;
+ x.r = &k;
+ p = &x.q;
+ p += off;
+ /* *p points to { i, j, k } */
+ q = *p;
+ return *q;
+}
+int main()
+{
+ if (foo(1, 2, 3, -1) != 1)
+ abort ();
+ if (foo(1, 2, 3, 0) != 2)
+ abort ();
+ if (foo(1, 2, 3, 1) != 3)
+ abort ();
+ return 0;
+}
+
+/* { dg-final { scan-tree-dump "q_., points-to vars: { i j k }" "alias" } } */
+/* { dg-final { cleanup-tree-dump "alias" } } */
--- /dev/null
+/* { dg-do run } */
+/* { dg-options "-fno-tree-sra -fdump-tree-alias" } */
+/* { dg-skip-if "" { *-*-* } { "-O0" } { "" } } */
+
+struct X
+{
+ long l1;
+ struct Y
+ {
+ long l2;
+ int *p;
+ } y;
+};
+int i;
+static int
+foo (struct X *x)
+{
+ struct Y y = x->y;
+ *y.p = 0;
+ i = 1;
+ return *y.p;
+}
+extern void abort (void);
+int main()
+{
+ struct X x;
+ x.y.p = &i;
+ if (foo(&x) != 1)
+ abort ();
+ return 0;
+}
+
+/* { dg-final { scan-tree-dump "points-to vars: { i }" "alias" } } */
+/* { dg-final { cleanup-tree-dump "alias" } } */
--- /dev/null
+/* { dg-do run } */
+/* { dg-options "-fdump-tree-alias" } */
+/* { dg-skip-if "" { *-*-* } { "-O0" } { "" } } */
+
+extern void abort (void);
+int *glob;
+
+int * __attribute__((noinline,const))
+foo_const(int *p) { return p; }
+
+int * __attribute__((noinline,pure))
+foo_pure(int *p) { return glob; }
+
+int * __attribute__((noinline))
+foo_normal(int *p) { glob = p; return p; }
+
+void test_const(void)
+{
+ int i;
+ int *p = &i;
+ int *q_const = foo_const(p);
+ *p = 1;
+ *q_const = 2;
+ if (*p != 2)
+ abort ();
+}
+
+void test(void)
+{
+ int i;
+ int *p = &i;
+ int *q_normal = foo_normal(p);
+ *p = 1;
+ *q_normal = 2;
+ if (*p != 2)
+ abort ();
+}
+
+void test_pure(void)
+{
+ int i;
+ int *p = &i;
+ int *q_pure = foo_pure(p);
+ *p = 1;
+ *q_pure = 2;
+ if (*p != 2)
+ abort ();
+}
+
+int main()
+{
+ test_const();
+ test();
+ test_pure();
+ return 0;
+}
+
+/* { dg-final { scan-tree-dump "q_const_., points-to non-local, points-to vars: { i }" "alias" } } */
+/* { dg-final { scan-tree-dump "q_pure_., points-to non-local, points-to escaped, points-to vars: { i }" "alias" } } */
+/* { dg-final { scan-tree-dump "q_normal_., points-to non-local, points-to escaped, points-to vars: { }" "alias" } } */
+/* { dg-final { cleanup-tree-dump "alias" } } */
+++ /dev/null
-/* { dg-do compile } */
-/* { dg-options "-O -fno-early-inlining -fdump-tree-alias-vops-details" } */
-
-struct foo {
- int a;
- struct X {
- int b[4];
- } b;
-} m;
-static inline struct X *wrap(struct X *p) { return p; }
-int test2(void)
-{
- struct X *p = wrap(&m.b);
- /* Both memory references need to alias the same tags. */
- return p->b[3] - m.b.b[3];
-}
-
-/* { dg-final { scan-tree-dump-times "VUSE <m_.\\\(D\\\)>" 2 "alias" } } */
-/* { dg-final { cleanup-tree-dump "alias" } } */
/* { dg-do run } */
-/* { dg-options "-O --param max-aliased-vops=1" } */
-/* Compile with -O --param max-aliased-vops=1. This partitions all
- the initial SFTs for 'm' which was causing the operand scanner to
- miss adding the right SFTs to p->b[2]. */
extern void abort (void);
struct X {
/* { dg-do compile } */
-/* { dg-options "-O2 -fdump-tree-fre-details -fdump-tree-optimized --param max-aliased-vops=0" } */
+/* { dg-options "-O2 -fdump-tree-fre-details -fdump-tree-optimized" } */
struct A {
int i;
}
/* { dg-final { scan-tree-dump "q_. = { a b }" "alias" } } */
-/* { dg-final { scan-tree-dump "q_., name memory tag: NMT..., is dereferenced, points-to vars: { a b }" "alias" } } */
-/* { dg-final { scan-tree-dump "# VUSE <a_.\\\(D\\\), b_.>" "alias" } } */
+/* { dg-final { scan-tree-dump "q_., points-to vars: { a b }" "alias" } } */
/* { dg-final { cleanup-tree-dump "alias" } } */
--- /dev/null
+/* { dg-do compile } */
+/* { dg-options "-O -fstrict-aliasing -fdump-tree-optimized" } */
+
+struct S { float f; int i; };
+struct R { int x; int i; };
+
+/* Strict-aliasing rules say that int and float do not alias. */
+int bar(struct S *s, int *i)
+{
+ *i = 0;
+ s->f = 1.0;
+ return *i;
+}
+
+/* Strict-aliasing rules say that S and R do not alias. */
+int foo(struct S *s, struct R *r)
+{
+ r->i = 0;
+ s->i = 1;
+ return r->i;
+}
+
+/* { dg-final { scan-tree-dump-times "return 0;" 2 "optimized" } } */
+/* { dg-final { cleanup-tree-dump "optimized" } } */
--- /dev/null
+/* { dg-do compile } */
+/* { dg-options "-O -fstrict-aliasing -fdump-tree-optimized" } */
+
+struct A
+{
+ int i;
+};
+struct B
+{
+ struct A a;
+ int j;
+};
+
+int foo (struct A *p, struct B *q)
+{
+ p->i = 0;
+ q->j = 1;
+ return p->i;
+}
+
+/* { dg-final { scan-tree-dump "return 0;" "optimized" } } */
+/* { dg-final { cleanup-tree-dump "optimized" } } */
/* { dg-do compile } */
-/* { dg-options "-O2 -fdump-tree-alias-vops" } */
+/* { dg-options "-O2 -fdump-tree-pre-details" } */
struct a
{
int length;
struct a *a = malloc(sizeof(struct a));
return a->length;
}
-/* { dg-final { scan-tree-dump-times "VDEF <HEAP" 1 "alias"} } */
-/* { dg-final { cleanup-tree-dump "alias" } } */
+/* { dg-final { scan-tree-dump-times "Variable: HEAP" 1 "pre"} } */
+/* { dg-final { cleanup-tree-dump "pre" } } */
/* { dg-do compile } */
-/* { dg-options "-O2 -fdump-tree-alias-vops" } */
+/* { dg-options "-O2 -fdump-tree-optimized" } */
typedef struct {
int i;
/* Verify the call clobbers all of a. */
-/* { dg-final { scan-tree-dump-times "VDEF <a_" 2 "alias" } } */
-/* { dg-final { cleanup-tree-dump "alias" } } */
+/* { dg-final { scan-tree-dump-not "return 1;" "optimized" } } */
+/* { dg-final { cleanup-tree-dump "optimized" } } */
--- /dev/null
+/* { dg-do compile } */
+/* { dg-options "-O -fstrict-aliasing -fdump-tree-optimized" } */
+
+struct A {
+ int i;
+ int j;
+};
+struct B {
+ struct A a1;
+ struct A a2;
+};
+struct C {
+ struct A a1;
+ struct B b;
+};
+int foo(struct C *c, struct B *b)
+{
+ c->a1.i = 1;
+ b->a1.i = 0;
+ return c->a1.i;
+}
+
+/* { dg-final { scan-tree-dump "return 1;" "optimized" } } */
+/* { dg-final { cleanup-tree-dump "optimized" } } */
--- /dev/null
+/* { dg-do compile } */
+/* { dg-options "-O2 -fno-tree-ccp -fdump-tree-alias" } */
+
+extern void abort (void);
+struct X {
+ int *p;
+ int *q;
+ int *r;
+};
+int __attribute__((noinline))
+foo(int i, int j, int k, int off)
+{
+ struct X x;
+ int **p, *q;
+ x.p = &i;
+ x.q = &j;
+ x.r = &k;
+ p = &x.q;
+ p += 1;
+ /* *p points to { k } */
+ q = *p;
+ return *q;
+}
+
+/* { dg-final { scan-tree-dump "q_., points-to vars: { k }" "alias" } } */
+/* { dg-final { cleanup-tree-dump "alias" } } */
--- /dev/null
+/* { dg-do compile } */
+/* { dg-options "-O2 -fno-tree-ccp -fdump-tree-alias" } */
+
+extern void abort (void);
+struct X {
+ int *p;
+ int *q;
+ int *r;
+};
+int __attribute__((noinline))
+foo(int i, int j, int k, int off)
+{
+ struct X x;
+ int **p, *q;
+ x.p = &i;
+ x.q = &j;
+ x.r = &k;
+ p = &x.q;
+ p -= 1;
+ /* *p points to { i } */
+ q = *p;
+ return *q;
+}
+
+/* { dg-final { scan-tree-dump "q_., points-to vars: { i }" "alias" } } */
+/* { dg-final { cleanup-tree-dump "alias" } } */
--- /dev/null
+/* { dg-do compile } */
+/* { dg-options "-O -fdump-tree-cddce1" } */
+
+int foo(int b)
+{
+ int a[128];
+ a[b] = 1;
+ if (b)
+ {
+ b = 2;
+ a[2] = 0;
+ }
+ a[2] = 3;
+ return a[2] + b;
+}
+
+/* { dg-final { scan-tree-dump-times "a\\\[\[^\n\]\\\]" 2 "cddce1" } } */
+/* { dg-final { cleanup-tree-dump "cddce1" } } */
--- /dev/null
+/* { dg-do compile } */
+/* { dg-options "-O -fno-tree-sra -fdump-tree-cddce1" } */
+
+struct X { int i; };
+struct X foo(int b)
+{
+ struct X x;
+ if (b)
+ x.i = 0;
+ x.i = 1;
+ return x;
+}
+
+/* { dg-final { scan-tree-dump-times "x.i =" 1 "cddce1" } } */
+/* { dg-final { cleanup-tree-dump "cddce1" } } */
--- /dev/null
+/* { dg-do compile } */
+/* { dg-options "-O -fdump-tree-dse1" } */
+
+void foo (int *p, int b)
+{
+ if (b)
+ *p = 1;
+ *p = 0;
+}
+
+/* { dg-final { scan-tree-dump-times "\\\*p" 1 "dse1" } } */
+/* { dg-final { cleanup-tree-dump "dse1" } } */
}
}
-/* { dg-final { scan-tree-dump "Insertions: 2" "pre" } } */
+/* This is a weird testcase. It should need PPRE to hoist the loop
+ invariants and the volatileness of state_in prevents DSE of the
+ first store. Thus, this is XFAILed. */
+
+/* { dg-final { scan-tree-dump "Insertions: 2" "pre" { xfail *-*-* } } } */
/* { dg-final { cleanup-tree-dump "pre" } } */
/* { dg-do compile } */
-/* { dg-options "-O -fstrict-aliasing -fno-tree-sra --param max-aliased-vops=0 --param max-fields-for-field-sensitive=0 -fdump-tree-fre-details" } */
+/* { dg-options "-O -fstrict-aliasing -fno-tree-sra -fdump-tree-fre-details" } */
-/* Should be optimized, propagating &a into (*p)[i] with parameters
- --param max-aliased-vops=0 --param max-fields-for-field-sensitive=0
- which means max 1 VOP per stmt and no SFTs. */
+/* Should be optimized, propagating &a into (*p)[i]. */
/* For this testcase we need TBAA to work. */
/* { dg-do compile } */
-/* { dg-options "-O -fno-tree-sra --param max-aliased-vops=0 --param max-fields-for-field-sensitive=0 -fdump-tree-fre-details" } */
+/* { dg-options "-O -fno-tree-sra -fdump-tree-fre-details" } */
-/* Should be optimized, propagating &a into (*p)[i] with parameters
- --param max-aliased-vops=0 --param max-fields-for-field-sensitive=0
- which means max 1 VOP per stmt and no SFTs. */
+/* Should be optimized, propagating &a into (*p)[i]. */
struct Foo
{
/* { dg-do compile } */
-/* { dg-options "-O -fno-tree-sra --param max-aliased-vops=0 --param max-fields-for-field-sensitive=0 -fdump-tree-fre-details" } */
+/* { dg-options "-O -fno-tree-sra -fdump-tree-fre-details" } */
-/* Should be optimized, propagating &a into (*p)[i] with parameters
- --param max-aliased-vops=0 --param max-fields-for-field-sensitive=0
- which means max 1 VOP per stmt and no SFTs. */
+/* Should be optimized, propagating &a into (*p)[i]. */
struct Foo
{
--- /dev/null
+/* { dg-do compile } */
+/* { dg-options "-O -fdump-tree-fre" } */
+
+struct a
+{
+ union
+ {
+ int a;
+ int b;
+ };
+ union
+ {
+ int c;
+ int d;
+ };
+};
+
+int f(struct a *c)
+{
+ int d = c->a;
+ c->c = 1;
+ return c->a + d;
+}
+
+/* We should have CSEd the load from c->a. */
+
+/* { dg-final { scan-tree-dump-times "c_.*\\\.a" 1 "fre" } } */
+/* { dg-final { cleanup-tree-dump "fre" } } */
--- /dev/null
+/* { dg-do compile } */
+/* { dg-options "-O -fdump-tree-fre" } */
+
+struct a
+{
+ union
+ {
+ int a;
+ int b;
+ };
+ union
+ {
+ int c;
+ int d;
+ };
+ int e;
+};
+
+int f(struct a *c)
+{
+ int d;
+ c->e = 2;
+ d = c->a;
+ c->c = 1;
+ return c->a + d;
+}
+
+/* We should have CSEd the load from c->a. */
+
+/* { dg-final { scan-tree-dump-times "c_.*\\\.a" 1 "fre" } } */
+/* { dg-final { cleanup-tree-dump "fre" } } */
--- /dev/null
+/* { dg-do compile } */
+/* { dg-options "-O -fdump-tree-optimized" } */
+
+int i, j;
+int foo(int b)
+{
+ j = 0;
+ if (b)
+ goto L2;
+L1:
+ i = i + 1;
+L2:
+ i = i + 1;
+ if (i == 1)
+ goto L1;
+ return j;
+}
+
+/* { dg-final { scan-tree-dump "return 0;" "optimized" } } */
+/* { dg-final { cleanup-tree-dump "optimized" } } */
/* { dg-do compile } */
-/* { dg-options "-O2 -fdump-tree-lim-details" } */
+/* { dg-options "-O -fdump-tree-lim-details" } */
struct { int x; int y; } global;
void foo(int n)
--- /dev/null
+/* { dg-do compile } */
+/* { dg-options "-O2 -fdump-tree-pre-stats" } */
+
+struct { int x; int y; } global;
+void foo(int n)
+{
+ int i;
+ for ( i=0; i<n; i++)
+ global.y += global.x*global.x;
+}
+
+/* { dg-final { scan-tree-dump "Eliminated: 2" "pre" } } */
+/* { dg-final { cleanup-tree-dump "pre" } } */
--- /dev/null
+/* { dg-do compile } */
+/* { dg-options "-O2 -fdump-tree-pre" } */
+
+void foo(int *p, double *x, int n)
+{
+ int i;
+ for (i = 0; i < n; ++i)
+ *(x + *p * i) = 0.0;
+}
+
+/* We should remove the unnecessary insertion of a phi-node and
+ _not_ end up using the phi result for replacement *p.
+ The issue here is that when PHI-translating the virtual operands
+ we assign different value-numbers to the load. Re-running VN
+ after insertion or trying to be clever and doing this on the
+ fly during PHI translation would solve this. The next copyprop
+ fixes this anyway. */
+
+/* { dg-final { scan-tree-dump-not "= prephitmp" "pre" { xfail *-*-* } } } */
+/* { dg-final { cleanup-tree-dump "pre" } } */
baz (void)
{
int i;
- if (i) /* { dg-warning "is used uninitialized" "uninit i warning" } */
+ if (i) /* { dg-warning "is used uninitialized" "uninit i warning" { xfail *-*-* } } */
bar (i);
foo (&i);
}
int main(void)
{
int i;
- printf("i = %d\n", i); /* { dg-warning "'i' is used uninitialized in this function" } */
+ printf("i = %d\n", i); /* { dg-warning "'i' is used uninitialized in this function" "" { xfail *-*-* } } */
frob(&i);
return 0;
void foo3(int*);
void bar3(void) {
int x;
- if(x) /* { dg-warning "'x' is used uninitialized in this function" "uninitialized" } */
+ if(x) /* { dg-warning "'x' is used uninitialized in this function" "uninitialized" { xfail *-*-* } } */
foo3(&x);
}
float pb[N] __attribute__ ((__aligned__(16))) = {0,3,6,9,12,15,18,21,24,27,30,33,36,39,42,45,48,51,54,57};
float pc[N] __attribute__ ((__aligned__(16))) = {0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19};
- /* Not vectorizable: pa may alias pb and/or pc, since their addresses escape. */
+ /* Vectorizable: pa may not alias pb and/or pc, even though their
+ addresses escape. &pa would need to escape to point to escaped memory. */
for (i = 0; i < N; i++)
{
pa[i] = pb[i] * pc[i];
return 0;
}
-/* { dg-final { scan-tree-dump-times "vectorized 1 loops" 1 "vect" } } */
+/* { dg-final { scan-tree-dump-times "vectorized 1 loops" 2 "vect" } } */
/* { dg-final { scan-tree-dump-times "Alignment of access forced using versioning" 1 "vect" { target vect_no_align } } } */
/* { dg-final { cleanup-tree-dump "vect" } } */
DEFTIMEVAR (TV_REG_STATS , "register information")
DEFTIMEVAR (TV_ALIAS_ANALYSIS , "alias analysis")
+DEFTIMEVAR (TV_ALIAS_STMT_WALK , "alias stmt walking")
DEFTIMEVAR (TV_REG_SCAN , "register scan")
DEFTIMEVAR (TV_REBUILD_JUMP , "rebuild jump labels")
/* Timing in various stages of the compiler. */
DEFTIMEVAR (TV_TREE_STORE_COPY_PROP , "tree store copy prop")
DEFTIMEVAR (TV_FIND_REFERENCED_VARS , "tree find ref. vars")
DEFTIMEVAR (TV_TREE_PTA , "tree PTA")
-DEFTIMEVAR (TV_TREE_MAY_ALIAS , "tree alias analysis")
-DEFTIMEVAR (TV_CALL_CLOBBER , "tree call clobbering")
-DEFTIMEVAR (TV_FLOW_SENSITIVE , "tree flow sensitive alias")
-DEFTIMEVAR (TV_FLOW_INSENSITIVE , "tree flow insensitive alias")
-DEFTIMEVAR (TV_MEMORY_PARTITIONING , "tree memory partitioning")
DEFTIMEVAR (TV_TREE_INSERT_PHI_NODES , "tree PHI insertion")
DEFTIMEVAR (TV_TREE_SSA_REWRITE_BLOCKS, "tree SSA rewrite")
DEFTIMEVAR (TV_TREE_SSA_OTHER , "tree SSA other")
#include "tree-mudflap.h"
#include "tree-pass.h"
#include "gimple.h"
+#include "tree-ssa-alias.h"
#if defined (DWARF2_UNWIND_INFO) || defined (DWARF2_DEBUGGING_INFO)
#include "dwarf2out.h"
dump_bitmap_statistics ();
dump_vec_loc_statistics ();
dump_ggc_loc_statistics (final);
+ dump_alias_stats (stderr);
+ dump_pta_stats (stderr);
}
/* Clean up: close opened files, etc. */
{
free_dominance_info (CDI_DOMINATORS);
free_dominance_info (CDI_POST_DOMINATORS);
+ /* As we introduced new control-flow we need to insert PHI-nodes
+ for the call-clobbers of the remaining call. */
+ mark_sym_for_renaming (gimple_vop (cfun));
return (TODO_update_ssa | TODO_cleanup_cfg | TODO_ggc_collect
| TODO_remove_unused_locals);
}
x = TREE_OPERAND (x, 0))
;
- if (TREE_CODE (x) != VAR_DECL && TREE_CODE (x) != PARM_DECL)
+ if (!(TREE_CODE (x) == VAR_DECL
+ || TREE_CODE (x) == PARM_DECL
+ || TREE_CODE (x) == RESULT_DECL))
return NULL;
if (!TREE_ADDRESSABLE (x))
{
operands. */
copy = gimple_copy (stmt);
gsi_insert_after (&gsi_tgt, copy, GSI_NEW_STMT);
- copy_virtual_operands (copy, stmt);
region = lookup_stmt_eh_region (stmt);
if (region >= 0)
add_stmt_to_eh_region (copy, region);
free_region_copy = true;
}
- gcc_assert (!need_ssa_update_p ());
+ gcc_assert (!need_ssa_update_p (cfun));
/* Record blocks outside the region that are dominated by something
inside. */
free_region_copy = true;
}
- gcc_assert (!need_ssa_update_p ());
+ gcc_assert (!need_ssa_update_p (cfun));
/* Record blocks outside the region that are dominated by something
inside. */
mark_virtual_ops_for_renaming (gsi_stmt (gsi));
}
-/* Marks virtual operands of all statements in basic blocks BBS for
- renaming. */
-
-static void
-mark_virtual_ops_in_region (VEC (basic_block,heap) *bbs)
-{
- basic_block bb;
- unsigned i;
-
- for (i = 0; VEC_iterate (basic_block, bbs, i, bb); i++)
- mark_virtual_ops_in_bb (bb);
-}
-
/* Move basic block BB from function CFUN to function DEST_FN. The
block is moved out of the original linked list and placed after
block AFTER in the new list. Also, the block is removed from the
old_len = VEC_length (basic_block, cfg->x_label_to_block_map);
if (old_len <= (unsigned) uid)
{
- new_len = 3 * uid / 2;
+ new_len = 3 * uid / 2 + 1;
VEC_safe_grow_cleared (basic_block, gc,
cfg->x_label_to_block_map, new_len);
}
pop_cfun ();
- /* The ssa form for virtual operands in the source function will have to
- be repaired. We do not care for the real operands -- the sese region
- must be closed with respect to those. */
- mark_virtual_ops_in_region (bbs);
-
/* Move blocks from BBS into DEST_CFUN. */
gcc_assert (VEC_length (basic_block, bbs) >= 2);
after = dest_cfun->cfg->x_entry_block_ptr;
}
}
-/* Mark each virtual op in STMT for ssa update. */
-
-static void
-update_all_vops (gimple stmt)
-{
- ssa_op_iter iter;
- tree sym;
-
- FOR_EACH_SSA_TREE_OPERAND (sym, stmt, iter, SSA_OP_ALL_VIRTUALS)
- {
- if (TREE_CODE (sym) == SSA_NAME)
- sym = SSA_NAME_VAR (sym);
- mark_sym_for_renaming (sym);
- }
-}
-
-
/* Expand a complex move to scalars. */
static void
}
else
{
- update_all_vops (stmt);
if (gimple_assign_rhs_code (stmt) != COMPLEX_EXPR)
{
r = extract_component (gsi, rhs, 0, true);
gimple_return_set_retval (stmt, lhs);
}
- update_all_vops (stmt);
update_stmt (stmt);
}
}
static void
dr_analyze_alias (struct data_reference *dr)
{
- gimple stmt = DR_STMT (dr);
tree ref = DR_REF (dr);
- tree base = get_base_address (ref), addr, smt = NULL_TREE;
- ssa_op_iter it;
- tree op;
- bitmap vops;
+ tree base = get_base_address (ref), addr;
- if (DECL_P (base))
- smt = base;
- else if (INDIRECT_REF_P (base))
+ if (INDIRECT_REF_P (base))
{
addr = TREE_OPERAND (base, 0);
if (TREE_CODE (addr) == SSA_NAME)
- {
- smt = symbol_mem_tag (SSA_NAME_VAR (addr));
- DR_PTR_INFO (dr) = SSA_NAME_PTR_INFO (addr);
- }
+ DR_PTR_INFO (dr) = SSA_NAME_PTR_INFO (addr);
}
-
- DR_SYMBOL_TAG (dr) = smt;
-
- vops = BITMAP_ALLOC (NULL);
- FOR_EACH_SSA_TREE_OPERAND (op, stmt, it, SSA_OP_VIRTUAL_USES)
- {
- bitmap_set_bit (vops, DECL_UID (SSA_NAME_VAR (op)));
- }
-
- DR_VOPS (dr) = vops;
}
/* Returns true if the address of DR is invariant. */
void
free_data_ref (data_reference_p dr)
{
- BITMAP_FREE (DR_VOPS (dr));
VEC_free (tree, heap, DR_ACCESS_FNS (dr));
free (dr);
}
print_generic_expr (dump_file, DR_ALIGNED_TO (dr), TDF_SLIM);
fprintf (dump_file, "\n\tbase_object: ");
print_generic_expr (dump_file, DR_BASE_OBJECT (dr), TDF_SLIM);
- fprintf (dump_file, "\n\tsymbol tag: ");
- print_generic_expr (dump_file, DR_SYMBOL_TAG (dr), TDF_SLIM);
fprintf (dump_file, "\n");
}
const_tree type_a, type_b;
const_tree decl_a = NULL_TREE, decl_b = NULL_TREE;
- /* If the sets of virtual operands are disjoint, the memory references do not
- alias. */
- if (!bitmap_intersect_p (DR_VOPS (a), DR_VOPS (b)))
- return false;
-
/* If the accessed objects are disjoint, the memory references do not
alias. */
if (disjoint_objects_p (DR_BASE_OBJECT (a), DR_BASE_OBJECT (b)))
return false;
+ /* Query the alias oracle. */
+ if (!refs_may_alias_p (DR_REF (a), DR_REF (b)))
+ return false;
+
if (!addr_a || !addr_b)
return true;
- /* If the references are based on different static objects, they cannot alias
- (PTA should be able to disambiguate such accesses, but often it fails to,
- since currently we cannot distinguish between pointer and offset in pointer
- arithmetics). */
+ /* If the references are based on different static objects, they cannot
+ alias (PTA should be able to disambiguate such accesses, but often
+ it fails to). */
if (TREE_CODE (addr_a) == ADDR_EXPR
&& TREE_CODE (addr_b) == ADDR_EXPR)
return TREE_OPERAND (addr_a, 0) == TREE_OPERAND (addr_b, 0);
&& gimple_asm_volatile_p (stmt)))
clobbers_memory = true;
- if (ZERO_SSA_OPERANDS (stmt, SSA_OP_ALL_VIRTUALS))
+ if (!gimple_vuse (stmt))
return clobbers_memory;
if (stmt_code == GIMPLE_ASSIGN)
{
unsigned nb_top_relations = 0;
unsigned nb_bot_relations = 0;
- unsigned nb_basename_differ = 0;
unsigned nb_chrec_relations = 0;
struct data_dependence_relation *ddr;
nb_top_relations++;
else if (DDR_ARE_DEPENDENT (ddr) == chrec_known)
- {
- struct data_reference *a = DDR_A (ddr);
- struct data_reference *b = DDR_B (ddr);
-
- if (!bitmap_intersect_p (DR_VOPS (a), DR_VOPS (b)))
- nb_basename_differ++;
- else
- nb_bot_relations++;
- }
+ nb_bot_relations++;
else
nb_chrec_relations++;
gimple_stmt_iterator bsi;
for (bsi = gsi_start_bb (bb); !gsi_end_p (bsi); gsi_next (&bsi))
- if (!ZERO_SSA_OPERANDS (gsi_stmt (bsi), SSA_OP_VDEF))
+ if (gimple_vdef (gsi_stmt (bsi)))
VEC_safe_push (gimple, heap, *stmts, gsi_stmt (bsi));
}
{
/* The alias information that should be used for new pointers to this
location. SYMBOL_TAG is either a DECL or a SYMBOL_MEMORY_TAG. */
- tree symbol_tag;
struct ptr_info_def *ptr_info;
/* The set of virtual operands corresponding to this memory reference,
#define DR_OFFSET(DR) (DR)->innermost.offset
#define DR_INIT(DR) (DR)->innermost.init
#define DR_STEP(DR) (DR)->innermost.step
-#define DR_SYMBOL_TAG(DR) (DR)->alias.symbol_tag
#define DR_PTR_INFO(DR) (DR)->alias.ptr_info
-#define DR_VOPS(DR) (DR)->alias.vops
#define DR_ALIGNED_TO(DR) (DR)->innermost.aligned_to
#define DR_ACCESS_MATRIX(DR) (DR)->access_matrix
{
fprintf (file, "Variable: ");
dump_variable (file, var);
- fprintf (file, "\n");
}
+
+ fprintf (file, "\n");
}
fprintf (file, ", ");
print_generic_expr (file, TREE_TYPE (var), dump_flags);
- if (ann && ann->symbol_mem_tag)
- {
- fprintf (file, ", symbol memory tag: ");
- print_generic_expr (file, ann->symbol_mem_tag, dump_flags);
- }
-
if (TREE_ADDRESSABLE (var))
fprintf (file, ", is addressable");
if (TREE_THIS_VOLATILE (var))
fprintf (file, ", is volatile");
- dump_mem_sym_stats_for_var (file, var);
-
if (is_call_clobbered (var))
- {
- const char *s = "";
- var_ann_t va = var_ann (var);
- unsigned int escape_mask = va->escape_mask;
-
- fprintf (file, ", call clobbered");
- fprintf (file, " (");
- if (escape_mask & ESCAPE_STORED_IN_GLOBAL)
- { fprintf (file, "%sstored in global", s); s = ", "; }
- if (escape_mask & ESCAPE_TO_ASM)
- { fprintf (file, "%sgoes through ASM", s); s = ", "; }
- if (escape_mask & ESCAPE_TO_CALL)
- { fprintf (file, "%spassed to call", s); s = ", "; }
- if (escape_mask & ESCAPE_BAD_CAST)
- { fprintf (file, "%sbad cast", s); s = ", "; }
- if (escape_mask & ESCAPE_TO_RETURN)
- { fprintf (file, "%sreturned from func", s); s = ", "; }
- if (escape_mask & ESCAPE_TO_PURE_CONST)
- { fprintf (file, "%spassed to pure/const", s); s = ", "; }
- if (escape_mask & ESCAPE_IS_GLOBAL)
- { fprintf (file, "%sis global var", s); s = ", "; }
- if (escape_mask & ESCAPE_IS_PARM)
- { fprintf (file, "%sis incoming pointer", s); s = ", "; }
- if (escape_mask & ESCAPE_UNKNOWN)
- { fprintf (file, "%sunknown escape", s); s = ", "; }
- fprintf (file, ")");
- }
+ fprintf (file, ", call clobbered");
+ else if (is_call_used (var))
+ fprintf (file, ", call used");
if (ann->noalias_state == NO_ALIAS)
fprintf (file, ", NO_ALIAS (does not alias other NO_ALIAS symbols)");
print_generic_expr (file, gimple_default_def (cfun, var), dump_flags);
}
- if (MTAG_P (var) && may_aliases (var))
- {
- fprintf (file, ", may aliases: ");
- dump_may_aliases_for (file, var);
- }
-
- if (!is_gimple_reg (var))
- {
- if (memory_partition (var))
- {
- fprintf (file, ", belongs to partition: ");
- print_generic_expr (file, memory_partition (var), dump_flags);
- }
-
- if (TREE_CODE (var) == MEMORY_PARTITION_TAG)
- {
- fprintf (file, ", partition symbols: ");
- dump_decl_set (file, MPT_SYMBOLS (var));
- }
- }
-
fprintf (file, "\n");
}
gimple stmt = gsi_stmt (si);
dfa_stats_p->num_defs += NUM_SSA_OPERANDS (stmt, SSA_OP_DEF);
dfa_stats_p->num_uses += NUM_SSA_OPERANDS (stmt, SSA_OP_USE);
- dfa_stats_p->num_vdefs += NUM_SSA_OPERANDS (stmt, SSA_OP_VDEF);
- dfa_stats_p->num_vuses += NUM_SSA_OPERANDS (stmt, SSA_OP_VUSE);
+ dfa_stats_p->num_vdefs += gimple_vdef (stmt) ? 1 : 0;
+ dfa_stats_p->num_vuses += gimple_vuse (stmt) ? 1 : 0;
}
}
}
/* Insert VAR into the referenced_vars has table if it isn't present. */
if (referenced_var_check_and_insert (var))
{
- /* This is the first time we found this variable, annotate it with
- attributes that are intrinsic to the variable. */
-
- /* Tag's don't have DECL_INITIAL. */
- if (MTAG_P (var))
- return true;
-
/* Scan DECL_INITIAL for pointer variables as they may contain
address arithmetic referencing the address of other
variables.
void **loc;
unsigned int uid = DECL_UID (var);
- clear_call_clobbered (var);
- bitmap_clear_bit (gimple_call_used_vars (cfun), uid);
- if ((v_ann = var_ann (var)))
+ /* Preserve var_anns of globals. */
+ if (!is_global_var (var)
+ && (v_ann = var_ann (var)))
{
- /* Preserve var_anns of globals, but clear their alias info. */
- if (MTAG_P (var)
- || (!TREE_STATIC (var) && !DECL_EXTERNAL (var)))
- {
- ggc_free (v_ann);
- var->base.ann = NULL;
- }
- else
- {
- v_ann->mpt = NULL_TREE;
- v_ann->symbol_mem_tag = NULL_TREE;
- }
+ ggc_free (v_ann);
+ var->base.ann = NULL;
}
gcc_assert (DECL_P (var));
in.uid = uid;
bool seen_variable_array_ref = false;
bool seen_union = false;
- gcc_assert (!SSA_VAR_P (exp));
-
/* First get the final access size from just the outermost expression. */
if (TREE_CODE (exp) == COMPONENT_REF)
size_tree = DECL_SIZE (TREE_OPERAND (exp, 1));
return false;
}
-/* Return true, if the two memory references REF1 and REF2 may alias. */
-
-bool
-refs_may_alias_p (tree ref1, tree ref2)
-{
- tree base1, base2;
- HOST_WIDE_INT offset1 = 0, offset2 = 0;
- HOST_WIDE_INT size1 = -1, size2 = -1;
- HOST_WIDE_INT max_size1 = -1, max_size2 = -1;
- bool strict_aliasing_applies;
-
- gcc_assert ((SSA_VAR_P (ref1)
- || handled_component_p (ref1)
- || INDIRECT_REF_P (ref1)
- || TREE_CODE (ref1) == TARGET_MEM_REF)
- && (SSA_VAR_P (ref2)
- || handled_component_p (ref2)
- || INDIRECT_REF_P (ref2)
- || TREE_CODE (ref2) == TARGET_MEM_REF));
-
- /* Defer to TBAA if possible. */
- if (flag_strict_aliasing
- && !alias_sets_conflict_p (get_alias_set (ref1), get_alias_set (ref2)))
- return false;
-
- /* Decompose the references into their base objects and the access. */
- base1 = ref1;
- if (handled_component_p (ref1))
- base1 = get_ref_base_and_extent (ref1, &offset1, &size1, &max_size1);
- base2 = ref2;
- if (handled_component_p (ref2))
- base2 = get_ref_base_and_extent (ref2, &offset2, &size2, &max_size2);
-
- /* If both references are based on different variables, they cannot alias.
- If both references are based on the same variable, they cannot alias if
- the accesses do not overlap. */
- if (SSA_VAR_P (base1)
- && SSA_VAR_P (base2))
- {
- if (!operand_equal_p (base1, base2, 0))
- return false;
- return ranges_overlap_p (offset1, max_size1, offset2, max_size2);
- }
-
- /* If one base is a ref-all pointer weird things are allowed. */
- strict_aliasing_applies = (flag_strict_aliasing
- && (!INDIRECT_REF_P (base1)
- || get_alias_set (base1) != 0)
- && (!INDIRECT_REF_P (base2)
- || get_alias_set (base2) != 0));
-
- /* If strict aliasing applies the only way to access a scalar variable
- is through a pointer dereference or through a union (gcc extension). */
- if (strict_aliasing_applies
- && ((SSA_VAR_P (ref2)
- && !AGGREGATE_TYPE_P (TREE_TYPE (ref2))
- && !INDIRECT_REF_P (ref1)
- && TREE_CODE (TREE_TYPE (base1)) != UNION_TYPE)
- || (SSA_VAR_P (ref1)
- && !AGGREGATE_TYPE_P (TREE_TYPE (ref1))
- && !INDIRECT_REF_P (ref2)
- && TREE_CODE (TREE_TYPE (base2)) != UNION_TYPE)))
- return false;
-
- /* If both references are through the same type, or if strict aliasing
- doesn't apply they are through two same pointers, they do not alias
- if the accesses do not overlap. */
- if ((strict_aliasing_applies
- && (TYPE_MAIN_VARIANT (TREE_TYPE (base1))
- == TYPE_MAIN_VARIANT (TREE_TYPE (base2))))
- || (TREE_CODE (base1) == INDIRECT_REF
- && TREE_CODE (base2) == INDIRECT_REF
- && operand_equal_p (TREE_OPERAND (base1, 0),
- TREE_OPERAND (base2, 0), 0)))
- return ranges_overlap_p (offset1, max_size1, offset2, max_size2);
-
- /* If both are component references through pointers try to find a
- common base and apply offset based disambiguation. This handles
- for example
- struct A { int i; int j; } *q;
- struct B { struct A a; int k; } *p;
- disambiguating q->i and p->a.j. */
- if (strict_aliasing_applies
- && (TREE_CODE (base1) == INDIRECT_REF
- || TREE_CODE (base2) == INDIRECT_REF)
- && handled_component_p (ref1)
- && handled_component_p (ref2))
- {
- tree *refp;
- /* Now search for the type of base1 in the access path of ref2. This
- would be a common base for doing offset based disambiguation on. */
- refp = &ref2;
- while (handled_component_p (*refp)
- /* Note that the following is only conservative if there are
- never copies of types appearing as sub-structures. */
- && (TYPE_MAIN_VARIANT (TREE_TYPE (*refp))
- != TYPE_MAIN_VARIANT (TREE_TYPE (base1))))
- refp = &TREE_OPERAND (*refp, 0);
- if (TYPE_MAIN_VARIANT (TREE_TYPE (*refp))
- == TYPE_MAIN_VARIANT (TREE_TYPE (base1)))
- {
- HOST_WIDE_INT offadj, sztmp, msztmp;
- get_ref_base_and_extent (*refp, &offadj, &sztmp, &msztmp);
- offset2 -= offadj;
- return ranges_overlap_p (offset1, max_size1, offset2, max_size2);
- }
- /* The other way around. */
- refp = &ref1;
- while (handled_component_p (*refp)
- && (TYPE_MAIN_VARIANT (TREE_TYPE (*refp))
- != TYPE_MAIN_VARIANT (TREE_TYPE (base2))))
- refp = &TREE_OPERAND (*refp, 0);
- if (TYPE_MAIN_VARIANT (TREE_TYPE (*refp))
- == TYPE_MAIN_VARIANT (TREE_TYPE (base2)))
- {
- HOST_WIDE_INT offadj, sztmp, msztmp;
- get_ref_base_and_extent (*refp, &offadj, &sztmp, &msztmp);
- offset1 -= offadj;
- return ranges_overlap_p (offset1, max_size1, offset2, max_size2);
- }
- /* If we can be sure to catch all equivalent types in the search
- for the common base then we could return false here. In that
- case we would be able to disambiguate q->i and p->k. */
- }
-
- return true;
-}
-
-/* Given a stmt STMT that references memory, return the single stmt
- that is reached by following the VUSE -> VDEF link. Returns
- NULL_TREE, if there is no single stmt that defines all VUSEs of
- STMT.
- Note that for a stmt with a single virtual operand this may return
- a PHI node as well. Note that if all VUSEs are default definitions
- this function will return an empty statement. */
-
-gimple
-get_single_def_stmt (gimple stmt)
-{
- gimple def_stmt = NULL;
- tree use;
- ssa_op_iter iter;
-
- FOR_EACH_SSA_TREE_OPERAND (use, stmt, iter, SSA_OP_VIRTUAL_USES)
- {
- gimple tmp = SSA_NAME_DEF_STMT (use);
-
- /* ??? This is too simplistic for multiple virtual operands
- reaching different PHI nodes of the same basic blocks or for
- reaching all default definitions. */
- if (def_stmt
- && def_stmt != tmp
- && !(gimple_nop_p (def_stmt)
- && gimple_nop_p (tmp)))
- return NULL;
-
- def_stmt = tmp;
- }
-
- return def_stmt;
-}
-
-/* Given a PHI node of virtual operands, tries to eliminate cyclic
- reached definitions if they do not alias REF and returns the
- defining statement of the single virtual operand that flows in
- from a non-backedge. Returns NULL_TREE if such statement within
- the above conditions cannot be found. */
-
-gimple
-get_single_def_stmt_from_phi (tree ref, gimple phi)
-{
- tree def_arg = NULL_TREE;
- unsigned i;
-
- /* Find the single PHI argument that is not flowing in from a
- back edge and verify that the loop-carried definitions do
- not alias the reference we look for. */
- for (i = 0; i < gimple_phi_num_args (phi); ++i)
- {
- tree arg = PHI_ARG_DEF (phi, i);
- gimple def_stmt;
-
- if (!(gimple_phi_arg_edge (phi, i)->flags & EDGE_DFS_BACK))
- {
- /* Multiple non-back edges? Do not try to handle this. */
- if (def_arg)
- return NULL;
- def_arg = arg;
- continue;
- }
-
- /* Follow the definitions back to the original PHI node. Bail
- out once a definition is found that may alias REF. */
- def_stmt = SSA_NAME_DEF_STMT (arg);
- do
- {
- if (!is_gimple_assign (def_stmt)
- || refs_may_alias_p (ref, gimple_assign_lhs (def_stmt)))
- return NULL;
- /* ??? This will only work, reaching the PHI node again if
- there is a single virtual operand on def_stmt. */
- def_stmt = get_single_def_stmt (def_stmt);
- if (!def_stmt)
- return NULL;
- }
- while (def_stmt != phi);
- }
-
- return SSA_NAME_DEF_STMT (def_arg);
-}
-
-/* Return the single reference statement defining all virtual uses
- on STMT or NULL_TREE, if there are multiple defining statements.
- Take into account only definitions that alias REF if following
- back-edges when looking through a loop PHI node. */
-
-gimple
-get_single_def_stmt_with_phi (tree ref, gimple stmt)
-{
- switch (NUM_SSA_OPERANDS (stmt, SSA_OP_VIRTUAL_USES))
- {
- case 0:
- gcc_unreachable ();
-
- case 1:
- {
- gimple def_stmt = SSA_NAME_DEF_STMT (SINGLE_SSA_TREE_OPERAND
- (stmt, SSA_OP_VIRTUAL_USES));
- /* We can handle lookups over PHI nodes only for a single
- virtual operand. */
- if (gimple_code (def_stmt) == GIMPLE_PHI)
- return get_single_def_stmt_from_phi (ref, def_stmt);
- return def_stmt;
- }
-
- default:
- return get_single_def_stmt (stmt);
- }
-}
case CONST_DECL:
dump_child ("cnst", DECL_INITIAL (t));
break;
-
- case SYMBOL_MEMORY_TAG:
- case NAME_MEMORY_TAG:
- break;
case VAR_DECL:
case PARM_DECL:
similar updating as jump threading does. */
for (si = gsi_start_phis (bb); !gsi_end_p (si); gsi_next (&si))
- mark_sym_for_renaming (SSA_NAME_VAR (PHI_RESULT (gsi_stmt (si))));
+ {
+ tree res = PHI_RESULT (gsi_stmt (si));
+ gimple stmt;
+ imm_use_iterator iter;
+ use_operand_p use_p;
+
+ /* As we are going to delete this block we will release all
+ defs which makes the immediate uses on use stmts invalid.
+ Avoid that by replacing all uses with the bare variable
+ and updating the stmts. */
+ FOR_EACH_IMM_USE_STMT (stmt, iter, res)
+ {
+ FOR_EACH_IMM_USE_ON_STMT (use_p, iter)
+ SET_USE (use_p, SSA_NAME_VAR (res));
+ update_stmt (stmt);
+ }
+ mark_sym_for_renaming (SSA_NAME_VAR (res));
+ }
+ /* We want to thread over the current receiver to the next reachable
+ one. Do so by deleting all outgoing EH edges from all
+ predecessors of the receiver block we are going to delete and
+ rebuild EH edges for them. */
while ((e = ei_safe_edge (ei_start (bb->preds))))
{
basic_block src = e->src;
if (!stmt_can_throw_internal (last_stmt (src)))
continue;
make_eh_edges (last_stmt (src));
+ /* Make sure to also rename symbols that feed into receivers
+ that are now newly reachable from current src. */
FOR_EACH_EDGE (e, ei, src->succs)
if (e->flags & EDGE_EH)
{
return fun && fun->gimple_df && fun->gimple_df->in_ssa_p;
}
-/* 'true' after aliases have been computed (see compute_may_aliases). */
-static inline bool
-gimple_aliases_computed_p (const struct function *fun)
-{
- gcc_assert (fun && fun->gimple_df);
- return fun->gimple_df->aliases_computed_p;
-}
-
-/* Addressable variables in the function. If bit I is set, then
- REFERENCED_VARS (I) has had its address taken. Note that
- CALL_CLOBBERED_VARS and ADDRESSABLE_VARS are not related. An
- addressable variable is not necessarily call-clobbered (e.g., a
- local addressable whose address does not escape) and not all
- call-clobbered variables are addressable (e.g., a local static
- variable). */
-static inline bitmap
-gimple_addressable_vars (const struct function *fun)
-{
- gcc_assert (fun && fun->gimple_df);
- return fun->gimple_df->addressable_vars;
-}
-
-/* Call clobbered variables in the function. If bit I is set, then
- REFERENCED_VARS (I) is call-clobbered. */
-static inline bitmap
-gimple_call_clobbered_vars (const struct function *fun)
-{
- gcc_assert (fun && fun->gimple_df);
- return fun->gimple_df->call_clobbered_vars;
-}
-
-/* Call-used variables in the function. If bit I is set, then
- REFERENCED_VARS (I) is call-used at pure function call-sites. */
-static inline bitmap
-gimple_call_used_vars (const struct function *fun)
-{
- gcc_assert (fun && fun->gimple_df);
- return fun->gimple_df->call_used_vars;
-}
-
/* Array of all variables referenced in the function. */
static inline htab_t
gimple_referenced_vars (const struct function *fun)
return fun->gimple_df->referenced_vars;
}
-/* Artificial variable used to model the effects of function calls. */
+/* Artificial variable used to model the effects of nonlocal
+ variables. */
static inline tree
-gimple_global_var (const struct function *fun)
+gimple_nonlocal_all (const struct function *fun)
{
gcc_assert (fun && fun->gimple_df);
- return fun->gimple_df->global_var;
+ return fun->gimple_df->nonlocal_all;
}
-/* Artificial variable used to model the effects of nonlocal
- variables. */
+/* Artificial variable used for the virtual operand FUD chain. */
static inline tree
-gimple_nonlocal_all (const struct function *fun)
+gimple_vop (const struct function *fun)
{
gcc_assert (fun && fun->gimple_df);
- return fun->gimple_df->nonlocal_all;
+ return fun->gimple_df->vop;
}
/* Initialize the hashtable iterator HTI to point to hashtable TABLE */
return ann->common.type;
}
-/* Return the may_aliases bitmap for variable VAR, or NULL if it has
- no may aliases. */
-static inline bitmap
-may_aliases (const_tree var)
-{
- return MTAG_ALIASES (var);
-}
-
/* Return the line number for EXPR, or return -1 if we have no line
number information for it. */
static inline int
}
-/* Return true if T (assumed to be a DECL) is a global variable. */
+/* Return true if T (assumed to be a DECL) is a global variable.
+ A variable is considered global if its storage is not automatic. */
static inline bool
is_global_var (const_tree t)
{
- if (MTAG_P (t))
- return MTAG_GLOBAL (t);
- else
- return (TREE_STATIC (t) || DECL_EXTERNAL (t));
+ return (TREE_STATIC (t) || DECL_EXTERNAL (t));
+}
+
+
+/* Return true if VAR may be aliased. A variable is considered as
+ maybe aliased if it has its address taken by the local TU
+ or possibly by another TU. */
+
+static inline bool
+may_be_aliased (const_tree var)
+{
+ return (TREE_PUBLIC (var) || DECL_EXTERNAL (var) || TREE_ADDRESSABLE (var));
}
+
/* PHI nodes should contain only ssa_names and invariants. A test
for ssa_name is definitely simpler; don't let invalid contents
slip in in the meantime. */
}
-/* Return the memory partition tag associated with symbol SYM. */
-
-static inline tree
-memory_partition (tree sym)
-{
- tree tag;
-
- /* MPTs belong to their own partition. */
- if (TREE_CODE (sym) == MEMORY_PARTITION_TAG)
- return sym;
-
- gcc_assert (!is_gimple_reg (sym));
- /* Autoparallelization moves statements from the original function (which has
- aliases computed) to the new one (which does not). When rebuilding
- operands for the statement in the new function, we do not want to
- record the memory partition tags of the original function. */
- if (!gimple_aliases_computed_p (cfun))
- return NULL_TREE;
- tag = get_var_ann (sym)->mpt;
-
-#if defined ENABLE_CHECKING
- if (tag)
- gcc_assert (TREE_CODE (tag) == MEMORY_PARTITION_TAG);
-#endif
-
- return tag;
-}
-
-/* Return true if NAME is a memory factoring SSA name (i.e., an SSA
- name for a memory partition. */
-
+/* Return true if VAR is clobbered by function calls. */
static inline bool
-factoring_name_p (const_tree name)
+is_call_clobbered (const_tree var)
{
- return TREE_CODE (SSA_NAME_VAR (name)) == MEMORY_PARTITION_TAG;
+ return (is_global_var (var)
+ || (may_be_aliased (var)
+ && pt_solution_includes (&cfun->gimple_df->escaped, var)));
}
/* Return true if VAR is used by function calls. */
static inline bool
is_call_used (const_tree var)
{
- return (var_ann (var)->call_clobbered
- || bitmap_bit_p (gimple_call_used_vars (cfun), DECL_UID (var)));
-}
-
-/* Return true if VAR is clobbered by function calls. */
-static inline bool
-is_call_clobbered (const_tree var)
-{
- return var_ann (var)->call_clobbered;
-}
-
-/* Mark variable VAR as being clobbered by function calls. */
-static inline void
-mark_call_clobbered (tree var, unsigned int escape_type)
-{
- var_ann (var)->escape_mask |= escape_type;
- var_ann (var)->call_clobbered = true;
- bitmap_set_bit (gimple_call_clobbered_vars (cfun), DECL_UID (var));
-}
-
-/* Clear the call-clobbered attribute from variable VAR. */
-static inline void
-clear_call_clobbered (tree var)
-{
- var_ann_t ann = var_ann (var);
- ann->escape_mask = 0;
- if (MTAG_P (var))
- MTAG_GLOBAL (var) = 0;
- var_ann (var)->call_clobbered = false;
- bitmap_clear_bit (gimple_call_clobbered_vars (cfun), DECL_UID (var));
+ return (is_call_clobbered (var)
+ || (may_be_aliased (var)
+ && pt_solution_includes (&cfun->gimple_df->callused, var)));
}
/* Return the common annotation for T. Return NULL if the annotation
ptr->uses = ptr->uses->next;
return use_p;
}
- if (ptr->vuses)
- {
- use_p = VUSE_OP_PTR (ptr->vuses, ptr->vuse_index);
- if (++(ptr->vuse_index) >= VUSE_NUM (ptr->vuses))
- {
- ptr->vuse_index = 0;
- ptr->vuses = ptr->vuses->next;
- }
- return use_p;
- }
- if (ptr->mayuses)
- {
- use_p = VDEF_OP_PTR (ptr->mayuses, ptr->mayuse_index);
- if (++(ptr->mayuse_index) >= VDEF_NUM (ptr->mayuses))
- {
- ptr->mayuse_index = 0;
- ptr->mayuses = ptr->mayuses->next;
- }
- return use_p;
- }
if (ptr->phi_i < ptr->num_phi)
{
return PHI_ARG_DEF_PTR (ptr->phi_stmt, (ptr->phi_i)++);
ptr->defs = ptr->defs->next;
return def_p;
}
- if (ptr->vdefs)
- {
- def_p = VDEF_RESULT_PTR (ptr->vdefs);
- ptr->vdefs = ptr->vdefs->next;
- return def_p;
- }
ptr->done = true;
return NULL_DEF_OPERAND_P;
}
ptr->uses = ptr->uses->next;
return val;
}
- if (ptr->vuses)
- {
- val = VUSE_OP (ptr->vuses, ptr->vuse_index);
- if (++(ptr->vuse_index) >= VUSE_NUM (ptr->vuses))
- {
- ptr->vuse_index = 0;
- ptr->vuses = ptr->vuses->next;
- }
- return val;
- }
- if (ptr->mayuses)
- {
- val = VDEF_OP (ptr->mayuses, ptr->mayuse_index);
- if (++(ptr->mayuse_index) >= VDEF_NUM (ptr->mayuses))
- {
- ptr->mayuse_index = 0;
- ptr->mayuses = ptr->mayuses->next;
- }
- return val;
- }
if (ptr->defs)
{
val = DEF_OP (ptr->defs);
ptr->defs = ptr->defs->next;
return val;
}
- if (ptr->vdefs)
- {
- val = VDEF_RESULT (ptr->vdefs);
- ptr->vdefs = ptr->vdefs->next;
- return val;
- }
ptr->done = true;
return NULL_TREE;
{
ptr->defs = NULL;
ptr->uses = NULL;
- ptr->vuses = NULL;
- ptr->vdefs = NULL;
- ptr->mayuses = NULL;
ptr->iter_type = ssa_op_iter_none;
ptr->phi_i = 0;
ptr->num_phi = 0;
ptr->phi_stmt = NULL;
ptr->done = true;
- ptr->vuse_index = 0;
- ptr->mayuse_index = 0;
}
/* Initialize the iterator PTR to the virtual defs in STMT. */
static inline void
op_iter_init (ssa_op_iter *ptr, gimple stmt, int flags)
{
- ptr->defs = (flags & SSA_OP_DEF) ? gimple_def_ops (stmt) : NULL;
- ptr->uses = (flags & SSA_OP_USE) ? gimple_use_ops (stmt) : NULL;
- ptr->vuses = (flags & SSA_OP_VUSE) ? gimple_vuse_ops (stmt) : NULL;
- ptr->vdefs = (flags & SSA_OP_VDEF) ? gimple_vdef_ops (stmt) : NULL;
- ptr->mayuses = (flags & SSA_OP_VMAYUSE) ? gimple_vdef_ops (stmt) : NULL;
+ /* We do not support iterating over virtual defs or uses without
+ iterating over defs or uses at the same time. */
+ gcc_assert ((!(flags & SSA_OP_VDEF) || (flags & SSA_OP_DEF))
+ && (!(flags & SSA_OP_VUSE) || (flags & SSA_OP_USE)));
+ ptr->defs = (flags & (SSA_OP_DEF|SSA_OP_VDEF)) ? gimple_def_ops (stmt) : NULL;
+ if (!(flags & SSA_OP_VDEF)
+ && ptr->defs
+ && gimple_vdef (stmt) != NULL_TREE)
+ ptr->defs = ptr->defs->next;
+ ptr->uses = (flags & (SSA_OP_USE|SSA_OP_VUSE)) ? gimple_use_ops (stmt) : NULL;
+ if (!(flags & SSA_OP_VUSE)
+ && ptr->uses
+ && gimple_vuse (stmt) != NULL_TREE)
+ ptr->uses = ptr->uses->next;
ptr->done = false;
ptr->phi_i = 0;
ptr->num_phi = 0;
ptr->phi_stmt = NULL;
- ptr->vuse_index = 0;
- ptr->mayuse_index = 0;
}
/* Initialize iterator PTR to the use operands in STMT based on FLAGS. Return
static inline use_operand_p
op_iter_init_use (ssa_op_iter *ptr, gimple stmt, int flags)
{
- gcc_assert ((flags & SSA_OP_ALL_DEFS) == 0);
+ gcc_assert ((flags & SSA_OP_ALL_DEFS) == 0
+ && (flags & SSA_OP_USE));
op_iter_init (ptr, stmt, flags);
ptr->iter_type = ssa_op_iter_use;
return op_iter_next_use (ptr);
static inline def_operand_p
op_iter_init_def (ssa_op_iter *ptr, gimple stmt, int flags)
{
- gcc_assert ((flags & SSA_OP_ALL_USES) == 0);
+ gcc_assert ((flags & SSA_OP_ALL_USES) == 0
+ && (flags & SSA_OP_DEF));
op_iter_init (ptr, stmt, flags);
ptr->iter_type = ssa_op_iter_def;
return op_iter_next_def (ptr);
return op_iter_next_tree (ptr);
}
-/* Get the next iterator mustdef value for PTR, returning the mustdef values in
- KILL and DEF. */
-static inline void
-op_iter_next_vdef (vuse_vec_p *use, def_operand_p *def,
- ssa_op_iter *ptr)
-{
-#ifdef ENABLE_CHECKING
- gcc_assert (ptr->iter_type == ssa_op_iter_vdef);
-#endif
- if (ptr->mayuses)
- {
- *def = VDEF_RESULT_PTR (ptr->mayuses);
- *use = VDEF_VECT (ptr->mayuses);
- ptr->mayuses = ptr->mayuses->next;
- return;
- }
-
- *def = NULL_DEF_OPERAND_P;
- *use = NULL;
- ptr->done = true;
- return;
-}
-
-
-static inline void
-op_iter_next_mustdef (use_operand_p *use, def_operand_p *def,
- ssa_op_iter *ptr)
-{
- vuse_vec_p vp;
- op_iter_next_vdef (&vp, def, ptr);
- if (vp != NULL)
- {
- gcc_assert (VUSE_VECT_NUM_ELEM (*vp) == 1);
- *use = VUSE_ELEMENT_PTR (*vp, 0);
- }
- else
- *use = NULL_USE_OPERAND_P;
-}
-
-/* Initialize iterator PTR to the operands in STMT. Return the first operands
- in USE and DEF. */
-static inline void
-op_iter_init_vdef (ssa_op_iter *ptr, gimple stmt, vuse_vec_p *use,
- def_operand_p *def)
-{
- gcc_assert (gimple_code (stmt) != GIMPLE_PHI);
-
- op_iter_init (ptr, stmt, SSA_OP_VMAYUSE);
- ptr->iter_type = ssa_op_iter_vdef;
- op_iter_next_vdef (use, def, ptr);
-}
-
/* If there is a single operand in STMT matching FLAGS, return it. Otherwise
return NULL. */
}
-/* This routine will compare all the operands matching FLAGS in STMT1 to those
- in STMT2. TRUE is returned if they are the same. STMTs can be NULL. */
-static inline bool
-compare_ssa_operands_equal (gimple stmt1, gimple stmt2, int flags)
-{
- ssa_op_iter iter1, iter2;
- tree op1 = NULL_TREE;
- tree op2 = NULL_TREE;
- bool look1, look2;
-
- if (stmt1 == stmt2)
- return true;
-
- look1 = stmt1 != NULL;
- look2 = stmt2 != NULL;
-
- if (look1)
- {
- op1 = op_iter_init_tree (&iter1, stmt1, flags);
- if (!look2)
- return op_iter_done (&iter1);
- }
- else
- clear_and_done_ssa_iter (&iter1);
-
- if (look2)
- {
- op2 = op_iter_init_tree (&iter2, stmt2, flags);
- if (!look1)
- return op_iter_done (&iter2);
- }
- else
- clear_and_done_ssa_iter (&iter2);
-
- while (!op_iter_done (&iter1) && !op_iter_done (&iter2))
- {
- if (op1 != op2)
- return false;
- op1 = op_iter_next_tree (&iter1);
- op2 = op_iter_next_tree (&iter2);
- }
-
- return (op_iter_done (&iter1) && op_iter_done (&iter2));
-}
-
-
/* If there is a single DEF in the PHI node which matches FLAG, return it.
Otherwise return NULL_DEF_OPERAND_P. */
static inline tree
comp = (is_gimple_reg (phi_def) ? SSA_OP_DEF : SSA_OP_VIRTUAL_DEFS);
- /* If the PHI node doesn't the operand type we care about, we're done. */
+ /* If the PHI node doesn't have the operand type we care about,
+ we're done. */
if ((flags & comp) == 0)
{
ptr->done = true;
- return NULL_USE_OPERAND_P;
+ return NULL_DEF_OPERAND_P;
}
ptr->iter_type = ssa_op_iter_def;
}
else
{
- FOR_EACH_SSA_USE_OPERAND (use_p, head_stmt, op_iter, flag)
- if (USE_FROM_PTR (use_p) == use)
- last_p = move_use_after_head (use_p, head, last_p);
+ if (flag == SSA_OP_USE)
+ {
+ FOR_EACH_SSA_USE_OPERAND (use_p, head_stmt, op_iter, flag)
+ if (USE_FROM_PTR (use_p) == use)
+ last_p = move_use_after_head (use_p, head, last_p);
+ }
+ else if ((use_p = gimple_vuse_op (head_stmt)) != NULL_USE_OPERAND_P)
+ {
+ if (USE_FROM_PTR (use_p) == use)
+ last_p = move_use_after_head (use_p, head, last_p);
+ }
}
/* Link iter node in after last_p. */
if (imm->iter_node.prev != NULL)
imm->iter_node.prev = NULL_USE_OPERAND_P;
imm->iter_node.next = NULL_USE_OPERAND_P;
imm->iter_node.loc.stmt = NULL;
- imm->iter_node.use = NULL_USE_OPERAND_P;
+ imm->iter_node.use = NULL;
if (end_imm_use_stmt_p (imm))
return NULL;
if (TREE_CODE (var) == SSA_NAME)
var = SSA_NAME_VAR (var);
- if (MTAG_P (var))
- return false;
-
return TREE_READONLY (var) && (TREE_STATIC (var) || DECL_EXTERNAL (var));
}
return false;
}
-/* Return the memory tag associated with symbol SYM. */
-
-static inline tree
-symbol_mem_tag (tree sym)
-{
- tree tag = get_var_ann (sym)->symbol_mem_tag;
-
-#if defined ENABLE_CHECKING
- if (tag)
- gcc_assert (TREE_CODE (tag) == SYMBOL_MEMORY_TAG);
-#endif
-
- return tag;
-}
-
-
-/* Set the memory tag associated with symbol SYM. */
-
-static inline void
-set_symbol_mem_tag (tree sym, tree tag)
-{
-#if defined ENABLE_CHECKING
- if (tag)
- gcc_assert (TREE_CODE (tag) == SYMBOL_MEMORY_TAG);
-#endif
-
- get_var_ann (sym)->symbol_mem_tag = tag;
-}
-
/* Accessor to tree-ssa-operands.c caches. */
static inline struct ssa_operands *
gimple_ssa_operands (const struct function *fun)
return &fun->gimple_df->ssa_operands;
}
-/* Map describing reference statistics for function FN. */
-static inline struct mem_ref_stats_d *
-gimple_mem_ref_stats (const struct function *fn)
-{
- return &fn->gimple_df->mem_ref_stats;
-}
-
/* Given an edge_var_map V, return the PHI arg definition. */
static inline tree
#include "tree-ssa-operands.h"
#include "cgraph.h"
#include "ipa-reference.h"
+#include "tree-ssa-alias.h"
/* Forward declare structures for the garbage collector GTY markers. */
#ifndef GCC_BASIC_BLOCK_H
#endif
struct static_var_ann_d;
-/* The reasons a variable may escape a function. */
-enum escape_type
-{
- NO_ESCAPE = 0, /* Doesn't escape. */
- ESCAPE_STORED_IN_GLOBAL = 1 << 0,
- ESCAPE_TO_ASM = 1 << 1, /* Passed by address to an assembly
- statement. */
- ESCAPE_TO_CALL = 1 << 2, /* Escapes to a function call. */
- ESCAPE_BAD_CAST = 1 << 3, /* Cast from pointer to integer */
- ESCAPE_TO_RETURN = 1 << 4, /* Returned from function. */
- ESCAPE_TO_PURE_CONST = 1 << 5, /* Escapes to a pure or constant
- function call. */
- ESCAPE_IS_GLOBAL = 1 << 6, /* Is a global variable. */
- ESCAPE_IS_PARM = 1 << 7, /* Is an incoming function argument. */
- ESCAPE_UNKNOWN = 1 << 8 /* We believe it escapes for
- some reason not enumerated
- above. */
-};
-
-/* Memory reference statistics for individual memory symbols,
- collected during alias analysis. */
-struct mem_sym_stats_d GTY(())
-{
- /* Memory symbol. */
- tree var;
-
- /* Nonzero if this entry has been assigned a partition. */
- unsigned int partitioned_p : 1;
-
- /* Nonzero if VAR is a memory partition tag that already contains
- call-clobbered variables in its partition set. */
- unsigned int has_call_clobbered_vars : 1;
-
- /* Number of direct reference sites. A direct reference to VAR is any
- reference of the form 'VAR = ' or ' = VAR'. For GIMPLE reg
- pointers, this is the number of sites where the pointer is
- dereferenced. */
- long num_direct_writes;
- long num_direct_reads;
-
- /* Number of indirect reference sites. An indirect reference to VAR
- is any reference via a pointer that contains VAR in its points-to
- set or, in the case of call-clobbered symbols, a function call. */
- long num_indirect_writes;
- long num_indirect_reads;
-
- /* Execution frequency. This is the sum of the execution
- frequencies of all the statements that reference this object
- weighted by the number of references in each statement. This is
- the main key used to sort the list of symbols to partition.
- Symbols with high execution frequencies are put at the bottom of
- the work list (ie, they are partitioned last).
- Execution frequencies are taken directly from each basic block,
- so compiling with PGO enabled will increase the precision of this
- estimate. */
- long frequency_reads;
- long frequency_writes;
-
- /* Set of memory tags that contain VAR in their alias set. */
- bitmap parent_tags;
-};
-
-typedef struct mem_sym_stats_d *mem_sym_stats_t;
-DEF_VEC_P(mem_sym_stats_t);
-DEF_VEC_ALLOC_P(mem_sym_stats_t, heap);
-
-/* Memory reference statistics collected during alias analysis. */
-struct mem_ref_stats_d GTY(())
-{
- /* Number of statements that make memory references. */
- long num_mem_stmts;
-
- /* Number of statements that make function calls. */
- long num_call_sites;
-
- /* Number of statements that make calls to pure/const functions. */
- long num_pure_const_call_sites;
-
- /* Number of ASM statements. */
- long num_asm_sites;
-
- /* Estimated number of virtual operands needed as computed by
- compute_memory_partitions. */
- long num_vuses;
- long num_vdefs;
-
- /* This maps every symbol used to make "memory" references
- (pointers, arrays, structures, etc) to an instance of struct
- mem_sym_stats_d describing reference statistics for the symbol. */
- struct pointer_map_t * GTY((skip)) mem_sym_stats;
-};
-
/* Gimple dataflow datastructure. All publicly available fields shall have
gimple_ accessor defined in tree-flow-inline.h, all publicly modifiable
/* Array of all SSA_NAMEs used in the function. */
VEC(tree,gc) *ssa_names;
- /* Artificial variable used to model the effects of function calls. */
- tree global_var;
+ /* Artificial variable used for the virtual operand FUD chain. */
+ tree vop;
/* Artificial variable used to model the effects of nonlocal
variables. */
tree nonlocal_all;
- /* Call clobbered variables in the function. If bit I is set, then
- REFERENCED_VARS (I) is call-clobbered. */
- bitmap call_clobbered_vars;
-
- /* Call-used variables in the function. If bit I is set, then
- REFERENCED_VARS (I) is call-used at pure function call-sites. */
- bitmap call_used_vars;
+ /* The PTA solution for the ESCAPED artificial variable. */
+ struct pt_solution escaped;
- /* Addressable variables in the function. If bit I is set, then
- REFERENCED_VARS (I) has had its address taken. Note that
- CALL_CLOBBERED_VARS and ADDRESSABLE_VARS are not related. An
- addressable variable is not necessarily call-clobbered (e.g., a
- local addressable whose address does not escape) and not all
- call-clobbered variables are addressable (e.g., a local static
- variable). */
- bitmap addressable_vars;
+ /* The PTA solution for the CALLUSED artificial variable. */
+ struct pt_solution callused;
/* Free list of SSA_NAMEs. */
tree free_ssanames;
for this variable with an empty defining statement. */
htab_t GTY((param_is (union tree_node))) default_defs;
- /* 'true' after aliases have been computed (see compute_may_aliases). */
- unsigned int aliases_computed_p : 1;
+ /* Symbols whose SSA form needs to be updated or created for the first
+ time. */
+ bitmap syms_to_rename;
/* True if the code is in ssa form. */
unsigned int in_ssa_p : 1;
struct ssa_operands ssa_operands;
-
- /* Memory reference statistics collected during alias analysis.
- This information is used to drive the memory partitioning
- heuristics in compute_memory_partitions. */
- struct mem_ref_stats_d mem_ref_stats;
};
/* Accessors for internal use only. Generic code should use abstraction
#define SSANAMES(fun) (fun)->gimple_df->ssa_names
#define MODIFIED_NORETURN_CALLS(fun) (fun)->gimple_df->modified_noreturn_calls
#define DEFAULT_DEFS(fun) (fun)->gimple_df->default_defs
+#define SYMS_TO_RENAME(fun) (fun)->gimple_df->syms_to_rename
typedef struct
{
/* Aliasing information for SSA_NAMEs representing pointer variables. */
struct ptr_info_def GTY(())
{
- /* Mask of reasons this pointer's value escapes the function. */
- ENUM_BITFIELD (escape_type) escape_mask : 9;
-
- /* Nonzero if points-to analysis couldn't determine where this pointer
- is pointing to. */
- unsigned int pt_anything : 1;
-
- /* Nonzero if the value of this pointer escapes the current function. */
- unsigned int value_escapes_p : 1;
-
- /* Nonzero if a memory tag is needed for this pointer. This is
- true if this pointer is eventually dereferenced. */
- unsigned int memory_tag_needed : 1;
-
- /* Nonzero if this pointer is really dereferenced. */
- unsigned int is_dereferenced : 1;
-
- /* Nonzero if this pointer points to a global variable. */
- unsigned int pt_global_mem : 1;
-
- /* Nonzero if this pointer points to NULL. */
- unsigned int pt_null : 1;
-
- /* Set of variables that this pointer may point to. */
- bitmap pt_vars;
-
- /* If this pointer has been dereferenced, and points-to information is
- more precise than type-based aliasing, indirect references to this
- pointer will be represented by this memory tag, instead of the type
- tag computed by TBAA. */
- tree name_mem_tag;
+ /* The points-to solution, TBAA-pruned if the pointer is dereferenced. */
+ struct pt_solution pt;
};
states. */
ENUM_BITFIELD (need_phi_state) need_phi_state : 2;
- /* Used during operand processing to determine if this variable is already
- in the VUSE list. */
- unsigned in_vuse_list : 1;
-
- /* Used during operand processing to determine if this variable is already
- in the VDEF list. */
- unsigned in_vdef_list : 1;
-
/* True for HEAP artificial variables. These variables represent
the memory area allocated by a call to malloc. */
unsigned is_heapvar : 1;
- /* True if the variable is call clobbered. */
- unsigned call_clobbered : 1;
-
/* This field describes several "no alias" attributes that some
symbols are known to have. See the enum's definition for more
information on each attribute. */
ENUM_BITFIELD (noalias_state) noalias_state : 2;
- /* Mask of values saying the reasons why this variable has escaped
- the function. */
- ENUM_BITFIELD (escape_type) escape_mask : 9;
-
- /* Memory partition tag assigned to this symbol. */
- tree mpt;
-
- /* If this variable is a pointer P that has been dereferenced, this
- field is an artificial variable that represents the memory
- location *P. Every other pointer Q that is type-compatible with
- P will also have the same memory tag. If the variable is not a
- pointer or if it is never dereferenced, this must be NULL.
- FIXME, do we really need this here? How much slower would it be
- to convert to hash table? */
- tree symbol_mem_tag;
-
/* Used when going out of SSA form to indicate which partition this
variable represents storage for. */
unsigned partition;
static inline function_ann_t get_function_ann (tree);
static inline enum tree_ann_type ann_type (tree_ann_t);
static inline void update_stmt (gimple);
-static inline bitmap may_aliases (const_tree);
static inline int get_lineno (const_gimple);
/*---------------------------------------------------------------------------
extern void set_default_def (tree, tree);
extern tree gimple_default_def (struct function *, tree);
extern bool stmt_references_abnormal_ssa_name (gimple);
-extern bool refs_may_alias_p (tree, tree);
-extern gimple get_single_def_stmt (gimple);
-extern gimple get_single_def_stmt_from_phi (tree, gimple);
-extern gimple get_single_def_stmt_with_phi (tree, gimple);
+extern tree get_ref_base_and_extent (tree, HOST_WIDE_INT *,
+ HOST_WIDE_INT *, HOST_WIDE_INT *);
/* In tree-phinodes.c */
extern void reserve_phi_args_for_new_edge (basic_block);
extern bool gimple_seq_may_fallthru (gimple_seq);
extern bool gimple_stmt_may_fallthru (gimple);
-/* In tree-ssa-alias.c */
-extern unsigned int compute_may_aliases (void);
-extern void dump_may_aliases_for (FILE *, tree);
-extern void debug_may_aliases_for (tree);
-extern void dump_alias_info (FILE *);
-extern void debug_alias_info (void);
-extern void dump_points_to_info (FILE *);
-extern void debug_points_to_info (void);
-extern void dump_points_to_info_for (FILE *, tree);
-extern void debug_points_to_info_for (tree);
-extern bool may_be_aliased (tree);
-extern bool may_alias_p (tree, alias_set_type, tree, alias_set_type, bool);
-extern struct ptr_info_def *get_ptr_info (tree);
-extern bool may_point_to_global_var (tree);
-extern void new_type_alias (tree, tree, tree);
-extern void count_uses_and_derefs (tree, gimple, unsigned *, unsigned *,
- unsigned *);
-static inline bool ref_contains_array_ref (const_tree);
-static inline bool array_ref_contains_indirect_ref (const_tree);
-extern tree get_ref_base_and_extent (tree, HOST_WIDE_INT *,
- HOST_WIDE_INT *, HOST_WIDE_INT *);
-extern tree create_tag_raw (enum tree_code, tree, const char *);
-extern void delete_mem_ref_stats (struct function *);
-extern void dump_mem_ref_stats (FILE *);
-extern void debug_mem_ref_stats (void);
-extern void debug_memory_partitions (void);
-extern void debug_mem_sym_stats (tree var);
-extern void dump_mem_sym_stats_for_var (FILE *, tree);
-extern void debug_all_mem_sym_stats (void);
-
-/* Call-back function for walk_use_def_chains(). At each reaching
- definition, a function with this prototype is called. */
-typedef bool (*walk_use_def_chains_fn) (tree, gimple, void *);
-
-/* In tree-ssa-alias-warnings.c */
-extern void strict_aliasing_warning_backend (void);
-
/* In tree-ssa.c */
extern void flush_pending_stmts (edge);
extern void verify_ssa (bool);
extern void delete_tree_ssa (void);
-extern void walk_use_def_chains (tree, walk_use_def_chains_fn, void *, bool);
extern bool ssa_undefined_value_p (tree);
+extern void execute_update_addresses_taken (bool);
+
+/* Call-back function for walk_use_def_chains(). At each reaching
+ definition, a function with this prototype is called. */
+typedef bool (*walk_use_def_chains_fn) (tree, gimple, void *);
+
+extern void walk_use_def_chains (tree, walk_use_def_chains_fn, void *, bool);
/* In tree-into-ssa.c */
void delete_update_ssa (void);
void register_new_name_mapping (tree, tree);
tree create_new_def_for (tree, gimple, def_operand_p);
-bool need_ssa_update_p (void);
+bool need_ssa_update_p (struct function *);
bool name_mappings_registered_p (void);
bool name_registered_for_update_p (tree);
bitmap ssa_names_to_replace (void);
/* In tree-flow-inline.h */
static inline bool is_call_clobbered (const_tree);
-static inline void mark_call_clobbered (tree, unsigned int);
static inline void set_is_used (tree);
static inline bool unmodifiable_var_p (const_tree);
+static inline bool ref_contains_array_ref (const_tree);
+static inline bool array_ref_contains_indirect_ref (const_tree);
/* In tree-eh.c */
extern void make_eh_edges (gimple);
tree gimple_fold_indirect_ref (tree);
void mark_addressable (tree);
-/* In tree-ssa-structalias.c */
-bool find_what_p_points_to (tree);
-bool clobber_what_escaped (void);
-void compute_call_used_vars (void);
-
/* In tree-ssa-live.c */
extern void remove_unused_locals (void);
extern void dump_scope_blocks (FILE *, int);
void get_address_description (tree, struct mem_address *);
tree maybe_fold_tmr (tree);
-void init_alias_heapvars (void);
-void delete_alias_heapvars (void);
unsigned int execute_fixup_cfg (void);
#include "tree-flow-inline.h"
else
walk_gimple_op (copy, remap_gimple_op_r, &wi);
+ /* Clear the copied virtual operands. We are not remapping them here
+ but are going to recreate them from scratch. */
+ if (gimple_has_mem_ops (copy))
+ {
+ gimple_set_vdef (copy, NULL_TREE);
+ gimple_set_vuse (copy, NULL_TREE);
+ }
+
/* We have to handle EH region remapping of GIMPLE_RESX specially because
the region number is not an operand. */
if (gimple_code (stmt) == GIMPLE_RESX && id->eh_region_offset)
pointer_map_destroy (id->decl_map);
id->decl_map = st;
+ /* Unlink the calls virtual operands before replacing it. */
+ unlink_stmt_vdef (stmt);
+
/* If the inlined function returns a result that we care about,
substitute the GIMPLE_CALL with an assignment of the return
variable to the LHS of the call. That is, if STMT was
stmt = gimple_build_assign (gimple_call_lhs (stmt), use_retvar);
gsi_replace (&stmt_gsi, stmt, false);
if (gimple_in_ssa_p (cfun))
- {
- update_stmt (stmt);
- mark_symbols_for_renaming (stmt);
- }
+ mark_symbols_for_renaming (stmt);
maybe_clean_or_replace_eh_stmt (old_stmt, stmt);
}
else
undefined via a move. */
stmt = gimple_build_assign (gimple_call_lhs (stmt), def);
gsi_replace (&stmt_gsi, stmt, true);
- update_stmt (stmt);
}
else
{
/* Clean up. */
pointer_map_destroy (id.decl_map);
+ free_dominance_info (CDI_DOMINATORS);
+ free_dominance_info (CDI_POST_DOMINATORS);
if (!update_clones)
{
fold_marked_statements (0, id.statements_to_fold);
pointer_set_destroy (id.statements_to_fold);
fold_cond_expr_cond ();
- }
- if (gimple_in_ssa_p (cfun))
- {
- free_dominance_info (CDI_DOMINATORS);
- free_dominance_info (CDI_POST_DOMINATORS);
- if (!update_clones)
- delete_unreachable_blocks ();
+ delete_unreachable_blocks ();
update_ssa (TODO_update_ssa);
- if (!update_clones)
- {
- fold_cond_expr_cond ();
- if (need_ssa_update_p ())
- update_ssa (TODO_update_ssa);
- }
}
- free_dominance_info (CDI_DOMINATORS);
- free_dominance_info (CDI_POST_DOMINATORS);
VEC_free (gimple, heap, init_stmts);
pop_cfun ();
current_function_decl = old_current_function_decl;
static sbitmap new_ssa_names;
-/* Symbols whose SSA form needs to be updated or created for the first
- time. */
-static bitmap syms_to_rename;
-
/* Subset of SYMS_TO_RENAME. Contains all the GIMPLE register symbols
that have been marked for renaming. */
static bitmap regs_to_rename;
then REPL_TBL[N_i] = { O_1, O_2, ..., O_j }. */
static htab_t repl_tbl;
-/* true if register_new_name_mapping needs to initialize the data
- structures needed by update_ssa. */
-static bool need_to_initialize_update_ssa_p = true;
-
-/* true if update_ssa needs to update virtual operands. */
-static bool need_to_update_vops_p = false;
+/* The function the SSA updating data structures have been initialized for.
+ NULL if they need to be initialized by register_new_name_mapping. */
+static struct function *update_ssa_initialized_fn = NULL;
/* Statistics kept by update_ssa to use in the virtual mapping
heuristic. If the number of virtual mappings is beyond certain
static inline bool
symbol_marked_for_renaming (tree sym)
{
- return bitmap_bit_p (syms_to_rename, DECL_UID (sym));
+ return bitmap_bit_p (SYMS_TO_RENAME (cfun), DECL_UID (sym));
}
is_old_name (tree name)
{
unsigned ver = SSA_NAME_VERSION (name);
+ if (!new_ssa_names)
+ return false;
return ver < new_ssa_names->n_bits && TEST_BIT (old_ssa_names, ver);
}
is_new_name (tree name)
{
unsigned ver = SSA_NAME_VERSION (name);
+ if (!new_ssa_names)
+ return false;
return ver < new_ssa_names->n_bits && TEST_BIT (new_ssa_names, ver);
}
{
tree sym;
- need_to_update_vops_p = true;
-
update_ssa_stats.num_virtual_mappings++;
update_ssa_stats.num_virtual_symbols++;
fprintf (file, " ");
}
- fprintf (file, "}\n");
+ fprintf (file, "}");
}
else
- fprintf (file, "NIL\n");
+ fprintf (file, "NIL");
}
debug_decl_set (bitmap set)
{
dump_decl_set (stderr, set);
+ fprintf (stderr, "\n");
}
fprintf (file, "\n\nCurrent reaching definitions\n\n");
FOR_EACH_REFERENCED_VAR (var, i)
- if (syms_to_rename == NULL || bitmap_bit_p (syms_to_rename, DECL_UID (var)))
+ if (SYMS_TO_RENAME (cfun) == NULL
+ || bitmap_bit_p (SYMS_TO_RENAME (cfun), DECL_UID (var)))
{
fprintf (file, "CURRDEF (");
print_generic_expr (file, var, 0);
/* Rewrite USES included in OLD_SSA_NAMES and USES whose underlying
symbol is marked for renaming. */
if (rewrite_uses_p (stmt))
- {
- FOR_EACH_SSA_USE_OPERAND (use_p, stmt, iter, SSA_OP_USE)
- maybe_replace_use (use_p);
-
- if (need_to_update_vops_p)
- FOR_EACH_SSA_USE_OPERAND (use_p, stmt, iter, SSA_OP_VIRTUAL_USES)
- maybe_replace_use (use_p);
- }
+ FOR_EACH_SSA_USE_OPERAND (use_p, stmt, iter, SSA_OP_ALL_USES)
+ maybe_replace_use (use_p);
/* Register definitions of names in NEW_SSA_NAMES and OLD_SSA_NAMES.
Also register definitions for names whose underlying symbol is
marked for renaming. */
if (register_defs_p (stmt))
- {
- FOR_EACH_SSA_DEF_OPERAND (def_p, stmt, iter, SSA_OP_DEF)
- maybe_register_def (def_p, stmt);
-
- if (need_to_update_vops_p)
- FOR_EACH_SSA_DEF_OPERAND (def_p, stmt, iter, SSA_OP_VIRTUAL_DEFS)
- maybe_register_def (def_p, stmt);
- }
+ FOR_EACH_SSA_DEF_OPERAND (def_p, stmt, iter, SSA_OP_ALL_DEFS)
+ maybe_register_def (def_p, stmt);
}
0, /* properties_destroyed */
0, /* todo_flags_start */
TODO_dump_func
+ | TODO_update_ssa_only_virtuals
| TODO_verify_ssa
| TODO_remove_unused_locals /* todo_flags_finish */
}
unsigned i = 0;
bitmap_iterator bi;
- if (!need_ssa_update_p ())
+ if (!need_ssa_update_p (cfun))
return;
if (new_ssa_names && sbitmap_first_set_bit (new_ssa_names) >= 0)
update_ssa_stats.num_virtual_symbols);
}
- if (syms_to_rename && !bitmap_empty_p (syms_to_rename))
+ if (!bitmap_empty_p (SYMS_TO_RENAME (cfun)))
{
fprintf (file, "\n\nSymbols to be put in SSA form\n\n");
- dump_decl_set (file, syms_to_rename);
+ dump_decl_set (file, SYMS_TO_RENAME (cfun));
+ fprintf (file, "\n");
}
if (names_to_release && !bitmap_empty_p (names_to_release))
/* Initialize data structures used for incremental SSA updates. */
static void
-init_update_ssa (void)
+init_update_ssa (struct function *fn)
{
/* Reserve more space than the current number of names. The calls to
add_new_name_mapping are typically done after creating new SSA
sbitmap_zero (new_ssa_names);
repl_tbl = htab_create (20, repl_map_hash, repl_map_eq, repl_map_free);
- need_to_initialize_update_ssa_p = false;
- need_to_update_vops_p = false;
- syms_to_rename = BITMAP_ALLOC (NULL);
regs_to_rename = BITMAP_ALLOC (NULL);
mem_syms_to_rename = BITMAP_ALLOC (NULL);
names_to_release = NULL;
memset (&update_ssa_stats, 0, sizeof (update_ssa_stats));
update_ssa_stats.virtual_symbols = BITMAP_ALLOC (NULL);
+ update_ssa_initialized_fn = fn;
}
htab_delete (repl_tbl);
repl_tbl = NULL;
- need_to_initialize_update_ssa_p = true;
- need_to_update_vops_p = false;
- BITMAP_FREE (syms_to_rename);
+ bitmap_clear (SYMS_TO_RENAME (update_ssa_initialized_fn));
BITMAP_FREE (regs_to_rename);
BITMAP_FREE (mem_syms_to_rename);
BITMAP_FREE (update_ssa_stats.virtual_symbols);
BITMAP_FREE (blocks_with_phis_to_rewrite);
BITMAP_FREE (blocks_to_update);
+ update_ssa_initialized_fn = NULL;
}
update_ssa. */
void
-register_new_name_mapping (tree new_Tree ATTRIBUTE_UNUSED, tree old ATTRIBUTE_UNUSED)
+register_new_name_mapping (tree new_tree, tree old)
{
- if (need_to_initialize_update_ssa_p)
- init_update_ssa ();
+ if (!update_ssa_initialized_fn)
+ init_update_ssa (cfun);
+
+ gcc_assert (update_ssa_initialized_fn == cfun);
- add_new_name_mapping (new_Tree, old);
+ add_new_name_mapping (new_tree, old);
}
void
mark_sym_for_renaming (tree sym)
{
- if (need_to_initialize_update_ssa_p)
- init_update_ssa ();
-
- bitmap_set_bit (syms_to_rename, DECL_UID (sym));
-
- if (!is_gimple_reg (sym))
- {
- need_to_update_vops_p = true;
- if (memory_partition (sym))
- bitmap_set_bit (syms_to_rename, DECL_UID (memory_partition (sym)));
- }
+ bitmap_set_bit (SYMS_TO_RENAME (cfun), DECL_UID (sym));
}
if (set == NULL || bitmap_empty_p (set))
return;
- if (need_to_initialize_update_ssa_p)
- init_update_ssa ();
-
EXECUTE_IF_SET_IN_BITMAP (set, 0, i, bi)
mark_sym_for_renaming (referenced_var (i));
}
-/* Return true if there is any work to be done by update_ssa. */
+/* Return true if there is any work to be done by update_ssa
+ for function FN. */
bool
-need_ssa_update_p (void)
+need_ssa_update_p (struct function *fn)
{
- return syms_to_rename || old_ssa_names || new_ssa_names;
+ gcc_assert (fn != NULL);
+ return (update_ssa_initialized_fn == fn
+ || (fn->gimple_df
+ && !bitmap_empty_p (SYMS_TO_RENAME (fn))));
}
/* Return true if SSA name mappings have been registered for SSA updating. */
bool
name_mappings_registered_p (void)
{
+ if (!update_ssa_initialized_fn)
+ return false;
+
+ gcc_assert (update_ssa_initialized_fn == cfun);
+
return repl_tbl && htab_elements (repl_tbl) > 0;
}
bool
name_registered_for_update_p (tree n ATTRIBUTE_UNUSED)
{
- if (!need_ssa_update_p ())
+ if (!update_ssa_initialized_fn)
return false;
- return is_new_name (n)
- || is_old_name (n)
- || symbol_marked_for_renaming (SSA_NAME_VAR (n));
+ gcc_assert (update_ssa_initialized_fn == cfun);
+
+ return is_new_name (n) || is_old_name (n);
}
bitmap ret;
sbitmap_iterator sbi;
+ gcc_assert (update_ssa_initialized_fn == NULL
+ || update_ssa_initialized_fn == cfun);
+
ret = BITMAP_ALLOC (NULL);
EXECUTE_IF_SET_IN_SBITMAP (old_ssa_names, 0, i, sbi)
bitmap_set_bit (ret, i);
void
release_ssa_name_after_update_ssa (tree name)
{
- gcc_assert (!need_to_initialize_update_ssa_p);
+ gcc_assert (cfun && update_ssa_initialized_fn == cfun);
if (names_to_release == NULL)
names_to_release = BITMAP_ALLOC (NULL);
bool insert_phi_p;
sbitmap_iterator sbi;
- if (!need_ssa_update_p ())
+ if (!need_ssa_update_p (cfun))
return;
timevar_push (TV_TREE_SSA_INCREMENTAL);
+ if (!update_ssa_initialized_fn)
+ init_update_ssa (cfun);
+ gcc_assert (update_ssa_initialized_fn == cfun);
+
blocks_with_phis_to_rewrite = BITMAP_ALLOC (NULL);
if (!phis_to_rewrite)
phis_to_rewrite = VEC_alloc (gimple_vec, heap, last_basic_block);
/* If there are symbols to rename, identify those symbols that are
GIMPLE registers into the set REGS_TO_RENAME and those that are
memory symbols into the set MEM_SYMS_TO_RENAME. */
- if (!bitmap_empty_p (syms_to_rename))
+ if (!bitmap_empty_p (SYMS_TO_RENAME (cfun)))
{
unsigned i;
bitmap_iterator bi;
- EXECUTE_IF_SET_IN_BITMAP (syms_to_rename, 0, i, bi)
+ EXECUTE_IF_SET_IN_BITMAP (SYMS_TO_RENAME (cfun), 0, i, bi)
{
tree sym = referenced_var (i);
if (is_gimple_reg (sym))
bitmap_set_bit (regs_to_rename, i);
- else
- {
- /* Memory partitioning information may have been
- computed after the symbol was marked for renaming,
- if SYM is inside a partition also mark the partition
- for renaming. */
- tree mpt = memory_partition (sym);
- if (mpt)
- bitmap_set_bit (syms_to_rename, DECL_UID (mpt));
- }
}
/* Memory symbols are those not in REGS_TO_RENAME. */
- bitmap_and_compl (mem_syms_to_rename, syms_to_rename, regs_to_rename);
+ bitmap_and_compl (mem_syms_to_rename,
+ SYMS_TO_RENAME (cfun), regs_to_rename);
}
/* If there are names defined in the replacement table, prepare
removal, and there are no symbols to rename, then there's
nothing else to do. */
if (sbitmap_first_set_bit (new_ssa_names) < 0
- && bitmap_empty_p (syms_to_rename))
+ && bitmap_empty_p (SYMS_TO_RENAME (cfun)))
goto done;
}
/* Next, determine the block at which to start the renaming process. */
- if (!bitmap_empty_p (syms_to_rename))
+ if (!bitmap_empty_p (SYMS_TO_RENAME (cfun)))
{
/* If we have to rename some symbols from scratch, we need to
start the process at the root of the CFG. FIXME, it should
sbitmap_free (tmp);
}
- EXECUTE_IF_SET_IN_BITMAP (syms_to_rename, 0, i, bi)
+ EXECUTE_IF_SET_IN_BITMAP (SYMS_TO_RENAME (cfun), 0, i, bi)
insert_updated_phi_nodes_for (referenced_var (i), dfs, blocks_to_update,
update_flags);
EXECUTE_IF_SET_IN_SBITMAP (old_ssa_names, 0, i, sbi)
set_current_def (ssa_name (i), NULL_TREE);
- EXECUTE_IF_SET_IN_BITMAP (syms_to_rename, 0, i, bi)
+ EXECUTE_IF_SET_IN_BITMAP (SYMS_TO_RENAME (cfun), 0, i, bi)
set_current_def (referenced_var (i), NULL_TREE);
/* Now start the renaming process at START_BB. */
predecessor a node that writes to memory. */
static bitmap upstream_mem_writes;
-/* TODOs we need to run after the pass. */
-static unsigned int todo;
-
/* Update the PHI nodes of NEW_LOOP. NEW_LOOP is a duplicate of
ORIG_LOOP. */
generate_memset_zero (gimple stmt, tree op0, tree nb_iter,
gimple_stmt_iterator bsi)
{
- tree t, addr_base;
+ tree addr_base;
tree nb_bytes = NULL;
bool res = false;
gimple_seq stmts = NULL, stmt_list = NULL;
gimple fn_call;
tree mem, fndecl, fntype, fn;
gimple_stmt_iterator i;
- ssa_op_iter iter;
struct data_reference *dr = XCNEW (struct data_reference);
DR_STMT (dr) = stmt;
{
gimple s = gsi_stmt (i);
update_stmt_if_modified (s);
-
- FOR_EACH_SSA_TREE_OPERAND (t, s, iter, SSA_OP_VIRTUAL_DEFS)
- {
- if (TREE_CODE (t) == SSA_NAME)
- t = SSA_NAME_VAR (t);
- mark_sym_for_renaming (t);
- }
- }
-
- /* Mark also the uses of the VDEFS of STMT to be renamed. */
- FOR_EACH_SSA_TREE_OPERAND (t, stmt, iter, SSA_OP_VIRTUAL_DEFS)
- {
- if (TREE_CODE (t) == SSA_NAME)
- {
- gimple s;
- imm_use_iterator imm_iter;
-
- FOR_EACH_IMM_USE_STMT (s, imm_iter, t)
- update_stmt (s);
-
- t = SSA_NAME_VAR (t);
- }
- mark_sym_for_renaming (t);
}
gsi_insert_seq_after (&bsi, stmt_list, GSI_CONTINUE_LINKING);
if (dump_file && (dump_flags & TDF_DETAILS))
fprintf (dump_file, "generated memset zero\n");
- todo |= TODO_rebuild_alias;
-
end:
free_data_ref (dr);
return res;
rdg_flag_uses (struct graph *rdg, int u, bitmap partition, bitmap loops,
bitmap processed, bool *part_has_writes)
{
- ssa_op_iter iter;
use_operand_p use_p;
struct vertex *x = &(rdg->vertices[u]);
gimple stmt = RDGV_STMT (x);
if (gimple_code (stmt) != GIMPLE_PHI)
{
- FOR_EACH_SSA_USE_OPERAND (use_p, stmt, iter, SSA_OP_VIRTUAL_USES)
+ if ((use_p = gimple_vuse_op (stmt)) != NULL_USE_OPERAND_P)
{
tree use = USE_FROM_PTR (use_p);
loop_iterator li;
int nb_generated_loops = 0;
- todo = 0;
-
FOR_EACH_LOOP (li, loop, 0)
{
VEC (gimple, heap) *work_list = VEC_alloc (gimple, heap, 3);
VEC_free (gimple, heap, work_list);
}
- return todo;
+ return 0;
}
static bool
DECL_GIMPLE_REG_P (tmp) = DECL_GIMPLE_REG_P (t);
add_referenced_var (tmp);
- /* add_referenced_var will create the annotation and set up some
- of the flags in the annotation. However, some flags we need to
- inherit from our original variable. */
- set_symbol_mem_tag (tmp, symbol_mem_tag (t));
- if (is_call_clobbered (t))
- mark_call_clobbered (tmp, var_ann (t)->escape_mask);
- if (bitmap_bit_p (gimple_call_used_vars (cfun), DECL_UID (t)))
- bitmap_set_bit (gimple_call_used_vars (cfun), DECL_UID (tmp));
+ /* We should never have copied variables in non-automatic storage
+ or variables that have their address taken. So it is pointless
+ to try to copy call-clobber state here. */
+ gcc_assert (!may_be_aliased (t) && !is_global_var (t));
return tmp;
}
#define TODO_mark_first_instance (1 << 19)
/* Rebuild aliasing info. */
-#define TODO_rebuild_alias (1 << 20)
+#define TODO_rebuild_alias (1 << 20)
+
+/* Rebuild the addressable-vars bitmap and do register promotion. */
+#define TODO_update_address_taken (1 << 21)
#define TODO_update_ssa_any \
(TODO_update_ssa \
extern struct gimple_opt_pass pass_phiprop;
extern struct gimple_opt_pass pass_tree_ifcombine;
extern struct gimple_opt_pass pass_dse;
-extern struct gimple_opt_pass pass_simple_dse;
extern struct gimple_opt_pass pass_nrv;
extern struct gimple_opt_pass pass_mark_used_blocks;
extern struct gimple_opt_pass pass_rename_ssa_copies;
extern struct gimple_opt_pass pass_rebuild_cgraph_edges;
extern struct gimple_opt_pass pass_remove_cgraph_callee_edges;
extern struct gimple_opt_pass pass_build_cgraph_edges;
-extern struct gimple_opt_pass pass_reset_cc_flags;
extern struct gimple_opt_pass pass_local_pure_const;
/* IPA Passes */
void
mark_virtual_ops_for_renaming (gimple stmt)
{
- ssa_op_iter iter;
tree var;
if (gimple_code (stmt) == GIMPLE_PHI)
}
update_stmt (stmt);
-
- FOR_EACH_SSA_TREE_OPERAND (var, stmt, iter, SSA_OP_ALL_VIRTUALS)
- {
- if (TREE_CODE (var) == SSA_NAME)
- var = SSA_NAME_VAR (var);
- mark_sym_for_renaming (var);
- }
-}
-
-/* Calls mark_virtual_ops_for_renaming for all members of LIST. */
-
-static void
-mark_virtual_ops_for_renaming_list (gimple_seq list)
-{
- gimple_stmt_iterator gsi;
-
- for (gsi = gsi_start (list); !gsi_end_p (gsi); gsi_next (&gsi))
- mark_virtual_ops_for_renaming (gsi_stmt (gsi));
+ if (gimple_vuse (stmt))
+ mark_sym_for_renaming (gimple_vop (cfun));
}
/* Returns a new temporary variable used for the I-th variable carrying
init = force_gimple_operand (init, &stmts, true, NULL_TREE);
if (stmts)
- {
- mark_virtual_ops_for_renaming_list (stmts);
- gsi_insert_seq_on_edge_immediate (entry, stmts);
- }
+ gsi_insert_seq_on_edge_immediate (entry, stmts);
phi = create_phi_node (var, loop->header);
SSA_NAME_DEF_STMT (var) = phi;
init = force_gimple_operand (init, &stmts, written, NULL_TREE);
if (stmts)
- {
- mark_virtual_ops_for_renaming_list (stmts);
- gsi_insert_seq_on_edge_immediate (entry, stmts);
- }
+ gsi_insert_seq_on_edge_immediate (entry, stmts);
if (written)
{
}
}
-/* Sets alias information based on data reference DR for REF,
- if necessary. */
-
-static void
-set_alias_info (tree ref, struct data_reference *dr)
-{
- tree var;
- tree tag = DR_SYMBOL_TAG (dr);
-
- gcc_assert (tag != NULL_TREE);
-
- ref = get_base_address (ref);
- if (!ref || !INDIRECT_REF_P (ref))
- return;
-
- var = SSA_NAME_VAR (TREE_OPERAND (ref, 0));
- if (var_ann (var)->symbol_mem_tag)
- return;
-
- if (!MTAG_P (tag))
- new_type_alias (var, tag, ref);
- else
- var_ann (var)->symbol_mem_tag = tag;
-}
-
/* Prepare initializers for CHAIN in LOOP. Returns false if this is
impossible because one of these initializers may trap, true otherwise. */
init = force_gimple_operand (init, &stmts, false, NULL_TREE);
if (stmts)
- {
- mark_virtual_ops_for_renaming_list (stmts);
- gsi_insert_seq_on_edge_immediate (entry, stmts);
- }
- set_alias_info (init, dr);
+ gsi_insert_seq_on_edge_immediate (entry, stmts);
VEC_replace (tree, chain->inits, i, init);
}
}
break;
- case SYMBOL_MEMORY_TAG:
- case NAME_MEMORY_TAG:
case VAR_DECL:
case PARM_DECL:
case FIELD_DECL:
case NAMESPACE_DECL:
- case MEMORY_PARTITION_TAG:
dump_decl_name (buffer, node, flags);
break;
/* True if this is the "early" pass, before inlining. */
static bool early_sra;
-/* The set of todo flags to return from tree_sra. */
-static unsigned int todoflags;
-
/* The set of aggregate variables that are candidates for scalarization. */
static bitmap sra_candidates;
static tree generate_element_ref (struct sra_elt *);
static gimple_seq sra_build_assignment (tree dst, tree src);
static void mark_all_v_defs_seq (gimple_seq);
-static void mark_all_v_defs_stmt (gimple);
\f
/* Return true if DECL is an SRA candidate. */
ni = si;
gsi_next (&ni);
- /* If the statement has no virtual operands, then it doesn't
+ /* If the statement does not reference memory, then it doesn't
make any structure references that we care about. */
- if (gimple_aliases_computed_p (cfun)
- && ZERO_SSA_OPERANDS (stmt, (SSA_OP_VIRTUAL_DEFS | SSA_OP_VUSE)))
- continue;
+ if (!gimple_references_memory_p (stmt))
+ continue;
switch (gimple_code (stmt))
{
\f
/* Phase Four: Update the function to match the replacements created. */
-/* Mark all the variables in VDEF/VUSE operators for STMT for
- renaming. This becomes necessary when we modify all of a
- non-scalar. */
-
-static void
-mark_all_v_defs_stmt (gimple stmt)
-{
- tree sym;
- ssa_op_iter iter;
-
- update_stmt_if_modified (stmt);
-
- FOR_EACH_SSA_TREE_OPERAND (sym, stmt, iter, SSA_OP_ALL_VIRTUALS)
- {
- if (TREE_CODE (sym) == SSA_NAME)
- sym = SSA_NAME_VAR (sym);
- mark_sym_for_renaming (sym);
- }
-}
-
-
/* Mark all the variables in virtual operands in all the statements in
LIST for renaming. */
gimple_stmt_iterator gsi;
for (gsi = gsi_start (seq); !gsi_end_p (gsi); gsi_next (&gsi))
- mark_all_v_defs_stmt (gsi_stmt (gsi));
+ update_stmt_if_modified (gsi_stmt (gsi));
}
/* Mark every replacement under ELT with TREE_NO_WARNING. */
sra_replace (gimple_stmt_iterator *gsi, gimple_seq seq)
{
sra_insert_before (gsi, seq);
+ unlink_stmt_vdef (gsi_stmt (*gsi));
gsi_remove (gsi, false);
if (gsi_end_p (*gsi))
*gsi = gsi_last (gsi_seq (*gsi));
replacement = tmp;
}
if (is_output)
- mark_all_v_defs_stmt (stmt);
+ update_stmt_if_modified (stmt);
*expr_p = REPLDUP (replacement);
update_stmt (stmt);
}
original block copy statement. */
stmt = gsi_stmt (*gsi);
- mark_all_v_defs_stmt (stmt);
+ update_stmt_if_modified (stmt);
seq = NULL;
generate_element_copy (lhs_elt, rhs_elt, &seq);
/* The LHS is fully instantiated. The list of initializations
replaces the original structure assignment. */
gcc_assert (seq);
- mark_all_v_defs_stmt (gsi_stmt (*gsi));
+ update_stmt_if_modified (gsi_stmt (*gsi));
mark_all_v_defs_seq (seq);
sra_replace (gsi, seq);
}
gimple_seq seq = NULL;
gimple stmt = gsi_stmt (*gsi);
- mark_all_v_defs_stmt (stmt);
+ update_stmt_if_modified (stmt);
generate_copy_inout (elt, is_output, other, &seq);
gcc_assert (seq);
mark_all_v_defs_seq (seq);
tree_sra (void)
{
/* Initialize local variables. */
- todoflags = 0;
gcc_obstack_init (&sra_obstack);
sra_candidates = BITMAP_ALLOC (NULL);
needs_copy_in = BITMAP_ALLOC (NULL);
scan_function ();
decide_instantiations ();
scalarize_function ();
- if (!bitmap_empty_p (sra_candidates))
- todoflags |= TODO_rebuild_alias;
}
/* Free allocated memory. */
BITMAP_FREE (sra_type_decomp_cache);
BITMAP_FREE (sra_type_inst_cache);
obstack_free (&sra_obstack, NULL);
- return todoflags;
+ return 0;
}
static unsigned int
ret = tree_sra ();
early_sra = false;
- return ret & ~TODO_rebuild_alias;
+ return ret;
}
static bool
PROP_cfg | PROP_ssa, /* properties_required */
0, /* properties_provided */
0, /* properties_destroyed */
- 0, /* todo_flags_start */
+ TODO_update_address_taken, /* todo_flags_start */
TODO_dump_func
| TODO_update_ssa
| TODO_ggc_collect
if (addr->offset && integer_zerop (addr->offset))
addr->offset = NULL_TREE;
- return build7 (TARGET_MEM_REF, type,
+ return build6 (TARGET_MEM_REF, type,
addr->symbol, addr->base, addr->index,
- addr->step, addr->offset, NULL, NULL);
+ addr->step, addr->offset, NULL);
}
/* Returns true if OBJ is an object whose address is a link time constant. */
void
copy_mem_ref_info (tree to, tree from)
{
- /* Copy the annotation, to preserve the aliasing information. */
- TMR_TAG (to) = TMR_TAG (from);
-
/* And the info about the original reference. */
TMR_ORIGINAL (to) = TMR_ORIGINAL (from);
}
#include "tree-flow.h"
#include "tree-inline.h"
#include "tree-pass.h"
-#include "tree-ssa-structalias.h"
#include "convert.h"
#include "params.h"
#include "ipa-type-escape.h"
#include "vecprim.h"
#include "pointer-set.h"
#include "alloc-pool.h"
+#include "tree-ssa-alias.h"
-/* Broad overview of how aliasing works:
-
- First we compute points-to sets, which is done in
- tree-ssa-structalias.c
-
- During points-to set constraint finding, a bunch of little bits of
- information is collected.
- This is not done because it is necessary for points-to, but because
- points-to has to walk every statement anyway. The function performing
- this collecting is update_alias_info.
-
- Bits update_alias_info collects include:
- 1. Directly escaping variables and variables whose value escapes
- (using is_escape_site). This is the set of variables and values that
- escape prior to transitive closure of the clobbers.
- 2. The set of variables dereferenced on the LHS (into
- dereferenced_ptr_stores)
- 3. The set of variables dereferenced on the RHS (into
- dereferenced_ptr_loads)
- 4. The set of all pointers we saw.
- 5. The number of loads and stores for each variable
- 6. The number of statements touching memory
- 7. The set of address taken variables.
-
-
- #1 is computed by a combination of is_escape_site, and counting the
- number of uses/deref operators. This function properly accounts for
- situations like &ptr->field, which is *not* a dereference.
-
- After points-to sets are computed, the sets themselves still
- contain points-to specific variables, such as a variable that says
- the pointer points to anything, a variable that says the pointer
- points to readonly memory, etc.
-
- These are eliminated in a later phase, as we will see.
-
- The rest of the phases are located in tree-ssa-alias.c
-
- The next phase after points-to set computation is called
- "setup_pointers_and_addressables"
-
- This pass does 3 main things:
-
- 1. All variables that can have TREE_ADDRESSABLE removed safely (IE
- non-globals whose address is not taken), have TREE_ADDRESSABLE
- removed.
- 2. All variables that may be aliased (which is the set of addressable
- variables and globals) at all, are marked for renaming, and have
- symbol memory tags created for them.
- 3. All variables which are stored into have their SMT's added to
- written vars.
-
-
- After this function is run, all variables that will ever have an
- SMT, have one, though its aliases are not filled in.
-
- The next phase is to compute flow-insensitive aliasing, which in
- our case, is a misnomer. it is really computing aliasing that
- requires no transitive closure to be correct. In particular, it
- uses stack vs non-stack, TBAA, etc, to determine whether two
- symbols could *ever* alias . This phase works by going through all
- the pointers we collected during update_alias_info, and for every
- addressable variable in the program, seeing if they alias. If so,
- the addressable variable is added to the symbol memory tag for the
- pointer.
-
- As part of this, we handle symbol memory tags that conflict but
- have no aliases in common, by forcing them to have a symbol in
- common (through unioning alias sets or adding one as an alias of
- the other), or by adding one as an alias of another. The case of
- conflicts with no aliases in common occurs mainly due to aliasing
- we cannot see. In particular, it generally means we have a load
- through a pointer whose value came from outside the function.
- Without an addressable symbol to point to, they would get the wrong
- answer.
-
- After flow insensitive aliasing is computed, we compute name tags
- (called compute_flow_sensitive_info). We walk each pointer we
- collected and see if it has a usable points-to set. If so, we
- generate a name tag using that pointer, and make an alias bitmap for
- it. Name tags are shared between all things with the same alias
- bitmap. The alias bitmap will be translated from what points-to
- computed. In particular, the "anything" variable in points-to will be
- transformed into a pruned set of SMT's and their aliases that
- compute_flow_insensitive_aliasing computed.
- Note that since 4.3, every pointer that points-to computed a solution for
- will get a name tag (whereas before 4.3, only those whose set did
- *not* include the anything variable would). At the point where name
- tags are all assigned, symbol memory tags are dead, and could be
- deleted, *except* on global variables. Global variables still use
- symbol memory tags as of right now.
-
- After name tags are computed, the set of clobbered variables is
- transitively closed. In particular, we compute the set of clobbered
- variables based on the initial set of clobbers, plus the aliases of
- pointers which either escape, or have their value escape.
-
- After this, maybe_create_global_var is run, which handles a corner
- case where we have no call clobbered variables, but have pure and
- non-pure functions.
-
- Staring at this function, I now remember it is a hack for the fact
- that we do not mark all globals in the program as call clobbered for a
- function unless they are actually used in that function. Instead, we
- only mark the set that is actually clobbered. As a result, you can
- end up with situations where you have no call clobbered vars set.
-
- After maybe_create_global_var, we set pointers with the REF_ALL flag
- to have alias sets that include all clobbered
- memory tags and variables.
-
- After this, memory partitioning is computed (by the function
- compute_memory_partitions) and alias sets are reworked accordingly.
-
- Lastly, we delete partitions with no symbols, and clean up after
- ourselves. */
-
-
-/* Alias information used by compute_may_aliases and its helpers. */
-struct alias_info
-{
- /* SSA names visited while collecting points-to information. If bit I
- is set, it means that SSA variable with version I has already been
- visited. */
- sbitmap ssa_names_visited;
-
- /* Array of SSA_NAME pointers processed by the points-to collector. */
- VEC(tree,heap) *processed_ptrs;
-
- /* ADDRESSABLE_VARS contains all the global variables and locals that
- have had their address taken. */
- struct alias_map_d **addressable_vars;
- size_t num_addressable_vars;
-
- /* POINTERS contains all the _DECL pointers with unique memory tags
- that have been referenced in the program. */
- struct alias_map_d **pointers;
- size_t num_pointers;
-
- /* Pointers that have been used in an indirect load/store operation. */
- struct pointer_set_t *dereferenced_ptrs;
-};
-
-
-/* Structure to map a variable to its alias set. */
-struct alias_map_d
-{
- /* Variable and its alias set. */
- tree var;
- alias_set_type set;
-};
-
-
-/* Counters used to display statistics on alias analysis. */
-struct alias_stats_d
-{
- unsigned int alias_queries;
- unsigned int alias_mayalias;
- unsigned int alias_noalias;
- unsigned int simple_queries;
- unsigned int simple_resolved;
- unsigned int tbaa_queries;
- unsigned int tbaa_resolved;
- unsigned int structnoaddress_queries;
- unsigned int structnoaddress_resolved;
-};
-
-
-/* Local variables. */
-static struct alias_stats_d alias_stats;
-static bitmap_obstack alias_bitmap_obstack;
-
-/* Local functions. */
-static void compute_flow_insensitive_aliasing (struct alias_info *);
-static void dump_alias_stats (FILE *);
-static tree create_memory_tag (tree type, bool is_type_tag);
-static tree get_smt_for (tree, struct alias_info *);
-static tree get_nmt_for (tree);
-static void add_may_alias (tree, tree);
-static struct alias_info *init_alias_info (void);
-static void delete_alias_info (struct alias_info *);
-static void compute_flow_sensitive_aliasing (struct alias_info *);
-static void setup_pointers_and_addressables (struct alias_info *);
-static void update_alias_info (struct alias_info *);
-static void create_global_var (void);
-static void maybe_create_global_var (void);
-static void set_pt_anything (tree);
-
-void debug_mp_info (VEC(mem_sym_stats_t,heap) *);
-
-static alloc_pool mem_sym_stats_pool;
-
-/* Return memory reference stats for symbol VAR. Create a new slot in
- cfun->gimple_df->mem_sym_stats if needed. */
-
-static struct mem_sym_stats_d *
-get_mem_sym_stats_for (tree var)
-{
- void **slot;
- struct mem_sym_stats_d *stats;
- struct pointer_map_t *map = gimple_mem_ref_stats (cfun)->mem_sym_stats;
-
- gcc_assert (map);
-
- slot = pointer_map_insert (map, var);
- if (*slot == NULL)
- {
- stats = (struct mem_sym_stats_d *) pool_alloc (mem_sym_stats_pool);
- memset (stats, 0, sizeof (*stats));
- stats->var = var;
- *slot = (void *) stats;
- }
- else
- stats = (struct mem_sym_stats_d *) *slot;
-
- return stats;
-}
-
-
-/* Return memory reference statistics for variable VAR in function FN.
- This is computed by alias analysis, but it is not kept
- incrementally up-to-date. So, these stats are only accurate if
- pass_may_alias has been run recently. If no alias information
- exists, this function returns NULL. */
-
-static mem_sym_stats_t
-mem_sym_stats (struct function *fn, tree var)
-{
- void **slot;
- struct pointer_map_t *stats_map = gimple_mem_ref_stats (fn)->mem_sym_stats;
-
- if (stats_map == NULL)
- return NULL;
-
- slot = pointer_map_contains (stats_map, var);
- if (slot == NULL)
- return NULL;
-
- return (mem_sym_stats_t) *slot;
-}
-
-
-/* Set MPT to be the memory partition associated with symbol SYM. */
-
-static inline void
-set_memory_partition (tree sym, tree mpt)
-{
-#if defined ENABLE_CHECKING
- if (mpt)
- gcc_assert (TREE_CODE (mpt) == MEMORY_PARTITION_TAG
- && !is_gimple_reg (sym));
-#endif
-
- var_ann (sym)->mpt = mpt;
- if (mpt)
- {
- if (MPT_SYMBOLS (mpt) == NULL)
- MPT_SYMBOLS (mpt) = BITMAP_ALLOC (&alias_bitmap_obstack);
-
- bitmap_set_bit (MPT_SYMBOLS (mpt), DECL_UID (sym));
-
- /* MPT inherits the call-clobbering attributes from SYM. */
- if (is_call_clobbered (sym))
- {
- MTAG_GLOBAL (mpt) = 1;
- mark_call_clobbered (mpt, ESCAPE_IS_GLOBAL);
- }
- }
-}
-
-
-/* Mark variable VAR as being non-addressable. */
-
-static void
-mark_non_addressable (tree var)
-{
- tree mpt;
-
- if (!TREE_ADDRESSABLE (var))
- return;
-
- mpt = memory_partition (var);
-
- clear_call_clobbered (var);
- TREE_ADDRESSABLE (var) = 0;
-
- if (mpt)
- {
- /* Note that it's possible for a symbol to have an associated
- MPT and the MPT have a NULL empty set. During
- init_alias_info, all MPTs get their sets cleared out, but the
- symbols still point to the old MPTs that used to hold them.
- This is done so that compute_memory_partitions can now which
- symbols are losing or changing partitions and mark them for
- renaming. */
- if (MPT_SYMBOLS (mpt))
- bitmap_clear_bit (MPT_SYMBOLS (mpt), DECL_UID (var));
- set_memory_partition (var, NULL_TREE);
- }
-}
-
-
-/* qsort comparison function to sort type/name tags by DECL_UID. */
-
-static int
-sort_tags_by_id (const void *pa, const void *pb)
-{
- const_tree const a = *(const_tree const *)pa;
- const_tree const b = *(const_tree const *)pb;
-
- return DECL_UID (a) - DECL_UID (b);
-}
-
-/* Initialize WORKLIST to contain those memory tags that are marked call
- clobbered. Initialized WORKLIST2 to contain the reasons these
- memory tags escaped. */
-
-static void
-init_transitive_clobber_worklist (VEC (tree, heap) **worklist,
- VEC (int, heap) **worklist2,
- bitmap on_worklist)
-{
- referenced_var_iterator rvi;
- tree curr;
-
- FOR_EACH_REFERENCED_VAR (curr, rvi)
- {
- if (MTAG_P (curr) && is_call_clobbered (curr))
- {
- VEC_safe_push (tree, heap, *worklist, curr);
- VEC_safe_push (int, heap, *worklist2,
- var_ann (curr)->escape_mask);
- bitmap_set_bit (on_worklist, DECL_UID (curr));
- }
- }
-}
-
-/* Add ALIAS to WORKLIST (and the reason for escaping REASON to WORKLIST2) if
- ALIAS is not already marked call clobbered, and is a memory
- tag. */
-
-static void
-add_to_worklist (tree alias, VEC (tree, heap) **worklist,
- VEC (int, heap) **worklist2, int reason,
- bitmap on_worklist)
-{
- if (MTAG_P (alias) && !is_call_clobbered (alias)
- && !bitmap_bit_p (on_worklist, DECL_UID (alias)))
- {
- VEC_safe_push (tree, heap, *worklist, alias);
- VEC_safe_push (int, heap, *worklist2, reason);
- bitmap_set_bit (on_worklist, DECL_UID (alias));
- }
-}
-
-/* Mark aliases of TAG as call clobbered, and place any tags on the
- alias list that were not already call clobbered on WORKLIST. */
-
-static void
-mark_aliases_call_clobbered (tree tag, VEC (tree, heap) **worklist,
- VEC (int, heap) **worklist2, bitmap on_worklist)
-{
- bitmap aliases;
- bitmap_iterator bi;
- unsigned int i;
- tree entry;
- var_ann_t ta = var_ann (tag);
-
- if (!MTAG_P (tag))
- return;
- aliases = may_aliases (tag);
- if (!aliases)
- return;
-
- EXECUTE_IF_SET_IN_BITMAP (aliases, 0, i, bi)
- {
- entry = referenced_var (i);
- /* If you clobber one part of a structure, you
- clobber the entire thing. While this does not make
- the world a particularly nice place, it is necessary
- in order to allow C/C++ tricks that involve
- pointer arithmetic to work. */
- if (!unmodifiable_var_p (entry))
- {
- add_to_worklist (entry, worklist, worklist2, ta->escape_mask,
- on_worklist);
- mark_call_clobbered (entry, ta->escape_mask);
- }
- }
-}
-
-/* Tags containing global vars need to be marked as global.
- Tags containing call clobbered vars need to be marked as call
- clobbered. */
-
-static void
-compute_tag_properties (void)
-{
- referenced_var_iterator rvi;
- tree tag;
- bool changed = true;
- VEC (tree, heap) *taglist = NULL;
-
- FOR_EACH_REFERENCED_VAR (tag, rvi)
- {
- if (!MTAG_P (tag))
- continue;
- VEC_safe_push (tree, heap, taglist, tag);
- }
-
- /* We sort the taglist by DECL_UID, for two reasons.
- 1. To get a sequential ordering to make the bitmap accesses
- faster.
- 2. Because of the way we compute aliases, it's more likely that
- an earlier tag is included in a later tag, and this will reduce
- the number of iterations.
-
- If we had a real tag graph, we would just topo-order it and be
- done with it. */
- qsort (VEC_address (tree, taglist),
- VEC_length (tree, taglist),
- sizeof (tree),
- sort_tags_by_id);
-
- /* Go through each tag not marked as global, and if it aliases
- global vars, mark it global.
-
- If the tag contains call clobbered vars, mark it call
- clobbered.
-
- This loop iterates because tags may appear in the may-aliases
- list of other tags when we group. */
-
- while (changed)
- {
- unsigned int k;
-
- changed = false;
- for (k = 0; VEC_iterate (tree, taglist, k, tag); k++)
- {
- bitmap ma;
- bitmap_iterator bi;
- unsigned int i;
- tree entry;
- bool tagcc = is_call_clobbered (tag);
- bool tagglobal = MTAG_GLOBAL (tag);
-
- if (tagcc && tagglobal)
- continue;
-
- ma = may_aliases (tag);
- if (!ma)
- continue;
-
- EXECUTE_IF_SET_IN_BITMAP (ma, 0, i, bi)
- {
- entry = referenced_var (i);
- /* Call clobbered entries cause the tag to be marked
- call clobbered. */
- if (!tagcc && is_call_clobbered (entry))
- {
- mark_call_clobbered (tag, var_ann (entry)->escape_mask);
- tagcc = true;
- changed = true;
- }
-
- /* Global vars cause the tag to be marked global. */
- if (!tagglobal && is_global_var (entry))
- {
- MTAG_GLOBAL (tag) = true;
- changed = true;
- tagglobal = true;
- }
-
- /* Early exit once both global and cc are set, since the
- loop can't do any more than that. */
- if (tagcc && tagglobal)
- break;
- }
- }
- }
- VEC_free (tree, heap, taglist);
-}
-
-/* Set up the initial variable clobbers, call-uses and globalness.
- When this function completes, only tags whose aliases need to be
- clobbered will be set clobbered. Tags clobbered because they
- contain call clobbered vars are handled in compute_tag_properties. */
-
-static void
-set_initial_properties (struct alias_info *ai)
-{
- unsigned int i;
- referenced_var_iterator rvi;
- tree var;
- tree ptr;
- bool any_pt_anything = false;
- enum escape_type pt_anything_mask = 0;
-
- FOR_EACH_REFERENCED_VAR (var, rvi)
- {
- if (is_global_var (var))
- {
- if (!unmodifiable_var_p (var))
- mark_call_clobbered (var, ESCAPE_IS_GLOBAL);
- }
- else if (TREE_CODE (var) == PARM_DECL
- && gimple_default_def (cfun, var)
- && POINTER_TYPE_P (TREE_TYPE (var)))
- {
- tree def = gimple_default_def (cfun, var);
- get_ptr_info (def)->value_escapes_p = 1;
- get_ptr_info (def)->escape_mask |= ESCAPE_IS_PARM;
- }
- }
-
- if (!clobber_what_escaped ())
- {
- any_pt_anything = true;
- pt_anything_mask |= ESCAPE_TO_CALL;
- }
-
- compute_call_used_vars ();
-
- for (i = 0; VEC_iterate (tree, ai->processed_ptrs, i, ptr); i++)
- {
- struct ptr_info_def *pi = SSA_NAME_PTR_INFO (ptr);
- tree tag = symbol_mem_tag (SSA_NAME_VAR (ptr));
-
- /* A pointer that only escapes via a function return does not
- add to the call clobber or call used solution.
- To exclude ESCAPE_TO_PURE_CONST we would need to track
- call used variables separately or compute those properly
- in the operand scanner. */
- if (pi->value_escapes_p
- && pi->escape_mask & ~ESCAPE_TO_RETURN)
- {
- /* If PTR escapes then its associated memory tags and
- pointed-to variables are call-clobbered. */
- if (pi->name_mem_tag)
- mark_call_clobbered (pi->name_mem_tag, pi->escape_mask);
-
- if (tag)
- mark_call_clobbered (tag, pi->escape_mask);
- }
-
- /* If the name tag is call clobbered, so is the symbol tag
- associated with the base VAR_DECL. */
- if (pi->name_mem_tag
- && tag
- && is_call_clobbered (pi->name_mem_tag))
- mark_call_clobbered (tag, pi->escape_mask);
-
- /* Name tags and symbol tags that we don't know where they point
- to, might point to global memory, and thus, are clobbered.
-
- FIXME: This is not quite right. They should only be
- clobbered if value_escapes_p is true, regardless of whether
- they point to global memory or not.
- So removing this code and fixing all the bugs would be nice.
- It is the cause of a bunch of clobbering. */
- if ((pi->pt_global_mem || pi->pt_anything)
- && pi->memory_tag_needed && pi->name_mem_tag)
- {
- mark_call_clobbered (pi->name_mem_tag, ESCAPE_IS_GLOBAL);
- MTAG_GLOBAL (pi->name_mem_tag) = true;
- }
-
- if ((pi->pt_global_mem || pi->pt_anything)
- && pi->memory_tag_needed
- && tag)
- {
- mark_call_clobbered (tag, ESCAPE_IS_GLOBAL);
- MTAG_GLOBAL (tag) = true;
- }
- }
-
- /* If a pt_anything pointer escaped we need to mark all addressable
- variables call clobbered. */
- if (any_pt_anything)
- {
- bitmap_iterator bi;
- unsigned int j;
-
- EXECUTE_IF_SET_IN_BITMAP (gimple_addressable_vars (cfun), 0, j, bi)
- {
- tree var = referenced_var (j);
- if (!unmodifiable_var_p (var))
- mark_call_clobbered (var, pt_anything_mask);
- }
- }
-}
-
-/* Compute which variables need to be marked call clobbered because
- their tag is call clobbered, and which tags need to be marked
- global because they contain global variables. */
-
-static void
-compute_call_clobbered (struct alias_info *ai)
-{
- VEC (tree, heap) *worklist = NULL;
- VEC (int,heap) *worklist2 = NULL;
- bitmap on_worklist;
-
- timevar_push (TV_CALL_CLOBBER);
- on_worklist = BITMAP_ALLOC (NULL);
-
- set_initial_properties (ai);
- init_transitive_clobber_worklist (&worklist, &worklist2, on_worklist);
- while (VEC_length (tree, worklist) != 0)
- {
- tree curr = VEC_pop (tree, worklist);
- int reason = VEC_pop (int, worklist2);
-
- bitmap_clear_bit (on_worklist, DECL_UID (curr));
- mark_call_clobbered (curr, reason);
- mark_aliases_call_clobbered (curr, &worklist, &worklist2, on_worklist);
- }
- VEC_free (tree, heap, worklist);
- VEC_free (int, heap, worklist2);
- BITMAP_FREE (on_worklist);
- compute_tag_properties ();
- timevar_pop (TV_CALL_CLOBBER);
-}
-
-
-/* Dump memory partition information to FILE. */
-
-static void
-dump_memory_partitions (FILE *file)
-{
- unsigned i, npart;
- unsigned long nsyms;
- tree mpt;
-
- fprintf (file, "\nMemory partitions\n\n");
- for (i = 0, npart = 0, nsyms = 0;
- VEC_iterate (tree, gimple_ssa_operands (cfun)->mpt_table, i, mpt);
- i++)
- {
- if (mpt)
- {
- bitmap syms = MPT_SYMBOLS (mpt);
- unsigned long n = (syms) ? bitmap_count_bits (syms) : 0;
-
- fprintf (file, "#%u: ", i);
- print_generic_expr (file, mpt, 0);
- fprintf (file, ": %lu elements: ", n);
- dump_decl_set (file, syms);
- npart++;
- nsyms += n;
- }
- }
-
- fprintf (file, "\n%u memory partitions holding %lu symbols\n", npart, nsyms);
-}
-
-
-/* Dump memory partition information to stderr. */
-
-void
-debug_memory_partitions (void)
-{
- dump_memory_partitions (stderr);
-}
-
-
-/* Return true if memory partitioning is required given the memory
- reference estimates in STATS. */
-
-static inline bool
-need_to_partition_p (struct mem_ref_stats_d *stats)
-{
- long num_vops = stats->num_vuses + stats->num_vdefs;
- long avg_vops = CEIL (num_vops, stats->num_mem_stmts);
- return (num_vops > (long) MAX_ALIASED_VOPS
- && avg_vops > (long) AVG_ALIASED_VOPS);
-}
-
-
-/* Count the actual number of virtual operators in CFUN. Note that
- this is only meaningful after virtual operands have been populated,
- so it should be invoked at the end of compute_may_aliases.
-
- The number of virtual operators are stored in *NUM_VDEFS_P and
- *NUM_VUSES_P, the number of partitioned symbols in
- *NUM_PARTITIONED_P and the number of unpartitioned symbols in
- *NUM_UNPARTITIONED_P.
-
- If any of these pointers is NULL the corresponding count is not
- computed. */
-
-static void
-count_mem_refs (long *num_vuses_p, long *num_vdefs_p,
- long *num_partitioned_p, long *num_unpartitioned_p)
-{
- gimple_stmt_iterator gsi;
- basic_block bb;
- long num_vdefs, num_vuses, num_partitioned, num_unpartitioned;
- referenced_var_iterator rvi;
- tree sym;
-
- num_vuses = num_vdefs = num_partitioned = num_unpartitioned = 0;
-
- if (num_vuses_p || num_vdefs_p)
- FOR_EACH_BB (bb)
- for (gsi = gsi_start_bb (bb); !gsi_end_p (gsi); gsi_next (&gsi))
- {
- gimple stmt = gsi_stmt (gsi);
- if (gimple_references_memory_p (stmt))
- {
- num_vuses += NUM_SSA_OPERANDS (stmt, SSA_OP_VUSE);
- num_vdefs += NUM_SSA_OPERANDS (stmt, SSA_OP_VDEF);
- }
- }
-
- if (num_partitioned_p || num_unpartitioned_p)
- FOR_EACH_REFERENCED_VAR (sym, rvi)
- {
- if (is_gimple_reg (sym))
- continue;
-
- if (memory_partition (sym))
- num_partitioned++;
- else
- num_unpartitioned++;
- }
-
- if (num_vdefs_p)
- *num_vdefs_p = num_vdefs;
-
- if (num_vuses_p)
- *num_vuses_p = num_vuses;
-
- if (num_partitioned_p)
- *num_partitioned_p = num_partitioned;
-
- if (num_unpartitioned_p)
- *num_unpartitioned_p = num_unpartitioned;
-}
-
-
-/* The list is sorted by increasing partitioning score (PSCORE).
- This score is computed such that symbols with high scores are
- those that are least likely to be partitioned. Given a symbol
- MP->VAR, PSCORE(S) is the result of the following weighted sum
-
- PSCORE(S) = FW * 64 + FR * 32
- + DW * 16 + DR * 8
- + IW * 4 + IR * 2
- + NO_ALIAS
-
- where
-
- FW Execution frequency of writes to S
- FR Execution frequency of reads from S
- DW Number of direct writes to S
- DR Number of direct reads from S
- IW Number of indirect writes to S
- IR Number of indirect reads from S
- NO_ALIAS State of the NO_ALIAS* flags
-
- The basic idea here is that symbols that are frequently
- written-to in hot paths of the code are the last to be considered
- for partitioning. */
-
-static inline long
-mem_sym_score (mem_sym_stats_t mp)
-{
- return mp->frequency_writes * 64 + mp->frequency_reads * 32
- + mp->num_direct_writes * 16 + mp->num_direct_reads * 8
- + mp->num_indirect_writes * 4 + mp->num_indirect_reads * 2
- + var_ann (mp->var)->noalias_state;
-}
-
-
-/* Dump memory reference stats for function CFUN to FILE. */
-
-void
-dump_mem_ref_stats (FILE *file)
-{
- long actual_num_vuses, actual_num_vdefs;
- long num_partitioned, num_unpartitioned;
- struct mem_ref_stats_d *stats;
-
- stats = gimple_mem_ref_stats (cfun);
-
- count_mem_refs (&actual_num_vuses, &actual_num_vdefs, &num_partitioned,
- &num_unpartitioned);
-
- fprintf (file, "\nMemory reference statistics for %s\n\n",
- lang_hooks.decl_printable_name (current_function_decl, 2));
-
- fprintf (file, "Number of memory statements: %ld\n",
- stats->num_mem_stmts);
- fprintf (file, "Number of call sites: %ld\n",
- stats->num_call_sites);
- fprintf (file, "Number of pure/const call sites: %ld\n",
- stats->num_pure_const_call_sites);
- fprintf (file, "Number of asm sites: %ld\n",
- stats->num_asm_sites);
- fprintf (file, "Estimated number of loads: %ld (%ld/stmt)\n",
- stats->num_vuses,
- (stats->num_mem_stmts)
- ? CEIL (stats->num_vuses, stats->num_mem_stmts)
- : 0);
- fprintf (file, "Actual number of loads: %ld (%ld/stmt)\n",
- actual_num_vuses,
- (stats->num_mem_stmts)
- ? CEIL (actual_num_vuses, stats->num_mem_stmts)
- : 0);
-
- if (actual_num_vuses > stats->num_vuses + (stats->num_vuses / 25))
- fprintf (file, "\t(warning: estimation is lower by more than 25%%)\n");
-
- fprintf (file, "Estimated number of stores: %ld (%ld/stmt)\n",
- stats->num_vdefs,
- (stats->num_mem_stmts)
- ? CEIL (stats->num_vdefs, stats->num_mem_stmts)
- : 0);
- fprintf (file, "Actual number of stores: %ld (%ld/stmt)\n",
- actual_num_vdefs,
- (stats->num_mem_stmts)
- ? CEIL (actual_num_vdefs, stats->num_mem_stmts)
- : 0);
-
- if (actual_num_vdefs > stats->num_vdefs + (stats->num_vdefs / 25))
- fprintf (file, "\t(warning: estimation is lower by more than 25%%)\n");
-
- fprintf (file, "Partitioning thresholds: MAX = %d AVG = %d "
- "(%sNEED TO PARTITION)\n", MAX_ALIASED_VOPS, AVG_ALIASED_VOPS,
- stats->num_mem_stmts && need_to_partition_p (stats) ? "" : "NO ");
- fprintf (file, "Number of partitioned symbols: %ld\n", num_partitioned);
- fprintf (file, "Number of unpartitioned symbols: %ld\n", num_unpartitioned);
-}
-
-
-/* Dump memory reference stats for function FN to stderr. */
-
-void
-debug_mem_ref_stats (void)
-{
- dump_mem_ref_stats (stderr);
-}
-
-
-/* Dump memory reference stats for variable VAR to FILE. */
-
-static void
-dump_mem_sym_stats (FILE *file, tree var)
-{
- mem_sym_stats_t stats = mem_sym_stats (cfun, var);
-
- if (stats == NULL)
- return;
-
- fprintf (file, "read frequency: %6ld, write frequency: %6ld, "
- "direct reads: %3ld, direct writes: %3ld, "
- "indirect reads: %4ld, indirect writes: %4ld, symbol: ",
- stats->frequency_reads, stats->frequency_writes,
- stats->num_direct_reads, stats->num_direct_writes,
- stats->num_indirect_reads, stats->num_indirect_writes);
- print_generic_expr (file, stats->var, 0);
- fprintf (file, ", tags: ");
- dump_decl_set (file, stats->parent_tags);
-}
-
-
-/* Dump memory reference stats for variable VAR to stderr. */
-
-void
-debug_mem_sym_stats (tree var)
-{
- dump_mem_sym_stats (stderr, var);
-}
-
-/* Dump memory reference stats for variable VAR to FILE. For use
- of tree-dfa.c:dump_variable. */
-
-void
-dump_mem_sym_stats_for_var (FILE *file, tree var)
-{
- mem_sym_stats_t stats = mem_sym_stats (cfun, var);
-
- if (stats == NULL)
- return;
-
- fprintf (file, ", score: %ld", mem_sym_score (stats));
- fprintf (file, ", direct reads: %ld", stats->num_direct_reads);
- fprintf (file, ", direct writes: %ld", stats->num_direct_writes);
- fprintf (file, ", indirect reads: %ld", stats->num_indirect_reads);
- fprintf (file, ", indirect writes: %ld", stats->num_indirect_writes);
-}
-
-/* Dump memory reference stats for all memory symbols to FILE. */
-
-static void
-dump_all_mem_sym_stats (FILE *file)
-{
- referenced_var_iterator rvi;
- tree sym;
-
- FOR_EACH_REFERENCED_VAR (sym, rvi)
- {
- if (is_gimple_reg (sym))
- continue;
-
- dump_mem_sym_stats (file, sym);
- }
-}
-
-
-/* Dump memory reference stats for all memory symbols to stderr. */
-
-void
-debug_all_mem_sym_stats (void)
-{
- dump_all_mem_sym_stats (stderr);
-}
-
-
-/* Dump the MP_INFO array to FILE. */
-
-static void
-dump_mp_info (FILE *file, VEC(mem_sym_stats_t,heap) *mp_info)
-{
- unsigned i;
- mem_sym_stats_t mp_p;
-
- for (i = 0; VEC_iterate (mem_sym_stats_t, mp_info, i, mp_p); i++)
- if (!mp_p->partitioned_p)
- dump_mem_sym_stats (file, mp_p->var);
-}
-
-
-/* Dump the MP_INFO array to stderr. */
-
-void
-debug_mp_info (VEC(mem_sym_stats_t,heap) *mp_info)
-{
- dump_mp_info (stderr, mp_info);
-}
-
-
-/* Update memory reference stats for symbol VAR in statement STMT.
- NUM_DIRECT_READS and NUM_DIRECT_WRITES specify the number of times
- that VAR is read/written in STMT (indirect reads/writes are not
- recorded by this function, see compute_memory_partitions). */
-
-void
-update_mem_sym_stats_from_stmt (tree var, gimple stmt, long num_direct_reads,
- long num_direct_writes)
-{
- mem_sym_stats_t stats;
-
- gcc_assert (num_direct_reads >= 0 && num_direct_writes >= 0);
-
- stats = get_mem_sym_stats_for (var);
-
- stats->num_direct_reads += num_direct_reads;
- stats->frequency_reads += ((long) gimple_bb (stmt)->frequency
- * num_direct_reads);
-
- stats->num_direct_writes += num_direct_writes;
- stats->frequency_writes += ((long) gimple_bb (stmt)->frequency
- * num_direct_writes);
-}
-
-
-/* Given two MP_INFO entries MP1 and MP2, return -1 if MP1->VAR should
- be partitioned before MP2->VAR, 0 if they are the same or 1 if
- MP1->VAR should be partitioned after MP2->VAR. */
-
-static inline int
-compare_mp_info_entries (mem_sym_stats_t mp1, mem_sym_stats_t mp2)
-{
- long pscore1 = mem_sym_score (mp1);
- long pscore2 = mem_sym_score (mp2);
-
- if (pscore1 < pscore2)
- return -1;
- else if (pscore1 > pscore2)
- return 1;
- else
- return DECL_UID (mp1->var) - DECL_UID (mp2->var);
-}
-
-
-/* Comparison routine for qsort. The list is sorted by increasing
- partitioning score (PSCORE). This score is computed such that
- symbols with high scores are those that are least likely to be
- partitioned. */
-
-static int
-mp_info_cmp (const void *p, const void *q)
-{
- mem_sym_stats_t e1 = *((const mem_sym_stats_t *) p);
- mem_sym_stats_t e2 = *((const mem_sym_stats_t *) q);
- return compare_mp_info_entries (e1, e2);
-}
-
-
-/* Sort the array of reference counts used to compute memory partitions.
- Elements are sorted in ascending order of execution frequency and
- descending order of virtual operators needed. */
-
-static inline void
-sort_mp_info (VEC(mem_sym_stats_t,heap) *list)
-{
- unsigned num = VEC_length (mem_sym_stats_t, list);
-
- if (num < 2)
- return;
-
- if (num == 2)
- {
- if (compare_mp_info_entries (VEC_index (mem_sym_stats_t, list, 0),
- VEC_index (mem_sym_stats_t, list, 1)) > 0)
- {
- /* Swap elements if they are in the wrong order. */
- mem_sym_stats_t tmp = VEC_index (mem_sym_stats_t, list, 0);
- VEC_replace (mem_sym_stats_t, list, 0,
- VEC_index (mem_sym_stats_t, list, 1));
- VEC_replace (mem_sym_stats_t, list, 1, tmp);
- }
-
- return;
- }
-
- /* There are 3 or more elements, call qsort. */
- qsort (VEC_address (mem_sym_stats_t, list),
- VEC_length (mem_sym_stats_t, list),
- sizeof (mem_sym_stats_t),
- mp_info_cmp);
-}
-
-
-/* Return the memory partition tag (MPT) associated with memory
- symbol SYM. */
-
-static tree
-get_mpt_for (tree sym)
-{
- tree mpt;
-
- /* Don't create a new tag unnecessarily. */
- mpt = memory_partition (sym);
- if (mpt == NULL_TREE)
- {
- mpt = create_tag_raw (MEMORY_PARTITION_TAG, TREE_TYPE (sym), "MPT");
- TREE_ADDRESSABLE (mpt) = 0;
- add_referenced_var (mpt);
- VEC_safe_push (tree, heap, gimple_ssa_operands (cfun)->mpt_table, mpt);
- gcc_assert (MPT_SYMBOLS (mpt) == NULL);
- set_memory_partition (sym, mpt);
- }
-
- return mpt;
-}
-
-
-/* Add MP_P->VAR to a memory partition and return the partition. */
-
-static tree
-find_partition_for (mem_sym_stats_t mp_p)
-{
- unsigned i;
- VEC(tree,heap) *mpt_table;
- tree mpt;
-
- mpt_table = gimple_ssa_operands (cfun)->mpt_table;
- mpt = NULL_TREE;
-
- /* Find an existing partition for MP_P->VAR. */
- for (i = 0; VEC_iterate (tree, mpt_table, i, mpt); i++)
- {
- mem_sym_stats_t mpt_stats;
-
- /* If MPT does not have any symbols yet, use it. */
- if (MPT_SYMBOLS (mpt) == NULL)
- break;
-
- /* Otherwise, see if MPT has common parent tags with MP_P->VAR,
- but avoid grouping clobbered variables with non-clobbered
- variables (otherwise, this tends to creates a single memory
- partition because other call-clobbered variables may have
- common parent tags with non-clobbered ones). */
- mpt_stats = get_mem_sym_stats_for (mpt);
- if (mp_p->parent_tags
- && mpt_stats->parent_tags
- && is_call_clobbered (mpt) == is_call_clobbered (mp_p->var)
- && bitmap_intersect_p (mpt_stats->parent_tags, mp_p->parent_tags))
- break;
-
- /* If no common parent tags are found, see if both MPT and
- MP_P->VAR are call-clobbered. */
- if (is_call_clobbered (mpt) && is_call_clobbered (mp_p->var))
- break;
- }
-
- if (mpt == NULL_TREE)
- mpt = get_mpt_for (mp_p->var);
- else
- set_memory_partition (mp_p->var, mpt);
-
- mp_p->partitioned_p = true;
-
- mark_sym_for_renaming (mp_p->var);
- mark_sym_for_renaming (mpt);
-
- return mpt;
-}
-
-
-/* Rewrite the alias set for TAG to use the newly created partitions.
- If TAG is NULL, rewrite the set of call-clobbered variables.
- NEW_ALIASES is a scratch bitmap to build the new set of aliases for
- TAG. */
-
-static void
-rewrite_alias_set_for (tree tag, bitmap new_aliases)
-{
- bitmap_iterator bi;
- unsigned i;
- tree mpt, sym;
-
- EXECUTE_IF_SET_IN_BITMAP (MTAG_ALIASES (tag), 0, i, bi)
- {
- sym = referenced_var (i);
- mpt = memory_partition (sym);
- if (mpt)
- bitmap_set_bit (new_aliases, DECL_UID (mpt));
- else
- bitmap_set_bit (new_aliases, DECL_UID (sym));
- }
-
- /* Rebuild the may-alias array for TAG. */
- bitmap_copy (MTAG_ALIASES (tag), new_aliases);
-}
-
-
-/* Determine how many virtual operands can be saved by partitioning
- MP_P->VAR into MPT. When a symbol S is thrown inside a partition
- P, every virtual operand that used to reference S will now
- reference P. Whether it reduces the number of virtual operands
- depends on:
-
- 1- Direct references to S are never saved. Instead of the virtual
- operand to S, we will now have a virtual operand to P.
-
- 2- Indirect references to S are reduced only for those memory tags
- holding S that already had other symbols partitioned into P.
- For instance, if a memory tag T has the alias set { a b S c },
- the first time we partition S into P, the alias set will become
- { a b P c }, so no virtual operands will be saved. However, if
- we now partition symbol 'c' into P, then the alias set for T
- will become { a b P }, so we will be saving one virtual operand
- for every indirect reference to 'c'.
-
- 3- Is S is call-clobbered, we save as many virtual operands as
- call/asm sites exist in the code, but only if other
- call-clobbered symbols have been grouped into P. The first
- call-clobbered symbol that we group does not produce any
- savings.
-
- MEM_REF_STATS points to CFUN's memory reference information. */
-
-static void
-estimate_vop_reduction (struct mem_ref_stats_d *mem_ref_stats,
- mem_sym_stats_t mp_p, tree mpt)
-{
- unsigned i;
- bitmap_iterator bi;
- mem_sym_stats_t mpt_stats;
-
- /* We should only get symbols with indirect references here. */
- gcc_assert (mp_p->num_indirect_reads > 0 || mp_p->num_indirect_writes > 0);
-
- /* Note that the only statistics we keep for MPT is the set of
- parent tags to know which memory tags have had alias members
- partitioned, and the indicator has_call_clobbered_vars.
- Reference counts are not important for MPT. */
- mpt_stats = get_mem_sym_stats_for (mpt);
-
- /* Traverse all the parent tags for MP_P->VAR. For every tag T, if
- partition P is already grouping aliases of T, then reduce the
- number of virtual operands by the number of direct references
- to T. */
- if (mp_p->parent_tags)
- {
- if (mpt_stats->parent_tags == NULL)
- mpt_stats->parent_tags = BITMAP_ALLOC (&alias_bitmap_obstack);
-
- EXECUTE_IF_SET_IN_BITMAP (mp_p->parent_tags, 0, i, bi)
- {
- if (bitmap_bit_p (mpt_stats->parent_tags, i))
- {
- /* Partition MPT is already partitioning symbols in the
- alias set for TAG. This means that we are now saving
- 1 virtual operand for every direct reference to TAG. */
- tree tag = referenced_var (i);
- mem_sym_stats_t tag_stats = mem_sym_stats (cfun, tag);
- mem_ref_stats->num_vuses -= tag_stats->num_direct_reads;
- mem_ref_stats->num_vdefs -= tag_stats->num_direct_writes;
- }
- else
- {
- /* This is the first symbol in tag I's alias set that is
- being grouped under MPT. We will not save any
- virtual operands this time, but record that MPT is
- grouping a symbol from TAG's alias set so that the
- next time we get the savings. */
- bitmap_set_bit (mpt_stats->parent_tags, i);
- }
- }
- }
-
- /* If MP_P->VAR is call-clobbered, and MPT is already grouping
- call-clobbered symbols, then we will save as many virtual
- operands as asm/call sites there are. */
- if (is_call_clobbered (mp_p->var))
- {
- if (mpt_stats->has_call_clobbered_vars)
- mem_ref_stats->num_vdefs -= mem_ref_stats->num_call_sites
- + mem_ref_stats->num_asm_sites;
- else
- mpt_stats->has_call_clobbered_vars = true;
- }
-}
-
-
-/* Helper for compute_memory_partitions. Transfer reference counts
- from pointers to their pointed-to sets. Counters for pointers were
- computed by update_alias_info. MEM_REF_STATS points to CFUN's
- memory reference information. */
-
-static void
-update_reference_counts (struct mem_ref_stats_d *mem_ref_stats)
-{
- unsigned i;
- bitmap_iterator bi;
- mem_sym_stats_t sym_stats;
-
- for (i = 1; i < num_ssa_names; i++)
- {
- tree ptr;
- struct ptr_info_def *pi;
-
- ptr = ssa_name (i);
- if (ptr
- && POINTER_TYPE_P (TREE_TYPE (ptr))
- && (pi = SSA_NAME_PTR_INFO (ptr)) != NULL
- && pi->memory_tag_needed)
- {
- unsigned j;
- bitmap_iterator bj;
- tree tag;
- mem_sym_stats_t ptr_stats, tag_stats;
-
- /* If PTR has flow-sensitive points-to information, use
- PTR's name tag, otherwise use the symbol tag associated
- with PTR's symbol. */
- if (pi->name_mem_tag)
- tag = pi->name_mem_tag;
- else
- tag = symbol_mem_tag (SSA_NAME_VAR (ptr));
-
- ptr_stats = get_mem_sym_stats_for (ptr);
- tag_stats = get_mem_sym_stats_for (tag);
-
- /* TAG has as many direct references as dereferences we
- found for its parent pointer. */
- tag_stats->num_direct_reads += ptr_stats->num_direct_reads;
- tag_stats->num_direct_writes += ptr_stats->num_direct_writes;
-
- /* All the dereferences of pointer PTR are considered direct
- references to PTR's memory tag (TAG). In turn,
- references to TAG will become virtual operands for every
- symbol in TAG's alias set. So, for every symbol ALIAS in
- TAG's alias set, add as many indirect references to ALIAS
- as direct references there are for TAG. */
- if (MTAG_ALIASES (tag))
- EXECUTE_IF_SET_IN_BITMAP (MTAG_ALIASES (tag), 0, j, bj)
- {
- tree alias = referenced_var (j);
- sym_stats = get_mem_sym_stats_for (alias);
-
- /* All the direct references to TAG are indirect references
- to ALIAS. */
- sym_stats->num_indirect_reads += ptr_stats->num_direct_reads;
- sym_stats->num_indirect_writes += ptr_stats->num_direct_writes;
- sym_stats->frequency_reads += ptr_stats->frequency_reads;
- sym_stats->frequency_writes += ptr_stats->frequency_writes;
-
- /* Indicate that TAG is one of ALIAS's parent tags. */
- if (sym_stats->parent_tags == NULL)
- sym_stats->parent_tags = BITMAP_ALLOC (&alias_bitmap_obstack);
- bitmap_set_bit (sym_stats->parent_tags, DECL_UID (tag));
- }
- }
- }
-
- /* Call-clobbered symbols are indirectly written at every
- call/asm site. */
- EXECUTE_IF_SET_IN_BITMAP (gimple_call_clobbered_vars (cfun), 0, i, bi)
- {
- tree sym = referenced_var (i);
- sym_stats = get_mem_sym_stats_for (sym);
- sym_stats->num_indirect_writes += mem_ref_stats->num_call_sites
- + mem_ref_stats->num_asm_sites;
- }
-
- /* Addressable symbols are indirectly written at some ASM sites.
- Since only ASM sites that clobber memory actually affect
- addressable symbols, this is an over-estimation. */
- EXECUTE_IF_SET_IN_BITMAP (gimple_addressable_vars (cfun), 0, i, bi)
- {
- tree sym = referenced_var (i);
- sym_stats = get_mem_sym_stats_for (sym);
- sym_stats->num_indirect_writes += mem_ref_stats->num_asm_sites;
- }
-}
-
-
-/* Helper for compute_memory_partitions. Add all memory symbols to
- *MP_INFO_P and compute the initial estimate for the total number of
- virtual operands needed. MEM_REF_STATS points to CFUN's memory
- reference information. On exit, *TAGS_P will contain the list of
- memory tags whose alias set need to be rewritten after
- partitioning. */
-
-static void
-build_mp_info (struct mem_ref_stats_d *mem_ref_stats,
- VEC(mem_sym_stats_t,heap) **mp_info_p,
- VEC(tree,heap) **tags_p)
-{
- tree var;
- referenced_var_iterator rvi;
-
- FOR_EACH_REFERENCED_VAR (var, rvi)
- {
- mem_sym_stats_t sym_stats;
- tree old_mpt;
-
- /* We are only interested in memory symbols other than MPTs. */
- if (is_gimple_reg (var) || TREE_CODE (var) == MEMORY_PARTITION_TAG)
- continue;
-
- /* Collect memory tags into the TAGS array so that we can
- rewrite their alias sets after partitioning. */
- if (MTAG_P (var) && MTAG_ALIASES (var))
- VEC_safe_push (tree, heap, *tags_p, var);
-
- /* Since we are going to re-compute partitions, any symbols that
- used to belong to a partition must be detached from it and
- marked for renaming. */
- if ((old_mpt = memory_partition (var)) != NULL)
- {
- mark_sym_for_renaming (old_mpt);
- set_memory_partition (var, NULL_TREE);
- mark_sym_for_renaming (var);
- }
-
- sym_stats = get_mem_sym_stats_for (var);
-
- /* Add VAR's reference info to MP_INFO. Note that the only
- symbols that make sense to partition are those that have
- indirect references. If a symbol S is always directly
- referenced, partitioning it will not reduce the number of
- virtual operators. The only symbols that are profitable to
- partition are those that belong to alias sets and/or are
- call-clobbered. */
- if (sym_stats->num_indirect_reads > 0
- || sym_stats->num_indirect_writes > 0)
- VEC_safe_push (mem_sym_stats_t, heap, *mp_info_p, sym_stats);
-
- /* Update the number of estimated VOPS. Note that direct
- references to memory tags are always counted as indirect
- references to their alias set members, so if a memory tag has
- aliases, do not count its direct references to avoid double
- accounting. */
- if (!MTAG_P (var) || !MTAG_ALIASES (var))
- {
- mem_ref_stats->num_vuses += sym_stats->num_direct_reads;
- mem_ref_stats->num_vdefs += sym_stats->num_direct_writes;
- }
-
- mem_ref_stats->num_vuses += sym_stats->num_indirect_reads;
- mem_ref_stats->num_vdefs += sym_stats->num_indirect_writes;
- }
-}
-
-
-/* Compute memory partitions. A memory partition (MPT) is an
- arbitrary grouping of memory symbols, such that references to one
- member of the group is considered a reference to all the members of
- the group.
-
- As opposed to alias sets in memory tags, the grouping into
- partitions is completely arbitrary and only done to reduce the
- number of virtual operands. The only rule that needs to be
- observed when creating memory partitions is that given two memory
- partitions MPT.i and MPT.j, they must not contain symbols in
- common.
-
- Memory partitions are used when putting the program into Memory-SSA
- form. In particular, in Memory-SSA PHI nodes are not computed for
- individual memory symbols. They are computed for memory
- partitions. This reduces the amount of PHI nodes in the SSA graph
- at the expense of precision (i.e., it makes unrelated stores affect
- each other).
-
- However, it is possible to increase precision by changing this
- partitioning scheme. For instance, if the partitioning scheme is
- such that get_mpt_for is the identity function (that is,
- get_mpt_for (s) = s), this will result in ultimate precision at the
- expense of huge SSA webs.
-
- At the other extreme, a partitioning scheme that groups all the
- symbols in the same set results in minimal SSA webs and almost
- total loss of precision.
-
- There partitioning heuristic uses three parameters to decide the
- order in which symbols are processed. The list of symbols is
- sorted so that symbols that are more likely to be partitioned are
- near the top of the list:
-
- - Execution frequency. If a memory references is in a frequently
- executed code path, grouping it into a partition may block useful
- transformations and cause sub-optimal code generation. So, the
- partition heuristic tries to avoid grouping symbols with high
- execution frequency scores. Execution frequency is taken
- directly from the basic blocks where every reference is made (see
- update_mem_sym_stats_from_stmt), which in turn uses the
- profile guided machinery, so if the program is compiled with PGO
- enabled, more accurate partitioning decisions will be made.
-
- - Number of references. Symbols with few references in the code,
- are partitioned before symbols with many references.
-
- - NO_ALIAS attributes. Symbols with any of the NO_ALIAS*
- attributes are partitioned after symbols marked MAY_ALIAS.
-
- Once the list is sorted, the partitioning proceeds as follows:
-
- 1- For every symbol S in MP_INFO, create a new memory partition MP,
- if necessary. To avoid memory partitions that contain symbols
- from non-conflicting alias sets, memory partitions are
- associated to the memory tag that holds S in its alias set. So,
- when looking for a memory partition for S, the memory partition
- associated with one of the memory tags holding S is chosen. If
- none exists, a new one is created.
-
- 2- Add S to memory partition MP.
-
- 3- Reduce by 1 the number of VOPS for every memory tag holding S.
-
- 4- If the total number of VOPS is less than MAX_ALIASED_VOPS or the
- average number of VOPS per statement is less than
- AVG_ALIASED_VOPS, stop. Otherwise, go to the next symbol in the
- list. */
-
-static void
-compute_memory_partitions (void)
-{
- tree tag;
- unsigned i;
- mem_sym_stats_t mp_p;
- VEC(mem_sym_stats_t,heap) *mp_info;
- bitmap new_aliases;
- VEC(tree,heap) *tags;
- struct mem_ref_stats_d *mem_ref_stats;
- int prev_max_aliased_vops;
-
- mem_ref_stats = gimple_mem_ref_stats (cfun);
- gcc_assert (mem_ref_stats->num_vuses == 0 && mem_ref_stats->num_vdefs == 0);
-
- if (mem_ref_stats->num_mem_stmts == 0)
- return;
-
- timevar_push (TV_MEMORY_PARTITIONING);
-
- mp_info = NULL;
- tags = NULL;
- prev_max_aliased_vops = MAX_ALIASED_VOPS;
-
- /* Since we clearly cannot lower the number of virtual operators
- below the total number of memory statements in the function, we
- may need to adjust MAX_ALIASED_VOPS beforehand. */
- if (MAX_ALIASED_VOPS < mem_ref_stats->num_mem_stmts)
- MAX_ALIASED_VOPS = mem_ref_stats->num_mem_stmts;
-
- /* Update reference stats for all the pointed-to variables and
- memory tags. */
- update_reference_counts (mem_ref_stats);
-
- /* Add all the memory symbols to MP_INFO. */
- build_mp_info (mem_ref_stats, &mp_info, &tags);
-
- /* No partitions required if we are below the threshold. */
- if (!need_to_partition_p (mem_ref_stats))
- {
- if (dump_file)
- fprintf (dump_file, "\nMemory partitioning NOT NEEDED for %s\n",
- get_name (current_function_decl));
- goto done;
- }
-
- /* Sort the MP_INFO array so that symbols that should be partitioned
- first are near the top of the list. */
- sort_mp_info (mp_info);
-
- if (dump_file)
- {
- fprintf (dump_file, "\nMemory partitioning NEEDED for %s\n\n",
- get_name (current_function_decl));
- fprintf (dump_file, "Memory symbol references before partitioning:\n");
- dump_mp_info (dump_file, mp_info);
- }
-
- /* Create partitions for variables in MP_INFO until we have enough
- to lower the total number of VOPS below MAX_ALIASED_VOPS or if
- the average number of VOPS per statement is below
- AVG_ALIASED_VOPS. */
- for (i = 0; VEC_iterate (mem_sym_stats_t, mp_info, i, mp_p); i++)
- {
- tree mpt;
-
- /* If we are below the threshold, stop. */
- if (!need_to_partition_p (mem_ref_stats))
- break;
-
- mpt = find_partition_for (mp_p);
- estimate_vop_reduction (mem_ref_stats, mp_p, mpt);
- }
-
- /* After partitions have been created, rewrite alias sets to use
- them instead of the original symbols. This way, if the alias set
- was computed as { a b c d e f }, and the subset { b e f } was
- grouped into partition MPT.3, then the new alias set for the tag
- will be { a c d MPT.3 }.
-
- Note that this is not strictly necessary. The operand scanner
- will always check if a symbol belongs to a partition when adding
- virtual operands. However, by reducing the size of the alias
- sets to be scanned, the work needed inside the operand scanner is
- significantly reduced. */
- new_aliases = BITMAP_ALLOC (&alias_bitmap_obstack);
-
- for (i = 0; VEC_iterate (tree, tags, i, tag); i++)
- {
- rewrite_alias_set_for (tag, new_aliases);
- bitmap_clear (new_aliases);
- }
-
- BITMAP_FREE (new_aliases);
-
- if (dump_file)
- {
- fprintf (dump_file, "\nMemory symbol references after partitioning:\n");
- dump_mp_info (dump_file, mp_info);
- }
-
-done:
- /* Free allocated memory. */
- VEC_free (mem_sym_stats_t, heap, mp_info);
- VEC_free (tree, heap, tags);
-
- MAX_ALIASED_VOPS = prev_max_aliased_vops;
-
- timevar_pop (TV_MEMORY_PARTITIONING);
-}
-
-/* Compute may-alias information for every variable referenced in function
- FNDECL.
-
- Alias analysis proceeds in 3 main phases:
-
- 1- Points-to and escape analysis.
-
- This phase walks the use-def chains in the SSA web looking for three
- things:
-
- * Assignments of the form P_i = &VAR
- * Assignments of the form P_i = malloc()
- * Pointers and ADDR_EXPR that escape the current function.
-
- The concept of 'escaping' is the same one used in the Java world. When
- a pointer or an ADDR_EXPR escapes, it means that it has been exposed
- outside of the current function. So, assignment to global variables,
- function arguments and returning a pointer are all escape sites, as are
- conversions between pointers and integers.
-
- This is where we are currently limited. Since not everything is renamed
- into SSA, we lose track of escape properties when a pointer is stashed
- inside a field in a structure, for instance. In those cases, we are
- assuming that the pointer does escape.
-
- We use escape analysis to determine whether a variable is
- call-clobbered. Simply put, if an ADDR_EXPR escapes, then the variable
- is call-clobbered. If a pointer P_i escapes, then all the variables
- pointed-to by P_i (and its memory tag) also escape.
-
- 2- Compute flow-sensitive aliases
-
- We have two classes of memory tags. Memory tags associated with the
- pointed-to data type of the pointers in the program. These tags are
- called "symbol memory tag" (SMT). The other class are those associated
- with SSA_NAMEs, called "name memory tag" (NMT). The basic idea is that
- when adding operands for an INDIRECT_REF *P_i, we will first check
- whether P_i has a name tag, if it does we use it, because that will have
- more precise aliasing information. Otherwise, we use the standard symbol
- tag.
-
- In this phase, we go through all the pointers we found in points-to
- analysis and create alias sets for the name memory tags associated with
- each pointer P_i. If P_i escapes, we mark call-clobbered the variables
- it points to and its tag.
-
-
- 3- Compute flow-insensitive aliases
-
- This pass will compare the alias set of every symbol memory tag and
- every addressable variable found in the program. Given a symbol
- memory tag SMT and an addressable variable V. If the alias sets of
- SMT and V conflict (as computed by may_alias_p), then V is marked
- as an alias tag and added to the alias set of SMT.
-
- For instance, consider the following function:
-
- foo (int i)
- {
- int *p, a, b;
-
- if (i > 10)
- p = &a;
- else
- p = &b;
-
- *p = 3;
- a = b + 2;
- return *p;
- }
-
- After aliasing analysis has finished, the symbol memory tag for pointer
- 'p' will have two aliases, namely variables 'a' and 'b'. Every time
- pointer 'p' is dereferenced, we want to mark the operation as a
- potential reference to 'a' and 'b'.
-
- foo (int i)
- {
- int *p, a, b;
-
- if (i_2 > 10)
- p_4 = &a;
- else
- p_6 = &b;
- # p_1 = PHI <p_4(1), p_6(2)>;
-
- # a_7 = VDEF <a_3>;
- # b_8 = VDEF <b_5>;
- *p_1 = 3;
-
- # a_9 = VDEF <a_7>
-