This is not a patch review, lets move this to gcc@gcc.gnu.org.

On 15/07/2014 17:03, Roman Gareev wrote:
I've found out that int128_integer_type_node and
long_long_integer_type_node are NULL at the moment of definition of
the graphite_expression_size_type. Maybe we should use
long_long_integer_type_node, because, as you said before, using of
signed 64 has also been proved to be very robust. What do you think
about this?

I do not fully understand this message. You first say that long_long_integer_type_node is NULL, but then want to use this. This does not seem to be a solution. Most likely it is the solution, but the problem description makes it hard to understand it. Is the problem
caused by initialization order issues? Or why are such types NULL?

(I am fine with using 64 bits by default, but I would like to keep the possibility to compile with 128 bits to allow the size to be changed
easily during debugging. So using a specific type directly without
going through a graphite specific variable is something I would like to avoid.

Cheers,
Tobias


Reply via email to