Hi there!
As some may know I've been working on a native code generation backend for
aarch64[1]. When Ben initially wrote about The state of GHC on ARM[2], I
was quite skeptical if a native code generator would really be what we
should be doing. And the claim it would take a week or two, might h
On September 25, 2020 6:21:23 PM EDT, Ryan Scott
wrote:
...
>However, I discovered recently that there are places where GHC *does*
>use
>unboxed tuples with arity greater than 62. For example, the
>GHC.Prim.unpackInt8X64# [2] function returns an unboxed tuple of size
>64. I
>was confused for a wh
I had a feeling that this might be the case. Unfortunately, this technology
preview is actively blocking progress on !4097, which leaves me at a loss
for what to do. I can see two ways forward:
1. Remove unpackInt8X64# and friends.
2. Reconsider whether the tuple size limit should apply to unboxed
Luite is currently working on unboxed tuple support in the interpreter.
This will also be limited, as getting a generic solution for arbitrary
sized tuples raises a lot of complications.
Thus form a practical point of view, I’d go for (1) ;-)
We’ll need to rethink and get SIMD proper support at s
I think it would be worth trying to add tuples up to width 64. The only real
cost here is the interface file size of GHC.Tuple and if adding a 63-wide tuple
really does induce a crash then that is a bug in its own right that deserves
investigation.
- Ben
On September 26, 2020 8:26:32 AM EDT,
I think as long as it's bounded it's ok.
On Sat, Sep 26, 2020 at 8:52 PM Ben Gamari wrote:
> I think it would be worth trying to add tuples up to width 64. The only
> real cost here is the interface file size of GHC.Tuple and if adding a
> 63-wide tuple really does induce a crash then that is a
Steady and wonderful work!
Regards,
Takenobu
On Sat, Sep 26, 2020 at 6:44 PM Moritz Angermann
wrote:
>
> Hi there!
>
> As some may know I've been working on a native code generation backend for
> aarch64[1]. When Ben initially wrote about The state of GHC on ARM[2], I was
> quite skeptical i