On Friday, 3 January 2014 at 17:41:48 UTC, Jeroen Bollen wrote:
On Friday, 3 January 2014 at 13:42:19 UTC, monarch_dodra wrote:
On Friday, 3 January 2014 at 13:30:09 UTC, Jeroen Bollen wrote:
I already considered this, but the problem is, I need to smoothen the noise, and to do that I need all surrounding 'checkpoints' too. This means that it'll have to load in 5 at a time.

I don't see that as a problem. Just because you can load sub-regions at once, doesn't mean you are limited to only loading one at a time.

That still means that out of 5 million pixels loaded, only 1 million 4 thousand will be used. I guess I can recycle them though.

You could also do a "subdivision" approach:

First, a table that contains seeds for 1Kx1K blocks... However each seed is designed to seed 10000 new seeds, to generate 10x10 blocks (for example).

This way, you can first load your big 1Kx1K block, and then 400 10x10 blocks. Seems reasonable to me.

I don't know exactly how big your data is, so your mileage may vary. Depending on your algorithm, you may have to adapt the numbers, or even amount of subdivisions.

Reply via email to