On Mon, Feb 15, 2021 at 3:17 AM Walter Dnes <waltd...@waltdnes.org> wrote:
>
> On Sun, Feb 14, 2021 at 06:09:58PM -0700, Grant Taylor wrote
> > On 2/14/21 10:51 AM, Jack wrote:
> > > I don't think you can completely get rid of it.
> >
> > My (long term) desire is to do away with /lib32 and /lib64, ultimately
> > only using /lib.  Likewise for the other library directories in /usr or
> > wherever they are.  I don't see a need for the specific bit variants in
> > the future.
>
>   How long before we see /lib and /lib64 *AND* /lib128 ?

Well, anything is possible, but it seems unlikely.  If it happens soon
then chances are that multilib will still be a thing and so less stuff
will break than when amd64 was introduced.  If it happens in a century
when we're all running no-multilib then we'll be reinventing the
wheel.

The main things that drove amd64 though were:
* increasing the number of registers available
* allowing direct access to >4GB of RAM (or a fraction of this
depending on the OS design)

I suspect the first is less of a concern these days - compilers
generally only need so many registers and when instructions are added
that need more register space they tend to come with registers to
accommodate them.  The second will be a concern when exabyte-scale
data structures are common to work with.  Note that current processors
generally can't handle this much address space, but the amd64
instruction set itself can (I think), so the CPUs can continue to
scale up.  RAM capacity doesn't really seem to be increasing in recent
years - I'm not sure if that is more market-driven or a technological
limitation.  RAM speed has improved somewhat, especially in niches
like GPUs.  Computers with 1GB of RAM were a thing in Y2K and today it
is pretty uncommon for a standard desktop to have more than 8GB, and
if you want to even cram more than about 128GB into a motherboard you
start needing more enterprise-grade hardware.  That isn't a very large
increase in 20 years - doubling every 3 years (in terms of max
capacity).  We're using 37 bits today (on desktops), so at 3 years per
bit that is another 80 years until we exhaust 64 bits, assuming that
we continue to grow exponentially at the same rate.  Though you do
have to think about what use cases actually need that kind of working
set.  At 64-bit depth 300dpi 3D graphics would require 200MB/in^3, If
you had a house-sized VR space (20k ft^3) rendered at that detail
you'd need 7TB of RAM to store a frame of video, which is still only
50 bits.  Maybe if you want a holodeck that 1000 people can play
around in at once you'd run into the 64-bit limit (of course you'd
have a ton of IO issues to fix long before then).

So, that makes me wonder what the practical requirements are in order
to implement The Matrix.  :)  Of course, if you're sticking people in
it maybe you can borrow some of their own memory capacity and
processing abilities to drive it.  Kind of makes you wonder why you'd
even need the human brains in the first place if you're able to deal
with that kind of data in a simulation...

-- 
Rich

Reply via email to