Marco van de Voort wrote:
What I think we should devise or adapt is something a bit similar to the python/perl library model.

What exactly is that model? Triple the average linux packaging system size
by sticking every unit into a separate installable package and provide of a
web of dependancies that pulls half of them in in an average install?
The model is that you download the source you need and compile it into your own system.
- use/extend the existing fpc tools to load the package. They get
  installed into a predefined directory like the /perl/site/lib directory
  in perl.

Define "load". Also keep in mind that Perl installs interpreter sourcecode,
what exactly do you imagine for FPC? compiled libs, source? How do you deal
with versioning?
Right. That's what perl does. We should do the same thing. Then compile it into a library that we can use.
- the compilable source files get compiled at package load time
  and "integrated" into the system for use in the "uses" clause.

I don't think it is wise to bother the compiler with the packaging system.
It has no feelings.... :-) no pain will be caused by say 3 seconds of compiling.
Keep in mind we are not a scripting system, but a compiler. The whole idea
is that the end-users of a binary don't need the whole shebang.
I agree. It is a compiler. So we shouldn't be afraid to ask it to compile an extra file for us.

The issue is more about platform independance and download size. By distributing packages in source code and compiling them after download, it means quicker downloads and less time recompiling the whole of lazararus just to add in a new package.
- fpc/lazarus uses these library files just as it would
  internal functions...

That is impossible. (or you use "internal functions" entirely the wrong
way).
Why impossible ?

At compile time, the compiler could decide whether the function is "internal" ie within the lazarus library files, or "external" within the users library directory.

It makes no difference to the linker (ld), where the code is. The linker just needs the names of the additional libraries to link against.

Maybe we already have some of the parts of this already. It would be good
to have something that works as simply as this. Yes, I might be able to
help with making it work if people want to go this way.

This is what fppkg and friends are achieving. However it has not much to do
with dynlinking. There are tough technicaly nuts to be crached there first.
Read the wiki articles.
Well I don't have an enormous amount of working experience with fpc to comment too deeply on what it is or isn't doing.

But I have noticed that we are using the GNU ld linker and I have been exposed to that a bit in the past. I know what that thing can do and what it can't.

When I use GLScene in Lazarus, ld takes an enourmous amount of time to link. It is unbearable. ld is being asked to write in all the glscene code into the .exe and seems to take forever (15+ seconds).

I would prefer to have the option to link against the runtime libraries dynamically. This would save so much time.

As far as I am aware, it is only a matter of changing the command line parameters when ld is being called, and ensuring that all the libraries that are being linked against are in the correct directories.

David







Reply via email to