Re: Please integrate build framework into the compiler

2009-03-21 Thread davidl
在 Sun, 22 Mar 2009 12:18:03 +0800,Andrei Alexandrescu  
 写道:



grauzone wrote:
My rdmd doesn't know --chatty. Probably the zip file for dmd 1.041  
contains an outdated, buggy version. Where can I find the up-to-date  
source code?


Hold off on that for now.

Another question, rdmd just calls dmd, right? How does it scan for  
dependencies, or is this step actually done by dmd itself?


rdmd invokes dmd -v to get deps. It's a interesting idea to add a  
compilation mode to rdmd that asks dmd to generate headers and diff them  
against the old headers. That way we can implement incremental rebuilds  
without changing the compiler.


Andrei


The bad news is that public imports ruin the simplicity of dependencies.  
Though most cases d projs uses private imports.


Maybe we can further restrict the public imports.

I suggest we add a new module style of interfacing. Public imports are  
only allowed in those modules. Interface module can only have public  
imports.


example: all.d

module(interface) all;
public import blah;
public import blah.foo;

interface module can not import another interface module. Thus no public  
import chain will be created. The shortcoming of it is:


module(interface) subpack.all;
public import subpack.mod;

module(interface) all;
public import subpack.mod;  // duplication here.
public import subpack1.mod1;


Re: for in D versus C and C++

2009-03-21 Thread Walter Bright

Sean Kelly wrote:
I've found that once I created one lexer it could be re-used pretty 
easily for other languages too.  And recursive descent parsers are 
trivial to write.  It may be overkill for command-line parameters, but 
for anything remotely structured it's generally worth using.


When I was looking into parsing date strings, I thought it would be much 
easier if I adopted a lex/parse style approach. The result is in 
std.dateparse. The payoff is I've had very little trouble with it.


Re: Licences issues with d runtime

2009-03-21 Thread Walter Bright

Robert Jacques wrote:
This is a serious legal obligation which isn't in the primary DMD 
licence or readme. Would it be possible for the licence in druntime to 
be unified? (If not, a more prominent notice would be appreciated)


Sean is working on fixing this.


Licences issues with d runtime

2009-03-21 Thread Robert Jacques
Deep in the 'eliminate writeln et comp?' thread there's been a recent  
discussion about the confusion over Tango licences. In particular,  
regarding the desire that the standard library shouldn't require binary  
'copies' (a.k.a. every single executable compiled using it) from  
publishing/containing the library's licence. (And specifically, trying to  
understand the AFL) Anyways, I recently checked D2, and about half the  
druntime files are in BSD (which require publication) while the other half  
are in the zlib/libpng/Phobos licence (which doesn't).


This is a serious legal obligation which isn't in the primary DMD licence  
or readme. Would it be possible for the licence in druntime to be unified?  
(If not, a more prominent notice would be appreciated)


Thank you.


Re: .NET on a string

2009-03-21 Thread Cristian Vlasceanu
>>
>> The idea of slices and arrays being distinct types does seem to have 
>> advantages. I've seen a couple of mentions of this lately, but has there 
>> been a *rigorous* discussion?
>
> There has been.  But there are very good reasons to keep arrays and slices 
> the same type.  Even in C# and Java, a substring is the same type as a 
> string.  It allows iterative patterns such as:
>
> str = str[1..$];
>

I am afraid that there's a fallacy in your argument: substrings ARE strings: 
1) they are full copies of the characters in the given range, and 2) once 
the substring is created it goes its own merry way (i.e. it does not keep 
track of any relationship to the "original" string). Slices ARE NOT arrays. 
Slices are more like "views" into the original array. It is like the 
difference between the icon and the saint / deity that it represents.

Another point that I have a hard time getting accross (even to the language 
heavy-weights) is that just because it is easy to represent arrays and 
slices seemlessly IN THE PARTICULAR CASE OF THE DIGITAL MARS BACKEND it does 
not mean it is going to work as smooth and seamless in other systems. The 
.NET backend that I am working on is the case in point. If instead of using 
.NET built-in arrays I craft my own representation (to stay compatible with 
the DMD's way of doing array and slices) then I give up interoperability 
with other languages -- and that would defeat the point of doing D on .NET 
to begin with.

Your proposed solution is interesting but implementation-specific. I am 
afraid that I cannot not use it with .NET (I just generate IL code, which is 
more high-level than "ordinary" assembly code).

I passed a proposal of my own to Walter and Andrei, and that is to have D 
coders explicitly state the intent of using a slice with the "ref" keyword; 
"ref" is already a legal token in D (at least in 2.0) albeit it is only 
valid in the context of a parameter list, or foreach argument list. It is 
not legal to say "ref int j = i;" in a declaration, for example. But it is a 
trivial change in the parser (I have implemented this change as a proof of 
concept / language extension research) to allow ref (just for slices): "ref 
int[] s = a[1..2];" other than in parameter and foreach arg lists.

I think that "ref" makes sense, because slices, like I said, are 
conceptually views (or references) into the "true" arrays. This simple 
change would a) make D code more self-documenting, and it would give a very 
powerful hint to the compiler. Also, the "ref" semantics is backwards 
compatbile with the exisiting cases where "ref" is allowed.

But there may be some yet unforseen side-effects (when dealing with generic 
code, for instance) that need to be further investigated before I push for 
this solution at full speed. If I find it to work suitably well I have no 
problem in making my own D 2.0 / .NET dialect...

Cheers,
Cristian 




Re: Please integrate build framework into the compiler

2009-03-21 Thread Andrei Alexandrescu

grauzone wrote:
My rdmd doesn't know --chatty. Probably the zip file for dmd 1.041 
contains an outdated, buggy version. Where can I find the up-to-date 
source code?


Hold off on that for now.

Another question, rdmd just calls dmd, right? How does it scan for 
dependencies, or is this step actually done by dmd itself?


rdmd invokes dmd -v to get deps. It's a interesting idea to add a 
compilation mode to rdmd that asks dmd to generate headers and diff them 
against the old headers. That way we can implement incremental rebuilds 
without changing the compiler.


Andrei


Re: Response files

2009-03-21 Thread Andrei Alexandrescu

Walter Bright wrote:

Andrei Alexandrescu wrote:

.o on Linux, .obj on Windows.


OBJSUFFIX_win32 = .obj
OBJSUFFIX_linux = .o
...
OS = linux
...
... file$(OBJSUFFIX_$(OS)) ...


I hadn't thought of using macros to generate macros. It's a good idea.


I confess I also hadn't until the third iteration of the Phobos makefile.

Andrei


Re: Please integrate build framework into the compiler

2009-03-21 Thread grauzone

Little self-promotion here, and in case Walter misses some of them:
http://d.puremagic.com/issues/show_bug.cgi?id=2744
http://d.puremagic.com/issues/show_bug.cgi?id=2745
http://d.puremagic.com/issues/show_bug.cgi?id=2747
http://d.puremagic.com/issues/show_bug.cgi?id=2748
http://d.puremagic.com/issues/show_bug.cgi?id=2751


If it's about bugs, it would (probably) be easier for Walter to fix that 
code generation bug, that forces dsss/rebuild to invoke a new dmd 
process to recompile each outdated file separately.


This would bring a critical speedup for incremental compilation (from 
absolutely useless to relatively useful), and all impatient D users with 
middle sized source bases could be happy.


In c++, a sophisticated makefile carefully build .h dependencies of .c 
files. Thus, once .h files are updated, then .c files which are based on 
them need to be recompile. This detection can be made by comparison of 
old .di files and new .di files by testing their equality.


This sounds like a really nice idea, but it's also quite complex.

For example, to guarantee correctness, the D compiler _always_ had to 
read the .di file when importing a module (and not the .d file 
directly). If it doesn't do that, it could "accidentally" use 
information that isn't included in the .di file (like code when doing 
inlining). This means you had to generate the .di files first. When 
doing this, you also had to deal with circular dependencies, which will 
bring extra headaches. And of course, you need to fix all those .di 
generation bugs. It's actually a bit scary that the compiler not only 
has to be able to parse D code, but also to output D source code again. 
And .di files are not even standardized.


It's perhaps messy enough to deem it unrealistic. Still, nice idea.


Re: Please integrate build framework into the compiler

2009-03-21 Thread grauzone
My rdmd doesn't know --chatty. Probably the zip file for dmd 1.041 
contains an outdated, buggy version. Where can I find the up-to-date 
source code?


Another question, rdmd just calls dmd, right? How does it scan for 
dependencies, or is this step actually done by dmd itself?


Re: Response files

2009-03-21 Thread Walter Bright

Andrei Alexandrescu wrote:

.o on Linux, .obj on Windows.


OBJSUFFIX_win32 = .obj
OBJSUFFIX_linux = .o
...
OS = linux
...
... file$(OBJSUFFIX_$(OS)) ...


I hadn't thought of using macros to generate macros. It's a good idea.


Re: Response files

2009-03-21 Thread Sean Kelly

Andrei Alexandrescu wrote:


More code in makefiles doesn't necessarily improve things quite a lot. 
druntime has a lot of makefiles; apparently every single blessed thing 
has a makefile dedicated to it. But that complicates things without 
benefit.


In theory, each thing in druntime with a makefile is actually a 
standalone project, but since they're built together for the current 
distribution I agree it's overcomplicated.


Regarding DM's make program, I'd be happy if it simply accepted rules 
with wildcards (ie. %.o : %.d).  That would eliminate basically all the 
differences between the Win32 and Posix makefiles in druntime.


‘final’ variables: an alternative defini tion and its usefulness.

2009-03-21 Thread Robert Jacques
I propose to enhance the concept of ‘final’ to variables in order fill a  
much needed place in the current type system. Final variables would be  
transitive (a.k.a. deep), a super-type of non-final, assignable only at  
declaration and apply only to references. Essentially, it would be a  
references only version of ‘const’. Despite this similarity, it is  
orthogonal to both the immutable-mutable-const and the shared-local-scope  
(a.k.a. shared-heap-stack) type trees. The primary reasons for proposing  
yet another type qualifier is bug 2095 and the lack of an equivalent of  
‘const’ for the shared-local-scope storage type system.


Regarding Bug 2095
Bug 2095 occurs because a reference to a reference type is implicitly  
convertible to a reference to a reference super-type. i.e.


class A {}
class B : A {}
B[] b = new B[10];
A[] a = b; // Practically, this is really important
a[0] = new A();// But allowing this causes bugs, since b has also  
changed


Now, the implicit conversion of B[] to A[] is really important for subtype  
and all functions operating on arrays of the super-type in order to avoid  
casting left, right and center. However, if the implicitly converted  
super-type array is assigned to, and then the original array is accessed,  
random code execution can occur. Similarly, immutability can be  
circumvented.


Final solves this problem by providing an alternative implicit conversion  
of B[] to final A[]. E.g.


final A[] fa = b; // Still valid, and mutability/storage isn’t changed.
a[0] = new A();   // But now this is a compile time error.

Regarding shared-local-scope
The storage type system, consisting of shared, local and scope provides  
many correctness and performance advantages. However, akin to mutable and  
immutable there is no safe way to cast between them. This is an undesired  
situation as it would require three separate variants of most functions.  
The currently allowed implicit conversion between local and scope has  
resulted in functions being prevented from returning explicit scope types.  
However, a scope object can still escape, resulting in bugs.


Final references can not escape a scope as they can only be assigned at  
declaration. Thus, final scope and final local may be implicitly casted  
between themselves and either may be implicitly casted to final shared.  
(The reason final shared may not be implicitly casted to final local or  
final scope is that member variables will be protected by memory fences  
and thus are accessed differently at the machine level)


Regarding Transitivity
Though bug 2095 would require arrays of array, etc to be implicitly  
converted to final in a transitive manner, complete transitivity is not  
required and represents a limitation on the actual objects. It is the need  
to guarantee that final variables can not result in an object escaping its  
scope that motivates full transitivity. In a situation where the storage  
type of member variables and member returns is dictated by an object’s own  
storage type, the implicit casting of final-storage-types results in the  
member variable and return storage types to become unknown, necessitating  
the implicit casting of them to final and thus transitivity.


Additional ‘final’ restrictions
Functions whose return type is final may result an object’s escape. This  
can occur only when the returned object is final or scope. While returning  
scope types is already illegal, but returning a final object may be needed  
and logically valid. However, as a potentially dangerous operation  
explicit casting should be required for final types. Another possible  
escape could occur when final member variables are assigned during  
construction and explicit casts should again be required.


Re: new D2.0 + C++ language

2009-03-21 Thread Rainer Deyke
Sergey Gromov wrote:
> I think this is an overstatement.  It's only abstract write buffers
> where GC really doesn't work, like std.stream.BufferedFile.  In any
> other resource management case I can think of GC works fine.

OpenGL objects (textures/shader programs/display lists).
SDL surfaces.
Hardware sound buffers.
Mutex locks.
File handles.
Any object with a non-trivial destructor.
Any object that contains or manages one of the above.

Many of the above need to be released in a timely manner. For example,
it is a serious error to free a SDL surface after closing the SDL video
subsystem, and closing the SDL video subsystem is the only way to close
the application window under SDL.  Non-deterministic garbage collection
cannot work.

Others don't strictly need to be released immediately after use, but
should still be released as soon as reasonably possible to prevent
resource hogging.  The GC triggers when the program is low on system
memory, not when the program is low on texture memory.

By my estimate, in my current project (rewritten in C++ after abandoning
D due to its poor resource management), about half of the classes manage
resources (directly or indirectly) that need to be released in a timely
manner.  The other 50% does not need RAII, but also wouldn't benefit
from GC in any area other than performance.


-- 
Rainer Deyke - rain...@eldwood.com


Re: Please integrate build framework into the compiler

2009-03-21 Thread davidl

在 Sun, 22 Mar 2009 04:19:31 +0800,grauzone  写道:

I don't really understand what you mean. But if you want the compiler to  
scan for dependencies, I fully agree.


I claim that we don't even need incremental compilation. It would be  
better if the compiler would scan for dependencies, and if a source file  
has changed, recompile the whole project in one go. This would be simple  
and efficient.




This may not be true. Consider the dwt lib case, once you tweaked a module  
very little(that means you do not modify any interface connects with  
outside modules and code that could possible affect modules in the same  
packages), the optimal way is


dmd -c your_tweaked_module
link all_obj

That's much faster than regenerating all other object files. Yes, feed  
them all to DMD compiles really fast. Writing all object files to disk  
costs much time. And your impression of incremental compilation seems to  
be misguided by the rebuild and dsss system. Rebuild takes no advantage of  
di files, thus it have to recompile everytime even in the situation that  
the module based on all other di files unchanged. I posted several  
blocking header generation bugs in DMD and with fixes. Just so little  
change that dmd can generate almost all header files correctly. I tested  
tango, dwt, dwt-addons. Those projects are very big and some take advanced  
use of templates. So the header generation building strategy is really not  
far away.


Little self-promotion here, and in case Walter misses some of them:
http://d.puremagic.com/issues/show_bug.cgi?id=2744
http://d.puremagic.com/issues/show_bug.cgi?id=2745
http://d.puremagic.com/issues/show_bug.cgi?id=2747
http://d.puremagic.com/issues/show_bug.cgi?id=2748
http://d.puremagic.com/issues/show_bug.cgi?id=2751

In c++, a sophisticated makefile carefully build .h dependencies of .c  
files. Thus, once .h files are updated, then .c files which are based on  
them need to be recompile. This detection can be made by comparison of old  
.di files and new .di files by testing their equality.


Re: new D2.0 + C++ language

2009-03-21 Thread Sergey Gromov
Sat, 21 Mar 2009 00:59:22 -0600, Rainer Deyke wrote:

> GC is useless for resource management.

I think this is an overstatement.  It's only abstract write buffers
where GC really doesn't work, like std.stream.BufferedFile.  In any
other resource management case I can think of GC works fine.


Re: Response files

2009-03-21 Thread Andrei Alexandrescu

Walter Bright wrote:

Georg Wrede wrote:
'Round here we say "maassa maan tavalla", which is probably something 
like "When in Rome, do like the Romans do".


Makefiles aren't just a C(++) thing. Unix has a culture of its own, 
Windows (I wouldn't say have a culture, but still) does it another 
way. So do we import the Unix way to Windows or the other way around? 
I'd go with the Romans in Rome.


If there were a vote (outside of this NG!!) with D users, probably 
there are more folks who write in D /and/ in C or another language /on 
their own/ OS, than folks who write D apps (big enough to need 
makefiles) for both Windows and Linux.



There is no standard for makefiles, I've run across dozens of different 
make programs that use different syntax and have different extensions. 
So, if you're going to have the same makefile across systems, you have 
to start with finding a make program that is fairly identical across 
those systems.


Then you have the \ vs / problems. Some people assure me that Windows 
now works flawlessly with /, but that simply isn't so. I keep running 
into odd cases where it doesn't, so I don't use / on Windows.


File name case sensitivity differs.

The command line utilities called by makefiles differ in their names, 
switches, and how they work.


dmd's flags are the same. Other than that, you only need to configure 
how files are deleted and how the C compiler is invoked.



.o on Linux, .obj on Windows.


OBJSUFFIX_win32 = .obj
OBJSUFFIX_linux = .o
...
OS = linux
...
... file$(OBJSUFFIX_$(OS)) ...


nothing on Linux, .exe on Windows.


See above.


.a on Linux, .lib on Windows.


See above.


It just goes on and on.


No. At some point it stops and you are gained by understanding where OS 
matters for your product and how.


You could try and parameterize all of it, but 
then the makefile becomes an inscrutable mess. You could have scripts 
generate makefiles, embed scripts in the makefiles, etc., but is this 
really worthwhile? It's just a makefile. I spend almost zero time on 
them. I like them simple even if that means they're more verbose.


This is because you don't really need to. I work on Phobos a fair 
amount, and I don't want to update four-odd places whenever I add a 
module. There is something to be said about once and only once. I've 
overhauled Phobos' makefile twice, and every time I've been gained by it.


And again this brings a basic disagreement I have about making a 
hodge-podge of particular cases instead of searching the higher ground 
of proper abstraction.


More code in makefiles doesn't necessarily improve things quite a lot. 
druntime has a lot of makefiles; apparently every single blessed thing 
has a makefile dedicated to it. But that complicates things without benefit.



Andrei


Re: Please integrate build framework into the compiler

2009-03-21 Thread Andrei Alexandrescu

grauzone wrote:

Andrei Alexandrescu wrote:

grauzone wrote:
I don't really understand what you mean. But if you want the compiler 
to scan for dependencies, I fully agree.


I claim that we don't even need incremental compilation. It would be 
better if the compiler would scan for dependencies, and if a source 
file has changed, recompile the whole project in one go. This would 
be simple and efficient.


That's precisely what rdmd does.


This looks really good, but I couldn't get it to work. Am I doing 
something wrong?


--- o.d:
module o;

import tango.io.Stdout;

void k() {
Stdout("foo").newline;
}

--- u.d:
module u;

import o;

void main() {
k();
}



$ rdmd u.d
/tmp/u-1000-20-49158160-A46C236CDE107E3B9F053881E4257C2D.o:(.data+0x38): 
undefined reference to `_D1o12__ModuleInfoZ'
/tmp/u-1000-20-49158160-A46C236CDE107E3B9F053881E4257C2D.o: In function 
`_Dmain':

u.d:(.text._Dmain+0x4): undefined reference to `_D1o1kFZv'
collect2: ld returned 1 exit status
--- errorlevel 1
rdmd: Couldn't compile or execute u.d.

$ dmd|grep Compiler
Digital Mars D Compiler v1.041


Should work, but I tested only with D2. You may want to pass --chatty to 
rdmd and see what commands it invokes.



Andrei


Re: eliminate writeln et comp?

2009-03-21 Thread Georg Wrede

Lars Ivar Igesund wrote:

Daniel Keep wrote:


I'm not talking about distribution of the actual library machine code,
I'm talking about the LEGAL ISSUES.  Tango's license apparently requires
you to explicitly include attribution for Tango in your program.  This
means it's possible to naively compile "Hello, World" with Tango,
distribute it and break the law.


Sorry to use you as the source to enter the thread, Daniel.

Tango DOES NOT IN ANY WAY require you to put attribution into your program. 
That is a choice you as a user would make entirely on your own by choosing 
to use Tango licensed under the BSD (which is quite possible because this 
license is better suited for use alongside the GPL).


However, the AFL does not put such a restriction on your binaries, and 
(unless you use the GPL for your code) the AFL is the license most users 
should use. This is also noted on the license page (it was probably not 
clear enough, I hope it is now).


http://dsource.org/projects/tango/wiki/LibraryLicense

For current or prospective contributors; you are completely and entirely 
entitled to relicense your own code to whichever license you wish, however 
these should also include the AFL and BSD when used in Tango.


To change the license to something else at this point (for instance to 
Apache 2.0 only), would be a major undertaking, but something that we may 
consider to do at a later point.


I read http://dsource.org/projects/tango/wiki/LibraryLicense.

I am sorry to say, the page /still/ is not /clear/ enough. (As of Mar 
22, 00:24 UTC.)


The first bullets establish the intent, yes. But everything after that 
is actually... worthless. What the page should instead say, is /in terms 
understandable to/ *anybody*, explain what you have to do if you 
incorporate Tango in your software, or if you make another library that 
depends on Tango.


Even if this includes "awkward things" (like having to have a constant 
string in the binary, mentioning Tango in the "About" menu item, or 
whatever else), it should be stated in layman-understandable terms.


Currently, words like "encumbrance", phrases like "provides broad 
rights" etc. only make the prospective reader run away in frustration. 
Just state what you want, in language that can be understood at First 
Reading, without asking your mother. Or both of you having an IQ of 170+.


I'm not surprised that Don and others are getting second thoughts about 
contributing. A /clear/ stance to these issues makes everybody's 
(contributors, users, OS distributors, even app vendors) life easier. 
And, therefore, increases the popularity of Tango.


Re: Response files

2009-03-21 Thread Walter Bright

Georg Wrede wrote:
'Round here we say "maassa maan tavalla", which is probably something 
like "When in Rome, do like the Romans do".


Makefiles aren't just a C(++) thing. Unix has a culture of its own, 
Windows (I wouldn't say have a culture, but still) does it another way. 
So do we import the Unix way to Windows or the other way around? I'd go 
with the Romans in Rome.


If there were a vote (outside of this NG!!) with D users, probably there 
are more folks who write in D /and/ in C or another language /on their 
own/ OS, than folks who write D apps (big enough to need makefiles) for 
both Windows and Linux.



There is no standard for makefiles, I've run across dozens of different 
make programs that use different syntax and have different extensions. 
So, if you're going to have the same makefile across systems, you have 
to start with finding a make program that is fairly identical across 
those systems.


Then you have the \ vs / problems. Some people assure me that Windows 
now works flawlessly with /, but that simply isn't so. I keep running 
into odd cases where it doesn't, so I don't use / on Windows.


File name case sensitivity differs.

The command line utilities called by makefiles differ in their names, 
switches, and how they work.


.o on Linux, .obj on Windows.

nothing on Linux, .exe on Windows.

.a on Linux, .lib on Windows.

It just goes on and on. You could try and parameterize all of it, but 
then the makefile becomes an inscrutable mess. You could have scripts 
generate makefiles, embed scripts in the makefiles, etc., but is this 
really worthwhile? It's just a makefile. I spend almost zero time on 
them. I like them simple even if that means they're more verbose.


Re: Please integrate build framework into the compiler

2009-03-21 Thread Christopher Wright

grauzone wrote:
- Long dependency chains. Unlike in C/C++, you can't separate a module 
into interface and implementation. Compared to C++, it's as if a change 
to one .c file triggers recompilation of a _lot_ of other .c files. This 
makes incremental compilation really look useless. Unless you move 
modules into libraries and use them through .di files.


You can use interfaces for this, though that is not always possible.


Re: Please integrate build framework into the compiler

2009-03-21 Thread Christopher Wright

dsimcha wrote:

1.  std.traits could offer a way to get a tuple of all derived classes,
essentially the opposite of BaseTypeType.
2.  Since DMD would know about all derived classes when compiling the base 
class,
it would be feasible to allow templates to add virtual functions to classes.
IMHO, this would be an absolute godsend, as it is currently a _huge_ limitation 
of
templates.
3.  For the same reason, methods calls to classes with no derived classes could 
be
made directly instead of through the vtable.


This is only if there is no dynamic linking.


Re: Response files

2009-03-21 Thread Georg Wrede

Andrei Alexandrescu wrote:

Walter Bright wrote:

Frank Benoit wrote:

Because, imagine you set up a build process for your application. Why
should i have to care about that difference in my 'makefile',
'rakefile', ... whatever ?


I use different makefiles for Windows, Linux, and OSX. It's easier 
than tearing my few strands of hair out trying to figure out how to 
remove system differences.


...NOT.


Well... that's a bit D centric. :-)

'Round here we say "maassa maan tavalla", which is probably something 
like "When in Rome, do like the Romans do".


Makefiles aren't just a C(++) thing. Unix has a culture of its own, 
Windows (I wouldn't say have a culture, but still) does it another way. 
So do we import the Unix way to Windows or the other way around? I'd go 
with the Romans in Rome.


If there were a vote (outside of this NG!!) with D users, probably there 
are more folks who write in D /and/ in C or another language /on their 
own/ OS, than folks who write D apps (big enough to need makefiles) for 
both Windows and Linux.


Re: Please integrate build framework into the compiler

2009-03-21 Thread BCS

Hello grauzone,


I would even go so far to say, that dmd should automatically follow
all imports and compile them in one go. This would be faster than
having a separate responsefile step, because the source code needs to
be analyzed only once. To prevent compilation of imported library
headers, the compiler could provide a new include switch for library
code. Modules inside "library" include paths wouldn't be compiled.


Adding that without a way to turn it off would kill D in some cases. I have 
a project where DMD uses up >30% of the available address space compiling 
one module. If I was forced to compile all modules at once, it might not 
work, end of story.


That said, for many cases, I don't see a problem with having that feature 
available.





Re: Please integrate build framework into the compiler

2009-03-21 Thread grauzone

dsimcha wrote:

== Quote from grauzone (n...@example.net)'s article

I claim that we don't even need incremental compilation. It would be
better if the compiler would scan for dependencies, and if a source file
has changed, recompile the whole project in one go. This would be simple
and efficient.


I'm surprised that this could possibly be more efficient than incremental
compilation, but I've never worked on a project large enough for compile times 
to
be a major issue, so I've never really looked into this.


Maybe incremental compilation could be faster, but dmd has a bug that 
forces tools like dsss/rebuild to use a slower method. Instead of 
invoking the compiler once to recompile all modules that depend from 
changed files, it has to start a new compiler process for each file.



If incremental compilation were removed from the spec, meaning the compiler 
would
always know about the whole program when compiling, I assume (correct me if I'm
wrong) that would mean the following restrictions could be removed:

1.  std.traits could offer a way to get a tuple of all derived classes,
essentially the opposite of BaseTypeType.
2.  Since DMD would know about all derived classes when compiling the base 
class,
it would be feasible to allow templates to add virtual functions to classes.
IMHO, this would be an absolute godsend, as it is currently a _huge_ limitation 
of
templates.
3.  For the same reason, methods calls to classes with no derived classes could 
be
made directly instead of through the vtable.


And you could do all kinds of interprocedural optimizations.


Of course, these restrictions would still apply to libraries that use .di files.
If incremental compilation is actually causing more problems than it solves
anyhow, it would be great to get rid of it along with the annoying restrictions 
it
creates.


It seems Microsoft thought the same. C# goes without incremental 
compilation. But for now, D's build model is too similar to C/C++ as 
that you'd completely remove that ability.


Re: Please integrate build framework into the compiler

2009-03-21 Thread grauzone

Andrei Alexandrescu wrote:

grauzone wrote:
I don't really understand what you mean. But if you want the compiler 
to scan for dependencies, I fully agree.


I claim that we don't even need incremental compilation. It would be 
better if the compiler would scan for dependencies, and if a source 
file has changed, recompile the whole project in one go. This would be 
simple and efficient.


That's precisely what rdmd does.


This looks really good, but I couldn't get it to work. Am I doing 
something wrong?


--- o.d:
module o;

import tango.io.Stdout;

void k() {
Stdout("foo").newline;
}

--- u.d:
module u;

import o;

void main() {
k();
}



$ rdmd u.d
/tmp/u-1000-20-49158160-A46C236CDE107E3B9F053881E4257C2D.o:(.data+0x38): 
undefined reference to `_D1o12__ModuleInfoZ'
/tmp/u-1000-20-49158160-A46C236CDE107E3B9F053881E4257C2D.o: In function 
`_Dmain':

u.d:(.text._Dmain+0x4): undefined reference to `_D1o1kFZv'
collect2: ld returned 1 exit status
--- errorlevel 1
rdmd: Couldn't compile or execute u.d.

$ dmd|grep Compiler
Digital Mars D Compiler v1.041


Andrei


Re: eliminate writeln et comp?

2009-03-21 Thread Fawzi Mohamed

On 2009-03-21 20:19:15 +0100, Fawzi Mohamed  said:


On 2009-03-21 14:23:51 +0100, Daniel Keep  said:



Christopher Wright wrote:

Daniel Keep wrote:


Christopher Wright wrote:

Daniel Keep wrote:

When was the last time you had to put this in your GCC-compiled
programs?

"Portions of this program Copyright (C) Free Software Foundation.  Uses
glibc."

Executable code resulting from compilation is not a work derived from
GCC.

glibc is extremely difficult to link statically and is distributed under
the LGPL, so no copyright notice is necessary.

If dmd had good support for dynamic linking, this wouldn't be nearly as
much of an issue. Sadly, ddl seems to be on hiatus, and at any rate, it
can't be applied to the runtime.


I think you're missing my point.  I'm saying that a standard library
shouldn't require you to insert legal disclaimers or attribution notices
into your program or its documentation.

A standard library should be be as invisible as possible in this regard.

-- Daniel


Right. It's invisible with glibc because you link to it dynamically, and
because everyone installs it by default. Druntime has neither of these
advantages.


I'm not talking about distribution of the actual library machine code,
I'm talking about the LEGAL ISSUES.  Tango's license apparently requires
you to explicitly include attribution for Tango in your program.  This
means it's possible to naively compile "Hello, World" with Tango,
distribute it and break the law.

That glibc uses dynamic linking is immaterial: that there is no way to
avoid the legal issues with Tango no matter what you do is the point I'm
trying to make.

  -- Daniel


This is bullshit, if you look at the header of c stdio.h you extremely 
likely to find exactly the same disclaimer (at least I did find it).


If in your program you have an "about" and copyright, or you have 
documentation to it, then yes you should credit the inclusion of tango 
if you use the BSD license.


If you want to avoid this then you should use the AFL license (which 
yes is incompatible with GPLv2).


This if looking more and more like FUD.

Fawzi


Sorry if I reacted a little too vehemently

Fawzi



Re: Please integrate build framework into the compiler

2009-03-21 Thread dsimcha
== Quote from grauzone (n...@example.net)'s article
> I claim that we don't even need incremental compilation. It would be
> better if the compiler would scan for dependencies, and if a source file
> has changed, recompile the whole project in one go. This would be simple
> and efficient.

I'm surprised that this could possibly be more efficient than incremental
compilation, but I've never worked on a project large enough for compile times 
to
be a major issue, so I've never really looked into this.

If incremental compilation were removed from the spec, meaning the compiler 
would
always know about the whole program when compiling, I assume (correct me if I'm
wrong) that would mean the following restrictions could be removed:

1.  std.traits could offer a way to get a tuple of all derived classes,
essentially the opposite of BaseTypeType.
2.  Since DMD would know about all derived classes when compiling the base 
class,
it would be feasible to allow templates to add virtual functions to classes.
IMHO, this would be an absolute godsend, as it is currently a _huge_ limitation 
of
templates.
3.  For the same reason, methods calls to classes with no derived classes could 
be
made directly instead of through the vtable.

Of course, these restrictions would still apply to libraries that use .di files.
If incremental compilation is actually causing more problems than it solves
anyhow, it would be great to get rid of it along with the annoying restrictions 
it
creates.


Re: Please integrate build framework into the compiler

2009-03-21 Thread Andrei Alexandrescu

grauzone wrote:
I don't really understand what you mean. But if you want the compiler to 
scan for dependencies, I fully agree.


I claim that we don't even need incremental compilation. It would be 
better if the compiler would scan for dependencies, and if a source file 
has changed, recompile the whole project in one go. This would be simple 
and efficient.


That's precisely what rdmd does.

Andrei


Re: Please integrate build framework into the compiler

2009-03-21 Thread grauzone
I don't really understand what you mean. But if you want the compiler to 
scan for dependencies, I fully agree.


I claim that we don't even need incremental compilation. It would be 
better if the compiler would scan for dependencies, and if a source file 
has changed, recompile the whole project in one go. This would be simple 
and efficient.


Here are some arguments that speak for this approach:

- A full compiler is the only piece of software that can build a 
correct/complete module dependency graph. This is because you need full 
semantic analysis to catch all import statements. For example, you can 
use a string mixin to generate import statements: mixin("import bla;"). 
No naive dependency scanner would be able to detect this import. You 
need CTFE capabilities, which require almost a full compiler. (Actually, 
dsss uses the dmd frontend for dependency scanning.)


- Speed. Incremental compilation is godawfully slow (10 times slower 
than to compile all files in one dmd invocation). You could pass all 
changed files to dmd at once, but this is broken and often causes linker 
errors (ask the dsss author for details lol). Recompiling the whole 
thing every time is faster.


- Long dependency chains. Unlike in C/C++, you can't separate a module 
into interface and implementation. Compared to C++, it's as if a change 
to one .c file triggers recompilation of a _lot_ of other .c files. This 
makes incremental compilation really look useless. Unless you move 
modules into libraries and use them through .di files.


I would even go so far to say, that dmd should automatically follow all 
imports and compile them in one go. This would be faster than having a 
separate responsefile step, because the source code needs to be analyzed 
only once. To prevent compilation of imported library headers, the 
compiler could provide a new include switch for library code. Modules 
inside "library" include paths wouldn't be compiled.


Hell, maybe I'll even manage to come up with a compiler patch, to turn 
this into reality.


Re: eliminate writeln et comp?

2009-03-21 Thread Fawzi Mohamed

On 2009-03-21 14:23:51 +0100, Daniel Keep  said:




Christopher Wright wrote:

Daniel Keep wrote:


Christopher Wright wrote:

Daniel Keep wrote:

When was the last time you had to put this in your GCC-compiled
programs?

"Portions of this program Copyright (C) Free Software Foundation.  Uses
glibc."

Executable code resulting from compilation is not a work derived from
GCC.

glibc is extremely difficult to link statically and is distributed under
the LGPL, so no copyright notice is necessary.

If dmd had good support for dynamic linking, this wouldn't be nearly as
much of an issue. Sadly, ddl seems to be on hiatus, and at any rate, it
can't be applied to the runtime.


I think you're missing my point.  I'm saying that a standard library
shouldn't require you to insert legal disclaimers or attribution notices
into your program or its documentation.

A standard library should be be as invisible as possible in this regard.

-- Daniel


Right. It's invisible with glibc because you link to it dynamically, and
because everyone installs it by default. Druntime has neither of these
advantages.


I'm not talking about distribution of the actual library machine code,
I'm talking about the LEGAL ISSUES.  Tango's license apparently requires
you to explicitly include attribution for Tango in your program.  This
means it's possible to naively compile "Hello, World" with Tango,
distribute it and break the law.

That glibc uses dynamic linking is immaterial: that there is no way to
avoid the legal issues with Tango no matter what you do is the point I'm
trying to make.

  -- Daniel


This is bullshit, if you look at the header of c stdio.h you extremely 
likely to find exactly the same disclaimer (at least I did find it).


If in your program you have an "about" and copyright, or you have 
documentation to it, then yes you should credit the inclusion of tango 
if you use the BSD license.


If you want to avoid this then you should use the AFL license (which 
yes is incompatible with GPLv2).


This if looking more and more like FUD.

Fawzi



Re: Response files

2009-03-21 Thread Andrei Alexandrescu

Walter Bright wrote:

Frank Benoit wrote:

Because, imagine you set up a build process for your application. Why
should i have to care about that difference in my 'makefile',
'rakefile', ... whatever ?


I use different makefiles for Windows, Linux, and OSX. It's easier than 
tearing my few strands of hair out trying to figure out how to remove 
system differences.


...NOT.

Andrei


Re: Response files

2009-03-21 Thread Nick Sabalausky
"Jason House"  wrote in message 
news:gq2dv9$2vn...@digitalmars.com...
> Walter Bright Wrote:
>
>> Frank Benoit wrote:
>> > DMD 1.041 on windows does support response files, that is a file
>> > containing arguments.
>> > On Linux dmd does not understand that.
>>
>> The windows response files date back to the problem DOS/Windows had with
>> only a very short command line length was allowed. So the arguments were
>> put into a file instead.
>>
>> It's probably a good idea to do it for Linux, too.
>
> Ick. Why? Command files are hacks for Window's shortcomings. Why spread 
> such hacks across all platforms? The linux command line is already well 
> adapted to handle this kind of thing.

Sometimes command lines get too long to keep typing (obviously). You *could* 
solve that with a shell/batch script, but then that would be specific to a 
particular shell/OS. As long as the app you're using is reasonably 
cross-platform, then a response file is completely shell/OS-agnostic. 
Response files may have originated as a workaround, but that doesn't mean 
they didn't turn out to have additional benefits. 




Re: Response files

2009-03-21 Thread Frank Benoit
Walter Bright schrieb:
> Frank Benoit wrote:
>> Because, imagine you set up a build process for your application. Why
>> should i have to care about that difference in my 'makefile',
>> 'rakefile', ... whatever ?
> 
> I use different makefiles for Windows, Linux, and OSX. It's easier than
> tearing my few strands of hair out trying to figure out how to remove
> system differences.

Right, this is because of such stuff. With dmd acting the same, it would
be one step easier.

On the other hand, separated build scripts are against DRY.


Re: for in D versus C and C++

2009-03-21 Thread Sean Kelly

Walter Bright wrote:

Georg Wrede wrote:
Sometimes I need to have a command line UI in a program. Such programs 
usually have 5 to 10 commands, with their parameters. One command per 
line.


So far I have tested and split the command line with regular 
expressions, because using a parser generator has felt like shooting 
mosquitos with a shotgun.


What would your strategy be?


If it's 5 or 10, you can get by with ad-hoc. But if you find yourself 
repeatedly fixing bugs in the parsing, it's time to consider a more 
principled approach.


I've found that once I created one lexer it could be re-used pretty 
easily for other languages too.  And recursive descent parsers are 
trivial to write.  It may be overkill for command-line parameters, but 
for anything remotely structured it's generally worth using.


Re: for in D versus C and C++

2009-03-21 Thread Walter Bright

Georg Wrede wrote:
Sometimes I need to have a command line UI in a program. Such programs 
usually have 5 to 10 commands, with their parameters. One command per line.


So far I have tested and split the command line with regular 
expressions, because using a parser generator has felt like shooting 
mosquitos with a shotgun.


What would your strategy be?


If it's 5 or 10, you can get by with ad-hoc. But if you find yourself 
repeatedly fixing bugs in the parsing, it's time to consider a more 
principled approach.


Re: Response files

2009-03-21 Thread Walter Bright

Frank Benoit wrote:

Because, imagine you set up a build process for your application. Why
should i have to care about that difference in my 'makefile',
'rakefile', ... whatever ?


I use different makefiles for Windows, Linux, and OSX. It's easier than 
tearing my few strands of hair out trying to figure out how to remove 
system differences.


Re: Response files

2009-03-21 Thread Walter Bright

Jason House wrote:

Ick. Why? Command files are hacks for Window's shortcomings. Why
spread such hacks across all platforms? The linux command line is
already well adapted to handle this kind of thing.


gcc already supports it. There's apparently a demand for it.


Re: for in D versus C and C++

2009-03-21 Thread Georg Wrede

Sean Kelly wrote:

== Quote from Walter Bright (newshou...@digitalmars.com)'s article

Sean Kelly wrote:

Sounds like HTTP/HTML.  The best I've come up with so far for parsing
that stuff is to have the lexer actually return tokens representing whitespace
in some instances.  It's totally ridiculous.

When you see ad-hoc designs like that, it's obvious the designer has no
experience with compilers. It's why every programmer should take a basic
course in compiler design .


I very much agree.  In fact, I'd go so far as to say that my compiler design
course was the single most valuable CS course I took while in college.  It's
amazing how many problems I encounter have something to do with parsing
or language translation.  It's also amazing how many crappy parsers there
are out there for these same tasks.  Clearly, compiler design doesn't get
as much attention as it should in undergrad CS.


Sometimes I need to have a command line UI in a program. Such programs 
usually have 5 to 10 commands, with their parameters. One command per line.


So far I have tested and split the command line with regular 
expressions, because using a parser generator has felt like shooting 
mosquitos with a shotgun.


What would your strategy be?


Re: Returning a struct by reference

2009-03-21 Thread BCS

Hello Simon,


If I add a getter-property that returns the field by value, the
following instruction "object.position.x = 12;" won't modify the
position of the object, but will only modify the returned copy of the
position, right?

That's actually why I'd like to have a getter that returns the field
by reference and not by value.




That is correct. Reference returns are on the todo list. For now this Hack 
should work.



struct S { float x; float y; }


class C
{
   S s

   class C_S { void x(float v){ s.x=v; } void y(float v){ s.y=v; } }

   C_S pos() { return new C_S(); }
}




Please integrate build framework into the compiler

2009-03-21 Thread davidl


1. compiler know in what situation a file need to be recompiled

Consider the file given the same header file, then the obj file of this  
will be required for linking, all other files import this file shouldn't  
require any recompilation in this case. If a file's header file changes,  
thus the interface changes, all files import this file should be  
recompiled.

Compiler can emit building command like rebuild does.

I would enjoy:

dmd -buildingcommand abc.d  > responsefile

dmd @responsefile

I think we need to eliminate useless recompilation as much as we should  
with consideration of the growing d project size.


2. maintaining the build without compiler support costs



Re: eliminate writeln et comp?

2009-03-21 Thread Lars Ivar Igesund
Daniel Keep wrote:

> I'm not talking about distribution of the actual library machine code,
> I'm talking about the LEGAL ISSUES.  Tango's license apparently requires
> you to explicitly include attribution for Tango in your program.  This
> means it's possible to naively compile "Hello, World" with Tango,
> distribute it and break the law.

Sorry to use you as the source to enter the thread, Daniel.

Tango DOES NOT IN ANY WAY require you to put attribution into your program. 
That is a choice you as a user would make entirely on your own by choosing 
to use Tango licensed under the BSD (which is quite possible because this 
license is better suited for use alongside the GPL).

However, the AFL does not put such a restriction on your binaries, and 
(unless you use the GPL for your code) the AFL is the license most users 
should use. This is also noted on the license page (it was probably not 
clear enough, I hope it is now).

http://dsource.org/projects/tango/wiki/LibraryLicense

For current or prospective contributors; you are completely and entirely 
entitled to relicense your own code to whichever license you wish, however 
these should also include the AFL and BSD when used in Tango.

To change the license to something else at this point (for instance to 
Apache 2.0 only), would be a major undertaking, but something that we may 
consider to do at a later point.

-- 
Lars Ivar Igesund
blog at http://larsivi.net
DSource, #d.tango & #D: larsivi
Dancing the Tango



Re: Returning a struct by reference

2009-03-21 Thread grauzone

Simon TRENY wrote:

Daniel Keep Wrote:



Simon TRENY wrote:

Ok, but then, what if I'd like to make the variable "read-only"? i.e. 
preventing the user from writing things like this:
myObject.position = pos2;


So... you're rejecting a solution on the basis that it prevents you from
doing the exact opposite of what you want to do?

*boggle*

  -- Daniel


Here is a complete example of what I'd like to achieve:
struct Position {
   private float m_x;
   private float m_y;

   public float x() {
  return m_x;
   }

   public void x(float x) {
  m_x = x;
  EmitSignal("changed");
   }

   public float y() {
  return m_y;
   }

   public void y(float y) {
  m_y = y;
  EmitSignal("changed");
   }
}

class Object {
   private Position m_position;

   public this() {
  m_position.CallOnSignal("changed", onPositionChanged);
   }

   //This syntax is not working
   public ref Position position() {
  return m_position;
   }

   public void onPositionChanged() {
  writeln("Position Changed!!);
   }
}

With this "fictional" code, I could write things like:
object.position.x = 14; and the object will be "aware" that its position has 
changed.

Making the "position"-variable public will lead the user to be able to do 
things like this:
object.position = pos2; and then, the object won't be "aware" that its position 
has changed. And this is a problem for me.


But if position is returned as ref, this still could happen. The 
returned value is still assignable, and because it's a ref, overwriting 
it is like overwriting m_position directly. As far as I see, your 
position()-getter just emulates a public field. Including overwriting by 
assignment.



Now D2.0 has const. If position() would return a const object, this kind 
of unwanted "overwriting" couldn't happen. This improves correctness, 
because uncatched changes or changes to temporaries can't happen. But 
then again, the setters in the Position struct wouldn't work, because 
the struct is const (as far as I understand the const system). This 
means returning the field as a "const ref" wouldn't help.



Also, how is this EmitSignal() working? What happens if you write:

Position p = object.position;
p.x = 14; //EmitSignal() calls what?

p is simply a bit-copy of the struct returned by the getter, and 
EmitSignal() has no way to check if it's still supposed to notify class 
Object. (All solutions I can come up with sound really hairy and hackish.)


Maybe you're better off with Position as a class instead of a struct. 
Even when you use D2.0's const/opAssign/copy-ctor/post-blit/ref-return 
features.


Looking forward to the replies pointing out that D2.0 actually allows to 
implement exactly what you want, and how this is done.



I hope it's clearer now


Re: Returning a struct by reference

2009-03-21 Thread Sergey Gromov
Sat, 21 Mar 2009 09:55:13 -0400, Simon TRENY wrote:

>//This syntax is not working
>public ref Position position() {
>   return m_position;
>}

D2 supports this.
D1 won't, ever.  I think.


Re: Response files

2009-03-21 Thread TomD
Frank Benoit Wrote:
[...]
> This seems to be enough, however, 32k/64k are not.
> There is a related bug, because the dmd response file workaround is not
> working with >64k, see http://d.puremagic.com/issues/show_bug.cgi?id=2705
> 
Just out of curiosity: With which kind of project
do you hit this limit?

Ciao
Tom


Re: [OT]new D2.0 + C++ language

2009-03-21 Thread bearophile
Piotrek:

> Haha, I found good discussion on reddit<

On the other hand, D too for me becomes a puzzle language when I use many 
string mixins or templates in functional-style. D macros will possibly improve 
that situation some.


> J language expample:<

K language seems worse to me, this is a full raytracer that saves in pgm (in 
C++ it's about 120 lines of code written in normal style):
http://www.nsl.com/k/ray/ray.k

U:{x%_sqrt x _dot x}
S:{[r;s]:[0>d:_sqr[s 1]+_sqr[b:v _dot r 1]-v _dot v:s[0]-*r;0i;0>t:b+e:_sqrt 
d;0i;0l:S[r]o;h;(l;U 
r[0]-o[0]-l*r 1)]}
T:{[r;o;d;z;l]:[0i=*h:I[r;z]o;0.;~0>g:h[1]_dot l;0.;0i=*I[(r[0]+(r[1]**h)+d*h 
1;-l);z]o;-g;0.]}
N:{[n;o;i]0{x+T[(0 0 -4.;U(i+(y%4)-n%2),n);o;_sqrt 2^-42;0i,,3#0.]U -1 -3 
2.}/+4_vs!16}
R:{[k;n]"P5\n",(5:n,n),"\n255\n",_ci _.5+15.9375*N[n*1.;C[k;0 -1 0.]1.]'+|@[n 
_vs!n*n;0;|:]}
C:{[k;c;r]:[k=1;(c;r);((c;r*3);(,(c;r)),C[k-1;;r%2]'+c+-3 3[2_vs 2 3 6 
7]*r%_sqrt 12)]}
\t q:R[3]32
"temp.pgm"6:q
\"C:\\Program Files\\IrfanView\\i_view32.exe" temp.pgm

APL-like languages are a dead-end...

Bye,
bearophile


Re: Returning a struct by reference

2009-03-21 Thread Simon TRENY
grauzone Wrote:

> Simon TRENY wrote:
> > grauzone Wrote:
> > 
> >> Simon TRENY wrote:
> >>> Hi there!
> >>>
> >>> I'm quite new at D and I'm still just playing with it, but there is a 
> >>> thing that I find currently missing. Sometimes, I'd like to be able to 
> >>> return a struct by reference and not by value. For example, in the 
> >>> following example:
> >>>
> >>> struct Position {
> >>>float x;
> >>>float y;
> >>> }
> >>>
> >>> class Object {
> >>>private Position m_position;
> >>>
> >>>public Position position() {
> >>>   return m_position;
> >>>}
> >>> }
> >>>
> >>> I'd like to be able to write things like this: myObject.position.x = 43 
> >>> to actually change the position of the object. But right now, since 
> >>> "position" is a struct, it is returned by value and not by reference, and 
> >>> then the previous instruction won't change the position of the object, 
> >>> but it will work on a copy of the position field.
> >>>
> >>>
> >>> Here is the solutions that I can see to this problem:
> >>>
> >>> - Returning a pointer to the position: "public Position *position() { ... 
> >>> }", but I'd like to keep my code as free from pointers as possible.
> >>>  - Make "Position" a class and not a struct. That could be a solution, 
> >>> but then, when I'll do things like "Position pos = object.position; pos.x 
> >>> = 43;", it will effectively change the position of the object, which I 
> >>> wouldn't like with this syntax.
> >>>
> >>> Actually, I'd like to be able to do a thing like this:
> >>>public ref Position position() {
> >>>   return m_position;
> >>>}
> >>> which would be the equivalent form to passing structs by reference in a 
> >>> parameter.
> >>>
> >>> Is there a way to do this in D?
> >> Yes. Make the variable public.
> >>
> >> class Object {
> >>Position position;
> >> }
> >>
> >> This code is even simpler than your's above. Incredible, isn't it?
> > 
> > Ok, but then, what if I'd like to make the variable "read-only"? i.e. 
> > preventing the user from writing things like this:
> > myObject.position = pos2;
> 
> Then you write a getter that simply returns the field by value.
> 
> The D compiler will (hopefully) inline the getter function, so there 
> shouldn't be a disadvantage in performance.

If I add a getter-property that returns the field by value, the following 
instruction "object.position.x = 12;" won't modify the position of the object, 
but will only modify the returned copy of the position, right?
That's actually why I'd like to have a getter that returns the field by 
reference and not by value.

> 
> Note: I think D2.0 wants to introduce ref-returns at some point in the 
> future.
> 
> >>> Regards,
> >>> Simon
> >>>
> > 



Re: Returning a struct by reference

2009-03-21 Thread Simon TRENY
Daniel Keep Wrote:

> 
> 
> Simon TRENY wrote:
> > Ok, but then, what if I'd like to make the variable "read-only"? i.e. 
> > preventing the user from writing things like this:
> > myObject.position = pos2;
> > 
> 
> So... you're rejecting a solution on the basis that it prevents you from
> doing the exact opposite of what you want to do?
> 
> *boggle*
> 
>   -- Daniel

Here is a complete example of what I'd like to achieve:
struct Position {
   private float m_x;
   private float m_y;

   public float x() {
  return m_x;
   }

   public void x(float x) {
  m_x = x;
  EmitSignal("changed");
   }

   public float y() {
  return m_y;
   }

   public void y(float y) {
  m_y = y;
  EmitSignal("changed");
   }
}

class Object {
   private Position m_position;

   public this() {
  m_position.CallOnSignal("changed", onPositionChanged);
   }

   //This syntax is not working
   public ref Position position() {
  return m_position;
   }

   public void onPositionChanged() {
  writeln("Position Changed!!);
   }
}

With this "fictional" code, I could write things like:
object.position.x = 14; and the object will be "aware" that its position has 
changed.

Making the "position"-variable public will lead the user to be able to do 
things like this:
object.position = pos2; and then, the object won't be "aware" that its position 
has changed. And this is a problem for me.

I hope it's clearer now


Re: Returning a struct by reference

2009-03-21 Thread grauzone

Simon TRENY wrote:

grauzone Wrote:


Simon TRENY wrote:

Hi there!

I'm quite new at D and I'm still just playing with it, but there is a thing 
that I find currently missing. Sometimes, I'd like to be able to return a 
struct by reference and not by value. For example, in the following example:

struct Position {
   float x;
   float y;
}

class Object {
   private Position m_position;

   public Position position() {
  return m_position;
   }
}

I'd like to be able to write things like this: myObject.position.x = 43 to actually 
change the position of the object. But right now, since "position" is a struct, 
it is returned by value and not by reference, and then the previous instruction won't 
change the position of the object, but it will work on a copy of the position field.


Here is the solutions that I can see to this problem:

- Returning a pointer to the position: "public Position *position() { ... }", 
but I'd like to keep my code as free from pointers as possible.
 - Make "Position" a class and not a struct. That could be a solution, but then, when 
I'll do things like "Position pos = object.position; pos.x = 43;", it will effectively 
change the position of the object, which I wouldn't like with this syntax.

Actually, I'd like to be able to do a thing like this:
   public ref Position position() {
  return m_position;
   }
which would be the equivalent form to passing structs by reference in a 
parameter.

Is there a way to do this in D?

Yes. Make the variable public.

class Object {
Position position;
}

This code is even simpler than your's above. Incredible, isn't it?


Ok, but then, what if I'd like to make the variable "read-only"? i.e. 
preventing the user from writing things like this:
myObject.position = pos2;


Then you write a getter that simply returns the field by value.

The D compiler will (hopefully) inline the getter function, so there 
shouldn't be a disadvantage in performance.


Note: I think D2.0 wants to introduce ref-returns at some point in the 
future.



Regards,
Simon





Re: Returning a struct by reference

2009-03-21 Thread Simon TRENY
grauzone Wrote:

> Simon TRENY wrote:
> > Hi there!
> > 
> > I'm quite new at D and I'm still just playing with it, but there is a thing 
> > that I find currently missing. Sometimes, I'd like to be able to return a 
> > struct by reference and not by value. For example, in the following example:
> > 
> > struct Position {
> >float x;
> >float y;
> > }
> > 
> > class Object {
> >private Position m_position;
> > 
> >public Position position() {
> >   return m_position;
> >}
> > }
> > 
> > I'd like to be able to write things like this: myObject.position.x = 43 to 
> > actually change the position of the object. But right now, since "position" 
> > is a struct, it is returned by value and not by reference, and then the 
> > previous instruction won't change the position of the object, but it will 
> > work on a copy of the position field.
> > 
> > 
> > Here is the solutions that I can see to this problem:
> > 
> > - Returning a pointer to the position: "public Position *position() { ... 
> > }", but I'd like to keep my code as free from pointers as possible.
> >  - Make "Position" a class and not a struct. That could be a solution, but 
> > then, when I'll do things like "Position pos = object.position; pos.x = 
> > 43;", it will effectively change the position of the object, which I 
> > wouldn't like with this syntax.
> > 
> > Actually, I'd like to be able to do a thing like this:
> >public ref Position position() {
> >   return m_position;
> >}
> > which would be the equivalent form to passing structs by reference in a 
> > parameter.
> > 
> > Is there a way to do this in D?
> 
> Yes. Make the variable public.
> 
> class Object {
>   Position position;
> }
> 
> This code is even simpler than your's above. Incredible, isn't it?

Ok, but then, what if I'd like to make the variable "read-only"? i.e. 
preventing the user from writing things like this:
myObject.position = pos2;

> 
> > Regards,
> > Simon
> > 



Re: Returning a struct by reference

2009-03-21 Thread Daniel Keep


Simon TRENY wrote:
> Ok, but then, what if I'd like to make the variable "read-only"? i.e. 
> preventing the user from writing things like this:
> myObject.position = pos2;
> 

So... you're rejecting a solution on the basis that it prevents you from
doing the exact opposite of what you want to do?

*boggle*

  -- Daniel


Re: Returning a struct by reference

2009-03-21 Thread grauzone

Simon TRENY wrote:

Hi there!

I'm quite new at D and I'm still just playing with it, but there is a thing 
that I find currently missing. Sometimes, I'd like to be able to return a 
struct by reference and not by value. For example, in the following example:

struct Position {
   float x;
   float y;
}

class Object {
   private Position m_position;

   public Position position() {
  return m_position;
   }
}

I'd like to be able to write things like this: myObject.position.x = 43 to actually 
change the position of the object. But right now, since "position" is a struct, 
it is returned by value and not by reference, and then the previous instruction won't 
change the position of the object, but it will work on a copy of the position field.


Here is the solutions that I can see to this problem:

- Returning a pointer to the position: "public Position *position() { ... }", 
but I'd like to keep my code as free from pointers as possible.
 - Make "Position" a class and not a struct. That could be a solution, but then, when 
I'll do things like "Position pos = object.position; pos.x = 43;", it will effectively 
change the position of the object, which I wouldn't like with this syntax.

Actually, I'd like to be able to do a thing like this:
   public ref Position position() {
  return m_position;
   }
which would be the equivalent form to passing structs by reference in a 
parameter.

Is there a way to do this in D?


Yes. Make the variable public.

class Object {
Position position;
}

This code is even simpler than your's above. Incredible, isn't it?


Regards,
Simon



Re: eliminate writeln et comp?

2009-03-21 Thread Daniel Keep


Christopher Wright wrote:
> Daniel Keep wrote:
>>
>> Christopher Wright wrote:
>>> Daniel Keep wrote:
 When was the last time you had to put this in your GCC-compiled
 programs?

 "Portions of this program Copyright (C) Free Software Foundation.  Uses
 glibc."
>>> Executable code resulting from compilation is not a work derived from
>>> GCC.
>>>
>>> glibc is extremely difficult to link statically and is distributed under
>>> the LGPL, so no copyright notice is necessary.
>>>
>>> If dmd had good support for dynamic linking, this wouldn't be nearly as
>>> much of an issue. Sadly, ddl seems to be on hiatus, and at any rate, it
>>> can't be applied to the runtime.
>>
>> I think you're missing my point.  I'm saying that a standard library
>> shouldn't require you to insert legal disclaimers or attribution notices
>> into your program or its documentation.
>>
>> A standard library should be be as invisible as possible in this regard.
>>
>>   -- Daniel
> 
> Right. It's invisible with glibc because you link to it dynamically, and
> because everyone installs it by default. Druntime has neither of these
> advantages.

I'm not talking about distribution of the actual library machine code,
I'm talking about the LEGAL ISSUES.  Tango's license apparently requires
you to explicitly include attribution for Tango in your program.  This
means it's possible to naively compile "Hello, World" with Tango,
distribute it and break the law.

That glibc uses dynamic linking is immaterial: that there is no way to
avoid the legal issues with Tango no matter what you do is the point I'm
trying to make.

  -- Daniel


Re: [OT]new D2.0 + C++ language

2009-03-21 Thread Piotrek

bearophile pisze:

Piotrek:

(of course there are bugs but  when I write something in D
I'm so glad I don't have to to do it in something else).


D is surely not a puzzle language :-)
http://prog21.dadgum.com/38.html



Haha, I found good discussion on reddit

http://www.reddit.com/r/programming/comments/7vnm0/puzzle_languages/

Just couple of nice citations:

J language expample:
meanf=: 1 : '(0>.255<.])@:<.@(m (+/ % #)@,;.3 ])"2'
boxf=: 1 : '(0>.255<.])@:<.@((2 2$m) (+/ % #)@,;.3 ])"2'
pc=: &.:([: 2 0 1&|: 256 256 256&#:)
inter=: ( [: , 2&(({. , (+/%#))\) )"1
inter2=: ([: (<@inter pc) |:)^:2
fit=: [: >.`<./ ,
fit1=: [: >.`<./ (,~ (,~ -))~
histogram=: <: @ (#/.~) @ (i...@#@[ , I.)

Does it look like the Matrix?

And the the one made me laugh the most:
"CSS is the puzzliest puzzle that I ever puzzled"

Every one should try CSS :D

Cheers


Returning a struct by reference

2009-03-21 Thread Simon TRENY
Hi there!

I'm quite new at D and I'm still just playing with it, but there is a thing 
that I find currently missing. Sometimes, I'd like to be able to return a 
struct by reference and not by value. For example, in the following example:

struct Position {
   float x;
   float y;
}

class Object {
   private Position m_position;

   public Position position() {
  return m_position;
   }
}

I'd like to be able to write things like this: myObject.position.x = 43 to 
actually change the position of the object. But right now, since "position" is 
a struct, it is returned by value and not by reference, and then the previous 
instruction won't change the position of the object, but it will work on a copy 
of the position field.


Here is the solutions that I can see to this problem:

- Returning a pointer to the position: "public Position *position() { ... }", 
but I'd like to keep my code as free from pointers as possible.
 - Make "Position" a class and not a struct. That could be a solution, but 
then, when I'll do things like "Position pos = object.position; pos.x = 43;", 
it will effectively change the position of the object, which I wouldn't like 
with this syntax.

Actually, I'd like to be able to do a thing like this:
   public ref Position position() {
  return m_position;
   }
which would be the equivalent form to passing structs by reference in a 
parameter.

Is there a way to do this in D?

Regards,
Simon



Re: new D2.0 + C++ language

2009-03-21 Thread Christopher Wright

Weed wrote:

Piotrek пишет:

Weed pisze:

Weed пишет:

Hi!

I want to offer the dialect of the language D2.0, suitable for use where
are now used C/C++. Main goal of this is making language like D, but
corresponding "zero-overhead principle" like C++:

at least to something like this idea? )

The idea could be ok but have you written a compiler or specification?


My experience in the creation of the compilers is reduced to half-read
book "Compilers: Principles, Techniques, and Tools".

It was easier to write the differences from D, as fully to specification
- I hoped that receive point to some fundamental problems, but there
seems  all to be good (excluding holy war about GC, of course)


I inferred from your original post that you had written such a compiler. 
I think probably the first thing to do, if you are serious about this, 
is to choose one essential feature of your dialect and implement that. 
Then there will be something concrete to discuss.


That isn't the normal modus operandi around here, but there doesn't seem 
to be much support for your suggestions. I think this might be less to 
do with your ideas and more that I can't really envision what you're 
talking about and how code would look with your dialect.


Re: new D2.0 + C++ language

2009-03-21 Thread Christopher Wright

Rainer Deyke wrote:

Christopher Wright wrote:

I was pulling numbers out of my ass.


That's what I assumed.  I'm a game developer.  I use GC.


0.1 seconds out of every ten is a small amount to pay for the benefits
of garbage collection in most situations.


GC is useless for resource management.  RAII solves the resource
management problem, in C++ and D2.  GC is a performance optimization on
top of that.  If the GC isn't faster than simple reference counting,
then it serves no purpose, because you could use RAII with reference
counting for the same effect.

(No, I don't consider circular references a problem worth discussing.)


I believe Python is using reference counting with a garbage collector, 
with the collector intended to solve the circular reference problem, so 
apparently Guido van Rossum thinks it's a problem worth discussing.


And my opinion of reference counting is, if it requires no programmer 
intervention, it's just another garbage collector. Reference counting 
would probably be a win overall if a reference count going to zero would 
only optionally trigger a collection -- you're eliminating the 'mark' 
out of 'mark and sweep'. Though I would still demand a full 
mark-and-sweep, just not as often. Nontrivial data structures nearly 
always have circular references.


Re: new D2.0 + C++ language

2009-03-21 Thread bearophile
Piotrek:
> (of course there are bugs but  when I write something in D
> I'm so glad I don't have to to do it in something else).

D is surely not a puzzle language :-)
http://prog21.dadgum.com/38.html

Well, writing D templates in functional style and lot of string mixins is a 
puzzle (even if they are less puzzles than some things you have to do in 
Forth). AST macros too can become puzzles, but I think if well designed they 
can be more natural to use than the current templates & string mixins.

Bye,
bearophile


Re: Response files

2009-03-21 Thread Frank Benoit
Jason House schrieb:
> Walter Bright Wrote:
> 
>> Frank Benoit wrote:
>>> DMD 1.041 on windows does support response files, that is a file
>>> containing arguments.
>>> On Linux dmd does not understand that.
>> The windows response files date back to the problem DOS/Windows had with 
>> only a very short command line length was allowed. So the arguments were 
>> put into a file instead.
>>
>> It's probably a good idea to do it for Linux, too.
> 
> Ick. Why? Command files are hacks for Window's shortcomings. Why spread such 
> hacks across all platforms? The linux command line is already well adapted to 
> handle this kind of thing.

Even in cygwin, there seems to be a 32k limit on the command line.
On Linux you can find out that limit with
getconf ARG_MAX (=>2097152 on my linux box)

This seems to be enough, however, 32k/64k are not.
There is a related bug, because the dmd response file workaround is not
working with >64k, see http://d.puremagic.com/issues/show_bug.cgi?id=2705




Re: eliminate writeln et comp?

2009-03-21 Thread Christopher Wright

Daniel Keep wrote:


Christopher Wright wrote:

Daniel Keep wrote:

When was the last time you had to put this in your GCC-compiled programs?

"Portions of this program Copyright (C) Free Software Foundation.  Uses
glibc."

Executable code resulting from compilation is not a work derived from GCC.

glibc is extremely difficult to link statically and is distributed under
the LGPL, so no copyright notice is necessary.

If dmd had good support for dynamic linking, this wouldn't be nearly as
much of an issue. Sadly, ddl seems to be on hiatus, and at any rate, it
can't be applied to the runtime.


I think you're missing my point.  I'm saying that a standard library
shouldn't require you to insert legal disclaimers or attribution notices
into your program or its documentation.

A standard library should be be as invisible as possible in this regard.

  -- Daniel


Right. It's invisible with glibc because you link to it dynamically, and 
because everyone installs it by default. Druntime has neither of these 
advantages.


Re: new D2.0 + C++ language

2009-03-21 Thread Piotrek

Weed pisze:

No. )
I'm not suggesting anything new, it is suggested that all the
time-tested things.


OK. I tell you what I think. D is a well designed language. What you 
suggest is some kind of hack to that language. I don't think there's 
much interest in it. As you said you don't have much experience in 
writing compilers (neither do I) but you should know how hard is to keep 
the design points all the way a language works. Walter spent many years 
on it. Form my point of view he does the best (of course there are bugs 
but  when I write something in D I'm so glad I don't have to to do it in 
something else).


Cheers


Re: new D2.0 + C++ language

2009-03-21 Thread Weed
Piotrek пишет:
> Weed pisze:
>> Weed пишет:
>>> Hi!
>>>
>>> I want to offer the dialect of the language D2.0, suitable for use where
>>> are now used C/C++. Main goal of this is making language like D, but
>>> corresponding "zero-overhead principle" like C++:
>>
>> at least to something like this idea? )
> 
> The idea could be ok but have you written a compiler or specification?

My experience in the creation of the compilers is reduced to half-read
book "Compilers: Principles, Techniques, and Tools".

It was easier to write the differences from D, as fully to specification
- I hoped that receive point to some fundamental problems, but there
seems  all to be good (excluding holy war about GC, of course)

> Or is it wishful thinking like let's make the language that's productive
> to the skies while faster than asm? ;)

No. )
I'm not suggesting anything new, it is suggested that all the
time-tested things.


Re: Library for Linear Algebra?

2009-03-21 Thread Trass3r

Don schrieb:
I abandoned it largely because array operations got into the language; 
since then I've been working on getting the low-level math language 
stuff working.

Don't worry, I haven't gone away!


I see.




http://www.dsource.org/projects/lyla


Though array operations still only give us SIMD and no multithreading (?!).
I think the best approach is lyla's, taking an existing, optimized C 
BLAS library and writing some kind of wrapper using operator overloading 
etc. to make programming easier and more intuitive.


Re: new D2.0 + C++ language

2009-03-21 Thread Piotrek

Weed pisze:

Weed пишет:

Hi!

I want to offer the dialect of the language D2.0, suitable for use where
are now used C/C++. Main goal of this is making language like D, but
corresponding "zero-overhead principle" like C++:


at least to something like this idea? )


The idea could be ok but have you written a compiler or specification? 
Or is it wishful thinking like let's make the language that's productive 
to the skies while faster than asm? ;)


Cheers


Re: Response files

2009-03-21 Thread Frank Benoit
Jason House schrieb:
> Walter Bright Wrote:
> 
>> Frank Benoit wrote:
>>> DMD 1.041 on windows does support response files, that is a file
>>> containing arguments.
>>> On Linux dmd does not understand that.
>> The windows response files date back to the problem DOS/Windows had with 
>> only a very short command line length was allowed. So the arguments were 
>> put into a file instead.
>>
>> It's probably a good idea to do it for Linux, too.
> 
> Ick. Why? Command files are hacks for Window's shortcomings. Why spread such 
> hacks across all platforms? The linux command line is already well adapted to 
> handle this kind of thing.

Because, imagine you set up a build process for your application. Why
should i have to care about that difference in my 'makefile',
'rakefile', ... whatever ?


Re: new D2.0 + C++ language

2009-03-21 Thread Weed
Weed пишет:
> Hi!
> 
> I want to offer the dialect of the language D2.0, suitable for use where
> are now used C/C++. Main goal of this is making language like D, but
> corresponding "zero-overhead principle" like C++:

at least to something like this idea? )


Re: eliminate writeln et comp?

2009-03-21 Thread Daniel Keep


Christopher Wright wrote:
> Daniel Keep wrote:
>> When was the last time you had to put this in your GCC-compiled programs?
>>
>> "Portions of this program Copyright (C) Free Software Foundation.  Uses
>> glibc."
> 
> Executable code resulting from compilation is not a work derived from GCC.
> 
> glibc is extremely difficult to link statically and is distributed under
> the LGPL, so no copyright notice is necessary.
> 
> If dmd had good support for dynamic linking, this wouldn't be nearly as
> much of an issue. Sadly, ddl seems to be on hiatus, and at any rate, it
> can't be applied to the runtime.

I think you're missing my point.  I'm saying that a standard library
shouldn't require you to insert legal disclaimers or attribution notices
into your program or its documentation.

A standard library should be be as invisible as possible in this regard.

  -- Daniel


Re: new D2.0 + C++ language

2009-03-21 Thread Daniel Keep


bearophile wrote:
> Jarrett Billingsley:
>>> Pointer syntax of Pascal is better, and the := �= often avoid the C bugs 
>>> like if(a = b).
>> Which isn't a problem in D ;)
> 
> Let's say D has a workaround to patch most of that C-syntax hole :-)
> And I'll never like C pointer syntax.
> 
> 
>> That's actually pretty nice.
> 
> An alternative syntax that avoids the two nested {{}}:
> Lambda functions:
> {int x -> x*x}
> {x -> x*x}
> {float x, float x*y}
> Lambda delegates:
> {int x => x*x}
> {x => x*x}
> {float x, float y => x*y}
> 
> I may even like that :-)
> 
> Bye,
> bearophile

{ int -> int } // function
{ this int -> int } // delegate

Not saying I support this syntax; just proposing an alternative.  The
way I see it, there's no reason why functions are -> and delegates are
=>; the difference is non-obvious.

  -- Daniel


Re: Response files

2009-03-21 Thread Jason House
Walter Bright Wrote:

> Frank Benoit wrote:
> > DMD 1.041 on windows does support response files, that is a file
> > containing arguments.
> > On Linux dmd does not understand that.
> 
> The windows response files date back to the problem DOS/Windows had with 
> only a very short command line length was allowed. So the arguments were 
> put into a file instead.
> 
> It's probably a good idea to do it for Linux, too.

Ick. Why? Command files are hacks for Window's shortcomings. Why spread such 
hacks across all platforms? The linux command line is already well adapted to 
handle this kind of thing.


Re: new D2.0 + C++ language

2009-03-21 Thread Rainer Deyke
Christopher Wright wrote:
> I was pulling numbers out of my ass.

That's what I assumed.  I'm a game developer.  I use GC.

> 0.1 seconds out of every ten is a small amount to pay for the benefits
> of garbage collection in most situations.

GC is useless for resource management.  RAII solves the resource
management problem, in C++ and D2.  GC is a performance optimization on
top of that.  If the GC isn't faster than simple reference counting,
then it serves no purpose, because you could use RAII with reference
counting for the same effect.

(No, I don't consider circular references a problem worth discussing.)


-- 
Rainer Deyke - rain...@eldwood.com