Re: OT: on IDEs and code writing on steroids

2010-07-27 Thread Bruno Medeiros

On 20/05/2009 02:12, Walter Bright wrote:

Not even this book cover could save Forth!

http://www.globalnerdy.com/2007/09/14/reimagining-programming-book-covers/


Ah, Julie Bell and Boris Vallejo, one (well, two) of my favorite fantasy 
artists, they're pretty awesome.


--
Bruno Medeiros - Software Engineer


Re: OT: on IDEs and code writing on steroids

2009-05-26 Thread grauzone
I do not wish to recompile a 1.5GB standalone executable if I just 
changed a minor version of a lib.


Can you tell me, why that application needs to be that big, and can't be 
split in several, smaller processes?


Re: OT: on IDEs and code writing on steroids

2009-05-26 Thread BCS

Hello Yigal,


BCS wrote:


Reply to Yigal,


BCS wrote:


Hello Yigal,


C# assemblies are analogous to C/C++/D libs.
you can't create a standalone executable in D just by parsing the
D
source files (for all the imports) if you need to link in external
libs.
you need to at least specify the lib name if it's on the linker's
search path or provide the full path otherwise.

pagma(lib, ...); //?


that's a compiler directive. nothing prevents a C# compiler to
implement this. In general though this is a bad idea. why would you
want to embed such outside data inside your code?


Because it's needed to build the code


info needed for building your task should not be part of the code.


IMO it should. Ideally it should be available in the code in a form
tools can read. At a minimum, it should be in the comment header. The
only other choice is placing it outside your code and we have already
covered why I think that is a bad idea.


what if I want to rename the lib,


So you rename the lib and whatever references to it (inside the code
or outside) end up needing to be update. Regardless, you will need to
update something by hand or have a tool do it for you. I see nothing
harder about updateing it in the code than outside the code.


do I have to recompile everything?


Nope. pragma(lib, ...) just passes a static lib to the linker and
doesn't have any effect at runtime. (if you are dealing with .dll/.so
libraries then you link in a export .lib with a pragma or load them
manually and don't even worry about it at all)


what if I don't have the source?


It's pointing at a static library so source doesn't matter. If you
are working from source then you don't need the pragma and what
matters is DMD's import path (if it is an unrelated code tree in
which cases the path can be developer specific and needs to be set up
per system).


what if I want to change the version?


In that case you change the pragma. (again assuming static libs and
the same side note for dynamic libs)


what if I want to switch a vendor for this lib?


I have never heard of this being possible without major changes in
the calling code so it don't matter.


What I was trying to say is that you're hardcoding the lib name and
version inside the code. I see two problems with this: if the pragma
is
in my code than I need to re-compile my code if I want to edit the
pragma (rename lib, change version, change vendor, etc...)
if the pragma is in some 3rd party component which I don't have the
source for than I can't change the pragma.
either way, it conflicts with my work-flow and goals.
I do not wish to recompile a 1.5GB standalone executable if I just
changed a minor version of a lib.


I see you point but I think it is invalid.

For starters, I could be wrong but I think that the use of pragma(lib,) can't 
be detected in object code, I think It just instructs DMD to pass the lib 
on to the linker when it gets called by DMD. If I am wrong about that I still 
think it doesn't matter because (as far as static libraries go) I think it 
would a very BAD idea to try and switch them out from under a closed source 
lib. Third, if you really want to go mucking around with those internals, 
you can always copy the new lib over the old one.




IIRC, the math lib in C/C++ comes in three flavors so you can choose
your trade-off (speed or accuracy) and the only thing you need to do
is just link the flavor you want in your executable.


Everything needs a math lib so there will be a default. I'd not willing to 
second guess the original programmer if they choose to switch to another 
lib. The same goes for other libs as well. If you start switching to libs 
that the lib's programmer doesn't explicitly support, your already on your 
own and you have bigger problems than what I'm talking about.



you seem keen on combining the build process with compilation which is
in my experience a very bad thing. it may simplify your life for your
small projects but as I was telling you before it's a pain in the neck
for the scale of projects I work on. I don't get why you refuse to see
that. what you suggest is _not_ a good solution for me.


What I want is a language where most of the time you build a project from 
only the information in the source code. What I don't want is a language 
where the only way to keep track of the information you need to build a project, 
is with an external data file. I don't want that because the only practical 
way to do that is _force_ the programmer to use an IDE and have it maintain 
that file.





Re: OT: on IDEs and code writing on steroids

2009-05-26 Thread BCS

Hello grauzone,


I do not wish to recompile a 1.5GB standalone executable if I just
changed a minor version of a lib.


Can you tell me, why that application needs to be that big, and can't
be split in several, smaller processes?



I'm more interested in how you got 1.5GBs of executable.




Re: OT: on IDEs and code writing on steroids

2009-05-26 Thread Jussi Jumppanen
BCS Wrote:

 What I want is a language where most of the time you build 
 a project from only the information in the source code. 

There is nothing in C# that stops you doing exactly this.

You can build this Simple.cs file: 
  
  using System;
  using System.Windows.Forms;
  
  namespace SimpleApplication
  {
  static class Program
  {
  [STAThread]
  static void Main()
  {
  MessageBox.Show(Hello World!, C# Application, 
  MessageBoxButtons.OK, 
  MessageBoxIcon.Information);
  }
  }
  }

to create a Simple.exe using nothing but this command line:

  csc.exe /r:System.dll; D:\temp\simple.cs

 What I don't want is a language where the only way to keep track 
 of the information you need to build a project, is with an external 
 data file. 

People have been developing projects using an external data file 
for decades. It's called the make file.

 I don't want that because the only practical way to do that is _force_ 
 the programmer to use an IDE and have it maintain that file.

What exactly is it about C# that makes you think you are FORCED
to use an IDE to write the code?
 
MSBuild.exe is nothing than Microsoft's replacement to make.exe. 

It is nothing more than a version of make.exe that takes XML 
make files as it's input.





Re: OT: on IDEs and code writing on steroids

2009-05-25 Thread Yigal Chripun

BCS wrote:

Hello Yigal,


C# assemblies are analogous to C/C++/D libs.
you can't create a standalone executable in D just by parsing the D
source files (for all the imports) if you need to link in external libs.
you need to at least specify the lib name if it's on the linker's
search path or provide the full path otherwise.


pagma(lib, ...); //?


that's a compiler directive. nothing prevents a C# compiler to implement 
this. In general though this is a bad idea. why would you want to embed 
such outside data inside your code? info needed for building your task 
should not be part of the code. what if I want to rename the lib, do I 
have to recompile everything? what if I don't have the source? what if I 
want to change the version? what if I want to switch a vendor for this lib?





Same thing with assemblies.
you have to provide that meta-data (lib names) anyway both in C# and
D. the only difference is that C# (correctly) recognizes that this is
the better default.


IMHO the c# way is the /worse/ default. Based on that being my opinion, 
I think we have found where we will have to disagree. Part of my 
reasoning is that in the normal case, for practical reasons, that file 
will have to be maintained by an IDE, thus /requiring/ development to be 
in an IDE of some kind. In D, that data in can normally be part of the 
source code, and only in unusual cases does it need to be formally 
codified.





Re: OT: on IDEs and code writing on steroids

2009-05-25 Thread BCS

Reply to Yigal,


BCS wrote:


Hello Yigal,


C# assemblies are analogous to C/C++/D libs.
you can't create a standalone executable in D just by parsing the D
source files (for all the imports) if you need to link in external
libs.
you need to at least specify the lib name if it's on the linker's
search path or provide the full path otherwise.

pagma(lib, ...); //?


that's a compiler directive. nothing prevents a C# compiler to
implement this. In general though this is a bad idea. why would you
want to embed such outside data inside your code?


Because it's needed to build the code


info needed for building your task should not be part of the code.


IMO it should. Ideally it should be available in the code in a form tools 
can read. At a minimum, it should be in the comment header. The only other 
choice is placing it outside your code and we have already covered why I 
think that is a bad idea.



what if I want to rename the lib,


So you rename the lib and whatever references to it (inside the code or outside) 
end up needing to be update. Regardless, you will need to update something 
by hand or have a tool do it for you. I see nothing harder about updateing 
it in the code than outside the code.



do I have to recompile everything?


Nope. pragma(lib, ...) just passes a static lib to the linker and doesn't 
have any effect at runtime. (if you are dealing with .dll/.so libraries then 
you link in a export .lib with a pragma or load them manually and don't even 
worry about it at all)



what if I don't have the source?


It's pointing at a static library so source doesn't matter. If you are working 
from source then you don't need the pragma and what matters is DMD's import 
path (if it is an unrelated code tree in which cases the path can be developer 
specific and needs to be set up per system).



what if I want to change the version?


In that case you change the pragma. (again assuming static libs and the same 
side note for dynamic libs)



what if I want to switch a vendor for this lib?


I have never heard of this being possible without major changes in the calling 
code so it don't matter.





Re: OT: on IDEs and code writing on steroids

2009-05-24 Thread BCS

Hello Christopher,


BCS wrote:


But that's not the point. Neither make nor VS's equivalent is what
this thread was about. At least not where I was involved. My point is
that the design of c# *requiters* the maintenance (almost certainly
by a c# specific IDE) of some kind of external metadata file that
contains information that can't be derived from the source code its
self, where as with D, no such metadata is *needed*. If you wanted,
you could build a tool to take D source code and generate a makefile
or a bash build script from it


If you wanted, you could create a tool to do the same with C# source
code, assuming there exists a directory containing all and only those
source files that should end up in the resulting assembly.


I'm /not/ willing to assume that (because all to often it's not true) and 
you also need the list of other assemblies that should be included.





Re: OT: on IDEs and code writing on steroids

2009-05-24 Thread Lutger
Yigal Chripun wrote:
...
 
 this I completely disagree with. those are the same faulty reasons I 
 already answered.
 an IDE does _not_ create bad programmers, and does _not_ encourage bad 
 code. it does encourage descriptive names which is a _good_ thing.
 
 writing strcpy ala C style is cryptic and *wrong*. code is read 
 hundred times more than it's written and a better name would be for 
 instance - stringCopy.
 it's common nowadays to have tera-byte sized HDD so why people try to 
 save a few bytes from their source while sacrificing readability?
...

This is not what I was saying. 

I'm not talking about strcpy vs stringCopy. stringCopy is short. I'm talking 
about things like SetCompatibleTextRenderingDefault.

And this example isn't even so bad. Fact is, it is easier to come up with long 
identifiers and there is no penalty in the form of typing cost for doing so. 

It's not about bad programmers (or saving bytes, that's just ridiculous), but 
IDE 
does encourage some kind of constructs because they are easier in that 
environment. Good programmers come up with good, descriptive names, whether 
they 
program in an IDE or not. 

At work I must program in VB.NET. This language is pretty verbose in describing 
even the most common things. It's easier to parse when you're new to the 
language, but after a while I find all the verbosity gets in the way of 
readability.





Re: OT: on IDEs and code writing on steroids

2009-05-24 Thread Yigal Chripun

BCS wrote:

Hello Christopher,


BCS wrote:


But that's not the point. Neither make nor VS's equivalent is what
this thread was about. At least not where I was involved. My point is
that the design of c# *requiters* the maintenance (almost certainly
by a c# specific IDE) of some kind of external metadata file that
contains information that can't be derived from the source code its
self, where as with D, no such metadata is *needed*. If you wanted,
you could build a tool to take D source code and generate a makefile
or a bash build script from it


If you wanted, you could create a tool to do the same with C# source
code, assuming there exists a directory containing all and only those
source files that should end up in the resulting assembly.


I'm /not/ willing to assume that (because all to often it's not true) 
and you also need the list of other assemblies that should be included.




C# assemblies are analogous to C/C++/D libs.
you can't create a standalone executable in D just by parsing the D 
source files (for all the imports) if you need to link in external libs. 
you need to at least specify the lib name if it's on the linker's search 
path or provide the full path otherwise.

Same thing with assemblies.

you have to provide that meta-data (lib names) anyway both in C# and D. 
the only difference is that C# (correctly) recognizes that this is the 
better default.


Re: OT: on IDEs and code writing on steroids

2009-05-24 Thread BCS

Hello Yigal,


C# assemblies are analogous to C/C++/D libs.
you can't create a standalone executable in D just by parsing the D
source files (for all the imports) if you need to link in external libs.
you need to at least specify the lib name if it's on the linker's
search path or provide the full path otherwise.


pagma(lib, ...); //?


Same thing with assemblies.
you have to provide that meta-data (lib names) anyway both in C# and
D. the only difference is that C# (correctly) recognizes that this is
the better default.


IMHO the c# way is the /worse/ default. Based on that being my opinion, I 
think we have found where we will have to disagree. Part of my reasoning 
is that in the normal case, for practical reasons, that file will have to 
be maintained by an IDE, thus /requiring/ development to be in an IDE of 
some kind. In D, that data in can normally be part of the source code, and 
only in unusual cases does it need to be formally codified.





Re: OT: on IDEs and code writing on steroids

2009-05-23 Thread Yigal Chripun

Georg Wrede wrote:

Yigal Chripun wrote:
What I was saying was not specific for DWT but rather that _any_ 
reasonably big project will use such a system and it's simply not 
practical to do otherwise. how would you handle a project with a 
hundred  files that takes 30 min. to compile without any tool 
whatsoever except the compiler itself?


Make?

And if you're smart, a verson control system. (Whether you use an IDE or 
not.)


Make _is_ a build tool


Re: OT: on IDEs and code writing on steroids

2009-05-22 Thread Andrei Alexandrescu

Yigal Chripun wrote:
Last thing, basing your arguments on history is flawed. the Micro-Kernel 
idea got the same treatment after the failures in the 80's (Mach and 
co.) but nowadays this idea was revived and there are already several 
million cellphones that run an OS built on the L4 micro-kernel so it's 
even a commercial success.


Our industry goes in cycles, and I've been around to see a few 
(multitasking, microkernels, memory paging, virtual machine monitors, 
client-server...) That is because various tradeoffs in hardware have 
subtly changed with time. The human factor, however, has stayed the same.



Andrei


Re: OT: on IDEs and code writing on steroids

2009-05-22 Thread Nick Sabalausky
Yigal Chripun yigal...@gmail.com wrote in message 
news:gv5dpn$2oe...@digitalmars.com...

 I think Nemerle provies this - the constructs in Nemerle for the Macro 
 system are very simple and intuitive. you only have one extra syntax 
 feature, the [ ]. think of D's CTFE only much more extended in scope - 
 you write a CTFE function and compile it. (that's the Nemerle Macro 
 equivalent) than you just call it in a different file and the compiler 
 will execute this function at compile time.
 Nemerle does not need an interpreter for this since these functions are 
 compiled just like the rest of the code. Nemerle also provides compiler 
 APIs so these functions could work on AST to add/alter/remove types and 
 other constructs.


As I recall, we got onto this subject from discussing ways to combine the 
power of D/C++-style templates with the cross-[assembly/object] benefits of 
C#-style generics. I understand what you're saying here about why nemerle's 
macro system is like a better form of D's CTFE. But I'm unclear on how 
nemerle's macro system relates to the problem of achieving the best of both 
worlds between D/C++-style templates and C#-style generics? 




Re: OT: on IDEs and code writing on steroids

2009-05-22 Thread Nick Sabalausky
Georg Wrede georg.wr...@iki.fi wrote in message 
news:gv4t8t$1r4...@digitalmars.com...
 Nick Sabalausky wrote:
 Suppose (purely hypothetically) that the .NET assembly system were 
 changed to allow the source for a D/C++ style of source-level template to 
 be embedded into the assembly. Then they'd be able to do D/C++ style 
 source-level template/code-generation. Right?

 I assume, actually presume, that would take the better part of a decade.

 Now obviously the big problem with that is it would only be usable in
 the same language it was originally written in.

 That actually depends. Done the M$ way, it would. Done properly, it would 
 work in any language. But then, that would mean a rewrite of the entire 
 CLR, wouldn't it?


Actually, I only intended that part of it as a lead-in to my idea about 
templates/generics working on an AST-level instead of source-level (D/C++ 
templates) or bytecode-level (C# generics). 




Re: OT: on IDEs and code writing on steroids

2009-05-22 Thread BCS

Hello Yigal,


BCS wrote:


It is my strongly held opinion that the primary argument for dlls and
friends, code sharing, is attempting to solve a completely
intractable problem. As soon as you bring in versioning, installers
and uninstallers, the problem becomes flat out impossible to solve.
(the one exception is for low level system things like KERNEL32.DLL
and stdc*.so)


so, in your opinion Office, photoshop, adobe acrobat, can all be
provided as monolithic executables? that's just ridiculous.


Office, I wouldn't mind. Photoshop, it's got lots of plugins (#1) right? 
adobe acrobat, it might as well BE a plugin (#1 again).



My work uses this monolithic model approach for some programs and this
brings so much pain that you wouldn't believe.


How exactly?


just so you'd understand the scale I'm talking about - our largest
executable is 1.5 Gigs in size.


That's point #3 and I'd love to known how you got that big. (I guess I should 
add a #5: Resource only DLLs.) 


you're wrong on both accounts, DLL type systems are not only the
common case, they are the correct solution.


I didn't say that aren't common. I said it's a bad idea IMO. 


the DLL HELL you're so afraid of is mostly solved by using
jars/assemblies (smart dlls) that contain meta-data such as versions.
this problem is also solved on Linux systems that use package
managers, like Debian's APT.


If you ignore system libraries like .NET its self, I'd almost bet that if 
you look at it long enough those systems, from a piratical standpoint, are 
almost the same as installing dll/so files to be used only by one program. 
That is that the average number of programs/applications that depend on any 
given file is 1. And as I all ready pointed out, I'll burn disk space to 
get the reliability that static linkage gets me.


I seem to recall running into this issue with .NET assemblies and .so files 
within the last year. 


monolithic design like you suggest is in fact bad design that leads to
things like - Windows Vista running slower on my 8-core machine than
Window XP on my extremely weak laptop.


If the same design runs slower with static linkage than with dynamic linkage, 
then there is something wrong with the OS. I can say that with confidence 
because everything that a static version needs to do the dynamic version 
will also, and then a pile more.





Re: OT: on IDEs and code writing on steroids

2009-05-22 Thread BCS

Hello Rainer,


My favorite deployment system is the application bundle under OS X.
It's a directory that looks like a file.  Beneath the covers it has
frameworks and configuration files and multiple executables and all
that
crap, but to the user, it looks like a single file.  You can copy it,
rename it, move it (on a single computer or between computers), even
delete it, and it just works.  Too bad the system doesn't work under
any
other OS.


Oh man would I love to have that :) I've day dreamed of a system a lot like 
that. One thing I'd mandate if I ever designed my ideal system is that installation 
/in total/ is plunking in the dir, and removal is simply deleting it. Once 
it's gone, it's gone. Nothing may remain that can in ANY WAY effect other 
apps. That would implies that after each boot nothing is installed and everything 
is installed on launch. 





Re: OT: on IDEs and code writing on steroids

2009-05-21 Thread Christopher Wright

Nick Sabalausky wrote:
Christopher Wright dhase...@gmail.com wrote in message 
news:gv29vn$7a...@digitalmars.com...

Nick Sabalausky wrote:
Christopher Wright dhase...@gmail.com wrote in message 
news:gv0p4e$uv...@digitalmars.com...

Nick Sabalausky wrote:
I can see certain potential benefits to the general way C# does 
generics, but until the old (and I do mean old) issue of There's an 
IComparable, so why the hell won't MS give us an IArithmetic so we can 
actually use arithmetic operators on generic code? gets fixed (and at 
this point I'm convinced they've never had any intent of ever fixing 
that), I don't care how valid the reasoning behind C#'s general 
approach to generics is, the actual state of C#'s generics still falls 
squarely into the categories of crap and almost useless.
IArithmetic is impossible in C# because operator overloads are static 
methods, and interfaces cannot specify static methods.

Then how does IComparable work?

It uses a member function instead.


And they can't do the same for arithmetic? 


I believe the rationale for using static functions is so that you can 
add null to something. (The indexing operator, mind you, is a member 
property, so this doesn't always hold.) Additionally, this gets rid of 
opX_r.


In practice, I doubt anyone uses that. But it's too late to make that 
change.


Re: OT: on IDEs and code writing on steroids

2009-05-21 Thread Steven Schveighoffer

On Wed, 20 May 2009 23:40:54 -0400, Nick Sabalausky a...@a.a wrote:


Christopher Wright dhase...@gmail.com wrote in message
news:gv29vn$7a...@digitalmars.com...

Nick Sabalausky wrote:

Christopher Wright dhase...@gmail.com wrote in message
news:gv0p4e$uv...@digitalmars.com...

Nick Sabalausky wrote:

I can see certain potential benefits to the general way C# does
generics, but until the old (and I do mean old) issue of There's an
IComparable, so why the hell won't MS give us an IArithmetic so we  
can
actually use arithmetic operators on generic code? gets fixed (and  
at

this point I'm convinced they've never had any intent of ever fixing
that), I don't care how valid the reasoning behind C#'s general
approach to generics is, the actual state of C#'s generics still  
falls

squarely into the categories of crap and almost useless.

IArithmetic is impossible in C# because operator overloads are static
methods, and interfaces cannot specify static methods.


Then how does IComparable work?


It uses a member function instead.


And they can't do the same for arithmetic?


Keep in mind that the member method does not map to an operator.

So you still have to call it directly:

object.compareTo(object2);

-Steve


Re: OT: on IDEs and code writing on steroids

2009-05-21 Thread Yigal Chripun

BCS wrote:

in C# you almost never compile each source file separately, rather you
compile a bunch of sources into an assembly all at once and you provide
the list of other assemblies your code depends on. so the dependency is
on the package level rather than on the file level. this make so much
more sense since each assembly is a self contained unit of
functionality.


That is more or less what I thought it was. Also, that indicates that 
the design of c# assumes a build model that I think is a bad idea; the 
big dumb all or nothing build where a sub part of a program is either 
up to date, or rebuilt by recompiling everything in it.


C# has a different compilation model which is what I was saying all 
along. However I disagree with your assertion that this model is bad.
It makes much more sense than the C++/D model. the idea here is that 
each self contained sub-component is compiled by itself. this self 
contained component might as well be a single file, nothing in the above 
prevents this.
consider a project with 100 files where you have one specific feature 
implemented by 4 tightly coupled classes which you put in separate 
files. each of the files depends on the rest. what's the best compiling 
strategy here?
if you compile each file separately than you parse all 4 files for each 
object file which is completely redundant and makes little sense since 
you'll need to recompile all of them anyway because of their dependencies.





Last I heard Re-Sharper is a VS plugin, not an IDE in it's own right, 
and even if that has changed, it's still an IDE. Even so, my point is 
Any IDE vs. No IDE, so it dosn't address my point.




My use of the term IDE here is a loose one. let me rephrase:
yes, Re-sharper is a plugin for VS. without it VS provides just 
text-editing features and I don't consider it an IDE like eclipse is.
Re-sharper provides all the features of a real IDE for VS. so, while 
it's just a plugin, it's more important than VS itself.


So DWT depends on DSSS's meta data. That's a design choice of DWT not D. 
What I'm asserting that that C# projects depending on meta data is a 
design choice of C# not the project. D project can (even if some don't) 
be practically designed so that they don't need that meta data where as, 
I will assert, C# projects, for practical purposes, can't do away with it.


--


What I was saying was not specific for DWT but rather that _any_ 
reasonably big project will use such a system and it's simply not 
practical to do otherwise. how would you handle a project with a hundred 
 files that takes 30 min. to compile without any tool whatsoever except 
the compiler itself?




I'm fine with any build system you want to have implemented as long as a 
tool stack can still be built that works like the current one. That is 
that it can practically:


- support projects that need no external meta data
- produce monolithic OS native binary executables
- work with the only language aware tool being the compiler

I don't expect it to requiter that projects be done that way and I 
wouldn't take any issue if a tool stack were built that didn't fit that 
list. What I /would/ take issue with is the the language (okay, or DMD 
in particular) were altered to the point that one or more of those 
*couldn't* be done.




your points are skewed IMO.
 - support projects that need no external meta data
this is only practical for small projects and that works the same way in 
both languages.


 - produce monolithic OS native binary executables
that is unrelated to our topic. Yes .Net uses byte-code and not native 
executables. I never said I want this aspect to be brought to D.

 - work with the only language aware tool being the compiler
again, only practical for small-mid projects in both languages.


just to clarify: you _can_ compile C# files one at a time just like you 
would with C or D, and the is an output format for that that is not an 
assembly.


Re: OT: on IDEs and code writing on steroids

2009-05-21 Thread Yigal Chripun

Nick Sabalausky wrote:


Maybe this is naive, but what about an AST-level template/generic? Couldn't 
that provide for the best of both worlds?


For instance, suppose (purely hypothetically) that the .NET assembly system 
were changed to allow the source for a D/C++ style of source-level template 
to be embedded into the assembly. Then they'd be able to do D/C++ style 
source-level template/code-generation. Right? Now obviously the big problem 
with that is it would only be usable in the same language it was originally 
written in. So, instead of getting that cross-language support by going all 
the way down to the IL bytecode level to implement generics (which, as you 
said, would somehow prevent the flexibility that the D/C++ style enjoys) 
suppose it only went down as far as a language-agnostic AST?


I suppose that might make reverse-engineering easier which MS might not 
like, but I'm not suggesting this as something that MS should like or should 
even do, but rather suggesting it as (business issues completely aside) 
something that would possibly gain the benefits of both styles. 





that's the exact opposite of a good solution.
I already mentioned several times before the language Nemerle which 
provide the correct solution.
important fact - Nemerle is a .NET language and it does _NOT_ need to 
modify the underlining system.


The way it works in Nemerle is pretty simple:
the language has a syntax to compose/decompose AST.
a Macro in nemerle is just a plain old function that uses the same 
syntax you'd use at run-time and this function can use APIs to access 
the compiler's internal data structures (the AST) and manipulate it.
you connect it to your regular code by either just calling it like a 
regular function or by using attributes.


let's compare to see the benefits:
in D:
tango.io.Stdout(Hello World).newline; // prints at run-time
pragma(msg, Hello World); // prints at compile-time

in Nemerle:
macro m () {
  Nemerle.IO.printf (compile-time\n);
  [ Nemerle.IO.printf (run-time\n) ];
}
// and you call it like this:
m();
Nemerle.IO.printf (run-time\n);

notice how both use the same code, the same printf function?
the only change is that the second line inside the macro is enclosed 
inside [ ] which means output (return) the AST for this code instead 
of actually running the code and returning the result of the call.


Macros in Nemerle need to be compiled since they are regular Nemerle 
code and they need to be loaded by the compiler (added to the command 
line) in order to compile the code the calls the macros.


essentially these are just plugins for the compiler.

compared to the elegance of this solution, templates are just a crude 
copy-paste mechanism implemented inside the compiler.


Re: OT: on IDEs and code writing on steroids

2009-05-21 Thread Andrei Alexandrescu

Yigal Chripun wrote:

Nick Sabalausky wrote:
I suppose that might make reverse-engineering easier which MS might 
not like, but I'm not suggesting this as something that MS should like 
or should even do, but rather suggesting it as (business issues 
completely aside) something that would possibly gain the benefits of 
both styles.




that's the exact opposite of a good solution.
I already mentioned several times before the language Nemerle which 
provide the correct solution.
important fact - Nemerle is a .NET language and it does _NOT_ need to 
modify the underlining system.


The way it works in Nemerle is pretty simple:
the language has a syntax to compose/decompose AST.
a Macro in nemerle is just a plain old function that uses the same 
syntax you'd use at run-time and this function can use APIs to access 
the compiler's internal data structures (the AST) and manipulate it.
you connect it to your regular code by either just calling it like a 
regular function or by using attributes.


let's compare to see the benefits:
in D:
tango.io.Stdout(Hello World).newline; // prints at run-time
pragma(msg, Hello World); // prints at compile-time

in Nemerle:
macro m () {
  Nemerle.IO.printf (compile-time\n);
  [ Nemerle.IO.printf (run-time\n) ];
}
// and you call it like this:
m();
Nemerle.IO.printf (run-time\n);

notice how both use the same code, the same printf function?
the only change is that the second line inside the macro is enclosed 
inside [ ] which means output (return) the AST for this code instead 
of actually running the code and returning the result of the call.


Macros in Nemerle need to be compiled since they are regular Nemerle 
code and they need to be loaded by the compiler (added to the command 
line) in order to compile the code the calls the macros.


essentially these are just plugins for the compiler.

compared to the elegance of this solution, templates are just a crude 
copy-paste mechanism implemented inside the compiler.


Nemerle's interesting, but it has its own issues. The largest one is 
that it will have to beat history: languages with configurable syntax 
have failed in droves in the 1970s.


Before I got into D, I was working on Enki. Enki was my own programming 
language and of course made D look like a piece of crap. In Enki, you 
had only very few primitives related to macro expansion, and you could 
construct all language elements - if, while, for, structures, classes, 
exceptions, you name it, from those primitive elements.


There were two elements that convinced me to quit Enki. One was that I'd 
got word of a language called IMP72. IMP72 embedded the very same ideas 
Enki had, with two exceptions: (1) it was created in 1972, and (2) 
nobody gave a damn ever since. IMP72 (and there were others too around 
that time) started with essentially one primitive and then generated 
itself with a bootstrap routine, notion that completely wowed me and I 
erroneously thought would have the world wowed too.


The second reason was that I've had many coffees and some beers with 
Walter and he convinced me that configurable syntax is an idea that 
people just don't like. Thinking a bit more, I realized that humans 
don't operate well with configurable syntax. To use the hackneyed 
comparison, no natural language or similar concoction has configurable 
syntax. Not even musical notation or whatnot. There's one syntax for 
every human language. I speculated that humans can learn one syntax for 
a language and then wire their brains to just pattern match semantics 
using it. Configurable syntax just messes with that approach, and 
besides makes any program hugely context-dependent and consequently any 
large program a pile of crap.


That being said, I have no idea whether or not Nemerle will be 
successful. I just speculate it has an uphill battle to win.



Andrei


Re: OT: on IDEs and code writing on steroids

2009-05-21 Thread BCS

Reply to Yigal,


BCS wrote:


in C# you almost never compile each source file separately, rather
you compile a bunch of sources into an assembly all at once and you
provide the list of other assemblies your code depends on. so the
dependency is on the package level rather than on the file level.
this make so much more sense since each assembly is a self contained
unit of functionality.


That is more or less what I thought it was. Also, that indicates that
the design of c# assumes a build model that I think is a bad idea;
the big dumb all or nothing build where a sub part of a program is
either up to date, or rebuilt by recompiling everything in it.


if you compile each file separately than you parse all 4 files for each
object file which is completely redundant and makes little sense since
you'll need to recompile all of them anyway because of their
dependencies.


All of the above is (as far as D goes) an implementation detail[*]. What 
I'm railing on is that in c# 1) you have no option BUT to do it that way 
and 2) the only practical way to build is from a config file


[*] I am working very slowly on building a compiler and am thinking of building 
it so that along with object files, it generates public export (.pe) files 
that have a binary version of the public interface for the module. I'd set 
it up so that the compiler never parses more than one file per process. If 
you pass it more, it forks and when it runs into imports, it loads the .pe 
files after, if needed, forking off a process to generating it.



without it VS provides just
text-editing features and I don't consider it an IDE like eclipse is.


The IDE features I don't want the language to depend on are in VS so this 
whole side line is un-important.



So DWT depends on DSSS's meta data. That's a design choice of DWT not
D. What I'm asserting that that C# projects depending on meta data is
a design choice of C# not the project. D project can (even if some
don't) be practically designed so that they don't need that meta data
where as, I will assert, C# projects, for practical purposes, can't
do away with it.

--


What I was saying was not specific for DWT but rather that _any_
reasonably big project will use such a system and it's simply not
practical to do otherwise.


I assert that the above is false because...


how would you handle a project with a hundred
files that takes 30 min. to compile without any tool whatsoever
except the compiler itself?


I didn't say that the only tool you can use is the compiler. I'm fine with 
bud/DSSS/rebuild being used. What I don't want, is a language that effectively 
_requiters_ that some config file be maintained along with the code files. 
I suspect that the bulk of pure D projects (including large ones) /could/ 
have been written so that they didn't need a dsss.conf file and many of those 
that do have a dsss.conf, I'd almost bet can be handed without it. IIRC, 
all that DSSS really needs is what file to start with (where as c# needs 
to be handed the full file list at some point).



I'm fine with any build system you want to have implemented as long
as a tool stack can still be built that works like the current one.
That is that it can practically:

- support projects that need no external meta data
- produce monolithic OS native binary executables
- work with the only language aware tool being the compiler

I don't expect it to requiter that projects be done that way and I
wouldn't take any issue if a tool stack were built that didn't fit
that list. What I /would/ take issue with is the the language (okay,
or DMD in particular) were altered to the point that one or more of
those *couldn't* be done.


your points are skewed IMO.


- support projects that need no external meta data


this is only practical for small projects and that works the same way
in both languages.



As I said, I think this is false.


- produce monolithic OS native binary executables


that is unrelated to our topic. Yes .Net uses byte-code and not native
executables. I never said I want this aspect to be brought to D.



Mostly I'm interested in the monolithic bit (no DLL hell!) but I was just 
pulling out my laundry list.



- work with the only language aware tool being the compiler


again, only practical for small-mid projects in both languages.


ditto the point on 1


just to clarify: you _can_ compile C# files one at a time just like
you would with C or D, and the is an output format for that that is
not an assembly.


I think we won't converge on this.

I think I'm seeing a tools dependency issue that I don't like in the design 
of C# that I _known_ I'm not seeing in D. You think that D is already just 
as dependent on the tools and don't see that as an issue.


One of the major attractions for me to DMD is its build model so I tend to 
be very conservative and resistant to change on this point.





Re: OT: on IDEs and code writing on steroids

2009-05-21 Thread Yigal Chripun

BCS wrote:

Reply to Yigal,


BCS wrote:


in C# you almost never compile each source file separately, rather
you compile a bunch of sources into an assembly all at once and you
provide the list of other assemblies your code depends on. so the
dependency is on the package level rather than on the file level.
this make so much more sense since each assembly is a self contained
unit of functionality.


That is more or less what I thought it was. Also, that indicates that
the design of c# assumes a build model that I think is a bad idea;
the big dumb all or nothing build where a sub part of a program is
either up to date, or rebuilt by recompiling everything in it.


if you compile each file separately than you parse all 4 files for each
object file which is completely redundant and makes little sense since
you'll need to recompile all of them anyway because of their
dependencies.


All of the above is (as far as D goes) an implementation detail[*]. What 
I'm railing on is that in c# 1) you have no option BUT to do it that way 
and 2) the only practical way to build is from a config file


it's as much an implementation detail in D as it is in C#. nothing 
prevents you to create your own compiler for C# as well.


[*] I am working very slowly on building a compiler and am thinking of 
building it so that along with object files, it generates public 
export (.pe) files that have a binary version of the public interface 
for the module. I'd set it up so that the compiler never parses more 
than one file per process. If you pass it more, it forks and when it 
runs into imports, it loads the .pe files after, if needed, forking off 
a process to generating it.


sounds like an interesting idea - basically your compiler will generate 
the meta data just as an IDE does for C#.



without it VS provides just
text-editing features and I don't consider it an IDE like eclipse is.


The IDE features I don't want the language to depend on are in VS so 
this whole side line is un-important.



So DWT depends on DSSS's meta data. That's a design choice of DWT not
D. What I'm asserting that that C# projects depending on meta data is
a design choice of C# not the project. D project can (even if some
don't) be practically designed so that they don't need that meta data
where as, I will assert, C# projects, for practical purposes, can't
do away with it.

--


What I was saying was not specific for DWT but rather that _any_
reasonably big project will use such a system and it's simply not
practical to do otherwise.


I assert that the above is false because...


how would you handle a project with a hundred
files that takes 30 min. to compile without any tool whatsoever
except the compiler itself?


I didn't say that the only tool you can use is the compiler. I'm fine 
with bud/DSSS/rebuild being used. What I don't want, is a language that 
effectively _requiters_ that some config file be maintained along with 
the code files. I suspect that the bulk of pure D projects (including 
large ones) /could/ have been written so that they didn't need a 
dsss.conf file and many of those that do have a dsss.conf, I'd almost 
bet can be handed without it. IIRC, all that DSSS really needs is what 
file to start with (where as c# needs to be handed the full file list at 
some point).


you miss a critical issue here: DSSS/rebuild/etc can mostly be used 
without a config file _because_ they embed the DMDFE which generates 
that information (dependencies) for them. There is no conceptual 
difference between that and using an IDE. you just moved some 
functionality from the IDE to the build tool.

both need to parse the code to get the dependencies.




I'm fine with any build system you want to have implemented as long
as a tool stack can still be built that works like the current one.
That is that it can practically:

- support projects that need no external meta data
- produce monolithic OS native binary executables
- work with the only language aware tool being the compiler

I don't expect it to requiter that projects be done that way and I
wouldn't take any issue if a tool stack were built that didn't fit
that list. What I /would/ take issue with is the the language (okay,
or DMD in particular) were altered to the point that one or more of
those *couldn't* be done.


your points are skewed IMO.


- support projects that need no external meta data


this is only practical for small projects and that works the same way
in both languages.



As I said, I think this is false.


- produce monolithic OS native binary executables


that is unrelated to our topic. Yes .Net uses byte-code and not native
executables. I never said I want this aspect to be brought to D.



Mostly I'm interested in the monolithic bit (no DLL hell!) but I was 
just pulling out my laundry list.



- work with the only language aware tool being the compiler


again, only practical for small-mid projects in both languages.


ditto the point on 1


just to 

Re: OT: on IDEs and code writing on steroids

2009-05-21 Thread dsimcha
== Quote from Andrei Alexandrescu (seewebsiteforem...@erdani.org)'s
 Before I got into D, I was working on Enki. Enki was my own programming
 language and of course made D look like a piece of crap. In Enki, you
 had only very few primitives related to macro expansion, and you could
 construct all language elements - if, while, for, structures, classes,
 exceptions, you name it, from those primitive elements.
 There were two elements that convinced me to quit Enki. One was that I'd
 got word of a language called IMP72. IMP72 embedded the very same ideas
 Enki had, with two exceptions: (1) it was created in 1972, and (2)
 nobody gave a damn ever since. IMP72 (and there were others too around
 that time) started with essentially one primitive and then generated
 itself with a bootstrap routine, notion that completely wowed me and I
 erroneously thought would have the world wowed too.
 The second reason was that I've had many coffees and some beers with
 Walter and he convinced me that configurable syntax is an idea that
 people just don't like. Thinking a bit more, I realized that humans
 don't operate well with configurable syntax. To use the hackneyed
 comparison, no natural language or similar concoction has configurable
 syntax. Not even musical notation or whatnot. There's one syntax for
 every human language. I speculated that humans can learn one syntax for
 a language and then wire their brains to just pattern match semantics
 using it. Configurable syntax just messes with that approach, and
 besides makes any program hugely context-dependent and consequently any
 large program a pile of crap.
 That being said, I have no idea whether or not Nemerle will be
 successful. I just speculate it has an uphill battle to win.
 Andrei

This is pretty much a special case of a more general statement about
customizability.  Customizability is good as long as there are also sane
default/de facto standard ways of doing things and simple things are still made
simple.  Take EMacs or vi, for example.  I absolutely despise both because they
have very annoying, idiosynchratic ways of doing basic stuff like saving a file,
navigating through a file, etc.  The backspace, delete, home, etc. keys don't
always do what I've come to expect them to do out of the box.  I know all this
stuff is customizable, but the barrier to entry of learning how to configure all
this stuff is much higher than the barrier to just using a simple GUI text 
editor
like gedit or Notepad++ instead.  I don't care how powerful these programs are,
they're still not special enough that violating such basic conventions is 
acceptable.

Bringing this analogy back to language design, if you make a language very 
highly
configurable and don't provide good defaults, the barrier to entry will just be
too high.  If people have to understand a whole bunch of intricacies of the 
macro
system to do anything more complex than Hello, world, the language will be
confined to a highly devoted niche.  On the other hand, if you do provide strong
conventions and sane defaults, people will probably avoid violating them because
doing so would make their code no longer portable from programmer to programmer,
and would probably break a whole bunch of library code, etc. that relies on the
convention.  In other words, they would feel like they were creating a whole new
language, and for all practical purposes they would be.  Thus, the purpose of 
this
customizability would be defeated.

As I've said before in various places, my favorite thing about D is that it 
takes
a level headed, pragmatic view on so many issues that other languages fight holy
wars about.  These include performance vs. simplicity, safety vs. flexibility,
etc.  This is just another one to add to the list.  Having a sane subset that's
something like a hybrid between Java and Python, but then putting D's template
system on top of it for when the sane subset just doesn't cut it is a pragmatic,
level-headed solution to the holy war between meta-languages that let (force?) 
you
to customize everything and don't have a well-defined sane, simple subset and
excessively rigid static languages that sometimes don't make complex things 
possible.

(Note:  Don't get me wrong, IMHO, parts of the template system have actually
earned a rightful place in the sane subset, just not the more advanced
metaprogramming stuff.)


Re: OT: on IDEs and code writing on steroids

2009-05-21 Thread Denis Koroskin
On Thu, 21 May 2009 23:07:32 +0400, BCS a...@pathlink.com wrote:

 Reply to Yigal,

 BCS wrote:

 Reply to Yigal,

 if you compile each file separately than you parse all 4 files for
 each object file which is completely redundant and makes little
 sense since you'll need to recompile all of them anyway because of
 their dependencies.

 All of the above is (as far as D goes) an implementation detail[*].
 What I'm railing on is that in c# 1) you have no option BUT to do it
 that way and 2) the only practical way to build is from a config file

 it's as much an implementation detail in D as it is in C#. nothing
 prevents you to create your own compiler for C# as well.


 I disagree, see below:

 [*] I am working very slowly on building a compiler and am thinking
 of building it so that along with object files, it generates public
 export (.pe) files that have a binary version of the public
 interface for the module. I'd set it up so that the compiler never
 parses more than one file per process. If you pass it more, it forks
 and when it runs into imports, it loads the .pe files after, if
 needed, forking off a process to generating it.

 sounds like an interesting idea - basically your compiler will
 generate the meta data just as an IDE does for C#.


 Maybe that's the confusion: No it won't!  
 That's not the meta data I've been talking about. The meta data that c#  
 needs that I'm referring to is the list of file that the compiler needs  
 to look at. In D this information can be derived from the text of the  
 import statements in the .d files (well it also needs the the import  
 directory list). In c# this can't be done even within a single assembly.  
 Without me explicitly telling the compiler what files to look in, it  
 cant find anything! It can't even just search the local dir for file  
 that have what it's looking for because I could have old copies of the  
 file laying around that shouldn't be used.

 I didn't say that the only tool you can use is the compiler. I'm fine
 with bud/DSSS/rebuild being used. What I don't want, is a language
 that effectively _requiters_ that some config file be maintained
 along with the code files. I suspect that the bulk of pure D projects
 (including large ones) /could/ have been written so that they didn't
 need a dsss.conf file and many of those that do have a dsss.conf, I'd
 almost bet can be handed without it. IIRC, all that DSSS really needs
 is what file to start with (where as c# needs to be handed the full
 file list at some point).

 you miss a critical issue here: DSSS/rebuild/etc can mostly be used
 without a config file _because_ they embed the DMDFE which generates
 that information (dependencies) for them. There is no conceptual
 difference between that and using an IDE. you just moved some
 functionality from the IDE to the build tool.
 both need to parse the code to get the dependencies.

 Again, in c# you /can't get that information/ by parsing the code. And  
 that is my point exactly.

 I think we won't converge on this.
  I think I'm seeing a tools dependency issue that I don't like in the
 design of C# that I _known_ I'm not seeing in D. You think that D is
 already just as dependent on the tools and don't see that as an
 issue.
  One of the major attractions for me to DMD is its build model so I
 tend to be very conservative and resistant to change on this point.


 you're right that we will not converge on this. you only concentrate on
 the monolithic executable case and ignore the fact that in real life
 projects the common case is to have sub-components, be it Java jars, C#
 assemblies, C/C++ dll/so/a or D DDLs.

 Yes the common case, but that dosn't make it the right case. See below.

 in any of those cases you sstill need to manage the sub components and
 their dependencies.
 one of the reasons for dll hell is because c/c++ do not handle this
 properly and that's what Java and .net and DDL try to solve. the
 dependency is already there for external tools to manage this
 complexity.

 I assert that very rare that a programs NEEDS to use a DLL/so/DDL type  
 of system. The only unavoidable reasons to use them that I see are:

 1) you are forced to use code that can't be had at compile time (rare  
 outside of plugins and they don't count because they are not your code)
 2) you have lots of code that is mostly never run and you can't load it  
 all (and that sounds like you have bigger problems)
 3) you are running into file size limits (outside of something like a  
 kernel image, this is unlikely)
 4) booting takes to long (and that says your doing something else wrong)


5) The most common case - your program relies on some third-party middleware 
that doesn't provide any source code.

 It is my strongly held opinion that the primary argument for dlls and  
 friends, code sharing, is attempting to solve a completely intractable  
 problem. As soon as you bring in versioning, installers and  
 uninstallers, the problem becomes flat 

Re: OT: on IDEs and code writing on steroids

2009-05-21 Thread Andrei Alexandrescu

dsimcha wrote:

Bringing this analogy back to language design, if you make a language very 
highly
configurable and don't provide good defaults, the barrier to entry will just be
too high.  If people have to understand a whole bunch of intricacies of the 
macro
system to do anything more complex than Hello, world, the language will be
confined to a highly devoted niche.  On the other hand, if you do provide strong
conventions and sane defaults, people will probably avoid violating them because
doing so would make their code no longer portable from programmer to programmer,
and would probably break a whole bunch of library code, etc. that relies on the
convention.  In other words, they would feel like they were creating a whole new
language, and for all practical purposes they would be.  Thus, the purpose of 
this
customizability would be defeated.


The symptom is visible: even with good defaults and all, such languages 
invariably come with the advice don't use our best feature. That's 
terrible. It's a bad anguage design to put in power that can do more 
harm than good, to the extent that you openly unrecommend usage of the 
feature. Like you have a car with a very powerful engine but not a 
clutch to match (there are many, Subaru Impreza comes to mind). Then you 
have this awesome power on paper, but invariably the mechanic tells you: 
don't seal the pedal to the metal if you want to have a car.


I therefore much prefer D's templates which I use and also recommend to 
non-advanced users, plus the occasional string mixin that is a 
seldom-used feature instrumental to only a minority of idioms.



Andrei


Re: OT: on IDEs and code writing on steroids

2009-05-21 Thread Nick Sabalausky
Andrei Alexandrescu seewebsiteforem...@erdani.org wrote in message 
news:gv3ubr$2u...@digitalmars.com...
 Yigal Chripun wrote:

 Nemerle's interesting, but it has its own issues. The largest one is that 
 it will have to beat history: languages with configurable syntax have 
 failed in droves in the 1970s.

 Before I got into D, I was working on Enki. Enki was my own programming 
 language and of course made D look like a piece of crap. In Enki, you had 
 only very few primitives related to macro expansion, and you could 
 construct all language elements - if, while, for, structures, classes, 
 exceptions, you name it, from those primitive elements.

 There were two elements that convinced me to quit Enki. One was that I'd 
 got word of a language called IMP72. IMP72 embedded the very same ideas 
 Enki had, with two exceptions: (1) it was created in 1972, and (2) nobody 
 gave a damn ever since. IMP72 (and there were others too around that time) 
 started with essentially one primitive and then generated itself with a 
 bootstrap routine, notion that completely wowed me and I erroneously 
 thought would have the world wowed too.


There are many possible reasons for a failed language's failure. One of the 
biggest is lack of visibility. Who has ever heard of IMP72? Sure, that lack 
of visibility could have been because people hated that particular aspect of 
the language, but it could also have been from any one of a number of other 
reasons.

 The second reason was that I've had many coffees and some beers with 
 Walter and he convinced me that configurable syntax is an idea that people 
 just don't like. Thinking a bit more, I realized that humans don't operate 
 well with configurable syntax. To use the hackneyed comparison, no natural 
 language or similar concoction has configurable syntax. Not even musical 
 notation or whatnot. There's one syntax for every human language. I 
 speculated that humans can learn one syntax for a language and then wire 
 their brains to just pattern match semantics using it. Configurable syntax 
 just messes with that approach, and besides makes any program hugely 
 context-dependent and consequently any large program a pile of crap.


So I take it AST Macros are no longer on the table for D3?




Re: OT: on IDEs and code writing on steroids

2009-05-21 Thread Andrei Alexandrescu

Nick Sabalausky wrote:
There are many possible reasons for a failed language's failure. One of the 
biggest is lack of visibility. Who has ever heard of IMP72? Sure, that lack 
of visibility could have been because people hated that particular aspect of 
the language, but it could also have been from any one of a number of other 
reasons.


As I said, there were many languages with configurable syntax created 
during that period. None was even remembered. But then, correlation is 
not causation :o).


The second reason was that I've had many coffees and some beers with 
Walter and he convinced me that configurable syntax is an idea that people 
just don't like. Thinking a bit more, I realized that humans don't operate 
well with configurable syntax. To use the hackneyed comparison, no natural 
language or similar concoction has configurable syntax. Not even musical 
notation or whatnot. There's one syntax for every human language. I 
speculated that humans can learn one syntax for a language and then wire 
their brains to just pattern match semantics using it. Configurable syntax 
just messes with that approach, and besides makes any program hugely 
context-dependent and consequently any large program a pile of crap.




So I take it AST Macros are no longer on the table for D3?


AST macros can be implemented to not allow configurable syntax.


Andrei


Re: OT: on IDEs and code writing on steroids

2009-05-21 Thread BCS

Reply to Denis,


I assert that very rare that a programs NEEDS to use a DLL/so/DDL type
of system. The only unavoidable reasons to use them that I see are:

1) you are forced to use code that can't be had at compile time (rare
outside of plugins and they don't count because they are not your code)
2) you have lots of code that is mostly never run and you can't load it
all (and that sounds like you have bigger problems)
3) you are running into file size limits (outside of something like a
kernel image, this is unlikely)
4) booting takes to long (and that says your doing something else
wrong)


5) The most common case - your program relies on some third-party
middleware that doesn't provide any source code.



They /sould/ ship static libs as well IMNSHO. Also the same aside as for 
#1 fits here.





Re: OT: on IDEs and code writing on steroids

2009-05-21 Thread Georg Wrede

Nick Sabalausky wrote:
Suppose (purely hypothetically) that the .NET assembly system 
were changed to allow the source for a D/C++ style of source-level template 
to be embedded into the assembly. Then they'd be able to do D/C++ style 
source-level template/code-generation. Right?


I assume, actually presume, that would take the better part of a decade.


Now obviously the big problem with that is it would only be usable in
the same language it was originally written in.


That actually depends. Done the M$ way, it would. Done properly, it 
would work in any language. But then, that would mean a rewrite of the 
entire CLR, wouldn't it?



I suppose that might make reverse-engineering easier  [...]


I don't think that's got anything to do with it. Not at least, if they'd 
really do it below the language/CLR boundary.


Re: OT: on IDEs and code writing on steroids

2009-05-21 Thread Yigal Chripun

BCS wrote:

I assert that very rare that a programs NEEDS to use a DLL/so/DDL type 
of system. The only unavoidable reasons to use them that I see are:


1) you are forced to use code that can't be had at compile time (rare 
outside of plugins and they don't count because they are not your code)
2) you have lots of code that is mostly never run and you can't load it 
all (and that sounds like you have bigger problems)
3) you are running into file size limits (outside of something like a 
kernel image, this is unlikely)

4) booting takes to long (and that says your doing something else wrong)

It is my strongly held opinion that the primary argument for dlls and 
friends, code sharing, is attempting to solve a completely intractable 
problem. As soon as you bring in versioning, installers and 
uninstallers, the problem becomes flat out impossible to solve. (the one 
exception is for low level system things like KERNEL32.DLL and stdc*.so)


In this day and age where HDD's are ready to be measured in TB and 
people ask how many Gigs of RAM you have, *who cares* about code sharing?





so, in your opinion Office, photoshop, adobe acrobat, can all be 
provided as monolithic executables? that's just ridiculous.


My work uses this monolithic model approach for some programs and this 
brings so much pain that you wouldn't believe. Now we're trying to 
slowly move away from this retarded model. I'm talking from experience 
here - the monolithic approach does NOT work.
just so you'd understand the scale I'm talking about - our largest 
executable is 1.5 Gigs in size.


you're wrong on both accounts, DLL type systems are not only the common 
case, they are the correct solution.
the DLL HELL you're so afraid of is mostly solved by using 
jars/assemblies (smart dlls) that contain meta-data such as versions.
this problem is also solved on Linux systems that use package managers, 
like Debian's APT.


monolithic design like you suggest is in fact bad design that leads to 
things like - Windows Vista running slower on my 8-core machine than 
Window XP on my extremely weak laptop.


Re: OT: on IDEs and code writing on steroids

2009-05-21 Thread Georg Wrede

Yigal Chripun wrote:
What I was saying was not specific for DWT but rather that _any_ 
reasonably big project will use such a system and it's simply not 
practical to do otherwise. how would you handle a project with a hundred 
 files that takes 30 min. to compile without any tool whatsoever except 
the compiler itself?


Make?

And if you're smart, a verson control system. (Whether you use an IDE or 
not.)


Re: OT: on IDEs and code writing on steroids

2009-05-21 Thread Yigal Chripun

Andrei Alexandrescu wrote:

Yigal Chripun wrote:

Nick Sabalausky wrote:
I suppose that might make reverse-engineering easier which MS might 
not like, but I'm not suggesting this as something that MS should 
like or should even do, but rather suggesting it as (business issues 
completely aside) something that would possibly gain the benefits of 
both styles.




that's the exact opposite of a good solution.
I already mentioned several times before the language Nemerle which 
provide the correct solution.
important fact - Nemerle is a .NET language and it does _NOT_ need to 
modify the underlining system.


The way it works in Nemerle is pretty simple:
the language has a syntax to compose/decompose AST.
a Macro in nemerle is just a plain old function that uses the same 
syntax you'd use at run-time and this function can use APIs to 
access the compiler's internal data structures (the AST) and 
manipulate it.
you connect it to your regular code by either just calling it like a 
regular function or by using attributes.


let's compare to see the benefits:
in D:
tango.io.Stdout(Hello World).newline; // prints at run-time
pragma(msg, Hello World); // prints at compile-time

in Nemerle:
macro m () {
  Nemerle.IO.printf (compile-time\n);
  [ Nemerle.IO.printf (run-time\n) ];
}
// and you call it like this:
m();
Nemerle.IO.printf (run-time\n);

notice how both use the same code, the same printf function?
the only change is that the second line inside the macro is enclosed 
inside [ ] which means output (return) the AST for this code instead 
of actually running the code and returning the result of the call.


Macros in Nemerle need to be compiled since they are regular Nemerle 
code and they need to be loaded by the compiler (added to the command 
line) in order to compile the code the calls the macros.


essentially these are just plugins for the compiler.

compared to the elegance of this solution, templates are just a crude 
copy-paste mechanism implemented inside the compiler.


Nemerle's interesting, but it has its own issues. The largest one is 
that it will have to beat history: languages with configurable syntax 
have failed in droves in the 1970s.


Before I got into D, I was working on Enki. Enki was my own programming 
language and of course made D look like a piece of crap. In Enki, you 
had only very few primitives related to macro expansion, and you could 
construct all language elements - if, while, for, structures, classes, 
exceptions, you name it, from those primitive elements.


There were two elements that convinced me to quit Enki. One was that I'd 
got word of a language called IMP72. IMP72 embedded the very same ideas 
Enki had, with two exceptions: (1) it was created in 1972, and (2) 
nobody gave a damn ever since. IMP72 (and there were others too around 
that time) started with essentially one primitive and then generated 
itself with a bootstrap routine, notion that completely wowed me and I 
erroneously thought would have the world wowed too.


The second reason was that I've had many coffees and some beers with 
Walter and he convinced me that configurable syntax is an idea that 
people just don't like. Thinking a bit more, I realized that humans 
don't operate well with configurable syntax. To use the hackneyed 
comparison, no natural language or similar concoction has configurable 
syntax. Not even musical notation or whatnot. There's one syntax for 
every human language. I speculated that humans can learn one syntax for 
a language and then wire their brains to just pattern match semantics 
using it. Configurable syntax just messes with that approach, and 
besides makes any program hugely context-dependent and consequently any 
large program a pile of crap.


That being said, I have no idea whether or not Nemerle will be 
successful. I just speculate it has an uphill battle to win.



Andrei


I didn't talk about configurable syntax at all above.
Yes, Nemerle has this feature as part of their Macro system but that's 
just one rather small aspect about the system that can be removed. I 
also don't see the bigger problem with it as you describe since this is 
limited in Nemerle to specific things.


I'm also unsure as to what you define as syntax and what you define as 
semantics. for example, Smalltalk has only 5 keywords and it's 
implemented entirely as a collection of libraries. if else switch 
while are all implemented as methods of objects and therefore are 
configurable. Is that also wrong in your opinion?


I agree with dsimcha -
the language needs to provide simple intuitive defaults, that's why I 
think LISP didn't succeed. it is very powerful but the need to write 
your own macro just so you can say 4 + 5 instead of (+ 4 5) shows this 
point.


I think Nemerle provies this - the constructs in Nemerle for the Macro 
system are very simple and intuitive. you only have one extra syntax 
feature, the [ ]. think of D's CTFE only much more extended in scope - 
you write a CTFE 

Re: OT: on IDEs and code writing on steroids

2009-05-21 Thread Rainer Deyke
Yigal Chripun wrote:
 just so you'd understand the scale I'm talking about - our largest
 executable is 1.5 Gigs in size.

How is 1.5 GB of dlls better than a 1.5 GB executable?  (And don't
forget, removing dead code across dll boundaries is a lot more difficult
than removing it within a single executable, so you're more likely to
have 3 GB of dlls.)

 you're wrong on both accounts, DLL type systems are not only the common
 case, they are the correct solution.
 the DLL HELL you're so afraid of is mostly solved by using
 jars/assemblies (smart dlls) that contain meta-data such as versions.
 this problem is also solved on Linux systems that use package managers,
 like Debian's APT.

You have a curious definition of solved.  Package managers work
(sometimes, sort of) so long as you get all of your software from a
single source and you never need a newer versions of your software that
is not yet available in package form.  I've got programs that I've
almost given up on deploying at all because of assembly hell.  Plain old
DLLs weren't anywhere near as bad as that.

My favorite deployment system is the application bundle under OS X.
It's a directory that looks like a file.  Beneath the covers it has
frameworks and configuration files and multiple executables and all that
crap, but to the user, it looks like a single file.  You can copy it,
rename it, move it (on a single computer or between computers), even
delete it, and it just works.  Too bad the system doesn't work under any
other OS.


-- 
Rainer Deyke - rain...@eldwood.com


Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread Yigal Chripun

Lutger wrote:

Yigal Chripun wrote:

...
IMO, designing the language to support this better work-flow is a good 
decision made by MS, and D should follow it instead of trying to get 
away without an IDE.


I'm not sure about this. D is designed to be easier to parse than C++ 
(but that's saying nothing) to allow better tools made for it. I think this 
should be enough. 

C#  friends not only better supports working inside an IDE, but makes it a pain to 
do without. Autocomplete dictates that related functions should be named with
the exact same prefix - even when this isn't logical. It also encourages names to be 
as descriptive as possible, in practice leading to a part of the api docs encoded in 
the function name. Extremely bloated names are the consequence of this. It doesn't 
always make code more readable imho. 

this I completely disagree with. those are the same faulty reasons I 
already answered.
an IDE does _not_ create bad programmers, and does _not_ encourage bad 
code. it does encourage descriptive names which is a _good_ thing.


writing strcpy ala C style is cryptic and *wrong*. code is read 
hundred times more than it's written and a better name would be for 
instance - stringCopy.
it's common nowadays to have tera-byte sized HDD so why people try to 
save a few bytes from their source while sacrificing readability?


the only issue I have with too long names is when dealing with C/C++ 
code that prefixes all symbols with their file-names/namespaces. At 
least in C++ this is solved by using namespaces. but this is a problem 
with the languages themselves and has nothing to do with the IDE.


The documentation comments are in xml: pure insanity. I tried to generate documentation 
for my stuff at work once, expecting to be done in max 5 min. like ddoc. Turns out nobody at 
work uses documentation generation for a reason: it isn't really fleshed out and one-click

from the IDE, in fact it is a pain in the arse compared to using ddoc.

I should stop now before this turns into a rant.



I agree fully with this. XML documents are a mistake made by MS. javadoc 
is a much better format and even that can be improved.


This however has nothing to do with the IDE. the important part is that 
the IDE parses whatever format is used and can show you the 
documentation via simple means. no need for you to spend time to find 
the documentation yourself.


Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread Lutger
Andrei Alexandrescu wrote:

...
 What the heck do you need generics for when you have real templates?  To me,
 generics seem like just a lame excuse for templates.
 
 I agree. Then, templates aren't easy to implement and they were 
 understandably already busy implementing the using statement.
 
 Andrei

While I don't fully understand how generics work under the hood in .NET, there 
are some benefits to how it is done. For example, you can use runtime 
reflection on generic types. And the jit compiler 
instantiates them at runtime. They may serve a different purpose than templates:

Anders Hejlsberg: To me the best way to understand the distinction between C# 
generics and C++ templates is this: C# generics are really just like classes, 
except they have a type parameter. C++ templates 
are really just like macros, except they look like classes. 

It seems that lack of structural typing is seen as a feature:

When you think about it, constraints are a pattern matching mechanism. You 
want to be able to say, This type parameter must have a constructor that takes 
two arguments, implement operator+, have this 
static method, has these two instance methods, etc. The question is, how 
complicated do you want this pattern matching mechanism to be?
There's a whole continuum from nothing to grand pattern matching. We think it's 
too little to say nothing, and the grand pattern matching becomes very 
complicated, so we're in- between.  

From: http://www.artima.com/intv/genericsP.html




 










Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread Lutger
Andrei Alexandrescu wrote:

...
 I've repeatedly failed to figure out the coolness of C#, and would 
 appreciate a few pointers. Or references. Or delegates :o).


It's not in the language. 

C# only has to do 'better' than C++ and Java to be cool and in that it 
succeeds: Besides many smaller improvements, it provides delegates / events, 
lambda's and true generics compared to java. Compared to 
C++ it provides, well, lots of the same stuff that make people prefer Java over 
C++. 
It does this while still being familiar. Take F# for example, probably a much 
cooler language: I suspect this is too alien for most people.

The real attraction of C# is the framework and the IDE. A lot of programmers 
don't program in a language, but in an ecosystem where the language is just a 
part of it alongside the framework, toolchain and 
documentation.







Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread Tim Matthews
On Wed, 20 May 2009 17:31:14 +1200, Jarrett Billingsley  
jarrett.billings...@gmail.com wrote:




Just, uh, wow.  Please dude, read up on this stuff first.



This thread turned into a java vs .net argument. I'm sorry but I don't  
know the details of the JVM's just in time compiler. The virtual machine  
in the name plus the design goals led me to this misunderstanding It  
should be interpreted, threaded, and dynamic


http://en.wikipedia.org/wiki/Java_(programming_language)#Primary_goals

http://en.wikipedia.org/wiki/Comparison_of_the_Java_and_.NET_platforms


Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread Kagamin
BCS Wrote:

 smaller object code? OTOH a good implementation will noice when I can fold 
 together several template expansions

That's the difference. You can't fold templates because they're binary 
incompatible as opposite to generics.


Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread Frits van Bommel

Kagamin wrote:

BCS Wrote:

smaller object code? OTOH a good implementation will noice when I can fold 
together several template expansions


That's the difference. You can't fold templates because they're binary 
incompatible as opposite to generics.


They're not always binary-incompatible. For instance, if a template only works 
with pointers or references (this includes object references) to parameter types 
it might well contain the exact same machine code for some of the instantiations.
A compiler backend or linker could recognize these cases and use a single 
instantiation's machine code for them.
(Essentially, these are pretty much the same cases where generics would have 
been sufficient)


Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread Denis Koroskin
On Wed, 20 May 2009 13:09:37 +0400, Kagamin s...@here.lot wrote:

 BCS Wrote:

 smaller object code? OTOH a good implementation will noice when I can  
 fold
 together several template expansions

 That's the difference. You can't fold templates because they're binary  
 incompatible as opposite to generics.

You can fold /some/ templates. I believe LLVM already does merging of identical 
functions (including templates, virtual functions etc) as a part of 
optimization process. Not sure about LDC, though.


Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread Nick Sabalausky
Lutger lutger.blijdest...@gmail.com wrote in message 
news:gv090o$22...@digitalmars.com...
 Andrei Alexandrescu wrote:

 ...
 What the heck do you need generics for when you have real templates?  To 
 me,
 generics seem like just a lame excuse for templates.

 I agree. Then, templates aren't easy to implement and they were
 understandably already busy implementing the using statement.

 Andrei

 While I don't fully understand how generics work under the hood in .NET, 
 there are some benefits to how it is done. For example, you can use 
 runtime reflection on generic types. And the jit compiler
 instantiates them at runtime. They may serve a different purpose than 
 templates:

 Anders Hejlsberg: To me the best way to understand the distinction 
 between C# generics and C++ templates is this: C# generics are really just 
 like classes, except they have a type parameter. C++ templates
 are really just like macros, except they look like classes.

 It seems that lack of structural typing is seen as a feature:

 When you think about it, constraints are a pattern matching mechanism. 
 You want to be able to say, This type parameter must have a constructor 
 that takes two arguments, implement operator+, have this
 static method, has these two instance methods, etc. The question is, how 
 complicated do you want this pattern matching mechanism to be?
 There's a whole continuum from nothing to grand pattern matching. We think 
 it's too little to say nothing, and the grand pattern matching becomes 
 very complicated, so we're in- between.

 From: http://www.artima.com/intv/genericsP.html


I can see certain potential benefits to the general way C# does generics, 
but until the old (and I do mean old) issue of There's an IComparable, so 
why the hell won't MS give us an IArithmetic so we can actually use 
arithmetic operators on generic code? gets fixed (and at this point I'm 
convinced they've never had any intent of ever fixing that), I don't care 
how valid the reasoning behind C#'s general approach to generics is, the 
actual state of C#'s generics still falls squarely into the categories of 
crap and almost useless. 




Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread Frits van Bommel

Denis Koroskin wrote:

On Wed, 20 May 2009 13:09:37 +0400, Kagamin s...@here.lot wrote:


BCS Wrote:

smaller object code? OTOH a good implementation will noice when I can  
fold

together several template expansions
That's the difference. You can't fold templates because they're binary  
incompatible as opposite to generics.


You can fold /some/ templates. I believe LLVM already does merging of identical 
functions (including templates, virtual functions etc) as a part of 
optimization process. Not sure about LDC, though.


LLVM has a function merging pass, but LDC doesn't run it by default at any 
optimization level. (You can pass -mergefunc to run it explicitly, as with any 
LLVM pass)
It has some limitations though. Since it runs on IR, it matters what LLVM type 
values have. That means it might merge Templ!(int) and Templ!(uint) since int 
and uint are both an i32 to LLVM, but it normally wouldn't merge Templ!(int*) 
and Templ(short*) even if the template compiles down to return cast(T) 
somefunc(cast(void*) arg); because the types are still different (i32* vs i16*).
To do the latter transformation, the pass would need to be reimplemented to run 
when the code is closer to machine code.


Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread Ary Borenszweig

dsimcha escribió:

== Quote from Christopher Wright (dhase...@gmail.com)'s article

Nick Sabalausky wrote:

Andrei Alexandrescu seewebsiteforem...@erdani.org wrote in message
news:gus0lu$1sm...@digitalmars.com...


I've repeatedly failed to figure out the coolness of C#, and would
appreciate a few pointers. Or references. Or delegates :o).

Outside of this group, I think most of the people considering C# really cool
are people who are unaware of D and are coming to C# from Java. What's
cool about C# is that it's like a less-shitty version of Java (and *had*
good tools, although the newer versions of VS are almost as much of a
bloated unresponsive mess as Eclipse - Which come to think of it, makes me
wonder - If Java has gotten so fast as many people claim, why is Eclipse
still such a sluggish POS?).

Compare C# to D though and most of the coolness fades, even though there are
still a handful of things I think D could still learn from C# (but there's
probably more than a handful that C# could learn from D).

Generics and reflection. Generics just hide a lot of casts, usually, but
that's still quite useful. And autoboxing is convenient, though not
appropriate for D.


What the heck do you need generics for when you have real templates?  To me,
generics seem like just a lame excuse for templates.


Yesterday doob reported a bug in Descent saying when you compile your 
project and it references a user library that has errors, when you click 
on the console to jump to the error, it doesn't work. I said to him: I 
never thought a user library could have errors! How did this happen to 
you? He replied: I found a bug in a template in Tango.


That's why generics doesn't suck: if there's something wrong in them, 
the compiler tells you in compile-time. In D, you get the errors only 
when instantiating that template.


Generics might not be as powerful as templates, but once you write one 
that compiles, you know you will always be able to instantiate it.


Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread bearophile
Ary Borenszweig:
 That's why generics doesn't suck: if there's something wrong in them, 
 the compiler tells you in compile-time. In D, you get the errors only 
 when instantiating that template.

It's just like in dynamic languages, you need to unittest them a lot :-)
So having a static throws() to assert that a template isn't working is very 
useful.

Bye,
bearophile


Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread Christopher Wright

Nick Sabalausky wrote:
I can see certain potential benefits to the general way C# does generics, 
but until the old (and I do mean old) issue of There's an IComparable, so 
why the hell won't MS give us an IArithmetic so we can actually use 
arithmetic operators on generic code? gets fixed (and at this point I'm 
convinced they've never had any intent of ever fixing that), I don't care 
how valid the reasoning behind C#'s general approach to generics is, the 
actual state of C#'s generics still falls squarely into the categories of 
crap and almost useless. 


IArithmetic is impossible in C# because operator overloads are static 
methods, and interfaces cannot specify static methods.


Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread bearophile
Frits van Bommel:
 To do the latter transformation, the pass would need to be reimplemented to 
 run 
 when the code is closer to machine code.

Can't this feature be asked to the LLVM developers?

Bye,
bearophile


Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread Christopher Wright

dsimcha wrote:

== Quote from Christopher Wright (dhase...@gmail.com)'s article

Nick Sabalausky wrote:

Andrei Alexandrescu seewebsiteforem...@erdani.org wrote in message
news:gus0lu$1sm...@digitalmars.com...


I've repeatedly failed to figure out the coolness of C#, and would
appreciate a few pointers. Or references. Or delegates :o).

Outside of this group, I think most of the people considering C# really cool
are people who are unaware of D and are coming to C# from Java. What's
cool about C# is that it's like a less-shitty version of Java (and *had*
good tools, although the newer versions of VS are almost as much of a
bloated unresponsive mess as Eclipse - Which come to think of it, makes me
wonder - If Java has gotten so fast as many people claim, why is Eclipse
still such a sluggish POS?).

Compare C# to D though and most of the coolness fades, even though there are
still a handful of things I think D could still learn from C# (but there's
probably more than a handful that C# could learn from D).

Generics and reflection. Generics just hide a lot of casts, usually, but
that's still quite useful. And autoboxing is convenient, though not
appropriate for D.


What the heck do you need generics for when you have real templates?  To me,
generics seem like just a lame excuse for templates.


Put a template in an interface.

Use reflection to instantiate a template.


Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread Frits van Bommel

bearophile wrote:

Frits van Bommel:
To do the latter transformation, the pass would need to be reimplemented to run 
when the code is closer to machine code.


Can't this feature be asked to the LLVM developers?


Sure, feel free to file a feature request: 
http://llvm.org/bugs/enter_bug.cgi?product=new-bugs


Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread dsimcha
== Quote from Ary Borenszweig (a...@esperanto.org.ar)'s article
 dsimcha escribió:
  == Quote from Christopher Wright (dhase...@gmail.com)'s article
  Nick Sabalausky wrote:
  Andrei Alexandrescu seewebsiteforem...@erdani.org wrote in message
  news:gus0lu$1sm...@digitalmars.com...
 
  I've repeatedly failed to figure out the coolness of C#, and would
  appreciate a few pointers. Or references. Or delegates :o).
  Outside of this group, I think most of the people considering C# really 
  cool
  are people who are unaware of D and are coming to C# from Java. What's
  cool about C# is that it's like a less-shitty version of Java (and *had*
  good tools, although the newer versions of VS are almost as much of a
  bloated unresponsive mess as Eclipse - Which come to think of it, makes me
  wonder - If Java has gotten so fast as many people claim, why is Eclipse
  still such a sluggish POS?).
 
  Compare C# to D though and most of the coolness fades, even though there 
  are
  still a handful of things I think D could still learn from C# (but there's
  probably more than a handful that C# could learn from D).
  Generics and reflection. Generics just hide a lot of casts, usually, but
  that's still quite useful. And autoboxing is convenient, though not
  appropriate for D.
 
  What the heck do you need generics for when you have real templates?  To me,
  generics seem like just a lame excuse for templates.
 Yesterday doob reported a bug in Descent saying when you compile your
 project and it references a user library that has errors, when you click
 on the console to jump to the error, it doesn't work. I said to him: I
 never thought a user library could have errors! How did this happen to
 you? He replied: I found a bug in a template in Tango.
 That's why generics doesn't suck: if there's something wrong in them,
 the compiler tells you in compile-time. In D, you get the errors only
 when instantiating that template.
 Generics might not be as powerful as templates, but once you write one
 that compiles, you know you will always be able to instantiate it.

Yes, but there are two flaws in this argument:

1.  If you are only using templates like generics, you simply use a unit test to
see if it compiles.  If you're not doing anything fancy and it compiles for one 
or
two types, it will probably compile for everything that you would reasonably
expect it to.

2.  If you're doing something fancier, like metaprogramming, you have to just 
face
the fact that this is non-trivial, and couldn't be done with generics anyhow.

3.  As Bearophile alluded to, templates are really a clever hack to give you the
flexibility of a dynamic language with the performance and compile time checking
of a static language.  This is done by moving the dynamism to instantiation 
time.
 Therefore, whereas in a dynamic language you pay at runtime in terms of the 
here
be monsters, this code may not be being used as the author intended and tested
it, with templates you pay at instantiation time.  However, IMHO this is orders
of magnitude better than not having that flexibility at all.  I personally can't
figure out how people accomplish anything in static languages w/o templates.  
It's
just too inflexible.


Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread Andrei Alexandrescu

Lutger wrote:

Andrei Alexandrescu wrote:

...

What the heck do you need generics for when you have real templates?  To me,
generics seem like just a lame excuse for templates.
I agree. Then, templates aren't easy to implement and they were 
understandably already busy implementing the using statement.


Andrei


While I don't fully understand how generics work under the hood in .NET, there are some benefits to how it is done. For example, you can use runtime reflection on generic types. And the jit compiler 
instantiates them at runtime. They may serve a different purpose than templates:


Anders Hejlsberg: To me the best way to understand the distinction between C# generics and C++ templates is this: C# generics are really just like classes, except they have a type parameter. C++ templates 
are really just like macros, except they look like classes. 


It seems that lack of structural typing is seen as a feature:

When you think about it, constraints are a pattern matching mechanism. You want to be able to say, This type parameter must have a constructor that takes two arguments, implement operator+, have this 
static method, has these two instance methods, etc. The question is, how complicated do you want this pattern matching mechanism to be?
There's a whole continuum from nothing to grand pattern matching. We think it's too little to say nothing, and the grand pattern matching becomes very complicated, so we're in- between.  


From: http://www.artima.com/intv/genericsP.html


Oh, so Wal^H^H^Ha friend of mine I was talking to was right: there's 
some missing of the point point going on. The code generation aspect of 
templates is a blind spot of the size of Canada.


Andrei


Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread Frits van Bommel

Kagamin wrote:

Frits van Bommel Wrote:


That's the difference. You can't fold templates because they're binary 
incompatible as opposite to generics.
They're not always binary-incompatible. For instance, if a template only works 
with pointers or references (this includes object references) to parameter types 
it might well contain the exact same machine code for some of the instantiations.


If you require that the class inherits some interface and call that interface's 
methods, they'll be incompatible. I'll dare to say this is the most useful 
variant of generic code.


Okay, so it doesn't (usually) work for interfaces, but it'll work if the 
requirement is for a common base class. Or perhaps even if they happen to have a 
common base class that implements the interface in question.


Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread Kagamin
Frits van Bommel Wrote:

  That's the difference. You can't fold templates because they're binary 
  incompatible as opposite to generics.
 
 They're not always binary-incompatible. For instance, if a template only 
 works 
 with pointers or references (this includes object references) to parameter 
 types 
 it might well contain the exact same machine code for some of the 
 instantiations.

If you require that the class inherits some interface and call that interface's 
methods, they'll be incompatible. I'll dare to say this is the most useful 
variant of generic code.


Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread Nick Sabalausky
Christopher Wright dhase...@gmail.com wrote in message 
news:gv0p4e$uv...@digitalmars.com...
 Nick Sabalausky wrote:
 I can see certain potential benefits to the general way C# does generics, 
 but until the old (and I do mean old) issue of There's an IComparable, 
 so why the hell won't MS give us an IArithmetic so we can actually use 
 arithmetic operators on generic code? gets fixed (and at this point I'm 
 convinced they've never had any intent of ever fixing that), I don't care 
 how valid the reasoning behind C#'s general approach to generics is, the 
 actual state of C#'s generics still falls squarely into the categories of 
 crap and almost useless.

 IArithmetic is impossible in C# because operator overloads are static 
 methods, and interfaces cannot specify static methods.

Then how does IComparable work? 




Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread Yigal Chripun

BCS wrote:

  minor point; I said you have to give the compiler all the source files.
You might not actually nned to compile them all, but without some 
external meta data, it still needs to be handled the full because it 
can't find them on it's own. And at that point you might as well compile 
them anyway.


you are only considering small hobby projects. that's not true for big 
projects where you do not want to build all at once. Think of DWT for 
instance.
besides, you do NOT need to provide all sources, not even just for 
partially processing them to find the symbols.

there is no difference between C#'s /r someAssembly and GCC's -llib

I don't think you fully understand the C# compilation model -
in C# you almost never compile each source file separately, rather you 
compile a bunch of sources into an assembly all at once and you provide 
the list of other assemblies your code depends on. so the dependency is 
on the package level rather than on the file level. this make so much 
more sense since each assembly is a self contained unit of functionality.





sure, you don't get the full power of an IDE that can track all the
source files in the project for you. That just means that it's worth
the money you pay for it.

you can write makefiles or what ever (scons, rake, ant, ...) in the
same way you'd do for C and C++. In other words:
if you prefer commnad line tools you get the same experience and if
you do use an IDE you get a *much* better experience.
same goes for D - either write your own makefile or use rebuild which




uses the compiler front-end to parse the source files just like you
suggested above for C#.



where did I suggest that?


I replied to both you and Daniel. I think I was referring to what Daniel 
said here.





where in all of that, do you see any contradiction to what I said?
again, I said the D compilation model is ancient legacy and should be
replaced and that has nothing to do with the format you prefer for
your build scripts.



I think that you think I'm saying something other than what I'm trying 
to say. I'm struggling to make my argument clear but can't seem to put 
it in words. My thesis is that, in effect, C# is married to VS and that 
D is married only to the compiler.


I understand your thesis and disagree with it. what i'm saying is that 
not only C# is NOT married to VS but also that VS isn't even the best 
IDE for C#. VS is just a fancy text-editor with lots of bells and 
whistles. if you want a real IDE for C# you'd probably use Re-Sharper or 
a similar offering.


My argument is that a D project can be done as nothing but a collection 
of .d files with no extra project files of any kind. In c# this is 
theoretically possible, but from any practical standpoint it's not going 
to be done. There is going to be some extra files that list, in some 
form, extra information the compiler needs to resolve symbols and figure 
out where to look for stuff. In any practical environment this extra bit 
that c# more or less forces you to have (and the D doesn't) will be 
maintain by some sort of IDE.


this is wrong. you cannot have a big project based solely on .d files. 
look at DWT as an example. no matter what tool you use, let's say DSSS, 
it still has a config file of some sort which contains that additional 
meta-data. a DSSS config file might be shorter than what's required for 
a C# project file but don't forget that this comes from DSSS relying on 
rebuild which embeds the entire DMDFE.

in practice, both languages need more than just the compiler.



To put it quantitatively:

productivity on a scale of 0 to whatever
c# w/o IDE - ~1
D w/o IDE - 10
c# w/ IDE - 100+
D w/ IDE - 100+

Either C# or D will be lots more productive with an IDE but D without an 
IDE will be lots more productive than C# without an IDE. D is designed 
to be used however you want, IDE or not. C# is *designed* to be used 
from within VS. I rather suspect that the usability of C# without VS is 
very low on MS things we care about list.







Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread Yigal Chripun

Andrei Alexandrescu wrote:

Lutger wrote:

Andrei Alexandrescu wrote:

...
What the heck do you need generics for when you have real 
templates?  To me,

generics seem like just a lame excuse for templates.
I agree. Then, templates aren't easy to implement and they were 
understandably already busy implementing the using statement.


Andrei


While I don't fully understand how generics work under the hood in 
.NET, there are some benefits to how it is done. For example, you can 
use runtime reflection on generic types. And the jit compiler 
instantiates them at runtime. They may serve a different purpose than 
templates:


Anders Hejlsberg: To me the best way to understand the distinction 
between C# generics and C++ templates is this: C# generics are really 
just like classes, except they have a type parameter. C++ templates 
are really just like macros, except they look like classes.

It seems that lack of structural typing is seen as a feature:

When you think about it, constraints are a pattern matching 
mechanism. You want to be able to say, This type parameter must have 
a constructor that takes two arguments, implement operator+, have this 
static method, has these two instance methods, etc. The question is, 
how complicated do you want this pattern matching mechanism to be?
There's a whole continuum from nothing to grand pattern matching. We 
think it's too little to say nothing, and the grand pattern matching 
becomes very complicated, so we're in- between. 
From: http://www.artima.com/intv/genericsP.html


Oh, so Wal^H^H^Ha friend of mine I was talking to was right: there's 
some missing of the point point going on. The code generation aspect of 
templates is a blind spot of the size of Canada.


Andrei


I think you miss the point here.
Generics and code generation are two separate and orthogonal features 
that where conflated together by C++.


while you can do powerful stuff with templates it smells of trying to 
write Haskel code with the C pre-proceesor.

if you want to see a clean solution to this issue look at Nemerle.
essentially, their AST Macro system provides multi-level compilation.

c++ templates are a horrible hack designed to ween off C programmers 
from using the pre-processor and the D templates provide mostly cosmetic 
 changes to this. they do not solve the bigger design issue.


Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread Andrei Alexandrescu

Yigal Chripun wrote:

I think you miss the point here.
Generics and code generation are two separate and orthogonal features 
that where conflated together by C++.


It's kind of odd, then, that for example the Generative Programming book 
(http://www.generative-programming.org) chose to treat the two notions 
in conjunction.


while you can do powerful stuff with templates it smells of trying to 
write Haskel code with the C pre-proceesor.

if you want to see a clean solution to this issue look at Nemerle.
essentially, their AST Macro system provides multi-level compilation.

c++ templates are a horrible hack designed to ween off C programmers 
from using the pre-processor and the D templates provide mostly cosmetic 
 changes to this. they do not solve the bigger design issue.


What is the bigger design issue?


Andrei


Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread dsimcha
== Quote from Yigal Chripun (yigal...@gmail.com)'s article
 Andrei Alexandrescu wrote:
  Lutger wrote:
  Andrei Alexandrescu wrote:
 
  ...
  What the heck do you need generics for when you have real
  templates?  To me,
  generics seem like just a lame excuse for templates.
  I agree. Then, templates aren't easy to implement and they were
  understandably already busy implementing the using statement.
 
  Andrei
 
  While I don't fully understand how generics work under the hood in
  .NET, there are some benefits to how it is done. For example, you can
  use runtime reflection on generic types. And the jit compiler
  instantiates them at runtime. They may serve a different purpose than
  templates:
 
  Anders Hejlsberg: To me the best way to understand the distinction
  between C# generics and C++ templates is this: C# generics are really
  just like classes, except they have a type parameter. C++ templates
  are really just like macros, except they look like classes.
  It seems that lack of structural typing is seen as a feature:
 
  When you think about it, constraints are a pattern matching
  mechanism. You want to be able to say, This type parameter must have
  a constructor that takes two arguments, implement operator+, have this
  static method, has these two instance methods, etc. The question is,
  how complicated do you want this pattern matching mechanism to be?
  There's a whole continuum from nothing to grand pattern matching. We
  think it's too little to say nothing, and the grand pattern matching
  becomes very complicated, so we're in- between.
  From: http://www.artima.com/intv/genericsP.html
 
  Oh, so Wal^H^H^Ha friend of mine I was talking to was right: there's
  some missing of the point point going on. The code generation aspect of
  templates is a blind spot of the size of Canada.
 
  Andrei
 I think you miss the point here.
 Generics and code generation are two separate and orthogonal features
 that where conflated together by C++.
 while you can do powerful stuff with templates it smells of trying to
 write Haskel code with the C pre-proceesor.
 if you want to see a clean solution to this issue look at Nemerle.
 essentially, their AST Macro system provides multi-level compilation.
 c++ templates are a horrible hack designed to ween off C programmers
 from using the pre-processor and the D templates provide mostly cosmetic
   changes to this. they do not solve the bigger design issue.

Not sure I agree.  C++ templates were probably intended to be something like
generics initially and became Turing-complete almost by accident.  To get Turing
completeness in C++ templates requires severe abuse of features and spaghetti 
code
writing.  D extends templates so that they're actually *designed* for
metaprogramming, not just an implementation of generics, thus solving C++'s 
design
problem.  Mixins (to really allow code generation), CTFE (to make it easier to
generate code), static if (to avoid kludges like using specialization just to 
get
branching) and tuples (to handle variadics) make D templates useful for
metaprogramming without massive kludges.


Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread Andrei Alexandrescu

dsimcha wrote:

Not sure I agree.  C++ templates were probably intended to be something like
generics initially and became Turing-complete almost by accident.


That is factually correct. It was quite a hubbub on the C++ 
standardization committee when Erwin Unruh wrote a C++ program that 
wrote prime numbers in error messages. See http://tinyurl.com/oqk6nl


Andrei


Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread Jacob Carlborg

dsimcha wrote:

== Quote from Ary Borenszweig (a...@esperanto.org.ar)'s article

dsimcha escribió:

== Quote from Christopher Wright (dhase...@gmail.com)'s article

Nick Sabalausky wrote:

Andrei Alexandrescu seewebsiteforem...@erdani.org wrote in message
news:gus0lu$1sm...@digitalmars.com...


I've repeatedly failed to figure out the coolness of C#, and would
appreciate a few pointers. Or references. Or delegates :o).

Outside of this group, I think most of the people considering C# really cool
are people who are unaware of D and are coming to C# from Java. What's
cool about C# is that it's like a less-shitty version of Java (and *had*
good tools, although the newer versions of VS are almost as much of a
bloated unresponsive mess as Eclipse - Which come to think of it, makes me
wonder - If Java has gotten so fast as many people claim, why is Eclipse
still such a sluggish POS?).

Compare C# to D though and most of the coolness fades, even though there are
still a handful of things I think D could still learn from C# (but there's
probably more than a handful that C# could learn from D).

Generics and reflection. Generics just hide a lot of casts, usually, but
that's still quite useful. And autoboxing is convenient, though not
appropriate for D.

What the heck do you need generics for when you have real templates?  To me,
generics seem like just a lame excuse for templates.

Yesterday doob reported a bug in Descent saying when you compile your
project and it references a user library that has errors, when you click
on the console to jump to the error, it doesn't work. I said to him: I
never thought a user library could have errors! How did this happen to
you? He replied: I found a bug in a template in Tango.
That's why generics doesn't suck: if there's something wrong in them,
the compiler tells you in compile-time. In D, you get the errors only
when instantiating that template.
Generics might not be as powerful as templates, but once you write one
that compiles, you know you will always be able to instantiate it.


Yes, but there are two flaws in this argument:

1.  If you are only using templates like generics, you simply use a unit test to
see if it compiles.  If you're not doing anything fancy and it compiles for one 
or
two types, it will probably compile for everything that you would reasonably
expect it to.


I used tango.text.xml.Document with wchar and dchar as the template type 
and in tango.text.xml.PullParser there were some functions that took 
char[] instead of T[] as the argument. 
http://www.dsource.org/projects/tango/ticket/1663



2.  If you're doing something fancier, like metaprogramming, you have to just 
face
the fact that this is non-trivial, and couldn't be done with generics anyhow.

3.  As Bearophile alluded to, templates are really a clever hack to give you the
flexibility of a dynamic language with the performance and compile time checking
of a static language.  This is done by moving the dynamism to instantiation 
time.
 Therefore, whereas in a dynamic language you pay at runtime in terms of the 
here
be monsters, this code may not be being used as the author intended and tested
it, with templates you pay at instantiation time.  However, IMHO this is orders
of magnitude better than not having that flexibility at all.  I personally can't
figure out how people accomplish anything in static languages w/o templates.  
It's
just too inflexible.


Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread BCS

Reply to Yigal,


D templates provide mostly cosmetic changes to this.



If you think D's templates are C++'s template with a few cosmetic changes 
than you aren't paying attention.


A few cosmetic changes aren't going to allow 1.4MB of c++ header files to 
be anywhere near duplicated in 2000 LOC (Boost sprit vs dparse)





Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread BCS

Reply to Yigal,


BCS wrote:


minor point; I said you have to give the compiler all the source
files. You might not actually nned to compile them all, but without
some external meta data, it still needs to be handled the full
because it can't find them on it's own. And at that point you might
as well compile them anyway.



(BTW: that is only referring to c#)


you are only considering small hobby projects. that's not true for big
projects where you do not want to build all at once. Think of DWT for
instance. besides, you do NOT need to provide all sources, not even
just for partially processing them to find the symbols.
there is no difference between C#'s /r someAssembly and GCC's -llib
I don't think you fully understand the C# compilation model -

in C# you almost never compile each source file separately, rather you
compile a bunch of sources into an assembly all at once and you provide
the list of other assemblies your code depends on. so the dependency is
on the package level rather than on the file level. this make so much
more sense since each assembly is a self contained unit of
functionality.


That is more or less what I thought it was. Also, that indicates that the 
design of c# assumes a build model that I think is a bad idea; the big dumb 
all or nothing build where a sub part of a program is either up to date, 
or rebuilt by recompiling everything in it.



 where in all of that, do you see any contradiction to what I said?
again, I said the D compilation model is ancient legacy and should
be replaced and that has nothing to do with the format you prefer
for your build scripts.


I think that you think I'm saying something other than what I'm
trying to say. I'm struggling to make my argument clear but can't
seem to put it in words. My thesis is that, in effect, C# is married
to VS and that D is married only to the compiler.


I understand your thesis and disagree with it. what i'm saying is that
not only C# is NOT married to VS but also that VS isn't even the best
IDE for C#.


Maybe I should have said it's married to having *an IDE*, it's just VS by 
default and design.



VS is just a fancy text-editor with lots of bells and
whistles. if you want a real IDE for C# you'd probably use Re-Sharper
or a similar offering.


Last I heard Re-Sharper is a VS plugin, not an IDE in it's own right, and 
even if that has changed, it's still an IDE. Even so, my point is Any IDE 
vs. No IDE, so it dosn't address my point.



My argument is that a D project can be done as nothing but a
collection of .d files with no extra project files of any kind. In c#
this is theoretically possible, but from any practical standpoint
it's not going to be done. There is going to be some extra files that
list, in some form, extra information the compiler needs to resolve
symbols and figure out where to look for stuff. In any practical
environment this extra bit that c# more or less forces you to have
(and the D doesn't) will be maintain by some sort of IDE.


this is wrong. you cannot have a big project based solely on .d files.
look at DWT as an example. no matter what tool you use, let's say DSSS,
it still has a config file of some sort which contains that additional
meta-data.


So DWT depends on DSSS's meta data. That's a design choice of DWT not D. 
What I'm asserting that that C# projects depending on meta data is a design 
choice of C# not the project. D project can (even if some don't) be practically 
designed so that they don't need that meta data where as, I will assert, 
C# projects, for practical purposes, can't do away with it.


--

I'm fine with any build system you want to have implemented as long as a 
tool stack can still be built that works like the current one. That is that 
it can practically:


- support projects that need no external meta data
- produce monolithic OS native binary executables
- work with the only language aware tool being the compiler

I don't expect it to requiter that projects be done that way and I wouldn't 
take any issue if a tool stack were built that didn't fit that list. What 
I /would/ take issue with is the the language (okay, or DMD in particular) 
were altered to the point that one or more of those *couldn't* be done.





Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread Bill Baxter
On Wed, May 20, 2009 at 1:09 PM, Andrei Alexandrescu
seewebsiteforem...@erdani.org wrote:
 Yigal Chripun wrote:

 I think you miss the point here.
 Generics and code generation are two separate and orthogonal features that
 where conflated together by C++.

 It's kind of odd, then, that for example the Generative Programming book
 (http://www.generative-programming.org) chose to treat the two notions in
 conjunction.

Yeh, there's definitely a overlap.  Orthogonal isn't quite the right word there.

I'm reading a bit on C++/CLI right now, which is C++ extended to
inter-operate with CLR.
C++/CLI has *both* classic C++ templates and CLR generics:

  templatetypename T ...// all code specializations generated at
compile-time (if at all)
  generictypename T ... // code generated at compiletime
unconditionally, specialized at run-time.

I'm not clear on exactly what happens at runtime in the generic
case.  I had been thinking it was simply that the compiler does some
type checking at compile time and the VM code then just manipulates
pointers to Object from there.  That may be what happens in Java
generics, but in CLR generics at least you can specialize on
non-Object value types and that apparently does not result in
everything getting boxed.  So it seems like there's a little something
extra going on.

I think the main reason for having Generics is that they're the best
anyone currently knows how to do at the IL bytecode level.  Generics
give you a way to define generic parameterized types that work across
all the languages that target a given VM's bytecode.  But that doesn't
preclude any language that targets that VM from *also* implementing
compile-time templates, or code generators, or AST macros at the
source code level.

But the problem with source-level code generation is that you then
need the source code in order to use the library.  I think they were
trying to avoid that with C#.   If you have a compiled C# assembly,
then you have everything you need to use it.   Period.   (I think.)
At any rate, a tech that requires inclusion of source code is not very
interesting to Microsoft, because Microsoft doesn't generally like to
let people see their source code in the first place, and they know
that many of their biggest customers don't like to either.  They're
nervous enough about just putting de-compileable bytecode out there.

--bb


Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread Christopher Wright

Nick Sabalausky wrote:
Christopher Wright dhase...@gmail.com wrote in message 
news:gv0p4e$uv...@digitalmars.com...

Nick Sabalausky wrote:
I can see certain potential benefits to the general way C# does generics, 
but until the old (and I do mean old) issue of There's an IComparable, 
so why the hell won't MS give us an IArithmetic so we can actually use 
arithmetic operators on generic code? gets fixed (and at this point I'm 
convinced they've never had any intent of ever fixing that), I don't care 
how valid the reasoning behind C#'s general approach to generics is, the 
actual state of C#'s generics still falls squarely into the categories of 
crap and almost useless.
IArithmetic is impossible in C# because operator overloads are static 
methods, and interfaces cannot specify static methods.


Then how does IComparable work? 


It uses a member function instead.


Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread Nick Sabalausky
Christopher Wright dhase...@gmail.com wrote in message 
news:gv29vn$7a...@digitalmars.com...
 Nick Sabalausky wrote:
 Christopher Wright dhase...@gmail.com wrote in message 
 news:gv0p4e$uv...@digitalmars.com...
 Nick Sabalausky wrote:
 I can see certain potential benefits to the general way C# does 
 generics, but until the old (and I do mean old) issue of There's an 
 IComparable, so why the hell won't MS give us an IArithmetic so we can 
 actually use arithmetic operators on generic code? gets fixed (and at 
 this point I'm convinced they've never had any intent of ever fixing 
 that), I don't care how valid the reasoning behind C#'s general 
 approach to generics is, the actual state of C#'s generics still falls 
 squarely into the categories of crap and almost useless.
 IArithmetic is impossible in C# because operator overloads are static 
 methods, and interfaces cannot specify static methods.

 Then how does IComparable work?

 It uses a member function instead.

And they can't do the same for arithmetic? 




Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread Nick Sabalausky
Bill Baxter wbax...@gmail.com wrote in message 
news:mailman.151.1242855932.13405.digitalmar...@puremagic.com...
 On Wed, May 20, 2009 at 1:09 PM, Andrei Alexandrescu
 seewebsiteforem...@erdani.org wrote:
 Yigal Chripun wrote:

 I think you miss the point here.
 Generics and code generation are two separate and orthogonal features 
 that
 where conflated together by C++.

 It's kind of odd, then, that for example the Generative Programming book
 (http://www.generative-programming.org) chose to treat the two notions in
 conjunction.

 Yeh, there's definitely a overlap.  Orthogonal isn't quite the right word 
 there.

 I'm reading a bit on C++/CLI right now, which is C++ extended to
 inter-operate with CLR.
 C++/CLI has *both* classic C++ templates and CLR generics:

  templatetypename T ...// all code specializations generated at
 compile-time (if at all)
  generictypename T ... // code generated at compiletime
 unconditionally, specialized at run-time.

 I'm not clear on exactly what happens at runtime in the generic
 case.  I had been thinking it was simply that the compiler does some
 type checking at compile time and the VM code then just manipulates
 pointers to Object from there.  That may be what happens in Java
 generics, but in CLR generics at least you can specialize on
 non-Object value types and that apparently does not result in
 everything getting boxed.  So it seems like there's a little something
 extra going on.

 I think the main reason for having Generics is that they're the best
 anyone currently knows how to do at the IL bytecode level.  Generics
 give you a way to define generic parameterized types that work across
 all the languages that target a given VM's bytecode.  But that doesn't
 preclude any language that targets that VM from *also* implementing
 compile-time templates, or code generators, or AST macros at the
 source code level.

 But the problem with source-level code generation is that you then
 need the source code in order to use the library.  I think they were
 trying to avoid that with C#.   If you have a compiled C# assembly,
 then you have everything you need to use it.   Period.   (I think.)
 At any rate, a tech that requires inclusion of source code is not very
 interesting to Microsoft, because Microsoft doesn't generally like to
 let people see their source code in the first place, and they know
 that many of their biggest customers don't like to either.  They're
 nervous enough about just putting de-compileable bytecode out there.


Maybe this is naive, but what about an AST-level template/generic? Couldn't 
that provide for the best of both worlds?

For instance, suppose (purely hypothetically) that the .NET assembly system 
were changed to allow the source for a D/C++ style of source-level template 
to be embedded into the assembly. Then they'd be able to do D/C++ style 
source-level template/code-generation. Right? Now obviously the big problem 
with that is it would only be usable in the same language it was originally 
written in. So, instead of getting that cross-language support by going all 
the way down to the IL bytecode level to implement generics (which, as you 
said, would somehow prevent the flexibility that the D/C++ style enjoys) 
suppose it only went down as far as a language-agnostic AST?

I suppose that might make reverse-engineering easier which MS might not 
like, but I'm not suggesting this as something that MS should like or should 
even do, but rather suggesting it as (business issues completely aside) 
something that would possibly gain the benefits of both styles. 




Re: OT: on IDEs and code writing on steroids

2009-05-20 Thread Daniel Keep


Nick Sabalausky wrote:
 ...
 
 Maybe this is naive, but what about an AST-level template/generic? Couldn't 
 that provide for the best of both worlds?
 
 For instance, suppose (purely hypothetically) that the .NET assembly system 
 were changed to allow the source for a D/C++ style of source-level template 
 to be embedded into the assembly. Then they'd be able to do D/C++ style 
 source-level template/code-generation. Right? Now obviously the big problem 
 with that is it would only be usable in the same language it was originally 
 written in. So, instead of getting that cross-language support by going all 
 the way down to the IL bytecode level to implement generics (which, as you 
 said, would somehow prevent the flexibility that the D/C++ style enjoys) 
 suppose it only went down as far as a language-agnostic AST?
 
 ...

What I've always thought might be an interesting experiment would be to
change templates in LDC so that instead of generating an AST, they
generate code that generates code.

So when you use A!(T), what happens is that at runtime the template is
run with T as an argument.  This generates a chunk of LLVM bitcode
which LLVM then assembles to machine code and links into the program.

This alleviates the problem with using source in that if you embed the
template's actual source, then you suddenly ALSO have to embed the
standard library's source and the source of any other libraries you
happened to compile with.

Oh, and the same version of the compiler.

  -- Daniel


Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread Rainer Deyke
Yigal Chripun wrote:
 oh, I forgot my last point:
 for C link-time compatibility you need to be able to _read_ C object
 files and link them to your executable. you gain little from the ability
 to _write_ object files.

You can transitivity.  Two compilers for different languages that both
produce C object files can link to each other; two compiler that can
only read C object files cannot.

 if you want to do a reverse integration (use D code in your C project)
 you can and IMO should have created a library anyway instead of using
 object files and the compiler should allow this as a separate option via
 a flag, e.g. --make-so or whatever

If you can read and write compatible library files, you don't need to
read or write compatible object files, since library files can take the
place of object files.


-- 
Rainer Deyke - rain...@eldwood.com


Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread Yigal Chripun

BCS wrote:

one other thing, this thread discusses also the VS project files. This
is completely irrelevant. those XML files are VS specific and their
complexity is MS' problem. Nothing prevents a developer from using
different build tools like make, rake or scons with their C# sources
since VS comes with a command line compiler. the issue is not the
build tool but rather the compilation model itself.


I think you are in error here as the c# files don't contain enough 
information for the compiler to know where to resolve symbols. You might 
be able to get away with throwing every single .cs/.dll/whatever file in 
the project at the compiler all at once. (Now if you want to talk about 
archaic!) Aside from that, how can it find meta-data for your types?


 you're mistaken since there are build tools that support C#. I think I 
saw this in Scons last time I looked.


Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread Yigal Chripun

Rainer Deyke wrote:

Yigal Chripun wrote:

oh, I forgot my last point:
for C link-time compatibility you need to be able to _read_ C object
files and link them to your executable. you gain little from the ability
to _write_ object files.


You can transitivity.  Two compilers for different languages that both
produce C object files can link to each other; two compiler that can
only read C object files cannot.


good point.




if you want to do a reverse integration (use D code in your C project)
you can and IMO should have created a library anyway instead of using
object files and the compiler should allow this as a separate option via
a flag, e.g. --make-so or whatever


If you can read and write compatible library files, you don't need to
read or write compatible object files, since library files can take the
place of object files.




that's even better. just allow 2-way usage of C libs and that's it. no 
need to support the C object file formats directly.


Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread Lutger
Yigal Chripun wrote:

...
 IMO, designing the language to support this better work-flow is a good 
 decision made by MS, and D should follow it instead of trying to get 
 away without an IDE.

I'm not sure about this. D is designed to be easier to parse than C++ 
(but that's saying nothing) to allow better tools made for it. I think this 
should be enough. 

C#  friends not only better supports working inside an IDE, but makes it a 
pain to 
do without. Autocomplete dictates that related functions should be named with
the exact same prefix - even when this isn't logical. It also encourages names 
to be 
as descriptive as possible, in practice leading to a part of the api docs 
encoded in 
the function name. Extremely bloated names are the consequence of this. It 
doesn't 
always make code more readable imho. 

The documentation comments are in xml: pure insanity. I tried to generate 
documentation 
for my stuff at work once, expecting to be done in max 5 min. like ddoc. Turns 
out nobody at 
work uses documentation generation for a reason: it isn't really fleshed out 
and one-click
from the IDE, in fact it is a pain in the arse compared to using ddoc.

I should stop now before this turns into a rant.










Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread Lutger
BCS wrote:

...
 
 all LINQ is is a set of standard nameing conventions and sugar. I Add a 
 Where 
 function to some SQL tabel object and you get the above as well.
 
...

Not really, LINQ is 'sugar' for the underlying libraries that implements 
querying. Instead of
calling it just sugar, it is more proper to call it a language in it's own 
right.

LINQ to SQL is just one thing, the power of LINQ is that you separate queries 
from source of data.
You can write one query (ideally) that works with SQL, XML or plain Arrays. 
It's not only that you
don't have to write SQL queries anymore, a lot of messy for/while/etc. loops 
can be totally replaced 
with LINQ queries too. 





Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread Ary Borenszweig

Frits van Bommel escribió:

Ary Borenszweig wrote:

Frits van Bommel wrote:

Jacob Carlborg wrote:

Daniel Keep wrote:
Actually, Descent isn't perfect, either.  For example, it mandates 
that

cases in a switch MUST be aligned with the braces.  What's more fun is
that you can't override it until AFTER it's corrected YOU.


Just file a ticket.


The relevant ticket[1] is a year old, according to dsource...


[1]: At least I *think* he's talking about this: 
http://dsource.org/projects/descent/ticket/82


Well, I didn't know it was *that* important for using it. If you 
consider it really important, post something in the forums, reply to 
that ticket, or something like that.


Why would I reply to it? I *wrote* it.


LOL!


Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread Daniel Keep


Yigal Chripun wrote:
 Rainer Deyke wrote:
 ...

 If you can read and write compatible library files, you don't need to
 read or write compatible object files, since library files can take the
 place of object files.
 
 that's even better. just allow 2-way usage of C libs and that's it. no
 need to support the C object file formats directly.

Ummm... IIRC, an .a file is just an archive of .o files.  A .lib file in
Windows is something similar.

If you want to support C libraries, you need to support the object file
format as well.

  -- Daniel


Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread Tim Matthews

On Tue, 19 May 2009 08:56:59 +1200, BCS a...@pathlink.com wrote:



VS/MS/etc is a for profit ecosystem. They assume that your system and  
software is paid for by your boss and he's spending 10-20 time that much  
on your paycheck so who cares. At least that's the impression I get.




I think vs express editions that can be used to make great software, sell  
the software and not pay MS a single cent is very generous of them and the  
.net cil is a genious idea. The most succefull compilers are the ones that  
recognize that there is multiple languages, multiple archictectures and  
that there should be something in the middle. CIL just leaves it in the  
middle code until the last minute. MS may not do the best operating  
systems but the whole .net thing is very good in my opinion and I think  
sun is better for there solaris than there java.


Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread grauzone
and the .net cil is a genious idea. The most succefull compilers are the 
ones that recognize that there is multiple languages, multiple 
archictectures and that there should be something in the middle. CIL 
just leaves it in the middle code until the last minute. MS may not do 
the best operating systems but the whole .net thing is very good in my 


And what exactly is good about byte code?

It's portable? My D code is portable too. Sure, it requires 
recompilation, but it doesn't need a clusterfuck-VM just for running it.



opinion and I think sun is better for there solaris than there java.


.Net is just Microsoft's Java clone, and Sun didn't invent byte code either.


Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread BCS

Reply to Daniel,


Yigal Chripun wrote:


BCS wrote:


one other thing, this thread discusses also the VS project files.
This is completely irrelevant. those XML files are VS specific and
their complexity is MS' problem. Nothing prevents a developer from
using different build tools like make, rake or scons with their C#
sources since VS comes with a command line compiler. the issue is
not the build tool but rather the compilation model itself.


I think you are in error here as the c# files don't contain enough
information for the compiler to know where to resolve symbols. You
might be able to get away with throwing every single
.cs/.dll/whatever file in the project at the compiler all at once.
(Now if you want to talk about archaic!) Aside from that, how can it
find meta-data for your types?


you're mistaken since there are build tools that support C#. I think
I saw this in Scons last time I looked.


Maybe you should back up your statements instead of just guessing.

http://www.scons.org/wiki/CsharpBuilder

Oh look, you have to list all the source files because C# source files
*do not contain enough information*.

A C# source file containing using Foo.Bar; tells you exactly ZERO
about what other files it depends on.

-- Daniel



Exactly. The only practical way to deal with C# is an IDE or build system 
of some kind that is aware of C#. You /can/ deal with it by hand but IMHO 
that would be about half way from D to using C without even a make file or 
build script.





Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread BCS

Reply to Lutger,


BCS wrote:

...


all LINQ is is a set of standard nameing conventions and sugar. I Add
a Where function to some SQL tabel object and you get the above as
well.


...

Not really, LINQ is 'sugar' for the underlying libraries that


As far as language features go, I'm even less impressed with sugar for 
libraries.


implements querying. Instead of calling it just sugar, it is more
proper to call it a language in it's own right.



I still don't think it's anything to spectacular. The AST stuff on the other 
hand...





Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread Yigal Chripun

BCS wrote:

Reply to Daniel,


Yigal Chripun wrote:


BCS wrote:


one other thing, this thread discusses also the VS project files.
This is completely irrelevant. those XML files are VS specific and
their complexity is MS' problem. Nothing prevents a developer from
using different build tools like make, rake or scons with their C#
sources since VS comes with a command line compiler. the issue is
not the build tool but rather the compilation model itself.


I think you are in error here as the c# files don't contain enough
information for the compiler to know where to resolve symbols. You
might be able to get away with throwing every single
.cs/.dll/whatever file in the project at the compiler all at once.
(Now if you want to talk about archaic!) Aside from that, how can it
find meta-data for your types?


you're mistaken since there are build tools that support C#. I think
I saw this in Scons last time I looked.


Maybe you should back up your statements instead of just guessing.

http://www.scons.org/wiki/CsharpBuilder

Oh look, you have to list all the source files because C# source files
*do not contain enough information*.

A C# source file containing using Foo.Bar; tells you exactly ZERO
about what other files it depends on.

-- Daniel



Exactly. The only practical way to deal with C# is an IDE or build 
system of some kind that is aware of C#. You /can/ deal with it by hand 
but IMHO that would be about half way from D to using C without even a 
make file or build script.





first, thanks Daniel for the evidence I missed.
BCS wrote that a programmer needs to compile all the source files at 
once to make it work without an IDE. as I already said, he's wrong, and 
Daniel provided the proof above.


sure, you don't get the full power of an IDE that can track all the 
source files in the project for you. That just means that it's worth the 
money you pay for it.


you can write makefiles or what ever (scons, rake, ant, ...) in the same 
way you'd do for C and C++. In other words:
if you prefer commnad line tools you get the same experience and if you 
 do use an IDE you get a *much* better experience.
same goes for D - either write your own makefile or use rebuild which 
uses the compiler front-end to parse the source files just like you 
suggested above for C#.


where in all of that, do you see any contradiction to what I said?
again, I said the D compilation model is ancient legacy and should be 
replaced and that has nothing to do with the format you prefer for your 
build scripts.




Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread BCS

Reply to Yigal,


BCS wrote:


Reply to Daniel,


Yigal Chripun wrote:


BCS wrote:


one other thing, this thread discusses also the VS project files.
This is completely irrelevant. those XML files are VS specific
and their complexity is MS' problem. Nothing prevents a developer
from using different build tools like make, rake or scons with
their C# sources since VS comes with a command line compiler. the
issue is not the build tool but rather the compilation model
itself.


I think you are in error here as the c# files don't contain enough
information for the compiler to know where to resolve symbols. You
might be able to get away with throwing every single
.cs/.dll/whatever file in the project at the compiler all at once.
(Now if you want to talk about archaic!) Aside from that, how can
it find meta-data for your types?


you're mistaken since there are build tools that support C#. I
think I saw this in Scons last time I looked.


Maybe you should back up your statements instead of just guessing.

http://www.scons.org/wiki/CsharpBuilder

Oh look, you have to list all the source files because C# source
files *do not contain enough information*.

A C# source file containing using Foo.Bar; tells you exactly ZERO
about what other files it depends on.

-- Daniel


Exactly. The only practical way to deal with C# is an IDE or build
system of some kind that is aware of C#. You /can/ deal with it by
hand but IMHO that would be about half way from D to using C without
even a make file or build script.



first, thanks Daniel for the evidence I missed.
BCS wrote that a programmer needs to compile all the source files at
once to make it work without an IDE. as I already said, he's wrong,
and Daniel provided the proof above.


minor point; I said you have to give the compiler all the source files. You 
might not actually nned to compile them all, but without some external meta 
data, it still needs to be handled the full because it can't find them on 
it's own. And at that point you might as well compile them anyway.



sure, you don't get the full power of an IDE that can track all the
source files in the project for you. That just means that it's worth
the money you pay for it.

you can write makefiles or what ever (scons, rake, ant, ...) in the
same way you'd do for C and C++. In other words:
if you prefer commnad line tools you get the same experience and if
you do use an IDE you get a *much* better experience.
same goes for D - either write your own makefile or use rebuild which




uses the compiler front-end to parse the source files just like you
suggested above for C#.



where did I suggest that?


where in all of that, do you see any contradiction to what I said?
again, I said the D compilation model is ancient legacy and should be
replaced and that has nothing to do with the format you prefer for
your build scripts.



I think that you think I'm saying something other than what I'm trying to 
say. I'm struggling to make my argument clear but can't seem to put it in 
words. My thesis is that, in effect, C# is married to VS and that D is married 
only to the compiler.


My argument is that a D project can be done as nothing but a collection of 
.d files with no extra project files of any kind. In c# this is theoretically 
possible, but from any practical standpoint it's not going to be done. There 
is going to be some extra files that list, in some form, extra information 
the compiler needs to resolve symbols and figure out where to look for stuff. 
In any practical environment this extra bit that c# more or less forces you 
to have (and the D doesn't) will be maintain by some sort of IDE.


To put it quantitatively:

productivity on a scale of 0 to whatever
c# w/o IDE - ~1
D w/o IDE - 10
c# w/ IDE - 100+
D w/ IDE - 100+

Either C# or D will be lots more productive with an IDE but D without an 
IDE will be lots more productive than C# without an IDE. D is designed to 
be used however you want, IDE or not. C# is *designed* to be used from within 
VS. I rather suspect that the usability of C# without VS is very low on MS 
things we care about list.






Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread Walter Bright

Georg Wrede wrote:
In the Good Old Days (when it was usual for an average programmer to 
write parts of the code in ASM (that was the time before the late 
eighties -- be it Basic, Pascal, or even C, some parts had to be done in 
ASM to help a bearable user experience when the mainframes had less 
power than today's MP3 players), the ASM programing was very different 
on, say, Zilog, MOS, or Motorola processors. The rumor was that the 6502 
was made for hand coded ASM, whereas the 8088 was more geared towards 
automatic code generation (as in C commpilers, etc.). My experiences of 
both certainly seemed to support this.


The 6502 is an 8 bit processor, the 8088 is 16 bits. All 8 bit 
processors were a terrible fit for C, which was designed for 16 bit 
CPUs. Everyone who coded professional apps for the 6502, 6800, 8080 and 
Z80 (all 8 bit CPUs) wrote in assembler. (Including myself.)



If we were smart with D, we'd find out a way of leapfrogging this 
thinking. We have a language that's more powerful than any of C#, Java 
or C++, more practical than Haskell, Scheme, Ruby, co, and more 
maintainable than C or Perl, but which *still* is Human Writable. All we 
need is some outside-of-the-box thinking, and we might reap some 
overwhelming advantages when we combine *this* language with the IDEs 
and the horsepower that the modern drone takes for granted.


Easier parsing, CTFE, actually usable templates, practical mixins, pure 
functions, safe code, you name it! We have all the bits and pieces to 
really make writing + IDE assisted program authoring, a superior reality.


Right, but I can't think of any IDE feature that would be a bad fit for 
using the filesystem to store the D source modules.


Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread Walter Bright

Christopher Wright wrote:

I really like IDEs. They let me think less when creating code.


It wouldn't be hard to do a competent IDE for D. After all, D is 
designed to make that job easy.


Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread Ary Borenszweig

Walter Bright escribió:

Christopher Wright wrote:

I really like IDEs. They let me think less when creating code.


It wouldn't be hard to do a competent IDE for D. After all, D is 
designed to make that job easy.


Like, for example, if you have this:

---
char[] someFunction(char[] name) {
  return int  ~ name ~ ;;
}

class Foo {
mixin(someFunction(variable));
}

void main() {
Foo foo = new Foo();
foo.  -- I'd really like the IDE to suggest me variable
}
---

Do you really think implementing a *good* IDE for D is easy now? :-P

(of course Descent works in this case, but just because it has the full 
dmdfe in it... so basically a good IDE will need to be able to do CTFE, 
instantiante templates, etc., and all of those things are kind of 
unclear in the specification of the D language, so if you don't use 
dmdfe... well... I hope you get my point)


Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread bearophile
Walter Bright:
 The 6502 is an 8 bit processor, the 8088 is 16 bits. All 8 bit 
 processors were a terrible fit for C, which was designed for 16 bit 
 CPUs. Everyone who coded professional apps for the 6502, 6800, 8080 and 
 Z80 (all 8 bit CPUs) wrote in assembler. (Including myself.)

Forth interpreters can be very small, it's a very flexible language, you can 
metaprogram it almost as Lisp, and if implemented well it can be efficient 
(surely more than interpreter Basic, but less than handwritten asm. You can 
have an optimizing Forth in probably less than 4-5 KB).

But the people was waiting/asking for the Basic Language, most people didn't 
know Forth, Basic was common in schools, so Basic was the language shipped 
inside the machine, instead of Forth:
http://www.npsnet.com/danf/cbm/languages.html#FORTH

The Commodore 64 with built-in Forth instead of Basic may have driven computer 
science in a quite different direction.

Do you agree?

Bye,
bearophile


Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread Walter Bright

Ary Borenszweig wrote:

Do you really think implementing a *good* IDE for D is easy now? :-P

(of course Descent works in this case, but just because it has the full 
dmdfe in it... so basically a good IDE will need to be able to do CTFE, 
instantiante templates, etc., and all of those things are kind of 
unclear in the specification of the D language, so if you don't use 
dmdfe... well... I hope you get my point)


The dmdfe is available, so one doesn't have to recreate it. That makes 
it easy :-)


Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread Walter Bright

bearophile wrote:

Forth interpreters can be very small, it's a very flexible language,
you can metaprogram it almost as Lisp, and if implemented well it can
be efficient (surely more than interpreter Basic, but less than
handwritten asm. You can have an optimizing Forth in probably less
than 4-5 KB).

But the people was waiting/asking for the Basic Language, most people
didn't know Forth, Basic was common in schools, so Basic was the
language shipped inside the machine, instead of Forth: 
http://www.npsnet.com/danf/cbm/languages.html#FORTH


The Commodore 64 with built-in Forth instead of Basic may have driven
computer science in a quite different direction.

Do you agree?


I remember lots of talk about Forth, and nobody using it.



Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread Ary Borenszweig

Walter Bright escribió:

Ary Borenszweig wrote:

Do you really think implementing a *good* IDE for D is easy now? :-P

(of course Descent works in this case, but just because it has the 
full dmdfe in it... so basically a good IDE will need to be able to do 
CTFE, instantiante templates, etc., and all of those things are kind 
of unclear in the specification of the D language, so if you don't use 
dmdfe... well... I hope you get my point)


The dmdfe is available, so one doesn't have to recreate it. That makes 
it easy :-)


Except if the IDE is not made in C++ ;-)


Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread Georg Wrede

Walter Bright wrote:

Georg Wrede wrote:
In the Good Old Days (when it was usual for an average programmer to 
write parts of the code in ASM (that was the time before the late 
eighties -- be it Basic, Pascal, or even C, some parts had to be done 
in ASM to help a bearable user experience when the mainframes had less 
power than today's MP3 players), the ASM programing was very different 
on, say, Zilog, MOS, or Motorola processors. The rumor was that the 
6502 was made for hand coded ASM, whereas the 8088 was more geared 
towards automatic code generation (as in C commpilers, etc.). My 
experiences of both certainly seemed to support this.


The 6502 is an 8 bit processor, the 8088 is 16 bits. All 8 bit 
processors were a terrible fit for C, which was designed for 16 bit 
CPUs. Everyone who coded professional apps for the 6502, 6800, 8080 and 
Z80 (all 8 bit CPUs) wrote in assembler. (Including myself.)


Sloppy me, 8080 was what I meant, instead of the 8088. My bad.

And you're right about ASM coding. But over here, with smaller software 
companies, stuff was done win S-Basic (does anyone even know that one 
anymore???), C-Basic, and Turbo Pascal. Ron Cain's SmallC wasn't really 
up to anything serious, and C wasn't all that well known around here 
then. But Turbo Pascal was already at 3.0 in 1985, and a good 
investment, because using it was the same on the pre-PC computers and 
the then new IBM-PC.


If we were smart with D, we'd find out a way of leapfrogging this 
thinking. We have a language that's more powerful than any of C#, Java 
or C++, more practical than Haskell, Scheme, Ruby, co, and more 
maintainable than C or Perl, but which *still* is Human Writable. All 
we need is some outside-of-the-box thinking, and we might reap some 
overwhelming advantages when we combine *this* language with the IDEs 
and the horsepower that the modern drone takes for granted.


Easier parsing, CTFE, actually usable templates, practical mixins, 
pure functions, safe code, you name it! We have all the bits and 
pieces to really make writing + IDE assisted program authoring, a 
superior reality.


Right, but I can't think of any IDE feature that would be a bad fit for 
using the filesystem to store the D source modules.


I remember writing something about it here, like 7 years ago. But today 
there are others who have newer opinions about it. I haven't thought 
about it since then.


I wonder how a seasoned template author would describe what the most 
welcome help would be when writing serious templates?


Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread BCS

Reply to Georg,


I wonder how a seasoned template author would describe what the most
welcome help would be when writing serious templates?



Breakpoint debugging of template explanation. Pick a template, feed it 
values and see (as in syntax highlighting and foreach unrolling) what happens. 
Pick an invoked template and dive in. Real breakpoint debugging of CTFE where 
it will stop on the line that is not CTFEable.


Oh and auto complete that works with meta but doesn't fall over on it's side 
twiching with larger systems.





Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread Derek Parnell
On Tue, 19 May 2009 16:09:54 -0700, Walter Bright wrote:

 bearophile wrote:
 Forth interpreters can be very small, it's a very flexible language,
 you can metaprogram it almost as Lisp, and if implemented well it can
 be efficient (surely more than interpreter Basic, but less than
 handwritten asm. You can have an optimizing Forth in probably less
 than 4-5 KB).
 
 But the people was waiting/asking for the Basic Language, most people
 didn't know Forth, Basic was common in schools, so Basic was the
 language shipped inside the machine, instead of Forth: 
 http://www.npsnet.com/danf/cbm/languages.html#FORTH
 
 The Commodore 64 with built-in Forth instead of Basic may have driven
 computer science in a quite different direction.
 
 Do you agree?
 
 I remember lots of talk about Forth, and nobody using it.

It can quickly degenerate into a write-only language because it encourages
one to extend the syntax, and even semantics, of the language. It takes
extreme discipline to make a Forth program maintainable by anyone other
than the original author. 

The other difficulty with it is that most people don't use Reverse Polish
Notation often enough for it to become second nature, thus making it hard
for people to read a Forth program and 'see' what its trying to do.

However, it has its own elegance and simplicity that can be very alluring.
I see it as the Circe of programming languages.

-- 
Derek Parnell
Melbourne, Australia
skype: derek.j.parnell


Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread Robert Fraser

BCS wrote:
Oh and auto complete that works with meta but doesn't fall over on it's 
side twiching with larger systems.


:-) It's getting better, slowly.


Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread BCS

Hello Robert,


BCS wrote:


Oh and auto complete that works with meta but doesn't fall over on
it's side twiching with larger systems.


:-) It's getting better, slowly.



I can get you some test cases if you want... :-)




Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread Walter Bright

Not even this book cover could save Forth!

http://www.globalnerdy.com/2007/09/14/reimagining-programming-book-covers/


Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread Andrei Alexandrescu

Walter Bright wrote:

Not even this book cover could save Forth!

http://www.globalnerdy.com/2007/09/14/reimagining-programming-book-covers/


Hehe...

And of course the Ruby book has the obligatory distasteful sexual 
reference.


Only today I was reading another book on Rails and within the third page 
I got the notion that good website development is like good porn: you 
know it when you see it. Yeah, you've apparently seen to much of it. Get 
a date. :o/


I'm all for sexual jokes, but give me a break with the lucky stiff. 
The subtler the better. I made one such joke in a talk at ACCU, and it 
took people 30 seconds to even suspect it. (Walter of course got it in a 
femtosecond.)



Andrei


Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread Nick Sabalausky
Steven Schveighoffer schvei...@yahoo.com wrote in message 
news:op.ut4vynx5eav...@steves.networkengines.com...
 The docs are reasonable once you figure out how  they are laid out.


I find the docs to be so slow as to be almost unusable. F*(*^*%* AJAX. 




Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread BCS

Hello Nick,


If Java has gotten so fast as many
people claim, why is Eclipse still such a sluggish POS?).



for the same reason that anything is slow, people more than make up for any 
gains in perf with more features (and shoddy code)





Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread dsimcha
== Quote from Christopher Wright (dhase...@gmail.com)'s article
 Nick Sabalausky wrote:
  Andrei Alexandrescu seewebsiteforem...@erdani.org wrote in message
  news:gus0lu$1sm...@digitalmars.com...
 
  I've repeatedly failed to figure out the coolness of C#, and would
  appreciate a few pointers. Or references. Or delegates :o).
 
  Outside of this group, I think most of the people considering C# really cool
  are people who are unaware of D and are coming to C# from Java. What's
  cool about C# is that it's like a less-shitty version of Java (and *had*
  good tools, although the newer versions of VS are almost as much of a
  bloated unresponsive mess as Eclipse - Which come to think of it, makes me
  wonder - If Java has gotten so fast as many people claim, why is Eclipse
  still such a sluggish POS?).
 
  Compare C# to D though and most of the coolness fades, even though there are
  still a handful of things I think D could still learn from C# (but there's
  probably more than a handful that C# could learn from D).
 Generics and reflection. Generics just hide a lot of casts, usually, but
 that's still quite useful. And autoboxing is convenient, though not
 appropriate for D.

What the heck do you need generics for when you have real templates?  To me,
generics seem like just a lame excuse for templates.


Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread Daniel Keep

grauzone wrote:
 and the .net cil is a genious idea. The most succefull compilers are
 the ones that recognize that there is multiple languages, multiple
 archictectures and that there should be something in the middle. CIL
 just leaves it in the middle code until the last minute. MS may not do
 the best operating systems but the whole .net thing is very good in my 
 
 And what exactly is good about byte code?
 
 It's portable? My D code is portable too. Sure, it requires
 recompilation, but it doesn't need a clusterfuck-VM just for running it.

There's a few points here:

1. Users don't like compiling software.  Hell, *I* don't like having to
compile software since it invariably doesn't work first go, even when
the build instructions are correct (they often aren't.)

2. A very large number of Windows developers write closed-source
software.  The idea of having customers obtain and compile their
software scares the pants off of them.  If it didn't, they wouldn't
invest so much money in obfuscators.

I hate to be the one to tell you this, but... MS didn't design .NET to
make you happy.  *ducks*

  -- Daniel


Re: OT: on IDEs and code writing on steroids

2009-05-19 Thread Andrei Alexandrescu

dsimcha wrote:

== Quote from Christopher Wright (dhase...@gmail.com)'s article

Nick Sabalausky wrote:

Andrei Alexandrescu seewebsiteforem...@erdani.org wrote in message
news:gus0lu$1sm...@digitalmars.com...


I've repeatedly failed to figure out the coolness of C#, and would
appreciate a few pointers. Or references. Or delegates :o).

Outside of this group, I think most of the people considering C# really cool
are people who are unaware of D and are coming to C# from Java. What's
cool about C# is that it's like a less-shitty version of Java (and *had*
good tools, although the newer versions of VS are almost as much of a
bloated unresponsive mess as Eclipse - Which come to think of it, makes me
wonder - If Java has gotten so fast as many people claim, why is Eclipse
still such a sluggish POS?).

Compare C# to D though and most of the coolness fades, even though there are
still a handful of things I think D could still learn from C# (but there's
probably more than a handful that C# could learn from D).

Generics and reflection. Generics just hide a lot of casts, usually, but
that's still quite useful. And autoboxing is convenient, though not
appropriate for D.


What the heck do you need generics for when you have real templates?  To me,
generics seem like just a lame excuse for templates.


I agree. Then, templates aren't easy to implement and they were 
understandably already busy implementing the using statement.


Andrei


  1   2   >