Re: Had another 48hr game jam this weekend...

2013-09-18 Thread deadalnix

On Wednesday, 18 September 2013 at 16:25:16 UTC, Dicebot wrote:
You can get this information from gcc and use it to define 100% 
correct dependencies. However, it highlights another issue 
common for build tools not coupled with compiler - to generate 
dependency graph reliably you need now to parse all the source 
files which contradicts one of core motivations to do it. The 
way C preprocessor works does not help the matter either.


The idea is to cache the dependencies to not reparse everything.


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Jacob Carlborg

On 2013-09-18 20:06, H. S. Teoh wrote:


Really? I thought rdmd caches object files. Or does that only apply to
executables?


There are some issues with that. Just search for incremental 
complication, or something like that, in these newsgroup. It has been 
tried before, notably by Tomaz.


--
/Jacob Carlborg


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Andrei Alexandrescu

On 9/18/13 10:32 AM, Peter Alexander wrote:

On Wednesday, 18 September 2013 at 17:25:21 UTC, Andrei Alexandrescu wrote:

But what I meant to say was join me in a place where you get to write D.


Facebook is using D for actual projects? Can you elaborate any more?


Not yet. I mean I am not at liberty to elaborate yet.

Andrei



Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Andrej Mitrovic
On 9/18/13, H. S. Teoh  wrote:
> I thought rdmd caches object files.

It caches dependency lists.


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread H. S. Teoh
On Wed, Sep 18, 2013 at 10:25:57AM -0700, Andrei Alexandrescu wrote:
> On 9/18/13 9:09 AM, Manu wrote:
> >On 19 September 2013 02:05, Andrej Mitrovic  >> wrote:
> >
> >On 9/18/13, Andrei Alexandrescu  >> wrote:
> > >> Problem is, 80% of the code I write is C code still... :(
> > >
> > > join me
> >
> >I don't suppose GCC and other compilers have some kind of verbose
> >output that exports include declarations? Or maybe just modify an RDMD
> >fork to run the preprocessor over a C/C++ file. What I'm saying is, I
> >think a tool like RDMD could be used for C/C++ (with some trickery).
> >
> >
> >rdmd implies rebuild-all every time. It doesn't really scale.
[...]

Really? I thought rdmd caches object files. Or does that only apply to
executables?


T

-- 
Nobody is perfect.  I am Nobody. -- pepoluan, GKC forum


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Andrei Alexandrescu

On 9/18/13 11:06 AM, H. S. Teoh wrote:

On Wed, Sep 18, 2013 at 10:25:57AM -0700, Andrei Alexandrescu wrote:

On 9/18/13 9:09 AM, Manu wrote:

On 19 September 2013 02:05, Andrej Mitrovic mailto:andrej.mitrov...@gmail.com>> wrote:

On 9/18/13, Andrei Alexandrescu mailto:seewebsiteforem...@erdani.org>> wrote:
 >> Problem is, 80% of the code I write is C code still... :(
 >
 > join me

I don't suppose GCC and other compilers have some kind of verbose
output that exports include declarations? Or maybe just modify an RDMD
fork to run the preprocessor over a C/C++ file. What I'm saying is, I
think a tool like RDMD could be used for C/C++ (with some trickery).


rdmd implies rebuild-all every time. It doesn't really scale.

[...]

Really? I thought rdmd caches object files. Or does that only apply to
executables?


rdmd is geared toward building one executable out of several files. It 
uses the all-in-one-cmdline approach. For programs up to medium size 
that's the fastest way. We could provide file-at-a-time as an option.


Andrei



Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Andrei Alexandrescu

On 9/18/13 9:05 AM, Andrej Mitrovic wrote:

On 9/18/13, Andrei Alexandrescu  wrote:

Problem is, 80% of the code I write is C code still... :(


join me


I don't suppose GCC and other compilers have some kind of verbose
output that exports include declarations? Or maybe just modify an RDMD
fork to run the preprocessor over a C/C++ file. What I'm saying is, I
think a tool like RDMD could be used for C/C++ (with some trickery).


http://scottmcpeak.com/autodepend/autodepend.html

But what I meant to say was join me in a place where you get to write D.


Andrei



Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Peter Alexander
On Wednesday, 18 September 2013 at 16:24:19 UTC, Adam D. Ruppe 
wrote:
On Wednesday, 18 September 2013 at 16:22:27 UTC, Iain Buclaw 
wrote:

Takes about 30-40 seconds with gdc. ;-)


Yikes.

Though that reminds me of something, I don't use dmd -O very 
often, which is horribly slow too. I guess in game dev, even 
when debugging/toying around, you'd probably want to optimize 
so maybe that is a problem.


Yes, development builds are generally optimised for game dev.

When debugging in MSVC, you can use

#pragma optimize("", off)
...
#pragma optimize("", on)

To selectively de-optimise regions of code... or just get used to 
debugging optimised builds, which isn't that difficult once you 
know what registers to look in ;-)


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Peter Alexander
On Wednesday, 18 September 2013 at 17:25:21 UTC, Andrei 
Alexandrescu wrote:
But what I meant to say was join me in a place where you get to 
write D.


Facebook is using D for actual projects? Can you elaborate any 
more?




Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Andrei Alexandrescu

On 9/18/13 9:09 AM, Manu wrote:

On 19 September 2013 02:05, Andrej Mitrovic mailto:andrej.mitrov...@gmail.com>> wrote:

On 9/18/13, Andrei Alexandrescu mailto:seewebsiteforem...@erdani.org>> wrote:
 >> Problem is, 80% of the code I write is C code still... :(
 >
 > join me

I don't suppose GCC and other compilers have some kind of verbose
output that exports include declarations? Or maybe just modify an RDMD
fork to run the preprocessor over a C/C++ file. What I'm saying is, I
think a tool like RDMD could be used for C/C++ (with some trickery).


rdmd implies rebuild-all every time. It doesn't really scale.
If you have a few hundred thousand LOC, its probably not the man for the
job.


That is correct.

Andrei


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Andrei Alexandrescu

On 9/18/13 9:15 AM, Adam D. Ruppe wrote:

On Wednesday, 18 September 2013 at 16:09:46 UTC, Manu wrote:

rdmd implies rebuild-all every time. It doesn't really scale.


Have you actually tried it? My biggest D program is coming up on 100,000
lines of code, plus phobos, and I still compile it all at once, every
time. The compile time is ~15 seconds, slower than I'd like (due to ctfe
more than anything else) but not bad.


Interesting! I'm pleasantly surprised.

Andrei



Re: Had another 48hr game jam this weekend...

2013-09-18 Thread H. S. Teoh
On Wed, Sep 18, 2013 at 06:36:19PM +0200, David Eagen wrote:
> On Wednesday, 18 September 2013 at 15:43:21 UTC, Manu wrote:
> 
> >I've had lots of problems in the past where a header included by
> >another header doesn't prompt the dependent code to be rebuilt, and I
> >ended up in a conservative state of rebuild-all-ing every time... :/
> >Maybe I should try working with that environment more...
> >
> 
> This is what makes tup so interesting to me. It tracks every file that
> is read in during a compile so it knows what to recompile when a
> particular header changes.

+1.

Anyone who's ever had to 'make clean; make' just to make sure everything
was up-to-date seriously should ditch make and use a real build system
like tup. SCons is also pretty good, though algorithm-wise, tup is
probably better.


T

-- 
Just because you can, doesn't mean you should.


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Paulo Pinto

Am 18.09.2013 18:24, schrieb Adam D. Ruppe:

On Wednesday, 18 September 2013 at 16:22:27 UTC, Iain Buclaw wrote:

Takes about 30-40 seconds with gdc. ;-)


Yikes.

Though that reminds me of something, I don't use dmd -O very often,
which is horribly slow too. I guess in game dev, even when
debugging/toying around, you'd probably want to optimize so maybe that
is a problem.


I think, ideally we should have Turbo Pascal/Delphi compilation times, 
without requiring dependencies to be available as text files.


--
Paulo


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread David Eagen

On Wednesday, 18 September 2013 at 15:43:21 UTC, Manu wrote:

I've had lots of problems in the past where a header included 
by another
header doesn't prompt the dependent code to be rebuilt, and I 
ended up in a

conservative state of rebuild-all-ing every time... :/
Maybe I should try working with that environment more...



This is what makes tup so interesting to me. It tracks every file 
that is read in during a compile so it knows what to recompile 
when a particular header changes.


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Adam D. Ruppe
On Wednesday, 18 September 2013 at 16:22:27 UTC, Iain Buclaw 
wrote:

Takes about 30-40 seconds with gdc. ;-)


Yikes.

Though that reminds me of something, I don't use dmd -O very 
often, which is horribly slow too. I guess in game dev, even when 
debugging/toying around, you'd probably want to optimize so maybe 
that is a problem.


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Dicebot
On Wednesday, 18 September 2013 at 16:05:18 UTC, Andrej Mitrovic 
wrote:
On 9/18/13, Andrei Alexandrescu  
wrote:

Problem is, 80% of the code I write is C code still... :(


join me


I don't suppose GCC and other compilers have some kind of 
verbose
output that exports include declarations? Or maybe just modify 
an RDMD
fork to run the preprocessor over a C/C++ file. What I'm saying 
is, I
think a tool like RDMD could be used for C/C++ (with some 
trickery).


You can get this information from gcc and use it to define 100% 
correct dependencies. However, it highlights another issue common 
for build tools not coupled with compiler - to generate 
dependency graph reliably you need now to parse all the source 
files which contradicts one of core motivations to do it. The way 
C preprocessor works does not help the matter either.


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Andrej Mitrovic
On 9/18/13, Andrei Alexandrescu  wrote:
>> Problem is, 80% of the code I write is C code still... :(
>
> join me

I don't suppose GCC and other compilers have some kind of verbose
output that exports include declarations? Or maybe just modify an RDMD
fork to run the preprocessor over a C/C++ file. What I'm saying is, I
think a tool like RDMD could be used for C/C++ (with some trickery).


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Manu
On 19 September 2013 01:46, Andrei Alexandrescu <
seewebsiteforem...@erdani.org> wrote:

> On 9/18/13 8:44 AM, Manu wrote:
>
>> On 19 September 2013 01:04, Andrei Alexandrescu
>> > > >>
>>
>> wrote:
>>
>> On 9/18/13 6:54 AM, Wyatt wrote:
>>
>> On Wednesday, 18 September 2013 at 11:45:55 UTC, Manu wrote:
>> ?
>>
>> The problem I've always had with make-based build systems is
>> rebuild
>> dependencies... how do any of those build systems go
>> performing a
>> minimal rebuild,
>>
>>
>> As in "only rebuild the files that changed"?  In my experience,
>> that
>> comes with using make.  Even really ancient versions.
>>
>>
>> Plus rdmd --make-depend, yum.
>>
>>
>> I actually only just realised how cool rdmd was last weekend.
>> Never occurred to me really to use it before. Very handy!
>>
>
> yay 2 dat
>
>
>  Problem is, 80% of the code I write is C code still... :(
>>
>
> join me


You never know... it could happen.


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Manu
On 19 September 2013 02:05, Andrej Mitrovic wrote:

> On 9/18/13, Andrei Alexandrescu  wrote:
> >> Problem is, 80% of the code I write is C code still... :(
> >
> > join me
>
> I don't suppose GCC and other compilers have some kind of verbose
> output that exports include declarations? Or maybe just modify an RDMD
> fork to run the preprocessor over a C/C++ file. What I'm saying is, I
> think a tool like RDMD could be used for C/C++ (with some trickery).
>

rdmd implies rebuild-all every time. It doesn't really scale.
If you have a few hundred thousand LOC, its probably not the man for the
job.


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Iain Buclaw
On 18 September 2013 17:15, Adam D. Ruppe  wrote:
> On Wednesday, 18 September 2013 at 16:09:46 UTC, Manu wrote:
>>
>> rdmd implies rebuild-all every time. It doesn't really scale.
>
>
> Have you actually tried it? My biggest D program is coming up on 100,000
> lines of code, plus phobos, and I still compile it all at once, every time.
> The compile time is ~15 seconds, slower than I'd like (due to ctfe more than
> anything else) but not bad.
>
> Compiling all of phobos at once, > 100,000 lines of D, takes about 2 seconds
> on my computer, excluding linking.

Takes about 30-40 seconds with gdc. ;-)

-- 
Iain Buclaw

*(p < e ? p++ : p) = (c & 0x0f) + '0';


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Adam D. Ruppe

On Wednesday, 18 September 2013 at 16:09:46 UTC, Manu wrote:

rdmd implies rebuild-all every time. It doesn't really scale.


Have you actually tried it? My biggest D program is coming up on 
100,000 lines of code, plus phobos, and I still compile it all at 
once, every time. The compile time is ~15 seconds, slower than 
I'd like (due to ctfe more than anything else) but not bad.


Compiling all of phobos at once, > 100,000 lines of D, takes 
about 2 seconds on my computer, excluding linking.


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Andrei Alexandrescu

On 9/18/13 8:44 AM, Manu wrote:

On 19 September 2013 01:04, Andrei Alexandrescu
mailto:seewebsiteforem...@erdani.org>>
wrote:

On 9/18/13 6:54 AM, Wyatt wrote:

On Wednesday, 18 September 2013 at 11:45:55 UTC, Manu wrote:
?

The problem I've always had with make-based build systems is
rebuild
dependencies... how do any of those build systems go
performing a
minimal rebuild,


As in "only rebuild the files that changed"?  In my experience, that
comes with using make.  Even really ancient versions.


Plus rdmd --make-depend, yum.


I actually only just realised how cool rdmd was last weekend.
Never occurred to me really to use it before. Very handy!


yay 2 dat


Problem is, 80% of the code I write is C code still... :(


join me


Andrei



Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Manu
On 19 September 2013 01:04, Andrei Alexandrescu <
seewebsiteforem...@erdani.org> wrote:

> On 9/18/13 6:54 AM, Wyatt wrote:
>
>> On Wednesday, 18 September 2013 at 11:45:55 UTC, Manu wrote:
>> ?
>>
>>> The problem I've always had with make-based build systems is rebuild
>>> dependencies... how do any of those build systems go performing a
>>> minimal rebuild,
>>>
>>
>> As in "only rebuild the files that changed"?  In my experience, that
>> comes with using make.  Even really ancient versions.
>>
>
> Plus rdmd --make-depend, yum.


I actually only just realised how cool rdmd was last weekend.
Never occurred to me really to use it before. Very handy!

Problem is, 80% of the code I write is C code still... :(


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Manu
On 18 September 2013 23:54, Wyatt  wrote:

> On Wednesday, 18 September 2013 at 11:45:55 UTC, Manu wrote:
> ?
>
>> The problem I've always had with make-based build systems is rebuild
>> dependencies... how do any of those build systems go performing a minimal
>> rebuild,
>>
>
> As in "only rebuild the files that changed"?  In my experience, that comes
> with using make.  Even really ancient versions.


I've had lots of problems in the past where a header included by another
header doesn't prompt the dependent code to be rebuilt, and I ended up in a
conservative state of rebuild-all-ing every time... :/
Maybe I should try working with that environment more...


>  or incremental linking?
>>
>
> This is a bit harder.  If you're using the Gold linker, -Wl,--incremental
> sounds like your medicine, though I'm not sure if Gold is ready for
> primetime, yet.


How does that work? Can you re-link while paused in the debugger, and then
continue with the new code?


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Andrei Alexandrescu

On 9/18/13 6:54 AM, Wyatt wrote:

On Wednesday, 18 September 2013 at 11:45:55 UTC, Manu wrote:
?

The problem I've always had with make-based build systems is rebuild
dependencies... how do any of those build systems go performing a
minimal rebuild,


As in "only rebuild the files that changed"?  In my experience, that
comes with using make.  Even really ancient versions.


Plus rdmd --make-depend, yum.

Andrei



Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Joseph Rushton Wakeling

On 18/09/13 15:54, Wyatt wrote:

As in "only rebuild the files that changed"?  In my experience, that comes with
using make.  Even really ancient versions.


More like, top-down vs. bottom-up resolution of dependencies.  E.g. suppose 
you've built GDC from source -- this means you've built both the GCC backend 
(and C/C++ compilers etc.) and the D frontend.


Now you pull in the latest updates to the frontend, and you want to rebuild. 
Theoretically, you shouldn't need to care about the backend at all -- it's 
already built -- and all that you should need to do is rebuild the frontend and 
the resulting GDC executable and libraries.


In practice, the GCC make process will trawl through all the different 
subdirectories related to the backend -- it'll find that nothing needs to be 
rebuilt, but it takes quite some time to check all of that, when you'd like your 
built system to be smart enough to avoid that kind of bottom-up checking.


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Wyatt

On Wednesday, 18 September 2013 at 11:45:55 UTC, Manu wrote:
?
The problem I've always had with make-based build systems is 
rebuild
dependencies... how do any of those build systems go performing 
a minimal rebuild,


As in "only rebuild the files that changed"?  In my experience, 
that comes with using make.  Even really ancient versions.



or incremental linking?


This is a bit harder.  If you're using the Gold linker, 
-Wl,--incremental sounds like your medicine, though I'm not sure 
if Gold is ready for primetime, yet.


-Wyatt


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Joseph Rushton Wakeling

On 18/09/13 14:56, Dicebot wrote:

On Wednesday, 18 September 2013 at 12:55:14 UTC, Dicebot wrote:

Most frustrating thing here is that that `proposed` repo has
4.0.5 while Mono-D will only work on 4.1.7+ (judging by
http://mono-d.alexanderbothe.com/) :)


P.S.

I guess that makes Arch Linux 4.0.12 package unusable too. Love
that versioning hell.


That's annoying.  A minor version number bump shouldn't render plugins etc. 
incompatible.  It doesn't say many good things about the development process of 
MonoDevelop ... :-(


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Dicebot

On Wednesday, 18 September 2013 at 12:55:14 UTC, Dicebot wrote:

On Wednesday, 18 September 2013 at 12:47:43 UTC, Joseph Rushton
Wakeling wrote:

MonoDevelop 4 is in the "proposed" repository of Ubuntu 13.10:
https://launchpad.net/ubuntu/saucy/+source/monodevelop

... which I hope means that it will make it into the regular 
repositories of the final release next month.  It's also in 
Debian Unstable:

http://packages.debian.org/sid/monodevelop


Most frustrating thing here is that that `proposed` repo has
4.0.5 while Mono-D will only work on 4.1.7+ (judging by
http://mono-d.alexanderbothe.com/) :)


P.S.

I guess that makes Arch Linux 4.0.12 package unusable too. Love
that versioning hell.


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Dicebot

On Wednesday, 18 September 2013 at 12:47:43 UTC, Joseph Rushton
Wakeling wrote:

MonoDevelop 4 is in the "proposed" repository of Ubuntu 13.10:
https://launchpad.net/ubuntu/saucy/+source/monodevelop

... which I hope means that it will make it into the regular 
repositories of the final release next month.  It's also in 
Debian Unstable:

http://packages.debian.org/sid/monodevelop


Most frustrating thing here is that that `proposed` repo has
4.0.5 while Mono-D will only work on 4.1.7+ (judging by
http://mono-d.alexanderbothe.com/) :)


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Dicebot

On Wednesday, 18 September 2013 at 12:44:07 UTC, Manu wrote:

So... is MonoDevelop effectively a one-man project?


https://github.com/mono/monodevelop/graphs/contributors ;)


How is it that it
doesn't have a Debian distribution? It's the most popular form 
of linux by

far.


There is package for older version. Version 4 branch is rather 
new and it will take some time until maintainer of non-bleeding 
edge Linux distributions will consider it stable enough to be 
allowed into official repositories.


It is a common approach on its own. Problem is the plugin API 
incompatibility they do which does not allow Alex to maintain 
Mono-D for older version without effectively spending twice more 
time/efforts. I don't know the details but he was ranting about 
it quite a lot :)


There is a certain unpleasant similarity between MonoDevelop and 
DMD development process :D


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Joseph Rushton Wakeling

On 18/09/13 14:37, Dicebot wrote:

You shouldn't unless you are going to verify source and build package on your
own ;) AFAIK there is no official MonoDevelop 4 package for Debian-based distros
and one won't appear in any time soon. Alex provides latest suitable MonoDevelop
as blob via http://simendsjo.me/files/abothe (with generous help of simendsjo
for hosting)


MonoDevelop 4 is in the "proposed" repository of Ubuntu 13.10:
https://launchpad.net/ubuntu/saucy/+source/monodevelop

... which I hope means that it will make it into the regular repositories of the 
final release next month.  It's also in Debian Unstable:

http://packages.debian.org/sid/monodevelop


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Manu
On 18 September 2013 22:37, Dicebot  wrote:

> On Wednesday, 18 September 2013 at 07:13:01 UTC, Manu wrote:
>
>> That looks thoroughly unofficial. Why should I trust it?
>>
>
> You shouldn't unless you are going to verify source and build package on
> your own ;) AFAIK there is no official MonoDevelop 4 package for
> Debian-based distros and one won't appear in any time soon. Alex provides
> latest suitable MonoDevelop as blob via 
> http://simendsjo.me/files/**abothe(with 
> generous help of simendsjo for hosting)
>
> I personally can only recommend to use proper bleeding edge Linux
> distributive in that regard :P
>
> P.S. It is a disaster how much trouble MonoDevelop upstream is causing for
> Mono-D judging by Alex posts. Breaking plugin API in minor releases all the
> time.
>

So... is MonoDevelop effectively a one-man project? How is it that it
doesn't have a Debian distribution? It's the most popular form of linux by
far.


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread PauloPinto

On Wednesday, 18 September 2013 at 12:37:28 UTC, Dicebot wrote:

On Wednesday, 18 September 2013 at 07:13:01 UTC, Manu wrote:

That looks thoroughly unofficial. Why should I trust it?


You shouldn't unless you are going to verify source and build 
package on your own ;) AFAIK there is no official MonoDevelop 4 
package for Debian-based distros and one won't appear in any 
time soon. Alex provides latest suitable MonoDevelop as blob 
via http://simendsjo.me/files/abothe (with generous help of 
simendsjo for hosting)


I personally can only recommend to use proper bleeding edge 
Linux distributive in that regard :P


P.S. It is a disaster how much trouble MonoDevelop upstream is 
causing for Mono-D judging by Alex posts. Breaking plugin API 
in minor releases all the time.


Well, Miguel is no longer interested in Linux I guess.


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Dicebot

On Wednesday, 18 September 2013 at 07:13:01 UTC, Manu wrote:

That looks thoroughly unofficial. Why should I trust it?


You shouldn't unless you are going to verify source and build 
package on your own ;) AFAIK there is no official MonoDevelop 4 
package for Debian-based distros and one won't appear in any time 
soon. Alex provides latest suitable MonoDevelop as blob via 
http://simendsjo.me/files/abothe (with generous help of simendsjo 
for hosting)


I personally can only recommend to use proper bleeding edge Linux 
distributive in that regard :P


P.S. It is a disaster how much trouble MonoDevelop upstream is 
causing for Mono-D judging by Alex posts. Breaking plugin API in 
minor releases all the time.


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Manu
On 18 September 2013 21:45, Manu  wrote:

> On 18 September 2013 19:44, Joseph Rushton Wakeling <
> joseph.wakel...@webdrake.net> wrote:
>
>> On 18/09/13 02:45, Manu wrote:
>>
>>> The extent of my experience with QtCreator is that it has a button
>>> "Generate VS
>>> Project" in the menu, which I clicked on, and then I opened visual
>>> studio.
>>> For the few moments that I used it, I was surprised by now
>>> un-like-eclipse it
>>> looked :)
>>> Maybe it's decent? What's it's underlying project/build system?
>>> I have ex-trolltech mates who are now all unemployed... so what's the
>>> future of
>>> the tool?
>>>
>>
>> I had the impression the future was very positive, ever since Digia
>> acquired Qt from Nokia a couple of years back.
>>
>> AFAIK it can use make, cmake or qmake as project build systems and can
>> use GDB, CDB (Microsoft?) and Valgrind for debugging.  I don't have enough
>> IDE experience to really make any judgement, but I wondered if it might
>> meet your needs as a light, cross-platform IDE.
>>
>
> The problem I've always had with make-based build systems is rebuild
> dependencies... how do any of those build systems go performing a minimal
> rebuild, or incremental linking?
>

And of course their edit-and-continue support to update a binary while
debugging and continue debugging the edited binary with your code tweak (an
extension from incremental linking)...


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Manu
On 18 September 2013 19:44, Joseph Rushton Wakeling <
joseph.wakel...@webdrake.net> wrote:

> On 18/09/13 02:45, Manu wrote:
>
>> The extent of my experience with QtCreator is that it has a button
>> "Generate VS
>> Project" in the menu, which I clicked on, and then I opened visual studio.
>> For the few moments that I used it, I was surprised by now
>> un-like-eclipse it
>> looked :)
>> Maybe it's decent? What's it's underlying project/build system?
>> I have ex-trolltech mates who are now all unemployed... so what's the
>> future of
>> the tool?
>>
>
> I had the impression the future was very positive, ever since Digia
> acquired Qt from Nokia a couple of years back.
>
> AFAIK it can use make, cmake or qmake as project build systems and can use
> GDB, CDB (Microsoft?) and Valgrind for debugging.  I don't have enough IDE
> experience to really make any judgement, but I wondered if it might meet
> your needs as a light, cross-platform IDE.
>

The problem I've always had with make-based build systems is rebuild
dependencies... how do any of those build systems go performing a minimal
rebuild, or incremental linking?


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread PauloPinto

On Wednesday, 18 September 2013 at 07:59:43 UTC, Manu wrote:
On 18 September 2013 17:45, PauloPinto  
wrote:




It is mostly C# actually.

The VS 2010 rewrite was a way to fix WPF bugs and prove the 
developers at

large that big applications could be done in WPF.

As far as I can tell from MSDN blogs, the only C++ bits left 
standing were

the ones related to C++ development.

This most likely changed with the whole "Going Native" story 
afterwards.




It's clear that MS have no idea WTF they're doing with 
VisualStudio, that's
been clear for half a decade... I'm just waiting for a viable 
alternative

to emerge.
Still nothing...


I think it is all very political what is driving them.

With Longhorn, there was the plan to make the OS .NET based, 
similar to how OS/400 works, and to certain extent Android and 
WP7 were done.


As Longhorn project was rebooted and Vista came out, whatever 
problems the teams were having with Longhorn rewrite of Win32 
into .NET was attributed to the tooling.


As we all know in our jobs it is easier to blame tooling as the 
people.


As such the native tools group inside Microsoft felt empowered 
and started pushing into the back to native direction we see 
nowadays.


Even the WinRT runtime is nothing new, it was actually developed 
in 1999.


Microsoft Research proposed a language neutral COM runtime, which 
eventually became .NET instead.


http://blogs.msdn.com/cfs-file.ashx/__key/communityserver-components-postattachments/00-10-32-72-38/Ext_2D00_VOS.pdf

Full story here, 
http://blogs.msdn.com/b/dsyme/archive/2012/07/05/more-c-net-generics-history-the-msr-white-paper-from-mid-1999.aspx


I have no idea how far or close from the truth this is, but it is 
my gut feeling how these events developed themselves.


Typical enterprise political games, which affect everyone that 
wants to target Windows as well.


--
Paulo


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Joseph Rushton Wakeling

On 18/09/13 02:45, Manu wrote:

The extent of my experience with QtCreator is that it has a button "Generate VS
Project" in the menu, which I clicked on, and then I opened visual studio.
For the few moments that I used it, I was surprised by now un-like-eclipse it
looked :)
Maybe it's decent? What's it's underlying project/build system?
I have ex-trolltech mates who are now all unemployed... so what's the future of
the tool?


I had the impression the future was very positive, ever since Digia acquired Qt 
from Nokia a couple of years back.


AFAIK it can use make, cmake or qmake as project build systems and can use GDB, 
CDB (Microsoft?) and Valgrind for debugging.  I don't have enough IDE experience 
to really make any judgement, but I wondered if it might meet your needs as a 
light, cross-platform IDE.


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Manu
On 18 September 2013 17:45, PauloPinto  wrote:

>
> It is mostly C# actually.
>
> The VS 2010 rewrite was a way to fix WPF bugs and prove the developers at
> large that big applications could be done in WPF.
>
> As far as I can tell from MSDN blogs, the only C++ bits left standing were
> the ones related to C++ development.
>
> This most likely changed with the whole "Going Native" story afterwards.
>

It's clear that MS have no idea WTF they're doing with VisualStudio, that's
been clear for half a decade... I'm just waiting for a viable alternative
to emerge.
Still nothing...


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread PauloPinto
On Tuesday, 17 September 2013 at 15:15:09 UTC, Bruno Medeiros 
wrote:

On 17/09/2013 15:19, Manu wrote:

On 17 September 2013 23:46, Bruno Medeiros
>

wrote:

   On 17/09/2013 07:24, Manu wrote:


I closed about half my open tabs after my last 
email

   (~50 left
open). Down
to 93mb. You must all use some heavy plugins 
or something.
My current solution has 10 projects, one is an 
entire game

engine with over
500 source files, hundreds of thousands of LOC.
   Intellisense
info for all
of it... dunno what to tell you.
Eclipse uses more than 4 times that much 
memory idling

   with no
project open
at all...


4 times ? You must have a pretty light instance of 
eclipse !



   It's a fairly fresh eclipse install, and I just boot it 
up. It

   showed
   the home screen, no project loaded. It was doing 
absolutely

   nothing and
   well into 400mb.
   When I do use it for android and appengine, it more or 
less

   works well
   enough, but the UI feels like it's held together with 
stickytape and
   glue, and it's pretty sluggish. Debugging (native code) 
is slow and

   clunky. How can I take that software seriously?
   I probably waste significant portion of my life 
hovering and

   waiting for
   eclipse to render the pop-up variable inspection 
windows. That shit
   needs to be instant, no excuse. It's just showing a 
value from ram.
   Then I press a key, it doesn't take ages for the letter 
to

   appear on the
   screen...


   Android and Appengine?
   There are two flaws in that comparison, the first is that 
apparently
   you are comparing an Eclipse installation with a lot more 
tools than
   your VS installation (which I'm guessing has only C++ 
tools, perhaps

   some VCS tools too?). No wonder the footprint is bigger. For
   example, my Eclipse instance with only DDT and Git 
installed, and

   opened on a workspace with D projects takes up 130Mb:
   http://i.imgur..com/VmKzrRU.png 




My VS installation has VisualD, VCS tools, xbox 360, ps3, 
android,
emsscripten, nacl, clang and gcc tools. (I don't think these 
offer any
significant resource burden though, they're not really active 
processes)
If Eclipse has a lot more tools as you say, then it's a 
problem is that
I never selected them, and apparently they hog resources even 
when not
being used. That seems like a serious engineering fail if 
that's the case.
As far as I know, I don't have DDT and git installed, so 
you're 2 up on
me :) .. I only have android beyond default install (and no 
project was

open). No appengine in this installation.



Eclipse is designed such that plugins should be lazy-loaded: 
they are only loaded when needed (for example if you open a 
view/editor/preference-page/project, etc., contributed from a 
plugin). But that requires the plugin to be well-behaved in 
that regards, and some plugins might not be so.
I'm not familiar at all with the Eclipse Android plugins or 
AppEngine plugins so I have no idea how these behave 
performance wise. I can't comment on that. Again it should 
noted that Eclipse is not a monolithic application, and a lot 
of things are going to depend on what plugins/tools you have 
installed. (neither is VisualStudio a monolithic application, 
but I would argue that Eclipse has more plugins and extensions 
available, and thus more variation in setup and quality of 
installations)




   With the recommend JVM memory settings (see
   http://code.google.com/p/ddt/__wiki/UserGuide#Eclipse_basics
   
 
), the

   usage in that startup scenario goes up to 180Mb.
   But even so that is not a fair comparison, the second flaw 
here is
   that Eclipse is running on a VM, and is not actually using 
all the

   memory that is taken from the OS.


It's perfectly fair. Let's assume for a second that I couldn't 
care less
that it runs in a VM (I couldn't), all you're really saying is 
that VM's
are effectively a waste of memory and performance, and that 
doesn't

redeem Eclipse in any way.
You're really just suggesting that Eclipse may be inherently 
inefficient
because it's lynched by it's VM. So there's no salvation for 
it? :)


   If you wanna see how much memory the Java application 
itself is
   using for its data structures, you have to use a tool like 
jconsole
   (included in the JDK) to check out JVM stats. For example, 
in the
   DDT scenario above, after startup the whole of Eclipse is 
just using

   just 40Mb for the Java heap:
   http://i.imgur..com/yCPtS52.png 




I don't care how much memory the app is 'really' using beneath 
it's
overhead. All I care about is h

Re: Had another 48hr game jam this weekend...

2013-09-18 Thread PauloPinto

On Wednesday, 18 September 2013 at 00:45:58 UTC, Manu wrote:

On 18 September 2013 00:35, Joseph Rushton Wakeling <
joseph.wakel...@webdrake.net> wrote:


On 17/09/13 16:19, Manu wrote:

I had some experience with kdevelop this past weekend trying 
to find a
reasonable working environment on linux. It's fairly nice. 
Certainly come

along
since I last tried to take it seriously a year or 2 back.
It would be nice if there was D support though. It has 
rudimentary

support that
some whipped up, but it could do a lot better.



Do you have any experience/opinion on Qt Creator as an IDE?  
My impression
is that it's nice in and of itself but limited compared to 
others in the
range of languages/tools it supports -- but it's a very 
superficial

impression so may be wrong.



The extent of my experience with QtCreator is that it has a 
button
"Generate VS Project" in the menu, which I clicked on, and then 
I opened

visual studio.
For the few moments that I used it, I was surprised by now 
un-like-eclipse

it looked :)
Maybe it's decent? What's it's underlying project/build system?
I have ex-trolltech mates who are now all unemployed... so 
what's the

future of the tool?


It is the official IDE for Qt development, so I guess while Digia 
has customers it will be developed.


Re: Had another 48hr game jam this weekend...

2013-09-18 Thread Manu
On 18 September 2013 14:09, Dicebot  wrote:

> On Wednesday, 18 September 2013 at 00:47:09 UTC, Manu wrote:
>
>> Mint15... Link me? I can't find it.
>>
>
> Mint is Debian/Ubuntu based so you should look at PPA mentioned above:


That looks thoroughly unofficial. Why should I trust it?
MonoDevelop's website points to badgerports which is thoroughly out of
date: http://monodevelop.com/Download

My friend also found this:
http://software.opensuse.org/download/package?project=home:tpokorra:mono&package=monodevelop-opt

I'm not sure what to trust if I can't even trust the official website...

 If you're on some sort of ubuntu variant: https://launchpad.net/~keks9n/**
>> +archive/monodevelop-latest
>>
>


Re: Had another 48hr game jam this weekend...

2013-09-17 Thread Dicebot

On Wednesday, 18 September 2013 at 00:47:09 UTC, Manu wrote:

Mint15... Link me? I can't find it.


Mint is Debian/Ubuntu based so you should look at PPA mentioned 
above:


If you're on some sort of ubuntu variant: 
https://launchpad.net/~keks9n/+archive/monodevelop-latest


Re: Had another 48hr game jam this weekend...

2013-09-17 Thread Manu
On 18 September 2013 00:40, Dicebot  wrote:

> On Tuesday, 17 September 2013 at 14:20:03 UTC, Manu wrote:
>
>> Can any linux MonoDevelop user enlighten me on how to use MonoDevelop4 on
>> linux? I couldn't find a package for it anywhere... only MD3. It seems
>> linux MD is way behind... no idea why.
>>
>
> Your distro? It is 4.0.12 in Arch Linux official repository :P
>

Mint15... Link me? I can't find it.


Re: Had another 48hr game jam this weekend...

2013-09-17 Thread Manu
On 18 September 2013 00:35, Joseph Rushton Wakeling <
joseph.wakel...@webdrake.net> wrote:

> On 17/09/13 16:19, Manu wrote:
>
>> I had some experience with kdevelop this past weekend trying to find a
>> reasonable working environment on linux. It's fairly nice. Certainly come
>> along
>> since I last tried to take it seriously a year or 2 back.
>> It would be nice if there was D support though. It has rudimentary
>> support that
>> some whipped up, but it could do a lot better.
>>
>
> Do you have any experience/opinion on Qt Creator as an IDE?  My impression
> is that it's nice in and of itself but limited compared to others in the
> range of languages/tools it supports -- but it's a very superficial
> impression so may be wrong.
>

The extent of my experience with QtCreator is that it has a button
"Generate VS Project" in the menu, which I clicked on, and then I opened
visual studio.
For the few moments that I used it, I was surprised by now un-like-eclipse
it looked :)
Maybe it's decent? What's it's underlying project/build system?
I have ex-trolltech mates who are now all unemployed... so what's the
future of the tool?


Re: Had another 48hr game jam this weekend...

2013-09-17 Thread Arjan
On Tue, 17 Sep 2013 17:15:12 +0200, Bruno Medeiros  
 wrote:


That said, I do agree that Eclipse is generally slower than Visual  
Studio. Eclipse (and most existing plugins) are almost entirely  
Java-based, which has some JIT and GC overheards. Whereas VS is done in  
C++ and C# (I'm guessing a lot of critical bits are developed in C++, at  
least for VS2010).
But I don't think Eclipse is as bad or as slow as a lot of people make  
it appear to be. And better functionality would make up for it, for sure.


+1

It is slower, but IMO not unbearable slower.


Re: Had another 48hr game jam this weekend...

2013-09-17 Thread John Colvin

On Tuesday, 17 September 2013 at 14:20:03 UTC, Manu wrote:
On 17 September 2013 23:46, Bruno Medeiros 
wrote:



On 17/09/2013 07:24, Manu wrote:



I closed about half my open tabs after my last email 
(~50 left

open). Down
to 93mb. You must all use some heavy plugins or 
something.
My current solution has 10 projects, one is an entire 
game

engine with over
500 source files, hundreds of thousands of LOC. 
Intellisense

info for all
of it... dunno what to tell you.
Eclipse uses more than 4 times that much memory 
idling with no

project open
at all...


4 times ? You must have a pretty light instance of 
eclipse !



It's a fairly fresh eclipse install, and I just boot it up. 
It showed
the home screen, no project loaded. It was doing absolutely 
nothing and

well into 400mb.
When I do use it for android and appengine, it more or less 
works well
enough, but the UI feels like it's held together with 
stickytape and
glue, and it's pretty sluggish. Debugging (native code) is 
slow and

clunky. How can I take that software seriously?
I probably waste significant portion of my life hovering and 
waiting for
eclipse to render the pop-up variable inspection windows. 
That shit
needs to be instant, no excuse. It's just showing a value 
from ram.
Then I press a key, it doesn't take ages for the letter to 
appear on the

screen...



Android and Appengine?
There are two flaws in that comparison, the first is that 
apparently you
are comparing an Eclipse installation with a lot more tools 
than your VS
installation (which I'm guessing has only C++ tools, perhaps 
some VCS tools
too?). No wonder the footprint is bigger. For example, my 
Eclipse instance
with only DDT and Git installed, and opened on a workspace 
with D projects

takes up 130Mb:
http://i.imgur.com/VmKzrRU.png



My VS installation has VisualD, VCS tools, xbox 360, ps3, 
android,
emsscripten, nacl, clang and gcc tools. (I don't think these 
offer any
significant resource burden though, they're not really active 
processes)
If Eclipse has a lot more tools as you say, then it's a problem 
is that I
never selected them, and apparently they hog resources even 
when not being
used. That seems like a serious engineering fail if that's the 
case.
As far as I know, I don't have DDT and git installed, so you're 
2 up on me
:) .. I only have android beyond default install (and no 
project was open).

No appengine in this installation.

With the recommend JVM memory settings (see 
http://code.google.com/p/ddt/**
wiki/UserGuide#Eclipse_basics), 
the usage in that startup scenario goes up to 180Mb.
But even so that is not a fair comparison, the second flaw 
here is that
Eclipse is running on a VM, and is not actually using all the 
memory that

is taken from the OS.



It's perfectly fair. Let's assume for a second that I couldn't 
care less
that it runs in a VM (I couldn't), all you're really saying is 
that VM's
are effectively a waste of memory and performance, and that 
doesn't redeem

Eclipse in any way.
You're really just suggesting that Eclipse may be inherently 
inefficient
because it's lynched by it's VM. So there's no salvation for 
it? :)


If you wanna see how much memory the Java application itself is 
using for
its data structures, you have to use a tool like jconsole 
(included in the
JDK) to check out JVM stats. For example, in the DDT scenario 
above, after
startup the whole of Eclipse is just using just 40Mb for the 
Java heap:

http://i.imgur.com/yCPtS52.png



I don't care how much memory the app is 'really' using beneath 
it's
overhead. All I care about is how much memory it's using 
(actually, I don't
really care about that at all, I only care about how it 
performs, which is
poorly), and the windows task manager surely offers the most 
fair measure
for comparison available to the OS, at least for the memory 
consumption
metric ;) .. The problem remains that I find eclipse 
significantly less
responsive, and the UI is messy and disorganised. I feel a lack 
of

coherency between different parts of Eclipse.
So in summary, I prefer and use VS whenever I have the option.

I had some experience with kdevelop this past weekend trying to 
find a
reasonable working environment on linux. It's fairly nice. 
Certainly come

along since I last tried to take it seriously a year or 2 back.
It would be nice if there was D support though. It has 
rudimentary support

that some whipped up, but it could do a lot better.

Can any linux MonoDevelop user enlighten me on how to use 
MonoDevelop4 on
linux? I couldn't find a package for it anywhere... only MD3. 
It seems

linux MD is way behind... no idea why.


If you're on some sort of ubuntu variant: 
https://launchpad.net/~keks9n/+archive/monodevelop-latest


Re: Had another 48hr game jam this weekend...

2013-09-17 Thread Bruno Medeiros

On 17/09/2013 15:33, PauloPinto wrote:

On Tuesday, 17 September 2013 at 13:46:43 UTC, Bruno Medeiros wrote:

On 17/09/2013 07:24, Manu wrote:


   I closed about half my open tabs after my last email (~50 left
   open). Down
   to 93mb. You must all use some heavy plugins or something.
   My current solution has 10 projects, one is an entire game
   engine with over
   500 source files, hundreds of thousands of LOC. Intellisense
   info for all
   of it... dunno what to tell you.
   Eclipse uses more than 4 times that much memory idling with no
   project open
   at all...


   4 times ? You must have a pretty light instance of eclipse !


It's a fairly fresh eclipse install, and I just boot it up. It showed
the home screen, no project loaded. It was doing absolutely nothing and
well into 400mb.
When I do use it for android and appengine, it more or less works well
enough, but the UI feels like it's held together with stickytape and
glue, and it's pretty sluggish. Debugging (native code) is slow and
clunky. How can I take that software seriously?
I probably waste significant portion of my life hovering and waiting for
eclipse to render the pop-up variable inspection windows. That shit
needs to be instant, no excuse. It's just showing a value from ram.
Then I press a key, it doesn't take ages for the letter to appear on the
screen...


Android and Appengine?
There are two flaws in that comparison, the first is that apparently
you are comparing an Eclipse installation with a lot more tools than
your VS installation (which I'm guessing has only C++ tools, perhaps
some VCS tools too?). No wonder the footprint is bigger. For example,
my Eclipse instance with only DDT and Git installed, and opened on a
workspace with D projects takes up 130Mb:
http://i.imgur.com/VmKzrRU.png

With the recommend JVM memory settings (see
http://code.google.com/p/ddt/wiki/UserGuide#Eclipse_basics ), the
usage in that startup scenario goes up to 180Mb.
But even so that is not a fair comparison, the second flaw here is
that Eclipse is running on a VM, and is not actually using all the
memory that is taken from the OS.

If you wanna see how much memory the Java application itself is using
for its data structures, you have to use a tool like jconsole
(included in the JDK) to check out JVM stats. For example, in the DDT
scenario above, after startup the whole of Eclipse is just using just
40Mb for the Java heap:
http://i.imgur.com/yCPtS52.png


VS is also running in a VM as it is mostly a C# application nowadays,
since the WPF rewrite done to 2010.

--
Paulo


That point was not so much that it is running on a VM, but that the 
actual memory in use is much less than the memory taken from the OS. 
(the same thing could happen on a non-VM process)



--
Bruno Medeiros - Software Engineer


Re: Had another 48hr game jam this weekend...

2013-09-17 Thread Bruno Medeiros

On 17/09/2013 15:19, Manu wrote:

On 17 September 2013 23:46, Bruno Medeiros
mailto:brunodomedeiros+...@gmail.com>>
wrote:

On 17/09/2013 07:24, Manu wrote:


 I closed about half my open tabs after my last email
(~50 left
 open). Down
 to 93mb. You must all use some heavy plugins or something.
 My current solution has 10 projects, one is an entire game
 engine with over
 500 source files, hundreds of thousands of LOC.
Intellisense
 info for all
 of it... dunno what to tell you.
 Eclipse uses more than 4 times that much memory idling
with no
 project open
 at all...


 4 times ? You must have a pretty light instance of eclipse !


It's a fairly fresh eclipse install, and I just boot it up. It
showed
the home screen, no project loaded. It was doing absolutely
nothing and
well into 400mb.
When I do use it for android and appengine, it more or less
works well
enough, but the UI feels like it's held together with stickytape and
glue, and it's pretty sluggish. Debugging (native code) is slow and
clunky. How can I take that software seriously?
I probably waste significant portion of my life hovering and
waiting for
eclipse to render the pop-up variable inspection windows. That shit
needs to be instant, no excuse. It's just showing a value from ram.
Then I press a key, it doesn't take ages for the letter to
appear on the
screen...


Android and Appengine?
There are two flaws in that comparison, the first is that apparently
you are comparing an Eclipse installation with a lot more tools than
your VS installation (which I'm guessing has only C++ tools, perhaps
some VCS tools too?). No wonder the footprint is bigger. For
example, my Eclipse instance with only DDT and Git installed, and
opened on a workspace with D projects takes up 130Mb:
http://i.imgur..com/VmKzrRU.png 


My VS installation has VisualD, VCS tools, xbox 360, ps3, android,
emsscripten, nacl, clang and gcc tools. (I don't think these offer any
significant resource burden though, they're not really active processes)
If Eclipse has a lot more tools as you say, then it's a problem is that
I never selected them, and apparently they hog resources even when not
being used. That seems like a serious engineering fail if that's the case.
As far as I know, I don't have DDT and git installed, so you're 2 up on
me :) .. I only have android beyond default install (and no project was
open). No appengine in this installation.



Eclipse is designed such that plugins should be lazy-loaded: they are 
only loaded when needed (for example if you open a 
view/editor/preference-page/project, etc., contributed from a plugin). 
But that requires the plugin to be well-behaved in that regards, and 
some plugins might not be so.
I'm not familiar at all with the Eclipse Android plugins or AppEngine 
plugins so I have no idea how these behave performance wise. I can't 
comment on that. Again it should noted that Eclipse is not a monolithic 
application, and a lot of things are going to depend on what 
plugins/tools you have installed. (neither is VisualStudio a monolithic 
application, but I would argue that Eclipse has more plugins and 
extensions available, and thus more variation in setup and quality of 
installations)




With the recommend JVM memory settings (see
http://code.google.com/p/ddt/__wiki/UserGuide#Eclipse_basics
 ), the
usage in that startup scenario goes up to 180Mb.
But even so that is not a fair comparison, the second flaw here is
that Eclipse is running on a VM, and is not actually using all the
memory that is taken from the OS.


It's perfectly fair. Let's assume for a second that I couldn't care less
that it runs in a VM (I couldn't), all you're really saying is that VM's
are effectively a waste of memory and performance, and that doesn't
redeem Eclipse in any way.
You're really just suggesting that Eclipse may be inherently inefficient
because it's lynched by it's VM. So there's no salvation for it? :)

If you wanna see how much memory the Java application itself is
using for its data structures, you have to use a tool like jconsole
(included in the JDK) to check out JVM stats. For example, in the
DDT scenario above, after startup the whole of Eclipse is just using
just 40Mb for the Java heap:
http://i.imgur..com/yCPtS52.png 


I don't care how much memory the app is 'really' using beneath it's
overhead. All I care about is how much memory it's using (actually, I
don't really care about that at all, I onl

Re: Had another 48hr game jam this weekend...

2013-09-17 Thread Dicebot

On Tuesday, 17 September 2013 at 14:20:03 UTC, Manu wrote:
Can any linux MonoDevelop user enlighten me on how to use 
MonoDevelop4 on
linux? I couldn't find a package for it anywhere... only MD3. 
It seems

linux MD is way behind... no idea why.


Your distro? It is 4.0.12 in Arch Linux official repository :P


Re: Had another 48hr game jam this weekend...

2013-09-17 Thread PauloPinto
On Tuesday, 17 September 2013 at 13:46:43 UTC, Bruno Medeiros 
wrote:

On 17/09/2013 07:24, Manu wrote:


   I closed about half my open tabs after my last email 
(~50 left

   open). Down
   to 93mb. You must all use some heavy plugins or 
something.
   My current solution has 10 projects, one is an entire 
game

   engine with over
   500 source files, hundreds of thousands of LOC. 
Intellisense

   info for all
   of it... dunno what to tell you.
   Eclipse uses more than 4 times that much memory idling 
with no

   project open
   at all...


   4 times ? You must have a pretty light instance of eclipse !


It's a fairly fresh eclipse install, and I just boot it up. It 
showed
the home screen, no project loaded. It was doing absolutely 
nothing and

well into 400mb.
When I do use it for android and appengine, it more or less 
works well
enough, but the UI feels like it's held together with 
stickytape and
glue, and it's pretty sluggish. Debugging (native code) is 
slow and

clunky. How can I take that software seriously?
I probably waste significant portion of my life hovering and 
waiting for
eclipse to render the pop-up variable inspection windows. That 
shit
needs to be instant, no excuse. It's just showing a value from 
ram.
Then I press a key, it doesn't take ages for the letter to 
appear on the

screen...


Android and Appengine?
There are two flaws in that comparison, the first is that 
apparently you are comparing an Eclipse installation with a lot 
more tools than your VS installation (which I'm guessing has 
only C++ tools, perhaps some VCS tools too?). No wonder the 
footprint is bigger. For example, my Eclipse instance with only 
DDT and Git installed, and opened on a workspace with D 
projects takes up 130Mb:

http://i.imgur.com/VmKzrRU.png

With the recommend JVM memory settings (see 
http://code.google.com/p/ddt/wiki/UserGuide#Eclipse_basics ), 
the usage in that startup scenario goes up to 180Mb.
But even so that is not a fair comparison, the second flaw here 
is that Eclipse is running on a VM, and is not actually using 
all the memory that is taken from the OS.


If you wanna see how much memory the Java application itself is 
using for its data structures, you have to use a tool like 
jconsole (included in the JDK) to check out JVM stats. For 
example, in the DDT scenario above, after startup the whole of 
Eclipse is just using just 40Mb for the Java heap:

http://i.imgur.com/yCPtS52.png


VS is also running in a VM as it is mostly a C# application 
nowadays, since the WPF rewrite done to 2010.


--
Paulo


Re: Had another 48hr game jam this weekend...

2013-09-17 Thread deadalnix

On Tuesday, 17 September 2013 at 14:30:12 UTC, PauloPinto wrote:

On Tuesday, 17 September 2013 at 14:26:59 UTC, deadalnix wrote:
On Tuesday, 17 September 2013 at 12:56:56 UTC, PauloPinto 
wrote:
And people are still working on new way to packages stuff. 
What a waste of everybody's time ! Meanwhile, windows do not 
have any packaging system, and that even worse :D


It is called MSI, blame bad companies for still using exe 
installers.


This is only one piece of the puzzle. Ass reprositories, 
integration into windows update and all the goodies, and then 
we have something.


http://chocolatey.org/


Nice ! A similar tool should really be integrated into windows 
and become the de facto standard.


Re: Had another 48hr game jam this weekend...

2013-09-17 Thread PauloPinto

On Tuesday, 17 September 2013 at 14:26:59 UTC, deadalnix wrote:

On Tuesday, 17 September 2013 at 12:56:56 UTC, PauloPinto wrote:
And people are still working on new way to packages stuff. 
What a waste of everybody's time ! Meanwhile, windows do not 
have any packaging system, and that even worse :D


It is called MSI, blame bad companies for still using exe 
installers.


This is only one piece of the puzzle. Ass reprositories, 
integration into windows update and all the goodies, and then 
we have something.


http://chocolatey.org/


Re: Had another 48hr game jam this weekend...

2013-09-17 Thread Joseph Rushton Wakeling

On 17/09/13 16:19, Manu wrote:

I had some experience with kdevelop this past weekend trying to find a
reasonable working environment on linux. It's fairly nice. Certainly come along
since I last tried to take it seriously a year or 2 back.
It would be nice if there was D support though. It has rudimentary support that
some whipped up, but it could do a lot better.


Do you have any experience/opinion on Qt Creator as an IDE?  My impression is 
that it's nice in and of itself but limited compared to others in the range of 
languages/tools it supports -- but it's a very superficial impression so may be 
wrong.




Re: Had another 48hr game jam this weekend...

2013-09-17 Thread deadalnix

On Tuesday, 17 September 2013 at 12:56:56 UTC, PauloPinto wrote:
And people are still working on new way to packages stuff. 
What a waste of everybody's time ! Meanwhile, windows do not 
have any packaging system, and that even worse :D


It is called MSI, blame bad companies for still using exe 
installers.


This is only one piece of the puzzle. Ass reprositories, 
integration into windows update and all the goodies, and then we 
have something.


Re: Had another 48hr game jam this weekend...

2013-09-17 Thread Manu
On 17 September 2013 23:46, Bruno Medeiros wrote:

> On 17/09/2013 07:24, Manu wrote:
>
>>
>> I closed about half my open tabs after my last email (~50 left
>> open). Down
>> to 93mb. You must all use some heavy plugins or something.
>> My current solution has 10 projects, one is an entire game
>> engine with over
>> 500 source files, hundreds of thousands of LOC. Intellisense
>> info for all
>> of it... dunno what to tell you.
>> Eclipse uses more than 4 times that much memory idling with no
>> project open
>> at all...
>>
>>
>> 4 times ? You must have a pretty light instance of eclipse !
>>
>>
>> It's a fairly fresh eclipse install, and I just boot it up. It showed
>> the home screen, no project loaded. It was doing absolutely nothing and
>> well into 400mb.
>> When I do use it for android and appengine, it more or less works well
>> enough, but the UI feels like it's held together with stickytape and
>> glue, and it's pretty sluggish. Debugging (native code) is slow and
>> clunky. How can I take that software seriously?
>> I probably waste significant portion of my life hovering and waiting for
>> eclipse to render the pop-up variable inspection windows. That shit
>> needs to be instant, no excuse. It's just showing a value from ram.
>> Then I press a key, it doesn't take ages for the letter to appear on the
>> screen...
>>
>
> Android and Appengine?
> There are two flaws in that comparison, the first is that apparently you
> are comparing an Eclipse installation with a lot more tools than your VS
> installation (which I'm guessing has only C++ tools, perhaps some VCS tools
> too?). No wonder the footprint is bigger. For example, my Eclipse instance
> with only DDT and Git installed, and opened on a workspace with D projects
> takes up 130Mb:
> http://i.imgur.com/VmKzrRU.png


My VS installation has VisualD, VCS tools, xbox 360, ps3, android,
emsscripten, nacl, clang and gcc tools. (I don't think these offer any
significant resource burden though, they're not really active processes)
If Eclipse has a lot more tools as you say, then it's a problem is that I
never selected them, and apparently they hog resources even when not being
used. That seems like a serious engineering fail if that's the case.
As far as I know, I don't have DDT and git installed, so you're 2 up on me
:) .. I only have android beyond default install (and no project was open).
No appengine in this installation.

With the recommend JVM memory settings (see http://code.google.com/p/ddt/**
> wiki/UserGuide#Eclipse_basics),
>  the usage in that startup scenario goes up to 180Mb.
> But even so that is not a fair comparison, the second flaw here is that
> Eclipse is running on a VM, and is not actually using all the memory that
> is taken from the OS.
>

It's perfectly fair. Let's assume for a second that I couldn't care less
that it runs in a VM (I couldn't), all you're really saying is that VM's
are effectively a waste of memory and performance, and that doesn't redeem
Eclipse in any way.
You're really just suggesting that Eclipse may be inherently inefficient
because it's lynched by it's VM. So there's no salvation for it? :)

If you wanna see how much memory the Java application itself is using for
> its data structures, you have to use a tool like jconsole (included in the
> JDK) to check out JVM stats. For example, in the DDT scenario above, after
> startup the whole of Eclipse is just using just 40Mb for the Java heap:
> http://i.imgur.com/yCPtS52.png


I don't care how much memory the app is 'really' using beneath it's
overhead. All I care about is how much memory it's using (actually, I don't
really care about that at all, I only care about how it performs, which is
poorly), and the windows task manager surely offers the most fair measure
for comparison available to the OS, at least for the memory consumption
metric ;) .. The problem remains that I find eclipse significantly less
responsive, and the UI is messy and disorganised. I feel a lack of
coherency between different parts of Eclipse.
So in summary, I prefer and use VS whenever I have the option.

I had some experience with kdevelop this past weekend trying to find a
reasonable working environment on linux. It's fairly nice. Certainly come
along since I last tried to take it seriously a year or 2 back.
It would be nice if there was D support though. It has rudimentary support
that some whipped up, but it could do a lot better.

Can any linux MonoDevelop user enlighten me on how to use MonoDevelop4 on
linux? I couldn't find a package for it anywhere... only MD3. It seems
linux MD is way behind... no idea why.


Re: Had another 48hr game jam this weekend...

2013-09-17 Thread Bruno Medeiros

On 17/09/2013 07:24, Manu wrote:


I closed about half my open tabs after my last email (~50 left
open). Down
to 93mb. You must all use some heavy plugins or something.
My current solution has 10 projects, one is an entire game
engine with over
500 source files, hundreds of thousands of LOC. Intellisense
info for all
of it... dunno what to tell you.
Eclipse uses more than 4 times that much memory idling with no
project open
at all...


4 times ? You must have a pretty light instance of eclipse !


It's a fairly fresh eclipse install, and I just boot it up. It showed
the home screen, no project loaded. It was doing absolutely nothing and
well into 400mb.
When I do use it for android and appengine, it more or less works well
enough, but the UI feels like it's held together with stickytape and
glue, and it's pretty sluggish. Debugging (native code) is slow and
clunky. How can I take that software seriously?
I probably waste significant portion of my life hovering and waiting for
eclipse to render the pop-up variable inspection windows. That shit
needs to be instant, no excuse. It's just showing a value from ram.
Then I press a key, it doesn't take ages for the letter to appear on the
screen...


Android and Appengine?
There are two flaws in that comparison, the first is that apparently you 
are comparing an Eclipse installation with a lot more tools than your VS 
installation (which I'm guessing has only C++ tools, perhaps some VCS 
tools too?). No wonder the footprint is bigger. For example, my Eclipse 
instance with only DDT and Git installed, and opened on a workspace with 
D projects takes up 130Mb:

http://i.imgur.com/VmKzrRU.png

With the recommend JVM memory settings (see 
http://code.google.com/p/ddt/wiki/UserGuide#Eclipse_basics ), the usage 
in that startup scenario goes up to 180Mb.
But even so that is not a fair comparison, the second flaw here is that 
Eclipse is running on a VM, and is not actually using all the memory 
that is taken from the OS.


If you wanna see how much memory the Java application itself is using 
for its data structures, you have to use a tool like jconsole (included 
in the JDK) to check out JVM stats. For example, in the DDT scenario 
above, after startup the whole of Eclipse is just using just 40Mb for 
the Java heap:

http://i.imgur.com/yCPtS52.png

--
Bruno Medeiros - Software Engineer


Re: Had another 48hr game jam this weekend...

2013-09-17 Thread PauloPinto

On Tuesday, 17 September 2013 at 08:19:12 UTC, deadalnix wrote:

On Tuesday, 17 September 2013 at 07:24:23 UTC, PauloPinto wrote:

Well, they want to sell their own console, Linux based.

So of course they need to create awareness for it.



Well that do make sense. Many widows OpenGL games run faster in 
wine than on windows (obviously that isn't the case for 
DirectX). And we are talking here about the windows version of 
the game, not a native linux one.


Which is good for Linux gaming in general, but like commercial 
UNIXes, unless you use the right distribution, there is 
nothing for you, because of the typical fragmentation issues.


And people are still working on new way to packages stuff. What 
a waste of everybody's time ! Meanwhile, windows do not have 
any packaging system, and that even worse :D


It is called MSI, blame bad companies for still using exe 
installers.


Re: Had another 48hr game jam this weekend...

2013-09-17 Thread Dicebot

On Tuesday, 17 September 2013 at 01:18:39 UTC, Manu wrote:
What kind of quantity are we talking? My VisualStudio2010 is 
humming away
right now at 80mb with a large project open (less than i 
expected).

It's a text editor... what does it do?


I don't know what is the minimal java runtime heap amount to make 
Eclipse perfectly responsive but I usually put it to 1GB+


Re: Had another 48hr game jam this weekend...

2013-09-17 Thread deadalnix

On Tuesday, 17 September 2013 at 07:24:23 UTC, PauloPinto wrote:

Well, they want to sell their own console, Linux based.

So of course they need to create awareness for it.



Well that do make sense. Many widows OpenGL games run faster in 
wine than on windows (obviously that isn't the case for DirectX). 
And we are talking here about the windows version of the game, 
not a native linux one.


Which is good for Linux gaming in general, but like commercial 
UNIXes, unless you use the right distribution, there is nothing 
for you, because of the typical fragmentation issues.


And people are still working on new way to packages stuff. What a 
waste of everybody's time ! Meanwhile, windows do not have any 
packaging system, and that even worse :D


Re: Had another 48hr game jam this weekend...

2013-09-17 Thread PauloPinto
On Tuesday, 17 September 2013 at 06:39:59 UTC, Brad Anderson 
wrote:

On Tuesday, 17 September 2013 at 06:24:20 UTC, Manu wrote:
On 17 September 2013 15:48, deadalnix  
wrote:



On Tuesday, 17 September 2013 at 05:32:28 UTC, Manu wrote:

In my experience, more memory == slower. If you care about 
performance,

the
only time it's acceptable to use more memory is if your data 
structures

are
as efficient as they can get, and the alternative is reading 
off the hard

drive.
Bandwidth isn't free, cache is only so big, and logic to 
process and make
use of so much memory isn't free either. It usually just 
suggests
inefficient (or just lazy) data structures, which often also 
implies

inefficient processing logic.
And the more memory an app uses, the higher chance of 
invoking the page

file, which is a mega-killer.


I do agree as this is generally true. However, the problem 
isn't really
cache size or bandwidth, but rather latency. We know how to 
increase
bandwith or cache size, but the first one come at a cost with 
no big
benefit, and the second come at increase of cost and increase 
of latency.

What is capping the perf here is really latency.



Latency bottlenecks are usually a function of inefficient 
cache usage, or a

working set that's too large and non-linear.

That being said, less memory == more of your working set in 
cache => faster

program.



Precisely.

Dunno what to tell you. My VS instance is pretty light.



Yup, VS is one of these program that microsoft did better 
than the

alternative :D



Perhaps the only one, and also the single reason I still use 
Windows
(despite their best efforts to ruin it more and more with 
almost every
release!). There is STILL no realistic alternative for my 
money, well over

a decade later...
I don't get it. VS has been there a long time. It's not even 
perfect; farm
from it in fact. But the fact that given over a decade of 
solid working
example, nobody has yet managed to create a competitive 
product just blows

my mind...
Seriously, where is the competition? I probably use about 10% 
of VS's
features, but the features that I do use and rely on work, and 
work well.
Although even they could be significantly improved in some 
very simple ways.


I closed about half my open tabs after my last email (~50 left 
open). Down

to 93mb. You must all use some heavy plugins or something.
My current solution has 10 projects, one is an entire game 
engine with

over
500 source files, hundreds of thousands of LOC. Intellisense 
info for all

of it... dunno what to tell you.
Eclipse uses more than 4 times that much memory idling with 
no project

open
at all...



4 times ? You must have a pretty light instance of eclipse !



It's a fairly fresh eclipse install, and I just boot it up. It 
showed the
home screen, no project loaded. It was doing absolutely 
nothing and well

into 400mb.
When I do use it for android and appengine, it more or less 
works well
enough, but the UI feels like it's held together with 
stickytape and glue,
and it's pretty sluggish. Debugging (native code) is slow and 
clunky. How

can I take that software seriously?
I probably waste significant portion of my life hovering and 
waiting for
eclipse to render the pop-up variable inspection windows. That 
shit needs

to be instant, no excuse. It's just showing a value from ram.
Then I press a key, it doesn't take ages for the letter to 
appear on the

screen...


Better get used to it.  The Gaben has spoken: 
http://arstechnica.com/gaming/2013/09/gabe-newell-linux-is-the-future-of-gaming-new-hardware-coming-soon/


I actually agree, my experience with full blown IDEs other than 
VS has been terrible (and I just spent all day fixing a VS 2010 
PCH corruption bug). I've always got my beloved vim to fall 
back on though.




Well, they want to sell their own console, Linux based.

So of course they need to create awareness for it.

Which is good for Linux gaming in general, but like commercial 
UNIXes, unless you use the right distribution, there is nothing 
for you, because of the typical fragmentation issues.


Re: Had another 48hr game jam this weekend...

2013-09-17 Thread growler
On Tuesday, 17 September 2013 at 06:57:04 UTC, Jacob Carlborg 
wrote:

On 2013-09-17 07:32, Manu wrote:

I closed about half my open tabs after my last email (~50 left 
open).

Down to 93mb. You must all use some heavy plugins or something.
My current solution has 10 projects, one is an entire game 
engine with
over 500 source files, hundreds of thousands of LOC. 
Intellisense info

for all of it... dunno what to tell you.
Eclipse uses more than 4 times that much memory idling with no 
project

open at all...

VS is light years better than MonoDevelop. MD is only good 
where VS is

not available ;)

My task manager:
http://i.imgur.com/crbUrH1.png


Are we talking virtual or physical memory? Can the operating 
system affect, Windows 7 vs 8, for example?


That will be the working set for the process, both virtual and 
physical, private and shared memory.




Re: Had another 48hr game jam this weekend...

2013-09-17 Thread Jacob Carlborg

On 2013-09-17 05:32, Manu wrote:


I presume you mean megabytes?
Well I've been working all morning since I made that comment; I have
about 100 tabs open for editing in VS now (I don't clean up open tabs
often >_<), and it's sitting at 120mb.
For reference, that's considerably less than the chrome process that
hosts gmail (200mb!). About the same as the steam client which I haven't
even opened since I turned on my PC, and less than double that of
dropbox (70mb!).
I just booted eclipse, doing absolutely nothing, no projects open on the
start screen. over 410mb...
I don't know why modern software uses so much memory. But it seems
VisualStudio at ~100mb is pretty bloody good comparatively!

Dunno why you're seeing 200mb? (still less than my gmail tab...)
Perhaps you use Visual Assist or some other bulky plugins? I only have
Visual-D installed.


Opening Xcode takes 76  MB real memory (I guess that's physical memory?) 
and 324 MB virtual memory.


After opening the DMD project, indexing, downloading some doc sets, it's 
at 227 MB real memory and 420 virtual memory.


Yesterday I got a stack overflow in our Rails application and the Ruby 
instance took 4 GB real memory.


--
/Jacob Carlborg


Re: Had another 48hr game jam this weekend...

2013-09-17 Thread Jacob Carlborg

On 2013-09-17 07:32, Manu wrote:


I closed about half my open tabs after my last email (~50 left open).
Down to 93mb. You must all use some heavy plugins or something.
My current solution has 10 projects, one is an entire game engine with
over 500 source files, hundreds of thousands of LOC. Intellisense info
for all of it... dunno what to tell you.
Eclipse uses more than 4 times that much memory idling with no project
open at all...

VS is light years better than MonoDevelop. MD is only good where VS is
not available ;)

My task manager:
http://i.imgur.com/crbUrH1.png


Are we talking virtual or physical memory? Can the operating system 
affect, Windows 7 vs 8, for example?


--
/Jacob Carlborg


Re: Had another 48hr game jam this weekend...

2013-09-16 Thread Brad Anderson

On Tuesday, 17 September 2013 at 06:24:20 UTC, Manu wrote:
On 17 September 2013 15:48, deadalnix  
wrote:



On Tuesday, 17 September 2013 at 05:32:28 UTC, Manu wrote:

In my experience, more memory == slower. If you care about 
performance,

the
only time it's acceptable to use more memory is if your data 
structures

are
as efficient as they can get, and the alternative is reading 
off the hard

drive.
Bandwidth isn't free, cache is only so big, and logic to 
process and make
use of so much memory isn't free either. It usually just 
suggests
inefficient (or just lazy) data structures, which often also 
implies

inefficient processing logic.
And the more memory an app uses, the higher chance of 
invoking the page

file, which is a mega-killer.


I do agree as this is generally true. However, the problem 
isn't really
cache size or bandwidth, but rather latency. We know how to 
increase
bandwith or cache size, but the first one come at a cost with 
no big
benefit, and the second come at increase of cost and increase 
of latency.

What is capping the perf here is really latency.



Latency bottlenecks are usually a function of inefficient cache 
usage, or a

working set that's too large and non-linear.

That being said, less memory == more of your working set in 
cache => faster

program.



Precisely.

 Dunno what to tell you. My VS instance is pretty light.



Yup, VS is one of these program that microsoft did better than 
the

alternative :D



Perhaps the only one, and also the single reason I still use 
Windows
(despite their best efforts to ruin it more and more with 
almost every
release!). There is STILL no realistic alternative for my 
money, well over

a decade later...
I don't get it. VS has been there a long time. It's not even 
perfect; farm
from it in fact. But the fact that given over a decade of solid 
working
example, nobody has yet managed to create a competitive product 
just blows

my mind...
Seriously, where is the competition? I probably use about 10% 
of VS's
features, but the features that I do use and rely on work, and 
work well.
Although even they could be significantly improved in some very 
simple ways.


 I closed about half my open tabs after my last email (~50 left 
open). Down

to 93mb. You must all use some heavy plugins or something.
My current solution has 10 projects, one is an entire game 
engine with

over
500 source files, hundreds of thousands of LOC. Intellisense 
info for all

of it... dunno what to tell you.
Eclipse uses more than 4 times that much memory idling with 
no project

open
at all...



4 times ? You must have a pretty light instance of eclipse !



It's a fairly fresh eclipse install, and I just boot it up. It 
showed the
home screen, no project loaded. It was doing absolutely nothing 
and well

into 400mb.
When I do use it for android and appengine, it more or less 
works well
enough, but the UI feels like it's held together with 
stickytape and glue,
and it's pretty sluggish. Debugging (native code) is slow and 
clunky. How

can I take that software seriously?
I probably waste significant portion of my life hovering and 
waiting for
eclipse to render the pop-up variable inspection windows. That 
shit needs

to be instant, no excuse. It's just showing a value from ram.
Then I press a key, it doesn't take ages for the letter to 
appear on the

screen...


Better get used to it.  The Gaben has spoken: 
http://arstechnica.com/gaming/2013/09/gabe-newell-linux-is-the-future-of-gaming-new-hardware-coming-soon/


I actually agree, my experience with full blown IDEs other than 
VS has been terrible (and I just spent all day fixing a VS 2010 
PCH corruption bug). I've always got my beloved vim to fall back 
on though.


Re: Had another 48hr game jam this weekend...

2013-09-16 Thread dennis luehring

Am 17.09.2013 08:24, schrieb Manu:

Perhaps the only one, and also the single reason I still use Windows
(despite their best efforts to ruin it more and more with almost every
release!). There is STILL no realistic alternative for my money, well over
a decade later...
I don't get it. VS has been there a long time. It's not even perfect; farm
from it in fact. But the fact that given over a decade of solid working
example, nobody has yet managed to create a competitive product just blows
my mind...


QTCreator seems to get better and better, its fast, got features like 
VS+VisualAsssist included, easy to write plugins for, nice buildsystem


only part which is still not in (perfect) good state is the windows 
debugger integration


one of my customers is evaluating it for >1Mio LOC Projecs and ~30 
developers, try to get rid of VS-IDE, but still using the VS compiler 
and debugger as backend and tools like incredibuild + gcc environment




Re: Had another 48hr game jam this weekend...

2013-09-16 Thread PauloPinto

On Tuesday, 17 September 2013 at 05:48:21 UTC, deadalnix wrote:

On Tuesday, 17 September 2013 at 05:32:28 UTC, Manu wrote:
In my experience, more memory == slower. If you care about 
performance, the
only time it's acceptable to use more memory is if your data 
structures are
as efficient as they can get, and the alternative is reading 
off the hard

drive.
Bandwidth isn't free, cache is only so big, and logic to 
process and make
use of so much memory isn't free either. It usually just 
suggests
inefficient (or just lazy) data structures, which often also 
implies

inefficient processing logic.
And the more memory an app uses, the higher chance of invoking 
the page

file, which is a mega-killer.



I do agree as this is generally true. However, the problem 
isn't really cache size or bandwidth, but rather latency. We 
know how to increase bandwith or cache size, but the first one 
come at a cost with no big benefit, and the second come at 
increase of cost and increase of latency. What is capping the 
perf here is really latency.


That being said, less memory == more of your working set in 
cache => faster program.



Dunno what to tell you. My VS instance is pretty light.



Yup, VS is one of these program that microsoft did better than 
the alternative :D



Yet in 2013 still doesn't do color printing with syntax 
highlight, like any MS-DOS IDE used to offer around MS-DOS 5/6 
timeframe, unless one installs third party plugins.


And the refactoring tools are a joke compared to Java IDEs, 
unless one installs a third party tool.


Even QtCreator has better C/C++ refactoring tools out of the box.


Visual Studio is a very good IDE, but in some areas it is surely 
lacking.


--
Paulo


Re: Had another 48hr game jam this weekend...

2013-09-16 Thread Manu
On 17 September 2013 15:48, deadalnix  wrote:

> On Tuesday, 17 September 2013 at 05:32:28 UTC, Manu wrote:
>
>> In my experience, more memory == slower. If you care about performance,
>> the
>> only time it's acceptable to use more memory is if your data structures
>> are
>> as efficient as they can get, and the alternative is reading off the hard
>> drive.
>> Bandwidth isn't free, cache is only so big, and logic to process and make
>> use of so much memory isn't free either. It usually just suggests
>> inefficient (or just lazy) data structures, which often also implies
>> inefficient processing logic.
>> And the more memory an app uses, the higher chance of invoking the page
>> file, which is a mega-killer.
>>
>>
> I do agree as this is generally true. However, the problem isn't really
> cache size or bandwidth, but rather latency. We know how to increase
> bandwith or cache size, but the first one come at a cost with no big
> benefit, and the second come at increase of cost and increase of latency.
> What is capping the perf here is really latency.
>

Latency bottlenecks are usually a function of inefficient cache usage, or a
working set that's too large and non-linear.

That being said, less memory == more of your working set in cache => faster
> program.


Precisely.

 Dunno what to tell you. My VS instance is pretty light.
>>
>>
> Yup, VS is one of these program that microsoft did better than the
> alternative :D


Perhaps the only one, and also the single reason I still use Windows
(despite their best efforts to ruin it more and more with almost every
release!). There is STILL no realistic alternative for my money, well over
a decade later...
I don't get it. VS has been there a long time. It's not even perfect; farm
from it in fact. But the fact that given over a decade of solid working
example, nobody has yet managed to create a competitive product just blows
my mind...
Seriously, where is the competition? I probably use about 10% of VS's
features, but the features that I do use and rely on work, and work well.
Although even they could be significantly improved in some very simple ways.

 I closed about half my open tabs after my last email (~50 left open). Down
>> to 93mb. You must all use some heavy plugins or something.
>> My current solution has 10 projects, one is an entire game engine with
>> over
>> 500 source files, hundreds of thousands of LOC. Intellisense info for all
>> of it... dunno what to tell you.
>> Eclipse uses more than 4 times that much memory idling with no project
>> open
>> at all...
>>
>>
> 4 times ? You must have a pretty light instance of eclipse !
>

It's a fairly fresh eclipse install, and I just boot it up. It showed the
home screen, no project loaded. It was doing absolutely nothing and well
into 400mb.
When I do use it for android and appengine, it more or less works well
enough, but the UI feels like it's held together with stickytape and glue,
and it's pretty sluggish. Debugging (native code) is slow and clunky. How
can I take that software seriously?
I probably waste significant portion of my life hovering and waiting for
eclipse to render the pop-up variable inspection windows. That shit needs
to be instant, no excuse. It's just showing a value from ram.
Then I press a key, it doesn't take ages for the letter to appear on the
screen...


Re: Had another 48hr game jam this weekend...

2013-09-16 Thread deadalnix

On Tuesday, 17 September 2013 at 05:32:28 UTC, Manu wrote:
In my experience, more memory == slower. If you care about 
performance, the
only time it's acceptable to use more memory is if your data 
structures are
as efficient as they can get, and the alternative is reading 
off the hard

drive.
Bandwidth isn't free, cache is only so big, and logic to 
process and make
use of so much memory isn't free either. It usually just 
suggests
inefficient (or just lazy) data structures, which often also 
implies

inefficient processing logic.
And the more memory an app uses, the higher chance of invoking 
the page

file, which is a mega-killer.



I do agree as this is generally true. However, the problem isn't 
really cache size or bandwidth, but rather latency. We know how 
to increase bandwith or cache size, but the first one come at a 
cost with no big benefit, and the second come at increase of cost 
and increase of latency. What is capping the perf here is really 
latency.


That being said, less memory == more of your working set in cache 
=> faster program.



Dunno what to tell you. My VS instance is pretty light.



Yup, VS is one of these program that microsoft did better than 
the alternative :D


I closed about half my open tabs after my last email (~50 left 
open). Down

to 93mb. You must all use some heavy plugins or something.
My current solution has 10 projects, one is an entire game 
engine with over
500 source files, hundreds of thousands of LOC. Intellisense 
info for all

of it... dunno what to tell you.
Eclipse uses more than 4 times that much memory idling with no 
project open

at all...



4 times ? You must have a pretty light instance of eclipse !


Re: Had another 48hr game jam this weekend...

2013-09-16 Thread Manu
On 17 September 2013 13:43, Kapps  wrote:

> On Tuesday, 17 September 2013 at 03:32:17 UTC, Manu wrote:
>
>>
>> I presume you mean megabytes?
>> Well I've been working all morning since I made that comment; I have about
>> 100 tabs open for editing in VS now (I don't clean up open tabs often
>> >_<),
>> and it's sitting at 120mb.
>> VisualStudio at ~100mb is pretty bloody good comparatively!
>>
>> Dunno why you're seeing 200mb? (still less than my gmail tab...)
>> Perhaps you use Visual Assist or some other bulky plugins? I only have
>> Visual-D installed.
>>
>
>
> That's quite surprising, Visual Studio for me is always in the ~300MB or
> so range, often more. Right now using MonoDevelop on Linux with Mono-D is
> using ~500MB. That being said, I'm perfectly okay with IDEs using lots of
> memory. RAM is cheap, if the IDE can make itself even slightly better by
> using an extra 2GB when I have spare, I'd be happy to let it. I have 16GB
> in my laptop and 12GB in my desktop and nothing ever comes even remotely
> close to causing me to run out of memory. Things using CPU usage in the
> background however is quite frustrating. Somehow my most CPU intensive
> process on this laptop is my touchpad driver (touchegg), which likely kills
> battery life.
>

In my experience, more memory == slower. If you care about performance, the
only time it's acceptable to use more memory is if your data structures are
as efficient as they can get, and the alternative is reading off the hard
drive.
Bandwidth isn't free, cache is only so big, and logic to process and make
use of so much memory isn't free either. It usually just suggests
inefficient (or just lazy) data structures, which often also implies
inefficient processing logic.
And the more memory an app uses, the higher chance of invoking the page
file, which is a mega-killer.

Dunno what to tell you. My VS instance is pretty light.

I closed about half my open tabs after my last email (~50 left open). Down
to 93mb. You must all use some heavy plugins or something.
My current solution has 10 projects, one is an entire game engine with over
500 source files, hundreds of thousands of LOC. Intellisense info for all
of it... dunno what to tell you.
Eclipse uses more than 4 times that much memory idling with no project open
at all...

VS is light years better than MonoDevelop. MD is only good where VS is not
available ;)

My task manager:
http://i.imgur.com/crbUrH1.png


Re: Had another 48hr game jam this weekend...

2013-09-16 Thread deadalnix

On Tuesday, 17 September 2013 at 03:44:08 UTC, Kapps wrote:

On Tuesday, 17 September 2013 at 03:32:17 UTC, Manu wrote:


I presume you mean megabytes?
Well I've been working all morning since I made that comment; 
I have about
100 tabs open for editing in VS now (I don't clean up open 
tabs often >_<),

and it's sitting at 120mb.
VisualStudio at ~100mb is pretty bloody good comparatively!

Dunno why you're seeing 200mb? (still less than my gmail 
tab...)
Perhaps you use Visual Assist or some other bulky plugins? I 
only have

Visual-D installed.



That's quite surprising, Visual Studio for me is always in the 
~300MB or so range, often more. Right now using MonoDevelop on 
Linux with Mono-D is using ~500MB. That being said, I'm 
perfectly okay with IDEs using lots of memory. RAM is cheap, if 
the IDE can make itself even slightly better by using an extra 
2GB when I have spare, I'd be happy to let it. I have 16GB in 
my laptop and 12GB in my desktop and nothing ever comes even 
remotely close to causing me to run out of memory. Things using 
CPU usage in the background however is quite frustrating. 
Somehow my most CPU intensive process on this laptop is my 
touchpad driver (touchegg), which likely kills battery life.


Eclipse is made for you :D

I tend to agree with you, considereing the benefit is high enough 
(it is in java or C# for instance). For D I disagree, as the 
benefit is not as high, and the frontend can consume quite a lot 
of memory, so having some extra memory around is a big deal 
(especially if you also run a browser somewhere, that will use 
several Gb of memory).


Re: Had another 48hr game jam this weekend...

2013-09-16 Thread Meta
On Tuesday, 17 September 2013 at 02:14:30 UTC, Jonathan M Davis 
wrote:

On Tuesday, September 17, 2013 04:04:55 Meta wrote:

On Tuesday, 17 September 2013 at 01:18:39 UTC, Manu wrote:
> What kind of quantity are we talking? My VisualStudio2010 is
> humming away
> right now at 80mb with a large project open (less than i
> expected).
> It's a text editor... what does it do?

How in the world are you getting that small of a memory
footprint? My CS2010 is currently using 203k with a fairly
bare-bones project. Hell, it gobbles up 100k just idling with
nothing open.


He did say 80 _MB_ not, 80 KB. Whether that's a small footprint 
or not for VS,
I don't know, but it's way more than the 203 KB that that 
you're talking

about.

- Jonathan M Davis


Heh, an unfortunate mistake. Just pretend I meant 203k kilobytes 
in base-10 kilobytes.


Re: Had another 48hr game jam this weekend...

2013-09-16 Thread Kapps

On Tuesday, 17 September 2013 at 03:32:17 UTC, Manu wrote:


I presume you mean megabytes?
Well I've been working all morning since I made that comment; I 
have about
100 tabs open for editing in VS now (I don't clean up open tabs 
often >_<),

and it's sitting at 120mb.
VisualStudio at ~100mb is pretty bloody good comparatively!

Dunno why you're seeing 200mb? (still less than my gmail tab...)
Perhaps you use Visual Assist or some other bulky plugins? I 
only have

Visual-D installed.



That's quite surprising, Visual Studio for me is always in the 
~300MB or so range, often more. Right now using MonoDevelop on 
Linux with Mono-D is using ~500MB. That being said, I'm perfectly 
okay with IDEs using lots of memory. RAM is cheap, if the IDE can 
make itself even slightly better by using an extra 2GB when I 
have spare, I'd be happy to let it. I have 16GB in my laptop and 
12GB in my desktop and nothing ever comes even remotely close to 
causing me to run out of memory. Things using CPU usage in the 
background however is quite frustrating. Somehow my most CPU 
intensive process on this laptop is my touchpad driver 
(touchegg), which likely kills battery life.


Re: Had another 48hr game jam this weekend...

2013-09-16 Thread Manu
On 17 September 2013 12:04, Meta  wrote:

> On Tuesday, 17 September 2013 at 01:18:39 UTC, Manu wrote:
>
>> What kind of quantity are we talking? My VisualStudio2010 is humming away
>> right now at 80mb with a large project open (less than i expected).
>> It's a text editor... what does it do?
>>
>
> How in the world are you getting that small of a memory footprint? My
> CS2010 is currently using 203k with a fairly bare-bones project. Hell, it
> gobbles up 100k just idling with nothing open.
>

I presume you mean megabytes?
Well I've been working all morning since I made that comment; I have about
100 tabs open for editing in VS now (I don't clean up open tabs often >_<),
and it's sitting at 120mb.
For reference, that's considerably less than the chrome process that hosts
gmail (200mb!). About the same as the steam client which I haven't even
opened since I turned on my PC, and less than double that of dropbox
(70mb!).
I just booted eclipse, doing absolutely nothing, no projects open on the
start screen. over 410mb...
I don't know why modern software uses so much memory. But it seems
VisualStudio at ~100mb is pretty bloody good comparatively!

Dunno why you're seeing 200mb? (still less than my gmail tab...)
Perhaps you use Visual Assist or some other bulky plugins? I only have
Visual-D installed.


Re: Had another 48hr game jam this weekend...

2013-09-16 Thread Jonathan M Davis
On Tuesday, September 17, 2013 04:04:55 Meta wrote:
> On Tuesday, 17 September 2013 at 01:18:39 UTC, Manu wrote:
> > What kind of quantity are we talking? My VisualStudio2010 is
> > humming away
> > right now at 80mb with a large project open (less than i
> > expected).
> > It's a text editor... what does it do?
> 
> How in the world are you getting that small of a memory
> footprint? My CS2010 is currently using 203k with a fairly
> bare-bones project. Hell, it gobbles up 100k just idling with
> nothing open.

He did say 80 _MB_ not, 80 KB. Whether that's a small footprint or not for VS, 
I don't know, but it's way more than the 203 KB that that you're talking 
about.

- Jonathan M Davis


Re: Had another 48hr game jam this weekend...

2013-09-16 Thread Meta

On Tuesday, 17 September 2013 at 01:18:39 UTC, Manu wrote:
What kind of quantity are we talking? My VisualStudio2010 is 
humming away
right now at 80mb with a large project open (less than i 
expected).

It's a text editor... what does it do?


How in the world are you getting that small of a memory 
footprint? My CS2010 is currently using 203k with a fairly 
bare-bones project. Hell, it gobbles up 100k just idling with 
nothing open.


Re: Had another 48hr game jam this weekend...

2013-09-16 Thread Meta

On Tuesday, 17 September 2013 at 02:05:12 UTC, Meta wrote:
How in the world are you getting that small of a memory 
footprint? My CS2010 is currently using 203k with a fairly 
bare-bones project. Hell, it gobbles up 100k just idling with 
nothing open.


VS2010*


Re: Had another 48hr game jam this weekend...

2013-09-16 Thread Manu
On 17 September 2013 01:33, Dicebot  wrote:

> On Monday, 16 September 2013 at 14:00:14 UTC, Bruno Medeiros wrote:
>
>> On 02/09/2013 15:15, Manu wrote:
>>
>>> For me, I absolutely will not work without a symbolic debugger,
>>>
>>
>> Oh well, so much getting you to try DDT, at least for now. :p
>>
>> But I do understand that is a reasonable deal-breaker. (However, ditching
>> Eclipse IDEs just because they are Eclipse-based is not though, regardless
>> of what may be the status quo in the C/C++ community)
>>
>
> I have discovered that lot of issues with Eclipse from fellow C++
> developers came simply from using default eclipse.ini - it often does have
> rather small memory limits defined for VM and Eclipse does want plenty of
> memory. Increasing most parameters 2x-3x times in eclipse.ini can make it
> much more smooth and convenient.
>

What kind of quantity are we talking? My VisualStudio2010 is humming away
right now at 80mb with a large project open (less than i expected).
It's a text editor... what does it do?


Re: Had another 48hr game jam this weekend...

2013-09-16 Thread Bruno Medeiros

On 16/09/2013 16:33, Dicebot wrote:

On Monday, 16 September 2013 at 14:00:14 UTC, Bruno Medeiros wrote:

On 02/09/2013 15:15, Manu wrote:

For me, I absolutely will not work without a symbolic debugger,


Oh well, so much getting you to try DDT, at least for now. :p

But I do understand that is a reasonable deal-breaker. (However,
ditching Eclipse IDEs just because they are Eclipse-based is not
though, regardless of what may be the status quo in the C/C++ community)


I have discovered that lot of issues with Eclipse from fellow C++
developers came simply from using default eclipse.ini - it often does
have rather small memory limits defined for VM and Eclipse does want
plenty of memory. Increasing most parameters 2x-3x times in eclipse.ini
can make it much more smooth and convenient.


DUUH, I totally forgot to mention this to DDT users. I've added that ( 
http://www.vogella.com/articles/Eclipse/article.html#eclipse_memorysettings 
) to the UserGuide , and will mention it on next release.


--
Bruno Medeiros - Software Engineer


Re: Had another 48hr game jam this weekend...

2013-09-16 Thread Dicebot
On Monday, 16 September 2013 at 14:00:14 UTC, Bruno Medeiros 
wrote:

On 02/09/2013 15:15, Manu wrote:

For me, I absolutely will not work without a symbolic debugger,


Oh well, so much getting you to try DDT, at least for now. :p

But I do understand that is a reasonable deal-breaker. 
(However, ditching Eclipse IDEs just because they are 
Eclipse-based is not though, regardless of what may be the 
status quo in the C/C++ community)


I have discovered that lot of issues with Eclipse from fellow C++ 
developers came simply from using default eclipse.ini - it often 
does have rather small memory limits defined for VM and Eclipse 
does want plenty of memory. Increasing most parameters 2x-3x 
times in eclipse.ini can make it much more smooth and convenient.


Re: Had another 48hr game jam this weekend...

2013-09-16 Thread Manu
On 17 September 2013 00:00, Bruno Medeiros wrote:

> On 02/09/2013 15:15, Manu wrote:
>
>> For me, I absolutely will not work without a symbolic debugger,
>>
>
> Oh well, so much getting you to try DDT, at least for now. :p
>
> But I do understand that is a reasonable deal-breaker. (However, ditching
> Eclipse IDEs just because they are Eclipse-based is not though, regardless
> of what may be the status quo in the C/C++ community)


I don't have any particular issue with Eclipse. I just prefer VisualStudio
and MonoDevelop.
Eclipse feels like it's held together with sticky tape and glue. But it
works for Android, and AppEngine.

We've been experimenting with kdevelop on linux recently. It's showing
promise. Basic support and debugging works, but it doesn't have a semantic
analysis engine :(


Re: Had another 48hr game jam this weekend...

2013-09-16 Thread Bruno Medeiros

On 02/09/2013 15:15, Manu wrote:

For me, I absolutely will not work without a symbolic debugger,


Oh well, so much getting you to try DDT, at least for now. :p

But I do understand that is a reasonable deal-breaker. (However, 
ditching Eclipse IDEs just because they are Eclipse-based is not though, 
regardless of what may be the status quo in the C/C++ community)


--
Bruno Medeiros - Software Engineer


Re: Had another 48hr game jam this weekend...

2013-09-07 Thread Joseph Rushton Wakeling

On 04/09/13 01:17, ixid wrote:

On Monday, 2 September 2013 at 20:25:51 UTC, Walter Bright wrote:

Fer gawd's sake, why not put their entire freakin' back catalog on it?

For example, there's a "sampling" of a few of Julia Childs' shows from the
60's. Why not put them all on?


The deliberate scarcity of entertainment, or at least enforcement of the new and
unavailability of the old is a part of their plan. Music is similar in that it's
hard to get hold of lots of older recordings through legal channels. They don't
want the consumer to have freedom.


That's not entirely fair.  Often the reason why older shows are difficult to get 
hold of is because back then distribution rights were not managed 
comprehensively as they are now, because no one ever thought there'd be a need 
to (e.g. before there was ever such a thing as video tapes or DVDs, let alone 
streaming).


The result is that whoever owns the physical recordings may not have good enough 
documentation to know who they have to negotiate with in order to ensure a 
certain kind of distribution rights, and so they can't distribute even if they 
want to.


So, for example, with an older TV series the original contracts would probably 
have covered TV repeats but not distribution.  In order to issue a DVD or 
distribute via streaming media or download, the TV studio would likely have to 
negotiate and secure new contracts with not just the writer, director and 
production company but also with all the performers, possibly many of the crew, 
and very likely also the music composer and any musicians who performed on the 
soundtrack.


There can also be confusion over who holds the copyright to various material, 
and so on.


All of that makes it _very_ difficult to release some older material, and unless 
the prospective income is large enough, companies are unlikely to feel it's 
worth doing all the legwork tracing the different rights holders.


Just as one example: Disney has a huge amount of stuff in its archives which it 
would love to release as DVD extras, but they can't because of issues like these.




Re: Had another 48hr game jam this weekend...

2013-09-06 Thread PauloPinto
On Sunday, 1 September 2013 at 20:27:22 UTC, Nick Sabalausky 
wrote:

On Sun, 1 Sep 2013 23:20:37 +1000
Manu  wrote:


On 1 September 2013 17:46, Nick Sabalausky <
seewebsitetocontac...@semitwist.com> wrote:

> On Sun, 01 Sep 2013 06:45:48 +0200
> "Kapps"  wrote:
>
> > On Sunday, 1 September 2013 at 02:05:51 UTC, Manu wrote:
> > > One more thing:
> > > I'll just pick one language complaint from the weekend.
> > > It is how quickly classes became disorganised and 
> > > difficult to

> > > navigate
> > > (like Java and C#).
> > > We all wanted to ability to define class member functions
> > > outside the class
> > > definition:
> > >   class MyClass
> > >   {
> > > void method();
> > >   }
> > >
> > >   void MyClass.method()
> > >   {
> > > //...
> > >   }
> > >
> > > It definitely cost us time simply trying to understand 
> > > the

> > > class layout
> > > visually (ie, when IDE support is barely available).
> > > You don't need to see the function bodies in the class
> > > definition, you want
> > > to quickly see what a class has and does.
> >
> > This isn't something I've found to be an issue personally, 
> > but I
> > suppose it's a matter of what you're used to. Since I'm 
> > used to
> > C#, I haven't had problems with this. I've always felt 
> > that this
> > was the IDE's job, personally. That being said, perhaps 
> > .di files

> > could help with this?
>
> I see it as the job of doc generators.
>

Why complicate the issue? What's wrong with readable code?



I spent several years using C/C++ exclusively (and was happy 
with it
at the time) and I still don't understand what's "readable" 
about having
a class's members separate from the class itself. It's also a 
non-DRY

maintenance PITA and one of the biggest reasons I left C/C++.

I don't like complicating things, and I like readable code. 
That's

why I find C++-style class definitions intolerable.



I also hate them. It is always a pain to get back to C and C++ 
land with double
header and implementation files, specially after being spoiled 
with languages that have proper module support.


Re: Had another 48hr game jam this weekend...

2013-09-05 Thread Sumit Raja


Screw makefiles. dub[1] is the way to go. Dead easy to 
configure [2] and dead easy to use. A default debug build on 
the command line is "dub build", or even just "dub".


[1] http://code.dlang.org/packages/dub
[2] http://code.dlang.org/package-format


dub + Geany is my combination of choice. Great for cross platform 
- I'm using the same source tree to build across Windows, Linux 
and FreeBSD.


Re: Had another 48hr game jam this weekend...

2013-09-05 Thread Danni Coy
The linux user ended up heading the art team so we didn't test on that
environment.
Ideally the Linux user would like D support in KDevelop. Monodevelop is
acceptable but a bit clunky.



On Sun, Sep 1, 2013 at 7:57 PM, Jacob Carlborg  wrote:

> On 2013-09-01 04:05, Manu wrote:
>
>  Naturally, this is primarily a problem with the windows experience, but
>> it's so frustrating that it is STILL a problem... how many years later?
>> People don't want to 'do work' to install a piece of software. Rather,
>> they expect it to 'just work'. We lost about 6 hours trying to get
>> everyone's machines working properly.
>> In the context of a 48 hour game jam, that's a terrible sign! I just
>> kept promising people that it would save time overall... which I wish
>> were true.
>>
>
> Was this only on Windows or were there problems on Linux/Mac OS X as well?
>
>
>  Getting a workable environment:
>>
>> Unsurprisingly, the Linux user was the only person happy work with a
>> makefile. Everybody else wanted a comfortable IDE solution (and the
>> linux user would prefer it too).
>>
>
> I can understand that.
>
>
>  IDE integration absolutely needs to be considered a first class feature
>> of D.
>> I also suggest that the IDE integration downloads should be hosted on
>> the dlang download page so they are obvious and available to everyone
>> without having to go looking, and also as a statement that they are
>> actually endorsed by the dlanguage authorities. As an end-user, you're
>> not left guessing which ones are good/bad/out of date/actually work/etc.
>>
>
> I completely agree.
>
>
>  Obviously, we settled on Visual-D (Windows) and Mono-D (OSX/Linux); the
>> only realistic choices available.
>>
>
> There's also DDT with Eclipse. It supports auto completion, go to
> definition, has an outline view and so on.
>
>
>  The OSX user would have preferred an  XCode integration.
>>
>
> This one is a bit problematic since Xcode doesn't officially supports
> plugins. But it's still possible, as been shown by Michel Fortin with his D
> for Xcode plugin.
>
>  One more thing:
>> I'll just pick one language complaint from the weekend.
>> It is how quickly classes became disorganised and difficult to navigate
>> (like Java and C#).
>> We all wanted to ability to define class member functions outside the
>> class definition:
>>class MyClass
>>{
>>  void method();
>>}
>>
>>void MyClass.method()
>>{
>>  //...
>>}
>>
>> It definitely cost us time simply trying to understand the class layout
>> visually (ie, when IDE support is barely available).
>> You don't need to see the function bodies in the class definition, you
>> want to quickly see what a class has and does.
>>
>
> Sounds like you want an outline view in the IDE. This is supported by DDT
> in Eclipse. Even TextMate on Mac OS X has a form of outline view.
>
> --
> /Jacob Carlborg
>


Re: Had another 48hr game jam this weekend...

2013-09-04 Thread deadalnix

On Wednesday, 4 September 2013 at 18:00:21 UTC, Joakim wrote:
Well, if this kind of simply-minded pseudo-reasoning is to 
find resonance, it has to be targeted at a less critical 
audience.


Except there was little reasoning in my above two sentences, 
only two statements about the other thread.


That, my friend, is called autodestruction.

Now I'll have to invoke Poe's law and get out that thread.


Re: Had another 48hr game jam this weekend...

2013-09-04 Thread Joseph Rushton Wakeling

On Wednesday, 4 September 2013 at 21:34:43 UTC, Ramon wrote:

for the info and your friendly offer to help. But I'm already
fine and settled thanks to some hints in the GDC forum and in
particular thanks to hints and help from Iain Buclaw (whose help
and work can't be praised enough).


I couldn't agree more. :-)


Re: Had another 48hr game jam this weekend...

2013-09-04 Thread H. S. Teoh
On Wed, Sep 04, 2013 at 11:00:10PM +0200, Adam D. Ruppe wrote:
> I *hate* shell scripting. My rule is if it is more than three lines,
> do yourself a favor and use a real programming language. This is
> equally true on unix and windows.

I dunno, I find that windows batch files are so quirky, inconsistent,
and straitjacketed that they're nigh unusable for anything but the most
trivial uses. *nix shell scripts are a lot better.


> Well, actually, the limit with batch might be one line rather than
> three. But still, shells are for interactive entry. Doing any
> scripting on them is a filthy, time wasting, bug-prone hack.

I agree that bash scripting beyond simple uses is fragile and full of
unexpected holes (the primary culprit being the shell's over-eager
interpolation that sometimes interpolates multiple times per command,
and the lack of any usable built-in computational functions).  It's
generally pretty good for automating stuff you'd type by hand, but if
you need anything more complex like actual computations, data
manipulation, or control structures, I'd recommend Perl.

Or rather, D. :)


> (Especially on unix where you get idiocy like "command line too long"
> even trying to do simple tasks like deleting a bunch of files!

At least bash isn't so stupid as to impose arbitrary command-line length
limits. But yeah, on *nixes where there is such a limit (and where it's
unreasonably small), it's a royal pain.


> Or the output to a pipe gets truncated due to terminal width - I kid
> you not, FreeBSD did that to me some years ago when I had to use it on
> a server. Drove me nuts.)

Hmm. I haven't seen this one before on Linux. A BSD-specific issue
maybe?


T

-- 
GEEK = Gatherer of Extremely Enlightening Knowledge


Re: Had another 48hr game jam this weekend...

2013-09-04 Thread Craig Dillabaugh

On Wednesday, 4 September 2013 at 21:34:21 UTC, Iain Buclaw wrote:
On 4 September 2013 22:08, Craig Dillabaugh 
 wrote:

clip


Really.Long.Names.For.Everything


I thought Powershell got deprecated...


I don't follow Windows much, but a quick check didn't turn up
anything. Maybe some of the Windows guys hanging around will
know. Apparently there is a Powershell 4 coming out


Re: Had another 48hr game jam this weekend...

2013-09-04 Thread Ramon

On Wednesday, 4 September 2013 at 20:51:03 UTC, Joseph Rushton
Wakeling wrote:

On Tuesday, 3 September 2013 at 21:26:46 UTC, Ramon wrote:

Now, if you will excuse me, I'll hurry to debian unstable *g


Latest GDC release is also in the soon-to-be released Ubuntu 
13.10, if that's useful to you. And check D.Announce for the 
latest info on D packages in Arch Linux.


I can't remember if you've tried to build from source, but for 
what it's worth that's now a fairly straightforward, albeit 
time-consuming, process. So don't be afraid to do that in order 
to have the latest release. I can help guide you through it if 
you like.


Thank you, J R,

for the info and your friendly offer to help. But I'm already
fine and settled thanks to some hints in the GDC forum and in
particular thanks to hints and help from Iain Buclaw (whose help
and work can't be praised enough).

In case someone else runs into similar problems:

debian unstable (or unstable based derivates) offer GDC-4.8
apt-getable .deb (and phobos) which works well right out of the
box.

Thanks and A+ -R


Re: Had another 48hr game jam this weekend...

2013-09-04 Thread Iain Buclaw
On 4 September 2013 22:08, Craig Dillabaugh  wrote:
> On Wednesday, 4 September 2013 at 20:37:40 UTC, Sean Kelly wrote:
>>
>> On Sep 2, 2013, at 2:04 PM, Walter Bright 
>> wrote:
>>
>>> On 9/2/2013 1:36 PM, H. S. Teoh wrote:

 It's things like this "keyhole interface", that caused me to be
 convinced that the GUI emperor has no clothes, and to turn to CLI-only
 development.
>>>
>>>
>>> One of the giant failures of the GUI interface, and that VS suffers from,
>>> too, is when you need to do repetitive operations.
>>>
>>> On the CLI, I constantly use the history list, and I constantly write
>>> throwaway scripts to automate what I'm doing at the moment. It makes
>>> everything I do, no matter how obscure, only 2 or 3 keypresses.
>>>
>>> With VS, or any GUI, if there's not a button to do it, I'm reduced to:
>>>
>>> move mouse
>>> click
>>> move mouse
>>> click
>>
>>
>> Most editors these days have an option to record and playback macros.
>> Does VS really not have this?
>>
>>
>>> Sounds easy, right? It is easy. Now do it to 1000 photos. With a command
>>> line tool:
>>>
>>> write a script that does it to one picture, name it cc.bat
>>
>>
>> The problem I've encountered on Windows is that its default batch language
>> is terrible.  Any reasonable amount of command-line scripting requires
>> either a different shell or ports of all the Unix tools.
>
>
> Newer versions of Windows have Powershell, which as a Linux/CLI
> guy, I must admit is reasonably close to something like BASH
> (perhaps even superior for some tasks), and there are aliases to
> many of the Unix commands.   I think it may be the default batch
> language in the latest Windows versions.
>
> However, while it has aliases to many Unix-like commands it
> doesn't have everything an it also suffers from the Java diseases
> of having:
>
> Really.Long.Names.For.Everything

I thought Powershell got deprecated...

-- 
Iain Buclaw

*(p < e ? p++ : p) = (c & 0x0f) + '0';


Re: Had another 48hr game jam this weekend...

2013-09-04 Thread H. S. Teoh
On Wed, Sep 04, 2013 at 01:10:53PM -0700, Sean Kelly wrote:
> On Sep 2, 2013, at 2:04 PM, Walter Bright  wrote:
[...]
> > Sounds easy, right? It is easy. Now do it to 1000 photos. With a
> > command line tool:
> > 
> > write a script that does it to one picture, name it cc.bat
> 
> The problem I've encountered on Windows is that its default batch
> language is terrible.  Any reasonable amount of command-line scripting
> requires either a different shell or ports of all the Unix tools.

"Those who don't understand Unix are condemned to reinvent it, poorly."

;-)


T

-- 
It is the quality rather than the quantity that matters. -- Lucius Annaeus 
Seneca


Re: Had another 48hr game jam this weekend...

2013-09-04 Thread Joseph Rushton Wakeling

On Tuesday, 3 September 2013 at 21:26:46 UTC, Ramon wrote:

Now, if you will excuse me, I'll hurry to debian unstable *g


Latest GDC release is also in the soon-to-be released Ubuntu 
13.10, if that's useful to you. And check D.Announce for the 
latest info on D packages in Arch Linux.


I can't remember if you've tried to build from source, but for 
what it's worth that's now a fairly straightforward, albeit 
time-consuming, process. So don't be afraid to do that in order 
to have the latest release. I can help guide you through it if 
you like.


Re: Had another 48hr game jam this weekend...

2013-09-04 Thread Adam D. Ruppe
I *hate* shell scripting. My rule is if it is more than three 
lines, do yourself a favor and use a real programming language. 
This is equally true on unix and windows. Well, actually, the 
limit with batch might be one line rather than three. But still, 
shells are for interactive entry. Doing any scripting on them is 
a filthy, time wasting, bug-prone hack. (Especially on unix where 
you get idiocy like "command line too long" even trying to do 
simple tasks like deleting a bunch of files! Or the output to a 
pipe gets truncated due to terminal width - I kid you not, 
FreeBSD did that to me some years ago when I had to use it on a 
server. Drove me nuts.)


Re: Had another 48hr game jam this weekend...

2013-09-04 Thread Timon Gehr

On 09/04/2013 08:00 PM, Joakim wrote:

On Wednesday, 4 September 2013 at 13:23:19 UTC, Timon Gehr wrote:

On 09/04/2013 11:26 AM, Joakim wrote:

On Tuesday, 3 September 2013 at 21:34:42 UTC, Timon Gehr wrote:

On 09/03/2013 06:33 PM, Joakim wrote:

Sure, but I did provide demonstration, that thread.


That thread seems to demonstrate a failure of communication.


By whom?  [...]



When communication fails, there is usually not a single side
responsible for it. (Unless one side is trolling. Trolls are typically
anonymous.)


Except that trolling has nothing to do with communication failure


Good trolling is often _indistinguishable_ from communication failure.


...


"Any" impartial observer would notice the personal attacks, even if
that observer was completely ignorant of the discussion topic. "Any"
impartial observer would interpret those as lack of a well-reasoned
argument and decide to spend his time impartially observing something
more interesting.


I call it like I see it.


Great.


Except that you then criticize me


I don't criticize people, I question arguments. If you think these two 
things should be conflated, I beg you to reconsider.



for "personal attacks" and name-calling, [...]
...


There are multiple possibilities to replace the above statement in a way 
I would disapprove of, eg:


- "I call it like I don't see it."

- "I state inevitable fact."


An impartial observer can determine if what
you call "personal attacks," more like labeling of the usually silly or
wrong tenor of their arguments
and what kind of person generally makes such dumb arguments, are
accurate.


How? Accuracy of conclusions of fallacious reasoning is mostly
incidental. Consider googling "ad hominem", "association fallacy" and
"fallacy of irrelevance".


[...] what "incidental" means. :)


It means: "Occurring by chance in connection with something else." A 
possible reason informal reasoning makes use of heuristics is that they 
often work by chance in some evolutionary relevant contexts.



[...]they make several statements that are just factually
wrong, [...]


IIRC you more or less successfully debunk some factually wrong 
statements. Not all of them were actually made, though.



If you [...] don't [...] know the facts, there can be no discussion,


One of the points of a discussion is to exchange facts and to widen 
one's understanding of different viewpoints.



which is why I bailed on that thread.
...


There are less intrusive ways of doing that.


If you want to take a long thread full of arguments about the topic
and pick out a little name-calling
and then run away, clearly the argument is lost on you.



Frankly, I'm unimpressed. It's you who picked out the name-calling
instead of arguments when summarizing the past discussion. In case any
valuable arguments were part of that discussion then I'd advise to
pick out those instead and put them in a coherent form.


I called them what they are,


As I see it it is irrelevant in a discussion how anyone may classify 
anyone else taking part in that discussion. It is often even irrelevant 
who those people are. I'm just saying that if the goal is to make one's 
reasoning and opinions available to a potential reader, making it 
inconvenient to read and seemingly irrelevant is not the way to go.



[...] which isn't really name-calling
but an accurate description,


:o)


and noted one of their main [...] arguments,
 so I did both.


No point can be made by noting that one hasn't made a specific 
fallacious argument or by noting that somebody has defended another 
point poorly.



[...]

On Wednesday, 4 September 2013 at 00:25:30 UTC, deadalnix wrote:

You seem confused by the difference between saying something and
providing conclusive evidence.


That thread _is_ conclusive evidence.  [...]


(Please do not mess up the threading.)

[...]

Well, if this kind of simply-minded pseudo-reasoning is to find
resonance, it has to be targeted at a less critical audience.


Except there was little reasoning in my above two sentences, only two
statements about the other thread.


Exactly. (Or rather, one statement about the other thread and one 
irrelevant statement about a community member.)


So a point of contention appears to be that some assume that evidence 
should be given in the form of reasoning or at least be accompanied by 
reasoning, whereas others don't?



[...] I'll leave this "meta-discussion" here, as you two are clearly
incapable of dealing with


Typically the ones incapable of dealing with something leave.


my actual arguments.


What actual arguments are there? ("Go look for them yourself." is not a 
valid answer.)


Re: Had another 48hr game jam this weekend...

2013-09-04 Thread Craig Dillabaugh

On Wednesday, 4 September 2013 at 20:37:40 UTC, Sean Kelly wrote:
On Sep 2, 2013, at 2:04 PM, Walter Bright 
 wrote:



On 9/2/2013 1:36 PM, H. S. Teoh wrote:
It's things like this "keyhole interface", that caused me to 
be
convinced that the GUI emperor has no clothes, and to turn to 
CLI-only

development.


One of the giant failures of the GUI interface, and that VS 
suffers from, too, is when you need to do repetitive 
operations.


On the CLI, I constantly use the history list, and I 
constantly write throwaway scripts to automate what I'm doing 
at the moment. It makes everything I do, no matter how 
obscure, only 2 or 3 keypresses.


With VS, or any GUI, if there's not a button to do it, I'm 
reduced to:


move mouse
click
move mouse
click


Most editors these days have an option to record and playback 
macros.  Does VS really not have this?


Sounds easy, right? It is easy. Now do it to 1000 photos. With 
a command line tool:


write a script that does it to one picture, name it cc.bat


The problem I've encountered on Windows is that its default 
batch language is terrible.  Any reasonable amount of 
command-line scripting requires either a different shell or 
ports of all the Unix tools.


Newer versions of Windows have Powershell, which as a Linux/CLI
guy, I must admit is reasonably close to something like BASH
(perhaps even superior for some tasks), and there are aliases to
many of the Unix commands.   I think it may be the default batch
language in the latest Windows versions.

However, while it has aliases to many Unix-like commands it
doesn't have everything an it also suffers from the Java diseases
of having:

Really.Long.Names.For.Everything


Re: Had another 48hr game jam this weekend...

2013-09-04 Thread Sean Kelly
On Sep 2, 2013, at 2:04 PM, Walter Bright  wrote:

> On 9/2/2013 1:36 PM, H. S. Teoh wrote:
>> It's things like this "keyhole interface", that caused me to be
>> convinced that the GUI emperor has no clothes, and to turn to CLI-only
>> development.
> 
> One of the giant failures of the GUI interface, and that VS suffers from, 
> too, is when you need to do repetitive operations.
> 
> On the CLI, I constantly use the history list, and I constantly write 
> throwaway scripts to automate what I'm doing at the moment. It makes 
> everything I do, no matter how obscure, only 2 or 3 keypresses.
> 
> With VS, or any GUI, if there's not a button to do it, I'm reduced to:
> 
> move mouse
> click
> move mouse
> click

Most editors these days have an option to record and playback macros.  Does VS 
really not have this?

> Sounds easy, right? It is easy. Now do it to 1000 photos. With a command line 
> tool:
> 
> write a script that does it to one picture, name it cc.bat

The problem I've encountered on Windows is that its default batch language is 
terrible.  Any reasonable amount of command-line scripting requires either a 
different shell or ports of all the Unix tools.

Re: Had another 48hr game jam this weekend...

2013-09-04 Thread Joakim

On Tuesday, 3 September 2013 at 04:29:54 UTC, Walter Bright wrote:

On 9/2/2013 6:13 PM, deadalnix wrote:
Unless the industry is showing signs of understanding, I'm 
done with theses
stuffs. When amateurs can do better for free, you are not 
providing any service,

you are just scamming your customers.


I don't know about scamming, but I find the business practice 
of ignoring people who want to throw money at you to be utterly 
baffling.


For example, I want to watch Forbrydelsen. It's only available 
as Region 2 DVDs. I have several dvd/bluray players, none will 
play it. What the hell? It's 6 years old. Who is making money 
off of me not being able to watch it?


(Amazon sez: "It won't play on standard DVD/Blu-ray players 
sold in the United States.")


I'm unimpressed.


It's an issue of rights negotiation.  Someone has to go buy the 
rights for each of those shows for every region and type of 
technology, whether broadcast or DVD or internet, each one is 
handled separately.  Because there's no standardized contracts or 
pricing, these deals take forever and they simply don't bother if 
the market is too small, ie you and the three other people who 
want to watch Forbrydelsen, whatever that is. ;) If it costs them 
more to hire the high-priced lawyers to cut these deals than they 
will get from foreign sales, they don't bother.


This is what bit torrent is for:

http://bitsnoop.com/
http://thepiratebay.sx/
http://www.transmissionbt.com/

I've watched the full runs of HBO shows like Game of Thrones and 
Boardwalk Empire and any popular movie I want, in HD, through 
these torrent sites.  I discovered an Australian reality show 
called My Restaurant Rules through a torrent site, despite never 
having heard of it anywhere else, and enjoyed it enough that I 
watched the entire second season through torrent almost a decade 
ago (http://en.wikipedia.org/wiki/My_Restaurant_Rules#Series_two).


I haven't had any cable, HBO, or online video subscription 
service in more than a decade; I've probably rented one, maybe 
two, DVD/blurays during that time.  It's all moving online 
anyway, only a question of when.


  1   2   3   4   5   >