Re: How does buffering actually work?

2019-02-28 Thread Cleverson Casarin Uliana via Digitalmars-d-learn

Thanks Adam, I'll test it later.

Cheers
Cleverson


Re: How does buffering actually work?

2019-02-28 Thread Adam D. Ruppe via Digitalmars-d-learn
On Thursday, 28 February 2019 at 23:07:44 UTC, Cleverson Casarin 
Uliana wrote:
How am I suposed to use the flush function? I've found it in 
the std.stdio reference, buth calling it in my code gives 
"undefined identifier", even though std.stdio is imported.



try

stdout.flush();


flush is a member function of the file object.


Re: How does buffering actually work?

2019-02-28 Thread Cleverson Casarin Uliana via Digitalmars-d-learn

Hi, thank you both Ali and Sarn.

How am I suposed to use the flush function? I've found it in the 
std.stdio reference, buth calling it in my code gives "undefined 
identifier", even though std.stdio is imported.


Greetings,
Cleverson


Re: How does buffering actually work?

2019-02-28 Thread sarn via Digitalmars-d-learn
On Thursday, 28 February 2019 at 21:17:23 UTC, Cleverson Casarin 
Uliana wrote:
It works almost perfectly, except that it doesn't wait for my 
first Enter after printing "First name: value1". Rather, it 
prints both "First name: value1" and "First name: value2" 
together on the same line, then it starts to behave as 
expected, e.g. printing one line at a time and waiting for me 
to press Enter.


Perhaps that happened with some other variation of the code.  The 
code you wrote shouldn't work like that (it doesn't for me when I 
tried, at least).


Ali has some good answers for fixing your code.  (readf("\n") 
also works, BTW.)  Hopefully this helps with the "How does 
buffering actually work?" question:


D uses the system's standard C library for IO, like most 
programming languages do, so IO buffering isn't fundamentally 
different (but some high-level functions might have different 
behaviour).


The standard C library provides buffered IO for input and output. 
 By default terminal IO is line-buffered (not sure if that's all 
systems), so you might see delays up until a newline, but 
line-by-line IO won't notice the buffering.


What happens here?

write()
read()
write()
read()

The first write goes to the output buffer.  If the buffer ever 
gets full (or has a newline in the case of line buffering), the 
data gets flushed to the real output.


At the read, it's possible there's still some data in the output 
buffer that's not flushed.  If needed, you can explicitly call 
flush() to make sure there isn't.  If there happens to already be 
data in the read buffer, read() will take as much as it needs to. 
 If there isn't enough, then real input will happen, and the call 
will block until data comes in.  The real read will ask for a 
chunk of data, which will often be more than the read() call 
needs.  The remainder gets put into the buffer (that's what it's 
for).  (The kernel and libc actually both have IO buffers.)


In any case, the second write won't happen until the read has 
finished.


Rinse and repeat for the remaining lines.


Re: How does buffering actually work?

2019-02-28 Thread Ali Çehreli via Digitalmars-d-learn

On 02/28/2019 01:17 PM, Cleverson Casarin Uliana wrote:

> I experimented substituting readf for readln, but then it doesn't
> recognize my Enter presses and hangs on.
>
> What is a more suitable aproach to this problem please?

readln and strip works, and formattedRead can be useful as well. I have 
an example here:


  http://ddili.org/ders/d.en/strings.html#ix_strings.readln

import std.stdio;
import std.string;

void main() {
char[] name;

write("What is your name? ");
readln(name);
name = strip(name);

writeln("Hello ", name, "!");
}

Ali



How does buffering actually work?

2019-02-28 Thread Cleverson Casarin Uliana via Digitalmars-d-learn
Hi all, this may be a really newbie question but I'm not really used to 
medium/low level stuff like this. I've actually done the same in Python, 
and it just works...


Supose I want my program to print some content on a given line, lets say 
"First name: value1", then it should sit down and wait for me to press 
Enter, then print "Second name: value2", then sit down again, and so on. 
I've tried to do that the following way:


import std.stdio;
void myFunc(string name, double value) {
write(name, value);
string buf;
readf(" %s", buf);
}

then I just call multiple times:

void main() {
myFunc("First name: ", value1);
myFunc("Second name: ", value2);
(...)
}

It works almost perfectly, except that it doesn't wait for my first 
Enter after printing "First name: value1". Rather, it prints both "First 
name: value1" and "First name: value2" together on the same line, then 
it starts to behave as expected, e.g. printing one line at a time and 
waiting for me to press Enter.


I experimented substituting readf for readln, but then it doesn't 
recognize my Enter presses and hangs on.


What is a more suitable aproach to this problem please?

Thanks,
Cleverson


dmd for Haiku OS

2019-02-28 Thread MGW via Digitalmars-d-learn
I have recently looked through Haiku OS and got surprised to find 
there dmd 2.072.

There was only running file but Phobos and DrinTime were missing.

Are there any plans to port dmd to Haiku OS nowadays?
Is there a manual (example) on dmd porting on Haiku OS?


Re: dcompute - Error: unrecognized `pragma(LDC_intrinsic)

2019-02-28 Thread Michelle Long via Digitalmars-d-learn
On Thursday, 28 February 2019 at 10:37:22 UTC, Nicholas Wilson 
wrote:
On Thursday, 28 February 2019 at 09:58:35 UTC, Michelle Long 
wrote:
I've included it in Visual D as di and it seems not to add it 
to the include line...


Is it in any way possible that it being an di file would allow 
that? Seems that it is an LDC issue though but LDC has some 
usage of it I believe and it works fine.


Could it be a LDC version? Or do I need to include something 
to make it work? Seems like it would be part of the compiler 
itself.


(I'm on cold and flu meds so apologies if this doesn't make 
sense)


It should be on the atuoimport path of LDC as that file had 
others 
(https://github.com/ldc-developers/druntime/tree/ldc/src/ldc) 
are part of LDC's druntime which should be imported by ldc's 
.conf file, it won't show up on the command line unless you do 
something (... brain not working properly...).


Irrespective of all that, the error comes from

https://github.com/ldc-developers/ldc/blob/f5a5324773484447953746725ea455d2827e6004/dmd/dsymbolsem.d#L1862

which should never happen because this branch should be taken

https://github.com/ldc-developers/ldc/blob/f5a5324773484447953746725ea455d2827e6004/dmd/dsymbolsem.d#L1807

because that pragma is explicitly checked for here

https://github.com/ldc-developers/ldc/blob/187d8198e63564c633f22f2ef4db2a31a8a600ce/gen/pragma.cpp#L110




Yeah, in the config it is

// default switches appended after all explicit command-line 
switches

post-switches = [
"-I%%ldcbinarypath%%/../import",

I've hard coded it and it still doesn't find them.

I've added -conf=

and nothing...

dcompute\driver\cuda\package.d(3): Error: module `dcompute` is in 
file 'ldc\dcompute.d' which cannot be read


Of course if I manually include it then it gives more problems...

Seems ldc is not even reading the conf file or using it if it is?


If you are right then it seems you are suggesting it is an LDC 
bug... but that this is highly unlikely...


I am using Visual D and maybe it is all setup for dub and dub 
takes care of configuring something that Visual D does not?



Note that I've had to include the dcompute imports(downloaded 
from git), rt(for xmalloc I think), and derelict modules.


I did those because when I added dcompute it would complain about 
the missing modules so I had to hunt and peck to find them.



When I remove the ldc imports from Visual D,

then the error about missing ldc.dcompute is at

module dcompute.driver.cuda;
public import ldc.dcompute;

which, is in the ldc imports dir(which is why i manually included 
them which solves it then gives the intrinsics error).



In any case, it seems like it is a very strange bug since 
`pragma(LDC_intrinsic` should work fine. It's analogous to 
`pragma(msg` not working in dmd...


Are you using the latest ldc? (1.14.0?)





Re: dcompute - Error: unrecognized `pragma(LDC_intrinsic)

2019-02-28 Thread Michelle Long via Digitalmars-d-learn
On Thursday, 28 February 2019 at 11:22:49 UTC, Michelle Long 
wrote:
On Thursday, 28 February 2019 at 10:37:22 UTC, Nicholas Wilson 
wrote:

[...]




Yeah, in the config it is

[...]


Also, is it possible that intrinsics are disabled?



Re: dcompute - Error: unrecognized `pragma(LDC_intrinsic)

2019-02-28 Thread Nicholas Wilson via Digitalmars-d-learn
On Thursday, 28 February 2019 at 09:58:35 UTC, Michelle Long 
wrote:
I've included it in Visual D as di and it seems not to add it 
to the include line...


Is it in any way possible that it being an di file would allow 
that? Seems that it is an LDC issue though but LDC has some 
usage of it I believe and it works fine.


Could it be a LDC version? Or do I need to include something to 
make it work? Seems like it would be part of the compiler 
itself.


(I'm on cold and flu meds so apologies if this doesn't make sense)

It should be on the atuoimport path of LDC as that file had 
others 
(https://github.com/ldc-developers/druntime/tree/ldc/src/ldc) are 
part of LDC's druntime which should be imported by ldc's .conf 
file, it won't show up on the command line unless you do 
something (... brain not working properly...).


Irrespective of all that, the error comes from

https://github.com/ldc-developers/ldc/blob/f5a5324773484447953746725ea455d2827e6004/dmd/dsymbolsem.d#L1862

which should never happen because this branch should be taken

https://github.com/ldc-developers/ldc/blob/f5a5324773484447953746725ea455d2827e6004/dmd/dsymbolsem.d#L1807

because that pragma is explicitly checked for here

https://github.com/ldc-developers/ldc/blob/187d8198e63564c633f22f2ef4db2a31a8a600ce/gen/pragma.cpp#L110


Re: dcompute - Error: unrecognized `pragma(LDC_intrinsic)

2019-02-28 Thread Michelle Long via Digitalmars-d-learn
On Thursday, 28 February 2019 at 02:35:59 UTC, Nicholas Wilson 
wrote:
On Wednesday, 27 February 2019 at 22:56:14 UTC, Michelle Long 
wrote:
Trying to get dcompute to work... after a bunch of issues 
dealing with all the crap this is what I can't get past:


Error: unrecognized `pragma(LDC_intrinsic)

This is actually from the ldc.intrinsics file, which I had to 
rename from .di to d so it would be included in by VisualD.


I upgraded to latest LDC just before

LDC - the LLVM D compiler (1.14.0git-1bbda74):
  based on DMD v2.084.1 and LLVM 7.0.1


pragma(LDC_intrinsic, "llvm.returnaddress")
void* llvm_returnaddress(uint level);


..\..\..\..\D\LDC\import\ldc\intrinsics.d(85): Error: 
unrecognized `pragma(LDC_intrinsic)`


I've got no idea why that is the case, although I note that you 
shouldn't need to change intrinsics.di to intrinsics.d, as a 
.di it only needs to be on the include path which should 
already be the case for LDC.


I've included it in Visual D as di and it seems not to add it to 
the include line...


Is it in any way possible that it being an di file would allow 
that? Seems that it is an LDC issue though but LDC has some usage 
of it I believe and it works fine.


Could it be a LDC version? Or do I need to include something to 
make it work? Seems like it would be part of the compiler itself.




Re: how to pass a malloc'd C string over to be managed by the GC

2019-02-28 Thread Kagamin via Digitalmars-d-learn

byte[] snappyCompress(in byte[] plaintext) {
import deimos.snappy.snappy;
	size_t output_length = 
snappy_max_compressed_length(plaintext.length);

byte[] output = new byte[output_length];
	if(snappy_compress(cast(char*)plaintext.ptr, plaintext.length, 
cast(char*)output.ptr, _length) == SNAPPY_OK) {

byte[] compressed = output[0..output_length];
// < do something magical here
return compressed;
}
assert(0);
}

Snappy works on bytes, not text, char is a wrong type there.