GCC 14.1.0 now available for Dreamcast's KallistiOS, including GDC support.
As per KallistiOS Dreamcast maintainers, the new GCC version has been integrated, with support for all GCC frontends, https://twitter.com/falco_girgis/status/1788064911689404612
Re: Safer Linux Kernel Modules Using the D Programming Language
On Wednesday, 11 January 2023 at 09:52:23 UTC, Walter Bright wrote: By the way, back in the 80's, I wrote my own pointer checker for my own use developing C code. It was immensely useful in flushing bugs out of my code. There are vestiges of it still in the dmd source code. But it ran very slooowwly, and was not usable for shipped code. A lot of very capable engineers have working on this problem C has for many decades. If it was solvable, they would have solved it by now. It is kind of "solved", by turning all computers into C machines, Solaris under SPARC ADI, https://docs.oracle.com/cd/E53394_01/html/E54815/gqajs.html Android with MTE, https://source.android.com/docs/security/test/memory-safety/arm-mte iOS with XP, https://developer.apple.com/documentation/security/preparing_your_app_to_work_with_pointer_authentication FreeBSD with CHERI, https://www.cheribsd.org/ Intel messed up their MPX design, but certainly won't want to be left behind. Basically acknowledging that only having bounds and pointer checking via hardware memory tagging will fix C derived issues, and all mitigations thus far have failed one way or the other.
Re: Safer Linux Kernel Modules Using the D Programming Language
On Monday, 9 January 2023 at 20:07:01 UTC, areYouSureAboutThat wrote: On Monday, 9 January 2023 at 11:04:24 UTC, Patrick Schluter wrote: On Monday, 9 January 2023 at 09:08:59 UTC, areYouSureAboutThat wrote: On Monday, 9 January 2023 at 03:54:32 UTC, Walter Bright wrote: Yes, as long as you don't make any mistakes. A table saw won't cut your fingers off if you never make a mistake, too. And yet, people keep using them (table saws). Don't underestimate the level of risk humans are happily willing to accept, in exchange for some personal benefit. and people literally kill themselves by overestimating their skills https://youtu.be/wzosDKcXQ0I?t=441 Wood is a conductor? I never knew that. And yet, @safe is still not the default ;-) Because the DIP to make it default considered calling extern C code as "safe", thus voiding its guarantees.
Re: Safer Linux Kernel Modules Using the D Programming Language
On Monday, 9 January 2023 at 07:23:48 UTC, Siarhei Siamashka wrote: On Monday, 9 January 2023 at 06:34:23 UTC, Paulo Pinto wrote: On Monday, 9 January 2023 at 04:31:48 UTC, Siarhei Siamashka ASAN, Valgrind, Clang Static Analyzer and plenty of other tools are the practical mechanisms to prevent buffer overflows. Yes, they are not baked into the ISO language standard. But D has no ISO language standard at all. The best part of memory safe systems programming languages is that many of those tools don't even have to exist, they are part of language semantics! Memory safe systems programming language is an oxymoron. To be suitable for systems programming, the language has to provide a mechanism to opt out of safety at least for some parts of the code. These parts of code may have memory safety bugs. The compiler of the safe language itself may have bugs. Valgrind is very useful for troubleshooting D issues and this usefulness won't go away any time soon. Here's one example: https://forum.dlang.org/post/msjrcymphcdquslfg...@forum.dlang.org It is a big difference having to audit 100% of the source code like in C, or just 1%. One of the reasons why Burroughs is still available as Unisys ClearPath MCP, is that a couple of agencies that care about secure servers above anything else are willing to keep paying for it, alongside the safety guarantes provided by NEWP.
Re: Safer Linux Kernel Modules Using the D Programming Language
On Monday, 9 January 2023 at 04:31:48 UTC, Siarhei Siamashka wrote: On Monday, 9 January 2023 at 03:54:32 UTC, Walter Bright wrote: Buffer overflows are trivial to have in C, and C has no mechanism to prevent them. ASAN, Valgrind, Clang Static Analyzer and plenty of other tools are the practical mechanisms to prevent buffer overflows. Yes, they are not baked into the ISO language standard. But D has no ISO language standard at all. The best part of memory safe systems programming languages is that many of those tools don't even have to exist, they are part of language semantics!
Re: The D Programming Language Vision Document
On Wednesday, 6 July 2022 at 15:43:25 UTC, ryuukk_ wrote: On Wednesday, 6 July 2022 at 14:30:07 UTC, Paulo Pinto wrote: On Tuesday, 5 July 2022 at 12:34:57 UTC, ryuukk_ wrote: On Monday, 4 July 2022 at 05:30:10 UTC, Andrej Mitrovic wrote: [...] GC is one of D's strength because it is optional, not making core APIs bing-your-own-memory-allocation-strategy through nogc or allocators, is making it no longer optional, which is no longer a strength imo You don't want GC when you do microcontroller development, so as a result core APIs (most of them) becomes useless, moving forward that should make the story better for everyone Which becomes a strength again Feel free to consider it a strength, when in reality it is a flaw against established market players. https://www.microej.com/ https://www.wildernesslabs.co/ https://www.ptc.com/en/products/developer-tools/perc https://www.aicas.com/wp/products-services/jamaicavm/ https://www.astrobe.com/ People also use nodejs and npm, what is your point? If you invest in the future you must take the pragmatic approach and give options Those are not Oracle's products, companies took the JVM for what it is as a foundation and built their products They haven't picked the default Oracle JDK and the default concurrent GC D should enable similar stories with what it has and can provide, read on the challenges TinyGO faced, if D provides the tools for companies to experiment with it, with a proper set of efficient and minimal Core APIs, that alone makes it a proper and more efficient alternative solution My point is that GC hate is not a fixed problem in D, and the vision does little to fix it. Meanwhile the language communities that embraced GC in embedded deployment, are at least 20 years ahead in production deployments versus D, where no GC keeps being a reason to rejoice, like the the comment I replied to. The future was when Andrei book came out.
Re: The D Programming Language Vision Document
On Tuesday, 5 July 2022 at 12:34:57 UTC, ryuukk_ wrote: On Monday, 4 July 2022 at 05:30:10 UTC, Andrej Mitrovic wrote: On Sunday, 3 July 2022 at 08:46:31 UTC, Mike Parker wrote: You can find the final draft of the high-level goals for the D programming language at the following link: https://github.com/dlang/vision-document Under 'Memory safety': Allow the continued use of garbage collection as the default memory management strategy without impact. The GC is one of D's strengths, and we should not "throw the baby out with the bath water". Under 'Phobos and DRuntime': @nogc as much as possible. Aren't these the polar opposites of each other? The GC is one of D's strengths, yet we should avoid it as much as possible in the standard library. Then it's not part of D's strengths. GC is one of D's strength because it is optional, not making core APIs bing-your-own-memory-allocation-strategy through nogc or allocators, is making it no longer optional, which is no longer a strength imo You don't want GC when you do microcontroller development, so as a result core APIs (most of them) becomes useless, moving forward that should make the story better for everyone Which becomes a strength again Feel free to consider it a strength, when in reality it is a flaw against established market players. https://www.microej.com/ https://www.wildernesslabs.co/ https://www.ptc.com/en/products/developer-tools/perc https://www.aicas.com/wp/products-services/jamaicavm/ https://www.astrobe.com/
Re: Adding Modules to C in 10 Lines of Code
On Sunday, 5 June 2022 at 22:41:14 UTC, Walter Bright wrote: On 6/4/2022 10:54 PM, Paulo Pinto wrote: That paper had a real implementation to follow along, I didn't see it. while Lucid and IBM products were real things one could buy. That are *C* compilers doing imports for *C* code? What C compilers have imports: gcc - nope clang - nope VC - nope Digital Mars C - nope C Standard - nope ImportC - yes! https://clang.llvm.org/docs/Modules.html And I am out of this thread.
Re: Adding Modules to C in 10 Lines of Code
On Sunday, 5 June 2022 at 04:42:44 UTC, claptrap wrote: On Saturday, 4 June 2022 at 12:29:59 UTC, Paulo Pinto wrote: On Saturday, 4 June 2022 at 07:40:36 UTC, Walter Bright wrote: On 5/29/2022 11:13 PM, Paulo Pinto wrote: Precompiled headers for C and C++ were certainly a module-like system (I know because I wrote one), but they are horrific kludges. I guess going around the horn is a synonym for lets pretend there wasn't prior art and keep arguing D did it first, as usual. Yikes whats up with you man? You need a hug? Nah, fed up with "D did it first, other languages keep copying us" meme. Don't worry, I refrain myself in the future.
Re: Adding Modules to C in 10 Lines of Code
On Saturday, 4 June 2022 at 19:26:27 UTC, Walter Bright wrote: On 6/4/2022 5:29 AM, Paulo Pinto wrote: I guess going around the horn is a synonym for lets pretend there wasn't prior art and keep arguing D did it first, as usual. Writing a paper is not doing it first. That paper had a real implementation to follow along, while Lucid and IBM products were real things one could buy.
Re: Adding Modules to C in 10 Lines of Code
On Saturday, 4 June 2022 at 07:40:36 UTC, Walter Bright wrote: On 5/29/2022 11:13 PM, Paulo Pinto wrote: [...] Not the same as C doing the importing of C code. [...] Going around the horn, really doing it the hard way. Besides, writing a paper is not the same thing as implementing a working system. There were many proposals about adding modules to C, they just all went nowhere. [...] Yee gods, doing it with a database representation is certainly going around the horn. Note that neither D nor ImportC rely on some database or symbol table file. [...] Precompiled headers for C and C++ were certainly a module-like system (I know because I wrote one), but they are horrific kludges. I guess going around the horn is a synonym for lets pretend there wasn't prior art and keep arguing D did it first, as usual.
Re: Adding Modules to C in 10 Lines of Code
On Tuesday, 3 May 2022 at 01:54:16 UTC, forkit wrote: On Friday, 22 April 2022 at 19:54:13 UTC, Walter Bright wrote: On 4/17/2022 1:12 PM, Walter Bright wrote: https://nwcpp.org/ An online presentation. Monday at 7PM PST. Slides: https://nwcpp.org/talks/2022/modules_in_c.pdf Video: https://www.youtube.com/watch?v=2ImfbGm0fls Here is the answer to the 2 questions you posed in your presentation: (1) why nobody has done this in 40 years? (2) what went on with C++ for 20 years? It's simple. Rational choice theory tell us, that the reward of the action must outweigh the costs incurred. Stepstone did it for Objective-C with #import, and Apple with module maps for C and Objective-C, the modules design that preceeded C++ modules on clang. Then we have those failed attempts at fixing C like SafeC. And if we count research work, Bjarne Stroustoup and Gabriel dos Reis, did it back when they were teaching in Texas university, here is the 2009 paper, "A Principled, Complete, and Efficient Representation of C++" https://www.stroustrup.com/gdr-bs-macis09.pdf Both Visual Age for C++ v0.4 and Lucid's Energize C++ did 40 years ago, by serializing C++ code into database representation, both failed due to high hardware requirements for late 80's/early 90's PCs. Lucid Energize Demo in 1993 https://www.youtube.com/watch?v=pQQTScuApWk Its database system, Cadillac, "Foundation for a C++ Programming Environment" https://dreamsongs.com/Files/Energize.pdf Visual Age for C++ v4, http://www.edm2.com/0704/vacpp4/vacpp4.html The build environment is totally different from traditional compilers. The concept of header files and source code files is obsolete. VAC++ utilizes a global approach to definitions and implementations. That is, once a definition is processed it stays in memory for the duration of the build. To maintain compatibility, header files can still be #included. This new approach to handling source code is disorienting at first and will make migrating existing code to the compiler somewhat difficult. Errors pertaining to objects being defined more than once will likely occur while migrating. Often these errors are incorrect. The work around is to remove the #include line in the source file that contains the offending “redefinition”. https://books.google.de/books?id=ZwHxz0UaB54C=PA206_esc=y#v=onepage=false Additionally the way pre-compiled headers work on C++ Builder and Visual C++, versus the UNIX way, meant that on Windows the reward of the action did not outweigh the costs incurred, and ironically Visual C++ is the one leading the C++ modules support anyway, thanks to Gabriel dos Reis being part of the team, and pinging back on those 2009 learnings.
Re: On the D Blog: A Gas Dynamics Toolkit in D
On Wednesday, 2 February 2022 at 12:53:50 UTC, Paulo Pinto wrote: On Wednesday, 2 February 2022 at 08:14:32 UTC, Mike Parker wrote: The University of Queensland's Centre for Hypersonics has [a gas dynamics toolkit](https://gdtk.uqcloud.net/) that, since 1994, has evolved from C, to C++, and now to D. Peter Jacobs, Rowan Gallon, and Kyle Damm wrote a little about it for the D Blog. The blog: https://dlang.org/blog/2022/02/02/a-gas-dynamics-toolkit-in-d/ Reddit: https://www.reddit.com/r/programming/comments/sij99d/they_wrote_a_gas_dynamics_toolkit_in_d/ And HN, https://dlang.org/blog/2022/02/02/a-gas-dynamics-toolkit-in-d/ Sorry, copy-paste gone wrong, right URL. https://news.ycombinator.com/item?id=30176778
Re: On the D Blog: A Gas Dynamics Toolkit in D
On Wednesday, 2 February 2022 at 08:14:32 UTC, Mike Parker wrote: The University of Queensland's Centre for Hypersonics has [a gas dynamics toolkit](https://gdtk.uqcloud.net/) that, since 1994, has evolved from C, to C++, and now to D. Peter Jacobs, Rowan Gallon, and Kyle Damm wrote a little about it for the D Blog. The blog: https://dlang.org/blog/2022/02/02/a-gas-dynamics-toolkit-in-d/ Reddit: https://www.reddit.com/r/programming/comments/sij99d/they_wrote_a_gas_dynamics_toolkit_in_d/ And HN, https://dlang.org/blog/2022/02/02/a-gas-dynamics-toolkit-in-d/
Re: Why I Like D
On Friday, 14 January 2022 at 02:13:48 UTC, H. S. Teoh wrote: On Fri, Jan 14, 2022 at 01:19:01AM +, forkit via Digitalmars-d-announce wrote: [...] [...] [...] [...] How is using D "losing autonomy"? Unlike Java, D does not force you to use anything. You can write all-out GC code, you can write @nogc code (slap it on main() and your entire program will be guaranteed to be GC-free -- statically verified by the compiler). You can write functional-style code, and, thanks to metaprogramming, you can even use more obscure paradigms like declarative programming. [...] When languages are compared in grammar and semantics alone, you are fully correct. Except we have this nasty thing called eco-system, where libraries, IDE tooling, OS, team mates, books, contractors, are also part of the comparisasion. And in that regard it doesn't matter how great D is against C# 10, if C# 10 gets me 90% there with a legion of libraries, IDE tooling, OS, team mates, books, to help me getting there. Using C# version explicitly here, because since C# 7 the language has grown multiple features that used to be only on the D's side of the comparisasion. Naturally C# 10 was only an example among several possible ones, that have a flowershing ecosytem and keep getting the features only D could brag about when Andrei's book came out 10 years ago.
Re: Why I Like D
On Thursday, 13 January 2022 at 15:44:33 UTC, Ola Fosheim Grøstad wrote: On Thursday, 13 January 2022 at 10:21:12 UTC, Stanislav Blinov wrote: TLDR: it's pointless to lament on irrelevant trivia. Time it! Any counter-arguments from either side are pointless without that. "Time it" isn't really useful for someone starting on a project, as it is too late when you have something worth measuring. The reason for this is that it gets worse and worse as your application grows. Then you end up either giving up on the project or going through a very expensive and bug prone rewrite. There is no trivial upgrade path for code relying on the D GC. And quite frankly, 4 ms is not a realistic worse case scenario for the D GC. You have to wait for all threads to stop on the worst possible OS/old-budget-hardware/program state configuration. It is better to start with a solution that is known to scale well if you are writing highly interactive applications. For D that could be ARC. Just leaving this here from a little well known company. https://developer.arm.com/solutions/internet-of-things/languages-and-libraries/go ARC, tracing GC, whatever, but make your mind otherwise other languages that know what they want to be get the spotlight in such vendors.
Re: Why I Like D
On Thursday, 13 January 2022 at 10:21:12 UTC, Stanislav Blinov wrote: On Wednesday, 12 January 2022 at 16:17:02 UTC, H. S. Teoh wrote: [...] Oh there is a psychological barrier for sure. On both sides of the, uh, "argument". I've said this before but I can repeat it again: time it. 4 milliseconds. That's how long a single GC.collect() takes on my machine. That's a quarter of a frame. And that's a dry run. Doesn't matter if you can GC.disable or not, eventually you'll have to collect, so you're paying that cost (more, actually, since that's not going to be a dry run). If you can afford that - you can befriend the GC. If not - GC goes out the window. In other words, it's only acceptable if you have natural pauses (loading screens, transitions, etc.) with limited resource consumption between them OR if you can afford to e.g. halve your FPS for a while. The alternative is to collect every frame, which means sacrificing a quarter of runtime. No, thanks. Thing is, "limited resource consumption" means you're preallocating anyway, at which point one has to question why use the GC in the first place. The majority of garbage created per frame can be trivially allocated from an arena and "deallocated" in one `mov` instruction (or a few of them). And things that can't be allocated in an arena, i.e. things with destructors - you *can't* reliably delegate to the GC anyway - which means your persistent state is more likely to be manually managed. TLDR: it's pointless to lament on irrelevant trivia. Time it! Any counter-arguments from either side are pointless without that. You collect it when it matters less, like loading a level, some of them take so long that people even have written mini-games that play during loading scenes, they won't notice a couple of ms more. Hardly any different from having an arena throw away the whole set of frame data during loading. Unless we start talking about DirectStorage and similar.
Re: Why I Like D
On Wednesday, 12 January 2022 at 02:37:47 UTC, Walter Bright wrote: "Why I like D" is on the front page of HackerNews at the moment at number 11. https://news.ycombinator.com/news I enjoyed reading the article.
Re: Release D 2.097.0
On Friday, 18 June 2021 at 06:14:03 UTC, Martin Nowak wrote: On Monday, 7 June 2021 at 08:51:52 UTC, Bastiaan Veelo wrote: I am having issues as well, but I don't think the installer is at fault: I see the `C:\D\dmd2` directory get filled as the installer progresses, then files just disappear. It doesn't seem to be consistent though. After failure I tried with `dmd-2.096.1.exe` and the same thing happened, whereas it had installed fine before. I tried `dmd-2.097.0.exe` and this time the whole directory got wiped. I tried again and it installed fine. Windows 10 Pro N version 20H2 build 19042.985. I suspect MS cloud security scan. -- Bastiaan. Maybe we could recruit someone to replace the dated NSIS installer with a native msi installer. https://issues.dlang.org/show_bug.cgi?id=15375 https://en.wikipedia.org/wiki/List_of_installation_software#Windows Don't have much of a clue about Windows nowadays, maybe there are more suitable alternatives. Speaking with my Windwos dev hat on, the suitable alternative would be to use MSIX package files, the replacement for MSI files, that date back to Windows 2000. https://docs.microsoft.com/en-us/windows/msix/packaging-tool/tool-overview In what concerns typical Windows development workflows, its use is quite simple for basic use cases, namely add a MSIX project to solution, configure the manifest file and add as dependencies the .NET and C++ projects whose binaries are going to be part of MSIX. With something like D I expect a bit more convoluted process. https://docs.microsoft.com/en-us/windows/msix/desktop/desktop-to-uwp-third-party-installer
Re: D Language Foundation Monthly Meeting Summary
On Thursday, 10 June 2021 at 10:55:50 UTC, sighoya wrote: On Saturday, 5 June 2021 at 09:14:52 UTC, Ola Fosheim Grøstad wrote: The current GC strategy is a dead end. No GC makes the language too much of a C++ with no real edge. D needs to offer something other languages do not, to offset the cost of learning the language complexities. I think the switch to arc with cycle detection as opt out (like in python) is the right direction, it fits more to a system level language making use of destructors more often. Rewriting cyclic code to acyclic code is easier than lifetime management in general. Further, optimizations can be introduced that detect acyclic structures in D just as it is the case for nim (https://nim-lang.org/docs/manual.html#pragmas-acyclic-pragma). That doesn't mean tracing GC is bad, I'm still skeptical that arc + cycle detection is better than tracing in general for true high level languages. Well, I advise reading "On Adding Garbage Collection and Runtime Types to a Strongly-Typed, Statically-Checked, Concurrent Language" http://www.bitsavers.org/pdf/xerox/parc/techReports/CSL-84-7_On_Adding_Garbage_Collection_and_Runtime_Types_to_a_Strongly-Typed_Statically-Checked_Concurrent_Language.pdf And watching the upcoming WWDC 2021 talk "ARC in Swift: Basics and beyond" on Friday. https://developer.apple.com/wwdc21/sessions What Cedar, Swift and Nim have in common that D lacks, are fat pointers, the compiler awareness for optimizations regarding elision of counter manipulation code, and tricks like background threads for the cycle collector or cascade deletions. It is no an accident that high performance reference counting GC is similar to tracing GC in regards to implementation complexity.
Re: D Language Foundation Monthly Meeting Summary
On Monday, 7 June 2021 at 23:04:12 UTC, Norm wrote: On Saturday, 5 June 2021 at 08:58:47 UTC, Paulo Pinto wrote: On Friday, 4 June 2021 at 21:35:43 UTC, IGotD- wrote: On Friday, 4 June 2021 at 19:56:06 UTC, sighoya wrote: This uniformization sounds too good to be true. I think most people think that, but it's simply not true. malloc/free is incompatible to garbage collection. This is true and even druntime has a malloc/free option for the GC. However, its implementation is really bad. Also the implementation of the current GC has a lot of room for improvements. It is still not appropriate for many embedded systems as it requires another layer that steals CPU time and code memory. Speaking of embedded, https://learn.adafruit.com/welcome-to-circuitpython https://blog.arduino.cc/2019/08/23/tinygo-on-arduino https://www.microsoft.com/en-us/makecode/resources http://www.ulisp.com/ https://developer.android.com/training/wearables/principles https://www.microej.com/product/vee/ Meanwhile kids, the future generation of developers, keeps adopting the hardware and programming languages listed above, completly oblivious there is a programming language where all discussion threads turn into GC vs no-GC no matter what was the original subject. There is also https://micropython.org/ I just skipped MicroPython, because Circuit Python seems to have more uptake even though it is based on it. It would not be my choice of language for medical but uPython is used in a small number of embedded medical devices and has been ported to several flavours of STM32. This is a space where D could make a difference, although unfortunately the language has some dark corner cases and friction that put some people off to the point where they don't see any benefit moving to D. Exactly, and the whole GC vs no-GC take the language nowhere in that regard.
Re: D Language Foundation Monthly Meeting Summary
On Friday, 4 June 2021 at 21:35:43 UTC, IGotD- wrote: On Friday, 4 June 2021 at 19:56:06 UTC, sighoya wrote: This uniformization sounds too good to be true. I think most people think that, but it's simply not true. malloc/free is incompatible to garbage collection. This is true and even druntime has a malloc/free option for the GC. However, its implementation is really bad. Also the implementation of the current GC has a lot of room for improvements. It is still not appropriate for many embedded systems as it requires another layer that steals CPU time and code memory. Speaking of embedded, https://learn.adafruit.com/welcome-to-circuitpython https://blog.arduino.cc/2019/08/23/tinygo-on-arduino https://www.microsoft.com/en-us/makecode/resources http://www.ulisp.com/ https://developer.android.com/training/wearables/principles https://www.microej.com/product/vee/ Meanwhile kids, the future generation of developers, keeps adopting the hardware and programming languages listed above, completly oblivious there is a programming language where all discussion threads turn into GC vs no-GC no matter what was the original subject.
Re: LWDR (Light Weight D Runtime) for Microcontrollers v0.2.3
On Sunday, 30 May 2021 at 14:28:25 UTC, Dylan Graham wrote: Github: https://github.com/0dyl/LWDR DUB: https://code.dlang.org/packages/lwdr Hi, all! This is LWDR (Light Weight D Runtime) It is a ground-up implementation of a D runtime targeting the ARM Cortex-M microcontrollers and other microcontroller platforms with RTOSes (Real Time Operating Systems). It doesn't, and possibly may not, support all D features in order to make it viable for the constrained environments. For example, all memory allocation is manually done via `new` and `delete` - no GC. It works by providing a series of barebones API hooks (alloc, dealloc, assert, etc) (defined in `rtoslink.d`), which you must implement and/or point to your RTOS implementation. It can be compiled with either GDC or LDC and it is DUB compatible. It has so far been successfully run on a real STM32F407. LWDR currently supports the following language features: - Class allocations and deallocations (via new and delete) - Struct heap allocations and deallocations (via new and delete) - Invariants - Asserts - Contract programming - Basic RTTI (via TypeInfo stubs) - Interfaces - Static Arrays - Virtual functions and overrides - Abstract classes - Static classes - Allocation and deallocation of dynamic arrays - Concatenate an item to a dynamic array - Concatenate two dynamic arrays together - Dynamic array resizing The following features are experimental: - Exceptions and Throwables (so far are working on GDC only) Not supported: - Module constructors and destructors - ModuleInfo - There is no GC implementation - TLS (thread local static) variables - Delegates/closures - Associative arrays - Shared/synchronized - Object hashing - Other stuff I have forgotten :( It is beta, so expect bugs. Great work!
Re: On the D Blog--Symphony of Destruction: Structs, Classes, and the GC
On Thursday, 18 March 2021 at 12:27:56 UTC, Petar Kirov [ZombineDev] wrote: On Thursday, 18 March 2021 at 09:21:27 UTC, Per Nordlöw wrote: [...] Just implementation deficiency. I think it is fixable with some refactoring of the GC pipeline. One approach would be, (similar to other language implementations - see below), that GC-allocated objects with destructors should be placed on a queue and their destructors be called when the GC has finished the collection. Afterwards, the GC can release their memory during the next collection. [...] Small correction, since .NET 5 / C# 9, implementing IDisposable isn't required if the Dispose() method is available. This is done as performance improvement for using structs with determistic destruction and avoid implicit convertions to references when interfaces are used.
Re: Say Hello to Our Two New Pull-Request/Issue Managers
On Wednesday, 13 January 2021 at 11:33:44 UTC, Mike Parker wrote: I'm very, very happy that I can finally announce the news. Some of you may recall the job announcements I put out on the blog back in September [1]. Symmetry Investments offered to fund one full-time, or two part-time, Pull Request Manager positions, the goal being to improve the efficiency of our process (prevent pull requests from stagnating for ages, make sure the right people see the PRs in need of more than a simple review, persuade the right people to help with specific Bugzilla issues, etc). [...] Congratulations to everyone, thanks to make it happen, and good luck for the job.
Re: Our community seems to have grown, so many people are joining the Facebook group
On Wednesday, 30 December 2020 at 02:31:36 UTC, Murilo wrote: On Tuesday, 29 December 2020 at 15:06:07 UTC, Ola Fosheim Grøstad wrote: No, the OP clearly stated that he made the group "official". That is a deliberate attempt to fracture. I'm sorry you see it like this but my intention when I created the group was to expand Dlang by bringing it to places people couldn't find it yet. The whole point of the FB group is to aggregate people into our community, to bring more people to Dlang and make Dlang famous. My whole intention was to help our community grow, not fracture. You did well, many get all fancy about Facebook, yet don't have any issues using FOSS software paid with Facebook's money. The more the better.
Re: Talk by Herb Sutter: Bridge to NewThingia
On Monday, 29 June 2020 at 18:29:54 UTC, Russel Winder wrote: On Mon, 2020-06-29 at 12:41 +, Paulo Pinto via Digitalmars-d-announce wrote: […] Concepts, coroutines, and modules are already in ISO C++20. Only once the standard is voted in. :-) Also ranges are in I believe. And co-routines are in a much better story than the incompatible runtimes currently existing for Rust async/await story. I have not used C++ co-routines, but having used Rust co-routines, they seem fine. You need to make good on your negative criticism – which I would like to hear. Rust has only standardized part of the async/await story, the asynchronous runtime is not part of the standard library, so currently it is impossible to write code that works flawlessly across the existing runtimes. https://stjepang.github.io/2020/04/03/why-im-building-a-new-async-runtime.html Additionally there are still rough edges with lifetimes across async/await calls. Rust still needs to improve a lot on its tooling and ecosystem to cover many of the scenarios we use C++ for, even if is safer. I can believe that may be true for others, but for me JetBrains CLion, Rustup, and Cargo make for an excellent environment. crates.io works very well – better than CLion, CMake, and lots of manual hacking around to get libraries for C++. The typical scenarios where we would use GPGPU shaders, iDevices, Android and Windows drivers, Arduino, SYCL, DPC++, Unreal, XBox/PS/Switch SDKs, ... Already the fact that it lacks an ISO standard is a no go in many domains. That is a choice for those organisations. I am guessing those organisations do not use Java, D, Python, etc. Java has a standard to guide for, updated for each language release. So it doesn't need to be ISO, can be ECMA, or some other formal writen specification, with multiple vendor implementations. I guess you mean using Python as glue for GPGU libraries written in C++. In C, but yes. Though I haven't done it in a while now.
Re: Talk by Herb Sutter: Bridge to NewThingia
On Monday, 29 June 2020 at 12:17:57 UTC, Russel Winder wrote: On Mon, 2020-06-29 at 10:31 +, IGotD- via Digitalmars-d-announce wrote: […] Back to C++20 and beyond which Herb Sutter refers to a lot. Is C++20 a success, or even C++17? Does anyone know this? Modern C++ isn't a programming standard so what I've seen is just a mix of everything. I guess the question is whether concepts, coroutines, and modules finally make it in. The really interesting question is whether metaclasses make C++23. Concepts, coroutines, and modules are already in ISO C++20. And co-routines are in a much better story than the incompatible runtimes currently existing for Rust async/await story. [...] On the other hand people are stopping using C++ in favour of Go, Rust, Python, but it seems not D. Rust still needs to improve a lot on its tooling and ecosystem to cover many of the scenarios we use C++ for, even if is safer. Already the fact that it lacks an ISO standard is a no go in many domains. I guess you mean using Python as glue for GPGU libraries written in C++.
Re: Talk by Herb Sutter: Bridge to NewThingia
On Monday, 29 June 2020 at 10:31:43 UTC, IGotD- wrote: On Saturday, 27 June 2020 at 15:48:33 UTC, Andrei Alexandrescu wrote: How to answer "why will yours succeed, when X, Y, and Z have failed?" https://www.youtube.com/watch?v=wIHfaH9Kffs Very insightful talk. Back to C++20 and beyond which Herb Sutter refers to a lot. Is C++20 a success, or even C++17? Does anyone know this? Modern C++ isn't a programming standard so what I've seen is just a mix of everything. I have lost track of all new C++ features and now he even refers it as "NewLang" what that is. Is that Bjarnes famous quote "Within C++, there is a much smaller and clearer language struggling to get out."? I believe it when I see it. One thing that isn't mention that is very important for a language to succeed is libraries. C++ has a lack of standard libraries which forces the programmer to look for third party alternatives, which are of varying standard. This leads to that the there is no particular programming API standard it must gravitate to the lowest common denominator. This in contrast to Phobos which is more complete. Does C++ need more language features or does C++ need better standard libraries? I would say the latter. If it weren't for Qt, C++ would just be a skeleton language. Qt is a great library and was that even before C++11 which proves that the new language features weren't that important. What do you think, did "modern C++" really succeed? Yes it did, thanks to its rejuvenation and CUDA, C++ has become the main language in HPC and ML, NVidia now designs their GPUs having C++ semantics in mind, although CUDA is designed as language agnostic GPGPU environment. Metal Shaders and HLSL are largely based on C++14, and due to the game developers pressure, Google and Samsung have taken the effort to make HLSL available on Vulkan as well, porting Microsoft's open sourced HLSL compiler to SPIR-V.. On Windows, the Windows team is quite keen pushing C++/WinRT (based on C++17) to eventually provide a .NET like experience while using C++/WinUI, although the Visual Studio tooling is still lacking. Unreal is already supporting C++17 and GCC is discussing moving to C++17 as default dialect. Apple, Google, Microsoft, Nintendo, Sony, NVidia, AMD, ARM, still have lots of plans for it, even if they also own other language stacks on their SDKs. So, even those of us that rather spend our productive time in other stacks, occasionally dealing with C++ is unavoidable, and it continue being so for the decades to come. Which is why good C++ compatibility is a very valuable sales pitch of any language. For D, on Windows that would mean to improve COM support to deal with UWP as well, as it is COM vNext. Project Reunion plans support for C++, C#, Python and JavaScript.
Re: Codefence, an embeddable interactive code editor, has added D support.
On Saturday, 23 May 2020 at 19:49:33 UTC, welkam wrote: On Saturday, 23 May 2020 at 15:04:35 UTC, Paulo Pinto wrote: Hi everyone, as the subject states, you can find it here, https://codefence.io/ The current version is 2.092.0 with dmd. Regards, Why such thing is free? Who pays for the servers? No idea, I just found it while browsing Reddit.
Codefence, an embeddable interactive code editor, has added D support.
Hi everyone, as the subject states, you can find it here, https://codefence.io/ The current version is 2.092.0 with dmd. Regards,
Re: LDC 1.21.0
On Friday, 24 April 2020 at 06:58:52 UTC, Sebastiaan Koppe wrote: On Friday, 24 April 2020 at 06:13:09 UTC, Paulo Pinto wrote: Great work! What is the status of WebAssembly support beyond betterC? Almost there. I originally planned to complete it last February. It turned out to be a bit more work because I didn't consider I would need to port parts of phobos as well. I then set a new deadline in March. Covid-19 happened and my wife and I have been homeschooling our kids in-between work. I haven't been able to find a lot of time to work on this. The druntime and phobos test-runners both compile, and I am using whatever spare time I have to fix or disable any tests that fail. Once that is done, I will open up PR's to both upstream repos. I do expect some discussions on that front so it probably won't be a one-day deal. After that there will be a little celebration since I have already have a PR to update LDC's CI to create prebuild releases. From my side, no need for explanations, it is already a much better experience with D and betterC than having to deal with emscripten. Many thanks for the work, all the best and stay safe in these days.
Re: LDC 1.21.0
On Thursday, 23 April 2020 at 17:53:05 UTC, kinke wrote: Glad to announce an exciting LDC 1.21 release - some highlights: * Based on D 2.091.1+; LLVM upgraded to v10.0.0. * Experimental iOS/arm64 support - all druntime/Phobos unittests pass, thanks Jacob! The prebuilt macOS package supports cross-compilation out of the box, just add `-mtriple=arm64-apple-ios12.0` (ldc2/ldmd2) or `--arch=arm64-apple-ios12.0` (dub). * Initial support for GCC/GDC-style inline assembly syntax, primarily for portability across GDC and LDC (and GCC-style C). * Android improvements, incl. an important fix for x86 architectures, usability of the ld.gold linker, no more D main() requirement and reduced .so sizes. * Important AArch64 bugfix, especially for unoptimized code. Full release log and downloads: https://github.com/ldc-developers/ldc/releases/tag/v1.21.0 Thanks to all contributors! Great work! What is the status of WebAssembly support beyond betterC?
Re: Top Five World’s Most Underrated Programming Languages
On Friday, 18 January 2019 at 03:41:38 UTC, Brian wrote: On Monday, 14 January 2019 at 20:21:25 UTC, Andrei Alexandrescu wrote: Of possible interest: https://www.technotification.com/2019/01/most-underrated-programming-languages.html Because no software can use it. examples: 1. Docker use golang. 2. Middleware system use java. 3. Shell use python. 4. AI use python and R. 5. Desktop application use QT / C#. 6. Web framework & database use php's laravel and java's sprint-boot. 7. Web use Javascript / typescript. Google is using Go for gVisor and Fuchsia, MIT for Biscuit, TUM (Munich) for userspache high performance network drivers, in spite of the naysayers regarding Go and systems programming. Apparently Google is ramping up the use of Rust in Fuchsia and hiring quite a few devs. Azure IoT Edge uses a mix of C# and Rust. C# support for low level systems programing is looking better every release since they started integrated Midori lessons into it, while making it beat TechEmpower and working closely with Unity. Now C# support is started to be a thing all AAA devs wish for on their game engines, even if only for gameplay scripts, while Unity is betting down in improving C# AOT compilation via their HPC# subset (C#'s -betterC in IL2CPP). D really needs its killer use case if it is to move away from that list.
Re: State of D 2018 Survey
On Friday, 2 March 2018 at 11:16:51 UTC, psychoticRabbit wrote: On Friday, 2 March 2018 at 10:21:05 UTC, Russel Winder wrote: ...continue with C in the face of overwhelming evidence it is the wrong thing to do. yeah, the health fanatics who promote their crap to goverments and insurance agencies, use very similar arguments about sugar, salt, alchohol, this and that when really, it's all about moderation, not prohibition (or increased taxes on things people say are bad). and science is so dodgy these days, that even scientific evidence requires evidence. No, it is about costs and saving people lives. It is cheaper to prevent diseases than trying to cure them afterwards, specially chronic ones that cause people's death. Likewise, it is cheaper to prevent security exploits caused by memory corruption by not having them, instead of having to pay millions of dollars in compensation to everyone has was impacted by one. c rules! Thanks to AT not being able to sell UNIX, giving it by a symbolic price for universities like Berkely, followed by a few startups like Sun and SGI basing their OS on it. Had AT been allowed to sell UNIX at the same price of VMS, OS/z and others, and C wouldn't rule anywhere. And if you like C so much, what are you doing in a safe systems programming language forum?
Re: State of D 2018 Survey
On Friday, 2 March 2018 at 10:21:05 UTC, Russel Winder wrote: […] There are those who use C because the only other option is assembly language, so they make the right decision. This is an indicator that high-level language toolchain manufacturers have failed to port to their platform. I'll wager there are still a lot of 8051s out there. I'll also wager the C++ compilers for that target do not realise C++, but a subset that is worse than using C. Even after 14 years of improvement. It is going to be interesting what happens when Rust begins to have to toolchains to deal with microcontrollers. Hopefully though ARM cores dominate now, especially given the silicon area is reputedly smaller than 8051. I've been out of the smartcard arena for over a decade now, and yet I bet it is all still very much the same. There are safer alternatives, (Pascal and Basic), but they suffer from the same stigma that has pushed them outside of the market, namely they aren't offered on the chip vendor SDK, thus requiring an additional purchase, which only a few bother with. http://turbo51.com/ https://www.mikroe.com/compilers
Re: State of D 2018 Survey
On Friday, 2 March 2018 at 04:38:24 UTC, psychoticRabbit wrote: On Friday, 2 March 2018 at 03:57:25 UTC, barry.harris wrote: Sorry little rabbit, your are misguided in this belief. Back in day we all used C and this is the reason most "safer" languages exist today. You can write pretty safe code in C these days, without too much trouble. We have the tooling and the knowledge to make that happen.. developed over decades - and both keep getting better, because the language is not subjected to a constant and frequent release cycle. Ironically, the demands on programmers to adapt to constant change, is actually making applications less safe. - and least, that's my thesis ;-) The real problem with using C these days (in some areas), is more to do with its limited abstraction power, not its lack of safety. And also C is frowned upon (and C++ too for that matter), cause most programmers are so lazy these days, and don't want to write code - but prefer to just 'link algorithms' that someone else wrote. I include myself in this - hence my interest in D ;-) Keep those algorithms coming! Those tools exist since 1979, so C programmers have had quite some time to actually use them. "To encourage people to pay more attention to the official language rules, to detect legal but suspicious constructions, and to help find interface mismatches undetectable with simple mechanisms for separate compilation, Steve Johnson adapted his pcc compiler to produce lint [Johnson 79b], which scanned a set of files and remarked on dubious constructions." Dennis Ritchie, https://www.bell-labs.com/usr/dmr/www/chist.html Also, anyone that wasn't using safer systems programming languages before C got widespread outside UNIX, can spend some time educating themselves on BitSavers or Archive about all the systems outside AT that were developed in such languages since 1961. The first well known, Burroughs B5000, has kept being improved and is sold by Unisys as ClearPath nowadays. Or PL/8 used by IBM for doing RISC research, creating an compiler using an plugable architecture similar to what many think are LLVM ideas and the respective OS. They only switched to C, when they decided to bet on UNIX for going commercial with RISC. There are only two reasons we are stuck with C, until we get to radically change computer architectures, UNIX like OSes, and embedded developers that won't use anything else even at point gun. All the quantum computing research is using languages that don't have anything to do with C.
Re: We're looking for a Software Developer! (D language)
On Wednesday, 29 November 2017 at 12:05:06 UTC, Ola Fosheim Grostad wrote: On Wednesday, 29 November 2017 at 10:47:31 UTC, aberba wrote: to death learning these stuff in lectures. I learnt them beyond the syllables years back on my own at a much quicker pase. CS isnt about the languages themselves, that is trivial. Basically covered in the first or second semester. You become experienced and skilled when you're passionate about it. Sure, imperative languages are all mostly the same, and easy to learn once you know the basics (C++ being an exception). Learning frameworks takes time, but there are too many frameworks for anyone to master, and they are quickly outdated. So the only knowledgebase that isnt getting outdated are the models from CS. Wirth puts it nicely, it is all about algorithms, data structures and learning how to apply them to any language.
Re: We're looking for a Software Developer! (D language)
On Wednesday, 29 November 2017 at 10:47:31 UTC, aberba wrote: On Thursday, 8 January 2015 at 11:10:09 UTC, Johanna Burgos wrote: Your Mission Your Track Record Degree in Computer Science, or closely-related It baffles me that recruitment still works using this as a requirement. A CS graduate will never know any of these besides basic intro to C, C++, html, css, databases, and basic hardware-software theory... without self learning and practice. ... Sure it will, it is a matter of university quality. During my 5 year degree, we got to learn about C++, Prolog, Caml Light and SML, x86 and MIPS Assembly, Pascal, PL/SQL, Java, Smalltalk. Those that took compiler design, also had a look into Algol, Concurrent C, Oberon, Modula-3, Eiffel, Lisp. We had access to DG/UX, Aix, GNU/Linux, Mac System 7 and Windows as OSes. Additionally we got all the layers of OS development, from drivers to graphics programming, distributed computing using PWM and MPI, web design, architecture, algorithms and data structures, calculus, linear algebra among many other concepts. Each area required projects to be delivered during each semester and final examination. Sure many can self learn some of those themes, but it requires a big discipline to keep the rhythm.
Re: Questionnaire
On Wednesday, 8 February 2017 at 18:27:57 UTC, Ilya Yaroshenko wrote: 1. Why your company uses D? a. D is the best b. We like D c. I like D and my company allowed me to use D d. My head like D e. Because marketing reasons f. Because my company can be more efficient with D for some tasks then with any other system language My company doesn't use D. 2. Does your company uses C/C++, Java, Scala, Go, Rust? My company uses what the customers ask for, we don't get to choose that much. iOS projects - Objective-C, Swift Java projects - Java, Scala, Clojure Windows projects - C#, VB.NET Web Projects - JavaScript on frontend with a Java or .NET stack on backend Android projects - Java Hybrid development across iOS and Android - Cordova, Ionic Traversal to all projects, C++ as infrastructure language for performance reasons or integration of C and C++ libraries. 3. If yes, what the reasons to do not use D instead? Customers don't ask for it on their RPF to allow its use. 2. Have you use one of the following Mir projects in production: a. https://github.com/libmir/mir b. https://github.com/libmir/mir-algorithm c. https://github.com/libmir/mir-cpuid d. https://github.com/libmir/mir-random e. https://github.com/libmir/dcv - D Computer Vision Library f. std.experimental.ndslice No, as they don't follow on the typical enterprise computing projects we work on. 3. If Yes, can Mir community use your company's logo in a section "Used by" or similar. 4. Have you use one of the following Tamedia projects in your production: a. https://github.com/tamediadigital/asdf b. https://github.com/tamediadigital/je c. https://github.com/tamediadigital/lincount No, I wasn't aware of their existence. 5. What D misses to be commercially successful languages? A GC that can compete with Java and .NET ones. The IDE tooling at the same level of the InteliJ and Visual Studio. Above all, a killer project that makes customers ask us about employees with D skils. As an example, Docker and Kubernetes success means our devops guys are slowly improving their Go skills in the newly introduced internal training. 6. Why many topnotch system projects use C programming language nowadays? Due to existing tooling, libraries and that for many companies using a managed language + C, is good enough and allows for cheaper developers. Also many developers born after memory safe system languages lost the market to UNIX + C, think that C was the first one to exist for system programming. = All my current D project are finished. Probably I will use other languages for production this year, Java/Go/whatever. Mir libraries are amazing and good quality. If you use them this would be a good motivation for us to improve the docs and provide regular updates. Plus, it can be enchanted during the GSoC 2017. Thanks, Ilya
Re: Moving forward with work on the D language and foundation
On Monday, 24 August 2015 at 18:43:01 UTC, Andrei Alexandrescu wrote: Hello everyone, Following an increasing desire to focus on working on the D language and foundation, I have recently made the difficult decision to part ways with Facebook, my employer of five years and nine months. Facebook has impacted my career and life very positively, and I am grateful to have been a part of it for this long. The time has come for me, however, to fully focus on pushing D forward. As sorry I am for leaving a good and secure career behind, I am excited many times over about the great challenges and opportunities going forward. Next step with the D Language Foundation is a formal talk with the foundation's prospective attorney tomorrow. I hope to get the foundation in motion as soon as possible, though I'm told there are numerous steps to complete. I will keep this forum posted about progress. I'm also glad to announce that the D Language Foundation already has a donor - I have decided to contribute my books' royalties to it. I encourage others to respond in kind. Thanks, Andrei Good luck for this new step on your career and also for the D community. Although I have now other focus, this community is great and even as a bystander it would be nice to see D flourish as mainstream language and safer system programming practices. Good luck, Paulo
Re: Now official: we are livestreaming DConf 2015
On Wednesday, 27 May 2015 at 21:05:23 UTC, Leandro Lucarella wrote: Dicebot, el 27 de May a las 19:26 me escribiste: On Wednesday, 27 May 2015 at 19:23:32 UTC, Kai Nacke wrote: Hi! Any chance to change the YouTube settings? Here in Germany I get only the message: Live streaming is not available in your country due to rights issues. Regards, Kai The whole live streaming feature seems to be banned in Germany :( I am afraid there is nothing that can be done apart from using a VPN. Are you sure? Is not like it would surprise me much, but I have the feeling I have seen live streaming in DE in the past without any trickery... Yes, for some reason the URL felt under the war Google is having with our beloved GEMA. Google refuses to pay the royalties GEMA asks for. To see how ridiculous the whole situation is, even schools had to search for GEMA free songs for their children to sing, given how GEMA handles the whole royalties situation.
Re: Gary Willoughby: Why Go's design is a disservice to intelligent programmers
On Sunday, 29 March 2015 at 22:07:40 UTC, Laeeth Isharc wrote: should we add a link to the wiki and ask author if we could mirror there ? This section on wiki looks like it could with a bit of fleshing out! http://wiki.dlang.org/Coming_From/Python I just seen what you did in the wiki, that's great! I don't have much time to invest tonight but I'll definitely do my part of the job in a day or two. Thank you for noticing. It's not very inspired, but I don't have much energy at the moment, and it is the best I can do with what I have. Better an acceptable start than trying to be perfect. The Ruby / Java / Eiffel / C# / and Basic sections also need starting. While not forgetting that Java, Eiffel, C#, Basic have options to compile straight to native code, just like D, so the focus should be on other features and not on native vs VM. -- Paulo
Re: Gary Willoughby: Why Go's design is a disservice to intelligent programmers
On Monday, 30 March 2015 at 03:26:14 UTC, deadalnix wrote: On Sunday, 29 March 2015 at 16:32:32 UTC, Idan Arye wrote: Computer science is all about tradeoffs. I used to love Ruby, but then a Rails project got out of hand... Nowadays I use it mainly as a bash replacement - Hundredfolds more expressive, only a tiny tiny bit syntax overhead, and for things that bash's safety would be enough Ruby's certainly suffices. This is pretty much the recurring story with ruby. The first 10 000 lines are a lot of fun, and then it gets out of hands. Just like any other language with dynamic typing.
Re: Gary Willoughby: Why Go's design is a disservice to intelligent programmers
On Monday, 30 March 2015 at 08:53:15 UTC, Ola Fosheim Grøstad wrote: On Sunday, 29 March 2015 at 19:03:06 UTC, Laeeth Isharc wrote: On Sunday, 29 March 2015 at 15:34:35 UTC, Ola Fosheim Grøstad wrote: Actually, there is quite a large overlap if you look beyond the syntax. Dart is completely unexciting, but I also find it very productive when used with the IDE. Glad to hear this - I haven't yet got very far with Dart, but it seems like a toss-up between Dart and Livescript for a passable language to run on the client (for my little use case). I don't know the future of Dart, but if you have time to wait for it you might consider atscript/Angular 2.0. Very dark as Angular team decided to look for Typescript instead[0]. http://blogs.msdn.com/b/typescript/archive/2015/03/05/angular-2-0-built-on-typescript.aspx Now with Dart team giving up on their VM, Dart becomes just yet another language that transpiles to JavaScript. http://news.dartlang.org/2015/03/dart-for-entire-web.html So, it will just fade way in the sea of JavaScript wannabe replacements. -- Paulo
Re: I'll be presenting at NWCPP on Jan 21 at Microsoft
On Friday, 23 January 2015 at 05:54:41 UTC, Walter Bright wrote: On 1/22/2015 12:52 PM, Gary Willoughby wrote: Me too, is there any video available? https://www.youtube.com/watch?v=IkwaV6k6BmM I can't bear to watch it, you'll have to do it for me! I enjoyed watching it. -- Paulo
Re: I'll be presenting at NWCPP on Jan 21 at Microsoft
On Saturday, 24 January 2015 at 13:03:33 UTC, Andrej Mitrovic wrote: On 1/23/15, MattCoder via Digitalmars-d-announce digitalmars-d-announce@puremagic.com wrote: My right ear can't hear too! :) While the youtube engineers are too lazy to fix this, in the meantime you can use the youtube-dl tool to download the video, watch it in VLC and select Audio-Select channel-Left (or something like that). Worked for me! Funny enough, I didn't have any audio problems. Just watched with FF Flash plugin in Windows 8.1. -- Paulo
Re: Anyone interested in embedding a JVM in their D app?
On Wednesday, 14 January 2015 at 09:29:25 UTC, Russel Winder via Digitalmars-d-announce wrote: On Wed, 2015-01-14 at 02:00 +, james via Digitalmars-d-announce wrote: I've been playing with jni.h and D. I think I've got a fully working jni.d and I have the start of a nicer D wrapper around it with djvm.d. Whilst I have tinkered with JNI, I have never had to really use it in anger. And I, and many others, really want to keep it that way even though there are many who use it. It's like trying to program Python from C, only worse performance. Performance is good enough if you do the same approach as remote method invocation, by using a single call and not multiple ones. There is JNA of course, which does some similar stuff, many use that I have never used it. The current fashion is (or will be) JNR (which leads to JEP 191). As far as I know JNA, JNR (and JEP 191) use JNI, more or less because they have to. The issue is to make using the adaptor as easy as possible. JNI is not easy; JNA is easy but slow; JNR is supposedly easy and fast, so hopefully JEP 191 will be. JNI is hard on purpose. Mark Reinhold has said during the JavaONE 2014 that it was made so, to force Java developers to stay away from writing unsafe code, specially given Java's portability goal. Now with Java being adopted left and right for HPT and big data, that is an hindrance for integrating legacy code, hence the need for JNR, born out of JRuby project. Interesting enough, something like JNR was one of Microsoft extensions to Java and the precursor of .NET P/Invoke. -- Paulo
Re: Visual Studio Community and .NET Open Source
On Friday, 21 November 2014 at 08:02:07 UTC, philippecp wrote: .Net does have a pretty damn good GC. It is both a moving garbage collector (improves locality, reduces heap fragmentation and allows for memory allocation to be a single pointer operation) and a generational garbage collector (reduces garbage collection cost by leveraging heuristic that most collected objects are usually very young). I believe their server GC is even concurrent to avoid long stop the world pauses. The problem is I'm not sure how much of those principles can be applied to D. I can see moving objects being problematic given that D supports unions. Another thing to consider is that .Net's GC is the results of many man years of full time work on a single platform, while D is mostly done by volunteers in their spare time for many platforms. It would probably require a lot of work to port, unless you're volunteering yourself for that work;) ... The official .NET runs on x86, x64, ARM (including cortex variants), MIPS. It scales from embedded hardware running with 512KB of flash and 128KB of RAM (http://www.netmf.com/get-started/), all the way up to Azure deployments. http://www.microsoft.com/net/multiple-platform-support -- Paulo
Re: D2 port of Sociomantic CDGC available for early experiments
Am 11.10.2014 um 06:43 schrieb dennis luehring: Am 11.10.2014 06:25, schrieb Andrei Alexandrescu: On 10/10/14, 7:54 PM, Walter Bright wrote: On 10/10/2014 5:45 PM, Leandro Lucarella wrote: I still don't understand why wouldn't we use environment variables for what they've been created for, it's foolish :-) Because using environment variables to tune program X will also affect programs A-Z. Nope. Try this at your Unix command prompt: echo $CRAP CRAP=hello echo $CRAP CRAP=world echo $CRAP in windows there are user-environment-variables (walter talking about) and shell-environment variables (like in your example) setting user-environement variables will affect every program thats why java is not using them And lets not forget about OS/400 or any of the other non-POSIX systems out there, unless D is never expected to target such OSs. -- Paulo
Re: [OT Security PSA] Shellshock: Update your bash, now!
On Wednesday, 1 October 2014 at 13:58:25 UTC, eles wrote: On Wednesday, 1 October 2014 at 13:41:43 UTC, JN wrote: On Wednesday, 1 October 2014 at 05:09:45 UTC, Nick Sabalausky wrote: I find it ironic that it's another big global security hole about which Windows users don't even have to be concerned about. That's of course very true, since Windows runs on no serious servers. You would be surprised how some Fortune 500 companies are doing their serious work in 100% Windows servers. Sadly I need to comply with NDAs. -- Paulo
Re: DUB 0.9.22 released
Am 22.09.2014 11:33, schrieb Sönke Ludwig: After again a longer-than-anticipated wait, the next release of the DUB package and build manager is finally ready. This is a major milestone with some important changes in the way dependency versions are handled, making it more robust for a rapidly growing ecosystem. The number of available packages is now well above the 300 mark and keeps growing steadily: http://vibed.org/temp/dub-packages.png But even more important, I'm pleased to announce that DUB is now officially developed as part of the D language ecosystem! Based on the decision back during this year's DConf, the repository has been migrated to the D-Programming-Language organization on GitHub [1], and we are now working towards a 1.0.0 milestone [2] that is supposed to be ready for inclusion into the official DMD installation package. If you can think of any potentially important and especially backwards-incompatible changes/additions, please mention them (ideally as GitHub tickets), so that we can include them before the 1.0.0 release. Major changes and additions in 0.9.22 include: - Improved dependency version handling scheme. Version upgrades are now explicit, with the current snapshot being stored in the dub.selections.json file. This is similar to how other popular systems, such as Bundler [3], work, but built into the core system. Committing dub.selections.json to the repository ensures that everyone gets the same (working) combination of dependency versions. - Branch based dependencies (e.g. ~master) have been deprecated due to their destructive influence on the package ecosystem. See the wiki [4] for more information, including on the alternative approaches that are now supported. - Simple DustMite [5] integration. Using the dub dustmite command it is now possible to reduce bugs in DUB packages with ease, even in complex package hierarchies. The condition used for reduction can be specified in terms of exit code or as a regular expression on the output of either the compiler, linker, or final executable. - Added BASH and FISH shell completion scripts. - Added general support for single-file compilation mode, as well as separate compile/link mode for GDC. - Platform detection now also works when cross-compiling. - Added the * version specifier to match any version, and path based dependencies don't need to specify an explicit version anymore. As always, find the full list of changes in the change log [6] and download at: http://code.dlang.org/download [1]: https://github.com/D-Programming-Language/dub/ [2]: https://github.com/D-Programming-Language/dub/issues?q=is%3Aopen+is%3Aissue+milestone%3A1.0.0 [3]: http://bundler.io/ [4]: https://github.com/D-Programming-Language/dub/wiki/Version-management [5]: https://github.com/CyberShadow/DustMite/wiki [6]: https://github.com/D-Programming-Language/dub/blob/master/CHANGELOG.md This is great. I have been using it on my toy projects since code.dlang.org came into existence. Congratulations to everyone involved. -- Paulo
Re: D for the Win
Am 21.08.2014 00:02, schrieb anonymous: On Wednesday, 20 August 2014 at 21:43:26 UTC, Walter Bright wrote: On 8/20/2014 2:33 PM, anonymous wrote: Dlang Dlang Über Alles as a German, O_O I'm not surprised that the German programming community has taken to D. After all, German cars all have those D stickers on them :-) No, no, Dlang Dlang Über Alles is a take on Deutschland Deutschland über alles (Germany Germany over everything), the first verse of the national anthem as sung in Nazi times. I was actually worried if the author is German. He's not, thankfully. He's from Israel. From a German author that would be an embracement of fascism. Coming from an Israeli, I don't really know where to put it, probably completely benign. As a Portuguese living in Germany, I would say not everyone knows that outside Germany. Specially the younger generations, they just use it because it sounds cool. -- Paulo
Re: Mago Debugger changes hands
On Tuesday, 12 August 2014 at 08:21:40 UTC, Manu via Digitalmars-d-announce wrote: Thanks Aldo for this very important work! Very sad to see you move on. Thanks also to Rainer for taking on another big project. I wouldn't be a D user if it weren't for both of your work. I think this stuff is much more important than the attention it tends to gets by this relatively Linux biased community. I suspect there is a much larger number of Windows based users and lurkers than the representation in this forum tends to suggest. I for one.