You can specify one or more runtimes in the runtime-versions section of 
your buildspec file. If your runtime is dependent upon another runtime, you 
can also specify its dependent runtime in the buildspec file. If you do not 
specify any runtimes in the buildspec file, CodeBuild chooses the default 
runtimes that are available in the image you use. If you specify one or 
more runtimes, CodeBuild uses only those runtimes. If a dependent runtime 
is not specified, CodeBuild attempts to choose the dependent runtime for 
you. For more information, see Specify runtime versions in the buildspec 
file.

NVM for Windows has been a good way to work with Node.js on Windows, but 
I've always felt it could be better. A *lot* better. I've made many changes 
and built many features that never saw the light of day (like a macOS 
version back in 2015). A prototype hooks system was built, but I never had 
time to finish. These are just a few things I've experimented with. 
Meanwhile, the JavaScript ecosystem continued to evolve.

I eventually decided to start working on a new cross-platform version 
manager in 2019. I made some solid progress, but I believe the community 
wants more than a version manager. Many tools have come out to manage 
Node.js workflow, a concept which is often lumped in with "version 
management". I decided to rethink the whole thing, starting over from 
scratch, focusing on "environment management". This begins with managing 
JavaScript runtimes.

There are far more nuances for maintaining safe and secure development 
environments in organizations than there are for individual developers. The 
Runtime core will provide an updated version management experience for 
everyone, but there is a direct intention to address 
enterprise/organizational environment management and other needs via 
commercial edition/add-on/services. I've setup a new company and have 
nearly completed initial fundraising (a process of finishing paid client 
work - VC comes later).

I intend to deprecate NVM for Windows after Runtime is released. There are 
two things slowing work on Runtime. First, client engagements. I am 
wrapping up the most time-consuming projects, which extended into Q2 2023. *My 
co-developer and I started working on this full-time on June 20, 2023.* The 
other thing slowing down Runtime development is this project. So, I'm *freezing 
feature development* on NVM for Windows.

*Update:* An AWOL employee left Corey with several months of incomplete 
work on a client project. He is working as fast as possible to catch up and 
focus exclusively on Runtime. This was finally resolved in February. Corey 
is working FT on Runtime.

To be notified when the Runtime beta is ready, complete this form (the last 
survey question allows you to sign up). I'll also post Runtime updates on 
Twitter (assuming Elon doesn't paywall it) @goldglovecb.

But it looks like the *Docker desktop on windows doesnt look there*.I found 
the daemon json in the Docker Desktop *Settings under Docker Engine*.I *added 
the entry manually* there and *now it loads the nvidia runtime* I hope it 
helps someone else

Currently on Windows, mainCRTStartup is run before the Rust std internal 
main (which in turn calls the application's main function). This 
initializes the C/C++ runtime. However, as far as I understand it, Rust 
handles things like setting up the main thread and SEH exceptions. Rust is 
also very insistent that there shouldn't be any constructors run before the 
application's main (aka no life before main) so there's no issue with 
trying to ensure constructors are initialized before the application's main 
is called. The Microsoft docs make mention of calling 
__security_init_cookie but I think Rust has its own stack guard.

So I'm wondering if mainCRTStartup does anything that's useful for Rust? 
I'm finding it hard to find definitive information on this. My tests so far 
don't show a problem with replacing the entry point with a stub that simply 
calls Rust's internal std main directly but I may well be missing something 
very important.

Long story short, the win32 loader creates the initial thread and runs a 
function called LdrInitializeThunk before the exe's (or dll's) entry point. 
This function is part of the OS, not the binary. As far as I'm aware, this 
function does all the necessary initialization for threads, TLS, etc. So 
for those keeping score, the functions before main are:

I think it should be possible for Rust to set the entry point to lang_start 
(which is exported as main), therefore skipping mainCRTStartup. But I'm not 
100% certain the C initialization isn't doing anything useful for Rust, 
hence my question.

I think Rust code could be linked with C/C++ static libraries that need the 
CRT, and also I think it's legitimate to actually have global constructors 
in Rust code (albeit unsafe and with inline asm or a C/C++ helper), so it 
seems problematic to unconditionally not initialize the CRT, and thus it 
probably needs to be an opt-in option and probably still need to have code 
to call global constructors.

That link confuses me. What memory management and file IO needs to be set 
up for C/C++ on Windows? Perhaps that's talking about managed code? Also 
this is more of a nitpick but the Windows C++ main function called by 
mainCRTStartup has this definition:

The thing is I'm unsure if C itself needs any particular set up on Windows. 
I know mainCRTStartup sets up the arguments for main but Rust doesn't use 
them. I'm assuming it also does something to run (or set to run) global 
static constructors which is where I think there could be a problem but I'm 
unsure of the implementation.

In summary, for a pure Rust exe the CRT initialization can be skipped (I 
think) even if calling C functions. However when statically linking C/C++ 
libraries there may be issues if they use globals that require a function 
call before main.

For what it's worth, the entire source to mainCRTStartup is shipped with 
MSVC- in VC\Tools\MSVC\14.24.28314\crt\src\vcruntime of the installation. 
You can also get to it by walking up the call stack from main in the VS 
debugger.

The thing that concerns me about looking at source code or 
disassembling/debugging is the license. I don't know if having looked at 
the code could come back to bite me if I do write some code of my own.

The license is not a problem. That source is provided explicitly and only 
for the purpose of reading and debugging it- it is not used by VS for 
anything else. (i.e. VS doesn't build it to produce the binary form of the 
vcruntime, it doesn't depend on it itself, etc.)

What I mean is that, sure I'm allowed to look at the code. But what if one 
day I want to write and distribute my own Windows C runtime initializer, 
outside of VC++? Could my knowledge of Microsoft's code be used to accuse 
me of violating their copyright? I'd prefer not have to deal with that.

I'm investigating how Rust compiles/links on Windows and how or if the 
experience could be improved. As part of that I've looking into, for 
example, how Rust handles imported dll functions and wondering to what 
extent Rust depends on the Windows C startup routine.

Avoiding looking at someone else's code can be useful as an affirmative 
defense if you are accused of copyright infringement. This is a practice 
known as clean room design. It's probably useful only if you document the 
entire specification and implementation process sufficiently to prove it in 
court. If you aren't working with an independent written specification and 
a knowledgeable legal team, then it's probably not useful to worry about 
this.

If you are using the CRT then you need the CRT entry point. If you 
statically link to any C/C++ code then you most likely are using the CRT 
and therefore need the CRT entry point. If you write a pure Rust program 
where the only C/C++ you touch is in other dlls then you likely don't need 
the CRT and therefore don't need the CRT entry point. In the future when 
-lang/rust/issues/58713 is finally implemented, Rust will gain a new pure 
Rust target for windows which won't link to the CRT at all and therefore 
won't use the CRT entry point.

The Project Manager for Java extension helps you to manage your Java 
projects and their dependencies. It also helps you to create new Java 
projects, packages, and classes. To get the complete Java language support 
in Visual Studio Code, you can install the Extension Pack for Java, which 
includes the Project Manager for Java extension.

By default, the Java Projects view is displayed below the *Explorer* view. 
If you cannot see it, try clicking the ... button in the *EXPLORER* title 
bar and select *Java Projects*.

You can directly import existing Java projects and modules to your 
workspace through *File* > *Open Folder...* (Make sure the opened folder 
contains your build tool scripts, for example, pom.xml or build.gradle). VS 
Code for Java will detect your projects and import them automatically.

When you add a new module into your projects, you can trigger the command 
*Java: 
Import Java projects in workspace* to import them to your workspace. This 
command helps to import new projects into the workspace without the need to 
reload the VS Code window.

As Java evolves, it's common that developers work with multiple versions of 
JDK. You can map them to your local installation paths via the setting: 
java.configuration.runtimes. The setting has following format:

If you want to change the JDK version for your Maven or Gradle projects, 
you need to update it in your build scripts (pom.xml or build.gradle). You 
can click to see how to make such changes. Click will navigate to the build 
script file of the project.

If your project is an unmanaged folder without any build tools. You can 
manage the dependencies by clicking the *+* icon or the *-* icon on the 
*Referenced 
Libraries*node or the items under it, or you can just directly drag your 
jar libraries to the node Referenced Libraries

-- 
You received this message because you are subscribed to the Google Groups 
"beets" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/beets-users/a6d5ec5a-e254-47d0-8053-0c6b9e9b2fd2n%40googlegroups.com.

Reply via email to