denoland/deno

"deno compile" into executable

matthewmueller opened this issue Β· 111 comments

I was wondering if there's any plan to support compiling deno scripts into a single executable?

I saw "Single executable" as a feature, but the code that follows seems to suggest it's deno as the single executable, not the programs executed with it.

It'd be awesome to support something like go build main.go or something like zeit/pkg.

I believe this is still on the long term roadmap.

ry commented

I'm into the idea - and it's quite doable with our infrastructure. I'd estimate ~2 weeks of my time to get it working. It seems very cool, but it's unclear to me what the actual use cases are - do you have any? I also think there's some more important features that should come first.

The main use case is to simplify deploys – you don't need to care about what's installed on the server because it's all present in that single executable, including the deno executable for the given architecture and OS.

You also tend to find issues at compile time, rather than runtime (e.g. we've all had something something this node module is not found after deploying to production).

You also tend to find issues at compile time, rather than runtime

I don't think a single binary helps or doesn't help in this case. Unless you are going to exercise all the code in tests, you aren't going to find issues. Building it into a binary doesn't help with that.

If you are using TypeScript modules, either local or remote with Deno, and you do a prefetch, all the types will be checked at least. Again, building this into a binary doesn't offer you anything more.

hayd commented

I think the use case is not server deploys but cli/gui apps where you want to lock down the deno version used and it's easier to distribute and call a single file.

The compile before deploy is solved with --prefetch if you include the cached deps.

I don't think a single binary helps or doesn't help in this case. Unless you are going to exercise all the code in tests, you aren't going to find issues. Building it into a binary doesn't help with that.

I think you're right about this – good point @kitsonk!

I think the use case is not server deploys but cli/gui apps where you want to lock down the deno version used and it's easier to distribute and call a single file.

It depends on what you're doing, but a single binary definitely makes it easier in those cases too!

Want to reiterate what @hayd says here. I like to create small single-use CLI tools that are essentially "done" and not having to worry about Deno version compatibility would be very enticing. Aside from security patches, it would be nice to have a binary that never needs upgrading.

A couple of use cases for me:

β€’ CLI β€” Having a single executable makes distribution easy. Especially if cross-compiling is easy!

β€’ Services built into simple build artifacts β€” This can be great for small container images, and generally nice for CI/CD workflows.

Support for this would push me over the fence of wanting to use Deno for real projects (once stable). Right now, it's not compelling enough for my use cases.

It seems very cool, but it's unclear to me what the actual use cases are

deno could solve an issue that Python and Node have: Ease of running foreign scripts for casual users.

The use case is simple: When my aunt asks me for a programm to organize her holiday pictures on her Windows machine, she does not want to download and install a 30 MB programming environment - she wants a 20 KB do_stuff.exe she can double click.

ry commented

Bert and I are actively working on this. Hopefully we’ll have a demo in a two weeks or so.

ry commented

FYI development has stalled on this feature while we are fixing other bugs in the system... but still on my back burner.

Thanks for the update, @ry. I'd be careful to wait too long on getting some version of this going. It seems like a feature that could quickly become infeasible unless we support it early.

awkj commented

by the way, I really found ry deeply affected by golang, study golang, use rust, write typescript.

Thank you for your work., Is there any progress?
I look forward to it so much.

@awkj Good observation, I just thought the same thing. To me taking good parts from other languages/platforms and integrating something we use is great think, so thanks @ry. Haters gonna hate 😎 (that issue gets me everytime).

mnxn commented

@satishbabariya I think that's a vast oversimplification of the amount of work needed to make a compiler. Implementing all the features of js/ts in a compiler is a monumental task and I would assume to be out of the scope of this project. A much more feasible way would be to embed the already existing V8 VM into the executable.

I think that's a vast oversimplification

Agreed! Many many people have tried to out perform V8 (and other advanced JavaScript runtimes). JavaScript (and therefore TypeScript) is such a permissive language that it take a lot of work to get anything that performs well. Also "TypeScript AST" also only solves part of a problem of creating what is offered by the runtime, like a highly performant garbage collector and a main event loop, just to start. All of that isn't expressed in a "TypeScript AST".

The snapshot building in Deno has been reworked and we are largely creating the internal Deno bundles from very similar code to deno bundle. We will likely unify that very soon, which leads to a path of creating something that can create a snapshot that includes the Deno runtime and the transpiled target program as one binary. For certain workloads it will greatly improve startup time too, I would suspect.

The snapshot building in Deno has been reworked and we are largely creating the internal Deno bundles from very similar code to deno bundle. We will likely unify that very soon, which leads to a path of creating something that can create a snapshot that includes the Deno runtime and the transpiled target program as one binary. For certain workloads it will greatly improve startup time too, I would suspect.

@kitsonk That's what I assumed the path would be for such a feature.

This would still be great to see!

For me, the major benefit of having this feature is to be able to make commercial versions of the code. Businesses usually don't want to expose their code to their customers.

In Node.js, there is a package called "pkg" which compiles javascript. I am not sure how they do it but I'm guessing they are forcing V8 to compile each function into bytecode and then put everything into a binary file along with node's executable.

No. This discussion predates that thread by about a year, and even before it was raised compiling user code into a binary was something i think Ry and I talked about.

NW.js does this by sort of "cheating". It's a single exe, but it's actually a very clever zip file that extracts itself and bootstraps node on first execute. I really appreciate the single exe deployment option with NW.js, and I've shipped box software with it, but would love to see deno take a better approach (if possible?).

Our approach (which I don't know if it is better or not) would be something like this:

  • Deno bundles modules in a bundle.
  • Deno loads the resulting bundle into a v8 isolate and takes a snapshot.
  • Deno builds a binary that includes Deno and this additional snapshot.

The move to rusty_v8 should make this whole process easier, but it still isn't an easy process.

One of the biggest challenges is there are things that can be done to make snapshotting difficult. We have run into all sorts of strange things in building the internal snapshots for Deno, but we are doing some things that end users would never need to do. The challenge still is that snapshotting can throw some really strange errors that would be hard to provide back to users in a meaningful, actionable way. Something we would need to consider carefully.

I believe that @kitsonk's approach is the way to go, and many NodeJS developers (like me and @andyfleming) may finally switch to Deno when this feature arrives.

@kitsonik I think that's significantly more "compilation" than what zeit/pkg did last time I've inspected their resulting bundle/binary.

It simply bundled the files used by the entry script (all required JS is traversed and copied into a snapshot filesystem) and Node.js into a single executable which unpacks everything in RAM and starts it. I don't think they even bundled JS with something like rollup, they didn't mention it in their docs and my inspection of the binary confirmed as much. They did develop a separate tool called ncc later for bundling and you could feasibly use it prior, and feed the bundled script to pkg I guess.

Back then when I toyed with it as when you use strings CLI utility you could find bits of the JS source code inside and even filenames, and runtime traces would contain paths. Maybe it's different now with them, I didn't test later versions nor followed their development. We did use it to ship "boxed" code to people without them needing to bother with Node.js and it worked great. The "boxed" code would work on any Linux x86_64 so in a lot of ways similar to Go wrt deployment. Fantastic for containers as well.

I feel with grabbing a V8 snapshot you'd be skipping the lexing/parsing part of the process and startup would be faster. And for people that want their shipped code to be "obfuscated" and hard to reverse-engineer it would hardly get better than that with JS/TS.

Adding a use case. I would love to be able to build and distribute small CLI based applications (that would accept arguments) specifically for CI platforms. A utility that has a dependency on a particular language/framework being installed is a compatibility limitation for many users who would otherwise make use of the tool if it could be shipped as its own binary.

A small example of this, for instance, is the AWS CLI v1 vs v2. The first version of the AWS CLI required Python to be installed, even in the case you are building a Node.js application. The new V2 CLI ships without the need for python to be installed locally.

@kitsonk, is there anything the community could do to help you and other core developers on this topic?

It likely requires someone who understands V8 and Rust to a decent degree. In particular someone familiar with V8 snapshots. At a high level the process needs to be:

  • Use the compiler APIs to generate a bundle of JavaScript.
  • Load that JavaScript in a Deno CLI V8 isolate.
  • Take a snapshot of that isolate.
  • Write the Deno CLI binary and the snapshot out to disk in a single executable file (this is the part, I personally don't know how to do yet, though Bert or Ry might have an idea).

The first is basically available now, the second two should be possible given the infrastructure we have, the last as I indicate, I personally am not even sure of an approach right now.

It is certainly something that will get attention post 1.0. It is very much a desired feature. It just is on the complex end of the scale.

Just to share the info: The #4402 PR now lets Deno load non-static snapshots. With the ability to loading snapshots at runtime we are one step closer making a single bin containing deno + snapshot + a bit of glue.

This is a very interesting feature.
I think before it can be attempted, some lower level primitives are needed.

First, making a snapshot of an isolate which is based on a snapshot. See this comment: #1877 (comment)
This is the LoC related to it:

// TODO(ry) Support loading snapshots before snapshotting.

Secondly, how is Deno going to produce a binary of itself? What are common concepts for doing something like this?
Do we even need the whole binary included or could we use a smaller binary that just supports the basics of executing JS? Where would that come from?
I think a PoC approach could be to leverage the Rust compiler in order to compile bytes into a Deno binary. Example here:

include_bytes!(concat!(env!("OUT_DIR"), "/CLI_SNAPSHOT.bin"));

@mraerino yup! You are thinking exactly along the same lines here. I think the first step would be to just be able to write out the snapshot to disk and bring it back in. Trying to have it be a single executable I believe is going to be difficult... but if we got to the point where deno run --bin=snapshot.bin or something, trying to inline would be another step.

yeah, unless we load the cli utils from plain javascript we won't be able to snapshot the running isolate right now

From my perspective, I see two complementary paths.

The primary path would be a Rust toolchain approach to support CI/CD based building with code signing workflows. The more we can make compiling a Deno + Snapshot look like normal compilation, the more workflows it can integrate with. Less magic, friendly to human audits, and it allows for downstream innovation by developers recombining the parts.

The second path is enabling Deno to create the binary executables itself, in the great tradition of self-extracting archives. I have a bit of experience with these in Python with py2exe and the like. Essentially the Snapshot payload is (re-)linked into a binary that looks for the Snapshot binary data at "the right place". In earlier days, that right place was aligned to the end of the binary, allowing you to simply concatenate the payload onto the executable. Unfortunately, the technique is dependent on the format of the binary: PE, PE/32, Mach-O, COFF, ELF, etc.

That said, the Rust compiler folks might have some modern tricks, insight, and code that could dramatically help!

Maybe we can lean from dotnet build single executable file mode
pack all files to a single file and give this file to user, user click this executable file , then it will extract itself to one tmp dir and run the complied js file
sorry for my bad english, I think this is a easy way to implement single executable file publish

this is a rust tool that basicly does this: https://github.com/dgiagio/warp
sadly, it's not released as a crate.

One thing to think about is closed source binaries. Tools like warp that simply compress and decompress the source code wouldn't work for scenarios like that and having closed source compilation would give deno a serious advantage over NodeJS

warp would allow to compile-in deno snapshots which do not expose the JS/TS source code

Ah, my bad, it looked like Warp was decompressing into a local cache. In that case I think warp looks pretty promising.

it looked like Warp was decompressing into a local cache

that's indeed what it's doing. but deno allows to snapshot the V8 state after the initial parse of the user script. so if we use that snapshot to run from the source code is not exposed

I'd be willing to take a shot at this, but I'm not the most familiar with Rust. Might be better to have a developer with more experience implement this.

Correct me if I'm wrong here's what I understand looking at the comments;

  1. Write some code on your development machine
  2. Once done invoke Deno to take a snapshot of the V8 isolate / state and then export it (though I'm not sure if it can be exported as a file, my initial thought for this step comes from this comment)
  3. Use Warp (or similar tool / Rust code) to pack Deno, the snapshot and probably a launcher script (for packed Deno to load the extracted snapshot on the target machine)
  4. Distribute packed, self-contained executable
  5. On the target machine, once that self-contained executable ran, let the packer (Warp or, again something similar to that) unpack the contents (for the first-time or after an update) and run the launcher script to run Deno and make it load the snapshot (at runtime)

So basically the code can still be closed source and wouldn't be exposed. Although I'm not a Rust developer, I'm totally into Deno.

FWIW:

I just tried the Warp here with deno bundle:

$ mkdir bundle/
$ cp $HOME/.deno/bin/deno bundle/
$ deno bundle https://deno.land/std/http/file_server.ts bundle/bundle.js
$ cat <<'EOF' >> bundle/launch 
#!/bin/sh
DIR="$(cd "$(dirname "$0")" ; pwd -P)"
exec $DIR/deno run --allow-net --allow-read $DIR/bundle.js $@
EOF
$ chmod +x bundle/launch 
$ warp-packer -a linux-x64 -i bundle -e launch -o file_server

And then:

$ ./file_server 
HTTP server listening on http://0.0.0.0:4507/

It worked like a charm;

And yes, the extracted files are in $HOME/.local/share/warp/packages ; Its not "protected" like zeit/pkg ;

Maybe a binary instead a js bundle can help:

deno bundle --bin https://deno.land/std/http/file_server.ts snapshot.bin
deno run --bin  --allow-net --allow-read snapshot.bin

Warp is nice, uses a similar approach the gzexe, and because that needs to be 'extracted'. It would be nice to be able to configure this and run directly as in zeit/pkg;

Total noob here but could there be a way for say two or more Deno executables to share snapshots between each-other on the same machine if the snapshots they use are somewhat similar, to save resources?

I could see the first Deno executable launched start a service that tracks snapshots, and every subsequent Deno executable checking first the tracking service before spinning up snapshots of their own.

At the OS level this could make a lot of sense. I could see this being used in Linux distros etc.

Just a thought.

Would it be worth letting the OS handle the sort of copy-on-write optimisation you’re talking about PhilParisot?

Hey @lukebrowell, like I said, total noob here, but my impression is that apps based on frameworks, for example electron, consume a huge amount of memory at runtime partly because they spin up their individual runtime environments (in electron's case, nodejs and chromium) for the sake of portability. They don't share their runtimes with all other concurrent electron apps on the same machine and I'm not sure why that optimization isn't done at the OS level.

It's a huge detractor.

Again, I could be totally wrong. Thoughts?

@PhilParisot I believe the case is different for Deno and I vote to not have this concern about snapshot sharing.

On the examples you provided, the problem is in Electron design and the solution is not on the language or the way it compiles binaries, but to choose not to incorporate a huge browser engine inside the binary. Instead, it can use the system default engine or a smaller one, which is done on NeutralinoJS, Electrino, Quark, DeskGAP, Sciter, Azula and many others (more info here).

But, also, this is just my 2c. πŸ˜ƒ

I'm in agreement with @paulocoghi, snapshot sharing is definitely a cool feature, however, this problem is more about how to have Deno output native binary as opposed to a complete GUI application like what electron provides. I'm sure in the coming days someone will figure out how to make electron work with Deno (I'd be interested in creating a project like that myself) and at that point, snapshot sharing would become essential

FWIW: In my very small example above, the bundle.js has ~180KiB, and Deno itselves ~51 MiB; Doesn't worth over-complicate the solution to share the snapshot or run-time if you dont "extract" the binary to any place. The best is make like zeit/pkg or nexe and just make a regular uncompressed binary.

You can always use the gzexe or other library to do this after; (gzexe compress deno to 20MiB)

It seems promising to look and learn what could be eventually helpful on the approach of Tauri (github and website), that is made in Rust.

The generated binaries are extremely small.

Maybe @lucasfernog (hello from Brazil, Lucas!) can detail his approach :)

It is even more interesting that Tauri uses NodeJS inside. Imagine Tauri using Deno!

In their words:

Tauri is a polyglot system for building apps. It uses NodeJS to scaffold an HTML/CSS/JS rendering Window as a User Interface that is bootstrapped and managed by Rust. The final product is a monolithic binary that can be distributed as common file-types for Windows (exe/msi), Linux (deb/appimage) and Macintosh (app/dmg).

I now found that using Deno is on their roadmap. See the end of the page.

Tauri's CLI is written in Node.js but we would love to get it running in Deno ASAP, since we're security-driven and Deno is a great match!

We worked a little bit on building applications that has the Node.js binary + app bundled, using tools like pkg. Would be pretty cool if Deno had that kind of support, and we could secure the source code using features like include_str or include_bytes.

@lucasfernog certainly reach out to @ry (ry@tinyclouds.com) as something like Tauri is certainly well aligned to Deno's goals, and having parts of Deno consumable as crates is a big objective for Deno.

The problem I'm facing with Warp, is that it's not packing in the bundled js file. It's packing in deno, which is great, but without including the one other file needed, then I may as well just be distributing the deno binary πŸ™„

This directory, with this cmd file
image

Packed with this command

warp-packer --arch windows-x64 --input_dir .\warp\ --exec warp-launcher.cmd --output ./dist/echo.exe

Which results in
image

However, when I put the bundled js file beside echo.exe it works

image

But I don't want that, I want the js file included in the exe file. Kind of the whole point :|

@Wallacy I was wondering if I might request a look at my comment here ( #986 (comment) ) and maybe you can tell me what i'm doing wrong? Or is that the expected result? I find it weird that Warp will only pack in the deno binary but nothing else and im kind of at my wits end with this one :|

@RedactedProfile it’s not s Deno or Warp problem. It’s just your launch script.

Warp decompress your input_dir and run from there.

Check the examples on: https://github.com/dgiagio/warp/blob/master/README.md#windows

@Wallacy hi thanks for responding!

Thats kind of what I mean though, I figured warp just took an input folder and packed whatever was in there into the final executable and ran your script

However the content of the folder beyond just the deno binary isn't getting packed in there

I have to place the bundled js file alongside the final binary warp creates, in order for it to work. If I don't the execution fails to find the js file.

So I figure I'm doing something wrong here but after a couple days of fiddling around I'm only ever getting the same result each time, and not sure what else to do

I ping you because you seem to be the only person who got it to work so far, so I apologize if I'm being a little direct here

@RedactedProfile those questions should be made on Warp github issues, not here.

Anyway, as i said before, warp is packing everything, your script is just wrong...

You can look the files on: %LOCALAPPDATA%\warp\

You script is:
CALL deno.exe run --alow-net --allow-env .\echo.bundle.js

The problem is .\: You are telling the the script to run a script on the calling path. Thats wrong, and the error message is clear about that.

Should be something like that:
CALL %~dp0\deno.exe run --alow-net --allow-env .%~dp0\echo.bundle.js

I'm not good at windows scripts but should be something like that. Ask on https://github.com/dgiagio/warp to better support on that.

@Wallacy

those questions should be made on Warp github issues, not here.

Trust me, I understand what you mean, but Warp has open issues submitted to it before with similar concerns that it's not including everything from the input folder with no resolution years later dgiagio/warp#19 I'm posting here only because you got it working for something it's not originally designed for, in a thread about this very subject, so I don't think it's entirely unreasonable to be posting here. I'm simply asking how you did it because I'm trying to do something very similar and asking if you could help me in doing the very thing you already did, with a benefit of helping others that are finding their way to this thread. I'd have DM'd if I didn't think this would be helpful to discuss this in public πŸ€·β€β™‚οΈ The activity found on Warp's issues and repo also inspired very little confidence that they care one way or another.

Thank you for helping anyways though, most would have just deflected and left it at that.

Sadly with what was provided, it no longer knows how to find the runtime now

image

Getting rid of the %-dp0 for the runtime allows it to be found, but now

image

This script that ended up working, which looks dishearteningly similar to a previous attempt from several days ago, which is frustrating, but it's simply

CALL deno.exe run --allow-net --allow-env %~dp0.\echo.bundle.js

The major difference between this version and anther I was using, was that the paths were stored in variables.

Thank you for taking the time to show me that the %~dp0 was indeed the key. I got frustrated because it wasn't finding the file before when I followed the examples provided by Warp themselves and a few other websites and articles.

I'm just really glad to see it working πŸ‘ Thank you, and I sincerely hope this helps other's out too

(Long thread, so haven't gone through in full). I too would like to be able to distribute self-contained binaries. I presume binary size isn't a concern, at least initially until we can do better. Given that, the following scheme can work I think -

  • The end of the deno binary can be used to store javascript/typescript code to load on startup.
  • Reserve the last 20 bytes of the file.
  • On startup, deno reads the last 20 bytes of itself as a string.
  • The string is expected to be in the format <CR><LF>// js 000000000000 .. where the number after "// js" is a decimal giving the last N bytes to read from the deno executable that are a script.
  • Deno then reads this part and runs it like a normal js as though deno run packed.js was executed.
  • The last 20 bytes will look like a JS comment line, so there won't be a problem even if the entire section is loaded.
  • "// js" can be used to indicate js code and "// ts" to indicate typescript code.

I've tried this scheme on windows, linux and macosx in my old scheme dialect interpreter and it works. (ref: https://github.com/srikumarks/muSE/wiki/StandAloneExecutables)

If we put aside this seeming like a hack, would it suffice as an interim solution to the "self contained executable" problem?

edit: Based on @rivy's suggestion - // jz and // tz can be used to indicate compressed javascript and typescript sources using gzip. If Brotli is also intended to be supported (not sure whether that's useful), this can be extended to // jb and // tb.

I think @srikumarks solution would work fine and if the binary used was a stripped down version with only the bare necessities (no linting etc.) size could be reduced in the compiled binary. Another potential optimization for final executable size and speed would be storing the v8 heap snapshot instead of the raw source code (although I have no idea if the snapshot actually is smaller), and if possible without a too big performance impact compressing it (or the source if snapshots are not worth it) using something like gzip or brotli.

rivy commented

Additionally, if you're doing compression, you could use a character tag to represent compressed vs non-compressed text of the included script (eg, <CR><LF>// js+000000000000 vs <CR><LF>// js-000000000000). That would likely save a great deal of space.

The way that has been proposed above is pretty simple, but I'd like to suggest a more robust approach:

__denoOptions = {
  compression: {
    type: "bz2",
    options: {
        // ...
    }
  },
  executable: {
    type: "typescript",
    checksum: "...",
    // ...
  },
};
// DENOEXE 0000000000000000 0000000000000000 

The first set of 0s would be a hex offset from the end of the file indicating where __denoOptions is defined, and the second set indicating where the script/V8 blob starts. Alternatively you could base64 encode the __denoOptions object as a trailing comment.

Any progress on when this may be available? I think it could be VERY useful for distribution of projects. Say for example you have a client who purchases software from you, but does not purchase the source, you would generally deliver them a binary. It would be very nice to have this option with deno, especially for this use case. I saw it got cut from 1.0, and was curious if it's still on the roadmap somewhere.

Personally, this would be the defining feature that would shift most of my work to Deno (as opposed to Node/Go). If I weren't a complete n00b with Rust, I'd consider working on this. I'm still considering a crash course in Rust just to help on this feature.

I hope @ry, @piscisaureus, and everyone else recognize the impact this feature would have on the community. This aligns with the idea of Deno being the runtime for the 2020's (while Node was the runtime for the 2010's), primarily because people are doing alot more with JS now than just writing servers. The ability to generate cross-platform executables has significant momentum, as illustrated by the popularity of Electron and the Go ecosystem. It's amazing how many people use Electron just to make an executable version of a Node app. I have a small series of components I wrote years ago, called node-windows, node-mac, and node-linux. The projects wrap Node so they operate as executables/daemons on servers. While the star count on the repos is fairly low, there are over 900K processes running on these tools (that I'm aware of, based on a portion of download counts). Popular tools like forever, pm2, nexe, pkg, and others show just how many people are interested in the concept of isolated executable distribution/operation.

Given I'm sure everyone's personal time is limited, is there anything the community can do to help accelerate this? Are there dependencies that need help being fleshed out to make this practical/possible?

I think @srikumarks solution would work fine and if the binary used was a stripped down version with only the bare necessities (no linting etc.) size could be reduced in the compiled binary.

I'd just like to add my voice that if the Deno binary is included in the implementation of this feature, it should be included sans all of the extra that make it good for development but don't matter once the ts code has been compiled to js and is already linted/tested etc.

My main use case is I'd also like to be able to generate CLI tools (preferably <10MB if possible) that has closed source code in it. Given Deno CLI on crate is <2.5MB, I suspect this is at least in the realm of possible? My second use case revolves around an apparent growing push away from resource hungry Electron apps (see relative newcomer frameworks/toolchains such as Tauri and Neutralino for reference). I think Deno could be a strong part of making this push a reality if it can create lightweight compiled binaries that can work with these kinds of tools. Not all HTML/CSS/JS apps need to talk to a cloud based server after all (nth millionth todo app, I'm looking at you!).

My justification for targeting a smaller size is that if you start to have multiple 50MB+ compiled Deno binaries + apps, it could add up fast and be quite redundant and a waste of resources on limited servers / containers (at that point I'd rather just link to a separate deno binary, and again at that point, may as well just stick with the status quo largely).

No pressure. Can't contribute as I don't know rust (and can't make the time), so no expectations. Just adding input in case it helps drive development direction. Thanks for the work!

Personally, this would be the defining feature that would shift most of my work to Deno.

I also want to additionally take note that (at the time of this writing) this is the single most wanted/anticipated feature right now. The evidence for this is that at the moment, this issue has the most comments and the most (non-negative) reaction-emojis of all the open issues in this entire repo!

p.s:- I also tried to get deno buy-in from my team, and the lack of this feature was the single deal breaker.

Unrelated: Microsoft announced that the single file executable feature is coming to .Net v5 (released next week). So this is something that others outside are also starting a great benefit in adding as well.

The core team pays attention to the reactions on issues. We also talk about this one semi-regularly.

I think the community has to respect that there are certain issues that are a lot more complex than they appear, and while it might seem straight forward, this particular one hasn't been. It certainly isn't out of spite that the feature hasn't been implemented.

Anyone would be foolish to think this is a simple issue or that it is being ignored. Mad respect for the work being done!

Is there any way of breaking down the complexity and enumerating the steps involved so that the community could lend a hand?

We've discussed this issue extensively over the last week. Grepping through
comments on this issue there are two distinct needs raised by the community.

a) "deno compile" to provide self-contained binaries that could be cross-compiled
for other systems. Similar to what nexe does.
The problem with this approach is that we'd need to edit already compiled binaries
in-place which mandates changes to the binary itself (to support loading code from
within the binary). In theory we could do that in the current binary, however
produced binaries would be heavily bloated (because of inclusion of tools like
formatter, linter, test runner, TS support, etc) resulting in 45Mb+ binaries even for simplest
scripts. We've explored the idea of producing "lite" binaries that are stripped
of all the tooling (#8381) which reduced the binary size to about 30Mb (or 18Mb
if we were to strip all debug info). I realize this is prefered solution for
the community, but it comes with some drawbacks:

  • not configurable - users would be able to include a single bundled script
    and have no way to load additional code during runtime

  • hard time notarizing built binaries (nexe/nexe#446)

  • there's a lot of open questions about deno features in context of self-contained
    binaries (eg. should permission still be checked, what about lock files, and other
    flags deno supports)

This approach is quite complex to implement and there's no clear path forward
at the moment.

b) provide a "scaffold" for deno-based binaries. This approach would essentialy
create a Rust project with some basic setup that allows to leverage Deno's runtime.
There would be no additional tooling included (unless user opts-in) and user has
total control of what's going on in the binary. In essence this approach
means that Deno would be available as a library crate (already requested several
times: #2633, #7928).
Having total control over compilation means that much more complex projects
could be conceived (eg. Electron clone) however, it would require user
to set up full Rust toolchain. There's also no option to cross-compile. Users
would have to set up full CI workflow similar to Deno's.

Second approach would be much easier and faster to deliver (I expect that refactors
needed to provide CLI as a library crate would take about a week) at the same time giving
users more powerful tools to start working on. The first approach would heavily benefit from
the work required for the second approach anyway, so in my opinion the core team
should focus on delivering Rust library available to use and enable users to
start building projects based on that. Then we could collaborate with the community
on making the first approach feasible.

I'm happy to hear your opinions on the matter.

I'm a little confused with the conclusion on a. I, and I imagine others, don't care about tooling or how it gets done initially. Even file size is a secondary concern next to the primary concern of deployment format.

All I care about on the first pass is something that can be distributed as a single, self-contained binary.

REDACTED - I was a tired idiot who misunderstood which user was being referenced.

A few thoughts:

  • Only getting it down to 18mb surprises me. If possible, it would be good if it was even lower somehow (minor secondary issue)
  • I don’t see the inability to let users add code as a disadvantage. I hear secure and easy to support. If they want more, they can compile it themselves (first project that comes to my mind that has this is Caddy written in Go, but there are heaps of examples where this is usual fare).
  • I think more complex electron like projects will be made using compiled binaries from A (see my previous comment)
  • I go as far as saying ditch everything you can from the needed deno-core / deno-lite binary that you can (whatever it is called). If you’re compiling software, there is no way to verify what it is doing if the run permission is enabled anyway and I expect it’ll be needed a lot for projects where you are going to compile the end project. Just provide the bare minimum needed to run the transpiled and bundled TS.

ALSO REDACTED - Again, a tired idiot who used tired words to explain something that should have waited. Please see my more awake response down below

aral commented

As someone using Nexe today (on Site.js) and keeping an eye on Deno as a possible, lighter alternative to Node.js in the future, it sounds like (b) first is a good idea. As an outsider, it feels more in line with Deno’s approach in general – which (correct me if I’m wrong) is not to try and recreate Node but to take a first-principles approach to tacking the same problem again from scratch. It feels like (b) would allow that sort of experimentation – and could lead to (a) – whereas getting stuck on (a) or bust from the outset could be a dead end.

Personally, I’d be happy to start playing with (b) if it were available.

@binaryben I think you are misunderstanding what the ultimate goal for Deno is here. Deno will always use V8 to run your JS code. There will be no compiling from JS/TS to "native code" (yes, I am simplifying here, disregard V8 snapshots). Doing anything else is far outside of the scope for Deno.

With any approach we take, we will still be running JS code through V8. So the question that is open currently, is how we create a binary that contains V8, the JS code to run, and all of Deno's runtime code. Either approach (a or b) would build the binary on the developers machine. The ultimate binary shipped to the user would be very similar - and likely also similar in size.

yes, I am simplifying here, disregard V8 snapshots

Snapshots aren't "native code" anyways, they are just serialization of the memory of the JavaScript isolate. 😁

I like B. It’s a reasonable route towards A, the undeniable objective of this thread, which engages more of the community in achieving it due to the shared objectives. Starting with B gets us closer to A without creating unnecessary technical debt, bloat and wasted effort. Deferred gratification theory and game theory both apply to make B the logical first milestone choice.

In many of my uses cases, I'm building CLI utilities, so even 18MB is pretty massive compared to the Go executables I can create (which are as small as 1-3MB), but still better than most pkg/nexe output. That said, the ease of building with JS can offset this, meaning I wouldn't personally get too hung up on an 18MB binary. It would still be a very nice improvement over the Electron footprint and other options. Plus, there are tools like upx that can help with this. At the end of the day, I think having something would be a big step forward.

Ultimately, I think the majority of users are looking for a way to run a command to produce a binary, avoiding much of the build process. Of course, others want control. It seems like approach B provides more flexibility and control over the output, and is therefore a logical choice en route to outcome A.

The executable file produced should only contain v8 and deno-core

Typescript code will be compiled into javascript or v8 bytecode, then swc/typescript/linter/formatter/repl should be drop.

This should reduce the size of the executable file

@binaryben I think you are misunderstanding what the ultimate goal for Deno is here. Deno will always use V8 to run your JS code. There will be no compiling from JS/TS to "native code" (yes, I am simplifying here, disregard V8 snapshots). Doing anything else is far outside of the scope for Deno.

Please forgive my late night ramblings. I was very, very tired and probably should have saved replying until today. You're right and I know that using V8 is the end goal. The word I should of used was transpiled and not compiled or native. Specifically, my belief is that Deno should be able to produce a binary that is just (what has been called so far) [deno-lite] + [the developers TS code transpiled to JS], where [deno-lite] is V8 plus the bare minimum to get the developers transpiled/bundled Deno code running on it (EDIT: essentially what @axetroy said above while I was typing this).

I think V8 is something like 10mb, so I made the assumption that 8mb is needed for transpiling TS β†’ JS at run time among other things that arguably aren't needed once development is done and binaries are shipped. But hey, that assumption is making me look like an ass! πŸ™ƒ I can see in #8381 that TS support is apparently already able to be removed.

In my mind, the ideal solution is to have deno-run βŠ‚ deno-server βŠ‚ deno, or something to that effect (deno-run is essentially what has been called deno-lite so far, but this name seems more fitting). I fully expect the final solution to these desired features will be a hybrid of option A & B as already alluded to in this reply. But the priority should be on A first as I suspect most of the target audience of Deno aren't wanting to install a Rust toolchain. Option A would be the killer feature for me personally (and it seems a large number of others), and it would push me to switch to using Deno over Node for many projects. Here are some user stories that explain how it should work in my mind:

  • As a developer, I install deno and use the helpful additional functions such as doc generation, formatting, linting, bundling, file watching, testing, etc to develop my program. Anything that isn't relevant once a program has been deployed is included in this Deno release. It would also have the requested feature here of deno compile. The compile subcommand should NOT need a Rust toolchain installed to be functional though as it would defeat one of the benefits of developing with Deno, that is only needing one self contained binary to get started developing. If possible, it should be that deno compile can create a self contained binary for different platforms by using a precompiled deno-run binary for the target architecture. These deno-run binaries don't need to be included in the full deno binary. In fact, it should be possible to use a flag to point to the lightweight Deno binary to include in the final binary (as a bonus benefit, allowing the developer the flexibility to use a customised binary based on the result of implementing option B). Otherwise deno could download an appropriate deno-run binary from Denoland and cache it for future use by default. The final 'compiled' binary is just a wrapper around an appropriately tested Deno bundle and deno-run

  • As an administrator, I can download the source code of a Deno program, inspect it as needed and run it on a server using deno-server (or use CI tools to automate this). This Deno release still has the ability to transpile TS to JS, download modules from URLs, full logging capabilities, lockfiles, security flags (where an administrator is more likely to understand the implications and it will have greater value), etc. An administrator can also include plugins and extend the original developers code using this option. Ultimately, I would expect that as an administrator, I would compile the developers Deno code and any desired plugins for use in containerised environments, but that's up to me. If it makes sense to just install one deno-server binary and run a bunch of code from that, I have the option. Transpiling from TS to JS at startup in these scenarios is also unlikely to be a major issue as the benefit of seeing the source as it was written will outweigh any issues of slow downs on startup and I can otherwise compile it myself if it is a problem.

  • As an end user, I can download a self contained, lightweight binary with no awareness of what code is in it other than what is labeled on the box (so to speak). I run it knowing it can do anything any other executable file can do (no need for permission flags). All this binary contains is the developers bundled source code and the linked deno-run binary. If it needs to log anything beyond fatal errors, the developer will have written that code in themselves. Because the transpiling of TS was done on the developers machine and every extra feature of Deno that is no longer needed is removed, it is as fast and lightweight as possible.

I'm curious about option B. I can now see the benefit in being able to build more complicated software (i.e., an Electron alternative as suggested), especially if deno compile can link in a custom Deno binary that is likely to be based on the result of developing option B. But to get started, it seems like the long way round to realising the requirements for the above personas when there is already a proof of concept (#8381) for making Deno binaries that are subsets of the full Deno package. Fleshing out that POC and releasing three flavours, and then adding a compile command that creates a wrapper around the lite/run flavour seems like the most direct and flexible approach.

I am not a Rust developer. Is there a large amount of technical debt introduced in the linked POC? Because otherwise it looks like a promising solution that supports this proposed end goal.

@bartlomieju - I think final solution for most of Typescript/Javascript developers is something nearer A - "deno compile" or some set of deno like tools which produces final self executable.
If go through option B is better to achieve A, let's go. But to deal with Rust toolchain for producing self executable is not good final solution for most of Deno audience I think.

Pardon my ignorance, but what would "deal with Rust toolchain" mean for the deno users? If we could encapsulate it all in a "deno compile" command (setting up the whole toolchain and firing the bundling/compilation process) I guess it would be fine as a first working solution. The size of the executable could be improved in later iterations, as we approach the A solution.
On the other hand I think it's great to have the community involved, but self-contained executable generation should be part of deno official tooling.

Pardon my ignorance, but what would "deal with Rust toolchain" mean for the deno users?

@bartlomieju said about option B
It would require user to set up full Rust toolchain. There's also no option to cross-compile. Users
would have to set up full CI workflow similar to Deno's.

It means is't necessary to setup Rust tooling for Typescript/Javascript developer to produce self executable. And this user have to know something a little about how Rust compilation, building, linking etc. work, what some Rust errors means etc.
It's fine for intermediate step as I understand option B, but it isn't final goal for this issue for me as Typescript node/deno user.

I don't like the idea of the "compile" command.

I think it would be better to do a "pack" command which creates a ZIP file with all the dependencies and a boot file included. And that we can do a simple "deno run app.zip". and if it is a library that we can do an "import *" lib.zip ""

I'm afraid that with the "compile" command that we end up with thousands of applications that would result in thousands of 16MB (minimal deno) + ?? Mb useless (deno is just a binary and easy to install). With my idea we would only have a deno + many zip;)

Sorry for my bad english !!!

I think everyone agrees that option-A is ideal but ...

I would like to point out that many/most of us are a devs that are intimately familiar with the node.js ecosystem. However, many of us that are concerned about option-B may have forgotten that node-gyp (which you will probably run into at some point or another even if you don't write native node.js modules yourself) also requires prior installation of a foreign build toolchain (in this case, cpp toolchain possibly sprinkled with some python on top).

So, in my view, we should not look at the option-B installation of a rust toolchain onto our dev/CI environments as something that will be completely foreign to us, especially if we all understand that we are ultimately going to strive to achieve an experience that will, in time, be or approache something close to option-A.

I like the idea above of trying to seek out ways to encapsulate deno compile automatically installing the rust toolchain for you if you don't already have it (or it otherwise "helping" you get the toolchain in some other less invasive manner)

I think it would be better to do a "pack" command which creates a ZIP file with all the dependencies and a boot file included. And that we can do a simple "deno run app.zip". and if it is a library that we can do an "import *" lib.zip ""

Not user-friendly.
In that case, I would be doing the same thing I do today using Node, which is creating WinRAR self-extracting packages...

I think it would be better to do a "pack" command which creates a ZIP file with all the dependencies and a boot file included. And that we can do a simple "deno run app.zip". and if it is a library that we can do an "import *" lib.zip ""

This basically exists already. Use deno bundle if that’s your preference @IllusionPerdu. You’d still benefit from the proposed deno-run binary.


I appreciate what you are saying @somombo, but Deno is trying to learn from past experience and lessons to do better. Requiring Rust means we may as well stick with Node for now and use existing, well known options

@binaryben Requiring Rust means we may as well stick with Node for now and use existing, well known options.

Fair enough, I think some people will feel that way.. but others will not. I suspect the latter will instead feel like, "for now, something is better than nothing".

And hopefully as the tool approaches option-A the former will also come back to making full use of Deno.

This basically exists already. Use deno bundle if that’s your preference @IllusionPerdu. You’d still benefit from the proposed deno-run binary.

In the zip file you can include somes ressources other than just the js/ts : i think a sort of virtual drive
The benefit it's save 16Mo per app ;)

I personally fail to see what the issue with option B is at all.

I am making an assumption that the deno compile, or what have you, will create a Rust project, grab all the required crates and finally compile Deno libs and my JS/TS app into a single executable with resources (that I specify it to) bundled. And since user's code is (at least for the users that are not interested in getting under the hood, i.e. that seek PKG-like experience) only going to be TS/JS -- it's highly unlikely that, once this whole thing is stable off course, that user will ever need to deal with any rustc errors.

Setting up Rust toolchain is pretty straightforward (see: https://rustup.rs/), and if Deno would handle everything else it's as good as a turnkey solution. The target audience are developers, after all.

The initial implementation of this feature will be released in Deno 1.6. For those interested, this is implemented using option A. Option B is still being worked on and will also be available at some point.

So if I understand correctly (according to #986 (comment)) you are taking the hardest path first, which I think will answer most users needs. Am I correct or have I misread it?

We have taken the path that takes the least effort :-)

If you want to discuss, join the Discord: https://discord.gg/deno

Wow, this came suprisingly! Thank you very much @lucacasonato

You did it!!! 1.6.0 is now released including this feature.

deno compile --unstable https://deno.land/std@0.79.0/examples/cat.ts will make you an executable version of the module.

  • Startup time is on pair with running via deno!

  • The size of the 47 MB output file is an area of improvement but we knew that already from this thread.

I'm very very excited.

Now this is a celebration! I am really excited to try this :)

Nice feature, I tried deno compile --unstable https://deno.land/std@0.79.0/examples/cat.ts, and it produced a 31.4 MB sized cat.exe. Look forward to a smaller size in the future.