If there's any demand at all for doing what the feature does, then devs will tend to write polyfills to make the feature "work" (for some value of "work") in the other engines, so that their code can still be "portable" (i.e. at least not crash) despite calling the feature. Also, eventually, the other engines might introduce their own versions of the feature, with their own APIs, for the polyfill to fill over; and then even more eventually, those will get standardized.
If there's a huge demand for the feature (think: XMLHTTPRequest), then other engines will rush to just clone the feature's API as-is, so that their engine won't be held back by the lack of it, or by the bad performance of a polyfill solution.
But if there's next-to-no demand for doing what the feature does, such that you can predict that nobody will even bother to write a polyfill, let alone port the feature, and so the feature will languish in that one engine — then yes, people will intentionally avoid using that feature at that point. See e.g. most of the Microsoft ActiveX-based stuff you could do in Chakra (IE's long-dead JavaScript engine.)
Even if there's no technical trade-off, it does seem that most of their engineering efforts will be focused on backwards compatibility with Node, instead of the new vision that they were promising.
Or a new frontend package is built for react by default rather than vanilla js, (or a very light set of js dependencies) even if it easily could have been.
And my experience is that react has been corruptive to the vanilla js ecosystem as a result.
It’s a bit of a reverse from node->deno but I could easily see how deno would be stunted or affected in an unknown but possibly not great way by this.
All that said, Deno has had to pivot because node seems to have gotten its act together to some extent. At least compared to when Deno set out.
So as a practical user acquisition need, not want but need, Deno had no choice but to do this. And the decision to go down this path was made quite a while back, iirc.
I remain hopeful for Deno and the team behind it. Frontend tech is very competitive.
The thrashing across the frontend stack that backend folks recoil from has made things stronger and there’s a lot of talent at work.
It doesn't mean the application at its core will be more node based so much as wanting to be able to use a given NPM module.
Unlike with browser-embedded JS engines, where "what native APIs are accessible from JS" is a relevant question†, in backend JS runtimes the answer to that question is always "all of them, because they're all just C FFI libraries, and these runtimes use the same C ABI, allowing interoperation with any such library." So for backend runtimes, it's only things to do with the JS engine itself — like language features — where the differences really matter.
† And actually, browser JS engines could have gone this way as well, if NPAPI/PPAPI <object> had caught on. The "Java" in "JavaScript" was supposed to refer to using Java applets as ActiveX-like COM servers which would expose APIs to be scripted against! In other words: a common-standard for native FFI, just with sandboxing.
Deno does have some benefits, such as security model and speed, but those are not things that couldn't be added to Node. Once Node adds the security model why use Deno?
1. Native support for TypeScript.
2. Better built-in APIs.
3. Security model.
4. Easy package management.
Node.js is closing the gap. E.g. top-level await, fetch() API, Permissions.
I believe Node.js will follow Deno and implement more and more features. Sort of a npm/yarn situation.
It's cool to see what Deno does, but I don't think you will be left out by sticking with Node.js.
If anything, deno's package management is definitely not easy.
And no, "just throw your deps into deps.ts and re-export each manually" isn't easy either.
About the security model: can someone explain why it's a good idea to have isolation at the interpreter level rather than relying on OS features? (or VMs). I cannot see why a security team would want an isolation system that's different for a specific platform.
Those technologies are platform-specific and difficult to use.
For example, provide the command line for running a Node.js program in Linux without file system access.
In deno, it's
deno run main.ts
Or, to enable fs access: deno run --allow-read --allow-write main.ts
---That said, I agree that I'd rather rely on containers, VMs, etc.
> docker run -it --rm -v "$PWD":/app -w /app node:18 node main.js
It's a tad longer but it's more flexible too ;)
But I do see your point :)
Forgive me if I don't believe that running a full OS on a host OS to run a single node command doesn't amount to running a VM.
You can think of Docker containers as a way to package an application dependencies, ending at the (Linux) kernel API/ABI.
As in: "a completely different node js binary in its own environment, its own networking, its own file system overlay etc."
Or the millions of daily passengers in transport mechanisms powered by high integrity operating systems.
Zero POSIX.
And almost all systems on AWS are running on Linux/KVM. Even quite a few that report as Xen are actually running on KVM; Amazon added a bunch of code to KVM to lie on guest hypervisor cpuid leaves as well as emulate Xens hypercall interface.
Which happens to be a special Windows build, Azure Host OS.
https://techcommunity.microsoft.com/t5/windows-os-platform-b...
Additionally, bare bones Linux kernel infrastructure for Xen and KVM support isn't UNIX.
"Transcending POSIX: The End of an Era?"
https://www.usenix.org/publications/loginonline/transcending...
> Additionally, bare bones Linux kernel infrastructure for Xen and KVM support isn't UNIX.
> "Transcending POSIX: The End of an Era?"
Xen and KVM aren't equivalent here.
KVM is ultimately an interface involving file descriptors, mmap(2), read(2), write(2), etc. IMO while not being UNIX™ it still very much embodies the UNIX spirit.
Deno supports typescript out of the box. This alone greatly simplifies any deployment, and even if Deno did not offered anything else it would already take the lead in developer experience.
People will make sure to distribute their library so that it works on all of them. They might even use `deno compile` to bundle the library into a single file that can be consumed from any of the runtimes.
Long shot though.
I expect deno to convert all of their features to node modules and simply be an opinionated stack, similar to create-react-app vs. next.js. The path they're entertaining now isn't really tenable.
To me, this is like complaining that you can mix JS into TypeScript projects. The only reason the latter is successful is because you can migrate from the former in a first-class way.
I don't think Deno has the community support to make it on its own steam - that said, I think it's a huge value add in terms of DX. I support them on this 100%.
Node is trying to add in some capabilities of Deno. We'll see which one comes out ahead.
Just my opinion, but I'd rather that not be the case.
deno does not have any inherent advantages over node if they continue to support node functionality. I won't be surprised if package.json support is next. literally the purpose and sole advantage of deno was that they did not have node cruft. things like speed are not inherent advantages. it's true deno is written in rust, but node is basically v8, which has its own advantages.
Could you be more specific about how this harms deno? It is just "cruft"?
Haven't tested it, but one of the modules I had a lot of trouble importing when MS-SQL was a major database I interacted with was `mssql` backed by `tedious`. I'm at least able to `import mssql from 'npm:mssql';" now. I don't have many other good/complex imports that weren't working before or were lacking suitable deno alternatives. Back when trying to use the likes of esm.sh and similar, it used to blow up spectacularly.
No it isn't.
> deno was able to ignore all of that
So maintaining a Node.js stdlib (or subset thereof) is more work for the Deno team. I agree, but that seems a weak argument for harming a Deno user.
sure it is. how could deno maintain its security model and implement all of the node apis? part of why it works is because there's a standard lib and there's no way to obscure usage and thus circumvent the permissions. unless your answer is that --allow-run is the solution. even with the OP post, they don't even support child_process with deno deploy, which is why, but clearly they can't support everything and still maintain security, which is the point.
consider just the c++ embedder apis. how exactly would deno deal with that with respect to permissions?
> for one the security model is incompatible with node APIs.
All Node.js APIs are polyfilled using Deno APIs and as such these Node.js API are subject to the same permission model as regular Deno programs.
> however if deno were to support all npm packages, it would necessarily basically have to implement the entire node api, including support for pre async/await (aka callbacks)
We're not aiming for 100% compatibility, though we managed to polyfill a hefty amount of built-in Node modules, including ones that are callback based.
> consider just the c++ embedder apis. how exactly would deno deal with that with respect to permissions?
This is the same situation as with Deno's FFI API. You need to pass `--allow-ffi` flag to enable usage of `Deno.dlopen` (or `require("./foo.node")` for N-API extensions). That said when you load dynamic libraries all bets are off - we can't control what the dynamic library can do and you effectively open up your whole system. Still, you can't load such libraries unless `--allow-ffi` flag is passed.
Hope this clears up some things.
yeah, I'm aware of this. my point was that you couldn't 100% compatibility without compromising. fwiw it's a good thing. node can be a security nightmare if your team is not disciplined about dependencies.
w.r.t to node apis, technically you are like you say polyfilling them. and with that implementation differences hopefully allow you to continue to polyfill.
Deno is very vulnerable to the same fate. Yes, it's easier to get adoption if you can plug into an ecosystem that is already popular, but then it's not your ecosystem and the behemoth that actually owns the ecosystem will feel the threat and adapt.
If you don't prioritize interop, your initial adoption will be much much slower, but if you make it out of the early stages you have an ecosystem of your own that has its own distinct advantages. The incumbent can't just pull in a few good features and thereby take your mind share.
I don’t think it will suffer the same fate as Kotlin because a lot of the value add is in Deploy, Deno KV, etc.
Kotlin is doing fine, but its growth has slowed a lot and the team is pivoting rapidly to multiplatform to try forge a niche that isn't just "better Java". That pivot isn't smooth because most Kotlin code—even the standard library—depends heavily on the JVM and Java libraries.
I didn't mean "fate" to imply that Kotlin as a language is dying, just that their position as "better Java" is no longer tenable and they've had to shift strategy dramatically to try to stay relevant. That's what I'm afraid will happen to Deno as Node catches up.
- Big plus: the built-in stdlib is strong (coming from Node; it's not as strong as Python or even Ruby, but I'll take what I can get). The stdlib is also pretty "good Unix citizen" focused--it's very easy to build an ergonomic command-line application.
- Theoretical plus: the capabilities-based security model is smart and cool.
- In-reality minus: you end up passing most `--allow` flags anyway for anything that isn't trivial. It could scope down to specific modules if you are a tryhard, but I am not sufficiently a tryhard, and this really should just be within the capability of Deno to figure out and recommend for you.
- Moderate plus: `deno compile` being built in is nice, and it is easier than the Node `pkg` library from Vercel.
- Small minus: The set of targets for `deno compile` is limited, though, and (for reasons that are both reasonable and unfortunate) it spits out truckasaurus-sized executables, which is a bummer.
- Enormous, world-stopping minus: dependency management. It's bad, and it's not easy to get around. "Just put all your deps in a TypeScript file and export it" is not the amazing solution that its adherents want you to think it is. The developer experience of upgrades is annoying and unpleasant and `deno-upgrade` helps but makes your project feel out-of-control. (Import maps don't really help, it's just another road into the same unpleasantness and since they don't work if you're writing a library I don't think you should ever use them at all; just stay consistent between applications or libraries.)
- Whiny minus: VScode support is pretty bad. It exists, but it's not great, and there's a weird lag to dependency management that I never figured out.
- Real minus: the Deno VScode stuff freaks out when you use a Node project, even if you set `deno.enabled: false` in your .vscode/settings.json file.
There are definite, real benefits to some of what Deno does. But I just don't see why I'd continue to use it over Just Continuing To Use Node.
In deno you can just open your ide, add some imports to the import map and you're ready to code.
Direct imports that the runtime caches for you, vs node_modules is way easier to deal with most of the time.
Deno is not supporting node.js built-ins. It's deno deploy. Meaning that this will not work on other hosting platforms, right... or?
Note: Cloudflare Workers is also starting to support a lot of Node APIs as well (opt-in), so bundles will likely be able to run in both of them and others. Can't speak specifically to it, but Bun will probably also largely work.
I'm really excited for when they also get state (Deno kv).
but if you already have a huge legacy node project why would you switch to deno? or are we all going to pretend to forget why Deno was even created?
Only the people who like having to rewrite everything with every breaking change that happens on cutting edge platforms. It's not all that fun for most people. I'll wait until "new shiny" isn't as new and shiny, when it finally becomes stable and dependable. Until then the current stable platform is what I'll continue building on because it still does everything I need it to do.