Deno: Node.JS time is running out?







About 18 months have passed since Deno’s internal release, a preview release has been released, several articles have appeared on Habré, and Ryan goes to conferences and talks about him. However, I have never seen any thoughtful analysis of this project anywhere - for some reason everyone is limited to translating the documentation ...







Well, let's try to do it now. For the last 5 years I have been writing on Node.JS, and the OneTwoTrip company, where I am working now, has been writing projects on the node for about 9 years (yes, I wrote the story about 9 years in a monolith on the node ). So the analysis should be good. Moreover, I already told him at Moscow Node.JS Meetup 10 , and it was interesting. By the way, if it’s more convenient for you to listen, rather than read, then you can listen and watch here . My second speech, I'm a dude in a pink shirt.







Oddly enough, in order to understand where the project came from and why, it is necessary to fall into the past. So we throw in a little plutonium, raise the doors of our delorean, and embark on a journey - look at the important 10 years that made Node.JS the way we see it now.













Forward to the past



2009



Ryan Dahl announces Node.JS, here she is - the very first presentation at JSConf 2009.







2010



Express, socket.io appear - the main bricks at the moment are almost any service.







There are crazy people who really write server code on this!







2011



Big guys are starting to flirt with Node.JS - including Uber and Linkedin.







Npm 1.0 release.







Node starts working on Windows.







2012



Ryan moves away from developing Node.JS. Remember . It was 2012. So Ryan is certainly the creator, and has done a lot for the ecosystem - but the next 7 years passed without his participation.







2013



Node at Paypal, Walmart, eBay.







Koa appears - remember how many copies were broken about generators?







2014



Node at Netflix. Attempts are beginning to formalize the project into something more mature, with an open model for managing an advisory board. There is technical stagnation leading to the fork of io.js.







2015



Work on the bugs. The merger of io.js and Node in ecstasy under the auspices of the Node Foundation and the release of Node 4. I must say, this is the version I consider the first on which it was really possible to develop something. If someone wrote on versions 0.xx - then you remember about var, callback hell, a bunch of different libraries to simplify asynchronous work (like step and async. Yes, async is not only async await, but also an npm library).







2016



The incident with the leftpad, which still evil tongues recall the ecosystem. How many articles and attacks were there. Well haters gonna hate. However, important lessons have been learned from this.







2017



Breakthrough year. I will not mention all the releases of the node and the increase in the number of installations of modules with npm, however this year the number of services on Node.JS exceeded 8 million, and the number of installations was 3 billion per week. Absolutely cosmic figures that are hard to imagine.







N-API also appeared, and Node.JS was forked again in Ayo.js. A very funny and instructive story about SJW - it is worth a separate article, so I will not dwell on it - I just recommend reading it at your leisure. Only to the prospoiler that the fork died safely.







2018



The second mass hysteria since leftpad - now about how event-stream steals bitcoins. Hundreds of posts about ecosystem insecurity. Fantasy posts about how npm packages steal credit card information. The community is spraying mud like a hose. I must say, it was very useful, and conclusions were also made - about them a little later.







Also, Ryan suddenly blows up community posts about the fact that serious services are worth writing on Go, describes 10 things in Node that he regrets, and announces Deno, who solves all the problems.







2019



Deno goes into the preview release, a bunch of articles appear on the hub, and now you are reading one of them.













Back to the present



I hope that after this tour it became more clear what kind of sore spots the ecosystem had and how it developed - having this context, it is much easier to understand what is happening now.







10 things Ryan Dahl regrets about Node.JS



Unfortunately, I did not find offhand the article with the translation of the report, so I will list them here briefly, and here I will comment.







  1. Lack of support for promises at the beginning of the journey . Yes, it would have been simpler if Ryan had not cut the promises, considering them to be an extra complication that did not take off at the beginning of the development of the node. Lost time for all this callback hell is certainly a pity - but in 2019, all sane libraries have promises as the main interface. Moreover, even system libraries finally provide promises.
  2. Security of system calls and network calls. On the one hand - yes, it's good when everything is safe. On the other hand, it is not clear how in this respect the node turned out to be worse than any other medium ....
  3. Build native modules using GYP. Yes, probably it was superfluous, but who could know that the chrome would leave her. Again - if the chrome is gone, then we can leave too ....
  4. Excessive package.json. NPM as a monopoly register. The argument about package.json is a bit strange. For example, Ryan says that there is any rubbish there like a license. But if it weren’t there, how could you quickly find out the licenses of the modules used in your project? .. The argument about NPM is more like the truth, but we will dwell on this in more detail later.
  5. Node modules. Complex resolution of dependencies does not work like in a browser. Yes, that's right. Stable dependencies began to be put without any miracles only on the 4-5 version of npm. But the mechanism works, and allows you to do amazing things - at the moment this is fine. As for browser compatibility, no matter what you do, there will still be stages of processing the code like compiling and collecting bundles. So, node modules are unlikely to have any meaning in this context.
  6. Require without extension and its uncertainty . Yes, probably bad. But not enough to mention it ...
  7. index.js as unnecessary complication. Also too trivial and boring point to describe.


By the way, mind you, I said that Ryan regrets about 10 things, but only 7 points. This is not a mistake, I reviewed his report several times and the reviews of the report. Either it was a complicated joke on the topic of processing numerical values, or Ryan was just too shy to name 3 more points ...







But all right, we understood the problems, and drove on. Logically, in Deno, Ryan decided to get rid of all the problems of Node.JS. Let's see what happened to him.







What Deno consists of



  1. Deno is written in Rust.
  2. As an Event loop in Deno uses Tokio, written again in Rust.
  3. Deno supports Typescript out of the box.
  4. Well, the code is executed using the same V8, which captured the whole world.


Sounds good at first glance, but let's take a closer look at these points.







Rust . No, please understand me correctly - I believe that Rust and Go are wonderful languages, and I am sincerely glad that they are. They make it possible to write low-level code faster and safer than in C ++. However, there is a nuance - in the best case, it will not be slower than the implementation in C ++. So I personally do not see the point of writing a complete analogue of system strands and an event loop - it is unlikely to bring any real benefit, since these are things that at some point simply reach the optimal state and are not actually optimized further.







TypeScript Again - I have nothing against TS. A huge number of companies and developers use it, and it has already shown its worth. But Deno just hides the transpiler in the binary, and the transpiled code in special directories. That is, everything is the same under the hood, and there is no benefit in this, except for the aesthetic. But there is a minus - the transpiler version is tightly nailed to the Deno version. I’m not sure that this is good - it’s easy to imagine a situation where you want to update either the transpiler or the runtime. But not both.







So while nothing tasty is visible. Let's go further, look at the main features.







The main differences between Deno and Node.JS



Deno does not use npm. There is no central registry. Modules are imported by URL. No package.json.



That is, the code looks something like this:







import { test, runIfMain } from "https://deno.land/std/testing/mod.ts"; import { assertEquals } from "https://deno.land/std/testing/asserts.ts"; test(function t1() { assertEquals("hello", "hello"); });
      
      





At the first start, the code is loaded and cached, and then the cache is used. Versioning is supported by specifying the version in the urls. I do not know why this causes so much delight in everyone. I see only two scenarios:







  1. Everything will go as Ryan wants it. Developers will upload component code to personal websites, create version folders, and everyone will download packages from there. Hmm. I think this is a decisive leap into the past:

    1.1. Any code, including with the specified version, can change uncontrollably on the author’s server.

    1.2. There will be no source from which you can find out about the stability of the package, about the number of its reuse, about the presence of bookmarks and problems in it.

    1.3. The author of the package will have to take care that his server holds the load from developers downloading packages. Didn’t we ate this around 2012, when npm was more likely to lie than work? As the stickers on cars say, “can we repeat”?
  2. Another option, more real. Developers will put packages on github, gitlab, or any other repository that holds the load and is transparent to the community. Everything seems to be fine. One question remains. Suppose we give up npm - but what difference does it make, what exactly will the centralized repository be? Everything will be exactly the same - just a side view. Even the Entropic decentralized repository project, which seems to have passed away quietly, looked even more interesting.


A separate consideration is the issue of reproducible builds. Since no one guarantees that you download the same thing from some left server (especially if you didn’t specify the version in the url) - then Ryan suggests ... Keep the source code of the imported packages in the project repository ... Is it 2019 exactly? Maybe Ryan is not aware that since 2012, shrinkwrap first appeared in Node.JS, and then a lock file? Who solve this problem is much simpler, more visual and more economical?







In my opinion, loading modules by URL is a very strange and controversial solution, the main advantage of which is the wow effect for junior developers.







All asynchronous operations in Deno return Promise.



Overall cool, yes. But since 2012, really a lot of water has flowed - and in the node all sane libraries have long been working on promises. And even the system libraries on them almost crawled. Therefore, it is difficult to consider it any competitive advantage.







Deno always shuts down on Uncaught Errors



It is strange to see this in the list of significant differences. In Node.JS, you can do exactly the same thing. If you need it.







Used by ES Modules, not require



Yes, it’s good that the back and front are switching to the same format. But you know, Node.JS now also has ES Modules support ...







Deno requires permission to work with the network, files, and environment variables.



Sounds great! But the implementation ... The implementation is just a set of flags allow-read, allow-net, allow-run, allow-env. It looks something like this:







 deno --allow-read=/etc https://deno.land/std/examples/cat.ts /etc/passwd
      
      





This again raises my questions:







  1. In a large application, the startup script will turn into a flag trash.
  2. Most likely, these restrictions will simply turn into a launch practice with the --allow-all flag.
  3. In virtually no other environment there are no such restrictions. And everyone is doing fine with this. For the simple reason that for many years how file access rights can be controlled at the permission level of the user from whom the process is running. And the issue of networking is perfectly resolved by firewalls. Why Ryan decided it was a runtime issue is deeply incomprehensible to me.
  4. Well, the last. Containers not only appeared, they are firmly in use, and even ceased to be a hype theme. And they perfectly solve these issues. There is a feeling that Ryan entered 2019 in a time machine right from 2012, and only this explains everything - then there was a whole year before the release of the docker ...


Nowadays. Our days. NPM









Well, in general, I wanted to recall what happened to npm since 2012:







  1. Packages do not disappear. Removing and changing the downloaded version is prohibited.
  2. Lock file provides reproducible assemblies.
  3. There is a security audit. And with the help of github, snyk and npm itself.
  4. There are usage statistics and dependencies.
  5. There are alternative clients.
  6. It is possible to put packages from other sources - git, github, whatever.
  7. There is a proxy register.


And I consider the main advantage of npm ... The fact that it can be thrown out of the ecosystem at any time. There is a protocol, there are clients, there are other registries ... As soon as Akela misses, any large company can raise an alternative registry - facebook, google, microsoft, gitlab ... So far this has not happened exactly because npm works quite stably and responds community needs.







To summarize



Let's go through the points:







  1. Rust is not an advantage.
  2. TypeScript is not an advantage.
  3. Loading modules by URL without NPM is rather two steps back.
  4. Security Enhancements - It looks awful.
  5. The remaining differences are unprincipled. Is that the logo. The logo is awesome. I love dinosaurs!








In the end, I just do not see the point in using Deno. I can’t stand the position “I haven’t tried, but I’m condemning” - but so far, even Ryan says that Deno is still raw - so I did not try it.







However, I really wanted to find an antagonist who would tell me that I was wrong, that Ryan did a cool thing, and that I did not understand its application. I discussed Deno a lot with colleagues, friends, and told all this at Moscow Node.JS Meetup - and no one gave me an alternative opinion. This is partly why I am writing an article on Habr - tell me, maybe I still did not understand something or did not notice?








All Articles