Importing public packages into Deno

Asked

Viewed 119 times

4

I was looking at some examples of code in the new Deno, and something caught my attention and caused me some discomfort. It was the way public packages are imported by Deno, through a URL.

I know that the npm It’s not the perfect world and it has its problems, but using this way of importing modules through a URL has given me a certain strangeness when looking for the first time. I know that the Golang language imports in this way, but since I have never used this language, I have nothing to comment on. But to bring it into an environment 'similar' to Nodejs, so to say, left me with some doubts when looking at a code like this:

import { Response } from "https://deno.land/[email protected]/http/server.ts";
  • I have to import by a URL in all my files I need a library/framework/module?
  • If one of these modules undergoes a security update, or is 'deprecated', or if the repositories change domains, or simply go offline, I have to manually 'scan' each of them and change these versions by hand?
  • Deno uses a built-in cache system so you don’t have to download a lot of things (you’ve already downloaded for your other projects), but each project has its package versions, how do you manage that? You can manage this cache?

I came to look at the documentation that they allow to create a file import_map.json (see here), where you place all the Mports inside a file and dispense with manual import into your files.

  • But what? If I have 50 Imports in this JSON, I’ll have to manage it manually the same way?
  • If I update this JSON, it will overwrite the previous version saved in the cache?

Anyway, I don’t know if I’m complaining too early and in the future there will be a package manager (I don’t know if this idea is on the agenda for Deno).

Although Deno has some very interesting ideas, mainly related to security and can run Typescript without having to install anything, it caught my attention, but for the little I studied about, I’ve come to the conclusion that Deno has a long way to go and a long way to get to where Nodejs came from, because at the moment Deno’s only purpose, in my view, was to generate news, because I don’t see the purpose to use in production, at least for now, and study their concepts that differ from Nodejs. But it’s up to the discussion of why this platform no longer came with a native package manager. Maybe I’m talking nonsense and years from now that question is irrelevant.
I accept any material on the subject as an answer and here are my doubts.

  • 3

    "But what? If I have 50 Imports in this JSON, will I have to manually manage it anyway?" - I don’t get it, it’s not the same concept as the JSON package.json, updating in one place?

  • @Rafaeltavares in a way is, but it would be something to change the url of */[email protected]/http/server.ts to */[email protected]/http/server.ts and if this would not overwrite the files that are already in the chache.

  • 1

    This update should be put in another question and not there, you can not answer in the question itself a doubt or suggestion.

  • 1

    @novic ah yes, I was going to put her with an answer, but I forgot. Thank you for remembering :)

2 answers

3


Heed, this reply contains opinions.


I installed Deno these days (version 1.5) to "play" a little, so I’ll put my impressions, which can help with your doubts. One detail is that I started messing with Node about 2 years ago and I don’t consider myself a expert in this technology (that is, I have not yet touched enough to have a consolidated opinion, only impressions of those who do not yet know the background).

I have to import by a URL in all my files I need a library/framework/module?

At first, if they are external libs, yes (remember that you can also import local modules from the file system). Of course the import_map.json that you mentioned is an alternative to "type less", but it seems to me that’s the idea (I understood that it would be more or less - but not exactly - as a "package.json with URL’s").

It is totally different from Node, in which there is a central repository from where everything is downloaded "magically", so you do not need to explain which URL will be used - although it is possible to use more than one Registry. Maybe Deno is like this because there is not yet a central repository (will we? don’t know), or maybe because the idea is this: to allow you to download from anywhere, without limiting yourself to a single "central entity" that contains all the packages in the world.

Both approaches (centralized x decentralized) have advantages and disadvantages, and each will prefer one for several reasons. Honestly, it doesn’t really matter to me whether it’s centralized or not, it would be enough if people didn’t abuse the dependencies, and did not add unnecessary modules for trivial things, but rambling...

Anyway, you don’t have a perfect model. A centralized repository can facilitate management (less URL’s to set up), but does not necessarily guarantee "quality"/curation (let’s face it, it has a lot of thing questionable no npm). But a decentralized model also does not guarantee this, although "competition" may perhaps stimulate a search for quality</speculation>. I think there might even be some abstraction above import_map.json, Maybe if you had "favorite URL profiles/groups" or something. But maybe Deno wants to start simple and only then complicate (if it is done carefully, it will not turn into the mess that many software end up turning). I don’t know, only time will tell.

Anyway, Deno is being made with different ideas of Node, and still can not sink if he will "hit" or "miss". But at least he’s trying to do something different, which is commendable (given that both were created by same person, and it is unusual in our area for someone who decides to do something that can kill their previous creation, even though it is still popular). I prefer not to give the verdict on what is best, not least because I don’t think it has "better", what it has are pros and cons of each approach, and each chooses what they prefer (preferably based on technical evaluation). In computing everything is trade-off what can not turn is "religious war".


If one of these modules undergoes a security update, or is 'deprecated', or if the repositories change domains, or simply go offline, I have to manually 'scan' each of them and change these versions by hand?

Yes.

But doesn’t that also happen with any dependency manager? If you are using any lib and it is updated or removed, you will have to go there in the configuration file (whatever it is) and change in hand. Of course you can configure the package.json so that it use newer versions automatically, but that doesn’t get you out of problems that can occur if a lib is suddenly removed.

Of course, with each lib coming from a different URL, it might take a little more work to manage. But it may be that someone creates some solution to this. We will see...


If I update this JSON, it will overwrite the previous version saved in the cache?

Not.

To test this, I created a very simple lib, with 2 versions, and put in the following structure:

libs/
  |  
   \_ [email protected]
  |      |
  |       \_ mylib.ts
  |
   \_ [email protected]
         |
          \_ mylib.ts

Content is as simple as possible:

export function f() {
    console.log('mylib 1.0');
}
export function f() {
    console.log('mylib 2.0');
}

And I made a test file (test.ts):

import { f } from "http://localhost:1234/libs/[email protected]/mylib.ts";

f();

Note that I left the libs running on a local HTTP server.

When rotating it (deno run test.ts) first time, lib is downloaded (version 1.0) and output is:

Download http://localhost:1234/libs/[email protected]/mylib.ts
Check file:///home/hkotsubo/test_deno/test.ts
mylib 1.0

From the second time on, the download is no longer done (I confirmed this also by looking at the HTTP server logs, and did another test with HTTP server turned off and worked), and the output script is only "mylib 1.0".

And in the cache directory, an entry was created for mylib, in the briefcase .cache/deno/deps/http/localhost_PORT1234 (the briefcase .cache was created in my user’s home), with 2 files:

fd82babb7ac771cf78249676f381f5aa11a8eb6e7199a09893615123f2ec587f
fd82babb7ac771cf78249676f381f5aa11a8eb6e7199a09893615123f2ec587f.metadata.json

The first contains all the contents of version 1.0 of mylib.ts, and the second contains metadata thereof:

{
  "headers": {
    "date": "Wed, 28 Oct 2020 10:51:30 GMT",
    "connection": "close",
    "content-length": "55",
    "content-type": "application/octet-stream",
    "host": "localhost:1234"
  },
  "url": "http://localhost:1234/libs/[email protected]/mylib.ts"
}

Now what happens if I change my script to use version 2.0 of mylib.ts?

// test.ts, usar a versão 2.0 de mylib
import { f } from "http://localhost:1234/libs/[email protected]/mylib.ts";

f();

When rotating again with deno run test.ts, the exit is:

Download http://localhost:1234/libs/[email protected]/mylib.ts
Check file:///home/hkotsubo/test_deno/test.ts
mylib 2.0

He made another download (after all, it’s another version, another URL), but again this is only done the first time I run the script. From the second time onwards, the download is no longer done and the code only prints "mylib 2.0". And in the cache directory were created 2 more files, IE, now are 4:

fd82babb7ac771cf78249676f381f5aa11a8eb6e7199a09893615123f2ec587f
fd82babb7ac771cf78249676f381f5aa11a8eb6e7199a09893615123f2ec587f.metadata.json
fb378398f20710b3d0dc83428cddc811c69646561a7da4fd6d3ec2989be2756b
fb378398f20710b3d0dc83428cddc811c69646561a7da4fd6d3ec2989be2756b.metadata.json

The 2 new files refer to version 2.0 of mylib.ts: the first contains the code, and the second (metadata) contains information of the respective version:

{
  "headers": {
    "host": "localhost:1234",
    "content-type": "application/octet-stream",
    "connection": "close",
    "content-length": "55",
    "date": "Wed, 28 Oct 2020 11:02:34 GMT"
  },
  "url": "http://localhost:1234/libs/[email protected]/mylib.ts"
}

Notice how the URL is different from the other file.

That is, both versions of mylib.ts stay in the cache, and I can change the import used in test.ts to use any of them, which the same is found in the cache correctly (and no download is done again). There is no confusion between versions.

Remember that the cache is centralized and external to your project. All projects will search in the same cache, and as it can maintain several versions of the same lib, it has no risk of overwriting anything (each project can use the version you want, which is read from the cache without problems).

Attention, paragraph with personal opinion: i think it’s better this way, with a centralized cache that only downloads once instead of having a directory (node_modules) for each project, where the same things can be downloaded several times, which is often redundant, since they are the same versions of the same libs (by the way, having a centralized cache is also the approach that other managers, like Maven and Gradle for example, have been using for a long time). But let’s not get into an endless discussion about which dependency manager is best, so we don’t lose focus on the answer. Because, as I said, there is no perfect solution, each approach has its pros and cons.

I only leave for reflection a presentation of the creator of the Node, in particular what he says from 13 minutes of this video ("It’s my fault and I’m very Sorry"). Maybe that’s why he wanted to make Deno so different from Node, and probably "never" there will be 100% compatibility.


even without passing the --allow-net flag, he accessed the internet and downloaded the packages

It makes sense. If one of the premises of Deno is that you can import modules directly from a URL, then by default the import have to access the internet.

To option --allow-net serves for what the script code have access to the network. Let’s see the example script you have on Deno’s website:

import { serve } from "https://deno.land/[email protected]/http/server.ts";
const s = serve({ port: 8000 });
console.log("http://localhost:8000/");
for await (const req of s) {
  req.respond({ body: "Hello World\n" });
}

I saved this code in the file server.ts and rode (deno run server.ts). The exit was:

Download https://deno.land/[email protected]/http/server.ts
Download https://deno.land/[email protected]/_util/assert.ts
Download https://deno.land/[email protected]/encoding/utf8.ts
Download https://deno.land/[email protected]/io/bufio.ts
Download https://deno.land/[email protected]/async/mod.ts
Download https://deno.land/[email protected]/http/_io.ts
Download https://deno.land/[email protected]/http/http_status.ts
Download https://deno.land/[email protected]/textproto/mod.ts
Download https://deno.land/[email protected]/async/delay.ts
Download https://deno.land/[email protected]/async/mux_async_iterator.ts
Download https://deno.land/[email protected]/async/deferred.ts
Download https://deno.land/[email protected]/async/pool.ts
Download https://deno.land/[email protected]/bytes/mod.ts
Check file:///home/hkotsubo/test_deno/server.ts
error: Uncaught PermissionDenied: network access to "0.0.0.0:8000", run again with the --allow-net flag
    at processResponse (core.js:226:13)
    at Object.jsonOpSync (core.js:250:12)
    at opListen (deno:cli/rt/30_net.js:32:17)
    at Object.listen (deno:cli/rt/30_net.js:207:17)
    at serve (server.ts:287:25)
    at server.ts:2:11

That is, the import downloaded the dependencies, but the script code (which goes up an HTTP server) did not run because it needs the flag --allow-net.

If the --allow-net needed to import packages, so all scripts that had some import (even those whose code does not make any access to the network) would have to run with this flag qualified, which would make no sense at all (if everyone needed this permission, a flag to enable this would be half useless, it would be easier if it were already enabled by default). Why the import can already access the internet by default (I mean, it makes sense to me that it’s like).


I don’t know if I’m complaining too early and in the future there will be a package manager

I didn’t dig deep, but in a quick search I found that ("package management tool for Deno similar to npm but keeping close to the Deno Philosophy"). Probably many other alternatives have already emerged, and only time will tell which one will become more popular.

For the record, at documentation it is said that "Deno explicitly takes on the role of Both Runtime and package manager" - that is, Deno considers itself a "package manager". Maybe not in the same way that Node does (and that everyone has got used to), but anyway...

My conclusion is that it is too early to say whether Deno will be a success or not. It’s only 2 years of existence against more than 10 years of Node, and we still don’t know what will come (if someone will create a "killer" dependency manager, or if they will invent something else, etc.), there is no way to know how things will be in 10 years. It may be that only one of them survives, it may be that both coexist, it may arise something else and kill both of them. There is no way to know...

So I guess yes, you’re complaining too soon :-)

  • 1

    I know that commenting just to compliment is a anti-pattern here, but the answer was excellent. :)

  • 1

    Excellent explanation/opinion about Deno. I started studying Nodejs a little over a year and a half, so I kind of doctored myself with the concepts that the Node implemented and kind of started to think that the creator of the Node would replicate these concepts in Deno. I agree when Voce says there is no better or worse, because to want to compare 10 years of Node with Deno, who was born recently, would be bullshit on my part. But my intention was not to speak ill of Deno or be fanboy of Node, but rather to question and exchange ideas.

  • 1

    o que não pode é virar "guerrinha religiosa". I agree 100%. I went back to reading Deno’s documentation as a hobby and curiosity, and as a good nerd I am, I’m open to learning new things, so Deno was, shall we say, my "second Ling. backend, "I’m liking some concepts, I question others, but I like to see something new. Então eu acho que sim, você está reclamando cedo demais :-) hahaha, I think so too. After a few months since the question, I think I rushed too soon. Deno has a long way to go. Two years from now I’m gonna go back to that question and find XD funny

1

UPDATE (26/10/2020)

I have gone back to reading the entire Deno documentation, and some questions are beginning to be resolved. I will make updates if you help other people.

For this doubt:

Deno uses a built-in cache system so you don’t have to download a lot of things (you’ve already downloaded for your other projects), but each project has its package versions, how do you manage that? You can manage this cache?

The answer is yes, as the same documentation explains:

DENO caches remote Imports in a special directory specified by the DENO_DIR Environment variable. It defaults to the system’s cache directory if DENO_DIR is not specified. The next time you run the program, no downloads will be made. If the program hasn’t changed, it won’t be recompiled either.

Translating:

Deno caches remote imports in a special directory specified by the environment variable DENO_DIR. Default is the system cache directory if DENO_DIR is not specified. The next time you run the program, no download will be done again. If the program hasn’t changed, it won’t be recompiled either.

But I found it strange that even without passing the flag --allow-net, he accessed the internet and downloaded the packages. It seems this privilege and standard when performing the Imports.

Browser other questions tagged

You are not signed in. Login or sign up in order to post.