Heed, this reply contains opinions.
I installed Deno these days (version 1.5) to "play" a little, so I’ll put my impressions, which can help with your doubts. One detail is that I started messing with Node about 2 years ago and I don’t consider myself a expert in this technology (that is, I have not yet touched enough to have a consolidated opinion, only impressions of those who do not yet know the background).
I have to import by a URL in all my files I need a library/framework/module?
At first, if they are external libs, yes (remember that you can also import local modules from the file system). Of course the import_map.json
that you mentioned is an alternative to "type less", but it seems to me that’s the idea (I understood that it would be more or less - but not exactly - as a "package.json
with URL’s").
It is totally different from Node, in which there is a central repository from where everything is downloaded "magically", so you do not need to explain which URL will be used - although it is possible to use more than one Registry. Maybe Deno is like this because there is not yet a central repository (will we? don’t know), or maybe because the idea is this: to allow you to download from anywhere, without limiting yourself to a single "central entity" that contains all the packages in the world.
Both approaches (centralized x decentralized) have advantages and disadvantages, and each will prefer one for several reasons. Honestly, it doesn’t really matter to me whether it’s centralized or not, it would be enough if people didn’t abuse the dependencies, and did not add unnecessary modules for trivial things, but rambling...
Anyway, you don’t have a perfect model. A centralized repository can facilitate management (less URL’s to set up), but does not necessarily guarantee "quality"/curation (let’s face it, it has a lot of thing questionable no npm). But a decentralized model also does not guarantee this, although "competition" may perhaps stimulate a search for quality</speculation>. I think there might even be some abstraction above import_map.json
, Maybe if you had "favorite URL profiles/groups" or something. But maybe Deno wants to start simple and only then complicate (if it is done carefully, it will not turn into the mess that many software end up turning). I don’t know, only time will tell.
Anyway, Deno is being made with different ideas of Node, and still can not sink if he will "hit" or "miss". But at least he’s trying to do something different, which is commendable (given that both were created by same person, and it is unusual in our area for someone who decides to do something that can kill their previous creation, even though it is still popular). I prefer not to give the verdict on what is best, not least because I don’t think it has "better", what it has are pros and cons of each approach, and each chooses what they prefer (preferably based on technical evaluation). In computing everything is trade-off what can not turn is "religious war".
If one of these modules undergoes a security update, or is 'deprecated', or if the repositories change domains, or simply go offline, I have to manually 'scan' each of them and change these versions by hand?
Yes.
But doesn’t that also happen with any dependency manager? If you are using any lib and it is updated or removed, you will have to go there in the configuration file (whatever it is) and change in hand. Of course you can configure the package.json
so that it use newer versions automatically, but that doesn’t get you out of problems that can occur if a lib is suddenly removed.
Of course, with each lib coming from a different URL, it might take a little more work to manage. But it may be that someone creates some solution to this. We will see...
If I update this JSON, it will overwrite the previous version saved in the cache?
Not.
To test this, I created a very simple lib, with 2 versions, and put in the following structure:
libs/
|
\_ [email protected]
| |
| \_ mylib.ts
|
\_ [email protected]
|
\_ mylib.ts
Content is as simple as possible:
export function f() {
console.log('mylib 1.0');
}
export function f() {
console.log('mylib 2.0');
}
And I made a test file (test.ts
):
import { f } from "http://localhost:1234/libs/[email protected]/mylib.ts";
f();
Note that I left the libs running on a local HTTP server.
When rotating it (deno run test.ts
) first time, lib is downloaded (version 1.0) and output is:
Download http://localhost:1234/libs/[email protected]/mylib.ts
Check file:///home/hkotsubo/test_deno/test.ts
mylib 1.0
From the second time on, the download is no longer done (I confirmed this also by looking at the HTTP server logs, and did another test with HTTP server turned off and worked), and the output script is only "mylib 1.0".
And in the cache directory, an entry was created for mylib
, in the briefcase .cache/deno/deps/http/localhost_PORT1234
(the briefcase .cache
was created in my user’s home), with 2 files:
fd82babb7ac771cf78249676f381f5aa11a8eb6e7199a09893615123f2ec587f
fd82babb7ac771cf78249676f381f5aa11a8eb6e7199a09893615123f2ec587f.metadata.json
The first contains all the contents of version 1.0 of mylib.ts
, and the second contains metadata thereof:
{
"headers": {
"date": "Wed, 28 Oct 2020 10:51:30 GMT",
"connection": "close",
"content-length": "55",
"content-type": "application/octet-stream",
"host": "localhost:1234"
},
"url": "http://localhost:1234/libs/[email protected]/mylib.ts"
}
Now what happens if I change my script to use version 2.0 of mylib.ts
?
// test.ts, usar a versão 2.0 de mylib
import { f } from "http://localhost:1234/libs/[email protected]/mylib.ts";
f();
When rotating again with deno run test.ts
, the exit is:
Download http://localhost:1234/libs/[email protected]/mylib.ts
Check file:///home/hkotsubo/test_deno/test.ts
mylib 2.0
He made another download (after all, it’s another version, another URL), but again this is only done the first time I run the script. From the second time onwards, the download is no longer done and the code only prints "mylib 2.0". And in the cache directory were created 2 more files, IE, now are 4:
fd82babb7ac771cf78249676f381f5aa11a8eb6e7199a09893615123f2ec587f
fd82babb7ac771cf78249676f381f5aa11a8eb6e7199a09893615123f2ec587f.metadata.json
fb378398f20710b3d0dc83428cddc811c69646561a7da4fd6d3ec2989be2756b
fb378398f20710b3d0dc83428cddc811c69646561a7da4fd6d3ec2989be2756b.metadata.json
The 2 new files refer to version 2.0 of mylib.ts
: the first contains the code, and the second (metadata) contains information of the respective version:
{
"headers": {
"host": "localhost:1234",
"content-type": "application/octet-stream",
"connection": "close",
"content-length": "55",
"date": "Wed, 28 Oct 2020 11:02:34 GMT"
},
"url": "http://localhost:1234/libs/[email protected]/mylib.ts"
}
Notice how the URL is different from the other file.
That is, both versions of mylib.ts
stay in the cache, and I can change the import
used in test.ts
to use any of them, which the same is found in the cache correctly (and no download is done again). There is no confusion between versions.
Remember that the cache is centralized and external to your project. All projects will search in the same cache, and as it can maintain several versions of the same lib, it has no risk of overwriting anything (each project can use the version you want, which is read from the cache without problems).
Attention, paragraph with personal opinion: i think it’s better this way, with a centralized cache that only downloads once instead of having a directory (node_modules
) for each project, where the same things can be downloaded several times, which is often redundant, since they are the same versions of the same libs (by the way, having a centralized cache is also the approach that other managers, like Maven and Gradle for example, have been using for a long time). But let’s not get into an endless discussion about which dependency manager is best, so we don’t lose focus on the answer. Because, as I said, there is no perfect solution, each approach has its pros and cons.
I only leave for reflection a presentation of the creator of the Node, in particular what he says from 13 minutes of this video ("It’s my fault and I’m very Sorry"). Maybe that’s why he wanted to make Deno so different from Node, and probably "never" there will be 100% compatibility.
even without passing the --allow-net flag, he accessed the internet and downloaded the packages
It makes sense. If one of the premises of Deno is that you can import modules directly from a URL, then by default the import
have to access the internet.
To option --allow-net
serves for what the script code have access to the network. Let’s see the example script you have on Deno’s website:
import { serve } from "https://deno.land/[email protected]/http/server.ts";
const s = serve({ port: 8000 });
console.log("http://localhost:8000/");
for await (const req of s) {
req.respond({ body: "Hello World\n" });
}
I saved this code in the file server.ts
and rode (deno run server.ts
). The exit was:
Download https://deno.land/[email protected]/http/server.ts
Download https://deno.land/[email protected]/_util/assert.ts
Download https://deno.land/[email protected]/encoding/utf8.ts
Download https://deno.land/[email protected]/io/bufio.ts
Download https://deno.land/[email protected]/async/mod.ts
Download https://deno.land/[email protected]/http/_io.ts
Download https://deno.land/[email protected]/http/http_status.ts
Download https://deno.land/[email protected]/textproto/mod.ts
Download https://deno.land/[email protected]/async/delay.ts
Download https://deno.land/[email protected]/async/mux_async_iterator.ts
Download https://deno.land/[email protected]/async/deferred.ts
Download https://deno.land/[email protected]/async/pool.ts
Download https://deno.land/[email protected]/bytes/mod.ts
Check file:///home/hkotsubo/test_deno/server.ts
error: Uncaught PermissionDenied: network access to "0.0.0.0:8000", run again with the --allow-net flag
at processResponse (core.js:226:13)
at Object.jsonOpSync (core.js:250:12)
at opListen (deno:cli/rt/30_net.js:32:17)
at Object.listen (deno:cli/rt/30_net.js:207:17)
at serve (server.ts:287:25)
at server.ts:2:11
That is, the import
downloaded the dependencies, but the script code (which goes up an HTTP server) did not run because it needs the flag --allow-net
.
If the --allow-net
needed to import packages, so all scripts that had some import
(even those whose code does not make any access to the network) would have to run with this flag qualified, which would make no sense at all (if everyone needed this permission, a flag to enable this would be half useless, it would be easier if it were already enabled by default). Why the import
can already access the internet by default (I mean, it makes sense to me that it’s like).
I don’t know if I’m complaining too early and in the future there will be a package manager
I didn’t dig deep, but in a quick search I found that ("package management tool for Deno similar to npm but keeping close to the Deno Philosophy"). Probably many other alternatives have already emerged, and only time will tell which one will become more popular.
For the record, at documentation it is said that "Deno explicitly takes on the role of Both Runtime and package manager" - that is, Deno considers itself a "package manager". Maybe not in the same way that Node does (and that everyone has got used to), but anyway...
My conclusion is that it is too early to say whether Deno will be a success or not. It’s only 2 years of existence against more than 10 years of Node, and we still don’t know what will come (if someone will create a "killer" dependency manager, or if they will invent something else, etc.), there is no way to know how things will be in 10 years. It may be that only one of them survives, it may be that both coexist, it may arise something else and kill both of them. There is no way to know...
So I guess yes, you’re complaining too soon :-)
"But what? If I have 50 Imports in this JSON, will I have to manually manage it anyway?" - I don’t get it, it’s not the same concept as the JSON
package.json
, updating in one place?– Rafael Tavares
@Rafaeltavares in a way is, but it would be something to change the url of */[email protected]/http/server.ts to */[email protected]/http/server.ts and if this would not overwrite the files that are already in the chache.
– Cmte Cardeal
This update should be put in another question and not there, you can not answer in the question itself a doubt or suggestion.
– novic
@novic ah yes, I was going to put her with an answer, but I forgot. Thank you for remembering :)
– Cmte Cardeal