-
Notifications
You must be signed in to change notification settings - Fork 5.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support npm URLs #13703
Comments
Why have an built-in import-map? {
"imports": {
"npm:": "https://unpkg.com/"
}
} this already solves the problem? |
I am not supportive of this suggestion, as well as I think there are lots of semantics that would need to be defined to even really consider it. Reasons I am not supportive:
The current widely used solution in Deno to access npm packages, is to use a CDN like esm.sh, esm.run and skypack.dev. Admittedly there are some rough edges and the usability and something we need to work on, but it helps ensure that Deno first and npm code work together without having to bring the whole of the npm/Node.js ecosystem with it. While esm.sh is great, there are lots of improvements that could be made with something like npm.deno.land that helps ensure a smooth frictionless journey and helps ensure people can move smoothly from a legacy Node.js world to a more modern runtime. While there are some challenges and limitations with import maps, they are a) a web standard, b) provide transparent and user editable configuration. Being transparent on resolution and auditable is a big thing versus some sort of opaque mechanism, because we are likely to get it wrong sometimes, and a user editing a bit in an import map (or having better and richer tools to manipulate it) is a lot better than something locked away in the binary you can't influence. |
But how --compat will know this is code writen for node ? I suggest to have convention about npm import file config like deno.json convention |
The
Deno code is not portable to other platforms. Deno code is generally Deno-specific typescript (with extension imports) - so does not work directly in browsers, nor directly with canonical TypeScript.
The resolution algorithm will be changed inside of npm packages, so bare specifiers will be resolved via the node resolution algorithm. In normal Deno scripts, bare specifiers will work as they currently do.
Transparently - I see no difficulties here.
This suggestion does not imply
Probably remote dependencies with
You should read Strategy Letter II: Chicken and Egg Problems
It would probably use the default registry and maybe we'd add a
I'm confident they can be made to work.
It will not be littered with bugs - this is a straightforward feature. Most of the infrastructure is already in place. |
For dotland there is a requested feature and PR to transpile ts to js, so browsers can consume /x/ modules. This suggestion would quite hinder that (on top of the deno ns, but thats another topic) |
I like this proposal better than the current This doesn't expose
This proposal suggests we switch the runtime to what the current |
@crowlKats I'm a huge fan of that proposal so I'm curious why this proposal would hinder it? I was under the impression that the all the node.js compatibility polyfills are in ts, so it would be able to run in browser after being compiled to js (assuming the npm module doesn't e.g. use file system / native addons / etc., but same can be said of Deno modules)? I.e. I'd have thought that the node stuff could essentially be "browserified". I'm probably misunderstanding something here. |
@kt3k okay i didn't know it switched to a nodejs context. i thought it just downloads it but this clarifies it for me. thanks! 😊 but why not allow something like this: import express from "https://registry.npmjs.org/express/4.17.3";
const app = express();
const port = 3000;
app.get('/', (req, res) => {
res.send('Hello World!');
});
app.listen(port, () => {
console.log(`Example app listening on port ${port}`);
}); This could check the endpoint if its an npm like registry and provides everything in the background for you |
Something like that might work, but |
@kt3k Why not just make npm accessible via something like Or is it not possible to solve this via some sort of transpilation/shimming of npm packages? That seems unlikely, but I might be misunderstanding something. If it is possible, then the cost of this I think it'd be really bad idea to hurt browser compatibility here if there were some other viable way to get npm packages working in Deno. |
We already have that its esm.sh/package-name |
By adding a query parameter of ?module to unpkg.com URLs, any bare import specifiers will automatically be expanded to absolute URLs, and it will server ES6 modules of all packages that have that specify a "module" in their package.json. This way there doesn't need to be any extra configuration like a package.json or any non-web compatible changes to deno. If a package isn't offereing an ES6 compatible version, usually there is already an issue on their github asking for it and it's only a matter of time. I've been using this a lot, and it works well for most packages but requires suffixing all imported modules with Also, trying to import { "imports": { "npm:": "https://unpkg.com/" } } on deno version 1.21.2
but importing { "imports": { "npm/": "https://unpkg.com/" } } works fine. If you don't want to suffix all of your imports with "?module" you have to include a separate mapping for every single package you want from npm/unpkg.com specifying the entire URL, so if you just want to { "imports": { "npm/foo.js": "https://unpkg.com/foo/bax/bar/foo.js?module" } } It would be nice if importmaps passed on query parameters to redirects/mappings like { "imports": { "npm/": "https://unpkg.com/?module" } } I saw some discussion about this on WICG/import-maps but currently trying this results in
esm.sh works in a similar way, where the returned resource can be changed depending on the query parameter passed to it but with a lot more options. |
@lucsoft I think this issue is about trying to avoid the need for the So, naively (not suggesting as actual solution), |
With regards to whether or not this requires I think we should start by allowing |
Regarding the scheme format, allowing specifying a (non-npm) authority would be pretty useful. |
@jsejcksn Essentially, yes. But technically I'm referring to a URI authority. However I don't think it would be feasible to support registry URLs that include a path section (e.g. "https://skimdb.npmjs.com/registry"). |
@kitsonk @ry @lucacasonato @crowlKats Can I ask a couple of questions about the direction that's being taken here according to the latest blog post? If so:
My main worry here is that people will start using Sorry if these questions have been answered already in some other thread that I've missed. Thanks! |
Presumably a problem with the
React requires https://unpkg.com/[email protected]/server.node.js Other packages have it inverted, where prod logic is the default. Either way, https://unpkg.com/[email protected]/client.js That module is meant for client use, but it shows how packages use such logic with inline code instead of nested Ideally over time npm packages are published as pure ESM, with dev and prod builds that can be selected via import maps and CDN's that don't do any build steps at all but just statically host the published files (like unpkg.com). Imagined dev import map: {
"imports": {
"react/": "https://esm.sh/[email protected]/dev/",
"react-dom/": "https://unpkg.com/[email protected]/dev/",
}
} Imagined prod import map: {
"imports": {
"react/": "https://esm.sh/[email protected]/",
"react-dom/": "https://unpkg.com/[email protected]/",
}
} Imagined imports: import createElement from "react/createElement.mjs";
import hydrateRoot from "react-dom/hydrateRoot.mjs"; There would be no main index modules to map, if packages had optimal JavaScript module design. No Are projects like React so ossified, untouchable, and aloof that we would rather build non-standard features into JavaScript runtimes than simply fix the module format and structure of their packages so they align to web standards, allowing them to be used efficiently and universally with the tools we have today in Node.js, Deno, and browsers? A JavaScript runtime should align to web standards, and not start cramming in proprietary features that will lead to a fractured ecosystem based on non web standard features that endorse arbitrary for-profit corporations. How would we feel if Chrome added support for an We already have a solution for using legacy CJS npm packages in Deno and browsers until they clean up their act and publish ESM: CDNs like esm.sh. Why don't we just fix up the remaining bugs such CDNs have, and work with that? The engineering work to fix the CDN bugs is surely a lot less/cheaper than implementing the As other commenters have pointed out, the |
@josephrocca there are certain npm packages that can't work in the browser, but will eventually work in Deno. For packages that are possible to transpile for the browser, the npm specifiers can be mapped to a transpile server like esm.sh. So, these specifiers provide a way for dependencies to state their version requirements, rather than depend on a single explicit version, then the runtime can do the work of automatically de-duplicating dependencies. For the browser, although this requires an extra step by using an import map, this can be auto-generated and you get the added benefit of probably less code delivered to the user because dependencies have started expressing version requirements instead. There are several technical reasons why a transpile server like esm.sh doesn't always work. First, it's not always possible to convert cjs to esm. There can be stuff like synchronous deferred requires, or we've seen code that's non-analyzable using requires in an |
@dsherret Thanks for your reply! So am I correct in understanding that library/package authors would ideally use unpkg.com/esm.sh/etc whenever possible (instead of If so, I'm wondering: Would it somehow be possible to give a warning if Or can we just solve this "browser-incompatible-for-no-good-reason" case with on-the-fly (and cached, of course) pre-compilation/transpilation when a If that's the case, then that's somewhat relieving. Thanks again for your reply! |
If you have an import map that maps npm specifiers to something like unpkg, then this is not a build step. If a package ships with typescript, then that's browser incompatible and actually requires a build step—we don't have a warning for that so I don't think we would give a warning for this (though we'll discuss it internally).
Yes, I think this is something that could be better solved by registries where if you specify you want a browser compatible module, then it will convert npm specifiers to an esm package, handle deduplication for you, and convert ts to js. |
A very very rough unstable implementation of this landed in #15484. We're missing a lot of stuff like |
Example:
Depends on #12648
The text was updated successfully, but these errors were encountered: