NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
JSR Is Not Another Package Manager (deno.com)
lrvick 24 days ago [-]
"To publish provenance for a package, you must publish the package from a GitHub Actions workflow."

Read as: "We wish to uphold NPM tradition of not allowing the authors of code to sign their own code, but as an alternative we will increase your package trust score if you build your package with a heavily censored, centralized, and proprietary build system owned by Microsoft"

How is it that STILL the only source for signed javascript packages are Debian apt-get repos. NPM and JSR still have dramatically worse JS supply chain security than a -terrible- 30yo package manager which still requires a lot of custom tooling overhead in every project for reproducible builds (docker, apt package hash pinning, apt-archive, etc).

Oh right, because the NPM team was worried even having -optional- support for package signing would scare off people from publishing javascript packages.

https://github.com/npm/npm/pull/4016

Philip-J-Fry 24 days ago [-]
I think true innovation here would be forgoing a centralised package registry and just using good old file systems. Go modules have the the right idea, pull a package from a server which responds to basic HTTP requests or even a file system. If you want some smart searching capability, or generated docs then create a proxy which packages are downloaded through which allows you to index them.

Just take things back to basics. You shouldn't have to publish a package on some centralised registry, you should just be able to import a package from anywhere.

crabmusket 24 days ago [-]
Deno literally did this[1] in order to be browser-compatible[2], and everybody whinged about it. I happen to love that I can just import an HTTPS URL inside Deno code. It's really nice for playing around with libraries, writing little scripts that pull in dependencies, etc.

[1] https://docs.deno.com/runtime/manual/basics/modules/#remote-...

[2] https://html.spec.whatwg.org/multipage/webappapis.html#modul...

tionis 23 days ago [-]
I love this approach in Deno! I did write some nice scripts using this!
wwwigham 24 days ago [-]
You can use file:// or git:// versioned dependencies in normal npm `package.json`s today. People just don't, outside some edge cases (local packages, nightly versions) because the centralized registry has upsides for discoverability and maintenance. There's also private registries, where you can setup a .npmrc to pull different namespaces from different registries. But if you want, you can totally be that guy who only publishes their packages to their presumably self-hosted repo - it works.
Macha 23 days ago [-]
The same is also true of Python, and once used to be the majority option, which was around the time that the "Python packaging is terrible" impression was starting. Since like 2015, pypi has banned it in hosted packages, but you can do it for private packages all you want.
lifthrasiir 24 days ago [-]
The fact that Go eventually introduced the central proxy server highlights one of the biggest issues with this approach though: the vast majority of use cases do prefer centralized systems.
crabbone 24 days ago [-]
The major advantages of centralized systems in my mind are:

* No need to deal with duplicates (what if two indices provide the same version of a package but with different dependencies?)

* Oversight (the ability of most popular package managers to "side-load" from Git repo / URL / etc. "illegitimate" source has been a bane of my existence for a while). Being accepted into some "central" index means at least some (albeit usually not much...) degree of oversight. Being able to load from anywhere means exposure to spoofing.

Also, unfortunately, lots of package management systems in common use can be "subverted" to be used in a decentralized way. And, if it were up to me, I'd rather not have this ability at all than have to try to defend against negative consequences.

pjmlp 24 days ago [-]
Go is exactly how one should not do it, hardcoding SCM URLs on source code, only allowing for source code distribution, and forcing everyone else to jump through hoops to make it work otherwise.
davexunit 24 days ago [-]
Go programs are a nightmare to package for linux distros due to the way they handle dependencies so I would strongly discourage any other language from copying the Go model.
jitl 24 days ago [-]
This says more about Linux distros creaky 90s era design than it says about Go’s model.
crabbone 24 days ago [-]
Not really. With all the problems of packaging software for Linux, I'll take that any time over Go.

The whole "spirit" of how Go authors approached every infrastructure task they had was to start as simple as possible and grow as necessary, in small increments. Unfortunately, incrementalism doesn't work in this domain. It's very hard / virtually impossible to go back and undo bad decisions once they become public. Some steps will require sweeping changes all across the board and cannot be performed in one increment.

Go packaging mostly works because of the extra restrictions imposed on themselves by most package authors. But, if you were to stray off the beaten path, you'll discover that the system is unprepared to deal with your case. Go is the opposite of thoughtful and insightful design. It wins at first because its fast and easy to do, and once you are hooked, you will have to work extra hard to deal with the difficult parts yourself.

jitl 23 days ago [-]
I guess I view Debian et al similarly to how you view Go: distros like Debian have been incrementally iterating on a simple design since the 90s and their package systems have become a Katamari Damacy ball of mud around a core designed for having a single agreed-upon version of each C library shared by all software in the universe.

Now to reliably distribute Linux software, it’s very popular to distribute the software along with the entire userspace the software compiled against in the form of a Docker container.

crabbone 23 days ago [-]
> it’s very popular to distribute the software along with the entire userspace

OK, there are two ways to interpret the word "popular".

* As in "a presidential candidate enjoyed popular support".

* As in "ski is a popular sport in some European countries".

So, let me tell you this: distributing software with the entire userspace is not popular, if you use the first interpretation of the word. Users hate developers who distribute software like this, this is what they call "bloatware", "not a team-player" etc. I, personally, despise developers who do this, because, from my perspective, these are the low-skill developers who do less to get equal pay at my expense (as a user).

But, of course, it's popular to be a bad developer, in the same way how it's popular to steal bicycles -- low effort, high reward, if you ignore the disappointment of someone else.

ManBeardPc 24 days ago [-]
Bazel also allows to pull dependencies from different sources via Bzlmod.

1. define dependency like bazel_dep(name = "protobuf", version = "3.19.0") 2. define repositories where to look for it

A repository is just a file structure like /modules/$MODULE/$VERSION + some info about the module. Can be on a HTTP server or just on your local file system.

https://bazel.build/external/overview#bzlmod

I hope more tools adopt something similar. Or maybe a single package manager for everything, so we can finally build cross-language software without having to debug why tool A can't find stuff from B.

fire_lake 24 days ago [-]
Bazel mod requires a registry of packages in a git repo, so it’s quite centralized.
ManBeardPc 24 days ago [-]
From the documentation: https://bazel.build/external/registry

> Bzlmod discovers dependencies by requesting their information from Bazel registries: databases of Bazel modules. Currently, Bzlmod only supports index registries — local directories or static HTTP servers following a specific format.

It just uses the Bazel Central Registry (BCR) by default. You can specify your own via the --registry flag and then it uses them instead. It is possible to specify multiple registries at the same time, so you can mix the official, company internal and on your local filesystem at the same time.

24 days ago [-]
mike_hearn 24 days ago [-]
What's the business model, I wonder? The reason npm registry didn't evolve much is that it is expensive to give away cloud services and eventually got sold to Microsoft, who presumably assessed that adding features wouldn't drive much extra revenue. How many people publish private packages to npmjs.com how much does it cost to host and serve the ever growing collection, especially as they're pretty lenient about people serving large binaries from it?

The Java world got burned by this a few years ago when JFrog shut down Bintray, which had been the second largest open source package repository after Maven Central. A ton of stuff had to be republished, a ton of build configs updated. Now Maven Central is hopefully Too Big To Fail and Sonatype is a sustainable independent business, partly due to the widespread practice of companies buying its Nexus product to mirror Central internally, something I haven't seen so much of in the JS space, and partly because the Java ecosystem doesn't tend to host giant binaries off it. But still.

Gotta admit, I'd like to see a more decentralized approach become popular here. There's no specific reason packages always have to be hosted in one or two central registries.

wwwigham 24 days ago [-]
> What's the business model, I wonder?

Serverless functions, right? That's what deno deploy is billed as. Presumably the registry is a platform-adjacent investment to try and bring more serverless market-share to deno. Since it provides a npm-registry compatible facade, presumably you should feel safe publishing deno-y code to it (without calling platform APIs?), and should thus be more likely to use deno, and thus enter the funnel for deno deploy.

Personally, I just use deno's rust v8 wrappers a bunch, since they make embedding a V8 runtime into a rust app very simple, and js is a very nice scripting engine (especially with optional types). A hugely valuable contribution to the open source community. But then again, I don't deploy serverless functions on the regular. To each their own.

chrisldgk 24 days ago [-]
I believe you hit the nail on the head there.

That, together with the fact that you can still host the deno runtime on your own hardware actually makes it a pretty viable alternative to new projects that you would be building using NodeJS otherwise, with the added bonus that if you ever decide to go serverless, you don‘t have to rewrite your code since you can take the same codebase and move it to deno deploy (or supabase functions, which just uses deno under the hood itself)

chrisweekly 24 days ago [-]
FTR a large and rapidly-increasing percentage of the JS ecosystem supports targeting multiple runtimes. Things like Remix.run, Fastify, Hono, etc. can deploy to Node.js, Deno, Cloudflare workers, etc.
pjmlp 24 days ago [-]
We also use Nexus for .NET, C++, nodejs stuff.

Decentralized approach mostly matters for having a curated repo of what teams are actually allowed to use on their projects, instead of having legal surprised by random devs downloading the Internet into their projects.

vmfunction 24 days ago [-]
>Gotta admit, I'd like to see a more decentralized approach become popular here. There's no specific reason packages always have to be hosted in one or two central registries.

That is what ESM is. Each https link are basically decentralised registries.

hobofan 24 days ago [-]
That's what Deno's interpretation of ESM, but nothing inherent to ESM.

The ECMAScript standard doesn't specify the "ModuleSpecifier" further than a string, so how the module specifier is interpreted and loaded may range anywhere from "a handful of well-known modules" to "any retrievable URI".

chrisldgk 24 days ago [-]
While I haven’t tried it myself yet, I remember finding nest.land[1] and thinking decentralization is a beautiful approach to something as centrally important to the JS development world as packages are.

I also remember thinking it‘s the first and only thing I‘ve ever seen where using the blockchain as the technical basis for it makes sense and isn‘t tacked on for grfiting purposes.

[1] https://nest.land

fire_lake 24 days ago [-]
Who pays for hosting? The site claims it’s free but I read that as unsustainable
norman784 24 days ago [-]
> What's the business model, I wonder?

I think this video[0] answers that question, overall I think the video is interesting if you have time to watch it entirely.

[0] https://youtu.be/dHfZiqVWVhk?si=RqEmPmizm7Rs_91V&t=2779

eviks 24 days ago [-]
Why isn't "just one build script/config you don't need to update" not a specific reason for a central registry?
chrisldgk 24 days ago [-]
Though I‘m not sure if it will get adopted by as many people as NodeJS (and other kind-of drop-in replacements like Bun) did, I‘m really excited by the innovation that Deno brings to the JS/TS space. Particularly actually looking at the hassles that working with NodeJS currently brings and trying to find elegant solutions to that is an admirable goal. If Deno doesn‘t get adopted by many people, maybe at least some of these ideas can later find their way into other runtimes.

As anyone that has tried to publish hybrid packages that include types, a CJS and an ESM version, all the while maintaining semver and anything else can be a real hassle. Everyone seems to have a different solution, and most of the time you end up writing a convoluted build system for your package consisting of an amalgamation of tsc, esbuild, rollup or whatever other bundler is the hot new stuff.

pjmlp 24 days ago [-]
Based on similar experience from other language ecosystems since Perl introduced CPAN, unless a customer requires me to use JSR, I am not rely convinced to stay away from npm repos.
eqvinox 24 days ago [-]
Sure, it's not a "Package Manager" by JavaScript nomenclature, where apparently the package manager refers only to the client side piece. Instead it's apparently a "Package Manager" and a "Package Registry"… which most other environments just call a "package manager".

As a non-JavaScript developer, drawing this distinction feels incredibly silly to me…

chaorace 24 days ago [-]
Well... go ahead and try installing JSR on your local machine. You can't, because JSR is a website. Let's do the inverse and try installing a package from pnpm.io! You can't, because the only thing available for download on that website is pnpm itself.

pnpm lets you download packages to your computer -- a "manager", if you will. JSR gives you a centralized place to find packages at -- a "registry", if you will. We use different words because these two things are not interchangeable and it's not just a JS thing.

chrisweekly 24 days ago [-]
They are completely separate concerns! Why would they necessarily be tightly coupled? Assuming use of an npm package registry (the default public one at npmjs.org, or a private one like jfrog artifactory), there's nothing "silly" about using a pnpm client instead of npm (or yarn).
mort96 24 days ago [-]
I mean the exact same distinction is drawn in the Linux space: dpkg/apt and pacman and dnf and opkg are package managers, while Debian's repos and Ubuntu's repos and Arch Linux's repos and Alpine's repos and Fedora's repos and Red Hat's repos are "package registries". It's a natural distinction to draw IMO, even if some tools have a hard-coded "package registry" URL.
eqvinox 24 days ago [-]
There's one primary tool¹ to access these "package registries" (repositories) though, and it's a 1:n relationship with the tool being generally able to use more than one repository. JavaScript seems to be doing n:1 instead with a whole bunch of frontends to npm.js? And the "special" thing about JSR is that it additionally(?) has its own repo?

Put differently — the tool is the standard, and you have a choice of repos. In JS it's the other way around?

I would describe the JS situation as… hmm… "oddly tilted" in comparison to other ecosystems; the closest thing I can think of is Python's package management situation?

[¹] I'm not counting GUI frontends here, since I'm relatively sure they just call into the same tool.

Macha 23 days ago [-]
apt-get, the modern apt command, aptitude...

yum, dnf...

The package registries also have multiple. Sticking with apt repos, Ubuntu has one per release, Debian has one per release, Mint has one per release...

So it's an n:n situation.

chrisweekly 24 days ago [-]
> "relatively sure" Sorry, but you're flat-out incorrect. See e.g. https://pnpm.io/motivation

See also my response to your earlier comment; in the npm ecosystem you can choose your package registry, and your package manager/client, not to mention your target runtime. Coupling any of these distinct concepts is not the norm.

eqvinox 23 days ago [-]
> Sorry, but you're flat-out incorrect.

I was talking about the non-JS world, i.e. tools like aptitude and Ubuntu's GUI package manager.

crabmusket 24 days ago [-]
Rust's Cargo and crates.io are separate, just as another example.
thenerdhead 24 days ago [-]
I like how they encourage best practices using a score similar to pub.dev & what npm tried to do years ago. Also the compatibility portion is quite interesting given the evolution of runtimes.

I suppose I don't quite understand the rationale of another registry. But perhaps that is what is needed nowadays with various ecosystems and their chosen defaults. What for example prevents npm from adopting ES modules tomorrow? (Although there are many opinions on CommonJS/ES Modules)

posix_monad 24 days ago [-]
TypeScript is not a good foundation to be building on. The language has very complex semantics and the only "spec" is the main implementation, which is also evolving.
postepowanieadm 24 days ago [-]
And has some really odd behaviors like https://github.com/microsoft/TypeScript/issues/17053
c-hendricks 24 days ago [-]
Are you saying typescript should parse regular expressions?

Turn on `noUncheckedIndexAccess` and it works as you expect (plus forces you to check all the places you're doing unchecked access).

https://www.typescriptlang.org/play/?noUncheckedIndexedAcces...

swift 24 days ago [-]
Regular expressions are part of the language, so it's not so unreasonable that TypeScript should parse them and take their semantics into account. Indeed, TypeScript 5.5 will include new support for syntax checking of regular expressions[1], and presumably they'll eventually be able to solve the problem the GP highlighted on top of those foundations.

[1]: https://github.com/microsoft/TypeScript/pull/55600

c-hendricks 23 days ago [-]
That's crazy, in the best way
mort96 24 days ago [-]
Yet it's the best we have at the moment in terms of statically typed JS (other than JSDoc which encodes the type system in comments, which .. has its own issues).
crabmusket 24 days ago [-]
Does JSDoc even have a type checking system? The point of TS is that is actually does something with your declared types. JSDoc is nice... for docs being interpreted by a human. But unless those are used to create red squigglies then it's kind of arbitrary.

I've used JSDoc with TS's type syntax and typechecker in JS files. Which is fine - I find it subjectively uglier but I know some prefer it. But it's still TS that's doing the actual work of determining if all my annotations are compatible when the rubber hits the road.

mbarneyme 23 days ago [-]
JSDoc provides the exact same level of type safety as TypeScript, because the TypeScript Language Server/CLI themselves are what does the type checking. Using JSDoc removes the need to transpile your code before executing it. You still want to type-check your code (with a `tsc --noEmit` or similar) in CI, just like you'd run unit tests/a linter/etc.
crabmusket 22 days ago [-]
Yes, that's my point. JSDoc isn't an alternative to TypeScript if you want your types to actually be checked.
dinckelman 24 days ago [-]
My motivation to host in this repository completely vanished, the moment I realized that any Zod types are considered slow, and will not produce any type definition on JSR as a result.

On a surface level, they have automatic mechanisms for everything that my projects already have implemented, except it has arbitrary limitations, and not a whole lot of material explaining them properly

nailer 24 days ago [-]
> higher scores are awarded to packages that include comprehensive JSDoc documentation on each exported symbol

Nope. I love Ryan Dahl, but code filled with JSdoc comments that needlessly replicate type information already in the code, and mandatory descriptions for every variable (often used as a substitute for proper naming) has a really low signal:noise ratio.

hamilyon2 24 days ago [-]
Well of course. JSR is java specification request.
thriftwy 24 days ago [-]
I thought that JSR is an RFC for Java - can we please be more creative than reusing TLAs?

Call it Genet.

Traubenfuchs 24 days ago [-]
The same kids who never used it also dared to repurpose JAX from JAX-WS.
Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 02:13:42 GMT+0000 (Coordinated Universal Time) with Vercel.