I used to use yarn all the time until I tried pnpm. The speed is amazing, and as a traveler I love that it works offline by default. I’d be curious to see some benchmarks between pnpm and yarn 3, because pnpm has been the fastest of the bunch for me. It also gets bonus points for pnpx, workspaces, and essentially being a drop-in replacement for npm.
We maintain a set of automated benchmarks against Yarn 1/2+ (PnP and node_modules tracked separately), pnpm, and npm:
https://yarnpkg.com/benchmarks
The fastest are usually either Yarn PnP or pnpm. Note however that regardless of the package managers it's quite clear that performances are reaching a plateau in the common cases. I personally believe a better thing to consider are feature set, stability, documentation, and general codebase health (since it impacts how fast features and bugfixes ship, and how dangerous upgrades may be). But those are quite a bit harder to measure, of course!
Good benchmarks! Looks like pnpn is still quite a bit faster in most cases though. It does look like performance has reached a plateau, but a different plateau for different tools so surely it still makes sense to decide based on performance?
I've had a wonderful time with PNPM ever since switching for a couple of projects - installs are near instant, and my hard drive is no longer stuffed with a million installs of the same package in different places.
I always thought it was weird that npm didnt use a central store on your hard drive for your packages and instead just duplicated them everywhere, and i always thought it was even weirder that yarn didn't change this behaviour, but maybe i'm missing something.
I think having a simple global cache is fine, but otherwise I much prefer NPM's approach where all dependencies are contained in the project directory over Python's where everything is tangled up in your global system environment
That's not quite true, PnP has always been the default and we have no plans to change that. The only thing that changed is that when you migrate from v1, we automatically enable the node-modules linker.
The reason for that is that PnP tends to require recent versions of some of your dependencies, so migrating to it can be somewhat involved, and it's better experience to let you keep using the install strategy you're used to until you decide to change it yourself.
New projects, on the other hand, already use the latest versions from their dependencies, so it's a much better starting point for PnP.
Sorry for the long delay on replying to this - what packages depend on files being there? I've never had a single issue with pnpm's approach to package management, minus some poor support from github.
Please, if you're downvoting this, share why. The replies here are full of valid anecdotal evidence for every package manager being mentioned, and they're all valid. If you feel this is invalid or poor content to be sharing, please state why so conversation can be had.
I agree - PNPM supports workspaces, is incredibly fast, and is space-efficient (it more or less builds node_modules out of symlinks back to somewhere central on your local machine). It also structures the node_modules folder correctly unlike NPM and Yarn, making it very difficult to import a library that you haven’t explicitly declared a dependency on.
pnpm is really good. The workspace implementation is the only one that feels (mostly) intuitive and also comes with certain functionality usually associated with tools like Lerna. It also doesn't seem to break certain expectations that packages make in regard to node_modules and the like. While there a few things that could be improved, I think it's definitely worth a try and provides a good compromise between NPM and Yarn 2+.
I'd be interested to know what HN's experience with Yarn 2 has been. Yarn 1 was compatible with NPM, so switching was easy, but Yarn 2 seems to need explicit support in a lot of cases.
Had anyone found significant benefits from switching to yarn 2? And conversely, has anyone had any issues with this?
EDIT: It seems that there is also a 3rd option: pnpm. Experiences with pnpm would also be welcomed.
We have a monorepo with 200packages. NPM install times were longer than 1hour. Unusable. Yarn v2 took this down to 20 seconds. Yes, around 200x faster. They just are not comparable on that aspect.
The issues we had are with some packages who don’t list their peerDependencies correctly, or which assume the file structure of node_modules will be the one made by npm. But yarn has improved and these obstacles are much easier now.
I don't understand why npm just can't improve perf. Seems fixable. In my smaller personal projects without as many dependencies, I just use npm and it's easier.
NPM has gotten faster over the last few years. When Yarn first came out it was dramatically faster than NPM. NPM 5 then caught up with Yarn 1 on performance, and NPM 7 is often faster than Yarn 1. NPM has not caught up with Yarn 2, but Yarn 2 had to break a lot more things than NPM is willing to break.
I think part of any (perceived?) performance advantage comes from not being as safe as npm in terms of dependency resolution and lock file preciseness. I seem to get very little out of Yarn, considering its downsides, while npm on the other hand is the standard package manager and simple to use.
Yarn is absolutely safer than npm in terms of determinism and lock file preciseness. I think it was actually one of the main goals of yarn v2 to be fully reproductible and deterministic.
After performance, the other main reason we moved from npm to yarn was exactly that it completely removed an entire class of bugs we had that were caused by incompatible packages versions installed in various places of the repo.
In exchange of that, yarn will warn if it encounters packages that don’t declare their reps and peerDeps correctly. And there are loads of those. if you want to be safe and remove these warnings, you will have to either ask the package authors to declare their depa correctly, or fix them yourself manually in your yarn YML file.
Full disclosure, I don't have any experience with yarn aside from personal experiments, so I can't compare yarn with npm fairly. But is it possible that another package manager is even less safe than npm regarding dependency resolution?
Ask your team the question: what's the difference between npm update, npm install, npm update --workspaces, npm install --workspaces, and how the outcome differs between npm versions.
Bonus points for knowing in which scenario package-lock.json is taken into account.
If yarn is even worse, my last hope is gone. Please don't take that post too seriously :)
Right? I'm a web developer and work a lot with PHP.
PHP's de facto package manager is Composer and it's very simple and clear in how it works:
Your composer.json states your dependencies and their version constraints.
Your composer.lock (also a json file) states the actual versions that should be installed, based off your composer.json.
"composer install" installs the exact versions from your composer.lock file, "composer update [package]" updates the lock file based on your constraints.
With npm this doesn't seem to be as straight-forward, sometimes I run "npm install" and the package-lock.json ends up changing, I definitely don't consider npm to be safe.
This is why you should use yarn v2 or v3. Dependencies clusterfck are not a thing with yarn. We went from having a lot of weird bugs like that every month to having zero, for 2 years thanks to yarn. The counterpart is that yarn sometimes needs your input, when package maintainers don’t declare their does or peerDeps correctly.
Is that due to yarn v2 including a zipped version of all the dependencies that you can check into the repo?
Because that is what I plan to do. Such a pain when some random dependency 50 packages deep is broken or even pulled from npm and so we can't even finish a deployment build until fixed. Especially for older projects.
It’s is first and foremost due to the “mathematically correct” (or at least “more correct”) resolution algorithms used in yarn. And it is further improved by the local caching of packages indeed. Note that you can even be in “zero install” mode with yarn, where you checkout in git this dependency cache, so you never have to “yarn install”, got checkout is enough to get everything. If npmjs.com falls down you wouldn’t be in any trouble.
Composer is not that much better though. It has (had?) the brain dead convention, that you may only refer to branches by prepending "dev" or "dev-", which itself can be a branch name and an organizational convention and thus the convention prescribed by composer gets in the way unnecessarily with what an organization may already have as convention.
So I am not sure it is better than npm, however, at least one does not have that split in package managers, which will additionally create friction in or between teams and projects.
> I run "npm install" and the package-lock.json ends up changing
This is intended behaviour, but seems totally counterintuitive to me. I’m with you, I’m used to Composer and NPM just seems inscrutable at times.
For reference, the approximate equivalent to `composer install` is `npm ci`. This will install the exact versions from package.lock without changing it, however it will also blow away your node_modules directory and install from scratch each time.
My experience is yarn is safer in terms of locking as well, I had a lot of subpackage versions suddenly changing with npm causing CI tests to fail whereas with yarn it's very rare.
You've got it backwards. npm has historically been much less deterministic than yarn, and recently decided auto-installing peerDependencies was a good idea...
That is so appealing... I don't have a monorepo with 200 packages but I do `yarn install` in a Dockerfile and if something changed that uncached that line, the whole thing can take 10 minutes or more to download, unpack, install. Would love to try Yarn 2 (3, now, I guess) when I can.
Does Rome use yarn? Anyone using Rome in production?
If you are on mac, this is a known docker issue because in non-Linux systems fs access is virtualized (docker runs on an Ubuntu running on a vm). Becomes pretty unusable.
When doing dev you can specify the workspace command and cache any node_modules etc in volumes locally which greatly increases local development speed when changing sepa.
Eg no Docker file in dev unless you need custome system dependencies. Otherwise alpine node in Docker compose and in prod Docker file
NPM 7 is fantastic, and feels like a worthy successor to Yarn 1. Yarn >=2 is probably fine for what it is, but I wasn't prepared to invest resources into rearchitecting how we handle dependencies for nebulous (or possibly negative) benefit. Meanwhile, the old release of Yarn 1.x that I'd pinned in order to work around a blocker regression (v1.21.1) was starting to show its age, and eventually became unusable after a third-party package update triggered another hidden blocker bug in that release (which, IIRC, the team had elected not to fix because only 1.x was affected). At that point, it was untenable for us to continue relying on a broken, unmaintained tool for such a critical function.
Luckily, the stable release of NPM 7 arrived just in time for us to switch over with minimal hassle. Performance and reliability have been notably better in my experience (keeping in mind that my baseline for comparison is a release of Yarn from two years ago).
I'm grateful to the Yarn project for pushing NPM in the right direction, and wish it and its remaining users all the success in the world, but I no longer see an urgent need in the ecosystem for Yarn to exist and would no longer recommend it by default for any greenfield project.
My team upgraded our frontend monorepo from npm+lerna to Yarn 2 with PnP. We saw vastly improved installation time, most importantly on incremental installations. The startup times for jest, webpack, and TypeScript also improved. PnP's dependency strictness eliminated cases where updating a package in one workspace would break a different workspace.
It was a fairly difficult migration, in particular because Yarn 2 PnP does not allow you to import a module that is not specified in package.json. Lots of libraries play fast and loose with dependencies, and so our .yarnrc.yml file which adds those missing dependencies is now 172 lines. PnP requires configuration for vscode and a custom build of TypeScript, which isn't ideal, but those have worked well.
I run about ~30 JS repos, ~10 of them monorepos. Since I migrated everything to yarn 2 its been a breeze, my other contributors have been able to completely forget about it (as it just works) and I've been able to assert that everything that works today will work in six months and two years.
That is because I have not gone against the default of 0-config repos, that is saving the .yarn/cache folder (equiv of node_modules) to git (via git-lfs). The lockfiles work. Dependencies can have conflicting dependent packages. I've had a small number of packages have issues with the zip format, which is solved by setting unplugged (very well documented).
A user clones the repo and they are done. Yarn itself is saved into the repo, so the only external dependency is nodejs.
Things I haven't thought about or done since switching to yarn 2:
* I don't worry that the server is running something different to the dev machine
* I don't get breakage because I switched branches
* I don't have to teach other people how to look after their packages (because it 'just works')
* I don't have to chase people to update when a lib we use has a vuln
* I don't spend time installing
* I don't care about package compatibility
Things I have had to worry about:
* build systems. When you use webpack, esbuild, etc. then they need a little help resolving packages
* in the beginning it was hard to step into dependencies. That is all fixed now.
* Sometimes a package is broken because they don't specify peerDependencies properly (which is a security & reliability issue), so you have to add to a config file (surprising first time, easy from then on).
- yes, you can have the same lockfile and run it behind different proxies with the same result now! very strong for corporate proxy situations with external dev teams
- pnp
- superbly fast package install, gets faster as you use it more
Yarn 2 also has an upfront cost
- all new config (yml)
- also renamed pretty much every property
- docs are decent and mitigate some of this
- doesn't come for free with node.js
- conflicts with npm with regards to commands
- interacts awkwardly with npm config, leading to some situations where yarn installs fail due to npm config being partially and implicitly merged
Overall it's pretty good. Not really sure it's worth it unless install performance or multiple-proxy situations are present at your workplace though.
I manage several monorepos with PNPM's workspaces. One of which is an open source repo for a popular bundler which accumulates 100s of millions of weekly downloads. Another is internal and private that contains an entire organization's codebase. PNPM is an absolute joy to work with. If you haven't tried it, make a point to.
FWIW I tried pnpm 8months ago and had problems with a lot of tools/packages. Next (or some next related project) did not work well with it, and many others. Went back to yarn v2.
You’re right, it was probably with our react native package, and maybe also with our electron package. And also with Strapi. And maybe others, I will never know.
That's one thing that is unfortunate about pnpm - the methodology is superior in many ways, but there are a lot of tooling packages that don't consider symlinks for dependency resolution because npm and yarn are so prevalent.
I had issues with Yarn 1, when it came to concurrently managing its own cache, when I had packages, which have dependencies to private git repos, which could be referenced using multiple ways of writing their URLs. In such cases one had to manually make sure, that the order of installed packages is exactly right, otherwise the cache gets into problems and errors would appear. Afaik that issue never got resolved, except perhaps by accident in a newer version of Yarn, but no explicit treatment of admission of the issue.
To me it was outrageous, that a tool used by so many could have such flaws. I stayed with npm, which also has the advantage of storing a more precise lock file than Yarn and I never had the same kind of issues as I had with Yarn. It seems more standard and safer than Yarn ever did, so I never bothered with upgrading Yarn.
I often need to work with a project, which has a big monorepo and chose to bundle its version of Yarn with that. Now it seems that there will forever be a Yarn version inside that project.
I have not noticed that much of a speed difference between npm and Yarn once things are cached and I value safety and correctness more than speed anyway, so that has not been a reason for me to try and make the switch either.
I've found two major benefits with yarn. First is package install time, like others have said it's an order of magnitude faster for projects I've worked on.
Second is a little more niche in running a Linux image on Windows host via Vagrant/Virtualbox. Out of the box symlinks won't work so bin files often fail. I would also run into file locks when running npm install on larger projects often, to the point where I was having to run it 5-10x before everything would finish installing. You can instruct Vagrant to offer some support (setextradata VBoxInternal2/SharedFoldersEnableSymlinksCreate/v-root 1), or set an environment variable to key on and update any scripts to use --no-bin-links, but it's flaky in my experience. I never found a solution to the file locks with npm.
I installed yarn on a whim on that setup and have yet to have any of those issues with it.
Even for a smaller project, on yarn v1, our installs was like 3 minutes. With yarn v2 it is basically instant - saves so much time!
I can also add that the monorepo setup really makes it easy to manage project with multiple packages and interdependencies. You can have a look at our config for this here: https://github.com/lowdefy/lowdefy
It's basically not supported in Ember yet, so.... Not at all. My fingers are crossed for pnpm support soon so I can be done with the sadness pile that has been yarn as of late (e.g. linking just flat out doesn't work for us with Ember addons).
My experience is that Yarn 2 works great as long as you stick with the `node_modules` linker. It's the "Plug 'n Play" functionality that is great in theory but still rough in practice.
The single decision from the Yarn team to not offer a reasonable transition path from v1 to v2 was a total blocker for my teams.
After trying it on seldected projects, we collectively decided that it was not worth it to follow a project that did not take its users' time seriously, since the migration was so hard.
I was the main developer back when the 2.0 was started (so, like, rc.1), but since then we grew and are now a team, with at least four very active contributors of similar expertise.
I believe this is the greatest accomplishment of the v2.
Yarn was great and all, but it’s not npm so you have to convince the whole team to use it.
I can’t believe npm still sucks after 10 years. Week doesn’t go by I have to nuke node_modules and occasionally the lock as well. It’s junk as far as I’m concerned, but I just happen to know what the problem is every time.
I hear stories like this occasionally. I've used node heavily at every day job for about 9 years. Needing to blow away node modules is so rare. Even on large teams with nested package.json files. With npm link and library development. Once npm added the lock file instead of shrinkwrap, it became a total non issue.
I am genuinely curious what are the specific issues which necessitate deleting all modules? Colleagues across operating systems with native modules? Old node versions?
Take React.js's "Add React to a Website" article[1] for example. The example of adding JSX to an existing website[2]... breaks! And you won't exactly know when until it does and you have to delete your package-lock.json file. It should work the first time, but once you commit those package versions, running `npx babel [...]` may break depending on what is in your package-lock.json file, and when.
Why? Because installing the dependencies they specify and running babel (implicitly at the latest version) does not have reproducible results. It's that simple. package-lock.json is specifically supposed to help with this, but in some cases, it just doesn't, and it's entirely possible, and common(!) to have two sets of package.json files produce differing package-lock.json files.
They even have a warning below that can still occur even if you already have the dependencies specified above installed:
> If you see an error message saying “You have mistakenly installed the babel package”, you might have missed the previous step. Perform it in the same folder, and then try again.
I don’t think in general I’ve ever had a hard time selling it. The issue is the decision pretty much all or nothing; everyone has to be using it as well. Getting a number of people to switch is always difficult.
Note that we don't recommend using both Yarn and npm: both have different feature sets, heuristics, and implementation details, and as a result your colleagues and you may have slightly different behaviors in development (on top of desync'd dependency trees).
Just like no one would think of letting their developers choose whether they want to bundle their code using either Webpack or Rollup, the package manager should really be enforced at the project level, whether you choose Yarn or npm.
I understand that there are probably some weird edge cases where this could cause an issue
But I've been doing this for years and so far haven't ran into it
I really don't have the mental energy to fight with people over package managers -- it's just not worth it.
> your colleagues and you may have slightly different behaviors in development (on top of desync'd dependency trees).
Yarn and NPM both take a "package.json" and install the dependencies so that you can import them though.
If I have "express" in my package.json and do "npm install" or "yarn install" -- the functional outcome is (and always should be) the same
Unless you're using some package-manager specific behavior so that it only works properly or relies on particularities from either yarn/npm/pnpm whatnot, I'm not sure I understand how this could cause problems
But also, you're the lead maintainer of Yarn and I'm just some schmuk who's been on the consuming end for the last many years. I reckon you've got a fair bit more clue here than I do.
yarn 2 solved real problems with zero install and advanced PnP. Maybe the only problem was that it was released too early, wasn't mature enough when it came out. Now it's just better than v1 and npm, works particularly well on large mono repos where upgrading a package can often break other modules due to how node_modules hoisting works.
The problem with Yarn 2 is that it wasn’t Yarn. You can’t change the whole thing and keep the same name. They thought people would just keep using Yarn out of inertia. That’s not how it works.
Eh. They're allowed to do that as per a literal definition of semver, but they turned it into a completely different tool with completely different usage patterns and use cases. It's one thing to have to make some small tweaks to handle an isolated breaking change in a dependency. It's another matter entirely to have a perfectly good core part of your stack deprecated out of the blue and told that you need to rewrite every line of code that it touches.
I see this as analogous to the Angular 2 situation, except that Google actually did a good job maintaining Angular 1 (retroactively named "Angular.js") for a number of years afterwards and providing a solid migration path. Everyone who had staked their projects and businesses on the future of Angular 1 was understandably annoyed.
All that being said, while I have problems with specifics of their approach, I actually think Yarn made the right call on this. After NPM caught up with v7, it became a bit of a wasted effort to have two redundant projects that were almost drop-in replacements for one another. Yarn staking out a different path at least justifies its continued existence.
What I think could have been better is if they'd put an explicit acknowledgement in the migration docs that Yarn 2 wasn't going to be a good fit for all users of Yarn 1, and a recommendation of NPM 7 as an alternative successor to Yarn 1 for such users. An even nicer gesture would have been if they'd written an alternate migration doc for Yarn 1 -> NPM 7.
I tried updating my teams frontend pipelines from Yarn 1 to Yarn 2 since there seemed to be some significant performance improvements.
I spent some time trying to understand why I needed a third party plugin to run the equivalent of `yarn install --production` I just said fuck it and I'll let the frontend team figure this shit out.
So yeah, I wouldn't be surprised if the Yarn 1 usage is significantly higher than Yarn 2
The only project of mine where I use Yarn 2 was one where arcanist, the yarn2 maintainer, added it himself. Otherwise, I use yarn@1 which requires significantly less brain cells than following yarn2 opinions/changes/plugins/installing
We also remained on Yarn 1 because developing packages using the workspaces feature is really nice.
We did make an attempt at moving to Yarn 2, it did not work out well. At that time (probably 12 months ago or more) lots of upstream packages had broken package.json files, we never managed to fix them all. Lots of things with React were a bit screwy too!
One of the biggest things holding me back from transitioning my open source projects over to Yarn and PnP is Dependabot support. I'd even settle for support with the `node-modules` plugin at this point, but GitHub is of the opinion that there isn't much community usage of yarn v2 to warrant the additional development required to support yarn v2+.
I'm really glad for all the forward thinking projects like yarn and pnpm are bringing to the node package management space, and I'm also glad to see that npm is listening and taking those advancements into consideration. I'm currently exploring using npm v7's workspaces feature, which was something I originally wanted to use yarn for.
So did the dawn of Yarn make package management for nodeJS better for everyone or do we now just have another packagemanger to choose from and take into account when we publish packages?
Take a look at many of the other comments. My view as a regular user of JS tooling is that yarn provided a better option and also forced npm to become better in a hurry. The current level of package management fragmentation seems good for everyone.
My team is still using yarn v1 because we want both dependabot support (rules out yarn v2+ and pnpm) and support for overriding transitive dependency versions to force security fixes (eg, yarn resolutions). We would love to explore other options but right now yarn v1 seems to be the only game in town that meets those requirements.
Once npm implements their recently-accepted overrides RFC, we're eager to try switching to that.
for me yarn 2 was a clusterfuck. I thought dependency resolution, would be better but nah. then it also polluted my workspace with dozens of json files. and issue of packages not being found, when using esm via node.
Does the combination of Yarn, VSCode and TypeScript work now?
I've wasted time twice now trying to move projects to Yarn 2 just to have the tsserver LSP break every time, being unable to find the packages despite following the documentation to the letter.
We use Yarn / VSCode / TypeScript (both to maintain Yarn and at work), so we are very invested in this use case. If something didn't work, we'd notice it quickly and the fix wouldn't take long!
Well I just tried to make a repro by grabbing a random repo I had, but I get "Invalid authentication (as an anonymous user)" when trying to get a package from a GitHub registry, after configuring my scope with `yarn config set npmScopes.MYSCOPE.npmRegistryServer "https://npm.pkg.github.com"` and successfully logging in with `yarn npm login --scope MYSCOPE`.
Every single time I've tried Yarn 2 I end up frustrated.
We don't do support on HN, but if you really don't find information in our repository issues I'm sure our Discord can help. We publish all the Yarn packages to both the npm and GitHub package registries, so I'm certain it works fine.
This is exactly what turned me off Yarn 2. I think it was a language server issue, but if it's broken, it's broken. It sent me back to NPM, and I rarely use Yarn these days at all.
What are the advantages of using Yarn over NPM? It just feels to me that npm does anything that is needed and is available in stuff like ansible-node docker containers. Why should someone who is used to Yarn check Yarn?
So... What exactly is this version, how is it different from yarn 2 that it warrants a major version update (not that I even know what yarn 2 was about), and how does it compare to npm?
Forced peerDependency installations in v7 are a deal breaker for me. And as a package author, it's one of the more painful changes to NPM in recent times. Lest we forget the debacle the between prepare, install, prePublish and prePublishOnly that we're still dealing with, and which the other package managers have to contend with.
We support it once you enable the `node-modules` linker (cf our documentation). It's a little slower and doesn't leverage some of the stability improvements brought by PnP (in short, you get a good old node_modules folder without much fanciness), but you still get to benefit from all the other features and bugfixes we made since the 1.x.
One of the reasons I really like Yarn 3, is the really good support for linking local dependencies. The whole “@internal/mypackage”: “workspace:*”.
That, and the plugin system.
Because it meant I could build on top of yarn a build/test system much like Bazel, Buck and Pants. But without the overhead of those tools.
Honestly, being able to build functionality into tooling like this is really awesome. Especially when it’s aware dependency graph.
I often have polyglot repos, with a heap of typescript, and some golang, and other stuff.
I spent a couple of weeks last year hacking away on a plugin, that can look at the whole dependency tree defined in yarn - and develop a build graph that can build everything in the required order, taking advantage of every cpu thread available. While keeping track of what’s previously been built.
From that, I extended the support to testing to do the same thing.
I wrapped it all up into a nice package and published it to:
Then it’s just `yarn build` which runs `package.json#scripts.build`, and `yarn test` which runs `package.json#scripts.test`.
—
Since then, there’s been plenty of development. I added a bundle command. You tell it which local package to bundle up, and it copies your repo to a temporary folder, removes everything that package doesn’t need (useful in a giant monorepo), leaves the other packages it depends on. Finally it adds a file `entrypoint.js` which sets up pnp for you, and re-exports the `main` file from your target package.
Then it zips it up for you ready for AWS Lambda (my use case), but also Docker and any other node runtime.
And recently I hacked another interesting plugin that yarn and pnp enabled. A really neat plugin that lets you write package.yaml instead of package.json.
If you replace package.json with package.yaml (or .yml), it will appear to yarn as a normal package.json. Everything works fine (with yarn, unsure about some node tooling), and you even get comments.
It’s a completely separate plugin (because it’s pretty out there), and it only has an effect if you don’t have a package.json in your package folder.
I’m using it on the golang packages in monorepo’s, because those developers are often a bit upset by seeing build tooling in a json file.
The package.yaml plugin can be installed with the following command:
yarn plugin import https://yarn.build/yaml
It’s still a bit experimental, but works pretty well.
—
It’s all open source on GitHub, if you have any issues using it feel free to post a bug report.
Github: "Starting next month, all new source code repositories created on GitHub will be named "main" instead of "master" as part of the company's effort to remove unnecessary references to slavery and replace them with more inclusive terms. Teams need tools to help them collaborate and stay productive while remotely working."
> This release marks the transition of Node.js 10.x into Long Term Support (LTS) with the codename 'Dubnium'. The 10.x release line now moves in to "Active LTS" and will remain so until April 2020. After that time it will move in to "Maintenance" until end of life in April 2021.
Is there somewhere that actually talks about LTS releases? I'm trying to find when it's EOL and I'm getting mixed information - one page said it was April 2021, you're saying it's still LTS?