Cargo is the native package manager and build system for Rust, allowing you to easily bring in dependencies from the global crates.io registry,1 or to publish your own crates to crates.io. Tor Hovland and I recently contributed a long-requested feature to Cargo, allowing you to package many interdependent packages in one go. That might not sound like a big deal, but there were a few tricky parts; there’s a reason the original feature request was open for more than 10 years! In this post, I’ll walk you through the feature and — if you’re a Rust developer — tell you how you can try it out.
Workspaces
The Rust unit of packaging — like a gem in Ruby or a module in Go — is called a “crate”, and it’s pretty common for a medium-to-large Rust project to be divided into several of them. This division helps keep code modular and interfaces well-defined, and also allows you to build and test components individually. Cargo supports multi-crate workflows using “workspaces”: a workspace is just a bunch of crates that Cargo handles “together”, sharing a common dependency tree, a common build directory, and so on. A basic workspace might look like this:
.
├── Cargo.toml
├── Cargo.lock
├── taco
│ ├── Cargo.toml
│ └── src
│ ├── lib.rs
│ └── ... more source files
└── tortilla
├── Cargo.toml
└── src
├── lib.rs
└── ... more source files
The top-level Cargo.toml
just tells Cargo where the crates in the workspace live.2
# ./Cargo.toml
workspace.members = ["taco", "tortilla"]
The crate-level Cargo.toml
files tell us about the crates (surprise!). Here’s taco
’s
Cargo.toml
:
# ./taco/Cargo.toml
[package]
name = "taco"
version = "2.0"
dependencies.tortilla = { path = "../tortilla", version = "1.3" }
The dependency specification is actually pretty interesting. First, it tells
us that the tortilla
package is located at ../tortilla
(relative to
taco
). When you’re developing locally, Cargo uses this local path to find the
tortilla
crate. But when you publish the taco
crate for public consumption, Cargo strips out the
path = "../tortilla"
setting because it’s only meaningful within your local
workspace. Instead, the published taco
crate will depend on version 1.3 of
the published tortilla
crate. This doubly-specified dependency gives you the
benefits of a monorepo (for example, you get to work on tortilla
and taco
simultaneously and be sure that they stay compatible) without leaking that local setup
to downstream users of your crates.
If you’ve been hurt by packaging incompatibilities before, the previous
paragraph might have raised some red flags: allowing a dependency to come
from one of two places could lead to problems if they get out-of-sync. Like,
couldn’t you accidentally make a broken package by locally updating both your
crates and then only publishing taco
? You won’t see the breakage when building locally,
but the published taco
will be incompatible with the previously published tortilla
.
To deal with this issue, Cargo verifies packages before you publish them.
When you type cargo publish --package taco
, it packages up the taco
crate
(removing the local ../tortilla
dependency) and then unpackages the new
package in a temporary location and attempts to build it from scratch. This
rebuild-from-scratch sees the taco
crate exactly as a downstream user would,
and so it will catch any incompatibilities between the existing, published
tortilla
and the about-to-be-published taco
.
Cargo’s crate verification is not completely fool-proof because it only checks that the package compiles.3 In practice, I find that checking compilation is already pretty useful, but I also like to run other static checks.
Publish all my crates
Imagine you’ve been working in your workspace, updating your crates in backwards-incompatible
ways. Now you want to bump tortilla
to version 2.0 and taco
to version 3.0
and publish them both. This isn’t too hard:
- Edit
tortilla/Cargo.toml
to increase the version to 2.0. - Run
cargo publish --package tortilla
, and wait for it to appear on crates.io. - Edit
taco/Cargo.toml
to increase its version to 3.0, and change itstortilla
dependency. to 2.0. - Run
cargo publish --package taco
.
The ordering is important here. You can’t publish the new taco
before tortilla
2.0 is
publicly available: if you try, the verification step will fail.
This multi-crate workflow works, but it has two problems:
- It can get tedious. With two crates it’s manageable, but what about when the dependency graph gets complicated? I worked for a client whose CI had custom Python scripts for checking versions, bumping versions, publishing things in the right order, and so on. It worked, but it wasn’t pretty.4
- It’s non-atomic: if in the process of verifying and packaging dependent crates you discover some problems with the dependencies then you’re out of luck because you’ve already published them. crates.io doesn’t allow deleting packages, so you’ll just have to yank5 the broken packages, increase the version number some more, and start publishing again. This one can’t be solved by scripts or third-party tooling: verifying the dependent crate requires the dependencies to be published.
Starting in mid-2024, my colleague Tor Hovland and I began working on native support for this in Cargo. A few months and dozens of code-review comments later, our initial implementation landed in Cargo 1.83.0. By the way, the Cargo team are super supportive of new contributors — I highly recommend going to their office hours if you’re interested.
How it works
In our implementation, we use a sort of registry “overlay” to verify dependent crates before their dependencies are published. This overlay wraps an upstream registry (like crates.io), allowing us to add local crates to the overlay without actually publishing them upstream. This kind of registry overlay is an interesting topic on its own. The “virtualization” of package sources is an often-requested feature that hasn’t yet been implemented in general because it’s tricky to design without exposing users to dependency confusion attacks: the more flexible you are about where dependencies come from, the easier it is for an attacker to sneak their way into your dependency tree. Our registry overlay passed scrutiny because it’s only available to Cargo internally, and only gets used for workspace-local packages during workspace publishing.
The registry overlay was pretty simple to implement, since it’s just a composition of two existing Cargo features: local registries and abstract sources. A local registry in Cargo is just a registry (like crates.io) that lives on your local disk instead of in the cloud. Cargo has long supported them because they’re useful for offline builds and integration testing. When packaging a workspace we create a temporary, initially-empty local registry for storing the new local packages as we produce them.
Our second ingredient is Cargo’s Source
trait: since Cargo can pull dependencies
from many different kinds of places (crates.io, private registries, git repositories, etc.),
they already have a nice abstraction that encapsulates how to query
availability, download, and cache packages from different places. So our registry
overlay is just a new implementation of the Source
trait that wraps two other Source
s:
the upstream registry (like crates.io) that we want to publish to, and the local registry
that we put our local packages in.
When someone queries our overlay source for a package, we check in the local registry
first, and fall back to the upstream registry.
Now that we have our local registry overlay, the workspace-publishing workflow looks like this:
- Gather all the to-be-published crates and figure out any inter-dependencies. Sort them in a “dependency-compatible” order, meaning that every crate will be processed after all its dependencies.
- In that dependency-compatible order, package and verify each crate. For each crate:
- Package it up, removing any mention of local path dependencies.
- Unpackage it in a temporary location and check that it builds. This build step uses the local registry overlay, so that it thinks all the local dependencies that were previously added to the local overlay are really published.
- “Publish” the crate in the local registry overlay.
- In the dependency-compatible order, actually upload all the crates to crates.io.
This is done in parallel as much as possible. For example, if
tortilla
andcarnitas
don’t depend on one another buttaco
depends on them both, thentortilla
andcarnitas
can be uploaded simultaneously.
It’s possible for the final upload to fail (if your network goes down, for example) and for some crates to remain unpublished; in that sense, the new workspace publishing workflow is not truly atomic. But because all of the new crates have already been verified with one another, you can just retry publishing the ones that failed to upload.
How to try it
Cargo, as critical infrastructure for Rust development, is pretty conservative about
introducing new features. Multi-package publishing was recently promoted to
a stable feature, but it is currently only available in nightly builds. If you’re using
a recent nightly build of Cargo 1.90.0 or later, running cargo publish
in a workspace
will work as described in this blog post.
If you don’t want to publish everything in your workspace, the usual package-selection arguments
should work as expected: cargo publish --package taco --package tortilla
will package just taco
and tortilla
, while correctly managing any dependencies
between them. Or you can exclude packages like cargo publish --exclude onions
.
If you’re using a stable Rust toolchain, workspace publishing will be available in Cargo 1.90 in September 2025.
- If you use Node.js, Cargo is like the
npm
command and crates.io is like the NPM registry. If you use Python, Cargo is like pip (or Poetry, or uv) and crates.io is like PyPI.↩ - It can also contain lots of other useful workspace-scoped information, like dependencies that are common between crates or global compiler settings.↩
- To be even more precise, it only checks that the package compiles against
the dependencies that are locked in your
Cargo.lock
file, which gets included in the package. If you or someone in your dependency tree doesn’t correctly follow semantic versioning, downstream users could still experience compilation problems. In practice, we’ve seen this cause binary packages to break becausecargo install
ignores the lock file by default.↩ - There are also several third-party tools (for example, cargo-release, cargo-smart-release, and release-plz) to help automate multi-crate releases. If one of these meets your needs, it might be better than a custom script.↩
- “Yanking” is Cargo’s mechanism for marking packages as broken without actually deleting their contents and breaking everyone’s builds.↩
Behind the scenes
Joe is a programmer and a mathematician. He loves getting lost in new topics and is a fervent believer in clear documentation.
If you enjoyed this article, you might be interested in joining the Tweag team.