I’m a Unity developer: Professionally I make mobile games with the Unity game engine. I’ve been working with Unity since 2014, and have seen the engine and the ecosystem around it change a lot over the years. I’m also an avid open source developer, and a big believer in the value of having a community-driven ecosystem around any core technology. Many times in my years working with Unity I’ve tried unsuccessfully to setup reusable, open source libraries for Unity in the way that I would for other ecosystems like JavaScript (via NPM) or Rust, failing inevitably due to some limitation in the tooling available for Unity. In the last year or so things have been changing a lot within the Unity ecosystem, and I’m finding that I’m finally able to do all of the things that I’m used to being able to do when setting up open source libraries.

What follows is my attempt to recount the history of package management in Unity, along with the ways that Unity and its ecosystem have been changing recently to make authoring packages easier.

The Bad Old Days

I started using Unity right at the tail end of the Unity 4 cycle, just before Unity 5 came out. At that time, Unity’s only tool for sharing assets between Unity projects was asset packages. Asset packages are glorified zip archives containing a set of game assets (including code files!) in a pre-defined directory tree. When you import an asset package Unity merges the package’s directory tree into your project’s root Assets folder, giving you an option to preview which files were going to be imported ahead of time.

This system works well enough as a way to do one-off asset imports, but as you might imagine it’s not a terribly good system for managing more complex dependencies, especially code dependencies. If a new version of the package is released and files or folders were moved in the new version the import process doesn’t handle removing the old versions of the assets, leaving you with duplicates. Ideally a package will keep all of its assets under a single top-level directory so that you only need to delete a single folder before importing the new version. In practice many packages fail to follow this convention. Some packages even store project-specific configuration files within the package’s folder, so if you try to delete the root package folder before upgrading you’ll loose configuration settings that you probably meant to keep. There’s also nothing preventing you from making modifications to code/assets pulled in from these packages, and in practice such modifications are common in Unity projects. This makes upgrading packages doubly difficult because you now have to manage merging your changes with incoming changes.

There were also limited means of distributing such packages. The main place for hosting them was the Unity Asset Store, which mainly existed for selling pre-made assets. You could distribute packages there for free, but the interface for accessing and downloading packages as a user was pretty rough. It was also not uncommon to find open source projects on GitHub that provided pre-built asset packages for import. However it was also just as common to find projects on GitHub that didn’t provide a pre-built package, where the recommended way of grabbing the code was to manually copy the contents into your project. For a long time the Unify Community Wiki was also a popular option for smaller code snippets, though you had to manually copy-paste the code into your project which… blech. I get the impression that it sees a lot less usage these days than it once did, though.

So while there was undoubtedly useful utilities out there, and loads of developers trying to do their best with the tools available, I wouldn’t say that Unity really had an ecosystem per se. The tooling available simply made it impossible to share common code dependencies in a way that would allow for large, reusable libraries to be built. Instead, everyone working on a Unity project had their own local copy of the same handful of common dependencies (usually with a couple of bespoke modifications).

Take, for example, JSON parsing. For a long time Unity didn’t have built-in support for JSON serialization, and even now the support it has is very limited and not usable for many games. So instead most projects using JSON in some form end up having to pull in a separate JSON serialization library. The most common solution for a long time was SimpleJSON, which was posted on the Unify wiki. Nowadays I see miniJSON used a lot (Unity even uses within their UDP package). The far more robust Json.NET was also made into a Unity package for the low, low price of $20 (the actual distribution of Json.NET has always been free). Of course, if you’re making a Unity package that itself needs JSON parsing support, you can’t assume that everyone using your package will already have a JSON library in their project (or which one they’ll be using even if they do already have one), so you need to provide your own copy of the JSON parsing library you’re using. I’ve worked on a project that had no fewer than 6 JSON parsing implementations in various places, several of them copies of SimpleJSON! The project I work on currently has 3 copies of miniJSON (pulled in via external packages) in addition to the 2 different JSON libraries we’re using for the game itself!

A New Hope

In the 2017.2 release Unity started adding a new package manager, which became available to users in the 2018.1 release. In its initial form there wasn’t official support for making custom packages (it was only being used to distribute Unity’s own packages). However at least one clever person was able to reverse engineer the package format, making it possible to start experimenting with the package manager early.

With the 2018.3 release, Unity added official support for custom packages, as well as experimental support for distributing packages via Git and custom NPM servers. At this point the functionality was still largely undocumented, but was working well enough to start using in actual projects. Synapse has at least one project on Unity 2018.4 that relies on this functionality and has found that it works well in practice.

Starting in 2019.1 Unity provided official documentation for setting up custom packages, and they’ve continued to flesh out the docs and improve on UPM’s functionality throughout the 2019 release cycle. The big thing that this has enabled is the ability to start breaking out reusable bits of functionality into local packages. For studios like Synapse that have made many games over the years, it’s useful to have common utilities that are reused between games. Historically we’ve been able to share these utilities between games using version control mechanism like SVN externals or Git submodules.

At this point, UPM provides enough functionality that building out an ecosystem of Unity packages is actually a viable prospect. Well, at least in theory. In practice there’s still one major hiccup that needs to be addressed:

Hosting Packages

At the time of writing Unity still doesn’t have an official way to host custom UPM packages. The officially-supported ways for pulling in package dependencies are:

  • The local file system, either by dropping the package directly into your project’s Packages folder or by manually specifying the path to the package on your local filesystem.
  • Via Git, by specifying the URL of Git repository. This makes posting a package up on GitHub a pretty common way of sharing Unity packages.
  • Via NPM (of all things). Unity’s docs specifically suggest hosting your own NPM package registry.

The last option is, in theory, the best option since it doesn’t force a dependency on Git (since not all projects are already using Git) and doesn’t require users to vendor local copies of the package. However, hosting your own package registry is a hurdle for most developers who just want to share some useful utility code. Some intrepid folks have actually started hosting their packages on NPM proper, which… is something, I guess.

Fortunately, it was only a matter of time before someone stepped in to provide a common package registry for Unity developers. Enter OpenUPM: An open source package registry with a built-in build pipeline for automatically deploying packages to the registry. Any UPM package hosted on GitHub can be added to the registry, and OpenUPM will build the package and host it for redistribution. It also provides a nifty command line tool for adding and updating packages, since the “scoped registry” system for adding external packages can be tedious to update by hand.

OpenUPM is… a bit of a weird project. As far as package registries go, it’s pretty odd to be able to publish other people’s packages. The built-in build pipeline is also somewhat unusual, since you usually publish packages to the registry directly rather than having the registry go out and find the package elsewhere. The need for a separate command line tool also goes against the grain for UPM, where the expected flow for adding/updating packages is to go through the package manager window in the editor.

However I can forgive OpenUPM’s quirks since it’s providing a very important service (and most of those quirks are working around problems that Unity caused in the first place). Being able to easily host Unity packages means that for the first time in Unity’s history it’s actually possible to start building out a more complete package ecosystem! Packages can be published and versioned properly, and packages can reliably depend on other packages without having to manually copy their contents.

Testing Open Source Projects

However, if you’re maintaining an open source project of any kind, having build and test automation is pretty critical in order to be able to ensure that the code you’re publishing actually works as intended. Historically this has been a major pain point for Unity projects.

For one thing, running the Unity editor from the command line has always been a struggle. For a long time the command line options were very poorly documented, and the editor would often fail to correctly report errors, leaving you with no feedback as to what failed or why. It was also especially difficult to run Unity in a headless environment, meaning things like Docker were often non-starters.

Worst of all is Unity’s license activation policy. In order to run Unity you need to activate a license. Anyone can activate a free license for personal use, but doing so is a manual process process, there’s no automated way of doing so. What’s worse is that license activations are pinned to the machine that you activated the license on, which means that VM-based build systems are basically unusable since each run requires a fresh license activation. If you have a professional Unity license you can activate that more easily from the command line. However professional licenses can only be activated on two machines at a time, which means even if you’re paying the big bucks for a license you can still have at most two concurrent builds! Even once running Unity from the command line became more viable, the need to activate a license has effectively killed every attempt I’ve ever made to setup automated testing for my open source projects (and I’ve tried many times over the last few years).

However, in the last couple of years two projects have popped up that have managed to solve this issue (for the most part). First, a user on GitLab has started providing pre-built Docker images with Unity installed. The project also includes instructions for how to activate a Unity personal license from within the Docker container. This effectively works around the need to activate a license per machine, because a given Docker image looks like the same machine to Unity no matter how many times you run it!

Using those Docker images, another person has been able to build out pre-made actions Unity for GitHub Actions (GitHub’s new CI service). The project provides actions for running tests and building for different platforms, and provides built-in support for activating personal licenses! This cuts the amount of manual work needed to setup test automation down to a minimum, and makes it actually viable to setup automation for open source project. For example, the kongregate-web package that I maintain is setup to test against two different versions of Unity, and verifies that the code works both in editor and when built for WebGL!

Generating Documentation

Another longstanding issue I’ve had in trying to maintain open source Unity packages is difficulty in generating API documentation. C# has built-in support for doc comments, however I previously hadn’t been able to find a tool that can generate a hostable website for browsing the doc. This makes it hard for users to see what functionality your package provides without digging through source code, which is less than ideal.

But recently I came across DocFX, which seems to now be the semi-official documentation generator for .NET, and I was able to get it working for a Unity package without much issue! DocFX knows how to parse C# source code, and it doesn’t seem to mind that the code isn’t setup with a proper .csproj file. Just write regular XML comments in your source code, point DocFX at it, and you’re good to go. It’s even pretty easy to automatically publish the generated docs to a GitHub Pages site using GitHub Actions! Unity seems to be using it to generate the documentation for all of their packages, too, which gives me some confidence that the tool has proven to work reasonably well with Unity projects.

The Not-So-Bright Side

As per usual, not everything is sunshine and roses in the Unity world. UPM is a massive step forward compared to what we had before, but there are still some unfortunate pain points to deal with:

  • The lack of an official package registry is really a massive oversight. Centralized package hosting is usually like half the point of having a package manger in the first place, and treating GitHub as the semi-official solution isn’t ideal for projects that aren’t already using Git. OpenUPM is a stopgap solution, but the way the scoped registry system is setup poses problems for packages that depend on other packages on OpenUPM. Specifically…
  • A package can’t itself declare scoped registries, so a project pulling in the package needs to also add the scoped registry declarations for the package’s entire dependency tree. This is gradually becoming more of an issue as people continue to publish more packages on OpenUPM that in turn depend on other packages. This is one of the things that make the custom OpenUPM command line tool necessary, since you have to potentially add scoped registry entries for the package’s entire dependency tree.
  • The package manager UI in the editor doesn’t seem to work well with general purpose registries like OpenUPM. The UI will show packages that are a part of the declared scope for the registry (i.e. where the package name starts with the specified prefix), but for OpenUPM there’s no common scope that all packages are a part of, so the UI doesn’t let you brown or add OpenUPM packages. I can imagine some valid reasons why UPM is setup to work this way, but it highlights the difficulties that come with not having an official package registry.
  • Dealing with conflicts between package versions in dependencies isn’t great. Your project can only pull in a single canonical version of a package, so if multiple packages depend on different versions of the same package UPM needs to pick a single version to use. Unity can sometimes resolve this automatically by grabbing the highest required version, but also sometimes it can’t and you get to deal with it yourself. Honestly I’m not too upset about this one, though; This restriction comes down to how .NET works than anything Unity-specific, and this is a problem that you run into with various package managers so it’s not like this is an entirely solved problem. Still, it’s a pain point that’s only going to increase as inter-dependencies between packages becomes more common.
  • While I highlighted earlier the ways in which running tests for a Unity project has gotten easier, there’s still more setup when testing a package than is really necessary: For each Unity version that you want to test against you need to have a separate test project setup to test against, including a separate manual license activation for each of those Unity versions. In most cases the package will be self-contained with everything it needs to run its test suite, so these are generally empty projects that exist just so that you can run Unity from the command line. It would be far easier to setup CI for new packages if you could just point Unity at a package and have it run the tests without needing a full project setup, and if you didn’t need to have a license activation when running package tests.

At least a couple of these can potentially be addressed by the community by building better tooling. However, some of these can only really be addressed by the improvements to Unity itself.

Closing Thoughts

Things are currently looking much brighter for the Unity ecosystem than they have in the past: With an actual package manager for Unity and an easy way to host those packages it’s much easier to create reusable code than it was previously, the recent improvements to CI setups make it much more viable to maintain an open source Unity package, and the ability to generate readable documentation for packages makes it easier to to use community-provided packages. While I don’t think all of the difficulties around building open source Unity packages are completely behind us, I have hope that it will continue to get easier as more tooling is built by the community.