Codebase Conversion: Building a MEAN.js AngularJS Project with Create-React-App

This is a post in the Codebase Conversion series.


How I hacked CRA's config to have it build a classic AngularJS codebase

Intro 🔗︎

For the last several years, I'd worked on a project codebase that was primarily Backbone. Over time, we'd been incrementally migrating that codebase to React+Redux.

I was recently reassigned to another project, which is an internal metrics dashboard. It turns out that the app is an AngularJS 1.x codebase, written using the classic MEAN.js starter kit. (I'd managed to avoid working with any flavor of Angular thus far in my career, and certainly didn't ever want to use AngularJS 1.x, but here we are.)

As I began learning the codebase and trying to work on my first couple tasks, I realized that the build setup is woefully deficient in comparison to what I've been used to:

  • Client-side packages are being installed via Bower (which was deprecated in 2017)
  • The app is still using Node 6.x as the runtime
  • The build process is entirely managed by Gulp
  • This codebase had Babel and @babel/preset-env added, but Babel was only being used as part of the prod build process
  • There's no actual bundling going on at all. The prod pipeline minifies and concatenates the client JS, but nothing at all in dev. Also, all the client code is written as IIFEs that access globals. That means you can't actually use CommonJS or ES module syntax, can't import from other files, can't use any packages from NPM, and it's downloading tons of Angular templates and individual CSS files as part of the loading process.

As the new team lead, I'm going to get to drive the technical direction for this app. Long-term, of course I want to replace it all with a modern React stack, but that's a long ways off. In the short term, I just want to update the build tooling to support a modernized workflow (bundling, Babel, NPM packages, asset management), and enable at least using React inside the existing AngularJS UI to start doing some bits of incremental migration.

After realizing these issues, I dove into working on the build tooling, and I was able to pull off a hack that is incredibly bizarre and yet works surprisingly well:

I've successfully used Create-React-App to build this classic AngularJS codebase!

I figure it's worth documenting this as a public service in case anyone else ever wants to try this, as well as showing some techniques that might be useful in general.

A couple notes:

  • I don't feel like trying to recreate this on a fresh MEAN.js repo atm, but I believe this should work in other projects based on the similarities between my app's codebase and the original MEAN.js boilerplate repo
  • When I originally wrote this post, I had just done the initial CRA conversion the day before and only tested it in my local dev environment. I did a bunch of additional conversion work later on and also built a new feature on top of the build tooling changes, and have updated the post to reflect that. The app has been deployed in a test environment and appears to be pretty stable, with the bundling working as intended.
  • The codebase is based off of MEAN.js v0.5.0. It looks like there may be a 0.6.0 release, or at least there's a bunch of changes in master since then.

Table of Contents 🔗︎

Upgrading Node and Package Management 🔗︎

Node 7.x Support 🔗︎

The first issue was that the app server crashed when run under Node >= 7.x. Turns out there was a fix in the MEAN.js repo already. Apparently the debug server task was passing a --debug flag to Node when starting server.js, and that flag was deprecated as of Node 7. Passing --inspect fixed the crash, and allowed me to run the server with newer Node (10.x). Also, Gulp 3.x doesn't work on Node 12, so an upgrade to Gulp 4.x would be necessary to use Node 12. Haven't refactored that yet.

In addition, the node-inspector package should be removed, because that's apparently years outdated.

Switching from Bower to Yarn 🔗︎

Since we were still on Node 6.x, we were using NPM 3 and had no lockfiles. As a first step, I used Yarn 1.22 to reinstall all the packages, and generated an initial lockfile. I also noted that we ended up with like 30 different copies of Lodash in our node_modules, so I used Yarn's resolutions to force a single consistent version of Lodash. That shrank things considerably (from around 57K files to 28K).

The next step was to drop Bower itself. I skimmed through a few different resources on migrating:

I ended up pasting the deps list from bower.json into package.json, removing any hash versions, and fixing up package names. For example, angular-ui-router on Bower had been replaced with @uirouter/angularjs on NPM. I kept retrying installs and looking at error messages until Yarn succeeded in installing all packages.

The MEAN.js template then has some config files with arrays of the third-party JS and CSS files from Bower, like:

  css: [
    'public/lib/bootstrap/dist/css/bootstrap.css'
  ]

As an initial temp conversion step, I changed all references of public/lib to node_modules, and double-checked the paths to make sure those files existed.

That process made all the existing build logic break, because the Gulp tasks were using wiredep for pulling in Bower assets and doing various concatenations. So, time to bring in Webpack.

Initial Webpack Migration 🔗︎

I did some more searching on Github to find other repos where folks might have migrated their MEAN.js project to use Webpack. In addition to the MEAN.js repo open PR, I found bcdevexchange/devex PR #509. I used that as a guideline for my initial Webpack conversion attempt. There's also a good post at Migrating AngularJS from Gulp to Webpack that has some useful instructions and snippets.

That bcdevexchange PR and the subsequent work in the repo sets up Webpack as part of the Gulp process, and pipes globs of source files into Webpack. I was able to successfully get the client app building using this approach.

There were a couple key parts of this conversion process.

First, I added a vendor.js file that imported all the client-side lib JS and CSS assets so that Webpack would process them:

modules/core/client/app/vendor.js

import "angular";
import "@uirouter/angularjs";
// other libs

import "bootstrap/dist/css/bootstrap.css";
// other CSS

I also added a ./paths.js file to calculate some project paths:

paths.js

const path = require("path");

const projectBasePath = path.resolve(__dirname);

module.exports = {
  modules: path.resolve(projectBasePath, "modules"),
  config: path.resolve(projectBasePath, "config"),
  dist: path.join(projectBasePath, "public")
}

Second, while the blog post recommends adding index.js files to the various module/ folders and explicitly importing things (which is probably a good idea overall), the PR actually globs together all the JS files and passes them as the contents of the Webpack entry point:

webpack.config.js

const commonConfig = merge([
  {
    entry: {
      app: [
        path.join(paths.modules, "core/client/app/vendor.js"),
        path.join(paths.modules, "core/client/app/config.js"),
        path.join(paths.modules, "core/client/app/init.js"),
      ].concat(
        glob.sync(path.join(paths.modules, "*/client/*.js*")),
        glob.sync(path.join(paths.modules, "*/client/**/*.js*"), {
          ignore: [
            path.join(paths.modules, "*/client/*.js*"),
            path.join(paths.modules, "core/client/app/config.js"),
            path.join(paths.modules, "core/client/app/init.js"),
          ]
        }),
      )
    }
  }
])

That technique for defining the entry point worked for me.

However, I wasn't happy with the "Gulp drives Webpack" approach for several reasons:

  • I'm used to having a standalone dev server
  • Webpack was writing a large bundle to disk in dev mode
  • None of the other assets were being handled by Webpack
  • Still no use of other modules on the client side

I really wanted the same kind of experience I have when using a typical Create-React-App project.

So... why not use Create-React-App itself to build this AngularJS app? :)

Looking Inside Create-React-App 🔗︎

Understanding Create-React-App's Internals 🔗︎

Create-React-App may seem like magic, but it's not. It's just a lot of very carefully crafted configuration files for Webpack, Babel, and other build tools, with thousands of man-hours put into solving problems and making things work right for a wide variety of users.

This is why I still strongly believe that most apps should default to Create-React-App or Next.js as their build tooling, vs trying to set up a Webpack+Babel config by hand from scratch. Sure, it's easier to whip up a starting config these days, but you'll never be able to match the amount of effort that's been put into ensuring that everything works right and handles different edge cases correctly. (That's not to say you shouldn't ever write a config by hand, just that you should have really good reasons to do so and know what you're doing first.)

The create-react-app command is actually just a bootstrapping tool that sets up a new project. The real guts of CRA are in the react-scripts package in the CRA repo. In particular, you can see the heavily customized CRA webpack.config.js and the source for the start command.

If you look at those files, you'll note that both of them rely on a package called react-dev-utils. This package contains numerous utility functions that have been extracted from react-scripts, like functions to format output messages from Webpack/ESLint/TypeScript , open your default browser or editor, assorted Webpack config utils, and so on.

On my earlier Backbone project, I'd first set up a custom Webpack config and dev server from scratch. I later was able to leverage these APIs from react-dev-utils to make my dev server look and behave more like CRA's dev server.

It's also worth noting that CRA has a file called paths.js that contains calculated paths to many folders and files, and that this structure is used throughout the build process.

Create-Any-App? 🔗︎

Despite its name, almost nothing about Create-React-App's build tooling is really React-specific. The React references are really limited to adding specific handling of .jsx and .tsx files and inserting @babel/preset-react, and that the CRA template projects all depend on react and react-dom by default and render a React component tree in index.js.

But, from there, you don't have to write React code in a CRA-generated project. You could remove the react and react-dom dependencies and then write pretty much any JS code that you want: Preact, vanilla JS, Vue, Angular, Backbone, or whatever.

Overriding Create-React-App 🔗︎

CRA is deliberately opinionated in many ways. It hides the Webpack and Babel configs inside the react-scripts package so you can't access them, and doesn't provide a way to override that configuration. It also expects that your app will have its code in ./src, that the app's entry point is ./src/index.{js,ts}, and that ./public/index.html exists and will be the template for Webpack's HTML output.

Because of these opinions, a cottage industry of tooling has sprung up to allow overriding CRA's config:

I used react-app-rewired originally, but it's somewhat lapsed since CRA 2.0 came out. Today I use and would recommend using craco for overriding CRA configs.

I've used CRA overrides on multiple projects. Last year I overrode CRA to allow inserting a proxy middleware at the end of the Express pipeline. When I created a boilerplate for CRA+Spectacle+MDX, I set up the config to use a Markdown loader

So, I have plenty of experience poking through the guts of CRA and fiddling with its internals. This prior knowledge was extremely useful here.

Hacking CRA's Configuration for Fun and Profit 🔗︎

Bypassing Folder Structure Checks 🔗︎

I installed react-scripts and @craco/craco, added a craco.config.js file to my project, and a run script of "client:start": "craco start --verbose". If I ran it, the process would start up for a bit... and then suddenly exit with an error that ./src/index.js didn't exist.

I also removed the "watch" and "webpack-watch" tasks I'd added to the Gulpfile from its "default", "dev", and "prod" commands, to ensure it wouldn't touch client-side code, as well as disabling all the JS and CSS-related linting steps.

CRA has a hardcoded check on startup to enforce that both ./src/index.{js,ts} and ./public/index.html exist. Since this Angular app's code all lives under ./modules/ those checks won't pass.

I could have created dumy folders and files, but there's other issues. CRA wants to use ./src/index.{js,ts} as the entry point for Webpack's bundling. It also assumes that ./public/index.html is the HTML template to pass to Webpack, and then it tries to copy all your assets from ./public/ into the output ./build/ folder. Meanwhile, the MEAN.js Express server is configured to first serve static assets from ./public/ (including index.html), and we need to pass that big glob of source files from ./modules/ into Webpack.

To make this work, I'd have to override several bits of this behavior, not just alter the Webpack config.

The first step was to mock out the checkRequiredFiles() function that CRA uses. Fortunately, that function is in its own file in react-dev-utils. Node.js's requre.cache API allows us to replace the values that an imported module has exported with our own values:

craco.config.js

const path = require("path");

// Import CRA's "check required files" fmodule so we can fake it out completely
const craCheckRequiredFilesPath = path.resolve("node_modules/react-dev-utils/checkRequiredFiles.js");
require(craCheckRequiredFilesPath);

// Supply a fake implementation
require.cache[craCheckRequiredFilesPath].exports = () => true;

That was sufficient to let CRA bypass the initial file existence checks.

Overriding Folder Paths 🔗︎

The next issue was altering the various paths that CRA uses. I needed to:

  • Change paths.appIndexJs to be the big glob of actual source files to bundle
  • Point paths.appHtml to the alternate location of my index.html template
  • Change the location of the public folder used as the source of non-compiled assets, and put the build output in the actual ./public folder served by Express

I created ./modules/core/client/public/ as the new public folder location, and dropped a modified version of the MEAN.js layout.server.view.html file into that folder as index.html. I also moved some of the image and icon assets there as well.

I then tried using the same trick for overriding the cached CRA paths.js file:

craco.config.js

const glob = require("glob";)

const appPaths = require("./paths");

// Import the CRA paths object so we can manipulate it
const craPathsPath = path.resolve("node_modules/react-scripts/config/paths.js");
const craPaths = require(craPathsPath);

const indexHtmlPath = path.resolve(appPaths.modules, "core/client/public/index.html");

// Point CRA to our actual index.html file
craPaths.appHtml = indexHtmlPath;

// Glob together all our actual client source files
const actualClientSourceGlob = [
  // big glob o' files from the earlier snippet in this post
];

// Write build output to `./public` so it can be served by Express
craPaths.appBuild = craPaths.appPublic
// and use the alternate "public" folder as the asset folder copy source
craPaths.appPublic = path.resolve(appPaths.modules, "core/client/public");

require.cache[craPathsPath].exports = craPaths;

This didn't actually work as I expected. After adding some more logging directly inside node_modules files, it appeared that craco was somehow re-importing the CRA paths.js file, causing my assignment to the cache to not have the effect I wanted.

I eventually was able to get that to work by doing the cache override a second time, in the process of altering the Webpack config itself, which apparently runs after craco has re-imported the paths file.

Note: After later research, I discovered that CRA explicitly clears the require cache for its paths.js file, and I was unable to find anywhere in the craco loading process that would let me successfully re-overwrite that cache value at the correct time for some of the other aspects I needed.

Rewriting the Webpack Config 🔗︎

With those steps in place, it was time to actually modify the Webpack config. I needed to:

  • Remove the original index.js path from the entry point and use my globs instead
  • Add "absolute path" handling using the modules/ folder as a base (not required, but nice to have)
  • Alter the HTML template plugin to use my own template file
  • Ensure that the build output is written to the right place
  • Set up initial handling for loading Angular HTML templates

Here's what the initial overridden config looked like:

craco.config.js

module.exports = {
  webpack: {
    configure: (webpackConfig, ctx) => {
      webpackConfig.entry = webpackConfig.entry
        .filter(path => !path.includes("index."))
        .concat(actualClientSourceGlob);
        
      // Enable absolute paths starting from "./modules"
      webpackConfig.resolve.modules.push(appPaths.modules)
      
      // Use the right HTML template
      const [htmlWebpackPlugin] = webpackConfig.plugins;
      htmlWebpackPlugin.options.template = indexHtmlPath;
      
      // Process Angular HTML template files
      webpackConfig.module.rules.push({
        test: /\.html$/,
        use: "html-loader"
      });
      
      require.cache[craPathsPath].exports = craPaths;
      
      if(ctx.env === "production") {
        // set build output folder
        webpackConfig.output.path = craPaths.appBuild;
      }
      
      return webpackConfig;
    }
  }
}

That was enough to let the app actually build... but would it run?

Fixing Runtime Issues 🔗︎

The first time I tried to load the app with that config, I got an infinite number of 404 error messages appearing in my browser's devtools console. Some API requests were failing and causing a 404, but the client didn't know how to load the Angular template for a 404, which caused another request, which....

The first step was adding a proxy: "http://localhost:8080" setting to the app's package.json to enable proxying API requests through to the backend.

Loading Angular Templates 🔗︎

From there, I needed to load AngularJS's templates through Webpack, so that they would be bundled in and accessible without needing to request them from the server. The post I linked earlier on Migrating AngularJS from Gulp to Webpack had a section on doing this. I was able to modify their approach a bit to match this repo's structure.

First, I added a util function to process a list of template filenames:

modules/core/client/utils.js

import angular from "angular";

export function loadTemplates(name, ctx) {
  angular.module(name).run([
    "$templateCache",
    function($templateCache) {
      ctx.keys().forEach(key => {
        // looks like: "./some.template.html"
        const filename = key.replace("./", "");
        
        // All our templates are at paths like this:
        // /modules/core/client/views/some.template.html
        // and referred to in the source using that path string
        const path = `/modules/${name}/client/views/${filename}`;
        
        // Add the template to the cache
        $templateCache.put(path, ctx(key));
      })
    }
  ])
}

Then, in our top-level module definition files, I used Webpack's require.context API to import all HTML files within that module folder. I also added another command to ensure all CSS files were imported as well.

modules/someFeature/someFeature.client.module.js

+import {loadTemplates} from "core/client/utils";
(function(app) {
  "use strict";
  
  app.registerModule("someFeature");
  
+ const ctx = require.context("./views", true, /.*\.html$/);
+ loadTemplates("someFeature", ctx);

+ const cssCtx = require.context("./css", true, /.*\.css$/);
+ cssCtx.keys().forEach(key => cssCtx(key));
}(ApplicationConfiguration));

And that was sufficient to get the app client effectively running in local dev. With those changes in place, I was able to successfully load the app from my CRA dev server at http://localhost:3000, browse through the UI, see data being fetched, and things were getting served up correctly.

I was also able to run CRA's build command using the same setup. It correctly output the files under ./public, with the typical app and vendor chunks I'd expect to see.

Updating Production Behavior 🔗︎

At this point, it was pretty much just "it works on my machine!". I needed to expand on this to enable actually deploying the codebase to anything resembling a production environment.

Fixing AngularJS Annotations 🔗︎

When I tried doing a prod build with CRA and browsing to the app server, I started getting a bunch of errors about missing AngularJS injection dependencies.

I concluded that this was likely due to use of the implicit dependency injection syntax somewhere, and that prod minification of the bundle was breaking things.

Our particular repo had already a .babelrc file that was trying to use babel-plugin-angularjs-annotate, although I'm not entirely convinced it was set up right.

I had to update the craco config to add that plugin by adding this section:

  babel: {    
    plugins: [
      [
        "angularjs-annotate",
        {
            "explicitOnly": false,
        }
      ]
    ]
  }

I also found that one of our dependencies, angularjs-dropdown-multiselect, seemed to have an implicit annotation in its minified dist file and wouldn't load right. I had to explicitly update the vendors.js file I'd added to import the original file instead.

// Specifically use the unminified version as the source, so that it will be run 
// through the Babel ng-annotate plugin before it's minified
import 'angularjs-dropdown-multiselect/dist/src/angularjs-dropdown-multiselect.js';

IE11 Polyfills 🔗︎

As an internal enterprise app, we unfortunately seem to have a sizeable percentage of IE11 users (ack!). The team had recently started using ES6 Promises in the codebase, and that of course is not supported in IE11.

Fortunately, CRA makes it easy to fix this. Just make sure that the "browserslist" section of your package.json has an "ie 11" entry in the "production" array, and import ''react-app-polyfill/ie11' and ''react-app-polyfill/stable' at the top of your entry point / vendor file. Loading the page with IE11 before the change produced errors that Promise doesn't exist, and it loaded fine afterwards.

Serving the Index Route 🔗︎

The app server originally relied on a pair of template files in modules/core/server/views/, named index.server.view.html and layout.server.view.html. The layout file contained the HTML head and body, script and CSS tags, and EJS template syntax placeholders for data on the current logged-in user and an OWASP password strength init options object.

Going forward, the app wouldn't actually be serving these templates at all, because the index file is now generated by the Webpack HTML plugin. In dev, the CRA dev server generates that, and in prod, that's a static public/index.html file.

In core.server.routes.js, I deleted the app.route('/*').get(core.renderIndex); line, and also deleted renderIndex entirely from core.server.controller.js. Then, in config/lib/express.js, I added this catch-all handler to the end of the main export function, right after all the route init calls:

  // Catch-all: handle loading unknown client pages by serving up index.html,
  // so that a direct request to a client-side route like `/items` works right.
  app.get('/*', (req, res) => {
    res.sendFile(path.join(config.folders.dist, 'index.html'));
  });

Fixing User Sessions and App Init 🔗︎

The original layout.server.view.html file had these scripts expecting data to be written in by the server when rendered:

  <script type="text/javascript">
    var user = {{{ user }}};
    var env = "{{ env }}";
  </script>

  <!--owasp config sync-->
  <script type="text/javascript">
    var sharedConfig = {{{ sharedConfig }}};
    owaspPasswordStrengthTest.config(sharedConfig.owasp);
  </script>

None of that was going to be there any more, so I had to rework things.

core/client/app/config.js was reading window.env. I replaced that with process.env.NODE_ENV instead.

I moved the OWASP init into core/client/app/vendor.js. The "shared config" was really hardcoded from the server-side, so I pasted that entire config object directly into the owasp.config() call.

The users/client/services/authentication.client.service file was just saving a reference to window.user for other services to access later. I changed the service to fetch the current user data on app startup instead:

  angular
    .module('users.services')
    .factory('Authentication', Authentication)
    .run(['$http', '$rootScope', 'Authentication', function($http, $rootScope, Authentication) {
      async function checkUserSessionOnLoad() {
        try {
          const response = await $http.get('/api/users/me');
          Authentication.user = response.data;
        } catch (err) {
          Authentication.user = null;
        }
        $rootScope.$apply();
      }

      checkUserSessionOnLoad();
    }]);

  Authentication.$inject = [];

  function Authentication() {
    const auth = {
      // The run block above assigns any existing user session on app startup,
      // and AuthenticationController assigns this if the user logs in or signs up
      user: null
    };

    return auth;
  }

In the process, I also had to update the ESLint config to recognize async/await syntax, with "parserOptions": {"ecmaVersion": 2018}.

(No, I'm not convinced that the auth service as designed is a good idea, nor that fetching the user here is a great idea either, but this was sufficient to let the app work correctly.)

Converting Tests from Karma to Jest 🔗︎

The process of bundling all the code with CRA completely broke all the existing Karma client-side tests. In the process of reviewing the tests, I noted that a large portion were fairly useless anyway, but I wanted to try to get as many of them as possible running for the sake of parity with how the codebase had been before I started changing things.

I did a bunch of research, and these were the most relevant posts I found:

However, the single most helpful resource I found was a PR to convert an app called Trustroots from Karma to Jest. Looking through that PR, as well as later versions of the codebase, gave me several useful techniques and code snippets to use.

Jest Setup and Config 🔗︎

To shorten the explanation, here's the set of config files I ended up with:

jest.config.js

const path = require('path');

module.exports = {
    moduleNameMapper: {
      '^@/(.*)$': '<rootDir>/$1',
      '^modules/(.*)$': '<rootDir>/modules/$1',
      '^.+\\.(css|html)$': '<rootDir>/jest/jest.empty-module.js',
    },
    moduleDirectories: [
        'modules',
        'node_modules'
    ],
    testMatch: ['<rootDir>/modules/*/tests/client/**/*.tests.js'],
    testEnvironment: 'jest-environment-jsdom-sixteen',
    setupFilesAfterEnv: ['<rootDir>/jest/jest.setup.js'],
    transform: {
      '^.+\\.(js|jsx|mjs|cjs|ts|tsx)$': path.resolve('./jest/babelTransform.js'),
    '^.+\\.css$': path.resolve('./jest/cssTransform.js')
    },
    transformIgnorePatterns: [
      // we want to ignore everything in node_modules
      // except the html templates inside angular-ui-bootstrap
      '/node_modules/(?!angular-ui-bootstrap.+\\.html)',
    ],
  };

jest/jest.empty-module.js

module.exports = '';

jest/jest.setup.js

const angular = require('angular');
require('angular-mocks');

require('core/client/app/vendor');
require('core/client/app/config');
require('core/client/app/init');

require('core/client/core.client.module');
require('core/client/config/core.client.menus');
require('core/client/services/menu.client.service');
require('core/client/services/interceptors/auth-interceptor.client.service');
require('core/client/services/interceptors/request-interceptor.client.service');
require('users/client/users.client.module');
require('users/client/services/authentication.client.service');

I also explicitly copied over the babelTransform.js and cssTransform.js files from react-scripts.

Enabling Use of require.context 🔗︎

I had originally used Webpack's require.context() to mass-import CSS files and HTML templates. Unfortunately, that doesn't work right out of the box with Jest.

After digging around, I came across a Babel macro that imitates require.context at build time. Since CRA's Babel config already supports Babel macros, I was able to pull that in and use it instead of the actual require.context, and have it work in both the CRA and Jest environments:

users/client/users.client.module.js

+import requireContext from 'require-context.macro';
import {loadTemplates} from 'core/client/utils';

(function (app) {
  'use strict';

  app.registerModule('users');
  app.registerModule('users.admin');
  app.registerModule('users.admin.routes', ['ui.router', 'core.routes', 'users.admin.services']);
  app.registerModule('users.admin.services');
  app.registerModule('users.routes', ['ui.router', 'core.routes']);
  app.registerModule('users.services');

- const ctx = require.context(`./views`, true, /.*\.html$/);
+ const ctx = requireContext(`./views`, true, /.*\.html$/);
  loadTemplates('users', ctx);
}(ApplicationConfiguration));

Updating Tests for Jest 🔗︎

I had to go through and update all the test files by adding explicit imports of their various dependency files at the top, so that all the relevant services and such were available at injection time. I also had to change calls like beforeEach(module(ApplicationConfiguration.applicationModuleName)); so that it referenced angular.mock.module() instead.

After the tests were all sufficiently passing, I went through and nuked all the references to Karma in the codebase, and removed all Karma-related dependencies.

Adding React Integration 🔗︎

With the codebase now sufficiently stable, I could begin working on the short-term migration plan: enabling the ability to add React components inside of our AngularJS-rendered UI.

Setting Up React Configuration 🔗︎

I added react, react-dom, and react2angular, the React TS typings packages, and TypeScript itself as deps. I also added @types/angular-resource, @types/angular-ui-notification, and @types/ng-file-upload.

I copied over the standard CRA-generated tsconfig.json. CRA picked it up on build, but that wasn't sufficient to get things working. It specifically wanted a react-app-env.d.ts file, which is normally located in src. I also had to update my file entry search glob to include .ts/tsx files, and realized I needed to modify the babel-loader entries to correctly process code inside of the modules folder. Highlights of those changes:

craco.config.js

const glob = require('glob');

const appPaths = require('./paths');
+const { getLoaders, loaderByName } = require("@craco/craco");

// omitted

craPaths.appPublic = path.resolve(appPaths.modules, 'core/client/public');

+craPaths.appTypeDeclarations = path.resolve(appPaths.modules, 'react-app-env.d.ts');

require.cache[craPathsPath].exports = craPaths;

// omitted
module.exports = {
  webpack: {
    configure: (webpackConfig, ctx) => {
      // omitted
      const { hasFoundAny, matches } = getLoaders(webpackConfig, loaderByName('babel-loader'));

+     if (hasFoundAny) {
+       matches.forEach(entry => {
+         const { include } = entry.loader;
+         if (typeof include === 'string' && include.includes('src')) {
+           entry.loader.include = [include, appPaths.modules];
+         }
+       });
+     }

      if (ctx.env === 'production') {
        // set build output folder
        webpackConfig.output.path = craPaths.appBuild;
      }

      return webpackConfig;
    }
  }
}

That let me write an initial React dummy component in TSX and manually call ReactDOM.render() to put it somewhere on the page, including use of a CSS Modules file to style it.

Wrapping React Components with react2angular 🔗︎

The key to this incremental migration effort was react2angular, a library that accepts a React component type, wraps it in an Angular component directive, and lets you pass through binding values and injected values as props.

General usage of react2angular looks roughly like this:

import React, { useState } from 'react';
import angular, * as ng from 'angular';
import { react2angular } from 'react2angular';
 
import { SomeAppService } from '../../../someFeature/client/someAppService.client.types';
 
// Define type declaration of props object for this component
interface MRCProps {
  SomeAppService: SomeAppService;
  Notification: ng.uiNotification.INotificationService;
  isEditing: boolean;
}
 
const MyReactComponent = (props: MRCProps) => {
  const showItemName = async () => {
    // query data from an Angular Resource service
    const response = await props.SomeAppService.query({ 'some.field': 123 }).$promise;
    const [item] = response;
     
    if (item) {
      props.Notification.info({
        message: `Item name is: ${item.name}`
      })
    }
  }
   
  return (
    <div>
      Are we editing something? {props.isEditing}
      <button onClick={showItemName}>Show an Item Name</button>
    </div>
  )
}
 
angular
  .module('core')
  .component(
     // Register a camelCased component name with Angular.
     // It will then be available as a kebab-cased tag name in templates, like <my-react-component>
    'myReactComponent',
    react2angular(
      // Pass a reference to the component type itself
      MyReactComponent,
      // An array of camelCased names of template attributes to bind as props.
      // These must be used as kebab-cased attributes in the template, like 'is-editing'
      ['isEditing'],
      // Array of AngularJS injection services to be passed in as props
      ['SomeAppService', 'Notification']
    )
  );

What really confused me for a while is that AngularJS wants everything in its templates to be kebab-cased, not camelCased. So, even though my component accepts an isEditing prop, and the binding names array takes an 'isEditing' string, the attributes and the tags in the template have to be kebab-cased like is-editing and <my-react-component>:

<div>
  <my-react-component is-editing="vm.someObject.editing"></my-react-component>
</div>

Building a New Feature with React and TypeScript 🔗︎

Now that I had a solid foundation in place, I had a new feature I needed to add: a moderately complex form that let the user edit a specific array field in a much larger domain object. We already had a page that let the user edit the rest of the fields, but we'd added a new field containing a specific array of items to the data model, and needed to let the user add, remove, and visualize the items in that array.

Using React+TS, I was able to build out that feature over the next couple weeks. This went far faster than it would have than if I'd had to take the time to learn enough AngularJS to write it that way. And, by writing it in React+TS, I was able to write code that was much more solid and predictable, and found numerous typos and mistakes as soon as I saved and compiled instead of waiting until runtime.

In the process, I had to come up with some rather hacky workarounds for specific use cases.

Syncing Data Updates between AngularJS and React 🔗︎

Data and functions can be passed down from AngularJS controllers / scope into your React components as props. However, if the React code needs to update data that lives on the AngularJS side, things get tricky:

  • AngularJS relies on direct mutation of existing values, like vm.someObject.someField = 123 . In the React ecosystem, code generally assumes that updates are done immutably, by copying data to create new references so that you can detect changes by comparing if(newItem !== oldItem).
  • react2angular specifically will only force a re-render of the wrapped component if the values being passed in have changed to new references. So, if you're injecting something like <my-react-component some-object="vm.someObject"> , and the controller mutates vm.someObject.someField = 123 , then react2angular won't see that things have changed and won't force the React component to re-render.
  • One of the key rules of React is that components should never mutate data they receive as props, because they don't own those values conceptually.

But, sometimes you have to break the rules :)

In this case of this form, the domainObject reference in the controller is an AngularJS Resource object, which will serialize itself to the server when told to save changes. Fortunately, only this form (which was written in React) cared about the new domainObject.subitems field, since it's a new addition. The problem was that I needed to ensure that domainObject.subitems always pointed to the latest version of the array, so that if the user clicked the "Save Changes" button, it would have that data ready to include in the request to the server.

I implemented this by:

  • Copying the domainObject.subitems array into React component state when my outermost component is initialized
  • Every time that local component state value gets updated, actually mutating props.domainObject.subitems = newDataArray to ensure the data is updated on the AngularJS side

To make this more complicated, if the AngularJS code was trying to databind to that array somewhere in the UI (like showing the list of items), it also wouldn't know that anything had changed because the mutation occurred outside of its own event handlers. So, we have to use AngularJS's $scope.apply() API to force Angular to look through its data and detect any changes. (I later had to switch to a safeApply() wrapper function to avoid cases where $scope.apply() might be called twice in a row.)

The resulting logic looked like this:

export const DomainDataEditingContent = ({
  domainObject,
  isEditing,
  $scope,
  Notification
}: ACProps) => {
  const [itemsState, itemsDispatch] = useReducer(itemsReducer, undefined, calculateInitialItemsState);
 
  const wasPreviouslyEditing = usePrevious(isEditing);
  const domainObjRef = useRef<DomainObject>();
  const [, forceRender] = useReducer(s => s + 1, 0);
 
  useLayoutEffect(() => {
    domainObjRef.current = domainObject;
  });
 
  useLayoutEffect(() => {
    if (isEditing) {
      if (wasPreviouslyEditing) {
        // Interop between AngularJS and React.
        // AngularJS wants direct assignments to the `domainObject` object to update it,
        // while React relies on immutable updates of state.
        // Since only this React component tree cares about the items field, we'll make
        // immutable updates internally, then directly assign to `domainObject` afterwards.
        // Don't do this at home, okay, kids? :)
        domainObject.subitems = itemsState.items;
        // Just in case, force Angular to acknowledge we've changed things
        safeApply($scope);
      } else {
        // When we start editing, ensure that we're using the latest items array in this form
        napsDispatch(resetState(domainObject.subitems));
      }
      //
    } else {
      if (wasPreviouslyEditing) {
        // When we stop editing, ensure that we're using the latest naps array in this form
        napsDispatch(resetState(domainObject.subitems));
      }
    }
  }, [isEditing, domainObject.subitems, itemsState.items]);
   
  // other component logic and render output here
}

Making AngularJS Services and Data Accessible to React Components 🔗︎

Only the root component in this React component tree is wrapped by react2angular - everything inside of it is just plain React. So, if a deeply nested component needs access to an AngularJS service or piece of data, you'll need to inject that into this root component, and then pass the service/data down to the child component that needs it. I'd found a post on The woes of using React2Angular to mix ReactJS components into my AngularJS App, which pointed out some of the problems this can lead to.

If it's just one or two levels down, you can pass it explicitly as props through each component. If it's more deeply nested, you should create a React context value instance using React.createContext(), put an object containing that service/data into the context as the value, read from the context in the nested child, and retrieve the pieces that component needs.

I had my root content component pull in all the services that were needed anywhere in this component tree, and either put them directly into a context object or wrap them in specific internal functions and put those into the context object:

const contextValue = {
  itemsState,
  itemsDispatch,
  SomeAppService,
  AnotherAppService,
  doOneThing,
  doAnotherThing
}

(Fun side note: while I don't have Redux added to this app yet, I did pull in our official Redux Toolkit package just so that I could make use of createSlice with useReducer!. The logic for this form turned into a fairly complex reducer, and I needed it to be well-typed. createSlice does that work amazingly well, and you can use the reducers it generates with useReducer, not just with a Redux store.)

Results 🔗︎

Network Improvements 🔗︎

I measured the performance of our app in a prod-like setting:

  • With the original build tooling, loading a particular dashboard page resulted in 250 network requests (!), most of which were individual JS files or AngularJS HTML templates, about 1.3MB transferred, and it took about 6.5 seconds to load
  • With CRA's build tooling in place, only 22 requests were made (including several images), 860K transferred, and it loaded in about 2.5 seconds.

Granted, this is an internal dashboard app and not some public ecommerce site, so I'm not worried about bundle sizes and perf metrics atm. But, hey, 90% fewer requests and 50% less data transferred, just by swapping build tooling? Sure!

Incremental Migration Capabilities 🔗︎

Now that we've got the ability to write code in React+TS, we can do incremental migration of parts of the codebase in-place, or add new features and have confidence that the code is reasonably safe if it compiles.

Developer Experience 🔗︎

The app now loads faster in dev thanks to fewer requests, and we can see meaningful lint warnings and TS compiler errors in the CRA dev server. Also, we can now actually import NPM packages, CSS, and images directly into the bundle for processing as needed, write ES modules, and import shared code across the client codebase.

Future Improvements 🔗︎

Codebase Cleanup 🔗︎

I've added Prettier and enabled it for all our TS files, but haven't yet reformatted the codebase yet. I'll probably pull the trigger on that in the near future. (Debating whether to try converting all the IIFE files to ESM in the process.)

Removing Gulp 🔗︎

I've already yanked out most of the Gulp tasks, including dropping it entirely from running the server in production. At this point it's only starting the app server in dev and running linting, and both of those can be done as separate scripts.

Server Conversion to TypeScript 🔗︎

There's a ton of server logic here, and it's all plain JS. I'd at least like to play with converting some of the key files to TS, and adding a few more hand-written typings files to the codebase for the rest of our data models.

Long-Term Migration 🔗︎

Long-term, my goal is to replace this codebase entirely, probably with something built on Next.js. I hope to research setting up a new codebase in the near future, and we'll likely work on migrating functionality over one page at a time, with the individual pages being reverse-proxied from our existing server process to the Next.js process and loaded in an iframe for ease of migration.

Final Thoughts 🔗︎

I'm honestly both amazed and disgusted by how well this worked out :) I did the initial Bower->Yarn and Gulp+Webpack work earlier over a few days, but it only took me 3 hours of fiddling with config stuff to make CRA build this app as the proof-of-concept. (In fact, looking at the clock... I spent more time writing the first version of this blog post than I did setting up CRA initially!)

That said, the only reason I was able to pull this off was all my existing knowledge of CRA's internals. I already knew that this should be possible, and the pieces that I'd need to make this happen - I just had to figure out the actual mechanics of how to do it.

Stepping back for a minute, this says a lot about the value of experience in a career. The tasks we work on and the problems we solve heavily influence what we're able to do down the road and how quickly we can solve future similar problems.

I don't expect the main points of this post to be directly relevant to many people, given the scarcity of AngularJS codebases in 2020, but hopefully some of the ideas and techniques that I showed here will be potentially useful.

I know I didn't completely show my work with the various code snippets - the hazards of trying to explain stuff I've done at the day job on codebases that aren't public :) Hopefully the examples and explanations are sufficient to give the idea. If you've got any questions or want to know more specifics, please leave a comment or ping me on Twitter, and I'll see if I can provide more details.


This is a post in the Codebase Conversion series. Other posts in this series: