DEV Community

Cover image for IE11 and the Missing Polyfills
Anton Korzunov
Anton Korzunov

Posted on

IE11 and the Missing Polyfills

It was a beautiful sunny day, and our brand new site was working well, as it usually does, however, nobody(except us) knew how cool is it, yet 😉. Our startup was in the stealth mode.
We had no traffic, no customers, and, obviously, no worries.

Everything was perfect - the code was DRY, KISS, fast, best practices applied, dependencies up to date, and even UX design was not that bad.
And it was a launch day!

All systems ready! Prepare! Launch 🚀🚀🚀!
3 seconds! Telemetry! All green! 10 seconds! 60 seconds!
🥳 We are online! 🥳

We were confident - everything is ok. Of course, there was no reason to worry - we had proofs that everything is perfect: 100% unit test coverage and puppeteer based E2E tests would not let any bug to exists.


We were online...

We were happy...

We were not expecting anything bad to happen... but it happened...

Rollbar error

...shouted rollbar, the service we use to monitor our frontend errors.

...this and nothing more, keeping silence for the next minute.

And then it happened AGAIN! And AGAIN! And AGAIN, and our happy life was destroyed, and our belief in ourselves has vanished 😭😭😭...

...

Sound like a scary story? Well, it was very scary, and a bit unexpected. But, looking behind, we did everything to get into this trouble - we haven’t provided required polyfills to let our so cool, and so modern code work in the legacy browsers, the browsers no developer would ever use, the browsers which are still around.

According to the statistic - usually, almost 90% of your customers are expected to use more or less “modern” browsers, however, in some cases, it might be as low as just 50%. It depends who you are, where you are, and your target audience.

Instagram users with ES2017 supported

Percentage of Instagram users with ES2017 supported vs unsupported browsers, source.

BTW: React and Create-React-App still supports IE9, 🦖

And we nor made our code better for modern ones, shipping more compact and fast “ES6” to the browser, which old browsers are absolutely unable to understand, but the new ones could benefit from. Nor made our code compatible with those “old” browsers, shipping everything in "ES5", as well as adding the “missing pieces”, known as polyfills, without which nothing would work as expected.

Polyfills are in fact a very bad thing, as long as they are a very big thing. Don't add them unless you need them. (well, so we didn't)

I would ask you one thing: what is better - provide the best experience possible for the majority of your customers, like around 90%, and let the other suffer... or provide the same "not great" experience for everyone, including that “majority”.

And would you be surprised, if I tell you that no matter what you do, you will pick the first way? There are always people who cannot run as much JavaScript as you are sending, or just some settings and environments where JS is disabled at all.

If not JavaScript, then CSS - maintaining perfect result across different browsers, when some of them just don’t (yet) support something is hard and (and that’s the truth) economically inefficient.

Even more - the common "target" for the bundle is "2 last versions + Firefox ESR + IE11". So IE11 is the minumum requirement, a vast majority of sites are designed to work with.
But what if I am using IE10, or QQ Browser Mobile? (and I have no idea what it exactly is, but that’s not quite “standard” thingy).

So here is the point: it would be better for you to handle the bigger part of your customers in the best possible way - ie ship as modern code, as possible. However, you always should be ready to ship de-modernized bundles for your other users, which should not be forgotten.

de-modernized. de-gradated. de-evoluted. devoluted

PS: Have you heard about “graceful degradation”? Not a new thing.

🦎 -> 🦖

However, this story is not about modern bundles from es5/es6/es7 perspective. This story is about polyfills. And polyfills - language features polyfills, as well as web platform polyfills, could be a quite big thing (and we are trying to make this “efficient”).

I am still remembering my PTE English exam, where you have to explain a random picture or graph. _What could you tell me looking at this picture?

Bundle separation

By looking at this picture (I’ve borrowed from Smart Bundling), there are 4 points I want to highlight:

  • you don't have to ship polyfills to a browser which supports these features. Polyfills exists to polyfill something missing.
  • you don't have to ship a polyfill which is not going to be used straight ahead. You need it only when it's actually required.
  • and you have to have all "missing functional parts" when they needed, or your code would produce a runtime exception.
  • there is no way to automatically detect which parts are missing 🤔. Well, that’s not clearly visible from the image, but is true.

The missing parts

Let's imagine you use vanilla JavaScript. You do Object.entries, and it just works. For you. However it would not work for IE11, that's a sad but obvious fact.

You may see the error in logs, and add @babel/polyfills for the first time. It's like the first thing to do, and the first result in a google search. That's fixing the problem, but is adding too much stuff you don't need - like all possible polyfills.

Should be a better way.

useBuitIns

So, you google deeper, and found that babel could magically make everything better - just use usebuiltins: "entry"

{
  "presets": [
    [
      "@babel/preset-env",
      {
        "useBuiltIns": "entry"
      }
    ]
  ]
}

What does it do? It replaces @babel/polyfill with polyfills actually required by target system, sometimes just halving their count. Once you configured - "modern + IE11" it will remove polyfills for IE9-10, as well A LOT of polyfills for Android related browsers.

However, that "half" still might include stuff you are NOT using, and there is another option to tackle this - usage

        "useBuiltIns": "usage"

the usage is a bit smarter than entry - it would add polyfills only for stuff you are using in real. Halving already halved size.

  • haven't used WeakSets? Removed!
  • no RegExp? Removed!
  • using Symbols? Polyfills!
  • not using String.repeat? Re... Well...

What's not so great...

Actually "useBuiltIns": "usage" is not removing anything - it is adding. Is somehow detect that stuff got used, work it out.

Usage has two design problems:

  • it's actually not quite "smart", as long as ”JavaScript”. I mean "JavaScript" is the reason why it’s working in a not best way. > - If you do anything.description it would add polyfill for Symbol.description, cos ".description" 😉 > - If you do Symbol.toStringTag it will add:
    • es.symbol.to-string-tag
    • es.math.to-string-tag
    • es.json.to-string-tag Cos, you got it, .toStringTag 😉. As long as it's really don't know all types - JS is not a typed language. it’s 🦆 Duck Typing. If something quacks like toStringTag - get it polyfilled!

There is no way to fix it - false positives would not break your application, while false negatives might. So - live with it.

This is not a "real" problem. You might just get more polyfills that you really need, but still less than with entry mode.
And, the main difference, you will get required polyfills where you need them, not at your entry point. So this more is the code splitting best friend.

  • the second problem is more severe. usage is about "usage", and "usage" only within files "under babel management". If some of your node modules requires any polyfill - it would not be detected, and you will have to add it manually. I hope before shipping stuff to production. Well, like we did. 🥳

Sometimes you might work this out expending babel to the whole node_modules, but that's not always an option.

However enabled by default for create-react-app as well as parcel

CoreJS2 and CoreJS3

By fact, there are two useBuiltIns usage plugins - one is for corejs-2 and one is for corejs-3.
v3 "detects" much more cases, which is good from one point of view - you are more "safe", but from another, it leads to a much higher level of false positives.

Theoretically - a whole corejs3 is 50kb gzip, but you might need only 2kb from it.

Takeaways?

  • use @babel/polyfill or underlaying core-js to make your application compatible with a wide amount of customers browsers, including aged or bugged browsers.
  • use @babel/preset-env with useBuiltIns: "entry" to safely reduce the number of polyfills sent.
  • use @babel/preset-env with useBuiltIns: "usage" to UNsafely reduce the number of polyfills sent even more.
  • 😉 don't forget - using only one bundle for all customers is making this sort of optimizations inefficient, as long as too many polyfills, prepared for "legacy targets" would be sent to "modern targets". As well as less compact js code.

8 polyfills(24kb) were needed for modern bundle, and 37(154kb) were required for IE11 in our case. Sizes are before gzip.

listing polyfills

⚠️ spoiler alert ⚠️

Core Duo

So, to get something measurable from shipping right polyfills to the right client you have to send a different code to different clients.

There are simple ways to do it:

  • use polyfills.io to automatically deliver all required polyfills. One line fix 😉. One more blocking script at your head 🤔.
  • use pika to deliver legacy/modern bundles. Sounds just amazing 🥳. Probably you have to change all your build pipeline 🤓.

There is a bit harder way - use double bundling or multicompiler mode to create different bundles targets, and that's the best you might get, but it's hard to manage. In terms of code-splitting, prefetching and service-workers.
parcel2 is promising to get it working out of the box, time will show how useful it is.

There is another question to ask yourself -

Which bundle to build?

And how this "double bundling works", and which operations are required to make your code compatible with browsers, and what's the goal...

And that's simple, really simple - modern browsers are able to run your code as-is.

Babel, as well as TypeScript are here to create a "lower" version of your code (babel initially was named 6to5, it was converting es6 to es5, you got it 😉).

The idea about bundling is to get your files, combine them together, and create a version for a "lower target". Like es5, eatable by any browser. Well, eatable with not language "downgraded", but also with "missing pieces" polyfilled, keep that in mind.

Double-bundling is doing exactly that, just twice - first for one target, and secondary for another. Modern and legacy. Module and no-modules.

And I think there is a more efficient way to handle it

🦎 -> (devolution) -> 🦖

Idea behind devolution is simple:

  • you are compiling your bundle, you can run in your browser. Like the "modern" one
  • devolution takes it as an input, and produces the legacy output, with language version "downgraded", and requited polyfills added.
  • it makes it faster than bundler, with easier configuration, however with some cost to the final bundle side.

Let's go step by step:

you are compiling your bundle to a modern target

Just do it. Pick esmodules target, which targets browsers with "module" support, or pick even higher target, without old Safary inclided. Feel free to use preset-modules, which creates more compact es6 code than preset-env, however not adding any polyfills

devolution takes it as an input, and produces the legacy output

Run run yarn devolution and it will first create a self-documented .devolutionrc letting you configure absolute everything.
The second run will create a legacy bundle.

Well, asciinema is not looking good at dev.to content width :(

The process is split into a few steps:

  • detecting required polyfills, using port of babel's usage plugin.
  • adding missing polyfills, as well as elaborating what is required where
  • recompiling code to another target, by fact - devoluting it
  • re-minification of result code, to keep it compact
  • and that's all..

There is only one piece left - pick the right bundle to use, and that's easy - just import the right one, everything else, including _webpack_pulbic_path_ update is already inside.

<script> 
  var script = document.createElement('script');
  var prefix = (!('noModule' in check)) ? "/ie11" : "/esm"; 
  script.src = prefix + "/index.js";
  document.head.appendChild(script);
</script>

This is the main difference between approach undertaken by devolution and common "double bundling" - devolution produces two structurally identical directories, the difference in only public path.

The process is working quite fast, as long as:

  • all heavy lifting is already done by the bundler
  • each file is managed in a separate thread, so if you are using code splitting the process might be quite fast.
  • you can opt-in for swc, making compilation 10 times faster - there is no need to apply any babel plugins again - they are already applied, so we are able to use more specialised tools.
  • bonus: you are getting polyfills for the "whole" bundle, all your node_modules are covered. As well as es5 for the whole bundle - if you are using some es6-based @sindresorhus modules, like query-string - it would just work without any extra configuration!
  • another bonus: it does not matter which framework or bundler you are using - this is working at deployment time. So - it would work even for closed systems like CRA.

In numbers - it's taking 10 seconds to handle bundle it takes 30 seconds to build 😒, and 30 seconds to handle bundle it takes 20 minutes to build 😃.

Bonus - you might use preset-modules, which is not adding any polyfills to create a bundle, and then use devolution to add required ones for your "esm bundle".

The point

The result bundles are the same. They are just laying in different directories. You can use __webpack_public_path__ to control which one has to be, or is loaded, while parcel would work out of the box.

Read an article about shipping module/nomodule bundles for details:

The real conclusion

  • you have to ship polyfills to support all possible browsers your customers might use
  • it's quite bad idea to ship all theoretically required polyfills to everybody
  • consider separating bundles for the "modern" and "legacy" browsers, or, at least, separating polyfills you are sending to your clients. Or, at least, use babel/polyfill with entry module useBuiltIns

And keep in mind

  • you are going to send more javascript code to the aged browsers, as long as it would be a result of a transpilation from es6-es7-es8.
  • amount of polyfills to be sent would increase and the numbers of features to polyfill grows
  • the "legacy customer" will suffer in any case, as long as even "modern customers" suffer - there is too much javascript around.
  • however, you might help at least the majority with a few lines

Start using double bundling. Please.
⚠️ However, removing polyfills, you dont need for the more modern bundle, could drive the biggest impact to the bundle size ⚠️, especially for the entry chunk.

Don't be a 🦖, let devolution handle it. At least give it a try, you can setup it in moments and start being more efficient.

yarn add --dev devolution
yarn devolution [source-dist] [target-dist]
// like
yarn devolution build build
  • feel free to use as modern code anywhere. node_modules included. Create as modern bundle as you can, and devolute it as a ​whole.
  • be confident - all polyfills are included. devolution uses the same usage-plugin @babel/preset-env uses internally and you can configure corejs version to use.
  • it's just a few lines to add it to your setup and start shipping separated bundles for different customers.
  • well, it's the only one "safe" way to use @babel/preset-modules
  • it's 100% configurable. .devolutionrc.js let you control almost everything
  • and it's fast - multy threaded nature with optional usage of lighting fast swc transpiler.

Top comments (0)