Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Do not use global cache for createCache() to avoid memory leaks #3110

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

functino
Copy link

@functino functino commented Sep 21, 2023

What:

Currently createCache() seems to use a global cache that is reused even if you create multiple caches. In an SSR scenario this means that even if you follow the recommendations to create a cache per render you will end up having a shared cache.
Also currently an accessor function is returned from getServerStylisCache
But later it is just used as if it would be an object

Why:
I think the current approach can cause memory-leaks in bigger applications. Since we additionally just use the serverStylisCache as an object, I think the cached values will never be garbage collected. This causes issues in apps that have a lot of css and/or use a lot of dynamic styles.

How:
I want to change the return type of getServerStylisCache to a simple cache object. This is how it is already used. We could also try to use the accessor function. But I wouldn't know how to write to the cache then.

Also I want to move the call to getServerStylisCache inside the function body of createCache. This way we will get a new instance of the cache for every request.

I'm not sure how to add tests for this. And I'm also not sure if my assessment of what the current code is doing is correct. Maybe someone can help to shed some light on it.

Checklist:

  • Documentation
  • Tests
  • Code complete
  • Changeset

We are never using the accessor function that is returned.
In the code base later we access it like this:
`serverStylisCache[name]` and not like this `serverStylisCache(name)`

While we could change the access,
I'm not sure how we would write to the cache.
So I think we can just directly return the cache object.
Currently we share caches even for multiple createCache() calls.

So if you have an SSR app that reates a cache on the server like this:

const cache = createCache({ key: 'custom' });

and you do this (like recommended) on every request, you still share a cache
between all requests. I also think this is currently a memory leak.
If you have a big application and maybe even lots of dynamic styles
the cache can get quite big and will never be garbage collected.
@changeset-bot
Copy link

changeset-bot bot commented Sep 21, 2023

🦋 Changeset detected

Latest commit: c417aac

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 1 package
Name Type
@emotion/cache Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@codesandbox-ci
Copy link

This pull request is automatically built and testable in CodeSandbox.

To see build info of the built libraries, click here or the icon next to each commit SHA.

Latest deployment of this branch, based on commit c417aac:

Sandbox Source
Emotion Configuration

@codecov
Copy link

codecov bot commented Sep 21, 2023

Codecov Report

Merging #3110 (c417aac) into main (f3b268f) will not change coverage.
The diff coverage is 100.00%.

Files Changed Coverage Δ
packages/cache/src/index.js 97.80% <100.00%> (ø)

@jrosspaperless
Copy link

My coworker hunted down a memory leak in our application to this same location. He found that if we downgrade stylis and pass in an array of plugins with stylis instead of relying on the one built into emotion it resolves the memory leak.

We believe it's because the WeakMap references to the static array created in emotion's configuration end up ballooning the cache because it does not recognize the key being passed.

Since the only purpose for the cache is SSR and you're moving any potential caching benefit into a method that would result in never being able to benefit from the cache, I think the best option would be to remove the caching concept entirely.

Coming across your PR has helped us to confirm our suspicions and we really appreciate your taking the time to do this!

@vgeyvandov
Copy link

vgeyvandov commented May 2, 2024

@functino This is a great find!

As @jrosspaperless mentioned, we had a similar issue with Emotion during SSR, and have found that providing our own stylisPlugins array (a new instance of the array each render cycle) with the default stylis prefixer resolves the issue of the cache building up over time.

After debugging further, we found that an even simpler solution within the Emotion cache module would be to update this line to be the following:

const stylisPlugins = options.stylisPlugins || [prefixer]

And then delete this line entirely since this variable will no longer be used.

This way, the WeakMap cache receives a new instance of the plugins array as the key for the cache, and the garbage collector is able to clean up the key and its entries, since a new instance of it is set during each SSR render cycle.

The root issue seems to be that the current defaultStylisPlugins variable is declared globally and is being used as the default key, such that it is never garbage collected and all new cache entries during each SSR cycle end up tied to defaultStylisPlugins and are never cleaned up.

Edit: I do also want to mention that our proposed "fixes" for this issue do call into question the usefulness of the serverStylisCache for SSR, as it does not come with any limits or eviction system (the cache will just grow forever with each unique page hit over time). Perhaps the ultimate solution here is to propose some kind of maximum entries concept and an eviction policy for it.

@szagi3891
Copy link

What's next ? It would be great if this PR was merged

@Andarist
Copy link
Member

Could somebody familiar with the issue describe step by step how the leak is created here? I admittedly haven't looked into this code yet, a concise description of the problem would help me though

@vgeyvandov
Copy link

@Andarist I have some details in my comment above, but the root issue seems to be that the current defaultStylisPlugins variable being used as the default cache key for the server Stylis cache WeakMap is declared globally.

When Emotion consumers do not create their own cache, or create a custom cache without specifying custom stylisPlugins option, or set stylisPlugins as a globally defined array, this cache will grow with each unique SSR request. Since the cache has no entry limits or eviction policy, memory consumption slowly rises until the server is restarted.

@functino
Copy link
Author

Could somebody familiar with the issue describe step by step how the leak is created here? I admittedly haven't looked into this code yet, a concise description of the problem would help me though

Hi @Andarist thanks for looking into it.

The issue is that regardless how often you call createCache() it is actually creating only ONE serverStylisCache.
Now lets assume we have a big app:

  • it is SSR rendred
  • we call createCache() on every request
  • we have lots of styles and some of them are dynamic (something like opacity: Math.random() in a worst case)

Then the cache will be shared between all those requests AND it will constantly increase in size without ever getting garbage collected.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants