Stefan Holm Olsen

Cache busting with ASP.Net MVC

UPDATE: The code samples in this blog post has been superseeded. I have described a newer and simpler solution in this blog post.

Whenever a webpage have links to pre-compiled frontend files, we as developers can make the web browsers, as well as proxy servers, cache (save) the files for a very long time. This can be achieved by setting a HTTP-header, called Cache-Control, to a very high number of seconds. The browsers can thereby avoid downloading the same files again and again, which will save the visitors from downloading a lot of extra data.

However, what then happens when we, the developers, make changes to one of those files on the server?

Well, a browser that have already got the changed files in their own cache will not reload them again, since the server specifically told it not to. This is so, because when the browser determines whether or not it has a certain file in it’s cache, it looks at the URL of the requested file (and maybe some headers, if specified by the server). If the specified URL in the HTML document matches the URL in the browser cache, then it will not download it from the server again.

So when a HTML document links to a file at a URL like the following, the browser may find it in cache.


If the file is changed on the server, but the link remains exactly the same, the browser will not download it again. The link needs to point to a different URL to make the browser download the file again.


One way of doing this, forcing the browser to re-download, is called fingerprinting. This technique is about adding a parameter to the URL of a static file. It could be random, or it could be a value that is derived from the file. This way the browser is tricked into seeing this new URL as different from another URL. And then we tell the browser to also cache this file for a very long time.

I usually add a parameter with the date of the file’s last modification, which is converted to a number of ticks (a 64-bit integer). However, you could as well use the assembly version of the running web application, a version number which could automatically be increased on when building the web solution. As far as the browsers are concerned, both methods are equally good.

With the first method, the URL from before will look like this:


It now includes a timestamp, which is automatically updated every time the file is modified. Since the browser considers the whole URL (including query string parameters) of the file, it will not be considered the same as the following (which was saved later):



To implement the fingerprinting I just mentioned, you can do these steps:

  1. Add an extension method to the UrlHelper class for creating a URL with a timestamp.
  2. Use the extension method in Razor views, for things like stylesheets, scripts, images etc.
  3. Configure cache headers for static content.

The following is an example of an implementation for ASP.Net MVC. Add it to your own solution.

public static class UrlExtensions
    private const string FileDateTicksCacheKeyFormat = "FileDateTicks_{0}";

    private static long GetFileDateTicks(this UrlHelper urlHelper, string filename)
        var context = urlHelper.RequestContext.HttpContext;
        string cacheKey = string.Format(FileDateTicksCacheKeyFormat, filename);

        // Check if we already cached the ticks in the cache.
        if (context.Cache[cacheKey] != null)
            return (long)context.Cache[cacheKey];

        var physicalPath = context.Server.MapPath(filename);
        var fileInfo = new FileInfo(physicalPath);
        var dependency = new CacheDependency(physicalPath);

        // If file exists, read number of ticks from last write date. Or fall back to 0.
        long ticks = fileInfo.Exists ? fileInfo.LastWriteTime.Ticks : 0;

        // Add the number of ticks to cache for 12 hours.
        // The cache dependency will remove the entry if file is changed or deleted.
        context.Cache.Add(cacheKey, ticks, dependency,
            DateTime.Now.AddHours(12), TimeSpan.Zero,
            CacheItemPriority.Normal, null);

        return ticks;

    public static string ContentVersioned(this UrlHelper urlHelper, string contentPath)
        string url = urlHelper.Content(contentPath);

        long fileTicks = GetFileDateTicks(urlHelper, url);

        return $"{url}?v={fileTicks}";

The extension method can then be used like this:

<img src="@Url.ContentVersioned("~/html/dist/logo.svg")"/>

Which will be rendered like this:

<img src="/html/dist/logo.svg?v=636343694099634717"/>

Also use it for scripts and stylesheets, like these examples:

<script src="@Url.ContentVersioned("~/html/dist/main.js")"/>
<link rel="stylesheet" href="@Url.ContentVersioned("~/html/dist/screen.css")"/>

To configure the cache headers, add this piece of XML to the web.config file. After that the webserver will tell browsers to cache static files for 365 days and effectively ignore any changes to those files.

  <clientCache cacheControlMode="UseMaxAge" cacheControlMaxAge="365.00:00:00"/>


That’s it. Now browsers will cache static files for a very long time. And when the file is updated it will be reloaded, since technically it is a different file as far as the browsers see them.