Theme:

Self-host your static assets with a TagHelper

Make your site faster, by self hosting your static assets instead of loading them from a third party CDN. In this article you'll learn how to make a TagHelper to make it just as easy as using a third party CDN.

If you’ve worked in web development before the era of npm install for including JavaScript or CSS libraries on your website, you are probably familiar with using them from an external source, like cdnjs, jsDeliver or similar, where you simply copy and paste a <script> tag on to your site to include your desired library.

Previously, this has been known as a performant way of including libraries. Often the browsers would have cached the same library from another website using the same CDN service, and often CDN services are exceptionally good at matching the right server location with the user, to deliver the files quickly.

The thing is the benefits of using a third party CDN doesn’t outweigh the risks.

If the CDN is having slowdowns, you might have critical assets not being delivered fast enough, and then I’m not considering the possibility of the service breaking down, or simply shutting down.

Since HTTPS is now more or less a standard, you force the browser to start a new connection just to get the certificate of the third-party service.

And the benefit of having the browser already cached the asset, is pretty much a myth. In addition to that, Safari has stopped caching cross domain requests for privacy reasons.

So, there is no reason NOT to self-host your static assets anymore, at least if you care for the stability and speed of your site.

The simple but tedious way

Some might reach for the good old npm install when dealing with third party dependencies, and while it has its pros, it has never really been a fun way of handling this.

Using this approach, you’ll likely need some kind of build system like Webpack, Vite, Gulp or similar. And who has ever thought that was fun?

The simplest way of handling this, is simply manually saving the desired file, to your local file system, and referencing it in a script tag. So, you go from <script src="https://unpkg.com/alpinejs@3.10.5/dist/cdn.min.js"> to <script src="/assets/alpine.js">.

It doesn’t get much simpler than this, but in my opinion, this is tedious and boring, and you’ll probably never want to upgrade your dependency this way.

So why not use this as a chance to experiment, learn something new and help your future self?

The fun and convenient way

I wanted an effortless way, of just referencing a dependency by its CDN URL, without having to actually use the CDN, while still having the option to fall back to the CDN URL in case something goes wrong.

Inspired by the asp-append-version TagHelper, you can attach to eg. <script>-tags, which adds a query string value of some kind of version number of the file to the URL in the src attribute.

I’m not sure how it works under the hood, but I use this as a form of cache busting method. This way I can tell the browser to cache the file for a long time. Performance testing tools like Lighthouse likes this, and the query string value is automatically updated when the file changes to bust the browser cache.

So, I thought could I do something similar, just for self-hosting files? Instead of asp-append-version could I make an our-self-host TagHelper?

The idea is, that I add the script tag as normally, when I use the CDN URL, and then add the attribute for the TagHelper, like <script src="https://unpkg.com/alpinejs@3.10.5/dist/cdn.min.js" our-self-host>. The TagHelper should then download the file referenced, save it somewhere I can access it, and swap out the src attribute with the path to my newly downloaded file.

Building a TagHelper

Building a TagHelper is well documented at MSDN, so I won’t go into too much detail of that. The gist of my TagHelper, is that it targets all elements that has the attribute our-self-host but it won’t have any effect on anything not using a src or href attribute.


[HtmlTargetElement("*", Attributes = "our-self-host")]
public class SelfHostTagHelper : TagHelper

I then add some other members to the TagHelper, href and src for getting the values of these attributes on the tag. I want to be able to configure where on the disk I want the asset saved, so I add the folder attribute for this. Last, I need some way of defining the file extension, in case the URL to the static asset is extension less. For this I have the ext attribute.


[HtmlAttributeName("folder")]
public string? FolderName { get; set; }
[HtmlAttributeName("src")]
public string? SrcAttribute { get; set; }
[HtmlAttributeName("href")]
public string? HrefAttribute { get; set; }
[HtmlAttributeName("ext")]
public string? Extension { get; set; }
public string Url => SrcAttribute.IfNullOrWhiteSpace(HrefAttribute);

The result of this, is that I now have a tag helper, which is activated by using the attribute our-self-host, like this.


<script src="https://unpkg.com/jquery" ext="js" our-self-host></script>

The necessary code

For this to work, I need some code that actually downloads the file specified in the src attribute. The first part is added to the TagHelpers Process method:


public override async Task ProcessAsync(TagHelperContext context, TagHelperOutput output)
{
    var url = (Url.StartsWith("//") ? $"https:{Url}" : Url);
    var selfHostedFile = await _selfHostService.SelfHostFile(url, FolderName, Extension);

    if (SrcAttribute.IsNullOrWhiteSpace() == false)
    {
        output.Attributes.SetAttribute("data-original-src", SrcAttribute);
        output.Attributes.SetAttribute("src", selfHostedFile.Url);
    }
    else if (HrefAttribute.IsNullOrWhiteSpace() == false)
    {
        output.Attributes.SetAttribute("data-original-href", HrefAttribute);
        output.Attributes.SetAttribute("href", selfHostedFile.Url);
    }

    output.Attributes.Remove(new TagHelperAttribute("umb-self-host"));
    output.Attributes.Remove(new TagHelperAttribute("folder"));
    output.Attributes.Remove(new TagHelperAttribute("ext"));
}

In this part, I first make sure that there is a protocol on the URL (either the src attribute or the href attribute). I then use the SelfHostService (we'll get to that, carry on), to actually self-host the file, and update the tag accordingly, with a new src or href attribute, pointing to the now self-hosted file. For informational purposes, the TagHelper adds the original src/href as a data attribute.

Programmatically download and save the file

For the heavy lifting of the TagHelper, I have created a SelfHostService, that takes the URL, optional folder name, and optional extension, and downloads the file to the disk on server.

Because of this, there is only a performance hit of downloading the file once - any subsequent request will redirect to my local file. Cache busting is simple too, just delete the file of the server, and the next time a request hits the Tag Helper, the file will be re-downloaded.


public class SelfHostService : ISelfHostService
{
    private readonly IAppPolicyCache _runtimeCache;
    private readonly IWebHostEnvironment _hostingEnvironment;
    private readonly IConfiguration _config;

    public SelfHostService(
        IAppPolicyCache appPolicyCache,
        IWebHostEnvironment hostingEnvironment,
        IConfiguration config
        )
    {
        _runtimeCache = appPolicyCache;
        _hostingEnvironment = hostingEnvironment;
    _config = config
    }
}

The service contains different methods for each part of the self-hosting logic. The main entry, SelfHostFile(url, subfolder, fileExtension), takes care of generating the model later used by the TagHelper, which contains the external (original) URL of the file, the resulting filename on the disk, the path to where the file is saved, and then the actual self-hosted file URL. This part is then cached, so we don’t need to work out the result more than once.

If an extension is specified, this is added to the filename for the local file. This is important to get the server to serve the file with the right MIME type. Eg. using https://unpkg.com/jquery as the URL, you will get a file named jquery. To make the server and browser understand this as Javascript, you need the .js extension. If you put the script tag as <script src="https://unpkg.com/jquery" our-self-host ext="js">, the filename will end up being jquery.js.


public async Task<SelfHostedFile> SelfHostFile(string url, string? subfolder = null, string? fileExtension = null)
{
    return await _runtimeCache.GetCacheItem($"Our.Umbraco.TagHelpers.Services.SelfHostService.SelfHostedFile({url}, {subfolder}, {fileExtension})", async () =>
    {
        using (_logger.TraceDuration<ISelfHostService>($"Start generating SelfHostedFile: {url}", $"Finished generating SelfHostedFile: {url}"))
        {
            var uri = new Uri(url, UriKind.Absolute);

            var selfHostedFile = new SelfHostedFile()
            {
                ExternalUrl = url,
                FileName = uri.Segments.Last() + fileExtension.IfNotNull(ext => ext.EnsureStartsWith(".")),
                FolderPath = GetFolderPath(uri, subfolder)
            };

            selfHostedFile.Url = await GetSelfHostedUrl(selfHostedFile);
            return selfHostedFile;
        }
    });
}

If the external file is in a subfolder, eg. https://cdnjs.cloudflare.com/ajax/libs/bootstrap/5.2.2/js/bootstrap.min.js is in ajax/libs/bootstrap/5.2.2/js, this path will be preserved on disk, to avoid collisions. This is handled in the GetFolderPath method, which combines the root folder (which is configurable) and the path in the external URL.


private string GetFolderPath(Uri uri, string? subfolder = null)
{
    var folderPath = _config["Our.Umbraco.TagHelpers.SelfHost.RootFolder"].IfNullOrWhiteSpace("~/assets"); ;

    if (subfolder.IsNullOrWhiteSpace() == false) folderPath += subfolder.EnsureStartsWith("/");

    folderPath += GetRemoteFolderPath(uri);

    return folderPath;
}

private string GetRemoteFolderPath(Uri uri)
{
    var segments = uri?.Segments;

    // if there is more than 2 segments (first segment is the root, last segment is the file)
    // we can extract the folderpath
    if (segments?. Length > 2)
    {
        segments = segments.Skip(1).SkipLast(1).ToArray();

        // remove trailing slash from segments
        segments = segments. Select(x => x.Replace("/", "")).ToArray();

        // join segments with slash
        return string. Join("/", segments).EnsureStartsWith("/");
    }

    return string. Empty;
}

When we know where the file is going to be located, and what it will be called, we first check if the file already exists - if it does, we can skip the downloading part, and simply return the new local URL for the file.

If the file doesn’t exist, we will try to download it, using a HttpClient, and create the necessary folders on the disk. If the download fails, the external URL is returned instead, so the user can still have its own try at downloading the external file. In this case, the resulting src/href attribute on our tag, will be the same as the original.


private async Task<string> GetSelfHostedUrl(SelfHostedFile file)
{
    var filePath = $"{file.FolderPath}/{file.FileName}";
    var localPath = _hostingEnvironment.MapPathWebRoot(file.FolderPath);
    var localFilePath = _hostingEnvironment.MapPathWebRoot(filePath);

    if (!File.Exists(localFilePath))
    {
        using (_logger.TraceDuration<ISelfHostService>($"Start downloading SelfHostedFile: {file.ExternalUrl} to {localFilePath}", $"Finished downloading SelfHostedFile: {file.ExternalUrl} to {localFilePath}"))
        {
            var content = await GetUrlContent(file.ExternalUrl);
            if (content != null)
            {
                if (!Directory.Exists(localPath)) Directory.CreateDirectory(localPath);
                await File.WriteAllBytesAsync(localFilePath, content);
                return filePath;
            }
            else
            {
                return file.ExternalUrl;
            }
        }
    }

    return filePath;
}

private static async Task<byte[]?> GetUrlContent(string url)
{
    using (var client = new HttpClient())
    {
        using (var result = await client.GetAsync(url))
        {
            if (result is not null && result.IsSuccessStatusCode)
            {
                return await result.Content.ReadAsByteArrayAsync();
            }
            else
            {
                return null;
            }
        }
    }
}

So, the next time I need a third-party library for some one-off thing, I can do

  • <script src="https://cdnjs.cloudflare.com/ajax/libs/animejs/3.2.1/anime.min.js" our-self-host></script>
  • <link href="https://unpkg.com/nes.css@2.3.0/css/nes.min.css" rel="stylesheet" our-self-host />
  • or even <img src="https://i3.ytimg.com/vi/xm3YgoEiEDc/maxresdefault.jpg" our-self-host folder="ytimg" />

Some tips

If you use a CDN that can automatically fetch the newest version of a library, like https://unpkg.com/jquery - don’t! Always specify the version number. The our-self-host TagHelper downloads the file, and you will miss the auto upgrade anyway - and you can never know when your library of choice introduces breaking changes. So always pick the version yourself!

If you are worried about getting the right file, use subresource integrity to verify the file contents against a hash. The downloading mechanism is not tampering with the file, so you should expect both the external and the local file to match.

Don't use this technique to self-host fonts from eg. Google Fonts. When using Google Fonts, you are asked to link to a stylesheet from Google Fonts. A seemingly perfect candidate for this technique. But the stylesheet contains links to the actual font files, that won't be self-hosted using the Tag Helper. This could be a great feature for a version 2 of the TagHelper. But until then, you can use something like google webfonts helper, to help with self-hosting.

Full code example

I've added a PR to the community project, Our.Umbraco.TagHelpers, with this TagHelper if you want to snoop around the code, or maybe even try it out yourself.

Use it locally too. The files will get created in your local environment. I add them to git once they have been downloaded, and simply use the TagHelper as an uncomplicated way of downloading files needed for my sites. In addition to that, I get a footprint of where the file is actually coming from.


Søren Kottal
Søren Kottal

Tech Tip: JetBrains Rider

Check it out here