Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


Please recommend a CDN with Brotli level 11 for js/css files
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

Please recommend a CDN with Brotli level 11 for js/css files

peixotormspeixotorms Member
edited December 2020 in Help

Hi all,

Can someone please recommend a CDN service that is PAYG and comes with:

  • brotli level 11 for css and js files (files should get precompressed at the highest level by the cdn and served with the smallest size possible)
  • Webp Conversion from jpg and png formats to browsers that support it
  • option to set maximum image sizes for mobile and desktop (if possible)

So far, I am using bunnycdn.com and I really love their service, however, their compression level really sucks.

I saw that pagecdn.com has this, but there are traffic limits per plan an on the amount of sites. I would like something like bunnycdn but with the highest compression for js and css files.

Please suggest CDN services that support these features.

Comments

  • eva2000eva2000 Veteran
    edited December 2020

    Brotli level 11 is highly cpu intensive to compress. That's why you won't see CDN providers use such high levels. See benchmarks I did for zstd vs brotli vs pigz vs bzip2 vs xz etc at https://community.centminmod.com/threads/round-4-compression-comparison-benchmarks-zstd-vs-brotli-vs-pigz-vs-bzip2-vs-xz-etc.18669/

    At some point you need to balance speed of compression vs compression size/ratio otherwise visitors will end up spending more time decompressing the assets than downloading them.

  • I am aware... however assets can be precompressed and served with brotli_static on nginx instantly without being resource intensive.

    Pagecdn can do it and I heard akamai can do it too, and they certainly precompress.
    Just looking for CDN with those features, not an explanation on why there are not many doing it.
    Thanks

  • isunbejoisunbejo Member
    edited December 2020

    For speed i choose zlib from cf, brotli good for compress ratio, but slow speed, high cpu intensife

  • eva2000eva2000 Veteran
    edited December 2020

    Yeah I don't know many CDNs that would do brotli 11. You can get close with Cloudflare for gzip pre-compress at origin with gzip/zopfli level 11 compression and allow Cloudflare to take your pre-compressed gzip/zopfli level 11 compressed files and serve them directly. The difference for jquery.min.js file for brotli 11 vs gzip/zopfli level 11 is ~6.3%

    87K May  4  2020 jquery.min.precomp.js
    28K May  4  2020 jquery.min.precomp.js.br
    29K May  4  2020 jquery.min.precomp.js.gz
    

    compression tests for gzip and brotli HTTP requests with Cloudflare CDN and Centmin Mod Nginx origin with gzip_static and brotli_static enabled. Cloudflare respects the gzip'd origin pre-compressed version

    gzip on the fly with Cloudflare serving origin pre-compressed version viz gzip/zopfli level 11

    curltest gzip https://$domain/cftests/jquery.min.precomp.js
    Uncompressed size : 86.07 KiB
    Compressed size   : 28.93 KiB
    

    brotli on the fly with Cloudflare

    curltest br https://$domain/cftests/jquery.min.precomp.js  
    Uncompressed size : 86.07 KiB
    Compressed size   : 29.38 KiB
    

    brotli 11 pre-compressed size would of been

    curltest br https://$domain/cftests/jquery.min.precomp.js.br
    Uncompressed size : 27.10 KiB
    Compressed size   : 27.10 KiB
    
    Thanked by 2pouyam ariq01
  • Daniel15Daniel15 Veteran
    edited December 2020

    Most CDNs should support Brotli level 11 if you statically compress it yourself (that is, you deploy foo.js, foo.js.br and foo.js.gz).

    I thought BunnyCDN supported static compression, but I'm surprised they don't: https://support.bunnycdn.com/hc/en-us/community/posts/360008401559-Static-Compression.

  • BunnySpeedBunnySpeed Member, Host Rep
    edited December 2020

    We use a lower level compression because, with this example, with a 20 Mbps connection, you're delivering 2.5kb per millisecond. If you go for example from level 4 to 11 and add as little as 1-2ms overhead be it on the server or the client, you've already lost any benefits and in fact added some delay.

    We did many tests to try and find something that gives a good ratio of minimum overhead and good compression, however, we are looking to make some improvements in this area soon and potentially allow you to pre-compress files.

  • yoursunnyyoursunny Member, IPv6 Advocate
    edited December 2020

    @BunnySpeed said:
    We use a lower level compression because, with this example, with a 20 Mbps connection, you're delivering 2.5kb per millisecond. If you go for example from level 4 to 11 and add as little as 1-2ms overhead be it on the server or the client, you've already lost any benefits and in fact added some delay.

    If user is on a 112Kbps ISDN line or 1.5Mbps ADSL, or a metered cellular/satellite connection, it helps with higher compression. Can you identify the user's connection type and adjust accordingly?

  • BunnySpeedBunnySpeed Member, Host Rep
    edited December 2020

    Unfortunately, we can't detect this. I guess we could dynamically set it based on the region of our PoPs though, that might be something interesting to explore, for example in Lagos or South Africa.

    Regarding the ISDN, if a user has a 112 Kbps ISDN line, then can probably expect they can grab a coffee before anything modern will load either way. A busy modern website can easily get into 5MB which would take 6 minutes to load... Maybe it's not the best use-case to optimize for, but I understand the concern :)

    Thanked by 1yoursunny
  • dfroedfroe Member, Host Rep

    I agree with the basic idea of compressing ASCII files at a reasonable balanced level.
    But isn't thinking about another few percent compression ratio what we call micro optimization?

    I mean a typical user may have 10 Mbps throughput and 50 ms RTT.
    Which is my educated guess when walking through Germany.

    Under these circumstances, does it really make any noticable difference to gain another few milliseconds?

    Probably a CDN having just slightly better positioned servers will result in much better user experience. I guess the decision to move to another CDN just because it supports something like a preferred compression method may easily perform worse in the end.

  • @yoursunny said:
    If user is on a 112Kbps ISDN line or 1.5Mbps ADSL, or a metered cellular/satellite connection, it helps with higher compression. Can you identify the user's connection type and adjust accordingly?

    If the user is on 112Kbps or 1.5Mbps, you're likely already serving very small files, since you'd obviously optimize for your target users. You're thus talking about bytes (or worst case a kilobyte) difference in compression.

    In the example @eva2000 gave, the jQuery example saves 2.28 kilobytes with Brotli compression level 11. You're talking about super minor savings, where you could probably optimize other things to save those 2.28 kilobytes, such as not using jQuery as an example.

    Obviously, if you've optimized your applications to the level where the brotli compression level matters then kudos :) But some people just tend to have super weird requirements with no actual added benefit.

    Thanked by 2yoursunny Shot2
  • @Zerpy said: some people just tend to have super weird requirements with no actual added benefit

    This is how I am feeling. If you are building a modern web application, you have a build pipeline like webpack, that can also output brotli and zlib and others. Therefore any/most CDN should work?

    If you are building a legacy app, just use any of the free CDNs for jquery.

  • Daniel15Daniel15 Veteran
    edited December 2020

    @BunnySpeed said: If you go for example from level 4 to 11 and add as little as 1-2ms overhead be it on the server or the client, you've already lost any benefits and in fact added some delay.

    That 1-2ms overhead is only once-off on cache fill, though. Assuming a high hit rate and low cache eviction rate, only one user per file per POP should see the slower speed, then all future users can take advantage of the smaller file.

    Allowing statically compressed files (https://support.bunnycdn.com/hc/en-us/community/posts/360008401559-Static-Compression) would avoid the overhead, too. Either the files could be compressed during deployment, or the origin server could compress and cache them, so in either case they're already compressed at the source.

  • BunnySpeedBunnySpeed Member, Host Rep
    edited December 2020

    @Daniel15 said:

    @BunnySpeed said: If you go for example from level 4 to 11 and add as little as 1-2ms overhead be it on the server or the client, you've already lost any benefits and in fact added some delay.

    That 1-2ms overhead is only once-off on cache fill, though. Assuming a high hit rate and low cache eviction rate, only one user per file per POP should see the slower speed, then all future users can take advantage of the smaller file.

    Allowing statically compressed files (https://support.bunnycdn.com/hc/en-us/community/posts/360008401559-Static-Compression) would avoid the overhead, too. Either the files could be compressed during deployment, or the origin server could compress and cache them, so in either case they're already compressed at the source.

    Yep, that part we're looking into.

  • @eva2000
    with reference to your above comment about brotli and other compression; What if I precompress a .js file on origin server (with either brotli or zlib or gzip) BUT have also enable "Brotli" compression in my cloudflare account. In such case how will CF work with the resource - Will it directly show a compressed file to user OR it will decompress the gzip/zlib file, download and save on its edge, and then again compress with brotli and send to user's browser?

  • @JasonM said:
    @eva2000
    with reference to your above comment about brotli and other compression; What if I precompress a .js file on origin server (with either brotli or zlib or gzip) BUT have also enable "Brotli" compression in my cloudflare account. In such case how will CF work with the resource - Will it directly show a compressed file to user OR it will decompress the gzip/zlib file, download and save on its edge, and then again compress with brotli and send to user's browser?

    I haven't done extensive testing with Cloudflare in particular, but CDNs generally use the Accept-Encoding header to tell your server which compression formats it supports. If your origin server supports Brotli, and the client supports Brotli, they should cache the Brotli response and serve it to the client as-is.

    If your origin server supports Brotli but the client only supports Gzip, there's two approaches they may take:
    1. Send a separate request to your server to specifically request a gzipped version (ie pass through the Accept-Encoding header from the client); or
    2. Decompress the Brotli version, recompress it using Gzip, and cache and serve that version.

    The first approach is the most compatible and generally matches how the server is configured (the server should be sending a Vary: Accept-Encoding​ header, which means the CDN cache key should include the Accept-Encoding value), but the second approach is still fine and a few CDNs do that.

  • eva2000eva2000 Veteran
    edited December 2020

    @JasonM said: with reference to your above comment about brotli and other compression; What if I precompress a .js file on origin server (with either brotli or zlib or gzip) BUT have also enable "Brotli" compression in my cloudflare account. In such case how will CF work with the resource - Will it directly show a compressed file to user OR it will decompress the gzip/zlib file, download and save on its edge, and then again compress with brotli and send to user's browser?

    Cloudflare right now only respects and talks to origins via gzip or non-gzip and not brotli. So only your pre-compressed gzip origin files will pass directly to Cloudflare Edge cache and onto your visitors https://support.cloudflare.com/hc/en-us/articles/200168086-Does-Cloudflare-compress-resources-. Brotli requests via Cloudflare will be encoded at Cloudflare Edge always.

    If you're already using gzip we will honor your gzip settings as long as you're passing the details in a header from your web server for the files.

    Cloudflare only supports the content types gzip towards your origin server and can also only deliver content either gzip compressed, brotli compressed, or not compressed.

    Cloudflare's reverse proxy is also able to convert between compressed formats and uncompressed formats, meaning that it can pull content from a customer's origin server via gzip and serve it to clients uncompressed (or vice versa). This is done independently of caching.

    Please note: The Accept-Encoding header is not respected and will be removed.

    and

    @Daniel15 said: but CDNs generally use the Accept-Encoding header to tell your server which compression formats it supports.

    Unfortunately, from quoted post, Cloudflare doesn't support Vary: Accept-Encoding​ in origin communications. Wish they'd change that though all the other Cloudflare performance and security benefits greatly outweigh this lost feature support.

    And as others have pointed out regarding the small difference in sizes can be made up elsewhere seeing as Google search engine and user page experience signals aren't about absolute page sizes but user experience (critical render path above fold). So it matters more the order and how your assets are loaded more so than the size of those assets. Of course smaller sizes will always be better than larger sizes. But 1-3KB isn't going to make that much of difference compared to 3rd party javascript/advertising scripts render blocking your page loads !

    Thanked by 1JasonM
  • @eva2000 said: Cloudflare right now only respects and talks to origins via gzip or non-gzip and not brotli. So only your pre-compressed gzip origin files will pass directly to Cloudflare Edge cache and onto your visitors

    thanks.

    It means, I should toggle OFF the brotli compression in CF account as I already gzip all resources on my server!

  • eva2000eva2000 Veteran
    edited December 2020

    or you can let cloudflare on the fly compress with brotli your non-precompressed assets and for precompressed gzip assets just add a no-transform header to tell Cloudflare not to convert/touch them and serve as is https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Cache-Control

    no-transform
    An intermediate cache or proxy cannot edit the response body, Content-Encoding, Content-Range, or Content-Type. It therefore forbids a proxy or browser feature, such as Google’s Web Light, from converting images to minimize data for a cache store or slow connection.

  • TTBF with brolti on my site : average 300-400ms
    with cloudflare-zlib: under 200ms

  • @isunbejo said:
    TTBF with brolti on my site : average 300-400ms
    with cloudflare-zlib: under 200ms

    When using Brotli, you should pre-compress everything. It's designed to have a better compression ratio in exchange for slower compression times. TTFB shouldn't be affected if it's compressed beforehand.

  • @Daniel15 said:

    @isunbejo said:
    TTBF with brolti on my site : average 300-400ms
    with cloudflare-zlib: under 200ms

    When using Brotli, you should pre-compress everything. It's designed to have a better compression ratio in exchange for slower compression times. TTFB shouldn't be affected if it's compressed beforehand.

    When use proxy CF, use zlib-cf ( https://github.com/cloudflare/zlib ), good TTBF (under 200ms), check on waterfall GTMETRIX

    with brotli ( https://github.com/google/ngx_brotli ) TTBF average 300-400ms

    Yes, no compressed beforehand

  • @isunbejo said: Yes, no compressed beforehand

    You should compress beforehand and only use brotli_static for best performance :)

  • @Daniel15 said: You should compress beforehand and only use brotli_static for best performance

    Yes for TTBF brotli good for pre compress static file, not dyamic html,

    Generate static pre compressed. :

    find /var/www  -type f -regextype posix-extended -regex '.*\.(css|svg|xml|rss|json)' -exec brotli '{}' \;
    

    :smile:

    Thanked by 1peixotorms
  • peixotormspeixotorms Member
    edited January 2021

    @BunnySpeed said:
    We use a lower level compression because, with this example, with a 20 Mbps connection, you're delivering 2.5kb per millisecond. If you go for example from level 4 to 11 and add as little as 1-2ms overhead be it on the server or the client, you've already lost any benefits and in fact added some delay.

    We did many tests to try and find something that gives a good ratio of minimum overhead and good compression, however, we are looking to make some improvements in this area soon and potentially allow you to pre-compress files.

    The point is... there is an absolute difference on google pagespeed insights mobile test
    https://developers.google.com/speed/pagespeed/insights/
    when you use a brotli 11 precompressed large js or css file, as compared to the default compression on bunnycdn.

    If people use wordpress and are merging scripts and css files on wordpress, they can end up with a lot of data in their css or js files.

    The difference between using bunnycdn to serve those assets can then be significant, like 200 Kb of javascript, vs serving 130 Kb or less with precompression.

    And this, makes a difference on their mobile test, frequently between scoring green or less than 70 points.

    So in those cases, while a normal user will benefit from using the cdn, many site owners care more about google showing them that the site scores better without the cdn.

    Trust me when I say, I encounter this every day (I work in speed optimization) and often, clients prefer to skip bunnycdn altogether, or use it exclusively for images (because of webp support).

    Hence, at very least... allow us asap, to precompress our own assets and serve them directly. Or better... have a mirror request (nginx supports this) and download the precompressed file from the origin to be used with brotli_static (I know for reverse proxy it's not straighforward like this, but some cdn are doing it).

    Thanked by 2Daniel15 _MS_
  • yoursunnyyoursunny Member, IPv6 Advocate

    @peixotorms said:
    If people use wordpress and are merging scripts and css files on wordpress, they can end up with a lot of data in their css or js files.

    The difference between using bunnycdn to serve those assets can then be significant, like 200 Kb of javascript, vs serving 130 Kb or less with precompression.

    And this, makes a difference on their mobile test, frequently between scoring green or less than 70 points.

    A blog shouldn't need 200 KB or 130 KB of JavaScript.
    I can make a blog with only 46 KB of JavaScript, for example this page:
    https://yoursunny.com/t/2020/NDNts-webpack-start/

    At 160 KB JavaScript weight, I can have video streaming: https://pushups.ndn.today

    It seems that WordPress is the problem / cancer. You should ditch WordPress and use a static site generator.

  • Daniel15Daniel15 Veteran
    edited January 2021

    @yoursunny said: It seems that WordPress is the problem / cancer. You should ditch WordPress and use a static site generator.

    The server side has nothing to do with the client side, though. You can have a WordPress blog with 0 bytes of JavaScript if you wanted to :)

    Thanked by 1_MS_
  • eva2000eva2000 Veteran
    edited January 2021

    @peixotorms said: The point is... there is an absolute difference on google pagespeed insights mobile test
    https://developers.google.com/speed/pagespeed/insights/
    when you use a brotli 11 precompressed large js or css file, as compared to the default compression on bunnycdn.

    You're mistaken if you believe compression alone will improve Google's focused page speed metrics. Yes it helps with javascript compared to no compression at all but difference between gzip vs brotli isn't that much if gzip sizes are optimal. This is for same origin requests, but the key is more so for remote server served requests with 3rd parties like adsense etc. That is what usually pulls down Google's focused page speed metrics which show up in the form of poor Total Blocking Time and Time to Interactive metrics. For a well optimised site, same origin request's compression configuration isn't usually the problem.

    I wrote a guide for my Centmin Mod users which might be worth reading here too https://community.centminmod.com/threads/google-page-speed-insights-and-google-core-web-vital-metrics.20735/

    @yoursunny said: It seems that WordPress is the problem / cancer. You should ditch WordPress and use a static site generator.

    Depends on how Wordpress is optimised/configured. No problems for me with Wordpress https://blog.centminmod.com/2020/12/16/2175/gtmetrix-using-google-lighthouse-v6/ :)

  • @eva2000 I have never seen a such fast wordpress site on the wild. Great job I learned some new things from your cache tutorial.

    Thanked by 2yoursunny eva2000
  • @sepei said:
    @eva2000 I have never seen a such fast wordpress site on the wild. Great job I learned some new things from your cache tutorial.

    Cheers - it helps that I'm a speed/performance addict ^_^

Sign In or Register to comment.