Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


NGINX reverse proxy shared caching suggestions
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

NGINX reverse proxy shared caching suggestions

Hello there!

I'm trying to configure the content caching for my NGINX reverse proxy.
Besides many html, PHP files, pictures etc. there are plenty of videos from 100mb to 15gb which I need to be cached on my reverse proxy server to trim down the traffic a little bit.

At the moment the content caching seems to work and there are cache files stored at the defined location, but it won't work as expected. Every time a video gets retrieved a new cache file is created. So it seems the NGINX is proxying just for this one connection and not shared.

My NGINX Reverse-Proxy configuration:

proxy_cache_path /srv/nginx/cache levels=1:2 keys_zone=cache:10m inactive=168h max_size=1000g use_temp_path=off loader_threshold=300 loader_files=200 max_size=200m;

 server {
          listen xxx;

            # caching
             slice                   1m;
             proxy_cache             cache;
             proxy_buffering         on;
             proxy_cache_key         $uri$is_args$args$slice_range;
             proxy_set_header        Range $slice_range;
             proxy_http_version      1.1;
             proxy_cache_valid       200 206 302 168h;

             proxy_ignore_headers Expires;
             proxy_ignore_headers X-Accel-Expires;
             proxy_ignore_headers Cache-Control;
             proxy_ignore_headers Set-Cookie;

             proxy_hide_header X-Accel-Expires;
             proxy_hide_header Expires;
             proxy_hide_header Cache-Control;
             proxy_hide_header Pragma;

             add_header X-Proxy-Cache $upstream_cache_status;
             # caching over


 location / {
             proxy_pass              http://xxx/;
             proxy_set_header        X-Forwarded-Host        $server_name:$server_port;
             proxy_hide_header       Referer;
             proxy_hide_header       Origin;
             proxy_set_header        Referer                 '';
             proxy_set_header        Origin                  '';
             add_header              X-Frame-Options         "SAMEORIGIN";

             }

     }

Is it even possible with the normal NGINX to create such a shared cache? I've heard of NGINX Plus for exactly this use case, but this would just blow my budget.

Does anyone have another suggestions on how to implement this whole thing?

Comments

  • yoursunnyyoursunny Member, IPv6 Advocate
    edited December 2020

    Your NGINX config has $slice_range as part of the cache key.
    If two clients are downloading the same video with Range request header of different values, there would be no cache hit.

    For example:

    GET /pushups.mp4 HTTP/1.1
    Range: 1024-4096
    
    GET /pushups.mp4 HTTP/1.1
    Range: 2048-8192
    

    These two requests have an overlapping range 2048-4096, but NGINX would have cache miss because they have different cache keys.


    My push-up videos are delivered using Named Data Networking (NDN).
    The global cache network can be used for free to deliver NDN packets over a UDP overlay network; browsers can access this network via WebSockets.
    Software is open source, but you'll need to spend a few weeks to study the protocol and libraries.

    The videos are packetized into DASH format using Shaka Packager. Each file is less than 200KB, and can be cached individually. The client will always request the whole file and not ask for a range.
    Shaka Packager could be used on HTTP servers as well, but it does require you to upload packaged videos to the origin HTTP server.

    Thanked by 2oBnys ferri
  • Any static content with nginx can be cached but i don't think caching large videos (100MB+) is a great idea because you will require a huge amount of local SSD cache or memory to be able to sustain it.

    I think you should try out from smaller files and work your way up, then for very large file you should rely on providers or just host the files on S3-like storages

Sign In or Register to comment.