All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
Is a CDN the only solution?
I'm about to launch a web based rpg game I've been working on for months. There is a character design tool that I created to maximize the customize-ability of your characters, and I use php to craft the image.
I'm looking at each .png image, and they are around 150x130 in dimensions, and approximately 14.2 kilobytes each. Give or take if a user adds items/accessories/armor to their character.
So, around 100 character images are pushing 1.42mb of storage. I am expected to hit around 5,000 + characters.
I will have these images displayed when players are joining a party, the lobby, forum topics, profile view, etc. I am not sure how this is going to scale and what kind of bandwidth it will use.
What CDN service would you guys here at LET recommend? I am really honestly not looking for something very expensive like akamai. (The game will not get that big). Also, I can cache the images fine on the main server, but I feel like having a CDN would be more appropriate, as this would pull weight off the websocket game server, and overall load... (or maybe am I over thinking this? Would caching these images be fine on the main server?)
I am sorry if this is not 100% VPS related. I figure you guys would be knowledgeable in this area though! Thanks in advance.
Comments
I don't think static requests are a concern for a fast web server like nginx. You can tell browsers to cache the pictures,too. But I am not a specialist.
Nginx is good at serving static images, and browsers should cache the images anyway, I wouldn't worry about it.
I agree, your needs don't seem to be anything that would overtax most servers. But your users will get faster response times if you use a CDN (as they're pulling images, javascript, etc. from a server near them). I'd recommend CloudFlare highly. Besides being free for the CDN, they'll also provide free SSL certificates, manage your SSL traffic, and provide some security to your server. It's really easy to setup.
Maybe the issue isn't the server, but the clients. Why can't the client preload assets and cache them for some time - wouldn't this reduce load and perceived latency? Set the correct caching/ etag headers for the website, and optimise the game side to preload and cache.
We're using KeyCDN for imgs.us. It's $40 / TB which a hell of a lot cheaper than MaxCDN's ~$80/TB. Another option would be purchasing some vpses and putting up reverse squid proxies to cache content. Then you'd just need a DNS solution to send the traffic to the vps based on location.
Since 5,000 characters sounds like it'd be around 71 MB, we're still in the realm of storing it all in RAM. You could, relatively easily, cache the images in RAM, only writing to disk when a user changes their avatar - this should reduce your HDD needs, in terms of a bottleneck.
From there, it's a matter of bandwidth. We don't know what your bandwidth requirements will be in terms of average bandwidth needs per user, but assuming you had 2 TB of bandwidth, you could serve up a bunch of images.
Thank you everyone for the replies. I will be ditching the CDN service at launch, and I'll carefully log and analyze the data once we grow. And if need be, I'll try out @sambling's advice with KeyCDN. That sounds like the cheapest route to go. I think Jemaltz hit the nail on the head. What an awesome community! Appreciate it fellas.
Yes, and your operating system should take care of doing all of this for you. All you need is some available RAM for disk-caching.
I'd say the only issue with relying on the OS to do it is the amount of RAM that you have available, and how much other "stuff" you have going on. For speed's sake, it may be worth doing it yourself in order to ensure that everything is acting as you'd want it to.
You should consider OVH CDN, they're priced at 11,90€/TB + a fixed fee of 6€/month, for a lot of traffic it will be cheaper than other CDN services.
I setup my own cdn pretty much with a bunch of little vps boxes.
Using rage4 for the DNS and geoip location
Using litespeed free version as web server
Rsync to keep data synced across servers
Pm me if you want to test it out.
How about an anycast cluster of low cost VPS servers?
We have servers in Las Vegas, New Jersey, & Luxembourg. You'd be able to buy a single (or dual in each for failover) and then assign an anycast IP address for free to your account. You'd more or less be creating your own CDN for way cheaper than what it goes for in the market.
Bandwidth is pooled together (so if you buy 3 VM's, you'd have 3TB to spend between them, etc), & costs $2.50/month per extra pooled TB of transit.
All in all, you'd be looking at $10.50/month for 30GB of SSD storage over 3 locations & 3TB transit. Gotta be up there as one of the cheaper/cheapest CDN options around
Francisco
This sounds like a great idea! Its pretty much what i did grouping a bunch of different LEB systems. Love the pooled data traffic idea as well! Question about Anycast IP ... how does that differ from using for example Rage4 and having cdn.DOMAIN.com point to geoip located servers?
You should do a little tutorial, might be able to get a nice simple niche in the market if you can provide an easy setup/configuration for a $10.50/month, 3TB month CDN!
Same here but I use nsone for DNS
If the "other stuff" is utilizing ram cache moreso than the content, it's because it has a greater need (i.e., more frequent accesses). So it's the right thing to do. If you dedicate memory to less-frequently-accessed content, then you'll negatively impact the system overall.
The Linux kernel is very smart about this. The best thing that us mortals can do to help is add more RAM
@doughnet - The GEO setup works by checking the GEOIP of the DNS server you used to resolve the domain, and then their system uses maxmind/their own GEO database, to make decisions. GEO DB's are notorious for being way off the mark and there's also the large increase in anycast DNS servers, which means that while you used a DNS server in Europe, the IP's were owned by a US corporation so the GEO will point to the USA.
Anycast uses BGP's nature of 'shortest path' to route things to the least amount of AS HOP's as possible. You also get the benefit that if a locations BGP goes down (very rare but it could happen), the tubes self correct and will automatically send data to the next closest location.
Yes, I know, documentation is something we're spending the coming weeks working on, along side new websites
Francisco
well i love the idea ... ill be grabbing 3 vps to play around with the setup for sure!
Let us know if you need a hand with anything
Remember, if you use a website to 'test' if a page/site is loading, take into consideration where the tester is located. We had a few people that were racking their brain over why it was pulling only from their US location....then remembered that the tester they're using is only US based :P
Francisco
But only 3 Points what with ASIA / NZ / AU ? What is with middle US and so on?
We're looking to get an Asia-Pacific location together for our anycast users at least. A central US location is probably not needed since both Vegas & Jersey are close by (~20ms).
We can't dump 30 locations into the anycast project since it becomes a cost burden to people that want to enjoy it. I think 4 strong POP's would give global coverage and still keep things very affordable.
Francisco
Well why every location needed to be bought?
And do you have any TestIP from Anycasted Network
Because the same IP range is announced in every location and we don't 'backhaul' traffic to other DC's. It becomes costly to do that and is putting an unneeded technical burden for me to put that all in place.
I could put together multiple /24's and combo them together to try to allow cherry picking of configurations, but we'd end up burning a /21 just to add a tiny bit of flexibility.
You can ping 198.251.87.1
Francisco
You are sure 198.251.87.1 is anycasted? getting 220ms from Austria
Woops, do 198.251.86.1
Francisco
This IP is DDOS protected right? How much ist the protection?
Here's a better idea. Instead of relying on an external CDN, why not consider running a few locally-concentrated machines?
1 machine for dynamic service, 1 machine for static service, 1 machine for compressing images (1.42 MB for 150x130 image? I'd imagine at that size you'd be able to push them down into the hundreds of kilobytes range). Or better yet, a 3+ 'core' VPS that does this all.
Also, +1 for @Francisco's solution. I've not used it, but I'd imagine it'd do you swell.
Anycast IP's are free. By default users can assign 5 to themselves via Stallion and can request more for free. Filtering is 20gbit and is a flat $5/month per IP address.
At the risk of a derail, do tell..
That's "greater need" as defined by the kernel, but if the OP, say, wants as close to zero lag as possible for the character images, and doesn't mind normal caching/loading delays on other content, then they have a greater need to keep the avatars loaded 24/7, and there's no way to explain that to the kernel