Reduce wordpress size, archive old posts
Hi all,
I have a client's Wordpress site hosted on my server which is outgrowing 16GB, and since its a movie review, celeb gossip site, it has a lot of images to host. I'm just checking on recommendations on how I can cut down /archive posts that are more than 1 year old, and just keep the current one active. The only problem is... She (owner) doesn't want to compromise on SEO and link backs she already has on old posts.
The biggest problem is weekly Backup sets that take quite a space on the server as well as cloud backup I've setup.
One solution that immidientely comes to my mind is to make another WP site inside it with /domain.com/archive and slice out all posts >1yr out of it. then I can exclude that domain/site folder in my weekly backups.
still open to better suggestions... So calling all pro-bloggers & admins to suggest (as much as possible) non-invasive remedy .
Comments
There're some wp plugins providing lossless images compression, you can try. At best you can save ~40% image size.
Move all the images to a third-party service (paid or free), pull them back via reverse proxying, problem solved.
@deadbeef, that came into my mind... though I didn't give it a thorough reading. Can you point me to a real case scenario & recommend which service would be best ? Imgur ??
@khuongcomputer... This is ofcourse an option, Can you recommend a good stable plugin? I don't have much experience with WP, may be smushit or something ?
"EWWW Image Optimizer" will reduce images size w/o compromise too much images's quality
Image compression:
http://nikkhokkho.sourceforge.net/static.php?page=FileOptimizer
http://pnggauntlet.com/ for pngs
For speed tests:
https://developers.google.com/speed/pagespeed/insights/
http://gtmetrix.com/dashboard.html
http://tools.pingdom.com/fpt/
http://www.webpagetest.org/
Google provides you downloadable content, images, css and js. I'm using wordpress too and I've ditched jquery2.1.4 uncompressed and move to github in compressed version. Funny thing is loading time is around the same but the compressed version on github is a lot smaller. But it only caches for 5 minutes, whereas code.jquery.com might be for longer cacheable. Maybe I should just use their jquery.com compresse version instead.
Google is pretty good. I still don't know how to inline and combine js and css, so I got 70-80 mobile and 85+ desktop.
I'm using a minimal theme from a marketing forum, so it's not a bloated theme you get from bloatforest and the like.
Edit: I just remembered that she could upload all her images visible on her domain.tld/allimagesetc page and let google speedtest analyse that, download the google offered images and reupload them via ftp/winscp rewriting the older images.
I'd love to know what google uses to compress images.
JS:
http://javascript-minifier.com/
JS and CSS:
http://cssminifier.com/
Yours. But since you don't seem to agree, here's one from another guy: https://unsplash.com Check where they host their images.
Any image hosting service would do. Imgur has the diadvantage that they may delete your images if there is no traffic to them for a long (unspecified) time.
I suggest: http://www.imgix.com
This service looks cool, but pricing is not adequate.
Even if forget about bandwidth (CDN can reduce that number), $3 per 1000 master images is crazy. For example i have typical website with 333143 images. So i have to pay 999$ for this service. WTF? I pay much less to host this website. And images 1% of whole load and takes just 20 GB which is for now fine.
So with them i have to pay 999$ for 20 GB of images. That's not adequate. If imgix.com would cost 0.01$ per 1000 images which is i think reasonable pricing, i'd use it, so i can pay 3$ in my case. Even with 0.1$ per 1000 images it's rather costly.
@mehargags Usually for your problem i do:
backup all EXCEPT images to one Cloud/backup engine daily/weekly/monthly
backup images incrementally to another Cloud/backup engine.
If you think, that it's slow, then you should optimize your server/infrastructure, buy SSD. 100K images can be backuped in a minute.
@profforg
Here's a crazy idea - find another one that suits your specific needs better. Worst case scenario, you can always fall back to S3 + AWS Lamda + AWS's newly launched API Gateway, tie them together and do it "manually". Choices, choices, choises, so many choices all over.
well... thanks for the suggestions... but I feel we are loosing the track a little bit here.
@Profforg, my site is not slow or heavy to load, I'm not looking for Wordpress speed optimizations. The problem in what way you can ARCHIVE WP's content folder with 1000's of images.
Being a media site, people only tend to see the first few posts/pages only... something hot in news. The rest of it gets OLD pretty fast so all the media images are just sitting there eating valuable space (not RAM or CPU) and then I'm backing everything up bi-weekly and carrying the "burden" all around while it is not needed.
Deleting the posts would mean loosing backlinks and SEO thing... and bloggers are chewy on that... still non-technical.
So I'm seeking of some way to have an archival copy somewhere, how to point my permalinks to that site so as to feed the SEO purpose and still able to cut the bloat from the current site. This way, I can keep a small fresh site running of last 6 months posts and older posts can sit on a separate server/Directory which can be excluded from Weekly Backups.
Hope I'm clear...
Thanks anyways for all the help.
You organize your images by year/month, right? Why not just exclude wp-content/uploads/2013/* from backups and etc if that's your primary concern?
There's no reason you need to remove the old Posts to remove the old images, you could just replace old images with a 0KB placeholder that way you'll avoid 404's on the pages.
nice answer @nunim... great ideas there!! will definitely think over both your suggestions...
Many thanks
I say nothing about performance. The solution you trying to do is rather dirty. Deleting posts or images is not cool.
What i said, is that you should backup images individually. And, as i said, i usually backup images with 2nd script incrementally. Overall number of stored data is usually 2x of images size (one incremental backup daily and one full backup 1 day old in case if incremental fails for some reason).
Optimize images and everything else that consumes a lot of space.
Use rsync for backup.
@nunim ...Best idea so far....and besides...you could always upload the previous backups to cloud storages which are free of course and remove them from Ur server and when cloud storages get full you start removing the oldest one because if you dont do it...the images will have 100 of copies where you would only need one to restore in case of a loss....you could run a cron to upload it to Google drive...
Does loading the uploads folder from an NFS share and excluding it from that backup on a different VPS on the same machine sound as crazy to you as it sounds to me.
You can set the uploads folder to have a lower number of backups per month that way while you can safely backup all wordpress without thinking about size?
rsync incremental?
We too have a huge site with almost 75 Gb image data. We use vaultpress backup subscription to avoid all these hassles.
Yes - i would also prefer third party image hosting. BUT how to make changes now in each posts for the links; and then possibly image gallery plugins may not support it.
@riu,
You are right, and I Seek the same info how to change in all the old posts incase I switch to a thirdparty/server to host images elsewhere. It is practically the biggest hurdle.
Keeping this open for someone to comment... thanks!
I will explore CDN services. They must also be changing image URLs. I was thinking to host with flickr, but then its a huge task especially when not sure of its final outcome.