Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


Best practice to add more disk space to a CentOS/WHM installation
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

Best practice to add more disk space to a CentOS/WHM installation

Hello.
I have got control over a CentOS webserver with WHM/Cpanel. It's some hundred customer on this server.
The disk is beginning to be full, and I need to add more disk space.

Here is the disk setup:

What will be the best practice? Just take a backup of the whole WHM and move it to a new server with bigger disk, or are there any easy way to increase disk space?
The server is hosted, and there is no problems adding more disk space to the server.

Just never done this before.

Comments

  • Looks like you are using LVM for your /home, you can extend using this guide: https://www.tecmint.com/extend-and-reduce-lvms-in-linux/

    Be careful so you don't loose your data.

    Thanked by 3myhken WSS MrOli3000
  • mikhomikho Member, Host Rep

    Add a second disk, set as /home2 and create new accounts there?

  • mikho said: Add a second disk, set as /home2 and create new accounts there?

    Is it so simple as that? Where in WHM do I set that all new sites should use /home2?

  • ExpertVMExpertVM Member, Host Rep

    https://documentation.cpanel.net/display/ALD/Rearrange+an+Account

    WHM will pick up and create on a different partition on a "load balancing method"

    Thanked by 1myhken
  • mikhomikho Member, Host Rep

    @ExpertVM said:
    https://documentation.cpanel.net/display/ALD/Rearrange+an+Account

    WHM will pick up and create on a different partition on a "load balancing method"

    Beat me to it

    Thanked by 1myhken
  • They're all trolling you. You made the mistake all failed hosts do. Never exceed 10% useage of disk space. You'll need to change your name becuase when it gets full, it will crash and never start up ever again. You can't slow it or migrate either. It just doesn't work like that. This isn't like in the movie Tron where you waive your hands and a city is created.

  • cheapwebdev said: cheapwebdev

    But how to solve this then, if what you are saying is true.

  • FlamesRunnerFlamesRunner Member
    edited December 2017

    @myhken

    It was a joke :/

    You can use the WHM rearrange function to move over accounts to another partition, e.g /home2. I've seen hosts with 100% disk usage, and it didn't crash the server (albeit it was slow as molasses).

    Thanked by 2myhken doghouch
  • As said, add it to a pool with LVM. Also, why did you block out your mount point? Lawl.

    Thanked by 1myhken
  • AnthonySmithAnthonySmith Member, Patron Provider
    edited December 2017

    its just a logical volume, add the second disk, pvcreate on the new disk, then vgextend volgroupname /dev/new-disk-name.

    Then shutdown the server, boot in to recovery, lvextend, and done.

    Thanked by 2Junkless myhken
  • MikeAMikeA Member, Patron Provider
    edited December 2017

    @FlamesRunner said:
    I've seen hosts with 100% disk usage,

    "Where'd my XXXX go??"

    Thanked by 2myhken FlamesRunner
  • myhkenmyhken Member
    edited December 2017

    @AnthonySmith said:
    its just a logical volume, add the second disk, pvcreate on the new disk, then vgextend volgroupname /dev/new-disk-name.

    Then shutdown the server, boot in to recovery, lvextend, and done.

    Ok my first try failed, so using my test server, what to do next now. When I first tried, I got the space from sdb into /home, but cpanel did not see the space. A df -h did not show the space either, but if i use lsblk I could see the new space.

    So here is my after I have used pvcreate and lvextend, as you can see lsblk do show the correct new size, but df -h do not show the correct size:

  • Nobody? I'm sure I'm just missing a simple command or something? I have tried both in rescue mode and without rescue mode, with the same result.

  • you will probably need to use resize2fs?

  • myhkenmyhken Member
    edited December 2017

    @nobizzle said:
    you will probably need to use resize2fs?

    That did the trick. Now...the next thing is to do this on the production server...it worked fine on my test server, but the production server has around 200 sites on it...
    Of course, I have backup and a snapshot of the server...still, will it actual be so simple on the production server also?

  • AnthonySmithAnthonySmith Member, Patron Provider

    It will be fine, if your not confident you could always just boot it up with gparted instead and have it do the work for you :)

    Thanked by 1myhken
  • On my test server unmounting the home partition worked fine, but what will happen on the production server? Will people just not get access to their sites until it's mounted again? And when mounting it again, will all just work?

  • AlexanderMAlexanderM Member, Top Host, Host Rep

    @myhken said:
    On my test server unmounting the home partition worked fine, but what will happen on the production server? Will people just not get access to their sites until it's mounted again? And when mounting it again, will all just work?

    You would be best doing maintenance and secduling the work, you would be best to stop certain services to prevent certain things from happening.

    If a live server, you would be better getting someone to do it for you, or, simply, mount the new space as /home2

    Also - you left the host name in putty in the windows bar.

    Alexander

  • @AlexanderM said:
    Also - you left the host name in putty in the windows bar.

    Alexander

    You signed your post.

    Alexander

  • AlexanderMAlexanderM Member, Top Host, Host Rep

    @WSS said:

    @AlexanderM said:
    Also - you left the host name in putty in the windows bar.

    Alexander

    You signed your post.

    Alexander

    Sorry,

    WSS

  • @AlexanderM said:

    @WSS said:

    @AlexanderM said:
    Also - you left the host name in putty in the windows bar.

    Alexander

    You signed your post.

    Alexander

    Sorry,

    WSS

    That's better.

    Thanked by 1AlexanderM
  • AlexanderM said: If a live server, you would be better getting someone to do it for you, or, simply, mount the new space as /home2

    Also - you left the host name in putty in the windows bar.

    Fixed, and I just did it on the production server. I rebooted the server first, then did the commands, and it just worked, had to force a umount, but no issues beside that.
    And now we have 50% free space, in stead of 7% free space.

    The commands was:

    pvcreate /dev/sdd

    vgextend vg_cpanel1 /dev/sdd

    lvextend -L+74G /dev/vg_cpanel1/lv_home

    umount -f /dev/mapper/vg_cpanel1-lv_home

    e2fsck -f /dev/mapper/vg_cpanel1-lv_home

    resize2fs /dev/mapper/vg_cpanel1-lv_home

    mount /dev/mapper/vg_cpanel1-lv_home

Sign In or Register to comment.