Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


Disaster recovery for self hosted Vaultwarden
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

Disaster recovery for self hosted Vaultwarden

I am curious to know how others are approaching this. I came up with a simple procedure to perform a disaster recovery of my password manager in case some major shit happens and

  • I lose access to all my devices with a copy of the Bitwarden client (fire at home or whatever)
    AND

  • the server hosting Vaultwarden is down/offline at the same time

This would be a very unlucky situation but one I want to be prepared for nevertheless.

So what I did was

  • create a gmail account for this specific purpose (so it's separate from my own email, since I am self hosting that too now)
  • schedule a daily task on the Vaultwarden server that creates an archive, encrypted with openssl and a strong password, that includes Vaultwarden's postgres database and data directory with other files and config, and then sends this archive to that gmail address

The email includes just the encrypted archive as attachment and a message in the body with the exact command to decrypt the attachment with openssl minus the password of course (so I am sure about the version and the parameters I used for a strong encryption).

NOTE: the Vaultwarden vault is already encrypted but there is still some non sensitive data in clear that anyway I want to keep private, so I encrypt everything in the attachment just in the unlikely case someone gains access to that gmail account (which is difficult to guess and has a strong password too).

Once I decrypt the archive, I have the .sql dump of the database, the directory with the other files used by Vaultwarden, and a docker-compose.yml.

So if something major happens, all I need to do to recover access to my passwords and thus all my services and servers is

  • download the attachment from the latest email sent to the dedicated gmail account
  • decrypt it
  • run docker compose up (docker is required of course)

Boom, within a couple of minutes I have a copy of my Vaultwarden instance running locally. Then I can just visit /admin, enter the admin token which of course I remember, disable 2FA on my vault and log in to the vault to see all my passwords, secret nodes, SSH/GPG keys etc.

It's a simple procedure but critical if some major shit happens.

Do you self host your password manager? If yes do you have a disaster recovery procedure?

Comments

  • I ensure redundancy for my selfhosted Vaultwarden by running synchronized instances on 3 servers hosted on:

    • hxservers
    • Alpharacks
    • Dedipath

    This way, if up to 2 of them go down or encounter an problem, I have a 3rd instance remaining

  • @Moopah said:
    I ensure redundancy for my selfhosted Vaultwarden by running synchronized instances on 3 servers hosted on:

    • hxservers
    • Alpharacks
    • Dedipath

    This way, if up to 2 of them go down or encounter an problem, I have a 3rd instance remaining

    Interesting approach, I like it. I guess you are using small servers for the sync to keep costs down? Since Vaultwarden is very light anyway.

  • @Moopah said:
    I ensure redundancy for my selfhosted Vaultwarden by running synchronized instances on 3 servers hosted on:

    • hxservers
    • Alpharacks
    • Dedipath

    This way, if up to 2 of them go down or encounter an problem, I have a 3rd instance remaining

    How do you sync them btw?

  • remyremy Member
    edited December 2023

    I don't really understand why you do something so complicated.

    Personally, I just backup vaultgarden data (same for other docker containers / lxc ) to 2 different locations daily. (My home storage server + remote backup server)
    And I verified that I could launch a docker container with this data in a few minutes from the docker compose file and the backup data.

    That's all, and it doesn't seem to me to be the most critical.
    Because, as you said, each client keeps a local copy.

    But that's just the procedure for all my containers.

  • @remy said:
    I don't really understand why you do something so complicated.

    Personally, I just backup vaultgarden data (same for other container) to 2 different locations daily. (My home storage server + remote backup server)
    And I verified that I could launch a docker container with this data in a few minutes from the docker compose file and the backup data.

    That's all, and it doesn't seem to me to be the most critical.
    Because, as you said, each customer keeps a local copy.

    But that's just the procedure for all my containers.

    Yeah but the assumption in why I talk about "disaster recovery" in first place is, like I said in the OP, that I lose access to everything at home due to fire or something. In this case, if the Vaultwarden instance is also down right then how do you access your backup copy of the Vaultwarden data?

  • It lives here: https://github.com/berkant/bw2gh/blob/main/bw.lowdb.json

    The drawback is it doesn't work with attachments, which I don't use whatsoever.

  • @0xbkt said:
    It lives here: https://github.com/berkant/bw2gh/blob/main/bw.lowdb.json

    The drawback is it doesn't work with attachments, which I don't use whatsoever.

    Is that.. your actual vault?

  • @vitobotta said:

    @0xbkt said:
    It lives here: https://github.com/berkant/bw2gh/blob/main/bw.lowdb.json

    The drawback is it doesn't work with attachments, which I don't use whatsoever.

    Is that.. your actual vault?

    Yeah, it is.

  • remyremy Member
    edited December 2023

    @vitobotta said:

    @remy said:
    I don't really understand why you do something so complicated.

    Personally, I just backup vaultgarden data (same for other container) to 2 different locations daily. (My home storage server + remote backup server)
    And I verified that I could launch a docker container with this data in a few minutes from the docker compose file and the backup data.

    That's all, and it doesn't seem to me to be the most critical.
    Because, as you said, each customer keeps a local copy.

    But that's just the procedure for all my containers.

    Yeah but the assumption in why I talk about "disaster recovery" in first place is, like I said in the OP, that I lose access to everything at home due to fire or something. In this case, if the Vaultwarden instance is also down right then how do you access your backup copy of the Vaultwarden data?

    So basically I'm left with just my phone? And the instance is down ?
    I won't be worried because, as I said, I'll still have another backup.

    VPS vaultgarden: Location A
    Remote backup: Location B
    My Home: Location C

    And if my computer burned down. And so did my house...
    I don't think it will be an emergency to repair the vaultgarden instance.

    It will probably take me several days to buy a new computer.
    And in the meantime I'd still have access to the passwords on my phone even if the instance is down.
    To buy a new House on ebay :D

    Thanked by 1Doo
  • @0xbkt said:

    @vitobotta said:

    @0xbkt said:
    It lives here: https://github.com/berkant/bw2gh/blob/main/bw.lowdb.json

    The drawback is it doesn't work with attachments, which I don't use whatsoever.

    Is that.. your actual vault?

    Yeah, it is.

    Wow. Aren't you worried at all by exposing it like that? What if some vulnerability is found with Vaultwarden?

  • @remy said:

    @vitobotta said:

    @remy said:
    I don't really understand why you do something so complicated.

    Personally, I just backup vaultgarden data (same for other container) to 2 different locations daily. (My home storage server + remote backup server)
    And I verified that I could launch a docker container with this data in a few minutes from the docker compose file and the backup data.

    That's all, and it doesn't seem to me to be the most critical.
    Because, as you said, each customer keeps a local copy.

    But that's just the procedure for all my containers.

    Yeah but the assumption in why I talk about "disaster recovery" in first place is, like I said in the OP, that I lose access to everything at home due to fire or something. In this case, if the Vaultwarden instance is also down right then how do you access your backup copy of the Vaultwarden data?

    So basically I'm left with just my phone? And the instance is down ?
    I won't be worried because, as I said, I'll still have another backup.

    VPS vaultgarden: Location A
    Remote backup: Location B
    My Home: Location C

    And if my computer burned down. And so did my house...
    I don't think it will be an emergency to repair the vaultgarden instance.

    It will probably take me several days to buy a new computer.
    And in the meantime I'd still have access to the passwords on my phone even if the instance is down.
    To buy a new House on ebay :D

    Problem is, if you get logged out from your phone then you can't log in if the instance is unavailable. At least this is what I saw during testing

  • @vitobotta said:

    @remy said:

    @vitobotta said:

    @remy said:
    I don't really understand why you do something so complicated.

    Personally, I just backup vaultgarden data (same for other container) to 2 different locations daily. (My home storage server + remote backup server)
    And I verified that I could launch a docker container with this data in a few minutes from the docker compose file and the backup data.

    That's all, and it doesn't seem to me to be the most critical.
    Because, as you said, each customer keeps a local copy.

    But that's just the procedure for all my containers.

    Yeah but the assumption in why I talk about "disaster recovery" in first place is, like I said in the OP, that I lose access to everything at home due to fire or something. In this case, if the Vaultwarden instance is also down right then how do you access your backup copy of the Vaultwarden data?

    So basically I'm left with just my phone? And the instance is down ?
    I won't be worried because, as I said, I'll still have another backup.

    VPS vaultgarden: Location A
    Remote backup: Location B
    My Home: Location C

    And if my computer burned down. And so did my house...
    I don't think it will be an emergency to repair the vaultgarden instance.

    It will probably take me several days to buy a new computer.
    And in the meantime I'd still have access to the passwords on my phone even if the instance is down.
    To buy a new House on ebay :D

    Problem is, if you get logged out from your phone then you can't log in if the instance is unavailable. At least this is what I saw during testing

    That's a lot of parameters. I can always use the "forgot password" feature until I launch a new server
    But I'll keep following the thread for any technical ideas that come up.

    Let's hope I don't die in a fire as a result of the pessimism of @vitobotta :#

  • 0xbkt0xbkt Member
    edited December 2023

    @vitobotta said:

    @0xbkt said:

    @vitobotta said:

    @0xbkt said:
    It lives here: https://github.com/berkant/bw2gh/blob/main/bw.lowdb.json

    The drawback is it doesn't work with attachments, which I don't use whatsoever.

    Is that.. your actual vault?

    Yeah, it is.

    Wow. Aren't you worried at all by exposing it like that? What if some vulnerability is found with Vaultwarden?

    The server sees nothing. Bitwarden CLI is used in an Actions workflow to sign in with token authentication, so you can't really get anything out of it without torturing me through a $5 wrench attack to get my master password, as long as you don't really have enough compute power to successfully carry out a brute force decryption attack. I don't feel I'm worth it whatsoever, and I know no one pathetic enough to be after me.

  • I initially had it in a much simpler setup of a replicated vm (running docker) between my 2 proxmox nodes.

    Send backups to a vm at my parent's house which i can manually restore the data.

    Then, in thinking about having my spouse use it too instead of LastPass... I made it much more complicated.

    • 3 node galera cluster to sync the db in real time
    • haproxy at my home for load balancing / sending to active server (preferred)

    • I have yet to move the 3rd node to my parent's place because I'm not done setting up all 3 nodes (see below)

    • have not figured out a reliable way to sync the file system for attachments between all nodes (I've noticed rsync is not reliable) and haven't tested other alternatives yet

  • yoursunnyyoursunny Member, IPv6 Advocate

    @vitobotta said:

    • create a gmail account for this specific purpose

    Google could deadpool at any time.
    Google could delete your account at any time.

    • schedule a daily task on the Vaultwarden server that creates an archive, encrypted with openssl and a strong password, that includes Vaultwarden's postgres database and data directory with other files and config, and then sends this archive to that gmail address

    Google could reject your email at any time.

    • download the attachment from the latest email sent to the dedicated gmail account

    Google could ask you to react to a push notification on your Android at any time.
    Too bad the Android is burned.

    Thanked by 1Erisa
  • @T1an said:
    I initially had it in a much simpler setup of a replicated vm (running docker) between my 2 proxmox nodes.

    Send backups to a vm at my parent's house which i can manually restore the data.

    Then, in thinking about having my spouse use it too instead of LastPass... I made it much more complicated.

    • 3 node galera cluster to sync the db in real time
    • haproxy at my home for load balancing / sending to active server (preferred)

    • I have yet to move the 3rd node to my parent's place because I'm not done setting up all 3 nodes (see below)

    • have not figured out a reliable way to sync the file system for attachments between all nodes (I've noticed rsync is not reliable) and haven't tested other alternatives yet

    You set up a Galera cluster for this? Wow

  • I use their hosted solution. Reading this, it seems it has saved me a lot of hours

    Thanked by 1Erisa
Sign In or Register to comment.