New on LowEndTalk? Please Register and read our Community Rules.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
Comments
@PetaByet Without wanting to be demanding, is there any ETA about the "restore on another server / ip" feature?
I'm pretty sure tar doesn't support this.
This week!
@PetaByet Please ad plesk backup in your next update too
Plesk doesn't have an easy API to work with. It's easier if you use the file backup and database backup functions.
@shivoham
Or don't be a noob and stop using Plesk.
That's why the installer asks for a 'fresh' OS installation.
Thanks for sharing, awesome...
@PetaByet Does CDP provide a way> @PetaByet said:
@PetaByet, actually tar does provide a way to list all files inside a tarball. You can find it at tar man(1) doc:
This should work for compressed files as well. Just put the flag that matches the compression algorithm that you have used:
$ tar -ztvf sometarfile.tar.gz
or
$ tar -jtvf sometarfile.tar.bz2
or any other algorithm.
If you are compressing with gzip or just putting files together without compressing it, you can use less:
$ less sometarfile.tar.gz
Less provides a very graceful output, just like
ls -l
. And tar does have a function to pull just one file or directory from one archive too:$ tar -zxvf sometarfile.tar.gz somedir/subdir/justafile
If you want to pull a dir you just use the same command but point to the dir path inside the tar file. Note that you dont use a / at the beginning of the internat tar dir path:
$ tar -zxvf sometarfile.tar.gz somedir/subdir/
This works for any compression algo. used to compress the tarball. You can extract all files that match a wildcard inside the tar file:
tar -zxvf sometarfile.tar.gz --wildcards --no-anchored '*.conf'
Or the same wildcard inside a specific dir:
tar -zxvf sometarfile.tar.gz somedir/subdir/ --wildcards '*.conf'
@EkaatyLinux Create an issue in GitHub please .
It's opened.
Thanks, will have a look at this.
CDP v1.0 is out!
Link: https://github.com/PetaByet/cdp/releases/tag/V1.0
New screenshots: http://imgur.com/a/yPALv
I shall have some exclusive Vultr storage VPS coupons for you in a few days .
@PetaByet - first, nice to see someone taking backups seriously.
However...I personally would never use this.
The problem is that you're pushing backups from the client to the backup server. If your client (the VPS being backed up) is ever compromised, the attacker can easily wipe out your backups, too. The attacker has all the backup config info, keys, etc.
I always have my backups pulled from the backup server for this reason. The backup server contacts the client and does an rsync, rotating directories on its end if I want the ability to go back in time.
rsync over ssh is possible, but I don't recommend it because then you have to have lots of clients trusting the backup server's key. If the backup server is compromised...
So instead I run rsyncd, which can be setup read only, only allowing one concurrent connection, with a password, only permitting access from the master(s), and of course you can lock it down with iptables as well.
Admittedly, this is more of a headache to setup, but it's the only solution I've found (other than running a proper backup client like R1Soft, bacula, etc.) that meets all the requirements.
So if the backup server gets compromised, one has full read-only access to every node? In that case, the attacker could easily rsync all the servers' private keys, sensitive files etc to the backup server and then transfer those to somewhere else?
Depends what you backup - I exclude root's .ssh dir.
In fact, I only rsync backups that are created locally on the server. So:
I'd like to find a proper distributed batch management system for BSD/Linux but one doesn't seem to exist and so far I've been too lazy to write one.
Oh, alright. That's exactly what I do, although I push the (encrypted) backup to the backup server.
That way the backup server doesn't have any kind of access to nodes + even if the backup server gets compromised, everything is encrypted anyway. I believe this is the most secure "home-made" backup solution.
This isn't a good idea by now. CDP is using JSON files as database what should not be used for more than a couple of users. When the system grows and create a large number of any data that is stored on these files CDP will get slower, and slower, and slower.
The security of the system isn't that good as well. I'm working to implement a better user management system for CDP.
CDP is very good and is getting powerful over the time but need some improvement to turn into a software powerful enough to that use.
Wasnt planning to Someone seems to want ACL and pre-defined resources etc etc. Just answered the 'why'
Desculpe. Mas o meu conselho continue sendo válida: p
@raindog308 @socials What about an optional server agent?
Unison can be the server agent. You can make a API for auth when client is trying to push backups to the server.
I have problem with these Linux on Online.net's dedi.
and the last option left, Centos 7 which doesnt fit with cdp..
hello,
I cdp.me install on a new machine with centos 6.5.
Voic what I do:
When I want to add a server, it does not work.
The server is not registered.
Thank you for your help
@comeback @psycholyzern Please try now.
Im reinstalling debian 7 on my online.net's dedi.. will share the result
what now? No files has been backup.
As Backup Server, i just add some password for a non root user on a storage boxor what?
Did you get any errors or emailed a backup log (email can be set in config.php)?
Password or SSH key for any user that has enough previleges to tar the directory(s).
Is it possible to backup full cpanel server like as all cPanel account under whm ?