Discussion:
[BackupPC-devel] BackupPC 4.1.1 released
Craig Barratt
2017-03-30 16:21:54 UTC
Permalink
BackupPC 4.1.1 <https://github.com/backuppc/backuppc/releases/tag/4.1.1> has
been released on Github.

BackupPC 4.1.1 is a bug fix release. There are several minor bug fixes
listed below.

Craig

* Merged pull requests: #77, #78, #79, #82

* Added missing BackupPC_migrateV3toV4 to makeDist (issue #75) reported
by spikebike.

* Fixed divide-by-zero in progress % report in BackupPC_migrateV3toV4
(issue #75) reported by spikebike.

* In lib/BackupPC/Lib.pm, if Socket::getaddrinfo() doesn't exist (ie,
an old version of Socket.pm), then default to ipv4 ping.

* Updates to configure.pl to make config-path default be based on
config-dir (#79), prepended config-path with dest-dir, fixing a
config.pl merge bug affecting $Conf{PingPath} reported by Richard Shaw,
and a few other fixes.

* Updated required version of BackupPC::XS to 0.53 and rsync_bpc to
3.0.9.6.

* Minor changes to systemd/src/init.d/gentoo-backuppc from sigmoidal (#82).

* Added RuntimeDirectory to systemd/src/backuppc.service.

* Use the scalar form of getpwnam() in lib/BackupPC/CGI/Lib.pm and
lib/BackupPC/Lib.pm
Bill Broadley
2017-04-05 03:48:42 UTC
Permalink
Two or so weeks ago I upgraded a backuppc 3.3 system to 4.0. Things generally
worked pretty well. Pool was 500GB or so, 31 hosts, all using rsync+ssh on the
client side and rsync-bpc on the server side.

A larger setup with 40 hosts, 2.7TB pool, and backuppc-4.1.1 has had no problems.

I haven't tinkered with it for a week:
Loading Image...

But suddenly the pool is 100% full:
$ df -h /backuppc/
Filesystem Size Used Avail Use% Mounted on
/dev/md1 3.6T 3.4T 984M 100% /backuppc

I poked around a bit without finding anything obvious.

Any thoughts on this being worth tracking down? Or just I just upgrade to
backuppc 4.1.1 (btw the main backuppc page still lists 4.0) and related
rsync-bpc and friends?

I'm running du -x on that partition, but I think it's got 55M files so it's
going to take awhile. I'm hoping feeding that to xdu or similar will show
something obvious.

I did stop/start the backuppc daemon, it's not hung file handle from the main
backuppc process.



------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
_______________________________________________
BackupPC-devel mailing list
BackupPC-***@lists.sourceforge.net
List: https://lists.sourceforge.net/lists/listinfo/backuppc-devel
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
Craig Barratt
2017-04-05 05:16:36 UTC
Permalink
Bill,

What is $Conf{PoolV3Enabled} set to?

Craig
Post by Bill Broadley
Two or so weeks ago I upgraded a backuppc 3.3 system to 4.0. Things generally
worked pretty well. Pool was 500GB or so, 31 hosts, all using rsync+ssh on the
client side and rsync-bpc on the server side.
A larger setup with 40 hosts, 2.7TB pool, and backuppc-4.1.1 has had no problems.
http://broadley.org/backuppc-pool.png
$ df -h /backuppc/
Filesystem Size Used Avail Use% Mounted on
/dev/md1 3.6T 3.4T 984M 100% /backuppc
I poked around a bit without finding anything obvious.
Any thoughts on this being worth tracking down? Or just I just upgrade to
backuppc 4.1.1 (btw the main backuppc page still lists 4.0) and related
rsync-bpc and friends?
I'm running du -x on that partition, but I think it's got 55M files so it's
going to take awhile. I'm hoping feeding that to xdu or similar will show
something obvious.
I did stop/start the backuppc daemon, it's not hung file handle from the main
backuppc process.
------------------------------------------------------------
------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
_______________________________________________
BackupPC-devel mailing list
List: https://lists.sourceforge.net/lists/listinfo/backuppc-devel
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
Bill Broadley
2017-04-05 05:19:42 UTC
Permalink
Post by Craig Barratt
Bill,
What is $Conf{PoolV3Enabled} set to?
***@node1:/etc/backuppc# cat config.pl | grep PoolV3
$Conf{PoolV3Enabled} = '0';

I did that after the V3 to V4 migrate completed. I ran the migrate again and it
didn't do anything and finished quickly. I restarted the backuppc daemon as well.



------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
_______________________________________________
BackupPC-devel mailing list
BackupPC-***@lists.sourceforge.net
List: https://lists.sourceforge.net/lists/listinfo/backuppc-devel
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
Craig Barratt
2017-04-05 05:59:47 UTC
Permalink
Bill,

Is the V3 pool empty? You can check by looking in, eg,
TOPDIR/cpool/0/0/0. If there are files there, check that all of them have
only 1 link.

Craig
Post by Bill Broadley
Post by Craig Barratt
Bill,
What is $Conf{PoolV3Enabled} set to?
$Conf{PoolV3Enabled} = '0';
I did that after the V3 to V4 migrate completed. I ran the migrate again and it
didn't do anything and finished quickly. I restarted the backuppc daemon as well.
------------------------------------------------------------
------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
_______________________________________________
BackupPC-devel mailing list
List: https://lists.sourceforge.net/lists/listinfo/backuppc-devel
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
Bill Broadley
2017-04-05 06:04:44 UTC
Permalink
Bill,
Is the V3 pool empty? You can check by looking in, eg, TOPDIR/cpool/0/0/0. If
there are files there, check that all of them have only 1 link.
***@node1:/backuppc# du -hs cpool/0/0/0
380K cpool/0/0/0
***@node1:/backuppc# find cpool/0/0/0 | wc -l
63
***@node1:/backuppc# ls -al cpool/0/0/0 | head -10
total 384
drwxr-x--- 2 backuppc backuppc 114688 Mar 28 21:58 .
drwxr-x--- 18 backuppc backuppc 4096 Aug 24 2011 ..
-rw-r----- 3 backuppc backuppc 32 Apr 23 2014 00002564f9012849e45bfa1f4fd47578
-rw-r----- 2 backuppc backuppc 159 Apr 23 2014 000026b5ae9afbffa56382c6019dbfe1
-rw-r----- 2 backuppc backuppc 40 Oct 29 2014 0000c32069243ef9cb6fd5113bd0891c
-rw-r----- 1 backuppc backuppc 35 Feb 15 20:07 0000c34a3e2faf9ccf2c31c6ded1c849
-rw-r----- 2 backuppc backuppc 55 Dec 28 22:00 0000c50fe18070ca0ae0d11aaed2f261
-rw-r----- 2 backuppc backuppc 133 Apr 23 2014 0000feef14f3c8f589894b983456389a
-rw-r----- 1 backuppc backuppc 145 Feb 11 2015 00012f3df3fef9176f4a08f470d1f5e6

Not sure how to check the 1 link thing.





------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
_______________________________________________
BackupPC-devel mailing list
BackupPC-***@lists.sourceforge.net
List: https://lists.sourceforge.net/lists/listinfo/backuppc-devel
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
higuita
2017-04-05 22:25:45 UTC
Permalink
Hi
Post by Bill Broadley
total 384
drwxr-x--- 2 backuppc backuppc 114688 Mar 28 21:58 .
drwxr-x--- 18 backuppc backuppc 4096 Aug 24 2011 ..
-rw-r----- 3 backuppc backuppc 32 Apr 23 2014 00002564f9012849e45bfa1f4fd47578
-rw-r----- 2 backuppc backuppc 159 Apr 23 2014 000026b5ae9afbffa56382c6019dbfe1
-rw-r----- 2 backuppc backuppc 40 Oct 29 2014 0000c32069243ef9cb6fd5113bd0891c
-rw-r----- 1 backuppc backuppc 35 Feb 15 20:07 0000c34a3e2faf9ccf2c31c6ded1c849
-rw-r----- 2 backuppc backuppc 55 Dec 28 22:00 0000c50fe18070ca0ae0d11aaed2f261
-rw-r----- 2 backuppc backuppc 133 Apr 23 2014 0000feef14f3c8f589894b983456389a
-rw-r----- 1 backuppc backuppc 145 Feb 11 2015 00012f3df3fef9176f4a08f470d1f5e6
^
|
This field is the number of hardlinks.
So you have entries >1, then you still have backups pointing to the v3 pool.

Try to find what backups are still in V3 format

Best regards
--
Naturally the common people don't want war... but after all it is the
leaders of a country who determine the policy, and it is always a
simple matter to drag the people along, whether it is a democracy, or
a fascist dictatorship, or a parliament, or a communist dictatorship.
Voice or no voice, the people can always be brought to the bidding of
the leaders. That is easy. All you have to do is tell them they are
being attacked, and denounce the pacifists for lack of patriotism and
exposing the country to danger. It works the same in every country.
-- Hermann Goering, Nazi and war criminal, 1883-1946
Bill Broadley
2017-04-07 08:08:49 UTC
Permalink
Hi
Post by Bill Broadley
-rw-r----- 1 backuppc backuppc 145 Feb 11 2015
00012f3df3fef9176f4a08f470d1f5e6
^ |
This field is the number of hardlinks.
So you have entries >1, then you still have backups pointing to the v3 pool.
Odd.

I ran the V3 to V4 migration script several times and it wasn't finding anything
and running quickly. I was worried that my filesystem was corrupt somehow, it
had been up for 360 some days. I umounted and fsck'd, not a single complaint.

***@node1:/backuppc/cpool/0/0/0# ls -al | awk ' { print $2 } ' | grep -v "1" |
wc -l
39

I didn't have many with more than one link. The entire dir is small:
***@node1:/backuppc/cpool/0/0/0# du -hs .
380K .

I see similar elsewhere:
***@node1:/backuppc/cpool/8/8/8# ls -al |wc -l; ls -al | awk ' { print $2 } ' |
grep -v "1" | wc -l
59
31

(59 files, 31 with links).

I'm still seeing crazy disk usage, over 3TB, only about 650GB (total from the
host status "full size" column) visible to backuppc.

Keep in mind this happened with no changes to the server. 30 hosts backed up
for a week or so, then suddenly much more disk is used. No host has larger
backups, just a factor of 6 larger pool one night.

I was using the v3 to v4 migration script from git since it wasn't in the
release package yet (that's been fixed.

I upgraded to backuppc 4.1.1, and the current versions of backuppc-xs and
rsync-bpc. I ran the V3 to V4 migration script (now included in the release)
again and it's doing some serious chewing (unlike before). It used to just fly
through them all with "refCnt directory; skipping this backup".

So maybe this will fix it.


------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
_______________________________________________
BackupPC-devel mailing list
BackupPC-***@lists.sourceforge.net
List: https://lists.sourceforge.net/lists/listinfo/backuppc-devel
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
Craig Barratt
2017-04-08 17:54:41 UTC
Permalink
Bill,

/backuppc/cpool/0/0/0 is just one of the 4096 3.x pool directories (each of
the last three digits is a single hex value from 0..9a..f). So to see the
total storage remaining in the 3.x pool you should do this:

du -csh /backuppc/cpool/?/?/?

I'm not sure why some of the 3.x files didn't get migrated. You could pick
one that has more than one link (eg: 00002564f9012849e45bfa1f4fd47578
above) and find its inode:

ls -li /backuppc/cpool/0/0/0/00002564f9012849e45bfa1f4fd47578

then look for other files that have that same inode (replace NNN with the
inode printed by ls -i):

find /backuppc -inum NNN -print

But given your point about 0/0/0 being quite small, it's unlikely this can
explain 3TB of extra usage, and I suspect the du command above won't show
more than a few MB.

So another path is to use du to find which directories are so large. For
example:

du -hs /backuppc

If that number is reasonable, then it must be something outside /backuppc
that is using so much space.

Next:

du -hs /backuppc/cpool /backuppc/pool /backuppc/pc

Are any of those close to 3TB? If so, do the du inside those directories
to narrow things down.

Is it possible your excludes aren't work after the 4.x transition? For
example, on some linux systems /var/log/lastlog is a sparse file, and
backing it up will create a huge (regular) file.

You could also use find to look for single huge files, eg:

find /backuppc -size +1G -print

will list all files over 1G.

Craig
Post by Bill Broadley
Hi
Post by Bill Broadley
-rw-r----- 1 backuppc backuppc 145 Feb 11 2015
00012f3df3fef9176f4a08f470d1f5e6
^ |
This field is the number of hardlinks.
So you have entries >1, then you still have backups pointing to the v3
pool.
Odd.
I ran the V3 to V4 migration script several times and it wasn't finding anything
and running quickly. I was worried that my filesystem was corrupt somehow, it
had been up for 360 some days. I umounted and fsck'd, not a single complaint.
wc -l
39
380K .
grep -v "1" | wc -l
59
31
(59 files, 31 with links).
I'm still seeing crazy disk usage, over 3TB, only about 650GB (total from the
host status "full size" column) visible to backuppc.
Keep in mind this happened with no changes to the server. 30 hosts backed up
for a week or so, then suddenly much more disk is used. No host has larger
backups, just a factor of 6 larger pool one night.
I was using the v3 to v4 migration script from git since it wasn't in the
release package yet (that's been fixed.
I upgraded to backuppc 4.1.1, and the current versions of backuppc-xs and
rsync-bpc. I ran the V3 to V4 migration script (now included in the release)
again and it's doing some serious chewing (unlike before). It used to just fly
through them all with "refCnt directory; skipping this backup".
So maybe this will fix it.
------------------------------------------------------------
------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
_______________________________________________
BackupPC-devel mailing list
List: https://lists.sourceforge.net/lists/listinfo/backuppc-devel
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
Loading...