Starting this morning I got errors with my two backup jobs that upload to B2. When I was notified of the errors, I saw that there was a duplicati update available. Thinking it could be the cause of the errors, I went ahead and updated to 2.0.7.1_beta_2023-05-25
I then tried to run my backup job again, and got the last error.
It seems like there might be some issue with B2, but I don’t know how to get log information from B2 in order to troubleshoot. Can somebody help me figure this out? Duplicati has been rock solid for me for a few years now, so not having a daily backup is stressing me out!
2023-06-01 05:14:19 -07 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error
System.IO.IOException: Unable to read data from the transport connection: Connection reset by peer. ---> System.Net.Sockets.SocketException: Connection reset by peer
--- End of inner exception stack trace ---
at System.Net.Sockets.Socket+AwaitableSocketAsyncEventArgs.ThrowException (System.Net.Sockets.SocketError error) [0x00007] in <33b19a7ad5234d94abf4fd9b47566616>:0
at System.Net.Sockets.Socket+AwaitableSocketAsyncEventArgs.System.Threading.Tasks.Sources.IValueTaskSource.GetResult (System.Int16 token) [0x0001f] in <33b19a7ad5234d94abf4fd9b47566616>:0
at System.Threading.Tasks.ValueTask+ValueTaskSourceAsTask+<>c.<.cctor>b__4_0 (System.Object state) [0x00030] in <de882a77e7c14f8ba5d298093dde82b2>:0
--- End of stack trace from previous location where exception was thrown ---
at Mono.Net.Security.MobileAuthenticatedStream.InnerWrite (System.Boolean sync, System.Threading.CancellationToken cancellationToken) [0x000d3] in <33b19a7ad5234d94abf4fd9b47566616>:0
at Mono.Net.Security.AsyncProtocolRequest.ProcessOperation (System.Threading.CancellationToken cancellationToken) [0x00199] in <33b19a7ad5234d94abf4fd9b47566616>:0
at Mono.Net.Security.AsyncProtocolRequest.StartOperation (System.Threading.CancellationToken cancellationToken) [0x0008b] in <33b19a7ad5234d94abf4fd9b47566616>:0
at Mono.Net.Security.MobileAuthenticatedStream.StartOperation (Mono.Net.Security.MobileAuthenticatedStream+OperationType type, Mono.Net.Security.AsyncProtocolRequest asyncRequest, System.Threading.CancellationToken cancellationToken) [0x0024b] in <33b19a7ad5234d94abf4fd9b47566616>:0
at System.Net.WebRequestStream.ProcessWrite (System.Byte[] buffer, System.Int32 offset, System.Int32 size, System.Threading.CancellationToken cancellationToken) [0x00270] in <33b19a7ad5234d94abf4fd9b47566616>:0
at System.Net.WebRequestStream.WriteAsyncInner (System.Byte[] buffer, System.Int32 offset, System.Int32 size, System.Net.WebCompletionSource completion, System.Threading.CancellationToken cancellationToken) [0x001aa] in <33b19a7ad5234d94abf4fd9b47566616>:0
at Duplicati.Library.Utility.Utility.CopyStreamAsync (System.IO.Stream source, System.IO.Stream target, System.Boolean tryRewindSource, System.Threading.CancellationToken cancelToken, System.Byte[] buf) [0x00166] in <2a3ee711c7c04f6c957360f2cf183a7f>:0
at Duplicati.Library.Utility.Utility.CopyStreamAsync (System.IO.Stream source, System.IO.Stream target, System.Threading.CancellationToken cancelToken) [0x00077] in <2a3ee711c7c04f6c957360f2cf183a7f>:0
at Duplicati.Library.Backend.Backblaze.B2+<>c__DisplayClass28_0.<PutAsync>b__1 (Duplicati.Library.Utility.AsyncHttpRequest req, System.Threading.CancellationToken reqCancelToken) [0x0008a] in <f30a9ba7585445e094ae4320fb244dfc>:0
at Duplicati.Library.JSONWebHelper.GetJSONDataAsync[T] (System.String url, System.Threading.CancellationToken cancelToken, System.Action`1[T] setup, System.Func`3[T1,T2,TResult] setupbodyreq) [0x000c1] in <fd6b26d6c1004933aab594493968119c>:0
at Duplicati.Library.Backend.Backblaze.B2.PutAsync (System.String remotename, System.IO.Stream stream, System.Threading.CancellationToken cancelToken) [0x003c7] in <f30a9ba7585445e094ae4320fb244dfc>:0
at Duplicati.Library.Main.Operation.Backup.BackendUploader.DoPut (Duplicati.Library.Main.Operation.Common.BackendHandler+FileEntryItem item, Duplicati.Library.Interface.IBackend backend, System.Threading.CancellationToken cancelToken) [0x00426] in <e60bc008dd1b454d861cfacbdd3760b9>:0
at Duplicati.Library.Main.Operation.Backup.BackendUploader+<>c__DisplayClass20_0.<UploadFileAsync>b__0 () [0x0010a] in <e60bc008dd1b454d861cfacbdd3760b9>:0
at Duplicati.Library.Main.Operation.Backup.BackendUploader.DoWithRetry (System.Func`1[TResult] method, Duplicati.Library.Main.Operation.Common.BackendHandler+FileEntryItem item, Duplicati.Library.Main.Operation.Backup.BackendUploader+Worker worker, System.Threading.CancellationToken cancelToken) [0x0017c] in <e60bc008dd1b454d861cfacbdd3760b9>:0
at Duplicati.Library.Main.Operation.Backup.BackendUploader.DoWithRetry (System.Func`1[TResult] method, Duplicati.Library.Main.Operation.Common.BackendHandler+FileEntryItem item, Duplicati.Library.Main.Operation.Backup.BackendUploader+Worker worker, System.Threading.CancellationToken cancelToken) [0x003a5] in <e60bc008dd1b454d861cfacbdd3760b9>:0
at Duplicati.Library.Main.Operation.Backup.BackendUploader.UploadFileAsync (Duplicati.Library.Main.Operation.Common.BackendHandler+FileEntryItem item, Duplicati.Library.Main.Operation.Backup.BackendUploader+Worker worker, System.Threading.CancellationToken cancelToken) [0x000da] in <e60bc008dd1b454d861cfacbdd3760b9>:0
at Duplicati.Library.Main.Operation.Backup.BackendUploader.UploadBlockAndIndexAsync (Duplicati.Library.Main.Operation.Backup.VolumeUploadRequest upload, Duplicati.Library.Main.Operation.Backup.BackendUploader+Worker worker, System.Threading.CancellationToken cancelToken) [0x00098] in <e60bc008dd1b454d861cfacbdd3760b9>:0
at CoCoL.AutomationExtensions.RunTask[T] (T channels, System.Func`2[T,TResult] method, System.Boolean catchRetiredExceptions) [0x000d4] in <9a758ff4db6c48d6b3d4d0e5c2adf6d1>:0
at Duplicati.Library.Main.Operation.BackupHandler.FlushBackend (Duplicati.Library.Main.BackupResults result, CoCoL.IWriteChannel`1[T] uploadtarget, System.Threading.Tasks.Task uploader) [0x001da] in <e60bc008dd1b454d861cfacbdd3760b9>:0
at Duplicati.Library.Main.Operation.BackupHandler.RunAsync (System.String[] sources, Duplicati.Library.Utility.IFilter filter, System.Threading.CancellationToken token) [0x00cb2] in <e60bc008dd1b454d861cfacbdd3760b9>:0
2023-06-01 05:32:49 -07 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error
System.Net.Sockets.SocketException (0x80004005): Connection reset by peer
at Duplicati.Library.Main.Operation.Backup.BackendUploader.<Run>b__13_0 (<>f__AnonymousType12`1[<Input>j__TPar] self) [0x005ac] in <e60bc008dd1b454d861cfacbdd3760b9>:0
at Duplicati.Library.Main.Operation.Backup.BackendUploader.<Run>b__13_0 (<>f__AnonymousType12`1[<Input>j__TPar] self) [0x007d7] in <e60bc008dd1b454d861cfacbdd3760b9>:0
at CoCoL.AutomationExtensions.RunTask[T] (T channels, System.Func`2[T,TResult] method, System.Boolean catchRetiredExceptions) [0x000d5] in <9a758ff4db6c48d6b3d4d0e5c2adf6d1>:0
at Duplicati.Library.Main.Operation.BackupHandler.FlushBackend (Duplicati.Library.Main.BackupResults result, CoCoL.IWriteChannel`1[T] uploadtarget, System.Threading.Tasks.Task uploader) [0x000f2] in <e60bc008dd1b454d861cfacbdd3760b9>:0
at Duplicati.Library.Main.Operation.BackupHandler.RunAsync (System.String[] sources, Duplicati.Library.Utility.IFilter filter, System.Threading.CancellationToken token) [0x00cb2] in <e60bc008dd1b454d861cfacbdd3760b9>:0
2023-06-01 07:03:48 -07 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error
System.Net.Sockets.SocketException (0x80004005): The socket has been shut down
at Duplicati.Library.Main.Operation.Backup.BackendUploader.<Run>b__13_0 (<>f__AnonymousType12`1[<Input>j__TPar] self) [0x005ac] in <9cab5b8f1b4f49ec980acae8e278968b>:0
at Duplicati.Library.Main.Operation.Backup.BackendUploader.<Run>b__13_0 (<>f__AnonymousType12`1[<Input>j__TPar] self) [0x007d7] in <9cab5b8f1b4f49ec980acae8e278968b>:0
at CoCoL.AutomationExtensions.RunTask[T] (T channels, System.Func`2[T,TResult] method, System.Boolean catchRetiredExceptions) [0x000d5] in <9a758ff4db6c48d6b3d4d0e5c2adf6d1>:0
at Duplicati.Library.Main.Operation.BackupHandler.FlushBackend (Duplicati.Library.Main.BackupResults result, CoCoL.IWriteChannel`1[T] uploadtarget, System.Threading.Tasks.Task uploader) [0x000f2] in <9cab5b8f1b4f49ec980acae8e278968b>:0
at Duplicati.Library.Main.Operation.BackupHandler.RunAsync (System.String[] sources, Duplicati.Library.Utility.IFilter filter, System.Threading.CancellationToken token) [0x00cb2] in <9cab5b8f1b4f49ec980acae8e278968b>:0
Hello
how about testing the connection ? there is a button for that in the UI, just click the edit button, second step (Destination). Or from the command line, there is
mono Duplicati.CommandLine.BackendTester.exe --max-file-size=20kb --number-of-files=2 --reruns=1 <connection string>
you can get the connection string for your backend by exporting the backup as command line. If the backend is working, it should answer that the directory is not empty. If your backend supports creating a directory, change the directory in the connection string (you have to create it first in the backend) and it should do an extended test in this directory.
Testing the connection from the UI succeeded. When I tried to run it from the command line, I got Cannot open assembly 'Duplicati.CommandLine.BackendTester.exe': No such file or directory.
It’s probably worth noting that leading up to the errors in my original post, there are several Backend event: Put - Started
and Backend event: Put - Completed
entries in the logs. I think that means that at least some of the files in the backup were written to B2 successfully, right?
I tried running the job again, and now I’m getting a different error:
2023-06-01 20:21:14 -07 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started
2023-06-01 20:23:19 -07 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error
Duplicati.Library.Interface.UserInformationException: Unexpected difference in fileset version 10: 5/18/2023 5:00:00 AM (database id: 1415), found 261396 entries, but expected 261397
at Duplicati.Library.Main.Database.LocalDatabase.VerifyConsistency (System.Int64 blocksize, System.Int64 hashsize, System.Boolean verifyfilelists, System.Data.IDbTransaction transaction) [0x0036e] in <9cab5b8f1b4f49ec980acae8e278968b>:0
at Duplicati.Library.Main.Operation.Backup.BackupDatabase+<>c__DisplayClass34_0.<VerifyConsistencyAsync>b__0 () [0x00000] in <9cab5b8f1b4f49ec980acae8e278968b>:0
at Duplicati.Library.Main.Operation.Common.SingleRunner+<>c__DisplayClass3_0.<RunOnMain>b__0 () [0x00000] in <9cab5b8f1b4f49ec980acae8e278968b>:0
at Duplicati.Library.Main.Operation.Common.SingleRunner.DoRunOnMain[T] (System.Func`1[TResult] method) [0x000b0] in <9cab5b8f1b4f49ec980acae8e278968b>:0
at Duplicati.Library.Main.Operation.BackupHandler.RunAsync (System.String[] sources, Duplicati.Library.Utility.IFilter filter, System.Threading.CancellationToken token) [0x0034d] in <9cab5b8f1b4f49ec980acae8e278968b>:0
What’s this error about? Why would a problem with the fileset on 5/18 prevent me from doing a backup today?
using (var cmd2 = m_connection.CreateCommand(transaction))
foreach (var filesetid in cmd.ExecuteReaderEnumerable(@"SELECT ""ID"" FROM ""Fileset"" ").Select(x => x.ConvertValueToInt64(0, -1)))
var expandedCmd = string.Format(@"SELECT COUNT(*) FROM (SELECT DISTINCT ""Path"" FROM ({0}) UNION SELECT DISTINCT ""Path"" FROM ({1}))", LocalDatabase.LIST_FILESETS, LocalDatabase.LIST_FOLDERS_AND_SYMLINKS);
var expandedlist = cmd2.ExecuteScalarInt64(expandedCmd, 0, filesetid, FOLDER_BLOCKSET_ID, SYMLINK_BLOCKSET_ID, filesetid);
//var storedfilelist = cmd2.ExecuteScalarInt64(string.Format(@"SELECT COUNT(*) FROM ""FilesetEntry"", ""FileLookup"" WHERE ""FilesetEntry"".""FilesetID"" = ? AND ""FileLookup"".""ID"" = ""FilesetEntry"".""FileID"" AND ""FileLookup"".""BlocksetID"" != ? AND ""FileLookup"".""BlocksetID"" != ?"), 0, filesetid, FOLDER_BLOCKSET_ID, SYMLINK_BLOCKSET_ID);
var storedlist = cmd2.ExecuteScalarInt64(@"SELECT COUNT(*) FROM ""FilesetEntry"" WHERE ""FilesetEntry"".""FilesetID"" = ?", 0, filesetid);
if (expandedlist != storedlist)
var filesetname = filesetid.ToString();
var fileset = FilesetTimes.Zip(Enumerable.Range(0, FilesetTimes.Count()), (a, b) => new Tuple<long, long, DateTime>(b, a.Key, a.Value)).FirstOrDefault(x => x.Item2 == filesetid);
if (fileset != null)
filesetname = string.Format("version {0}: {1} (database id: {2})", fileset.Item1, fileset.Item3, fileset.Item2);
throw new Interface.UserInformationException(string.Format("Unexpected difference in fileset {0}, found {1} entries, but expected {2}", filesetname, expandedlist, storedlist), "FilesetDifferences");
storedlist
is straightforward from record of what’s in the version. It’s shown as the expected
count.
The one-less from found
is computed based on what appears to be internal pieces actually available.
You might have had something go missing, but the history is missing unless you set up a profiling log.
If you can post a link to a database bug report that you store somewhere, maybe there will be clues…
Having you type in database queries that I suggest is possible, but is probably a more difficult method.
FWIW I’m backing up to Backblaze B2 just fine, but possibly your issues are due to network problems.
Export As Command-line, edit URL to an empty folder for Duplicati.CommandLine.BackendTester.exe
would be one test. Or see if any backup that works winds up with Complete log
with “RetryAttempts”.
You can also try restarting any WiFi, routing, or other network equipment to see whether that will help.
Network errors that outlast number-of-retries sometimes cause issues, and some spots are more likely.
Compacting files at the backend has issues on interruption, but there may be other sensitive spots too.
tkohhh:
Why would a problem with the fileset on 5/18 prevent me from doing a backup today?
The verification stops on an error. It’s the first one found. Sometimes deleting that version will help, but sometimes another version pops up as broken instead. This is because deduplication means a version shares identical blocks with other versions. It shares other things too. Versions are not full file copies… Advantage is a huge space saving. Disadvantage is that damage to shared items shares that damage.
Although SQL doesn’t guarantee it, I think verification is usually in increasing date. How many backups happened 5/18 and after? I’m a little concerned if this is a daily, as that means a lot of later ones at risk.
This is a daily backup with retention set at 1W:1D,4W:1W,12M:1M,5Y:1Y
The bug report is here: tomk.xyz/download/bugreport.zip
As for today, the main backup failed again with the Unexpected difference in fileset version 10
. However the database backup succeeded. The log shows 0 retry attempts, so it seems everything is fine on that particular backup (today at least!).
Please let me know how I can proceed.
Is that a backup of the database from the main backup (which some people do in case DB gets lost)?
If so, that might come in handy at some time, serving as a historical reference. Is retention the same?
How large is that bug report? I can’t download it using 2 browsers and 2 Internet connections.
Last try said it’s 1.2 GB, but the four tries have only gotten 512034, 512093, 512050, 512038.
EDIT:
It was downloading at a pretty good clip until the early end, which 7-Zip Test
function verifies.
If there’s a web server limit or log you can adjust or examine, that might give some answers…
If all else fails, 7-Zip and probably other tools can split a large file into a series of smaller files.
I suppose it’s also possible that some network glitch is causing both the B2 problems and this.
tkohhh:
the main backup failed again with the Unexpected difference in fileset version 10
.
note that you can ignore this error with the undocumented flag
–disable-filelist-consistency-checks
ts678:
I suppose it’s also possible that some network glitch is causing both the B2 problems and this.
I don’t think so, as I don’t host that site. It’s possible there are some file size limitations with my website hosting provider.
Can you try this link? It’s on my Nextcloud server: bugreport.zip
note that you can ignore this error with the undocumented flag
–disable-filelist-consistency-checks
How would I implement that? What are the implications of ignoring the error?
@tkohhh
edit your backup and add the advanced option in the tab 5.
Implication: obviously you are working with a damaged database, with a dubious filelist. If you can’t affort to repair the database (and if your system is dubious a database repair may not be the best option), you can still do backups. It’s not a long term solution, unless you can be sure somehow that this damaged filelist is an accident.
gpatel-fr:
If you can’t affort to repair the database (and if your system is dubious a database repair may not be the best option)
Nobody has suggested that a database repair is in order, so I haven’t considered that. Is that something I should be thinking about? What is meant by “if your system is dubious”?
tkohhh:
Cannot open assembly ‘Duplicati.CommandLine.BackendTester.exe’: No such file or directory.
Are you in a directory that has that file? It’s in Duplicati’s installation folder.
Meanwhile, the bug report has a clear view of when the B2 uploads broke.
Regardless of any DB fixes, it’s important to get the destination working…
I’m not seeing a compact (good), but maybe use no-auto-compact option
for awhile, especially if you’re considering trying to get this limping again.
Without any compact, it might be possible to just back recent history out.
After compact, old and new data is mixed, reducing chances of recovery.
I’ll keep looking for damage like what verification saw. It takes lots of SQL.
I don’t think I’d say I want to get to “limping”. What I really want is to get back to where my daily backup jobs are running and backing up my data. I’d prefer if I could do that without starting from scratch and losing the versions that have accumulated in the years that I’ve been running Duplicati. If that’s not possible, I’d like to know that now so I can nuke it all and start from scratch.
With that in mind, is it possible to just delete all versions from May 18th forward? The idea is to eliminate the fileset with the error and any subsequent filesets that might depend on the May 18th fileset. Would that work?