添加链接
link管理
链接快照平台
  • 输入网页链接,自动生成快照
  • 标签化管理网页链接

There are a lot of 5520-errors now. In addition we see "A connection was successfully established with the server, but then an error occurred during the pre-login handshake. (provider: SSL Provider, error: 0 - The wait operation timed out.)" sometimes.

The mitigation was finished yesterday (at least that is what it says on status.visma.com) ... but there is no doubt that something is not as it should be

We have been able to identify the issue and we are currently applying a fix.

Please let us know if you are still seeing connection issues.

Thank you.

New strange error:

Url: /controller/api/v1/paymentMethod/
Content received:
{"message":"An error has occurred.","exceptionMessage":"The 'CompressedContent' type failed to serialize the response body for content type 'application/json; charset=utf-8'.","exceptionType":"System.InvalidOperationException","stackTrace":null,"innerException":{"message":"An error has occurred.","exceptionMessage":"Timeout performing EVAL (5000ms), next: EVAL, inst: 2, qu: 0, qs: 7, aw: False, rs: ReadAsync, ws: Idle, in: 0, serverEndpoint: 10.10.6.4:15005, mc: 1/1/0, mgr: 8 of 10 available, clientName: erp096701000036, PerfCounterHelperkeyHashSlot: 15367, IOCP: (Busy=0,Free=1000,Min=8,Max=1000), WORKER: (Busy=32,Free=32735,Min=8,Max=32767), v: 2.1.58.34321 (Please take a look at this article for some common client-side issues that can cause timeouts: https://stackexchange.github.io/StackExchange.Redis/Timeouts)","exceptionType":"StackExchange.Redis.RedisTimeoutException","stackTrace":" at StackExchange.Redis.ConnectionMultiplexer.ExecuteSyncImpl[T](Message message, ResultProcessor`1 processor, ServerEndPoint server) in /_/src/StackExchange.Redis/ConnectionMultiplexer.cs:line 2807\r\n at StackExchange.Redis.RedisBase.ExecuteSync[T](Message message, ResultProcessor`1 processor, ServerEndPoint server) in /_/src/StackExchange.Redis/RedisBase.cs:line 54\r\n at StackExchange.Redis.RedisDatabase.ScriptEvaluate(String script, RedisKey[] keys, RedisValue[] values, CommandFlags flags) in /_/src/StackExchange.Redis/RedisDatabase.cs:line 1153\r\n at Microsoft.Web.Redis.StackExchangeClientConnection.<>c__DisplayClass11_0.b__0()\r\n at Microsoft.Web.Redis.StackExchangeClientConnection.RetryForScriptNotFound(Func`1 redisOperation)\r\n at Microsoft.Web.Redis.StackExchangeClientConnection.RetryLogic(Func`1 redisOperation)\r\n at Microsoft.Web.Redis.StackExchangeClientConnection.Eval(String script, String[] keyArgs, Object[] valueArgs)\r\n at Microsoft.Web.Redis.RedisConnectionWrapper.TryCheckWriteLockAndGetData(Object& lockId, ISessionStateItemCollection& data, Int32& sessionTimeout)\r\n at Microsoft.Web.Redis.RedisSessionStateProvider.GetItemFromSessionStore(Boolean isWriteLockRequired, HttpContext context, String id, Boolean& locked, TimeSpan& lockAge, Object& lockId, SessionStateActions& actions)\r\n at Microsoft.Web.Redis.RedisSessionStateProvider.GetItem(HttpContext context, String id, Boolean& locked, TimeSpan& lockAge, Object& lockId, SessionStateActions& actions)\r\n at PX.Data.PXSessionStateStore.ProviderGetItem(HttpContext context, String id, Boolean& locked, TimeSpan& lockAge, Object& lockId, SessionStateActions& actionFlags)\r\n at PX.Data.PXSessionStateStore.LoadSessionItem(HttpContext context, String id, Boolean isReadonly, Object& lockId, Boolean& isNew)\r\n at PX.Data.PXSessionStateStore.WriteToSession(String id, Boolean isReadonly, Action`1 processMethod)\r\n at PX.Data.PXSessionStateStore.ProcessWithSessionContext(String id, Boolean isReadonly, Action`1 processMethod)\r\n at PX.Data.PXLongOperation.PXTaskPool.ReadSession(Object key, Boolean& abort)\r\n at PX.Data.PXLongOperation.PXTaskPool.TryGetValue(Object key, PXAsyncResult& result)\r\n at PX.Data.PXLongOperation.ClearStatus(Object key, Boolean abort)\r\n at PX.Data.PXLongOperation.ClearStatus(Object key)\r\n at PX.Data.PXGraph.Clear(PXClearOption option)\r\n at Visma.net.ERP.Api.Base.BaseRepository.d__4.MoveNext() in D:\\Data\\BuildAgent\\work\\11023e52ea15d342\\Web\\Lib\\Visma.net.ERP.Api\\Base\\BaseRepository.cs:line 77\r\n at Visma.net.ERP.Api.CA.Repository.PaymentMethodRepository.d__12.MoveNext() in D:\\Data\\BuildAgent\\work\\11023e52ea15d342\\Web\\Lib\\Visma.net.ERP.Api\\CA\\Repository\\PaymentMethodRepository.cs:line 129\r\n at Visma.net.ERP.Web.Api.Mapping.PaymentMethodMappingExtension.d__1.MoveNext() in D:\\Data\\BuildAgent\\work\\11023e52ea15d342\\Web\\Lib\\Visma.net.ERP.Web.Api\\Mapping\\PaymentMethodMappingExtension.cs:line 31\r\n at System.Linq.Buffer`1..ctor(IEnumerable`1 source)\r\n at System.Linq.Enumerable.ToArray[TSource](IEnumerable`1 source)\r\n at Visma.net.ERP.Web.Api.Mapping.PaymentMethodMappingExtension.ToDtoPaymentMethod(IPaymentMethodVo paymentMethodVo) in D:\\Data\\BuildAgent\\work\\11023e52ea15d342\\Web\\Lib\\Visma.net.ERP.Web.Api\\Mapping\\PaymentMethodMappingExtension.cs:line 16\r\n at System.Linq.Enumerable.WhereSelectEnumerableIterator`2.MoveNext()\r\n at Newtonsoft.Json.Serialization.JsonSerializerInternalWriter.SerializeList(JsonWriter writer, IEnumerable values, JsonArrayContract contract, JsonProperty member, JsonContainerContract collectionContract, JsonProperty containerProperty)\r\n at Newtonsoft.Json.Serialization.JsonSerializerInternalWriter.Serialize(JsonWriter jsonWriter, Object value, Type objectType)\r\n at Newtonsoft.Json.JsonSerializer.SerializeInternal(JsonWriter jsonWriter, Object value, Type objectType)\r\n at System.Net.Http.Formatting.BaseJsonMediaTypeFormatter.WriteToStream(Type type, Object value, Stream writeStream, Encoding effectiveEncoding)\r\n at System.Net.Http.Formatting.JsonMediaTypeFormatter.WriteToStream(Type type, Object value, Stream writeStream, Encoding effectiveEncoding)\r\n at System.Net.Http.Formatting.BaseJsonMediaTypeFormatter.WriteToStreamAsync(Type type, Object value, Stream writeStream, HttpContent content, TransportContext transportContext, CancellationToken cancellationToken)\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at PX.Api.Compression.BaseCompressor.d__4.MoveNext()\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at PX.Api.Compression.CompressedContent.d__4.MoveNext()\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at System.Web.Http.WebHost.HttpControllerHandler.d__22.MoveNext()"}}

Its a lot better, but there are still quite a few 5520-errors. These are the last three that i have logged

18:13: Completed: /controller/api/v1/Inventory?availabilityLastModifiedDateTimeCondition=%3E&availabilityLastModifiedDateTime=2023-10-23 18:02:18&pageNumber=1
{"ExceptionType":"IPPException","ExceptionMessage":"","ExceptionFaultCode":"5520","ExceptionMessageID":"5520_45f4518c-0a8f-453b-9392-9a94fe84de41","ExceptionDetails":""}

18:09: Completed: /controller/api/v1/Inventory?availabilityLastModifiedDateTimeCondition=%3E&availabilityLastModifiedDateTime=2023-10-23 18:03:25&pageNumber=2
{"ExceptionType":"IPPException","ExceptionMessage":"","ExceptionFaultCode":"5520","ExceptionMessageID":"5520_c9ed6b64-c339-4e30-8f97-2241534c9675","ExceptionDetails":""}

18:07: Completed: /controller/api/v1/carrier/
{"ExceptionType":"IPPException","ExceptionMessage":"","ExceptionFaultCode":"5520","ExceptionMessageID":"5520_93028f4b-ae94-460c-b38c-1b6045af90aa","ExceptionDetails":""}

I can see that the status has changed to Operational, but that is a bit premature

From servicelog

Error: InternalServerError

Error: VismaId: 3593668b-3ed6-4c57-8eb1-466ba02181b4. Error updating supplier invoice. A connection was successfully established with the server, but then an error occurred during the pre-login handshake. (provider: SSL Provider, error: 0 - The wait operation timed out.)

IPP exception during the week end

We have identified the same problem as above this morning along with a 502-gateway error. Is this something Visma are aware of?

These seem to mainly occur when handling supplier invoices. Examples of the errors below:

{"ExceptionType":"IPPException","ExceptionMessage":"","ExceptionFaultCode":"5520","ExceptionMessageID":"5520_f468c19a-16fa-4251-ac21-928d180061dd","ExceptionDetails":""}.

{"message":"VismaId: f22461e8-9168-43b8-baf6-a387b230acb0. A connection was successfully established with the server, but then an error occurred during the pre-login handshake. (provider: SSL Provider, error: 0 - The wait operation timed out.)"}. .

Error response: <html>
<head><title>502 Bad Gateway</title></head>
<body>
<center><h1>502 Bad Gateway</h1></center>
</body>
</html>