saleem-mirza/serilog-sinks-azure-analytics

Some messages not appearing in LogAnalytics

Closed this issue · 10 comments

We are currently having issues with certain log messages not appearing in our LogAnalytics workspaces.

Tech stack:
.NET Core 2.1 api,
using:
Serilog.AspNetCore 2.1.1
Serilog.Sinks.AzureAnalytics 4.0.0

App is running within ServiceFabric within a managed cluster.

Serilog is configured to log to a rolling file and the Azure sink with batch size of 10 and the logBufferSize left as default.

Our api, calls out to a third party calculation engine, and we have lots of log messages wrapped around these http calls to dump out the internal mapped C# Request and Response objects serialized into json, along with timer metrics.

These serialized objects are large nested objects with many properties, but in general each json string is approx 5-20KB in size.

Our understanding is that each POST serilog makes to the Azure Data Collection API has a limit of 30MB (i.e. per batch). Given the above configuration and our message size we aren't getting anyway near those limits.

Serilog doesn't seem log internal errors that we can see, and appears to send all the message batches but sometimes the messages simply don't ever appear, even after several days.

We would really appreciate some pointers to help us get to the bottom of this.

Some additional questions:

  1. Could we be hitting other request limits? i.e. cluster / ServiceFabric / .NET sdk ?
  2. Are we going against best practice using Azure Log Analytics to log such large chunks of json in this way, and if so what's the alternative?

Thanks

I'm trying to debug this and get serilog to SelfLog but it doesn't seem to output to TextFile.

I've added a issue on main serilog repo here: serilog/serilog#1272 so you can see my setup.

When I output SelfLog to the Console I can see it logging each batch sending to Azure data collection api but the SelfLog Text file is always empty and not all the messages in each batch appear in LogAnalytics.

I need to get this SelfLog working in our ServiceFabric environment and when the api runs there, it has no Console so Text file output is the only way to see what this sink is doing, unless you can point in a better direction?
Thanks

@sicollins apologies for inconvenience. I can successfully log SelfLog messages to text file. see

        var writer = File.CreateText("selflog.txt");
        Serilog.Debugging.SelfLog.Enable(writer);

make sure you close writer writer.Close() before your application exits.

Regarding your rest of issues, I believe your application is being terminated unexpectedly and not allowing logger to transmit log to remote server.

Thanks @saleem-mirza the app is not being terminated unexpectedly otherwise I would see telemetry on this and when it restarts we log start up service logs. So it is running continuously, unless of course a new deployment overwrites it.

Sounds great. I am interested in to see log. Please see if you can capture SelfLog. One more thing. This sink by default can accept 25000 messages in bulk. If you have more than 25000 messages in burst, it will ignore those.

See what happen if you increase limit of logBufferSize to 50000 or even 75000. Also consider tweaking batch size.

Did you probably add some new properties? After if have done this, the log events did stop to show up in log analytics, although I think the log sink sends it correctly to the collection API. Log events without the new property correctly show up in log analytics. I'm currently investigating this issue with Microsoft support.

I'm sorry, I'm lost. What properties are you talking about? Please mention how I can reproduce issue?

Hi @saleem-mirza,
according to Microsoft Support, there is a limit of about 500 custom fields per log analytics workspace (check Azure portal->Log Analytics workspace-> advanced settings -> Data -> Custom fields for the current field count). It shows that we currently have 599 custom fields and as soon as we log something with another custom field, it is then not available in log analytics. The main source of custom fields seems to be exceptions (with nested inner exceptions):

2019-01-21 10_12_19-advanced settings - microsoft azure

I think the new flattenObject feature should help here :)
@sicollins: how many custom fields do you have? There's a good chance that we have encountered the same limitation.

@MartinDoerig I think you're on to something here. Just checked one of our Workspaces and it currently has 889 custom fields!

This is extremely difficult to control especially when you a running multiple microservices each logging different events and messages and some call out to 3rd party apis, so if Exceptions occur, as you mention, those stack traces and inner exceptions can swallow up many new custom fields, coupled with your own structured logs.

The flatten feature would definitely help for sure @saleem-mirza .

For now, I've taken a new approach to how I log my large C# Request/Response models and it seems to be working better and no loss of log messages thus far - but still testing.

I create a new LogModel object with few fields which specify the type of payload and then the payload itself serialized as json string using JsonConvert.SerializeObject(...).I then pass this into my logDebug using the {@log} destructuring syntax.

In LogAnalytics I can then see:

LogProperties_Log_PayloadType_s
LogProperties_Log_Payload_s

Which I can then extract the json if need be and filter the logs using the other LogModel properties.

Glad to hear that you have figured out solution. However, you do not need to call ,JsonConvert.SerializeObject(...) but just pass obect to logger using '{@log}'