COVESA/dlt-daemon

Not offline logstoreage to new file in continuous power on

Closed this issue · 7 comments

When a log file is full, no new files can be created to write logs;
image

Hello @tangzhiqiang3 ,
thank you so much for your interest in DLT topics.

Regarding this topic, could you provide us more info, eg: your dlt logstorage conf? dlt.conf?

version: origin/master(4ed1b97) branch.

dlt.conf

##############################################################################
# Offline logstorage                                                         #
##############################################################################
# Send automatic get log info response during context registration
SendContextRegistration = 1

# Set ECU ID (Default: ECU1)
ECUId = CSC1

# Size of shared memory (Default: 100000)
SharedMemorySize = 100000

# Directory where to store the persistant configuration (Default: /tmp)
PersistanceStoragePath = /var/data

# Store DLT log messages, if not set offline logstorage is off (Default: off)
# Maximum devices to be used as offline logstorage devices
OfflineLogstorageMaxDevices = 10

# Path to store DLT offline log storage messages (Default: off)
OfflineLogstorageDirPath = /var/log/qisi

# Wrap around value for log file count in file name (Default: UINT_MAX)
OfflineLogstorageMaxCounter = 100

# Maximal used memory for Logstorage Cache in KB (Default: 30000 KB)
OfflineLogstorageCacheSize = 80000

# Store DLT messages to local directory, if not set offline Trace is off (Default: off)
OfflineTraceDirectory = /var/log/qisi

ControlSocketPath = /tmp/tmp.xxxx.sock

dlt_logstorage.conf

[FILTER1]
LogAppName=xxx
ContextName=xxx, xxx
LogLevel=DLT_LOG_INFO
File=xxx
FileSize=10485760
NOFiles=50
GzipCompression=on

Could you please help me take a look? Thank you

Hello @tangzhiqiang3
so sorry we are busy these days.

Do you have any progress in this issue?
And, what is the dlt-daemon version causing you this issue?
I will try to reproduce on my side and comment.

Regards

Issue has not been fixed yet, please keep this open

Hello @tangzhiqiang3
Sorry for my late fixing.
Fix is now available for review at: #571

Okay, I'll try again

Please kindly try and ping me if there are any issues
Thank you