Can't import sar files due to timestamp issues
Opened this issue · 1 comments
Hi,
I have a couple of sar files that I would like to analyze with naarad, however I'm getting some parsing errors.
My sar files are generated with a different timestamp format wrt what sar_metric.py expects.
E.g.:
$ head sar15.cpu
Linux 2.6.32-504.8.1.el6.x86_64 (web.local) 2016-06-15 x86_64 (4 CPU)
00:00:01 CPU %usr %nice %sys %iowait %steal %irq %soft %guest %idle
00:01:01 all 10.17 0.43 2.99 43.54 0.00 0.07 0.37 0.00 42.43
00:01:01 0 8.75 0.53 2.83 49.33 0.00 0.29 1.16 0.00 37.11
The sar_metric.py parser looks like a bit hardcoded in terms of timestamp parsing at the moment.
I have fixed it with a quick and dirty code hack to be able to import my files, but in the long run I think the usage of timestamp parsing libs is preferred:
adjusted for my sar files
sar_ts = line.split()[3]
if '-' in sar_ts:
ts_sep = '-'
else:
ts_sep = '/'
datesar = sar_ts.split(ts_sep)
# year is not fully qualified - this will work till year 2999 :)
#if int(datesar[0]) < 1000:
# year = int(datesar[0]) + 2000
# datesar[0] = str(year)
except IndexError:
logger.error("Header not found for file: %s", input_file)
logger.error("line: %s", line)
return False
if int(datesar[0]) > 1000:
#2016/06/06
date = datesar[0] + '-' + datesar[1] + '-' + datesar[2]
else:
# Assume US format 02/23/2012
date = datesar[2] + '-' + datesar[0] + '-' + datesar[1]
I just had the same issue. It seems like the code is set to do int(timestamp)
. Therefore it'll fail if you give a timestamp as the dictionary keys. But it'll work if you simply put an index instead.
enu_dict = {}
for i, v in enumerate(data['values']):
enu_dict[i] = v
ts_object = TimeSeries(enu_dict)
That's how it worked.