lynxthecat/cake-autorate

Octave plotting tool update

Opened this issue · 12 comments

The DATA and SUMMARY lines were tweaked a while ago - see headers here:

print_headers()

@moeller0 any chance you might be able to update the octave script to use the present ones?

Maybe? Could you just send me a recent logfile with the new data, so I can add this?

Here is an up-to-date log file:

cake-autorate.primary.log

The data lines seem incorrect:

DATA; 2024-08-05-10:05:14; 1722848714.085549; 1722848714.084762; 0; 0; 0; 0; [1722848714.08198]; 1.0.0.1; 0; 37900; 31000; -655; -6900; 80200; 37900; 31000; -655; -6900; 80400; 0; -2318; 30200; 130200; 0; -2318; 30400; 130400; dl_idle; ul_idle; 60000; 30000

the ICMP timestamp has superfluous square brackets around it: [1722848714.08198] the parser does not know what to do with them... easiest fix would be to remove the brackets from the log files. Or is there a deeper rationale for square brackets?

The data lines seem incorrect:

DATA; 2024-08-05-10:05:14; 1722848714.085549; 1722848714.084762; 0; 0; 0; 0; [1722848714.08198]; 1.0.0.1; 0; 37900; 31000; -655; -6900; 80200; 37900; 31000; -655; -6900; 80400; 0; -2318; 30200; 130200; 0; -2318; 30400; 130400; dl_idle; ul_idle; 60000; 30000

the ICMP timestamp has superfluous square brackets around it: [1722848714.08198] the parser does not know what to do with them... easiest fix would be to remove the brackets from the log files. Or is there a deeper rationale for square brackets?

Were those not there before?

Not as far as I can tell, here is an example from a log file someone sent me earlier this year:

DATA; 2024-04-24-11:11:05; 1713946265.521166; 1713946265.521078; 60; 5; 0; 0; 1713946265.520201; 195.50.193.163; 2552; -4906; 16000; 22283; 20906; 30234; 1449; 17000; 8068; 15551; 30984; 0; 20186; 60234; 0; 7939; 60984; dl_idle; ul_idle; 52547; 12500

No brackets?

Ah, and should there be an extra digit after the decimal place? Looks like maybe should use ${timestamp_us} not ${timestamp}:

timestamp_us=${timestamp//[\[\].]}0

Looking at old code looks like we set timestamp to timestamp_us before outputting:

timestamp_us=${timestamp//[.]}

@rany2 do you suppose the best fix is simply to swap ${timestamp_us} for ${timestamp} here:

				if (( output_processing_stats ))
				then
					printf -v processing_stats '%s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s' "${EPOCHREALTIME}" "${achieved_rate_kbps[dl]}" "${achieved_rate_kbps[ul]}" "${load_percent[dl]}" "${load_percent[ul]}" "${timestamp_us}" "${reflector}" "${seq}" "${dl_owd_baselines_us[${reflector}]}" "${dl_owd_us}" "${dl_owd_delta_ewmas_us[${reflector}]}" "${dl_owd_delta_us}" "${compensated_owd_delta_delay_thr_us[dl]}" "${ul_owd_baselines_us[${reflector}]}" "${ul_owd_us}" "${ul_owd_delta_ewmas_us[${reflector}]}" "${ul_owd_delta_us}" "${compensated_owd_delta_delay_thr_us[ul]}" "${sum_dl_delays}" "${avg_owd_delta_us[dl]}" "${compensated_avg_owd_delta_max_adjust_up_thr_us[dl]}" "${compensated_avg_owd_delta_max_adjust_down_thr_us[dl]}" "${sum_ul_delays}" "${avg_owd_delta_us[ul]}" "${compensated_avg_owd_delta_max_adjust_up_thr_us[ul]}" "${compensated_avg_owd_delta_max_adjust_down_thr_us[ul]}" "${load_condition[dl]}" "${load_condition[ul]}" "${shaper_rate_kbps[dl]}" "${shaper_rate_kbps[ul]}"
					log_msg "DATA" "${processing_stats}"
				fi

printf -v processing_stats '%s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s; %s' "${EPOCHREALTIME}" "${achieved_rate_kbps[dl]}" "${achieved_rate_kbps[ul]}" "${load_percent[dl]}" "${load_percent[ul]}" "${timestamp}" "${reflector}" "${seq}" "${dl_owd_baselines_us[${reflector}]}" "${dl_owd_us}" "${dl_owd_delta_ewmas_us[${reflector}]}" "${dl_owd_delta_us}" "${compensated_owd_delta_delay_thr_us[dl]}" "${ul_owd_baselines_us[${reflector}]}" "${ul_owd_us}" "${ul_owd_delta_ewmas_us[${reflector}]}" "${ul_owd_delta_us}" "${compensated_owd_delta_delay_thr_us[ul]}" "${sum_dl_delays}" "${avg_owd_delta_us[dl]}" "${compensated_avg_owd_delta_max_adjust_up_thr_us[dl]}" "${compensated_avg_owd_delta_max_adjust_down_thr_us[dl]}" "${sum_ul_delays}" "${avg_owd_delta_us[ul]}" "${compensated_avg_owd_delta_max_adjust_up_thr_us[ul]}" "${compensated_avg_owd_delta_max_adjust_down_thr_us[ul]}" "${load_condition[dl]}" "${load_condition[ul]}" "${shaper_rate_kbps[dl]}" "${shaper_rate_kbps[ul]}"

All I can say is that the parser does not like the brackets, and if these serve no purpose it seems easiest to remove them from the logs instead of having the parser do this for every line (parsing these files is already slow, no need to make it even slower ;) )

I'll regenerate log without brackets now.

This should fix things, hopefully, as the parser will first dissect the header to figure out dynamically how many data fields to expect...