openvinotoolkit/nncf

[Good First Issue][NNCF]: Optimize memory footprint by removing redundant collected statistics

kshpv opened this issue · 8 comments

Context

One of the most frequently used algorithm inside NNCF is PostTrainingQuantization.

This algorithm operates within a pipeline structure, comprising multiple steps. Each step is independent and may involve the iterative application of one or more algorithms to a model.

During the execution of each step, all statistics points requested by all participating algorithms are gathered. Then, based on the provided statistics points, statistics are collected for all algorithms inside the step and preserved throughout the entire step's lifecycle.

Need to note that some statistics could be utilized by more than one algorithm, some - only by one.

What is statistic point? - https://github.com/openvinotoolkit/nncf/blob/develop/nncf/common/tensor_statistics/statistic_point.py#L20
How does the statistics point collection for a step looks like? - https://github.com/openvinotoolkit/nncf/blob/develop/nncf/quantization/algorithms/pipeline.py#L170
How does the execution of a step looks like? - https://github.com/openvinotoolkit/nncf/blob/develop/nncf/quantization/algorithms/pipeline.py#L95

What needs to be done?

The objective is to minimize memory usage following the execution of an algorithm within a pipeline step by eliminating redundant statistics that are solely utilized by previously applied algorithms.

  • Implement a function in the StatisticPointsContainer class that enables the removal of statistics points associated with a particular algorithm.
  • Integrate this function into the pipeline workflow, calling it after each algorithm application.
  • Evaluate the impact of these changes on memory consumption using NNCF examples for Torch, TensorFlow, OpenVINO, and ONNX.
  • Write a test covering the proposed functionality.

Example Pull Requests

N/A

Resources

Contact points

@kshpv

Ticket

120377

.take

Thank you for looking into this issue! Please let us know if you have any questions or require any help.

.take

Thank you for looking into this issue! Please let us know if you have any questions or require any help.

I will be out on the following week. Please, contact @KodiaqQ for any questions.

.take

Thanks for being interested in this issue. It looks like this ticket is already assigned to a contributor. Please communicate with the assigned contributor to confirm the status of the issue.

After conducting experiments with #2563, it was observed that the proposed changes have negligible impact. Therefore, the decision was made not to merge them and to close the issue.