jjkislele/i_just_want_a_simple_demo

relu6使用tensorrt怎么实现啊?

Closed this issue · 1 comments

琢磨这个玩意有段时间了,资料也不好找,请问大哥知道怎么做吗?

$$ReLU6(x) = min(max(0,x), 6)$$

这样子的呀,那就按照这个公式造呗

# relu6(x)
relu   = network.add_activation(input=tmp.get_output(0), type=trt.ActivationType.RELU)
shape  = (1, ) * len(your_input_shape)
tensor = 6.0 * torch.ones(shape, dtype=trt.float32).cpu().numpy()
trt_6  = network.add_constant(shape, tensor)
relu_6 = network.add_elementwise(relu.get_output(0), trt_6.get_output(0), trt.ElementWiseOperation.MIN)