paritytech/substrate-api-sidecar

Fee estimation/tx broadcast stopped working

Closed this issue · 2 comments

Description

Today around noon, fee estimation and broadcasting of transactions to the public node at https://polkadot-public-sidecar.parity-chains.parity.io (and also to a local setup we run) stopped working:

Steps to Reproduce

Calling POST https://polkadot-public-sidecar.parity-chains.parity.io/transaction/fee-estimate with body

{"tx":"0x3d028400000000000000000000000000000000000000000000000000000000000000000000010000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000001b0091010404030000000000000000000000000000000000000000000000000000000000000000000284d717"}

returns

{
    "code": 400,
    "error": "Unable to fetch fee info",
    "transaction": "0x3d028400000000000000000000000000000000000000000000000000000000000000000000010000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000001b0091010404030000000000000000000000000000000000000000000000000000000000000000000284d717",
    "cause": "4003: Client error: Execution failed: Execution aborted due to trap: wasm trap: wasm `unreachable` instruction executed\nWASM backtrace:\nerror while executing at wasm backtrace:\n    0: 0x63a3 - <unknown>!rust_begin_unwind\n    1: 0x2eb7 - <unknown>!core::panicking::panic_fmt::hbb5a6b42001bdfec\n    2: 0x53ea58 - <unknown>!TransactionPaymentApi_query_info",
    "stack": "RpcError: 4003: Client error: Execution failed: Execution aborted due to trap: wasm trap: wasm `unreachable` instruction executed\nWASM backtrace:\nerror while executing at wasm backtrace:\n    0: 0x63a3 - <unknown>!rust_begin_unwind\n    1: 0x2eb7 - <unknown>!core::panicking::panic_fmt::hbb5a6b42001bdfec\n    2: 0x53ea58 - <unknown>!TransactionPaymentApi_query_info\n    at checkError (/usr/src/app/node_modules/@polkadot/rpc-provider/cjs/coder/index.js:23:15)\n    at RpcCoder.decodeResponse (/usr/src/app/node_modules/@polkadot/rpc-provider/cjs/coder/index.js:39:9)\n    at WsProvider.__internal__onSocketMessageResult (/usr/src/app/node_modules/@polkadot/rpc-provider/cjs/ws/index.js:413:51)\n    at WebSocket.__internal__onSocketMessage (/usr/src/app/node_modules/@polkadot/rpc-provider/cjs/ws/index.js:402:20)\n    at callListener (/usr/src/app/node_modules/ws/lib/event-target.js:290:14)\n    at WebSocket.onMessage (/usr/src/app/node_modules/ws/lib/event-target.js:209:9)\n    at WebSocket.emit (node:events:513:28)\n    at Receiver.receiverOnMessage (/usr/src/app/node_modules/ws/lib/websocket.js:1211:20)\n    at Receiver.emit (node:events:513:28)\n    at Receiver.dataMessage (/usr/src/app/node_modules/ws/lib/receiver.js:594:14)",
    "at": {
        "hash": "0x6737c3c64299dd084ea90f089a90ebf9eca9a390474c15263e9ccc598acdb15f"
    }
}

Note the transaction here is a demo tx (with some fields unset) which we don't expect to succeed, but we've also tried with some real transactions (transferKeepAlive which succeeded previously) and get a similar result, with different stack traces but usually some failure. There have been some cases where fee estimation was successful.

Expected vs. Actual Behavior

With no changes to our setup, is it possible that some protocol upgrade was rolled out that the sidecar no longer supports, or that some corruption to the WASM engine has taken place?

Hi @symtor, today a runtime upgrade was enacted in Polkadot with introduced changes to the tx format. That same change has been effective on Kusama since early June. You can see the change here, and you should update your tx building accordingly

Hi @symtor, today a runtime upgrade was enacted in Polkadot with introduced changes to the tx format. That same change has been effective on Kusama since early June. You can see the change here, and you should update your tx building accordingly

Alright, thanks for the heads up! That's what we were afraid of.