Obfuscated file will not run
LoganDark opened this issue · 1 comments
0 LoganDark ~/Desktop/ShEdit lua main.lua.obf
lua: main.lua.obf:1: <name> expected near char(226)
This happens whether I use level 1, 2, 3, or 4.
Here's the contents of the file I'm trying to obfuscate:
local lex = require('.lib.lexer')
local function getTokens(text)
return ({lex(text)})[2].getTokenList()
end
local function flattenTokens(tokens)
local flattened = {}
local function append(type, data)
local last = flattened[#flattened]
if last ~= nil and last.type == type then
last.data = last.data .. data
else
flattened[#flattened + 1] = {type = type, data = data}
end
end
for t = 1, #tokens do
local token = tokens[t]
for w = #token.LeadingWhite, 1, -1 do
local white = token.LeadingWhite[w]
if white.Data ~= nil and white.Data:len() > 0 then
append(white.Type, white.Data)
end
end
if token.Data ~= nil and token.Data:len() > 0 then
append(token.Type, token.Data)
end
end
return flattened
end
local function flatToLines(flattened)
local function splitTwoLines(tokens)
local l1 = {}
local l2 = {}
for i = 1, #tokens do
local token = tokens[i]
local data = token.data
local pos = data:find('\n')
if pos then
local before = data:sub(1, pos - 1)
local after = data:sub(pos + 1)
if before:len() > 0 then
l1[#l1 + 1] = {type = token.type, data = before}
end
if after:len() > 0 then
l2[#l2 + 1] = {type = token.type, data = after}
end
for x = i + 1, #tokens do
l2[#l2 + 1] = {type = tokens[x].type, data = tokens[x].data}
end
return l1, l2
else
l1[#l1 + 1] = {type = token.type, data = data}
end
end
return l1, nil
end
local lines = {}
local current = flattened
while current and #current > 0 do
local l1, l2 = splitTwoLines(current)
current = l2
lines[#lines + 1] = l1
end
return lines
end
return {
getTokens = getTokens,
flattenTokens = flattenTokens,
flatToLines = flatToLines
}
Here's the command I used to obfuscate it:
0 LoganDark ~/LuaObfuscator
python3.6 __main__.py --input ~/Desktop/ShEdit/main.lua --output ~/Desktop/ShEdit/main.lua.obf --level 3 --dontcopy
Version: Beta 1.0.3 (Updated on Jan 14, 2018 @ 04:34 PM)
Randomly chose XOR value 53.
Stripped 2 strings and 0 comments.
Finished tokenizing 460 tokens.
Beginning Level 3 obfuscation...
Step 1...Done.
Step 2...Done.
Step 3...Done.
Step 4...Done.
Step 5...Done.
Step 6...Done.
Step 7...Done.
Assuming "Data" is a global.
Step 8...Done.
Step 9...Done.
Step 10...Done.
Finished in 0.165 seconds.
It says "Assuming "Data" is a global.", so I think it doesn't realize that tables can be accessed like table.key
instead of just table['key']
.
The tokenizer and obfuscator are not perfect and will fail to produce working code under certain conditions. It has been a long time since I looked into these issues, but one thing that I remember is faulty handling of colon notation.
I believe the issue is occurring at white.Data:len()
and token.Data:len()
. I think the obfuscator is rewriting these statements as white.Data.len(Data)
and token.Data.len(Data)
and missing the white.
and token.
prefixes.
Sorry about this. To work around the issue please try modifying the input code. If Data
is a string, then string.len(white.Data)
or #white.Data
should work just fine. Otherwise try using a local variable (ex: local whiteData = white.Data
... whiteData:len()
), or the uglier solution white.Data.len(white.Data)
.
Please let me know if this helps.
PS. There is a --debug
flag you can use which will produce slightly more helpful output.