What you see here is an overview of how i approach reacon.
graph LR
A((Project setup)) --> B(recon-ng)
B --> C(isup)
C --> D(nMap)
C --> F(Sn1per)
B --> E(httpx)
E --> K(paramspider)
E --> G(Eyewitness)
B --> H(osmedeous)
B --> I(Heartbleed)
B --> J(takeover)
Create Folders (subdomains, urls, ips,patterns,params,javascript). here is a good oneliner:
export projectname=[name]
mkdir ~/$projectname && cd ~/$projectname && mkdir subdomains urls ips patterns params javascripts downloads ffuf
install recon-ng. use it to search Passively for subdomains ips ports here are some of the commands you may need
workspaces create [workspace_name]
#insert domains
db insert domains
#insert company name
db insert companies
# if you have a narrow scope you can enter the hosts using this command
db insert hosts
# now load some modules and then run
modules load recon/domai.....
run
# save the list
modules load reporting/list
# save ip addresses
options set FILENAME /root/$projectname/ips/ips.txt
run
# save domains
options set COLUMN host
options set FILENAME /root/$projectname/subdomains/subdomains.txt
run
now make sure you are in your working directory:
cd ~/$projectname
check if ips are alive using isup.
cd ~/isup && rm tmp -R &&./isup.sh ~/$projectname/ips/ips.txt && cp ~/isup/tmp/valid-ips.txt ~/$projectname/ips/valid-ips.txt
validate domains using httpx:
cat ~/$projectname/subdomains/subdomains.txt | httpx -verbose > ~/$projectname/urls/urls.txt
always use ffuf 1.5 this will fuzz for files in the main directory
for URL in $(<~/$projectname/urls/urls.txt); do ( ffuf -u "${URL}/FUZZ" -w /root/SecLists/Discovery/Web-Content/raft-large-files-lowercase.txt -ic -c -of csv -o ~/$projectname/ffuf/$(echo "files-${URL}" | sed 's/https:\/\///g ; s/http:\/\///g')); done
this will fuzz for directories:
for URL in $(<~/$projectname/urls/urls.txt); do ( ffuf -u "${URL}/FUZZ" -w /root/SecLists/Discovery/Web-Content/raft-medium-directories-lowercase.txt -ic -c -recursion -recursion-depth 3 -of csv -o ~/$projectname/ffuf/$(echo "dir-${URL}" | sed 's/https:\/\///g ; s/http:\/\///g')); done
nmap -iL ~/$projectname/ips/valid-ips.txt -sSV -A -T4 -O -Pn -v -F -oA ~/$projectname/"$projectname"_nmap_result
sniper -f ~/$projectname/ips/ips.txt -m massweb -w $projectname && mkdir ~/$projectname/sniper && cp /usr/share/sniper/loot/workspace/$projectname ~/$projectname/sniper/ -R
then save the result and copy them to our working folder!
# complete this
cd ~/$projectname/ && eyewitness -f ~/$projectname/urls/urls.txt
zip -r $projectname.zip foldername
to Hunt for URLS with Parameters automatically from wayback machine
cat ~/$projectname/urls/urls.txt | while IFS="" read -r p || [ -n "$p" ]
do
python3 ~/ParamSpider/paramspider.py --domain "$p" --exclude woff,png,svg,php,jpg --output ~/$projectname/params/$(echo $p | sed 's/https:\/\///g ; s/http:\/\///g').txt
done && cat ~/$projectname/params/* > ~/$projectname/params/all.txt
Technique to Clean Params from XSS:
cp ~/$projectname/params/all.txt | sed 's/FUZZ/[whatever-you-like]/g' > ~/$projectname/params/all-changed.txt
for scanning everything:
nuclei -l ~/$projectname/urls/urls.txt -o ~/$projectname/nucleai_all_result.txt
for cve only:
nuclei -l ~/$projectname/urls/urls.txt -t /root/nuclei-templates/cves/ -o ~/$projectname/nucleai-cve-result.txt
cat ~/$projectname/urls/urls.txt | jaeles scan -s 'cves' -s 'sensitive' -s 'fuzz' -s 'common' -s 'routines' report -o ~/$projectname/Jaeles-cve-result.txt --title "[$projectname] Jaeles Full Report"
~/ChopChop/gochopchop scan --url-file ~/$projectname/urls/urls.txt --threads 4 -e csv --export-filename ~/$projectname/chopchop-result.txt
# write this
Gau - for realtime URL extraction when performing manual search so you can have urls to attack. Hunt for Links that have Parameters by using gau (Get all URLS) and displaying all links that have params:
# this might need some work
./gau --blacklist png,jpg,gif ~/$projectname/urls/urls.txt --o ~/$projectname/urls/urls-gau.txt --verbose
first download everything:
~/JSScanner/script.sh ~/$projectname/params/all.txt && cp Jsscanner_results/ ~/$projectname/javascripts -r
cat ~/$projectname/urls/urls.txt | while IFS="" read -r p || [ -n "$p" ]
do
proxychains python3 ~/metagoofil/metagoofil.py -d $p -t pdf,doc,xls,ppt,docs,xls,pptx -e 30 -l 12 -o ~/$projectname/downloads/
done
Select one of these actions directly run on vuln scan and directory scan on list of domains :
osmedeus scan -f vuln-and-dirb -t ~/$projectname/subdomains/subdomains.txt
scan list of targets :
osmedeus scan -T ~/$projectname/subdomains/subdomains.txt
get target from a stdin and start the scan with 2 concurrency :
cat ~/$projectname/subdomains/subdomains.txt | osmedeus scan -c 2
start a simple scan with default 'general' flow :
osmedeus scan -t sample.com
then save the result and copy them to our working folder!
# complete this
cat ~/$projectname/subdomains/subdomains.txt | while read line ; do echo "QUIT" | openssl s_client -connect $line:443 2>&1 | grep 'server extension "heartbeat" (id=15)' || echo $line: safe; done
takeover -l ~/$projectname/subdomains/subdomains.txt -v -t 10
on URLs list to test for http requests that could desync, and posting multiple chunked requests to smuggle external sources so the backend server will forward the request with cookies, data to the front end server
cat ~/$projectname/urls/urls.txt | python3 ~/smuggler/smuggler.py -l ~/$projectname/smuggler-result.txt
Since we have params urls from paramspider, dalfox needs to know where to inject, and you can define it with XSS instead of FUZZ, so here is a command to replace this from the result, and create a new list to be used on dalfox.
cat ~/$projectname/params/all.txt | sed 's/FUZZ/XSS/g' > ~/$projectname/all-dalfox-ready.txt
You are now ready for parsing the urls into dalfox in pipe mode:
cat ~/$projectname/all-dalfox-ready.txt | dalfox pipe | cut -d " " -f 2 > ~/$projectname/dalfox-all-result.txt
dalfox file ~/$projectname/all-dalfox-ready.txt | cut -d " " -f 2 > ~/$projectname/dalfox-all-result.txt
For Deeper Attacks add this: --deep-domxss
Silence --silence Prints only PoC When found and progress
Scanning Javascript Files for Endpoints, Secrets, Hardcoded credentials, IDOR, Open redirect and more Paste URLS into alive.txt
. Run script alive.txt
- Examine the results using GF advanced patterns
Use tree command, cat into subdirectories:
cat * */*.txt
cat */*.js | gf api-keys
cat /*/*.txt | gf ssrf > /root/Desktop/ssrf.txt
Or New Method with GitLeaks: New!
Scan a Directory with Javascripts, Files, Json Etc.. for Secrets!
gitleaks --path=/directory -v --no-git
Scan a File with Any Extension for Secrets!
gitleaks --path=/file.xxx -v --no-git
When you find Keys/Tokens - Check from here: https://github.com/streaak/keyhacks
Examine the Results Manually
B) Pattern Check Example for Results with gf & gf-patterns:
After you have the Parameters Gathered, we want to check for specific patterns and possible vulnerable URLs that can be attacked using Meg or other Fuzzing Tools.
cat /root/Desktop/Bounty/params.txt | gf xss | sed 's/FUZZ/ /g' >> /root/Desktop/Bounty/xss_params_forMeg.txt
OSINT & Passive Amplified Attacks: (Raspberry Pi)
OSINT:
Perform OSINT using spiderfoot
One off 1337 Powerful Command Attacks with amass:
use Gotty - https://github.com/yudai/gotty
gotty -p 1337 -w recon-ng
Here are some of the tools that we use when we perform Live Recon Passive ONLY on Twitch:
- Recon-ng https://github.com/lanmaster53/recon-ng
- httpx https://github.com/projectdiscovery/httpx
- isup.sh https://github.com/gitnepal/isup
- Arjun https://github.com/s0md3v/Arjun
- jSQL https://github.com/ron190/jsql-injection
- Smuggler https://github.com/defparam/smuggler
- Sn1per https://github.com/1N3/Sn1per
- Spiderfoot https://github.com/smicallef/spiderfoot
- Nuclei https://github.com/projectdiscovery/nuclei
- Jaeles https://github.com/jaeles-project/jaeles
- ChopChop https://github.com/michelin/ChopChop
- Inception https://github.com/proabiral/inception
- Eyewitness https://github.com/FortyNorthSecurity/EyeWitness
- Meg https://github.com/tomnomnom/meg
- Gau - Get All Urls https://github.com/lc/gau
- Snallygaster https://github.com/hannob/snallygaster
- NMAP https://github.com/nmap/nmap
- Waybackurls https://github.com/tomnomnom/waybackurls
- Gotty https://github.com/yudai/gotty
- GF https://github.com/tomnomnom/gf
- GF Patterns https://github.com/1ndianl33t/Gf-Patterns
- Paramspider https://github.com/devanshbatham/ParamSpider
- XSSER https://github.com/epsylon/xsser
- UPDOG https://github.com/sc0tfree/updog
- JSScanner https://github.com/dark-warlord14/JSScanner
- Takeover https://github.com/m4ll0k/takeover
- Keyhacks https://github.com/streaak/keyhacks
- S3 Bucket AIO Pwn https://github.com/blackhatethicalhacking/s3-buckets-aio-pwn
- BHEH Sub Pwner Recon https://github.com/blackhatethicalhacking/bheh-sub-pwner
- GitLeaks https://github.com/zricethezav/gitleaks
- Domain-2IP-Converter https://github.com/blackhatethicalhacking/Domain2IP-Converter
- Dalfox https://github.com/hahwul/dalfox
- Log4j Scanner https://github.com/Black-Hat-Ethical-Hacking/log4j-scan
- Osmedeus https://github.com/j3ssie/osmedeus
- getJS https://github.com/003random/getJS
- MetaGoofil https://github.com/opsdisk/metagoofil