j4xx3n

How to Enumerate Paths

Crawling

Katana

katana -list subs.txt -o katana.txt

Katana (Lostsec Method)

katana -list subs.txt -d 5 waybackarchive,commoncrawl,alienvault -kf -jc -fx -ef woff,css,png,svg,jpg,woff2,jpeg,gif,svg -o katana2.txt

Katana (Blackhat Ethical Hacking Method)

katana -list subs.txt -hl -jc --no-sandbox -c 1 -p 1 -rd 3 -rl 5 -tlsi -o katana3.txt

OSINT

Wayback Machine (Waybackurls)

cat subs.txt | waybackurls | anew waybackurls.txt; done

Wayback Machine (Waybackurls)

cat subs.txt | waybackurls | anew waybackurls.txt; done

Directory Brute Force

Dirb

cat subs.txt | while read sub; do dirb https://$sub | anew dirb.txt; done

grep -Eo 'https?://[^ ]+' dirb.txt | sed 's/^+ //' | anew dirbUrls.txt

Ffuf

ffuf -u https://sub.example.com/FUZZ -w wordlist.txt -mc 200 -recursion | anew ffuf.txt

cat subs.txt | while read url; do ffuf -u $url/FUZZ -w wordlist.txt -mc 200 -recursion | anew ffuf.txt; done

grep -Eo 'https?://[^ ]+' ffuf.txt | sed 's/^+ //' | anew urls.txt

Custom Wordlist

cat wordlist.txt | anew customPaths.txt

cat urls.txt | unfurl paths | anew customPaths.txt

cewl urls.txt --with-numbers | anew customPaths.txt

cat subs.txt | while read url; do ffuf -u $url/FUZZ -w customPaths.txt -mc 200 -recursion | anew ffuf.txt; done

Scanning JS Files

cat urls.txt | grep '\.js$' | httpx-toolkit -mc 200 tee js.txt

cat js.txt | jsleaks -s -l -k