Introduction
Shell scripting is one of the best skills you can learn to speed up your ethical hacking tasks—especially during reconnaissance.
Reconnaissance, or recon, is the first step in ethical hacking. It’s when you gather information about a target to learn how it works. This helps you find weak points later.
But doing recon manually takes time. You have to run many tools again and again. You might forget steps or miss something important. That’s where shell scripting becomes very useful.
With shell scripting, you can automate your entire recon process. You can run multiple tools together, save their output in files, and repeat the steps without typing everything each time.
In this blog, we’ll show you 7 powerful shell scripting tricks that will help you do recon faster and better. These tips are beginner-friendly, but useful even for advanced learners.
Let’s get started!
Table of Contents
- Introduction
- 1. Use Shell Scripting to Automate Subdomain Enumeration
- 2. Shell Scripting for Historical URL Collection
- 3. Automate Directory Bruteforcing
- 4. Chain Recon Tools Using Shell Scripting Pipelines
- 5. Shell Scripting for OSINT Data Collection
- 6. Auto-Test Common Vulnerabilities with Curl + Shell Logic
- 7. Advanced Shell Scripting to Organize Recon Output
- How to Create, Save, and Run a Shell Script (For Beginners)
- Conclusion
1. Use Shell Scripting to Automate Subdomain Enumeration
Finding subdomains is one of the first things you should do in recon. Subdomains can lead to hidden websites, old apps, or test servers that are often forgotten and sometimes vulnerable.
You usually have to run different tools like assetfinder, amass, and subfinder one by one. That takes time and can be messy. But with shell scripting, you can combine them all into one simple script.
Let’s create a shell script that does this for you.
#!/bin/bash
# Check if domain is given
if [ -z "$1" ]; then
echo "Usage: ./subenum.sh example.com"
exit 1
fi
DOMAIN=$1
OUTPUT_DIR="$DOMAIN-recon"
mkdir -p $OUTPUT_DIR
echo "[+] Finding subdomains for: $DOMAIN"
echo "[+] Using assetfinder..."
assetfinder --subs-only $DOMAIN > $OUTPUT_DIR/assetfinder.txt
echo "[+] Using subfinder..."
subfinder -d $DOMAIN -silent >> $OUTPUT_DIR/subfinder.txt
echo "[+] Using amass (passive mode)..."
amass enum -passive -d $DOMAIN >> $OUTPUT_DIR/amass.txt
echo "[+] Merging results..."
cat $OUTPUT_DIR/*.txt | sort -u > $OUTPUT_DIR/final_subdomains.txt
echo "[+] Found $(wc -l < $OUTPUT_DIR/final_subdomains.txt) unique subdomains."
echo "[+] Results saved to $OUTPUT_DIR/final_subdomains.txt"
Here are a few important things to understand:
- if [ -z “$1” ]: This checks if the user entered a domain name when running the script. If not, it shows how to use it.
- mkdir -p $OUTPUT_DIR: This creates a folder named after the domain, like example.com-recon, to store all the output files.
- >> vs >: The > symbol overwrites the file, while >> adds (appends) to the file. That’s why the first command uses >, and the others use >>.
- cat $OUTPUT_DIR/*.txt | sort -u: This combines all text files and removes duplicate lines. You get a clean list of unique subdomains.
- $(wc -l < file): This counts how many lines (subdomains) are in the final file. It shows the total number at the end.
These are the parts that beginners often find confusing, but once you understand them, you can easily modify or improve this script later.
2. Shell Scripting for Historical URL Collection
Old URLs can reveal a lot. Sometimes, developers remove pages from websites but forget to remove the actual code or endpoints. These old pages can still be accessed and may contain security flaws.
As an ethical hacker, finding these historical URLs helps you discover hidden paths like /admin, /backup, or old APIs.
There are tools like waybackurls and gau that collect these URLs from archive websites like the Wayback Machine. Instead of running these tools one by one, we’ll use shell scripting to do it all at once.
Let’s write a script to collect old URLs from both tools and save them in one clean file.
#!/bin/bash
# Check if domain is provided
if [ -z "$1" ]; then
echo "Usage: ./url-collector.sh example.com"
exit 1
fi
DOMAIN=$1
OUTPUT_DIR="$DOMAIN-urls"
mkdir -p $OUTPUT_DIR
echo "[+] Collecting URLs for: $DOMAIN"
# Collect using waybackurls
echo "[+] Using waybackurls..."
echo $DOMAIN | waybackurls > $OUTPUT_DIR/wayback.txt
# Collect using gau
echo "[+] Using gau..."
echo $DOMAIN | gau > $OUTPUT_DIR/gau.txt
# Merge and clean
echo "[+] Merging and removing duplicates..."
cat $OUTPUT_DIR/*.txt | sort -u > $OUTPUT_DIR/final_urls.txt
echo "[+] Found $(wc -l < $OUTPUT_DIR/final_urls.txt) unique URLs"
echo "[+] All URLs saved to $OUTPUT_DIR/final_urls.txt"
- if [ -z “$1” ]: This checks if you gave a domain name when running the script. If not, it shows usage instructions.
- echo $DOMAIN | waybackurls: This sends the domain into the waybackurls tool, which fetches archived links.
- mkdir -p $OUTPUT_DIR: This creates a folder to store results. If it already exists, it won’t give an error.
- cat *.txt | sort -u: This merges both tool results and removes duplicates using sort -u.
- wc -l: This counts the total number of unique URLs found.
It will create a folder called example.com-urls and save everything there. The final list of URLs will be in a file called final_urls.txt.
Pro Tip: Our Hackers Real World membership provides exclusive scripts and guides for building your own tools in Python and Bash, offering deeper insights into how to start ethical hacking.
3. Automate Directory Bruteforcing
Directory bruteforcing is an important part of recon. It helps you find hidden folders or pages on a website — like /admin, /backup, or /test. These hidden paths might lead to serious vulnerabilities.
Tools like FFUF are great for this job. But running the same tool again and again, for every target, can be boring and time-consuming. That’s where shell scripting helps.
With one script, you can scan any website using your favorite wordlist. You can even save the results neatly in a folder.
Let’s write a simple script to automate FFUF for directory bruteforcing.
#!/bin/bash
# Check if URL is given
if [ -z "$1" ]; then
echo "Usage: ./dirscan.sh https://example.com"
exit 1
fi
TARGET=$1
WORDLIST="/usr/share/wordlists/dirb/common.txt" # Change if needed
OUTPUT_DIR="dirscan-results"
mkdir -p $OUTPUT_DIR
FILENAME=$(echo $TARGET | sed 's/https?:////' | sed 's///_/g')
echo "[+] Starting directory scan on $TARGET"
ffuf -u "$TARGET/FUZZ" -w $WORDLIST -o "$OUTPUT_DIR/$FILENAME.json" -of json
echo "[+] Scan finished. Results saved to $OUTPUT_DIR/$FILENAME.json"
- WORDLIST: This is the file that contains a list of common folder names to try. You can replace it with your own wordlist.
- ffuf -u “$TARGET/FUZZ”: FFUF replaces the word “FUZZ” with each word from the list and tries it on the target.
- sed ‘s/https?:////’: This cleans the target URL and converts it into a safe filename.
- -of json: This tells FFUF to save the results in JSON format.
It will use the common wordlist to scan directories and save results in a file like dirscan-results/target.com.json.
4. Chain Recon Tools Using Shell Scripting Pipelines
Recon becomes more powerful when you combine tools together. You can collect subdomains, find old URLs, filter interesting patterns, and test them — all in one flow. This is called tool chaining, and shell scripting makes it easy.
Instead of running tools one by one, you can connect them using pipes (|). This passes output from one tool to the next, like a chain.
Let’s write a full script that takes a domain, finds subdomains, collects URLs, filters them for juicy paths, and checks if they’re live.
#!/bin/bash
# Focus Keyword: shell scripting
# Check if domain is given
if [ -z "$1" ]; then
echo "Usage: ./recon-chain.sh example.com"
exit 1
fi
DOMAIN=$1
OUTPUT_DIR="$DOMAIN-fullrecon"
mkdir -p $OUTPUT_DIR
echo "[+] Starting recon on $DOMAIN..."
# Step 1: Find subdomains
echo "[+] Finding subdomains..."
subfinder -d $DOMAIN -silent > $OUTPUT_DIR/subs.txt
# Step 2: Check which subdomains are live
echo "[+] Checking live subdomains..."
cat $OUTPUT_DIR/subs.txt | httpx -silent > $OUTPUT_DIR/live_subs.txt
# Step 3: Get URLs from Wayback Machine
echo "[+] Collecting historical URLs..."
cat $OUTPUT_DIR/live_subs.txt | waybackurls > $OUTPUT_DIR/wayback.txt
# Step 4: Filter URLs using gf patterns
echo "[+] Filtering for interesting paths..."
cat $OUTPUT_DIR/wayback.txt | gf admin > $OUTPUT_DIR/admin.txt
cat $OUTPUT_DIR/wayback.txt | gf sqli > $OUTPUT_DIR/sqli.txt
# Step 5: Check if those URLs are live
echo "[+] Checking live filtered URLs..."
cat $OUTPUT_DIR/admin.txt $OUTPUT_DIR/sqli.txt | httpx -silent > $OUTPUT_DIR/final_targets.txt
echo "[+] Recon complete. Results saved in $OUTPUT_DIR/"
This script will give you a folder like example.com-fullrecon with all the results inside.
5. Shell Scripting for OSINT Data Collection
OSINT stands for Open Source Intelligence. It means collecting information from public sources like websites, social media, and search engines.
Hackers, security researchers, and investigators use OSINT to learn more about a target. This can include email addresses, phone numbers, usernames, and more.
Collecting this data manually takes time. But with shell scripting, we can automate many OSINT tasks.
Let’s create a script that collects useful data like domain info, emails, and social media mentions.
#!/bin/bash
# Focus Keyword: shell scripting
# Check for input
if [ -z "$1" ]; then
echo "Usage: ./osint-collector.sh targetname"
exit 1
fi
TARGET=$1
OUTPUT_DIR="osint-$TARGET"
mkdir -p $OUTPUT_DIR
echo "[+] Starting OSINT collection for: $TARGET"
# Whois info
echo "[+] Getting WHOIS info..."
whois $TARGET > $OUTPUT_DIR/whois.txt
# DNS info
echo "[+] Getting DNS info..."
dig $TARGET ANY +noall +answer > $OUTPUT_DIR/dns.txt
# Search for emails using theHarvester
echo "[+] Collecting emails using theHarvester..."
theHarvester -d $TARGET -b all > $OUTPUT_DIR/emails.txt
# Check public breaches using haveibeenpwned (requires API or manual curl call)
echo "[+] Checking breaches on HaveIBeenPwned..."
curl -s "https://haveibeenpwned.com/unifiedsearch/$TARGET" > $OUTPUT_DIR/breaches.json
# Google dorking (basic)
echo "[+] Running basic Google dorking..."
echo "site:$TARGET" > $OUTPUT_DIR/google_dorks.txt
echo "inurl:login site:$TARGET" >> $OUTPUT_DIR/google_dorks.txt
echo "intitle:index.of site:$TARGET" >> $OUTPUT_DIR/google_dorks.txt
echo "filetype:pdf site:$TARGET" >> $OUTPUT_DIR/google_dorks.txt
echo "[+] OSINT collection completed. Results saved in $OUTPUT_DIR/"
- whois gets domain registration details like owner, registrar, and contact info.
- dig fetches DNS records, including IPs and subdomains.
- theHarvester gathers emails and subdomains from public sources like search engines and public databases.
- curl checks for data breaches (you can use the HaveIBeenPwned API for more detail).
- Basic Google dorks are saved in a file for manual or later use.
It will create a folder named osint-example.com and save all the results there.
6. Auto-Test Common Vulnerabilities with Curl + Shell Logic
Once you find live URLs, the next step is to check if they are vulnerable. You don’t always need big tools for that. With just curl and shell scripting, you can test for simple and common web vulnerabilities.
This method works well for:
- Testing URL parameters for XSS
- Checking for open redirects
- Sending payloads to see basic server responses
Let’s write a small but powerful shell script that sends different payloads using curl and checks the response.
#!/bin/bash
# Focus Keyword: shell scripting
# Check if URL file is given
if [ -z "$1" ]; then
echo "Usage: ./vuln-tester.sh urls.txt"
exit 1
fi
URL_FILE=$1
PAYLOADS=("'><script>alert(1)</script>" "//evil.com" "' OR '1'='1" "<?php phpinfo(); ?>" "<svg onload=alert(1)>")
OUTPUT="vuln-results.txt"
> $OUTPUT
echo "[+] Starting auto tests on URLs..."
while read -r URL; do
for PAYLOAD in "${PAYLOADS[@]}"; do
TEST_URL="${URL}${PAYLOAD}"
RESPONSE=$(curl -sk "$TEST_URL" -o /dev/null -w "%{http_code} %{size_download}n")
echo "[+] $TEST_URL => $RESPONSE" >> $OUTPUT
done
done < "$URL_FILE"
echo "[+] Testing done. Results saved to $OUTPUT"
Create a file named urls.txt with one testable URL per line. For example:
– https://example.com/page?id=
– https://test.com/login?redirect=
– https://site.com/search?q=
- PAYLOADS: This is a list of simple test payloads. They target XSS, open redirect, SQLi, and basic RCE.
- “${URL}${PAYLOAD}”: This adds each payload to the end of the URL. It assumes the URL ends with a parameter like ?q=.
- curl -sk: Sends a request silently (-s) and skips SSL check (-k).
- -o /dev/null: Ignores saving the response body.
- -w “%{http_code} %{size_download}”: Prints HTTP response code and size. If something changes a lot, the payload may have triggered something.
- >> $OUTPUT: Saves every result into a file.
This simple logic helps you quickly test many URLs with different payloads.
7. Advanced Shell Scripting to Organize Recon Output
Recon generates a flood of data: subdomains, endpoints, screenshots, and vulnerabilities. Without proper structure, things get messy — fast.
Using shell scripting, you can automate both the recon and the way everything is stored. This keeps your work neat and makes your recon flow repeatable and professional.
Let’s build a smart script that:
- Runs recon tools.
- Stores their output in the right folders.
- Logs everything in a single place
Tools Used:
Make sure these tools are installed:
#!/bin/bash
# Focus Keyword: shell scripting
# Check input
if [ -z "$1" ]; then
echo "Usage: ./advanced-recon.sh example.com"
exit 1
fi
DOMAIN=$1
DATE=$(date +%Y-%m-%d)
ROOT_DIR="recon-$DOMAIN-$DATE"
LOG_FILE="$ROOT_DIR/recon.log"
# Create directory structure
mkdir -p "$ROOT_DIR"/{subdomains,live,wayback,scans,logs,screenshots}
echo "[+] Starting recon for: $DOMAIN" | tee -a $LOG_FILE
# Step 1: Subdomain enumeration
echo "[*] Finding subdomains..." | tee -a $LOG_FILE
subfinder -d $DOMAIN -silent > "$ROOT_DIR/subdomains/subs.txt"
SUB_COUNT=$(cat "$ROOT_DIR/subdomains/subs.txt" | wc -l)
echo "[+] Found $SUB_COUNT subdomains" | tee -a $LOG_FILE
# Step 2: Check for live subdomains
echo "[*] Checking live subdomains..." | tee -a $LOG_FILE
httpx -silent -l "$ROOT_DIR/subdomains/subs.txt" > "$ROOT_DIR/live/live.txt"
LIVE_COUNT=$(cat "$ROOT_DIR/live/live.txt" | wc -l)
echo "[+] Found $LIVE_COUNT live domains" | tee -a $LOG_FILE
# Step 3: Collect historical URLs
echo "[*] Fetching historical URLs..." | tee -a $LOG_FILE
cat "$ROOT_DIR/live/live.txt" | waybackurls > "$ROOT_DIR/wayback/urls.txt"
WAYBACK_COUNT=$(cat "$ROOT_DIR/wayback/urls.txt" | wc -l)
echo "[+] Retrieved $WAYBACK_COUNT URLs from Wayback Machine" | tee -a $LOG_FILE
# Step 4: Log Summary
echo "-----------------------------------" | tee -a $LOG_FILE
echo "Recon Summary for $DOMAIN" | tee -a $LOG_FILE
echo "Subdomains: $SUB_COUNT" | tee -a $LOG_FILE
echo "Live Domains: $LIVE_COUNT" | tee -a $LOG_FILE
echo "Wayback URLs: $WAYBACK_COUNT" | tee -a $LOG_FILE
echo "Output directory: $ROOT_DIR" | tee -a $LOG_FILE
echo "-----------------------------------" | tee -a $LOG_FILE
- ${ROOT_DIR}/{subdomains,…}: Uses brace expansion to make multiple folders in one command.
- tee -a $LOG_FILE: Saves all important output both on screen and into a log.
- Counters: Shows how much data was collected at each step.
- You can extend this easily to include tools like nuclei, gf, or screenshotters.
How to Create, Save, and Run a Shell Script (For Beginners)
If you’re new to shell scripting, don’t worry. Here’s a simple step-by-step guide to help you create, save, and run any bash script on your terminal.
Step 1: Open a Terminal and Create the File
In your Linux or macOS terminal, type:
nano script-name.sh
Replace script-name.sh with whatever you want to name your script (e.g., recon.sh, urls.sh, etc.).
This will open a blank file in the nano text editor.
Step 2: Paste Your Script
Now copy the shell script code and paste it into the editor.
Use right-click → paste or Ctrl + Shift + V (on Linux) or Cmd + V (on Mac) if supported
Step 3: Save the Script
After pasting the code:
- Press CTRL + O (the letter “O”) to save.
- Press Enter to confirm the file name.
- Then press CTRL + X to exit the editor.
Step 4: Give Execute Permission
Before you can run the script, make it executable:
chmod +x script-name.sh
This command tells the system: “I want this file to be a runnable script.”
Step 5: Run the Script
Now simply run the script with:
./script-name.sh
If your script requires an input (like a domain), add it like this:
./script-name.sh example.com
Your script will now start running, and it will follow the steps written inside. You’ll see outputs and logs just like a real tool.
This process works for all shell scripts. Once you get used to it, you can automate almost anything using simple Bash logic.
Conclusion: Why Shell Scripting Is a Must-Have Recon Skill
If you want to get better at ethical hacking or bug bounty, learning shell scripting is a must. It saves time, avoids mistakes, and helps you stay organized.
When you do recon manually, you waste a lot of time. You run one tool, copy results, paste them into another, and do this over and over. This is slow and boring. With shell scripting, you can automate all of this.
You can run tools like subfinder, httpx, and waybackurls in one go. You can also chain tools together, clean your output, and store everything in neat folders. This is not only faster — it’s smarter.
With a single script, you can scan a domain, find subdomains, check which ones are live, collect old URLs, and more. And every time you use it, you save hours.
Another great thing is: once you build a script, you can reuse it forever. You can change one part or add new tools easily. It grows with your skills.
Shell scripting makes you look professional. It keeps your recon clean. It shows that you understand how tools work, and how to use them well.
So, whether you’re just starting out or already experienced, don’t ignore this skill.
If you want to level up your recon — start learning shell scripting today. It’s one of the most powerful things you can add to your hacking toolkit.
7 Powerful Shell Scripting Tricks to Automate Recon Like a
Learn how to use shell scripting to automate reconnaissance in ethical hacking. Step-by-step tutorials with scripts for subdomain enumeration, URL…
How Hackers Use WaybackURLs to Find Hidden Website Vulnerabilities
Hackers look at old URLs because they can reveal forgotten parts of a website. Sometimes, websites remove pages but leave the…
Critical Security Vulnerability in Cisco Meeting Management
Critical Security Vulnerability in Cisco Meeting Management (CVE-2025-20156) allows privilege escalation and……
Russia-Linked Cyber Espionage Targets Kazakhstan: A Closer Look
Russia-linked cyber espionage efforts have focused their sights to Kazakhstan. The attacks, ascribed to a cyber incursion group known as…
Cybercriminals Target YouTube to Spread Malware
Cybercriminals exploit YouTube to spread malware disguised as cracked software and game cheats…..
Ultimate Hacker Laptop Setup Guide for Beginners
Find the perfect hacker laptop setup! Get expert recommendations, setup tips, and essential tools to kickstart your ethical hacking journey…
HackProofHacks
Copyright © 2025 HackProofHacks