I generated 16 character (upper/lower) subdomain and set up a virtual host for it in Apache, and within an hour was seeing vulnerability scans.

How are folks digging this up? What’s the strategy to avoid this?

I am serving it all with a single wildcard SSL cert, if that’s relevant.

Thanks

Edit:

  • I am using a single wildcard cert, with no subdomains attached/embedded/however those work
  • I don’t have any subdomains registered with DNS.
  • I attempted dig axfr example.com @ns1.example.com returned zone transfer DENIED

Edit 2: I’m left wondering, is there an apache endpoint that returns all configured virtual hosts?

Edit 3: I’m going to go through this hardening guide and try against with a new random subdomain https://www.tecmint.com/apache-security-tips/

  • foggy@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    5 hours ago

    https://crt.sh/

    When a CA issues an SSL/TLS certificate, they’re required to submit it to public CT logs (append-only, cryptographically verifiable ledgers). This was designed to detect misissued or malicious certificates.

    Red and Blue team alike use this resource (crt.sh) to enumerate subdomains.

  • eleijeep@piefed.social
    link
    fedilink
    English
    arrow-up
    3
    ·
    6 hours ago

    You need to look at the DNS server used by whatever client is resolving that name. If it’s going to an external recursive resolver instead of using your own internal DNS server then you could be leaking lookups to the wider internet.

  • stratself@lemdro.id
    link
    fedilink
    English
    arrow-up
    3
    ·
    6 hours ago

    My guess would be NSEC zone walking if your DNS provider supports DNSSEC. But that shouldn’t work with unregistered or wildcard domains

    The next guess would be during setup, someone somewhere got ahold of your SNI (and/or outgoing DNS requests). Maybe your ISP/VPN service actually logs them and announce it to the world

    I suggest next time, try setting up without any over-the-internet traffic at all. E.g. always use curl with the --resolve flag on the same VM as Apache to check if it’s working

  • Fair Fairy@thelemmy.club
    link
    fedilink
    English
    arrow-up
    7
    ·
    12 hours ago

    Crawlers typically crawl by ip.

    Are u sure they just not using ip?

    U need to expressly configure drop connection if invalid domain.

    I use similar pattern and have 0 crawls.

  • Fedditor385@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    1
    ·
    15 hours ago

    If you have browser with search suggestions enabled, everything you type in URL bar gets sent to a search engine like Google to give you URL suggestions. I would not be surprised if Google uses this data to check what it knows about the domain you entered, and if it sees that it doesn’t know anything, it sends the bot to scan it to get more information.

    But in general, you can’t access a domain without using a browser which might send that what you type to some company’s backend and voila, you leaked your data.

    • Derpgon@programming.dev
      link
      fedilink
      English
      arrow-up
      3
      ·
      11 hours ago

      Easily verified by creating another bunch of domains and using a browser that doesn’t do tracking - like waterfox

    • kumi@feddit.online
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      14 hours ago

      What you can do is segregate networks.

      If the browser runs in, say, a VM with only access to the intranet and no internet access at all, this risk is greatly reduced.

  • oranki@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    15
    ·
    15 hours ago

    Maybe that particular subdomain is getting treated as the default virtual host by Apache? Are the other subdomains receiving scans too?

    I don’t use Apache much, but NGINX sometimes surprises on what it uses if the default is not specifically defined.

  • fubarx@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    16 hours ago

    A long time ago, I turned a PC in my basement into a web server. No DNS. Just a static IP address. Within 15 minutes, the logs showed it was getting scanned.

    SSL encrypts traffic in-transit. You need to set up auth/access control. Even better, stick it behind a Web Application Firewall.

    Or set up a tunnel. Cloudflare offers a free one: https://developers.cloudflare.com/cloudflare-one/networks/connectors/cloudflare-tunnel/

  • androidul@lemmy.world
    link
    fedilink
    English
    arrow-up
    81
    ·
    edit-2
    20 hours ago

    if you use Let’s Encrypt (ACME protocol) AFAIK you can find all domains registered in a directory that even has a search, no matter if it’s wildcard or not.

    It was something like this https://crt.sh/ but can’t find the site exactly anymore

    LE: you can also find some here https://search.censys.io/

  • Decronym@lemmy.decronym.xyzB
    link
    fedilink
    English
    arrow-up
    49
    arrow-down
    1
    ·
    edit-2
    10 minutes ago

    Acronyms, initialisms, abbreviations, contractions, and other phrases which expand to something larger, that I’ve seen in this thread:

    Fewer Letters More Letters
    CA (SSL) Certificate Authority
    DNS Domain Name Service/System
    IP Internet Protocol
    SSL Secure Sockets Layer, for transparent encryption
    TLS Transport Layer Security, supersedes SSL
    VPN Virtual Private Network
    VPS Virtual Private Server (opposed to shared hosting)

    7 acronyms in this thread; the most compressed thread commented on today has 16 acronyms.

    [Thread #990 for this comm, first seen 11th Jan 2026, 01:25] [FAQ] [Full list] [Contact] [Source code]

  • 4am@lemmy.zip
    link
    fedilink
    English
    arrow-up
    34
    arrow-down
    2
    ·
    edit-2
    20 hours ago

    For anyone who needs to read it: At the end of the day this is obscurity, not security; however obscurity is a good secondary defense because it buys time.

    I too would be interested to learn how this leaked

    • zeca@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 hours ago

      Isnt security mostly achieved by heavy obscurity? A password secures because other people dont know what it is, it is obscured.

      • pishadoot@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        ·
        5 hours ago

        They’re not the same.

        Hiding an unlocked treasure chest in the forest is obscurity. Sure, you might be the only one who knows it’s there at first but eventually someone might come across it.

        Having a vault at a bank branch is security - everyone knows there’s a vault there, but you’ll be damned if you’re going to get into it when you’re not authorized.

        Good passwords, when implemented correctly, use hashing (one way encryption) to provide security. It’s not obscured, people know you need a password to access the thing (in our example)

      • bluehambrgr@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 hours ago

        In cryptography, there’s a difference between “secrets” (like passwords and encryption keys), and hiding / obscuring something (like steganography or changing your web server to run on a different port)

      • Fair Fairy@thelemmy.club
        link
        fedilink
        English
        arrow-up
        9
        ·
        edit-2
        10 hours ago

        It’s not. Wildcard DNS and wildcard cert. Domain is not logged publicly.

        People that keep saying logged publicly simply don’t understand setup and technology

      • Keelhaul@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        9
        ·
        13 hours ago

        How is it being logged publicly? Like OP said there is no specific subdomain registered in the DNS records (instead using a wildcard). Same for the SSL cert. Only things I can think of is the browser leaking the subdomains (through google or Microsoft) or the DNS queries themselves being logged and leaked. (Possibly by the ISP inspecting the traffic or logging and leaking on their own DNS servers?). I would hardly call either of those public.

  • yeehaw@lemmy.ca
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    16 hours ago

    Reverse DNS? Or vuln scans just hitting IPs. Don’t need DNS for that.

    • Fair Fairy@thelemmy.club
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 hours ago

      Op is doing hidden subdomain pattern. Wildcard dns and wildcard ssl.

      This way subdomain acts as a password and application essentially inaccessible for bot crawls.

      Works very well