-- Leo's gemini proxy

-- Connecting to gemplex.space:1965...

-- Connected

-- Sending request

-- Meta line: 20 text/gemini

Gemplex Crawler


Gemplex tries to be slow and careful when crawling the Geminispace. robots.txt limitations are honored, and queries are timed so that a single host is not hit more than once a second. "SLOW DOWN" responses from servers (code 44) are also honored.


The crawler's user-agent string is "elektito/gemplex". You can use this user-agent to set limitations specific to the Gemplex crawler. Any limitations set for "crawler", "indexer", and "researcher" crawlers are always taken into account, as described in the spec:


robots.txt Specification for Gemini


If there are any issues, you are welcome to contact the author at:


mostafa@sepent.com

-- Response ended

-- Page fetched on Fri May 10 08:49:31 2024