Search Engine Review: SurfGopher.com

SurfGopher has been around since at least 1998, when it was billing itself as a "Domain name homepage directory." In English; SurfGopher isn't really a search engine or a directory; it's a database of domain names with "card catalog" descriptions of each domain's web content. SurfGopher only tells you what the domain-owner wants you to know about the domain.

The idea behind domain-indexes is fairly simple: By indexing domains instead of pages, domain-indexes filter out minor sites (ones without domains) while putting all others on equal footing (since each domain gets the same number of listings in the database). In practice, this seldom works as well as intended, since having a domain name doesn't really guarantee good content, and any spammer with two dimes to rub together is going to have extra domains to fill the database with.

For a site that does so little, SurfGopher has been around for a suprisingly long time, having outlasted several larger competitors like DirectHit and Infoseek. Apparently, SurfGopher exists in that strange middle ground of search engines, where it's too small to be an acquisition target, but just large enough to survive off its advertising revenue.

The Users' Side

Searching SurfGopher is too easy: There's a search box with no advanced options. Results seem to treat multiple keywords as a boolean "AND", with results containing only one word trailing after the listings that contain all the keywords. Phrase-searching is enabled by placing search phrases in double-quotes (").

Search results are pretty straightforward: Each listing includes the title, meta description, and URI of a site. (All of this information is provided by the authors of the indexed pages.) Pages that don't have a meta description don't get any description at all.

SurfGopher's search results are a mixed bag. They often do include domains that are too new to appear in major engines (which take longer to add sites, because they have more complex databases), but those new domains are often lost in a quagmire of out-of-date listings. Surfgopher doesn't appear to revisit sites after their original submission, so changes in site content (or the termination of sites) isn't reflected in the search results. Until SurfGopher changes their webcrawling methods, their database is likely to continue degrading.

SurfGopher's partner site MetaGopher is a meta-search engine that gets its results from various paid-placement search providers. Search Engine Snob isn't a fan of meta- search engines, because I think combining the results of multiple mediocre search engines doesn't create a good search engine.

The Webmasters' Side

SurfGopher does not crawl the web looking for sites; if you want your domain listed, you need to submit it yourself. Submission is free, but requires an e-mail address for the confirmation message. (The confirmation message, as is usual in these situations, includes a bunch of advertisment, but it's a one-time mailing.) The e-mail address doesn't have to be from the domain submitted.

Surfgopher doesn't even have a branded robot; submitted sites will be visited by "libwww-perl/5.47" almost immediately after being submitted. SurfGopher doesn't appear to re-visit included sites (which is why there are so many dead listing in their index), so you should resubmit if your site changes.

As far as listings go, SurfGopher is one of those dream sites that give webmasters almost complete control over their listings. SurfGopher listings display the title, meta DESCRIPTION, and URI for a site.

On the other hand, SurfGopher does not index the full contents of submitted pages. The only information indexed is the cotents of the title, the meta DESCRIPTION, and meta KEYWORDS elements; so pages that don't have a meta DESCRIPTION won't have any description at all in SurfGopher's results.

SurfGopher appears to have problems displaying HTML character entities; some of them (including the "@" sing, entity @ ) won't appear in SurfGopher's version of site titles and descriptions.

SurfGopher's indexing system isn't fooled by http redirects or stealth redirection; in my tests of those tricks, Surfgopher just went ahead and indexed the the site being redirected to (and discarding it, if it wasn't the top page of a domain).

Conclusions

While a "homepage-only" search engine sounds like an interesting idea for leveling the playing field between sites, SurfGopher's implementation leaves a lot to be desired. It's greatest problem is its limited spidering; unless they beging a more aggressive crawl of the web, the database will always be second-rate.

From the users' perspective, SurfGopher is a "Why bother?" engine. For webmasters, it's more like a "Can't hurt." engine, in that is only spiders sites when requested, and lets webmasters control the listings.