In the years that followed, the future seemed obvious. The number of Gopher users expanded at orders of magnitude more than the World Wide Web. Gopher developers held gatherings around the country, called GopherCons, and issued a Gopher T-shirt – worn by MTV veejay Adam Curry when he announced the network’s Gopher site. The White House revealed its Gopher site on Good Morning America. In the race to rule the internet, one observer noted, “Gopher seems to have won out.”
Well, things turned out a little differently. Sadly, we tend to only remember the victors, not the ones lying in a ditch by the side of the road to victory.
Interesting article! There doesn’t seem to be one simple reason for Gopher’s decline, but it sounds like incompetent university administrators and porn were two of the big ones. I wonder if we could have ended up with a less advertising-based internet… That’s probably unlikely, but it still could have ended up looking quite a bit different – and maybe even been more elitist and old-media, in a way, rather than the tabloid instincts of the Web.
I rather enjoyed that article.
These days GopherCon’s are conferences for the Go programming language. Which has a Gopher as their logo.
What killed gopher? The university tried to “monetize” it right at the start. If you want to get REALLY big, you have to find some other way than demanding EVERYONE pays you based on how much you can squeeze from them.
What’s sad is that ISPs haven’t learned – they’re trying to do the same thing now that killed gopher then – charge companies over top their connection based on how they use their ISP connection. Thank God for net neutrality.
But that’s how you recognize a company/business/whatever that is swirling the bowl that needs to be dumped NOW – they talk about “monetizing” their “assets”… very often “IP” of some sort.
“Everyone would answer phones at least one day a week,†Lindner says, “even if you were programming. That way you were close to the pain you were inflicting on people — if programmers today still took calls, we’d have more user-friendly software.â€
I find this sort of history fascinating. And you are absolutely right Thom, that we tend to only remember the victors.
I found that bit interesting.
I had never heard of Hyper-G, but it probably deserves a footnote as well. I didn’t find very much though, this document explains some of the differences. It’s hard to understand what this means in practice without having experienced it.
http://www.jucs.org/jucs_1_11/a_comparison_of_www/Pam_A.html
HyperG’s seems to be more structured, with bidirectional links and built in indexing. Things that HTTP servers today still can’t do well without laborious crawlers.
Search for “hyperg” and “tu graz” – this should bring up some ressources.
http://www.iicm.tugraz.at/thesis_113/cderler.pdf
http://ftp.iicm.tugraz.at/pub/papers/inet93.pdf
http://www.iicm.tugraz.at/hwbook/5
…
There is still a lot of stuff online.. it’s the university where hyper-g was created.. (by my former professors)
I think we keep finding that the worse technology tend to be used more. I like how McCahill likens the Web to Tim Berners-Lee’s own personality. People are naturally a bit scattered and that’s what the worse technology always seem to be – a bit scattered. Linux – very scattered.
I would coin it “The Fine China Principle”. No one wants to use the fine china. Your grandmother would shout at you. Use the regular china. It’s cheap to replace if you break it.
The Web’s unstructured, unidirectional, unindexed nature just means there’s less obstacles to it spreading. All you have to do is bang up some HTML and you don’t even have to think.
Bi-directional links? Damn, imagine knowing exactly how many pages linked to a page on your website. Technically you could Google search, but this is more something built into the thing itself. I’ve always hated the wild abandon that people change links or change CRM software that breaks links. If they could easily get a sense of how many people depend on their URL remaining the same, maybe we’d have a less disjointed, less volatile internet.
” It was designed as a defense, a secure means of communication should the Soviets destroy the American telephone system”
The inception concept was extremely simple: The same way an holographic image can be reconstructed from some of its fragments, also messages and other data if damaged or destroyed on transit. It required of redundancy, distribution and diversity on transmission channels and formats.
Encryption was presumptive -or unnecessary. Whole purpose was resilience in delivery.
The Web was built on top of that. Amazing.
The history as told suggest that Gopher was killed.
This was a fascinating read. I think I can trace my personal career development back to these events.
Back when Gopher was first released, I was a high-school student in Minnesota. Was an Amiga-guy, but was definitely into the BBS scene and such.
The U was running some sort of modem-based terminal server at the time. There was no authentication; once you connected it would just say “host>” and you’d type the name of a server. So long as it was *.umn.edu, it would happily connect you, but you then had to log in to the host itself.
So, since Gopher was a new thing, the folks from the article were running a demonstration server. If memory serves me, it was called hafnhaf.tc.umn.edu. If you were on the Internet, you could telnet to it, login as “gopher” with no password and be presented with a gopher client. Since this was at the U, you could login through the modem-based terminal server and get gopher access for free on a local telephone number.
While this was neat in and of itself, what really made it great for me and my high-school geek friends was that Gopher supported telnet links, and there were a few public Unix-shell services that were linked from gopher. One of them was M-Net in Ann Arbor. I signed up for an account and learned everything I could about Unix on this public system that I was connecting to through a not-so-legit connection to the U of M’s public gopher server.
I’ve worked with Unix/Linux for more than 20 years now, and I’m not so sure I would have gotten into this field if it wasn’t for the events described in the article.
Edited 2016-08-14 20:49 UTC
http://static.nautil.us/5322_a4a8a31750a23de2da88ef6a491dfd5c.png
“…but Otlet introduced an important new twist: a set of so-called “auxiliary tables†that allowed indexers to connect one topic to another by using a combination of numeric codes and familiar marks like the equal sign, plus sign, colon, and quotation marks”.
…..
“…After invading Brussels, the Nazis destroyed much of his life’s work, removing more than 70 tons worth of material and repurposing the World Palace site for an exhibition of Third Reich art. Otlet died in 1944, and has remained largely forgotten ever since”.
http://nautil.us/issue/21/information/the-future-of-the-web-is-100-…
I remember Gopher in the early-mid 90s. It was much faster to use over dialup than the web. I’d always thought that it might have made sense for use in limited bandwidth environments like cell phone networks, but nobody seems to have gone for it.
I really started using the internet right as gopher was dying and www was coming of age. Gopher was really useful in the late 90’s for accessing things that had be otherwise removed from the website. There were a number of gopher servers that people just left running which had the old stuff on them that had long been scrubbed from the web.
I guess that makes me a little odd too. The structure of the web maps to my thinking very well. I find myself constantly being drawn into other associated topics and discussions as the original even when links are not present. Search engines have made links serve a very different function these days.
While working as a university research assistant I was put in charge of a Gopher server. I remember, in 1993, when the web started picking up steam I downloaded NCSA Mosaic and poked around a bit at the early websites. My first reaction was “this will never catch on,” and I went back to Gopher. Gopher had a much better user experience back then. But of course the Web evolved quickly, and Gopher stagnated, and by early 1994, I moved to web development and never looked back.
I know 20 years is not much compared with the internet history, but in the last 20 years of so, which is my contact with the internet, I NEVER used Gopher. Not even once. The only thing I know about it is “it was something people used in the early days to search, but don’t ask me how, where or what.” Most likely a majority of internet users these days don’t know even that about it.
I started to use the ‘Net in about 1993, when I started University. Before that Id seen a few BBS, but in the UK, phone calls were charged by the minute (every call, no matter how long), so you’d kind of try not to use a modem for extended periods.
So – when I started at Uni, we mainly had access through a terminal (VAX, running VMS), but we had a script that was passed around to get on to JANET (real internet) and a copy of Netscape. So that’s what we did. I never used Gopher. It might as well have never existed. By the time I got a modem, the Web was all I used and so I never used it at home either. I think I’d have to google Gopher to even know what it did!