Share community alive and kicking

Those of you who are regular readers of seeing digital will know that our new electronic communities are one of my favourite topics. While researching this article out on the internet I came across many communities that gathered together around common interests. They really are spreading like wildfire now that it is becoming slightly easier to get onto the net. While looking for something interesting to write, I realised that I had missed the most obvious community of all, and it exists right under my nose.

Sharetalk originated a year ago when I noticed that some people were copying notes to each other with share tips and discussion. At about the same time we had just installed new software to handle our internet eMail and it included an option for running a listserver. I decided that I would be putting the listserver to good use if I set up an eMail based discussion forum covering share investments. Thus Sharetalk was born.

For those of you who have heard the term but are perhaps still wondering what it is, a listserver is a very simple piece of software that allows people to join a mailing list by eMail. It’s a bit like the mailroom at Readers Digest. You would write to them and say please put me on your mailing list (unlikely as this sounds) and they start sending you mail. The difference is that instead of just Readers Digest being able to communicate with you (one way), everyone on the list is able to communicate with everyone else on the list.

So every member of the mailing list can post messages which are then forwarded on to all other members. If this sounds like a full inbox to you, you’re quite right. The eMail can accumulate very quickly. The listserv software does allow you various options, one of which sends you a single message each evening which contains both a summary and the full text of the messages for that day. This suits some users but most of the investors on Sharetalk get their mail directly so that they can act on it as it arrives.

Enough of the technical stuff. Why do these people hang out in this forum? Initial reactions from people when I asked them what they thought of sharetalk and whether it would become popular were negative. “Why would people share their stock market tips?”, was one comment. Many people thought that share investors were selfish and wanted to gleam as much information as possible without exchanging their own ideas..

The amazing thing was that within a few weeks, sharetalk had organised itself into a functioning community. One of the very first developments, which set the scene for the culture that developed, was a member with access to live share prices starting a trading game. Members are given R100 000 (US$22 000) and are free to buy and sell at the ruling prices. A base set of rules were agreed upon after some debate and they are posted to the forum periodically. The facilitator posts a daily update of all portfolios so that you can track how you are doing.

One of the almost unwritten rules of the game is that you give a reason for your trades. This in turn has led to a debate on the merits of fundamental versus technical analysis. With strong arguments coming from both sides, the proof lies in the pudding and the latest development is the techies keeping track of their profits based on their analysis and challenging the fundamentalists to do the same. At the moment techies 3, fundamentalists 0.

As much as the game is fun, real money has been made and lost on Sharetalk over the last year. A hot tip on Crendel, posted late one evening some months back, led to many members in the forum calling their brokers (there are a couple on the forum) the next morning. While many made good money, some held on to long and lost the bundle. The success and failure stories were posted to the forum for others to learn from the mistakes.

The level of knowledge on the forum varies from the first time share trader to the old hands who are executing bear trades and making profits from arbitrage. The attitude is relaxed and there is never a problem to share some hard earned knowledge with a new comer.

People come and go in the forum all the time. With a core of active members keeping the 10 to 30 messages a day circulating, there is always something interesting happening. Friday’s have become humour day and when trading slows down towards the end of the week, jokes start taking the place of share tips.

Latest developments include one of the members setting up a Sharetalk web page and an international search engine who now archive all messages making it easy to go back and find details of a discussion from a few months back.

To the sceptics who thought it wouldn’t work – well I think again the internet has proven that things aren’t always as you would expect them to be. There is a sense of community out there and people seen to naturally work together for the good of all without only looking out for #1.

Anyone interested in more information can send an eMail to to get a reply with instructions for joining.

Hits (and misses)

Excited about your website notching up hits? Pleased with that counter on your web page clicking over week after week. Perhaps you shouldn’t be, or not at least until you have taken a closer look to see what lies behind those figures. Current technology makes calculating the number of visitors to a web site a bit like measuring the water in a dam using a rusty old sieve.

The problem with hits…
…are numerous but lets take a look at some of the major issues. Caching is one of them. Service providers and large companies generally set their new users up to point at what is called a proxy server. A proxy server uses some clever technology to read in the contents of popular web sites and store them locally. When you request a page from a website, lets use CNN.COM as an example, the proxy first looks in it’s own store of pages. If the page you are requesting already exists then the proxy server compares it’s copy of the page to the original copy on CNN.COM to see if it has changed. If it hasn’t then the proxy just shows you the copy it has and CNN.COM is non the wiser. If it doesn’t have the latest copy, it pulls it in and stores it locally for yourself and the next users. This has a major performance advantage for users because they get their pages from across town instead of across the world. Unfortunately the owners of CNN.COM are none the wiser that someone has looked at their page. Sites who use hits to calculate how many users are viewing their site never hear about the hits that get delivered from proxies located all around the world.

The next problem is with hits themselves. Without going into the intricacies of what can be done on a web page, a typical web page consists of text and graphics. A web page that consists of only text registers 1 hit when a browser looks at it. A page that contains 10 graphics and text will deliver 11 hits when a browser looks at it. 1 for the text and 1 each for the graphics. As you can imagine this causes a major distortion in your web site figures. The more graphics you put on a site, the more hits you receive.

On the flip side of this rather distorted coin lies the search engines. A quick survey I did found more than 200 companies that are making their living by exploring the web and cataloguing every page they can lay their hands on. These companies use sophisticated computers to browse the web and follow every link on every page. If we take our Stones site as an example, a search engine would start at our home page and then follow every link to our unit trust site, our restaurant guide, our business directory, Hyperactive our web design company and so on and so forth. At each of those pages it would again follow links to the next level until it could find no more pages.

Companies who publish web sites don’t often take into account the fact that search engines may account for a significant number of hits on their site. Those who do, may quickly (and correctly so) reach the conclusion that the more pages they have on their site, the more hits they will get from the search engines. One search engine working its way through a site with 10 pages, each with 5 graphics on them, will notch up 60 hits (10 x text page hits and 10 x 5 graphics hits on each page). If that same site were to re-arrange itself so that instead of displaying that information on 10 pages, it split the information and displayed it on 20 pages with links between the pages, the hits for the site would double (20 x text page hits and 20 x 5 graphics hits on each page) just from that one search engine. Now take the 200 or so search engines that may be working their way through a site at any one time…

Users a much more accurate way of tracking usage.
Most reasonable web statistics software, records not only hits but users sessions. A timeout is set, perhaps 15 minutes (which is what we use), and then a person entering a site and remaining active (not pausing for longer than 15 minutes), will count as one user session. This means that Joe Bloggs entering a site at 8:15 on a Monday morning and browsing through 30 or 40 pages (perhaps 150 to 200 hits) without a break of longer than 15 minutes would count as 1 user session. Likewise a search engine doing the same thing would also only record 1 user session, even if it spent 6 hours perusing a site. The search engine dilemma mentioned above is not likely to distort user session statistics enough to be meaningful.

If this is taken a step further and unique user sessions are measured, then it is likely that a more accurate picture of web site usage will appear. Unique user sessions count only different people that browse a website so that even if a person spends more than 15 minutes away from a website, they are still measured as one user.

The way forward
The measurement of web site visitors is likely to remain an inexact science until such time as research companies step into the arena and turn web site statistics into a business. At that time standards can be a agreed and webmasters will be forced to report usage in a standard manner. Until then, it is likely that we will always be attempting to make sense of disparate reporting which depends only on the whim of the person doing the reporting.