I have about 100 gigs of web log files from a wiki site I created when I first started working at my company. A small project at first, it grew in popularity and was opened up to everyone in the company. I'd like to run some statistics on the most popular pages and which divisions access what and when.Aside from the commercial applications, what are some of the better free utilities for analyzing large log files? I've looked at awstats, Webalizer, and analog. They all look good, but would like some opinions on some of you who run websites and look at trends.The IT folks who manage and maintain the site now are going to start using Google analytics which looks like a great solution, but I'd still like to be able to take a look at the traffic over the past 3 years.
11/2/2008 2:24:06 PM
take your pick of the numerous scripting languages and have at it
11/2/2008 6:39:54 PM
you know, my first thought when I saw this thread was "perl," but I had a feeling it'd be received poorly now that someone has broken the ice, I am going to second ^
11/2/2008 6:45:34 PM
^ My first thought also.
11/2/2008 7:04:36 PM
i've used awstats, http://www.haveamint.com/ and one other, but i can't remember what is was. which is funny, cus it's the one we ended up actually deployingoh yeah, it was smarterstats[Edited on November 2, 2008 at 7:05 PM. Reason : remembered]
11/2/2008 7:04:49 PM
awstats is what I've used in the past. No problems with it.
11/3/2008 12:38:44 AM
Urchin is good - they were bought by google a few years ago but it looks like you can download a demo version for freehttp://www.google.com/urchin/download.html
11/3/2008 10:53:18 AM