I put together a little component recently that was basically designed to give very basic stats on the number of times a particular page was “hit”. Each time it rendered it just wrote away a record to a list, and then I tallied that up via a lookup in another list. It was simple, it met the requirements, and everyone was pretty happy. A few days into testing we suddenly found that the number of hits for all pages seemed to be growing on their own. What is going on here I thought?
Then it hit me, the SharePoint indexing services was hitting those pages, and each time it did, it pumped up the number of hits. This was one of those things that had somehow become buried in the back of my mind. Digging up some old SP2003 code, I pulled out the bit that checked to see if the requesting party was the SharePoint indexer (using the UserAgent), and then if so, I ensured the web part did nothing at all.
We have now incorporated this little feature into the zevenseas Web Part Base class (I plan to talk about this more very soon). However, it strikes me that it’s a little bit of code worth thinking about for all your web parts, as there is often not much value in rendering to the indexer. In fact, it could even be having a negative impact on crawl performance.
1: private static bool IsIndexer()
3: if (HttpContext.Current.Request.UserAgent != null)
5: if (HttpContext.Current.Request.UserAgent.Contains("MS Search 5.0 Robot"))
7: return true;
10: return false;