[O365] SharePoint Content Search webpart is not being indexed

Thinking of a title for this post, I wanted to start with “Inconvenient” because that pretty much describes the essence. Inconveniently enough that trademark has already been claimed by an MVP you might know if you ever search for anything SharePoint related (shout out to Waldek!). Anyways, as inconvenient as this all is, let’s get started.

For a project we had a requirement stating that a lot of content had to be put together on a page dynamically. These were reusable blocks of content which should be shown on multiple pages depending on certain properties. So your first thought might go to reusable content blocks, which are usable for this purpose. But the amount of combinations would require us to create a large number of pages which in turn would lead to something unmanageable.

So I started to come up with alternatives and looked to a good friend of mine: search. If I could tag the content blocks with terms, I could easily get the right blocks and display them on a page using the Content Search web part. Good! To make it even more dynamic, I decided to use the catalog feature to have a completely dynamic structure built on a navigation term set. If you want more information about the catalog features I highly recommend this blog serie by Bella Engen.

I won’t go in the exact details so this post stays short, it suffices to say that in the end we managed to get it all working. Except for one thing: search. The requirement there was that the dynamically generated pages would be indexed and searchable by content. And although the pages were indexed already, the only way to get them to appear as a result was to search for a word from the title of the page.

As said, the content search webpart was used to show the reusable content blocks on the page. And it seemed as if search was just ignoring the contents of that web part. I found some stuff around server side xslt rendering which might have something to do with that, but that didn’t fix the problem. After a lot of searching and asking around, the conclusion was an inconvenient one.

The Content Search web part will detect the search crawler and when it does, prevents the webpart from being rendered completely. 

In other words: as soon as the crawler hits the page, the webpart won’t appear. You can mimic this behaviour by sending a fake user agent string that includes “MS Search […] Robot”.

The most likely reasoning behind this is that search crawls the environment having full read permissions. The web part would show all kinds of results on the page, which a less privileged user might not be allowed to see. But because the user has access to our dynamic page, the content might be rendered in the search results. Hmmm… makes sense unfortunately.

Is there a way around this?

No. There isn’t. At least not in Office 365. There is no override or something like that. When you take a look in the code responsible, you’ll find the following:

int num = userAgent.IndexOf("MS Search", System.StringComparison.OrdinalIgnoreCase);
if (num >= 0 && num < userAgent.IndexOf("Robot", System.StringComparison.OrdinalIgnoreCase))
{
   result = true;
}

And this method is called by the Render Webpart override:

protected override void RenderWebPart(HtmlTextWriter output)
{
    using (new SearchUXMonitoredScope(this, "ResultScriptWebPart::RenderWebPart"))
    {
        if (!this.RenderOnServer && !base.IsSharePointCrawler())
        {
            base.RenderWebPart(output);
        }
    }
}

When you’re running on-prem, you might be able to create a custom web part based on the ootb Content Search webpart. Then override the RenderWebpart method and strip the call to “IsSharePointCrawer”. That should make the web part render regardless of the user agent string. But… be aware of the above security implications!

A lot of credit goes to Evariste on StackExchange, who provided an excellent anwer to this post of mine. I took the liberty of documenting this in a blog post, since there seems to be little information available on this topic.

For those interested in how the initial requirements were solved: we ended up dynamically generating static publishing pages based on the snippets and tags. Works equally well and plays a lot nicer with search and refiners. It’s a bit heavier on the maintenance side, but still an ok solution.

, , , ,

Related posts

Long Term Support… or not?

Thinking of a title for this post, I wanted to start with "Inconvenient" because that pretty much describes the essence. Inconveniently enough that trademark has already been claimed by an MVP you might know if you ever search for anything SharePoint related (shout out to Waldek!). Anyways, as inconvenient as this all is, let's get started.

[DevOps] Should you migrate onto YAML release pipelines?

Thinking of a title for this post, I wanted to start with "Inconvenient" because that pretty much describes the essence. Inconveniently enough that trademark has already been claimed by an MVP you might know if you ever search for anything SharePoint related (shout out to Waldek!). Anyways, as inconvenient as this all is, let's get started.

Latest posts

Long Term Support… or not?

Thinking of a title for this post, I wanted to start with "Inconvenient" because that pretty much describes the essence. Inconveniently enough that trademark has already been claimed by an MVP you might know if you ever search for anything SharePoint related (shout out to Waldek!). Anyways, as inconvenient as this all is, let's get started.

[DevOps] Should you migrate onto YAML release pipelines?

Thinking of a title for this post, I wanted to start with "Inconvenient" because that pretty much describes the essence. Inconveniently enough that trademark has already been claimed by an MVP you might know if you ever search for anything SharePoint related (shout out to Waldek!). Anyways, as inconvenient as this all is, let's get started.

Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *