Setting values for the robots meta tag

Hi all,

Just adding a summary first and a question below.

The robots meta tag is generated from Litium and the default seems to generate a tag like this:

< meta name=“robots” content=“index,follow” / >

If other values are needed (e.g. stopping crawlers from indexing/following pages that are used for email signup confirmations etc) then a SQL script is needed as these values can’t be set from the page type definition. (According to support ticket ZAP-189-43978 answered by Patric Forsgard)

The SQL script should modify the SearchEngineIndex and SearchEngineFollow fields in both dbo.CMS_Page and dbo.CMS_WorkingCopy tables for the pages that is related to the affected page types.

Question:
When adding more pages of the affected types, what is the best approach to setting the correct values for these fields? A Litium task running regularly that resets correct values for all pages related to these page types?

Litium version: 6.x

Regards
Magnus

Not sure I understand the question here but the rendering of the robots tag is avaliable to edit in accelerators Head.cshtml-file, can you not just modify that so that it always renders specific pagetypes the way you like?

Sounds like an alternative approach, but that will bypass the existing function of the db fields?

Yes, but updating the values with a scheduled job might be more confusing since that would change the values that the users have set on the page.

Totally overriding this function in cshtml for some page types seems almost as confusing, IMHO.

I corrected my first post, as users can see and change these settings in Backoffice for page instances. (But not for the page type)

If you want to set it on the page you could listen to the page published event to modify the value, you could also add a new field that overrides the standard robot values and give that field a default value.

The new fields description should clearly state that it overrides any other robots value if it has a value is specified.