if that would be true: why bother with meta tags?

Well, I would think because you can have insights into what's relevant about a page that a search engine couldn't. Like others have pointed out however, it seems the search engines are unlikely to trust you anyway.

If you did want to go ahead though, this should be a good start:

tag 'meta_keywords' do |tag|
    length = tag.attr[:length] || 15
    blacklisted = %w[words that are too common or not relevant]
    min_word_size = 3

words_from_body = tag.locals.page.part("body").content.gsub(/<.*? >/, "").split(/[^a-zA-Z]/).uniq allowed_words = words_from_body.delete_if { |word| word.size <= min_word_size } - blacklisted
    keywords = allowed_words.first(length)
    %{<meta name='keywords' content='#{keywords.join(' ')}' />}
end

Hope it's easy enough to follow. Description could be done similarly, but rules for picking what substring to pick for it are likely to be site-specific.


-Arthur

_______________________________________________
Radiant mailing list
Post:   Radiant@radiantcms.org
Search: http://radiantcms.org/mailing-list/search/
Site:   http://lists.radiantcms.org/mailman/listinfo/radiant

Reply via email to