My thoughts/hopes are that the actual search engine algorithm would be smart enough to know to only weigh in what text is actually visible on page load and to ignore or at least weigh less the hidden text. There's so much javascript on websites these days moving stuff around, hiding/showing etc. so it would make sense the best algorithm would replicate as close as possible actual user experience. It will see the duplicate text and it's not ideal but probably/hopefully it has less effect than spamming same text visibly on the page. SEO report tools are always going to be simpler than the actual algo, just analysing the raw source code is easier for them to generate reports. At the end of the day though with variants in Xara it's just one of those things you have to live with.