Welcome to TalkGraphics.com
Results 1 to 4 of 4
  1. #1
    Join Date
    May 2019
    Posts
    4

    Default Search Engine Optimisation

    I find that when a Xara website includes both a desktop version and a mobile version, the search engines read both versions of a page separately, but group them together as one page, in a way that makes it appear that all the text and headings, etc, appear twice on the same page. This can have a negative effect on SEO, as search engines could interpret this as duplicate text, spam or keyword stuffing. Is there a way around this to prevent it from happening? I'm currently using Xara Web Designer Premium 12.6.0.49270 CD x64 Mar 30 2017

  2. #2

    Default Re: Search Engine Optimisation

    If you're referring to Google's URL Inspection function, I've found when I run the inspection, it gives you a "not mobile friendly" answer. Then if you View the page inspected you see the desktop version plus the mobile version on top of it. However, if I run the test a day later, it works properly. In fact, the first display that you see (mobile version on top of desktop version) was exactly what I saw when I used my Google Chrome browser to display a just-published web page before someone suggested that you need to first clear the cache on Google Chrome.

  3. #3
    Join Date
    May 2019
    Posts
    4

    Default Re: Search Engine Optimisation

    I use an SEO program called IBP (iBusinessPromoter) and when it analyses the text on a page, I can see that it reads all the wording twice, so any advice that it offers regarding increasing/decreasing the number of keywords in the text and so on to improve the page ranking, will be inaccurate. I thought perhaps it could be a glitch in the IBP program, but I've also tried running my webpage urls through online search engine spider simulators, which show how the engines “see” a web page and this also shows that the engines are seeing the text twice.

  4. #4
    Join Date
    Apr 2010
    Location
    Kildare, Ireland
    Posts
    906

    Default Re: Search Engine Optimisation

    My thoughts/hopes are that the actual search engine algorithm would be smart enough to know to only weigh in what text is actually visible on page load and to ignore or at least weigh less the hidden text. There's so much javascript on websites these days moving stuff around, hiding/showing etc. so it would make sense the best algorithm would replicate as close as possible actual user experience. It will see the duplicate text and it's not ideal but probably/hopefully it has less effect than spamming same text visibly on the page. SEO report tools are always going to be simpler than the actual algo, just analysing the raw source code is easier for them to generate reports. At the end of the day though with variants in Xara it's just one of those things you have to live with.
    XT-CMS - a self-hosted CMS for Xara Designers - Xara + CMS Demo with blog & ecommerce shopping cart system.

 

 

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •