-
AuthorPosts
-
October 15, 2018 at 7:05 am #1021428
Hi
We used the Enfold Template Construction – https://kriesi.at/themes/enfold-construction/
to create the website – https://farquharprojects.com.auHowever Enfold Theme Template Failed Google Mobile Friendly Test
– Clickable elements too close together
– Content wider than screenResults Link –
https://search.google.com/test/mobile-friendly?utm_source=mft&utm_medium=redirect&utm_campaign=mft-redirect&id=c1nCeMV6a69e_5uqnub-ZwAny ideas how to rectify this?
Thanks Dave
October 15, 2018 at 7:30 am #1021437Hey Dave,
Which elements are too close together and which content is wider that the screen on your actual site?
Best regards,
RikardOctober 15, 2018 at 11:00 am #1021494Here is the Results Link, check it out it will show you –
https://search.google.com/test/mobile-friendly?utm_source=mft&utm_medium=redirect&utm_campaign=mft-redirect&id=c1nCeMV6a69e_5uqnub-ZwOctober 15, 2018 at 12:06 pm #1021529Hi,
Thanks, but your actual site doesn’t look like the Google test displays it as. If you need to make changes then we need to know what should change.
Best regards,
RikardOctober 16, 2018 at 4:31 am #1021937The 2 things are;
– Clickable elements too close together
– Content wider than screenI need to fix these… ignoring these 2 things will not fix what Google sees. And that’s the main focus of SEO.
So if someone has help support that will fix the problem. I’d appreciated it.
cheers
DaveOctober 16, 2018 at 6:44 am #1021999Hi Dave,
I understand that, but where can we see the problems on your actual site? We can’t fix a problem we can’t see unfortunately.
Best regards,
RikardOctober 30, 2018 at 5:21 pm #1028170If I may intrude, the issue is that google thinks there is a problem with Enfold sites. I get the same errors on most of my enfold sites and that means that savvy clients also get the same errors and think that we’ve built them an inferior website. You can test an enfold site yourself here: https://www.google.com/webmasters/tools/mobile-friendly/
I don’t believe users are experiencing the problem that google reports. Enfold offers a great user experience, but can you please help us resolve or explain why google thinks Enfold sites aren’t mobile friendly? I can’t tell clients I design a great mobile experience if google doesn’t agree.
November 1, 2018 at 6:16 am #1028707Hi bluesbrush,
I see your point, but we have no idea why the Google test is giving those suggestions unfortunately. I can’t even reproduce the results the original poster got when I try with the construction demo: https://search.google.com/test/mobile-friendly?utm_source=mft&utm_medium=redirect&utm_campaign=mft-redirect&id=ozVibM2dIscIaUqr512Rjg
Best regards,
RikardDecember 19, 2018 at 6:03 am #1047107Hi Kriesi,
I have recently become aware of this.
Google says: “Your site is not mobile fro]iendly”. And displays the messages that the OP listed, BUT I do not believe it. My site IS mobile friendly. Clickable elements are NOT too close and content is NOT larger than the screen. I do not know if there is an actual “unfriendly” session happening. At this point I do not believe there is.
Here is my website.
Here is the link to the Google Mobile test result..
I’ve attached a screenshot of my page listed in Google and my website login details for Kriesi.
Thanks!
Best,
DameonDecember 19, 2018 at 12:17 pm #1047145@dameonjamie … I’d say your text and the clickable-area of links in the top region (above logo and navigation) and in the footer socket are definitely smaller than what Google calls mobile friendly.
Edit: Looking at your Test Result, it is obvious Google could’t load the needed CSS to render the page. That might be a temporarily problem … but as Google often doesn’t load the whole page to test it for mobile you maybe should try to prioritize the contents of the page better. (If that test result from Google is important for you)
- This reply was modified 5 years, 11 months ago by cg.
December 19, 2018 at 5:20 pm #1047257I had the same issue a couple of weeks ago with two of my sites. It IS NOT an issue with Enfold. It IS an issue with Googlebot and how is reaching certain Enfold components. This includes certain folders used for css styling. Not sure if you have a robots.txt file in your directory structure but it would help to have one. robots.txt should include the following lines and be put in the root directory. Once there, go into Google Search console for your site and resubmit the site to clear the errors. This in Google’s screwup.
robots.txt contents
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Crawl-delay: 10User-agent: Googlebot
Allow: /*.js*
Allow: /*.css*
Allow: /wp-content/*.js*
Allow: /wp-content/*.css*
Allow: /wp-includes/*.js*
Allow: /wp-includes/*.css*
Allow: /wp-content/plugins/*.css*
Allow: /wp-content/plugins/*.js*
Allow: /wp-content/plugins/*ajax*
Allow: /wp-content/themes/*.css*
Allow: /wp-content/themes/*.js*
Allow: /wp-content/themes/*ajax*
Allow: /wp-content/themes/*.woff*
Allow: /wp-content/themes/*.ttf*
Allow: /wp-content/plugins/*/images/
Allow: /wp-content/uploads/
Allow: /wp-admin/admin-ajax.php
Disallow: /wp-content/
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/plugins/
Disallow: /?s=*Sitemap: https://yourdomain.com/sitemap.xml
In addition, it may be wise to use the Redirection plugin and setup permanent redirects for any obviously strange 404s showing up that reference Enfold folders. They can just be redirected to the site home URL.
- This reply was modified 5 years, 11 months ago by crtdude.
December 19, 2018 at 5:32 pm #1047261@crtdude: It’s not an advisable aproach you are going there.
Two reasons:
First, the robots.txt is open available and you disclose all admin-folders of your content management system. Security wise that’s not a good idea.
Secondly, it’s a common misunderstanding how a robots.txt directive works for Google. A “disallow” does not prevent Google from indexing a page … it’s disallowing Google to crawl that page. A big difference.December 19, 2018 at 5:54 pm #1047280Yes, the robots file is visible however it DOES NOT contain any items specific to the theme therefore it is NOT a security issue. I don’t know where you came up with that but our procedure was developed by a SEO & Security team. Furthermore, any developer familiar with WP sites already knows the reported base folder structure.
You are correct in stating “A “disallow” does not prevent Google from indexing a page” however that is precisely why I suggested the redirections be employed to gain the indexing pass.
This method worked cleanly in fixing the reported errors on our sites without one glitch.
December 19, 2018 at 6:08 pm #1047296Of course it’s a security issue when you disclose your admin-folders.
I don’t talk about “any developer familiar with WP” … they aren’t attacking your site in most cases … automated bots are the threat, and you are telling them what system you run and where to look for possible exploits.Redirecting 404s is another bad idea you came up with … a 404 is a 404 and should be as long it’s not because of a changed URL or similar.
Here are some links … maybe you should also send them to your so called “SEO & Security team” … lol
—–
robots.txt possible security issue:
=> https://www.synopsys.com/blogs/software-security/robots-txt/
=> https://www.willmaster.com/library/security/one-way-robots.txt-can-be-a-security-risk.phpThe robots.txt file can be a security risk if this one thing is present: A disallow line to a directory containing sensitive information.
—–
robots.txt – Yoast team about indexing
=> https://yoast.com/prevent-site-being-indexed/We’ve said it in 2009, and we’ll say it again: it keeps amazing us that there are still people using just a robots.txt files to prevent indexing of their site in Google or Bing.
—–
Edit: Last but not least … all of your statements are totally offtopic and not related to this thread about ‘Google Mobile Friendly Test’ … or should someone disallow his homepage url like you recommend because Google says it’s not mobile friendly?- This reply was modified 5 years, 11 months ago by cg.
December 19, 2018 at 6:27 pm #1047308Conversation ended. You have no clue what I am talking about with the offtopic comment. This is precisely related to the ‘Google Mobile Friendly Test’. And that is how we permanently rectified it.
December 19, 2018 at 6:33 pm #1047310Just look at the URLs the people here are having problems with in the Mobile Test … those are not WordPress-Admin-Pages or Theme-URLs … those are their normal home-urls and content-pages …
They can’t disallow them in the robots.txt … don’t be that ignorant.December 19, 2018 at 7:19 pm #1047332“don’t be that ignorant”? – seriously? You have no clue what you are talking about here. Kindly move on. If you know anything about this specific issue and have experienced the same yourself, then offer a specific solution. Knowledge-less rambling does no one any good.
We had exactly the same thing occur. The specific reason is that the Googlebot had issues with the css files when it grabbed the content. What we did fixed all the issues, and then directed the bot to rescan as the issues (not actually issues) were fixed. If you bothered to look at the screenshot of the rendered page, you might figure that out.
December 20, 2018 at 11:11 am #1047634Sorry to say that … but … you are wrong again … :-)
You say that your robots.txt solved the problem for the Googlebot with the css files.
But … you have multiple lines in your robots.txt to allow Google crawling any css file … so your robots.txt changes nothing about how the bot handles css … because that is the default behaviour without a robots.txt …It’s totally possible that the reindexing solved your problem (and may help others with similar notices from google).
But your robots.txt has nothing to do with that.
For a normal home-URL or an URL of a content page your robots.txt has no effect at all …—–
Edit: If you still disagree … please explain what your robots.txt does in this example case:Let’s take the Enfold 2017 demo and assume there would be your robots.txt located in the webroot and active for crawling bots.
=> https://kriesi.at/themes/enfold-2017/What would your robots.txt change for that page with that URL ?
(spoiler: nothing)- This reply was modified 5 years, 11 months ago by cg.
December 20, 2018 at 11:26 am #1047640You can believe what you want – the complete solution I suggested to the user fixed the problem, permanently. It forces Googlebot to read the robots file and do what we tell it do – whether or not we allow or disallow anything is of no consequence. Their default run missed certain objects and caused Google to think there was something awry, throwing the site into an SEO warning state.
Like I said before, if you knew what you were talking about, you would have suggested a fix for the gentleman that actually worked.
December 20, 2018 at 12:00 pm #1047653Ok … i expected that … you just make assumptions and can’t answer the question regarding your robots.txt effect …
Edit: Everyone reading our discussion will easily recognize who of us two is the one “knowing nothing” … i really don’t know why you are taking it personally …
- This reply was modified 5 years, 11 months ago by cg.
December 20, 2018 at 4:55 pm #1047786I don’t like how this thread evolved because of the destructive comments from crtdude.
So i digged a little deeper into this to find an answer to the question why and when those messages from the Google Search Console appear.I found one of the sites i have full access to with those notifications from GSC.
I filtered the server access logs to see when and what Google bots crawled on that domain and compared it to the time when Google send an email to that reputed “error”. That works roughly also the reaction from Google isn’t instantly correlatet with the crawling.Google uses several stages to crawl and index a site. And only sometimes they fetch the whole page with all components.
To keep it simple, lets assume two steps: Google does most of the time a simple “lightweight” and sometimes a “full” crawl.So … this is what i think sounds plausible:
—–
# How does it happen?1. Google visits the site/url and does an initial full crawl. At that time the bot also fetches the files like images and CSS — the merged CSS file “a-wild-string.css”.
2. Then something is changed in the CSS of that page and a new merged CSS file is created “a-wild-string-2.css” and the old one is deleted (by the theme option or by the caching plugin, whatever).
3. The “Test-Mobile”-Bot from Google comes over … he is a different bot and uses the cached/hashed data from the crawling bot … and tries to use the old CSS file “a-wild-string.css” to render the page without recognizing that the new one should be used … so the page is tested with half or no css in use … FAILURE! he (the bot) screams …That’s how i think this happens.
—–
# What to do?The easiest way: Just wait two, three, four weeks … at some point the Mobile-Bot from Google will use the new CSS data …
More options:
– don’t delete old CSS files
– let Google reindex the site
– use the options in Google Search Console to manually push the testing of each url
=> https://support.google.com/webmasters/answer/9063469?hl=en#validationWhen you use the last option and while testing the Live-Version of the site there are still errors … only then there are actually problems with your page.
—–
Thanks for your attention. :-)p.s.: I am not a native speaker of the english language, i hope the text above is understandable.
p.p.s.: don’t use robots.txt to solve indexing issues—–
Edit: Obviously there are more possibilities of why such a rendering error from Google can occur … so it has to be examined from case to case …- This reply was modified 5 years, 11 months ago by cg.
December 21, 2018 at 2:23 pm #1048184 -
AuthorPosts
- You must be logged in to reply to this topic.