Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 9 years ago.
Improve this question
I developed a Single Page Responsive Website for my company http://germin8.com/ . Everything is going good however I actually now face a problem with SEO .The site's different sections do not show up in search engine.
I know the cause it being a single page site so not crawler friendly...Inorder to get the URL change I used history pushstate technique and have put href links for menu bar items to sections ..... confused ?? eh
Sample anchor tag outlink ( I thought this is enough for my section to show up in Search Engine :-/ )
a style="text-decoration:none;color:black;padding-left:30px;" class="scrollTo" id="contactUs_Menu"
href="/contact-us">CONTACT</a></li>
Or you may have a look at the source code of the website and follow the anchor tags.
On some research and POC I came across this AJAX crawlable technique by google (https://developers.google.com/webmasters/ajax-crawling ) ...however I couldn't understand it and also feel loading site's sections through ajax would be a lot more work at this stage since my entire site is a static HTML file ( index.php ) with nothing rendered dynamically through javascript/AJAX
Someone who has faced similar problem can you suggest me the simplest and fastest way for my site's different sections ( eg .Clients , Partners , Contact Us etc ) to show up in google engine
Thanks in advance guys :)
Actually this question is more suitable for https://webmasters.stackexchange.com/ but since it has been raised here, I'll try and answer this question to the best of my knowledge.
Unfortunately, there is no shortcut for SEO and to be able to fetch search results in your favor is a slow and painful process. The basic principle of SEO is doing simple things right and provide quality content to your users in your website and not worry much about the ranking.
That being said, your expectation is slightly unrealistic for the following reasons,
You are asking Google to index a page that doesn't even exist.
The URL is changed with JavaScript on runtime, which is something no-search-enginebot is good at indexing.
However, there are couple of things that you can improve in terms of SEO (not going to guarantee what you have asked),
Make sure you have sitemap.xml file in the root directory of your website. You need to add individual sub-page links for each url like this,
<url>
<loc>http://germin8.com/clients</loc>
<lastmod>2005-01-01</lastmod>
<changefreq>monthly</changefreq>
<priority>0.8</priority>
</url>
Once you are done with sitemap.xml file. Open your Google Webmaster Account (also make sure your Google Analytics account is linked to your webmaster profile) and validate the structure and schema of the sitemap file.
Write better anchor text - Add title attribute to your anchor tags. Avoid inline styles as much as you can. Use complete url instead of relative paths for href attribute.
Google doesn't like slow websites. Hence, you need to focus a lot on the performance of your website. Also no user likes to see a webpage loading for ages. Make efforts to concatenate, minify and lint your assets(html/css/js). Gzip compression is required as well.
149 requests with 4.1 MB is huge. You need to reduce the number of HTTP requests you make massively!
Conclusion
Apart from the above, I don't really see your internal links not being visible on search results as a big problem. Your primary objective is to make sure that your users land on your webpage (this is something that you already doing). After the user enters your territory (website) he has the liberty to navigate to any section of the webpage.
http://webcache.googleusercontent.com/search?q=cache:http://germin8.com&client=firefox-a&rls=org.mozilla:en-US:official&strip=1
I dont see any problem with indexing of your site. Clients will not showup in normal search but they would show up in google images. You should give alt tag to best describe the client images that you have used. Above url will give you an idea how Google bot sees your site. So you can notice all text is indexed by google including your heading where clients are listed. Hope this solves your concern.
Related
I am having trouble figuring out how to do even the simplest things in Alfresco, like typing a simple document. I've been Googling and noticed that customizations can be done through HTML documents. I need help and decided to post a question to a knowledgeable user platform. THe following customizations I would like are WAY far fetched and most likely not even achievable, but any help that can be provided I would really appreciate.
*list items in bold are most important
Anyone could be assigned a login and when they logged in they would have access to and easily view all of the contents of the site (or multiple sites that make up one accessible website?)
All of the items on the website would be a hierarchy, the user facing contents of the site would be a list of links with thumbnails, when one link was clicked it would be another list of links with large thumbnails, when one of those links was clicked a text document would be brought up, that document would contain clickable sections, when one of those sections was clicked it would bring up a page only containing the section clicked:
Links (crafts)
2nd layer of links (modules)
Text and image document with clickable links (single module containing clickable sections)
Section (single sections of module)
The module and section text would also contain images and tables throughout and mixed in the text
If a link (module or section) was used in multiple places all instances of the link would be linked to each other. If on instance was edited, the other would also change. THis setting could be turned off for any individual link if necessary.
Every document should have an easy to use live commenting system (something simple like Disqus would work) The comments are the most important on the single section pages but would also be good on the module page
An advanced tagging system that would be part of the entire site/website environment. A user could type anything they wanted as a tag and use multiple tags. The tags would be used for their comments on the content (text, sections) but the tags could be searched (most importantly by the administrators of the site) at any time in the whole environment. A popularity of any tag could also be viewed (I'm not sure how that would work, possibly another section of the site or an easy to see column on any text/image document?)
A user could edit their own comment if they wished but would not be able to delete it entirely. Comments would also be date and time stamped.
I know all of this is most likely impossible but if anyone has an idea of Alfresco customizations that could pull any of this off, or of an entirely different secure platform or site that would perform anything similar to this please let me know.
Thank you!
It sounds like you are looking for a Web Content Management (WCM) System. Alfresco is a Document Management (DM) System. You can use Alfresco as a back-end for a custom content-centric solution, but if you are expecting to install it, start it up, and have anything close to what you've listed above, you are barking up the wrong tree.
Everything you've listed is a front-end concern. You can use whatever you want to develop that functionality, but none of it will leverage Alfresco unless you choose to store some of the data in the Alfresco back-end.
You might be better off looking at something in the WCM space, such as Drupal or Wordpress. Or if you want something Java-based, look at Magnolia CMS or Hippo CMS.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I placed a text early in the html code for SEO reasons, but visually, this text should be at the bottom of the page. I can't do this with just CSS.
My question: is it a problem for SEO if I move the text in the DOM with javascript? Not hidden or removed, juste moved.
Thanks !
If you placed your text in the beginning for SEO reasons, but moved it down because it isn't aimed at the user, Google will interpret that as misleading to the end-user. If they'll notice a single paragraph? Doubtful. But if you were to make this common practice, I believe this would go in the category of Blackhat SEO, even though it's mild in comparison to other things you can do.
You have to remember that visitors that end up on your site through Google, is a customer of Google. And if the user thinks he/she was mislead, it falls on Google not doing a good enough job. Keep that in mind when designing.
A tip, if you're looking for one, is to simply never do things like this. If you design your site and script your text well enough that the visitor finds it useful, Google will see that and ranking will follow. This has been my experience anyways - shortcuts have never worked out (for me) in the long run.
Any "I want machine to see content more prominently, but obscure it from user" action is simply called cloaking and will reflect negatively on your index once caught.
How much will it take? Depends on many factors, including complains from users lured to your site if it indeed have nothing but "SEO reasons" to it.
The most important content first is good SEO practice.
You cannot say that moving content around is bad SEO practice per se. Take a look at the HTML of wikipedia for example. The content of a detail page (the article) within the HTML is before the HTML of the navigation, for example.
The bot will see it more prominently, it's the most important thing on the page and that's it.
Another example: Why should it be bad practive to place content first in the HTML and "move up" a slider with some images, completely coded in JS? Things like that.
In one video on YT Matt Cutts discusses a similar question with Javascript and dialog elements that toggle the display of texts (show/hide) which is quite common. As long as typical phrases are used (on, off, more, see more, things like that) and you use common tools you don't look suspicious.
As usual, the devil will be in the details.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
I wonder, is Google index and rate sites with dynamically loaded content as good as sites with static content? For example, what about site, where all the layout and elements are created by JavaScript and all the content is Ajax-loaded.
You need to do some extra work but is perfectly possible to generate indexable ajax based websites.
Main thing you have to do is make sure that any url that uses a #!hash to determine what to view, also has a ?_escaped_fragment_= url that generates the exact same content.
For more (and likely more understandable) information look at https://developers.google.com/webmasters/ajax-crawling/docs/specification
google's crawler now does index content that is generated from javascript. You can see this working at http://www.adynamicjssite.com/ and by googling for site:adynamicjssite.com (the snippet google renders comes from content that's stored in javascript).
Note that bing and others do not yet support running javascript while spidering your site.
No. Crawlers do not scrape dynamically generated content.
http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=35769
Use a text browser such as Lynx to
examine your site, because most search
engine spiders see your site much as
Lynx would. If fancy features such as
JavaScript, cookies, session IDs,
frames, DHTML, or Flash keep you from
seeing all of your site in a text
browser, then search engine spiders
may have trouble crawling your site.
If you view your page source (CTRL+U in Firefox), you'll see everything the web crawler sees.
You could also use Crawlme to automatically make your web app pages indexable by google.
Hi
my web site provides instant filtering of articles via JavaScript.
Initially, 12 most fresh summaries for articles are displayed.
Summaries of ALL articles are putted into JavaScript cache-object (rendered by server in script tags).
When user click on tags, corresponding summaries for articles will be taken from JS cache-object and inserted into the page as HTML pieces.
Does it have some negative impact on how SEO-friendly my web site is.
The main problem is clear: only 12 "static" URL's are displayed and another will appear programmatically only on user interaction.
How to make the site SEO-friendly, keeping this nice filtering feature ?
When i will add a link "all articles" that will load separate page with all articles, will it solve the SEO problems ?
The way to make this work for Search Engines, user who don't have JavaScript and also in your funky way is to write this feature in stages.
Stage 1: Get a working "Paged" version of this page working, so it shows 12 results and you can click on "next page" and "last page" and maybe even on various page numbers.
Stage 2: Implement the filter using a form-post and have it change the results shown in the page view.
Stage 3: Add JavaScript over the top of the working form and have it display the results the normal post would display. You can also replace the full-page-reload for paging with JavaScript safe in the knowledge that it all works without JavaScript.
Most people use an AJAX request rather than storing an ever-increasing list in a JavaScript array.
Crawlers (or most of them) don't enable javascript while crawling. therefore, all javascript powered content won't be referenced. And your site will be considered smaller as it is by search engines. this will be penalizing for your pages.
Making a "directory" page can be a solution. But if you do so, search engines will send user to these static pages, and not through your javascript viewer homepage.
Anyway, I would not recommend to make a javascript-only viewable content:
First it's not SEO friendly
then it's not javascript-disabled
friendly
imposibillity to use history
functions (back and next)
middle click "open in a new
tab/window" won't be working.
Also you can't bookmark javascript
generated content
so is your nice feature nice enough to lose the points mentionned above?
There are ways to conciliate your feature + all those points but that's far from easy to do.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I have a javascript snippet that clients can put on their webpages that loads some text associated with embedded flash objects (like Slideshare presentations) on that page. Does Google crawl this type of content? Will this provide any SEO benefit? If not, what else should I consider. I don't want to force people to embed the actual content since they typically have multiple pages that use this script and the there is typically a lot of text. Any suggestions?
google does execute on page javascript quite well, but the current seo consensus is that external javascript (i.e.: asynchronous loaded content )does not count as part of the page.
this means, that script (the loaded text) is not seo valuable.
if you want the text to be valuable it must be onpage, means it must be in the html of the page, so basically you will have to go with the big (text already included) js snippet.
but before you rush to make it "seo-valuable"e please be aware that duplicate content is usually not valuable. so if the text shows up on different pages it might not be useful to include the text at all.
Flash is popular on the Web, but each presents challenges to the search engines in terms of indexing the related content. This creates a gap between the user experience with a site and what the search engines can find on that site.
It used to be that search engines did not index Flash content at all. In June 2008, Google announced that it was offering improved indexing of this content
(http://googlewebmastercentral.blogspot.com/2008/06/improved-flash-indexing.html).
This announcement indicates that Google can index text content and find and follow links within Flash files. However, Google still cannot tell what is contained in images within the Flash file. Here are some reasons why Flash is still not fully SEO-friendly:
Different content is not on different URLs
This is the same problem you encounter with AJAX-based pages. You could have unique frames, movies within movies, and so on that appear to be completely unique portions of the Flash site, yet there’s often no way to link to these individual elements.
The breakdown of text is not clean
Google can index the output files in the SWF file to see words and phrases, but in Flash, a lot of your text is not inside clean or tags; it is jumbled up into half-phrases for graphical effects and will often be output in the incorrect order. Worse still are text effects that often require “breaking” words apart into individual letters to animate them.
Flash gets embedded
A lot of Flash content is only linked to by other Flash content wrapped inside shell Flash pages. This line of links, where no other internal or external URLs are referencing the interior content, means some very low PageRank/link juice documents. Even if they manage to stay in the main index, they probably won’t rank for anything.
Flash doesn’t earn external links like HTML
An all-Flash site might get a large number of links to the home page, but interior pages almost always suffer. For embeddable Flash content, it is the HTML host page earning those links when they do come.
SEO basics are often missing
Anchor text, headlines, bold/strong text, img alt attributes, and even title tags are not simple elements to properly include in Flash. Developing Flash with SEO in mind is just more difficult than doing it in HTML. In addition, it is not part of the cultural lexicon of the Flash development world.
A lot of Flash isn’t even crawlable
Google has indicated that it doesn’t execute external JavaScript calls (which many Flashbased sites use) or index the content from external files called by Flash (which, again, a lot of Flash sites rely on). These limitations could severely impact what a visitor can see versus what Googlebot can index.
Note that it used to be that you could not test the crawlability of Flash, but the Adobe Search Engine SDK does allow you to get an idea as to how the search engines will see your Flash file.
You can have the content on an external page.
If you don't want Google to crawl it, block it with robots.txt
If you don't want Google to index it (possibly a better option) use x-robots or noindex in the head.
Whether you use javascript to pull it into the page, iframes, or both really comes down to implementation and what the included page may need to access on the page, tracking, sessions etc.
Although google does not crawl flash and java script so well but these are not the only things crucial for SEO. There are many other things which matters such as keyword density, quality of content, inboubd and outbound linking, titles and content should be well managed with proper tags etc. So if flash/java script is necessity then use it but do not use it in excess.
Google is not efficient at reading or indexing flash elements. If I had to publish content like slideshare, I would produce a PDF. This can be indexed with no problem, it could drag traffic to my website.
Google crawls Flash objects to some extent. But in my experience a best solution (if Flash is imminent) is to use SWFObject for alternative HTML text. This will make your Flash and your Site 100% Google friendly and, more importantly, user friendly.
For more information go here:
http://www.adobe.com/devnet/flashplayer/articles/alternative_content.html
People read way too much into what Google will think about the technology or specific code they use on their site. If you're on the up-and-up and not using the code to cloak, deceive visitors or hijack sessions...you're going to be just fine. Will you rank better if you subbed all text for Flash? Maybe a little, but at the end of the day it's the quality of your content (yes, even if it's not text-based), the number of people that find it useful (via high quality links) and other small factors.
8 years ago, your answer would have been more valid for not including JS, but it just doesn't matter much anymore, Google treats navigable websites the same and ranks primarily around "quality", not usability excessive keyword rich text.