Unlike humans, search engines have many limitations as they peruse (“crawl”) the internet and interpret content on websites. In fact, human eyes and search engines see web pages very differently. That’s why building or modifying web pages represents a very delicate balance between what works best for viewers and what works best for search engines. In the end, the best optimized pages are a well-crafted compromise that works well for people and search engines alike.
The Challenge of Creating Indexable Content
HTML text format is the standard for performing well in search engine listings. Many people don’t realize that Flash files, images, Java apps or other non-text content are frequently ignored by search engines, which occurs despite the many advances that have been achieved in “crawling” technology.
While keeping things in the HTML text format is of primary importance, there are other things you can do to enhance non-HTML elements of your website:
- Images: provide alt text for every single image. Write text for images in gif, jpg, or png format so that search engines have a crawlable text description about the visual content.
- Search boxes: use navigation & crawlable links.
- Video & audio: create a transcript so that engines can index its content.
- Flash or Java plug-ins: be sure to incorporate text on the page itself.
Are Your Link Structures Crawlable? Answer: They Better Be!
Search engines must be able to see links to crawl through a website’s content. That means you’ve got to have a crawlable link structure. Without it, you could be wasting a great deal of time on crafting good content that gets ignored. Don’t make the common mistake of creating web pages that engines can’t find.
Here’s what it looks like when you don’t employ a crawlable link structure:
With no direct links to pages C or D, they will remain invisible to search engines. It doesn’t matter if the pages are beautifully written and have used keywords effectively. All of that effort will be in vain if those pages are orphans without links leading back to the home page.
Why Don’t Search Engines See Certain Pages?
Here are just a few reasons why areas of your website could be inaccessible to search engines:
- Forms that require a submission
If users need to fill out an online form to access some content, search engines probably won’t find these protected pages.
- Links in Flash and other plug-ins
Once again, these links won’t be accessible to search engines. - Links that point to pages that Meta Robots tag or robots.txt have blocked
A Meta Robots tag and the robots.txt file enable site owners to stop crawler access to a particular web page. Although this is beneficial for blocking rogue bots, it can also stop search engines as well. - Huge numbers of links on pages
Only so many links on a page will be crawled by a search engine. This isn’t meant to punish ethical websites. It’s meant to reduce spam and ensure reliable search results. Therefore, building pages with hundreds or thousands of links can kill a website’s search ranking.
- Frames or iframes
Although links in frames or iframes can be crawled, any structural issues can be problematic for search engines. It’s best to leave this area to advanced webmasters who are proficient in their understanding of how engines index and follow links in frames.
Designing a search engine design-friendly website is no small task. As the SEO landscape becomes more complex, it takes serious experience to create SEO-friendly websites. Dreamscape Marketing, a proud Google Partner, can help you create or revamp your website for today’s—and tomorrow’s—demanding web environment. Call us today at 877-958-9180 and let’s discuss your SEO needs.