Saturday 14 July, 2007

Accessibility and Usability issues

Accessibility and Usability issues


Since Googlebot is actually designed to simulate the behavior of an average Internet user, it will check web pages for accessibility and usability issues as well. In general, the algorithms identifying major problems of a web site are highly refined, thus errors, misused code or hard to comprehend layouts are all playing a role in deciding the ranking of the pages. While a few errors will most likely be ignored, major problems, site-wide navigational inconsistencies, and especially intentional misuse or even overuse of certain elements may very well lead to a decline in rankings.

Known issues


Accessibility and usability checks are heavily relying on browser compatibility, which in fact is an ever changing factor. Some practices may now be more widespread than they were about a year ago, yet still be viewed as a hindrance, because of a minority of web browsing software still can not display them correctly. Google is updating its algorithms and Googlebot constantly, thus is expanding the methods a web site may utilize in its design to get its content properly indexed. The results try to be on par with the majority and technical advancements. Shockwave Flash content is analyzed for its textual content, javascript based links are followed the same as anchor text links ( although they don't pass any parameters ), image maps, information in the NOFRAMES tag, and other advancements in standards are evaluated in the same manner for relevance and trust. However the broader range of browsers a web site can serve, the more importance it will be given to. There is still a hierarchy in judging usability issues, rendering the most accessible sites above the specialized designs. For example, text link references will weigh more than image based links, references buried in heavy code will likely to be followed at a slower rate than easy to access navigation.

+ Resolution: The W3C standard for web pages is a good hint on whether web sites are ready to be evaluated by Googlebot, based on the simulated user experience. While a page does not need to comply with all standards, major errors, and problems that are not only browser specific differences will less likely be ignored. Asking yourself the question whether your web site is easy to use, and whether it is accessible with most common web browsers is also a hint. A simple checklist might be to watch out for broken links, orphaned pages, loading time, number of links within the navigation, and the overall navigation communicating a consistent and coherent page hierarchy, images being labeled with ALT tags, the use of unique TITLE and META description tags, proper page encoding settings, language settings, text of readable size and color, no hidden text, no overuse of anchor text in links, no cloaking or off-screen content, no invisible layers, no redirect chains, no overuse of keywords to an extent where the content becomes meaningless, use of all necessary but also closing of all HTML tags, use of proper layout emphasizing the parts unique to a page, and the code not relying on yet to become standard practices. While the list of things to keep an eye out for could seem long, once thought over, the knowledge of web page coding and some common sense applied will save most pages from becoming a burden to your web site, or the visitor trying to decipher them. The most common errors are still the most obvious ones, with misused or vital but forgotten HTML code leading the list of problems, and cause many of the instances of a drop in rankings.

No comments:

Enter your email address:

Delivered by FeedBurner