With the dramatic increase in the number of websites, including those that are advertising-supported, subscription-based (amateur and professional), or free from well-respected organizations-government and non-profits-as well as those with commercial or political agendas, it is challenging to identify unbiased, reliable sites just by randomly searching the Web. Search engines make the process of differentiation difficult in several ways. First, they are what they are called – search engines, not find engines – enabling us to conduct a search, but not necessarily research. Also, they favor websites that pay for higher placements on search results pages by purchasing text and graphical advertising-search-engine marketing (SEM), including paid listings and keywords, with the most popular keywords going to the highest bidders. And because of the way search-engine algorithms are designed, the more links a site gets from other sites (in-bound links), the higher the site will rank on the SERP. Free and open sites with a sizable amount of content will accumulate the greatest number of links and high SERP rank; therefore, the very best content, or the content most relevant to what users are looking for, may not always be at the top of the SERP. As a result, website producers are forced to compete on the Web by constructing their sites and writing text copy using keywords and phrases-search-engine optimization (SEO)—in a way that helps them improve their site’s SERP position.