Search engines are a common way to find information on the Web. The results that appear when you type in a query are based on complex algorithms that consider a wide range of factors, including the frequency with which keywords occur and whether they are placed together or in order, among others. Understanding how to optimize your search engine use can make a big difference in the quality of your results. Search engine techniques include using search blocks, finding the balance between broad and narrow searches and combining search terms with boolean operators.
Search engine technology is not only complicated, but constantly changing. Search engine software sifts through millions of pages recorded in an index, finds matches to a user’s query and ranks them based on how relevant they are. It is not uncommon for the first page of results to contain more than a dozen or more sites. As a result, companies that rank higher in search results are more likely to attract visitors and customers.
The most popular search engine is Google, but there are many others. Some are more specialized than others, such as those that focus on audio, video or images, or that specialize in finding jobs, torrent files or news articles. Some are even free to use.
A search engine is composed of three parts: the index, which records a number of Web pages and their locations; a program to sift through the millions of entries in the index; and a Web browser that displays the results. The index is built by a computer that crawls the Internet, collecting and recording information from Web sites, reading their meta tags and following links to other Web sites. The information is then stored in a central depository. It is possible for an index to be updated regularly or manually, depending on the structure of the database and the frequency with which it is updated.
There are two basic types of search engines: those that use automated software agents called crawlers and those that rely on human submissions. Crawler-based search engines have automated software agents (or bots) that visit each Web site, read the information on the actual site and the metadata, and also follow links to perform indexing of other sites. They then return the indexed information back to the main repository, where it is catalogued and organized. Human-powered search engines have volunteers who review submitted information and put it into an index.
When searching for specific information, a balance is needed between the amount of information returned and the amount of time it takes to sort through all of the results. You can save time by searching for exact phrases. Enclosing a search phrase in quotation marks causes the search engine to only look for Web pages that contain that exact sequence of words. Searching for “Joe Bloggs” versus “jeans” is an example of this. You can also narrow the results by using the OR operator. This tells the search engine that you want both or either of the keywords in the result. You can also exclude certain words with the NOT operator. For example, a search for Joe Bloggs -jeans would eliminate any results containing the word jeans.