4 search engine snags and their satisfying solutions
Connecting state and local government leaders
Users have a hard time finding government information on the Web. Remedying the situation requires a mix of technical and information design skills.
Consistency is generally a good thing, but the Food Safety and Inspection Service’s website established a pattern for its search function no organization wants to own: It was consistently bad.
The search function lurked at the bottom of customer satisfaction surveys year after year for FSIS, a public health agency in the Agriculture Department. When users were asked to judge website elements such as content, functionality, look and feel, navigation, and site performance, “search was dead last,” said Janet Stevens, CIO at FSIS.
Related coverage:
OMB pushing agencies to streamline the federal domain
The agency used a combination of Web analytics and more detailed survey questions to zero in on the problem and discovered what was frustrating some site visitors: They were searching for information that couldn’t be found on the site. FSIS’ food safety purview covers meat, poultry and eggs, but some users were searching for information on vegetable and seafood recalls. Those alerts fall under the Food and Drug Administration.
The agency solved the problem by directing users to the appropriate government entity when search requests are not within FSIS’ purview. The site’s A-Z index, for example, now includes seafood and vegetable listings with links to FDA’s website.
“To improve the search satisfaction for these customers, we had to route them off our site ... and point them in the right direction,” said Kim Taylor, director of Web services at FSIS.
Although search satisfaction scores have gone up at FSIS, many agencies remain stuck in seek-and-you-might-find mode. Indeed, users often have little assurance that their searches will produce the desired result, whether they use a search engine such as Google or Bing or a site’s native search capability. Many agency sites falter when it comes to findability — the degree to which visitors can readily obtain information.
“Government sites are horrible,” said Shari Thurow, founder of Omni Marketing Interactive, a firm that specializes in Web design, usability analysis and search engine optimization (SEO).
Thurow, who advises both government and corporate organizations, said search engines try to determine the “aboutness” of Web documents. But site owners don’t always take the time to clearly define their pages. As a consequence, a search engine is compelled to guess what a page is about, and those guesses can easily be wrong.
“The government is notorious for not defining the aboutness of pages,” Thurow said. “Not only can [agencies] not expect their pages to rank in a Web search engine, they can’t expect their site search results to be accurate either. That is why people have a terrible time on government websites.”
To be fair, commercial enterprises haven’t been covering themselves in findability glory either. A 2008 AIIM study found that nearly half the businesses surveyed lacked a formal goal for enterprise findability. About 70 percent of respondents estimated that half or less of their organization’s information was searchable online.
So what can resource-strapped agencies do to improve their sites’ findability? Here are the core problems and some potential solutions.
Problem 1: Poor information architecture
Lack of SEO gets the blame for dismal search results, but the problem goes much deeper. Search dysfunction begins with poor website design and information architecture.
In terms of findability, information architecture refers to the organization and labeling of content on a website. It shapes how users will navigate the site and ultimately determines how easily they will locate information.
Thurow cited labeling as one of the key components of information architecture. Labels aim to make a site’s information easier to find and can take a number of forms, including HTML/XHTML title tags, which define Web pages; headings and subheadings; breadcrumb links, which help users navigate a site, as in Home > Apparel > Women > Dresses; and text on navigation buttons.
Website owners, however, “often leave the labeling decisions to Web developers, who are not trained, educated or knowledgeable about information architecture,” Thurow said.
In addition, website owners often view architecture in terms of hierarchical navigation and emphasize only a few key pages. Peter Morville, president of Semantic Studios, said government and commercial organizations spend a lot of time designing home pages and top-level pages. “But what you find when you start looking at the analytics is that many visitors don’t come through the home pages,” said Morville, whose firm specializes in information architecture, user experience and findability.
He said more than 75 percent of the Library of Congress’ Web visitors arrive via searches and therefore land deeper in the site. He is working with library officials on a more search-centered approach to their Web offerings.
Thurow said there’s an easy way for agencies to tell if their architecture needs help: A constant stream of calls and e-mail messages asking where to find things should send a strong signal that something is amiss.
Solutions: Effective labeling boosts navigation, but agencies can take additional steps to improve site architecture and, by extension, searchability.
Organizing content into groups is one method. Thurow cited one of her clients, the National Cancer Institute, as an agency that has performed admirably in this area. For example, NCI groups content according to cancer topics and types of cancer.
Morville, who has also worked with NCI, said the institute identified its most important content and organized it accordingly. “Prioritizing content is something that is ultimately part of the search experience,” he added.
In general, he encourages clients to think about their content along the lines of products in a catalog. In this website-as-database model, the search function knits together a site’s collection of objects, he said.
Problem 2: Not enough people or expertise
Structuring and labeling content takes time and effort, but resource-constrained agencies aren’t brimming with information architects, search experts and metadata mavens. Indeed, several comments posted during the National Dialogue on Improving Federal Websites in September and October pointed to staffing issues.
“Too many government programs are run by overworked staff with little to no expertise in website development,” one commenter said.
Another noted a lack of advanced, specialized skills in areas such as usability, information architecture and SEO.
Solutions: Agencies face a struggle when it comes to hiring employees who can bolster search and findability. For an e-commerce site, such an investment can often be tied to a projected increase in product sales. In the public sector, federal websites might be able to argue that enhanced searchability can avoid lost time and productivity and deflect inquiries from higher-touch call centers.
Training existing Web employees in key areas is another option. That approach also requires an investment, but it doesn’t call for an increase in staffing levels.
Morville said graphic designers, Web developers and others involved in running a federal website can learn enough about information architecture to do a reasonable job. However, particularly complicated or sophisticated sites might need the expertise of an information architect, he added.
Another strategy is to set priorities and keep findability within the grasp of existing employees.
A handful of a website’s documents end up being in high demand, said Louis Rosenfeld, an information architecture consultant who recently wrote a book on search analytics. Similarly, a relatively small number of search queries might account for 30 percent to 40 percent of a site’s search traffic, he added.
In other words, target your resources at the most important areas rather than at the website as a whole.
Rosenfeld uses an onion metaphor to describe how an organization might prioritize its findability activities. The outer layer with the least important content might be left alone and indexed via the search engine. The next layer moving inward could involve an ad hoc approach, such as user-generated tagging. The next layer could employ simple metadata tagging, while the subsequent layer uses more sophisticated tagging. The most important content at the onion’s core might call for the most sophisticated techniques, such as ontological modeling.
“The point isn’t having more people, it’s investing resources more efficiently,” Rosenfeld said.
Organizations basically have two options for addressing findability, said Leslie Owens, a senior analyst at Forrester Research who focuses on enterprise search. In a systematic approach, the website owner builds a business case and obtains funding to hire personnel and buy technology. In contrast, a tactical approach would rely on informal methods, such as encouraging people to tag content.
“You can address findability in a low-cost way or in a more strategic way,” she said.
Problem 3: Too many government websites
Many agency websites compete with one another in search lists, complicating users’ efforts to find the desired information. Citing the need to keep content readily accessible, the Obama administration is seeking to reduce redundancies among the estimated 24,000 federal websites. State governments also struggle with a proliferation of official websites.
Solutions: Consolidating sites is one way to address the issue. In June, the Office of Management and Budget imposed a freeze on new executive branch domains and said the move seeks to “simplify access to federal services.”
OMB also directed agencies to “eliminate duplicative and outdated websites” and identify sites that should be consolidated.
At the state level, Massachusetts has been cutting back on websites over the years. Commenting on the “GovFresh” blog, Sarah Bourne, director of assistive technology and chief technology strategist for Mass.gov, said the state has been integrating sites into the Mass.gov portal. In her view, the guiding principle behind consolidation is improving the customer’s experience.
Consolidation could help with efforts to aggregate and organize similar information, eliminate outdated or redundant content, and minimize “the number of host/domain names to boost page ranking,” Bourne wrote.
But some observers say consolidation is of limited use in improving findability.
“I’m not sure consolidation is necessarily the answer,” Rosenfeld said. “Rather than throwing garbage together in a huge dumpster, make the good stuff rise to the surface.”
He said identifying “best bet” search results for the most common queries is one way to do that. Created by an individual or group, the best bets would show up at the top of a site’s search results page. The technique can span multiple websites, he added.
Problem 4: Little or no SEO
SEO offers a mechanism for boosting website visibility, but agencies have acquired a reputation for doing it poorly or not at all. Again, staffing and related costs appear to be key impediments. Some National Dialogue commenters pointed out that website staffers are often generalists with limited knowledge of specialized areas such as SEO.
Lack of awareness of SEO best practices and insufficient training also contribute to the situation.
“The public is increasingly visiting government websites through search engines rather than accessing these websites directly,” said a government source who spoke on background. “With that in mind, search engine optimization techniques are very important for improving the public’s access to and awareness of government-provided services and information.”
Solutions: The government is making a push to get agencies up to speed on SEO. For example, the site HowTo.gov, managed by the General Services Administration’s Office of Citizen Services and Innovative Technologies and the Federal Web Managers Council, includes a section on SEO guidance.
Improving SEO might be somewhat easier for agencies that have already gone through the information architecture process because one discipline flows into the other, Thurow said. Breadcrumb navigation, for example, can be a component of information architecture and also give SEO a lift.
Agencies could also get the jump on SEO by incorporating it into website design and development rather than tuning sites for SEO after the fact. A website owner might start with information architecture, move on to usability testing, and then embark on design and programming, managing each stage with SEO in mind, Thurow said.
“People really need search engine optimizers throughout all of these processes: architecture, design, development,” she added.
In the design and development phase, for instance, site owners should avoid inserting technologies that make search more difficult, such as expanding/collapsing menus, Thurow said.
And as agencies look for ways to improve their search functions, they shouldn’t ignore other aspects of website construction.
“Ultimately, it’s important to maintain a balance between SEO and website design for improving findability and ensuring access to quality content and links,” the government source said.
Morville said the pendulum has swung back and forth over the years, with arguments alternately favoring search or browsing. In reality, users switch between the two activities depending on what works best for the task at hand.
“You really need to strike a balance,” he said. “It’s not an either/or.”