Web of Confusion May Untangle With Standards, Cooperative Efforts
Despite growing Internet use and successes, designers’ patience shrinks with Web-page obstacles.
A growing industry and government effort to provide nationwide Internet services has created an intricate maze of accessibility, content and quality control challenges. With the number of Internet users now totaling 320 million, more than a score of browsers in use and the development of constantly changing Internet technologies, the online challenges are complex. Navigation has been difficult, errors continue to creep in, and many users are excluded from access to a large number of World Wide Web sites.
The Web is a robust tool that can disseminate information on a wide range of topics quickly and in many formats. It is also part of the United States’ democratic philosophy that all citizens are entitled to access—now electronic—to information on what the government is doing as well as the right to respond. The problem with this tool is that it performs inconsistently, excludes some citizens and causes other glitches that contend daily for Web designers’ attention.
Browser incompatibility is among the top Web challenges, according to Jeffrey Zeldman, designer and group leader, The Web Standards Project, an industry and government virtual collaborative effort. “In 1998, when 4.0 browsers came out, a lot of us [designers] got really disgusted, and there was a lot of grumbling because Microsoft’s and Netscape’s browser technologies were so completely at odds with each other. So a group of designers and programmers formed an organization to protest and to call attention to the problem and the solution,” Zeldman shares. The solution is to implement standards created by the World Wide Web Consortium (W3C), an organization that develops common protocols to promote the Web’s evolution and ensure its interoperability. “Our mandate is to help get these standards implemented by any means necessary,” he adds.
“Incompatibility means [that a page] works in Netscape but doesn’t work in Internet Explorer (IE), or it works in Explorer for Windows but not in Explorer for Macintosh,” Zeldman explains. These types of issues correspond to how hypertext markup language (HTML), cascading style sheets (CSS) and Java Script are implemented, he notes. “For instance, if I embedded a Flash movie in Netscape, I would write a little Java Script that will ask, ‘Does my visitor have the right plug-in?’ If so, play the movie. If not, take the user to a modified page of just text,” he states.
“The case is that you could implement this in Netscape 4.0. You couldn’t implement it in IE 4.0 ... Microsoft technology enables a similar version, but then the page doesn’t work on the Macintosh,” Zeldman says. “So basically when Macintosh users of IE 4.0 would get to a site using Flash, they would get into an endless loop. They would be asked if they wanted to download Flash. They would download it, and then they would be asked [again] if they wanted to download Flash. They would download it, and then eventually they would decide never to come back to your site again,” he explains.
Zeldman also notes that the advanced features are not the only cause of problems for users. Other difficulties lie in what should be simple design techniques such as the display of text. If a designer is using style sheets and wants text to appear small, he or she can use a key word in CSS that indicates “text small.” But depending on the browser, the text will appear anywhere from very small to large. Key words are used so that site visitors are not limited to a specific font size. “I want it to be small, but I also want someone who is visually impaired to be able to enlarge it by clicking a button,” Zeldman says. This can be done with key words but only if they are implemented properly.
“If they’re not, I have to decide that I either can’t control how my text looks on a page,” Zeldman continues, “or I have to exclude the visually impaired, or I have to design a Web page that looks big and horsey to 90 percent of my viewers. There are huge incompatibilities among browsers that cause problems over even the smallest issue.”
To bypass these incompatibilities, Web designers spend at least 25 percent of their time doing workaround alternate versions, Zeldman indicates. Therefore, clients pay 25 percent more than they would if browser standards were implemented universally. For example, a designer is working on a $25,000 job, and $6,250 is being wasted on workarounds. That amount is multiplied by 10 other jobs that a designer will do in a year, then multiplied by tens of thousands of people who are also creating Web sites. The sum is astronomical. “You’re talking about millions of dollars being spent that don’t need to be,” Zeldman maintains.
“It’s as if I were making television shows for people and I said, ‘That scene was great and it will really work well on the Sony televisions. Okay, Seinfeld, could you come back please? We’re going to do a shoot for the RCA televisions now,” Zeldman illustrates. The result, he notes, is that either the client pays more than is necessary, the Web company loses money because it cannot bill for the workarounds, or some users get excluded.
According to Section 508 of the U.S. Rehabilitation Act, this exclusion is unacceptable for government Web sites, including state-funded institution sites. Section 508 is designed to ensure that people with disabilities have access to information made available electronically. A report on one of Zeldman’s Web sites, www.alistapart.com, states that approximately 43 million Americans have some type of disability. While not all are restricted from the Web, the total demonstrates the need for a greater awareness of accessibility and surrounding issues.
Zeldman observes, “The government has laws about accessibility based on the World Wide Web Consortium standards, for instance, Web content accessibility guidelines. The government could mandate the use of browsers that accommodate those guidelines. If the browser isn’t compliant, the government could say, ‘we can’t use this, sorry guys.’” Zeldman explains that if browser X, the most popular browser, is used for all government intranets, and someone working in a government office states that he or she cannot read a page, then the agency will have to switch to a different browser. “The government law ... will force people who make Web sites to treat accessibility much more seriously,” Zeldman offers.
Chris Unger, a project manager for the U.S. Defense Department’s DefenseLINK Web site, Alexandria, Virginia, agrees. He notes that DefenseLINK’s written policy is for site developers to design in accordance with W3C standards. “I don’t want to have to redesign my site each time a new browser feature appears on the market,” he admits. “We just can’t react that fast.”
Zeldman adds, “To some of us, this is a moral issue. It may be a moral issue because we just aren’t comfortable with excluding non-English speakers or those who are visually impaired.” He notes that incompatibilities result in a human cost. “Some people get left out. If HTML 4.0 isn’t fully implemented, then people whose first language isn’t English … [and] disabled individuals have a harder time with the Web. The Web is a gift, and it’s supposed to be platform and device agnostic. It is supposed to enable anyone in the world to share information—share art, transact business, and communicate—from a palm pilot to a Pentium III and on any platform. When we can’t support every browser and every platform, we’re hurting people.”
George Olsen, co-founder of The Web Standards Project and information architecture manager, Parago.com, Santa Monica, California, notes that other problems plague Web designers and users. He explains that content organization is extremely challenging, especially when a site is both broad and deep. If a site is hard to navigate, he observes, users are less likely to return.
Olsen’s company has conducted extensive traffic analysis and usability testing. This resulted in what they call a breadcrumb system, where categorization is explicit and the page always displays the category and subcategories. This allows the user to see where he or she has been and navigate easily to the next pages.
Mark Wolfson, content manager for the Federal Emergency Management Agency (FEMA) Web site, Washington, D.C., shares that FEMA was among the first government agencies to realize the potential for disseminating information 24 hours a day to such a large audience. He too notes the initial difficulty of content organization.
FEMA’s site started out as 200 pages—taken from existing public relations materials—and has grown to 50,000 pages. Wolfson relates that difficulty working on the site did not come from HTML programming but from site structure decisions. He shares that it is not always effective to mimic the organization’s wiring diagram. FEMA solved the problem by noting that a firm’s organizational structure is not necessarily the right paradigm for a Web site. Determining the audience and what it needs, then organizing the content accordingly will be infinitely more helpful to the user, he explains.
Listening to the needs of the user is essential, concurs Donald O’Brien, program manager, U.S. Department of Defense E-Mall, Fort Belvoir, Virginia. O’Brien explains that the government E-Mall site improved with the use of customer focus groups and the implementation of an issue tracker where any user or vendor can input problems or comments. The comments are then entered into a to-do list that is assigned a high priority.
Issue tracker response is based on a three-tiered system. Hot lines—items that keep the site from being used—are fixed immediately. Warm lines are workaround problems, and these are corrected within one to two weeks. Issues that do not fall under one of the first two categories are called scheduled software releases and are addressed quarterly.
O’Brien gives an example of a warm-line glitch. The E-Mall site’s servers are located from Salt Lake City, Utah, to Battle Creek, Michigan, to Falls Church, Virginia. These servers must work together, and to maintain efficiency, cookies—information for future use that is stored by the server on the client side—are passed among the servers. Problems arose when a user wanted to update his or her profile because the changes affected the cookies. When a user would move from server to server, the new server would not recognize the user as having logged on, thus causing the user to log on repeatedly. “This was a major irritant,” O’Brien expresses. “By implementing the issue tracker, site managers were able to respond to this problem much more quickly.”
Unger, through his work with DefenseLINK, also contends that Web site quality control is important. He shares that the DefenseLINK site has many contributors from the Office of the Secretary of Defense, and for each, the process for submitting content differs. He explains that some staff use simplified automated processes, while others create the pages end-to-end. Multiple contributors allow coding and spelling errors to creep in, making quality control difficult.
Quality control is a topic on which DefenseLINK designers are actively working. “We are putting into place a content management publishing system that is more automated. We are hoping to use that as a way to implement quality control up front. We are developing this in house,” Unger offers. “Our initial prototype is going to be built on a Cold Fusion application server with an Access database. Our production environment is going to be HP-UX [Hewlett Packard UNIX]—again running the Cold Fusion application server with an Oracle database in the back. We’re currently in our first beta right now.”
The concept is that content contributors will have a Web-based system in which they can create new products—such as a press release or news article—and edit existing ones. Additionally, built-in editorial controls will facilitate online submission of information. A superior will approve the content, and an editor will give an overall approval for publishing—all completed online, Unger says. Quality control will be re-established as a joint effort.
“Web site successes come from a strong partnership between the techies and the communicators,” Wolfson summarizes. “For example, a television cameraman is not necessarily a good newscaster. You need both to disseminate the evening news. You need a marriage of skills.”
The process of disseminating information on the Internet is still being learned, Wolfson emphasizes. “It is analogous to the introduction of the television and how we first communicated through this medium. At first, stations broadcast five to 10 minutes of news once a day. Now, we disseminate information much differently.”