*** NOTICE ***


The ERIC Clearinghouse on Information & Technology
web site is no longer in operation.


The United States Department of Education continues to offer the


ERIC Database




All ERIC Clearinghouses plus AskERIC will be closed permanently as of December 31, 2003.


In January 2004, the Department of Education will implement a reengineering plan for ERIC. The new ERIC mission continues the core function of providing a centralized bibliographic database of journal articles and other published and unpublished education materials. It enhances the database by adding free full text and electronic links to commercial sources and by making it easy to use and up to date.


From January 2004 until the new ERIC model for acquiring education literature is developed later in 2004, no new materials will be received and accepted for the database. However, the ERIC database will continue to grow, as thousands of documents selected by the ERIC clearinghouses throughout 2003 will be added. When the new model is ready later in 2004, the new ERIC contractor will communicate with publishers, education organizations, and other database contributors to add publications and materials released from January 2004 forward.


Please use:

www.eric.ed.gov to


§         Search the ERIC database.

§         Search the ERIC Calendar of Education-Related Conferences.

§         Link to the ERIC Document Reproduction Service (EDRS) to purchase ERIC full-text documents.

§         Link to the ERIC Processing and Reference Facility to purchase ERIC tapes and tools.

§         Stay up-to-date about the ERIC transition to a new contractor and model.

Archived version of the site:

Feature Articles
ERIC Digests
ERIC Database
Department of Education
National Library of Education

Digital reference may be known by several names including electronic question and answering, online reference, and e-mail-based reference. Until recently, digital reference was considered to be a novel way for organizations to provide information. Now, digital reference is ubiquitous and is conducted at libraries, schools, universities and many other organizations. Digital reference affects, and is affected by, the organizations in which it is conducted.

Recognizing the importance of such knowledge, the United States Department of Education recently commissioned a large-scale study of its own digital reference services. At the Department of Education, digital reference is conducted in centers where information specialists use e-mail to provide answers to consumers' questions. Information specialists at the Department of Education consider themselves part of an inter-office Web and email-based reference service, and refer to themselves collectively as “ED.gov.”

Because accessibility to computers and the Internet is increasing, the number of consumers sending e-mail questions to ED.gov is growing. The centers received over one million digital reference questions in 1999 and the number is burgeoning. ED.gov information specialists are dedicated to customer service, but the increased demand for information strains departmental resources and can pose. difficulties.

The United States Department of Education was an early and innovative advocate of e-mail and Web-based information service. In the early 1990's - long before most federal agencies were online - the Department of Education encouraged each of its offices to craft its own infrastructure, and to develop procedures and policies for answering consumers' questions. For a long time, this approach served the individual centers well. Sharing information became difficult however as other offices and new users wanted to access information. The use of different information formats made it difficult to standardize processes, and to track and archive questions across ED.gov centers.

In 1999, the Department of Education contracted with the Virtual Reference Desk at the ERIC Clearinghouse on Information & Technology at Syracuse University to conduct a major research initiative. The intention of the research was to reveal issues about, and find recommendations for, optimal provision of information. The study featured methodical data gathering, and began with an assessment of the current processes, procedures and challenges in ED.gov centers. Outcomes included suggestions for improving information delivery to customers, recommendations for policy implementation, software requirements for possible future automation of processes, and suggestions for training managers and specialists.

The findings may be applicable to digital reference services in many organizations.

Key Findings

ED.gov specialists reported that they encountered challenges with almost every aspect of answering and tracking consumers' questions. They found it was becoming increasingly difficult to:

  • check quality and content of referred answers
  • formalize and share reference lists (of other specialists and their areas of expertise)
  • share and use Frequently Asked Question (FAQ) files and archives
  • keep FAQs accurate, up-to-date and consistent across related resources
  • differentiate among types of questions and answers
  • understand standards and procedures for tracking, archiving and referring questions
  • educate consumers of varying expertise levels
  • identify consumer populations, and priorities for their levels of service.

Certain philosophical differences created divergent policies and practices and were particularly troublesome in the centers' quickly changing environments. Specialists often disagreed with each other, and with managers, about the identity of their primary customers and about how to best answer their questions. One specialist, for example, could think that a United States senator's question must be answered first. Another specialist could believe that the general public should be served first.

Philosophical differences over the identity of the primary customer were exacerbated by the fact that new users came not only from the centers' traditional user population, but from new populations as well. New users had varying levels of skill and some asked high-context questions and required rich synthesis in their answers. Others had little knowledge about how to phrase queries or conduct searches.

New customers also asked new kinds of questions and some could have been sent to inappropriate offices for answers. Specialists had to identify questions that were within the scope of their offices (in-scope questions) vs. questions that were outside the scope of their offices (out-of-scope questions). Once a specialist determined that a question was in-scope, he or she had to choose the most appropriate format out of many for an answer.

A growing number of online information resources overwhelmed specialists with more information, new interfaces, more learning curves, and greater expectations for service. These forces combined to create both internal and external challenges to the centers' management of answers.

Internal Challenges in Managing Answers

The term tracking refers to the monitoring of a question's progress as it is sent to other information providers. It is an important activity in customer service. Tracking at the ED.gov centers, however, lacked consistency across centers, and varied according to time, tools and media types. Commitment to overcoming these challenges also varied across centers and according to the abilities of the specialists who worked with the systems.

Archiving refers to the storage of answers for possible reuse, and presented some of the same challenges as tracking. Like tracking, archiving is a fragmented process that differed from center to center. Archiving was constrained by lack of standards and insufficient support.

FAQs are educational tools that were developed from archived answers to frequently asked questions. FAQs were difficult to find, and therefore were underused. This resulted in wasted resources and a continual re-creation of answers.

In addition, the procedures for tracking, archiving and creating FAQs were developed in separate offices, each of which selected its own database platform, processes, operating system and applications. These "islands" of development created disconnects when centers attempted to share information internally.

External Challenges in Managing Answers

The centers faced two kinds of challenges that came from outside the organization. First, current events in the news triggered increased demand for information. Because of their dynamic nature, questions about current events allowed little time for coordination of timely, consistent and accurate department-wide answers.

Secondly, journalism influenced the centers in two ways - formally and informally. Formally, professional journalists often wrote about a Department of Education office and disseminated the center e-mail address in newspapers and other media without first contacting the center. Consumers, having read the articles in their local newspapers, went to their computers in great numbers to access the service. The “traffic” of questions then spiked, straining workloads and answer quality.

Informally, non-professional “journalists” (parents, students, teachers and researchers) cut, pasted and electronically disseminated previously published articles into school newsletters, personal home pages and consumer guides. Again, spikes in traffic resulted.

Nevertheless, it is important to note that growing consumer interest generated by formal or informal dissemination of United States Department of Education information must be served and encouraged. To do that, ED.gov specialists and managers must acquire new skills to address emerging issues in digital reference.


In summary, the findings of this study can be grouped into six issues:

  1. Coordination of standards, documents and procedures.
  2. Consistency of policies from center to center.
  3. Standardization across centers.
  4. Development of software.
  5. Training for center specialists and managers.
  6. Resource sharing.

Solutions to these challenges cannot be accomplished easily or quickly. But the following recommendations may help ED.gov, (or any large organization practicing digital reference) address the issues.


The Department of Education already recognizes the importance of re-thinking digital reference, and the recommendations made here are intended to provide specific, actionable actions that will support the goal of improved services. While each recommendation below may not be appropriate for all other organizations, some may be useful to other government agencies or large organizations. The recommendations are predicated on the assumption that each digital reference service must consider a broader view of its own services than it may be used to, Recommendations include:

  1. Choose a champion. The momentum needed to induce large-scale change must be spearheaded by a person or group that is both influential and well resourced. In the case of ED.gov a group of specialists self-organized, procured a budget and made digital reference optimization a mission. In other organizations, the champion could be a highly placed executive. Whether a group or an individual, the champion's duties are to procure resources and determine the optimal level of centralization.

  2. Determine level of centralization. Should one centralized authority require all centers to use the same tools and policies? Route their questions in the same way? Format their answers similarly? If not, how much standardization will there be across centers? Answers to these questions should be secured.

  3. Incorporate AskA software into a Department-wide intranet. Once high level issues such as policy, format and procedures have been decided upon, they can be implemented using software. Some digital reference services use existing software and build on customized functionality. Others use software designed specifically for AskA services, such as the Incubator created by the Virtual Reference Desk Project at the ERIC Clearinghouse on Information & Technology.

  4. Employ QuIP protocols to enable resource sharing across Centers. QuIP is the Question Interchange Protocol (created by R. David Lankes, the director of the ERIC Clearinghouse on Information & Technology), and is designed to facilitate the exchange of questions and answers among organizations. It could be used at ED.gov to automate processes and provide faster and more accurate answers. More information about QuIP is available at http://www.vrd.org/Tech/QuIP/1.01/1.01d.htm.

  5. Use checklists to translate policy into actionable items.

  6. Formalize the Frontline Forum (an ED communications group) and use it to coordinate software specification and standardize operations across centers.

  7. Coordinate a "fast-response" team to provide fast and accurate answers to questions about current events, thus preventing traffic spikes in the centers.

  8. Research commercially available software packages to determine if they support center processes and procedures.

  9. Create training goals and plans, and decide on implementation mode(s).

  10. Evaluate daily operations using checklists.

  11. Continuously gather and use feedback to upgrade systems and services.

These recommendations address specific challenges at ED.gov, but may also serve to inform and support digital reference practice at other organizations that face increasing expectations for digital reference service.

Joanne Silverstein is head of research for the Information Institute of Syracuse at Syracuse University.

Go to Top