As a general theme in cataloging over the last century or so, the integration of cataloging types have been of great importance. One cataloging cavern that has been closing over the years is that of library and archival cataloging. Michael Gorman, author of Cataloguing in an Electronic Age, investigates the divide between these systems and determines that the two are being brought together in the latest cataloging rules. This harmonization, he explains, should be applied to the various parts of the American Memory Project and other digital archives. His concern, not unlike many in the field of cataloging, is for the integration of differing methods of organization to come together to help enable the archiving of the Internets contents.
The American Memory Project is a digital artifact section of the Library of Congress designed to make selected collections widely available in electronic form. This is the theme encouraged by many librarians today because the idea of cataloging and archiving the Internet has been a source of intrigue for years. The scope of this idea has been expanded in recent years to encompass vast areas of the Web. Not knowing what to do and how to do this on such a large scale has perplexed catalogers and librarians; and, as the years go by, greater concern arises for the information that can potentially become lost forever.
Even though it was understood from the beginning that it would be impossible to catalog everything on the Internet, it was agreed that much of the Internets resource bank should be cataloged. Unlike traditional library materials, the Internets Web page inventory grows in size at an incredible rate. A librarians job is to organize information for retrieval and use; but, certain matters exist that act as a barrier to typical cataloging practices. For one thing, the quality and stability of the information on the Internet is of a variable level.
A natural result of instability in the online environment is the desire for the digital material to be stored and archived. According to Michael Gorman, the beliefs of a few optimistic people are focused on the idea that governments and private companies will ensure the survival of the Internets digital records. He responds by writing that the cost and practicalities of such schemes boggle the mind and defy credulity. Funding, however, has been awarded in the amount of $100 million to the Library of Congress to generate a national digital strategy for preserving the countrys important materials of intellectual significance that happen to be in digital format.
Another example of Internet archiving comes from the U.K. Web Archiving Consortium. The group is archiving certain U.K. Web sites for the purpose of investigating the outcome. 6,000 sites were selected for their historical and cultural importance. Project manager Magnus Research aims to lengthen the average lifespan of Web sites from 44 days to about a century.
Along with all the programming and data management, hardware considerations are also necessary for long-term storage, which is solely for preserving the digital files. The viewer, reader, and player require just as much attention. File formats, hardware formats, and operating system platforms change with great regularity. Software companies have just recently begun to focus their attention on long-term preservation software. Hurdles reside not only on the physical side of the issue, but also on the legal side. The 1998 Digital Millennium Copyright Act has made it more difficult to legally copy digitized data. Because long-term preservation is expensive, and the growth of digitized information expands far from the resources needed to do the job, Internet archival access may be limited for an unknown period of time.
Please visit
iAutoblog the premier autoblogger software
No comments:
Post a Comment