VTF Bibliography

Agnew, G. (2003). Developing a metadata strategy. Cataloging & Classification Quarterly, 36 (3/4), 31-46.

Agre, Philip E. (1995). Institutional Circuitry: Thinking About the Forms and Uses of Information". Information Technology and Libraries. Chicago: Dec. 1995, v. 14:4, p. 225-231 (Also note related discussion in April 1996 issue of his newsletter). "Information is not a natural category," Agre writes, but rather "an object of certain professional ideologies" (i.e., those of librarians and information technologists). Librarians share an ideology that considers 'information' a kind of value-neutral homogeneous substance. This has served us well in many ways, but has also caused us to distort the ideologies of those whom we profess to serve--i.e., other professional communities--by subjecting their respective literatures to ill-fitting description, indexing, and classification. Agre sounds a bit like Clay Shirky (2005), where the latter argues that LCC (versus social tagging and folksonomies) does violence to the dynamism and specificity of real ideas and communication. And he seems to contradict Kurth (in press) who sees the cataloger as a welcome mediator among diverse scholarly communities. Agre concludes that "it may become possible--and perhaps even unavoidable--for librarians to abandon the ideology of information and replace it with the specialized ideology that governs the circuitry of a particular institution." In other words, librarians may increasingly attach themselves to particular disciplines or research teams (rather like journalists embedded with military units, it seems), where, through customized bibliographies, classification, reader's guides, etc., they would identify with and represent the native ideologies and literatures of their hosts. Given the current trend away from "cataloging" toward "metadata consulting", one wonders if the transformation hasn't already begun. [DL]

Ahronheim, J. R., & Marko, L. (2000). Exploding out of the MARC box: Building new roles for cataloging departments. Cataloging & Classification Quarterly, 30 (2/3), 217-225. Describes non-MARC metadata activities undertaken at the University of Michigan Library and offers advice to other institutions contemplating the same. Librarians should be encouraged to attend non-library metadata conferences, in order to learn what standards and tools are being used by other professional communities. Recommend rapid implementation of pilot metadata projects following training, for staff to benefit from immediate hands on experience (p. 222).

Association of College and Research Libraries. Guidelines Involving Thefts in Libraries. (2003) http://www.ala.org/ala/acrl/acrlstandards/guidelinesregardingthefts.htm. Also see reformatted version (posted to VTF Web site). Discusses (among many other things) shelf listing as tool for inventory-taking and discouragement of in-house theft. The author is mostly thinking about special collections, but the argument lends itself to regular collections as well.

ALCTS Cataloging & Classification Section Executive Committee. ALCTS and the Future of Bibliographic Control. Position paper, distributed at ALA Midwinter 2007 . In July 2006 CCS Exec was charged by ALCTS Exec "with developing a series of recommendations or discussion points for the next steps that ALCTS should take to enhance its leadership position with respect to the changing nature of bibliographic control (cataloging and classification)". An important stimulus for this charge was the decision by LC to discontinue creating authority records for series titles. "The model of a national library that provides cataloging data in the manner to which we are accustomed is changing," the authors note. Communication habits need to change in order to allow quicker response times to rapidly changing conditions (i.e., "so that the library community at large is not blindsided by unilateral or small-group actions on the part of major members of the community"). Toward this end, support for 'hot topic' forums, virtual conferences, RSS feeds, wikis, and blogs, is encouraged. Some other key points: "the concept of 'legacy materials' needs to be treated with caution ... we do not assume that non-digital resources are simply waiting to be digitized ... We [also] treat with skepticism the concept of 'legacy metadata' ... MARC-based metadata, in particular, has a long and useful life ahead." Also: "Budgets are the outcomes of political processes ... technical services librarians too often display passive negativity with regard to budget and staffing decisions ... we treat phrases such as 'common-sense business decisions' and 'fiscal inevitabilities' with skepticism". Note statement on p. 7: " ALCTS understands and respects the most diverse needs of all types of library users for all types of materials. There are no "transitional" users. Library users comprise both those who sometimes, or always, prefer non-digital resources and those who now prefer digital resources, among others. All are to be treated with equal respect and without condescension. Similarly, the concept of "legacy materials� needs to be treated with caution. Whatever the values conferred by digitization of �non-born-digital� resources, we realize that those are added values, beyond those inherent in the resources as originally created. In other words, resources which are not universally available are not inherently of lesser value. We do not assume that non-digital resources are simply waiting to be digitized."

Bauer, Kathleen. (2004). "Trends in Electronic Content at the Cushing /Whitney Medical Library: 1999-2003." Journal of Electronic Resources in Medical Libraries, 1:4, 2004.

Beacom, M. (2005). "Reading alphabet soup: RDA, the JSC, the PCC, and the future of cataloging." (See also text version.) PCC Participants Meeting ALA Annual Conference, Chicago, Illinois, June 26, 2005. Keynote address to ALA 2005 PCC Participants Meeting. Discusses relationship of PCC to RDA, with an eye toward the future as envisioned by Dale Flecker, Lorcan Dempsey, Deanna Marcum, and Roy Tennant. Stresses importance of high-quality metadata as foundation for any repurposing of library staff and services. Suggests how the PCC can help catalogers flourish in the age of the 'recombinant library' (Dempsey's phrase). Beacom's challenge to the PCC is quite relevant to our committee's charge: "How can PCC work to retrain catalogers through its programs to prepare them to function effectively in a diverse information environment?" In our own case we need to ask: how is the Catalog Department preparing/educating its members to succeed in an increasingly diverse and unpredictable information environment? Are we developing expertise in non-MARC, non-AACR metadata, acquiring database and programming skills, cultivating entrepreneurial spirit, etc.?

Bibliographic Services Task Force (2005). See University of California, Bibliographic Services Task Force (below)

Big Heads (Technical Services Directors of Large Research Libraries Discussion Group). Minutes from ALA Chicago meeting, June 24, 2005. (http://www.loc.gov/library/bigheads/bigheads-june05.html, accessed Jan. 2, 2006). Includes RDA briefing by Jennifer Bowen, CCDA rep to JSC. New title (i.e., RDA vs. AACR3) signifies a framework more hospitable to digital resources, newly emerging database architectures (such as digital repositories [?] and content management systems), compatibility with pre-existing AACR records, interoperability with data coming from outside the OPAC, organization around FRBR, separation of content from display standards, commitment to ‘plain English' language (i.e., with minimal jargon), and outreach to non-library metadata communities. Code to be organized in three parts: (1) resource description, (2) relationships among records, and (3) authority control. Interesting fact: AACR has been translated into 25 languages. So tinkering with rules has significant international ramifications. Fears expressed about expensive new training and documentation, plus retrofitting of old catalog records to interoperate in new RDA environment.

Bishoff, Liz. (2004). The Collaboration Imperative. Library Journal, v. 129 no1, Jan. 2004., pp. 34-35. PDF full text retrieved through LibraryLit 11/3/05. Bishoff--who is VP of OCLC's Digital Collection and Preservation Services--argues that librarians need to collaborate more with archivists and museum curators. Areas of mutual interest and benefit include ILMS grant-writing, scaling up to more cost-effective project management, collaborative training and best practices, and, perhaps most importantly for our purposes, interoperability through common metadata standards. The current proliferation of standards limits our ability to provide "seamless integrated access" to our users, e.g.: Encoded Archival Description (EAD) is used for finding aids, Dublin Core for digital photos, maps, etc., MARC for e-books, etc., and VRA Core for art resources.

Bogan, R. A. (2004). Redesign of database management at Rutgers University libraries. In B. L. Eden (Ed.), Innovative redesign and reorganization of library technical services. (pp. 161-177). Westport, Conn.: Libraries Unlimited. Rutgers University Libraries (RUL) held a “Digital Futures” half-day retreat in 2003 to have staff members discuss upcoming changes. The management model used was called “Core Competencies of the Corporation,” developed by C.K. Prahalad and Gary Hamel (p. 163). Bogan reports that members of the Database Management team (or “DBM”, roughly equivalent to Yale's Catalog Management Team) had a tough time of it. They couldn't figure out how their jobs fit into the new vision of the library, and concluded that they would either be reassigned to new duties or be out of a job altogether (162). The demoralizing effect of the retreat could have been avoided, Bogan suspects, if technical service managers had effectively followed the Competencies model, that is: envision how staff job responsibilities will evolve, identify skill sets needed to assume these responsibilities, and build the organizational competencies that will support these responsibilities and skills. (Judging from the reaction of the DBM unit, it would seem reasonable to add a fourth step, namely, communicate the organization's vision inclusively and effectively.)

Based on the Core Competencies of the Corporation model, the key question to ask of oneself would be: ‘What do we do best, that no one else does, that would be difficult to imitate, and that is valuable to the library and its patrons?' The typical DBM response at the “Digital Futures” retreat had been “we fix other peoples' mistakes”. This would not do. Upon further reflection, the team realized that its deep understanding cataloging principles and conventions, and its ability to navigate and find things using this catalog [or something like that; need to check], was what they do best. Projecting ahead, they might want to define their group's core competency as the ability to obtain virtually any metadata from virtually any data source, schema and interface. This fairly expansive though precise self-understanding would facilitate grafting new skills onto the old, and allow the changing nature of one's work to look like the logical extension of one's earlier expertise, which is in fact what it is. Bogan cites Dorothy Leonard-Barton on core competencies, that they yield competitive advantage, build up incrementally, and resist imitation (p. 163, citing Leonard-Barton, 1995, p. 4).

Buckland, Michael K. "Information as Thing". Journal of the American Society for Information Science 42:5, p. 351-360, John Wiley & Sons, 1991. Cited in Agre, 1995. distinguishes three uses of the word "information": (1) Information-as-process; (2) Information-as-knowledge; and (3) Information-as-thing.

Calhoun, Karen (2006). The Changing Nature of the Catalog and its Integration with Other Discovery Tools. DRAFT 2B. February 21, 2006. 45 p. (incl. large appendices). http://dspace.library.cornell.edu/bitstream/1813/2670/1/LC+64+report+draft2b.pdf. Statement of problem: "a large and growing number of students and scholars routinely bypass library catalogs in favor of other discovery tools, and the catalog represents a shrinking proportion of the universe of scholarly information (p. 5) [does she prove this?]. By way of background, Calhoun cites LC's "Bicentennial Conference on Bibliographic Control in the 21st Century", and especially Action Plan Item 6.4, namely, to "support research and development on the changing nature of the catalog to include consideration of a framework for its integration with other discovery tools." Calhoun was enlisted as principal investigator for this item, one result of which seems to be this report. Calhoun describes 4 strategies for confornting the decline of the catalog. Maintaining status quo, which is not recommended, would be a "harvest" strategy, as it simply reaps the gains of earlier strategic investments (and doesn't necessarily sow new seeds for the future) (p. 12). The three recommended strategies are "Extending" (e.g., Endeca at NCSU), "Expanding" (e.g. consortial efforts among California libraries); and "Leading" (e.g., early implementers of new technology?). A schematic pyramid on p. 14 shows the foundation strategy to be "Extending". Of particular relevance to our group, this category includes "Support RDA with qualifications; support experimentation with FRBR, Simplify cataloging practice to a set of basic elements; eliminate LCSH". This last item resurfaces in Calhoun's blueprint under section 4. "Innovate and Reduce Costs" (p. 17), where she also recommends "accepting as much cataloging copy as possible without review or modification" and 4.2.3 "Abandon the attempt to do comprehensive subject analysis manually with LCSH in favor of subject keywords; urge LC to dismantle LCSH. Section 9. speaks to another aspect of our charge: "Develop, Retrain, and Recruit". Some other items of interest: Number 10.1 ; "Expand the number of staff members who can write effective grant proposals, 10.6: "Introduce a new product/service innovation program and process" and 10.7 "Encourage and reward an entrepreneurial spirit." Interviewees included Lorcan Dempsey, Dale Flecker, David Lindahl, Clifford Lynch, and Roy Tennant. Were any catalogers interviewed? An 8-point "Vision for Change" is included on p. 16.

Calhoun, Karen. (2006). “On Competition for Catalogers”. Address to the PCC Participants Meeting at ALA Midwinter (http://www.loc.gov/catdir/pcc/archive/pccpart06m.html ). Catalog departments have achieved economies of scale through cooperative agreements and resource sharing. At the same time, the increasingly networked environment of the World Wide Web has created unprecedented job-insecurity for catalogers. Google, Amazon, Yahoo, for example, have implemented knowledge management tools that appear more advanced than those offered by libraries. Traditional cataloging in an OPAC doesn't seem to scale up, and users are increasingly turning elsewhere to find what they need. (Calhoun cited extremely low statistics on OPAC use, but several in our group questioned their accuracy.) The OPAC is a diminishing portion of the researcher's 'infosphere'. Calhoun argued that catalogers should be more involved in managing the Web. She also suggests catalogers are well-equipped to survive in a more disintermediated library environment. It is we, after all, who created the self-service OPAC in the first place, allowing users to bypass library staff, and go straight to the stacks with call number in hand. So we have long been champions of "user empowerment" and should not be afraid to employ new technologies that make users even more self-sufficient. [5/28/06]

Calhoun, K. (2003). Technology, productivity and change in library technical services. Library Collections, Acquisitions, and Technical Services, 27 (3), 281-289. Describes changes in technical services implemented at Cornell (Calhoun, 2003). Following the "Future Search" management model developed by Cornell's Organizational Development Services (ODS), Catalog Department managers attended a full-day retreat in February 2001. By January 2002, 4 staff members had been transferred from Cataloging to a new metadata services group. While the loss to cataloging operations was palpable, Calhoun believes the tradeoff was justified. She adds that coming up with the strategic plan was the easy part; getting stakeholders to buy into was much more difficult. Her main challenge was persuading skeptical colleagues that “to be successful in the long run, technical services must play a central role in digital library design and development and in e-resource management.” In the end, Calhoun feels she was able to accomplish this, but it would be important to know what were the long-term consequences of reallocating staff and other resources away from traditional cataloging. If the strategy proved successful, it would be interesting to consider a similar technique at Yale, possibly in collaboration with the Organizational Development Center or School of Management.

Carlson, Scott, "Lost in a Sea of Science Data". Chronicle of Higher Education, 00095982, 6/23/2006, vol. 52, issue 42. Database: Full text via Academic Search, accessed 8/7/06. Catalogers may find themselves increasingly in demand to help organize and preserve experimental research data from university laboratories. Librarians at Purdue, for example, are already working with scientists there to supply metadata for data sets stored on networked servers. Due to the threats of software/hardware obsolescence, and turnover in laboratory personnel, it is suggested that a centralized, library-run repository may be a more effective solution in the long term. Does Yale have something like this in mind as it implements FEDORA/VITAL? [8/7/06]

Coyle, Karen & Hillman, Diane (2007) "Resource Description and Access (RDA): Cataloging Rules for the 20th Century". D-Lib Magazine 13:1/2, Jan./Feb. 2007; ISSN 1082-9873. http://www.dlib.org/dlib/january07/coyle/01coyle.html. The authors argue that RDA, should it continue to develop along current lines, will fail to provide a viable cataloging framework for the 21st century. The rules have not kept up with changes in electronic publishing and distribution: "the switch from physical media formats distributed through traditional channels to web-distributed digital information pulled the last remaining rug from under catalogers used to relatively stable materials. Descriptive rules based on predictable, stable and named "sources of information" (title pages, colophons, etc.) about a resource, with a prescribed order of preference, were not adaptable to resources without title pages or pages, and not suitable for resources that existed in a state of constant change."

Dempsey, Lorcan (2006). "Libraries and the Long Tail: Some Thoughts about Libraries in a Network Age". D-Lib Magazine, April 2006, v. 12, no. 4, ISSN 1082-9873 (http://www.dlib.org/dlib/april06/dempsey/04dempsey.html). The 'long tail' argument was first articulated by Chris Anderson in a 2004 Wired Magazine article. The basic idea seems to be that, before the Web, purveyors of information and media often had to cater to the lowest common denominator of public opinion in order to turn a profit. Aside from exceptional cases such as well-endowed research insitutions (and, I assume, high-end enthusiast specialty stores), where narrow profit margins were not a concern, service providers could not afford to catalog and warehouse items that might only be purchased or borrowed once every hundred or thousand years.

The so-called "long tail" refers to the string of outlying data points at the right or left side of an x-y frequency distribution graph, i.e., the things retail vendors have been traditionally loath to supply. But the long tail is also where a lot of the most interesting stuff is. The Internet has made marketing of the long tail more affordable because it aggregate producers, consumers, and content; lowers advertising and transaction costs; and provides inexhaustable storage space that diminishes the need for expensive bricks-and-mortar warehouses.

Dempsey's article looks at prospects for libraries to flourish in this new environment. Citing the 5 Google Book Search libraries (the “G5”) as his sample, Dempsey finds that only about 10% of books account for 90% of circulation (i.e., transactions within a single institution), and that even though 60% of aggregate G5 holdings can be found in only one of the G5 libraries, ILL transactions account for only 4.7% of total circulation. These statistics confirm what is commonly known, namely, that many library items rarely if ever circulate.

Dempsey's next step is highly questionable, however. He says: "These numbers suggest that many items in a specific collection may be underused" [emphasis added]. Looking for support, he invokes Ranganathan's famous axiom that "Every book has its reader". If Ranganathan is correct, and every book ever published has at least one person out there wanting (or destined?) to read it, then how do we explain that ca. 90% of our books are not finding their respective destined readers. The answer has to be, Dempsey seems to be saying, that we librarians are not doing a good enough job getting them into public consciousness. But isn't it possible that, while every book may indeed have its reader, it may only get read every hundred or so years? And, for academic research purposes, is this necessarily a bad thing? Could it not simply be a reflection of how vast and deeply ramified our collections are? Dempsey, for his part, seems to be saying that, given Ranganathan's axiom, there are probably readers out there who would love to get their hands on the long tail of library collections, but so far haven't been able to because librarians are failing to aggregate holdings properly. In fact, we are surely the "just-in-case" inventories par excellence, a situation that Dempsey may think is no longer sustainable or desirable.

Dempsey offers specific recommendations for libraries, including better intelligence gathering (e.g., usage statistics and other forms of market research), reduced service fragmentation (including, I presume, though he doesn’t mention it here, merging RLG with OCLC), providing better D2D (Discovery to Delivery) services, and changing our cost-recovery model (e.g., as in Paypal, where smaller stakeholders are empowered through reduced transaction fees.) Dempsey concludes that "in this context, aggregation of supply is about improving discovery and reducing transaction costs. ... Aggregation of demand is about mobilizing a community of users to that the chances of rendezvous between a resource and an interested user are increased ... We need new services that operate at the network level, above the level of individual libraries."

Duranceau, Ellen Finnie (2007). The Role of the Librarian in an Open Access World (presentation slides from a BioMed Central Consultation Workshop 5/21/07). Shows various ways the MIT library has redefined itself for the 21st century. For example, a new mission was approved in 2003, namely: "to create and sustain an intuitive, trusted information environment that enables learning and the advancement of knowledge at MIT. We are committed to developing strategies and systems that promote discovery and facilitate worldwide scholarly communication" [emphasis in original]. Job descriptions have been replaced or re-written to support the Budapest Open Access Initiative and other international programs: a "Research Group" was formed in 2002 to design and develop discovery tools, reimagine the role of librarians on campus, and to partner with the computer science department and Technology Services; a "Metadata specialist" position was designated in 2005 to support the OpenCourseWare program; a "Dspace product manager" was hired in 2006 to promote and develop MIT's digital institutional repository; a "Scholarly publishing consultant" was hired in 2006 to support author rights, institutional/faculty research, and open access publishing; a "Digital products manager" was hired in 2005 to build systems in support of open access to theses; . One question to ask ourselves here is: how similar is our mission to MIT's? And what differentiates us? The role of technology is different, of course, but the commitment to leadership in global education and literacy sounds familiar. MIT seems to have made OA a basic organizing principle. What role does it play at Yale?

Flecker, Dale (2005). OPACs & Our Changing Environment Presentation made to PCC Participants meeting in January 2005. Flecker worries that traditional library OPAC is losing ground to more sophisticated and convenient information retrieval services (e.g., Google and Amazon). Recommends merging OPAC with portal functions and databases. Worries that librarians will be preoccupied with AACR3 and FRBR debates while we continue to fall behind other IT professions.

Flecker, D. P. (2000). Harvard's library digital initiative: Building a first generation digital library infrastructure {computer file}. D-Lib Magazine. Mentions program (monetary awards) at Harvard that encouraged staff to propose and implement digital library prototypes (Harvard College Library, 2000). The prototypes would give staff members the skills and experience Harvard needed to jump-start a sustainable digital library infrastructure. The skills list included: (1) administrative, technical, and intellectual metadata; (2) digital formats, (3) reformatting technology and workflows; (4) licensing; (5) intellectual property rights; (6) preservation of digital objects; and (7) knowledge of interface and access.

Gallager, John, Kathleen Bauer, & Daniel Dollar (2005). "Evidence-Based Librarianship: Utilizing Data from all Available Sources to make Judicious Print Cancellation Decisions", Library Collections, Acquisitions, & Technical Services, 29: p. 169-179. Discusses rapid migration from print to electronic resources at Yale medical library from 1999 to 2004, with eye toward empirical evidence reflecting and justifying this migration. During those 5 years, e-journal subscriptions increased 642% from 528 to 3,391. Moreover, between July 1, 1998 and June 30, 2004, the number of patrons observed to be entering the physical library declined by 32%. Changes are driven by publishing trends, user expectations, and budget issues. While this article is mostly of interest to STM library administrators, other types of libraries are going through a similar, if less dramatic, transformation of their services.

Gentry, Mark, and R. Kenny Marone. "The Virtual Medical Libary Resources at the Point of Need via a Proxy Server." Journal of Electronic Resources in Medical Libraries, Vol. 1(1) 2004, p. 3-20.

Goldsmith, Beth, and Frances Knudson, Los Alamos National Laboratory Research Library. "Repository Librarian and the Next Crusade: The Search for a Common Standard for Digital Repository Metadata". D-Lib Magazine, Sept. 2006. vol. 12:9. , doi:10.1045/september2006-goldsmith. This paper describes the testing and decision-making process whereby LANL's Research Library selected MARCXML as metadata format for its 80 million metadata records, 1.5 million full-text records, and millions of additional complex digital objects. The other schemas under consideration were Dublin Core, MODS, ONIX (Online Information Exchange), and PRISM (Publishing Requirements for Industry Standard Metadata). Contrary to frequent reports in the literature, the complexity of MARCXML turned out not to be overwhelming. Citing analyses by Moen, W.E. et al., 2005 , the authors explain that "although there are well over one thousand tag/subfield combinations, only thirty-seven are used in 90% of MARC records". The large number of available well-defined data elements, however, make the MARCXML standard unusually robust, transparent, and extensible. As we consider the future of Yale's Catalog department, it may be helpful to keep in mind this study of MARCXML. As Matthew points out, wide adoption of MARCXML at Yale could help unify local encoding standards, improve system interoperability, lower operating costs, and improve services to our users. [10/1/06]

Greenberg, Jane. "Understanding Metadata and Metadata Schemes". Metadata: A Cataloger's Primer (ed. Richard Smiraglia). Haworth Press, 2005. pp. 17-36. Includes discussion of origins and definition of the term "metadata". The term was coined by Jack E. Myers in 1969 to designate "data about data" in a computer science context. (p. 19). Library and information scientists later adopted it to describe the cataloging of electronic resources. In order to stress the functional aspect of metadata, one could call it "structured data about data". Lynne Howarth, in another chapter (see below), compares "metadata" with "bibliographic control".

Gross, Tina and Taylor, Arlene. "What Have We Got to Lose? The Effect of Controlled Vocabulary on Keyword Searching Results". College and Research Libraries, 66:3, May, 2005, pp. 212-230. Gross and Taylor show how, without cataloger-assigned subject headings to provide a common denominator for terminology, the accuracy of keyword retrieval drops precipitously.

Gruber, Tom. "Ontology of Folksonomy: a Mash-up of Apples and Oranges" (2005) [From author's web site, retrieved 7/15/06]. Cited by Steve Gaither 7/14/06 on ACAT. Gruber was once a research scientist at the Stanford Knowledge Systems Lab, and is currently Chief Architect at Vignette. "Ontologies are enabling technology for the Semantic Web," he writes. "Folksonomies are an emergent pheonomen of the social web." They serve very different purposes. Ontologies are necessary for systems interoperability, since otherwise there is no way to determine whether a tag applied in one system should or will correspondend with a similar tag in another. Gruber is interested in finding a way of linking together the various tagging communities through a project called TagOntology. (Cf. Shirky, 2005, below).

Harvard College Library. Workflow Design Task Force. (2000). Report of the workflow design task force on widener technical services relocation to central square. Unpublished.

Harvard College Library. Technical Services Working Group (2001). Unpublished report. Accessed from Harvard Intranet July 11, 2001. Describes efforts to consolidate all technical services in a single physical location, relocated from the main research facility to a satellite office building.

Hillman, Diane, Stuart A. Sutton, Jon Phipps and Ryan Laundry. (2006). "A Metadata Registry from Vocabularies UP: The NSDL Registry Project", submitted to Dublin Core 2006 Conference. http://www.citebase.org/abstract?id=oai:arXiv.org:cs/0605111. Abstract: "The NSDL Metadata Registry is designed to provide humans and machines with the means to discover, create, access and manage metadata schemes, schemas, application profiles, crosswalks and concept mappings. This paper describes the general goals and architecture of the NSDL Metadata Registry as well as issues encountered during the first year of the project's implementation."

Hixson, Carol. (2005). "When Just Doing it Isn't Enough: the University of Oregon Takes Stock". RLG DigiNews, Dec. 15, 2005, vol. 9, no. 6. (http://www.rlg.org/en/page.php?
). Responding to rapid growth in digital library collections, University of Oregon (UO) implemented new metadata standards and workflows. The digital repository architecture of CONTENTdm and DSpace (locally customized as "Scholars' Bank") are managed by the UO metadata staff.

In December 2003 the Catalog Department was renamed Metadata and Digital Library Services (MDLS) "in recognition of its expanded role of implementing and maintainting digital collections--in addition to cataloging and preserving analog materials." Hixson is concerned, however, that UO's metadata efforts are unsustainable, given that digital content she manages has increased by over 1,000% in one year, and there has been no substantive change in staffing or funding to handle growing workload. She warns that the "Oregon model of 'just do it' is about to do us in."

Hoebelheinrich, N. J. Metadata at SUL/AIR [report to big heads meeting at Stanford university, June, 2001] (unpublished report to Big Heads)

Howarth, Lynne C. "Metadata and Bibliographic Control: Soul-Mates or Two Solitudes?". Metadata: A Cataloger's Primer (ed.: Richard P. Smiraglia; see below). Howarth Press, 2005, pp. 37-56. Attempts to "situate metadata in relation to bibliographic control" (p. 38), tracing the latter back to Alfred Panizzi's ninety-one rules for the British Museum catalog in 1841, and Charles Cutter's 1904 discussion of the three "objects" for library catalogs. Like Greenberg, whose chapter immediately precedes this one, Howarth provides some definitions for "metadata" as such. According to Gilliland-Swetland (2000), "Until the mid-1990's, 'metadata' was a term most prevalently used by communities involved with the management and interoperabiliity of geospatial data, and with data management and systems design and maintenance in general." An example of this earlier usage is the File Allocation Table (FAT) used by operating systems to record the names and and physical positions of files on a computer disk. It is worth keeping in mind the different usages of the term when considering whether to include it in the name of our department.

Indiana University Task Group on the Future of Cataloging. "White Paper on the Future of Cataloging at Indiana University." (January 2006). 31 p. http://www.iub.edu/~libtserv/pub/Future_of_Cataloging_White_Paper.doc

Excellent report covering much of the same ground we’ve been discussing here. First 5 pages is an executive summary. The rest is divided into two sections: (1) survey of landscape and trends impacting cataloging operations; and (2) Possible new roles for online catalog and cataloging staff.

Included in the first section is the observation that: "much hope is placed on the ability of an institutional repository to rescue scholarly communication, yet in no way can it become part of this conceptual global system unless there is a strong backbone of cataloging and metadata" (p. 5). This is timely for us given the imminent introduction of FEDORA at Yale. Also: Google will never be able to provide us with the metadata we need. As Thomas Mann has pointed out, relevance ranking [and, I would add here: full text keyword searching as well] "is expressly designed and optimized for quick information seeking rather than scholarship....” (p. 7), whereas sustained scholarly research benefits considerably from classification, authority control, subject analysis, and other forms of bibliographic control.

The Task Group offers four survival strategies for catalog departments (pp. 16-19): (1) Form partnerships on and off campus; (2) expand staff expertise in non-MARC metadata, with understanding that "the best training is by doing" (p. 17); (3) Continue streamlining workflows; (4) study and prepare for evolution of OPAC, including decisions about whether tiered levels of cataloging are being applied appropriately.

At the end of the report there are four appendices: Appendix A (p. 22) has the original charge of the Task Group. Appendix B (p. 23-28) is a bibliography of sources; Appendix C is a summary of results from the IU library staff survey; and Appendix D shows a sample of survey questions that were asked.

Jacso, P. (2002). XML and digital librarians. Computers in Libraries, 22 (8), 46.

Johnson, Gary M. (Jan. 11, 2007). "Eliminating Series Authority Records and Series Title Control: Improving Efficiency or Creating Waste? Or, 12 Reasons Why the Library of Congress Should Reconsider Its SARs Decision " prepared for AFSCME 2910.

Kelleher, Kevin. (2005). "Who's Afraid of Google? Everyone". Wired Magazine, 13.12 December 2005. (http://www.wired.com/wired/archive/13.12/google_pr.html). We're all too familiar with Google's effect on libraries (including, with its post-coordinate searching algorithm, the challenge it poses to traditional cataloging), but Kelleher describes how "Google's ever-expanding agenda has put in on a collision course with nearly every company in the information technology industry: Amazon.com, Comcast, eBay, Yahoo!, even Microsoft."

Koppel, Ted, and George S. Machovec. "An Interview with Ted Koppel of the Library Corporation on Standards." Charleston Advisor, Op-Ed, vol 6:4, April 2005. Cited as "background reading" for Yale's Electronic Resources Management (ERM) Implementation Group. Koppel, product manager for Ex Libris's ERM "Verde" tool, which Yale seems to be implementing, talks about OpenURL, ONIX, Z39.50, ISBNs, DOIs, and other metadata and resource identification standards. Koppel worries that the slow consensus-building required for open universal standards leads some stake-holders (OCLC in particular?) to adopt proprietary APIs (application programming interfaces) instead.

Kroski, Ellyssa. "The Hive Mind: Folksonomies and User-Based Tagging". Infotangle [Blog] (7 December 2005, http://infotangle.blogsome.com/2005/12/07/the-hive-mind-folksonomies-and-user-based-tagging/). "The wisdom of crowds," Kroski writes, "the hive mind, and the collective intelligence are doing what heretofore only expert catalogers, information architects and website authors have done. They are categorizing and organizing the Internet and determining the user experience, and it's working." Kroski notes that the University of Pennsylvannia has implemented a de.licio.us-based feature called "PennTags" that allows bookmarking of library web pages and bibliographic records. Roy Tennant describes Kroski's posting in the December 2005 issue of Current Cites as "one of the best I've seen on both the good and the bad of folksonomies."

Kurth, Martin. "Found in Translation: Four Characteristics of Metadata Practice." In Metadata and the Digitization of Information: A Festschrift in Honor of Thomas P. Turner, edited by Elaine Westbrooks and Keith Jenkins. Lanham, MD: Scarecrow Press. (In publication.) http://www.library.cornell.edu/cts/
Suggests that the role of catalog versus metadata librarian is essentially the same, i.e., to provide the pre-conditions for reconciliation (or semantic interoperability) among different persons' and disciplines' representations of knowledge. In other words, catalog/metadata librarians serve as conceptual translators from one mindset to another, reconciling vocabulary and facilitating interdisciplinary research. Metadata, in turn, forms the connective tissue that makes translation, reuse, mapping and transformation possible.

Lagoze, Carl, et al. (2006). "Metadata Aggregation and 'Automated Digital Libraries' : a Retrospective on the NSDL Experience." Retrieved from http://arxiv.org/abs/cs.DL/0601125 on 7/12/06. Apparent winner of Joint Conference on Digital Libraries (JCDL) 2006 "Vannevar Bush Best Paper Award", this article discusses an OAI-PMH/Dublin Core aggregator now in its third year of development at the National Science Digital Library (NSDL). Turns out there are serious problems with data harvesting technique. A few quotations: "Even if all other aspects of the system worked perfectly, poor quality metadata would degrade the quality of the resulting library." (p. 3); "Minimally descriptive metadata, like Dublin Core, is still minimally descriptive even after multiple quality repairs. We suggest that the time spent on such format-specific transforms might be better spent on analysis of the resource itself--the source of all manner of rich information" [which sounds like a plea for more expert cataloging] (p. 7). Ed Summers cited the paper in his blog, pointing out that, among other factors, low-quality metadata was to blame for poor harvesting success rates. He writes: "Good metadata requires domain expertise, metadata expertise, and technical expertise--and unfortunately the NSDL data providers typically lacked people or a team with these skills (aka library technologists)".

Leonard-Barton, D. (1995). Wellsprings of knowledge : Building and sustaining the sources of innovation . Boston, Mass.: Harvard Business School Press.

Mann, Thomas (January 1, 2007). "More on What is Going on at the Library of Congress" prepared for the Library of Congress Professional Guild. http://www.guild2910.org/AFSCMEMoreOnWhatIsGoing.pdf

Mann, Thomas. "The Changing Nature of the Catalog and Its Integration with Other Discovery Tools: a Critical Review". (2006) Prepared for the Library of Congress Professional Guild. http://guild2910.org/AFSCMECalhounReviewREV.pdf . Takes Calhoun (2006) to task for (in his view) misrepresenting research data and having more of a pro-business than pro-scholarship agenda. This isn't to say that Mann doesn't have his own blind spots. For example, in his discussion of the 'niche' strategy he completely dismisses the point that disciplines vary in the extent to which their practitioners benefit from cataloging services. The high energy physics community (along with others) do much of their scholarly work via the full-text-searchable e-print repository arXiv , whereas humanists remain far more committed to current and historical print resources. It seems reasonable to suggest, then, that different groups might require cataloging services to different degrees. Mann, for his part, approves of the niche strategy, but maintains that the appropriate niche would be to support scholars (as over and against "quick information seekers") rather than to identify and support only selected disciplines. In any event, I think the two papers taken together shed light on the politics of cataloging, and help frame the discussion of our department's mission and vision. [4/13/06]

Mann, Thomas (2006). "What is Going on at the Library of Congress?" Paper delivered to the Library of Congress Professional Guild (http://guild2910.org/AFSCMEWhatIsGoingOn.pdf); accessed 7/11/06. Points out a disturbing pattern at LC: comissioning the Calhoun Report (which includes proposal to "dismantle LCSH"), discontinuing SARs, purchasing digital copies of dissertations, etc., at expense of preservation-worthy microfilm, debasing of CIP standards, and shrinking the cataloging budget and staff.

Mann, Thomas (2005). "Will Google's Keyword Searching Eliminate the Need for LC Cataloging and Classification?" Paper delivered to the Library of Congress Professional Guild (www.guild2910.org/searching.htm)

Mann, Thomas (2005b). "Research at Risk". Library Journal, July 15, 2005.

Marcum, D. (2005). The Future of Cataloging. Ebsco Leadership Seminar, Boston, Massachusetts, Retrieved August 12, 2005, pp. 6-11. Another version published in LRTS 50(1), Jan. 2006, pp. 5-9. An administrator at the Library of Congress, Marcum finds the $44 million annual cataloging budget excessive. She asks provocatively, “in the age of digital information … how much do we need to continue to spend on carefully constructed catalogs?” Pointing out that the need for “intermediate-level descriptions” (as Arlene Taylor defines the goal of cataloging) has come under increasing scrutiny, she wonders aloud whether Google now functions sufficiently well as a retrieval engine for digital objects. Her message is not entirely negative. “If the task of descriptive cataloging could be assumed by technicians,” she writes, “then retooled catalogers could give more time to authority control, subject analysis, resource identification, and evaluation, and collaboration with information technology units on automated applications and digitization projects”. Marcum's staff at LC has been experimenting with the use of “access level records”. This is likely to be cheaper than full or core level records, but it remains to be seen whether, as was the case with Dublin Core, the ‘dumbing-down' of the record undermines the usability of the catalog.

Markey, Karen. (2007). "The Online Library Catalog: Paradise Lost and Paradise Regained?" D-Lib Magazine Jan./Feb. 2007, 13:1/2. http://www.dlib.org/dlib/january07/markey/01markey.html. Recalls golden age of OPAC, between 1980 and 1995, when it was "the jewel in the crown when people eagerly queued at its terminals to find information written by the world's experts". What caused its downfall? The World Wide Web came into existence, while at the same time expert recommendations for improved usabiliity (e.g., addition of full text resources and post-Boolean probabalistic searching) were being ignored by ILS customers and vendors. Catalogers were preoccupied with maintaining descriptive rules and missed the boat on rising IT capabilities and reader expectations. Now Google reigns supreme as place to begin (if not necessarily finish) one's research. Markey suggests that the current age of mass digitization offers an opportunity to reclaim the lead role. The attempt to simplify cataloging rules and emphasize primary sources (as proposed by Calhoun) will not bring people back to the online catalog. Rather, Markey suggests, we should be adding advanced features like full text searching relevancy ranking, social tagging, and recommendation engines.

Metadata: a Cataloger's Primer. (ed: Richard P. Smiraglia). Haworth Press, 2005. Co-published simultaneously as Cataloging & Classification Quarterly, 40:3/4, 2005. 287 pp. Divided into two parts: I. Intellectual Foundations; and II. How to Create, Use, and Apply Metadata. See specific chapter annotations for details. The volume begins with Smiraglia's "Introducing Metadata". Part I includes (A.) Jane Greenberg's "Understanding Metadata and Metadata Schemes"; (B.) Lynne Howarth's "Metadata and Bibliographic Control"; (C.) D. Grant Campbell's Metadata, Metaphor, and Metonymy; (D.) Leatric Ferriaioli's "Exploratory Study of Metadata Creation"; (E.) Jennifer Cwiok's "The Defining Element--A Discussion fo the Creator Element Within Metadata Schemas; and (F.) Richard Smiraglia's "Content Metadata--An analysis of Etruscan Artifacts in a Museum of Archeology". Part II includes (A.) Anita Coleman's "From Cataloging to Metadata: Dublin Core Records for the Library Catalog"; (B.) Alexander Thurman's "Metadata Standards for Archival Control: An Introduction to EAD and EAC"; (C.) Patrick Yott's "Introduction to XML"; (D.) Linda Cantara's "METS: The Metadata Encoding and Transmission Standard; and (E.) Michael Chopey's "Planning and Implementing a Metadata-Driven Digital Repository".

Morgan, Eric L. "Mass Digitization" [Weblog entry.] Infomotions. 21 Mar 2006. Seton Hill University. 11 Dec 2003. (http://infomotions.com/musings/mass-digitization/). 24 Mar 2006. Review of symposium at University of Michigan (March 10-11) entitled "Scholarship and Libraries in Transition: A Dialog about the Impacts of Mass Digitization Projects", featuring Tim O'Reilly, Clifford Lynch, and others. How will scanning of entire library collections (e.g., Google Print) affect the business of libraries? Morgan concludes: "a library's collection will not be as important as it is today. Everybody will be carrying the collection around in their pocket. Instead what people will need are sets of services -- tools -- to apply against the collections making the content more useful. In a digital environment the things of traditional librarianship (books) will give way to their content and this makes services increasingly important."

Naun, Chew Chiat. "Objectivity and Subject Access in the Print Library". Cataloging & Classification Quarterly, vol. 43(2), 83-95, 2006. Naun acknowledges that traditional subject analysis is historically tied to card catalogs and book collections, but suggests that its usefulness and power transcend its original historical context. While some might consider full-text information retrieval, i.e., with the librarian's traditional representation of knowledge (i.e., catalog record) removed to be a step toward greater objectivity, Naun suggests the opposite may be the case. Knowledge resources left to their own devices often contain biased, obscure, or misleading terminology. Far from distorting the information they organize, catalogers help create a neutral clearing, an open space, where civil discourse and mutual respect have their best chance of flourishing. In an age corporate news media and tendentious blogs, the work of disinterested librarians may turn out to be more important than ever. [10/11/06]

Pace, Andrew K. "The Relevance of 'Relevant Relevance'", October 2005, American Libraries, pp. 78-79. Pace writes partly in reaction to Thomas Mann's opinion piece (see above), which he considers problematic. "The primary flaw in his argument," Pace writes of Mann, "is comparing the precision and beauty of LCSH to Google keyword. More relevant would be discussion of meaningful keyword relevance applied to the OPAC itself, not an -apples-and-oranges contrast of still-disparate technologies."

PCC Mission Task Group (2005). Report Of The Task Group on the PCC Mission Statement. Accessed online Sept. 16, 2005.

Pilsk, Susan, et al. (2002). "Organizing Corporate Knowledge: the Ever-Changing Role of Cataloging and Classificiation". Information Outlook, April, 2002. http://www.encyclopedia.com/doc/1G1-95200282.html. Reviews history of cataloging from Panizzi's 1841 Rules for Compilation of the Catalogue of the British Museum through Dublin Core and the beginning of RDA. Draws comparisons between proliferation of ephemera at end of 19th century (via mass-produced print) and that which began taking place at the end of the 20th century with the advent of the Web. In both cases, the capacity to establish bibliographic control over published material was overwhelmed. Mentions 1889 punch card technique used by Herman Hollerith to represent 1890 census data, Hollerith's company being renamed as IBM in 1924, card punch library system developed by Ralph Parker in 1936, and project by Parker and Kilgour to develop shared cataloging network in 1965. Also in 1965 Henriette Avram began developing MARC, without which a distributed union catalog would not have been possible. The authors note, "Many librarians were afraid that what started as the Ohio College Library Center and unified online catalog vision would put catalogers out of business ..." but what actually happened is that "technology automated the clerical routine allowing humans to perform higher-level work by integrating mountains of information into more useful knowledge repositories."

Prahalad, C. K., & Hamel, G. (1990). The core competence of the corporation. Harvard Business Review, 68 (3), 79-91.

Shirky, Clay (2005). Ontology is Overrated: Categories, Links, and Tags. Self-archived at http://www.shirky.com/writings/ontology_overrated.html [Per author introduction: "This piece is based on two talks I gave in the spring of 2005 -- one at the O'Reilly ETech conference in March, entitled "Ontology Is Overrated", and one at the IMCExpo in April entitled "Folksonomies & Tags: The rise of user-developed classification." The written version is a heavily edited concatenation of those two talks."] Part of his argument is that the driving force behind LCC (Library of Congress Classification) has been arrangement of books on shelves rather than authentic relations among ideas. In the age of ubiquitous digital information, he believes the system may have outlived its usefulness. His analysis goes pretty deep, and poses some interesting challenges for catalog librarians.

Smiraglia, Richard. "Introducing Metadata". in Metadata: A Cataloger's Primer, ed. Richard Smiraglia (see above). Haworth Press, 2005. Recounts milestones of development of "metadata" as distinct concept. Depicts ISBD punctuation as it emerged from the 1961 Paris Principles and then MARC codes as early forms of document markup (p.5f.)

Swanekamp, J. (2004) [Report on Finding Aids to LMC]. Unpublished report. Written September 2004; Viewed August 24, 2005.

Taiga Forum organized by Meg Bellinger, Karen Calhoun, and others. See Provocative Recommendations and Program,

Tennant, R. (2004). Metadata's bitter harvest. Library Journal (1976), 129 (12), 32. Accessed online Sept. 15, 2005 (http://www.libraryjournal.com/article/CA434443.html)

Tillett, Barbara B. "Change Cataloging, but Don't Throw the Baby Out with the Bathwater". (http://www.loc.gov/catdir/cpso/Mittler.pdf) In International Librarianship--Today and Tomorrow. K.G. Saur Verlag, submitted Nov. 2004. Recalls Panizzi's succesful bid in the 1840s to maintain a "full and accurate catalogue" when challenged by trustees of the British Museum to provide more of an inventory control tool. Also recalls Charles Ammi Cutter's Rules for a Printed Dictionary Catalog (1876), which advocated a cataloging scheme whereby all known items by a given author or on a given topic could be rapidly identified and retrieved by the user. Tillett acknowledges that certain practices need to be reformed, e.g., the intricate "special case law" that provides special guidance on seemingly every permutation of cataloging. But she also reminds us of the folly of eliminating useful services for short term gain, specifically, LC's abandonment of relator terms and codes in the 1980s. These terms and codes had been used to establish relationships between and among creators and works, and, had they been retained, would have make the current shift to a FRBR data environment easier and less costly. Worth remembering as LC now discontinues its practice of series authority control [5/28/06]

University of California Libraries Bibliographic Services Task Force. Final Report: December 2005 (http://libraries.universityofcalifornia.edu/sopag/BSTF/FinalsansBiblio.pdf ). Excellent report, written for UC libraries in particular, but with conclusions applicable to many research libraries. There are 4 sections: (I) Enhancing Search and Retrieval; (II) Rearchitecting the OPAC; (III) Adopting New Cataloging Practices; and (IV) Supporting Continuous Improvement, of which III, naturally, is worthiest of our group's attention.

The specific recommendations on cataloging practice include: (1) consolidating and streamlining workflows; (2) applying appropriate metadata standards (i.e., not always using AACR2/MARC), discontinuing or modifiying (e.g., through FAST) authority control for subject terms, and giving priority cataloging to otherwise unfindable items; (3) manually enhancing shelf list for prolific literary authors, and implementing structured serials holdings format; (4) automating metadata creation wherever possible, preserving vendor-supplied metadata such as cover art, publisher blurbs, tables of contents, bibliographies, etc., and allowing items to go into collection before they've been fully cataloged.

In terms of optimizing the interface, the model is self-consciously based on Google and Amazon, namely: don't abandon the patron, no dead ends, take users either to a logical default choice, given them a meaningful range of other choices, or suggest alternative search strategies; but never tell the user: "No results found" and leave them to fend for themselves. [2/10/06]

University of Chicago Task Force Report... Favorably reviewed by Mann 2007. Not sure if publicly available.

Vellucci, Sherry. Cataloging vs. Metadata, competition or collaboration? PPT presentation to UCSD Metadata Services Department March 14, 2006. She quotes from her Jan. 2000 LRTS article (44:1; 33-43) on the 2nd slide: "Catalogers must learn several metadata schemes and organizational structures beyond AACR and the MARC record. They must free themselves from thinking in terms of flat files and linear access and begin to think in terms of multi-scheme data registries, new record constructs, and relational data models… They must envision a new spectrum of authority control that includes many types of identifiers along with the more familiar names, titles, and subjects. And most critically, catalogers must actively participate in the development of system architectures and data registries. Only this level of activity will ensure that catalogers play a key role in the development of authority control systems for electronic resources.” Many useful slides with diagrams and notes. Slide 44 mentions Metadata Education and Research Information Center (MERIC) standards, which could help inform our in-house strategy for recruitment and professional development. Slide 49 shows a workplan for implementing new metadata projects, according to which catalogers/metadaticians will "play a critical role [1] In creating metadata to share with others [2] Using metadata created by others [3] Developing Application Profiles for digital repositories [and 4] Implementing metadata schemas". [10/4/06]

Westbrooks, E. L. (2004). A vision for the future: Cornell University's geospatial information repository (CUGIR). In B. L. Eden (Ed.), Innovative redesign and reorganization of library technical services (pp. 445-464). Westport, Conn.: Libraries Unlimited. While library production and assignment of non-MARC metadata is becoming commonplace, the ability to exchange non-MARC records across systems and platforms is still at an early stage. The author describes the approach taken at the Cornell University Geospatial Information Repository (CUGIR), where digital objects are described using Content Standard for Digital Geospatial Metadata (CSDGM), as established by the Federal Geographic Data Committee (FGDC, 1994).

Yale University. Library. Catalog Department (2005). Efficiency Improvement Recommendations.

Yale University. Library. DPIP Production and Content Integration Working Group. Final Report August 2006. Interesting report of great significance to our department. Note compatability with Joan's vision for non-MARC metadata within the Catalog Department: "Strategic planning groups in the library advocated a 'federated approach' to Yale's integrated library and argued that a separate digital library unit would create a digital elite and discourage mainstreaming of digital activities." (p. 11). Cornell DCAPS is cited as a helpful model, showing what can be accomplished even in the absence of a budget increase. Other institutions singled out for comparison are Harvard (representing the ideal case where an additional $20 million were invested in digital infrastructure), University of Virginia (which hosts the Electronic Text Center, Institute for Advanced Technology in the Humanities, the Tibetan & Himalayan Digital Library, and the team that developed FEDORA), University of Michigan (which hosts an enviable 45-FTE Library IT department, the Digital Library eXtension Service, and the nearly 10-million record and counting OAIster repository) and the University of Oregon. The report does not mention Yale's international programs mandate, but, according to Fred (in his 10/12/06 report to the Catalog Librarians), this is taken into account by DPIP, but out of scope for the present working group. Progress made so far includes introduction of Yale Elements Set, creation of rescue repository (i.e., an interim safe haven for digital masters), and the soon-to-be-implemented FEDORA repository. A list of projects recommended for rapid implementation includes digitization of ca. 50,000 Classics Department slides, cataloging and posting of ca. 6,000 Byzantine collection slides into Luna Insight, management of digital surrogates for WWI virtual collection, collaboration on conversion of Yale Daily News back issues, conversion of ca. 371,000 VRC slides, and provision of access to digital objects generated through preservation reformatting.Section 3.4.2 is devoted to metadata issues, and includes recommendations to form a virtual Metadata Services Group (p. 21), having ILTS support automated manipulation of data and metadata productivity tools, and DPIP encouraging participants "to ensure that their metadata is reusable by higher-level systems shuch as harvesters and compatible with external projects such as the DLF Aquifer initiative" (p. 24)[10/14/06, 11/12/06]

Yale University. Library. (2005). Portal Opportunities Group [POG]. Report 33 p., 09/22/2005. Main sections: I. Envionrmental Scan; II. Library Content Integration at Yale; III. User Needs Assessment; and IV. Recommendations. Appendixes: A. Integrated Library diagram (Fred Martz); B. Use cases; and C. Annotated List of Resources. POG disbanded in 2005 upon completion of this report. Successor is Portal Opportunities Implementation Group (POIG) charged with prioritizing and implementing at least some of the over 30 projects thus far proposed. [6/3/07]

Yale University. Provost's Office. Cyberinfrastructure Survey Report (10/3/06). Based on 1,084 faculty responses (out of 5,824 invited), three of the four highest-rated enhancements involve the library implicity, namely: (1) "Easier electronic access to scholarly materials," (2) "Providing students with digital access to research and instructional materials," and (4) "Providing better search tools to locate materials across all of Yale's holdings and collections." There seems to be broad agreement that Yale is trailing its peers in information technology, with the sole exception, said one respondent, of our "subscription to electronic journals" which is "mostly adequate." The top three recommendations of the report are: (1) to convene focus groups on what respondents meant by "easier access to scholarly materials" (and similar statements), (2) development, funding and implementation of projects that connect research tools to digital content, including integration of digital repositories with the YaleInfo and ClassesV.2 portals, archiving and re-use of digital objects via SOAP and REST, and a "handle server" to "persistently identity digital resources and facilitate the integration and access of digital objects stored in repositories across campus"; and (3) annual re-grants to develop new digital collections. [11/12/06]

Yott, Patrick (2005). "Introduction to XML" . Cataloging & Classification Quarterly. Hayworth Press , vol. 40 no. 3/4, pp 213-135. Simultaneously published in Metadata: A Cataloger's Primer (see above; ed. Richard P. Smiraglia), 2005. [DSL 3/10/06] Clear step-by-step instructions on creating well-formed, valid XML documents. One limitation is that Yott uses DTD in illustration of validity testing, whereas XML schemas have become the preferred option among librarians and others knowledge managers. Includes brief introduction to XSLT at end of article.


Back to top

This file last modified 06/03/07