David Weinberger, who is a Shorenstein Fellow at Harvard University, and former co-director of the Harvard Library Innovation Lab, presented a talk on Libraries as Platforms: Enabling Libraries to Become Community Centers of Meaning part of the Program on Information Science Brown Bag Series.
In the talk, illustrated by the slides below, David discusses how libraries can increase their relevance in a networked world by creating information platforms that enable communities to locate, create, and discuss contextually relevant connections among information resources.
In his abstract, David describes his talk as follows:
Libraries are in a unique position to reflect a community back to itself, enabling us to see what matters, and to use that information so that the community learns from itself. This is one of the primary use cases for developing and widely deploying library platforms. But becoming a community center of meaning can easily turn into creating an echo chamber. The key is developing interoperable systems that let communities learn from one another. We’ll look at one proposal for a relatively straightforward way of doing so that’s so dumb that it just might work.
David describes libraries as a “black hole on the Net” — the knowledge and culture that only libraries have entrusted with is generally not available on the web. He claims that the core institutional advantage of libraries is not only access, but an understanding of what matters to specific communities paired with incentives that are fully aligned with those communities.
His talk that … Meaning comprises a set of connections that are important to a community. Libraries have always been aligned with user communities and helped them discover and make sense of meaningful information. And changes in internet and communication technology create an opportunity for libraries to help communities create and reflect back community meaning.
The talk suggested that Libraries can move toward this by creating API’s that enable open access to their open content, and metadata (broadly defined) related both to content and to the local use of that content; and conjectured that linked data approaches are necessary for integrating platforms and metadata at scale.
David discussed StackLife as an example. StackLife uses circulation metadata to provides a private a shareable, public (aggregated) normalized measure of physical book usage in several libraries. It enables , and is shareable — allowing for comparisons across libraries.
I will note that the Program is engaged in research toward creating a modern approach to privacy concepts and controls. I will also note that to maintain a platform will require digital sustainability and organizational sustainability. Realizing the former will require designing systems with a view towards supporting long term access. Realizing the latter will require identifying stakeholders that have mutually reinforcing incentives to create digital stuff, use digital stuff created by others, and maintain platforms for such stuff. (Typically, in sciences, such stakeholders are clustered around sets of domain problems…)
A recurring theme of David’s talk was that “libraries won’t invent their own future.”: Libraries can now see and participate in the cultural appropriation by their communities of the work entrusted to libraries. And open platforms will enable the world to integrate library knowledge into sites, tools, and services that libraries on their own might not have envisioned or have had the resources to develop.
This resonates with me, and I will add that any successful platform will almost certainly require using tools and infrastructure neither built by nor for the libraries. It will also require us to collaborate with organizations far beyond our boundaries.
“We’re ‘In IT’ Together:” Active Learning at NERCOMP ‘15
NERCOMP ‘15 was held in Providence, Rhode Island, at the Rhode Island Convention Center, from March 30th until April 1st. There were over 700 attendees present (including presenters and exhibitors) and over 50 vendors with representation from companies like Adobe, Microsoft, and McGraw-Hill Education.
The theme of this year’s NERCOMP (NorthEast Regional Computing Program) conference was “We’re ‘In IT’ Together” — when we come together, no matter our titles, “we can do transformative things.”
One of the most memorable sessions for me was “Building a Disco: Active Learning in a Library Discovery and Collaboration Space.” My inspiration peaked during this presentation, as I realized that active learning and collaboration spaces was to be a central theme of the proceedings for the day.
Elizabeth and Patricia dazzled audience members with the story of their library renovation that took place between May and August of 2014. They began with an underutilized space in their library that was intended for collaborative and active learning, but was not designed to be optimal for either of those purposes.
We were then shown the finished product: their beautiful new library space, named the Brian J. Flynn Discover & Collaboration Space (or the “DisCo” — there’s even a disco ball hanging from the ceiling). The new room has features like movable furniture to facilitate an adaptable environment that is conducive to collaboration and active learning and movable whiteboards that are used for collaboration and to define space (students immediately began using them to section off areas of the room to work on projects).
The most impactful change in the DisCo is all of the new technology that was purchased. The room is outfitted with projection capabilities using Crestron AirMedia technology, which offers wireless presentation functionality from any device. Twenty-six HP Elite tablets (complete with keyboards and mice) and four HDTVs (that have AirMedia as well as cable TV capabilities) were also purchased.
The theme of collaborative and active learning spaces and the technology that we use within them reverberated throughout the day: from the gamification of the student work environment at the Nease Library to the 3D printer in the exhibit hall (I won a 3D printed compass charm — NERCOMP’s logo), library spaces were definitely on my mind.
NERCOMP left me with a few questions: How can we best use new technologies in active learning spaces? In the DisCo, several technology issues arose immediately after the space was opened, and a LibGuide was created in response to those issues. This has some implications for launching an active learning space: perhaps faculty and student training sessions should take place before [or soon after] launch, and as much technology should be tested as possible during the planning stage (this was a direct recommendation from Elizabeth and Patricia and something they wish they had known beforehand).
What will active learning in libraries look like in the future? There is a wide range of new technologies that can be appropriated for active learning use. They can be designed to support a myriad of library services and needs: for example, tablets can serve as personal computing devices, for collaboration purposes, or for presentations. Spaces equipped with networking, projection, and appropriate seating can also support students and faculty in conducting presentations, and in collaborating both formally and informally.
Further, these spaces can support active learning: for example, a librarian could more easily conduct an information literacy workshop in the space using projection equipment and tablets to allow the students to share their answers simultaneously.
Will active learning spaces become a staple in academic libraries? I believe so, but the challenge is to design spaces that support substantial specific needs so that they are used regularly for their intended purpose. In my opinion, one way to overcome this is to engage in a participatory design process. If patrons are heavily included in the process from the beginning, they will be left feeling enthusiastic about using the space and advocating for it.
These are only a few considerations; each library space will have its own unique populations to serve and obstacles to overcome. I am of the opinion that if we remain collaborative in nature when discussing and designing active learning spaces, we will be successful in providing active learning spaces for our patrons.
 The Program on Information Science is engaged in a number of research efforts related to active learning — including investigations of Makerspaces, and measurements of attention in massive online education systems. The program website links to classes, resources, and publications in these and other areas.
Caren Torrey, who is a Graduate Research in the program, reflects on the recent ACRL Conference:
ACRL 2015 – Gut Churn
Jad Abumrad, host and creator of RadioLab, gave a fantastic keynote speech at the ACRL conference on Thursday afternoon titled “Gut Churn.” He used this term to describe the moment when creativity goes into a dark space: when you lose your perspective, maybe give up a little hope, when you are not sure of yourself, when your creative process fills you with anxiety. Gut churn is Abumrad’s term for being uncertainty. Abumrad feels that this part of the creative process is key to overcoming hurdles and breaking through to an innovative answer.
Gut churn was echoed at the ACRL conference (held March 25th-28th in Portland, OR). Academic librarians embraced his speech; the term was repeated throughout the event. This feeling of creativity and embracing challenges was important for the conference theme of sustaining community.
I was rejuvenated after hearing Abumrad’s speech. Not only was I surprised that the keynote was like a private version of RadioLab — just for us (!), but I was relieved to hear that most successful, creative people also are apprehensive when trying new approaches to old concepts. As I navigate my path into the professional libraries world, I am embracing my own feelings of gut churn.
What resonated with me most about this conference were the specific challenges that libraries are currently facing: embracing new technology, outreach to faculty and students, education and information literacy, and demonstrating value. Each of these issues was discussed in the context of the academic library. In each case, there is a need for innovation and creativity that can only be accomplished by pushing through uncertainty. The uncertainty that libraries are facing include funding, use and lack of space, accelerated advances in technology, and the evolving role of librarians.
Many of the sessions that I attended discussed bringing new applications and e-resources to the library and the implementation of open educational resources. Librarians seemed excited about these changes. They want to incorporate new ways of presenting, accessing and finding information.
I could also sense the gut churn in the room during these presentations. The questions that were about implementing and training: How do you get new technologies in your library? How do you fund technology? How do you keep up? The anxiety and excitement expressed comes hand-in-hand with bringing innovation into the workplace.
As a profession, librarians are excited about information. We enjoy the feeling of wonder, the search for information, and the joy of finding the perfect answer. Librarians should embrace our collective “gut churn” to seek out new paths for finding solutions to our environmental challenges. Creative marketing and outreach to faculty and students can be approached in as collaborative exercise for all. Using new methods of interactive technology is going to be vital to accelerate education and information literacy. Our biggest challenge is demonstrating our value to our communities, perhaps we incorporate open data and open resources to track our impact.
As a graduate research intern at MIT’s Program on Information Science, I am conducting research on early career scientists. This includes investigating the way in which researchers advance their scholarly reputation early in their careers. I am exploring various methods of and technologies for sharing research and communicating oneself and one’s professional life via scholarly communication and social media. This research has been extremely valuable as an incoming academic librarian. Although the challenges vary by profession, building a name for yourself and your research is vital to a lasting, satisfying career.
My attendance of sessions for students and new professionals also echoed the overall feeling of the conference. The gut churn and excitement in these sessions was similar to my own feelings: where do we fit in in the moment of change? How do we effectively lead change as we enter the workplace? Is the millennial generation of librarians really that different than the current professionals? Am I going to get a job?
Overall, the ACRL conference felt like a success. I learned that in order to really make effective change, you have to embrace your uncertainty and learn from it. After all, there are only two outcomes: success and failure. Failure isn’t the end, it is a new beginning.
Brown Bag: DMCA §1201 and Video Game Preservation Institutions: A Case Study in Combining Preservation and Advocacy, by Kendra Albert, Harvard Law School
Kendra Albert, who has served as research associate at the Harvard Law School; as an intern at the Electronic Frontier Foundation; as a fellow at the Berkman Center for Internet & Society; and is now completing her J.D. at Harvard Law, presented this talk as part of the Program on Information Science Brown Bag Series.
Kendra brings a fresh perspective developed through collaborating with librarians and archivists on projects such as as perma.cc, EFF’s response to DMCA 1201, and our PrivacyTools project.
In her talk, Kendra discusses the intersection of law, librarianship and advocacy, focuses on the following question:
Archival institutions and libraries are often on the front lines of battles over ownership of digital content and the legality of ensuring copies are preserved. How can institutions devoted to preservation use their expertise to advocate for users?
A number of themes ran through Kendra’s presentation:
- Libraries have substantial potential to affect law and policy by advocating for legal change
- Libraries enjoy a position of trust as an information source, and as an authority on long-term access for posterity
- Intellectual property law that is created for the purpose of limiting present use may have substantial unintended consequences for long-term access and cultural heritage.
Reflecting on Kendra’s talk, and on the subsequent discussions…
The courts have sometimes recognized preservation as having value — explicitly in formulating DMCA exceptions, and implicitly, in adopting perma.cc. But, the gaps between the private value of the content to the controller in the short term, and its value to the public in the long-term value is both a strength and a weakness for preservation efforts.
For example, Kendra’s talk noted that the lack of a market for older games is an important factor for determining that distribution of that content is fair use — which works in favor of preservation. The talk also mentioned that the game companies short-term focus on the next release was a barrier to collaborating on preservation activities. These points seem to me connected — the companies would become interested if there were a market… but this would, in turn, weaken the fair use consideration. Effective public preservation efforts must walk a tightrope — supporting access and use that is of value, but not either impinging on private value in the short term, or creating so much of a market for access, that there is political pressure to re-privatize the market.
Furthermore, it is well recognized that institutional legal counsel tends to be conservative … both to minimize risks to the institution as a whole, and to avoid the risk of setting precedent with bad cases. It is clear from Kendra’s talk that librarians dealing with projects that use intellectual property in new ways should both engage with their institution’s legal counsel early in the process, and have some independent legal expertise on the library team in order to generate possible new approaches.
For more information you can see some of the outputs of Kendra’s work here:
Marguerite Avery, who is a Research Affiliate in the program, reflects on changes in the research and scholarly publishing ecoystem.
Some thoughts on “Shaking it Up : How to Thrive in – and Change – the Research Ecosystem” some weeks later – proof that we’ve embraced the need for change and just how far the conversation has evolved
When I got together with Amy Brand (Digital Science) and Chris Erdman (Center for Astrophysics library at Harvard) in August to kickstart planning for the workshop that would become Shaking it Up, our goal was to continue an earlier conversation started by Tony Hey and Lee Dirks (Microsoft Research) at a 2011 workshop on eScience, Transforming Scholarly Communication. Its objective, according to my notes, was to present a roadmap for action with a four- to five-year time horizon specifying recommendations for the whole scholarly communications ecosystems. Having already passed the halfway mark, this was an opportune time to check in and take stock of the evolving scholarly communication ecosystem, and raise key questions: How are we doing? Where are we on this path? What’s working? What’s still broken? What progress have we made?
Providing definitive answers to these open-ended questions was decidedly out of scope, so we focused business models – on one of the major obstacles in the evolution of scholarly communication ecosystem. The willingness to consider alternatives to traditional models, coupled with the proliferation of startups in this space, demonstrated some progress in changing attitudes and expectations of scholars and researchers, and to other stakeholders. Yet the greater institutional forces lagged behind with a willful yet full understandable deference to an aging knowledge infrastructure. Our theme would be hacking business models, and we set out to assemble those people in scholarly communication who landed somewhere between the disruptive innovation of Clay Christensen and the creative destructive of Joseph Schumpeter. Our delegates would report on new pathways, models, and formats in scholarly communication, and we could get a snapshot of our progress at this midpoint check-up.
The lapsed historian in me insists that we cannot assess our current standpoint without some historical context, thus a recap of the 2011 meeting follows. Transforming Scholarly Communication was an ambitious event in scope and presentation. The workshop spanned three days, with one each devoted to raising issues, demonstrations from the field, and the drafting of a report. Invited participation was capped at 75, and participants were assigned to one of six groups: platforms, resources, review, recognition, media, and literature. (See the table below for descriptions of each group.) The opening remarks (as gleaned from a rough transcript on the workshop Tumblr) from Phil Bourne focused on the benefits of open science beyond the (already contentious) end result of free content. Much more provocative was a shift in thought from considering scholarly communication simply as a product to treating it as a process — and a complex one at that. Bourne stated: “Open science is more tha[n] changing how we interface with the final product; it is interfacing with the complete scientific process – motivation, ideas, hypotheses, experiments to test the hypotheses, data generated, analysis of that data, conclusions, and awareness.” With this refocusing on processes, collaboration becomes visible and thus possible to be assessed and valued.
This emphasis on action bleeds over from scholarly communication itself into the necessity for cultivating a movement to affect change. “We need to define ways to recruit a movement – it will take more than tools to do so – are there clear wins for all concerned? If so, what are they? Platforms to disseminate scholarship, new reward systems, knowledge discovery from open access content…” Although it was not identified as such, Bourne has given recognition to the scholarly communication ecosystem. [this allows us to think of of new “products” and to pay attention to the infrastructure] The revolutionary aspect of his proposal wasn’t so much the free content, but what we consider to be content and how it would be published.
The six categories – platforms, media, literature, review, resources, and recognition – ran the gamut of the processes (and products) of the scholarly communication ecosystem. Each group was to consider the following issues: essential elements in successful systems, failure mode of unsuccessful experiments, promising future technologies, key unsolved problems, and interdependencies with other topics. (The notes from each section are accessible in full on the Tumblr page). The focus and recommendations varied with each topic: the Resources group, with its extensive listing of tools for each stage of the research process, was particularly concerned with the differences between formats and the lack of universal search capabilities across platforms and topics; the Platform group lamented the low participation rates for developing tools and worried over the blurring between platforms for personal and professional use; and the Recognition group cautioned that new tools should augment rather than supplant established forms of published communication.
|Platforms||project collaboration software, “smart” laboratory software, provenance systems|
|Media||production, distribution, archiving (e.g. video, 3-D modeling, databases)|
|Literature||publications based on test and still images (including creation, reviewing, dissemination, archiving, reproducibility)|
|Review||standard publication-based systems, alternative rating systems, etc.|
|Resources||seamless technologies for literature and data (literature/data search engines; cloud-based, group sharing, adjustable permissions, integration with search)|
|Recognition||how can we best enable cooperation and adoption?|
Six themes for 2011 eScience workshop
As one of only three traditional publishers in the room, I was focused on our role within scholarly communication – what was our value-add to the process? I identified four functions – authority (how do we know whom to trust?), discoverability (how will your work find its audience?), recognition (will your community acknowledge your contribution as valid?), and community/collaboration (will your audience engage with your work?) The scholarly communication ecosystem generates an enormous volume of content, from the granular and descriptive (data sets and procedures / processes), the observational (tweets, blog posts), to the reflective and declarative (journal articles and books with hypotheses/arguments). It clearly doesn’t make sense for traditional publishers to participate in every stage, however that doesn’t mean this content should not be published. And by published, I mean made readily available for peer critique and consumption. (Which leads into a type of peer review, with readers assessing and determining value.)
As I had been one of three participants assigned to float from group to group, I was charged with making a summary statement. My closing remarks echoed Bourne’s identification of a more holistic approach to scholarly communication in terms of product and process. “Clearly scholarly communication needs to move beyond words and print. Fields of inquiry have changed dramatically since the dawn of scholarly publishing hundreds of years ago. Research encompasses subjects requiring massive data sets, highly complex procedures, and massive collaboration.” I advocated for the inclusion of “ancillary material” – all of the content that we could neither print nor incorporate into traditional texts and push to the web such as data sets, video, audio, workflows and processes, color images, and dynamic processes – and acknowledged these would require a new model of authoritative digital publication.
And despite the frustration voiced by many participants with the shortcomings of the current publishing process, an indelible reverence remained for the work of publishers. “As Sharon Traweek reminds us, the scholarly journal article has remained a stable, enduring standard of scholarship. She wisely reminds us of the difference between documenting process and documenting results. David Shotton echoed this, by describing a peer reviewed journal article as a line in the sand upon which knowledge is built and captures a scientist’s understanding of his or her work at a particular moment. Perhaps we need to distinguish between scholarship and research, with scholarship being the codified results, and research representing the process and observations. And while these tools are two halves of the same whole, each requires a different approach and different tools to reach different audiences.” At this meeting, I caught a glimpse of an evolving role for scholarly publishers, but without a clear path forward.
How the conversation has evolved over the last three years.
Shaking It Up illuminated a number of specific issues for us to consider in scholarly communication reform but how do these compare from 2011 to 2014? (I won’t offer a summary of the workshop – see accounts from Roger Schonfeld and Lou Woodley; and the DigitalScience report will be released soon.) For starters, the emphasis on infrastructure increased dramatically. Whereas in 2011, the word infrastructure appears only three times(!!!) in the meeting documentation, it undergirds the entire 2014 discussion as well as being the subject of the keynote. In “Mind the Gap: Scholarly Cyberinfrastructure is the Third Rail of Funders, Institutions, and Researchers: Why Do we Keep/ Getting it Wrong and What Can We Do about it?” CrossRef’s Geoff Bilder offered energetic provocations on the flaws in scholarly communication cyberinfrastructures (“Why are we so crap at infrastructure?”). He lamented the inclination of scholars and researchers to forge their own systems specifically tailored to their research agendas and outputs at the expense of interoperability and persistence. While these forays represent brief moments of progress and do serve to push the conversation and expectations, the unique systems (and their languishing trapped content) ultimately do not meet the broader needs of a scholarly and research communities over time. He advocated for a shared, interoperable platform created by professionals who think about infrastructure. Overall this demonstrates a significant change in how we think about our research objects and environment.
(It’s difficult not to view this as a metacommentary on the evolution of scholarly communication itself, specifically with original digital publications. These often suffer a similar fate due to the particular creation and design carefully tied to the specific research project; while this may yield a brilliant instance of digital publication, its long-term fate is tenuous due to a highly specialized format, platform, and/or mode of access.)
Another welcome departure was the level of community engagement on scholarly communication issues. If the 2011 meeting participants considered themselves representative of only 10% of their communities, they would no longer hold a minority opinion or awareness. The importance of open access, intellectual property, and copyright is recognized across disciplines, if still unevenly. And not only are individuals and thematic communities taking on these issues, but so are institutions. The number of scholarly communication officers (or dedicated offices) at universities is growing, as is the adoption of open access mandates and the use of institutional repositories on campuses. (And of course funding agencies and other sponsoring organizations have long been sympathetic to open access.) Although the depth of knowledge and understanding of such issues needs continued improvement, these concerns are no longer relegated to the fringe but have become scholarly mainstream.
And as our more science-themed meeting unfolded in Cambridge MA and virtually, other discussions were happening in parallel. Leuphana University’s Hybrid Publishing Lab in Lünenberg, Germany hosted the Post-Digital Scholar Conference. This humanities-focused meeting “focused on the interplay between pixels and print, and discussed upon and closed modes of knowledge, in order to seek out what this elusive thing could be: post-digital knowledge.” It’s fair to say that many scholars and researchers agree with the need for change and the areas in which change are most needed (for example, see this previous talk for thoughts on where change is needed in university publishing). Many scholars are increasingly comfortable experimenting with these possibilities, the level of concern becomes more granular as we reflect upon the experience of these new possibilities and how these resonate in the professional space. What are the barriers here?
Addressing Access, Attribution, Authority – and Format
While the original six themes [platforms | media | literature | review | resources | recognition] proved incredibly useful for discovery and early identification of the issues and challenges of transforming scholarly communication, I’ve sharpened my categories of analysis. Based on my many conversations with scholars and researchers over the years on the publication of digital scholarship and new formats, I’ve determined that the concerns and issues facing a new digital publication formats fall across three categories: access, attribution, and authority. If the problem is traditional scholarly publishing cannot accommodate research objects and research results in the digital space, then why haven’t we experienced the groundswell of creative activity and innovation we’ve seen in other areas of scholarly communication such as search, sharing, and collaboration? The barriers to participation – setting aside the obvious issues such as time, technical skill, and/or resources – are access, attribution, and authority. In other words, innovative digital publications are viewed as a professional gamble, which only a few brave souls are willing to take at this time due to the current affordances of the scholarly communication ecosystem.
Let me say more about these three points:
- To elaborate on access: scholars are legitimately concerned by the prospect of creating an innovative digital publication only to have it inaccessible to its audience; this could happen immediately either immediately or over time as a proprietary platform goes fallow due to an end to funding, the creator/keeper of the platform moving on to new projects, and libraries – the preservation endpoint – being unequipped and unprepared to handle these digital orphans. Bilder’s infrastructure concerns speak precisely to this point.
- To elaborate on attribution: scholars publish work not only to share their research results and to advance the conversations in their fields and beyond, but of equal importance is the acknowledgement of authorship of the work. Receiving credit for a publication in the eyes of the academy.
- To elaborate on authority: and speaking of receiving credit from one’s peers and adjudicators (e.g. tenure and promotion committees), a publication carries more weight if published with an established press of solid reputation. Of course these presses are not publishing anything beyond standard books and journal articles, and most anything digital is really just electronic – print content marked up for reading on a device – and not truly embracing the affordances of the digital space.
I was gratified to hear echoes of these points throughout the day. Accessing Content: New Thinking / New Business Models for Accessing Research Literature addressed challenges of new formats and access barriers (e.g. Eric Hellman of Gluejar raised the problem of libraries including open access titles within their catalogs, as well as presentations from ReadCube and Mendeley). Attribution and authority were themes within the Measuring Content: Evaluation, Metrics, and Measuring Impact panel, which grappled with these issues in developing tools for alternative assessment (e.g. altmetrics) with presentations from UberResearch and Plum Analytics. The last panel, Publishing Content: The Article of the Future, offered alternative visions for the future of scholarly communication (albeit while still adhering to the book-journal binary with ‘article’ of the future). These participants posed the greatest challenge to traditional format expectations by embracing the affordances of the web: Authorea, plot.ly, and Annotopia offer tools that mesh readily with aspects of the existing infrastructure while simultaneously threatening other elements of the scholarly communication ecosystem. And this is what we’d hoped to accomplish with Shaking It Up – to identify an array of business models and the possibilities for changing existing structures and/or developing new ones to accommodate the changing needs of scholarship.
Where do we go from here?
So while the observation that so much as changed and yet so much remains the same seems cliché, there really couldn’t be another analysis when you think about the disparate factors at play: scientific research (communities, practices, and research results) are evolving at warp speed, while university administrations and scholarly publishing entities have a retrenched commitment to the persistence of traditional content dissemination from the product to the supporting infrastructure for publishing, which undergirds the tenure and promotion process. I applaud the incremental change pushed from start-ups and distinct projects exploring facets of this massive issue – and it is a massive issue with an infrastructure, many moving parts, and continually evolving research methods and results – as these demonstrate the changing needs of the communities, that this is embraced by users, and that other possibilities exist beyond the established systems. But it’s not enough. To move the needle, we need a critical mass of scholars publishing authoritative digital publications in an array of formats. Otherwise, these stay on the fringe. Just like that guy using cooking oil as automobile fuel.
And how can we change these systems? Stay tuned for the next blog post.
Brown Bag: Conservation Collaborations at the MIT Libraries (Summary of the December Talk by Jana Dambrogio)
My colleague, Jana Dambrogio, Thomas F. Peterson (1957) Conservator, MIT Libraries presented this talk as part of the Program on Information Science Brown Bag Series. Jana is an is an expert in physical preservation, having worked in the preservation field for 15 years as a conservator, consultant, and teaching professional. at the US National Archives, the United Nations, the Vatican Secret Archives — and now we are pleased to have her at MIT.
In her talk, below, Jana discusses two research projects to preserve artifactual knowledge in MIT Libraries’ special collections — including work to reengineer ‘letterlocking’ methods and the broken spines of historical books.
A number of themes ran through Jana’s presentation:
- To conserve physical objects requires ensuring that their integrity is maintained for access and interpretation:
- Effective conservation may require applied research to reengineer the original processes used to produce historical works, in order to understand what information was conveyed by choice of that process.
Reflecting on Jana’s talk, I see connections between her physical conservation research and information science more generally…
The information associated with a work is not simply that embedded within it, although the embedded information is often the focus when creating digital surrogates — both digital and physical may carry with them information about their method of production, provenance, security, authenticity, history of use, and affordances of use. It is useful to model each of these types of information, even if one chooses not to spend equal amounts of effort in capturing each.
Second, new fabrication technologies, such as 3-D printing, are making the boundaries between physical and digital more permeable. Patrons may learn of the affordances of a work through a fabricated surrogate, for example. Furthermore, the scanning and digitization processes that are being used in association with rapid fabrication may also be used in conservation practice as part of the reengineering process — Jana’s presentation describes working with researchers at MIT to do just this…
Finally, collaboration with educational and research users is increasingly important in understanding the potential for information associated with each work, and thus to guide selection and conservation in order to create of a portfolio of works that is likely to be of future educational and research value. As in digital curation, we can’t offer access to everything, for everyone, forever — so modeling the information associated with its work, and its future uses, is critical to making rational decisions.
Marguerite Avery, who is a Research Affiliate in the program, presented the talk below as part of Shaking It Up — a one-day workshop on the changing state of the research ecosystem jointly sponsored by Digital Science, MIT, Harvard and Microsoft.
For the past ten years, Margy was Senior Acquisitions Editor at The MIT Press where she acquired scholarly, trade, and reference work in Science and Technology Studies, Information Science, Communications, and Internet Studies. She joined the research program in September to collaborate on explorations of new forms of library publishing.
A number of themes ran through the talk:
- The two formats published by vast majority of University Presses books, and journals, increasingly compromise the ability of the press to capture and publish modern research.
- The time to publish is also increasingly out of sync with the pace of research — publication occurs too slowly.
- Existing business models and price points are a significant barrier for university presses that do wish to move to different formats or more agile publication models
As a follow-on, we are collaborating to analytically unpack the “university press” model, and identify the minmum necessary characteristics for a sustainable publisher of scholarship. Some preliminary thoughts on a short list include:
- A process to ensure durability of the published work — possibly through supporting organizations such as Hathi Trust, the Internet Archive, Portico, SSRN, LOCKSS, or CLOCKSS
- A mechanism to persistently and uniquely identify works — likely through ISBN’s (supported by Bowker) and DOI’s (supported by CROSSREF)
- Metadata and mechanisms supporting metadata discoverability — e.g. MARC records, LC catalog entries, WorldCat entries, ONIX feeds
- Mechanisms for supporting content discoverability and previewing, — e.g. through google Google Books, Google Scholar, Amazon, Books in Print
- A business process to broker and process purchases and subscriptions
- A way to select quality content and to signal the quality of the selection
- A process to establish and maintain an acquisition pipeline
- A production workflow
- Marketing channels
Matching these necessary criteria to new forms of scholarship, which are accompanied new affordances and barriers, promises to be an interesting and challenging task.