2012-08-28T15:17:41.064-05:00Having a mobile device has changed the way that I work. Being always connected allows me to see emails and status updates from anywhere and at any time. I can't help but see email, Twitter messages, and other alerts when using my iPad at home at night, or on vacation, with the audible notifications or new email messages popping up on the screen as I watch LOL cats.
1. Ignore Everybody
2. The idea doesn’t have to be big. It just has to be yours.
5. You are responsible for your own experience.
6. Everyone is born creative; everyone is given a box of crayons in kindergarten.
8. Companies that squelch creativity can no longer compete with companies that
10. The more talented somebody is, the less they need the props.
11. Don’t try to stand out from the crowd; avoid crowds altogether
18. Avoid the Watercooler Gang.
20. The choice of media is irrelevant.
22. Nobody cares. Do it for yourself.
27. The best way to get approval is not to need it.
30. The hardest part of being creative is getting used to it.
36. Start blogging.
2012-09-26T14:54:41.295-05:00In the movie The Terminator, the viewer is taken frequently to the Terminator's point-of view. We know this is Terminator's POV because there is image digitization and the people he is chasing are more luminous than objects in the foreground and background.In the margins of the viewpoint there are scrolling columns of characters, including numbers and acronyms. The data changes so rapidly that it leaves no doubt that we are viewing the world as the Terminator does.Science fiction? Well, yes. However, parts of the Terminator's POV are no longer sci-fi.Augmented reality (AR) is the application of computer-generated imagery embedded into live-video streams as a way to expand information as it relates to the real-world. Through the use of AR technology, information about a user's surrounding environment, and the objects within it, are stored and then retrieved as an information layer on top of a live real world view.The technology behind AR requires a camera, a target, and software which renders contextual data, images, or 3D animations on top of the live image. The target could consist of a graphic or a physical object. Google's recent announcement of their Google Glasses prototype is another step in the development of applied AR: As far as using AR technology in libraries, Ken Fujiuchi proposes possible uses:"When someone finds a book in the library catalog, they can have the option to snap a QR code or unique image of the book, which will first store the information about the book. Then the user can first be directed to a specific section of the library, and once they are in the right section they can use a mobile device to scan the book spines to start being guided towards the book they are looking for."Helene Blowers paints this scenario:"When I shift my thinking about AR apps to the physical library space I see our whole collection opening up before our eyeballs. Imagine the ability to walk down an aisle and see the reviews and popularity of an entire shelf titles just by pointing the camera lens on your phone at the spines (or outfacing covers)."Bo Brinkman, associate professor at at Armstrong Institute for Interactive Media Studies at Miami University (OH) has created a prototype augmented reality shelf reading application: As is the case with many emerging technologies there are many competing standards and a marker created for one application is not viewable using another. String Labs makes both a reader and test targets available to experiment with the technology. One can also play with creating AR experiences using applications such as Aurasma Lite for the iOS. Here are some other possible uses for AR, with the assumption every information source and device is networked:- Scan a building to find out if study rooms are available- Scan a building to identify hours of service, or which librarians are on duty. Touch screen to contact (text, IM, etc.)- 3-D images of special collection artifacts are viewable from a QR code or bibliographic record- Physical exhibits and artwork can provide supplemental content and materialsDo you have any ideas?Resources: Educause: 7 things you should know about Augmented RealityHow Stuff Works: Augmented Reality [...]
2012-03-07T09:23:14.798-05:00First off, let's get right to the first question. Gasp!!! Yes, it has been over a year since my last post to this blog! The cob webs have taken over and the number of readers has dropped. While I have a regularly scheduled event on my calendar to create new entries, well, work has gotten in the way. Fortunately, my lame excuse is also a good topic for a post!As regular readers of this blog are aware, one of my regular themes is related to scholarly communication. More specifically, the broadening of what is considered as scholarship for promotion and tenure purposes. During the past year, I have been serving as the Procedures Oversight Designee (POD) for the Ohio State University Libraries. In short, my role is to help ensure that the P&T reviews follow the procedures as defined the Board of Trustees, the Office of Academic Affairs, and the Libraries.In addition to the normal workload required when taking on such a role, there were many significant changes made to our review process PLUS we experienced a bit of a 'baby boom.' Until this past year, a committee of tenured faculty consisting of about 1/3 of the eligible voting faculty review all cases. Given the size of the review group and the fact there are normally around 3 cases to review annually, materials were made available in a single paper dossier stored in our HR office. Committee members checked them out and reviewed them on site. Given the scale, the system worked.Last year, our faculty voted to change the review procedures so that all the eligible faculty would review all the cases. At the same time, the number of cases scheduled for review was 4 times larger than usual. Having almost 40 individuals descending upon HR to review over a dozen dossiers made little logistical sense. This convergence of events required a significant rethinking of the method used to make the review documents available.The solution was to create an eDossier system using of our course management system.A 'course' was set up for the review materials with each candidate given their own content tree. All the materials that made up the physical dossier were were scanned and uploaded into the system. The total number of documents for all the candidates was just over 600. All the eligible voting faculty were added as students and were grated access to those dossiers that they were eligible to review. Access to materials was turned on and off as required by the review schedule. While this approach required a significant time commitment on my part, it really represented a small percentage of the time saved collectively by faculty reviewers since they didn't have to take a trip over to HR to read paper dossiers. Instead, the dossiers could be reviewed online where ever and whenever. I had one faculty member comment that they even reviewed materials on their iPad while waiting at an airport. There will be additional workflow efficiencies in future reviews since documents from pre-tenure reviews will be added into the system as they are made available.Based on my experience, one of my new goals is to develop a plan that uses a similar approach to distribute materials to external evaluators. This will require buy-in from the Office of Academic Affairs. While the POD role is considered an overload responsibility all the changes turned it into a full-time job at times. Yet, I still had to manage it plus my regular job responsibilities plus keep a reasonable work-life balance. Something had to give, and it was this blog.So, please accept my excuse for the long time between posts. It's good to see you again [...]
2011-02-14T16:43:15.928-05:00I have been having an increasing number of conversations with colleagues about the creation of mobile apps. Much of the conversation is not about IF apps are needed, but instead they are focused on how apps should be developed.There are two different types of mobile apps with each technique having advantages and disadvantages.The apps one would find in the iTunes App Store or the Android Marketplace are known as "native" apps. Native apps are pieces of software that must be installed on the device and in most cases are downloaded from a distribution point. A Web app, on the other hand, is not a piece of software but a web site optimized for viewing on mobile devices. A well designed Web app can have all the look and feel of a native app.Develop a Native App if: your library needs to take advantages of all the features built into the device itself. For example, to vibrate the phone or use GPS. However, this will be changing soon as HTML5 rolls out. Web application developers are already using solutions like PhoneGap, an open source framework suite that provides support for a variety of device features on a variety of platforms. (video)your library needs to make sure content or service is available offline. If the core purpose of your application is to make your content available without an Internet connection, then a native app is needed.performance and user responsiveness is crucialyour library is looking to try to make money directly from the sale of the appyour application needs to access the device file systemDevelop a Web app if: your library web site has all the same content that will be featured in the appyour library is interested in potentially reaching users on different devices and platforms with the same app. An Apple native app can only be used on iDevices and is not easily ported to other platforms such as Android and BlackBerry. Web apps are platform-agnostic.your library wants its app content to appear in search engine results. Library users are begin to demand that library mobile content shows up in those results optimized for mobile devices. Content is a native app will not show up in Internet search results.General considerations:Native app development cold be more expensive than building web apps since a greater skill set is required to build apps for multiple platforms.Native apps requires the use a software development kit supplied by each operating system creator.Developing apps for multiple platforms would require a maintaining and creating enhancements for each.Native app user interfaces tend to be smoother and takes greater advantage of the full graphics capabilities of a device.Web apps require round trips to the server where the app is hosted whereas with a native app that time is almost instantaneous. Web app content us more current because it refreshes itself from the network. Most native app stores require approval. Web apps can be deployed immediately. Native apps require updates to be installed. Web app changes are immediate.Native apps may be more secure.Resources:Meredith Farkas. The Library in Your Pocket: Mobile Trends for Libraries Brian Fling. Mobile Design and Development: Practical concepts and techniques for creating mobile sites and web appsLorraine Paterson. Designing for Mobile Devices in Higher Education Research[...]
2011-01-18T12:54:04.931-05:00My buddy Jeff and I took our annual trip up to Detroit to check out the North American International Auto Show this past Monday. While we were there to see all the new automobile technology, I again paid attention to use of technology on the floor:
2010-12-06T11:32:15.431-05:00No. Linkrolls are not a delicious appetizer or a holiday baked good.
2010-11-09T14:37:45.563-05:00I have stated in the past that I feel that blogging is a valid form of scholarly communication in the discipline of academic librarianship. Still the question continues to arise as to whether blogging should count as scholarship or a creative activity in academic promotion and tenure.In "Bloggership, or is publishing a blog scholarship? A survey of academic librarians," Arthur Hendricks (Library Hi Tech, Vol. 28 Iss: 3, pp.470 - 477 DOI 10.1108/07378831011076701 ) details the results of a survey of academic librarians to uncover how much weight their libraries, and/or their parent institutions, place on blogs in promotion and tenure reviews. Of the 67 complete responses, 53.6 percent indicated that their performance review committees do not weigh a blog the same as an article published in a peer-reviewed journal, while only 1.5 percent stated they did.Respondents were asked, “If you consider the above blogs to be scholarly (equal to an article published in a peer-reviewed journal), please describe why.” Answers varied, but one person wrote, “I'm not sure I would say ‘equal to peer reviewed journal’ but as intellectually thoughtful, important, and influential? sometimes. They tend to be more in the formative stage, like a conference presentation rather than the lengthy, substantial, finished nature of a peer reviewed article.” Of those respondents who publish a blog, 57.1 percent indicated that they find other's blogs to be scholarly.Younger librarians are more inclined to think of their blog as counting toward scholarship when compared to older colleagues. Of those 22-30 years of age, 40.0 percent indicated that they thought their blog should count as scholarship, and of those 31-40 years of age, 27.3 percent thought their blog should count. None of those 41-50 years of age indicated that their blog should count as scholarship, and of those over 51, 12.5 percent considered their blog scholarly.From the information provided in the paper, it appears that many of the respondents equate research with scholarship, when in fact research is a subset of scholarship. Scholarship is the creation of new knowledge or organization of knowledge within a new framework or presentation. Scholarship can take the form of a peer-review publication, but it can also be evidenced in other ways such as exhibits, public performances, digital resources, and papers at professional meetings. So, if a blog communicates some sort of new knowledge or the organization of knowledge within a new framework or presentation, or is even seen as a equivalent of a conference presentation, it is indeed scholarship.Criteria for evaluation of any work of scholarship in any form should take into consideration originality, breadth of dissemination, and impact on scholarship and/or practice in the field of librarianship. I would argue that blogs may be having a greater impact in the practice of librarianship than are traditional publications. Blogs have invigorated the exchange of ideas within librarianship and have enabled academics to connect with a larger general readership for their insight and expertise.What was very interesting was that being an article that discusses scholarly blogging it did not include one reference from a blog. If blogs are to be recognized as scholarly contributions, then they should also be viewed as such.See also:The “voice” of academic librarianship [...]
2010-10-25T12:35:00.582-05:00The Educause Center for Applied Research (ECAR) Study of Undergraduates and Information Technology, is a longitudinal study of students and information technology. The study focuses on what kinds of information technologies these students use, own, and experience; their technology behaviors, preferences, and skills; how IT impacts their experiences in their courses; and their perceptions of the role of IT in the academic experience.
2010-10-06T16:21:13.171-05:00About a year ago, I wrote about the idea of a creating a University-branded URL shortening service. Late last month, a small team that I collaborated with at Ohio State launched such a service, called Go.OSU.
2010-08-25T10:22:18.229-05:00In what now appears to be a trend, the NYT reports that the prestigious 60-year-old Shakespeare Quarterly is also experimenting with open peer review.
2010-06-16T12:45:28.052-05:00Christopher S. Penn's post Are you ready for the Twitpocalypse? details coming changes to the Twitter API that will impact many of the widgets, sites, clients, and applications one may use to access Twitter.
2010-02-15T13:13:06.595-05:00I attended the annual address to the University Senate presented by Ohio State's Executive Vice President and Provost Joseph A. Alutto late last week. While I do not normally go to Senate sessions, I was given the heads-up that his address would include a discussion of potential changes to our promotion standards.Although we talk about the need to have balance in their portfolios, the reality is when it comes to promotions to full professor faculty dossiers become a monopod: research. Some faculty have responsibilities that are essential for the organization to succeed, as is the case within the University Libraries system. The work of library faculty involved in e-resources or building digital collections make visible and demonstrably outstanding contributions to the missions of the university. The perception communicate by many faculty is that since such activities are not traditional research they will not given much weight in full professor deliberations.Provost Alutto's observations support this perception:"That leads me to the standards used for promotion from associate professor to full professor. Here I am talking about cases in which the 30- to 40-year compact between university and professor, the thing we call tenure, has already been agreed to and celebrated. Given that this commitment has been made, the next question is what should be the basis for advancement from associate professor (with tenure) to full professor (with tenure)... "One answer, and the one that is most reflected in our formal documents and policies, is “more of the same.” That is, a full professor is supposed to have more publications, greater teaching achievements, and higher service contributions to justify promotion. Wonderfully, for many faculty members, this is exactly the pattern we see played out. They continue to perform powerfully on all dimensions. However, in reality, promotion to professor tends to be based primarily on assessments of the impact of a faculty member’s scholarship in a particular discipline."If one reviews hundreds of such promotion cases, as does any provost, it becomes clear that promotion to full professor tends to be reserved for those whose research impact is clearly superior. The faculty member whose primary impact and distinctive contributions are in the areas of dissemination of knowledge through teaching or service to the university or professional associations will tend to be passed over for promotion to full professor—unless a department can find a way to “fudge” a demonstrated level of research impact."As the Provost points out, this approach is insidiously harmful. It generates cynicism among productive faculty when they realize the “game” being played. This can frustrate productive faculty who contribute to their disciplines and the university in ways other than traditional research. It not only flies in the face of everything we have been told about the need for a balanced portfolio, it also overlooks the need to recognize evolving interests and skills. It tends to exacerbate to dysfunctional levels all differences in perspective about what is valuable, both personally and institutionally.The Provost continues:"Given these observations, I intend to work with faculty and administrative groups to begin focusing on the following:making certain that there are clear criteria for assessing “impact”—whether in terms of research, teaching, or service in cases of promotion to professor; in all such cases, these criteria should involve both quantitative and qualitative measures, most of which will require seeking data from external sources rather than relying on purely internal onesensuring clear identification of the bases for promotion to professor; these might wel[...]
2010-01-06T13:35:36.066-05:00In my in box yesterday morning was the news that Tom Sanville has submitted his resignation as Executive Director of OhioLINK, effective March 31. The OhioLink system has grown and flourished primarly because of Tom's leadership (and by hiring a very talented staff!). When I came to Ohio State in 1992, OhioLINK was still crawling. The OhioLink system consisted of an ILS that included catalog creation and maintenance; the online public access catalog; circulation, interlibrary loan, and document delivery; acquisitions and serials control; and collection development and management. While the OhioLink Library Catalog is still a centerpiece, the system now includes databases, a Digital Media Center, an E-Book Center, an Electronic Theses and Dissertations Center, and the Electronic Journal Center. Just recently, I came across an old 1998 talking points document where I was touting the 1400 electronic journal titles made available to our users via OhioLink, from two publishers - Elsevier Science and Academic Press. (see: Diedrichs, CP. E-journals: the OhioLINK experience) By last year, the Electronic Journal Center contained more than 8,200 full-text research journals (12.2 million articles) from 100+ publishers. Tom's reach went well beyond Ohio. If you are a part of a consortium that licenses electronic journals you should also thank Tom for his service. He was one of the pioneers in the establishment of consortium pricing from publishing giants like Elsevier. It is possible that one of the license agreements you have signed today (did I mention pricing?) grew from Tom's adept negotiation skills. The text of his email announcement:"I have submitted my resignation to Eric Fingerhut, Chancellor of the Ohio Board of Regents, as Executive Director of OhioLINK. This will be effective March 31, 2010. This should ensure a full transfer of my almost 18 years of knowledge and files to other staff.After my long tenure as director, this was a very difficult decision for me but I believe its the right one for me at this time. With the continued evolution of an integrated Educational Technology infrastructure and the OhioLINK staff’s role in it, and with the establishment of key strategic projects as reflected in the OhioLINK Fall 2009 Update, this is an appropriate time to resign my position and to seek new challenges.In accepting my resignation the Chancellor has been gracious in recognizing that “OhioLINK is a wonderful example of the collaborations possible when many institutions work together with the support of the state to share resources and reduce costs. Your formative leadership in this effort is widely acknowledged and greatly appreciated… As educational technology and the format of academic materials rapidly changes, OhioLINK will be called on to play a critical role inexpanding the availability of new resources to all academic institutions and directly to students and faculty.”The job of OhioLINK will never be done. It has been my greatest professional experience to have worked with the OhioLINK community and staff in building a world recognized library consortium. I am grateful for having had the opportunity to serve with so many wonderful colleagues and friends over the years." Thank you, Tom, for sharing your vision and service to the state and the library profession.[...]
2010-01-04T13:28:30.794-05:00One of the more interesting discussions I have read since rebooting from a long vacation has been the Twitter weirdness uncovered by the Disruptive Library Technology Jester (a.k.a Peter Murray)It all started when the Jester authored a blog post detailing his ALA Midwinter meeting plans. It appears that others have been constructed tweets that consist of his blog post headlines with links back to his postings. This practice has increased dramatically in the past few weeks. The Jester uses a WordPress plugin to inject posts into his Twitter stream. He runs the BackType service to uncover commentary found in other social media sites so they can be added as comments to his postings. It was the BackType service which alerted the Jester to the Twitter updates.In all cases, the Twitter IDs used are unlike those of other spammers. The account names did not contain a string of numbers and had numerous followers. The Jester did notice one thing in common with all of these updates: they came from the Twitterfeed service.The Jester speculates that these users are grabbing his blog post feed using Twitterfeed and are then syndicating it into their own Twitter streams. He has since analyzed this hypothesis and uncovered other blogs which are also being tweeted by others.What is interesting is that the tweets of the Jester's blog post headlines are not spam in of themselves. The shortened URLs to his posts do not redirect the user to spam sites, but instead go to the original posts. However, the Twitter profiles of some of them contain profile URLs for spam.In the comments of the Jesters post, D0r0th34 speculates and Mr. Gunn concurs, that the prank being performed on the Jester sounds like a Twitterfarm, a la Google linkfarm. Mr. Gunn observes:"Many of the follower qualification tools use follower ratios to determine spamminess of followers. Additionally, there’s a parallel in "macro" blogging where presumptive search engine spiders are really just post harvesters. The point is to provide real or whitelisted content for getting around spam filters, increasing pagerank, or making it look like a twitter account has real non-spam content."So, while the Jester's posts are not being used to propagate spam per say, it appears they are being used to make spammer user profiles look, well, a lot less spammy. I guess in a weird way that should make the Jester feel good. His content is good enough to make it around spam filters.Still, it could be troubling for the Jester if the additional posts directed to his blog flag it for banning/autobury by those sites that keep a lookout for those who post too much (e.g. Digg).[...]
2009-12-14T10:32:32.974-05:00The CDC (Centers for Disease Control and Prevention) is a large government agency with 14,000 full-time, part-time, and contract employees. While headquartered in Atlanta, the CDC has a large geographically dispersed workforce working in 19 facilities across the United States and 54 countries around the globe. Since many the CDC's employees are isolated geographically, the processes of collaboration, communication, and sharing information efficiently and effectively are challenging.The solution to their challenge is what they call the IdeaLab. IdeaLab is a web-based application which CDC employees are encouraged to use to post their ideas, comment on others’ posts, and vote on the quality of the posts and comments. Submissions are attributed and authenticated in real time. Ideas are categorized according to CDC organizational goals, and related ideas are affinity-grouped using tag clouds. A weekly “Bright Idea” highlights a submission that has broad agency interest across multiple national centers and offices. All communications are stored in a searchable archive that anyone at CDC can review at anytime. IdeaLab enables CDC employees to "use their insights and experiences to help colleagues build and implement high-impact solutions to important public health challenges."The CDC hopes that the IdeaLab will: Increase connectivity of CDC employees who support multidisciplinary, evidence-based solutionsPromote scientific crowd-sourcing and peer-2-peer networking to build ideas, enable virtual piloting and refinement of ideas, and foster rapid implementation and adoption of the best ideasFoster retention and sharing of institutional memoryImprove interactions among networks of knowledgeImprove orientation for and assimilation by new employeesAccelerate health impacts by increasing employee-driven innovation and improving organizational efficiency[...]
2009-12-07T11:04:49.568-05:00(image) Harvard researchers recently released the study Hospital Computing and the Costs and Quality of Care: A National Study, which examined computerization’s cost and quality impacts at 4,000 hospitals in the U.S over a four-year period.
"We used a variety of analytic strategies to search for evidence that computerization might be cost-saving. In cross-sectional analyses, we examined whether more computerized hospitals had lower costs or more efficient administration in any of the 5 years. We also looked for lagged effects, that is, whether cost-savings might emerge after the implementation of computerized systems. We looked for subgroups of computer applications, as well as individual applications, that might result in savings. None of these hypotheses were borne out. Even the select group of hospitals at the cutting edge of computerization showed neither cost nor efficiency advantages. Our longitudinal analysis suggests that computerization may actually increase administrative costs, at least in the near term."Himmelstein, D., Wright, A., & Woolhandler, S. (2009). Hospital Computing and the Costs and Quality of Care: A National Study The American Journal of Medicine DOI: 10.1016/j.amjmed.2009.09.004
2009-12-02T16:10:14.833-05:00I just finished reading What Technology? Reflections on Evolving Services, a report from Sharon Collins and the 2009 EDUCAUSE Evolving Technologies Committee. For the first time, Information resource management technologies for libraries were featured as an evolving technology.
"The increasing primacy of highly distributed digital resources has brought disruptive change to the way libraries must approach their work to remain relevant to their parent organizations and constituencies."
"Organizing content to support research and learning is at the heart of the library's institutional role. Once limited to applying subject terms, co-locating physical materials, and producing research guides, this role has been changed by the volume and variety of online resources, which require new tools to more effectively meet the needs of users. A growing collection of technologies and tools can be used to more granularly organize, customize, and personalize the online information environment to fit professional, learning, and research activities."
"These technologies are evolving away from being strictly stand-alone tools and resources and are converging into a more interoperable, collaborative, enterprise-level information management environment — one more closely integrated with teaching, learning, research, and administrative systems. Underlying system architectures are focusing more on providing discrete services (service-oriented architecture) rather than monolithic systems, enabling more interoperable and customizable workflows."
"By combining discrete services with cloud storage and cloud-enabled applications, institutions can build collaborative work environments between libraries as well as between libraries and non-library units, both on and off their home campuses, for discovering, acquiring, describing, and managing all types of resources. Layered over this enterprise-level resource management environment, information discovery and management tools are providing individuals and workgroups with much more intuitive and productive ways to discover, manipulate, incorporate, and share information for teaching, learning, and research, allowing users to shift time from the mechanics of managing specific resources to a focus on analyzing the information itself."
2009-11-16T12:16:04.818-05:00A one-day NSF-funded workshop entitled "Scholarly Evaluation Metrics: Opportunities and Challenges" will take place in the Renaissance Washington DC Hotel on Wednesday, December 16th 2009. The 50 available seats were filled the day that the workshop was announced. I would have loved to be in attendance, given my role as a P&T chair, but I heard about it four days after the announcement.The focus of the workshop is the future of scholarly assessment approaches, including organizational, infrastructural, and community issues. The overall goal is to: "identify requirements for novel assessment approaches, several of which have been proposed in recent years, to become acceptable to community stakeholders including scholars, academic and research institutions, and funding agencies."Panelists include Oren Beit-Arie (Ex Libris), Peter Binfield (PLoS ONE), Johan Bollen (Indiana University), Lorcan Dempsey (OCLC), Tony Hey (Microsoft), Jorge E. Hirsch (UCSD), Julia Lane (NSF), Michael Kurtz (Astrophysics Data Service), Don Waters (Andrew W. Mellon Foundation), Jevin West (UW/eigenfactor.org), and Jan Velterop (Concept Web Alliance).A summary of the goal of the workshop:The quantitative evaluation of scholarly impact and value has historically been conducted on the basis of metrics derived from citation data. For example, the well-known journal Impact Factor is defined as a mean two-year citation rate for the articles published in a particular journal. Although well-established and productive, this approach is not always best suited to fit the fast-paced, open, and interdisciplinary nature of today's digital scholarship. Also, consensus seems to emerge that it would be constructive to have multiple metrics, not just one. In the past years, significant advances have been made in this realm. First, we have seen a rapid expansion of proposed metrics to evaluate scientific impact. This expansion has been driven by interdisciplinary work in web, network and social network science, e.g. citation PageRank, h-index, and various other social network metrics. Second, new data sets such as usage and query data, which represent aspects of scholarly dynamics other than citation, have been investigated as the basis for novel metrics. The COUNTER and MESUR projects are examples in this realm. And, third, an interest in applying Web reputation concepts in the realm of scholarly evaluation has emerged and is generally referred to a Webometrics. A plethora of proposals, both concrete and speculative, has thus emerged to expand the toolkit available for evaluating scholarly impact to the degree that it has become difficult to see the forest for the trees. Which of these new metrics and underlying data sets best approximate a common-sense understanding of scholarly impact? Which can be best applied to assess a particular facet of scholarly impact? Which ones are fit to be used in a future, fully electronic and open science environment? Which makes most sense from the perspective of those involved with the practice of evaluating scientific impact? Which are regarded fair by scholars? Under which conditions can novel metrics become an accepted and well-understood part of the evaluation toolkit that is, for example, used in promotion and tenure decisions?I look forward to the twitter stream..[...]
2009-11-02T15:29:49.494-05:00A report Entitled Patterns of Information Use and Exchange: Case Studies of Researchers in the Life Sciences has been released by the British Library and the Research Information Network.The report was developed by capturing the day-to-day patterns of information use in seven research teams from a wide range of disciplines. The study, undertaken over 11 months and involving 56 participants, concluded that ‘one-size-ﬁts-all’ information and data sharing policies are not achieving "scientiﬁcally productive and cost-efﬁcient information use in life sciences"Skip past all of that and jump to page 47 of the report. There, they state (I'll let the report speak for itself) :"Conventional university library facilities rank low as a vehicle for accessing published information. The traditional role of professional information intermediaries has been largely replaced by direct access to online resources, with heavy reliance upon Google to identify them. Given the limitations of generic search engines such as Google, measures to reconnect researchers with IIS professionals could bring improvements in information retrieval, and benefits to the research process."Researchers also tend to use services that have been ‘proven’ by colleagues, or to interrogate websites they regard as authoritative and comprehensive in their field. When they use such services, researchers tend to take the results on trust: the specificity and the breadth of the information retrieved do not appear to require further enquiry."The result of all these developments is that many life science researchers have removed themselves from the mainstream library user population. They do not even use the library catalogue. Library-based services can replace the services researchers do use only by demonstrating that they can improve retrieval capability, and deliver results within a timeframe that corresponds to researchers’ own patterns of work. This is a significant challenge when researchers are driven by a desire for immediate online access to specific resources of interest, at a time convenient to them, and from a known and trusted source."Overall they found that the groups that they studied use a narrow range of search engines and bibliographic resources, for three reasons:• lack of awareness and time to achieve or build a broader suite• the ‘comfort’ that comes from relying on a small set of familiar resources, usually endorsed by peers and colleagues, and• the cost in time and effort needed to identify other resources, and to learn to use them effectively. They detail what would appear to be emerging roles of the library in a researcher's information seeking patterns: "The challenge for institutional information services is thus to develop and provide online services geared to the needs of their research groups and thereby to add value to the research process, facilitating the use of new tools, providing individuated professional support, as well as advice, training and documentation on a subject or discipline basis. Any such strategy would have to be proactive: as noted by our regenerative medicine group, researchers are reluctant to adopt new tools and services unless they know a colleague who can recommend or share knowledge about them.""Library and information service providers in the higher education sector need to come to a clearer view of their structures and roles.. some of our groups expressed a desire for better portals and tools to identify the information resources relevant to researchers wo[...]
2012-08-10T11:04:27.531-05:00Twitter users are quite familiar with URL shortening tools as a way to include web links within their 140 character limit. URL shorting is is the process of taking a long URL and turning it into, well, a short one. For example, instead of using the long URL of http://library.osu.edu/blogs/techtips/2009/09/21/techtips-augmented-reality/ one can use the shortened URL of http://tinyurl.com/ykdkmss. Shortened URLs are extremely useful in Internet conversations such as forum threads,IM chats, etc. They are also essential in communication channels where there is a limited to specific number of characters, such as with Twitter. Shortened URLs can also be useful when reading long URLs aloud to customers over the phone, adding URLs to print materials, and when showing them on video displays or during presentations. Shortened URLs are also easier to enter into a mobile device. There are many services that create shortened URLs, most notably TinyURL.com. OCLC was ahead of this game way back in 1995 with their PURL ( Persistent Uniform Resource Locators ) service. While the goal of PURL was to allow content providers to redirect users as content moves from site to site, it did so using shorter URLs. The mechanism for resolving a shortening URLs is simple: The browser is directed to the shortened URL site. That site performs an HTTP redirect of the address and the browser is sent to the registered long URL. The URL shortening service maintains the master table of redirects. One problem is that all the shortened links die when such free services die, as tr.im almost did in August '09. As a result, members of the academic community that rely upon such services will eventually lose access to their shortened links. This will require reentering the URLs into another service, which might also die. Another concern with existing shortening services that the URL domain plays an important role in identifying the authority of a Web resource. Shortened URLs lose their link and organizational information. All brand/name recognition - the authority of an organization - goes away since the domain is hidden within the shortened URL. One needs to click on the shortened URL and visit the redirected site before discovering the domain's authority. An example of where short URL branding works is with Flickr. Each photo page also get a shortened Flickr URL. The domain flic.kr is owned and operated by flickr.com so the shortening service will be as reliable as the Flickr service. When someone goes to the site flic.kr they know they will get to a Flickr photo page, not a redirect to a site containing malware.It therefore makes a lot of sense than academic institutions consider building their own URL shortening services as a way to brand and create authority with their shortened URLs. One University that has done just that is the University of Nebraska-Lincoln. Wayne State University also appears to have such a service. I would love to see a local url.osu.edu shorting service. If I had the programming chops, I would write it over the next weekend. I know. It's easier to start a shortening service than it is to maintain it in perpetuity. Yet, creating an in-house URL shortening service not only helps to promote and support the institutional brand, it lessens the chance that the institution's carefully crafted custom links will not die if the third-party goes down, or out of business. [...]
2011-02-11T12:32:47.662-05:00Thank You President Gee!At his annual presidential address yesterday afternoon, Ohio State University President E. Gordon Gee thinks it's time for faculty members to be evaluated on the quality and impact of their work.New faculty members at Ohio State University Libraries enter as assistant professors and have six years to build up their record of scholarship, teaching and service. They receive performance evaluations every year, a fourth-year comprehensive review, and in their sixth year undergo a more vigorous examination to see if they measure up to the level of performance required for tenure and a promotion to associate professor. Library faculty can then choose to undergo an additional review later in the careers to attain the rank of full professor.President Gee said in his address that professors should be rewarded for their talents and should be encouraged to work with academic departments outside their own. Instead of using an arbitrary formula for evaluation, he would like OSU to create a system in which faculty members are judged on the quality of their work and their impact on students, their disciplines and the community.This is exactly the position I have been advocating not only on this blog, but in discussions with my library faculty colleagues. Even though I have articulated to colleagues all the points that Gee highlighted, inertia has indeed won out.From his prepared remarks:Let me state this directly: We must change our recognition and reward criteria.Since I returned to Ohio State two years ago, I have made this point a number of times. Changing the way we define scholarship, appreciate new forms of engagement, and properly reward superb teaching can be this University’s signal differential.If we do not properly and tangibly value those activities, our efforts to extend our resources more fully into our communities will be stymied. We must take it upon ourselves to revise the centuries-old equations for promotion and tenure and develop new reward structures.Without a doubt, this is a nettlesome issue. And I am not the first person to raise it. Ernie Boyer articulated the case nearly 20 years ago in a speech here on campus. And of course he did so very persuasively in his 1990 book, “Scholarship Reconsidered,” in which he called for “recognition that knowledge is acquired through research, through synthesis, through practice, and through teaching.”At Ohio State, and at colleges and universities across the country, we have long had faculty committees devoted to looking at revising promotion and tenure standards. And yet, the status quo remains. Inertia is winning.I believe we must finally speak aloud the truth: that some arbitrary volume of published papers, on some narrowly defined points of debate, is not necessarily more worthy than other activities.Ladies and gentlemen, this University is big and strong enough to be bold enough to judge by a different standard.We can dare to say, “No more,” to quantity over quality.We can stop looking at the length of a vita and start measuring its true heft.This University, finally, can be the first to say, “We judge by a different standard.” And let others follow our lead, if they wish. I sit here thinking, what if OSU Libraries HAD acted a year ago and began to change our criteria? Would we have been included in President's speech as the leaders of where the University should be heading? Would that have raised our visibility on campus? As a pro[...]
2009-10-07T07:16:00.309-05:00The approach that Kathleen Fitzpatrick has taken with her new book manuscript might be one possible path that the future of scholarly communications will take.
"One of the points that this text argues hardest about is the need to reform peer review for the digital age, insisting that peer review will be a more productive, more helpful, more transparent, and more effective process if conducted in the open. And so here’s the text, practicing what it preaches, available online for open review."Not only is the process being used to write the manuscript exciting, the manuscript is as well. A couple parts of the text which relate to the academic rewards system:
"our institutional misunderstanding of peer review as a necessary prior indicator of “quality,” rather than as one means among many of assessing quality, dooms us to misunderstand the ways that scholars establish and maintain their reputations within the field."
"we need to remind ourselves, as Cathy Davidson has pointed out, that the materials used in a tenure review are meant in some sense to be metonymic, standing in for the “promise” of all the future work that a scholar will do (“Research”). We currently reduce such “promise” to the existence of a certain quantity of texts; we need instead to shift our focus to active scholarly engagement"
"Until institutional assumptions about how scholarly work should be assessed are changed — but moreover, until we come to understand peer-review as part of an ongoing conversation among scholars rather than a convenient means of determining “value” without all that inconvenient reading and discussion — the processes of evaluation for tenure and promotion are doomed to become a monster that eats its young, trapped in an early twentieth century model of scholarly production that simply no longer works."
"I want to suggest that the time has come for us to consider whether, really, we might all be better served by separating the question of credentialing from the publishing process, by allowing everything through the gate, and by designing a post-publication peer review process that focuses on how a scholarly text should be received rather than whether it should be out there in the first place."
2009-10-01T12:19:11.708-05:00Almost all peer reviewers get worse, not better, over time.