More from TU:Librarian
- June 2016
- April 2016
- March 2016
- February 2016
- December 2015
- September 2015
- July 2015
- April 2015
- January 2015
- December 2014
- September 2014
- July 2014
- May 2014
- March 2014
- January 2014
- December 2013
- November 2013
- October 2013
- July 2013
- June 2013
- April 2013
- March 2013
- February 2013
- December 2012
- November 2012
- October 2012
- August 2012
- June 2012
- May 2012
- April 2012
- March 2012
- February 2012
- January 2012
- December 2011
- November 2011
- September 2011
- July 2011
- June 2011
- May 2011
- March 2011
- February 2011
- January 2011
- December 2010
- November 2010
- October 2010
- August 2010
- July 2010
- June 2010
- May 2010
- April 2010
- March 2010
- February 2010
- December 2009
- October 2009
- September 2009
- July 2009
- June 2009
- April 2009
- March 2009
- January 2009
- November 2008
- October 2008
- September 2008
I am reporting on LIBER and on Helsinki again. So it better be good. A few days of both is a good way to pass your time. At LIBER 2016 “opening paths to knowledge” (45th edition) there were the usual topics on the agenda. The best short speech of day 1 for me was the speech during the conference dinner, by the deputy mayor. He referred to the Helsinki open data site, and called Helsinki a city of transparency. In times where populism rules, it is necessary to know your facts, and to advocate for the better argument. This is why it is so important to share your data and your knowledge. It had been a long day, and I could not make notes, so the quotes are not perfect, but I thought it was a very good dinner speech. Copying from the website: “Imagine a city where public decision-making is easy for all to follow and comment on using any digital channel. A solution to this challenge is being sought in Helsinki, which has long been working to unlock the data reserves related to municipal decision-making.”
The first day also started with data. The topic of the pre-workshop I attended was “skills for supporting research data”. There were a lot of examples of libraries starting training for staff, for researchers (at different levels), a lot of variety in topics, in forms (flipped classroom, MOOCs, offline and online mixes) and experiences. The conclusion Wolfram Horstmann made at the end was that our role regarding research data skills training is established, what remains is at what level and detail we can or want to do this. Useful links (besides of course of our own training Essentials 4 Data Support) are the overview of existing education-models by DataOne, and the MOOC developed by the University of North Carolina at Chapel Hill, and The University of Edinburgh.
Another topic of LIBER was Libraries in publishing (or should we say releasing results, as was suggested during the conference). I liked the presentation from Göttingen. Margo Bargheer and Birgit Schmidt found a few answers when preparing their paper. Research libraries are on a mission: they work on more transparency, more participation, open access and more accuracy. Libraries can help researchers to “be good, and avoid the bad”. I liked their references to the Open Science peer reviewer oath, the Singapore statement on research integrity and the answer to the question we asked ourselves in the pre-workshop (when is the right time to start training) by their training for junior scientists. Talking about outreach, on the last day we had a presentation about Altmetrics. Susanna Kirsi Nykyri and Valtteri Reino Vainikka, from Helsinki University Library, shared their experience with Plumx from Ebsco. I really appreciated their reservations and conclusions at the end. Altmetrics are not the answer for everyone, as always it is discipline-related. As a library you may have a lot of extra work, choice of the platform is essential, the success is depending on language, complete metadata, use of identifiers and source lists. ORCID seems to be of great help (though ORCID accounts also need to be updated).
Of course I could not attend every session (however, my colleague Zofia Dzwig also attended LIBER, and went to other presentations), but I was enticed to go to the “user-centred” session on day 2, and good that I did so, because this was a very nice session. The one that I highlight here is from Cambridge University Library. Sue Mehrer and Andy Priestner made an impressive presentation. Bear in mind (quoting Margaret Mead): “What people say, what people do, or say they do are entirely different things”, and try to benchmark yourself against services that people encounter in their daily life. A good idea according to Sue and Andy is tested via a MVP (minimum viable product), which gives you the opportunity to fail forward (learn and improve). Their Futurelib prototypes (70% complete) are often not brought to the final version. When I later spoke to Andy, he mentioned that this is the way it is in a time where things change so rapidly, we are living in beta forever. Their staffing is just 1,5 person. Depending on the topic, they have other employees involved and hire extra resources. All sessions made clear that innovation is dynamic, changes need to be evaluated, and users to be asked for their experiences on a regular basis. However, beware that you check what your users do (not what they say). To give also some credit to the other two presentations in this session: keep on listening, reviewing and challenging (Penny Hicks). And if you go out and ask your users, bring in an outside view, and do not present yourself as a library (Eva Dahlbäck and Martin Wincent).
And of course open science and open access were present at the congress. Ralf Schimmer had a keynote, but did not bring a new view or the “how” roadmap on his transformation paper.
A bit before the wrap-up I had to leave, thank you LIBER, organisers and particpants, for yet another conference worth attending!
Is it possible to know (the intention or story within) a book without reading it? To just discuss its outline with some friends or colleagues who more or less “believe” in it? And then without further ado writing a blogpost about it.
Well anyway, that is what I am doing, so apologies if I have made the wrong assumptions. As far as I get it the book is about clearing the mess in your house, in a logical order, and to help you in reaching or living a happy life, without too much focus on possessions.
I wondered about the analogy with us librarians. Part of us is about collecting, in the old days of paper, and nowadays of online content (and then often without possession;-). A collection that we order, clear and clean, in a conscious or thoughtful way. (Other parts of us are about so many more things, serving the best study places, providing academic skills to students and staff, facilitating the production of online education or research output. Of course.)
Do we wonder, during our special tasks, if something “sparks a joy” (the phrase that I hear and read whenever the book is quoted)? Is that something we could or would consider in our review of (online) materials? I guess not always. However perhaps this question would just be enough. Joy. Inspiration. Good memories. Useful tips. Bringing in the joy of reading, of keeping or telling why something is or was important. So that could also be a good subtitle for the Library. A place filled with sparks of joy.
On June 2, we had a Research Exhibition in our Library. Twenty innovative projects of our university were presented, and a lot of (external) visitors were walking around, talking with the researchers, and listening to some speed talks. Some people asked me whether this was a proper event to take place in the Library. I explained that we want people to search (be curious!), find (get answers!) and share information (create and innovate!) via all sorts of channels or ways. So of course our place can contain so much more than the traditional books or study places. Exhibitions, project presentations, workshops. In the context of science, design and engineering they all fit our purpose. To create sparks of joy.
And by the way. I am hesitating to follow my friends’ way of clearing or cleaning the stuff in my own house. I am fuzzying my way through my possessions. 😉
Educate. Innovate. Create. After we transformed to a Library Learning Centre, several years ago, these were the words that we posted on our walls. I thought of them again during the two days of the Open Science Presidency Conference in Amsterdam. I am not going to repeat the words of the Dutch State Secretary Sander Dekker or European commissioner Carlos Moedas (though I am very pleased for the priority and attention that they give to open access to publications and sharing of research data). I would like to reflect on the session we organized on the second day, the break-out session on Innovation. The key items for this session were looking at successful new models for scholarly communication, and how new users can benefit from opening up science.
Under inspiring guidance (thank you David Bohmert), we listened to several speakers:
- Cees Leeuwis on responsible life sciences innovations for developing countries (referring to EVOCA, environmental virtual observatories for connective action). He would benefit if (grant) calls would be interdisciplinary and targeted, and he emphasized that we should open up the whole research process (do not focus entirely on research outputs).
- Lucia Malfent reported to us about the experiences with the citizen science project ‘Tell Us’ as a best practice of open innovation in science. She asked us to train scientists in applying methods of openness, and in the discussion afterwards we realized that citizens would also benefit if we would make open “what is already out there”. Should universities be funding citizen science?
- Jeroen Bosman and Bianca Kramer broadened our perspective with 101 innovations in scholarly communication. On 15 April 2016 they will open data of their survey amongst researchers. They presented their G-E-O model Good Efficient Open (as goals for science & scholarship). Focus for researchers is mainly on doing things efficiently. So we need to stimulate the open or good angle.
- Daniel Wyler brought it back to money: he talked about innovations in funding and funding innovation. He made it clear that new funding schemes encourage innovative research.
The talks were preceded by Vincent Lien, who set up an ePetition in the UK to call for free access to research journals for teachers in August 2014.
The results of our session, and of the conference, were captured in a Call for Action, that was published on 7 April 2016, both as a pdf to view the state-of-the-art on that day, and as a dynamic wiki, so that all participants and other stakeholders could add comments (possible until 14 April).
In our Innovation session we collected the ideas or improvements of our delegates in an innovative way (of course!). Everybody was invited to write these down on a postcard, and we connected them, to make a truly concerted action line. All actions have been processed in our own Trello board, including the tweets harvested via #innotrack.
We also wanted to showcase nice innovations in our sessions, but two hours is not that much. Marina Noordegraaf created a Tour d’Horizon. In this short movie we show three models in the developing landscape of Open Science: 1. APC funded journals; 2. distributed publishing roles with the overlay journal as one of the examples; and 3. building innovations around timely sharing smaller units of research outputs. We call for “research and innovation to take a long term perspective and not be trapped by the past”, quoting commissioner Moedas. If the Commission wants to make Europe open to innovation, open to science and open to the world, it must dare to choose new models for opening up research outputs and credit participatory and Open Science.
Wrapping up: a nice session, a good experience! Educate each other, Treasure innovations, and Be creative!
P.S. Talking about different approaches. On 7 April Leiden University launched a movie “On being a scientist“. Touching issues on plagiarism, publication pressure and integrity. Nice!
Anka Mulder, Vice President of Education and Operations at TU Delft, invited me to speak briefly about Open Science at our Open Education Seminar during Open Education Week 2016. I had only limited time so I could not really expand on the topic, but I think that it is really good to combine the two. We want to share our work or results to society, so that both economy and society-at-large can benefit. Open up your software, your research data & publications, your education and your campus. To move forward. To make science better. Of course open if possible, closed if necessary.
As I stated it all started over (far) more than a decade ago with lobbying for open access. Open access, a topic that goes hand in hand with libraries. TU Delft Library’s mission is to let knowledge flow freely, because students, teachers and researchers will become better when they use knowledge of others, and share their own (see also an older post “share to grow“). In The Netherlands things have gone rather “wild” lately in relation with open access to publications, and I think that has a reason.
The Netherlands are now seen as a forerunner, as guide towards open access. To my opinion this was due to two factors: enhancement & diversity. After the statement by our State Secretary, we joined efforts (VSNU, UKB and Surfmarket) in the negotiation teams, and we had them chaired by our Vice Chancellors. Three of them were directly involved and formed a front of our universities towards the publishers. A true enhancement with respect to the situation before where we had our negotiations without Board level involvement. When we discussed our conditions with the publishers, we made it explicit that we wanted to move to open access via the license deals, without us paying more money. We stuck to our principles, but we added diversity by accepting a variety of paces along the way.
Learning from experience is always wise. We will start at TU Delft as per 1 May 2016 to have all Delft authors to post their paper (final accepted version) in our institutional repository. We diversify, i.e., we follow the gold route as far as we manage to be successful in our license negotiations, we follow the green route where journals do not offer other open access solutions (or far too expensive ones) and we stimulate new initiatives. We enhance our Open Science umbrella by implementing Open Research as stepping stone to Open Science. We are setting up a data stewardship programme with our faculties supported by a multidisciplinary team (Library, ICT, Legal Services, Strategic Development) as part of our 2016 agenda. Understanding that for research data diversification means that we open them if possible, and close them if necessary and using a fair (findable accessible interoperable reusable) approach.
TU Delft is also a forerunner in Open Education. We started with OpenCourseware by providing free and open educational resources in 2007. Many other initiatives have arisen in the mean time, from free MOOCs, online masters to paid Professional Education. So also here, some content is open, some is closed. The bottomline is that we want to share. Or as Anka Mulder puts as tagline on her weblog: “Deliver World Class Education to Everyone”.
It is as if it is virtual reality week. I hear about developments at Apple, Google and a virtual reality movie to be recorded, all within just a few days. News that I pick up via (Internet) radio, the newspaper and blogs. Yes, I am a slow changer. Though I work paperless for some years now, have a mobile office (my laptop with my wires) and put my notes in OneNote, I do like to read the paper news, the paper book and scribble lyrics on a paper note.
Making slow changes does move me forward however. I use Twitter, Facebook (but more as observer, checking what is happening), but not Instagram, Snapchat or Pinterest. I checked Meerkat and have Spotify, but have no Netflix account (we very rarely buy a movie via Apple TV.) I use my Samsung Galaxy Tab for minuting meetings or conferences, or for reading via the Kindle app when travelling. Adaptation mainly driven by office needs, adapting when things can be done more practical, but I am surely no new gadget adept.
So how about virtual reality? I am not sure whether this would work for me. Would I have a VR experience instead of going to a museum? Would I enjoy a VR movie or documentary? Go for a dinner, always at the same place, but in a virtual world? I am not sure. I just watched this Raw Data game by Survios – that is not the thing for me. Or have a look at Micosoft’s hololens. A different way to watch football, to interact, to have virtual avatars in the same room. For me it should probably first start in my working environment, a virtual board meeting instead of using GoMeeting, Lync or Skype.
My Library colleagues in Research & Development and our Programma Manager for the Library Learning Centre are investigating a possible VR happening in our Library, presenting faculty work. Read the blogs they now write on new Library trends, some are in Dutch, some in English. Great stuff, of course on virtual reality, but also on the new Beam(er) and a Library with a Pinterest account (and that of course could get me using it!).
We should be life long learning, right!? Learning about e.g. the future of publishing (“Don’t be disillusioned. There was nothing wrong with email, public transportation or cameras, but they were all flipped upside down by Facebook, Uber and iPhones. Digital will change publishing. In fact, it already has.”). The best source to learn from is of course young people. This week my daughter turned 12, she took my telephone, and showed me how I could use Whatsapp web on the PC, via the QR reader. I had no idea, and that is worrying. I know that we all should be learning how to code (reading, writing and arithmetic), and in the end I probably will … though I am a slow changer.
A different way of an annual retrospect. This time I asked our Library staff a few days in advance of our Christmas drinks to name a few remarkable events (“things that you are proud of”) that happened in 2015. My last 2015 blog is dedicated to the events mentioned, that all seem to have a focus in the last months. I mentioned that I would put the contributions in this weblog, and that I would share the pictures. I apologize that I do not refer to all creators.
Library tour app
…we did it!
.. our team was so diverse, it was fun and everybody was committed and did just what was needed
.. we involved the end-user right from the start
.. we finally did something with mobile
.. we received so many positive responses
.. there is still a lot of potential for students to use the app
.. projects like these give our R&D team so much energy
On December 15 this year our Board has approved the plans outlined in our report “activating academic heritage Delft University of Technology”, with which the Library can continue maintaining this heritage and making it accessible, but may also exploit it to strengthen the TU’s collective identity and standing. We will be able to attract dedicated staff for these activities. In 2016 the heritage collection will be transferred to our book depot – a lot of our staff worked extremely hard and efficient to make this possible.
We had a positive end-of-the-year, where our Open Science programme made good progress, and we got budget to get proper data research data management set-up with our faculties, to promote our how-to-guide, to stimulate open access via our open access fund. We are also happy that we continue to provide access to our Elsevier journals, whilst at the same time making true steps to open access with this publisher.
A big achievement this year was our transition from Aleph to Worldshare Management Services, and from our own Discover environment to Worldcat Discovery. We call the new library management system Nubes. All people involved deserve a big compliment: we have managed to deliver in time, in budget, and within the coming weeks we hope to have solved some pending loose ends.
We have never registered as much as this year in our (to be replaced in 2016 to Pure) current research information system METIS, and uploaded more than ever in our TU Delft repository. Great!
In October we moved from Sharepoint 2010 to 2013, in total 200 teamsites were migrated. This really went smoothly, amongst others due to a good preparation, and a great cooperation between our own functional application managers and the product managers from ICT.
Office in your bag
The past two weeks you could find a Christmas tree in our “Praethuys”. Everybody can put a wish in the tree, related to what you think you might need as office-App. Because your smartphone can do everything for you, to communicate, to find stuff, to make a shopping list or check your account. But what App could help you and make you more productive at work?
And in December we showed in our Library Learning Centre how flexible we are, when hosting a Phd Defence because there was a (fortunately small) incident in the Auditorium where the PhD Defences normally take place.
So within a few days we created a quick and certainly not complete overview of our 2015. And now we are up to enter the millennium’s sweet sixteen birthday party. We are starting to become grown-ups in this millennium, so less things that may surprise us, less things that we can just deny, we might realize that the world is amazing, and we can only get it better if we really try.
I spent a few days in Paris, France, with my research data colleagues, almost 600 participants from 38 countries, who gathered for the 6th RDA Plenary. This RDA (research data alliance) focused on the need to work with enterprises, and had as underlying theme the climate change.
That was the reason that Barbara Ryan (Secretariat Director, Group on Earth Observations) held a keynote on the first day. She was not just focusing on the climate change per se, but explained how she managed to get their data open, and the effect that this has had on usage. “Countries have borders, earth observations have not.”
We were all impressed with the statement that Axelle Lemaire (Minister of State for Digital Technology, French Ministry of Economy, Industry and Digital Technology) made at the start of the conference. She preferred to use the metaphor of light instead of oil, when talking about data. Data is not a fossil source that might run dry, data is around in many forms, sometimes a bit diffused, but crucial and it needs to be shared to create value. She told us that France will launch a public consultation on 26 September about “the Digital Bill”. A delightful presentation.
At the Plenary day I attended (there were three days in total) on 23 September, I was especially curious to see how working groups that I attended before, had progressed. So I attended the Publishing Data Workflows and the Data Citation Groups. The first group gave us a link to their article, and sample cases where either Dataverse, Dryad or figshare are used in the publisher’s data workflow. The future work will concentrate on moving forward in the research process, and analyse how processes for data publishing might work there. The working group invites everybody to give their best practices, thoughts and comments.
I think that we as libraries should realize that this is indeed what publishers are doing now (just also notice the press release announced at the RDA meeting about Mendeley Data and DANS). If we support our researchers with their data management plans and data stewardship, we can advise them how to keep, store and share their data, without giving the content away. I thought that the remark by William Gunn from Mendeley on the workshop a day before was reassuring “All types of content providers need to focus on value-added services and not paywalls”.
The Data Citation Working Group will shortly report on their 14 recommendations. The idea of RDA was that working groups only work for 18 months on a certain topic and that the group dissolves, and new groups emerge again. The difficulty here is that people like to continue their work, either because they feel committed to their legacy, or because there are many more ideas or recommendations to explore or make. New for me in this session was the “query store” as a middle man (you need to be able to reproduce your queries, so you give them a persistent identifier, but you also need to be able to retrieve the same data with that query, so you version your data with a timestamp). I also learned that data can be watermarked or carry fingerprints, as a protection layer (this related to data from social insurance providers for doctors and hospitals). Another term often used, was a “snapshot”: a version is a snapshot of your database. And I think it was Stefan Proll (but perhaps was it somebody who asked him some questions) said: “If users do not cite your data, cite your users”.
I already referred to the workshop on the day preceding the RDA, that was on e-Infrastructures & RDA for data intensive science. There was some overlap between these two days. One I did not mind at all. A very nice tool, called RD Switchboard, presented by Amir Aryani from ANDS (Australia). This switchboard is connecting datasets on the basis of co-authorship or other collaboration (e.g. via funding). Paolo Manghi showed that they already work together with the RD Switchboard by finding connections via the OpenAire database, between publications and projects and publications and data.
Mark Parsons, the secretary general of RDA talked (amongst other funny stuff) on infrastructures during the opening session of the preceding day. How we went from systems, to networks to networked infrastructures. Infrastructures are about bridges, both social and technical, and that is what RDA wants to do, creating bridges, and be open! “Preserve the freedom to tinker, that is why choice for open source is important.”
My Paris RDA trip started even a day before that, with the persistent identifiers workshop, organized by DataCite and ePIC. ePIC stands for persistent identifiers for eResearch, and is working on data in the full research cycle (what they call referrable data), whereas DataCite provides identifiers to citeable data. At the workshop there were presentations about identifiers such as ark, doi, handle, orcid and isni. For domain-specific work identifiers are often also needed, Anne Cambon-Thomsen started a journal for descriptions on Bioresources and Kerstin Lehnert introduced the igsn, the geosample number.
And we are not yet there, we want to use identifiers for more physical objects, we should always make sure that we refer to the pid in the metadata, and according to Peter Wissenburg, we should also use identifiers for the metadata. It is obvious that the most important thing is that these persistent identifiers are linked across platforms, and that we have an open scholarly infrastructure. A project about this, has just started, “Technical and Human Infrastructure for Open Research”: THOR. Tobias Weigl even wanted to bring it further: “We need an operational transition process. Go from one pid to the other. That is not possible yet.”
New for me was in the presentation by Laura Paglioni from ORCID that they will come with review information in your ORCID profile, and she showed that there is already a dataflow between CrossRef, DataCite and ORCID.
So even though I could not attend the full Plenary, enough inspiration as a take-away!
I attended the first two days of the IATUL conference 2015, on Strategic partnerships for access and discovery, on July 6 and 7, 2015. It has been a while that I visited an IATUL conference (the last time was in Auckland, New Zealand, in 2008), so it was nice that I was invited by TIB Hannover to give a keynote and be able to attend the meeting in Hannover, Germany.
IATUL celebrated its 60th anniversary, and last year the General Assembly had decided that members from other university libraries than technical ones are now also welcome to IATUL membership. Though of course at this conference there were still mainly representatives from technical libraries (which I like, perhaps because it potentially can provide a more focused flavour, though in the end we all have the same challenges of course). Let me mention a few observations I made during these first two days:
- The obvious topic of open access was immediately addressed on the first day by Martin Hofmann-Apitius, who gave a very explicit case and reason why publishers should allow automatic text mining and by doing so would save the lives of cancer patients. In the panel afterwards he said that he thought scientific youth has been spoiled by Google: “We should come back from the hype, and define what solid-based literature search means”.
José Cotta (from CONNECT, Directorate-General of the European Commission), the second keynote speaker, phrased open science as a “democratisation” of science, and explained that free flow of data is necessary for a digital single market. He divided open science as follows: E-infrastructures for open science; Open access to research results & processes; Evidence-based policy making (Global systems science); and Public engagement (citizen science, crowdsourcing). Cotta further referred to the blogpost Moedas and Oetinger jointly wrote, and ended with his statement that we need to “catalyse a change in culture”. (I had already heard the update on the pilot for research data Cotta gave.)
- Look out for the English translation of a public paper from ETH Zurich about the strategy for their collections (summary is already available). They identified the following four functions of scientific collections: Research; Teaching; Transfer of knowledge to the public; and Preservation of cultural heritage.
- Frank Seeliger (from Library TH Wildau, Germany) showed some nice visual presentations in the (digital or physical) library environment, or as he puts it: “making knowledge and science tangible”. They invite their professors to the library with this tagline: “Come to us to play”. Twice these two IATUL days a reference was made to the fluid library, i.e., the library can be where you want it. The tangible example at Wildau: every table in the library has an rfid reader, and when a book is put there, it can stay there, and be available (because it is localised).
- Simone Fühles-Ubach (Cologne University of Applied Sciences, Germany) explained a model for a library strategy, where the focus is on what the user wants to do instead of what they want. It is called the Openstrategies PRUB-model, because libraries run Projects to produce Results which customers and citizens Use to create Benefits.
- Wolf-Tilo Balke made it clear that libraries can (or should) connect information science and computer science, and combine this with their knowledge of the subject domain. He believes that this is necessary because otherwise people are lost in the information overflow, and “taking something out of the collection is the same as putting something on the 2nd page of Google”.
- I have made notes to take a better look at WorldWideScience.org who presented themselves as “a global science gateway comprised of national and international scientific databases and portals”. I must also investigate the impressive overview of research support services Hester Mountifield (University of Auckland, New Zealand) provided in her talk called “Through power of collaboration. How we increased our impact by helping researchers to increase theirs”. And I should have a better look at the circle (slide 20 in the linked pdf at this page) of the (physical) library functions Brian Irwin showed in his joint talk with Sharon L. Bostick.
So enough to get back to when I am in Delft. I did like the meeting and the talks with the attendees I had. The organisation was very “light”, which made it pleasantly informal, and there was also a lot of diversity in the topics, (perhaps some more focus in the total programme would have made it even better).
I have already given some attention to my own keynote via slideshare and twitter, but I would like to repeat very briefly what I feel is important for us librarians. We know why we find open access important, because we want everybody to have easy access to research findings. We should work together to reach this goal, and be Flexible, Assertive, Innovative and Realistic while doing this. If you want to know more, just reach out (and I’ll be there;-).
While we are implementing (hopefully more and more) open access deals in The Netherlands (e.g. with Springer) there is one thought that crossed my mind. For quite some time funding agencies support the open access publication of scientific articles. This is of course what we want, because we want knowledge to flow and our economy to grow. However … is all the extra funding in open access publications a sensible way forward? The reason I am asking this is because of the upcoming offset deals. We (institutes, libraries) have put money in the publishing system in the old days for access, for reading (and of course for more services the publishers offered alongside in the digital age). We have created a movement to (try to) change to a system where we (will) pay for the publication production part, and reading or access is “for free”. There is a lot to read about the total cost of publishing, and how open access will cost us much more than we pay for “closed” access right now.
And where does or will the extra money we need come from? From the funding agents, research grants, other faculty budget, or perhaps the government (the latter is not the case in The Netherlands). We are creating complexity in the world of publishing, we are finding ways to fund open access, where it perhaps could be as simple as it used to be. Apart from member or personal subscriptions, the major proportion of costs for the reading access came from the institutional (library) budgets. There was no budget for this from funding agencies; if you wanted extra content, not covered by the “big deals”, you needed to arrange this with your librarian, and publishers marketed their content via their libraries and via the researchers (“ask your library to …”). So why are we changing this? If a library or consortium wants to have an open access arrangement for or alongside the subscribed content, you cannot “use” this extra money in the system. If I refer again to the Springer deal the money paid is (we accept that in this transition period) mainly for opening up our articles in hybrid journals, and it is not possible to fund this (what I do understand). However, we all know this extra money is there and exists, and we cannot ignore this during our negotiations. And this creates complexity.
Let there just be one budget for publishing (and for the sake of simplicity I leave out the surcharges that might be paid by the authors out of their own research budget, also in the old days), and have no, I say no, extra funding for open access. Too bad. Not because I am not in favour of Open Access. But because I am.
A few afterthoughts (and of course do not consider this blogpost as a well-thought-of and reviewed article, I am just writing it in the train):
- This total publishing budget should be the university’s. The library can keep it, but we need to make sure that authors understand what costs are involved in publishing.
- We need all stakeholders to make the transition happen, as I stated earlier, there is a true choice to decide to make progress in open access.
- We need to take care that open access is not adding a barrier for people to be able to publish (resesarchers who are not connected with research institutes that can afford this – though we had the same problem with subscriptions of course).
- In the transition phase funders can stimulate open access, but I think that funding apc’s is not the way to do this.
- The complexity is perhaps more created by traditional publishers that do not want to change models due to an uncertain future. We know that there are upcoming successful open access publishers with a changed business model.
I attended APE2015 for 75%, i.e., I missed the second afternoon, the sessions I attended got of course my full attention. My report touches upon some highlights, for full coverage, please check the recordings or slides that will be put online later.
Open access as a means, not a goal
Quite some presentations made of course reference to open access, starting off with Martin Grötschel, incoming President of the Berlin Brandenburg Academy of Sciences, who stated that “everything should be available at your fingertips” and that “openness is the best possible way to foster high quality science”. And ending (well at least for me) with Jan Velterop, as independent Advocate and Advisor on open access and open science: “The goal is optimal dissemination of knowledge”. He referred to the term “lamp post research”, because with the publication overload researchers might only be looking where the light is (but not necessarily where the key is). I come back to that later. Celina Ramjoué, Head of Sector, Open Access to Scientific Publications and Data, European Commission, talked about Open Science. She also emphasized the importance of free circulation of knowledge, and explained the actions of her department by four issues, i.e., 1. e-infrastructures (big data); 2. evidence-based policy making; 3. open access to research results and processes; and 4. citizen engagement .
The dotcoms-to-watch session Eefke Smit was chairing carried the title “sharing is multiplying”, which also nicely catches the goal of open access in it, resembling our library’s vision that “if you share, you grow”. For circulation of samples, Olivier Acher (www.sampleofscience.net) wants to connect scientists creating samples with scientists who can use them. Descriptions are put in journal Sample of Science, and each disseminated sample becomes a citable item. I did remember readcube.com, but I did not realize that there were the launching partner of the shareable, read-only articles that Nature Publishing Group has on its website and that can be used for sharing peer to peer and media referral as Nicko Goncharov (Digital Science) talked about. David Sommer (Kudos): was back from last year, and after their launch in April 2014 they have 29000 author registrations. Authors can explain, enrich, share and measure what happens with their articles. New is an institutional partnership and proof that enriched articles are read and cited more.
From knowledge maps to knowledge vault
Hans Uszkoreit, Professor of Computational Linguistics, Saarland University at Saarbrücken, referred to the highest level of offering information resources, with new structured knowledge. Uskoreit, Jan Velterop and David Wade referred to knowledge maps or graphs. How these are already the basis of current Google or Bing services, and how they can be improved or sharpened by adding other “more closed” content. As Velterop put it: “Make sure that we at least get access to meaningful stuff in the articles. The more we have, the sharper the knowledge picture.” Wade is Director Scholarly Communication at Microsoft Research. He had some advice for the (mainly publishers) audience: “get crawled and indexed, get a sitemaps.xml, and mark up your content”. Interestingly he showed us that Microsoft is using these knowledge maps “behind the screen”, and that a new feature of Word is a direct reference to Word online (a sort of extended autosuggest), and you can also search for online pictures (with CC-BY licenses) in Powerpoint or search for data via an online search in Excel.
Making it happen, reaching our goal
Obviously if you want something, you should try to get there, and there might be more ways to reach your goal than you can think of yourself. In The Netherlands we have just reached an agreement with Springer. At APE Veronika Spinka (Open Access Manager) showed us how Springer is working on automating the process of identification and verification of author/institution for the apc’s deposit. Richard Wynne, Vice President of Sales and Marketing, Aries Systems Corporation, explained that we really should leave metadata and business rules separate from each other, as was the case with the subscription model. Strikingly I thought was the author landing page Jake Kelleher (Senior Director of Licensing and Business Development, Copyright Clearance Center) showed us, where an author (or institute) can see what the agreed fee to be paid is, consisting of an apc and all sorts of surcharges (for supplementary data, colour and CC-BY license).
Ramjoué made reference to a possibility the EU now created where apc’s are also applicable for two years after a project has ended (in relation with FP7).
Frederick Dylla (see also last year) updated us on CHORUS: they now have 100 plus signatories and it is growing. CHORUS builds on existing infrastructure, and consists of a landing page (popular term, I must say!) for public access on the publisher’s site and progress can be followed via a live dashboard. In the US they chose the green road, and CHORUS is the publisher’s solution. For the library side of it, there is SHARE (working together, with CHORUS, e.g. on identifiers), the shared access research ecosystem.
(I need to) Come back to this later, or remember these phrases!
- Research pad (convert all open content to ePub format)
- Utopiadocs.com (I already heard about it, but should check it again, about “resurrecting knowledge”)
- Corona, your personal digital research assistant (Microsoft)
- Uszkoreit : “Owners of the texts do not own the facts”
- Dirk Pieper referring to an open access clearing centre to pay for apc’s and a landing page for their authors/publications
- Phil Archer (W3C): “a book is a broken, dead thing for youngsters.” “Semantics matters, otherwise machines cannot read it efficient, if models are different, again you make it difficult for machines”
- Kent Anderson about what peer review is about: “Is this new, done well, important? First, best or last?”, and provoking the publishers: “Investing in peer review is investing in your core function”
- Velterop: “You don’t get answers, but hints”
- Follow what happens in the CC-BY discussion, is it true that authors need to pay more in this license? Should we add ND to this?