More from TU:Librarian
- March 2017
- February 2017
- January 2017
- December 2016
- October 2016
- June 2016
- April 2016
- March 2016
- February 2016
- December 2015
- September 2015
- July 2015
- April 2015
- January 2015
- December 2014
- September 2014
- July 2014
- May 2014
- March 2014
- January 2014
- December 2013
- November 2013
- October 2013
- July 2013
- June 2013
- April 2013
- March 2013
- February 2013
- December 2012
- November 2012
- October 2012
- August 2012
- June 2012
- May 2012
- April 2012
- March 2012
- February 2012
- January 2012
- December 2011
- November 2011
- September 2011
- July 2011
- June 2011
- May 2011
- March 2011
- February 2011
- January 2011
- December 2010
- November 2010
- October 2010
- August 2010
- July 2010
- June 2010
- May 2010
- April 2010
- March 2010
- February 2010
- December 2009
- October 2009
- September 2009
- July 2009
- June 2009
- April 2009
- March 2009
- January 2009
- November 2008
- October 2008
- September 2008
Posts tagged APE
I attended APE2015 for 75%, i.e., I missed the second afternoon, the sessions I attended got of course my full attention. My report touches upon some highlights, for full coverage, please check the recordings or slides that will be put online later.
Open access as a means, not a goal
Quite some presentations made of course reference to open access, starting off with Martin Grötschel, incoming President of the Berlin Brandenburg Academy of Sciences, who stated that “everything should be available at your fingertips” and that “openness is the best possible way to foster high quality science”. And ending (well at least for me) with Jan Velterop, as independent Advocate and Advisor on open access and open science: “The goal is optimal dissemination of knowledge”. He referred to the term “lamp post research”, because with the publication overload researchers might only be looking where the light is (but not necessarily where the key is). I come back to that later. Celina Ramjoué, Head of Sector, Open Access to Scientific Publications and Data, European Commission, talked about Open Science. She also emphasized the importance of free circulation of knowledge, and explained the actions of her department by four issues, i.e., 1. e-infrastructures (big data); 2. evidence-based policy making; 3. open access to research results and processes; and 4. citizen engagement .
The dotcoms-to-watch session Eefke Smit was chairing carried the title “sharing is multiplying”, which also nicely catches the goal of open access in it, resembling our library’s vision that “if you share, you grow”. For circulation of samples, Olivier Acher (www.sampleofscience.net) wants to connect scientists creating samples with scientists who can use them. Descriptions are put in journal Sample of Science, and each disseminated sample becomes a citable item. I did remember readcube.com, but I did not realize that there were the launching partner of the shareable, read-only articles that Nature Publishing Group has on its website and that can be used for sharing peer to peer and media referral as Nicko Goncharov (Digital Science) talked about. David Sommer (Kudos): was back from last year, and after their launch in April 2014 they have 29000 author registrations. Authors can explain, enrich, share and measure what happens with their articles. New is an institutional partnership and proof that enriched articles are read and cited more.
From knowledge maps to knowledge vault
Hans Uszkoreit, Professor of Computational Linguistics, Saarland University at Saarbrücken, referred to the highest level of offering information resources, with new structured knowledge. Uskoreit, Jan Velterop and David Wade referred to knowledge maps or graphs. How these are already the basis of current Google or Bing services, and how they can be improved or sharpened by adding other “more closed” content. As Velterop put it: “Make sure that we at least get access to meaningful stuff in the articles. The more we have, the sharper the knowledge picture.” Wade is Director Scholarly Communication at Microsoft Research. He had some advice for the (mainly publishers) audience: “get crawled and indexed, get a sitemaps.xml, and mark up your content”. Interestingly he showed us that Microsoft is using these knowledge maps “behind the screen”, and that a new feature of Word is a direct reference to Word online (a sort of extended autosuggest), and you can also search for online pictures (with CC-BY licenses) in Powerpoint or search for data via an online search in Excel.
Making it happen, reaching our goal
Obviously if you want something, you should try to get there, and there might be more ways to reach your goal than you can think of yourself. In The Netherlands we have just reached an agreement with Springer. At APE Veronika Spinka (Open Access Manager) showed us how Springer is working on automating the process of identification and verification of author/institution for the apc’s deposit. Richard Wynne, Vice President of Sales and Marketing, Aries Systems Corporation, explained that we really should leave metadata and business rules separate from each other, as was the case with the subscription model. Strikingly I thought was the author landing page Jake Kelleher (Senior Director of Licensing and Business Development, Copyright Clearance Center) showed us, where an author (or institute) can see what the agreed fee to be paid is, consisting of an apc and all sorts of surcharges (for supplementary data, colour and CC-BY license).
Ramjoué made reference to a possibility the EU now created where apc’s are also applicable for two years after a project has ended (in relation with FP7).
Frederick Dylla (see also last year) updated us on CHORUS: they now have 100 plus signatories and it is growing. CHORUS builds on existing infrastructure, and consists of a landing page (popular term, I must say!) for public access on the publisher’s site and progress can be followed via a live dashboard. In the US they chose the green road, and CHORUS is the publisher’s solution. For the library side of it, there is SHARE (working together, with CHORUS, e.g. on identifiers), the shared access research ecosystem.
(I need to) Come back to this later, or remember these phrases!
- Research pad (convert all open content to ePub format)
- Utopiadocs.com (I already heard about it, but should check it again, about “resurrecting knowledge”)
- Corona, your personal digital research assistant (Microsoft)
- Uszkoreit : “Owners of the texts do not own the facts”
- Dirk Pieper referring to an open access clearing centre to pay for apc’s and a landing page for their authors/publications
- Phil Archer (W3C): “a book is a broken, dead thing for youngsters.” “Semantics matters, otherwise machines cannot read it efficient, if models are different, again you make it difficult for machines”
- Kent Anderson about what peer review is about: “Is this new, done well, important? First, best or last?”, and provoking the publishers: “Investing in peer review is investing in your core function”
- Velterop: “You don’t get answers, but hints”
- Follow what happens in the CC-BY discussion, is it true that authors need to pay more in this license? Should we add ND to this?
Flying back to Amsterdam after two days of attending the APE Conference and one day of attending a Board Meeting of DataCite in Berlin, I try to capture what APE has brought me this year. It has been 4 or perhaps 5 years that I attended this conference. Each year it is organized by Arnoud de Kemp and this was its 9th edition.
I will not report in chronological order, but just take a few strands out. The topic of the meeting was “Redefining the Scientific Record, The Future of the Article, Big Data & Metrics”, and participants were (mainly) publishers, some researchers involved in funding or publishing, and library, governmental or funding agency representatives. A lot of Dutch people attended APE2014, of course also due to the keynote speech of Sander Dekker, our State Secretary, of the Ministry of Education, Culture and Science, at the start of the conference.
If you want to view any or all presentation(s), that is possible via the recorded live stream.
Peer review under discussion
David Black, Secretary General ICSU (International Council for Science), and from origin a researcher in organic chemistry, claimed that the interdependence between curators and creators would also remain in this digital era. According to Black authors will in future send their findings to repositories (standardized, subject-focused and international) instead of primary journals. Anybody should be allowed to add comments to papers submitted to this repository. The peer review is an open evaluation, and takes place after publication. Out of this repository secondary publications can be selected (and that could still be a role for the major publishing houses). Reputation building is not merely based on these publications, but also on local contributions, your presence at conferences and individual (personal) references. An important condition for this to be a success it that the author takes his or her responsibility for his own work (be aware of what you submit). Jan Velterop referred me via twitter to a recent blog he wrote about this, .
ScienceOpen, also present at the conference, mentioned that they are already supporting scientists and are offering public post-publication peer review.
From the publishers
APE is really a conference for and with publishers. Let me highlight two presentations from the publishers. The first one was by H. Frederick Dylla, Executive Director and CEO of the American Institute of Physics. He talked about CHORUS, that started in September 2013. CHORUS stands for ClearingHouse for the Open Research of the US and provides public access to manuscript/articles reporting on federally funded research, using existing infrastructure. Another one was from Eefke Smit. Apart from giving some nice poetry and examples, e.g. the Atlas of Digital Damages, she asked all publishing participants to make sure that they are aware they need preservation strategies for their content (outsourcing, normalisation, migration and emulation). In the Keepers Registry, as we later learned from Peter Burnhill (EDINA and Head of Edinburgh University Data Library), 22000 e-serial titles are being preserved with “archival intent”. Knowing that 113000 titles (issn’s) are registered, we have only 19% save. Eefke therefor called out to solve the identifier soup and to make sure that we are creating the connections to the future, so that they in future can make their connections back to us.
Talks that might be worthwhile to be watching if you have a spare moment would be the ones from Jaso Swedlow, Professor of quantitative Cell Biology at the University of Dundee and President of Glencoe Software, talking about OMERO: The Open Microscopy Environment. OMERO deals with (the storage of) images. Swedlow introduces the ubiquitous image problem: is it a pretty picture, a measurement or a resource? According to Swedlow his tool brings in a driver for integrity, and published trusted scientific data.
Paul Groth, from the Department of Computer Science & the Network Institute at the Free University in Amsterdam, wondered what impact really is. Policy makers are interested to know whether you are doing good science. Evidence up till now has been limited to the publication (article), and not included slides, videos, codes, data or the fact that you might have different types of story to tell (citation is not always the driver). Altmetrics catches activity in online tools and environment. Paul gave us a few examples (ImpactStory; Open Phacts – published AND discussed AND cited; LISC 2013 – where results of a workshop are saved as short wrap-ups in figshare).
Mike Taylor, from labs.elsevier.com, presented seven reasons why one could be bothered about altmetrics and gave us an insight what work-in-progress is at Elsevier. Would it not be great to have real-time information on what others are reading right now?
And just quickly to wrap this up, the “dotcoms-to-watch” session was a good addition to the other lectures. The company Kudos makes an effort to match people to the right articles. ReadCube told us about ReadCube Instant PDF (keeping users online and engaged) and ReadCube Access, an eCommerce system for libraries, with access restrictions in exchange for lower prices, rent, buy or download. The latter one has recently been launched with the University of Utah as development partner.
A moral appeal
As said one of the exciting moments was the keynote of Sander Dekker. His full speech is in full-text available at Science Guide. Dekker is convinced that the digital world will be a game-changer in the world of scientific publishing. Above all he sees open access as a moral obligation, and a matter of principle: the whole of society will benefit from free and open access. So a true challenge for all stakeholders, let us hope we will indeed be able to put the flags out! Connecting this opening keynote with the closing one by David Sweeney (from HEFCE), is interesting. Sweeney, experiencing the UK situation as a big funder, sees three things we should be doing, i.e. address the double dipping issue; do what you are allowed to do (publishers are doing their part, how about academia and funders?); and test if we really need embargo periods.
It is perhaps enough to repeat two of the Einstein quotes Bernhard A. Sabel (giving a guest lecture about The Psychology of Innovation) presented, they are always so true:
“Great spirits have always encountered violent opposition from mediocre minds”
“Everything that is really great and inspiring is created by the individual who can work in freedom”
And three to take away from Sabel:
“Old technology or lawyers are in committees”
“Get the brightest people, do not compromise on people – ever”
“Sometimes it is better to be sufficiently ignorant”
At the closing panel there were representatives from a funding agency, publishers, and research communities. The panel drew no real conclusion, but the voice of librarians was clearly not included in the panel, which is perhaps typical. Libraries were a logical partner as a license broker, and now that open access is growing to become a commodity (why not;-), we should (and will) find new challenges. Susan Reilly from LIBER used a few keywords at her lecture earlier that day: Libraries should focus on digital preservation, improve findability and integrity, and aim at resource sharing / collaboration.
Data were addressed at the conference, but no new insights for me. As DataCite representative it was clear though that we should keep on pushing the necessity to use persistent identifiers to make your data findable, citable and usable. Partners such as Brian Hole from Ubiquity Press might be able to help us. In one of his last slides he said that they help universities to set up a data repository, and by doing so give power to the university presses.
Finally I would like to repeat the reference Peter Burnhill made to the State of the Union, 2014 should be the Year of Action!