About DPF Manager
DPF Manager is an application and a framework designed to allow end users and developers to gain full control over the technical properties and structure of TIFF images intended for Long Term Preservation.
The main objective is to give memory institutions full control of the process of the conformity checks of files. This is a three-step process:
- Validation: validating the conformance to a specific normative. These normative can be defined by some standardization organization or specific acceptance criteria based on a locally-defined policy rules.
About DPF Manager
Bill McCoy’s article, “Takeaways on the Future of Documents: Report from the 2015 PDF Technical Conference,” offers some interesting thoughts on the future of PDF. I can’t find much to disagree with. PDF is in practice a format for reproducing a specific document appearance, and that’s becoming less important as the variety of computing devices increases. He makes a point I hadn’t thought of, that the “de facto interoperable PDF format” is well behind the latest specifications, which may explain why I haven’t seen complaints that JHOVE doesn’t know about ISO 32000 PDF!
The Digital Preservation Coalition are pleased to share the news that a critical mass of content has been prepared and peer reviewed and the project board has agreed we should release a majority of the Handbook. DPC members have already seen the emerging revised 2nd Edition of the Handbook on the members’ private area and this has been switched to the public side of the DPC website. This partial release will be further enhanced by additional functionality when a new platform for the website focused on ‘responsive design’ is brought on stream by the DPC early in 2016.
Downloading an object over the internet through a standard web-browser is a mechanism that is ‘less-than-optimal’ for the delivery of archival objects. Download of objects will not preserve the file-system metadata of the object. Tools like Wget can do this, but do we want the same behavior of the browser? On answering that, do we also need to create mandatory new requirements in future digital preservation systems? The repatriation of modified dates with born-digital records, for example?
The veraPDF consortium is pleased to announce the latest release of the veraPDF PDF/A validation software and test-suite currently under development.
Highlights for this release are:
- validation of all conformance criteria for ISO 19005-1 (PDF/A-1), conformance level b;
- a complete PDF/A-1b test corpus, including 200 new test-files:
- PDF features reporting; and
- a cross-platform installer.
Prototype features include:
- PDF metadata fixing;
- validation model and rules for PDF/A-1a, PDF/A-2 & PDF/A-3;
Volume 18, Issue 2
In the era of research infrastructures and big data, sophisticated data management practices are becoming essential building blocks of successful science. Most practices follow a data-centric approach, which does not take into account the processes that created, analysed and presented the data. This fact limits the possibilities for reliable verification of results. Furthermore, it does not guarantee the reuse of research, which is one of the key aspects of credible data-driven science.
Serendipity, generally thought of as accidental good fortune, has long been a staple of popular science. The notion of fortuitous discovery still has strong appeal, but could it encourage innovation? Antony Funnell meets researchers who not only believe in serendipity, but are actively trying to engineer it.
This memo responds to the request by the Danish e[Infrastructure Cooperation (DeIC), in partnership with Denmark’s Electronic Research Library (DEFF), for a document ‘to provide an overview of current best practices for research data management (RDM) policies within a number of subject areas, and as such inspire the development of a Danish national strategy on the area of RDM policies.’
Friday 20 November, 14:00 GMT / 15:00 CET