Notes on WP14 Mega Meeting 23 March 2012

Attendees:

  • Ash Hunter, Tessella
  • David Giaretta, APA
  • Simon Lambert, STFC
  • Yannis Tzitzikas, FORTH
  • Sabine Schrimpf, DNB
  • Juha Lehtonen, CSC
  • Mark Guttenbrunner, SBA
  • Herve L'Hours, UKDA

Agenda

  1. Agree our plan to address the Review recommendations that were made about Deliverable D14.1 and its implications for the direction of the work package.
  2. Agree that we will formally put forward to the Project's PMB the need to extend the WP to end M20.
  3. Define a process for better visibility of assignment and ownership of tasks within the WP. It would be useful if you could provide me in advance of the meeting with your estimate of how many Person Months left you have to contribute to this WP, based on your totals calculated at end of Year 1. For Tessella, we have 1.8 PMs left to work on WP14.

In order to assist us with 1) above we have drafted the attached document to collate all of the reviewer recommendations that impact this work package, and make some initial suggestions as to what can be done to move towards meeting these recommendations.

Discussion Points

Yannis suggested that the Post Review Actions document be maintained as a response document that can be revised and then submitted as part of the M16 Checkpoint documentation in order to state what WP14 has done to address the issues raised by the reviewer recommendations

Reviewers suggested that WP14 could be seen as providing a service like a website that allows 'end users' to submit material for analysis, and that this website would then provide them with some sort of automated result that advises them what would be the best digital preservation strategy in order to preserve the material for the long term. This is not achievable within the original scope of the DOW and the resources that the WP has available to it. Ash thinks it may be useful to at least acknowledge this as being a highly ambitious goal that could be seen as a viable post project objective for the VCoE that emerges out of APARSEN. i.e. don't throw this idea away completely, but take a pragmatic view regarding the effort involved to implement it.

Yannis suggested that an alternative approach to this website would be to take a 'training' view - why not create a web based training resource that makes references (links) to other qualified resources of information. (Perhaps other APARSEN WPs). Discussions then extended this view to providing something like a Digital Preservation 'Wizard' that allows a user to answer some questions about their material that needs to be preserved, and this would take them through a 'decision tree' to a results page that is focussed to their needs based on the answers that they gave.

This could be implemented as part of the matrix of analysis - follow links to 'evidence of capability' etc.

For the wizard approach to be successful it will need to use an approach where users can identify material that is similar to their own needs in order to learn about the 'best practices' that can be applied to it in order to preserve it.

The Questions in the Decision Tree will need to be self describing and lead to clear decisions.

On a different topic, Yannis suggested that we Cite the coRR paper (URL: http://arxiv.org/abs/1201.0385 ) and say that the notion of Information Format (that is defined in that paper) can be used a driver for testing. This direction is more refined than typologies which are carrier-oriented (word, pdf, image, etc). So we would propose an information oriented approach, and state that this direction is ongoing and part of the APARSEN research.

Ash should talk to all WP leaders (of ongoing put probably of WPs that haven't started yet) possible test procedures , system, tools, etc that are related to the objectives of their tasks (that would make the entire project to look more coherent)

Ash also raised the question as to what benefit would be gained by offering up two different Digital Preservation test systems if they both end up performing the same underlying function. e.g. You can replicate the migration of either a single or many thousands of TIF to JPG file conversions, but if both systems are using the 3rd Party Imagemagick tool to perform the file transformation, are they not both just testing the quality of the tool itself and not the wider preservation systems? This fundamental question needs to be addressed quickly, since this defines the purpose of implementing and using Test Environments.

Actions

Action Who Description Status
1 Ash H. Provide a wiki version of the Post Review document Complete: WP14Year1Actions
2 Yannis / David Design for way to implement a Decision Tree web site linked to APARSEN OPEN
3 All Review Yannis's suggestion re: typology and agree to its inclusion OPEN
4 Ash H. Contact other WP Leads to identify dependencies on Test Environments / Tools / etc OPEN
5 All Scope out meaning of Test Environments in APARSEN terms - identify the differential value that can be obtained by running similar tests in different Digital Preservation Systems but that use the same tools - e.g. what added value is gained when a preservation test involves using Imagemagick to migrate a TIF file to a JP2 file in each system - are you not just testing the migration tool, not the wider DPS? OPEN

-- AshHunter - 2012-03-28

Edit | Attach | Watch | Print version | History: r2 < r1 | Backlinks | Raw View | Raw edit | More topic actions
Topic revision: r2 - 2012-03-29 - AshHunter
 
This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback