Recommendation 1:

The panel strongly recommends a project checkpoint review at M16, to assess the eleven deliverables due at M14, the availability of details requested to the project management, and the implementation of some of the recommendations provided. By this time the project should also be able to show a common vision for the project including a clear description of what the end goals of the project will be with particular emphasis on the implementation, post-project management and sustainability of the VCoE.

Just for concreteness: M16 = 1st May 2012

-- SimonLambert - 2012-03-06

I think this request explains why I have been keen for the consortium to try and start to 'sketch out' what the final 'products' of the project will be in 3 years time. This may seem like a long way off now, but the reviewers are encouraging us to have the same 'common vision' now in order for us to work towards implementing the same objectives, otherwise we will just work in our own separate silos of interest and the primary objective of the NoE will not be met. This will not necessarily be an easy path, we will all have different corporate objectives that we want from the project, and this common vision needs to demonstrate that we have addressed and resolved these issues, rather than delaying or ignoring their impact until the majority of the project is complete.

-- AshHunter - 2012-03-06

Recommendation 2:

The panel recommends that all M14 deliverables are completed and made available for the M16 checkpoint review (see Recommendation 1). This includes all deliverables mentioned in DoW Part B (D1101; D1102; D1301; D4101; D4301) of which no draft or presentation have yet been provided for review.

One step for better articulating the work that has been done, and the work that will be done in APARSEN, could be the following: The M14 deliverables (but also the subsequent ones) are extended by a new section that describes how the particular deliverable/task/result is related to the rest WPs & Tasks (even to those WPs that have not started yet). I think that this exercise could enhance coherence, and assist us in defining a common vision.

-- YannisTzitzikas - 2012-03-12

Recommendation 3:

The panel recommends that a more thorough description of the Year 2 plan is completed and made available for the M16 checkpoint review than was available to the Year 1 review, and across all work packages. This should include a clear description of how all work packages align with the newly articulated vision of Recommendation 1. The plan should also provide a clear focus on integration activities in Year 2. It is important to be able to see that all work packages are directed to the overall vision of the project. Where they are not, then clear justification for any divergence should be provided. It is important that Year 2 clearly articulates what the NoE will produce and will have to offer, especially for its participants including clear differentiation from other research consortia and clear indications of organisational model and governance to be applied. For example, in 4 years’ time, will Airbus, will the BL or KB (etc.) discard their current practices and all start to adopt APARSEN’s new ‘shared vision and framework for a sustainable digital information infrastructure providing permanent access to digitally encoded information’?

One bullet for the year 2 plan: During the KO meeting of APARSEN I remember that Liina mentioned that the number of publications that involve more than one APARSEN partners, is an indicator of the progress/success of the NoE. So probably we (APARSEN) should make a plan for common publications and have this available for the M16 checkpoint (part of the plan for year 2). The work done in Wp44 (e.g. could aid such planning.

-- YannisTzitzikas - 2012-03-22

Recommendation 4:

The panel recommends that a formal explanation is made to the Project Officer regarding the changes to WP43 and WP12 on training and staff exchanges including a full explanation of how budget has been spent to date and a clear description of the current status of these work packages, both of which clearly deviate from what was expected from the DoW. This explanation is to be provided for the M16 checkpoint review (Recommendation 1).

I've commented below under WP12 and WP43. I think the concerns around WP43 are slightly misplaced, while we have already worked to revise WP12 in line with recommendations.

-- WilliamKilbride - 2012-03-06

Recommendation 5:

The panel recommends a realignment of deliverables due dates to M24, so that deliverables due immediately after the future project reviews (e.g. the eleven deliverables due at M26) will fall within the second reporting period.

Suggests that the review team are not happy with the current direction and want a tighter management control over the project direction even if this means that they have more documents to review for the Year 2 Review Meeting.

-- AshHunter - 2012-03-06

I propose leaving them as their are in the current DoW (which is our contract). It is risky (from various perspectives) to move them earlier. We could even propose moving them at M28 for having enough time for more integrated and coherent results. We could however promise that at M24 we will deliver drafts of all M26 deliverables.

-- YannisTzitzikas - 2012-03-12

Recommendation 6:

The panel recommends that certification, accreditation and self-assessment tools and various types of testbeds be impartially and consistently described within APARSEN, so that both internal and external stakeholders can have access to all relevant information and details to choose the one most suitable tool for their needs. It is strongly recommended that a mapping of such tools is developed and incorporated within the Year 2 reporting period, and that an overview and documentation of this work is provided at the M16 checkpoint review.

If this is within WP33 then it is not what is envisaged in the DoW. We have to be careful about expanding the scope of work beyond the contract.

-- SimonLambert - 2012-03-06

Agree with Simon, does the last sentence need the creation of a new deliverable or should it be incorporated in to 1 or more existing deliverables? We will not get any additional funding for this activity so will have to come as a 'cut' from other planned activities. External stakeholders - does this mean public access to results - how, by website?

-- AshHunter - 2012-03-06

Recommendation 7:

The panel recommends that issues relating to development are surfaced fully within the project and the implications for these resolved during Year 2, and that an operational plan to achieve this is provided at M16 checkpoint review. Too often the panel were told that development would not be undertaken as part of the APARSEN project or that discussions with Scidip-es would be had regarding development. This means that activities such as the test environment (WP14) and the Persistent Identifier Interoperability Framework (WP22) will remain at purely report level and provide no meaningful advance on the state-of-the-art and contribute no practical benefit to the VCoE.

But isn't productive interaction with other projects part of what APARSEN should be doing?!

-- SimonLambert - 2012-03-06

We should check the original "Scope of Supply" as we do not believe that we promised anything more than a report on the Test Environments (WP14) as there is no way enough funding to implement a set of test environments as part of this project. Should check this against original WP text before challenging EC.

-- AshHunter - 2012-03-06

Recommendation 8:

The Panel recommends significant strengthening of the APARSEN Project Management processes. The project must improve and demonstrate its capacity to effectively and transparently manage and monitor the project. This should be reflected in:
  • operational management and interrelatedness of WPs, streams, committees etc
  • timely production and delivery of detailed, clear and consistent project documents, from the Project Reports to the Deliverables
  • clear effort breakdown and costs WP tasks and partners
  • coherent Gantt chart and risk analysis and qa, including resource usage, how deviations/risks have been addressed, and a record of review and qa activity.
  • Demonstration of QA and monitoring being carried out across the project.

This is a fair comment. The first year report was produced in a hurry and is of variable quality. Reporting needs to be improved.

-- SimonLambert - 2012-03-06

One issue regarding the strengthening the APARSEN Project Management processes: I was wondering what happens if a partner does nothing (no contribution, no participation, etc). Do we have an exact procedure of who/when/how shows yellow (or even red) cards?

-- YannisTzitzikas - 2012-03-12

Recommendation 9:

The panel recommends significant strengthening of current dissemination activities including increased promotion of project activities and results, and a more proactive and effective exploitation of existing social network and dissemination channels.

Recommendation 10:

The panel recommend to foster close engagement with the Open Planets Foundation in terms of success/failure and lessons learnt, as this would be instructive for the setting up of the VCoE.

Should restate William Kilbride's suggestion during last weeks MegaMeeting that it would be good to approach Richard Wright, BBC, who is involved on the PrestPRIME project that also wants to set up a VCoE.

-- AshHunter - 2012-03-06


2.1.1 PR-P1-01-0_4 – Period 1 Report – Draft

This first annual report was unacceptable. The current form of the report obfuscates the review of the project: it was difficult to understand what progress had been made against all work packages and what was planned for the Year 2 period. There was insufficient overview of the project and there needs to be more detail with regard to all work packages, effort and cost per WP and partner. This needs to include both to date and against projected totals. The document must also provide greater clarity and consistency on work done, on actions taken to deal with failures and on resources consumed and the reason for this.

The Project Report needs to contain sufficient granularity to allow reviewing of the project. All partners needs to make an assessment on where the project is going in and what the common vision is for the project and for the VCoE. This needs to be done now, in D11.2 due at M14, as described in DoW Part B.



2.1.2 WP51 D51.1 - Project Quality Plan – implement

The Quality Plan was submitted two months late. This is unlikely to have a material impact on the project. However, there are a number of discrepancies in the submitted document (e.g. it still lists Microsoft and Philips Consumer Lifestyle) which appear to reflect a somewhat slapdash approach to issues of project documentation. Procedural issues need to more clearly defined, e.g. section 5.1 mentions ‘periodic progress meeting’ and ‘review of main project milestones’ but not how they will be executed from a quality perspective. Similarly, there is no mention of process for periodic update of this document. The same applies for ‘the procedure to publish documents in the Public area’ (section 8.2) and the Risk Register (D51.2).

The panel is concerned that this deliverable may reflect poor overall project coordination.



I think we do need to be more careful about putting into practice what we have said we will do in terms of procedures.

-- SimonLambert - 2012-03-06

Reviewers said "Approved Needs to be implemented and updated regularly."

-- DavidGiaretta - 2012-03-06

2.1.3 WP51 D51.2 – Project Risk Register – implement

The register identifies a good selection of risks that are specific to the project. However, it omits ‘generic-type’ risks (e.g. cancellation of funding, withdrawal of a key partner, staffing problems, non-delivery of partners) that would normally be in a risk register. The mitigations are not convincing and mostly could be re-stated as ‘just do the project’. Risks [4, 10, 11, 12 and 13] should be revised and integrated, and a risk on lack of integration within the project consortium should be added.

The relationship between the illustrative risks quoted in D51.1 Section 6.1 (taken from the project’s extranet) and those in this risk register is not clear. Specifically, both this risk register document and D51.1 show the same date, so the risks should be identical if both are to be credible. However, the risks differ, details of the impact and probability vary, and wordings are different; this calls into question the integrity of the risk management process. Where is the central statement of risks which will dictate the risk management activity for the project as whole?

For effective project management, a risk register must be periodically reviewed and maintained, as new risks arise and existing risks



Reviewers said "Approved Needs to be implemented and updated regularly. Integration risk should be added."

-- DavidGiaretta - 2012-03-06

Seems like they have had time to re-read all the deliverables and then use this additional information to try and throw the book at us!

-- AshHunter - 2012-03-06

2.1.4 WP52 D52.1 – Project website – implement (reject)

The website exists, but the information is extremely minimal (and most of it has been added over the two weeks prior to the review). Further, most of the links on each page take the reader to another site (APA); and the visualisation map does not work. It is not doing a good job of raising awareness about the project and the European Commission. The site needs to be redeveloped with meaningful and useful content based on an APARSEN url and infrastructure, and resubmitted at M16 Checkpoint Review. If APARSEN truly has the ambition to bring together all digital preservation work in Europe, it must look more credible.



The DoW is very clear about the web site being part of the APA site. But clearly it needs a lot more information. Making the deliverables available on there with the background information and opportunities for discussion would add a great deal of difference. In the extreme we could set up a separate site but then we will have the problem of how to merge them over the rest of the project.

-- DavidGiaretta - 2012-03-06

Lina was pretty clear during the feedback session in the review meeting that they didn't want the website to be part of another bigger website, it should be separate, but that you can link between the two via links of logo's etc at the tops of the pages, that kind of thing.

-- AshHunter - 2012-03-06

2.1.5 WP22 D22.1 – PI System Framework

The panel was divided on the utility and direction of WP22. While D22.1 was accepted as a reasonable description of current thinking about identifiers, there was concern regarding the object centric nature of the report (e.g. an institution might want to permanently identify only at the collection or aggregation level), the lack of recognition of possible zero-to-one and many-to-many relationships, and the lack of details on the role of the survey respondents (actors) within their organisations (which influences the survey’s replies). Also, the document does not contain the analysis of requirements promised in the DoW (page 26); there is no rationale for its exclusion.

While the underlying importance of identifiers was accepted there was also concern that without development effort (prototype services and an Interoperability Framework) this work package will end up as no more than an environmental scan and potential reference model for persistent identifiers. Will the project meet its M16 delivery of ‘intermediate delivery’ of ‘advanced services’ and ‘practical implementation’? This issue of development needs to be explicitly addressed by the project in Year 2.

Concentration in Year 2 needs to be on the Framework with a view to institutional choices being insertable into the Framework (ie if organisation 1 chooses Handle and organisation 2 chooses ARK then both should be equally coherent within the PI Framework). Similarly the three tasks elaborated on D22.1, page 10 should be reframed to focus on building the IF not purely on modelling.



The reviewers said "It would be good to have a discussion about what the final project outcomes for PIs will be, e.g. will it be developed, will it be operationalised in VCoE?" Is it practical to have a M16 intermediate delivery?

-- DavidGiaretta - 2012-03-06

2.1.6 WP44 D44.1 - Communications Plan – not formally submitted yet

The messages in Section 3 which should be crucially important are, unfortunately, weak. They need to be reviewed, to make clear what the benefits are. The messages currently will not be news to the targeted audience. What is the point of difference of APARSEN which will set it apart from other digital preservation initiatives? The stakeholder groupings in Section 4 are not convincing: • groups 1 to 7 are ‘the digital preservation community’, and there is no need to ‘sell’ the digital preservation message to this community. • Groups 8 – 11 in effect represent most of ‘the rest of the world’ except for public sector bodies. This deliverable is quite a long way from being acceptable; much more work needs to be done on the plan. The authors may wish to consider increasing clarity by removing overlap with the stakeholder plan (D45.1) – possibly by integrating the two deliverables.



Reviewers said "Not formally submitted yet. Replaces Social Media Facilities Plan. Could be combined with 45.1. Needs regular updating and more detail on management of feedback."

We should follow the suggestion and merge with D45.1

Need discussion and consistent view of the key messages and stakeholders.

However surely we were not just "selling" digital preservation to groups 1-7 - we were promoting the uptake the results of the APARSEN work. If that is true we need to make it much clearer.

-- DavidGiaretta - 2012-03-06

The communications plan was originally intended as a strategic document informing the work of all WP's rather than a comprehensic account: clearly the reviewers were looking for the latter. The problem is that comms work is fragmented across a range of WPs and the main problems in terms of delivery of comms - especially the website - are not part of WP44. This needs to be addressed. The current comms plan provides a framework for this.

-- WilliamKilbride - 2012-03-06

2.1.7 WP45 D45.1 - Stakeholder identification – not formally submitted yet.

Poor progress has been made on the stakeholder plan despite having started in M7. While stakeholder management is a complicated process this plan as it stands is not clear as to the status and functions of the stakeholders included (e.g. the EAC). The lack of archival representation was noted by the panel; as archival institutions are major stakeholders, this should be addressed.

What will APARSEN offer stakeholders? What will it require of stakeholders? It might also be desirable to address ‘multiplier’ organisations such as AIIM to increase the reach of APARSEN.

The plan overall lacks specificity. For example, it mentions ‘APARSEN content packaged as part of university courses’. How much and what kind of content? Which universities and which courses? What level?

The authors may wish to consider including as stakeholders ‘multiplier’ organisations (i.e. membership organisations) that do not have much digital preservation activity (or none) but that ‘should’ – in other words, communicate with them to encourage adoption of the NoE’s digital preservation views. Examples include ARMA, IRMS, document@work, IKMS, ARA, BRA, DLM Forum, ICA, CDM, Sedona Conference(?), DigCCurr(?), VOI, Records Management Guild.

The lack of clarity regarding the role of the External Advisory Committee raised questions regarding the scope of APARSEN, its expected/desired status with respect to Europe and how much of its work is/should be influenced by, and influencing, countries outside Europe.

Arguably, the plan is generic and unimaginative, with modern ‘social media’ channels featuring little. The panel is not requiring that social media play a more important role, but suggests that the consortium may wish to consider the matter, and also the possibility of more innovative approaches.

Equally, the consortium could consider setting quantified targets (numbers of stakeholders of various kinds by various times) in this document to provide a method of assessing success as the work progresses.

This deliverable must be revised in both formal and content aspects, providing greater clarity on the devised strategy, its relation to D44.1 and on how the contact with the external networks will be managed. It is also recommended that the external advisory committee include a higher proportion of European members, as this is a EU-funded project aiming at European communities in the first place.



reviewers said "Not formally submitted yet. Needs more detail – who, what, when, why etc. Could be combined with 44.1. External Advisory Committee needs more European members"

We can combine with D44.1 as suggested.

We can easily add more European members to the EAC - but we need names.

-- DavidGiaretta - 2012-03-06


The panel offers the following comments on M14 deliverables. This is in the context that the documents reviewed were marked as drafts, but also were due to be delivered in final form within four weeks of the review.

2.2.1 WP12 D12.1 - Register/map of research activities … and placements

The description of progress was limited to a presentation at the review meeting. The deliverable due at M14 was not made available. The Register (D12.1) needs more detail – who, when, why etc.

There has been substantial deviation in this work package based on presentation at review and DoW, e.g. instead of placements for up to 6 months now placements of 2 weeks are being suggested. The panel questions the value of such a short term placement to either partner or to the project as a whole. There was no explanation for this deviation or of any budget implications for the project. Such a substantive change would need to be fully justified before it can be accepted.

Implications for WP42 and WP43 need to be considered in this context.

This needs to be explicitly addressed by the Project Officer (Recommendation 4).



Reviewers said "Would be rejected. Budget implications of changes from DoW. More detail – who, when, why etc. Two week placements not meaningful. Integration between WP12, 42, 43?"

We need to consider the budget implications of doing more. The DoW estimates that a 2 month placement would cost about 10K Euros.

-- DavidGiaretta - 2012-03-06

We have reduced the burden of management to maximise the time available for exchanges. These are now set to a minimum of one month based on the months available in the current allocation of months within the project.

A more complete listing of the opportunities has been provided including more details of people, timings and topics. Partners for the first tranche of exchanges in 2012 - have each provided prelmininary notes of their topics in anticipation of a formal 'call for hosts' in April 2012.

Paperwork and processes have been described in detail on a new website established specifically for the purpose which will be integrated into the APARSEN website in due course.

-- WilliamKilbride - 2012-03-06

2.2.2 WP14 D14.1 - Report on testing environments

The panel was concerned with the progress of this work package, which is ending in M14, and that it may not be able to meet its contracted deliverables Its status also is unclear, being titled a ‘report’ but listed instead on p12 of the DoW as ‘Other’ and omitted entirely from the list of deliverables on p123 of the DoW.

This concern arises from a number of areas: it seems that real testing on real testbeds have not been performed; it is not clear that all available test beds have been collected together (Task 1410); it is not clear what the fundamental unit of preservation is in APARSEN terms; the approach taken is not clearly articulated (D14.1, page 6, para 5); the description of user experiences is poor (D14.1, page 6, para 6); the methodology for the gap analysis is not described (D14.1, page 6, para 7); the nature of the taxonomies to be used is not clearly stated (D14.1, section 3.2.1, page 14); relevant citations on other works done in this area are absent from the draft.

The panel was unclear as to how this work package was intended to be used in the overall context of APARSEN. How and when is it intended to be used and by whom? For example, it would be interesting to see the development of Transformational Information Properties for a range of representative formats across a range of disciplines etc.

It is important that a test environment supports a range of preservation strategies across a range of material types across a range of disciplines. It is not at all clear whether this work package is intended to provide such an environment. This work package should be able to develop a ‘generic’ testing model and some proved test cases which have replicability and extensibility to other testing issues/problems. Or, if this is not possible, how a heterogeneous approach to testing would be managed.

It is also important to ensure the independence of the specific scenario from the preservation solutions being used, ie the preservation testbeds and tests should be separable from the software being used to manage the objects whether that be a specialist digital preservation system or what might be considered a digital repository (commercial or open source). Similarly, given that different testbeds have different ways to test, the proposed environment should be configured to take a particular type of object and apply several different testbeds to it, thus resulting in a range of comparative capability metrics.

At a more detailed level, the classification scheme explored in Section 3.4 is arbitrary and adds nothing to the process because of that. It is based on binary characteristics (e.g. static vs dynamic); but while it is easy to think of a practically unlimited series of such characteristics, only a few are included here, without any justification. And if more are added, the decision table representation (Table 3) will rapidly become totally unrealistic. Most worryingly, what reason is there to believe that all objects in one resulting class will have the same characteristics for preservation purposes? For example, why are TIFF and WordPerfect files different, or why are WordPerfect and CATIA v4 files different for preservation purposes? It normally would be a matter of concern that so close to the scheduled end of this WP the underlying classification scheme has not yet been agreed (section 3.4.2), though given the concerns it raises that may turn out to be beneficial. The final sentence of section 3.4.1 seems to confirm that its authors have little faith in the classification scheme.



  • it seems that real testing on real testbeds have not been performed;
  • it is not clear that all available test beds have been collected together (Task 1410);
    • What are we missing?
  • it is not clear what the fundamental unit of preservation is in APARSEN terms;
    • Not sure what this means
  • the approach taken is not clearly articulated (D14.1, page 6, para 5)
  • the description of user experiences is poor (D14.1, page 6, para 6)
  • the methodology for the gap analysis is not described (D14.1, page 6, para 7)
  • the nature of the taxonomies to be used is not clearly stated (D14.1, section 3.2.1, page 14)
    • Need to state this more clearly;
  • relevant citations on other works done in this area are absent from the draft. Such as the DCC Methodology for Designing and Evaluating Curation and Preservation Experiments V1.1
    • thought we had quite a few citations - check

-- DavidGiaretta - 2012-03-06

Reviewers said "Would be rejected WP ends at M14. Testing has not been done. Quality of work and reference to prior work absent. Attributes categorisation arbitrary and needs to be fully reworked. Where will this go in terms of the overall project?"

It does seem as if the reviewers took the categorisation as a proposal for classifying the world rather than something simply to force us all to think about more than just documents or single science datasets.
In the DoW we said: "In this work package we will look across partners and beyond to identify candidate testing techniques. These will be classified and themselves tested against various types of data and scenarios; a number of open competitions will be organised to encourage a competitive spirit. Ultimately we aim to produce a collection of testbeds which will include testing procedures and test data, together with test software if appropriate, which can be used to provide a common measure for digital preservation techniques. "

We can (1) give a clear listing of testbeds and instructions where they can be run and who to ask for advice (2) collect test software if appropriate and test datasets (3) make clear links to the training courses, audit & certification and authenticity work etc.

It would be relatively easy to look at science datasets used in some significant properties testbeds to see what comes out. I do not think we can do any of the CASPAR or SHAMAN testbeds in the time available but we can describe them in more detail and point to good examples.

-- DavidGiaretta - 2012-03-06

I have added the additional comments that Liina made in her email of 14MAR2012 in response to David's comments on the initial review report to a new wiki page here: WP14Year1ReviewComments

-- AshHunter - 2012-03-15

2.2.3 WP16 D16.1 - Software Repository

The software repository was not described or presented at the review. It would be useful to have a gap analysis of other similar activities such as the KEEP project.



reviewers said "It would be good to see a demonstration and maybe some commentary on potential overlap with KEEP project."

Is there an overlap with KEEP?

My idea was to link with in with WP14 so people could add views and evidence of the effectiveness of the s/w. There are long lists of tools already available we could use as sources of information.

-- DavidGiaretta - 2012-03-06

2.2.4 WP24 D24.1 - Report on authenticity and plan for … evaluation system
D24.2 – Implementation and testing … on a specific domain

In general progress on WP24 was good.

The draft paper for D24.1 was a clear and comprehensive description of the issues and players although there has already been a lot of work done on this topic. It would be useful to hear what this work package is adding to the authenticity work of CASPAR and the other quoted initiatives. What exactly will this WP deliver? There needs to be a fuller plan for the evaluation system.

The notion of crosswalks has a long and distinguished history. The difficulty has always been increasing the corpus of mappings within a model and then the development of automated tools to manage and effect the crosswalks. What tools are expected to be delivered by WP24?

The presentation explained a proposed use of ‘Authenticity Evidence Records’ as the basis for the model. This seems highly appropriate. The team may wish to note that this concept has already been fully developed in detail by another European project, MoReq2010.

Deliverable 24.2 was not completed by the time of the review. A presentation only was available.



Reviewers said "Would be approved. Need a fuller plan for the evaluation system. What is the nature of authenticity mechanisms pre and post implementation of specific preservation strategies, e.g. migration, emulation, à la XCDL work from Planets?"

We do need specific proposals for the evaluation. My impression is that XCDL is about significant properties for rendered objects and no good for other types of digital objects. If so we need to make that clear and provide some examples. This is related to WP14.

-- DavidGiaretta - 2012-03-06

WP24 - I2401 Report on provenance interoperability and mappings:

Reviewers said: Need to see more mappings and how they are to be utilised. Some bibliographic references missing.

Response: A revised (extended version) and a response letter is available at the corresponding wiki page:

2.2.5 WP26 D26.1 - Report and strategy on annotation, reputation and data quality

In general progress on WP26 was satisfactory.

A clearer definition of annotation is essential. If it is defined as ‘… metadata or free comments’ (page 14) there needs to be a very clear statement as to what the strategy is for this WP and what is expected to be delivered over the life of the project. If it is about interoperability of annotations/metadata (page 16) then the relationship with WP24 needs to be clearly articulated and it would be useful to hear what the expected outcomes for Reputation and Data Quality are. In any event, the use of ‘annotation’ instead of ‘metadata’, if retained, needs to be explained.

These issues could usefully be incorporated into a conclusions and recommendations section.

The panel noted that the surveys in WP22 and WP26 asked essentially the same question to overlapping constituencies, but ended up with different typologies for the responses:

It may have been a better use of resources, and there may have been better results, if the two surveys had been combined. If this had been better coordinated, D24.1 would have been able to report on n=103 instead of its barely-helpful n=11.

There are also inconsistencies within the typologies used and concern as to the utility of these typologies. For example, the terms ‘3D objects’, ‘raw data’ and ‘datasets’ in the D22 typology are not mutually exclusive; likewise ‘archived data’ and ‘AV data’ in D24.

During the review presentation, the possibility of carrying out another survey was floated. Given the results of the existing surveys, and acknowledging the consortium’s recognition of ‘survey fatigue’ in the digital preservation community, the panel considers that another survey should be avoided if possible.



Reviewers said "Would be approved. Some replies (e.g. on existing certification/accreditation) raise qualms about survey results. Overview of state-of-the-art and survey analysis should be added. What is the strategy for this work?"

Not sure what is meant by ... replies on certification.. means.

We need to add SoA and survey analysis.

-- DavidGiaretta - 2012-03-06

2.2.6 WP33 D33.1A - Final report on peer review … in scholarly communication
D33.1B - Report on peer review of digital repositories

In general progress on WP33 has been acceptable. D33.1A is a very good description of current thinking in this area and D33.1B provides an initial coverage of current audit and certification practice.

In D33.1A, the relationship to and possible overlap with WP26 on Reputation and Data Quality needs to be managed.

For D33.1B, what is the expected outcome of audit and certification activities in the context of a European VCoE for digital preservation? There was also concern from the panel around the lack of information regarding the audit and certification methodologies used, how they were chosen and how any comparative modelling was initiated and the outcome of any such comparisons. Finally, the panel expressed concern over the intention to set up a completely new infrastructure to carry out and govern audit and certification; this could be complex, legally-fraught and expensive, yet unnecessary because testing companies and regulatory infrastructure already exist in a mature infrastructure for other sorts of certification.

It is also important that an individual institution is able to understand where his/her repository sits in the world of audit methodologies, ie how to make a decision on the level of certification he/she should aim for. The difference between arbitrary constructs such as ‘extended’ and ‘formal’ certification also needs to be clarified.

See recommendation 6 on mappings.



For D33.1B it does not seem to be understood that we were doing what we were requested to do in terms of the European Framework and the fact that APARSEN is not setting up new infrastructure (DIN and ISO). Probably a fair point about the need to help in "how to make a decision on the level of certification"

-- DavidGiaretta - 2012-03-06

2.2.7 WP43 D43.1 - Survey for the assessment of training material …

WP43 shows substantial deviation from the DoW without any explanation or justification for the deviation and no discussion of the implications of the deviation, e.g. impact on personnel and financial resources, with the Project Officer.

Implications for WP12 and WP 42 need to be considered in this context.

The deliverable D43.1 was not provided to evaluate the state of the training. It was only from the progress report and onsite review that it emerged that training is now only focusing on training of auditors. This needs to be readjusted in conjunction with the Project Officer.



Reviewers said "Would be rejected. Presentation at review meeting, paper not yet ready. Clear deviation from DoW."

Looks as if we should (1) do more in terms of evaluation of existing material and (2) downplay the training of auditors.

-- DavidGiaretta - 2012-03-06

There are a range of issues packed into the training not just audit. It's certainly the case that we have done quite a lot of work on audit in order to inform our understanding of training needs but if the reviewers / project officer were under the impression that this would be the only topic then we have made a rod for our own backs!

In any case planning of the specific training and allocation of topics begins in WP43.2 which starts in March 2012 informed by the recommendations of 43.1 - which is wide ranging in its conclusions. We therefore have scope to incorporate comments from reviewers and/or the project officer without significant impact. Indeed given the expertise of the reviewers and PO, any supportive comments are welcome.

The plan for WP43 should be sufficient to clarify this.

-- WilliamKilbride - 2012-03-06

Edit | Attach | Watch | Print version | History: r12 < r11 < r10 < r9 < r8 | Backlinks | Raw View | Raw edit | More topic actions
Topic revision: r12 - 2012-03-22 - YannisTzitzikas
This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback