Reviewers' comments on WPs and deliverables.

The other part of the report is here

NOTE Note Remember that if you type in a comment box you must submit it before moving on to the next one

PR-P1-01-1_0 PERIOD 1 REPORT and WP 51 Administrative co-ordination

This now contains all required quantitative data broken down by WP and by partner, in a commendably clear manner.

The Gantt chart in page 12 is not acceptable, on several counts:

  • it is illegible;
  • it shows only the original plan, but not progress or slippage to date;
  • it does not show changes to the plan (such as delayed completion of WP14);
  • is inconsistent with the detail in the report in several places (e.g. the start date of WP41, the end date of WP33) and contains apparently inconsistent rounding.

The Gantt chart on page 46 is not acceptable either. The bar for WP 1400 has been moved, as if to indicate the delay in delivering D14.1. However, it has been moved crudely, so that the start date and dates are both wrong, and the dependencies no longer align – it is not clear what this chart is attempting to convey. See illustration below, extracted from the chart. The bar for the other delayed task (WP26) has not been changed – it still reflects the original dates, now incorrect.

Most of these points were made at the first review. It is essential that future reports should contain a legible, updated, and correct Gantt chart.

Milestones past and future are almost ignored in this report; the only mention is the identification of a date and an event name for one milestone on page 57. It is clear that milestones are not considered a helpful project management tool by APARSEN, which is surprising given the large number of tasks and work packages. The Gantt chart includes 8 ‘checkpoint’ symbols (), most (not all) of which correspond to the 9 DoW milestones. Here again, it is not clear what is meant to be conveyed by ‘milestones’ and ‘checkpoints’.

We cannot detect any attempt to simplify the project plan structure from its present combination of WPs, streams, aspects, phases, segments, checkpoints and milestones (this issue was pointed out in the first review).

We should re-do the GANTT chart in MS Project properly. There was a comment in the previous report about the WPs, streams etc but to change that would be via a contract change of the DoW.

-- DavidGiaretta - 2012-06-16

We note that fewer person-days were expended than would be expected by applying a simple straight line expenditure, for every work package save one. This is unexpected. It may arise for several reasons, not least that the straight line assumption may be too simplistic, and need not be a cause for concern. What is a cause for concern is that the possible shortfall is neither recognised or explained in the progress report. In the absence of any explanation we have prepared the following chart to illustrate the issue:

The planned effort per WP is an estimate only. If we matched the plans exactly it would seem suspicious to me!

-- DavidGiaretta - 2012-06-16

Planned versus actual person months spent

The chart is based on figures from the progress report and has not been reconciled to the DoW.

Several WPs started in month 11 (WP21, WP23, WP32, WP34, WP36). These are not reported on, save a statement that they have started “because they do not have anything substantive to report after just one month.” We accept that the WPs will not have any substantive findings at this stage. However, it is not acceptable project management to omit them from the reporting. It is a basic idea of project management that the resources expended on, any issues encountered, and the outlook for, such tasks must be reported routinely.

It is striking that section 3.2, which deals with activities in sustainability, mentions several other initiatives. The only initiatives mentioned are those that already involve STFC.

Do other partners have any we should have mentioned?

-- DavidGiaretta - 2012-06-16

In terms of the WPs starting in M11, we were putting effort into delivering the M14 deliverables early rather than working on those.

-- DavidGiaretta - 2012-06-16

We should add in the effort tables for the M11 start WPs - even if the numbers are very small.

-- DavidGiaretta - 2012-06-16

Section 3 is intended to describe year 2 activities, but it is inadequate for planning purposes and it is difficult to relate to the common vision. It does not “clearly articulate what the NoE will produce and will have to offer” (as stated in the recommendation R1.3). By way of illustration of this point, the following list is a simple list of all the phrases in section 3 that allude to future delivery or future work:

Section 3.1

  • needs further discussion
  • virtual meetings will be held
  • will also be discussed
Section 3.2
  • We will look review
  • will be critically evaluated
  • will be able to use
  • could for example be designed
  • will be investigated
  • we will also have the opportunity to benefit from the work of
  • can open discussions
  • we will take advantage of our ability to bring together many different views
  • will allow us to test published models and improve on them.
  • will focus on looking at
  • information will be collected
  • APARSEN again can exploit its members

Section 3.3

  • we will need to look at the commonalities
Section 3.4
  • will be evaluated
  • a small prototype will be developed
  • There is a further deliverable
Section 3.5
  • expand on these plans
  • carrying out the initial steps of internal consultations
  • We will also be consulting
  • will be evaluated
  • A checkpoint on the standardisation activities will be carried out
  • will hold talks with
  • will be identified
  • comments from users collected
  • evaluate the usefulness
Section 3.6
  • will have the basis for spreading excellence (followed by a list of communication channels)

This list is notable for its lack of commitments to delivery. It contains only one commitment to develop and deliver something new (“a small prototype will be delivered”). All other statements are vague (“…will be able to…”) or relate to looking at existing efforts.

Note sure how we could sensibly be more specific. Any ideas?

-- DavidGiaretta - 2012-06-16

Section 3 also fails to provide “clear differentiation from other research consortia” (also in R1.3); on the contrary, it seems to describe mainly reviews and analyses of the work of other consortia (all of which include STFC).

The relevance of the two-page table in section 3.1 is unclear. It does not serve to illustrate progress or any other aspect of project management. It seems to be content that is a candidate for inclusion in some other deliverables, but that is only a presumption: there is no indication of its provenance or significance.

This report is rejected pending further improvements and clear plan for year 2 activities.

D11.1 - INTERIM Report on Comparison of Research Programmes

This is an early interim report, as the final deliverable is not due until M46. It is a light overview of the preservation activities of the APARSEN partners and confirms the existence of “fragmentation and lack of common approaches”.

The scope of the deliverable is of more concern. It is entitled ‘Interim Report on Comparison of Research Programmes’. The title of the deliverable, according to the DoW, should be ‘Comparison of research programmes as a measure of integration’. In its interim and final version, this deliverable is meant to represent the measure (‘to be used as a regular check on progress’) of the first of the project SMART objectives in the DoW (‘the integration of the majority of the research activities in DP within a common vision and common terminology and evidence standard’). However, the abstract describes it as ‘an interim report of the comparison of the digital preservation research programmes of consortium members.’ This could be interpreted as two different things. The deliverable ‘will provide key evidence as to the effectiveness of our defragmentation activities’. It will not be a surprise that APARSEN aims to bring a degree of measurable defragmentation to the activities of its members. Focus, however, should be on the impact of APARSEN, through the VCoE, on the level of defragmentation across the wider European digital preservation community. This scope issue needs to be clarified as soon as possible.

The deliverable claims to report on “…the context of the digital preservation activities for most of the APARSEN members…” It actually reports on 23 of the 33 members, without explaining why ten are excluded. Furthermore the omission of the British Library – a leading researcher in this field – is surprising in this context.

This interim report may be useful to some consortium partners as a handy reference with which to look up other partners’ research interests – though it is handicapped by being incomplete. Moreover, the information currently provided is not contextualised within the specific environment of each organisation (e.g. mission and scope of the organisation, scale, funding, strategic planning, legislative framework), so it does not allow an effective and critical comparison of research interests.

The DoW explains that the data for this deliverable was to be collected “…at the Kick-Off meeting at which the participant will have an opportunity to present their views on digital preservation research for the next 6 or 7 years.” In fact the deliverable was first drafted in M13, well after kick-off, which calls into doubt the approach. The DoW also states that “…if the discussions in this task uncover omissions from our plan or, more likely, differences in emphasis, we will continue over the next 3 months with a number of Tiger Teams to identify the actions required to integrate these into our plans to deliver the common vision.” It is not clear where this is reflected in the deliverables. At this (admittedly interim) stage no preliminary indications given on how to move forward towards integrating ‘the majority of the research activities in DP within a common vision and common terminology and evidence standard’.

In our view this deliverable is acceptable as an interim product, but only barely. This deliverable is meant to provide “provide key evidence as to the effectiveness of our defragmentation activities” (DoW PDF page 18). This is a key measure of the success or failure of APARSEN. Consortium members will need to bear in mind its content and purpose as they strive for uniformity over the next two and a half years; movement towards defragmentation should be monitored on an ongoing basis as part of the programme, not just presented at the end of the project as a ‘fait accompli’ success or failure.

D11.2 - INTERIM Report on Common Vision

This is an early interim report, the final deliverable being due M46.

D11.2 on the common vision of digital preservation was an issue for the Year 1 review which requested ‘a clear description of what the end goals of the project will be with particular emphasis on the implementation, post-project management and sustainability of the VCoE.’ This interim report begins that process.

The deliverable correctly identifies the key questions as “(1) what specifically should be VCoE do to improve digital preservation and (2) how should it be organised to achieve this.” The present interim draft seeks to address the second of these. It does not present a firm answer - admittently the question is extremely difficult to answer. Notwithstanding its difficulty, this remains the key question for the consortium needs to focus on as it is the expected underlying key outcome of the project. .

Section 4, titled ‘What can the VCoE do for its members’, is reasonably general rather than specific. We are however concerned at the specific detail concerning one potential service. In 1st review the panel expressed concern ‘around the lack of information regarding the audit and certification methodologies used, how they were chosen and how any comparative modelling was initiated and the outcome of any such comparisons. Finally, the panel expressed concern over the intention to set up a completely new infrastructure to carry out and govern audit and certification; this could be complex, legally-fraught and expensive, yet unnecessary because testing companies and regulatory infrastructure already exist in a mature infrastructure for other sorts of certification.’ However in the section ‘ What can the VCoE do for its members’ of this deliverable, the first planned activity of the VCoE is consultancy, in particular on audit and certification, defined as a ‘special type of consultancy’: ‘The VCoE may contain certified auditors who can perform the audits or provide consultation to help prepare for an audit. The centre will also have a register of auditors, including probably feedbacks from previous auditees’. We are concerned that the VCoE seems still heavily focused on auditing activities.

The timetable set out in section 2.1 and method in 2.2 are appropriate, and consortium members must work hard to achieve it. It would be good to see in year two a more formal plan for the development of the VCoE and the building blocks (rather than chapter headings) to be put in place to achieve it. The plan should include draft business plan, funding model etc.

We note that the progress of APA is described in section 2.4 without any recognition that the APA is envisaged as disappearing in favour of APARSEN. On the contrary, the future relationships of APA and APARSEN are opaque.

This document is of acceptable quality as a starting point and an interim, but nevertheless with a major caveat: the activities of the VCoE should be revised under the light of the recommendation provided in the 1st year review.

D12.1 - Register/map of research activities, positions available and placements

D12.1, a register of activities and places, was due M14. Nothing has been explicitly presented for this review – the only mention is a sentence stating it is a website. No evidence is presented that any work has been done on this, and no evidence about its quality or failures. This is despite the statement in the Periodic Report that D12.1 is one of the deliverables for which “Final versions will be delivered at the M16 checkpoint” (that is, the present review). The Periodic Report does not claim that any time has been expended on this deliverable, possibly because it started late. The situation is unclear and unsatisfactory.

This said, the panel notes – and discovered almost by chance – a ‘microsite’ at that appears to be D12.1 (although the 'APARSEN exchange manual' at is marked as draft). Our comments below assume this to be the case.

The microsite lists 14 exchange opportunities. Each opportunity should be described on one page, with links to further details. However, of the 14, two pages are missing – this may be because the site is unfinished, but we have no way to know (they have been missing for several weeks). All but one of the opportunities are for one person month, one being for two person months – so a total 15 person months. This is not compatible with the DoW which undertakes that “placements will range from a few months to one year in duration”. The consortium needs to honour this DoW undertaking.

We have attempted to reconcile the months of effort to the budgeted months and the descriptions in the DOW. Our attempts were fruitless, as the material is opaque on this point.

Additionally, while the microsite might be sufficiently functional for the few exchange opportunities currently advertised, we have doubts about its suitability with the larger number of exchanges envisaged.

The microsite hosts a 7-page ‘manual’ titled ‘Staff and Experience Exchange’ (marked Draft 7 and referenced as WP12A). We have not reviewed this in detail. A brief examination suggests that it covers all the essential ground in a clear manner.

We consider that:

  • If the microsite represents the work performed in WP12, it represents a material deviation from the DOW.
  • If this microsite were presented as a deliverable, it would be rejected.

Needs urgent attention

-- DavidGiaretta - 2012-06-16

D13.1 - Intermediate Report about the Coordination of Common Standards

This is an early interim report, as the final deliverable is not due until M48. As an interim version of a “summary” it is long – 119 pages – reflecting the complexity of the preservation standards landscape.

The report notes that it is “mandatory that a VCoE as an open network of networks combines all the standard related knowledge, the applicable validation and certification procedures, represents a common mind for services related to preservation and maintains the infrastructure that APARSEN members are generating.” Years 3 & 4 will be central to achieving this in the context of WP13.

The deliverable states that “The main goal for the remaining 3 periods of the APARSEN project is to split the standards beside their community applicability on the dedicated elements of the OAIS model which are covered by the standard […]”. Unfortunately, the intention here is unclear.

The deliverable is patchy (in this interim draft). While some standards are described in detail, others are described only by a summary quotation. For example, ISO 23081 is described only as defining “terms and definitions applicable to the standards on management systems for records (MSR) prepared by ISO/TC 46/SC 11. It also establishes the objectives for using a MSR, provides principles for a MSR, describes a process approach and specifies roles for top management” – a description that does not help the reader – whereas an explanation of the multiplicity of views contained in the publication might be more helpful.

It would be helpful if the standards could be mapped against a conceptual model of the digital preservation conceptual space. Something more complete than the table presented in section 1.5 is intended here, a model that represents both the different levels of abstraction and addressed by different standards and their different purposes (for example, recognising how XMP, ISO 15836, and ISO 23081 all address metadata but at different levels; or the different levels addressed by XML and XML DTD). If this could be achieved, it would for the first time make it possible for standards to be compared meaningfully with each other; indeed a meaningful gap analysis is barely possible without it. However, if no suitable conceptual model can be identified, it may be challenging to develop such a conceptual model within the scope of APARSEN. The consortium may have to reconsider what is usefully achievable.

Section 2.3, on “previous and ongoing preservation initiatives”, is more problematic. For one thing, with a global ambition it can hardly aspire to the encyclopædic coverage that the section on standards is likely to achieve. For another, interpretive descriptions of projects are fraught with risk. It may not be realistic to attempt to complete this section.

The apparent attempt to list all file formats (in the Appendix) is a step too far. Accepting that the present deliverable is only interim, a cursory glance at the long list of some 583 formats listed in the Appendix reveals the absence of some relatively well-known generic formats (e.g. DjVu, OpenDocument, Office Open XML) and many proprietary formats used by scientific instruments (including the spectrophotometry format .WSV, the gas chromatography formats .MI and .MX, and presumably an uncountable number of others). A list of file formats more twice as long as the list in this Appendix is easily assembled within minutes from publicly available sources. Any attempt for complete coverage will be fruitless and we recommend it should be abandoned.

However, the shortage of information on preservation file formats – notably XML, and the PDF/A family of standards – is surprising. Assuming that this is not purely because of the draft status of the report, this should be remedied or justified. This is not to say that all the standards listed in the Annex need to be fully documented – any attempt to do so would be hopeless – but standards with especial relevance to preservation could be considered further.

As a relatively minor point, we point out that the hyperlinks in the delivered PDF version of the deliverable do not work. We presume that the links are fully functional in the original document, and we trust this will be rectified in future versions.

This deliverable is acceptable as an interim, subject to the above. The consortium needs to focus further work in this area on what will be new, valuable to the preservation community and/or other work packages, and what can realistically be achieved.

D14.1 - Report on testing environments

D14.1 has been deferred to M20. We are concerned that this deviation is not plausible because of the dependencies of this deliverable within the project overall future activities, and because of the potential further deviation suggested in the postponement letter regarding the possibility of a decision tree or ‘wizard’.

In terms of dependencies, we are concerned that the postponement of this key deliverable would create a risk wave across the project planned work which could potentially hinder the successful creation of the VCoE. As indicated in the DoW, D14.1 is a ‘Report on testing environments’. This deliverable consists of a report summarising the test environments which have been examined and the proposed common testing approach; it also provides a framework within which to evaluate the efficacy and applicability of proposed preservation tools and techniques. It is the only deliverable regarding the testing of testbeds in the project and it represents a key underpinning for project activities in Year 2 and Year 3.

In D11.2, regarding Year 1 it is mentioned that ‘We have carried out joint work on Trust, producing a number of reports about different aspects of this topic, so we will have something to disseminate and establish our credentials as an organisation in year 2. At the same time we put in place the basics about training and testing which will underpin the rest of the topics and provide a basis for some services.’ In regard to the Year 2 timetable, D11.2 further mentions that ‘During this period we flesh out the potential services and also the organisational issues’ and that ‘Also in year 2 we will be working on Sustainability, which should feed ideas into the VCoE Discussions’. Postponing D14.1 to M20 would represent a substantial postponement of activities that would take the baton from this deliverable, with the risk of creating a wave of further postponements and incomplete work across the project.

The postponement letter also mentions the possibility of a decision tree or ‘wizard’ intended to help ‘users’ with advice on digital preservation (in connection with WP14). This might, depending on the details, replicate work done by PLANETS, in particular its Plato tool) and we urge the consortium to be clear of both the scope of related PLANETS work and also its nature and findings before undertaking this challenging additional task. We suggest that this task, though potentially useful, is too large to fit into the budget left by an accidental underspend of less than 15 man-months.

The postponement letter mentions a preservation glossary, which we welcome. We ask that this be made available to the review panel at the next review.

" It is the only deliverable regarding the testing of testbeds" - I don't think this was the purpose of the WP - It is to state the capability of existing testbeds, and provide a framework for new preservation tools / strategies / approaches to be developed as part of follow on WP work. (e.g. Implementation of new migration tools, preservation approaches including incorporating new representation network services, etc)

-- AshHunter - 2012-06-19

Response ... Bister will be maintained May add 1 mm for tessella from other WPs May be dependencies for WP16 and 21 and 25

-- DavidGiaretta - 2012-06-20

WP15 - Internal workshops, symposia and events

This WP concerns internal symposia, workshops and events. As such, it is natural that the majority of the activity will fall in later years, so the first formal deliverable is not due until after the end of the second year (M26). However, no seminars, workshops or events are noted in the progress report, despite a suggestion in the DOW that the WP might include two-monthly virtual meetings, all the more so as scheduled work on Trust should be complete by now. We question how the 2 person-months spent on the WP has been used (acknowledging that this is a small amount).

The 2 person months spent on the WP are used: - To prepare and contribute to "all hands meetings" of the APARSEN project, that can be considered as internal events. Also some of the regular teleconferences in which the project progress is discussed are considered as internal events. - By organizing events within organizations to spread the outcomes of the APARSEN project (large organizations, such as CERN, STFC use resources from WP15 to discuss APARSEN deliverables within the organization)

-- ReneVanHorik - 2012-07-12

D16.1 - Software repository

D16.1 has been deferred to M20.

The DoW is unclear about what is to be delivered here, and why. For it to succeed, by providing a useful resource, it will have to incorporate a complete software categorisation system, as well as categorising a wide range of existing software. This is remarkably similar to attempts made by other initiatives with much greater resources. Further, it is far from clear why other initiatives would wish for an APARSEN repository to take over the fruits of their research.

The WP is intended to be small – only 11 person-months remaining in the budget – but it may be worth considering cancelling it. Alternatively, the repository could be produced, with a clear explanation of its purpose and value. If it is cancelled, careful thought will need to be given to re-allocation of the resources, in keeping with the spirit of the DOW.

Task 4470: Interactive map

The interactive map cited in the section on WP44 now works. However, it is not a deliverable of this project, is hosted by another initiative, and is anyway of very limited value or interest and so seems largely irrelevant. Unless its relevance is demonstrated, it should not be mentioned in future deliverables.


-- DavidGiaretta - 2012-06-16

D22.1 - Persistent Identifiers Interoperability Framework

In the 1st year review the panel accepted this deliverable, noting that ‘It would be good to have a discussion about what the final project outcomes for PIs will be, e.g. will it be developed, will it be operationalised in VCoE?’

In the updated version of this deliverable, project partners outlined how ‘some of the identified services will be designed, taking the IF as a reference. In particular, by addressing the citability issues, advanced services for resources identified by different PI systems, can be implemented’. However these activities are not currently included in D11.2 - INTERIM, and there is no indication that they will feed into the VCoE. It would be good to have this added to the planning of the VCoE.

D24.1 - Report on Authenticity and Plan for Interoperable Authenticity Evaluation System

D24.2 - Implementation and Testing of an Authenticity Protocol on a Specific Domain

D24.1 is a reasonable introduction to the issues. It is not clear how the APARSEN project intends to enhance the work already undertaken by the CASPAR project and the ongoing work of 3D-COFORM. The proposed model appears to be based on some assumptions, e.g. that the base repositories will be comprised of RDF Triples. If this is the case what are the implications for non-RDF Triple repositories?

The model is based on a lifecycle definition of a digital object. This does not allow for a distinction between record keeping requirements and digital preservation. Many of the components of the proposed authenticity and provenance interoperability model appear to be more aspects of the record keeping process, e.g. inheritance of properties from super- to sub-processes, inheritance along processing steps, merging metadata of intermediate steps etc. It would be useful to understand the implications of the model should a boundary be drawn between record keeping and digital preservation, which may dictate a different representation of use and re-use over time from within a digital preservation repository. This is reflected in D24.2 where the UKDA use case states that “authenticity is ‘assumed’ from the point in the lifecycle where the UKDA has custody”.

The proposed model requires ‘a question of convention’, e.g. ‘if we regard that persons carrying out a process carry out all of its sub-processes’ while at the same time acknowledging that this only applies up until the moment ‘we encounter that one person left too early’. There appear to several such assumptions in order for the model to work (camera settings, environmental conditions, configurations). This leads to the question of what happens when one or more of the assumptions is not satisfied. This would appear to be an intrinsic flaw with the model as it has been described, especially if the lack of satisfaction of an assumption is not discovered early on. This is noted in the second D24.1 document where it notes that “the use of inference rules introduces difficulties with respect to the evolution of knowledge”.

Overall, this is a sound description of the authentication/provenance domain; but it is not clear where current state-of-the-art has been extended. Similarly, it would may been useful for the project to discuss the two key commercial systems in this space as well as the two open source systems selected (6.5.1).

The paper D24.2 is an excellent set of case studies of the model presented in D24.1. It grounds the model in current practice and gives a very clear representation of the current state of practice in the real world and the areas where the model needs to be enhanced in order to work in the real world.

The case studies show ‘current practice in the production environments of LTDP system as somewhat distant from the current debates on best draft practice in academic circles’ (p 40). The onus is on APARSEN and its proposed VCoE to ensure that this gap is lessened including:

  • agreeing clear best practice in the areas of authenticity and provenance
  • conversion of conceptual best practice into an accurate structured data model/schema
  • guidelines on technical and procedural implementation for:
    • development of supporting data/metadata capture and management tools
    • deployment of such tools throughout the DR lifecycle.

This is acknowledged in D24.1 but needs to be strongly addressed. The UKDA case study makes it clear that this would be a sine qua non for its movement away from its current practice (p54). This is reflected in Section 6 but in a weak and non-committal manner. This should be a primary focus of the project planning for the VCoE going forwards.

Overall, these two deliverables are of acceptable quality.

The reviewers comments both on D24.1 and D24.2 confirm that the direction taken by the workpackage (a crucial part of the whole stream) is convincing. Of course, it has to be further developed and implemented as part of the VCoE initiative. The aspects discussed in the Section 6 of D24.1 have been considered in the case studies analysis (specifically in e-health case) and handled according to the methodology as one (relevant but not sufficient) component of a continuing approach to the digital preservation

-- MariellaGuercio - 2012-06-18

D26.1 - Report and strategy on annotation, reputation and data quality

D26.1 has been deferred to M18.

Should be ready by mid July

-- DavidGiaretta - 2012-06-20

D33.1a - Report on Peer Review of Research Data in Scholarly Communication

The final version reviewed – as indicated by the coordinator as the final version – is a version marked ‘Released’ ,

As a minor point, illustration 1, which is identical to illustration 3, presents a model. The model shows 5 ‘manifestations’ arranged within four (rather than five) levels. This conveys the idea of some sort of data hierarchy, but the details confuse. Illustration 1 also confuses because its content is not referred to until later (and then only fleetingly).

Overall, this deliverable is of acceptable quality.

D33.1b - Report on Peer Review of Digital Repositories

D33.1b was presented as a draft at the 1st Year Review. The title of this deliverable in the DoW is ‘Final report on peer review and 3rd party certification of repositories’, with the scope to describe ‘possible requirements and research agenda in the area of data quality, using input from domain scientists and the range of archives and other data holders within the consortium and associated with it’. In the overall D33.1b would fit with this description although in our view it presents some weaknesses that we suggested to address in the 1st Year review. We recommended that ‘The concepts of this deliverable should be further clarified and contextualised within a more comprehensive and impartial state-of-the-art analysis. Need more information on the methodologies used, e.g. why and how.’ Such comprehensive and impartial state of the art analysis is not present in the final version of D33.1b. We furthermore recommended that ‘While there is substantial reference to other EC and international programmes there should be a greater declaration of how a given WP is going to contribute to the extension of knowledge or practice, e.g. WP16 KEEP, WP32 LIFE, WP33 TRAC, DANS, DRAMBORA, WP34 CASPAR, WP35’ In the DoW DRAMBORA is mentioned in relation to self-auditing adopted in national legislation, for example in Italy. However this is not mentioned or analysed in the final version of D33.1b.

This deliverable is of acceptable quality with caveat. A comprehensive and impartial state of the art analysis should be added to the final version.

This is beyond the DoW and the MoU supported by the EU - this WP was added at the specific request of the EU. However the state of the art analysis is certainly something that we could do.

-- DavidGiaretta - 2012-06-16

The adoption of audit processes in Italian legislation can be easily added to the report

-- MariellaGuercio - 2012-06-18

Somewhere the reviewers have pointed out that there is a contradiction between the methodology developed by WP24 and the development of a certification and audit strategy. I cannot understand the contradiction. On which basis the certification can be provided without a method for evaluated the authenticity evidence based on the life cycle or on a continuum approach? the fact that many crucial information have to be early collected, maintained, monitored, submitted, preserved and carefully documented simply means that the certification could be easier, less expensive and the quality of the repository more reliable.

-- MariellaGuercio - 2012-06-18

CINES may have information on ISO std (ISO 14641-1 2012) used in France (NF Z 42-013) "Electronic archiving -- Part 1: Specifications concerning the design and the operation of an information system for electronic information preservation". Audit done by Inter-Ministerial Service for Archives (SIAF).

CINES went through DRAMBORA - may be able to say how/whether it can be compared with DSA/ISO audits

"A comprehensive and impartial state of the art analysis should be added to the final version." seems to be outside the scope of the deliverable.

-- DavidGiaretta - 2012-06-19

Mariella could supply an update of the situation in Italy about government legislation.. We could ask all partners for info in their countries.

-- DavidGiaretta - 2012-06-19

Correction - CINES did NOT use DRAMBORA.

So CINES and CINI CAN add some extra details (national legislation) in a week, (could also say what happens in other countries - ask partners - even the statement that there is nothing would be useful) BUT we *CANNOT do a "comprehensive and impartial state of the art analysis" within this WP - the WP and the ToC was set up according to the Unit's wishes.*

-- DavidGiaretta - 2012-06-19

- CINES and Mariella will write a paragraph each about national legislation
- other partners will check their own national legislation
- time 1 week
- include these in the deliverable.

-- DavidGiaretta - 2012-06-19

Regarding the implementation of the test audits by the members of PTAB - (, indicated in a list of participants sent by the coordinator to the PO during the periodic review 1 in February 2012, it should be further clarified and justified the use of so many non-EU experts, and the timing of these audits in relation to the timing of the finalisation of the ISO standard that is amongst the activities promoted by PTAB. The consortium needs to provide detailed evidence and measures of how such performed test audits and related activities are supporting the research required to set up of the European Framework and the ISO Digital Repository Audit and Certification processes.

In particular it is noted that twelve attendees (average) for each test audit is well above expectations. It should be explained why PTAB members from outside the APARSEN team had to be involved instead of APARSEN members, what specific skills the non-Europeans bring and would justify their involvement and how many trips per person were actually made to assess the acceptance of the cost related to the test audits.

We state in D33.1B: An added purpose for the test audits for ISO 16363 was to verify the practicality of the audit process and understandability of the metrics, help to develop the process for the audits and to check the consistency of the judgements of the auditors. This last point required that, whereas a normal audit would involve two auditors, the test audits had to involve larger groups of auditors.

-- DavidGiaretta - 2012-06-16


This WP concerns external symposia and workshops.

The progress report states that this WP started “recently” though it was due to start last July. As with WP15 it is natural that much of the effort will come later. And, also as with WP15, it is not clear how the reported 7.76 person-months – around 155 days, or 12% of the budget – has been spent. Running a workshop at an existing conference, and building an Excel spreadsheet of contacts from PARSEInsight, DPE, SHAMAN and CASPAR would not justify this (especially as APA is already involved in two of these).

D41.1 - Workshops Summary and Planning Report

This is an interim report, the final deliverable being due in M46.

The deliverable states that “The success criteria for the APARSEN/APA conferences and workshops will be measured by:

  • Number of attending persons and the variety of organizations represented
  • which APARSEN pillars, i.e. Trust, Usability, Sustainability, and Access that were covered
  • which APARSEN deliverables were covered
  • the results/outcomes of the discussions or analysis made during the events”

Of these four proposed measures, the first and fourth are sensible and essential; though the second and third are not measures, and can safely be ignored in this context. More detail will be needed on the first and fourth measures, such as:

  • How many events are planned
  • The number of attendees of the events of Year 1
  • The numbers of attendees that will be considered ‘successful’
  • What ‘analyses’ and ‘outcomes’ are proposed (we assume that they will include, at a minimum, surveys of attendees’ opinions of the events, understanding of the issues, work, intentions towards the VCoE, etc), how these will be measured and target success levels. This is essential for a credible approach to dissemination at this level.

This deliverable is acceptable only as an interim. There needs to be more detail on achieved activities and more emphasis on forward planning in the next version.

D43.1 - Report on Survey of Training Material/ Assessment of Digital Curation Requirements

The title of this deliverable in the DoW is ‘Survey for the assessment of training material/ Assessment of digital curation requirements’. The scope of this survey is to ‘collect information about the types of training courses which are on offer, together with a critique of their coverage and quality’. In its current form this deliverable is mostly a collection of survey results and materials from other projects and initiatives, rather than original research. The material presented is unbalanced, critical analyses are weak or absent, bibliographic references are insufficient and the analyses of repository staff needs represents a deviation from the DoW. It is important that these issues are attended to, if only to avoid any risk that the originality of the work may be challenged by others.

Annex B lists training events identified. The period covered is not stated, and coverage is worldwide rather than European.

It is surprising that only one PLANETS event is listed; perhaps this results from the period chosen.

Large sections of the deliverable are content reformatted from DigCurV and from websites of the Digital Curation Centre and the Digital Preservation Coalition. We question how this can be considered research, and whether it adds any value to the community.

It is strange to see the same AIIM course listed six times in Appendix B. The six listings are the same course, delivered in different American cities. This should have been obvious to the author, as all six references have the same url. It is all the stranger as the online version of this course is omitted, and its European offerings are missing too; and the course anyway includes only one hour of digital preservation training. The appendix purports to list training initiatives that were ‘analysed’ and it is not credible that these six offerings of the same course were analysed. As a side effect, Figure 1 is incorrect, as are conclusions drawn from it.

It is important that APARSEN articulates as soon as possible what it is proposing to deliver. There is currently a wide and increasing range of professional development activities available to digital preservation practitioners as professional development and the project needs to be clear whether this is the market it intends to enter into, as opposed to the education market, i.e. the development of tertiary curricula for the academic training of future digital preservation leaders, managers and practitioners.

The first deliverable for WP22 on formal (academic) training for digital preservation is due at M22 by which time the consortium must be able to answer this question. The impact of APARSEN on the training and education activities within the domain may be determined by the answer to this question.

The lack of clarity in this deliverable is very worrying. A good understanding of training – starting with what is available – is an essential prerequisite to the future work of the VCoE. This does not present a usable basis.

This deliverable is not of acceptable quality.

Needs urgent attention

-- DavidGiaretta - 2012-06-16

The fact that the deliverable is a collection of survey results and not an original proposal cannot be assumed as a critical (negative) aspect. Of course in this sector the first step is to collect the information related to the successful experiences and to make them available.

-- MariellaGuercio - 2012-06-18

We need 1) collect missing information about course materials 2) add proposed courses strongly linked to APARSEN Research WPs - this will also allow us to link the research to the training and cate training materials.

-- DavidGiaretta - 2012-06-19

D44.1 – Communication Plan

This deliverable ‘sets out the strategy for the APARSEN’s project internal and external communication activities throughout the lifetime of the project’ and would therefore be a key deliverable to support the project success in outreach.

D44.1 outlines communication strategies, activities, roles and responsibilities. However no measures for the impact project are provided, and many of the success indicators are considerably unambitious for a project whose main goal is to create a network of networks.

Participation in several externally-organised events is noted.

Section 3.1 lists events which “…APARSEN/APA has been part of organizing…” It lists just three events, and there is little evidence of explicit APARSEN involvement, despite the claim cited here. Taking for example the only event for which an external reference is given (PV 2011), the programme makes no mention at all of APARSEN (though it refers frequently to APA).

Appendix B is a partial listing of potentially relevant third party events (workshops, conferences etc.). It is not clear what purpose it serves, as (in its updated form) it is confidential to the consortium.

The same Appendix B highlights events for which “…APARSEN currently plans to submit bid for a workshop or will organize a workshop.” Nine such events are highlighted; they appear to be well chosen.

This deliverable is of acceptable quality. However, it needs to be periodically updated on management feedback as recommended in the 1st Year Review and merged with the deliverable D45.1.1 (as suggested in the section below).

D45.1.1 - Stakeholder identification and communication strategy

The first part of this very similar to D44.1, and the rest seems to target primarily memory and archival institutions. APARSEN should be about records of science and about the stakeholders in the widest sense, but throughout the communication materials memory institutions seem to have dominate attention.

We see no reason for this document to continue with an existence separate from D44.1. We suggest it be merged with D44.1. The consortium needs to ensure that future versions are enhanced to address the intended subject area. On this basis, it is rejected. Furthermore, it needs to be integrated with D45.1.1.

We were committed to deliver D45.1. We do intend to merge with D44.1.

It is true that we did not list (section 2.4) the various science archives out there. Otherwise I would have argued that we meant "Archival institutions" in a broader sense. How did we overlook that?!

-- DavidGiaretta - 2012-06-16

D46.1 - International liaison communication report

The DoW states that an interim version of D46.1 should be available for each review (PDF page 68). No such interim has been produced.

This should in any case be merged with D41.1

-- DavidGiaretta - 2012-06-16

D52.1 – Project Website

The project website is greatly improved. We are pleased to note that some deliverables have been published openly. There is scope for improvement of the site, which is normal at this early stage and which is expected by the next review.

We recommend that the site should be enhanced to make the purpose of the VCoE more evident (the website should be a central component of the VCoE) – at present this is rather ‘buried’ and slightly confused by the APA/APARSEN co-branding. For example, the relationship of the two initiatives is ignored on the joint home page. As another example, it is confusing to see that APA claims to be a ‘Centre of Excellence’ while APARSEN claims to be a ‘Network of Excellence’ (both on the ‘About…’ web pages); and it is not evident how the two initiatives’ work differ and/or overlap. We suggest that this is not sustainable. At some point soon, APARSEN needs to be clearer about its branding and presentation – probably as soon as the question of direction referred to in D11.2 is resolved.

The deliverable is accepted with caveat regarding the remarks in the section 6 below.

c. Milestones and deliverables

NOTE Note whether the planned milestones and deliverables have been achieved for the reporting period


The DoW recognises some milestones (DoW PDF page 74). However, there is no clarity as to what they represent other than dates. It is not clear what has to be done to achieve these milestones. This was pointed out in the first review report but has not been addressed.



No. Title Delivery date Status (Approved/Rejected) Remarks OUR RESPONSES
PR-P1-01-1_0 PERIOD 1 REPORT M12 Rejected See above.  
D11_1-01-1_0-M14 INTERIM Report on Comparison of Research Programmes M46 Acceptable interim See above.  
D11_2-01-1_0-M14 INTERIM Report on Common Vision M48 Acceptable interim See above.  
D12.1 Register/map of research activities, positions available and placements M14 Status uncertain. Would be rejected if submitted. See above.  
D13_1-01-1_0-M16 Intermediate Report about the Coordination of Common Standards M48 Acceptable interim See above.  
D14.1 Report on testing environments M20 Not reviewed Deferred from M14.  
D16.1 Software reposotory M20 Not reviewed Deferred from M20.  
D22_1-01-1_8 Persistent Identifiers Interoperability Framework M12 Approved in the Y1 review. Not reviewed again.  
D24_1-01-2_3 Report on Authenticity and Plan for Interoperable Authenticity Evaluation System M14 Formal approval in Y2 review Approvable quality. See above.  
D24_2-01-2_2 Implementation and Testing of an Authenticity Protocol on a Specific Domain M14 Formal approval in Y2 review See above.  
D26.1 Report and strategy on annotation, reputation and data quality M18 Not reviewed Deferred from M14.  
D33_1A-01-1_0 Report on Peer Review of Research Data in Scholarly Communication (Part A of D33.1) M14 Formal approval in Y2 review Approvable quality. See above.  
D33_1B-01-1_0 Report on Peer Review of Digital Repositories (Part B of D33.1) M14 Formal approval in Y2 review Approvable quality but with caveat. See above.  
D41_1-01-1_0-M14 Workshops Summary and Planning Report M46 Formal approval in Y4 review Acceptable interim. See above.  
D43_1-01-3_0 Report on Survey of Training Material/ Assessment of Digital Curation Requirements M14 Formal approval in Y2 review Not of approvable quality. See above.  
D44_1-01-2_0 Communication Plan M12 Approved See above  
D45_1-01-1_0 Stakeholder identification and communication strategy M10 Rejected See above.  
D46.1 International liaison communication report M46 Yearly status report no submitted See above  
n/a APARSEN (269977) Month 16 checkpoint and response to review   n/a See above.  

d. Relevance of objectives

NOTE Note whether the objectives for the coming periods are (i) still relevant and (ii) still achievable within the time and resources available to the project… also whether the approach and methodology continue to be relevant.

There has been no change in relevance. See Year 1 Review report.

e. For Networks of Excellence (NoEs) only

NOTE Note how the Joint Programme of Activities has been realised for the period and whether all the planned activities have been satisfactorily completed.

The M16 Checkpoint Review was requested by the review panel to be able to assess the M14 deliverables and to respond to material concerns from the panel regarding the direction of the project.

The M14 deliverables have been received although the panel is concerned at the lack of originality of the reports produced to date. Continuing to collate results from other projects without an underlying and unitary conceptual framework is not acceptable.

The panel still has major concerns regarding the nature of some of the project deliverables and the ability of the project to achieve its stated goal of ‘building a long-lived Virtual Centre of Digital Preservation Excellence’.

In this context it is clear that the JPA has not come together in such a manner as to provide the panel with confidence that the project can achieve its objectives.


a. Assessment of the use of resources

NOTE Note use of resources, i.e. personnel resources and other major cost items. In particular, indicate whether the resources have been utilised (i) to achieve the progress and (ii) in a manner consistent with the principle of economy, efficiency and effectiveness. Note that both aspects (i) and (ii) have to be covered in your answer. The assessment should cover the deployment of resources overall and by each participant. Are the resources used appropriate and necessary for the work performed and commensurate with the results achieved? Are the major cost items appropriate? In your assessment, consider the person months, equipment, subcontracting, consumables and travel.

In general we have no reason to question the usage of resources other than as mentioned elsewhere in this report.

b. Deviations

NOTE Note If applicable, please comment on major deviations with respect to the planned resources.

The panel is concerned about the wave of deviations that the postponement of D14.1 and D16.1 – currently deferred to M20 – will have on the project. We are concerned that the dependencies of these deliverables (in particular D14.1) within the project overall future activities will promote further slippage in the project. We have to date observed no mechanism in the planning to deal with the potential implications of this slippage for timing, resourcing or funding perspectives. These are key deliverables for the project and should be key contributors to the fullness and robustness of a Virtual Centre of Excellence.

The panel is also concerned about deviations in D43.1 - Report on Survey of Training Material/ Assessment of Digital Curation Requirements in regard to the analyses of repository staff needs.

Without more clarity around effort and costs (in particular a detailed breakdown of allocated budget per WP and per Partner and proper Gantt charts) it is impossible to be certain that other material deviations have not occurred.


a. Technical, administrative and financial management of the project

NOTE Note quality and effectiveness of the project management, including the management of individual work packages, the handling of any problems and the implementation of previous review recommendations. Comment also on the quality and completeness of information and documentation.

Our concerns about project management are explained in section 2b under the heading PR P1 01 1_0.

b. Collaboration and communication

NOTE Note quality and effectiveness of the collaboration and communication between the beneficiaries.

Refer to the first review report. We expect material improvements by the next periodic review.

c. Beneficiaries’ roles

NOTE Note assessment of the role and contribution of each individual beneficiary and indicate if there is any evidence of underperformance, lack of commitment or change of interest.

Refer to the first review report. We expect material improvements by the next periodic review. ---


a. Impact

NOTE Note evidence that the project has so far had, and is it likely to have, significant scientific, technical, commercial, social or environmental impact (where applicable)

As it was mentioned in the 1st Year Review, core impacts are likely to be in the primary outcomes related to the VCoE although this is dependent on exactly what the VCoE will be delivering, e.g. training and information provision or an open platform for accessing digital preservation services for practical application of digital preservation. In our view this was not sufficiently clarified in the M16 checkpoint. The project has not yet devised evidence-based measures of its potential scientific, technical, commercial impact. This needs to be urgently addressed.

b. Use of results

NOTE Note exploitable results (functionality, purpose innovation) and comment on whether the plan for the use of foreground, including any updates, is still appropriate… also on the plan for the exploitation and use of foreground for the consortium as a whole, or for individual beneficiaries or groups of beneficiaries, and its progress to date, identifying any technical and market considerations, including IPR and third party rights, which may need attention

The nature and extent of the expected exploitable results from the project remain unclear. The balance between commercial and non-commercial exploitation also remains unclear, in particular in relation to the VCoE. An exploitation register of APARSEN deliverables, suggested in the 1st Year review, has not been produced.

c. Dissemination

NOTE Note whether the dissemination of project results and information (via the project website, publications, conferences, etc.) has been adequate and appropriate.

Dissemination should be one of the key activities of a NoE. However in the first year of the project poor results have been achieved from this point of view, both in terms of active and passive dissemination. At this checkpoint review some effort has been made by the project towards social media, however the dissemination and communication plans and activities keep presenting weaknesses which could undermine the project success.

In the first report, we pointed out that links to the APARSEN “…website from partners’ websites are absent in most cases and poor in others, despite the commitment in the DoW to include them”. For this review, we revisited some partners’ websites, though due to resource limitations, we limited this to only 14 partners’ websites. Our findings for these 12 websites are tabulated below.

Beneficiary Website Link to APARSEN website? OUR COMMENT
APA Yes  
CERN First review: No
This review: Yes
STM First review: No. APARSEN mentioned in 2 CVs. Site links to APA
This review: Yes
FTK First review: Yes
This review: Yes
STFC First review: Yes. Very buried – namely: STFC Home > e-Science Home > News and events > News archive (pre-2010) > News 2011 > STFC data management projects feature at APA Conference
This review: Yes, very buried, as above.
CSC First review: Yes
This review: No (broken link)
Redirect fixed - link works
DNB First review: Yes
This review: No
DPC First review: Yes. Not easy to find (‘newsroom’ articles)
This review: Yes, but as above.
AFPUM First review: Yes. English site only, not German site. Link only, no explanation
This review: No (broken link, from English site only)
Link works
BL First review: Yes. Unexpected place – ‘datasets programme’ not ‘digital preservation’
This review: No
ESA First review: No
This review: No
KNAW First review: No
This review: Yes (only in News article- then broken link)
KB First review: No. Mentioned in the small print of annual report (no link)
This review: Mentioned in the small print of PDF files, no link, Dutch only.
AIRBUS First review: Airbus website search engine offline for at least 7 days.
This review: No

In summary, some links have been added; but 8 out of the 14 websites above to not provide adequate links. Some websites now have broken links, presumably because of the change of url with the recent redesign.

This situation is unsatisfactory. The requirement is clear: at a minimum, every partner must clearly promote and explain its participation in APARSEN on its website. This must be on one or more pages in a logical location, one that is easily found and navigated to by readers who are not entering APARSEN as a search term. Therefore a mention buried in a news article or PDF document does not suffice); and must link to the current APARSEN website home page. Preferably, every partner must also report on APARSEN news, developments etc.

d. Involvement of potential users and stakeholders

NOTE *Note*whether potential users and other stakeholders (outside the consortium) are suitably involved in the use of results and dissemination (if applicable).

It remains too early in the project to determine the degree of success in terms of engagement with potential users and stakeholders to date. The revised version of stakeholder plan (D45.1) now includes mention to large public sector archives inadequate to address this task. However this deliverable remains weak and without evidence-based measures of success for this task.

e. Links with other projects and programmes

NOTE Note consortium’s interaction with other related Framework Programme projects and other national/international R&D programmes and standardisation bodies (if relevant).

The deliverables make extensive reference to links to other initiatives. However, most of them are initiatives that already involve APARSEN partners, particularly STFC/APA. This raises questions about whether any meaningful exchanges are taking place. It is possible to interpret many of the deliverables’ references as meaning ‘we are writing down what we already know, because it is easy for us to do so’ – though this is only a possibility. The consortium may wish to establish meaningful exchanges with other initiatives. ---


NOTE Note whether other relevant issues (e.g. ethical issues, policy/regulatory issues, safety issues) have been handled appropriately.

The website has been improved since the first review. It now contains worthwhile content. However, the site and domain are shared with APA; thus the homepage focusses more on APA than on APARSEN. Although this reflects the fact that APARSEN is an extension of APA, it is unfortunate from the perspective of visibility and nature of the EU-funded project.

We note that since this report was first drafted a new logo has been introduced for APARSEN on the website. The new logo emphasises the first three letters of APARSEN – APA. One effect of this change is to reinforce the APA “brand” at the expense of the APARSEN “brand”.

Original logo New logo

The appearance of the new logo leads to questions about whether APA will take over from APARSEN; whether APARSEN will be a part of APA; or whether the two initiatives will continue side by side. It further muddies the waters already clouded by the unclear decision to share a website between the initiatives.

As APA is only one beneficiary of the APARSEN NoE project, it is not clear what is the position of the entire consortium – and particularly those who are not members of APA – regarding the reinforcement of the APA brand at the expense of the EU-funded APARSEN brand, whose goals (provision of a coherent policy framework and funding models and the development of European based training and education for digital preservation with a view to a Virtual Centre of Excellence) differ from APA’s goals.


NOTE Note recommendations, together with outline timescale for implementation, describing the actions that need to be undertaken in order to better pursue the project goals or to adjust them where appropriate. The recommendations may cover technical as well as organisational and/or managerial, dissemination and exploitation aspects. The review may recommend any course of action that may be required in order to better achieve the project objectives, improve performance and/or remedy non-performance, including modifications to the workprogramme. These should include but may elaborate further on the recommendations under b, c and d of the Overall Assessment

The reviewers recommend:

Recommendation R2.1:

The consortium should clarify the relationships between APA and APARSEN, between the ‘centre of excellence’ of one and the ‘network of excellence’ of the other, with a view to deciding on the long-term branding. We recommend this on the basis that having both APA and APARSEN, and having them both share a website, having them both claim responsibility for events, is not sustainable. The decision on how to manage this needs to be taken, though not necessarily fully executed, by the next review so as to avoid promotional and awareness-building for a ‘brand’ name that rapidly disappears.

Recommendation R2.2:

The consortium should consider granting reviewers access to its collaboration platform, so that reviewers can access evidential documents such as risk registers, quality assurance records, the latest versions of event lists, etc. This is suggested because the deliverables schedule severely limits the reviewers’ visibility of such documentation; it would be to the benefit of all parties if the reviewers could have ready access.

No problem as far as I can see

-- DavidGiaretta - 2012-06-16

Recommendation R2.3:

We recommend that D11.3 Virtual Centre of Excellence organisation (currently planned in only one final version at M48) should additionally be presented as intermediate deliverable in the following project reviews. This should include a clear description of how all deliverables will be tied into the Virtual Centre of Excellence which is the primary objective of the APARSEN project.

Surely the research is help generate "excellence" and provide a coherent view on digital preservation research. We have the schedule for the VCoE in D11.2

-- DavidGiaretta - 2012-06-16

Recommendation R2.4:

The project should conduct a critical path analysis of all work packages with a view to providing a rationale/justification for the continuation of all aspects of the project. The critical path analysis should classify all proposed activities as:
  • central to the delivery of the VCoE;
  • of clear scientific/technical value to the digital preservation field;
  • other.

Not clear a "critical path analysis" makes sense.

In brief: - The research WP's (Streams 2 and 3) are not mappable to the VCoE except in a general way.

- Stream 4: dissemination - pretty fundamental. Maybe we broke it into too many WPs but are any of the activities unnecessary?

- Stream 1: spreading with word inside organisations and provide the basis for testing and plans for VCoE. WP12 is a possible deletion.

-- DavidGiaretta - 2012-06-16

I think a critical path analysis would be MOST useful. define a few main outcomes that lead toward a VCoE with a Common vision and redirect all efforts to only these activities that lead towards this.the critical path analysis should be the main focus now

-- HildeliesBalk - 2012-06-18

Certainly "critical analysis"

-- DavidGiaretta - 2012-06-19

Recommendation R2.5:

Issues to consider for continuation:

1 provide all deliverables and periodic report of period 1 in acceptable quality;

We should be able to do this - D43.1 is probably the biggest problem.

-- DavidGiaretta - 2012-06-16

2 rectify or incorporate into planning all slippages;

We can do this

-- DavidGiaretta - 2012-06-16

3 have fully realised project planning for the rest of the project including establishment of the VCoE;

How much more detail is needed? There are a lot of real uncertainties.

-- DavidGiaretta - 2012-06-16

4 have re-framed all key project deviations in conjunction with the Project Officer (in particular issues relating to education and training);

5 be able to show how APARSEN will provide coverage and uniqueness in relation to other projects;

Not sure what this means

-- DavidGiaretta - 2012-06-16

6 provide a clear roadmap of developments to be undertaken and their alignment with a future Virtual Centre of Excellence;

7 be able to show which of the project deliverables will be incorporated into the VCoE and how, and which will not.

They all will be relevant to the VCoE. There is some confusion about the research agenda which we need to explain more carefully.

-- DavidGiaretta - 2012-06-16

The project should continue only if all these points, and other recommendations, are addressed satisfactorily, failing which it should be terminated.

Edit | Attach | Watch | Print version | History: r11 < r10 < r9 < r8 < r7 | Backlinks | Raw View | Raw edit | More topic actions
Topic revision: r11 - 2012-07-12 - ReneVanHorik
This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback