Attendees

Bram Luyten - @mire
Elin Stangeland - Cambridge University Library
Iryna Kuchma - eIFL.net
Sarah Molloy - Queen Mary, University of London
Sarah Potvin - Texas A&M University
Valorie Hollister - DuraSpace

Time

  • 10:00am Eastern/14:00 UTC

Dial-in

We will use the international conference call dial-in. Please follow directions below.

  • U.S.A/Canada toll free: 866-740-1260, participant code: 2257295
  • International toll free: http://www.readytalk.com/intl
    • Use the above link and input 2257295 and the country you are calling from to get your country's toll-free dial in #
    • Once on the call, enter participant code 2257295

Discussion Topics

 

Topic

Discussion leader

1

News / announcements / events, etc.

Val

2

Metadata project

Maureen, Sarah, Amy, Bram
3Any progress on JIRA issue DS-1583 Adopt interface translating by Translatewiki.net - didn't see any mailing list responses or comments on JIRA - No news at this point, TWN integration php code still needs to be reviewed.Bram
4

Any new JIRA issues? https://jira.duraspace.org/browse/DS#selectedTab=com.atlassian.jira.plugin.system.project%3Aissues-panel

  • "More Details Needed" - Try to ask relevant questions in the comments, poking the original submitter to provide more detail
  • "Volunteer Needed" - These are issues waiting for a volunteer. You can put some of these tickets in the spotlight by commenting or +1 on them to indicate that they are really important to you.
  • "Received" - Help to judge whether the incoming tickets contain enough detail
Val

 


Discussion Notes/References

1) News

2) Metadata project

  • MetadataMapper
    • will replace, assign, merge metadata
    • Mark Wood and Richard agree that we need some validation capability, but not in this tool
    • doesn't completely solve migration issues - mapping for tool needs to be defined in config, not available curation
    • Sarah P.: can we test out tool before it is included in DSpace - will post comment on proposal page - ask if tool will be in 4.0 and that DCAT is ready to help test - either now or during testathon
  • Project chunks
  • Review group
    • Sarah P.: was on NISO review group - great way to get feedback
    • Sarah M.: can get people to participate for a short time because it is targeted and limited, locally refer to it as a  "task & finish group"
    • Val: perhaps review group/task & finish group could be used to validate the DC mapping, may allow for broader participation and for DC compliance experts outside DSpace community to participate because it is a defined/short period  
      • Bram: difference between terms – "mapping" vs. "crosswalk" from DSpace developers perspective - crosswalk usually references OAI-PMH – we should use the term "mapping" 
      • metadata team has done the preliminary DC mapping, needs to be reviewed and validated to confirm the end result compliant: https://docs.google.com/a/fedora-commons.org/spreadsheet/ccc?key=0AgU-htsSmo31dEtaM1M1Q2E1NlRxNG11ZHFrSkMxNFE#gid=0
      • Sarah P.: is going to the DC conference in 2 weeks, will try to find some resources for a review group on mapping
      • Elin: Jessica Lindholm from Univ of Malmo? in Sweden is a metadata expert
      • Sarah P: metadata team needs to clean up the mapping a bit before review group
    • Sarah: also believe that we need a review group to validate the staging/phases
      • challenging to unpack all issues - technical infrastructure issues, dc compliance issues, etc.
      • need dc experts to validate as well as dspace metadata experts, also broader community of what is supportable
    • Actions:
      • fix references to preliminary mappings on proposal page - make easier to find?
      • add link to OR13 preso on proposal page
      • clean up the preliminary DC mapping
      • look at other applications using DC - Omeka uses unqualified DC, others?
      • think about / ask others about who could participate in review groups
      • contact volunteers to date to update them, ask for their feedback

3) JIRA issue DS-1583

 

 

 

 

Actions Items 

Action Item

Assignee

Deadline

1) Metadata project - prepping for review group

    • fix references to preliminary mappings on proposal page - make easier to find?
    • add link to OR13 preso on proposal page
    • clean up the preliminary DC mapping
Metadata team - Sarah P., Bram, Amy, Maureen 

2) Metadata project - research / brainstorm on recruiting

    • look at other applications using DC - Omeka uses unqualified DC, others?
    • think about / ask others about who could participate in review groups
ALL 
3) Contact metadata project volunteers to date to update them, ask for their feedback on review group ideaVal 

4) Re-start JIRA reviews

a) Everyone should select a JIRA issue of interest (Bram's recommended view: https://jira.duraspace.org/browse/DS#selectedTab=com.atlassian.jira.plugin.system.project%3Aissues-panel)

    • Received: help to judge whether the incoming tickets contain enough detail
    • More details needed: try to ask relevant questions in the comments, poking the original submitter to provide more detail
    • Volunteer needed: There are today 155 issues waiting for a volunteer. You can put some of these tickets in the spotlight by commenting or +1 on them to indicate that they are really important to you.

b) If the issue you select merits DCAT discussion (you want to confirm the importance of the request, want input on request, want to propose DCAT sending a msg to the listserve to find volunteers, etc.) please schedule the month for which you'd like to hold the discussion on the 2013 DCAT Discussion Schedule. If your JIRA issue doesn't merit a DCAT discussion, use JIRA to follow up directly.

c) Monthly discussion leaders should use the DCAT Discussion Forum to start your discussions with the rest of DCAT.

d) Post links/updates to the JIRA issue (summary of DCAT discussion with links to forum, etc.).

ALL

 
   
  • No labels

1 Comment

  1. Elin made an excellent suggestion to look at other systems that are also using Dublin Core in a way similar of how DSpace is using it. I volunteered to find some information on how Omeka is dealing with schema's and standards. Here's what I found out:

    Element Set: The omeka notion of an element set is a set of metadata elements that should be available to all item types. In a sense, it corresponds with the notion of metadata schemas in DSpace. The default element set of Omeka are the 15 dublin core standard elements. Name spaces are not mentioned. Interestingly enough, it doesn't seem possible to edit the standard element set from the user interface and you need to make changes to the database directly. This means less flexibility, but ensures greater standard adherance. They do have a plugin that extends the default element set with the entire selection of metadata elements from DCTERMS.

    resources for these claims (I may have misunderstood some of this as I never worked with Omeka myself):
    http://omeka.org/codex/Creating_an_Element_Set
    http://omeka.org/codex/Managing_Element_Sets_2.0 : reordering, and adding labels is possible, but removing and creating new fields doesn't seem to be in scope of what the UI can do.
    http://omeka.org/codex/Plugins/DublinCoreExtended_2.0 : the plugin that extends the standard set with DCTERMS fields

    Item Types: DSpace has system wide metadata schemas, and input forms that define which metadata is being collected for a particular collection. Omeka has more strongly defined item types. Basically for each item type, the administrator chooses which elements from the element set are being displayed. While they seem more strictly adhering to standards when it comes to element sets, they seem to allow breaking certain compliance completely by allowing custom metadata elements in a particular item type.

    http://omeka.org/codex/Managing_Item_Types_2.0

    Multi lingual metadata:Judging from the forum discussion below, Omeka doesn't address the challenges regarding multi lingual metadata. DSpace does not do a stellar job here neither, but at least we have a language field for each of the metadata fields.

    http://omeka.org/forums/topic/omeka-multilanguage-site

    Authority control / Storing links instead of strings: I found a few examples for controlled vocabularies but unsure at this point how they work under the hood.

    http://omeka.org/codex/Plugins/Library_of_Congress_Suggest Plugin offering auto-complete on LC name authorities.
    http://omeka.org/codex/Plugins/Library_of_Congress_Suggest_2.0
    http://omeka.org/codex/Plugins/SimpleVocab Simple vocabulary plugin

    Other:

    http://omeka.org/codex/Working_with_Dublin_Core
    http://omeka.org/codex/Plugins/ItemRelations Plugin for relations between items