Skip to end of metadata
Go to start of metadata

Calls are held every Thursday at 1 pm eastern standard time (GMT-5) – convert to your time at http://www.thetimezoneconverter.com

Please add additional agenda items or updates – we always welcome suggestions

Updates

  • Weill Cornell – 1) working on xml data from HSS (CTSC partner) - use Google Refine (testing version 2.5) to convert from xml to rdf, 2) mass ingest of ~ 36,000 publications using Scopus Harvester - fixed some bugs on the way, and testing if the server can accommodate such a capacity, 3) one of our other CTSC partner, Hunter College also have their VIVO site (http://vivo.hunter.cuny.edu/vivo/people) up and running - we'll try rebuilding index of our customized local federated search site.
  • Stony Brook – interested in learning about the Florida work on logging provenance – saving Harvester models; VIVO supports separate graphs for different data sources, but it's still cumbersome. Brian Lowe adds that we will be standardizing basic metadata about graphs so that it will be possible to display helpful information about where data came from. We won't yet be providing auditing information."

    Scripps -- North Texas --_scripts are built, using a database and JDBC Harvester examples

  • Johns Hopkins
  • Indiana – successfully installed all the UF data locally so can work with the co-author network thesholding issue
  • Florida – finished courses from 2008 through 2011 Fall term – 7672 unique courses and 66,800 unique course sections. Will ingest each successive semester a week or two after it ends. Working on more improvements to the Harvester to account for data already in VIVO.
  • Duke – working on grants – pulled in a flat file initially. Recently improved access to the data store.
  • Cornell – Chris Westling has been working on HR ingest code, unifying 5 separate spreadsheets we get with people, job, and department information. Using an intermediate database and move across to VIVO with D2R. Working on cron job.
  • Colorado – getting departments, institutes, and other units into VIVO, along with appointments and other affiliations. Starting with faculty and faculty administrative positions, highlighting the home department relationship.
  • Brown – still in the early stages – working on identifying data. Faculty submit PDF and Word documents, and will work with a vendor to structure the data; plan to use VIVO as the way to update their CVs.

UF Demos and Discussion

The University of Florida will demonstrate the latest on their course ingest and their implementation of logging for the Harvester and for VIVO itself, including the ability to produce reports of all the changes made to the UF VIVO every day.

A listener on the web context that picks up anything coming through the web interface – self editing, or the Harvester. Log to a separate file.

Beginning to do logging - the who, what, when, how of what has happened, but not yet the tools to leverage that in terms of blocking editing under

vivo.ideascale.com

Please check out http://vivo.ideascale.com and vote – developers are encouraged to participate, not just implementers and users.

Notable Development List Traffic

  • People tab – how to show people by subject – Subject areas could also be set up as a new menu page – see Cornell test site.
  • CVs and biosketches – using the work that's completed, developing requirements for the missing interactive control component
  • Exceptions when starting VIVO (WCMC) – DB admin configured MySQL binlog_format to MIXED, then the problem was resolved
  • Connecting the Harvester to a restful web service for the Dutch Metis service – John Fereira responded that could use the SimpleXMLFetch or JSONFetch classes he wrote and adapt them to retrieve data from the Metis service, replacing the JDBCFetch or one of the other "Fetcher" tools currently available in the harvesting pipeline. The classes also require a couple of additional jar files that he can provide for parsing json/xml.
  • Report of a Freemarker error in the wild
  • Processing images for VIVO
  • Joseki query performance
  • Targeting a specific graph for removal of content – Nicholas documented the procedure on the implementation section of this wiki
  • Removing the publications for one faculty member
  • Simple reasoner error: a NullPointerException was received while recomputing the ABox inferences. Halting inference computation. – any clues?

Implementation Fest update

Registration is open for the Implementation Fest at http://vivoweb.org/2012-vivo-implementation-fest and a draft schedule is online

Items for next week

Call-in Information

1. Please join my meeting. https://www1.gotomeeting.com/join/322087560

2. Use your microphone and speakers (VoIP) - a headset is recommended. Or, call in using your telephone.

Dial +1 (773) 897-3008
Access Code: 322-087-560
Audio PIN: Shown after joining the meeting

Meeting ID: 322-087-560

last meeting | next meeting