Skip to end of metadata
Go to start of metadata

Calls are held every Thursday at 1 pm eastern daylight time (GMT-5) – convert to your time at

These calls now use WebEx – see the "Call-in Information" at the bottom of this page.

Please add additional agenda items or updates --


  • Brown – (Ted) – Still cleaning up data that has been loaded; met with people who maintain the local faculty information system and will be getting current faculty positions and titles from that.
    • Also created new local object properties to display service to the university and other types of service in line with how they typically appear on faculty CVs – looking at how to display date ranges using custom list views
  • Buffalo – (Mark) Has VIVO installed on a new production server and configuring mod-jk
  • Colorado – (Stephen) Working on modifying some of the list views when faculty have multiple appointments in the same department; a sparql query that works via the interactive query interface throws a java null pointer exception
    • Still pushing out new data updates every couple of days for Colorado Springs and Boulder campuses; this is the faculty reporting season
    • Met earlier this week with the Laboratory for Atmospheric and Space Physics and UCAR, the University Consortium for Atmospheric research – UCAR has ~120 member universities and they want to explore a hybrid local and remote VIVO, since a significant number of their member institutions already have VIVO or VIVO-compatible systems
    • Also interested in Datastar, and will be attending the I-Fest
  • Cornell – Tim has continued to develop the horizontal tabbed interface as an alternative to vertical property groups
    • He's added a "view all" tab that Alex suggests could be set up to use a print stylesheet that could potentially include page breaks
      • What would the relationship be between a good print stylesheet and the DV-Docs CV export (implemented and live at Florida)? The CV export produces rich text to import directly into Word for further editing, so would have advantages, but a print screen option is likely not very much work and straightforward to use.
    • Question: is all the data for the page fetched when the page is loaded, or is data for the different tabbed sections only loaded as the user clicks on the tab?
      • Right now Tim has not modified how the data are loaded for the page, but "lazy loading" might be worth looking into as a way to address the performance of page loads for people with very large numbers of publications
    • still working on URITool
      • UF is missing some of the pieces like XSL
  • Duke – (Richard) Upgraded test environment to VIVO 1.5 but had a question about exporting a large model – kept running into memory problems, so tried exporting data in chunks by getting a handle on a dataset vs. a model – there are a few ways to get a query in there.  Brian – definitely want to use the actual SDB dataset object, not Jena models, or will miss out on the optimization that SDB does for limit queries.
    • Also found that when had been using named graphs, with 1.5 the graphs have to have real URIs, not just strings.  Modified the graph names in the SQL table and reactivated the graph again, and seemed to work. Brian – Yes, going forward the graph names have to be valid URIs
  • Florida – (Matt and Nicholas) Fixed the indexing problem – had to use a Java debugger and go through the SQL tables to find the source; a manual index would fail, and incremental indexes had not been working for a while.  The cause was a single form feed character pasted into a PDF file pasted into VIVO in March of 2011; cannot reproduce the error by pasting the same file back into more recent versions, but can if the data is uploaded as an n-triples file.
    • Now working on getting the logs to reflect manual edits, that previous versions used to log in the vivo.all.log – Brian suggests modifying the auditor, since none of the logging functionality is built into VIVO that was locally modified at UF; with 1.5 have one central place to listen to all the changes via the new RDF API, and the image editing must not yet be using that which is why it still shows up in the logs.  By switching to logging from the auditor in the RDF API, you should see all the edits again, but will have to filter out the inferred triples that are also inserted by the application at the same time. (See dev list messages)
  • Memorial University of Newfoundland – (Lisa and John) Just had a call with David Baker of CASRAI; talking about going to the October CASRAI conference in Ottawa and perhaps doing a half-day workshop on knowledge mobilization, as well for the chance to meet up with the VIVO team there.
    • Memorial University is adopting VIVO as the back end for its Yaffle tool – have a go ahead for a 2-year project with the understanding that they need to extend the VIVO ontology or come up with a separate add-on ontology that models knowledge brokering/knowledge management. The university solicits research topics that citizen groups, private companies, and government units around the province suggest for collaborations with researchers at the university.  The university has four workshops a year in Newfoundland and Labrador where researchers present and local officials and entrepreneurs discuss possible joint projects; these ideas are brought into Yaffle-VIVO through individual users, staff entry following events, and brokering connects opportunities with researchers and research units.
    • Want to be able to model both the actors and the opportunities, visualize them, and generate reports collaboratively and offer the models to other people interested in public engagement, knowledge transfer, and related activities, as demonstrated on their public engagement site.
    • Will be sending out a poll to the ontology, dev and implementation lists to solicit interest in working on the ontology modeling
  • NYU – (Yin) Looking at the VIVO development model and came up with a nice way of saying synchronized with the development master branch as checking in local changes to the search indexing and results display. Substantially replacing some sections of code as converting VIVO to search people only – redefining how the indexing and the weighting goes, using what is now still a prototype; now wants to be able to keep up with whatever changes continue in the dev version so won't have to continually. Will write it up and solicit feedback from others in comparison.
    • could be straightforward to swap out the way that indexing is done for people, but less clear how alternative indexing would work on other types of entities like publications, events, organizations, etc.
    • is there a branching model that is recommended?  Jim – holding pretty close to the Gitflow model; Yin – use that in-house already
  • Stony Brook – (Tammy) Erich has WebID working on sample code and still looking to bring into VIVO and will contribute the module.  Upgraded to the latest Tomcat 7 and JDK 7 and Erich is working with both of those.
  • UCSF – (Eric) Getting ready to release the RDF version of Profiles, that is compatible with the VIVO ontology, this weekend.  Will be doing a panel presentation on OpenSocial at the upcoming AMIA conference.  Not sure whether anybody else has gone live with the OpenSocial additions to VIVO?  Alex – has discussed features like the slideshare gadget at some meetings and gotten positive feedback. Eric – reviewers for the panel liked the concept of bringing content in from industry.  Right now are storing the data in a relational model but now sees how to store RDF from working with the RDF version of Profiles – wondering what should be in RDF vs. relational – some content wants to be available as linked data, while other data might be more private and/or just transactional support.  Has there been any thinking about that on the development team?
    • Jim – so far all the discussion has involved in storing as RDF; Jon – VIVO user accounts are stored in RDB, a different data store from where the public data is stored (SDB)
    • Eric – Loki stores everything in RDB and exports on request as RDF for linked open data; Profiles is more of a hybrid, in part due to its transition from a purely relational product, but also perhaps for performance reasons and because it uses a .Net stack. Profiles has kept passwords in RDB, for example.
  • Washington University – (Kristi) Interested in discussions about the Implementation Fest
  • Weill – (Paul) Still working to a deadline of getting publications clean enough to be used as the basis for a separate faculty reporting exercise. Are making very specific Scopus queries for each faculty member rather than relying on general Harvester queries.
    • still working on performance issues – noticed that Google API is loading for QR codes – maybe they should be lazy-loaded only

Other updates or topics

  • Implementation Fest dates and agenda
    • still working on reserving spaces, which may be easier to schedule on Thursday or Friday at the Law School where the I-Fest was held last year
      • could then potentially have recreation like skiing on Saturday
    • may check with a few others spaces including LASP
    • hotel in walking distance from the Law School has a rate of $84/night
    • might be able to do a code sprint with a smaller group at a local startup; open source projects have sometimes done that
    • we'll hope for good snow to enjoy Saturday and/or Sunday...
  • Chin Hua is leading a visualization sprint at Indiana to address performance and scaling issues with visualizations, and now has an updated copy of the UF dataset for testing
  • Upcoming presentation February 7 on Karma, a data integration tool used to map USC faculty data to VIVO
    • Powerpoint from 2012 VIVO Conference presentation by Pedro Szekely of the USC Information Sciences Institute

Notable development list traffic

  • Resolution of search indexing problem at Florida, and remaining questions about how to trap for the data issue involved (a form feed character pasted in from a PDF)
  • Exporting large models – Ted at Brown developed a Jython script to call Jena with Harvester libraries, specifying a particular model
  • Modifying VIVO to allow authentication with WebID while still using modules in Apache httpd to support authentication
    • from Erich at Stony Brook: 
      When you try to log onto a site that accepts WebID, it will ask for your certificate.  That site will then look in the certificate for the URI of your foaf file then grab your foaf file and reference it to get your public key.  The server will then check the signature of your certificate with the public key to verify that you have the private key to the public key.  If they match, you are logged on, and, as a bonus, the server now knows basic information about you from your foaf file.  Also realize, there are no commercial certificate authorities involved.  If you suspect your private key has been compromised, re-key your foaf file and local're done.  No running around to numerous sites.  Change the picture in your foaf file and sites accepting WebID can update from your updated foaf files, as well as, other information.  Cool eh? :-) 
  • VIVO 1.5.1 logging – how to log all edits made through the application via the RDF api, and the need to have the auditor ignore all changes made to the inference graphs
  • Prefacing URIs with a non-numeric character (and an issue in VIVO to fix for 1.6)
  • Desire to combine multiple positions/titles for one employee when seen from the department's page
  • Jim: show me your code – "When I get your code, I'll 'diff' it against the original release code to see what you had to change, and how hard you had to work to make it happen. I'll summarize for the team, and we'll talk about ways to make it easier."

Call-in Information

Topic: VIVO weekly call

Date: Every Thursday, no end date

Time: 1:00 pm, Eastern Daylight Time (New York, GMT-04:00)

Meeting Number: 641 825 891

To join the online meeting

To view in other time zones or languages, please click the link:

If those links don't work, please visit the Cornell meeting page and look for a VIVO meeting.

To join the audio conference only

Access code:645 873 290

last meeting | next meeting

  • No labels