Hi all,
Would love to brainstorm how to implement linked data on a variety of cultural heritage institutional metadata. We’ve been working on a project about artist books and enhancing with Getty LOD vocabs and would love to get a feel for how or if others are thinking about how to enhance their legacy data to be more usable for others. Some thoughts about what would be discussed:
- What to do with legacy data – what forms it takes, what is best for the objects and what is best for interoperability?
- Identifying relevant tools for cleaning and enhancing data, such as OpenRefine, Heidelberg tool?
- Relevant team members for such a project, including domain expert and metadata and systems experts – how to communicate effectively across multiple expertise backgrounds
- Balancing over- or underenhancement, particularly for works of art
- What are some ways to visualize connections and relationships between creators and artifacts?
- How can we ensure that a project is sustainable?
Hi Emilee,
Our Visual Resources Center at Vanderbilt University is also dealing with these very issues, and our homegrown digital assets management system “dimli” recently partnered with the Getty’s LOD vocabularies using RDF and SPARQL.
We are asking questions about viability (triple stores are $$$), whether to point to Getty RDF or save it locally (think Government shutdown and no LC vocals) to name a few!
I have a poster on said topic for the conference, and would be very interested in contributing to a workshop at ThatCamp too.
Millie
docs.google.com/document/d/1PFLQoOOnUK7yRpD_s-WS3_iV0NKhPLO0PGpdWEuvUoQ/edit?usp=sharing for the agenda