Skip to content

Commit

Permalink
update oquare link
Browse files Browse the repository at this point in the history
  • Loading branch information
jesualdotomasfernandezbreis committed Feb 29, 2024
1 parent d51fe5c commit dc051ea
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ The qualitative assessment was done by an ontology engineer. The objective is to

The quantitative assessment has been done by using ontology evaluation automated tools. The objective of this evaluation is to compare the ontologies developed by the human and the LLM and to study whether the modelling style is consistent across datasets.

* Calculating and comparing the values of the 19 quality metrics included in the [https://semantics.inf.um.es/oquare/](OQuaRE ontology evaluation framework). OQuaRE scales the values of the metrics to a Likert scale 1 (lowest score) to 5 (highest score). In order to generate the OQuaRE scores we have used the [GitHub action available for OQuaRE](https://github.com/tecnomod-um/oquare-metrics), whose results are available in the [oquare folder](https://github.com/tecnomod-um/OntoGenixEvaluation/tree/main/oquare/results/ontologies), there is one folder per ontology. For each ontology there is one *README.md* file which shows all the figures available in the *img* folder. OQuaRE outputs the values of the metrics in an XML file which is also provided in the *metrics* folder. The LLM ontologies are in RDF/XML format despite OntoGenix generates them in Turtle. The files have been transformed into RDF/XML using the [EasyRDF Converter](https://www.easyrdf.org/converter). The reason for this conversion is that OQuaRE input ontologies must be in this format.
* Calculating and comparing the values of the 19 quality metrics included in the [OQuaRE ontology evaluation framework](https://semantics.inf.um.es/oquare). OQuaRE scales the values of the metrics to a Likert scale 1 (lowest score) to 5 (highest score). In order to generate the OQuaRE scores we have used the [GitHub action available for OQuaRE](https://github.com/tecnomod-um/oquare-metrics), whose results are available in the [oquare folder](https://github.com/tecnomod-um/OntoGenixEvaluation/tree/main/oquare/results/ontologies), there is one folder per ontology. For each ontology there is one *README.md* file which shows all the figures available in the *img* folder. OQuaRE outputs the values of the metrics in an XML file which is also provided in the *metrics* folder. The LLM ontologies are in RDF/XML format despite OntoGenix generates them in Turtle. The files have been transformed into RDF/XML using the [EasyRDF Converter](https://www.easyrdf.org/converter). The reason for this conversion is that OQuaRE input ontologies must be in this format.

* Finding potential modelling errors with the [https://oops.linkeddata.es](Ontology Pitfall Scanner (OOPS!)). The PNG and RDF files generated by the tool are available in a [https://github.com/tecnomod-um/OntoGenixEvaluation/tree/main/oops](specific repository).OOPS! checks for [41 types of pitfalls in ontologies](https://oops.linkeddata.es/catalogue.jsp).

Expand Down

0 comments on commit dc051ea

Please sign in to comment.