Schemas of the OpenAIRE Graph main entities and the relationships among them, both internal and public (dump) definitions.
Go to file
Claudio Atzori a6bc90b90e Merge pull request 'Person entity' (#37) from person_entity into master
Reviewed-on: #37
2024-07-17 11:50:38 +02:00
src Merge branch 'master' into person_entity 2024-07-17 11:46:38 +02:00
.gitignore schema module made independent, migrated from https://code-repo.d4science.org/D-Net/dnet-hadoop/src/branch/stable_ids/dhp-schemas 2021-04-23 12:15:36 +02:00
CHANGES.md updated changelog 2024-07-17 11:50:27 +02:00
CODE_OF_CONDUCT.md added code of conduct and contributing files 2024-01-24 10:45:44 +01:00
CONTRIBUTING.md added code of conduct and contributing files 2024-01-24 10:45:44 +01:00
LICENSE.md added code of conduct and contributing files 2024-01-24 10:45:44 +01:00
README.md WIP: defining model classes used to store the json payload in the Solr documents 2024-03-01 12:26:58 +01:00
pom.xml bumped version 2024-07-17 11:45:45 +02:00

README.md

Introduction

This project adheres to the Contributor Covenant code of conduct. By participating, you are expected to uphold this code. Please report unacceptable behavior to dnet-team@isti.cnr.it.

This project is licensed under the AGPL v3 or later version.

Purpose

This project defines object schemas of the OpenAIRE main entities and the relationships that intercur among them. Namely it defines the model for

  • the graph internal representation, defined under the package eu.dnetlib.dhp.schema.oaf
  • the scholexplorer content representation, defined under the package eu.dnetlib.dhp.schema.sx
  • the contents acquired from the metadata aggregation subsystem, defined under the package eu.dnetlib.dhp.schema.mdstore
  • the ORCID common schemas, defined under the package eu.dnetlib.dhp.schema.orcid
  • the Solr common schemas used to represent the information returned to the Explore portal and the APIs, defined under the package eu.dnetlib.dhp.schema.solr

The serialization of such objects (data store files) are used to pass data between workflow nodes in the processing pipeline and / or intended to be shared across components.