Initial Commit

This commit is contained in:
George Kalampokis 2024-01-26 11:07:11 +02:00
commit b6af310532
78 changed files with 11685 additions and 0 deletions

2
.gitignore vendored Normal file
View File

@ -0,0 +1,2 @@
.idea/
target/

36
Dockerfile Normal file
View File

@ -0,0 +1,36 @@
####################################### Build stage #######################################
FROM maven:3.9-eclipse-temurin-21-alpine AS build-stage
ARG MAVEN_ACCOUNT_USR
ARG MAVEN_ACCOUNT_PSW
ARG REVISION
ARG PROFILE
ARG DEV_PROFILE_URL
ENV server_username=$MAVEN_ACCOUNT_USR
ENV server_password=$MAVEN_ACCOUNT_PSW
COPY pom.xml /build/
COPY core /build/core/
COPY web /build/web/
COPY settings.xml /root/.m2/settings.xml
RUN rm -f /build/web/src/main/resources/config/app.env
RUN rm -f /build/web/src/main/resources/config/*-dev.yml
# RUN rm -f /build/web/src/main/resources/logging/*.xml
WORKDIR /build/
RUN mvn -Drevision=${REVISION} -DdevProfileUrl=${DEV_PROFILE_URL} -P${PROFILE} dependency:go-offline
# Build project
RUN mvn -Drevision=${REVISION} -DdevProfileUrl=${DEV_PROFILE_URL} -P${PROFILE} clean package
######################################## Run Stage ########################################
FROM eclipse-temurin:21-jre-alpine
ARG PROFILE
ARG REVISION
ENV SERVER_PORT=8080
EXPOSE ${SERVER_PORT}
COPY --from=build-stage /build/web/target/file-transformer-rda-web-${REVISION}.jar /app/file-transformer-rda-web-web.jar
ENTRYPOINT ["java","-Dspring.config.additional-location=file:/config/","-Dspring.profiles.active=${PROFILE}","-Djava.security.egd=file:/dev/./urandom","-jar","/app/file-transformer-rda-web.jar"]

21
LICENSE.txt Normal file
View File

@ -0,0 +1,21 @@
MIT License
Copyright (c) 2019-2020 OpenAIRE AMKE
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

9
README.md Normal file
View File

@ -0,0 +1,9 @@
# Using RDA File Transformer with Argos
The repository-file-transformer-rda module implements the [https://code-repo.d4science.org/MaDgiK-CITE/file-transformer-base](https://) interface for the RDA Json file format.
## Setup
After creating the jar from the project, environment variables should be set since they are used in the application.properties
1) STORAGE_TMP_ZENODO - a temporary storage needed
2) CONFIGURATION_ZENODO - path to json file which includes the configuration for the repository

428
THIRD-PARTY-NOTICES.txt Normal file
View File

@ -0,0 +1,428 @@
THIRD-PARTY SOFTWARE NOTICES AND INFORMATION
Do Not Translate or Localize
This component uses third party material from the projects listed below.
The original copyright notice and the license under which CITE
received such third party material are set forth below. CITE
reserves all other rights not expressly granted, whether by
implication, estoppel or otherwise.
In the event that we accidentally failed to list a required notice, please
bring it to our attention. Post an issue or email us: reception@cite.gr
1. spring-boot-starter-parent
2. spring-boot-starter-web
3. json
4. file-transformer-base
spring-boot-starter-parent NOTICES, INFORMATION, AND LICENSE BEGIN HERE
=========================================
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
=========================================
END OF spring-boot-starter-parent NOTICES, INFORMATION, AND LICENSE
spring-boot-starter-web NOTICES, INFORMATION, AND LICENSE BEGIN HERE
=========================================
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
=========================================
END OF spring-boot-starter-web NOTICES, INFORMATION, AND LICENSE

36
core/pom.xml Normal file
View File

@ -0,0 +1,36 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>gr.cite.opendmp</groupId>
<artifactId>file-transformer-rda-parent</artifactId>
<version>${revision}</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>file-transformer-rda</artifactId>
<version>${revision}</version>
<packaging>jar</packaging>
<properties>
<maven.compiler.source>21</maven.compiler.source>
<maven.compiler.target>21</maven.compiler.target>
<maven.compiler.release>21</maven.compiler.release>
<revision>1.0.0-SNAPSHOT</revision>
<transformer-base.version>0.0.3</transformer-base.version>
</properties>
<dependencies>
<dependency>
<groupId>gr.cite.opendmp</groupId>
<artifactId>file-transformer-base</artifactId>
<version>${transformer-base.version}</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-webflux</artifactId>
</dependency>
</dependencies>
</project>

View File

@ -0,0 +1,9 @@
package eu.eudat.file.transformer.configuration;
import org.springframework.boot.context.properties.EnableConfigurationProperties;
import org.springframework.context.annotation.Configuration;
@Configuration
@EnableConfigurationProperties(FileStorageProperties.class)
public class FileStorageConfiguration {
}

View File

@ -0,0 +1,24 @@
package eu.eudat.file.transformer.configuration;
import org.springframework.boot.context.properties.ConfigurationProperties;
import org.springframework.boot.context.properties.bind.ConstructorBinding;
@ConfigurationProperties(prefix = "file.storage")
public class FileStorageProperties {
private final String temp;
private final String transientPath;
@ConstructorBinding
public FileStorageProperties(String temp, String transientPath) {
this.temp = temp;
this.transientPath = transientPath;
}
public String getTemp() {
return temp;
}
public String getTransientPath() {
return transientPath;
}
}

View File

@ -0,0 +1,93 @@
package eu.eudat.file.transformer.executor;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.SerializationFeature;
import com.fasterxml.jackson.datatype.jsr310.JavaTimeModule;
import eu.eudat.file.transformer.interfaces.FileTransformerClient;
import eu.eudat.file.transformer.interfaces.FileTransformerConfiguration;
import eu.eudat.file.transformer.models.description.DescriptionFileTransformerModel;
import eu.eudat.file.transformer.models.dmp.DmpFileTransformerModel;
import eu.eudat.file.transformer.models.misc.FileEnvelope;
import eu.eudat.file.transformer.models.misc.FileFormat;
import eu.eudat.file.transformer.rda.Dataset;
import eu.eudat.file.transformer.rda.Dmp;
import eu.eudat.file.transformer.rda.RDAModel;
import eu.eudat.file.transformer.rda.mapper.DatasetRDAMapper;
import eu.eudat.file.transformer.rda.mapper.DmpRDAMapper;
import eu.eudat.file.transformer.utils.service.storage.FileStorageService;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import javax.management.InvalidApplicationException;
import java.io.IOException;
import java.nio.charset.StandardCharsets;
import java.text.SimpleDateFormat;
import java.util.List;
@Service
public class RdaFileTransformer implements FileTransformerClient {
private final DmpRDAMapper dmpRDAMapper;
private final DatasetRDAMapper descriptionRDAMapper;
private final ObjectMapper mapper;
private final FileStorageService fileStorageService;
@Autowired
public RdaFileTransformer(DmpRDAMapper dmpRDAMapper, DatasetRDAMapper descriptionRDAMapper, FileStorageService fileStorageService) {
this.dmpRDAMapper = dmpRDAMapper;
this.descriptionRDAMapper = descriptionRDAMapper;
this.fileStorageService = fileStorageService;
mapper = new ObjectMapper();
mapper.registerModule(new JavaTimeModule());
mapper.configure(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS, false);
}
@Override
public FileEnvelope exportDmp(DmpFileTransformerModel dmpFileTransformerModel) throws InvalidApplicationException, IOException {
Dmp dmp = this.dmpRDAMapper.toRDA(dmpFileTransformerModel);
RDAModel rdaModel = new RDAModel();
rdaModel.setDmp(dmp);
String dmpJson = mapper.writerWithDefaultPrettyPrinter().writeValueAsString(rdaModel);
FileEnvelope result = new FileEnvelope();
result.setFilename(dmpFileTransformerModel.getLabel() + ".json");
result.setFile(this.fileStorageService.storeFile(dmpJson.getBytes(StandardCharsets.UTF_8)));
return result;
}
@Override
public FileEnvelope exportDescription(DescriptionFileTransformerModel descriptionFileTransformerModel, String format) throws InvalidApplicationException, IOException {
Dataset dataset = this.descriptionRDAMapper.toRDA(descriptionFileTransformerModel, this.dmpRDAMapper.toRDA(descriptionFileTransformerModel.getDmp()));
String dmpJson = mapper.writerWithDefaultPrettyPrinter().writeValueAsString(dataset);
FileEnvelope result = new FileEnvelope();
result.setFilename(descriptionFileTransformerModel.getLabel() + ".json");
result.setFile(this.fileStorageService.storeFile(dmpJson.getBytes(StandardCharsets.UTF_8)));
return result;
}
@Override
public DmpFileTransformerModel importDmp(FileEnvelope fileEnvelope) {
/*try { //TODO
String jsonString = String.valueOf(this.fileStorageService.readFile(fileEnvelope.getFile()));
RDAModel rda = mapper.readValue(jsonString, RDAModel.class);
DmpFileTransformerModel model = this.dmpRDAMapper.toEntity(rda.getDmp(), )
} catch (JsonProcessingException e) {
}*/
return null;
}
@Override
public DescriptionFileTransformerModel importDescription(FileEnvelope fileEnvelope) {
return null;
}
@Override
public FileTransformerConfiguration getConfiguration() {
List<FileFormat> supportedFormats = List.of(new FileFormat("json", false, null));
FileTransformerConfiguration configuration = new FileTransformerConfiguration();
configuration.setFileTransformerId("json");
configuration.setExportVariants(supportedFormats);
configuration.setImportVariants(supportedFormats);
return configuration;
}
}

View File

@ -0,0 +1,144 @@
package eu.eudat.file.transformer.rda;
import com.fasterxml.jackson.annotation.*;
import java.io.Serializable;
import java.util.HashMap;
import java.util.Map;
/**
* The DMP Contact Schema
* <p>
*
*
*/
@JsonInclude(JsonInclude.Include.NON_NULL)
@JsonPropertyOrder({
"contact_id",
"mbox",
"name"
})
@JsonIgnoreProperties(ignoreUnknown = true)
public class Contact implements Serializable
{
/**
* The Contact ID Schema
* <p>
*
* (Required)
*
*/
@JsonProperty("contact_id")
private ContactId contactId;
/**
* The Mailbox Schema
* <p>
* Contact Person's E-mail address
* (Required)
*
*/
@JsonProperty("mbox")
@JsonPropertyDescription("Contact Person's E-mail address")
private String mbox;
/**
* The Name Schema
* <p>
* Name of the contact person
* (Required)
*
*/
@JsonProperty("name")
@JsonPropertyDescription("Name of the contact person")
private String name;
@JsonIgnore
private Map<String, Object> additionalProperties = new HashMap<String, Object>();
private final static long serialVersionUID = -2062619884605400321L;
/**
* The Contact ID Schema
* <p>
*
* (Required)
*
*/
@JsonProperty("contact_id")
public ContactId getContactId() {
return contactId;
}
/**
* The Contact ID Schema
* <p>
*
* (Required)
*
*/
@JsonProperty("contact_id")
public void setContactId(ContactId contactId) {
this.contactId = contactId;
}
/**
* The Mailbox Schema
* <p>
* Contact Person's E-mail address
* (Required)
*
*/
@JsonProperty("mbox")
public String getMbox() {
return mbox;
}
/**
* The Mailbox Schema
* <p>
* Contact Person's E-mail address
* (Required)
*
*/
@JsonProperty("mbox")
public void setMbox(String mbox) {
this.mbox = mbox;
}
/**
* The Name Schema
* <p>
* Name of the contact person
* (Required)
*
*/
@JsonProperty("name")
public String getName() {
return name;
}
/**
* The Name Schema
* <p>
* Name of the contact person
* (Required)
*
*/
@JsonProperty("name")
public void setName(String name) {
this.name = name;
}
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
public Map<String, Object> getAdditionalProperties() {
return this.additionalProperties;
}
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
public void setAdditionalProperty(String name, Object value) {
this.additionalProperties.put(name, value);
}
}

View File

@ -0,0 +1,150 @@
package eu.eudat.file.transformer.rda;
import com.fasterxml.jackson.annotation.*;
import java.io.Serializable;
import java.util.HashMap;
import java.util.Map;
/**
* The Contact ID Schema
* <p>
*
*
*/
@JsonInclude(JsonInclude.Include.NON_NULL)
@JsonPropertyOrder({
"identifier",
"type"
})
@JsonIgnoreProperties(ignoreUnknown = true)
public class ContactId implements Serializable
{
/**
* The DMP Contact Identifier Schema
* <p>
*
* (Required)
*
*/
@JsonProperty("identifier")
private String identifier;
/**
* The DMP Contact Identifier Type Schema
* <p>
* Identifier type. Allowed values: orcid, isni, openid, other
* (Required)
*
*/
@JsonProperty("type")
@JsonPropertyDescription("Identifier type. Allowed values: orcid, isni, openid, other")
private Type type;
@JsonIgnore
private Map<String, Object> additionalProperties = new HashMap<String, Object>();
private final static long serialVersionUID = -7066973565810615822L;
/**
* The DMP Contact Identifier Schema
* <p>
*
* (Required)
*
*/
@JsonProperty("identifier")
public String getIdentifier() {
return identifier;
}
/**
* The DMP Contact Identifier Schema
* <p>
*
* (Required)
*
*/
@JsonProperty("identifier")
public void setIdentifier(String identifier) {
this.identifier = identifier;
}
/**
* The DMP Contact Identifier Type Schema
* <p>
* Identifier type. Allowed values: orcid, isni, openid, other
* (Required)
*
*/
@JsonProperty("type")
public Type getType() {
return type;
}
/**
* The DMP Contact Identifier Type Schema
* <p>
* Identifier type. Allowed values: orcid, isni, openid, other
* (Required)
*
*/
@JsonProperty("type")
public void setType(Type type) {
this.type = type;
}
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
public Map<String, Object> getAdditionalProperties() {
return this.additionalProperties;
}
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
public void setAdditionalProperty(String name, Object value) {
this.additionalProperties.put(name, value);
}
public enum Type {
ORCID("orcid"),
ISNI("isni"),
OPENID("openid"),
OTHER("other");
private final String value;
private final static Map<String, Type> CONSTANTS = new HashMap<String, Type>();
static {
for (Type c: values()) {
CONSTANTS.put(c.value, c);
}
}
private Type(String value) {
this.value = value;
}
@Override
public String toString() {
return this.value;
}
@JsonValue
public String value() {
return this.value;
}
@JsonCreator
public static Type fromValue(String value) {
Type constant = CONSTANTS.get(value);
if (constant == null) {
throw new IllegalArgumentException(value);
} else {
return constant;
}
}
}
}

View File

@ -0,0 +1,180 @@
package eu.eudat.file.transformer.rda;
import com.fasterxml.jackson.annotation.*;
import com.fasterxml.jackson.databind.annotation.JsonDeserialize;
import java.io.Serializable;
import java.util.HashMap;
import java.util.Map;
import java.util.Set;
/**
* The Contributor Items Schema
* <p>
*
*
*/
@JsonInclude(JsonInclude.Include.NON_NULL)
@JsonPropertyOrder({
"contributor_id",
"mbox",
"name",
"role"
})
@JsonIgnoreProperties(ignoreUnknown = true)
public class Contributor implements Serializable
{
/**
* The Contributor_id Schema
* <p>
*
* (Required)
*
*/
@JsonProperty("contributor_id")
private ContributorId contributorId;
/**
* The Contributor Mailbox Schema
* <p>
* Contributor Mail address
*
*/
@JsonProperty("mbox")
@JsonPropertyDescription("Contributor Mail address")
private String mbox;
/**
* The Name Schema
* <p>
* Name of the contributor
* (Required)
*
*/
@JsonProperty("name")
@JsonPropertyDescription("Name of the contributor")
private String name;
/**
* The Role Schema
* <p>
* Type of contributor
* (Required)
*
*/
@JsonProperty("role")
@JsonDeserialize(as = java.util.LinkedHashSet.class)
@JsonPropertyDescription("Type of contributor")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
private Set<String> role = null;
@JsonIgnore
private Map<String, Object> additionalProperties = new HashMap<String, Object>();
private final static long serialVersionUID = 3452606902359513114L;
/**
* The Contributor_id Schema
* <p>
*
* (Required)
*
*/
@JsonProperty("contributor_id")
public ContributorId getContributorId() {
return contributorId;
}
/**
* The Contributor_id Schema
* <p>
*
* (Required)
*
*/
@JsonProperty("contributor_id")
public void setContributorId(ContributorId contributorId) {
this.contributorId = contributorId;
}
/**
* The Contributor Mailbox Schema
* <p>
* Contributor Mail address
*
*/
@JsonProperty("mbox")
public String getMbox() {
return mbox;
}
/**
* The Contributor Mailbox Schema
* <p>
* Contributor Mail address
*
*/
@JsonProperty("mbox")
public void setMbox(String mbox) {
this.mbox = mbox;
}
/**
* The Name Schema
* <p>
* Name of the contributor
* (Required)
*
*/
@JsonProperty("name")
public String getName() {
return name;
}
/**
* The Name Schema
* <p>
* Name of the contributor
* (Required)
*
*/
@JsonProperty("name")
public void setName(String name) {
this.name = name;
}
/**
* The Role Schema
* <p>
* Type of contributor
* (Required)
*
*/
@JsonProperty("role")
public Set<String> getRole() {
return role;
}
/**
* The Role Schema
* <p>
* Type of contributor
* (Required)
*
*/
@JsonProperty("role")
public void setRole(Set<String> role) {
this.role = role;
}
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
public Map<String, Object> getAdditionalProperties() {
return this.additionalProperties;
}
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
public void setAdditionalProperty(String name, Object value) {
this.additionalProperties.put(name, value);
}
}

View File

@ -0,0 +1,151 @@
package eu.eudat.file.transformer.rda;
import com.fasterxml.jackson.annotation.*;
import java.io.Serializable;
import java.util.HashMap;
import java.util.Map;
/**
* The Contributor_id Schema
* <p>
*
*
*/
@JsonInclude(JsonInclude.Include.NON_NULL)
@JsonPropertyOrder({
"identifier",
"type"
})
@JsonIgnoreProperties(ignoreUnknown = true)
public class ContributorId implements Serializable
{
/**
* The Contributor Identifier Schema
* <p>
* Identifier for a contact person
* (Required)
*
*/
@JsonProperty("identifier")
@JsonPropertyDescription("Identifier for a contact person")
private String identifier;
/**
* The Contributor Identifier Type Schema
* <p>
* Identifier type. Allowed values: orcid, isni, openid, other
* (Required)
*
*/
@JsonProperty("type")
@JsonPropertyDescription("Identifier type. Allowed values: orcid, isni, openid, other")
private Type type;
@JsonIgnore
private Map<String, Object> additionalProperties = new HashMap<String, Object>();
private final static long serialVersionUID = 3089650417960767482L;
/**
* The Contributor Identifier Schema
* <p>
* Identifier for a contact person
* (Required)
*
*/
@JsonProperty("identifier")
public String getIdentifier() {
return identifier;
}
/**
* The Contributor Identifier Schema
* <p>
* Identifier for a contact person
* (Required)
*
*/
@JsonProperty("identifier")
public void setIdentifier(String identifier) {
this.identifier = identifier;
}
/**
* The Contributor Identifier Type Schema
* <p>
* Identifier type. Allowed values: orcid, isni, openid, other
* (Required)
*
*/
@JsonProperty("type")
public Type getType() {
return type;
}
/**
* The Contributor Identifier Type Schema
* <p>
* Identifier type. Allowed values: orcid, isni, openid, other
* (Required)
*
*/
@JsonProperty("type")
public void setType(Type type) {
this.type = type;
}
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
public Map<String, Object> getAdditionalProperties() {
return this.additionalProperties;
}
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
public void setAdditionalProperty(String name, Object value) {
this.additionalProperties.put(name, value);
}
public enum Type {
ORCID("orcid"),
ISNI("isni"),
OPENID("openid"),
OTHER("other");
private final String value;
private final static Map<String, Type> CONSTANTS = new HashMap<String, Type>();
static {
for (Type c: values()) {
CONSTANTS.put(c.value, c);
}
}
private Type(String value) {
this.value = value;
}
@Override
public String toString() {
return this.value;
}
@JsonValue
public String value() {
return this.value;
}
@JsonCreator
public static Type fromValue(String value) {
Type constant = CONSTANTS.get(value);
if (constant == null) {
throw new IllegalArgumentException(value);
} else {
return constant;
}
}
}
}

View File

@ -0,0 +1,370 @@
package eu.eudat.file.transformer.rda;
import com.fasterxml.jackson.annotation.*;
import java.io.Serializable;
import java.util.HashMap;
import java.util.Map;
/**
* The Cost Items Schema
* <p>
*
*
*/
@JsonInclude(JsonInclude.Include.NON_NULL)
@JsonPropertyOrder({
"currency_code",
"description",
"title",
"value"
})
@JsonIgnoreProperties(ignoreUnknown = true)
public class Cost implements Serializable
{
/**
* The Cost Currency Code Schema
* <p>
* Allowed values defined by ISO 4217
*
*/
@JsonProperty("currency_code")
@JsonPropertyDescription("Allowed values defined by ISO 4217")
private CurrencyCode currencyCode;
/**
* The Cost Description Schema
* <p>
* Cost(s) Description
*
*/
@JsonProperty("description")
@JsonPropertyDescription("Cost(s) Description")
private String description;
/**
* The Cost Title Schema
* <p>
* Title
* (Required)
*
*/
@JsonProperty("title")
@JsonPropertyDescription("Title")
private String title;
/**
* The Cost Value Schema
* <p>
* Value
*
*/
@JsonProperty("value")
@JsonPropertyDescription("Value")
private Double value;
@JsonIgnore
private Map<String, Object> additionalProperties = new HashMap<String, Object>();
private final static long serialVersionUID = -322637784848035165L;
/**
* The Cost Currency Code Schema
* <p>
* Allowed values defined by ISO 4217
*
*/
@JsonProperty("currency_code")
public CurrencyCode getCurrencyCode() {
return currencyCode;
}
/**
* The Cost Currency Code Schema
* <p>
* Allowed values defined by ISO 4217
*
*/
@JsonProperty("currency_code")
public void setCurrencyCode(CurrencyCode currencyCode) {
this.currencyCode = currencyCode;
}
/**
* The Cost Description Schema
* <p>
* Cost(s) Description
*
*/
@JsonProperty("description")
public String getDescription() {
return description;
}
/**
* The Cost Description Schema
* <p>
* Cost(s) Description
*
*/
@JsonProperty("description")
public void setDescription(String description) {
this.description = description;
}
/**
* The Cost Title Schema
* <p>
* Title
* (Required)
*
*/
@JsonProperty("title")
public String getTitle() {
return title;
}
/**
* The Cost Title Schema
* <p>
* Title
* (Required)
*
*/
@JsonProperty("title")
public void setTitle(String title) {
this.title = title;
}
/**
* The Cost Value Schema
* <p>
* Value
*
*/
@JsonProperty("value")
public Double getValue() {
return value;
}
/**
* The Cost Value Schema
* <p>
* Value
*
*/
@JsonProperty("value")
public void setValue(Double value) {
this.value = value;
}
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
public Map<String, Object> getAdditionalProperties() {
return this.additionalProperties;
}
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
public void setAdditionalProperty(String name, Object value) {
this.additionalProperties.put(name, value);
}
public enum CurrencyCode {
AED("AED"),
AFN("AFN"),
ALL("ALL"),
AMD("AMD"),
ANG("ANG"),
AOA("AOA"),
ARS("ARS"),
AUD("AUD"),
AWG("AWG"),
AZN("AZN"),
BAM("BAM"),
BBD("BBD"),
BDT("BDT"),
BGN("BGN"),
BHD("BHD"),
BIF("BIF"),
BMD("BMD"),
BND("BND"),
BOB("BOB"),
BRL("BRL"),
BSD("BSD"),
BTN("BTN"),
BWP("BWP"),
BYN("BYN"),
BZD("BZD"),
CAD("CAD"),
CDF("CDF"),
CHF("CHF"),
CLP("CLP"),
CNY("CNY"),
COP("COP"),
CRC("CRC"),
CUC("CUC"),
CUP("CUP"),
CVE("CVE"),
CZK("CZK"),
DJF("DJF"),
DKK("DKK"),
DOP("DOP"),
DZD("DZD"),
EGP("EGP"),
ERN("ERN"),
ETB("ETB"),
EUR("EUR"),
FJD("FJD"),
FKP("FKP"),
GBP("GBP"),
GEL("GEL"),
GGP("GGP"),
GHS("GHS"),
GIP("GIP"),
GMD("GMD"),
GNF("GNF"),
GTQ("GTQ"),
GYD("GYD"),
HKD("HKD"),
HNL("HNL"),
HRK("HRK"),
HTG("HTG"),
HUF("HUF"),
IDR("IDR"),
ILS("ILS"),
IMP("IMP"),
INR("INR"),
IQD("IQD"),
IRR("IRR"),
ISK("ISK"),
JEP("JEP"),
JMD("JMD"),
JOD("JOD"),
JPY("JPY"),
KES("KES"),
KGS("KGS"),
KHR("KHR"),
KMF("KMF"),
KPW("KPW"),
KRW("KRW"),
KWD("KWD"),
KYD("KYD"),
KZT("KZT"),
LAK("LAK"),
LBP("LBP"),
LKR("LKR"),
LRD("LRD"),
LSL("LSL"),
LYD("LYD"),
MAD("MAD"),
MDL("MDL"),
MGA("MGA"),
MKD("MKD"),
MMK("MMK"),
MNT("MNT"),
MOP("MOP"),
MRU("MRU"),
MUR("MUR"),
MVR("MVR"),
MWK("MWK"),
MXN("MXN"),
MYR("MYR"),
MZN("MZN"),
NAD("NAD"),
NGN("NGN"),
NIO("NIO"),
NOK("NOK"),
NPR("NPR"),
NZD("NZD"),
OMR("OMR"),
PAB("PAB"),
PEN("PEN"),
PGK("PGK"),
PHP("PHP"),
PKR("PKR"),
PLN("PLN"),
PYG("PYG"),
QAR("QAR"),
RON("RON"),
RSD("RSD"),
RUB("RUB"),
RWF("RWF"),
SAR("SAR"),
SBD("SBD"),
SCR("SCR"),
SDG("SDG"),
SEK("SEK"),
SGD("SGD"),
SHP("SHP"),
SLL("SLL"),
SOS("SOS"),
SPL("SPL*"),
SRD("SRD"),
STN("STN"),
SVC("SVC"),
SYP("SYP"),
SZL("SZL"),
THB("THB"),
TJS("TJS"),
TMT("TMT"),
TND("TND"),
TOP("TOP"),
TRY("TRY"),
TTD("TTD"),
TVD("TVD"),
TWD("TWD"),
TZS("TZS"),
UAH("UAH"),
UGX("UGX"),
USD("USD"),
UYU("UYU"),
UZS("UZS"),
VEF("VEF"),
VND("VND"),
VUV("VUV"),
WST("WST"),
XAF("XAF"),
XCD("XCD"),
XDR("XDR"),
XOF("XOF"),
XPF("XPF"),
YER("YER"),
ZAR("ZAR"),
ZMW("ZMW"),
ZWD("ZWD");
private final String value;
private final static Map<String, CurrencyCode> CONSTANTS = new HashMap<String, CurrencyCode>();
static {
for (CurrencyCode c: values()) {
CONSTANTS.put(c.value, c);
}
}
private CurrencyCode(String value) {
this.value = value;
}
@Override
public String toString() {
return this.value;
}
@JsonValue
public String value() {
return this.value;
}
@JsonCreator
public static CurrencyCode fromValue(String value) {
CurrencyCode constant = CONSTANTS.get(value);
if (constant == null) {
throw new IllegalArgumentException(value);
} else {
return constant;
}
}
}
}

View File

@ -0,0 +1,619 @@
package eu.eudat.file.transformer.rda;
import com.fasterxml.jackson.annotation.*;
import java.io.Serializable;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
/**
* The Dataset Items Schema
* <p>
*
*
*/
@JsonInclude(JsonInclude.Include.NON_NULL)
@JsonPropertyOrder({
"data_quality_assurance",
"dataset_id",
"description",
"distribution",
"issued",
"keyword",
"language",
"metadata",
"personal_data",
"preservation_statement",
"security_and_privacy",
"sensitive_data",
"technical_resource",
"title",
"type",
"additional_properties"
})
@JsonIgnoreProperties(ignoreUnknown = true)
public class Dataset implements Serializable
{
/**
* The Data Quality Assurance Schema
* <p>
* Data Quality Assurance
*
*/
@JsonProperty("data_quality_assurance")
@JsonPropertyDescription("Data Quality Assurance")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
private List<String> dataQualityAssurance = null;
/**
* The Dataset ID Schema
* <p>
* Dataset ID
* (Required)
*
*/
@JsonProperty("dataset_id")
@JsonPropertyDescription("Dataset ID")
private DatasetId datasetId;
/**
* The Dataset Description Schema
* <p>
* Description is a property in both Dataset and Distribution, in compliance with W3C DCAT. In some cases these might be identical, but in most cases the Dataset represents a more abstract concept, while the distribution can point to a specific file.
*
*/
@JsonProperty("description")
@JsonPropertyDescription("Description is a property in both Dataset and Distribution, in compliance with W3C DCAT. In some cases these might be identical, but in most cases the Dataset represents a more abstract concept, while the distribution can point to a specific file.")
private String description;
/**
* The Dataset Distribution Schema
* <p>
* To provide technical information on a specific instance of data.
*
*/
@JsonProperty("distribution")
@JsonPropertyDescription("To provide technical information on a specific instance of data.")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
private List<Distribution> distribution = null;
/**
* The Dataset Date of Issue Schema
* <p>
* Date of Issue
*
*/
@JsonProperty("issued")
@JsonPropertyDescription("Date of Issue")
private String issued;
/**
* The Dataset Keyword(s) Schema
* <p>
* Keywords
*
*/
@JsonProperty("keyword")
@JsonPropertyDescription("Keywords")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
private List<String> keyword = null;
/**
* The Dataset Language Schema
* <p>
* Language of the dataset expressed using ISO 639-3.
*
*/
@JsonProperty("language")
@JsonPropertyDescription("Language of the dataset expressed using ISO 639-3.")
private Language language;
/**
* The Dataset Metadata Schema
* <p>
* To describe metadata standards used.
*
*/
@JsonProperty("metadata")
@JsonPropertyDescription("To describe metadata standards used.")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
private List<Metadatum> metadata = null;
/**
* The Dataset Personal Data Schema
* <p>
* If any personal data is contained. Allowed values: yes, no, unknown
* (Required)
*
*/
@JsonProperty("personal_data")
@JsonPropertyDescription("If any personal data is contained. Allowed values: yes, no, unknown")
private PersonalData personalData;
/**
* The Dataset Preservation Statement Schema
* <p>
* Preservation Statement
*
*/
@JsonProperty("preservation_statement")
@JsonPropertyDescription("Preservation Statement")
private String preservationStatement;
/**
* The Dataset Security and Policy Schema
* <p>
* To list all issues and requirements related to security and privacy
*
*/
@JsonProperty("security_and_privacy")
@JsonPropertyDescription("To list all issues and requirements related to security and privacy")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
private List<SecurityAndPrivacy> securityAndPrivacy = null;
/**
* The Dataset Sensitive Data Schema
* <p>
* If any sensitive data is contained. Allowed values: yes, no, unknown
* (Required)
*
*/
@JsonProperty("sensitive_data")
@JsonPropertyDescription("If any sensitive data is contained. Allowed values: yes, no, unknown")
private SensitiveData sensitiveData;
/**
* The Dataset Technical Resource Schema
* <p>
* To list all technical resources needed to implement a DMP
*
*/
@JsonProperty("technical_resource")
@JsonPropertyDescription("To list all technical resources needed to implement a DMP")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
private List<TechnicalResource> technicalResource = null;
/**
* The Dataset Title Schema
* <p>
* Title is a property in both Dataset and Distribution, in compliance with W3C DCAT. In some cases these might be identical, but in most cases the Dataset represents a more abstract concept, while the distribution can point to a specific file.
* (Required)
*
*/
@JsonProperty("title")
@JsonPropertyDescription("Title is a property in both Dataset and Distribution, in compliance with W3C DCAT. In some cases these might be identical, but in most cases the Dataset represents a more abstract concept, while the distribution can point to a specific file.")
private String title;
/**
* The Dataset Type Schema
* <p>
* If appropriate, type according to: DataCite and/or COAR dictionary. Otherwise use the common name for the type, e.g. raw data, software, survey, etc. https://schema.datacite.org/meta/kernel-4.1/doc/DataCite-MetadataKernel_v4.1.pdf http://vocabularies.coar-repositories.org/pubby/resource_type.html
*
*/
@JsonProperty("type")
@JsonPropertyDescription("If appropriate, type according to: DataCite and/or COAR dictionary. Otherwise use the common name for the type, e.g. raw data, software, survey, etc. https://schema.datacite.org/meta/kernel-4.1/doc/DataCite-MetadataKernel_v4.1.pdf http://vocabularies.coar-repositories.org/pubby/resource_type.html")
private String type;
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
private Map<String, Object> additionalProperties = new HashMap<String, Object>();
private final static long serialVersionUID = -6931119120629009399L;
/**
* The Data Quality Assurance Schema
* <p>
* Data Quality Assurance
*
*/
@JsonProperty("data_quality_assurance")
public List<String> getDataQualityAssurance() {
return dataQualityAssurance;
}
/**
* The Data Quality Assurance Schema
* <p>
* Data Quality Assurance
*
*/
@JsonProperty("data_quality_assurance")
public void setDataQualityAssurance(List<String> dataQualityAssurance) {
this.dataQualityAssurance = dataQualityAssurance;
}
/**
* The Dataset ID Schema
* <p>
* Dataset ID
* (Required)
*
*/
@JsonProperty("dataset_id")
public DatasetId getDatasetId() {
return datasetId;
}
/**
* The Dataset ID Schema
* <p>
* Dataset ID
* (Required)
*
*/
@JsonProperty("dataset_id")
public void setDatasetId(DatasetId datasetId) {
this.datasetId = datasetId;
}
/**
* The Dataset Description Schema
* <p>
* Description is a property in both Dataset and Distribution, in compliance with W3C DCAT. In some cases these might be identical, but in most cases the Dataset represents a more abstract concept, while the distribution can point to a specific file.
*
*/
@JsonProperty("description")
public String getDescription() {
return description;
}
/**
* The Dataset Description Schema
* <p>
* Description is a property in both Dataset and Distribution, in compliance with W3C DCAT. In some cases these might be identical, but in most cases the Dataset represents a more abstract concept, while the distribution can point to a specific file.
*
*/
@JsonProperty("description")
public void setDescription(String description) {
this.description = description;
}
/**
* The Dataset Distribution Schema
* <p>
* To provide technical information on a specific instance of data.
*
*/
@JsonProperty("distribution")
public List<Distribution> getDistribution() {
return distribution;
}
/**
* The Dataset Distribution Schema
* <p>
* To provide technical information on a specific instance of data.
*
*/
@JsonProperty("distribution")
public void setDistribution(List<Distribution> distribution) {
this.distribution = distribution;
}
/**
* The Dataset Date of Issue Schema
* <p>
* Date of Issue
*
*/
@JsonProperty("issued")
public String getIssued() {
return issued;
}
/**
* The Dataset Date of Issue Schema
* <p>
* Date of Issue
*
*/
@JsonProperty("issued")
public void setIssued(String issued) {
this.issued = issued;
}
/**
* The Dataset Keyword(s) Schema
* <p>
* Keywords
*
*/
@JsonProperty("keyword")
public List<String> getKeyword() {
return keyword;
}
/**
* The Dataset Keyword(s) Schema
* <p>
* Keywords
*
*/
@JsonProperty("keyword")
public void setKeyword(List<String> keyword) {
this.keyword = keyword;
}
/**
* The Dataset Language Schema
* <p>
* Language of the dataset expressed using ISO 639-3.
*
*/
@JsonProperty("language")
public Language getLanguage() {
return language;
}
/**
* The Dataset Language Schema
* <p>
* Language of the dataset expressed using ISO 639-3.
*
*/
@JsonProperty("language")
public void setLanguage(Language language) {
this.language = language;
}
/**
* The Dataset Metadata Schema
* <p>
* To describe metadata standards used.
*
*/
@JsonProperty("metadata")
public List<Metadatum> getMetadata() {
return metadata;
}
/**
* The Dataset Metadata Schema
* <p>
* To describe metadata standards used.
*
*/
@JsonProperty("metadata")
public void setMetadata(List<Metadatum> metadata) {
this.metadata = metadata;
}
/**
* The Dataset Personal Data Schema
* <p>
* If any personal data is contained. Allowed values: yes, no, unknown
* (Required)
*
*/
@JsonProperty("personal_data")
public PersonalData getPersonalData() {
return personalData;
}
/**
* The Dataset Personal Data Schema
* <p>
* If any personal data is contained. Allowed values: yes, no, unknown
* (Required)
*
*/
@JsonProperty("personal_data")
public void setPersonalData(PersonalData personalData) {
this.personalData = personalData;
}
/**
* The Dataset Preservation Statement Schema
* <p>
* Preservation Statement
*
*/
@JsonProperty("preservation_statement")
public String getPreservationStatement() {
return preservationStatement;
}
/**
* The Dataset Preservation Statement Schema
* <p>
* Preservation Statement
*
*/
@JsonProperty("preservation_statement")
public void setPreservationStatement(String preservationStatement) {
this.preservationStatement = preservationStatement;
}
/**
* The Dataset Security and Policy Schema
* <p>
* To list all issues and requirements related to security and privacy
*
*/
@JsonProperty("security_and_privacy")
public List<SecurityAndPrivacy> getSecurityAndPrivacy() {
return securityAndPrivacy;
}
/**
* The Dataset Security and Policy Schema
* <p>
* To list all issues and requirements related to security and privacy
*
*/
@JsonProperty("security_and_privacy")
public void setSecurityAndPrivacy(List<SecurityAndPrivacy> securityAndPrivacy) {
this.securityAndPrivacy = securityAndPrivacy;
}
/**
* The Dataset Sensitive Data Schema
* <p>
* If any sensitive data is contained. Allowed values: yes, no, unknown
* (Required)
*
*/
@JsonProperty("sensitive_data")
public SensitiveData getSensitiveData() {
return sensitiveData;
}
/**
* The Dataset Sensitive Data Schema
* <p>
* If any sensitive data is contained. Allowed values: yes, no, unknown
* (Required)
*
*/
@JsonProperty("sensitive_data")
public void setSensitiveData(SensitiveData sensitiveData) {
this.sensitiveData = sensitiveData;
}
/**
* The Dataset Technical Resource Schema
* <p>
* To list all technical resources needed to implement a DMP
*
*/
@JsonProperty("technical_resource")
public List<TechnicalResource> getTechnicalResource() {
return technicalResource;
}
/**
* The Dataset Technical Resource Schema
* <p>
* To list all technical resources needed to implement a DMP
*
*/
@JsonProperty("technical_resource")
public void setTechnicalResource(List<TechnicalResource> technicalResource) {
this.technicalResource = technicalResource;
}
/**
* The Dataset Title Schema
* <p>
* Title is a property in both Dataset and Distribution, in compliance with W3C DCAT. In some cases these might be identical, but in most cases the Dataset represents a more abstract concept, while the distribution can point to a specific file.
* (Required)
*
*/
@JsonProperty("title")
public String getTitle() {
return title;
}
/**
* The Dataset Title Schema
* <p>
* Title is a property in both Dataset and Distribution, in compliance with W3C DCAT. In some cases these might be identical, but in most cases the Dataset represents a more abstract concept, while the distribution can point to a specific file.
* (Required)
*
*/
@JsonProperty("title")
public void setTitle(String title) {
this.title = title;
}
/**
* The Dataset Type Schema
* <p>
* If appropriate, type according to: DataCite and/or COAR dictionary. Otherwise use the common name for the type, e.g. raw data, software, survey, etc. https://schema.datacite.org/meta/kernel-4.1/doc/DataCite-MetadataKernel_v4.1.pdf http://vocabularies.coar-repositories.org/pubby/resource_type.html
*
*/
@JsonProperty("type")
public String getType() {
return type;
}
/**
* The Dataset Type Schema
* <p>
* If appropriate, type according to: DataCite and/or COAR dictionary. Otherwise use the common name for the type, e.g. raw data, software, survey, etc. https://schema.datacite.org/meta/kernel-4.1/doc/DataCite-MetadataKernel_v4.1.pdf http://vocabularies.coar-repositories.org/pubby/resource_type.html
*
*/
@JsonProperty("type")
public void setType(String type) {
this.type = type;
}
@JsonProperty("additional_properties")
public Map<String, Object> getAdditionalProperties() {
return this.additionalProperties;
}
@JsonProperty("additional_properties")
public void setAdditionalProperty(String name, Object value) {
this.additionalProperties.put(name, value);
}
public enum PersonalData {
YES("yes"),
NO("no"),
UNKNOWN("unknown");
private final String value;
private final static Map<String, PersonalData> CONSTANTS = new HashMap<String, PersonalData>();
static {
for (PersonalData c: values()) {
CONSTANTS.put(c.value, c);
}
}
private PersonalData(String value) {
this.value = value;
}
@Override
public String toString() {
return this.value;
}
@JsonValue
public String value() {
return this.value;
}
@JsonCreator
public static PersonalData fromValue(String value) {
PersonalData constant = CONSTANTS.get(value);
if (constant == null) {
throw new IllegalArgumentException(value);
} else {
return constant;
}
}
}
public enum SensitiveData {
YES("yes"),
NO("no"),
UNKNOWN("unknown");
private final String value;
private final static Map<String, SensitiveData> CONSTANTS = new HashMap<String, SensitiveData>();
static {
for (SensitiveData c: values()) {
CONSTANTS.put(c.value, c);
}
}
private SensitiveData(String value) {
this.value = value;
}
@Override
public String toString() {
return this.value;
}
@JsonValue
public String value() {
return this.value;
}
@JsonCreator
public static SensitiveData fromValue(String value) {
SensitiveData constant = CONSTANTS.get(value);
if (constant == null) {
throw new IllegalArgumentException(value);
} else {
return constant;
}
}
}
}

View File

@ -0,0 +1,160 @@
package eu.eudat.file.transformer.rda;
import com.fasterxml.jackson.annotation.*;
import java.io.Serializable;
import java.util.HashMap;
import java.util.Map;
/**
* The Dataset ID Schema
* <p>
* Dataset ID
*
*/
@JsonInclude(JsonInclude.Include.NON_NULL)
@JsonPropertyOrder({
"identifier",
"type"
})
@JsonIgnoreProperties(ignoreUnknown = true)
public class DatasetId implements Serializable
{
/**
* The Dataset Identifier Schema
* <p>
* Identifier for a dataset
* (Required)
*
*/
@JsonProperty("identifier")
@JsonPropertyDescription("Identifier for a dataset")
private String identifier;
/**
* The Dataset Identifier Type Schema
* <p>
* Dataset identifier type. Allowed values: handle, doi, ark, url, other
* (Required)
*
*/
@JsonProperty("type")
@JsonPropertyDescription("Dataset identifier type. Allowed values: handle, doi, ark, url, other")
private Type type;
@JsonIgnore
private Map<String, Object> additionalProperties = new HashMap<String, Object>();
private final static long serialVersionUID = -6295164005851378031L;
public DatasetId() {
}
public DatasetId(String identifier, Type type) {
this.identifier = identifier;
this.type = type;
}
/**
* The Dataset Identifier Schema
* <p>
* Identifier for a dataset
* (Required)
*
*/
@JsonProperty("identifier")
public String getIdentifier() {
return identifier;
}
/**
* The Dataset Identifier Schema
* <p>
* Identifier for a dataset
* (Required)
*
*/
@JsonProperty("identifier")
public void setIdentifier(String identifier) {
this.identifier = identifier;
}
/**
* The Dataset Identifier Type Schema
* <p>
* Dataset identifier type. Allowed values: handle, doi, ark, url, other
* (Required)
*
*/
@JsonProperty("type")
public Type getType() {
return type;
}
/**
* The Dataset Identifier Type Schema
* <p>
* Dataset identifier type. Allowed values: handle, doi, ark, url, other
* (Required)
*
*/
@JsonProperty("type")
public void setType(Type type) {
this.type = type;
}
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
public Map<String, Object> getAdditionalProperties() {
return this.additionalProperties;
}
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
public void setAdditionalProperty(String name, Object value) {
this.additionalProperties.put(name, value);
}
public enum Type {
HANDLE("handle"),
DOI("doi"),
ARK("ark"),
URL("url"),
OTHER("other");
private final String value;
private final static Map<String, Type> CONSTANTS = new HashMap<String, Type>();
static {
for (Type c: values()) {
CONSTANTS.put(c.value, c);
}
}
private Type(String value) {
this.value = value;
}
@Override
public String toString() {
return this.value;
}
@JsonValue
public String value() {
return this.value;
}
@JsonCreator
public static Type fromValue(String value) {
Type constant = CONSTANTS.get(value);
if (constant == null) {
throw new IllegalArgumentException(value);
} else {
return constant;
}
}
}
}

View File

@ -0,0 +1,410 @@
package eu.eudat.file.transformer.rda;
import com.fasterxml.jackson.annotation.*;
import java.io.Serializable;
import java.net.URI;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
/**
* The Dataset Distribution Items Schema
* <p>
*
*
*/
@JsonInclude(JsonInclude.Include.NON_NULL)
@JsonPropertyOrder({
"access_url",
"available_until",
"byte_size",
"data_access",
"description",
"download_url",
"format",
"host",
"license",
"title",
"additional_properties"
})
@JsonIgnoreProperties(ignoreUnknown = true)
public class Distribution implements Serializable
{
/**
* The Dataset Distribution Access URL Schema
* <p>
* A URL of the resource that gives access to a distribution of the dataset. e.g. landing page.
*
*/
@JsonProperty("access_url")
@JsonPropertyDescription("A URL of the resource that gives access to a distribution of the dataset. e.g. landing page.")
private String accessUrl;
/**
* The Dataset Distribution Available Until Schema
* <p>
* Indicates how long this distribution will be / should be available.
*
*/
@JsonProperty("available_until")
@JsonPropertyDescription("Indicates how long this distribution will be / should be available.")
private String availableUntil;
/**
* The Dataset Distribution Byte Size Schema
* <p>
* Size in bytes.
*
*/
@JsonProperty("byte_size")
@JsonPropertyDescription("Size in bytes.")
private Integer byteSize;
/**
* The Dataset Distribution Data Access Schema
* <p>
* Indicates access mode for data. Allowed values: open, shared, closed
* (Required)
*
*/
@JsonProperty("data_access")
@JsonPropertyDescription("Indicates access mode for data. Allowed values: open, shared, closed")
private DataAccess dataAccess;
/**
* The Dataset Distribution Description Schema
* <p>
* Description is a property in both Dataset and Distribution, in compliance with W3C DCAT. In some cases these might be identical, but in most cases the Dataset represents a more abstract concept, while the distribution can point to a specific file.
*
*/
@JsonProperty("description")
@JsonPropertyDescription("Description is a property in both Dataset and Distribution, in compliance with W3C DCAT. In some cases these might be identical, but in most cases the Dataset represents a more abstract concept, while the distribution can point to a specific file.")
private String description;
/**
* The Dataset Distribution Download URL Schema
* <p>
* The URL of the downloadable file in a given format. E.g. CSV file or RDF file.
*
*/
@JsonProperty("download_url")
@JsonPropertyDescription("The URL of the downloadable file in a given format. E.g. CSV file or RDF file.")
private URI downloadUrl;
/**
* The Dataset Distribution Format Schema
* <p>
* Format according to: https://www.iana.org/assignments/media-types/media-types.xhtml if appropriate, otherwise use the common name for this format.
*
*/
@JsonProperty("format")
@JsonPropertyDescription("Format according to: https://www.iana.org/assignments/media-types/media-types.xhtml if appropriate, otherwise use the common name for this format.")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
private List<String> format = null;
/**
* The Dataset Distribution Host Schema
* <p>
* To provide information on quality of service provided by infrastructure (e.g. repository) where data is stored.
*
*/
@JsonProperty("host")
@JsonPropertyDescription("To provide information on quality of service provided by infrastructure (e.g. repository) where data is stored.")
private Host host;
/**
* The Dataset Distribution License(s) Schema
* <p>
* To list all licenses applied to a specific distribution of data.
*
*/
@JsonProperty("license")
@JsonPropertyDescription("To list all licenses applied to a specific distribution of data.")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
private List<License> license = null;
/**
* The Dataset Distribution Title Schema
* <p>
* Title is a property in both Dataset and Distribution, in compliance with W3C DCAT. In some cases these might be identical, but in most cases the Dataset represents a more abstract concept, while the distribution can point to a specific file.
* (Required)
*
*/
@JsonProperty("title")
@JsonPropertyDescription("Title is a property in both Dataset and Distribution, in compliance with W3C DCAT. In some cases these might be identical, but in most cases the Dataset represents a more abstract concept, while the distribution can point to a specific file.")
private String title;
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
private Map<String, Object> additionalProperties = new HashMap<String, Object>();
private final static long serialVersionUID = -6018365280419917902L;
/**
* The Dataset Distribution Access URL Schema
* <p>
* A URL of the resource that gives access to a distribution of the dataset. e.g. landing page.
*
*/
@JsonProperty("access_url")
public String getAccessUrl() {
return accessUrl;
}
/**
* The Dataset Distribution Access URL Schema
* <p>
* A URL of the resource that gives access to a distribution of the dataset. e.g. landing page.
*
*/
@JsonProperty("access_url")
public void setAccessUrl(String accessUrl) {
this.accessUrl = accessUrl;
}
/**
* The Dataset Distribution Available Until Schema
* <p>
* Indicates how long this distribution will be / should be available.
*
*/
@JsonProperty("available_until")
public String getAvailableUntil() {
return availableUntil;
}
/**
* The Dataset Distribution Available Until Schema
* <p>
* Indicates how long this distribution will be / should be available.
*
*/
@JsonProperty("available_until")
public void setAvailableUntil(String availableUntil) {
this.availableUntil = availableUntil;
}
/**
* The Dataset Distribution Byte Size Schema
* <p>
* Size in bytes.
*
*/
@JsonProperty("byte_size")
public Integer getByteSize() {
return byteSize;
}
/**
* The Dataset Distribution Byte Size Schema
* <p>
* Size in bytes.
*
*/
@JsonProperty("byte_size")
public void setByteSize(Integer byteSize) {
this.byteSize = byteSize;
}
/**
* The Dataset Distribution Data Access Schema
* <p>
* Indicates access mode for data. Allowed values: open, shared, closed
* (Required)
*
*/
@JsonProperty("data_access")
public DataAccess getDataAccess() {
return dataAccess;
}
/**
* The Dataset Distribution Data Access Schema
* <p>
* Indicates access mode for data. Allowed values: open, shared, closed
* (Required)
*
*/
@JsonProperty("data_access")
public void setDataAccess(DataAccess dataAccess) {
this.dataAccess = dataAccess;
}
/**
* The Dataset Distribution Description Schema
* <p>
* Description is a property in both Dataset and Distribution, in compliance with W3C DCAT. In some cases these might be identical, but in most cases the Dataset represents a more abstract concept, while the distribution can point to a specific file.
*
*/
@JsonProperty("description")
public String getDescription() {
return description;
}
/**
* The Dataset Distribution Description Schema
* <p>
* Description is a property in both Dataset and Distribution, in compliance with W3C DCAT. In some cases these might be identical, but in most cases the Dataset represents a more abstract concept, while the distribution can point to a specific file.
*
*/
@JsonProperty("description")
public void setDescription(String description) {
this.description = description;
}
/**
* The Dataset Distribution Download URL Schema
* <p>
* The URL of the downloadable file in a given format. E.g. CSV file or RDF file.
*
*/
@JsonProperty("download_url")
public URI getDownloadUrl() {
return downloadUrl;
}
/**
* The Dataset Distribution Download URL Schema
* <p>
* The URL of the downloadable file in a given format. E.g. CSV file or RDF file.
*
*/
@JsonProperty("download_url")
public void setDownloadUrl(URI downloadUrl) {
this.downloadUrl = downloadUrl;
}
/**
* The Dataset Distribution Format Schema
* <p>
* Format according to: https://www.iana.org/assignments/media-types/media-types.xhtml if appropriate, otherwise use the common name for this format.
*
*/
@JsonProperty("format")
public List<String> getFormat() {
return format;
}
/**
* The Dataset Distribution Format Schema
* <p>
* Format according to: https://www.iana.org/assignments/media-types/media-types.xhtml if appropriate, otherwise use the common name for this format.
*
*/
@JsonProperty("format")
public void setFormat(List<String> format) {
this.format = format;
}
/**
* The Dataset Distribution Host Schema
* <p>
* To provide information on quality of service provided by infrastructure (e.g. repository) where data is stored.
*
*/
@JsonProperty("host")
public Host getHost() {
return host;
}
/**
* The Dataset Distribution Host Schema
* <p>
* To provide information on quality of service provided by infrastructure (e.g. repository) where data is stored.
*
*/
@JsonProperty("host")
public void setHost(Host host) {
this.host = host;
}
/**
* The Dataset Distribution License(s) Schema
* <p>
* To list all licenses applied to a specific distribution of data.
*
*/
@JsonProperty("license")
public List<License> getLicense() {
return license;
}
/**
* The Dataset Distribution License(s) Schema
* <p>
* To list all licenses applied to a specific distribution of data.
*
*/
@JsonProperty("license")
public void setLicense(List<License> license) {
this.license = license;
}
/**
* The Dataset Distribution Title Schema
* <p>
* Title is a property in both Dataset and Distribution, in compliance with W3C DCAT. In some cases these might be identical, but in most cases the Dataset represents a more abstract concept, while the distribution can point to a specific file.
* (Required)
*
*/
@JsonProperty("title")
public String getTitle() {
return title;
}
/**
* The Dataset Distribution Title Schema
* <p>
* Title is a property in both Dataset and Distribution, in compliance with W3C DCAT. In some cases these might be identical, but in most cases the Dataset represents a more abstract concept, while the distribution can point to a specific file.
* (Required)
*
*/
@JsonProperty("title")
public void setTitle(String title) {
this.title = title;
}
@JsonProperty("additional_properties")
public Map<String, Object> getAdditionalProperties() {
return this.additionalProperties;
}
@JsonProperty("additional_properties")
public void setAdditionalProperty(String name, Object value) {
this.additionalProperties.put(name, value);
}
public enum DataAccess {
OPEN("open"),
SHARED("shared"),
CLOSED("closed");
private final String value;
private final static Map<String, DataAccess> CONSTANTS = new HashMap<String, DataAccess>();
static {
for (DataAccess c: values()) {
CONSTANTS.put(c.value, c);
}
}
private DataAccess(String value) {
this.value = value;
}
@Override
public String toString() {
return this.value;
}
@JsonValue
public String value() {
return this.value;
}
@JsonCreator
public static DataAccess fromValue(String value) {
DataAccess constant = CONSTANTS.get(value);
if (constant == null) {
throw new IllegalArgumentException(value);
} else {
return constant;
}
}
}
}

View File

@ -0,0 +1,551 @@
package eu.eudat.file.transformer.rda;
import com.fasterxml.jackson.annotation.*;
import java.io.Serializable;
import java.net.URI;
import java.time.Instant;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
/**
* The DMP Schema
* <p>
*
*
*/
@JsonIgnoreProperties(value = { "schema" }, ignoreUnknown = true)
@JsonInclude(JsonInclude.Include.NON_NULL)
@JsonPropertyOrder({
"contact",
"contributor",
"cost",
"created",
"dataset",
"description",
"dmp_id",
"ethical_issues_description",
"ethical_issues_exist",
"ethical_issues_report",
"language",
"modified",
"project",
"title",
"additional_properties"
})
public class Dmp implements Serializable
{
/**
* The DMP Contact Schema
* <p>
*
* (Required)
*
*/
@JsonProperty("contact")
private Contact contact;
/**
* The Contributor Schema
* <p>
*
*
*/
@JsonProperty("contributor")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
private List<Contributor> contributor = null;
/**
* The Cost Schema
* <p>
*
*
*/
@JsonProperty("cost")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
private List<Cost> cost = null;
/**
* The DMP Creation Schema
* <p>
*
*
*/
@JsonProperty("created")
@JsonPropertyDescription("")
private Instant created;
/**
* The Dataset Schema
* <p>
*
* (Required)
*
*/
@JsonProperty("dataset")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
private List<Dataset> dataset = null;
/**
* The DMP Description Schema
* <p>
* To provide any free-form text information on a DMP
*
*/
@JsonProperty("description")
@JsonPropertyDescription("To provide any free-form text information on a DMP")
private String description;
/**
* The DMP Identifier Schema
* <p>
* Identifier for the DMP itself
* (Required)
*
*/
@JsonProperty("dmp_id")
@JsonPropertyDescription("Identifier for the DMP itself")
private DmpId dmpId;
/**
* The DMP Ethical Issues Description Schema
* <p>
* To describe ethical issues directly in a DMP
*
*/
@JsonProperty("ethical_issues_description")
@JsonPropertyDescription("To describe ethical issues directly in a DMP")
private String ethicalIssuesDescription;
/**
* The DMP Ethical Issues Exist Schema
* <p>
* To indicate whether there are ethical issues related to data that this DMP describes. Allowed values: yes, no, unknown
* (Required)
*
*/
@JsonProperty("ethical_issues_exist")
@JsonPropertyDescription("To indicate whether there are ethical issues related to data that this DMP describes. Allowed values: yes, no, unknown")
private EthicalIssuesExist ethicalIssuesExist;
/**
* The DMP Ethical Issues Report Schema
* <p>
* To indicate where a protocol from a meeting with an ethical commitee can be found
*
*/
@JsonProperty("ethical_issues_report")
@JsonPropertyDescription("To indicate where a protocol from a meeting with an ethical commitee can be found")
private URI ethicalIssuesReport;
/**
* The DMP Language Schema
* <p>
* Language of the DMP expressed using ISO 639-3.
* (Required)
*
*/
@JsonProperty("language")
@JsonPropertyDescription("Language of the DMP expressed using ISO 639-3.")
private Language language;
/**
* The DMP Modification Schema
* <p>
* Must be set each time DMP is modified. Indicates DMP version.
* (Required)
*
*/
@JsonProperty("modified")
@JsonPropertyDescription("Must be set each time DMP is modified. Indicates DMP version.")
private Instant modified;
/**
* The DMP Project Schema
* <p>
* Project related to a DMP
*
*/
@JsonProperty("project")
@JsonPropertyDescription("Project related to a DMP")
private List<Project> project = null;
/**
* The DMP Title Schema
* <p>
* Title of a DMP
* (Required)
*
*/
@JsonProperty("title")
@JsonPropertyDescription("Title of a DMP")
private String title;
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
private Map<String, Object> additionalProperties = new HashMap<String, Object>();
private final static long serialVersionUID = 4599713332472772292L;
/**
* The DMP Contact Schema
* <p>
*
* (Required)
*
*/
@JsonProperty("contact")
public Contact getContact() {
return contact;
}
/**
* The DMP Contact Schema
* <p>
*
* (Required)
*
*/
@JsonProperty("contact")
public void setContact(Contact contact) {
this.contact = contact;
}
/**
* The Contributor Schema
* <p>
*
*
*/
@JsonProperty("contributor")
public List<Contributor> getContributor() {
return contributor;
}
/**
* The Contributor Schema
* <p>
*
*
*/
@JsonProperty("contributor")
public void setContributor(List<Contributor> contributor) {
this.contributor = contributor;
}
/**
* The Cost Schema
* <p>
*
*
*/
@JsonProperty("cost")
public List<Cost> getCost() {
return cost;
}
/**
* The Cost Schema
* <p>
*
*
*/
@JsonProperty("cost")
public void setCost(List<Cost> cost) {
this.cost = cost;
}
/**
* The DMP Creation Schema
* <p>
*
*
*/
@JsonProperty("created")
public Instant getCreated() {
return created;
}
/**
* The DMP Creation Schema
* <p>
*
*
*/
@JsonProperty("created")
public void setCreated(Instant created) {
this.created = created;
}
/**
* The Dataset Schema
* <p>
*
* (Required)
*
*/
@JsonProperty("dataset")
public List<Dataset> getDataset() {
return dataset;
}
/**
* The Dataset Schema
* <p>
*
* (Required)
*
*/
@JsonProperty("dataset")
public void setDataset(List<Dataset> dataset) {
this.dataset = dataset;
}
/**
* The DMP Description Schema
* <p>
* To provide any free-form text information on a DMP
*
*/
@JsonProperty("description")
public String getDescription() {
return description;
}
/**
* The DMP Description Schema
* <p>
* To provide any free-form text information on a DMP
*
*/
@JsonProperty("description")
public void setDescription(String description) {
this.description = description;
}
/**
* The DMP Identifier Schema
* <p>
* Identifier for the DMP itself
* (Required)
*
*/
@JsonProperty("dmp_id")
public DmpId getDmpId() {
return dmpId;
}
/**
* The DMP Identifier Schema
* <p>
* Identifier for the DMP itself
* (Required)
*
*/
@JsonProperty("dmp_id")
public void setDmpId(DmpId dmpId) {
this.dmpId = dmpId;
}
/**
* The DMP Ethical Issues Description Schema
* <p>
* To describe ethical issues directly in a DMP
*
*/
@JsonProperty("ethical_issues_description")
public String getEthicalIssuesDescription() {
return ethicalIssuesDescription;
}
/**
* The DMP Ethical Issues Description Schema
* <p>
* To describe ethical issues directly in a DMP
*
*/
@JsonProperty("ethical_issues_description")
public void setEthicalIssuesDescription(String ethicalIssuesDescription) {
this.ethicalIssuesDescription = ethicalIssuesDescription;
}
/**
* The DMP Ethical Issues Exist Schema
* <p>
* To indicate whether there are ethical issues related to data that this DMP describes. Allowed values: yes, no, unknown
* (Required)
*
*/
@JsonProperty("ethical_issues_exist")
public EthicalIssuesExist getEthicalIssuesExist() {
return ethicalIssuesExist;
}
/**
* The DMP Ethical Issues Exist Schema
* <p>
* To indicate whether there are ethical issues related to data that this DMP describes. Allowed values: yes, no, unknown
* (Required)
*
*/
@JsonProperty("ethical_issues_exist")
public void setEthicalIssuesExist(EthicalIssuesExist ethicalIssuesExist) {
this.ethicalIssuesExist = ethicalIssuesExist;
}
/**
* The DMP Ethical Issues Report Schema
* <p>
* To indicate where a protocol from a meeting with an ethical commitee can be found
*
*/
@JsonProperty("ethical_issues_report")
public URI getEthicalIssuesReport() {
return ethicalIssuesReport;
}
/**
* The DMP Ethical Issues Report Schema
* <p>
* To indicate where a protocol from a meeting with an ethical commitee can be found
*
*/
@JsonProperty("ethical_issues_report")
public void setEthicalIssuesReport(URI ethicalIssuesReport) {
this.ethicalIssuesReport = ethicalIssuesReport;
}
/**
* The DMP Language Schema
* <p>
* Language of the DMP expressed using ISO 639-3.
* (Required)
*
*/
@JsonProperty("language")
public Language getLanguage() {
return language;
}
/**
* The DMP Language Schema
* <p>
* Language of the DMP expressed using ISO 639-3.
* (Required)
*
*/
@JsonProperty("language")
public void setLanguage(Language language) {
this.language = language;
}
/**
* The DMP Modification Schema
* <p>
* Must be set each time DMP is modified. Indicates DMP version.
* (Required)
*
*/
@JsonProperty("modified")
public Instant getModified() {
return modified;
}
/**
* The DMP Modification Schema
* <p>
* Must be set each time DMP is modified. Indicates DMP version.
* (Required)
*
*/
@JsonProperty("modified")
public void setModified(Instant modified) {
this.modified = modified;
}
/**
* The DMP Project Schema
* <p>
* Project related to a DMP
*
*/
@JsonProperty("project")
public List<Project> getProject() {
return project;
}
/**
* The DMP Project Schema
* <p>
* Project related to a DMP
*
*/
@JsonProperty("project")
public void setProject(List<Project> project) {
this.project = project;
}
/**
* The DMP Title Schema
* <p>
* Title of a DMP
* (Required)
*
*/
@JsonProperty("title")
public String getTitle() {
return title;
}
/**
* The DMP Title Schema
* <p>
* Title of a DMP
* (Required)
*
*/
@JsonProperty("title")
public void setTitle(String title) {
this.title = title;
}
@JsonProperty("additional_properties")
public Map<String, Object> getAdditionalProperties() {
return this.additionalProperties;
}
@JsonProperty("additional_properties")
public void setAdditionalProperty(String name, Object value) {
this.additionalProperties.put(name, value);
}
public enum EthicalIssuesExist {
YES("yes"),
NO("no"),
UNKNOWN("unknown");
private final String value;
private final static Map<String, EthicalIssuesExist> CONSTANTS = new HashMap<String, EthicalIssuesExist>();
static {
for (EthicalIssuesExist c: values()) {
CONSTANTS.put(c.value, c);
}
}
private EthicalIssuesExist(String value) {
this.value = value;
}
@Override
public String toString() {
return this.value;
}
@JsonValue
public String value() {
return this.value;
}
@JsonCreator
public static EthicalIssuesExist fromValue(String value) {
EthicalIssuesExist constant = CONSTANTS.get(value);
if (constant == null) {
throw new IllegalArgumentException(value);
} else {
return constant;
}
}
}
}

View File

@ -0,0 +1,152 @@
package eu.eudat.file.transformer.rda;
import com.fasterxml.jackson.annotation.*;
import java.io.Serializable;
import java.util.HashMap;
import java.util.Map;
/**
* The DMP Identifier Schema
* <p>
* Identifier for the DMP itself
*
*/
@JsonInclude(JsonInclude.Include.NON_NULL)
@JsonPropertyOrder({
"identifier",
"type"
})
@JsonIgnoreProperties(ignoreUnknown = true)
public class DmpId implements Serializable
{
/**
* The DMP Identifier Value Schema
* <p>
* Identifier for a DMP
* (Required)
*
*/
@JsonProperty("identifier")
@JsonPropertyDescription("Identifier for a DMP")
private String identifier;
/**
* The DMP Identifier Type Schema
* <p>
* The DMP Identifier Type. Allowed values: handle, doi, ark, url, other
* (Required)
*
*/
@JsonProperty("type")
@JsonPropertyDescription("The DMP Identifier Type. Allowed values: handle, doi, ark, url, other")
private Type type;
@JsonIgnore
private Map<String, Object> additionalProperties = new HashMap<String, Object>();
private final static long serialVersionUID = -6059908070202476841L;
/**
* The DMP Identifier Value Schema
* <p>
* Identifier for a DMP
* (Required)
*
*/
@JsonProperty("identifier")
public String getIdentifier() {
return identifier;
}
/**
* The DMP Identifier Value Schema
* <p>
* Identifier for a DMP
* (Required)
*
*/
@JsonProperty("identifier")
public void setIdentifier(String identifier) {
this.identifier = identifier;
}
/**
* The DMP Identifier Type Schema
* <p>
* The DMP Identifier Type. Allowed values: handle, doi, ark, url, other
* (Required)
*
*/
@JsonProperty("type")
public Type getType() {
return type;
}
/**
* The DMP Identifier Type Schema
* <p>
* The DMP Identifier Type. Allowed values: handle, doi, ark, url, other
* (Required)
*
*/
@JsonProperty("type")
public void setType(Type type) {
this.type = type;
}
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
public Map<String, Object> getAdditionalProperties() {
return this.additionalProperties;
}
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
public void setAdditionalProperty(String name, Object value) {
this.additionalProperties.put(name, value);
}
public enum Type {
HANDLE("handle"),
DOI("doi"),
ARK("ark"),
URL("url"),
OTHER("other");
private final String value;
private final static Map<String, Type> CONSTANTS = new HashMap<String, Type>();
static {
for (Type c: values()) {
CONSTANTS.put(c.value, c);
}
}
private Type(String value) {
this.value = value;
}
@Override
public String toString() {
return this.value;
}
@JsonValue
public String value() {
return this.value;
}
@JsonCreator
public static Type fromValue(String value) {
Type constant = CONSTANTS.get(value);
if (constant == null) {
throw new IllegalArgumentException(value);
} else {
return constant;
}
}
}
}

View File

@ -0,0 +1,150 @@
package eu.eudat.file.transformer.rda;
import com.fasterxml.jackson.annotation.*;
import java.io.Serializable;
import java.util.HashMap;
import java.util.Map;
/**
* The Funder ID Schema
* <p>
* Funder ID of the associated project
*
*/
@JsonInclude(JsonInclude.Include.NON_NULL)
@JsonPropertyOrder({
"identifier",
"type"
})
@JsonIgnoreProperties(ignoreUnknown = true)
public class FunderId implements Serializable
{
/**
* The Funder ID Value Schema
* <p>
* Funder ID, recommended to use CrossRef Funder Registry. See: https://www.crossref.org/services/funder-registry/
* (Required)
*
*/
@JsonProperty("identifier")
@JsonPropertyDescription("Funder ID, recommended to use CrossRef Funder Registry. See: https://www.crossref.org/services/funder-registry/")
private String identifier;
/**
* The Funder ID Type Schema
* <p>
* Identifier type. Allowed values: fundref, url, other
* (Required)
*
*/
@JsonProperty("type")
@JsonPropertyDescription("Identifier type. Allowed values: fundref, url, other")
private Type type;
@JsonIgnore
private Map<String, Object> additionalProperties = new HashMap<String, Object>();
private final static long serialVersionUID = 1783349151334366078L;
/**
* The Funder ID Value Schema
* <p>
* Funder ID, recommended to use CrossRef Funder Registry. See: https://www.crossref.org/services/funder-registry/
* (Required)
*
*/
@JsonProperty("identifier")
public String getIdentifier() {
return identifier;
}
/**
* The Funder ID Value Schema
* <p>
* Funder ID, recommended to use CrossRef Funder Registry. See: https://www.crossref.org/services/funder-registry/
* (Required)
*
*/
@JsonProperty("identifier")
public void setIdentifier(String identifier) {
this.identifier = identifier;
}
/**
* The Funder ID Type Schema
* <p>
* Identifier type. Allowed values: fundref, url, other
* (Required)
*
*/
@JsonProperty("type")
public Type getType() {
return type;
}
/**
* The Funder ID Type Schema
* <p>
* Identifier type. Allowed values: fundref, url, other
* (Required)
*
*/
@JsonProperty("type")
public void setType(Type type) {
this.type = type;
}
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
public Map<String, Object> getAdditionalProperties() {
return this.additionalProperties;
}
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
public void setAdditionalProperty(String name, Object value) {
this.additionalProperties.put(name, value);
}
public enum Type {
FUNDREF("fundref"),
URL("url"),
OTHER("other");
private final String value;
private final static Map<String, Type> CONSTANTS = new HashMap<String, Type>();
static {
for (Type c: values()) {
CONSTANTS.put(c.value, c);
}
}
private Type(String value) {
this.value = value;
}
@Override
public String toString() {
return this.value;
}
@JsonValue
public String value() {
return this.value;
}
@JsonCreator
public static Type fromValue(String value) {
Type constant = CONSTANTS.get(value);
if (constant == null) {
throw new IllegalArgumentException(value);
} else {
return constant;
}
}
}
}

View File

@ -0,0 +1,183 @@
package eu.eudat.file.transformer.rda;
import com.fasterxml.jackson.annotation.*;
import java.io.Serializable;
import java.util.HashMap;
import java.util.Map;
/**
* The DMP Project Funding Items Schema
* <p>
*
*
*/
@JsonInclude(JsonInclude.Include.NON_NULL)
@JsonPropertyOrder({
"funder_id",
"funding_status",
"grant_id"
})
@JsonIgnoreProperties(ignoreUnknown = true)
public class Funding implements Serializable
{
/**
* The Funder ID Schema
* <p>
* Funder ID of the associated project
* (Required)
*
*/
@JsonProperty("funder_id")
@JsonPropertyDescription("Funder ID of the associated project")
private FunderId funderId;
/**
* The Funding Status Schema
* <p>
* To express different phases of project lifecycle. Allowed values: planned, applied, granted, rejected
*
*/
@JsonProperty("funding_status")
@JsonPropertyDescription("To express different phases of project lifecycle. Allowed values: planned, applied, granted, rejected")
private FundingStatus fundingStatus;
/**
* The Funding Grant ID Schema
* <p>
* Grant ID of the associated project
* (Required)
*
*/
@JsonProperty("grant_id")
@JsonPropertyDescription("Grant ID of the associated project")
private GrantId grantId;
@JsonIgnore
private Map<String, Object> additionalProperties = new HashMap<String, Object>();
private final static long serialVersionUID = 8962229321225336165L;
/**
* The Funder ID Schema
* <p>
* Funder ID of the associated project
* (Required)
*
*/
@JsonProperty("funder_id")
public FunderId getFunderId() {
return funderId;
}
/**
* The Funder ID Schema
* <p>
* Funder ID of the associated project
* (Required)
*
*/
@JsonProperty("funder_id")
public void setFunderId(FunderId funderId) {
this.funderId = funderId;
}
/**
* The Funding Status Schema
* <p>
* To express different phases of project lifecycle. Allowed values: planned, applied, granted, rejected
*
*/
@JsonProperty("funding_status")
public FundingStatus getFundingStatus() {
return fundingStatus;
}
/**
* The Funding Status Schema
* <p>
* To express different phases of project lifecycle. Allowed values: planned, applied, granted, rejected
*
*/
@JsonProperty("funding_status")
public void setFundingStatus(FundingStatus fundingStatus) {
this.fundingStatus = fundingStatus;
}
/**
* The Funding Grant ID Schema
* <p>
* Grant ID of the associated project
* (Required)
*
*/
@JsonProperty("grant_id")
public GrantId getGrantId() {
return grantId;
}
/**
* The Funding Grant ID Schema
* <p>
* Grant ID of the associated project
* (Required)
*
*/
@JsonProperty("grant_id")
public void setGrantId(GrantId grantId) {
this.grantId = grantId;
}
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
public Map<String, Object> getAdditionalProperties() {
return this.additionalProperties;
}
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
public void setAdditionalProperty(String name, Object value) {
this.additionalProperties.put(name, value);
}
public enum FundingStatus {
PLANNED("planned"),
APPLIED("applied"),
GRANTED("granted"),
REJECTED("rejected");
private final String value;
private final static Map<String, FundingStatus> CONSTANTS = new HashMap<String, FundingStatus>();
static {
for (FundingStatus c: values()) {
CONSTANTS.put(c.value, c);
}
}
private FundingStatus(String value) {
this.value = value;
}
@Override
public String toString() {
return this.value;
}
@JsonValue
public String value() {
return this.value;
}
@JsonCreator
public static FundingStatus fromValue(String value) {
FundingStatus constant = CONSTANTS.get(value);
if (constant == null) {
throw new IllegalArgumentException(value);
} else {
return constant;
}
}
}
}

View File

@ -0,0 +1,149 @@
package eu.eudat.file.transformer.rda;
import com.fasterxml.jackson.annotation.*;
import java.io.Serializable;
import java.util.HashMap;
import java.util.Map;
/**
* The Funding Grant ID Schema
* <p>
* Grant ID of the associated project
*
*/
@JsonInclude(JsonInclude.Include.NON_NULL)
@JsonPropertyOrder({
"identifier",
"type"
})
@JsonIgnoreProperties(ignoreUnknown = true)
public class GrantId implements Serializable
{
/**
* The Funding Grant ID Value Schema
* <p>
* Grant ID
* (Required)
*
*/
@JsonProperty("identifier")
@JsonPropertyDescription("Grant ID")
private String identifier;
/**
* The Funding Grant ID Type Schema
* <p>
* Identifier type. Allowed values: url, other
* (Required)
*
*/
@JsonProperty("type")
@JsonPropertyDescription("Identifier type. Allowed values: url, other")
private Type type;
@JsonIgnore
private Map<String, Object> additionalProperties = new HashMap<String, Object>();
private final static long serialVersionUID = -7738072672837592065L;
/**
* The Funding Grant ID Value Schema
* <p>
* Grant ID
* (Required)
*
*/
@JsonProperty("identifier")
public String getIdentifier() {
return identifier;
}
/**
* The Funding Grant ID Value Schema
* <p>
* Grant ID
* (Required)
*
*/
@JsonProperty("identifier")
public void setIdentifier(String identifier) {
this.identifier = identifier;
}
/**
* The Funding Grant ID Type Schema
* <p>
* Identifier type. Allowed values: url, other
* (Required)
*
*/
@JsonProperty("type")
public Type getType() {
return type;
}
/**
* The Funding Grant ID Type Schema
* <p>
* Identifier type. Allowed values: url, other
* (Required)
*
*/
@JsonProperty("type")
public void setType(Type type) {
this.type = type;
}
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
public Map<String, Object> getAdditionalProperties() {
return this.additionalProperties;
}
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
public void setAdditionalProperty(String name, Object value) {
this.additionalProperties.put(name, value);
}
public enum Type {
URL("url"),
OTHER("other");
private final String value;
private final static Map<String, Type> CONSTANTS = new HashMap<String, Type>();
static {
for (Type c: values()) {
CONSTANTS.put(c.value, c);
}
}
private Type(String value) {
this.value = value;
}
@Override
public String toString() {
return this.value;
}
@JsonValue
public String value() {
return this.value;
}
@JsonCreator
public static Type fromValue(String value) {
Type constant = CONSTANTS.get(value);
if (constant == null) {
throw new IllegalArgumentException(value);
} else {
return constant;
}
}
}
}

View File

@ -0,0 +1,772 @@
package eu.eudat.file.transformer.rda;
import com.fasterxml.jackson.annotation.*;
import java.io.Serializable;
import java.net.URI;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
/**
* The Dataset Distribution Host Schema
* <p>
* To provide information on quality of service provided by infrastructure (e.g. repository) where data is stored.
*
*/
@JsonInclude(JsonInclude.Include.NON_NULL)
@JsonPropertyOrder({
"availability",
"backup_frequency",
"backup_type",
"certified_with",
"description",
"geo_location",
"pid_system",
"storage_type",
"support_versioning",
"title",
"url"
})
@JsonIgnoreProperties(ignoreUnknown = true)
public class Host implements Serializable
{
/**
* The Dataset Distribution Host Availability Schema
* <p>
* Availability
*
*/
@JsonProperty("availability")
@JsonPropertyDescription("Availability")
private String availability;
/**
* The Dataset Distribution Host Backup Frequency Schema
* <p>
* Backup Frequency
*
*/
@JsonProperty("backup_frequency")
@JsonPropertyDescription("Backup Frequency")
private String backupFrequency;
/**
* The Dataset Distribution Host Backup Type Schema
* <p>
* Backup Type
*
*/
@JsonProperty("backup_type")
@JsonPropertyDescription("Backup Type")
private String backupType;
/**
* The Dataset Distribution Host Certification Type Schema
* <p>
* Repository certified to a recognised standard. Allowed values: din31644, dini-zertifikat, dsa, iso16363, iso16919, trac, wds, coretrustseal
*
*/
@JsonProperty("certified_with")
@JsonPropertyDescription("Repository certified to a recognised standard. Allowed values: din31644, dini-zertifikat, dsa, iso16363, iso16919, trac, wds, coretrustseal")
private CertifiedWith certifiedWith;
/**
* The Dataset Distribution Host Description Schema
* <p>
* Description
*
*/
@JsonProperty("description")
@JsonPropertyDescription("Description")
private String description;
/**
* The Dataset Distribution Host Geographical Location Schema
* <p>
* Physical location of the data expressed using ISO 3166-1 country code.
*
*/
@JsonProperty("geo_location")
@JsonPropertyDescription("Physical location of the data expressed using ISO 3166-1 country code.")
private GeoLocation geoLocation;
/**
* The Dataset Distribution Host PID System Schema
* <p>
* PID system(s). Allowed values: ark, arxiv, bibcode, doi, ean13, eissn, handle, igsn, isbn, issn, istc, lissn, lsid, pmid, purl, upc, url, urn, other
*
*/
@JsonProperty("pid_system")
@JsonPropertyDescription("PID system(s). Allowed values: ark, arxiv, bibcode, doi, ean13, eissn, handle, igsn, isbn, issn, istc, lissn, lsid, pmid, purl, upc, url, urn, other")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
private List<PidSystem> pidSystem = null;
/**
* The Dataset Distribution Host Storage Type Schema
* <p>
* The type of storage required
*
*/
@JsonProperty("storage_type")
@JsonPropertyDescription("The type of storage required")
private String storageType;
/**
* The Dataset Distribution Host Support Versioning Schema
* <p>
* If host supports versioning. Allowed values: yes, no, unknown
*
*/
@JsonProperty("support_versioning")
@JsonPropertyDescription("If host supports versioning. Allowed values: yes, no, unknown")
private SupportVersioning supportVersioning;
/**
* The Dataset Distribution Host Title Schema
* <p>
* Title
* (Required)
*
*/
@JsonProperty("title")
@JsonPropertyDescription("Title")
private String title;
/**
* The Dataset Distribution Host Title Schema
* <p>
* The URL of the system hosting a distribution of a dataset
* (Required)
*
*/
@JsonProperty("url")
@JsonPropertyDescription("The URL of the system hosting a distribution of a dataset")
private URI url;
@JsonIgnore
private Map<String, Object> additionalProperties = new HashMap<String, Object>();
private final static long serialVersionUID = 8564338806797654115L;
/**
* The Dataset Distribution Host Availability Schema
* <p>
* Availability
*
*/
@JsonProperty("availability")
public String getAvailability() {
return availability;
}
/**
* The Dataset Distribution Host Availability Schema
* <p>
* Availability
*
*/
@JsonProperty("availability")
public void setAvailability(String availability) {
this.availability = availability;
}
/**
* The Dataset Distribution Host Backup Frequency Schema
* <p>
* Backup Frequency
*
*/
@JsonProperty("backup_frequency")
public String getBackupFrequency() {
return backupFrequency;
}
/**
* The Dataset Distribution Host Backup Frequency Schema
* <p>
* Backup Frequency
*
*/
@JsonProperty("backup_frequency")
public void setBackupFrequency(String backupFrequency) {
this.backupFrequency = backupFrequency;
}
/**
* The Dataset Distribution Host Backup Type Schema
* <p>
* Backup Type
*
*/
@JsonProperty("backup_type")
public String getBackupType() {
return backupType;
}
/**
* The Dataset Distribution Host Backup Type Schema
* <p>
* Backup Type
*
*/
@JsonProperty("backup_type")
public void setBackupType(String backupType) {
this.backupType = backupType;
}
/**
* The Dataset Distribution Host Certification Type Schema
* <p>
* Repository certified to a recognised standard. Allowed values: din31644, dini-zertifikat, dsa, iso16363, iso16919, trac, wds, coretrustseal
*
*/
@JsonProperty("certified_with")
public CertifiedWith getCertifiedWith() {
return certifiedWith;
}
/**
* The Dataset Distribution Host Certification Type Schema
* <p>
* Repository certified to a recognised standard. Allowed values: din31644, dini-zertifikat, dsa, iso16363, iso16919, trac, wds, coretrustseal
*
*/
@JsonProperty("certified_with")
public void setCertifiedWith(CertifiedWith certifiedWith) {
this.certifiedWith = certifiedWith;
}
/**
* The Dataset Distribution Host Description Schema
* <p>
* Description
*
*/
@JsonProperty("description")
public String getDescription() {
return description;
}
/**
* The Dataset Distribution Host Description Schema
* <p>
* Description
*
*/
@JsonProperty("description")
public void setDescription(String description) {
this.description = description;
}
/**
* The Dataset Distribution Host Geographical Location Schema
* <p>
* Physical location of the data expressed using ISO 3166-1 country code.
*
*/
@JsonProperty("geo_location")
public GeoLocation getGeoLocation() {
return geoLocation;
}
/**
* The Dataset Distribution Host Geographical Location Schema
* <p>
* Physical location of the data expressed using ISO 3166-1 country code.
*
*/
@JsonProperty("geo_location")
public void setGeoLocation(GeoLocation geoLocation) {
this.geoLocation = geoLocation;
}
/**
* The Dataset Distribution Host PID System Schema
* <p>
* PID system(s). Allowed values: ark, arxiv, bibcode, doi, ean13, eissn, handle, igsn, isbn, issn, istc, lissn, lsid, pmid, purl, upc, url, urn, other
*
*/
@JsonProperty("pid_system")
public List<PidSystem> getPidSystem() {
return pidSystem;
}
/**
* The Dataset Distribution Host PID System Schema
* <p>
* PID system(s). Allowed values: ark, arxiv, bibcode, doi, ean13, eissn, handle, igsn, isbn, issn, istc, lissn, lsid, pmid, purl, upc, url, urn, other
*
*/
@JsonProperty("pid_system")
public void setPidSystem(List<PidSystem> pidSystem) {
this.pidSystem = pidSystem;
}
/**
* The Dataset Distribution Host Storage Type Schema
* <p>
* The type of storage required
*
*/
@JsonProperty("storage_type")
public String getStorageType() {
return storageType;
}
/**
* The Dataset Distribution Host Storage Type Schema
* <p>
* The type of storage required
*
*/
@JsonProperty("storage_type")
public void setStorageType(String storageType) {
this.storageType = storageType;
}
/**
* The Dataset Distribution Host Support Versioning Schema
* <p>
* If host supports versioning. Allowed values: yes, no, unknown
*
*/
@JsonProperty("support_versioning")
public SupportVersioning getSupportVersioning() {
return supportVersioning;
}
/**
* The Dataset Distribution Host Support Versioning Schema
* <p>
* If host supports versioning. Allowed values: yes, no, unknown
*
*/
@JsonProperty("support_versioning")
public void setSupportVersioning(SupportVersioning supportVersioning) {
this.supportVersioning = supportVersioning;
}
/**
* The Dataset Distribution Host Title Schema
* <p>
* Title
* (Required)
*
*/
@JsonProperty("title")
public String getTitle() {
return title;
}
/**
* The Dataset Distribution Host Title Schema
* <p>
* Title
* (Required)
*
*/
@JsonProperty("title")
public void setTitle(String title) {
this.title = title;
}
/**
* The Dataset Distribution Host Title Schema
* <p>
* The URL of the system hosting a distribution of a dataset
* (Required)
*
*/
@JsonProperty("url")
public URI getUrl() {
return url;
}
/**
* The Dataset Distribution Host Title Schema
* <p>
* The URL of the system hosting a distribution of a dataset
* (Required)
*
*/
@JsonProperty("url")
public void setUrl(URI url) {
this.url = url;
}
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
public Map<String, Object> getAdditionalProperties() {
return this.additionalProperties;
}
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
public void setAdditionalProperty(String name, Object value) {
this.additionalProperties.put(name, value);
}
public enum CertifiedWith {
DIN_31644("din31644"),
DINI_ZERTIFIKAT("dini-zertifikat"),
DSA("dsa"),
ISO_16363("iso16363"),
ISO_16919("iso16919"),
TRAC("trac"),
WDS("wds"),
CORETRUSTSEAL("coretrustseal");
private final String value;
private final static Map<String, CertifiedWith> CONSTANTS = new HashMap<String, CertifiedWith>();
static {
for (CertifiedWith c: values()) {
CONSTANTS.put(c.value, c);
}
}
private CertifiedWith(String value) {
this.value = value;
}
@Override
public String toString() {
return this.value;
}
@JsonValue
public String value() {
return this.value;
}
@JsonCreator
public static CertifiedWith fromValue(String value) {
CertifiedWith constant = CONSTANTS.get(value);
if (constant == null) {
throw new IllegalArgumentException(value);
} else {
return constant;
}
}
}
public enum GeoLocation {
AD("AD"),
AE("AE"),
AF("AF"),
AG("AG"),
AI("AI"),
AL("AL"),
AM("AM"),
AO("AO"),
AQ("AQ"),
AR("AR"),
AS("AS"),
AT("AT"),
AU("AU"),
AW("AW"),
AX("AX"),
AZ("AZ"),
BA("BA"),
BB("BB"),
BD("BD"),
BE("BE"),
BF("BF"),
BG("BG"),
BH("BH"),
BI("BI"),
BJ("BJ"),
BL("BL"),
BM("BM"),
BN("BN"),
BO("BO"),
BQ("BQ"),
BR("BR"),
BS("BS"),
BT("BT"),
BV("BV"),
BW("BW"),
BY("BY"),
BZ("BZ"),
CA("CA"),
CC("CC"),
CD("CD"),
CF("CF"),
CG("CG"),
CH("CH"),
CI("CI"),
CK("CK"),
CL("CL"),
CM("CM"),
CN("CN"),
CO("CO"),
CR("CR"),
CU("CU"),
CV("CV"),
CW("CW"),
CX("CX"),
CY("CY"),
CZ("CZ"),
DE("DE"),
DJ("DJ"),
DK("DK"),
DM("DM"),
DO("DO"),
DZ("DZ"),
EC("EC"),
EE("EE"),
EG("EG"),
EH("EH"),
ER("ER"),
ES("ES"),
ET("ET"),
FI("FI"),
FJ("FJ"),
FK("FK"),
FM("FM"),
FO("FO"),
FR("FR"),
GA("GA"),
GB("GB"),
GD("GD"),
GE("GE"),
GF("GF"),
GG("GG"),
GH("GH"),
GI("GI"),
GL("GL"),
GM("GM"),
GN("GN"),
GP("GP"),
GQ("GQ"),
GR("GR"),
GS("GS"),
GT("GT"),
GU("GU"),
GW("GW"),
GY("GY"),
HK("HK"),
HM("HM"),
HN("HN"),
HR("HR"),
HT("HT"),
HU("HU"),
ID("ID"),
IE("IE"),
IL("IL"),
IM("IM"),
IN("IN"),
IO("IO"),
IQ("IQ"),
IR("IR"),
IS("IS"),
IT("IT"),
JE("JE"),
JM("JM"),
JO("JO"),
JP("JP"),
KE("KE"),
KG("KG"),
KH("KH"),
KI("KI"),
KM("KM"),
KN("KN"),
KP("KP"),
KR("KR"),
KW("KW"),
KY("KY"),
KZ("KZ"),
LA("LA"),
LB("LB"),
LC("LC"),
LI("LI"),
LK("LK"),
LR("LR"),
LS("LS"),
LT("LT"),
LU("LU"),
LV("LV"),
LY("LY"),
MA("MA"),
MC("MC"),
MD("MD"),
ME("ME"),
MF("MF"),
MG("MG"),
MH("MH"),
MK("MK"),
ML("ML"),
MM("MM"),
MN("MN"),
MO("MO"),
MP("MP"),
MQ("MQ"),
MR("MR"),
MS("MS"),
MT("MT"),
MU("MU"),
MV("MV"),
MW("MW"),
MX("MX"),
MY("MY"),
MZ("MZ"),
NA("NA"),
NC("NC"),
NE("NE"),
NF("NF"),
NG("NG"),
NI("NI"),
NL("NL"),
NO("NO"),
NP("NP"),
NR("NR"),
NU("NU"),
NZ("NZ"),
OM("OM"),
PA("PA"),
PE("PE"),
PF("PF"),
PG("PG"),
PH("PH"),
PK("PK"),
PL("PL"),
PM("PM"),
PN("PN"),
PR("PR"),
PS("PS"),
PT("PT"),
PW("PW"),
PY("PY"),
QA("QA"),
RE("RE"),
RO("RO"),
RS("RS"),
RU("RU"),
RW("RW"),
SA("SA"),
SB("SB"),
SC("SC"),
SD("SD"),
SE("SE"),
SG("SG"),
SH("SH"),
SI("SI"),
SJ("SJ"),
SK("SK"),
SL("SL"),
SM("SM"),
SN("SN"),
SO("SO"),
SR("SR"),
SS("SS"),
ST("ST"),
SV("SV"),
SX("SX"),
SY("SY"),
SZ("SZ"),
TC("TC"),
TD("TD"),
TF("TF"),
TG("TG"),
TH("TH"),
TJ("TJ"),
TK("TK"),
TL("TL"),
TM("TM"),
TN("TN"),
TO("TO"),
TR("TR"),
TT("TT"),
TV("TV"),
TW("TW"),
TZ("TZ"),
UA("UA"),
UG("UG"),
UM("UM"),
US("US"),
UY("UY"),
UZ("UZ"),
VA("VA"),
VC("VC"),
VE("VE"),
VG("VG"),
VI("VI"),
VN("VN"),
VU("VU"),
WF("WF"),
WS("WS"),
YE("YE"),
YT("YT"),
ZA("ZA"),
ZM("ZM"),
ZW("ZW");
private final String value;
private final static Map<String, GeoLocation> CONSTANTS = new HashMap<String, GeoLocation>();
static {
for (GeoLocation c: values()) {
CONSTANTS.put(c.value, c);
}
}
private GeoLocation(String value) {
this.value = value;
}
@Override
public String toString() {
return this.value;
}
@JsonValue
public String value() {
return this.value;
}
@JsonCreator
public static GeoLocation fromValue(String value) {
GeoLocation constant = CONSTANTS.get(value);
if (constant == null) {
throw new IllegalArgumentException(value);
} else {
return constant;
}
}
}
public enum SupportVersioning {
YES("yes"),
NO("no"),
UNKNOWN("unknown");
private final String value;
private final static Map<String, SupportVersioning> CONSTANTS = new HashMap<String, SupportVersioning>();
static {
for (SupportVersioning c: values()) {
CONSTANTS.put(c.value, c);
}
}
private SupportVersioning(String value) {
this.value = value;
}
@Override
public String toString() {
return this.value;
}
@JsonValue
public String value() {
return this.value;
}
@JsonCreator
public static SupportVersioning fromValue(String value) {
SupportVersioning constant = CONSTANTS.get(value);
if (constant == null) {
throw new IllegalArgumentException(value);
} else {
return constant;
}
}
}
}

View File

@ -0,0 +1,229 @@
package eu.eudat.file.transformer.rda;
import com.fasterxml.jackson.annotation.JsonCreator;
import com.fasterxml.jackson.annotation.JsonValue;
import java.util.HashMap;
import java.util.Map;
public enum Language {
AAR("aar"),
ABK("abk"),
AFR("afr"),
AKA("aka"),
AMH("amh"),
ARA("ara"),
ARG("arg"),
ASM("asm"),
AVA("ava"),
AVE("ave"),
AYM("aym"),
AZE("aze"),
BAK("bak"),
BAM("bam"),
BEL("bel"),
BEN("ben"),
BIH("bih"),
BIS("bis"),
BOD("bod"),
BOS("bos"),
BRE("bre"),
BUL("bul"),
CAT("cat"),
CES("ces"),
CHA("cha"),
CHE("che"),
CHU("chu"),
CHV("chv"),
COR("cor"),
COS("cos"),
CRE("cre"),
CYM("cym"),
DAN("dan"),
DEU("deu"),
DIV("div"),
DZO("dzo"),
ELL("ell"),
ENG("eng"),
EPO("epo"),
EST("est"),
EUS("eus"),
EWE("ewe"),
FAO("fao"),
FAS("fas"),
FIJ("fij"),
FIN("fin"),
FRA("fra"),
FRY("fry"),
FUL("ful"),
GLA("gla"),
GLE("gle"),
GLG("glg"),
GLV("glv"),
GRN("grn"),
GUJ("guj"),
HAT("hat"),
HAU("hau"),
HBS("hbs"),
HEB("heb"),
HER("her"),
HIN("hin"),
HMO("hmo"),
HRV("hrv"),
HUN("hun"),
HYE("hye"),
IBO("ibo"),
IDO("ido"),
III("iii"),
IKU("iku"),
ILE("ile"),
INA("ina"),
IND("ind"),
IPK("ipk"),
ISL("isl"),
ITA("ita"),
JAV("jav"),
JPN("jpn"),
KAL("kal"),
KAN("kan"),
KAS("kas"),
KAT("kat"),
KAU("kau"),
KAZ("kaz"),
KHM("khm"),
KIK("kik"),
KIN("kin"),
KIR("kir"),
KOM("kom"),
KON("kon"),
KOR("kor"),
KUA("kua"),
KUR("kur"),
LAO("lao"),
LAT("lat"),
LAV("lav"),
LIM("lim"),
LIN("lin"),
LIT("lit"),
LTZ("ltz"),
LUB("lub"),
LUG("lug"),
MAH("mah"),
MAL("mal"),
MAR("mar"),
MKD("mkd"),
MLG("mlg"),
MLT("mlt"),
MON("mon"),
MRI("mri"),
MSA("msa"),
MYA("mya"),
NAU("nau"),
NAV("nav"),
NBL("nbl"),
NDE("nde"),
NDO("ndo"),
NEP("nep"),
NLD("nld"),
NNO("nno"),
NOB("nob"),
NOR("nor"),
NYA("nya"),
OCI("oci"),
OJI("oji"),
ORI("ori"),
ORM("orm"),
OSS("oss"),
PAN("pan"),
PLI("pli"),
POL("pol"),
POR("por"),
PUS("pus"),
QUE("que"),
ROH("roh"),
RON("ron"),
RUN("run"),
RUS("rus"),
SAG("sag"),
SAN("san"),
SIN("sin"),
SLK("slk"),
SLV("slv"),
SME("sme"),
SMO("smo"),
SNA("sna"),
SND("snd"),
SOM("som"),
SOT("sot"),
SPA("spa"),
SQI("sqi"),
SRD("srd"),
SRP("srp"),
SSW("ssw"),
SUN("sun"),
SWA("swa"),
SWE("swe"),
TAH("tah"),
TAM("tam"),
TAT("tat"),
TEL("tel"),
TGK("tgk"),
TGL("tgl"),
THA("tha"),
TIR("tir"),
TON("ton"),
TSN("tsn"),
TSO("tso"),
TUK("tuk"),
TUR("tur"),
TWI("twi"),
UIG("uig"),
UKR("ukr"),
URD("urd"),
UZB("uzb"),
VEN("ven"),
VIE("vie"),
VOL("vol"),
WLN("wln"),
WOL("wol"),
XHO("xho"),
YID("yid"),
YOR("yor"),
ZHA("zha"),
ZHO("zho"),
ZUL("zul");
private final String value;
private final static Map<String, Language> CONSTANTS = new HashMap<>();
static {
for (Language c: values()) {
CONSTANTS.put(c.value, c);
}
}
private Language(String value) {
this.value = value;
}
@Override
public String toString() {
return this.value;
}
@JsonValue
public String value() {
return this.value;
}
@JsonCreator
public static Language fromValue(String value) {
Language constant = CONSTANTS.get(value);
if (constant == null) {
throw new IllegalArgumentException(value);
} else {
return constant;
}
}
}

View File

@ -0,0 +1,111 @@
package eu.eudat.file.transformer.rda;
import com.fasterxml.jackson.annotation.*;
import java.io.Serializable;
import java.net.URI;
import java.util.HashMap;
import java.util.Map;
/**
* The Dataset Distribution License Items
* <p>
*
*
*/
@JsonInclude(JsonInclude.Include.NON_NULL)
@JsonPropertyOrder({
"license_ref",
"start_date"
})
@JsonIgnoreProperties(ignoreUnknown = true)
public class License implements Serializable
{
/**
* The Dataset Distribution License Reference Schema
* <p>
* Link to license document.
* (Required)
*
*/
@JsonProperty("license_ref")
@JsonPropertyDescription("Link to license document.")
private URI licenseRef;
/**
* The Dataset Distribution License Start Date Schema
* <p>
* Starting date of license. If date is set in the future, it indicates embargo period.
* (Required)
*
*/
@JsonProperty("start_date")
@JsonPropertyDescription("Starting date of license. If date is set in the future, it indicates embargo period.")
private String startDate;
@JsonIgnore
private Map<String, Object> additionalProperties = new HashMap<String, Object>();
private final static long serialVersionUID = 4148207295817559010L;
/**
* The Dataset Distribution License Reference Schema
* <p>
* Link to license document.
* (Required)
*
*/
@JsonProperty("license_ref")
public URI getLicenseRef() {
return licenseRef;
}
/**
* The Dataset Distribution License Reference Schema
* <p>
* Link to license document.
* (Required)
*
*/
@JsonProperty("license_ref")
public void setLicenseRef(URI licenseRef) {
this.licenseRef = licenseRef;
}
/**
* The Dataset Distribution License Start Date Schema
* <p>
* Starting date of license. If date is set in the future, it indicates embargo period.
* (Required)
*
*/
@JsonProperty("start_date")
public String getStartDate() {
return startDate;
}
/**
* The Dataset Distribution License Start Date Schema
* <p>
* Starting date of license. If date is set in the future, it indicates embargo period.
* (Required)
*
*/
@JsonProperty("start_date")
public void setStartDate(String startDate) {
this.startDate = startDate;
}
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
public Map<String, Object> getAdditionalProperties() {
return this.additionalProperties;
}
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
public void setAdditionalProperty(String name, Object value) {
this.additionalProperties.put(name, value);
}
}

View File

@ -0,0 +1,149 @@
package eu.eudat.file.transformer.rda;
import com.fasterxml.jackson.annotation.*;
import java.io.Serializable;
import java.util.HashMap;
import java.util.Map;
/**
* The Dataset Metadata Standard ID Schema
* <p>
*
*
*/
@JsonInclude(JsonInclude.Include.NON_NULL)
@JsonPropertyOrder({
"identifier",
"type"
})
@JsonIgnoreProperties(ignoreUnknown = true)
public class MetadataStandardId implements Serializable
{
/**
* The Dataset Metadata Standard Identifier Value Schema
* <p>
* Identifier for the metadata standard used.
* (Required)
*
*/
@JsonProperty("identifier")
@JsonPropertyDescription("Identifier for the metadata standard used.")
private String identifier;
/**
* The Dataset Metadata Standard Identifier Type Schema
* <p>
* Identifier type. Allowed values: url, other
* (Required)
*
*/
@JsonProperty("type")
@JsonPropertyDescription("Identifier type. Allowed values: url, other")
private Type type;
@JsonIgnore
private Map<String, Object> additionalProperties = new HashMap<String, Object>();
private final static long serialVersionUID = -7641042701935397947L;
/**
* The Dataset Metadata Standard Identifier Value Schema
* <p>
* Identifier for the metadata standard used.
* (Required)
*
*/
@JsonProperty("identifier")
public String getIdentifier() {
return identifier;
}
/**
* The Dataset Metadata Standard Identifier Value Schema
* <p>
* Identifier for the metadata standard used.
* (Required)
*
*/
@JsonProperty("identifier")
public void setIdentifier(String identifier) {
this.identifier = identifier;
}
/**
* The Dataset Metadata Standard Identifier Type Schema
* <p>
* Identifier type. Allowed values: url, other
* (Required)
*
*/
@JsonProperty("type")
public Type getType() {
return type;
}
/**
* The Dataset Metadata Standard Identifier Type Schema
* <p>
* Identifier type. Allowed values: url, other
* (Required)
*
*/
@JsonProperty("type")
public void setType(Type type) {
this.type = type;
}
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
public Map<String, Object> getAdditionalProperties() {
return this.additionalProperties;
}
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
public void setAdditionalProperty(String name, Object value) {
this.additionalProperties.put(name, value);
}
public enum Type {
URL("url"),
OTHER("other");
private final String value;
private final static Map<String, Type> CONSTANTS = new HashMap<String, Type>();
static {
for (Type c: values()) {
CONSTANTS.put(c.value, c);
}
}
private Type(String value) {
this.value = value;
}
@Override
public String toString() {
return this.value;
}
@JsonValue
public String value() {
return this.value;
}
@JsonCreator
public static Type fromValue(String value) {
Type constant = CONSTANTS.get(value);
if (constant == null) {
throw new IllegalArgumentException(value);
} else {
return constant;
}
}
}
}

View File

@ -0,0 +1,364 @@
package eu.eudat.file.transformer.rda;
import com.fasterxml.jackson.annotation.*;
import java.io.Serializable;
import java.util.HashMap;
import java.util.Map;
/**
* The Dataset Metadata Items Schema
* <p>
*
*
*/
@JsonInclude(JsonInclude.Include.NON_NULL)
@JsonPropertyOrder({
"description",
"language",
"metadata_standard_id",
"additional_properties"
})
@JsonIgnoreProperties(ignoreUnknown = true)
public class Metadatum implements Serializable
{
/**
* The Dataset Metadata Description Schema
* <p>
* Description
*
*/
@JsonProperty("description")
@JsonPropertyDescription("Description")
private String description;
/**
* The Dataset Metadata Language Schema
* <p>
* Language of the metadata expressed using ISO 639-3.
* (Required)
*
*/
@JsonProperty("language")
@JsonPropertyDescription("Language of the metadata expressed using ISO 639-3.")
private Language language;
/**
* The Dataset Metadata Standard ID Schema
* <p>
*
* (Required)
*
*/
@JsonProperty("metadata_standard_id")
private MetadataStandardId metadataStandardId;
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
private Map<String, Object> additionalProperties = new HashMap<String, Object>();
private final static long serialVersionUID = 6511312853153406190L;
/**
* The Dataset Metadata Description Schema
* <p>
* Description
*
*/
@JsonProperty("description")
public String getDescription() {
return description;
}
/**
* The Dataset Metadata Description Schema
* <p>
* Description
*
*/
@JsonProperty("description")
public void setDescription(String description) {
this.description = description;
}
/**
* The Dataset Metadata Language Schema
* <p>
* Language of the metadata expressed using ISO 639-3.
* (Required)
*
*/
@JsonProperty("language")
public Language getLanguage() {
return language;
}
/**
* The Dataset Metadata Language Schema
* <p>
* Language of the metadata expressed using ISO 639-3.
* (Required)
*
*/
@JsonProperty("language")
public void setLanguage(Language language) {
this.language = language;
}
/**
* The Dataset Metadata Standard ID Schema
* <p>
*
* (Required)
*
*/
@JsonProperty("metadata_standard_id")
public MetadataStandardId getMetadataStandardId() {
return metadataStandardId;
}
/**
* The Dataset Metadata Standard ID Schema
* <p>
*
* (Required)
*
*/
@JsonProperty("metadata_standard_id")
public void setMetadataStandardId(MetadataStandardId metadataStandardId) {
this.metadataStandardId = metadataStandardId;
}
@JsonProperty("additional_properties")
public Map<String, Object> getAdditionalProperties() {
return this.additionalProperties;
}
@JsonProperty("additional_properties")
public void setAdditionalProperty(String name, Object value) {
this.additionalProperties.put(name, value);
}
public enum Language {
AAR("aar"),
ABK("abk"),
AFR("afr"),
AKA("aka"),
AMH("amh"),
ARA("ara"),
ARG("arg"),
ASM("asm"),
AVA("ava"),
AVE("ave"),
AYM("aym"),
AZE("aze"),
BAK("bak"),
BAM("bam"),
BEL("bel"),
BEN("ben"),
BIH("bih"),
BIS("bis"),
BOD("bod"),
BOS("bos"),
BRE("bre"),
BUL("bul"),
CAT("cat"),
CES("ces"),
CHA("cha"),
CHE("che"),
CHU("chu"),
CHV("chv"),
COR("cor"),
COS("cos"),
CRE("cre"),
CYM("cym"),
DAN("dan"),
DEU("deu"),
DIV("div"),
DZO("dzo"),
ELL("ell"),
ENG("eng"),
EPO("epo"),
EST("est"),
EUS("eus"),
EWE("ewe"),
FAO("fao"),
FAS("fas"),
FIJ("fij"),
FIN("fin"),
FRA("fra"),
FRY("fry"),
FUL("ful"),
GLA("gla"),
GLE("gle"),
GLG("glg"),
GLV("glv"),
GRN("grn"),
GUJ("guj"),
HAT("hat"),
HAU("hau"),
HBS("hbs"),
HEB("heb"),
HER("her"),
HIN("hin"),
HMO("hmo"),
HRV("hrv"),
HUN("hun"),
HYE("hye"),
IBO("ibo"),
IDO("ido"),
III("iii"),
IKU("iku"),
ILE("ile"),
INA("ina"),
IND("ind"),
IPK("ipk"),
ISL("isl"),
ITA("ita"),
JAV("jav"),
JPN("jpn"),
KAL("kal"),
KAN("kan"),
KAS("kas"),
KAT("kat"),
KAU("kau"),
KAZ("kaz"),
KHM("khm"),
KIK("kik"),
KIN("kin"),
KIR("kir"),
KOM("kom"),
KON("kon"),
KOR("kor"),
KUA("kua"),
KUR("kur"),
LAO("lao"),
LAT("lat"),
LAV("lav"),
LIM("lim"),
LIN("lin"),
LIT("lit"),
LTZ("ltz"),
LUB("lub"),
LUG("lug"),
MAH("mah"),
MAL("mal"),
MAR("mar"),
MKD("mkd"),
MLG("mlg"),
MLT("mlt"),
MON("mon"),
MRI("mri"),
MSA("msa"),
MYA("mya"),
NAU("nau"),
NAV("nav"),
NBL("nbl"),
NDE("nde"),
NDO("ndo"),
NEP("nep"),
NLD("nld"),
NNO("nno"),
NOB("nob"),
NOR("nor"),
NYA("nya"),
OCI("oci"),
OJI("oji"),
ORI("ori"),
ORM("orm"),
OSS("oss"),
PAN("pan"),
PLI("pli"),
POL("pol"),
POR("por"),
PUS("pus"),
QUE("que"),
ROH("roh"),
RON("ron"),
RUN("run"),
RUS("rus"),
SAG("sag"),
SAN("san"),
SIN("sin"),
SLK("slk"),
SLV("slv"),
SME("sme"),
SMO("smo"),
SNA("sna"),
SND("snd"),
SOM("som"),
SOT("sot"),
SPA("spa"),
SQI("sqi"),
SRD("srd"),
SRP("srp"),
SSW("ssw"),
SUN("sun"),
SWA("swa"),
SWE("swe"),
TAH("tah"),
TAM("tam"),
TAT("tat"),
TEL("tel"),
TGK("tgk"),
TGL("tgl"),
THA("tha"),
TIR("tir"),
TON("ton"),
TSN("tsn"),
TSO("tso"),
TUK("tuk"),
TUR("tur"),
TWI("twi"),
UIG("uig"),
UKR("ukr"),
URD("urd"),
UZB("uzb"),
VEN("ven"),
VIE("vie"),
VOL("vol"),
WLN("wln"),
WOL("wol"),
XHO("xho"),
YID("yid"),
YOR("yor"),
ZHA("zha"),
ZHO("zho"),
ZUL("zul");
private final String value;
private final static Map<String, Language> CONSTANTS = new HashMap<String, Language>();
static {
for (Language c: values()) {
CONSTANTS.put(c.value, c);
}
}
private Language(String value) {
this.value = value;
}
@Override
public String toString() {
return this.value;
}
@JsonValue
public String value() {
return this.value;
}
@JsonCreator
public static Language fromValue(String value) {
Language constant = CONSTANTS.get(value);
if (constant == null) {
return null;
// throw new IllegalArgumentException(value);
} else {
return constant;
}
}
}
}

View File

@ -0,0 +1,64 @@
package eu.eudat.file.transformer.rda;
import com.fasterxml.jackson.annotation.JsonCreator;
import com.fasterxml.jackson.annotation.JsonValue;
import java.util.HashMap;
import java.util.Map;
public enum PidSystem {
ARK("ark"),
ARXIV("arxiv"),
BIBCODE("bibcode"),
DOI("doi"),
EAN_13("ean13"),
EISSN("eissn"),
HANDLE("handle"),
IGSN("igsn"),
ISBN("isbn"),
ISSN("issn"),
ISTC("istc"),
LISSN("lissn"),
LSID("lsid"),
PMID("pmid"),
PURL("purl"),
UPC("upc"),
URL("url"),
URN("urn"),
OTHER("other");
private final String value;
private final static Map<String, PidSystem> CONSTANTS = new HashMap<String, PidSystem>();
static {
for (PidSystem c: values()) {
CONSTANTS.put(c.value, c);
}
}
private PidSystem(String value) {
this.value = value;
}
@Override
public String toString() {
return this.value;
}
@JsonValue
public String value() {
return this.value;
}
@JsonCreator
public static PidSystem fromValue(String value) {
PidSystem constant = CONSTANTS.get(value);
if (constant == null) {
throw new IllegalArgumentException(value);
} else {
return constant;
}
}
}

View File

@ -0,0 +1,211 @@
package eu.eudat.file.transformer.rda;
import com.fasterxml.jackson.annotation.*;
import java.io.Serializable;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
/**
* The DMP Project Items Schema
* <p>
*
*
*/
@JsonInclude(JsonInclude.Include.NON_NULL)
@JsonPropertyOrder({
"description",
"end",
"funding",
"start",
"title"
})
@JsonIgnoreProperties(ignoreUnknown = true)
public class Project implements Serializable
{
/**
* The DMP Project Description Schema
* <p>
* Project description
*
*/
@JsonProperty("description")
@JsonPropertyDescription("Project description")
private String description;
/**
* The DMP Project End Date Schema
* <p>
* Project end date
* (Required)
*
*/
@JsonProperty("end")
@JsonPropertyDescription("Project end date")
private String end;
/**
* The DMP Project Funding Schema
* <p>
* Funding related with a project
*
*/
@JsonProperty("funding")
@JsonPropertyDescription("Funding related with a project")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
private List<Funding> funding = null;
/**
* The DMP Project Start Date Schema
* <p>
* Project start date
* (Required)
*
*/
@JsonProperty("start")
@JsonPropertyDescription("Project start date")
private String start;
/**
* The DMP Project Title Schema
* <p>
* Project title
* (Required)
*
*/
@JsonProperty("title")
@JsonPropertyDescription("Project title")
private String title;
@JsonIgnore
private Map<String, Object> additionalProperties = new HashMap<String, Object>();
private final static long serialVersionUID = 1437619307195890472L;
/**
* The DMP Project Description Schema
* <p>
* Project description
*
*/
@JsonProperty("description")
public String getDescription() {
return description;
}
/**
* The DMP Project Description Schema
* <p>
* Project description
*
*/
@JsonProperty("description")
public void setDescription(String description) {
this.description = description;
}
/**
* The DMP Project End Date Schema
* <p>
* Project end date
* (Required)
*
*/
@JsonProperty("end")
public String getEnd() {
return end;
}
/**
* The DMP Project End Date Schema
* <p>
* Project end date
* (Required)
*
*/
@JsonProperty("end")
public void setEnd(String end) {
this.end = end;
}
/**
* The DMP Project Funding Schema
* <p>
* Funding related with a project
*
*/
@JsonProperty("funding")
public List<Funding> getFunding() {
return funding;
}
/**
* The DMP Project Funding Schema
* <p>
* Funding related with a project
*
*/
@JsonProperty("funding")
public void setFunding(List<Funding> funding) {
this.funding = funding;
}
/**
* The DMP Project Start Date Schema
* <p>
* Project start date
* (Required)
*
*/
@JsonProperty("start")
public String getStart() {
return start;
}
/**
* The DMP Project Start Date Schema
* <p>
* Project start date
* (Required)
*
*/
@JsonProperty("start")
public void setStart(String start) {
this.start = start;
}
/**
* The DMP Project Title Schema
* <p>
* Project title
* (Required)
*
*/
@JsonProperty("title")
public String getTitle() {
return title;
}
/**
* The DMP Project Title Schema
* <p>
* Project title
* (Required)
*
*/
@JsonProperty("title")
public void setTitle(String title) {
this.title = title;
}
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
public Map<String, Object> getAdditionalProperties() {
return this.additionalProperties;
}
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
public void setAdditionalProperty(String name, Object value) {
this.additionalProperties.put(name, value);
}
}

View File

@ -0,0 +1,58 @@
package eu.eudat.file.transformer.rda;
import com.fasterxml.jackson.annotation.*;
import java.io.Serializable;
/**
* RDA DMP Common Standard Schema
* <p>
* JSON Schema for the RDA DMP Common Standard
*
*/
@JsonInclude(JsonInclude.Include.NON_NULL)
@JsonPropertyOrder({
"dmp"
})
@JsonIgnoreProperties(ignoreUnknown = true)
public class RDAModel implements Serializable
{
/**
* The DMP Schema
* <p>
*
* (Required)
*
*/
@JsonProperty("dmp")
private Dmp dmp;
private final static long serialVersionUID = 7331666133368350998L;
/**
* The DMP Schema
* <p>
*
* (Required)
*
*/
@JsonProperty("dmp")
public Dmp getDmp() {
return dmp;
}
/**
* The DMP Schema
* <p>
*
* (Required)
*
*/
@JsonProperty("dmp")
public void setDmp(Dmp dmp) {
this.dmp = dmp;
}
}

View File

@ -0,0 +1,107 @@
package eu.eudat.file.transformer.rda;
import com.fasterxml.jackson.annotation.*;
import java.io.Serializable;
import java.util.HashMap;
import java.util.Map;
/**
* The Dataset Security & Policy Items Schema
* <p>
*
*
*/
@JsonInclude(JsonInclude.Include.NON_NULL)
@JsonPropertyOrder({
"description",
"title",
"additional_properties"
})
@JsonIgnoreProperties(ignoreUnknown = true)
public class SecurityAndPrivacy implements Serializable
{
/**
* The Dataset Security & Policy Description Schema
* <p>
* Description
*
*/
@JsonProperty("description")
@JsonPropertyDescription("Description")
private String description;
/**
* The Dataset Security & Policy Title Schema
* <p>
* Title
* (Required)
*
*/
@JsonProperty("title")
@JsonPropertyDescription("Title")
private String title;
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
private Map<String, Object> additionalProperties = new HashMap<String, Object>();
private final static long serialVersionUID = 7863747935827682977L;
/**
* The Dataset Security & Policy Description Schema
* <p>
* Description
*
*/
@JsonProperty("description")
public String getDescription() {
return description;
}
/**
* The Dataset Security & Policy Description Schema
* <p>
* Description
*
*/
@JsonProperty("description")
public void setDescription(String description) {
this.description = description;
}
/**
* The Dataset Security & Policy Title Schema
* <p>
* Title
* (Required)
*
*/
@JsonProperty("title")
public String getTitle() {
return title;
}
/**
* The Dataset Security & Policy Title Schema
* <p>
* Title
* (Required)
*
*/
@JsonProperty("title")
public void setTitle(String title) {
this.title = title;
}
@JsonProperty("additional_properties")
public Map<String, Object> getAdditionalProperties() {
return this.additionalProperties;
}
@JsonProperty("additional_properties")
public void setAdditionalProperty(String name, Object value) {
this.additionalProperties.put(name, value);
}
}

View File

@ -0,0 +1,107 @@
package eu.eudat.file.transformer.rda;
import com.fasterxml.jackson.annotation.*;
import java.io.Serializable;
import java.util.HashMap;
import java.util.Map;
/**
* The Dataset Technical Resource Items Schema
* <p>
*
*
*/
@JsonInclude(JsonInclude.Include.NON_NULL)
@JsonPropertyOrder({
"description",
"name",
"additional_properties"
})
@JsonIgnoreProperties(ignoreUnknown = true)
public class TechnicalResource implements Serializable
{
/**
* The Dataset Technical Resource Description Schema
* <p>
* Description of the technical resource
*
*/
@JsonProperty("description")
@JsonPropertyDescription("Description of the technical resource")
private String description;
/**
* The Dataset Technical Resource Name Schema
* <p>
* Name of the technical resource
* (Required)
*
*/
@JsonProperty("name")
@JsonPropertyDescription("Name of the technical resource")
private String name;
@JsonProperty("additional_properties")
@JsonInclude(JsonInclude.Include.NON_EMPTY)
private Map<String, Object> additionalProperties = new HashMap<String, Object>();
private final static long serialVersionUID = -7451757227129483110L;
/**
* The Dataset Technical Resource Description Schema
* <p>
* Description of the technical resource
*
*/
@JsonProperty("description")
public String getDescription() {
return description;
}
/**
* The Dataset Technical Resource Description Schema
* <p>
* Description of the technical resource
*
*/
@JsonProperty("description")
public void setDescription(String description) {
this.description = description;
}
/**
* The Dataset Technical Resource Name Schema
* <p>
* Name of the technical resource
* (Required)
*
*/
@JsonProperty("name")
public String getName() {
return name;
}
/**
* The Dataset Technical Resource Name Schema
* <p>
* Name of the technical resource
* (Required)
*
*/
@JsonProperty("name")
public void setName(String name) {
this.name = name;
}
@JsonProperty("additional_properties")
public Map<String, Object> getAdditionalProperties() {
return this.additionalProperties;
}
@JsonProperty("additional_properties")
public void setAdditionalProperty(String name, Object value) {
this.additionalProperties.put(name, value);
}
}

View File

@ -0,0 +1,20 @@
package eu.eudat.file.transformer.rda.mapper;
import eu.eudat.file.transformer.rda.ContactId;
import java.util.UUID;
public class ContactIdRDAMapper {
public static ContactId toRDA(UUID id) {
ContactId rda = new ContactId();
rda.setIdentifier(id.toString());
rda.setType(ContactId.Type.OTHER);
return rda;
}
public static UUID toEntity(ContactId rda) {
return UUID.fromString(rda.getIdentifier());
}
}

View File

@ -0,0 +1,39 @@
package eu.eudat.file.transformer.rda.mapper;
import eu.eudat.file.transformer.enums.ContactInfoType;
import eu.eudat.file.transformer.models.user.UserContactInfoFileTransformerModel;
import eu.eudat.file.transformer.models.user.UserFileTransformerModel;
import eu.eudat.file.transformer.rda.Contact;
import java.util.List;
public class ContactRDAMapper {
public static Contact toRDA(UserFileTransformerModel creator) {
Contact rda = new Contact();
if (creator.getName() == null) {
throw new IllegalArgumentException("Contact Name is missing");
}
rda.setName(creator.getName());
//TODO: GetEmail
UserContactInfoFileTransformerModel emailContact = creator.getContacts().stream().filter(userContactInfo -> userContactInfo.getType().equals(ContactInfoType.Email)).findFirst().orElse(null);
if (emailContact == null) {
throw new IllegalArgumentException("Contact Email is missing");
}
rda.setMbox(emailContact.getValue());
rda.setContactId(ContactIdRDAMapper.toRDA(creator.getId()));
return rda;
}
public static UserFileTransformerModel toEntity(Contact rda) {
UserFileTransformerModel entity = new UserFileTransformerModel();
entity.setId(ContactIdRDAMapper.toEntity(rda.getContactId()));
entity.setName(rda.getName());
UserContactInfoFileTransformerModel emailContactInfo = new UserContactInfoFileTransformerModel();
emailContactInfo.setType(ContactInfoType.Email);
emailContactInfo.setValue(rda.getMbox());
entity.setContacts(List.of(emailContactInfo));
// entity.setEmail(rda.getMbox());//TODO: GetEmail
return entity;
}
}

View File

@ -0,0 +1,22 @@
package eu.eudat.file.transformer.rda.mapper;
import eu.eudat.file.transformer.rda.ContributorId;
public class ContributorIdRDAMapper {
public static ContributorId toRDA(Object id) {
ContributorId rda = new ContributorId();
String idParts[] = id.toString().split(":");
String prefix = idParts.length > 1 ? idParts[0] : id.toString();
if (prefix.equals("orcid")) {
String finalId = id.toString().replace(prefix + ":", "");
rda.setIdentifier("http://orcid.org/" + finalId);
rda.setType(ContributorId.Type.ORCID);
} else {
rda.setIdentifier(id.toString());
rda.setType(ContributorId.Type.OTHER);
}
return rda;
}
}

View File

@ -0,0 +1,90 @@
package eu.eudat.file.transformer.rda.mapper;
import com.fasterxml.jackson.databind.ObjectMapper;
import eu.eudat.file.transformer.enums.ContactInfoType;
import eu.eudat.file.transformer.models.dmp.DmpUserFileTransformerModel;
import eu.eudat.file.transformer.models.reference.DefinitionFileTransformerModel;
import eu.eudat.file.transformer.models.reference.FieldFileTransformerModel;
import eu.eudat.file.transformer.models.reference.ReferenceFileTransformerModel;
import eu.eudat.file.transformer.models.user.UserContactInfoFileTransformerModel;
import eu.eudat.file.transformer.rda.Contributor;
import eu.eudat.file.transformer.rda.ContributorId;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.io.IOException;
import java.util.HashSet;
import java.util.List;
public class ContributorRDAMapper {
private static final Logger logger = LoggerFactory.getLogger(ContributorRDAMapper.class);
public static Contributor toRDA(DmpUserFileTransformerModel userDMP) {
Contributor rda = new Contributor();
rda.setContributorId(ContributorIdRDAMapper.toRDA(userDMP.getUser().getId()));
if (userDMP.getUser().getName() == null) {
throw new IllegalArgumentException("Contributor Name is missing");
}
rda.setName(userDMP.getUser().getName());
UserContactInfoFileTransformerModel emailContact = userDMP.getUser().getContacts().stream().filter(userContactInfo -> userContactInfo.getType().equals(ContactInfoType.Email)).findFirst().orElse(null);
if (emailContact != null) {
rda.setMbox(emailContact.getValue());
}
rda.setRole(new HashSet<>(List.of(userDMP.getRole().name())));
return rda;
}
public static Contributor toRDA(ReferenceFileTransformerModel researcher) {
Contributor rda = new Contributor();
rda.setContributorId(ContributorIdRDAMapper.toRDA(researcher.getReference()));
rda.setName(researcher.getLabel());
if (researcher.getDefinition() != null) {
FieldFileTransformerModel emailField = researcher.getDefinition().getFields().stream().filter(field -> field.getCode().equals("primaryEmail")).findFirst().orElse(null);
if (emailField != null) {
rda.setMbox(emailField.getValue());
}
}
// rda.setRole(new HashSet<>(Arrays.asList(UserDMP.UserDMPRoles.fromInteger(userDMP.getRole()).name())));
return rda;
}
public static Contributor toRDA(String value) {
ObjectMapper mapper = new ObjectMapper();
try {
ReferenceFileTransformerModel researcher = mapper.readValue(value, ReferenceFileTransformerModel.class);
return toRDA(researcher);
} catch (IOException e) {
logger.error(e.getMessage(), e);
}
return null;
}
public static ReferenceFileTransformerModel toEntity(Contributor rda) {
ReferenceFileTransformerModel reference = new ReferenceFileTransformerModel();
String referenceString;
if (rda.getContributorId() != null) {
if (rda.getContributorId().getType() == ContributorId.Type.ORCID) {
String id = rda.getContributorId().getIdentifier().replace("http://orcid.org/", "");
referenceString = "orcid:" + id;
} else {
String idParts[] = rda.getContributorId().getIdentifier().split(":");
if (idParts.length == 1) {
referenceString = "dmp:" + rda.getContributorId().getIdentifier();
} else {
referenceString = rda.getContributorId().getIdentifier();
}
}
reference.setReference(referenceString);
reference.setLabel(rda.getName());
FieldFileTransformerModel field = new FieldFileTransformerModel();
field.setCode("primaryEmail");
field.setValue(rda.getMbox());
reference.setDefinition(new DefinitionFileTransformerModel());
reference.getDefinition().setFields(List.of(field));
} else {
return null;
}
return reference;
}
}

View File

@ -0,0 +1,114 @@
package eu.eudat.file.transformer.rda.mapper;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.node.ArrayNode;
import eu.eudat.file.transformer.models.descriptiontemplate.definition.FieldFileTransformerModel;
import eu.eudat.file.transformer.rda.Cost;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.util.HashMap;
import java.util.Iterator;
import java.util.List;
import java.util.Map;
import java.util.stream.Collectors;
public class CostRDAMapper {
private static final Logger logger = LoggerFactory.getLogger(DatasetRDAMapper.class);
private static final ObjectMapper mapper = new ObjectMapper();
public static Cost toRDA(Map<String, Object> cost) throws JsonProcessingException {
Cost rda = new Cost();
Map<String, Object> code = mapper.readValue((String) cost.get("code"), HashMap.class);
rda.setCurrencyCode(Cost.CurrencyCode.fromValue((String) code.get("value")));
rda.setDescription((String) cost.get("description"));
if (cost.get("title") == null) {
throw new IllegalArgumentException("Cost Title is missing");
}
rda.setTitle((String) cost.get("title"));
rda.setValue(((Integer) cost.get("value")).doubleValue());
return rda;
}
public static List<Cost> toRDAList(List<FieldFileTransformerModel> nodes) throws JsonProcessingException {
ObjectMapper mapper = new ObjectMapper();
Map<String, Cost> rdaMap = new HashMap<>();
for(FieldFileTransformerModel node: nodes){
String rdaProperty = node.getSchematics().stream().filter(schematic -> schematic.startsWith("rda.dmp.cost")).findFirst().orElse("");
if (node.getData() == null) {
continue;
}
String rdaValue = node.getData().getValue();
if(rdaValue == null || rdaValue.isEmpty()){
continue;
}
String key = node.getNumbering();
if(!key.contains("mult")){
key = "0";
}
else{
key = "" + key.charAt(4);
}
Cost rda;
if(rdaMap.containsKey(key)){
rda = rdaMap.get(key);
}
else{
rda = new Cost();
rdaMap.put(key, rda);
}
if(rdaProperty.contains("value")){
try {
rda.setValue(Double.valueOf(rdaValue));
}
catch (NumberFormatException e) {
logger.warn("Dmp cost value " + rdaValue + " is not valid. Cost value will not be set.");
}
}
else if(rdaProperty.contains("currency_code")){
try {
HashMap<String, String> result =
new ObjectMapper().readValue(rdaValue, HashMap.class);
rda.setCurrencyCode(Cost.CurrencyCode.fromValue(result.get("value")));
}
catch (Exception e) {
logger.warn("Dmp cost currency code is not valid and will not be set.");
}
}
else if(rdaProperty.contains("title")){
Iterator<JsonNode> iter = mapper.readTree(rdaValue).elements();
StringBuilder title = new StringBuilder();
while(iter.hasNext()){
String next = iter.next().asText();
if(!next.equals("Other")) {
title.append(next).append(", ");
}
}
if(title.length() > 2){
rda.setTitle(title.substring(0, title.length() - 2));
}
else{
String t = rda.getTitle();
if(t == null){ // only other as title
rda.setTitle(rdaValue);
}
else{ // option + other
rda.setTitle(t + ", " + rdaValue);
}
}
}
else if(rdaProperty.contains("description")){
rda.setDescription(rdaValue);
}
}
List<Cost> rdaList = rdaMap.values().stream()
.filter(cost -> cost.getTitle() != null)
.collect(Collectors.toList());
return rdaList;
}
}

View File

@ -0,0 +1,139 @@
package eu.eudat.file.transformer.rda.mapper;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.node.ArrayNode;
import eu.eudat.file.transformer.models.descriptiontemplate.definition.FieldFileTransformerModel;
import eu.eudat.file.transformer.rda.DatasetId;
import eu.eudat.file.transformer.utils.json.JsonSearcher;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.io.IOException;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
public class DatasetIdRDAMapper {
private static final Logger logger = LoggerFactory.getLogger(DatasetIdRDAMapper.class);
/*public static DatasetId toRDA(UUID id) {
DatasetId rda = new DatasetId();
rda.setIdentifier(id.toString());
rda.setType(DatasetId.Type.OTHER);
return rda;
}*/
public static DatasetId toRDA(List<FieldFileTransformerModel> nodes) {
DatasetId data = new DatasetId();
for (FieldFileTransformerModel node: nodes) {
String rdaProperty = node.getSchematics().stream().filter(schematic -> schematic.startsWith("rda.dataset.dataset_id")).findFirst().orElse("");
if (node.getData() == null) {
continue;
}
String rdaValue = node.getData().getValue();
if(rdaValue == null || rdaValue.isEmpty()){
continue;
}
ObjectMapper mapper = new ObjectMapper();
try {
Map<String, Object> values = mapper.readValue(rdaValue, HashMap.class);
if (!values.isEmpty()) {
values.entrySet().forEach(entry -> finalRDAMap(data, entry.getKey(), (String) entry.getValue()));
} else {
finalRDAMap(data, rdaProperty, rdaValue);
}
} catch (IOException e) {
logger.warn(e.getMessage() + ".Passing value as is");
finalRDAMap(data, rdaProperty, rdaValue);
}
}
if (data.getIdentifier() != null && data.getType() != null) {
return data;
}
return null;
}
private static void finalRDAMap(DatasetId rda, String property, String value) {
if (value != null) {
for (DatasetIdProperties datasetIdProperties : DatasetIdProperties.values()) {
if (property.contains(datasetIdProperties.getName())) {
switch (datasetIdProperties) {
case IDENTIFIER:
rda.setIdentifier(value);
break;
case TYPE:
try {
rda.setType(DatasetId.Type.fromValue(value));
}
catch (IllegalArgumentException e){
logger.warn("Type " + value + " from semantic rda.dataset.dataset_id.type was not found. Setting type to OTHER.");
rda.setType(DatasetId.Type.OTHER);
}
break;
}
}
}
}
}
//TODO
/*
public static List<Field> toProperties(DatasetId rda, JsonNode node) {
List<Field> properties = new ArrayList<>();
List<JsonNode> idNodes = JsonSearcher.findNodes(node, "schematics", "rda.dataset.dataset_id");
for (JsonNode idNode: idNodes) {
for (DatasetIdProperties datasetIdProperties : DatasetIdProperties.values()) {
JsonNode schematics = idNode.get("schematics");
if(schematics.isArray()){
for(JsonNode schematic: schematics){
if(schematic.asText().endsWith(datasetIdProperties.getName())){
switch (datasetIdProperties) {
case IDENTIFIER:
Field field1 = new Field();
field1.setKey(idNode.get("id").asText());
field1.setValue(rda.getIdentifier());
properties.add(field1);
break;
case TYPE:
Field field2 = new Field();
field2.setKey(idNode.get("id").asText());
field2.setValue(rda.getType().value());
properties.add(field2);
break;
}
break;
}
}
}
}
}
return properties;
}
*/
private enum DatasetIdProperties {
IDENTIFIER("identifier"),
TYPE("type");
private final String name;
DatasetIdProperties(String name) {
this.name = name;
}
public String getName() {
return name;
}
}
}

View File

@ -0,0 +1,436 @@
package eu.eudat.file.transformer.rda.mapper;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.node.ArrayNode;
import eu.eudat.file.transformer.models.description.DescriptionFileTransformerModel;
import eu.eudat.file.transformer.models.descriptiontemplate.DescriptionTemplateFileTransformerModel;
import eu.eudat.file.transformer.models.descriptiontemplate.definition.FieldFileTransformerModel;
import eu.eudat.file.transformer.models.descriptiontemplate.definition.SectionFileTransformerModel;
import eu.eudat.file.transformer.models.tag.TagFileTransformerModel;
import eu.eudat.file.transformer.rda.*;
import eu.eudat.file.transformer.utils.descriptionTemplate.TemplateFieldSearcher;
import eu.eudat.file.transformer.utils.json.JsonSearcher;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Component;
import java.net.URI;
import java.time.Instant;
import java.time.ZoneId;
import java.time.format.DateTimeFormatter;
import java.time.format.DateTimeParseException;
import java.util.*;
import java.util.stream.Collectors;
import java.util.stream.Stream;
import java.util.stream.StreamSupport;
@Component
public class DatasetRDAMapper {
private static final Logger logger = LoggerFactory.getLogger(DatasetRDAMapper.class);
private final ObjectMapper mapper;
@Autowired
public DatasetRDAMapper() {
this.mapper = new ObjectMapper();
}
public Dataset toRDA(DescriptionFileTransformerModel descriptionEntity, Dmp dmp) {
Dataset rda = new Dataset();
// rda.setDatasetId(DatasetIdRDAMapper.toRDA(dataset.getId()));
if (descriptionEntity.getLabel() == null) {
throw new IllegalArgumentException("Dataset Label is missing");
}
Map<String, Object> templateIdsToValues = this.createFieldIdValueMap(descriptionEntity.getDescriptionTemplate());
rda.setTitle(descriptionEntity.getLabel());
rda.setDescription(descriptionEntity.getDescription());
//rda.setAdditionalProperty("template", descriptionEntity.getDescriptionTemplate()); //TODO
try {
List<FieldFileTransformerModel> idNodes = TemplateFieldSearcher.searchFields(descriptionEntity.getDescriptionTemplate(), "schematics", "rda.dataset.dataset_id");
if (!idNodes.isEmpty()) {
rda.setDatasetId(DatasetIdRDAMapper.toRDA(idNodes));
}
if (rda.getDatasetId() == null) {
rda.setDatasetId(new DatasetId(descriptionEntity.getId().toString(), DatasetId.Type.OTHER));
}
List<FieldFileTransformerModel> typeNodes = TemplateFieldSearcher.searchFields(descriptionEntity.getDescriptionTemplate(), "schematics", "rda.dataset.type");
if (!typeNodes.isEmpty() && typeNodes.get(0).getData() != null && !typeNodes.get(0).getData().getValue().isEmpty()) {
rda.setType(typeNodes.get(0).getData().getValue());
} else {
rda.setType("DMP Dataset");
}
List<FieldFileTransformerModel> languageNodes = TemplateFieldSearcher.searchFields(descriptionEntity.getDescriptionTemplate(), "schematics", "rda.dataset.language");
if (!languageNodes.isEmpty() && languageNodes.get(0).getData() != null && !languageNodes.get(0).getData().getValue().isEmpty()) {
String lang = languageNodes.get(0).getData().getValue();
try {
rda.setLanguage(Language.fromValue(lang));
}
catch (IllegalArgumentException e){
//TODO
logger.warn("Language " + lang + " from semantic rda.dataset.language was not found. Setting '" + descriptionEntity.getDescriptionTemplate().getLanguage() +"' as language from the dataset profile.");
rda.setLanguage(LanguageRDAMapper.mapLanguageIsoToRDAIso(descriptionEntity.getDescriptionTemplate().getLanguage()));
}
} else {
//TODO
rda.setLanguage(LanguageRDAMapper.mapLanguageIsoToRDAIso(descriptionEntity.getDescriptionTemplate().getLanguage()));
}
List<FieldFileTransformerModel> metadataNodes = TemplateFieldSearcher.searchFields(descriptionEntity.getDescriptionTemplate(), "schematics", "rda.dataset.metadata");
if (!metadataNodes.isEmpty()) {
rda.setMetadata(MetadataRDAMapper.toRDAList(metadataNodes));
}else{
rda.setMetadata(new ArrayList<>());
}
List<FieldFileTransformerModel> qaNodes = TemplateFieldSearcher.searchFields(descriptionEntity.getDescriptionTemplate(), "schematics", "rda.dataset.data_quality_assurance");
if (!qaNodes.isEmpty()) {
rda.setDataQualityAssurance(qaNodes.stream().filter(qaNode -> qaNode.getData() != null).map(qaNode -> qaNode.getData().getValue()).collect(Collectors.toList()));
for (int i = 0; i < qaNodes.size(); i++) {
rda.setAdditionalProperty("qaId" + (i + 1), qaNodes.get(i).getId());
}
List<String> qaList = new ArrayList<>();
String qa;
for(FieldFileTransformerModel node: qaNodes){
if (node.getData() == null) {
continue;
}
JsonNode valueNode = mapper.readTree(node.getData().getValue());
if(valueNode.isArray()){
Iterator<JsonNode> iter = valueNode.elements();
while(iter.hasNext()) {
qa = iter.next().asText();
qaList.add(qa);
}
}
}
String data_quality;
for(FieldFileTransformerModel dqa: qaNodes){
if (dqa.getData() == null) {
continue;
}
data_quality = dqa.getData().getValue();
if(!data_quality.isEmpty()){
qaList.add(data_quality);
rda.setAdditionalProperty("otherDQAID", dqa.getId());
rda.setAdditionalProperty("otherDQA", data_quality);
break;
}
}
rda.setDataQualityAssurance(qaList);
}else{
rda.setDataQualityAssurance(new ArrayList<>());
}
List<FieldFileTransformerModel> preservationNodes = TemplateFieldSearcher.searchFields(descriptionEntity.getDescriptionTemplate(), "schematics", "rda.dataset.preservation_statement");
if (!preservationNodes.isEmpty() && preservationNodes.get(0).getData() != null && !preservationNodes.get(0).getData().getValue().isEmpty()) {
rda.setPreservationStatement(preservationNodes.get(0).getData().getValue());
}
List<FieldFileTransformerModel> distributionNodes = TemplateFieldSearcher.searchFields(descriptionEntity.getDescriptionTemplate(), "schematics", "rda.dataset.distribution");
if (!distributionNodes.isEmpty()) {
rda.setDistribution(DistributionRDAMapper.toRDAList(distributionNodes));
}else{
rda.setDistribution(new ArrayList<>());
}
List<FieldFileTransformerModel> keywordNodes = TemplateFieldSearcher.searchFields(descriptionEntity.getDescriptionTemplate(), "schematics", "rda.dataset.keyword");
if (!keywordNodes.isEmpty()) {
rda.setKeyword(keywordNodes.stream().filter(keywordNode -> keywordNode.getData() != null).map(keywordNode -> {
try {
JsonNode value = mapper.readTree(keywordNode.getData().getValue());
if (value.isArray()) {
return StreamSupport.stream(value.spliterator(), false).map(node -> KeywordRDAMapper.toRDA(node.toString())).flatMap(Collection::stream).collect(Collectors.toList());
} else {
return KeywordRDAMapper.toRDA(keywordNode.getData().getValue());
}
}catch (JsonProcessingException e) {
logger.error(e.getMessage(), e);
return null;
}
}).filter(Objects::nonNull).flatMap(Collection::stream).collect(Collectors.toList()));
for (int i = 0; i < keywordNodes.size(); i++) {
rda.setAdditionalProperty("keyword" + (i + 1), keywordNodes.get(i).getId());
}
}
// else if (apiContext.getOperationsContext().getElasticRepository().getDatasetRepository().exists()) { //TODO
// List<String> tags = apiContext.getOperationsContext().getElasticRepository().getDatasetRepository().findDocument(descriptionEntity.getId().toString()).getTags().stream().map(Tag::getName).collect(Collectors.toList());
// rda.setKeyword(tags);
// }
List<FieldFileTransformerModel> personalDataNodes = TemplateFieldSearcher.searchFields(descriptionEntity.getDescriptionTemplate(), "schematics", "rda.dataset.personal_data");
if (!personalDataNodes.isEmpty()) {
try{
rda.setPersonalData(personalDataNodes.stream().filter(personalDataNode -> personalDataNode.getData() != null).map(personalDataNode -> Dataset.PersonalData.fromValue(personalDataNode.getData().getValue())).findFirst().get());
}catch(IllegalArgumentException e){
rda.setPersonalData(Dataset.PersonalData.UNKNOWN);
}
} else {
rda.setPersonalData(Dataset.PersonalData.UNKNOWN);
}
List<FieldFileTransformerModel> securityAndPrivacyNodes = TemplateFieldSearcher.searchFields(descriptionEntity.getDescriptionTemplate(), "schematics", "rda.dataset.security_and_privacy");
if (!securityAndPrivacyNodes.isEmpty()) {
rda.setSecurityAndPrivacy(SecurityAndPrivacyRDAMapper.toRDAList(securityAndPrivacyNodes));
}else{
rda.setSecurityAndPrivacy(new ArrayList<>());
}
List<FieldFileTransformerModel> sensitiveDataNodes = TemplateFieldSearcher.searchFields(descriptionEntity.getDescriptionTemplate(), "schematics", "rda.dataset.sensitive_data");
if (!sensitiveDataNodes.isEmpty()) {
try{
rda.setSensitiveData(sensitiveDataNodes.stream().filter(sensitiveDataNode -> sensitiveDataNode.getData() != null).map(sensitiveDataNode -> Dataset.SensitiveData.fromValue(sensitiveDataNode.getData().getValue())).findFirst().get());
}catch(IllegalArgumentException e){
rda.setSensitiveData(Dataset.SensitiveData.UNKNOWN);
}
} else {
rda.setSensitiveData(Dataset.SensitiveData.UNKNOWN);
}
List<FieldFileTransformerModel> technicalResourceNodes = TemplateFieldSearcher.searchFields(descriptionEntity.getDescriptionTemplate(), "schematics", "rda.dataset.technical_resource");
if (!technicalResourceNodes.isEmpty()) {
rda.setTechnicalResource(TechnicalResourceRDAMapper.toRDAList(technicalResourceNodes));
}else{
rda.setTechnicalResource(new ArrayList<>());
}
List<FieldFileTransformerModel> issuedNodes = TemplateFieldSearcher.searchFields(descriptionEntity.getDescriptionTemplate(), "schematics", "rda.dataset.issued");
if (!issuedNodes.isEmpty() && issuedNodes.get(0).getData() != null && !issuedNodes.get(0).getData().getValue().isEmpty()) {
rda.setIssued(issuedNodes.get(0).getData().getValue());
}
List<FieldFileTransformerModel> contributorNodes = TemplateFieldSearcher.searchFields(descriptionEntity.getDescriptionTemplate(), "schematics", "rda.dmp.contributor");
if (!contributorNodes.isEmpty()) {
dmp.getContributor().addAll(contributorNodes.stream().filter(contributorNode -> contributorNode.getData() != null).map(contributorNode -> {
try {
JsonNode value = mapper.readTree(contributorNode.getData().getValue());
if (value.isArray()) {
return StreamSupport.stream(value.spliterator(), false).map(node -> ContributorRDAMapper.toRDA(node.asText())).collect(Collectors.toList());
} else {
return Collections.singletonList(new Contributor());
}
}catch (JsonProcessingException e) {
return null;
}
}).filter(Objects::nonNull).flatMap(Collection::stream).toList());
dmp.setContributor(dmp.getContributor().stream().filter(contributor -> contributor.getContributorId() != null && contributor.getName() != null).collect(Collectors.toList()));
}
List<FieldFileTransformerModel> costNodes = TemplateFieldSearcher.searchFields(descriptionEntity.getDescriptionTemplate(), "schematics", "rda.dmp.cost");
if (!costNodes.isEmpty()) {
dmp.getCost().addAll(CostRDAMapper.toRDAList(costNodes));
}
List<FieldFileTransformerModel> ethicsNodes = TemplateFieldSearcher.searchFields(descriptionEntity.getDescriptionTemplate(), "schematics", "rda.dmp.ethical_issues");
if (!ethicsNodes.isEmpty()) {
for(FieldFileTransformerModel node: ethicsNodes){
String rdaProperty = node.getSchematics().stream().filter(schematic -> schematic.startsWith("rda.dmp.ethical_issues")).findFirst().orElse("");
if (node.getData() == null) {
continue;
}
String rdaValue = node.getData().getValue();
if(rdaValue == null || rdaValue.isEmpty()){
continue;
}
if(rdaProperty.contains("exist")){
try {
Dmp.EthicalIssuesExist exists = dmp.getEthicalIssuesExist();
if(exists == null
|| ((exists == Dmp.EthicalIssuesExist.NO || exists == Dmp.EthicalIssuesExist.UNKNOWN) && rdaValue.equals("yes"))
|| (exists == Dmp.EthicalIssuesExist.YES && !(rdaValue.equals("no") || rdaValue.equals("unknown")))
|| (exists == Dmp.EthicalIssuesExist.UNKNOWN && rdaValue.equals("no"))){
dmp.setEthicalIssuesExist(Dmp.EthicalIssuesExist.fromValue(rdaValue));
}
}catch(IllegalArgumentException e){
logger.warn(e.getLocalizedMessage() + ". Setting ethical_issues_exist to unknown");
dmp.setEthicalIssuesExist(Dmp.EthicalIssuesExist.UNKNOWN);
}
}
else if(rdaProperty.contains("description")){
if(dmp.getEthicalIssuesDescription() == null){
dmp.setEthicalIssuesDescription(rdaValue);
}
else{
dmp.setEthicalIssuesDescription(dmp.getEthicalIssuesDescription() + ", " + rdaValue);
}
}
else if(rdaProperty.contains("report")){
try {
dmp.setEthicalIssuesReport(URI.create(rdaValue));
} catch (IllegalArgumentException e) {
logger.warn(e.getLocalizedMessage() + ". Skipping url parsing");
}
}
}
}
List<FieldFileTransformerModel> foundNodes = Stream.of(typeNodes, languageNodes, metadataNodes, qaNodes, preservationNodes, distributionNodes,
keywordNodes, personalDataNodes, securityAndPrivacyNodes, sensitiveDataNodes, technicalResourceNodes).flatMap(Collection::stream).toList();
templateIdsToValues.entrySet().forEach(entry -> {
boolean isFound = foundNodes.stream().anyMatch(node -> node.getId().equals(entry.getKey()));
if (!isFound && entry.getValue() != null && !entry.getValue().toString().isEmpty()) {
try {
Instant time = Instant.parse(entry.getValue().toString());
rda.setAdditionalProperty(entry.getKey(), DateTimeFormatter.ofPattern("yyyy-MM-dd").withZone(ZoneId.systemDefault()).format(time));
} catch (DateTimeParseException e) {
rda.setAdditionalProperty(entry.getKey(), entry.getValue());
}
}
});
} catch (Exception e) {
logger.error(e.getMessage(), e);
}
return rda;
}
public DescriptionFileTransformerModel toEntity(Dataset rda, DescriptionTemplateFileTransformerModel defaultProfile) {
DescriptionFileTransformerModel entity = new DescriptionFileTransformerModel();
entity.setLabel(rda.getTitle());
entity.setDescription(rda.getDescription());
/*try {
DescriptionTemplateEntity profile = apiContext.getOperationsContext().getDatabaseRepository().getDatasetProfileDao().find(UUID.fromString(rda.getAdditionalProperties().get("template").toString()));
//entity.setDescriptionTemplateId(profile.getId()); //TODO
}catch(Exception e) {
logger.warn(e.getMessage(), e);*/
entity.setDescriptionTemplate(defaultProfile); //TODO
// }
try {
// PropertyDefini properties = new PropertyDefinition();
// properties.setFields(new ArrayList<>());
String datasetDescriptionJson = mapper.writeValueAsString(entity.getDescriptionTemplate());
JsonNode datasetDescriptionObj = mapper.readTree(datasetDescriptionJson);
List<FieldFileTransformerModel> typeNodes = TemplateFieldSearcher.searchFields(entity.getDescriptionTemplate(), "schematics", "rda.dataset.type");
if (!typeNodes.isEmpty()) {
typeNodes.get(0).getData().setValue(rda.getType());
}
List<FieldFileTransformerModel> languageNodes = TemplateFieldSearcher.searchFields(entity.getDescriptionTemplate(), "schematics", "rda.dataset.language");
if (!languageNodes.isEmpty() && rda.getLanguage() != null) {
languageNodes.get(0).getData().setValue(rda.getLanguage().value());
}
//TODO
/*if (rda.getMetadata() != null) {
properties.getFields().addAll(MetadataRDAMapper.toProperties(rda.getMetadata()));
}*/
//TODO
/*if (rda.getDatasetId() != null) {
properties.getFields().addAll(DatasetIdRDAMapper.toProperties(rda.getDatasetId(), datasetDescriptionObj));
}*/
/*List <String> qaIds = rda.getAdditionalProperties().entrySet().stream().filter(entry -> entry.getKey().startsWith("qaId")).map(entry -> entry.getValue().toString()).collect(Collectors.toList());
for (int i = 0; i < qaIds.size(); i++) {
properties.put(qaIds.get(i), rda.getDataQualityAssurance().get(i));
}*/
List<FieldFileTransformerModel> qaNodes = TemplateFieldSearcher.searchFields(entity.getDescriptionTemplate(), "schematics", "rda.dataset.data_quality_assurance");
if (!qaNodes.isEmpty() && rda.getDataQualityAssurance() != null && !rda.getDataQualityAssurance().isEmpty()) {
ObjectMapper m = new ObjectMapper();
List<String> qas = new ArrayList<>(rda.getDataQualityAssurance());
if(!qas.isEmpty()){
qaNodes.get(0).getData().setValue(mapper.writeValueAsString(qas));
if(rda.getAdditionalProperties().containsKey("otherDQAID")){
List<FieldFileTransformerModel> subFields = TemplateFieldSearcher.searchFields(entity.getDescriptionTemplate(), "id", (String) rda.getAdditionalProperties().get("otherDQAID"));
if (subFields != null && !subFields.isEmpty()) {
subFields.get(0).getData().setValue((String) rda.getAdditionalProperties().get("otherDQA"));
}
}
}
}
List<FieldFileTransformerModel> preservationNodes = TemplateFieldSearcher.searchFields(entity.getDescriptionTemplate(), "schematics", "rda.dataset.preservation_statement");
if (!preservationNodes.isEmpty()) {
preservationNodes.get(0).getData().setValue(rda.getPreservationStatement());
}
List<FieldFileTransformerModel> issuedNodes = TemplateFieldSearcher.searchFields(entity.getDescriptionTemplate(), "schematics", "rda.dataset.issued");
if (!issuedNodes.isEmpty()) {
issuedNodes.get(0).getData().setValue(rda.getIssued());
}
//TODO
/*if (rda.getDistribution() != null && !rda.getDistribution().isEmpty()) {
properties.getFields().addAll(DistributionRDAMapper.toProperties(rda.getDistribution().get(0), datasetDescriptionObj));
}*/
if (rda.getKeyword() != null) {
List<String> keywordIds = rda.getAdditionalProperties().entrySet().stream().filter(entry -> entry.getKey().startsWith("keyword")).map(entry -> entry.getValue().toString()).collect(Collectors.toList());
boolean takeAll = false;
if (keywordIds.size() < rda.getKeyword().size()) {
takeAll = true;
}
for (int i = 0; i < keywordIds.size(); i++) {
List<FieldFileTransformerModel> tagField = TemplateFieldSearcher.searchFields(entity.getDescriptionTemplate(), "id", keywordIds.get(i));
if (takeAll) {
List<String> tags = new ArrayList<>();
for (String keyword : rda.getKeyword()) {
tags.add(mapper.writeValueAsString(toTagEntity(keyword)));
}
tagField.get(0).getData().setValue(String.valueOf(tags));
} else {
tagField.get(0).getData().setValue(mapper.writeValueAsString(toTagEntity(rda.getKeyword().get(i))));
}
/*properties.getFields().add(field);
Field field1 = new Field();
field1.setKey(keywordIds.get(i));
field1.setValue(rda.getKeyword().get(i));
properties.getFields().add(field1);*/
}
}
List<FieldFileTransformerModel> personalDataNodes = TemplateFieldSearcher.searchFields(entity.getDescriptionTemplate(), "schematics", "rda.dataset.personal_data");
if (!personalDataNodes.isEmpty()) {
personalDataNodes.get(0).getData().setValue(rda.getPersonalData().value());
}
//TODO
/*if (rda.getSecurityAndPrivacy() != null) {
properties.getFields().addAll(SecurityAndPrivacyRDAMapper.toProperties(rda.getSecurityAndPrivacy()));
}*/
List<FieldFileTransformerModel> sensitiveDataNodes = TemplateFieldSearcher.searchFields(entity.getDescriptionTemplate(), "schematics", "rda.dataset.sensitive_data");
if (!sensitiveDataNodes.isEmpty()) {
sensitiveDataNodes.get(0).getData().setValue(rda.getSensitiveData().value());
}
//TODO
/*if (rda.getTechnicalResource() != null) {
properties.getFields().addAll(TechnicalResourceRDAMapper.toProperties(rda.getTechnicalResource()));
}*/
rda.getAdditionalProperties().entrySet().stream()
.filter(entry -> !entry.getKey().equals("template") && !entry.getKey().startsWith("qaId") && !entry.getKey().startsWith("keyword"))
.forEach(entry -> {
List<FieldFileTransformerModel> field = TemplateFieldSearcher.searchFields(entity.getDescriptionTemplate(), "id", entry.getKey());
field.get(0).getData().setValue((String) entry.getValue());
});
} catch (Exception e) {
logger.error(e.getMessage(), e);
}
return entity;
}
private static TagFileTransformerModel toTagEntity(String name) {
TagFileTransformerModel tag = new TagFileTransformerModel();
tag.setId(UUID.randomUUID());
tag.setLabel(name);
return tag;
}
private Map<String, Object> createFieldIdValueMap(DescriptionTemplateFileTransformerModel template) {
Map<String, Object> result = new HashMap<>();
template.getDefinition().getPages().forEach(page -> page.getSections().forEach(section -> result.putAll(createFieldIdValueMapFromSection(section))));
return result;
}
private Map<String, Object> createFieldIdValueMapFromSection(SectionFileTransformerModel section) {
Map<String, Object> result = new HashMap<>();
if (section.getSections() != null && !section.getSections().isEmpty()) {
section.getSections().forEach(subSection -> result.putAll(createFieldIdValueMapFromSection(subSection)));
}
if (section.getFieldSets() != null && !section.getFieldSets().isEmpty()) {
section.getFieldSets().stream().filter(fieldSet -> fieldSet.getFields() != null && !fieldSet.getFields().isEmpty())
.forEach(fieldSet -> fieldSet.getFields().stream().filter(field -> field.getData() != null).forEach(field -> result.put(field.getId(), field.getData().getValue())));
}
return result;
}
}

View File

@ -0,0 +1,439 @@
package eu.eudat.file.transformer.rda.mapper;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.node.ArrayNode;
import eu.eudat.file.transformer.models.descriptiontemplate.definition.FieldFileTransformerModel;
import eu.eudat.file.transformer.rda.Distribution;
import eu.eudat.file.transformer.rda.License;
import eu.eudat.file.transformer.utils.json.JsonSearcher;
import eu.eudat.file.transformer.utils.string.MyStringUtils;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.net.URI;
import java.util.*;
import java.util.stream.Collectors;
public class DistributionRDAMapper {
private static final Logger logger = LoggerFactory.getLogger(DistributionRDAMapper.class);
private static final ObjectMapper mapper = new ObjectMapper();
public static List<Distribution> toRDAList(List<FieldFileTransformerModel> nodes) {
Map<String, Distribution> rdaMap = new HashMap<>();
for (FieldFileTransformerModel node: nodes) {
String rdaProperty = getRdaDistributionProperty(node);
if(rdaProperty.isEmpty() || node.getData() == null){
continue;
}
String rdaValue = node.getData().getValue();
//if(rdaValue == null || rdaValue.isEmpty()){
if(rdaValue == null || rdaValue.isEmpty()){
continue;
}
String key = node.getNumbering();
if(!key.contains("mult")){
key = "0";
}
else{
key = "" + key.charAt(4);
}
Distribution rda;
if(rdaMap.containsKey(key)){
rda = rdaMap.get(key);
}
else {
rda = new Distribution();
rdaMap.put(key, rda);
}
/* Distribution rda = getRelative(rdaMap, node.get("numbering").asText());
if (!rdaMap.containsValue(rda)) {
rdaMap.put(node.get("numbering").asText(), rda);
} */
for (ExportPropertyName exportPropertyName : ExportPropertyName.values()) {
if (rdaProperty.contains(exportPropertyName.getName())) {
switch (exportPropertyName) {
case ACCESS_URL:
rda.setAccessUrl(rdaValue);
rda.setAdditionalProperty(ImportPropertyName.ACCESS_URL.getName(), node.getId());
break;
case AVAILABLE_UNTIL:
rda.setAvailableUntil(rdaValue);
rda.setAdditionalProperty(ImportPropertyName.AVAILABLE_UNTIL.getName(), node.getId());
break;
case DOWNLOAD_URL:
rda.setDownloadUrl(URI.create(rdaValue));
rda.setAdditionalProperty(ImportPropertyName.DOWNLOAD_URL.getName(), node.getId());
break;
case DESCRIPTION:
if(!rdaProperty.contains("host")) {
rda.setDescription(rdaValue);
rda.setAdditionalProperty(ImportPropertyName.DESCRIPTION.getName(), node.getId());
}
break;
case DATA_ACCESS:
try {
rda.setDataAccess(Distribution.DataAccess.fromValue(rdaValue));
rda.setAdditionalProperty(ImportPropertyName.DATA_ACCESS.getName(), node.getId());
}
catch (IllegalArgumentException e) {
logger.warn("Distribution data access " + rdaValue + " from semantic distribution.data_access is not valid. Data access will not be set set.");
}
break;
case BYTE_SIZE:
rda.setByteSize(Integer.parseInt(rdaValue));
rda.setAdditionalProperty(ImportPropertyName.BYTE_SIZE.getName(), node.getId());
break;
case LICENSE:
List<FieldFileTransformerModel> licenseNodes = nodes.stream().filter(lnode -> {
//if(lnode.get("schematics").isArray()){
for(String schematic: lnode.getSchematics()){
if(schematic.startsWith("rda.dataset.distribution.license")){
return true;
}
}
//}
return false;
}).collect(Collectors.toList());
License license = LicenseRDAMapper.toRDA(licenseNodes);
rda.setLicense(license != null? Collections.singletonList(license): new ArrayList<>());
break;
case FORMAT:
try {
JsonNode valueNode = mapper.readTree(node.getData().getValue());
if(valueNode.isArray()){
Iterator<JsonNode> iter = valueNode.elements();
List<String> formats = new ArrayList<>();
int i = 1;
while(iter.hasNext()) {
JsonNode current = iter.next();
String format = current.toString();
Map<String, String> result = mapper.readValue(format, HashMap.class);
format = result.get("label");
formats.add(format);
rda.setAdditionalProperty("format" + i++, mapper.readTree(current.toString()));
}
rda.setFormat(formats);
}
else{
if(rda.getFormat() == null || rda.getFormat().isEmpty()){
rda.setFormat(new ArrayList<>(Arrays.asList(rdaValue.replace(" ", "").split(","))));
}
else{
rda.getFormat().addAll(Arrays.asList(rdaValue.replace(" ", "").split(",")));
}
}
rda.setAdditionalProperty(ImportPropertyName.FORMAT.getName(), node.getId());
}
catch(JsonProcessingException e){
logger.warn(e.getMessage());
}
break;
case TITLE:
if(!rdaProperty.contains("host")) {
rda.setTitle(rdaValue);
rda.setAdditionalProperty(ImportPropertyName.TITLE.getName(), node.getId());
}
break;
case HOST:
rda.setHost(HostRDAMapper.toRDA(nodes, node.getNumbering()));
break;
}
}
}
}
return rdaMap.values().stream()
.filter(distro -> distro.getTitle() != null).collect(Collectors.toList());
}
//TODO
/*public static List<Field> toProperties(List<Distribution> rdas) {
List<Field> properties = new ArrayList<>();
rdas.forEach(rda -> {
rda.getAdditionalProperties().entrySet().forEach(entry -> {
try {
Field field = new Field();
field.setKey(entry.getValue().toString());
ImportPropertyName importPropertyName = ImportPropertyName.fromString(entry.getKey());
switch (importPropertyName) {
case ACCESS_URL:
field.setValue(rda.getAccessUrl());
break;
case TITLE:
field.setValue(rda.getTitle());
break;
case DESCRIPTION:
field.setValue(rda.getDescription());
break;
case FORMAT:
field.setValue(rda.getFormat().get(0));
break;
case BYTE_SIZE:
field.setValue(rda.getByteSize().toString());
break;
case DATA_ACCESS:
field.setValue(rda.getDataAccess().value());
break;
case DOWNLOAD_URL:
field.setValue(rda.getDownloadUrl().toString());
break;
case AVAILABLE_UNTIL:
field.setValue(rda.getAvailableUntil());
break;
}
} catch (Exception e) {
logger.error(e.getMessage(), e);
}
});
if (rda.getHost() != null) {
properties.addAll(HostRDAMapper.toProperties(rda.getHost()));
}
if (rda.getLicense() != null && !rda.getLicense().isEmpty()) {
properties.addAll(LicenseRDAMapper.toProperties(rda.getLicense()));
}
});
return properties;
}
public static List<Field> toProperties(Distribution rda, JsonNode root) {
List<Field> properties = new ArrayList<>();
List<JsonNode> distributionNodes = JsonSearcher.findNodes(root, "schematics", "rda.dataset.distribution");
for (JsonNode distributionNode: distributionNodes) {
for (ExportPropertyName exportPropertyName: ExportPropertyName.values()) {
JsonNode schematics = distributionNode.get("schematics");
if(schematics.isArray()){
for(JsonNode schematic: schematics){
Field field = new Field();
field.setKey(distributionNode.get("id").asText());
if(schematic.asText().contains(exportPropertyName.getName())){
switch (exportPropertyName) {
case ACCESS_URL:
field.setValue(rda.getAccessUrl());
break;
case DESCRIPTION:
field.setValue(rda.getDescription());
break;
case TITLE:
field.setValue(rda.getTitle());
break;
case AVAILABLE_UNTIL:
field.setValue(rda.getAvailableUntil());
break;
case DOWNLOAD_URL:
if (rda.getDownloadUrl() != null) {
field.setValue(rda.getDownloadUrl().toString());
}
break;
case DATA_ACCESS:
field.setValue(rda.getDataAccess().value());
break;
case BYTE_SIZE:
if (rda.getByteSize() != null) {
field.setValue(rda.getByteSize().toString());
}
break;
case FORMAT:
if (rda.getFormat() != null && !rda.getFormat().isEmpty()) {
String style = distributionNode.get("viewStyle").get("renderStyle").asText();
if(style.equals("combobox")) {
if (distributionNode.get("data").get("type").asText().equals("autocomplete")) {
Map<String, Object> additionalProperties = rda.getAdditionalProperties();
List<Object> standardFormats = new ArrayList<>();
rda.getAdditionalProperties().forEach((key, value) -> {
try {
if (key.matches("format\\d+")) {
standardFormats.add(additionalProperties.get(key));
Field field1 = new Field();
field1.setKey(distributionNode.get("id").asText());
field1.setValue(mapper.writeValueAsString(standardFormats));
properties.add(field1);
}
} catch (JsonProcessingException e) {
logger.error(e.getMessage(), e);
}
});
}
}
else if(style.equals("freetext")){
field.setValue(String.join(", ", rda.getFormat()));
}
}
break;
case LICENSE:
if (rda.getLicense() != null && !rda.getLicense().isEmpty()) {
properties.addAll(LicenseRDAMapper.toProperties(rda.getLicense().get(0), root));
}
break;
case HOST:
if (rda.getHost() != null) {
properties.addAll(HostRDAMapper.toProperties(rda.getHost()));
}
break;
}
if (field.getValue() != null) {
properties.add(field);
}
break;
}
}
}
}
}
return properties;
}*/
public static Distribution toRDA(List<FieldFileTransformerModel> nodes) {
Distribution rda = new Distribution();
for (FieldFileTransformerModel node: nodes) {
String rdaProperty = getRdaDistributionProperty(node);
if(rdaProperty.isEmpty()){
continue;
}
String rdaValue = node.getData().getValue();
for (ExportPropertyName exportPropertyName: ExportPropertyName.values()) {
if (rdaProperty.contains(exportPropertyName.getName())) {
switch (exportPropertyName) {
case ACCESS_URL:
rda.setAccessUrl(rdaValue);
break;
case DESCRIPTION:
rda.setDescription(rdaValue);
break;
case TITLE:
rda.setTitle(rdaValue);
break;
case AVAILABLE_UNTIL:
rda.setAvailableUntil(rdaValue);
break;
case DOWNLOAD_URL:
rda.setDownloadUrl(URI.create(rdaValue));
break;
case DATA_ACCESS:
rda.setDataAccess(Distribution.DataAccess.fromValue(rdaValue));
break;
case BYTE_SIZE:
rda.setByteSize(Integer.parseInt(rdaValue));
break;
case FORMAT:
rda.setFormat(Collections.singletonList(rdaValue));
break;
case LICENSE:
List<FieldFileTransformerModel> licenseNodes = nodes.stream().filter(lnode -> lnode.getSchematics().stream().anyMatch(schematic -> schematic.startsWith("rda.dataset.distribution.license"))).collect(Collectors.toList());
rda.setLicense(Collections.singletonList(LicenseRDAMapper.toRDA(licenseNodes)));
break;
case HOST:
List<FieldFileTransformerModel> hostNodes = nodes.stream().filter(lnode -> lnode.getSchematics().stream().anyMatch(schematic -> schematic.startsWith("rda.dataset.distribution.host"))).collect(Collectors.toList());
rda.setHost(HostRDAMapper.toRDA(hostNodes, "0"));
break;
}
}
}
/*if (rdaProperty.contains("access_url")) {
rda.setAccessUrl(rdaValue);
} else if (rdaProperty.contains("available_util")) {
rda.setAvailableUntil(rdaValue);
} else if (rdaProperty.contains("byte_size")) {
rda.setByteSize(Integer.parseInt(rdaValue));
} else if (rdaProperty.contains("data_access")) {
rda.setDataAccess(Distribution.DataAccess.fromValue(rdaValue));
} else if (rdaProperty.contains("description")) {
rda.setDescription(rdaValue);
} else if (rdaProperty.contains("download_url")) {
rda.setDownloadUrl(URI.create(rdaValue));
} else if (rdaProperty.contains("format")) {
rda.setFormat(Collections.singletonList(rdaValue));
} else if (rdaProperty.contains("host")) {
// rda.setHost(HostRDAMapper.toRDA(node));
} else if (rdaProperty.contains("license")) {
rda.setLicense(Collections.singletonList(LicenseRDAMapper.toRDA(node)));
} else if (rdaProperty.contains("title")) {
rda.setTitle(rdaValue);
}*/
}
if (rda.getTitle() == null) {
throw new IllegalArgumentException("Distribution title is missing");
}
if (rda.getDataAccess() == null) {
throw new IllegalArgumentException("Distribution Data Access is missing");
}
return rda;
}
private static String getRdaDistributionProperty(FieldFileTransformerModel node) {
return node.getSchematics().stream().filter(schematic -> schematic.startsWith("rda.dataset.distribution")).findFirst().orElse("");
}
private static Distribution getRelative( Map<String, Distribution> rdaMap, String numbering) {
return rdaMap.entrySet().stream().filter(entry -> MyStringUtils.getFirstDifference(entry.getKey(), numbering) > 0)
.max(Comparator.comparingInt(entry -> MyStringUtils.getFirstDifference(entry.getKey(), numbering))).map(Map.Entry::getValue).orElse(new Distribution());
}
private enum ExportPropertyName {
ACCESS_URL("access_url"),
AVAILABLE_UNTIL("available_until"),
BYTE_SIZE("byte_size"),
DATA_ACCESS("data_access"),
DESCRIPTION("description"),
DOWNLOAD_URL("download_url"),
FORMAT("format"),
HOST("host"),
LICENSE("license"),
TITLE("title");
private final String name;
ExportPropertyName(String name) {
this.name = name;
}
public String getName() {
return name;
}
}
private enum ImportPropertyName {
ACCESS_URL("accessurlId"),
AVAILABLE_UNTIL("availableUtilId"),
BYTE_SIZE("byteSizeId"),
DATA_ACCESS("dataAccessId"),
DESCRIPTION("descriptionId"),
DOWNLOAD_URL("downloadUrlId"),
FORMAT("formatId"),
/*HOST("host"),
LICENSE("license"),*/
TITLE("titleId");
private final String name;
ImportPropertyName(String name) {
this.name = name;
}
public String getName() {
return name;
}
public static ImportPropertyName fromString(String name) throws Exception {
for (ImportPropertyName importPropertyName: ImportPropertyName.values()) {
if (importPropertyName.getName().equals(name)) {
return importPropertyName;
}
}
throw new Exception("No name available");
}
}
}

View File

@ -0,0 +1,19 @@
package eu.eudat.file.transformer.rda.mapper;
import eu.eudat.file.transformer.rda.DmpId;
import java.util.UUID;
public class DmpIdRDAMapper {
public static DmpId toRDA(Object id) {
DmpId rda = new DmpId();
rda.setIdentifier(id.toString());
if (id instanceof UUID) {
rda.setType(DmpId.Type.OTHER);
} else {
rda.setType(DmpId.Type.DOI);
}
return rda;
}
}

View File

@ -0,0 +1,209 @@
package eu.eudat.file.transformer.rda.mapper;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper;
import eu.eudat.file.transformer.enums.ReferenceType;
import eu.eudat.file.transformer.models.descriptiontemplate.DescriptionTemplateFileTransformerModel;
import eu.eudat.file.transformer.models.dmp.DmpFileTransformerModel;
import eu.eudat.file.transformer.models.dmp.DmpReferenceFileTransformerModel;
import eu.eudat.file.transformer.models.dmp.DmpUserFileTransformerModel;
import eu.eudat.file.transformer.models.entitydoi.EntityDoiFileTransformerModel;
import eu.eudat.file.transformer.models.reference.ReferenceFileTransformerModel;
import eu.eudat.file.transformer.models.user.UserFileTransformerModel;
import eu.eudat.file.transformer.rda.Dmp;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Component;
import javax.management.InvalidApplicationException;
import java.io.IOException;
import java.util.*;
@Component
public class DmpRDAMapper {
private static final Logger logger = LoggerFactory.getLogger(DmpRDAMapper.class);
private DatasetRDAMapper datasetRDAMapper;
private final ObjectMapper mapper;
@Autowired
public DmpRDAMapper(DatasetRDAMapper datasetRDAMapper) throws IOException {
this.datasetRDAMapper = datasetRDAMapper;
this.mapper = new ObjectMapper();
}
public Dmp toRDA(DmpFileTransformerModel dmp) throws InvalidApplicationException, JsonProcessingException {
List<ReferenceFileTransformerModel> grants = new ArrayList<>();
List<ReferenceFileTransformerModel> researchers = new ArrayList<>();
List<ReferenceFileTransformerModel> organizations = new ArrayList<>();
List<ReferenceFileTransformerModel> funders = new ArrayList<>();
List<ReferenceFileTransformerModel> projects = new ArrayList<>();
if (dmp.getDmpReferences() != null) {
grants = dmp.getDmpReferences().stream().map(DmpReferenceFileTransformerModel::getReference).filter(referenceFileModel -> referenceFileModel.getType().equals(ReferenceType.Grants)).toList();
researchers = dmp.getDmpReferences().stream().map(DmpReferenceFileTransformerModel::getReference).filter(reference -> reference.getType().equals(ReferenceType.Researcher)).toList();
organizations = dmp.getDmpReferences().stream().map(DmpReferenceFileTransformerModel::getReference).filter(referenceFileModel -> referenceFileModel.getType().equals(ReferenceType.Organizations)).toList();
funders = dmp.getDmpReferences().stream().map(DmpReferenceFileTransformerModel::getReference).filter(referenceFileModel -> referenceFileModel.getType().equals(ReferenceType.Funder)).toList();
projects = dmp.getDmpReferences().stream().map(DmpReferenceFileTransformerModel::getReference).filter(reference -> reference.getType().equals(ReferenceType.Project)).toList();
}
if (dmp.getDescriptions() == null || dmp.getDescriptions().isEmpty()) { //TODO
throw new IllegalArgumentException("DMP has no Datasets");
}
Map<String, Object> extraProperties;
if (dmp.getProperties() == null) {
throw new IllegalArgumentException("DMP is missing language and contact properties");
} else {
extraProperties = mapper.readValue(dmp.getProperties(), HashMap.class);
/*if (extraProperties.get("language") == null) {
throw new IllegalArgumentException("DMP must have it's language property defined");
}*/
if (extraProperties.get("contacts") == null) {
throw new IllegalArgumentException("DMP must have it's contact property defined");
}
}
Dmp rda = new Dmp();
if (dmp.getEntityDois() != null && !dmp.getEntityDois().isEmpty()) {
for(EntityDoiFileTransformerModel doi: dmp.getEntityDois()){
if(doi.getRepositoryId().equals("Zenodo")){
rda.setDmpId(DmpIdRDAMapper.toRDA(doi.getDoi()));
}
}
} else {
rda.setDmpId(DmpIdRDAMapper.toRDA(dmp.getId()));
}
if (dmp.getCreatedAt() == null) {
throw new IllegalArgumentException("DMP Created is missing");
}
if (dmp.getUpdatedAt() == null) {
throw new IllegalArgumentException("DMP Modified is missing");
}
if (dmp.getLabel() == null) {
throw new IllegalArgumentException("DMP Label is missing");
}
rda.setCreated(dmp.getCreatedAt()); //TODO
rda.setDescription(dmp.getDescription());
rda.setModified(dmp.getUpdatedAt());
rda.setTitle(dmp.getLabel());
rda.setLanguage(LanguageRDAMapper.mapLanguageIsoToRDAIso(dmp.getLanguage() != null ? dmp.getLanguage() : "en"));
if (!extraProperties.isEmpty()) {
if (extraProperties.get("ethicalIssues") != null) {
rda.setEthicalIssuesExist(Dmp.EthicalIssuesExist.fromValue(extraProperties.get("ethicalIssues").toString()));
} else {
rda.setEthicalIssuesExist(Dmp.EthicalIssuesExist.UNKNOWN);
}
if (extraProperties.get("costs") != null) {
rda.setCost(new ArrayList<>());
((List) extraProperties.get("costs")).forEach(costl -> {
try {
rda.getCost().add(CostRDAMapper.toRDA((Map)costl));
} catch (JsonProcessingException e) {
logger.error(e.getMessage(), e);
}
});
}
UUID contactId = UUID.fromString((String) ((List<Map<String, Object>>) extraProperties.get("contacts")).get(0).get("userId"));
if (contactId != null) {
UserFileTransformerModel userContact = dmp.getDmpUsers().stream().map(DmpUserFileTransformerModel::getUser)
.filter(userFileModel -> userFileModel.getId().equals(contactId))
.findFirst().orElse(null);
if (userContact != null) {
rda.setContact(ContactRDAMapper.toRDA(userContact));
}
}
}
/*UserInfo creator;
if (dmp.getCreator() != null) {
creator = dmp.getCreator();
} else {
creator = dmp.getUsers().stream().filter(userDMP -> userDMP.getRole().equals(UserDMP.UserDMPRoles.OWNER.getValue())).map(UserDMP::getUser).findFirst().orElse(new UserInfo());
}
rda.setContact(ContactRDAMapper.toRDA(creator));*/
rda.setContributor(new ArrayList<>());
if (!researchers.isEmpty()) {
rda.getContributor().addAll(researchers.stream().map(ContributorRDAMapper::toRDA).toList());
}
rda.getContributor().addAll(dmp.getDmpUsers().stream().map(ContributorRDAMapper::toRDA).toList());
rda.setDataset(dmp.getDescriptions().stream().map(dataset -> datasetRDAMapper.toRDA(dataset, rda)).toList());
if (!projects.isEmpty() && !grants.isEmpty() && !funders.isEmpty()) {
rda.setProject(List.of(ProjectRDAMapper.toRDA(projects.get(0), grants.get(0), funders.get(0))));
}
rda.setAdditionalProperty("templates", dmp.getDescriptions().stream().map(descriptionFileTransformerModel -> descriptionFileTransformerModel.getDescriptionTemplate().getId().toString()).toList());
return rda;
}
public DmpFileTransformerModel toEntity(Dmp rda, List<DescriptionTemplateFileTransformerModel> profiles) throws InvalidApplicationException, JsonProcessingException {
DmpFileTransformerModel entity = new DmpFileTransformerModel();
entity.setLabel(rda.getTitle());
/*if (rda.getDmpId().getType() == DmpId.Type.DOI) { //TODO
try {
//TODO: Find from doi = rda.getDmpId().getIdentifier()
EntityDoi doi = new EntityDoi();
List<EntityDoi> dois = new ArrayList<>();
dois.add(doi);
entity.setEntityDois(dois);
}
catch (NoResultException e) {
logger.warn("No entity doi: " + rda.getDmpId().getIdentifier() + " found in database. No dois are added to dmp.");
entity.setDois(new HashSet<>());
}
}*/
/*if (((List<String>) rda.getAdditionalProperties().get("templates")) != null && !((List<String>) rda.getAdditionalProperties().get("templates")).isEmpty() && entity.getId() != null) {
entity.setAssociatedDmps(((List<String>) rda.getAdditionalProperties().get("templates")).stream().map(x -> {
try {
return this.getProfile(x, entity.getId());
} catch (InvalidApplicationException e) {
throw new RuntimeException(e);
}
}).filter(Objects::nonNull).collect(Collectors.toSet()));
}*/
/*if (entity.getAssociatedDmps() == null) {
entity.setAssociatedDmps(new HashSet<>());
}*/
/*if (profiles != null && entity.getId() != null) {
for (String profile : profiles) {
entity.getAssociatedDmps().add(this.getProfile(profile, entity.getId()));
}
}*/
entity.setDmpReferences(new ArrayList<>());
if (rda.getContributor() != null && !rda.getContributor().isEmpty() && rda.getContributor().get(0).getContributorId() != null) {
entity.getDmpReferences().addAll(rda.getContributor().stream().filter(r -> r.getContributorId() != null).map(ContributorRDAMapper::toEntity)
.map(reference -> {
DmpReferenceFileTransformerModel dmpReference = new DmpReferenceFileTransformerModel();
dmpReference.setReference(reference);
return dmpReference;
}).toList());
}
entity.setCreatedAt(rda.getCreated());
entity.setUpdatedAt(rda.getModified());
entity.setDescription(rda.getDescription());
entity.setDescriptions(rda.getDataset().stream().map(rda1 -> datasetRDAMapper.toEntity(rda1, profiles.get(0))).toList());
if (!rda.getProject().isEmpty()) {
entity.getDmpReferences().addAll(ProjectRDAMapper.toEntity(rda.getProject().get(0)).stream()
.map(reference -> {
DmpReferenceFileTransformerModel dmpReference = new DmpReferenceFileTransformerModel();
dmpReference.setReference(reference);
return dmpReference;
}).toList());
}
Map<String, Object> extraProperties = new HashMap<>();
extraProperties.put("language", LanguageRDAMapper.mapRDAIsoToLanguageIso(rda.getLanguage()));
entity.setProperties(mapper.writeValueAsString(extraProperties));
return entity;
}
// private DmpDescriptionTemplateEntity getProfile(String descriptionTemplateId, UUID dmpId) throws InvalidApplicationException {
// return this.queryFactory.query(DmpDescriptionTemplateQuery.class).dmpIds(dmpId).descriptionTemplateIds(UUID.fromString(descriptionTemplateId)).first();
// }
}

View File

@ -0,0 +1,19 @@
package eu.eudat.file.transformer.rda.mapper;
import eu.eudat.file.transformer.rda.FunderId;
import java.util.UUID;
public class FunderIdRDAMapper {
public static FunderId toRDA(Object id) {
FunderId rda = new FunderId();
rda.setIdentifier(id.toString());
if (id instanceof UUID) {
rda.setType(FunderId.Type.OTHER);
} else {
rda.setType(FunderId.Type.FUNDREF);
}
return rda;
}
}

View File

@ -0,0 +1,48 @@
package eu.eudat.file.transformer.rda.mapper;
import eu.eudat.file.transformer.enums.ReferenceType;
import eu.eudat.file.transformer.models.reference.ReferenceFileTransformerModel;
import eu.eudat.file.transformer.rda.Funding;
import java.util.ArrayList;
import java.util.List;
public class FundingRDAMapper {
public static Funding toRDA(ReferenceFileTransformerModel grant, ReferenceFileTransformerModel funder) {
Funding rda = new Funding();
String referencePrefix;
String shortReference;
Integer prefixLength = 0;
if (funder.getReference() != null) {
referencePrefix = funder.getReference().split(":")[0];
prefixLength = referencePrefix.length() == funder.getReference().length() ? referencePrefix.length() - 1 : referencePrefix.length();
shortReference = funder.getReference().substring(prefixLength + 1);
rda.setFunderId(FunderIdRDAMapper.toRDA(shortReference));
} else {
rda.setFunderId(FunderIdRDAMapper.toRDA(funder.getId()));
}
if (grant.getReference() != null) {
referencePrefix = grant.getReference().split(":")[0];
prefixLength = referencePrefix.length() == grant.getReference().length() ? referencePrefix.length() - 1 : referencePrefix.length();
shortReference = grant.getReference().substring(prefixLength + 1);
rda.setGrantId(GrantIdRDAMapper.toRDA(shortReference));
} else {
rda.setGrantId(GrantIdRDAMapper.toRDA(grant.getId().toString()));
}
return rda;
}
public static List<ReferenceFileTransformerModel> toEntity(Funding rda) {
List<ReferenceFileTransformerModel> references = new ArrayList<>();
ReferenceFileTransformerModel funder = new ReferenceFileTransformerModel();
funder.setType(ReferenceType.Funder);
funder.setReference(rda.getFunderId().getIdentifier());
references.add(funder);
ReferenceFileTransformerModel grant = new ReferenceFileTransformerModel();
grant.setType(ReferenceType.Grants);
grant.setReference(rda.getGrantId().getIdentifier());
references.add(grant);
return references;
}
}

View File

@ -0,0 +1,13 @@
package eu.eudat.file.transformer.rda.mapper;
import eu.eudat.file.transformer.rda.GrantId;
public class GrantIdRDAMapper {
public static GrantId toRDA(String id) {
GrantId rda = new GrantId();
rda.setIdentifier(id);
rda.setType(GrantId.Type.OTHER);
return rda;
}
}

View File

@ -0,0 +1,262 @@
package eu.eudat.file.transformer.rda.mapper;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import eu.eudat.file.transformer.models.descriptiontemplate.definition.FieldFileTransformerModel;
import eu.eudat.file.transformer.rda.Host;
import eu.eudat.file.transformer.rda.PidSystem;
import eu.eudat.file.transformer.utils.string.MyStringUtils;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.net.URI;
import java.util.*;
import java.util.stream.Collectors;
public class HostRDAMapper {
private static final Logger logger = LoggerFactory.getLogger(HostRDAMapper.class);
public static Host toRDA(List<FieldFileTransformerModel> nodes, String numbering) {
ObjectMapper mapper = new ObjectMapper();
Host rda = new Host();
for (FieldFileTransformerModel node: nodes) {
String rdaProperty = node.getSchematics().stream().filter(schematic -> schematic.startsWith("rda.dataset.distribution.host")).findFirst().orElse("");
if (rdaProperty.contains("host")) {
int firstDiff = MyStringUtils.getFirstDifference(numbering, node.getNumbering());
if (firstDiff == -1 || firstDiff >= 2) {
if (node.getData() == null) {
continue;
}
String rdaValue = node.getData().getValue();
if(rdaValue == null || rdaValue.isEmpty()){
continue;
}
for (ExportPropertyName propertyName: ExportPropertyName.values()) {
if (rdaProperty.contains(propertyName.getName())) {
switch (propertyName) {
case AVAILABILITY:
rda.setAvailability(rdaValue);
rda.setAdditionalProperty(ImportPropertyName.AVAILABILITY.getName(), node.getId());
break;
case BACKUP_FREQUENCY:
rda.setBackupFrequency(rdaValue);
rda.setAdditionalProperty(ImportPropertyName.BACKUP_FREQUENCY.getName(), node.getId());
break;
case BACKUP_TYPE:
rda.setBackupType(rdaValue);
rda.setAdditionalProperty(ImportPropertyName.BACKUP_TYPE.getName(), node.getId());
break;
case CERTIFIED_WITH:
try {
rda.setCertifiedWith(Host.CertifiedWith.fromValue(rdaValue));
rda.setAdditionalProperty(ImportPropertyName.CERTIFIED_WITH.getName(), node.getId());
}
catch (IllegalArgumentException e) {
logger.warn("Distribution host certified with " + rdaValue + "from semantic distribution.host.certified_with is not valid. Certified_with will not be set set.");
}
break;
case DESCRIPTION:
rda.setDescription(rdaValue);
rda.setAdditionalProperty(ImportPropertyName.DESCRIPTION.getName(), node.getId());
break;
case GEO_LOCATION:
if (rdaValue.startsWith("{")) {
try {
rdaValue = mapper.readValue(rdaValue, Map.class).get("id").toString();
} catch (JsonProcessingException e) {
logger.warn(e.getLocalizedMessage() + ". Try to pass value as is");
}
}
try {
rda.setGeoLocation(Host.GeoLocation.fromValue(rdaValue));
rda.setAdditionalProperty(ImportPropertyName.GEO_LOCATION.getName(), node.getId());
}
catch (IllegalArgumentException e) {
logger.warn("Distribution host geo location " + rdaValue + "from semantic distribution.host.geo_location is not valid. Geo location will not be set set.");
}
break;
case PID_SYSTEM:
try{
JsonNode valueNode = mapper.readTree(rdaValue);
Iterator<JsonNode> iter = valueNode.elements();
List<String> pList = new ArrayList<>();
while(iter.hasNext()) {
pList.add(iter.next().asText());
}
List<PidSystem> pidList;
if(pList.size() == 0){
pidList = Arrays.stream(rdaValue.replaceAll("[\\[\"\\]]","").split(","))
.map(PidSystem::fromValue).collect(Collectors.toList());
}
else{
pidList = pList.stream().map(PidSystem::fromValue).collect(Collectors.toList());
}
rda.setPidSystem(pidList);
rda.setAdditionalProperty(ImportPropertyName.PID_SYSTEM.getName(), node.getId());
}
catch (IllegalArgumentException e){
rda.setPidSystem(new ArrayList<>());
break;
} catch (JsonProcessingException e) {
throw new RuntimeException(e);
}
break;
case STORAGE_TYPE:
rda.setStorageType(rdaValue);
rda.setAdditionalProperty(ImportPropertyName.STORAGE_TYPE.getName(), node.getId());
break;
case SUPPORT_VERSIONING:
try {
rda.setSupportVersioning(Host.SupportVersioning.fromValue(rdaValue));
rda.setAdditionalProperty(ImportPropertyName.SUPPORT_VERSIONING.getName(), node.getId());
}
catch (IllegalArgumentException e) {
logger.warn("Distribution host support versioning " + rdaValue + "from semantic distribution.host.support_versioning is not valid. Support versioning will not be set set.");
}
break;
case TITLE:
rda.setTitle(rdaValue);
rda.setAdditionalProperty(ImportPropertyName.TITLE.getName(), node.getId());
break;
case URL:
try {
rda.setUrl(URI.create(rdaValue));
rda.setAdditionalProperty(ImportPropertyName.URL.getName(), node.getId());
} catch (IllegalArgumentException e) {
logger.warn(e.getLocalizedMessage() + ". Skipping url parsing");
}
break;
}
}
}
}
}
}
if(rda.getTitle() == null || rda.getUrl() == null){
return null;
}
return rda;
}
//TODO
/*
public static List<Field> toProperties(Host rda) {
List<Field> properties = new ArrayList<>();
rda.getAdditionalProperties().entrySet().forEach(entry -> {
try {
ImportPropertyName importPropertyName = ImportPropertyName.fromString(entry.getKey());
Field field = new Field();
field.setKey(entry.getValue().toString());
switch (importPropertyName) {
case AVAILABILITY:
field.setValue(rda.getAvailability());
break;
case TITLE:
field.setValue(rda.getTitle());
break;
case DESCRIPTION:
field.setValue(rda.getDescription());
break;
case BACKUP_FREQUENCY:
field.setValue(rda.getBackupFrequency());
break;
case BACKUP_TYPE:
field.setValue(rda.getBackupType());
break;
case CERTIFIED_WITH:
field.setValue(rda.getCertifiedWith().value());
break;
case GEO_LOCATION:
field.setValue(rda.getGeoLocation().value());
break;
case PID_SYSTEM:
List<Object> pids = new ArrayList<>();
ObjectMapper mapper = new ObjectMapper();
for(PidSystem pid: rda.getPidSystem()){
pids.add(pid.value());
}
if(!pids.isEmpty()){
field.setValue(mapper.writeValueAsString(pids));
}
break;
case STORAGE_TYPE:
field.setValue(rda.getStorageType());
break;
case SUPPORT_VERSIONING:
field.setValue(rda.getSupportVersioning().value());
break;
case URL:
field.setValue(rda.getUrl().toString());
break;
}
properties.add(field);
} catch (Exception e) {
logger.error(e.getMessage(), e);
}
});
return properties;
}
*/
private enum ExportPropertyName {
AVAILABILITY("availability"),
BACKUP_FREQUENCY("backup_frequency"),
BACKUP_TYPE("backup_type"),
CERTIFIED_WITH("certified_with"),
DESCRIPTION("description"),
GEO_LOCATION("geo_location"),
PID_SYSTEM("pid_system"),
STORAGE_TYPE("storage_type"),
SUPPORT_VERSIONING("support_versioning"),
TITLE("title"),
URL("url");
private final String name;
ExportPropertyName(String name) {
this.name = name;
}
public String getName() {
return name;
}
}
private enum ImportPropertyName {
AVAILABILITY("availabilityId"),
BACKUP_FREQUENCY("backup_frequencyId"),
BACKUP_TYPE("backup_typeId"),
CERTIFIED_WITH("certified_withId"),
DESCRIPTION("descriptionId"),
GEO_LOCATION("geo_locationId"),
PID_SYSTEM("pid_systemId"),
STORAGE_TYPE("storage_typeId"),
SUPPORT_VERSIONING("support_versioningId"),
TITLE("titleId"),
URL("urlId");
private final String name;
ImportPropertyName(String name) {
this.name = name;
}
public String getName() {
return name;
}
public static ImportPropertyName fromString(String name) throws Exception {
for (ImportPropertyName importPropertyName: ImportPropertyName.values()) {
if (importPropertyName.getName().equals(name)) {
return importPropertyName;
}
}
throw new Exception("No name available");
}
}
}

View File

@ -0,0 +1,30 @@
package eu.eudat.file.transformer.rda.mapper;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper;
import eu.eudat.file.transformer.models.tag.TagFileTransformerModel;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
public class KeywordRDAMapper {
private static final Logger logger = LoggerFactory.getLogger(KeywordRDAMapper.class);
private static final ObjectMapper mapper = new ObjectMapper();
public static List<String> toRDA(String value) {
if (!value.isEmpty() && !value.equals("null")) {
try {
TagFileTransformerModel tag = mapper.readValue(value, TagFileTransformerModel.class);
return new ArrayList<>(Collections.singletonList(tag.getLabel()));
} catch (JsonProcessingException e) {
logger.warn(e.getMessage() + ". Attempting to parse it as a String since its a new tag.");
return new ArrayList<>(Collections.singletonList(value));
}
}
return new ArrayList<>();
}
}

View File

@ -0,0 +1,45 @@
package eu.eudat.file.transformer.rda.mapper;
import com.fasterxml.jackson.databind.ObjectMapper;
import eu.eudat.file.transformer.rda.Language;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.io.IOException;
import java.io.InputStreamReader;
import java.nio.charset.StandardCharsets;
import java.util.HashMap;
import java.util.LinkedHashMap;
import java.util.Map;
import java.util.Objects;
public class LanguageRDAMapper {
private final static Map<String, Object> langMap = new HashMap<>();
private static final Logger logger = LoggerFactory.getLogger(LanguageRDAMapper.class);
static {
try {
ObjectMapper mapper = new ObjectMapper();
InputStreamReader isr = new InputStreamReader(LanguageRDAMapper.class.getClassLoader().getResource("internal/rda-lang-map.json").openStream(), StandardCharsets.UTF_8);
langMap.putAll(mapper.readValue(isr, LinkedHashMap.class));
isr.close();
} catch (IOException e) {
logger.error(e.getMessage(), e);
}
}
public static Language mapLanguageIsoToRDAIso(String code) {
return langMap.entrySet().stream().map(entry -> {
if (entry.getValue().toString().equals(code)) {
return Language.fromValue(entry.getKey());
} else {
return null;
}
}).filter(Objects::nonNull).findFirst().get();
}
public static String mapRDAIsoToLanguageIso(Language lang) {
return langMap.get(lang.value()).toString();
}
}

View File

@ -0,0 +1,135 @@
package eu.eudat.file.transformer.rda.mapper;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.node.ArrayNode;
import eu.eudat.file.transformer.models.descriptiontemplate.definition.FieldFileTransformerModel;
import eu.eudat.file.transformer.rda.License;
import eu.eudat.file.transformer.utils.json.JsonSearcher;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.net.URI;
import java.util.ArrayList;
import java.util.List;
public class LicenseRDAMapper {
private static final Logger logger = LoggerFactory.getLogger(LicenseRDAMapper.class);
public static License toRDA(List<FieldFileTransformerModel> nodes) {
License rda = new License();
for (FieldFileTransformerModel node: nodes) {
String rdaProperty = node.getSchematics().stream().filter(schematic -> schematic.startsWith("rda.dataset.distribution.license")).findFirst().orElse("");
String value = node.getData().getValue();
if(value == null || value.isEmpty()){
continue;
}
for (LicenceProperties licenceProperties: LicenceProperties.values()) {
if (rdaProperty.contains(licenceProperties.getName())) {
switch (licenceProperties) {
case LICENSE_REF:
try {
rda.setLicenseRef(URI.create(value));
} catch (IllegalArgumentException e) {
logger.warn(e.getLocalizedMessage() + ". Skipping url parsing");
}
break;
case START_DATE:
rda.setStartDate(value);
break;
}
}
}
/*if (rdaProperty.contains("license_ref")) {
rda.setLicenseRef(URI.create(value));
rda.setAdditionalProperty("license_refId", node.get("id").asText());
} else if (rdaProperty.contains("start_date")) {
rda.setStartDate(value);
rda.setAdditionalProperty("start_dateId", node.get("id").asText());
}*/
}
if(rda.getLicenseRef() == null || rda.getStartDate() == null){
return null;
}
return rda;
}
//TODO
/*public static List<Field> toProperties(List<License> rdas) {
List<Field> properties = new ArrayList<>();
rdas.forEach(rda -> {
rda.getAdditionalProperties().entrySet().forEach(entry -> {
Field field = new Field();
field.setKey(entry.getValue().toString());
switch (entry.getKey()) {
case "license_refId":
field.setValue(rda.getLicenseRef().toString());
break;
case "start_dateId":
field.setValue(rda.getStartDate());
break;
}
properties.add(field);
});
});
return properties;
}
public static List<Field> toProperties(License rda, JsonNode root) {
List<Field> properties = new ArrayList<>();
List<JsonNode> licenseNodes = JsonSearcher.findNodes(root, "schematics", "rda.dataset.distribution.license");
for (JsonNode licenseNode: licenseNodes) {
for (LicenceProperties licenceProperty: LicenceProperties.values()) {
JsonNode schematics = licenseNode.get("schematics");
if(schematics.isArray()) {
for (JsonNode schematic : schematics) {
if (schematic.asText().endsWith(licenceProperty.getName())) {
switch (licenceProperty) {
case LICENSE_REF:
if (rda.getLicenseRef() != null) {
Field field = new Field();
field.setKey(licenseNode.get("id").asText());
field.setValue(rda.getLicenseRef().toString());
properties.add(field);
}
break;
case START_DATE:
Field field = new Field();
field.setKey(licenseNode.get("id").asText());
field.setValue(rda.getStartDate());
properties.add(field);
break;
}
}
break;
}
}
}
}
return properties;
}*/
public enum LicenceProperties {
LICENSE_REF("license_ref"),
START_DATE("start_date");
private String name;
LicenceProperties(String name) {
this.name = name;
}
public String getName() {
return name;
}
}
}

View File

@ -0,0 +1,230 @@
package eu.eudat.file.transformer.rda.mapper;
import com.fasterxml.jackson.core.JsonParseException;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.node.ArrayNode;
import com.fasterxml.jackson.databind.node.TextNode;
import eu.eudat.file.transformer.models.descriptiontemplate.definition.FieldFileTransformerModel;
import eu.eudat.file.transformer.rda.Metadatum;
import eu.eudat.file.transformer.utils.string.MyStringUtils;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.io.IOException;
import java.util.*;
import java.util.stream.Collectors;
public class MetadataRDAMapper {
private static final Logger logger = LoggerFactory.getLogger(MetadataRDAMapper.class);
public static List<Metadatum> toRDAList(List<FieldFileTransformerModel> nodes) {
ObjectMapper mapper = new ObjectMapper();
Map<String, String> rdaMap = new HashMap<>();
List<Metadatum> rdas = new ArrayList<>();
for (FieldFileTransformerModel node: nodes) {
String rdaProperty = node.getSchematics().stream().filter(schematic -> schematic.startsWith("rda.dataset.metadata")).findFirst().orElse("");
try {
if (node.getData() == null) {
continue;
}
String stringValue = node.getData().getValue().startsWith("[") ? node.getData().getValue() : "\"" + node.getData().getValue() + "\"";
JsonNode rdaValue = mapper.readTree(stringValue);
for (PropertyName propertyName : PropertyName.values()) {
if (rdaProperty.contains(propertyName.getName())) {
switch (propertyName) {
case METADATA_STANDARD_ID:
if (rdaValue instanceof ArrayNode) {
for (Iterator<JsonNode> it = rdaValue.elements(); it.hasNext(); ) {
JsonNode data = null;
data = mapper.readTree(it.next().asText());
if (data.get("uri") != null) {
rdas.add(new Metadatum());
rdas.get(rdas.size() - 1).setMetadataStandardId(MetadataStandardIdRDAMapper.toRDA(data.get("uri").asText()));
rdas.get(rdas.size() - 1).setDescription(data.get("label").asText());
rdas.get(rdas.size() - 1).setAdditionalProperty("fieldId", node.getId());
rdas.get(rdas.size() - 1).setAdditionalProperty("valueId", data.get("id").asText());
rdaMap.put(data.get("uri").asText(), node.getNumbering());
}
}
} else if (rdaValue instanceof TextNode && rdaProperty.contains("identifier") && !rdaValue.asText().isEmpty()) {
rdas.add(new Metadatum());
rdas.get(rdas.size() - 1).setMetadataStandardId(MetadataStandardIdRDAMapper.toRDA(rdaValue.asText()));
rdas.get(rdas.size() - 1).setAdditionalProperty("identifierId", node.getId());
rdaMap.put(rdaValue.asText(), node.getNumbering());
}
break;
case DESCRIPTION:
if (!rdaValue.asText().isEmpty()) {
Metadatum rda = getRelative(rdas, rdaMap, node.getNumbering());
if (rda != null) {
rda.setDescription(rdaValue.asText());
rda.setAdditionalProperty("descriptionId", node.getId());
} else {
rdas.stream().filter(rda1 -> rda1.getDescription() == null || rda1.getDescription().isEmpty()).forEach(rda1 -> rda1.setDescription(rdaValue.asText()));
}
}
break;
case LANGUAGE:
String language = rdaValue.asText();
Metadatum.Language lang = Metadatum.Language.fromValue(language);
Metadatum rda = getRelative(rdas, rdaMap, node.getNumbering());
if (rda != null) {
rda.setLanguage(lang);
rda.setAdditionalProperty("languageId", node.getId());
} else {
rdas.forEach(rda1 -> rda1.setLanguage(lang));
}
break;
}
}
}
} catch (JsonProcessingException e) {
logger.error(e.getMessage(), e);
}
}
return rdas;
}
//TODO
/*
public static void toProperties(List<Metadatum> rdas, List<FieldFileTransformerModel> fields) {
List<Object> standardIds = new ArrayList<>();
ObjectMapper mapper = new ObjectMapper();
rdas.forEach(rda -> {
rda.getAdditionalProperties().entrySet().forEach(entry -> {
try {
switch (entry.getKey()) {
case "fieldId":
Map<String, String> metadata = toMap(rda);
standardIds.add(metadata);
Field field1 = new Field();
field1.setKey(entry.getValue().toString());
field1.setValue(mapper.writeValueAsString(standardIds));
properties.add(field1);
break;
case "identifierId":
Field field2 = new Field();
field2.setKey(entry.getValue().toString());
field2.setValue(rda.getMetadataStandardId().getIdentifier());
properties.add(field2);
break;
case "descriptionId":
Field field3 = new Field();
field3.setKey(entry.getValue().toString());
field3.setValue(rda.getDescription());
properties.add(field3);
break;
case "languageId":
if (rda.getLanguage() != null) {
Field field4 = new Field();
field4.setKey(entry.getValue().toString());
field4.setValue(rda.getLanguage().value());
properties.add(field4);
}
break;
}
}catch (Exception e) {
logger.error(e.getMessage(), e);
}
});
});
return properties;
}
*/
public static Metadatum toRDA(JsonNode node) {
Metadatum rda = new Metadatum();
String rdaProperty = "";
JsonNode schematics = node.get("schematics");
if(schematics.isArray()){
for(JsonNode schematic: schematics){
if(schematic.asText().startsWith("rda.dataset.metadata")){
rdaProperty = schematic.asText();
break;
}
}
}
JsonNode rdaValue = node.get("value");
if (rdaProperty.contains("metadata_standard_id")) {
if (rdaValue instanceof ArrayNode) {
for (Iterator<JsonNode> it = rdaValue.elements(); it.hasNext(); ) {
JsonNode data = it.next();
if (data.get("uri") != null) {
rda.setMetadataStandardId(MetadataStandardIdRDAMapper.toRDA(data.get("uri").asText()));
}
}
}
} else if (rdaProperty.contains("description")) {
rda.setDescription(rdaValue.asText());
} else if (rdaProperty.contains("language")) {
String language = rdaValue.asText();
Metadatum.Language lang = Metadatum.Language.fromValue(language);
rda.setLanguage(lang);
}
return rda;
}
private static Metadatum getRelative(List<Metadatum> rdas, Map<String, String> rdaMap, String numbering) {
String target = rdaMap.entrySet().stream().filter(entry -> MyStringUtils.getFirstDifference(entry.getValue(), numbering) > 0)
.max(Comparator.comparingInt(entry -> MyStringUtils.getFirstDifference(entry.getValue(), numbering))).map(Map.Entry::getKey).orElse("");
return rdas.stream().filter(rda -> rda.getMetadataStandardId().getIdentifier().equals(target)).distinct().findFirst().orElse(null);
}
private enum PropertyName {
METADATA_STANDARD_ID("metadata_standard_id"),
DESCRIPTION("description"),
LANGUAGE("language");
private final String name;
PropertyName(String name) {
this.name = name;
}
public String getName() {
return name;
}
}
private static Map<String, String> toMap(Metadatum rda) {
Map<String, String> result = new HashMap<>();
ObjectMapper mapper = new ObjectMapper();
Map<String, Object> metadata = mapper.convertValue(rda, Map.class);
Map<String, String> additionalProperties = mapper.convertValue(metadata.get("additional_properties"), Map.class);
String id = additionalProperties.remove("valueId");
additionalProperties.clear();
additionalProperties.put("id", id);
Map<String, String> metadataStandardId = mapper.convertValue(metadata.get("metadata_standard_id"), Map.class);
String url = metadataStandardId.remove("identifier");
metadataStandardId.remove("type");
metadataStandardId.put("uri", url);
metadata.remove("additional_properties");
metadata.remove("metadata_standard_id");
Map<String, String> newMetadata = metadata.entrySet().stream().collect(Collectors.toMap(Map.Entry::getKey, entry -> entry.getValue().toString()));
String label = newMetadata.remove("description");
newMetadata.put("label", label);
result.putAll(newMetadata);
result.putAll(metadataStandardId);
result.putAll(additionalProperties);
return result;
}
}

View File

@ -0,0 +1,13 @@
package eu.eudat.file.transformer.rda.mapper;
import eu.eudat.file.transformer.rda.MetadataStandardId;
public class MetadataStandardIdRDAMapper {
public static MetadataStandardId toRDA(String uri) {
MetadataStandardId rda = new MetadataStandardId();
rda.setIdentifier(uri);
rda.setType(MetadataStandardId.Type.URL);
return rda;
}
}

View File

@ -0,0 +1,69 @@
package eu.eudat.file.transformer.rda.mapper;
import eu.eudat.file.transformer.enums.ReferenceType;
import eu.eudat.file.transformer.models.reference.DefinitionFileTransformerModel;
import eu.eudat.file.transformer.models.reference.FieldFileTransformerModel;
import eu.eudat.file.transformer.models.reference.ReferenceFileTransformerModel;
import eu.eudat.file.transformer.rda.Project;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.util.*;
public class ProjectRDAMapper {
private final static Logger logger = LoggerFactory.getLogger(ProjectRDAMapper.class);
public static Project toRDA(ReferenceFileTransformerModel project, ReferenceFileTransformerModel grant, ReferenceFileTransformerModel funder) {
Project rda = new Project();
try {
rda.setTitle(project.getLabel());
rda.setDescription(project.getDescription());
String startDateString = project.getDefinition().getFields().stream().filter(field -> field.getCode().equals("startDate")).map(FieldFileTransformerModel::getValue).findFirst().orElse(null);
if (startDateString != null) {
rda.setStart(startDateString);
}
String endDateString = project.getDefinition().getFields().stream().filter(field -> field.getCode().equals("endDate")).map(FieldFileTransformerModel::getValue).findFirst().orElse(null);
if (endDateString != null) {
rda.setEnd(endDateString);
}
rda.setFunding(List.of(FundingRDAMapper.toRDA(grant, funder)));
if (rda.getTitle() == null) {
throw new IllegalArgumentException("Project Title is missing");
}
} catch (Exception e) {
logger.error(e.getLocalizedMessage(), e);
}
return rda;
}
public static List<ReferenceFileTransformerModel> toEntity(Project rda) {
List<ReferenceFileTransformerModel> entities = new ArrayList<>();
ReferenceFileTransformerModel project = new ReferenceFileTransformerModel();
project.setLabel(rda.getTitle());
project.setDescription(rda.getDescription());
project.setType(ReferenceType.Project);
DefinitionFileTransformerModel projectDefinition = new DefinitionFileTransformerModel();
projectDefinition.setFields(new ArrayList<>());
if (rda.getStart() != null && !rda.getStart().isEmpty()) {
FieldFileTransformerModel startDateField = new FieldFileTransformerModel();
startDateField.setCode("startDate");
startDateField.setValue(rda.getStart());
projectDefinition.getFields().add(startDateField);
}
if (rda.getEnd() != null && !rda.getEnd().isEmpty()) {
FieldFileTransformerModel startDateField = new FieldFileTransformerModel();
startDateField.setCode("endDate");
startDateField.setValue(rda.getEnd());
projectDefinition.getFields().add(startDateField);
}
project.setDefinition(projectDefinition);
entities.add(project);
for (int i = 0; i < rda.getFunding().size(); i++) {
entities.addAll(FundingRDAMapper.toEntity(rda.getFunding().get(i)));
}
return entities;
}
}

View File

@ -0,0 +1,152 @@
package eu.eudat.file.transformer.rda.mapper;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.node.ArrayNode;
import eu.eudat.file.transformer.models.descriptiontemplate.definition.FieldFileTransformerModel;
import eu.eudat.file.transformer.rda.SecurityAndPrivacy;
import eu.eudat.file.transformer.utils.string.MyStringUtils;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.util.*;
import java.util.stream.Collectors;
public class SecurityAndPrivacyRDAMapper {
private static final Logger logger = LoggerFactory.getLogger(SecurityAndPrivacyRDAMapper.class);
public static List<SecurityAndPrivacy> toRDAList(List<FieldFileTransformerModel> nodes) {
Map<String, SecurityAndPrivacy> rdaMap = new HashMap<>();
for (FieldFileTransformerModel node: nodes) {
String rdaProperty = node.getSchematics().stream().filter(schematic -> schematic.startsWith("rda.dataset.security_and_privacy")).findFirst().orElse("");
if (node.getData() == null) {
continue;
}
String rdaValue = node.getData().getValue();
if(rdaValue == null || rdaValue.isEmpty()){
continue;
}
SecurityAndPrivacy rda = getRelative(rdaMap, node.getNumbering());
if (!rdaMap.containsValue(rda)) {
rdaMap.put(node.getNumbering(), rda);
}
for (ExportPropertyName exportPropertyName : ExportPropertyName.values()) {
if (rdaProperty.contains(exportPropertyName.getName())) {
switch (exportPropertyName) {
case TITLE:
rda.setTitle(rdaValue);
rda.getAdditionalProperties().put(ImportPropertyName.TITLE.getName(), node.getId());
break;
case DESCRIPTION:
rda.setDescription(rdaValue);
rda.getAdditionalProperties().put(ImportPropertyName.DESCRIPTION.getName(), node.getId());
break;
}
}
}
}
return rdaMap.values().stream()
.filter(sap -> sap.getTitle() != null)
.collect(Collectors.toList());
}
//TODO
/*
public static List<Field> toProperties(List<SecurityAndPrivacy> rdas) {
List<Field> properties = new ArrayList<>();
rdas.forEach(rda -> rda.getAdditionalProperties().entrySet().forEach(entry -> {
try {
Field field = new Field();
field.setKey(entry.getValue().toString());
ImportPropertyName importPropertyName = ImportPropertyName.fromString(entry.getKey());
switch(importPropertyName) {
case TITLE:
field.setValue(rda.getTitle());
break;
case DESCRIPTION:
field.setValue(rda.getDescription());
break;
}
properties.add(field);
} catch (Exception e) {
logger.error(e.getMessage(), e);
}
}));
return properties;
}
*/
public static SecurityAndPrivacy toRDA(JsonNode node) {
SecurityAndPrivacy rda = new SecurityAndPrivacy();
String rdaProperty = "";
JsonNode schematics = node.get("schematics");
if(schematics.isArray()){
for(JsonNode schematic: schematics){
if(schematic.asText().startsWith("rda.dataset.security_and_privacy")){
rdaProperty = schematic.asText();
break;
}
}
}
String value = node.get("value").asText();
if (rdaProperty.contains("description")) {
rda.setDescription(value);
}
if (rdaProperty.contains("title")) {
rda.setTitle(value);
}
if (rda.getTitle() == null) {
throw new IllegalArgumentException("Security And Privacy Title is missing");
}
return rda;
}
private static SecurityAndPrivacy getRelative(Map<String, SecurityAndPrivacy> rdaMap, String numbering) {
return rdaMap.entrySet().stream().filter(entry -> MyStringUtils.getFirstDifference(entry.getKey(), numbering) > 0)
.max(Comparator.comparingInt(entry -> MyStringUtils.getFirstDifference(entry.getKey(), numbering))).map(Map.Entry::getValue).orElse(new SecurityAndPrivacy());
}
private enum ExportPropertyName {
TITLE("title"),
DESCRIPTION("description");
private String name;
ExportPropertyName(String name) {
this.name = name;
}
public String getName() {
return name;
}
}
private enum ImportPropertyName {
TITLE("titleId"),
DESCRIPTION("descriptionId");
private String name;
ImportPropertyName(String name) {
this.name = name;
}
public String getName() {
return name;
}
public static ImportPropertyName fromString(String name) throws Exception {
for (ImportPropertyName importPropertyName: ImportPropertyName.values()) {
if (importPropertyName.getName().equals(name)) {
return importPropertyName;
}
}
throw new Exception("Property not available");
}
}
}

View File

@ -0,0 +1,153 @@
package eu.eudat.file.transformer.rda.mapper;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.node.ArrayNode;
import eu.eudat.file.transformer.models.descriptiontemplate.definition.FieldFileTransformerModel;
import eu.eudat.file.transformer.rda.TechnicalResource;
import eu.eudat.file.transformer.utils.string.MyStringUtils;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.util.*;
import java.util.stream.Collectors;
public class TechnicalResourceRDAMapper {
private static final Logger logger = LoggerFactory.getLogger(TechnicalResourceRDAMapper.class);
public static List<TechnicalResource> toRDAList(List<FieldFileTransformerModel> nodes) {
Map<String, TechnicalResource> rdaMap = new HashMap<>();
for (FieldFileTransformerModel node: nodes) {
String rdaProperty = node.getSchematics().stream().filter(schematic -> schematic.startsWith("rda.dataset.technical_resource")).findFirst().orElse("");
if (node.getData() == null) {
continue;
}
String rdaValue = node.getData().getValue();
if(rdaValue == null || rdaValue.isEmpty()){
continue;
}
TechnicalResource rda = getRelative(rdaMap, node.getNumbering());
if (!rdaMap.containsValue(rda)) {
rdaMap.put(node.getNumbering(), rda);
}
for (ExportPropertyName exportPropertyName : ExportPropertyName.values()) {
if (rdaProperty.contains(exportPropertyName.getName())) {
switch (exportPropertyName) {
case NAME:
rda.setName(rdaValue);
rda.setAdditionalProperty(ImportPropertyName.NAME.getName(), node.getId());
break;
case DESCRIPTION:
rda.setDescription(rdaValue);
rda.setAdditionalProperty(ImportPropertyName.DESCRIPTION.getName(), node.getId());
break;
}
}
}
}
return rdaMap.values().stream()
.filter(tr -> tr.getName() != null)
.collect(Collectors.toList());
}
//TODO
/*
public static List<Field> toProperties(List<TechnicalResource> rdas) {
List<Field> properties = new ArrayList<>();
rdas.forEach(rda -> rda.getAdditionalProperties().entrySet().forEach(entry -> {
try {
Field field = new Field();
field.setKey(entry.getValue().toString());
ImportPropertyName importPropertyName = ImportPropertyName.fromString(entry.getKey());
switch(importPropertyName) {
case DESCRIPTION:
field.setValue(rda.getDescription());
break;
case NAME:
field.setValue(rda.getName());
break;
}
properties.add(field);
} catch (Exception e) {
logger.error(e.getMessage(), e);
}
}));
return properties;
}
*/
public static TechnicalResource toRDA(JsonNode node) {
TechnicalResource rda = new TechnicalResource();
String rdaProperty = "";
JsonNode schematics = node.get("schematics");
if(schematics.isArray()){
for(JsonNode schematic: schematics){
if(schematic.asText().startsWith("rda.dataset.technical_resource")){
rdaProperty = schematic.asText();
break;
}
}
}
String value = node.get("value").asText();
if (rdaProperty.contains("description")) {
rda.setDescription(value);
}
if (rdaProperty.contains("name")) {
rda.setName(value);
}
if (rda.getName() == null) {
throw new IllegalArgumentException("Technical Resources Name is missing");
}
return rda;
}
private static TechnicalResource getRelative(Map<String, TechnicalResource> rdaMap, String numbering) {
return rdaMap.entrySet().stream().filter(entry -> MyStringUtils.getFirstDifference(entry.getKey(), numbering) > 0)
.max(Comparator.comparingInt(entry -> MyStringUtils.getFirstDifference(entry.getKey(), numbering))).map(Map.Entry::getValue).orElse(new TechnicalResource());
}
private enum ExportPropertyName {
NAME("name"),
DESCRIPTION("description");
private String name;
ExportPropertyName(String name) {
this.name = name;
}
public String getName() {
return name;
}
}
private enum ImportPropertyName {
NAME("nameId"),
DESCRIPTION("descriptionId");
private String name;
ImportPropertyName(String name) {
this.name = name;
}
public String getName() {
return name;
}
public static ImportPropertyName fromString(String name) throws Exception {
for (ImportPropertyName importPropertyName: ImportPropertyName.values()) {
if (importPropertyName.getName().equals(name)) {
return importPropertyName;
}
}
throw new Exception("Property name not available");
}
}
}

View File

@ -0,0 +1,63 @@
package eu.eudat.file.transformer.utils.descriptionTemplate;
import eu.eudat.file.transformer.models.descriptiontemplate.DescriptionTemplateFileTransformerModel;
import eu.eudat.file.transformer.models.descriptiontemplate.definition.FieldFileTransformerModel;
import eu.eudat.file.transformer.models.descriptiontemplate.definition.FieldSetFileTransformerModel;
import eu.eudat.file.transformer.models.descriptiontemplate.definition.PageFileTransformerModel;
import eu.eudat.file.transformer.models.descriptiontemplate.definition.SectionFileTransformerModel;
import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.Locale;
public class TemplateFieldSearcher {
public static List<FieldFileTransformerModel> searchFields(DescriptionTemplateFileTransformerModel template, String key, String value) {
List<FieldFileTransformerModel> result;
List<PageFileTransformerModel> pages = template.getDefinition().getPages();
result = pages.stream().flatMap(pageFileTransformerModel -> searchFieldsFromSections(pageFileTransformerModel.getSections(), key, value).stream()).toList();
return result;
}
private static List<FieldFileTransformerModel> searchFieldsFromSections(List<SectionFileTransformerModel> sections, String key, String value) {
List<FieldFileTransformerModel> result = new ArrayList<>();
for (SectionFileTransformerModel section : sections) {
if (section.getSections() != null && !section.getSections().isEmpty()) {
result.addAll(searchFieldsFromSections(section.getSections(), key, value));
}
if (section.getFieldSets() != null && !section.getFieldSets().isEmpty()) {
List<FieldSetFileTransformerModel> fieldSets = section.getFieldSets();
for (FieldSetFileTransformerModel fieldSet : fieldSets) {
List<FieldFileTransformerModel> fields = fieldSet.getFields();
for (FieldFileTransformerModel field : fields) {
Method keyGetter = Arrays.stream(FieldFileTransformerModel.class.getDeclaredMethods()).filter(method -> method.getName().equals(makeGetter(key))).findFirst().orElse(null);
if (keyGetter != null && keyGetter.canAccess(field)) {
try {
if (keyGetter.invoke(field).equals(value) || keyGetter.invoke(field).toString().startsWith(value)) {
result.add(field);
} else if(keyGetter.getReturnType().isAssignableFrom(List.class)) {
List nodes = (List) keyGetter.invoke(field);
for (Object item : nodes) {
if (item.toString().equals(value) || item.toString().startsWith(value)) {
result.add(field);
}
}
}
} catch (IllegalAccessException | InvocationTargetException e) {
throw new RuntimeException(e);
}
}
}
}
}
}
return result;
}
private static String makeGetter(String fieldName) {
return "get" + fieldName.substring(0, 1).toUpperCase(Locale.ROOT) + fieldName.substring(1);
}
}

View File

@ -0,0 +1,81 @@
package eu.eudat.file.transformer.utils.json;
import com.fasterxml.jackson.databind.JsonNode;
import java.util.ArrayList;
import java.util.Iterator;
import java.util.LinkedList;
import java.util.List;
public class JsonSearcher {
public static List<JsonNode> findNodes(JsonNode root, String key, String value) {
return findNodes(root, key, value, false);
}
public static List<JsonNode> findNodes(JsonNode root, String key, String value, boolean parent) {
List<JsonNode> nodes = new ArrayList<>();
for (Iterator<JsonNode> it = root.elements(); it.hasNext(); ) {
JsonNode node = it.next();
int found = 0;
for (Iterator<String> iter = node.fieldNames(); iter.hasNext(); ) {
String fieldName = iter.next();
if (fieldName.equals(key)) {
if (node.get(fieldName).asText().equals(value) || node.get(fieldName).asText().startsWith(value)) {
if (parent) {
nodes.add(root);
} else {
nodes.add(node);
}
found++;
}
else if(node.get(fieldName).isArray()){
for(JsonNode item: node.get(fieldName)){
if(item.asText().equals(value) || item.asText().startsWith(value)){
if (parent) {
nodes.add(root);
} else {
nodes.add(node);
}
found++;
}
}
}
}
}
if (found == 0) {
nodes.addAll(findNodes(node, key, value, parent));
}
}
return nodes;
}
public static List<String> getParentValues(JsonNode root, String childValue, String key) {
List<String> values = new LinkedList<>();
for (Iterator<JsonNode> it = root.elements(); it.hasNext(); ) {
JsonNode node = it.next();
int found = 0;
for (Iterator<String> iter = node.fieldNames(); iter.hasNext(); ) {
String fieldName = iter.next();
if (fieldName.equals(key)) {
if (node.get(fieldName).asText().equals(childValue) || node.get(fieldName).asText().startsWith(childValue)) {
values.add(childValue);
found++;
}
}
}
if (found == 0) {
values.addAll(getParentValues(node, childValue, key));
if (!values.isEmpty() && node.has(key)) {
values.add(node.get(key).asText());
values.remove(childValue);
}
}
}
return values;
}
}

View File

@ -0,0 +1,44 @@
package eu.eudat.file.transformer.utils.service.storage;
import eu.eudat.file.transformer.configuration.FileStorageProperties;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import java.io.*;
import java.nio.file.*;
import java.util.UUID;
@Service
public class FileStorageService {
private final static Logger logger = LoggerFactory.getLogger(FileStorageService.class);
private final FileStorageProperties properties;
@Autowired
public FileStorageService(FileStorageProperties properties) {
this.properties = properties;
}
public String storeFile(byte[] data) {
try {
String fileName = UUID.randomUUID().toString();
Path storagePath = Paths.get(properties.getTransientPath() + "/" + fileName);
Files.write(storagePath, data, StandardOpenOption.CREATE_NEW);
return fileName;
} catch (IOException e) {
logger.error(e.getMessage(), e);
}
return null;
}
public byte[] readFile(String fileRef) {
try (FileInputStream inputStream = new FileInputStream(properties.getTransientPath() + "/" + fileRef)) {
return inputStream.readAllBytes();
} catch (IOException e) {
logger.error(e.getMessage(), e);
}
return new byte[1];
}
}

View File

@ -0,0 +1,21 @@
package eu.eudat.file.transformer.utils.string;
public class MyStringUtils {
public static int getFirstDifference(String s1, String s2) {
char[] s1ar = s1.toCharArray();
char[] s2ar = s2.toCharArray();
for(int i = 0; i < s1ar.length; i++) {
if (s2ar.length > i) {
if (s1ar[i] != s2ar[i]) {
return i;
}
} else {
return i;
}
}
return -1;
}
}

37
pom.xml Normal file
View File

@ -0,0 +1,37 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>3.1.0</version>
<relativePath/>
</parent>
<groupId>gr.cite.opendmp</groupId>
<artifactId>file-transformer-rda-parent</artifactId>
<version>${revision}</version>
<packaging>pom</packaging>
<properties>
<maven.compiler.source>21</maven.compiler.source>
<maven.compiler.target>21</maven.compiler.target>
<maven.compiler.release>21</maven.compiler.release>
<revision>1.0.0-SNAPSHOT</revision>
</properties>
<dependencies>
<dependency>
<groupId>org.yaml</groupId>
<artifactId>snakeyaml</artifactId>
<version>2.0</version>
</dependency>
</dependencies>
<modules>
<module>core</module>
<module>web</module>
</modules>
</project>

29
settings.xml Normal file
View File

@ -0,0 +1,29 @@
<settings>
<servers>
<server>
<id>ossrh</id>
<username>${server_username}</username>
<password>${server_password}</password>
</server>
<server>
<id>dev</id>
<username>${server_username}</username>
<password>${server_password}</password>
</server>
</servers>
<profiles>
<profile>
<id>dev</id>
<activation>
<activeByDefault>true</activeByDefault>
</activation>
<repositories>
<repository>
<id>dev</id>
<name>Dev Profile</name>
<url>${devProfileUrl}</url>
</repository>
</repositories>
</profile>
</profiles>
</settings>

59
web/pom.xml Normal file
View File

@ -0,0 +1,59 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>gr.cite.opendmp</groupId>
<artifactId>file-transformer-rda-parent</artifactId>
<version>${revision}</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>file-transformer-rda-web</artifactId>
<version>${revision}</version>
<packaging>jar</packaging>
<properties>
<maven.compiler.source>21</maven.compiler.source>
<maven.compiler.target>21</maven.compiler.target>
<maven.compiler.release>21</maven.compiler.release>
<revision>1.0.0-SNAPSHOT</revision>
</properties>
<dependencies>
<dependency>
<groupId>gr.cite.opendmp</groupId>
<artifactId>file-transformer-rda</artifactId>
<version>${revision}</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>gr.cite</groupId>
<artifactId>oidc-authn</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>gr.cite</groupId>
<artifactId>cache</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-cache</artifactId>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<version>3.1.0</version>
</plugin>
</plugins>
</build>
</project>

View File

@ -0,0 +1,16 @@
package eu.eudat.file.transformer;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
@SpringBootApplication(scanBasePackages = {
"eu.eudat.file.transformer.*",
"gr.cite.tools",
"gr.cite.commons"
})
public class FileTransformerApplication {
public static void main(String[] args) {
SpringApplication.run(FileTransformerApplication.class, args);
}
}

View File

@ -0,0 +1,76 @@
package eu.eudat.file.transformer.config;
import gr.cite.commons.web.oidc.configuration.WebSecurityProperties;
import gr.cite.commons.web.oidc.configuration.filter.ApiKeyFilter;
import jakarta.servlet.http.HttpServletRequest;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.security.authentication.AuthenticationManagerResolver;
import org.springframework.security.config.Customizer;
import org.springframework.security.config.annotation.web.builders.HttpSecurity;
import org.springframework.security.config.annotation.web.configuration.EnableWebSecurity;
import org.springframework.security.config.annotation.web.configurers.AbstractHttpConfigurer;
import org.springframework.security.config.http.SessionCreationPolicy;
import org.springframework.security.web.SecurityFilterChain;
import org.springframework.security.web.authentication.preauth.AbstractPreAuthenticatedProcessingFilter;
import java.util.Set;
@Configuration
@EnableWebSecurity
public class SecurityConfiguration {
private final ApiKeyFilter apiKeyFilter;
private final WebSecurityProperties webSecurityProperties;
private final AuthenticationManagerResolver<HttpServletRequest> authenticationManagerResolver;
@Autowired
public SecurityConfiguration(ApiKeyFilter apiKeyFilter, WebSecurityProperties webSecurityProperties, AuthenticationManagerResolver<HttpServletRequest> authenticationManagerResolver) {
this.apiKeyFilter = apiKeyFilter;
this.webSecurityProperties = webSecurityProperties;
this.authenticationManagerResolver = authenticationManagerResolver;
}
@Bean
protected SecurityFilterChain filterChain(HttpSecurity http) throws Exception {
if (webSecurityProperties.isEnabled()) {
http.csrf(AbstractHttpConfigurer::disable)
.cors(Customizer.withDefaults())
.addFilterBefore(apiKeyFilter, AbstractPreAuthenticatedProcessingFilter.class)
.authorizeHttpRequests(authorizationManagerRequestMatcherRegistry -> authorizationManagerRequestMatcherRegistry
.requestMatchers(buildAntPatterns(webSecurityProperties.getAuthorizedEndpoints())).authenticated()
.requestMatchers(buildAntPatterns(webSecurityProperties.getAllowedEndpoints())).anonymous())
.sessionManagement(httpSecuritySessionManagementConfigurer -> httpSecuritySessionManagementConfigurer.sessionCreationPolicy(SessionCreationPolicy.NEVER))
.oauth2ResourceServer(oauth2 -> oauth2.authenticationManagerResolver(authenticationManagerResolver));
return http.build();
} else {
return http.csrf(AbstractHttpConfigurer::disable)
.cors(Customizer.withDefaults())
.authorizeHttpRequests(authorizationManagerRequestMatcherRegistry ->
authorizationManagerRequestMatcherRegistry.anyRequest().anonymous())
.build();
}
}
private String[] buildAntPatterns(Set<String> endpoints) {
if (endpoints == null) {
return new String[0];
}
return endpoints.stream()
.filter(endpoint -> endpoint != null && !endpoint.isBlank())
.map(endpoint -> "/" + stripUnnecessaryCharacters(endpoint) + "/**")
.toArray(String[]::new);
}
private String stripUnnecessaryCharacters(String endpoint) {
endpoint = endpoint.strip();
if (endpoint.startsWith("/")) {
endpoint = endpoint.substring(1);
}
if (endpoint.endsWith("/")) {
endpoint = endpoint.substring(0, endpoint.length() - 1);
}
return endpoint;
}
}

View File

@ -0,0 +1,48 @@
package eu.eudat.file.transformer.controller;
import eu.eudat.file.transformer.interfaces.FileTransformerClient;
import eu.eudat.file.transformer.interfaces.FileTransformerConfiguration;
import eu.eudat.file.transformer.models.description.DescriptionFileTransformerModel;
import eu.eudat.file.transformer.models.dmp.DmpFileTransformerModel;
import eu.eudat.file.transformer.models.misc.FileEnvelope;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.*;
import java.util.List;
@RestController
@RequestMapping("/api/file")
public class FileTransformerController {
private final FileTransformerClient fileTransformerExecutor;
@Autowired
public FileTransformerController(FileTransformerClient fileTransformerExecutor) {
this.fileTransformerExecutor = fileTransformerExecutor;
}
@PostMapping("/export/dmp")
public FileEnvelope exportDmp(@RequestBody DmpFileTransformerModel dmpDepositModel) throws Exception {
return fileTransformerExecutor.exportDmp(dmpDepositModel);
}
@PostMapping("/export/description")
public FileEnvelope exportDescription(@RequestBody DescriptionFileTransformerModel descriptionFileTransformerModel, @RequestParam(value = "format",required = false)String format, @RequestParam(value = "descriptionId",required = false) String descriptionId) throws Exception {
return fileTransformerExecutor.exportDescription(descriptionFileTransformerModel, format);
}
@PostMapping("/import/dmp")
public DmpFileTransformerModel importFileToDmp(@RequestBody FileEnvelope fileEnvelope) {
return fileTransformerExecutor.importDmp(fileEnvelope);
}
@PostMapping("/import/description")
public DescriptionFileTransformerModel importFileToDescription(@RequestBody FileEnvelope fileEnvelope) {
return fileTransformerExecutor.importDescription(fileEnvelope);
}
@GetMapping("/formats")
public FileTransformerConfiguration getSupportedFormats() {
return fileTransformerExecutor.getConfiguration();
}
}

View File

@ -0,0 +1,11 @@
spring:
jackson:
default-property-inclusion: non_null
config:
import: optional:classpath:config/app.env[.properties], optional:file:../config/app.env[.properties],
optional:classpath:config/server.yml[.yml], optional:classpath:config/server-${spring.profiles.active}.yml[.yml], optional:file:../config/server-${spring.profiles.active}.yml[.yml],
optional:classpath:config/storage.yml[.yml], optional:classpath:config/storage-${spring.profiles.active}.yml[.yml], optional:file:../config/storage-${spring.profiles.active}.yml[.yml],
optional:classpath:config/security.yml[.yml], optional:classpath:config/security-${spring.profiles.active}.yml[.yml], optional:file:../config/security-${spring.profiles.active}.yml[.yml],
optional:classpath:config/cache.yml[.yml], optional:classpath:config/cache-${spring.profiles.active}.yml[.yml], optional:file:../config/cache-${spring.profiles.active}.yml[.yml],
optional:classpath:config/pdf.yml[.yml], optional:classpath:config/pdf-${spring.profiles.active}.yml[.yml], optional:file:../config/pdf-${spring.profiles.active}.yml[.yml]

View File

@ -0,0 +1,16 @@
cache:
manager:
fallbackToNoOpCache: true
caffeineCaches:
- names: [ "apikey" ]
allowNullValues: true
initialCapacity: 100
maximumSize: 500
enableRecordStats: false
expireAfterWriteMinutes: 10
expireAfterAccessMinutes: 10
refreshAfterWriteMinutes: 10
mapCaches:
apiKey:
name: apikey
keyPattern: resolve_$keyhash$:v0

View File

@ -0,0 +1,3 @@
pdf:
converter:
url: ${PDF_CONVERTER_URL:}

View File

@ -0,0 +1,20 @@
web:
security:
enabled: true
authorized-endpoints: [ api ]
allowed-endpoints: [ health ]
idp:
api-key:
enabled: true
authorization-header: Authorization
client-id: ${IDP_APIKEY_CLIENT_ID:}
client-secret: ${IDP_APIKEY_CLIENT_SECRET:}
scope: ${IDP_APIKEY_SCOPE:}
resource:
token-type: JWT #| opaque
opaque:
client-id: ${IDP_OPAQUE_CLIENT_ID:}
client-secret: ${IDP_OPAQUE_CLIENT_SECRET:}
jwt:
claims: [ role, x-role ]
issuer-uri: ${IDP_ISSUER_URI:}

View File

@ -0,0 +1,12 @@
server:
port: 8086
tomcat:
threads:
max: 20
max-connections: 10000
spring:
servlet:
multipart:
max-file-size: 10MB
max-request-size: 10MB

View File

@ -0,0 +1,4 @@
file:
storage:
temp: ${TEMP_PATH}
transient-path: ${TRANSIENT_PATH}

Binary file not shown.

Binary file not shown.

File diff suppressed because one or more lines are too long

Binary file not shown.

View File

@ -0,0 +1,13 @@
<fetchConfig>
<configs>
<config>
<type>currency</type>
<fileType>xml</fileType>
<filePath>internal/iso-4217.xml</filePath>
<parseClass>eu.eudat.logic.proxy.fetching.entities.CurrencyModel</parseClass>
<parseField>currencies</parseField>
<name>currency</name>
<value>code</value>
</config>
</configs>
</fetchConfig>

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,186 @@
{
"aar": "aa",
"abk": "ab",
"afr": "af",
"aka": "ak",
"amh": "am",
"ara": "ar",
"arg": "an",
"asm": "as",
"ava": "av",
"ave": "ae",
"aym": "ay",
"aze": "az",
"bak": "ba",
"bam": "bm",
"bel": "be",
"ben": "bn",
"bis": "bi",
"bod": "bo",
"bos": "bs",
"bre": "br",
"bul": "bg",
"cat": "ca",
"ces": "cs",
"cha": "ch",
"che": "ce",
"chu": "cu",
"chv": "cv",
"cor": "kw",
"cos": "co",
"cre": "cr",
"cym": "cy",
"dan": "da",
"deu": "de",
"div": "dv",
"dzo": "dz",
"ell": "el",
"eng": "en",
"epo": "eo",
"est": "et",
"eus": "eu",
"ewe": "ee",
"fao": "fo",
"fas": "fa",
"fij": "fj",
"fin": "fi",
"fra": "fr",
"fry": "fy",
"ful": "ff",
"gla": "gd",
"gle": "ga",
"glg": "gl",
"glv": "gv",
"grn": "gn",
"guj": "gu",
"hat": "ht",
"hau": "ha",
"hbs": "sh",
"heb": "he",
"her": "hz",
"hin": "hi",
"hmo": "ho",
"hrv": "hr",
"hun": "hu",
"hye": "hy",
"ibo": "ig",
"ido": "io",
"iii": "ii",
"iku": "iu",
"ile": "ie",
"ina": "ia",
"ind": "id",
"ipk": "ik",
"isl": "is",
"ita": "it",
"jav": "jv",
"jpn": "ja",
"kal": "kl",
"kan": "kn",
"kas": "ks",
"kat": "ka",
"kau": "kr",
"kaz": "kk",
"khm": "km",
"kik": "ki",
"kin": "rw",
"kir": "ky",
"kom": "kv",
"kon": "kg",
"kor": "ko",
"kua": "kj",
"kur": "ku",
"lao": "lo",
"lat": "la",
"lav": "lv",
"lim": "li",
"lin": "ln",
"lit": "lt",
"ltz": "lb",
"lub": "lu",
"lug": "lg",
"mah": "mh",
"mal": "ml",
"mar": "mr",
"mkd": "mk",
"mlg": "mg",
"mlt": "mt",
"mon": "mn",
"mri": "mi",
"msa": "ms",
"mya": "my",
"nau": "na",
"nav": "nv",
"nbl": "nr",
"nde": "nd",
"ndo": "ng",
"nep": "ne",
"nld": "nl",
"nno": "nn",
"nob": "nb",
"nor": "no",
"nya": "ny",
"oci": "oc",
"oji": "oj",
"ori": "or",
"orm": "om",
"oss": "os",
"pan": "pa",
"pli": "pi",
"pol": "pl",
"por": "pt",
"pus": "ps",
"que": "qu",
"roh": "rm",
"ron": "ro",
"run": "rn",
"rus": "ru",
"sag": "sg",
"san": "sa",
"sin": "si",
"slk": "sk",
"slv": "sl",
"sme": "se",
"smo": "sm",
"sna": "sn",
"snd": "sd",
"som": "so",
"sot": "st",
"spa": "es",
"sqi": "sq",
"srd": "sc",
"srp": "sr",
"ssw": "ss",
"sun": "su",
"swa": "sw",
"swe": "sv",
"tah": "ty",
"tam": "ta",
"tat": "tt",
"tel": "te",
"tgk": "tg",
"tgl": "tl",
"tha": "th",
"tir": "ti",
"ton": "to",
"tsn": "tn",
"tso": "ts",
"tuk": "tk",
"tur": "tr",
"twi": "tw",
"uig": "ug",
"ukr": "uk",
"urd": "ur",
"uzb": "uz",
"ven": "ve",
"vie": "vi",
"vol": "vo",
"wln": "wa",
"wol": "wo",
"xho": "xh",
"yid": "yi",
"yor": "yo",
"zha": "za",
"zho": "zh",
"zul": "zu"
}

View File

@ -0,0 +1,100 @@
{
"pidLinks": [
{
"pid": "doi",
"link": "https://doi.org/{pid}"
},
{
"pid": "uniprot",
"link": "https://uniprot.org/uniprotkb/{pid}"
},
{
"pid": "handle",
"link": "https://hdl.handle.net/{pid}"
},
{
"pid": "arxiv",
"link": "https://arxiv.org/abs/{pid}"
},
{
"pid": "ascl",
"link": "https://ascl.net/{pid}"
},
{
"pid": "orcid",
"link": "https://orcid.org/{pid}"
},
{
"pid": "pmid",
"link": "https://pubmed.ncbi.nlm.nih.gov/{pid}"
},
{
"pid": "ads",
"link": "https://ui.adsabs.harvard.edu/#abs/{pid}"
},
{
"pid": "pmcid",
"link": "https://ncbi.nlm.nih.gov/pmc/{pid}"
},
{
"pid": "gnd",
"link": "https://d-nb.info/gnd/{pid}"
},
{
"pid": "urn",
"link": "https://nbn-resolving.org/{pid}"
},
{
"pid": "sra",
"link": "https://ebi.ac.uk/ena/data/view/{pid}"
},
{
"pid": "bioproject",
"link": "https://ebi.ac.uk/ena/data/view/{pid}"
},
{
"pid": "biosample",
"link": "https://ebi.ac.uk/ena/data/view/{pid}"
},
{
"pid": "ensembl",
"link": "https://ensembl.org/id/{pid}"
},
{
"pid": "refseq",
"link": "https://ncbi.nlm.nih.gov/entrez/viewer.fcgi?val={pid}"
},
{
"pid": "genome",
"link": "https://ncbi.nlm.nih.gov/assembly/{pid}"
},
{
"pid": "geo",
"link": "https://ncbi.nlm.nih.gov/geo/query/acc.cgi?acc={pid}"
},
{
"pid": "arrayexpress_array",
"link": "https://ebi.ac.uk/arrayexpress/arrays/{pid}"
},
{
"pid": "arrayexpress_experiment",
"link": "https://ebi.ac.uk/arrayexpress/experiments/{pid}"
},
{
"pid": "hal",
"link": "https://hal.archives-ouvertes.fr/{pid}"
},
{
"pid": "swh",
"link": "https://archive.softwareheritage.org/{pid}"
},
{
"pid": "ror",
"link": "https://ror.org/{pid}"
},
{
"pid": "viaf",
"link": "https://viaf.org/viaf/{pid}"
}
]
}