Compare commits
No commits in common. "main" and "doiboost_wf" have entirely different histories.
main
...
doiboost_w
|
@ -3,6 +3,8 @@
|
||||||
*.iws
|
*.iws
|
||||||
*.ipr
|
*.ipr
|
||||||
*.iml
|
*.iml
|
||||||
|
*.ipr
|
||||||
|
*.iws
|
||||||
*~
|
*~
|
||||||
.vscode
|
.vscode
|
||||||
.metals
|
.metals
|
||||||
|
@ -25,6 +27,4 @@ spark-warehouse
|
||||||
/**/job-override.properties
|
/**/job-override.properties
|
||||||
/**/*.log
|
/**/*.log
|
||||||
/**/.factorypath
|
/**/.factorypath
|
||||||
/**/.scalafmt.conf
|
|
||||||
/.java-version
|
|
||||||
/dhp-shade-package/dependency-reduced-pom.xml
|
|
||||||
|
|
|
@ -1,21 +0,0 @@
|
||||||
style = defaultWithAlign
|
|
||||||
|
|
||||||
align.openParenCallSite = false
|
|
||||||
align.openParenDefnSite = false
|
|
||||||
align.tokens = [{code = "->"}, {code = "<-"}, {code = "=>", owner = "Case"}]
|
|
||||||
continuationIndent.callSite = 2
|
|
||||||
continuationIndent.defnSite = 2
|
|
||||||
danglingParentheses = true
|
|
||||||
indentOperator = spray
|
|
||||||
maxColumn = 120
|
|
||||||
newlines.alwaysBeforeTopLevelStatements = true
|
|
||||||
project.excludeFilters = [".*\\.sbt"]
|
|
||||||
rewrite.rules = [AvoidInfix]
|
|
||||||
rewrite.rules = [ExpandImportSelectors]
|
|
||||||
rewrite.rules = [RedundantBraces]
|
|
||||||
rewrite.rules = [RedundantParens]
|
|
||||||
rewrite.rules = [SortImports]
|
|
||||||
rewrite.rules = [SortModifiers]
|
|
||||||
rewrite.rules = [PreferCurlyFors]
|
|
||||||
spaces.inImportCurlyBraces = false
|
|
||||||
unindentTopLevelOperators = true
|
|
|
@ -1,43 +0,0 @@
|
||||||
# Contributor Code of Conduct
|
|
||||||
|
|
||||||
Openness, transparency and our community-driven participatory approach guide us in our day-to-day interactions and decision-making. Our open source projects are no exception. Trust, respect, collaboration and transparency are core values we believe should live and breathe within our projects. Our community welcomes participants from around the world with different experiences, unique perspectives, and great ideas to share.
|
|
||||||
|
|
||||||
## Our Pledge
|
|
||||||
|
|
||||||
In the interest of fostering an open and welcoming environment, we as contributors and maintainers pledge to making participation in our project and our community a harassment-free experience for everyone, regardless of age, body size, disability, ethnicity, sex characteristics, gender identity and expression, level of experience, education, socio-economic status, nationality, personal appearance, race, religion, or sexual identity and orientation.
|
|
||||||
|
|
||||||
## Our Standards
|
|
||||||
|
|
||||||
Examples of behavior that contributes to creating a positive environment include:
|
|
||||||
|
|
||||||
- Using welcoming and inclusive language
|
|
||||||
- Being respectful of differing viewpoints and experiences
|
|
||||||
- Gracefully accepting constructive criticism
|
|
||||||
- Attempting collaboration before conflict
|
|
||||||
- Focusing on what is best for the community
|
|
||||||
- Showing empathy towards other community members
|
|
||||||
|
|
||||||
Examples of unacceptable behavior by participants include:
|
|
||||||
|
|
||||||
- Violence, threats of violence, or inciting others to commit self-harm
|
|
||||||
- The use of sexualized language or imagery and unwelcome sexual attention or advances
|
|
||||||
- Trolling, intentionally spreading misinformation, insulting/derogatory comments, and personal or political attacks
|
|
||||||
- Public or private harassment
|
|
||||||
- Publishing others' private information, such as a physical or electronic address, without explicit permission
|
|
||||||
- Abuse of the reporting process to intentionally harass or exclude others
|
|
||||||
- Advocating for, or encouraging, any of the above behavior
|
|
||||||
- Other conduct which could reasonably be considered inappropriate in a professional setting
|
|
||||||
|
|
||||||
## Our Responsibilities
|
|
||||||
|
|
||||||
Project maintainers are responsible for clarifying the standards of acceptable behavior and are expected to take appropriate and fair corrective action in response to any instances of unacceptable behavior.
|
|
||||||
|
|
||||||
Project maintainers have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, or to ban temporarily or permanently any contributor for other behaviors that they deem inappropriate, threatening, offensive, or harmful.
|
|
||||||
|
|
||||||
## Scope
|
|
||||||
|
|
||||||
This Code of Conduct applies both within project spaces and in public spaces when an individual is representing the project or its community. Examples of representing a project or community include using an official project e-mail address, posting via an official social media account, or acting as an appointed representative at an online or offline event. Representation of a project may be further defined and clarified by project maintainers.
|
|
||||||
|
|
||||||
## Attribution
|
|
||||||
|
|
||||||
This Code of Conduct is adapted from the [Contributor Covenant](https://www.contributor-covenant.org/), [version 1.4](https://www.contributor-covenant.org/version/1/4/code-of-conduct.html).
|
|
|
@ -1,10 +0,0 @@
|
||||||
# Contributing to D-Net Hadoop
|
|
||||||
|
|
||||||
:+1::tada: First off, thanks for taking the time to contribute! :tada::+1:
|
|
||||||
|
|
||||||
This project and everyone participating in it is governed by our [Code of Conduct](CODE_OF_CONDUCT.md). By participating, you are expected to uphold this code. Please report unacceptable behavior to [dnet-team@isti.cnr.it](mailto:dnet-team@isti.cnr.it).
|
|
||||||
|
|
||||||
The following is a set of guidelines for contributing to this project and its packages. These are mostly guidelines, not rules, which applies to this project as a while, including all its sub-modules.
|
|
||||||
Use your best judgment, and feel free to propose changes to this document in a pull request.
|
|
||||||
|
|
||||||
All contributions are welcome, all contributions will be considered to be contributed under the [project license](LICENSE.md).
|
|
133
README.md
133
README.md
|
@ -1,133 +1,2 @@
|
||||||
# dnet-hadoop
|
# dnet-hadoop
|
||||||
|
Dnet-hadoop is the project that defined all the OOZIE workflows for the OpenAIRE Graph construction, processing, provisioning.
|
||||||
Dnet-hadoop is the project that defined all the [OOZIE workflows](https://oozie.apache.org/) for the OpenAIRE Graph construction, processing, provisioning.
|
|
||||||
|
|
||||||
This project adheres to the Contributor Covenant [code of conduct](CODE_OF_CONDUCT.md).
|
|
||||||
By participating, you are expected to uphold this code. Please report unacceptable behavior to [dnet-team@isti.cnr.it](mailto:dnet-team@isti.cnr.it).
|
|
||||||
|
|
||||||
This project is licensed under the [AGPL v3 or later version](#LICENSE.md).
|
|
||||||
|
|
||||||
How to build, package and run oozie workflows
|
|
||||||
====================
|
|
||||||
|
|
||||||
Oozie-installer is a utility allowing building, uploading and running oozie workflows. In practice, it creates a `*.tar.gz`
|
|
||||||
package that contains resources that define a workflow and some helper scripts.
|
|
||||||
|
|
||||||
This module is automatically executed when running:
|
|
||||||
|
|
||||||
`mvn package -Poozie-package -Dworkflow.source.dir=classpath/to/parent/directory/of/oozie_app`
|
|
||||||
|
|
||||||
on module having set:
|
|
||||||
|
|
||||||
```
|
|
||||||
<parent>
|
|
||||||
<groupId>eu.dnetlib.dhp</groupId>
|
|
||||||
<artifactId>dhp-workflows</artifactId>
|
|
||||||
</parent>
|
|
||||||
```
|
|
||||||
|
|
||||||
in `pom.xml` file. `oozie-package` profile initializes oozie workflow packaging, `workflow.source.dir` property points to
|
|
||||||
a workflow (notice: this is not a relative path but a classpath to directory usually holding `oozie_app` subdirectory).
|
|
||||||
|
|
||||||
The outcome of this packaging is `oozie-package.tar.gz` file containing inside all the resources required to run Oozie workflow:
|
|
||||||
|
|
||||||
- jar packages
|
|
||||||
- workflow definitions
|
|
||||||
- job properties
|
|
||||||
- maintenance scripts
|
|
||||||
|
|
||||||
Required properties
|
|
||||||
====================
|
|
||||||
|
|
||||||
In order to include proper workflow within package, `workflow.source.dir` property has to be set. It could be provided
|
|
||||||
by setting `-Dworkflow.source.dir=some/job/dir` maven parameter.
|
|
||||||
|
|
||||||
In oder to define full set of cluster environment properties one should create `~/.dhp/application.properties` file with
|
|
||||||
the following properties:
|
|
||||||
|
|
||||||
- `dhp.hadoop.frontend.user.name` - your user name on hadoop cluster and frontend machine
|
|
||||||
- `dhp.hadoop.frontend.host.name` - frontend host name
|
|
||||||
- `dhp.hadoop.frontend.temp.dir` - frontend directory for temporary files
|
|
||||||
- `dhp.hadoop.frontend.port.ssh` - frontend machine ssh port
|
|
||||||
- `oozieServiceLoc` - oozie service location required by run_workflow.sh script executing oozie job
|
|
||||||
- `nameNode` - name node address
|
|
||||||
- `jobTracker` - job tracker address
|
|
||||||
- `oozie.execution.log.file.location` - location of file that will be created when executing oozie job, it contains output
|
|
||||||
produced by `run_workflow.sh` script (needed to obtain oozie job id)
|
|
||||||
- `maven.executable` - mvn command location, requires parameterization due to a different setup of CI cluster
|
|
||||||
- `sparkDriverMemory` - amount of memory assigned to spark jobs driver
|
|
||||||
- `sparkExecutorMemory` - amount of memory assigned to spark jobs executors
|
|
||||||
- `sparkExecutorCores` - number of cores assigned to spark jobs executors
|
|
||||||
|
|
||||||
All values will be overriden with the ones from `job.properties` and eventually `job-override.properties` stored in module's
|
|
||||||
main folder.
|
|
||||||
|
|
||||||
When overriding properties from `job.properties`, `job-override.properties` file can be created in main module directory
|
|
||||||
(the one containing `pom.xml` file) and define all new properties which will override existing properties.
|
|
||||||
One can provide those properties one by one as command line `-D` arguments.
|
|
||||||
|
|
||||||
Properties overriding order is the following:
|
|
||||||
|
|
||||||
1. `pom.xml` defined properties (located in the project root dir)
|
|
||||||
2. `~/.dhp/application.properties` defined properties
|
|
||||||
3. `${workflow.source.dir}/job.properties`
|
|
||||||
4. `job-override.properties` (located in the project root dir)
|
|
||||||
5. `maven -Dparam=value`
|
|
||||||
|
|
||||||
where the maven `-Dparam` property is overriding all the other ones.
|
|
||||||
|
|
||||||
Workflow definition requirements
|
|
||||||
====================
|
|
||||||
|
|
||||||
`workflow.source.dir` property should point to the following directory structure:
|
|
||||||
|
|
||||||
[${workflow.source.dir}]
|
|
||||||
|
|
|
||||||
|-job.properties (optional)
|
|
||||||
|
|
|
||||||
\-[oozie_app]
|
|
||||||
|
|
|
||||||
\-workflow.xml
|
|
||||||
|
|
||||||
This property can be set using maven `-D` switch.
|
|
||||||
|
|
||||||
`[oozie_app]` is the default directory name however it can be set to any value as soon as `oozieAppDir` property is
|
|
||||||
provided with directory name as value.
|
|
||||||
|
|
||||||
Sub-workflows are supported as well and sub-workflow directories should be nested within `[oozie_app]` directory.
|
|
||||||
|
|
||||||
Creating oozie installer step-by-step
|
|
||||||
=====================================
|
|
||||||
|
|
||||||
Automated oozie-installer steps are the following:
|
|
||||||
|
|
||||||
1. creating jar packages: `*.jar` and `*tests.jar` along with copying all dependencies in `target/dependencies`
|
|
||||||
2. reading properties from maven, `~/.dhp/application.properties`, `job.properties`, `job-override.properties`
|
|
||||||
3. invoking priming mechanism linking resources from import.txt file (currently resolving subworkflow resources)
|
|
||||||
4. assembling shell scripts for preparing Hadoop filesystem, uploading Oozie application and starting workflow
|
|
||||||
5. copying whole `${workflow.source.dir}` content to `target/${oozie.package.file.name}`
|
|
||||||
6. generating updated `job.properties` file in `target/${oozie.package.file.name}` based on maven,
|
|
||||||
`~/.dhp/application.properties`, `job.properties` and `job-override.properties`
|
|
||||||
7. creating `lib` directory (or multiple directories for sub-workflows for each nested directory) and copying jar packages
|
|
||||||
created at step (1) to each one of them
|
|
||||||
8. bundling whole `${oozie.package.file.name}` directory into single tar.gz package
|
|
||||||
|
|
||||||
Uploading oozie package and running workflow on cluster
|
|
||||||
=======================================================
|
|
||||||
|
|
||||||
In order to simplify deployment and execution process two dedicated profiles were introduced:
|
|
||||||
|
|
||||||
- `deploy`
|
|
||||||
- `run`
|
|
||||||
|
|
||||||
to be used along with `oozie-package` profile e.g. by providing `-Poozie-package,deploy,run` maven parameters.
|
|
||||||
|
|
||||||
The `deploy` profile supplements packaging process with:
|
|
||||||
1) uploading oozie-package via scp to `/home/${user.name}/oozie-packages` directory on `${dhp.hadoop.frontend.host.name}` machine
|
|
||||||
2) extracting uploaded package
|
|
||||||
3) uploading oozie content to hadoop cluster HDFS location defined in `oozie.wf.application.path` property (generated dynamically by maven build process, based on `${dhp.hadoop.frontend.user.name}` and `workflow.source.dir` properties)
|
|
||||||
|
|
||||||
The `run` profile introduces:
|
|
||||||
1) executing oozie application uploaded to HDFS cluster using `deploy` command. Triggers `run_workflow.sh` script providing runtime properties defined in `job.properties` file.
|
|
||||||
|
|
||||||
Notice: ssh access to frontend machine has to be configured on system level and it is preferable to set key-based authentication in order to simplify remote operations.
|
|
|
@ -6,7 +6,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>eu.dnetlib.dhp</groupId>
|
<groupId>eu.dnetlib.dhp</groupId>
|
||||||
<artifactId>dhp-build</artifactId>
|
<artifactId>dhp-build</artifactId>
|
||||||
<version>1.2.5-SNAPSHOT</version>
|
<version>1.2.4-SNAPSHOT</version>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
<artifactId>dhp-build-assembly-resources</artifactId>
|
<artifactId>dhp-build-assembly-resources</artifactId>
|
||||||
|
|
|
@ -6,7 +6,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>eu.dnetlib.dhp</groupId>
|
<groupId>eu.dnetlib.dhp</groupId>
|
||||||
<artifactId>dhp-build</artifactId>
|
<artifactId>dhp-build</artifactId>
|
||||||
<version>1.2.5-SNAPSHOT</version>
|
<version>1.2.4-SNAPSHOT</version>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
<artifactId>dhp-build-properties-maven-plugin</artifactId>
|
<artifactId>dhp-build-properties-maven-plugin</artifactId>
|
||||||
|
|
|
@ -80,15 +80,7 @@ class WritePredefinedProjectPropertiesTest {
|
||||||
mojo.outputFile = testFolder;
|
mojo.outputFile = testFolder;
|
||||||
|
|
||||||
// execute
|
// execute
|
||||||
try {
|
Assertions.assertThrows(MojoExecutionException.class, () -> mojo.execute());
|
||||||
mojo.execute();
|
|
||||||
Assertions.assertTrue(false); // not reached
|
|
||||||
} catch (Exception e) {
|
|
||||||
Assertions
|
|
||||||
.assertTrue(
|
|
||||||
MojoExecutionException.class.isAssignableFrom(e.getClass()) ||
|
|
||||||
IllegalArgumentException.class.isAssignableFrom(e.getClass()));
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
|
|
||||||
<groupId>eu.dnetlib.dhp</groupId>
|
<groupId>eu.dnetlib.dhp</groupId>
|
||||||
<artifactId>dhp-code-style</artifactId>
|
<artifactId>dhp-code-style</artifactId>
|
||||||
<version>1.2.5-SNAPSHOT</version>
|
<version>1.2.4-SNAPSHOT</version>
|
||||||
|
|
||||||
<packaging>jar</packaging>
|
<packaging>jar</packaging>
|
||||||
|
|
||||||
|
@ -22,20 +22,9 @@
|
||||||
<id>dnet45-releases</id>
|
<id>dnet45-releases</id>
|
||||||
<url>https://maven.d4science.org/nexus/content/repositories/dnet45-releases</url>
|
<url>https://maven.d4science.org/nexus/content/repositories/dnet45-releases</url>
|
||||||
</repository>
|
</repository>
|
||||||
<site>
|
|
||||||
<id>DHPSite</id>
|
|
||||||
<url>${dhp.site.stage.path}/dhp-build/dhp-code-style</url>
|
|
||||||
</site>
|
|
||||||
</distributionManagement>
|
</distributionManagement>
|
||||||
|
|
||||||
<build>
|
<build>
|
||||||
<extensions>
|
|
||||||
<extension>
|
|
||||||
<groupId>org.apache.maven.wagon</groupId>
|
|
||||||
<artifactId>wagon-ssh</artifactId>
|
|
||||||
<version>2.10</version>
|
|
||||||
</extension>
|
|
||||||
</extensions>
|
|
||||||
<pluginManagement>
|
<pluginManagement>
|
||||||
<plugins>
|
<plugins>
|
||||||
<plugin>
|
<plugin>
|
||||||
|
@ -46,19 +35,14 @@
|
||||||
<plugin>
|
<plugin>
|
||||||
<groupId>org.apache.maven.plugins</groupId>
|
<groupId>org.apache.maven.plugins</groupId>
|
||||||
<artifactId>maven-site-plugin</artifactId>
|
<artifactId>maven-site-plugin</artifactId>
|
||||||
<version>3.9.1</version>
|
<version>3.7.1</version>
|
||||||
<configuration>
|
|
||||||
<skip>true</skip>
|
|
||||||
</configuration>
|
|
||||||
</plugin>
|
</plugin>
|
||||||
</plugins>
|
</plugins>
|
||||||
</pluginManagement>
|
</pluginManagement>
|
||||||
</build>
|
</build>
|
||||||
|
|
||||||
<properties>
|
<properties>
|
||||||
|
|
||||||
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
|
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
|
||||||
<dhp.site.stage.path>sftp://dnet-hadoop@static-web.d4science.org/dnet-hadoop</dhp.site.stage.path>
|
|
||||||
</properties>
|
</properties>
|
||||||
|
|
||||||
</project>
|
</project>
|
|
@ -1,21 +0,0 @@
|
||||||
style = defaultWithAlign
|
|
||||||
|
|
||||||
align.openParenCallSite = false
|
|
||||||
align.openParenDefnSite = false
|
|
||||||
align.tokens = [{code = "->"}, {code = "<-"}, {code = "=>", owner = "Case"}]
|
|
||||||
continuationIndent.callSite = 2
|
|
||||||
continuationIndent.defnSite = 2
|
|
||||||
danglingParentheses = true
|
|
||||||
indentOperator = spray
|
|
||||||
maxColumn = 120
|
|
||||||
newlines.alwaysBeforeTopLevelStatements = true
|
|
||||||
project.excludeFilters = [".*\\.sbt"]
|
|
||||||
rewrite.rules = [AvoidInfix]
|
|
||||||
rewrite.rules = [ExpandImportSelectors]
|
|
||||||
rewrite.rules = [RedundantBraces]
|
|
||||||
rewrite.rules = [RedundantParens]
|
|
||||||
rewrite.rules = [SortImports]
|
|
||||||
rewrite.rules = [SortModifiers]
|
|
||||||
rewrite.rules = [PreferCurlyFors]
|
|
||||||
spaces.inImportCurlyBraces = false
|
|
||||||
unindentTopLevelOperators = true
|
|
|
@ -1,21 +0,0 @@
|
||||||
<?xml version="1.0" encoding="ISO-8859-1"?>
|
|
||||||
<project xmlns="http://maven.apache.org/DECORATION/1.8.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
|
|
||||||
xsi:schemaLocation="http://maven.apache.org/DECORATION/1.8.0 https://maven.apache.org/xsd/decoration-1.8.0.xsd"
|
|
||||||
name="DHP-Aggregation">
|
|
||||||
<skin>
|
|
||||||
<groupId>org.apache.maven.skins</groupId>
|
|
||||||
<artifactId>maven-fluido-skin</artifactId>
|
|
||||||
<version>1.8</version>
|
|
||||||
</skin>
|
|
||||||
<poweredBy>
|
|
||||||
<logo name="OpenAIRE Research Graph" href="https://graph.openaire.eu/"
|
|
||||||
img="https://graph.openaire.eu/assets/common-assets/logo-large-graph.png"/>
|
|
||||||
</poweredBy>
|
|
||||||
<body>
|
|
||||||
<links>
|
|
||||||
<item name="Code" href="https://code-repo.d4science.org/" />
|
|
||||||
</links>
|
|
||||||
<menu ref="modules" />
|
|
||||||
<menu ref="reports"/>
|
|
||||||
</body>
|
|
||||||
</project>
|
|
|
@ -4,15 +4,12 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>eu.dnetlib.dhp</groupId>
|
<groupId>eu.dnetlib.dhp</groupId>
|
||||||
<artifactId>dhp</artifactId>
|
<artifactId>dhp</artifactId>
|
||||||
<version>1.2.5-SNAPSHOT</version>
|
<version>1.2.4-SNAPSHOT</version>
|
||||||
</parent>
|
</parent>
|
||||||
<artifactId>dhp-build</artifactId>
|
<artifactId>dhp-build</artifactId>
|
||||||
<packaging>pom</packaging>
|
<packaging>pom</packaging>
|
||||||
|
|
||||||
<description>This module is a container for the build tools used in dnet-hadoop</description>
|
<description>This module is a container for the build tools used in dnet-hadoop</description>
|
||||||
<properties>
|
|
||||||
<maven.javadoc.skip>true</maven.javadoc.skip>
|
|
||||||
</properties>
|
|
||||||
|
|
||||||
<modules>
|
<modules>
|
||||||
<module>dhp-code-style</module>
|
<module>dhp-code-style</module>
|
||||||
|
@ -20,12 +17,4 @@
|
||||||
<module>dhp-build-properties-maven-plugin</module>
|
<module>dhp-build-properties-maven-plugin</module>
|
||||||
</modules>
|
</modules>
|
||||||
|
|
||||||
|
|
||||||
<distributionManagement>
|
|
||||||
<site>
|
|
||||||
<id>DHPSite</id>
|
|
||||||
<url>${dhp.site.stage.path}/dhp-build/</url>
|
|
||||||
</site>
|
|
||||||
</distributionManagement>
|
|
||||||
|
|
||||||
</project>
|
</project>
|
||||||
|
|
|
@ -1,22 +0,0 @@
|
||||||
<?xml version="1.0" encoding="ISO-8859-1"?>
|
|
||||||
<project xmlns="http://maven.apache.org/DECORATION/1.8.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
|
|
||||||
xsi:schemaLocation="http://maven.apache.org/DECORATION/1.8.0 https://maven.apache.org/xsd/decoration-1.8.0.xsd"
|
|
||||||
name="DHP-Aggregation">
|
|
||||||
<skin>
|
|
||||||
<groupId>org.apache.maven.skins</groupId>
|
|
||||||
<artifactId>maven-fluido-skin</artifactId>
|
|
||||||
<version>1.8</version>
|
|
||||||
</skin>
|
|
||||||
<poweredBy>
|
|
||||||
<logo name="OpenAIRE Research Graph" href="https://graph.openaire.eu/"
|
|
||||||
img="https://graph.openaire.eu/assets/common-assets/logo-large-graph.png"/>
|
|
||||||
</poweredBy>
|
|
||||||
<body>
|
|
||||||
<links>
|
|
||||||
<item name="Code" href="https://code-repo.d4science.org/" />
|
|
||||||
</links>
|
|
||||||
|
|
||||||
<menu ref="modules" />
|
|
||||||
<menu ref="reports"/>
|
|
||||||
</body>
|
|
||||||
</project>
|
|
|
@ -5,7 +5,7 @@
|
||||||
<parent>
|
<parent>
|
||||||
<groupId>eu.dnetlib.dhp</groupId>
|
<groupId>eu.dnetlib.dhp</groupId>
|
||||||
<artifactId>dhp</artifactId>
|
<artifactId>dhp</artifactId>
|
||||||
<version>1.2.5-SNAPSHOT</version>
|
<version>1.2.4-SNAPSHOT</version>
|
||||||
<relativePath>../pom.xml</relativePath>
|
<relativePath>../pom.xml</relativePath>
|
||||||
|
|
||||||
</parent>
|
</parent>
|
||||||
|
@ -13,64 +13,14 @@
|
||||||
<artifactId>dhp-common</artifactId>
|
<artifactId>dhp-common</artifactId>
|
||||||
<packaging>jar</packaging>
|
<packaging>jar</packaging>
|
||||||
|
|
||||||
<distributionManagement>
|
|
||||||
<site>
|
|
||||||
<id>DHPSite</id>
|
|
||||||
<url>${dhp.site.stage.path}/dhp-common</url>
|
|
||||||
</site>
|
|
||||||
</distributionManagement>
|
|
||||||
|
|
||||||
<description>This module contains common utilities meant to be used across the dnet-hadoop submodules</description>
|
<description>This module contains common utilities meant to be used across the dnet-hadoop submodules</description>
|
||||||
<build>
|
|
||||||
<plugins>
|
|
||||||
<plugin>
|
|
||||||
<groupId>net.alchim31.maven</groupId>
|
|
||||||
<artifactId>scala-maven-plugin</artifactId>
|
|
||||||
<version>${net.alchim31.maven.version}</version>
|
|
||||||
<executions>
|
|
||||||
<execution>
|
|
||||||
<id>scala-compile-first</id>
|
|
||||||
<phase>initialize</phase>
|
|
||||||
<goals>
|
|
||||||
<goal>add-source</goal>
|
|
||||||
<goal>compile</goal>
|
|
||||||
</goals>
|
|
||||||
</execution>
|
|
||||||
<execution>
|
|
||||||
<id>scala-test-compile</id>
|
|
||||||
<phase>process-test-resources</phase>
|
|
||||||
<goals>
|
|
||||||
<goal>testCompile</goal>
|
|
||||||
</goals>
|
|
||||||
</execution>
|
|
||||||
<execution>
|
|
||||||
<id>scala-doc</id>
|
|
||||||
<phase>process-resources</phase> <!-- or wherever -->
|
|
||||||
<goals>
|
|
||||||
<goal>doc</goal>
|
|
||||||
</goals>
|
|
||||||
</execution>
|
|
||||||
</executions>
|
|
||||||
<configuration>
|
|
||||||
<failOnMultipleScalaVersions>true</failOnMultipleScalaVersions>
|
|
||||||
<scalaCompatVersion>${scala.binary.version}</scalaCompatVersion>
|
|
||||||
<scalaVersion>${scala.version}</scalaVersion>
|
|
||||||
</configuration>
|
|
||||||
</plugin>
|
|
||||||
</plugins>
|
|
||||||
|
|
||||||
</build>
|
|
||||||
|
|
||||||
<dependencies>
|
<dependencies>
|
||||||
<dependency>
|
|
||||||
<groupId>edu.cmu</groupId>
|
|
||||||
<artifactId>secondstring</artifactId>
|
|
||||||
</dependency>
|
|
||||||
<dependency>
|
|
||||||
<groupId>com.ibm.icu</groupId>
|
|
||||||
<artifactId>icu4j</artifactId>
|
|
||||||
</dependency>
|
|
||||||
|
|
||||||
|
<dependency>
|
||||||
|
<groupId>org.apache.hadoop</groupId>
|
||||||
|
<artifactId>hadoop-common</artifactId>
|
||||||
|
</dependency>
|
||||||
<dependency>
|
<dependency>
|
||||||
<groupId>com.github.sisyphsu</groupId>
|
<groupId>com.github.sisyphsu</groupId>
|
||||||
<artifactId>dateparser</artifactId>
|
<artifactId>dateparser</artifactId>
|
||||||
|
@ -82,11 +32,11 @@
|
||||||
|
|
||||||
<dependency>
|
<dependency>
|
||||||
<groupId>org.apache.spark</groupId>
|
<groupId>org.apache.spark</groupId>
|
||||||
<artifactId>spark-core_${scala.binary.version}</artifactId>
|
<artifactId>spark-core_2.11</artifactId>
|
||||||
</dependency>
|
</dependency>
|
||||||
<dependency>
|
<dependency>
|
||||||
<groupId>org.apache.spark</groupId>
|
<groupId>org.apache.spark</groupId>
|
||||||
<artifactId>spark-sql_${scala.binary.version}</artifactId>
|
<artifactId>spark-sql_2.11</artifactId>
|
||||||
</dependency>
|
</dependency>
|
||||||
|
|
||||||
<dependency>
|
<dependency>
|
||||||
|
@ -148,6 +98,11 @@
|
||||||
<artifactId>okhttp</artifactId>
|
<artifactId>okhttp</artifactId>
|
||||||
</dependency>
|
</dependency>
|
||||||
|
|
||||||
|
<dependency>
|
||||||
|
<groupId>eu.dnetlib</groupId>
|
||||||
|
<artifactId>dnet-pace-core</artifactId>
|
||||||
|
</dependency>
|
||||||
|
|
||||||
<dependency>
|
<dependency>
|
||||||
<groupId>org.apache.httpcomponents</groupId>
|
<groupId>org.apache.httpcomponents</groupId>
|
||||||
<artifactId>httpclient</artifactId>
|
<artifactId>httpclient</artifactId>
|
||||||
|
@ -169,23 +124,4 @@
|
||||||
</dependency>
|
</dependency>
|
||||||
</dependencies>
|
</dependencies>
|
||||||
|
|
||||||
<!-- dependencies required on JDK9+ because J2EE has been removed -->
|
|
||||||
<profiles>
|
|
||||||
<profile>
|
|
||||||
<id>spark-34</id>
|
|
||||||
<dependencies>
|
|
||||||
<dependency>
|
|
||||||
<groupId>javax.xml.bind</groupId>
|
|
||||||
<artifactId>jaxb-api</artifactId>
|
|
||||||
<version>2.2.11</version>
|
|
||||||
</dependency>
|
|
||||||
<dependency>
|
|
||||||
<groupId>com.sun.xml.ws</groupId>
|
|
||||||
<artifactId>jaxws-ri</artifactId>
|
|
||||||
<version>2.3.3</version>
|
|
||||||
<type>pom</type>
|
|
||||||
</dependency>
|
|
||||||
</dependencies>
|
|
||||||
</profile>
|
|
||||||
</profiles>
|
|
||||||
</project>
|
</project>
|
||||||
|
|
|
@ -10,12 +10,6 @@ public class Constants {
|
||||||
public static final Map<String, String> accessRightsCoarMap = Maps.newHashMap();
|
public static final Map<String, String> accessRightsCoarMap = Maps.newHashMap();
|
||||||
public static final Map<String, String> coarCodeLabelMap = Maps.newHashMap();
|
public static final Map<String, String> coarCodeLabelMap = Maps.newHashMap();
|
||||||
|
|
||||||
public static final String ROR_NS_PREFIX = "ror_________";
|
|
||||||
|
|
||||||
public static final String ROR_OPENAIRE_ID = "10|openaire____::993a7ae7a863813cf95028b50708e222";
|
|
||||||
|
|
||||||
public static final String ROR_DATASOURCE_NAME = "Research Organization Registry (ROR)";
|
|
||||||
|
|
||||||
public static String COAR_ACCESS_RIGHT_SCHEMA = "http://vocabularies.coar-repositories.org/documentation/access_rights/";
|
public static String COAR_ACCESS_RIGHT_SCHEMA = "http://vocabularies.coar-repositories.org/documentation/access_rights/";
|
||||||
|
|
||||||
private Constants() {
|
private Constants() {
|
||||||
|
@ -51,7 +45,6 @@ public class Constants {
|
||||||
public static final String RETRY_DELAY = "retryDelay";
|
public static final String RETRY_DELAY = "retryDelay";
|
||||||
public static final String CONNECT_TIMEOUT = "connectTimeOut";
|
public static final String CONNECT_TIMEOUT = "connectTimeOut";
|
||||||
public static final String READ_TIMEOUT = "readTimeOut";
|
public static final String READ_TIMEOUT = "readTimeOut";
|
||||||
public static final String REQUEST_METHOD = "requestMethod";
|
|
||||||
public static final String FROM_DATE_OVERRIDE = "fromDateOverride";
|
public static final String FROM_DATE_OVERRIDE = "fromDateOverride";
|
||||||
public static final String UNTIL_DATE_OVERRIDE = "untilDateOverride";
|
public static final String UNTIL_DATE_OVERRIDE = "untilDateOverride";
|
||||||
|
|
||||||
|
@ -59,10 +52,4 @@ public class Constants {
|
||||||
public static final String CONTENT_INVALIDRECORDS = "InvalidRecords";
|
public static final String CONTENT_INVALIDRECORDS = "InvalidRecords";
|
||||||
public static final String CONTENT_TRANSFORMEDRECORDS = "transformedItems";
|
public static final String CONTENT_TRANSFORMEDRECORDS = "transformedItems";
|
||||||
|
|
||||||
// IETF Draft and used by Repositories like ZENODO , not included in APACHE HTTP java packages
|
|
||||||
// see https://ietf-wg-httpapi.github.io/ratelimit-headers/draft-ietf-httpapi-ratelimit-headers.html
|
|
||||||
public static final String HTTPHEADER_IETF_DRAFT_RATELIMIT_LIMIT = "X-RateLimit-Limit";
|
|
||||||
public static final String HTTPHEADER_IETF_DRAFT_RATELIMIT_REMAINING = "X-RateLimit-Remaining";
|
|
||||||
public static final String HTTPHEADER_IETF_DRAFT_RATELIMIT_RESET = "X-RateLimit-Reset";
|
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
|
@ -7,12 +7,12 @@ import java.sql.*;
|
||||||
import java.util.function.Consumer;
|
import java.util.function.Consumer;
|
||||||
|
|
||||||
import org.apache.commons.lang3.StringUtils;
|
import org.apache.commons.lang3.StringUtils;
|
||||||
import org.slf4j.Logger;
|
import org.apache.commons.logging.Log;
|
||||||
import org.slf4j.LoggerFactory;
|
import org.apache.commons.logging.LogFactory;
|
||||||
|
|
||||||
public class DbClient implements Closeable {
|
public class DbClient implements Closeable {
|
||||||
|
|
||||||
private static final Logger log = LoggerFactory.getLogger(DbClient.class);
|
private static final Log log = LogFactory.getLog(DbClient.class);
|
||||||
|
|
||||||
private final Connection connection;
|
private final Connection connection;
|
||||||
|
|
||||||
|
@ -37,8 +37,6 @@ public class DbClient implements Closeable {
|
||||||
try (final Statement stmt = connection.createStatement()) {
|
try (final Statement stmt = connection.createStatement()) {
|
||||||
stmt.setFetchSize(100);
|
stmt.setFetchSize(100);
|
||||||
|
|
||||||
log.info("running SQL:\n\n{}\n\n", sql);
|
|
||||||
|
|
||||||
try (final ResultSet rs = stmt.executeQuery(sql)) {
|
try (final ResultSet rs = stmt.executeQuery(sql)) {
|
||||||
while (rs.next()) {
|
while (rs.next()) {
|
||||||
consumer.accept(rs);
|
consumer.accept(rs);
|
||||||
|
|
|
@ -0,0 +1,413 @@
|
||||||
|
|
||||||
|
package eu.dnetlib.dhp.common;
|
||||||
|
|
||||||
|
import java.io.Serializable;
|
||||||
|
import java.util.*;
|
||||||
|
import java.util.stream.Collectors;
|
||||||
|
|
||||||
|
import eu.dnetlib.dhp.schema.common.ModelConstants;
|
||||||
|
import eu.dnetlib.dhp.schema.dump.oaf.*;
|
||||||
|
import eu.dnetlib.dhp.schema.dump.oaf.community.CommunityInstance;
|
||||||
|
import eu.dnetlib.dhp.schema.dump.oaf.community.CommunityResult;
|
||||||
|
import eu.dnetlib.dhp.schema.oaf.DataInfo;
|
||||||
|
import eu.dnetlib.dhp.schema.oaf.Field;
|
||||||
|
import eu.dnetlib.dhp.schema.oaf.Journal;
|
||||||
|
import eu.dnetlib.dhp.schema.oaf.StructuredProperty;
|
||||||
|
|
||||||
|
public class GraphResultMapper implements Serializable {
|
||||||
|
|
||||||
|
public static <E extends eu.dnetlib.dhp.schema.oaf.OafEntity> Result map(
|
||||||
|
E in) {
|
||||||
|
|
||||||
|
CommunityResult out = new CommunityResult();
|
||||||
|
|
||||||
|
eu.dnetlib.dhp.schema.oaf.Result input = (eu.dnetlib.dhp.schema.oaf.Result) in;
|
||||||
|
Optional<eu.dnetlib.dhp.schema.oaf.Qualifier> ort = Optional.ofNullable(input.getResulttype());
|
||||||
|
if (ort.isPresent()) {
|
||||||
|
switch (ort.get().getClassid()) {
|
||||||
|
case "publication":
|
||||||
|
Optional<Journal> journal = Optional
|
||||||
|
.ofNullable(((eu.dnetlib.dhp.schema.oaf.Publication) input).getJournal());
|
||||||
|
if (journal.isPresent()) {
|
||||||
|
Journal j = journal.get();
|
||||||
|
Container c = new Container();
|
||||||
|
c.setConferencedate(j.getConferencedate());
|
||||||
|
c.setConferenceplace(j.getConferenceplace());
|
||||||
|
c.setEdition(j.getEdition());
|
||||||
|
c.setEp(j.getEp());
|
||||||
|
c.setIss(j.getIss());
|
||||||
|
c.setIssnLinking(j.getIssnLinking());
|
||||||
|
c.setIssnOnline(j.getIssnOnline());
|
||||||
|
c.setIssnPrinted(j.getIssnPrinted());
|
||||||
|
c.setName(j.getName());
|
||||||
|
c.setSp(j.getSp());
|
||||||
|
c.setVol(j.getVol());
|
||||||
|
out.setContainer(c);
|
||||||
|
out.setType(ModelConstants.PUBLICATION_DEFAULT_RESULTTYPE.getClassname());
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
case "dataset":
|
||||||
|
eu.dnetlib.dhp.schema.oaf.Dataset id = (eu.dnetlib.dhp.schema.oaf.Dataset) input;
|
||||||
|
Optional.ofNullable(id.getSize()).ifPresent(v -> out.setSize(v.getValue()));
|
||||||
|
Optional.ofNullable(id.getVersion()).ifPresent(v -> out.setVersion(v.getValue()));
|
||||||
|
|
||||||
|
out
|
||||||
|
.setGeolocation(
|
||||||
|
Optional
|
||||||
|
.ofNullable(id.getGeolocation())
|
||||||
|
.map(
|
||||||
|
igl -> igl
|
||||||
|
.stream()
|
||||||
|
.filter(Objects::nonNull)
|
||||||
|
.map(gli -> {
|
||||||
|
GeoLocation gl = new GeoLocation();
|
||||||
|
gl.setBox(gli.getBox());
|
||||||
|
gl.setPlace(gli.getPlace());
|
||||||
|
gl.setPoint(gli.getPoint());
|
||||||
|
return gl;
|
||||||
|
})
|
||||||
|
.collect(Collectors.toList()))
|
||||||
|
.orElse(null));
|
||||||
|
|
||||||
|
out.setType(ModelConstants.DATASET_DEFAULT_RESULTTYPE.getClassname());
|
||||||
|
break;
|
||||||
|
case "software":
|
||||||
|
|
||||||
|
eu.dnetlib.dhp.schema.oaf.Software is = (eu.dnetlib.dhp.schema.oaf.Software) input;
|
||||||
|
Optional
|
||||||
|
.ofNullable(is.getCodeRepositoryUrl())
|
||||||
|
.ifPresent(value -> out.setCodeRepositoryUrl(value.getValue()));
|
||||||
|
Optional
|
||||||
|
.ofNullable(is.getDocumentationUrl())
|
||||||
|
.ifPresent(
|
||||||
|
value -> out
|
||||||
|
.setDocumentationUrl(
|
||||||
|
value
|
||||||
|
.stream()
|
||||||
|
.map(Field::getValue)
|
||||||
|
.collect(Collectors.toList())));
|
||||||
|
|
||||||
|
Optional
|
||||||
|
.ofNullable(is.getProgrammingLanguage())
|
||||||
|
.ifPresent(value -> out.setProgrammingLanguage(value.getClassid()));
|
||||||
|
|
||||||
|
out.setType(ModelConstants.SOFTWARE_DEFAULT_RESULTTYPE.getClassname());
|
||||||
|
break;
|
||||||
|
case "other":
|
||||||
|
|
||||||
|
eu.dnetlib.dhp.schema.oaf.OtherResearchProduct ir = (eu.dnetlib.dhp.schema.oaf.OtherResearchProduct) input;
|
||||||
|
out
|
||||||
|
.setContactgroup(
|
||||||
|
Optional
|
||||||
|
.ofNullable(ir.getContactgroup())
|
||||||
|
.map(value -> value.stream().map(Field::getValue).collect(Collectors.toList()))
|
||||||
|
.orElse(null));
|
||||||
|
|
||||||
|
out
|
||||||
|
.setContactperson(
|
||||||
|
Optional
|
||||||
|
.ofNullable(ir.getContactperson())
|
||||||
|
.map(value -> value.stream().map(Field::getValue).collect(Collectors.toList()))
|
||||||
|
.orElse(null));
|
||||||
|
out
|
||||||
|
.setTool(
|
||||||
|
Optional
|
||||||
|
.ofNullable(ir.getTool())
|
||||||
|
.map(value -> value.stream().map(Field::getValue).collect(Collectors.toList()))
|
||||||
|
.orElse(null));
|
||||||
|
|
||||||
|
out.setType(ModelConstants.ORP_DEFAULT_RESULTTYPE.getClassname());
|
||||||
|
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
|
||||||
|
Optional
|
||||||
|
.ofNullable(input.getAuthor())
|
||||||
|
.ifPresent(
|
||||||
|
ats -> out.setAuthor(ats.stream().map(GraphResultMapper::getAuthor).collect(Collectors.toList())));
|
||||||
|
|
||||||
|
// I do not map Access Right UNKNOWN or OTHER
|
||||||
|
|
||||||
|
Optional<eu.dnetlib.dhp.schema.oaf.Qualifier> oar = Optional.ofNullable(input.getBestaccessright());
|
||||||
|
if (oar.isPresent()) {
|
||||||
|
if (Constants.accessRightsCoarMap.containsKey(oar.get().getClassid())) {
|
||||||
|
String code = Constants.accessRightsCoarMap.get(oar.get().getClassid());
|
||||||
|
out
|
||||||
|
.setBestaccessright(
|
||||||
|
AccessRight
|
||||||
|
.newInstance(
|
||||||
|
code,
|
||||||
|
Constants.coarCodeLabelMap.get(code),
|
||||||
|
Constants.COAR_ACCESS_RIGHT_SCHEMA));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
final List<String> contributorList = new ArrayList<>();
|
||||||
|
Optional
|
||||||
|
.ofNullable(input.getContributor())
|
||||||
|
.ifPresent(value -> value.stream().forEach(c -> contributorList.add(c.getValue())));
|
||||||
|
out.setContributor(contributorList);
|
||||||
|
|
||||||
|
Optional
|
||||||
|
.ofNullable(input.getCountry())
|
||||||
|
.ifPresent(
|
||||||
|
value -> out
|
||||||
|
.setCountry(
|
||||||
|
value
|
||||||
|
.stream()
|
||||||
|
.map(
|
||||||
|
c -> {
|
||||||
|
if (c.getClassid().equals((ModelConstants.UNKNOWN))) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
Country country = new Country();
|
||||||
|
country.setCode(c.getClassid());
|
||||||
|
country.setLabel(c.getClassname());
|
||||||
|
Optional
|
||||||
|
.ofNullable(c.getDataInfo())
|
||||||
|
.ifPresent(
|
||||||
|
provenance -> country
|
||||||
|
.setProvenance(
|
||||||
|
Provenance
|
||||||
|
.newInstance(
|
||||||
|
provenance
|
||||||
|
.getProvenanceaction()
|
||||||
|
.getClassname(),
|
||||||
|
c.getDataInfo().getTrust())));
|
||||||
|
return country;
|
||||||
|
})
|
||||||
|
.filter(Objects::nonNull)
|
||||||
|
.collect(Collectors.toList())));
|
||||||
|
|
||||||
|
final List<String> coverageList = new ArrayList<>();
|
||||||
|
Optional
|
||||||
|
.ofNullable(input.getCoverage())
|
||||||
|
.ifPresent(value -> value.stream().forEach(c -> coverageList.add(c.getValue())));
|
||||||
|
out.setCoverage(coverageList);
|
||||||
|
|
||||||
|
out.setDateofcollection(input.getDateofcollection());
|
||||||
|
|
||||||
|
final List<String> descriptionList = new ArrayList<>();
|
||||||
|
Optional
|
||||||
|
.ofNullable(input.getDescription())
|
||||||
|
.ifPresent(value -> value.forEach(d -> descriptionList.add(d.getValue())));
|
||||||
|
out.setDescription(descriptionList);
|
||||||
|
Optional<Field<String>> oStr = Optional.ofNullable(input.getEmbargoenddate());
|
||||||
|
if (oStr.isPresent()) {
|
||||||
|
out.setEmbargoenddate(oStr.get().getValue());
|
||||||
|
}
|
||||||
|
|
||||||
|
final List<String> formatList = new ArrayList<>();
|
||||||
|
Optional
|
||||||
|
.ofNullable(input.getFormat())
|
||||||
|
.ifPresent(value -> value.stream().forEach(f -> formatList.add(f.getValue())));
|
||||||
|
out.setFormat(formatList);
|
||||||
|
out.setId(input.getId());
|
||||||
|
out.setOriginalId(input.getOriginalId());
|
||||||
|
|
||||||
|
Optional<List<eu.dnetlib.dhp.schema.oaf.Instance>> oInst = Optional
|
||||||
|
.ofNullable(input.getInstance());
|
||||||
|
|
||||||
|
if (oInst.isPresent()) {
|
||||||
|
out
|
||||||
|
.setInstance(
|
||||||
|
oInst.get().stream().map(GraphResultMapper::getInstance).collect(Collectors.toList()));
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
Optional<eu.dnetlib.dhp.schema.oaf.Qualifier> oL = Optional.ofNullable(input.getLanguage());
|
||||||
|
if (oL.isPresent()) {
|
||||||
|
eu.dnetlib.dhp.schema.oaf.Qualifier language = oL.get();
|
||||||
|
out.setLanguage(Qualifier.newInstance(language.getClassid(), language.getClassname()));
|
||||||
|
}
|
||||||
|
Optional<Long> oLong = Optional.ofNullable(input.getLastupdatetimestamp());
|
||||||
|
if (oLong.isPresent()) {
|
||||||
|
out.setLastupdatetimestamp(oLong.get());
|
||||||
|
}
|
||||||
|
Optional<List<StructuredProperty>> otitle = Optional.ofNullable(input.getTitle());
|
||||||
|
if (otitle.isPresent()) {
|
||||||
|
List<StructuredProperty> iTitle = otitle
|
||||||
|
.get()
|
||||||
|
.stream()
|
||||||
|
.filter(t -> t.getQualifier().getClassid().equalsIgnoreCase("main title"))
|
||||||
|
.collect(Collectors.toList());
|
||||||
|
if (!iTitle.isEmpty()) {
|
||||||
|
out.setMaintitle(iTitle.get(0).getValue());
|
||||||
|
}
|
||||||
|
|
||||||
|
iTitle = otitle
|
||||||
|
.get()
|
||||||
|
.stream()
|
||||||
|
.filter(t -> t.getQualifier().getClassid().equalsIgnoreCase("subtitle"))
|
||||||
|
.collect(Collectors.toList());
|
||||||
|
if (!iTitle.isEmpty()) {
|
||||||
|
out.setSubtitle(iTitle.get(0).getValue());
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
List<ControlledField> pids = new ArrayList<>();
|
||||||
|
Optional
|
||||||
|
.ofNullable(input.getPid())
|
||||||
|
.ifPresent(
|
||||||
|
value -> value
|
||||||
|
.stream()
|
||||||
|
.forEach(
|
||||||
|
p -> pids
|
||||||
|
.add(
|
||||||
|
ControlledField
|
||||||
|
.newInstance(p.getQualifier().getClassid(), p.getValue()))));
|
||||||
|
out.setPid(pids);
|
||||||
|
oStr = Optional.ofNullable(input.getDateofacceptance());
|
||||||
|
if (oStr.isPresent()) {
|
||||||
|
out.setPublicationdate(oStr.get().getValue());
|
||||||
|
}
|
||||||
|
oStr = Optional.ofNullable(input.getPublisher());
|
||||||
|
if (oStr.isPresent()) {
|
||||||
|
out.setPublisher(oStr.get().getValue());
|
||||||
|
}
|
||||||
|
|
||||||
|
List<String> sourceList = new ArrayList<>();
|
||||||
|
Optional
|
||||||
|
.ofNullable(input.getSource())
|
||||||
|
.ifPresent(value -> value.stream().forEach(s -> sourceList.add(s.getValue())));
|
||||||
|
// out.setSource(input.getSource().stream().map(s -> s.getValue()).collect(Collectors.toList()));
|
||||||
|
List<Subject> subjectList = new ArrayList<>();
|
||||||
|
Optional
|
||||||
|
.ofNullable(input.getSubject())
|
||||||
|
.ifPresent(
|
||||||
|
value -> value
|
||||||
|
.forEach(s -> subjectList.add(getSubject(s))));
|
||||||
|
|
||||||
|
out.setSubjects(subjectList);
|
||||||
|
|
||||||
|
out.setType(input.getResulttype().getClassid());
|
||||||
|
}
|
||||||
|
|
||||||
|
out
|
||||||
|
.setCollectedfrom(
|
||||||
|
input
|
||||||
|
.getCollectedfrom()
|
||||||
|
.stream()
|
||||||
|
.map(cf -> KeyValue.newInstance(cf.getKey(), cf.getValue()))
|
||||||
|
.collect(Collectors.toList()));
|
||||||
|
|
||||||
|
return out;
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
private static CommunityInstance getInstance(eu.dnetlib.dhp.schema.oaf.Instance i) {
|
||||||
|
CommunityInstance instance = new CommunityInstance();
|
||||||
|
|
||||||
|
setCommonValue(i, instance);
|
||||||
|
|
||||||
|
instance
|
||||||
|
.setCollectedfrom(
|
||||||
|
KeyValue
|
||||||
|
.newInstance(i.getCollectedfrom().getKey(), i.getCollectedfrom().getValue()));
|
||||||
|
|
||||||
|
instance
|
||||||
|
.setHostedby(
|
||||||
|
KeyValue.newInstance(i.getHostedby().getKey(), i.getHostedby().getValue()));
|
||||||
|
|
||||||
|
return instance;
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
private static <I extends Instance> void setCommonValue(eu.dnetlib.dhp.schema.oaf.Instance i, I instance) {
|
||||||
|
Optional<eu.dnetlib.dhp.schema.oaf.Qualifier> opAr = Optional
|
||||||
|
.ofNullable(i.getAccessright());
|
||||||
|
if (opAr.isPresent()) {
|
||||||
|
if (Constants.accessRightsCoarMap.containsKey(opAr.get().getClassid())) {
|
||||||
|
String code = Constants.accessRightsCoarMap.get(opAr.get().getClassid());
|
||||||
|
instance
|
||||||
|
.setAccessright(
|
||||||
|
AccessRight
|
||||||
|
.newInstance(
|
||||||
|
code,
|
||||||
|
Constants.coarCodeLabelMap.get(code),
|
||||||
|
Constants.COAR_ACCESS_RIGHT_SCHEMA));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Optional
|
||||||
|
.ofNullable(i.getLicense())
|
||||||
|
.ifPresent(value -> instance.setLicense(value.getValue()));
|
||||||
|
Optional
|
||||||
|
.ofNullable(i.getDateofacceptance())
|
||||||
|
.ifPresent(value -> instance.setPublicationdate(value.getValue()));
|
||||||
|
Optional
|
||||||
|
.ofNullable(i.getRefereed())
|
||||||
|
.ifPresent(value -> instance.setRefereed(value.getClassname()));
|
||||||
|
Optional
|
||||||
|
.ofNullable(i.getInstancetype())
|
||||||
|
.ifPresent(value -> instance.setType(value.getClassname()));
|
||||||
|
Optional.ofNullable(i.getUrl()).ifPresent(value -> instance.setUrl(value));
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
private static Subject getSubject(StructuredProperty s) {
|
||||||
|
Subject subject = new Subject();
|
||||||
|
subject.setSubject(ControlledField.newInstance(s.getQualifier().getClassid(), s.getValue()));
|
||||||
|
Optional<DataInfo> di = Optional.ofNullable(s.getDataInfo());
|
||||||
|
if (di.isPresent()) {
|
||||||
|
Provenance p = new Provenance();
|
||||||
|
p.setProvenance(di.get().getProvenanceaction().getClassname());
|
||||||
|
p.setTrust(di.get().getTrust());
|
||||||
|
subject.setProvenance(p);
|
||||||
|
}
|
||||||
|
|
||||||
|
return subject;
|
||||||
|
}
|
||||||
|
|
||||||
|
private static Author getAuthor(eu.dnetlib.dhp.schema.oaf.Author oa) {
|
||||||
|
Author a = new Author();
|
||||||
|
a.setFullname(oa.getFullname());
|
||||||
|
a.setName(oa.getName());
|
||||||
|
a.setSurname(oa.getSurname());
|
||||||
|
a.setRank(oa.getRank());
|
||||||
|
|
||||||
|
Optional<List<StructuredProperty>> oPids = Optional
|
||||||
|
.ofNullable(oa.getPid());
|
||||||
|
if (oPids.isPresent()) {
|
||||||
|
Pid pid = getOrcid(oPids.get());
|
||||||
|
if (pid != null) {
|
||||||
|
a.setPid(pid);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return a;
|
||||||
|
}
|
||||||
|
|
||||||
|
private static Pid getOrcid(List<StructuredProperty> p) {
|
||||||
|
for (StructuredProperty pid : p) {
|
||||||
|
if (pid.getQualifier().getClassid().equals(ModelConstants.ORCID)) {
|
||||||
|
Optional<DataInfo> di = Optional.ofNullable(pid.getDataInfo());
|
||||||
|
if (di.isPresent()) {
|
||||||
|
return Pid
|
||||||
|
.newInstance(
|
||||||
|
ControlledField
|
||||||
|
.newInstance(
|
||||||
|
pid.getQualifier().getClassid(),
|
||||||
|
pid.getValue()),
|
||||||
|
Provenance
|
||||||
|
.newInstance(
|
||||||
|
di.get().getProvenanceaction().getClassname(),
|
||||||
|
di.get().getTrust()));
|
||||||
|
} else {
|
||||||
|
return Pid
|
||||||
|
.newInstance(
|
||||||
|
ControlledField
|
||||||
|
.newInstance(
|
||||||
|
pid.getQualifier().getClassid(),
|
||||||
|
pid.getValue())
|
||||||
|
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
|
@ -28,7 +28,7 @@ public class HdfsSupport {
|
||||||
* @param configuration Configuration of hadoop env
|
* @param configuration Configuration of hadoop env
|
||||||
*/
|
*/
|
||||||
public static boolean exists(String path, Configuration configuration) {
|
public static boolean exists(String path, Configuration configuration) {
|
||||||
logger.info("Checking existence for path: {}", path);
|
logger.info("Removing path: {}", path);
|
||||||
return rethrowAsRuntimeException(
|
return rethrowAsRuntimeException(
|
||||||
() -> {
|
() -> {
|
||||||
Path f = new Path(path);
|
Path f = new Path(path);
|
||||||
|
|
|
@ -1,100 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.common;
|
|
||||||
|
|
||||||
/**
|
|
||||||
* This utility represent the Metadata Store information
|
|
||||||
* needed during the migration from mongo to HDFS to store
|
|
||||||
*/
|
|
||||||
public class MDStoreInfo {
|
|
||||||
private String mdstore;
|
|
||||||
private String currentId;
|
|
||||||
private Long latestTimestamp;
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Instantiates a new Md store info.
|
|
||||||
*/
|
|
||||||
public MDStoreInfo() {
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Instantiates a new Md store info.
|
|
||||||
*
|
|
||||||
* @param mdstore the mdstore
|
|
||||||
* @param currentId the current id
|
|
||||||
* @param latestTimestamp the latest timestamp
|
|
||||||
*/
|
|
||||||
public MDStoreInfo(String mdstore, String currentId, Long latestTimestamp) {
|
|
||||||
this.mdstore = mdstore;
|
|
||||||
this.currentId = currentId;
|
|
||||||
this.latestTimestamp = latestTimestamp;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Gets mdstore.
|
|
||||||
*
|
|
||||||
* @return the mdstore
|
|
||||||
*/
|
|
||||||
public String getMdstore() {
|
|
||||||
return mdstore;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Sets mdstore.
|
|
||||||
*
|
|
||||||
* @param mdstore the mdstore
|
|
||||||
* @return the mdstore
|
|
||||||
*/
|
|
||||||
public MDStoreInfo setMdstore(String mdstore) {
|
|
||||||
this.mdstore = mdstore;
|
|
||||||
return this;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Gets current id.
|
|
||||||
*
|
|
||||||
* @return the current id
|
|
||||||
*/
|
|
||||||
public String getCurrentId() {
|
|
||||||
return currentId;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Sets current id.
|
|
||||||
*
|
|
||||||
* @param currentId the current id
|
|
||||||
* @return the current id
|
|
||||||
*/
|
|
||||||
public MDStoreInfo setCurrentId(String currentId) {
|
|
||||||
this.currentId = currentId;
|
|
||||||
return this;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Gets latest timestamp.
|
|
||||||
*
|
|
||||||
* @return the latest timestamp
|
|
||||||
*/
|
|
||||||
public Long getLatestTimestamp() {
|
|
||||||
return latestTimestamp;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Sets latest timestamp.
|
|
||||||
*
|
|
||||||
* @param latestTimestamp the latest timestamp
|
|
||||||
* @return the latest timestamp
|
|
||||||
*/
|
|
||||||
public MDStoreInfo setLatestTimestamp(Long latestTimestamp) {
|
|
||||||
this.latestTimestamp = latestTimestamp;
|
|
||||||
return this;
|
|
||||||
}
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public String toString() {
|
|
||||||
return "MDStoreInfo{" +
|
|
||||||
"mdstore='" + mdstore + '\'' +
|
|
||||||
", currentId='" + currentId + '\'' +
|
|
||||||
", latestTimestamp=" + latestTimestamp +
|
|
||||||
'}';
|
|
||||||
}
|
|
||||||
}
|
|
|
@ -5,71 +5,13 @@ import java.io.BufferedInputStream;
|
||||||
import java.io.IOException;
|
import java.io.IOException;
|
||||||
import java.io.InputStream;
|
import java.io.InputStream;
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
import java.util.Optional;
|
|
||||||
|
|
||||||
import org.apache.commons.compress.archivers.tar.TarArchiveEntry;
|
import org.apache.commons.compress.archivers.tar.TarArchiveEntry;
|
||||||
import org.apache.commons.compress.archivers.tar.TarArchiveOutputStream;
|
import org.apache.commons.compress.archivers.tar.TarArchiveOutputStream;
|
||||||
import org.apache.commons.io.IOUtils;
|
|
||||||
import org.apache.hadoop.conf.Configuration;
|
|
||||||
import org.apache.hadoop.fs.*;
|
import org.apache.hadoop.fs.*;
|
||||||
import org.slf4j.Logger;
|
|
||||||
import org.slf4j.LoggerFactory;
|
|
||||||
|
|
||||||
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
|
|
||||||
|
|
||||||
public class MakeTarArchive implements Serializable {
|
public class MakeTarArchive implements Serializable {
|
||||||
|
|
||||||
private static final Logger log = LoggerFactory.getLogger(MakeTarArchive.class);
|
|
||||||
|
|
||||||
public static void main(String[] args) throws Exception {
|
|
||||||
String jsonConfiguration = IOUtils
|
|
||||||
.toString(
|
|
||||||
MakeTarArchive.class
|
|
||||||
.getResourceAsStream(
|
|
||||||
"/eu/dnetlib/dhp/common/input_maketar_parameters.json"));
|
|
||||||
|
|
||||||
final ArgumentApplicationParser parser = new ArgumentApplicationParser(jsonConfiguration);
|
|
||||||
parser.parseArgument(args);
|
|
||||||
|
|
||||||
final String outputPath = parser.get("hdfsPath");
|
|
||||||
log.info("hdfsPath: {}", outputPath);
|
|
||||||
|
|
||||||
final String hdfsNameNode = parser.get("nameNode");
|
|
||||||
log.info("nameNode: {}", hdfsNameNode);
|
|
||||||
|
|
||||||
final String inputPath = parser.get("sourcePath");
|
|
||||||
log.info("input path : {}", inputPath);
|
|
||||||
|
|
||||||
final int gBperSplit = Optional
|
|
||||||
.ofNullable(parser.get("splitSize"))
|
|
||||||
.map(Integer::valueOf)
|
|
||||||
.orElse(10);
|
|
||||||
|
|
||||||
Configuration conf = new Configuration();
|
|
||||||
conf.set("fs.defaultFS", hdfsNameNode);
|
|
||||||
|
|
||||||
FileSystem fileSystem = FileSystem.get(conf);
|
|
||||||
|
|
||||||
makeTArArchive(fileSystem, inputPath, outputPath, gBperSplit);
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
public static void makeTArArchive(FileSystem fileSystem, String inputPath, String outputPath, int gBperSplit)
|
|
||||||
throws IOException {
|
|
||||||
|
|
||||||
RemoteIterator<LocatedFileStatus> dirIterator = fileSystem.listLocatedStatus(new Path(inputPath));
|
|
||||||
|
|
||||||
while (dirIterator.hasNext()) {
|
|
||||||
LocatedFileStatus fileStatus = dirIterator.next();
|
|
||||||
|
|
||||||
Path p = fileStatus.getPath();
|
|
||||||
String pathString = p.toString();
|
|
||||||
String entity = pathString.substring(pathString.lastIndexOf("/") + 1);
|
|
||||||
|
|
||||||
MakeTarArchive.tarMaxSize(fileSystem, pathString, outputPath + "/" + entity, entity, gBperSplit);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
private static TarArchiveOutputStream getTar(FileSystem fileSystem, String outputPath) throws IOException {
|
private static TarArchiveOutputStream getTar(FileSystem fileSystem, String outputPath) throws IOException {
|
||||||
Path hdfsWritePath = new Path(outputPath);
|
Path hdfsWritePath = new Path(outputPath);
|
||||||
if (fileSystem.exists(hdfsWritePath)) {
|
if (fileSystem.exists(hdfsWritePath)) {
|
||||||
|
@ -79,7 +21,7 @@ public class MakeTarArchive implements Serializable {
|
||||||
return new TarArchiveOutputStream(fileSystem.create(hdfsWritePath).getWrappedStream());
|
return new TarArchiveOutputStream(fileSystem.create(hdfsWritePath).getWrappedStream());
|
||||||
}
|
}
|
||||||
|
|
||||||
private static void write(FileSystem fileSystem, String inputPath, String outputPath, String dirName)
|
private static void write(FileSystem fileSystem, String inputPath, String outputPath, String dir_name)
|
||||||
throws IOException {
|
throws IOException {
|
||||||
|
|
||||||
Path hdfsWritePath = new Path(outputPath);
|
Path hdfsWritePath = new Path(outputPath);
|
||||||
|
@ -95,7 +37,7 @@ public class MakeTarArchive implements Serializable {
|
||||||
new Path(inputPath), true);
|
new Path(inputPath), true);
|
||||||
|
|
||||||
while (iterator.hasNext()) {
|
while (iterator.hasNext()) {
|
||||||
writeCurrentFile(fileSystem, dirName, iterator, ar, 0);
|
writeCurrentFile(fileSystem, dir_name, iterator, ar, 0);
|
||||||
}
|
}
|
||||||
|
|
||||||
}
|
}
|
||||||
|
@ -117,40 +59,35 @@ public class MakeTarArchive implements Serializable {
|
||||||
new Path(inputPath), true);
|
new Path(inputPath), true);
|
||||||
boolean next = fileStatusListIterator.hasNext();
|
boolean next = fileStatusListIterator.hasNext();
|
||||||
while (next) {
|
while (next) {
|
||||||
try (TarArchiveOutputStream ar = getTar(fileSystem, outputPath + "_" + (partNum + 1) + ".tar")) {
|
TarArchiveOutputStream ar = getTar(fileSystem, outputPath + "_" + (partNum + 1) + ".tar");
|
||||||
|
|
||||||
long currentSize = 0;
|
long current_size = 0;
|
||||||
while (next && currentSize < bytesPerSplit) {
|
while (next && current_size < bytesPerSplit) {
|
||||||
currentSize = writeCurrentFile(fileSystem, dir_name, fileStatusListIterator, ar, currentSize);
|
current_size = writeCurrentFile(fileSystem, dir_name, fileStatusListIterator, ar, current_size);
|
||||||
next = fileStatusListIterator.hasNext();
|
next = fileStatusListIterator.hasNext();
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
partNum += 1;
|
partNum += 1;
|
||||||
}
|
ar.close();
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
private static long writeCurrentFile(FileSystem fileSystem, String dirName,
|
}
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
private static long writeCurrentFile(FileSystem fileSystem, String dir_name,
|
||||||
RemoteIterator<LocatedFileStatus> fileStatusListIterator,
|
RemoteIterator<LocatedFileStatus> fileStatusListIterator,
|
||||||
TarArchiveOutputStream ar, long currentSize) throws IOException {
|
TarArchiveOutputStream ar, long current_size) throws IOException {
|
||||||
LocatedFileStatus fileStatus = fileStatusListIterator.next();
|
LocatedFileStatus fileStatus = fileStatusListIterator.next();
|
||||||
|
|
||||||
Path p = fileStatus.getPath();
|
Path p = fileStatus.getPath();
|
||||||
String pString = p.toString();
|
String p_string = p.toString();
|
||||||
if (!pString.endsWith("_SUCCESS")) {
|
if (!p_string.endsWith("_SUCCESS")) {
|
||||||
String name = pString.substring(pString.lastIndexOf("/") + 1);
|
String name = p_string.substring(p_string.lastIndexOf("/") + 1);
|
||||||
if (name.startsWith("part-") & name.length() > 10) {
|
TarArchiveEntry entry = new TarArchiveEntry(dir_name + "/" + name);
|
||||||
String tmp = name.substring(0, 10);
|
|
||||||
if (name.contains(".")) {
|
|
||||||
tmp += name.substring(name.indexOf("."));
|
|
||||||
}
|
|
||||||
name = tmp;
|
|
||||||
}
|
|
||||||
TarArchiveEntry entry = new TarArchiveEntry(dirName + "/" + name);
|
|
||||||
entry.setSize(fileStatus.getLen());
|
entry.setSize(fileStatus.getLen());
|
||||||
currentSize += fileStatus.getLen();
|
current_size += fileStatus.getLen();
|
||||||
ar.putArchiveEntry(entry);
|
ar.putArchiveEntry(entry);
|
||||||
|
|
||||||
InputStream is = fileSystem.open(fileStatus.getPath());
|
InputStream is = fileSystem.open(fileStatus.getPath());
|
||||||
|
@ -166,7 +103,7 @@ public class MakeTarArchive implements Serializable {
|
||||||
ar.closeArchiveEntry();
|
ar.closeArchiveEntry();
|
||||||
|
|
||||||
}
|
}
|
||||||
return currentSize;
|
return current_size;
|
||||||
}
|
}
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,12 +1,12 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.common;
|
package eu.dnetlib.dhp.common;
|
||||||
|
|
||||||
import static com.mongodb.client.model.Sorts.descending;
|
|
||||||
|
|
||||||
import java.io.Closeable;
|
import java.io.Closeable;
|
||||||
import java.io.IOException;
|
import java.io.IOException;
|
||||||
import java.util.*;
|
import java.util.ArrayList;
|
||||||
import java.util.stream.Collectors;
|
import java.util.HashMap;
|
||||||
|
import java.util.Map;
|
||||||
|
import java.util.Optional;
|
||||||
import java.util.stream.StreamSupport;
|
import java.util.stream.StreamSupport;
|
||||||
|
|
||||||
import org.apache.commons.lang3.StringUtils;
|
import org.apache.commons.lang3.StringUtils;
|
||||||
|
@ -38,26 +38,6 @@ public class MdstoreClient implements Closeable {
|
||||||
this.db = getDb(client, dbName);
|
this.db = getDb(client, dbName);
|
||||||
}
|
}
|
||||||
|
|
||||||
private Long parseTimestamp(Document f) {
|
|
||||||
if (f == null || !f.containsKey("timestamp"))
|
|
||||||
return null;
|
|
||||||
|
|
||||||
Object ts = f.get("timestamp");
|
|
||||||
|
|
||||||
return Long.parseLong(ts.toString());
|
|
||||||
}
|
|
||||||
|
|
||||||
public Long getLatestTimestamp(final String collectionId) {
|
|
||||||
MongoCollection<Document> collection = db.getCollection(collectionId);
|
|
||||||
FindIterable<Document> result = collection.find().sort(descending("timestamp")).limit(1);
|
|
||||||
if (result == null) {
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
|
|
||||||
Document f = result.first();
|
|
||||||
return parseTimestamp(f);
|
|
||||||
}
|
|
||||||
|
|
||||||
public MongoCollection<Document> mdStore(final String mdId) {
|
public MongoCollection<Document> mdStore(final String mdId) {
|
||||||
BasicDBObject query = (BasicDBObject) QueryBuilder.start("mdId").is(mdId).get();
|
BasicDBObject query = (BasicDBObject) QueryBuilder.start("mdId").is(mdId).get();
|
||||||
|
|
||||||
|
@ -74,16 +54,6 @@ public class MdstoreClient implements Closeable {
|
||||||
return getColl(db, currentId, true);
|
return getColl(db, currentId, true);
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<MDStoreInfo> mdStoreWithTimestamp(final String mdFormat, final String mdLayout,
|
|
||||||
final String mdInterpretation) {
|
|
||||||
Map<String, String> res = validCollections(mdFormat, mdLayout, mdInterpretation);
|
|
||||||
return res
|
|
||||||
.entrySet()
|
|
||||||
.stream()
|
|
||||||
.map(e -> new MDStoreInfo(e.getKey(), e.getValue(), getLatestTimestamp(e.getValue())))
|
|
||||||
.collect(Collectors.toList());
|
|
||||||
}
|
|
||||||
|
|
||||||
public Map<String, String> validCollections(
|
public Map<String, String> validCollections(
|
||||||
final String mdFormat, final String mdLayout, final String mdInterpretation) {
|
final String mdFormat, final String mdLayout, final String mdInterpretation) {
|
||||||
|
|
||||||
|
|
|
@ -1,18 +1,18 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.common;
|
package eu.dnetlib.dhp.common;
|
||||||
|
|
||||||
import java.io.IOException;
|
|
||||||
import java.nio.charset.StandardCharsets;
|
import java.nio.charset.StandardCharsets;
|
||||||
import java.text.Normalizer;
|
import java.text.Normalizer;
|
||||||
import java.util.*;
|
import java.util.HashSet;
|
||||||
import java.util.stream.Collectors;
|
import java.util.List;
|
||||||
|
import java.util.Set;
|
||||||
|
|
||||||
import org.apache.commons.io.IOUtils;
|
import org.apache.commons.io.IOUtils;
|
||||||
import org.apache.commons.lang3.text.WordUtils;
|
import org.apache.commons.lang3.text.WordUtils;
|
||||||
|
|
||||||
import com.ctc.wstx.dtd.LargePrefixedNameSet;
|
|
||||||
import com.google.common.base.Joiner;
|
import com.google.common.base.Joiner;
|
||||||
import com.google.common.base.Splitter;
|
import com.google.common.base.Splitter;
|
||||||
|
import com.google.common.collect.Iterables;
|
||||||
import com.google.common.collect.Lists;
|
import com.google.common.collect.Lists;
|
||||||
import com.google.common.hash.Hashing;
|
import com.google.common.hash.Hashing;
|
||||||
|
|
||||||
|
@ -29,19 +29,7 @@ public class PacePerson {
|
||||||
private List<String> fullname = Lists.newArrayList();
|
private List<String> fullname = Lists.newArrayList();
|
||||||
private final String original;
|
private final String original;
|
||||||
|
|
||||||
private static Set<String> particles;
|
private static Set<String> particles = null;
|
||||||
|
|
||||||
static {
|
|
||||||
try {
|
|
||||||
particles = new HashSet<>(IOUtils
|
|
||||||
.readLines(
|
|
||||||
PacePerson.class
|
|
||||||
.getResourceAsStream(
|
|
||||||
"/eu/dnetlib/dhp/common/name_particles.txt")));
|
|
||||||
} catch (Exception e) {
|
|
||||||
throw new RuntimeException(e);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Capitalizes a string
|
* Capitalizes a string
|
||||||
|
@ -49,20 +37,29 @@ public class PacePerson {
|
||||||
* @param s the string to capitalize
|
* @param s the string to capitalize
|
||||||
* @return the input string with capital letter
|
* @return the input string with capital letter
|
||||||
*/
|
*/
|
||||||
public static String capitalize(final String s) {
|
public static final String capitalize(final String s) {
|
||||||
if (particles.contains(s)) {
|
|
||||||
return s;
|
|
||||||
}
|
|
||||||
return WordUtils.capitalize(s.toLowerCase(), ' ', '-');
|
return WordUtils.capitalize(s.toLowerCase(), ' ', '-');
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Adds a dot to a string with length equals to 1
|
* Adds a dot to a string with length equals to 1
|
||||||
*/
|
*/
|
||||||
public static String dotAbbreviations(final String s) {
|
public static final String dotAbbreviations(final String s) {
|
||||||
return s.length() == 1 ? s + "." : s;
|
return s.length() == 1 ? s + "." : s;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
public static Set<String> loadFromClasspath(final String classpath) {
|
||||||
|
final Set<String> h = new HashSet<>();
|
||||||
|
try {
|
||||||
|
for (final String s : IOUtils.readLines(PacePerson.class.getResourceAsStream(classpath))) {
|
||||||
|
h.add(s);
|
||||||
|
}
|
||||||
|
} catch (final Throwable e) {
|
||||||
|
return new HashSet<>();
|
||||||
|
}
|
||||||
|
return h;
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* The constructor of the class. It fills the fields of the class basing on the input fullname.
|
* The constructor of the class. It fills the fields of the class basing on the input fullname.
|
||||||
*
|
*
|
||||||
|
@ -131,6 +128,10 @@ public class PacePerson {
|
||||||
}
|
}
|
||||||
|
|
||||||
private List<String> splitTerms(final String s) {
|
private List<String> splitTerms(final String s) {
|
||||||
|
if (particles == null) {
|
||||||
|
particles = loadFromClasspath("/eu/dnetlib/dhp/oa/graph/pace/name_particles.txt");
|
||||||
|
}
|
||||||
|
|
||||||
final List<String> list = Lists.newArrayList();
|
final List<String> list = Lists.newArrayList();
|
||||||
for (final String part : Splitter.on(" ").omitEmptyStrings().split(s)) {
|
for (final String part : Splitter.on(" ").omitEmptyStrings().split(s)) {
|
||||||
if (!particles.contains(part.toLowerCase())) {
|
if (!particles.contains(part.toLowerCase())) {
|
||||||
|
@ -186,36 +187,17 @@ public class PacePerson {
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<String> getCapitalFirstnames() {
|
public List<String> getCapitalFirstnames() {
|
||||||
return Optional
|
return Lists
|
||||||
.ofNullable(getNameWithAbbreviations())
|
.newArrayList(
|
||||||
.map(
|
Iterables.transform(getNameWithAbbreviations(), PacePerson::capitalize));
|
||||||
name -> name
|
|
||||||
.stream()
|
|
||||||
.map(PacePerson::capitalize)
|
|
||||||
.collect(Collectors.toList()))
|
|
||||||
.orElse(new ArrayList<>());
|
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<String> getCapitalSurname() {
|
public List<String> getCapitalSurname() {
|
||||||
return Optional
|
return Lists.newArrayList(Iterables.transform(surname, PacePerson::capitalize));
|
||||||
.ofNullable(getSurname())
|
|
||||||
.map(
|
|
||||||
surname -> surname
|
|
||||||
.stream()
|
|
||||||
.map(PacePerson::capitalize)
|
|
||||||
.collect(Collectors.toList()))
|
|
||||||
.orElse(new ArrayList<>());
|
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<String> getNameWithAbbreviations() {
|
public List<String> getNameWithAbbreviations() {
|
||||||
return Optional
|
return Lists.newArrayList(Iterables.transform(name, PacePerson::dotAbbreviations));
|
||||||
.ofNullable(getName())
|
|
||||||
.map(
|
|
||||||
name -> name
|
|
||||||
.stream()
|
|
||||||
.map(PacePerson::dotAbbreviations)
|
|
||||||
.collect(Collectors.toList()))
|
|
||||||
.orElse(new ArrayList<>());
|
|
||||||
}
|
}
|
||||||
|
|
||||||
public boolean isAccurate() {
|
public boolean isAccurate() {
|
||||||
|
|
|
@ -1,81 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.common.action;
|
|
||||||
|
|
||||||
import java.io.BufferedWriter;
|
|
||||||
import java.io.IOException;
|
|
||||||
import java.io.OutputStreamWriter;
|
|
||||||
import java.nio.charset.StandardCharsets;
|
|
||||||
import java.sql.ResultSet;
|
|
||||||
import java.sql.SQLException;
|
|
||||||
|
|
||||||
import org.apache.hadoop.conf.Configuration;
|
|
||||||
import org.apache.hadoop.fs.FSDataOutputStream;
|
|
||||||
import org.apache.hadoop.fs.FileSystem;
|
|
||||||
import org.apache.hadoop.fs.Path;
|
|
||||||
import org.slf4j.Logger;
|
|
||||||
import org.slf4j.LoggerFactory;
|
|
||||||
|
|
||||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
|
||||||
|
|
||||||
import eu.dnetlib.dhp.common.DbClient;
|
|
||||||
import eu.dnetlib.dhp.common.action.model.MasterDuplicate;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.utils.OafMapperUtils;
|
|
||||||
|
|
||||||
public class ReadDatasourceMasterDuplicateFromDB {
|
|
||||||
|
|
||||||
private static final Logger log = LoggerFactory.getLogger(ReadDatasourceMasterDuplicateFromDB.class);
|
|
||||||
|
|
||||||
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
|
|
||||||
|
|
||||||
private static final String QUERY = "SELECT distinct dd.id as masterId, d.officialname as masterName, dd.duplicate as duplicateId "
|
|
||||||
+
|
|
||||||
"FROM dsm_dedup_services dd join dsm_services d on (dd.id = d.id);";
|
|
||||||
|
|
||||||
public static int execute(String dbUrl, String dbUser, String dbPassword, String hdfsPath, String hdfsNameNode)
|
|
||||||
throws IOException {
|
|
||||||
int count = 0;
|
|
||||||
try (DbClient dbClient = new DbClient(dbUrl, dbUser, dbPassword)) {
|
|
||||||
Configuration conf = new Configuration();
|
|
||||||
conf.set("fs.defaultFS", hdfsNameNode);
|
|
||||||
FileSystem fileSystem = FileSystem.get(conf);
|
|
||||||
FSDataOutputStream fos = fileSystem.create(new Path(hdfsPath));
|
|
||||||
|
|
||||||
log.info("running query: {}", QUERY);
|
|
||||||
log.info("storing results in: {}", hdfsPath);
|
|
||||||
|
|
||||||
try (BufferedWriter writer = new BufferedWriter(new OutputStreamWriter(fos, StandardCharsets.UTF_8))) {
|
|
||||||
dbClient.processResults(QUERY, rs -> writeMap(datasourceMasterMap(rs), writer));
|
|
||||||
count++;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return count;
|
|
||||||
}
|
|
||||||
|
|
||||||
private static MasterDuplicate datasourceMasterMap(ResultSet rs) {
|
|
||||||
try {
|
|
||||||
final MasterDuplicate md = new MasterDuplicate();
|
|
||||||
|
|
||||||
final String duplicateId = rs.getString("duplicateId");
|
|
||||||
final String masterId = rs.getString("masterId");
|
|
||||||
final String masterName = rs.getString("masterName");
|
|
||||||
|
|
||||||
md.setDuplicateId(OafMapperUtils.createOpenaireId(10, duplicateId, true));
|
|
||||||
md.setMasterId(OafMapperUtils.createOpenaireId(10, masterId, true));
|
|
||||||
md.setMasterName(masterName);
|
|
||||||
|
|
||||||
return md;
|
|
||||||
} catch (final SQLException e) {
|
|
||||||
throw new RuntimeException(e);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
private static void writeMap(final MasterDuplicate dm, final BufferedWriter writer) {
|
|
||||||
try {
|
|
||||||
writer.write(OBJECT_MAPPER.writeValueAsString(dm));
|
|
||||||
writer.newLine();
|
|
||||||
} catch (final IOException e) {
|
|
||||||
throw new RuntimeException(e);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
|
@ -1,38 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.common.action.model;
|
|
||||||
|
|
||||||
import java.io.Serializable;
|
|
||||||
|
|
||||||
/**
|
|
||||||
* @author miriam.baglioni
|
|
||||||
* @Date 21/07/22
|
|
||||||
*/
|
|
||||||
public class MasterDuplicate implements Serializable {
|
|
||||||
private String duplicateId;
|
|
||||||
private String masterId;
|
|
||||||
private String masterName;
|
|
||||||
|
|
||||||
public String getDuplicateId() {
|
|
||||||
return duplicateId;
|
|
||||||
}
|
|
||||||
|
|
||||||
public void setDuplicateId(String duplicateId) {
|
|
||||||
this.duplicateId = duplicateId;
|
|
||||||
}
|
|
||||||
|
|
||||||
public String getMasterId() {
|
|
||||||
return masterId;
|
|
||||||
}
|
|
||||||
|
|
||||||
public void setMasterId(String masterId) {
|
|
||||||
this.masterId = masterId;
|
|
||||||
}
|
|
||||||
|
|
||||||
public String getMasterName() {
|
|
||||||
return masterName;
|
|
||||||
}
|
|
||||||
|
|
||||||
public void setMasterName(String masterName) {
|
|
||||||
this.masterName = masterName;
|
|
||||||
}
|
|
||||||
}
|
|
|
@ -0,0 +1,53 @@
|
||||||
|
|
||||||
|
package eu.dnetlib.dhp.common.api;
|
||||||
|
|
||||||
|
import java.io.IOException;
|
||||||
|
import java.io.InputStream;
|
||||||
|
|
||||||
|
import okhttp3.MediaType;
|
||||||
|
import okhttp3.RequestBody;
|
||||||
|
import okhttp3.internal.Util;
|
||||||
|
import okio.BufferedSink;
|
||||||
|
import okio.Okio;
|
||||||
|
import okio.Source;
|
||||||
|
|
||||||
|
public class InputStreamRequestBody extends RequestBody {
|
||||||
|
|
||||||
|
private final InputStream inputStream;
|
||||||
|
private final MediaType mediaType;
|
||||||
|
private final long lenght;
|
||||||
|
|
||||||
|
public static RequestBody create(final MediaType mediaType, final InputStream inputStream, final long len) {
|
||||||
|
|
||||||
|
return new InputStreamRequestBody(inputStream, mediaType, len);
|
||||||
|
}
|
||||||
|
|
||||||
|
private InputStreamRequestBody(InputStream inputStream, MediaType mediaType, long len) {
|
||||||
|
this.inputStream = inputStream;
|
||||||
|
this.mediaType = mediaType;
|
||||||
|
this.lenght = len;
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public MediaType contentType() {
|
||||||
|
return mediaType;
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public long contentLength() {
|
||||||
|
|
||||||
|
return lenght;
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public void writeTo(BufferedSink sink) throws IOException {
|
||||||
|
Source source = null;
|
||||||
|
try {
|
||||||
|
source = Okio.source(inputStream);
|
||||||
|
sink.writeAll(source);
|
||||||
|
} finally {
|
||||||
|
Util.closeQuietly(source);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
|
@ -0,0 +1,8 @@
|
||||||
|
|
||||||
|
package eu.dnetlib.dhp.common.api;
|
||||||
|
|
||||||
|
public class MissingConceptDoiException extends Throwable {
|
||||||
|
public MissingConceptDoiException(String message) {
|
||||||
|
super(message);
|
||||||
|
}
|
||||||
|
}
|
|
@ -0,0 +1,318 @@
|
||||||
|
|
||||||
|
package eu.dnetlib.dhp.common.api;
|
||||||
|
|
||||||
|
import java.io.*;
|
||||||
|
import java.io.IOException;
|
||||||
|
import java.util.concurrent.TimeUnit;
|
||||||
|
|
||||||
|
import org.apache.http.HttpHeaders;
|
||||||
|
import org.apache.http.entity.ContentType;
|
||||||
|
|
||||||
|
import com.google.gson.Gson;
|
||||||
|
|
||||||
|
import eu.dnetlib.dhp.common.api.zenodo.ZenodoModel;
|
||||||
|
import eu.dnetlib.dhp.common.api.zenodo.ZenodoModelList;
|
||||||
|
import okhttp3.*;
|
||||||
|
|
||||||
|
public class ZenodoAPIClient implements Serializable {
|
||||||
|
|
||||||
|
String urlString;
|
||||||
|
String bucket;
|
||||||
|
|
||||||
|
String deposition_id;
|
||||||
|
String access_token;
|
||||||
|
|
||||||
|
public static final MediaType MEDIA_TYPE_JSON = MediaType.parse("application/json; charset=utf-8");
|
||||||
|
|
||||||
|
private static final MediaType MEDIA_TYPE_ZIP = MediaType.parse("application/zip");
|
||||||
|
|
||||||
|
public String getUrlString() {
|
||||||
|
return urlString;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setUrlString(String urlString) {
|
||||||
|
this.urlString = urlString;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getBucket() {
|
||||||
|
return bucket;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setBucket(String bucket) {
|
||||||
|
this.bucket = bucket;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setDeposition_id(String deposition_id) {
|
||||||
|
this.deposition_id = deposition_id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public ZenodoAPIClient(String urlString, String access_token) {
|
||||||
|
|
||||||
|
this.urlString = urlString;
|
||||||
|
this.access_token = access_token;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Brand new deposition in Zenodo. It sets the deposition_id and the bucket where to store the files to upload
|
||||||
|
*
|
||||||
|
* @return response code
|
||||||
|
* @throws IOException
|
||||||
|
*/
|
||||||
|
public int newDeposition() throws IOException {
|
||||||
|
String json = "{}";
|
||||||
|
OkHttpClient httpClient = new OkHttpClient.Builder().connectTimeout(600, TimeUnit.SECONDS).build();
|
||||||
|
|
||||||
|
RequestBody body = RequestBody.create(json, MEDIA_TYPE_JSON);
|
||||||
|
|
||||||
|
Request request = new Request.Builder()
|
||||||
|
.url(urlString)
|
||||||
|
.addHeader(HttpHeaders.CONTENT_TYPE, ContentType.APPLICATION_JSON.toString()) // add request headers
|
||||||
|
.addHeader(HttpHeaders.AUTHORIZATION, "Bearer " + access_token)
|
||||||
|
.post(body)
|
||||||
|
.build();
|
||||||
|
|
||||||
|
try (Response response = httpClient.newCall(request).execute()) {
|
||||||
|
|
||||||
|
if (!response.isSuccessful())
|
||||||
|
throw new IOException("Unexpected code " + response + response.body().string());
|
||||||
|
|
||||||
|
// Get response body
|
||||||
|
json = response.body().string();
|
||||||
|
|
||||||
|
ZenodoModel newSubmission = new Gson().fromJson(json, ZenodoModel.class);
|
||||||
|
this.bucket = newSubmission.getLinks().getBucket();
|
||||||
|
this.deposition_id = newSubmission.getId();
|
||||||
|
|
||||||
|
return response.code();
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Upload files in Zenodo.
|
||||||
|
*
|
||||||
|
* @param is the inputStream for the file to upload
|
||||||
|
* @param file_name the name of the file as it will appear on Zenodo
|
||||||
|
* @param len the size of the file
|
||||||
|
* @return the response code
|
||||||
|
*/
|
||||||
|
public int uploadIS(InputStream is, String file_name, long len) throws IOException {
|
||||||
|
OkHttpClient httpClient = new OkHttpClient.Builder()
|
||||||
|
.writeTimeout(600, TimeUnit.SECONDS)
|
||||||
|
.readTimeout(600, TimeUnit.SECONDS)
|
||||||
|
.connectTimeout(600, TimeUnit.SECONDS)
|
||||||
|
.build();
|
||||||
|
|
||||||
|
Request request = new Request.Builder()
|
||||||
|
.url(bucket + "/" + file_name)
|
||||||
|
.addHeader(HttpHeaders.CONTENT_TYPE, "application/zip") // add request headers
|
||||||
|
.addHeader(HttpHeaders.AUTHORIZATION, "Bearer " + access_token)
|
||||||
|
.put(InputStreamRequestBody.create(MEDIA_TYPE_ZIP, is, len))
|
||||||
|
.build();
|
||||||
|
|
||||||
|
try (Response response = httpClient.newCall(request).execute()) {
|
||||||
|
if (!response.isSuccessful())
|
||||||
|
throw new IOException("Unexpected code " + response + response.body().string());
|
||||||
|
return response.code();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Associates metadata information to the current deposition
|
||||||
|
*
|
||||||
|
* @param metadata the metadata
|
||||||
|
* @return response code
|
||||||
|
* @throws IOException
|
||||||
|
*/
|
||||||
|
public int sendMretadata(String metadata) throws IOException {
|
||||||
|
|
||||||
|
OkHttpClient httpClient = new OkHttpClient.Builder().connectTimeout(600, TimeUnit.SECONDS).build();
|
||||||
|
|
||||||
|
RequestBody body = RequestBody.create(metadata, MEDIA_TYPE_JSON);
|
||||||
|
|
||||||
|
Request request = new Request.Builder()
|
||||||
|
.url(urlString + "/" + deposition_id)
|
||||||
|
.addHeader(HttpHeaders.CONTENT_TYPE, ContentType.APPLICATION_JSON.toString()) // add request headers
|
||||||
|
.addHeader(HttpHeaders.AUTHORIZATION, "Bearer " + access_token)
|
||||||
|
.put(body)
|
||||||
|
.build();
|
||||||
|
|
||||||
|
try (Response response = httpClient.newCall(request).execute()) {
|
||||||
|
|
||||||
|
if (!response.isSuccessful())
|
||||||
|
throw new IOException("Unexpected code " + response + response.body().string());
|
||||||
|
|
||||||
|
return response.code();
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* To publish the current deposition. It works for both new deposition or new version of an old deposition
|
||||||
|
*
|
||||||
|
* @return response code
|
||||||
|
* @throws IOException
|
||||||
|
*/
|
||||||
|
public int publish() throws IOException {
|
||||||
|
|
||||||
|
String json = "{}";
|
||||||
|
|
||||||
|
OkHttpClient httpClient = new OkHttpClient.Builder().connectTimeout(600, TimeUnit.SECONDS).build();
|
||||||
|
|
||||||
|
RequestBody body = RequestBody.create(json, MEDIA_TYPE_JSON);
|
||||||
|
|
||||||
|
Request request = new Request.Builder()
|
||||||
|
.url(urlString + "/" + deposition_id + "/actions/publish")
|
||||||
|
.addHeader("Authorization", "Bearer " + access_token)
|
||||||
|
.post(body)
|
||||||
|
.build();
|
||||||
|
|
||||||
|
try (Response response = httpClient.newCall(request).execute()) {
|
||||||
|
|
||||||
|
if (!response.isSuccessful())
|
||||||
|
throw new IOException("Unexpected code " + response + response.body().string());
|
||||||
|
|
||||||
|
return response.code();
|
||||||
|
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* To create a new version of an already published deposition. It sets the deposition_id and the bucket to be used
|
||||||
|
* for the new version.
|
||||||
|
*
|
||||||
|
* @param concept_rec_id the concept record id of the deposition for which to create a new version. It is the last
|
||||||
|
* part of the url for the DOI Zenodo suggests to use to cite all versions: DOI: 10.xxx/zenodo.656930
|
||||||
|
* concept_rec_id = 656930
|
||||||
|
* @return response code
|
||||||
|
* @throws IOException
|
||||||
|
* @throws MissingConceptDoiException
|
||||||
|
*/
|
||||||
|
public int newVersion(String concept_rec_id) throws IOException, MissingConceptDoiException {
|
||||||
|
setDepositionId(concept_rec_id);
|
||||||
|
String json = "{}";
|
||||||
|
|
||||||
|
OkHttpClient httpClient = new OkHttpClient.Builder().connectTimeout(600, TimeUnit.SECONDS).build();
|
||||||
|
|
||||||
|
RequestBody body = RequestBody.create(json, MEDIA_TYPE_JSON);
|
||||||
|
|
||||||
|
Request request = new Request.Builder()
|
||||||
|
.url(urlString + "/" + deposition_id + "/actions/newversion")
|
||||||
|
.addHeader(HttpHeaders.AUTHORIZATION, "Bearer " + access_token)
|
||||||
|
.post(body)
|
||||||
|
.build();
|
||||||
|
|
||||||
|
try (Response response = httpClient.newCall(request).execute()) {
|
||||||
|
|
||||||
|
if (!response.isSuccessful())
|
||||||
|
throw new IOException("Unexpected code " + response + response.body().string());
|
||||||
|
|
||||||
|
ZenodoModel zenodoModel = new Gson().fromJson(response.body().string(), ZenodoModel.class);
|
||||||
|
String latest_draft = zenodoModel.getLinks().getLatest_draft();
|
||||||
|
deposition_id = latest_draft.substring(latest_draft.lastIndexOf("/") + 1);
|
||||||
|
bucket = getBucket(latest_draft);
|
||||||
|
return response.code();
|
||||||
|
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* To finish uploading a version or new deposition not published
|
||||||
|
* It sets the deposition_id and the bucket to be used
|
||||||
|
*
|
||||||
|
*
|
||||||
|
* @param deposition_id the deposition id of the not yet published upload
|
||||||
|
* concept_rec_id = 656930
|
||||||
|
* @return response code
|
||||||
|
* @throws IOException
|
||||||
|
* @throws MissingConceptDoiException
|
||||||
|
*/
|
||||||
|
public int uploadOpenDeposition(String deposition_id) throws IOException, MissingConceptDoiException {
|
||||||
|
|
||||||
|
this.deposition_id = deposition_id;
|
||||||
|
|
||||||
|
OkHttpClient httpClient = new OkHttpClient.Builder().connectTimeout(600, TimeUnit.SECONDS).build();
|
||||||
|
|
||||||
|
Request request = new Request.Builder()
|
||||||
|
.url(urlString + "/" + deposition_id)
|
||||||
|
.addHeader("Authorization", "Bearer " + access_token)
|
||||||
|
.build();
|
||||||
|
|
||||||
|
try (Response response = httpClient.newCall(request).execute()) {
|
||||||
|
|
||||||
|
if (!response.isSuccessful())
|
||||||
|
throw new IOException("Unexpected code " + response + response.body().string());
|
||||||
|
|
||||||
|
ZenodoModel zenodoModel = new Gson().fromJson(response.body().string(), ZenodoModel.class);
|
||||||
|
bucket = zenodoModel.getLinks().getBucket();
|
||||||
|
return response.code();
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
private void setDepositionId(String concept_rec_id) throws IOException, MissingConceptDoiException {
|
||||||
|
|
||||||
|
ZenodoModelList zenodoModelList = new Gson().fromJson(getPrevDepositions(), ZenodoModelList.class);
|
||||||
|
|
||||||
|
for (ZenodoModel zm : zenodoModelList) {
|
||||||
|
if (zm.getConceptrecid().equals(concept_rec_id)) {
|
||||||
|
deposition_id = zm.getId();
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
throw new MissingConceptDoiException("The concept record id specified was missing in the list of depositions");
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
private String getPrevDepositions() throws IOException {
|
||||||
|
OkHttpClient httpClient = new OkHttpClient.Builder().connectTimeout(600, TimeUnit.SECONDS).build();
|
||||||
|
|
||||||
|
Request request = new Request.Builder()
|
||||||
|
.url(urlString)
|
||||||
|
.addHeader(HttpHeaders.CONTENT_TYPE, ContentType.APPLICATION_JSON.toString()) // add request headers
|
||||||
|
.addHeader(HttpHeaders.AUTHORIZATION, "Bearer " + access_token)
|
||||||
|
.get()
|
||||||
|
.build();
|
||||||
|
|
||||||
|
try (Response response = httpClient.newCall(request).execute()) {
|
||||||
|
|
||||||
|
if (!response.isSuccessful())
|
||||||
|
throw new IOException("Unexpected code " + response + response.body().string());
|
||||||
|
|
||||||
|
return response.body().string();
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
private String getBucket(String url) throws IOException {
|
||||||
|
OkHttpClient httpClient = new OkHttpClient.Builder()
|
||||||
|
.connectTimeout(600, TimeUnit.SECONDS)
|
||||||
|
.build();
|
||||||
|
|
||||||
|
Request request = new Request.Builder()
|
||||||
|
.url(url)
|
||||||
|
.addHeader(HttpHeaders.CONTENT_TYPE, ContentType.APPLICATION_JSON.toString()) // add request headers
|
||||||
|
.addHeader(HttpHeaders.AUTHORIZATION, "Bearer " + access_token)
|
||||||
|
.get()
|
||||||
|
.build();
|
||||||
|
|
||||||
|
try (Response response = httpClient.newCall(request).execute()) {
|
||||||
|
|
||||||
|
if (!response.isSuccessful())
|
||||||
|
throw new IOException("Unexpected code " + response + response.body().string());
|
||||||
|
|
||||||
|
// Get response body
|
||||||
|
ZenodoModel zenodoModel = new Gson().fromJson(response.body().string(), ZenodoModel.class);
|
||||||
|
|
||||||
|
return zenodoModel.getLinks().getBucket();
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
|
@ -1,39 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.common.api.context;
|
|
||||||
|
|
||||||
public class CategorySummary {
|
|
||||||
|
|
||||||
private String id;
|
|
||||||
|
|
||||||
private String label;
|
|
||||||
|
|
||||||
private boolean hasConcept;
|
|
||||||
|
|
||||||
public String getId() {
|
|
||||||
return id;
|
|
||||||
}
|
|
||||||
|
|
||||||
public String getLabel() {
|
|
||||||
return label;
|
|
||||||
}
|
|
||||||
|
|
||||||
public boolean isHasConcept() {
|
|
||||||
return hasConcept;
|
|
||||||
}
|
|
||||||
|
|
||||||
public CategorySummary setId(final String id) {
|
|
||||||
this.id = id;
|
|
||||||
return this;
|
|
||||||
}
|
|
||||||
|
|
||||||
public CategorySummary setLabel(final String label) {
|
|
||||||
this.label = label;
|
|
||||||
return this;
|
|
||||||
}
|
|
||||||
|
|
||||||
public CategorySummary setHasConcept(final boolean hasConcept) {
|
|
||||||
this.hasConcept = hasConcept;
|
|
||||||
return this;
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
|
@ -1,7 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.common.api.context;
|
|
||||||
|
|
||||||
import java.util.ArrayList;
|
|
||||||
|
|
||||||
public class CategorySummaryList extends ArrayList<CategorySummary> {
|
|
||||||
}
|
|
|
@ -1,52 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.common.api.context;
|
|
||||||
|
|
||||||
import java.util.List;
|
|
||||||
|
|
||||||
public class ConceptSummary {
|
|
||||||
|
|
||||||
private String id;
|
|
||||||
|
|
||||||
private String label;
|
|
||||||
|
|
||||||
public boolean hasSubConcept;
|
|
||||||
|
|
||||||
private List<ConceptSummary> concepts;
|
|
||||||
|
|
||||||
public String getId() {
|
|
||||||
return id;
|
|
||||||
}
|
|
||||||
|
|
||||||
public String getLabel() {
|
|
||||||
return label;
|
|
||||||
}
|
|
||||||
|
|
||||||
public List<ConceptSummary> getConcepts() {
|
|
||||||
return concepts;
|
|
||||||
}
|
|
||||||
|
|
||||||
public ConceptSummary setId(final String id) {
|
|
||||||
this.id = id;
|
|
||||||
return this;
|
|
||||||
}
|
|
||||||
|
|
||||||
public ConceptSummary setLabel(final String label) {
|
|
||||||
this.label = label;
|
|
||||||
return this;
|
|
||||||
}
|
|
||||||
|
|
||||||
public boolean isHasSubConcept() {
|
|
||||||
return hasSubConcept;
|
|
||||||
}
|
|
||||||
|
|
||||||
public ConceptSummary setHasSubConcept(final boolean hasSubConcept) {
|
|
||||||
this.hasSubConcept = hasSubConcept;
|
|
||||||
return this;
|
|
||||||
}
|
|
||||||
|
|
||||||
public ConceptSummary setConcept(final List<ConceptSummary> concepts) {
|
|
||||||
this.concepts = concepts;
|
|
||||||
return this;
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
|
@ -1,7 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.common.api.context;
|
|
||||||
|
|
||||||
import java.util.ArrayList;
|
|
||||||
|
|
||||||
public class ConceptSummaryList extends ArrayList<ConceptSummary> {
|
|
||||||
}
|
|
|
@ -1,50 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.common.api.context;
|
|
||||||
|
|
||||||
public class ContextSummary {
|
|
||||||
|
|
||||||
private String id;
|
|
||||||
|
|
||||||
private String label;
|
|
||||||
|
|
||||||
private String type;
|
|
||||||
|
|
||||||
private String status;
|
|
||||||
|
|
||||||
public String getId() {
|
|
||||||
return id;
|
|
||||||
}
|
|
||||||
|
|
||||||
public String getLabel() {
|
|
||||||
return label;
|
|
||||||
}
|
|
||||||
|
|
||||||
public String getType() {
|
|
||||||
return type;
|
|
||||||
}
|
|
||||||
|
|
||||||
public String getStatus() {
|
|
||||||
return status;
|
|
||||||
}
|
|
||||||
|
|
||||||
public ContextSummary setId(final String id) {
|
|
||||||
this.id = id;
|
|
||||||
return this;
|
|
||||||
}
|
|
||||||
|
|
||||||
public ContextSummary setLabel(final String label) {
|
|
||||||
this.label = label;
|
|
||||||
return this;
|
|
||||||
}
|
|
||||||
|
|
||||||
public ContextSummary setType(final String type) {
|
|
||||||
this.type = type;
|
|
||||||
return this;
|
|
||||||
}
|
|
||||||
|
|
||||||
public ContextSummary setStatus(final String status) {
|
|
||||||
this.status = status;
|
|
||||||
return this;
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
|
@ -1,7 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.common.api.context;
|
|
||||||
|
|
||||||
import java.util.ArrayList;
|
|
||||||
|
|
||||||
public class ContextSummaryList extends ArrayList<ContextSummary> {
|
|
||||||
}
|
|
|
@ -0,0 +1,14 @@
|
||||||
|
|
||||||
|
package eu.dnetlib.dhp.common.api.zenodo;
|
||||||
|
|
||||||
|
public class Community {
|
||||||
|
private String identifier;
|
||||||
|
|
||||||
|
public String getIdentifier() {
|
||||||
|
return identifier;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setIdentifier(String identifier) {
|
||||||
|
this.identifier = identifier;
|
||||||
|
}
|
||||||
|
}
|
|
@ -0,0 +1,47 @@
|
||||||
|
|
||||||
|
package eu.dnetlib.dhp.common.api.zenodo;
|
||||||
|
|
||||||
|
public class Creator {
|
||||||
|
private String affiliation;
|
||||||
|
private String name;
|
||||||
|
private String orcid;
|
||||||
|
|
||||||
|
public String getAffiliation() {
|
||||||
|
return affiliation;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setAffiliation(String affiliation) {
|
||||||
|
this.affiliation = affiliation;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getName() {
|
||||||
|
return name;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setName(String name) {
|
||||||
|
this.name = name;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getOrcid() {
|
||||||
|
return orcid;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setOrcid(String orcid) {
|
||||||
|
this.orcid = orcid;
|
||||||
|
}
|
||||||
|
|
||||||
|
public static Creator newInstance(String name, String affiliation, String orcid) {
|
||||||
|
Creator c = new Creator();
|
||||||
|
if (name != null) {
|
||||||
|
c.name = name;
|
||||||
|
}
|
||||||
|
if (affiliation != null) {
|
||||||
|
c.affiliation = affiliation;
|
||||||
|
}
|
||||||
|
if (orcid != null) {
|
||||||
|
c.orcid = orcid;
|
||||||
|
}
|
||||||
|
|
||||||
|
return c;
|
||||||
|
}
|
||||||
|
}
|
|
@ -0,0 +1,44 @@
|
||||||
|
|
||||||
|
package eu.dnetlib.dhp.common.api.zenodo;
|
||||||
|
|
||||||
|
import java.io.Serializable;
|
||||||
|
|
||||||
|
public class File implements Serializable {
|
||||||
|
private String checksum;
|
||||||
|
private String filename;
|
||||||
|
private long filesize;
|
||||||
|
private String id;
|
||||||
|
|
||||||
|
public String getChecksum() {
|
||||||
|
return checksum;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setChecksum(String checksum) {
|
||||||
|
this.checksum = checksum;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getFilename() {
|
||||||
|
return filename;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setFilename(String filename) {
|
||||||
|
this.filename = filename;
|
||||||
|
}
|
||||||
|
|
||||||
|
public long getFilesize() {
|
||||||
|
return filesize;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setFilesize(long filesize) {
|
||||||
|
this.filesize = filesize;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getId() {
|
||||||
|
return id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setId(String id) {
|
||||||
|
this.id = id;
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
|
@ -0,0 +1,23 @@
|
||||||
|
|
||||||
|
package eu.dnetlib.dhp.common.api.zenodo;
|
||||||
|
|
||||||
|
import java.io.Serializable;
|
||||||
|
|
||||||
|
public class Grant implements Serializable {
|
||||||
|
private String id;
|
||||||
|
|
||||||
|
public String getId() {
|
||||||
|
return id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setId(String id) {
|
||||||
|
this.id = id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public static Grant newInstance(String id) {
|
||||||
|
Grant g = new Grant();
|
||||||
|
g.id = id;
|
||||||
|
|
||||||
|
return g;
|
||||||
|
}
|
||||||
|
}
|
|
@ -0,0 +1,92 @@
|
||||||
|
|
||||||
|
package eu.dnetlib.dhp.common.api.zenodo;
|
||||||
|
|
||||||
|
import java.io.Serializable;
|
||||||
|
|
||||||
|
public class Links implements Serializable {
|
||||||
|
|
||||||
|
private String bucket;
|
||||||
|
|
||||||
|
private String discard;
|
||||||
|
|
||||||
|
private String edit;
|
||||||
|
private String files;
|
||||||
|
private String html;
|
||||||
|
private String latest_draft;
|
||||||
|
private String latest_draft_html;
|
||||||
|
private String publish;
|
||||||
|
|
||||||
|
private String self;
|
||||||
|
|
||||||
|
public String getBucket() {
|
||||||
|
return bucket;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setBucket(String bucket) {
|
||||||
|
this.bucket = bucket;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getDiscard() {
|
||||||
|
return discard;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setDiscard(String discard) {
|
||||||
|
this.discard = discard;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getEdit() {
|
||||||
|
return edit;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setEdit(String edit) {
|
||||||
|
this.edit = edit;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getFiles() {
|
||||||
|
return files;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setFiles(String files) {
|
||||||
|
this.files = files;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getHtml() {
|
||||||
|
return html;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setHtml(String html) {
|
||||||
|
this.html = html;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getLatest_draft() {
|
||||||
|
return latest_draft;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setLatest_draft(String latest_draft) {
|
||||||
|
this.latest_draft = latest_draft;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getLatest_draft_html() {
|
||||||
|
return latest_draft_html;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setLatest_draft_html(String latest_draft_html) {
|
||||||
|
this.latest_draft_html = latest_draft_html;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getPublish() {
|
||||||
|
return publish;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setPublish(String publish) {
|
||||||
|
this.publish = publish;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getSelf() {
|
||||||
|
return self;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setSelf(String self) {
|
||||||
|
this.self = self;
|
||||||
|
}
|
||||||
|
}
|
|
@ -0,0 +1,153 @@
|
||||||
|
|
||||||
|
package eu.dnetlib.dhp.common.api.zenodo;
|
||||||
|
|
||||||
|
import java.io.Serializable;
|
||||||
|
import java.util.List;
|
||||||
|
|
||||||
|
public class Metadata implements Serializable {
|
||||||
|
|
||||||
|
private String access_right;
|
||||||
|
private List<Community> communities;
|
||||||
|
private List<Creator> creators;
|
||||||
|
private String description;
|
||||||
|
private String doi;
|
||||||
|
private List<Grant> grants;
|
||||||
|
private List<String> keywords;
|
||||||
|
private String language;
|
||||||
|
private String license;
|
||||||
|
private PrereserveDoi prereserve_doi;
|
||||||
|
private String publication_date;
|
||||||
|
private List<String> references;
|
||||||
|
private List<RelatedIdentifier> related_identifiers;
|
||||||
|
private String title;
|
||||||
|
private String upload_type;
|
||||||
|
private String version;
|
||||||
|
|
||||||
|
public String getUpload_type() {
|
||||||
|
return upload_type;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setUpload_type(String upload_type) {
|
||||||
|
this.upload_type = upload_type;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getVersion() {
|
||||||
|
return version;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setVersion(String version) {
|
||||||
|
this.version = version;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getAccess_right() {
|
||||||
|
return access_right;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setAccess_right(String access_right) {
|
||||||
|
this.access_right = access_right;
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<Community> getCommunities() {
|
||||||
|
return communities;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCommunities(List<Community> communities) {
|
||||||
|
this.communities = communities;
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<Creator> getCreators() {
|
||||||
|
return creators;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCreators(List<Creator> creators) {
|
||||||
|
this.creators = creators;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getDescription() {
|
||||||
|
return description;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setDescription(String description) {
|
||||||
|
this.description = description;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getDoi() {
|
||||||
|
return doi;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setDoi(String doi) {
|
||||||
|
this.doi = doi;
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<Grant> getGrants() {
|
||||||
|
return grants;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setGrants(List<Grant> grants) {
|
||||||
|
this.grants = grants;
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<String> getKeywords() {
|
||||||
|
return keywords;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setKeywords(List<String> keywords) {
|
||||||
|
this.keywords = keywords;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getLanguage() {
|
||||||
|
return language;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setLanguage(String language) {
|
||||||
|
this.language = language;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getLicense() {
|
||||||
|
return license;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setLicense(String license) {
|
||||||
|
this.license = license;
|
||||||
|
}
|
||||||
|
|
||||||
|
public PrereserveDoi getPrereserve_doi() {
|
||||||
|
return prereserve_doi;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setPrereserve_doi(PrereserveDoi prereserve_doi) {
|
||||||
|
this.prereserve_doi = prereserve_doi;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getPublication_date() {
|
||||||
|
return publication_date;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setPublication_date(String publication_date) {
|
||||||
|
this.publication_date = publication_date;
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<String> getReferences() {
|
||||||
|
return references;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setReferences(List<String> references) {
|
||||||
|
this.references = references;
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<RelatedIdentifier> getRelated_identifiers() {
|
||||||
|
return related_identifiers;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setRelated_identifiers(List<RelatedIdentifier> related_identifiers) {
|
||||||
|
this.related_identifiers = related_identifiers;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getTitle() {
|
||||||
|
return title;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setTitle(String title) {
|
||||||
|
this.title = title;
|
||||||
|
}
|
||||||
|
}
|
|
@ -0,0 +1,25 @@
|
||||||
|
|
||||||
|
package eu.dnetlib.dhp.common.api.zenodo;
|
||||||
|
|
||||||
|
import java.io.Serializable;
|
||||||
|
|
||||||
|
public class PrereserveDoi implements Serializable {
|
||||||
|
private String doi;
|
||||||
|
private String recid;
|
||||||
|
|
||||||
|
public String getDoi() {
|
||||||
|
return doi;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setDoi(String doi) {
|
||||||
|
this.doi = doi;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getRecid() {
|
||||||
|
return recid;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setRecid(String recid) {
|
||||||
|
this.recid = recid;
|
||||||
|
}
|
||||||
|
}
|
|
@ -0,0 +1,43 @@
|
||||||
|
|
||||||
|
package eu.dnetlib.dhp.common.api.zenodo;
|
||||||
|
|
||||||
|
import java.io.Serializable;
|
||||||
|
|
||||||
|
public class RelatedIdentifier implements Serializable {
|
||||||
|
private String identifier;
|
||||||
|
private String relation;
|
||||||
|
private String resource_type;
|
||||||
|
private String scheme;
|
||||||
|
|
||||||
|
public String getIdentifier() {
|
||||||
|
return identifier;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setIdentifier(String identifier) {
|
||||||
|
this.identifier = identifier;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getRelation() {
|
||||||
|
return relation;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setRelation(String relation) {
|
||||||
|
this.relation = relation;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getResource_type() {
|
||||||
|
return resource_type;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setResource_type(String resource_type) {
|
||||||
|
this.resource_type = resource_type;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getScheme() {
|
||||||
|
return scheme;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setScheme(String scheme) {
|
||||||
|
this.scheme = scheme;
|
||||||
|
}
|
||||||
|
}
|
|
@ -0,0 +1,118 @@
|
||||||
|
|
||||||
|
package eu.dnetlib.dhp.common.api.zenodo;
|
||||||
|
|
||||||
|
import java.io.Serializable;
|
||||||
|
import java.util.List;
|
||||||
|
|
||||||
|
public class ZenodoModel implements Serializable {
|
||||||
|
|
||||||
|
private String conceptrecid;
|
||||||
|
private String created;
|
||||||
|
|
||||||
|
private List<File> files;
|
||||||
|
private String id;
|
||||||
|
private Links links;
|
||||||
|
private Metadata metadata;
|
||||||
|
private String modified;
|
||||||
|
private String owner;
|
||||||
|
private String record_id;
|
||||||
|
private String state;
|
||||||
|
private boolean submitted;
|
||||||
|
private String title;
|
||||||
|
|
||||||
|
public String getConceptrecid() {
|
||||||
|
return conceptrecid;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setConceptrecid(String conceptrecid) {
|
||||||
|
this.conceptrecid = conceptrecid;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getCreated() {
|
||||||
|
return created;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCreated(String created) {
|
||||||
|
this.created = created;
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<File> getFiles() {
|
||||||
|
return files;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setFiles(List<File> files) {
|
||||||
|
this.files = files;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getId() {
|
||||||
|
return id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setId(String id) {
|
||||||
|
this.id = id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Links getLinks() {
|
||||||
|
return links;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setLinks(Links links) {
|
||||||
|
this.links = links;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Metadata getMetadata() {
|
||||||
|
return metadata;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setMetadata(Metadata metadata) {
|
||||||
|
this.metadata = metadata;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getModified() {
|
||||||
|
return modified;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setModified(String modified) {
|
||||||
|
this.modified = modified;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getOwner() {
|
||||||
|
return owner;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setOwner(String owner) {
|
||||||
|
this.owner = owner;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getRecord_id() {
|
||||||
|
return record_id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setRecord_id(String record_id) {
|
||||||
|
this.record_id = record_id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getState() {
|
||||||
|
return state;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setState(String state) {
|
||||||
|
this.state = state;
|
||||||
|
}
|
||||||
|
|
||||||
|
public boolean isSubmitted() {
|
||||||
|
return submitted;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setSubmitted(boolean submitted) {
|
||||||
|
this.submitted = submitted;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getTitle() {
|
||||||
|
return title;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setTitle(String title) {
|
||||||
|
this.title = title;
|
||||||
|
}
|
||||||
|
}
|
|
@ -0,0 +1,7 @@
|
||||||
|
|
||||||
|
package eu.dnetlib.dhp.common.api.zenodo;
|
||||||
|
|
||||||
|
import java.util.ArrayList;
|
||||||
|
|
||||||
|
public class ZenodoModelList extends ArrayList<ZenodoModel> {
|
||||||
|
}
|
|
@ -1,40 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.common.collection;
|
|
||||||
|
|
||||||
import java.io.BufferedOutputStream;
|
|
||||||
import java.io.IOException;
|
|
||||||
import java.util.zip.GZIPOutputStream;
|
|
||||||
|
|
||||||
import org.apache.commons.compress.archivers.tar.TarArchiveEntry;
|
|
||||||
import org.apache.commons.compress.archivers.tar.TarArchiveInputStream;
|
|
||||||
import org.apache.commons.compress.compressors.gzip.GzipCompressorInputStream;
|
|
||||||
import org.apache.commons.io.IOUtils;
|
|
||||||
import org.apache.hadoop.fs.FSDataInputStream;
|
|
||||||
import org.apache.hadoop.fs.FSDataOutputStream;
|
|
||||||
import org.apache.hadoop.fs.FileSystem;
|
|
||||||
import org.apache.hadoop.fs.Path;
|
|
||||||
|
|
||||||
public class DecompressTarGz {
|
|
||||||
|
|
||||||
public static void doExtract(FileSystem fs, String outputPath, String tarGzPath) throws IOException {
|
|
||||||
|
|
||||||
FSDataInputStream inputFileStream = fs.open(new Path(tarGzPath));
|
|
||||||
try (TarArchiveInputStream tais = new TarArchiveInputStream(
|
|
||||||
new GzipCompressorInputStream(inputFileStream))) {
|
|
||||||
TarArchiveEntry entry = null;
|
|
||||||
while ((entry = tais.getNextTarEntry()) != null) {
|
|
||||||
if (!entry.isDirectory()) {
|
|
||||||
try (
|
|
||||||
FSDataOutputStream out = fs
|
|
||||||
.create(new Path(outputPath.concat(entry.getName()).concat(".gz")));
|
|
||||||
GZIPOutputStream gzipOs = new GZIPOutputStream(new BufferedOutputStream(out))) {
|
|
||||||
|
|
||||||
IOUtils.copy(tais, gzipOs);
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
|
@ -1,9 +1,6 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.common.collection;
|
package eu.dnetlib.dhp.common.collection;
|
||||||
|
|
||||||
import java.util.HashMap;
|
|
||||||
import java.util.Map;
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Bundles the http connection parameters driving the client behaviour.
|
* Bundles the http connection parameters driving the client behaviour.
|
||||||
*/
|
*/
|
||||||
|
@ -16,8 +13,6 @@ public class HttpClientParams {
|
||||||
public static int _connectTimeOut = 10; // seconds
|
public static int _connectTimeOut = 10; // seconds
|
||||||
public static int _readTimeOut = 30; // seconds
|
public static int _readTimeOut = 30; // seconds
|
||||||
|
|
||||||
public static String _requestMethod = "GET";
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Maximum number of allowed retires before failing
|
* Maximum number of allowed retires before failing
|
||||||
*/
|
*/
|
||||||
|
@ -43,30 +38,17 @@ public class HttpClientParams {
|
||||||
*/
|
*/
|
||||||
private int readTimeOut;
|
private int readTimeOut;
|
||||||
|
|
||||||
/**
|
|
||||||
* Custom http headers
|
|
||||||
*/
|
|
||||||
private Map<String, String> headers;
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Request method (i.e., GET, POST etc)
|
|
||||||
*/
|
|
||||||
private String requestMethod;
|
|
||||||
|
|
||||||
public HttpClientParams() {
|
public HttpClientParams() {
|
||||||
this(_maxNumberOfRetry, _requestDelay, _retryDelay, _connectTimeOut, _readTimeOut, new HashMap<>(),
|
this(_maxNumberOfRetry, _requestDelay, _retryDelay, _connectTimeOut, _readTimeOut);
|
||||||
_requestMethod);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
public HttpClientParams(int maxNumberOfRetry, int requestDelay, int retryDelay, int connectTimeOut,
|
public HttpClientParams(int maxNumberOfRetry, int requestDelay, int retryDelay, int connectTimeOut,
|
||||||
int readTimeOut, Map<String, String> headers, String requestMethod) {
|
int readTimeOut) {
|
||||||
this.maxNumberOfRetry = maxNumberOfRetry;
|
this.maxNumberOfRetry = maxNumberOfRetry;
|
||||||
this.requestDelay = requestDelay;
|
this.requestDelay = requestDelay;
|
||||||
this.retryDelay = retryDelay;
|
this.retryDelay = retryDelay;
|
||||||
this.connectTimeOut = connectTimeOut;
|
this.connectTimeOut = connectTimeOut;
|
||||||
this.readTimeOut = readTimeOut;
|
this.readTimeOut = readTimeOut;
|
||||||
this.headers = headers;
|
|
||||||
this.requestMethod = requestMethod;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
public int getMaxNumberOfRetry() {
|
public int getMaxNumberOfRetry() {
|
||||||
|
@ -109,19 +91,4 @@ public class HttpClientParams {
|
||||||
this.readTimeOut = readTimeOut;
|
this.readTimeOut = readTimeOut;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Map<String, String> getHeaders() {
|
|
||||||
return headers;
|
|
||||||
}
|
|
||||||
|
|
||||||
public void setHeaders(Map<String, String> headers) {
|
|
||||||
this.headers = headers;
|
|
||||||
}
|
|
||||||
|
|
||||||
public String getRequestMethod() {
|
|
||||||
return requestMethod;
|
|
||||||
}
|
|
||||||
|
|
||||||
public void setRequestMethod(String requestMethod) {
|
|
||||||
this.requestMethod = requestMethod;
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
|
@ -8,7 +8,6 @@ import java.io.InputStream;
|
||||||
import java.net.*;
|
import java.net.*;
|
||||||
import java.util.List;
|
import java.util.List;
|
||||||
import java.util.Map;
|
import java.util.Map;
|
||||||
import java.util.concurrent.TimeUnit;
|
|
||||||
|
|
||||||
import org.apache.commons.io.IOUtils;
|
import org.apache.commons.io.IOUtils;
|
||||||
import org.apache.commons.lang3.math.NumberUtils;
|
import org.apache.commons.lang3.math.NumberUtils;
|
||||||
|
@ -16,13 +15,12 @@ import org.apache.http.HttpHeaders;
|
||||||
import org.slf4j.Logger;
|
import org.slf4j.Logger;
|
||||||
import org.slf4j.LoggerFactory;
|
import org.slf4j.LoggerFactory;
|
||||||
|
|
||||||
import eu.dnetlib.dhp.common.Constants;
|
|
||||||
import eu.dnetlib.dhp.common.aggregation.AggregatorReport;
|
import eu.dnetlib.dhp.common.aggregation.AggregatorReport;
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Migrated from https://svn.driver.research-infrastructures.eu/driver/dnet45/modules/dnet-modular-collector-service/trunk/src/main/java/eu/dnetlib/data/collector/plugins/HttpConnector.java
|
* Migrated from https://svn.driver.research-infrastructures.eu/driver/dnet45/modules/dnet-modular-collector-service/trunk/src/main/java/eu/dnetlib/data/collector/plugins/HttpConnector.java
|
||||||
*
|
*
|
||||||
* @author jochen, michele, andrea, alessia, claudio, andreas
|
* @author jochen, michele, andrea, alessia, claudio
|
||||||
*/
|
*/
|
||||||
public class HttpConnector2 {
|
public class HttpConnector2 {
|
||||||
|
|
||||||
|
@ -95,46 +93,29 @@ public class HttpConnector2 {
|
||||||
throw new CollectorException(msg);
|
throw new CollectorException(msg);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
log.info("Request attempt {} [{}]", retryNumber, requestUrl);
|
||||||
|
|
||||||
InputStream input = null;
|
InputStream input = null;
|
||||||
|
|
||||||
long start = System.currentTimeMillis();
|
|
||||||
try {
|
try {
|
||||||
if (getClientParams().getRequestDelay() > 0) {
|
if (getClientParams().getRequestDelay() > 0) {
|
||||||
backoffAndSleep(getClientParams().getRequestDelay());
|
backoffAndSleep(getClientParams().getRequestDelay());
|
||||||
}
|
}
|
||||||
|
|
||||||
log.info("Request attempt {} [{}]", retryNumber, requestUrl);
|
|
||||||
|
|
||||||
final HttpURLConnection urlConn = (HttpURLConnection) new URL(requestUrl).openConnection();
|
final HttpURLConnection urlConn = (HttpURLConnection) new URL(requestUrl).openConnection();
|
||||||
urlConn.setInstanceFollowRedirects(false);
|
urlConn.setInstanceFollowRedirects(false);
|
||||||
urlConn.setReadTimeout(getClientParams().getReadTimeOut() * 1000);
|
urlConn.setReadTimeout(getClientParams().getReadTimeOut() * 1000);
|
||||||
urlConn.setConnectTimeout(getClientParams().getConnectTimeOut() * 1000);
|
urlConn.setConnectTimeout(getClientParams().getConnectTimeOut() * 1000);
|
||||||
urlConn.addRequestProperty(HttpHeaders.USER_AGENT, userAgent);
|
urlConn.addRequestProperty(HttpHeaders.USER_AGENT, userAgent);
|
||||||
urlConn.setRequestMethod(getClientParams().getRequestMethod());
|
|
||||||
|
|
||||||
// if provided, add custom headers
|
|
||||||
if (!getClientParams().getHeaders().isEmpty()) {
|
|
||||||
for (Map.Entry<String, String> headerEntry : getClientParams().getHeaders().entrySet()) {
|
|
||||||
urlConn.addRequestProperty(headerEntry.getKey(), headerEntry.getValue());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
|
if (log.isDebugEnabled()) {
|
||||||
logHeaderFields(urlConn);
|
logHeaderFields(urlConn);
|
||||||
|
}
|
||||||
|
|
||||||
int retryAfter = obtainRetryAfter(urlConn.getHeaderFields());
|
int retryAfter = obtainRetryAfter(urlConn.getHeaderFields());
|
||||||
String rateLimit = urlConn.getHeaderField(Constants.HTTPHEADER_IETF_DRAFT_RATELIMIT_LIMIT);
|
|
||||||
String rateRemaining = urlConn.getHeaderField(Constants.HTTPHEADER_IETF_DRAFT_RATELIMIT_REMAINING);
|
|
||||||
|
|
||||||
if ((rateLimit != null) && (rateRemaining != null) && (Integer.parseInt(rateRemaining) < 2)) {
|
|
||||||
if (retryAfter > 0) {
|
|
||||||
backoffAndSleep(retryAfter);
|
|
||||||
} else {
|
|
||||||
backoffAndSleep(1000);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (is2xx(urlConn.getResponseCode())) {
|
if (is2xx(urlConn.getResponseCode())) {
|
||||||
return getInputStream(urlConn, start);
|
input = urlConn.getInputStream();
|
||||||
|
responseType = urlConn.getContentType();
|
||||||
|
return input;
|
||||||
}
|
}
|
||||||
if (is3xx(urlConn.getResponseCode())) {
|
if (is3xx(urlConn.getResponseCode())) {
|
||||||
// REDIRECTS
|
// REDIRECTS
|
||||||
|
@ -144,7 +125,6 @@ public class HttpConnector2 {
|
||||||
.put(
|
.put(
|
||||||
REPORT_PREFIX + urlConn.getResponseCode(),
|
REPORT_PREFIX + urlConn.getResponseCode(),
|
||||||
String.format("Moved to: %s", newUrl));
|
String.format("Moved to: %s", newUrl));
|
||||||
logRequestTime(start);
|
|
||||||
urlConn.disconnect();
|
urlConn.disconnect();
|
||||||
if (retryAfter > 0) {
|
if (retryAfter > 0) {
|
||||||
backoffAndSleep(retryAfter);
|
backoffAndSleep(retryAfter);
|
||||||
|
@ -160,50 +140,26 @@ public class HttpConnector2 {
|
||||||
if (retryAfter > 0) {
|
if (retryAfter > 0) {
|
||||||
log
|
log
|
||||||
.warn(
|
.warn(
|
||||||
"waiting and repeating request after suggested retry-after {} sec for URL {}",
|
"{} - waiting and repeating request after suggested retry-after {} sec.",
|
||||||
retryAfter, requestUrl);
|
requestUrl, retryAfter);
|
||||||
backoffAndSleep(retryAfter * 1000);
|
backoffAndSleep(retryAfter * 1000);
|
||||||
} else {
|
} else {
|
||||||
log
|
log
|
||||||
.warn(
|
.warn(
|
||||||
"waiting and repeating request after default delay of {} sec for URL {}",
|
"{} - waiting and repeating request after default delay of {} sec.",
|
||||||
getClientParams().getRetryDelay(), requestUrl);
|
requestUrl, getClientParams().getRetryDelay());
|
||||||
backoffAndSleep(retryNumber * getClientParams().getRetryDelay());
|
backoffAndSleep(retryNumber * getClientParams().getRetryDelay() * 1000);
|
||||||
}
|
}
|
||||||
report.put(REPORT_PREFIX + urlConn.getResponseCode(), requestUrl);
|
report.put(REPORT_PREFIX + urlConn.getResponseCode(), requestUrl);
|
||||||
|
|
||||||
logRequestTime(start);
|
|
||||||
|
|
||||||
urlConn.disconnect();
|
urlConn.disconnect();
|
||||||
|
|
||||||
return attemptDownload(requestUrl, retryNumber + 1, report);
|
return attemptDownload(requestUrl, retryNumber + 1, report);
|
||||||
case 422: // UNPROCESSABLE ENTITY
|
|
||||||
report.put(REPORT_PREFIX + urlConn.getResponseCode(), requestUrl);
|
|
||||||
log.warn("waiting and repeating request after 10 sec for URL {}", requestUrl);
|
|
||||||
backoffAndSleep(10000);
|
|
||||||
urlConn.disconnect();
|
|
||||||
logRequestTime(start);
|
|
||||||
try {
|
|
||||||
return getInputStream(urlConn, start);
|
|
||||||
} catch (IOException e) {
|
|
||||||
log
|
|
||||||
.error(
|
|
||||||
"server returned 422 and got IOException accessing the response body from URL {}",
|
|
||||||
requestUrl);
|
|
||||||
log.error("IOException:", e);
|
|
||||||
return attemptDownload(requestUrl, retryNumber + 1, report);
|
|
||||||
}
|
|
||||||
default:
|
default:
|
||||||
log.error("gor error {} from URL: {}", urlConn.getResponseCode(), urlConn.getURL());
|
|
||||||
log.error("response message: {}", urlConn.getResponseMessage());
|
|
||||||
report
|
report
|
||||||
.put(
|
.put(
|
||||||
REPORT_PREFIX + urlConn.getResponseCode(),
|
REPORT_PREFIX + urlConn.getResponseCode(),
|
||||||
String
|
String
|
||||||
.format(
|
.format(
|
||||||
"%s Error: %s", requestUrl, urlConn.getResponseMessage()));
|
"%s Error: %s", requestUrl, urlConn.getResponseMessage()));
|
||||||
logRequestTime(start);
|
|
||||||
urlConn.disconnect();
|
|
||||||
throw new CollectorException(urlConn.getResponseCode() + " error " + report);
|
throw new CollectorException(urlConn.getResponseCode() + " error " + report);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -212,11 +168,11 @@ public class HttpConnector2 {
|
||||||
.format(
|
.format(
|
||||||
"Unexpected status code: %s errors: %s", urlConn.getResponseCode(),
|
"Unexpected status code: %s errors: %s", urlConn.getResponseCode(),
|
||||||
MAPPER.writeValueAsString(report)));
|
MAPPER.writeValueAsString(report)));
|
||||||
} catch (MalformedURLException e) {
|
} catch (MalformedURLException | UnknownHostException e) {
|
||||||
log.error(e.getMessage(), e);
|
log.error(e.getMessage(), e);
|
||||||
report.put(e.getClass().getName(), e.getMessage());
|
report.put(e.getClass().getName(), e.getMessage());
|
||||||
throw new CollectorException(e.getMessage(), e);
|
throw new CollectorException(e.getMessage(), e);
|
||||||
} catch (SocketTimeoutException | SocketException | UnknownHostException e) {
|
} catch (SocketTimeoutException | SocketException e) {
|
||||||
log.error(e.getMessage(), e);
|
log.error(e.getMessage(), e);
|
||||||
report.put(e.getClass().getName(), e.getMessage());
|
report.put(e.getClass().getName(), e.getMessage());
|
||||||
backoffAndSleep(getClientParams().getRetryDelay() * retryNumber * 1000);
|
backoffAndSleep(getClientParams().getRetryDelay() * retryNumber * 1000);
|
||||||
|
@ -224,27 +180,13 @@ public class HttpConnector2 {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private InputStream getInputStream(HttpURLConnection urlConn, long start) throws IOException {
|
|
||||||
InputStream input = urlConn.getInputStream();
|
|
||||||
responseType = urlConn.getContentType();
|
|
||||||
logRequestTime(start);
|
|
||||||
return input;
|
|
||||||
}
|
|
||||||
|
|
||||||
private static void logRequestTime(long start) {
|
|
||||||
log
|
|
||||||
.info(
|
|
||||||
"request time elapsed: {}sec",
|
|
||||||
TimeUnit.MILLISECONDS.toSeconds(System.currentTimeMillis() - start));
|
|
||||||
}
|
|
||||||
|
|
||||||
private void logHeaderFields(final HttpURLConnection urlConn) throws IOException {
|
private void logHeaderFields(final HttpURLConnection urlConn) throws IOException {
|
||||||
log.info("Response: {} - {}", urlConn.getResponseCode(), urlConn.getResponseMessage());
|
log.debug("StatusCode: {}", urlConn.getResponseMessage());
|
||||||
|
|
||||||
for (Map.Entry<String, List<String>> e : urlConn.getHeaderFields().entrySet()) {
|
for (Map.Entry<String, List<String>> e : urlConn.getHeaderFields().entrySet()) {
|
||||||
if (e.getKey() != null) {
|
if (e.getKey() != null) {
|
||||||
for (String v : e.getValue()) {
|
for (String v : e.getValue()) {
|
||||||
log.info(" key: {} - value: {}", e.getKey(), v);
|
log.debug(" key: {} - value: {}", e.getKey(), v);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -264,7 +206,7 @@ public class HttpConnector2 {
|
||||||
for (String key : headerMap.keySet()) {
|
for (String key : headerMap.keySet()) {
|
||||||
if ((key != null) && key.equalsIgnoreCase(HttpHeaders.RETRY_AFTER) && (!headerMap.get(key).isEmpty())
|
if ((key != null) && key.equalsIgnoreCase(HttpHeaders.RETRY_AFTER) && (!headerMap.get(key).isEmpty())
|
||||||
&& NumberUtils.isCreatable(headerMap.get(key).get(0))) {
|
&& NumberUtils.isCreatable(headerMap.get(key).get(0))) {
|
||||||
return Integer.parseInt(headerMap.get(key).get(0));
|
return Integer.parseInt(headerMap.get(key).get(0)) + 10;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
return -1;
|
return -1;
|
||||||
|
|
|
@ -4,7 +4,6 @@ package eu.dnetlib.dhp.common.vocabulary;
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
import java.util.HashMap;
|
import java.util.HashMap;
|
||||||
import java.util.Map;
|
import java.util.Map;
|
||||||
import java.util.Objects;
|
|
||||||
import java.util.Optional;
|
import java.util.Optional;
|
||||||
|
|
||||||
import org.apache.commons.lang3.StringUtils;
|
import org.apache.commons.lang3.StringUtils;
|
||||||
|
@ -63,46 +62,25 @@ public class Vocabulary implements Serializable {
|
||||||
}
|
}
|
||||||
|
|
||||||
public VocabularyTerm getTermBySynonym(final String syn) {
|
public VocabularyTerm getTermBySynonym(final String syn) {
|
||||||
return Optional
|
return getTerm(synonyms.get(syn.toLowerCase()));
|
||||||
.ofNullable(syn)
|
|
||||||
.map(s -> getTerm(synonyms.get(s.toLowerCase())))
|
|
||||||
.orElse(null);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
public Qualifier getTermAsQualifier(final String termId) {
|
public Qualifier getTermAsQualifier(final String termId) {
|
||||||
return getTermAsQualifier(termId, false);
|
if (StringUtils.isBlank(termId)) {
|
||||||
}
|
|
||||||
|
|
||||||
public Qualifier getTermAsQualifier(final String termId, boolean strict) {
|
|
||||||
final VocabularyTerm term = getTerm(termId);
|
|
||||||
if (Objects.nonNull(term)) {
|
|
||||||
return OafMapperUtils.qualifier(term.getId(), term.getName(), getId(), getName());
|
|
||||||
} else if (Objects.isNull(term) && strict) {
|
|
||||||
return OafMapperUtils.unknown(getId(), getName());
|
return OafMapperUtils.unknown(getId(), getName());
|
||||||
|
} else if (termExists(termId)) {
|
||||||
|
final VocabularyTerm t = getTerm(termId);
|
||||||
|
return OafMapperUtils.qualifier(t.getId(), t.getName(), getId(), getName());
|
||||||
} else {
|
} else {
|
||||||
return OafMapperUtils.qualifier(termId, termId, getId(), getName());
|
return OafMapperUtils.qualifier(termId, termId, getId(), getName());
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
public Qualifier getSynonymAsQualifier(final String syn) {
|
public Qualifier getSynonymAsQualifier(final String syn) {
|
||||||
return getSynonymAsQualifier(syn, false);
|
|
||||||
}
|
|
||||||
|
|
||||||
public Qualifier getSynonymAsQualifier(final String syn, boolean strict) {
|
|
||||||
return Optional
|
return Optional
|
||||||
.ofNullable(getTermBySynonym(syn))
|
.ofNullable(getTermBySynonym(syn))
|
||||||
.map(term -> getTermAsQualifier(term.getId(), strict))
|
.map(term -> getTermAsQualifier(term.getId()))
|
||||||
.orElse(null);
|
.orElse(null);
|
||||||
}
|
}
|
||||||
|
|
||||||
public Qualifier lookup(String id) {
|
|
||||||
return lookup(id, false);
|
|
||||||
}
|
|
||||||
|
|
||||||
public Qualifier lookup(String id, boolean strict) {
|
|
||||||
return Optional
|
|
||||||
.ofNullable(getSynonymAsQualifier(id, strict))
|
|
||||||
.orElse(getTermAsQualifier(id, strict));
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
|
@ -57,17 +57,9 @@ public class VocabularyGroup implements Serializable {
|
||||||
final String syn = arr[2].trim();
|
final String syn = arr[2].trim();
|
||||||
|
|
||||||
vocs.addSynonyms(vocId, termId, syn);
|
vocs.addSynonyms(vocId, termId, syn);
|
||||||
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// add the term names as synonyms
|
|
||||||
vocs.vocs.values().forEach(voc -> {
|
|
||||||
voc.getTerms().values().forEach(term -> {
|
|
||||||
voc.addSynonym(term.getName().toLowerCase(), term.getId());
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
return vocs;
|
return vocs;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -81,13 +73,6 @@ public class VocabularyGroup implements Serializable {
|
||||||
vocs.put(id.toLowerCase(), new Vocabulary(id, name));
|
vocs.put(id.toLowerCase(), new Vocabulary(id, name));
|
||||||
}
|
}
|
||||||
|
|
||||||
public Optional<Vocabulary> find(final String vocId) {
|
|
||||||
return Optional
|
|
||||||
.ofNullable(vocId)
|
|
||||||
.map(String::toLowerCase)
|
|
||||||
.map(vocs::get);
|
|
||||||
}
|
|
||||||
|
|
||||||
public void addTerm(final String vocId, final String id, final String name) {
|
public void addTerm(final String vocId, final String id, final String name) {
|
||||||
if (vocabularyExists(vocId)) {
|
if (vocabularyExists(vocId)) {
|
||||||
vocs.get(vocId.toLowerCase()).addTerm(id, name);
|
vocs.get(vocId.toLowerCase()).addTerm(id, name);
|
||||||
|
@ -135,24 +120,6 @@ public class VocabularyGroup implements Serializable {
|
||||||
return vocs.get(vocId.toLowerCase()).getSynonymAsQualifier(syn);
|
return vocs.get(vocId.toLowerCase()).getSynonymAsQualifier(syn);
|
||||||
}
|
}
|
||||||
|
|
||||||
public Qualifier lookupTermBySynonym(final String vocId, final String syn) {
|
|
||||||
return find(vocId)
|
|
||||||
.map(
|
|
||||||
vocabulary -> Optional
|
|
||||||
.ofNullable(vocabulary.getTerm(syn))
|
|
||||||
.map(
|
|
||||||
term -> OafMapperUtils
|
|
||||||
.qualifier(term.getId(), term.getName(), vocabulary.getId(), vocabulary.getName()))
|
|
||||||
.orElse(
|
|
||||||
Optional
|
|
||||||
.ofNullable(vocabulary.getTermBySynonym(syn))
|
|
||||||
.map(
|
|
||||||
term -> OafMapperUtils
|
|
||||||
.qualifier(term.getId(), term.getName(), vocabulary.getId(), vocabulary.getName()))
|
|
||||||
.orElse(null)))
|
|
||||||
.orElse(null);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* getSynonymAsQualifierCaseSensitive
|
* getSynonymAsQualifierCaseSensitive
|
||||||
*
|
*
|
||||||
|
|
|
@ -10,7 +10,6 @@ import org.apache.commons.lang3.StringUtils;
|
||||||
import com.wcohen.ss.JaroWinkler;
|
import com.wcohen.ss.JaroWinkler;
|
||||||
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.Author;
|
import eu.dnetlib.dhp.schema.oaf.Author;
|
||||||
import eu.dnetlib.dhp.schema.oaf.Qualifier;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.StructuredProperty;
|
import eu.dnetlib.dhp.schema.oaf.StructuredProperty;
|
||||||
import eu.dnetlib.pace.model.Person;
|
import eu.dnetlib.pace.model.Person;
|
||||||
import scala.Tuple2;
|
import scala.Tuple2;
|
||||||
|
@ -120,47 +119,11 @@ public class AuthorMerger {
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
public static String normalizeFullName(final String fullname) {
|
|
||||||
return nfd(fullname)
|
|
||||||
.toLowerCase()
|
|
||||||
// do not compact the regexes in a single expression, would cause StackOverflowError
|
|
||||||
// in case
|
|
||||||
// of large input strings
|
|
||||||
.replaceAll("(\\W)+", " ")
|
|
||||||
.replaceAll("(\\p{InCombiningDiacriticalMarks})+", " ")
|
|
||||||
.replaceAll("(\\p{Punct})+", " ")
|
|
||||||
.replaceAll("(\\d)+", " ")
|
|
||||||
.replaceAll("(\\n)+", " ")
|
|
||||||
|
|
||||||
.trim();
|
|
||||||
}
|
|
||||||
|
|
||||||
private static String authorFieldToBeCompared(Author author) {
|
|
||||||
if (StringUtils.isNotBlank(author.getSurname())) {
|
|
||||||
return author.getSurname();
|
|
||||||
|
|
||||||
}
|
|
||||||
if (StringUtils.isNotBlank(author.getFullname())) {
|
|
||||||
return author.getFullname();
|
|
||||||
}
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
|
|
||||||
public static String pidToComparableString(StructuredProperty pid) {
|
public static String pidToComparableString(StructuredProperty pid) {
|
||||||
final String classId = Optional
|
final String classid = pid.getQualifier().getClassid() != null ? pid.getQualifier().getClassid().toLowerCase()
|
||||||
.ofNullable(pid)
|
: "";
|
||||||
.map(
|
return (pid.getQualifier() != null ? classid : "")
|
||||||
p -> Optional
|
+ (pid.getValue() != null ? pid.getValue().toLowerCase() : "");
|
||||||
.ofNullable(p.getQualifier())
|
|
||||||
.map(Qualifier::getClassid)
|
|
||||||
.map(String::toLowerCase)
|
|
||||||
.orElse(""))
|
|
||||||
.orElse("");
|
|
||||||
return Optional
|
|
||||||
.ofNullable(pid)
|
|
||||||
.map(StructuredProperty::getValue)
|
|
||||||
.map(v -> String.join("|", v, classId))
|
|
||||||
.orElse("");
|
|
||||||
}
|
}
|
||||||
|
|
||||||
public static int countAuthorsPids(List<Author> authors) {
|
public static int countAuthorsPids(List<Author> authors) {
|
||||||
|
@ -208,7 +171,7 @@ public class AuthorMerger {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
public static String normalize(final String s) {
|
private static String normalize(final String s) {
|
||||||
String[] normalized = nfd(s)
|
String[] normalized = nfd(s)
|
||||||
.toLowerCase()
|
.toLowerCase()
|
||||||
// do not compact the regexes in a single expression, would cause StackOverflowError
|
// do not compact the regexes in a single expression, would cause StackOverflowError
|
||||||
|
|
|
@ -1,194 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.oa.merge;
|
|
||||||
|
|
||||||
import static eu.dnetlib.dhp.common.SparkSessionSupport.runWithSparkSession;
|
|
||||||
import static org.apache.spark.sql.functions.col;
|
|
||||||
import static org.apache.spark.sql.functions.when;
|
|
||||||
|
|
||||||
import java.util.Map;
|
|
||||||
import java.util.Optional;
|
|
||||||
import java.util.concurrent.ExecutionException;
|
|
||||||
import java.util.concurrent.ForkJoinPool;
|
|
||||||
import java.util.stream.Collectors;
|
|
||||||
|
|
||||||
import org.apache.commons.io.IOUtils;
|
|
||||||
import org.apache.spark.SparkConf;
|
|
||||||
import org.apache.spark.api.java.function.MapFunction;
|
|
||||||
import org.apache.spark.api.java.function.MapGroupsFunction;
|
|
||||||
import org.apache.spark.sql.*;
|
|
||||||
import org.slf4j.Logger;
|
|
||||||
import org.slf4j.LoggerFactory;
|
|
||||||
|
|
||||||
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
|
|
||||||
import eu.dnetlib.dhp.common.HdfsSupport;
|
|
||||||
import eu.dnetlib.dhp.common.vocabulary.VocabularyGroup;
|
|
||||||
import eu.dnetlib.dhp.schema.common.EntityType;
|
|
||||||
import eu.dnetlib.dhp.schema.common.ModelSupport;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.OafEntity;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.utils.GraphCleaningFunctions;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.utils.MergeUtils;
|
|
||||||
import eu.dnetlib.dhp.utils.ISLookupClientFactory;
|
|
||||||
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpException;
|
|
||||||
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpService;
|
|
||||||
import scala.Tuple2;
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Groups the graph content by entity identifier to ensure ID uniqueness
|
|
||||||
*/
|
|
||||||
public class GroupEntitiesSparkJob {
|
|
||||||
private static final Logger log = LoggerFactory.getLogger(GroupEntitiesSparkJob.class);
|
|
||||||
|
|
||||||
private static final Encoder<OafEntity> OAFENTITY_KRYO_ENC = Encoders.kryo(OafEntity.class);
|
|
||||||
|
|
||||||
private ArgumentApplicationParser parser;
|
|
||||||
|
|
||||||
public GroupEntitiesSparkJob(ArgumentApplicationParser parser) {
|
|
||||||
this.parser = parser;
|
|
||||||
}
|
|
||||||
|
|
||||||
public static void main(String[] args) throws Exception {
|
|
||||||
|
|
||||||
String jsonConfiguration = IOUtils
|
|
||||||
.toString(
|
|
||||||
GroupEntitiesSparkJob.class
|
|
||||||
.getResourceAsStream(
|
|
||||||
"/eu/dnetlib/dhp/oa/merge/group_graph_entities_parameters.json"));
|
|
||||||
final ArgumentApplicationParser parser = new ArgumentApplicationParser(jsonConfiguration);
|
|
||||||
parser.parseArgument(args);
|
|
||||||
|
|
||||||
Boolean isSparkSessionManaged = Optional
|
|
||||||
.ofNullable(parser.get("isSparkSessionManaged"))
|
|
||||||
.map(Boolean::valueOf)
|
|
||||||
.orElse(Boolean.TRUE);
|
|
||||||
log.info("isSparkSessionManaged: {}", isSparkSessionManaged);
|
|
||||||
|
|
||||||
final String isLookupUrl = parser.get("isLookupUrl");
|
|
||||||
log.info("isLookupUrl: {}", isLookupUrl);
|
|
||||||
|
|
||||||
final ISLookUpService isLookupService = ISLookupClientFactory.getLookUpService(isLookupUrl);
|
|
||||||
|
|
||||||
new GroupEntitiesSparkJob(parser).run(isSparkSessionManaged, isLookupService);
|
|
||||||
}
|
|
||||||
|
|
||||||
public void run(Boolean isSparkSessionManaged, ISLookUpService isLookUpService)
|
|
||||||
throws ISLookUpException {
|
|
||||||
|
|
||||||
String graphInputPath = parser.get("graphInputPath");
|
|
||||||
log.info("graphInputPath: {}", graphInputPath);
|
|
||||||
|
|
||||||
String checkpointPath = parser.get("checkpointPath");
|
|
||||||
log.info("checkpointPath: {}", checkpointPath);
|
|
||||||
|
|
||||||
String outputPath = parser.get("outputPath");
|
|
||||||
log.info("outputPath: {}", outputPath);
|
|
||||||
|
|
||||||
boolean filterInvisible = Boolean.parseBoolean(parser.get("filterInvisible"));
|
|
||||||
log.info("filterInvisible: {}", filterInvisible);
|
|
||||||
|
|
||||||
SparkConf conf = new SparkConf();
|
|
||||||
conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer");
|
|
||||||
conf.registerKryoClasses(ModelSupport.getOafModelClasses());
|
|
||||||
|
|
||||||
final VocabularyGroup vocs = VocabularyGroup.loadVocsFromIS(isLookUpService);
|
|
||||||
|
|
||||||
runWithSparkSession(
|
|
||||||
conf,
|
|
||||||
isSparkSessionManaged,
|
|
||||||
spark -> {
|
|
||||||
HdfsSupport.remove(checkpointPath, spark.sparkContext().hadoopConfiguration());
|
|
||||||
groupEntities(spark, graphInputPath, checkpointPath, outputPath, filterInvisible, vocs);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
private static void groupEntities(
|
|
||||||
SparkSession spark,
|
|
||||||
String inputPath,
|
|
||||||
String checkpointPath,
|
|
||||||
String outputPath,
|
|
||||||
boolean filterInvisible, VocabularyGroup vocs) {
|
|
||||||
|
|
||||||
Dataset<OafEntity> allEntities = spark.emptyDataset(OAFENTITY_KRYO_ENC);
|
|
||||||
|
|
||||||
for (Map.Entry<EntityType, Class> e : ModelSupport.entityTypes.entrySet()) {
|
|
||||||
String entity = e.getKey().name();
|
|
||||||
Class<? extends OafEntity> entityClass = e.getValue();
|
|
||||||
String entityInputPath = inputPath + "/" + entity;
|
|
||||||
|
|
||||||
if (!HdfsSupport.exists(entityInputPath, spark.sparkContext().hadoopConfiguration())) {
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
allEntities = allEntities
|
|
||||||
.union(
|
|
||||||
((Dataset<OafEntity>) spark
|
|
||||||
.read()
|
|
||||||
.schema(Encoders.bean(entityClass).schema())
|
|
||||||
.json(entityInputPath)
|
|
||||||
.filter("length(id) > 0")
|
|
||||||
.as(Encoders.bean(entityClass)))
|
|
||||||
.map((MapFunction<OafEntity, OafEntity>) r -> r, OAFENTITY_KRYO_ENC));
|
|
||||||
}
|
|
||||||
|
|
||||||
Dataset<?> groupedEntities = allEntities
|
|
||||||
.map(
|
|
||||||
(MapFunction<OafEntity, OafEntity>) entity -> GraphCleaningFunctions
|
|
||||||
.applyCoarVocabularies(entity, vocs),
|
|
||||||
OAFENTITY_KRYO_ENC)
|
|
||||||
.groupByKey((MapFunction<OafEntity, String>) OafEntity::getId, Encoders.STRING())
|
|
||||||
.mapGroups((MapGroupsFunction<String, OafEntity, OafEntity>) MergeUtils::mergeById, OAFENTITY_KRYO_ENC)
|
|
||||||
.map(
|
|
||||||
(MapFunction<OafEntity, Tuple2<String, OafEntity>>) t -> new Tuple2<>(
|
|
||||||
t.getClass().getName(), t),
|
|
||||||
Encoders.tuple(Encoders.STRING(), OAFENTITY_KRYO_ENC));
|
|
||||||
|
|
||||||
// pivot on "_1" (classname of the entity)
|
|
||||||
// created columns containing only entities of the same class
|
|
||||||
for (Map.Entry<EntityType, Class> e : ModelSupport.entityTypes.entrySet()) {
|
|
||||||
String entity = e.getKey().name();
|
|
||||||
Class<? extends OafEntity> entityClass = e.getValue();
|
|
||||||
|
|
||||||
groupedEntities = groupedEntities
|
|
||||||
.withColumn(
|
|
||||||
entity,
|
|
||||||
when(col("_1").equalTo(entityClass.getName()), col("_2")));
|
|
||||||
}
|
|
||||||
|
|
||||||
groupedEntities
|
|
||||||
.drop("_1", "_2")
|
|
||||||
.write()
|
|
||||||
.mode(SaveMode.Overwrite)
|
|
||||||
.option("compression", "gzip")
|
|
||||||
.save(checkpointPath);
|
|
||||||
|
|
||||||
ForkJoinPool parPool = new ForkJoinPool(ModelSupport.entityTypes.size());
|
|
||||||
|
|
||||||
ModelSupport.entityTypes
|
|
||||||
.entrySet()
|
|
||||||
.stream()
|
|
||||||
.map(e -> parPool.submit(() -> {
|
|
||||||
String entity = e.getKey().name();
|
|
||||||
Class<? extends OafEntity> entityClass = e.getValue();
|
|
||||||
|
|
||||||
spark
|
|
||||||
.read()
|
|
||||||
.load(checkpointPath)
|
|
||||||
.select(col(entity).as("value"))
|
|
||||||
.filter("value IS NOT NULL")
|
|
||||||
.as(OAFENTITY_KRYO_ENC)
|
|
||||||
.map((MapFunction<OafEntity, OafEntity>) r -> r, (Encoder<OafEntity>) Encoders.bean(entityClass))
|
|
||||||
.filter(filterInvisible ? "dataInfo.invisible != TRUE" : "TRUE")
|
|
||||||
.write()
|
|
||||||
.mode(SaveMode.Overwrite)
|
|
||||||
.option("compression", "gzip")
|
|
||||||
.json(outputPath + "/" + entity);
|
|
||||||
}))
|
|
||||||
.collect(Collectors.toList())
|
|
||||||
.forEach(t -> {
|
|
||||||
try {
|
|
||||||
t.get();
|
|
||||||
} catch (InterruptedException | ExecutionException e) {
|
|
||||||
throw new RuntimeException(e);
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
|
|
@ -1,77 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.oozie;
|
|
||||||
|
|
||||||
import static eu.dnetlib.dhp.common.SparkSessionSupport.runWithSparkHiveSession;
|
|
||||||
|
|
||||||
import java.net.URL;
|
|
||||||
import java.nio.charset.StandardCharsets;
|
|
||||||
import java.util.HashMap;
|
|
||||||
import java.util.Map;
|
|
||||||
import java.util.Optional;
|
|
||||||
|
|
||||||
import org.apache.commons.lang3.time.DurationFormatUtils;
|
|
||||||
import org.apache.commons.text.StringSubstitutor;
|
|
||||||
import org.apache.spark.SparkConf;
|
|
||||||
import org.slf4j.Logger;
|
|
||||||
import org.slf4j.LoggerFactory;
|
|
||||||
|
|
||||||
import com.google.common.io.Resources;
|
|
||||||
|
|
||||||
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
|
|
||||||
|
|
||||||
public class RunSQLSparkJob {
|
|
||||||
private static final Logger log = LoggerFactory.getLogger(RunSQLSparkJob.class);
|
|
||||||
|
|
||||||
private final ArgumentApplicationParser parser;
|
|
||||||
|
|
||||||
public RunSQLSparkJob(ArgumentApplicationParser parser) {
|
|
||||||
this.parser = parser;
|
|
||||||
}
|
|
||||||
|
|
||||||
public static void main(String[] args) throws Exception {
|
|
||||||
|
|
||||||
Map<String, String> params = new HashMap<>();
|
|
||||||
for (int i = 0; i < args.length - 1; i++) {
|
|
||||||
if (args[i].startsWith("--")) {
|
|
||||||
params.put(args[i].substring(2), args[++i]);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/*
|
|
||||||
* String jsonConfiguration = IOUtils .toString( Objects .requireNonNull( RunSQLSparkJob.class
|
|
||||||
* .getResourceAsStream( "/eu/dnetlib/dhp/oozie/run_sql_parameters.json"))); final ArgumentApplicationParser
|
|
||||||
* parser = new ArgumentApplicationParser(jsonConfiguration); parser.parseArgument(args);
|
|
||||||
*/
|
|
||||||
|
|
||||||
Boolean isSparkSessionManaged = Optional
|
|
||||||
.ofNullable(params.get("isSparkSessionManaged"))
|
|
||||||
.map(Boolean::valueOf)
|
|
||||||
.orElse(Boolean.TRUE);
|
|
||||||
log.info("isSparkSessionManaged: {}", isSparkSessionManaged);
|
|
||||||
|
|
||||||
URL url = com.google.common.io.Resources.getResource(params.get("sql"));
|
|
||||||
String raw_sql = Resources.toString(url, StandardCharsets.UTF_8);
|
|
||||||
|
|
||||||
String sql = StringSubstitutor.replace(raw_sql, params);
|
|
||||||
log.info("sql: {}", sql);
|
|
||||||
|
|
||||||
SparkConf conf = new SparkConf();
|
|
||||||
conf.set("hive.metastore.uris", params.get("hiveMetastoreUris"));
|
|
||||||
|
|
||||||
runWithSparkHiveSession(
|
|
||||||
conf,
|
|
||||||
isSparkSessionManaged,
|
|
||||||
spark -> {
|
|
||||||
for (String statement : sql.split(";\\s*/\\*\\s*EOS\\s*\\*/\\s*")) {
|
|
||||||
log.info("executing: {}", statement);
|
|
||||||
long startTime = System.currentTimeMillis();
|
|
||||||
spark.sql(statement).show();
|
|
||||||
log
|
|
||||||
.info(
|
|
||||||
"executed in {}",
|
|
||||||
DurationFormatUtils.formatDuration(System.currentTimeMillis() - startTime, "HH:mm:ss.S"));
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
|
@ -1,70 +0,0 @@
|
||||||
/*
|
|
||||||
* Copyright (c) 2024.
|
|
||||||
* SPDX-FileCopyrightText: © 2023 Consiglio Nazionale delle Ricerche
|
|
||||||
* SPDX-License-Identifier: AGPL-3.0-or-later
|
|
||||||
*/
|
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf;
|
|
||||||
|
|
||||||
import org.apache.commons.lang3.builder.EqualsBuilder;
|
|
||||||
import org.apache.commons.lang3.builder.HashCodeBuilder;
|
|
||||||
|
|
||||||
public class HashableStructuredProperty extends StructuredProperty {
|
|
||||||
|
|
||||||
private static final long serialVersionUID = 8371670185221126045L;
|
|
||||||
|
|
||||||
public static HashableStructuredProperty newInstance(String value, Qualifier qualifier, DataInfo dataInfo) {
|
|
||||||
if (value == null) {
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
final HashableStructuredProperty sp = new HashableStructuredProperty();
|
|
||||||
sp.setValue(value);
|
|
||||||
sp.setQualifier(qualifier);
|
|
||||||
sp.setDataInfo(dataInfo);
|
|
||||||
return sp;
|
|
||||||
}
|
|
||||||
|
|
||||||
public static HashableStructuredProperty newInstance(StructuredProperty sp) {
|
|
||||||
HashableStructuredProperty hsp = new HashableStructuredProperty();
|
|
||||||
hsp.setQualifier(sp.getQualifier());
|
|
||||||
hsp.setValue(sp.getValue());
|
|
||||||
hsp.setQualifier(sp.getQualifier());
|
|
||||||
return hsp;
|
|
||||||
}
|
|
||||||
|
|
||||||
public static StructuredProperty toStructuredProperty(HashableStructuredProperty hsp) {
|
|
||||||
StructuredProperty sp = new StructuredProperty();
|
|
||||||
sp.setQualifier(hsp.getQualifier());
|
|
||||||
sp.setValue(hsp.getValue());
|
|
||||||
sp.setQualifier(hsp.getQualifier());
|
|
||||||
return sp;
|
|
||||||
}
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public int hashCode() {
|
|
||||||
return new HashCodeBuilder(11, 91)
|
|
||||||
.append(getQualifier().getClassid())
|
|
||||||
.append(getQualifier().getSchemeid())
|
|
||||||
.append(getValue())
|
|
||||||
.hashCode();
|
|
||||||
}
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public boolean equals(Object obj) {
|
|
||||||
if (obj == null) {
|
|
||||||
return false;
|
|
||||||
}
|
|
||||||
if (obj == this) {
|
|
||||||
return true;
|
|
||||||
}
|
|
||||||
if (obj.getClass() != getClass()) {
|
|
||||||
return false;
|
|
||||||
}
|
|
||||||
final HashableStructuredProperty rhs = (HashableStructuredProperty) obj;
|
|
||||||
return new EqualsBuilder()
|
|
||||||
.append(getQualifier().getClassid(), rhs.getQualifier().getClassid())
|
|
||||||
.append(getQualifier().getSchemeid(), rhs.getQualifier().getSchemeid())
|
|
||||||
.append(getValue(), rhs.getValue())
|
|
||||||
.isEquals();
|
|
||||||
}
|
|
||||||
}
|
|
|
@ -1,46 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
|
||||||
|
|
||||||
import java.util.HashSet;
|
|
||||||
import java.util.Objects;
|
|
||||||
import java.util.Optional;
|
|
||||||
import java.util.Set;
|
|
||||||
|
|
||||||
import org.apache.commons.lang3.StringUtils;
|
|
||||||
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.StructuredProperty;
|
|
||||||
|
|
||||||
public class CleaningFunctions {
|
|
||||||
|
|
||||||
public static final String DOI_PREFIX_REGEX = "(^10\\.|\\/10\\.)";
|
|
||||||
public static final String DOI_PREFIX = "10.";
|
|
||||||
|
|
||||||
public static final Set<String> PID_BLACKLIST = new HashSet<>();
|
|
||||||
|
|
||||||
static {
|
|
||||||
PID_BLACKLIST.add("none");
|
|
||||||
PID_BLACKLIST.add("na");
|
|
||||||
}
|
|
||||||
|
|
||||||
public CleaningFunctions() {
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Utility method that filter PID values on a per-type basis.
|
|
||||||
* @param s the PID whose value will be checked.
|
|
||||||
* @return false if the pid matches the filter criteria, true otherwise.
|
|
||||||
*/
|
|
||||||
public static boolean pidFilter(StructuredProperty s) {
|
|
||||||
final String pidValue = s.getValue();
|
|
||||||
if (Objects.isNull(s.getQualifier()) ||
|
|
||||||
StringUtils.isBlank(pidValue) ||
|
|
||||||
StringUtils.isBlank(pidValue.replaceAll("(?:\\n|\\r|\\t|\\s)", ""))) {
|
|
||||||
return false;
|
|
||||||
}
|
|
||||||
if (CleaningFunctions.PID_BLACKLIST.contains(pidValue)) {
|
|
||||||
return false;
|
|
||||||
}
|
|
||||||
return !PidBlacklistProvider.getBlacklist(s.getQualifier().getClassid()).contains(pidValue);
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
|
@ -1,30 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
|
||||||
|
|
||||||
import org.apache.commons.lang3.StringUtils;
|
|
||||||
|
|
||||||
public class DoiCleaningRule {
|
|
||||||
|
|
||||||
public static String clean(final String doi) {
|
|
||||||
if (doi == null)
|
|
||||||
return null;
|
|
||||||
final String replaced = doi
|
|
||||||
.replaceAll("\\n|\\r|\\t|\\s", "")
|
|
||||||
.replaceAll("^doi:", "")
|
|
||||||
.toLowerCase()
|
|
||||||
.replaceFirst(CleaningFunctions.DOI_PREFIX_REGEX, CleaningFunctions.DOI_PREFIX);
|
|
||||||
if (StringUtils.isEmpty(replaced))
|
|
||||||
return null;
|
|
||||||
|
|
||||||
if (!replaced.contains("10."))
|
|
||||||
return null;
|
|
||||||
|
|
||||||
final String ret = replaced.substring(replaced.indexOf("10."));
|
|
||||||
|
|
||||||
if (!ret.startsWith(CleaningFunctions.DOI_PREFIX))
|
|
||||||
return null;
|
|
||||||
|
|
||||||
return ret;
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
|
@ -1,25 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
|
||||||
|
|
||||||
import java.util.regex.Matcher;
|
|
||||||
import java.util.regex.Pattern;
|
|
||||||
|
|
||||||
public class FundRefCleaningRule {
|
|
||||||
|
|
||||||
public static final Pattern PATTERN = Pattern.compile("\\d+");
|
|
||||||
|
|
||||||
public static String clean(final String fundRefId) {
|
|
||||||
|
|
||||||
String s = fundRefId
|
|
||||||
.toLowerCase()
|
|
||||||
.replaceAll("\\s", "");
|
|
||||||
|
|
||||||
Matcher m = PATTERN.matcher(s);
|
|
||||||
if (m.find()) {
|
|
||||||
return m.group();
|
|
||||||
} else {
|
|
||||||
return "";
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
|
@ -1,12 +1,6 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
package eu.dnetlib.dhp.schema.oaf.utils;
|
||||||
|
|
||||||
import static eu.dnetlib.dhp.schema.common.ModelConstants.*;
|
|
||||||
import static eu.dnetlib.dhp.schema.common.ModelConstants.OPENAIRE_META_RESOURCE_TYPE;
|
|
||||||
import static eu.dnetlib.dhp.schema.oaf.utils.OafMapperUtils.getProvenance;
|
|
||||||
|
|
||||||
import java.net.MalformedURLException;
|
|
||||||
import java.net.URL;
|
|
||||||
import java.time.LocalDate;
|
import java.time.LocalDate;
|
||||||
import java.time.ZoneId;
|
import java.time.ZoneId;
|
||||||
import java.time.format.DateTimeFormatter;
|
import java.time.format.DateTimeFormatter;
|
||||||
|
@ -22,8 +16,6 @@ import com.github.sisyphsu.dateparser.DateParserUtils;
|
||||||
import com.google.common.collect.Lists;
|
import com.google.common.collect.Lists;
|
||||||
import com.google.common.collect.Sets;
|
import com.google.common.collect.Sets;
|
||||||
|
|
||||||
import eu.dnetlib.dhp.common.vocabulary.VocabularyGroup;
|
|
||||||
import eu.dnetlib.dhp.common.vocabulary.VocabularyTerm;
|
|
||||||
import eu.dnetlib.dhp.schema.common.ModelConstants;
|
import eu.dnetlib.dhp.schema.common.ModelConstants;
|
||||||
import eu.dnetlib.dhp.schema.common.ModelSupport;
|
import eu.dnetlib.dhp.schema.common.ModelSupport;
|
||||||
import eu.dnetlib.dhp.schema.oaf.*;
|
import eu.dnetlib.dhp.schema.oaf.*;
|
||||||
|
@ -31,215 +23,27 @@ import me.xuender.unidecode.Unidecode;
|
||||||
|
|
||||||
public class GraphCleaningFunctions extends CleaningFunctions {
|
public class GraphCleaningFunctions extends CleaningFunctions {
|
||||||
|
|
||||||
public static final String DNET_PUBLISHERS = "dnet:publishers";
|
|
||||||
|
|
||||||
public static final String DNET_LICENSES = "dnet:licenses";
|
|
||||||
|
|
||||||
public static final String ORCID_CLEANING_REGEX = ".*([0-9]{4}).*[-–—−=].*([0-9]{4}).*[-–—−=].*([0-9]{4}).*[-–—−=].*([0-9x]{4})";
|
public static final String ORCID_CLEANING_REGEX = ".*([0-9]{4}).*[-–—−=].*([0-9]{4}).*[-–—−=].*([0-9]{4}).*[-–—−=].*([0-9x]{4})";
|
||||||
public static final int ORCID_LEN = 19;
|
public static final int ORCID_LEN = 19;
|
||||||
public static final String CLEANING_REGEX = "(?:\\n|\\r|\\t)";
|
public static final String CLEANING_REGEX = "(?:\\n|\\r|\\t)";
|
||||||
public static final String INVALID_AUTHOR_REGEX = ".*deactivated.*";
|
public static final String INVALID_AUTHOR_REGEX = ".*deactivated.*";
|
||||||
|
public static final String TITLE_FILTER_REGEX = "[.*test.*\\W\\d]";
|
||||||
public static final String TITLE_TEST = "test";
|
public static final int TITLE_FILTER_RESIDUAL_LENGTH = 10;
|
||||||
public static final String TITLE_FILTER_REGEX = String.format("(%s)|\\W|\\d", TITLE_TEST);
|
|
||||||
|
|
||||||
public static final int TITLE_FILTER_RESIDUAL_LENGTH = 5;
|
|
||||||
private static final String NAME_CLEANING_REGEX = "[\\r\\n\\t\\s]+";
|
|
||||||
|
|
||||||
private static final Set<String> INVALID_AUTHOR_NAMES = new HashSet<>();
|
|
||||||
|
|
||||||
private static final Set<String> INVALID_URLS = new HashSet<>();
|
|
||||||
|
|
||||||
private static final Set<String> INVALID_URL_HOSTS = new HashSet<>();
|
|
||||||
|
|
||||||
private static final HashSet<String> PEER_REVIEWED_TYPES = new HashSet<>();
|
|
||||||
|
|
||||||
static {
|
|
||||||
PEER_REVIEWED_TYPES.add("Article");
|
|
||||||
PEER_REVIEWED_TYPES.add("Part of book or chapter of book");
|
|
||||||
PEER_REVIEWED_TYPES.add("Book");
|
|
||||||
PEER_REVIEWED_TYPES.add("Doctoral thesis");
|
|
||||||
PEER_REVIEWED_TYPES.add("Master thesis");
|
|
||||||
PEER_REVIEWED_TYPES.add("Data Paper");
|
|
||||||
PEER_REVIEWED_TYPES.add("Thesis");
|
|
||||||
PEER_REVIEWED_TYPES.add("Bachelor thesis");
|
|
||||||
PEER_REVIEWED_TYPES.add("Conference object");
|
|
||||||
|
|
||||||
INVALID_AUTHOR_NAMES.add("(:null)");
|
|
||||||
INVALID_AUTHOR_NAMES.add("(:unap)");
|
|
||||||
INVALID_AUTHOR_NAMES.add("(:tba)");
|
|
||||||
INVALID_AUTHOR_NAMES.add("(:unas)");
|
|
||||||
INVALID_AUTHOR_NAMES.add("(:unav)");
|
|
||||||
INVALID_AUTHOR_NAMES.add("(:unkn)");
|
|
||||||
INVALID_AUTHOR_NAMES.add("(:unkn) unknown");
|
|
||||||
INVALID_AUTHOR_NAMES.add(":none");
|
|
||||||
INVALID_AUTHOR_NAMES.add(":null");
|
|
||||||
INVALID_AUTHOR_NAMES.add(":unas");
|
|
||||||
INVALID_AUTHOR_NAMES.add(":unav");
|
|
||||||
INVALID_AUTHOR_NAMES.add(":unkn");
|
|
||||||
INVALID_AUTHOR_NAMES.add("[autor desconocido]");
|
|
||||||
INVALID_AUTHOR_NAMES.add("[s. n.]");
|
|
||||||
INVALID_AUTHOR_NAMES.add("[s.n]");
|
|
||||||
INVALID_AUTHOR_NAMES.add("[unknown]");
|
|
||||||
INVALID_AUTHOR_NAMES.add("anonymous");
|
|
||||||
INVALID_AUTHOR_NAMES.add("n.n.");
|
|
||||||
INVALID_AUTHOR_NAMES.add("nn");
|
|
||||||
INVALID_AUTHOR_NAMES.add("no name supplied");
|
|
||||||
INVALID_AUTHOR_NAMES.add("none");
|
|
||||||
INVALID_AUTHOR_NAMES.add("none available");
|
|
||||||
INVALID_AUTHOR_NAMES.add("not available not available");
|
|
||||||
INVALID_AUTHOR_NAMES.add("null &na;");
|
|
||||||
INVALID_AUTHOR_NAMES.add("null anonymous");
|
|
||||||
INVALID_AUTHOR_NAMES.add("unbekannt");
|
|
||||||
INVALID_AUTHOR_NAMES.add("unknown");
|
|
||||||
INVALID_AUTHOR_NAMES.add("autor, Sin");
|
|
||||||
INVALID_AUTHOR_NAMES.add("Desconocido / Inconnu,");
|
|
||||||
|
|
||||||
INVALID_URL_HOSTS.add("creativecommons.org");
|
|
||||||
INVALID_URL_HOSTS.add("www.academia.edu");
|
|
||||||
INVALID_URL_HOSTS.add("academia.edu");
|
|
||||||
INVALID_URL_HOSTS.add("researchgate.net");
|
|
||||||
INVALID_URL_HOSTS.add("www.researchgate.net");
|
|
||||||
|
|
||||||
INVALID_URLS.add("http://repo.scoap3.org/api");
|
|
||||||
INVALID_URLS.add("http://ora.ox.ac.uk/objects/uuid:");
|
|
||||||
INVALID_URLS.add("http://ntur.lib.ntu.edu.tw/news/agent_contract.pdf");
|
|
||||||
INVALID_URLS.add("https://media.springer.com/full/springer-instructions-for-authors-assets/pdf/SN_BPF_EN.pdf");
|
|
||||||
INVALID_URLS.add("http://www.tobaccoinduceddiseases.org/dl/61aad426c96519bea4040a374c6a6110/");
|
|
||||||
INVALID_URLS.add("https://www.bilboard.nl/verenigingsbladen/bestuurskundige-berichten");
|
|
||||||
}
|
|
||||||
|
|
||||||
public static <T extends Oaf> T cleanContext(T value, String contextId, String verifyParam) {
|
|
||||||
if (ModelSupport.isSubClass(value, Result.class)) {
|
|
||||||
final Result res = (Result) value;
|
|
||||||
if (shouldCleanContext(res, verifyParam)) {
|
|
||||||
res
|
|
||||||
.setContext(
|
|
||||||
res
|
|
||||||
.getContext()
|
|
||||||
.stream()
|
|
||||||
.filter(c -> !StringUtils.startsWith(c.getId().toLowerCase(), contextId))
|
|
||||||
.collect(Collectors.toCollection(ArrayList::new)));
|
|
||||||
}
|
|
||||||
return (T) res;
|
|
||||||
} else {
|
|
||||||
return value;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
private static boolean shouldCleanContext(Result res, String verifyParam) {
|
|
||||||
boolean titleMatch = res
|
|
||||||
.getTitle()
|
|
||||||
.stream()
|
|
||||||
.filter(
|
|
||||||
t -> t
|
|
||||||
.getQualifier()
|
|
||||||
.getClassid()
|
|
||||||
.equalsIgnoreCase(ModelConstants.MAIN_TITLE_QUALIFIER.getClassid()))
|
|
||||||
.anyMatch(t -> t.getValue().toLowerCase().startsWith(verifyParam.toLowerCase()));
|
|
||||||
|
|
||||||
return titleMatch && Objects.nonNull(res.getContext());
|
|
||||||
}
|
|
||||||
|
|
||||||
public static <T extends Oaf> T cleanCountry(T value, String[] verifyParam, Set<String> hostedBy,
|
|
||||||
String collectedfrom, String country) {
|
|
||||||
if (ModelSupport.isSubClass(value, Result.class)) {
|
|
||||||
final Result res = (Result) value;
|
|
||||||
if (res.getInstance().stream().anyMatch(i -> hostedBy.contains(i.getHostedby().getKey())) ||
|
|
||||||
!res.getCollectedfrom().stream().anyMatch(cf -> cf.getValue().equals(collectedfrom))) {
|
|
||||||
return (T) res;
|
|
||||||
}
|
|
||||||
|
|
||||||
List<StructuredProperty> ids = getPidsAndAltIds(res).collect(Collectors.toList());
|
|
||||||
if (ids
|
|
||||||
.stream()
|
|
||||||
.anyMatch(
|
|
||||||
p -> p
|
|
||||||
.getQualifier()
|
|
||||||
.getClassid()
|
|
||||||
.equals(PidType.doi.toString()) && pidInParam(p.getValue(), verifyParam))) {
|
|
||||||
res
|
|
||||||
.setCountry(
|
|
||||||
res
|
|
||||||
.getCountry()
|
|
||||||
.stream()
|
|
||||||
.filter(
|
|
||||||
c -> toTakeCountry(c, country))
|
|
||||||
.collect(Collectors.toList()));
|
|
||||||
}
|
|
||||||
|
|
||||||
return (T) res;
|
|
||||||
} else {
|
|
||||||
return value;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
private static <T extends Result> Stream<StructuredProperty> getPidsAndAltIds(T r) {
|
|
||||||
final Stream<StructuredProperty> resultPids = Optional
|
|
||||||
.ofNullable(r.getPid())
|
|
||||||
.map(Collection::stream)
|
|
||||||
.orElse(Stream.empty());
|
|
||||||
|
|
||||||
final Stream<StructuredProperty> instancePids = Optional
|
|
||||||
.ofNullable(r.getInstance())
|
|
||||||
.map(
|
|
||||||
instance -> instance
|
|
||||||
.stream()
|
|
||||||
.flatMap(
|
|
||||||
i -> Optional
|
|
||||||
.ofNullable(i.getPid())
|
|
||||||
.map(Collection::stream)
|
|
||||||
.orElse(Stream.empty())))
|
|
||||||
.orElse(Stream.empty());
|
|
||||||
|
|
||||||
final Stream<StructuredProperty> instanceAltIds = Optional
|
|
||||||
.ofNullable(r.getInstance())
|
|
||||||
.map(
|
|
||||||
instance -> instance
|
|
||||||
.stream()
|
|
||||||
.flatMap(
|
|
||||||
i -> Optional
|
|
||||||
.ofNullable(i.getAlternateIdentifier())
|
|
||||||
.map(Collection::stream)
|
|
||||||
.orElse(Stream.empty())))
|
|
||||||
.orElse(Stream.empty());
|
|
||||||
|
|
||||||
return Stream
|
|
||||||
.concat(
|
|
||||||
Stream.concat(resultPids, instancePids),
|
|
||||||
instanceAltIds);
|
|
||||||
}
|
|
||||||
|
|
||||||
private static boolean pidInParam(String value, String[] verifyParam) {
|
|
||||||
for (String s : verifyParam)
|
|
||||||
if (value.startsWith(s))
|
|
||||||
return true;
|
|
||||||
return false;
|
|
||||||
}
|
|
||||||
|
|
||||||
private static boolean toTakeCountry(Country c, String country) {
|
|
||||||
// If dataInfo is not set, or dataInfo.inferenceprovenance is not set or not present then it cannot be
|
|
||||||
// inserted via propagation
|
|
||||||
if (!Optional.ofNullable(c.getDataInfo()).isPresent())
|
|
||||||
return true;
|
|
||||||
if (!Optional.ofNullable(c.getDataInfo().getInferenceprovenance()).isPresent())
|
|
||||||
return true;
|
|
||||||
return !(c
|
|
||||||
.getClassid()
|
|
||||||
.equalsIgnoreCase(country) &&
|
|
||||||
c.getDataInfo().getInferenceprovenance().equals("propagation"));
|
|
||||||
}
|
|
||||||
|
|
||||||
public static <T extends Oaf> T fixVocabularyNames(T value) {
|
public static <T extends Oaf> T fixVocabularyNames(T value) {
|
||||||
if (value instanceof OafEntity) {
|
if (value instanceof Datasource) {
|
||||||
|
// nothing to clean here
|
||||||
|
} else if (value instanceof Project) {
|
||||||
|
// nothing to clean here
|
||||||
|
} else if (value instanceof Organization) {
|
||||||
|
Organization o = (Organization) value;
|
||||||
|
if (Objects.nonNull(o.getCountry())) {
|
||||||
|
fixVocabName(o.getCountry(), ModelConstants.DNET_COUNTRY_TYPE);
|
||||||
|
}
|
||||||
|
} else if (value instanceof Relation) {
|
||||||
|
// nothing to clean here
|
||||||
|
} else if (value instanceof Result) {
|
||||||
|
|
||||||
OafEntity e = (OafEntity) value;
|
|
||||||
|
|
||||||
Optional
|
|
||||||
.ofNullable(e.getPid())
|
|
||||||
.ifPresent(pid -> pid.forEach(p -> fixVocabName(p.getQualifier(), ModelConstants.DNET_PID_TYPES)));
|
|
||||||
|
|
||||||
if (value instanceof Result) {
|
|
||||||
Result r = (Result) value;
|
Result r = (Result) value;
|
||||||
|
|
||||||
fixVocabName(r.getLanguage(), ModelConstants.DNET_LANGUAGES);
|
fixVocabName(r.getLanguage(), ModelConstants.DNET_LANGUAGES);
|
||||||
|
@ -253,11 +57,6 @@ public class GraphCleaningFunctions extends CleaningFunctions {
|
||||||
for (Instance i : r.getInstance()) {
|
for (Instance i : r.getInstance()) {
|
||||||
fixVocabName(i.getAccessright(), ModelConstants.DNET_ACCESS_MODES);
|
fixVocabName(i.getAccessright(), ModelConstants.DNET_ACCESS_MODES);
|
||||||
fixVocabName(i.getRefereed(), ModelConstants.DNET_REVIEW_LEVELS);
|
fixVocabName(i.getRefereed(), ModelConstants.DNET_REVIEW_LEVELS);
|
||||||
Optional
|
|
||||||
.ofNullable(i.getPid())
|
|
||||||
.ifPresent(
|
|
||||||
pid -> pid.forEach(p -> fixVocabName(p.getQualifier(), ModelConstants.DNET_PID_TYPES)));
|
|
||||||
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
if (Objects.nonNull(r.getAuthor())) {
|
if (Objects.nonNull(r.getAuthor())) {
|
||||||
|
@ -278,47 +77,16 @@ public class GraphCleaningFunctions extends CleaningFunctions {
|
||||||
} else if (value instanceof Software) {
|
} else if (value instanceof Software) {
|
||||||
|
|
||||||
}
|
}
|
||||||
} else if (value instanceof Datasource) {
|
|
||||||
// nothing to clean here
|
|
||||||
} else if (value instanceof Project) {
|
|
||||||
// nothing to clean here
|
|
||||||
} else if (value instanceof Organization) {
|
|
||||||
Organization o = (Organization) value;
|
|
||||||
if (Objects.nonNull(o.getCountry())) {
|
|
||||||
fixVocabName(o.getCountry(), ModelConstants.DNET_COUNTRY_TYPE);
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
||||||
} else if (value instanceof Relation) {
|
|
||||||
// nothing to clean here
|
|
||||||
}
|
}
|
||||||
|
|
||||||
return value;
|
return value;
|
||||||
}
|
}
|
||||||
|
|
||||||
public static <T extends Oaf> boolean filter(T value) {
|
public static <T extends Oaf> boolean filter(T value) {
|
||||||
if (!(value instanceof Relation) && (Boolean.TRUE
|
|
||||||
.equals(
|
|
||||||
Optional
|
|
||||||
.ofNullable(value)
|
|
||||||
.map(
|
|
||||||
o -> Optional
|
|
||||||
.ofNullable(o.getDataInfo())
|
|
||||||
.map(
|
|
||||||
d -> Optional
|
|
||||||
.ofNullable(d.getInvisible())
|
|
||||||
.orElse(true))
|
|
||||||
.orElse(false))
|
|
||||||
.orElse(true)))) {
|
|
||||||
return true;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (value instanceof Datasource) {
|
if (value instanceof Datasource) {
|
||||||
final Datasource d = (Datasource) value;
|
// nothing to evaluate here
|
||||||
return Objects.nonNull(d.getOfficialname()) && StringUtils.isNotBlank(d.getOfficialname().getValue());
|
|
||||||
} else if (value instanceof Project) {
|
} else if (value instanceof Project) {
|
||||||
final Project p = (Project) value;
|
// nothing to evaluate here
|
||||||
return Objects.nonNull(p.getCode()) && StringUtils.isNotBlank(p.getCode().getValue());
|
|
||||||
} else if (value instanceof Organization) {
|
} else if (value instanceof Organization) {
|
||||||
// nothing to evaluate here
|
// nothing to evaluate here
|
||||||
} else if (value instanceof Relation) {
|
} else if (value instanceof Relation) {
|
||||||
|
@ -344,21 +112,7 @@ public class GraphCleaningFunctions extends CleaningFunctions {
|
||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
|
|
||||||
public static <T extends Oaf> T cleanup(T value, VocabularyGroup vocs) {
|
public static <T extends Oaf> T cleanup(T value) {
|
||||||
|
|
||||||
if (Objects.isNull(value.getDataInfo())) {
|
|
||||||
final DataInfo d = new DataInfo();
|
|
||||||
d.setDeletedbyinference(false);
|
|
||||||
value.setDataInfo(d);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (value instanceof OafEntity) {
|
|
||||||
|
|
||||||
OafEntity e = (OafEntity) value;
|
|
||||||
if (Objects.nonNull(e.getPid())) {
|
|
||||||
e.setPid(processPidCleaning(e.getPid()));
|
|
||||||
}
|
|
||||||
|
|
||||||
if (value instanceof Datasource) {
|
if (value instanceof Datasource) {
|
||||||
// nothing to clean here
|
// nothing to clean here
|
||||||
} else if (value instanceof Project) {
|
} else if (value instanceof Project) {
|
||||||
|
@ -368,20 +122,21 @@ public class GraphCleaningFunctions extends CleaningFunctions {
|
||||||
if (Objects.isNull(o.getCountry()) || StringUtils.isBlank(o.getCountry().getClassid())) {
|
if (Objects.isNull(o.getCountry()) || StringUtils.isBlank(o.getCountry().getClassid())) {
|
||||||
o.setCountry(ModelConstants.UNKNOWN_COUNTRY);
|
o.setCountry(ModelConstants.UNKNOWN_COUNTRY);
|
||||||
}
|
}
|
||||||
|
} else if (value instanceof Relation) {
|
||||||
|
Relation r = (Relation) value;
|
||||||
|
|
||||||
|
Optional<String> validationDate = doCleanDate(r.getValidationDate());
|
||||||
|
if (validationDate.isPresent()) {
|
||||||
|
r.setValidationDate(validationDate.get());
|
||||||
|
r.setValidated(true);
|
||||||
|
} else {
|
||||||
|
r.setValidationDate(null);
|
||||||
|
r.setValidated(false);
|
||||||
|
}
|
||||||
} else if (value instanceof Result) {
|
} else if (value instanceof Result) {
|
||||||
|
|
||||||
Result r = (Result) value;
|
Result r = (Result) value;
|
||||||
|
|
||||||
if (Objects.isNull(r.getContext())) {
|
|
||||||
r.setContext(new ArrayList<>());
|
|
||||||
}
|
|
||||||
|
|
||||||
if (Objects.nonNull(r.getFulltext())
|
|
||||||
&& (ModelConstants.SOFTWARE_RESULTTYPE_CLASSID.equals(r.getResulttype().getClassid()) ||
|
|
||||||
ModelConstants.DATASET_RESULTTYPE_CLASSID.equals(r.getResulttype().getClassid()))) {
|
|
||||||
r.setFulltext(null);
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
if (Objects.nonNull(r.getDateofacceptance())) {
|
if (Objects.nonNull(r.getDateofacceptance())) {
|
||||||
Optional<String> date = cleanDateField(r.getDateofacceptance());
|
Optional<String> date = cleanDateField(r.getDateofacceptance());
|
||||||
if (date.isPresent()) {
|
if (date.isPresent()) {
|
||||||
|
@ -406,26 +161,8 @@ public class GraphCleaningFunctions extends CleaningFunctions {
|
||||||
.filter(sp -> StringUtils.isNotBlank(sp.getValue()))
|
.filter(sp -> StringUtils.isNotBlank(sp.getValue()))
|
||||||
.collect(Collectors.toList()));
|
.collect(Collectors.toList()));
|
||||||
}
|
}
|
||||||
if (Objects.nonNull(r.getPublisher())) {
|
if (Objects.nonNull(r.getPublisher()) && StringUtils.isBlank(r.getPublisher().getValue())) {
|
||||||
if (StringUtils.isBlank(r.getPublisher().getValue())) {
|
|
||||||
r.setPublisher(null);
|
r.setPublisher(null);
|
||||||
} else {
|
|
||||||
r
|
|
||||||
.getPublisher()
|
|
||||||
.setValue(
|
|
||||||
r
|
|
||||||
.getPublisher()
|
|
||||||
.getValue()
|
|
||||||
.replaceAll(NAME_CLEANING_REGEX, " "));
|
|
||||||
|
|
||||||
if (vocs.vocabularyExists(DNET_PUBLISHERS)) {
|
|
||||||
vocs
|
|
||||||
.find(DNET_PUBLISHERS)
|
|
||||||
.map(voc -> voc.getTermBySynonym(r.getPublisher().getValue()))
|
|
||||||
.map(VocabularyTerm::getName)
|
|
||||||
.ifPresent(publisher -> r.getPublisher().setValue(publisher));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
if (Objects.isNull(r.getLanguage()) || StringUtils.isBlank(r.getLanguage().getClassid())) {
|
if (Objects.isNull(r.getLanguage()) || StringUtils.isBlank(r.getLanguage().getClassid())) {
|
||||||
r
|
r
|
||||||
|
@ -433,8 +170,8 @@ public class GraphCleaningFunctions extends CleaningFunctions {
|
||||||
qualifier("und", "Undetermined", ModelConstants.DNET_LANGUAGES));
|
qualifier("und", "Undetermined", ModelConstants.DNET_LANGUAGES));
|
||||||
}
|
}
|
||||||
if (Objects.nonNull(r.getSubject())) {
|
if (Objects.nonNull(r.getSubject())) {
|
||||||
List<Subject> subjects = Lists
|
r
|
||||||
.newArrayList(
|
.setSubject(
|
||||||
r
|
r
|
||||||
.getSubject()
|
.getSubject()
|
||||||
.stream()
|
.stream()
|
||||||
|
@ -442,26 +179,8 @@ public class GraphCleaningFunctions extends CleaningFunctions {
|
||||||
.filter(sp -> StringUtils.isNotBlank(sp.getValue()))
|
.filter(sp -> StringUtils.isNotBlank(sp.getValue()))
|
||||||
.filter(sp -> Objects.nonNull(sp.getQualifier()))
|
.filter(sp -> Objects.nonNull(sp.getQualifier()))
|
||||||
.filter(sp -> StringUtils.isNotBlank(sp.getQualifier().getClassid()))
|
.filter(sp -> StringUtils.isNotBlank(sp.getQualifier().getClassid()))
|
||||||
.map(s -> {
|
|
||||||
if ("dnet:result_subject".equals(s.getQualifier().getClassid())) {
|
|
||||||
s.getQualifier().setClassid(ModelConstants.DNET_SUBJECT_TYPOLOGIES);
|
|
||||||
s.getQualifier().setClassname(ModelConstants.DNET_SUBJECT_TYPOLOGIES);
|
|
||||||
}
|
|
||||||
return s;
|
|
||||||
})
|
|
||||||
.map(GraphCleaningFunctions::cleanValue)
|
.map(GraphCleaningFunctions::cleanValue)
|
||||||
.collect(
|
.collect(Collectors.toList()));
|
||||||
Collectors
|
|
||||||
.toMap(
|
|
||||||
s -> Optional
|
|
||||||
.ofNullable(s.getQualifier())
|
|
||||||
.map(q -> q.getClassid() + s.getValue())
|
|
||||||
.orElse(s.getValue()),
|
|
||||||
Function.identity(),
|
|
||||||
(s1, s2) -> Collections
|
|
||||||
.min(Lists.newArrayList(s1, s2), new SubjectProvenanceComparator())))
|
|
||||||
.values());
|
|
||||||
r.setSubject(subjects);
|
|
||||||
}
|
}
|
||||||
if (Objects.nonNull(r.getTitle())) {
|
if (Objects.nonNull(r.getTitle())) {
|
||||||
r
|
r
|
||||||
|
@ -476,29 +195,14 @@ public class GraphCleaningFunctions extends CleaningFunctions {
|
||||||
final String title = sp
|
final String title = sp
|
||||||
.getValue()
|
.getValue()
|
||||||
.toLowerCase();
|
.toLowerCase();
|
||||||
final String decoded = Unidecode.decode(title);
|
final String residual = Unidecode
|
||||||
|
.decode(title)
|
||||||
if (StringUtils.contains(decoded, TITLE_TEST)) {
|
.replaceAll(TITLE_FILTER_REGEX, "");
|
||||||
return decoded
|
return residual.length() > TITLE_FILTER_RESIDUAL_LENGTH;
|
||||||
.replaceAll(TITLE_FILTER_REGEX, "")
|
|
||||||
.length() > TITLE_FILTER_RESIDUAL_LENGTH;
|
|
||||||
}
|
|
||||||
return !decoded
|
|
||||||
.replaceAll("\\W|\\d", "")
|
|
||||||
.isEmpty();
|
|
||||||
})
|
})
|
||||||
.map(GraphCleaningFunctions::cleanValue)
|
.map(GraphCleaningFunctions::cleanValue)
|
||||||
.collect(Collectors.toList()));
|
.collect(Collectors.toList()));
|
||||||
}
|
}
|
||||||
if (Objects.nonNull(r.getFormat())) {
|
|
||||||
r
|
|
||||||
.setFormat(
|
|
||||||
r
|
|
||||||
.getFormat()
|
|
||||||
.stream()
|
|
||||||
.map(GraphCleaningFunctions::cleanValue)
|
|
||||||
.collect(Collectors.toList()));
|
|
||||||
}
|
|
||||||
if (Objects.nonNull(r.getDescription())) {
|
if (Objects.nonNull(r.getDescription())) {
|
||||||
r
|
r
|
||||||
.setDescription(
|
.setDescription(
|
||||||
|
@ -508,10 +212,11 @@ public class GraphCleaningFunctions extends CleaningFunctions {
|
||||||
.filter(Objects::nonNull)
|
.filter(Objects::nonNull)
|
||||||
.filter(sp -> StringUtils.isNotBlank(sp.getValue()))
|
.filter(sp -> StringUtils.isNotBlank(sp.getValue()))
|
||||||
.map(GraphCleaningFunctions::cleanValue)
|
.map(GraphCleaningFunctions::cleanValue)
|
||||||
.sorted((s1, s2) -> s2.getValue().length() - s1.getValue().length())
|
|
||||||
.limit(ModelHardLimits.MAX_ABSTRACTS)
|
|
||||||
.collect(Collectors.toList()));
|
.collect(Collectors.toList()));
|
||||||
}
|
}
|
||||||
|
if (Objects.nonNull(r.getPid())) {
|
||||||
|
r.setPid(processPidCleaning(r.getPid()));
|
||||||
|
}
|
||||||
if (Objects.isNull(r.getResourcetype()) || StringUtils.isBlank(r.getResourcetype().getClassid())) {
|
if (Objects.isNull(r.getResourcetype()) || StringUtils.isBlank(r.getResourcetype().getClassid())) {
|
||||||
r
|
r
|
||||||
.setResourcetype(
|
.setResourcetype(
|
||||||
|
@ -520,40 +225,6 @@ public class GraphCleaningFunctions extends CleaningFunctions {
|
||||||
if (Objects.nonNull(r.getInstance())) {
|
if (Objects.nonNull(r.getInstance())) {
|
||||||
|
|
||||||
for (Instance i : r.getInstance()) {
|
for (Instance i : r.getInstance()) {
|
||||||
if (!vocs
|
|
||||||
.termExists(ModelConstants.DNET_PUBLICATION_RESOURCE, i.getInstancetype().getClassid())) {
|
|
||||||
if (r instanceof Publication) {
|
|
||||||
i
|
|
||||||
.setInstancetype(
|
|
||||||
OafMapperUtils
|
|
||||||
.qualifier(
|
|
||||||
"0038", "Other literature type",
|
|
||||||
ModelConstants.DNET_PUBLICATION_RESOURCE,
|
|
||||||
ModelConstants.DNET_PUBLICATION_RESOURCE));
|
|
||||||
} else if (r instanceof Dataset) {
|
|
||||||
i
|
|
||||||
.setInstancetype(
|
|
||||||
OafMapperUtils
|
|
||||||
.qualifier(
|
|
||||||
"0039", "Other dataset type", ModelConstants.DNET_PUBLICATION_RESOURCE,
|
|
||||||
ModelConstants.DNET_PUBLICATION_RESOURCE));
|
|
||||||
} else if (r instanceof Software) {
|
|
||||||
i
|
|
||||||
.setInstancetype(
|
|
||||||
OafMapperUtils
|
|
||||||
.qualifier(
|
|
||||||
"0040", "Other software type", ModelConstants.DNET_PUBLICATION_RESOURCE,
|
|
||||||
ModelConstants.DNET_PUBLICATION_RESOURCE));
|
|
||||||
} else if (r instanceof OtherResearchProduct) {
|
|
||||||
i
|
|
||||||
.setInstancetype(
|
|
||||||
OafMapperUtils
|
|
||||||
.qualifier(
|
|
||||||
"0020", "Other ORP type", ModelConstants.DNET_PUBLICATION_RESOURCE,
|
|
||||||
ModelConstants.DNET_PUBLICATION_RESOURCE));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (Objects.nonNull(i.getPid())) {
|
if (Objects.nonNull(i.getPid())) {
|
||||||
i.setPid(processPidCleaning(i.getPid()));
|
i.setPid(processPidCleaning(i.getPid()));
|
||||||
}
|
}
|
||||||
|
@ -563,29 +234,16 @@ public class GraphCleaningFunctions extends CleaningFunctions {
|
||||||
Optional
|
Optional
|
||||||
.ofNullable(i.getPid())
|
.ofNullable(i.getPid())
|
||||||
.ifPresent(pid -> {
|
.ifPresent(pid -> {
|
||||||
final Set<HashableStructuredProperty> pids = pid
|
final Set<StructuredProperty> pids = Sets.newHashSet(pid);
|
||||||
.stream()
|
|
||||||
.map(HashableStructuredProperty::newInstance)
|
|
||||||
.collect(Collectors.toCollection(HashSet::new));
|
|
||||||
Optional
|
Optional
|
||||||
.ofNullable(i.getAlternateIdentifier())
|
.ofNullable(i.getAlternateIdentifier())
|
||||||
.ifPresent(altId -> {
|
.ifPresent(altId -> {
|
||||||
final Set<HashableStructuredProperty> altIds = altId
|
final Set<StructuredProperty> altIds = Sets.newHashSet(altId);
|
||||||
.stream()
|
i.setAlternateIdentifier(Lists.newArrayList(Sets.difference(altIds, pids)));
|
||||||
.map(HashableStructuredProperty::newInstance)
|
|
||||||
.collect(Collectors.toCollection(HashSet::new));
|
|
||||||
i
|
|
||||||
.setAlternateIdentifier(
|
|
||||||
Sets
|
|
||||||
.difference(altIds, pids)
|
|
||||||
.stream()
|
|
||||||
.map(HashableStructuredProperty::toStructuredProperty)
|
|
||||||
.collect(Collectors.toList()));
|
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
if (Objects.isNull(i.getAccessright())
|
if (Objects.isNull(i.getAccessright()) || StringUtils.isBlank(i.getAccessright().getClassid())) {
|
||||||
|| StringUtils.isBlank(i.getAccessright().getClassid())) {
|
|
||||||
i
|
i
|
||||||
.setAccessright(
|
.setAccessright(
|
||||||
accessRight(
|
accessRight(
|
||||||
|
@ -595,46 +253,9 @@ public class GraphCleaningFunctions extends CleaningFunctions {
|
||||||
if (Objects.isNull(i.getHostedby()) || StringUtils.isBlank(i.getHostedby().getKey())) {
|
if (Objects.isNull(i.getHostedby()) || StringUtils.isBlank(i.getHostedby().getKey())) {
|
||||||
i.setHostedby(ModelConstants.UNKNOWN_REPOSITORY);
|
i.setHostedby(ModelConstants.UNKNOWN_REPOSITORY);
|
||||||
}
|
}
|
||||||
if (Objects.isNull(i.getRefereed()) || StringUtils.isBlank(i.getRefereed().getClassid())) {
|
if (Objects.isNull(i.getRefereed())) {
|
||||||
i.setRefereed(qualifier("0000", "Unknown", ModelConstants.DNET_REVIEW_LEVELS));
|
i.setRefereed(qualifier("0000", "Unknown", ModelConstants.DNET_REVIEW_LEVELS));
|
||||||
}
|
}
|
||||||
|
|
||||||
if (Objects.nonNull(i.getLicense()) && Objects.nonNull(i.getLicense().getValue())) {
|
|
||||||
vocs
|
|
||||||
.find(DNET_LICENSES)
|
|
||||||
.map(voc -> voc.getTermBySynonym(i.getLicense().getValue()))
|
|
||||||
.map(VocabularyTerm::getId)
|
|
||||||
.ifPresent(license -> i.getLicense().setValue(license));
|
|
||||||
}
|
|
||||||
|
|
||||||
// from the script from Dimitris
|
|
||||||
if ("0000".equals(i.getRefereed().getClassid())) {
|
|
||||||
final boolean isFromCrossref = Optional
|
|
||||||
.ofNullable(i.getCollectedfrom())
|
|
||||||
.map(KeyValue::getKey)
|
|
||||||
.map(id -> id.equals(ModelConstants.CROSSREF_ID))
|
|
||||||
.orElse(false);
|
|
||||||
final boolean hasDoi = Optional
|
|
||||||
.ofNullable(i.getPid())
|
|
||||||
.map(
|
|
||||||
pid -> pid
|
|
||||||
.stream()
|
|
||||||
.anyMatch(
|
|
||||||
p -> PidType.doi.toString().equals(p.getQualifier().getClassid())))
|
|
||||||
.orElse(false);
|
|
||||||
final boolean isPeerReviewedType = PEER_REVIEWED_TYPES
|
|
||||||
.contains(i.getInstancetype().getClassname());
|
|
||||||
final boolean noOtherLitType = r
|
|
||||||
.getInstance()
|
|
||||||
.stream()
|
|
||||||
.noneMatch(ii -> "Other literature type".equals(ii.getInstancetype().getClassname()));
|
|
||||||
if (isFromCrossref && hasDoi && isPeerReviewedType && noOtherLitType) {
|
|
||||||
i.setRefereed(qualifier("0001", "peerReviewed", ModelConstants.DNET_REVIEW_LEVELS));
|
|
||||||
} else {
|
|
||||||
i.setRefereed(qualifier("0002", "nonPeerReviewed", ModelConstants.DNET_REVIEW_LEVELS));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (Objects.nonNull(i.getDateofacceptance())) {
|
if (Objects.nonNull(i.getDateofacceptance())) {
|
||||||
Optional<String> date = cleanDateField(i.getDateofacceptance());
|
Optional<String> date = cleanDateField(i.getDateofacceptance());
|
||||||
if (date.isPresent()) {
|
if (date.isPresent()) {
|
||||||
|
@ -643,24 +264,9 @@ public class GraphCleaningFunctions extends CleaningFunctions {
|
||||||
i.setDateofacceptance(null);
|
i.setDateofacceptance(null);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
if (StringUtils.isNotBlank(i.getFulltext()) &&
|
|
||||||
(ModelConstants.SOFTWARE_RESULTTYPE_CLASSID.equals(r.getResulttype().getClassid()) ||
|
|
||||||
ModelConstants.DATASET_RESULTTYPE_CLASSID.equals(r.getResulttype().getClassid()))) {
|
|
||||||
i.setFulltext(null);
|
|
||||||
}
|
|
||||||
if (Objects.nonNull(i.getUrl())) {
|
|
||||||
i
|
|
||||||
.setUrl(
|
|
||||||
i
|
|
||||||
.getUrl()
|
|
||||||
.stream()
|
|
||||||
.filter(GraphCleaningFunctions::urlFilter)
|
|
||||||
.collect(Collectors.toList()));
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
if (Objects.isNull(r.getBestaccessright()) || StringUtils.isBlank(r.getBestaccessright().getClassid())) {
|
||||||
if (Objects.isNull(r.getBestaccessright())
|
|
||||||
|| StringUtils.isBlank(r.getBestaccessright().getClassid())) {
|
|
||||||
Qualifier bestaccessrights = OafMapperUtils.createBestAccessRights(r.getInstance());
|
Qualifier bestaccessrights = OafMapperUtils.createBestAccessRights(r.getInstance());
|
||||||
if (Objects.isNull(bestaccessrights)) {
|
if (Objects.isNull(bestaccessrights)) {
|
||||||
r
|
r
|
||||||
|
@ -679,8 +285,8 @@ public class GraphCleaningFunctions extends CleaningFunctions {
|
||||||
.getAuthor()
|
.getAuthor()
|
||||||
.stream()
|
.stream()
|
||||||
.filter(Objects::nonNull)
|
.filter(Objects::nonNull)
|
||||||
.filter(GraphCleaningFunctions::isValidAuthorName)
|
.filter(a -> StringUtils.isNotBlank(a.getFullname()))
|
||||||
.map(GraphCleaningFunctions::cleanupAuthor)
|
.filter(a -> StringUtils.isNotBlank(a.getFullname().replaceAll("[\\W]", "")))
|
||||||
.collect(Collectors.toList()));
|
.collect(Collectors.toList()));
|
||||||
|
|
||||||
boolean nullRank = r
|
boolean nullRank = r
|
||||||
|
@ -706,20 +312,23 @@ public class GraphCleaningFunctions extends CleaningFunctions {
|
||||||
.filter(Objects::nonNull)
|
.filter(Objects::nonNull)
|
||||||
.filter(p -> Objects.nonNull(p.getQualifier()))
|
.filter(p -> Objects.nonNull(p.getQualifier()))
|
||||||
.filter(p -> StringUtils.isNotBlank(p.getValue()))
|
.filter(p -> StringUtils.isNotBlank(p.getValue()))
|
||||||
.filter(
|
|
||||||
p -> StringUtils
|
|
||||||
.contains(StringUtils.lowerCase(p.getQualifier().getClassid()), ORCID))
|
|
||||||
.map(p -> {
|
.map(p -> {
|
||||||
// hack to distinguish orcid from orcid_pending
|
// hack to distinguish orcid from orcid_pending
|
||||||
String pidProvenance = getProvenance(p.getDataInfo());
|
String pidProvenance = Optional
|
||||||
|
.ofNullable(p.getDataInfo())
|
||||||
|
.map(
|
||||||
|
d -> Optional
|
||||||
|
.ofNullable(d.getProvenanceaction())
|
||||||
|
.map(Qualifier::getClassid)
|
||||||
|
.orElse(""))
|
||||||
|
.orElse("");
|
||||||
if (p
|
if (p
|
||||||
.getQualifier()
|
.getQualifier()
|
||||||
.getClassid()
|
.getClassid()
|
||||||
.toLowerCase()
|
.toLowerCase()
|
||||||
.contains(ModelConstants.ORCID)) {
|
.contains(ModelConstants.ORCID)) {
|
||||||
if (pidProvenance
|
if (pidProvenance
|
||||||
.equals(ModelConstants.SYSIMPORT_CROSSWALK_ENTITYREGISTRY) ||
|
.equals(ModelConstants.SYSIMPORT_CROSSWALK_ENTITYREGISTRY)) {
|
||||||
pidProvenance.equals("ORCID_ENRICHMENT")) {
|
|
||||||
p.getQualifier().setClassid(ModelConstants.ORCID);
|
p.getQualifier().setClassid(ModelConstants.ORCID);
|
||||||
} else {
|
} else {
|
||||||
p.getQualifier().setClassid(ModelConstants.ORCID_PENDING);
|
p.getQualifier().setClassid(ModelConstants.ORCID_PENDING);
|
||||||
|
@ -760,54 +369,11 @@ public class GraphCleaningFunctions extends CleaningFunctions {
|
||||||
} else if (value instanceof Software) {
|
} else if (value instanceof Software) {
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
} else if (value instanceof Relation) {
|
|
||||||
Relation r = (Relation) value;
|
|
||||||
|
|
||||||
Optional<String> validationDate = doCleanDate(r.getValidationDate());
|
|
||||||
if (validationDate.isPresent()) {
|
|
||||||
r.setValidationDate(validationDate.get());
|
|
||||||
r.setValidated(true);
|
|
||||||
} else {
|
|
||||||
r.setValidationDate(null);
|
|
||||||
r.setValidated(false);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
return value;
|
return value;
|
||||||
}
|
}
|
||||||
|
|
||||||
private static Author cleanupAuthor(Author author) {
|
|
||||||
if (StringUtils.isNotBlank(author.getFullname())) {
|
|
||||||
author
|
|
||||||
.setFullname(
|
|
||||||
author
|
|
||||||
.getFullname()
|
|
||||||
.replaceAll(NAME_CLEANING_REGEX, " ")
|
|
||||||
.replace("\"", "\\\""));
|
|
||||||
}
|
|
||||||
if (StringUtils.isNotBlank(author.getName())) {
|
|
||||||
author
|
|
||||||
.setName(
|
|
||||||
author
|
|
||||||
.getName()
|
|
||||||
.replaceAll(NAME_CLEANING_REGEX, " ")
|
|
||||||
.replace("\"", "\\\""));
|
|
||||||
}
|
|
||||||
if (StringUtils.isNotBlank(author.getSurname())) {
|
|
||||||
author
|
|
||||||
.setSurname(
|
|
||||||
author
|
|
||||||
.getSurname()
|
|
||||||
.replaceAll(NAME_CLEANING_REGEX, " ")
|
|
||||||
.replace("\"", "\\\""));
|
|
||||||
}
|
|
||||||
|
|
||||||
return author;
|
|
||||||
}
|
|
||||||
|
|
||||||
private static Optional<String> cleanDateField(Field<String> dateofacceptance) {
|
private static Optional<String> cleanDateField(Field<String> dateofacceptance) {
|
||||||
return Optional
|
return Optional
|
||||||
.ofNullable(dateofacceptance)
|
.ofNullable(dateofacceptance)
|
||||||
|
@ -841,32 +407,14 @@ public class GraphCleaningFunctions extends CleaningFunctions {
|
||||||
// HELPERS
|
// HELPERS
|
||||||
|
|
||||||
private static boolean isValidAuthorName(Author a) {
|
private static boolean isValidAuthorName(Author a) {
|
||||||
return StringUtils.isNotBlank(a.getFullname()) &&
|
return !Stream
|
||||||
StringUtils.isNotBlank(a.getFullname().replaceAll("[\\W]", "")) &&
|
|
||||||
!INVALID_AUTHOR_NAMES.contains(StringUtils.lowerCase(a.getFullname()).trim()) &&
|
|
||||||
!Stream
|
|
||||||
.of(a.getFullname(), a.getName(), a.getSurname())
|
.of(a.getFullname(), a.getName(), a.getSurname())
|
||||||
.filter(StringUtils::isNotBlank)
|
.filter(s -> s != null && !s.isEmpty())
|
||||||
.collect(Collectors.joining(""))
|
.collect(Collectors.joining(""))
|
||||||
.toLowerCase()
|
.toLowerCase()
|
||||||
.matches(INVALID_AUTHOR_REGEX);
|
.matches(INVALID_AUTHOR_REGEX);
|
||||||
}
|
}
|
||||||
|
|
||||||
private static boolean urlFilter(String u) {
|
|
||||||
try {
|
|
||||||
final URL url = new URL(u);
|
|
||||||
if (StringUtils.isBlank(url.getPath()) || "/".equals(url.getPath())) {
|
|
||||||
return false;
|
|
||||||
}
|
|
||||||
if (INVALID_URL_HOSTS.contains(url.getHost())) {
|
|
||||||
return false;
|
|
||||||
}
|
|
||||||
return !INVALID_URLS.contains(url.toString());
|
|
||||||
} catch (MalformedURLException ex) {
|
|
||||||
return false;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
private static List<StructuredProperty> processPidCleaning(List<StructuredProperty> pids) {
|
private static List<StructuredProperty> processPidCleaning(List<StructuredProperty> pids) {
|
||||||
return pids
|
return pids
|
||||||
.stream()
|
.stream()
|
||||||
|
@ -875,7 +423,7 @@ public class GraphCleaningFunctions extends CleaningFunctions {
|
||||||
.filter(sp -> !PID_BLACKLIST.contains(sp.getValue().trim().toLowerCase()))
|
.filter(sp -> !PID_BLACKLIST.contains(sp.getValue().trim().toLowerCase()))
|
||||||
.filter(sp -> Objects.nonNull(sp.getQualifier()))
|
.filter(sp -> Objects.nonNull(sp.getQualifier()))
|
||||||
.filter(sp -> StringUtils.isNotBlank(sp.getQualifier().getClassid()))
|
.filter(sp -> StringUtils.isNotBlank(sp.getQualifier().getClassid()))
|
||||||
.map(PidCleaner::normalizePidValue)
|
.map(CleaningFunctions::normalizePidValue)
|
||||||
.filter(CleaningFunctions::pidFilter)
|
.filter(CleaningFunctions::pidFilter)
|
||||||
.collect(Collectors.toList());
|
.collect(Collectors.toList());
|
||||||
}
|
}
|
||||||
|
@ -904,152 +452,9 @@ public class GraphCleaningFunctions extends CleaningFunctions {
|
||||||
return s;
|
return s;
|
||||||
}
|
}
|
||||||
|
|
||||||
protected static Subject cleanValue(Subject s) {
|
|
||||||
s.setValue(s.getValue().replaceAll(CLEANING_REGEX, " "));
|
|
||||||
return s;
|
|
||||||
}
|
|
||||||
|
|
||||||
protected static Field<String> cleanValue(Field<String> s) {
|
protected static Field<String> cleanValue(Field<String> s) {
|
||||||
s.setValue(s.getValue().replaceAll(CLEANING_REGEX, " "));
|
s.setValue(s.getValue().replaceAll(CLEANING_REGEX, " "));
|
||||||
return s;
|
return s;
|
||||||
}
|
}
|
||||||
|
|
||||||
public static OafEntity applyCoarVocabularies(OafEntity entity, VocabularyGroup vocs) {
|
|
||||||
|
|
||||||
if (entity instanceof Result) {
|
|
||||||
final Result result = (Result) entity;
|
|
||||||
|
|
||||||
Optional
|
|
||||||
.ofNullable(result.getInstance())
|
|
||||||
.ifPresent(
|
|
||||||
instances -> instances
|
|
||||||
.forEach(
|
|
||||||
instance -> {
|
|
||||||
if (Objects.isNull(instance.getInstanceTypeMapping())) {
|
|
||||||
List<InstanceTypeMapping> mapping = Lists.newArrayList();
|
|
||||||
mapping
|
|
||||||
.add(
|
|
||||||
OafMapperUtils
|
|
||||||
.instanceTypeMapping(
|
|
||||||
instance.getInstancetype().getClassname(),
|
|
||||||
OPENAIRE_COAR_RESOURCE_TYPES_3_1));
|
|
||||||
instance.setInstanceTypeMapping(mapping);
|
|
||||||
}
|
|
||||||
Optional<InstanceTypeMapping> optionalItm = instance
|
|
||||||
.getInstanceTypeMapping()
|
|
||||||
.stream()
|
|
||||||
.filter(GraphCleaningFunctions::originalResourceType)
|
|
||||||
.findFirst();
|
|
||||||
if (optionalItm.isPresent()) {
|
|
||||||
InstanceTypeMapping coarItm = optionalItm.get();
|
|
||||||
Optional
|
|
||||||
.ofNullable(
|
|
||||||
vocs
|
|
||||||
.lookupTermBySynonym(
|
|
||||||
OPENAIRE_COAR_RESOURCE_TYPES_3_1, coarItm.getOriginalType()))
|
|
||||||
.ifPresent(type -> {
|
|
||||||
coarItm.setTypeCode(type.getClassid());
|
|
||||||
coarItm.setTypeLabel(type.getClassname());
|
|
||||||
});
|
|
||||||
final List<InstanceTypeMapping> mappings = Lists.newArrayList();
|
|
||||||
if (vocs.vocabularyExists(OPENAIRE_USER_RESOURCE_TYPES)) {
|
|
||||||
Optional
|
|
||||||
.ofNullable(
|
|
||||||
vocs
|
|
||||||
.lookupTermBySynonym(
|
|
||||||
OPENAIRE_USER_RESOURCE_TYPES, coarItm.getTypeCode()))
|
|
||||||
.ifPresent(
|
|
||||||
type -> mappings
|
|
||||||
.add(
|
|
||||||
OafMapperUtils
|
|
||||||
.instanceTypeMapping(coarItm.getTypeCode(), type)));
|
|
||||||
}
|
|
||||||
if (!mappings.isEmpty()) {
|
|
||||||
instance.getInstanceTypeMapping().addAll(mappings);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}));
|
|
||||||
result.setMetaResourceType(getMetaResourceType(result.getInstance(), vocs));
|
|
||||||
}
|
|
||||||
|
|
||||||
return entity;
|
|
||||||
}
|
|
||||||
|
|
||||||
private static boolean originalResourceType(InstanceTypeMapping itm) {
|
|
||||||
return StringUtils.isNotBlank(itm.getOriginalType()) &&
|
|
||||||
OPENAIRE_COAR_RESOURCE_TYPES_3_1.equals(itm.getVocabularyName()) &&
|
|
||||||
StringUtils.isBlank(itm.getTypeCode()) &&
|
|
||||||
StringUtils.isBlank(itm.getTypeLabel());
|
|
||||||
}
|
|
||||||
|
|
||||||
private static Qualifier getMetaResourceType(final List<Instance> instances, final VocabularyGroup vocs) {
|
|
||||||
return Optional
|
|
||||||
.ofNullable(instances)
|
|
||||||
.map(ii -> {
|
|
||||||
if (vocs.vocabularyExists(OPENAIRE_META_RESOURCE_TYPE)) {
|
|
||||||
Optional<InstanceTypeMapping> itm = ii
|
|
||||||
.stream()
|
|
||||||
.filter(Objects::nonNull)
|
|
||||||
.flatMap(
|
|
||||||
i -> Optional
|
|
||||||
.ofNullable(i.getInstanceTypeMapping())
|
|
||||||
.map(Collection::stream)
|
|
||||||
.orElse(Stream.empty()))
|
|
||||||
.filter(t -> OPENAIRE_COAR_RESOURCE_TYPES_3_1.equals(t.getVocabularyName()))
|
|
||||||
.findFirst();
|
|
||||||
|
|
||||||
if (!itm.isPresent() || Objects.isNull(itm.get().getTypeCode())) {
|
|
||||||
return null;
|
|
||||||
} else {
|
|
||||||
final String typeCode = itm.get().getTypeCode();
|
|
||||||
return Optional
|
|
||||||
.ofNullable(vocs.lookupTermBySynonym(OPENAIRE_META_RESOURCE_TYPE, typeCode))
|
|
||||||
.orElseThrow(
|
|
||||||
() -> new IllegalStateException("unable to find a synonym for '" + typeCode + "' in " +
|
|
||||||
OPENAIRE_META_RESOURCE_TYPE));
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
throw new IllegalStateException("vocabulary '" + OPENAIRE_META_RESOURCE_TYPE + "' not available");
|
|
||||||
}
|
|
||||||
})
|
|
||||||
.orElse(null);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Implements bad and ugly things that we should get rid of ASAP.
|
|
||||||
*
|
|
||||||
* @param value
|
|
||||||
* @return
|
|
||||||
* @param <T>
|
|
||||||
*/
|
|
||||||
public static <T extends Oaf> T dedicatedUglyHacks(T value) {
|
|
||||||
if (value instanceof OafEntity) {
|
|
||||||
if (value instanceof Result) {
|
|
||||||
final Result r = (Result) value;
|
|
||||||
|
|
||||||
// Fix for AMS Acta
|
|
||||||
Optional
|
|
||||||
.ofNullable(r.getInstance())
|
|
||||||
.map(
|
|
||||||
instance -> instance
|
|
||||||
.stream()
|
|
||||||
.filter(
|
|
||||||
i -> Optional
|
|
||||||
.ofNullable(i.getHostedby())
|
|
||||||
.map(KeyValue::getKey)
|
|
||||||
.map(dsId -> dsId.equals("10|re3data_____::4cc76bed7ce2fb95fd8e7a2dfde16016"))
|
|
||||||
.orElse(false)))
|
|
||||||
.ifPresent(instance -> instance.forEach(i -> {
|
|
||||||
if (Optional
|
|
||||||
.ofNullable(i.getPid())
|
|
||||||
.map(pid -> pid.stream().noneMatch(p -> p.getValue().startsWith("10.6092/unibo/amsacta")))
|
|
||||||
.orElse(false)) {
|
|
||||||
i.setHostedby(UNKNOWN_REPOSITORY);
|
|
||||||
}
|
|
||||||
}));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return value;
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,24 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
|
||||||
|
|
||||||
import java.util.regex.Matcher;
|
|
||||||
import java.util.regex.Pattern;
|
|
||||||
|
|
||||||
public class GridCleaningRule {
|
|
||||||
|
|
||||||
public static final Pattern PATTERN = Pattern.compile("(?<grid>\\d{4,6}\\.[0-9a-z]{1,2})");
|
|
||||||
|
|
||||||
public static String clean(String grid) {
|
|
||||||
String s = grid
|
|
||||||
.replaceAll("\\s", "")
|
|
||||||
.toLowerCase();
|
|
||||||
|
|
||||||
Matcher m = PATTERN.matcher(s);
|
|
||||||
if (m.find()) {
|
|
||||||
return "grid." + m.group("grid");
|
|
||||||
}
|
|
||||||
|
|
||||||
return "";
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
|
@ -1,21 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
|
||||||
|
|
||||||
import java.util.regex.Matcher;
|
|
||||||
import java.util.regex.Pattern;
|
|
||||||
|
|
||||||
// https://www.wikidata.org/wiki/Property:P213
|
|
||||||
public class ISNICleaningRule {
|
|
||||||
|
|
||||||
public static final Pattern PATTERN = Pattern.compile("([0]{4}) ?([0-9]{4}) ?([0-9]{4}) ?([0-9]{3}[0-9X])");
|
|
||||||
|
|
||||||
public static String clean(final String isni) {
|
|
||||||
|
|
||||||
Matcher m = PATTERN.matcher(isni);
|
|
||||||
if (m.find()) {
|
|
||||||
return String.join("", m.group(1), m.group(2), m.group(3), m.group(4));
|
|
||||||
} else {
|
|
||||||
return "";
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
|
@ -1,294 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
|
||||||
|
|
||||||
import static com.google.common.base.Preconditions.checkArgument;
|
|
||||||
import static eu.dnetlib.dhp.schema.common.ModelConstants.*;
|
|
||||||
|
|
||||||
import java.io.Serializable;
|
|
||||||
import java.nio.charset.StandardCharsets;
|
|
||||||
import java.security.MessageDigest;
|
|
||||||
import java.util.*;
|
|
||||||
import java.util.function.Function;
|
|
||||||
import java.util.stream.Collectors;
|
|
||||||
import java.util.stream.Stream;
|
|
||||||
|
|
||||||
import org.apache.commons.codec.binary.Hex;
|
|
||||||
import org.apache.commons.lang3.StringUtils;
|
|
||||||
|
|
||||||
import com.google.common.collect.HashBiMap;
|
|
||||||
import com.google.common.collect.Maps;
|
|
||||||
|
|
||||||
import eu.dnetlib.dhp.schema.common.ModelSupport;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.*;
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Factory class for OpenAIRE identifiers in the Graph
|
|
||||||
*/
|
|
||||||
public class IdentifierFactory implements Serializable {
|
|
||||||
|
|
||||||
public static final String ID_SEPARATOR = "::";
|
|
||||||
public static final String ID_PREFIX_SEPARATOR = "|";
|
|
||||||
|
|
||||||
public static final int ID_PREFIX_LEN = 12;
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Declares the associations PID_TYPE -> [DATASOURCE ID, NAME] considered authoritative for that PID_TYPE.
|
|
||||||
* The id of the record (source_::id) will be rewritten as pidType_::id)
|
|
||||||
*/
|
|
||||||
public static final Map<PidType, HashBiMap<String, String>> PID_AUTHORITY = Maps.newHashMap();
|
|
||||||
|
|
||||||
static {
|
|
||||||
PID_AUTHORITY.put(PidType.doi, HashBiMap.create());
|
|
||||||
PID_AUTHORITY.get(PidType.doi).put(CROSSREF_ID, "Crossref");
|
|
||||||
PID_AUTHORITY.get(PidType.doi).put(DATACITE_ID, "Datacite");
|
|
||||||
PID_AUTHORITY.get(PidType.doi).put(ZENODO_OD_ID, "ZENODO");
|
|
||||||
PID_AUTHORITY.get(PidType.doi).put(ZENODO_R3_ID, "Zenodo");
|
|
||||||
|
|
||||||
PID_AUTHORITY.put(PidType.pmc, HashBiMap.create());
|
|
||||||
PID_AUTHORITY.get(PidType.pmc).put(EUROPE_PUBMED_CENTRAL_ID, "Europe PubMed Central");
|
|
||||||
PID_AUTHORITY.get(PidType.pmc).put(PUBMED_CENTRAL_ID, "PubMed Central");
|
|
||||||
|
|
||||||
PID_AUTHORITY.put(PidType.pmid, HashBiMap.create());
|
|
||||||
PID_AUTHORITY.get(PidType.pmid).put(EUROPE_PUBMED_CENTRAL_ID, "Europe PubMed Central");
|
|
||||||
PID_AUTHORITY.get(PidType.pmid).put(PUBMED_CENTRAL_ID, "PubMed Central");
|
|
||||||
|
|
||||||
PID_AUTHORITY.put(PidType.arXiv, HashBiMap.create());
|
|
||||||
PID_AUTHORITY.get(PidType.arXiv).put(ARXIV_ID, "arXiv.org e-Print Archive");
|
|
||||||
|
|
||||||
PID_AUTHORITY.put(PidType.w3id, HashBiMap.create());
|
|
||||||
PID_AUTHORITY.get(PidType.w3id).put(ROHUB_ID, "ROHub");
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Declares the associations PID_TYPE -> [DATASOURCE ID, PID SUBSTRING] considered as delegated authority for that
|
|
||||||
* PID_TYPE. Example, Zenodo is delegated to forge DOIs that contain the 'zenodo' word.
|
|
||||||
*
|
|
||||||
* If a record with the same id (same pid) comes from 2 data sources, the one coming from a delegated source wins. E.g. Zenodo records win over those from Datacite.
|
|
||||||
* See also https://code-repo.d4science.org/D-Net/dnet-hadoop/pulls/187 and the class dhp-common/src/main/java/eu/dnetlib/dhp/schema/oaf/utils/OafMapperUtils.java
|
|
||||||
*/
|
|
||||||
public static final Map<PidType, Map<String, String>> DELEGATED_PID_AUTHORITY = Maps.newHashMap();
|
|
||||||
|
|
||||||
static {
|
|
||||||
DELEGATED_PID_AUTHORITY.put(PidType.doi, new HashMap<>());
|
|
||||||
DELEGATED_PID_AUTHORITY.get(PidType.doi).put(ZENODO_OD_ID, "zenodo");
|
|
||||||
DELEGATED_PID_AUTHORITY.get(PidType.doi).put(ZENODO_R3_ID, "zenodo");
|
|
||||||
DELEGATED_PID_AUTHORITY.put(PidType.w3id, new HashMap<>());
|
|
||||||
DELEGATED_PID_AUTHORITY.get(PidType.w3id).put(ROHUB_ID, "ro-id");
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Declares the associations PID_TYPE -> [DATASOURCE ID, NAME] whose records are considered enrichment for the graph.
|
|
||||||
* Their OpenAIRE ID is built from the declared PID type. Are merged with their corresponding record, identified by
|
|
||||||
* the same OpenAIRE id.
|
|
||||||
*/
|
|
||||||
public static final Map<PidType, HashBiMap<String, String>> ENRICHMENT_PROVIDER = Maps.newHashMap();
|
|
||||||
|
|
||||||
static {
|
|
||||||
ENRICHMENT_PROVIDER.put(PidType.doi, HashBiMap.create());
|
|
||||||
ENRICHMENT_PROVIDER.get(PidType.doi).put(OPEN_APC_ID, OPEN_APC_NAME);
|
|
||||||
}
|
|
||||||
|
|
||||||
public static Set<String> delegatedAuthorityDatasourceIds() {
|
|
||||||
return DELEGATED_PID_AUTHORITY
|
|
||||||
.values()
|
|
||||||
.stream()
|
|
||||||
.flatMap(m -> m.keySet().stream())
|
|
||||||
.collect(Collectors.toCollection(HashSet::new));
|
|
||||||
}
|
|
||||||
|
|
||||||
public static List<StructuredProperty> getPids(List<StructuredProperty> pid, KeyValue collectedFrom) {
|
|
||||||
return pidFromInstance(pid, collectedFrom, true).distinct().collect(Collectors.toList());
|
|
||||||
}
|
|
||||||
|
|
||||||
public static <T extends Result> String createDOIBoostIdentifier(T entity) {
|
|
||||||
if (entity == null)
|
|
||||||
return null;
|
|
||||||
|
|
||||||
StructuredProperty pid = null;
|
|
||||||
if (entity.getPid() != null) {
|
|
||||||
pid = entity
|
|
||||||
.getPid()
|
|
||||||
.stream()
|
|
||||||
.filter(Objects::nonNull)
|
|
||||||
.filter(s -> s.getQualifier() != null && "doi".equalsIgnoreCase(s.getQualifier().getClassid()))
|
|
||||||
.filter(CleaningFunctions::pidFilter)
|
|
||||||
.findAny()
|
|
||||||
.orElse(null);
|
|
||||||
} else {
|
|
||||||
if (entity.getInstance() != null) {
|
|
||||||
pid = entity
|
|
||||||
.getInstance()
|
|
||||||
.stream()
|
|
||||||
.filter(i -> i.getPid() != null)
|
|
||||||
.flatMap(i -> i.getPid().stream())
|
|
||||||
.filter(CleaningFunctions::pidFilter)
|
|
||||||
.findAny()
|
|
||||||
.orElse(null);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if (pid != null)
|
|
||||||
return idFromPid(entity, pid, true);
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Creates an identifier from the most relevant PID (if available) provided by a known PID authority in the given
|
|
||||||
* entity T. Returns entity.id when none of the PIDs meet the selection criteria is available.
|
|
||||||
*
|
|
||||||
* @param entity the entity providing PIDs and a default ID.
|
|
||||||
* @param <T> the specific entity type. Currently Organization and Result subclasses are supported.
|
|
||||||
* @param md5 indicates whether should hash the PID value or not.
|
|
||||||
* @return an identifier from the most relevant PID, entity.id otherwise
|
|
||||||
*/
|
|
||||||
public static <T extends OafEntity> String createIdentifier(T entity, boolean md5) {
|
|
||||||
|
|
||||||
checkArgument(StringUtils.isNoneBlank(entity.getId()), "missing entity identifier");
|
|
||||||
|
|
||||||
final Map<String, Set<StructuredProperty>> pids = extractPids(entity);
|
|
||||||
|
|
||||||
return pids
|
|
||||||
.values()
|
|
||||||
.stream()
|
|
||||||
.flatMap(Set::stream)
|
|
||||||
.min(new PidComparator<>(entity))
|
|
||||||
.map(
|
|
||||||
min -> Optional
|
|
||||||
.ofNullable(pids.get(min.getQualifier().getClassid()))
|
|
||||||
.map(
|
|
||||||
p -> p
|
|
||||||
.stream()
|
|
||||||
.sorted(new PidValueComparator())
|
|
||||||
.findFirst()
|
|
||||||
.map(s -> idFromPid(entity, s, md5))
|
|
||||||
.orElseGet(entity::getId))
|
|
||||||
.orElseGet(entity::getId))
|
|
||||||
.orElseGet(entity::getId);
|
|
||||||
}
|
|
||||||
|
|
||||||
private static <T extends OafEntity> Map<String, Set<StructuredProperty>> extractPids(T entity) {
|
|
||||||
if (entity instanceof Result) {
|
|
||||||
return Optional
|
|
||||||
.ofNullable(((Result) entity).getInstance())
|
|
||||||
.map(IdentifierFactory::mapPids)
|
|
||||||
.orElse(new HashMap<>());
|
|
||||||
} else {
|
|
||||||
return entity
|
|
||||||
.getPid()
|
|
||||||
.stream()
|
|
||||||
.map(PidCleaner::normalizePidValue)
|
|
||||||
.filter(CleaningFunctions::pidFilter)
|
|
||||||
.collect(
|
|
||||||
Collectors
|
|
||||||
.groupingBy(
|
|
||||||
p -> p.getQualifier().getClassid(),
|
|
||||||
Collectors.mapping(p -> p, Collectors.toCollection(HashSet::new))));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
private static Map<String, Set<StructuredProperty>> mapPids(List<Instance> instance) {
|
|
||||||
return instance
|
|
||||||
.stream()
|
|
||||||
.map(i -> pidFromInstance(i.getPid(), i.getCollectedfrom(), false))
|
|
||||||
.flatMap(Function.identity())
|
|
||||||
.collect(
|
|
||||||
Collectors
|
|
||||||
.groupingBy(
|
|
||||||
p -> p.getQualifier().getClassid(),
|
|
||||||
Collectors.mapping(p -> p, Collectors.toCollection(HashSet::new))));
|
|
||||||
}
|
|
||||||
|
|
||||||
private static Stream<StructuredProperty> pidFromInstance(List<StructuredProperty> pid, KeyValue collectedFrom,
|
|
||||||
boolean mapHandles) {
|
|
||||||
return Optional
|
|
||||||
.ofNullable(pid)
|
|
||||||
.map(
|
|
||||||
pp -> pp
|
|
||||||
.stream()
|
|
||||||
// filter away PIDs provided by a DS that is not considered an authority for the
|
|
||||||
// given PID Type
|
|
||||||
.filter(p -> shouldFilterPidByCriteria(collectedFrom, p, mapHandles))
|
|
||||||
.map(PidCleaner::normalizePidValue)
|
|
||||||
.filter(p -> isNotFromDelegatedAuthority(collectedFrom, p))
|
|
||||||
.filter(CleaningFunctions::pidFilter))
|
|
||||||
.orElse(Stream.empty());
|
|
||||||
}
|
|
||||||
|
|
||||||
private static boolean shouldFilterPidByCriteria(KeyValue collectedFrom, StructuredProperty p, boolean mapHandles) {
|
|
||||||
final PidType pType = PidType.tryValueOf(p.getQualifier().getClassid());
|
|
||||||
|
|
||||||
if (Objects.isNull(collectedFrom)) {
|
|
||||||
return false;
|
|
||||||
}
|
|
||||||
|
|
||||||
boolean isEnrich = Optional
|
|
||||||
.ofNullable(ENRICHMENT_PROVIDER.get(pType))
|
|
||||||
.map(
|
|
||||||
enrich -> enrich.containsKey(collectedFrom.getKey())
|
|
||||||
|| enrich.containsValue(collectedFrom.getValue()))
|
|
||||||
.orElse(false);
|
|
||||||
|
|
||||||
boolean isAuthority = Optional
|
|
||||||
.ofNullable(PID_AUTHORITY.get(pType))
|
|
||||||
.map(
|
|
||||||
authorities -> authorities.containsKey(collectedFrom.getKey())
|
|
||||||
|| authorities.containsValue(collectedFrom.getValue()))
|
|
||||||
.orElse(false);
|
|
||||||
|
|
||||||
return (mapHandles && pType.equals(PidType.handle)) || isEnrich || isAuthority;
|
|
||||||
}
|
|
||||||
|
|
||||||
private static boolean isNotFromDelegatedAuthority(KeyValue collectedFrom, StructuredProperty p) {
|
|
||||||
final PidType pType = PidType.tryValueOf(p.getQualifier().getClassid());
|
|
||||||
|
|
||||||
final Map<String, String> da = DELEGATED_PID_AUTHORITY.get(pType);
|
|
||||||
if (Objects.isNull(da)) {
|
|
||||||
return true;
|
|
||||||
}
|
|
||||||
if (!da.containsKey(collectedFrom.getKey())) {
|
|
||||||
return true;
|
|
||||||
}
|
|
||||||
return StringUtils.contains(p.getValue(), da.get(collectedFrom.getKey()));
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* @see {@link IdentifierFactory#createIdentifier(OafEntity, boolean)}
|
|
||||||
*/
|
|
||||||
public static <T extends OafEntity> String createIdentifier(T entity) {
|
|
||||||
|
|
||||||
return createIdentifier(entity, true);
|
|
||||||
}
|
|
||||||
|
|
||||||
private static <T extends OafEntity> String idFromPid(T entity, StructuredProperty s, boolean md5) {
|
|
||||||
return idFromPid(ModelSupport.getIdPrefix(entity.getClass()), s.getQualifier().getClassid(), s.getValue(), md5);
|
|
||||||
}
|
|
||||||
|
|
||||||
public static String idFromPid(String numericPrefix, String pidType, String pidValue, boolean md5) {
|
|
||||||
return new StringBuilder()
|
|
||||||
.append(numericPrefix)
|
|
||||||
.append(ID_PREFIX_SEPARATOR)
|
|
||||||
.append(createPrefix(pidType))
|
|
||||||
.append(ID_SEPARATOR)
|
|
||||||
.append(md5 ? md5(pidValue) : pidValue)
|
|
||||||
.toString();
|
|
||||||
}
|
|
||||||
|
|
||||||
// create the prefix (length = 12)
|
|
||||||
private static String createPrefix(String pidType) {
|
|
||||||
StringBuilder prefix = new StringBuilder(StringUtils.left(pidType, ID_PREFIX_LEN));
|
|
||||||
while (prefix.length() < ID_PREFIX_LEN) {
|
|
||||||
prefix.append("_");
|
|
||||||
}
|
|
||||||
return prefix.substring(0, ID_PREFIX_LEN);
|
|
||||||
}
|
|
||||||
|
|
||||||
public static String md5(final String s) {
|
|
||||||
try {
|
|
||||||
final MessageDigest md = MessageDigest.getInstance("MD5");
|
|
||||||
md.update(s.getBytes(StandardCharsets.UTF_8));
|
|
||||||
return new String(Hex.encodeHex(md.digest()));
|
|
||||||
} catch (final Exception e) {
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
|
@ -1,78 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
|
||||||
|
|
||||||
import java.util.Comparator;
|
|
||||||
import java.util.HashSet;
|
|
||||||
import java.util.Optional;
|
|
||||||
import java.util.stream.Collectors;
|
|
||||||
|
|
||||||
//
|
|
||||||
// Source code recreated from a .class file by IntelliJ IDEA
|
|
||||||
// (powered by FernFlower decompiler)
|
|
||||||
//
|
|
||||||
import eu.dnetlib.dhp.schema.common.EntityType;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.KeyValue;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.Oaf;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.OafEntity;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.Result;
|
|
||||||
|
|
||||||
public class MergeComparator implements Comparator<Oaf> {
|
|
||||||
public MergeComparator() {
|
|
||||||
}
|
|
||||||
|
|
||||||
public int compare(Oaf left, Oaf right) {
|
|
||||||
// nulls at the end
|
|
||||||
if (left == null && right == null) {
|
|
||||||
return 0;
|
|
||||||
} else if (left == null) {
|
|
||||||
return -1;
|
|
||||||
} else if (right == null) {
|
|
||||||
return 1;
|
|
||||||
}
|
|
||||||
|
|
||||||
// invisible
|
|
||||||
if (left.getDataInfo() != null && left.getDataInfo().getInvisible() == true) {
|
|
||||||
if (right.getDataInfo() != null && right.getDataInfo().getInvisible() == false) {
|
|
||||||
return -1;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// collectedfrom
|
|
||||||
HashSet<String> lCf = getCollectedFromIds(left);
|
|
||||||
HashSet<String> rCf = getCollectedFromIds(right);
|
|
||||||
if (lCf.contains("10|openaire____::081b82f96300b6a6e3d282bad31cb6e2")
|
|
||||||
&& !rCf.contains("10|openaire____::081b82f96300b6a6e3d282bad31cb6e2")) {
|
|
||||||
return -1;
|
|
||||||
} else if (!lCf.contains("10|openaire____::081b82f96300b6a6e3d282bad31cb6e2")
|
|
||||||
&& rCf.contains("10|openaire____::081b82f96300b6a6e3d282bad31cb6e2")) {
|
|
||||||
return 1;
|
|
||||||
}
|
|
||||||
|
|
||||||
SubEntityType lClass = SubEntityType.fromClass(left.getClass());
|
|
||||||
SubEntityType rClass = SubEntityType.fromClass(right.getClass());
|
|
||||||
return lClass.ordinal() - rClass.ordinal();
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
protected HashSet<String> getCollectedFromIds(Oaf left) {
|
|
||||||
return (HashSet) Optional.ofNullable(left.getCollectedfrom()).map((cf) -> {
|
|
||||||
return (HashSet) cf.stream().map(KeyValue::getKey).collect(Collectors.toCollection(HashSet::new));
|
|
||||||
}).orElse(new HashSet());
|
|
||||||
}
|
|
||||||
|
|
||||||
enum SubEntityType {
|
|
||||||
publication, dataset, software, otherresearchproduct, datasource, organization, project;
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Resolves the EntityType, given the relative class name
|
|
||||||
*
|
|
||||||
* @param clazz the given class name
|
|
||||||
* @param <T> actual OafEntity subclass
|
|
||||||
* @return the EntityType associated to the given class
|
|
||||||
*/
|
|
||||||
public static <T extends Oaf> SubEntityType fromClass(Class<T> clazz) {
|
|
||||||
return valueOf(clazz.getSimpleName().toLowerCase());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
|
@ -1,106 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
|
||||||
|
|
||||||
import java.util.*;
|
|
||||||
|
|
||||||
import eu.dnetlib.dhp.schema.common.ModelConstants;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.Oaf;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.OafEntity;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.Result;
|
|
||||||
|
|
||||||
public class MergeEntitiesComparator implements Comparator<Oaf> {
|
|
||||||
static final List<String> PID_AUTHORITIES = Arrays
|
|
||||||
.asList(
|
|
||||||
ModelConstants.ARXIV_ID,
|
|
||||||
ModelConstants.PUBMED_CENTRAL_ID,
|
|
||||||
ModelConstants.EUROPE_PUBMED_CENTRAL_ID,
|
|
||||||
ModelConstants.DATACITE_ID,
|
|
||||||
ModelConstants.CROSSREF_ID);
|
|
||||||
|
|
||||||
static final List<String> RESULT_TYPES = Arrays
|
|
||||||
.asList(
|
|
||||||
ModelConstants.ORP_RESULTTYPE_CLASSID,
|
|
||||||
ModelConstants.SOFTWARE_RESULTTYPE_CLASSID,
|
|
||||||
ModelConstants.DATASET_RESULTTYPE_CLASSID,
|
|
||||||
ModelConstants.PUBLICATION_RESULTTYPE_CLASSID);
|
|
||||||
|
|
||||||
public static final Comparator<Oaf> INSTANCE = new MergeEntitiesComparator();
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public int compare(Oaf left, Oaf right) {
|
|
||||||
if (left == null && right == null)
|
|
||||||
return 0;
|
|
||||||
if (left == null)
|
|
||||||
return -1;
|
|
||||||
if (right == null)
|
|
||||||
return 1;
|
|
||||||
|
|
||||||
int res = 0;
|
|
||||||
|
|
||||||
// pid authority
|
|
||||||
int cfp1 = Optional
|
|
||||||
.ofNullable(left.getCollectedfrom())
|
|
||||||
.map(
|
|
||||||
cf -> cf
|
|
||||||
.stream()
|
|
||||||
.map(kv -> PID_AUTHORITIES.indexOf(kv.getKey()))
|
|
||||||
.max(Integer::compare)
|
|
||||||
.orElse(-1))
|
|
||||||
.orElse(-1);
|
|
||||||
int cfp2 = Optional
|
|
||||||
.ofNullable(right.getCollectedfrom())
|
|
||||||
.map(
|
|
||||||
cf -> cf
|
|
||||||
.stream()
|
|
||||||
.map(kv -> PID_AUTHORITIES.indexOf(kv.getKey()))
|
|
||||||
.max(Integer::compare)
|
|
||||||
.orElse(-1))
|
|
||||||
.orElse(-1);
|
|
||||||
|
|
||||||
if (cfp1 >= 0 && cfp1 > cfp2) {
|
|
||||||
return 1;
|
|
||||||
} else if (cfp2 >= 0 && cfp2 > cfp1) {
|
|
||||||
return -1;
|
|
||||||
}
|
|
||||||
|
|
||||||
// trust
|
|
||||||
if (left.getDataInfo() != null && right.getDataInfo() != null) {
|
|
||||||
res = left.getDataInfo().getTrust().compareTo(right.getDataInfo().getTrust());
|
|
||||||
}
|
|
||||||
|
|
||||||
// result type
|
|
||||||
if (res == 0) {
|
|
||||||
if (left instanceof Result && right instanceof Result) {
|
|
||||||
Result r1 = (Result) left;
|
|
||||||
Result r2 = (Result) right;
|
|
||||||
|
|
||||||
if (r1.getResulttype() == null || r1.getResulttype().getClassid() == null) {
|
|
||||||
if (r2.getResulttype() != null && r2.getResulttype().getClassid() != null) {
|
|
||||||
return -1;
|
|
||||||
}
|
|
||||||
} else if (r2.getResulttype() == null || r2.getResulttype().getClassid() == null) {
|
|
||||||
return 1;
|
|
||||||
}
|
|
||||||
|
|
||||||
int rt1 = RESULT_TYPES.indexOf(r1.getResulttype().getClassid());
|
|
||||||
int rt2 = RESULT_TYPES.indexOf(r2.getResulttype().getClassid());
|
|
||||||
|
|
||||||
if (rt1 >= 0 && rt1 > rt2) {
|
|
||||||
return 1;
|
|
||||||
} else if (rt2 >= 0 && rt2 > rt1) {
|
|
||||||
return -1;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// id
|
|
||||||
if (res == 0) {
|
|
||||||
if (left instanceof OafEntity && right instanceof OafEntity) {
|
|
||||||
res = ((OafEntity) right).getId().compareTo(((OafEntity) left).getId());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return res;
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
File diff suppressed because it is too large
Load Diff
|
@ -1,27 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
|
||||||
|
|
||||||
public class ModelHardLimits {
|
|
||||||
|
|
||||||
private ModelHardLimits() {
|
|
||||||
}
|
|
||||||
|
|
||||||
public static final String LAYOUT = "index";
|
|
||||||
public static final String INTERPRETATION = "openaire";
|
|
||||||
public static final String SEPARATOR = "-";
|
|
||||||
|
|
||||||
public static final int MAX_EXTERNAL_ENTITIES = 50;
|
|
||||||
public static final int MAX_AUTHORS = 200;
|
|
||||||
public static final int MAX_AUTHOR_FULLNAME_LENGTH = 1000;
|
|
||||||
public static final int MAX_TITLE_LENGTH = 5000;
|
|
||||||
public static final int MAX_TITLES = 10;
|
|
||||||
public static final int MAX_ABSTRACTS = 10;
|
|
||||||
public static final int MAX_ABSTRACT_LENGTH = 150000;
|
|
||||||
public static final int MAX_RELATED_ABSTRACT_LENGTH = 500;
|
|
||||||
public static final int MAX_INSTANCES = 10;
|
|
||||||
|
|
||||||
public static String getCollectionName(String format) {
|
|
||||||
return format + SEPARATOR + LAYOUT + SEPARATOR + INTERPRETATION;
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
|
@ -3,8 +3,6 @@ package eu.dnetlib.dhp.schema.oaf.utils;
|
||||||
|
|
||||||
import static eu.dnetlib.dhp.schema.common.ModelConstants.*;
|
import static eu.dnetlib.dhp.schema.common.ModelConstants.*;
|
||||||
|
|
||||||
import java.sql.Array;
|
|
||||||
import java.sql.SQLException;
|
|
||||||
import java.util.*;
|
import java.util.*;
|
||||||
import java.util.concurrent.ConcurrentHashMap;
|
import java.util.concurrent.ConcurrentHashMap;
|
||||||
import java.util.function.Function;
|
import java.util.function.Function;
|
||||||
|
@ -14,6 +12,7 @@ import java.util.stream.Collectors;
|
||||||
import org.apache.commons.lang3.StringUtils;
|
import org.apache.commons.lang3.StringUtils;
|
||||||
|
|
||||||
import eu.dnetlib.dhp.schema.common.AccessRightComparator;
|
import eu.dnetlib.dhp.schema.common.AccessRightComparator;
|
||||||
|
import eu.dnetlib.dhp.schema.common.ModelSupport;
|
||||||
import eu.dnetlib.dhp.schema.oaf.*;
|
import eu.dnetlib.dhp.schema.oaf.*;
|
||||||
|
|
||||||
public class OafMapperUtils {
|
public class OafMapperUtils {
|
||||||
|
@ -21,6 +20,42 @@ public class OafMapperUtils {
|
||||||
private OafMapperUtils() {
|
private OafMapperUtils() {
|
||||||
}
|
}
|
||||||
|
|
||||||
|
public static Oaf merge(final Oaf left, final Oaf right) {
|
||||||
|
if (ModelSupport.isSubClass(left, OafEntity.class)) {
|
||||||
|
return mergeEntities((OafEntity) left, (OafEntity) right);
|
||||||
|
} else if (ModelSupport.isSubClass(left, Relation.class)) {
|
||||||
|
((Relation) left).mergeFrom((Relation) right);
|
||||||
|
} else {
|
||||||
|
throw new IllegalArgumentException("invalid Oaf type:" + left.getClass().getCanonicalName());
|
||||||
|
}
|
||||||
|
return left;
|
||||||
|
}
|
||||||
|
|
||||||
|
public static OafEntity mergeEntities(OafEntity left, OafEntity right) {
|
||||||
|
if (ModelSupport.isSubClass(left, Result.class)) {
|
||||||
|
return mergeResults((Result) left, (Result) right);
|
||||||
|
} else if (ModelSupport.isSubClass(left, Datasource.class)) {
|
||||||
|
left.mergeFrom(right);
|
||||||
|
} else if (ModelSupport.isSubClass(left, Organization.class)) {
|
||||||
|
left.mergeFrom(right);
|
||||||
|
} else if (ModelSupport.isSubClass(left, Project.class)) {
|
||||||
|
left.mergeFrom(right);
|
||||||
|
} else {
|
||||||
|
throw new IllegalArgumentException("invalid OafEntity subtype:" + left.getClass().getCanonicalName());
|
||||||
|
}
|
||||||
|
return left;
|
||||||
|
}
|
||||||
|
|
||||||
|
public static Result mergeResults(Result left, Result right) {
|
||||||
|
if (new ResultTypeComparator().compare(left, right) < 0) {
|
||||||
|
left.mergeFrom(right);
|
||||||
|
return left;
|
||||||
|
} else {
|
||||||
|
right.mergeFrom(left);
|
||||||
|
return right;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
public static KeyValue keyValue(final String k, final String v) {
|
public static KeyValue keyValue(final String k, final String v) {
|
||||||
final KeyValue kv = new KeyValue();
|
final KeyValue kv = new KeyValue();
|
||||||
kv.setKey(k);
|
kv.setKey(k);
|
||||||
|
@ -60,17 +95,6 @@ public class OafMapperUtils {
|
||||||
.collect(Collectors.toList());
|
.collect(Collectors.toList());
|
||||||
}
|
}
|
||||||
|
|
||||||
public static <T> List<T> listValues(Array values) throws SQLException {
|
|
||||||
if (Objects.isNull(values)) {
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
return Arrays
|
|
||||||
.stream((T[]) values.getArray())
|
|
||||||
.filter(Objects::nonNull)
|
|
||||||
.distinct()
|
|
||||||
.collect(Collectors.toList());
|
|
||||||
}
|
|
||||||
|
|
||||||
public static List<Field<String>> listFields(final DataInfo info, final List<String> values) {
|
public static List<Field<String>> listFields(final DataInfo info, final List<String> values) {
|
||||||
return values
|
return values
|
||||||
.stream()
|
.stream()
|
||||||
|
@ -80,30 +104,8 @@ public class OafMapperUtils {
|
||||||
.collect(Collectors.toList());
|
.collect(Collectors.toList());
|
||||||
}
|
}
|
||||||
|
|
||||||
public static InstanceTypeMapping instanceTypeMapping(String originalType, String code, String label,
|
|
||||||
String vocabularyName) {
|
|
||||||
final InstanceTypeMapping m = new InstanceTypeMapping();
|
|
||||||
m.setVocabularyName(vocabularyName);
|
|
||||||
m.setOriginalType(originalType);
|
|
||||||
m.setTypeCode(code);
|
|
||||||
m.setTypeLabel(label);
|
|
||||||
return m;
|
|
||||||
}
|
|
||||||
|
|
||||||
public static InstanceTypeMapping instanceTypeMapping(String originalType, Qualifier term) {
|
|
||||||
return instanceTypeMapping(originalType, term.getClassid(), term.getClassname(), term.getSchemeid());
|
|
||||||
}
|
|
||||||
|
|
||||||
public static InstanceTypeMapping instanceTypeMapping(String originalType) {
|
|
||||||
return instanceTypeMapping(originalType, null, null, null);
|
|
||||||
}
|
|
||||||
|
|
||||||
public static InstanceTypeMapping instanceTypeMapping(String originalType, String vocabularyName) {
|
|
||||||
return instanceTypeMapping(originalType, null, null, vocabularyName);
|
|
||||||
}
|
|
||||||
|
|
||||||
public static Qualifier unknown(final String schemeid, final String schemename) {
|
public static Qualifier unknown(final String schemeid, final String schemename) {
|
||||||
return qualifier(UNKNOWN, "Unknown", schemeid, schemename);
|
return qualifier("UNKNOWN", "Unknown", schemeid, schemename);
|
||||||
}
|
}
|
||||||
|
|
||||||
public static AccessRight accessRight(
|
public static AccessRight accessRight(
|
||||||
|
@ -151,17 +153,6 @@ public class OafMapperUtils {
|
||||||
return q;
|
return q;
|
||||||
}
|
}
|
||||||
|
|
||||||
public static Subject subject(
|
|
||||||
final String value,
|
|
||||||
final String classid,
|
|
||||||
final String classname,
|
|
||||||
final String schemeid,
|
|
||||||
final String schemename,
|
|
||||||
final DataInfo dataInfo) {
|
|
||||||
|
|
||||||
return subject(value, qualifier(classid, classname, schemeid, schemename), dataInfo);
|
|
||||||
}
|
|
||||||
|
|
||||||
public static StructuredProperty structuredProperty(
|
public static StructuredProperty structuredProperty(
|
||||||
final String value,
|
final String value,
|
||||||
final String classid,
|
final String classid,
|
||||||
|
@ -173,20 +164,6 @@ public class OafMapperUtils {
|
||||||
return structuredProperty(value, qualifier(classid, classname, schemeid, schemename), dataInfo);
|
return structuredProperty(value, qualifier(classid, classname, schemeid, schemename), dataInfo);
|
||||||
}
|
}
|
||||||
|
|
||||||
public static Subject subject(
|
|
||||||
final String value,
|
|
||||||
final Qualifier qualifier,
|
|
||||||
final DataInfo dataInfo) {
|
|
||||||
if (value == null) {
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
final Subject s = new Subject();
|
|
||||||
s.setValue(value);
|
|
||||||
s.setQualifier(qualifier);
|
|
||||||
s.setDataInfo(dataInfo);
|
|
||||||
return s;
|
|
||||||
}
|
|
||||||
|
|
||||||
public static StructuredProperty structuredProperty(
|
public static StructuredProperty structuredProperty(
|
||||||
final String value,
|
final String value,
|
||||||
final Qualifier qualifier,
|
final Qualifier qualifier,
|
||||||
|
@ -391,88 +368,4 @@ public class OafMapperUtils {
|
||||||
}
|
}
|
||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
|
|
||||||
public static KeyValue newKeyValueInstance(String key, String value, DataInfo dataInfo) {
|
|
||||||
KeyValue kv = new KeyValue();
|
|
||||||
kv.setDataInfo(dataInfo);
|
|
||||||
kv.setKey(key);
|
|
||||||
kv.setValue(value);
|
|
||||||
return kv;
|
|
||||||
}
|
|
||||||
|
|
||||||
public static Measure newMeasureInstance(String id, String value, String key, DataInfo dataInfo) {
|
|
||||||
Measure m = new Measure();
|
|
||||||
m.setId(id);
|
|
||||||
m.setUnit(Arrays.asList(newKeyValueInstance(key, value, dataInfo)));
|
|
||||||
return m;
|
|
||||||
}
|
|
||||||
|
|
||||||
public static Relation getRelation(final String source,
|
|
||||||
final String target,
|
|
||||||
final String relType,
|
|
||||||
final String subRelType,
|
|
||||||
final String relClass,
|
|
||||||
final OafEntity entity) {
|
|
||||||
return getRelation(source, target, relType, subRelType, relClass, entity, null);
|
|
||||||
}
|
|
||||||
|
|
||||||
public static Relation getRelation(final String source,
|
|
||||||
final String target,
|
|
||||||
final String relType,
|
|
||||||
final String subRelType,
|
|
||||||
final String relClass,
|
|
||||||
final OafEntity entity,
|
|
||||||
final String validationDate) {
|
|
||||||
return getRelation(
|
|
||||||
source, target, relType, subRelType, relClass, entity.getCollectedfrom(), entity.getDataInfo(),
|
|
||||||
entity.getLastupdatetimestamp(), validationDate, null);
|
|
||||||
}
|
|
||||||
|
|
||||||
public static Relation getRelation(final String source,
|
|
||||||
final String target,
|
|
||||||
final String relType,
|
|
||||||
final String subRelType,
|
|
||||||
final String relClass,
|
|
||||||
final List<KeyValue> collectedfrom,
|
|
||||||
final DataInfo dataInfo,
|
|
||||||
final Long lastupdatetimestamp) {
|
|
||||||
return getRelation(
|
|
||||||
source, target, relType, subRelType, relClass, collectedfrom, dataInfo, lastupdatetimestamp, null, null);
|
|
||||||
}
|
|
||||||
|
|
||||||
public static Relation getRelation(final String source,
|
|
||||||
final String target,
|
|
||||||
final String relType,
|
|
||||||
final String subRelType,
|
|
||||||
final String relClass,
|
|
||||||
final List<KeyValue> collectedfrom,
|
|
||||||
final DataInfo dataInfo,
|
|
||||||
final Long lastupdatetimestamp,
|
|
||||||
final String validationDate,
|
|
||||||
final List<KeyValue> properties) {
|
|
||||||
final Relation rel = new Relation();
|
|
||||||
rel.setRelType(relType);
|
|
||||||
rel.setSubRelType(subRelType);
|
|
||||||
rel.setRelClass(relClass);
|
|
||||||
rel.setSource(source);
|
|
||||||
rel.setTarget(target);
|
|
||||||
rel.setCollectedfrom(collectedfrom);
|
|
||||||
rel.setDataInfo(dataInfo);
|
|
||||||
rel.setLastupdatetimestamp(lastupdatetimestamp);
|
|
||||||
rel.setValidated(StringUtils.isNotBlank(validationDate));
|
|
||||||
rel.setValidationDate(StringUtils.isNotBlank(validationDate) ? validationDate : null);
|
|
||||||
rel.setProperties(properties);
|
|
||||||
return rel;
|
|
||||||
}
|
|
||||||
|
|
||||||
public static String getProvenance(DataInfo dataInfo) {
|
|
||||||
return Optional
|
|
||||||
.ofNullable(dataInfo)
|
|
||||||
.map(
|
|
||||||
d -> Optional
|
|
||||||
.ofNullable(d.getProvenanceaction())
|
|
||||||
.map(Qualifier::getClassid)
|
|
||||||
.orElse(""))
|
|
||||||
.orElse("");
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,46 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
|
||||||
|
|
||||||
import java.util.Comparator;
|
|
||||||
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.StructuredProperty;
|
|
||||||
|
|
||||||
public class OrganizationPidComparator implements Comparator<StructuredProperty> {
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public int compare(StructuredProperty left, StructuredProperty right) {
|
|
||||||
if (left == null) {
|
|
||||||
return right == null ? 0 : -1;
|
|
||||||
} else if (right == null) {
|
|
||||||
return 1;
|
|
||||||
}
|
|
||||||
|
|
||||||
PidType lClass = PidType.tryValueOf(left.getQualifier().getClassid());
|
|
||||||
PidType rClass = PidType.tryValueOf(right.getQualifier().getClassid());
|
|
||||||
|
|
||||||
if (lClass.equals(rClass))
|
|
||||||
return 0;
|
|
||||||
|
|
||||||
if (lClass.equals(PidType.openorgs))
|
|
||||||
return -1;
|
|
||||||
if (rClass.equals(PidType.openorgs))
|
|
||||||
return 1;
|
|
||||||
|
|
||||||
if (lClass.equals(PidType.GRID))
|
|
||||||
return -1;
|
|
||||||
if (rClass.equals(PidType.GRID))
|
|
||||||
return 1;
|
|
||||||
|
|
||||||
if (lClass.equals(PidType.mag_id))
|
|
||||||
return -1;
|
|
||||||
if (rClass.equals(PidType.mag_id))
|
|
||||||
return 1;
|
|
||||||
|
|
||||||
if (lClass.equals(PidType.urn))
|
|
||||||
return -1;
|
|
||||||
if (rClass.equals(PidType.urn))
|
|
||||||
return 1;
|
|
||||||
|
|
||||||
return 0;
|
|
||||||
}
|
|
||||||
}
|
|
|
@ -1,21 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
|
||||||
|
|
||||||
import java.util.regex.Matcher;
|
|
||||||
import java.util.regex.Pattern;
|
|
||||||
|
|
||||||
public class PICCleaningRule {
|
|
||||||
|
|
||||||
public static final Pattern PATTERN = Pattern.compile("\\d{9}");
|
|
||||||
|
|
||||||
public static String clean(final String pic) {
|
|
||||||
|
|
||||||
Matcher m = PATTERN.matcher(pic);
|
|
||||||
if (m.find()) {
|
|
||||||
return m.group();
|
|
||||||
} else {
|
|
||||||
return "";
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
|
@ -1,8 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
|
||||||
|
|
||||||
import java.util.HashMap;
|
|
||||||
import java.util.HashSet;
|
|
||||||
|
|
||||||
public class PidBlacklist extends HashMap<String, HashSet<String>> {
|
|
||||||
}
|
|
|
@ -1,40 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
|
||||||
|
|
||||||
import java.io.IOException;
|
|
||||||
import java.util.HashSet;
|
|
||||||
import java.util.Optional;
|
|
||||||
import java.util.Set;
|
|
||||||
|
|
||||||
import org.apache.commons.io.IOUtils;
|
|
||||||
|
|
||||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
|
||||||
|
|
||||||
public class PidBlacklistProvider {
|
|
||||||
|
|
||||||
private static final PidBlacklist blacklist;
|
|
||||||
|
|
||||||
static {
|
|
||||||
try {
|
|
||||||
String json = IOUtils.toString(IdentifierFactory.class.getResourceAsStream("pid_blacklist.json"));
|
|
||||||
blacklist = new ObjectMapper().readValue(json, PidBlacklist.class);
|
|
||||||
|
|
||||||
} catch (IOException e) {
|
|
||||||
throw new RuntimeException(e);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
public static PidBlacklist getBlacklist() {
|
|
||||||
return blacklist;
|
|
||||||
}
|
|
||||||
|
|
||||||
public static Set<String> getBlacklist(String pidType) {
|
|
||||||
return Optional
|
|
||||||
.ofNullable(getBlacklist().get(pidType))
|
|
||||||
.orElse(new HashSet<>());
|
|
||||||
}
|
|
||||||
|
|
||||||
private PidBlacklistProvider() {
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
|
@ -1,62 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
|
||||||
|
|
||||||
import java.util.Optional;
|
|
||||||
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.StructuredProperty;
|
|
||||||
|
|
||||||
public class PidCleaner {
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Utility method that normalises PID values on a per-type basis.
|
|
||||||
* @param pid the PID whose value will be normalised.
|
|
||||||
* @return the PID containing the normalised value.
|
|
||||||
*/
|
|
||||||
public static StructuredProperty normalizePidValue(StructuredProperty pid) {
|
|
||||||
pid
|
|
||||||
.setValue(
|
|
||||||
normalizePidValue(
|
|
||||||
pid.getQualifier().getClassid(),
|
|
||||||
pid.getValue()));
|
|
||||||
|
|
||||||
return pid;
|
|
||||||
}
|
|
||||||
|
|
||||||
public static String normalizePidValue(String pidType, String pidValue) {
|
|
||||||
String value = Optional
|
|
||||||
.ofNullable(pidValue)
|
|
||||||
.map(String::trim)
|
|
||||||
.orElseThrow(() -> new IllegalArgumentException("PID value cannot be empty"));
|
|
||||||
|
|
||||||
switch (pidType) {
|
|
||||||
|
|
||||||
// TODO add cleaning for more PID types as needed
|
|
||||||
|
|
||||||
// Result
|
|
||||||
case "doi":
|
|
||||||
return DoiCleaningRule.clean(value);
|
|
||||||
case "pmid":
|
|
||||||
return PmidCleaningRule.clean(value);
|
|
||||||
case "pmc":
|
|
||||||
return PmcCleaningRule.clean(value);
|
|
||||||
case "handle":
|
|
||||||
case "arXiv":
|
|
||||||
return value;
|
|
||||||
|
|
||||||
// Organization
|
|
||||||
case "GRID":
|
|
||||||
return GridCleaningRule.clean(value);
|
|
||||||
case "ISNI":
|
|
||||||
return ISNICleaningRule.clean(value);
|
|
||||||
case "ROR":
|
|
||||||
return RorCleaningRule.clean(value);
|
|
||||||
case "PIC":
|
|
||||||
return PICCleaningRule.clean(value);
|
|
||||||
case "FundRef":
|
|
||||||
return FundRefCleaningRule.clean(value);
|
|
||||||
default:
|
|
||||||
return value;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
|
@ -1,48 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
|
||||||
|
|
||||||
import java.util.Comparator;
|
|
||||||
|
|
||||||
import eu.dnetlib.dhp.schema.common.ModelSupport;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.OafEntity;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.Organization;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.Result;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.StructuredProperty;
|
|
||||||
|
|
||||||
public class PidComparator<T extends OafEntity> implements Comparator<StructuredProperty> {
|
|
||||||
|
|
||||||
private final T entity;
|
|
||||||
|
|
||||||
public PidComparator(T entity) {
|
|
||||||
this.entity = entity;
|
|
||||||
}
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public int compare(StructuredProperty left, StructuredProperty right) {
|
|
||||||
|
|
||||||
if (left == null && right == null)
|
|
||||||
return 0;
|
|
||||||
if (left == null)
|
|
||||||
return 1;
|
|
||||||
if (right == null)
|
|
||||||
return -1;
|
|
||||||
|
|
||||||
if (ModelSupport.isSubClass(entity, Result.class)) {
|
|
||||||
return compareResultPids(left, right);
|
|
||||||
}
|
|
||||||
if (ModelSupport.isSubClass(entity, Organization.class)) {
|
|
||||||
return compareOrganizationtPids(left, right);
|
|
||||||
}
|
|
||||||
|
|
||||||
// Else (but unlikely), lexicographical ordering will do.
|
|
||||||
return left.getQualifier().getClassid().compareTo(right.getQualifier().getClassid());
|
|
||||||
}
|
|
||||||
|
|
||||||
private int compareResultPids(StructuredProperty left, StructuredProperty right) {
|
|
||||||
return new ResultPidComparator().compare(left, right);
|
|
||||||
}
|
|
||||||
|
|
||||||
private int compareOrganizationtPids(StructuredProperty left, StructuredProperty right) {
|
|
||||||
return new OrganizationPidComparator().compare(left, right);
|
|
||||||
}
|
|
||||||
}
|
|
|
@ -1,79 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
|
||||||
|
|
||||||
import org.apache.commons.lang3.EnumUtils;
|
|
||||||
|
|
||||||
public enum PidType {
|
|
||||||
|
|
||||||
/**
|
|
||||||
* The DOI syntax shall be made up of a DOI prefix and a DOI suffix separated by a forward slash.
|
|
||||||
*
|
|
||||||
* There is no defined limit on the length of the DOI name, or of the DOI prefix or DOI suffix.
|
|
||||||
*
|
|
||||||
* The DOI name is case-insensitive and can incorporate any printable characters from the legal graphic characters
|
|
||||||
* of Unicode. Further constraints on character use (e.g. use of language-specific alphanumeric characters) can be
|
|
||||||
* defined for an application by the ISO 26324 Registration Authority.
|
|
||||||
*
|
|
||||||
*
|
|
||||||
* DOI prefix: The DOI prefix shall be composed of a directory indicator followed by a registrant code.
|
|
||||||
* These two components shall be separated by a full stop (period). The directory indicator shall be "10" and
|
|
||||||
* distinguishes the entire set of character strings (prefix and suffix) as digital object identifiers within the
|
|
||||||
* resolution system.
|
|
||||||
*
|
|
||||||
* Registrant code: The second element of the DOI prefix shall be the registrant code. The registrant code is a
|
|
||||||
* unique string assigned to a registrant.
|
|
||||||
*
|
|
||||||
* DOI suffix: The DOI suffix shall consist of a character string of any length chosen by the registrant.
|
|
||||||
* Each suffix shall be unique to the prefix element that precedes it. The unique suffix can be a sequential number,
|
|
||||||
* or it might incorporate an identifier generated from or based on another system used by the registrant
|
|
||||||
* (e.g. ISAN, ISBN, ISRC, ISSN, ISTC, ISNI; in such cases, a preferred construction for such a suffix can be
|
|
||||||
* specified, as in Example 1).
|
|
||||||
*
|
|
||||||
* Source: https://www.doi.org/doi_handbook/2_Numbering.html#2.2
|
|
||||||
*/
|
|
||||||
doi,
|
|
||||||
|
|
||||||
/**
|
|
||||||
* PubMed Unique Identifier (PMID)
|
|
||||||
*
|
|
||||||
* This field is a 1-to-8 digit accession number with no leading zeros. It is present on all records and is the
|
|
||||||
* accession number for managing and disseminating records. PMIDs are not reused after records are deleted.
|
|
||||||
*
|
|
||||||
* Beginning in February 2012 PMIDs include extensions following a decimal point to account for article versions
|
|
||||||
* (e.g., 21804956.2). All citations are considered version 1 until replaced. The extended PMID is not displayed
|
|
||||||
* on the MEDLINE format.
|
|
||||||
*
|
|
||||||
* View the citation in abstract format in PubMed to access additional versions when available (see the article in
|
|
||||||
* the Jan-Feb 2012 NLM Technical Bulletin).
|
|
||||||
*
|
|
||||||
* Source: https://www.nlm.nih.gov/bsd/mms/medlineelements.html#pmid
|
|
||||||
*/
|
|
||||||
pmid,
|
|
||||||
|
|
||||||
/**
|
|
||||||
* This field contains the unique identifier for the cited article in PubMed Central. The identifier begins with the
|
|
||||||
* prefix PMC.
|
|
||||||
*
|
|
||||||
* Source: https://www.nlm.nih.gov/bsd/mms/medlineelements.html#pmc
|
|
||||||
*/
|
|
||||||
pmc, handle, arXiv, nct, pdb, w3id,
|
|
||||||
|
|
||||||
// Organization
|
|
||||||
openorgs, ROR, GRID, PIC, ISNI, Wikidata, FundRef, corda, corda_h2020, mag_id, urn,
|
|
||||||
|
|
||||||
// Used by dedup
|
|
||||||
undefined, original;
|
|
||||||
|
|
||||||
public static boolean isValid(String type) {
|
|
||||||
return EnumUtils.isValidEnum(PidType.class, type);
|
|
||||||
}
|
|
||||||
|
|
||||||
public static PidType tryValueOf(String s) {
|
|
||||||
try {
|
|
||||||
return PidType.valueOf(s);
|
|
||||||
} catch (Exception e) {
|
|
||||||
return PidType.original;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
|
@ -1,33 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
|
||||||
|
|
||||||
import java.util.Comparator;
|
|
||||||
import java.util.Optional;
|
|
||||||
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.StructuredProperty;
|
|
||||||
|
|
||||||
public class PidValueComparator implements Comparator<StructuredProperty> {
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public int compare(StructuredProperty left, StructuredProperty right) {
|
|
||||||
|
|
||||||
if (left == null && right == null)
|
|
||||||
return 0;
|
|
||||||
if (left == null)
|
|
||||||
return 1;
|
|
||||||
if (right == null)
|
|
||||||
return -1;
|
|
||||||
|
|
||||||
StructuredProperty l = PidCleaner.normalizePidValue(left);
|
|
||||||
StructuredProperty r = PidCleaner.normalizePidValue(right);
|
|
||||||
|
|
||||||
return Optional
|
|
||||||
.ofNullable(l.getValue())
|
|
||||||
.map(
|
|
||||||
lv -> Optional
|
|
||||||
.ofNullable(r.getValue())
|
|
||||||
.map(rv -> lv.compareTo(rv))
|
|
||||||
.orElse(-1))
|
|
||||||
.orElse(1);
|
|
||||||
}
|
|
||||||
}
|
|
|
@ -1,24 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
|
||||||
|
|
||||||
import java.util.regex.Matcher;
|
|
||||||
import java.util.regex.Pattern;
|
|
||||||
|
|
||||||
public class PmcCleaningRule {
|
|
||||||
|
|
||||||
public static final Pattern PATTERN = Pattern.compile("PMC\\d{1,8}");
|
|
||||||
|
|
||||||
public static String clean(String pmc) {
|
|
||||||
String s = pmc
|
|
||||||
.replaceAll("\\s", "")
|
|
||||||
.toUpperCase();
|
|
||||||
|
|
||||||
final Matcher m = PATTERN.matcher(s);
|
|
||||||
|
|
||||||
if (m.find()) {
|
|
||||||
return m.group();
|
|
||||||
}
|
|
||||||
return "";
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
|
@ -1,25 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
|
||||||
|
|
||||||
import java.util.regex.Matcher;
|
|
||||||
import java.util.regex.Pattern;
|
|
||||||
|
|
||||||
// https://researchguides.stevens.edu/c.php?g=442331&p=6577176
|
|
||||||
public class PmidCleaningRule {
|
|
||||||
|
|
||||||
public static final Pattern PATTERN = Pattern.compile("0*(\\d{1,8})");
|
|
||||||
|
|
||||||
public static String clean(String pmid) {
|
|
||||||
String s = pmid
|
|
||||||
.toLowerCase()
|
|
||||||
.replaceAll("\\s", "");
|
|
||||||
|
|
||||||
final Matcher m = PATTERN.matcher(s);
|
|
||||||
|
|
||||||
if (m.find()) {
|
|
||||||
return m.group(1);
|
|
||||||
}
|
|
||||||
return "";
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
|
@ -1,46 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
|
||||||
|
|
||||||
import java.util.Comparator;
|
|
||||||
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.Qualifier;
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Comparator for sorting the values from the dnet:review_levels vocabulary, implements the following ordering
|
|
||||||
*
|
|
||||||
* peerReviewed (0001) > nonPeerReviewed (0002) > UNKNOWN (0000)
|
|
||||||
*/
|
|
||||||
public class RefereedComparator implements Comparator<Qualifier> {
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public int compare(Qualifier left, Qualifier right) {
|
|
||||||
if (left == null || left.getClassid() == null) {
|
|
||||||
return (right == null || right.getClassid() == null) ? 0 : -1;
|
|
||||||
} else if (right == null || right.getClassid() == null) {
|
|
||||||
return 1;
|
|
||||||
}
|
|
||||||
|
|
||||||
String lClass = left.getClassid();
|
|
||||||
String rClass = right.getClassid();
|
|
||||||
|
|
||||||
if (lClass.equals(rClass))
|
|
||||||
return 0;
|
|
||||||
|
|
||||||
if ("0001".equals(lClass))
|
|
||||||
return -1;
|
|
||||||
if ("0001".equals(rClass))
|
|
||||||
return 1;
|
|
||||||
|
|
||||||
if ("0002".equals(lClass))
|
|
||||||
return -1;
|
|
||||||
if ("0002".equals(rClass))
|
|
||||||
return 1;
|
|
||||||
|
|
||||||
if ("0000".equals(lClass))
|
|
||||||
return -1;
|
|
||||||
if ("0000".equals(rClass))
|
|
||||||
return 1;
|
|
||||||
|
|
||||||
return 0;
|
|
||||||
}
|
|
||||||
}
|
|
|
@ -1,56 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
|
||||||
|
|
||||||
import java.util.Comparator;
|
|
||||||
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.StructuredProperty;
|
|
||||||
|
|
||||||
public class ResultPidComparator implements Comparator<StructuredProperty> {
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public int compare(StructuredProperty left, StructuredProperty right) {
|
|
||||||
|
|
||||||
PidType lClass = PidType.tryValueOf(left.getQualifier().getClassid());
|
|
||||||
PidType rClass = PidType.tryValueOf(right.getQualifier().getClassid());
|
|
||||||
|
|
||||||
if (lClass.equals(rClass))
|
|
||||||
return 0;
|
|
||||||
|
|
||||||
if (lClass.equals(PidType.doi))
|
|
||||||
return -1;
|
|
||||||
if (rClass.equals(PidType.doi))
|
|
||||||
return 1;
|
|
||||||
|
|
||||||
if (lClass.equals(PidType.pmid))
|
|
||||||
return -1;
|
|
||||||
if (rClass.equals(PidType.pmid))
|
|
||||||
return 1;
|
|
||||||
|
|
||||||
if (lClass.equals(PidType.pmc))
|
|
||||||
return -1;
|
|
||||||
if (rClass.equals(PidType.pmc))
|
|
||||||
return 1;
|
|
||||||
|
|
||||||
if (lClass.equals(PidType.handle))
|
|
||||||
return -1;
|
|
||||||
if (rClass.equals(PidType.handle))
|
|
||||||
return 1;
|
|
||||||
|
|
||||||
if (lClass.equals(PidType.arXiv))
|
|
||||||
return -1;
|
|
||||||
if (rClass.equals(PidType.arXiv))
|
|
||||||
return 1;
|
|
||||||
|
|
||||||
if (lClass.equals(PidType.nct))
|
|
||||||
return -1;
|
|
||||||
if (rClass.equals(PidType.nct))
|
|
||||||
return 1;
|
|
||||||
|
|
||||||
if (lClass.equals(PidType.pdb))
|
|
||||||
return -1;
|
|
||||||
if (rClass.equals(PidType.pdb))
|
|
||||||
return 1;
|
|
||||||
|
|
||||||
return 0;
|
|
||||||
}
|
|
||||||
}
|
|
|
@ -1,27 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
|
||||||
|
|
||||||
import java.util.regex.Matcher;
|
|
||||||
import java.util.regex.Pattern;
|
|
||||||
|
|
||||||
// https://ror.readme.io/docs/ror-identifier-pattern
|
|
||||||
public class RorCleaningRule {
|
|
||||||
|
|
||||||
public static final String ROR_PREFIX = "https://ror.org/";
|
|
||||||
|
|
||||||
private static final Pattern PATTERN = Pattern.compile("(?<ror>0[a-hj-km-np-tv-z|0-9]{6}[0-9]{2})");
|
|
||||||
|
|
||||||
public static String clean(String ror) {
|
|
||||||
String s = ror
|
|
||||||
.replaceAll("\\s", "")
|
|
||||||
.toLowerCase();
|
|
||||||
|
|
||||||
Matcher m = PATTERN.matcher(s);
|
|
||||||
|
|
||||||
if (m.find()) {
|
|
||||||
return ROR_PREFIX + m.group("ror");
|
|
||||||
}
|
|
||||||
return "";
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
|
@ -1,46 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
|
||||||
|
|
||||||
import static eu.dnetlib.dhp.schema.oaf.utils.OafMapperUtils.getProvenance;
|
|
||||||
import static org.apache.commons.lang3.StringUtils.isBlank;
|
|
||||||
|
|
||||||
import java.util.Comparator;
|
|
||||||
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.Subject;
|
|
||||||
|
|
||||||
public class SubjectProvenanceComparator implements Comparator<Subject> {
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public int compare(Subject left, Subject right) {
|
|
||||||
|
|
||||||
String lProv = getProvenance(left.getDataInfo());
|
|
||||||
String rProv = getProvenance(right.getDataInfo());
|
|
||||||
|
|
||||||
if (isBlank(lProv) && isBlank(rProv))
|
|
||||||
return 0;
|
|
||||||
if (isBlank(lProv))
|
|
||||||
return 1;
|
|
||||||
if (isBlank(rProv))
|
|
||||||
return -1;
|
|
||||||
if (lProv.equals(rProv))
|
|
||||||
return 0;
|
|
||||||
if (lProv.toLowerCase().contains("crosswalk"))
|
|
||||||
return -1;
|
|
||||||
if (rProv.toLowerCase().contains("crosswalk"))
|
|
||||||
return 1;
|
|
||||||
if (lProv.toLowerCase().contains("user"))
|
|
||||||
return -1;
|
|
||||||
if (rProv.toLowerCase().contains("user"))
|
|
||||||
return 1;
|
|
||||||
if (lProv.toLowerCase().contains("propagation"))
|
|
||||||
return -1;
|
|
||||||
if (rProv.toLowerCase().contains("propagation"))
|
|
||||||
return 1;
|
|
||||||
if (lProv.toLowerCase().contains("iis"))
|
|
||||||
return -1;
|
|
||||||
if (rProv.toLowerCase().contains("iis"))
|
|
||||||
return 1;
|
|
||||||
|
|
||||||
return 0;
|
|
||||||
}
|
|
||||||
}
|
|
|
@ -4,19 +4,19 @@ package eu.dnetlib.dhp.utils;
|
||||||
import java.io.*;
|
import java.io.*;
|
||||||
import java.nio.charset.StandardCharsets;
|
import java.nio.charset.StandardCharsets;
|
||||||
import java.security.MessageDigest;
|
import java.security.MessageDigest;
|
||||||
import java.util.*;
|
import java.util.List;
|
||||||
import java.util.stream.Collectors;
|
import java.util.Map;
|
||||||
|
import java.util.Properties;
|
||||||
|
import java.util.zip.GZIPInputStream;
|
||||||
|
import java.util.zip.GZIPOutputStream;
|
||||||
|
|
||||||
|
import org.apache.commons.codec.binary.Base64;
|
||||||
|
import org.apache.commons.codec.binary.Base64OutputStream;
|
||||||
import org.apache.commons.codec.binary.Hex;
|
import org.apache.commons.codec.binary.Hex;
|
||||||
import org.apache.commons.io.IOUtils;
|
import org.apache.commons.io.IOUtils;
|
||||||
import org.apache.commons.lang3.StringUtils;
|
|
||||||
import org.apache.hadoop.conf.Configuration;
|
import org.apache.hadoop.conf.Configuration;
|
||||||
import org.apache.hadoop.fs.FileSystem;
|
import org.apache.hadoop.fs.FileSystem;
|
||||||
import org.apache.hadoop.fs.Path;
|
import org.apache.hadoop.fs.Path;
|
||||||
import org.apache.http.client.methods.CloseableHttpResponse;
|
|
||||||
import org.apache.http.client.methods.HttpGet;
|
|
||||||
import org.apache.http.impl.client.CloseableHttpClient;
|
|
||||||
import org.apache.http.impl.client.HttpClients;
|
|
||||||
import org.apache.spark.sql.Dataset;
|
import org.apache.spark.sql.Dataset;
|
||||||
import org.apache.spark.sql.SaveMode;
|
import org.apache.spark.sql.SaveMode;
|
||||||
import org.slf4j.Logger;
|
import org.slf4j.Logger;
|
||||||
|
@ -26,9 +26,6 @@ import com.fasterxml.jackson.databind.ObjectMapper;
|
||||||
import com.google.common.collect.Maps;
|
import com.google.common.collect.Maps;
|
||||||
import com.jayway.jsonpath.JsonPath;
|
import com.jayway.jsonpath.JsonPath;
|
||||||
|
|
||||||
import eu.dnetlib.dhp.schema.mdstore.MDStoreWithInfo;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.utils.CleaningFunctions;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.utils.PidCleaner;
|
|
||||||
import net.minidev.json.JSONArray;
|
import net.minidev.json.JSONArray;
|
||||||
import scala.collection.JavaConverters;
|
import scala.collection.JavaConverters;
|
||||||
import scala.collection.Seq;
|
import scala.collection.Seq;
|
||||||
|
@ -55,61 +52,10 @@ public class DHPUtils {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
|
||||||
* Retrieves from the metadata store manager application the list of paths associated with mdstores characterized
|
|
||||||
* by he given format, layout, interpretation
|
|
||||||
* @param mdstoreManagerUrl the URL of the mdstore manager service
|
|
||||||
* @param format the mdstore format
|
|
||||||
* @param layout the mdstore layout
|
|
||||||
* @param interpretation the mdstore interpretation
|
|
||||||
* @param includeEmpty include Empty mdstores
|
|
||||||
* @return the set of hdfs paths
|
|
||||||
* @throws IOException in case of HTTP communication issues
|
|
||||||
*/
|
|
||||||
public static Set<String> mdstorePaths(final String mdstoreManagerUrl,
|
|
||||||
final String format,
|
|
||||||
final String layout,
|
|
||||||
final String interpretation,
|
|
||||||
boolean includeEmpty) throws IOException {
|
|
||||||
final String url = mdstoreManagerUrl + "/mdstores/";
|
|
||||||
final ObjectMapper objectMapper = new ObjectMapper();
|
|
||||||
|
|
||||||
final HttpGet req = new HttpGet(url);
|
|
||||||
|
|
||||||
log.info("MDStoreManager request: {}", req);
|
|
||||||
|
|
||||||
try (final CloseableHttpClient client = HttpClients.createDefault()) {
|
|
||||||
try (final CloseableHttpResponse response = client.execute(req)) {
|
|
||||||
final String json = IOUtils.toString(response.getEntity().getContent());
|
|
||||||
|
|
||||||
log.info("MDStoreManager response: {}", json);
|
|
||||||
|
|
||||||
final MDStoreWithInfo[] mdstores = objectMapper.readValue(json, MDStoreWithInfo[].class);
|
|
||||||
return Arrays
|
|
||||||
.stream(mdstores)
|
|
||||||
.filter(md -> md.getFormat().equalsIgnoreCase(format))
|
|
||||||
.filter(md -> md.getLayout().equalsIgnoreCase(layout))
|
|
||||||
.filter(md -> md.getInterpretation().equalsIgnoreCase(interpretation))
|
|
||||||
.filter(md -> StringUtils.isNotBlank(md.getHdfsPath()))
|
|
||||||
.filter(md -> StringUtils.isNotBlank(md.getCurrentVersion()))
|
|
||||||
.filter(md -> includeEmpty || md.getSize() > 0)
|
|
||||||
.map(md -> md.getHdfsPath() + "/" + md.getCurrentVersion() + "/store")
|
|
||||||
.collect(Collectors.toSet());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
public static String generateIdentifier(final String originalId, final String nsPrefix) {
|
public static String generateIdentifier(final String originalId, final String nsPrefix) {
|
||||||
return String.format("%s::%s", nsPrefix, DHPUtils.md5(originalId));
|
return String.format("%s::%s", nsPrefix, DHPUtils.md5(originalId));
|
||||||
}
|
}
|
||||||
|
|
||||||
public static String generateUnresolvedIdentifier(final String pid, final String pidType) {
|
|
||||||
|
|
||||||
final String cleanedPid = PidCleaner.normalizePidValue(pidType, pid);
|
|
||||||
|
|
||||||
return String.format("unresolved::%s::%s", cleanedPid, pidType.toLowerCase().trim());
|
|
||||||
}
|
|
||||||
|
|
||||||
public static String getJPathString(final String jsonPath, final String json) {
|
public static String getJPathString(final String jsonPath, final String json) {
|
||||||
try {
|
try {
|
||||||
Object o = JsonPath.read(json, jsonPath);
|
Object o = JsonPath.read(json, jsonPath);
|
||||||
|
|
|
@ -1,101 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.pace.common;
|
|
||||||
|
|
||||||
import java.nio.charset.StandardCharsets;
|
|
||||||
import java.text.Normalizer;
|
|
||||||
import java.util.Set;
|
|
||||||
import java.util.regex.Matcher;
|
|
||||||
import java.util.regex.Pattern;
|
|
||||||
|
|
||||||
import org.apache.commons.io.IOUtils;
|
|
||||||
import org.apache.commons.lang3.StringUtils;
|
|
||||||
|
|
||||||
import com.google.common.base.Splitter;
|
|
||||||
import com.google.common.collect.Iterables;
|
|
||||||
import com.google.common.collect.Sets;
|
|
||||||
import com.ibm.icu.text.Transliterator;
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Set of common functions for the framework
|
|
||||||
*
|
|
||||||
* @author claudio
|
|
||||||
*/
|
|
||||||
public class PaceCommonUtils {
|
|
||||||
|
|
||||||
// transliterator
|
|
||||||
protected static Transliterator transliterator = Transliterator.getInstance("Any-Eng");
|
|
||||||
|
|
||||||
protected static final String aliases_from = "⁰¹²³⁴⁵⁶⁷⁸⁹⁺⁻⁼⁽⁾ⁿ₀₁₂₃₄₅₆₇₈₉₊₋₌₍₎àáâäæãåāèéêëēėęəîïíīįìôöòóœøōõûüùúūßśšłžźżçćčñń";
|
|
||||||
protected static final String aliases_to = "0123456789+-=()n0123456789+-=()aaaaaaaaeeeeeeeeiiiiiioooooooouuuuussslzzzcccnn";
|
|
||||||
|
|
||||||
protected static Pattern hexUnicodePattern = Pattern.compile("\\\\u(\\p{XDigit}{4})");
|
|
||||||
|
|
||||||
protected static String fixAliases(final String s) {
|
|
||||||
final StringBuilder sb = new StringBuilder();
|
|
||||||
|
|
||||||
s.chars().forEach(ch -> {
|
|
||||||
final int i = StringUtils.indexOf(aliases_from, ch);
|
|
||||||
sb.append(i >= 0 ? aliases_to.charAt(i) : (char) ch);
|
|
||||||
});
|
|
||||||
|
|
||||||
return sb.toString();
|
|
||||||
}
|
|
||||||
|
|
||||||
protected static String transliterate(final String s) {
|
|
||||||
try {
|
|
||||||
return transliterator.transliterate(s);
|
|
||||||
} catch (Exception e) {
|
|
||||||
return s;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
public static String normalize(final String s) {
|
|
||||||
return fixAliases(transliterate(nfd(unicodeNormalization(s))))
|
|
||||||
.toLowerCase()
|
|
||||||
// do not compact the regexes in a single expression, would cause StackOverflowError in case of large input
|
|
||||||
// strings
|
|
||||||
.replaceAll("[^ \\w]+", "")
|
|
||||||
.replaceAll("(\\p{InCombiningDiacriticalMarks})+", "")
|
|
||||||
.replaceAll("(\\p{Punct})+", " ")
|
|
||||||
.replaceAll("(\\d)+", " ")
|
|
||||||
.replaceAll("(\\n)+", " ")
|
|
||||||
.trim();
|
|
||||||
}
|
|
||||||
|
|
||||||
public static String nfd(final String s) {
|
|
||||||
return Normalizer.normalize(s, Normalizer.Form.NFD);
|
|
||||||
}
|
|
||||||
|
|
||||||
public static String unicodeNormalization(final String s) {
|
|
||||||
|
|
||||||
Matcher m = hexUnicodePattern.matcher(s);
|
|
||||||
StringBuffer buf = new StringBuffer(s.length());
|
|
||||||
while (m.find()) {
|
|
||||||
String ch = String.valueOf((char) Integer.parseInt(m.group(1), 16));
|
|
||||||
m.appendReplacement(buf, Matcher.quoteReplacement(ch));
|
|
||||||
}
|
|
||||||
m.appendTail(buf);
|
|
||||||
return buf.toString();
|
|
||||||
}
|
|
||||||
|
|
||||||
public static Set<String> loadFromClasspath(final String classpath) {
|
|
||||||
|
|
||||||
Transliterator transliterator = Transliterator.getInstance("Any-Eng");
|
|
||||||
|
|
||||||
final Set<String> h = Sets.newHashSet();
|
|
||||||
try {
|
|
||||||
for (final String s : IOUtils
|
|
||||||
.readLines(PaceCommonUtils.class.getResourceAsStream(classpath), StandardCharsets.UTF_8)) {
|
|
||||||
h.add(fixAliases(transliterator.transliterate(s))); // transliteration of the stopwords
|
|
||||||
}
|
|
||||||
} catch (final Throwable e) {
|
|
||||||
return Sets.newHashSet();
|
|
||||||
}
|
|
||||||
return h;
|
|
||||||
}
|
|
||||||
|
|
||||||
protected static Iterable<String> tokens(final String s, final int maxTokens) {
|
|
||||||
return Iterables.limit(Splitter.on(" ").omitEmptyStrings().trimResults().split(s), maxTokens);
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
|
@ -1,156 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.pace.model;
|
|
||||||
|
|
||||||
import java.nio.charset.Charset;
|
|
||||||
import java.text.Normalizer;
|
|
||||||
import java.util.List;
|
|
||||||
import java.util.Set;
|
|
||||||
|
|
||||||
import com.google.common.base.Joiner;
|
|
||||||
import com.google.common.base.Splitter;
|
|
||||||
import com.google.common.collect.Iterables;
|
|
||||||
import com.google.common.collect.Lists;
|
|
||||||
import com.google.common.hash.Hashing;
|
|
||||||
|
|
||||||
import eu.dnetlib.pace.common.PaceCommonUtils;
|
|
||||||
import eu.dnetlib.pace.util.Capitalise;
|
|
||||||
import eu.dnetlib.pace.util.DotAbbreviations;
|
|
||||||
|
|
||||||
public class Person {
|
|
||||||
|
|
||||||
private static final String UTF8 = "UTF-8";
|
|
||||||
private List<String> name = Lists.newArrayList();
|
|
||||||
private List<String> surname = Lists.newArrayList();
|
|
||||||
private List<String> fullname = Lists.newArrayList();
|
|
||||||
private final String original;
|
|
||||||
|
|
||||||
private static Set<String> particles = null;
|
|
||||||
|
|
||||||
public Person(String s, final boolean aggressive) {
|
|
||||||
original = s;
|
|
||||||
s = Normalizer.normalize(s, Normalizer.Form.NFD);
|
|
||||||
s = s.replaceAll("\\(.+\\)", "");
|
|
||||||
s = s.replaceAll("\\[.+\\]", "");
|
|
||||||
s = s.replaceAll("\\{.+\\}", "");
|
|
||||||
s = s.replaceAll("\\s+-\\s+", "-");
|
|
||||||
s = s.replaceAll("[\\p{Punct}&&[^,-]]", " ");
|
|
||||||
s = s.replaceAll("\\d", " ");
|
|
||||||
s = s.replaceAll("\\n", " ");
|
|
||||||
s = s.replaceAll("\\.", " ");
|
|
||||||
s = s.replaceAll("\\s+", " ");
|
|
||||||
|
|
||||||
if (aggressive) {
|
|
||||||
s = s.replaceAll("[\\p{InCombiningDiacriticalMarks}&&[^,-]]", "");
|
|
||||||
// s = s.replaceAll("[\\W&&[^,-]]", "");
|
|
||||||
}
|
|
||||||
|
|
||||||
if (s.contains(",")) { // if the name contains a comma it is easy derivable the name and the surname
|
|
||||||
final String[] arr = s.split(",");
|
|
||||||
if (arr.length == 1) {
|
|
||||||
fullname = splitTerms(arr[0]);
|
|
||||||
} else if (arr.length > 1) {
|
|
||||||
surname = splitTerms(arr[0]);
|
|
||||||
name = splitTerms(arr[1]);
|
|
||||||
fullname.addAll(surname);
|
|
||||||
fullname.addAll(name);
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
fullname = splitTerms(s);
|
|
||||||
|
|
||||||
int lastInitialPosition = fullname.size();
|
|
||||||
boolean hasSurnameInUpperCase = false;
|
|
||||||
|
|
||||||
for (int i = 0; i < fullname.size(); i++) {
|
|
||||||
final String term = fullname.get(i);
|
|
||||||
if (term.length() == 1) {
|
|
||||||
lastInitialPosition = i;
|
|
||||||
} else if (term.equals(term.toUpperCase())) {
|
|
||||||
hasSurnameInUpperCase = true;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (lastInitialPosition < (fullname.size() - 1)) { // Case: Michele G. Artini
|
|
||||||
name = fullname.subList(0, lastInitialPosition + 1);
|
|
||||||
surname = fullname.subList(lastInitialPosition + 1, fullname.size());
|
|
||||||
} else if (hasSurnameInUpperCase) { // Case: Michele ARTINI
|
|
||||||
for (final String term : fullname) {
|
|
||||||
if ((term.length() > 1) && term.equals(term.toUpperCase())) {
|
|
||||||
surname.add(term);
|
|
||||||
} else {
|
|
||||||
name.add(term);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
private List<String> splitTerms(final String s) {
|
|
||||||
if (particles == null) {
|
|
||||||
particles = PaceCommonUtils.loadFromClasspath("/eu/dnetlib/pace/config/name_particles.txt");
|
|
||||||
}
|
|
||||||
|
|
||||||
final List<String> list = Lists.newArrayList();
|
|
||||||
for (final String part : Splitter.on(" ").omitEmptyStrings().split(s)) {
|
|
||||||
if (!particles.contains(part.toLowerCase())) {
|
|
||||||
list.add(part);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return list;
|
|
||||||
}
|
|
||||||
|
|
||||||
public List<String> getName() {
|
|
||||||
return name;
|
|
||||||
}
|
|
||||||
|
|
||||||
public String getNameString() {
|
|
||||||
return Joiner.on(" ").join(getName());
|
|
||||||
}
|
|
||||||
|
|
||||||
public List<String> getSurname() {
|
|
||||||
return surname;
|
|
||||||
}
|
|
||||||
|
|
||||||
public List<String> getFullname() {
|
|
||||||
return fullname;
|
|
||||||
}
|
|
||||||
|
|
||||||
public String getOriginal() {
|
|
||||||
return original;
|
|
||||||
}
|
|
||||||
|
|
||||||
public String hash() {
|
|
||||||
return Hashing.murmur3_128().hashString(getNormalisedFullname(), Charset.forName(UTF8)).toString();
|
|
||||||
}
|
|
||||||
|
|
||||||
public String getNormalisedFirstName() {
|
|
||||||
return Joiner.on(" ").join(getCapitalFirstnames());
|
|
||||||
}
|
|
||||||
|
|
||||||
public String getNormalisedSurname() {
|
|
||||||
return Joiner.on(" ").join(getCapitalSurname());
|
|
||||||
}
|
|
||||||
|
|
||||||
public String getSurnameString() {
|
|
||||||
return Joiner.on(" ").join(getSurname());
|
|
||||||
}
|
|
||||||
|
|
||||||
public String getNormalisedFullname() {
|
|
||||||
return isAccurate() ? getNormalisedSurname() + ", " + getNormalisedFirstName() : Joiner.on(" ").join(fullname);
|
|
||||||
}
|
|
||||||
|
|
||||||
public List<String> getCapitalFirstnames() {
|
|
||||||
return Lists.newArrayList(Iterables.transform(getNameWithAbbreviations(), new Capitalise()));
|
|
||||||
}
|
|
||||||
|
|
||||||
public List<String> getCapitalSurname() {
|
|
||||||
return Lists.newArrayList(Iterables.transform(surname, new Capitalise()));
|
|
||||||
}
|
|
||||||
|
|
||||||
public List<String> getNameWithAbbreviations() {
|
|
||||||
return Lists.newArrayList(Iterables.transform(name, new DotAbbreviations()));
|
|
||||||
}
|
|
||||||
|
|
||||||
public boolean isAccurate() {
|
|
||||||
return ((name != null) && (surname != null) && !name.isEmpty() && !surname.isEmpty());
|
|
||||||
}
|
|
||||||
}
|
|
|
@ -1,18 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.pace.util;
|
|
||||||
|
|
||||||
import org.apache.commons.lang3.text.WordUtils;
|
|
||||||
|
|
||||||
import com.google.common.base.Function;
|
|
||||||
|
|
||||||
public class Capitalise implements Function<String, String> {
|
|
||||||
|
|
||||||
private final char[] DELIM = {
|
|
||||||
' ', '-'
|
|
||||||
};
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public String apply(final String s) {
|
|
||||||
return WordUtils.capitalize(s.toLowerCase(), DELIM);
|
|
||||||
}
|
|
||||||
}
|
|
|
@ -1,11 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.pace.util;
|
|
||||||
|
|
||||||
import com.google.common.base.Function;
|
|
||||||
|
|
||||||
public class DotAbbreviations implements Function<String, String> {
|
|
||||||
@Override
|
|
||||||
public String apply(String s) {
|
|
||||||
return s.length() == 1 ? s + "." : s;
|
|
||||||
}
|
|
||||||
}
|
|
|
@ -1,8 +0,0 @@
|
||||||
van
|
|
||||||
von
|
|
||||||
der
|
|
||||||
de
|
|
||||||
dell
|
|
||||||
sig
|
|
||||||
mr
|
|
||||||
mrs
|
|
|
@ -1,38 +0,0 @@
|
||||||
[
|
|
||||||
{
|
|
||||||
"paramName": "issm",
|
|
||||||
"paramLongName": "isSparkSessionManaged",
|
|
||||||
"paramDescription": "when true will stop SparkSession after job execution",
|
|
||||||
"paramRequired": false
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"paramName": "gin",
|
|
||||||
"paramLongName": "graphInputPath",
|
|
||||||
"paramDescription": "the input graph root path",
|
|
||||||
"paramRequired": true
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"paramName": "cp",
|
|
||||||
"paramLongName": "checkpointPath",
|
|
||||||
"paramDescription": "checkpoint directory",
|
|
||||||
"paramRequired": true
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"paramName": "out",
|
|
||||||
"paramLongName": "outputPath",
|
|
||||||
"paramDescription": "the output graph root path",
|
|
||||||
"paramRequired": true
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"paramName": "fi",
|
|
||||||
"paramLongName": "filterInvisible",
|
|
||||||
"paramDescription": "if true filters out invisible entities",
|
|
||||||
"paramRequired": true
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"paramName": "isu",
|
|
||||||
"paramLongName": "isLookupUrl",
|
|
||||||
"paramDescription": "url to the ISLookup Service",
|
|
||||||
"paramRequired": true
|
|
||||||
}
|
|
||||||
]
|
|
|
@ -1,20 +0,0 @@
|
||||||
[
|
|
||||||
{
|
|
||||||
"paramName": "issm",
|
|
||||||
"paramLongName": "isSparkSessionManaged",
|
|
||||||
"paramDescription": "when true will stop SparkSession after job execution",
|
|
||||||
"paramRequired": false
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"paramName": "hmu",
|
|
||||||
"paramLongName": "hiveMetastoreUris",
|
|
||||||
"paramDescription": "the hive metastore uris",
|
|
||||||
"paramRequired": true
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"paramName": "sql",
|
|
||||||
"paramLongName": "sql",
|
|
||||||
"paramDescription": "sql script to execute",
|
|
||||||
"paramRequired": true
|
|
||||||
}
|
|
||||||
]
|
|
|
@ -154,13 +154,5 @@
|
||||||
"unknown":{
|
"unknown":{
|
||||||
"original":"Unknown",
|
"original":"Unknown",
|
||||||
"inverse":"Unknown"
|
"inverse":"Unknown"
|
||||||
},
|
|
||||||
"isamongtopnsimilardocuments": {
|
|
||||||
"original": "IsAmongTopNSimilarDocuments",
|
|
||||||
"inverse": "HasAmongTopNSimilarDocuments"
|
|
||||||
},
|
|
||||||
"hasamongtopnsimilardocuments": {
|
|
||||||
"original": "HasAmongTopNSimilarDocuments",
|
|
||||||
"inverse": "IsAmongTopNSimilarDocuments"
|
|
||||||
}
|
}
|
||||||
}
|
}
|
|
@ -1,86 +0,0 @@
|
||||||
package eu.dnetlib.dhp.application
|
|
||||||
|
|
||||||
import eu.dnetlib.dhp.common.Constants
|
|
||||||
import eu.dnetlib.dhp.utils.DHPUtils.writeHdfsFile
|
|
||||||
|
|
||||||
import scala.io.Source
|
|
||||||
|
|
||||||
/** This is the main Interface SparkApplication
|
|
||||||
* where all the Spark Scala class should inherit
|
|
||||||
*/
|
|
||||||
trait SparkScalaApplication {
|
|
||||||
|
|
||||||
/** This is the path in the classpath of the json
|
|
||||||
* describes all the argument needed to run
|
|
||||||
*/
|
|
||||||
val propertyPath: String
|
|
||||||
|
|
||||||
/** Utility to parse the arguments using the
|
|
||||||
* property json in the classpath identified from
|
|
||||||
* the variable propertyPath
|
|
||||||
*
|
|
||||||
* @param args the list of arguments
|
|
||||||
*/
|
|
||||||
def parseArguments(args: Array[String]): ArgumentApplicationParser = {
|
|
||||||
val parser = new ArgumentApplicationParser(
|
|
||||||
Source.fromInputStream(getClass.getResourceAsStream(propertyPath)).mkString
|
|
||||||
)
|
|
||||||
parser.parseArgument(args)
|
|
||||||
parser
|
|
||||||
}
|
|
||||||
|
|
||||||
/** Here all the spark applications runs this method
|
|
||||||
* where the whole logic of the spark node is defined
|
|
||||||
*/
|
|
||||||
def run(): Unit
|
|
||||||
}
|
|
||||||
|
|
||||||
import org.apache.spark.SparkConf
|
|
||||||
import org.apache.spark.sql.SparkSession
|
|
||||||
import org.slf4j.Logger
|
|
||||||
|
|
||||||
abstract class AbstractScalaApplication(
|
|
||||||
val propertyPath: String,
|
|
||||||
val args: Array[String],
|
|
||||||
log: Logger
|
|
||||||
) extends SparkScalaApplication {
|
|
||||||
|
|
||||||
var parser: ArgumentApplicationParser = null
|
|
||||||
|
|
||||||
var spark: SparkSession = null
|
|
||||||
|
|
||||||
def initialize(): SparkScalaApplication = {
|
|
||||||
parser = parseArguments(args)
|
|
||||||
spark = createSparkSession()
|
|
||||||
this
|
|
||||||
}
|
|
||||||
|
|
||||||
/** Utility for creating a spark session starting from parser
|
|
||||||
*
|
|
||||||
* @return a spark Session
|
|
||||||
*/
|
|
||||||
private def createSparkSession(): SparkSession = {
|
|
||||||
require(parser != null)
|
|
||||||
|
|
||||||
val conf: SparkConf = new SparkConf()
|
|
||||||
val master = parser.get("master")
|
|
||||||
log.info(s"Creating Spark session: Master: $master")
|
|
||||||
val b = SparkSession
|
|
||||||
.builder()
|
|
||||||
.config(conf)
|
|
||||||
.appName(getClass.getSimpleName)
|
|
||||||
if (master != null)
|
|
||||||
b.master(master)
|
|
||||||
b.getOrCreate()
|
|
||||||
}
|
|
||||||
|
|
||||||
def reportTotalSize(targetPath: String, outputBasePath: String): Unit = {
|
|
||||||
val total_items = spark.read.text(targetPath).count()
|
|
||||||
writeHdfsFile(
|
|
||||||
spark.sparkContext.hadoopConfiguration,
|
|
||||||
s"$total_items",
|
|
||||||
outputBasePath + Constants.MDSTORE_SIZE_PATH
|
|
||||||
)
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
|
@ -1,10 +0,0 @@
|
||||||
package eu.dnetlib.dhp.application.dedup.log
|
|
||||||
|
|
||||||
case class DedupLogModel(
|
|
||||||
tag: String,
|
|
||||||
configuration: String,
|
|
||||||
entity: String,
|
|
||||||
startTS: Long,
|
|
||||||
endTS: Long,
|
|
||||||
totalMs: Long
|
|
||||||
) {}
|
|
|
@ -1,14 +0,0 @@
|
||||||
package eu.dnetlib.dhp.application.dedup.log
|
|
||||||
|
|
||||||
import org.apache.spark.sql.{SaveMode, SparkSession}
|
|
||||||
|
|
||||||
class DedupLogWriter(path: String) {
|
|
||||||
|
|
||||||
def appendLog(dedupLogModel: DedupLogModel, spark: SparkSession): Unit = {
|
|
||||||
import spark.implicits._
|
|
||||||
val df = spark.createDataset[DedupLogModel](data = List(dedupLogModel))
|
|
||||||
df.write.mode(SaveMode.Append).save(path)
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
|
@ -1,451 +0,0 @@
|
||||||
package eu.dnetlib.dhp.sx.graph.scholix
|
|
||||||
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.{Publication, Relation, Result, StructuredProperty}
|
|
||||||
import eu.dnetlib.dhp.schema.sx.scholix._
|
|
||||||
import eu.dnetlib.dhp.schema.sx.summary.{CollectedFromType, SchemeValue, ScholixSummary, Typology}
|
|
||||||
import eu.dnetlib.dhp.utils.DHPUtils
|
|
||||||
import org.apache.spark.sql.expressions.Aggregator
|
|
||||||
import org.apache.spark.sql.{Encoder, Encoders}
|
|
||||||
import org.json4s
|
|
||||||
import org.json4s.DefaultFormats
|
|
||||||
import org.json4s.jackson.JsonMethods.parse
|
|
||||||
import scala.collection.JavaConverters._
|
|
||||||
import scala.io.Source
|
|
||||||
|
|
||||||
object ScholixUtils extends Serializable {
|
|
||||||
|
|
||||||
val DNET_IDENTIFIER_SCHEMA: String = "DNET Identifier"
|
|
||||||
|
|
||||||
val DATE_RELATION_KEY: String = "RelationDate"
|
|
||||||
|
|
||||||
case class RelationVocabulary(original: String, inverse: String) {}
|
|
||||||
|
|
||||||
case class RelatedEntities(id: String, relatedDataset: Long, relatedPublication: Long) {}
|
|
||||||
|
|
||||||
val relations: Map[String, RelationVocabulary] = {
|
|
||||||
val input = Source
|
|
||||||
.fromInputStream(
|
|
||||||
getClass.getResourceAsStream("/eu/dnetlib/scholexplorer/relation/relations.json")
|
|
||||||
)
|
|
||||||
.mkString
|
|
||||||
implicit lazy val formats: DefaultFormats.type = org.json4s.DefaultFormats
|
|
||||||
|
|
||||||
lazy val json: json4s.JValue = parse(input)
|
|
||||||
|
|
||||||
json.extract[Map[String, RelationVocabulary]]
|
|
||||||
}
|
|
||||||
|
|
||||||
def extractRelationDate(relation: Relation): String = {
|
|
||||||
|
|
||||||
if (relation.getProperties == null || !relation.getProperties.isEmpty)
|
|
||||||
null
|
|
||||||
else {
|
|
||||||
val date = relation.getProperties.asScala
|
|
||||||
.find(p => DATE_RELATION_KEY.equalsIgnoreCase(p.getKey))
|
|
||||||
.map(p => p.getValue)
|
|
||||||
if (date.isDefined)
|
|
||||||
date.get
|
|
||||||
else
|
|
||||||
null
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
def extractRelationDate(summary: ScholixSummary): String = {
|
|
||||||
|
|
||||||
if (summary.getDate == null || summary.getDate.isEmpty)
|
|
||||||
null
|
|
||||||
else {
|
|
||||||
summary.getDate.get(0)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
def inverseRelationShip(rel: ScholixRelationship): ScholixRelationship = {
|
|
||||||
new ScholixRelationship(rel.getInverse, rel.getSchema, rel.getName)
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
def generateScholixResourceFromResult(r: Result): ScholixResource = {
|
|
||||||
val sum = ScholixUtils.resultToSummary(r)
|
|
||||||
if (sum != null)
|
|
||||||
generateScholixResourceFromSummary(ScholixUtils.resultToSummary(r))
|
|
||||||
else
|
|
||||||
null
|
|
||||||
}
|
|
||||||
|
|
||||||
val statsAggregator: Aggregator[(String, String, Long), RelatedEntities, RelatedEntities] =
|
|
||||||
new Aggregator[(String, String, Long), RelatedEntities, RelatedEntities] with Serializable {
|
|
||||||
override def zero: RelatedEntities = null
|
|
||||||
|
|
||||||
override def reduce(b: RelatedEntities, a: (String, String, Long)): RelatedEntities = {
|
|
||||||
val relatedDataset = if ("dataset".equalsIgnoreCase(a._2)) a._3 else 0
|
|
||||||
val relatedPublication = if ("publication".equalsIgnoreCase(a._2)) a._3 else 0
|
|
||||||
|
|
||||||
if (b == null)
|
|
||||||
RelatedEntities(a._1, relatedDataset, relatedPublication)
|
|
||||||
else
|
|
||||||
RelatedEntities(
|
|
||||||
a._1,
|
|
||||||
b.relatedDataset + relatedDataset,
|
|
||||||
b.relatedPublication + relatedPublication
|
|
||||||
)
|
|
||||||
}
|
|
||||||
|
|
||||||
override def merge(b1: RelatedEntities, b2: RelatedEntities): RelatedEntities = {
|
|
||||||
if (b1 != null && b2 != null)
|
|
||||||
RelatedEntities(
|
|
||||||
b1.id,
|
|
||||||
b1.relatedDataset + b2.relatedDataset,
|
|
||||||
b1.relatedPublication + b2.relatedPublication
|
|
||||||
)
|
|
||||||
else if (b1 != null)
|
|
||||||
b1
|
|
||||||
else
|
|
||||||
b2
|
|
||||||
}
|
|
||||||
|
|
||||||
override def finish(reduction: RelatedEntities): RelatedEntities = reduction
|
|
||||||
|
|
||||||
override def bufferEncoder: Encoder[RelatedEntities] = Encoders.bean(classOf[RelatedEntities])
|
|
||||||
|
|
||||||
override def outputEncoder: Encoder[RelatedEntities] = Encoders.bean(classOf[RelatedEntities])
|
|
||||||
}
|
|
||||||
|
|
||||||
val scholixAggregator: Aggregator[(String, Scholix), Scholix, Scholix] =
|
|
||||||
new Aggregator[(String, Scholix), Scholix, Scholix] with Serializable {
|
|
||||||
override def zero: Scholix = null
|
|
||||||
|
|
||||||
def scholix_complete(s: Scholix): Boolean = {
|
|
||||||
if (s == null || s.getIdentifier == null) {
|
|
||||||
false
|
|
||||||
} else if (s.getSource == null || s.getTarget == null) {
|
|
||||||
false
|
|
||||||
} else if (s.getLinkprovider == null || s.getLinkprovider.isEmpty)
|
|
||||||
false
|
|
||||||
else
|
|
||||||
true
|
|
||||||
}
|
|
||||||
|
|
||||||
override def reduce(b: Scholix, a: (String, Scholix)): Scholix = {
|
|
||||||
if (scholix_complete(b)) b else a._2
|
|
||||||
}
|
|
||||||
|
|
||||||
override def merge(b1: Scholix, b2: Scholix): Scholix = {
|
|
||||||
if (scholix_complete(b1)) b1 else b2
|
|
||||||
}
|
|
||||||
|
|
||||||
override def finish(reduction: Scholix): Scholix = reduction
|
|
||||||
|
|
||||||
override def bufferEncoder: Encoder[Scholix] = Encoders.kryo[Scholix]
|
|
||||||
|
|
||||||
override def outputEncoder: Encoder[Scholix] = Encoders.kryo[Scholix]
|
|
||||||
}
|
|
||||||
|
|
||||||
def createInverseScholixRelation(scholix: Scholix): Scholix = {
|
|
||||||
val s = new Scholix
|
|
||||||
s.setPublicationDate(scholix.getPublicationDate)
|
|
||||||
s.setPublisher(scholix.getPublisher)
|
|
||||||
s.setLinkprovider(scholix.getLinkprovider)
|
|
||||||
s.setRelationship(inverseRelationShip(scholix.getRelationship))
|
|
||||||
s.setSource(scholix.getTarget)
|
|
||||||
s.setTarget(scholix.getSource)
|
|
||||||
s.setIdentifier(
|
|
||||||
DHPUtils.md5(
|
|
||||||
s"${s.getSource.getIdentifier}::${s.getRelationship.getName}::${s.getTarget.getIdentifier}"
|
|
||||||
)
|
|
||||||
)
|
|
||||||
s
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
def invRel(rel: String): String = {
|
|
||||||
val semanticRelation = relations.getOrElse(rel.toLowerCase, null)
|
|
||||||
if (semanticRelation != null)
|
|
||||||
semanticRelation.inverse
|
|
||||||
else
|
|
||||||
null
|
|
||||||
}
|
|
||||||
|
|
||||||
def extractCollectedFrom(summary: ScholixResource): List[ScholixEntityId] = {
|
|
||||||
if (summary.getCollectedFrom != null && !summary.getCollectedFrom.isEmpty) {
|
|
||||||
val l: List[ScholixEntityId] = summary.getCollectedFrom.asScala.map { d =>
|
|
||||||
new ScholixEntityId(d.getProvider.getName, d.getProvider.getIdentifiers)
|
|
||||||
}(collection.breakOut)
|
|
||||||
l
|
|
||||||
} else List()
|
|
||||||
}
|
|
||||||
|
|
||||||
def extractCollectedFrom(summary: ScholixSummary): List[ScholixEntityId] = {
|
|
||||||
if (summary.getDatasources != null && !summary.getDatasources.isEmpty) {
|
|
||||||
val l: List[ScholixEntityId] = summary.getDatasources.asScala.map { d =>
|
|
||||||
new ScholixEntityId(
|
|
||||||
d.getDatasourceName,
|
|
||||||
List(new ScholixIdentifier(d.getDatasourceId, "DNET Identifier", null)).asJava
|
|
||||||
)
|
|
||||||
}(collection.breakOut)
|
|
||||||
l
|
|
||||||
} else List()
|
|
||||||
}
|
|
||||||
|
|
||||||
def extractCollectedFrom(relation: Relation): List[ScholixEntityId] = {
|
|
||||||
if (relation.getCollectedfrom != null && !relation.getCollectedfrom.isEmpty) {
|
|
||||||
|
|
||||||
val l: List[ScholixEntityId] = relation.getCollectedfrom.asScala.map { c =>
|
|
||||||
new ScholixEntityId(
|
|
||||||
c.getValue,
|
|
||||||
List(new ScholixIdentifier(c.getKey, DNET_IDENTIFIER_SCHEMA, null)).asJava
|
|
||||||
)
|
|
||||||
}.toList
|
|
||||||
l
|
|
||||||
} else List()
|
|
||||||
}
|
|
||||||
|
|
||||||
def generateCompleteScholix(scholix: Scholix, target: ScholixSummary): Scholix = {
|
|
||||||
val s = new Scholix
|
|
||||||
s.setPublicationDate(scholix.getPublicationDate)
|
|
||||||
s.setPublisher(scholix.getPublisher)
|
|
||||||
s.setLinkprovider(scholix.getLinkprovider)
|
|
||||||
s.setRelationship(scholix.getRelationship)
|
|
||||||
s.setSource(scholix.getSource)
|
|
||||||
s.setTarget(generateScholixResourceFromSummary(target))
|
|
||||||
s.setIdentifier(
|
|
||||||
DHPUtils.md5(
|
|
||||||
s"${s.getSource.getIdentifier}::${s.getRelationship.getName}::${s.getTarget.getIdentifier}"
|
|
||||||
)
|
|
||||||
)
|
|
||||||
s
|
|
||||||
}
|
|
||||||
|
|
||||||
def generateCompleteScholix(scholix: Scholix, target: ScholixResource): Scholix = {
|
|
||||||
val s = new Scholix
|
|
||||||
s.setPublicationDate(scholix.getPublicationDate)
|
|
||||||
s.setPublisher(scholix.getPublisher)
|
|
||||||
s.setLinkprovider(scholix.getLinkprovider)
|
|
||||||
s.setRelationship(scholix.getRelationship)
|
|
||||||
s.setSource(scholix.getSource)
|
|
||||||
s.setTarget(target)
|
|
||||||
s.setIdentifier(
|
|
||||||
DHPUtils.md5(
|
|
||||||
s"${s.getSource.getIdentifier}::${s.getRelationship.getName}::${s.getTarget.getIdentifier}"
|
|
||||||
)
|
|
||||||
)
|
|
||||||
s
|
|
||||||
}
|
|
||||||
|
|
||||||
def generateScholixResourceFromSummary(summaryObject: ScholixSummary): ScholixResource = {
|
|
||||||
val r = new ScholixResource
|
|
||||||
r.setIdentifier(summaryObject.getLocalIdentifier)
|
|
||||||
r.setDnetIdentifier(summaryObject.getId)
|
|
||||||
|
|
||||||
r.setObjectType(summaryObject.getTypology.toString)
|
|
||||||
r.setObjectSubType(summaryObject.getSubType)
|
|
||||||
|
|
||||||
if (summaryObject.getTitle != null && !summaryObject.getTitle.isEmpty)
|
|
||||||
r.setTitle(summaryObject.getTitle.get(0))
|
|
||||||
|
|
||||||
if (summaryObject.getAuthor != null && !summaryObject.getAuthor.isEmpty) {
|
|
||||||
val l: List[ScholixEntityId] =
|
|
||||||
summaryObject.getAuthor.asScala.map(a => new ScholixEntityId(a, null)).toList
|
|
||||||
if (l.nonEmpty)
|
|
||||||
r.setCreator(l.asJava)
|
|
||||||
}
|
|
||||||
|
|
||||||
if (summaryObject.getDate != null && !summaryObject.getDate.isEmpty)
|
|
||||||
r.setPublicationDate(summaryObject.getDate.get(0))
|
|
||||||
if (summaryObject.getPublisher != null && !summaryObject.getPublisher.isEmpty) {
|
|
||||||
val plist: List[ScholixEntityId] =
|
|
||||||
summaryObject.getPublisher.asScala.map(p => new ScholixEntityId(p, null)).toList
|
|
||||||
|
|
||||||
if (plist.nonEmpty)
|
|
||||||
r.setPublisher(plist.asJava)
|
|
||||||
}
|
|
||||||
|
|
||||||
if (summaryObject.getDatasources != null && !summaryObject.getDatasources.isEmpty) {
|
|
||||||
|
|
||||||
val l: List[ScholixCollectedFrom] = summaryObject.getDatasources.asScala
|
|
||||||
.map(c =>
|
|
||||||
new ScholixCollectedFrom(
|
|
||||||
new ScholixEntityId(
|
|
||||||
c.getDatasourceName,
|
|
||||||
List(new ScholixIdentifier(c.getDatasourceId, DNET_IDENTIFIER_SCHEMA, null)).asJava
|
|
||||||
),
|
|
||||||
"collected",
|
|
||||||
"complete"
|
|
||||||
)
|
|
||||||
)
|
|
||||||
.toList
|
|
||||||
|
|
||||||
if (l.nonEmpty)
|
|
||||||
r.setCollectedFrom(l.asJava)
|
|
||||||
|
|
||||||
}
|
|
||||||
r
|
|
||||||
}
|
|
||||||
|
|
||||||
def scholixFromSource(relation: Relation, source: ScholixResource): Scholix = {
|
|
||||||
if (relation == null || source == null)
|
|
||||||
return null
|
|
||||||
val s = new Scholix
|
|
||||||
var l: List[ScholixEntityId] = extractCollectedFrom(relation)
|
|
||||||
if (l.isEmpty)
|
|
||||||
l = extractCollectedFrom(source)
|
|
||||||
if (l.isEmpty)
|
|
||||||
return null
|
|
||||||
s.setLinkprovider(l.asJava)
|
|
||||||
var d = extractRelationDate(relation)
|
|
||||||
if (d == null)
|
|
||||||
d = source.getPublicationDate
|
|
||||||
|
|
||||||
s.setPublicationDate(d)
|
|
||||||
|
|
||||||
if (source.getPublisher != null && !source.getPublisher.isEmpty) {
|
|
||||||
s.setPublisher(source.getPublisher)
|
|
||||||
}
|
|
||||||
|
|
||||||
val semanticRelation = relations.getOrElse(relation.getRelClass.toLowerCase, null)
|
|
||||||
if (semanticRelation == null)
|
|
||||||
return null
|
|
||||||
s.setRelationship(
|
|
||||||
new ScholixRelationship(semanticRelation.original, "datacite", semanticRelation.inverse)
|
|
||||||
)
|
|
||||||
s.setSource(source)
|
|
||||||
|
|
||||||
s
|
|
||||||
}
|
|
||||||
|
|
||||||
def scholixFromSource(relation: Relation, source: ScholixSummary): Scholix = {
|
|
||||||
|
|
||||||
if (relation == null || source == null)
|
|
||||||
return null
|
|
||||||
|
|
||||||
val s = new Scholix
|
|
||||||
|
|
||||||
var l: List[ScholixEntityId] = extractCollectedFrom(relation)
|
|
||||||
if (l.isEmpty)
|
|
||||||
l = extractCollectedFrom(source)
|
|
||||||
if (l.isEmpty)
|
|
||||||
return null
|
|
||||||
|
|
||||||
s.setLinkprovider(l.asJava)
|
|
||||||
|
|
||||||
var d = extractRelationDate(relation)
|
|
||||||
if (d == null)
|
|
||||||
d = extractRelationDate(source)
|
|
||||||
|
|
||||||
s.setPublicationDate(d)
|
|
||||||
|
|
||||||
if (source.getPublisher != null && !source.getPublisher.isEmpty) {
|
|
||||||
val l: List[ScholixEntityId] = source.getPublisher.asScala
|
|
||||||
.map { p =>
|
|
||||||
new ScholixEntityId(p, null)
|
|
||||||
}(collection.breakOut)
|
|
||||||
|
|
||||||
if (l.nonEmpty)
|
|
||||||
s.setPublisher(l.asJava)
|
|
||||||
}
|
|
||||||
|
|
||||||
val semanticRelation = relations.getOrElse(relation.getRelClass.toLowerCase, null)
|
|
||||||
if (semanticRelation == null)
|
|
||||||
return null
|
|
||||||
s.setRelationship(
|
|
||||||
new ScholixRelationship(semanticRelation.original, "datacite", semanticRelation.inverse)
|
|
||||||
)
|
|
||||||
s.setSource(generateScholixResourceFromSummary(source))
|
|
||||||
|
|
||||||
s
|
|
||||||
}
|
|
||||||
|
|
||||||
def findURLForPID(
|
|
||||||
pidValue: List[StructuredProperty],
|
|
||||||
urls: List[String]
|
|
||||||
): List[(StructuredProperty, String)] = {
|
|
||||||
pidValue.map { p =>
|
|
||||||
val pv = p.getValue
|
|
||||||
|
|
||||||
val r = urls.find(u => u.toLowerCase.contains(pv.toLowerCase))
|
|
||||||
(p, r.orNull)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
def extractTypedIdentifierFromInstance(r: Result): List[ScholixIdentifier] = {
|
|
||||||
if (r.getInstance() == null || r.getInstance().isEmpty)
|
|
||||||
return List()
|
|
||||||
r.getInstance()
|
|
||||||
.asScala
|
|
||||||
.filter(i => i.getUrl != null && !i.getUrl.isEmpty)
|
|
||||||
.filter(i => i.getPid != null && i.getUrl != null)
|
|
||||||
.flatMap(i => findURLForPID(i.getPid.asScala.toList, i.getUrl.asScala.toList))
|
|
||||||
.map(i => new ScholixIdentifier(i._1.getValue, i._1.getQualifier.getClassid, i._2))
|
|
||||||
.distinct
|
|
||||||
.toList
|
|
||||||
}
|
|
||||||
|
|
||||||
def resultToSummary(r: Result): ScholixSummary = {
|
|
||||||
val s = new ScholixSummary
|
|
||||||
s.setId(r.getId)
|
|
||||||
if (r.getPid == null || r.getPid.isEmpty)
|
|
||||||
return null
|
|
||||||
|
|
||||||
val persistentIdentifiers: List[ScholixIdentifier] = extractTypedIdentifierFromInstance(r)
|
|
||||||
if (persistentIdentifiers.isEmpty)
|
|
||||||
return null
|
|
||||||
s.setLocalIdentifier(persistentIdentifiers.asJava)
|
|
||||||
// s.setTypology(r.getResulttype.getClassid)
|
|
||||||
|
|
||||||
s.setSubType(r.getInstance().get(0).getInstancetype.getClassname)
|
|
||||||
|
|
||||||
if (r.getTitle != null && r.getTitle.asScala.nonEmpty) {
|
|
||||||
val titles: List[String] = r.getTitle.asScala.map(t => t.getValue).toList
|
|
||||||
if (titles.nonEmpty)
|
|
||||||
s.setTitle(titles.asJava)
|
|
||||||
else
|
|
||||||
return null
|
|
||||||
}
|
|
||||||
|
|
||||||
if (r.getAuthor != null && !r.getAuthor.isEmpty) {
|
|
||||||
val authors: List[String] = r.getAuthor.asScala.map(a => a.getFullname).toList
|
|
||||||
if (authors.nonEmpty)
|
|
||||||
s.setAuthor(authors.asJava)
|
|
||||||
}
|
|
||||||
if (r.getInstance() != null) {
|
|
||||||
val dt: List[String] = r
|
|
||||||
.getInstance()
|
|
||||||
.asScala
|
|
||||||
.filter(i => i.getDateofacceptance != null)
|
|
||||||
.map(i => i.getDateofacceptance.getValue)
|
|
||||||
.toList
|
|
||||||
if (dt.nonEmpty)
|
|
||||||
s.setDate(dt.distinct.asJava)
|
|
||||||
}
|
|
||||||
if (r.getDescription != null && !r.getDescription.isEmpty) {
|
|
||||||
val d = r.getDescription.asScala.find(f => f != null && f.getValue != null)
|
|
||||||
if (d.isDefined)
|
|
||||||
s.setDescription(d.get.getValue)
|
|
||||||
}
|
|
||||||
|
|
||||||
if (r.getSubject != null && !r.getSubject.isEmpty) {
|
|
||||||
val subjects: List[SchemeValue] = r.getSubject.asScala
|
|
||||||
.map(s => new SchemeValue(s.getQualifier.getClassname, s.getValue))
|
|
||||||
.toList
|
|
||||||
if (subjects.nonEmpty)
|
|
||||||
s.setSubject(subjects.asJava)
|
|
||||||
}
|
|
||||||
|
|
||||||
if (r.getPublisher != null)
|
|
||||||
s.setPublisher(List(r.getPublisher.getValue).asJava)
|
|
||||||
|
|
||||||
if (r.getCollectedfrom != null && !r.getCollectedfrom.isEmpty) {
|
|
||||||
val cf: List[CollectedFromType] = r.getCollectedfrom.asScala
|
|
||||||
.map(c => new CollectedFromType(c.getValue, c.getKey, "complete"))
|
|
||||||
.toList
|
|
||||||
if (cf.nonEmpty)
|
|
||||||
s.setDatasources(cf.distinct.asJava)
|
|
||||||
}
|
|
||||||
|
|
||||||
s.setRelatedDatasets(0)
|
|
||||||
s.setRelatedPublications(0)
|
|
||||||
s.setRelatedUnknown(0)
|
|
||||||
|
|
||||||
s
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
|
@ -1,36 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.common;
|
|
||||||
|
|
||||||
import java.io.IOException;
|
|
||||||
import java.nio.charset.StandardCharsets;
|
|
||||||
import java.nio.file.Files;
|
|
||||||
import java.nio.file.Path;
|
|
||||||
import java.nio.file.Paths;
|
|
||||||
import java.util.List;
|
|
||||||
|
|
||||||
import org.junit.jupiter.api.Test;
|
|
||||||
|
|
||||||
import com.fasterxml.jackson.core.JsonProcessingException;
|
|
||||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
|
||||||
|
|
||||||
public class MdStoreClientTest {
|
|
||||||
|
|
||||||
// @Test
|
|
||||||
public void testMongoCollection() throws IOException {
|
|
||||||
final MdstoreClient client = new MdstoreClient("mongodb://localhost:27017", "mdstore");
|
|
||||||
|
|
||||||
final ObjectMapper mapper = new ObjectMapper();
|
|
||||||
|
|
||||||
final List<MDStoreInfo> infos = client.mdStoreWithTimestamp("ODF", "store", "cleaned");
|
|
||||||
|
|
||||||
infos.forEach(System.out::println);
|
|
||||||
|
|
||||||
final String s = mapper.writeValueAsString(infos);
|
|
||||||
|
|
||||||
Path fileName = Paths.get("/Users/sandro/mdstore_info.json");
|
|
||||||
|
|
||||||
// Writing into the file
|
|
||||||
Files.write(fileName, s.getBytes(StandardCharsets.UTF_8));
|
|
||||||
|
|
||||||
}
|
|
||||||
}
|
|
|
@ -0,0 +1,109 @@
|
||||||
|
|
||||||
|
package eu.dnetlib.dhp.common.api;
|
||||||
|
|
||||||
|
import java.io.File;
|
||||||
|
import java.io.FileInputStream;
|
||||||
|
import java.io.IOException;
|
||||||
|
import java.io.InputStream;
|
||||||
|
|
||||||
|
import org.apache.commons.io.IOUtils;
|
||||||
|
import org.junit.jupiter.api.Assertions;
|
||||||
|
import org.junit.jupiter.api.Disabled;
|
||||||
|
import org.junit.jupiter.api.Test;
|
||||||
|
|
||||||
|
@Disabled
|
||||||
|
class ZenodoAPIClientTest {
|
||||||
|
|
||||||
|
private final String URL_STRING = "https://sandbox.zenodo.org/api/deposit/depositions";
|
||||||
|
private final String ACCESS_TOKEN = "";
|
||||||
|
|
||||||
|
private final String CONCEPT_REC_ID = "657113";
|
||||||
|
|
||||||
|
private final String depositionId = "674915";
|
||||||
|
|
||||||
|
@Test
|
||||||
|
void testUploadOldDeposition() throws IOException, MissingConceptDoiException {
|
||||||
|
ZenodoAPIClient client = new ZenodoAPIClient(URL_STRING,
|
||||||
|
ACCESS_TOKEN);
|
||||||
|
Assertions.assertEquals(200, client.uploadOpenDeposition(depositionId));
|
||||||
|
|
||||||
|
File file = new File(getClass()
|
||||||
|
.getResource("/eu/dnetlib/dhp/common/api/COVID-19.json.gz")
|
||||||
|
.getPath());
|
||||||
|
|
||||||
|
InputStream is = new FileInputStream(file);
|
||||||
|
|
||||||
|
Assertions.assertEquals(200, client.uploadIS(is, "COVID-19.json.gz", file.length()));
|
||||||
|
|
||||||
|
String metadata = IOUtils.toString(getClass().getResourceAsStream("/eu/dnetlib/dhp/common/api/metadata.json"));
|
||||||
|
|
||||||
|
Assertions.assertEquals(200, client.sendMretadata(metadata));
|
||||||
|
|
||||||
|
Assertions.assertEquals(202, client.publish());
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
void testNewDeposition() throws IOException {
|
||||||
|
|
||||||
|
ZenodoAPIClient client = new ZenodoAPIClient(URL_STRING,
|
||||||
|
ACCESS_TOKEN);
|
||||||
|
Assertions.assertEquals(201, client.newDeposition());
|
||||||
|
|
||||||
|
File file = new File(getClass()
|
||||||
|
.getResource("/eu/dnetlib/dhp/common/api/COVID-19.json.gz")
|
||||||
|
.getPath());
|
||||||
|
|
||||||
|
InputStream is = new FileInputStream(file);
|
||||||
|
|
||||||
|
Assertions.assertEquals(200, client.uploadIS(is, "COVID-19.json.gz", file.length()));
|
||||||
|
|
||||||
|
String metadata = IOUtils.toString(getClass().getResourceAsStream("/eu/dnetlib/dhp/common/api/metadata.json"));
|
||||||
|
|
||||||
|
Assertions.assertEquals(200, client.sendMretadata(metadata));
|
||||||
|
|
||||||
|
Assertions.assertEquals(202, client.publish());
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
void testNewVersionNewName() throws IOException, MissingConceptDoiException {
|
||||||
|
|
||||||
|
ZenodoAPIClient client = new ZenodoAPIClient(URL_STRING,
|
||||||
|
ACCESS_TOKEN);
|
||||||
|
|
||||||
|
Assertions.assertEquals(201, client.newVersion(CONCEPT_REC_ID));
|
||||||
|
|
||||||
|
File file = new File(getClass()
|
||||||
|
.getResource("/eu/dnetlib/dhp/common/api/newVersion")
|
||||||
|
.getPath());
|
||||||
|
|
||||||
|
InputStream is = new FileInputStream(file);
|
||||||
|
|
||||||
|
Assertions.assertEquals(200, client.uploadIS(is, "newVersion_deposition", file.length()));
|
||||||
|
|
||||||
|
Assertions.assertEquals(202, client.publish());
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
void testNewVersionOldName() throws IOException, MissingConceptDoiException {
|
||||||
|
|
||||||
|
ZenodoAPIClient client = new ZenodoAPIClient(URL_STRING,
|
||||||
|
ACCESS_TOKEN);
|
||||||
|
|
||||||
|
Assertions.assertEquals(201, client.newVersion(CONCEPT_REC_ID));
|
||||||
|
|
||||||
|
File file = new File(getClass()
|
||||||
|
.getResource("/eu/dnetlib/dhp/common/api/newVersion2")
|
||||||
|
.getPath());
|
||||||
|
|
||||||
|
InputStream is = new FileInputStream(file);
|
||||||
|
|
||||||
|
Assertions.assertEquals(200, client.uploadIS(is, "newVersion_deposition", file.length()));
|
||||||
|
|
||||||
|
Assertions.assertEquals(202, client.publish());
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
|
@ -0,0 +1,100 @@
|
||||||
|
|
||||||
|
package eu.dnetlib.dhp.oa.merge;
|
||||||
|
|
||||||
|
import java.io.BufferedReader;
|
||||||
|
import java.io.FileReader;
|
||||||
|
import java.io.IOException;
|
||||||
|
import java.nio.file.Paths;
|
||||||
|
import java.util.ArrayList;
|
||||||
|
import java.util.List;
|
||||||
|
import java.util.stream.Collectors;
|
||||||
|
|
||||||
|
import org.junit.jupiter.api.Assertions;
|
||||||
|
import org.junit.jupiter.api.BeforeEach;
|
||||||
|
import org.junit.jupiter.api.Test;
|
||||||
|
|
||||||
|
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||||
|
|
||||||
|
import eu.dnetlib.dhp.schema.oaf.Author;
|
||||||
|
import eu.dnetlib.dhp.schema.oaf.Publication;
|
||||||
|
import eu.dnetlib.dhp.schema.oaf.StructuredProperty;
|
||||||
|
import eu.dnetlib.pace.util.MapDocumentUtil;
|
||||||
|
import scala.Tuple2;
|
||||||
|
|
||||||
|
class AuthorMergerTest {
|
||||||
|
|
||||||
|
private String publicationsBasePath;
|
||||||
|
|
||||||
|
private List<List<Author>> authors;
|
||||||
|
|
||||||
|
@BeforeEach
|
||||||
|
public void setUp() throws Exception {
|
||||||
|
|
||||||
|
publicationsBasePath = Paths
|
||||||
|
.get(AuthorMergerTest.class.getResource("/eu/dnetlib/dhp/oa/merge").toURI())
|
||||||
|
.toFile()
|
||||||
|
.getAbsolutePath();
|
||||||
|
|
||||||
|
authors = readSample(publicationsBasePath + "/publications_with_authors.json", Publication.class)
|
||||||
|
.stream()
|
||||||
|
.map(p -> p._2().getAuthor())
|
||||||
|
.collect(Collectors.toList());
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
void mergeTest() { // used in the dedup: threshold set to 0.95
|
||||||
|
|
||||||
|
for (List<Author> authors1 : authors) {
|
||||||
|
System.out.println("List " + (authors.indexOf(authors1) + 1));
|
||||||
|
for (Author author : authors1) {
|
||||||
|
System.out.println(authorToString(author));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
List<Author> merge = AuthorMerger.merge(authors);
|
||||||
|
|
||||||
|
System.out.println("Merge ");
|
||||||
|
for (Author author : merge) {
|
||||||
|
System.out.println(authorToString(author));
|
||||||
|
}
|
||||||
|
|
||||||
|
Assertions.assertEquals(7, merge.size());
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
public <T> List<Tuple2<String, T>> readSample(String path, Class<T> clazz) {
|
||||||
|
List<Tuple2<String, T>> res = new ArrayList<>();
|
||||||
|
BufferedReader reader;
|
||||||
|
try {
|
||||||
|
reader = new BufferedReader(new FileReader(path));
|
||||||
|
String line = reader.readLine();
|
||||||
|
while (line != null) {
|
||||||
|
res
|
||||||
|
.add(
|
||||||
|
new Tuple2<>(
|
||||||
|
MapDocumentUtil.getJPathString("$.id", line),
|
||||||
|
new ObjectMapper().readValue(line, clazz)));
|
||||||
|
// read next line
|
||||||
|
line = reader.readLine();
|
||||||
|
}
|
||||||
|
reader.close();
|
||||||
|
} catch (IOException e) {
|
||||||
|
e.printStackTrace();
|
||||||
|
}
|
||||||
|
|
||||||
|
return res;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String authorToString(Author a) {
|
||||||
|
|
||||||
|
String print = "Fullname = ";
|
||||||
|
print += a.getFullname() + " pid = [";
|
||||||
|
if (a.getPid() != null)
|
||||||
|
for (StructuredProperty sp : a.getPid()) {
|
||||||
|
print += sp.toComparableString() + " ";
|
||||||
|
}
|
||||||
|
print += "]";
|
||||||
|
return print;
|
||||||
|
}
|
||||||
|
}
|
|
@ -1,21 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
|
||||||
|
|
||||||
import java.util.Set;
|
|
||||||
|
|
||||||
import org.junit.jupiter.api.Assertions;
|
|
||||||
import org.junit.jupiter.api.Test;
|
|
||||||
|
|
||||||
class BlackListProviderTest {
|
|
||||||
|
|
||||||
@Test
|
|
||||||
void blackListTest() {
|
|
||||||
|
|
||||||
Assertions.assertNotNull(PidBlacklistProvider.getBlacklist());
|
|
||||||
Assertions.assertNotNull(PidBlacklistProvider.getBlacklist().get("doi"));
|
|
||||||
Assertions.assertTrue(PidBlacklistProvider.getBlacklist().get("doi").size() > 0);
|
|
||||||
final Set<String> xxx = PidBlacklistProvider.getBlacklist("xxx");
|
|
||||||
Assertions.assertNotNull(xxx);
|
|
||||||
Assertions.assertEquals(0, xxx.size());
|
|
||||||
}
|
|
||||||
}
|
|
|
@ -1,18 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
|
||||||
|
|
||||||
import static org.junit.jupiter.api.Assertions.assertEquals;
|
|
||||||
|
|
||||||
import org.junit.jupiter.api.Test;
|
|
||||||
|
|
||||||
class GridCleaningRuleTest {
|
|
||||||
|
|
||||||
@Test
|
|
||||||
void testCleaning() {
|
|
||||||
assertEquals("grid.493784.5", GridCleaningRule.clean("grid.493784.5"));
|
|
||||||
assertEquals("grid.493784.5x", GridCleaningRule.clean("grid.493784.5x"));
|
|
||||||
assertEquals("grid.493784.5x", GridCleaningRule.clean("493784.5x"));
|
|
||||||
assertEquals("", GridCleaningRule.clean("493x784.5x"));
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
|
@ -1,19 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
|
||||||
|
|
||||||
import static org.junit.jupiter.api.Assertions.assertEquals;
|
|
||||||
|
|
||||||
import org.junit.jupiter.api.Test;
|
|
||||||
|
|
||||||
class ISNICleaningRuleTest {
|
|
||||||
|
|
||||||
@Test
|
|
||||||
void testCleaning() {
|
|
||||||
assertEquals("0000000463436020", ISNICleaningRule.clean("0000 0004 6343 6020"));
|
|
||||||
assertEquals("0000000463436020", ISNICleaningRule.clean("0000000463436020"));
|
|
||||||
assertEquals("", ISNICleaningRule.clean("Q30256598"));
|
|
||||||
assertEquals("0000000493403529", ISNICleaningRule.clean("ISNI:0000000493403529"));
|
|
||||||
assertEquals("000000008614884X", ISNICleaningRule.clean("0000 0000 8614 884X"));
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
|
@ -1,87 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
|
||||||
|
|
||||||
import static org.junit.jupiter.api.Assertions.assertEquals;
|
|
||||||
import static org.junit.jupiter.api.Assertions.assertNotNull;
|
|
||||||
|
|
||||||
import java.io.IOException;
|
|
||||||
|
|
||||||
import org.apache.commons.io.IOUtils;
|
|
||||||
import org.junit.jupiter.api.Test;
|
|
||||||
|
|
||||||
import com.fasterxml.jackson.databind.DeserializationFeature;
|
|
||||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
|
||||||
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.Publication;
|
|
||||||
|
|
||||||
class IdentifierFactoryTest {
|
|
||||||
|
|
||||||
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper()
|
|
||||||
.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
|
|
||||||
|
|
||||||
@Test
|
|
||||||
void testCreateIdentifierForPublication() throws IOException {
|
|
||||||
|
|
||||||
verifyIdentifier(
|
|
||||||
"publication_doi1.json", "50|doi_________::79dbc7a2a56dc1532659f9038843256e", true);
|
|
||||||
|
|
||||||
verifyIdentifier(
|
|
||||||
"publication_doi2.json", "50|doi_________::79dbc7a2a56dc1532659f9038843256e", true);
|
|
||||||
|
|
||||||
verifyIdentifier(
|
|
||||||
"publication_doi3.json", "50|pmc_________::e2a339e0e11bfbf55462e14a07f1b304", true);
|
|
||||||
|
|
||||||
verifyIdentifier(
|
|
||||||
"publication_doi4.json", "50|od______2852::38861c44e6052a8d49f59a4c39ba5e66", true);
|
|
||||||
|
|
||||||
verifyIdentifier(
|
|
||||||
"publication_doi5.json", "50|doi_________::3bef95c0ca26dd55451fc8839ea69d27", true);
|
|
||||||
|
|
||||||
verifyIdentifier(
|
|
||||||
"publication_pmc1.json", "50|DansKnawCris::0829b5191605bdbea36d6502b8c1ce1f", true);
|
|
||||||
|
|
||||||
verifyIdentifier(
|
|
||||||
"publication_pmc2.json", "50|pmc_________::e2a339e0e11bfbf55462e14a07f1b304", true);
|
|
||||||
|
|
||||||
verifyIdentifier(
|
|
||||||
"publication_openapc.json", "50|doi_________::79dbc7a2a56dc1532659f9038843256e", true);
|
|
||||||
|
|
||||||
final String defaultID = "50|DansKnawCris::0829b5191605bdbea36d6502b8c1ce1f";
|
|
||||||
verifyIdentifier("publication_3.json", defaultID, true);
|
|
||||||
verifyIdentifier("publication_4.json", defaultID, true);
|
|
||||||
verifyIdentifier("publication_5.json", defaultID, true);
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
@Test
|
|
||||||
void testCreateIdentifierForPublicationNoHash() throws IOException {
|
|
||||||
|
|
||||||
verifyIdentifier("publication_doi1.json", "50|doi_________::10.1016/j.cmet.2010.03.013", false);
|
|
||||||
verifyIdentifier("publication_doi2.json", "50|doi_________::10.1016/j.cmet.2010.03.013", false);
|
|
||||||
verifyIdentifier("publication_pmc1.json", "50|DansKnawCris::0829b5191605bdbea36d6502b8c1ce1f", false);
|
|
||||||
verifyIdentifier(
|
|
||||||
"publication_urn1.json", "50|DansKnawCris::0829b5191605bdbea36d6502b8c1ce1f", false);
|
|
||||||
|
|
||||||
final String defaultID = "50|DansKnawCris::0829b5191605bdbea36d6502b8c1ce1f";
|
|
||||||
verifyIdentifier("publication_3.json", defaultID, false);
|
|
||||||
verifyIdentifier("publication_4.json", defaultID, false);
|
|
||||||
verifyIdentifier("publication_5.json", defaultID, false);
|
|
||||||
}
|
|
||||||
|
|
||||||
@Test
|
|
||||||
void testCreateIdentifierForROHub() throws IOException {
|
|
||||||
verifyIdentifier(
|
|
||||||
"orp-rohub.json", "50|w3id________::afc7592914ae190a50570db90f55f9c2", true);
|
|
||||||
}
|
|
||||||
|
|
||||||
protected void verifyIdentifier(String filename, String expectedID, boolean md5) throws IOException {
|
|
||||||
final String json = IOUtils.toString(getClass().getResourceAsStream(filename));
|
|
||||||
final Publication pub = OBJECT_MAPPER.readValue(json, Publication.class);
|
|
||||||
|
|
||||||
String id = IdentifierFactory.createIdentifier(pub, md5);
|
|
||||||
System.out.println(id);
|
|
||||||
assertNotNull(id);
|
|
||||||
assertEquals(expectedID, id);
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
|
@ -1,130 +0,0 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
|
||||||
|
|
||||||
import static org.junit.jupiter.api.Assertions.*;
|
|
||||||
import static org.junit.jupiter.api.Assertions.assertEquals;
|
|
||||||
|
|
||||||
import java.io.IOException;
|
|
||||||
import java.lang.reflect.InvocationTargetException;
|
|
||||||
import java.util.HashSet;
|
|
||||||
import java.util.List;
|
|
||||||
import java.util.stream.Collectors;
|
|
||||||
|
|
||||||
import org.apache.commons.beanutils.BeanUtils;
|
|
||||||
import org.apache.commons.io.IOUtils;
|
|
||||||
import org.junit.jupiter.api.Test;
|
|
||||||
|
|
||||||
import com.fasterxml.jackson.databind.DeserializationFeature;
|
|
||||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
|
||||||
import com.google.common.collect.Lists;
|
|
||||||
|
|
||||||
import eu.dnetlib.dhp.schema.common.ModelConstants;
|
|
||||||
import eu.dnetlib.dhp.schema.common.ModelSupport;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.*;
|
|
||||||
|
|
||||||
public class MergeUtilsTest {
|
|
||||||
|
|
||||||
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper()
|
|
||||||
.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
|
|
||||||
|
|
||||||
@Test
|
|
||||||
void testMergePubs_new() throws IOException {
|
|
||||||
Publication pt = read("publication_test.json", Publication.class);
|
|
||||||
Publication p1 = read("publication_test.json", Publication.class);
|
|
||||||
|
|
||||||
assertEquals(1, pt.getCollectedfrom().size());
|
|
||||||
assertEquals(ModelConstants.CROSSREF_ID, pt.getCollectedfrom().get(0).getKey());
|
|
||||||
|
|
||||||
Instance i = new Instance();
|
|
||||||
i.setUrl(Lists.newArrayList("https://..."));
|
|
||||||
p1.getInstance().add(i);
|
|
||||||
|
|
||||||
Publication ptp1 = MergeUtils.mergePublication(pt, p1);
|
|
||||||
|
|
||||||
assertNotNull(ptp1.getInstance());
|
|
||||||
assertEquals(2, ptp1.getInstance().size());
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
@Test
|
|
||||||
void testMergePubs() throws IOException {
|
|
||||||
Publication p1 = read("publication_1.json", Publication.class);
|
|
||||||
Publication p2 = read("publication_2.json", Publication.class);
|
|
||||||
Dataset d1 = read("dataset_1.json", Dataset.class);
|
|
||||||
Dataset d2 = read("dataset_2.json", Dataset.class);
|
|
||||||
|
|
||||||
assertEquals(1, p1.getCollectedfrom().size());
|
|
||||||
assertEquals(ModelConstants.CROSSREF_ID, p1.getCollectedfrom().get(0).getKey());
|
|
||||||
assertEquals(1, d2.getCollectedfrom().size());
|
|
||||||
assertFalse(cfId(d2.getCollectedfrom()).contains(ModelConstants.CROSSREF_ID));
|
|
||||||
|
|
||||||
assertEquals(1, p2.getCollectedfrom().size());
|
|
||||||
assertFalse(cfId(p2.getCollectedfrom()).contains(ModelConstants.CROSSREF_ID));
|
|
||||||
assertEquals(1, d1.getCollectedfrom().size());
|
|
||||||
assertTrue(cfId(d1.getCollectedfrom()).contains(ModelConstants.CROSSREF_ID));
|
|
||||||
|
|
||||||
final Result p1d2 = MergeUtils.checkedMerge(p1, d2, true);
|
|
||||||
assertEquals(ModelConstants.PUBLICATION_RESULTTYPE_CLASSID, p1d2.getResulttype().getClassid());
|
|
||||||
assertTrue(p1d2 instanceof Publication);
|
|
||||||
assertEquals(p1.getId(), p1d2.getId());
|
|
||||||
}
|
|
||||||
|
|
||||||
@Test
|
|
||||||
void testMergePubs_1() throws IOException {
|
|
||||||
Publication p2 = read("publication_2.json", Publication.class);
|
|
||||||
Dataset d1 = read("dataset_1.json", Dataset.class);
|
|
||||||
|
|
||||||
final Result p2d1 = MergeUtils.checkedMerge(p2, d1, true);
|
|
||||||
assertEquals((ModelConstants.DATASET_RESULTTYPE_CLASSID), p2d1.getResulttype().getClassid());
|
|
||||||
assertTrue(p2d1 instanceof Dataset);
|
|
||||||
assertEquals(d1.getId(), p2d1.getId());
|
|
||||||
assertEquals(2, p2d1.getCollectedfrom().size());
|
|
||||||
}
|
|
||||||
|
|
||||||
@Test
|
|
||||||
void testMergePubs_2() throws IOException {
|
|
||||||
Publication p1 = read("publication_1.json", Publication.class);
|
|
||||||
Publication p2 = read("publication_2.json", Publication.class);
|
|
||||||
|
|
||||||
Result p1p2 = MergeUtils.checkedMerge(p1, p2, true);
|
|
||||||
assertTrue(p1p2 instanceof Publication);
|
|
||||||
assertEquals(p1.getId(), p1p2.getId());
|
|
||||||
assertEquals(2, p1p2.getCollectedfrom().size());
|
|
||||||
}
|
|
||||||
|
|
||||||
@Test
|
|
||||||
void testDelegatedAuthority_1() throws IOException {
|
|
||||||
Dataset d1 = read("dataset_2.json", Dataset.class);
|
|
||||||
Dataset d2 = read("dataset_delegated.json", Dataset.class);
|
|
||||||
|
|
||||||
assertEquals(1, d2.getCollectedfrom().size());
|
|
||||||
assertTrue(cfId(d2.getCollectedfrom()).contains(ModelConstants.ZENODO_OD_ID));
|
|
||||||
|
|
||||||
Result res = (Result) MergeUtils.merge(d1, d2, true);
|
|
||||||
|
|
||||||
assertEquals(d2, res);
|
|
||||||
}
|
|
||||||
|
|
||||||
@Test
|
|
||||||
void testDelegatedAuthority_2() throws IOException {
|
|
||||||
Dataset p1 = read("publication_1.json", Dataset.class);
|
|
||||||
Dataset d2 = read("dataset_delegated.json", Dataset.class);
|
|
||||||
|
|
||||||
assertEquals(1, d2.getCollectedfrom().size());
|
|
||||||
assertTrue(cfId(d2.getCollectedfrom()).contains(ModelConstants.ZENODO_OD_ID));
|
|
||||||
|
|
||||||
Result res = (Result) MergeUtils.merge(p1, d2, true);
|
|
||||||
|
|
||||||
assertEquals(d2, res);
|
|
||||||
}
|
|
||||||
|
|
||||||
protected HashSet<String> cfId(List<KeyValue> collectedfrom) {
|
|
||||||
return collectedfrom.stream().map(KeyValue::getKey).collect(Collectors.toCollection(HashSet::new));
|
|
||||||
}
|
|
||||||
|
|
||||||
protected <T extends Result> T read(String filename, Class<T> clazz) throws IOException {
|
|
||||||
final String json = IOUtils.toString(getClass().getResourceAsStream(filename));
|
|
||||||
return OBJECT_MAPPER.readValue(json, clazz);
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue