Compare commits
No commits in common. "main" and "unique_field_in_lists" have entirely different histories.
main
...
unique_fie
|
@ -3,10 +3,10 @@
|
|||
*.iws
|
||||
*.ipr
|
||||
*.iml
|
||||
*.ipr
|
||||
*.iws
|
||||
*~
|
||||
.vscode
|
||||
.metals
|
||||
.bloop
|
||||
.classpath
|
||||
/*/.classpath
|
||||
/*/*/.classpath
|
||||
|
@ -24,7 +24,4 @@
|
|||
spark-warehouse
|
||||
/**/job-override.properties
|
||||
/**/*.log
|
||||
/**/.factorypath
|
||||
/**/.scalafmt.conf
|
||||
/.java-version
|
||||
/dhp-shade-package/dependency-reduced-pom.xml
|
||||
|
||||
|
|
|
@ -1,21 +0,0 @@
|
|||
style = defaultWithAlign
|
||||
|
||||
align.openParenCallSite = false
|
||||
align.openParenDefnSite = false
|
||||
align.tokens = [{code = "->"}, {code = "<-"}, {code = "=>", owner = "Case"}]
|
||||
continuationIndent.callSite = 2
|
||||
continuationIndent.defnSite = 2
|
||||
danglingParentheses = true
|
||||
indentOperator = spray
|
||||
maxColumn = 120
|
||||
newlines.alwaysBeforeTopLevelStatements = true
|
||||
project.excludeFilters = [".*\\.sbt"]
|
||||
rewrite.rules = [AvoidInfix]
|
||||
rewrite.rules = [ExpandImportSelectors]
|
||||
rewrite.rules = [RedundantBraces]
|
||||
rewrite.rules = [RedundantParens]
|
||||
rewrite.rules = [SortImports]
|
||||
rewrite.rules = [SortModifiers]
|
||||
rewrite.rules = [PreferCurlyFors]
|
||||
spaces.inImportCurlyBraces = false
|
||||
unindentTopLevelOperators = true
|
|
@ -1,43 +0,0 @@
|
|||
# Contributor Code of Conduct
|
||||
|
||||
Openness, transparency and our community-driven participatory approach guide us in our day-to-day interactions and decision-making. Our open source projects are no exception. Trust, respect, collaboration and transparency are core values we believe should live and breathe within our projects. Our community welcomes participants from around the world with different experiences, unique perspectives, and great ideas to share.
|
||||
|
||||
## Our Pledge
|
||||
|
||||
In the interest of fostering an open and welcoming environment, we as contributors and maintainers pledge to making participation in our project and our community a harassment-free experience for everyone, regardless of age, body size, disability, ethnicity, sex characteristics, gender identity and expression, level of experience, education, socio-economic status, nationality, personal appearance, race, religion, or sexual identity and orientation.
|
||||
|
||||
## Our Standards
|
||||
|
||||
Examples of behavior that contributes to creating a positive environment include:
|
||||
|
||||
- Using welcoming and inclusive language
|
||||
- Being respectful of differing viewpoints and experiences
|
||||
- Gracefully accepting constructive criticism
|
||||
- Attempting collaboration before conflict
|
||||
- Focusing on what is best for the community
|
||||
- Showing empathy towards other community members
|
||||
|
||||
Examples of unacceptable behavior by participants include:
|
||||
|
||||
- Violence, threats of violence, or inciting others to commit self-harm
|
||||
- The use of sexualized language or imagery and unwelcome sexual attention or advances
|
||||
- Trolling, intentionally spreading misinformation, insulting/derogatory comments, and personal or political attacks
|
||||
- Public or private harassment
|
||||
- Publishing others' private information, such as a physical or electronic address, without explicit permission
|
||||
- Abuse of the reporting process to intentionally harass or exclude others
|
||||
- Advocating for, or encouraging, any of the above behavior
|
||||
- Other conduct which could reasonably be considered inappropriate in a professional setting
|
||||
|
||||
## Our Responsibilities
|
||||
|
||||
Project maintainers are responsible for clarifying the standards of acceptable behavior and are expected to take appropriate and fair corrective action in response to any instances of unacceptable behavior.
|
||||
|
||||
Project maintainers have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, or to ban temporarily or permanently any contributor for other behaviors that they deem inappropriate, threatening, offensive, or harmful.
|
||||
|
||||
## Scope
|
||||
|
||||
This Code of Conduct applies both within project spaces and in public spaces when an individual is representing the project or its community. Examples of representing a project or community include using an official project e-mail address, posting via an official social media account, or acting as an appointed representative at an online or offline event. Representation of a project may be further defined and clarified by project maintainers.
|
||||
|
||||
## Attribution
|
||||
|
||||
This Code of Conduct is adapted from the [Contributor Covenant](https://www.contributor-covenant.org/), [version 1.4](https://www.contributor-covenant.org/version/1/4/code-of-conduct.html).
|
|
@ -1,10 +0,0 @@
|
|||
# Contributing to D-Net Hadoop
|
||||
|
||||
:+1::tada: First off, thanks for taking the time to contribute! :tada::+1:
|
||||
|
||||
This project and everyone participating in it is governed by our [Code of Conduct](CODE_OF_CONDUCT.md). By participating, you are expected to uphold this code. Please report unacceptable behavior to [dnet-team@isti.cnr.it](mailto:dnet-team@isti.cnr.it).
|
||||
|
||||
The following is a set of guidelines for contributing to this project and its packages. These are mostly guidelines, not rules, which applies to this project as a while, including all its sub-modules.
|
||||
Use your best judgment, and feel free to propose changes to this document in a pull request.
|
||||
|
||||
All contributions are welcome, all contributions will be considered to be contributed under the [project license](LICENSE.md).
|
133
README.md
133
README.md
|
@ -1,133 +1,2 @@
|
|||
# dnet-hadoop
|
||||
|
||||
Dnet-hadoop is the project that defined all the [OOZIE workflows](https://oozie.apache.org/) for the OpenAIRE Graph construction, processing, provisioning.
|
||||
|
||||
This project adheres to the Contributor Covenant [code of conduct](CODE_OF_CONDUCT.md).
|
||||
By participating, you are expected to uphold this code. Please report unacceptable behavior to [dnet-team@isti.cnr.it](mailto:dnet-team@isti.cnr.it).
|
||||
|
||||
This project is licensed under the [AGPL v3 or later version](#LICENSE.md).
|
||||
|
||||
How to build, package and run oozie workflows
|
||||
====================
|
||||
|
||||
Oozie-installer is a utility allowing building, uploading and running oozie workflows. In practice, it creates a `*.tar.gz`
|
||||
package that contains resources that define a workflow and some helper scripts.
|
||||
|
||||
This module is automatically executed when running:
|
||||
|
||||
`mvn package -Poozie-package -Dworkflow.source.dir=classpath/to/parent/directory/of/oozie_app`
|
||||
|
||||
on module having set:
|
||||
|
||||
```
|
||||
<parent>
|
||||
<groupId>eu.dnetlib.dhp</groupId>
|
||||
<artifactId>dhp-workflows</artifactId>
|
||||
</parent>
|
||||
```
|
||||
|
||||
in `pom.xml` file. `oozie-package` profile initializes oozie workflow packaging, `workflow.source.dir` property points to
|
||||
a workflow (notice: this is not a relative path but a classpath to directory usually holding `oozie_app` subdirectory).
|
||||
|
||||
The outcome of this packaging is `oozie-package.tar.gz` file containing inside all the resources required to run Oozie workflow:
|
||||
|
||||
- jar packages
|
||||
- workflow definitions
|
||||
- job properties
|
||||
- maintenance scripts
|
||||
|
||||
Required properties
|
||||
====================
|
||||
|
||||
In order to include proper workflow within package, `workflow.source.dir` property has to be set. It could be provided
|
||||
by setting `-Dworkflow.source.dir=some/job/dir` maven parameter.
|
||||
|
||||
In oder to define full set of cluster environment properties one should create `~/.dhp/application.properties` file with
|
||||
the following properties:
|
||||
|
||||
- `dhp.hadoop.frontend.user.name` - your user name on hadoop cluster and frontend machine
|
||||
- `dhp.hadoop.frontend.host.name` - frontend host name
|
||||
- `dhp.hadoop.frontend.temp.dir` - frontend directory for temporary files
|
||||
- `dhp.hadoop.frontend.port.ssh` - frontend machine ssh port
|
||||
- `oozieServiceLoc` - oozie service location required by run_workflow.sh script executing oozie job
|
||||
- `nameNode` - name node address
|
||||
- `jobTracker` - job tracker address
|
||||
- `oozie.execution.log.file.location` - location of file that will be created when executing oozie job, it contains output
|
||||
produced by `run_workflow.sh` script (needed to obtain oozie job id)
|
||||
- `maven.executable` - mvn command location, requires parameterization due to a different setup of CI cluster
|
||||
- `sparkDriverMemory` - amount of memory assigned to spark jobs driver
|
||||
- `sparkExecutorMemory` - amount of memory assigned to spark jobs executors
|
||||
- `sparkExecutorCores` - number of cores assigned to spark jobs executors
|
||||
|
||||
All values will be overriden with the ones from `job.properties` and eventually `job-override.properties` stored in module's
|
||||
main folder.
|
||||
|
||||
When overriding properties from `job.properties`, `job-override.properties` file can be created in main module directory
|
||||
(the one containing `pom.xml` file) and define all new properties which will override existing properties.
|
||||
One can provide those properties one by one as command line `-D` arguments.
|
||||
|
||||
Properties overriding order is the following:
|
||||
|
||||
1. `pom.xml` defined properties (located in the project root dir)
|
||||
2. `~/.dhp/application.properties` defined properties
|
||||
3. `${workflow.source.dir}/job.properties`
|
||||
4. `job-override.properties` (located in the project root dir)
|
||||
5. `maven -Dparam=value`
|
||||
|
||||
where the maven `-Dparam` property is overriding all the other ones.
|
||||
|
||||
Workflow definition requirements
|
||||
====================
|
||||
|
||||
`workflow.source.dir` property should point to the following directory structure:
|
||||
|
||||
[${workflow.source.dir}]
|
||||
|
|
||||
|-job.properties (optional)
|
||||
|
|
||||
\-[oozie_app]
|
||||
|
|
||||
\-workflow.xml
|
||||
|
||||
This property can be set using maven `-D` switch.
|
||||
|
||||
`[oozie_app]` is the default directory name however it can be set to any value as soon as `oozieAppDir` property is
|
||||
provided with directory name as value.
|
||||
|
||||
Sub-workflows are supported as well and sub-workflow directories should be nested within `[oozie_app]` directory.
|
||||
|
||||
Creating oozie installer step-by-step
|
||||
=====================================
|
||||
|
||||
Automated oozie-installer steps are the following:
|
||||
|
||||
1. creating jar packages: `*.jar` and `*tests.jar` along with copying all dependencies in `target/dependencies`
|
||||
2. reading properties from maven, `~/.dhp/application.properties`, `job.properties`, `job-override.properties`
|
||||
3. invoking priming mechanism linking resources from import.txt file (currently resolving subworkflow resources)
|
||||
4. assembling shell scripts for preparing Hadoop filesystem, uploading Oozie application and starting workflow
|
||||
5. copying whole `${workflow.source.dir}` content to `target/${oozie.package.file.name}`
|
||||
6. generating updated `job.properties` file in `target/${oozie.package.file.name}` based on maven,
|
||||
`~/.dhp/application.properties`, `job.properties` and `job-override.properties`
|
||||
7. creating `lib` directory (or multiple directories for sub-workflows for each nested directory) and copying jar packages
|
||||
created at step (1) to each one of them
|
||||
8. bundling whole `${oozie.package.file.name}` directory into single tar.gz package
|
||||
|
||||
Uploading oozie package and running workflow on cluster
|
||||
=======================================================
|
||||
|
||||
In order to simplify deployment and execution process two dedicated profiles were introduced:
|
||||
|
||||
- `deploy`
|
||||
- `run`
|
||||
|
||||
to be used along with `oozie-package` profile e.g. by providing `-Poozie-package,deploy,run` maven parameters.
|
||||
|
||||
The `deploy` profile supplements packaging process with:
|
||||
1) uploading oozie-package via scp to `/home/${user.name}/oozie-packages` directory on `${dhp.hadoop.frontend.host.name}` machine
|
||||
2) extracting uploaded package
|
||||
3) uploading oozie content to hadoop cluster HDFS location defined in `oozie.wf.application.path` property (generated dynamically by maven build process, based on `${dhp.hadoop.frontend.user.name}` and `workflow.source.dir` properties)
|
||||
|
||||
The `run` profile introduces:
|
||||
1) executing oozie application uploaded to HDFS cluster using `deploy` command. Triggers `run_workflow.sh` script providing runtime properties defined in `job.properties` file.
|
||||
|
||||
Notice: ssh access to frontend machine has to be configured on system level and it is preferable to set key-based authentication in order to simplify remote operations.
|
||||
Dnet-hadoop is a tool for
|
|
@ -6,7 +6,7 @@
|
|||
<parent>
|
||||
<groupId>eu.dnetlib.dhp</groupId>
|
||||
<artifactId>dhp-build</artifactId>
|
||||
<version>1.2.5-SNAPSHOT</version>
|
||||
<version>1.2.4-SNAPSHOT</version>
|
||||
</parent>
|
||||
|
||||
<artifactId>dhp-build-assembly-resources</artifactId>
|
||||
|
|
|
@ -6,7 +6,7 @@
|
|||
<parent>
|
||||
<groupId>eu.dnetlib.dhp</groupId>
|
||||
<artifactId>dhp-build</artifactId>
|
||||
<version>1.2.5-SNAPSHOT</version>
|
||||
<version>1.2.4-SNAPSHOT</version>
|
||||
</parent>
|
||||
|
||||
<artifactId>dhp-build-properties-maven-plugin</artifactId>
|
||||
|
|
|
@ -8,6 +8,8 @@ import java.util.List;
|
|||
import org.apache.commons.lang.ArrayUtils;
|
||||
import org.apache.commons.lang.StringUtils;
|
||||
import org.apache.maven.plugin.AbstractMojo;
|
||||
import org.apache.maven.plugin.MojoExecutionException;
|
||||
import org.apache.maven.plugin.MojoFailureException;
|
||||
|
||||
/**
|
||||
* Generates oozie properties which were not provided from commandline.
|
||||
|
@ -25,7 +27,7 @@ public class GenerateOoziePropertiesMojo extends AbstractMojo {
|
|||
};
|
||||
|
||||
@Override
|
||||
public void execute() {
|
||||
public void execute() throws MojoExecutionException, MojoFailureException {
|
||||
if (System.getProperties().containsKey(PROPERTY_NAME_WF_SOURCE_DIR)
|
||||
&& !System.getProperties().containsKey(PROPERTY_NAME_SANDBOX_NAME)) {
|
||||
String generatedSandboxName = generateSandboxName(
|
||||
|
@ -44,24 +46,24 @@ public class GenerateOoziePropertiesMojo extends AbstractMojo {
|
|||
/**
|
||||
* Generates sandbox name from workflow source directory.
|
||||
*
|
||||
* @param wfSourceDir workflow source directory
|
||||
* @param wfSourceDir
|
||||
* @return generated sandbox name
|
||||
*/
|
||||
private String generateSandboxName(String wfSourceDir) {
|
||||
// utilize all dir names until finding one of the limiters
|
||||
List<String> sandboxNameParts = new ArrayList<>();
|
||||
List<String> sandboxNameParts = new ArrayList<String>();
|
||||
String[] tokens = StringUtils.split(wfSourceDir, File.separatorChar);
|
||||
ArrayUtils.reverse(tokens);
|
||||
if (tokens.length > 0) {
|
||||
for (String token : tokens) {
|
||||
for (String limiter : limiters) {
|
||||
if (limiter.equals(token)) {
|
||||
return !sandboxNameParts.isEmpty()
|
||||
return sandboxNameParts.size() > 0
|
||||
? StringUtils.join(sandboxNameParts.toArray())
|
||||
: null;
|
||||
}
|
||||
}
|
||||
if (!sandboxNameParts.isEmpty()) {
|
||||
if (sandboxNameParts.size() > 0) {
|
||||
sandboxNameParts.add(0, File.separator);
|
||||
}
|
||||
sandboxNameParts.add(0, token);
|
||||
|
|
|
@ -16,7 +16,6 @@ import java.io.File;
|
|||
import java.io.FileInputStream;
|
||||
import java.io.IOException;
|
||||
import java.io.InputStream;
|
||||
import java.nio.charset.StandardCharsets;
|
||||
import java.util.ArrayList;
|
||||
import java.util.Arrays;
|
||||
import java.util.Collections;
|
||||
|
@ -290,7 +289,7 @@ public class WritePredefinedProjectProperties extends AbstractMojo {
|
|||
*/
|
||||
protected List<String> getEscapeChars(String escapeChars) {
|
||||
List<String> tokens = getListFromCSV(escapeChars);
|
||||
List<String> realTokens = new ArrayList<>();
|
||||
List<String> realTokens = new ArrayList<String>();
|
||||
for (String token : tokens) {
|
||||
String realToken = getRealToken(token);
|
||||
realTokens.add(realToken);
|
||||
|
@ -325,7 +324,7 @@ public class WritePredefinedProjectProperties extends AbstractMojo {
|
|||
* @return content
|
||||
*/
|
||||
protected String getContent(String comment, Properties properties, List<String> escapeTokens) {
|
||||
List<String> names = new ArrayList<>(properties.stringPropertyNames());
|
||||
List<String> names = new ArrayList<String>(properties.stringPropertyNames());
|
||||
Collections.sort(names);
|
||||
StringBuilder sb = new StringBuilder();
|
||||
if (!StringUtils.isBlank(comment)) {
|
||||
|
@ -353,7 +352,7 @@ public class WritePredefinedProjectProperties extends AbstractMojo {
|
|||
throws MojoExecutionException {
|
||||
try {
|
||||
String content = getContent(comment, properties, escapeTokens);
|
||||
FileUtils.writeStringToFile(file, content, StandardCharsets.UTF_8);
|
||||
FileUtils.writeStringToFile(file, content, ENCODING_UTF8);
|
||||
} catch (IOException e) {
|
||||
throw new MojoExecutionException("Error creating properties file", e);
|
||||
}
|
||||
|
@ -400,9 +399,9 @@ public class WritePredefinedProjectProperties extends AbstractMojo {
|
|||
*/
|
||||
protected static final List<String> getListFromCSV(String csv) {
|
||||
if (StringUtils.isBlank(csv)) {
|
||||
return new ArrayList<>();
|
||||
return new ArrayList<String>();
|
||||
}
|
||||
List<String> list = new ArrayList<>();
|
||||
List<String> list = new ArrayList<String>();
|
||||
String[] tokens = StringUtils.split(csv, ",");
|
||||
for (String token : tokens) {
|
||||
list.add(token.trim());
|
||||
|
|
|
@ -9,18 +9,18 @@ import org.junit.jupiter.api.BeforeEach;
|
|||
import org.junit.jupiter.api.Test;
|
||||
|
||||
/** @author mhorst, claudio.atzori */
|
||||
class GenerateOoziePropertiesMojoTest {
|
||||
public class GenerateOoziePropertiesMojoTest {
|
||||
|
||||
private final GenerateOoziePropertiesMojo mojo = new GenerateOoziePropertiesMojo();
|
||||
|
||||
@BeforeEach
|
||||
void clearSystemProperties() {
|
||||
public void clearSystemProperties() {
|
||||
System.clearProperty(PROPERTY_NAME_SANDBOX_NAME);
|
||||
System.clearProperty(PROPERTY_NAME_WF_SOURCE_DIR);
|
||||
}
|
||||
|
||||
@Test
|
||||
void testExecuteEmpty() throws Exception {
|
||||
public void testExecuteEmpty() throws Exception {
|
||||
// execute
|
||||
mojo.execute();
|
||||
|
||||
|
@ -29,7 +29,7 @@ class GenerateOoziePropertiesMojoTest {
|
|||
}
|
||||
|
||||
@Test
|
||||
void testExecuteSandboxNameAlreadySet() throws Exception {
|
||||
public void testExecuteSandboxNameAlreadySet() throws Exception {
|
||||
// given
|
||||
String workflowSourceDir = "eu/dnetlib/dhp/wf/transformers";
|
||||
String sandboxName = "originalSandboxName";
|
||||
|
@ -44,7 +44,7 @@ class GenerateOoziePropertiesMojoTest {
|
|||
}
|
||||
|
||||
@Test
|
||||
void testExecuteEmptyWorkflowSourceDir() throws Exception {
|
||||
public void testExecuteEmptyWorkflowSourceDir() throws Exception {
|
||||
// given
|
||||
String workflowSourceDir = "";
|
||||
System.setProperty(PROPERTY_NAME_WF_SOURCE_DIR, workflowSourceDir);
|
||||
|
@ -57,7 +57,7 @@ class GenerateOoziePropertiesMojoTest {
|
|||
}
|
||||
|
||||
@Test
|
||||
void testExecuteNullSandboxNameGenerated() throws Exception {
|
||||
public void testExecuteNullSandboxNameGenerated() throws Exception {
|
||||
// given
|
||||
String workflowSourceDir = "eu/dnetlib/dhp/";
|
||||
System.setProperty(PROPERTY_NAME_WF_SOURCE_DIR, workflowSourceDir);
|
||||
|
@ -70,7 +70,7 @@ class GenerateOoziePropertiesMojoTest {
|
|||
}
|
||||
|
||||
@Test
|
||||
void testExecute() throws Exception {
|
||||
public void testExecute() throws Exception {
|
||||
// given
|
||||
String workflowSourceDir = "eu/dnetlib/dhp/wf/transformers";
|
||||
System.setProperty(PROPERTY_NAME_WF_SOURCE_DIR, workflowSourceDir);
|
||||
|
@ -83,7 +83,7 @@ class GenerateOoziePropertiesMojoTest {
|
|||
}
|
||||
|
||||
@Test
|
||||
void testExecuteWithoutRoot() throws Exception {
|
||||
public void testExecuteWithoutRoot() throws Exception {
|
||||
// given
|
||||
String workflowSourceDir = "wf/transformers";
|
||||
System.setProperty(PROPERTY_NAME_WF_SOURCE_DIR, workflowSourceDir);
|
||||
|
|
|
@ -20,7 +20,7 @@ import org.mockito.junit.jupiter.MockitoExtension;
|
|||
|
||||
/** @author mhorst, claudio.atzori */
|
||||
@ExtendWith(MockitoExtension.class)
|
||||
class WritePredefinedProjectPropertiesTest {
|
||||
public class WritePredefinedProjectPropertiesTest {
|
||||
|
||||
@Mock
|
||||
private MavenProject mavenProject;
|
||||
|
@ -39,7 +39,7 @@ class WritePredefinedProjectPropertiesTest {
|
|||
// ----------------------------------- TESTS ---------------------------------------------
|
||||
|
||||
@Test
|
||||
void testExecuteEmpty() throws Exception {
|
||||
public void testExecuteEmpty() throws Exception {
|
||||
// execute
|
||||
mojo.execute();
|
||||
|
||||
|
@ -50,7 +50,7 @@ class WritePredefinedProjectPropertiesTest {
|
|||
}
|
||||
|
||||
@Test
|
||||
void testExecuteWithProjectProperties() throws Exception {
|
||||
public void testExecuteWithProjectProperties() throws Exception {
|
||||
// given
|
||||
String key = "projectPropertyKey";
|
||||
String value = "projectPropertyValue";
|
||||
|
@ -70,7 +70,7 @@ class WritePredefinedProjectPropertiesTest {
|
|||
}
|
||||
|
||||
@Test()
|
||||
void testExecuteWithProjectPropertiesAndInvalidOutputFile(@TempDir File testFolder) {
|
||||
public void testExecuteWithProjectPropertiesAndInvalidOutputFile(@TempDir File testFolder) {
|
||||
// given
|
||||
String key = "projectPropertyKey";
|
||||
String value = "projectPropertyValue";
|
||||
|
@ -80,19 +80,11 @@ class WritePredefinedProjectPropertiesTest {
|
|||
mojo.outputFile = testFolder;
|
||||
|
||||
// execute
|
||||
try {
|
||||
mojo.execute();
|
||||
Assertions.assertTrue(false); // not reached
|
||||
} catch (Exception e) {
|
||||
Assertions
|
||||
.assertTrue(
|
||||
MojoExecutionException.class.isAssignableFrom(e.getClass()) ||
|
||||
IllegalArgumentException.class.isAssignableFrom(e.getClass()));
|
||||
}
|
||||
Assertions.assertThrows(MojoExecutionException.class, () -> mojo.execute());
|
||||
}
|
||||
|
||||
@Test
|
||||
void testExecuteWithProjectPropertiesExclusion(@TempDir File testFolder) throws Exception {
|
||||
public void testExecuteWithProjectPropertiesExclusion(@TempDir File testFolder) throws Exception {
|
||||
// given
|
||||
String key = "projectPropertyKey";
|
||||
String value = "projectPropertyValue";
|
||||
|
@ -116,7 +108,7 @@ class WritePredefinedProjectPropertiesTest {
|
|||
}
|
||||
|
||||
@Test
|
||||
void testExecuteWithProjectPropertiesInclusion(@TempDir File testFolder) throws Exception {
|
||||
public void testExecuteWithProjectPropertiesInclusion(@TempDir File testFolder) throws Exception {
|
||||
// given
|
||||
String key = "projectPropertyKey";
|
||||
String value = "projectPropertyValue";
|
||||
|
@ -140,7 +132,7 @@ class WritePredefinedProjectPropertiesTest {
|
|||
}
|
||||
|
||||
@Test
|
||||
void testExecuteIncludingPropertyKeysFromFile(@TempDir File testFolder) throws Exception {
|
||||
public void testExecuteIncludingPropertyKeysFromFile(@TempDir File testFolder) throws Exception {
|
||||
// given
|
||||
String key = "projectPropertyKey";
|
||||
String value = "projectPropertyValue";
|
||||
|
@ -172,7 +164,7 @@ class WritePredefinedProjectPropertiesTest {
|
|||
}
|
||||
|
||||
@Test
|
||||
void testExecuteIncludingPropertyKeysFromClasspathResource(@TempDir File testFolder)
|
||||
public void testExecuteIncludingPropertyKeysFromClasspathResource(@TempDir File testFolder)
|
||||
throws Exception {
|
||||
// given
|
||||
String key = "projectPropertyKey";
|
||||
|
@ -202,7 +194,7 @@ class WritePredefinedProjectPropertiesTest {
|
|||
}
|
||||
|
||||
@Test
|
||||
void testExecuteIncludingPropertyKeysFromBlankLocation() {
|
||||
public void testExecuteIncludingPropertyKeysFromBlankLocation() {
|
||||
// given
|
||||
String key = "projectPropertyKey";
|
||||
String value = "projectPropertyValue";
|
||||
|
@ -222,7 +214,7 @@ class WritePredefinedProjectPropertiesTest {
|
|||
}
|
||||
|
||||
@Test
|
||||
void testExecuteIncludingPropertyKeysFromXmlFile(@TempDir File testFolder)
|
||||
public void testExecuteIncludingPropertyKeysFromXmlFile(@TempDir File testFolder)
|
||||
throws Exception {
|
||||
// given
|
||||
String key = "projectPropertyKey";
|
||||
|
@ -255,7 +247,7 @@ class WritePredefinedProjectPropertiesTest {
|
|||
}
|
||||
|
||||
@Test
|
||||
void testExecuteIncludingPropertyKeysFromInvalidXmlFile(@TempDir File testFolder)
|
||||
public void testExecuteIncludingPropertyKeysFromInvalidXmlFile(@TempDir File testFolder)
|
||||
throws Exception {
|
||||
// given
|
||||
String key = "projectPropertyKey";
|
||||
|
@ -281,7 +273,7 @@ class WritePredefinedProjectPropertiesTest {
|
|||
}
|
||||
|
||||
@Test
|
||||
void testExecuteWithQuietModeOn(@TempDir File testFolder) throws Exception {
|
||||
public void testExecuteWithQuietModeOn(@TempDir File testFolder) throws Exception {
|
||||
// given
|
||||
mojo.setQuiet(true);
|
||||
mojo.setIncludePropertyKeysFromFiles(new String[] {
|
||||
|
@ -298,7 +290,7 @@ class WritePredefinedProjectPropertiesTest {
|
|||
}
|
||||
|
||||
@Test
|
||||
void testExecuteIncludingPropertyKeysFromInvalidFile() {
|
||||
public void testExecuteIncludingPropertyKeysFromInvalidFile() {
|
||||
// given
|
||||
mojo.setIncludePropertyKeysFromFiles(new String[] {
|
||||
"invalid location"
|
||||
|
@ -309,7 +301,7 @@ class WritePredefinedProjectPropertiesTest {
|
|||
}
|
||||
|
||||
@Test
|
||||
void testExecuteWithEnvironmentProperties(@TempDir File testFolder) throws Exception {
|
||||
public void testExecuteWithEnvironmentProperties(@TempDir File testFolder) throws Exception {
|
||||
// given
|
||||
mojo.setIncludeEnvironmentVariables(true);
|
||||
|
||||
|
@ -326,7 +318,7 @@ class WritePredefinedProjectPropertiesTest {
|
|||
}
|
||||
|
||||
@Test
|
||||
void testExecuteWithSystemProperties(@TempDir File testFolder) throws Exception {
|
||||
public void testExecuteWithSystemProperties(@TempDir File testFolder) throws Exception {
|
||||
// given
|
||||
String key = "systemPropertyKey";
|
||||
String value = "systemPropertyValue";
|
||||
|
@ -345,7 +337,7 @@ class WritePredefinedProjectPropertiesTest {
|
|||
}
|
||||
|
||||
@Test
|
||||
void testExecuteWithSystemPropertiesAndEscapeChars(@TempDir File testFolder)
|
||||
public void testExecuteWithSystemPropertiesAndEscapeChars(@TempDir File testFolder)
|
||||
throws Exception {
|
||||
// given
|
||||
String key = "systemPropertyKey ";
|
||||
|
|
|
@ -5,7 +5,7 @@
|
|||
|
||||
<groupId>eu.dnetlib.dhp</groupId>
|
||||
<artifactId>dhp-code-style</artifactId>
|
||||
<version>1.2.5-SNAPSHOT</version>
|
||||
<version>1.2.4-SNAPSHOT</version>
|
||||
|
||||
<packaging>jar</packaging>
|
||||
|
||||
|
@ -15,27 +15,16 @@
|
|||
<snapshotRepository>
|
||||
<id>dnet45-snapshots</id>
|
||||
<name>DNet45 Snapshots</name>
|
||||
<url>https://maven.d4science.org/nexus/content/repositories/dnet45-snapshots</url>
|
||||
<url>http://maven.research-infrastructures.eu/nexus/content/repositories/dnet45-snapshots</url>
|
||||
<layout>default</layout>
|
||||
</snapshotRepository>
|
||||
<repository>
|
||||
<id>dnet45-releases</id>
|
||||
<url>https://maven.d4science.org/nexus/content/repositories/dnet45-releases</url>
|
||||
<url>http://maven.research-infrastructures.eu/nexus/content/repositories/dnet45-releases</url>
|
||||
</repository>
|
||||
<site>
|
||||
<id>DHPSite</id>
|
||||
<url>${dhp.site.stage.path}/dhp-build/dhp-code-style</url>
|
||||
</site>
|
||||
</distributionManagement>
|
||||
|
||||
<build>
|
||||
<extensions>
|
||||
<extension>
|
||||
<groupId>org.apache.maven.wagon</groupId>
|
||||
<artifactId>wagon-ssh</artifactId>
|
||||
<version>2.10</version>
|
||||
</extension>
|
||||
</extensions>
|
||||
<pluginManagement>
|
||||
<plugins>
|
||||
<plugin>
|
||||
|
@ -46,19 +35,14 @@
|
|||
<plugin>
|
||||
<groupId>org.apache.maven.plugins</groupId>
|
||||
<artifactId>maven-site-plugin</artifactId>
|
||||
<version>3.9.1</version>
|
||||
<configuration>
|
||||
<skip>true</skip>
|
||||
</configuration>
|
||||
<version>3.7.1</version>
|
||||
</plugin>
|
||||
</plugins>
|
||||
</pluginManagement>
|
||||
</build>
|
||||
|
||||
<properties>
|
||||
|
||||
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
|
||||
<dhp.site.stage.path>sftp://dnet-hadoop@static-web.d4science.org/dnet-hadoop</dhp.site.stage.path>
|
||||
</properties>
|
||||
|
||||
</project>
|
|
@ -19,7 +19,7 @@
|
|||
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_empty_parens_in_annotation_type_member_declaration" value="do not insert"/>
|
||||
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_method_declaration_throws" value="do not insert"/>
|
||||
<setting id="org.eclipse.jdt.core.formatter.parentheses_positions_in_switch_statement" value="common_lines"/>
|
||||
<setting id="org.eclipse.jdt.core.formatter.comment.format_javadoc_comments" value="false"/>
|
||||
<setting id="org.eclipse.jdt.core.formatter.comment.format_javadoc_comments" value="true"/>
|
||||
<setting id="org.eclipse.jdt.core.formatter.indentation.size" value="4"/>
|
||||
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_postfix_operator" value="do not insert"/>
|
||||
<setting id="org.eclipse.jdt.core.formatter.parentheses_positions_in_enum_constant_declaration" value="common_lines"/>
|
||||
|
|
|
@ -1,21 +0,0 @@
|
|||
style = defaultWithAlign
|
||||
|
||||
align.openParenCallSite = false
|
||||
align.openParenDefnSite = false
|
||||
align.tokens = [{code = "->"}, {code = "<-"}, {code = "=>", owner = "Case"}]
|
||||
continuationIndent.callSite = 2
|
||||
continuationIndent.defnSite = 2
|
||||
danglingParentheses = true
|
||||
indentOperator = spray
|
||||
maxColumn = 120
|
||||
newlines.alwaysBeforeTopLevelStatements = true
|
||||
project.excludeFilters = [".*\\.sbt"]
|
||||
rewrite.rules = [AvoidInfix]
|
||||
rewrite.rules = [ExpandImportSelectors]
|
||||
rewrite.rules = [RedundantBraces]
|
||||
rewrite.rules = [RedundantParens]
|
||||
rewrite.rules = [SortImports]
|
||||
rewrite.rules = [SortModifiers]
|
||||
rewrite.rules = [PreferCurlyFors]
|
||||
spaces.inImportCurlyBraces = false
|
||||
unindentTopLevelOperators = true
|
|
@ -1,21 +0,0 @@
|
|||
<?xml version="1.0" encoding="ISO-8859-1"?>
|
||||
<project xmlns="http://maven.apache.org/DECORATION/1.8.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
|
||||
xsi:schemaLocation="http://maven.apache.org/DECORATION/1.8.0 https://maven.apache.org/xsd/decoration-1.8.0.xsd"
|
||||
name="DHP-Aggregation">
|
||||
<skin>
|
||||
<groupId>org.apache.maven.skins</groupId>
|
||||
<artifactId>maven-fluido-skin</artifactId>
|
||||
<version>1.8</version>
|
||||
</skin>
|
||||
<poweredBy>
|
||||
<logo name="OpenAIRE Research Graph" href="https://graph.openaire.eu/"
|
||||
img="https://graph.openaire.eu/assets/common-assets/logo-large-graph.png"/>
|
||||
</poweredBy>
|
||||
<body>
|
||||
<links>
|
||||
<item name="Code" href="https://code-repo.d4science.org/" />
|
||||
</links>
|
||||
<menu ref="modules" />
|
||||
<menu ref="reports"/>
|
||||
</body>
|
||||
</project>
|
|
@ -4,15 +4,12 @@
|
|||
<parent>
|
||||
<groupId>eu.dnetlib.dhp</groupId>
|
||||
<artifactId>dhp</artifactId>
|
||||
<version>1.2.5-SNAPSHOT</version>
|
||||
<version>1.2.4-SNAPSHOT</version>
|
||||
</parent>
|
||||
<artifactId>dhp-build</artifactId>
|
||||
<packaging>pom</packaging>
|
||||
|
||||
<description>This module is a container for the build tools used in dnet-hadoop</description>
|
||||
<properties>
|
||||
<maven.javadoc.skip>true</maven.javadoc.skip>
|
||||
</properties>
|
||||
|
||||
<modules>
|
||||
<module>dhp-code-style</module>
|
||||
|
@ -20,12 +17,4 @@
|
|||
<module>dhp-build-properties-maven-plugin</module>
|
||||
</modules>
|
||||
|
||||
|
||||
<distributionManagement>
|
||||
<site>
|
||||
<id>DHPSite</id>
|
||||
<url>${dhp.site.stage.path}/dhp-build/</url>
|
||||
</site>
|
||||
</distributionManagement>
|
||||
|
||||
</project>
|
||||
|
|
|
@ -1,22 +0,0 @@
|
|||
<?xml version="1.0" encoding="ISO-8859-1"?>
|
||||
<project xmlns="http://maven.apache.org/DECORATION/1.8.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
|
||||
xsi:schemaLocation="http://maven.apache.org/DECORATION/1.8.0 https://maven.apache.org/xsd/decoration-1.8.0.xsd"
|
||||
name="DHP-Aggregation">
|
||||
<skin>
|
||||
<groupId>org.apache.maven.skins</groupId>
|
||||
<artifactId>maven-fluido-skin</artifactId>
|
||||
<version>1.8</version>
|
||||
</skin>
|
||||
<poweredBy>
|
||||
<logo name="OpenAIRE Research Graph" href="https://graph.openaire.eu/"
|
||||
img="https://graph.openaire.eu/assets/common-assets/logo-large-graph.png"/>
|
||||
</poweredBy>
|
||||
<body>
|
||||
<links>
|
||||
<item name="Code" href="https://code-repo.d4science.org/" />
|
||||
</links>
|
||||
|
||||
<menu ref="modules" />
|
||||
<menu ref="reports"/>
|
||||
</body>
|
||||
</project>
|
|
@ -5,88 +5,28 @@
|
|||
<parent>
|
||||
<groupId>eu.dnetlib.dhp</groupId>
|
||||
<artifactId>dhp</artifactId>
|
||||
<version>1.2.5-SNAPSHOT</version>
|
||||
<relativePath>../pom.xml</relativePath>
|
||||
|
||||
<version>1.2.4-SNAPSHOT</version>
|
||||
<relativePath>../</relativePath>
|
||||
</parent>
|
||||
|
||||
<artifactId>dhp-common</artifactId>
|
||||
<packaging>jar</packaging>
|
||||
|
||||
<distributionManagement>
|
||||
<site>
|
||||
<id>DHPSite</id>
|
||||
<url>${dhp.site.stage.path}/dhp-common</url>
|
||||
</site>
|
||||
</distributionManagement>
|
||||
|
||||
<description>This module contains common utilities meant to be used across the dnet-hadoop submodules</description>
|
||||
<build>
|
||||
<plugins>
|
||||
<plugin>
|
||||
<groupId>net.alchim31.maven</groupId>
|
||||
<artifactId>scala-maven-plugin</artifactId>
|
||||
<version>${net.alchim31.maven.version}</version>
|
||||
<executions>
|
||||
<execution>
|
||||
<id>scala-compile-first</id>
|
||||
<phase>initialize</phase>
|
||||
<goals>
|
||||
<goal>add-source</goal>
|
||||
<goal>compile</goal>
|
||||
</goals>
|
||||
</execution>
|
||||
<execution>
|
||||
<id>scala-test-compile</id>
|
||||
<phase>process-test-resources</phase>
|
||||
<goals>
|
||||
<goal>testCompile</goal>
|
||||
</goals>
|
||||
</execution>
|
||||
<execution>
|
||||
<id>scala-doc</id>
|
||||
<phase>process-resources</phase> <!-- or wherever -->
|
||||
<goals>
|
||||
<goal>doc</goal>
|
||||
</goals>
|
||||
</execution>
|
||||
</executions>
|
||||
<configuration>
|
||||
<failOnMultipleScalaVersions>true</failOnMultipleScalaVersions>
|
||||
<scalaCompatVersion>${scala.binary.version}</scalaCompatVersion>
|
||||
<scalaVersion>${scala.version}</scalaVersion>
|
||||
</configuration>
|
||||
</plugin>
|
||||
</plugins>
|
||||
|
||||
</build>
|
||||
|
||||
<dependencies>
|
||||
<dependency>
|
||||
<groupId>edu.cmu</groupId>
|
||||
<artifactId>secondstring</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>com.ibm.icu</groupId>
|
||||
<artifactId>icu4j</artifactId>
|
||||
</dependency>
|
||||
|
||||
<dependency>
|
||||
<groupId>com.github.sisyphsu</groupId>
|
||||
<artifactId>dateparser</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>me.xuender</groupId>
|
||||
<artifactId>unidecode</artifactId>
|
||||
</dependency>
|
||||
|
||||
<dependency>
|
||||
<groupId>org.apache.spark</groupId>
|
||||
<artifactId>spark-core_${scala.binary.version}</artifactId>
|
||||
<groupId>org.apache.hadoop</groupId>
|
||||
<artifactId>hadoop-common</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.apache.spark</groupId>
|
||||
<artifactId>spark-sql_${scala.binary.version}</artifactId>
|
||||
<artifactId>spark-core_2.11</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.apache.spark</groupId>
|
||||
<artifactId>spark-sql_2.11</artifactId>
|
||||
</dependency>
|
||||
|
||||
<dependency>
|
||||
|
@ -113,6 +53,11 @@
|
|||
<groupId>com.fasterxml.jackson.core</groupId>
|
||||
<artifactId>jackson-databind</artifactId>
|
||||
</dependency>
|
||||
<!-- https://mvnrepository.com/artifact/com.rabbitmq/amqp-client -->
|
||||
<dependency>
|
||||
<groupId>com.rabbitmq</groupId>
|
||||
<artifactId>amqp-client</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>net.sf.saxon</groupId>
|
||||
<artifactId>Saxon-HE</artifactId>
|
||||
|
@ -142,50 +87,6 @@
|
|||
<groupId>org.postgresql</groupId>
|
||||
<artifactId>postgresql</artifactId>
|
||||
</dependency>
|
||||
|
||||
<dependency>
|
||||
<groupId>com.squareup.okhttp3</groupId>
|
||||
<artifactId>okhttp</artifactId>
|
||||
</dependency>
|
||||
|
||||
<dependency>
|
||||
<groupId>org.apache.httpcomponents</groupId>
|
||||
<artifactId>httpclient</artifactId>
|
||||
</dependency>
|
||||
|
||||
<dependency>
|
||||
<groupId>org.mongodb</groupId>
|
||||
<artifactId>mongo-java-driver</artifactId>
|
||||
</dependency>
|
||||
|
||||
<dependency>
|
||||
<groupId>eu.dnetlib.dhp</groupId>
|
||||
<artifactId>dhp-schemas</artifactId>
|
||||
</dependency>
|
||||
|
||||
<dependency>
|
||||
<groupId>com.opencsv</groupId>
|
||||
<artifactId>opencsv</artifactId>
|
||||
</dependency>
|
||||
</dependencies>
|
||||
|
||||
<!-- dependencies required on JDK9+ because J2EE has been removed -->
|
||||
<profiles>
|
||||
<profile>
|
||||
<id>spark-34</id>
|
||||
<dependencies>
|
||||
<dependency>
|
||||
<groupId>javax.xml.bind</groupId>
|
||||
<artifactId>jaxb-api</artifactId>
|
||||
<version>2.2.11</version>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>com.sun.xml.ws</groupId>
|
||||
<artifactId>jaxws-ri</artifactId>
|
||||
<version>2.3.3</version>
|
||||
<type>pom</type>
|
||||
</dependency>
|
||||
</dependencies>
|
||||
</profile>
|
||||
</profiles>
|
||||
</project>
|
||||
|
|
|
@ -0,0 +1,48 @@
|
|||
|
||||
package eu.dnetlib.collector.worker.model;
|
||||
|
||||
import java.util.HashMap;
|
||||
import java.util.Map;
|
||||
|
||||
public class ApiDescriptor {
|
||||
|
||||
private String id;
|
||||
|
||||
private String baseUrl;
|
||||
|
||||
private String protocol;
|
||||
|
||||
private Map<String, String> params = new HashMap<>();
|
||||
|
||||
public String getBaseUrl() {
|
||||
return baseUrl;
|
||||
}
|
||||
|
||||
public void setBaseUrl(final String baseUrl) {
|
||||
this.baseUrl = baseUrl;
|
||||
}
|
||||
|
||||
public String getId() {
|
||||
return id;
|
||||
}
|
||||
|
||||
public void setId(final String id) {
|
||||
this.id = id;
|
||||
}
|
||||
|
||||
public Map<String, String> getParams() {
|
||||
return params;
|
||||
}
|
||||
|
||||
public void setParams(final HashMap<String, String> params) {
|
||||
this.params = params;
|
||||
}
|
||||
|
||||
public String getProtocol() {
|
||||
return protocol;
|
||||
}
|
||||
|
||||
public void setProtocol(final String protocol) {
|
||||
this.protocol = protocol;
|
||||
}
|
||||
}
|
|
@ -0,0 +1,119 @@
|
|||
|
||||
package eu.dnetlib.data.mdstore.manager.common.model;
|
||||
|
||||
import java.io.Serializable;
|
||||
import java.util.UUID;
|
||||
|
||||
import javax.persistence.Column;
|
||||
import javax.persistence.Entity;
|
||||
import javax.persistence.Id;
|
||||
import javax.persistence.Table;
|
||||
|
||||
@Entity
|
||||
@Table(name = "mdstores")
|
||||
public class MDStore implements Serializable {
|
||||
|
||||
/** */
|
||||
private static final long serialVersionUID = 3160530489149700055L;
|
||||
|
||||
@Id
|
||||
@Column(name = "id")
|
||||
private String id;
|
||||
|
||||
@Column(name = "format")
|
||||
private String format;
|
||||
|
||||
@Column(name = "layout")
|
||||
private String layout;
|
||||
|
||||
@Column(name = "interpretation")
|
||||
private String interpretation;
|
||||
|
||||
@Column(name = "datasource_name")
|
||||
private String datasourceName;
|
||||
|
||||
@Column(name = "datasource_id")
|
||||
private String datasourceId;
|
||||
|
||||
@Column(name = "api_id")
|
||||
private String apiId;
|
||||
|
||||
public String getId() {
|
||||
return id;
|
||||
}
|
||||
|
||||
public void setId(final String id) {
|
||||
this.id = id;
|
||||
}
|
||||
|
||||
public String getFormat() {
|
||||
return format;
|
||||
}
|
||||
|
||||
public void setFormat(final String format) {
|
||||
this.format = format;
|
||||
}
|
||||
|
||||
public String getLayout() {
|
||||
return layout;
|
||||
}
|
||||
|
||||
public void setLayout(final String layout) {
|
||||
this.layout = layout;
|
||||
}
|
||||
|
||||
public String getInterpretation() {
|
||||
return interpretation;
|
||||
}
|
||||
|
||||
public void setInterpretation(final String interpretation) {
|
||||
this.interpretation = interpretation;
|
||||
}
|
||||
|
||||
public String getDatasourceName() {
|
||||
return datasourceName;
|
||||
}
|
||||
|
||||
public void setDatasourceName(final String datasourceName) {
|
||||
this.datasourceName = datasourceName;
|
||||
}
|
||||
|
||||
public String getDatasourceId() {
|
||||
return datasourceId;
|
||||
}
|
||||
|
||||
public void setDatasourceId(final String datasourceId) {
|
||||
this.datasourceId = datasourceId;
|
||||
}
|
||||
|
||||
public String getApiId() {
|
||||
return apiId;
|
||||
}
|
||||
|
||||
public void setApiId(final String apiId) {
|
||||
this.apiId = apiId;
|
||||
}
|
||||
|
||||
public static MDStore newInstance(
|
||||
final String format, final String layout, final String interpretation) {
|
||||
return newInstance(format, layout, interpretation, null, null, null);
|
||||
}
|
||||
|
||||
public static MDStore newInstance(
|
||||
final String format,
|
||||
final String layout,
|
||||
final String interpretation,
|
||||
final String dsName,
|
||||
final String dsId,
|
||||
final String apiId) {
|
||||
final MDStore md = new MDStore();
|
||||
md.setId("md-" + UUID.randomUUID());
|
||||
md.setFormat(format);
|
||||
md.setLayout(layout);
|
||||
md.setInterpretation(interpretation);
|
||||
md.setDatasourceName(dsName);
|
||||
md.setDatasourceId(dsId);
|
||||
md.setApiId(apiId);
|
||||
return md;
|
||||
}
|
||||
}
|
|
@ -0,0 +1,51 @@
|
|||
|
||||
package eu.dnetlib.data.mdstore.manager.common.model;
|
||||
|
||||
import java.io.Serializable;
|
||||
|
||||
import javax.persistence.Column;
|
||||
import javax.persistence.Entity;
|
||||
import javax.persistence.Id;
|
||||
import javax.persistence.Table;
|
||||
|
||||
@Entity
|
||||
@Table(name = "mdstore_current_versions")
|
||||
public class MDStoreCurrentVersion implements Serializable {
|
||||
|
||||
/** */
|
||||
private static final long serialVersionUID = -4757725888593745773L;
|
||||
|
||||
@Id
|
||||
@Column(name = "mdstore")
|
||||
private String mdstore;
|
||||
|
||||
@Column(name = "current_version")
|
||||
private String currentVersion;
|
||||
|
||||
public String getMdstore() {
|
||||
return mdstore;
|
||||
}
|
||||
|
||||
public void setMdstore(final String mdstore) {
|
||||
this.mdstore = mdstore;
|
||||
}
|
||||
|
||||
public String getCurrentVersion() {
|
||||
return currentVersion;
|
||||
}
|
||||
|
||||
public void setCurrentVersion(final String currentVersion) {
|
||||
this.currentVersion = currentVersion;
|
||||
}
|
||||
|
||||
public static MDStoreCurrentVersion newInstance(final String mdId, final String versionId) {
|
||||
final MDStoreCurrentVersion cv = new MDStoreCurrentVersion();
|
||||
cv.setMdstore(mdId);
|
||||
cv.setCurrentVersion(versionId);
|
||||
return cv;
|
||||
}
|
||||
|
||||
public static MDStoreCurrentVersion newInstance(final MDStoreVersion v) {
|
||||
return newInstance(v.getMdstore(), v.getId());
|
||||
}
|
||||
}
|
|
@ -0,0 +1,99 @@
|
|||
|
||||
package eu.dnetlib.data.mdstore.manager.common.model;
|
||||
|
||||
import java.io.Serializable;
|
||||
import java.util.Date;
|
||||
|
||||
import javax.persistence.Column;
|
||||
import javax.persistence.Entity;
|
||||
import javax.persistence.Id;
|
||||
import javax.persistence.Table;
|
||||
import javax.persistence.Temporal;
|
||||
import javax.persistence.TemporalType;
|
||||
|
||||
@Entity
|
||||
@Table(name = "mdstore_versions")
|
||||
public class MDStoreVersion implements Serializable {
|
||||
|
||||
/** */
|
||||
private static final long serialVersionUID = -4763494442274298339L;
|
||||
|
||||
@Id
|
||||
@Column(name = "id")
|
||||
private String id;
|
||||
|
||||
@Column(name = "mdstore")
|
||||
private String mdstore;
|
||||
|
||||
@Column(name = "writing")
|
||||
private boolean writing;
|
||||
|
||||
@Column(name = "readcount")
|
||||
private int readCount = 0;
|
||||
|
||||
@Column(name = "lastupdate")
|
||||
@Temporal(TemporalType.TIMESTAMP)
|
||||
private Date lastUpdate;
|
||||
|
||||
@Column(name = "size")
|
||||
private long size = 0;
|
||||
|
||||
public static MDStoreVersion newInstance(final String mdId, final boolean writing) {
|
||||
final MDStoreVersion t = new MDStoreVersion();
|
||||
t.setId(mdId + "-" + new Date().getTime());
|
||||
t.setMdstore(mdId);
|
||||
t.setLastUpdate(null);
|
||||
t.setWriting(writing);
|
||||
t.setReadCount(0);
|
||||
t.setSize(0);
|
||||
return t;
|
||||
}
|
||||
|
||||
public String getId() {
|
||||
return id;
|
||||
}
|
||||
|
||||
public void setId(final String id) {
|
||||
this.id = id;
|
||||
}
|
||||
|
||||
public String getMdstore() {
|
||||
return mdstore;
|
||||
}
|
||||
|
||||
public void setMdstore(final String mdstore) {
|
||||
this.mdstore = mdstore;
|
||||
}
|
||||
|
||||
public boolean isWriting() {
|
||||
return writing;
|
||||
}
|
||||
|
||||
public void setWriting(final boolean writing) {
|
||||
this.writing = writing;
|
||||
}
|
||||
|
||||
public int getReadCount() {
|
||||
return readCount;
|
||||
}
|
||||
|
||||
public void setReadCount(final int readCount) {
|
||||
this.readCount = readCount;
|
||||
}
|
||||
|
||||
public Date getLastUpdate() {
|
||||
return lastUpdate;
|
||||
}
|
||||
|
||||
public void setLastUpdate(final Date lastUpdate) {
|
||||
this.lastUpdate = lastUpdate;
|
||||
}
|
||||
|
||||
public long getSize() {
|
||||
return size;
|
||||
}
|
||||
|
||||
public void setSize(final long size) {
|
||||
this.size = size;
|
||||
}
|
||||
}
|
|
@ -0,0 +1,143 @@
|
|||
|
||||
package eu.dnetlib.data.mdstore.manager.common.model;
|
||||
|
||||
import java.io.Serializable;
|
||||
import java.util.Date;
|
||||
|
||||
import javax.persistence.Column;
|
||||
import javax.persistence.Entity;
|
||||
import javax.persistence.Id;
|
||||
import javax.persistence.Table;
|
||||
import javax.persistence.Temporal;
|
||||
import javax.persistence.TemporalType;
|
||||
|
||||
@Entity
|
||||
@Table(name = "mdstores_with_info")
|
||||
public class MDStoreWithInfo implements Serializable {
|
||||
|
||||
/** */
|
||||
private static final long serialVersionUID = -8445784770687571492L;
|
||||
|
||||
@Id
|
||||
@Column(name = "id")
|
||||
private String id;
|
||||
|
||||
@Column(name = "format")
|
||||
private String format;
|
||||
|
||||
@Column(name = "layout")
|
||||
private String layout;
|
||||
|
||||
@Column(name = "interpretation")
|
||||
private String interpretation;
|
||||
|
||||
@Column(name = "datasource_name")
|
||||
private String datasourceName;
|
||||
|
||||
@Column(name = "datasource_id")
|
||||
private String datasourceId;
|
||||
|
||||
@Column(name = "api_id")
|
||||
private String apiId;
|
||||
|
||||
@Column(name = "current_version")
|
||||
private String currentVersion;
|
||||
|
||||
@Column(name = "lastupdate")
|
||||
@Temporal(TemporalType.TIMESTAMP)
|
||||
private Date lastUpdate;
|
||||
|
||||
@Column(name = "size")
|
||||
private long size = 0;
|
||||
|
||||
@Column(name = "n_versions")
|
||||
private long numberOfVersions = 0;
|
||||
|
||||
public String getId() {
|
||||
return id;
|
||||
}
|
||||
|
||||
public void setId(final String id) {
|
||||
this.id = id;
|
||||
}
|
||||
|
||||
public String getFormat() {
|
||||
return format;
|
||||
}
|
||||
|
||||
public void setFormat(final String format) {
|
||||
this.format = format;
|
||||
}
|
||||
|
||||
public String getLayout() {
|
||||
return layout;
|
||||
}
|
||||
|
||||
public void setLayout(final String layout) {
|
||||
this.layout = layout;
|
||||
}
|
||||
|
||||
public String getInterpretation() {
|
||||
return interpretation;
|
||||
}
|
||||
|
||||
public void setInterpretation(final String interpretation) {
|
||||
this.interpretation = interpretation;
|
||||
}
|
||||
|
||||
public String getDatasourceName() {
|
||||
return datasourceName;
|
||||
}
|
||||
|
||||
public void setDatasourceName(final String datasourceName) {
|
||||
this.datasourceName = datasourceName;
|
||||
}
|
||||
|
||||
public String getDatasourceId() {
|
||||
return datasourceId;
|
||||
}
|
||||
|
||||
public void setDatasourceId(final String datasourceId) {
|
||||
this.datasourceId = datasourceId;
|
||||
}
|
||||
|
||||
public String getApiId() {
|
||||
return apiId;
|
||||
}
|
||||
|
||||
public void setApiId(final String apiId) {
|
||||
this.apiId = apiId;
|
||||
}
|
||||
|
||||
public String getCurrentVersion() {
|
||||
return currentVersion;
|
||||
}
|
||||
|
||||
public void setCurrentVersion(final String currentVersion) {
|
||||
this.currentVersion = currentVersion;
|
||||
}
|
||||
|
||||
public Date getLastUpdate() {
|
||||
return lastUpdate;
|
||||
}
|
||||
|
||||
public void setLastUpdate(final Date lastUpdate) {
|
||||
this.lastUpdate = lastUpdate;
|
||||
}
|
||||
|
||||
public long getSize() {
|
||||
return size;
|
||||
}
|
||||
|
||||
public void setSize(final long size) {
|
||||
this.size = size;
|
||||
}
|
||||
|
||||
public long getNumberOfVersions() {
|
||||
return numberOfVersions;
|
||||
}
|
||||
|
||||
public void setNumberOfVersions(final long numberOfVersions) {
|
||||
this.numberOfVersions = numberOfVersions;
|
||||
}
|
||||
}
|
|
@ -1,7 +1,10 @@
|
|||
|
||||
package eu.dnetlib.dhp.application;
|
||||
|
||||
import java.io.*;
|
||||
import java.io.ByteArrayInputStream;
|
||||
import java.io.ByteArrayOutputStream;
|
||||
import java.io.Serializable;
|
||||
import java.io.StringWriter;
|
||||
import java.util.*;
|
||||
import java.util.zip.GZIPInputStream;
|
||||
import java.util.zip.GZIPOutputStream;
|
||||
|
@ -9,21 +12,17 @@ import java.util.zip.GZIPOutputStream;
|
|||
import org.apache.commons.cli.*;
|
||||
import org.apache.commons.codec.binary.Base64;
|
||||
import org.apache.commons.io.IOUtils;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
|
||||
public class ArgumentApplicationParser implements Serializable {
|
||||
|
||||
private static final Logger log = LoggerFactory.getLogger(ArgumentApplicationParser.class);
|
||||
|
||||
private final Options options = new Options();
|
||||
private final Map<String, String> objectMap = new HashMap<>();
|
||||
|
||||
private final List<String> compressedValues = new ArrayList<>();
|
||||
|
||||
public ArgumentApplicationParser(final String json_configuration) throws IOException {
|
||||
public ArgumentApplicationParser(final String json_configuration) throws Exception {
|
||||
final ObjectMapper mapper = new ObjectMapper();
|
||||
final OptionsParameter[] configuration = mapper.readValue(json_configuration, OptionsParameter[].class);
|
||||
createOptionMap(configuration);
|
||||
|
@ -34,6 +33,7 @@ public class ArgumentApplicationParser implements Serializable {
|
|||
}
|
||||
|
||||
private void createOptionMap(final OptionsParameter[] configuration) {
|
||||
|
||||
Arrays
|
||||
.stream(configuration)
|
||||
.map(
|
||||
|
@ -47,6 +47,10 @@ public class ArgumentApplicationParser implements Serializable {
|
|||
return o;
|
||||
})
|
||||
.forEach(options::addOption);
|
||||
|
||||
// HelpFormatter formatter = new HelpFormatter();
|
||||
// formatter.printHelp("myapp", null, options, null, true);
|
||||
|
||||
}
|
||||
|
||||
public static String decompressValue(final String abstractCompressed) {
|
||||
|
@ -56,13 +60,13 @@ public class ArgumentApplicationParser implements Serializable {
|
|||
final StringWriter stringWriter = new StringWriter();
|
||||
IOUtils.copy(gis, stringWriter);
|
||||
return stringWriter.toString();
|
||||
} catch (IOException e) {
|
||||
log.error("Wrong value to decompress: {}", abstractCompressed);
|
||||
throw new IllegalArgumentException(e);
|
||||
} catch (Throwable e) {
|
||||
System.out.println("Wrong value to decompress:" + abstractCompressed);
|
||||
throw new RuntimeException(e);
|
||||
}
|
||||
}
|
||||
|
||||
public static String compressArgument(final String value) throws IOException {
|
||||
public static String compressArgument(final String value) throws Exception {
|
||||
ByteArrayOutputStream out = new ByteArrayOutputStream();
|
||||
GZIPOutputStream gzip = new GZIPOutputStream(out);
|
||||
gzip.write(value.getBytes());
|
||||
|
@ -70,7 +74,7 @@ public class ArgumentApplicationParser implements Serializable {
|
|||
return java.util.Base64.getEncoder().encodeToString(out.toByteArray());
|
||||
}
|
||||
|
||||
public void parseArgument(final String[] args) throws ParseException {
|
||||
public void parseArgument(final String[] args) throws Exception {
|
||||
CommandLineParser parser = new BasicParser();
|
||||
CommandLine cmd = parser.parse(options, args);
|
||||
Arrays
|
||||
|
|
|
@ -9,6 +9,9 @@ public class OptionsParameter {
|
|||
private boolean paramRequired;
|
||||
private boolean compressed;
|
||||
|
||||
public OptionsParameter() {
|
||||
}
|
||||
|
||||
public String getParamName() {
|
||||
return paramName;
|
||||
}
|
||||
|
|
|
@ -1,48 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.collection;
|
||||
|
||||
import java.util.HashMap;
|
||||
import java.util.Map;
|
||||
|
||||
public class ApiDescriptor {
|
||||
|
||||
private String id;
|
||||
|
||||
private String baseUrl;
|
||||
|
||||
private String protocol;
|
||||
|
||||
private Map<String, String> params = new HashMap<>();
|
||||
|
||||
public String getBaseUrl() {
|
||||
return baseUrl;
|
||||
}
|
||||
|
||||
public void setBaseUrl(final String baseUrl) {
|
||||
this.baseUrl = baseUrl;
|
||||
}
|
||||
|
||||
public String getId() {
|
||||
return id;
|
||||
}
|
||||
|
||||
public void setId(final String id) {
|
||||
this.id = id;
|
||||
}
|
||||
|
||||
public Map<String, String> getParams() {
|
||||
return params;
|
||||
}
|
||||
|
||||
public void setParams(final Map<String, String> params) {
|
||||
this.params = params;
|
||||
}
|
||||
|
||||
public String getProtocol() {
|
||||
return protocol;
|
||||
}
|
||||
|
||||
public void setProtocol(final String protocol) {
|
||||
this.protocol = protocol;
|
||||
}
|
||||
}
|
|
@ -1,68 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.common;
|
||||
|
||||
import java.util.Map;
|
||||
|
||||
import com.google.common.collect.Maps;
|
||||
|
||||
public class Constants {
|
||||
|
||||
public static final Map<String, String> accessRightsCoarMap = Maps.newHashMap();
|
||||
public static final Map<String, String> coarCodeLabelMap = Maps.newHashMap();
|
||||
|
||||
public static final String ROR_NS_PREFIX = "ror_________";
|
||||
|
||||
public static final String ROR_OPENAIRE_ID = "10|openaire____::993a7ae7a863813cf95028b50708e222";
|
||||
|
||||
public static final String ROR_DATASOURCE_NAME = "Research Organization Registry (ROR)";
|
||||
|
||||
public static String COAR_ACCESS_RIGHT_SCHEMA = "http://vocabularies.coar-repositories.org/documentation/access_rights/";
|
||||
|
||||
private Constants() {
|
||||
}
|
||||
|
||||
static {
|
||||
accessRightsCoarMap.put("OPEN", "c_abf2");
|
||||
accessRightsCoarMap.put("RESTRICTED", "c_16ec");
|
||||
accessRightsCoarMap.put("OPEN SOURCE", "c_abf2");
|
||||
accessRightsCoarMap.put("CLOSED", "c_14cb");
|
||||
accessRightsCoarMap.put("EMBARGO", "c_f1cf");
|
||||
}
|
||||
|
||||
static {
|
||||
coarCodeLabelMap.put("c_abf2", "OPEN");
|
||||
coarCodeLabelMap.put("c_16ec", "RESTRICTED");
|
||||
coarCodeLabelMap.put("c_14cb", "CLOSED");
|
||||
coarCodeLabelMap.put("c_f1cf", "EMBARGO");
|
||||
}
|
||||
|
||||
public static final String SEQUENCE_FILE_NAME = "/sequence_file";
|
||||
public static final String REPORT_FILE_NAME = "/report";
|
||||
public static final String MDSTORE_DATA_PATH = "/store";
|
||||
public static final String MDSTORE_SIZE_PATH = "/size";
|
||||
|
||||
public static final String COLLECTION_MODE = "collectionMode";
|
||||
public static final String METADATA_ENCODING = "metadataEncoding";
|
||||
public static final String OOZIE_WF_PATH = "oozieWfPath";
|
||||
public static final String DNET_MESSAGE_MGR_URL = "dnetMessageManagerURL";
|
||||
|
||||
public static final String MAX_NUMBER_OF_RETRY = "maxNumberOfRetry";
|
||||
public static final String REQUEST_DELAY = "requestDelay";
|
||||
public static final String RETRY_DELAY = "retryDelay";
|
||||
public static final String CONNECT_TIMEOUT = "connectTimeOut";
|
||||
public static final String READ_TIMEOUT = "readTimeOut";
|
||||
public static final String REQUEST_METHOD = "requestMethod";
|
||||
public static final String FROM_DATE_OVERRIDE = "fromDateOverride";
|
||||
public static final String UNTIL_DATE_OVERRIDE = "untilDateOverride";
|
||||
|
||||
public static final String CONTENT_TOTALITEMS = "TotalItems";
|
||||
public static final String CONTENT_INVALIDRECORDS = "InvalidRecords";
|
||||
public static final String CONTENT_TRANSFORMEDRECORDS = "transformedItems";
|
||||
|
||||
// IETF Draft and used by Repositories like ZENODO , not included in APACHE HTTP java packages
|
||||
// see https://ietf-wg-httpapi.github.io/ratelimit-headers/draft-ietf-httpapi-ratelimit-headers.html
|
||||
public static final String HTTPHEADER_IETF_DRAFT_RATELIMIT_LIMIT = "X-RateLimit-Limit";
|
||||
public static final String HTTPHEADER_IETF_DRAFT_RATELIMIT_REMAINING = "X-RateLimit-Remaining";
|
||||
public static final String HTTPHEADER_IETF_DRAFT_RATELIMIT_RESET = "X-RateLimit-Reset";
|
||||
|
||||
}
|
|
@ -7,14 +7,14 @@ import java.sql.*;
|
|||
import java.util.function.Consumer;
|
||||
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.apache.commons.logging.Log;
|
||||
import org.apache.commons.logging.LogFactory;
|
||||
|
||||
public class DbClient implements Closeable {
|
||||
|
||||
private static final Logger log = LoggerFactory.getLogger(DbClient.class);
|
||||
private static final Log log = LogFactory.getLog(DbClient.class);
|
||||
|
||||
private final Connection connection;
|
||||
private Connection connection;
|
||||
|
||||
public DbClient(final String address, final String login, final String password) {
|
||||
|
||||
|
@ -37,8 +37,6 @@ public class DbClient implements Closeable {
|
|||
try (final Statement stmt = connection.createStatement()) {
|
||||
stmt.setFetchSize(100);
|
||||
|
||||
log.info("running SQL:\n\n{}\n\n", sql);
|
||||
|
||||
try (final ResultSet rs = stmt.executeQuery(sql)) {
|
||||
while (rs.next()) {
|
||||
consumer.accept(rs);
|
||||
|
|
|
@ -28,7 +28,7 @@ public class HdfsSupport {
|
|||
* @param configuration Configuration of hadoop env
|
||||
*/
|
||||
public static boolean exists(String path, Configuration configuration) {
|
||||
logger.info("Checking existence for path: {}", path);
|
||||
logger.info("Removing path: {}", path);
|
||||
return rethrowAsRuntimeException(
|
||||
() -> {
|
||||
Path f = new Path(path);
|
||||
|
|
|
@ -1,100 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.common;
|
||||
|
||||
/**
|
||||
* This utility represent the Metadata Store information
|
||||
* needed during the migration from mongo to HDFS to store
|
||||
*/
|
||||
public class MDStoreInfo {
|
||||
private String mdstore;
|
||||
private String currentId;
|
||||
private Long latestTimestamp;
|
||||
|
||||
/**
|
||||
* Instantiates a new Md store info.
|
||||
*/
|
||||
public MDStoreInfo() {
|
||||
}
|
||||
|
||||
/**
|
||||
* Instantiates a new Md store info.
|
||||
*
|
||||
* @param mdstore the mdstore
|
||||
* @param currentId the current id
|
||||
* @param latestTimestamp the latest timestamp
|
||||
*/
|
||||
public MDStoreInfo(String mdstore, String currentId, Long latestTimestamp) {
|
||||
this.mdstore = mdstore;
|
||||
this.currentId = currentId;
|
||||
this.latestTimestamp = latestTimestamp;
|
||||
}
|
||||
|
||||
/**
|
||||
* Gets mdstore.
|
||||
*
|
||||
* @return the mdstore
|
||||
*/
|
||||
public String getMdstore() {
|
||||
return mdstore;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets mdstore.
|
||||
*
|
||||
* @param mdstore the mdstore
|
||||
* @return the mdstore
|
||||
*/
|
||||
public MDStoreInfo setMdstore(String mdstore) {
|
||||
this.mdstore = mdstore;
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Gets current id.
|
||||
*
|
||||
* @return the current id
|
||||
*/
|
||||
public String getCurrentId() {
|
||||
return currentId;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets current id.
|
||||
*
|
||||
* @param currentId the current id
|
||||
* @return the current id
|
||||
*/
|
||||
public MDStoreInfo setCurrentId(String currentId) {
|
||||
this.currentId = currentId;
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Gets latest timestamp.
|
||||
*
|
||||
* @return the latest timestamp
|
||||
*/
|
||||
public Long getLatestTimestamp() {
|
||||
return latestTimestamp;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets latest timestamp.
|
||||
*
|
||||
* @param latestTimestamp the latest timestamp
|
||||
* @return the latest timestamp
|
||||
*/
|
||||
public MDStoreInfo setLatestTimestamp(Long latestTimestamp) {
|
||||
this.latestTimestamp = latestTimestamp;
|
||||
return this;
|
||||
}
|
||||
|
||||
@Override
|
||||
public String toString() {
|
||||
return "MDStoreInfo{" +
|
||||
"mdstore='" + mdstore + '\'' +
|
||||
", currentId='" + currentId + '\'' +
|
||||
", latestTimestamp=" + latestTimestamp +
|
||||
'}';
|
||||
}
|
||||
}
|
|
@ -1,172 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.common;
|
||||
|
||||
import java.io.BufferedInputStream;
|
||||
import java.io.IOException;
|
||||
import java.io.InputStream;
|
||||
import java.io.Serializable;
|
||||
import java.util.Optional;
|
||||
|
||||
import org.apache.commons.compress.archivers.tar.TarArchiveEntry;
|
||||
import org.apache.commons.compress.archivers.tar.TarArchiveOutputStream;
|
||||
import org.apache.commons.io.IOUtils;
|
||||
import org.apache.hadoop.conf.Configuration;
|
||||
import org.apache.hadoop.fs.*;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
|
||||
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
|
||||
|
||||
public class MakeTarArchive implements Serializable {
|
||||
|
||||
private static final Logger log = LoggerFactory.getLogger(MakeTarArchive.class);
|
||||
|
||||
public static void main(String[] args) throws Exception {
|
||||
String jsonConfiguration = IOUtils
|
||||
.toString(
|
||||
MakeTarArchive.class
|
||||
.getResourceAsStream(
|
||||
"/eu/dnetlib/dhp/common/input_maketar_parameters.json"));
|
||||
|
||||
final ArgumentApplicationParser parser = new ArgumentApplicationParser(jsonConfiguration);
|
||||
parser.parseArgument(args);
|
||||
|
||||
final String outputPath = parser.get("hdfsPath");
|
||||
log.info("hdfsPath: {}", outputPath);
|
||||
|
||||
final String hdfsNameNode = parser.get("nameNode");
|
||||
log.info("nameNode: {}", hdfsNameNode);
|
||||
|
||||
final String inputPath = parser.get("sourcePath");
|
||||
log.info("input path : {}", inputPath);
|
||||
|
||||
final int gBperSplit = Optional
|
||||
.ofNullable(parser.get("splitSize"))
|
||||
.map(Integer::valueOf)
|
||||
.orElse(10);
|
||||
|
||||
Configuration conf = new Configuration();
|
||||
conf.set("fs.defaultFS", hdfsNameNode);
|
||||
|
||||
FileSystem fileSystem = FileSystem.get(conf);
|
||||
|
||||
makeTArArchive(fileSystem, inputPath, outputPath, gBperSplit);
|
||||
|
||||
}
|
||||
|
||||
public static void makeTArArchive(FileSystem fileSystem, String inputPath, String outputPath, int gBperSplit)
|
||||
throws IOException {
|
||||
|
||||
RemoteIterator<LocatedFileStatus> dirIterator = fileSystem.listLocatedStatus(new Path(inputPath));
|
||||
|
||||
while (dirIterator.hasNext()) {
|
||||
LocatedFileStatus fileStatus = dirIterator.next();
|
||||
|
||||
Path p = fileStatus.getPath();
|
||||
String pathString = p.toString();
|
||||
String entity = pathString.substring(pathString.lastIndexOf("/") + 1);
|
||||
|
||||
MakeTarArchive.tarMaxSize(fileSystem, pathString, outputPath + "/" + entity, entity, gBperSplit);
|
||||
}
|
||||
}
|
||||
|
||||
private static TarArchiveOutputStream getTar(FileSystem fileSystem, String outputPath) throws IOException {
|
||||
Path hdfsWritePath = new Path(outputPath);
|
||||
if (fileSystem.exists(hdfsWritePath)) {
|
||||
fileSystem.delete(hdfsWritePath, true);
|
||||
|
||||
}
|
||||
return new TarArchiveOutputStream(fileSystem.create(hdfsWritePath).getWrappedStream());
|
||||
}
|
||||
|
||||
private static void write(FileSystem fileSystem, String inputPath, String outputPath, String dirName)
|
||||
throws IOException {
|
||||
|
||||
Path hdfsWritePath = new Path(outputPath);
|
||||
if (fileSystem.exists(hdfsWritePath)) {
|
||||
fileSystem.delete(hdfsWritePath, true);
|
||||
|
||||
}
|
||||
try (TarArchiveOutputStream ar = new TarArchiveOutputStream(
|
||||
fileSystem.create(hdfsWritePath).getWrappedStream())) {
|
||||
|
||||
RemoteIterator<LocatedFileStatus> iterator = fileSystem
|
||||
.listFiles(
|
||||
new Path(inputPath), true);
|
||||
|
||||
while (iterator.hasNext()) {
|
||||
writeCurrentFile(fileSystem, dirName, iterator, ar, 0);
|
||||
}
|
||||
|
||||
}
|
||||
}
|
||||
|
||||
public static void tarMaxSize(FileSystem fileSystem, String inputPath, String outputPath, String dir_name,
|
||||
int gBperSplit) throws IOException {
|
||||
final long bytesPerSplit = 1024L * 1024L * 1024L * gBperSplit;
|
||||
|
||||
long sourceSize = fileSystem.getContentSummary(new Path(inputPath)).getSpaceConsumed();
|
||||
|
||||
if (sourceSize < bytesPerSplit) {
|
||||
write(fileSystem, inputPath, outputPath + ".tar", dir_name);
|
||||
} else {
|
||||
int partNum = 0;
|
||||
|
||||
RemoteIterator<LocatedFileStatus> fileStatusListIterator = fileSystem
|
||||
.listFiles(
|
||||
new Path(inputPath), true);
|
||||
boolean next = fileStatusListIterator.hasNext();
|
||||
while (next) {
|
||||
try (TarArchiveOutputStream ar = getTar(fileSystem, outputPath + "_" + (partNum + 1) + ".tar")) {
|
||||
|
||||
long currentSize = 0;
|
||||
while (next && currentSize < bytesPerSplit) {
|
||||
currentSize = writeCurrentFile(fileSystem, dir_name, fileStatusListIterator, ar, currentSize);
|
||||
next = fileStatusListIterator.hasNext();
|
||||
|
||||
}
|
||||
|
||||
partNum += 1;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private static long writeCurrentFile(FileSystem fileSystem, String dirName,
|
||||
RemoteIterator<LocatedFileStatus> fileStatusListIterator,
|
||||
TarArchiveOutputStream ar, long currentSize) throws IOException {
|
||||
LocatedFileStatus fileStatus = fileStatusListIterator.next();
|
||||
|
||||
Path p = fileStatus.getPath();
|
||||
String pString = p.toString();
|
||||
if (!pString.endsWith("_SUCCESS")) {
|
||||
String name = pString.substring(pString.lastIndexOf("/") + 1);
|
||||
if (name.startsWith("part-") & name.length() > 10) {
|
||||
String tmp = name.substring(0, 10);
|
||||
if (name.contains(".")) {
|
||||
tmp += name.substring(name.indexOf("."));
|
||||
}
|
||||
name = tmp;
|
||||
}
|
||||
TarArchiveEntry entry = new TarArchiveEntry(dirName + "/" + name);
|
||||
entry.setSize(fileStatus.getLen());
|
||||
currentSize += fileStatus.getLen();
|
||||
ar.putArchiveEntry(entry);
|
||||
|
||||
InputStream is = fileSystem.open(fileStatus.getPath());
|
||||
|
||||
BufferedInputStream bis = new BufferedInputStream(is);
|
||||
|
||||
int count;
|
||||
byte[] data = new byte[1024];
|
||||
while ((count = bis.read(data, 0, data.length)) != -1) {
|
||||
ar.write(data, 0, count);
|
||||
}
|
||||
bis.close();
|
||||
ar.closeArchiveEntry();
|
||||
|
||||
}
|
||||
return currentSize;
|
||||
}
|
||||
|
||||
}
|
|
@ -1,152 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.common;
|
||||
|
||||
import static com.mongodb.client.model.Sorts.descending;
|
||||
|
||||
import java.io.Closeable;
|
||||
import java.io.IOException;
|
||||
import java.util.*;
|
||||
import java.util.stream.Collectors;
|
||||
import java.util.stream.StreamSupport;
|
||||
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
import org.bson.Document;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
|
||||
import com.google.common.collect.Iterables;
|
||||
import com.mongodb.BasicDBObject;
|
||||
import com.mongodb.MongoClient;
|
||||
import com.mongodb.MongoClientURI;
|
||||
import com.mongodb.QueryBuilder;
|
||||
import com.mongodb.client.FindIterable;
|
||||
import com.mongodb.client.MongoCollection;
|
||||
import com.mongodb.client.MongoDatabase;
|
||||
|
||||
public class MdstoreClient implements Closeable {
|
||||
|
||||
private static final Logger log = LoggerFactory.getLogger(MdstoreClient.class);
|
||||
|
||||
private final MongoClient client;
|
||||
private final MongoDatabase db;
|
||||
|
||||
private static final String COLL_METADATA = "metadata";
|
||||
private static final String COLL_METADATA_MANAGER = "metadataManager";
|
||||
|
||||
public MdstoreClient(final String baseUrl, final String dbName) {
|
||||
this.client = new MongoClient(new MongoClientURI(baseUrl));
|
||||
this.db = getDb(client, dbName);
|
||||
}
|
||||
|
||||
private Long parseTimestamp(Document f) {
|
||||
if (f == null || !f.containsKey("timestamp"))
|
||||
return null;
|
||||
|
||||
Object ts = f.get("timestamp");
|
||||
|
||||
return Long.parseLong(ts.toString());
|
||||
}
|
||||
|
||||
public Long getLatestTimestamp(final String collectionId) {
|
||||
MongoCollection<Document> collection = db.getCollection(collectionId);
|
||||
FindIterable<Document> result = collection.find().sort(descending("timestamp")).limit(1);
|
||||
if (result == null) {
|
||||
return null;
|
||||
}
|
||||
|
||||
Document f = result.first();
|
||||
return parseTimestamp(f);
|
||||
}
|
||||
|
||||
public MongoCollection<Document> mdStore(final String mdId) {
|
||||
BasicDBObject query = (BasicDBObject) QueryBuilder.start("mdId").is(mdId).get();
|
||||
|
||||
log.info("querying current mdId: {}", query.toJson());
|
||||
|
||||
final String currentId = Optional
|
||||
.ofNullable(getColl(db, COLL_METADATA_MANAGER, true).find(query))
|
||||
.map(FindIterable::first)
|
||||
.map(d -> d.getString("currentId"))
|
||||
.orElseThrow(() -> new IllegalArgumentException("cannot find current mdstore id for: " + mdId));
|
||||
|
||||
log.info("currentId: {}", currentId);
|
||||
|
||||
return getColl(db, currentId, true);
|
||||
}
|
||||
|
||||
public List<MDStoreInfo> mdStoreWithTimestamp(final String mdFormat, final String mdLayout,
|
||||
final String mdInterpretation) {
|
||||
Map<String, String> res = validCollections(mdFormat, mdLayout, mdInterpretation);
|
||||
return res
|
||||
.entrySet()
|
||||
.stream()
|
||||
.map(e -> new MDStoreInfo(e.getKey(), e.getValue(), getLatestTimestamp(e.getValue())))
|
||||
.collect(Collectors.toList());
|
||||
}
|
||||
|
||||
public Map<String, String> validCollections(
|
||||
final String mdFormat, final String mdLayout, final String mdInterpretation) {
|
||||
|
||||
final Map<String, String> transactions = new HashMap<>();
|
||||
for (final Document entry : getColl(db, COLL_METADATA_MANAGER, true).find()) {
|
||||
final String mdId = entry.getString("mdId");
|
||||
final String currentId = entry.getString("currentId");
|
||||
if (StringUtils.isNoneBlank(mdId, currentId)) {
|
||||
transactions.put(mdId, currentId);
|
||||
}
|
||||
}
|
||||
|
||||
final Map<String, String> res = new HashMap<>();
|
||||
for (final Document entry : getColl(db, COLL_METADATA, true).find()) {
|
||||
if (entry.getString("format").equals(mdFormat)
|
||||
&& entry.getString("layout").equals(mdLayout)
|
||||
&& entry.getString("interpretation").equals(mdInterpretation)
|
||||
&& transactions.containsKey(entry.getString("mdId"))) {
|
||||
res.put(entry.getString("mdId"), transactions.get(entry.getString("mdId")));
|
||||
}
|
||||
}
|
||||
|
||||
return res;
|
||||
}
|
||||
|
||||
private MongoDatabase getDb(final MongoClient client, final String dbName) {
|
||||
if (!Iterables.contains(client.listDatabaseNames(), dbName)) {
|
||||
final String err = String.format("Database '%s' not found in %s", dbName, client.getAddress());
|
||||
log.warn(err);
|
||||
throw new IllegalArgumentException(err);
|
||||
}
|
||||
return client.getDatabase(dbName);
|
||||
}
|
||||
|
||||
private MongoCollection<Document> getColl(
|
||||
final MongoDatabase db, final String collName, final boolean abortIfMissing) {
|
||||
if (!Iterables.contains(db.listCollectionNames(), collName)) {
|
||||
final String err = String
|
||||
.format(
|
||||
String.format("Missing collection '%s' in database '%s'", collName, db.getName()));
|
||||
log.warn(err);
|
||||
if (abortIfMissing) {
|
||||
throw new IllegalArgumentException(err);
|
||||
} else {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
return db.getCollection(collName);
|
||||
}
|
||||
|
||||
public Iterable<String> listRecords(final String collName) {
|
||||
final MongoCollection<Document> coll = getColl(db, collName, false);
|
||||
return coll == null
|
||||
? new ArrayList<>()
|
||||
: () -> StreamSupport
|
||||
.stream(coll.find().spliterator(), false)
|
||||
.filter(e -> e.containsKey("body"))
|
||||
.map(e -> e.getString("body"))
|
||||
.iterator();
|
||||
}
|
||||
|
||||
@Override
|
||||
public void close() throws IOException {
|
||||
client.close();
|
||||
}
|
||||
}
|
|
@ -1,18 +1,18 @@
|
|||
|
||||
package eu.dnetlib.dhp.common;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.nio.charset.StandardCharsets;
|
||||
import java.text.Normalizer;
|
||||
import java.util.*;
|
||||
import java.util.stream.Collectors;
|
||||
import java.util.HashSet;
|
||||
import java.util.List;
|
||||
import java.util.Set;
|
||||
|
||||
import org.apache.commons.io.IOUtils;
|
||||
import org.apache.commons.lang3.text.WordUtils;
|
||||
|
||||
import com.ctc.wstx.dtd.LargePrefixedNameSet;
|
||||
import com.google.common.base.Joiner;
|
||||
import com.google.common.base.Splitter;
|
||||
import com.google.common.collect.Iterables;
|
||||
import com.google.common.collect.Lists;
|
||||
import com.google.common.hash.Hashing;
|
||||
|
||||
|
@ -24,24 +24,13 @@ import com.google.common.hash.Hashing;
|
|||
*/
|
||||
public class PacePerson {
|
||||
|
||||
private static final String UTF8 = "UTF-8";
|
||||
private List<String> name = Lists.newArrayList();
|
||||
private List<String> surname = Lists.newArrayList();
|
||||
private List<String> fullname = Lists.newArrayList();
|
||||
private final String original;
|
||||
|
||||
private static Set<String> particles;
|
||||
|
||||
static {
|
||||
try {
|
||||
particles = new HashSet<>(IOUtils
|
||||
.readLines(
|
||||
PacePerson.class
|
||||
.getResourceAsStream(
|
||||
"/eu/dnetlib/dhp/common/name_particles.txt")));
|
||||
} catch (Exception e) {
|
||||
throw new RuntimeException(e);
|
||||
}
|
||||
}
|
||||
private static Set<String> particles = null;
|
||||
|
||||
/**
|
||||
* Capitalizes a string
|
||||
|
@ -49,20 +38,29 @@ public class PacePerson {
|
|||
* @param s the string to capitalize
|
||||
* @return the input string with capital letter
|
||||
*/
|
||||
public static String capitalize(final String s) {
|
||||
if (particles.contains(s)) {
|
||||
return s;
|
||||
}
|
||||
public static final String capitalize(final String s) {
|
||||
return WordUtils.capitalize(s.toLowerCase(), ' ', '-');
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a dot to a string with length equals to 1
|
||||
*/
|
||||
public static String dotAbbreviations(final String s) {
|
||||
public static final String dotAbbreviations(final String s) {
|
||||
return s.length() == 1 ? s + "." : s;
|
||||
}
|
||||
|
||||
public static Set<String> loadFromClasspath(final String classpath) {
|
||||
final Set<String> h = new HashSet<>();
|
||||
try {
|
||||
for (final String s : IOUtils.readLines(PacePerson.class.getResourceAsStream(classpath))) {
|
||||
h.add(s);
|
||||
}
|
||||
} catch (final Throwable e) {
|
||||
return new HashSet<>();
|
||||
}
|
||||
return h;
|
||||
}
|
||||
|
||||
/**
|
||||
* The constructor of the class. It fills the fields of the class basing on the input fullname.
|
||||
*
|
||||
|
@ -131,6 +129,10 @@ public class PacePerson {
|
|||
}
|
||||
|
||||
private List<String> splitTerms(final String s) {
|
||||
if (particles == null) {
|
||||
particles = loadFromClasspath("/eu/dnetlib/dhp/oa/graph/pace/name_particles.txt");
|
||||
}
|
||||
|
||||
final List<String> list = Lists.newArrayList();
|
||||
for (final String part : Splitter.on(" ").omitEmptyStrings().split(s)) {
|
||||
if (!particles.contains(part.toLowerCase())) {
|
||||
|
@ -186,36 +188,17 @@ public class PacePerson {
|
|||
}
|
||||
|
||||
public List<String> getCapitalFirstnames() {
|
||||
return Optional
|
||||
.ofNullable(getNameWithAbbreviations())
|
||||
.map(
|
||||
name -> name
|
||||
.stream()
|
||||
.map(PacePerson::capitalize)
|
||||
.collect(Collectors.toList()))
|
||||
.orElse(new ArrayList<>());
|
||||
return Lists
|
||||
.newArrayList(
|
||||
Iterables.transform(getNameWithAbbreviations(), PacePerson::capitalize));
|
||||
}
|
||||
|
||||
public List<String> getCapitalSurname() {
|
||||
return Optional
|
||||
.ofNullable(getSurname())
|
||||
.map(
|
||||
surname -> surname
|
||||
.stream()
|
||||
.map(PacePerson::capitalize)
|
||||
.collect(Collectors.toList()))
|
||||
.orElse(new ArrayList<>());
|
||||
return Lists.newArrayList(Iterables.transform(surname, PacePerson::capitalize));
|
||||
}
|
||||
|
||||
public List<String> getNameWithAbbreviations() {
|
||||
return Optional
|
||||
.ofNullable(getName())
|
||||
.map(
|
||||
name -> name
|
||||
.stream()
|
||||
.map(PacePerson::dotAbbreviations)
|
||||
.collect(Collectors.toList()))
|
||||
.orElse(new ArrayList<>());
|
||||
return Lists.newArrayList(Iterables.transform(name, PacePerson::dotAbbreviations));
|
||||
}
|
||||
|
||||
public boolean isAccurate() {
|
||||
|
|
|
@ -1,81 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.common.action;
|
||||
|
||||
import java.io.BufferedWriter;
|
||||
import java.io.IOException;
|
||||
import java.io.OutputStreamWriter;
|
||||
import java.nio.charset.StandardCharsets;
|
||||
import java.sql.ResultSet;
|
||||
import java.sql.SQLException;
|
||||
|
||||
import org.apache.hadoop.conf.Configuration;
|
||||
import org.apache.hadoop.fs.FSDataOutputStream;
|
||||
import org.apache.hadoop.fs.FileSystem;
|
||||
import org.apache.hadoop.fs.Path;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
|
||||
import eu.dnetlib.dhp.common.DbClient;
|
||||
import eu.dnetlib.dhp.common.action.model.MasterDuplicate;
|
||||
import eu.dnetlib.dhp.schema.oaf.utils.OafMapperUtils;
|
||||
|
||||
public class ReadDatasourceMasterDuplicateFromDB {
|
||||
|
||||
private static final Logger log = LoggerFactory.getLogger(ReadDatasourceMasterDuplicateFromDB.class);
|
||||
|
||||
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
|
||||
|
||||
private static final String QUERY = "SELECT distinct dd.id as masterId, d.officialname as masterName, dd.duplicate as duplicateId "
|
||||
+
|
||||
"FROM dsm_dedup_services dd join dsm_services d on (dd.id = d.id);";
|
||||
|
||||
public static int execute(String dbUrl, String dbUser, String dbPassword, String hdfsPath, String hdfsNameNode)
|
||||
throws IOException {
|
||||
int count = 0;
|
||||
try (DbClient dbClient = new DbClient(dbUrl, dbUser, dbPassword)) {
|
||||
Configuration conf = new Configuration();
|
||||
conf.set("fs.defaultFS", hdfsNameNode);
|
||||
FileSystem fileSystem = FileSystem.get(conf);
|
||||
FSDataOutputStream fos = fileSystem.create(new Path(hdfsPath));
|
||||
|
||||
log.info("running query: {}", QUERY);
|
||||
log.info("storing results in: {}", hdfsPath);
|
||||
|
||||
try (BufferedWriter writer = new BufferedWriter(new OutputStreamWriter(fos, StandardCharsets.UTF_8))) {
|
||||
dbClient.processResults(QUERY, rs -> writeMap(datasourceMasterMap(rs), writer));
|
||||
count++;
|
||||
}
|
||||
}
|
||||
return count;
|
||||
}
|
||||
|
||||
private static MasterDuplicate datasourceMasterMap(ResultSet rs) {
|
||||
try {
|
||||
final MasterDuplicate md = new MasterDuplicate();
|
||||
|
||||
final String duplicateId = rs.getString("duplicateId");
|
||||
final String masterId = rs.getString("masterId");
|
||||
final String masterName = rs.getString("masterName");
|
||||
|
||||
md.setDuplicateId(OafMapperUtils.createOpenaireId(10, duplicateId, true));
|
||||
md.setMasterId(OafMapperUtils.createOpenaireId(10, masterId, true));
|
||||
md.setMasterName(masterName);
|
||||
|
||||
return md;
|
||||
} catch (final SQLException e) {
|
||||
throw new RuntimeException(e);
|
||||
}
|
||||
}
|
||||
|
||||
private static void writeMap(final MasterDuplicate dm, final BufferedWriter writer) {
|
||||
try {
|
||||
writer.write(OBJECT_MAPPER.writeValueAsString(dm));
|
||||
writer.newLine();
|
||||
} catch (final IOException e) {
|
||||
throw new RuntimeException(e);
|
||||
}
|
||||
}
|
||||
|
||||
}
|
|
@ -1,38 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.common.action.model;
|
||||
|
||||
import java.io.Serializable;
|
||||
|
||||
/**
|
||||
* @author miriam.baglioni
|
||||
* @Date 21/07/22
|
||||
*/
|
||||
public class MasterDuplicate implements Serializable {
|
||||
private String duplicateId;
|
||||
private String masterId;
|
||||
private String masterName;
|
||||
|
||||
public String getDuplicateId() {
|
||||
return duplicateId;
|
||||
}
|
||||
|
||||
public void setDuplicateId(String duplicateId) {
|
||||
this.duplicateId = duplicateId;
|
||||
}
|
||||
|
||||
public String getMasterId() {
|
||||
return masterId;
|
||||
}
|
||||
|
||||
public void setMasterId(String masterId) {
|
||||
this.masterId = masterId;
|
||||
}
|
||||
|
||||
public String getMasterName() {
|
||||
return masterName;
|
||||
}
|
||||
|
||||
public void setMasterName(String masterName) {
|
||||
this.masterName = masterName;
|
||||
}
|
||||
}
|
|
@ -1,45 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.common.aggregation;
|
||||
|
||||
import java.io.Closeable;
|
||||
import java.io.IOException;
|
||||
import java.util.HashMap;
|
||||
import java.util.LinkedHashMap;
|
||||
import java.util.Map;
|
||||
import java.util.Objects;
|
||||
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
|
||||
import eu.dnetlib.dhp.message.MessageSender;
|
||||
import eu.dnetlib.dhp.utils.DHPUtils;
|
||||
|
||||
public class AggregatorReport extends LinkedHashMap<String, String> implements Closeable {
|
||||
|
||||
private static final Logger log = LoggerFactory.getLogger(AggregatorReport.class);
|
||||
|
||||
private transient MessageSender messageSender;
|
||||
|
||||
public AggregatorReport() {
|
||||
}
|
||||
|
||||
public AggregatorReport(MessageSender messageSender) {
|
||||
this.messageSender = messageSender;
|
||||
}
|
||||
|
||||
public void ongoing(Long current, Long total) {
|
||||
messageSender.sendMessage(current, total);
|
||||
}
|
||||
|
||||
@Override
|
||||
public void close() throws IOException {
|
||||
if (Objects.nonNull(messageSender)) {
|
||||
log.info("closing report: ");
|
||||
this.forEach((k, v) -> log.info("{} - {}", k, v));
|
||||
|
||||
Map<String, String> m = new HashMap<>();
|
||||
m.put(getClass().getSimpleName().toLowerCase(), DHPUtils.MAPPER.writeValueAsString(values()));
|
||||
messageSender.sendReport(m);
|
||||
}
|
||||
}
|
||||
}
|
|
@ -1,39 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.common.api.context;
|
||||
|
||||
public class CategorySummary {
|
||||
|
||||
private String id;
|
||||
|
||||
private String label;
|
||||
|
||||
private boolean hasConcept;
|
||||
|
||||
public String getId() {
|
||||
return id;
|
||||
}
|
||||
|
||||
public String getLabel() {
|
||||
return label;
|
||||
}
|
||||
|
||||
public boolean isHasConcept() {
|
||||
return hasConcept;
|
||||
}
|
||||
|
||||
public CategorySummary setId(final String id) {
|
||||
this.id = id;
|
||||
return this;
|
||||
}
|
||||
|
||||
public CategorySummary setLabel(final String label) {
|
||||
this.label = label;
|
||||
return this;
|
||||
}
|
||||
|
||||
public CategorySummary setHasConcept(final boolean hasConcept) {
|
||||
this.hasConcept = hasConcept;
|
||||
return this;
|
||||
}
|
||||
|
||||
}
|
|
@ -1,7 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.common.api.context;
|
||||
|
||||
import java.util.ArrayList;
|
||||
|
||||
public class CategorySummaryList extends ArrayList<CategorySummary> {
|
||||
}
|
|
@ -1,52 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.common.api.context;
|
||||
|
||||
import java.util.List;
|
||||
|
||||
public class ConceptSummary {
|
||||
|
||||
private String id;
|
||||
|
||||
private String label;
|
||||
|
||||
public boolean hasSubConcept;
|
||||
|
||||
private List<ConceptSummary> concepts;
|
||||
|
||||
public String getId() {
|
||||
return id;
|
||||
}
|
||||
|
||||
public String getLabel() {
|
||||
return label;
|
||||
}
|
||||
|
||||
public List<ConceptSummary> getConcepts() {
|
||||
return concepts;
|
||||
}
|
||||
|
||||
public ConceptSummary setId(final String id) {
|
||||
this.id = id;
|
||||
return this;
|
||||
}
|
||||
|
||||
public ConceptSummary setLabel(final String label) {
|
||||
this.label = label;
|
||||
return this;
|
||||
}
|
||||
|
||||
public boolean isHasSubConcept() {
|
||||
return hasSubConcept;
|
||||
}
|
||||
|
||||
public ConceptSummary setHasSubConcept(final boolean hasSubConcept) {
|
||||
this.hasSubConcept = hasSubConcept;
|
||||
return this;
|
||||
}
|
||||
|
||||
public ConceptSummary setConcept(final List<ConceptSummary> concepts) {
|
||||
this.concepts = concepts;
|
||||
return this;
|
||||
}
|
||||
|
||||
}
|
|
@ -1,7 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.common.api.context;
|
||||
|
||||
import java.util.ArrayList;
|
||||
|
||||
public class ConceptSummaryList extends ArrayList<ConceptSummary> {
|
||||
}
|
|
@ -1,50 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.common.api.context;
|
||||
|
||||
public class ContextSummary {
|
||||
|
||||
private String id;
|
||||
|
||||
private String label;
|
||||
|
||||
private String type;
|
||||
|
||||
private String status;
|
||||
|
||||
public String getId() {
|
||||
return id;
|
||||
}
|
||||
|
||||
public String getLabel() {
|
||||
return label;
|
||||
}
|
||||
|
||||
public String getType() {
|
||||
return type;
|
||||
}
|
||||
|
||||
public String getStatus() {
|
||||
return status;
|
||||
}
|
||||
|
||||
public ContextSummary setId(final String id) {
|
||||
this.id = id;
|
||||
return this;
|
||||
}
|
||||
|
||||
public ContextSummary setLabel(final String label) {
|
||||
this.label = label;
|
||||
return this;
|
||||
}
|
||||
|
||||
public ContextSummary setType(final String type) {
|
||||
this.type = type;
|
||||
return this;
|
||||
}
|
||||
|
||||
public ContextSummary setStatus(final String status) {
|
||||
this.status = status;
|
||||
return this;
|
||||
}
|
||||
|
||||
}
|
|
@ -1,7 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.common.api.context;
|
||||
|
||||
import java.util.ArrayList;
|
||||
|
||||
public class ContextSummaryList extends ArrayList<ContextSummary> {
|
||||
}
|
|
@ -1,32 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.common.collection;
|
||||
|
||||
public class CollectorException extends Exception {
|
||||
|
||||
/** */
|
||||
private static final long serialVersionUID = -290723075076039757L;
|
||||
|
||||
public CollectorException() {
|
||||
super();
|
||||
}
|
||||
|
||||
public CollectorException(
|
||||
final String message,
|
||||
final Throwable cause,
|
||||
final boolean enableSuppression,
|
||||
final boolean writableStackTrace) {
|
||||
super(message, cause, enableSuppression, writableStackTrace);
|
||||
}
|
||||
|
||||
public CollectorException(final String message, final Throwable cause) {
|
||||
super(message, cause);
|
||||
}
|
||||
|
||||
public CollectorException(final String message) {
|
||||
super(message);
|
||||
}
|
||||
|
||||
public CollectorException(final Throwable cause) {
|
||||
super(cause);
|
||||
}
|
||||
}
|
|
@ -1,40 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.common.collection;
|
||||
|
||||
import java.io.BufferedOutputStream;
|
||||
import java.io.IOException;
|
||||
import java.util.zip.GZIPOutputStream;
|
||||
|
||||
import org.apache.commons.compress.archivers.tar.TarArchiveEntry;
|
||||
import org.apache.commons.compress.archivers.tar.TarArchiveInputStream;
|
||||
import org.apache.commons.compress.compressors.gzip.GzipCompressorInputStream;
|
||||
import org.apache.commons.io.IOUtils;
|
||||
import org.apache.hadoop.fs.FSDataInputStream;
|
||||
import org.apache.hadoop.fs.FSDataOutputStream;
|
||||
import org.apache.hadoop.fs.FileSystem;
|
||||
import org.apache.hadoop.fs.Path;
|
||||
|
||||
public class DecompressTarGz {
|
||||
|
||||
public static void doExtract(FileSystem fs, String outputPath, String tarGzPath) throws IOException {
|
||||
|
||||
FSDataInputStream inputFileStream = fs.open(new Path(tarGzPath));
|
||||
try (TarArchiveInputStream tais = new TarArchiveInputStream(
|
||||
new GzipCompressorInputStream(inputFileStream))) {
|
||||
TarArchiveEntry entry = null;
|
||||
while ((entry = tais.getNextTarEntry()) != null) {
|
||||
if (!entry.isDirectory()) {
|
||||
try (
|
||||
FSDataOutputStream out = fs
|
||||
.create(new Path(outputPath.concat(entry.getName()).concat(".gz")));
|
||||
GZIPOutputStream gzipOs = new GZIPOutputStream(new BufferedOutputStream(out))) {
|
||||
|
||||
IOUtils.copy(tais, gzipOs);
|
||||
|
||||
}
|
||||
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
|
@ -1,56 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.common.collection;
|
||||
|
||||
import java.io.*;
|
||||
import java.nio.charset.StandardCharsets;
|
||||
import java.util.List;
|
||||
|
||||
import org.apache.hadoop.fs.FSDataOutputStream;
|
||||
import org.apache.hadoop.fs.FileSystem;
|
||||
import org.apache.hadoop.fs.Path;
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
import com.opencsv.bean.CsvToBeanBuilder;
|
||||
|
||||
public class GetCSV {
|
||||
|
||||
public static final char DEFAULT_DELIMITER = ',';
|
||||
|
||||
private GetCSV() {
|
||||
}
|
||||
|
||||
public static void getCsv(FileSystem fileSystem, BufferedReader reader, String hdfsPath,
|
||||
String modelClass) throws IOException, ClassNotFoundException {
|
||||
getCsv(fileSystem, reader, hdfsPath, modelClass, DEFAULT_DELIMITER);
|
||||
}
|
||||
|
||||
public static void getCsv(FileSystem fileSystem, Reader reader, String hdfsPath,
|
||||
String modelClass, char delimiter) throws IOException, ClassNotFoundException {
|
||||
|
||||
Path hdfsWritePath = new Path(hdfsPath);
|
||||
FSDataOutputStream fsDataOutputStream = null;
|
||||
if (fileSystem.exists(hdfsWritePath)) {
|
||||
fileSystem.delete(hdfsWritePath, false);
|
||||
}
|
||||
fsDataOutputStream = fileSystem.create(hdfsWritePath);
|
||||
|
||||
try (BufferedWriter writer = new BufferedWriter(
|
||||
new OutputStreamWriter(fsDataOutputStream, StandardCharsets.UTF_8))) {
|
||||
|
||||
final ObjectMapper mapper = new ObjectMapper();
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
final List lines = new CsvToBeanBuilder(reader)
|
||||
.withType(Class.forName(modelClass))
|
||||
.withSeparator(delimiter)
|
||||
.build()
|
||||
.parse();
|
||||
|
||||
for (Object line : lines) {
|
||||
writer.write(mapper.writeValueAsString(line));
|
||||
writer.newLine();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
}
|
|
@ -1,127 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.common.collection;
|
||||
|
||||
import java.util.HashMap;
|
||||
import java.util.Map;
|
||||
|
||||
/**
|
||||
* Bundles the http connection parameters driving the client behaviour.
|
||||
*/
|
||||
public class HttpClientParams {
|
||||
|
||||
// Defaults
|
||||
public static int _maxNumberOfRetry = 3;
|
||||
public static int _requestDelay = 0; // milliseconds
|
||||
public static int _retryDelay = 10; // seconds
|
||||
public static int _connectTimeOut = 10; // seconds
|
||||
public static int _readTimeOut = 30; // seconds
|
||||
|
||||
public static String _requestMethod = "GET";
|
||||
|
||||
/**
|
||||
* Maximum number of allowed retires before failing
|
||||
*/
|
||||
private int maxNumberOfRetry;
|
||||
|
||||
/**
|
||||
* Delay between request (Milliseconds)
|
||||
*/
|
||||
private int requestDelay;
|
||||
|
||||
/**
|
||||
* Time to wait after a failure before retrying (Seconds)
|
||||
*/
|
||||
private int retryDelay;
|
||||
|
||||
/**
|
||||
* Connect timeout (Seconds)
|
||||
*/
|
||||
private int connectTimeOut;
|
||||
|
||||
/**
|
||||
* Read timeout (Seconds)
|
||||
*/
|
||||
private int readTimeOut;
|
||||
|
||||
/**
|
||||
* Custom http headers
|
||||
*/
|
||||
private Map<String, String> headers;
|
||||
|
||||
/**
|
||||
* Request method (i.e., GET, POST etc)
|
||||
*/
|
||||
private String requestMethod;
|
||||
|
||||
public HttpClientParams() {
|
||||
this(_maxNumberOfRetry, _requestDelay, _retryDelay, _connectTimeOut, _readTimeOut, new HashMap<>(),
|
||||
_requestMethod);
|
||||
}
|
||||
|
||||
public HttpClientParams(int maxNumberOfRetry, int requestDelay, int retryDelay, int connectTimeOut,
|
||||
int readTimeOut, Map<String, String> headers, String requestMethod) {
|
||||
this.maxNumberOfRetry = maxNumberOfRetry;
|
||||
this.requestDelay = requestDelay;
|
||||
this.retryDelay = retryDelay;
|
||||
this.connectTimeOut = connectTimeOut;
|
||||
this.readTimeOut = readTimeOut;
|
||||
this.headers = headers;
|
||||
this.requestMethod = requestMethod;
|
||||
}
|
||||
|
||||
public int getMaxNumberOfRetry() {
|
||||
return maxNumberOfRetry;
|
||||
}
|
||||
|
||||
public void setMaxNumberOfRetry(int maxNumberOfRetry) {
|
||||
this.maxNumberOfRetry = maxNumberOfRetry;
|
||||
}
|
||||
|
||||
public int getRequestDelay() {
|
||||
return requestDelay;
|
||||
}
|
||||
|
||||
public void setRequestDelay(int requestDelay) {
|
||||
this.requestDelay = requestDelay;
|
||||
}
|
||||
|
||||
public int getRetryDelay() {
|
||||
return retryDelay;
|
||||
}
|
||||
|
||||
public void setRetryDelay(int retryDelay) {
|
||||
this.retryDelay = retryDelay;
|
||||
}
|
||||
|
||||
public void setConnectTimeOut(int connectTimeOut) {
|
||||
this.connectTimeOut = connectTimeOut;
|
||||
}
|
||||
|
||||
public int getConnectTimeOut() {
|
||||
return connectTimeOut;
|
||||
}
|
||||
|
||||
public int getReadTimeOut() {
|
||||
return readTimeOut;
|
||||
}
|
||||
|
||||
public void setReadTimeOut(int readTimeOut) {
|
||||
this.readTimeOut = readTimeOut;
|
||||
}
|
||||
|
||||
public Map<String, String> getHeaders() {
|
||||
return headers;
|
||||
}
|
||||
|
||||
public void setHeaders(Map<String, String> headers) {
|
||||
this.headers = headers;
|
||||
}
|
||||
|
||||
public String getRequestMethod() {
|
||||
return requestMethod;
|
||||
}
|
||||
|
||||
public void setRequestMethod(String requestMethod) {
|
||||
this.requestMethod = requestMethod;
|
||||
}
|
||||
}
|
|
@ -1,309 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.common.collection;
|
||||
|
||||
import static eu.dnetlib.dhp.utils.DHPUtils.*;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.io.InputStream;
|
||||
import java.net.*;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
import java.util.concurrent.TimeUnit;
|
||||
|
||||
import org.apache.commons.io.IOUtils;
|
||||
import org.apache.commons.lang3.math.NumberUtils;
|
||||
import org.apache.http.HttpHeaders;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
|
||||
import eu.dnetlib.dhp.common.Constants;
|
||||
import eu.dnetlib.dhp.common.aggregation.AggregatorReport;
|
||||
|
||||
/**
|
||||
* Migrated from https://svn.driver.research-infrastructures.eu/driver/dnet45/modules/dnet-modular-collector-service/trunk/src/main/java/eu/dnetlib/data/collector/plugins/HttpConnector.java
|
||||
*
|
||||
* @author jochen, michele, andrea, alessia, claudio, andreas
|
||||
*/
|
||||
public class HttpConnector2 {
|
||||
|
||||
private static final Logger log = LoggerFactory.getLogger(HttpConnector2.class);
|
||||
|
||||
private static final String REPORT_PREFIX = "http:";
|
||||
|
||||
private HttpClientParams clientParams;
|
||||
|
||||
private String responseType = null;
|
||||
|
||||
private static final String userAgent = "Mozilla/5.0 (compatible; OAI; +http://www.openaire.eu)";
|
||||
|
||||
public HttpConnector2() {
|
||||
this(new HttpClientParams());
|
||||
}
|
||||
|
||||
public HttpConnector2(HttpClientParams clientParams) {
|
||||
this.clientParams = clientParams;
|
||||
CookieHandler.setDefault(new CookieManager(null, CookiePolicy.ACCEPT_ALL));
|
||||
}
|
||||
|
||||
/**
|
||||
* @see HttpConnector2#getInputSource(java.lang.String, AggregatorReport)
|
||||
*/
|
||||
public InputStream getInputSourceAsStream(final String requestUrl) throws CollectorException {
|
||||
return IOUtils.toInputStream(getInputSource(requestUrl));
|
||||
}
|
||||
|
||||
/**
|
||||
* @see HttpConnector2#getInputSource(java.lang.String, AggregatorReport)
|
||||
*/
|
||||
public String getInputSource(final String requestUrl) throws CollectorException {
|
||||
return attemptDownloadAsString(requestUrl, 1, new AggregatorReport());
|
||||
}
|
||||
|
||||
/**
|
||||
* Given the URL returns the content via HTTP GET
|
||||
*
|
||||
* @param requestUrl the URL
|
||||
* @param report the list of errors
|
||||
* @return the content of the downloaded resource
|
||||
* @throws CollectorException when retrying more than maxNumberOfRetry times
|
||||
*/
|
||||
public String getInputSource(final String requestUrl, AggregatorReport report)
|
||||
throws CollectorException {
|
||||
return attemptDownloadAsString(requestUrl, 1, report);
|
||||
}
|
||||
|
||||
private String attemptDownloadAsString(final String requestUrl, final int retryNumber,
|
||||
final AggregatorReport report) throws CollectorException {
|
||||
|
||||
try (InputStream s = attemptDownload(requestUrl, retryNumber, report)) {
|
||||
return IOUtils.toString(s);
|
||||
} catch (IOException e) {
|
||||
log.error(e.getMessage(), e);
|
||||
throw new CollectorException(e);
|
||||
}
|
||||
}
|
||||
|
||||
private InputStream attemptDownload(final String requestUrl, final int retryNumber,
|
||||
final AggregatorReport report) throws CollectorException, IOException {
|
||||
|
||||
if (retryNumber > getClientParams().getMaxNumberOfRetry()) {
|
||||
final String msg = String
|
||||
.format(
|
||||
"Max number of retries (%s/%s) exceeded, failing.",
|
||||
retryNumber, getClientParams().getMaxNumberOfRetry());
|
||||
log.error(msg);
|
||||
throw new CollectorException(msg);
|
||||
}
|
||||
|
||||
InputStream input = null;
|
||||
|
||||
long start = System.currentTimeMillis();
|
||||
try {
|
||||
if (getClientParams().getRequestDelay() > 0) {
|
||||
backoffAndSleep(getClientParams().getRequestDelay());
|
||||
}
|
||||
|
||||
log.info("Request attempt {} [{}]", retryNumber, requestUrl);
|
||||
|
||||
final HttpURLConnection urlConn = (HttpURLConnection) new URL(requestUrl).openConnection();
|
||||
urlConn.setInstanceFollowRedirects(false);
|
||||
urlConn.setReadTimeout(getClientParams().getReadTimeOut() * 1000);
|
||||
urlConn.setConnectTimeout(getClientParams().getConnectTimeOut() * 1000);
|
||||
urlConn.addRequestProperty(HttpHeaders.USER_AGENT, userAgent);
|
||||
urlConn.setRequestMethod(getClientParams().getRequestMethod());
|
||||
|
||||
// if provided, add custom headers
|
||||
if (!getClientParams().getHeaders().isEmpty()) {
|
||||
for (Map.Entry<String, String> headerEntry : getClientParams().getHeaders().entrySet()) {
|
||||
urlConn.addRequestProperty(headerEntry.getKey(), headerEntry.getValue());
|
||||
}
|
||||
}
|
||||
|
||||
logHeaderFields(urlConn);
|
||||
|
||||
int retryAfter = obtainRetryAfter(urlConn.getHeaderFields());
|
||||
String rateLimit = urlConn.getHeaderField(Constants.HTTPHEADER_IETF_DRAFT_RATELIMIT_LIMIT);
|
||||
String rateRemaining = urlConn.getHeaderField(Constants.HTTPHEADER_IETF_DRAFT_RATELIMIT_REMAINING);
|
||||
|
||||
if ((rateLimit != null) && (rateRemaining != null) && (Integer.parseInt(rateRemaining) < 2)) {
|
||||
if (retryAfter > 0) {
|
||||
backoffAndSleep(retryAfter);
|
||||
} else {
|
||||
backoffAndSleep(1000);
|
||||
}
|
||||
}
|
||||
|
||||
if (is2xx(urlConn.getResponseCode())) {
|
||||
return getInputStream(urlConn, start);
|
||||
}
|
||||
if (is3xx(urlConn.getResponseCode())) {
|
||||
// REDIRECTS
|
||||
final String newUrl = obtainNewLocation(urlConn.getHeaderFields());
|
||||
log.info("The requested url has been moved to {}", newUrl);
|
||||
report
|
||||
.put(
|
||||
REPORT_PREFIX + urlConn.getResponseCode(),
|
||||
String.format("Moved to: %s", newUrl));
|
||||
logRequestTime(start);
|
||||
urlConn.disconnect();
|
||||
if (retryAfter > 0) {
|
||||
backoffAndSleep(retryAfter);
|
||||
}
|
||||
return attemptDownload(newUrl, retryNumber + 1, report);
|
||||
}
|
||||
if (is4xx(urlConn.getResponseCode()) || is5xx(urlConn.getResponseCode())) {
|
||||
switch (urlConn.getResponseCode()) {
|
||||
case HttpURLConnection.HTTP_NOT_FOUND:
|
||||
case HttpURLConnection.HTTP_BAD_GATEWAY:
|
||||
case HttpURLConnection.HTTP_UNAVAILABLE:
|
||||
case HttpURLConnection.HTTP_GATEWAY_TIMEOUT:
|
||||
if (retryAfter > 0) {
|
||||
log
|
||||
.warn(
|
||||
"waiting and repeating request after suggested retry-after {} sec for URL {}",
|
||||
retryAfter, requestUrl);
|
||||
backoffAndSleep(retryAfter * 1000);
|
||||
} else {
|
||||
log
|
||||
.warn(
|
||||
"waiting and repeating request after default delay of {} sec for URL {}",
|
||||
getClientParams().getRetryDelay(), requestUrl);
|
||||
backoffAndSleep(retryNumber * getClientParams().getRetryDelay());
|
||||
}
|
||||
report.put(REPORT_PREFIX + urlConn.getResponseCode(), requestUrl);
|
||||
|
||||
logRequestTime(start);
|
||||
|
||||
urlConn.disconnect();
|
||||
|
||||
return attemptDownload(requestUrl, retryNumber + 1, report);
|
||||
case 422: // UNPROCESSABLE ENTITY
|
||||
report.put(REPORT_PREFIX + urlConn.getResponseCode(), requestUrl);
|
||||
log.warn("waiting and repeating request after 10 sec for URL {}", requestUrl);
|
||||
backoffAndSleep(10000);
|
||||
urlConn.disconnect();
|
||||
logRequestTime(start);
|
||||
try {
|
||||
return getInputStream(urlConn, start);
|
||||
} catch (IOException e) {
|
||||
log
|
||||
.error(
|
||||
"server returned 422 and got IOException accessing the response body from URL {}",
|
||||
requestUrl);
|
||||
log.error("IOException:", e);
|
||||
return attemptDownload(requestUrl, retryNumber + 1, report);
|
||||
}
|
||||
default:
|
||||
log.error("gor error {} from URL: {}", urlConn.getResponseCode(), urlConn.getURL());
|
||||
log.error("response message: {}", urlConn.getResponseMessage());
|
||||
report
|
||||
.put(
|
||||
REPORT_PREFIX + urlConn.getResponseCode(),
|
||||
String
|
||||
.format(
|
||||
"%s Error: %s", requestUrl, urlConn.getResponseMessage()));
|
||||
logRequestTime(start);
|
||||
urlConn.disconnect();
|
||||
throw new CollectorException(urlConn.getResponseCode() + " error " + report);
|
||||
}
|
||||
}
|
||||
throw new CollectorException(
|
||||
String
|
||||
.format(
|
||||
"Unexpected status code: %s errors: %s", urlConn.getResponseCode(),
|
||||
MAPPER.writeValueAsString(report)));
|
||||
} catch (MalformedURLException e) {
|
||||
log.error(e.getMessage(), e);
|
||||
report.put(e.getClass().getName(), e.getMessage());
|
||||
throw new CollectorException(e.getMessage(), e);
|
||||
} catch (SocketTimeoutException | SocketException | UnknownHostException e) {
|
||||
log.error(e.getMessage(), e);
|
||||
report.put(e.getClass().getName(), e.getMessage());
|
||||
backoffAndSleep(getClientParams().getRetryDelay() * retryNumber * 1000);
|
||||
return attemptDownload(requestUrl, retryNumber + 1, report);
|
||||
}
|
||||
}
|
||||
|
||||
private InputStream getInputStream(HttpURLConnection urlConn, long start) throws IOException {
|
||||
InputStream input = urlConn.getInputStream();
|
||||
responseType = urlConn.getContentType();
|
||||
logRequestTime(start);
|
||||
return input;
|
||||
}
|
||||
|
||||
private static void logRequestTime(long start) {
|
||||
log
|
||||
.info(
|
||||
"request time elapsed: {}sec",
|
||||
TimeUnit.MILLISECONDS.toSeconds(System.currentTimeMillis() - start));
|
||||
}
|
||||
|
||||
private void logHeaderFields(final HttpURLConnection urlConn) throws IOException {
|
||||
log.info("Response: {} - {}", urlConn.getResponseCode(), urlConn.getResponseMessage());
|
||||
|
||||
for (Map.Entry<String, List<String>> e : urlConn.getHeaderFields().entrySet()) {
|
||||
if (e.getKey() != null) {
|
||||
for (String v : e.getValue()) {
|
||||
log.info(" key: {} - value: {}", e.getKey(), v);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private void backoffAndSleep(int sleepTimeMs) throws CollectorException {
|
||||
log.info("I'm going to sleep for {}ms", sleepTimeMs);
|
||||
try {
|
||||
Thread.sleep(sleepTimeMs);
|
||||
} catch (InterruptedException e) {
|
||||
log.error(e.getMessage(), e);
|
||||
throw new CollectorException(e);
|
||||
}
|
||||
}
|
||||
|
||||
private int obtainRetryAfter(final Map<String, List<String>> headerMap) {
|
||||
for (String key : headerMap.keySet()) {
|
||||
if ((key != null) && key.equalsIgnoreCase(HttpHeaders.RETRY_AFTER) && (!headerMap.get(key).isEmpty())
|
||||
&& NumberUtils.isCreatable(headerMap.get(key).get(0))) {
|
||||
return Integer.parseInt(headerMap.get(key).get(0));
|
||||
}
|
||||
}
|
||||
return -1;
|
||||
}
|
||||
|
||||
private String obtainNewLocation(final Map<String, List<String>> headerMap) throws CollectorException {
|
||||
for (String key : headerMap.keySet()) {
|
||||
if ((key != null) && key.equalsIgnoreCase(HttpHeaders.LOCATION) && (headerMap.get(key).size() > 0)) {
|
||||
return headerMap.get(key).get(0);
|
||||
}
|
||||
}
|
||||
throw new CollectorException("The requested url has been MOVED, but 'location' param is MISSING");
|
||||
}
|
||||
|
||||
private boolean is2xx(final int statusCode) {
|
||||
return statusCode >= 200 && statusCode <= 299;
|
||||
}
|
||||
|
||||
private boolean is4xx(final int statusCode) {
|
||||
return statusCode >= 400 && statusCode <= 499;
|
||||
}
|
||||
|
||||
private boolean is3xx(final int statusCode) {
|
||||
return statusCode >= 300 && statusCode <= 399;
|
||||
}
|
||||
|
||||
private boolean is5xx(final int statusCode) {
|
||||
return statusCode >= 500 && statusCode <= 599;
|
||||
}
|
||||
|
||||
public String getResponseType() {
|
||||
return responseType;
|
||||
}
|
||||
|
||||
public HttpClientParams getClientParams() {
|
||||
return clientParams;
|
||||
}
|
||||
|
||||
public void setClientParams(HttpClientParams clientParams) {
|
||||
this.clientParams = clientParams;
|
||||
}
|
||||
}
|
|
@ -1,75 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.common.rest;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.Arrays;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
import org.apache.commons.io.IOUtils;
|
||||
import org.apache.http.client.methods.HttpGet;
|
||||
import org.apache.http.client.methods.HttpPost;
|
||||
import org.apache.http.client.methods.HttpUriRequest;
|
||||
import org.apache.http.entity.StringEntity;
|
||||
import org.apache.http.impl.client.CloseableHttpClient;
|
||||
import org.apache.http.impl.client.HttpClients;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
|
||||
public class DNetRestClient {
|
||||
|
||||
private static final Logger log = LoggerFactory.getLogger(DNetRestClient.class);
|
||||
|
||||
private static final ObjectMapper mapper = new ObjectMapper();
|
||||
|
||||
private DNetRestClient() {
|
||||
}
|
||||
|
||||
public static <T> T doGET(final String url, Class<T> clazz) throws Exception {
|
||||
final HttpGet httpGet = new HttpGet(url);
|
||||
return doHTTPRequest(httpGet, clazz);
|
||||
}
|
||||
|
||||
public static String doGET(final String url) throws IOException {
|
||||
final HttpGet httpGet = new HttpGet(url);
|
||||
return doHTTPRequest(httpGet);
|
||||
}
|
||||
|
||||
public static <V> String doPOST(final String url, V objParam) throws IOException {
|
||||
final HttpPost httpPost = new HttpPost(url);
|
||||
|
||||
if (objParam != null) {
|
||||
final StringEntity entity = new StringEntity(mapper.writeValueAsString(objParam));
|
||||
httpPost.setEntity(entity);
|
||||
httpPost.setHeader("Accept", "application/json");
|
||||
httpPost.setHeader("Content-type", "application/json");
|
||||
}
|
||||
return doHTTPRequest(httpPost);
|
||||
}
|
||||
|
||||
public static <T, V> T doPOST(final String url, V objParam, Class<T> clazz) throws IOException {
|
||||
return mapper.readValue(doPOST(url, objParam), clazz);
|
||||
}
|
||||
|
||||
private static String doHTTPRequest(final HttpUriRequest r) throws IOException {
|
||||
try (CloseableHttpClient client = HttpClients.createDefault()) {
|
||||
|
||||
log.info("performing HTTP request, method {} on URI {}", r.getMethod(), r.getURI().toString());
|
||||
log
|
||||
.info(
|
||||
"request headers: {}",
|
||||
Arrays
|
||||
.asList(r.getAllHeaders())
|
||||
.stream()
|
||||
.map(h -> h.getName() + ":" + h.getValue())
|
||||
.collect(Collectors.joining(",")));
|
||||
|
||||
return IOUtils.toString(client.execute(r).getEntity().getContent());
|
||||
}
|
||||
}
|
||||
|
||||
private static <T> T doHTTPRequest(final HttpUriRequest r, Class<T> clazz) throws Exception {
|
||||
return mapper.readValue(doHTTPRequest(r), clazz);
|
||||
}
|
||||
}
|
|
@ -1,108 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.common.vocabulary;
|
||||
|
||||
import java.io.Serializable;
|
||||
import java.util.HashMap;
|
||||
import java.util.Map;
|
||||
import java.util.Objects;
|
||||
import java.util.Optional;
|
||||
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
|
||||
import com.google.common.collect.Maps;
|
||||
|
||||
import eu.dnetlib.dhp.schema.oaf.Qualifier;
|
||||
import eu.dnetlib.dhp.schema.oaf.utils.OafMapperUtils;
|
||||
|
||||
public class Vocabulary implements Serializable {
|
||||
|
||||
private final String id;
|
||||
private final String name;
|
||||
|
||||
/**
|
||||
* Code to Term mappings for this Vocabulary.
|
||||
*/
|
||||
private final Map<String, VocabularyTerm> terms = new HashMap<>();
|
||||
|
||||
/**
|
||||
* Synonym to Code mappings for this Vocabulary.
|
||||
*/
|
||||
private final Map<String, String> synonyms = Maps.newHashMap();
|
||||
|
||||
public Vocabulary(final String id, final String name) {
|
||||
this.id = id;
|
||||
this.name = name;
|
||||
}
|
||||
|
||||
public String getId() {
|
||||
return id;
|
||||
}
|
||||
|
||||
public String getName() {
|
||||
return name;
|
||||
}
|
||||
|
||||
protected Map<String, VocabularyTerm> getTerms() {
|
||||
return terms;
|
||||
}
|
||||
|
||||
public VocabularyTerm getTerm(final String id) {
|
||||
return Optional.ofNullable(id).map(String::toLowerCase).map(terms::get).orElse(null);
|
||||
}
|
||||
|
||||
protected void addTerm(final String id, final String name) {
|
||||
terms.put(id.toLowerCase(), new VocabularyTerm(id, name));
|
||||
}
|
||||
|
||||
protected boolean termExists(final String id) {
|
||||
return terms.containsKey(id.toLowerCase());
|
||||
}
|
||||
|
||||
protected void addSynonym(final String syn, final String termCode) {
|
||||
synonyms.put(syn, termCode.toLowerCase());
|
||||
}
|
||||
|
||||
public VocabularyTerm getTermBySynonym(final String syn) {
|
||||
return Optional
|
||||
.ofNullable(syn)
|
||||
.map(s -> getTerm(synonyms.get(s.toLowerCase())))
|
||||
.orElse(null);
|
||||
}
|
||||
|
||||
public Qualifier getTermAsQualifier(final String termId) {
|
||||
return getTermAsQualifier(termId, false);
|
||||
}
|
||||
|
||||
public Qualifier getTermAsQualifier(final String termId, boolean strict) {
|
||||
final VocabularyTerm term = getTerm(termId);
|
||||
if (Objects.nonNull(term)) {
|
||||
return OafMapperUtils.qualifier(term.getId(), term.getName(), getId(), getName());
|
||||
} else if (Objects.isNull(term) && strict) {
|
||||
return OafMapperUtils.unknown(getId(), getName());
|
||||
} else {
|
||||
return OafMapperUtils.qualifier(termId, termId, getId(), getName());
|
||||
}
|
||||
}
|
||||
|
||||
public Qualifier getSynonymAsQualifier(final String syn) {
|
||||
return getSynonymAsQualifier(syn, false);
|
||||
}
|
||||
|
||||
public Qualifier getSynonymAsQualifier(final String syn, boolean strict) {
|
||||
return Optional
|
||||
.ofNullable(getTermBySynonym(syn))
|
||||
.map(term -> getTermAsQualifier(term.getId(), strict))
|
||||
.orElse(null);
|
||||
}
|
||||
|
||||
public Qualifier lookup(String id) {
|
||||
return lookup(id, false);
|
||||
}
|
||||
|
||||
public Qualifier lookup(String id, boolean strict) {
|
||||
return Optional
|
||||
.ofNullable(getSynonymAsQualifier(id, strict))
|
||||
.orElse(getTermAsQualifier(id, strict));
|
||||
}
|
||||
|
||||
}
|
|
@ -1,207 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.common.vocabulary;
|
||||
|
||||
import java.io.Serializable;
|
||||
import java.util.*;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
|
||||
import eu.dnetlib.dhp.schema.oaf.Qualifier;
|
||||
import eu.dnetlib.dhp.schema.oaf.utils.OafMapperUtils;
|
||||
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpException;
|
||||
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpService;
|
||||
|
||||
public class VocabularyGroup implements Serializable {
|
||||
|
||||
public static final String VOCABULARIES_XQUERY = "for $x in collection('/db/DRIVER/VocabularyDSResources/VocabularyDSResourceType') \n"
|
||||
+
|
||||
"let $vocid := $x//VOCABULARY_NAME/@code\n" +
|
||||
"let $vocname := $x//VOCABULARY_NAME/text()\n" +
|
||||
"for $term in ($x//TERM)\n" +
|
||||
"return concat($vocid,' @=@ ',$vocname,' @=@ ',$term/@code,' @=@ ',$term/@english_name)";
|
||||
|
||||
public static final String VOCABULARY_SYNONYMS_XQUERY = "for $x in collection('/db/DRIVER/VocabularyDSResources/VocabularyDSResourceType')\n"
|
||||
+
|
||||
"let $vocid := $x//VOCABULARY_NAME/@code\n" +
|
||||
"let $vocname := $x//VOCABULARY_NAME/text()\n" +
|
||||
"for $term in ($x//TERM)\n" +
|
||||
"for $syn in ($term//SYNONYM/@term)\n" +
|
||||
"return concat($vocid,' @=@ ',$term/@code,' @=@ ', $syn)\n";
|
||||
|
||||
public static VocabularyGroup loadVocsFromIS(ISLookUpService isLookUpService) throws ISLookUpException {
|
||||
|
||||
final VocabularyGroup vocs = new VocabularyGroup();
|
||||
|
||||
for (final String s : isLookUpService.quickSearchProfile(VOCABULARIES_XQUERY)) {
|
||||
final String[] arr = s.split("@=@");
|
||||
if (arr.length == 4) {
|
||||
final String vocId = arr[0].trim();
|
||||
final String vocName = arr[1].trim();
|
||||
final String termId = arr[2].trim();
|
||||
final String termName = arr[3].trim();
|
||||
|
||||
if (!vocs.vocabularyExists(vocId)) {
|
||||
vocs.addVocabulary(vocId, vocName);
|
||||
}
|
||||
|
||||
vocs.addTerm(vocId, termId, termName);
|
||||
}
|
||||
}
|
||||
|
||||
for (final String s : isLookUpService.quickSearchProfile(VOCABULARY_SYNONYMS_XQUERY)) {
|
||||
final String[] arr = s.split("@=@");
|
||||
if (arr.length == 3) {
|
||||
final String vocId = arr[0].trim();
|
||||
final String termId = arr[1].trim();
|
||||
final String syn = arr[2].trim();
|
||||
|
||||
vocs.addSynonyms(vocId, termId, syn);
|
||||
|
||||
}
|
||||
}
|
||||
|
||||
// add the term names as synonyms
|
||||
vocs.vocs.values().forEach(voc -> {
|
||||
voc.getTerms().values().forEach(term -> {
|
||||
voc.addSynonym(term.getName().toLowerCase(), term.getId());
|
||||
});
|
||||
});
|
||||
|
||||
return vocs;
|
||||
}
|
||||
|
||||
private final Map<String, Vocabulary> vocs = new HashMap<>();
|
||||
|
||||
public Set<String> vocabularyNames() {
|
||||
return vocs.keySet();
|
||||
}
|
||||
|
||||
public void addVocabulary(final String id, final String name) {
|
||||
vocs.put(id.toLowerCase(), new Vocabulary(id, name));
|
||||
}
|
||||
|
||||
public Optional<Vocabulary> find(final String vocId) {
|
||||
return Optional
|
||||
.ofNullable(vocId)
|
||||
.map(String::toLowerCase)
|
||||
.map(vocs::get);
|
||||
}
|
||||
|
||||
public void addTerm(final String vocId, final String id, final String name) {
|
||||
if (vocabularyExists(vocId)) {
|
||||
vocs.get(vocId.toLowerCase()).addTerm(id, name);
|
||||
}
|
||||
}
|
||||
|
||||
public VocabularyTerm getTerm(final String vocId, final String id) {
|
||||
if (termExists(vocId, id)) {
|
||||
return vocs.get(vocId.toLowerCase()).getTerm(id);
|
||||
} else {
|
||||
return new VocabularyTerm(id, id);
|
||||
}
|
||||
}
|
||||
|
||||
public Set<String> getTerms(String vocId) {
|
||||
if (!vocabularyExists(vocId)) {
|
||||
return new HashSet<>();
|
||||
}
|
||||
return vocs
|
||||
.get(vocId.toLowerCase())
|
||||
.getTerms()
|
||||
.values()
|
||||
.stream()
|
||||
.map(VocabularyTerm::getId)
|
||||
.collect(Collectors.toCollection(HashSet::new));
|
||||
}
|
||||
|
||||
public Qualifier lookup(String vocId, String id) {
|
||||
return Optional
|
||||
.ofNullable(getSynonymAsQualifier(vocId, id))
|
||||
.orElse(getTermAsQualifier(vocId, id));
|
||||
}
|
||||
|
||||
public Qualifier getTermAsQualifier(final String vocId, final String id) {
|
||||
if (vocabularyExists(vocId)) {
|
||||
return vocs.get(vocId.toLowerCase()).getTermAsQualifier(id);
|
||||
}
|
||||
return OafMapperUtils.qualifier(id, id, "", "");
|
||||
}
|
||||
|
||||
public Qualifier getSynonymAsQualifier(final String vocId, final String syn) {
|
||||
if (StringUtils.isBlank(vocId)) {
|
||||
return OafMapperUtils.unknown("", "");
|
||||
}
|
||||
return vocs.get(vocId.toLowerCase()).getSynonymAsQualifier(syn);
|
||||
}
|
||||
|
||||
public Qualifier lookupTermBySynonym(final String vocId, final String syn) {
|
||||
return find(vocId)
|
||||
.map(
|
||||
vocabulary -> Optional
|
||||
.ofNullable(vocabulary.getTerm(syn))
|
||||
.map(
|
||||
term -> OafMapperUtils
|
||||
.qualifier(term.getId(), term.getName(), vocabulary.getId(), vocabulary.getName()))
|
||||
.orElse(
|
||||
Optional
|
||||
.ofNullable(vocabulary.getTermBySynonym(syn))
|
||||
.map(
|
||||
term -> OafMapperUtils
|
||||
.qualifier(term.getId(), term.getName(), vocabulary.getId(), vocabulary.getName()))
|
||||
.orElse(null)))
|
||||
.orElse(null);
|
||||
}
|
||||
|
||||
/**
|
||||
* getSynonymAsQualifierCaseSensitive
|
||||
*
|
||||
* refelects the situation to check caseSensitive vocabulary
|
||||
*/
|
||||
public Qualifier getSynonymAsQualifierCaseSensitive(final String vocId, final String syn) {
|
||||
if (StringUtils.isBlank(vocId)) {
|
||||
return OafMapperUtils.unknown("", "");
|
||||
}
|
||||
return vocs.get(vocId).getSynonymAsQualifier(syn);
|
||||
}
|
||||
|
||||
/**
|
||||
* termExists
|
||||
*
|
||||
* two methods: without and with caseSensitive check
|
||||
*/
|
||||
public boolean termExists(final String vocId, final String id) {
|
||||
return termExists(vocId, id, Boolean.FALSE);
|
||||
}
|
||||
|
||||
public boolean termExists(final String vocId, final String id, final Boolean caseSensitive) {
|
||||
if (Boolean.TRUE.equals(caseSensitive)) {
|
||||
return vocabularyExists(vocId) && vocs.get(vocId).termExists(id);
|
||||
}
|
||||
return vocabularyExists(vocId) && vocs.get(vocId.toLowerCase()).termExists(id);
|
||||
}
|
||||
|
||||
public boolean vocabularyExists(final String vocId) {
|
||||
return Optional
|
||||
.ofNullable(vocId)
|
||||
.map(String::toLowerCase)
|
||||
.map(vocs::containsKey)
|
||||
.orElse(false);
|
||||
}
|
||||
|
||||
private void addSynonyms(final String vocId, final String termId, final String syn) {
|
||||
String id = Optional
|
||||
.ofNullable(vocId)
|
||||
.map(String::toLowerCase)
|
||||
.orElseThrow(
|
||||
() -> new IllegalArgumentException(
|
||||
String
|
||||
.format(
|
||||
"empty vocabulary id for [term:%s, synonym:%s]", termId, syn)));
|
||||
Optional
|
||||
.ofNullable(vocs.get(id))
|
||||
.orElseThrow(() -> new IllegalArgumentException("missing vocabulary id: " + vocId))
|
||||
.addSynonym(syn.toLowerCase(), termId);
|
||||
}
|
||||
|
||||
}
|
|
@ -1,24 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.common.vocabulary;
|
||||
|
||||
import java.io.Serializable;
|
||||
|
||||
public class VocabularyTerm implements Serializable {
|
||||
|
||||
private final String id;
|
||||
private final String name;
|
||||
|
||||
public VocabularyTerm(final String id, final String name) {
|
||||
this.id = id;
|
||||
this.name = name;
|
||||
}
|
||||
|
||||
public String getId() {
|
||||
return id;
|
||||
}
|
||||
|
||||
public String getName() {
|
||||
return name;
|
||||
}
|
||||
|
||||
}
|
|
@ -1,63 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.message;
|
||||
|
||||
import java.io.Serializable;
|
||||
import java.util.LinkedHashMap;
|
||||
import java.util.Map;
|
||||
|
||||
public class Message implements Serializable {
|
||||
|
||||
private static final long serialVersionUID = 401753881204524893L;
|
||||
|
||||
public static final String CURRENT_PARAM = "current";
|
||||
public static final String TOTAL_PARAM = "total";
|
||||
|
||||
private MessageType messageType;
|
||||
|
||||
private String workflowId;
|
||||
|
||||
private Map<String, String> body;
|
||||
|
||||
public Message() {
|
||||
}
|
||||
|
||||
public Message(final MessageType messageType, final String workflowId) {
|
||||
this(messageType, workflowId, new LinkedHashMap<>());
|
||||
}
|
||||
|
||||
public Message(final MessageType messageType, final String workflowId, final Map<String, String> body) {
|
||||
this.messageType = messageType;
|
||||
this.workflowId = workflowId;
|
||||
this.body = body;
|
||||
}
|
||||
|
||||
public MessageType getMessageType() {
|
||||
return messageType;
|
||||
}
|
||||
|
||||
public void setMessageType(MessageType messageType) {
|
||||
this.messageType = messageType;
|
||||
}
|
||||
|
||||
public String getWorkflowId() {
|
||||
return workflowId;
|
||||
}
|
||||
|
||||
public void setWorkflowId(final String workflowId) {
|
||||
this.workflowId = workflowId;
|
||||
}
|
||||
|
||||
public Map<String, String> getBody() {
|
||||
return body;
|
||||
}
|
||||
|
||||
public void setBody(final Map<String, String> body) {
|
||||
this.body = body;
|
||||
}
|
||||
|
||||
@Override
|
||||
public String toString() {
|
||||
return String.format("Message [type=%s, workflowId=%s, body=%s]", messageType, workflowId, body);
|
||||
}
|
||||
|
||||
}
|
|
@ -1,94 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.message;
|
||||
|
||||
import java.util.Map;
|
||||
import java.util.concurrent.ExecutorService;
|
||||
import java.util.concurrent.Executors;
|
||||
|
||||
import org.apache.http.client.config.RequestConfig;
|
||||
import org.apache.http.client.methods.CloseableHttpResponse;
|
||||
import org.apache.http.client.methods.HttpPut;
|
||||
import org.apache.http.entity.ContentType;
|
||||
import org.apache.http.entity.StringEntity;
|
||||
import org.apache.http.impl.client.CloseableHttpClient;
|
||||
import org.apache.http.impl.client.HttpClients;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
|
||||
import com.fasterxml.jackson.core.JsonProcessingException;
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
|
||||
public class MessageSender {
|
||||
|
||||
private static final Logger log = LoggerFactory.getLogger(MessageSender.class);
|
||||
|
||||
private static final int SOCKET_TIMEOUT_MS = 2000;
|
||||
|
||||
private static final int CONNECTION_REQUEST_TIMEOUT_MS = 2000;
|
||||
|
||||
private static final int CONNTECTION_TIMEOUT_MS = 2000;
|
||||
|
||||
private final ObjectMapper objectMapper = new ObjectMapper();
|
||||
|
||||
private final String dnetMessageEndpoint;
|
||||
|
||||
private final String workflowId;
|
||||
|
||||
private final ExecutorService executorService = Executors.newCachedThreadPool();
|
||||
|
||||
public MessageSender(final String dnetMessageEndpoint, final String workflowId) {
|
||||
this.workflowId = workflowId;
|
||||
this.dnetMessageEndpoint = dnetMessageEndpoint;
|
||||
}
|
||||
|
||||
public void sendMessage(final Message message) {
|
||||
executorService.submit(() -> _sendMessage(message));
|
||||
}
|
||||
|
||||
public void sendMessage(final Long current, final Long total) {
|
||||
sendMessage(createOngoingMessage(current, total));
|
||||
}
|
||||
|
||||
public void sendReport(final Map<String, String> report) {
|
||||
sendMessage(new Message(MessageType.REPORT, workflowId, report));
|
||||
}
|
||||
|
||||
private Message createOngoingMessage(final Long current, final Long total) {
|
||||
final Message m = new Message(MessageType.ONGOING, workflowId);
|
||||
m.getBody().put(Message.CURRENT_PARAM, current.toString());
|
||||
if (total != null) {
|
||||
m.getBody().put(Message.TOTAL_PARAM, total.toString());
|
||||
}
|
||||
return m;
|
||||
}
|
||||
|
||||
private void _sendMessage(final Message message) {
|
||||
try {
|
||||
final String json = objectMapper.writeValueAsString(message);
|
||||
|
||||
final HttpPut req = new HttpPut(dnetMessageEndpoint);
|
||||
req.setEntity(new StringEntity(json, ContentType.APPLICATION_JSON));
|
||||
|
||||
final RequestConfig requestConfig = RequestConfig
|
||||
.custom()
|
||||
.setConnectTimeout(CONNTECTION_TIMEOUT_MS)
|
||||
.setConnectionRequestTimeout(CONNECTION_REQUEST_TIMEOUT_MS)
|
||||
.setSocketTimeout(SOCKET_TIMEOUT_MS)
|
||||
.build();
|
||||
|
||||
try (final CloseableHttpClient client = HttpClients
|
||||
.custom()
|
||||
.setDefaultRequestConfig(requestConfig)
|
||||
.build();
|
||||
final CloseableHttpResponse response = client.execute(req)) {
|
||||
log.debug("Sent Message to " + dnetMessageEndpoint);
|
||||
log.debug("MESSAGE:" + message);
|
||||
} catch (final Throwable e) {
|
||||
log.error("Error sending message to " + dnetMessageEndpoint + ", message content: " + message, e);
|
||||
}
|
||||
} catch (final JsonProcessingException e) {
|
||||
log.error("Error sending message to " + dnetMessageEndpoint + ", message content: " + message, e);
|
||||
}
|
||||
}
|
||||
|
||||
}
|
|
@ -1,21 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.message;
|
||||
|
||||
import java.io.Serializable;
|
||||
import java.util.Optional;
|
||||
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
|
||||
public enum MessageType implements Serializable {
|
||||
|
||||
ONGOING, REPORT;
|
||||
|
||||
public MessageType from(String value) {
|
||||
return Optional
|
||||
.ofNullable(value)
|
||||
.map(StringUtils::upperCase)
|
||||
.map(MessageType::valueOf)
|
||||
.orElseThrow(() -> new IllegalArgumentException("unknown message type: " + value));
|
||||
}
|
||||
|
||||
}
|
|
@ -0,0 +1,121 @@
|
|||
|
||||
package eu.dnetlib.dhp.model.mdstore;
|
||||
|
||||
import java.io.Serializable;
|
||||
|
||||
import eu.dnetlib.dhp.utils.DHPUtils;
|
||||
|
||||
/** This class models a record inside the new Metadata store collection on HDFS * */
|
||||
public class MetadataRecord implements Serializable {
|
||||
|
||||
/** The D-Net Identifier associated to the record */
|
||||
private String id;
|
||||
|
||||
/** The original Identifier of the record */
|
||||
private String originalId;
|
||||
|
||||
/** The encoding of the record, should be JSON or XML */
|
||||
private String encoding;
|
||||
|
||||
/**
|
||||
* The information about the provenance of the record see @{@link Provenance} for the model of this information
|
||||
*/
|
||||
private Provenance provenance;
|
||||
|
||||
/** The content of the metadata */
|
||||
private String body;
|
||||
|
||||
/** the date when the record has been stored */
|
||||
private long dateOfCollection;
|
||||
|
||||
/** the date when the record has been stored */
|
||||
private long dateOfTransformation;
|
||||
|
||||
public MetadataRecord() {
|
||||
this.dateOfCollection = System.currentTimeMillis();
|
||||
}
|
||||
|
||||
public MetadataRecord(
|
||||
String originalId,
|
||||
String encoding,
|
||||
Provenance provenance,
|
||||
String body,
|
||||
long dateOfCollection) {
|
||||
|
||||
this.originalId = originalId;
|
||||
this.encoding = encoding;
|
||||
this.provenance = provenance;
|
||||
this.body = body;
|
||||
this.dateOfCollection = dateOfCollection;
|
||||
this.id = DHPUtils.generateIdentifier(originalId, this.provenance.getNsPrefix());
|
||||
}
|
||||
|
||||
public String getId() {
|
||||
return id;
|
||||
}
|
||||
|
||||
public void setId(String id) {
|
||||
this.id = id;
|
||||
}
|
||||
|
||||
public String getOriginalId() {
|
||||
return originalId;
|
||||
}
|
||||
|
||||
public void setOriginalId(String originalId) {
|
||||
this.originalId = originalId;
|
||||
}
|
||||
|
||||
public String getEncoding() {
|
||||
return encoding;
|
||||
}
|
||||
|
||||
public void setEncoding(String encoding) {
|
||||
this.encoding = encoding;
|
||||
}
|
||||
|
||||
public Provenance getProvenance() {
|
||||
return provenance;
|
||||
}
|
||||
|
||||
public void setProvenance(Provenance provenance) {
|
||||
this.provenance = provenance;
|
||||
}
|
||||
|
||||
public String getBody() {
|
||||
return body;
|
||||
}
|
||||
|
||||
public void setBody(String body) {
|
||||
this.body = body;
|
||||
}
|
||||
|
||||
public long getDateOfCollection() {
|
||||
return dateOfCollection;
|
||||
}
|
||||
|
||||
public void setDateOfCollection(long dateOfCollection) {
|
||||
this.dateOfCollection = dateOfCollection;
|
||||
}
|
||||
|
||||
public long getDateOfTransformation() {
|
||||
return dateOfTransformation;
|
||||
}
|
||||
|
||||
public void setDateOfTransformation(long dateOfTransformation) {
|
||||
this.dateOfTransformation = dateOfTransformation;
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean equals(Object o) {
|
||||
if (!(o instanceof MetadataRecord)) {
|
||||
return false;
|
||||
}
|
||||
return ((MetadataRecord) o).getId().equalsIgnoreCase(id);
|
||||
}
|
||||
|
||||
@Override
|
||||
public int hashCode() {
|
||||
return id.hashCode();
|
||||
}
|
||||
}
|
|
@ -0,0 +1,52 @@
|
|||
|
||||
package eu.dnetlib.dhp.model.mdstore;
|
||||
|
||||
import java.io.Serializable;
|
||||
|
||||
/**
|
||||
* @author Sandro La Bruzzo
|
||||
* <p>
|
||||
* Provenace class models the provenance of the record in the metadataStore It contains the identifier and the
|
||||
* name of the datasource that gives the record
|
||||
*/
|
||||
public class Provenance implements Serializable {
|
||||
|
||||
private String datasourceId;
|
||||
|
||||
private String datasourceName;
|
||||
|
||||
private String nsPrefix;
|
||||
|
||||
public Provenance() {
|
||||
}
|
||||
|
||||
public Provenance(String datasourceId, String datasourceName, String nsPrefix) {
|
||||
this.datasourceId = datasourceId;
|
||||
this.datasourceName = datasourceName;
|
||||
this.nsPrefix = nsPrefix;
|
||||
}
|
||||
|
||||
public String getDatasourceId() {
|
||||
return datasourceId;
|
||||
}
|
||||
|
||||
public void setDatasourceId(String datasourceId) {
|
||||
this.datasourceId = datasourceId;
|
||||
}
|
||||
|
||||
public String getDatasourceName() {
|
||||
return datasourceName;
|
||||
}
|
||||
|
||||
public void setDatasourceName(String datasourceName) {
|
||||
this.datasourceName = datasourceName;
|
||||
}
|
||||
|
||||
public String getNsPrefix() {
|
||||
return nsPrefix;
|
||||
}
|
||||
|
||||
public void setNsPrefix(String nsPrefix) {
|
||||
this.nsPrefix = nsPrefix;
|
||||
}
|
||||
}
|
|
@ -1,234 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.oa.merge;
|
||||
|
||||
import java.text.Normalizer;
|
||||
import java.util.*;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
|
||||
import com.wcohen.ss.JaroWinkler;
|
||||
|
||||
import eu.dnetlib.dhp.schema.oaf.Author;
|
||||
import eu.dnetlib.dhp.schema.oaf.Qualifier;
|
||||
import eu.dnetlib.dhp.schema.oaf.StructuredProperty;
|
||||
import eu.dnetlib.pace.model.Person;
|
||||
import scala.Tuple2;
|
||||
|
||||
public class AuthorMerger {
|
||||
|
||||
private static final Double THRESHOLD = 0.95;
|
||||
|
||||
private AuthorMerger() {
|
||||
}
|
||||
|
||||
public static List<Author> merge(List<List<Author>> authors) {
|
||||
|
||||
authors.sort((o1, o2) -> -Integer.compare(countAuthorsPids(o1), countAuthorsPids(o2)));
|
||||
|
||||
List<Author> author = new ArrayList<>();
|
||||
|
||||
for (List<Author> a : authors) {
|
||||
author = mergeAuthor(author, a);
|
||||
}
|
||||
|
||||
return author;
|
||||
|
||||
}
|
||||
|
||||
public static List<Author> mergeAuthor(final List<Author> a, final List<Author> b, Double threshold) {
|
||||
int pa = countAuthorsPids(a);
|
||||
int pb = countAuthorsPids(b);
|
||||
List<Author> base;
|
||||
List<Author> enrich;
|
||||
int sa = authorsSize(a);
|
||||
int sb = authorsSize(b);
|
||||
|
||||
if (sa == sb) {
|
||||
base = pa > pb ? a : b;
|
||||
enrich = pa > pb ? b : a;
|
||||
} else {
|
||||
base = sa > sb ? a : b;
|
||||
enrich = sa > sb ? b : a;
|
||||
}
|
||||
enrichPidFromList(base, enrich, threshold);
|
||||
return base;
|
||||
}
|
||||
|
||||
public static List<Author> mergeAuthor(final List<Author> a, final List<Author> b) {
|
||||
return mergeAuthor(a, b, THRESHOLD);
|
||||
}
|
||||
|
||||
private static void enrichPidFromList(List<Author> base, List<Author> enrich, Double threshold) {
|
||||
if (base == null || enrich == null)
|
||||
return;
|
||||
|
||||
// <pidComparableString, Author> (if an Author has more than 1 pid, it appears 2 times in the list)
|
||||
final Map<String, Author> basePidAuthorMap = base
|
||||
.stream()
|
||||
.filter(a -> a.getPid() != null && !a.getPid().isEmpty())
|
||||
.flatMap(
|
||||
a -> a
|
||||
.getPid()
|
||||
.stream()
|
||||
.filter(Objects::nonNull)
|
||||
.map(p -> new Tuple2<>(pidToComparableString(p), a)))
|
||||
.collect(Collectors.toMap(Tuple2::_1, Tuple2::_2, (x1, x2) -> x1));
|
||||
|
||||
// <pid, Author> (list of pid that are missing in the other list)
|
||||
final List<Tuple2<StructuredProperty, Author>> pidToEnrich = enrich
|
||||
.stream()
|
||||
.filter(a -> a.getPid() != null && !a.getPid().isEmpty())
|
||||
.flatMap(
|
||||
a -> a
|
||||
.getPid()
|
||||
.stream()
|
||||
.filter(Objects::nonNull)
|
||||
.filter(p -> !basePidAuthorMap.containsKey(pidToComparableString(p)))
|
||||
.map(p -> new Tuple2<>(p, a)))
|
||||
.collect(Collectors.toList());
|
||||
|
||||
pidToEnrich
|
||||
.forEach(
|
||||
a -> {
|
||||
Optional<Tuple2<Double, Author>> simAuthor = base
|
||||
.stream()
|
||||
.map(ba -> new Tuple2<>(sim(ba, a._2()), ba))
|
||||
.max(Comparator.comparing(Tuple2::_1));
|
||||
|
||||
if (simAuthor.isPresent()) {
|
||||
double th = threshold;
|
||||
// increase the threshold if the surname is too short
|
||||
if (simAuthor.get()._2().getSurname() != null
|
||||
&& simAuthor.get()._2().getSurname().length() <= 3 && threshold > 0.0)
|
||||
th = 0.99;
|
||||
|
||||
if (simAuthor.get()._1() > th) {
|
||||
Author r = simAuthor.get()._2();
|
||||
if (r.getPid() == null) {
|
||||
r.setPid(new ArrayList<>());
|
||||
}
|
||||
|
||||
// TERRIBLE HACK but for some reason when we create and Array with Arrays.asList,
|
||||
// it creates of fixed size, and the add method raise UnsupportedOperationException at
|
||||
// java.util.AbstractList.add
|
||||
final List<StructuredProperty> tmp = new ArrayList<>(r.getPid());
|
||||
tmp.add(a._1());
|
||||
r.setPid(tmp);
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
public static String normalizeFullName(final String fullname) {
|
||||
return nfd(fullname)
|
||||
.toLowerCase()
|
||||
// do not compact the regexes in a single expression, would cause StackOverflowError
|
||||
// in case
|
||||
// of large input strings
|
||||
.replaceAll("(\\W)+", " ")
|
||||
.replaceAll("(\\p{InCombiningDiacriticalMarks})+", " ")
|
||||
.replaceAll("(\\p{Punct})+", " ")
|
||||
.replaceAll("(\\d)+", " ")
|
||||
.replaceAll("(\\n)+", " ")
|
||||
|
||||
.trim();
|
||||
}
|
||||
|
||||
private static String authorFieldToBeCompared(Author author) {
|
||||
if (StringUtils.isNotBlank(author.getSurname())) {
|
||||
return author.getSurname();
|
||||
|
||||
}
|
||||
if (StringUtils.isNotBlank(author.getFullname())) {
|
||||
return author.getFullname();
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
public static String pidToComparableString(StructuredProperty pid) {
|
||||
final String classId = Optional
|
||||
.ofNullable(pid)
|
||||
.map(
|
||||
p -> Optional
|
||||
.ofNullable(p.getQualifier())
|
||||
.map(Qualifier::getClassid)
|
||||
.map(String::toLowerCase)
|
||||
.orElse(""))
|
||||
.orElse("");
|
||||
return Optional
|
||||
.ofNullable(pid)
|
||||
.map(StructuredProperty::getValue)
|
||||
.map(v -> String.join("|", v, classId))
|
||||
.orElse("");
|
||||
}
|
||||
|
||||
public static int countAuthorsPids(List<Author> authors) {
|
||||
if (authors == null)
|
||||
return 0;
|
||||
|
||||
return (int) authors.stream().filter(AuthorMerger::hasPid).count();
|
||||
}
|
||||
|
||||
private static int authorsSize(List<Author> authors) {
|
||||
if (authors == null)
|
||||
return 0;
|
||||
return authors.size();
|
||||
}
|
||||
|
||||
private static Double sim(Author a, Author b) {
|
||||
|
||||
final Person pa = parse(a);
|
||||
final Person pb = parse(b);
|
||||
|
||||
// if both are accurate (e.g. they have name and surname)
|
||||
if (pa.isAccurate() & pb.isAccurate()) {
|
||||
return new JaroWinkler().score(normalize(pa.getSurnameString()), normalize(pb.getSurnameString())) * 0.5
|
||||
+ new JaroWinkler().score(normalize(pa.getNameString()), normalize(pb.getNameString())) * 0.5;
|
||||
} else {
|
||||
return new JaroWinkler()
|
||||
.score(normalize(pa.getNormalisedFullname()), normalize(pb.getNormalisedFullname()));
|
||||
}
|
||||
}
|
||||
|
||||
private static boolean hasPid(Author a) {
|
||||
if (a == null || a.getPid() == null || a.getPid().isEmpty())
|
||||
return false;
|
||||
return a.getPid().stream().anyMatch(p -> p != null && StringUtils.isNotBlank(p.getValue()));
|
||||
}
|
||||
|
||||
private static Person parse(Author author) {
|
||||
if (StringUtils.isNotBlank(author.getSurname())) {
|
||||
return new Person(author.getSurname() + ", " + author.getName(), false);
|
||||
} else {
|
||||
if (StringUtils.isNotBlank(author.getFullname()))
|
||||
return new Person(author.getFullname(), false);
|
||||
else
|
||||
return new Person("", false);
|
||||
}
|
||||
}
|
||||
|
||||
public static String normalize(final String s) {
|
||||
String[] normalized = nfd(s)
|
||||
.toLowerCase()
|
||||
// do not compact the regexes in a single expression, would cause StackOverflowError
|
||||
// in case
|
||||
// of large input strings
|
||||
.replaceAll("(\\W)+", " ")
|
||||
.replaceAll("(\\p{InCombiningDiacriticalMarks})+", " ")
|
||||
.replaceAll("(\\p{Punct})+", " ")
|
||||
.replaceAll("(\\d)+", " ")
|
||||
.replaceAll("(\\n)+", " ")
|
||||
.trim()
|
||||
.split(" ");
|
||||
|
||||
Arrays.sort(normalized);
|
||||
|
||||
return String.join(" ", normalized);
|
||||
}
|
||||
|
||||
private static String nfd(final String s) {
|
||||
return Normalizer.normalize(s, Normalizer.Form.NFD);
|
||||
}
|
||||
|
||||
}
|
|
@ -1,194 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.oa.merge;
|
||||
|
||||
import static eu.dnetlib.dhp.common.SparkSessionSupport.runWithSparkSession;
|
||||
import static org.apache.spark.sql.functions.col;
|
||||
import static org.apache.spark.sql.functions.when;
|
||||
|
||||
import java.util.Map;
|
||||
import java.util.Optional;
|
||||
import java.util.concurrent.ExecutionException;
|
||||
import java.util.concurrent.ForkJoinPool;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
import org.apache.commons.io.IOUtils;
|
||||
import org.apache.spark.SparkConf;
|
||||
import org.apache.spark.api.java.function.MapFunction;
|
||||
import org.apache.spark.api.java.function.MapGroupsFunction;
|
||||
import org.apache.spark.sql.*;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
|
||||
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
|
||||
import eu.dnetlib.dhp.common.HdfsSupport;
|
||||
import eu.dnetlib.dhp.common.vocabulary.VocabularyGroup;
|
||||
import eu.dnetlib.dhp.schema.common.EntityType;
|
||||
import eu.dnetlib.dhp.schema.common.ModelSupport;
|
||||
import eu.dnetlib.dhp.schema.oaf.OafEntity;
|
||||
import eu.dnetlib.dhp.schema.oaf.utils.GraphCleaningFunctions;
|
||||
import eu.dnetlib.dhp.schema.oaf.utils.MergeUtils;
|
||||
import eu.dnetlib.dhp.utils.ISLookupClientFactory;
|
||||
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpException;
|
||||
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpService;
|
||||
import scala.Tuple2;
|
||||
|
||||
/**
|
||||
* Groups the graph content by entity identifier to ensure ID uniqueness
|
||||
*/
|
||||
public class GroupEntitiesSparkJob {
|
||||
private static final Logger log = LoggerFactory.getLogger(GroupEntitiesSparkJob.class);
|
||||
|
||||
private static final Encoder<OafEntity> OAFENTITY_KRYO_ENC = Encoders.kryo(OafEntity.class);
|
||||
|
||||
private ArgumentApplicationParser parser;
|
||||
|
||||
public GroupEntitiesSparkJob(ArgumentApplicationParser parser) {
|
||||
this.parser = parser;
|
||||
}
|
||||
|
||||
public static void main(String[] args) throws Exception {
|
||||
|
||||
String jsonConfiguration = IOUtils
|
||||
.toString(
|
||||
GroupEntitiesSparkJob.class
|
||||
.getResourceAsStream(
|
||||
"/eu/dnetlib/dhp/oa/merge/group_graph_entities_parameters.json"));
|
||||
final ArgumentApplicationParser parser = new ArgumentApplicationParser(jsonConfiguration);
|
||||
parser.parseArgument(args);
|
||||
|
||||
Boolean isSparkSessionManaged = Optional
|
||||
.ofNullable(parser.get("isSparkSessionManaged"))
|
||||
.map(Boolean::valueOf)
|
||||
.orElse(Boolean.TRUE);
|
||||
log.info("isSparkSessionManaged: {}", isSparkSessionManaged);
|
||||
|
||||
final String isLookupUrl = parser.get("isLookupUrl");
|
||||
log.info("isLookupUrl: {}", isLookupUrl);
|
||||
|
||||
final ISLookUpService isLookupService = ISLookupClientFactory.getLookUpService(isLookupUrl);
|
||||
|
||||
new GroupEntitiesSparkJob(parser).run(isSparkSessionManaged, isLookupService);
|
||||
}
|
||||
|
||||
public void run(Boolean isSparkSessionManaged, ISLookUpService isLookUpService)
|
||||
throws ISLookUpException {
|
||||
|
||||
String graphInputPath = parser.get("graphInputPath");
|
||||
log.info("graphInputPath: {}", graphInputPath);
|
||||
|
||||
String checkpointPath = parser.get("checkpointPath");
|
||||
log.info("checkpointPath: {}", checkpointPath);
|
||||
|
||||
String outputPath = parser.get("outputPath");
|
||||
log.info("outputPath: {}", outputPath);
|
||||
|
||||
boolean filterInvisible = Boolean.parseBoolean(parser.get("filterInvisible"));
|
||||
log.info("filterInvisible: {}", filterInvisible);
|
||||
|
||||
SparkConf conf = new SparkConf();
|
||||
conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer");
|
||||
conf.registerKryoClasses(ModelSupport.getOafModelClasses());
|
||||
|
||||
final VocabularyGroup vocs = VocabularyGroup.loadVocsFromIS(isLookUpService);
|
||||
|
||||
runWithSparkSession(
|
||||
conf,
|
||||
isSparkSessionManaged,
|
||||
spark -> {
|
||||
HdfsSupport.remove(checkpointPath, spark.sparkContext().hadoopConfiguration());
|
||||
groupEntities(spark, graphInputPath, checkpointPath, outputPath, filterInvisible, vocs);
|
||||
});
|
||||
}
|
||||
|
||||
private static void groupEntities(
|
||||
SparkSession spark,
|
||||
String inputPath,
|
||||
String checkpointPath,
|
||||
String outputPath,
|
||||
boolean filterInvisible, VocabularyGroup vocs) {
|
||||
|
||||
Dataset<OafEntity> allEntities = spark.emptyDataset(OAFENTITY_KRYO_ENC);
|
||||
|
||||
for (Map.Entry<EntityType, Class> e : ModelSupport.entityTypes.entrySet()) {
|
||||
String entity = e.getKey().name();
|
||||
Class<? extends OafEntity> entityClass = e.getValue();
|
||||
String entityInputPath = inputPath + "/" + entity;
|
||||
|
||||
if (!HdfsSupport.exists(entityInputPath, spark.sparkContext().hadoopConfiguration())) {
|
||||
continue;
|
||||
}
|
||||
|
||||
allEntities = allEntities
|
||||
.union(
|
||||
((Dataset<OafEntity>) spark
|
||||
.read()
|
||||
.schema(Encoders.bean(entityClass).schema())
|
||||
.json(entityInputPath)
|
||||
.filter("length(id) > 0")
|
||||
.as(Encoders.bean(entityClass)))
|
||||
.map((MapFunction<OafEntity, OafEntity>) r -> r, OAFENTITY_KRYO_ENC));
|
||||
}
|
||||
|
||||
Dataset<?> groupedEntities = allEntities
|
||||
.map(
|
||||
(MapFunction<OafEntity, OafEntity>) entity -> GraphCleaningFunctions
|
||||
.applyCoarVocabularies(entity, vocs),
|
||||
OAFENTITY_KRYO_ENC)
|
||||
.groupByKey((MapFunction<OafEntity, String>) OafEntity::getId, Encoders.STRING())
|
||||
.mapGroups((MapGroupsFunction<String, OafEntity, OafEntity>) MergeUtils::mergeById, OAFENTITY_KRYO_ENC)
|
||||
.map(
|
||||
(MapFunction<OafEntity, Tuple2<String, OafEntity>>) t -> new Tuple2<>(
|
||||
t.getClass().getName(), t),
|
||||
Encoders.tuple(Encoders.STRING(), OAFENTITY_KRYO_ENC));
|
||||
|
||||
// pivot on "_1" (classname of the entity)
|
||||
// created columns containing only entities of the same class
|
||||
for (Map.Entry<EntityType, Class> e : ModelSupport.entityTypes.entrySet()) {
|
||||
String entity = e.getKey().name();
|
||||
Class<? extends OafEntity> entityClass = e.getValue();
|
||||
|
||||
groupedEntities = groupedEntities
|
||||
.withColumn(
|
||||
entity,
|
||||
when(col("_1").equalTo(entityClass.getName()), col("_2")));
|
||||
}
|
||||
|
||||
groupedEntities
|
||||
.drop("_1", "_2")
|
||||
.write()
|
||||
.mode(SaveMode.Overwrite)
|
||||
.option("compression", "gzip")
|
||||
.save(checkpointPath);
|
||||
|
||||
ForkJoinPool parPool = new ForkJoinPool(ModelSupport.entityTypes.size());
|
||||
|
||||
ModelSupport.entityTypes
|
||||
.entrySet()
|
||||
.stream()
|
||||
.map(e -> parPool.submit(() -> {
|
||||
String entity = e.getKey().name();
|
||||
Class<? extends OafEntity> entityClass = e.getValue();
|
||||
|
||||
spark
|
||||
.read()
|
||||
.load(checkpointPath)
|
||||
.select(col(entity).as("value"))
|
||||
.filter("value IS NOT NULL")
|
||||
.as(OAFENTITY_KRYO_ENC)
|
||||
.map((MapFunction<OafEntity, OafEntity>) r -> r, (Encoder<OafEntity>) Encoders.bean(entityClass))
|
||||
.filter(filterInvisible ? "dataInfo.invisible != TRUE" : "TRUE")
|
||||
.write()
|
||||
.mode(SaveMode.Overwrite)
|
||||
.option("compression", "gzip")
|
||||
.json(outputPath + "/" + entity);
|
||||
}))
|
||||
.collect(Collectors.toList())
|
||||
.forEach(t -> {
|
||||
try {
|
||||
t.get();
|
||||
} catch (InterruptedException | ExecutionException e) {
|
||||
throw new RuntimeException(e);
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
|
@ -1,77 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.oozie;
|
||||
|
||||
import static eu.dnetlib.dhp.common.SparkSessionSupport.runWithSparkHiveSession;
|
||||
|
||||
import java.net.URL;
|
||||
import java.nio.charset.StandardCharsets;
|
||||
import java.util.HashMap;
|
||||
import java.util.Map;
|
||||
import java.util.Optional;
|
||||
|
||||
import org.apache.commons.lang3.time.DurationFormatUtils;
|
||||
import org.apache.commons.text.StringSubstitutor;
|
||||
import org.apache.spark.SparkConf;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
|
||||
import com.google.common.io.Resources;
|
||||
|
||||
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
|
||||
|
||||
public class RunSQLSparkJob {
|
||||
private static final Logger log = LoggerFactory.getLogger(RunSQLSparkJob.class);
|
||||
|
||||
private final ArgumentApplicationParser parser;
|
||||
|
||||
public RunSQLSparkJob(ArgumentApplicationParser parser) {
|
||||
this.parser = parser;
|
||||
}
|
||||
|
||||
public static void main(String[] args) throws Exception {
|
||||
|
||||
Map<String, String> params = new HashMap<>();
|
||||
for (int i = 0; i < args.length - 1; i++) {
|
||||
if (args[i].startsWith("--")) {
|
||||
params.put(args[i].substring(2), args[++i]);
|
||||
}
|
||||
}
|
||||
|
||||
/*
|
||||
* String jsonConfiguration = IOUtils .toString( Objects .requireNonNull( RunSQLSparkJob.class
|
||||
* .getResourceAsStream( "/eu/dnetlib/dhp/oozie/run_sql_parameters.json"))); final ArgumentApplicationParser
|
||||
* parser = new ArgumentApplicationParser(jsonConfiguration); parser.parseArgument(args);
|
||||
*/
|
||||
|
||||
Boolean isSparkSessionManaged = Optional
|
||||
.ofNullable(params.get("isSparkSessionManaged"))
|
||||
.map(Boolean::valueOf)
|
||||
.orElse(Boolean.TRUE);
|
||||
log.info("isSparkSessionManaged: {}", isSparkSessionManaged);
|
||||
|
||||
URL url = com.google.common.io.Resources.getResource(params.get("sql"));
|
||||
String raw_sql = Resources.toString(url, StandardCharsets.UTF_8);
|
||||
|
||||
String sql = StringSubstitutor.replace(raw_sql, params);
|
||||
log.info("sql: {}", sql);
|
||||
|
||||
SparkConf conf = new SparkConf();
|
||||
conf.set("hive.metastore.uris", params.get("hiveMetastoreUris"));
|
||||
|
||||
runWithSparkHiveSession(
|
||||
conf,
|
||||
isSparkSessionManaged,
|
||||
spark -> {
|
||||
for (String statement : sql.split(";\\s*/\\*\\s*EOS\\s*\\*/\\s*")) {
|
||||
log.info("executing: {}", statement);
|
||||
long startTime = System.currentTimeMillis();
|
||||
spark.sql(statement).show();
|
||||
log
|
||||
.info(
|
||||
"executed in {}",
|
||||
DurationFormatUtils.formatDuration(System.currentTimeMillis() - startTime, "HH:mm:ss.S"));
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
}
|
|
@ -12,9 +12,6 @@ import com.ximpleware.VTDNav;
|
|||
/** Created by sandro on 9/29/16. */
|
||||
public class VtdUtilityParser {
|
||||
|
||||
private VtdUtilityParser() {
|
||||
}
|
||||
|
||||
public static List<Node> getTextValuesWithAttributes(
|
||||
final AutoPilot ap, final VTDNav vn, final String xpath, final List<String> attributes)
|
||||
throws VtdException {
|
||||
|
|
|
@ -1,70 +0,0 @@
|
|||
/*
|
||||
* Copyright (c) 2024.
|
||||
* SPDX-FileCopyrightText: © 2023 Consiglio Nazionale delle Ricerche
|
||||
* SPDX-License-Identifier: AGPL-3.0-or-later
|
||||
*/
|
||||
|
||||
package eu.dnetlib.dhp.schema.oaf;
|
||||
|
||||
import org.apache.commons.lang3.builder.EqualsBuilder;
|
||||
import org.apache.commons.lang3.builder.HashCodeBuilder;
|
||||
|
||||
public class HashableStructuredProperty extends StructuredProperty {
|
||||
|
||||
private static final long serialVersionUID = 8371670185221126045L;
|
||||
|
||||
public static HashableStructuredProperty newInstance(String value, Qualifier qualifier, DataInfo dataInfo) {
|
||||
if (value == null) {
|
||||
return null;
|
||||
}
|
||||
final HashableStructuredProperty sp = new HashableStructuredProperty();
|
||||
sp.setValue(value);
|
||||
sp.setQualifier(qualifier);
|
||||
sp.setDataInfo(dataInfo);
|
||||
return sp;
|
||||
}
|
||||
|
||||
public static HashableStructuredProperty newInstance(StructuredProperty sp) {
|
||||
HashableStructuredProperty hsp = new HashableStructuredProperty();
|
||||
hsp.setQualifier(sp.getQualifier());
|
||||
hsp.setValue(sp.getValue());
|
||||
hsp.setQualifier(sp.getQualifier());
|
||||
return hsp;
|
||||
}
|
||||
|
||||
public static StructuredProperty toStructuredProperty(HashableStructuredProperty hsp) {
|
||||
StructuredProperty sp = new StructuredProperty();
|
||||
sp.setQualifier(hsp.getQualifier());
|
||||
sp.setValue(hsp.getValue());
|
||||
sp.setQualifier(hsp.getQualifier());
|
||||
return sp;
|
||||
}
|
||||
|
||||
@Override
|
||||
public int hashCode() {
|
||||
return new HashCodeBuilder(11, 91)
|
||||
.append(getQualifier().getClassid())
|
||||
.append(getQualifier().getSchemeid())
|
||||
.append(getValue())
|
||||
.hashCode();
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean equals(Object obj) {
|
||||
if (obj == null) {
|
||||
return false;
|
||||
}
|
||||
if (obj == this) {
|
||||
return true;
|
||||
}
|
||||
if (obj.getClass() != getClass()) {
|
||||
return false;
|
||||
}
|
||||
final HashableStructuredProperty rhs = (HashableStructuredProperty) obj;
|
||||
return new EqualsBuilder()
|
||||
.append(getQualifier().getClassid(), rhs.getQualifier().getClassid())
|
||||
.append(getQualifier().getSchemeid(), rhs.getQualifier().getSchemeid())
|
||||
.append(getValue(), rhs.getValue())
|
||||
.isEquals();
|
||||
}
|
||||
}
|
|
@ -1,46 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
||||
|
||||
import java.util.HashSet;
|
||||
import java.util.Objects;
|
||||
import java.util.Optional;
|
||||
import java.util.Set;
|
||||
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
|
||||
import eu.dnetlib.dhp.schema.oaf.StructuredProperty;
|
||||
|
||||
public class CleaningFunctions {
|
||||
|
||||
public static final String DOI_PREFIX_REGEX = "(^10\\.|\\/10\\.)";
|
||||
public static final String DOI_PREFIX = "10.";
|
||||
|
||||
public static final Set<String> PID_BLACKLIST = new HashSet<>();
|
||||
|
||||
static {
|
||||
PID_BLACKLIST.add("none");
|
||||
PID_BLACKLIST.add("na");
|
||||
}
|
||||
|
||||
public CleaningFunctions() {
|
||||
}
|
||||
|
||||
/**
|
||||
* Utility method that filter PID values on a per-type basis.
|
||||
* @param s the PID whose value will be checked.
|
||||
* @return false if the pid matches the filter criteria, true otherwise.
|
||||
*/
|
||||
public static boolean pidFilter(StructuredProperty s) {
|
||||
final String pidValue = s.getValue();
|
||||
if (Objects.isNull(s.getQualifier()) ||
|
||||
StringUtils.isBlank(pidValue) ||
|
||||
StringUtils.isBlank(pidValue.replaceAll("(?:\\n|\\r|\\t|\\s)", ""))) {
|
||||
return false;
|
||||
}
|
||||
if (CleaningFunctions.PID_BLACKLIST.contains(pidValue)) {
|
||||
return false;
|
||||
}
|
||||
return !PidBlacklistProvider.getBlacklist(s.getQualifier().getClassid()).contains(pidValue);
|
||||
}
|
||||
|
||||
}
|
|
@ -1,30 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
||||
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
|
||||
public class DoiCleaningRule {
|
||||
|
||||
public static String clean(final String doi) {
|
||||
if (doi == null)
|
||||
return null;
|
||||
final String replaced = doi
|
||||
.replaceAll("\\n|\\r|\\t|\\s", "")
|
||||
.replaceAll("^doi:", "")
|
||||
.toLowerCase()
|
||||
.replaceFirst(CleaningFunctions.DOI_PREFIX_REGEX, CleaningFunctions.DOI_PREFIX);
|
||||
if (StringUtils.isEmpty(replaced))
|
||||
return null;
|
||||
|
||||
if (!replaced.contains("10."))
|
||||
return null;
|
||||
|
||||
final String ret = replaced.substring(replaced.indexOf("10."));
|
||||
|
||||
if (!ret.startsWith(CleaningFunctions.DOI_PREFIX))
|
||||
return null;
|
||||
|
||||
return ret;
|
||||
}
|
||||
|
||||
}
|
|
@ -1,25 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
||||
|
||||
import java.util.regex.Matcher;
|
||||
import java.util.regex.Pattern;
|
||||
|
||||
public class FundRefCleaningRule {
|
||||
|
||||
public static final Pattern PATTERN = Pattern.compile("\\d+");
|
||||
|
||||
public static String clean(final String fundRefId) {
|
||||
|
||||
String s = fundRefId
|
||||
.toLowerCase()
|
||||
.replaceAll("\\s", "");
|
||||
|
||||
Matcher m = PATTERN.matcher(s);
|
||||
if (m.find()) {
|
||||
return m.group();
|
||||
} else {
|
||||
return "";
|
||||
}
|
||||
}
|
||||
|
||||
}
|
File diff suppressed because it is too large
Load Diff
|
@ -1,24 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
||||
|
||||
import java.util.regex.Matcher;
|
||||
import java.util.regex.Pattern;
|
||||
|
||||
public class GridCleaningRule {
|
||||
|
||||
public static final Pattern PATTERN = Pattern.compile("(?<grid>\\d{4,6}\\.[0-9a-z]{1,2})");
|
||||
|
||||
public static String clean(String grid) {
|
||||
String s = grid
|
||||
.replaceAll("\\s", "")
|
||||
.toLowerCase();
|
||||
|
||||
Matcher m = PATTERN.matcher(s);
|
||||
if (m.find()) {
|
||||
return "grid." + m.group("grid");
|
||||
}
|
||||
|
||||
return "";
|
||||
}
|
||||
|
||||
}
|
|
@ -1,21 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
||||
|
||||
import java.util.regex.Matcher;
|
||||
import java.util.regex.Pattern;
|
||||
|
||||
// https://www.wikidata.org/wiki/Property:P213
|
||||
public class ISNICleaningRule {
|
||||
|
||||
public static final Pattern PATTERN = Pattern.compile("([0]{4}) ?([0-9]{4}) ?([0-9]{4}) ?([0-9]{3}[0-9X])");
|
||||
|
||||
public static String clean(final String isni) {
|
||||
|
||||
Matcher m = PATTERN.matcher(isni);
|
||||
if (m.find()) {
|
||||
return String.join("", m.group(1), m.group(2), m.group(3), m.group(4));
|
||||
} else {
|
||||
return "";
|
||||
}
|
||||
}
|
||||
}
|
|
@ -1,294 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
||||
|
||||
import static com.google.common.base.Preconditions.checkArgument;
|
||||
import static eu.dnetlib.dhp.schema.common.ModelConstants.*;
|
||||
|
||||
import java.io.Serializable;
|
||||
import java.nio.charset.StandardCharsets;
|
||||
import java.security.MessageDigest;
|
||||
import java.util.*;
|
||||
import java.util.function.Function;
|
||||
import java.util.stream.Collectors;
|
||||
import java.util.stream.Stream;
|
||||
|
||||
import org.apache.commons.codec.binary.Hex;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
|
||||
import com.google.common.collect.HashBiMap;
|
||||
import com.google.common.collect.Maps;
|
||||
|
||||
import eu.dnetlib.dhp.schema.common.ModelSupport;
|
||||
import eu.dnetlib.dhp.schema.oaf.*;
|
||||
|
||||
/**
|
||||
* Factory class for OpenAIRE identifiers in the Graph
|
||||
*/
|
||||
public class IdentifierFactory implements Serializable {
|
||||
|
||||
public static final String ID_SEPARATOR = "::";
|
||||
public static final String ID_PREFIX_SEPARATOR = "|";
|
||||
|
||||
public static final int ID_PREFIX_LEN = 12;
|
||||
|
||||
/**
|
||||
* Declares the associations PID_TYPE -> [DATASOURCE ID, NAME] considered authoritative for that PID_TYPE.
|
||||
* The id of the record (source_::id) will be rewritten as pidType_::id)
|
||||
*/
|
||||
public static final Map<PidType, HashBiMap<String, String>> PID_AUTHORITY = Maps.newHashMap();
|
||||
|
||||
static {
|
||||
PID_AUTHORITY.put(PidType.doi, HashBiMap.create());
|
||||
PID_AUTHORITY.get(PidType.doi).put(CROSSREF_ID, "Crossref");
|
||||
PID_AUTHORITY.get(PidType.doi).put(DATACITE_ID, "Datacite");
|
||||
PID_AUTHORITY.get(PidType.doi).put(ZENODO_OD_ID, "ZENODO");
|
||||
PID_AUTHORITY.get(PidType.doi).put(ZENODO_R3_ID, "Zenodo");
|
||||
|
||||
PID_AUTHORITY.put(PidType.pmc, HashBiMap.create());
|
||||
PID_AUTHORITY.get(PidType.pmc).put(EUROPE_PUBMED_CENTRAL_ID, "Europe PubMed Central");
|
||||
PID_AUTHORITY.get(PidType.pmc).put(PUBMED_CENTRAL_ID, "PubMed Central");
|
||||
|
||||
PID_AUTHORITY.put(PidType.pmid, HashBiMap.create());
|
||||
PID_AUTHORITY.get(PidType.pmid).put(EUROPE_PUBMED_CENTRAL_ID, "Europe PubMed Central");
|
||||
PID_AUTHORITY.get(PidType.pmid).put(PUBMED_CENTRAL_ID, "PubMed Central");
|
||||
|
||||
PID_AUTHORITY.put(PidType.arXiv, HashBiMap.create());
|
||||
PID_AUTHORITY.get(PidType.arXiv).put(ARXIV_ID, "arXiv.org e-Print Archive");
|
||||
|
||||
PID_AUTHORITY.put(PidType.w3id, HashBiMap.create());
|
||||
PID_AUTHORITY.get(PidType.w3id).put(ROHUB_ID, "ROHub");
|
||||
}
|
||||
|
||||
/**
|
||||
* Declares the associations PID_TYPE -> [DATASOURCE ID, PID SUBSTRING] considered as delegated authority for that
|
||||
* PID_TYPE. Example, Zenodo is delegated to forge DOIs that contain the 'zenodo' word.
|
||||
*
|
||||
* If a record with the same id (same pid) comes from 2 data sources, the one coming from a delegated source wins. E.g. Zenodo records win over those from Datacite.
|
||||
* See also https://code-repo.d4science.org/D-Net/dnet-hadoop/pulls/187 and the class dhp-common/src/main/java/eu/dnetlib/dhp/schema/oaf/utils/OafMapperUtils.java
|
||||
*/
|
||||
public static final Map<PidType, Map<String, String>> DELEGATED_PID_AUTHORITY = Maps.newHashMap();
|
||||
|
||||
static {
|
||||
DELEGATED_PID_AUTHORITY.put(PidType.doi, new HashMap<>());
|
||||
DELEGATED_PID_AUTHORITY.get(PidType.doi).put(ZENODO_OD_ID, "zenodo");
|
||||
DELEGATED_PID_AUTHORITY.get(PidType.doi).put(ZENODO_R3_ID, "zenodo");
|
||||
DELEGATED_PID_AUTHORITY.put(PidType.w3id, new HashMap<>());
|
||||
DELEGATED_PID_AUTHORITY.get(PidType.w3id).put(ROHUB_ID, "ro-id");
|
||||
}
|
||||
|
||||
/**
|
||||
* Declares the associations PID_TYPE -> [DATASOURCE ID, NAME] whose records are considered enrichment for the graph.
|
||||
* Their OpenAIRE ID is built from the declared PID type. Are merged with their corresponding record, identified by
|
||||
* the same OpenAIRE id.
|
||||
*/
|
||||
public static final Map<PidType, HashBiMap<String, String>> ENRICHMENT_PROVIDER = Maps.newHashMap();
|
||||
|
||||
static {
|
||||
ENRICHMENT_PROVIDER.put(PidType.doi, HashBiMap.create());
|
||||
ENRICHMENT_PROVIDER.get(PidType.doi).put(OPEN_APC_ID, OPEN_APC_NAME);
|
||||
}
|
||||
|
||||
public static Set<String> delegatedAuthorityDatasourceIds() {
|
||||
return DELEGATED_PID_AUTHORITY
|
||||
.values()
|
||||
.stream()
|
||||
.flatMap(m -> m.keySet().stream())
|
||||
.collect(Collectors.toCollection(HashSet::new));
|
||||
}
|
||||
|
||||
public static List<StructuredProperty> getPids(List<StructuredProperty> pid, KeyValue collectedFrom) {
|
||||
return pidFromInstance(pid, collectedFrom, true).distinct().collect(Collectors.toList());
|
||||
}
|
||||
|
||||
public static <T extends Result> String createDOIBoostIdentifier(T entity) {
|
||||
if (entity == null)
|
||||
return null;
|
||||
|
||||
StructuredProperty pid = null;
|
||||
if (entity.getPid() != null) {
|
||||
pid = entity
|
||||
.getPid()
|
||||
.stream()
|
||||
.filter(Objects::nonNull)
|
||||
.filter(s -> s.getQualifier() != null && "doi".equalsIgnoreCase(s.getQualifier().getClassid()))
|
||||
.filter(CleaningFunctions::pidFilter)
|
||||
.findAny()
|
||||
.orElse(null);
|
||||
} else {
|
||||
if (entity.getInstance() != null) {
|
||||
pid = entity
|
||||
.getInstance()
|
||||
.stream()
|
||||
.filter(i -> i.getPid() != null)
|
||||
.flatMap(i -> i.getPid().stream())
|
||||
.filter(CleaningFunctions::pidFilter)
|
||||
.findAny()
|
||||
.orElse(null);
|
||||
}
|
||||
}
|
||||
if (pid != null)
|
||||
return idFromPid(entity, pid, true);
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates an identifier from the most relevant PID (if available) provided by a known PID authority in the given
|
||||
* entity T. Returns entity.id when none of the PIDs meet the selection criteria is available.
|
||||
*
|
||||
* @param entity the entity providing PIDs and a default ID.
|
||||
* @param <T> the specific entity type. Currently Organization and Result subclasses are supported.
|
||||
* @param md5 indicates whether should hash the PID value or not.
|
||||
* @return an identifier from the most relevant PID, entity.id otherwise
|
||||
*/
|
||||
public static <T extends OafEntity> String createIdentifier(T entity, boolean md5) {
|
||||
|
||||
checkArgument(StringUtils.isNoneBlank(entity.getId()), "missing entity identifier");
|
||||
|
||||
final Map<String, Set<StructuredProperty>> pids = extractPids(entity);
|
||||
|
||||
return pids
|
||||
.values()
|
||||
.stream()
|
||||
.flatMap(Set::stream)
|
||||
.min(new PidComparator<>(entity))
|
||||
.map(
|
||||
min -> Optional
|
||||
.ofNullable(pids.get(min.getQualifier().getClassid()))
|
||||
.map(
|
||||
p -> p
|
||||
.stream()
|
||||
.sorted(new PidValueComparator())
|
||||
.findFirst()
|
||||
.map(s -> idFromPid(entity, s, md5))
|
||||
.orElseGet(entity::getId))
|
||||
.orElseGet(entity::getId))
|
||||
.orElseGet(entity::getId);
|
||||
}
|
||||
|
||||
private static <T extends OafEntity> Map<String, Set<StructuredProperty>> extractPids(T entity) {
|
||||
if (entity instanceof Result) {
|
||||
return Optional
|
||||
.ofNullable(((Result) entity).getInstance())
|
||||
.map(IdentifierFactory::mapPids)
|
||||
.orElse(new HashMap<>());
|
||||
} else {
|
||||
return entity
|
||||
.getPid()
|
||||
.stream()
|
||||
.map(PidCleaner::normalizePidValue)
|
||||
.filter(CleaningFunctions::pidFilter)
|
||||
.collect(
|
||||
Collectors
|
||||
.groupingBy(
|
||||
p -> p.getQualifier().getClassid(),
|
||||
Collectors.mapping(p -> p, Collectors.toCollection(HashSet::new))));
|
||||
}
|
||||
}
|
||||
|
||||
private static Map<String, Set<StructuredProperty>> mapPids(List<Instance> instance) {
|
||||
return instance
|
||||
.stream()
|
||||
.map(i -> pidFromInstance(i.getPid(), i.getCollectedfrom(), false))
|
||||
.flatMap(Function.identity())
|
||||
.collect(
|
||||
Collectors
|
||||
.groupingBy(
|
||||
p -> p.getQualifier().getClassid(),
|
||||
Collectors.mapping(p -> p, Collectors.toCollection(HashSet::new))));
|
||||
}
|
||||
|
||||
private static Stream<StructuredProperty> pidFromInstance(List<StructuredProperty> pid, KeyValue collectedFrom,
|
||||
boolean mapHandles) {
|
||||
return Optional
|
||||
.ofNullable(pid)
|
||||
.map(
|
||||
pp -> pp
|
||||
.stream()
|
||||
// filter away PIDs provided by a DS that is not considered an authority for the
|
||||
// given PID Type
|
||||
.filter(p -> shouldFilterPidByCriteria(collectedFrom, p, mapHandles))
|
||||
.map(PidCleaner::normalizePidValue)
|
||||
.filter(p -> isNotFromDelegatedAuthority(collectedFrom, p))
|
||||
.filter(CleaningFunctions::pidFilter))
|
||||
.orElse(Stream.empty());
|
||||
}
|
||||
|
||||
private static boolean shouldFilterPidByCriteria(KeyValue collectedFrom, StructuredProperty p, boolean mapHandles) {
|
||||
final PidType pType = PidType.tryValueOf(p.getQualifier().getClassid());
|
||||
|
||||
if (Objects.isNull(collectedFrom)) {
|
||||
return false;
|
||||
}
|
||||
|
||||
boolean isEnrich = Optional
|
||||
.ofNullable(ENRICHMENT_PROVIDER.get(pType))
|
||||
.map(
|
||||
enrich -> enrich.containsKey(collectedFrom.getKey())
|
||||
|| enrich.containsValue(collectedFrom.getValue()))
|
||||
.orElse(false);
|
||||
|
||||
boolean isAuthority = Optional
|
||||
.ofNullable(PID_AUTHORITY.get(pType))
|
||||
.map(
|
||||
authorities -> authorities.containsKey(collectedFrom.getKey())
|
||||
|| authorities.containsValue(collectedFrom.getValue()))
|
||||
.orElse(false);
|
||||
|
||||
return (mapHandles && pType.equals(PidType.handle)) || isEnrich || isAuthority;
|
||||
}
|
||||
|
||||
private static boolean isNotFromDelegatedAuthority(KeyValue collectedFrom, StructuredProperty p) {
|
||||
final PidType pType = PidType.tryValueOf(p.getQualifier().getClassid());
|
||||
|
||||
final Map<String, String> da = DELEGATED_PID_AUTHORITY.get(pType);
|
||||
if (Objects.isNull(da)) {
|
||||
return true;
|
||||
}
|
||||
if (!da.containsKey(collectedFrom.getKey())) {
|
||||
return true;
|
||||
}
|
||||
return StringUtils.contains(p.getValue(), da.get(collectedFrom.getKey()));
|
||||
}
|
||||
|
||||
/**
|
||||
* @see {@link IdentifierFactory#createIdentifier(OafEntity, boolean)}
|
||||
*/
|
||||
public static <T extends OafEntity> String createIdentifier(T entity) {
|
||||
|
||||
return createIdentifier(entity, true);
|
||||
}
|
||||
|
||||
private static <T extends OafEntity> String idFromPid(T entity, StructuredProperty s, boolean md5) {
|
||||
return idFromPid(ModelSupport.getIdPrefix(entity.getClass()), s.getQualifier().getClassid(), s.getValue(), md5);
|
||||
}
|
||||
|
||||
public static String idFromPid(String numericPrefix, String pidType, String pidValue, boolean md5) {
|
||||
return new StringBuilder()
|
||||
.append(numericPrefix)
|
||||
.append(ID_PREFIX_SEPARATOR)
|
||||
.append(createPrefix(pidType))
|
||||
.append(ID_SEPARATOR)
|
||||
.append(md5 ? md5(pidValue) : pidValue)
|
||||
.toString();
|
||||
}
|
||||
|
||||
// create the prefix (length = 12)
|
||||
private static String createPrefix(String pidType) {
|
||||
StringBuilder prefix = new StringBuilder(StringUtils.left(pidType, ID_PREFIX_LEN));
|
||||
while (prefix.length() < ID_PREFIX_LEN) {
|
||||
prefix.append("_");
|
||||
}
|
||||
return prefix.substring(0, ID_PREFIX_LEN);
|
||||
}
|
||||
|
||||
public static String md5(final String s) {
|
||||
try {
|
||||
final MessageDigest md = MessageDigest.getInstance("MD5");
|
||||
md.update(s.getBytes(StandardCharsets.UTF_8));
|
||||
return new String(Hex.encodeHex(md.digest()));
|
||||
} catch (final Exception e) {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
}
|
|
@ -1,78 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
||||
|
||||
import java.util.Comparator;
|
||||
import java.util.HashSet;
|
||||
import java.util.Optional;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
//
|
||||
// Source code recreated from a .class file by IntelliJ IDEA
|
||||
// (powered by FernFlower decompiler)
|
||||
//
|
||||
import eu.dnetlib.dhp.schema.common.EntityType;
|
||||
import eu.dnetlib.dhp.schema.oaf.KeyValue;
|
||||
import eu.dnetlib.dhp.schema.oaf.Oaf;
|
||||
import eu.dnetlib.dhp.schema.oaf.OafEntity;
|
||||
import eu.dnetlib.dhp.schema.oaf.Result;
|
||||
|
||||
public class MergeComparator implements Comparator<Oaf> {
|
||||
public MergeComparator() {
|
||||
}
|
||||
|
||||
public int compare(Oaf left, Oaf right) {
|
||||
// nulls at the end
|
||||
if (left == null && right == null) {
|
||||
return 0;
|
||||
} else if (left == null) {
|
||||
return -1;
|
||||
} else if (right == null) {
|
||||
return 1;
|
||||
}
|
||||
|
||||
// invisible
|
||||
if (left.getDataInfo() != null && left.getDataInfo().getInvisible() == true) {
|
||||
if (right.getDataInfo() != null && right.getDataInfo().getInvisible() == false) {
|
||||
return -1;
|
||||
}
|
||||
}
|
||||
|
||||
// collectedfrom
|
||||
HashSet<String> lCf = getCollectedFromIds(left);
|
||||
HashSet<String> rCf = getCollectedFromIds(right);
|
||||
if (lCf.contains("10|openaire____::081b82f96300b6a6e3d282bad31cb6e2")
|
||||
&& !rCf.contains("10|openaire____::081b82f96300b6a6e3d282bad31cb6e2")) {
|
||||
return -1;
|
||||
} else if (!lCf.contains("10|openaire____::081b82f96300b6a6e3d282bad31cb6e2")
|
||||
&& rCf.contains("10|openaire____::081b82f96300b6a6e3d282bad31cb6e2")) {
|
||||
return 1;
|
||||
}
|
||||
|
||||
SubEntityType lClass = SubEntityType.fromClass(left.getClass());
|
||||
SubEntityType rClass = SubEntityType.fromClass(right.getClass());
|
||||
return lClass.ordinal() - rClass.ordinal();
|
||||
|
||||
}
|
||||
|
||||
protected HashSet<String> getCollectedFromIds(Oaf left) {
|
||||
return (HashSet) Optional.ofNullable(left.getCollectedfrom()).map((cf) -> {
|
||||
return (HashSet) cf.stream().map(KeyValue::getKey).collect(Collectors.toCollection(HashSet::new));
|
||||
}).orElse(new HashSet());
|
||||
}
|
||||
|
||||
enum SubEntityType {
|
||||
publication, dataset, software, otherresearchproduct, datasource, organization, project;
|
||||
|
||||
/**
|
||||
* Resolves the EntityType, given the relative class name
|
||||
*
|
||||
* @param clazz the given class name
|
||||
* @param <T> actual OafEntity subclass
|
||||
* @return the EntityType associated to the given class
|
||||
*/
|
||||
public static <T extends Oaf> SubEntityType fromClass(Class<T> clazz) {
|
||||
return valueOf(clazz.getSimpleName().toLowerCase());
|
||||
}
|
||||
}
|
||||
|
||||
}
|
|
@ -1,106 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
||||
|
||||
import java.util.*;
|
||||
|
||||
import eu.dnetlib.dhp.schema.common.ModelConstants;
|
||||
import eu.dnetlib.dhp.schema.oaf.Oaf;
|
||||
import eu.dnetlib.dhp.schema.oaf.OafEntity;
|
||||
import eu.dnetlib.dhp.schema.oaf.Result;
|
||||
|
||||
public class MergeEntitiesComparator implements Comparator<Oaf> {
|
||||
static final List<String> PID_AUTHORITIES = Arrays
|
||||
.asList(
|
||||
ModelConstants.ARXIV_ID,
|
||||
ModelConstants.PUBMED_CENTRAL_ID,
|
||||
ModelConstants.EUROPE_PUBMED_CENTRAL_ID,
|
||||
ModelConstants.DATACITE_ID,
|
||||
ModelConstants.CROSSREF_ID);
|
||||
|
||||
static final List<String> RESULT_TYPES = Arrays
|
||||
.asList(
|
||||
ModelConstants.ORP_RESULTTYPE_CLASSID,
|
||||
ModelConstants.SOFTWARE_RESULTTYPE_CLASSID,
|
||||
ModelConstants.DATASET_RESULTTYPE_CLASSID,
|
||||
ModelConstants.PUBLICATION_RESULTTYPE_CLASSID);
|
||||
|
||||
public static final Comparator<Oaf> INSTANCE = new MergeEntitiesComparator();
|
||||
|
||||
@Override
|
||||
public int compare(Oaf left, Oaf right) {
|
||||
if (left == null && right == null)
|
||||
return 0;
|
||||
if (left == null)
|
||||
return -1;
|
||||
if (right == null)
|
||||
return 1;
|
||||
|
||||
int res = 0;
|
||||
|
||||
// pid authority
|
||||
int cfp1 = Optional
|
||||
.ofNullable(left.getCollectedfrom())
|
||||
.map(
|
||||
cf -> cf
|
||||
.stream()
|
||||
.map(kv -> PID_AUTHORITIES.indexOf(kv.getKey()))
|
||||
.max(Integer::compare)
|
||||
.orElse(-1))
|
||||
.orElse(-1);
|
||||
int cfp2 = Optional
|
||||
.ofNullable(right.getCollectedfrom())
|
||||
.map(
|
||||
cf -> cf
|
||||
.stream()
|
||||
.map(kv -> PID_AUTHORITIES.indexOf(kv.getKey()))
|
||||
.max(Integer::compare)
|
||||
.orElse(-1))
|
||||
.orElse(-1);
|
||||
|
||||
if (cfp1 >= 0 && cfp1 > cfp2) {
|
||||
return 1;
|
||||
} else if (cfp2 >= 0 && cfp2 > cfp1) {
|
||||
return -1;
|
||||
}
|
||||
|
||||
// trust
|
||||
if (left.getDataInfo() != null && right.getDataInfo() != null) {
|
||||
res = left.getDataInfo().getTrust().compareTo(right.getDataInfo().getTrust());
|
||||
}
|
||||
|
||||
// result type
|
||||
if (res == 0) {
|
||||
if (left instanceof Result && right instanceof Result) {
|
||||
Result r1 = (Result) left;
|
||||
Result r2 = (Result) right;
|
||||
|
||||
if (r1.getResulttype() == null || r1.getResulttype().getClassid() == null) {
|
||||
if (r2.getResulttype() != null && r2.getResulttype().getClassid() != null) {
|
||||
return -1;
|
||||
}
|
||||
} else if (r2.getResulttype() == null || r2.getResulttype().getClassid() == null) {
|
||||
return 1;
|
||||
}
|
||||
|
||||
int rt1 = RESULT_TYPES.indexOf(r1.getResulttype().getClassid());
|
||||
int rt2 = RESULT_TYPES.indexOf(r2.getResulttype().getClassid());
|
||||
|
||||
if (rt1 >= 0 && rt1 > rt2) {
|
||||
return 1;
|
||||
} else if (rt2 >= 0 && rt2 > rt1) {
|
||||
return -1;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// id
|
||||
if (res == 0) {
|
||||
if (left instanceof OafEntity && right instanceof OafEntity) {
|
||||
res = ((OafEntity) right).getId().compareTo(((OafEntity) left).getId());
|
||||
}
|
||||
}
|
||||
|
||||
return res;
|
||||
}
|
||||
|
||||
}
|
File diff suppressed because it is too large
Load Diff
|
@ -1,27 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
||||
|
||||
public class ModelHardLimits {
|
||||
|
||||
private ModelHardLimits() {
|
||||
}
|
||||
|
||||
public static final String LAYOUT = "index";
|
||||
public static final String INTERPRETATION = "openaire";
|
||||
public static final String SEPARATOR = "-";
|
||||
|
||||
public static final int MAX_EXTERNAL_ENTITIES = 50;
|
||||
public static final int MAX_AUTHORS = 200;
|
||||
public static final int MAX_AUTHOR_FULLNAME_LENGTH = 1000;
|
||||
public static final int MAX_TITLE_LENGTH = 5000;
|
||||
public static final int MAX_TITLES = 10;
|
||||
public static final int MAX_ABSTRACTS = 10;
|
||||
public static final int MAX_ABSTRACT_LENGTH = 150000;
|
||||
public static final int MAX_RELATED_ABSTRACT_LENGTH = 500;
|
||||
public static final int MAX_INSTANCES = 10;
|
||||
|
||||
public static String getCollectionName(String format) {
|
||||
return format + SEPARATOR + LAYOUT + SEPARATOR + INTERPRETATION;
|
||||
}
|
||||
|
||||
}
|
|
@ -1,478 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
||||
|
||||
import static eu.dnetlib.dhp.schema.common.ModelConstants.*;
|
||||
|
||||
import java.sql.Array;
|
||||
import java.sql.SQLException;
|
||||
import java.util.*;
|
||||
import java.util.concurrent.ConcurrentHashMap;
|
||||
import java.util.function.Function;
|
||||
import java.util.function.Predicate;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
|
||||
import eu.dnetlib.dhp.schema.common.AccessRightComparator;
|
||||
import eu.dnetlib.dhp.schema.oaf.*;
|
||||
|
||||
public class OafMapperUtils {
|
||||
|
||||
private OafMapperUtils() {
|
||||
}
|
||||
|
||||
public static KeyValue keyValue(final String k, final String v) {
|
||||
final KeyValue kv = new KeyValue();
|
||||
kv.setKey(k);
|
||||
kv.setValue(v);
|
||||
return kv;
|
||||
}
|
||||
|
||||
public static List<KeyValue> listKeyValues(final String... s) {
|
||||
if (s.length % 2 > 0) {
|
||||
throw new IllegalArgumentException("Invalid number of parameters (k,v,k,v,....)");
|
||||
}
|
||||
|
||||
final List<KeyValue> list = new ArrayList<>();
|
||||
for (int i = 0; i < s.length; i += 2) {
|
||||
list.add(keyValue(s[i], s[i + 1]));
|
||||
}
|
||||
return list;
|
||||
}
|
||||
|
||||
public static <T> Field<T> field(final T value, final DataInfo info) {
|
||||
if (value == null || StringUtils.isBlank(value.toString())) {
|
||||
return null;
|
||||
}
|
||||
|
||||
final Field<T> field = new Field<>();
|
||||
field.setValue(value);
|
||||
field.setDataInfo(info);
|
||||
return field;
|
||||
}
|
||||
|
||||
public static List<Field<String>> listFields(final DataInfo info, final String... values) {
|
||||
return Arrays
|
||||
.stream(values)
|
||||
.map(v -> field(v, info))
|
||||
.filter(Objects::nonNull)
|
||||
.filter(distinctByKey(Field::getValue))
|
||||
.collect(Collectors.toList());
|
||||
}
|
||||
|
||||
public static <T> List<T> listValues(Array values) throws SQLException {
|
||||
if (Objects.isNull(values)) {
|
||||
return null;
|
||||
}
|
||||
return Arrays
|
||||
.stream((T[]) values.getArray())
|
||||
.filter(Objects::nonNull)
|
||||
.distinct()
|
||||
.collect(Collectors.toList());
|
||||
}
|
||||
|
||||
public static List<Field<String>> listFields(final DataInfo info, final List<String> values) {
|
||||
return values
|
||||
.stream()
|
||||
.map(v -> field(v, info))
|
||||
.filter(Objects::nonNull)
|
||||
.filter(distinctByKey(Field::getValue))
|
||||
.collect(Collectors.toList());
|
||||
}
|
||||
|
||||
public static InstanceTypeMapping instanceTypeMapping(String originalType, String code, String label,
|
||||
String vocabularyName) {
|
||||
final InstanceTypeMapping m = new InstanceTypeMapping();
|
||||
m.setVocabularyName(vocabularyName);
|
||||
m.setOriginalType(originalType);
|
||||
m.setTypeCode(code);
|
||||
m.setTypeLabel(label);
|
||||
return m;
|
||||
}
|
||||
|
||||
public static InstanceTypeMapping instanceTypeMapping(String originalType, Qualifier term) {
|
||||
return instanceTypeMapping(originalType, term.getClassid(), term.getClassname(), term.getSchemeid());
|
||||
}
|
||||
|
||||
public static InstanceTypeMapping instanceTypeMapping(String originalType) {
|
||||
return instanceTypeMapping(originalType, null, null, null);
|
||||
}
|
||||
|
||||
public static InstanceTypeMapping instanceTypeMapping(String originalType, String vocabularyName) {
|
||||
return instanceTypeMapping(originalType, null, null, vocabularyName);
|
||||
}
|
||||
|
||||
public static Qualifier unknown(final String schemeid, final String schemename) {
|
||||
return qualifier(UNKNOWN, "Unknown", schemeid, schemename);
|
||||
}
|
||||
|
||||
public static AccessRight accessRight(
|
||||
final String classid,
|
||||
final String classname,
|
||||
final String schemeid,
|
||||
final String schemename) {
|
||||
return accessRight(classid, classname, schemeid, schemename, null);
|
||||
}
|
||||
|
||||
public static AccessRight accessRight(
|
||||
final String classid,
|
||||
final String classname,
|
||||
final String schemeid,
|
||||
final String schemename,
|
||||
final OpenAccessRoute openAccessRoute) {
|
||||
final AccessRight accessRight = new AccessRight();
|
||||
accessRight.setClassid(classid);
|
||||
accessRight.setClassname(classname);
|
||||
accessRight.setSchemeid(schemeid);
|
||||
accessRight.setSchemename(schemename);
|
||||
accessRight.setOpenAccessRoute(openAccessRoute);
|
||||
return accessRight;
|
||||
}
|
||||
|
||||
public static Qualifier qualifier(
|
||||
final String classid,
|
||||
final String classname,
|
||||
final String schemeid,
|
||||
final String schemename) {
|
||||
final Qualifier q = new Qualifier();
|
||||
q.setClassid(classid);
|
||||
q.setClassname(classname);
|
||||
q.setSchemeid(schemeid);
|
||||
q.setSchemename(schemename);
|
||||
return q;
|
||||
}
|
||||
|
||||
public static Qualifier qualifier(final Qualifier qualifier) {
|
||||
final Qualifier q = new Qualifier();
|
||||
q.setClassid(qualifier.getClassid());
|
||||
q.setClassname(qualifier.getClassname());
|
||||
q.setSchemeid(qualifier.getSchemeid());
|
||||
q.setSchemename(qualifier.getSchemename());
|
||||
return q;
|
||||
}
|
||||
|
||||
public static Subject subject(
|
||||
final String value,
|
||||
final String classid,
|
||||
final String classname,
|
||||
final String schemeid,
|
||||
final String schemename,
|
||||
final DataInfo dataInfo) {
|
||||
|
||||
return subject(value, qualifier(classid, classname, schemeid, schemename), dataInfo);
|
||||
}
|
||||
|
||||
public static StructuredProperty structuredProperty(
|
||||
final String value,
|
||||
final String classid,
|
||||
final String classname,
|
||||
final String schemeid,
|
||||
final String schemename,
|
||||
final DataInfo dataInfo) {
|
||||
|
||||
return structuredProperty(value, qualifier(classid, classname, schemeid, schemename), dataInfo);
|
||||
}
|
||||
|
||||
public static Subject subject(
|
||||
final String value,
|
||||
final Qualifier qualifier,
|
||||
final DataInfo dataInfo) {
|
||||
if (value == null) {
|
||||
return null;
|
||||
}
|
||||
final Subject s = new Subject();
|
||||
s.setValue(value);
|
||||
s.setQualifier(qualifier);
|
||||
s.setDataInfo(dataInfo);
|
||||
return s;
|
||||
}
|
||||
|
||||
public static StructuredProperty structuredProperty(
|
||||
final String value,
|
||||
final Qualifier qualifier,
|
||||
final DataInfo dataInfo) {
|
||||
if (value == null) {
|
||||
return null;
|
||||
}
|
||||
final StructuredProperty sp = new StructuredProperty();
|
||||
sp.setValue(value);
|
||||
sp.setQualifier(qualifier);
|
||||
sp.setDataInfo(dataInfo);
|
||||
return sp;
|
||||
}
|
||||
|
||||
public static ExtraInfo extraInfo(
|
||||
final String name,
|
||||
final String value,
|
||||
final String typology,
|
||||
final String provenance,
|
||||
final String trust) {
|
||||
final ExtraInfo info = new ExtraInfo();
|
||||
info.setName(name);
|
||||
info.setValue(value);
|
||||
info.setTypology(typology);
|
||||
info.setProvenance(provenance);
|
||||
info.setTrust(trust);
|
||||
return info;
|
||||
}
|
||||
|
||||
public static OAIProvenance oaiIProvenance(
|
||||
final String identifier,
|
||||
final String baseURL,
|
||||
final String metadataNamespace,
|
||||
final Boolean altered,
|
||||
final String datestamp,
|
||||
final String harvestDate) {
|
||||
|
||||
final OriginDescription desc = new OriginDescription();
|
||||
desc.setIdentifier(identifier);
|
||||
desc.setBaseURL(baseURL);
|
||||
desc.setMetadataNamespace(metadataNamespace);
|
||||
desc.setAltered(altered);
|
||||
desc.setDatestamp(datestamp);
|
||||
desc.setHarvestDate(harvestDate);
|
||||
|
||||
final OAIProvenance p = new OAIProvenance();
|
||||
p.setOriginDescription(desc);
|
||||
|
||||
return p;
|
||||
}
|
||||
|
||||
public static Journal journal(
|
||||
final String name,
|
||||
final String issnPrinted,
|
||||
final String issnOnline,
|
||||
final String issnLinking,
|
||||
final DataInfo dataInfo) {
|
||||
|
||||
return hasIssn(issnPrinted, issnOnline, issnLinking) ? journal(
|
||||
name,
|
||||
issnPrinted,
|
||||
issnOnline,
|
||||
issnLinking,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
dataInfo) : null;
|
||||
}
|
||||
|
||||
public static Journal journal(
|
||||
final String name,
|
||||
final String issnPrinted,
|
||||
final String issnOnline,
|
||||
final String issnLinking,
|
||||
final String ep,
|
||||
final String iss,
|
||||
final String sp,
|
||||
final String vol,
|
||||
final String edition,
|
||||
final String conferenceplace,
|
||||
final String conferencedate,
|
||||
final DataInfo dataInfo) {
|
||||
|
||||
if (StringUtils.isNotBlank(name) || hasIssn(issnPrinted, issnOnline, issnLinking)) {
|
||||
final Journal j = new Journal();
|
||||
j.setName(name);
|
||||
j.setIssnPrinted(issnPrinted);
|
||||
j.setIssnOnline(issnOnline);
|
||||
j.setIssnLinking(issnLinking);
|
||||
j.setEp(ep);
|
||||
j.setIss(iss);
|
||||
j.setSp(sp);
|
||||
j.setVol(vol);
|
||||
j.setEdition(edition);
|
||||
j.setConferenceplace(conferenceplace);
|
||||
j.setConferencedate(conferencedate);
|
||||
j.setDataInfo(dataInfo);
|
||||
return j;
|
||||
} else {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
private static boolean hasIssn(String issnPrinted, String issnOnline, String issnLinking) {
|
||||
return StringUtils.isNotBlank(issnPrinted)
|
||||
|| StringUtils.isNotBlank(issnOnline)
|
||||
|| StringUtils.isNotBlank(issnLinking);
|
||||
}
|
||||
|
||||
public static DataInfo dataInfo(
|
||||
final Boolean deletedbyinference,
|
||||
final String inferenceprovenance,
|
||||
final Boolean inferred,
|
||||
final Boolean invisible,
|
||||
final Qualifier provenanceaction,
|
||||
final String trust) {
|
||||
final DataInfo d = new DataInfo();
|
||||
d.setDeletedbyinference(deletedbyinference);
|
||||
d.setInferenceprovenance(inferenceprovenance);
|
||||
d.setInferred(inferred);
|
||||
d.setInvisible(invisible);
|
||||
d.setProvenanceaction(provenanceaction);
|
||||
d.setTrust(trust);
|
||||
return d;
|
||||
}
|
||||
|
||||
public static String createOpenaireId(
|
||||
final int prefix,
|
||||
final String originalId,
|
||||
final boolean to_md5) {
|
||||
if (StringUtils.isBlank(originalId)) {
|
||||
return null;
|
||||
} else if (to_md5) {
|
||||
final String nsPrefix = StringUtils.substringBefore(originalId, "::");
|
||||
final String rest = StringUtils.substringAfter(originalId, "::");
|
||||
return String.format("%s|%s::%s", prefix, nsPrefix, IdentifierFactory.md5(rest));
|
||||
} else {
|
||||
return String.format("%s|%s", prefix, originalId);
|
||||
}
|
||||
}
|
||||
|
||||
public static String createOpenaireId(
|
||||
final String type,
|
||||
final String originalId,
|
||||
final boolean to_md5) {
|
||||
switch (type) {
|
||||
case "datasource":
|
||||
return createOpenaireId(10, originalId, to_md5);
|
||||
case "organization":
|
||||
return createOpenaireId(20, originalId, to_md5);
|
||||
case "person":
|
||||
return createOpenaireId(30, originalId, to_md5);
|
||||
case "project":
|
||||
return createOpenaireId(40, originalId, to_md5);
|
||||
default:
|
||||
return createOpenaireId(50, originalId, to_md5);
|
||||
}
|
||||
}
|
||||
|
||||
public static String asString(final Object o) {
|
||||
return o == null ? "" : o.toString();
|
||||
}
|
||||
|
||||
public static <T> Predicate<T> distinctByKey(
|
||||
final Function<? super T, ?> keyExtractor) {
|
||||
final Map<Object, Boolean> seen = new ConcurrentHashMap<>();
|
||||
return t -> seen.putIfAbsent(keyExtractor.apply(t), Boolean.TRUE) == null;
|
||||
}
|
||||
|
||||
public static Qualifier createBestAccessRights(final List<Instance> instanceList) {
|
||||
return getBestAccessRights(instanceList);
|
||||
}
|
||||
|
||||
protected static Qualifier getBestAccessRights(final List<Instance> instanceList) {
|
||||
if (instanceList != null) {
|
||||
final Optional<AccessRight> min = instanceList
|
||||
.stream()
|
||||
.map(Instance::getAccessright)
|
||||
.min(new AccessRightComparator<>());
|
||||
|
||||
final Qualifier rights = min.map(OafMapperUtils::qualifier).orElseGet(Qualifier::new);
|
||||
|
||||
if (StringUtils.isBlank(rights.getClassid())) {
|
||||
rights.setClassid(UNKNOWN);
|
||||
}
|
||||
if (StringUtils.isBlank(rights.getClassname())
|
||||
|| UNKNOWN.equalsIgnoreCase(rights.getClassname())) {
|
||||
rights.setClassname(NOT_AVAILABLE);
|
||||
}
|
||||
if (StringUtils.isBlank(rights.getSchemeid())) {
|
||||
rights.setSchemeid(DNET_ACCESS_MODES);
|
||||
}
|
||||
if (StringUtils.isBlank(rights.getSchemename())) {
|
||||
rights.setSchemename(DNET_ACCESS_MODES);
|
||||
}
|
||||
|
||||
return rights;
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
public static KeyValue newKeyValueInstance(String key, String value, DataInfo dataInfo) {
|
||||
KeyValue kv = new KeyValue();
|
||||
kv.setDataInfo(dataInfo);
|
||||
kv.setKey(key);
|
||||
kv.setValue(value);
|
||||
return kv;
|
||||
}
|
||||
|
||||
public static Measure newMeasureInstance(String id, String value, String key, DataInfo dataInfo) {
|
||||
Measure m = new Measure();
|
||||
m.setId(id);
|
||||
m.setUnit(Arrays.asList(newKeyValueInstance(key, value, dataInfo)));
|
||||
return m;
|
||||
}
|
||||
|
||||
public static Relation getRelation(final String source,
|
||||
final String target,
|
||||
final String relType,
|
||||
final String subRelType,
|
||||
final String relClass,
|
||||
final OafEntity entity) {
|
||||
return getRelation(source, target, relType, subRelType, relClass, entity, null);
|
||||
}
|
||||
|
||||
public static Relation getRelation(final String source,
|
||||
final String target,
|
||||
final String relType,
|
||||
final String subRelType,
|
||||
final String relClass,
|
||||
final OafEntity entity,
|
||||
final String validationDate) {
|
||||
return getRelation(
|
||||
source, target, relType, subRelType, relClass, entity.getCollectedfrom(), entity.getDataInfo(),
|
||||
entity.getLastupdatetimestamp(), validationDate, null);
|
||||
}
|
||||
|
||||
public static Relation getRelation(final String source,
|
||||
final String target,
|
||||
final String relType,
|
||||
final String subRelType,
|
||||
final String relClass,
|
||||
final List<KeyValue> collectedfrom,
|
||||
final DataInfo dataInfo,
|
||||
final Long lastupdatetimestamp) {
|
||||
return getRelation(
|
||||
source, target, relType, subRelType, relClass, collectedfrom, dataInfo, lastupdatetimestamp, null, null);
|
||||
}
|
||||
|
||||
public static Relation getRelation(final String source,
|
||||
final String target,
|
||||
final String relType,
|
||||
final String subRelType,
|
||||
final String relClass,
|
||||
final List<KeyValue> collectedfrom,
|
||||
final DataInfo dataInfo,
|
||||
final Long lastupdatetimestamp,
|
||||
final String validationDate,
|
||||
final List<KeyValue> properties) {
|
||||
final Relation rel = new Relation();
|
||||
rel.setRelType(relType);
|
||||
rel.setSubRelType(subRelType);
|
||||
rel.setRelClass(relClass);
|
||||
rel.setSource(source);
|
||||
rel.setTarget(target);
|
||||
rel.setCollectedfrom(collectedfrom);
|
||||
rel.setDataInfo(dataInfo);
|
||||
rel.setLastupdatetimestamp(lastupdatetimestamp);
|
||||
rel.setValidated(StringUtils.isNotBlank(validationDate));
|
||||
rel.setValidationDate(StringUtils.isNotBlank(validationDate) ? validationDate : null);
|
||||
rel.setProperties(properties);
|
||||
return rel;
|
||||
}
|
||||
|
||||
public static String getProvenance(DataInfo dataInfo) {
|
||||
return Optional
|
||||
.ofNullable(dataInfo)
|
||||
.map(
|
||||
d -> Optional
|
||||
.ofNullable(d.getProvenanceaction())
|
||||
.map(Qualifier::getClassid)
|
||||
.orElse(""))
|
||||
.orElse("");
|
||||
}
|
||||
}
|
|
@ -1,46 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
||||
|
||||
import java.util.Comparator;
|
||||
|
||||
import eu.dnetlib.dhp.schema.oaf.StructuredProperty;
|
||||
|
||||
public class OrganizationPidComparator implements Comparator<StructuredProperty> {
|
||||
|
||||
@Override
|
||||
public int compare(StructuredProperty left, StructuredProperty right) {
|
||||
if (left == null) {
|
||||
return right == null ? 0 : -1;
|
||||
} else if (right == null) {
|
||||
return 1;
|
||||
}
|
||||
|
||||
PidType lClass = PidType.tryValueOf(left.getQualifier().getClassid());
|
||||
PidType rClass = PidType.tryValueOf(right.getQualifier().getClassid());
|
||||
|
||||
if (lClass.equals(rClass))
|
||||
return 0;
|
||||
|
||||
if (lClass.equals(PidType.openorgs))
|
||||
return -1;
|
||||
if (rClass.equals(PidType.openorgs))
|
||||
return 1;
|
||||
|
||||
if (lClass.equals(PidType.GRID))
|
||||
return -1;
|
||||
if (rClass.equals(PidType.GRID))
|
||||
return 1;
|
||||
|
||||
if (lClass.equals(PidType.mag_id))
|
||||
return -1;
|
||||
if (rClass.equals(PidType.mag_id))
|
||||
return 1;
|
||||
|
||||
if (lClass.equals(PidType.urn))
|
||||
return -1;
|
||||
if (rClass.equals(PidType.urn))
|
||||
return 1;
|
||||
|
||||
return 0;
|
||||
}
|
||||
}
|
|
@ -1,21 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
||||
|
||||
import java.util.regex.Matcher;
|
||||
import java.util.regex.Pattern;
|
||||
|
||||
public class PICCleaningRule {
|
||||
|
||||
public static final Pattern PATTERN = Pattern.compile("\\d{9}");
|
||||
|
||||
public static String clean(final String pic) {
|
||||
|
||||
Matcher m = PATTERN.matcher(pic);
|
||||
if (m.find()) {
|
||||
return m.group();
|
||||
} else {
|
||||
return "";
|
||||
}
|
||||
}
|
||||
|
||||
}
|
|
@ -1,8 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
||||
|
||||
import java.util.HashMap;
|
||||
import java.util.HashSet;
|
||||
|
||||
public class PidBlacklist extends HashMap<String, HashSet<String>> {
|
||||
}
|
|
@ -1,40 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.HashSet;
|
||||
import java.util.Optional;
|
||||
import java.util.Set;
|
||||
|
||||
import org.apache.commons.io.IOUtils;
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
|
||||
public class PidBlacklistProvider {
|
||||
|
||||
private static final PidBlacklist blacklist;
|
||||
|
||||
static {
|
||||
try {
|
||||
String json = IOUtils.toString(IdentifierFactory.class.getResourceAsStream("pid_blacklist.json"));
|
||||
blacklist = new ObjectMapper().readValue(json, PidBlacklist.class);
|
||||
|
||||
} catch (IOException e) {
|
||||
throw new RuntimeException(e);
|
||||
}
|
||||
}
|
||||
|
||||
public static PidBlacklist getBlacklist() {
|
||||
return blacklist;
|
||||
}
|
||||
|
||||
public static Set<String> getBlacklist(String pidType) {
|
||||
return Optional
|
||||
.ofNullable(getBlacklist().get(pidType))
|
||||
.orElse(new HashSet<>());
|
||||
}
|
||||
|
||||
private PidBlacklistProvider() {
|
||||
}
|
||||
|
||||
}
|
|
@ -1,62 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
||||
|
||||
import java.util.Optional;
|
||||
|
||||
import eu.dnetlib.dhp.schema.oaf.StructuredProperty;
|
||||
|
||||
public class PidCleaner {
|
||||
|
||||
/**
|
||||
* Utility method that normalises PID values on a per-type basis.
|
||||
* @param pid the PID whose value will be normalised.
|
||||
* @return the PID containing the normalised value.
|
||||
*/
|
||||
public static StructuredProperty normalizePidValue(StructuredProperty pid) {
|
||||
pid
|
||||
.setValue(
|
||||
normalizePidValue(
|
||||
pid.getQualifier().getClassid(),
|
||||
pid.getValue()));
|
||||
|
||||
return pid;
|
||||
}
|
||||
|
||||
public static String normalizePidValue(String pidType, String pidValue) {
|
||||
String value = Optional
|
||||
.ofNullable(pidValue)
|
||||
.map(String::trim)
|
||||
.orElseThrow(() -> new IllegalArgumentException("PID value cannot be empty"));
|
||||
|
||||
switch (pidType) {
|
||||
|
||||
// TODO add cleaning for more PID types as needed
|
||||
|
||||
// Result
|
||||
case "doi":
|
||||
return DoiCleaningRule.clean(value);
|
||||
case "pmid":
|
||||
return PmidCleaningRule.clean(value);
|
||||
case "pmc":
|
||||
return PmcCleaningRule.clean(value);
|
||||
case "handle":
|
||||
case "arXiv":
|
||||
return value;
|
||||
|
||||
// Organization
|
||||
case "GRID":
|
||||
return GridCleaningRule.clean(value);
|
||||
case "ISNI":
|
||||
return ISNICleaningRule.clean(value);
|
||||
case "ROR":
|
||||
return RorCleaningRule.clean(value);
|
||||
case "PIC":
|
||||
return PICCleaningRule.clean(value);
|
||||
case "FundRef":
|
||||
return FundRefCleaningRule.clean(value);
|
||||
default:
|
||||
return value;
|
||||
}
|
||||
}
|
||||
|
||||
}
|
|
@ -1,48 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
||||
|
||||
import java.util.Comparator;
|
||||
|
||||
import eu.dnetlib.dhp.schema.common.ModelSupport;
|
||||
import eu.dnetlib.dhp.schema.oaf.OafEntity;
|
||||
import eu.dnetlib.dhp.schema.oaf.Organization;
|
||||
import eu.dnetlib.dhp.schema.oaf.Result;
|
||||
import eu.dnetlib.dhp.schema.oaf.StructuredProperty;
|
||||
|
||||
public class PidComparator<T extends OafEntity> implements Comparator<StructuredProperty> {
|
||||
|
||||
private final T entity;
|
||||
|
||||
public PidComparator(T entity) {
|
||||
this.entity = entity;
|
||||
}
|
||||
|
||||
@Override
|
||||
public int compare(StructuredProperty left, StructuredProperty right) {
|
||||
|
||||
if (left == null && right == null)
|
||||
return 0;
|
||||
if (left == null)
|
||||
return 1;
|
||||
if (right == null)
|
||||
return -1;
|
||||
|
||||
if (ModelSupport.isSubClass(entity, Result.class)) {
|
||||
return compareResultPids(left, right);
|
||||
}
|
||||
if (ModelSupport.isSubClass(entity, Organization.class)) {
|
||||
return compareOrganizationtPids(left, right);
|
||||
}
|
||||
|
||||
// Else (but unlikely), lexicographical ordering will do.
|
||||
return left.getQualifier().getClassid().compareTo(right.getQualifier().getClassid());
|
||||
}
|
||||
|
||||
private int compareResultPids(StructuredProperty left, StructuredProperty right) {
|
||||
return new ResultPidComparator().compare(left, right);
|
||||
}
|
||||
|
||||
private int compareOrganizationtPids(StructuredProperty left, StructuredProperty right) {
|
||||
return new OrganizationPidComparator().compare(left, right);
|
||||
}
|
||||
}
|
|
@ -1,79 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
||||
|
||||
import org.apache.commons.lang3.EnumUtils;
|
||||
|
||||
public enum PidType {
|
||||
|
||||
/**
|
||||
* The DOI syntax shall be made up of a DOI prefix and a DOI suffix separated by a forward slash.
|
||||
*
|
||||
* There is no defined limit on the length of the DOI name, or of the DOI prefix or DOI suffix.
|
||||
*
|
||||
* The DOI name is case-insensitive and can incorporate any printable characters from the legal graphic characters
|
||||
* of Unicode. Further constraints on character use (e.g. use of language-specific alphanumeric characters) can be
|
||||
* defined for an application by the ISO 26324 Registration Authority.
|
||||
*
|
||||
*
|
||||
* DOI prefix: The DOI prefix shall be composed of a directory indicator followed by a registrant code.
|
||||
* These two components shall be separated by a full stop (period). The directory indicator shall be "10" and
|
||||
* distinguishes the entire set of character strings (prefix and suffix) as digital object identifiers within the
|
||||
* resolution system.
|
||||
*
|
||||
* Registrant code: The second element of the DOI prefix shall be the registrant code. The registrant code is a
|
||||
* unique string assigned to a registrant.
|
||||
*
|
||||
* DOI suffix: The DOI suffix shall consist of a character string of any length chosen by the registrant.
|
||||
* Each suffix shall be unique to the prefix element that precedes it. The unique suffix can be a sequential number,
|
||||
* or it might incorporate an identifier generated from or based on another system used by the registrant
|
||||
* (e.g. ISAN, ISBN, ISRC, ISSN, ISTC, ISNI; in such cases, a preferred construction for such a suffix can be
|
||||
* specified, as in Example 1).
|
||||
*
|
||||
* Source: https://www.doi.org/doi_handbook/2_Numbering.html#2.2
|
||||
*/
|
||||
doi,
|
||||
|
||||
/**
|
||||
* PubMed Unique Identifier (PMID)
|
||||
*
|
||||
* This field is a 1-to-8 digit accession number with no leading zeros. It is present on all records and is the
|
||||
* accession number for managing and disseminating records. PMIDs are not reused after records are deleted.
|
||||
*
|
||||
* Beginning in February 2012 PMIDs include extensions following a decimal point to account for article versions
|
||||
* (e.g., 21804956.2). All citations are considered version 1 until replaced. The extended PMID is not displayed
|
||||
* on the MEDLINE format.
|
||||
*
|
||||
* View the citation in abstract format in PubMed to access additional versions when available (see the article in
|
||||
* the Jan-Feb 2012 NLM Technical Bulletin).
|
||||
*
|
||||
* Source: https://www.nlm.nih.gov/bsd/mms/medlineelements.html#pmid
|
||||
*/
|
||||
pmid,
|
||||
|
||||
/**
|
||||
* This field contains the unique identifier for the cited article in PubMed Central. The identifier begins with the
|
||||
* prefix PMC.
|
||||
*
|
||||
* Source: https://www.nlm.nih.gov/bsd/mms/medlineelements.html#pmc
|
||||
*/
|
||||
pmc, handle, arXiv, nct, pdb, w3id,
|
||||
|
||||
// Organization
|
||||
openorgs, ROR, GRID, PIC, ISNI, Wikidata, FundRef, corda, corda_h2020, mag_id, urn,
|
||||
|
||||
// Used by dedup
|
||||
undefined, original;
|
||||
|
||||
public static boolean isValid(String type) {
|
||||
return EnumUtils.isValidEnum(PidType.class, type);
|
||||
}
|
||||
|
||||
public static PidType tryValueOf(String s) {
|
||||
try {
|
||||
return PidType.valueOf(s);
|
||||
} catch (Exception e) {
|
||||
return PidType.original;
|
||||
}
|
||||
}
|
||||
|
||||
}
|
|
@ -1,33 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
||||
|
||||
import java.util.Comparator;
|
||||
import java.util.Optional;
|
||||
|
||||
import eu.dnetlib.dhp.schema.oaf.StructuredProperty;
|
||||
|
||||
public class PidValueComparator implements Comparator<StructuredProperty> {
|
||||
|
||||
@Override
|
||||
public int compare(StructuredProperty left, StructuredProperty right) {
|
||||
|
||||
if (left == null && right == null)
|
||||
return 0;
|
||||
if (left == null)
|
||||
return 1;
|
||||
if (right == null)
|
||||
return -1;
|
||||
|
||||
StructuredProperty l = PidCleaner.normalizePidValue(left);
|
||||
StructuredProperty r = PidCleaner.normalizePidValue(right);
|
||||
|
||||
return Optional
|
||||
.ofNullable(l.getValue())
|
||||
.map(
|
||||
lv -> Optional
|
||||
.ofNullable(r.getValue())
|
||||
.map(rv -> lv.compareTo(rv))
|
||||
.orElse(-1))
|
||||
.orElse(1);
|
||||
}
|
||||
}
|
|
@ -1,24 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
||||
|
||||
import java.util.regex.Matcher;
|
||||
import java.util.regex.Pattern;
|
||||
|
||||
public class PmcCleaningRule {
|
||||
|
||||
public static final Pattern PATTERN = Pattern.compile("PMC\\d{1,8}");
|
||||
|
||||
public static String clean(String pmc) {
|
||||
String s = pmc
|
||||
.replaceAll("\\s", "")
|
||||
.toUpperCase();
|
||||
|
||||
final Matcher m = PATTERN.matcher(s);
|
||||
|
||||
if (m.find()) {
|
||||
return m.group();
|
||||
}
|
||||
return "";
|
||||
}
|
||||
|
||||
}
|
|
@ -1,25 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
||||
|
||||
import java.util.regex.Matcher;
|
||||
import java.util.regex.Pattern;
|
||||
|
||||
// https://researchguides.stevens.edu/c.php?g=442331&p=6577176
|
||||
public class PmidCleaningRule {
|
||||
|
||||
public static final Pattern PATTERN = Pattern.compile("0*(\\d{1,8})");
|
||||
|
||||
public static String clean(String pmid) {
|
||||
String s = pmid
|
||||
.toLowerCase()
|
||||
.replaceAll("\\s", "");
|
||||
|
||||
final Matcher m = PATTERN.matcher(s);
|
||||
|
||||
if (m.find()) {
|
||||
return m.group(1);
|
||||
}
|
||||
return "";
|
||||
}
|
||||
|
||||
}
|
|
@ -1,46 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
||||
|
||||
import java.util.Comparator;
|
||||
|
||||
import eu.dnetlib.dhp.schema.oaf.Qualifier;
|
||||
|
||||
/**
|
||||
* Comparator for sorting the values from the dnet:review_levels vocabulary, implements the following ordering
|
||||
*
|
||||
* peerReviewed (0001) > nonPeerReviewed (0002) > UNKNOWN (0000)
|
||||
*/
|
||||
public class RefereedComparator implements Comparator<Qualifier> {
|
||||
|
||||
@Override
|
||||
public int compare(Qualifier left, Qualifier right) {
|
||||
if (left == null || left.getClassid() == null) {
|
||||
return (right == null || right.getClassid() == null) ? 0 : -1;
|
||||
} else if (right == null || right.getClassid() == null) {
|
||||
return 1;
|
||||
}
|
||||
|
||||
String lClass = left.getClassid();
|
||||
String rClass = right.getClassid();
|
||||
|
||||
if (lClass.equals(rClass))
|
||||
return 0;
|
||||
|
||||
if ("0001".equals(lClass))
|
||||
return -1;
|
||||
if ("0001".equals(rClass))
|
||||
return 1;
|
||||
|
||||
if ("0002".equals(lClass))
|
||||
return -1;
|
||||
if ("0002".equals(rClass))
|
||||
return 1;
|
||||
|
||||
if ("0000".equals(lClass))
|
||||
return -1;
|
||||
if ("0000".equals(rClass))
|
||||
return 1;
|
||||
|
||||
return 0;
|
||||
}
|
||||
}
|
|
@ -1,56 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
||||
|
||||
import java.util.Comparator;
|
||||
|
||||
import eu.dnetlib.dhp.schema.oaf.StructuredProperty;
|
||||
|
||||
public class ResultPidComparator implements Comparator<StructuredProperty> {
|
||||
|
||||
@Override
|
||||
public int compare(StructuredProperty left, StructuredProperty right) {
|
||||
|
||||
PidType lClass = PidType.tryValueOf(left.getQualifier().getClassid());
|
||||
PidType rClass = PidType.tryValueOf(right.getQualifier().getClassid());
|
||||
|
||||
if (lClass.equals(rClass))
|
||||
return 0;
|
||||
|
||||
if (lClass.equals(PidType.doi))
|
||||
return -1;
|
||||
if (rClass.equals(PidType.doi))
|
||||
return 1;
|
||||
|
||||
if (lClass.equals(PidType.pmid))
|
||||
return -1;
|
||||
if (rClass.equals(PidType.pmid))
|
||||
return 1;
|
||||
|
||||
if (lClass.equals(PidType.pmc))
|
||||
return -1;
|
||||
if (rClass.equals(PidType.pmc))
|
||||
return 1;
|
||||
|
||||
if (lClass.equals(PidType.handle))
|
||||
return -1;
|
||||
if (rClass.equals(PidType.handle))
|
||||
return 1;
|
||||
|
||||
if (lClass.equals(PidType.arXiv))
|
||||
return -1;
|
||||
if (rClass.equals(PidType.arXiv))
|
||||
return 1;
|
||||
|
||||
if (lClass.equals(PidType.nct))
|
||||
return -1;
|
||||
if (rClass.equals(PidType.nct))
|
||||
return 1;
|
||||
|
||||
if (lClass.equals(PidType.pdb))
|
||||
return -1;
|
||||
if (rClass.equals(PidType.pdb))
|
||||
return 1;
|
||||
|
||||
return 0;
|
||||
}
|
||||
}
|
|
@ -1,27 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
||||
|
||||
import java.util.regex.Matcher;
|
||||
import java.util.regex.Pattern;
|
||||
|
||||
// https://ror.readme.io/docs/ror-identifier-pattern
|
||||
public class RorCleaningRule {
|
||||
|
||||
public static final String ROR_PREFIX = "https://ror.org/";
|
||||
|
||||
private static final Pattern PATTERN = Pattern.compile("(?<ror>0[a-hj-km-np-tv-z|0-9]{6}[0-9]{2})");
|
||||
|
||||
public static String clean(String ror) {
|
||||
String s = ror
|
||||
.replaceAll("\\s", "")
|
||||
.toLowerCase();
|
||||
|
||||
Matcher m = PATTERN.matcher(s);
|
||||
|
||||
if (m.find()) {
|
||||
return ROR_PREFIX + m.group("ror");
|
||||
}
|
||||
return "";
|
||||
}
|
||||
|
||||
}
|
|
@ -1,46 +0,0 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf.utils;
|
||||
|
||||
import static eu.dnetlib.dhp.schema.oaf.utils.OafMapperUtils.getProvenance;
|
||||
import static org.apache.commons.lang3.StringUtils.isBlank;
|
||||
|
||||
import java.util.Comparator;
|
||||
|
||||
import eu.dnetlib.dhp.schema.oaf.Subject;
|
||||
|
||||
public class SubjectProvenanceComparator implements Comparator<Subject> {
|
||||
|
||||
@Override
|
||||
public int compare(Subject left, Subject right) {
|
||||
|
||||
String lProv = getProvenance(left.getDataInfo());
|
||||
String rProv = getProvenance(right.getDataInfo());
|
||||
|
||||
if (isBlank(lProv) && isBlank(rProv))
|
||||
return 0;
|
||||
if (isBlank(lProv))
|
||||
return 1;
|
||||
if (isBlank(rProv))
|
||||
return -1;
|
||||
if (lProv.equals(rProv))
|
||||
return 0;
|
||||
if (lProv.toLowerCase().contains("crosswalk"))
|
||||
return -1;
|
||||
if (rProv.toLowerCase().contains("crosswalk"))
|
||||
return 1;
|
||||
if (lProv.toLowerCase().contains("user"))
|
||||
return -1;
|
||||
if (rProv.toLowerCase().contains("user"))
|
||||
return 1;
|
||||
if (lProv.toLowerCase().contains("propagation"))
|
||||
return -1;
|
||||
if (rProv.toLowerCase().contains("propagation"))
|
||||
return 1;
|
||||
if (lProv.toLowerCase().contains("iis"))
|
||||
return -1;
|
||||
if (rProv.toLowerCase().contains("iis"))
|
||||
return 1;
|
||||
|
||||
return 0;
|
||||
}
|
||||
}
|
|
@ -1,113 +1,63 @@
|
|||
|
||||
package eu.dnetlib.dhp.utils;
|
||||
|
||||
import java.io.*;
|
||||
import java.io.ByteArrayInputStream;
|
||||
import java.io.ByteArrayOutputStream;
|
||||
import java.nio.charset.StandardCharsets;
|
||||
import java.security.MessageDigest;
|
||||
import java.util.*;
|
||||
import java.util.stream.Collectors;
|
||||
import java.util.zip.GZIPInputStream;
|
||||
import java.util.zip.GZIPOutputStream;
|
||||
|
||||
import org.apache.commons.codec.binary.Base64;
|
||||
import org.apache.commons.codec.binary.Base64OutputStream;
|
||||
import org.apache.commons.codec.binary.Hex;
|
||||
import org.apache.commons.io.IOUtils;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
import org.apache.hadoop.conf.Configuration;
|
||||
import org.apache.hadoop.fs.FileSystem;
|
||||
import org.apache.hadoop.fs.Path;
|
||||
import org.apache.http.client.methods.CloseableHttpResponse;
|
||||
import org.apache.http.client.methods.HttpGet;
|
||||
import org.apache.http.impl.client.CloseableHttpClient;
|
||||
import org.apache.http.impl.client.HttpClients;
|
||||
import org.apache.spark.sql.Dataset;
|
||||
import org.apache.spark.sql.SaveMode;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
import com.google.common.collect.Maps;
|
||||
import com.jayway.jsonpath.JsonPath;
|
||||
|
||||
import eu.dnetlib.dhp.schema.mdstore.MDStoreWithInfo;
|
||||
import eu.dnetlib.dhp.schema.oaf.utils.CleaningFunctions;
|
||||
import eu.dnetlib.dhp.schema.oaf.utils.PidCleaner;
|
||||
import net.minidev.json.JSONArray;
|
||||
import scala.collection.JavaConverters;
|
||||
import scala.collection.Seq;
|
||||
|
||||
public class DHPUtils {
|
||||
|
||||
private static final Logger log = LoggerFactory.getLogger(DHPUtils.class);
|
||||
|
||||
private DHPUtils() {
|
||||
}
|
||||
|
||||
public static Seq<String> toSeq(List<String> list) {
|
||||
return JavaConverters.asScalaIteratorConverter(list.iterator()).asScala().toSeq();
|
||||
}
|
||||
|
||||
public static String md5(final String s) {
|
||||
try {
|
||||
final MessageDigest md = MessageDigest.getInstance("MD5");
|
||||
md.update(s.getBytes(StandardCharsets.UTF_8));
|
||||
return new String(Hex.encodeHex(md.digest()));
|
||||
} catch (final Exception e) {
|
||||
log.error("Error creating id from {}", s);
|
||||
System.err.println("Error creating id");
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Retrieves from the metadata store manager application the list of paths associated with mdstores characterized
|
||||
* by he given format, layout, interpretation
|
||||
* @param mdstoreManagerUrl the URL of the mdstore manager service
|
||||
* @param format the mdstore format
|
||||
* @param layout the mdstore layout
|
||||
* @param interpretation the mdstore interpretation
|
||||
* @param includeEmpty include Empty mdstores
|
||||
* @return the set of hdfs paths
|
||||
* @throws IOException in case of HTTP communication issues
|
||||
*/
|
||||
public static Set<String> mdstorePaths(final String mdstoreManagerUrl,
|
||||
final String format,
|
||||
final String layout,
|
||||
final String interpretation,
|
||||
boolean includeEmpty) throws IOException {
|
||||
final String url = mdstoreManagerUrl + "/mdstores/";
|
||||
final ObjectMapper objectMapper = new ObjectMapper();
|
||||
|
||||
final HttpGet req = new HttpGet(url);
|
||||
|
||||
log.info("MDStoreManager request: {}", req);
|
||||
|
||||
try (final CloseableHttpClient client = HttpClients.createDefault()) {
|
||||
try (final CloseableHttpResponse response = client.execute(req)) {
|
||||
final String json = IOUtils.toString(response.getEntity().getContent());
|
||||
|
||||
log.info("MDStoreManager response: {}", json);
|
||||
|
||||
final MDStoreWithInfo[] mdstores = objectMapper.readValue(json, MDStoreWithInfo[].class);
|
||||
return Arrays
|
||||
.stream(mdstores)
|
||||
.filter(md -> md.getFormat().equalsIgnoreCase(format))
|
||||
.filter(md -> md.getLayout().equalsIgnoreCase(layout))
|
||||
.filter(md -> md.getInterpretation().equalsIgnoreCase(interpretation))
|
||||
.filter(md -> StringUtils.isNotBlank(md.getHdfsPath()))
|
||||
.filter(md -> StringUtils.isNotBlank(md.getCurrentVersion()))
|
||||
.filter(md -> includeEmpty || md.getSize() > 0)
|
||||
.map(md -> md.getHdfsPath() + "/" + md.getCurrentVersion() + "/store")
|
||||
.collect(Collectors.toSet());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
public static String generateIdentifier(final String originalId, final String nsPrefix) {
|
||||
return String.format("%s::%s", nsPrefix, DHPUtils.md5(originalId));
|
||||
}
|
||||
|
||||
public static String generateUnresolvedIdentifier(final String pid, final String pidType) {
|
||||
public static String compressString(final String input) {
|
||||
try (ByteArrayOutputStream out = new ByteArrayOutputStream();
|
||||
Base64OutputStream b64os = new Base64OutputStream(out)) {
|
||||
GZIPOutputStream gzip = new GZIPOutputStream(b64os);
|
||||
gzip.write(input.getBytes(StandardCharsets.UTF_8));
|
||||
gzip.close();
|
||||
return out.toString();
|
||||
} catch (Throwable e) {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
final String cleanedPid = PidCleaner.normalizePidValue(pidType, pid);
|
||||
|
||||
return String.format("unresolved::%s::%s", cleanedPid, pidType.toLowerCase().trim());
|
||||
public static String decompressString(final String input) {
|
||||
byte[] byteArray = Base64.decodeBase64(input.getBytes());
|
||||
int len;
|
||||
try (GZIPInputStream gis = new GZIPInputStream(new ByteArrayInputStream((byteArray)));
|
||||
ByteArrayOutputStream bos = new ByteArrayOutputStream(byteArray.length)) {
|
||||
byte[] buffer = new byte[1024];
|
||||
while ((len = gis.read(buffer)) != -1) {
|
||||
bos.write(buffer, 0, len);
|
||||
}
|
||||
return bos.toString();
|
||||
} catch (Exception e) {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
public static String getJPathString(final String jsonPath, final String json) {
|
||||
|
@ -122,72 +72,4 @@ public class DHPUtils {
|
|||
return "";
|
||||
}
|
||||
}
|
||||
|
||||
public static final ObjectMapper MAPPER = new ObjectMapper();
|
||||
|
||||
public static void writeHdfsFile(final Configuration conf, final String content, final String path)
|
||||
throws IOException {
|
||||
|
||||
log.info("writing file {}, size {}", path, content.length());
|
||||
try (FileSystem fs = FileSystem.get(conf);
|
||||
BufferedOutputStream os = new BufferedOutputStream(fs.create(new Path(path)))) {
|
||||
os.write(content.getBytes(StandardCharsets.UTF_8));
|
||||
os.flush();
|
||||
}
|
||||
}
|
||||
|
||||
public static String readHdfsFile(Configuration conf, String path) throws IOException {
|
||||
log.info("reading file {}", path);
|
||||
|
||||
try (FileSystem fs = FileSystem.get(conf)) {
|
||||
final Path p = new Path(path);
|
||||
if (!fs.exists(p)) {
|
||||
throw new FileNotFoundException(path);
|
||||
}
|
||||
return IOUtils.toString(fs.open(p));
|
||||
}
|
||||
}
|
||||
|
||||
public static <T> T readHdfsFileAs(Configuration conf, String path, Class<T> clazz) throws IOException {
|
||||
return MAPPER.readValue(readHdfsFile(conf, path), clazz);
|
||||
}
|
||||
|
||||
public static <T> void saveDataset(final Dataset<T> mdstore, final String targetPath) {
|
||||
log.info("saving dataset in: {}", targetPath);
|
||||
mdstore
|
||||
.write()
|
||||
.mode(SaveMode.Overwrite)
|
||||
.format("parquet")
|
||||
.save(targetPath);
|
||||
}
|
||||
|
||||
public static Configuration getHadoopConfiguration(String nameNode) {
|
||||
// ====== Init HDFS File System Object
|
||||
Configuration conf = new Configuration();
|
||||
// Set FileSystem URI
|
||||
conf.set("fs.defaultFS", nameNode);
|
||||
// Because of Maven
|
||||
conf.set("fs.hdfs.impl", org.apache.hadoop.hdfs.DistributedFileSystem.class.getName());
|
||||
conf.set("fs.file.impl", org.apache.hadoop.fs.LocalFileSystem.class.getName());
|
||||
|
||||
System.setProperty("hadoop.home.dir", "/");
|
||||
return conf;
|
||||
}
|
||||
|
||||
public static void populateOOZIEEnv(final Map<String, String> report) throws IOException {
|
||||
File file = new File(System.getProperty("oozie.action.output.properties"));
|
||||
Properties props = new Properties();
|
||||
report.forEach((k, v) -> props.setProperty(k, v));
|
||||
|
||||
try (OutputStream os = new FileOutputStream(file)) {
|
||||
props.store(os, "");
|
||||
}
|
||||
}
|
||||
|
||||
public static void populateOOZIEEnv(final String paramName, String value) throws IOException {
|
||||
Map<String, String> report = Maps.newHashMap();
|
||||
report.put(paramName, value);
|
||||
|
||||
populateOOZIEEnv(report);
|
||||
}
|
||||
}
|
||||
|
|
|
@ -1,25 +1,15 @@
|
|||
|
||||
package eu.dnetlib.dhp.utils;
|
||||
|
||||
import org.apache.cxf.endpoint.Client;
|
||||
import org.apache.cxf.frontend.ClientProxy;
|
||||
import org.apache.commons.logging.Log;
|
||||
import org.apache.commons.logging.LogFactory;
|
||||
import org.apache.cxf.jaxws.JaxWsProxyFactoryBean;
|
||||
import org.apache.cxf.transport.http.HTTPConduit;
|
||||
import org.apache.cxf.transports.http.configuration.HTTPClientPolicy;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
|
||||
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpService;
|
||||
|
||||
public class ISLookupClientFactory {
|
||||
|
||||
private static final Logger log = LoggerFactory.getLogger(ISLookupClientFactory.class);
|
||||
|
||||
private static final int requestTimeout = 60000 * 10;
|
||||
private static final int connectTimeout = 60000 * 10;
|
||||
|
||||
private ISLookupClientFactory() {
|
||||
}
|
||||
private static final Log log = LogFactory.getLog(ISLookupClientFactory.class);
|
||||
|
||||
public static ISLookUpService getLookUpService(final String isLookupUrl) {
|
||||
return getServiceStub(ISLookUpService.class, isLookupUrl);
|
||||
|
@ -27,30 +17,10 @@ public class ISLookupClientFactory {
|
|||
|
||||
@SuppressWarnings("unchecked")
|
||||
private static <T> T getServiceStub(final Class<T> clazz, final String endpoint) {
|
||||
log.info("creating {} stub from {}", clazz.getName(), endpoint);
|
||||
log.info(String.format("creating %s stub from %s", clazz.getName(), endpoint));
|
||||
final JaxWsProxyFactoryBean jaxWsProxyFactory = new JaxWsProxyFactoryBean();
|
||||
jaxWsProxyFactory.setServiceClass(clazz);
|
||||
jaxWsProxyFactory.setAddress(endpoint);
|
||||
|
||||
final T service = (T) jaxWsProxyFactory.create();
|
||||
|
||||
Client client = ClientProxy.getClient(service);
|
||||
if (client != null) {
|
||||
HTTPConduit conduit = (HTTPConduit) client.getConduit();
|
||||
HTTPClientPolicy policy = new HTTPClientPolicy();
|
||||
|
||||
log
|
||||
.info(
|
||||
"setting connectTimeout to {}, requestTimeout to {} for service {}",
|
||||
connectTimeout,
|
||||
requestTimeout,
|
||||
clazz.getCanonicalName());
|
||||
|
||||
policy.setConnectionTimeout(connectTimeout);
|
||||
policy.setReceiveTimeout(requestTimeout);
|
||||
conduit.setClient(policy);
|
||||
}
|
||||
|
||||
return service;
|
||||
return (T) jaxWsProxyFactory.create();
|
||||
}
|
||||
}
|
||||
|
|
|
@ -10,7 +10,7 @@ import net.sf.saxon.trans.XPathException;
|
|||
|
||||
public abstract class AbstractExtensionFunction extends ExtensionFunctionDefinition {
|
||||
|
||||
public static final String DEFAULT_SAXON_EXT_NS_URI = "http://www.d-net.research-infrastructures.eu/saxon-extension";
|
||||
public static String DEFAULT_SAXON_EXT_NS_URI = "http://www.d-net.research-infrastructures.eu/saxon-extension";
|
||||
|
||||
public abstract String getName();
|
||||
|
||||
|
|
|
@ -26,7 +26,7 @@ public class ExtractYear extends AbstractExtensionFunction {
|
|||
|
||||
@Override
|
||||
public Sequence doCall(XPathContext context, Sequence[] arguments) throws XPathException {
|
||||
if (arguments == null || arguments.length == 0) {
|
||||
if (arguments == null | arguments.length == 0) {
|
||||
return new StringValue("");
|
||||
}
|
||||
final Item item = arguments[0].head();
|
||||
|
@ -63,7 +63,8 @@ public class ExtractYear extends AbstractExtensionFunction {
|
|||
for (String format : dateFormats) {
|
||||
try {
|
||||
c.setTime(new SimpleDateFormat(format).parse(s));
|
||||
return String.valueOf(c.get(Calendar.YEAR));
|
||||
String year = String.valueOf(c.get(Calendar.YEAR));
|
||||
return year;
|
||||
} catch (ParseException e) {
|
||||
}
|
||||
}
|
||||
|
|
|
@ -5,8 +5,6 @@ import java.text.ParseException;
|
|||
import java.text.SimpleDateFormat;
|
||||
import java.util.Date;
|
||||
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
|
||||
import net.sf.saxon.expr.XPathContext;
|
||||
import net.sf.saxon.om.Sequence;
|
||||
import net.sf.saxon.trans.XPathException;
|
||||
|
@ -21,8 +19,6 @@ public class NormalizeDate extends AbstractExtensionFunction {
|
|||
|
||||
private static final String normalizeOutFormat = "yyyy-MM-dd'T'hh:mm:ss'Z'";
|
||||
|
||||
public static final String BLANK = "";
|
||||
|
||||
@Override
|
||||
public String getName() {
|
||||
return "normalizeDate";
|
||||
|
@ -30,11 +26,11 @@ public class NormalizeDate extends AbstractExtensionFunction {
|
|||
|
||||
@Override
|
||||
public Sequence doCall(XPathContext context, Sequence[] arguments) throws XPathException {
|
||||
if (arguments == null || arguments.length == 0) {
|
||||
return new StringValue(BLANK);
|
||||
if (arguments == null | arguments.length == 0) {
|
||||
return new StringValue("");
|
||||
}
|
||||
String s = arguments[0].head().getStringValue();
|
||||
return new StringValue(_normalizeDate(s));
|
||||
return new StringValue(_year(s));
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -59,8 +55,8 @@ public class NormalizeDate extends AbstractExtensionFunction {
|
|||
return SequenceType.SINGLE_STRING;
|
||||
}
|
||||
|
||||
private String _normalizeDate(String s) {
|
||||
final String date = StringUtils.isNotBlank(s) ? s.trim() : BLANK;
|
||||
private String _year(String s) {
|
||||
final String date = s != null ? s.trim() : "";
|
||||
|
||||
for (String format : normalizeDateFormats) {
|
||||
try {
|
||||
|
@ -70,6 +66,6 @@ public class NormalizeDate extends AbstractExtensionFunction {
|
|||
} catch (ParseException e) {
|
||||
}
|
||||
}
|
||||
return BLANK;
|
||||
return "";
|
||||
}
|
||||
}
|
||||
|
|
|
@ -1,8 +1,6 @@
|
|||
|
||||
package eu.dnetlib.dhp.utils.saxon;
|
||||
|
||||
import static org.apache.commons.lang3.StringUtils.isNotBlank;
|
||||
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
|
||||
import net.sf.saxon.expr.XPathContext;
|
||||
|
@ -28,8 +26,7 @@ public class PickFirst extends AbstractExtensionFunction {
|
|||
final String s1 = getValue(arguments[0]);
|
||||
final String s2 = getValue(arguments[1]);
|
||||
|
||||
final String value = isNotBlank(s1) ? s1 : isNotBlank(s2) ? s2 : "";
|
||||
return new StringValue(value);
|
||||
return new StringValue(StringUtils.isNotBlank(s1) ? s1 : StringUtils.isNotBlank(s2) ? s2 : "");
|
||||
}
|
||||
|
||||
private String getValue(final Sequence arg) throws XPathException {
|
||||
|
|
|
@ -12,9 +12,6 @@ import net.sf.saxon.TransformerFactoryImpl;
|
|||
|
||||
public class SaxonTransformerFactory {
|
||||
|
||||
private SaxonTransformerFactory() {
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates the index record transformer from the given XSLT
|
||||
*
|
||||
|
|
|
@ -0,0 +1,76 @@
|
|||
|
||||
package eu.dnetlib.message;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.Map;
|
||||
|
||||
import com.fasterxml.jackson.core.JsonProcessingException;
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
|
||||
public class Message {
|
||||
|
||||
private String workflowId;
|
||||
|
||||
private String jobName;
|
||||
|
||||
private MessageType type;
|
||||
|
||||
private Map<String, String> body;
|
||||
|
||||
public static Message fromJson(final String json) throws IOException {
|
||||
final ObjectMapper jsonMapper = new ObjectMapper();
|
||||
return jsonMapper.readValue(json, Message.class);
|
||||
}
|
||||
|
||||
public Message() {
|
||||
}
|
||||
|
||||
public Message(String workflowId, String jobName, MessageType type, Map<String, String> body) {
|
||||
this.workflowId = workflowId;
|
||||
this.jobName = jobName;
|
||||
this.type = type;
|
||||
this.body = body;
|
||||
}
|
||||
|
||||
public String getWorkflowId() {
|
||||
return workflowId;
|
||||
}
|
||||
|
||||
public void setWorkflowId(String workflowId) {
|
||||
this.workflowId = workflowId;
|
||||
}
|
||||
|
||||
public String getJobName() {
|
||||
return jobName;
|
||||
}
|
||||
|
||||
public void setJobName(String jobName) {
|
||||
this.jobName = jobName;
|
||||
}
|
||||
|
||||
public MessageType getType() {
|
||||
return type;
|
||||
}
|
||||
|
||||
public void setType(MessageType type) {
|
||||
this.type = type;
|
||||
}
|
||||
|
||||
public Map<String, String> getBody() {
|
||||
return body;
|
||||
}
|
||||
|
||||
public void setBody(Map<String, String> body) {
|
||||
this.body = body;
|
||||
}
|
||||
|
||||
@Override
|
||||
public String toString() {
|
||||
final ObjectMapper jsonMapper = new ObjectMapper();
|
||||
try {
|
||||
return jsonMapper.writeValueAsString(this);
|
||||
} catch (JsonProcessingException e) {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
}
|
|
@ -0,0 +1,47 @@
|
|||
|
||||
package eu.dnetlib.message;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.nio.charset.StandardCharsets;
|
||||
import java.util.concurrent.LinkedBlockingQueue;
|
||||
|
||||
import com.rabbitmq.client.AMQP;
|
||||
import com.rabbitmq.client.Channel;
|
||||
import com.rabbitmq.client.DefaultConsumer;
|
||||
import com.rabbitmq.client.Envelope;
|
||||
|
||||
public class MessageConsumer extends DefaultConsumer {
|
||||
|
||||
final LinkedBlockingQueue<Message> queueMessages;
|
||||
|
||||
/**
|
||||
* Constructs a new instance and records its association to the passed-in channel.
|
||||
*
|
||||
* @param channel the channel to which this consumer is attached
|
||||
* @param queueMessages
|
||||
*/
|
||||
public MessageConsumer(Channel channel, LinkedBlockingQueue<Message> queueMessages) {
|
||||
super(channel);
|
||||
this.queueMessages = queueMessages;
|
||||
}
|
||||
|
||||
@Override
|
||||
public void handleDelivery(
|
||||
String consumerTag, Envelope envelope, AMQP.BasicProperties properties, byte[] body)
|
||||
throws IOException {
|
||||
final String json = new String(body, StandardCharsets.UTF_8);
|
||||
Message message = Message.fromJson(json);
|
||||
try {
|
||||
this.queueMessages.put(message);
|
||||
System.out.println("Receiving Message " + message);
|
||||
} catch (InterruptedException e) {
|
||||
if (message.getType() == MessageType.REPORT)
|
||||
throw new RuntimeException("Error on sending message");
|
||||
else {
|
||||
// TODO LOGGING EXCEPTION
|
||||
}
|
||||
} finally {
|
||||
getChannel().basicAck(envelope.getDeliveryTag(), false);
|
||||
}
|
||||
}
|
||||
}
|
|
@ -0,0 +1,136 @@
|
|||
|
||||
package eu.dnetlib.message;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.HashMap;
|
||||
import java.util.Map;
|
||||
import java.util.concurrent.LinkedBlockingQueue;
|
||||
import java.util.concurrent.TimeoutException;
|
||||
|
||||
import com.rabbitmq.client.Channel;
|
||||
import com.rabbitmq.client.Connection;
|
||||
import com.rabbitmq.client.ConnectionFactory;
|
||||
|
||||
public class MessageManager {
|
||||
|
||||
private final String messageHost;
|
||||
|
||||
private final String username;
|
||||
|
||||
private final String password;
|
||||
|
||||
private Connection connection;
|
||||
|
||||
private final Map<String, Channel> channels = new HashMap<>();
|
||||
|
||||
private boolean durable;
|
||||
|
||||
private boolean autodelete;
|
||||
|
||||
private final LinkedBlockingQueue<Message> queueMessages;
|
||||
|
||||
public MessageManager(
|
||||
String messageHost,
|
||||
String username,
|
||||
String password,
|
||||
final LinkedBlockingQueue<Message> queueMessages) {
|
||||
this.queueMessages = queueMessages;
|
||||
this.messageHost = messageHost;
|
||||
this.username = username;
|
||||
this.password = password;
|
||||
}
|
||||
|
||||
public MessageManager(
|
||||
String messageHost,
|
||||
String username,
|
||||
String password,
|
||||
boolean durable,
|
||||
boolean autodelete,
|
||||
final LinkedBlockingQueue<Message> queueMessages) {
|
||||
this.queueMessages = queueMessages;
|
||||
this.messageHost = messageHost;
|
||||
this.username = username;
|
||||
this.password = password;
|
||||
|
||||
this.durable = durable;
|
||||
this.autodelete = autodelete;
|
||||
}
|
||||
|
||||
private Connection createConnection() throws IOException, TimeoutException {
|
||||
ConnectionFactory factory = new ConnectionFactory();
|
||||
factory.setHost(this.messageHost);
|
||||
factory.setUsername(this.username);
|
||||
factory.setPassword(this.password);
|
||||
return factory.newConnection();
|
||||
}
|
||||
|
||||
private Channel createChannel(
|
||||
final Connection connection,
|
||||
final String queueName,
|
||||
final boolean durable,
|
||||
final boolean autodelete)
|
||||
throws Exception {
|
||||
Map<String, Object> args = new HashMap<>();
|
||||
args.put("x-message-ttl", 10000);
|
||||
Channel channel = connection.createChannel();
|
||||
channel.queueDeclare(queueName, durable, false, this.autodelete, args);
|
||||
return channel;
|
||||
}
|
||||
|
||||
private Channel getOrCreateChannel(final String queueName, boolean durable, boolean autodelete)
|
||||
throws Exception {
|
||||
if (channels.containsKey(queueName)) {
|
||||
return channels.get(queueName);
|
||||
}
|
||||
|
||||
if (this.connection == null) {
|
||||
this.connection = createConnection();
|
||||
}
|
||||
channels.put(queueName, createChannel(this.connection, queueName, durable, autodelete));
|
||||
return channels.get(queueName);
|
||||
}
|
||||
|
||||
public void close() throws IOException {
|
||||
channels
|
||||
.values()
|
||||
.forEach(
|
||||
ch -> {
|
||||
try {
|
||||
ch.close();
|
||||
} catch (Exception e) {
|
||||
// TODO LOG
|
||||
}
|
||||
});
|
||||
|
||||
this.connection.close();
|
||||
}
|
||||
|
||||
public boolean sendMessage(final Message message, String queueName) throws Exception {
|
||||
try {
|
||||
Channel channel = getOrCreateChannel(queueName, this.durable, this.autodelete);
|
||||
channel.basicPublish("", queueName, null, message.toString().getBytes());
|
||||
return true;
|
||||
} catch (Throwable e) {
|
||||
throw new RuntimeException(e);
|
||||
}
|
||||
}
|
||||
|
||||
public boolean sendMessage(
|
||||
final Message message, String queueName, boolean durable_var, boolean autodelete_var)
|
||||
throws Exception {
|
||||
try {
|
||||
Channel channel = getOrCreateChannel(queueName, durable_var, autodelete_var);
|
||||
channel.basicPublish("", queueName, null, message.toString().getBytes());
|
||||
return true;
|
||||
} catch (Throwable e) {
|
||||
throw new RuntimeException(e);
|
||||
}
|
||||
}
|
||||
|
||||
public void startConsumingMessage(
|
||||
final String queueName, final boolean durable, final boolean autodelete) throws Exception {
|
||||
|
||||
Channel channel = createChannel(createConnection(), queueName, durable, autodelete);
|
||||
channel.basicConsume(queueName, false, new MessageConsumer(channel, queueMessages));
|
||||
}
|
||||
}
|
|
@ -0,0 +1,6 @@
|
|||
|
||||
package eu.dnetlib.message;
|
||||
|
||||
public enum MessageType {
|
||||
ONGOING, REPORT
|
||||
}
|
|
@ -1,101 +0,0 @@
|
|||
|
||||
package eu.dnetlib.pace.common;
|
||||
|
||||
import java.nio.charset.StandardCharsets;
|
||||
import java.text.Normalizer;
|
||||
import java.util.Set;
|
||||
import java.util.regex.Matcher;
|
||||
import java.util.regex.Pattern;
|
||||
|
||||
import org.apache.commons.io.IOUtils;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
|
||||
import com.google.common.base.Splitter;
|
||||
import com.google.common.collect.Iterables;
|
||||
import com.google.common.collect.Sets;
|
||||
import com.ibm.icu.text.Transliterator;
|
||||
|
||||
/**
|
||||
* Set of common functions for the framework
|
||||
*
|
||||
* @author claudio
|
||||
*/
|
||||
public class PaceCommonUtils {
|
||||
|
||||
// transliterator
|
||||
protected static Transliterator transliterator = Transliterator.getInstance("Any-Eng");
|
||||
|
||||
protected static final String aliases_from = "⁰¹²³⁴⁵⁶⁷⁸⁹⁺⁻⁼⁽⁾ⁿ₀₁₂₃₄₅₆₇₈₉₊₋₌₍₎àáâäæãåāèéêëēėęəîïíīįìôöòóœøōõûüùúūßśšłžźżçćčñń";
|
||||
protected static final String aliases_to = "0123456789+-=()n0123456789+-=()aaaaaaaaeeeeeeeeiiiiiioooooooouuuuussslzzzcccnn";
|
||||
|
||||
protected static Pattern hexUnicodePattern = Pattern.compile("\\\\u(\\p{XDigit}{4})");
|
||||
|
||||
protected static String fixAliases(final String s) {
|
||||
final StringBuilder sb = new StringBuilder();
|
||||
|
||||
s.chars().forEach(ch -> {
|
||||
final int i = StringUtils.indexOf(aliases_from, ch);
|
||||
sb.append(i >= 0 ? aliases_to.charAt(i) : (char) ch);
|
||||
});
|
||||
|
||||
return sb.toString();
|
||||
}
|
||||
|
||||
protected static String transliterate(final String s) {
|
||||
try {
|
||||
return transliterator.transliterate(s);
|
||||
} catch (Exception e) {
|
||||
return s;
|
||||
}
|
||||
}
|
||||
|
||||
public static String normalize(final String s) {
|
||||
return fixAliases(transliterate(nfd(unicodeNormalization(s))))
|
||||
.toLowerCase()
|
||||
// do not compact the regexes in a single expression, would cause StackOverflowError in case of large input
|
||||
// strings
|
||||
.replaceAll("[^ \\w]+", "")
|
||||
.replaceAll("(\\p{InCombiningDiacriticalMarks})+", "")
|
||||
.replaceAll("(\\p{Punct})+", " ")
|
||||
.replaceAll("(\\d)+", " ")
|
||||
.replaceAll("(\\n)+", " ")
|
||||
.trim();
|
||||
}
|
||||
|
||||
public static String nfd(final String s) {
|
||||
return Normalizer.normalize(s, Normalizer.Form.NFD);
|
||||
}
|
||||
|
||||
public static String unicodeNormalization(final String s) {
|
||||
|
||||
Matcher m = hexUnicodePattern.matcher(s);
|
||||
StringBuffer buf = new StringBuffer(s.length());
|
||||
while (m.find()) {
|
||||
String ch = String.valueOf((char) Integer.parseInt(m.group(1), 16));
|
||||
m.appendReplacement(buf, Matcher.quoteReplacement(ch));
|
||||
}
|
||||
m.appendTail(buf);
|
||||
return buf.toString();
|
||||
}
|
||||
|
||||
public static Set<String> loadFromClasspath(final String classpath) {
|
||||
|
||||
Transliterator transliterator = Transliterator.getInstance("Any-Eng");
|
||||
|
||||
final Set<String> h = Sets.newHashSet();
|
||||
try {
|
||||
for (final String s : IOUtils
|
||||
.readLines(PaceCommonUtils.class.getResourceAsStream(classpath), StandardCharsets.UTF_8)) {
|
||||
h.add(fixAliases(transliterator.transliterate(s))); // transliteration of the stopwords
|
||||
}
|
||||
} catch (final Throwable e) {
|
||||
return Sets.newHashSet();
|
||||
}
|
||||
return h;
|
||||
}
|
||||
|
||||
protected static Iterable<String> tokens(final String s, final int maxTokens) {
|
||||
return Iterables.limit(Splitter.on(" ").omitEmptyStrings().trimResults().split(s), maxTokens);
|
||||
}
|
||||
|
||||
}
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue