Compare commits

...

1107 Commits

Author SHA1 Message Date
Miriam Baglioni 32b0c27217 Aggiornare 'dhp-workflows/dhp-enrichment/src/main/java/eu/dnetlib/dhp/resulttoorganizationfrominstrepo/PrepareResultInstRepoAssociation.java'
fix in SQL query: while writing the blacklist constraint it used d.id to indicate the datasource id, but no alias for the datasource was defined. So I removed the alias
2021-06-09 18:36:11 +02:00
Miriam Baglioni abd88f663d changed test resource to mirror change in the input file 2021-05-21 15:20:47 +02:00
Miriam Baglioni c844877de2 changed workflow flow to possibly parallelize also the programme and project preparation steps 2021-05-21 14:41:57 +02:00
Miriam Baglioni 073d76864d refactoring 2021-05-21 14:41:03 +02:00
Miriam Baglioni 4c8b4a774c removed not needed code 2021-05-21 14:40:07 +02:00
Miriam Baglioni 53b9d87fec new prepareProgramme according to the new file 2021-05-21 11:49:31 +02:00
Miriam Baglioni 1ee8f13580 refactoring and added "left" as join type to be 100% sure to get the whole set of projects 2021-05-21 11:49:05 +02:00
Miriam Baglioni e07c3ba089 due to change in the input file the filtering step is no more needed 2021-05-21 11:47:43 +02:00
Miriam Baglioni 54f6e2f693 changed to get the needed information to build the action set as parallel jobs 2021-05-21 11:47:00 +02:00
Miriam Baglioni 7180505519 removed non needed variable 2021-05-21 11:46:13 +02:00
Miriam Baglioni 2eb1a8b344 changed because the input file changed 2021-05-21 11:40:20 +02:00
Miriam Baglioni 9610224671 added param to workflow property 2021-05-20 18:21:12 +02:00
Miriam Baglioni aa45b4df9b - 2021-05-20 15:57:40 +02:00
Miriam Baglioni 052c837843 - 2021-05-20 15:54:44 +02:00
Claudio Atzori b572f56763 Merge branch 'master' into master 2021-05-20 15:22:35 +02:00
Claudio Atzori 2578b7fbb3 code formatting 2021-05-20 14:59:02 +02:00
Miriam Baglioni dc0ad8d2e0 fixed issue related to change in the file name downloaded. Added sheet name as parameter and also a check if the name should change 2021-05-20 14:53:53 +02:00
Claudio Atzori aef2977ad0 fixes #6701: xpath for titles to support both datacite and Guidelines v4 mapping 2021-05-20 14:40:22 +02:00
Miriam Baglioni 02b80cf24f resolved conflicts 2021-05-20 10:59:39 +02:00
Claudio Atzori 239d0f0a9a ROR actionset import workflow backported from branch stable_ids 2021-05-18 16:12:11 +02:00
Claudio Atzori eeb8bcf075 using constants from ModelConstants 2021-05-18 11:10:07 +02:00
Sandro La Bruzzo d9a0bbda7b implemented new phase in doiboost to make the dataset Distinct by ID 2021-05-13 12:25:14 +02:00
Claudio Atzori da9d6f3887 mapping datasource.journal only when an issn is available, null otherwhise 2021-05-11 10:45:30 +02:00
Sandro La Bruzzo 54217d73ff removed old parameters from oozie workflow 2021-05-11 09:59:02 +02:00
Claudio Atzori 3925eb6a79 MDStoreManager model classes moved in dhp-schemas 2021-05-10 13:58:23 +02:00
Claudio Atzori 25254885b9 [ActionManagement] reduced number of xqueries used to access ActionSet info 2021-05-07 17:32:03 +02:00
Sandro La Bruzzo 7dc824fc23 imported changes in stable_id into master 2021-05-07 12:53:50 +02:00
Claudio Atzori 8c96a82a03 fixed mapping applied to ODF records. Added unit test to verify the mapping for OpenTrials 2021-05-05 15:30:06 +02:00
Claudio Atzori 50fc128ff7 alternative way to set timeouts for the ISLookup client 2021-05-05 11:24:44 +02:00
Sandro La Bruzzo 1adfc41d23 merged manually changes on stable_id for doiboost into master 2021-05-05 10:23:32 +02:00
Alessia Bardi a801999e75 fixed query for organisations' pids 2021-04-29 12:18:42 +02:00
Alessia Bardi e6075bb917 updated json schema for results - added instances and accessright definition 2021-04-27 15:15:08 +02:00
Claudio Atzori dd2e0a81f4 added dnet45-bootstrap-snapshot and dnet45-bootstrap-release repositories 2021-04-27 12:08:43 +02:00
Claudio Atzori 7ed107be53 depending on external dhp-schemas module 2021-04-23 17:52:36 +02:00
Claudio Atzori 99cfb027fa making ODF record parsing namespace unaware (#6629) 2021-04-23 17:09:36 +02:00
Miriam Baglioni 72e5aa3b42 refactoring 2021-04-23 12:10:30 +02:00
Miriam Baglioni 4ae6fba01d refactoring 2021-04-23 12:09:19 +02:00
Miriam Baglioni 7d1b8b7f64 merge upstream 2021-04-23 11:55:49 +02:00
Claudio Atzori 906d50563c Merge pull request 'properly invalidating impala metadata' (#105) from antonis.lempesis/dnet-hadoop:master into master
Reviewed-on: D-Net/dnet-hadoop#105
2021-04-15 15:06:22 +02:00
Antonis Lempesis 03d36fadea properly invalidating impala metadata 2021-04-15 13:34:22 +03:00
miconis dcff9cecdf bug fix: ids in self mergerels are not marked deletedbyinference=true 2021-04-12 15:55:27 +02:00
Miriam Baglioni 70e391d427 merge upstream 2021-04-07 10:38:08 +02:00
Claudio Atzori 37b65cc3ad Merge pull request 'updates on stats-update workflow' (#100) from antonis.lempesis/dnet-hadoop:master into master
The workflow integrated in the _stable_ids_ branch has been run correctly on the BETA content, thus IMO this PR can be integrated in the master branch.

Reviewed-on: D-Net/dnet-hadoop#100
2021-04-02 16:13:35 +02:00
Miriam Baglioni 4b6e514f02 merge upstream 2021-03-30 10:27:12 +02:00
Antonis Lempesis 0ba0a6b9da update promote wf to support monitor&production 2021-03-12 16:42:59 +02:00
Antonis Lempesis 60ebdf2dbe update promote wf to support monitor&production 2021-03-12 16:34:53 +02:00
Antonis Lempesis 236435b470 following redirects 2021-03-12 14:11:21 +02:00
Antonis Lempesis 3c75a05044 fixed a ton of typos 2021-03-12 13:47:04 +02:00
Claudio Atzori 19f3580b3d introduced java8-based date parsing 2021-03-11 16:46:23 +01:00
Antonis Lempesis fa1ec5b5e9 fixed typo... 2021-03-10 14:05:58 +02:00
Antonis Lempesis f40c150a0d fixed steps... 2021-03-06 00:35:57 +02:00
Antonis Lempesis 6147ee4950 assigning correctly hive contexts to concepts 2021-03-05 14:12:18 +02:00
Antonis Lempesis c5fbad8093 Contexts are now downloaded instead of using the stats_ext db 2021-03-04 00:42:21 +02:00
Claudio Atzori e8789b0cdb Merge pull request 'stats DB for monitor' (#99) from antonis.lempesis/dnet-hadoop:master into master
Looks good to me, just a note on the parsing of the citations: since the last version, IIS produces citations as proper relationships among results. This is what we got already in the BETA graph

```
count		r.reltype	r.subreltype	r.relclass
62.129.254	resultResult	citation	cites
62.043.309	resultResult	citation	isCitedBy
```

Thus, I suggest to move away from the current property based implementation for the extraction of the citation links and start relying on the relationships instead.
2021-03-03 10:29:09 +01:00
Antonis Lempesis 27796343ca crude sleep. hardcoded value 2021-03-03 01:37:47 +02:00
Miriam Baglioni 896919e735 merge upstream 2021-02-23 10:45:29 +01:00
Antonis Lempesis d90767c733 correctly invalidating metadata 2021-02-19 03:18:47 +02:00
Antonis Lempesis 3681afbe04 typo 2021-02-19 03:04:27 +02:00
Antonis Lempesis c5502eba8f actually moved stats computation in impala instead of hive... 2021-02-19 02:54:39 +02:00
Antonis Lempesis 33c85d4e66 moved stats computation in impala instead of hive 2021-02-18 17:23:34 +02:00
Antonis Lempesis b8e96c8ae7 moved cache update to the end 2021-02-18 16:42:22 +02:00
Antonis Lempesis bcbfc052b1 fixed last errors in step 21 2021-02-18 16:32:54 +02:00
Antonis Lempesis 10a29a4b9a fixes in monitor step 2021-02-18 15:05:59 +02:00
Antonis Lempesis 8ef66452d5 fixed typo 2021-02-17 22:24:44 +02:00
Antonis Lempesis a8836e2f5f fixed typo 2021-02-17 19:27:07 +02:00
Antonis Lempesis a445c1ac3d fixed variable names in monitor script 2021-02-17 16:45:09 +02:00
Antonis Lempesis 00d516360f added missing ; 2021-02-17 16:41:10 +02:00
Antonis Lempesis cd1b794409 added the monitor db wf 2021-02-17 02:11:55 +02:00
Alessia Bardi bf2830b981 Merge pull request 'manage merging of Relatation validation attributes' (#95) from merge_rel_validation into master 2021-02-16 12:14:27 +01:00
Claudio Atzori 6f9864c564 manage merging of Relatation validation attributes 2021-02-16 11:53:44 +01:00
Alessia Bardi 32e81c2d89 non validated rel has null value in validated field 2021-02-16 11:01:42 +01:00
Antonis Lempesis 1c029b9fc0 fixed formatting 2021-02-14 03:14:24 +02:00
Antonis Lempesis 2c4dcc90ba analyzing tables to produce stats 2021-02-14 02:54:55 +02:00
Michele Artini 83d815d0bc only stats 2021-02-11 10:57:23 +01:00
Michele Artini 8c836bf930 Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2021-02-11 10:54:41 +01:00
Michele Artini 8c1600398a added resumeFrom parameter 2021-02-11 10:54:16 +01:00
Claudio Atzori 3f8f78cbfb Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop 2021-02-11 09:36:10 +01:00
Claudio Atzori b34b5a39ca index field authoridtypevalue mixes up different author id-type value pairs, dropped in favour of orcidtypevalue 2021-02-11 09:36:04 +01:00
Michele Artini 7249cceb53 switch of 2 nodes 2021-02-11 09:27:08 +01:00
Claudio Atzori 73393d3c4d Merge pull request 'validatedLinksToProjects' (#93) from validatedLinksToProjects into master
LGTM
2021-02-10 12:32:35 +01:00
Alessia Bardi 986dd969d3 use the proper import for Lists 2021-02-10 12:03:54 +01:00
Alessia Bardi c4d1feca74 mapper test with validated link to project 2021-02-10 11:22:54 +01:00
Alessia Bardi 09fc7e2f78 serialization of validated flag on relationships 2021-02-10 11:22:09 +01:00
Claudio Atzori bc458d1b54 Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop 2021-02-09 16:27:30 +01:00
Claudio Atzori 82e6c50f3f updated solr fields (authoridtypevalue, resultsubject, resultresourcetypename) 2021-02-09 16:27:04 +01:00
Claudio Atzori 62bd3c53ee Merge branch 'master' into provision_indexing 2021-02-09 15:46:26 +01:00
Miriam Baglioni 2f5e6647c6 merge upstream 2021-02-08 10:33:11 +01:00
Alessia Bardi c67329d3ad updated test for EU Open Data portal datasets 2021-02-03 17:06:48 +01:00
Alessia Bardi fd705404a1 tests for EU Open Data portal dataset mapping 2021-02-03 10:28:17 +01:00
Miriam Baglioni 6190465851 merge upstream 2021-02-03 10:27:27 +01:00
Claudio Atzori f1a852f278 align usage-stats workflow poms with latest snapshot version 2021-01-26 15:42:42 +01:00
Claudio Atzori 9c32119dc2 Merge pull request 'usage-stats-export-wf-v2' (#89) from dimitris.pierrakos/dnet-hadoop:usage-stats-export-wf-v2 into master
Thank you Dimitris!
2021-01-26 15:01:41 +01:00
Claudio Atzori 885e0dd926 [Cleaning] filter authors not providing word characters in the fullname 2021-01-26 09:48:53 +01:00
Claudio Atzori 2890511613 [Cleaning] normalise missing Result.country 2021-01-26 09:41:44 +01:00
Claudio Atzori 4eb9ed35b1 Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop 2021-01-25 18:12:24 +01:00
Claudio Atzori cd379eb5e3 [Cleaning] trying to avoid NPEs, this time by ruling out authors without a defined fullname 2021-01-25 18:11:49 +01:00
Alessia Bardi 505477f36f format code 2021-01-25 18:02:49 +01:00
Alessia Bardi ded6ed8d7d no ',' author, if there are no author in ODF records 2021-01-25 17:57:51 +01:00
Claudio Atzori 3465c8ccee [Cleaning] trying to avoid NPEs 2021-01-25 16:54:53 +01:00
Claudio Atzori 07a0ccfc96 [Cleaning] trying to avoid NPEs 2021-01-25 13:36:01 +01:00
Claudio Atzori 646dab7f68 trying to avoid NPEs 2021-01-22 18:24:34 +01:00
Claudio Atzori 34d653de41 [Cleaning] updated cleaning rule for DOIs 2021-01-22 14:16:33 +01:00
Miriam Baglioni fe36895c53 added datasource blacklist for the organization to result propagation through institutional repositories 2021-01-22 11:55:10 +01:00
Dimitris 3e8d2a6b2d Clean workflows 2021-01-15 16:19:12 +02:00
Michele Artini f667e94a31 Merge pull request 'broker' (#88) from broker into master 2021-01-14 14:48:13 +01:00
Michele Artini cfbcdc95bc fixed a wf param 2021-01-14 14:45:23 +01:00
Michele Artini 69ba3203c0 fixed a conflict 2021-01-14 14:43:25 +01:00
Michele Artini fafb5b2e08 Merge branch 'broker' of code-repo.d4science.org:D-Net/dnet-hadoop into broker 2021-01-14 14:32:42 +01:00
Michele Artini b230d44411 fixed conflict 2021-01-14 14:32:31 +01:00
Michele Artini b9d90e95b8 Added eventId to ShortEventMessage 2021-01-14 14:32:31 +01:00
Michele Artini 64b0b0bfb3 fixed a bug with invalid subject topic 2021-01-14 14:32:31 +01:00
Michele Artini e3e0ab1de1 fixed a problem with join 2021-01-14 14:32:31 +01:00
Michele Artini 26a941315a openaireId 2021-01-14 14:32:31 +01:00
Michele Artini 6f4d1a37f0 ES wf properties 2021-01-14 14:32:31 +01:00
Michele Artini 1391341d06 mkdir of output dir 2021-01-14 14:32:31 +01:00
Michele Artini 3c9cbd19f3 whitelist of topics 2021-01-14 14:32:31 +01:00
Michele Artini 467aa77279 workingDir and outputDir 2021-01-14 14:32:31 +01:00
Michele Artini 10f3f7eca7 workingDir and outputDir 2021-01-14 14:32:31 +01:00
Michele Artini ff41a7b3a4 gzipped output 2021-01-14 14:32:31 +01:00
Michele Artini 223fa660cb fixed conflict 2021-01-14 14:23:44 +01:00
Michele Artini ac91e495fc Added eventId to ShortEventMessage 2021-01-14 13:20:35 +01:00
Claudio Atzori 80cf55ef2e [Broker] fixed partitionEventsByOpendoarIds workflow parameter names 2021-01-13 16:24:30 +01:00
Claudio Atzori 41500669e2 [BIP! Scores integration] merged missing classes from bipFinder branch 2021-01-11 14:39:47 +01:00
Claudio Atzori 2a7a10809e [BIP! Scores integration] merged missing classes from bipFinder branch 2021-01-11 10:05:02 +01:00
Claudio Atzori 5bd999efe7 Merge pull request 'bipFinder_master_test' (#84) from bipFinder_master_test into master 2021-01-08 18:16:34 +01:00
Claudio Atzori d6686dd7cf merged from master 2021-01-08 18:16:12 +01:00
Claudio Atzori 34229970e6 [BIP! Scores integration] Create updates as Result rather than subclasses; Result considers also metrics in the mergeFrom operation 2021-01-08 16:29:17 +01:00
Claudio Atzori 1361c9eb0c [BIP! Scores integration] Create updates as Result rather than subclasses; Result considers also metrics in the mergeFrom operation 2021-01-07 10:07:30 +01:00
Claudio Atzori ab2fe9266a [DOIBoost] minor fixes in workflow definition 2021-01-05 10:26:39 +01:00
Claudio Atzori 7c722f3fdc [DOIBoost] fixed typo 2021-01-05 10:25:54 +01:00
Claudio Atzori 8879704ba0 [DOIBoost] configurable ES server url and index name in crossref importer 2021-01-05 10:00:13 +01:00
Claudio Atzori 26e9d55c13 code formatting 2021-01-05 09:59:26 +01:00
Sandro La Bruzzo 7834a35768 avoid to save intermediate dataset before generation of Sequence file 2021-01-04 17:54:57 +01:00
Sandro La Bruzzo e79445a8b4 minor fix for claudio polemica 2021-01-04 17:39:25 +01:00
Sandro La Bruzzo 8765020b85 minor fix 2021-01-04 17:37:08 +01:00
Sandro La Bruzzo b0dc92786f defined a single oozie workflow for the generation of doiboost 2021-01-04 17:01:35 +01:00
Claudio Atzori 7185158942 ignore missing properties 2020-12-29 11:06:28 +01:00
Claudio Atzori 28460c2cd1 using com.fasterxml.jackson.databind.ObjectMapper instead of org.codehaus.jackson.map.ObjectMapper 2020-12-23 16:59:52 +01:00
Claudio Atzori 60649ac7d2 swapped expected and actual in tests, updated expected number of authors 2020-12-23 12:26:04 +01:00
Claudio Atzori 723b01f9e9 trivial: the less magic numbers and values around, the better 2020-12-23 12:22:48 +01:00
Claudio Atzori 6848d0c3d7 trivial: avoid duplicated code 2020-12-23 12:21:58 +01:00
Claudio Atzori d8b5f43a7e code formatting 2020-12-22 14:59:03 +01:00
Claudio Atzori 7bfc35df5e Merge pull request 'Changed typo in script names' (#82) from antonis.lempesis/dnet-hadoop:master into master
no need to! :)
2020-12-22 12:36:21 +01:00
Antonis Lempesis be5969a8c2 Changed typo in script names 2020-12-22 13:33:32 +02:00
miconis 794e22b09c bug fix in the authormerge: now authors with higher size have priority, normalization of author name fixed 2020-12-21 17:51:42 +01:00
Claudio Atzori 6cb0dc3f43 extended OCRID cleaning procedure 2020-12-21 11:40:17 +01:00
Claudio Atzori 573a8a3272 Merge pull request 'Changed typo in script names' (#81) from antonis.lempesis/dnet-hadoop:master into master
ok! LGTM
2020-12-18 17:44:26 +01:00
Antonis Lempesis 2a074c3b2b Changed typo in script names 2020-12-18 18:40:48 +02:00
Claudio Atzori 47270d9af5 lenient mock can be lenient 2020-12-18 15:38:59 +01:00
Claudio Atzori 2e503ee101 code formatting 2020-12-17 13:47:38 +01:00
Claudio Atzori 5a3e2199b2 Merge pull request 'Creation of the action set to include the bipFinder! score' (#80) from miriam.baglioni/dnet-hadoop:bipFinder into bipFinder_master_test 2020-12-17 12:26:38 +01:00
Claudio Atzori 03319d3bd9 Revert "Merge pull request 'Creation of the action set to include the bipFinder! score' (#62) from miriam.baglioni/dnet-hadoop:bipFinder into master"
This reverts commit add7e1693b, reversing
changes made to f9a8fd8bbd.
2020-12-17 12:23:58 +01:00
Claudio Atzori add7e1693b Merge pull request 'Creation of the action set to include the bipFinder! score' (#62) from miriam.baglioni/dnet-hadoop:bipFinder into master 2020-12-17 12:09:03 +01:00
Alessia Bardi f9a8fd8bbd updated test record for textgrid 2020-12-17 11:59:45 +01:00
Claudio Atzori 4766495f5b [orcid_to_result_from_semrel_propagation] fixed typo in SQL 2020-12-17 09:15:50 +01:00
Claudio Atzori de00094ebc Merge pull request 'FIX on the creation of subject based broker enrichments' (#79) from broker into master 2020-12-15 14:58:31 +01:00
Michele Artini f9dc1e45fd fixed a bug with invalid subject topic 2020-12-15 14:54:11 +01:00
Sandro La Bruzzo f92bd56f56 Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-12-15 11:47:29 +01:00
Sandro La Bruzzo 1f6c8a9e83 added orcid_pending type to records coming from Crossref 2020-12-15 11:47:15 +01:00
Claudio Atzori 9f1181290e Merge pull request 'broker' (#78) from broker into master
The changes look good to me.
2020-12-15 10:03:45 +01:00
Claudio Atzori 6299f75807 Merge pull request 'validation in claim rels' (#77) from claims_validation into master
LGTM
2020-12-15 09:28:24 +01:00
Michele Artini 0a0f62bd01 Merge branch 'master' into broker 2020-12-15 08:30:52 +01:00
Michele Artini 12fa5d122a fixed a problem with join 2020-12-15 08:30:26 +01:00
Michele Artini 991e675dc6 validation in claim rels 2020-12-14 15:41:25 +01:00
Michele Artini 3e19cf7b4a openaireId 2020-12-14 15:24:33 +01:00
Claudio Atzori b6f08ce226 re-adding the old junit:junit dep as solr-test-framework needs it 2020-12-14 15:07:31 +01:00
Claudio Atzori e8ef8c63d4 delegate merging of OafEntity.dataInfo to the implementation of subclasses 2020-12-14 15:04:44 +01:00
Claudio Atzori 7d325e2c57 using actual result subclasses instead of their parent class 2020-12-14 14:40:54 +01:00
Claudio Atzori 152916890f renamed test name 2020-12-14 14:40:05 +01:00
Michele Artini a203aee32a ES wf properties 2020-12-14 12:02:33 +01:00
Claudio Atzori 1506f49052 Xml record serialization for author PIDs: 1) only one value per PID type is allowed; 2) orcid prevails over orcid_pending 2020-12-14 11:14:03 +01:00
Michele Artini d03756c962 mkdir of output dir 2020-12-14 11:11:41 +01:00
Michele Artini 399548f221 whitelist of topics 2020-12-14 11:03:55 +01:00
Michele Artini 38da1c282a Merge branch 'master' into broker 2020-12-14 09:14:02 +01:00
Dimitris dc9c2f3272 Commit 12122020 2020-12-12 12:00:14 +02:00
Claudio Atzori 61cd129ded XML serialisation test 2020-12-11 12:44:53 +01:00
Claudio Atzori ce7a319e01 using the correct assertion import 2020-12-11 12:44:17 +01:00
Claudio Atzori 7fe2433137 excluded transitive older junit dependencies, they can compromise the unit test executions 2020-12-11 12:42:55 +01:00
Michele Artini 933b4c1ada workingDir and outputDir 2020-12-10 14:47:51 +01:00
Michele Artini 2e7df07328 workingDir and outputDir 2020-12-10 14:47:22 +01:00
Michele Artini 94bfed1c84 gzipped output 2020-12-10 11:59:28 +01:00
Miriam Baglioni b7adbc7c3e merge branch with master 2020-12-10 10:35:27 +01:00
Alessia Bardi 112da6d76a in theory, just auto-formatting after mvn compile 2020-12-09 20:00:27 +01:00
Alessia Bardi bece04b330 Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop 2020-12-09 19:54:43 +01:00
Alessia Bardi 426b76ee8e more asserts for TextGrid record 2020-12-09 19:46:11 +01:00
Claudio Atzori ff72fcd91a allow orcid_pending to be percolate to the XML graph serialization 2020-12-09 19:04:50 +01:00
Claudio Atzori 4705144918 Merge pull request 'rel_project_validation' (#69) from rel_project_validation into master
LGTM
2020-12-09 19:01:20 +01:00
Claudio Atzori 211aa04726 allow orcid_pending to be percolate to the XML graph serialization 2020-12-09 18:08:51 +01:00
Claudio Atzori db4e400a0b introduced Oaf.mergeFrom method to allow merging of dataInfo(s), with prevalence based on datainfo.trust 2020-12-09 18:01:45 +01:00
Claudio Atzori ada21ad920 Merge pull request 'dump of the results related to at least one project' (#61) from miriam.baglioni/dnet-hadoop:dump into master
LGTM
2020-12-09 17:22:56 +01:00
Miriam Baglioni 6fbc67a959 using ModelConstant.ORCID and removing not used constants 2020-12-09 17:10:20 +01:00
Miriam Baglioni 212b52614f added graph mapper versus community result without context and project in common to be used for the doiboost mapping 2020-12-09 16:59:02 +01:00
Michele Artini 1bc9adc10d default trust for validated rels 2020-12-09 16:18:37 +01:00
Michele Artini 5f21a356fd reindent 2020-12-09 11:24:30 +01:00
Michele Artini 370a5e650b validation attributes in resultProject relations 2020-12-09 11:18:26 +01:00
Michele Artini 75bf708351 Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-12-09 10:31:33 +01:00
Michele Artini 620e1307a3 indentation 2020-12-09 10:30:47 +01:00
Claudio Atzori 27e96767e0 Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop 2020-12-07 21:53:22 +01:00
Claudio Atzori fba11eef2a cleanup 2020-12-07 21:53:13 +01:00
Claudio Atzori 2fcc24b36e code formatting 2020-12-07 21:52:32 +01:00
Claudio Atzori 197f286fa4 removed duplicated dependency (org.apache.httpcomponents:httpclent 2020-12-07 21:52:17 +01:00
Sandro La Bruzzo 7f8b93de72 Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-12-07 19:59:39 +01:00
Sandro La Bruzzo 302baab67b fixed doiboost mapping and workflows 2020-12-07 19:59:33 +01:00
Michele Artini d6934f370e Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-12-07 14:56:23 +01:00
Michele Artini 5de8a7276f wf to partition opendoar events 2020-12-07 14:56:06 +01:00
Claudio Atzori 5e8509bef7 Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop 2020-12-07 13:50:08 +01:00
Claudio Atzori 026ad40633 disabled test 2020-12-07 13:50:01 +01:00
Sandro La Bruzzo 620e585b63 Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-12-07 10:42:53 +01:00
Sandro La Bruzzo b31dd126fb fixed crossref workflow added common ORCID Class 2020-12-07 10:42:38 +01:00
Claudio Atzori a104a632df cleanup 2020-12-04 16:32:47 +01:00
Claudio Atzori 5b4e1142a8 Merge pull request 'added last step to update cache' (#64) from antonis.lempesis/dnet-hadoop:master into master
Looks good to me, thanks!
2020-12-04 14:42:31 +01:00
Antonis Lempesis b1ed1afdcc added the new parameter (stats_tool_api_url) in the workflow parameters 2020-12-04 13:07:18 +02:00
Antonis Lempesis 7cb113e088 added the new parameter (stats_tool_api_url) in the workflow parameters 2020-12-04 13:04:25 +02:00
Antonis Lempesis d23ccae0d5 ignoring deletedbyinference relations 2020-12-04 12:42:17 +02:00
Miriam Baglioni 5fb65ffc4a merge branch with master 2020-12-03 11:24:35 +01:00
Miriam Baglioni ea88dc3401 fixed issue in property name 2020-12-03 11:24:23 +01:00
Miriam Baglioni 4c58bd1c93 merge with upstream 2020-12-03 11:24:00 +01:00
Miriam Baglioni 05c452f58d merge with upstream 2020-12-03 10:26:45 +01:00
Antonis Lempesis 413afcfed5 finished first implementation of wf 2020-12-02 15:57:17 +02:00
Antonis Lempesis 0948536614 initial implementation of the promote wf 2020-12-02 15:41:56 +02:00
Sandro La Bruzzo 7da679542f fixed wrong projectId 2020-12-02 14:28:09 +01:00
Sandro La Bruzzo 6ba8037cc7 fixed failure to test due to changing of input 2020-12-02 11:34:46 +01:00
Claudio Atzori cfb55effd9 code formatting 2020-12-02 11:23:49 +01:00
Claudio Atzori 74242e450e using constants from ModelConstants 2020-12-02 11:23:35 +01:00
Miriam Baglioni d5efa6963a using constants in ModelCOnstants 2020-12-02 11:20:26 +01:00
Claudio Atzori 873c358d1d Merge pull request 'added extension for new author pid (orcid_pending)' (#63) from miriam.baglioni/dnet-hadoop:master into master
LGTM
2020-12-02 11:15:00 +01:00
Miriam Baglioni cd285e98bc usoing the constants defined in the ModelConstants class 2020-12-02 11:13:23 +01:00
Miriam Baglioni 51c582c08c added orcid class name among the constants set 2020-12-02 11:12:54 +01:00
Miriam Baglioni 4b0d1530a2 merge upstream 2020-12-02 11:05:00 +01:00
Claudio Atzori faa977df7e Merge pull request 'orcid-no-doi' (#43) from enrico.ottonello/dnet-hadoop:orcid-no-doi into master
The dataset was generated and is now part of the actionsets available in BETA
2020-12-02 10:55:12 +01:00
Claudio Atzori 57f448b7a4 graph cleaning workflow separate orcid_pending from orcid, depending on the author pid provenance 2020-12-02 10:44:05 +01:00
Alessia Bardi 2d15667b4a testing XML generation from json object (case AMS ACTA) 2020-12-02 10:16:26 +01:00
Alessia Bardi a417624670 tests for raw graph mapping 2020-12-02 10:15:26 +01:00
Miriam Baglioni f8468c9c22 added extention for new author pid (orcid_pending) 2020-12-01 20:09:35 +01:00
Miriam Baglioni 888175baf7 added java doc 2020-12-01 18:36:29 +01:00
Miriam Baglioni 3d62d99d5d fixed issue in workflow variable 2020-12-01 15:02:49 +01:00
Miriam Baglioni 17680296b9 removed unnecessary variable and unused method 2020-12-01 15:02:31 +01:00
Miriam Baglioni 5b3ed70808 refactoring 2020-12-01 14:31:34 +01:00
Miriam Baglioni 62ff4999e3 added workflow and last step of collection and save 2020-12-01 14:30:56 +01:00
Miriam Baglioni 45d06c45c7 collecting all the atoic actions for result type and save them all in the AS path 2020-12-01 14:29:18 +01:00
Miriam Baglioni 0051ebede5 extending test 2020-12-01 12:43:03 +01:00
Miriam Baglioni 719da15f04 added test resources 2020-12-01 12:42:30 +01:00
Miriam Baglioni e819155eb2 added implements Seriaiazable 2020-12-01 09:51:58 +01:00
Miriam Baglioni db36e11912 classes test classes and resources for production of the actionset to include bipFinder score in results 2020-11-30 20:14:23 +01:00
Enrico Ottonello f2df3ead74 Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop into orcid-no-doi 2020-11-30 14:22:46 +01:00
Enrico Ottonello 40c4559e92 added datainfo on authors pid with "sysimport:crosswalk:entityregistry", 2020-11-30 14:19:22 +01:00
Antonis Lempesis 815d6b25d9 added last step to update cache 2020-11-30 00:48:10 +02:00
Claudio Atzori e731a7658d cleaning texts to remove tab characters too 2020-11-27 09:00:04 +01:00
Claudio Atzori a104d2b6ad cleanup 2020-11-26 11:12:00 +01:00
Miriam Baglioni 124591a7f3 refactoring 2020-11-25 18:23:28 +01:00
Miriam Baglioni 1a89f8211c D-Net/dnet-hadoop#61 (comment) 2020-11-25 18:12:40 +01:00
Miriam Baglioni 5fbe54ef54 D-Net/dnet-hadoop#61 (comment) 2020-11-25 18:10:28 +01:00
Miriam Baglioni ed01e5a5e1 D-Net/dnet-hadoop#61 (comment) 2020-11-25 18:09:34 +01:00
Miriam Baglioni d4ddde2ef2 changed because of D-Net/dnet-hadoop#61 (comment) 2020-11-25 18:01:01 +01:00
Miriam Baglioni f5e5e92a10 changed because of D-Net/dnet-hadoop#61 (comment) 2020-11-25 17:58:53 +01:00
Miriam Baglioni 1df94b85b4 changed because of D-Net/dnet-hadoop#61 (comment) 2020-11-25 17:57:43 +01:00
Miriam Baglioni 66c0e3e574 changed because of D-Net/dnet-hadoop#61 (comment) 2020-11-25 17:52:17 +01:00
Claudio Atzori db0181b8af Merge pull request 'added bidirectionality to relations from project and result coming from crossref' (#60) from miriam.baglioni/dnet-hadoop:sxBidirectionality into master 2020-11-25 17:17:40 +01:00
Sandro La Bruzzo ec3e238de6 Fixed problem on duplicated identifier 2020-11-25 17:15:54 +01:00
Miriam Baglioni 90d4369fd2 added test to verify the compression in writing community info on hdfs 2020-11-25 14:34:58 +01:00
Miriam Baglioni 6750e33d69 merge branch with master 2020-11-25 14:09:01 +01:00
Miriam Baglioni b2c455f883 added java doc 2020-11-25 14:08:09 +01:00
Miriam Baglioni 1f130cdf92 changed the relation (produces -> isProducedBy) due to the change in the code 2020-11-25 14:04:26 +01:00
Miriam Baglioni e758d5d9b4 refactoring 2020-11-25 13:46:39 +01:00
Miriam Baglioni 87a9f616ae refactoring and addition of the funder nsp first part as nome for the dump insteasd of the whole nsp 2020-11-25 13:45:41 +01:00
Miriam Baglioni e7e418e444 added decision node to verify if to upload in Zenodo 2020-11-25 13:44:10 +01:00
Miriam Baglioni 305e3d0c9c added resource file for relation with relClass = isProducedBy 2020-11-25 13:43:41 +01:00
Miriam Baglioni 21ce175d17 added FilterFunction specification if filter operation 2020-11-25 13:42:31 +01:00
Miriam Baglioni bde6d337dd test classes for dump of results related to funders 2020-11-25 13:42:01 +01:00
Miriam Baglioni b37b9352d7 added constant value for semantic relationship between projects and results 2020-11-25 13:41:08 +01:00
Sandro La Bruzzo 264723ffd8 updated stuff for zenodo upload 2020-11-25 11:56:07 +01:00
Claudio Atzori eeebd5a920 Cleanig workflow: remove newlines from titles, descriptions, subjects 2020-11-24 18:40:25 +01:00
Enrico Ottonello 99a086f0c6 max concurrent executors set to 10, according to ORCID Director of Technology mail request 2020-11-24 17:49:32 +01:00
Miriam Baglioni 72bb0fe360 changed directory name 2020-11-24 16:47:07 +01:00
Miriam Baglioni 00874a8ce6 added bidirectionality to relations from project and result 2020-11-24 15:17:23 +01:00
Miriam Baglioni 39f4a20873 chenged the path and the name for saving the communities_infrastructures dump file 2020-11-24 14:47:32 +01:00
Miriam Baglioni 7e14452a87 final versione of the wf to get the dump of results associated to at least one funder per funder 2020-11-24 14:46:34 +01:00
Miriam Baglioni c167a18057 added new parameter for the dumpType 2020-11-24 14:45:50 +01:00
Miriam Baglioni 54a309bb6b refactoring 2020-11-24 14:45:30 +01:00
Miriam Baglioni 35ecea8842 changed to consider the modification for the specification of the type of dump 2020-11-24 14:45:15 +01:00
Miriam Baglioni b9b6bdb2e6 fixing issue on previous implementation 2020-11-24 14:44:53 +01:00
Miriam Baglioni 7e940f1991 changed to consider the modification for the specification of the type of dump 2020-11-24 14:43:34 +01:00
Miriam Baglioni 62928ef7a5 changed to save the communities_infrastructures information as the other entity dumps: in a json.gz file 2020-11-24 14:42:41 +01:00
Miriam Baglioni 3319440c53 changed the direction of the relation between projects and result considered to select the results linked to projects 2020-11-24 14:41:09 +01:00
Miriam Baglioni 00c377dac2 added specification of MapFunction types in map 2020-11-24 14:40:22 +01:00
Miriam Baglioni 44db258dc4 added enumerated for the dump type 2020-11-24 14:38:06 +01:00
Miriam Baglioni 1832708c42 modified boolean variable with string one whcih specify the type of dump we are performing: complete, community or funder 2020-11-24 14:37:36 +01:00
Miriam Baglioni 73dbb79602 removed the checl for the community name in the common version on MakeTar 2020-11-24 14:36:15 +01:00
Enrico Ottonello 5c17e768b2 set wf configuration with spark.dynamicAllocation.maxExecutors 20 over 20 input partitions 2020-11-23 16:01:23 +01:00
Enrico Ottonello 5c9a727895 Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop into orcid-no-doi 2020-11-23 09:49:53 +01:00
Enrico Ottonello 97c8111847 action to convert lambda file in seq file; spark action to download updated authors 2020-11-23 09:49:22 +01:00
Miriam Baglioni 259c67ce36 fixed issue in path name 2020-11-20 12:32:23 +01:00
Miriam Baglioni 0a9db67eec - 2020-11-20 12:21:33 +01:00
Miriam Baglioni d362f2637d merge branch with master 2020-11-19 19:17:20 +01:00
Miriam Baglioni cf3f47563f new parameter files 2020-11-19 19:16:05 +01:00
Miriam Baglioni 24c56fa7a3 new logic and workflow for dump of results with link to projects. In this implementation the result match the model of the communityresult. 2020-11-19 19:15:39 +01:00
Claudio Atzori d48f388fb2 Merge branch 'provision_indexing' 2020-11-19 15:59:55 +01:00
Claudio Atzori 46bde9c13f Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop 2020-11-19 15:26:27 +01:00
Claudio Atzori 7c9feaf9e7 project attributes removed from the XML record serialization: contactfullname, contactfax, contactphone, contactemail 2020-11-19 15:26:20 +01:00
Michele Artini 293da47ad9 Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-11-19 10:42:31 +01:00
Michele Artini ab08d12c46 considering abstract > MIN_LENGTH in ENRICH_MISSING_ABSTRACT 2020-11-19 10:42:10 +01:00
Claudio Atzori e503271abe fixed notification workflow name 2020-11-19 10:41:38 +01:00
Claudio Atzori 0374d34c3e introduced configuration param outputFormat: HDFS | SOLR 2020-11-19 10:34:28 +01:00
Miriam Baglioni fafb688887 - 2020-11-18 18:56:48 +01:00
Miriam Baglioni 906db690d2 - 2020-11-18 17:43:08 +01:00
Claudio Atzori ede7fae6c8 Merge pull request 'XML record indexing test' (#58) from provision_indexing into master 2020-11-18 17:04:34 +01:00
Miriam Baglioni 5402062ff5 changed parameter file with the ono associated to the job 2020-11-18 16:58:20 +01:00
Miriam Baglioni a172a37ad1 fixed typo 2020-11-18 16:55:07 +01:00
Miriam Baglioni 46ba3793f6 code, workflow and parameters for the dump of the results associated to funders 2020-11-18 16:47:31 +01:00
Claudio Atzori 5218718e8b updated set of fields from the MDFormatDSResourceType on PROD 2020-11-18 15:00:41 +01:00
Claudio Atzori d9e07a242b extended XmlIndexingJob to accept an optional parameter: outputPath. When present, forces the job to write its output on the specified HDFS location 2020-11-18 14:34:55 +01:00
Claudio Atzori 29dcff0f34 spark complains about missing classes, so here they are again 2020-11-18 14:32:32 +01:00
Miriam Baglioni 57cac36898 changed the workflow name 2020-11-18 13:38:03 +01:00
Claudio Atzori 12acf25519 Merge pull request 'starting from first step...' (#57) from antonis.lempesis/dnet-hadoop:master into master
No judging. Just re-deploying...
2020-11-18 11:01:49 +01:00
Claudio Atzori 8177ce7939 test for XmlIndexingJob based on a local miniSolrCluster 2020-11-18 10:58:05 +01:00
Alessia Bardi 10e673660f Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop 2020-11-18 10:01:23 +01:00
Alessia Bardi be7b310cef rel semantcis ignore case 2020-11-18 10:01:20 +01:00
Michele Artini 33da2e3d6c xpaths for dateOfCollection and dateOfTransformation 2020-11-18 09:26:20 +01:00
Antonis Lempesis 01a6e03989 starting from first step... 2020-11-17 23:26:47 +02:00
Alessia Bardi 8f87020a50 #56: map relevantDates from aggregated ODF records 2020-11-17 18:42:09 +01:00
Alessia Bardi 7e0a76a8ac test fr TextGrid 2020-11-17 18:39:25 +01:00
Enrico Ottonello 2b0c9bbb7e Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop into orcid-no-doi 2020-11-17 18:24:34 +01:00
Enrico Ottonello c0c2e05eae added wf to extracting authors and works xml data from orcid dump to hdfs; added wf to download the lamda file (containing last orcid update informations) from orcid to hdfs 2020-11-17 18:23:12 +01:00
Claudio Atzori cfc01f136e PID filtering based on a blacklist 2020-11-17 12:27:06 +01:00
Claudio Atzori 628ca54dd3 disable old maven repository URLs 2020-11-17 12:26:16 +01:00
Dimitris bbcf6b7c8b Commit 17112020 2020-11-17 08:36:51 +02:00
Enrico Ottonello c796adae24 Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop into orcid-no-doi 2020-11-16 11:57:19 +01:00
Claudio Atzori 6ab1ce53c9 fixed condition in result pid cleaning; cleanup 2020-11-16 10:09:17 +01:00
Claudio Atzori 4de8c8b237 fixed workflow variable name 2020-11-16 10:03:11 +01:00
Dimitris 3e24c9b176 Changes 14112020 2020-11-14 18:42:07 +02:00
Claudio Atzori 331d621800 added test resource 2020-11-14 12:16:15 +01:00
Claudio Atzori 5d4e34e26a fixed typo in variable name 2020-11-14 10:32:26 +01:00
Claudio Atzori 768bc5304c Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop 2020-11-13 15:40:34 +01:00
Claudio Atzori 93f7b7974f Merge pull request 'trust truncated to 3 decimals' (#24) from trunc_trust into master
LGTM
2020-11-13 15:40:02 +01:00
Claudio Atzori 2facfefc19 updated maven repository URL 2020-11-13 15:38:40 +01:00
Claudio Atzori 528231a287 grouping graph entities by id turned out to be an easy extension for the already existing cleaning workflow 2020-11-13 15:37:48 +01:00
Enrico Ottonello 005f849674 added compression to output dataset 2020-11-13 12:45:31 +01:00
Enrico Ottonello 9a2fa9dc2f added test for other names parsing from summaries dump 2020-11-13 10:25:34 +01:00
Claudio Atzori 2bed29eb09 WIP: added oozie workflow for grouping graph entities by id 2020-11-13 10:05:12 +01:00
Claudio Atzori 13e36a4da0 WIP: added oozie workflow for grouping graph entities by id 2020-11-13 10:05:02 +01:00
Enrico Ottonello 13f28fa225 moved AuthorData to dhp-schemas; added other names to author data 2020-11-12 17:43:32 +01:00
Enrico Ottonello 2af21150c5 Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop into orcid-no-doi 2020-11-12 09:58:33 +01:00
Claudio Atzori 75324ae58a Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop 2020-11-12 09:23:37 +01:00
Claudio Atzori 822971f54f no need to filter relations in CreateRelatedEntitiesJob_phase1; replaced 'left outer' join with 'left' join in CreateRelatedEntitiesJob_phase2; cleanup; 2020-11-12 09:22:59 +01:00
Enrico Ottonello 1f861f2b0d now wf output is a sequence file with the format seq("eu.dnetlib.dhp.schema.oaf.Publication",eu.dnetlib.dhp.schema.action.AtomicActions) 2020-11-11 17:38:50 +01:00
Claudio Atzori 9841488482 Merge pull request 'latest changes in stats wf' (#54) from antonis.lempesis/dnet-hadoop:master into master
LGTM, thanks!
2020-11-11 16:01:51 +01:00
Antonis Lempesis 99ebaee347 fixed #5913 2020-11-11 16:56:46 +02:00
Claudio Atzori e3d3481fb9 Merge pull request 'organizations pids' (#53) from organization_pids into master
LGTM
2020-11-11 14:08:25 +01:00
Antonis Lempesis f14e65f6a3 reverted wrong change 2020-11-10 17:23:04 +02:00
Antonis Lempesis c02c7741c9 fixes in db creation 2020-11-10 17:11:30 +02:00
Antonis Lempesis e603fa5847 fixes in db creation 2020-11-10 17:11:12 +02:00
Enrico Ottonello fea2451658 Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop into orcid-no-doi 2020-11-10 11:49:43 +01:00
Claudio Atzori 18d9aad70c improved documentation in dhp-graph-provision 2020-11-10 11:48:55 +01:00
Enrico Ottonello 1513174d7e added further test case 2020-11-10 11:44:55 +01:00
Michele Artini 40160d171f organizations pids 2020-11-09 12:58:36 +01:00
Sandro La Bruzzo 027ef2326c Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-11-06 17:12:42 +01:00
Sandro La Bruzzo cd27df91a1 fixed bug on missing relation in ANDS 2020-11-06 17:12:31 +01:00
Enrico Ottonello 6bc7dbeca7 first version of dataset successful generated from orcid dump 2020 2020-11-06 13:47:50 +01:00
Claudio Atzori d10447e747 re-packaged graph dump workflow sources 2020-11-05 17:38:18 +01:00
Claudio Atzori 144216fb88 Merge pull request 'OpenAIRE graph dump' (#51) from miriam.baglioni/dnet-hadoop:dump into master
LGTM
2020-11-05 17:09:52 +01:00
Miriam Baglioni f8e9bda24c merge branch with master 2020-11-05 16:31:18 +01:00
Miriam Baglioni afa0b1489b merge upstream 2020-11-05 16:31:09 +01:00
Miriam Baglioni 7ebdfacee9 removed commented code and added documentation to new method 2020-11-05 16:30:36 +01:00
Miriam Baglioni be5ed8f554 added check to avoid sending empty metadata. 2020-11-05 16:10:17 +01:00
Claudio Atzori 2148a51fae minor changes 2020-11-05 11:24:12 +01:00
Claudio Atzori 4625b7486e code formatting 2020-11-04 18:12:43 +01:00
Claudio Atzori f5f346dd2b Merge pull request 'dump' (#50) from miriam.baglioni/dnet-hadoop:dump into master
LGTM
2020-11-04 18:07:01 +01:00
Miriam Baglioni e9ac471ae9 removed dependency from classes for the pid graph dump 2020-11-04 18:04:42 +01:00
Miriam Baglioni f45c23316f removed entities added for the pid graph dump 2020-11-04 17:31:24 +01:00
Miriam Baglioni e9d948786d removed commented code 2020-11-04 17:30:51 +01:00
Miriam Baglioni b90a945c49 removed property files for pid graph dump 2020-11-04 17:28:33 +01:00
Miriam Baglioni bac307155a removed properties specific for pid graph dump 2020-11-04 17:28:04 +01:00
Miriam Baglioni 9c9d50f486 removed code specific for pid graph dump 2020-11-04 17:26:22 +01:00
Miriam Baglioni 5669890934 removed commented lines 2020-11-04 17:15:21 +01:00
Miriam Baglioni 6a89f59be9 removed commented lines 2020-11-04 17:13:59 +01:00
Miriam Baglioni 56150d7e5e removed all code related to the dump of pids graph 2020-11-04 17:13:12 +01:00
Miriam Baglioni 16c54a96f8 removed pid dump 2020-11-04 17:11:32 +01:00
Miriam Baglioni d9d8de63cc merge upstream 2020-11-04 13:36:38 +01:00
Miriam Baglioni 0cac5436ff Merge branch 'dump' of code-repo.d4science.org:miriam.baglioni/dnet-hadoop into dump 2020-11-04 13:21:11 +01:00
Alessia Bardi 51808b5afd Updated descriptions 2020-11-04 12:29:48 +01:00
Alessia Bardi e6becf8659 Updated descriptions 2020-11-04 12:17:57 +01:00
Alessia Bardi 0abe0eee33 Updated descriptions 2020-11-04 12:15:30 +01:00
Alessia Bardi f6ab238f5d Updated descriptions 2020-11-04 11:50:47 +01:00
Sandro La Bruzzo 3581244daf Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-11-04 09:04:22 +01:00
Sandro La Bruzzo 66efb39634 implemented merge scholix 2020-11-04 09:04:01 +01:00
Miriam Baglioni c010a8442f fixed issue on test code 2020-11-03 17:26:51 +01:00
Miriam Baglioni 8ec7a61188 merge branch with master 2020-11-03 16:59:08 +01:00
Miriam Baglioni 8b4f7bf492 merge upstream 2020-11-03 16:58:59 +01:00
Miriam Baglioni c209284ca7 new schemas for the entities in the dump with added descriptions 2020-11-03 16:58:08 +01:00
Miriam Baglioni 08806deddf added the splitSize non mandatory parameter. Default size 10G 2020-11-03 16:57:34 +01:00
Miriam Baglioni 7d2eda43ca added new non mandatory property publish to determine if to publish the upload or leave it pending. Default value flase 2020-11-03 16:57:01 +01:00
Miriam Baglioni cbbb1bdc54 moved business logic to new class in common for handling the zip of hte archives 2020-11-03 16:55:50 +01:00
Miriam Baglioni 7d95a5e2b4 refactoring 2020-11-03 16:55:13 +01:00
Miriam Baglioni d4382b54df moved the tar archive with maz size on common module 2020-11-03 16:54:50 +01:00
Claudio Atzori 5310e56dba remove empy PIDs 2020-11-03 11:52:10 +01:00
Miriam Baglioni 1124ac29fc merge upstream 2020-11-02 10:22:51 +01:00
Sandro La Bruzzo 754c86f33e fixed test to work on jenkins 2020-11-02 09:35:01 +01:00
Sandro La Bruzzo 39337d8a8a fixed test 2020-11-02 09:26:25 +01:00
Dimitris 32bf943979 Changes to download only updates 2020-11-02 09:08:25 +02:00
Miriam Baglioni dabb33e018 changed the discriminant for which split the file 2020-10-30 17:52:22 +01:00
Claudio Atzori fbad4988be relClass values should be camel-case 2020-10-30 17:26:17 +01:00
Claudio Atzori c5dda3a00c Merge pull request 'h2020classification' (#49) from miriam.baglioni/dnet-hadoop:h2020classification into master
LGTM
2020-10-30 17:10:05 +01:00
Miriam Baglioni 4905739be6 changed resource file to mirror change in business logic 2020-10-30 17:02:57 +01:00
Miriam Baglioni b40360ebfb changed the code to mirror the changed decision in the classification level and prodramme description labels 2020-10-30 17:02:30 +01:00
Miriam Baglioni 696409fb9f disabled tests because needing remote resource 2020-10-30 17:01:48 +01:00
Miriam Baglioni 0fba08eae4 max allowed size per file 10 Gb 2020-10-30 16:05:55 +01:00
Miriam Baglioni b828587252 prevent the code to cicle indefinetly 2020-10-30 15:01:25 +01:00
Miriam Baglioni f747e303ac classes for dumping of the graph as ttl file 2020-10-30 14:13:45 +01:00
Miriam Baglioni 16baf5b69e formatting 2020-10-30 14:13:14 +01:00
Miriam Baglioni a9eef9c852 added check for possible Optional value in relation dataInfo 2020-10-30 14:12:28 +01:00
Miriam Baglioni 5f4de9a962 formatting 2020-10-30 14:11:40 +01:00
Miriam Baglioni 10d8bbada8 changed deprecated method with non deprecated versioen 2020-10-30 14:10:10 +01:00
Miriam Baglioni 14bf2e7238 added option to split dumps bigger that 40Gb on different files 2020-10-30 14:09:04 +01:00
Dimitris b8a3392b59 Commit 30102020 2020-10-30 14:07:21 +02:00
Miriam Baglioni 78fdb11c3f merge branch with master 2020-10-29 12:55:22 +01:00
Miriam Baglioni d6e8dc0313 merge upstream 2020-10-29 12:55:06 +01:00
Sandro La Bruzzo 1d9fdb7367 fixed spark memory issue in SparkSplitOafTODLIEntities 2020-10-28 12:30:32 +01:00
Miriam Baglioni 4cf4454341 changed from deprecated method to new one 2020-10-27 17:46:19 +01:00
Miriam Baglioni c8f32dd109 - 2020-10-27 17:45:58 +01:00
Miriam Baglioni 3582eba565 - 2020-10-27 17:31:33 +01:00
Miriam Baglioni d2374e3b9e added code to handle cases where the funding tree is not existing 2020-10-27 16:15:21 +01:00
Miriam Baglioni 5d3012eeb4 changed code to dump only the programme list and not the classification list 2020-10-27 16:14:18 +01:00
Miriam Baglioni 1bd638d291 removed h2020classification from dump information. Added back the programme info 2020-10-27 16:13:36 +01:00
Miriam Baglioni 3241ec1777 added connection timeout and socket timeout 600 sec 2020-10-27 16:12:11 +01:00
Miriam Baglioni cc68855a1e merge upstream 2020-10-27 15:54:16 +01:00
Miriam Baglioni 1cb60aede4 added connection timeout and socket timeout 600 sec 2020-10-27 15:53:02 +01:00
Enrico Ottonello 9818e74a70 added dependency version in main pom.xml for orcid no doi 2020-10-22 16:38:00 +02:00
Enrico Ottonello 210a50e4f4 replaced null value 2020-10-22 16:24:42 +02:00
Enrico Ottonello b0290dbcb7 moved all dependencies version to main pom.xml 2020-10-22 16:20:46 +02:00
Enrico Ottonello a38ab57062 let run test methods 2020-10-22 15:43:50 +02:00
Enrico Ottonello 1139d6568d replaced null value with a more safe empty string as return value 2020-10-22 15:32:26 +02:00
Enrico Ottonello c58db1c8ea added filter on null value after map function 2020-10-22 15:11:02 +02:00
Enrico Ottonello 846ba30873 if typologies mapping fails, an exception will be propagated 2020-10-22 14:36:18 +02:00
Enrico Ottonello c3114ba0ae replaced null as return value with a more safe empty string 2020-10-22 14:21:31 +02:00
Enrico Ottonello c295c71ca0 added comment 2020-10-22 14:07:26 +02:00
Enrico Ottonello ab083f9946 propagate exception on parsing work (PR request) 2020-10-22 14:02:32 +02:00
sandro 3a81a940b7 solved bug on merge publication 2020-10-21 22:41:55 +02:00
Miriam Baglioni a2ce527fae changed to match the requirements for short titles in level and long titles in classification 2020-10-20 17:03:25 +02:00
Sandro La Bruzzo 346ed65e2c added upload to zenodo node 2020-10-20 16:59:55 +02:00
sandro 271b4db450 Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-10-20 16:09:49 +02:00
sandro d58d02d448 added workflow upload on zenodo 2020-10-20 16:09:07 +02:00
Alessia Bardi 1425d810a8 testing mapping 2020-10-19 17:46:14 +02:00
Miriam Baglioni 959f30811e added connection timeout and socket timeout 600 sec 2020-10-16 10:52:30 +02:00
Sandro La Bruzzo fed711da80 Merge remote-tracking branch 'origin/master' into merge_record_to_common 2020-10-13 15:32:45 +02:00
Sandro La Bruzzo 34bf64c94f fixed export Scholexplorer to OpenAire 2020-10-13 08:47:58 +02:00
Alessia Bardi 8775a64bc1 Merge pull request 'Merging different compatibility levels (pinocchio operator)' (#47) from merge_graph into master 2020-10-09 14:44:52 +02:00
Claudio Atzori e751c1402f Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop 2020-10-09 13:53:21 +02:00
Claudio Atzori b961dc7d1e added originalid to the fields in the result graph view 2020-10-09 13:53:15 +02:00
Sandro La Bruzzo 734934e2eb fixed error on empty intersection with publication and relation on export to OAF 2020-10-08 17:29:29 +02:00
Sandro La Bruzzo eec418cd26 moved AuthoreMerger into dhp-common 2020-10-08 10:33:55 +02:00
Sandro La Bruzzo fe0a7870e6 Added test to check if merge authors works 2020-10-08 10:33:12 +02:00
Sandro La Bruzzo cd9c377d18 adpted scholexplorer Dump generation to the new Dataset definition 2020-10-08 10:10:13 +02:00
Claudio Atzori a3f37a9414 javadoc 2020-10-07 16:44:22 +02:00
Claudio Atzori 8d85a2fced [BETA wf only] datasources involved in the merge operation doesn't obey to the infra precedence policy, but relies on a custom behaviour that, given two datasources from beta and prod returns the one from prod with the highest compatibility among the two 2020-10-07 16:28:52 +02:00
Claudio Atzori 5f7b75f5c5 code formatting 2020-10-07 13:22:54 +02:00
miconis 5a8bc329c5 bug fix in the result merge: it takes the correct bestaccessright basing on the license instead of the trust 2020-10-06 15:26:44 +02:00
Claudio Atzori babebbc9d3 Merge pull request 'Adding H2020 Classification, topic code and topic description to H2020 projects' (#46) from miriam.baglioni/dnet-hadoop:h2020classification into master
LGTM, thanks Miriam.
2020-10-05 14:14:38 +02:00
Miriam Baglioni d8839d1858 adding short description 2020-10-05 14:09:59 +02:00
Miriam Baglioni 061527f06e adding short description 2020-10-05 13:54:39 +02:00
Miriam Baglioni 0c12d7bdd8 adding short description 2020-10-05 11:39:55 +02:00
Miriam Baglioni ae08b3c0dd merge branch with master 2020-10-05 11:35:55 +02:00
Miriam Baglioni 11b7eaae09 changed the name of the folder where to store the context entity from context to communities_infrastructures 2020-10-05 11:24:54 +02:00
Miriam Baglioni 32bffb0134 changed the name from communities_infrastructures to communities_infrastuctures.json 2020-10-05 11:24:17 +02:00
Claudio Atzori 23f64d9eb4 updated dedup tests following the dnet-pace-core library update 2020-10-02 14:30:53 +02:00
Claudio Atzori 4fddd18403 updating to dnet-pace-core:4.0.5
- fixed error in the treeprocessor. it used th=-1 as default value, now it use th=1  5021e5048f
- fixed error in the block processor: entities with orderField=null were not considered 9e8ea8f6ee
2020-10-02 12:37:25 +02:00
Miriam Baglioni fc2f7636be removed not used code 2020-10-02 12:33:52 +02:00
Miriam Baglioni 12407c1f32 modification related to D-Net/dnet-hadoop#46 (comment) and also modified teh java doc with correct ref to H2020CLassification instead of H2020Programme 2020-10-02 12:29:01 +02:00
Miriam Baglioni 1cda6fd1ba modification related to D-Net/dnet-hadoop#46 (comment) 2020-10-02 12:27:48 +02:00
Miriam Baglioni 25cbcf6114 changed to solve issues about names. context renamed communities_infrastructure.json and removed the double json.gz extention to the name of the part in the tar 2020-10-02 12:17:46 +02:00
Claudio Atzori 9db0f88fb8 Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop 2020-10-02 09:43:35 +02:00
Claudio Atzori 49ae3450a9 code formatting 2020-10-02 09:43:24 +02:00
Claudio Atzori 1c44182dea minor changes 2020-10-02 09:41:34 +02:00
Claudio Atzori c2a6e2a9bf fixed mapping for datasource journal info (ISSNs) 2020-10-02 09:37:08 +02:00
Miriam Baglioni 01117a46e1 whole workflow activated 2020-10-01 17:19:21 +02:00
Miriam Baglioni cfb5766c6b removed double json.gz from names of files in the tar 2020-10-01 17:18:34 +02:00
Miriam Baglioni fcaedac980 merge branch with master 2020-10-01 16:46:59 +02:00
Miriam Baglioni c6e6ed1bd8 merge branch with master 2020-10-01 16:24:41 +02:00
Miriam Baglioni 4aec347351 refactoring 2020-10-01 16:23:52 +02:00
Miriam Baglioni 61946b4092 refactoring 2020-10-01 16:22:48 +02:00
Miriam Baglioni 7e6d35e56c added the link to the excel file related to topic 2020-10-01 15:53:31 +02:00
Sandro La Bruzzo 1a0a44e85a Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-10-01 15:46:53 +02:00
Sandro La Bruzzo c4a3c52e45 fixed Doiboost bug in the identifier 2020-10-01 15:46:44 +02:00
Miriam Baglioni 43cbd62c2b added classpath.first in the configuration 2020-10-01 15:46:34 +02:00
Miriam Baglioni cd69c6b023 added dependency for the topic file path 2020-10-01 15:45:59 +02:00
Miriam Baglioni 5ef03e5971 added the dependencies from dhp-aggregation for h2020classification 2020-10-01 15:44:40 +02:00
Miriam Baglioni 771cde3d05 moved the library version to global pom 2020-10-01 15:43:47 +02:00
Miriam Baglioni 632351c0da modified test resources to mirror the changed in the code 2020-10-01 15:43:02 +02:00
Miriam Baglioni ebc1c5513f modified test resources to mirror the changed in the code 2020-10-01 15:42:29 +02:00
Miriam Baglioni 3a374c34b6 fixed null pointer exception 2020-10-01 15:41:01 +02:00
Miriam Baglioni 83ea746163 added check to the test 2020-10-01 15:40:28 +02:00
Claudio Atzori 2e9e13444d author pids made unique by value 2020-10-01 12:50:40 +02:00
Miriam Baglioni 6e5db85b32 - 2020-10-01 11:51:11 +02:00
Miriam Baglioni a46179f61c refactoring 2020-10-01 11:22:01 +02:00
Miriam Baglioni b90bee124b removing raws that are empy from thos imported 2020-10-01 11:16:49 +02:00
Miriam Baglioni c107f193c9 refactoring 2020-10-01 11:16:22 +02:00
Claudio Atzori e265c3e125 cleaning functions factored out in a dedicated class 2020-10-01 10:50:15 +02:00
Miriam Baglioni 706a80a29a added test to check that separator '-' (not hyphen) will be recognized 2020-10-01 10:38:31 +02:00
Miriam Baglioni 3dca586b3b refactoring 2020-10-01 10:34:48 +02:00
Miriam Baglioni 416bda6066 changed the programme.desxcription by using the same value used in the classification instead of the short title or the title 2020-10-01 10:31:33 +02:00
Miriam Baglioni f6587c91f3 added comparison to a char that seems - but it is not 2020-10-01 10:30:26 +02:00
Claudio Atzori 4287164aba include relevantdate field in the result view 2020-10-01 10:28:55 +02:00
Miriam Baglioni 7e73bb88b3 changed the logic to add the topic description to the project 2020-09-28 17:21:43 +02:00
Miriam Baglioni 0a035e3630 - 2020-09-28 17:20:49 +02:00
Miriam Baglioni 16bee2084d added the topic code to the project subset 2020-09-28 17:20:11 +02:00
Miriam Baglioni 0bf2d0db52 added to the workflow the download of the topic excel file and one property needed to get the input path of the topic file in the hdfs filesystem 2020-09-28 12:17:22 +02:00
Miriam Baglioni c2abde4d9f changed the implementation of Atomic Actions creation by exploiting the topic information get from the cordis excel file 2020-09-28 12:16:34 +02:00
Miriam Baglioni d930b8d3fc changed the query to get only the code of the project and not the optional1 (topic code) and optional2 (topic description) 2020-09-28 12:15:48 +02:00
Miriam Baglioni f8f5cfd5cc removed the part added to set the topic code and description in the step of project preparation 2020-09-28 12:13:33 +02:00
Miriam Baglioni 9e19c9a221 remove the topic description from the values in the CSVProject class 2020-09-28 12:11:03 +02:00
Miriam Baglioni 6d8b932e40 refactoring 2020-09-28 12:06:56 +02:00
Miriam Baglioni b77f166549 changed the package name from csvutils to utils 2020-09-28 12:05:47 +02:00
Miriam Baglioni e33e3277de added needed dependency to read the excel file 2020-09-28 12:03:14 +02:00
Miriam Baglioni f4739a371a code to get the information related to the topic association between code and description. 2020-09-28 12:02:48 +02:00
Miriam Baglioni 7b6a7333e6 merge branch with master 2020-09-25 16:42:07 +02:00
Miriam Baglioni 983a12ed15 temporary modification to allow the upload of files in the sandbox without the neew to recreate the mapping from scratch 2020-09-25 16:41:51 +02:00
Miriam Baglioni 8b36d19182 added property depositionId and chenage property newVersion that became string from boolean to handle the three possible distinct values 2020-09-25 16:41:15 +02:00
Miriam Baglioni ed5239f9ec added new code to handle the new possibility to upload files to an already open deposition 2020-09-25 16:34:32 +02:00
Miriam Baglioni 3a8c524fce refactor 2020-09-25 16:34:02 +02:00
Miriam Baglioni ccd48dd78a added new test for new method 2020-09-25 16:33:43 +02:00
Miriam Baglioni 3e5497b336 added new method to handle an open deposition to which upload data 2020-09-25 16:33:15 +02:00
Miriam Baglioni 2ac2b537b6 merge branch with master 2020-09-25 14:40:47 +02:00
Miriam Baglioni 54800fb9b0 enabled only the step to upload in zenodo 2020-09-25 14:40:22 +02:00
Miriam Baglioni 12c2dfc268 modified the resource to consider the information added to the model 2020-09-25 14:17:23 +02:00
Miriam Baglioni 969fa8d96e fixed issue and changed the transformation of the programme file to consider the new model 2020-09-25 13:32:34 +02:00
Michele Artini c171fdebe1 Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-09-25 09:03:09 +02:00
Michele Artini c96598aaa4 opendoar partition 2020-09-25 09:02:58 +02:00
Miriam Baglioni de6c4d46d8 fixed conflicts 2020-09-24 15:35:01 +02:00
Miriam Baglioni e917281822 - 2020-09-24 15:24:05 +02:00
Miriam Baglioni 9f54f69e6d added topic information 2020-09-24 15:23:35 +02:00
Miriam Baglioni d6206d6e63 add the topic description to the action set associated to the project 2020-09-24 15:22:40 +02:00
Miriam Baglioni 6b50226f3b added topic code and topic description 2020-09-24 15:21:49 +02:00
Miriam Baglioni 15af1f527e modified to consider the topic information 2020-09-24 15:20:56 +02:00
Miriam Baglioni 609ff17cfc now the commission give us the framework programme (FP7 - H2020) so use this information to filter out programmes not associated to H2020 2020-09-24 15:19:31 +02:00
Miriam Baglioni b66f930466 Added optionl1 and optional2 information to the files red from the db. Optional1 contains the topic code and optional2 contains the topic description 2020-09-24 15:16:56 +02:00
Miriam Baglioni 860e6d38a6 added topic description to the CSV project variables 2020-09-24 15:15:26 +02:00
Miriam Baglioni 89612d639c added the h2020topicdescription 2020-09-24 15:14:50 +02:00
Claudio Atzori 044d3a0214 fixed query used to load datasources in the Graph 2020-09-24 13:48:58 +02:00
Claudio Atzori 27df1cea6d code formatting 2020-09-24 12:16:00 +02:00
Claudio Atzori fb22f4d70b included values for projects fundedamount and totalcost fields in the mapping tests. Swapped expected and actual values in junit test assertions 2020-09-24 12:10:59 +02:00
Claudio Atzori 42f55395c8 fixed order of the ISSNs returned by the SQL query 2020-09-24 12:09:58 +02:00
Claudio Atzori fadf5c7c69 Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop 2020-09-24 10:42:52 +02:00
Claudio Atzori 9a7e72d528 using concat_ws to join textual columns from PSQL. When using || to perform the concatenation, Null columns makes the operation result to be Null 2020-09-24 10:42:47 +02:00
Claudio Atzori 9e3e93c6b6 setting the correct issn type in the datasource.journal element 2020-09-24 10:39:16 +02:00
Miriam Baglioni 0d83f47166 merge branch with master 2020-09-23 17:33:49 +02:00
Miriam Baglioni 39eb8ab25b changed the dump to move from h2020programme to h2020classification 2020-09-23 17:33:00 +02:00
Miriam Baglioni 1d84cf19a6 added new line to resource file 2020-09-23 17:32:22 +02:00
Miriam Baglioni f0c476b6c9 modification to the test classes to consider h2020classification 2020-09-23 17:31:49 +02:00
Miriam Baglioni 2cba3cb484 modification to the classes building the actionset to consider the h2020classification 2020-09-23 17:31:15 +02:00
Miriam Baglioni 3c7ef5ca04 modification to the schema for h2020classification and h2020topic code 2020-09-23 17:30:21 +02:00
Miriam Baglioni 9d8cb5f827 H2020classification renamed as H2020Classification. Moved a method from model definition that was associated to mapping issue, and change the char to split the levels from $ to | 2020-09-22 14:58:53 +02:00
Miriam Baglioni 1069cf243a modification to the schema to consider the H2020classification of the programme. The filed Programme has been moved inside the H2020classification that is now associated to the Project. Programme is no more associated directly to the Project but via H2020CLassification 2020-09-22 14:38:00 +02:00
Enrico Ottonello a97ad20c7b exception is now propagated (PR review) 2020-09-22 10:46:34 +02:00
Enrico Ottonello fefbcfb106 dependency version moved to main pom (PR review) 2020-09-22 10:20:25 +02:00
Michele Artini 9e681609fd stats to sql file 2020-09-17 15:51:22 +02:00
Michele Artini 19eb8f9dcc Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-09-17 11:38:45 +02:00
Michele Artini 51321c2701 partition of events by opedoarId 2020-09-17 11:38:07 +02:00
Enrico Ottonello 7cffd14fb0 Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop into orcid-no-doi 2020-09-15 16:05:55 +02:00
Claudio Atzori cf2ce1a09b code formatting 2020-09-15 15:58:03 +02:00
Enrico Ottonello 9e8e7fe6ef add comments 2020-09-15 11:32:49 +02:00
Miriam Baglioni c2b5c780ff - 2020-09-14 14:34:03 +02:00
Miriam Baglioni e2ceefe9be - 2020-09-14 14:33:28 +02:00
Miriam Baglioni 1f893e63dc - 2020-09-14 14:33:10 +02:00
Enrico Ottonello 538f299767 merged 2020-09-14 12:35:16 +02:00
Enrico Ottonello eb8c9b2348 Merge remote-tracking branch 'upstream/master' into orcid-no-doi 2020-09-14 12:00:56 +02:00
Michele Artini 9b0c12f5d3 send notifications 2020-09-11 12:06:16 +02:00
Michele Artini 028613b751 remove old notifications 2020-09-09 15:32:06 +02:00
Michele Artini 9cfc124ac5 Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-09-08 16:39:54 +02:00
Michele Artini a597a218ab * forall topics 2020-09-08 16:39:40 +02:00
Claudio Atzori 8a523474b7 code formatting 2020-09-07 11:40:16 +02:00
Claudio Atzori 80eba5b497 avoid to reformat javadoc comments 2020-09-07 11:40:00 +02:00
Michele Artini bb459caf69 support for all topic subscriptions 2020-08-27 11:01:21 +02:00
Michele Artini 82ed8edafd notification indexing 2020-08-26 15:10:48 +02:00
Miriam Baglioni b72a7dad46 resuorce for pid graph dump 2020-08-24 17:09:01 +02:00
Miriam Baglioni 8694bb9b31 refactoring due to compilation 2020-08-24 17:07:34 +02:00
Miriam Baglioni 428f1022fd refactoring due to compilation 2020-08-24 17:05:50 +02:00
Miriam Baglioni c90a0d39dd refactoring due to compilation 2020-08-24 17:04:59 +02:00
Miriam Baglioni bd5a72929b refactoring due to compilation 2020-08-24 17:03:06 +02:00
Miriam Baglioni 8a069a4fea - 2020-08-24 17:01:30 +02:00
Miriam Baglioni 34fa96f3b1 - 2020-08-24 17:00:20 +02:00
Miriam Baglioni 5fb2949cb8 added utils methods 2020-08-24 17:00:09 +02:00
Miriam Baglioni 2a540b6c01 added constants for the pid graph dump 2020-08-24 16:55:35 +02:00
Miriam Baglioni da103c399a resources for the pid graph dump test 2020-08-24 16:52:07 +02:00
Miriam Baglioni 630a6a1fe7 first tests for the pid graph dump 2020-08-24 16:51:26 +02:00
Miriam Baglioni 40c8d2de7b test resources for the dump of the pids graph 2020-08-24 16:50:39 +02:00
Miriam Baglioni bef79d3bdf first attempt to the dump of pids graph 2020-08-24 16:49:38 +02:00
Michele Artini da470422d3 deleting events 2020-08-21 14:52:48 +02:00
Michele Artini 6e60bf026a indexing only a subset of eventsa 2020-08-19 12:39:22 +02:00
Miriam Baglioni 85203c16e3 merge branch with master 2020-08-19 11:49:03 +02:00
Miriam Baglioni 2c783793ba removed the affiliation from the author to mirror the changes in the model 2020-08-19 11:48:12 +02:00
Miriam Baglioni c325acef3f changed as extensions of the classes defining the common parameter 2020-08-19 11:47:17 +02:00
Miriam Baglioni 6b8c5034fc extends the result with parameters specific for the community dump. 2020-08-19 11:46:20 +02:00
Miriam Baglioni f26382fa51 extends the instance with parameters collectedfrom and hostedby to be dumps only for communities 2020-08-19 11:45:23 +02:00
Miriam Baglioni 4584cf6334 contains the specialized instance parameter for the dump of the result in the whole graph 2020-08-19 11:44:46 +02:00
Miriam Baglioni f5bae426f7 modified to store information concerning to instance and result common to both the dumps for community and the whole graph 2020-08-19 11:43:44 +02:00
Miriam Baglioni 11b80899d7 added to store information concerning to project common to both the dumps for community and the whole graph 2020-08-19 11:43:18 +02:00
Miriam Baglioni f6bf888016 removed affiliation from author to mirror the changes in the model 2020-08-19 11:41:41 +02:00
Miriam Baglioni 66d0e0d3f2 - 2020-08-19 11:31:50 +02:00
Miriam Baglioni 1c593a9cfe - 2020-08-19 11:29:51 +02:00
Miriam Baglioni e42b2f5ae2 - 2020-08-19 11:29:09 +02:00
Miriam Baglioni f81ee22418 changed to mirror the changes in the model (Instance, CommunityInstance, GraphResult) 2020-08-19 11:28:26 +02:00
Miriam Baglioni 387be43fd4 changed to discriminate if dumping all the results type together or each one in its own archive 2020-08-19 11:25:27 +02:00
Miriam Baglioni c5858afb88 added parameter to guide the dump for the result (resultAggregation). true if all the result types should be dump together, false otherwise. 2020-08-19 11:24:14 +02:00
Miriam Baglioni d407852ac2 changed to reflect the changed in the model 2020-08-19 11:15:05 +02:00
Miriam Baglioni 47c21a8961 refactoring due to compilation 2020-08-19 11:11:57 +02:00
Miriam Baglioni 5570678c65 changed parameter name from hfdsNameNode to nameNode 2020-08-19 10:59:26 +02:00
Miriam Baglioni dc5096a327 refactoring due to compilation 2020-08-19 10:57:36 +02:00
Miriam Baglioni 1791cf2e78 refactoring due to compilation 2020-08-19 10:14:41 +02:00
Miriam Baglioni c7f944a533 refactoring due to compilation 2020-08-19 10:01:26 +02:00
Miriam Baglioni 55e24c2547 relclass for relation and corresponding values have been put to lower case (isSupplementedBy wrote as IsSupplementedBy - orcid propagation) 2020-08-18 16:42:08 +02:00
Miriam Baglioni f44dd5d886 changed in mapping the result semantic name as it will be visible il the relclass Relation: from IsSupplementedBy to isSupplementedBy 2020-08-17 17:15:09 +02:00
Miriam Baglioni bc6b5d5b34 removed leftover parameter 2020-08-15 11:22:35 +02:00
Miriam Baglioni 200cd5c730 removed leftover parameter 2020-08-15 11:22:19 +02:00
Miriam Baglioni 96600ed04a modified test resource for mirroring the deletion of affiliation from author parameters 2020-08-14 20:41:49 +02:00
Miriam Baglioni 09f5b92763 added specific reference to class 2020-08-14 20:00:09 +02:00
Miriam Baglioni 37e7c43652 changed parameter name from hdfsNaemNode to nameNode 2020-08-14 18:18:25 +02:00
Claudio Atzori 5b994d7ccf Merge branch 'dump' of https://code-repo.d4science.org/miriam.baglioni/dnet-hadoop into resolve_conflicts_pr40_dump 2020-08-14 15:32:29 +02:00
Miriam Baglioni de995970ea try again to solve clash with master 2020-08-14 15:24:36 +02:00
Miriam Baglioni 5040d72d5e changed to make it equal to master branch 2020-08-14 15:20:17 +02:00
Miriam Baglioni be8106c339 added space toavoid conflicts with master branch 2020-08-14 15:16:27 +02:00
Claudio Atzori dc8b545fad Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop 2020-08-14 11:19:07 +02:00
Claudio Atzori 1871d1c6f6 solve error java.lang.NoSuchFieldError: INSTANCE when instantiating Solr client 2020-08-14 11:18:30 +02:00
Claudio Atzori f25438b07e sove error java.lang.NoSuchFieldError: INSTANCE when instantiating Solr client 2020-08-14 10:56:35 +02:00
Miriam Baglioni d2a8a4961a refactoring 2020-08-13 18:50:33 +02:00
Miriam Baglioni a5043de5da added method to get the mapped instance 2020-08-13 18:45:50 +02:00
Miriam Baglioni b7e49aee8d removed commented code 2020-08-13 18:44:07 +02:00
Miriam Baglioni e60a9e49b0 added descriptions 2020-08-13 18:23:24 +02:00
Miriam Baglioni 7ae1f76a6f added descriptions 2020-08-13 17:17:49 +02:00
Miriam Baglioni 3f9c0b897d added descriptions 2020-08-13 17:06:49 +02:00
Miriam Baglioni 2c62da288e added descriptions 2020-08-13 17:05:06 +02:00
Miriam Baglioni f439a6231e added missing constraint in XQuery (verify the status of the RC/RI different from hidden) 2020-08-13 15:30:55 +02:00
Miriam Baglioni 0fe800b1c9 modified because of D-Net/dnet-hadoop#40\#issuecomment-1902 2020-08-13 15:17:12 +02:00
Miriam Baglioni 270c89489c fixed issue created while renaming subject to subjects in community configuration xml 2020-08-13 15:16:04 +02:00
Miriam Baglioni fcd10f452c changed because of D-Net/dnet-hadoop#40 (comment) 2020-08-13 12:55:32 +02:00
Miriam Baglioni fd48ae3b85 changed because of D-Net/dnet-hadoop#40 (comment) 2020-08-13 12:19:15 +02:00
Miriam Baglioni 04a3e1ab38 disabled tests 2020-08-13 12:18:13 +02:00
Miriam Baglioni 2ede397933 Apply change because of D-Net/dnet-hadoop#40 (comment) 2020-08-13 12:16:39 +02:00
Miriam Baglioni bfd1fcde6d removed not useful method and changed because of D-Net/dnet-hadoop#40 (comment) and D-Net/dnet-hadoop#40 (comment) 2020-08-13 12:14:37 +02:00
Miriam Baglioni 7fd8397123 apply changes in D-Net/dnet-hadoop#40 (comment) 2020-08-13 12:13:15 +02:00
Miriam Baglioni 753d448cc9 apply changes in D-Net/dnet-hadoop#40 (comment) 2020-08-13 12:12:58 +02:00
Miriam Baglioni c0e071fa26 apply changes in D-Net/dnet-hadoop#40 (comment) 2020-08-13 12:12:40 +02:00
Miriam Baglioni 526db915bc apply changes in D-Net/dnet-hadoop#40 (comment) 2020-08-13 12:12:16 +02:00
Miriam Baglioni b0fab0d138 apply changes in D-Net/dnet-hadoop#40 (comment) 2020-08-13 12:11:57 +02:00
Miriam Baglioni 1b6320b251 apply changes in D-Net/dnet-hadoop#40 (comment) 2020-08-13 12:11:41 +02:00
Miriam Baglioni 743d31be22 apply changes in D-Net/dnet-hadoop#40 (comment) 2020-08-13 12:11:22 +02:00
Miriam Baglioni 65b48df652 apply changes in D-Net/dnet-hadoop#40 (comment) 2020-08-13 12:11:06 +02:00
Miriam Baglioni 90b54d3efb apply changes in D-Net/dnet-hadoop#40 (comment) 2020-08-13 12:08:24 +02:00
Miriam Baglioni 69bbb9592a apply changes in D-Net/dnet-hadoop#40 (comment) 2020-08-13 12:07:39 +02:00
Miriam Baglioni 945323299a apply changes in D-Net/dnet-hadoop#40 (comment) 2020-08-13 12:07:24 +02:00
Miriam Baglioni e04c993247 apply changes in D-Net/dnet-hadoop#40 (comment) 2020-08-13 12:07:07 +02:00
Miriam Baglioni ed0812d0ce apply changes in D-Net/dnet-hadoop#40 (comment) 2020-08-13 12:06:49 +02:00
Miriam Baglioni d55cfe0ea5 apply changes in D-Net/dnet-hadoop#40 (comment) 2020-08-13 12:06:20 +02:00
Miriam Baglioni 80866bec7d apply changes in D-Net/dnet-hadoop#40 (comment) 2020-08-13 12:06:05 +02:00
Miriam Baglioni 1400978c0a apply changes in D-Net/dnet-hadoop#40 (comment) 2020-08-13 12:05:44 +02:00
Miriam Baglioni 7b941a2e0a apply changes in D-Net/dnet-hadoop#40 (comment) 2020-08-13 12:05:17 +02:00
Miriam Baglioni f7474f50fe apply changes in D-Net/dnet-hadoop#40 (comment) 2020-08-13 12:04:52 +02:00
Miriam Baglioni 367203f412 apply changes in D-Net/dnet-hadoop#40 (comment) 2020-08-13 12:04:33 +02:00
Miriam Baglioni 3ab4809d31 apply changes in D-Net/dnet-hadoop#40 (comment) 2020-08-13 12:04:10 +02:00
Miriam Baglioni bcfbf7e5ef apply changes in D-Net/dnet-hadoop#40 (comment) 2020-08-13 12:03:33 +02:00
Miriam Baglioni 23b7c99164 apply changes in D-Net/dnet-hadoop#40 (comment) 2020-08-13 12:01:45 +02:00
Miriam Baglioni d1c78e09a8 apply changes in D-Net/dnet-hadoop#40 (comment) 2020-08-13 12:01:14 +02:00
Miriam Baglioni f7ce7a5be8 apply changes in D-Net/dnet-hadoop#40 (comment) 2020-08-13 12:00:37 +02:00
Miriam Baglioni 14ef4da2de removed affiliation from author as for D-Net/dnet-hadoop#40 (comment) 2020-08-13 11:59:51 +02:00
Miriam Baglioni 02a4986e7b Applying changed from code reviews D-Net/dnet-hadoop#40 (comment) and D-Net/dnet-hadoop#40 (comment) and D-Net/dnet-hadoop#40 (comment) 2020-08-13 11:53:01 +02:00
Miriam Baglioni 33a6a51333 Disabled Test (impossible to publish without accessToken). And applying changes from code review D-Net/dnet-hadoop#40 (comment) 2020-08-13 11:48:32 +02:00
Miriam Baglioni 235d4e4d6e moved Context as relevant for Communities dump 2020-08-12 18:16:45 +02:00
Miriam Baglioni 4871dfe595 merge branch with master 2020-08-12 10:05:30 +02:00
Miriam Baglioni 6953bdcbfb merge upstream 2020-08-12 10:05:19 +02:00
Miriam Baglioni adf9f96a67 test for extraction of relation between organizations and context 2020-08-12 10:04:47 +02:00
Miriam Baglioni 7400cd019d removed not needed variable 2020-08-12 10:03:33 +02:00
Miriam Baglioni 98d28bab5c fixed missing _ in context nsprefix 2020-08-12 10:00:18 +02:00
Miriam Baglioni 306603272e removed accession token 2020-08-12 09:39:58 +02:00
Miriam Baglioni 8f48cb29f4 changed resource because of a change in the XQuery that returned the XML to be parsed. The main Zenodo community is no more a separate element, but part of the <zenodocommunities> element 2020-08-11 18:04:38 +02:00
Miriam Baglioni c3672b162b merge branch with master 2020-08-11 17:53:04 +02:00
Miriam Baglioni a16bbf3202 changed test resource to mirror change in the Xquery that produced data to be parsed. The main Zenodo community it is no more provided in a different element, but it is part of the <zenodocommunities> 2020-08-11 17:48:44 +02:00
Miriam Baglioni 25f4fbceea draft of test and resources 2020-08-11 17:37:22 +02:00
Miriam Baglioni 30a2b19b65 changed metadata for deposition od covid-19 dump in Zenodo 2020-08-11 17:36:56 +02:00
Claudio Atzori f7cc52ab02 Merge pull request 'enrichment_wfs' (#39) from enrichment_wfs into master
LGTM
2020-08-11 17:26:13 +02:00
Claudio Atzori facca1c574 Merge pull request 'enrichment steps' (#38) from miriam.baglioni/dnet-hadoop:master into enrichment_wfs 2020-08-11 16:40:26 +02:00
Miriam Baglioni 10e2af3d3b - 2020-08-11 16:08:06 +02:00
Miriam Baglioni 49788b532a changed to mirror changes in the schema 2020-08-11 16:05:03 +02:00
Miriam Baglioni b08511287b - 2020-08-11 16:01:36 +02:00
Miriam Baglioni 7e81a17068 changed the XQUERY to mirror the change in the code 2020-08-11 16:00:33 +02:00
Miriam Baglioni 37ad2f28e9 removed added | in prefix for datasource 2020-08-11 15:55:06 +02:00
Miriam Baglioni f31c2e9461 enabled test 2020-08-11 15:49:25 +02:00
Miriam Baglioni 2d67476417 merge branch with master 2020-08-11 15:46:04 +02:00
Miriam Baglioni 77a390878c merge upstream 2020-08-11 15:45:48 +02:00
Miriam Baglioni 6d3804e24c - 2020-08-11 15:45:12 +02:00
Miriam Baglioni 0603ec4757 changed test to upload the dump for covid-19 community 2020-08-11 15:43:25 +02:00
Miriam Baglioni 7dfd56df9d - 2020-08-11 15:42:35 +02:00
Miriam Baglioni a169d7e7c1 added test file for the MakeTar class 2020-08-11 15:40:41 +02:00
Miriam Baglioni acb0926b2e json schemas for the dumped entities and relation 2020-08-11 15:39:48 +02:00
Miriam Baglioni ff52c51f92 added the communityMapPath parameter and removed the isLookUpUrl parameter 2020-08-11 15:39:22 +02:00
Miriam Baglioni 6f43acda5e added the maketar and send to zenodo step. Adjusted wf parameters 2020-08-11 15:38:20 +02:00
Miriam Baglioni ddc19de2e9 removed the isLookUpUrl among the parameters 2020-08-11 15:37:47 +02:00
Miriam Baglioni 592a8ea573 added parameter file for maketar class 2020-08-11 15:37:14 +02:00
Miriam Baglioni 77a0951b32 added the make archive step in the workflow 2020-08-11 15:32:32 +02:00
Miriam Baglioni cf4d918787 added description, changed parameter name and added method 2020-08-11 15:27:31 +02:00
Miriam Baglioni dc5fc5366d Creation of an archive for each related dump part 2020-08-11 15:26:06 +02:00
Miriam Baglioni 0ce49049d6 added description 2020-08-11 15:25:11 +02:00
Miriam Baglioni ecd2081f84 refactoring 2020-08-11 14:17:31 +02:00
Miriam Baglioni 9bae991167 added description of the class 2020-08-11 11:20:43 +02:00
Miriam Baglioni 341dc59ead removed the repartition(1). Added code for the creation of an archive containing all the parts dumped for each community 2020-08-11 11:18:58 +02:00
Sandro La Bruzzo fe8d640aee fixed error on oozie workflow 2020-08-11 09:43:03 +02:00
Sandro La Bruzzo 304590e854 updated workflow of indexing to start from begin 2020-08-11 09:17:47 +02:00
Sandro La Bruzzo eaf0dc68a2 fixed indexing 2020-08-11 09:17:03 +02:00
Miriam Baglioni 1991a49f70 removed reference to isLookUp to get the communityMap 2020-08-10 18:02:56 +02:00
Miriam Baglioni c378c38546 disabled test. The testing functionalities for hte upload in Zenode are moved to common 2020-08-10 12:41:11 +02:00
Miriam Baglioni 63ad0ed209 changed to use communityMapPath instead of IsLookUp 2020-08-10 12:40:19 +02:00
Miriam Baglioni cec795f2ea changed resources to mirror changes in the model 2020-08-10 12:39:35 +02:00
Miriam Baglioni f50e3e7333 changed the class for which to generate the schema 2020-08-10 12:03:49 +02:00
Miriam Baglioni b8c26f656c test using communityMapPath instead of isLookUp 2020-08-10 12:02:55 +02:00
Miriam Baglioni fe88904df0 changed the wf definition 2020-08-10 12:01:14 +02:00
Miriam Baglioni a6df01d329 - 2020-08-10 11:59:10 +02:00
Miriam Baglioni 87856467e2 removed isLookUpUrl and added code to read from HDSF the communitymap 2020-08-10 11:38:41 +02:00
Miriam Baglioni 1cf7043e26 removed isLookUoUrl from the parameters 2020-08-10 11:38:03 +02:00
Claudio Atzori cf6b68ce5a Merge pull request 'data provision workflow: add nodes to perform DELETE BY QUERY before the indexing begins and COMMIT after the indexing is completed' (#36) from provision_indexing into master 2020-08-10 11:16:29 +02:00
Sandro La Bruzzo 0ade33ad15 updated mergeFrom function for DLI Unknown 2020-08-10 10:18:35 +02:00
Miriam Baglioni 46986aae2d added the new parameter for newdeposion/newversion and concept_record_id 2020-08-07 18:00:06 +02:00
Miriam Baglioni 3aedfdf0d6 added option to do a new deposition or new version of an old deposition 2020-08-07 17:49:14 +02:00
Miriam Baglioni 1b3ad1bce6 filter out authors pid (only orcid). Added check to get unique provenance for context id. filtr out countries with code UNKNOWN 2020-08-07 17:48:18 +02:00
Miriam Baglioni 5ceb8c5f0a moved constants from graph.Constants 2020-08-07 17:46:47 +02:00
Miriam Baglioni 6c65c93c0e refactoring 2020-08-07 17:45:35 +02:00
Miriam Baglioni 68adf86fe4 refactoring 2020-08-07 17:43:20 +02:00
Miriam Baglioni 26d2ad6ebb refactoring 2020-08-07 17:41:56 +02:00
Miriam Baglioni 9675af7965 refactoring 2020-08-07 17:41:07 +02:00
Miriam Baglioni 346a91f4d9 Added constants 2020-08-07 17:35:39 +02:00
Miriam Baglioni d52b0e1797 no use of IsLookUp. The query is done once and its result stored on HDFS. The path to the result is given instead of the isLookUpUrl 2020-08-07 17:34:40 +02:00
Miriam Baglioni ae1b7fbfdb changed method signature from set of mapkey entries to String representing path on file system where to find the map 2020-08-07 17:32:27 +02:00
Miriam Baglioni 931fa2ff00 removed dependencies 2020-08-07 16:46:37 +02:00
Miriam Baglioni bcc70dce5e added dependency to POM 2020-08-07 16:46:18 +02:00
Miriam Baglioni 7e54b2189c resources for the test of the automatic deposition in Zenodo 2020-08-07 16:45:59 +02:00
Miriam Baglioni d161fa2aeb test for the automatic deposition in Zenodo 2020-08-07 16:45:28 +02:00
Miriam Baglioni 545ea9f77e moved in common. Zenodo response model and APIClient to deposit in Zenodo 2020-08-07 16:44:51 +02:00
Miriam Baglioni b6b5498a9a removed the fulltext element 2020-08-07 15:29:32 +02:00
Miriam Baglioni 150dd27431 changed the name od the pid: from pid to id 2020-08-07 15:29:13 +02:00
Miriam Baglioni 34973615ec changed the field pid: left only if the author as orcid 2020-08-07 15:28:52 +02:00
Sandro La Bruzzo ddb1446ceb fixed test 2020-08-07 11:34:33 +02:00
Sandro La Bruzzo 718bc7bbc8 implemented provision workflows using the new implementation with Dataset 2020-08-07 11:05:18 +02:00
Miriam Baglioni da9b012c15 fixed dewcription 2020-08-06 11:55:44 +02:00
Miriam Baglioni 6dbadcf181 the new schema for the dumped result 2020-08-06 11:05:56 +02:00
Sandro La Bruzzo a44e5abaa7 reformat code 2020-08-06 10:30:22 +02:00
Sandro La Bruzzo 4fb1821fab Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-08-06 10:28:31 +02:00
Sandro La Bruzzo 9d9e9edbd2 improved extractEntity Relation workflows using dataset 2020-08-06 10:28:24 +02:00
Miriam Baglioni adf0ca5aa7 test to send is from hdfs 2020-08-05 14:24:43 +02:00
Miriam Baglioni 14eda4f46e added method to try to put inputstream to zenodo 2020-08-05 14:18:25 +02:00
Miriam Baglioni e737a47270 added classes to try to send input stream to zenodo for the upload 2020-08-05 14:17:40 +02:00
Miriam Baglioni 873e9cd50c changed hadoop setting to connect to s3 2020-08-04 15:37:25 +02:00
Alessia Bardi a29565ff57 code formatting 2020-08-04 12:55:27 +02:00
Alessia Bardi 01db29e208 fixes redmine issue #5846: datacite and its different namespace declarations 2020-08-04 12:53:48 +02:00
Alessia Bardi b4e4e5f858 do not duplicate result PIDs 2020-08-04 12:52:14 +02:00
Alessia Bardi 09a323d18d testing a dataset from Nakala 2020-08-04 12:50:52 +02:00
Alessia Bardi c35bf486cc added handle among the possible PIDs 2020-08-04 12:50:12 +02:00
Miriam Baglioni 5b651abf82 merge branch with master 2020-08-04 10:14:07 +02:00
Miriam Baglioni 88e4c3b751 added default trust to context bulktagged 2020-08-04 10:13:25 +02:00
Miriam Baglioni f9342cb484 added constant 2020-08-03 18:32:35 +02:00
Miriam Baglioni 96c3c891f4 added trust 2020-08-03 18:32:17 +02:00
Miriam Baglioni 53656600ad changed XQuery to select only community and ri with status not hidden 2020-08-03 18:29:30 +02:00
Miriam Baglioni b34177d8ef merge upstream 2020-08-03 18:13:42 +02:00
Miriam Baglioni 901ae37f7b added step to workflow 2020-08-03 18:12:54 +02:00
Miriam Baglioni fa38cdb10b added resource 2020-08-03 18:11:12 +02:00
Miriam Baglioni e9fcc0b2f1 commented test unit - to decide change for mirroring the changed logics 2020-08-03 18:10:53 +02:00
Miriam Baglioni e43aeb139a added new property file and changed some parameter to old files 2020-08-03 18:07:28 +02:00
Miriam Baglioni aa9f3d9698 changed logic for save in s3 directly 2020-08-03 18:06:18 +02:00
Miriam Baglioni 627c1dc73a added new filed and doc 2020-08-03 18:05:41 +02:00
Miriam Baglioni d465f0eec9 added fulltext to result 2020-08-03 18:03:27 +02:00
Miriam Baglioni ec4b392d12 added new dependencies for writing on s3 2020-08-03 17:57:04 +02:00
Miriam Baglioni c892c7dfa7 changed to query for community map just once and save the result for remaining executions 2020-08-03 17:56:31 +02:00
Claudio Atzori 3a11a387a9 data provision workflow enhancement: added nodes to perform DELETE BY QUERY before the indexing begins and COMMIT after the indexing is completed 2020-08-03 14:28:08 +02:00
Alessia Bardi 8cc067fe76 specific test for claims 2020-08-03 11:17:50 +02:00
Claudio Atzori a89b6cc3ba Merge pull request 'nsprefix_blacklist' (#34) from nsprefix_blacklist into master 2020-07-31 11:52:23 +02:00
Sandro La Bruzzo 0c3bc9ea4b Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-07-31 09:07:18 +02:00
Sandro La Bruzzo 168bfb496a adopted dedup to the new schema 2020-07-31 09:06:57 +02:00
Michele Artini 652b13abb6 Merge branch 'master' into nsprefix_blacklist 2020-07-31 07:58:37 +02:00
Enrico Ottonello 0377b40fba output to one parquet file 2020-07-30 18:38:07 +02:00
Claudio Atzori cd631bb5bc defaults fixed in the cleaning workflow forces result.publisher to NULL when result.publisher.value in empty 2020-07-30 17:03:53 +02:00
Miriam Baglioni 872d7783fc - 2020-07-30 16:50:36 +02:00
Miriam Baglioni 3a0f5f1e1b added static newInstance method 2020-07-30 16:49:09 +02:00
Miriam Baglioni e163b1df79 removed the duration attribute (most of the times it is set to 0) 2020-07-30 16:45:03 +02:00
Miriam Baglioni 57c87b7653 re-implemented to fix issue on not serializable Set<String> variable 2020-07-30 16:43:43 +02:00
Miriam Baglioni ef8e5957b5 added specific directory where to save results 2020-07-30 16:42:46 +02:00
Miriam Baglioni 75f3361c85 - 2020-07-30 16:41:31 +02:00
Miriam Baglioni 3f695b25fa refactoring 2020-07-30 16:40:15 +02:00
Miriam Baglioni e623f12bef refactoring 2020-07-30 16:32:59 +02:00
Miriam Baglioni ff7d05abb4 added support class to store the couple organizationId representativeId gaot from sql query on hive 2020-07-30 16:32:04 +02:00
Miriam Baglioni cf6d80b2ab added command to close the writer 2020-07-30 16:31:22 +02:00
Miriam Baglioni f985bca37b added USER_CLAIM constant value 2020-07-30 16:25:26 +02:00
Claudio Atzori 4bbfcf1ac6 Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop 2020-07-30 16:25:06 +02:00
Claudio Atzori 4ff8007518 added function to set the missing vocabulary names, used in the cleaning workflow as a pre-cleaning step 2020-07-30 16:24:39 +02:00
Miriam Baglioni 6f1c40a933 - 2020-07-30 16:24:28 +02:00
Miriam Baglioni 2b66a93f9e added property file that was missing 2020-07-30 16:24:17 +02:00
Michele Artini bdece15ca0 blacklist of nsprefix 2020-07-30 16:13:38 +02:00
Enrico Ottonello 196f36c6ed fix publication dataset creation 2020-07-30 13:38:33 +02:00
Sandro La Bruzzo c97c8f0c44 implemented new oozie job to extract entities in a separate dataset 2020-07-30 12:13:58 +02:00
Sandro La Bruzzo 3010a362bc updated changing in the workflow of provision in the phase of aggregation. Removed serialization in JSON RDD and used spark Dataset 2020-07-30 09:25:56 +02:00
Sandro La Bruzzo 487226f669 Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-07-30 09:25:39 +02:00
Sandro La Bruzzo 16ae3c9ccf updated changing in the workflow of provision in the phase of aggregation. Removed serialization in JSON RDD and used spark Dataset 2020-07-30 09:25:32 +02:00
Miriam Baglioni ee8420c6b3 added resource for datasource test 2020-07-29 18:28:43 +02:00
Miriam Baglioni 76bcab98ce added code to filter out null originalId from the dump 2020-07-29 18:28:21 +02:00
Miriam Baglioni ef1d8aef17 added one test to verify the dump for the datasources 2020-07-29 18:27:46 +02:00
Miriam Baglioni 86bab79512 - 2020-07-29 18:20:22 +02:00
Miriam Baglioni 31791dcf3d fixed wrong property file path name 2020-07-29 18:20:08 +02:00
Miriam Baglioni 0f60436e52 - 2020-07-29 18:00:18 +02:00
Miriam Baglioni 9e722aa1ef - 2020-07-29 18:00:08 +02:00
Miriam Baglioni d22f106f27 added constant to identify datasource associated to funders 2020-07-29 17:56:55 +02:00
Miriam Baglioni 40e194fe2f added check to not dump datasources related to funders 2020-07-29 17:56:18 +02:00
Miriam Baglioni b48934f6df changed the workflow name 2020-07-29 17:43:43 +02:00
Miriam Baglioni 1433db825d refactorign 2020-07-29 17:43:24 +02:00
Miriam Baglioni 074e9ab75e refactoring 2020-07-29 17:42:50 +02:00
Miriam Baglioni 8ad8dac7d4 merge branch with fork master 2020-07-29 17:38:28 +02:00
Miriam Baglioni 9e997e63a2 merge upstream 2020-07-29 17:38:14 +02:00
Miriam Baglioni 9fa82dc93b fixed issue 2020-07-29 17:36:16 +02:00
Miriam Baglioni 8907648d6a - 2020-07-29 17:35:47 +02:00
Miriam Baglioni 536e7f6352 added and changed resources for testing of the whole graph dump and of community related products dumps 2020-07-29 17:33:34 +02:00
Miriam Baglioni 4d7f590493 testings for the whole graph dump 2020-07-29 17:32:37 +02:00
Miriam Baglioni a2f73ec2c7 changed due to changes in the model 2020-07-29 17:32:02 +02:00
Miriam Baglioni 481585e9d3 - 2020-07-29 17:31:41 +02:00
Miriam Baglioni 40a8dafbdc - 2020-07-29 17:30:44 +02:00
Miriam Baglioni de2ebb467e changed due to changes in the model 2020-07-29 17:08:02 +02:00
Miriam Baglioni d0ff2a56fb - 2020-07-29 17:06:53 +02:00
Miriam Baglioni b96dedb56b changed due to changes in the model 2020-07-29 17:05:31 +02:00
Miriam Baglioni 6d0f08277b classes to implement the dump of the whole graph. 2020-07-29 17:03:19 +02:00
Miriam Baglioni 8d4327b292 input parameters and workflow definition for the dump of the whole graph 2020-07-29 17:00:34 +02:00
Miriam Baglioni b5f995ab12 refactoring 2020-07-29 16:59:48 +02:00
Miriam Baglioni f7a87cc447 added new constants value 2020-07-29 16:58:40 +02:00
Miriam Baglioni 6b63668d3f - 2020-07-29 16:57:45 +02:00
Miriam Baglioni dac9f7cc10 refactorin 2020-07-29 16:56:46 +02:00
Miriam Baglioni 770b71b14a Added new model class to dumop Datasources 2020-07-29 16:55:39 +02:00
Miriam Baglioni c760d16936 removed the collected from element from the entities 2020-07-29 16:55:08 +02:00
Miriam Baglioni 6a499c6b7a added hashCode to relation to avoid to produce multiple identical relations 2020-07-29 16:54:16 +02:00
Miriam Baglioni b71d12cf26 refactoring 2020-07-29 16:52:44 +02:00
Miriam Baglioni a8d65b68cb changed to delete the part to check if it was a test or a real execution 2020-07-29 16:47:57 +02:00
Miriam Baglioni 3ec2392904 Added new class to move the place the split is effectively run 2020-07-29 16:46:50 +02:00
Michele Artini 8ba94833bd added an es prop 2020-07-29 14:16:08 +02:00
Miriam Baglioni 178c2729a7 changed the path to reach the java class to be executed 2020-07-29 12:29:51 +02:00
Miriam Baglioni 437ac12139 removed unused parameter 2020-07-29 12:28:16 +02:00
Enrico Ottonello c82b15b5f4 migrate configuration to ocean, fix publication dataset creation 2020-07-28 15:23:52 +02:00
Claudio Atzori 6f11c0496e fixed typo in module name dhp-worfklow-profiles -> dhp-workflow-profiles 2020-07-28 15:01:58 +02:00
Claudio Atzori f680eb3e12 Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop 2020-07-28 14:10:56 +02:00
Claudio Atzori 985b360c31 fixed typo in module name dhp-worfklow-profiles -> dhp-workflow-profiles 2020-07-28 14:10:52 +02:00
Claudio Atzori 7fc27bfdd1 Merge pull request 'islookup_timeout' (#30) from islookup_timeout into master
Thanks, Michele!
2020-07-28 13:53:12 +02:00
Michele Artini 3acd632123 Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-07-28 12:02:30 +02:00
Michele Artini 35e6e9c064 tests 2020-07-28 12:02:15 +02:00
Enrico Ottonello a6acb37689 Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop into orcid-no-doi 2020-07-28 08:07:40 +02:00
Claudio Atzori 2c4196ab22 Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop into islookup_timeout 2020-07-27 17:40:58 +02:00
Claudio Atzori ee832f358e Merge pull request 'stats_wf_extensions_and_corrections' (#28) from spyros/dnet-hadoop:stats_wf_extensions_and_corrections into master
Thank you Guys! The update workflow will be made available to the beta & production orchestration systems under the HDFS path

```/lib/dnet/oa/graph/stats/oozie_app```
2020-07-27 16:02:03 +02:00
Antonis Lempesis 4ac8ebe427 correctly calculating the project duration 2020-07-24 19:50:40 +03:00
Antonis Lempesis 18d9464b52 creating shadow db only if it not exists... 2020-07-24 19:50:40 +03:00
Antonis Lempesis e217d496ab added the dest db... 2020-07-24 19:50:40 +03:00
Antonis Lempesis b16bb68b9f added the target db name... 2020-07-24 19:50:40 +03:00
Antonis Lempesis 1ee7eeedf3 added the source db name... 2020-07-24 19:50:40 +03:00
Antonis Lempesis cecbbfa0fc added missing tables and views: contexts, creation_date, funder 2020-07-24 19:50:40 +03:00
Antonis Lempesis 25b7a615f5 moved datasource_sources table creating in the datasource section 2020-07-24 19:50:40 +03:00
Antonis Lempesis a8da4ab9c0 years in projects are now integers 2020-07-24 19:50:40 +03:00
Antonis Lempesis c9cfc165d9 not using impala since the resulting tables are not visible 2020-07-24 19:50:40 +03:00
Antonis Lempesis dd3d6a6e15 compute stats for the used and new impala tables 2020-07-24 19:50:40 +03:00
Antonis Lempesis e6f50de6ef Separated impala from hive steps 2020-07-24 19:50:40 +03:00
Antonis Lempesis de49173420 fixed a typo in queries 2020-07-24 19:50:40 +03:00
antleb 391cf80fb8 Added peer-reviewed, green, gold tables and fields in result. Added shortcuts from result-country 2020-07-24 19:50:40 +03:00
antleb 68389d0125 Corrected the script used by the last step of the wf 2020-07-24 19:50:40 +03:00
antleb ec52141f1a changed refereed type from value to clssname 2020-07-24 19:50:40 +03:00
Spyros Zoupanos 63cd797aba Comment out step 15 to make it work with the new schema of Claudio 2020-07-24 19:50:40 +03:00
Spyros Zoupanos 138c6ddffa Insert statement to datasource table that takes into account the piwik_id of the openAIRE graph 2020-07-24 19:50:40 +03:00
Spyros Zoupanos 3630794cef Fix to consider the relationships that have been 'virtually deleted' for project_results - defect #5607 2020-07-24 19:50:40 +03:00
Spyros Zoupanos 5546f29e63 Corrections on the shadow schema and the impala table stats calculation 2020-07-24 19:50:40 +03:00
Spyros Zoupanos adf8a025d2 Adding more relations (Sources, Licences, Additional) and shadow schema as provided and discussed with Antonis Lempesis 2020-07-24 19:50:40 +03:00
Spyros Zoupanos 657a40536b Corrections by Spyros: Scipt cleanup, corrections and re-arrangement 2020-07-24 19:50:40 +03:00
Giorgos Alexiou 477fa6234d Script re-organisation and adding table invalidations needed for impala 2020-07-24 19:50:40 +03:00
Miriam Baglioni 6c2223d1fc added code to get the openaire id for contexts 2020-07-24 17:30:15 +02:00
Miriam Baglioni afd54c1684 removed not needed upload and refactoring 2020-07-24 17:28:56 +02:00
Miriam Baglioni 7b0569d989 changed to map also the result associated to the whole graph 2020-07-24 17:28:11 +02:00
Miriam Baglioni 082225ad61 - 2020-07-24 17:27:26 +02:00
Miriam Baglioni 968c59d97a added teh logic to dump also the products for the whole graph. They will miss collected from and context information that will be materialized as new relations 2020-07-24 17:25:19 +02:00
Miriam Baglioni 00f2b8410a changed the definition of the model to intesert porvenance information to some classes 2020-07-24 17:23:57 +02:00
Miriam Baglioni c0f3059676 added needed structures to ModelSupport 2020-07-24 17:22:39 +02:00
Miriam Baglioni 332258d199 split the classes related to the communities dump and to the whole graph dump 2020-07-24 17:21:48 +02:00
Claudio Atzori 56bbfdc65d introduced parameter 'numParitions', driving the hive DB table data partitioning. Currently specified only for table 'project' 2020-07-23 08:54:10 +02:00
Sandro La Bruzzo 9ab594ccf6 fixed test 2020-07-21 10:36:21 +02:00
Claudio Atzori ebf60020ac map results as OPRs in case of missing //CobjCategory/@type and the vocabulary dnet:result_typologies doesn't resolve the super type 2020-07-20 19:01:10 +02:00
Miriam Baglioni 355d7e426e added dumo for project - not finished 2020-07-20 18:54:43 +02:00
Miriam Baglioni a2f01e5259 added getter and setter 2020-07-20 18:54:17 +02:00
Miriam Baglioni 40bbe94f7c merge with master fork 2020-07-20 18:10:03 +02:00
Miriam Baglioni 2a15494b16 merge upstream 2020-07-20 18:05:01 +02:00
Miriam Baglioni 23160b4d29 realignment of the workflow classes with the changes in the structure of the module 2020-07-20 18:04:30 +02:00
Miriam Baglioni b904e0699a - 2020-07-20 18:02:53 +02:00
Miriam Baglioni 3aab7680f6 changed the test results 2020-07-20 18:00:43 +02:00
Miriam Baglioni cde0300801 moved from projects to project 2020-07-20 17:57:35 +02:00
Miriam Baglioni 5076e4f320 changed test to comply with the modifications 2020-07-20 17:55:18 +02:00
Miriam Baglioni 08dbd99455 changed to dump the whole results graph by usign classes already implemented for communities. Added class to dump also organization 2020-07-20 17:54:28 +02:00
Miriam Baglioni e47ea9349c extended some types by adding provenance as the couple (provenance, trust) and moved some classes to be used by the complete graph dump also 2020-07-20 17:46:27 +02:00
Claudio Atzori 32f5e466e3 imports cleanup 2020-07-20 17:42:58 +02:00
Claudio Atzori 54ac583923 code formatting 2020-07-20 17:37:08 +02:00
Claudio Atzori 124e7ce19c in case of missing attribute //dr:CobjCategory/@type the resulttype is derived by looking up the vocabulary dnet:result_typologies with the 1st instance type available 2020-07-20 17:33:37 +02:00
Claudio Atzori 050dda223d Merge pull request 'removed duplicated fields' (#25) from unique_field_in_lists into master
Looks good as a temporary workaround. I agree the model could seamlessly make the distinct operation by using HashSets instead of Linked (or Array) Lists.

The task to update the model in such a way is added on #9#issuecomment-1583

Thanks!
2020-07-20 12:12:50 +02:00
Claudio Atzori e0c4cf6f7b added parameter to drive the graph merge strategy: priority (BETA|PROD) 2020-07-20 10:48:01 +02:00
Claudio Atzori 94ccdb4852 Merge branch 'master' into merge_graph 2020-07-20 10:14:55 +02:00
Claudio Atzori 0937c9998f Merge branch 'deduptesting' 2020-07-20 10:00:20 +02:00
Claudio Atzori 105176105c updated dnet-pace-core dependency to version 4.0.4 to include the latest clustering function 2020-07-20 09:59:47 +02:00
Claudio Atzori de72b1c859 cleanup 2020-07-20 09:59:11 +02:00
Michele Artini 331a3cbdd0 fixed originalId 2020-07-20 09:50:29 +02:00
Michele Artini c59c5369b1 Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-07-18 09:40:54 +02:00
Michele Artini 346a1d2b5a update eventId generator 2020-07-18 09:40:36 +02:00
Sandro La Bruzzo 9116d75b3e Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-07-17 18:01:30 +02:00
Miriam Baglioni d7d84c8217 - 2020-07-17 14:03:23 +02:00
Miriam Baglioni 47c7122773 changed priority from beta to production 2020-07-17 12:56:35 +02:00
Michele Artini 442f30930c removed duplicated fields 2020-07-17 12:25:36 +02:00
Michele Artini 3adedd0a68 trust truncated to 3 decimals 2020-07-17 11:58:11 +02:00
Claudio Atzori 1781609508 code formatting 2020-07-16 19:06:56 +02:00
Claudio Atzori db8b90a156 renamed CORE -> BETA 2020-07-16 19:05:13 +02:00
Miriam Baglioni 44e1c40c42 merge upstream 2020-07-16 18:49:38 +02:00
Claudio Atzori 878f2b931c Merge branch 'master' into merge_graph 2020-07-16 16:34:24 +02:00
Claudio Atzori cc5d13da85 introduced parameter shouldIndex (true|false) 2020-07-16 13:46:39 +02:00
Claudio Atzori b098cc3cbe avoid repeating identical values for fields: source, description 2020-07-16 13:45:53 +02:00
Claudio Atzori 805de4eca1 fix: filter the blocks with size = 1 2020-07-16 10:11:32 +02:00
Claudio Atzori 4b9fb2ffb8 Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop 2020-07-15 11:26:04 +02:00
Claudio Atzori 5033c25587 code formatting 2020-07-15 11:26:00 +02:00
Claudio Atzori b90389bac4 code formatting 2020-07-15 11:24:48 +02:00
Claudio Atzori 4e6f46e8fa filter blocks with one record only 2020-07-15 11:22:20 +02:00
Michele Artini 262c29463e relations with multiple datasources 2020-07-15 09:18:40 +02:00
Claudio Atzori 7d6e269b40 reverted CreateRelatedEntitiesJob_phase1 to its previous state 2020-07-13 22:54:04 +02:00
Claudio Atzori 8e97598eb4 avoid to NPE in case of null instances 2020-07-13 20:46:14 +02:00
Claudio Atzori 06def0c0cb SparkBlockStats allows to repartition the input rdd via the numPartitions workflow parameter 2020-07-13 20:09:06 +02:00
miconis b52c246aed merge done 2020-07-13 19:57:02 +02:00
miconis b8a45041fd minor changes 2020-07-13 19:53:18 +02:00
Claudio Atzori 66f9f6d323 adjusted parameters for the dedup stats workflow 2020-07-13 19:26:46 +02:00
miconis 03ecfa5ebd implementation of the test class for the new block stats spark action 2020-07-13 18:48:23 +02:00
miconis 10e08ccf45 Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-07-13 18:22:45 +02:00
miconis 9258e4f095 implementation of a new workflow to compute statistics on the blocks 2020-07-13 18:22:34 +02:00
Claudio Atzori c6f6fb0f28 code formatting 2020-07-13 16:46:13 +02:00
Claudio Atzori 8d2102d7d2 Merge branch 'deduptesting' 2020-07-13 16:32:43 +02:00
Claudio Atzori 344a90c2e6 updated assertions in propagateRelationTest 2020-07-13 16:32:04 +02:00
Claudio Atzori 1143f426aa WIP SparkCreateMergeRels distinct relations 2020-07-13 16:13:36 +02:00
Claudio Atzori 8c67938ad0 configurable number of partitions used in the SparkCreateSimRels phase 2020-07-13 16:07:07 +02:00
Claudio Atzori c73168b18e Merge branch 'deduptesting' of https://code-repo.d4science.org/D-Net/dnet-hadoop into deduptesting 2020-07-13 15:54:58 +02:00
Claudio Atzori c8284bab06 WIP SparkCreateMergeRels distinct relations 2020-07-13 15:54:51 +02:00
Sandro La Bruzzo 1d133b7fe6 update test 2020-07-13 15:52:41 +02:00
Michele Artini 3635d05061 poms 2020-07-13 15:52:23 +02:00
Claudio Atzori 7dd91edf43 parsing of optional parameter 2020-07-13 15:40:41 +02:00
Claudio Atzori 4c101a9d66 WIP SparkCreateMergeRels distinct relations 2020-07-13 15:31:38 +02:00
Claudio Atzori 8a612d861a WIP SparkCreateMergeRels distinct relations 2020-07-13 15:30:57 +02:00
Sandro La Bruzzo 9ef2385022 implemented test for cut of connected component 2020-07-13 15:28:17 +02:00
Sandro La Bruzzo d561b2dd21 implemented cut of connected component 2020-07-13 14:18:42 +02:00
Miriam Baglioni 8e0e090d7a merge upstream 2020-07-13 12:46:55 +02:00
Claudio Atzori e2093e42db Merge branch 'master' into deduptesting 2020-07-13 10:57:49 +02:00
Michele Artini 2c4ed9a043 Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-07-13 10:55:39 +02:00
Michele Artini ccbe5c5658 fixed import of eu.dnetlib.dhp:dnet-openaire-broker-common 2020-07-13 10:55:27 +02:00
Claudio Atzori 7a3fd9f54c dedup relation aggregator moved into dedicated class 2020-07-13 10:11:36 +02:00
Alessia Bardi 7e96105947 Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop 2020-07-12 19:29:12 +02:00
Alessia Bardi b7a39731a6 assert, not print 2020-07-12 19:28:56 +02:00
Miriam Baglioni f9ad6f3255 Merge branch 'dump' of code-repo.d4science.org:miriam.baglioni/dnet-hadoop into dump 2020-07-10 19:42:53 +02:00
Miriam Baglioni c27f12d6e8 avoid to consider _SUCCESS file 2020-07-10 19:42:23 +02:00
Claudio Atzori 770adc26e9 WIP aggregator to make relationships unique 2020-07-10 19:35:10 +02:00
Claudio Atzori ecf119f37a Merge branch 'master' into deduptesting 2020-07-10 19:04:16 +02:00
Claudio Atzori 31071e363f Merge branch 'provision_indexing' 2020-07-10 19:03:57 +02:00
Claudio Atzori 06c1913062 added different limits for grouping by source and by target, incremented spark.sql.shuffle.partitions for the join operations 2020-07-10 19:03:33 +02:00
Claudio Atzori cc77446dc4 added dbSchema parameter to the raw_db workflow 2020-07-10 19:01:50 +02:00
Claudio Atzori 4c3836f62e materialize the related entities before joining them 2020-07-10 19:00:44 +02:00
Michele Artini e1ae964bc4 stats 2020-07-10 16:12:08 +02:00
Claudio Atzori 752d28f8eb make the relations produced by the dedup SparkPropagateRelation jon unique 2020-07-10 15:09:50 +02:00
Sandro La Bruzzo c01efed79b Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-07-10 14:44:57 +02:00
Sandro La Bruzzo a7d3977481 added generation of EBI Dataset 2020-07-10 14:44:50 +02:00
Claudio Atzori b21866a2da allow to set different to relations cut points by source and by target; adjusted weight assigned to relationship types 2020-07-10 13:59:48 +02:00
Claudio Atzori ff4d6214f1 experimenting with pruning of relations 2020-07-10 10:06:41 +02:00
Miriam Baglioni faea30cda0 - 2020-07-09 14:05:21 +02:00
Michele Artini 2d742a84ae DedupConfig as json file 2020-07-09 12:53:46 +02:00
Miriam Baglioni a634794242 merge upstream 2020-07-09 11:46:51 +02:00
Michele Artini a44b9b36b9 Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-07-09 11:02:31 +02:00
Michele Artini 1c6a171633 updated pom 2020-07-09 11:02:09 +02:00
Claudio Atzori 3c728aaa0c trying to overcome OOM errors during duplicate scan phase 2020-07-08 22:39:51 +02:00
Claudio Atzori 18c555cd79 Merge branch 'master' into deduptesting 2020-07-08 22:32:01 +02:00
Claudio Atzori 4365cf41d7 trying to overcome OOM errors during duplicate scan phase 2020-07-08 22:31:46 +02:00
Claudio Atzori 67e1d222b6 bulk cleaning when found null or empty, sets bestaccessrights evaluating the result instances 2020-07-08 17:53:35 +02:00
Alessia Bardi 853e8d7987 test for software merge 2020-07-08 17:03:53 +02:00
Claudio Atzori 610d377d57 first implementation of the BETA & PROD graphs merge procedure 2020-07-08 16:54:26 +02:00
Alessia Bardi 9a898c0e4c Json schema generator 2020-07-08 12:52:00 +02:00
Alessia Bardi 636f9ce7d6 json schema generator lib 2020-07-08 12:50:57 +02:00
Alessia Bardi 8f83b726fa Dump json schema compliant to json schema Draft 7 2020-07-08 12:48:46 +02:00
Claudio Atzori e2ea30f89d updated graph construction workflow definition: cleaning wf moved at the bottom to include cleaning of the information produced by the enrichment workflows 2020-07-08 12:16:24 +02:00
Miriam Baglioni 1b0b968548 fixed issue on substring 2020-07-08 12:11:51 +02:00
Miriam Baglioni 7fe00cb4fb - 2020-07-08 10:29:37 +02:00
Miriam Baglioni 375ef07d7b changed the description for the upload 2020-07-07 18:41:27 +02:00
Miriam Baglioni 35c8265793 added the json extention to filename 2020-07-07 18:29:49 +02:00
Miriam Baglioni 81434f8e5e added method newInstance 2020-07-07 18:26:10 +02:00
Miriam Baglioni 817cddfc52 - 2020-07-07 18:25:12 +02:00
Miriam Baglioni a66aa9bd83 removed unuseful tests 2020-07-07 18:25:00 +02:00
Miriam Baglioni 9b20a21b24 removed unuseful tests 2020-07-07 18:23:37 +02:00
Miriam Baglioni 8a1b42ff21 added check to verify that dump contains at least one product 2020-07-07 18:21:35 +02:00
Miriam Baglioni d86adb82a7 - 2020-07-07 18:20:51 +02:00
Miriam Baglioni b2782025f6 enabled the whole workflow to run. Added property to give priority to depenedency in the classpath - to solve conflicts 2020-07-07 18:10:47 +02:00
Miriam Baglioni 83d2c84b77 added constraints to xquery so that to get only profiles with status manager or all 2020-07-07 18:09:48 +02:00
Miriam Baglioni 4c8d86493c - 2020-07-07 18:09:06 +02:00
Miriam Baglioni 0208bc18f3 added new resource for testing 2020-07-07 17:47:24 +02:00
Miriam Baglioni f5bb65c9ef the json schema for the dump of the results 2020-07-07 17:34:40 +02:00
Michele Artini dffa0b01a2 Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-07-07 15:37:29 +02:00
Michele Artini efadbdb2bc fixed a bug with duplicated events 2020-07-07 15:37:13 +02:00
Claudio Atzori 8af8e7481a code formatting 2020-07-07 14:23:34 +02:00
Claudio Atzori b383ed42fa pass optional parameter relationFilter to the PrepareRelationJob implementation 2020-07-07 14:21:28 +02:00
Claudio Atzori 911894a987 Merge branch 'deduptesting' 2020-07-07 14:20:43 +02:00
Miriam Baglioni c19818a3f8 merge branch with fork master 2020-07-06 13:58:23 +02:00
Miriam Baglioni d22240c0ba merge upstream 2020-07-06 13:58:02 +02:00
Enrico Ottonello ca37d3427b separate workflow to parse orcid summaries, activities and generate dataset with no doi publications; test 2020-07-03 23:30:31 +02:00
Michele Artini edf6c6c4dc Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-07-03 11:48:24 +02:00
Michele Artini 04bebb708c some fixes 2020-07-03 11:48:12 +02:00
Enrico Ottonello 1729cc5cf3 publication conversion from json to oaf test 2020-07-02 18:46:20 +02:00
Claudio Atzori c3d67f709a adjusted dedup configuration for result entities: using new wordssuffixprefix clustering function, removed ngrampairs, adjusted queueMaxSize (800) and slidingWindowSize (80) 2020-07-02 17:35:22 +02:00
Miriam Baglioni f8bf4acd76 - 2020-07-02 16:03:11 +02:00
Miriam Baglioni e6c79d44e6 - 2020-07-02 16:02:02 +02:00
Miriam Baglioni d7f6f0c216 changed code to use other lib 2020-07-02 16:01:34 +02:00
Miriam Baglioni 8fdc9e070c added dependency to OkHttp 2020-07-02 16:01:08 +02:00
Miriam Baglioni 94500a581b merge branch with fork master 2020-07-02 14:25:39 +02:00
Miriam Baglioni c133a23cf0 merge upstream 2020-07-02 14:24:57 +02:00
Claudio Atzori 1d39f7901c Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop 2020-07-02 12:45:01 +02:00
Claudio Atzori 0f77cac4b5 fix: deduper must use queueMaxSize instead of groupMaxSize for the block definition 2020-07-02 12:43:51 +02:00
Sandro La Bruzzo 18b9330312 Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-07-02 12:43:19 +02:00
Michele Artini b413db0bff white/blacklists 2020-07-02 12:43:03 +02:00
Claudio Atzori d380b85246 unit test for the preparation of the relations 2020-07-02 12:42:13 +02:00
Claudio Atzori ed1c7e5d75 fixed workflow for the import of the claims alone 2020-07-02 12:40:21 +02:00
Sandro La Bruzzo 07f0723fa7 Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-07-02 12:37:49 +02:00
Sandro La Bruzzo 1d420eedb4 added generation of EBI Dataset 2020-07-02 12:37:43 +02:00
Claudio Atzori e4a29a4513 fixed workflow for the import of the claims alone 2020-07-02 12:36:33 +02:00
Enrico Ottonello 5525f57ec8 converter from orcid work json to oaf 2020-07-01 18:36:14 +02:00
Michele Artini 3bcdfbabe9 list with limits 2020-07-01 08:42:39 +02:00
Michele Artini 59a5421c24 indexing, accumulators, limited lists 2020-06-30 16:17:09 +02:00
Enrico Ottonello b7b6be12a5 fixed enriched works generation 2020-06-29 18:03:16 +02:00
Michele Artini 6f13673464 accumulators 2020-06-29 16:33:32 +02:00
Sandro La Bruzzo dab783b173 Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-06-29 09:05:00 +02:00
Michele Artini a6ea432435 Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-06-29 08:44:20 +02:00
Michele Artini 35ae381d28 all events matchers 2020-06-29 08:43:56 +02:00
Claudio Atzori 7817338e05 added test to verify the relation pre-processing 2020-06-26 17:58:33 +02:00
Enrico Ottonello b2213b6435 merged with dnet version 2020-06-26 17:27:34 +02:00
Enrico Ottonello c5e149c46e Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop into orcid-no-doi 2020-06-26 16:15:38 +02:00
Claudio Atzori 8d59fdf34e WIP: dataset based PrepareRelationsJob 2020-06-26 14:32:58 +02:00
Claudio Atzori 74da8a08cf Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop into islookup_timeout 2020-06-26 14:30:07 +02:00
Michele Artini 2393d9da2f limits 2020-06-26 11:20:45 +02:00
Enrico Ottonello d6498278ed added workflow to generate seq(orcidId,work) and seq(orcidId,enrichedWork) 2020-06-25 18:43:29 +02:00
Sandro La Bruzzo 96ce124b59 Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-06-25 17:00:43 +02:00
Miriam Baglioni 4a7de07ea2 refactoring 2020-06-25 16:32:40 +02:00
Miriam Baglioni 54a12978d3 fixed issue in xquery 2020-06-25 16:30:20 +02:00
Claudio Atzori 93052ae384 WIP: set the connect & request timeout for BindingProvider service implementation 2020-06-25 16:16:02 +02:00
Michele Artini 408165a756 Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-06-25 15:53:35 +02:00
Michele Artini e8fb305f18 compilation of event map 2020-06-25 15:53:20 +02:00
Michele Artini 4eb3e109d7 compilation of event map 2020-06-25 15:45:50 +02:00
Claudio Atzori d839e88783 Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop 2020-06-25 14:06:30 +02:00
Claudio Atzori 6f5771c1c9 sets author.rank when null 2020-06-25 14:06:21 +02:00
Michele Artini e28033c6d8 some fixes 2020-06-25 13:01:09 +02:00
Claudio Atzori 216975c4ec restored complete provision workflow 2020-06-25 12:55:52 +02:00
Claudio Atzori 2d77d3a388 Merge branch 'master' of https://code-repo.d4science.org/D-Net/dnet-hadoop 2020-06-25 12:54:30 +02:00
Claudio Atzori 93f627ea51 code formatting 2020-06-25 12:54:21 +02:00
Miriam Baglioni 05a99cfb61 change the position of value and description elements in the workflow definition 2020-06-25 12:36:08 +02:00
Claudio Atzori 7df2712824 Merge branch 'provision_indexing' 2020-06-25 12:22:41 +02:00
Claudio Atzori e62333192c WIP: prepare relation job 2020-06-25 12:22:18 +02:00
Claudio Atzori 6933ec11fb WIP: prepare relation job 2020-06-25 11:04:12 +02:00
Sandro La Bruzzo a6c0faac70 added test to verify secondary sorting 2020-06-25 10:48:15 +02:00
Claudio Atzori 69b0391708 WIP: prepare relation job 2020-06-25 10:19:56 +02:00
Michele Artini abcbebcbb4 fixed generation of ids 2020-06-25 09:50:46 +02:00
Michele Artini 77d2a1b1c4 params to choose sql queries for beta or production 2020-06-25 09:28:13 +02:00
Claudio Atzori 46e76affeb WIP: prepare relation job 2020-06-24 19:01:15 +02:00
Claudio Atzori 0e723d378b added default from vocab for missing instance.refereed; remove spurious prefixes from orcid values; WIP: prepare relation job 2020-06-24 18:34:42 +02:00
Enrico Ottonello fcbb4c1489 parser of orcid publication data from xml original dump 2020-06-24 16:29:32 +02:00
Michele Artini 202f6e62ff Splitted join wf 2020-06-24 15:47:06 +02:00
Sandro La Bruzzo 96689a8994 Merge branch 'master' of code-repo.d4science.org:D-Net/dnet-hadoop 2020-06-24 14:06:50 +02:00
Sandro La Bruzzo 46631a4421 updated mapping scholexplorer to OAF 2020-06-24 14:06:38 +02:00
Michele Artini e53dd62e87 minot changes 2020-06-24 09:24:45 +02:00
Michele Artini 8b9933b934 refactoring aggregators 2020-06-24 08:57:13 +02:00
Miriam Baglioni 3e5570de7a - 2020-06-23 15:44:54 +02:00
Michele Artini d13e3d3f68 fixed paths 2020-06-23 11:01:42 +02:00
Michele Artini 8386c6f90d filter of valid resultResult relations 2020-06-23 10:24:15 +02:00
Michele Artini 38bb45d0b6 test osf:refereed 2020-06-23 10:14:39 +02:00
Michele Artini c3286f4c37 fixed relType 2020-06-23 09:32:32 +02:00
Michele Artini af2f7705fc partial refactoring of some joins 2020-06-23 08:37:35 +02:00
Miriam Baglioni e4b21be004 - 2020-06-22 17:31:50 +02:00
Miriam Baglioni afa19b0c84 changed the way to PUT the files to the rest API 2020-06-22 17:20:07 +02:00
Miriam Baglioni 250fd1c854 merge branch with fork master 2020-06-22 16:25:48 +02:00
Miriam Baglioni df80ae5c1b merge branch with fork master 2020-06-22 10:51:23 +02:00
Miriam Baglioni e8f914f8b3 - 2020-06-22 10:50:41 +02:00
Miriam Baglioni edeb862476 excluded dependency in module that generates conflict 2020-06-22 10:49:56 +02:00
Miriam Baglioni 185facb8e5 change the deprecated DefaultHttpClient with the CLoseableHttpClient 2020-06-22 10:49:10 +02:00
Miriam Baglioni 669a509430 - 2020-06-19 17:39:46 +02:00
Miriam Baglioni 44a12d244f - 2020-06-18 18:38:54 +02:00
Miriam Baglioni 57b14af231 Merge branch 'd4science' into dump 2020-06-18 14:22:07 +02:00
Miriam Baglioni fb80353018 - 2020-06-18 14:21:36 +02:00
Miriam Baglioni 65bf312360 merge branch with fork master 2020-06-18 11:35:27 +02:00
Miriam Baglioni 3953f56bd3 added dependency to pom 2020-06-18 11:34:47 +02:00
Miriam Baglioni a118b66858 - 2020-06-18 11:34:30 +02:00
Miriam Baglioni f9578312b5 - 2020-06-18 11:34:15 +02:00
Miriam Baglioni 8b145e6aba - 2020-06-18 11:25:28 +02:00
Miriam Baglioni e8b3e972f2 changed the input params and the workflow definition to tackle the Result as all result product produced 2020-06-18 11:25:05 +02:00
Miriam Baglioni 8211cbb9fe extension of Result to contain all the properties owned by any result type 2020-06-18 11:23:52 +02:00
Miriam Baglioni 3233b01089 changes due to adding all the result type under Result 2020-06-18 11:22:58 +02:00
Miriam Baglioni 5c8533d1a1 changed in the testing classes 2020-06-18 11:20:08 +02:00
Miriam Baglioni bc8611a95a added new resources for testing 2020-06-18 11:19:20 +02:00
Miriam Baglioni 9dd3ef22c5 merge branch with fork master 2020-06-15 11:23:26 +02:00
Miriam Baglioni 68cf0fd03f test input 2020-06-15 11:14:42 +02:00
Miriam Baglioni 0467145ae3 test for graph dump 2020-06-15 11:13:51 +02:00
Miriam Baglioni e43eedb5b0 added resources and workflow for dump of community products 2020-06-15 11:13:21 +02:00
Miriam Baglioni f96ca900e1 fixed issues while running on cluster 2020-06-15 11:12:14 +02:00
Miriam Baglioni 56e70573c2 - 2020-06-15 11:06:56 +02:00
Miriam Baglioni 20b9e67728 added new class funder 2020-06-15 11:06:18 +02:00
Miriam Baglioni e145972962 - 2020-06-11 13:08:39 +02:00
Miriam Baglioni a01800224c - 2020-06-11 13:02:04 +02:00
Miriam Baglioni 356dd582a3 map construction moved in class 2020-06-11 12:59:22 +02:00
Miriam Baglioni db27663750 - 2020-06-11 10:49:01 +02:00
Miriam Baglioni bb9f21d0e7 job test for class producing first step of results dump 2020-06-11 10:20:05 +02:00
Miriam Baglioni 206abba48c merge branch with fork master 2020-06-09 15:41:14 +02:00
Miriam Baglioni a089db18f1 workflow and parameters to exucute the dump 2020-06-09 15:39:38 +02:00
Miriam Baglioni 6bbe27587f new classes to execute the dump for products associated to community, enrich each result with project information and assign the result to each community it belongs to 2020-06-09 15:39:03 +02:00
Miriam Baglioni 5121cbaf6a new classes for external dump. Only classes functional to dump products 2020-06-09 15:37:46 +02:00
Miriam Baglioni f232db84e9 new classes for external dump. Only classes functional to dump products 2020-06-08 15:11:37 +02:00
886 changed files with 57920 additions and 10679 deletions

View File

@ -15,12 +15,12 @@
<snapshotRepository>
<id>dnet45-snapshots</id>
<name>DNet45 Snapshots</name>
<url>http://maven.research-infrastructures.eu/nexus/content/repositories/dnet45-snapshots</url>
<url>https://maven.d4science.org/nexus/content/repositories/dnet45-snapshots</url>
<layout>default</layout>
</snapshotRepository>
<repository>
<id>dnet45-releases</id>
<url>http://maven.research-infrastructures.eu/nexus/content/repositories/dnet45-releases</url>
<url>https://maven.d4science.org/nexus/content/repositories/dnet45-releases</url>
</repository>
</distributionManagement>

View File

@ -19,7 +19,7 @@
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_empty_parens_in_annotation_type_member_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_method_declaration_throws" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.parentheses_positions_in_switch_statement" value="common_lines"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_javadoc_comments" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_javadoc_comments" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.indentation.size" value="4"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_postfix_operator" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.parentheses_positions_in_enum_constant_declaration" value="common_lines"/>

View File

@ -6,7 +6,7 @@
<groupId>eu.dnetlib.dhp</groupId>
<artifactId>dhp</artifactId>
<version>1.2.4-SNAPSHOT</version>
<relativePath>../</relativePath>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>dhp-common</artifactId>
@ -87,6 +87,21 @@
<groupId>org.postgresql</groupId>
<artifactId>postgresql</artifactId>
</dependency>
<dependency>
<groupId>com.squareup.okhttp3</groupId>
<artifactId>okhttp</artifactId>
</dependency>
<dependency>
<groupId>eu.dnetlib</groupId>
<artifactId>dnet-pace-core</artifactId>
</dependency>
<dependency>
<groupId>eu.dnetlib.dhp</groupId>
<artifactId>dhp-schemas</artifactId>
</dependency>
</dependencies>
</project>

View File

@ -1,119 +0,0 @@
package eu.dnetlib.data.mdstore.manager.common.model;
import java.io.Serializable;
import java.util.UUID;
import javax.persistence.Column;
import javax.persistence.Entity;
import javax.persistence.Id;
import javax.persistence.Table;
@Entity
@Table(name = "mdstores")
public class MDStore implements Serializable {
/** */
private static final long serialVersionUID = 3160530489149700055L;
@Id
@Column(name = "id")
private String id;
@Column(name = "format")
private String format;
@Column(name = "layout")
private String layout;
@Column(name = "interpretation")
private String interpretation;
@Column(name = "datasource_name")
private String datasourceName;
@Column(name = "datasource_id")
private String datasourceId;
@Column(name = "api_id")
private String apiId;
public String getId() {
return id;
}
public void setId(final String id) {
this.id = id;
}
public String getFormat() {
return format;
}
public void setFormat(final String format) {
this.format = format;
}
public String getLayout() {
return layout;
}
public void setLayout(final String layout) {
this.layout = layout;
}
public String getInterpretation() {
return interpretation;
}
public void setInterpretation(final String interpretation) {
this.interpretation = interpretation;
}
public String getDatasourceName() {
return datasourceName;
}
public void setDatasourceName(final String datasourceName) {
this.datasourceName = datasourceName;
}
public String getDatasourceId() {
return datasourceId;
}
public void setDatasourceId(final String datasourceId) {
this.datasourceId = datasourceId;
}
public String getApiId() {
return apiId;
}
public void setApiId(final String apiId) {
this.apiId = apiId;
}
public static MDStore newInstance(
final String format, final String layout, final String interpretation) {
return newInstance(format, layout, interpretation, null, null, null);
}
public static MDStore newInstance(
final String format,
final String layout,
final String interpretation,
final String dsName,
final String dsId,
final String apiId) {
final MDStore md = new MDStore();
md.setId("md-" + UUID.randomUUID());
md.setFormat(format);
md.setLayout(layout);
md.setInterpretation(interpretation);
md.setDatasourceName(dsName);
md.setDatasourceId(dsId);
md.setApiId(apiId);
return md;
}
}

View File

@ -1,51 +0,0 @@
package eu.dnetlib.data.mdstore.manager.common.model;
import java.io.Serializable;
import javax.persistence.Column;
import javax.persistence.Entity;
import javax.persistence.Id;
import javax.persistence.Table;
@Entity
@Table(name = "mdstore_current_versions")
public class MDStoreCurrentVersion implements Serializable {
/** */
private static final long serialVersionUID = -4757725888593745773L;
@Id
@Column(name = "mdstore")
private String mdstore;
@Column(name = "current_version")
private String currentVersion;
public String getMdstore() {
return mdstore;
}
public void setMdstore(final String mdstore) {
this.mdstore = mdstore;
}
public String getCurrentVersion() {
return currentVersion;
}
public void setCurrentVersion(final String currentVersion) {
this.currentVersion = currentVersion;
}
public static MDStoreCurrentVersion newInstance(final String mdId, final String versionId) {
final MDStoreCurrentVersion cv = new MDStoreCurrentVersion();
cv.setMdstore(mdId);
cv.setCurrentVersion(versionId);
return cv;
}
public static MDStoreCurrentVersion newInstance(final MDStoreVersion v) {
return newInstance(v.getMdstore(), v.getId());
}
}

View File

@ -1,99 +0,0 @@
package eu.dnetlib.data.mdstore.manager.common.model;
import java.io.Serializable;
import java.util.Date;
import javax.persistence.Column;
import javax.persistence.Entity;
import javax.persistence.Id;
import javax.persistence.Table;
import javax.persistence.Temporal;
import javax.persistence.TemporalType;
@Entity
@Table(name = "mdstore_versions")
public class MDStoreVersion implements Serializable {
/** */
private static final long serialVersionUID = -4763494442274298339L;
@Id
@Column(name = "id")
private String id;
@Column(name = "mdstore")
private String mdstore;
@Column(name = "writing")
private boolean writing;
@Column(name = "readcount")
private int readCount = 0;
@Column(name = "lastupdate")
@Temporal(TemporalType.TIMESTAMP)
private Date lastUpdate;
@Column(name = "size")
private long size = 0;
public static MDStoreVersion newInstance(final String mdId, final boolean writing) {
final MDStoreVersion t = new MDStoreVersion();
t.setId(mdId + "-" + new Date().getTime());
t.setMdstore(mdId);
t.setLastUpdate(null);
t.setWriting(writing);
t.setReadCount(0);
t.setSize(0);
return t;
}
public String getId() {
return id;
}
public void setId(final String id) {
this.id = id;
}
public String getMdstore() {
return mdstore;
}
public void setMdstore(final String mdstore) {
this.mdstore = mdstore;
}
public boolean isWriting() {
return writing;
}
public void setWriting(final boolean writing) {
this.writing = writing;
}
public int getReadCount() {
return readCount;
}
public void setReadCount(final int readCount) {
this.readCount = readCount;
}
public Date getLastUpdate() {
return lastUpdate;
}
public void setLastUpdate(final Date lastUpdate) {
this.lastUpdate = lastUpdate;
}
public long getSize() {
return size;
}
public void setSize(final long size) {
this.size = size;
}
}

View File

@ -1,143 +0,0 @@
package eu.dnetlib.data.mdstore.manager.common.model;
import java.io.Serializable;
import java.util.Date;
import javax.persistence.Column;
import javax.persistence.Entity;
import javax.persistence.Id;
import javax.persistence.Table;
import javax.persistence.Temporal;
import javax.persistence.TemporalType;
@Entity
@Table(name = "mdstores_with_info")
public class MDStoreWithInfo implements Serializable {
/** */
private static final long serialVersionUID = -8445784770687571492L;
@Id
@Column(name = "id")
private String id;
@Column(name = "format")
private String format;
@Column(name = "layout")
private String layout;
@Column(name = "interpretation")
private String interpretation;
@Column(name = "datasource_name")
private String datasourceName;
@Column(name = "datasource_id")
private String datasourceId;
@Column(name = "api_id")
private String apiId;
@Column(name = "current_version")
private String currentVersion;
@Column(name = "lastupdate")
@Temporal(TemporalType.TIMESTAMP)
private Date lastUpdate;
@Column(name = "size")
private long size = 0;
@Column(name = "n_versions")
private long numberOfVersions = 0;
public String getId() {
return id;
}
public void setId(final String id) {
this.id = id;
}
public String getFormat() {
return format;
}
public void setFormat(final String format) {
this.format = format;
}
public String getLayout() {
return layout;
}
public void setLayout(final String layout) {
this.layout = layout;
}
public String getInterpretation() {
return interpretation;
}
public void setInterpretation(final String interpretation) {
this.interpretation = interpretation;
}
public String getDatasourceName() {
return datasourceName;
}
public void setDatasourceName(final String datasourceName) {
this.datasourceName = datasourceName;
}
public String getDatasourceId() {
return datasourceId;
}
public void setDatasourceId(final String datasourceId) {
this.datasourceId = datasourceId;
}
public String getApiId() {
return apiId;
}
public void setApiId(final String apiId) {
this.apiId = apiId;
}
public String getCurrentVersion() {
return currentVersion;
}
public void setCurrentVersion(final String currentVersion) {
this.currentVersion = currentVersion;
}
public Date getLastUpdate() {
return lastUpdate;
}
public void setLastUpdate(final Date lastUpdate) {
this.lastUpdate = lastUpdate;
}
public long getSize() {
return size;
}
public void setSize(final long size) {
this.size = size;
}
public long getNumberOfVersions() {
return numberOfVersions;
}
public void setNumberOfVersions(final long numberOfVersions) {
this.numberOfVersions = numberOfVersions;
}
}

View File

@ -0,0 +1,30 @@
package eu.dnetlib.dhp.common;
import java.util.Map;
import com.google.common.collect.Maps;
public class Constants {
public static final Map<String, String> accessRightsCoarMap = Maps.newHashMap();
public static final Map<String, String> coarCodeLabelMap = Maps.newHashMap();
public static String COAR_ACCESS_RIGHT_SCHEMA = "http://vocabularies.coar-repositories.org/documentation/access_rights/";
static {
accessRightsCoarMap.put("OPEN", "c_abf2");
accessRightsCoarMap.put("RESTRICTED", "c_16ec");
accessRightsCoarMap.put("OPEN SOURCE", "c_abf2");
accessRightsCoarMap.put("CLOSED", "c_14cb");
accessRightsCoarMap.put("EMBARGO", "c_f1cf");
}
static {
coarCodeLabelMap.put("c_abf2", "OPEN");
coarCodeLabelMap.put("c_16ec", "RESTRICTED");
coarCodeLabelMap.put("c_14cb", "CLOSED");
coarCodeLabelMap.put("c_f1cf", "EMBARGO");
}
}

View File

@ -0,0 +1,412 @@
package eu.dnetlib.dhp.common;
import java.io.Serializable;
import java.util.*;
import java.util.stream.Collectors;
import eu.dnetlib.dhp.schema.common.ModelConstants;
import eu.dnetlib.dhp.schema.dump.oaf.*;
import eu.dnetlib.dhp.schema.dump.oaf.community.CommunityInstance;
import eu.dnetlib.dhp.schema.dump.oaf.community.CommunityResult;
import eu.dnetlib.dhp.schema.oaf.DataInfo;
import eu.dnetlib.dhp.schema.oaf.Field;
import eu.dnetlib.dhp.schema.oaf.Journal;
import eu.dnetlib.dhp.schema.oaf.StructuredProperty;
public class GraphResultMapper implements Serializable {
public static <E extends eu.dnetlib.dhp.schema.oaf.OafEntity> Result map(
E in) {
CommunityResult out = new CommunityResult();
eu.dnetlib.dhp.schema.oaf.Result input = (eu.dnetlib.dhp.schema.oaf.Result) in;
Optional<eu.dnetlib.dhp.schema.oaf.Qualifier> ort = Optional.ofNullable(input.getResulttype());
if (ort.isPresent()) {
switch (ort.get().getClassid()) {
case "publication":
Optional<Journal> journal = Optional
.ofNullable(((eu.dnetlib.dhp.schema.oaf.Publication) input).getJournal());
if (journal.isPresent()) {
Journal j = journal.get();
Container c = new Container();
c.setConferencedate(j.getConferencedate());
c.setConferenceplace(j.getConferenceplace());
c.setEdition(j.getEdition());
c.setEp(j.getEp());
c.setIss(j.getIss());
c.setIssnLinking(j.getIssnLinking());
c.setIssnOnline(j.getIssnOnline());
c.setIssnPrinted(j.getIssnPrinted());
c.setName(j.getName());
c.setSp(j.getSp());
c.setVol(j.getVol());
out.setContainer(c);
out.setType(ModelConstants.PUBLICATION_DEFAULT_RESULTTYPE.getClassname());
}
break;
case "dataset":
eu.dnetlib.dhp.schema.oaf.Dataset id = (eu.dnetlib.dhp.schema.oaf.Dataset) input;
Optional.ofNullable(id.getSize()).ifPresent(v -> out.setSize(v.getValue()));
Optional.ofNullable(id.getVersion()).ifPresent(v -> out.setVersion(v.getValue()));
out
.setGeolocation(
Optional
.ofNullable(id.getGeolocation())
.map(
igl -> igl
.stream()
.filter(Objects::nonNull)
.map(gli -> {
GeoLocation gl = new GeoLocation();
gl.setBox(gli.getBox());
gl.setPlace(gli.getPlace());
gl.setPoint(gli.getPoint());
return gl;
})
.collect(Collectors.toList()))
.orElse(null));
out.setType(ModelConstants.DATASET_DEFAULT_RESULTTYPE.getClassname());
break;
case "software":
eu.dnetlib.dhp.schema.oaf.Software is = (eu.dnetlib.dhp.schema.oaf.Software) input;
Optional
.ofNullable(is.getCodeRepositoryUrl())
.ifPresent(value -> out.setCodeRepositoryUrl(value.getValue()));
Optional
.ofNullable(is.getDocumentationUrl())
.ifPresent(
value -> out
.setDocumentationUrl(
value
.stream()
.map(v -> v.getValue())
.collect(Collectors.toList())));
Optional
.ofNullable(is.getProgrammingLanguage())
.ifPresent(value -> out.setProgrammingLanguage(value.getClassid()));
out.setType(ModelConstants.SOFTWARE_DEFAULT_RESULTTYPE.getClassname());
break;
case "other":
eu.dnetlib.dhp.schema.oaf.OtherResearchProduct ir = (eu.dnetlib.dhp.schema.oaf.OtherResearchProduct) input;
out
.setContactgroup(
Optional
.ofNullable(ir.getContactgroup())
.map(value -> value.stream().map(cg -> cg.getValue()).collect(Collectors.toList()))
.orElse(null));
out
.setContactperson(
Optional
.ofNullable(ir.getContactperson())
.map(value -> value.stream().map(cp -> cp.getValue()).collect(Collectors.toList()))
.orElse(null));
out
.setTool(
Optional
.ofNullable(ir.getTool())
.map(value -> value.stream().map(t -> t.getValue()).collect(Collectors.toList()))
.orElse(null));
out.setType(ModelConstants.ORP_DEFAULT_RESULTTYPE.getClassname());
break;
}
Optional
.ofNullable(input.getAuthor())
.ifPresent(ats -> out.setAuthor(ats.stream().map(at -> getAuthor(at)).collect(Collectors.toList())));
// I do not map Access Right UNKNOWN or OTHER
Optional<eu.dnetlib.dhp.schema.oaf.Qualifier> oar = Optional.ofNullable(input.getBestaccessright());
if (oar.isPresent()) {
if (Constants.accessRightsCoarMap.containsKey(oar.get().getClassid())) {
String code = Constants.accessRightsCoarMap.get(oar.get().getClassid());
out
.setBestaccessright(
AccessRight
.newInstance(
code,
Constants.coarCodeLabelMap.get(code),
Constants.COAR_ACCESS_RIGHT_SCHEMA));
}
}
final List<String> contributorList = new ArrayList<>();
Optional
.ofNullable(input.getContributor())
.ifPresent(value -> value.stream().forEach(c -> contributorList.add(c.getValue())));
out.setContributor(contributorList);
Optional
.ofNullable(input.getCountry())
.ifPresent(
value -> out
.setCountry(
value
.stream()
.map(
c -> {
if (c.getClassid().equals((ModelConstants.UNKNOWN))) {
return null;
}
Country country = new Country();
country.setCode(c.getClassid());
country.setLabel(c.getClassname());
Optional
.ofNullable(c.getDataInfo())
.ifPresent(
provenance -> country
.setProvenance(
Provenance
.newInstance(
provenance
.getProvenanceaction()
.getClassname(),
c.getDataInfo().getTrust())));
return country;
})
.filter(Objects::nonNull)
.collect(Collectors.toList())));
final List<String> coverageList = new ArrayList<>();
Optional
.ofNullable(input.getCoverage())
.ifPresent(value -> value.stream().forEach(c -> coverageList.add(c.getValue())));
out.setCoverage(coverageList);
out.setDateofcollection(input.getDateofcollection());
final List<String> descriptionList = new ArrayList<>();
Optional
.ofNullable(input.getDescription())
.ifPresent(value -> value.forEach(d -> descriptionList.add(d.getValue())));
out.setDescription(descriptionList);
Optional<Field<String>> oStr = Optional.ofNullable(input.getEmbargoenddate());
if (oStr.isPresent()) {
out.setEmbargoenddate(oStr.get().getValue());
}
final List<String> formatList = new ArrayList<>();
Optional
.ofNullable(input.getFormat())
.ifPresent(value -> value.stream().forEach(f -> formatList.add(f.getValue())));
out.setFormat(formatList);
out.setId(input.getId());
out.setOriginalId(input.getOriginalId());
Optional<List<eu.dnetlib.dhp.schema.oaf.Instance>> oInst = Optional
.ofNullable(input.getInstance());
if (oInst.isPresent()) {
out
.setInstance(
oInst.get().stream().map(i -> getInstance(i)).collect(Collectors.toList()));
}
Optional<eu.dnetlib.dhp.schema.oaf.Qualifier> oL = Optional.ofNullable(input.getLanguage());
if (oL.isPresent()) {
eu.dnetlib.dhp.schema.oaf.Qualifier language = oL.get();
out.setLanguage(Qualifier.newInstance(language.getClassid(), language.getClassname()));
}
Optional<Long> oLong = Optional.ofNullable(input.getLastupdatetimestamp());
if (oLong.isPresent()) {
out.setLastupdatetimestamp(oLong.get());
}
Optional<List<StructuredProperty>> otitle = Optional.ofNullable(input.getTitle());
if (otitle.isPresent()) {
List<StructuredProperty> iTitle = otitle
.get()
.stream()
.filter(t -> t.getQualifier().getClassid().equalsIgnoreCase("main title"))
.collect(Collectors.toList());
if (iTitle.size() > 0) {
out.setMaintitle(iTitle.get(0).getValue());
}
iTitle = otitle
.get()
.stream()
.filter(t -> t.getQualifier().getClassid().equalsIgnoreCase("subtitle"))
.collect(Collectors.toList());
if (iTitle.size() > 0) {
out.setSubtitle(iTitle.get(0).getValue());
}
}
List<ControlledField> pids = new ArrayList<>();
Optional
.ofNullable(input.getPid())
.ifPresent(
value -> value
.stream()
.forEach(
p -> pids
.add(
ControlledField
.newInstance(p.getQualifier().getClassid(), p.getValue()))));
out.setPid(pids);
oStr = Optional.ofNullable(input.getDateofacceptance());
if (oStr.isPresent()) {
out.setPublicationdate(oStr.get().getValue());
}
oStr = Optional.ofNullable(input.getPublisher());
if (oStr.isPresent()) {
out.setPublisher(oStr.get().getValue());
}
List<String> sourceList = new ArrayList<>();
Optional
.ofNullable(input.getSource())
.ifPresent(value -> value.stream().forEach(s -> sourceList.add(s.getValue())));
// out.setSource(input.getSource().stream().map(s -> s.getValue()).collect(Collectors.toList()));
List<Subject> subjectList = new ArrayList<>();
Optional
.ofNullable(input.getSubject())
.ifPresent(
value -> value
.forEach(s -> subjectList.add(getSubject(s))));
out.setSubjects(subjectList);
out.setType(input.getResulttype().getClassid());
}
out
.setCollectedfrom(
input
.getCollectedfrom()
.stream()
.map(cf -> KeyValue.newInstance(cf.getKey(), cf.getValue()))
.collect(Collectors.toList()));
return out;
}
private static CommunityInstance getInstance(eu.dnetlib.dhp.schema.oaf.Instance i) {
CommunityInstance instance = new CommunityInstance();
setCommonValue(i, instance);
instance
.setCollectedfrom(
KeyValue
.newInstance(i.getCollectedfrom().getKey(), i.getCollectedfrom().getValue()));
instance
.setHostedby(
KeyValue.newInstance(i.getHostedby().getKey(), i.getHostedby().getValue()));
return instance;
}
private static <I extends Instance> void setCommonValue(eu.dnetlib.dhp.schema.oaf.Instance i, I instance) {
Optional<eu.dnetlib.dhp.schema.oaf.Qualifier> opAr = Optional
.ofNullable(i.getAccessright());
if (opAr.isPresent()) {
if (Constants.accessRightsCoarMap.containsKey(opAr.get().getClassid())) {
String code = Constants.accessRightsCoarMap.get(opAr.get().getClassid());
instance
.setAccessright(
AccessRight
.newInstance(
code,
Constants.coarCodeLabelMap.get(code),
Constants.COAR_ACCESS_RIGHT_SCHEMA));
}
}
Optional
.ofNullable(i.getLicense())
.ifPresent(value -> instance.setLicense(value.getValue()));
Optional
.ofNullable(i.getDateofacceptance())
.ifPresent(value -> instance.setPublicationdate(value.getValue()));
Optional
.ofNullable(i.getRefereed())
.ifPresent(value -> instance.setRefereed(value.getClassname()));
Optional
.ofNullable(i.getInstancetype())
.ifPresent(value -> instance.setType(value.getClassname()));
Optional.ofNullable(i.getUrl()).ifPresent(value -> instance.setUrl(value));
}
private static Subject getSubject(StructuredProperty s) {
Subject subject = new Subject();
subject.setSubject(ControlledField.newInstance(s.getQualifier().getClassid(), s.getValue()));
Optional<DataInfo> di = Optional.ofNullable(s.getDataInfo());
if (di.isPresent()) {
Provenance p = new Provenance();
p.setProvenance(di.get().getProvenanceaction().getClassname());
p.setTrust(di.get().getTrust());
subject.setProvenance(p);
}
return subject;
}
private static Author getAuthor(eu.dnetlib.dhp.schema.oaf.Author oa) {
Author a = new Author();
a.setFullname(oa.getFullname());
a.setName(oa.getName());
a.setSurname(oa.getSurname());
a.setRank(oa.getRank());
Optional<List<StructuredProperty>> oPids = Optional
.ofNullable(oa.getPid());
if (oPids.isPresent()) {
Pid pid = getOrcid(oPids.get());
if (pid != null) {
a.setPid(pid);
}
}
return a;
}
private static Pid getOrcid(List<StructuredProperty> p) {
for (StructuredProperty pid : p) {
if (pid.getQualifier().getClassid().equals(ModelConstants.ORCID)) {
Optional<DataInfo> di = Optional.ofNullable(pid.getDataInfo());
if (di.isPresent()) {
return Pid
.newInstance(
ControlledField
.newInstance(
pid.getQualifier().getClassid(),
pid.getValue()),
Provenance
.newInstance(
di.get().getProvenanceaction().getClassname(),
di.get().getTrust()));
} else {
return Pid
.newInstance(
ControlledField
.newInstance(
pid.getQualifier().getClassid(),
pid.getValue())
);
}
}
}
return null;
}
}

View File

@ -0,0 +1,114 @@
package eu.dnetlib.dhp.common;
import java.io.BufferedInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.Serializable;
import org.apache.commons.compress.archivers.tar.TarArchiveEntry;
import org.apache.commons.compress.archivers.tar.TarArchiveOutputStream;
import org.apache.hadoop.fs.*;
public class MakeTarArchive implements Serializable {
private static TarArchiveOutputStream getTar(FileSystem fileSystem, String outputPath) throws IOException {
Path hdfsWritePath = new Path(outputPath);
FSDataOutputStream fsDataOutputStream = null;
if (fileSystem.exists(hdfsWritePath)) {
fileSystem.delete(hdfsWritePath, true);
}
fsDataOutputStream = fileSystem.create(hdfsWritePath);
return new TarArchiveOutputStream(fsDataOutputStream.getWrappedStream());
}
private static void write(FileSystem fileSystem, String inputPath, String outputPath, String dir_name)
throws IOException {
Path hdfsWritePath = new Path(outputPath);
FSDataOutputStream fsDataOutputStream = null;
if (fileSystem.exists(hdfsWritePath)) {
fileSystem.delete(hdfsWritePath, true);
}
fsDataOutputStream = fileSystem.create(hdfsWritePath);
TarArchiveOutputStream ar = new TarArchiveOutputStream(fsDataOutputStream.getWrappedStream());
RemoteIterator<LocatedFileStatus> fileStatusListIterator = fileSystem
.listFiles(
new Path(inputPath), true);
while (fileStatusListIterator.hasNext()) {
writeCurrentFile(fileSystem, dir_name, fileStatusListIterator, ar, 0);
}
ar.close();
}
public static void tarMaxSize(FileSystem fileSystem, String inputPath, String outputPath, String dir_name,
int gBperSplit) throws IOException {
final long bytesPerSplit = 1024L * 1024L * 1024L * gBperSplit;
long sourceSize = fileSystem.getContentSummary(new Path(inputPath)).getSpaceConsumed();
if (sourceSize < bytesPerSplit) {
write(fileSystem, inputPath, outputPath + ".tar", dir_name);
} else {
int partNum = 0;
RemoteIterator<LocatedFileStatus> fileStatusListIterator = fileSystem
.listFiles(
new Path(inputPath), true);
boolean next = fileStatusListIterator.hasNext();
while (next) {
TarArchiveOutputStream ar = getTar(fileSystem, outputPath + "_" + (partNum + 1) + ".tar");
long current_size = 0;
while (next && current_size < bytesPerSplit) {
current_size = writeCurrentFile(fileSystem, dir_name, fileStatusListIterator, ar, current_size);
next = fileStatusListIterator.hasNext();
}
partNum += 1;
ar.close();
}
}
}
private static long writeCurrentFile(FileSystem fileSystem, String dir_name,
RemoteIterator<LocatedFileStatus> fileStatusListIterator,
TarArchiveOutputStream ar, long current_size) throws IOException {
LocatedFileStatus fileStatus = fileStatusListIterator.next();
Path p = fileStatus.getPath();
String p_string = p.toString();
if (!p_string.endsWith("_SUCCESS")) {
String name = p_string.substring(p_string.lastIndexOf("/") + 1);
TarArchiveEntry entry = new TarArchiveEntry(dir_name + "/" + name);
entry.setSize(fileStatus.getLen());
current_size += fileStatus.getLen();
ar.putArchiveEntry(entry);
InputStream is = fileSystem.open(fileStatus.getPath());
BufferedInputStream bis = new BufferedInputStream(is);
int count;
byte data[] = new byte[1024];
while ((count = bis.read(data, 0, data.length)) != -1) {
ar.write(data, 0, count);
}
bis.close();
ar.closeArchiveEntry();
}
return current_size;
}
}

View File

@ -0,0 +1,53 @@
package eu.dnetlib.dhp.common.api;
import java.io.IOException;
import java.io.InputStream;
import okhttp3.MediaType;
import okhttp3.RequestBody;
import okhttp3.internal.Util;
import okio.BufferedSink;
import okio.Okio;
import okio.Source;
public class InputStreamRequestBody extends RequestBody {
private InputStream inputStream;
private MediaType mediaType;
private long lenght;
public static RequestBody create(final MediaType mediaType, final InputStream inputStream, final long len) {
return new InputStreamRequestBody(inputStream, mediaType, len);
}
private InputStreamRequestBody(InputStream inputStream, MediaType mediaType, long len) {
this.inputStream = inputStream;
this.mediaType = mediaType;
this.lenght = len;
}
@Override
public MediaType contentType() {
return mediaType;
}
@Override
public long contentLength() {
return lenght;
}
@Override
public void writeTo(BufferedSink sink) throws IOException {
Source source = null;
try {
source = Okio.source(inputStream);
sink.writeAll(source);
} finally {
Util.closeQuietly(source);
}
}
}

View File

@ -0,0 +1,8 @@
package eu.dnetlib.dhp.common.api;
public class MissingConceptDoiException extends Throwable {
public MissingConceptDoiException(String message) {
super(message);
}
}

View File

@ -0,0 +1,315 @@
package eu.dnetlib.dhp.common.api;
import java.io.*;
import java.io.IOException;
import java.util.concurrent.TimeUnit;
import com.google.gson.Gson;
import eu.dnetlib.dhp.common.api.zenodo.ZenodoModel;
import eu.dnetlib.dhp.common.api.zenodo.ZenodoModelList;
import okhttp3.*;
public class ZenodoAPIClient implements Serializable {
String urlString;
String bucket;
String deposition_id;
String access_token;
public static final MediaType MEDIA_TYPE_JSON = MediaType.parse("application/json; charset=utf-8");
private static final MediaType MEDIA_TYPE_ZIP = MediaType.parse("application/zip");
public String getUrlString() {
return urlString;
}
public void setUrlString(String urlString) {
this.urlString = urlString;
}
public String getBucket() {
return bucket;
}
public void setBucket(String bucket) {
this.bucket = bucket;
}
public void setDeposition_id(String deposition_id) {
this.deposition_id = deposition_id;
}
public ZenodoAPIClient(String urlString, String access_token) throws IOException {
this.urlString = urlString;
this.access_token = access_token;
}
/**
* Brand new deposition in Zenodo. It sets the deposition_id and the bucket where to store the files to upload
*
* @return response code
* @throws IOException
*/
public int newDeposition() throws IOException {
String json = "{}";
OkHttpClient httpClient = new OkHttpClient.Builder().connectTimeout(600, TimeUnit.SECONDS).build();
RequestBody body = RequestBody.create(json, MEDIA_TYPE_JSON);
Request request = new Request.Builder()
.url(urlString)
.addHeader("Content-Type", "application/json") // add request headers
.addHeader("Authorization", "Bearer " + access_token)
.post(body)
.build();
try (Response response = httpClient.newCall(request).execute()) {
if (!response.isSuccessful())
throw new IOException("Unexpected code " + response + response.body().string());
// Get response body
json = response.body().string();
ZenodoModel newSubmission = new Gson().fromJson(json, ZenodoModel.class);
this.bucket = newSubmission.getLinks().getBucket();
this.deposition_id = newSubmission.getId();
return response.code();
}
}
/**
* Upload files in Zenodo.
*
* @param is the inputStream for the file to upload
* @param file_name the name of the file as it will appear on Zenodo
* @param len the size of the file
* @return the response code
*/
public int uploadIS(InputStream is, String file_name, long len) throws IOException {
OkHttpClient httpClient = new OkHttpClient.Builder()
.writeTimeout(600, TimeUnit.SECONDS)
.readTimeout(600, TimeUnit.SECONDS)
.connectTimeout(600, TimeUnit.SECONDS)
.build();
Request request = new Request.Builder()
.url(bucket + "/" + file_name)
.addHeader("Content-Type", "application/zip") // add request headers
.addHeader("Authorization", "Bearer " + access_token)
.put(InputStreamRequestBody.create(MEDIA_TYPE_ZIP, is, len))
.build();
try (Response response = httpClient.newCall(request).execute()) {
if (!response.isSuccessful())
throw new IOException("Unexpected code " + response + response.body().string());
return response.code();
}
}
/**
* Associates metadata information to the current deposition
*
* @param metadata the metadata
* @return response code
* @throws IOException
*/
public int sendMretadata(String metadata) throws IOException {
OkHttpClient httpClient = new OkHttpClient.Builder().connectTimeout(600, TimeUnit.SECONDS).build();
RequestBody body = RequestBody.create(metadata, MEDIA_TYPE_JSON);
Request request = new Request.Builder()
.url(urlString + "/" + deposition_id)
.addHeader("Content-Type", "application/json") // add request headers
.addHeader("Authorization", "Bearer " + access_token)
.put(body)
.build();
try (Response response = httpClient.newCall(request).execute()) {
if (!response.isSuccessful())
throw new IOException("Unexpected code " + response + response.body().string());
return response.code();
}
}
/**
* To publish the current deposition. It works for both new deposition or new version of an old deposition
*
* @return response code
* @throws IOException
*/
public int publish() throws IOException {
String json = "{}";
OkHttpClient httpClient = new OkHttpClient.Builder().connectTimeout(600, TimeUnit.SECONDS).build();
RequestBody body = RequestBody.create(json, MEDIA_TYPE_JSON);
Request request = new Request.Builder()
.url(urlString + "/" + deposition_id + "/actions/publish")
.addHeader("Authorization", "Bearer " + access_token)
.post(body)
.build();
try (Response response = httpClient.newCall(request).execute()) {
if (!response.isSuccessful())
throw new IOException("Unexpected code " + response + response.body().string());
return response.code();
}
}
/**
* To create a new version of an already published deposition. It sets the deposition_id and the bucket to be used
* for the new version.
*
* @param concept_rec_id the concept record id of the deposition for which to create a new version. It is the last
* part of the url for the DOI Zenodo suggests to use to cite all versions: DOI: 10.xxx/zenodo.656930
* concept_rec_id = 656930
* @return response code
* @throws IOException
* @throws MissingConceptDoiException
*/
public int newVersion(String concept_rec_id) throws IOException, MissingConceptDoiException {
setDepositionId(concept_rec_id);
String json = "{}";
OkHttpClient httpClient = new OkHttpClient.Builder().connectTimeout(600, TimeUnit.SECONDS).build();
RequestBody body = RequestBody.create(json, MEDIA_TYPE_JSON);
Request request = new Request.Builder()
.url(urlString + "/" + deposition_id + "/actions/newversion")
.addHeader("Authorization", "Bearer " + access_token)
.post(body)
.build();
try (Response response = httpClient.newCall(request).execute()) {
if (!response.isSuccessful())
throw new IOException("Unexpected code " + response + response.body().string());
ZenodoModel zenodoModel = new Gson().fromJson(response.body().string(), ZenodoModel.class);
String latest_draft = zenodoModel.getLinks().getLatest_draft();
deposition_id = latest_draft.substring(latest_draft.lastIndexOf("/") + 1);
bucket = getBucket(latest_draft);
return response.code();
}
}
/**
* To finish uploading a version or new deposition not published
* It sets the deposition_id and the bucket to be used
*
*
* @param deposition_id the deposition id of the not yet published upload
* concept_rec_id = 656930
* @return response code
* @throws IOException
* @throws MissingConceptDoiException
*/
public int uploadOpenDeposition(String deposition_id) throws IOException, MissingConceptDoiException {
this.deposition_id = deposition_id;
OkHttpClient httpClient = new OkHttpClient.Builder().connectTimeout(600, TimeUnit.SECONDS).build();
Request request = new Request.Builder()
.url(urlString + "/" + deposition_id)
.addHeader("Authorization", "Bearer " + access_token)
.build();
try (Response response = httpClient.newCall(request).execute()) {
if (!response.isSuccessful())
throw new IOException("Unexpected code " + response + response.body().string());
ZenodoModel zenodoModel = new Gson().fromJson(response.body().string(), ZenodoModel.class);
bucket = zenodoModel.getLinks().getBucket();
return response.code();
}
}
private void setDepositionId(String concept_rec_id) throws IOException, MissingConceptDoiException {
ZenodoModelList zenodoModelList = new Gson().fromJson(getPrevDepositions(), ZenodoModelList.class);
for (ZenodoModel zm : zenodoModelList) {
if (zm.getConceptrecid().equals(concept_rec_id)) {
deposition_id = zm.getId();
return;
}
}
throw new MissingConceptDoiException("The concept record id specified was missing in the list of depositions");
}
private String getPrevDepositions() throws IOException {
OkHttpClient httpClient = new OkHttpClient.Builder().connectTimeout(600, TimeUnit.SECONDS).build();
Request request = new Request.Builder()
.url(urlString)
.addHeader("Content-Type", "application/json") // add request headers
.addHeader("Authorization", "Bearer " + access_token)
.get()
.build();
try (Response response = httpClient.newCall(request).execute()) {
if (!response.isSuccessful())
throw new IOException("Unexpected code " + response + response.body().string());
return response.body().string();
}
}
private String getBucket(String url) throws IOException {
OkHttpClient httpClient = new OkHttpClient.Builder()
.connectTimeout(600, TimeUnit.SECONDS)
.build();
Request request = new Request.Builder()
.url(url)
.addHeader("Content-Type", "application/json") // add request headers
.addHeader("Authorization", "Bearer " + access_token)
.get()
.build();
try (Response response = httpClient.newCall(request).execute()) {
if (!response.isSuccessful())
throw new IOException("Unexpected code " + response + response.body().string());
// Get response body
ZenodoModel zenodoModel = new Gson().fromJson(response.body().string(), ZenodoModel.class);
return zenodoModel.getLinks().getBucket();
}
}
}

View File

@ -0,0 +1,14 @@
package eu.dnetlib.dhp.common.api.zenodo;
public class Community {
private String identifier;
public String getIdentifier() {
return identifier;
}
public void setIdentifier(String identifier) {
this.identifier = identifier;
}
}

View File

@ -0,0 +1,47 @@
package eu.dnetlib.dhp.common.api.zenodo;
public class Creator {
private String affiliation;
private String name;
private String orcid;
public String getAffiliation() {
return affiliation;
}
public void setAffiliation(String affiliation) {
this.affiliation = affiliation;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public String getOrcid() {
return orcid;
}
public void setOrcid(String orcid) {
this.orcid = orcid;
}
public static Creator newInstance(String name, String affiliation, String orcid) {
Creator c = new Creator();
if (!(name == null)) {
c.name = name;
}
if (!(affiliation == null)) {
c.affiliation = affiliation;
}
if (!(orcid == null)) {
c.orcid = orcid;
}
return c;
}
}

View File

@ -0,0 +1,58 @@
package eu.dnetlib.dhp.common.api.zenodo;
import java.io.Serializable;
import net.minidev.json.annotate.JsonIgnore;
public class File implements Serializable {
private String checksum;
private String filename;
private long filesize;
private String id;
@JsonIgnore
// private Links links;
public String getChecksum() {
return checksum;
}
public void setChecksum(String checksum) {
this.checksum = checksum;
}
public String getFilename() {
return filename;
}
public void setFilename(String filename) {
this.filename = filename;
}
public long getFilesize() {
return filesize;
}
public void setFilesize(long filesize) {
this.filesize = filesize;
}
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
// @JsonIgnore
// public Links getLinks() {
// return links;
// }
//
// @JsonIgnore
// public void setLinks(Links links) {
// this.links = links;
// }
}

View File

@ -0,0 +1,23 @@
package eu.dnetlib.dhp.common.api.zenodo;
import java.io.Serializable;
public class Grant implements Serializable {
private String id;
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public static Grant newInstance(String id) {
Grant g = new Grant();
g.id = id;
return g;
}
}

View File

@ -0,0 +1,92 @@
package eu.dnetlib.dhp.common.api.zenodo;
import java.io.Serializable;
public class Links implements Serializable {
private String bucket;
private String discard;
private String edit;
private String files;
private String html;
private String latest_draft;
private String latest_draft_html;
private String publish;
private String self;
public String getBucket() {
return bucket;
}
public void setBucket(String bucket) {
this.bucket = bucket;
}
public String getDiscard() {
return discard;
}
public void setDiscard(String discard) {
this.discard = discard;
}
public String getEdit() {
return edit;
}
public void setEdit(String edit) {
this.edit = edit;
}
public String getFiles() {
return files;
}
public void setFiles(String files) {
this.files = files;
}
public String getHtml() {
return html;
}
public void setHtml(String html) {
this.html = html;
}
public String getLatest_draft() {
return latest_draft;
}
public void setLatest_draft(String latest_draft) {
this.latest_draft = latest_draft;
}
public String getLatest_draft_html() {
return latest_draft_html;
}
public void setLatest_draft_html(String latest_draft_html) {
this.latest_draft_html = latest_draft_html;
}
public String getPublish() {
return publish;
}
public void setPublish(String publish) {
this.publish = publish;
}
public String getSelf() {
return self;
}
public void setSelf(String self) {
this.self = self;
}
}

View File

@ -0,0 +1,153 @@
package eu.dnetlib.dhp.common.api.zenodo;
import java.io.Serializable;
import java.util.List;
public class Metadata implements Serializable {
private String access_right;
private List<Community> communities;
private List<Creator> creators;
private String description;
private String doi;
private List<Grant> grants;
private List<String> keywords;
private String language;
private String license;
private PrereserveDoi prereserve_doi;
private String publication_date;
private List<String> references;
private List<RelatedIdentifier> related_identifiers;
private String title;
private String upload_type;
private String version;
public String getUpload_type() {
return upload_type;
}
public void setUpload_type(String upload_type) {
this.upload_type = upload_type;
}
public String getVersion() {
return version;
}
public void setVersion(String version) {
this.version = version;
}
public String getAccess_right() {
return access_right;
}
public void setAccess_right(String access_right) {
this.access_right = access_right;
}
public List<Community> getCommunities() {
return communities;
}
public void setCommunities(List<Community> communities) {
this.communities = communities;
}
public List<Creator> getCreators() {
return creators;
}
public void setCreators(List<Creator> creators) {
this.creators = creators;
}
public String getDescription() {
return description;
}
public void setDescription(String description) {
this.description = description;
}
public String getDoi() {
return doi;
}
public void setDoi(String doi) {
this.doi = doi;
}
public List<Grant> getGrants() {
return grants;
}
public void setGrants(List<Grant> grants) {
this.grants = grants;
}
public List<String> getKeywords() {
return keywords;
}
public void setKeywords(List<String> keywords) {
this.keywords = keywords;
}
public String getLanguage() {
return language;
}
public void setLanguage(String language) {
this.language = language;
}
public String getLicense() {
return license;
}
public void setLicense(String license) {
this.license = license;
}
public PrereserveDoi getPrereserve_doi() {
return prereserve_doi;
}
public void setPrereserve_doi(PrereserveDoi prereserve_doi) {
this.prereserve_doi = prereserve_doi;
}
public String getPublication_date() {
return publication_date;
}
public void setPublication_date(String publication_date) {
this.publication_date = publication_date;
}
public List<String> getReferences() {
return references;
}
public void setReferences(List<String> references) {
this.references = references;
}
public List<RelatedIdentifier> getRelated_identifiers() {
return related_identifiers;
}
public void setRelated_identifiers(List<RelatedIdentifier> related_identifiers) {
this.related_identifiers = related_identifiers;
}
public String getTitle() {
return title;
}
public void setTitle(String title) {
this.title = title;
}
}

View File

@ -0,0 +1,25 @@
package eu.dnetlib.dhp.common.api.zenodo;
import java.io.Serializable;
public class PrereserveDoi implements Serializable {
private String doi;
private String recid;
public String getDoi() {
return doi;
}
public void setDoi(String doi) {
this.doi = doi;
}
public String getRecid() {
return recid;
}
public void setRecid(String recid) {
this.recid = recid;
}
}

View File

@ -0,0 +1,43 @@
package eu.dnetlib.dhp.common.api.zenodo;
import java.io.Serializable;
public class RelatedIdentifier implements Serializable {
private String identifier;
private String relation;
private String resource_type;
private String scheme;
public String getIdentifier() {
return identifier;
}
public void setIdentifier(String identifier) {
this.identifier = identifier;
}
public String getRelation() {
return relation;
}
public void setRelation(String relation) {
this.relation = relation;
}
public String getResource_type() {
return resource_type;
}
public void setResource_type(String resource_type) {
this.resource_type = resource_type;
}
public String getScheme() {
return scheme;
}
public void setScheme(String scheme) {
this.scheme = scheme;
}
}

View File

@ -0,0 +1,118 @@
package eu.dnetlib.dhp.common.api.zenodo;
import java.io.Serializable;
import java.util.List;
public class ZenodoModel implements Serializable {
private String conceptrecid;
private String created;
private List<File> files;
private String id;
private Links links;
private Metadata metadata;
private String modified;
private String owner;
private String record_id;
private String state;
private boolean submitted;
private String title;
public String getConceptrecid() {
return conceptrecid;
}
public void setConceptrecid(String conceptrecid) {
this.conceptrecid = conceptrecid;
}
public String getCreated() {
return created;
}
public void setCreated(String created) {
this.created = created;
}
public List<File> getFiles() {
return files;
}
public void setFiles(List<File> files) {
this.files = files;
}
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public Links getLinks() {
return links;
}
public void setLinks(Links links) {
this.links = links;
}
public Metadata getMetadata() {
return metadata;
}
public void setMetadata(Metadata metadata) {
this.metadata = metadata;
}
public String getModified() {
return modified;
}
public void setModified(String modified) {
this.modified = modified;
}
public String getOwner() {
return owner;
}
public void setOwner(String owner) {
this.owner = owner;
}
public String getRecord_id() {
return record_id;
}
public void setRecord_id(String record_id) {
this.record_id = record_id;
}
public String getState() {
return state;
}
public void setState(String state) {
this.state = state;
}
public boolean isSubmitted() {
return submitted;
}
public void setSubmitted(boolean submitted) {
this.submitted = submitted;
}
public String getTitle() {
return title;
}
public void setTitle(String title) {
this.title = title;
}
}

View File

@ -0,0 +1,7 @@
package eu.dnetlib.dhp.common.api.zenodo;
import java.util.ArrayList;
public class ZenodoModelList extends ArrayList<ZenodoModel> {
}

View File

@ -1,9 +1,10 @@
package eu.dnetlib.dhp.oa.dedup;
package eu.dnetlib.dhp.oa.merge;
import java.text.Normalizer;
import java.util.*;
import java.util.stream.Collectors;
import java.util.stream.Stream;
import org.apache.commons.lang3.StringUtils;
@ -32,27 +33,33 @@ public class AuthorMerger {
}
public static List<Author> mergeAuthor(final List<Author> a, final List<Author> b) {
public static List<Author> mergeAuthor(final List<Author> a, final List<Author> b, Double threshold) {
int pa = countAuthorsPids(a);
int pb = countAuthorsPids(b);
List<Author> base, enrich;
int sa = authorsSize(a);
int sb = authorsSize(b);
if (pa == pb) {
base = sa > sb ? a : b;
enrich = sa > sb ? b : a;
} else {
if (sa == sb) {
base = pa > pb ? a : b;
enrich = pa > pb ? b : a;
} else {
base = sa > sb ? a : b;
enrich = sa > sb ? b : a;
}
enrichPidFromList(base, enrich);
enrichPidFromList(base, enrich, threshold);
return base;
}
private static void enrichPidFromList(List<Author> base, List<Author> enrich) {
public static List<Author> mergeAuthor(final List<Author> a, final List<Author> b) {
return mergeAuthor(a, b, THRESHOLD);
}
private static void enrichPidFromList(List<Author> base, List<Author> enrich, Double threshold) {
if (base == null || enrich == null)
return;
// <pidComparableString, Author> (if an Author has more than 1 pid, it appears 2 times in the list)
final Map<String, Author> basePidAuthorMap = base
.stream()
.filter(a -> a.getPid() != null && a.getPid().size() > 0)
@ -63,6 +70,7 @@ public class AuthorMerger {
.map(p -> new Tuple2<>(pidToComparableString(p), a)))
.collect(Collectors.toMap(Tuple2::_1, Tuple2::_2, (x1, x2) -> x1));
// <pid, Author> (list of pid that are missing in the other list)
final List<Tuple2<StructuredProperty, Author>> pidToEnrich = enrich
.stream()
.filter(a -> a.getPid() != null && a.getPid().size() > 0)
@ -83,10 +91,10 @@ public class AuthorMerger {
.max(Comparator.comparing(Tuple2::_1));
if (simAuthor.isPresent()) {
double th = THRESHOLD;
double th = threshold;
// increase the threshold if the surname is too short
if (simAuthor.get()._2().getSurname() != null
&& simAuthor.get()._2().getSurname().length() <= 3)
&& simAuthor.get()._2().getSurname().length() <= 3 && threshold > 0.0)
th = 0.99;
if (simAuthor.get()._1() > th) {
@ -94,7 +102,13 @@ public class AuthorMerger {
if (r.getPid() == null) {
r.setPid(new ArrayList<>());
}
r.getPid().add(a._1());
// TERRIBLE HACK but for some reason when we create and Array with Arrays.asList,
// it creates of fixed size, and the add method raise UnsupportedOperationException at
// java.util.AbstractList.add
final List<StructuredProperty> tmp = new ArrayList<>(r.getPid());
tmp.add(a._1());
r.setPid(tmp);
}
}
});
@ -150,7 +164,7 @@ public class AuthorMerger {
}
private static String normalize(final String s) {
return nfd(s)
String[] normalized = nfd(s)
.toLowerCase()
// do not compact the regexes in a single expression, would cause StackOverflowError
// in case
@ -160,7 +174,12 @@ public class AuthorMerger {
.replaceAll("(\\p{Punct})+", " ")
.replaceAll("(\\d)+", " ")
.replaceAll("(\\n)+", " ")
.trim();
.trim()
.split(" ");
Arrays.sort(normalized);
return String.join(" ", normalized);
}
private static String nfd(final String s) {

View File

@ -1,19 +1,57 @@
package eu.dnetlib.dhp.oa.graph.raw.common;
package eu.dnetlib.dhp.schema.oaf;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.Map;
import java.util.Objects;
import java.util.concurrent.ConcurrentHashMap;
import java.util.function.Function;
import java.util.function.Predicate;
import java.util.stream.Collectors;
import org.apache.commons.lang3.StringUtils;
import eu.dnetlib.dhp.schema.oaf.*;
import com.google.common.base.Joiner;
import eu.dnetlib.dhp.schema.common.ModelSupport;
import eu.dnetlib.dhp.utils.DHPUtils;
public class OafMapperUtils {
public static Oaf merge(final Oaf o1, final Oaf o2) {
if (ModelSupport.isSubClass(o1, OafEntity.class)) {
if (ModelSupport.isSubClass(o1, Result.class)) {
return mergeResults((Result) o1, (Result) o2);
} else if (ModelSupport.isSubClass(o1, Datasource.class)) {
((Datasource) o1).mergeFrom((Datasource) o2);
} else if (ModelSupport.isSubClass(o1, Organization.class)) {
((Organization) o1).mergeFrom((Organization) o2);
} else if (ModelSupport.isSubClass(o1, Project.class)) {
((Project) o1).mergeFrom((Project) o2);
} else {
throw new RuntimeException("invalid OafEntity subtype:" + o1.getClass().getCanonicalName());
}
} else if (ModelSupport.isSubClass(o1, Relation.class)) {
((Relation) o1).mergeFrom((Relation) o2);
} else {
throw new RuntimeException("invalid Oaf type:" + o1.getClass().getCanonicalName());
}
return o1;
}
public static Result mergeResults(Result r1, Result r2) {
if (new ResultTypeComparator().compare(r1, r2) < 0) {
r1.mergeFrom(r2);
return r1;
} else {
r2.mergeFrom(r1);
return r2;
}
}
public static KeyValue keyValue(final String k, final String v) {
final KeyValue kv = new KeyValue();
kv.setKey(k);
@ -49,6 +87,7 @@ public class OafMapperUtils {
.stream(values)
.map(v -> field(v, info))
.filter(Objects::nonNull)
.filter(distinctByKey(f -> f.getValue()))
.collect(Collectors.toList());
}
@ -57,6 +96,7 @@ public class OafMapperUtils {
.stream()
.map(v -> field(v, info))
.filter(Objects::nonNull)
.filter(distinctByKey(f -> f.getValue()))
.collect(Collectors.toList());
}
@ -89,7 +129,9 @@ public class OafMapperUtils {
}
public static StructuredProperty structuredProperty(
final String value, final Qualifier qualifier, final DataInfo dataInfo) {
final String value,
final Qualifier qualifier,
final DataInfo dataInfo) {
if (value == null) {
return null;
}
@ -137,6 +179,28 @@ public class OafMapperUtils {
return p;
}
public static Journal journal(
final String name,
final String issnPrinted,
final String issnOnline,
final String issnLinking,
final DataInfo dataInfo) {
return hasIssn(issnPrinted, issnOnline, issnLinking) ? journal(
name,
issnPrinted,
issnOnline,
issnLinking,
null,
null,
null,
null,
null,
null,
null,
dataInfo) : null;
}
public static Journal journal(
final String name,
final String issnPrinted,
@ -151,10 +215,7 @@ public class OafMapperUtils {
final String conferencedate,
final DataInfo dataInfo) {
if (StringUtils.isNotBlank(name)
|| StringUtils.isNotBlank(issnPrinted)
|| StringUtils.isNotBlank(issnOnline)
|| StringUtils.isNotBlank(issnLinking)) {
if (StringUtils.isNotBlank(name) || hasIssn(issnPrinted, issnOnline, issnLinking)) {
final Journal j = new Journal();
j.setName(name);
j.setIssnPrinted(issnPrinted);
@ -174,6 +235,12 @@ public class OafMapperUtils {
}
}
private static boolean hasIssn(String issnPrinted, String issnOnline, String issnLinking) {
return StringUtils.isNotBlank(issnPrinted)
|| StringUtils.isNotBlank(issnOnline)
|| StringUtils.isNotBlank(issnLinking);
}
public static DataInfo dataInfo(
final Boolean deletedbyinference,
final String inferenceprovenance,
@ -192,8 +259,12 @@ public class OafMapperUtils {
}
public static String createOpenaireId(
final int prefix, final String originalId, final boolean to_md5) {
if (to_md5) {
final int prefix,
final String originalId,
final boolean to_md5) {
if (StringUtils.isBlank(originalId)) {
return null;
} else if (to_md5) {
final String nsPrefix = StringUtils.substringBefore(originalId, "::");
final String rest = StringUtils.substringAfter(originalId, "::");
return String.format("%s|%s::%s", prefix, nsPrefix, DHPUtils.md5(rest));
@ -203,7 +274,9 @@ public class OafMapperUtils {
}
public static String createOpenaireId(
final String type, final String originalId, final boolean to_md5) {
final String type,
final String originalId,
final boolean to_md5) {
switch (type) {
case "datasource":
return createOpenaireId(10, originalId, to_md5);
@ -221,4 +294,10 @@ public class OafMapperUtils {
public static String asString(final Object o) {
return o == null ? "" : o.toString();
}
public static <T> Predicate<T> distinctByKey(
final Function<? super T, ?> keyExtractor) {
final Map<Object, Boolean> seen = new ConcurrentHashMap<>();
return t -> seen.putIfAbsent(keyExtractor.apply(t), Boolean.TRUE) == null;
}
}

View File

@ -0,0 +1,49 @@
package eu.dnetlib.dhp.schema.oaf;
import java.util.Comparator;
import eu.dnetlib.dhp.schema.common.ModelConstants;
public class ResultTypeComparator implements Comparator<Result> {
@Override
public int compare(Result left, Result right) {
if (left == null && right == null)
return 0;
if (left == null)
return 1;
if (right == null)
return -1;
String lClass = left.getResulttype().getClassid();
String rClass = right.getResulttype().getClassid();
if (lClass.equals(rClass))
return 0;
if (lClass.equals(ModelConstants.PUBLICATION_RESULTTYPE_CLASSID))
return -1;
if (rClass.equals(ModelConstants.PUBLICATION_RESULTTYPE_CLASSID))
return 1;
if (lClass.equals(ModelConstants.DATASET_RESULTTYPE_CLASSID))
return -1;
if (rClass.equals(ModelConstants.DATASET_RESULTTYPE_CLASSID))
return 1;
if (lClass.equals(ModelConstants.SOFTWARE_RESULTTYPE_CLASSID))
return -1;
if (rClass.equals(ModelConstants.SOFTWARE_RESULTTYPE_CLASSID))
return 1;
if (lClass.equals(ModelConstants.ORP_RESULTTYPE_CLASSID))
return -1;
if (rClass.equals(ModelConstants.ORP_RESULTTYPE_CLASSID))
return 1;
// Else (but unlikely), lexicographical ordering will do.
return lClass.compareTo(rClass);
}
}

View File

@ -5,6 +5,7 @@ import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.nio.charset.StandardCharsets;
import java.security.MessageDigest;
import java.util.List;
import java.util.zip.GZIPInputStream;
import java.util.zip.GZIPOutputStream;
@ -15,9 +16,15 @@ import org.apache.commons.codec.binary.Hex;
import com.jayway.jsonpath.JsonPath;
import net.minidev.json.JSONArray;
import scala.collection.JavaConverters;
import scala.collection.Seq;
public class DHPUtils {
public static Seq<String> toSeq(List<String> list) {
return JavaConverters.asScalaIteratorConverter(list.iterator()).asScala().toSeq();
}
public static String md5(final String s) {
try {
final MessageDigest md = MessageDigest.getInstance("MD5");

View File

@ -1,15 +1,22 @@
package eu.dnetlib.dhp.utils;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.apache.cxf.endpoint.Client;
import org.apache.cxf.frontend.ClientProxy;
import org.apache.cxf.jaxws.JaxWsProxyFactoryBean;
import org.apache.cxf.transport.http.HTTPConduit;
import org.apache.cxf.transports.http.configuration.HTTPClientPolicy;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpService;
public class ISLookupClientFactory {
private static final Log log = LogFactory.getLog(ISLookupClientFactory.class);
private static final Logger log = LoggerFactory.getLogger(ISLookupClientFactory.class);
private static int requestTimeout = 60000 * 10;
private static int connectTimeout = 60000 * 10;
public static ISLookUpService getLookUpService(final String isLookupUrl) {
return getServiceStub(ISLookUpService.class, isLookupUrl);
@ -21,6 +28,28 @@ public class ISLookupClientFactory {
final JaxWsProxyFactoryBean jaxWsProxyFactory = new JaxWsProxyFactoryBean();
jaxWsProxyFactory.setServiceClass(clazz);
jaxWsProxyFactory.setAddress(endpoint);
return (T) jaxWsProxyFactory.create();
final T service = (T) jaxWsProxyFactory.create();
Client client = ClientProxy.getClient(service);
if (client != null) {
HTTPConduit conduit = (HTTPConduit) client.getConduit();
HTTPClientPolicy policy = new HTTPClientPolicy();
log
.info(
String
.format(
"setting connectTimeout to %s, requestTimeout to %s for service %s",
connectTimeout,
requestTimeout,
clazz.getCanonicalName()));
policy.setConnectionTimeout(connectTimeout);
policy.setReceiveTimeout(requestTimeout);
conduit.setClient(policy);
}
return service;
}
}

View File

@ -5,6 +5,8 @@ import java.text.ParseException;
import java.text.SimpleDateFormat;
import java.util.Date;
import org.apache.commons.lang3.StringUtils;
import net.sf.saxon.expr.XPathContext;
import net.sf.saxon.om.Sequence;
import net.sf.saxon.trans.XPathException;
@ -19,6 +21,8 @@ public class NormalizeDate extends AbstractExtensionFunction {
private static final String normalizeOutFormat = "yyyy-MM-dd'T'hh:mm:ss'Z'";
public static final String BLANK = "";
@Override
public String getName() {
return "normalizeDate";
@ -27,10 +31,10 @@ public class NormalizeDate extends AbstractExtensionFunction {
@Override
public Sequence doCall(XPathContext context, Sequence[] arguments) throws XPathException {
if (arguments == null | arguments.length == 0) {
return new StringValue("");
return new StringValue(BLANK);
}
String s = arguments[0].head().getStringValue();
return new StringValue(_year(s));
return new StringValue(_normalizeDate(s));
}
@Override
@ -55,8 +59,8 @@ public class NormalizeDate extends AbstractExtensionFunction {
return SequenceType.SINGLE_STRING;
}
private String _year(String s) {
final String date = s != null ? s.trim() : "";
private String _normalizeDate(String s) {
final String date = StringUtils.isNotBlank(s) ? s.trim() : BLANK;
for (String format : normalizeDateFormats) {
try {
@ -66,6 +70,6 @@ public class NormalizeDate extends AbstractExtensionFunction {
} catch (ParseException e) {
}
}
return "";
return BLANK;
}
}

View File

@ -0,0 +1,109 @@
package eu.dnetlib.dhp.common.api;
import java.io.File;
import java.io.FileInputStream;
import java.io.IOException;
import java.io.InputStream;
import org.apache.commons.io.IOUtils;
import org.junit.jupiter.api.Assertions;
import org.junit.jupiter.api.Disabled;
import org.junit.jupiter.api.Test;
@Disabled
public class ZenodoAPIClientTest {
private final String URL_STRING = "https://sandbox.zenodo.org/api/deposit/depositions";
private final String ACCESS_TOKEN = "";
private final String CONCEPT_REC_ID = "657113";
private final String depositionId = "674915";
@Test
public void testUploadOldDeposition() throws IOException, MissingConceptDoiException {
ZenodoAPIClient client = new ZenodoAPIClient(URL_STRING,
ACCESS_TOKEN);
Assertions.assertEquals(200, client.uploadOpenDeposition(depositionId));
File file = new File(getClass()
.getResource("/eu/dnetlib/dhp/common/api/COVID-19.json.gz")
.getPath());
InputStream is = new FileInputStream(file);
Assertions.assertEquals(200, client.uploadIS(is, "COVID-19.json.gz", file.length()));
String metadata = IOUtils.toString(getClass().getResourceAsStream("/eu/dnetlib/dhp/common/api/metadata.json"));
Assertions.assertEquals(200, client.sendMretadata(metadata));
Assertions.assertEquals(202, client.publish());
}
@Test
public void testNewDeposition() throws IOException {
ZenodoAPIClient client = new ZenodoAPIClient(URL_STRING,
ACCESS_TOKEN);
Assertions.assertEquals(201, client.newDeposition());
File file = new File(getClass()
.getResource("/eu/dnetlib/dhp/common/api/COVID-19.json.gz")
.getPath());
InputStream is = new FileInputStream(file);
Assertions.assertEquals(200, client.uploadIS(is, "COVID-19.json.gz", file.length()));
String metadata = IOUtils.toString(getClass().getResourceAsStream("/eu/dnetlib/dhp/common/api/metadata.json"));
Assertions.assertEquals(200, client.sendMretadata(metadata));
Assertions.assertEquals(202, client.publish());
}
@Test
public void testNewVersionNewName() throws IOException, MissingConceptDoiException {
ZenodoAPIClient client = new ZenodoAPIClient(URL_STRING,
ACCESS_TOKEN);
Assertions.assertEquals(201, client.newVersion(CONCEPT_REC_ID));
File file = new File(getClass()
.getResource("/eu/dnetlib/dhp/common/api/newVersion")
.getPath());
InputStream is = new FileInputStream(file);
Assertions.assertEquals(200, client.uploadIS(is, "newVersion_deposition", file.length()));
Assertions.assertEquals(202, client.publish());
}
@Test
public void testNewVersionOldName() throws IOException, MissingConceptDoiException {
ZenodoAPIClient client = new ZenodoAPIClient(URL_STRING,
ACCESS_TOKEN);
Assertions.assertEquals(201, client.newVersion(CONCEPT_REC_ID));
File file = new File(getClass()
.getResource("/eu/dnetlib/dhp/common/api/newVersion2")
.getPath());
InputStream is = new FileInputStream(file);
Assertions.assertEquals(200, client.uploadIS(is, "newVersion_deposition", file.length()));
Assertions.assertEquals(202, client.publish());
}
}

View File

@ -0,0 +1,100 @@
package eu.dnetlib.dhp.oa.merge;
import java.io.BufferedReader;
import java.io.FileReader;
import java.io.IOException;
import java.nio.file.Paths;
import java.util.ArrayList;
import java.util.List;
import java.util.stream.Collectors;
import org.junit.jupiter.api.Assertions;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import com.fasterxml.jackson.databind.ObjectMapper;
import eu.dnetlib.dhp.schema.oaf.Author;
import eu.dnetlib.dhp.schema.oaf.Publication;
import eu.dnetlib.dhp.schema.oaf.StructuredProperty;
import eu.dnetlib.pace.util.MapDocumentUtil;
import scala.Tuple2;
public class AuthorMergerTest {
private String publicationsBasePath;
private List<List<Author>> authors;
@BeforeEach
public void setUp() throws Exception {
publicationsBasePath = Paths
.get(AuthorMergerTest.class.getResource("/eu/dnetlib/dhp/oa/merge").toURI())
.toFile()
.getAbsolutePath();
authors = readSample(publicationsBasePath + "/publications_with_authors.json", Publication.class)
.stream()
.map(p -> p._2().getAuthor())
.collect(Collectors.toList());
}
@Test
public void mergeTest() { // used in the dedup: threshold set to 0.95
for (List<Author> authors1 : authors) {
System.out.println("List " + (authors.indexOf(authors1) + 1));
for (Author author : authors1) {
System.out.println(authorToString(author));
}
}
List<Author> merge = AuthorMerger.merge(authors);
System.out.println("Merge ");
for (Author author : merge) {
System.out.println(authorToString(author));
}
Assertions.assertEquals(7, merge.size());
}
public <T> List<Tuple2<String, T>> readSample(String path, Class<T> clazz) {
List<Tuple2<String, T>> res = new ArrayList<>();
BufferedReader reader;
try {
reader = new BufferedReader(new FileReader(path));
String line = reader.readLine();
while (line != null) {
res
.add(
new Tuple2<>(
MapDocumentUtil.getJPathString("$.id", line),
new ObjectMapper().readValue(line, clazz)));
// read next line
line = reader.readLine();
}
reader.close();
} catch (IOException e) {
e.printStackTrace();
}
return res;
}
public String authorToString(Author a) {
String print = "Fullname = ";
print += a.getFullname() + " pid = [";
if (a.getPid() != null)
for (StructuredProperty sp : a.getPid()) {
print += sp.toComparableString() + " ";
}
print += "]";
return print;
}
}

View File

@ -0,0 +1 @@
{"metadata":{"access_right":"open","communities":[{"identifier":"openaire-research-graph"}],"creators":[{"affiliation":"ISTI - CNR","name":"Bardi, Alessia","orcid":"0000-0002-1112-1292"},{"affiliation":"eifl", "name":"Kuchma, Iryna"},{"affiliation":"BIH", "name":"Brobov, Evgeny"},{"affiliation":"GIDIF RBM", "name":"Truccolo, Ivana"},{"affiliation":"unesp", "name":"Monteiro, Elizabete"},{"affiliation":"und", "name":"Casalegno, Carlotta"},{"affiliation":"CARL ABRC", "name":"Clary, Erin"},{"affiliation":"The University of Edimburgh", "name":"Romanowski, Andrew"},{"affiliation":"ISTI - CNR", "name":"Pavone, Gina"},{"affiliation":"ISTI - CNR", "name":"Artini, Michele"},{"affiliation":"ISTI - CNR","name":"Atzori, Claudio","orcid":"0000-0001-9613-6639"},{"affiliation":"University of Bielefeld","name":"Bäcker, Amelie","orcid":"0000-0001-6015-2063"},{"affiliation":"ISTI - CNR","name":"Baglioni, Miriam","orcid":"0000-0002-2273-9004"},{"affiliation":"University of Bielefeld","name":"Czerniak, Andreas","orcid":"0000-0003-3883-4169"},{"affiliation":"ISTI - CNR","name":"De Bonis, Michele"},{"affiliation":"Athena Research and Innovation Centre","name":"Dimitropoulos, Harry"},{"affiliation":"Athena Research and Innovation Centre","name":"Foufoulas, Ioannis"},{"affiliation":"University of Warsaw","name":"Horst, Marek"},{"affiliation":"Athena Research and Innovation Centre","name":"Iatropoulou, Katerina"},{"affiliation":"University of Warsaw","name":"Jacewicz, Przemyslaw"},{"affiliation":"Athena Research and Innovation Centre","name":"Kokogiannaki, Argiro", "orcid":"0000-0002-3880-0244"},{"affiliation":"ISTI - CNR","name":"La Bruzzo, Sandro","orcid":"0000-0003-2855-1245"},{"affiliation":"ISTI - CNR","name":"Lazzeri, Emma"},{"affiliation":"University of Bielefeld","name":"Löhden, Aenne"},{"affiliation":"ISTI - CNR","name":"Manghi, Paolo","orcid":"0000-0001-7291-3210"},{"affiliation":"ISTI - CNR","name":"Mannocci, Andrea","orcid":"0000-0002-5193-7851"},{"affiliation":"Athena Research and Innovation Center","name":"Manola, Natalia"},{"affiliation":"ISTI - CNR","name":"Ottonello, Enrico"},{"affiliation":"University of Bielefeld","name":"Shirrwagen, Jochen"}],"description":"\\u003cp\\u003eThis dump provides access to the metadata records of publications, research data, software and projects that may be relevant to the Corona Virus Disease (COVID-19) fight. The dump contains records of the OpenAIRE COVID-19 Gateway (https://covid-19.openaire.eu/), identified via full-text mining and inference techniques applied to the OpenAIRE Research Graph (https://explore.openaire.eu/). The Graph is one of the largest Open Access collections of metadata records and links between publications, datasets, software, projects, funders, and organizations, aggregating 12,000+ scientific data sources world-wide, among which the Covid-19 data sources Zenodo COVID-19 Community, WHO (World Health Organization), BIP! FInder for COVID-19, Protein Data Bank, Dimensions, scienceOpen, and RSNA. \\u003cp\\u003eThe dump consists of a gzip file containing one json per line. Each json is compliant to the schema available at https://doi.org/10.5281/zenodo.3974226\\u003c/p\\u003e ","title":"OpenAIRE Covid-19 publications, datasets, software and projects metadata.","upload_type":"dataset","version":"1.0"}}

View File

@ -0,0 +1 @@
This is a test for a new deposition

View File

@ -0,0 +1 @@
This is a test for a new version of an old deposition

View File

@ -0,0 +1,2 @@
This is a test for a new version of an old deposition. This should replace the other new version. I expect to have only two
files in the deposition

File diff suppressed because one or more lines are too long

View File

@ -1,11 +0,0 @@
Description of the project
--------------------------
This project defines **object schemas** of the OpenAIRE main entities and the relationships that intercur among them.
Namely it defines the model for
- **research product (result)** which subclasses in publication, dataset, other research product, software
- **data source** object describing the data provider (institutional repository, aggregators, cris systems)
- **organization** research bodies managing a data source or participating to a research project
- **project** research project
Te serialization of such objects (data store files) are used to pass data between workflow nodes in the processing pipeline.

View File

@ -1,42 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>eu.dnetlib.dhp</groupId>
<artifactId>dhp</artifactId>
<version>1.2.4-SNAPSHOT</version>
<relativePath>../</relativePath>
</parent>
<artifactId>dhp-schemas</artifactId>
<packaging>jar</packaging>
<description>This module contains common schema classes meant to be used across the dnet-hadoop submodules</description>
<dependencies>
<dependency>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-lang3</artifactId>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
</dependency>
</dependencies>
</project>

View File

@ -1,40 +0,0 @@
package eu.dnetlib.dhp.schema.action;
import java.io.Serializable;
import com.fasterxml.jackson.databind.annotation.JsonDeserialize;
import eu.dnetlib.dhp.schema.oaf.Oaf;
@JsonDeserialize(using = AtomicActionDeserializer.class)
public class AtomicAction<T extends Oaf> implements Serializable {
private Class<T> clazz;
private T payload;
public AtomicAction() {
}
public AtomicAction(Class<T> clazz, T payload) {
this.clazz = clazz;
this.payload = payload;
}
public Class<T> getClazz() {
return clazz;
}
public void setClazz(Class<T> clazz) {
this.clazz = clazz;
}
public T getPayload() {
return payload;
}
public void setPayload(T payload) {
this.payload = payload;
}
}

View File

@ -1,32 +0,0 @@
package eu.dnetlib.dhp.schema.action;
import java.io.IOException;
import com.fasterxml.jackson.core.JsonParser;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.DeserializationContext;
import com.fasterxml.jackson.databind.JsonDeserializer;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import eu.dnetlib.dhp.schema.oaf.Oaf;
public class AtomicActionDeserializer extends JsonDeserializer {
@Override
public Object deserialize(JsonParser jp, DeserializationContext ctxt)
throws IOException {
JsonNode node = jp.getCodec().readTree(jp);
String classTag = node.get("clazz").asText();
JsonNode payload = node.get("payload");
ObjectMapper mapper = new ObjectMapper();
try {
final Class<?> clazz = Class.forName(classTag);
return new AtomicAction(clazz, (Oaf) mapper.readValue(payload.toString(), clazz));
} catch (ClassNotFoundException e) {
throw new IOException(e);
}
}
}

View File

@ -1,21 +0,0 @@
package eu.dnetlib.dhp.schema.common;
import eu.dnetlib.dhp.schema.oaf.OafEntity;
/** Actual entity types in the Graph */
public enum EntityType {
publication, dataset, otherresearchproduct, software, datasource, organization, project;
/**
* Resolves the EntityType, given the relative class name
*
* @param clazz the given class name
* @param <T> actual OafEntity subclass
* @return the EntityType associated to the given class
*/
public static <T extends OafEntity> EntityType fromClass(Class<T> clazz) {
return EntityType.valueOf(clazz.getSimpleName().toLowerCase());
}
}

View File

@ -1,69 +0,0 @@
package eu.dnetlib.dhp.schema.common;
import java.util.Comparator;
import eu.dnetlib.dhp.schema.oaf.Qualifier;
public class LicenseComparator implements Comparator<Qualifier> {
@Override
public int compare(Qualifier left, Qualifier right) {
if (left == null && right == null)
return 0;
if (left == null)
return 1;
if (right == null)
return -1;
String lClass = left.getClassid();
String rClass = right.getClassid();
if (lClass.equals(rClass))
return 0;
if (lClass.equals("OPEN SOURCE"))
return -1;
if (rClass.equals("OPEN SOURCE"))
return 1;
if (lClass.equals("OPEN"))
return -1;
if (rClass.equals("OPEN"))
return 1;
if (lClass.equals("6MONTHS"))
return -1;
if (rClass.equals("6MONTHS"))
return 1;
if (lClass.equals("12MONTHS"))
return -1;
if (rClass.equals("12MONTHS"))
return 1;
if (lClass.equals("EMBARGO"))
return -1;
if (rClass.equals("EMBARGO"))
return 1;
if (lClass.equals("RESTRICTED"))
return -1;
if (rClass.equals("RESTRICTED"))
return 1;
if (lClass.equals("CLOSED"))
return -1;
if (rClass.equals("CLOSED"))
return 1;
if (lClass.equals("UNKNOWN"))
return -1;
if (rClass.equals("UNKNOWN"))
return 1;
// Else (but unlikely), lexicographical ordering will do.
return lClass.compareTo(rClass);
}
}

View File

@ -1,7 +0,0 @@
package eu.dnetlib.dhp.schema.common;
/** Main entity types in the Graph */
public enum MainEntityType {
result, datasource, organization, project
}

View File

@ -1,125 +0,0 @@
package eu.dnetlib.dhp.schema.common;
import java.security.Key;
import eu.dnetlib.dhp.schema.oaf.DataInfo;
import eu.dnetlib.dhp.schema.oaf.KeyValue;
import eu.dnetlib.dhp.schema.oaf.Qualifier;
public class ModelConstants {
public static final String DNET_RESULT_TYPOLOGIES = "dnet:result_typologies";
public static final String DNET_PUBLICATION_RESOURCE = "dnet:publication_resource";
public static final String DNET_ACCESS_MODES = "dnet:access_modes";
public static final String DNET_LANGUAGES = "dnet:languages";
public static final String DNET_PID_TYPES = "dnet:pid_types";
public static final String DNET_DATA_CITE_DATE = "dnet:dataCite_date";
public static final String DNET_DATA_CITE_RESOURCE = "dnet:dataCite_resource";
public static final String DNET_PROVENANCE_ACTIONS = "dnet:provenanceActions";
public static final String DNET_COUNTRY_TYPE = "dnet:countries";
public static final String DNET_REVIEW_LEVELS = "dnet:review_levels";
public static final String SYSIMPORT_CROSSWALK_REPOSITORY = "sysimport:crosswalk:repository";
public static final String SYSIMPORT_CROSSWALK_ENTITYREGISTRY = "sysimport:crosswalk:entityregistry";
public static final String USER_CLAIM = "user:claim";
public static final String DATASET_RESULTTYPE_CLASSID = "dataset";
public static final String PUBLICATION_RESULTTYPE_CLASSID = "publication";
public static final String SOFTWARE_RESULTTYPE_CLASSID = "software";
public static final String ORP_RESULTTYPE_CLASSID = "other";
public static final String RESULT_RESULT = "resultResult";
/**
* @deprecated Use {@link ModelConstants#RELATIONSHIP} instead.
*/
@Deprecated
public static final String PUBLICATION_DATASET = "publicationDataset";
public static final String IS_RELATED_TO = "isRelatedTo";
public static final String SUPPLEMENT = "supplement";
public static final String IS_SUPPLEMENT_TO = "isSupplementTo";
public static final String IS_SUPPLEMENTED_BY = "isSupplementedBy";
public static final String PART = "part";
public static final String IS_PART_OF = "IsPartOf";
public static final String HAS_PARTS = "HasParts";
public static final String RELATIONSHIP = "relationship";
public static final String CITATION = "citation";
public static final String CITES = "cites";
public static final String IS_CITED_BY = "IsCitedBy";
public static final String REVIEW = "review";
public static final String REVIEWS = "reviews";
public static final String IS_REVIEWED_BY = "IsReviewedBy";
public static final String RESULT_PROJECT = "resultProject";
public static final String OUTCOME = "outcome";
public static final String IS_PRODUCED_BY = "isProducedBy";
public static final String PRODUCES = "produces";
public static final String DATASOURCE_ORGANIZATION = "datasourceOrganization";
public static final String PROVISION = "provision";
public static final String IS_PROVIDED_BY = "isProvidedBy";
public static final String PROVIDES = "provides";
public static final String PROJECT_ORGANIZATION = "projectOrganization";
public static final String PARTICIPATION = "participation";
public static final String HAS_PARTICIPANT = "hasParticipant";
public static final String IS_PARTICIPANT = "isParticipant";
public static final String RESULT_ORGANIZATION = "resultOrganization";
public static final String AFFILIATION = "affiliation";
public static final String IS_AUTHOR_INSTITUTION_OF = "isAuthorInstitutionOf";
public static final String HAS_AUTHOR_INSTITUTION = "hasAuthorInstitution";
public static final String MERGES = "merges";
public static final String UNKNOWN = "UNKNOWN";
public static final String NOT_AVAILABLE = "not available";
public static final Qualifier PUBLICATION_DEFAULT_RESULTTYPE = qualifier(
PUBLICATION_RESULTTYPE_CLASSID, PUBLICATION_RESULTTYPE_CLASSID,
DNET_RESULT_TYPOLOGIES, DNET_RESULT_TYPOLOGIES);
public static final Qualifier DATASET_DEFAULT_RESULTTYPE = qualifier(
DATASET_RESULTTYPE_CLASSID, DATASET_RESULTTYPE_CLASSID,
DNET_RESULT_TYPOLOGIES, DNET_RESULT_TYPOLOGIES);
public static final Qualifier SOFTWARE_DEFAULT_RESULTTYPE = qualifier(
SOFTWARE_RESULTTYPE_CLASSID, SOFTWARE_RESULTTYPE_CLASSID,
DNET_RESULT_TYPOLOGIES, DNET_RESULT_TYPOLOGIES);
public static final Qualifier ORP_DEFAULT_RESULTTYPE = qualifier(
ORP_RESULTTYPE_CLASSID, ORP_RESULTTYPE_CLASSID,
DNET_RESULT_TYPOLOGIES, DNET_RESULT_TYPOLOGIES);
public static final Qualifier REPOSITORY_PROVENANCE_ACTIONS = qualifier(
SYSIMPORT_CROSSWALK_REPOSITORY, SYSIMPORT_CROSSWALK_REPOSITORY,
DNET_PROVENANCE_ACTIONS, DNET_PROVENANCE_ACTIONS);
public static final Qualifier ENTITYREGISTRY_PROVENANCE_ACTION = qualifier(
SYSIMPORT_CROSSWALK_ENTITYREGISTRY, SYSIMPORT_CROSSWALK_ENTITYREGISTRY,
DNET_PROVENANCE_ACTIONS, DNET_PROVENANCE_ACTIONS);
public static final KeyValue UNKNOWN_REPOSITORY = keyValue(
"10|openaire____::55045bd2a65019fd8e6741a755395c8c", "Unknown Repository");
private static Qualifier qualifier(
final String classid,
final String classname,
final String schemeid,
final String schemename) {
final Qualifier q = new Qualifier();
q.setClassid(classid);
q.setClassname(classname);
q.setSchemeid(schemeid);
q.setSchemename(schemename);
return q;
}
private static KeyValue keyValue(String key, String value) {
KeyValue kv = new KeyValue();
kv.setKey(key);
kv.setValue(value);
kv.setDataInfo(new DataInfo());
return kv;
}
}

View File

@ -1,467 +0,0 @@
package eu.dnetlib.dhp.schema.common;
import static com.google.common.base.Preconditions.checkArgument;
import java.util.Map;
import java.util.Objects;
import java.util.Optional;
import java.util.function.Function;
import org.apache.commons.lang3.StringUtils;
import com.google.common.collect.Maps;
import eu.dnetlib.dhp.schema.oaf.*;
/** Oaf model utility methods. */
public class ModelSupport {
/** Defines the mapping between the actual entity type and the main entity type */
private static Map<EntityType, MainEntityType> entityMapping = Maps.newHashMap();
static {
entityMapping.put(EntityType.publication, MainEntityType.result);
entityMapping.put(EntityType.dataset, MainEntityType.result);
entityMapping.put(EntityType.otherresearchproduct, MainEntityType.result);
entityMapping.put(EntityType.software, MainEntityType.result);
entityMapping.put(EntityType.datasource, MainEntityType.datasource);
entityMapping.put(EntityType.organization, MainEntityType.organization);
entityMapping.put(EntityType.project, MainEntityType.project);
}
/**
* Defines the mapping between the actual entity types and the relative classes implementing them
*/
public static final Map<EntityType, Class> entityTypes = Maps.newHashMap();
static {
entityTypes.put(EntityType.datasource, Datasource.class);
entityTypes.put(EntityType.organization, Organization.class);
entityTypes.put(EntityType.project, Project.class);
entityTypes.put(EntityType.dataset, Dataset.class);
entityTypes.put(EntityType.otherresearchproduct, OtherResearchProduct.class);
entityTypes.put(EntityType.software, Software.class);
entityTypes.put(EntityType.publication, Publication.class);
}
public static final Map<String, Class> oafTypes = Maps.newHashMap();
static {
oafTypes.put("datasource", Datasource.class);
oafTypes.put("organization", Organization.class);
oafTypes.put("project", Project.class);
oafTypes.put("dataset", Dataset.class);
oafTypes.put("otherresearchproduct", OtherResearchProduct.class);
oafTypes.put("software", Software.class);
oafTypes.put("publication", Publication.class);
oafTypes.put("relation", Relation.class);
}
public static final Map<Class, String> idPrefixMap = Maps.newHashMap();
static {
idPrefixMap.put(Datasource.class, "10");
idPrefixMap.put(Organization.class, "20");
idPrefixMap.put(Project.class, "40");
idPrefixMap.put(Dataset.class, "50");
idPrefixMap.put(OtherResearchProduct.class, "50");
idPrefixMap.put(Software.class, "50");
idPrefixMap.put(Publication.class, "50");
}
public static final Map<String, String> entityIdPrefix = Maps.newHashMap();
static {
entityIdPrefix.put("datasource", "10");
entityIdPrefix.put("organization", "20");
entityIdPrefix.put("project", "40");
entityIdPrefix.put("result", "50");
}
public static final Map<String, RelationInverse> relationInverseMap = Maps.newHashMap();
static {
relationInverseMap
.put(
"personResult_authorship_isAuthorOf", new RelationInverse()
.setRelation("isAuthorOf")
.setInverse("hasAuthor")
.setRelType("personResult")
.setSubReltype("authorship"));
relationInverseMap
.put(
"personResult_authorship_hasAuthor", new RelationInverse()
.setInverse("isAuthorOf")
.setRelation("hasAuthor")
.setRelType("personResult")
.setSubReltype("authorship"));
relationInverseMap
.put(
"projectOrganization_participation_isParticipant", new RelationInverse()
.setRelation("isParticipant")
.setInverse("hasParticipant")
.setRelType("projectOrganization")
.setSubReltype("participation"));
relationInverseMap
.put(
"projectOrganization_participation_hasParticipant", new RelationInverse()
.setInverse("isParticipant")
.setRelation("hasParticipant")
.setRelType("projectOrganization")
.setSubReltype("participation"));
relationInverseMap
.put(
"resultOrganization_affiliation_hasAuthorInstitution", new RelationInverse()
.setRelation("hasAuthorInstitution")
.setInverse("isAuthorInstitutionOf")
.setRelType("resultOrganization")
.setSubReltype("affiliation"));
relationInverseMap
.put(
"resultOrganization_affiliation_isAuthorInstitutionOf", new RelationInverse()
.setInverse("hasAuthorInstitution")
.setRelation("isAuthorInstitutionOf")
.setRelType("resultOrganization")
.setSubReltype("affiliation"));
relationInverseMap
.put(
"organizationOrganization_dedup_merges", new RelationInverse()
.setRelation("merges")
.setInverse("isMergedIn")
.setRelType("organizationOrganization")
.setSubReltype("dedup"));
relationInverseMap
.put(
"organizationOrganization_dedup_isMergedIn", new RelationInverse()
.setInverse("merges")
.setRelation("isMergedIn")
.setRelType("organizationOrganization")
.setSubReltype("dedup"));
relationInverseMap
.put(
"organizationOrganization_dedupSimilarity_isSimilarTo", new RelationInverse()
.setInverse("isSimilarTo")
.setRelation("isSimilarTo")
.setRelType("organizationOrganization")
.setSubReltype("dedupSimilarity"));
relationInverseMap
.put(
"resultProject_outcome_isProducedBy", new RelationInverse()
.setRelation("isProducedBy")
.setInverse("produces")
.setRelType("resultProject")
.setSubReltype("outcome"));
relationInverseMap
.put(
"resultProject_outcome_produces", new RelationInverse()
.setInverse("isProducedBy")
.setRelation("produces")
.setRelType("resultProject")
.setSubReltype("outcome"));
relationInverseMap
.put(
"projectPerson_contactPerson_isContact", new RelationInverse()
.setRelation("isContact")
.setInverse("hasContact")
.setRelType("projectPerson")
.setSubReltype("contactPerson"));
relationInverseMap
.put(
"projectPerson_contactPerson_hasContact", new RelationInverse()
.setInverse("isContact")
.setRelation("hasContact")
.setRelType("personPerson")
.setSubReltype("coAuthorship"));
relationInverseMap
.put(
"personPerson_coAuthorship_isCoauthorOf", new RelationInverse()
.setInverse("isCoAuthorOf")
.setRelation("isCoAuthorOf")
.setRelType("personPerson")
.setSubReltype("coAuthorship"));
relationInverseMap
.put(
"personPerson_dedup_merges", new RelationInverse()
.setInverse("isMergedIn")
.setRelation("merges")
.setRelType("personPerson")
.setSubReltype("dedup"));
relationInverseMap
.put(
"personPerson_dedup_isMergedIn", new RelationInverse()
.setInverse("merges")
.setRelation("isMergedIn")
.setRelType("personPerson")
.setSubReltype("dedup"));
relationInverseMap
.put(
"personPerson_dedupSimilarity_isSimilarTo", new RelationInverse()
.setInverse("isSimilarTo")
.setRelation("isSimilarTo")
.setRelType("personPerson")
.setSubReltype("dedupSimilarity"));
relationInverseMap
.put(
"datasourceOrganization_provision_isProvidedBy", new RelationInverse()
.setInverse("provides")
.setRelation("isProvidedBy")
.setRelType("datasourceOrganization")
.setSubReltype("provision"));
relationInverseMap
.put(
"datasourceOrganization_provision_provides", new RelationInverse()
.setInverse("isProvidedBy")
.setRelation("provides")
.setRelType("datasourceOrganization")
.setSubReltype("provision"));
relationInverseMap
.put(
"resultResult_similarity_hasAmongTopNSimilarDocuments", new RelationInverse()
.setInverse("isAmongTopNSimilarDocuments")
.setRelation("hasAmongTopNSimilarDocuments")
.setRelType("resultResult")
.setSubReltype("similarity"));
relationInverseMap
.put(
"resultResult_similarity_isAmongTopNSimilarDocuments", new RelationInverse()
.setInverse("hasAmongTopNSimilarDocuments")
.setRelation("isAmongTopNSimilarDocuments")
.setRelType("resultResult")
.setSubReltype("similarity"));
relationInverseMap
.put(
"resultResult_relationship_isRelatedTo", new RelationInverse()
.setInverse("isRelatedTo")
.setRelation("isRelatedTo")
.setRelType("resultResult")
.setSubReltype("relationship"));
relationInverseMap
.put(
"resultResult_similarity_isAmongTopNSimilarDocuments", new RelationInverse()
.setInverse("hasAmongTopNSimilarDocuments")
.setRelation("isAmongTopNSimilarDocuments")
.setRelType("resultResult")
.setSubReltype("similarity"));
relationInverseMap
.put(
"resultResult_supplement_isSupplementTo", new RelationInverse()
.setInverse("isSupplementedBy")
.setRelation("isSupplementTo")
.setRelType("resultResult")
.setSubReltype("supplement"));
relationInverseMap
.put(
"resultResult_supplement_isSupplementedBy", new RelationInverse()
.setInverse("isSupplementTo")
.setRelation("isSupplementedBy")
.setRelType("resultResult")
.setSubReltype("supplement"));
relationInverseMap
.put(
"resultResult_part_isPartOf", new RelationInverse()
.setInverse("hasPart")
.setRelation("isPartOf")
.setRelType("resultResult")
.setSubReltype("part"));
relationInverseMap
.put(
"resultResult_part_hasPart", new RelationInverse()
.setInverse("isPartOf")
.setRelation("hasPart")
.setRelType("resultResult")
.setSubReltype("part"));
relationInverseMap
.put(
"resultResult_dedup_merges", new RelationInverse()
.setInverse("isMergedIn")
.setRelation("merges")
.setRelType("resultResult")
.setSubReltype("dedup"));
relationInverseMap
.put(
"resultResult_dedup_isMergedIn", new RelationInverse()
.setInverse("merges")
.setRelation("isMergedIn")
.setRelType("resultResult")
.setSubReltype("dedup"));
relationInverseMap
.put(
"resultResult_dedupSimilarity_isSimilarTo", new RelationInverse()
.setInverse("isSimilarTo")
.setRelation("isSimilarTo")
.setRelType("resultResult")
.setSubReltype("dedupSimilarity"));
}
private static final String schemeTemplate = "dnet:%s_%s_relations";
private ModelSupport() {
}
public static <E extends OafEntity> String getIdPrefix(Class<E> clazz) {
return idPrefixMap.get(clazz);
}
/**
* Checks subclass-superclass relationship.
*
* @param subClazzObject Subclass object instance
* @param superClazzObject Superclass object instance
* @param <X> Subclass type
* @param <Y> Superclass type
* @return True if X is a subclass of Y
*/
public static <X extends Oaf, Y extends Oaf> Boolean isSubClass(
X subClazzObject, Y superClazzObject) {
return isSubClass(subClazzObject.getClass(), superClazzObject.getClass());
}
/**
* Checks subclass-superclass relationship.
*
* @param subClazzObject Subclass object instance
* @param superClazz Superclass class
* @param <X> Subclass type
* @param <Y> Superclass type
* @return True if X is a subclass of Y
*/
public static <X extends Oaf, Y extends Oaf> Boolean isSubClass(
X subClazzObject, Class<Y> superClazz) {
return isSubClass(subClazzObject.getClass(), superClazz);
}
/**
* Checks subclass-superclass relationship.
*
* @param subClazz Subclass class
* @param superClazz Superclass class
* @param <X> Subclass type
* @param <Y> Superclass type
* @return True if X is a subclass of Y
*/
public static <X extends Oaf, Y extends Oaf> Boolean isSubClass(
Class<X> subClazz, Class<Y> superClazz) {
return superClazz.isAssignableFrom(subClazz);
}
/**
* Lists all the OAF model classes
*
* @param <T>
* @return
*/
public static <T extends Oaf> Class<T>[] getOafModelClasses() {
return new Class[] {
Author.class,
Context.class,
Country.class,
DataInfo.class,
Dataset.class,
Datasource.class,
ExternalReference.class,
ExtraInfo.class,
Field.class,
GeoLocation.class,
Instance.class,
Journal.class,
KeyValue.class,
Oaf.class,
OafEntity.class,
OAIProvenance.class,
Organization.class,
OriginDescription.class,
OtherResearchProduct.class,
Project.class,
Publication.class,
Qualifier.class,
Relation.class,
Result.class,
Software.class,
StructuredProperty.class
};
}
public static String getMainType(final EntityType type) {
return entityMapping.get(type).name();
}
public static boolean isResult(EntityType type) {
return MainEntityType.result.name().equals(getMainType(type));
}
public static String getScheme(final String sourceType, final String targetType) {
return String
.format(
schemeTemplate,
entityMapping.get(EntityType.valueOf(sourceType)).name(),
entityMapping.get(EntityType.valueOf(targetType)).name());
}
public static <T extends Oaf> String tableIdentifier(String dbName, String tableName) {
checkArgument(StringUtils.isNotBlank(dbName), "DB name cannot be empty");
checkArgument(StringUtils.isNotBlank(tableName), "table name cannot be empty");
return String.format("%s.%s", dbName, tableName);
}
public static <T extends Oaf> String tableIdentifier(String dbName, Class<T> clazz) {
checkArgument(Objects.nonNull(clazz), "clazz is needed to derive the table name, thus cannot be null");
return tableIdentifier(dbName, clazz.getSimpleName().toLowerCase());
}
public static <T extends Oaf> Function<T, String> idFn() {
return x -> {
if (isSubClass(x, Relation.class)) {
return idFnForRelation(x);
}
return idFnForOafEntity(x);
};
}
private static <T extends Oaf> String idFnForRelation(T t) {
Relation r = (Relation) t;
return Optional
.ofNullable(r.getSource())
.map(
source -> Optional
.ofNullable(r.getTarget())
.map(
target -> Optional
.ofNullable(r.getRelType())
.map(
relType -> Optional
.ofNullable(r.getSubRelType())
.map(
subRelType -> Optional
.ofNullable(r.getRelClass())
.map(
relClass -> String
.join(
source,
target,
relType,
subRelType,
relClass))
.orElse(
String
.join(
source,
target,
relType,
subRelType)))
.orElse(String.join(source, target, relType)))
.orElse(String.join(source, target)))
.orElse(source))
.orElse(null);
}
private static <T extends Oaf> String idFnForOafEntity(T t) {
return ((OafEntity) t).getId();
}
}

View File

@ -1,46 +0,0 @@
package eu.dnetlib.dhp.schema.common;
public class RelationInverse {
private String relation;
private String inverse;
private String relType;
private String subReltype;
public String getRelType() {
return relType;
}
public RelationInverse setRelType(String relType) {
this.relType = relType;
return this;
}
public String getSubReltype() {
return subReltype;
}
public RelationInverse setSubReltype(String subReltype) {
this.subReltype = subReltype;
return this;
}
public String getRelation() {
return relation;
}
public RelationInverse setRelation(String relation) {
this.relation = relation;
return this;
}
public String getInverse() {
return inverse;
}
public RelationInverse setInverse(String inverse) {
this.inverse = inverse;
return this;
}
}

View File

@ -1,89 +0,0 @@
package eu.dnetlib.dhp.schema.oaf;
import java.io.Serializable;
import java.util.*;
public class Author implements Serializable {
private String fullname;
private String name;
private String surname;
private Integer rank;
private List<StructuredProperty> pid;
private List<Field<String>> affiliation;
public String getFullname() {
return fullname;
}
public void setFullname(String fullname) {
this.fullname = fullname;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public String getSurname() {
return surname;
}
public void setSurname(String surname) {
this.surname = surname;
}
public Integer getRank() {
return rank;
}
public void setRank(Integer rank) {
this.rank = rank;
}
public List<StructuredProperty> getPid() {
return pid;
}
public void setPid(List<StructuredProperty> pid) {
this.pid = pid;
}
public List<Field<String>> getAffiliation() {
return affiliation;
}
public void setAffiliation(List<Field<String>> affiliation) {
this.affiliation = affiliation;
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
Author author = (Author) o;
return Objects.equals(fullname, author.fullname)
&& Objects.equals(name, author.name)
&& Objects.equals(surname, author.surname)
&& Objects.equals(rank, author.rank)
&& Objects.equals(pid, author.pid)
&& Objects.equals(affiliation, author.affiliation);
}
@Override
public int hashCode() {
return Objects.hash(fullname, name, surname, rank, pid, affiliation);
}
}

View File

@ -1,46 +0,0 @@
package eu.dnetlib.dhp.schema.oaf;
import java.io.Serializable;
import java.util.List;
public class Context implements Serializable {
private String id;
private List<DataInfo> dataInfo;
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public List<DataInfo> getDataInfo() {
return dataInfo;
}
public void setDataInfo(List<DataInfo> dataInfo) {
this.dataInfo = dataInfo;
}
@Override
public int hashCode() {
return id == null ? 0 : id.hashCode();
}
@Override
public boolean equals(Object obj) {
if (this == obj)
return true;
if (obj == null)
return false;
if (getClass() != obj.getClass())
return false;
Context other = (Context) obj;
return id.equals(other.getId());
}
}

View File

@ -1,34 +0,0 @@
package eu.dnetlib.dhp.schema.oaf;
import java.util.Objects;
public class Country extends Qualifier {
private DataInfo dataInfo;
public DataInfo getDataInfo() {
return dataInfo;
}
public void setDataInfo(DataInfo dataInfo) {
this.dataInfo = dataInfo;
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
if (!super.equals(o))
return false;
Country country = (Country) o;
return Objects.equals(dataInfo, country.dataInfo);
}
@Override
public int hashCode() {
return Objects.hash(super.hashCode(), dataInfo);
}
}

View File

@ -1,85 +0,0 @@
package eu.dnetlib.dhp.schema.oaf;
import java.io.Serializable;
import java.util.Objects;
public class DataInfo implements Serializable {
private Boolean invisible = false;
private Boolean inferred;
private Boolean deletedbyinference = false;
private String trust;
private String inferenceprovenance;
private Qualifier provenanceaction;
public Boolean getInvisible() {
return invisible;
}
public void setInvisible(Boolean invisible) {
this.invisible = invisible;
}
public Boolean getInferred() {
return inferred;
}
public void setInferred(Boolean inferred) {
this.inferred = inferred;
}
public Boolean getDeletedbyinference() {
return deletedbyinference;
}
public void setDeletedbyinference(Boolean deletedbyinference) {
this.deletedbyinference = deletedbyinference;
}
public String getTrust() {
return trust;
}
public void setTrust(String trust) {
this.trust = trust;
}
public String getInferenceprovenance() {
return inferenceprovenance;
}
public void setInferenceprovenance(String inferenceprovenance) {
this.inferenceprovenance = inferenceprovenance;
}
public Qualifier getProvenanceaction() {
return provenanceaction;
}
public void setProvenanceaction(Qualifier provenanceaction) {
this.provenanceaction = provenanceaction;
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
DataInfo dataInfo = (DataInfo) o;
return Objects.equals(invisible, dataInfo.invisible)
&& Objects.equals(inferred, dataInfo.inferred)
&& Objects.equals(deletedbyinference, dataInfo.deletedbyinference)
&& Objects.equals(trust, dataInfo.trust)
&& Objects.equals(inferenceprovenance, dataInfo.inferenceprovenance)
&& Objects.equals(provenanceaction, dataInfo.provenanceaction);
}
@Override
public int hashCode() {
return Objects
.hash(
invisible, inferred, deletedbyinference, trust, inferenceprovenance, provenanceaction);
}
}

View File

@ -1,116 +0,0 @@
package eu.dnetlib.dhp.schema.oaf;
import java.io.Serializable;
import java.util.List;
import eu.dnetlib.dhp.schema.common.ModelConstants;
public class Dataset extends Result implements Serializable {
private Field<String> storagedate;
// candidate for removal
private Field<String> device;
private Field<String> size;
private Field<String> version;
private Field<String> lastmetadataupdate;
private Field<String> metadataversionnumber;
private List<GeoLocation> geolocation;
public Dataset() {
setResulttype(ModelConstants.DATASET_DEFAULT_RESULTTYPE);
}
public Field<String> getStoragedate() {
return storagedate;
}
public void setStoragedate(Field<String> storagedate) {
this.storagedate = storagedate;
}
public Field<String> getDevice() {
return device;
}
public void setDevice(Field<String> device) {
this.device = device;
}
public Field<String> getSize() {
return size;
}
public void setSize(Field<String> size) {
this.size = size;
}
public Field<String> getVersion() {
return version;
}
public void setVersion(Field<String> version) {
this.version = version;
}
public Field<String> getLastmetadataupdate() {
return lastmetadataupdate;
}
public void setLastmetadataupdate(Field<String> lastmetadataupdate) {
this.lastmetadataupdate = lastmetadataupdate;
}
public Field<String> getMetadataversionnumber() {
return metadataversionnumber;
}
public void setMetadataversionnumber(Field<String> metadataversionnumber) {
this.metadataversionnumber = metadataversionnumber;
}
public List<GeoLocation> getGeolocation() {
return geolocation;
}
public void setGeolocation(List<GeoLocation> geolocation) {
this.geolocation = geolocation;
}
@Override
public void mergeFrom(OafEntity e) {
super.mergeFrom(e);
if (!Dataset.class.isAssignableFrom(e.getClass())) {
return;
}
final Dataset d = (Dataset) e;
storagedate = d.getStoragedate() != null && compareTrust(this, e) < 0 ? d.getStoragedate() : storagedate;
device = d.getDevice() != null && compareTrust(this, e) < 0 ? d.getDevice() : device;
size = d.getSize() != null && compareTrust(this, e) < 0 ? d.getSize() : size;
version = d.getVersion() != null && compareTrust(this, e) < 0 ? d.getVersion() : version;
lastmetadataupdate = d.getLastmetadataupdate() != null && compareTrust(this, e) < 0
? d.getLastmetadataupdate()
: lastmetadataupdate;
metadataversionnumber = d.getMetadataversionnumber() != null && compareTrust(this, e) < 0
? d.getMetadataversionnumber()
: metadataversionnumber;
geolocation = mergeLists(geolocation, d.getGeolocation());
mergeOAFDataInfo(d);
}
}

View File

@ -1,472 +0,0 @@
package eu.dnetlib.dhp.schema.oaf;
import java.io.Serializable;
import java.util.List;
public class Datasource extends OafEntity implements Serializable {
private Qualifier datasourcetype;
private Qualifier openairecompatibility;
private Field<String> officialname;
private Field<String> englishname;
private Field<String> websiteurl;
private Field<String> logourl;
private Field<String> contactemail;
private Field<String> namespaceprefix;
private Field<String> latitude;
private Field<String> longitude;
private Field<String> dateofvalidation;
private Field<String> description;
private List<StructuredProperty> subjects;
// opendoar specific fields (od*)
private Field<String> odnumberofitems;
private Field<String> odnumberofitemsdate;
private Field<String> odpolicies;
private List<Field<String>> odlanguages;
private List<Field<String>> odcontenttypes;
private List<Field<String>> accessinfopackage;
// re3data fields
private Field<String> releasestartdate;
private Field<String> releaseenddate;
private Field<String> missionstatementurl;
private Field<Boolean> dataprovider;
private Field<Boolean> serviceprovider;
// {open, restricted or closed}
private Field<String> databaseaccesstype;
// {open, restricted or closed}
private Field<String> datauploadtype;
// {feeRequired, registration, other}
private Field<String> databaseaccessrestriction;
// {feeRequired, registration, other}
private Field<String> datauploadrestriction;
private Field<Boolean> versioning;
private Field<String> citationguidelineurl;
// {yes, no, uknown}
private Field<String> qualitymanagementkind;
private Field<String> pidsystems;
private Field<String> certificates;
private List<KeyValue> policies;
private Journal journal;
public Qualifier getDatasourcetype() {
return datasourcetype;
}
public void setDatasourcetype(Qualifier datasourcetype) {
this.datasourcetype = datasourcetype;
}
public Qualifier getOpenairecompatibility() {
return openairecompatibility;
}
public void setOpenairecompatibility(Qualifier openairecompatibility) {
this.openairecompatibility = openairecompatibility;
}
public Field<String> getOfficialname() {
return officialname;
}
public void setOfficialname(Field<String> officialname) {
this.officialname = officialname;
}
public Field<String> getEnglishname() {
return englishname;
}
public void setEnglishname(Field<String> englishname) {
this.englishname = englishname;
}
public Field<String> getWebsiteurl() {
return websiteurl;
}
public void setWebsiteurl(Field<String> websiteurl) {
this.websiteurl = websiteurl;
}
public Field<String> getLogourl() {
return logourl;
}
public void setLogourl(Field<String> logourl) {
this.logourl = logourl;
}
public Field<String> getContactemail() {
return contactemail;
}
public void setContactemail(Field<String> contactemail) {
this.contactemail = contactemail;
}
public Field<String> getNamespaceprefix() {
return namespaceprefix;
}
public void setNamespaceprefix(Field<String> namespaceprefix) {
this.namespaceprefix = namespaceprefix;
}
public Field<String> getLatitude() {
return latitude;
}
public void setLatitude(Field<String> latitude) {
this.latitude = latitude;
}
public Field<String> getLongitude() {
return longitude;
}
public void setLongitude(Field<String> longitude) {
this.longitude = longitude;
}
public Field<String> getDateofvalidation() {
return dateofvalidation;
}
public void setDateofvalidation(Field<String> dateofvalidation) {
this.dateofvalidation = dateofvalidation;
}
public Field<String> getDescription() {
return description;
}
public void setDescription(Field<String> description) {
this.description = description;
}
public List<StructuredProperty> getSubjects() {
return subjects;
}
public void setSubjects(List<StructuredProperty> subjects) {
this.subjects = subjects;
}
public Field<String> getOdnumberofitems() {
return odnumberofitems;
}
public void setOdnumberofitems(Field<String> odnumberofitems) {
this.odnumberofitems = odnumberofitems;
}
public Field<String> getOdnumberofitemsdate() {
return odnumberofitemsdate;
}
public void setOdnumberofitemsdate(Field<String> odnumberofitemsdate) {
this.odnumberofitemsdate = odnumberofitemsdate;
}
public Field<String> getOdpolicies() {
return odpolicies;
}
public void setOdpolicies(Field<String> odpolicies) {
this.odpolicies = odpolicies;
}
public List<Field<String>> getOdlanguages() {
return odlanguages;
}
public void setOdlanguages(List<Field<String>> odlanguages) {
this.odlanguages = odlanguages;
}
public List<Field<String>> getOdcontenttypes() {
return odcontenttypes;
}
public void setOdcontenttypes(List<Field<String>> odcontenttypes) {
this.odcontenttypes = odcontenttypes;
}
public List<Field<String>> getAccessinfopackage() {
return accessinfopackage;
}
public void setAccessinfopackage(List<Field<String>> accessinfopackage) {
this.accessinfopackage = accessinfopackage;
}
public Field<String> getReleasestartdate() {
return releasestartdate;
}
public void setReleasestartdate(Field<String> releasestartdate) {
this.releasestartdate = releasestartdate;
}
public Field<String> getReleaseenddate() {
return releaseenddate;
}
public void setReleaseenddate(Field<String> releaseenddate) {
this.releaseenddate = releaseenddate;
}
public Field<String> getMissionstatementurl() {
return missionstatementurl;
}
public void setMissionstatementurl(Field<String> missionstatementurl) {
this.missionstatementurl = missionstatementurl;
}
public Field<Boolean> getDataprovider() {
return dataprovider;
}
public void setDataprovider(Field<Boolean> dataprovider) {
this.dataprovider = dataprovider;
}
public Field<Boolean> getServiceprovider() {
return serviceprovider;
}
public void setServiceprovider(Field<Boolean> serviceprovider) {
this.serviceprovider = serviceprovider;
}
public Field<String> getDatabaseaccesstype() {
return databaseaccesstype;
}
public void setDatabaseaccesstype(Field<String> databaseaccesstype) {
this.databaseaccesstype = databaseaccesstype;
}
public Field<String> getDatauploadtype() {
return datauploadtype;
}
public void setDatauploadtype(Field<String> datauploadtype) {
this.datauploadtype = datauploadtype;
}
public Field<String> getDatabaseaccessrestriction() {
return databaseaccessrestriction;
}
public void setDatabaseaccessrestriction(Field<String> databaseaccessrestriction) {
this.databaseaccessrestriction = databaseaccessrestriction;
}
public Field<String> getDatauploadrestriction() {
return datauploadrestriction;
}
public void setDatauploadrestriction(Field<String> datauploadrestriction) {
this.datauploadrestriction = datauploadrestriction;
}
public Field<Boolean> getVersioning() {
return versioning;
}
public void setVersioning(Field<Boolean> versioning) {
this.versioning = versioning;
}
public Field<String> getCitationguidelineurl() {
return citationguidelineurl;
}
public void setCitationguidelineurl(Field<String> citationguidelineurl) {
this.citationguidelineurl = citationguidelineurl;
}
public Field<String> getQualitymanagementkind() {
return qualitymanagementkind;
}
public void setQualitymanagementkind(Field<String> qualitymanagementkind) {
this.qualitymanagementkind = qualitymanagementkind;
}
public Field<String> getPidsystems() {
return pidsystems;
}
public void setPidsystems(Field<String> pidsystems) {
this.pidsystems = pidsystems;
}
public Field<String> getCertificates() {
return certificates;
}
public void setCertificates(Field<String> certificates) {
this.certificates = certificates;
}
public List<KeyValue> getPolicies() {
return policies;
}
public void setPolicies(List<KeyValue> policies) {
this.policies = policies;
}
public Journal getJournal() {
return journal;
}
public void setJournal(Journal journal) {
this.journal = journal;
}
@Override
public void mergeFrom(OafEntity e) {
super.mergeFrom(e);
if (!Datasource.class.isAssignableFrom(e.getClass())) {
return;
}
Datasource d = (Datasource) e;
datasourcetype = d.getDatasourcetype() != null && compareTrust(this, e) < 0
? d.getDatasourcetype()
: datasourcetype;
openairecompatibility = d.getOpenairecompatibility() != null && compareTrust(this, e) < 0
? d.getOpenairecompatibility()
: openairecompatibility;
officialname = d.getOfficialname() != null && compareTrust(this, e) < 0
? d.getOfficialname()
: officialname;
englishname = d.getEnglishname() != null && compareTrust(this, e) < 0 ? d.getEnglishname() : officialname;
websiteurl = d.getWebsiteurl() != null && compareTrust(this, e) < 0 ? d.getWebsiteurl() : websiteurl;
logourl = d.getLogourl() != null && compareTrust(this, e) < 0 ? d.getLogourl() : getLogourl();
contactemail = d.getContactemail() != null && compareTrust(this, e) < 0
? d.getContactemail()
: contactemail;
namespaceprefix = d.getNamespaceprefix() != null && compareTrust(this, e) < 0
? d.getNamespaceprefix()
: namespaceprefix;
latitude = d.getLatitude() != null && compareTrust(this, e) < 0 ? d.getLatitude() : latitude;
longitude = d.getLongitude() != null && compareTrust(this, e) < 0 ? d.getLongitude() : longitude;
dateofvalidation = d.getDateofvalidation() != null && compareTrust(this, e) < 0
? d.getDateofvalidation()
: dateofvalidation;
description = d.getDescription() != null && compareTrust(this, e) < 0 ? d.getDescription() : description;
subjects = mergeLists(subjects, d.getSubjects());
// opendoar specific fields (od*)
odnumberofitems = d.getOdnumberofitems() != null && compareTrust(this, e) < 0
? d.getOdnumberofitems()
: odnumberofitems;
odnumberofitemsdate = d.getOdnumberofitemsdate() != null && compareTrust(this, e) < 0
? d.getOdnumberofitemsdate()
: odnumberofitemsdate;
odpolicies = d.getOdpolicies() != null && compareTrust(this, e) < 0 ? d.getOdpolicies() : odpolicies;
odlanguages = mergeLists(odlanguages, d.getOdlanguages());
odcontenttypes = mergeLists(odcontenttypes, d.getOdcontenttypes());
accessinfopackage = mergeLists(accessinfopackage, d.getAccessinfopackage());
// re3data fields
releasestartdate = d.getReleasestartdate() != null && compareTrust(this, e) < 0
? d.getReleasestartdate()
: releasestartdate;
releaseenddate = d.getReleaseenddate() != null && compareTrust(this, e) < 0
? d.getReleaseenddate()
: releaseenddate;
missionstatementurl = d.getMissionstatementurl() != null && compareTrust(this, e) < 0
? d.getMissionstatementurl()
: missionstatementurl;
dataprovider = d.getDataprovider() != null && compareTrust(this, e) < 0
? d.getDataprovider()
: dataprovider;
serviceprovider = d.getServiceprovider() != null && compareTrust(this, e) < 0
? d.getServiceprovider()
: serviceprovider;
// {open, restricted or closed}
databaseaccesstype = d.getDatabaseaccesstype() != null && compareTrust(this, e) < 0
? d.getDatabaseaccesstype()
: databaseaccesstype;
// {open, restricted or closed}
datauploadtype = d.getDatauploadtype() != null && compareTrust(this, e) < 0
? d.getDatauploadtype()
: datauploadtype;
// {feeRequired, registration, other}
databaseaccessrestriction = d.getDatabaseaccessrestriction() != null && compareTrust(this, e) < 0
? d.getDatabaseaccessrestriction()
: databaseaccessrestriction;
// {feeRequired, registration, other}
datauploadrestriction = d.getDatauploadrestriction() != null && compareTrust(this, e) < 0
? d.getDatauploadrestriction()
: datauploadrestriction;
versioning = d.getVersioning() != null && compareTrust(this, e) < 0 ? d.getVersioning() : versioning;
citationguidelineurl = d.getCitationguidelineurl() != null && compareTrust(this, e) < 0
? d.getCitationguidelineurl()
: citationguidelineurl;
// {yes, no, unknown}
qualitymanagementkind = d.getQualitymanagementkind() != null && compareTrust(this, e) < 0
? d.getQualitymanagementkind()
: qualitymanagementkind;
pidsystems = d.getPidsystems() != null && compareTrust(this, e) < 0 ? d.getPidsystems() : pidsystems;
certificates = d.getCertificates() != null && compareTrust(this, e) < 0
? d.getCertificates()
: certificates;
policies = mergeLists(policies, d.getPolicies());
journal = d.getJournal() != null && compareTrust(this, e) < 0 ? d.getJournal() : journal;
mergeOAFDataInfo(e);
}
}

View File

@ -1,119 +0,0 @@
package eu.dnetlib.dhp.schema.oaf;
import java.io.Serializable;
import java.util.Objects;
public class ExternalReference implements Serializable {
// source
private String sitename;
// title
private String label;
// text()
private String url;
// ?? not mapped yet ??
private String description;
// type
private Qualifier qualifier;
// site internal identifier
private String refidentifier;
// maps the oaf:reference/@query attribute
private String query;
// ExternalReferences might be also inferred
private DataInfo dataInfo;
public String getSitename() {
return sitename;
}
public void setSitename(String sitename) {
this.sitename = sitename;
}
public String getLabel() {
return label;
}
public void setLabel(String label) {
this.label = label;
}
public String getUrl() {
return url;
}
public void setUrl(String url) {
this.url = url;
}
public String getDescription() {
return description;
}
public void setDescription(String description) {
this.description = description;
}
public Qualifier getQualifier() {
return qualifier;
}
public void setQualifier(Qualifier qualifier) {
this.qualifier = qualifier;
}
public String getRefidentifier() {
return refidentifier;
}
public void setRefidentifier(String refidentifier) {
this.refidentifier = refidentifier;
}
public String getQuery() {
return query;
}
public void setQuery(String query) {
this.query = query;
}
public DataInfo getDataInfo() {
return dataInfo;
}
public void setDataInfo(DataInfo dataInfo) {
this.dataInfo = dataInfo;
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
ExternalReference that = (ExternalReference) o;
return Objects.equals(sitename, that.sitename)
&& Objects.equals(label, that.label)
&& Objects.equals(url, that.url)
&& Objects.equals(description, that.description)
&& Objects.equals(qualifier, that.qualifier)
&& Objects.equals(refidentifier, that.refidentifier)
&& Objects.equals(query, that.query)
&& Objects.equals(dataInfo, that.dataInfo);
}
@Override
public int hashCode() {
return Objects
.hash(
sitename, label, url, description, qualifier, refidentifier, query, dataInfo);
}
}

View File

@ -1,77 +0,0 @@
package eu.dnetlib.dhp.schema.oaf;
import java.io.Serializable;
import java.util.Objects;
public class ExtraInfo implements Serializable {
private String name;
private String typology;
private String provenance;
private String trust;
// json containing a Citation or Statistics
private String value;
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public String getTypology() {
return typology;
}
public void setTypology(String typology) {
this.typology = typology;
}
public String getProvenance() {
return provenance;
}
public void setProvenance(String provenance) {
this.provenance = provenance;
}
public String getTrust() {
return trust;
}
public void setTrust(String trust) {
this.trust = trust;
}
public String getValue() {
return value;
}
public void setValue(String value) {
this.value = value;
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
ExtraInfo extraInfo = (ExtraInfo) o;
return Objects.equals(name, extraInfo.name)
&& Objects.equals(typology, extraInfo.typology)
&& Objects.equals(provenance, extraInfo.provenance)
&& Objects.equals(trust, extraInfo.trust)
&& Objects.equals(value, extraInfo.value);
}
@Override
public int hashCode() {
return Objects.hash(name, typology, provenance, trust, value);
}
}

View File

@ -1,45 +0,0 @@
package eu.dnetlib.dhp.schema.oaf;
import java.io.Serializable;
import java.util.Objects;
public class Field<T> implements Serializable {
private T value;
private DataInfo dataInfo;
public T getValue() {
return value;
}
public void setValue(T value) {
this.value = value;
}
public DataInfo getDataInfo() {
return dataInfo;
}
public void setDataInfo(DataInfo dataInfo) {
this.dataInfo = dataInfo;
}
@Override
public int hashCode() {
return getValue() == null ? 0 : getValue().hashCode();
}
@Override
public boolean equals(Object obj) {
if (this == obj)
return true;
if (obj == null)
return false;
if (getClass() != obj.getClass())
return false;
Field<T> other = (Field<T>) obj;
return Objects.equals(getValue(), other.getValue());
}
}

View File

@ -1,76 +0,0 @@
package eu.dnetlib.dhp.schema.oaf;
import java.io.Serializable;
import org.apache.commons.lang3.StringUtils;
import com.fasterxml.jackson.annotation.JsonIgnore;
public class GeoLocation implements Serializable {
private String point;
private String box;
private String place;
public String getPoint() {
return point;
}
public void setPoint(String point) {
this.point = point;
}
public String getBox() {
return box;
}
public void setBox(String box) {
this.box = box;
}
public String getPlace() {
return place;
}
public void setPlace(String place) {
this.place = place;
}
@JsonIgnore
public boolean isBlank() {
return StringUtils.isBlank(point) && StringUtils.isBlank(box) && StringUtils.isBlank(place);
}
public String toComparableString() {
return isBlank()
? ""
: String
.format(
"%s::%s%s",
point != null ? point.toLowerCase() : "",
box != null ? box.toLowerCase() : "",
place != null ? place.toLowerCase() : "");
}
@Override
public int hashCode() {
return toComparableString().hashCode();
}
@Override
public boolean equals(Object obj) {
if (this == obj)
return true;
if (obj == null)
return false;
if (getClass() != obj.getClass())
return false;
GeoLocation other = (GeoLocation) obj;
return toComparableString().equals(other.toComparableString());
}
}

View File

@ -1,152 +0,0 @@
package eu.dnetlib.dhp.schema.oaf;
import java.io.Serializable;
import java.util.List;
public class Instance implements Serializable {
private Field<String> license;
private Qualifier accessright;
private Qualifier instancetype;
private KeyValue hostedby;
private List<String> url;
// other research products specifc
private String distributionlocation;
private KeyValue collectedfrom;
private Field<String> dateofacceptance;
// ( article | book ) processing charges. Defined here to cope with possible wrongly typed
// results
private Field<String> processingchargeamount;
// currency - alphabetic code describe in ISO-4217. Defined here to cope with possible wrongly
// typed results
private Field<String> processingchargecurrency;
private Qualifier refereed; // peer-review status
public Field<String> getLicense() {
return license;
}
public void setLicense(Field<String> license) {
this.license = license;
}
public Qualifier getAccessright() {
return accessright;
}
public void setAccessright(Qualifier accessright) {
this.accessright = accessright;
}
public Qualifier getInstancetype() {
return instancetype;
}
public void setInstancetype(Qualifier instancetype) {
this.instancetype = instancetype;
}
public KeyValue getHostedby() {
return hostedby;
}
public void setHostedby(KeyValue hostedby) {
this.hostedby = hostedby;
}
public List<String> getUrl() {
return url;
}
public void setUrl(List<String> url) {
this.url = url;
}
public String getDistributionlocation() {
return distributionlocation;
}
public void setDistributionlocation(String distributionlocation) {
this.distributionlocation = distributionlocation;
}
public KeyValue getCollectedfrom() {
return collectedfrom;
}
public void setCollectedfrom(KeyValue collectedfrom) {
this.collectedfrom = collectedfrom;
}
public Field<String> getDateofacceptance() {
return dateofacceptance;
}
public void setDateofacceptance(Field<String> dateofacceptance) {
this.dateofacceptance = dateofacceptance;
}
public Field<String> getProcessingchargeamount() {
return processingchargeamount;
}
public void setProcessingchargeamount(Field<String> processingchargeamount) {
this.processingchargeamount = processingchargeamount;
}
public Field<String> getProcessingchargecurrency() {
return processingchargecurrency;
}
public void setProcessingchargecurrency(Field<String> processingchargecurrency) {
this.processingchargecurrency = processingchargecurrency;
}
public Qualifier getRefereed() {
return refereed;
}
public void setRefereed(Qualifier refereed) {
this.refereed = refereed;
}
public String toComparableString() {
return String
.format(
"%s::%s::%s::%s",
hostedby != null && hostedby.getKey() != null ? hostedby.getKey().toLowerCase() : "",
accessright != null && accessright.getClassid() != null ? accessright.getClassid() : "",
instancetype != null && instancetype.getClassid() != null ? instancetype.getClassid() : "",
url != null ? url : "");
}
@Override
public int hashCode() {
return toComparableString().hashCode();
}
@Override
public boolean equals(Object obj) {
if (this == obj)
return true;
if (obj == null)
return false;
if (getClass() != obj.getClass())
return false;
Instance other = (Instance) obj;
return toComparableString().equals(other.toComparableString());
}
}

View File

@ -1,167 +0,0 @@
package eu.dnetlib.dhp.schema.oaf;
import java.io.Serializable;
import java.util.Objects;
public class Journal implements Serializable {
private String name;
private String issnPrinted;
private String issnOnline;
private String issnLinking;
private String ep;
private String iss;
private String sp;
private String vol;
private String edition;
private String conferenceplace;
private String conferencedate;
private DataInfo dataInfo;
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public String getIssnPrinted() {
return issnPrinted;
}
public void setIssnPrinted(String issnPrinted) {
this.issnPrinted = issnPrinted;
}
public String getIssnOnline() {
return issnOnline;
}
public void setIssnOnline(String issnOnline) {
this.issnOnline = issnOnline;
}
public String getIssnLinking() {
return issnLinking;
}
public void setIssnLinking(String issnLinking) {
this.issnLinking = issnLinking;
}
public String getEp() {
return ep;
}
public void setEp(String ep) {
this.ep = ep;
}
public String getIss() {
return iss;
}
public void setIss(String iss) {
this.iss = iss;
}
public String getSp() {
return sp;
}
public void setSp(String sp) {
this.sp = sp;
}
public String getVol() {
return vol;
}
public void setVol(String vol) {
this.vol = vol;
}
public String getEdition() {
return edition;
}
public void setEdition(String edition) {
this.edition = edition;
}
public String getConferenceplace() {
return conferenceplace;
}
public void setConferenceplace(String conferenceplace) {
this.conferenceplace = conferenceplace;
}
public String getConferencedate() {
return conferencedate;
}
public void setConferencedate(String conferencedate) {
this.conferencedate = conferencedate;
}
public DataInfo getDataInfo() {
return dataInfo;
}
public void setDataInfo(DataInfo dataInfo) {
this.dataInfo = dataInfo;
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
Journal journal = (Journal) o;
return Objects.equals(name, journal.name)
&& Objects.equals(issnPrinted, journal.issnPrinted)
&& Objects.equals(issnOnline, journal.issnOnline)
&& Objects.equals(issnLinking, journal.issnLinking)
&& Objects.equals(ep, journal.ep)
&& Objects.equals(iss, journal.iss)
&& Objects.equals(sp, journal.sp)
&& Objects.equals(vol, journal.vol)
&& Objects.equals(edition, journal.edition)
&& Objects.equals(conferenceplace, journal.conferenceplace)
&& Objects.equals(conferencedate, journal.conferencedate)
&& Objects.equals(dataInfo, journal.dataInfo);
}
@Override
public int hashCode() {
return Objects
.hash(
name,
issnPrinted,
issnOnline,
issnLinking,
ep,
iss,
sp,
vol,
edition,
conferenceplace,
conferencedate,
dataInfo);
}
}

View File

@ -1,74 +0,0 @@
package eu.dnetlib.dhp.schema.oaf;
import java.io.Serializable;
import org.apache.commons.lang3.StringUtils;
import com.fasterxml.jackson.annotation.JsonIgnore;
public class KeyValue implements Serializable {
private String key;
private String value;
private DataInfo dataInfo;
public String getKey() {
return key;
}
public void setKey(String key) {
this.key = key;
}
public String getValue() {
return value;
}
public void setValue(String value) {
this.value = value;
}
public DataInfo getDataInfo() {
return dataInfo;
}
public void setDataInfo(DataInfo dataInfo) {
this.dataInfo = dataInfo;
}
public String toComparableString() {
return isBlank()
? ""
: String
.format(
"%s::%s",
key != null ? key.toLowerCase() : "", value != null ? value.toLowerCase() : "");
}
@JsonIgnore
public boolean isBlank() {
return StringUtils.isBlank(key) && StringUtils.isBlank(value);
}
@Override
public int hashCode() {
return toComparableString().hashCode();
}
@Override
public boolean equals(Object obj) {
if (this == obj)
return true;
if (obj == null)
return false;
if (getClass() != obj.getClass())
return false;
KeyValue other = (KeyValue) obj;
return toComparableString().equals(other.toComparableString());
}
}

View File

@ -1,59 +0,0 @@
package eu.dnetlib.dhp.schema.oaf;
import java.util.List;
import com.google.common.base.Objects;
/**
* Represent a measure, must be further described by a system available resource providing name and descriptions.
*/
public class Measure {
/**
* Unique measure identifier.
*/
private String id;
/**
* List of units associated with this measure. KeyValue provides a pair to store the laber (key) and the value, plus
* common provenance information.
*/
private List<KeyValue> unit;
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public List<KeyValue> getUnit() {
return unit;
}
public void setUnit(List<KeyValue> unit) {
this.unit = unit;
}
public void mergeFrom(Measure m) {
// TODO
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
Measure measure = (Measure) o;
return Objects.equal(id, measure.id) &&
Objects.equal(unit, measure.unit);
}
@Override
public int hashCode() {
return Objects.hashCode(id, unit);
}
}

View File

@ -1,33 +0,0 @@
package eu.dnetlib.dhp.schema.oaf;
import java.io.Serializable;
import java.util.Objects;
public class OAIProvenance implements Serializable {
private OriginDescription originDescription;
public OriginDescription getOriginDescription() {
return originDescription;
}
public void setOriginDescription(OriginDescription originDescription) {
this.originDescription = originDescription;
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
OAIProvenance that = (OAIProvenance) o;
return Objects.equals(originDescription, that.originDescription);
}
@Override
public int hashCode() {
return Objects.hash(originDescription);
}
}

View File

@ -1,73 +0,0 @@
package eu.dnetlib.dhp.schema.oaf;
import java.io.Serializable;
import java.util.List;
import java.util.Objects;
public abstract class Oaf implements Serializable {
/**
* The list of datasource id/name pairs providing this relationship.
*/
protected List<KeyValue> collectedfrom;
private DataInfo dataInfo;
private Long lastupdatetimestamp;
public List<KeyValue> getCollectedfrom() {
return collectedfrom;
}
public void setCollectedfrom(List<KeyValue> collectedfrom) {
this.collectedfrom = collectedfrom;
}
public DataInfo getDataInfo() {
return dataInfo;
}
public void setDataInfo(DataInfo dataInfo) {
this.dataInfo = dataInfo;
}
public Long getLastupdatetimestamp() {
return lastupdatetimestamp;
}
public void setLastupdatetimestamp(Long lastupdatetimestamp) {
this.lastupdatetimestamp = lastupdatetimestamp;
}
public void mergeOAFDataInfo(Oaf e) {
if (e.getDataInfo() != null && compareTrust(this, e) < 0)
dataInfo = e.getDataInfo();
}
protected String extractTrust(Oaf e) {
if (e == null || e.getDataInfo() == null || e.getDataInfo().getTrust() == null)
return "0.0";
return e.getDataInfo().getTrust();
}
protected int compareTrust(Oaf a, Oaf b) {
return extractTrust(a).compareTo(extractTrust(b));
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
Oaf oaf = (Oaf) o;
return Objects.equals(dataInfo, oaf.dataInfo)
&& Objects.equals(lastupdatetimestamp, oaf.lastupdatetimestamp);
}
@Override
public int hashCode() {
return Objects.hash(dataInfo, lastupdatetimestamp);
}
}

View File

@ -1,130 +0,0 @@
package eu.dnetlib.dhp.schema.oaf;
import java.io.Serializable;
import java.util.*;
import java.util.stream.Collectors;
public abstract class OafEntity extends Oaf implements Serializable {
private String id;
private List<String> originalId;
private List<StructuredProperty> pid;
private String dateofcollection;
private String dateoftransformation;
private List<ExtraInfo> extraInfo;
private OAIProvenance oaiprovenance;
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public List<String> getOriginalId() {
return originalId;
}
public void setOriginalId(List<String> originalId) {
this.originalId = originalId;
}
public List<StructuredProperty> getPid() {
return pid;
}
public void setPid(List<StructuredProperty> pid) {
this.pid = pid;
}
public String getDateofcollection() {
return dateofcollection;
}
public void setDateofcollection(String dateofcollection) {
this.dateofcollection = dateofcollection;
}
public String getDateoftransformation() {
return dateoftransformation;
}
public void setDateoftransformation(String dateoftransformation) {
this.dateoftransformation = dateoftransformation;
}
public List<ExtraInfo> getExtraInfo() {
return extraInfo;
}
public void setExtraInfo(List<ExtraInfo> extraInfo) {
this.extraInfo = extraInfo;
}
public OAIProvenance getOaiprovenance() {
return oaiprovenance;
}
public void setOaiprovenance(OAIProvenance oaiprovenance) {
this.oaiprovenance = oaiprovenance;
}
public void mergeFrom(OafEntity e) {
if (e == null)
return;
originalId = mergeLists(originalId, e.getOriginalId());
collectedfrom = mergeLists(collectedfrom, e.getCollectedfrom());
pid = mergeLists(pid, e.getPid());
if (e.getDateofcollection() != null && compareTrust(this, e) < 0)
dateofcollection = e.getDateofcollection();
if (e.getDateoftransformation() != null && compareTrust(this, e) < 0)
dateoftransformation = e.getDateoftransformation();
extraInfo = mergeLists(extraInfo, e.getExtraInfo());
if (e.getOaiprovenance() != null && compareTrust(this, e) < 0)
oaiprovenance = e.getOaiprovenance();
}
protected <T> List<T> mergeLists(final List<T>... lists) {
return Arrays
.stream(lists)
.filter(Objects::nonNull)
.flatMap(List::stream)
.filter(Objects::nonNull)
.distinct()
.collect(Collectors.toList());
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
if (!super.equals(o))
return false;
OafEntity oafEntity = (OafEntity) o;
return Objects.equals(id, oafEntity.id);
}
@Override
public int hashCode() {
return Objects.hash(super.hashCode(), id);
}
}

View File

@ -1,214 +0,0 @@
package eu.dnetlib.dhp.schema.oaf;
import java.io.Serializable;
import java.util.List;
public class Organization extends OafEntity implements Serializable {
private Field<String> legalshortname;
private Field<String> legalname;
private List<Field<String>> alternativeNames;
private Field<String> websiteurl;
private Field<String> logourl;
private Field<String> eclegalbody;
private Field<String> eclegalperson;
private Field<String> ecnonprofit;
private Field<String> ecresearchorganization;
private Field<String> echighereducation;
private Field<String> ecinternationalorganizationeurinterests;
private Field<String> ecinternationalorganization;
private Field<String> ecenterprise;
private Field<String> ecsmevalidated;
private Field<String> ecnutscode;
private Qualifier country;
public Field<String> getLegalshortname() {
return legalshortname;
}
public void setLegalshortname(Field<String> legalshortname) {
this.legalshortname = legalshortname;
}
public Field<String> getLegalname() {
return legalname;
}
public void setLegalname(Field<String> legalname) {
this.legalname = legalname;
}
public List<Field<String>> getAlternativeNames() {
return alternativeNames;
}
public void setAlternativeNames(List<Field<String>> alternativeNames) {
this.alternativeNames = alternativeNames;
}
public Field<String> getWebsiteurl() {
return websiteurl;
}
public void setWebsiteurl(Field<String> websiteurl) {
this.websiteurl = websiteurl;
}
public Field<String> getLogourl() {
return logourl;
}
public void setLogourl(Field<String> logourl) {
this.logourl = logourl;
}
public Field<String> getEclegalbody() {
return eclegalbody;
}
public void setEclegalbody(Field<String> eclegalbody) {
this.eclegalbody = eclegalbody;
}
public Field<String> getEclegalperson() {
return eclegalperson;
}
public void setEclegalperson(Field<String> eclegalperson) {
this.eclegalperson = eclegalperson;
}
public Field<String> getEcnonprofit() {
return ecnonprofit;
}
public void setEcnonprofit(Field<String> ecnonprofit) {
this.ecnonprofit = ecnonprofit;
}
public Field<String> getEcresearchorganization() {
return ecresearchorganization;
}
public void setEcresearchorganization(Field<String> ecresearchorganization) {
this.ecresearchorganization = ecresearchorganization;
}
public Field<String> getEchighereducation() {
return echighereducation;
}
public void setEchighereducation(Field<String> echighereducation) {
this.echighereducation = echighereducation;
}
public Field<String> getEcinternationalorganizationeurinterests() {
return ecinternationalorganizationeurinterests;
}
public void setEcinternationalorganizationeurinterests(
Field<String> ecinternationalorganizationeurinterests) {
this.ecinternationalorganizationeurinterests = ecinternationalorganizationeurinterests;
}
public Field<String> getEcinternationalorganization() {
return ecinternationalorganization;
}
public void setEcinternationalorganization(Field<String> ecinternationalorganization) {
this.ecinternationalorganization = ecinternationalorganization;
}
public Field<String> getEcenterprise() {
return ecenterprise;
}
public void setEcenterprise(Field<String> ecenterprise) {
this.ecenterprise = ecenterprise;
}
public Field<String> getEcsmevalidated() {
return ecsmevalidated;
}
public void setEcsmevalidated(Field<String> ecsmevalidated) {
this.ecsmevalidated = ecsmevalidated;
}
public Field<String> getEcnutscode() {
return ecnutscode;
}
public void setEcnutscode(Field<String> ecnutscode) {
this.ecnutscode = ecnutscode;
}
public Qualifier getCountry() {
return country;
}
public void setCountry(Qualifier country) {
this.country = country;
}
@Override
public void mergeFrom(OafEntity e) {
super.mergeFrom(e);
if (!Organization.class.isAssignableFrom(e.getClass())) {
return;
}
final Organization o = (Organization) e;
legalshortname = o.getLegalshortname() != null && compareTrust(this, e) < 0
? o.getLegalshortname()
: legalshortname;
legalname = o.getLegalname() != null && compareTrust(this, e) < 0 ? o.getLegalname() : legalname;
alternativeNames = mergeLists(o.getAlternativeNames(), alternativeNames);
websiteurl = o.getWebsiteurl() != null && compareTrust(this, e) < 0 ? o.getWebsiteurl() : websiteurl;
logourl = o.getLogourl() != null && compareTrust(this, e) < 0 ? o.getLogourl() : logourl;
eclegalbody = o.getEclegalbody() != null && compareTrust(this, e) < 0 ? o.getEclegalbody() : eclegalbody;
eclegalperson = o.getEclegalperson() != null && compareTrust(this, e) < 0
? o.getEclegalperson()
: eclegalperson;
ecnonprofit = o.getEcnonprofit() != null && compareTrust(this, e) < 0 ? o.getEcnonprofit() : ecnonprofit;
ecresearchorganization = o.getEcresearchorganization() != null && compareTrust(this, e) < 0
? o.getEcresearchorganization()
: ecresearchorganization;
echighereducation = o.getEchighereducation() != null && compareTrust(this, e) < 0
? o.getEchighereducation()
: echighereducation;
ecinternationalorganizationeurinterests = o.getEcinternationalorganizationeurinterests() != null
&& compareTrust(this, e) < 0
? o.getEcinternationalorganizationeurinterests()
: ecinternationalorganizationeurinterests;
ecinternationalorganization = o.getEcinternationalorganization() != null && compareTrust(this, e) < 0
? o.getEcinternationalorganization()
: ecinternationalorganization;
ecenterprise = o.getEcenterprise() != null && compareTrust(this, e) < 0
? o.getEcenterprise()
: ecenterprise;
ecsmevalidated = o.getEcsmevalidated() != null && compareTrust(this, e) < 0
? o.getEcsmevalidated()
: ecsmevalidated;
ecnutscode = o.getEcnutscode() != null && compareTrust(this, e) < 0 ? o.getEcnutscode() : ecnutscode;
country = o.getCountry() != null && compareTrust(this, e) < 0 ? o.getCountry() : country;
mergeOAFDataInfo(o);
}
}

View File

@ -1,88 +0,0 @@
package eu.dnetlib.dhp.schema.oaf;
import java.io.Serializable;
import java.util.Objects;
public class OriginDescription implements Serializable {
private String harvestDate;
private Boolean altered = true;
private String baseURL;
private String identifier;
private String datestamp;
private String metadataNamespace;
public String getHarvestDate() {
return harvestDate;
}
public void setHarvestDate(String harvestDate) {
this.harvestDate = harvestDate;
}
public Boolean getAltered() {
return altered;
}
public void setAltered(Boolean altered) {
this.altered = altered;
}
public String getBaseURL() {
return baseURL;
}
public void setBaseURL(String baseURL) {
this.baseURL = baseURL;
}
public String getIdentifier() {
return identifier;
}
public void setIdentifier(String identifier) {
this.identifier = identifier;
}
public String getDatestamp() {
return datestamp;
}
public void setDatestamp(String datestamp) {
this.datestamp = datestamp;
}
public String getMetadataNamespace() {
return metadataNamespace;
}
public void setMetadataNamespace(String metadataNamespace) {
this.metadataNamespace = metadataNamespace;
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
OriginDescription that = (OriginDescription) o;
return Objects.equals(harvestDate, that.harvestDate)
&& Objects.equals(altered, that.altered)
&& Objects.equals(baseURL, that.baseURL)
&& Objects.equals(identifier, that.identifier)
&& Objects.equals(datestamp, that.datestamp)
&& Objects.equals(metadataNamespace, that.metadataNamespace);
}
@Override
public int hashCode() {
return Objects.hash(harvestDate, altered, baseURL, identifier, datestamp, metadataNamespace);
}
}

View File

@ -1,60 +0,0 @@
package eu.dnetlib.dhp.schema.oaf;
import java.io.Serializable;
import java.util.List;
import eu.dnetlib.dhp.schema.common.ModelConstants;
public class OtherResearchProduct extends Result implements Serializable {
private List<Field<String>> contactperson;
private List<Field<String>> contactgroup;
private List<Field<String>> tool;
public OtherResearchProduct() {
setResulttype(ModelConstants.ORP_DEFAULT_RESULTTYPE);
}
public List<Field<String>> getContactperson() {
return contactperson;
}
public void setContactperson(List<Field<String>> contactperson) {
this.contactperson = contactperson;
}
public List<Field<String>> getContactgroup() {
return contactgroup;
}
public void setContactgroup(List<Field<String>> contactgroup) {
this.contactgroup = contactgroup;
}
public List<Field<String>> getTool() {
return tool;
}
public void setTool(List<Field<String>> tool) {
this.tool = tool;
}
@Override
public void mergeFrom(OafEntity e) {
super.mergeFrom(e);
if (!OtherResearchProduct.class.isAssignableFrom(e.getClass())) {
return;
}
OtherResearchProduct o = (OtherResearchProduct) e;
contactperson = mergeLists(contactperson, o.getContactperson());
contactgroup = mergeLists(contactgroup, o.getContactgroup());
tool = mergeLists(tool, o.getTool());
mergeOAFDataInfo(e);
}
}

View File

@ -1,38 +0,0 @@
package eu.dnetlib.dhp.schema.oaf;
import java.io.Serializable;
import java.util.Objects;
public class Programme implements Serializable {
private String code;
private String description;
public String getCode() {
return code;
}
public void setCode(String code) {
this.code = code;
}
public String getDescription() {
return description;
}
public void setDescription(String description) {
this.description = description;
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
Programme programme = (Programme) o;
return Objects.equals(code, programme.code);
}
}

View File

@ -1,338 +0,0 @@
package eu.dnetlib.dhp.schema.oaf;
import java.io.Serializable;
import java.util.List;
public class Project extends OafEntity implements Serializable {
private Field<String> websiteurl;
private Field<String> code;
private Field<String> acronym;
private Field<String> title;
private Field<String> startdate;
private Field<String> enddate;
private Field<String> callidentifier;
private Field<String> keywords;
private Field<String> duration;
private Field<String> ecsc39;
private Field<String> oamandatepublications;
private Field<String> ecarticle29_3;
private List<StructuredProperty> subjects;
private List<Field<String>> fundingtree;
private Qualifier contracttype;
private Field<String> optional1;
private Field<String> optional2;
private Field<String> jsonextrainfo;
private Field<String> contactfullname;
private Field<String> contactfax;
private Field<String> contactphone;
private Field<String> contactemail;
private Field<String> summary;
private Field<String> currency;
private Float totalcost;
private Float fundedamount;
private List<Programme> programme;
public Field<String> getWebsiteurl() {
return websiteurl;
}
public void setWebsiteurl(Field<String> websiteurl) {
this.websiteurl = websiteurl;
}
public Field<String> getCode() {
return code;
}
public void setCode(Field<String> code) {
this.code = code;
}
public Field<String> getAcronym() {
return acronym;
}
public void setAcronym(Field<String> acronym) {
this.acronym = acronym;
}
public Field<String> getTitle() {
return title;
}
public void setTitle(Field<String> title) {
this.title = title;
}
public Field<String> getStartdate() {
return startdate;
}
public void setStartdate(Field<String> startdate) {
this.startdate = startdate;
}
public Field<String> getEnddate() {
return enddate;
}
public void setEnddate(Field<String> enddate) {
this.enddate = enddate;
}
public Field<String> getCallidentifier() {
return callidentifier;
}
public void setCallidentifier(Field<String> callidentifier) {
this.callidentifier = callidentifier;
}
public Field<String> getKeywords() {
return keywords;
}
public void setKeywords(Field<String> keywords) {
this.keywords = keywords;
}
public Field<String> getDuration() {
return duration;
}
public void setDuration(Field<String> duration) {
this.duration = duration;
}
public Field<String> getEcsc39() {
return ecsc39;
}
public void setEcsc39(Field<String> ecsc39) {
this.ecsc39 = ecsc39;
}
public Field<String> getOamandatepublications() {
return oamandatepublications;
}
public void setOamandatepublications(Field<String> oamandatepublications) {
this.oamandatepublications = oamandatepublications;
}
public Field<String> getEcarticle29_3() {
return ecarticle29_3;
}
public void setEcarticle29_3(Field<String> ecarticle29_3) {
this.ecarticle29_3 = ecarticle29_3;
}
public List<StructuredProperty> getSubjects() {
return subjects;
}
public void setSubjects(List<StructuredProperty> subjects) {
this.subjects = subjects;
}
public List<Field<String>> getFundingtree() {
return fundingtree;
}
public void setFundingtree(List<Field<String>> fundingtree) {
this.fundingtree = fundingtree;
}
public Qualifier getContracttype() {
return contracttype;
}
public void setContracttype(Qualifier contracttype) {
this.contracttype = contracttype;
}
public Field<String> getOptional1() {
return optional1;
}
public void setOptional1(Field<String> optional1) {
this.optional1 = optional1;
}
public Field<String> getOptional2() {
return optional2;
}
public void setOptional2(Field<String> optional2) {
this.optional2 = optional2;
}
public Field<String> getJsonextrainfo() {
return jsonextrainfo;
}
public void setJsonextrainfo(Field<String> jsonextrainfo) {
this.jsonextrainfo = jsonextrainfo;
}
public Field<String> getContactfullname() {
return contactfullname;
}
public void setContactfullname(Field<String> contactfullname) {
this.contactfullname = contactfullname;
}
public Field<String> getContactfax() {
return contactfax;
}
public void setContactfax(Field<String> contactfax) {
this.contactfax = contactfax;
}
public Field<String> getContactphone() {
return contactphone;
}
public void setContactphone(Field<String> contactphone) {
this.contactphone = contactphone;
}
public Field<String> getContactemail() {
return contactemail;
}
public void setContactemail(Field<String> contactemail) {
this.contactemail = contactemail;
}
public Field<String> getSummary() {
return summary;
}
public void setSummary(Field<String> summary) {
this.summary = summary;
}
public Field<String> getCurrency() {
return currency;
}
public void setCurrency(Field<String> currency) {
this.currency = currency;
}
public Float getTotalcost() {
return totalcost;
}
public void setTotalcost(Float totalcost) {
this.totalcost = totalcost;
}
public Float getFundedamount() {
return fundedamount;
}
public void setFundedamount(Float fundedamount) {
this.fundedamount = fundedamount;
}
public List<Programme> getProgramme() {
return programme;
}
public void setProgramme(List<Programme> programme) {
this.programme = programme;
}
@Override
public void mergeFrom(OafEntity e) {
super.mergeFrom(e);
if (!Project.class.isAssignableFrom(e.getClass())) {
return;
}
Project p = (Project) e;
websiteurl = p.getWebsiteurl() != null && compareTrust(this, e) < 0 ? p.getWebsiteurl() : websiteurl;
code = p.getCode() != null && compareTrust(this, e) < 0 ? p.getCode() : code;
acronym = p.getAcronym() != null && compareTrust(this, e) < 0 ? p.getAcronym() : acronym;
title = p.getTitle() != null && compareTrust(this, e) < 0 ? p.getTitle() : title;
startdate = p.getStartdate() != null && compareTrust(this, e) < 0 ? p.getStartdate() : startdate;
enddate = p.getEnddate() != null && compareTrust(this, e) < 0 ? p.getEnddate() : enddate;
callidentifier = p.getCallidentifier() != null && compareTrust(this, e) < 0
? p.getCallidentifier()
: callidentifier;
keywords = p.getKeywords() != null && compareTrust(this, e) < 0 ? p.getKeywords() : keywords;
duration = p.getDuration() != null && compareTrust(this, e) < 0 ? p.getDuration() : duration;
ecsc39 = p.getEcsc39() != null && compareTrust(this, e) < 0 ? p.getEcsc39() : ecsc39;
oamandatepublications = p.getOamandatepublications() != null && compareTrust(this, e) < 0
? p.getOamandatepublications()
: oamandatepublications;
ecarticle29_3 = p.getEcarticle29_3() != null && compareTrust(this, e) < 0
? p.getEcarticle29_3()
: ecarticle29_3;
subjects = mergeLists(subjects, p.getSubjects());
fundingtree = mergeLists(fundingtree, p.getFundingtree());
contracttype = p.getContracttype() != null && compareTrust(this, e) < 0
? p.getContracttype()
: contracttype;
optional1 = p.getOptional1() != null && compareTrust(this, e) < 0 ? p.getOptional1() : optional1;
optional2 = p.getOptional2() != null && compareTrust(this, e) < 0 ? p.getOptional2() : optional2;
jsonextrainfo = p.getJsonextrainfo() != null && compareTrust(this, e) < 0
? p.getJsonextrainfo()
: jsonextrainfo;
contactfullname = p.getContactfullname() != null && compareTrust(this, e) < 0
? p.getContactfullname()
: contactfullname;
contactfax = p.getContactfax() != null && compareTrust(this, e) < 0 ? p.getContactfax() : contactfax;
contactphone = p.getContactphone() != null && compareTrust(this, e) < 0
? p.getContactphone()
: contactphone;
contactemail = p.getContactemail() != null && compareTrust(this, e) < 0
? p.getContactemail()
: contactemail;
summary = p.getSummary() != null && compareTrust(this, e) < 0 ? p.getSummary() : summary;
currency = p.getCurrency() != null && compareTrust(this, e) < 0 ? p.getCurrency() : currency;
totalcost = p.getTotalcost() != null && compareTrust(this, e) < 0 ? p.getTotalcost() : totalcost;
fundedamount = p.getFundedamount() != null && compareTrust(this, e) < 0
? p.getFundedamount()
: fundedamount;
programme = mergeLists(programme, p.getProgramme());
mergeOAFDataInfo(e);
}
}

View File

@ -1,39 +0,0 @@
package eu.dnetlib.dhp.schema.oaf;
import java.io.Serializable;
import eu.dnetlib.dhp.schema.common.ModelConstants;
public class Publication extends Result implements Serializable {
// publication specific
private Journal journal;
public Publication() {
setResulttype(ModelConstants.PUBLICATION_DEFAULT_RESULTTYPE);
}
public Journal getJournal() {
return journal;
}
public void setJournal(Journal journal) {
this.journal = journal;
}
@Override
public void mergeFrom(OafEntity e) {
super.mergeFrom(e);
if (!Publication.class.isAssignableFrom(e.getClass())) {
return;
}
Publication p = (Publication) e;
if (p.getJournal() != null && compareTrust(this, e) < 0)
journal = p.getJournal();
mergeOAFDataInfo(e);
}
}

View File

@ -1,87 +0,0 @@
package eu.dnetlib.dhp.schema.oaf;
import java.io.Serializable;
import org.apache.commons.lang3.StringUtils;
import com.fasterxml.jackson.annotation.JsonIgnore;
public class Qualifier implements Serializable {
private String classid;
private String classname;
private String schemeid;
private String schemename;
public String getClassid() {
return classid;
}
public void setClassid(String classid) {
this.classid = classid;
}
public String getClassname() {
return classname;
}
public void setClassname(String classname) {
this.classname = classname;
}
public String getSchemeid() {
return schemeid;
}
public void setSchemeid(String schemeid) {
this.schemeid = schemeid;
}
public String getSchemename() {
return schemename;
}
public void setSchemename(String schemename) {
this.schemename = schemename;
}
public String toComparableString() {
return isBlank()
? ""
: String
.format(
"%s::%s::%s::%s",
classid != null ? classid : "",
classname != null ? classname : "",
schemeid != null ? schemeid : "",
schemename != null ? schemename : "");
}
@JsonIgnore
public boolean isBlank() {
return StringUtils.isBlank(classid)
&& StringUtils.isBlank(classname)
&& StringUtils.isBlank(schemeid)
&& StringUtils.isBlank(schemename);
}
@Override
public int hashCode() {
return toComparableString().hashCode();
}
@Override
public boolean equals(Object obj) {
if (this == obj)
return true;
if (obj == null)
return false;
if (getClass() != obj.getClass())
return false;
Qualifier other = (Qualifier) obj;
return toComparableString().equals(other.toComparableString());
}
}

View File

@ -1,167 +0,0 @@
package eu.dnetlib.dhp.schema.oaf;
import static com.google.common.base.Preconditions.checkArgument;
import java.util.*;
import java.util.stream.Collectors;
import java.util.stream.Stream;
/**
* Relation models any edge between two nodes in the OpenAIRE graph. It has a source id and a target id pointing to
* graph node identifiers and it is further characterised by the semantic of the link through the fields relType,
* subRelType and relClass. Provenance information is modeled according to the dataInfo element and collectedFrom, while
* individual relationship types can provide extra information via the properties field.
*/
public class Relation extends Oaf {
/**
* Main relationship classifier, values include 'resultResult', 'resultProject', 'resultOrganization', etc.
*/
private String relType;
/**
* Further classifies a relationship, values include 'affiliation', 'similarity', 'supplement', etc.
*/
private String subRelType;
/**
* Indicates the direction of the relationship, values include 'isSupplementTo', 'isSupplementedBy', 'merges,
* 'isMergedIn'.
*/
private String relClass;
/**
* The source entity id.
*/
private String source;
/**
* The target entity id.
*/
private String target;
/**
* Was this relationship authoritatively validated?
*/
private Boolean validated;
/**
* When was this relationship authoritatively validated.
*/
private String validationDate;
/**
* List of relation specific properties. Values include 'similarityLevel', indicating the similarity score between a
* pair of publications.
*/
private List<KeyValue> properties = new ArrayList<>();
public String getRelType() {
return relType;
}
public void setRelType(final String relType) {
this.relType = relType;
}
public String getSubRelType() {
return subRelType;
}
public void setSubRelType(final String subRelType) {
this.subRelType = subRelType;
}
public String getRelClass() {
return relClass;
}
public void setRelClass(final String relClass) {
this.relClass = relClass;
}
public String getSource() {
return source;
}
public void setSource(final String source) {
this.source = source;
}
public String getTarget() {
return target;
}
public void setTarget(final String target) {
this.target = target;
}
public List<KeyValue> getProperties() {
return properties;
}
public void setProperties(List<KeyValue> properties) {
this.properties = properties;
}
public Boolean getValidated() {
return validated;
}
public void setValidated(Boolean validated) {
this.validated = validated;
}
public String getValidationDate() {
return validationDate;
}
public void setValidationDate(String validationDate) {
this.validationDate = validationDate;
}
public void mergeFrom(final Relation r) {
checkArgument(Objects.equals(getSource(), r.getSource()), "source ids must be equal");
checkArgument(Objects.equals(getTarget(), r.getTarget()), "target ids must be equal");
checkArgument(Objects.equals(getRelType(), r.getRelType()), "relType(s) must be equal");
checkArgument(
Objects.equals(getSubRelType(), r.getSubRelType()), "subRelType(s) must be equal");
checkArgument(Objects.equals(getRelClass(), r.getRelClass()), "relClass(es) must be equal");
setCollectedfrom(
Stream
.concat(
Optional
.ofNullable(getCollectedfrom())
.map(Collection::stream)
.orElse(Stream.empty()),
Optional
.ofNullable(r.getCollectedfrom())
.map(Collection::stream)
.orElse(Stream.empty()))
.distinct() // relies on KeyValue.equals
.collect(Collectors.toList()));
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
Relation relation = (Relation) o;
return relType.equals(relation.relType)
&& subRelType.equals(relation.subRelType)
&& relClass.equals(relation.relClass)
&& source.equals(relation.source)
&& target.equals(relation.target);
}
@Override
public int hashCode() {
return Objects.hash(relType, subRelType, relClass, source, target, collectedfrom);
}
}

View File

@ -1,348 +0,0 @@
package eu.dnetlib.dhp.schema.oaf;
import java.io.Serializable;
import java.util.ArrayList;
import java.util.Comparator;
import java.util.List;
import java.util.stream.Collectors;
public class Result extends OafEntity implements Serializable {
private List<Measure> measures;
private List<Author> author;
// resulttype allows subclassing results into publications | datasets | software
private Qualifier resulttype;
// common fields
private Qualifier language;
private List<Country> country;
private List<StructuredProperty> subject;
private List<StructuredProperty> title;
private List<StructuredProperty> relevantdate;
private List<Field<String>> description;
private Field<String> dateofacceptance;
private Field<String> publisher;
private Field<String> embargoenddate;
private List<Field<String>> source;
private List<Field<String>> fulltext; // remove candidate
private List<Field<String>> format;
private List<Field<String>> contributor;
private Qualifier resourcetype;
private List<Field<String>> coverage;
private Qualifier bestaccessright;
private List<Context> context;
private List<ExternalReference> externalReference;
private List<Instance> instance;
public List<Measure> getMeasures() {
return measures;
}
public void setMeasures(List<Measure> measures) {
this.measures = measures;
}
public List<Author> getAuthor() {
return author;
}
public void setAuthor(List<Author> author) {
this.author = author;
}
public Qualifier getResulttype() {
return resulttype;
}
public void setResulttype(Qualifier resulttype) {
this.resulttype = resulttype;
}
public Qualifier getLanguage() {
return language;
}
public void setLanguage(Qualifier language) {
this.language = language;
}
public List<Country> getCountry() {
return country;
}
public void setCountry(List<Country> country) {
this.country = country;
}
public List<StructuredProperty> getSubject() {
return subject;
}
public void setSubject(List<StructuredProperty> subject) {
this.subject = subject;
}
public List<StructuredProperty> getTitle() {
return title;
}
public void setTitle(List<StructuredProperty> title) {
this.title = title;
}
public List<StructuredProperty> getRelevantdate() {
return relevantdate;
}
public void setRelevantdate(List<StructuredProperty> relevantdate) {
this.relevantdate = relevantdate;
}
public List<Field<String>> getDescription() {
return description;
}
public void setDescription(List<Field<String>> description) {
this.description = description;
}
public Field<String> getDateofacceptance() {
return dateofacceptance;
}
public void setDateofacceptance(Field<String> dateofacceptance) {
this.dateofacceptance = dateofacceptance;
}
public Field<String> getPublisher() {
return publisher;
}
public void setPublisher(Field<String> publisher) {
this.publisher = publisher;
}
public Field<String> getEmbargoenddate() {
return embargoenddate;
}
public void setEmbargoenddate(Field<String> embargoenddate) {
this.embargoenddate = embargoenddate;
}
public List<Field<String>> getSource() {
return source;
}
public void setSource(List<Field<String>> source) {
this.source = source;
}
public List<Field<String>> getFulltext() {
return fulltext;
}
public void setFulltext(List<Field<String>> fulltext) {
this.fulltext = fulltext;
}
public List<Field<String>> getFormat() {
return format;
}
public void setFormat(List<Field<String>> format) {
this.format = format;
}
public List<Field<String>> getContributor() {
return contributor;
}
public void setContributor(List<Field<String>> contributor) {
this.contributor = contributor;
}
public Qualifier getResourcetype() {
return resourcetype;
}
public void setResourcetype(Qualifier resourcetype) {
this.resourcetype = resourcetype;
}
public List<Field<String>> getCoverage() {
return coverage;
}
public void setCoverage(List<Field<String>> coverage) {
this.coverage = coverage;
}
public Qualifier getBestaccessright() {
return bestaccessright;
}
public void setBestaccessright(Qualifier bestaccessright) {
this.bestaccessright = bestaccessright;
}
public List<Context> getContext() {
return context;
}
public void setContext(List<Context> context) {
this.context = context;
}
public List<ExternalReference> getExternalReference() {
return externalReference;
}
public void setExternalReference(List<ExternalReference> externalReference) {
this.externalReference = externalReference;
}
public List<Instance> getInstance() {
return instance;
}
public void setInstance(List<Instance> instance) {
this.instance = instance;
}
@Override
public void mergeFrom(OafEntity e) {
super.mergeFrom(e);
if (!Result.class.isAssignableFrom(e.getClass())) {
return;
}
Result r = (Result) e;
// TODO consider merging also Measures
instance = mergeLists(instance, r.getInstance());
if (r.getBestaccessright() != null && compareTrust(this, r) < 0)
bestaccessright = r.getBestaccessright();
if (r.getResulttype() != null && compareTrust(this, r) < 0)
resulttype = r.getResulttype();
if (r.getLanguage() != null && compareTrust(this, r) < 0)
language = r.getLanguage();
country = mergeLists(country, r.getCountry());
subject = mergeLists(subject, r.getSubject());
// merge title lists: main title with higher trust and distinct between the others
StructuredProperty baseMainTitle = null;
if (title != null) {
baseMainTitle = getMainTitle(title);
if (baseMainTitle != null) {
final StructuredProperty p = baseMainTitle;
title = title.stream().filter(t -> t != p).collect(Collectors.toList());
}
}
StructuredProperty newMainTitle = null;
if (r.getTitle() != null) {
newMainTitle = getMainTitle(r.getTitle());
if (newMainTitle != null) {
final StructuredProperty p = newMainTitle;
r.setTitle(r.getTitle().stream().filter(t -> t != p).collect(Collectors.toList()));
}
}
if (newMainTitle != null && compareTrust(this, r) < 0) {
baseMainTitle = newMainTitle;
}
title = mergeLists(title, r.getTitle());
if (title != null && baseMainTitle != null) {
title.add(baseMainTitle);
}
relevantdate = mergeLists(relevantdate, r.getRelevantdate());
description = longestLists(description, r.getDescription());
if (r.getPublisher() != null && compareTrust(this, r) < 0)
publisher = r.getPublisher();
if (r.getEmbargoenddate() != null && compareTrust(this, r) < 0)
embargoenddate = r.getEmbargoenddate();
source = mergeLists(source, r.getSource());
fulltext = mergeLists(fulltext, r.getFulltext());
format = mergeLists(format, r.getFormat());
contributor = mergeLists(contributor, r.getContributor());
if (r.getResourcetype() != null)
resourcetype = r.getResourcetype();
coverage = mergeLists(coverage, r.getCoverage());
context = mergeLists(context, r.getContext());
externalReference = mergeLists(externalReference, r.getExternalReference());
}
private List<Field<String>> longestLists(List<Field<String>> a, List<Field<String>> b) {
if (a == null || b == null)
return a == null ? b : a;
if (a.size() == b.size()) {
int msa = a
.stream()
.filter(i -> i.getValue() != null)
.map(i -> i.getValue().length())
.max(Comparator.naturalOrder())
.orElse(0);
int msb = b
.stream()
.filter(i -> i.getValue() != null)
.map(i -> i.getValue().length())
.max(Comparator.naturalOrder())
.orElse(0);
return msa > msb ? a : b;
}
return a.size() > b.size() ? a : b;
}
private StructuredProperty getMainTitle(List<StructuredProperty> titles) {
// need to check if the list of titles contains more than 1 main title? (in that case, we should chose which
// main title select in the list)
for (StructuredProperty title : titles) {
if (title.getQualifier() != null && title.getQualifier().getClassid() != null)
if (title.getQualifier().getClassid().equals("main title"))
return title;
}
return null;
}
}

View File

@ -1,80 +0,0 @@
package eu.dnetlib.dhp.schema.oaf;
import java.io.Serializable;
import java.util.List;
import eu.dnetlib.dhp.schema.common.ModelConstants;
public class Software extends Result implements Serializable {
private List<Field<String>> documentationUrl;
// candidate for removal
private List<StructuredProperty> license;
// candidate for removal
private Field<String> codeRepositoryUrl;
private Qualifier programmingLanguage;
public Software() {
setResulttype(ModelConstants.SOFTWARE_DEFAULT_RESULTTYPE);
}
public List<Field<String>> getDocumentationUrl() {
return documentationUrl;
}
public void setDocumentationUrl(List<Field<String>> documentationUrl) {
this.documentationUrl = documentationUrl;
}
public List<StructuredProperty> getLicense() {
return license;
}
public void setLicense(List<StructuredProperty> license) {
this.license = license;
}
public Field<String> getCodeRepositoryUrl() {
return codeRepositoryUrl;
}
public void setCodeRepositoryUrl(Field<String> codeRepositoryUrl) {
this.codeRepositoryUrl = codeRepositoryUrl;
}
public Qualifier getProgrammingLanguage() {
return programmingLanguage;
}
public void setProgrammingLanguage(Qualifier programmingLanguage) {
this.programmingLanguage = programmingLanguage;
}
@Override
public void mergeFrom(OafEntity e) {
super.mergeFrom(e);
if (!Software.class.isAssignableFrom(e.getClass())) {
return;
}
final Software s = (Software) e;
documentationUrl = mergeLists(documentationUrl, s.getDocumentationUrl());
license = mergeLists(license, s.getLicense());
codeRepositoryUrl = s.getCodeRepositoryUrl() != null && compareTrust(this, s) < 0
? s.getCodeRepositoryUrl()
: codeRepositoryUrl;
programmingLanguage = s.getProgrammingLanguage() != null && compareTrust(this, s) < 0
? s.getProgrammingLanguage()
: programmingLanguage;
mergeOAFDataInfo(e);
}
}

View File

@ -1,60 +0,0 @@
package eu.dnetlib.dhp.schema.oaf;
import java.io.Serializable;
public class StructuredProperty implements Serializable {
private String value;
private Qualifier qualifier;
private DataInfo dataInfo;
public String getValue() {
return value;
}
public void setValue(String value) {
this.value = value;
}
public Qualifier getQualifier() {
return qualifier;
}
public void setQualifier(Qualifier qualifier) {
this.qualifier = qualifier;
}
public DataInfo getDataInfo() {
return dataInfo;
}
public void setDataInfo(DataInfo dataInfo) {
this.dataInfo = dataInfo;
}
public String toComparableString() {
return value != null ? value.toLowerCase() : "";
}
@Override
public int hashCode() {
return toComparableString().hashCode();
}
@Override
public boolean equals(Object obj) {
if (this == obj)
return true;
if (obj == null)
return false;
if (getClass() != obj.getClass())
return false;
StructuredProperty other = (StructuredProperty) obj;
return toComparableString().equals(other.toComparableString());
}
}

View File

@ -1,89 +0,0 @@
package eu.dnetlib.dhp.schema.scholexplorer;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import org.apache.commons.lang3.StringUtils;
import eu.dnetlib.dhp.schema.oaf.Dataset;
import eu.dnetlib.dhp.schema.oaf.OafEntity;
public class DLIDataset extends Dataset {
private String originalObjIdentifier;
private List<ProvenaceInfo> dlicollectedfrom;
private String completionStatus;
public String getCompletionStatus() {
return completionStatus;
}
public void setCompletionStatus(String completionStatus) {
this.completionStatus = completionStatus;
}
public List<ProvenaceInfo> getDlicollectedfrom() {
return dlicollectedfrom;
}
public void setDlicollectedfrom(List<ProvenaceInfo> dlicollectedfrom) {
this.dlicollectedfrom = dlicollectedfrom;
}
public String getOriginalObjIdentifier() {
return originalObjIdentifier;
}
public void setOriginalObjIdentifier(String originalObjIdentifier) {
this.originalObjIdentifier = originalObjIdentifier;
}
@Override
public void mergeFrom(OafEntity e) {
super.mergeFrom(e);
DLIDataset p = (DLIDataset) e;
if (StringUtils.isBlank(completionStatus) && StringUtils.isNotBlank(p.completionStatus))
completionStatus = p.completionStatus;
if ("complete".equalsIgnoreCase(p.completionStatus))
completionStatus = "complete";
dlicollectedfrom = mergeProvenance(dlicollectedfrom, p.getDlicollectedfrom());
}
private List<ProvenaceInfo> mergeProvenance(
final List<ProvenaceInfo> a, final List<ProvenaceInfo> b) {
Map<String, ProvenaceInfo> result = new HashMap<>();
if (a != null)
a
.forEach(
p -> {
if (p != null && StringUtils.isNotBlank(p.getId()) && result.containsKey(p.getId())) {
if ("incomplete".equalsIgnoreCase(result.get(p.getId()).getCompletionStatus())
&& StringUtils.isNotBlank(p.getCompletionStatus())) {
result.put(p.getId(), p);
}
} else if (p != null && p.getId() != null && !result.containsKey(p.getId()))
result.put(p.getId(), p);
});
if (b != null)
b
.forEach(
p -> {
if (p != null && StringUtils.isNotBlank(p.getId()) && result.containsKey(p.getId())) {
if ("incomplete".equalsIgnoreCase(result.get(p.getId()).getCompletionStatus())
&& StringUtils.isNotBlank(p.getCompletionStatus())) {
result.put(p.getId(), p);
}
} else if (p != null && p.getId() != null && !result.containsKey(p.getId()))
result.put(p.getId(), p);
});
return new ArrayList<>(result.values());
}
}

View File

@ -1,87 +0,0 @@
package eu.dnetlib.dhp.schema.scholexplorer;
import java.io.Serializable;
import java.util.*;
import org.apache.commons.lang3.StringUtils;
import eu.dnetlib.dhp.schema.oaf.OafEntity;
import eu.dnetlib.dhp.schema.oaf.Publication;
public class DLIPublication extends Publication implements Serializable {
private String originalObjIdentifier;
private List<ProvenaceInfo> dlicollectedfrom;
private String completionStatus;
public String getCompletionStatus() {
return completionStatus;
}
public void setCompletionStatus(String completionStatus) {
this.completionStatus = completionStatus;
}
public List<ProvenaceInfo> getDlicollectedfrom() {
return dlicollectedfrom;
}
public void setDlicollectedfrom(List<ProvenaceInfo> dlicollectedfrom) {
this.dlicollectedfrom = dlicollectedfrom;
}
public String getOriginalObjIdentifier() {
return originalObjIdentifier;
}
public void setOriginalObjIdentifier(String originalObjIdentifier) {
this.originalObjIdentifier = originalObjIdentifier;
}
@Override
public void mergeFrom(OafEntity e) {
super.mergeFrom(e);
DLIPublication p = (DLIPublication) e;
if (StringUtils.isBlank(completionStatus) && StringUtils.isNotBlank(p.completionStatus))
completionStatus = p.completionStatus;
if ("complete".equalsIgnoreCase(p.completionStatus))
completionStatus = "complete";
dlicollectedfrom = mergeProvenance(dlicollectedfrom, p.getDlicollectedfrom());
}
private List<ProvenaceInfo> mergeProvenance(
final List<ProvenaceInfo> a, final List<ProvenaceInfo> b) {
Map<String, ProvenaceInfo> result = new HashMap<>();
if (a != null)
a
.forEach(
p -> {
if (p != null && StringUtils.isNotBlank(p.getId()) && result.containsKey(p.getId())) {
if ("incomplete".equalsIgnoreCase(result.get(p.getId()).getCompletionStatus())
&& StringUtils.isNotBlank(p.getCompletionStatus())) {
result.put(p.getId(), p);
}
} else if (p != null && p.getId() != null && !result.containsKey(p.getId()))
result.put(p.getId(), p);
});
if (b != null)
b
.forEach(
p -> {
if (p != null && StringUtils.isNotBlank(p.getId()) && result.containsKey(p.getId())) {
if ("incomplete".equalsIgnoreCase(result.get(p.getId()).getCompletionStatus())
&& StringUtils.isNotBlank(p.getCompletionStatus())) {
result.put(p.getId(), p);
}
} else if (p != null && p.getId() != null && !result.containsKey(p.getId()))
result.put(p.getId(), p);
});
return new ArrayList<>(result.values());
}
}

View File

@ -1,30 +0,0 @@
package eu.dnetlib.dhp.schema.scholexplorer;
import java.util.List;
import eu.dnetlib.dhp.schema.oaf.KeyValue;
import eu.dnetlib.dhp.schema.oaf.Relation;
public class DLIRelation extends Relation {
private String dateOfCollection;
private List<KeyValue> collectedFrom;
public List<KeyValue> getCollectedFrom() {
return collectedFrom;
}
public void setCollectedFrom(List<KeyValue> collectedFrom) {
this.collectedFrom = collectedFrom;
}
public String getDateOfCollection() {
return dateOfCollection;
}
public void setDateOfCollection(String dateOfCollection) {
this.dateOfCollection = dateOfCollection;
}
}

View File

@ -1,115 +0,0 @@
package eu.dnetlib.dhp.schema.scholexplorer;
import java.io.Serializable;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import org.apache.commons.lang3.StringUtils;
import eu.dnetlib.dhp.schema.oaf.Oaf;
import eu.dnetlib.dhp.schema.oaf.StructuredProperty;
public class DLIUnknown extends Oaf implements Serializable {
private String id;
private List<StructuredProperty> pid;
private String dateofcollection;
private String dateoftransformation;
private List<ProvenaceInfo> dlicollectedfrom;
private String completionStatus = "incomplete";
public String getCompletionStatus() {
return completionStatus;
}
public void setCompletionStatus(String completionStatus) {
this.completionStatus = completionStatus;
}
public List<ProvenaceInfo> getDlicollectedfrom() {
return dlicollectedfrom;
}
public void setDlicollectedfrom(List<ProvenaceInfo> dlicollectedfrom) {
this.dlicollectedfrom = dlicollectedfrom;
}
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public List<StructuredProperty> getPid() {
return pid;
}
public void setPid(List<StructuredProperty> pid) {
this.pid = pid;
}
public String getDateofcollection() {
return dateofcollection;
}
public void setDateofcollection(String dateofcollection) {
this.dateofcollection = dateofcollection;
}
public String getDateoftransformation() {
return dateoftransformation;
}
public void setDateoftransformation(String dateoftransformation) {
this.dateoftransformation = dateoftransformation;
}
public void mergeFrom(DLIUnknown p) {
if ("complete".equalsIgnoreCase(p.completionStatus))
completionStatus = "complete";
dlicollectedfrom = mergeProvenance(dlicollectedfrom, p.getDlicollectedfrom());
}
private List<ProvenaceInfo> mergeProvenance(
final List<ProvenaceInfo> a, final List<ProvenaceInfo> b) {
Map<String, ProvenaceInfo> result = new HashMap<>();
if (a != null)
a
.forEach(
p -> {
if (p != null && StringUtils.isNotBlank(p.getId()) && result.containsKey(p.getId())) {
if ("incomplete".equalsIgnoreCase(result.get(p.getId()).getCompletionStatus())
&& StringUtils.isNotBlank(p.getCompletionStatus())) {
result.put(p.getId(), p);
}
} else if (p != null && p.getId() != null && !result.containsKey(p.getId()))
result.put(p.getId(), p);
});
if (b != null)
b
.forEach(
p -> {
if (p != null && StringUtils.isNotBlank(p.getId()) && result.containsKey(p.getId())) {
if ("incomplete".equalsIgnoreCase(result.get(p.getId()).getCompletionStatus())
&& StringUtils.isNotBlank(p.getCompletionStatus())) {
result.put(p.getId(), p);
}
} else if (p != null && p.getId() != null && !result.containsKey(p.getId()))
result.put(p.getId(), p);
});
return new ArrayList<>(result.values());
}
}

View File

@ -1,47 +0,0 @@
package eu.dnetlib.dhp.schema.scholexplorer;
import java.io.Serializable;
public class ProvenaceInfo implements Serializable {
private String id;
private String name;
private String completionStatus;
private String collectionMode = "collected";
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public String getCompletionStatus() {
return completionStatus;
}
public void setCompletionStatus(String completionStatus) {
this.completionStatus = completionStatus;
}
public String getCollectionMode() {
return collectionMode;
}
public void setCollectionMode(String collectionMode) {
this.collectionMode = collectionMode;
}
}

View File

@ -1,40 +0,0 @@
package eu.dnetlib.dhp.schema.action;
import static org.junit.jupiter.api.Assertions.*;
import java.io.IOException;
import org.apache.commons.lang3.StringUtils;
import org.junit.jupiter.api.Test;
import com.fasterxml.jackson.databind.ObjectMapper;
import eu.dnetlib.dhp.schema.oaf.Relation;
/** @author claudio.atzori */
public class AtomicActionTest {
@Test
public void serializationTest() throws IOException {
Relation rel = new Relation();
rel.setSource("1");
rel.setTarget("2");
rel.setRelType("resultResult");
rel.setSubRelType("dedup");
rel.setRelClass("merges");
AtomicAction aa1 = new AtomicAction(Relation.class, rel);
final ObjectMapper mapper = new ObjectMapper();
String json = mapper.writeValueAsString(aa1);
assertTrue(StringUtils.isNotBlank(json));
AtomicAction aa2 = mapper.readValue(json, AtomicAction.class);
assertEquals(aa1.getClazz(), aa2.getClazz());
assertEquals(aa1.getPayload(), aa2.getPayload());
}
}

View File

@ -1,37 +0,0 @@
package eu.dnetlib.dhp.schema.common;
import static org.junit.jupiter.api.Assertions.assertFalse;
import static org.junit.jupiter.api.Assertions.assertTrue;
import org.junit.jupiter.api.Nested;
import org.junit.jupiter.api.Test;
import eu.dnetlib.dhp.schema.oaf.OafEntity;
import eu.dnetlib.dhp.schema.oaf.Relation;
import eu.dnetlib.dhp.schema.oaf.Result;
public class ModelSupportTest {
@Nested
class IsSubClass {
@Test
public void shouldReturnFalseWhenSubClassDoesNotExtendSuperClass() {
// when
Boolean result = ModelSupport.isSubClass(Relation.class, OafEntity.class);
// then
assertFalse(result);
}
@Test
public void shouldReturnTrueWhenSubClassExtendsSuperClass() {
// when
Boolean result = ModelSupport.isSubClass(Result.class, OafEntity.class);
// then
assertTrue(result);
}
}
}

View File

@ -1,57 +0,0 @@
package eu.dnetlib.dhp.schema.oaf;
import java.io.IOException;
import java.util.List;
import org.junit.jupiter.api.Assertions;
import org.junit.jupiter.api.Test;
import com.fasterxml.jackson.annotation.JsonInclude;
import com.fasterxml.jackson.core.type.TypeReference;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.google.common.collect.Lists;
public class MeasureTest {
public static final ObjectMapper OBJECT_MAPPER = new ObjectMapper()
.setSerializationInclusion(JsonInclude.Include.NON_NULL);
@Test
public void testMeasureSerialization() throws IOException {
Measure popularity = new Measure();
popularity.setId("popularity");
popularity
.setUnit(
Lists
.newArrayList(
unit("score", "0.5")));
Measure influence = new Measure();
influence.setId("influence");
influence
.setUnit(
Lists
.newArrayList(
unit("score", "0.3")));
List<Measure> m = Lists.newArrayList(popularity, influence);
String s = OBJECT_MAPPER.writeValueAsString(m);
System.out.println(s);
List<Measure> mm = OBJECT_MAPPER.readValue(s, new TypeReference<List<Measure>>() {
});
Assertions.assertNotNull(mm);
}
private KeyValue unit(String key, String value) {
KeyValue unit = new KeyValue();
unit.setKey(key);
unit.setValue(value);
return unit;
}
}

View File

@ -1,88 +0,0 @@
package eu.dnetlib.dhp.schema.oaf;
import static org.junit.jupiter.api.Assertions.*;
import java.util.Arrays;
import java.util.List;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
public class MergeTest {
OafEntity oaf;
@BeforeEach
public void setUp() {
oaf = new Publication();
}
@Test
public void mergeListsTest() {
// string list merge test
List<String> a = Arrays.asList("a", "b", "c", "e");
List<String> b = Arrays.asList("a", "b", "c", "d");
List<String> c = null;
System.out.println("merge result 1 = " + oaf.mergeLists(a, b));
System.out.println("merge result 2 = " + oaf.mergeLists(a, c));
System.out.println("merge result 3 = " + oaf.mergeLists(c, c));
}
@Test
public void mergePublicationCollectedFromTest() {
Publication a = new Publication();
Publication b = new Publication();
a.setCollectedfrom(Arrays.asList(setKV("a", "open"), setKV("b", "closed")));
b.setCollectedfrom(Arrays.asList(setKV("A", "open"), setKV("b", "Open")));
a.mergeFrom(b);
assertNotNull(a.getCollectedfrom());
assertEquals(3, a.getCollectedfrom().size());
}
@Test
public void mergePublicationSubjectTest() {
Publication a = new Publication();
Publication b = new Publication();
a.setSubject(Arrays.asList(setSP("a", "open", "classe"), setSP("b", "open", "classe")));
b.setSubject(Arrays.asList(setSP("A", "open", "classe"), setSP("c", "open", "classe")));
a.mergeFrom(b);
assertNotNull(a.getSubject());
assertEquals(3, a.getSubject().size());
}
private KeyValue setKV(final String key, final String value) {
KeyValue k = new KeyValue();
k.setKey(key);
k.setValue(value);
return k;
}
private StructuredProperty setSP(
final String value, final String schema, final String classname) {
StructuredProperty s = new StructuredProperty();
s.setValue(value);
Qualifier q = new Qualifier();
q.setClassname(classname);
q.setClassid(classname);
q.setSchemename(schema);
q.setSchemeid(schema);
s.setQualifier(q);
return s;
}
}

View File

@ -1,83 +0,0 @@
package eu.dnetlib.dhp.schema.scholexplorer;
import java.io.IOException;
import java.util.Arrays;
import java.util.Collections;
import org.junit.jupiter.api.Test;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.DeserializationFeature;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.SerializationFeature;
import eu.dnetlib.dhp.schema.oaf.Qualifier;
import eu.dnetlib.dhp.schema.oaf.StructuredProperty;
public class DLItest {
@Test
public void testMergePublication() throws JsonProcessingException {
DLIPublication a1 = new DLIPublication();
a1.setPid(Arrays.asList(createSP("123456", "pdb", "dnet:pid_types")));
a1.setTitle(Collections.singletonList(createSP("Un Titolo", "title", "dnetTitle")));
a1.setDlicollectedfrom(Arrays.asList(createCollectedFrom("znd", "Zenodo", "complete")));
a1.setCompletionStatus("complete");
DLIPublication a = new DLIPublication();
a
.setPid(
Arrays
.asList(
createSP("10.11", "doi", "dnet:pid_types"),
createSP("123456", "pdb", "dnet:pid_types")));
a.setTitle(Collections.singletonList(createSP("A Title", "title", "dnetTitle")));
a
.setDlicollectedfrom(
Arrays
.asList(
createCollectedFrom("dct", "datacite", "complete"),
createCollectedFrom("dct", "datacite", "incomplete")));
a.setCompletionStatus("incomplete");
a.mergeFrom(a1);
ObjectMapper mapper = new ObjectMapper();
System.out.println(mapper.writeValueAsString(a));
}
@Test
public void testDeserialization() throws IOException {
final String json = "{\"dataInfo\":{\"invisible\":false,\"inferred\":null,\"deletedbyinference\":false,\"trust\":\"0.9\",\"inferenceprovenance\":null,\"provenanceaction\":null},\"lastupdatetimestamp\":null,\"id\":\"60|bd9352547098929a394655ad1a44a479\",\"originalId\":[\"bd9352547098929a394655ad1a44a479\"],\"collectedfrom\":[{\"key\":\"dli_________::datacite\",\"value\":\"Datasets in Datacite\",\"dataInfo\":null,\"blank\":false}],\"pid\":[{\"value\":\"10.7925/DRS1.DUCHAS_5078760\",\"qualifier\":{\"classid\":\"doi\",\"classname\":\"doi\",\"schemeid\":\"dnet:pid_types\",\"schemename\":\"dnet:pid_types\",\"blank\":false},\"dataInfo\":null}],\"dateofcollection\":\"2020-01-09T08:29:31.885Z\",\"dateoftransformation\":null,\"extraInfo\":null,\"oaiprovenance\":null,\"author\":[{\"fullname\":\"Cathail, S. Ó\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null},{\"fullname\":\"Donnell, Breda Mc\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null},{\"fullname\":\"Ireland. Department of Arts, Culture, and the Gaeltacht\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null},{\"fullname\":\"University College Dublin\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null},{\"fullname\":\"National Folklore Foundation\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null},{\"fullname\":\"Cathail, S. Ó\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null},{\"fullname\":\"Donnell, Breda Mc\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null}],\"resulttype\":null,\"language\":null,\"country\":null,\"subject\":[{\"value\":\"Recreation\",\"qualifier\":{\"classid\":\"dnet:subject\",\"classname\":\"dnet:subject\",\"schemeid\":\"unknown\",\"schemename\":\"unknown\",\"blank\":false},\"dataInfo\":null},{\"value\":\"Entertainments and recreational activities\",\"qualifier\":{\"classid\":\"dnet:subject\",\"classname\":\"dnet:subject\",\"schemeid\":\"unknown\",\"schemename\":\"unknown\",\"blank\":false},\"dataInfo\":null},{\"value\":\"Siamsaíocht agus caitheamh aimsire\",\"qualifier\":{\"classid\":\"dnet:subject\",\"classname\":\"dnet:subject\",\"schemeid\":\"unknown\",\"schemename\":\"unknown\",\"blank\":false},\"dataInfo\":null}],\"title\":[{\"value\":\"Games We Play\",\"qualifier\":null,\"dataInfo\":null}],\"relevantdate\":[{\"value\":\"1938-09-28\",\"qualifier\":{\"classid\":\"date\",\"classname\":\"date\",\"schemeid\":\"dnet::date\",\"schemename\":\"dnet::date\",\"blank\":false},\"dataInfo\":null}],\"description\":[{\"value\":\"Story collected by Breda Mc Donnell, a student at Tenure school (Tinure, Co. Louth) (no informant identified).\",\"dataInfo\":null}],\"dateofacceptance\":null,\"publisher\":{\"value\":\"University College Dublin\",\"dataInfo\":null},\"embargoenddate\":null,\"source\":null,\"fulltext\":null,\"format\":null,\"contributor\":null,\"resourcetype\":null,\"coverage\":null,\"refereed\":null,\"context\":null,\"processingchargeamount\":null,\"processingchargecurrency\":null,\"externalReference\":null,\"instance\":[],\"storagedate\":null,\"device\":null,\"size\":null,\"version\":null,\"lastmetadataupdate\":null,\"metadataversionnumber\":null,\"geolocation\":null,\"dlicollectedfrom\":[{\"id\":\"dli_________::datacite\",\"name\":\"Datasets in Datacite\",\"completionStatus\":\"complete\",\"collectionMode\":\"resolved\"}],\"completionStatus\":\"complete\"}";
ObjectMapper mapper = new ObjectMapper();
mapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
DLIDataset dliDataset = mapper.readValue(json, DLIDataset.class);
mapper.enable(SerializationFeature.INDENT_OUTPUT);
System.out.println(mapper.writeValueAsString(dliDataset));
}
private ProvenaceInfo createCollectedFrom(
final String id, final String name, final String completionStatus) {
ProvenaceInfo p = new ProvenaceInfo();
p.setId(id);
p.setName(name);
p.setCompletionStatus(completionStatus);
return p;
}
private StructuredProperty createSP(
final String value, final String className, final String schemeName) {
StructuredProperty p = new StructuredProperty();
p.setValue(value);
Qualifier schema = new Qualifier();
schema.setClassname(className);
schema.setClassid(className);
schema.setSchemename(schemeName);
schema.setSchemeid(schemeName);
p.setQualifier(schema);
return p;
}
}

View File

@ -59,7 +59,6 @@
<dependency>
<groupId>eu.dnetlib.dhp</groupId>
<artifactId>dhp-schemas</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>

View File

@ -3,20 +3,23 @@ package eu.dnetlib.dhp.actionmanager;
import java.io.Serializable;
import java.io.StringReader;
import java.util.ArrayList;
import java.util.List;
import java.util.NoSuchElementException;
import java.util.*;
import java.util.stream.Collectors;
import org.apache.commons.lang3.tuple.Triple;
import org.dom4j.Document;
import org.dom4j.DocumentException;
import org.dom4j.Element;
import org.dom4j.io.SAXReader;
import org.jetbrains.annotations.NotNull;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.google.common.base.Joiner;
import com.google.common.base.Splitter;
import com.google.common.collect.Iterables;
import com.google.common.collect.Lists;
import com.google.common.collect.Sets;
import eu.dnetlib.actionmanager.rmi.ActionManagerException;
import eu.dnetlib.actionmanager.set.ActionManagerSet;
@ -25,6 +28,7 @@ import eu.dnetlib.dhp.actionmanager.partition.PartitionActionSetsByPayloadTypeJo
import eu.dnetlib.dhp.utils.ISLookupClientFactory;
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpException;
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpService;
import scala.Tuple2;
public class ISClient implements Serializable {
@ -40,80 +44,52 @@ public class ISClient implements Serializable {
public List<String> getLatestRawsetPaths(String setIds) {
List<String> ids = Lists
.newArrayList(
final Set<String> ids = Sets
.newHashSet(
Splitter
.on(INPUT_ACTION_SET_ID_SEPARATOR)
.omitEmptyStrings()
.trimResults()
.split(setIds));
return ids
.stream()
.map(id -> getSet(isLookup, id))
.map(as -> as.getPathToLatest())
.collect(Collectors.toCollection(ArrayList::new));
}
private ActionManagerSet getSet(ISLookUpService isLookup, final String setId) {
final String q = "for $x in collection('/db/DRIVER/ActionManagerSetDSResources/ActionManagerSetDSResourceType') "
+ "where $x//SET/@id = '"
+ setId
+ "' return $x";
try {
final String basePath = getBasePathHDFS(isLookup);
final String setProfile = isLookup.getResourceProfileByQuery(q);
return getActionManagerSet(basePath, setProfile);
} catch (ISLookUpException | ActionManagerException e) {
throw new RuntimeException("Error accessing Sets, using query: " + q);
// <SET id="..." directory="..." latest="xxx"/>
final String xquery = "for $x in collection('/db/DRIVER/ActionManagerSetDSResources/ActionManagerSetDSResourceType') "
+
"return <SET id='{$x//SET/@id/string()}' directory='{$x//SET/@directory/string()}' latest='{$x//LATEST/@id/string()}'/>";
return Optional
.ofNullable(isLookup.quickSearchProfile(xquery))
.map(
sets -> sets
.stream()
.map(set -> parseSetInfo(set))
.filter(t -> ids.contains(t.getLeft()))
.map(t -> buildDirectory(basePath, t))
.collect(Collectors.toList()))
.orElseThrow(() -> new IllegalStateException("empty set list"));
} catch (ActionManagerException | ISLookUpException e) {
throw new IllegalStateException("unable to query ActionSets info from the IS");
}
}
private ActionManagerSet getActionManagerSet(final String basePath, final String profile)
throws ActionManagerException {
final SAXReader reader = new SAXReader();
final ActionManagerSet set = new ActionManagerSet();
private Triple<String, String, String> parseSetInfo(String set) {
try {
final Document doc = reader.read(new StringReader(profile));
set.setId(doc.valueOf("//SET/@id").trim());
set.setName(doc.valueOf("//SET").trim());
set.setImpact(ImpactTypes.valueOf(doc.valueOf("//IMPACT").trim()));
set
.setLatest(
doc.valueOf("//RAW_SETS/LATEST/@id"),
doc.valueOf("//RAW_SETS/LATEST/@creationDate"),
doc.valueOf("//RAW_SETS/LATEST/@lastUpdate"));
set.setDirectory(doc.valueOf("//SET/@directory"));
final List expiredNodes = doc.selectNodes("//RAW_SETS/EXPIRED");
if (expiredNodes != null) {
for (int i = 0; i < expiredNodes.size(); i++) {
Element ex = (Element) expiredNodes.get(i);
set
.addExpired(
ex.attributeValue("id"),
ex.attributeValue("creationDate"),
ex.attributeValue("lastUpdate"));
}
}
final StringBuilder sb = new StringBuilder();
sb.append(basePath);
sb.append("/");
sb.append(doc.valueOf("//SET/@directory"));
sb.append("/");
sb.append(doc.valueOf("//RAW_SETS/LATEST/@id"));
set.setPathToLatest(sb.toString());
return set;
} catch (Exception e) {
throw new ActionManagerException("Error creating set from profile: " + profile, e);
Document doc = new SAXReader().read(new StringReader(set));
return Triple
.of(
doc.valueOf("//SET/@id"),
doc.valueOf("//SET/@directory"),
doc.valueOf("//SET/@latest"));
} catch (DocumentException e) {
throw new IllegalStateException(e);
}
}
private String buildDirectory(String basePath, Triple<String, String, String> t) {
return Joiner.on("/").join(basePath, t.getMiddle(), t.getRight());
}
private String getBasePathHDFS(ISLookUpService isLookup) throws ActionManagerException {
return queryServiceProperty(isLookup, "basePath");
}

View File

@ -35,7 +35,6 @@
<dependency>
<groupId>eu.dnetlib.dhp</groupId>
<artifactId>dhp-schemas</artifactId>
<version>${project.version}</version>
</dependency>
@ -66,6 +65,19 @@
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.poi/poi-ooxml -->
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-ooxml</artifactId>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.commons/commons-compress -->
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-compress</artifactId>
</dependency>
</dependencies>

View File

@ -0,0 +1,28 @@
package eu.dnetlib.dhp.actionmanager.bipfinder;
import java.io.Serializable;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
/**
* Class that maps the model of the bipFinder! input data.
* Only needed for deserialization purposes
*/
public class BipDeserialize extends HashMap<String, List<Score>> implements Serializable {
public BipDeserialize() {
super();
}
public List<Score> get(String key) {
if (super.get(key) == null) {
return new ArrayList<>();
}
return super.get(key);
}
}

View File

@ -0,0 +1,30 @@
package eu.dnetlib.dhp.actionmanager.bipfinder;
import java.io.Serializable;
import java.util.List;
/**
* Rewriting of the bipFinder input data by extracting the identifier of the result (doi)
*/
public class BipScore implements Serializable {
private String id; // doi
private List<Score> scoreList; // unit as given in the inputfile
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public List<Score> getScoreList() {
return scoreList;
}
public void setScoreList(List<Score> scoreList) {
this.scoreList = scoreList;
}
}

View File

@ -0,0 +1,85 @@
package eu.dnetlib.dhp.actionmanager.bipfinder;
import static eu.dnetlib.dhp.common.SparkSessionSupport.runWithSparkSession;
import java.io.Serializable;
import java.util.Optional;
import org.apache.commons.io.IOUtils;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapred.SequenceFileOutputFormat;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.sql.SparkSession;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.fasterxml.jackson.databind.ObjectMapper;
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
import eu.dnetlib.dhp.common.HdfsSupport;
import eu.dnetlib.dhp.schema.oaf.Result;
/**
* Just collects all the atomic actions produced for the different results and saves them in
* outputpath for the ActionSet
*/
public class CollectAndSave implements Serializable {
private static final Logger log = LoggerFactory.getLogger(CollectAndSave.class);
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
public static <I extends Result> void main(String[] args) throws Exception {
String jsonConfiguration = IOUtils
.toString(
CollectAndSave.class
.getResourceAsStream(
"/eu/dnetlib/dhp/actionmanager/bipfinder/input_actionset_parameter.json"));
final ArgumentApplicationParser parser = new ArgumentApplicationParser(jsonConfiguration);
parser.parseArgument(args);
Boolean isSparkSessionManaged = Optional
.ofNullable(parser.get("isSparkSessionManaged"))
.map(Boolean::valueOf)
.orElse(Boolean.TRUE);
log.info("isSparkSessionManaged: {}", isSparkSessionManaged);
final String inputPath = parser.get("inputPath");
log.info("inputPath {}: ", inputPath);
final String outputPath = parser.get("outputPath");
log.info("outputPath {}: ", outputPath);
SparkConf conf = new SparkConf();
runWithSparkSession(
conf,
isSparkSessionManaged,
spark -> {
removeOutputDir(spark, outputPath);
collectAndSave(spark, inputPath, outputPath);
});
}
private static void collectAndSave(SparkSession spark, String inputPath, String outputPath) {
JavaSparkContext sc = JavaSparkContext.fromSparkContext(spark.sparkContext());
sc
.sequenceFile(inputPath + "/publication", Text.class, Text.class)
.union(sc.sequenceFile(inputPath + "/dataset", Text.class, Text.class))
.union(sc.sequenceFile(inputPath + "/otherresearchproduct", Text.class, Text.class))
.union(sc.sequenceFile(inputPath + "/software", Text.class, Text.class))
.saveAsHadoopFile(outputPath, Text.class, Text.class, SequenceFileOutputFormat.class);
;
}
private static void removeOutputDir(SparkSession spark, String path) {
HdfsSupport.remove(path, spark.sparkContext().hadoopConfiguration());
}
}

View File

@ -0,0 +1,26 @@
package eu.dnetlib.dhp.actionmanager.bipfinder;
import java.io.Serializable;
public class KeyValue implements Serializable {
private String key;
private String value;
public String getKey() {
return key;
}
public void setKey(String key) {
this.key = key;
}
public String getValue() {
return value;
}
public void setValue(String value) {
this.value = value;
}
}

View File

@ -0,0 +1,28 @@
package eu.dnetlib.dhp.actionmanager.bipfinder;
import java.io.Serializable;
/**
* Subset of the information of the generic results that are needed to create the atomic action
*/
public class PreparedResult implements Serializable {
private String id; // openaire id
private String value; // doi
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public String getValue() {
return value;
}
public void setValue(String value) {
this.value = value;
}
}

View File

@ -0,0 +1,30 @@
package eu.dnetlib.dhp.actionmanager.bipfinder;
import java.io.Serializable;
import java.util.List;
/**
* represents the score in the input file
*/
public class Score implements Serializable {
private String id;
private List<KeyValue> unit;
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public List<KeyValue> getUnit() {
return unit;
}
public void setUnit(List<KeyValue> unit) {
this.unit = unit;
}
}

View File

@ -0,0 +1,200 @@
package eu.dnetlib.dhp.actionmanager.bipfinder;
import static eu.dnetlib.dhp.common.SparkSessionSupport.runWithSparkSession;
import java.io.Serializable;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
import org.apache.commons.io.IOUtils;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapred.SequenceFileOutputFormat;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.api.java.function.MapFunction;
import org.apache.spark.api.java.function.MapGroupsFunction;
import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Encoders;
import org.apache.spark.sql.SparkSession;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.fasterxml.jackson.databind.ObjectMapper;
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
import eu.dnetlib.dhp.common.HdfsSupport;
import eu.dnetlib.dhp.schema.action.AtomicAction;
import eu.dnetlib.dhp.schema.oaf.*;
import eu.dnetlib.dhp.schema.oaf.KeyValue;
import scala.Tuple2;
/**
* created the Atomic Action for each tipe of results
*/
public class SparkAtomicActionScoreJob implements Serializable {
private static String DOI = "doi";
private static final Logger log = LoggerFactory.getLogger(SparkAtomicActionScoreJob.class);
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
public static <I extends Result> void main(String[] args) throws Exception {
String jsonConfiguration = IOUtils
.toString(
SparkAtomicActionScoreJob.class
.getResourceAsStream(
"/eu/dnetlib/dhp/actionmanager/bipfinder/input_parameters.json"));
final ArgumentApplicationParser parser = new ArgumentApplicationParser(jsonConfiguration);
parser.parseArgument(args);
Boolean isSparkSessionManaged = Optional
.ofNullable(parser.get("isSparkSessionManaged"))
.map(Boolean::valueOf)
.orElse(Boolean.TRUE);
log.info("isSparkSessionManaged: {}", isSparkSessionManaged);
final String inputPath = parser.get("inputPath");
log.info("inputPath {}: ", inputPath);
final String outputPath = parser.get("outputPath");
log.info("outputPath {}: ", outputPath);
final String bipScorePath = parser.get("bipScorePath");
log.info("bipScorePath: {}", bipScorePath);
final String resultClassName = parser.get("resultTableName");
log.info("resultTableName: {}", resultClassName);
Class<I> inputClazz = (Class<I>) Class.forName(resultClassName);
SparkConf conf = new SparkConf();
runWithSparkSession(
conf,
isSparkSessionManaged,
spark -> {
removeOutputDir(spark, outputPath);
prepareResults(spark, inputPath, outputPath, bipScorePath, inputClazz);
});
}
private static <I extends Result> void prepareResults(SparkSession spark, String inputPath, String outputPath,
String bipScorePath, Class<I> inputClazz) {
final JavaSparkContext sc = new JavaSparkContext(spark.sparkContext());
JavaRDD<BipDeserialize> bipDeserializeJavaRDD = sc
.textFile(bipScorePath)
.map(item -> OBJECT_MAPPER.readValue(item, BipDeserialize.class));
Dataset<BipScore> bipScores = spark
.createDataset(bipDeserializeJavaRDD.flatMap(entry -> entry.keySet().stream().map(key -> {
BipScore bs = new BipScore();
bs.setId(key);
bs.setScoreList(entry.get(key));
return bs;
}).collect(Collectors.toList()).iterator()).rdd(), Encoders.bean(BipScore.class));
System.out.println(bipScores.count());
Dataset<I> results = readPath(spark, inputPath, inputClazz);
results.createOrReplaceTempView("result");
Dataset<PreparedResult> preparedResult = spark
.sql(
"select pIde.value value, id " +
"from result " +
"lateral view explode (pid) p as pIde " +
"where dataInfo.deletedbyinference = false and pIde.qualifier.classid = '" + DOI + "'")
.as(Encoders.bean(PreparedResult.class));
bipScores
.joinWith(
preparedResult, bipScores.col("id").equalTo(preparedResult.col("value")),
"inner")
.map((MapFunction<Tuple2<BipScore, PreparedResult>, BipScore>) value -> {
BipScore ret = value._1();
ret.setId(value._2().getId());
return ret;
}, Encoders.bean(BipScore.class))
.groupByKey((MapFunction<BipScore, String>) value -> value.getId(), Encoders.STRING())
.mapGroups((MapGroupsFunction<String, BipScore, Result>) (k, it) -> {
Result ret = new Result();
ret.setDataInfo(getDataInfo());
BipScore first = it.next();
ret.setId(first.getId());
ret.setMeasures(getMeasure(first));
it.forEachRemaining(value -> ret.getMeasures().addAll(getMeasure(value)));
return ret;
}, Encoders.bean(Result.class))
.toJavaRDD()
.map(p -> new AtomicAction(inputClazz, p))
.mapToPair(
aa -> new Tuple2<>(new Text(aa.getClazz().getCanonicalName()),
new Text(OBJECT_MAPPER.writeValueAsString(aa))))
.saveAsHadoopFile(outputPath, Text.class, Text.class, SequenceFileOutputFormat.class);
}
private static List<Measure> getMeasure(BipScore value) {
return value
.getScoreList()
.stream()
.map(score -> {
Measure m = new Measure();
m.setId(score.getId());
m
.setUnit(
score
.getUnit()
.stream()
.map(unit -> {
KeyValue kv = new KeyValue();
kv.setValue(unit.getValue());
kv.setKey(unit.getKey());
kv.setDataInfo(getDataInfo());
return kv;
})
.collect(Collectors.toList()));
return m;
})
.collect(Collectors.toList());
}
private static DataInfo getDataInfo() {
DataInfo di = new DataInfo();
di.setInferred(false);
di.setInvisible(false);
di.setDeletedbyinference(false);
di.setTrust("");
Qualifier qualifier = new Qualifier();
qualifier.setClassid("sysimport:actionset");
qualifier.setClassname("Harvested");
qualifier.setSchemename("dnet:provenanceActions");
qualifier.setSchemeid("dnet:provenanceActions");
di.setProvenanceaction(qualifier);
return di;
}
private static void removeOutputDir(SparkSession spark, String path) {
HdfsSupport.remove(path, spark.sparkContext().hadoopConfiguration());
}
public static <R> Dataset<R> readPath(
SparkSession spark, String inputPath, Class<R> clazz) {
return spark
.read()
.textFile(inputPath)
.map((MapFunction<String, R>) value -> OBJECT_MAPPER.readValue(value, clazz), Encoders.bean(clazz));
}
}

View File

@ -4,11 +4,15 @@ package eu.dnetlib.dhp.actionmanager.project;
import static eu.dnetlib.dhp.common.SparkSessionSupport.runWithSparkSession;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.Optional;
import org.apache.commons.io.IOUtils;
import org.apache.commons.lang3.StringUtils;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.api.java.function.MapFunction;
import org.apache.spark.sql.*;
import org.slf4j.Logger;
@ -16,11 +20,79 @@ import org.slf4j.LoggerFactory;
import com.fasterxml.jackson.databind.ObjectMapper;
import eu.dnetlib.dhp.actionmanager.project.csvutils.CSVProgramme;
import eu.dnetlib.dhp.actionmanager.project.utils.CSVProgramme;
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
import eu.dnetlib.dhp.common.HdfsSupport;
import scala.Tuple2;
/**
* Among all the programmes provided in the csv file, selects those in H2020 framework that have an english title.
*
* The title is then handled to get the programme description at a certain level. The set of programme titles will then
* be used to associate a classification for the programme.
*
* The programme code describes an hierarchy that can be exploited to provide the classification. To determine the hierarchy
* the code can be split by '.'. If the length of the splitted code is less than or equal to 2 it can be directly used
* as the classification: H2020-EU -> Horizon 2020 Framework Programme (It will never be repeated),
* H2020-EU.1. -> Excellent science, H2020-EU.2. -> Industrial leadership etc.
*
* The codes are ordered and for all of them the concatenation of all the titles (from the element in position 1 of
* the splitted code) handled as below is used to create the classification. For example:
*
* H2020-EU.1.1 -> Excellent science | European Research Council (ERC)
* from H2020-EU.1. -> Excellence science and H2020-EU.1.1. -> European Research Council (ERC)
*
* H2020-EU.3.1.3.1. -> Societal challenges | Health, demographic change and well-being | Treating and managing disease | Treating disease, including developing regenerative medicine
* from H2020-EU.3. -> Societal challenges,
* H2020-EU.3.1. -> Health, demographic change and well-being
* H2020-EU.3.1.3 -> Treating and managing disease
* H2020-EU.3.1.3.1. -> Treating disease, including developing regenerative medicine
*
* The classification up to level three, will be split in dedicated variables, while the complete classification will be stored
* in a variable called classification and provided as shown above.
*
* The programme title is not give in a standardized way:
*
* - Sometimes associated to the higher level in the hierarchy we can find Priority in title other times it is not the
* case. Since it is not uniform, we removed priority from the handled titles:
*
* H2020-EU.1. -> PRIORITY 'Excellent science'
* H2020-EU.2. -> PRIORITY 'Industrial leadership'
* H2020-EU.3. -> PRIORITY 'Societal challenges
*
* will become
*
* H2020-EU.1. -> Excellent science
* H2020-EU.2. -> Industrial leadership
* H2020-EU.3. -> Societal challenges
*
* - Sometimes the title of the parent is repeated in the title for the code, but it is not always the case, so, titles
* associated to previous levels in the hierarchy are removed from the code title.
*
* H2020-EU.1.2. -> EXCELLENT SCIENCE - Future and Emerging Technologies (FET)
* H2020-EU.2.2. -> INDUSTRIAL LEADERSHIP - Access to risk finance
* H2020-EU.3.4. -> SOCIETAL CHALLENGES - Smart, Green And Integrated Transport
*
* will become
*
* H2020-EU.1.2. -> Future and Emerging Technologies (FET)
* H2020-EU.2.2. -> Access to risk finance
* H2020-EU.3.4. -> Smart, Green And Integrated Transport
*
* This holds at all levels in the hierarchy. Hence
*
* H2020-EU.2.1.2. -> INDUSTRIAL LEADERSHIP - Leadership in enabling and industrial technologies Nanotechnologies
*
* will become
*
* H2020-EU.2.1.2. -> Nanotechnologies
*
* - Euratom is not given in the way the other programmes are: H2020-EU. but H2020-Euratom- . So we need to write
* specific code for it
*
*
*
*/
public class PrepareProgramme {
private static final Logger log = LoggerFactory.getLogger(PrepareProgramme.class);
@ -69,49 +141,148 @@ public class PrepareProgramme {
private static void exec(SparkSession spark, String programmePath, String outputPath) {
Dataset<CSVProgramme> programme = readPath(spark, programmePath, CSVProgramme.class);
programme
JavaRDD<CSVProgramme> h2020Programmes = programme
.toJavaRDD()
.filter(p -> !p.getCode().contains("FP7"))
.mapToPair(csvProgramme -> new Tuple2<>(csvProgramme.getCode(), csvProgramme))
.reduceByKey((a, b) -> {
if (StringUtils.isEmpty(a.getShortTitle())) {
if (StringUtils.isEmpty(b.getShortTitle())) {
if (StringUtils.isEmpty(a.getTitle())) {
if (StringUtils.isNotEmpty(b.getTitle())) {
a.setShortTitle(b.getTitle());
a.setLanguage(b.getLanguage());
}
} else {// notIsEmpty a.getTitle
if (StringUtils.isEmpty(b.getTitle())) {
a.setShortTitle(a.getTitle());
} else {
if (b.getLanguage().equalsIgnoreCase("en")) {
a.setShortTitle(b.getTitle());
a.setLanguage(b.getLanguage());
} else {
a.setShortTitle(a.getTitle());
}
}
}
} else {// not isEmpty b.getShortTitle
a.setShortTitle(b.getShortTitle());
// a.setLanguage(b.getLanguage());
if (!a.getLanguage().equals("en")) {
if (b.getLanguage().equalsIgnoreCase("en")) {
a.setTitle(b.getTitle());
a.setLanguage(b.getLanguage());
}
}
if (StringUtils.isEmpty(a.getShortTitle())) {
if (!StringUtils.isEmpty(b.getShortTitle())) {
a.setShortTitle(b.getShortTitle());
}
}
return a;
})
.map(p -> {
CSVProgramme csvProgramme = p._2();
if (StringUtils.isEmpty(csvProgramme.getShortTitle())) {
csvProgramme.setShortTitle(csvProgramme.getTitle());
String programmeTitle = csvProgramme.getTitle().trim();
if (programmeTitle.length() > 8 && programmeTitle.substring(0, 8).equalsIgnoreCase("PRIORITY")) {
programmeTitle = programmeTitle.substring(9);
if (programmeTitle.charAt(0) == '\'') {
programmeTitle = programmeTitle.substring(1);
}
if (programmeTitle.charAt(programmeTitle.length() - 1) == '\'') {
programmeTitle = programmeTitle.substring(0, programmeTitle.length() - 1);
}
csvProgramme.setTitle(programmeTitle);
}
return OBJECT_MAPPER.writeValueAsString(csvProgramme);
return csvProgramme;
});
// prepareClassification(h2020Programmes);
JavaSparkContext jsc = new JavaSparkContext(spark.sparkContext());
JavaRDD<CSVProgramme> rdd = jsc.parallelize(prepareClassification(h2020Programmes), 1);
rdd
.map(csvProgramme -> {
String tmp = OBJECT_MAPPER.writeValueAsString(csvProgramme);
return tmp;
})
.saveAsTextFile(outputPath);
}
private static List<CSVProgramme> prepareClassification(JavaRDD<CSVProgramme> h2020Programmes) {
Object[] codedescription = h2020Programmes
.map(
value -> new Tuple2<>(value.getCode(),
new Tuple2<String, String>(value.getTitle(), value.getShortTitle())))
.collect()
.toArray();
for (int i = 0; i < codedescription.length - 1; i++) {
for (int j = i + 1; j < codedescription.length; j++) {
Tuple2<String, Tuple2<String, String>> t2i = (Tuple2<String, Tuple2<String, String>>) codedescription[i];
Tuple2<String, Tuple2<String, String>> t2j = (Tuple2<String, Tuple2<String, String>>) codedescription[j];
if (t2i._1().compareTo(t2j._1()) > 0) {
Tuple2<String, Tuple2<String, String>> temp = t2i;
codedescription[i] = t2j;
codedescription[j] = temp;
}
}
}
Map<String, Tuple2<String, String>> map = new HashMap<>();
for (int j = 0; j < codedescription.length; j++) {
Tuple2<String, Tuple2<String, String>> entry = (Tuple2<String, Tuple2<String, String>>) codedescription[j];
String ent = entry._1();
if (ent.contains("Euratom-")) {
ent = ent.replace("-Euratom-", ".Euratom.");
}
String[] tmp = ent.split("\\.");
if (tmp.length <= 2) {
if (StringUtils.isEmpty(entry._2()._2())) {
map.put(entry._1(), new Tuple2<String, String>(entry._2()._1(), entry._2()._1()));
} else {
map.put(entry._1(), entry._2());
}
} else {
if (ent.endsWith(".")) {
ent = ent.substring(0, ent.length() - 1);
}
String key = ent.substring(0, ent.lastIndexOf(".") + 1);
if (key.contains("Euratom")) {
key = key.replace(".Euratom.", "-Euratom-");
ent = ent.replace(".Euratom.", "-Euratom-");
if (key.endsWith("-")) {
key = key.substring(0, key.length() - 1);
}
}
String current = entry._2()._1();
if (!ent.contains("Euratom")) {
String parent;
String tmp_key = tmp[0] + ".";
for (int i = 1; i < tmp.length - 1; i++) {
tmp_key += tmp[i] + ".";
parent = map.get(tmp_key)._1().toLowerCase().trim();
if (parent.contains("|")) {
parent = parent.substring(parent.lastIndexOf("|") + 1).trim();
}
if (current.trim().length() > parent.length()
&& current.toLowerCase().trim().substring(0, parent.length()).equals(parent)) {
current = current.substring(parent.length() + 1);
if (current.trim().charAt(0) == '-' || current.trim().charAt(0) == '') {
current = current.trim().substring(1).trim();
}
}
}
}
String shortTitle = entry._2()._2();
if (StringUtils.isEmpty(shortTitle)) {
shortTitle = current;
}
Tuple2<String, String> newEntry = new Tuple2<>(map.get(key)._1() + " | " + current,
map.get(key)._2() + " | " + shortTitle);
map.put(ent + ".", newEntry);
}
}
return h2020Programmes.map(csvProgramme -> {
String code = csvProgramme.getCode();
if (!code.endsWith(".") && !code.contains("Euratom")
&& !code.equals("H2020-EC"))
code += ".";
csvProgramme.setClassification(map.get(code)._1());
csvProgramme.setClassification_short(map.get(code)._2());
return csvProgramme;
}).collect();
}
public static <R> Dataset<R> readPath(
SparkSession spark, String inputPath, Class<R> clazz) {
return spark

View File

@ -6,9 +6,7 @@ import static eu.dnetlib.dhp.common.SparkSessionSupport.runWithSparkSession;
import java.util.*;
import org.apache.commons.io.IOUtils;
import org.apache.commons.lang3.StringUtils;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.function.FlatMapFunction;
import org.apache.spark.api.java.function.MapFunction;
import org.apache.spark.sql.Dataset;
@ -20,17 +18,21 @@ import org.slf4j.LoggerFactory;
import com.fasterxml.jackson.databind.ObjectMapper;
import eu.dnetlib.dhp.actionmanager.project.csvutils.CSVProgramme;
import eu.dnetlib.dhp.actionmanager.project.csvutils.CSVProject;
import eu.dnetlib.dhp.actionmanager.project.utils.CSVProject;
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
import eu.dnetlib.dhp.common.HdfsSupport;
import scala.Tuple2;
/**
* Selects only the relevant information collected with the projects: project grant agreement, project programme code and
* project topic code for the projects that are also collected from OpenAIRE.
*/
public class PrepareProjects {
private static final Logger log = LoggerFactory.getLogger(PrepareProgramme.class);
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
private static final HashMap<String, CSVProgramme> programmeMap = new HashMap<>();
public static void main(String[] args) throws Exception {
@ -97,10 +99,14 @@ public class PrepareProjects {
if (csvProject.isPresent()) {
String[] programme = csvProject.get().getProgramme().split(";");
String topic = csvProject.get().getTopics();
Arrays
.stream(programme)
.forEach(p -> {
CSVProject proj = new CSVProject();
proj.setTopics(topic);
proj.setProgramme(p);
proj.setId(csvProject.get().getId());
csvProjectList.add(proj);

View File

@ -3,6 +3,9 @@ package eu.dnetlib.dhp.actionmanager.project;
import java.io.Serializable;
/**
* Class to store the grande agreement (code) of the collected projects
*/
public class ProjectSubset implements Serializable {
private String code;
@ -14,4 +17,5 @@ public class ProjectSubset implements Serializable {
public void setCode(String code) {
this.code = code;
}
}

View File

@ -25,6 +25,10 @@ import com.fasterxml.jackson.databind.ObjectMapper;
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
import eu.dnetlib.dhp.common.DbClient;
/**
* queries the OpenAIRE database to get the grant agreement of projects collected from corda__h2020. The code collected
* are written on hdfs using the ProjectSubset model
*/
public class ReadProjectsFromDB implements Closeable {
private final DbClient dbClient;
@ -33,7 +37,7 @@ public class ReadProjectsFromDB implements Closeable {
private final BufferedWriter writer;
private final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
private final static String query = "SELECT code " +
private final static String query = "SELECT code " +
"from projects where id like 'corda__h2020%' ";
public static void main(final String[] args) throws Exception {
@ -72,7 +76,6 @@ public class ReadProjectsFromDB implements Closeable {
try {
ProjectSubset p = new ProjectSubset();
p.setCode(rs.getString("code"));
return Arrays.asList(p);
} catch (final Exception e) {

View File

@ -3,47 +3,53 @@ package eu.dnetlib.dhp.actionmanager.project;
import static eu.dnetlib.dhp.common.SparkSessionSupport.runWithSparkSession;
import java.io.IOException;
import java.util.Arrays;
import java.util.HashMap;
import java.util.Objects;
import java.util.Optional;
import java.util.function.Consumer;
import org.apache.commons.io.IOUtils;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.SequenceFile;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapred.SequenceFileOutputFormat;
import org.apache.hadoop.mapred.TextOutputFormat;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.function.MapFunction;
import org.apache.spark.api.java.function.MapGroupsFunction;
import org.apache.spark.rdd.SequenceFileRDDFunctions;
import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Encoders;
import org.apache.spark.sql.SaveMode;
import org.apache.spark.sql.SparkSession;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.fasterxml.jackson.databind.ObjectMapper;
import eu.dnetlib.dhp.actionmanager.project.csvutils.CSVProgramme;
import eu.dnetlib.dhp.actionmanager.project.csvutils.CSVProject;
import eu.dnetlib.dhp.actionmanager.project.utils.CSVProgramme;
import eu.dnetlib.dhp.actionmanager.project.utils.CSVProject;
import eu.dnetlib.dhp.actionmanager.project.utils.EXCELTopic;
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
import eu.dnetlib.dhp.common.HdfsSupport;
import eu.dnetlib.dhp.schema.action.AtomicAction;
import eu.dnetlib.dhp.schema.common.ModelSupport;
import eu.dnetlib.dhp.schema.oaf.Programme;
import eu.dnetlib.dhp.schema.oaf.H2020Classification;
import eu.dnetlib.dhp.schema.oaf.H2020Programme;
import eu.dnetlib.dhp.schema.oaf.Project;
import eu.dnetlib.dhp.utils.DHPUtils;
import scala.Function1;
import scala.Tuple2;
import scala.runtime.BoxedUnit;
/**
* Class that makes the ActionSet. To prepare the AS two joins are needed
*
* 1. join betweem the collected project subset and the programme extenden with the classification on the grant agreement.
* For each entry a
* eu.dnetlib.dhp.Project entity is created and the information about H2020Classification is set together with the
* h2020topiccode variable
* 2. join between the output of the previous step and the topic information on the topic code. Each time a match is
* found the h2020topicdescription variable is set.
*
* To produce one single entry for each project code a step of groupoing is needed: each project can be associated to more
* than one programme.
*
*
*/
public class SparkAtomicActionJob {
private static final Logger log = LoggerFactory.getLogger(SparkAtomicActionJob.class);
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
@ -77,6 +83,9 @@ public class SparkAtomicActionJob {
final String programmePath = parser.get("programmePath");
log.info("programmePath {}: ", programmePath);
final String topicPath = parser.get("topicPath");
log.info("topic path {}: ", topicPath);
SparkConf conf = new SparkConf();
runWithSparkSession(
@ -88,6 +97,7 @@ public class SparkAtomicActionJob {
spark,
projectPath,
programmePath,
topicPath,
outputPath);
});
}
@ -98,31 +108,54 @@ public class SparkAtomicActionJob {
private static void getAtomicActions(SparkSession spark, String projectPatH,
String programmePath,
String topicPath,
String outputPath) {
Dataset<CSVProject> project = readPath(spark, projectPatH, CSVProject.class);
Dataset<CSVProgramme> programme = readPath(spark, programmePath, CSVProgramme.class);
Dataset<EXCELTopic> topic = readPath(spark, topicPath, EXCELTopic.class);
project
Dataset<Project> aaproject = project
.joinWith(programme, project.col("programme").equalTo(programme.col("code")), "left")
.map(c -> {
CSVProject csvProject = c._1();
Optional<CSVProgramme> csvProgramme = Optional.ofNullable(c._2());
if (csvProgramme.isPresent()) {
Project p = new Project();
p
.setId(
createOpenaireId(
ModelSupport.entityIdPrefix.get("project"),
"corda__h2020", csvProject.getId()));
Programme pm = new Programme();
pm.setCode(csvProject.getProgramme());
pm.setDescription(csvProgramme.get().getShortTitle());
p.setProgramme(Arrays.asList(pm));
return p;
}
.map((MapFunction<Tuple2<CSVProject, CSVProgramme>, Project>) c -> {
return null;
CSVProject csvProject = c._1();
return Optional
.ofNullable(c._2())
.map(csvProgramme -> {
Project pp = new Project();
pp
.setId(
createOpenaireId(
ModelSupport.entityIdPrefix.get("project"),
"corda__h2020", csvProject.getId()));
pp.setH2020topiccode(csvProject.getTopics());
H2020Programme pm = new H2020Programme();
H2020Classification h2020classification = new H2020Classification();
pm.setCode(csvProject.getProgramme());
h2020classification.setClassification(csvProgramme.getClassification());
h2020classification.setH2020Programme(pm);
setLevelsandProgramme(h2020classification, csvProgramme.getClassification_short());
// setProgramme(h2020classification, ocsvProgramme.get().getClassification());
pp.setH2020classification(Arrays.asList(h2020classification));
return pp;
})
.orElse(null);
}, Encoders.bean(Project.class))
.filter(Objects::nonNull);
aaproject
.joinWith(topic, aaproject.col("h2020topiccode").equalTo(topic.col("code")), "left")
.map((MapFunction<Tuple2<Project, EXCELTopic>, Project>) p -> {
Optional<EXCELTopic> op = Optional.ofNullable(p._2());
Project rp = p._1();
if (op.isPresent()) {
rp.setH2020topicdescription(op.get().getTitle());
}
return rp;
}, Encoders.bean(Project.class))
.filter(Objects::nonNull)
.groupByKey(
@ -144,6 +177,24 @@ public class SparkAtomicActionJob {
}
private static void setLevelsandProgramme(H2020Classification h2020Classification, String classification_short) {
String[] tmp = classification_short.split(" \\| ");
h2020Classification.setLevel1(tmp[0]);
if (tmp.length > 1) {
h2020Classification.setLevel2(tmp[1]);
}
if (tmp.length > 2) {
h2020Classification.setLevel3(tmp[2]);
}
h2020Classification.getH2020Programme().setDescription(tmp[tmp.length - 1]);
}
// private static void setProgramme(H2020Classification h2020Classification, String classification) {
// String[] tmp = classification.split(" \\| ");
//
// h2020Classification.getH2020Programme().setDescription(tmp[tmp.length - 1]);
// }
public static <R> Dataset<R> readPath(
SparkSession spark, String inputPath, Class<R> clazz) {
return spark

View File

@ -1,5 +1,5 @@
package eu.dnetlib.dhp.actionmanager.project.csvutils;
package eu.dnetlib.dhp.actionmanager.project.utils;
import java.io.IOException;
import java.util.ArrayList;
@ -10,6 +10,9 @@ import org.apache.commons.csv.CSVFormat;
import org.apache.commons.csv.CSVRecord;
import org.apache.commons.lang.reflect.FieldUtils;
/**
* Reads a generic csv and maps it into classes that mirror its schema
*/
public class CSVParser {
public <R> List<R> parse(String csvFile, String classForName)

Some files were not shown because too many files have changed in this diff Show More