Giambattista Bloisi giambattista.bloisi
  • Joined on 2023-06-30
giambattista.bloisi pushed to airflow at giambattista.bloisi/lot1-kickoff 2024-06-06 19:51:30 +02:00
f77274ce4f initial stage
giambattista.bloisi pushed to airflow at giambattista.bloisi/lot1-kickoff 2024-06-06 19:50:11 +02:00
151d305417 initial stage
giambattista.bloisi pushed to airflow at giambattista.bloisi/lot1-kickoff 2024-06-06 19:48:20 +02:00
94b4add8cd initial stage
giambattista.bloisi pushed to airflow at giambattista.bloisi/lot1-kickoff 2024-06-03 22:03:11 +02:00
1bc94cd835 initial stage
giambattista.bloisi pushed to airflow at giambattista.bloisi/lot1-kickoff 2024-06-03 15:29:11 +02:00
d9e7528927 initial stage
giambattista.bloisi created pull request D-Net/dnet-hadoop#442 2024-06-03 15:15:47 +02:00
Fix for missing collectedfrom after dedup
giambattista.bloisi pushed to fix_mergedcliquesort at D-Net/dnet-hadoop 2024-06-03 15:13:53 +02:00
3feab5d92d Fix MergeUtils.mergeGroup: it could get rid of some records and did not consider all PID authorities whilke sorting records.
giambattista.bloisi created branch fix_mergedcliquesort in D-Net/dnet-hadoop 2024-06-03 15:13:53 +02:00
giambattista.bloisi pushed to pivotselectionbypid at D-Net/openaire-graph-docs 2024-05-29 23:18:49 +02:00
30f2cca3df Change the selection criteria for the pivot record of a group so that by best pid type becomes the first criteria. This will have the effect to slowly converge to records having DOI pid
giambattista.bloisi created branch pivotselectionbypid in D-Net/openaire-graph-docs 2024-05-29 23:18:49 +02:00
giambattista.bloisi pushed to beta at D-Net/dnet-hadoop 2024-05-28 14:14:57 +02:00
73316d8c83 Add jaxb and jaxws dependencies when compiling with spark-34 profile as they are required to run with jdk > 8
giambattista.bloisi commented on pull request D-Net/dnet-hadoop#431 2024-05-23 09:33:43 +02:00
WIP: playing with dependencies to compile also with macos arm64 and openjdk 11 or 17

@alessia.bardi I just merged in beta PR #327 that solves part of the problems. A new profile, -P spark-34, can be used to activate building against spark 3.4.2.openaire that has recently been…

giambattista.bloisi pushed to beta at D-Net/dnet-hadoop 2024-05-23 09:20:32 +02:00
1b2357e10a Merge pull request 'Changes in maven poms to build and test the project using Spark 3.4.x and scala 2.12' (#327) from spark34-integration into beta
f1fe363b19 merged again from beta (I hope for the last time)
66c1ffc866 merged again from beta (I hope for the last time)
103e2652b3 merged beta
a87f9ea643 fixed scholexplorer bug
Compare 23 commits »
giambattista.bloisi merged pull request D-Net/dnet-hadoop#327 2024-05-23 09:20:29 +02:00
Changes in maven poms to build and test the project using Spark 3.4.x and scala 2.12
giambattista.bloisi pushed to pid_stability at D-Net/openaire-graph-docs 2024-05-13 17:56:56 +02:00
75b1cdf92e Describe the usage of the pivot table to improve stability of “representative records” and how “non authoritative” PIDs are used to generate “representative records”
b7cb15e942 Update affiliation matching page in v7.1.3
c017c95486 Adjust text in affiliation matching page
4cdb5f7f31 affiliation matching description update
f279cdfe10 affiliation matching description update
Compare 14 commits »
giambattista.bloisi pushed to pid_stability at D-Net/openaire-graph-docs 2024-05-13 14:41:33 +02:00
6b3533d29a Describe the usage of the pivot table to improve stability of “representative records” and how “non authoritative” PIDs are used to generate “representative records”
giambattista.bloisi created pull request D-Net/dnet-hadoop#434 2024-05-07 16:47:37 +02:00
Fixes in Graph Provision
giambattista.bloisi pushed to beta_provision_relation at D-Net/dnet-hadoop 2024-05-07 15:44:42 +02:00
711048ceed PrepareRelationsJob rewritten to use Spark Dataframe API and Windowing functions
giambattista.bloisi created branch beta_provision_relation in D-Net/dnet-hadoop 2024-05-07 15:44:42 +02:00
giambattista.bloisi pushed to beta at D-Net/dnet-hadoop 2024-05-03 13:58:04 +02:00
69c5efbd8b Fix: when applying enrichments with no instance information the resulting merge entity was generated with no instance instead of keeping the original information