forked from D-Net/dnet-hadoop
merge with master fork
This commit is contained in:
commit
4c94231cad
|
@ -0,0 +1,661 @@
|
||||||
|
GNU AFFERO GENERAL PUBLIC LICENSE
|
||||||
|
Version 3, 19 November 2007
|
||||||
|
|
||||||
|
Copyright (C) 2007 Free Software Foundation, Inc. <http://fsf.org/>
|
||||||
|
Everyone is permitted to copy and distribute verbatim copies
|
||||||
|
of this license document, but changing it is not allowed.
|
||||||
|
|
||||||
|
Preamble
|
||||||
|
|
||||||
|
The GNU Affero General Public License is a free, copyleft license for
|
||||||
|
software and other kinds of works, specifically designed to ensure
|
||||||
|
cooperation with the community in the case of network server software.
|
||||||
|
|
||||||
|
The licenses for most software and other practical works are designed
|
||||||
|
to take away your freedom to share and change the works. By contrast,
|
||||||
|
our General Public Licenses are intended to guarantee your freedom to
|
||||||
|
share and change all versions of a program--to make sure it remains free
|
||||||
|
software for all its users.
|
||||||
|
|
||||||
|
When we speak of free software, we are referring to freedom, not
|
||||||
|
price. Our General Public Licenses are designed to make sure that you
|
||||||
|
have the freedom to distribute copies of free software (and charge for
|
||||||
|
them if you wish), that you receive source code or can get it if you
|
||||||
|
want it, that you can change the software or use pieces of it in new
|
||||||
|
free programs, and that you know you can do these things.
|
||||||
|
|
||||||
|
Developers that use our General Public Licenses protect your rights
|
||||||
|
with two steps: (1) assert copyright on the software, and (2) offer
|
||||||
|
you this License which gives you legal permission to copy, distribute
|
||||||
|
and/or modify the software.
|
||||||
|
|
||||||
|
A secondary benefit of defending all users' freedom is that
|
||||||
|
improvements made in alternate versions of the program, if they
|
||||||
|
receive widespread use, become available for other developers to
|
||||||
|
incorporate. Many developers of free software are heartened and
|
||||||
|
encouraged by the resulting cooperation. However, in the case of
|
||||||
|
software used on network servers, this result may fail to come about.
|
||||||
|
The GNU General Public License permits making a modified version and
|
||||||
|
letting the public access it on a server without ever releasing its
|
||||||
|
source code to the public.
|
||||||
|
|
||||||
|
The GNU Affero General Public License is designed specifically to
|
||||||
|
ensure that, in such cases, the modified source code becomes available
|
||||||
|
to the community. It requires the operator of a network server to
|
||||||
|
provide the source code of the modified version running there to the
|
||||||
|
users of that server. Therefore, public use of a modified version, on
|
||||||
|
a publicly accessible server, gives the public access to the source
|
||||||
|
code of the modified version.
|
||||||
|
|
||||||
|
An older license, called the Affero General Public License and
|
||||||
|
published by Affero, was designed to accomplish similar goals. This is
|
||||||
|
a different license, not a version of the Affero GPL, but Affero has
|
||||||
|
released a new version of the Affero GPL which permits relicensing under
|
||||||
|
this license.
|
||||||
|
|
||||||
|
The precise terms and conditions for copying, distribution and
|
||||||
|
modification follow.
|
||||||
|
|
||||||
|
TERMS AND CONDITIONS
|
||||||
|
|
||||||
|
0. Definitions.
|
||||||
|
|
||||||
|
"This License" refers to version 3 of the GNU Affero General Public License.
|
||||||
|
|
||||||
|
"Copyright" also means copyright-like laws that apply to other kinds of
|
||||||
|
works, such as semiconductor masks.
|
||||||
|
|
||||||
|
"The Program" refers to any copyrightable work licensed under this
|
||||||
|
License. Each licensee is addressed as "you". "Licensees" and
|
||||||
|
"recipients" may be individuals or organizations.
|
||||||
|
|
||||||
|
To "modify" a work means to copy from or adapt all or part of the work
|
||||||
|
in a fashion requiring copyright permission, other than the making of an
|
||||||
|
exact copy. The resulting work is called a "modified version" of the
|
||||||
|
earlier work or a work "based on" the earlier work.
|
||||||
|
|
||||||
|
A "covered work" means either the unmodified Program or a work based
|
||||||
|
on the Program.
|
||||||
|
|
||||||
|
To "propagate" a work means to do anything with it that, without
|
||||||
|
permission, would make you directly or secondarily liable for
|
||||||
|
infringement under applicable copyright law, except executing it on a
|
||||||
|
computer or modifying a private copy. Propagation includes copying,
|
||||||
|
distribution (with or without modification), making available to the
|
||||||
|
public, and in some countries other activities as well.
|
||||||
|
|
||||||
|
To "convey" a work means any kind of propagation that enables other
|
||||||
|
parties to make or receive copies. Mere interaction with a user through
|
||||||
|
a computer network, with no transfer of a copy, is not conveying.
|
||||||
|
|
||||||
|
An interactive user interface displays "Appropriate Legal Notices"
|
||||||
|
to the extent that it includes a convenient and prominently visible
|
||||||
|
feature that (1) displays an appropriate copyright notice, and (2)
|
||||||
|
tells the user that there is no warranty for the work (except to the
|
||||||
|
extent that warranties are provided), that licensees may convey the
|
||||||
|
work under this License, and how to view a copy of this License. If
|
||||||
|
the interface presents a list of user commands or options, such as a
|
||||||
|
menu, a prominent item in the list meets this criterion.
|
||||||
|
|
||||||
|
1. Source Code.
|
||||||
|
|
||||||
|
The "source code" for a work means the preferred form of the work
|
||||||
|
for making modifications to it. "Object code" means any non-source
|
||||||
|
form of a work.
|
||||||
|
|
||||||
|
A "Standard Interface" means an interface that either is an official
|
||||||
|
standard defined by a recognized standards body, or, in the case of
|
||||||
|
interfaces specified for a particular programming language, one that
|
||||||
|
is widely used among developers working in that language.
|
||||||
|
|
||||||
|
The "System Libraries" of an executable work include anything, other
|
||||||
|
than the work as a whole, that (a) is included in the normal form of
|
||||||
|
packaging a Major Component, but which is not part of that Major
|
||||||
|
Component, and (b) serves only to enable use of the work with that
|
||||||
|
Major Component, or to implement a Standard Interface for which an
|
||||||
|
implementation is available to the public in source code form. A
|
||||||
|
"Major Component", in this context, means a major essential component
|
||||||
|
(kernel, window system, and so on) of the specific operating system
|
||||||
|
(if any) on which the executable work runs, or a compiler used to
|
||||||
|
produce the work, or an object code interpreter used to run it.
|
||||||
|
|
||||||
|
The "Corresponding Source" for a work in object code form means all
|
||||||
|
the source code needed to generate, install, and (for an executable
|
||||||
|
work) run the object code and to modify the work, including scripts to
|
||||||
|
control those activities. However, it does not include the work's
|
||||||
|
System Libraries, or general-purpose tools or generally available free
|
||||||
|
programs which are used unmodified in performing those activities but
|
||||||
|
which are not part of the work. For example, Corresponding Source
|
||||||
|
includes interface definition files associated with source files for
|
||||||
|
the work, and the source code for shared libraries and dynamically
|
||||||
|
linked subprograms that the work is specifically designed to require,
|
||||||
|
such as by intimate data communication or control flow between those
|
||||||
|
subprograms and other parts of the work.
|
||||||
|
|
||||||
|
The Corresponding Source need not include anything that users
|
||||||
|
can regenerate automatically from other parts of the Corresponding
|
||||||
|
Source.
|
||||||
|
|
||||||
|
The Corresponding Source for a work in source code form is that
|
||||||
|
same work.
|
||||||
|
|
||||||
|
2. Basic Permissions.
|
||||||
|
|
||||||
|
All rights granted under this License are granted for the term of
|
||||||
|
copyright on the Program, and are irrevocable provided the stated
|
||||||
|
conditions are met. This License explicitly affirms your unlimited
|
||||||
|
permission to run the unmodified Program. The output from running a
|
||||||
|
covered work is covered by this License only if the output, given its
|
||||||
|
content, constitutes a covered work. This License acknowledges your
|
||||||
|
rights of fair use or other equivalent, as provided by copyright law.
|
||||||
|
|
||||||
|
You may make, run and propagate covered works that you do not
|
||||||
|
convey, without conditions so long as your license otherwise remains
|
||||||
|
in force. You may convey covered works to others for the sole purpose
|
||||||
|
of having them make modifications exclusively for you, or provide you
|
||||||
|
with facilities for running those works, provided that you comply with
|
||||||
|
the terms of this License in conveying all material for which you do
|
||||||
|
not control copyright. Those thus making or running the covered works
|
||||||
|
for you must do so exclusively on your behalf, under your direction
|
||||||
|
and control, on terms that prohibit them from making any copies of
|
||||||
|
your copyrighted material outside their relationship with you.
|
||||||
|
|
||||||
|
Conveying under any other circumstances is permitted solely under
|
||||||
|
the conditions stated below. Sublicensing is not allowed; section 10
|
||||||
|
makes it unnecessary.
|
||||||
|
|
||||||
|
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
|
||||||
|
|
||||||
|
No covered work shall be deemed part of an effective technological
|
||||||
|
measure under any applicable law fulfilling obligations under article
|
||||||
|
11 of the WIPO copyright treaty adopted on 20 December 1996, or
|
||||||
|
similar laws prohibiting or restricting circumvention of such
|
||||||
|
measures.
|
||||||
|
|
||||||
|
When you convey a covered work, you waive any legal power to forbid
|
||||||
|
circumvention of technological measures to the extent such circumvention
|
||||||
|
is effected by exercising rights under this License with respect to
|
||||||
|
the covered work, and you disclaim any intention to limit operation or
|
||||||
|
modification of the work as a means of enforcing, against the work's
|
||||||
|
users, your or third parties' legal rights to forbid circumvention of
|
||||||
|
technological measures.
|
||||||
|
|
||||||
|
4. Conveying Verbatim Copies.
|
||||||
|
|
||||||
|
You may convey verbatim copies of the Program's source code as you
|
||||||
|
receive it, in any medium, provided that you conspicuously and
|
||||||
|
appropriately publish on each copy an appropriate copyright notice;
|
||||||
|
keep intact all notices stating that this License and any
|
||||||
|
non-permissive terms added in accord with section 7 apply to the code;
|
||||||
|
keep intact all notices of the absence of any warranty; and give all
|
||||||
|
recipients a copy of this License along with the Program.
|
||||||
|
|
||||||
|
You may charge any price or no price for each copy that you convey,
|
||||||
|
and you may offer support or warranty protection for a fee.
|
||||||
|
|
||||||
|
5. Conveying Modified Source Versions.
|
||||||
|
|
||||||
|
You may convey a work based on the Program, or the modifications to
|
||||||
|
produce it from the Program, in the form of source code under the
|
||||||
|
terms of section 4, provided that you also meet all of these conditions:
|
||||||
|
|
||||||
|
a) The work must carry prominent notices stating that you modified
|
||||||
|
it, and giving a relevant date.
|
||||||
|
|
||||||
|
b) The work must carry prominent notices stating that it is
|
||||||
|
released under this License and any conditions added under section
|
||||||
|
7. This requirement modifies the requirement in section 4 to
|
||||||
|
"keep intact all notices".
|
||||||
|
|
||||||
|
c) You must license the entire work, as a whole, under this
|
||||||
|
License to anyone who comes into possession of a copy. This
|
||||||
|
License will therefore apply, along with any applicable section 7
|
||||||
|
additional terms, to the whole of the work, and all its parts,
|
||||||
|
regardless of how they are packaged. This License gives no
|
||||||
|
permission to license the work in any other way, but it does not
|
||||||
|
invalidate such permission if you have separately received it.
|
||||||
|
|
||||||
|
d) If the work has interactive user interfaces, each must display
|
||||||
|
Appropriate Legal Notices; however, if the Program has interactive
|
||||||
|
interfaces that do not display Appropriate Legal Notices, your
|
||||||
|
work need not make them do so.
|
||||||
|
|
||||||
|
A compilation of a covered work with other separate and independent
|
||||||
|
works, which are not by their nature extensions of the covered work,
|
||||||
|
and which are not combined with it such as to form a larger program,
|
||||||
|
in or on a volume of a storage or distribution medium, is called an
|
||||||
|
"aggregate" if the compilation and its resulting copyright are not
|
||||||
|
used to limit the access or legal rights of the compilation's users
|
||||||
|
beyond what the individual works permit. Inclusion of a covered work
|
||||||
|
in an aggregate does not cause this License to apply to the other
|
||||||
|
parts of the aggregate.
|
||||||
|
|
||||||
|
6. Conveying Non-Source Forms.
|
||||||
|
|
||||||
|
You may convey a covered work in object code form under the terms
|
||||||
|
of sections 4 and 5, provided that you also convey the
|
||||||
|
machine-readable Corresponding Source under the terms of this License,
|
||||||
|
in one of these ways:
|
||||||
|
|
||||||
|
a) Convey the object code in, or embodied in, a physical product
|
||||||
|
(including a physical distribution medium), accompanied by the
|
||||||
|
Corresponding Source fixed on a durable physical medium
|
||||||
|
customarily used for software interchange.
|
||||||
|
|
||||||
|
b) Convey the object code in, or embodied in, a physical product
|
||||||
|
(including a physical distribution medium), accompanied by a
|
||||||
|
written offer, valid for at least three years and valid for as
|
||||||
|
long as you offer spare parts or customer support for that product
|
||||||
|
model, to give anyone who possesses the object code either (1) a
|
||||||
|
copy of the Corresponding Source for all the software in the
|
||||||
|
product that is covered by this License, on a durable physical
|
||||||
|
medium customarily used for software interchange, for a price no
|
||||||
|
more than your reasonable cost of physically performing this
|
||||||
|
conveying of source, or (2) access to copy the
|
||||||
|
Corresponding Source from a network server at no charge.
|
||||||
|
|
||||||
|
c) Convey individual copies of the object code with a copy of the
|
||||||
|
written offer to provide the Corresponding Source. This
|
||||||
|
alternative is allowed only occasionally and noncommercially, and
|
||||||
|
only if you received the object code with such an offer, in accord
|
||||||
|
with subsection 6b.
|
||||||
|
|
||||||
|
d) Convey the object code by offering access from a designated
|
||||||
|
place (gratis or for a charge), and offer equivalent access to the
|
||||||
|
Corresponding Source in the same way through the same place at no
|
||||||
|
further charge. You need not require recipients to copy the
|
||||||
|
Corresponding Source along with the object code. If the place to
|
||||||
|
copy the object code is a network server, the Corresponding Source
|
||||||
|
may be on a different server (operated by you or a third party)
|
||||||
|
that supports equivalent copying facilities, provided you maintain
|
||||||
|
clear directions next to the object code saying where to find the
|
||||||
|
Corresponding Source. Regardless of what server hosts the
|
||||||
|
Corresponding Source, you remain obligated to ensure that it is
|
||||||
|
available for as long as needed to satisfy these requirements.
|
||||||
|
|
||||||
|
e) Convey the object code using peer-to-peer transmission, provided
|
||||||
|
you inform other peers where the object code and Corresponding
|
||||||
|
Source of the work are being offered to the general public at no
|
||||||
|
charge under subsection 6d.
|
||||||
|
|
||||||
|
A separable portion of the object code, whose source code is excluded
|
||||||
|
from the Corresponding Source as a System Library, need not be
|
||||||
|
included in conveying the object code work.
|
||||||
|
|
||||||
|
A "User Product" is either (1) a "consumer product", which means any
|
||||||
|
tangible personal property which is normally used for personal, family,
|
||||||
|
or household purposes, or (2) anything designed or sold for incorporation
|
||||||
|
into a dwelling. In determining whether a product is a consumer product,
|
||||||
|
doubtful cases shall be resolved in favor of coverage. For a particular
|
||||||
|
product received by a particular user, "normally used" refers to a
|
||||||
|
typical or common use of that class of product, regardless of the status
|
||||||
|
of the particular user or of the way in which the particular user
|
||||||
|
actually uses, or expects or is expected to use, the product. A product
|
||||||
|
is a consumer product regardless of whether the product has substantial
|
||||||
|
commercial, industrial or non-consumer uses, unless such uses represent
|
||||||
|
the only significant mode of use of the product.
|
||||||
|
|
||||||
|
"Installation Information" for a User Product means any methods,
|
||||||
|
procedures, authorization keys, or other information required to install
|
||||||
|
and execute modified versions of a covered work in that User Product from
|
||||||
|
a modified version of its Corresponding Source. The information must
|
||||||
|
suffice to ensure that the continued functioning of the modified object
|
||||||
|
code is in no case prevented or interfered with solely because
|
||||||
|
modification has been made.
|
||||||
|
|
||||||
|
If you convey an object code work under this section in, or with, or
|
||||||
|
specifically for use in, a User Product, and the conveying occurs as
|
||||||
|
part of a transaction in which the right of possession and use of the
|
||||||
|
User Product is transferred to the recipient in perpetuity or for a
|
||||||
|
fixed term (regardless of how the transaction is characterized), the
|
||||||
|
Corresponding Source conveyed under this section must be accompanied
|
||||||
|
by the Installation Information. But this requirement does not apply
|
||||||
|
if neither you nor any third party retains the ability to install
|
||||||
|
modified object code on the User Product (for example, the work has
|
||||||
|
been installed in ROM).
|
||||||
|
|
||||||
|
The requirement to provide Installation Information does not include a
|
||||||
|
requirement to continue to provide support service, warranty, or updates
|
||||||
|
for a work that has been modified or installed by the recipient, or for
|
||||||
|
the User Product in which it has been modified or installed. Access to a
|
||||||
|
network may be denied when the modification itself materially and
|
||||||
|
adversely affects the operation of the network or violates the rules and
|
||||||
|
protocols for communication across the network.
|
||||||
|
|
||||||
|
Corresponding Source conveyed, and Installation Information provided,
|
||||||
|
in accord with this section must be in a format that is publicly
|
||||||
|
documented (and with an implementation available to the public in
|
||||||
|
source code form), and must require no special password or key for
|
||||||
|
unpacking, reading or copying.
|
||||||
|
|
||||||
|
7. Additional Terms.
|
||||||
|
|
||||||
|
"Additional permissions" are terms that supplement the terms of this
|
||||||
|
License by making exceptions from one or more of its conditions.
|
||||||
|
Additional permissions that are applicable to the entire Program shall
|
||||||
|
be treated as though they were included in this License, to the extent
|
||||||
|
that they are valid under applicable law. If additional permissions
|
||||||
|
apply only to part of the Program, that part may be used separately
|
||||||
|
under those permissions, but the entire Program remains governed by
|
||||||
|
this License without regard to the additional permissions.
|
||||||
|
|
||||||
|
When you convey a copy of a covered work, you may at your option
|
||||||
|
remove any additional permissions from that copy, or from any part of
|
||||||
|
it. (Additional permissions may be written to require their own
|
||||||
|
removal in certain cases when you modify the work.) You may place
|
||||||
|
additional permissions on material, added by you to a covered work,
|
||||||
|
for which you have or can give appropriate copyright permission.
|
||||||
|
|
||||||
|
Notwithstanding any other provision of this License, for material you
|
||||||
|
add to a covered work, you may (if authorized by the copyright holders of
|
||||||
|
that material) supplement the terms of this License with terms:
|
||||||
|
|
||||||
|
a) Disclaiming warranty or limiting liability differently from the
|
||||||
|
terms of sections 15 and 16 of this License; or
|
||||||
|
|
||||||
|
b) Requiring preservation of specified reasonable legal notices or
|
||||||
|
author attributions in that material or in the Appropriate Legal
|
||||||
|
Notices displayed by works containing it; or
|
||||||
|
|
||||||
|
c) Prohibiting misrepresentation of the origin of that material, or
|
||||||
|
requiring that modified versions of such material be marked in
|
||||||
|
reasonable ways as different from the original version; or
|
||||||
|
|
||||||
|
d) Limiting the use for publicity purposes of names of licensors or
|
||||||
|
authors of the material; or
|
||||||
|
|
||||||
|
e) Declining to grant rights under trademark law for use of some
|
||||||
|
trade names, trademarks, or service marks; or
|
||||||
|
|
||||||
|
f) Requiring indemnification of licensors and authors of that
|
||||||
|
material by anyone who conveys the material (or modified versions of
|
||||||
|
it) with contractual assumptions of liability to the recipient, for
|
||||||
|
any liability that these contractual assumptions directly impose on
|
||||||
|
those licensors and authors.
|
||||||
|
|
||||||
|
All other non-permissive additional terms are considered "further
|
||||||
|
restrictions" within the meaning of section 10. If the Program as you
|
||||||
|
received it, or any part of it, contains a notice stating that it is
|
||||||
|
governed by this License along with a term that is a further
|
||||||
|
restriction, you may remove that term. If a license document contains
|
||||||
|
a further restriction but permits relicensing or conveying under this
|
||||||
|
License, you may add to a covered work material governed by the terms
|
||||||
|
of that license document, provided that the further restriction does
|
||||||
|
not survive such relicensing or conveying.
|
||||||
|
|
||||||
|
If you add terms to a covered work in accord with this section, you
|
||||||
|
must place, in the relevant source files, a statement of the
|
||||||
|
additional terms that apply to those files, or a notice indicating
|
||||||
|
where to find the applicable terms.
|
||||||
|
|
||||||
|
Additional terms, permissive or non-permissive, may be stated in the
|
||||||
|
form of a separately written license, or stated as exceptions;
|
||||||
|
the above requirements apply either way.
|
||||||
|
|
||||||
|
8. Termination.
|
||||||
|
|
||||||
|
You may not propagate or modify a covered work except as expressly
|
||||||
|
provided under this License. Any attempt otherwise to propagate or
|
||||||
|
modify it is void, and will automatically terminate your rights under
|
||||||
|
this License (including any patent licenses granted under the third
|
||||||
|
paragraph of section 11).
|
||||||
|
|
||||||
|
However, if you cease all violation of this License, then your
|
||||||
|
license from a particular copyright holder is reinstated (a)
|
||||||
|
provisionally, unless and until the copyright holder explicitly and
|
||||||
|
finally terminates your license, and (b) permanently, if the copyright
|
||||||
|
holder fails to notify you of the violation by some reasonable means
|
||||||
|
prior to 60 days after the cessation.
|
||||||
|
|
||||||
|
Moreover, your license from a particular copyright holder is
|
||||||
|
reinstated permanently if the copyright holder notifies you of the
|
||||||
|
violation by some reasonable means, this is the first time you have
|
||||||
|
received notice of violation of this License (for any work) from that
|
||||||
|
copyright holder, and you cure the violation prior to 30 days after
|
||||||
|
your receipt of the notice.
|
||||||
|
|
||||||
|
Termination of your rights under this section does not terminate the
|
||||||
|
licenses of parties who have received copies or rights from you under
|
||||||
|
this License. If your rights have been terminated and not permanently
|
||||||
|
reinstated, you do not qualify to receive new licenses for the same
|
||||||
|
material under section 10.
|
||||||
|
|
||||||
|
9. Acceptance Not Required for Having Copies.
|
||||||
|
|
||||||
|
You are not required to accept this License in order to receive or
|
||||||
|
run a copy of the Program. Ancillary propagation of a covered work
|
||||||
|
occurring solely as a consequence of using peer-to-peer transmission
|
||||||
|
to receive a copy likewise does not require acceptance. However,
|
||||||
|
nothing other than this License grants you permission to propagate or
|
||||||
|
modify any covered work. These actions infringe copyright if you do
|
||||||
|
not accept this License. Therefore, by modifying or propagating a
|
||||||
|
covered work, you indicate your acceptance of this License to do so.
|
||||||
|
|
||||||
|
10. Automatic Licensing of Downstream Recipients.
|
||||||
|
|
||||||
|
Each time you convey a covered work, the recipient automatically
|
||||||
|
receives a license from the original licensors, to run, modify and
|
||||||
|
propagate that work, subject to this License. You are not responsible
|
||||||
|
for enforcing compliance by third parties with this License.
|
||||||
|
|
||||||
|
An "entity transaction" is a transaction transferring control of an
|
||||||
|
organization, or substantially all assets of one, or subdividing an
|
||||||
|
organization, or merging organizations. If propagation of a covered
|
||||||
|
work results from an entity transaction, each party to that
|
||||||
|
transaction who receives a copy of the work also receives whatever
|
||||||
|
licenses to the work the party's predecessor in interest had or could
|
||||||
|
give under the previous paragraph, plus a right to possession of the
|
||||||
|
Corresponding Source of the work from the predecessor in interest, if
|
||||||
|
the predecessor has it or can get it with reasonable efforts.
|
||||||
|
|
||||||
|
You may not impose any further restrictions on the exercise of the
|
||||||
|
rights granted or affirmed under this License. For example, you may
|
||||||
|
not impose a license fee, royalty, or other charge for exercise of
|
||||||
|
rights granted under this License, and you may not initiate litigation
|
||||||
|
(including a cross-claim or counterclaim in a lawsuit) alleging that
|
||||||
|
any patent claim is infringed by making, using, selling, offering for
|
||||||
|
sale, or importing the Program or any portion of it.
|
||||||
|
|
||||||
|
11. Patents.
|
||||||
|
|
||||||
|
A "contributor" is a copyright holder who authorizes use under this
|
||||||
|
License of the Program or a work on which the Program is based. The
|
||||||
|
work thus licensed is called the contributor's "contributor version".
|
||||||
|
|
||||||
|
A contributor's "essential patent claims" are all patent claims
|
||||||
|
owned or controlled by the contributor, whether already acquired or
|
||||||
|
hereafter acquired, that would be infringed by some manner, permitted
|
||||||
|
by this License, of making, using, or selling its contributor version,
|
||||||
|
but do not include claims that would be infringed only as a
|
||||||
|
consequence of further modification of the contributor version. For
|
||||||
|
purposes of this definition, "control" includes the right to grant
|
||||||
|
patent sublicenses in a manner consistent with the requirements of
|
||||||
|
this License.
|
||||||
|
|
||||||
|
Each contributor grants you a non-exclusive, worldwide, royalty-free
|
||||||
|
patent license under the contributor's essential patent claims, to
|
||||||
|
make, use, sell, offer for sale, import and otherwise run, modify and
|
||||||
|
propagate the contents of its contributor version.
|
||||||
|
|
||||||
|
In the following three paragraphs, a "patent license" is any express
|
||||||
|
agreement or commitment, however denominated, not to enforce a patent
|
||||||
|
(such as an express permission to practice a patent or covenant not to
|
||||||
|
sue for patent infringement). To "grant" such a patent license to a
|
||||||
|
party means to make such an agreement or commitment not to enforce a
|
||||||
|
patent against the party.
|
||||||
|
|
||||||
|
If you convey a covered work, knowingly relying on a patent license,
|
||||||
|
and the Corresponding Source of the work is not available for anyone
|
||||||
|
to copy, free of charge and under the terms of this License, through a
|
||||||
|
publicly available network server or other readily accessible means,
|
||||||
|
then you must either (1) cause the Corresponding Source to be so
|
||||||
|
available, or (2) arrange to deprive yourself of the benefit of the
|
||||||
|
patent license for this particular work, or (3) arrange, in a manner
|
||||||
|
consistent with the requirements of this License, to extend the patent
|
||||||
|
license to downstream recipients. "Knowingly relying" means you have
|
||||||
|
actual knowledge that, but for the patent license, your conveying the
|
||||||
|
covered work in a country, or your recipient's use of the covered work
|
||||||
|
in a country, would infringe one or more identifiable patents in that
|
||||||
|
country that you have reason to believe are valid.
|
||||||
|
|
||||||
|
If, pursuant to or in connection with a single transaction or
|
||||||
|
arrangement, you convey, or propagate by procuring conveyance of, a
|
||||||
|
covered work, and grant a patent license to some of the parties
|
||||||
|
receiving the covered work authorizing them to use, propagate, modify
|
||||||
|
or convey a specific copy of the covered work, then the patent license
|
||||||
|
you grant is automatically extended to all recipients of the covered
|
||||||
|
work and works based on it.
|
||||||
|
|
||||||
|
A patent license is "discriminatory" if it does not include within
|
||||||
|
the scope of its coverage, prohibits the exercise of, or is
|
||||||
|
conditioned on the non-exercise of one or more of the rights that are
|
||||||
|
specifically granted under this License. You may not convey a covered
|
||||||
|
work if you are a party to an arrangement with a third party that is
|
||||||
|
in the business of distributing software, under which you make payment
|
||||||
|
to the third party based on the extent of your activity of conveying
|
||||||
|
the work, and under which the third party grants, to any of the
|
||||||
|
parties who would receive the covered work from you, a discriminatory
|
||||||
|
patent license (a) in connection with copies of the covered work
|
||||||
|
conveyed by you (or copies made from those copies), or (b) primarily
|
||||||
|
for and in connection with specific products or compilations that
|
||||||
|
contain the covered work, unless you entered into that arrangement,
|
||||||
|
or that patent license was granted, prior to 28 March 2007.
|
||||||
|
|
||||||
|
Nothing in this License shall be construed as excluding or limiting
|
||||||
|
any implied license or other defenses to infringement that may
|
||||||
|
otherwise be available to you under applicable patent law.
|
||||||
|
|
||||||
|
12. No Surrender of Others' Freedom.
|
||||||
|
|
||||||
|
If conditions are imposed on you (whether by court order, agreement or
|
||||||
|
otherwise) that contradict the conditions of this License, they do not
|
||||||
|
excuse you from the conditions of this License. If you cannot convey a
|
||||||
|
covered work so as to satisfy simultaneously your obligations under this
|
||||||
|
License and any other pertinent obligations, then as a consequence you may
|
||||||
|
not convey it at all. For example, if you agree to terms that obligate you
|
||||||
|
to collect a royalty for further conveying from those to whom you convey
|
||||||
|
the Program, the only way you could satisfy both those terms and this
|
||||||
|
License would be to refrain entirely from conveying the Program.
|
||||||
|
|
||||||
|
13. Remote Network Interaction; Use with the GNU General Public License.
|
||||||
|
|
||||||
|
Notwithstanding any other provision of this License, if you modify the
|
||||||
|
Program, your modified version must prominently offer all users
|
||||||
|
interacting with it remotely through a computer network (if your version
|
||||||
|
supports such interaction) an opportunity to receive the Corresponding
|
||||||
|
Source of your version by providing access to the Corresponding Source
|
||||||
|
from a network server at no charge, through some standard or customary
|
||||||
|
means of facilitating copying of software. This Corresponding Source
|
||||||
|
shall include the Corresponding Source for any work covered by version 3
|
||||||
|
of the GNU General Public License that is incorporated pursuant to the
|
||||||
|
following paragraph.
|
||||||
|
|
||||||
|
Notwithstanding any other provision of this License, you have
|
||||||
|
permission to link or combine any covered work with a work licensed
|
||||||
|
under version 3 of the GNU General Public License into a single
|
||||||
|
combined work, and to convey the resulting work. The terms of this
|
||||||
|
License will continue to apply to the part which is the covered work,
|
||||||
|
but the work with which it is combined will remain governed by version
|
||||||
|
3 of the GNU General Public License.
|
||||||
|
|
||||||
|
14. Revised Versions of this License.
|
||||||
|
|
||||||
|
The Free Software Foundation may publish revised and/or new versions of
|
||||||
|
the GNU Affero General Public License from time to time. Such new versions
|
||||||
|
will be similar in spirit to the present version, but may differ in detail to
|
||||||
|
address new problems or concerns.
|
||||||
|
|
||||||
|
Each version is given a distinguishing version number. If the
|
||||||
|
Program specifies that a certain numbered version of the GNU Affero General
|
||||||
|
Public License "or any later version" applies to it, you have the
|
||||||
|
option of following the terms and conditions either of that numbered
|
||||||
|
version or of any later version published by the Free Software
|
||||||
|
Foundation. If the Program does not specify a version number of the
|
||||||
|
GNU Affero General Public License, you may choose any version ever published
|
||||||
|
by the Free Software Foundation.
|
||||||
|
|
||||||
|
If the Program specifies that a proxy can decide which future
|
||||||
|
versions of the GNU Affero General Public License can be used, that proxy's
|
||||||
|
public statement of acceptance of a version permanently authorizes you
|
||||||
|
to choose that version for the Program.
|
||||||
|
|
||||||
|
Later license versions may give you additional or different
|
||||||
|
permissions. However, no additional obligations are imposed on any
|
||||||
|
author or copyright holder as a result of your choosing to follow a
|
||||||
|
later version.
|
||||||
|
|
||||||
|
15. Disclaimer of Warranty.
|
||||||
|
|
||||||
|
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
|
||||||
|
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
|
||||||
|
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
|
||||||
|
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
|
||||||
|
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
|
||||||
|
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
|
||||||
|
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
|
||||||
|
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
|
||||||
|
|
||||||
|
16. Limitation of Liability.
|
||||||
|
|
||||||
|
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
|
||||||
|
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
|
||||||
|
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
|
||||||
|
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
|
||||||
|
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
|
||||||
|
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
|
||||||
|
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
|
||||||
|
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
|
||||||
|
SUCH DAMAGES.
|
||||||
|
|
||||||
|
17. Interpretation of Sections 15 and 16.
|
||||||
|
|
||||||
|
If the disclaimer of warranty and limitation of liability provided
|
||||||
|
above cannot be given local legal effect according to their terms,
|
||||||
|
reviewing courts shall apply local law that most closely approximates
|
||||||
|
an absolute waiver of all civil liability in connection with the
|
||||||
|
Program, unless a warranty or assumption of liability accompanies a
|
||||||
|
copy of the Program in return for a fee.
|
||||||
|
|
||||||
|
END OF TERMS AND CONDITIONS
|
||||||
|
|
||||||
|
How to Apply These Terms to Your New Programs
|
||||||
|
|
||||||
|
If you develop a new program, and you want it to be of the greatest
|
||||||
|
possible use to the public, the best way to achieve this is to make it
|
||||||
|
free software which everyone can redistribute and change under these terms.
|
||||||
|
|
||||||
|
To do so, attach the following notices to the program. It is safest
|
||||||
|
to attach them to the start of each source file to most effectively
|
||||||
|
state the exclusion of warranty; and each file should have at least
|
||||||
|
the "copyright" line and a pointer to where the full notice is found.
|
||||||
|
|
||||||
|
<one line to give the program's name and a brief idea of what it does.>
|
||||||
|
Copyright (C) <year> <name of author>
|
||||||
|
|
||||||
|
This program is free software: you can redistribute it and/or modify
|
||||||
|
it under the terms of the GNU Affero General Public License as published by
|
||||||
|
the Free Software Foundation, either version 3 of the License, or
|
||||||
|
(at your option) any later version.
|
||||||
|
|
||||||
|
This program is distributed in the hope that it will be useful,
|
||||||
|
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||||
|
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||||
|
GNU Affero General Public License for more details.
|
||||||
|
|
||||||
|
You should have received a copy of the GNU Affero General Public License
|
||||||
|
along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
|
Also add information on how to contact you by electronic and paper mail.
|
||||||
|
|
||||||
|
If your software can interact with users remotely through a computer
|
||||||
|
network, you should also make sure that it provides a way for users to
|
||||||
|
get its source. For example, if your program is a web application, its
|
||||||
|
interface could display a "Source" link that leads users to an archive
|
||||||
|
of the code. There are many ways you could offer source, and different
|
||||||
|
solutions will be better for different programs; see section 13 for the
|
||||||
|
specific requirements.
|
||||||
|
|
||||||
|
You should also get your employer (if you work as a programmer) or school,
|
||||||
|
if any, to sign a "copyright disclaimer" for the program, if necessary.
|
||||||
|
For more information on this, and how to apply and follow the GNU AGPL, see
|
||||||
|
<http://www.gnu.org/licenses/>.
|
|
@ -12,6 +12,8 @@
|
||||||
<artifactId>dhp-build-assembly-resources</artifactId>
|
<artifactId>dhp-build-assembly-resources</artifactId>
|
||||||
<packaging>jar</packaging>
|
<packaging>jar</packaging>
|
||||||
|
|
||||||
|
<description>This module contains a set of scripts supporting the build lifecycle for the dnet-hadoop project</description>
|
||||||
|
|
||||||
<build>
|
<build>
|
||||||
<plugins>
|
<plugins>
|
||||||
<plugin>
|
<plugin>
|
||||||
|
|
|
@ -12,22 +12,29 @@
|
||||||
<artifactId>dhp-build-properties-maven-plugin</artifactId>
|
<artifactId>dhp-build-properties-maven-plugin</artifactId>
|
||||||
<packaging>maven-plugin</packaging>
|
<packaging>maven-plugin</packaging>
|
||||||
|
|
||||||
|
<description>This module is a maven plugin implementing custom properties substitutions in the build lifecycle</description>
|
||||||
|
|
||||||
<dependencies>
|
<dependencies>
|
||||||
<dependency>
|
<dependency>
|
||||||
<groupId>org.apache.maven</groupId>
|
<groupId>org.apache.maven</groupId>
|
||||||
<artifactId>maven-plugin-api</artifactId>
|
<artifactId>maven-plugin-api</artifactId>
|
||||||
<version>2.0</version>
|
<version>3.6.3</version>
|
||||||
</dependency>
|
</dependency>
|
||||||
<dependency>
|
<dependency>
|
||||||
<groupId>org.apache.maven</groupId>
|
<groupId>org.apache.maven</groupId>
|
||||||
<artifactId>maven-project</artifactId>
|
<artifactId>maven-project</artifactId>
|
||||||
<version>2.0</version>
|
<version>2.2.1</version>
|
||||||
</dependency>
|
</dependency>
|
||||||
|
<dependency>
|
||||||
|
<groupId>org.apache.maven</groupId>
|
||||||
|
<artifactId>maven-artifact</artifactId>
|
||||||
|
<version>2.2.1</version>
|
||||||
|
</dependency>
|
||||||
|
|
||||||
<dependency>
|
<dependency>
|
||||||
<groupId>org.kuali.maven.plugins</groupId>
|
<groupId>org.kuali.maven.plugins</groupId>
|
||||||
<artifactId>properties-maven-plugin</artifactId>
|
<artifactId>properties-maven-plugin</artifactId>
|
||||||
<version>1.3.2</version>
|
<version>${properties.maven.plugin.version}</version>
|
||||||
</dependency>
|
</dependency>
|
||||||
<dependency>
|
<dependency>
|
||||||
<groupId>com.google.code.findbugs</groupId>
|
<groupId>com.google.code.findbugs</groupId>
|
||||||
|
@ -73,44 +80,10 @@
|
||||||
<artifactId>maven-javadoc-plugin</artifactId>
|
<artifactId>maven-javadoc-plugin</artifactId>
|
||||||
<configuration>
|
<configuration>
|
||||||
<detectLinks>true</detectLinks>
|
<detectLinks>true</detectLinks>
|
||||||
|
<doclint>none</doclint>
|
||||||
</configuration>
|
</configuration>
|
||||||
</plugin>
|
</plugin>
|
||||||
</plugins>
|
</plugins>
|
||||||
<pluginManagement>
|
|
||||||
<plugins>
|
|
||||||
<!--This plugin's configuration is used to store Eclipse m2e settings only. It has no influence on the Maven build itself.-->
|
|
||||||
<plugin>
|
|
||||||
<groupId>org.eclipse.m2e</groupId>
|
|
||||||
<artifactId>lifecycle-mapping</artifactId>
|
|
||||||
<version>1.0.0</version>
|
|
||||||
<configuration>
|
|
||||||
<lifecycleMappingMetadata>
|
|
||||||
<pluginExecutions>
|
|
||||||
<pluginExecution>
|
|
||||||
<pluginExecutionFilter>
|
|
||||||
<groupId>
|
|
||||||
org.apache.maven.plugins
|
|
||||||
</groupId>
|
|
||||||
<artifactId>
|
|
||||||
maven-plugin-plugin
|
|
||||||
</artifactId>
|
|
||||||
<versionRange>
|
|
||||||
[3.2,)
|
|
||||||
</versionRange>
|
|
||||||
<goals>
|
|
||||||
<goal>descriptor</goal>
|
|
||||||
</goals>
|
|
||||||
</pluginExecutionFilter>
|
|
||||||
<action>
|
|
||||||
<ignore />
|
|
||||||
</action>
|
|
||||||
</pluginExecution>
|
|
||||||
</pluginExecutions>
|
|
||||||
</lifecycleMappingMetadata>
|
|
||||||
</configuration>
|
|
||||||
</plugin>
|
|
||||||
</plugins>
|
|
||||||
</pluginManagement>
|
|
||||||
</build>
|
</build>
|
||||||
|
|
||||||
</project>
|
</project>
|
||||||
|
|
|
@ -1,8 +1,10 @@
|
||||||
|
|
||||||
package eu.dnetlib.maven.plugin.properties;
|
package eu.dnetlib.maven.plugin.properties;
|
||||||
|
|
||||||
import java.io.File;
|
import java.io.File;
|
||||||
import java.util.ArrayList;
|
import java.util.ArrayList;
|
||||||
import java.util.List;
|
import java.util.List;
|
||||||
|
|
||||||
import org.apache.commons.lang.ArrayUtils;
|
import org.apache.commons.lang.ArrayUtils;
|
||||||
import org.apache.commons.lang.StringUtils;
|
import org.apache.commons.lang.StringUtils;
|
||||||
import org.apache.maven.plugin.AbstractMojo;
|
import org.apache.maven.plugin.AbstractMojo;
|
||||||
|
@ -17,55 +19,58 @@ import org.apache.maven.plugin.MojoFailureException;
|
||||||
*/
|
*/
|
||||||
public class GenerateOoziePropertiesMojo extends AbstractMojo {
|
public class GenerateOoziePropertiesMojo extends AbstractMojo {
|
||||||
|
|
||||||
public static final String PROPERTY_NAME_WF_SOURCE_DIR = "workflow.source.dir";
|
public static final String PROPERTY_NAME_WF_SOURCE_DIR = "workflow.source.dir";
|
||||||
public static final String PROPERTY_NAME_SANDBOX_NAME = "sandboxName";
|
public static final String PROPERTY_NAME_SANDBOX_NAME = "sandboxName";
|
||||||
|
|
||||||
private final String[] limiters = {"dhp", "dnetlib", "eu"};
|
private final String[] limiters = {
|
||||||
|
"dhp", "dnetlib", "eu"
|
||||||
|
};
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public void execute() throws MojoExecutionException, MojoFailureException {
|
public void execute() throws MojoExecutionException, MojoFailureException {
|
||||||
if (System.getProperties().containsKey(PROPERTY_NAME_WF_SOURCE_DIR)
|
if (System.getProperties().containsKey(PROPERTY_NAME_WF_SOURCE_DIR)
|
||||||
&& !System.getProperties().containsKey(PROPERTY_NAME_SANDBOX_NAME)) {
|
&& !System.getProperties().containsKey(PROPERTY_NAME_SANDBOX_NAME)) {
|
||||||
String generatedSandboxName =
|
String generatedSandboxName = generateSandboxName(
|
||||||
generateSandboxName(System.getProperties().getProperty(PROPERTY_NAME_WF_SOURCE_DIR));
|
System.getProperties().getProperty(PROPERTY_NAME_WF_SOURCE_DIR));
|
||||||
if (generatedSandboxName != null) {
|
if (generatedSandboxName != null) {
|
||||||
System.getProperties().setProperty(PROPERTY_NAME_SANDBOX_NAME, generatedSandboxName);
|
System.getProperties().setProperty(PROPERTY_NAME_SANDBOX_NAME, generatedSandboxName);
|
||||||
} else {
|
} else {
|
||||||
System.out.println(
|
System.out
|
||||||
"unable to generate sandbox name from path: "
|
.println(
|
||||||
+ System.getProperties().getProperty(PROPERTY_NAME_WF_SOURCE_DIR));
|
"unable to generate sandbox name from path: "
|
||||||
}
|
+ System.getProperties().getProperty(PROPERTY_NAME_WF_SOURCE_DIR));
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Generates sandbox name from workflow source directory.
|
* Generates sandbox name from workflow source directory.
|
||||||
*
|
*
|
||||||
* @param wfSourceDir
|
* @param wfSourceDir
|
||||||
* @return generated sandbox name
|
* @return generated sandbox name
|
||||||
*/
|
*/
|
||||||
private String generateSandboxName(String wfSourceDir) {
|
private String generateSandboxName(String wfSourceDir) {
|
||||||
// utilize all dir names until finding one of the limiters
|
// utilize all dir names until finding one of the limiters
|
||||||
List<String> sandboxNameParts = new ArrayList<String>();
|
List<String> sandboxNameParts = new ArrayList<String>();
|
||||||
String[] tokens = StringUtils.split(wfSourceDir, File.separatorChar);
|
String[] tokens = StringUtils.split(wfSourceDir, File.separatorChar);
|
||||||
ArrayUtils.reverse(tokens);
|
ArrayUtils.reverse(tokens);
|
||||||
if (tokens.length > 0) {
|
if (tokens.length > 0) {
|
||||||
for (String token : tokens) {
|
for (String token : tokens) {
|
||||||
for (String limiter : limiters) {
|
for (String limiter : limiters) {
|
||||||
if (limiter.equals(token)) {
|
if (limiter.equals(token)) {
|
||||||
return sandboxNameParts.size() > 0
|
return sandboxNameParts.size() > 0
|
||||||
? StringUtils.join(sandboxNameParts.toArray())
|
? StringUtils.join(sandboxNameParts.toArray())
|
||||||
: null;
|
: null;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
if (sandboxNameParts.size() > 0) {
|
if (sandboxNameParts.size() > 0) {
|
||||||
sandboxNameParts.add(0, File.separator);
|
sandboxNameParts.add(0, File.separator);
|
||||||
}
|
}
|
||||||
sandboxNameParts.add(0, token);
|
sandboxNameParts.add(0, token);
|
||||||
}
|
}
|
||||||
return StringUtils.join(sandboxNameParts.toArray());
|
return StringUtils.join(sandboxNameParts.toArray());
|
||||||
} else {
|
} else {
|
||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -9,9 +9,9 @@
|
||||||
* express or implied. See the License for the specific language governing permissions and
|
* express or implied. See the License for the specific language governing permissions and
|
||||||
* limitations under the License.
|
* limitations under the License.
|
||||||
*/
|
*/
|
||||||
|
|
||||||
package eu.dnetlib.maven.plugin.properties;
|
package eu.dnetlib.maven.plugin.properties;
|
||||||
|
|
||||||
import edu.umd.cs.findbugs.annotations.SuppressFBWarnings;
|
|
||||||
import java.io.File;
|
import java.io.File;
|
||||||
import java.io.FileInputStream;
|
import java.io.FileInputStream;
|
||||||
import java.io.IOException;
|
import java.io.IOException;
|
||||||
|
@ -24,6 +24,7 @@ import java.util.List;
|
||||||
import java.util.Map.Entry;
|
import java.util.Map.Entry;
|
||||||
import java.util.Properties;
|
import java.util.Properties;
|
||||||
import java.util.Set;
|
import java.util.Set;
|
||||||
|
|
||||||
import org.apache.commons.io.FileUtils;
|
import org.apache.commons.io.FileUtils;
|
||||||
import org.apache.commons.io.IOUtils;
|
import org.apache.commons.io.IOUtils;
|
||||||
import org.apache.commons.lang.StringUtils;
|
import org.apache.commons.lang.StringUtils;
|
||||||
|
@ -35,410 +36,412 @@ import org.springframework.core.io.DefaultResourceLoader;
|
||||||
import org.springframework.core.io.Resource;
|
import org.springframework.core.io.Resource;
|
||||||
import org.springframework.core.io.ResourceLoader;
|
import org.springframework.core.io.ResourceLoader;
|
||||||
|
|
||||||
|
import edu.umd.cs.findbugs.annotations.SuppressFBWarnings;
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Writes project properties for the keys listed in specified properties files. Based on:
|
* Writes project properties for the keys listed in specified properties files. Based on:
|
||||||
* http://site.kuali.org/maven/plugins/properties-maven-plugin/1.3.2/write-project-properties-mojo.html
|
* http://site.kuali.org/maven/plugins/properties-maven-plugin/2.0.1/write-project-properties-mojo.html
|
||||||
*
|
*
|
||||||
* @author mhorst
|
* @author mhorst
|
||||||
* @goal write-project-properties
|
* @goal write-project-properties
|
||||||
*/
|
*/
|
||||||
public class WritePredefinedProjectProperties extends AbstractMojo {
|
public class WritePredefinedProjectProperties extends AbstractMojo {
|
||||||
|
|
||||||
private static final String CR = "\r";
|
private static final String CR = "\r";
|
||||||
private static final String LF = "\n";
|
private static final String LF = "\n";
|
||||||
private static final String TAB = "\t";
|
private static final String TAB = "\t";
|
||||||
protected static final String PROPERTY_PREFIX_ENV = "env.";
|
protected static final String PROPERTY_PREFIX_ENV = "env.";
|
||||||
private static final String ENCODING_UTF8 = "utf8";
|
private static final String ENCODING_UTF8 = "utf8";
|
||||||
|
|
||||||
/** @parameter property="properties.includePropertyKeysFromFiles" */
|
/** @parameter property="properties.includePropertyKeysFromFiles" */
|
||||||
private String[] includePropertyKeysFromFiles;
|
private String[] includePropertyKeysFromFiles;
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @parameter default-value="${project}"
|
* @parameter default-value="${project}"
|
||||||
* @required
|
* @required
|
||||||
* @readonly
|
* @readonly
|
||||||
*/
|
*/
|
||||||
protected MavenProject project;
|
protected MavenProject project;
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* The file that properties will be written to
|
* The file that properties will be written to
|
||||||
*
|
*
|
||||||
* @parameter property="properties.outputFile"
|
* @parameter property="properties.outputFile"
|
||||||
* default-value="${project.build.directory}/properties/project.properties";
|
* default-value="${project.build.directory}/properties/project.properties";
|
||||||
* @required
|
* @required
|
||||||
*/
|
*/
|
||||||
protected File outputFile;
|
protected File outputFile;
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* If true, the plugin will silently ignore any non-existent properties files, and the build will
|
* If true, the plugin will silently ignore any non-existent properties files, and the build will continue
|
||||||
* continue
|
*
|
||||||
*
|
* @parameter property="properties.quiet" default-value="true"
|
||||||
* @parameter property="properties.quiet" default-value="true"
|
*/
|
||||||
*/
|
private boolean quiet;
|
||||||
private boolean quiet;
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Comma separated list of characters to escape when writing property values. cr=carriage return,
|
* Comma separated list of characters to escape when writing property values. cr=carriage return, lf=linefeed,
|
||||||
* lf=linefeed, tab=tab. Any other values are taken literally.
|
* tab=tab. Any other values are taken literally.
|
||||||
*
|
*
|
||||||
* @parameter default-value="cr,lf,tab" property="properties.escapeChars"
|
* @parameter default-value="cr,lf,tab" property="properties.escapeChars"
|
||||||
*/
|
*/
|
||||||
private String escapeChars;
|
private String escapeChars;
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* If true, the plugin will include system properties when writing the properties file. System
|
* If true, the plugin will include system properties when writing the properties file. System properties override
|
||||||
* properties override both environment variables and project properties.
|
* both environment variables and project properties.
|
||||||
*
|
*
|
||||||
* @parameter default-value="false" property="properties.includeSystemProperties"
|
* @parameter default-value="false" property="properties.includeSystemProperties"
|
||||||
*/
|
*/
|
||||||
private boolean includeSystemProperties;
|
private boolean includeSystemProperties;
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* If true, the plugin will include environment variables when writing the properties file.
|
* If true, the plugin will include environment variables when writing the properties file. Environment variables
|
||||||
* Environment variables are prefixed with "env". Environment variables override project
|
* are prefixed with "env". Environment variables override project properties.
|
||||||
* properties.
|
*
|
||||||
*
|
* @parameter default-value="false" property="properties.includeEnvironmentVariables"
|
||||||
* @parameter default-value="false" property="properties.includeEnvironmentVariables"
|
*/
|
||||||
*/
|
private boolean includeEnvironmentVariables;
|
||||||
private boolean includeEnvironmentVariables;
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Comma separated set of properties to exclude when writing the properties file
|
* Comma separated set of properties to exclude when writing the properties file
|
||||||
*
|
*
|
||||||
* @parameter property="properties.exclude"
|
* @parameter property="properties.exclude"
|
||||||
*/
|
*/
|
||||||
private String exclude;
|
private String exclude;
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Comma separated set of properties to write to the properties file. If provided, only the
|
* Comma separated set of properties to write to the properties file. If provided, only the properties matching
|
||||||
* properties matching those supplied here will be written to the properties file.
|
* those supplied here will be written to the properties file.
|
||||||
*
|
*
|
||||||
* @parameter property="properties.include"
|
* @parameter property="properties.include"
|
||||||
*/
|
*/
|
||||||
private String include;
|
private String include;
|
||||||
|
|
||||||
/*
|
/*
|
||||||
* (non-Javadoc)
|
* (non-Javadoc)
|
||||||
* @see org.apache.maven.plugin.AbstractMojo#execute()
|
* @see org.apache.maven.plugin.AbstractMojo#execute()
|
||||||
*/
|
*/
|
||||||
@Override
|
@Override
|
||||||
@SuppressFBWarnings({"NP_UNWRITTEN_FIELD", "UWF_UNWRITTEN_FIELD"})
|
@SuppressFBWarnings({
|
||||||
public void execute() throws MojoExecutionException, MojoFailureException {
|
"NP_UNWRITTEN_FIELD", "UWF_UNWRITTEN_FIELD"
|
||||||
Properties properties = new Properties();
|
})
|
||||||
// Add project properties
|
public void execute() throws MojoExecutionException, MojoFailureException {
|
||||||
properties.putAll(project.getProperties());
|
Properties properties = new Properties();
|
||||||
if (includeEnvironmentVariables) {
|
// Add project properties
|
||||||
// Add environment variables, overriding any existing properties with the same key
|
properties.putAll(project.getProperties());
|
||||||
properties.putAll(getEnvironmentVariables());
|
if (includeEnvironmentVariables) {
|
||||||
}
|
// Add environment variables, overriding any existing properties with the same key
|
||||||
if (includeSystemProperties) {
|
properties.putAll(getEnvironmentVariables());
|
||||||
// Add system properties, overriding any existing properties with the same key
|
}
|
||||||
properties.putAll(System.getProperties());
|
if (includeSystemProperties) {
|
||||||
}
|
// Add system properties, overriding any existing properties with the same key
|
||||||
|
properties.putAll(System.getProperties());
|
||||||
|
}
|
||||||
|
|
||||||
// Remove properties as appropriate
|
// Remove properties as appropriate
|
||||||
trim(properties, exclude, include);
|
trim(properties, exclude, include);
|
||||||
|
|
||||||
String comment = "# " + new Date() + "\n";
|
String comment = "# " + new Date() + "\n";
|
||||||
List<String> escapeTokens = getEscapeChars(escapeChars);
|
List<String> escapeTokens = getEscapeChars(escapeChars);
|
||||||
|
|
||||||
getLog().info("Creating " + outputFile);
|
getLog().info("Creating " + outputFile);
|
||||||
writeProperties(outputFile, comment, properties, escapeTokens);
|
writeProperties(outputFile, comment, properties, escapeTokens);
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Provides environment variables.
|
* Provides environment variables.
|
||||||
*
|
*
|
||||||
* @return environment variables
|
* @return environment variables
|
||||||
*/
|
*/
|
||||||
protected static Properties getEnvironmentVariables() {
|
protected static Properties getEnvironmentVariables() {
|
||||||
Properties props = new Properties();
|
Properties props = new Properties();
|
||||||
for (Entry<String, String> entry : System.getenv().entrySet()) {
|
for (Entry<String, String> entry : System.getenv().entrySet()) {
|
||||||
props.setProperty(PROPERTY_PREFIX_ENV + entry.getKey(), entry.getValue());
|
props.setProperty(PROPERTY_PREFIX_ENV + entry.getKey(), entry.getValue());
|
||||||
}
|
}
|
||||||
return props;
|
return props;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Removes properties which should not be written.
|
* Removes properties which should not be written.
|
||||||
*
|
*
|
||||||
* @param properties
|
* @param properties
|
||||||
* @param omitCSV
|
* @param omitCSV
|
||||||
* @param includeCSV
|
* @param includeCSV
|
||||||
* @throws MojoExecutionException
|
* @throws MojoExecutionException
|
||||||
*/
|
*/
|
||||||
protected void trim(Properties properties, String omitCSV, String includeCSV)
|
protected void trim(Properties properties, String omitCSV, String includeCSV)
|
||||||
throws MojoExecutionException {
|
throws MojoExecutionException {
|
||||||
List<String> omitKeys = getListFromCSV(omitCSV);
|
List<String> omitKeys = getListFromCSV(omitCSV);
|
||||||
for (String key : omitKeys) {
|
for (String key : omitKeys) {
|
||||||
properties.remove(key);
|
properties.remove(key);
|
||||||
}
|
}
|
||||||
|
|
||||||
List<String> includeKeys = getListFromCSV(includeCSV);
|
List<String> includeKeys = getListFromCSV(includeCSV);
|
||||||
// mh: including keys from predefined properties
|
// mh: including keys from predefined properties
|
||||||
if (includePropertyKeysFromFiles != null && includePropertyKeysFromFiles.length > 0) {
|
if (includePropertyKeysFromFiles != null && includePropertyKeysFromFiles.length > 0) {
|
||||||
for (String currentIncludeLoc : includePropertyKeysFromFiles) {
|
for (String currentIncludeLoc : includePropertyKeysFromFiles) {
|
||||||
if (validate(currentIncludeLoc)) {
|
if (validate(currentIncludeLoc)) {
|
||||||
Properties p = getProperties(currentIncludeLoc);
|
Properties p = getProperties(currentIncludeLoc);
|
||||||
for (String key : p.stringPropertyNames()) {
|
for (String key : p.stringPropertyNames()) {
|
||||||
includeKeys.add(key);
|
includeKeys.add(key);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
if (includeKeys != null && !includeKeys.isEmpty()) {
|
if (includeKeys != null && !includeKeys.isEmpty()) {
|
||||||
// removing only when include keys provided
|
// removing only when include keys provided
|
||||||
Set<String> keys = properties.stringPropertyNames();
|
Set<String> keys = properties.stringPropertyNames();
|
||||||
for (String key : keys) {
|
for (String key : keys) {
|
||||||
if (!includeKeys.contains(key)) {
|
if (!includeKeys.contains(key)) {
|
||||||
properties.remove(key);
|
properties.remove(key);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Checks whether file exists.
|
* Checks whether file exists.
|
||||||
*
|
*
|
||||||
* @param location
|
* @param location
|
||||||
* @return true when exists, false otherwise.
|
* @return true when exists, false otherwise.
|
||||||
*/
|
*/
|
||||||
protected boolean exists(String location) {
|
protected boolean exists(String location) {
|
||||||
if (StringUtils.isBlank(location)) {
|
if (StringUtils.isBlank(location)) {
|
||||||
return false;
|
return false;
|
||||||
}
|
}
|
||||||
File file = new File(location);
|
File file = new File(location);
|
||||||
if (file.exists()) {
|
if (file.exists()) {
|
||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
ResourceLoader loader = new DefaultResourceLoader();
|
ResourceLoader loader = new DefaultResourceLoader();
|
||||||
Resource resource = loader.getResource(location);
|
Resource resource = loader.getResource(location);
|
||||||
return resource.exists();
|
return resource.exists();
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Validates resource location.
|
* Validates resource location.
|
||||||
*
|
*
|
||||||
* @param location
|
* @param location
|
||||||
* @return true when valid, false otherwise
|
* @return true when valid, false otherwise
|
||||||
* @throws MojoExecutionException
|
* @throws MojoExecutionException
|
||||||
*/
|
*/
|
||||||
protected boolean validate(String location) throws MojoExecutionException {
|
protected boolean validate(String location) throws MojoExecutionException {
|
||||||
boolean exists = exists(location);
|
boolean exists = exists(location);
|
||||||
if (exists) {
|
if (exists) {
|
||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
if (quiet) {
|
if (quiet) {
|
||||||
getLog().info("Ignoring non-existent properties file '" + location + "'");
|
getLog().info("Ignoring non-existent properties file '" + location + "'");
|
||||||
return false;
|
return false;
|
||||||
} else {
|
} else {
|
||||||
throw new MojoExecutionException("Non-existent properties file '" + location + "'");
|
throw new MojoExecutionException("Non-existent properties file '" + location + "'");
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Provides input stream.
|
* Provides input stream.
|
||||||
*
|
*
|
||||||
* @param location
|
* @param location
|
||||||
* @return input stream
|
* @return input stream
|
||||||
* @throws IOException
|
* @throws IOException
|
||||||
*/
|
*/
|
||||||
protected InputStream getInputStream(String location) throws IOException {
|
protected InputStream getInputStream(String location) throws IOException {
|
||||||
File file = new File(location);
|
File file = new File(location);
|
||||||
if (file.exists()) {
|
if (file.exists()) {
|
||||||
return new FileInputStream(location);
|
return new FileInputStream(location);
|
||||||
}
|
}
|
||||||
ResourceLoader loader = new DefaultResourceLoader();
|
ResourceLoader loader = new DefaultResourceLoader();
|
||||||
Resource resource = loader.getResource(location);
|
Resource resource = loader.getResource(location);
|
||||||
return resource.getInputStream();
|
return resource.getInputStream();
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Creates properties for given location.
|
* Creates properties for given location.
|
||||||
*
|
*
|
||||||
* @param location
|
* @param location
|
||||||
* @return properties for given location
|
* @return properties for given location
|
||||||
* @throws MojoExecutionException
|
* @throws MojoExecutionException
|
||||||
*/
|
*/
|
||||||
protected Properties getProperties(String location) throws MojoExecutionException {
|
protected Properties getProperties(String location) throws MojoExecutionException {
|
||||||
InputStream in = null;
|
InputStream in = null;
|
||||||
try {
|
try {
|
||||||
Properties properties = new Properties();
|
Properties properties = new Properties();
|
||||||
in = getInputStream(location);
|
in = getInputStream(location);
|
||||||
if (location.toLowerCase().endsWith(".xml")) {
|
if (location.toLowerCase().endsWith(".xml")) {
|
||||||
properties.loadFromXML(in);
|
properties.loadFromXML(in);
|
||||||
} else {
|
} else {
|
||||||
properties.load(in);
|
properties.load(in);
|
||||||
}
|
}
|
||||||
return properties;
|
return properties;
|
||||||
} catch (IOException e) {
|
} catch (IOException e) {
|
||||||
throw new MojoExecutionException("Error reading properties file " + location, e);
|
throw new MojoExecutionException("Error reading properties file " + location, e);
|
||||||
} finally {
|
} finally {
|
||||||
IOUtils.closeQuietly(in);
|
IOUtils.closeQuietly(in);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Provides escape characters.
|
* Provides escape characters.
|
||||||
*
|
*
|
||||||
* @param escapeChars
|
* @param escapeChars
|
||||||
* @return escape characters
|
* @return escape characters
|
||||||
*/
|
*/
|
||||||
protected List<String> getEscapeChars(String escapeChars) {
|
protected List<String> getEscapeChars(String escapeChars) {
|
||||||
List<String> tokens = getListFromCSV(escapeChars);
|
List<String> tokens = getListFromCSV(escapeChars);
|
||||||
List<String> realTokens = new ArrayList<String>();
|
List<String> realTokens = new ArrayList<String>();
|
||||||
for (String token : tokens) {
|
for (String token : tokens) {
|
||||||
String realToken = getRealToken(token);
|
String realToken = getRealToken(token);
|
||||||
realTokens.add(realToken);
|
realTokens.add(realToken);
|
||||||
}
|
}
|
||||||
return realTokens;
|
return realTokens;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Provides real token.
|
* Provides real token.
|
||||||
*
|
*
|
||||||
* @param token
|
* @param token
|
||||||
* @return real token
|
* @return real token
|
||||||
*/
|
*/
|
||||||
protected String getRealToken(String token) {
|
protected String getRealToken(String token) {
|
||||||
if (token.equalsIgnoreCase("CR")) {
|
if (token.equalsIgnoreCase("CR")) {
|
||||||
return CR;
|
return CR;
|
||||||
} else if (token.equalsIgnoreCase("LF")) {
|
} else if (token.equalsIgnoreCase("LF")) {
|
||||||
return LF;
|
return LF;
|
||||||
} else if (token.equalsIgnoreCase("TAB")) {
|
} else if (token.equalsIgnoreCase("TAB")) {
|
||||||
return TAB;
|
return TAB;
|
||||||
} else {
|
} else {
|
||||||
return token;
|
return token;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Returns content.
|
* Returns content.
|
||||||
*
|
*
|
||||||
* @param comment
|
* @param comment
|
||||||
* @param properties
|
* @param properties
|
||||||
* @param escapeTokens
|
* @param escapeTokens
|
||||||
* @return content
|
* @return content
|
||||||
*/
|
*/
|
||||||
protected String getContent(String comment, Properties properties, List<String> escapeTokens) {
|
protected String getContent(String comment, Properties properties, List<String> escapeTokens) {
|
||||||
List<String> names = new ArrayList<String>(properties.stringPropertyNames());
|
List<String> names = new ArrayList<String>(properties.stringPropertyNames());
|
||||||
Collections.sort(names);
|
Collections.sort(names);
|
||||||
StringBuilder sb = new StringBuilder();
|
StringBuilder sb = new StringBuilder();
|
||||||
if (!StringUtils.isBlank(comment)) {
|
if (!StringUtils.isBlank(comment)) {
|
||||||
sb.append(comment);
|
sb.append(comment);
|
||||||
}
|
}
|
||||||
for (String name : names) {
|
for (String name : names) {
|
||||||
String value = properties.getProperty(name);
|
String value = properties.getProperty(name);
|
||||||
String escapedValue = escape(value, escapeTokens);
|
String escapedValue = escape(value, escapeTokens);
|
||||||
sb.append(name + "=" + escapedValue + "\n");
|
sb.append(name + "=" + escapedValue + "\n");
|
||||||
}
|
}
|
||||||
return sb.toString();
|
return sb.toString();
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Writes properties to given file.
|
* Writes properties to given file.
|
||||||
*
|
*
|
||||||
* @param file
|
* @param file
|
||||||
* @param comment
|
* @param comment
|
||||||
* @param properties
|
* @param properties
|
||||||
* @param escapeTokens
|
* @param escapeTokens
|
||||||
* @throws MojoExecutionException
|
* @throws MojoExecutionException
|
||||||
*/
|
*/
|
||||||
protected void writeProperties(
|
protected void writeProperties(
|
||||||
File file, String comment, Properties properties, List<String> escapeTokens)
|
File file, String comment, Properties properties, List<String> escapeTokens)
|
||||||
throws MojoExecutionException {
|
throws MojoExecutionException {
|
||||||
try {
|
try {
|
||||||
String content = getContent(comment, properties, escapeTokens);
|
String content = getContent(comment, properties, escapeTokens);
|
||||||
FileUtils.writeStringToFile(file, content, ENCODING_UTF8);
|
FileUtils.writeStringToFile(file, content, ENCODING_UTF8);
|
||||||
} catch (IOException e) {
|
} catch (IOException e) {
|
||||||
throw new MojoExecutionException("Error creating properties file", e);
|
throw new MojoExecutionException("Error creating properties file", e);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Escapes characters.
|
* Escapes characters.
|
||||||
*
|
*
|
||||||
* @param s
|
* @param s
|
||||||
* @param escapeChars
|
* @param escapeChars
|
||||||
* @return
|
* @return
|
||||||
*/
|
*/
|
||||||
protected String escape(String s, List<String> escapeChars) {
|
protected String escape(String s, List<String> escapeChars) {
|
||||||
String result = s;
|
String result = s;
|
||||||
for (String escapeChar : escapeChars) {
|
for (String escapeChar : escapeChars) {
|
||||||
result = result.replace(escapeChar, getReplacementToken(escapeChar));
|
result = result.replace(escapeChar, getReplacementToken(escapeChar));
|
||||||
}
|
}
|
||||||
return result;
|
return result;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Provides replacement token.
|
* Provides replacement token.
|
||||||
*
|
*
|
||||||
* @param escapeChar
|
* @param escapeChar
|
||||||
* @return replacement token
|
* @return replacement token
|
||||||
*/
|
*/
|
||||||
protected String getReplacementToken(String escapeChar) {
|
protected String getReplacementToken(String escapeChar) {
|
||||||
if (escapeChar.equals(CR)) {
|
if (escapeChar.equals(CR)) {
|
||||||
return "\\r";
|
return "\\r";
|
||||||
} else if (escapeChar.equals(LF)) {
|
} else if (escapeChar.equals(LF)) {
|
||||||
return "\\n";
|
return "\\n";
|
||||||
} else if (escapeChar.equals(TAB)) {
|
} else if (escapeChar.equals(TAB)) {
|
||||||
return "\\t";
|
return "\\t";
|
||||||
} else {
|
} else {
|
||||||
return "\\" + escapeChar;
|
return "\\" + escapeChar;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Returns list from csv.
|
* Returns list from csv.
|
||||||
*
|
*
|
||||||
* @param csv
|
* @param csv
|
||||||
* @return list of values generated from CSV
|
* @return list of values generated from CSV
|
||||||
*/
|
*/
|
||||||
protected static final List<String> getListFromCSV(String csv) {
|
protected static final List<String> getListFromCSV(String csv) {
|
||||||
if (StringUtils.isBlank(csv)) {
|
if (StringUtils.isBlank(csv)) {
|
||||||
return new ArrayList<String>();
|
return new ArrayList<String>();
|
||||||
}
|
}
|
||||||
List<String> list = new ArrayList<String>();
|
List<String> list = new ArrayList<String>();
|
||||||
String[] tokens = StringUtils.split(csv, ",");
|
String[] tokens = StringUtils.split(csv, ",");
|
||||||
for (String token : tokens) {
|
for (String token : tokens) {
|
||||||
list.add(token.trim());
|
list.add(token.trim());
|
||||||
}
|
}
|
||||||
return list;
|
return list;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setIncludeSystemProperties(boolean includeSystemProperties) {
|
public void setIncludeSystemProperties(boolean includeSystemProperties) {
|
||||||
this.includeSystemProperties = includeSystemProperties;
|
this.includeSystemProperties = includeSystemProperties;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setEscapeChars(String escapeChars) {
|
public void setEscapeChars(String escapeChars) {
|
||||||
this.escapeChars = escapeChars;
|
this.escapeChars = escapeChars;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setIncludeEnvironmentVariables(boolean includeEnvironmentVariables) {
|
public void setIncludeEnvironmentVariables(boolean includeEnvironmentVariables) {
|
||||||
this.includeEnvironmentVariables = includeEnvironmentVariables;
|
this.includeEnvironmentVariables = includeEnvironmentVariables;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setExclude(String exclude) {
|
public void setExclude(String exclude) {
|
||||||
this.exclude = exclude;
|
this.exclude = exclude;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setInclude(String include) {
|
public void setInclude(String include) {
|
||||||
this.include = include;
|
this.include = include;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setQuiet(boolean quiet) {
|
public void setQuiet(boolean quiet) {
|
||||||
this.quiet = quiet;
|
this.quiet = quiet;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Sets property files for which keys properties should be included.
|
* Sets property files for which keys properties should be included.
|
||||||
*
|
*
|
||||||
* @param includePropertyKeysFromFiles
|
* @param includePropertyKeysFromFiles
|
||||||
*/
|
*/
|
||||||
public void setIncludePropertyKeysFromFiles(String[] includePropertyKeysFromFiles) {
|
public void setIncludePropertyKeysFromFiles(String[] includePropertyKeysFromFiles) {
|
||||||
if (includePropertyKeysFromFiles != null) {
|
if (includePropertyKeysFromFiles != null) {
|
||||||
this.includePropertyKeysFromFiles =
|
this.includePropertyKeysFromFiles = Arrays
|
||||||
Arrays.copyOf(includePropertyKeysFromFiles, includePropertyKeysFromFiles.length);
|
.copyOf(includePropertyKeysFromFiles, includePropertyKeysFromFiles.length);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,3 +1,4 @@
|
||||||
|
|
||||||
package eu.dnetlib.maven.plugin.properties;
|
package eu.dnetlib.maven.plugin.properties;
|
||||||
|
|
||||||
import static eu.dnetlib.maven.plugin.properties.GenerateOoziePropertiesMojo.PROPERTY_NAME_SANDBOX_NAME;
|
import static eu.dnetlib.maven.plugin.properties.GenerateOoziePropertiesMojo.PROPERTY_NAME_SANDBOX_NAME;
|
||||||
|
@ -10,87 +11,87 @@ import org.junit.jupiter.api.Test;
|
||||||
/** @author mhorst, claudio.atzori */
|
/** @author mhorst, claudio.atzori */
|
||||||
public class GenerateOoziePropertiesMojoTest {
|
public class GenerateOoziePropertiesMojoTest {
|
||||||
|
|
||||||
private GenerateOoziePropertiesMojo mojo = new GenerateOoziePropertiesMojo();
|
private final GenerateOoziePropertiesMojo mojo = new GenerateOoziePropertiesMojo();
|
||||||
|
|
||||||
@BeforeEach
|
@BeforeEach
|
||||||
public void clearSystemProperties() {
|
public void clearSystemProperties() {
|
||||||
System.clearProperty(PROPERTY_NAME_SANDBOX_NAME);
|
System.clearProperty(PROPERTY_NAME_SANDBOX_NAME);
|
||||||
System.clearProperty(PROPERTY_NAME_WF_SOURCE_DIR);
|
System.clearProperty(PROPERTY_NAME_WF_SOURCE_DIR);
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void testExecuteEmpty() throws Exception {
|
public void testExecuteEmpty() throws Exception {
|
||||||
// execute
|
// execute
|
||||||
mojo.execute();
|
mojo.execute();
|
||||||
|
|
||||||
// assert
|
// assert
|
||||||
assertNull(System.getProperty(PROPERTY_NAME_SANDBOX_NAME));
|
assertNull(System.getProperty(PROPERTY_NAME_SANDBOX_NAME));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void testExecuteSandboxNameAlreadySet() throws Exception {
|
public void testExecuteSandboxNameAlreadySet() throws Exception {
|
||||||
// given
|
// given
|
||||||
String workflowSourceDir = "eu/dnetlib/dhp/wf/transformers";
|
String workflowSourceDir = "eu/dnetlib/dhp/wf/transformers";
|
||||||
String sandboxName = "originalSandboxName";
|
String sandboxName = "originalSandboxName";
|
||||||
System.setProperty(PROPERTY_NAME_WF_SOURCE_DIR, workflowSourceDir);
|
System.setProperty(PROPERTY_NAME_WF_SOURCE_DIR, workflowSourceDir);
|
||||||
System.setProperty(PROPERTY_NAME_SANDBOX_NAME, sandboxName);
|
System.setProperty(PROPERTY_NAME_SANDBOX_NAME, sandboxName);
|
||||||
|
|
||||||
// execute
|
// execute
|
||||||
mojo.execute();
|
mojo.execute();
|
||||||
|
|
||||||
// assert
|
// assert
|
||||||
assertEquals(sandboxName, System.getProperty(PROPERTY_NAME_SANDBOX_NAME));
|
assertEquals(sandboxName, System.getProperty(PROPERTY_NAME_SANDBOX_NAME));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void testExecuteEmptyWorkflowSourceDir() throws Exception {
|
public void testExecuteEmptyWorkflowSourceDir() throws Exception {
|
||||||
// given
|
// given
|
||||||
String workflowSourceDir = "";
|
String workflowSourceDir = "";
|
||||||
System.setProperty(PROPERTY_NAME_WF_SOURCE_DIR, workflowSourceDir);
|
System.setProperty(PROPERTY_NAME_WF_SOURCE_DIR, workflowSourceDir);
|
||||||
|
|
||||||
// execute
|
// execute
|
||||||
mojo.execute();
|
mojo.execute();
|
||||||
|
|
||||||
// assert
|
// assert
|
||||||
assertNull(System.getProperty(PROPERTY_NAME_SANDBOX_NAME));
|
assertNull(System.getProperty(PROPERTY_NAME_SANDBOX_NAME));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void testExecuteNullSandboxNameGenerated() throws Exception {
|
public void testExecuteNullSandboxNameGenerated() throws Exception {
|
||||||
// given
|
// given
|
||||||
String workflowSourceDir = "eu/dnetlib/dhp/";
|
String workflowSourceDir = "eu/dnetlib/dhp/";
|
||||||
System.setProperty(PROPERTY_NAME_WF_SOURCE_DIR, workflowSourceDir);
|
System.setProperty(PROPERTY_NAME_WF_SOURCE_DIR, workflowSourceDir);
|
||||||
|
|
||||||
// execute
|
// execute
|
||||||
mojo.execute();
|
mojo.execute();
|
||||||
|
|
||||||
// assert
|
// assert
|
||||||
assertNull(System.getProperty(PROPERTY_NAME_SANDBOX_NAME));
|
assertNull(System.getProperty(PROPERTY_NAME_SANDBOX_NAME));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void testExecute() throws Exception {
|
public void testExecute() throws Exception {
|
||||||
// given
|
// given
|
||||||
String workflowSourceDir = "eu/dnetlib/dhp/wf/transformers";
|
String workflowSourceDir = "eu/dnetlib/dhp/wf/transformers";
|
||||||
System.setProperty(PROPERTY_NAME_WF_SOURCE_DIR, workflowSourceDir);
|
System.setProperty(PROPERTY_NAME_WF_SOURCE_DIR, workflowSourceDir);
|
||||||
|
|
||||||
// execute
|
// execute
|
||||||
mojo.execute();
|
mojo.execute();
|
||||||
|
|
||||||
// assert
|
// assert
|
||||||
assertEquals("wf/transformers", System.getProperty(PROPERTY_NAME_SANDBOX_NAME));
|
assertEquals("wf/transformers", System.getProperty(PROPERTY_NAME_SANDBOX_NAME));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void testExecuteWithoutRoot() throws Exception {
|
public void testExecuteWithoutRoot() throws Exception {
|
||||||
// given
|
// given
|
||||||
String workflowSourceDir = "wf/transformers";
|
String workflowSourceDir = "wf/transformers";
|
||||||
System.setProperty(PROPERTY_NAME_WF_SOURCE_DIR, workflowSourceDir);
|
System.setProperty(PROPERTY_NAME_WF_SOURCE_DIR, workflowSourceDir);
|
||||||
|
|
||||||
// execute
|
// execute
|
||||||
mojo.execute();
|
mojo.execute();
|
||||||
|
|
||||||
// assert
|
// assert
|
||||||
assertEquals("wf/transformers", System.getProperty(PROPERTY_NAME_SANDBOX_NAME));
|
assertEquals("wf/transformers", System.getProperty(PROPERTY_NAME_SANDBOX_NAME));
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,3 +1,4 @@
|
||||||
|
|
||||||
package eu.dnetlib.maven.plugin.properties;
|
package eu.dnetlib.maven.plugin.properties;
|
||||||
|
|
||||||
import static eu.dnetlib.maven.plugin.properties.WritePredefinedProjectProperties.PROPERTY_PREFIX_ENV;
|
import static eu.dnetlib.maven.plugin.properties.WritePredefinedProjectProperties.PROPERTY_PREFIX_ENV;
|
||||||
|
@ -7,6 +8,7 @@ import static org.mockito.Mockito.lenient;
|
||||||
|
|
||||||
import java.io.*;
|
import java.io.*;
|
||||||
import java.util.Properties;
|
import java.util.Properties;
|
||||||
|
|
||||||
import org.apache.maven.plugin.MojoExecutionException;
|
import org.apache.maven.plugin.MojoExecutionException;
|
||||||
import org.apache.maven.project.MavenProject;
|
import org.apache.maven.project.MavenProject;
|
||||||
import org.junit.jupiter.api.*;
|
import org.junit.jupiter.api.*;
|
||||||
|
@ -20,337 +22,353 @@ import org.mockito.junit.jupiter.MockitoExtension;
|
||||||
@ExtendWith(MockitoExtension.class)
|
@ExtendWith(MockitoExtension.class)
|
||||||
public class WritePredefinedProjectPropertiesTest {
|
public class WritePredefinedProjectPropertiesTest {
|
||||||
|
|
||||||
@Mock private MavenProject mavenProject;
|
@Mock
|
||||||
|
private MavenProject mavenProject;
|
||||||
|
|
||||||
private WritePredefinedProjectProperties mojo;
|
private WritePredefinedProjectProperties mojo;
|
||||||
|
|
||||||
@BeforeEach
|
@BeforeEach
|
||||||
public void init(@TempDir File testFolder) {
|
public void init(@TempDir File testFolder) {
|
||||||
MockitoAnnotations.initMocks(this);
|
MockitoAnnotations.initMocks(this);
|
||||||
mojo = new WritePredefinedProjectProperties();
|
mojo = new WritePredefinedProjectProperties();
|
||||||
mojo.outputFile = getPropertiesFileLocation(testFolder);
|
mojo.outputFile = getPropertiesFileLocation(testFolder);
|
||||||
mojo.project = mavenProject;
|
mojo.project = mavenProject;
|
||||||
lenient().doReturn(new Properties()).when(mavenProject).getProperties();
|
lenient().doReturn(new Properties()).when(mavenProject).getProperties();
|
||||||
}
|
}
|
||||||
|
|
||||||
// ----------------------------------- TESTS ---------------------------------------------
|
// ----------------------------------- TESTS ---------------------------------------------
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void testExecuteEmpty() throws Exception {
|
public void testExecuteEmpty() throws Exception {
|
||||||
// execute
|
// execute
|
||||||
mojo.execute();
|
mojo.execute();
|
||||||
|
|
||||||
// assert
|
// assert
|
||||||
assertTrue(mojo.outputFile.exists());
|
assertTrue(mojo.outputFile.exists());
|
||||||
Properties storedProperties = getStoredProperties(mojo.outputFile.getParentFile());
|
Properties storedProperties = getStoredProperties(mojo.outputFile.getParentFile());
|
||||||
assertEquals(0, storedProperties.size());
|
assertEquals(0, storedProperties.size());
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void testExecuteWithProjectProperties() throws Exception {
|
public void testExecuteWithProjectProperties() throws Exception {
|
||||||
// given
|
// given
|
||||||
String key = "projectPropertyKey";
|
String key = "projectPropertyKey";
|
||||||
String value = "projectPropertyValue";
|
String value = "projectPropertyValue";
|
||||||
Properties projectProperties = new Properties();
|
Properties projectProperties = new Properties();
|
||||||
projectProperties.setProperty(key, value);
|
projectProperties.setProperty(key, value);
|
||||||
doReturn(projectProperties).when(mavenProject).getProperties();
|
doReturn(projectProperties).when(mavenProject).getProperties();
|
||||||
|
|
||||||
// execute
|
// execute
|
||||||
mojo.execute();
|
mojo.execute();
|
||||||
|
|
||||||
// assert
|
// assert
|
||||||
assertTrue(mojo.outputFile.exists());
|
assertTrue(mojo.outputFile.exists());
|
||||||
Properties storedProperties = getStoredProperties(mojo.outputFile.getParentFile());
|
Properties storedProperties = getStoredProperties(mojo.outputFile.getParentFile());
|
||||||
assertEquals(1, storedProperties.size());
|
assertEquals(1, storedProperties.size());
|
||||||
assertTrue(storedProperties.containsKey(key));
|
assertTrue(storedProperties.containsKey(key));
|
||||||
assertEquals(value, storedProperties.getProperty(key));
|
assertEquals(value, storedProperties.getProperty(key));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test()
|
@Test()
|
||||||
public void testExecuteWithProjectPropertiesAndInvalidOutputFile(@TempDir File testFolder) {
|
public void testExecuteWithProjectPropertiesAndInvalidOutputFile(@TempDir File testFolder) {
|
||||||
// given
|
// given
|
||||||
String key = "projectPropertyKey";
|
String key = "projectPropertyKey";
|
||||||
String value = "projectPropertyValue";
|
String value = "projectPropertyValue";
|
||||||
Properties projectProperties = new Properties();
|
Properties projectProperties = new Properties();
|
||||||
projectProperties.setProperty(key, value);
|
projectProperties.setProperty(key, value);
|
||||||
doReturn(projectProperties).when(mavenProject).getProperties();
|
doReturn(projectProperties).when(mavenProject).getProperties();
|
||||||
mojo.outputFile = testFolder;
|
mojo.outputFile = testFolder;
|
||||||
|
|
||||||
// execute
|
// execute
|
||||||
Assertions.assertThrows(MojoExecutionException.class, () -> mojo.execute());
|
Assertions.assertThrows(MojoExecutionException.class, () -> mojo.execute());
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void testExecuteWithProjectPropertiesExclusion(@TempDir File testFolder) throws Exception {
|
public void testExecuteWithProjectPropertiesExclusion(@TempDir File testFolder) throws Exception {
|
||||||
// given
|
// given
|
||||||
String key = "projectPropertyKey";
|
String key = "projectPropertyKey";
|
||||||
String value = "projectPropertyValue";
|
String value = "projectPropertyValue";
|
||||||
String excludedKey = "excludedPropertyKey";
|
String excludedKey = "excludedPropertyKey";
|
||||||
String excludedValue = "excludedPropertyValue";
|
String excludedValue = "excludedPropertyValue";
|
||||||
Properties projectProperties = new Properties();
|
Properties projectProperties = new Properties();
|
||||||
projectProperties.setProperty(key, value);
|
projectProperties.setProperty(key, value);
|
||||||
projectProperties.setProperty(excludedKey, excludedValue);
|
projectProperties.setProperty(excludedKey, excludedValue);
|
||||||
doReturn(projectProperties).when(mavenProject).getProperties();
|
doReturn(projectProperties).when(mavenProject).getProperties();
|
||||||
mojo.setExclude(excludedKey);
|
mojo.setExclude(excludedKey);
|
||||||
|
|
||||||
// execute
|
// execute
|
||||||
mojo.execute();
|
mojo.execute();
|
||||||
|
|
||||||
// assert
|
// assert
|
||||||
assertTrue(mojo.outputFile.exists());
|
assertTrue(mojo.outputFile.exists());
|
||||||
Properties storedProperties = getStoredProperties(testFolder);
|
Properties storedProperties = getStoredProperties(testFolder);
|
||||||
assertEquals(1, storedProperties.size());
|
assertEquals(1, storedProperties.size());
|
||||||
assertTrue(storedProperties.containsKey(key));
|
assertTrue(storedProperties.containsKey(key));
|
||||||
assertEquals(value, storedProperties.getProperty(key));
|
assertEquals(value, storedProperties.getProperty(key));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void testExecuteWithProjectPropertiesInclusion(@TempDir File testFolder) throws Exception {
|
public void testExecuteWithProjectPropertiesInclusion(@TempDir File testFolder) throws Exception {
|
||||||
// given
|
// given
|
||||||
String key = "projectPropertyKey";
|
String key = "projectPropertyKey";
|
||||||
String value = "projectPropertyValue";
|
String value = "projectPropertyValue";
|
||||||
String includedKey = "includedPropertyKey";
|
String includedKey = "includedPropertyKey";
|
||||||
String includedValue = "includedPropertyValue";
|
String includedValue = "includedPropertyValue";
|
||||||
Properties projectProperties = new Properties();
|
Properties projectProperties = new Properties();
|
||||||
projectProperties.setProperty(key, value);
|
projectProperties.setProperty(key, value);
|
||||||
projectProperties.setProperty(includedKey, includedValue);
|
projectProperties.setProperty(includedKey, includedValue);
|
||||||
doReturn(projectProperties).when(mavenProject).getProperties();
|
doReturn(projectProperties).when(mavenProject).getProperties();
|
||||||
mojo.setInclude(includedKey);
|
mojo.setInclude(includedKey);
|
||||||
|
|
||||||
// execute
|
// execute
|
||||||
mojo.execute();
|
mojo.execute();
|
||||||
|
|
||||||
// assert
|
// assert
|
||||||
assertTrue(mojo.outputFile.exists());
|
assertTrue(mojo.outputFile.exists());
|
||||||
Properties storedProperties = getStoredProperties(testFolder);
|
Properties storedProperties = getStoredProperties(testFolder);
|
||||||
assertEquals(1, storedProperties.size());
|
assertEquals(1, storedProperties.size());
|
||||||
assertTrue(storedProperties.containsKey(includedKey));
|
assertTrue(storedProperties.containsKey(includedKey));
|
||||||
assertEquals(includedValue, storedProperties.getProperty(includedKey));
|
assertEquals(includedValue, storedProperties.getProperty(includedKey));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void testExecuteIncludingPropertyKeysFromFile(@TempDir File testFolder) throws Exception {
|
public void testExecuteIncludingPropertyKeysFromFile(@TempDir File testFolder) throws Exception {
|
||||||
// given
|
// given
|
||||||
String key = "projectPropertyKey";
|
String key = "projectPropertyKey";
|
||||||
String value = "projectPropertyValue";
|
String value = "projectPropertyValue";
|
||||||
String includedKey = "includedPropertyKey";
|
String includedKey = "includedPropertyKey";
|
||||||
String includedValue = "includedPropertyValue";
|
String includedValue = "includedPropertyValue";
|
||||||
Properties projectProperties = new Properties();
|
Properties projectProperties = new Properties();
|
||||||
projectProperties.setProperty(key, value);
|
projectProperties.setProperty(key, value);
|
||||||
projectProperties.setProperty(includedKey, includedValue);
|
projectProperties.setProperty(includedKey, includedValue);
|
||||||
doReturn(projectProperties).when(mavenProject).getProperties();
|
doReturn(projectProperties).when(mavenProject).getProperties();
|
||||||
|
|
||||||
File includedPropertiesFile = new File(testFolder, "included.properties");
|
File includedPropertiesFile = new File(testFolder, "included.properties");
|
||||||
Properties includedProperties = new Properties();
|
Properties includedProperties = new Properties();
|
||||||
includedProperties.setProperty(includedKey, "irrelevantValue");
|
includedProperties.setProperty(includedKey, "irrelevantValue");
|
||||||
includedProperties.store(new FileWriter(includedPropertiesFile), null);
|
includedProperties.store(new FileWriter(includedPropertiesFile), null);
|
||||||
|
|
||||||
mojo.setIncludePropertyKeysFromFiles(new String[] {includedPropertiesFile.getAbsolutePath()});
|
mojo.setIncludePropertyKeysFromFiles(new String[] {
|
||||||
|
includedPropertiesFile.getAbsolutePath()
|
||||||
|
});
|
||||||
|
|
||||||
// execute
|
// execute
|
||||||
mojo.execute();
|
mojo.execute();
|
||||||
|
|
||||||
// assert
|
// assert
|
||||||
assertTrue(mojo.outputFile.exists());
|
assertTrue(mojo.outputFile.exists());
|
||||||
Properties storedProperties = getStoredProperties(testFolder);
|
Properties storedProperties = getStoredProperties(testFolder);
|
||||||
assertEquals(1, storedProperties.size());
|
assertEquals(1, storedProperties.size());
|
||||||
assertTrue(storedProperties.containsKey(includedKey));
|
assertTrue(storedProperties.containsKey(includedKey));
|
||||||
assertEquals(includedValue, storedProperties.getProperty(includedKey));
|
assertEquals(includedValue, storedProperties.getProperty(includedKey));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void testExecuteIncludingPropertyKeysFromClasspathResource(@TempDir File testFolder)
|
public void testExecuteIncludingPropertyKeysFromClasspathResource(@TempDir File testFolder)
|
||||||
throws Exception {
|
throws Exception {
|
||||||
// given
|
// given
|
||||||
String key = "projectPropertyKey";
|
String key = "projectPropertyKey";
|
||||||
String value = "projectPropertyValue";
|
String value = "projectPropertyValue";
|
||||||
String includedKey = "includedPropertyKey";
|
String includedKey = "includedPropertyKey";
|
||||||
String includedValue = "includedPropertyValue";
|
String includedValue = "includedPropertyValue";
|
||||||
Properties projectProperties = new Properties();
|
Properties projectProperties = new Properties();
|
||||||
projectProperties.setProperty(key, value);
|
projectProperties.setProperty(key, value);
|
||||||
projectProperties.setProperty(includedKey, includedValue);
|
projectProperties.setProperty(includedKey, includedValue);
|
||||||
doReturn(projectProperties).when(mavenProject).getProperties();
|
doReturn(projectProperties).when(mavenProject).getProperties();
|
||||||
|
|
||||||
mojo.setIncludePropertyKeysFromFiles(
|
mojo
|
||||||
new String[] {"/eu/dnetlib/maven/plugin/properties/included.properties"});
|
.setIncludePropertyKeysFromFiles(
|
||||||
|
new String[] {
|
||||||
|
"/eu/dnetlib/maven/plugin/properties/included.properties"
|
||||||
|
});
|
||||||
|
|
||||||
// execute
|
// execute
|
||||||
mojo.execute();
|
mojo.execute();
|
||||||
|
|
||||||
// assert
|
// assert
|
||||||
assertTrue(mojo.outputFile.exists());
|
assertTrue(mojo.outputFile.exists());
|
||||||
Properties storedProperties = getStoredProperties(testFolder);
|
Properties storedProperties = getStoredProperties(testFolder);
|
||||||
assertEquals(1, storedProperties.size());
|
assertEquals(1, storedProperties.size());
|
||||||
assertTrue(storedProperties.containsKey(includedKey));
|
assertTrue(storedProperties.containsKey(includedKey));
|
||||||
assertEquals(includedValue, storedProperties.getProperty(includedKey));
|
assertEquals(includedValue, storedProperties.getProperty(includedKey));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void testExecuteIncludingPropertyKeysFromBlankLocation() {
|
public void testExecuteIncludingPropertyKeysFromBlankLocation() {
|
||||||
// given
|
// given
|
||||||
String key = "projectPropertyKey";
|
String key = "projectPropertyKey";
|
||||||
String value = "projectPropertyValue";
|
String value = "projectPropertyValue";
|
||||||
String includedKey = "includedPropertyKey";
|
String includedKey = "includedPropertyKey";
|
||||||
String includedValue = "includedPropertyValue";
|
String includedValue = "includedPropertyValue";
|
||||||
Properties projectProperties = new Properties();
|
Properties projectProperties = new Properties();
|
||||||
projectProperties.setProperty(key, value);
|
projectProperties.setProperty(key, value);
|
||||||
projectProperties.setProperty(includedKey, includedValue);
|
projectProperties.setProperty(includedKey, includedValue);
|
||||||
doReturn(projectProperties).when(mavenProject).getProperties();
|
doReturn(projectProperties).when(mavenProject).getProperties();
|
||||||
|
|
||||||
mojo.setIncludePropertyKeysFromFiles(new String[] {""});
|
mojo.setIncludePropertyKeysFromFiles(new String[] {
|
||||||
|
""
|
||||||
|
});
|
||||||
|
|
||||||
// execute
|
// execute
|
||||||
Assertions.assertThrows(MojoExecutionException.class, () -> mojo.execute());
|
Assertions.assertThrows(MojoExecutionException.class, () -> mojo.execute());
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void testExecuteIncludingPropertyKeysFromXmlFile(@TempDir File testFolder)
|
public void testExecuteIncludingPropertyKeysFromXmlFile(@TempDir File testFolder)
|
||||||
throws Exception {
|
throws Exception {
|
||||||
// given
|
// given
|
||||||
String key = "projectPropertyKey";
|
String key = "projectPropertyKey";
|
||||||
String value = "projectPropertyValue";
|
String value = "projectPropertyValue";
|
||||||
String includedKey = "includedPropertyKey";
|
String includedKey = "includedPropertyKey";
|
||||||
String includedValue = "includedPropertyValue";
|
String includedValue = "includedPropertyValue";
|
||||||
Properties projectProperties = new Properties();
|
Properties projectProperties = new Properties();
|
||||||
projectProperties.setProperty(key, value);
|
projectProperties.setProperty(key, value);
|
||||||
projectProperties.setProperty(includedKey, includedValue);
|
projectProperties.setProperty(includedKey, includedValue);
|
||||||
doReturn(projectProperties).when(mavenProject).getProperties();
|
doReturn(projectProperties).when(mavenProject).getProperties();
|
||||||
|
|
||||||
File includedPropertiesFile = new File(testFolder, "included.xml");
|
File includedPropertiesFile = new File(testFolder, "included.xml");
|
||||||
Properties includedProperties = new Properties();
|
Properties includedProperties = new Properties();
|
||||||
includedProperties.setProperty(includedKey, "irrelevantValue");
|
includedProperties.setProperty(includedKey, "irrelevantValue");
|
||||||
includedProperties.storeToXML(new FileOutputStream(includedPropertiesFile), null);
|
includedProperties.storeToXML(new FileOutputStream(includedPropertiesFile), null);
|
||||||
|
|
||||||
mojo.setIncludePropertyKeysFromFiles(new String[] {includedPropertiesFile.getAbsolutePath()});
|
mojo.setIncludePropertyKeysFromFiles(new String[] {
|
||||||
|
includedPropertiesFile.getAbsolutePath()
|
||||||
|
});
|
||||||
|
|
||||||
// execute
|
// execute
|
||||||
mojo.execute();
|
mojo.execute();
|
||||||
|
|
||||||
// assert
|
// assert
|
||||||
assertTrue(mojo.outputFile.exists());
|
assertTrue(mojo.outputFile.exists());
|
||||||
Properties storedProperties = getStoredProperties(testFolder);
|
Properties storedProperties = getStoredProperties(testFolder);
|
||||||
assertEquals(1, storedProperties.size());
|
assertEquals(1, storedProperties.size());
|
||||||
assertTrue(storedProperties.containsKey(includedKey));
|
assertTrue(storedProperties.containsKey(includedKey));
|
||||||
assertEquals(includedValue, storedProperties.getProperty(includedKey));
|
assertEquals(includedValue, storedProperties.getProperty(includedKey));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void testExecuteIncludingPropertyKeysFromInvalidXmlFile(@TempDir File testFolder)
|
public void testExecuteIncludingPropertyKeysFromInvalidXmlFile(@TempDir File testFolder)
|
||||||
throws Exception {
|
throws Exception {
|
||||||
// given
|
// given
|
||||||
String key = "projectPropertyKey";
|
String key = "projectPropertyKey";
|
||||||
String value = "projectPropertyValue";
|
String value = "projectPropertyValue";
|
||||||
String includedKey = "includedPropertyKey";
|
String includedKey = "includedPropertyKey";
|
||||||
String includedValue = "includedPropertyValue";
|
String includedValue = "includedPropertyValue";
|
||||||
Properties projectProperties = new Properties();
|
Properties projectProperties = new Properties();
|
||||||
projectProperties.setProperty(key, value);
|
projectProperties.setProperty(key, value);
|
||||||
projectProperties.setProperty(includedKey, includedValue);
|
projectProperties.setProperty(includedKey, includedValue);
|
||||||
doReturn(projectProperties).when(mavenProject).getProperties();
|
doReturn(projectProperties).when(mavenProject).getProperties();
|
||||||
|
|
||||||
File includedPropertiesFile = new File(testFolder, "included.xml");
|
File includedPropertiesFile = new File(testFolder, "included.xml");
|
||||||
Properties includedProperties = new Properties();
|
Properties includedProperties = new Properties();
|
||||||
includedProperties.setProperty(includedKey, "irrelevantValue");
|
includedProperties.setProperty(includedKey, "irrelevantValue");
|
||||||
includedProperties.store(new FileOutputStream(includedPropertiesFile), null);
|
includedProperties.store(new FileOutputStream(includedPropertiesFile), null);
|
||||||
|
|
||||||
mojo.setIncludePropertyKeysFromFiles(new String[] {includedPropertiesFile.getAbsolutePath()});
|
mojo.setIncludePropertyKeysFromFiles(new String[] {
|
||||||
|
includedPropertiesFile.getAbsolutePath()
|
||||||
|
});
|
||||||
|
|
||||||
// execute
|
// execute
|
||||||
Assertions.assertThrows(MojoExecutionException.class, () -> mojo.execute());
|
Assertions.assertThrows(MojoExecutionException.class, () -> mojo.execute());
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void testExecuteWithQuietModeOn(@TempDir File testFolder) throws Exception {
|
public void testExecuteWithQuietModeOn(@TempDir File testFolder) throws Exception {
|
||||||
// given
|
// given
|
||||||
mojo.setQuiet(true);
|
mojo.setQuiet(true);
|
||||||
mojo.setIncludePropertyKeysFromFiles(new String[] {"invalid location"});
|
mojo.setIncludePropertyKeysFromFiles(new String[] {
|
||||||
|
"invalid location"
|
||||||
|
});
|
||||||
|
|
||||||
// execute
|
// execute
|
||||||
mojo.execute();
|
mojo.execute();
|
||||||
|
|
||||||
// assert
|
// assert
|
||||||
assertTrue(mojo.outputFile.exists());
|
assertTrue(mojo.outputFile.exists());
|
||||||
Properties storedProperties = getStoredProperties(testFolder);
|
Properties storedProperties = getStoredProperties(testFolder);
|
||||||
assertEquals(0, storedProperties.size());
|
assertEquals(0, storedProperties.size());
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void testExecuteIncludingPropertyKeysFromInvalidFile() {
|
public void testExecuteIncludingPropertyKeysFromInvalidFile() {
|
||||||
// given
|
// given
|
||||||
mojo.setIncludePropertyKeysFromFiles(new String[] {"invalid location"});
|
mojo.setIncludePropertyKeysFromFiles(new String[] {
|
||||||
|
"invalid location"
|
||||||
|
});
|
||||||
|
|
||||||
// execute
|
// execute
|
||||||
Assertions.assertThrows(MojoExecutionException.class, () -> mojo.execute());
|
Assertions.assertThrows(MojoExecutionException.class, () -> mojo.execute());
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void testExecuteWithEnvironmentProperties(@TempDir File testFolder) throws Exception {
|
public void testExecuteWithEnvironmentProperties(@TempDir File testFolder) throws Exception {
|
||||||
// given
|
// given
|
||||||
mojo.setIncludeEnvironmentVariables(true);
|
mojo.setIncludeEnvironmentVariables(true);
|
||||||
|
|
||||||
// execute
|
// execute
|
||||||
mojo.execute();
|
mojo.execute();
|
||||||
|
|
||||||
// assert
|
// assert
|
||||||
assertTrue(mojo.outputFile.exists());
|
assertTrue(mojo.outputFile.exists());
|
||||||
Properties storedProperties = getStoredProperties(testFolder);
|
Properties storedProperties = getStoredProperties(testFolder);
|
||||||
assertTrue(storedProperties.size() > 0);
|
assertTrue(storedProperties.size() > 0);
|
||||||
for (Object currentKey : storedProperties.keySet()) {
|
for (Object currentKey : storedProperties.keySet()) {
|
||||||
assertTrue(((String) currentKey).startsWith(PROPERTY_PREFIX_ENV));
|
assertTrue(((String) currentKey).startsWith(PROPERTY_PREFIX_ENV));
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void testExecuteWithSystemProperties(@TempDir File testFolder) throws Exception {
|
public void testExecuteWithSystemProperties(@TempDir File testFolder) throws Exception {
|
||||||
// given
|
// given
|
||||||
String key = "systemPropertyKey";
|
String key = "systemPropertyKey";
|
||||||
String value = "systemPropertyValue";
|
String value = "systemPropertyValue";
|
||||||
System.setProperty(key, value);
|
System.setProperty(key, value);
|
||||||
mojo.setIncludeSystemProperties(true);
|
mojo.setIncludeSystemProperties(true);
|
||||||
|
|
||||||
// execute
|
// execute
|
||||||
mojo.execute();
|
mojo.execute();
|
||||||
|
|
||||||
// assert
|
// assert
|
||||||
assertTrue(mojo.outputFile.exists());
|
assertTrue(mojo.outputFile.exists());
|
||||||
Properties storedProperties = getStoredProperties(testFolder);
|
Properties storedProperties = getStoredProperties(testFolder);
|
||||||
assertTrue(storedProperties.size() > 0);
|
assertTrue(storedProperties.size() > 0);
|
||||||
assertTrue(storedProperties.containsKey(key));
|
assertTrue(storedProperties.containsKey(key));
|
||||||
assertEquals(value, storedProperties.getProperty(key));
|
assertEquals(value, storedProperties.getProperty(key));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void testExecuteWithSystemPropertiesAndEscapeChars(@TempDir File testFolder)
|
public void testExecuteWithSystemPropertiesAndEscapeChars(@TempDir File testFolder)
|
||||||
throws Exception {
|
throws Exception {
|
||||||
// given
|
// given
|
||||||
String key = "systemPropertyKey ";
|
String key = "systemPropertyKey ";
|
||||||
String value = "systemPropertyValue";
|
String value = "systemPropertyValue";
|
||||||
System.setProperty(key, value);
|
System.setProperty(key, value);
|
||||||
mojo.setIncludeSystemProperties(true);
|
mojo.setIncludeSystemProperties(true);
|
||||||
String escapeChars = "cr,lf,tab,|";
|
String escapeChars = "cr,lf,tab,|";
|
||||||
mojo.setEscapeChars(escapeChars);
|
mojo.setEscapeChars(escapeChars);
|
||||||
|
|
||||||
// execute
|
// execute
|
||||||
mojo.execute();
|
mojo.execute();
|
||||||
|
|
||||||
// assert
|
// assert
|
||||||
assertTrue(mojo.outputFile.exists());
|
assertTrue(mojo.outputFile.exists());
|
||||||
Properties storedProperties = getStoredProperties(testFolder);
|
Properties storedProperties = getStoredProperties(testFolder);
|
||||||
assertTrue(storedProperties.size() > 0);
|
assertTrue(storedProperties.size() > 0);
|
||||||
assertFalse(storedProperties.containsKey(key));
|
assertFalse(storedProperties.containsKey(key));
|
||||||
assertTrue(storedProperties.containsKey(key.trim()));
|
assertTrue(storedProperties.containsKey(key.trim()));
|
||||||
assertEquals(value, storedProperties.getProperty(key.trim()));
|
assertEquals(value, storedProperties.getProperty(key.trim()));
|
||||||
}
|
}
|
||||||
|
|
||||||
// ----------------------------------- PRIVATE -------------------------------------------
|
// ----------------------------------- PRIVATE -------------------------------------------
|
||||||
|
|
||||||
private File getPropertiesFileLocation(File testFolder) {
|
private File getPropertiesFileLocation(File testFolder) {
|
||||||
return new File(testFolder, "test.properties");
|
return new File(testFolder, "test.properties");
|
||||||
}
|
}
|
||||||
|
|
||||||
private Properties getStoredProperties(File testFolder)
|
private Properties getStoredProperties(File testFolder)
|
||||||
throws FileNotFoundException, IOException {
|
throws IOException {
|
||||||
Properties properties = new Properties();
|
Properties properties = new Properties();
|
||||||
properties.load(new FileInputStream(getPropertiesFileLocation(testFolder)));
|
properties.load(new FileInputStream(getPropertiesFileLocation(testFolder)));
|
||||||
return properties;
|
return properties;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -11,6 +11,38 @@
|
||||||
|
|
||||||
<packaging>jar</packaging>
|
<packaging>jar</packaging>
|
||||||
|
|
||||||
|
<description>This module contains resources supporting common code style conventions</description>
|
||||||
|
|
||||||
|
<distributionManagement>
|
||||||
|
<snapshotRepository>
|
||||||
|
<id>dnet45-snapshots</id>
|
||||||
|
<name>DNet45 Snapshots</name>
|
||||||
|
<url>http://maven.research-infrastructures.eu/nexus/content/repositories/dnet45-snapshots</url>
|
||||||
|
<layout>default</layout>
|
||||||
|
</snapshotRepository>
|
||||||
|
<repository>
|
||||||
|
<id>dnet45-releases</id>
|
||||||
|
<url>http://maven.research-infrastructures.eu/nexus/content/repositories/dnet45-releases</url>
|
||||||
|
</repository>
|
||||||
|
</distributionManagement>
|
||||||
|
|
||||||
|
<build>
|
||||||
|
<pluginManagement>
|
||||||
|
<plugins>
|
||||||
|
<plugin>
|
||||||
|
<groupId>org.apache.maven.plugins</groupId>
|
||||||
|
<artifactId>maven-project-info-reports-plugin</artifactId>
|
||||||
|
<version>3.0.0</version>
|
||||||
|
</plugin>
|
||||||
|
<plugin>
|
||||||
|
<groupId>org.apache.maven.plugins</groupId>
|
||||||
|
<artifactId>maven-site-plugin</artifactId>
|
||||||
|
<version>3.7.1</version>
|
||||||
|
</plugin>
|
||||||
|
</plugins>
|
||||||
|
</pluginManagement>
|
||||||
|
</build>
|
||||||
|
|
||||||
<properties>
|
<properties>
|
||||||
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
|
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
|
||||||
</properties>
|
</properties>
|
||||||
|
|
|
@ -8,6 +8,9 @@
|
||||||
</parent>
|
</parent>
|
||||||
<artifactId>dhp-build</artifactId>
|
<artifactId>dhp-build</artifactId>
|
||||||
<packaging>pom</packaging>
|
<packaging>pom</packaging>
|
||||||
|
|
||||||
|
<description>This module is a container for the build tools used in dnet-hadoop</description>
|
||||||
|
|
||||||
<modules>
|
<modules>
|
||||||
<module>dhp-code-style</module>
|
<module>dhp-code-style</module>
|
||||||
<module>dhp-build-assembly-resources</module>
|
<module>dhp-build-assembly-resources</module>
|
||||||
|
|
|
@ -12,6 +12,8 @@
|
||||||
<artifactId>dhp-common</artifactId>
|
<artifactId>dhp-common</artifactId>
|
||||||
<packaging>jar</packaging>
|
<packaging>jar</packaging>
|
||||||
|
|
||||||
|
<description>This module contains common utilities meant to be used across the dnet-hadoop submodules</description>
|
||||||
|
|
||||||
<dependencies>
|
<dependencies>
|
||||||
|
|
||||||
<dependency>
|
<dependency>
|
||||||
|
|
|
@ -1,3 +1,4 @@
|
||||||
|
|
||||||
package eu.dnetlib.collector.worker.model;
|
package eu.dnetlib.collector.worker.model;
|
||||||
|
|
||||||
import java.util.HashMap;
|
import java.util.HashMap;
|
||||||
|
@ -5,43 +6,43 @@ import java.util.Map;
|
||||||
|
|
||||||
public class ApiDescriptor {
|
public class ApiDescriptor {
|
||||||
|
|
||||||
private String id;
|
private String id;
|
||||||
|
|
||||||
private String baseUrl;
|
private String baseUrl;
|
||||||
|
|
||||||
private String protocol;
|
private String protocol;
|
||||||
|
|
||||||
private Map<String, String> params = new HashMap<>();
|
private Map<String, String> params = new HashMap<>();
|
||||||
|
|
||||||
public String getBaseUrl() {
|
public String getBaseUrl() {
|
||||||
return baseUrl;
|
return baseUrl;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setBaseUrl(final String baseUrl) {
|
public void setBaseUrl(final String baseUrl) {
|
||||||
this.baseUrl = baseUrl;
|
this.baseUrl = baseUrl;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getId() {
|
public String getId() {
|
||||||
return id;
|
return id;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setId(final String id) {
|
public void setId(final String id) {
|
||||||
this.id = id;
|
this.id = id;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Map<String, String> getParams() {
|
public Map<String, String> getParams() {
|
||||||
return params;
|
return params;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setParams(final HashMap<String, String> params) {
|
public void setParams(final HashMap<String, String> params) {
|
||||||
this.params = params;
|
this.params = params;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getProtocol() {
|
public String getProtocol() {
|
||||||
return protocol;
|
return protocol;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setProtocol(final String protocol) {
|
public void setProtocol(final String protocol) {
|
||||||
this.protocol = protocol;
|
this.protocol = protocol;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,7 +1,9 @@
|
||||||
|
|
||||||
package eu.dnetlib.data.mdstore.manager.common.model;
|
package eu.dnetlib.data.mdstore.manager.common.model;
|
||||||
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
import java.util.UUID;
|
import java.util.UUID;
|
||||||
|
|
||||||
import javax.persistence.Column;
|
import javax.persistence.Column;
|
||||||
import javax.persistence.Entity;
|
import javax.persistence.Entity;
|
||||||
import javax.persistence.Id;
|
import javax.persistence.Id;
|
||||||
|
@ -11,107 +13,107 @@ import javax.persistence.Table;
|
||||||
@Table(name = "mdstores")
|
@Table(name = "mdstores")
|
||||||
public class MDStore implements Serializable {
|
public class MDStore implements Serializable {
|
||||||
|
|
||||||
/** */
|
/** */
|
||||||
private static final long serialVersionUID = 3160530489149700055L;
|
private static final long serialVersionUID = 3160530489149700055L;
|
||||||
|
|
||||||
@Id
|
@Id
|
||||||
@Column(name = "id")
|
@Column(name = "id")
|
||||||
private String id;
|
private String id;
|
||||||
|
|
||||||
@Column(name = "format")
|
@Column(name = "format")
|
||||||
private String format;
|
private String format;
|
||||||
|
|
||||||
@Column(name = "layout")
|
@Column(name = "layout")
|
||||||
private String layout;
|
private String layout;
|
||||||
|
|
||||||
@Column(name = "interpretation")
|
@Column(name = "interpretation")
|
||||||
private String interpretation;
|
private String interpretation;
|
||||||
|
|
||||||
@Column(name = "datasource_name")
|
@Column(name = "datasource_name")
|
||||||
private String datasourceName;
|
private String datasourceName;
|
||||||
|
|
||||||
@Column(name = "datasource_id")
|
@Column(name = "datasource_id")
|
||||||
private String datasourceId;
|
private String datasourceId;
|
||||||
|
|
||||||
@Column(name = "api_id")
|
@Column(name = "api_id")
|
||||||
private String apiId;
|
private String apiId;
|
||||||
|
|
||||||
public String getId() {
|
public String getId() {
|
||||||
return id;
|
return id;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setId(final String id) {
|
public void setId(final String id) {
|
||||||
this.id = id;
|
this.id = id;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getFormat() {
|
public String getFormat() {
|
||||||
return format;
|
return format;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setFormat(final String format) {
|
public void setFormat(final String format) {
|
||||||
this.format = format;
|
this.format = format;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getLayout() {
|
public String getLayout() {
|
||||||
return layout;
|
return layout;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setLayout(final String layout) {
|
public void setLayout(final String layout) {
|
||||||
this.layout = layout;
|
this.layout = layout;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getInterpretation() {
|
public String getInterpretation() {
|
||||||
return interpretation;
|
return interpretation;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setInterpretation(final String interpretation) {
|
public void setInterpretation(final String interpretation) {
|
||||||
this.interpretation = interpretation;
|
this.interpretation = interpretation;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getDatasourceName() {
|
public String getDatasourceName() {
|
||||||
return datasourceName;
|
return datasourceName;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDatasourceName(final String datasourceName) {
|
public void setDatasourceName(final String datasourceName) {
|
||||||
this.datasourceName = datasourceName;
|
this.datasourceName = datasourceName;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getDatasourceId() {
|
public String getDatasourceId() {
|
||||||
return datasourceId;
|
return datasourceId;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDatasourceId(final String datasourceId) {
|
public void setDatasourceId(final String datasourceId) {
|
||||||
this.datasourceId = datasourceId;
|
this.datasourceId = datasourceId;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getApiId() {
|
public String getApiId() {
|
||||||
return apiId;
|
return apiId;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setApiId(final String apiId) {
|
public void setApiId(final String apiId) {
|
||||||
this.apiId = apiId;
|
this.apiId = apiId;
|
||||||
}
|
}
|
||||||
|
|
||||||
public static MDStore newInstance(
|
public static MDStore newInstance(
|
||||||
final String format, final String layout, final String interpretation) {
|
final String format, final String layout, final String interpretation) {
|
||||||
return newInstance(format, layout, interpretation, null, null, null);
|
return newInstance(format, layout, interpretation, null, null, null);
|
||||||
}
|
}
|
||||||
|
|
||||||
public static MDStore newInstance(
|
public static MDStore newInstance(
|
||||||
final String format,
|
final String format,
|
||||||
final String layout,
|
final String layout,
|
||||||
final String interpretation,
|
final String interpretation,
|
||||||
final String dsName,
|
final String dsName,
|
||||||
final String dsId,
|
final String dsId,
|
||||||
final String apiId) {
|
final String apiId) {
|
||||||
final MDStore md = new MDStore();
|
final MDStore md = new MDStore();
|
||||||
md.setId("md-" + UUID.randomUUID());
|
md.setId("md-" + UUID.randomUUID());
|
||||||
md.setFormat(format);
|
md.setFormat(format);
|
||||||
md.setLayout(layout);
|
md.setLayout(layout);
|
||||||
md.setInterpretation(interpretation);
|
md.setInterpretation(interpretation);
|
||||||
md.setDatasourceName(dsName);
|
md.setDatasourceName(dsName);
|
||||||
md.setDatasourceId(dsId);
|
md.setDatasourceId(dsId);
|
||||||
md.setApiId(apiId);
|
md.setApiId(apiId);
|
||||||
return md;
|
return md;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,6 +1,8 @@
|
||||||
|
|
||||||
package eu.dnetlib.data.mdstore.manager.common.model;
|
package eu.dnetlib.data.mdstore.manager.common.model;
|
||||||
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
|
|
||||||
import javax.persistence.Column;
|
import javax.persistence.Column;
|
||||||
import javax.persistence.Entity;
|
import javax.persistence.Entity;
|
||||||
import javax.persistence.Id;
|
import javax.persistence.Id;
|
||||||
|
@ -10,40 +12,40 @@ import javax.persistence.Table;
|
||||||
@Table(name = "mdstore_current_versions")
|
@Table(name = "mdstore_current_versions")
|
||||||
public class MDStoreCurrentVersion implements Serializable {
|
public class MDStoreCurrentVersion implements Serializable {
|
||||||
|
|
||||||
/** */
|
/** */
|
||||||
private static final long serialVersionUID = -4757725888593745773L;
|
private static final long serialVersionUID = -4757725888593745773L;
|
||||||
|
|
||||||
@Id
|
@Id
|
||||||
@Column(name = "mdstore")
|
@Column(name = "mdstore")
|
||||||
private String mdstore;
|
private String mdstore;
|
||||||
|
|
||||||
@Column(name = "current_version")
|
@Column(name = "current_version")
|
||||||
private String currentVersion;
|
private String currentVersion;
|
||||||
|
|
||||||
public String getMdstore() {
|
public String getMdstore() {
|
||||||
return mdstore;
|
return mdstore;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setMdstore(final String mdstore) {
|
public void setMdstore(final String mdstore) {
|
||||||
this.mdstore = mdstore;
|
this.mdstore = mdstore;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getCurrentVersion() {
|
public String getCurrentVersion() {
|
||||||
return currentVersion;
|
return currentVersion;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setCurrentVersion(final String currentVersion) {
|
public void setCurrentVersion(final String currentVersion) {
|
||||||
this.currentVersion = currentVersion;
|
this.currentVersion = currentVersion;
|
||||||
}
|
}
|
||||||
|
|
||||||
public static MDStoreCurrentVersion newInstance(final String mdId, final String versionId) {
|
public static MDStoreCurrentVersion newInstance(final String mdId, final String versionId) {
|
||||||
final MDStoreCurrentVersion cv = new MDStoreCurrentVersion();
|
final MDStoreCurrentVersion cv = new MDStoreCurrentVersion();
|
||||||
cv.setMdstore(mdId);
|
cv.setMdstore(mdId);
|
||||||
cv.setCurrentVersion(versionId);
|
cv.setCurrentVersion(versionId);
|
||||||
return cv;
|
return cv;
|
||||||
}
|
}
|
||||||
|
|
||||||
public static MDStoreCurrentVersion newInstance(final MDStoreVersion v) {
|
public static MDStoreCurrentVersion newInstance(final MDStoreVersion v) {
|
||||||
return newInstance(v.getMdstore(), v.getId());
|
return newInstance(v.getMdstore(), v.getId());
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,7 +1,9 @@
|
||||||
|
|
||||||
package eu.dnetlib.data.mdstore.manager.common.model;
|
package eu.dnetlib.data.mdstore.manager.common.model;
|
||||||
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
import java.util.Date;
|
import java.util.Date;
|
||||||
|
|
||||||
import javax.persistence.Column;
|
import javax.persistence.Column;
|
||||||
import javax.persistence.Entity;
|
import javax.persistence.Entity;
|
||||||
import javax.persistence.Id;
|
import javax.persistence.Id;
|
||||||
|
@ -13,85 +15,85 @@ import javax.persistence.TemporalType;
|
||||||
@Table(name = "mdstore_versions")
|
@Table(name = "mdstore_versions")
|
||||||
public class MDStoreVersion implements Serializable {
|
public class MDStoreVersion implements Serializable {
|
||||||
|
|
||||||
/** */
|
/** */
|
||||||
private static final long serialVersionUID = -4763494442274298339L;
|
private static final long serialVersionUID = -4763494442274298339L;
|
||||||
|
|
||||||
@Id
|
@Id
|
||||||
@Column(name = "id")
|
@Column(name = "id")
|
||||||
private String id;
|
private String id;
|
||||||
|
|
||||||
@Column(name = "mdstore")
|
@Column(name = "mdstore")
|
||||||
private String mdstore;
|
private String mdstore;
|
||||||
|
|
||||||
@Column(name = "writing")
|
@Column(name = "writing")
|
||||||
private boolean writing;
|
private boolean writing;
|
||||||
|
|
||||||
@Column(name = "readcount")
|
@Column(name = "readcount")
|
||||||
private int readCount = 0;
|
private int readCount = 0;
|
||||||
|
|
||||||
@Column(name = "lastupdate")
|
@Column(name = "lastupdate")
|
||||||
@Temporal(TemporalType.TIMESTAMP)
|
@Temporal(TemporalType.TIMESTAMP)
|
||||||
private Date lastUpdate;
|
private Date lastUpdate;
|
||||||
|
|
||||||
@Column(name = "size")
|
@Column(name = "size")
|
||||||
private long size = 0;
|
private long size = 0;
|
||||||
|
|
||||||
public static MDStoreVersion newInstance(final String mdId, final boolean writing) {
|
public static MDStoreVersion newInstance(final String mdId, final boolean writing) {
|
||||||
final MDStoreVersion t = new MDStoreVersion();
|
final MDStoreVersion t = new MDStoreVersion();
|
||||||
t.setId(mdId + "-" + new Date().getTime());
|
t.setId(mdId + "-" + new Date().getTime());
|
||||||
t.setMdstore(mdId);
|
t.setMdstore(mdId);
|
||||||
t.setLastUpdate(null);
|
t.setLastUpdate(null);
|
||||||
t.setWriting(writing);
|
t.setWriting(writing);
|
||||||
t.setReadCount(0);
|
t.setReadCount(0);
|
||||||
t.setSize(0);
|
t.setSize(0);
|
||||||
return t;
|
return t;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getId() {
|
public String getId() {
|
||||||
return id;
|
return id;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setId(final String id) {
|
public void setId(final String id) {
|
||||||
this.id = id;
|
this.id = id;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getMdstore() {
|
public String getMdstore() {
|
||||||
return mdstore;
|
return mdstore;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setMdstore(final String mdstore) {
|
public void setMdstore(final String mdstore) {
|
||||||
this.mdstore = mdstore;
|
this.mdstore = mdstore;
|
||||||
}
|
}
|
||||||
|
|
||||||
public boolean isWriting() {
|
public boolean isWriting() {
|
||||||
return writing;
|
return writing;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setWriting(final boolean writing) {
|
public void setWriting(final boolean writing) {
|
||||||
this.writing = writing;
|
this.writing = writing;
|
||||||
}
|
}
|
||||||
|
|
||||||
public int getReadCount() {
|
public int getReadCount() {
|
||||||
return readCount;
|
return readCount;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setReadCount(final int readCount) {
|
public void setReadCount(final int readCount) {
|
||||||
this.readCount = readCount;
|
this.readCount = readCount;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Date getLastUpdate() {
|
public Date getLastUpdate() {
|
||||||
return lastUpdate;
|
return lastUpdate;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setLastUpdate(final Date lastUpdate) {
|
public void setLastUpdate(final Date lastUpdate) {
|
||||||
this.lastUpdate = lastUpdate;
|
this.lastUpdate = lastUpdate;
|
||||||
}
|
}
|
||||||
|
|
||||||
public long getSize() {
|
public long getSize() {
|
||||||
return size;
|
return size;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setSize(final long size) {
|
public void setSize(final long size) {
|
||||||
this.size = size;
|
this.size = size;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,7 +1,9 @@
|
||||||
|
|
||||||
package eu.dnetlib.data.mdstore.manager.common.model;
|
package eu.dnetlib.data.mdstore.manager.common.model;
|
||||||
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
import java.util.Date;
|
import java.util.Date;
|
||||||
|
|
||||||
import javax.persistence.Column;
|
import javax.persistence.Column;
|
||||||
import javax.persistence.Entity;
|
import javax.persistence.Entity;
|
||||||
import javax.persistence.Id;
|
import javax.persistence.Id;
|
||||||
|
@ -13,129 +15,129 @@ import javax.persistence.TemporalType;
|
||||||
@Table(name = "mdstores_with_info")
|
@Table(name = "mdstores_with_info")
|
||||||
public class MDStoreWithInfo implements Serializable {
|
public class MDStoreWithInfo implements Serializable {
|
||||||
|
|
||||||
/** */
|
/** */
|
||||||
private static final long serialVersionUID = -8445784770687571492L;
|
private static final long serialVersionUID = -8445784770687571492L;
|
||||||
|
|
||||||
@Id
|
@Id
|
||||||
@Column(name = "id")
|
@Column(name = "id")
|
||||||
private String id;
|
private String id;
|
||||||
|
|
||||||
@Column(name = "format")
|
@Column(name = "format")
|
||||||
private String format;
|
private String format;
|
||||||
|
|
||||||
@Column(name = "layout")
|
@Column(name = "layout")
|
||||||
private String layout;
|
private String layout;
|
||||||
|
|
||||||
@Column(name = "interpretation")
|
@Column(name = "interpretation")
|
||||||
private String interpretation;
|
private String interpretation;
|
||||||
|
|
||||||
@Column(name = "datasource_name")
|
@Column(name = "datasource_name")
|
||||||
private String datasourceName;
|
private String datasourceName;
|
||||||
|
|
||||||
@Column(name = "datasource_id")
|
@Column(name = "datasource_id")
|
||||||
private String datasourceId;
|
private String datasourceId;
|
||||||
|
|
||||||
@Column(name = "api_id")
|
@Column(name = "api_id")
|
||||||
private String apiId;
|
private String apiId;
|
||||||
|
|
||||||
@Column(name = "current_version")
|
@Column(name = "current_version")
|
||||||
private String currentVersion;
|
private String currentVersion;
|
||||||
|
|
||||||
@Column(name = "lastupdate")
|
@Column(name = "lastupdate")
|
||||||
@Temporal(TemporalType.TIMESTAMP)
|
@Temporal(TemporalType.TIMESTAMP)
|
||||||
private Date lastUpdate;
|
private Date lastUpdate;
|
||||||
|
|
||||||
@Column(name = "size")
|
@Column(name = "size")
|
||||||
private long size = 0;
|
private long size = 0;
|
||||||
|
|
||||||
@Column(name = "n_versions")
|
@Column(name = "n_versions")
|
||||||
private long numberOfVersions = 0;
|
private long numberOfVersions = 0;
|
||||||
|
|
||||||
public String getId() {
|
public String getId() {
|
||||||
return id;
|
return id;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setId(final String id) {
|
public void setId(final String id) {
|
||||||
this.id = id;
|
this.id = id;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getFormat() {
|
public String getFormat() {
|
||||||
return format;
|
return format;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setFormat(final String format) {
|
public void setFormat(final String format) {
|
||||||
this.format = format;
|
this.format = format;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getLayout() {
|
public String getLayout() {
|
||||||
return layout;
|
return layout;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setLayout(final String layout) {
|
public void setLayout(final String layout) {
|
||||||
this.layout = layout;
|
this.layout = layout;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getInterpretation() {
|
public String getInterpretation() {
|
||||||
return interpretation;
|
return interpretation;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setInterpretation(final String interpretation) {
|
public void setInterpretation(final String interpretation) {
|
||||||
this.interpretation = interpretation;
|
this.interpretation = interpretation;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getDatasourceName() {
|
public String getDatasourceName() {
|
||||||
return datasourceName;
|
return datasourceName;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDatasourceName(final String datasourceName) {
|
public void setDatasourceName(final String datasourceName) {
|
||||||
this.datasourceName = datasourceName;
|
this.datasourceName = datasourceName;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getDatasourceId() {
|
public String getDatasourceId() {
|
||||||
return datasourceId;
|
return datasourceId;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDatasourceId(final String datasourceId) {
|
public void setDatasourceId(final String datasourceId) {
|
||||||
this.datasourceId = datasourceId;
|
this.datasourceId = datasourceId;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getApiId() {
|
public String getApiId() {
|
||||||
return apiId;
|
return apiId;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setApiId(final String apiId) {
|
public void setApiId(final String apiId) {
|
||||||
this.apiId = apiId;
|
this.apiId = apiId;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getCurrentVersion() {
|
public String getCurrentVersion() {
|
||||||
return currentVersion;
|
return currentVersion;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setCurrentVersion(final String currentVersion) {
|
public void setCurrentVersion(final String currentVersion) {
|
||||||
this.currentVersion = currentVersion;
|
this.currentVersion = currentVersion;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Date getLastUpdate() {
|
public Date getLastUpdate() {
|
||||||
return lastUpdate;
|
return lastUpdate;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setLastUpdate(final Date lastUpdate) {
|
public void setLastUpdate(final Date lastUpdate) {
|
||||||
this.lastUpdate = lastUpdate;
|
this.lastUpdate = lastUpdate;
|
||||||
}
|
}
|
||||||
|
|
||||||
public long getSize() {
|
public long getSize() {
|
||||||
return size;
|
return size;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setSize(final long size) {
|
public void setSize(final long size) {
|
||||||
this.size = size;
|
this.size = size;
|
||||||
}
|
}
|
||||||
|
|
||||||
public long getNumberOfVersions() {
|
public long getNumberOfVersions() {
|
||||||
return numberOfVersions;
|
return numberOfVersions;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setNumberOfVersions(final long numberOfVersions) {
|
public void setNumberOfVersions(final long numberOfVersions) {
|
||||||
this.numberOfVersions = numberOfVersions;
|
this.numberOfVersions = numberOfVersions;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,6 +1,6 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.application;
|
package eu.dnetlib.dhp.application;
|
||||||
|
|
||||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
|
||||||
import java.io.ByteArrayInputStream;
|
import java.io.ByteArrayInputStream;
|
||||||
import java.io.ByteArrayOutputStream;
|
import java.io.ByteArrayOutputStream;
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
|
@ -8,87 +8,91 @@ import java.io.StringWriter;
|
||||||
import java.util.*;
|
import java.util.*;
|
||||||
import java.util.zip.GZIPInputStream;
|
import java.util.zip.GZIPInputStream;
|
||||||
import java.util.zip.GZIPOutputStream;
|
import java.util.zip.GZIPOutputStream;
|
||||||
|
|
||||||
import org.apache.commons.cli.*;
|
import org.apache.commons.cli.*;
|
||||||
import org.apache.commons.codec.binary.Base64;
|
import org.apache.commons.codec.binary.Base64;
|
||||||
import org.apache.commons.io.IOUtils;
|
import org.apache.commons.io.IOUtils;
|
||||||
|
|
||||||
|
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||||
|
|
||||||
public class ArgumentApplicationParser implements Serializable {
|
public class ArgumentApplicationParser implements Serializable {
|
||||||
|
|
||||||
private final Options options = new Options();
|
private final Options options = new Options();
|
||||||
private final Map<String, String> objectMap = new HashMap<>();
|
private final Map<String, String> objectMap = new HashMap<>();
|
||||||
|
|
||||||
private final List<String> compressedValues = new ArrayList<>();
|
private final List<String> compressedValues = new ArrayList<>();
|
||||||
|
|
||||||
public ArgumentApplicationParser(final String json_configuration) throws Exception {
|
public ArgumentApplicationParser(final String json_configuration) throws Exception {
|
||||||
final ObjectMapper mapper = new ObjectMapper();
|
final ObjectMapper mapper = new ObjectMapper();
|
||||||
final OptionsParameter[] configuration =
|
final OptionsParameter[] configuration = mapper.readValue(json_configuration, OptionsParameter[].class);
|
||||||
mapper.readValue(json_configuration, OptionsParameter[].class);
|
createOptionMap(configuration);
|
||||||
createOptionMap(configuration);
|
}
|
||||||
}
|
|
||||||
|
|
||||||
public ArgumentApplicationParser(final OptionsParameter[] configuration) {
|
public ArgumentApplicationParser(final OptionsParameter[] configuration) {
|
||||||
createOptionMap(configuration);
|
createOptionMap(configuration);
|
||||||
}
|
}
|
||||||
|
|
||||||
private void createOptionMap(final OptionsParameter[] configuration) {
|
private void createOptionMap(final OptionsParameter[] configuration) {
|
||||||
|
|
||||||
Arrays.stream(configuration)
|
Arrays
|
||||||
.map(
|
.stream(configuration)
|
||||||
conf -> {
|
.map(
|
||||||
final Option o = new Option(conf.getParamName(), true, conf.getParamDescription());
|
conf -> {
|
||||||
o.setLongOpt(conf.getParamLongName());
|
final Option o = new Option(conf.getParamName(), true, conf.getParamDescription());
|
||||||
o.setRequired(conf.isParamRequired());
|
o.setLongOpt(conf.getParamLongName());
|
||||||
if (conf.isCompressed()) {
|
o.setRequired(conf.isParamRequired());
|
||||||
compressedValues.add(conf.getParamLongName());
|
if (conf.isCompressed()) {
|
||||||
}
|
compressedValues.add(conf.getParamLongName());
|
||||||
return o;
|
}
|
||||||
})
|
return o;
|
||||||
.forEach(options::addOption);
|
})
|
||||||
|
.forEach(options::addOption);
|
||||||
|
|
||||||
// HelpFormatter formatter = new HelpFormatter();
|
// HelpFormatter formatter = new HelpFormatter();
|
||||||
// formatter.printHelp("myapp", null, options, null, true);
|
// formatter.printHelp("myapp", null, options, null, true);
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
public static String decompressValue(final String abstractCompressed) {
|
public static String decompressValue(final String abstractCompressed) {
|
||||||
try {
|
try {
|
||||||
byte[] byteArray = Base64.decodeBase64(abstractCompressed.getBytes());
|
byte[] byteArray = Base64.decodeBase64(abstractCompressed.getBytes());
|
||||||
GZIPInputStream gis = new GZIPInputStream(new ByteArrayInputStream(byteArray));
|
GZIPInputStream gis = new GZIPInputStream(new ByteArrayInputStream(byteArray));
|
||||||
final StringWriter stringWriter = new StringWriter();
|
final StringWriter stringWriter = new StringWriter();
|
||||||
IOUtils.copy(gis, stringWriter);
|
IOUtils.copy(gis, stringWriter);
|
||||||
return stringWriter.toString();
|
return stringWriter.toString();
|
||||||
} catch (Throwable e) {
|
} catch (Throwable e) {
|
||||||
System.out.println("Wrong value to decompress:" + abstractCompressed);
|
System.out.println("Wrong value to decompress:" + abstractCompressed);
|
||||||
throw new RuntimeException(e);
|
throw new RuntimeException(e);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
public static String compressArgument(final String value) throws Exception {
|
public static String compressArgument(final String value) throws Exception {
|
||||||
ByteArrayOutputStream out = new ByteArrayOutputStream();
|
ByteArrayOutputStream out = new ByteArrayOutputStream();
|
||||||
GZIPOutputStream gzip = new GZIPOutputStream(out);
|
GZIPOutputStream gzip = new GZIPOutputStream(out);
|
||||||
gzip.write(value.getBytes());
|
gzip.write(value.getBytes());
|
||||||
gzip.close();
|
gzip.close();
|
||||||
return java.util.Base64.getEncoder().encodeToString(out.toByteArray());
|
return java.util.Base64.getEncoder().encodeToString(out.toByteArray());
|
||||||
}
|
}
|
||||||
|
|
||||||
public void parseArgument(final String[] args) throws Exception {
|
public void parseArgument(final String[] args) throws Exception {
|
||||||
CommandLineParser parser = new BasicParser();
|
CommandLineParser parser = new BasicParser();
|
||||||
CommandLine cmd = parser.parse(options, args);
|
CommandLine cmd = parser.parse(options, args);
|
||||||
Arrays.stream(cmd.getOptions())
|
Arrays
|
||||||
.forEach(
|
.stream(cmd.getOptions())
|
||||||
it ->
|
.forEach(
|
||||||
objectMap.put(
|
it -> objectMap
|
||||||
it.getLongOpt(),
|
.put(
|
||||||
compressedValues.contains(it.getLongOpt())
|
it.getLongOpt(),
|
||||||
? decompressValue(it.getValue())
|
compressedValues.contains(it.getLongOpt())
|
||||||
: it.getValue()));
|
? decompressValue(it.getValue())
|
||||||
}
|
: it.getValue()));
|
||||||
|
}
|
||||||
|
|
||||||
public String get(final String key) {
|
public String get(final String key) {
|
||||||
return objectMap.get(key);
|
return objectMap.get(key);
|
||||||
}
|
}
|
||||||
|
|
||||||
public Map<String, String> getObjectMap() {
|
public Map<String, String> getObjectMap() {
|
||||||
return objectMap;
|
return objectMap;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,36 +1,38 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.application;
|
package eu.dnetlib.dhp.application;
|
||||||
|
|
||||||
public class OptionsParameter {
|
public class OptionsParameter {
|
||||||
|
|
||||||
private String paramName;
|
private String paramName;
|
||||||
private String paramLongName;
|
private String paramLongName;
|
||||||
private String paramDescription;
|
private String paramDescription;
|
||||||
private boolean paramRequired;
|
private boolean paramRequired;
|
||||||
private boolean compressed;
|
private boolean compressed;
|
||||||
|
|
||||||
public OptionsParameter() {}
|
public OptionsParameter() {
|
||||||
|
}
|
||||||
|
|
||||||
public String getParamName() {
|
public String getParamName() {
|
||||||
return paramName;
|
return paramName;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getParamLongName() {
|
public String getParamLongName() {
|
||||||
return paramLongName;
|
return paramLongName;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getParamDescription() {
|
public String getParamDescription() {
|
||||||
return paramDescription;
|
return paramDescription;
|
||||||
}
|
}
|
||||||
|
|
||||||
public boolean isParamRequired() {
|
public boolean isParamRequired() {
|
||||||
return paramRequired;
|
return paramRequired;
|
||||||
}
|
}
|
||||||
|
|
||||||
public boolean isCompressed() {
|
public boolean isCompressed() {
|
||||||
return compressed;
|
return compressed;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setCompressed(boolean compressed) {
|
public void setCompressed(boolean compressed) {
|
||||||
this.compressed = compressed;
|
this.compressed = compressed;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,3 +1,4 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.common;
|
package eu.dnetlib.dhp.common;
|
||||||
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
|
@ -6,46 +7,48 @@ import java.util.function.Supplier;
|
||||||
/** Provides serializable and throwing extensions to standard functional interfaces. */
|
/** Provides serializable and throwing extensions to standard functional interfaces. */
|
||||||
public class FunctionalInterfaceSupport {
|
public class FunctionalInterfaceSupport {
|
||||||
|
|
||||||
private FunctionalInterfaceSupport() {}
|
private FunctionalInterfaceSupport() {
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Serializable supplier of any kind of objects. To be used withing spark processing pipelines
|
* Serializable supplier of any kind of objects. To be used withing spark processing pipelines when supplying
|
||||||
* when supplying functions externally.
|
* functions externally.
|
||||||
*
|
*
|
||||||
* @param <T>
|
* @param <T>
|
||||||
*/
|
*/
|
||||||
@FunctionalInterface
|
@FunctionalInterface
|
||||||
public interface SerializableSupplier<T> extends Supplier<T>, Serializable {}
|
public interface SerializableSupplier<T> extends Supplier<T>, Serializable {
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Extension of consumer accepting functions throwing an exception.
|
* Extension of consumer accepting functions throwing an exception.
|
||||||
*
|
*
|
||||||
* @param <T>
|
* @param <T>
|
||||||
* @param <E>
|
* @param <E>
|
||||||
*/
|
*/
|
||||||
@FunctionalInterface
|
@FunctionalInterface
|
||||||
public interface ThrowingConsumer<T, E extends Exception> {
|
public interface ThrowingConsumer<T, E extends Exception> {
|
||||||
void accept(T t) throws E;
|
void accept(T t) throws E;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Extension of supplier accepting functions throwing an exception.
|
* Extension of supplier accepting functions throwing an exception.
|
||||||
*
|
*
|
||||||
* @param <T>
|
* @param <T>
|
||||||
* @param <E>
|
* @param <E>
|
||||||
*/
|
*/
|
||||||
@FunctionalInterface
|
@FunctionalInterface
|
||||||
public interface ThrowingSupplier<T, E extends Exception> {
|
public interface ThrowingSupplier<T, E extends Exception> {
|
||||||
T get() throws E;
|
T get() throws E;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Extension of runnable accepting functions throwing an exception.
|
* Extension of runnable accepting functions throwing an exception.
|
||||||
*
|
*
|
||||||
* @param <E>
|
* @param <E>
|
||||||
*/
|
*/
|
||||||
@FunctionalInterface
|
@FunctionalInterface
|
||||||
public interface ThrowingRunnable<E extends Exception> {
|
public interface ThrowingRunnable<E extends Exception> {
|
||||||
void run() throws E;
|
void run() throws E;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,3 +1,4 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.common;
|
package eu.dnetlib.dhp.common;
|
||||||
|
|
||||||
import static eu.dnetlib.dhp.common.ThrowingSupport.rethrowAsRuntimeException;
|
import static eu.dnetlib.dhp.common.ThrowingSupport.rethrowAsRuntimeException;
|
||||||
|
@ -5,6 +6,7 @@ import static eu.dnetlib.dhp.common.ThrowingSupport.rethrowAsRuntimeException;
|
||||||
import java.util.Arrays;
|
import java.util.Arrays;
|
||||||
import java.util.List;
|
import java.util.List;
|
||||||
import java.util.stream.Collectors;
|
import java.util.stream.Collectors;
|
||||||
|
|
||||||
import org.apache.hadoop.conf.Configuration;
|
import org.apache.hadoop.conf.Configuration;
|
||||||
import org.apache.hadoop.fs.FileStatus;
|
import org.apache.hadoop.fs.FileStatus;
|
||||||
import org.apache.hadoop.fs.FileSystem;
|
import org.apache.hadoop.fs.FileSystem;
|
||||||
|
@ -14,58 +16,59 @@ import org.slf4j.LoggerFactory;
|
||||||
|
|
||||||
/** HDFS utility methods. */
|
/** HDFS utility methods. */
|
||||||
public class HdfsSupport {
|
public class HdfsSupport {
|
||||||
private static final Logger logger = LoggerFactory.getLogger(HdfsSupport.class);
|
private static final Logger logger = LoggerFactory.getLogger(HdfsSupport.class);
|
||||||
|
|
||||||
private HdfsSupport() {}
|
private HdfsSupport() {
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Checks a path (file or dir) exists on HDFS.
|
* Checks a path (file or dir) exists on HDFS.
|
||||||
*
|
*
|
||||||
* @param path Path to be checked
|
* @param path Path to be checked
|
||||||
* @param configuration Configuration of hadoop env
|
* @param configuration Configuration of hadoop env
|
||||||
*/
|
*/
|
||||||
public static boolean exists(String path, Configuration configuration) {
|
public static boolean exists(String path, Configuration configuration) {
|
||||||
logger.info("Removing path: {}", path);
|
logger.info("Removing path: {}", path);
|
||||||
return rethrowAsRuntimeException(
|
return rethrowAsRuntimeException(
|
||||||
() -> {
|
() -> {
|
||||||
Path f = new Path(path);
|
Path f = new Path(path);
|
||||||
FileSystem fileSystem = FileSystem.get(configuration);
|
FileSystem fileSystem = FileSystem.get(configuration);
|
||||||
return fileSystem.exists(f);
|
return fileSystem.exists(f);
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Removes a path (file or dir) from HDFS.
|
* Removes a path (file or dir) from HDFS.
|
||||||
*
|
*
|
||||||
* @param path Path to be removed
|
* @param path Path to be removed
|
||||||
* @param configuration Configuration of hadoop env
|
* @param configuration Configuration of hadoop env
|
||||||
*/
|
*/
|
||||||
public static void remove(String path, Configuration configuration) {
|
public static void remove(String path, Configuration configuration) {
|
||||||
logger.info("Removing path: {}", path);
|
logger.info("Removing path: {}", path);
|
||||||
rethrowAsRuntimeException(
|
rethrowAsRuntimeException(
|
||||||
() -> {
|
() -> {
|
||||||
Path f = new Path(path);
|
Path f = new Path(path);
|
||||||
FileSystem fileSystem = FileSystem.get(configuration);
|
FileSystem fileSystem = FileSystem.get(configuration);
|
||||||
if (fileSystem.exists(f)) {
|
if (fileSystem.exists(f)) {
|
||||||
fileSystem.delete(f, true);
|
fileSystem.delete(f, true);
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Lists hadoop files located below path or alternatively lists subdirs under path.
|
* Lists hadoop files located below path or alternatively lists subdirs under path.
|
||||||
*
|
*
|
||||||
* @param path Path to be listed for hadoop files
|
* @param path Path to be listed for hadoop files
|
||||||
* @param configuration Configuration of hadoop env
|
* @param configuration Configuration of hadoop env
|
||||||
* @return List with string locations of hadoop files
|
* @return List with string locations of hadoop files
|
||||||
*/
|
*/
|
||||||
public static List<String> listFiles(String path, Configuration configuration) {
|
public static List<String> listFiles(String path, Configuration configuration) {
|
||||||
logger.info("Listing files in path: {}", path);
|
logger.info("Listing files in path: {}", path);
|
||||||
return rethrowAsRuntimeException(
|
return rethrowAsRuntimeException(
|
||||||
() ->
|
() -> Arrays
|
||||||
Arrays.stream(FileSystem.get(configuration).listStatus(new Path(path)))
|
.stream(FileSystem.get(configuration).listStatus(new Path(path)))
|
||||||
.filter(FileStatus::isDirectory)
|
.filter(FileStatus::isDirectory)
|
||||||
.map(x -> x.getPath().toString())
|
.map(x -> x.getPath().toString())
|
||||||
.collect(Collectors.toList()));
|
.collect(Collectors.toList()));
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,74 +1,75 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.common;
|
package eu.dnetlib.dhp.common;
|
||||||
|
|
||||||
import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.ThrowingConsumer;
|
|
||||||
import java.util.Objects;
|
import java.util.Objects;
|
||||||
import java.util.function.Function;
|
import java.util.function.Function;
|
||||||
|
|
||||||
import org.apache.spark.SparkConf;
|
import org.apache.spark.SparkConf;
|
||||||
import org.apache.spark.sql.SparkSession;
|
import org.apache.spark.sql.SparkSession;
|
||||||
|
|
||||||
|
import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.ThrowingConsumer;
|
||||||
|
|
||||||
/** SparkSession utility methods. */
|
/** SparkSession utility methods. */
|
||||||
public class SparkSessionSupport {
|
public class SparkSessionSupport {
|
||||||
|
|
||||||
private SparkSessionSupport() {}
|
private SparkSessionSupport() {
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Runs a given function using SparkSession created using default builder and supplied SparkConf.
|
* Runs a given function using SparkSession created using default builder and supplied SparkConf. Stops SparkSession
|
||||||
* Stops SparkSession when SparkSession is managed. Allows to reuse SparkSession created
|
* when SparkSession is managed. Allows to reuse SparkSession created externally.
|
||||||
* externally.
|
*
|
||||||
*
|
* @param conf SparkConf instance
|
||||||
* @param conf SparkConf instance
|
* @param isSparkSessionManaged When true will stop SparkSession
|
||||||
* @param isSparkSessionManaged When true will stop SparkSession
|
* @param fn Consumer to be applied to constructed SparkSession
|
||||||
* @param fn Consumer to be applied to constructed SparkSession
|
*/
|
||||||
*/
|
public static void runWithSparkSession(
|
||||||
public static void runWithSparkSession(
|
SparkConf conf, Boolean isSparkSessionManaged, ThrowingConsumer<SparkSession, Exception> fn) {
|
||||||
SparkConf conf, Boolean isSparkSessionManaged, ThrowingConsumer<SparkSession, Exception> fn) {
|
runWithSparkSession(
|
||||||
runWithSparkSession(
|
c -> SparkSession.builder().config(c).getOrCreate(), conf, isSparkSessionManaged, fn);
|
||||||
c -> SparkSession.builder().config(c).getOrCreate(), conf, isSparkSessionManaged, fn);
|
}
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Runs a given function using SparkSession created with hive support and using default builder
|
* Runs a given function using SparkSession created with hive support and using default builder and supplied
|
||||||
* and supplied SparkConf. Stops SparkSession when SparkSession is managed. Allows to reuse
|
* SparkConf. Stops SparkSession when SparkSession is managed. Allows to reuse SparkSession created externally.
|
||||||
* SparkSession created externally.
|
*
|
||||||
*
|
* @param conf SparkConf instance
|
||||||
* @param conf SparkConf instance
|
* @param isSparkSessionManaged When true will stop SparkSession
|
||||||
* @param isSparkSessionManaged When true will stop SparkSession
|
* @param fn Consumer to be applied to constructed SparkSession
|
||||||
* @param fn Consumer to be applied to constructed SparkSession
|
*/
|
||||||
*/
|
public static void runWithSparkHiveSession(
|
||||||
public static void runWithSparkHiveSession(
|
SparkConf conf, Boolean isSparkSessionManaged, ThrowingConsumer<SparkSession, Exception> fn) {
|
||||||
SparkConf conf, Boolean isSparkSessionManaged, ThrowingConsumer<SparkSession, Exception> fn) {
|
runWithSparkSession(
|
||||||
runWithSparkSession(
|
c -> SparkSession.builder().config(c).enableHiveSupport().getOrCreate(),
|
||||||
c -> SparkSession.builder().config(c).enableHiveSupport().getOrCreate(),
|
conf,
|
||||||
conf,
|
isSparkSessionManaged,
|
||||||
isSparkSessionManaged,
|
fn);
|
||||||
fn);
|
}
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Runs a given function using SparkSession created using supplied builder and supplied SparkConf.
|
* Runs a given function using SparkSession created using supplied builder and supplied SparkConf. Stops
|
||||||
* Stops SparkSession when SparkSession is managed. Allows to reuse SparkSession created
|
* SparkSession when SparkSession is managed. Allows to reuse SparkSession created externally.
|
||||||
* externally.
|
*
|
||||||
*
|
* @param sparkSessionBuilder Builder of SparkSession
|
||||||
* @param sparkSessionBuilder Builder of SparkSession
|
* @param conf SparkConf instance
|
||||||
* @param conf SparkConf instance
|
* @param isSparkSessionManaged When true will stop SparkSession
|
||||||
* @param isSparkSessionManaged When true will stop SparkSession
|
* @param fn Consumer to be applied to constructed SparkSession
|
||||||
* @param fn Consumer to be applied to constructed SparkSession
|
*/
|
||||||
*/
|
public static void runWithSparkSession(
|
||||||
public static void runWithSparkSession(
|
Function<SparkConf, SparkSession> sparkSessionBuilder,
|
||||||
Function<SparkConf, SparkSession> sparkSessionBuilder,
|
SparkConf conf,
|
||||||
SparkConf conf,
|
Boolean isSparkSessionManaged,
|
||||||
Boolean isSparkSessionManaged,
|
ThrowingConsumer<SparkSession, Exception> fn) {
|
||||||
ThrowingConsumer<SparkSession, Exception> fn) {
|
SparkSession spark = null;
|
||||||
SparkSession spark = null;
|
try {
|
||||||
try {
|
spark = sparkSessionBuilder.apply(conf);
|
||||||
spark = sparkSessionBuilder.apply(conf);
|
fn.accept(spark);
|
||||||
fn.accept(spark);
|
} catch (Exception e) {
|
||||||
} catch (Exception e) {
|
throw new RuntimeException(e);
|
||||||
throw new RuntimeException(e);
|
} finally {
|
||||||
} finally {
|
if (Objects.nonNull(spark) && isSparkSessionManaged) {
|
||||||
if (Objects.nonNull(spark) && isSparkSessionManaged) {
|
spark.stop();
|
||||||
spark.stop();
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,3 +1,4 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.common;
|
package eu.dnetlib.dhp.common;
|
||||||
|
|
||||||
import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.ThrowingRunnable;
|
import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.ThrowingRunnable;
|
||||||
|
@ -6,69 +7,70 @@ import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.ThrowingSupplier;
|
||||||
/** Exception handling utility methods. */
|
/** Exception handling utility methods. */
|
||||||
public class ThrowingSupport {
|
public class ThrowingSupport {
|
||||||
|
|
||||||
private ThrowingSupport() {}
|
private ThrowingSupport() {
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Executes given runnable and rethrows any exceptions as RuntimeException.
|
* Executes given runnable and rethrows any exceptions as RuntimeException.
|
||||||
*
|
*
|
||||||
* @param fn Runnable to be executed
|
* @param fn Runnable to be executed
|
||||||
* @param <E> Type of exception thrown
|
* @param <E> Type of exception thrown
|
||||||
*/
|
*/
|
||||||
public static <E extends Exception> void rethrowAsRuntimeException(ThrowingRunnable<E> fn) {
|
public static <E extends Exception> void rethrowAsRuntimeException(ThrowingRunnable<E> fn) {
|
||||||
try {
|
try {
|
||||||
fn.run();
|
fn.run();
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
throw new RuntimeException(e);
|
throw new RuntimeException(e);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Executes given runnable and rethrows any exceptions as RuntimeException with custom message.
|
* Executes given runnable and rethrows any exceptions as RuntimeException with custom message.
|
||||||
*
|
*
|
||||||
* @param fn Runnable to be executed
|
* @param fn Runnable to be executed
|
||||||
* @param msg Message to be set for rethrown exception
|
* @param msg Message to be set for rethrown exception
|
||||||
* @param <E> Type of exception thrown
|
* @param <E> Type of exception thrown
|
||||||
*/
|
*/
|
||||||
public static <E extends Exception> void rethrowAsRuntimeException(
|
public static <E extends Exception> void rethrowAsRuntimeException(
|
||||||
ThrowingRunnable<E> fn, String msg) {
|
ThrowingRunnable<E> fn, String msg) {
|
||||||
try {
|
try {
|
||||||
fn.run();
|
fn.run();
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
throw new RuntimeException(msg, e);
|
throw new RuntimeException(msg, e);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Executes given supplier and rethrows any exceptions as RuntimeException.
|
* Executes given supplier and rethrows any exceptions as RuntimeException.
|
||||||
*
|
*
|
||||||
* @param fn Supplier to be executed
|
* @param fn Supplier to be executed
|
||||||
* @param <T> Type of returned value
|
* @param <T> Type of returned value
|
||||||
* @param <E> Type of exception thrown
|
* @param <E> Type of exception thrown
|
||||||
* @return Result of supplier execution
|
* @return Result of supplier execution
|
||||||
*/
|
*/
|
||||||
public static <T, E extends Exception> T rethrowAsRuntimeException(ThrowingSupplier<T, E> fn) {
|
public static <T, E extends Exception> T rethrowAsRuntimeException(ThrowingSupplier<T, E> fn) {
|
||||||
try {
|
try {
|
||||||
return fn.get();
|
return fn.get();
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
throw new RuntimeException(e);
|
throw new RuntimeException(e);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Executes given supplier and rethrows any exceptions as RuntimeException with custom message.
|
* Executes given supplier and rethrows any exceptions as RuntimeException with custom message.
|
||||||
*
|
*
|
||||||
* @param fn Supplier to be executed
|
* @param fn Supplier to be executed
|
||||||
* @param msg Message to be set for rethrown exception
|
* @param msg Message to be set for rethrown exception
|
||||||
* @param <T> Type of returned value
|
* @param <T> Type of returned value
|
||||||
* @param <E> Type of exception thrown
|
* @param <E> Type of exception thrown
|
||||||
* @return Result of supplier execution
|
* @return Result of supplier execution
|
||||||
*/
|
*/
|
||||||
public static <T, E extends Exception> T rethrowAsRuntimeException(
|
public static <T, E extends Exception> T rethrowAsRuntimeException(
|
||||||
ThrowingSupplier<T, E> fn, String msg) {
|
ThrowingSupplier<T, E> fn, String msg) {
|
||||||
try {
|
try {
|
||||||
return fn.get();
|
return fn.get();
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
throw new RuntimeException(msg, e);
|
throw new RuntimeException(msg, e);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,120 +1,121 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.model.mdstore;
|
package eu.dnetlib.dhp.model.mdstore;
|
||||||
|
|
||||||
import eu.dnetlib.dhp.utils.DHPUtils;
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
|
|
||||||
|
import eu.dnetlib.dhp.utils.DHPUtils;
|
||||||
|
|
||||||
/** This class models a record inside the new Metadata store collection on HDFS * */
|
/** This class models a record inside the new Metadata store collection on HDFS * */
|
||||||
public class MetadataRecord implements Serializable {
|
public class MetadataRecord implements Serializable {
|
||||||
|
|
||||||
/** The D-Net Identifier associated to the record */
|
/** The D-Net Identifier associated to the record */
|
||||||
private String id;
|
private String id;
|
||||||
|
|
||||||
/** The original Identifier of the record */
|
/** The original Identifier of the record */
|
||||||
private String originalId;
|
private String originalId;
|
||||||
|
|
||||||
/** The encoding of the record, should be JSON or XML */
|
/** The encoding of the record, should be JSON or XML */
|
||||||
private String encoding;
|
private String encoding;
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* The information about the provenance of the record see @{@link Provenance} for the model of
|
* The information about the provenance of the record see @{@link Provenance} for the model of this information
|
||||||
* this information
|
*/
|
||||||
*/
|
private Provenance provenance;
|
||||||
private Provenance provenance;
|
|
||||||
|
|
||||||
/** The content of the metadata */
|
/** The content of the metadata */
|
||||||
private String body;
|
private String body;
|
||||||
|
|
||||||
/** the date when the record has been stored */
|
/** the date when the record has been stored */
|
||||||
private long dateOfCollection;
|
private long dateOfCollection;
|
||||||
|
|
||||||
/** the date when the record has been stored */
|
/** the date when the record has been stored */
|
||||||
private long dateOfTransformation;
|
private long dateOfTransformation;
|
||||||
|
|
||||||
public MetadataRecord() {
|
public MetadataRecord() {
|
||||||
this.dateOfCollection = System.currentTimeMillis();
|
this.dateOfCollection = System.currentTimeMillis();
|
||||||
}
|
}
|
||||||
|
|
||||||
public MetadataRecord(
|
public MetadataRecord(
|
||||||
String originalId,
|
String originalId,
|
||||||
String encoding,
|
String encoding,
|
||||||
Provenance provenance,
|
Provenance provenance,
|
||||||
String body,
|
String body,
|
||||||
long dateOfCollection) {
|
long dateOfCollection) {
|
||||||
|
|
||||||
this.originalId = originalId;
|
this.originalId = originalId;
|
||||||
this.encoding = encoding;
|
this.encoding = encoding;
|
||||||
this.provenance = provenance;
|
this.provenance = provenance;
|
||||||
this.body = body;
|
this.body = body;
|
||||||
this.dateOfCollection = dateOfCollection;
|
this.dateOfCollection = dateOfCollection;
|
||||||
this.id = DHPUtils.generateIdentifier(originalId, this.provenance.getNsPrefix());
|
this.id = DHPUtils.generateIdentifier(originalId, this.provenance.getNsPrefix());
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getId() {
|
public String getId() {
|
||||||
return id;
|
return id;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setId(String id) {
|
public void setId(String id) {
|
||||||
this.id = id;
|
this.id = id;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getOriginalId() {
|
public String getOriginalId() {
|
||||||
return originalId;
|
return originalId;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setOriginalId(String originalId) {
|
public void setOriginalId(String originalId) {
|
||||||
this.originalId = originalId;
|
this.originalId = originalId;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getEncoding() {
|
public String getEncoding() {
|
||||||
return encoding;
|
return encoding;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setEncoding(String encoding) {
|
public void setEncoding(String encoding) {
|
||||||
this.encoding = encoding;
|
this.encoding = encoding;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Provenance getProvenance() {
|
public Provenance getProvenance() {
|
||||||
return provenance;
|
return provenance;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setProvenance(Provenance provenance) {
|
public void setProvenance(Provenance provenance) {
|
||||||
this.provenance = provenance;
|
this.provenance = provenance;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getBody() {
|
public String getBody() {
|
||||||
return body;
|
return body;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setBody(String body) {
|
public void setBody(String body) {
|
||||||
this.body = body;
|
this.body = body;
|
||||||
}
|
}
|
||||||
|
|
||||||
public long getDateOfCollection() {
|
public long getDateOfCollection() {
|
||||||
return dateOfCollection;
|
return dateOfCollection;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDateOfCollection(long dateOfCollection) {
|
public void setDateOfCollection(long dateOfCollection) {
|
||||||
this.dateOfCollection = dateOfCollection;
|
this.dateOfCollection = dateOfCollection;
|
||||||
}
|
}
|
||||||
|
|
||||||
public long getDateOfTransformation() {
|
public long getDateOfTransformation() {
|
||||||
return dateOfTransformation;
|
return dateOfTransformation;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDateOfTransformation(long dateOfTransformation) {
|
public void setDateOfTransformation(long dateOfTransformation) {
|
||||||
this.dateOfTransformation = dateOfTransformation;
|
this.dateOfTransformation = dateOfTransformation;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public boolean equals(Object o) {
|
public boolean equals(Object o) {
|
||||||
if (!(o instanceof MetadataRecord)) {
|
if (!(o instanceof MetadataRecord)) {
|
||||||
return false;
|
return false;
|
||||||
}
|
}
|
||||||
return ((MetadataRecord) o).getId().equalsIgnoreCase(id);
|
return ((MetadataRecord) o).getId().equalsIgnoreCase(id);
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public int hashCode() {
|
public int hashCode() {
|
||||||
return id.hashCode();
|
return id.hashCode();
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,49 +1,52 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.model.mdstore;
|
package eu.dnetlib.dhp.model.mdstore;
|
||||||
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @author Sandro La Bruzzo
|
* @author Sandro La Bruzzo
|
||||||
* <p>Provenace class models the provenance of the record in the metadataStore It contains the
|
* <p>
|
||||||
* identifier and the name of the datasource that gives the record
|
* Provenace class models the provenance of the record in the metadataStore It contains the identifier and the
|
||||||
|
* name of the datasource that gives the record
|
||||||
*/
|
*/
|
||||||
public class Provenance implements Serializable {
|
public class Provenance implements Serializable {
|
||||||
|
|
||||||
private String datasourceId;
|
private String datasourceId;
|
||||||
|
|
||||||
private String datasourceName;
|
private String datasourceName;
|
||||||
|
|
||||||
private String nsPrefix;
|
private String nsPrefix;
|
||||||
|
|
||||||
public Provenance() {}
|
public Provenance() {
|
||||||
|
}
|
||||||
|
|
||||||
public Provenance(String datasourceId, String datasourceName, String nsPrefix) {
|
public Provenance(String datasourceId, String datasourceName, String nsPrefix) {
|
||||||
this.datasourceId = datasourceId;
|
this.datasourceId = datasourceId;
|
||||||
this.datasourceName = datasourceName;
|
this.datasourceName = datasourceName;
|
||||||
this.nsPrefix = nsPrefix;
|
this.nsPrefix = nsPrefix;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getDatasourceId() {
|
public String getDatasourceId() {
|
||||||
return datasourceId;
|
return datasourceId;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDatasourceId(String datasourceId) {
|
public void setDatasourceId(String datasourceId) {
|
||||||
this.datasourceId = datasourceId;
|
this.datasourceId = datasourceId;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getDatasourceName() {
|
public String getDatasourceName() {
|
||||||
return datasourceName;
|
return datasourceName;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDatasourceName(String datasourceName) {
|
public void setDatasourceName(String datasourceName) {
|
||||||
this.datasourceName = datasourceName;
|
this.datasourceName = datasourceName;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getNsPrefix() {
|
public String getNsPrefix() {
|
||||||
return nsPrefix;
|
return nsPrefix;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setNsPrefix(String nsPrefix) {
|
public void setNsPrefix(String nsPrefix) {
|
||||||
this.nsPrefix = nsPrefix;
|
this.nsPrefix = nsPrefix;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,12 +1,13 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.parser.utility;
|
package eu.dnetlib.dhp.parser.utility;
|
||||||
|
|
||||||
public class VtdException extends Exception {
|
public class VtdException extends Exception {
|
||||||
|
|
||||||
public VtdException(final Exception e) {
|
public VtdException(final Exception e) {
|
||||||
super(e);
|
super(e);
|
||||||
}
|
}
|
||||||
|
|
||||||
public VtdException(final Throwable e) {
|
public VtdException(final Throwable e) {
|
||||||
super(e);
|
super(e);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,105 +1,110 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.parser.utility;
|
package eu.dnetlib.dhp.parser.utility;
|
||||||
|
|
||||||
import com.ximpleware.AutoPilot;
|
|
||||||
import com.ximpleware.VTDNav;
|
|
||||||
import java.util.ArrayList;
|
import java.util.ArrayList;
|
||||||
import java.util.HashMap;
|
import java.util.HashMap;
|
||||||
import java.util.List;
|
import java.util.List;
|
||||||
import java.util.Map;
|
import java.util.Map;
|
||||||
|
|
||||||
|
import com.ximpleware.AutoPilot;
|
||||||
|
import com.ximpleware.VTDNav;
|
||||||
|
|
||||||
/** Created by sandro on 9/29/16. */
|
/** Created by sandro on 9/29/16. */
|
||||||
public class VtdUtilityParser {
|
public class VtdUtilityParser {
|
||||||
|
|
||||||
public static List<Node> getTextValuesWithAttributes(
|
public static List<Node> getTextValuesWithAttributes(
|
||||||
final AutoPilot ap, final VTDNav vn, final String xpath, final List<String> attributes)
|
final AutoPilot ap, final VTDNav vn, final String xpath, final List<String> attributes)
|
||||||
throws VtdException {
|
throws VtdException {
|
||||||
final List<Node> results = new ArrayList<>();
|
final List<Node> results = new ArrayList<>();
|
||||||
try {
|
try {
|
||||||
ap.selectXPath(xpath);
|
ap.selectXPath(xpath);
|
||||||
|
|
||||||
while (ap.evalXPath() != -1) {
|
while (ap.evalXPath() != -1) {
|
||||||
final Node currentNode = new Node();
|
final Node currentNode = new Node();
|
||||||
int t = vn.getText();
|
int t = vn.getText();
|
||||||
if (t >= 0) {
|
if (t >= 0) {
|
||||||
currentNode.setTextValue(vn.toNormalizedString(t));
|
currentNode.setTextValue(vn.toNormalizedString(t));
|
||||||
}
|
}
|
||||||
currentNode.setAttributes(getAttributes(vn, attributes));
|
currentNode.setAttributes(getAttributes(vn, attributes));
|
||||||
results.add(currentNode);
|
results.add(currentNode);
|
||||||
}
|
}
|
||||||
return results;
|
return results;
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
throw new VtdException(e);
|
throw new VtdException(e);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private static Map<String, String> getAttributes(final VTDNav vn, final List<String> attributes) {
|
private static Map<String, String> getAttributes(final VTDNav vn, final List<String> attributes) {
|
||||||
final Map<String, String> currentAttributes = new HashMap<>();
|
final Map<String, String> currentAttributes = new HashMap<>();
|
||||||
if (attributes != null) {
|
if (attributes != null) {
|
||||||
|
|
||||||
attributes.forEach(
|
attributes
|
||||||
attributeKey -> {
|
.forEach(
|
||||||
try {
|
attributeKey -> {
|
||||||
int attr = vn.getAttrVal(attributeKey);
|
try {
|
||||||
if (attr > -1) {
|
int attr = vn.getAttrVal(attributeKey);
|
||||||
currentAttributes.put(attributeKey, vn.toNormalizedString(attr));
|
if (attr > -1) {
|
||||||
}
|
currentAttributes.put(attributeKey, vn.toNormalizedString(attr));
|
||||||
} catch (Throwable e) {
|
}
|
||||||
throw new RuntimeException(e);
|
} catch (Throwable e) {
|
||||||
}
|
throw new RuntimeException(e);
|
||||||
});
|
}
|
||||||
}
|
});
|
||||||
return currentAttributes;
|
}
|
||||||
}
|
return currentAttributes;
|
||||||
|
}
|
||||||
|
|
||||||
public static List<String> getTextValue(final AutoPilot ap, final VTDNav vn, final String xpath)
|
public static List<String> getTextValue(final AutoPilot ap, final VTDNav vn, final String xpath)
|
||||||
throws VtdException {
|
throws VtdException {
|
||||||
List<String> results = new ArrayList<>();
|
List<String> results = new ArrayList<>();
|
||||||
try {
|
try {
|
||||||
ap.selectXPath(xpath);
|
ap.selectXPath(xpath);
|
||||||
while (ap.evalXPath() != -1) {
|
while (ap.evalXPath() != -1) {
|
||||||
int t = vn.getText();
|
int t = vn.getText();
|
||||||
if (t > -1) results.add(vn.toNormalizedString(t));
|
if (t > -1)
|
||||||
}
|
results.add(vn.toNormalizedString(t));
|
||||||
return results;
|
}
|
||||||
} catch (Exception e) {
|
return results;
|
||||||
throw new VtdException(e);
|
} catch (Exception e) {
|
||||||
}
|
throw new VtdException(e);
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
|
||||||
public static String getSingleValue(final AutoPilot ap, final VTDNav nav, final String xpath)
|
public static String getSingleValue(final AutoPilot ap, final VTDNav nav, final String xpath)
|
||||||
throws VtdException {
|
throws VtdException {
|
||||||
try {
|
try {
|
||||||
ap.selectXPath(xpath);
|
ap.selectXPath(xpath);
|
||||||
while (ap.evalXPath() != -1) {
|
while (ap.evalXPath() != -1) {
|
||||||
int it = nav.getText();
|
int it = nav.getText();
|
||||||
if (it > -1) return nav.toNormalizedString(it);
|
if (it > -1)
|
||||||
}
|
return nav.toNormalizedString(it);
|
||||||
return null;
|
}
|
||||||
} catch (Exception e) {
|
return null;
|
||||||
throw new VtdException(e);
|
} catch (Exception e) {
|
||||||
}
|
throw new VtdException(e);
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
|
||||||
public static class Node {
|
public static class Node {
|
||||||
|
|
||||||
private String textValue;
|
private String textValue;
|
||||||
|
|
||||||
private Map<String, String> attributes;
|
private Map<String, String> attributes;
|
||||||
|
|
||||||
public String getTextValue() {
|
public String getTextValue() {
|
||||||
return textValue;
|
return textValue;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setTextValue(final String textValue) {
|
public void setTextValue(final String textValue) {
|
||||||
this.textValue = textValue;
|
this.textValue = textValue;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Map<String, String> getAttributes() {
|
public Map<String, String> getAttributes() {
|
||||||
return attributes;
|
return attributes;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setAttributes(final Map<String, String> attributes) {
|
public void setAttributes(final Map<String, String> attributes) {
|
||||||
this.attributes = attributes;
|
this.attributes = attributes;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,70 +1,75 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.utils;
|
package eu.dnetlib.dhp.utils;
|
||||||
|
|
||||||
import com.jayway.jsonpath.JsonPath;
|
|
||||||
import java.io.ByteArrayInputStream;
|
import java.io.ByteArrayInputStream;
|
||||||
import java.io.ByteArrayOutputStream;
|
import java.io.ByteArrayOutputStream;
|
||||||
import java.nio.charset.StandardCharsets;
|
import java.nio.charset.StandardCharsets;
|
||||||
import java.security.MessageDigest;
|
import java.security.MessageDigest;
|
||||||
import java.util.zip.GZIPInputStream;
|
import java.util.zip.GZIPInputStream;
|
||||||
import java.util.zip.GZIPOutputStream;
|
import java.util.zip.GZIPOutputStream;
|
||||||
import net.minidev.json.JSONArray;
|
|
||||||
import org.apache.commons.codec.binary.Base64;
|
import org.apache.commons.codec.binary.Base64;
|
||||||
import org.apache.commons.codec.binary.Base64OutputStream;
|
import org.apache.commons.codec.binary.Base64OutputStream;
|
||||||
import org.apache.commons.codec.binary.Hex;
|
import org.apache.commons.codec.binary.Hex;
|
||||||
|
|
||||||
|
import com.jayway.jsonpath.JsonPath;
|
||||||
|
|
||||||
|
import net.minidev.json.JSONArray;
|
||||||
|
|
||||||
public class DHPUtils {
|
public class DHPUtils {
|
||||||
|
|
||||||
public static String md5(final String s) {
|
public static String md5(final String s) {
|
||||||
try {
|
try {
|
||||||
final MessageDigest md = MessageDigest.getInstance("MD5");
|
final MessageDigest md = MessageDigest.getInstance("MD5");
|
||||||
md.update(s.getBytes("UTF-8"));
|
md.update(s.getBytes(StandardCharsets.UTF_8));
|
||||||
return new String(Hex.encodeHex(md.digest()));
|
return new String(Hex.encodeHex(md.digest()));
|
||||||
} catch (final Exception e) {
|
} catch (final Exception e) {
|
||||||
System.err.println("Error creating id");
|
System.err.println("Error creating id");
|
||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
public static String generateIdentifier(final String originalId, final String nsPrefix) {
|
public static String generateIdentifier(final String originalId, final String nsPrefix) {
|
||||||
return String.format("%s::%s", nsPrefix, DHPUtils.md5(originalId));
|
return String.format("%s::%s", nsPrefix, DHPUtils.md5(originalId));
|
||||||
}
|
}
|
||||||
|
|
||||||
public static String compressString(final String input) {
|
public static String compressString(final String input) {
|
||||||
try (ByteArrayOutputStream out = new ByteArrayOutputStream();
|
try (ByteArrayOutputStream out = new ByteArrayOutputStream();
|
||||||
Base64OutputStream b64os = new Base64OutputStream(out)) {
|
Base64OutputStream b64os = new Base64OutputStream(out)) {
|
||||||
GZIPOutputStream gzip = new GZIPOutputStream(b64os);
|
GZIPOutputStream gzip = new GZIPOutputStream(b64os);
|
||||||
gzip.write(input.getBytes(StandardCharsets.UTF_8));
|
gzip.write(input.getBytes(StandardCharsets.UTF_8));
|
||||||
gzip.close();
|
gzip.close();
|
||||||
return out.toString();
|
return out.toString();
|
||||||
} catch (Throwable e) {
|
} catch (Throwable e) {
|
||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
public static String decompressString(final String input) {
|
public static String decompressString(final String input) {
|
||||||
byte[] byteArray = Base64.decodeBase64(input.getBytes());
|
byte[] byteArray = Base64.decodeBase64(input.getBytes());
|
||||||
int len;
|
int len;
|
||||||
try (GZIPInputStream gis = new GZIPInputStream(new ByteArrayInputStream((byteArray)));
|
try (GZIPInputStream gis = new GZIPInputStream(new ByteArrayInputStream((byteArray)));
|
||||||
ByteArrayOutputStream bos = new ByteArrayOutputStream(byteArray.length)) {
|
ByteArrayOutputStream bos = new ByteArrayOutputStream(byteArray.length)) {
|
||||||
byte[] buffer = new byte[1024];
|
byte[] buffer = new byte[1024];
|
||||||
while ((len = gis.read(buffer)) != -1) {
|
while ((len = gis.read(buffer)) != -1) {
|
||||||
bos.write(buffer, 0, len);
|
bos.write(buffer, 0, len);
|
||||||
}
|
}
|
||||||
return bos.toString();
|
return bos.toString();
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
public static String getJPathString(final String jsonPath, final String json) {
|
public static String getJPathString(final String jsonPath, final String json) {
|
||||||
try {
|
try {
|
||||||
Object o = JsonPath.read(json, jsonPath);
|
Object o = JsonPath.read(json, jsonPath);
|
||||||
if (o instanceof String) return (String) o;
|
if (o instanceof String)
|
||||||
if (o instanceof JSONArray && ((JSONArray) o).size() > 0)
|
return (String) o;
|
||||||
return (String) ((JSONArray) o).get(0);
|
if (o instanceof JSONArray && ((JSONArray) o).size() > 0)
|
||||||
return o.toString();
|
return (String) ((JSONArray) o).get(0);
|
||||||
} catch (Exception e) {
|
return o.toString();
|
||||||
return "";
|
} catch (Exception e) {
|
||||||
}
|
return "";
|
||||||
}
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,24 +1,26 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.utils;
|
package eu.dnetlib.dhp.utils;
|
||||||
|
|
||||||
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpService;
|
|
||||||
import org.apache.commons.logging.Log;
|
import org.apache.commons.logging.Log;
|
||||||
import org.apache.commons.logging.LogFactory;
|
import org.apache.commons.logging.LogFactory;
|
||||||
import org.apache.cxf.jaxws.JaxWsProxyFactoryBean;
|
import org.apache.cxf.jaxws.JaxWsProxyFactoryBean;
|
||||||
|
|
||||||
|
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpService;
|
||||||
|
|
||||||
public class ISLookupClientFactory {
|
public class ISLookupClientFactory {
|
||||||
|
|
||||||
private static final Log log = LogFactory.getLog(ISLookupClientFactory.class);
|
private static final Log log = LogFactory.getLog(ISLookupClientFactory.class);
|
||||||
|
|
||||||
public static ISLookUpService getLookUpService(final String isLookupUrl) {
|
public static ISLookUpService getLookUpService(final String isLookupUrl) {
|
||||||
return getServiceStub(ISLookUpService.class, isLookupUrl);
|
return getServiceStub(ISLookUpService.class, isLookupUrl);
|
||||||
}
|
}
|
||||||
|
|
||||||
@SuppressWarnings("unchecked")
|
@SuppressWarnings("unchecked")
|
||||||
private static <T> T getServiceStub(final Class<T> clazz, final String endpoint) {
|
private static <T> T getServiceStub(final Class<T> clazz, final String endpoint) {
|
||||||
log.info(String.format("creating %s stub from %s", clazz.getName(), endpoint));
|
log.info(String.format("creating %s stub from %s", clazz.getName(), endpoint));
|
||||||
final JaxWsProxyFactoryBean jaxWsProxyFactory = new JaxWsProxyFactoryBean();
|
final JaxWsProxyFactoryBean jaxWsProxyFactory = new JaxWsProxyFactoryBean();
|
||||||
jaxWsProxyFactory.setServiceClass(clazz);
|
jaxWsProxyFactory.setServiceClass(clazz);
|
||||||
jaxWsProxyFactory.setAddress(endpoint);
|
jaxWsProxyFactory.setAddress(endpoint);
|
||||||
return (T) jaxWsProxyFactory.create();
|
return (T) jaxWsProxyFactory.create();
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,3 +1,4 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.utils.saxon;
|
package eu.dnetlib.dhp.utils.saxon;
|
||||||
|
|
||||||
import net.sf.saxon.expr.XPathContext;
|
import net.sf.saxon.expr.XPathContext;
|
||||||
|
@ -9,25 +10,24 @@ import net.sf.saxon.trans.XPathException;
|
||||||
|
|
||||||
public abstract class AbstractExtensionFunction extends ExtensionFunctionDefinition {
|
public abstract class AbstractExtensionFunction extends ExtensionFunctionDefinition {
|
||||||
|
|
||||||
public static String DEFAULT_SAXON_EXT_NS_URI =
|
public static String DEFAULT_SAXON_EXT_NS_URI = "http://www.d-net.research-infrastructures.eu/saxon-extension";
|
||||||
"http://www.d-net.research-infrastructures.eu/saxon-extension";
|
|
||||||
|
|
||||||
public abstract String getName();
|
public abstract String getName();
|
||||||
|
|
||||||
public abstract Sequence doCall(XPathContext context, Sequence[] arguments) throws XPathException;
|
public abstract Sequence doCall(XPathContext context, Sequence[] arguments) throws XPathException;
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public StructuredQName getFunctionQName() {
|
public StructuredQName getFunctionQName() {
|
||||||
return new StructuredQName("dnet", DEFAULT_SAXON_EXT_NS_URI, getName());
|
return new StructuredQName("dnet", DEFAULT_SAXON_EXT_NS_URI, getName());
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public ExtensionFunctionCall makeCallExpression() {
|
public ExtensionFunctionCall makeCallExpression() {
|
||||||
return new ExtensionFunctionCall() {
|
return new ExtensionFunctionCall() {
|
||||||
@Override
|
@Override
|
||||||
public Sequence call(XPathContext context, Sequence[] arguments) throws XPathException {
|
public Sequence call(XPathContext context, Sequence[] arguments) throws XPathException {
|
||||||
return doCall(context, arguments);
|
return doCall(context, arguments);
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,9 +1,11 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.utils.saxon;
|
package eu.dnetlib.dhp.utils.saxon;
|
||||||
|
|
||||||
import java.text.ParseException;
|
import java.text.ParseException;
|
||||||
import java.text.SimpleDateFormat;
|
import java.text.SimpleDateFormat;
|
||||||
import java.util.Calendar;
|
import java.util.Calendar;
|
||||||
import java.util.GregorianCalendar;
|
import java.util.GregorianCalendar;
|
||||||
|
|
||||||
import net.sf.saxon.expr.XPathContext;
|
import net.sf.saxon.expr.XPathContext;
|
||||||
import net.sf.saxon.om.Item;
|
import net.sf.saxon.om.Item;
|
||||||
import net.sf.saxon.om.Sequence;
|
import net.sf.saxon.om.Sequence;
|
||||||
|
@ -13,55 +15,59 @@ import net.sf.saxon.value.StringValue;
|
||||||
|
|
||||||
public class ExtractYear extends AbstractExtensionFunction {
|
public class ExtractYear extends AbstractExtensionFunction {
|
||||||
|
|
||||||
private static final String[] dateFormats = {"yyyy-MM-dd", "yyyy/MM/dd"};
|
private static final String[] dateFormats = {
|
||||||
|
"yyyy-MM-dd", "yyyy/MM/dd"
|
||||||
|
};
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String getName() {
|
public String getName() {
|
||||||
return "extractYear";
|
return "extractYear";
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public Sequence doCall(XPathContext context, Sequence[] arguments) throws XPathException {
|
public Sequence doCall(XPathContext context, Sequence[] arguments) throws XPathException {
|
||||||
if (arguments == null | arguments.length == 0) {
|
if (arguments == null | arguments.length == 0) {
|
||||||
return new StringValue("");
|
return new StringValue("");
|
||||||
}
|
}
|
||||||
final Item item = arguments[0].head();
|
final Item item = arguments[0].head();
|
||||||
if (item == null) {
|
if (item == null) {
|
||||||
return new StringValue("");
|
return new StringValue("");
|
||||||
}
|
}
|
||||||
return new StringValue(_year(item.getStringValue()));
|
return new StringValue(_year(item.getStringValue()));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public int getMinimumNumberOfArguments() {
|
public int getMinimumNumberOfArguments() {
|
||||||
return 0;
|
return 0;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public int getMaximumNumberOfArguments() {
|
public int getMaximumNumberOfArguments() {
|
||||||
return 1;
|
return 1;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public SequenceType[] getArgumentTypes() {
|
public SequenceType[] getArgumentTypes() {
|
||||||
return new SequenceType[] {SequenceType.OPTIONAL_ITEM};
|
return new SequenceType[] {
|
||||||
}
|
SequenceType.OPTIONAL_ITEM
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public SequenceType getResultType(SequenceType[] suppliedArgumentTypes) {
|
public SequenceType getResultType(SequenceType[] suppliedArgumentTypes) {
|
||||||
return SequenceType.SINGLE_STRING;
|
return SequenceType.SINGLE_STRING;
|
||||||
}
|
}
|
||||||
|
|
||||||
private String _year(String s) {
|
private String _year(String s) {
|
||||||
Calendar c = new GregorianCalendar();
|
Calendar c = new GregorianCalendar();
|
||||||
for (String format : dateFormats) {
|
for (String format : dateFormats) {
|
||||||
try {
|
try {
|
||||||
c.setTime(new SimpleDateFormat(format).parse(s));
|
c.setTime(new SimpleDateFormat(format).parse(s));
|
||||||
String year = String.valueOf(c.get(Calendar.YEAR));
|
String year = String.valueOf(c.get(Calendar.YEAR));
|
||||||
return year;
|
return year;
|
||||||
} catch (ParseException e) {
|
} catch (ParseException e) {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
return "";
|
return "";
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,8 +1,10 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.utils.saxon;
|
package eu.dnetlib.dhp.utils.saxon;
|
||||||
|
|
||||||
import java.text.ParseException;
|
import java.text.ParseException;
|
||||||
import java.text.SimpleDateFormat;
|
import java.text.SimpleDateFormat;
|
||||||
import java.util.Date;
|
import java.util.Date;
|
||||||
|
|
||||||
import net.sf.saxon.expr.XPathContext;
|
import net.sf.saxon.expr.XPathContext;
|
||||||
import net.sf.saxon.om.Sequence;
|
import net.sf.saxon.om.Sequence;
|
||||||
import net.sf.saxon.trans.XPathException;
|
import net.sf.saxon.trans.XPathException;
|
||||||
|
@ -11,57 +13,59 @@ import net.sf.saxon.value.StringValue;
|
||||||
|
|
||||||
public class NormalizeDate extends AbstractExtensionFunction {
|
public class NormalizeDate extends AbstractExtensionFunction {
|
||||||
|
|
||||||
private static final String[] normalizeDateFormats = {
|
private static final String[] normalizeDateFormats = {
|
||||||
"yyyy-MM-dd'T'hh:mm:ss", "yyyy-MM-dd", "yyyy/MM/dd", "yyyy"
|
"yyyy-MM-dd'T'hh:mm:ss", "yyyy-MM-dd", "yyyy/MM/dd", "yyyy"
|
||||||
};
|
};
|
||||||
|
|
||||||
private static final String normalizeOutFormat = new String("yyyy-MM-dd'T'hh:mm:ss'Z'");
|
private static final String normalizeOutFormat = "yyyy-MM-dd'T'hh:mm:ss'Z'";
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String getName() {
|
public String getName() {
|
||||||
return "normalizeDate";
|
return "normalizeDate";
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public Sequence doCall(XPathContext context, Sequence[] arguments) throws XPathException {
|
public Sequence doCall(XPathContext context, Sequence[] arguments) throws XPathException {
|
||||||
if (arguments == null | arguments.length == 0) {
|
if (arguments == null | arguments.length == 0) {
|
||||||
return new StringValue("");
|
return new StringValue("");
|
||||||
}
|
}
|
||||||
String s = arguments[0].head().getStringValue();
|
String s = arguments[0].head().getStringValue();
|
||||||
return new StringValue(_year(s));
|
return new StringValue(_year(s));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public int getMinimumNumberOfArguments() {
|
public int getMinimumNumberOfArguments() {
|
||||||
return 0;
|
return 0;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public int getMaximumNumberOfArguments() {
|
public int getMaximumNumberOfArguments() {
|
||||||
return 1;
|
return 1;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public SequenceType[] getArgumentTypes() {
|
public SequenceType[] getArgumentTypes() {
|
||||||
return new SequenceType[] {SequenceType.OPTIONAL_ITEM};
|
return new SequenceType[] {
|
||||||
}
|
SequenceType.OPTIONAL_ITEM
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public SequenceType getResultType(SequenceType[] suppliedArgumentTypes) {
|
public SequenceType getResultType(SequenceType[] suppliedArgumentTypes) {
|
||||||
return SequenceType.SINGLE_STRING;
|
return SequenceType.SINGLE_STRING;
|
||||||
}
|
}
|
||||||
|
|
||||||
private String _year(String s) {
|
private String _year(String s) {
|
||||||
final String date = s != null ? s.trim() : "";
|
final String date = s != null ? s.trim() : "";
|
||||||
|
|
||||||
for (String format : normalizeDateFormats) {
|
for (String format : normalizeDateFormats) {
|
||||||
try {
|
try {
|
||||||
Date parse = new SimpleDateFormat(format).parse(date);
|
Date parse = new SimpleDateFormat(format).parse(date);
|
||||||
String res = new SimpleDateFormat(normalizeOutFormat).format(parse);
|
String res = new SimpleDateFormat(normalizeOutFormat).format(parse);
|
||||||
return res;
|
return res;
|
||||||
} catch (ParseException e) {
|
} catch (ParseException e) {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
return "";
|
return "";
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,59 +1,63 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.utils.saxon;
|
package eu.dnetlib.dhp.utils.saxon;
|
||||||
|
|
||||||
|
import org.apache.commons.lang3.StringUtils;
|
||||||
|
|
||||||
import net.sf.saxon.expr.XPathContext;
|
import net.sf.saxon.expr.XPathContext;
|
||||||
import net.sf.saxon.om.Item;
|
import net.sf.saxon.om.Item;
|
||||||
import net.sf.saxon.om.Sequence;
|
import net.sf.saxon.om.Sequence;
|
||||||
import net.sf.saxon.trans.XPathException;
|
import net.sf.saxon.trans.XPathException;
|
||||||
import net.sf.saxon.value.SequenceType;
|
import net.sf.saxon.value.SequenceType;
|
||||||
import net.sf.saxon.value.StringValue;
|
import net.sf.saxon.value.StringValue;
|
||||||
import org.apache.commons.lang3.StringUtils;
|
|
||||||
|
|
||||||
public class PickFirst extends AbstractExtensionFunction {
|
public class PickFirst extends AbstractExtensionFunction {
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String getName() {
|
public String getName() {
|
||||||
return "pickFirst";
|
return "pickFirst";
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public Sequence doCall(XPathContext context, Sequence[] arguments) throws XPathException {
|
public Sequence doCall(XPathContext context, Sequence[] arguments) throws XPathException {
|
||||||
if (arguments == null | arguments.length == 0) {
|
if (arguments == null | arguments.length == 0) {
|
||||||
return new StringValue("");
|
return new StringValue("");
|
||||||
}
|
}
|
||||||
|
|
||||||
final String s1 = getValue(arguments[0]);
|
final String s1 = getValue(arguments[0]);
|
||||||
final String s2 = getValue(arguments[1]);
|
final String s2 = getValue(arguments[1]);
|
||||||
|
|
||||||
return new StringValue(StringUtils.isNotBlank(s1) ? s1 : StringUtils.isNotBlank(s2) ? s2 : "");
|
return new StringValue(StringUtils.isNotBlank(s1) ? s1 : StringUtils.isNotBlank(s2) ? s2 : "");
|
||||||
}
|
}
|
||||||
|
|
||||||
private String getValue(final Sequence arg) throws XPathException {
|
private String getValue(final Sequence arg) throws XPathException {
|
||||||
if (arg != null) {
|
if (arg != null) {
|
||||||
final Item item = arg.head();
|
final Item item = arg.head();
|
||||||
if (item != null) {
|
if (item != null) {
|
||||||
return item.getStringValue();
|
return item.getStringValue();
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
return "";
|
return "";
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public int getMinimumNumberOfArguments() {
|
public int getMinimumNumberOfArguments() {
|
||||||
return 0;
|
return 0;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public int getMaximumNumberOfArguments() {
|
public int getMaximumNumberOfArguments() {
|
||||||
return 2;
|
return 2;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public SequenceType[] getArgumentTypes() {
|
public SequenceType[] getArgumentTypes() {
|
||||||
return new SequenceType[] {SequenceType.OPTIONAL_ITEM};
|
return new SequenceType[] {
|
||||||
}
|
SequenceType.OPTIONAL_ITEM
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public SequenceType getResultType(SequenceType[] suppliedArgumentTypes) {
|
public SequenceType getResultType(SequenceType[] suppliedArgumentTypes) {
|
||||||
return SequenceType.SINGLE_STRING;
|
return SequenceType.SINGLE_STRING;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,29 +1,32 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.utils.saxon;
|
package eu.dnetlib.dhp.utils.saxon;
|
||||||
|
|
||||||
import java.io.StringReader;
|
import java.io.StringReader;
|
||||||
|
|
||||||
import javax.xml.transform.Transformer;
|
import javax.xml.transform.Transformer;
|
||||||
import javax.xml.transform.TransformerException;
|
import javax.xml.transform.TransformerException;
|
||||||
import javax.xml.transform.stream.StreamSource;
|
import javax.xml.transform.stream.StreamSource;
|
||||||
|
|
||||||
import net.sf.saxon.Configuration;
|
import net.sf.saxon.Configuration;
|
||||||
import net.sf.saxon.TransformerFactoryImpl;
|
import net.sf.saxon.TransformerFactoryImpl;
|
||||||
|
|
||||||
public class SaxonTransformerFactory {
|
public class SaxonTransformerFactory {
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Creates the index record transformer from the given XSLT
|
* Creates the index record transformer from the given XSLT
|
||||||
*
|
*
|
||||||
* @param xslt
|
* @param xslt
|
||||||
* @return
|
* @return
|
||||||
* @throws TransformerException
|
* @throws TransformerException
|
||||||
*/
|
*/
|
||||||
public static Transformer newInstance(final String xslt) throws TransformerException {
|
public static Transformer newInstance(final String xslt) throws TransformerException {
|
||||||
|
|
||||||
final TransformerFactoryImpl factory = new TransformerFactoryImpl();
|
final TransformerFactoryImpl factory = new TransformerFactoryImpl();
|
||||||
final Configuration conf = factory.getConfiguration();
|
final Configuration conf = factory.getConfiguration();
|
||||||
conf.registerExtensionFunction(new ExtractYear());
|
conf.registerExtensionFunction(new ExtractYear());
|
||||||
conf.registerExtensionFunction(new NormalizeDate());
|
conf.registerExtensionFunction(new NormalizeDate());
|
||||||
conf.registerExtensionFunction(new PickFirst());
|
conf.registerExtensionFunction(new PickFirst());
|
||||||
|
|
||||||
return factory.newTransformer(new StreamSource(new StringReader(xslt)));
|
return factory.newTransformer(new StreamSource(new StringReader(xslt)));
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,73 +1,76 @@
|
||||||
|
|
||||||
package eu.dnetlib.message;
|
package eu.dnetlib.message;
|
||||||
|
|
||||||
import com.fasterxml.jackson.core.JsonProcessingException;
|
|
||||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
|
||||||
import java.io.IOException;
|
import java.io.IOException;
|
||||||
import java.util.Map;
|
import java.util.Map;
|
||||||
|
|
||||||
|
import com.fasterxml.jackson.core.JsonProcessingException;
|
||||||
|
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||||
|
|
||||||
public class Message {
|
public class Message {
|
||||||
|
|
||||||
private String workflowId;
|
private String workflowId;
|
||||||
|
|
||||||
private String jobName;
|
private String jobName;
|
||||||
|
|
||||||
private MessageType type;
|
private MessageType type;
|
||||||
|
|
||||||
private Map<String, String> body;
|
private Map<String, String> body;
|
||||||
|
|
||||||
public static Message fromJson(final String json) throws IOException {
|
public static Message fromJson(final String json) throws IOException {
|
||||||
final ObjectMapper jsonMapper = new ObjectMapper();
|
final ObjectMapper jsonMapper = new ObjectMapper();
|
||||||
return jsonMapper.readValue(json, Message.class);
|
return jsonMapper.readValue(json, Message.class);
|
||||||
}
|
}
|
||||||
|
|
||||||
public Message() {}
|
public Message() {
|
||||||
|
}
|
||||||
|
|
||||||
public Message(String workflowId, String jobName, MessageType type, Map<String, String> body) {
|
public Message(String workflowId, String jobName, MessageType type, Map<String, String> body) {
|
||||||
this.workflowId = workflowId;
|
this.workflowId = workflowId;
|
||||||
this.jobName = jobName;
|
this.jobName = jobName;
|
||||||
this.type = type;
|
this.type = type;
|
||||||
this.body = body;
|
this.body = body;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getWorkflowId() {
|
public String getWorkflowId() {
|
||||||
return workflowId;
|
return workflowId;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setWorkflowId(String workflowId) {
|
public void setWorkflowId(String workflowId) {
|
||||||
this.workflowId = workflowId;
|
this.workflowId = workflowId;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getJobName() {
|
public String getJobName() {
|
||||||
return jobName;
|
return jobName;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setJobName(String jobName) {
|
public void setJobName(String jobName) {
|
||||||
this.jobName = jobName;
|
this.jobName = jobName;
|
||||||
}
|
}
|
||||||
|
|
||||||
public MessageType getType() {
|
public MessageType getType() {
|
||||||
return type;
|
return type;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setType(MessageType type) {
|
public void setType(MessageType type) {
|
||||||
this.type = type;
|
this.type = type;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Map<String, String> getBody() {
|
public Map<String, String> getBody() {
|
||||||
return body;
|
return body;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setBody(Map<String, String> body) {
|
public void setBody(Map<String, String> body) {
|
||||||
this.body = body;
|
this.body = body;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public String toString() {
|
public String toString() {
|
||||||
final ObjectMapper jsonMapper = new ObjectMapper();
|
final ObjectMapper jsonMapper = new ObjectMapper();
|
||||||
try {
|
try {
|
||||||
return jsonMapper.writeValueAsString(this);
|
return jsonMapper.writeValueAsString(this);
|
||||||
} catch (JsonProcessingException e) {
|
} catch (JsonProcessingException e) {
|
||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,45 +1,47 @@
|
||||||
|
|
||||||
package eu.dnetlib.message;
|
package eu.dnetlib.message;
|
||||||
|
|
||||||
|
import java.io.IOException;
|
||||||
|
import java.nio.charset.StandardCharsets;
|
||||||
|
import java.util.concurrent.LinkedBlockingQueue;
|
||||||
|
|
||||||
import com.rabbitmq.client.AMQP;
|
import com.rabbitmq.client.AMQP;
|
||||||
import com.rabbitmq.client.Channel;
|
import com.rabbitmq.client.Channel;
|
||||||
import com.rabbitmq.client.DefaultConsumer;
|
import com.rabbitmq.client.DefaultConsumer;
|
||||||
import com.rabbitmq.client.Envelope;
|
import com.rabbitmq.client.Envelope;
|
||||||
import java.io.IOException;
|
|
||||||
import java.nio.charset.StandardCharsets;
|
|
||||||
import java.util.concurrent.LinkedBlockingQueue;
|
|
||||||
|
|
||||||
public class MessageConsumer extends DefaultConsumer {
|
public class MessageConsumer extends DefaultConsumer {
|
||||||
|
|
||||||
final LinkedBlockingQueue<Message> queueMessages;
|
final LinkedBlockingQueue<Message> queueMessages;
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Constructs a new instance and records its association to the passed-in channel.
|
* Constructs a new instance and records its association to the passed-in channel.
|
||||||
*
|
*
|
||||||
* @param channel the channel to which this consumer is attached
|
* @param channel the channel to which this consumer is attached
|
||||||
* @param queueMessages
|
* @param queueMessages
|
||||||
*/
|
*/
|
||||||
public MessageConsumer(Channel channel, LinkedBlockingQueue<Message> queueMessages) {
|
public MessageConsumer(Channel channel, LinkedBlockingQueue<Message> queueMessages) {
|
||||||
super(channel);
|
super(channel);
|
||||||
this.queueMessages = queueMessages;
|
this.queueMessages = queueMessages;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public void handleDelivery(
|
public void handleDelivery(
|
||||||
String consumerTag, Envelope envelope, AMQP.BasicProperties properties, byte[] body)
|
String consumerTag, Envelope envelope, AMQP.BasicProperties properties, byte[] body)
|
||||||
throws IOException {
|
throws IOException {
|
||||||
final String json = new String(body, StandardCharsets.UTF_8);
|
final String json = new String(body, StandardCharsets.UTF_8);
|
||||||
Message message = Message.fromJson(json);
|
Message message = Message.fromJson(json);
|
||||||
try {
|
try {
|
||||||
this.queueMessages.put(message);
|
this.queueMessages.put(message);
|
||||||
System.out.println("Receiving Message " + message);
|
System.out.println("Receiving Message " + message);
|
||||||
} catch (InterruptedException e) {
|
} catch (InterruptedException e) {
|
||||||
if (message.getType() == MessageType.REPORT)
|
if (message.getType() == MessageType.REPORT)
|
||||||
throw new RuntimeException("Error on sending message");
|
throw new RuntimeException("Error on sending message");
|
||||||
else {
|
else {
|
||||||
// TODO LOGGING EXCEPTION
|
// TODO LOGGING EXCEPTION
|
||||||
}
|
}
|
||||||
} finally {
|
} finally {
|
||||||
getChannel().basicAck(envelope.getDeliveryTag(), false);
|
getChannel().basicAck(envelope.getDeliveryTag(), false);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,134 +1,136 @@
|
||||||
|
|
||||||
package eu.dnetlib.message;
|
package eu.dnetlib.message;
|
||||||
|
|
||||||
import com.rabbitmq.client.Channel;
|
|
||||||
import com.rabbitmq.client.Connection;
|
|
||||||
import com.rabbitmq.client.ConnectionFactory;
|
|
||||||
import java.io.IOException;
|
import java.io.IOException;
|
||||||
import java.util.HashMap;
|
import java.util.HashMap;
|
||||||
import java.util.Map;
|
import java.util.Map;
|
||||||
import java.util.concurrent.LinkedBlockingQueue;
|
import java.util.concurrent.LinkedBlockingQueue;
|
||||||
import java.util.concurrent.TimeoutException;
|
import java.util.concurrent.TimeoutException;
|
||||||
|
|
||||||
|
import com.rabbitmq.client.Channel;
|
||||||
|
import com.rabbitmq.client.Connection;
|
||||||
|
import com.rabbitmq.client.ConnectionFactory;
|
||||||
|
|
||||||
public class MessageManager {
|
public class MessageManager {
|
||||||
|
|
||||||
private final String messageHost;
|
private final String messageHost;
|
||||||
|
|
||||||
private final String username;
|
private final String username;
|
||||||
|
|
||||||
private final String password;
|
private final String password;
|
||||||
|
|
||||||
private Connection connection;
|
private Connection connection;
|
||||||
|
|
||||||
private Map<String, Channel> channels = new HashMap<>();
|
private final Map<String, Channel> channels = new HashMap<>();
|
||||||
|
|
||||||
private boolean durable;
|
private boolean durable;
|
||||||
|
|
||||||
private boolean autodelete;
|
private boolean autodelete;
|
||||||
|
|
||||||
private final LinkedBlockingQueue<Message> queueMessages;
|
private final LinkedBlockingQueue<Message> queueMessages;
|
||||||
|
|
||||||
public MessageManager(
|
public MessageManager(
|
||||||
String messageHost,
|
String messageHost,
|
||||||
String username,
|
String username,
|
||||||
String password,
|
String password,
|
||||||
final LinkedBlockingQueue<Message> queueMessages) {
|
final LinkedBlockingQueue<Message> queueMessages) {
|
||||||
this.queueMessages = queueMessages;
|
this.queueMessages = queueMessages;
|
||||||
this.messageHost = messageHost;
|
this.messageHost = messageHost;
|
||||||
this.username = username;
|
this.username = username;
|
||||||
this.password = password;
|
this.password = password;
|
||||||
}
|
}
|
||||||
|
|
||||||
public MessageManager(
|
public MessageManager(
|
||||||
String messageHost,
|
String messageHost,
|
||||||
String username,
|
String username,
|
||||||
String password,
|
String password,
|
||||||
boolean durable,
|
boolean durable,
|
||||||
boolean autodelete,
|
boolean autodelete,
|
||||||
final LinkedBlockingQueue<Message> queueMessages) {
|
final LinkedBlockingQueue<Message> queueMessages) {
|
||||||
this.queueMessages = queueMessages;
|
this.queueMessages = queueMessages;
|
||||||
this.messageHost = messageHost;
|
this.messageHost = messageHost;
|
||||||
this.username = username;
|
this.username = username;
|
||||||
this.password = password;
|
this.password = password;
|
||||||
|
|
||||||
this.durable = durable;
|
this.durable = durable;
|
||||||
this.autodelete = autodelete;
|
this.autodelete = autodelete;
|
||||||
}
|
}
|
||||||
|
|
||||||
private Connection createConnection() throws IOException, TimeoutException {
|
private Connection createConnection() throws IOException, TimeoutException {
|
||||||
ConnectionFactory factory = new ConnectionFactory();
|
ConnectionFactory factory = new ConnectionFactory();
|
||||||
factory.setHost(this.messageHost);
|
factory.setHost(this.messageHost);
|
||||||
factory.setUsername(this.username);
|
factory.setUsername(this.username);
|
||||||
factory.setPassword(this.password);
|
factory.setPassword(this.password);
|
||||||
return factory.newConnection();
|
return factory.newConnection();
|
||||||
}
|
}
|
||||||
|
|
||||||
private Channel createChannel(
|
private Channel createChannel(
|
||||||
final Connection connection,
|
final Connection connection,
|
||||||
final String queueName,
|
final String queueName,
|
||||||
final boolean durable,
|
final boolean durable,
|
||||||
final boolean autodelete)
|
final boolean autodelete)
|
||||||
throws Exception {
|
throws Exception {
|
||||||
Map<String, Object> args = new HashMap<>();
|
Map<String, Object> args = new HashMap<>();
|
||||||
args.put("x-message-ttl", 10000);
|
args.put("x-message-ttl", 10000);
|
||||||
Channel channel = connection.createChannel();
|
Channel channel = connection.createChannel();
|
||||||
channel.queueDeclare(queueName, durable, false, this.autodelete, args);
|
channel.queueDeclare(queueName, durable, false, this.autodelete, args);
|
||||||
return channel;
|
return channel;
|
||||||
}
|
}
|
||||||
|
|
||||||
private Channel getOrCreateChannel(final String queueName, boolean durable, boolean autodelete)
|
private Channel getOrCreateChannel(final String queueName, boolean durable, boolean autodelete)
|
||||||
throws Exception {
|
throws Exception {
|
||||||
if (channels.containsKey(queueName)) {
|
if (channels.containsKey(queueName)) {
|
||||||
return channels.get(queueName);
|
return channels.get(queueName);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (this.connection == null) {
|
if (this.connection == null) {
|
||||||
this.connection = createConnection();
|
this.connection = createConnection();
|
||||||
}
|
}
|
||||||
channels.put(queueName, createChannel(this.connection, queueName, durable, autodelete));
|
channels.put(queueName, createChannel(this.connection, queueName, durable, autodelete));
|
||||||
return channels.get(queueName);
|
return channels.get(queueName);
|
||||||
}
|
}
|
||||||
|
|
||||||
public void close() throws IOException {
|
public void close() throws IOException {
|
||||||
channels
|
channels
|
||||||
.values()
|
.values()
|
||||||
.forEach(
|
.forEach(
|
||||||
ch -> {
|
ch -> {
|
||||||
try {
|
try {
|
||||||
ch.close();
|
ch.close();
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
// TODO LOG
|
// TODO LOG
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
this.connection.close();
|
this.connection.close();
|
||||||
}
|
}
|
||||||
|
|
||||||
public boolean sendMessage(final Message message, String queueName) throws Exception {
|
public boolean sendMessage(final Message message, String queueName) throws Exception {
|
||||||
try {
|
try {
|
||||||
Channel channel = getOrCreateChannel(queueName, this.durable, this.autodelete);
|
Channel channel = getOrCreateChannel(queueName, this.durable, this.autodelete);
|
||||||
channel.basicPublish("", queueName, null, message.toString().getBytes());
|
channel.basicPublish("", queueName, null, message.toString().getBytes());
|
||||||
return true;
|
return true;
|
||||||
} catch (Throwable e) {
|
} catch (Throwable e) {
|
||||||
throw new RuntimeException(e);
|
throw new RuntimeException(e);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
public boolean sendMessage(
|
public boolean sendMessage(
|
||||||
final Message message, String queueName, boolean durable_var, boolean autodelete_var)
|
final Message message, String queueName, boolean durable_var, boolean autodelete_var)
|
||||||
throws Exception {
|
throws Exception {
|
||||||
try {
|
try {
|
||||||
Channel channel = getOrCreateChannel(queueName, durable_var, autodelete_var);
|
Channel channel = getOrCreateChannel(queueName, durable_var, autodelete_var);
|
||||||
channel.basicPublish("", queueName, null, message.toString().getBytes());
|
channel.basicPublish("", queueName, null, message.toString().getBytes());
|
||||||
return true;
|
return true;
|
||||||
} catch (Throwable e) {
|
} catch (Throwable e) {
|
||||||
throw new RuntimeException(e);
|
throw new RuntimeException(e);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
public void startConsumingMessage(
|
public void startConsumingMessage(
|
||||||
final String queueName, final boolean durable, final boolean autodelete) throws Exception {
|
final String queueName, final boolean durable, final boolean autodelete) throws Exception {
|
||||||
|
|
||||||
Channel channel = createChannel(createConnection(), queueName, durable, autodelete);
|
Channel channel = createChannel(createConnection(), queueName, durable, autodelete);
|
||||||
channel.basicConsume(queueName, false, new MessageConsumer(channel, queueMessages));
|
channel.basicConsume(queueName, false, new MessageConsumer(channel, queueMessages));
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,6 +1,6 @@
|
||||||
|
|
||||||
package eu.dnetlib.message;
|
package eu.dnetlib.message;
|
||||||
|
|
||||||
public enum MessageType {
|
public enum MessageType {
|
||||||
ONGOING,
|
ONGOING, REPORT
|
||||||
REPORT
|
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,24 +1,25 @@
|
||||||
|
|
||||||
package eu.dnetlib.scholexplorer.relation;
|
package eu.dnetlib.scholexplorer.relation;
|
||||||
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
|
|
||||||
public class RelInfo implements Serializable {
|
public class RelInfo implements Serializable {
|
||||||
private String original;
|
private String original;
|
||||||
private String inverse;
|
private String inverse;
|
||||||
|
|
||||||
public String getOriginal() {
|
public String getOriginal() {
|
||||||
return original;
|
return original;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setOriginal(String original) {
|
public void setOriginal(String original) {
|
||||||
this.original = original;
|
this.original = original;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getInverse() {
|
public String getInverse() {
|
||||||
return inverse;
|
return inverse;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setInverse(String inverse) {
|
public void setInverse(String inverse) {
|
||||||
this.inverse = inverse;
|
this.inverse = inverse;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,18 +1,20 @@
|
||||||
|
|
||||||
package eu.dnetlib.scholexplorer.relation;
|
package eu.dnetlib.scholexplorer.relation;
|
||||||
|
|
||||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
import java.util.HashMap;
|
import java.util.HashMap;
|
||||||
|
|
||||||
import org.apache.commons.io.IOUtils;
|
import org.apache.commons.io.IOUtils;
|
||||||
|
|
||||||
|
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||||
|
|
||||||
public class RelationMapper extends HashMap<String, RelInfo> implements Serializable {
|
public class RelationMapper extends HashMap<String, RelInfo> implements Serializable {
|
||||||
|
|
||||||
public static RelationMapper load() throws Exception {
|
public static RelationMapper load() throws Exception {
|
||||||
|
|
||||||
final String json =
|
final String json = IOUtils.toString(RelationMapper.class.getResourceAsStream("relations.json"));
|
||||||
IOUtils.toString(RelationMapper.class.getResourceAsStream("relations.json"));
|
|
||||||
|
|
||||||
ObjectMapper mapper = new ObjectMapper();
|
ObjectMapper mapper = new ObjectMapper();
|
||||||
return mapper.readValue(json, RelationMapper.class);
|
return mapper.readValue(json, RelationMapper.class);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,3 +1,4 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.application;
|
package eu.dnetlib.dhp.application;
|
||||||
|
|
||||||
import static org.junit.jupiter.api.Assertions.assertEquals;
|
import static org.junit.jupiter.api.Assertions.assertEquals;
|
||||||
|
@ -8,58 +9,59 @@ import org.junit.jupiter.api.Test;
|
||||||
|
|
||||||
public class ArgumentApplicationParserTest {
|
public class ArgumentApplicationParserTest {
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void testParseParameter() throws Exception {
|
public void testParseParameter() throws Exception {
|
||||||
final String jsonConfiguration =
|
final String jsonConfiguration = IOUtils
|
||||||
IOUtils.toString(
|
.toString(
|
||||||
this.getClass().getResourceAsStream("/eu/dnetlib/application/parameters.json"));
|
this.getClass().getResourceAsStream("/eu/dnetlib/application/parameters.json"));
|
||||||
assertNotNull(jsonConfiguration);
|
assertNotNull(jsonConfiguration);
|
||||||
ArgumentApplicationParser parser = new ArgumentApplicationParser(jsonConfiguration);
|
ArgumentApplicationParser parser = new ArgumentApplicationParser(jsonConfiguration);
|
||||||
parser.parseArgument(
|
parser
|
||||||
new String[] {
|
.parseArgument(
|
||||||
"-p",
|
new String[] {
|
||||||
"value0",
|
"-p",
|
||||||
"-a",
|
"value0",
|
||||||
"value1",
|
"-a",
|
||||||
"-n",
|
"value1",
|
||||||
"value2",
|
"-n",
|
||||||
"-u",
|
"value2",
|
||||||
"value3",
|
"-u",
|
||||||
"-ru",
|
"value3",
|
||||||
"value4",
|
"-ru",
|
||||||
"-rp",
|
"value4",
|
||||||
"value5",
|
"-rp",
|
||||||
"-rh",
|
"value5",
|
||||||
"value6",
|
"-rh",
|
||||||
"-ro",
|
"value6",
|
||||||
"value7",
|
"-ro",
|
||||||
"-rr",
|
"value7",
|
||||||
"value8",
|
"-rr",
|
||||||
"-w",
|
"value8",
|
||||||
"value9",
|
"-w",
|
||||||
"-cc",
|
"value9",
|
||||||
ArgumentApplicationParser.compressArgument(jsonConfiguration)
|
"-cc",
|
||||||
});
|
ArgumentApplicationParser.compressArgument(jsonConfiguration)
|
||||||
assertNotNull(parser.get("hdfsPath"));
|
});
|
||||||
assertNotNull(parser.get("apidescriptor"));
|
assertNotNull(parser.get("hdfsPath"));
|
||||||
assertNotNull(parser.get("namenode"));
|
assertNotNull(parser.get("apidescriptor"));
|
||||||
assertNotNull(parser.get("userHDFS"));
|
assertNotNull(parser.get("namenode"));
|
||||||
assertNotNull(parser.get("rabbitUser"));
|
assertNotNull(parser.get("userHDFS"));
|
||||||
assertNotNull(parser.get("rabbitPassWord"));
|
assertNotNull(parser.get("rabbitUser"));
|
||||||
assertNotNull(parser.get("rabbitHost"));
|
assertNotNull(parser.get("rabbitPassWord"));
|
||||||
assertNotNull(parser.get("rabbitOngoingQueue"));
|
assertNotNull(parser.get("rabbitHost"));
|
||||||
assertNotNull(parser.get("rabbitReportQueue"));
|
assertNotNull(parser.get("rabbitOngoingQueue"));
|
||||||
assertNotNull(parser.get("workflowId"));
|
assertNotNull(parser.get("rabbitReportQueue"));
|
||||||
assertEquals("value0", parser.get("hdfsPath"));
|
assertNotNull(parser.get("workflowId"));
|
||||||
assertEquals("value1", parser.get("apidescriptor"));
|
assertEquals("value0", parser.get("hdfsPath"));
|
||||||
assertEquals("value2", parser.get("namenode"));
|
assertEquals("value1", parser.get("apidescriptor"));
|
||||||
assertEquals("value3", parser.get("userHDFS"));
|
assertEquals("value2", parser.get("namenode"));
|
||||||
assertEquals("value4", parser.get("rabbitUser"));
|
assertEquals("value3", parser.get("userHDFS"));
|
||||||
assertEquals("value5", parser.get("rabbitPassWord"));
|
assertEquals("value4", parser.get("rabbitUser"));
|
||||||
assertEquals("value6", parser.get("rabbitHost"));
|
assertEquals("value5", parser.get("rabbitPassWord"));
|
||||||
assertEquals("value7", parser.get("rabbitOngoingQueue"));
|
assertEquals("value6", parser.get("rabbitHost"));
|
||||||
assertEquals("value8", parser.get("rabbitReportQueue"));
|
assertEquals("value7", parser.get("rabbitOngoingQueue"));
|
||||||
assertEquals("value9", parser.get("workflowId"));
|
assertEquals("value8", parser.get("rabbitReportQueue"));
|
||||||
assertEquals(jsonConfiguration, parser.get("ccCoco"));
|
assertEquals("value9", parser.get("workflowId"));
|
||||||
}
|
assertEquals(jsonConfiguration, parser.get("ccCoco"));
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,3 +1,4 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.common;
|
package eu.dnetlib.dhp.common;
|
||||||
|
|
||||||
import static org.junit.jupiter.api.Assertions.*;
|
import static org.junit.jupiter.api.Assertions.*;
|
||||||
|
@ -8,6 +9,7 @@ import java.nio.file.Path;
|
||||||
import java.util.Arrays;
|
import java.util.Arrays;
|
||||||
import java.util.List;
|
import java.util.List;
|
||||||
import java.util.stream.Collectors;
|
import java.util.stream.Collectors;
|
||||||
|
|
||||||
import org.apache.hadoop.conf.Configuration;
|
import org.apache.hadoop.conf.Configuration;
|
||||||
import org.junit.jupiter.api.Nested;
|
import org.junit.jupiter.api.Nested;
|
||||||
import org.junit.jupiter.api.Test;
|
import org.junit.jupiter.api.Test;
|
||||||
|
@ -15,63 +17,64 @@ import org.junit.jupiter.api.io.TempDir;
|
||||||
|
|
||||||
public class HdfsSupportTest {
|
public class HdfsSupportTest {
|
||||||
|
|
||||||
@Nested
|
@Nested
|
||||||
class Remove {
|
class Remove {
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void shouldThrowARuntimeExceptionOnError() {
|
public void shouldThrowARuntimeExceptionOnError() {
|
||||||
// when
|
// when
|
||||||
assertThrows(RuntimeException.class, () -> HdfsSupport.remove(null, new Configuration()));
|
assertThrows(RuntimeException.class, () -> HdfsSupport.remove(null, new Configuration()));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void shouldRemoveADirFromHDFS(@TempDir Path tempDir) {
|
public void shouldRemoveADirFromHDFS(@TempDir Path tempDir) {
|
||||||
// when
|
// when
|
||||||
HdfsSupport.remove(tempDir.toString(), new Configuration());
|
HdfsSupport.remove(tempDir.toString(), new Configuration());
|
||||||
|
|
||||||
// then
|
// then
|
||||||
assertFalse(Files.exists(tempDir));
|
assertFalse(Files.exists(tempDir));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void shouldRemoveAFileFromHDFS(@TempDir Path tempDir) throws IOException {
|
public void shouldRemoveAFileFromHDFS(@TempDir Path tempDir) throws IOException {
|
||||||
// given
|
// given
|
||||||
Path file = Files.createTempFile(tempDir, "p", "s");
|
Path file = Files.createTempFile(tempDir, "p", "s");
|
||||||
|
|
||||||
// when
|
// when
|
||||||
HdfsSupport.remove(file.toString(), new Configuration());
|
HdfsSupport.remove(file.toString(), new Configuration());
|
||||||
|
|
||||||
// then
|
// then
|
||||||
assertFalse(Files.exists(file));
|
assertFalse(Files.exists(file));
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@Nested
|
@Nested
|
||||||
class ListFiles {
|
class ListFiles {
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void shouldThrowARuntimeExceptionOnError() {
|
public void shouldThrowARuntimeExceptionOnError() {
|
||||||
// when
|
// when
|
||||||
assertThrows(RuntimeException.class, () -> HdfsSupport.listFiles(null, new Configuration()));
|
assertThrows(RuntimeException.class, () -> HdfsSupport.listFiles(null, new Configuration()));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void shouldListFilesLocatedInPath(@TempDir Path tempDir) throws IOException {
|
public void shouldListFilesLocatedInPath(@TempDir Path tempDir) throws IOException {
|
||||||
Path subDir1 = Files.createTempDirectory(tempDir, "list_me");
|
Path subDir1 = Files.createTempDirectory(tempDir, "list_me");
|
||||||
Path subDir2 = Files.createTempDirectory(tempDir, "list_me");
|
Path subDir2 = Files.createTempDirectory(tempDir, "list_me");
|
||||||
|
|
||||||
// when
|
// when
|
||||||
List<String> paths = HdfsSupport.listFiles(tempDir.toString(), new Configuration());
|
List<String> paths = HdfsSupport.listFiles(tempDir.toString(), new Configuration());
|
||||||
|
|
||||||
// then
|
// then
|
||||||
assertEquals(2, paths.size());
|
assertEquals(2, paths.size());
|
||||||
List<String> expecteds =
|
List<String> expecteds = Arrays.stream(new String[] {
|
||||||
Arrays.stream(new String[] {subDir1.toString(), subDir2.toString()})
|
subDir1.toString(), subDir2.toString()
|
||||||
.sorted()
|
})
|
||||||
.collect(Collectors.toList());
|
.sorted()
|
||||||
List<String> actuals = paths.stream().sorted().collect(Collectors.toList());
|
.collect(Collectors.toList());
|
||||||
assertTrue(actuals.get(0).contains(expecteds.get(0)));
|
List<String> actuals = paths.stream().sorted().collect(Collectors.toList());
|
||||||
assertTrue(actuals.get(1).contains(expecteds.get(1)));
|
assertTrue(actuals.get(0).contains(expecteds.get(0)));
|
||||||
}
|
assertTrue(actuals.get(1).contains(expecteds.get(1)));
|
||||||
}
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,55 +1,58 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.common;
|
package eu.dnetlib.dhp.common;
|
||||||
|
|
||||||
import static org.mockito.Mockito.*;
|
import static org.mockito.Mockito.*;
|
||||||
|
|
||||||
import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.ThrowingConsumer;
|
|
||||||
import java.util.function.Function;
|
import java.util.function.Function;
|
||||||
|
|
||||||
import org.apache.spark.SparkConf;
|
import org.apache.spark.SparkConf;
|
||||||
import org.apache.spark.sql.SparkSession;
|
import org.apache.spark.sql.SparkSession;
|
||||||
import org.junit.jupiter.api.Nested;
|
import org.junit.jupiter.api.Nested;
|
||||||
import org.junit.jupiter.api.Test;
|
import org.junit.jupiter.api.Test;
|
||||||
|
|
||||||
|
import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.ThrowingConsumer;
|
||||||
|
|
||||||
public class SparkSessionSupportTest {
|
public class SparkSessionSupportTest {
|
||||||
|
|
||||||
@Nested
|
@Nested
|
||||||
class RunWithSparkSession {
|
class RunWithSparkSession {
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void shouldExecuteFunctionAndNotStopSparkSessionWhenSparkSessionIsNotManaged()
|
public void shouldExecuteFunctionAndNotStopSparkSessionWhenSparkSessionIsNotManaged()
|
||||||
throws Exception {
|
throws Exception {
|
||||||
// given
|
// given
|
||||||
SparkSession spark = mock(SparkSession.class);
|
SparkSession spark = mock(SparkSession.class);
|
||||||
SparkConf conf = mock(SparkConf.class);
|
SparkConf conf = mock(SparkConf.class);
|
||||||
Function<SparkConf, SparkSession> sparkSessionBuilder = mock(Function.class);
|
Function<SparkConf, SparkSession> sparkSessionBuilder = mock(Function.class);
|
||||||
when(sparkSessionBuilder.apply(conf)).thenReturn(spark);
|
when(sparkSessionBuilder.apply(conf)).thenReturn(spark);
|
||||||
ThrowingConsumer<SparkSession, Exception> fn = mock(ThrowingConsumer.class);
|
ThrowingConsumer<SparkSession, Exception> fn = mock(ThrowingConsumer.class);
|
||||||
|
|
||||||
// when
|
// when
|
||||||
SparkSessionSupport.runWithSparkSession(sparkSessionBuilder, conf, false, fn);
|
SparkSessionSupport.runWithSparkSession(sparkSessionBuilder, conf, false, fn);
|
||||||
|
|
||||||
// then
|
// then
|
||||||
verify(sparkSessionBuilder).apply(conf);
|
verify(sparkSessionBuilder).apply(conf);
|
||||||
verify(fn).accept(spark);
|
verify(fn).accept(spark);
|
||||||
verify(spark, never()).stop();
|
verify(spark, never()).stop();
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void shouldExecuteFunctionAndStopSparkSessionWhenSparkSessionIsManaged()
|
public void shouldExecuteFunctionAndStopSparkSessionWhenSparkSessionIsManaged()
|
||||||
throws Exception {
|
throws Exception {
|
||||||
// given
|
// given
|
||||||
SparkSession spark = mock(SparkSession.class);
|
SparkSession spark = mock(SparkSession.class);
|
||||||
SparkConf conf = mock(SparkConf.class);
|
SparkConf conf = mock(SparkConf.class);
|
||||||
Function<SparkConf, SparkSession> sparkSessionBuilder = mock(Function.class);
|
Function<SparkConf, SparkSession> sparkSessionBuilder = mock(Function.class);
|
||||||
when(sparkSessionBuilder.apply(conf)).thenReturn(spark);
|
when(sparkSessionBuilder.apply(conf)).thenReturn(spark);
|
||||||
ThrowingConsumer<SparkSession, Exception> fn = mock(ThrowingConsumer.class);
|
ThrowingConsumer<SparkSession, Exception> fn = mock(ThrowingConsumer.class);
|
||||||
|
|
||||||
// when
|
// when
|
||||||
SparkSessionSupport.runWithSparkSession(sparkSessionBuilder, conf, true, fn);
|
SparkSessionSupport.runWithSparkSession(sparkSessionBuilder, conf, true, fn);
|
||||||
|
|
||||||
// then
|
// then
|
||||||
verify(sparkSessionBuilder).apply(conf);
|
verify(sparkSessionBuilder).apply(conf);
|
||||||
verify(fn).accept(spark);
|
verify(fn).accept(spark);
|
||||||
verify(spark, times(1)).stop();
|
verify(spark, times(1)).stop();
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,3 +1,4 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.model.mdstore;
|
package eu.dnetlib.dhp.model.mdstore;
|
||||||
|
|
||||||
import static org.junit.jupiter.api.Assertions.assertTrue;
|
import static org.junit.jupiter.api.Assertions.assertTrue;
|
||||||
|
@ -6,10 +7,10 @@ import org.junit.jupiter.api.Test;
|
||||||
|
|
||||||
public class MetadataRecordTest {
|
public class MetadataRecordTest {
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void getTimestamp() {
|
public void getTimestamp() {
|
||||||
|
|
||||||
MetadataRecord r = new MetadataRecord();
|
MetadataRecord r = new MetadataRecord();
|
||||||
assertTrue(r.getDateOfCollection() > 0);
|
assertTrue(r.getDateOfCollection() > 0);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,3 +1,4 @@
|
||||||
|
|
||||||
package eu.dnetlib.message;
|
package eu.dnetlib.message;
|
||||||
|
|
||||||
import static org.junit.jupiter.api.Assertions.*;
|
import static org.junit.jupiter.api.Assertions.*;
|
||||||
|
@ -5,46 +6,46 @@ import static org.junit.jupiter.api.Assertions.*;
|
||||||
import java.io.IOException;
|
import java.io.IOException;
|
||||||
import java.util.HashMap;
|
import java.util.HashMap;
|
||||||
import java.util.Map;
|
import java.util.Map;
|
||||||
|
|
||||||
import org.junit.jupiter.api.Test;
|
import org.junit.jupiter.api.Test;
|
||||||
|
|
||||||
public class MessageTest {
|
public class MessageTest {
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void fromJsonTest() throws IOException {
|
public void fromJsonTest() throws IOException {
|
||||||
Message m = new Message();
|
Message m = new Message();
|
||||||
m.setWorkflowId("wId");
|
m.setWorkflowId("wId");
|
||||||
m.setType(MessageType.ONGOING);
|
m.setType(MessageType.ONGOING);
|
||||||
m.setJobName("Collection");
|
m.setJobName("Collection");
|
||||||
Map<String, String> body = new HashMap<>();
|
Map<String, String> body = new HashMap<>();
|
||||||
body.put("parsedItem", "300");
|
body.put("parsedItem", "300");
|
||||||
body.put("ExecutionTime", "30s");
|
body.put("ExecutionTime", "30s");
|
||||||
|
|
||||||
m.setBody(body);
|
m.setBody(body);
|
||||||
System.out.println("m = " + m);
|
System.out.println("m = " + m);
|
||||||
Message m1 = Message.fromJson(m.toString());
|
Message m1 = Message.fromJson(m.toString());
|
||||||
assertEquals(m1.getWorkflowId(), m.getWorkflowId());
|
assertEquals(m1.getWorkflowId(), m.getWorkflowId());
|
||||||
assertEquals(m1.getType(), m.getType());
|
assertEquals(m1.getType(), m.getType());
|
||||||
assertEquals(m1.getJobName(), m.getJobName());
|
assertEquals(m1.getJobName(), m.getJobName());
|
||||||
|
|
||||||
assertNotNull(m1.getBody());
|
assertNotNull(m1.getBody());
|
||||||
m1.getBody().keySet().forEach(it -> assertEquals(m1.getBody().get(it), m.getBody().get(it)));
|
m1.getBody().keySet().forEach(it -> assertEquals(m1.getBody().get(it), m.getBody().get(it)));
|
||||||
assertEquals(m1.getJobName(), m.getJobName());
|
assertEquals(m1.getJobName(), m.getJobName());
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void toStringTest() {
|
public void toStringTest() {
|
||||||
final String expectedJson =
|
final String expectedJson = "{\"workflowId\":\"wId\",\"jobName\":\"Collection\",\"type\":\"ONGOING\",\"body\":{\"ExecutionTime\":\"30s\",\"parsedItem\":\"300\"}}";
|
||||||
"{\"workflowId\":\"wId\",\"jobName\":\"Collection\",\"type\":\"ONGOING\",\"body\":{\"ExecutionTime\":\"30s\",\"parsedItem\":\"300\"}}";
|
Message m = new Message();
|
||||||
Message m = new Message();
|
m.setWorkflowId("wId");
|
||||||
m.setWorkflowId("wId");
|
m.setType(MessageType.ONGOING);
|
||||||
m.setType(MessageType.ONGOING);
|
m.setJobName("Collection");
|
||||||
m.setJobName("Collection");
|
Map<String, String> body = new HashMap<>();
|
||||||
Map<String, String> body = new HashMap<>();
|
body.put("parsedItem", "300");
|
||||||
body.put("parsedItem", "300");
|
body.put("ExecutionTime", "30s");
|
||||||
body.put("ExecutionTime", "30s");
|
|
||||||
|
|
||||||
m.setBody(body);
|
m.setBody(body);
|
||||||
|
|
||||||
assertEquals(expectedJson, m.toString());
|
assertEquals(expectedJson, m.toString());
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,13 +1,14 @@
|
||||||
|
|
||||||
package eu.dnetlib.scholexplorer.relation;
|
package eu.dnetlib.scholexplorer.relation;
|
||||||
|
|
||||||
import org.junit.jupiter.api.Test;
|
import org.junit.jupiter.api.Test;
|
||||||
|
|
||||||
public class RelationMapperTest {
|
public class RelationMapperTest {
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void testLoadRels() throws Exception {
|
public void testLoadRels() throws Exception {
|
||||||
|
|
||||||
RelationMapper relationMapper = RelationMapper.load();
|
RelationMapper relationMapper = RelationMapper.load();
|
||||||
relationMapper.keySet().forEach(System.out::println);
|
relationMapper.keySet().forEach(System.out::println);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -12,7 +12,7 @@
|
||||||
<artifactId>dhp-schemas</artifactId>
|
<artifactId>dhp-schemas</artifactId>
|
||||||
<packaging>jar</packaging>
|
<packaging>jar</packaging>
|
||||||
|
|
||||||
|
<description>This module contains common schema classes meant to be used across the dnet-hadoop submodules</description>
|
||||||
|
|
||||||
<dependencies>
|
<dependencies>
|
||||||
|
|
||||||
|
|
|
@ -1,36 +1,40 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.action;
|
package eu.dnetlib.dhp.schema.action;
|
||||||
|
|
||||||
import com.fasterxml.jackson.databind.annotation.JsonDeserialize;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.Oaf;
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
|
|
||||||
|
import com.fasterxml.jackson.databind.annotation.JsonDeserialize;
|
||||||
|
|
||||||
|
import eu.dnetlib.dhp.schema.oaf.Oaf;
|
||||||
|
|
||||||
@JsonDeserialize(using = AtomicActionDeserializer.class)
|
@JsonDeserialize(using = AtomicActionDeserializer.class)
|
||||||
public class AtomicAction<T extends Oaf> implements Serializable {
|
public class AtomicAction<T extends Oaf> implements Serializable {
|
||||||
|
|
||||||
private Class<T> clazz;
|
private Class<T> clazz;
|
||||||
|
|
||||||
private T payload;
|
private T payload;
|
||||||
|
|
||||||
public AtomicAction() {}
|
public AtomicAction() {
|
||||||
|
}
|
||||||
|
|
||||||
public AtomicAction(Class<T> clazz, T payload) {
|
public AtomicAction(Class<T> clazz, T payload) {
|
||||||
this.clazz = clazz;
|
this.clazz = clazz;
|
||||||
this.payload = payload;
|
this.payload = payload;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Class<T> getClazz() {
|
public Class<T> getClazz() {
|
||||||
return clazz;
|
return clazz;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setClazz(Class<T> clazz) {
|
public void setClazz(Class<T> clazz) {
|
||||||
this.clazz = clazz;
|
this.clazz = clazz;
|
||||||
}
|
}
|
||||||
|
|
||||||
public T getPayload() {
|
public T getPayload() {
|
||||||
return payload;
|
return payload;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setPayload(T payload) {
|
public void setPayload(T payload) {
|
||||||
this.payload = payload;
|
this.payload = payload;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,29 +1,32 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.action;
|
package eu.dnetlib.dhp.schema.action;
|
||||||
|
|
||||||
|
import java.io.IOException;
|
||||||
|
|
||||||
import com.fasterxml.jackson.core.JsonParser;
|
import com.fasterxml.jackson.core.JsonParser;
|
||||||
import com.fasterxml.jackson.core.JsonProcessingException;
|
import com.fasterxml.jackson.core.JsonProcessingException;
|
||||||
import com.fasterxml.jackson.databind.DeserializationContext;
|
import com.fasterxml.jackson.databind.DeserializationContext;
|
||||||
import com.fasterxml.jackson.databind.JsonDeserializer;
|
import com.fasterxml.jackson.databind.JsonDeserializer;
|
||||||
import com.fasterxml.jackson.databind.JsonNode;
|
import com.fasterxml.jackson.databind.JsonNode;
|
||||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||||
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.Oaf;
|
import eu.dnetlib.dhp.schema.oaf.Oaf;
|
||||||
import java.io.IOException;
|
|
||||||
|
|
||||||
public class AtomicActionDeserializer extends JsonDeserializer {
|
public class AtomicActionDeserializer extends JsonDeserializer {
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public Object deserialize(JsonParser jp, DeserializationContext ctxt)
|
public Object deserialize(JsonParser jp, DeserializationContext ctxt)
|
||||||
throws IOException, JsonProcessingException {
|
throws IOException {
|
||||||
JsonNode node = jp.getCodec().readTree(jp);
|
JsonNode node = jp.getCodec().readTree(jp);
|
||||||
String classTag = node.get("clazz").asText();
|
String classTag = node.get("clazz").asText();
|
||||||
JsonNode payload = node.get("payload");
|
JsonNode payload = node.get("payload");
|
||||||
ObjectMapper mapper = new ObjectMapper();
|
ObjectMapper mapper = new ObjectMapper();
|
||||||
|
|
||||||
try {
|
try {
|
||||||
final Class<?> clazz = Class.forName(classTag);
|
final Class<?> clazz = Class.forName(classTag);
|
||||||
return new AtomicAction(clazz, (Oaf) mapper.readValue(payload.toString(), clazz));
|
return new AtomicAction(clazz, (Oaf) mapper.readValue(payload.toString(), clazz));
|
||||||
} catch (ClassNotFoundException e) {
|
} catch (ClassNotFoundException e) {
|
||||||
throw new IOException(e);
|
throw new IOException(e);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,26 +1,21 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.common;
|
package eu.dnetlib.dhp.schema.common;
|
||||||
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.OafEntity;
|
import eu.dnetlib.dhp.schema.oaf.OafEntity;
|
||||||
|
|
||||||
/** Actual entity types in the Graph */
|
/** Actual entity types in the Graph */
|
||||||
public enum EntityType {
|
public enum EntityType {
|
||||||
publication,
|
publication, dataset, otherresearchproduct, software, datasource, organization, project;
|
||||||
dataset,
|
|
||||||
otherresearchproduct,
|
|
||||||
software,
|
|
||||||
datasource,
|
|
||||||
organization,
|
|
||||||
project;
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Resolves the EntityType, given the relative class name
|
* Resolves the EntityType, given the relative class name
|
||||||
*
|
*
|
||||||
* @param clazz the given class name
|
* @param clazz the given class name
|
||||||
* @param <T> actual OafEntity subclass
|
* @param <T> actual OafEntity subclass
|
||||||
* @return the EntityType associated to the given class
|
* @return the EntityType associated to the given class
|
||||||
*/
|
*/
|
||||||
public static <T extends OafEntity> EntityType fromClass(Class<T> clazz) {
|
public static <T extends OafEntity> EntityType fromClass(Class<T> clazz) {
|
||||||
|
|
||||||
return EntityType.valueOf(clazz.getSimpleName().toLowerCase());
|
return EntityType.valueOf(clazz.getSimpleName().toLowerCase());
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,9 +1,7 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.common;
|
package eu.dnetlib.dhp.schema.common;
|
||||||
|
|
||||||
/** Main entity types in the Graph */
|
/** Main entity types in the Graph */
|
||||||
public enum MainEntityType {
|
public enum MainEntityType {
|
||||||
result,
|
result, datasource, organization, project
|
||||||
datasource,
|
|
||||||
organization,
|
|
||||||
project
|
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,40 +1,88 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.common;
|
package eu.dnetlib.dhp.schema.common;
|
||||||
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.Qualifier;
|
import eu.dnetlib.dhp.schema.oaf.Qualifier;
|
||||||
|
|
||||||
public class ModelConstants {
|
public class ModelConstants {
|
||||||
|
|
||||||
public static final String DNET_RESULT_TYPOLOGIES = "dnet:result_typologies";
|
public static final String DNET_RESULT_TYPOLOGIES = "dnet:result_typologies";
|
||||||
|
public static final String DNET_PUBLICATION_RESOURCE = "dnet:publication_resource";
|
||||||
|
public static final String DNET_ACCESS_MODES = "dnet:access_modes";
|
||||||
|
public static final String DNET_LANGUAGES = "dnet:languages";
|
||||||
|
public static final String DNET_PID_TYPES = "dnet:pid_types";
|
||||||
|
public static final String DNET_DATA_CITE_DATE = "dnet:dataCite_date";
|
||||||
|
public static final String DNET_DATA_CITE_RESOURCE = "dnet:dataCite_resource";
|
||||||
|
public static final String DNET_PROVENANCE_ACTIONS = "dnet:provenanceActions";
|
||||||
|
|
||||||
public static final String DATASET_RESULTTYPE_CLASSID = "dataset";
|
public static final String SYSIMPORT_CROSSWALK_REPOSITORY = "sysimport:crosswalk:repository";
|
||||||
public static final String PUBLICATION_RESULTTYPE_CLASSID = "publication";
|
public static final String SYSIMPORT_CROSSWALK_ENTITYREGISTRY = "sysimport:crosswalk:entityregistry";
|
||||||
public static final String SOFTWARE_RESULTTYPE_CLASSID = "software";
|
public static final String USER_CLAIM = "user:claim";
|
||||||
public static final String ORP_RESULTTYPE_CLASSID = "other";
|
|
||||||
|
|
||||||
public static Qualifier PUBLICATION_DEFAULT_RESULTTYPE = new Qualifier();
|
public static final String DATASET_RESULTTYPE_CLASSID = "dataset";
|
||||||
public static Qualifier DATASET_DEFAULT_RESULTTYPE = new Qualifier();
|
public static final String PUBLICATION_RESULTTYPE_CLASSID = "publication";
|
||||||
public static Qualifier SOFTWARE_DEFAULT_RESULTTYPE = new Qualifier();
|
public static final String SOFTWARE_RESULTTYPE_CLASSID = "software";
|
||||||
public static Qualifier ORP_DEFAULT_RESULTTYPE = new Qualifier();
|
public static final String ORP_RESULTTYPE_CLASSID = "other";
|
||||||
|
|
||||||
static {
|
public static final String RESULT_RESULT = "resultResult";
|
||||||
PUBLICATION_DEFAULT_RESULTTYPE.setClassid(PUBLICATION_RESULTTYPE_CLASSID);
|
public static final String PUBLICATION_DATASET = "publicationDataset";
|
||||||
PUBLICATION_DEFAULT_RESULTTYPE.setClassname(PUBLICATION_RESULTTYPE_CLASSID);
|
public static final String IS_RELATED_TO = "isRelatedTo";
|
||||||
PUBLICATION_DEFAULT_RESULTTYPE.setSchemeid(DNET_RESULT_TYPOLOGIES);
|
public static final String SUPPLEMENT = "supplement";
|
||||||
PUBLICATION_DEFAULT_RESULTTYPE.setSchemename(DNET_RESULT_TYPOLOGIES);
|
public static final String IS_SUPPLEMENT_TO = "isSupplementTo";
|
||||||
|
public static final String IS_SUPPLEMENTED_BY = "isSupplementedBy";
|
||||||
|
public static final String PART = "part";
|
||||||
|
public static final String IS_PART_OF = "IsPartOf";
|
||||||
|
public static final String HAS_PARTS = "HasParts";
|
||||||
|
public static final String RELATIONSHIP = "relationship";
|
||||||
|
|
||||||
DATASET_DEFAULT_RESULTTYPE.setClassid(DATASET_RESULTTYPE_CLASSID);
|
public static final String RESULT_PROJECT = "resultProject";
|
||||||
DATASET_DEFAULT_RESULTTYPE.setClassname(DATASET_RESULTTYPE_CLASSID);
|
public static final String OUTCOME = "outcome";
|
||||||
DATASET_DEFAULT_RESULTTYPE.setSchemeid(DNET_RESULT_TYPOLOGIES);
|
public static final String IS_PRODUCED_BY = "isProducedBy";
|
||||||
DATASET_DEFAULT_RESULTTYPE.setSchemename(DNET_RESULT_TYPOLOGIES);
|
public static final String PRODUCES = "produces";
|
||||||
|
|
||||||
SOFTWARE_DEFAULT_RESULTTYPE.setClassid(SOFTWARE_RESULTTYPE_CLASSID);
|
public static final String DATASOURCE_ORGANIZATION = "datasourceOrganization";
|
||||||
SOFTWARE_DEFAULT_RESULTTYPE.setClassname(SOFTWARE_RESULTTYPE_CLASSID);
|
public static final String PROVISION = "provision";
|
||||||
SOFTWARE_DEFAULT_RESULTTYPE.setSchemeid(DNET_RESULT_TYPOLOGIES);
|
public static final String IS_PROVIDED_BY = "isProvidedBy";
|
||||||
SOFTWARE_DEFAULT_RESULTTYPE.setSchemename(DNET_RESULT_TYPOLOGIES);
|
public static final String PROVIDES = "provides";
|
||||||
|
|
||||||
ORP_DEFAULT_RESULTTYPE.setClassid(ORP_RESULTTYPE_CLASSID);
|
public static final String PROJECT_ORGANIZATION = "projectOrganization";
|
||||||
ORP_DEFAULT_RESULTTYPE.setClassname(ORP_RESULTTYPE_CLASSID);
|
public static final String PARTICIPATION = "participation";
|
||||||
ORP_DEFAULT_RESULTTYPE.setSchemeid(DNET_RESULT_TYPOLOGIES);
|
public static final String HAS_PARTICIPANT = "hasParticipant";
|
||||||
ORP_DEFAULT_RESULTTYPE.setSchemename(DNET_RESULT_TYPOLOGIES);
|
public static final String IS_PARTICIPANT = "isParticipant";
|
||||||
}
|
|
||||||
|
public static final Qualifier PUBLICATION_DEFAULT_RESULTTYPE = qualifier(
|
||||||
|
PUBLICATION_RESULTTYPE_CLASSID, PUBLICATION_RESULTTYPE_CLASSID,
|
||||||
|
DNET_RESULT_TYPOLOGIES, DNET_RESULT_TYPOLOGIES);
|
||||||
|
|
||||||
|
public static final Qualifier DATASET_DEFAULT_RESULTTYPE = qualifier(
|
||||||
|
DATASET_RESULTTYPE_CLASSID, DATASET_RESULTTYPE_CLASSID,
|
||||||
|
DNET_RESULT_TYPOLOGIES, DNET_RESULT_TYPOLOGIES);
|
||||||
|
|
||||||
|
public static final Qualifier SOFTWARE_DEFAULT_RESULTTYPE = qualifier(
|
||||||
|
SOFTWARE_RESULTTYPE_CLASSID, SOFTWARE_RESULTTYPE_CLASSID,
|
||||||
|
DNET_RESULT_TYPOLOGIES, DNET_RESULT_TYPOLOGIES);
|
||||||
|
|
||||||
|
public static final Qualifier ORP_DEFAULT_RESULTTYPE = qualifier(
|
||||||
|
ORP_RESULTTYPE_CLASSID, ORP_RESULTTYPE_CLASSID,
|
||||||
|
DNET_RESULT_TYPOLOGIES, DNET_RESULT_TYPOLOGIES);
|
||||||
|
|
||||||
|
public static final Qualifier REPOSITORY_PROVENANCE_ACTIONS = qualifier(
|
||||||
|
SYSIMPORT_CROSSWALK_REPOSITORY, SYSIMPORT_CROSSWALK_REPOSITORY,
|
||||||
|
DNET_PROVENANCE_ACTIONS, DNET_PROVENANCE_ACTIONS);
|
||||||
|
|
||||||
|
public static final Qualifier ENTITYREGISTRY_PROVENANCE_ACTION = qualifier(
|
||||||
|
SYSIMPORT_CROSSWALK_ENTITYREGISTRY, SYSIMPORT_CROSSWALK_ENTITYREGISTRY,
|
||||||
|
DNET_PROVENANCE_ACTIONS, DNET_PROVENANCE_ACTIONS);
|
||||||
|
|
||||||
|
private static Qualifier qualifier(
|
||||||
|
final String classid,
|
||||||
|
final String classname,
|
||||||
|
final String schemeid,
|
||||||
|
final String schemename) {
|
||||||
|
final Qualifier q = new Qualifier();
|
||||||
|
q.setClassid(classid);
|
||||||
|
q.setClassname(classname);
|
||||||
|
q.setSchemeid(schemeid);
|
||||||
|
q.setSchemename(schemename);
|
||||||
|
return q;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -13,7 +13,7 @@ import eu.dnetlib.dhp.schema.oaf.*;
|
||||||
public class ModelSupport {
|
public class ModelSupport {
|
||||||
|
|
||||||
/** Defines the mapping between the actual entity type and the main entity type */
|
/** Defines the mapping between the actual entity type and the main entity type */
|
||||||
private static Map<EntityType, MainEntityType> entityMapping = Maps.newHashMap();
|
private static final Map<EntityType, MainEntityType> entityMapping = Maps.newHashMap();
|
||||||
|
|
||||||
static {
|
static {
|
||||||
entityMapping.put(EntityType.publication, MainEntityType.result);
|
entityMapping.put(EntityType.publication, MainEntityType.result);
|
||||||
|
@ -53,232 +53,6 @@ public class ModelSupport {
|
||||||
oafTypes.put("relation", Relation.class);
|
oafTypes.put("relation", Relation.class);
|
||||||
}
|
}
|
||||||
|
|
||||||
public static final Map<String, String> entityIdPrefix = Maps.newHashMap();
|
|
||||||
|
|
||||||
static {
|
|
||||||
entityIdPrefix.put("datasource", "10");
|
|
||||||
entityIdPrefix.put("organization", "20");
|
|
||||||
entityIdPrefix.put("project", "40");
|
|
||||||
entityIdPrefix.put("result", "50");
|
|
||||||
}
|
|
||||||
|
|
||||||
public static final Map<String, RelationInverse> relationInverseMap = Maps.newHashMap();
|
|
||||||
|
|
||||||
static {
|
|
||||||
relationInverseMap
|
|
||||||
.put(
|
|
||||||
"personResult_authorship_isAuthorOf", new RelationInverse()
|
|
||||||
.setRelation("isAuthorOf")
|
|
||||||
.setInverse("hasAuthor")
|
|
||||||
.setRelType("personResult")
|
|
||||||
.setSubReltype("authorship"));
|
|
||||||
relationInverseMap
|
|
||||||
.put(
|
|
||||||
"personResult_authorship_hasAuthor", new RelationInverse()
|
|
||||||
.setInverse("isAuthorOf")
|
|
||||||
.setRelation("hasAuthor")
|
|
||||||
.setRelType("personResult")
|
|
||||||
.setSubReltype("authorship"));
|
|
||||||
relationInverseMap
|
|
||||||
.put(
|
|
||||||
"projectOrganization_participation_isParticipant", new RelationInverse()
|
|
||||||
.setRelation("isParticipant")
|
|
||||||
.setInverse("hasParticipant")
|
|
||||||
.setRelType("projectOrganization")
|
|
||||||
.setSubReltype("participation"));
|
|
||||||
relationInverseMap
|
|
||||||
.put(
|
|
||||||
"projectOrganization_participation_hasParticipant", new RelationInverse()
|
|
||||||
.setInverse("isParticipant")
|
|
||||||
.setRelation("hasParticipant")
|
|
||||||
.setRelType("projectOrganization")
|
|
||||||
.setSubReltype("participation"));
|
|
||||||
relationInverseMap
|
|
||||||
.put(
|
|
||||||
"resultOrganization_affiliation_hasAuthorInstitution", new RelationInverse()
|
|
||||||
.setRelation("hasAuthorInstitution")
|
|
||||||
.setInverse("isAuthorInstitutionOf")
|
|
||||||
.setRelType("resultOrganization")
|
|
||||||
.setSubReltype("affiliation"));
|
|
||||||
relationInverseMap
|
|
||||||
.put(
|
|
||||||
"resultOrganization_affiliation_isAuthorInstitutionOf", new RelationInverse()
|
|
||||||
.setInverse("hasAuthorInstitution")
|
|
||||||
.setRelation("isAuthorInstitutionOf")
|
|
||||||
.setRelType("resultOrganization")
|
|
||||||
.setSubReltype("affiliation"));
|
|
||||||
relationInverseMap
|
|
||||||
.put(
|
|
||||||
"organizationOrganization_dedup_merges", new RelationInverse()
|
|
||||||
.setRelation("merges")
|
|
||||||
.setInverse("isMergedIn")
|
|
||||||
.setRelType("organizationOrganization")
|
|
||||||
.setSubReltype("dedup"));
|
|
||||||
relationInverseMap
|
|
||||||
.put(
|
|
||||||
"organizationOrganization_dedup_isMergedIn", new RelationInverse()
|
|
||||||
.setInverse("merges")
|
|
||||||
.setRelation("isMergedIn")
|
|
||||||
.setRelType("organizationOrganization")
|
|
||||||
.setSubReltype("dedup"));
|
|
||||||
relationInverseMap
|
|
||||||
.put(
|
|
||||||
"organizationOrganization_dedupSimilarity_isSimilarTo", new RelationInverse()
|
|
||||||
.setInverse("isSimilarTo")
|
|
||||||
.setRelation("isSimilarTo")
|
|
||||||
.setRelType("organizationOrganization")
|
|
||||||
.setSubReltype("dedupSimilarity"));
|
|
||||||
|
|
||||||
relationInverseMap
|
|
||||||
.put(
|
|
||||||
"resultProject_outcome_isProducedBy", new RelationInverse()
|
|
||||||
.setRelation("isProducedBy")
|
|
||||||
.setInverse("produces")
|
|
||||||
.setRelType("resultProject")
|
|
||||||
.setSubReltype("outcome"));
|
|
||||||
relationInverseMap
|
|
||||||
.put(
|
|
||||||
"resultProject_outcome_produces", new RelationInverse()
|
|
||||||
.setInverse("isProducedBy")
|
|
||||||
.setRelation("produces")
|
|
||||||
.setRelType("resultProject")
|
|
||||||
.setSubReltype("outcome"));
|
|
||||||
relationInverseMap
|
|
||||||
.put(
|
|
||||||
"projectPerson_contactPerson_isContact", new RelationInverse()
|
|
||||||
.setRelation("isContact")
|
|
||||||
.setInverse("hasContact")
|
|
||||||
.setRelType("projectPerson")
|
|
||||||
.setSubReltype("contactPerson"));
|
|
||||||
relationInverseMap
|
|
||||||
.put(
|
|
||||||
"projectPerson_contactPerson_hasContact", new RelationInverse()
|
|
||||||
.setInverse("isContact")
|
|
||||||
.setRelation("hasContact")
|
|
||||||
.setRelType("personPerson")
|
|
||||||
.setSubReltype("coAuthorship"));
|
|
||||||
relationInverseMap
|
|
||||||
.put(
|
|
||||||
"personPerson_coAuthorship_isCoauthorOf", new RelationInverse()
|
|
||||||
.setInverse("isCoAuthorOf")
|
|
||||||
.setRelation("isCoAuthorOf")
|
|
||||||
.setRelType("personPerson")
|
|
||||||
.setSubReltype("coAuthorship"));
|
|
||||||
relationInverseMap
|
|
||||||
.put(
|
|
||||||
"personPerson_dedup_merges", new RelationInverse()
|
|
||||||
.setInverse("isMergedIn")
|
|
||||||
.setRelation("merges")
|
|
||||||
.setRelType("personPerson")
|
|
||||||
.setSubReltype("dedup"));
|
|
||||||
relationInverseMap
|
|
||||||
.put(
|
|
||||||
"personPerson_dedup_isMergedIn", new RelationInverse()
|
|
||||||
.setInverse("merges")
|
|
||||||
.setRelation("isMergedIn")
|
|
||||||
.setRelType("personPerson")
|
|
||||||
.setSubReltype("dedup"));
|
|
||||||
relationInverseMap
|
|
||||||
.put(
|
|
||||||
"personPerson_dedupSimilarity_isSimilarTo", new RelationInverse()
|
|
||||||
.setInverse("isSimilarTo")
|
|
||||||
.setRelation("isSimilarTo")
|
|
||||||
.setRelType("personPerson")
|
|
||||||
.setSubReltype("dedupSimilarity"));
|
|
||||||
relationInverseMap
|
|
||||||
.put(
|
|
||||||
"datasourceOrganization_provision_isProvidedBy", new RelationInverse()
|
|
||||||
.setInverse("provides")
|
|
||||||
.setRelation("isProvidedBy")
|
|
||||||
.setRelType("datasourceOrganization")
|
|
||||||
.setSubReltype("provision"));
|
|
||||||
relationInverseMap
|
|
||||||
.put(
|
|
||||||
"datasourceOrganization_provision_provides", new RelationInverse()
|
|
||||||
.setInverse("isProvidedBy")
|
|
||||||
.setRelation("provides")
|
|
||||||
.setRelType("datasourceOrganization")
|
|
||||||
.setSubReltype("provision"));
|
|
||||||
relationInverseMap
|
|
||||||
.put(
|
|
||||||
"resultResult_similarity_hasAmongTopNSimilarDocuments", new RelationInverse()
|
|
||||||
.setInverse("isAmongTopNSimilarDocuments")
|
|
||||||
.setRelation("hasAmongTopNSimilarDocuments")
|
|
||||||
.setRelType("resultResult")
|
|
||||||
.setSubReltype("similarity"));
|
|
||||||
relationInverseMap
|
|
||||||
.put(
|
|
||||||
"resultResult_similarity_isAmongTopNSimilarDocuments", new RelationInverse()
|
|
||||||
.setInverse("hasAmongTopNSimilarDocuments")
|
|
||||||
.setRelation("isAmongTopNSimilarDocuments")
|
|
||||||
.setRelType("resultResult")
|
|
||||||
.setSubReltype("similarity"));
|
|
||||||
relationInverseMap
|
|
||||||
.put(
|
|
||||||
"resultResult_relationship_isRelatedTo", new RelationInverse()
|
|
||||||
.setInverse("isRelatedTo")
|
|
||||||
.setRelation("isRelatedTo")
|
|
||||||
.setRelType("resultResult")
|
|
||||||
.setSubReltype("relationship"));
|
|
||||||
relationInverseMap
|
|
||||||
.put(
|
|
||||||
"resultResult_similarity_isAmongTopNSimilarDocuments", new RelationInverse()
|
|
||||||
.setInverse("hasAmongTopNSimilarDocuments")
|
|
||||||
.setRelation("isAmongTopNSimilarDocuments")
|
|
||||||
.setRelType("resultResult")
|
|
||||||
.setSubReltype("similarity"));
|
|
||||||
relationInverseMap
|
|
||||||
.put(
|
|
||||||
"resultResult_supplement_isSupplementTo", new RelationInverse()
|
|
||||||
.setInverse("isSupplementedBy")
|
|
||||||
.setRelation("isSupplementTo")
|
|
||||||
.setRelType("resultResult")
|
|
||||||
.setSubReltype("supplement"));
|
|
||||||
relationInverseMap
|
|
||||||
.put(
|
|
||||||
"resultResult_supplement_isSupplementedBy", new RelationInverse()
|
|
||||||
.setInverse("isSupplementTo")
|
|
||||||
.setRelation("isSupplementedBy")
|
|
||||||
.setRelType("resultResult")
|
|
||||||
.setSubReltype("supplement"));
|
|
||||||
relationInverseMap
|
|
||||||
.put(
|
|
||||||
"resultResult_part_isPartOf", new RelationInverse()
|
|
||||||
.setInverse("hasPart")
|
|
||||||
.setRelation("isPartOf")
|
|
||||||
.setRelType("resultResult")
|
|
||||||
.setSubReltype("part"));
|
|
||||||
relationInverseMap
|
|
||||||
.put(
|
|
||||||
"resultResult_part_hasPart", new RelationInverse()
|
|
||||||
.setInverse("isPartOf")
|
|
||||||
.setRelation("hasPart")
|
|
||||||
.setRelType("resultResult")
|
|
||||||
.setSubReltype("part"));
|
|
||||||
relationInverseMap
|
|
||||||
.put(
|
|
||||||
"resultResult_dedup_merges", new RelationInverse()
|
|
||||||
.setInverse("isMergedIn")
|
|
||||||
.setRelation("merges")
|
|
||||||
.setRelType("resultResult")
|
|
||||||
.setSubReltype("dedup"));
|
|
||||||
relationInverseMap
|
|
||||||
.put(
|
|
||||||
"resultResult_dedup_isMergedIn", new RelationInverse()
|
|
||||||
.setInverse("merges")
|
|
||||||
.setRelation("isMergedIn")
|
|
||||||
.setRelType("resultResult")
|
|
||||||
.setSubReltype("dedup"));
|
|
||||||
relationInverseMap
|
|
||||||
.put(
|
|
||||||
"resultResult_dedupSimilarity_isSimilarTo", new RelationInverse()
|
|
||||||
.setInverse("isSimilarTo")
|
|
||||||
.setRelation("isSimilarTo")
|
|
||||||
.setRelType("resultResult")
|
|
||||||
.setSubReltype("dedupSimilarity"));
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
private static final String schemeTemplate = "dnet:%s_%s_relations";
|
private static final String schemeTemplate = "dnet:%s_%s_relations";
|
||||||
|
|
||||||
private ModelSupport() {
|
private ModelSupport() {
|
||||||
|
@ -428,4 +202,5 @@ public class ModelSupport {
|
||||||
private static <T extends Oaf> String idFnForOafEntity(T t) {
|
private static <T extends Oaf> String idFnForOafEntity(T t) {
|
||||||
return ((OafEntity) t).getId();
|
return ((OafEntity) t).getId();
|
||||||
}
|
}
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,3 +1,4 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf;
|
package eu.dnetlib.dhp.schema.oaf;
|
||||||
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
|
@ -5,92 +6,84 @@ import java.util.*;
|
||||||
|
|
||||||
public class Author implements Serializable {
|
public class Author implements Serializable {
|
||||||
|
|
||||||
private String fullname;
|
private String fullname;
|
||||||
|
|
||||||
private String name;
|
private String name;
|
||||||
|
|
||||||
private String surname;
|
private String surname;
|
||||||
|
|
||||||
private Integer rank;
|
private Integer rank;
|
||||||
|
|
||||||
private List<StructuredProperty> pid;
|
private List<StructuredProperty> pid;
|
||||||
|
|
||||||
private List<Field<String>> affiliation;
|
private List<Field<String>> affiliation;
|
||||||
|
|
||||||
public String getFullname() {
|
public String getFullname() {
|
||||||
return fullname;
|
return fullname;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setFullname(String fullname) {
|
public void setFullname(String fullname) {
|
||||||
this.fullname = fullname;
|
this.fullname = fullname;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getName() {
|
public String getName() {
|
||||||
return name;
|
return name;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setName(String name) {
|
public void setName(String name) {
|
||||||
this.name = name;
|
this.name = name;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getSurname() {
|
public String getSurname() {
|
||||||
return surname;
|
return surname;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setSurname(String surname) {
|
public void setSurname(String surname) {
|
||||||
this.surname = surname;
|
this.surname = surname;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Integer getRank() {
|
public Integer getRank() {
|
||||||
return rank;
|
return rank;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setRank(Integer rank) {
|
public void setRank(Integer rank) {
|
||||||
this.rank = rank;
|
this.rank = rank;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<StructuredProperty> getPid() {
|
public List<StructuredProperty> getPid() {
|
||||||
return pid;
|
return pid;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setPid(List<StructuredProperty> pid) {
|
public void setPid(List<StructuredProperty> pid) {
|
||||||
this.pid = pid;
|
this.pid = pid;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<Field<String>> getAffiliation() {
|
public List<Field<String>> getAffiliation() {
|
||||||
return affiliation;
|
return affiliation;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setAffiliation(List<Field<String>> affiliation) {
|
public void setAffiliation(List<Field<String>> affiliation) {
|
||||||
this.affiliation = affiliation;
|
this.affiliation = affiliation;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public boolean equals(Object o) {
|
public boolean equals(Object o) {
|
||||||
if (this == o) return true;
|
if (this == o)
|
||||||
if (o == null || getClass() != o.getClass()) return false;
|
return true;
|
||||||
Author author = (Author) o;
|
if (o == null || getClass() != o.getClass())
|
||||||
return Objects.equals(fullname, author.fullname)
|
return false;
|
||||||
&& Objects.equals(name, author.name)
|
Author author = (Author) o;
|
||||||
&& Objects.equals(surname, author.surname)
|
return Objects.equals(fullname, author.fullname)
|
||||||
&& Objects.equals(rank, author.rank)
|
&& Objects.equals(name, author.name)
|
||||||
&& Objects.equals(pid, author.pid)
|
&& Objects.equals(surname, author.surname)
|
||||||
&& Objects.equals(affiliation, author.affiliation);
|
&& Objects.equals(rank, author.rank)
|
||||||
}
|
&& Objects.equals(pid, author.pid)
|
||||||
|
&& Objects.equals(affiliation, author.affiliation);
|
||||||
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public int hashCode() {
|
public int hashCode() {
|
||||||
return Objects.hash(fullname, name, surname, rank, pid, affiliation);
|
return Objects.hash(fullname, name, surname, rank, pid, affiliation);
|
||||||
}
|
}
|
||||||
|
|
||||||
public void addPid(StructuredProperty pid) {
|
|
||||||
|
|
||||||
if (pid == null) return;
|
|
||||||
|
|
||||||
if (this.pid == null) {
|
|
||||||
this.pid = Arrays.asList(pid);
|
|
||||||
} else {
|
|
||||||
this.pid.add(pid);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,42 +1,46 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf;
|
package eu.dnetlib.dhp.schema.oaf;
|
||||||
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
import java.util.List;
|
import java.util.List;
|
||||||
|
|
||||||
public class Context implements Serializable {
|
public class Context implements Serializable {
|
||||||
private String id;
|
private String id;
|
||||||
|
|
||||||
private List<DataInfo> dataInfo;
|
private List<DataInfo> dataInfo;
|
||||||
|
|
||||||
public String getId() {
|
public String getId() {
|
||||||
return id;
|
return id;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setId(String id) {
|
public void setId(String id) {
|
||||||
this.id = id;
|
this.id = id;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<DataInfo> getDataInfo() {
|
public List<DataInfo> getDataInfo() {
|
||||||
return dataInfo;
|
return dataInfo;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDataInfo(List<DataInfo> dataInfo) {
|
public void setDataInfo(List<DataInfo> dataInfo) {
|
||||||
this.dataInfo = dataInfo;
|
this.dataInfo = dataInfo;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public int hashCode() {
|
public int hashCode() {
|
||||||
return id == null ? 0 : id.hashCode();
|
return id == null ? 0 : id.hashCode();
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public boolean equals(Object obj) {
|
public boolean equals(Object obj) {
|
||||||
if (this == obj) return true;
|
if (this == obj)
|
||||||
if (obj == null) return false;
|
return true;
|
||||||
if (getClass() != obj.getClass()) return false;
|
if (obj == null)
|
||||||
|
return false;
|
||||||
|
if (getClass() != obj.getClass())
|
||||||
|
return false;
|
||||||
|
|
||||||
Context other = (Context) obj;
|
Context other = (Context) obj;
|
||||||
|
|
||||||
return id.equals(other.getId());
|
return id.equals(other.getId());
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,30 +1,34 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf;
|
package eu.dnetlib.dhp.schema.oaf;
|
||||||
|
|
||||||
import java.util.Objects;
|
import java.util.Objects;
|
||||||
|
|
||||||
public class Country extends Qualifier {
|
public class Country extends Qualifier {
|
||||||
|
|
||||||
private DataInfo dataInfo;
|
private DataInfo dataInfo;
|
||||||
|
|
||||||
public DataInfo getDataInfo() {
|
public DataInfo getDataInfo() {
|
||||||
return dataInfo;
|
return dataInfo;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDataInfo(DataInfo dataInfo) {
|
public void setDataInfo(DataInfo dataInfo) {
|
||||||
this.dataInfo = dataInfo;
|
this.dataInfo = dataInfo;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public boolean equals(Object o) {
|
public boolean equals(Object o) {
|
||||||
if (this == o) return true;
|
if (this == o)
|
||||||
if (o == null || getClass() != o.getClass()) return false;
|
return true;
|
||||||
if (!super.equals(o)) return false;
|
if (o == null || getClass() != o.getClass())
|
||||||
Country country = (Country) o;
|
return false;
|
||||||
return Objects.equals(dataInfo, country.dataInfo);
|
if (!super.equals(o))
|
||||||
}
|
return false;
|
||||||
|
Country country = (Country) o;
|
||||||
|
return Objects.equals(dataInfo, country.dataInfo);
|
||||||
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public int hashCode() {
|
public int hashCode() {
|
||||||
return Objects.hash(super.hashCode(), dataInfo);
|
return Objects.hash(super.hashCode(), dataInfo);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,3 +1,4 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf;
|
package eu.dnetlib.dhp.schema.oaf;
|
||||||
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
|
@ -5,77 +6,80 @@ import java.util.Objects;
|
||||||
|
|
||||||
public class DataInfo implements Serializable {
|
public class DataInfo implements Serializable {
|
||||||
|
|
||||||
private Boolean invisible = false;
|
private Boolean invisible = false;
|
||||||
private Boolean inferred;
|
private Boolean inferred;
|
||||||
private Boolean deletedbyinference;
|
private Boolean deletedbyinference;
|
||||||
private String trust;
|
private String trust;
|
||||||
private String inferenceprovenance;
|
private String inferenceprovenance;
|
||||||
private Qualifier provenanceaction;
|
private Qualifier provenanceaction;
|
||||||
|
|
||||||
public Boolean getInvisible() {
|
public Boolean getInvisible() {
|
||||||
return invisible;
|
return invisible;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setInvisible(Boolean invisible) {
|
public void setInvisible(Boolean invisible) {
|
||||||
this.invisible = invisible;
|
this.invisible = invisible;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Boolean getInferred() {
|
public Boolean getInferred() {
|
||||||
return inferred;
|
return inferred;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setInferred(Boolean inferred) {
|
public void setInferred(Boolean inferred) {
|
||||||
this.inferred = inferred;
|
this.inferred = inferred;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Boolean getDeletedbyinference() {
|
public Boolean getDeletedbyinference() {
|
||||||
return deletedbyinference;
|
return deletedbyinference;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDeletedbyinference(Boolean deletedbyinference) {
|
public void setDeletedbyinference(Boolean deletedbyinference) {
|
||||||
this.deletedbyinference = deletedbyinference;
|
this.deletedbyinference = deletedbyinference;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getTrust() {
|
public String getTrust() {
|
||||||
return trust;
|
return trust;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setTrust(String trust) {
|
public void setTrust(String trust) {
|
||||||
this.trust = trust;
|
this.trust = trust;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getInferenceprovenance() {
|
public String getInferenceprovenance() {
|
||||||
return inferenceprovenance;
|
return inferenceprovenance;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setInferenceprovenance(String inferenceprovenance) {
|
public void setInferenceprovenance(String inferenceprovenance) {
|
||||||
this.inferenceprovenance = inferenceprovenance;
|
this.inferenceprovenance = inferenceprovenance;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Qualifier getProvenanceaction() {
|
public Qualifier getProvenanceaction() {
|
||||||
return provenanceaction;
|
return provenanceaction;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setProvenanceaction(Qualifier provenanceaction) {
|
public void setProvenanceaction(Qualifier provenanceaction) {
|
||||||
this.provenanceaction = provenanceaction;
|
this.provenanceaction = provenanceaction;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public boolean equals(Object o) {
|
public boolean equals(Object o) {
|
||||||
if (this == o) return true;
|
if (this == o)
|
||||||
if (o == null || getClass() != o.getClass()) return false;
|
return true;
|
||||||
DataInfo dataInfo = (DataInfo) o;
|
if (o == null || getClass() != o.getClass())
|
||||||
return Objects.equals(invisible, dataInfo.invisible)
|
return false;
|
||||||
&& Objects.equals(inferred, dataInfo.inferred)
|
DataInfo dataInfo = (DataInfo) o;
|
||||||
&& Objects.equals(deletedbyinference, dataInfo.deletedbyinference)
|
return Objects.equals(invisible, dataInfo.invisible)
|
||||||
&& Objects.equals(trust, dataInfo.trust)
|
&& Objects.equals(inferred, dataInfo.inferred)
|
||||||
&& Objects.equals(inferenceprovenance, dataInfo.inferenceprovenance)
|
&& Objects.equals(deletedbyinference, dataInfo.deletedbyinference)
|
||||||
&& Objects.equals(provenanceaction, dataInfo.provenanceaction);
|
&& Objects.equals(trust, dataInfo.trust)
|
||||||
}
|
&& Objects.equals(inferenceprovenance, dataInfo.inferenceprovenance)
|
||||||
|
&& Objects.equals(provenanceaction, dataInfo.provenanceaction);
|
||||||
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public int hashCode() {
|
public int hashCode() {
|
||||||
return Objects.hash(
|
return Objects
|
||||||
invisible, inferred, deletedbyinference, trust, inferenceprovenance, provenanceaction);
|
.hash(
|
||||||
}
|
invisible, inferred, deletedbyinference, trust, inferenceprovenance, provenanceaction);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,116 +1,115 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf;
|
package eu.dnetlib.dhp.schema.oaf;
|
||||||
|
|
||||||
import eu.dnetlib.dhp.schema.common.ModelConstants;
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
import java.util.List;
|
import java.util.List;
|
||||||
|
|
||||||
|
import eu.dnetlib.dhp.schema.common.ModelConstants;
|
||||||
|
|
||||||
public class Dataset extends Result implements Serializable {
|
public class Dataset extends Result implements Serializable {
|
||||||
|
|
||||||
private Field<String> storagedate;
|
private Field<String> storagedate;
|
||||||
|
|
||||||
private Field<String> device;
|
private Field<String> device;
|
||||||
|
|
||||||
private Field<String> size;
|
private Field<String> size;
|
||||||
|
|
||||||
private Field<String> version;
|
private Field<String> version;
|
||||||
|
|
||||||
private Field<String> lastmetadataupdate;
|
private Field<String> lastmetadataupdate;
|
||||||
|
|
||||||
private Field<String> metadataversionnumber;
|
private Field<String> metadataversionnumber;
|
||||||
|
|
||||||
private List<GeoLocation> geolocation;
|
private List<GeoLocation> geolocation;
|
||||||
|
|
||||||
public Dataset() {
|
public Dataset() {
|
||||||
setResulttype(ModelConstants.DATASET_DEFAULT_RESULTTYPE);
|
setResulttype(ModelConstants.DATASET_DEFAULT_RESULTTYPE);
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getStoragedate() {
|
public Field<String> getStoragedate() {
|
||||||
return storagedate;
|
return storagedate;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setStoragedate(Field<String> storagedate) {
|
public void setStoragedate(Field<String> storagedate) {
|
||||||
this.storagedate = storagedate;
|
this.storagedate = storagedate;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getDevice() {
|
public Field<String> getDevice() {
|
||||||
return device;
|
return device;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDevice(Field<String> device) {
|
public void setDevice(Field<String> device) {
|
||||||
this.device = device;
|
this.device = device;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getSize() {
|
public Field<String> getSize() {
|
||||||
return size;
|
return size;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setSize(Field<String> size) {
|
public void setSize(Field<String> size) {
|
||||||
this.size = size;
|
this.size = size;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getVersion() {
|
public Field<String> getVersion() {
|
||||||
return version;
|
return version;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setVersion(Field<String> version) {
|
public void setVersion(Field<String> version) {
|
||||||
this.version = version;
|
this.version = version;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getLastmetadataupdate() {
|
public Field<String> getLastmetadataupdate() {
|
||||||
return lastmetadataupdate;
|
return lastmetadataupdate;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setLastmetadataupdate(Field<String> lastmetadataupdate) {
|
public void setLastmetadataupdate(Field<String> lastmetadataupdate) {
|
||||||
this.lastmetadataupdate = lastmetadataupdate;
|
this.lastmetadataupdate = lastmetadataupdate;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getMetadataversionnumber() {
|
public Field<String> getMetadataversionnumber() {
|
||||||
return metadataversionnumber;
|
return metadataversionnumber;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setMetadataversionnumber(Field<String> metadataversionnumber) {
|
public void setMetadataversionnumber(Field<String> metadataversionnumber) {
|
||||||
this.metadataversionnumber = metadataversionnumber;
|
this.metadataversionnumber = metadataversionnumber;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<GeoLocation> getGeolocation() {
|
public List<GeoLocation> getGeolocation() {
|
||||||
return geolocation;
|
return geolocation;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setGeolocation(List<GeoLocation> geolocation) {
|
public void setGeolocation(List<GeoLocation> geolocation) {
|
||||||
this.geolocation = geolocation;
|
this.geolocation = geolocation;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public void mergeFrom(OafEntity e) {
|
public void mergeFrom(OafEntity e) {
|
||||||
super.mergeFrom(e);
|
super.mergeFrom(e);
|
||||||
|
|
||||||
if (!Dataset.class.isAssignableFrom(e.getClass())) {
|
if (!Dataset.class.isAssignableFrom(e.getClass())) {
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
final Dataset d = (Dataset) e;
|
final Dataset d = (Dataset) e;
|
||||||
|
|
||||||
storagedate =
|
storagedate = d.getStoragedate() != null && compareTrust(this, e) < 0 ? d.getStoragedate() : storagedate;
|
||||||
d.getStoragedate() != null && compareTrust(this, e) < 0 ? d.getStoragedate() : storagedate;
|
|
||||||
|
|
||||||
device = d.getDevice() != null && compareTrust(this, e) < 0 ? d.getDevice() : device;
|
device = d.getDevice() != null && compareTrust(this, e) < 0 ? d.getDevice() : device;
|
||||||
|
|
||||||
size = d.getSize() != null && compareTrust(this, e) < 0 ? d.getSize() : size;
|
size = d.getSize() != null && compareTrust(this, e) < 0 ? d.getSize() : size;
|
||||||
|
|
||||||
version = d.getVersion() != null && compareTrust(this, e) < 0 ? d.getVersion() : version;
|
version = d.getVersion() != null && compareTrust(this, e) < 0 ? d.getVersion() : version;
|
||||||
|
|
||||||
lastmetadataupdate =
|
lastmetadataupdate = d.getLastmetadataupdate() != null && compareTrust(this, e) < 0
|
||||||
d.getLastmetadataupdate() != null && compareTrust(this, e) < 0
|
? d.getLastmetadataupdate()
|
||||||
? d.getLastmetadataupdate()
|
: lastmetadataupdate;
|
||||||
: lastmetadataupdate;
|
|
||||||
|
|
||||||
metadataversionnumber =
|
metadataversionnumber = d.getMetadataversionnumber() != null && compareTrust(this, e) < 0
|
||||||
d.getMetadataversionnumber() != null && compareTrust(this, e) < 0
|
? d.getMetadataversionnumber()
|
||||||
? d.getMetadataversionnumber()
|
: metadataversionnumber;
|
||||||
: metadataversionnumber;
|
|
||||||
|
|
||||||
geolocation = mergeLists(geolocation, d.getGeolocation());
|
geolocation = mergeLists(geolocation, d.getGeolocation());
|
||||||
|
|
||||||
mergeOAFDataInfo(d);
|
mergeOAFDataInfo(d);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,3 +1,4 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf;
|
package eu.dnetlib.dhp.schema.oaf;
|
||||||
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
|
@ -5,494 +6,467 @@ import java.util.List;
|
||||||
|
|
||||||
public class Datasource extends OafEntity implements Serializable {
|
public class Datasource extends OafEntity implements Serializable {
|
||||||
|
|
||||||
private Qualifier datasourcetype;
|
private Qualifier datasourcetype;
|
||||||
|
|
||||||
private Qualifier openairecompatibility;
|
private Qualifier openairecompatibility;
|
||||||
|
|
||||||
private Field<String> officialname;
|
private Field<String> officialname;
|
||||||
|
|
||||||
private Field<String> englishname;
|
private Field<String> englishname;
|
||||||
|
|
||||||
private Field<String> websiteurl;
|
private Field<String> websiteurl;
|
||||||
|
|
||||||
private Field<String> logourl;
|
private Field<String> logourl;
|
||||||
|
|
||||||
private Field<String> contactemail;
|
private Field<String> contactemail;
|
||||||
|
|
||||||
private Field<String> namespaceprefix;
|
private Field<String> namespaceprefix;
|
||||||
|
|
||||||
private Field<String> latitude;
|
private Field<String> latitude;
|
||||||
|
|
||||||
private Field<String> longitude;
|
private Field<String> longitude;
|
||||||
|
|
||||||
private Field<String> dateofvalidation;
|
private Field<String> dateofvalidation;
|
||||||
|
|
||||||
private Field<String> description;
|
private Field<String> description;
|
||||||
|
|
||||||
private List<StructuredProperty> subjects;
|
private List<StructuredProperty> subjects;
|
||||||
|
|
||||||
// opendoar specific fields (od*)
|
// opendoar specific fields (od*)
|
||||||
private Field<String> odnumberofitems;
|
private Field<String> odnumberofitems;
|
||||||
|
|
||||||
private Field<String> odnumberofitemsdate;
|
private Field<String> odnumberofitemsdate;
|
||||||
|
|
||||||
private Field<String> odpolicies;
|
private Field<String> odpolicies;
|
||||||
|
|
||||||
private List<Field<String>> odlanguages;
|
private List<Field<String>> odlanguages;
|
||||||
|
|
||||||
private List<Field<String>> odcontenttypes;
|
private List<Field<String>> odcontenttypes;
|
||||||
|
|
||||||
private List<Field<String>> accessinfopackage;
|
private List<Field<String>> accessinfopackage;
|
||||||
|
|
||||||
// re3data fields
|
// re3data fields
|
||||||
private Field<String> releasestartdate;
|
private Field<String> releasestartdate;
|
||||||
|
|
||||||
private Field<String> releaseenddate;
|
private Field<String> releaseenddate;
|
||||||
|
|
||||||
private Field<String> missionstatementurl;
|
private Field<String> missionstatementurl;
|
||||||
|
|
||||||
private Field<Boolean> dataprovider;
|
private Field<Boolean> dataprovider;
|
||||||
|
|
||||||
private Field<Boolean> serviceprovider;
|
private Field<Boolean> serviceprovider;
|
||||||
|
|
||||||
// {open, restricted or closed}
|
// {open, restricted or closed}
|
||||||
private Field<String> databaseaccesstype;
|
private Field<String> databaseaccesstype;
|
||||||
|
|
||||||
// {open, restricted or closed}
|
// {open, restricted or closed}
|
||||||
private Field<String> datauploadtype;
|
private Field<String> datauploadtype;
|
||||||
|
|
||||||
// {feeRequired, registration, other}
|
// {feeRequired, registration, other}
|
||||||
private Field<String> databaseaccessrestriction;
|
private Field<String> databaseaccessrestriction;
|
||||||
|
|
||||||
// {feeRequired, registration, other}
|
// {feeRequired, registration, other}
|
||||||
private Field<String> datauploadrestriction;
|
private Field<String> datauploadrestriction;
|
||||||
|
|
||||||
private Field<Boolean> versioning;
|
private Field<Boolean> versioning;
|
||||||
|
|
||||||
private Field<String> citationguidelineurl;
|
private Field<String> citationguidelineurl;
|
||||||
|
|
||||||
// {yes, no, uknown}
|
// {yes, no, uknown}
|
||||||
private Field<String> qualitymanagementkind;
|
private Field<String> qualitymanagementkind;
|
||||||
|
|
||||||
private Field<String> pidsystems;
|
private Field<String> pidsystems;
|
||||||
|
|
||||||
private Field<String> certificates;
|
private Field<String> certificates;
|
||||||
|
|
||||||
private List<KeyValue> policies;
|
private List<KeyValue> policies;
|
||||||
|
|
||||||
private Journal journal;
|
private Journal journal;
|
||||||
|
|
||||||
public Qualifier getDatasourcetype() {
|
public Qualifier getDatasourcetype() {
|
||||||
return datasourcetype;
|
return datasourcetype;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDatasourcetype(Qualifier datasourcetype) {
|
public void setDatasourcetype(Qualifier datasourcetype) {
|
||||||
this.datasourcetype = datasourcetype;
|
this.datasourcetype = datasourcetype;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Qualifier getOpenairecompatibility() {
|
public Qualifier getOpenairecompatibility() {
|
||||||
return openairecompatibility;
|
return openairecompatibility;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setOpenairecompatibility(Qualifier openairecompatibility) {
|
public void setOpenairecompatibility(Qualifier openairecompatibility) {
|
||||||
this.openairecompatibility = openairecompatibility;
|
this.openairecompatibility = openairecompatibility;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getOfficialname() {
|
public Field<String> getOfficialname() {
|
||||||
return officialname;
|
return officialname;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setOfficialname(Field<String> officialname) {
|
public void setOfficialname(Field<String> officialname) {
|
||||||
this.officialname = officialname;
|
this.officialname = officialname;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getEnglishname() {
|
public Field<String> getEnglishname() {
|
||||||
return englishname;
|
return englishname;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setEnglishname(Field<String> englishname) {
|
public void setEnglishname(Field<String> englishname) {
|
||||||
this.englishname = englishname;
|
this.englishname = englishname;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getWebsiteurl() {
|
public Field<String> getWebsiteurl() {
|
||||||
return websiteurl;
|
return websiteurl;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setWebsiteurl(Field<String> websiteurl) {
|
public void setWebsiteurl(Field<String> websiteurl) {
|
||||||
this.websiteurl = websiteurl;
|
this.websiteurl = websiteurl;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getLogourl() {
|
public Field<String> getLogourl() {
|
||||||
return logourl;
|
return logourl;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setLogourl(Field<String> logourl) {
|
public void setLogourl(Field<String> logourl) {
|
||||||
this.logourl = logourl;
|
this.logourl = logourl;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getContactemail() {
|
public Field<String> getContactemail() {
|
||||||
return contactemail;
|
return contactemail;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setContactemail(Field<String> contactemail) {
|
public void setContactemail(Field<String> contactemail) {
|
||||||
this.contactemail = contactemail;
|
this.contactemail = contactemail;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getNamespaceprefix() {
|
public Field<String> getNamespaceprefix() {
|
||||||
return namespaceprefix;
|
return namespaceprefix;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setNamespaceprefix(Field<String> namespaceprefix) {
|
public void setNamespaceprefix(Field<String> namespaceprefix) {
|
||||||
this.namespaceprefix = namespaceprefix;
|
this.namespaceprefix = namespaceprefix;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getLatitude() {
|
public Field<String> getLatitude() {
|
||||||
return latitude;
|
return latitude;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setLatitude(Field<String> latitude) {
|
public void setLatitude(Field<String> latitude) {
|
||||||
this.latitude = latitude;
|
this.latitude = latitude;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getLongitude() {
|
public Field<String> getLongitude() {
|
||||||
return longitude;
|
return longitude;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setLongitude(Field<String> longitude) {
|
public void setLongitude(Field<String> longitude) {
|
||||||
this.longitude = longitude;
|
this.longitude = longitude;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getDateofvalidation() {
|
public Field<String> getDateofvalidation() {
|
||||||
return dateofvalidation;
|
return dateofvalidation;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDateofvalidation(Field<String> dateofvalidation) {
|
public void setDateofvalidation(Field<String> dateofvalidation) {
|
||||||
this.dateofvalidation = dateofvalidation;
|
this.dateofvalidation = dateofvalidation;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getDescription() {
|
public Field<String> getDescription() {
|
||||||
return description;
|
return description;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDescription(Field<String> description) {
|
public void setDescription(Field<String> description) {
|
||||||
this.description = description;
|
this.description = description;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<StructuredProperty> getSubjects() {
|
public List<StructuredProperty> getSubjects() {
|
||||||
return subjects;
|
return subjects;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setSubjects(List<StructuredProperty> subjects) {
|
public void setSubjects(List<StructuredProperty> subjects) {
|
||||||
this.subjects = subjects;
|
this.subjects = subjects;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getOdnumberofitems() {
|
public Field<String> getOdnumberofitems() {
|
||||||
return odnumberofitems;
|
return odnumberofitems;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setOdnumberofitems(Field<String> odnumberofitems) {
|
public void setOdnumberofitems(Field<String> odnumberofitems) {
|
||||||
this.odnumberofitems = odnumberofitems;
|
this.odnumberofitems = odnumberofitems;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getOdnumberofitemsdate() {
|
public Field<String> getOdnumberofitemsdate() {
|
||||||
return odnumberofitemsdate;
|
return odnumberofitemsdate;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setOdnumberofitemsdate(Field<String> odnumberofitemsdate) {
|
public void setOdnumberofitemsdate(Field<String> odnumberofitemsdate) {
|
||||||
this.odnumberofitemsdate = odnumberofitemsdate;
|
this.odnumberofitemsdate = odnumberofitemsdate;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getOdpolicies() {
|
public Field<String> getOdpolicies() {
|
||||||
return odpolicies;
|
return odpolicies;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setOdpolicies(Field<String> odpolicies) {
|
public void setOdpolicies(Field<String> odpolicies) {
|
||||||
this.odpolicies = odpolicies;
|
this.odpolicies = odpolicies;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<Field<String>> getOdlanguages() {
|
public List<Field<String>> getOdlanguages() {
|
||||||
return odlanguages;
|
return odlanguages;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setOdlanguages(List<Field<String>> odlanguages) {
|
public void setOdlanguages(List<Field<String>> odlanguages) {
|
||||||
this.odlanguages = odlanguages;
|
this.odlanguages = odlanguages;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<Field<String>> getOdcontenttypes() {
|
public List<Field<String>> getOdcontenttypes() {
|
||||||
return odcontenttypes;
|
return odcontenttypes;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setOdcontenttypes(List<Field<String>> odcontenttypes) {
|
public void setOdcontenttypes(List<Field<String>> odcontenttypes) {
|
||||||
this.odcontenttypes = odcontenttypes;
|
this.odcontenttypes = odcontenttypes;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<Field<String>> getAccessinfopackage() {
|
public List<Field<String>> getAccessinfopackage() {
|
||||||
return accessinfopackage;
|
return accessinfopackage;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setAccessinfopackage(List<Field<String>> accessinfopackage) {
|
public void setAccessinfopackage(List<Field<String>> accessinfopackage) {
|
||||||
this.accessinfopackage = accessinfopackage;
|
this.accessinfopackage = accessinfopackage;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getReleasestartdate() {
|
public Field<String> getReleasestartdate() {
|
||||||
return releasestartdate;
|
return releasestartdate;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setReleasestartdate(Field<String> releasestartdate) {
|
public void setReleasestartdate(Field<String> releasestartdate) {
|
||||||
this.releasestartdate = releasestartdate;
|
this.releasestartdate = releasestartdate;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getReleaseenddate() {
|
public Field<String> getReleaseenddate() {
|
||||||
return releaseenddate;
|
return releaseenddate;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setReleaseenddate(Field<String> releaseenddate) {
|
public void setReleaseenddate(Field<String> releaseenddate) {
|
||||||
this.releaseenddate = releaseenddate;
|
this.releaseenddate = releaseenddate;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getMissionstatementurl() {
|
public Field<String> getMissionstatementurl() {
|
||||||
return missionstatementurl;
|
return missionstatementurl;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setMissionstatementurl(Field<String> missionstatementurl) {
|
public void setMissionstatementurl(Field<String> missionstatementurl) {
|
||||||
this.missionstatementurl = missionstatementurl;
|
this.missionstatementurl = missionstatementurl;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<Boolean> getDataprovider() {
|
public Field<Boolean> getDataprovider() {
|
||||||
return dataprovider;
|
return dataprovider;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDataprovider(Field<Boolean> dataprovider) {
|
public void setDataprovider(Field<Boolean> dataprovider) {
|
||||||
this.dataprovider = dataprovider;
|
this.dataprovider = dataprovider;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<Boolean> getServiceprovider() {
|
public Field<Boolean> getServiceprovider() {
|
||||||
return serviceprovider;
|
return serviceprovider;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setServiceprovider(Field<Boolean> serviceprovider) {
|
public void setServiceprovider(Field<Boolean> serviceprovider) {
|
||||||
this.serviceprovider = serviceprovider;
|
this.serviceprovider = serviceprovider;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getDatabaseaccesstype() {
|
public Field<String> getDatabaseaccesstype() {
|
||||||
return databaseaccesstype;
|
return databaseaccesstype;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDatabaseaccesstype(Field<String> databaseaccesstype) {
|
public void setDatabaseaccesstype(Field<String> databaseaccesstype) {
|
||||||
this.databaseaccesstype = databaseaccesstype;
|
this.databaseaccesstype = databaseaccesstype;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getDatauploadtype() {
|
public Field<String> getDatauploadtype() {
|
||||||
return datauploadtype;
|
return datauploadtype;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDatauploadtype(Field<String> datauploadtype) {
|
public void setDatauploadtype(Field<String> datauploadtype) {
|
||||||
this.datauploadtype = datauploadtype;
|
this.datauploadtype = datauploadtype;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getDatabaseaccessrestriction() {
|
public Field<String> getDatabaseaccessrestriction() {
|
||||||
return databaseaccessrestriction;
|
return databaseaccessrestriction;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDatabaseaccessrestriction(Field<String> databaseaccessrestriction) {
|
public void setDatabaseaccessrestriction(Field<String> databaseaccessrestriction) {
|
||||||
this.databaseaccessrestriction = databaseaccessrestriction;
|
this.databaseaccessrestriction = databaseaccessrestriction;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getDatauploadrestriction() {
|
public Field<String> getDatauploadrestriction() {
|
||||||
return datauploadrestriction;
|
return datauploadrestriction;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDatauploadrestriction(Field<String> datauploadrestriction) {
|
public void setDatauploadrestriction(Field<String> datauploadrestriction) {
|
||||||
this.datauploadrestriction = datauploadrestriction;
|
this.datauploadrestriction = datauploadrestriction;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<Boolean> getVersioning() {
|
public Field<Boolean> getVersioning() {
|
||||||
return versioning;
|
return versioning;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setVersioning(Field<Boolean> versioning) {
|
public void setVersioning(Field<Boolean> versioning) {
|
||||||
this.versioning = versioning;
|
this.versioning = versioning;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getCitationguidelineurl() {
|
public Field<String> getCitationguidelineurl() {
|
||||||
return citationguidelineurl;
|
return citationguidelineurl;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setCitationguidelineurl(Field<String> citationguidelineurl) {
|
public void setCitationguidelineurl(Field<String> citationguidelineurl) {
|
||||||
this.citationguidelineurl = citationguidelineurl;
|
this.citationguidelineurl = citationguidelineurl;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getQualitymanagementkind() {
|
public Field<String> getQualitymanagementkind() {
|
||||||
return qualitymanagementkind;
|
return qualitymanagementkind;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setQualitymanagementkind(Field<String> qualitymanagementkind) {
|
public void setQualitymanagementkind(Field<String> qualitymanagementkind) {
|
||||||
this.qualitymanagementkind = qualitymanagementkind;
|
this.qualitymanagementkind = qualitymanagementkind;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getPidsystems() {
|
public Field<String> getPidsystems() {
|
||||||
return pidsystems;
|
return pidsystems;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setPidsystems(Field<String> pidsystems) {
|
public void setPidsystems(Field<String> pidsystems) {
|
||||||
this.pidsystems = pidsystems;
|
this.pidsystems = pidsystems;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getCertificates() {
|
public Field<String> getCertificates() {
|
||||||
return certificates;
|
return certificates;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setCertificates(Field<String> certificates) {
|
public void setCertificates(Field<String> certificates) {
|
||||||
this.certificates = certificates;
|
this.certificates = certificates;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<KeyValue> getPolicies() {
|
public List<KeyValue> getPolicies() {
|
||||||
return policies;
|
return policies;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setPolicies(List<KeyValue> policies) {
|
public void setPolicies(List<KeyValue> policies) {
|
||||||
this.policies = policies;
|
this.policies = policies;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Journal getJournal() {
|
public Journal getJournal() {
|
||||||
return journal;
|
return journal;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setJournal(Journal journal) {
|
public void setJournal(Journal journal) {
|
||||||
this.journal = journal;
|
this.journal = journal;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public void mergeFrom(OafEntity e) {
|
public void mergeFrom(OafEntity e) {
|
||||||
super.mergeFrom(e);
|
super.mergeFrom(e);
|
||||||
|
|
||||||
if (!Datasource.class.isAssignableFrom(e.getClass())) {
|
if (!Datasource.class.isAssignableFrom(e.getClass())) {
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
Datasource d = (Datasource) e;
|
Datasource d = (Datasource) e;
|
||||||
|
|
||||||
datasourcetype =
|
datasourcetype = d.getDatasourcetype() != null && compareTrust(this, e) < 0
|
||||||
d.getDatasourcetype() != null && compareTrust(this, e) < 0
|
? d.getDatasourcetype()
|
||||||
? d.getDatasourcetype()
|
: datasourcetype;
|
||||||
: datasourcetype;
|
openairecompatibility = d.getOpenairecompatibility() != null && compareTrust(this, e) < 0
|
||||||
openairecompatibility =
|
? d.getOpenairecompatibility()
|
||||||
d.getOpenairecompatibility() != null && compareTrust(this, e) < 0
|
: openairecompatibility;
|
||||||
? d.getOpenairecompatibility()
|
officialname = d.getOfficialname() != null && compareTrust(this, e) < 0
|
||||||
: openairecompatibility;
|
? d.getOfficialname()
|
||||||
officialname =
|
: officialname;
|
||||||
d.getOfficialname() != null && compareTrust(this, e) < 0
|
englishname = d.getEnglishname() != null && compareTrust(this, e) < 0 ? d.getEnglishname() : officialname;
|
||||||
? d.getOfficialname()
|
websiteurl = d.getWebsiteurl() != null && compareTrust(this, e) < 0 ? d.getWebsiteurl() : websiteurl;
|
||||||
: officialname;
|
logourl = d.getLogourl() != null && compareTrust(this, e) < 0 ? d.getLogourl() : getLogourl();
|
||||||
englishname =
|
contactemail = d.getContactemail() != null && compareTrust(this, e) < 0
|
||||||
d.getEnglishname() != null && compareTrust(this, e) < 0 ? d.getEnglishname() : officialname;
|
? d.getContactemail()
|
||||||
websiteurl =
|
: contactemail;
|
||||||
d.getWebsiteurl() != null && compareTrust(this, e) < 0 ? d.getWebsiteurl() : websiteurl;
|
namespaceprefix = d.getNamespaceprefix() != null && compareTrust(this, e) < 0
|
||||||
logourl = d.getLogourl() != null && compareTrust(this, e) < 0 ? d.getLogourl() : getLogourl();
|
? d.getNamespaceprefix()
|
||||||
contactemail =
|
: namespaceprefix;
|
||||||
d.getContactemail() != null && compareTrust(this, e) < 0
|
latitude = d.getLatitude() != null && compareTrust(this, e) < 0 ? d.getLatitude() : latitude;
|
||||||
? d.getContactemail()
|
longitude = d.getLongitude() != null && compareTrust(this, e) < 0 ? d.getLongitude() : longitude;
|
||||||
: contactemail;
|
dateofvalidation = d.getDateofvalidation() != null && compareTrust(this, e) < 0
|
||||||
namespaceprefix =
|
? d.getDateofvalidation()
|
||||||
d.getNamespaceprefix() != null && compareTrust(this, e) < 0
|
: dateofvalidation;
|
||||||
? d.getNamespaceprefix()
|
description = d.getDescription() != null && compareTrust(this, e) < 0 ? d.getDescription() : description;
|
||||||
: namespaceprefix;
|
subjects = mergeLists(subjects, d.getSubjects());
|
||||||
latitude = d.getLatitude() != null && compareTrust(this, e) < 0 ? d.getLatitude() : latitude;
|
|
||||||
longitude =
|
// opendoar specific fields (od*)
|
||||||
d.getLongitude() != null && compareTrust(this, e) < 0 ? d.getLongitude() : longitude;
|
odnumberofitems = d.getOdnumberofitems() != null && compareTrust(this, e) < 0
|
||||||
dateofvalidation =
|
? d.getOdnumberofitems()
|
||||||
d.getDateofvalidation() != null && compareTrust(this, e) < 0
|
: odnumberofitems;
|
||||||
? d.getDateofvalidation()
|
odnumberofitemsdate = d.getOdnumberofitemsdate() != null && compareTrust(this, e) < 0
|
||||||
: dateofvalidation;
|
? d.getOdnumberofitemsdate()
|
||||||
description =
|
: odnumberofitemsdate;
|
||||||
d.getDescription() != null && compareTrust(this, e) < 0 ? d.getDescription() : description;
|
odpolicies = d.getOdpolicies() != null && compareTrust(this, e) < 0 ? d.getOdpolicies() : odpolicies;
|
||||||
subjects = mergeLists(subjects, d.getSubjects());
|
odlanguages = mergeLists(odlanguages, d.getOdlanguages());
|
||||||
|
odcontenttypes = mergeLists(odcontenttypes, d.getOdcontenttypes());
|
||||||
// opendoar specific fields (od*)
|
accessinfopackage = mergeLists(accessinfopackage, d.getAccessinfopackage());
|
||||||
odnumberofitems =
|
|
||||||
d.getOdnumberofitems() != null && compareTrust(this, e) < 0
|
// re3data fields
|
||||||
? d.getOdnumberofitems()
|
releasestartdate = d.getReleasestartdate() != null && compareTrust(this, e) < 0
|
||||||
: odnumberofitems;
|
? d.getReleasestartdate()
|
||||||
odnumberofitemsdate =
|
: releasestartdate;
|
||||||
d.getOdnumberofitemsdate() != null && compareTrust(this, e) < 0
|
releaseenddate = d.getReleaseenddate() != null && compareTrust(this, e) < 0
|
||||||
? d.getOdnumberofitemsdate()
|
? d.getReleaseenddate()
|
||||||
: odnumberofitemsdate;
|
: releaseenddate;
|
||||||
odpolicies =
|
missionstatementurl = d.getMissionstatementurl() != null && compareTrust(this, e) < 0
|
||||||
d.getOdpolicies() != null && compareTrust(this, e) < 0 ? d.getOdpolicies() : odpolicies;
|
? d.getMissionstatementurl()
|
||||||
odlanguages = mergeLists(odlanguages, d.getOdlanguages());
|
: missionstatementurl;
|
||||||
odcontenttypes = mergeLists(odcontenttypes, d.getOdcontenttypes());
|
dataprovider = d.getDataprovider() != null && compareTrust(this, e) < 0
|
||||||
accessinfopackage = mergeLists(accessinfopackage, d.getAccessinfopackage());
|
? d.getDataprovider()
|
||||||
|
: dataprovider;
|
||||||
// re3data fields
|
serviceprovider = d.getServiceprovider() != null && compareTrust(this, e) < 0
|
||||||
releasestartdate =
|
? d.getServiceprovider()
|
||||||
d.getReleasestartdate() != null && compareTrust(this, e) < 0
|
: serviceprovider;
|
||||||
? d.getReleasestartdate()
|
|
||||||
: releasestartdate;
|
// {open, restricted or closed}
|
||||||
releaseenddate =
|
databaseaccesstype = d.getDatabaseaccesstype() != null && compareTrust(this, e) < 0
|
||||||
d.getReleaseenddate() != null && compareTrust(this, e) < 0
|
? d.getDatabaseaccesstype()
|
||||||
? d.getReleaseenddate()
|
: databaseaccesstype;
|
||||||
: releaseenddate;
|
|
||||||
missionstatementurl =
|
// {open, restricted or closed}
|
||||||
d.getMissionstatementurl() != null && compareTrust(this, e) < 0
|
datauploadtype = d.getDatauploadtype() != null && compareTrust(this, e) < 0
|
||||||
? d.getMissionstatementurl()
|
? d.getDatauploadtype()
|
||||||
: missionstatementurl;
|
: datauploadtype;
|
||||||
dataprovider =
|
|
||||||
d.getDataprovider() != null && compareTrust(this, e) < 0
|
// {feeRequired, registration, other}
|
||||||
? d.getDataprovider()
|
databaseaccessrestriction = d.getDatabaseaccessrestriction() != null && compareTrust(this, e) < 0
|
||||||
: dataprovider;
|
? d.getDatabaseaccessrestriction()
|
||||||
serviceprovider =
|
: databaseaccessrestriction;
|
||||||
d.getServiceprovider() != null && compareTrust(this, e) < 0
|
|
||||||
? d.getServiceprovider()
|
// {feeRequired, registration, other}
|
||||||
: serviceprovider;
|
datauploadrestriction = d.getDatauploadrestriction() != null && compareTrust(this, e) < 0
|
||||||
|
? d.getDatauploadrestriction()
|
||||||
// {open, restricted or closed}
|
: datauploadrestriction;
|
||||||
databaseaccesstype =
|
|
||||||
d.getDatabaseaccesstype() != null && compareTrust(this, e) < 0
|
versioning = d.getVersioning() != null && compareTrust(this, e) < 0 ? d.getVersioning() : versioning;
|
||||||
? d.getDatabaseaccesstype()
|
citationguidelineurl = d.getCitationguidelineurl() != null && compareTrust(this, e) < 0
|
||||||
: databaseaccesstype;
|
? d.getCitationguidelineurl()
|
||||||
|
: citationguidelineurl;
|
||||||
// {open, restricted or closed}
|
|
||||||
datauploadtype =
|
// {yes, no, unknown}
|
||||||
d.getDatauploadtype() != null && compareTrust(this, e) < 0
|
qualitymanagementkind = d.getQualitymanagementkind() != null && compareTrust(this, e) < 0
|
||||||
? d.getDatauploadtype()
|
? d.getQualitymanagementkind()
|
||||||
: datauploadtype;
|
: qualitymanagementkind;
|
||||||
|
pidsystems = d.getPidsystems() != null && compareTrust(this, e) < 0 ? d.getPidsystems() : pidsystems;
|
||||||
// {feeRequired, registration, other}
|
|
||||||
databaseaccessrestriction =
|
certificates = d.getCertificates() != null && compareTrust(this, e) < 0
|
||||||
d.getDatabaseaccessrestriction() != null && compareTrust(this, e) < 0
|
? d.getCertificates()
|
||||||
? d.getDatabaseaccessrestriction()
|
: certificates;
|
||||||
: databaseaccessrestriction;
|
|
||||||
|
policies = mergeLists(policies, d.getPolicies());
|
||||||
// {feeRequired, registration, other}
|
|
||||||
datauploadrestriction =
|
journal = d.getJournal() != null && compareTrust(this, e) < 0 ? d.getJournal() : journal;
|
||||||
d.getDatauploadrestriction() != null && compareTrust(this, e) < 0
|
|
||||||
? d.getDatauploadrestriction()
|
mergeOAFDataInfo(e);
|
||||||
: datauploadrestriction;
|
}
|
||||||
|
|
||||||
versioning =
|
|
||||||
d.getVersioning() != null && compareTrust(this, e) < 0 ? d.getVersioning() : versioning;
|
|
||||||
citationguidelineurl =
|
|
||||||
d.getCitationguidelineurl() != null && compareTrust(this, e) < 0
|
|
||||||
? d.getCitationguidelineurl()
|
|
||||||
: citationguidelineurl;
|
|
||||||
|
|
||||||
// {yes, no, unknown}
|
|
||||||
qualitymanagementkind =
|
|
||||||
d.getQualitymanagementkind() != null && compareTrust(this, e) < 0
|
|
||||||
? d.getQualitymanagementkind()
|
|
||||||
: qualitymanagementkind;
|
|
||||||
pidsystems =
|
|
||||||
d.getPidsystems() != null && compareTrust(this, e) < 0 ? d.getPidsystems() : pidsystems;
|
|
||||||
|
|
||||||
certificates =
|
|
||||||
d.getCertificates() != null && compareTrust(this, e) < 0
|
|
||||||
? d.getCertificates()
|
|
||||||
: certificates;
|
|
||||||
|
|
||||||
policies = mergeLists(policies, d.getPolicies());
|
|
||||||
|
|
||||||
journal = d.getJournal() != null && compareTrust(this, e) < 0 ? d.getJournal() : journal;
|
|
||||||
|
|
||||||
mergeOAFDataInfo(e);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,115 +1,119 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf;
|
package eu.dnetlib.dhp.schema.oaf;
|
||||||
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
import java.util.Objects;
|
import java.util.Objects;
|
||||||
|
|
||||||
public class ExternalReference implements Serializable {
|
public class ExternalReference implements Serializable {
|
||||||
// source
|
// source
|
||||||
private String sitename;
|
private String sitename;
|
||||||
|
|
||||||
// title
|
// title
|
||||||
private String label;
|
private String label;
|
||||||
|
|
||||||
// text()
|
// text()
|
||||||
private String url;
|
private String url;
|
||||||
|
|
||||||
// ?? not mapped yet ??
|
// ?? not mapped yet ??
|
||||||
private String description;
|
private String description;
|
||||||
|
|
||||||
// type
|
// type
|
||||||
private Qualifier qualifier;
|
private Qualifier qualifier;
|
||||||
|
|
||||||
// site internal identifier
|
// site internal identifier
|
||||||
private String refidentifier;
|
private String refidentifier;
|
||||||
|
|
||||||
// maps the oaf:reference/@query attribute
|
// maps the oaf:reference/@query attribute
|
||||||
private String query;
|
private String query;
|
||||||
|
|
||||||
// ExternalReferences might be also inferred
|
// ExternalReferences might be also inferred
|
||||||
private DataInfo dataInfo;
|
private DataInfo dataInfo;
|
||||||
|
|
||||||
public String getSitename() {
|
public String getSitename() {
|
||||||
return sitename;
|
return sitename;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setSitename(String sitename) {
|
public void setSitename(String sitename) {
|
||||||
this.sitename = sitename;
|
this.sitename = sitename;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getLabel() {
|
public String getLabel() {
|
||||||
return label;
|
return label;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setLabel(String label) {
|
public void setLabel(String label) {
|
||||||
this.label = label;
|
this.label = label;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getUrl() {
|
public String getUrl() {
|
||||||
return url;
|
return url;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setUrl(String url) {
|
public void setUrl(String url) {
|
||||||
this.url = url;
|
this.url = url;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getDescription() {
|
public String getDescription() {
|
||||||
return description;
|
return description;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDescription(String description) {
|
public void setDescription(String description) {
|
||||||
this.description = description;
|
this.description = description;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Qualifier getQualifier() {
|
public Qualifier getQualifier() {
|
||||||
return qualifier;
|
return qualifier;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setQualifier(Qualifier qualifier) {
|
public void setQualifier(Qualifier qualifier) {
|
||||||
this.qualifier = qualifier;
|
this.qualifier = qualifier;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getRefidentifier() {
|
public String getRefidentifier() {
|
||||||
return refidentifier;
|
return refidentifier;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setRefidentifier(String refidentifier) {
|
public void setRefidentifier(String refidentifier) {
|
||||||
this.refidentifier = refidentifier;
|
this.refidentifier = refidentifier;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getQuery() {
|
public String getQuery() {
|
||||||
return query;
|
return query;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setQuery(String query) {
|
public void setQuery(String query) {
|
||||||
this.query = query;
|
this.query = query;
|
||||||
}
|
}
|
||||||
|
|
||||||
public DataInfo getDataInfo() {
|
public DataInfo getDataInfo() {
|
||||||
return dataInfo;
|
return dataInfo;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDataInfo(DataInfo dataInfo) {
|
public void setDataInfo(DataInfo dataInfo) {
|
||||||
this.dataInfo = dataInfo;
|
this.dataInfo = dataInfo;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public boolean equals(Object o) {
|
public boolean equals(Object o) {
|
||||||
if (this == o) return true;
|
if (this == o)
|
||||||
if (o == null || getClass() != o.getClass()) return false;
|
return true;
|
||||||
ExternalReference that = (ExternalReference) o;
|
if (o == null || getClass() != o.getClass())
|
||||||
return Objects.equals(sitename, that.sitename)
|
return false;
|
||||||
&& Objects.equals(label, that.label)
|
ExternalReference that = (ExternalReference) o;
|
||||||
&& Objects.equals(url, that.url)
|
return Objects.equals(sitename, that.sitename)
|
||||||
&& Objects.equals(description, that.description)
|
&& Objects.equals(label, that.label)
|
||||||
&& Objects.equals(qualifier, that.qualifier)
|
&& Objects.equals(url, that.url)
|
||||||
&& Objects.equals(refidentifier, that.refidentifier)
|
&& Objects.equals(description, that.description)
|
||||||
&& Objects.equals(query, that.query)
|
&& Objects.equals(qualifier, that.qualifier)
|
||||||
&& Objects.equals(dataInfo, that.dataInfo);
|
&& Objects.equals(refidentifier, that.refidentifier)
|
||||||
}
|
&& Objects.equals(query, that.query)
|
||||||
|
&& Objects.equals(dataInfo, that.dataInfo);
|
||||||
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public int hashCode() {
|
public int hashCode() {
|
||||||
return Objects.hash(
|
return Objects
|
||||||
sitename, label, url, description, qualifier, refidentifier, query, dataInfo);
|
.hash(
|
||||||
}
|
sitename, label, url, description, qualifier, refidentifier, query, dataInfo);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,74 +1,77 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf;
|
package eu.dnetlib.dhp.schema.oaf;
|
||||||
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
import java.util.Objects;
|
import java.util.Objects;
|
||||||
|
|
||||||
public class ExtraInfo implements Serializable {
|
public class ExtraInfo implements Serializable {
|
||||||
private String name;
|
private String name;
|
||||||
|
|
||||||
private String typology;
|
private String typology;
|
||||||
|
|
||||||
private String provenance;
|
private String provenance;
|
||||||
|
|
||||||
private String trust;
|
private String trust;
|
||||||
|
|
||||||
// json containing a Citation or Statistics
|
// json containing a Citation or Statistics
|
||||||
private String value;
|
private String value;
|
||||||
|
|
||||||
public String getName() {
|
public String getName() {
|
||||||
return name;
|
return name;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setName(String name) {
|
public void setName(String name) {
|
||||||
this.name = name;
|
this.name = name;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getTypology() {
|
public String getTypology() {
|
||||||
return typology;
|
return typology;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setTypology(String typology) {
|
public void setTypology(String typology) {
|
||||||
this.typology = typology;
|
this.typology = typology;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getProvenance() {
|
public String getProvenance() {
|
||||||
return provenance;
|
return provenance;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setProvenance(String provenance) {
|
public void setProvenance(String provenance) {
|
||||||
this.provenance = provenance;
|
this.provenance = provenance;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getTrust() {
|
public String getTrust() {
|
||||||
return trust;
|
return trust;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setTrust(String trust) {
|
public void setTrust(String trust) {
|
||||||
this.trust = trust;
|
this.trust = trust;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getValue() {
|
public String getValue() {
|
||||||
return value;
|
return value;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setValue(String value) {
|
public void setValue(String value) {
|
||||||
this.value = value;
|
this.value = value;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public boolean equals(Object o) {
|
public boolean equals(Object o) {
|
||||||
if (this == o) return true;
|
if (this == o)
|
||||||
if (o == null || getClass() != o.getClass()) return false;
|
return true;
|
||||||
ExtraInfo extraInfo = (ExtraInfo) o;
|
if (o == null || getClass() != o.getClass())
|
||||||
return Objects.equals(name, extraInfo.name)
|
return false;
|
||||||
&& Objects.equals(typology, extraInfo.typology)
|
ExtraInfo extraInfo = (ExtraInfo) o;
|
||||||
&& Objects.equals(provenance, extraInfo.provenance)
|
return Objects.equals(name, extraInfo.name)
|
||||||
&& Objects.equals(trust, extraInfo.trust)
|
&& Objects.equals(typology, extraInfo.typology)
|
||||||
&& Objects.equals(value, extraInfo.value);
|
&& Objects.equals(provenance, extraInfo.provenance)
|
||||||
}
|
&& Objects.equals(trust, extraInfo.trust)
|
||||||
|
&& Objects.equals(value, extraInfo.value);
|
||||||
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public int hashCode() {
|
public int hashCode() {
|
||||||
return Objects.hash(name, typology, provenance, trust, value);
|
return Objects.hash(name, typology, provenance, trust, value);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,40 +1,44 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf;
|
package eu.dnetlib.dhp.schema.oaf;
|
||||||
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
|
|
||||||
public class Field<T> implements Serializable {
|
public class Field<T> implements Serializable {
|
||||||
|
|
||||||
private T value;
|
private T value;
|
||||||
|
|
||||||
private DataInfo dataInfo;
|
private DataInfo dataInfo;
|
||||||
|
|
||||||
public T getValue() {
|
public T getValue() {
|
||||||
return value;
|
return value;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setValue(T value) {
|
public void setValue(T value) {
|
||||||
this.value = value;
|
this.value = value;
|
||||||
}
|
}
|
||||||
|
|
||||||
public DataInfo getDataInfo() {
|
public DataInfo getDataInfo() {
|
||||||
return dataInfo;
|
return dataInfo;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDataInfo(DataInfo dataInfo) {
|
public void setDataInfo(DataInfo dataInfo) {
|
||||||
this.dataInfo = dataInfo;
|
this.dataInfo = dataInfo;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public int hashCode() {
|
public int hashCode() {
|
||||||
return getValue() == null ? 0 : getValue().hashCode();
|
return getValue() == null ? 0 : getValue().hashCode();
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public boolean equals(Object obj) {
|
public boolean equals(Object obj) {
|
||||||
if (this == obj) return true;
|
if (this == obj)
|
||||||
if (obj == null) return false;
|
return true;
|
||||||
if (getClass() != obj.getClass()) return false;
|
if (obj == null)
|
||||||
Field<T> other = (Field<T>) obj;
|
return false;
|
||||||
return getValue().equals(other.getValue());
|
if (getClass() != obj.getClass())
|
||||||
}
|
return false;
|
||||||
|
Field<T> other = (Field<T>) obj;
|
||||||
|
return getValue().equals(other.getValue());
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,69 +1,76 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf;
|
package eu.dnetlib.dhp.schema.oaf;
|
||||||
|
|
||||||
import com.fasterxml.jackson.annotation.JsonIgnore;
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
|
|
||||||
import org.apache.commons.lang3.StringUtils;
|
import org.apache.commons.lang3.StringUtils;
|
||||||
|
|
||||||
|
import com.fasterxml.jackson.annotation.JsonIgnore;
|
||||||
|
|
||||||
public class GeoLocation implements Serializable {
|
public class GeoLocation implements Serializable {
|
||||||
|
|
||||||
private String point;
|
private String point;
|
||||||
|
|
||||||
private String box;
|
private String box;
|
||||||
|
|
||||||
private String place;
|
private String place;
|
||||||
|
|
||||||
public String getPoint() {
|
public String getPoint() {
|
||||||
return point;
|
return point;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setPoint(String point) {
|
public void setPoint(String point) {
|
||||||
this.point = point;
|
this.point = point;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getBox() {
|
public String getBox() {
|
||||||
return box;
|
return box;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setBox(String box) {
|
public void setBox(String box) {
|
||||||
this.box = box;
|
this.box = box;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getPlace() {
|
public String getPlace() {
|
||||||
return place;
|
return place;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setPlace(String place) {
|
public void setPlace(String place) {
|
||||||
this.place = place;
|
this.place = place;
|
||||||
}
|
}
|
||||||
|
|
||||||
@JsonIgnore
|
@JsonIgnore
|
||||||
public boolean isBlank() {
|
public boolean isBlank() {
|
||||||
return StringUtils.isBlank(point) && StringUtils.isBlank(box) && StringUtils.isBlank(place);
|
return StringUtils.isBlank(point) && StringUtils.isBlank(box) && StringUtils.isBlank(place);
|
||||||
}
|
}
|
||||||
|
|
||||||
public String toComparableString() {
|
public String toComparableString() {
|
||||||
return isBlank()
|
return isBlank()
|
||||||
? ""
|
? ""
|
||||||
: String.format(
|
: String
|
||||||
"%s::%s%s",
|
.format(
|
||||||
point != null ? point.toLowerCase() : "",
|
"%s::%s%s",
|
||||||
box != null ? box.toLowerCase() : "",
|
point != null ? point.toLowerCase() : "",
|
||||||
place != null ? place.toLowerCase() : "");
|
box != null ? box.toLowerCase() : "",
|
||||||
}
|
place != null ? place.toLowerCase() : "");
|
||||||
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public int hashCode() {
|
public int hashCode() {
|
||||||
return toComparableString().hashCode();
|
return toComparableString().hashCode();
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public boolean equals(Object obj) {
|
public boolean equals(Object obj) {
|
||||||
if (this == obj) return true;
|
if (this == obj)
|
||||||
if (obj == null) return false;
|
return true;
|
||||||
if (getClass() != obj.getClass()) return false;
|
if (obj == null)
|
||||||
|
return false;
|
||||||
|
if (getClass() != obj.getClass())
|
||||||
|
return false;
|
||||||
|
|
||||||
GeoLocation other = (GeoLocation) obj;
|
GeoLocation other = (GeoLocation) obj;
|
||||||
|
|
||||||
return toComparableString().equals(other.toComparableString());
|
return toComparableString().equals(other.toComparableString());
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,3 +1,4 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf;
|
package eu.dnetlib.dhp.schema.oaf;
|
||||||
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
|
@ -5,143 +6,147 @@ import java.util.List;
|
||||||
|
|
||||||
public class Instance implements Serializable {
|
public class Instance implements Serializable {
|
||||||
|
|
||||||
private Field<String> license;
|
private Field<String> license;
|
||||||
|
|
||||||
private Qualifier accessright;
|
private Qualifier accessright;
|
||||||
|
|
||||||
private Qualifier instancetype;
|
private Qualifier instancetype;
|
||||||
|
|
||||||
private KeyValue hostedby;
|
private KeyValue hostedby;
|
||||||
|
|
||||||
private List<String> url;
|
private List<String> url;
|
||||||
|
|
||||||
// other research products specifc
|
// other research products specifc
|
||||||
private String distributionlocation;
|
private String distributionlocation;
|
||||||
|
|
||||||
private KeyValue collectedfrom;
|
private KeyValue collectedfrom;
|
||||||
|
|
||||||
private Field<String> dateofacceptance;
|
private Field<String> dateofacceptance;
|
||||||
|
|
||||||
// ( article | book ) processing charges. Defined here to cope with possible wrongly typed
|
// ( article | book ) processing charges. Defined here to cope with possible wrongly typed
|
||||||
// results
|
// results
|
||||||
private Field<String> processingchargeamount;
|
private Field<String> processingchargeamount;
|
||||||
|
|
||||||
// currency - alphabetic code describe in ISO-4217. Defined here to cope with possible wrongly
|
// currency - alphabetic code describe in ISO-4217. Defined here to cope with possible wrongly
|
||||||
// typed results
|
// typed results
|
||||||
private Field<String> processingchargecurrency;
|
private Field<String> processingchargecurrency;
|
||||||
|
|
||||||
private Field<String> refereed; // peer-review status
|
private Field<String> refereed; // peer-review status
|
||||||
|
|
||||||
public Field<String> getLicense() {
|
public Field<String> getLicense() {
|
||||||
return license;
|
return license;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setLicense(Field<String> license) {
|
public void setLicense(Field<String> license) {
|
||||||
this.license = license;
|
this.license = license;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Qualifier getAccessright() {
|
public Qualifier getAccessright() {
|
||||||
return accessright;
|
return accessright;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setAccessright(Qualifier accessright) {
|
public void setAccessright(Qualifier accessright) {
|
||||||
this.accessright = accessright;
|
this.accessright = accessright;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Qualifier getInstancetype() {
|
public Qualifier getInstancetype() {
|
||||||
return instancetype;
|
return instancetype;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setInstancetype(Qualifier instancetype) {
|
public void setInstancetype(Qualifier instancetype) {
|
||||||
this.instancetype = instancetype;
|
this.instancetype = instancetype;
|
||||||
}
|
}
|
||||||
|
|
||||||
public KeyValue getHostedby() {
|
public KeyValue getHostedby() {
|
||||||
return hostedby;
|
return hostedby;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setHostedby(KeyValue hostedby) {
|
public void setHostedby(KeyValue hostedby) {
|
||||||
this.hostedby = hostedby;
|
this.hostedby = hostedby;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<String> getUrl() {
|
public List<String> getUrl() {
|
||||||
return url;
|
return url;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setUrl(List<String> url) {
|
public void setUrl(List<String> url) {
|
||||||
this.url = url;
|
this.url = url;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getDistributionlocation() {
|
public String getDistributionlocation() {
|
||||||
return distributionlocation;
|
return distributionlocation;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDistributionlocation(String distributionlocation) {
|
public void setDistributionlocation(String distributionlocation) {
|
||||||
this.distributionlocation = distributionlocation;
|
this.distributionlocation = distributionlocation;
|
||||||
}
|
}
|
||||||
|
|
||||||
public KeyValue getCollectedfrom() {
|
public KeyValue getCollectedfrom() {
|
||||||
return collectedfrom;
|
return collectedfrom;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setCollectedfrom(KeyValue collectedfrom) {
|
public void setCollectedfrom(KeyValue collectedfrom) {
|
||||||
this.collectedfrom = collectedfrom;
|
this.collectedfrom = collectedfrom;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getDateofacceptance() {
|
public Field<String> getDateofacceptance() {
|
||||||
return dateofacceptance;
|
return dateofacceptance;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDateofacceptance(Field<String> dateofacceptance) {
|
public void setDateofacceptance(Field<String> dateofacceptance) {
|
||||||
this.dateofacceptance = dateofacceptance;
|
this.dateofacceptance = dateofacceptance;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getProcessingchargeamount() {
|
public Field<String> getProcessingchargeamount() {
|
||||||
return processingchargeamount;
|
return processingchargeamount;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setProcessingchargeamount(Field<String> processingchargeamount) {
|
public void setProcessingchargeamount(Field<String> processingchargeamount) {
|
||||||
this.processingchargeamount = processingchargeamount;
|
this.processingchargeamount = processingchargeamount;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getProcessingchargecurrency() {
|
public Field<String> getProcessingchargecurrency() {
|
||||||
return processingchargecurrency;
|
return processingchargecurrency;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setProcessingchargecurrency(Field<String> processingchargecurrency) {
|
public void setProcessingchargecurrency(Field<String> processingchargecurrency) {
|
||||||
this.processingchargecurrency = processingchargecurrency;
|
this.processingchargecurrency = processingchargecurrency;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getRefereed() {
|
public Field<String> getRefereed() {
|
||||||
return refereed;
|
return refereed;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setRefereed(Field<String> refereed) {
|
public void setRefereed(Field<String> refereed) {
|
||||||
this.refereed = refereed;
|
this.refereed = refereed;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String toComparableString() {
|
public String toComparableString() {
|
||||||
return String.format(
|
return String
|
||||||
"%s::%s::%s::%s",
|
.format(
|
||||||
hostedby != null && hostedby.getKey() != null ? hostedby.getKey().toLowerCase() : "",
|
"%s::%s::%s::%s",
|
||||||
accessright != null && accessright.getClassid() != null ? accessright.getClassid() : "",
|
hostedby != null && hostedby.getKey() != null ? hostedby.getKey().toLowerCase() : "",
|
||||||
instancetype != null && instancetype.getClassid() != null ? instancetype.getClassid() : "",
|
accessright != null && accessright.getClassid() != null ? accessright.getClassid() : "",
|
||||||
url != null ? url : "");
|
instancetype != null && instancetype.getClassid() != null ? instancetype.getClassid() : "",
|
||||||
}
|
url != null ? url : "");
|
||||||
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public int hashCode() {
|
public int hashCode() {
|
||||||
return toComparableString().hashCode();
|
return toComparableString().hashCode();
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public boolean equals(Object obj) {
|
public boolean equals(Object obj) {
|
||||||
if (this == obj) return true;
|
if (this == obj)
|
||||||
if (obj == null) return false;
|
return true;
|
||||||
if (getClass() != obj.getClass()) return false;
|
if (obj == null)
|
||||||
|
return false;
|
||||||
|
if (getClass() != obj.getClass())
|
||||||
|
return false;
|
||||||
|
|
||||||
Instance other = (Instance) obj;
|
Instance other = (Instance) obj;
|
||||||
|
|
||||||
return toComparableString().equals(other.toComparableString());
|
return toComparableString().equals(other.toComparableString());
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,3 +1,4 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf;
|
package eu.dnetlib.dhp.schema.oaf;
|
||||||
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
|
@ -5,159 +6,162 @@ import java.util.Objects;
|
||||||
|
|
||||||
public class Journal implements Serializable {
|
public class Journal implements Serializable {
|
||||||
|
|
||||||
private String name;
|
private String name;
|
||||||
|
|
||||||
private String issnPrinted;
|
private String issnPrinted;
|
||||||
|
|
||||||
private String issnOnline;
|
private String issnOnline;
|
||||||
|
|
||||||
private String issnLinking;
|
private String issnLinking;
|
||||||
|
|
||||||
private String ep;
|
private String ep;
|
||||||
|
|
||||||
private String iss;
|
private String iss;
|
||||||
|
|
||||||
private String sp;
|
private String sp;
|
||||||
|
|
||||||
private String vol;
|
private String vol;
|
||||||
|
|
||||||
private String edition;
|
private String edition;
|
||||||
|
|
||||||
private String conferenceplace;
|
private String conferenceplace;
|
||||||
|
|
||||||
private String conferencedate;
|
private String conferencedate;
|
||||||
|
|
||||||
private DataInfo dataInfo;
|
private DataInfo dataInfo;
|
||||||
|
|
||||||
public String getName() {
|
public String getName() {
|
||||||
return name;
|
return name;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setName(String name) {
|
public void setName(String name) {
|
||||||
this.name = name;
|
this.name = name;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getIssnPrinted() {
|
public String getIssnPrinted() {
|
||||||
return issnPrinted;
|
return issnPrinted;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setIssnPrinted(String issnPrinted) {
|
public void setIssnPrinted(String issnPrinted) {
|
||||||
this.issnPrinted = issnPrinted;
|
this.issnPrinted = issnPrinted;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getIssnOnline() {
|
public String getIssnOnline() {
|
||||||
return issnOnline;
|
return issnOnline;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setIssnOnline(String issnOnline) {
|
public void setIssnOnline(String issnOnline) {
|
||||||
this.issnOnline = issnOnline;
|
this.issnOnline = issnOnline;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getIssnLinking() {
|
public String getIssnLinking() {
|
||||||
return issnLinking;
|
return issnLinking;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setIssnLinking(String issnLinking) {
|
public void setIssnLinking(String issnLinking) {
|
||||||
this.issnLinking = issnLinking;
|
this.issnLinking = issnLinking;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getEp() {
|
public String getEp() {
|
||||||
return ep;
|
return ep;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setEp(String ep) {
|
public void setEp(String ep) {
|
||||||
this.ep = ep;
|
this.ep = ep;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getIss() {
|
public String getIss() {
|
||||||
return iss;
|
return iss;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setIss(String iss) {
|
public void setIss(String iss) {
|
||||||
this.iss = iss;
|
this.iss = iss;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getSp() {
|
public String getSp() {
|
||||||
return sp;
|
return sp;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setSp(String sp) {
|
public void setSp(String sp) {
|
||||||
this.sp = sp;
|
this.sp = sp;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getVol() {
|
public String getVol() {
|
||||||
return vol;
|
return vol;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setVol(String vol) {
|
public void setVol(String vol) {
|
||||||
this.vol = vol;
|
this.vol = vol;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getEdition() {
|
public String getEdition() {
|
||||||
return edition;
|
return edition;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setEdition(String edition) {
|
public void setEdition(String edition) {
|
||||||
this.edition = edition;
|
this.edition = edition;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getConferenceplace() {
|
public String getConferenceplace() {
|
||||||
return conferenceplace;
|
return conferenceplace;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setConferenceplace(String conferenceplace) {
|
public void setConferenceplace(String conferenceplace) {
|
||||||
this.conferenceplace = conferenceplace;
|
this.conferenceplace = conferenceplace;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getConferencedate() {
|
public String getConferencedate() {
|
||||||
return conferencedate;
|
return conferencedate;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setConferencedate(String conferencedate) {
|
public void setConferencedate(String conferencedate) {
|
||||||
this.conferencedate = conferencedate;
|
this.conferencedate = conferencedate;
|
||||||
}
|
}
|
||||||
|
|
||||||
public DataInfo getDataInfo() {
|
public DataInfo getDataInfo() {
|
||||||
return dataInfo;
|
return dataInfo;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDataInfo(DataInfo dataInfo) {
|
public void setDataInfo(DataInfo dataInfo) {
|
||||||
this.dataInfo = dataInfo;
|
this.dataInfo = dataInfo;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public boolean equals(Object o) {
|
public boolean equals(Object o) {
|
||||||
if (this == o) return true;
|
if (this == o)
|
||||||
if (o == null || getClass() != o.getClass()) return false;
|
return true;
|
||||||
Journal journal = (Journal) o;
|
if (o == null || getClass() != o.getClass())
|
||||||
return Objects.equals(name, journal.name)
|
return false;
|
||||||
&& Objects.equals(issnPrinted, journal.issnPrinted)
|
Journal journal = (Journal) o;
|
||||||
&& Objects.equals(issnOnline, journal.issnOnline)
|
return Objects.equals(name, journal.name)
|
||||||
&& Objects.equals(issnLinking, journal.issnLinking)
|
&& Objects.equals(issnPrinted, journal.issnPrinted)
|
||||||
&& Objects.equals(ep, journal.ep)
|
&& Objects.equals(issnOnline, journal.issnOnline)
|
||||||
&& Objects.equals(iss, journal.iss)
|
&& Objects.equals(issnLinking, journal.issnLinking)
|
||||||
&& Objects.equals(sp, journal.sp)
|
&& Objects.equals(ep, journal.ep)
|
||||||
&& Objects.equals(vol, journal.vol)
|
&& Objects.equals(iss, journal.iss)
|
||||||
&& Objects.equals(edition, journal.edition)
|
&& Objects.equals(sp, journal.sp)
|
||||||
&& Objects.equals(conferenceplace, journal.conferenceplace)
|
&& Objects.equals(vol, journal.vol)
|
||||||
&& Objects.equals(conferencedate, journal.conferencedate)
|
&& Objects.equals(edition, journal.edition)
|
||||||
&& Objects.equals(dataInfo, journal.dataInfo);
|
&& Objects.equals(conferenceplace, journal.conferenceplace)
|
||||||
}
|
&& Objects.equals(conferencedate, journal.conferencedate)
|
||||||
|
&& Objects.equals(dataInfo, journal.dataInfo);
|
||||||
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public int hashCode() {
|
public int hashCode() {
|
||||||
return Objects.hash(
|
return Objects
|
||||||
name,
|
.hash(
|
||||||
issnPrinted,
|
name,
|
||||||
issnOnline,
|
issnPrinted,
|
||||||
issnLinking,
|
issnOnline,
|
||||||
ep,
|
issnLinking,
|
||||||
iss,
|
ep,
|
||||||
sp,
|
iss,
|
||||||
vol,
|
sp,
|
||||||
edition,
|
vol,
|
||||||
conferenceplace,
|
edition,
|
||||||
conferencedate,
|
conferenceplace,
|
||||||
dataInfo);
|
conferencedate,
|
||||||
}
|
dataInfo);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,67 +1,74 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf;
|
package eu.dnetlib.dhp.schema.oaf;
|
||||||
|
|
||||||
import com.fasterxml.jackson.annotation.JsonIgnore;
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
|
|
||||||
import org.apache.commons.lang3.StringUtils;
|
import org.apache.commons.lang3.StringUtils;
|
||||||
|
|
||||||
|
import com.fasterxml.jackson.annotation.JsonIgnore;
|
||||||
|
|
||||||
public class KeyValue implements Serializable {
|
public class KeyValue implements Serializable {
|
||||||
|
|
||||||
private String key;
|
private String key;
|
||||||
|
|
||||||
private String value;
|
private String value;
|
||||||
|
|
||||||
private DataInfo dataInfo;
|
private DataInfo dataInfo;
|
||||||
|
|
||||||
public String getKey() {
|
public String getKey() {
|
||||||
return key;
|
return key;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setKey(String key) {
|
public void setKey(String key) {
|
||||||
this.key = key;
|
this.key = key;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getValue() {
|
public String getValue() {
|
||||||
return value;
|
return value;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setValue(String value) {
|
public void setValue(String value) {
|
||||||
this.value = value;
|
this.value = value;
|
||||||
}
|
}
|
||||||
|
|
||||||
public DataInfo getDataInfo() {
|
public DataInfo getDataInfo() {
|
||||||
return dataInfo;
|
return dataInfo;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDataInfo(DataInfo dataInfo) {
|
public void setDataInfo(DataInfo dataInfo) {
|
||||||
this.dataInfo = dataInfo;
|
this.dataInfo = dataInfo;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String toComparableString() {
|
public String toComparableString() {
|
||||||
return isBlank()
|
return isBlank()
|
||||||
? ""
|
? ""
|
||||||
: String.format(
|
: String
|
||||||
"%s::%s",
|
.format(
|
||||||
key != null ? key.toLowerCase() : "", value != null ? value.toLowerCase() : "");
|
"%s::%s",
|
||||||
}
|
key != null ? key.toLowerCase() : "", value != null ? value.toLowerCase() : "");
|
||||||
|
}
|
||||||
|
|
||||||
@JsonIgnore
|
@JsonIgnore
|
||||||
public boolean isBlank() {
|
public boolean isBlank() {
|
||||||
return StringUtils.isBlank(key) && StringUtils.isBlank(value);
|
return StringUtils.isBlank(key) && StringUtils.isBlank(value);
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public int hashCode() {
|
public int hashCode() {
|
||||||
return toComparableString().hashCode();
|
return toComparableString().hashCode();
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public boolean equals(Object obj) {
|
public boolean equals(Object obj) {
|
||||||
if (this == obj) return true;
|
if (this == obj)
|
||||||
if (obj == null) return false;
|
return true;
|
||||||
if (getClass() != obj.getClass()) return false;
|
if (obj == null)
|
||||||
|
return false;
|
||||||
|
if (getClass() != obj.getClass())
|
||||||
|
return false;
|
||||||
|
|
||||||
KeyValue other = (KeyValue) obj;
|
KeyValue other = (KeyValue) obj;
|
||||||
|
|
||||||
return toComparableString().equals(other.toComparableString());
|
return toComparableString().equals(other.toComparableString());
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,3 +1,4 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf;
|
package eu.dnetlib.dhp.schema.oaf;
|
||||||
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
|
@ -5,26 +6,28 @@ import java.util.Objects;
|
||||||
|
|
||||||
public class OAIProvenance implements Serializable {
|
public class OAIProvenance implements Serializable {
|
||||||
|
|
||||||
private OriginDescription originDescription;
|
private OriginDescription originDescription;
|
||||||
|
|
||||||
public OriginDescription getOriginDescription() {
|
public OriginDescription getOriginDescription() {
|
||||||
return originDescription;
|
return originDescription;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setOriginDescription(OriginDescription originDescription) {
|
public void setOriginDescription(OriginDescription originDescription) {
|
||||||
this.originDescription = originDescription;
|
this.originDescription = originDescription;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public boolean equals(Object o) {
|
public boolean equals(Object o) {
|
||||||
if (this == o) return true;
|
if (this == o)
|
||||||
if (o == null || getClass() != o.getClass()) return false;
|
return true;
|
||||||
OAIProvenance that = (OAIProvenance) o;
|
if (o == null || getClass() != o.getClass())
|
||||||
return Objects.equals(originDescription, that.originDescription);
|
return false;
|
||||||
}
|
OAIProvenance that = (OAIProvenance) o;
|
||||||
|
return Objects.equals(originDescription, that.originDescription);
|
||||||
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public int hashCode() {
|
public int hashCode() {
|
||||||
return Objects.hash(originDescription);
|
return Objects.hash(originDescription);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,3 +1,4 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf;
|
package eu.dnetlib.dhp.schema.oaf;
|
||||||
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
|
@ -6,60 +7,67 @@ import java.util.Objects;
|
||||||
|
|
||||||
public abstract class Oaf implements Serializable {
|
public abstract class Oaf implements Serializable {
|
||||||
|
|
||||||
protected List<KeyValue> collectedfrom;
|
/**
|
||||||
|
* The list of datasource id/name pairs providing this relationship.
|
||||||
|
*/
|
||||||
|
protected List<KeyValue> collectedfrom;
|
||||||
|
|
||||||
private DataInfo dataInfo;
|
private DataInfo dataInfo;
|
||||||
|
|
||||||
private Long lastupdatetimestamp;
|
private Long lastupdatetimestamp;
|
||||||
|
|
||||||
public List<KeyValue> getCollectedfrom() {
|
public List<KeyValue> getCollectedfrom() {
|
||||||
return collectedfrom;
|
return collectedfrom;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setCollectedfrom(List<KeyValue> collectedfrom) {
|
public void setCollectedfrom(List<KeyValue> collectedfrom) {
|
||||||
this.collectedfrom = collectedfrom;
|
this.collectedfrom = collectedfrom;
|
||||||
}
|
}
|
||||||
|
|
||||||
public DataInfo getDataInfo() {
|
public DataInfo getDataInfo() {
|
||||||
return dataInfo;
|
return dataInfo;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDataInfo(DataInfo dataInfo) {
|
public void setDataInfo(DataInfo dataInfo) {
|
||||||
this.dataInfo = dataInfo;
|
this.dataInfo = dataInfo;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Long getLastupdatetimestamp() {
|
public Long getLastupdatetimestamp() {
|
||||||
return lastupdatetimestamp;
|
return lastupdatetimestamp;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setLastupdatetimestamp(Long lastupdatetimestamp) {
|
public void setLastupdatetimestamp(Long lastupdatetimestamp) {
|
||||||
this.lastupdatetimestamp = lastupdatetimestamp;
|
this.lastupdatetimestamp = lastupdatetimestamp;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void mergeOAFDataInfo(Oaf e) {
|
public void mergeOAFDataInfo(Oaf e) {
|
||||||
if (e.getDataInfo() != null && compareTrust(this, e) < 0) dataInfo = e.getDataInfo();
|
if (e.getDataInfo() != null && compareTrust(this, e) < 0)
|
||||||
}
|
dataInfo = e.getDataInfo();
|
||||||
|
}
|
||||||
|
|
||||||
protected String extractTrust(Oaf e) {
|
protected String extractTrust(Oaf e) {
|
||||||
if (e == null || e.getDataInfo() == null || e.getDataInfo().getTrust() == null) return "0.0";
|
if (e == null || e.getDataInfo() == null || e.getDataInfo().getTrust() == null)
|
||||||
return e.getDataInfo().getTrust();
|
return "0.0";
|
||||||
}
|
return e.getDataInfo().getTrust();
|
||||||
|
}
|
||||||
|
|
||||||
protected int compareTrust(Oaf a, Oaf b) {
|
protected int compareTrust(Oaf a, Oaf b) {
|
||||||
return extractTrust(a).compareTo(extractTrust(b));
|
return extractTrust(a).compareTo(extractTrust(b));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public boolean equals(Object o) {
|
public boolean equals(Object o) {
|
||||||
if (this == o) return true;
|
if (this == o)
|
||||||
if (o == null || getClass() != o.getClass()) return false;
|
return true;
|
||||||
Oaf oaf = (Oaf) o;
|
if (o == null || getClass() != o.getClass())
|
||||||
return Objects.equals(dataInfo, oaf.dataInfo)
|
return false;
|
||||||
&& Objects.equals(lastupdatetimestamp, oaf.lastupdatetimestamp);
|
Oaf oaf = (Oaf) o;
|
||||||
}
|
return Objects.equals(dataInfo, oaf.dataInfo)
|
||||||
|
&& Objects.equals(lastupdatetimestamp, oaf.lastupdatetimestamp);
|
||||||
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public int hashCode() {
|
public int hashCode() {
|
||||||
return Objects.hash(dataInfo, lastupdatetimestamp);
|
return Objects.hash(dataInfo, lastupdatetimestamp);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,3 +1,4 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf;
|
package eu.dnetlib.dhp.schema.oaf;
|
||||||
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
|
@ -6,118 +7,123 @@ import java.util.stream.Collectors;
|
||||||
|
|
||||||
public abstract class OafEntity extends Oaf implements Serializable {
|
public abstract class OafEntity extends Oaf implements Serializable {
|
||||||
|
|
||||||
private String id;
|
private String id;
|
||||||
|
|
||||||
private List<String> originalId;
|
private List<String> originalId;
|
||||||
|
|
||||||
private List<StructuredProperty> pid;
|
private List<StructuredProperty> pid;
|
||||||
|
|
||||||
private String dateofcollection;
|
private String dateofcollection;
|
||||||
|
|
||||||
private String dateoftransformation;
|
private String dateoftransformation;
|
||||||
|
|
||||||
private List<ExtraInfo> extraInfo;
|
private List<ExtraInfo> extraInfo;
|
||||||
|
|
||||||
private OAIProvenance oaiprovenance;
|
private OAIProvenance oaiprovenance;
|
||||||
|
|
||||||
public String getId() {
|
public String getId() {
|
||||||
return id;
|
return id;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setId(String id) {
|
public void setId(String id) {
|
||||||
this.id = id;
|
this.id = id;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<String> getOriginalId() {
|
public List<String> getOriginalId() {
|
||||||
return originalId;
|
return originalId;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setOriginalId(List<String> originalId) {
|
public void setOriginalId(List<String> originalId) {
|
||||||
this.originalId = originalId;
|
this.originalId = originalId;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<StructuredProperty> getPid() {
|
public List<StructuredProperty> getPid() {
|
||||||
return pid;
|
return pid;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setPid(List<StructuredProperty> pid) {
|
public void setPid(List<StructuredProperty> pid) {
|
||||||
this.pid = pid;
|
this.pid = pid;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getDateofcollection() {
|
public String getDateofcollection() {
|
||||||
return dateofcollection;
|
return dateofcollection;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDateofcollection(String dateofcollection) {
|
public void setDateofcollection(String dateofcollection) {
|
||||||
this.dateofcollection = dateofcollection;
|
this.dateofcollection = dateofcollection;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getDateoftransformation() {
|
public String getDateoftransformation() {
|
||||||
return dateoftransformation;
|
return dateoftransformation;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDateoftransformation(String dateoftransformation) {
|
public void setDateoftransformation(String dateoftransformation) {
|
||||||
this.dateoftransformation = dateoftransformation;
|
this.dateoftransformation = dateoftransformation;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<ExtraInfo> getExtraInfo() {
|
public List<ExtraInfo> getExtraInfo() {
|
||||||
return extraInfo;
|
return extraInfo;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setExtraInfo(List<ExtraInfo> extraInfo) {
|
public void setExtraInfo(List<ExtraInfo> extraInfo) {
|
||||||
this.extraInfo = extraInfo;
|
this.extraInfo = extraInfo;
|
||||||
}
|
}
|
||||||
|
|
||||||
public OAIProvenance getOaiprovenance() {
|
public OAIProvenance getOaiprovenance() {
|
||||||
return oaiprovenance;
|
return oaiprovenance;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setOaiprovenance(OAIProvenance oaiprovenance) {
|
public void setOaiprovenance(OAIProvenance oaiprovenance) {
|
||||||
this.oaiprovenance = oaiprovenance;
|
this.oaiprovenance = oaiprovenance;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void mergeFrom(OafEntity e) {
|
public void mergeFrom(OafEntity e) {
|
||||||
|
|
||||||
if (e == null) return;
|
if (e == null)
|
||||||
|
return;
|
||||||
|
|
||||||
originalId = mergeLists(originalId, e.getOriginalId());
|
originalId = mergeLists(originalId, e.getOriginalId());
|
||||||
|
|
||||||
collectedfrom = mergeLists(collectedfrom, e.getCollectedfrom());
|
collectedfrom = mergeLists(collectedfrom, e.getCollectedfrom());
|
||||||
|
|
||||||
pid = mergeLists(pid, e.getPid());
|
pid = mergeLists(pid, e.getPid());
|
||||||
|
|
||||||
if (e.getDateofcollection() != null && compareTrust(this, e) < 0)
|
if (e.getDateofcollection() != null && compareTrust(this, e) < 0)
|
||||||
dateofcollection = e.getDateofcollection();
|
dateofcollection = e.getDateofcollection();
|
||||||
|
|
||||||
if (e.getDateoftransformation() != null && compareTrust(this, e) < 0)
|
if (e.getDateoftransformation() != null && compareTrust(this, e) < 0)
|
||||||
dateoftransformation = e.getDateoftransformation();
|
dateoftransformation = e.getDateoftransformation();
|
||||||
|
|
||||||
extraInfo = mergeLists(extraInfo, e.getExtraInfo());
|
extraInfo = mergeLists(extraInfo, e.getExtraInfo());
|
||||||
|
|
||||||
if (e.getOaiprovenance() != null && compareTrust(this, e) < 0)
|
if (e.getOaiprovenance() != null && compareTrust(this, e) < 0)
|
||||||
oaiprovenance = e.getOaiprovenance();
|
oaiprovenance = e.getOaiprovenance();
|
||||||
}
|
}
|
||||||
|
|
||||||
protected <T> List<T> mergeLists(final List<T>... lists) {
|
protected <T> List<T> mergeLists(final List<T>... lists) {
|
||||||
|
|
||||||
return Arrays.stream(lists)
|
return Arrays
|
||||||
.filter(Objects::nonNull)
|
.stream(lists)
|
||||||
.flatMap(List::stream)
|
.filter(Objects::nonNull)
|
||||||
.distinct()
|
.flatMap(List::stream)
|
||||||
.collect(Collectors.toList());
|
.distinct()
|
||||||
}
|
.collect(Collectors.toList());
|
||||||
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public boolean equals(Object o) {
|
public boolean equals(Object o) {
|
||||||
if (this == o) return true;
|
if (this == o)
|
||||||
if (o == null || getClass() != o.getClass()) return false;
|
return true;
|
||||||
if (!super.equals(o)) return false;
|
if (o == null || getClass() != o.getClass())
|
||||||
OafEntity oafEntity = (OafEntity) o;
|
return false;
|
||||||
return Objects.equals(id, oafEntity.id);
|
if (!super.equals(o))
|
||||||
}
|
return false;
|
||||||
|
OafEntity oafEntity = (OafEntity) o;
|
||||||
|
return Objects.equals(id, oafEntity.id);
|
||||||
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public int hashCode() {
|
public int hashCode() {
|
||||||
return Objects.hash(super.hashCode(), id);
|
return Objects.hash(super.hashCode(), id);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,3 +1,4 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf;
|
package eu.dnetlib.dhp.schema.oaf;
|
||||||
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
|
@ -5,221 +6,209 @@ import java.util.List;
|
||||||
|
|
||||||
public class Organization extends OafEntity implements Serializable {
|
public class Organization extends OafEntity implements Serializable {
|
||||||
|
|
||||||
private Field<String> legalshortname;
|
private Field<String> legalshortname;
|
||||||
|
|
||||||
private Field<String> legalname;
|
private Field<String> legalname;
|
||||||
|
|
||||||
private List<Field<String>> alternativeNames;
|
private List<Field<String>> alternativeNames;
|
||||||
|
|
||||||
private Field<String> websiteurl;
|
private Field<String> websiteurl;
|
||||||
|
|
||||||
private Field<String> logourl;
|
private Field<String> logourl;
|
||||||
|
|
||||||
private Field<String> eclegalbody;
|
private Field<String> eclegalbody;
|
||||||
|
|
||||||
private Field<String> eclegalperson;
|
private Field<String> eclegalperson;
|
||||||
|
|
||||||
private Field<String> ecnonprofit;
|
private Field<String> ecnonprofit;
|
||||||
|
|
||||||
private Field<String> ecresearchorganization;
|
private Field<String> ecresearchorganization;
|
||||||
|
|
||||||
private Field<String> echighereducation;
|
private Field<String> echighereducation;
|
||||||
|
|
||||||
private Field<String> ecinternationalorganizationeurinterests;
|
private Field<String> ecinternationalorganizationeurinterests;
|
||||||
|
|
||||||
private Field<String> ecinternationalorganization;
|
private Field<String> ecinternationalorganization;
|
||||||
|
|
||||||
private Field<String> ecenterprise;
|
private Field<String> ecenterprise;
|
||||||
|
|
||||||
private Field<String> ecsmevalidated;
|
private Field<String> ecsmevalidated;
|
||||||
|
|
||||||
private Field<String> ecnutscode;
|
private Field<String> ecnutscode;
|
||||||
|
|
||||||
private Qualifier country;
|
private Qualifier country;
|
||||||
|
|
||||||
public Field<String> getLegalshortname() {
|
public Field<String> getLegalshortname() {
|
||||||
return legalshortname;
|
return legalshortname;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setLegalshortname(Field<String> legalshortname) {
|
public void setLegalshortname(Field<String> legalshortname) {
|
||||||
this.legalshortname = legalshortname;
|
this.legalshortname = legalshortname;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getLegalname() {
|
public Field<String> getLegalname() {
|
||||||
return legalname;
|
return legalname;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setLegalname(Field<String> legalname) {
|
public void setLegalname(Field<String> legalname) {
|
||||||
this.legalname = legalname;
|
this.legalname = legalname;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<Field<String>> getAlternativeNames() {
|
public List<Field<String>> getAlternativeNames() {
|
||||||
return alternativeNames;
|
return alternativeNames;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setAlternativeNames(List<Field<String>> alternativeNames) {
|
public void setAlternativeNames(List<Field<String>> alternativeNames) {
|
||||||
this.alternativeNames = alternativeNames;
|
this.alternativeNames = alternativeNames;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getWebsiteurl() {
|
public Field<String> getWebsiteurl() {
|
||||||
return websiteurl;
|
return websiteurl;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setWebsiteurl(Field<String> websiteurl) {
|
public void setWebsiteurl(Field<String> websiteurl) {
|
||||||
this.websiteurl = websiteurl;
|
this.websiteurl = websiteurl;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getLogourl() {
|
public Field<String> getLogourl() {
|
||||||
return logourl;
|
return logourl;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setLogourl(Field<String> logourl) {
|
public void setLogourl(Field<String> logourl) {
|
||||||
this.logourl = logourl;
|
this.logourl = logourl;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getEclegalbody() {
|
public Field<String> getEclegalbody() {
|
||||||
return eclegalbody;
|
return eclegalbody;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setEclegalbody(Field<String> eclegalbody) {
|
public void setEclegalbody(Field<String> eclegalbody) {
|
||||||
this.eclegalbody = eclegalbody;
|
this.eclegalbody = eclegalbody;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getEclegalperson() {
|
public Field<String> getEclegalperson() {
|
||||||
return eclegalperson;
|
return eclegalperson;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setEclegalperson(Field<String> eclegalperson) {
|
public void setEclegalperson(Field<String> eclegalperson) {
|
||||||
this.eclegalperson = eclegalperson;
|
this.eclegalperson = eclegalperson;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getEcnonprofit() {
|
public Field<String> getEcnonprofit() {
|
||||||
return ecnonprofit;
|
return ecnonprofit;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setEcnonprofit(Field<String> ecnonprofit) {
|
public void setEcnonprofit(Field<String> ecnonprofit) {
|
||||||
this.ecnonprofit = ecnonprofit;
|
this.ecnonprofit = ecnonprofit;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getEcresearchorganization() {
|
public Field<String> getEcresearchorganization() {
|
||||||
return ecresearchorganization;
|
return ecresearchorganization;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setEcresearchorganization(Field<String> ecresearchorganization) {
|
public void setEcresearchorganization(Field<String> ecresearchorganization) {
|
||||||
this.ecresearchorganization = ecresearchorganization;
|
this.ecresearchorganization = ecresearchorganization;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getEchighereducation() {
|
public Field<String> getEchighereducation() {
|
||||||
return echighereducation;
|
return echighereducation;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setEchighereducation(Field<String> echighereducation) {
|
public void setEchighereducation(Field<String> echighereducation) {
|
||||||
this.echighereducation = echighereducation;
|
this.echighereducation = echighereducation;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getEcinternationalorganizationeurinterests() {
|
public Field<String> getEcinternationalorganizationeurinterests() {
|
||||||
return ecinternationalorganizationeurinterests;
|
return ecinternationalorganizationeurinterests;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setEcinternationalorganizationeurinterests(
|
public void setEcinternationalorganizationeurinterests(
|
||||||
Field<String> ecinternationalorganizationeurinterests) {
|
Field<String> ecinternationalorganizationeurinterests) {
|
||||||
this.ecinternationalorganizationeurinterests = ecinternationalorganizationeurinterests;
|
this.ecinternationalorganizationeurinterests = ecinternationalorganizationeurinterests;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getEcinternationalorganization() {
|
public Field<String> getEcinternationalorganization() {
|
||||||
return ecinternationalorganization;
|
return ecinternationalorganization;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setEcinternationalorganization(Field<String> ecinternationalorganization) {
|
public void setEcinternationalorganization(Field<String> ecinternationalorganization) {
|
||||||
this.ecinternationalorganization = ecinternationalorganization;
|
this.ecinternationalorganization = ecinternationalorganization;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getEcenterprise() {
|
public Field<String> getEcenterprise() {
|
||||||
return ecenterprise;
|
return ecenterprise;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setEcenterprise(Field<String> ecenterprise) {
|
public void setEcenterprise(Field<String> ecenterprise) {
|
||||||
this.ecenterprise = ecenterprise;
|
this.ecenterprise = ecenterprise;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getEcsmevalidated() {
|
public Field<String> getEcsmevalidated() {
|
||||||
return ecsmevalidated;
|
return ecsmevalidated;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setEcsmevalidated(Field<String> ecsmevalidated) {
|
public void setEcsmevalidated(Field<String> ecsmevalidated) {
|
||||||
this.ecsmevalidated = ecsmevalidated;
|
this.ecsmevalidated = ecsmevalidated;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getEcnutscode() {
|
public Field<String> getEcnutscode() {
|
||||||
return ecnutscode;
|
return ecnutscode;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setEcnutscode(Field<String> ecnutscode) {
|
public void setEcnutscode(Field<String> ecnutscode) {
|
||||||
this.ecnutscode = ecnutscode;
|
this.ecnutscode = ecnutscode;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Qualifier getCountry() {
|
public Qualifier getCountry() {
|
||||||
return country;
|
return country;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setCountry(Qualifier country) {
|
public void setCountry(Qualifier country) {
|
||||||
this.country = country;
|
this.country = country;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public void mergeFrom(OafEntity e) {
|
public void mergeFrom(OafEntity e) {
|
||||||
super.mergeFrom(e);
|
super.mergeFrom(e);
|
||||||
|
|
||||||
if (!Organization.class.isAssignableFrom(e.getClass())) {
|
if (!Organization.class.isAssignableFrom(e.getClass())) {
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
final Organization o = (Organization) e;
|
final Organization o = (Organization) e;
|
||||||
legalshortname =
|
legalshortname = o.getLegalshortname() != null && compareTrust(this, e) < 0
|
||||||
o.getLegalshortname() != null && compareTrust(this, e) < 0
|
? o.getLegalshortname()
|
||||||
? o.getLegalshortname()
|
: legalshortname;
|
||||||
: legalshortname;
|
legalname = o.getLegalname() != null && compareTrust(this, e) < 0 ? o.getLegalname() : legalname;
|
||||||
legalname =
|
alternativeNames = mergeLists(o.getAlternativeNames(), alternativeNames);
|
||||||
o.getLegalname() != null && compareTrust(this, e) < 0 ? o.getLegalname() : legalname;
|
websiteurl = o.getWebsiteurl() != null && compareTrust(this, e) < 0 ? o.getWebsiteurl() : websiteurl;
|
||||||
alternativeNames = mergeLists(o.getAlternativeNames(), alternativeNames);
|
logourl = o.getLogourl() != null && compareTrust(this, e) < 0 ? o.getLogourl() : logourl;
|
||||||
websiteurl =
|
eclegalbody = o.getEclegalbody() != null && compareTrust(this, e) < 0 ? o.getEclegalbody() : eclegalbody;
|
||||||
o.getWebsiteurl() != null && compareTrust(this, e) < 0 ? o.getWebsiteurl() : websiteurl;
|
eclegalperson = o.getEclegalperson() != null && compareTrust(this, e) < 0
|
||||||
logourl = o.getLogourl() != null && compareTrust(this, e) < 0 ? o.getLogourl() : logourl;
|
? o.getEclegalperson()
|
||||||
eclegalbody =
|
: eclegalperson;
|
||||||
o.getEclegalbody() != null && compareTrust(this, e) < 0 ? o.getEclegalbody() : eclegalbody;
|
ecnonprofit = o.getEcnonprofit() != null && compareTrust(this, e) < 0 ? o.getEcnonprofit() : ecnonprofit;
|
||||||
eclegalperson =
|
ecresearchorganization = o.getEcresearchorganization() != null && compareTrust(this, e) < 0
|
||||||
o.getEclegalperson() != null && compareTrust(this, e) < 0
|
? o.getEcresearchorganization()
|
||||||
? o.getEclegalperson()
|
: ecresearchorganization;
|
||||||
: eclegalperson;
|
echighereducation = o.getEchighereducation() != null && compareTrust(this, e) < 0
|
||||||
ecnonprofit =
|
? o.getEchighereducation()
|
||||||
o.getEcnonprofit() != null && compareTrust(this, e) < 0 ? o.getEcnonprofit() : ecnonprofit;
|
: echighereducation;
|
||||||
ecresearchorganization =
|
ecinternationalorganizationeurinterests = o.getEcinternationalorganizationeurinterests() != null
|
||||||
o.getEcresearchorganization() != null && compareTrust(this, e) < 0
|
&& compareTrust(this, e) < 0
|
||||||
? o.getEcresearchorganization()
|
? o.getEcinternationalorganizationeurinterests()
|
||||||
: ecresearchorganization;
|
: ecinternationalorganizationeurinterests;
|
||||||
echighereducation =
|
ecinternationalorganization = o.getEcinternationalorganization() != null && compareTrust(this, e) < 0
|
||||||
o.getEchighereducation() != null && compareTrust(this, e) < 0
|
? o.getEcinternationalorganization()
|
||||||
? o.getEchighereducation()
|
: ecinternationalorganization;
|
||||||
: echighereducation;
|
ecenterprise = o.getEcenterprise() != null && compareTrust(this, e) < 0
|
||||||
ecinternationalorganizationeurinterests =
|
? o.getEcenterprise()
|
||||||
o.getEcinternationalorganizationeurinterests() != null && compareTrust(this, e) < 0
|
: ecenterprise;
|
||||||
? o.getEcinternationalorganizationeurinterests()
|
ecsmevalidated = o.getEcsmevalidated() != null && compareTrust(this, e) < 0
|
||||||
: ecinternationalorganizationeurinterests;
|
? o.getEcsmevalidated()
|
||||||
ecinternationalorganization =
|
: ecsmevalidated;
|
||||||
o.getEcinternationalorganization() != null && compareTrust(this, e) < 0
|
ecnutscode = o.getEcnutscode() != null && compareTrust(this, e) < 0 ? o.getEcnutscode() : ecnutscode;
|
||||||
? o.getEcinternationalorganization()
|
country = o.getCountry() != null && compareTrust(this, e) < 0 ? o.getCountry() : country;
|
||||||
: ecinternationalorganization;
|
mergeOAFDataInfo(o);
|
||||||
ecenterprise =
|
}
|
||||||
o.getEcenterprise() != null && compareTrust(this, e) < 0
|
|
||||||
? o.getEcenterprise()
|
|
||||||
: ecenterprise;
|
|
||||||
ecsmevalidated =
|
|
||||||
o.getEcsmevalidated() != null && compareTrust(this, e) < 0
|
|
||||||
? o.getEcsmevalidated()
|
|
||||||
: ecsmevalidated;
|
|
||||||
ecnutscode =
|
|
||||||
o.getEcnutscode() != null && compareTrust(this, e) < 0 ? o.getEcnutscode() : ecnutscode;
|
|
||||||
country = o.getCountry() != null && compareTrust(this, e) < 0 ? o.getCountry() : country;
|
|
||||||
mergeOAFDataInfo(o);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,3 +1,4 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf;
|
package eu.dnetlib.dhp.schema.oaf;
|
||||||
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
|
@ -5,81 +6,83 @@ import java.util.Objects;
|
||||||
|
|
||||||
public class OriginDescription implements Serializable {
|
public class OriginDescription implements Serializable {
|
||||||
|
|
||||||
private String harvestDate;
|
private String harvestDate;
|
||||||
|
|
||||||
private Boolean altered = true;
|
private Boolean altered = true;
|
||||||
|
|
||||||
private String baseURL;
|
private String baseURL;
|
||||||
|
|
||||||
private String identifier;
|
private String identifier;
|
||||||
|
|
||||||
private String datestamp;
|
private String datestamp;
|
||||||
|
|
||||||
private String metadataNamespace;
|
private String metadataNamespace;
|
||||||
|
|
||||||
public String getHarvestDate() {
|
public String getHarvestDate() {
|
||||||
return harvestDate;
|
return harvestDate;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setHarvestDate(String harvestDate) {
|
public void setHarvestDate(String harvestDate) {
|
||||||
this.harvestDate = harvestDate;
|
this.harvestDate = harvestDate;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Boolean getAltered() {
|
public Boolean getAltered() {
|
||||||
return altered;
|
return altered;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setAltered(Boolean altered) {
|
public void setAltered(Boolean altered) {
|
||||||
this.altered = altered;
|
this.altered = altered;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getBaseURL() {
|
public String getBaseURL() {
|
||||||
return baseURL;
|
return baseURL;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setBaseURL(String baseURL) {
|
public void setBaseURL(String baseURL) {
|
||||||
this.baseURL = baseURL;
|
this.baseURL = baseURL;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getIdentifier() {
|
public String getIdentifier() {
|
||||||
return identifier;
|
return identifier;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setIdentifier(String identifier) {
|
public void setIdentifier(String identifier) {
|
||||||
this.identifier = identifier;
|
this.identifier = identifier;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getDatestamp() {
|
public String getDatestamp() {
|
||||||
return datestamp;
|
return datestamp;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDatestamp(String datestamp) {
|
public void setDatestamp(String datestamp) {
|
||||||
this.datestamp = datestamp;
|
this.datestamp = datestamp;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getMetadataNamespace() {
|
public String getMetadataNamespace() {
|
||||||
return metadataNamespace;
|
return metadataNamespace;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setMetadataNamespace(String metadataNamespace) {
|
public void setMetadataNamespace(String metadataNamespace) {
|
||||||
this.metadataNamespace = metadataNamespace;
|
this.metadataNamespace = metadataNamespace;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public boolean equals(Object o) {
|
public boolean equals(Object o) {
|
||||||
if (this == o) return true;
|
if (this == o)
|
||||||
if (o == null || getClass() != o.getClass()) return false;
|
return true;
|
||||||
OriginDescription that = (OriginDescription) o;
|
if (o == null || getClass() != o.getClass())
|
||||||
return Objects.equals(harvestDate, that.harvestDate)
|
return false;
|
||||||
&& Objects.equals(altered, that.altered)
|
OriginDescription that = (OriginDescription) o;
|
||||||
&& Objects.equals(baseURL, that.baseURL)
|
return Objects.equals(harvestDate, that.harvestDate)
|
||||||
&& Objects.equals(identifier, that.identifier)
|
&& Objects.equals(altered, that.altered)
|
||||||
&& Objects.equals(datestamp, that.datestamp)
|
&& Objects.equals(baseURL, that.baseURL)
|
||||||
&& Objects.equals(metadataNamespace, that.metadataNamespace);
|
&& Objects.equals(identifier, that.identifier)
|
||||||
}
|
&& Objects.equals(datestamp, that.datestamp)
|
||||||
|
&& Objects.equals(metadataNamespace, that.metadataNamespace);
|
||||||
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public int hashCode() {
|
public int hashCode() {
|
||||||
return Objects.hash(harvestDate, altered, baseURL, identifier, datestamp, metadataNamespace);
|
return Objects.hash(harvestDate, altered, baseURL, identifier, datestamp, metadataNamespace);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,58 +1,60 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf;
|
package eu.dnetlib.dhp.schema.oaf;
|
||||||
|
|
||||||
import eu.dnetlib.dhp.schema.common.ModelConstants;
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
import java.util.List;
|
import java.util.List;
|
||||||
|
|
||||||
|
import eu.dnetlib.dhp.schema.common.ModelConstants;
|
||||||
|
|
||||||
public class OtherResearchProduct extends Result implements Serializable {
|
public class OtherResearchProduct extends Result implements Serializable {
|
||||||
|
|
||||||
private List<Field<String>> contactperson;
|
private List<Field<String>> contactperson;
|
||||||
|
|
||||||
private List<Field<String>> contactgroup;
|
private List<Field<String>> contactgroup;
|
||||||
|
|
||||||
private List<Field<String>> tool;
|
private List<Field<String>> tool;
|
||||||
|
|
||||||
public OtherResearchProduct() {
|
public OtherResearchProduct() {
|
||||||
setResulttype(ModelConstants.ORP_DEFAULT_RESULTTYPE);
|
setResulttype(ModelConstants.ORP_DEFAULT_RESULTTYPE);
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<Field<String>> getContactperson() {
|
public List<Field<String>> getContactperson() {
|
||||||
return contactperson;
|
return contactperson;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setContactperson(List<Field<String>> contactperson) {
|
public void setContactperson(List<Field<String>> contactperson) {
|
||||||
this.contactperson = contactperson;
|
this.contactperson = contactperson;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<Field<String>> getContactgroup() {
|
public List<Field<String>> getContactgroup() {
|
||||||
return contactgroup;
|
return contactgroup;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setContactgroup(List<Field<String>> contactgroup) {
|
public void setContactgroup(List<Field<String>> contactgroup) {
|
||||||
this.contactgroup = contactgroup;
|
this.contactgroup = contactgroup;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<Field<String>> getTool() {
|
public List<Field<String>> getTool() {
|
||||||
return tool;
|
return tool;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setTool(List<Field<String>> tool) {
|
public void setTool(List<Field<String>> tool) {
|
||||||
this.tool = tool;
|
this.tool = tool;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public void mergeFrom(OafEntity e) {
|
public void mergeFrom(OafEntity e) {
|
||||||
super.mergeFrom(e);
|
super.mergeFrom(e);
|
||||||
|
|
||||||
if (!OtherResearchProduct.class.isAssignableFrom(e.getClass())) {
|
if (!OtherResearchProduct.class.isAssignableFrom(e.getClass())) {
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
OtherResearchProduct o = (OtherResearchProduct) e;
|
OtherResearchProduct o = (OtherResearchProduct) e;
|
||||||
|
|
||||||
contactperson = mergeLists(contactperson, o.getContactperson());
|
contactperson = mergeLists(contactperson, o.getContactperson());
|
||||||
contactgroup = mergeLists(contactgroup, o.getContactgroup());
|
contactgroup = mergeLists(contactgroup, o.getContactgroup());
|
||||||
tool = mergeLists(tool, o.getTool());
|
tool = mergeLists(tool, o.getTool());
|
||||||
mergeOAFDataInfo(e);
|
mergeOAFDataInfo(e);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,3 +1,4 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf;
|
package eu.dnetlib.dhp.schema.oaf;
|
||||||
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
|
@ -5,335 +6,320 @@ import java.util.List;
|
||||||
|
|
||||||
public class Project extends OafEntity implements Serializable {
|
public class Project extends OafEntity implements Serializable {
|
||||||
|
|
||||||
private Field<String> websiteurl;
|
private Field<String> websiteurl;
|
||||||
|
|
||||||
private Field<String> code;
|
private Field<String> code;
|
||||||
|
|
||||||
private Field<String> acronym;
|
private Field<String> acronym;
|
||||||
|
|
||||||
private Field<String> title;
|
private Field<String> title;
|
||||||
|
|
||||||
private Field<String> startdate;
|
private Field<String> startdate;
|
||||||
|
|
||||||
private Field<String> enddate;
|
private Field<String> enddate;
|
||||||
|
|
||||||
private Field<String> callidentifier;
|
private Field<String> callidentifier;
|
||||||
|
|
||||||
private Field<String> keywords;
|
private Field<String> keywords;
|
||||||
|
|
||||||
private Field<String> duration;
|
private Field<String> duration;
|
||||||
|
|
||||||
private Field<String> ecsc39;
|
private Field<String> ecsc39;
|
||||||
|
|
||||||
private Field<String> oamandatepublications;
|
private Field<String> oamandatepublications;
|
||||||
|
|
||||||
private Field<String> ecarticle29_3;
|
private Field<String> ecarticle29_3;
|
||||||
|
|
||||||
private List<StructuredProperty> subjects;
|
private List<StructuredProperty> subjects;
|
||||||
|
|
||||||
private List<Field<String>> fundingtree;
|
private List<Field<String>> fundingtree;
|
||||||
|
|
||||||
private Qualifier contracttype;
|
private Qualifier contracttype;
|
||||||
|
|
||||||
private Field<String> optional1;
|
private Field<String> optional1;
|
||||||
|
|
||||||
private Field<String> optional2;
|
private Field<String> optional2;
|
||||||
|
|
||||||
private Field<String> jsonextrainfo;
|
private Field<String> jsonextrainfo;
|
||||||
|
|
||||||
private Field<String> contactfullname;
|
private Field<String> contactfullname;
|
||||||
|
|
||||||
private Field<String> contactfax;
|
private Field<String> contactfax;
|
||||||
|
|
||||||
private Field<String> contactphone;
|
private Field<String> contactphone;
|
||||||
|
|
||||||
private Field<String> contactemail;
|
private Field<String> contactemail;
|
||||||
|
|
||||||
private Field<String> summary;
|
private Field<String> summary;
|
||||||
|
|
||||||
private Field<String> currency;
|
private Field<String> currency;
|
||||||
|
|
||||||
private Float totalcost;
|
private Float totalcost;
|
||||||
|
|
||||||
private Float fundedamount;
|
private Float fundedamount;
|
||||||
|
|
||||||
public Field<String> getWebsiteurl() {
|
public Field<String> getWebsiteurl() {
|
||||||
return websiteurl;
|
return websiteurl;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setWebsiteurl(Field<String> websiteurl) {
|
public void setWebsiteurl(Field<String> websiteurl) {
|
||||||
this.websiteurl = websiteurl;
|
this.websiteurl = websiteurl;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getCode() {
|
public Field<String> getCode() {
|
||||||
return code;
|
return code;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setCode(Field<String> code) {
|
public void setCode(Field<String> code) {
|
||||||
this.code = code;
|
this.code = code;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getAcronym() {
|
public Field<String> getAcronym() {
|
||||||
return acronym;
|
return acronym;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setAcronym(Field<String> acronym) {
|
public void setAcronym(Field<String> acronym) {
|
||||||
this.acronym = acronym;
|
this.acronym = acronym;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getTitle() {
|
public Field<String> getTitle() {
|
||||||
return title;
|
return title;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setTitle(Field<String> title) {
|
public void setTitle(Field<String> title) {
|
||||||
this.title = title;
|
this.title = title;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getStartdate() {
|
public Field<String> getStartdate() {
|
||||||
return startdate;
|
return startdate;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setStartdate(Field<String> startdate) {
|
public void setStartdate(Field<String> startdate) {
|
||||||
this.startdate = startdate;
|
this.startdate = startdate;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getEnddate() {
|
public Field<String> getEnddate() {
|
||||||
return enddate;
|
return enddate;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setEnddate(Field<String> enddate) {
|
public void setEnddate(Field<String> enddate) {
|
||||||
this.enddate = enddate;
|
this.enddate = enddate;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getCallidentifier() {
|
public Field<String> getCallidentifier() {
|
||||||
return callidentifier;
|
return callidentifier;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setCallidentifier(Field<String> callidentifier) {
|
public void setCallidentifier(Field<String> callidentifier) {
|
||||||
this.callidentifier = callidentifier;
|
this.callidentifier = callidentifier;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getKeywords() {
|
public Field<String> getKeywords() {
|
||||||
return keywords;
|
return keywords;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setKeywords(Field<String> keywords) {
|
public void setKeywords(Field<String> keywords) {
|
||||||
this.keywords = keywords;
|
this.keywords = keywords;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getDuration() {
|
public Field<String> getDuration() {
|
||||||
return duration;
|
return duration;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDuration(Field<String> duration) {
|
public void setDuration(Field<String> duration) {
|
||||||
this.duration = duration;
|
this.duration = duration;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getEcsc39() {
|
public Field<String> getEcsc39() {
|
||||||
return ecsc39;
|
return ecsc39;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setEcsc39(Field<String> ecsc39) {
|
public void setEcsc39(Field<String> ecsc39) {
|
||||||
this.ecsc39 = ecsc39;
|
this.ecsc39 = ecsc39;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getOamandatepublications() {
|
public Field<String> getOamandatepublications() {
|
||||||
return oamandatepublications;
|
return oamandatepublications;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setOamandatepublications(Field<String> oamandatepublications) {
|
public void setOamandatepublications(Field<String> oamandatepublications) {
|
||||||
this.oamandatepublications = oamandatepublications;
|
this.oamandatepublications = oamandatepublications;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getEcarticle29_3() {
|
public Field<String> getEcarticle29_3() {
|
||||||
return ecarticle29_3;
|
return ecarticle29_3;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setEcarticle29_3(Field<String> ecarticle29_3) {
|
public void setEcarticle29_3(Field<String> ecarticle29_3) {
|
||||||
this.ecarticle29_3 = ecarticle29_3;
|
this.ecarticle29_3 = ecarticle29_3;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<StructuredProperty> getSubjects() {
|
public List<StructuredProperty> getSubjects() {
|
||||||
return subjects;
|
return subjects;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setSubjects(List<StructuredProperty> subjects) {
|
public void setSubjects(List<StructuredProperty> subjects) {
|
||||||
this.subjects = subjects;
|
this.subjects = subjects;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<Field<String>> getFundingtree() {
|
public List<Field<String>> getFundingtree() {
|
||||||
return fundingtree;
|
return fundingtree;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setFundingtree(List<Field<String>> fundingtree) {
|
public void setFundingtree(List<Field<String>> fundingtree) {
|
||||||
this.fundingtree = fundingtree;
|
this.fundingtree = fundingtree;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Qualifier getContracttype() {
|
public Qualifier getContracttype() {
|
||||||
return contracttype;
|
return contracttype;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setContracttype(Qualifier contracttype) {
|
public void setContracttype(Qualifier contracttype) {
|
||||||
this.contracttype = contracttype;
|
this.contracttype = contracttype;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getOptional1() {
|
public Field<String> getOptional1() {
|
||||||
return optional1;
|
return optional1;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setOptional1(Field<String> optional1) {
|
public void setOptional1(Field<String> optional1) {
|
||||||
this.optional1 = optional1;
|
this.optional1 = optional1;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getOptional2() {
|
public Field<String> getOptional2() {
|
||||||
return optional2;
|
return optional2;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setOptional2(Field<String> optional2) {
|
public void setOptional2(Field<String> optional2) {
|
||||||
this.optional2 = optional2;
|
this.optional2 = optional2;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getJsonextrainfo() {
|
public Field<String> getJsonextrainfo() {
|
||||||
return jsonextrainfo;
|
return jsonextrainfo;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setJsonextrainfo(Field<String> jsonextrainfo) {
|
public void setJsonextrainfo(Field<String> jsonextrainfo) {
|
||||||
this.jsonextrainfo = jsonextrainfo;
|
this.jsonextrainfo = jsonextrainfo;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getContactfullname() {
|
public Field<String> getContactfullname() {
|
||||||
return contactfullname;
|
return contactfullname;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setContactfullname(Field<String> contactfullname) {
|
public void setContactfullname(Field<String> contactfullname) {
|
||||||
this.contactfullname = contactfullname;
|
this.contactfullname = contactfullname;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getContactfax() {
|
public Field<String> getContactfax() {
|
||||||
return contactfax;
|
return contactfax;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setContactfax(Field<String> contactfax) {
|
public void setContactfax(Field<String> contactfax) {
|
||||||
this.contactfax = contactfax;
|
this.contactfax = contactfax;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getContactphone() {
|
public Field<String> getContactphone() {
|
||||||
return contactphone;
|
return contactphone;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setContactphone(Field<String> contactphone) {
|
public void setContactphone(Field<String> contactphone) {
|
||||||
this.contactphone = contactphone;
|
this.contactphone = contactphone;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getContactemail() {
|
public Field<String> getContactemail() {
|
||||||
return contactemail;
|
return contactemail;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setContactemail(Field<String> contactemail) {
|
public void setContactemail(Field<String> contactemail) {
|
||||||
this.contactemail = contactemail;
|
this.contactemail = contactemail;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getSummary() {
|
public Field<String> getSummary() {
|
||||||
return summary;
|
return summary;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setSummary(Field<String> summary) {
|
public void setSummary(Field<String> summary) {
|
||||||
this.summary = summary;
|
this.summary = summary;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getCurrency() {
|
public Field<String> getCurrency() {
|
||||||
return currency;
|
return currency;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setCurrency(Field<String> currency) {
|
public void setCurrency(Field<String> currency) {
|
||||||
this.currency = currency;
|
this.currency = currency;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Float getTotalcost() {
|
public Float getTotalcost() {
|
||||||
return totalcost;
|
return totalcost;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setTotalcost(Float totalcost) {
|
public void setTotalcost(Float totalcost) {
|
||||||
this.totalcost = totalcost;
|
this.totalcost = totalcost;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Float getFundedamount() {
|
public Float getFundedamount() {
|
||||||
return fundedamount;
|
return fundedamount;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setFundedamount(Float fundedamount) {
|
public void setFundedamount(Float fundedamount) {
|
||||||
this.fundedamount = fundedamount;
|
this.fundedamount = fundedamount;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public void mergeFrom(OafEntity e) {
|
public void mergeFrom(OafEntity e) {
|
||||||
super.mergeFrom(e);
|
super.mergeFrom(e);
|
||||||
|
|
||||||
if (!Project.class.isAssignableFrom(e.getClass())) {
|
if (!Project.class.isAssignableFrom(e.getClass())) {
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
Project p = (Project) e;
|
Project p = (Project) e;
|
||||||
|
|
||||||
websiteurl =
|
websiteurl = p.getWebsiteurl() != null && compareTrust(this, e) < 0 ? p.getWebsiteurl() : websiteurl;
|
||||||
p.getWebsiteurl() != null && compareTrust(this, e) < 0 ? p.getWebsiteurl() : websiteurl;
|
code = p.getCode() != null && compareTrust(this, e) < 0 ? p.getCode() : code;
|
||||||
code = p.getCode() != null && compareTrust(this, e) < 0 ? p.getCode() : code;
|
acronym = p.getAcronym() != null && compareTrust(this, e) < 0 ? p.getAcronym() : acronym;
|
||||||
acronym = p.getAcronym() != null && compareTrust(this, e) < 0 ? p.getAcronym() : acronym;
|
title = p.getTitle() != null && compareTrust(this, e) < 0 ? p.getTitle() : title;
|
||||||
title = p.getTitle() != null && compareTrust(this, e) < 0 ? p.getTitle() : title;
|
startdate = p.getStartdate() != null && compareTrust(this, e) < 0 ? p.getStartdate() : startdate;
|
||||||
startdate =
|
enddate = p.getEnddate() != null && compareTrust(this, e) < 0 ? p.getEnddate() : enddate;
|
||||||
p.getStartdate() != null && compareTrust(this, e) < 0 ? p.getStartdate() : startdate;
|
callidentifier = p.getCallidentifier() != null && compareTrust(this, e) < 0
|
||||||
enddate = p.getEnddate() != null && compareTrust(this, e) < 0 ? p.getEnddate() : enddate;
|
? p.getCallidentifier()
|
||||||
callidentifier =
|
: callidentifier;
|
||||||
p.getCallidentifier() != null && compareTrust(this, e) < 0
|
keywords = p.getKeywords() != null && compareTrust(this, e) < 0 ? p.getKeywords() : keywords;
|
||||||
? p.getCallidentifier()
|
duration = p.getDuration() != null && compareTrust(this, e) < 0 ? p.getDuration() : duration;
|
||||||
: callidentifier;
|
ecsc39 = p.getEcsc39() != null && compareTrust(this, e) < 0 ? p.getEcsc39() : ecsc39;
|
||||||
keywords = p.getKeywords() != null && compareTrust(this, e) < 0 ? p.getKeywords() : keywords;
|
oamandatepublications = p.getOamandatepublications() != null && compareTrust(this, e) < 0
|
||||||
duration = p.getDuration() != null && compareTrust(this, e) < 0 ? p.getDuration() : duration;
|
? p.getOamandatepublications()
|
||||||
ecsc39 = p.getEcsc39() != null && compareTrust(this, e) < 0 ? p.getEcsc39() : ecsc39;
|
: oamandatepublications;
|
||||||
oamandatepublications =
|
ecarticle29_3 = p.getEcarticle29_3() != null && compareTrust(this, e) < 0
|
||||||
p.getOamandatepublications() != null && compareTrust(this, e) < 0
|
? p.getEcarticle29_3()
|
||||||
? p.getOamandatepublications()
|
: ecarticle29_3;
|
||||||
: oamandatepublications;
|
subjects = mergeLists(subjects, p.getSubjects());
|
||||||
ecarticle29_3 =
|
fundingtree = mergeLists(fundingtree, p.getFundingtree());
|
||||||
p.getEcarticle29_3() != null && compareTrust(this, e) < 0
|
contracttype = p.getContracttype() != null && compareTrust(this, e) < 0
|
||||||
? p.getEcarticle29_3()
|
? p.getContracttype()
|
||||||
: ecarticle29_3;
|
: contracttype;
|
||||||
subjects = mergeLists(subjects, p.getSubjects());
|
optional1 = p.getOptional1() != null && compareTrust(this, e) < 0 ? p.getOptional1() : optional1;
|
||||||
fundingtree = mergeLists(fundingtree, p.getFundingtree());
|
optional2 = p.getOptional2() != null && compareTrust(this, e) < 0 ? p.getOptional2() : optional2;
|
||||||
contracttype =
|
jsonextrainfo = p.getJsonextrainfo() != null && compareTrust(this, e) < 0
|
||||||
p.getContracttype() != null && compareTrust(this, e) < 0
|
? p.getJsonextrainfo()
|
||||||
? p.getContracttype()
|
: jsonextrainfo;
|
||||||
: contracttype;
|
contactfullname = p.getContactfullname() != null && compareTrust(this, e) < 0
|
||||||
optional1 =
|
? p.getContactfullname()
|
||||||
p.getOptional1() != null && compareTrust(this, e) < 0 ? p.getOptional1() : optional1;
|
: contactfullname;
|
||||||
optional2 =
|
contactfax = p.getContactfax() != null && compareTrust(this, e) < 0 ? p.getContactfax() : contactfax;
|
||||||
p.getOptional2() != null && compareTrust(this, e) < 0 ? p.getOptional2() : optional2;
|
contactphone = p.getContactphone() != null && compareTrust(this, e) < 0
|
||||||
jsonextrainfo =
|
? p.getContactphone()
|
||||||
p.getJsonextrainfo() != null && compareTrust(this, e) < 0
|
: contactphone;
|
||||||
? p.getJsonextrainfo()
|
contactemail = p.getContactemail() != null && compareTrust(this, e) < 0
|
||||||
: jsonextrainfo;
|
? p.getContactemail()
|
||||||
contactfullname =
|
: contactemail;
|
||||||
p.getContactfullname() != null && compareTrust(this, e) < 0
|
summary = p.getSummary() != null && compareTrust(this, e) < 0 ? p.getSummary() : summary;
|
||||||
? p.getContactfullname()
|
currency = p.getCurrency() != null && compareTrust(this, e) < 0 ? p.getCurrency() : currency;
|
||||||
: contactfullname;
|
totalcost = p.getTotalcost() != null && compareTrust(this, e) < 0 ? p.getTotalcost() : totalcost;
|
||||||
contactfax =
|
fundedamount = p.getFundedamount() != null && compareTrust(this, e) < 0
|
||||||
p.getContactfax() != null && compareTrust(this, e) < 0 ? p.getContactfax() : contactfax;
|
? p.getFundedamount()
|
||||||
contactphone =
|
: fundedamount;
|
||||||
p.getContactphone() != null && compareTrust(this, e) < 0
|
mergeOAFDataInfo(e);
|
||||||
? p.getContactphone()
|
}
|
||||||
: contactphone;
|
|
||||||
contactemail =
|
|
||||||
p.getContactemail() != null && compareTrust(this, e) < 0
|
|
||||||
? p.getContactemail()
|
|
||||||
: contactemail;
|
|
||||||
summary = p.getSummary() != null && compareTrust(this, e) < 0 ? p.getSummary() : summary;
|
|
||||||
currency = p.getCurrency() != null && compareTrust(this, e) < 0 ? p.getCurrency() : currency;
|
|
||||||
totalcost =
|
|
||||||
p.getTotalcost() != null && compareTrust(this, e) < 0 ? p.getTotalcost() : totalcost;
|
|
||||||
fundedamount =
|
|
||||||
p.getFundedamount() != null && compareTrust(this, e) < 0
|
|
||||||
? p.getFundedamount()
|
|
||||||
: fundedamount;
|
|
||||||
mergeOAFDataInfo(e);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,36 +1,39 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf;
|
package eu.dnetlib.dhp.schema.oaf;
|
||||||
|
|
||||||
import eu.dnetlib.dhp.schema.common.ModelConstants;
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
|
|
||||||
|
import eu.dnetlib.dhp.schema.common.ModelConstants;
|
||||||
|
|
||||||
public class Publication extends Result implements Serializable {
|
public class Publication extends Result implements Serializable {
|
||||||
|
|
||||||
// publication specific
|
// publication specific
|
||||||
private Journal journal;
|
private Journal journal;
|
||||||
|
|
||||||
public Publication() {
|
public Publication() {
|
||||||
setResulttype(ModelConstants.PUBLICATION_DEFAULT_RESULTTYPE);
|
setResulttype(ModelConstants.PUBLICATION_DEFAULT_RESULTTYPE);
|
||||||
}
|
}
|
||||||
|
|
||||||
public Journal getJournal() {
|
public Journal getJournal() {
|
||||||
return journal;
|
return journal;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setJournal(Journal journal) {
|
public void setJournal(Journal journal) {
|
||||||
this.journal = journal;
|
this.journal = journal;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public void mergeFrom(OafEntity e) {
|
public void mergeFrom(OafEntity e) {
|
||||||
super.mergeFrom(e);
|
super.mergeFrom(e);
|
||||||
|
|
||||||
if (!Publication.class.isAssignableFrom(e.getClass())) {
|
if (!Publication.class.isAssignableFrom(e.getClass())) {
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
Publication p = (Publication) e;
|
Publication p = (Publication) e;
|
||||||
|
|
||||||
if (p.getJournal() != null && compareTrust(this, e) < 0) journal = p.getJournal();
|
if (p.getJournal() != null && compareTrust(this, e) < 0)
|
||||||
mergeOAFDataInfo(e);
|
journal = p.getJournal();
|
||||||
}
|
mergeOAFDataInfo(e);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,80 +1,87 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf;
|
package eu.dnetlib.dhp.schema.oaf;
|
||||||
|
|
||||||
import com.fasterxml.jackson.annotation.JsonIgnore;
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
|
|
||||||
import org.apache.commons.lang3.StringUtils;
|
import org.apache.commons.lang3.StringUtils;
|
||||||
|
|
||||||
|
import com.fasterxml.jackson.annotation.JsonIgnore;
|
||||||
|
|
||||||
public class Qualifier implements Serializable {
|
public class Qualifier implements Serializable {
|
||||||
|
|
||||||
private String classid;
|
private String classid;
|
||||||
private String classname;
|
private String classname;
|
||||||
private String schemeid;
|
private String schemeid;
|
||||||
private String schemename;
|
private String schemename;
|
||||||
|
|
||||||
public String getClassid() {
|
public String getClassid() {
|
||||||
return classid;
|
return classid;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setClassid(String classid) {
|
public void setClassid(String classid) {
|
||||||
this.classid = classid;
|
this.classid = classid;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getClassname() {
|
public String getClassname() {
|
||||||
return classname;
|
return classname;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setClassname(String classname) {
|
public void setClassname(String classname) {
|
||||||
this.classname = classname;
|
this.classname = classname;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getSchemeid() {
|
public String getSchemeid() {
|
||||||
return schemeid;
|
return schemeid;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setSchemeid(String schemeid) {
|
public void setSchemeid(String schemeid) {
|
||||||
this.schemeid = schemeid;
|
this.schemeid = schemeid;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getSchemename() {
|
public String getSchemename() {
|
||||||
return schemename;
|
return schemename;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setSchemename(String schemename) {
|
public void setSchemename(String schemename) {
|
||||||
this.schemename = schemename;
|
this.schemename = schemename;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String toComparableString() {
|
public String toComparableString() {
|
||||||
return isBlank()
|
return isBlank()
|
||||||
? ""
|
? ""
|
||||||
: String.format(
|
: String
|
||||||
"%s::%s::%s::%s",
|
.format(
|
||||||
classid != null ? classid : "",
|
"%s::%s::%s::%s",
|
||||||
classname != null ? classname : "",
|
classid != null ? classid : "",
|
||||||
schemeid != null ? schemeid : "",
|
classname != null ? classname : "",
|
||||||
schemename != null ? schemename : "");
|
schemeid != null ? schemeid : "",
|
||||||
}
|
schemename != null ? schemename : "");
|
||||||
|
}
|
||||||
|
|
||||||
@JsonIgnore
|
@JsonIgnore
|
||||||
public boolean isBlank() {
|
public boolean isBlank() {
|
||||||
return StringUtils.isBlank(classid)
|
return StringUtils.isBlank(classid)
|
||||||
&& StringUtils.isBlank(classname)
|
&& StringUtils.isBlank(classname)
|
||||||
&& StringUtils.isBlank(schemeid)
|
&& StringUtils.isBlank(schemeid)
|
||||||
&& StringUtils.isBlank(schemename);
|
&& StringUtils.isBlank(schemename);
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public int hashCode() {
|
public int hashCode() {
|
||||||
return toComparableString().hashCode();
|
return toComparableString().hashCode();
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public boolean equals(Object obj) {
|
public boolean equals(Object obj) {
|
||||||
if (this == obj) return true;
|
if (this == obj)
|
||||||
if (obj == null) return false;
|
return true;
|
||||||
if (getClass() != obj.getClass()) return false;
|
if (obj == null)
|
||||||
|
return false;
|
||||||
|
if (getClass() != obj.getClass())
|
||||||
|
return false;
|
||||||
|
|
||||||
Qualifier other = (Qualifier) obj;
|
Qualifier other = (Qualifier) obj;
|
||||||
|
|
||||||
return toComparableString().equals(other.toComparableString());
|
return toComparableString().equals(other.toComparableString());
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,3 +1,4 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf;
|
package eu.dnetlib.dhp.schema.oaf;
|
||||||
|
|
||||||
import static com.google.common.base.Preconditions.checkArgument;
|
import static com.google.common.base.Preconditions.checkArgument;
|
||||||
|
@ -6,93 +7,120 @@ import java.util.*;
|
||||||
import java.util.stream.Collectors;
|
import java.util.stream.Collectors;
|
||||||
import java.util.stream.Stream;
|
import java.util.stream.Stream;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Relation models any edge between two nodes in the OpenAIRE graph. It has a source id and a target id pointing to
|
||||||
|
* graph node identifiers and it is further characterised by the semantic of the link through the fields relType,
|
||||||
|
* subRelType and relClass. Provenance information is modeled according to the dataInfo element and collectedFrom, while
|
||||||
|
* individual relationship types can provide extra information via the properties field.
|
||||||
|
*/
|
||||||
public class Relation extends Oaf {
|
public class Relation extends Oaf {
|
||||||
|
|
||||||
private String relType;
|
/**
|
||||||
|
* Main relationship classifier, values include 'resultResult', 'resultProject', 'resultOrganization', etc.
|
||||||
|
*/
|
||||||
|
private String relType;
|
||||||
|
|
||||||
private String subRelType;
|
/**
|
||||||
|
* Further classifies a relationship, values include 'affiliation', 'similarity', 'supplement', etc.
|
||||||
|
*/
|
||||||
|
private String subRelType;
|
||||||
|
|
||||||
private String relClass;
|
/**
|
||||||
|
* Indicates the direction of the relationship, values include 'isSupplementTo', 'isSupplementedBy', 'merges,
|
||||||
|
* 'isMergedIn'.
|
||||||
|
*/
|
||||||
|
private String relClass;
|
||||||
|
|
||||||
private String source;
|
/**
|
||||||
|
* The source entity id.
|
||||||
|
*/
|
||||||
|
private String source;
|
||||||
|
|
||||||
private String target;
|
/**
|
||||||
|
* The target entity id.
|
||||||
|
*/
|
||||||
|
private String target;
|
||||||
|
|
||||||
public String getRelType() {
|
public String getRelType() {
|
||||||
return relType;
|
return relType;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setRelType(final String relType) {
|
public void setRelType(final String relType) {
|
||||||
this.relType = relType;
|
this.relType = relType;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getSubRelType() {
|
public String getSubRelType() {
|
||||||
return subRelType;
|
return subRelType;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setSubRelType(final String subRelType) {
|
public void setSubRelType(final String subRelType) {
|
||||||
this.subRelType = subRelType;
|
this.subRelType = subRelType;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getRelClass() {
|
public String getRelClass() {
|
||||||
return relClass;
|
return relClass;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setRelClass(final String relClass) {
|
public void setRelClass(final String relClass) {
|
||||||
this.relClass = relClass;
|
this.relClass = relClass;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getSource() {
|
public String getSource() {
|
||||||
return source;
|
return source;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setSource(final String source) {
|
public void setSource(final String source) {
|
||||||
this.source = source;
|
this.source = source;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getTarget() {
|
public String getTarget() {
|
||||||
return target;
|
return target;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setTarget(final String target) {
|
public void setTarget(final String target) {
|
||||||
this.target = target;
|
this.target = target;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void mergeFrom(final Relation r) {
|
public void mergeFrom(final Relation r) {
|
||||||
|
|
||||||
checkArgument(Objects.equals(getSource(), r.getSource()), "source ids must be equal");
|
checkArgument(Objects.equals(getSource(), r.getSource()), "source ids must be equal");
|
||||||
checkArgument(Objects.equals(getTarget(), r.getTarget()), "target ids must be equal");
|
checkArgument(Objects.equals(getTarget(), r.getTarget()), "target ids must be equal");
|
||||||
checkArgument(Objects.equals(getRelType(), r.getRelType()), "relType(s) must be equal");
|
checkArgument(Objects.equals(getRelType(), r.getRelType()), "relType(s) must be equal");
|
||||||
checkArgument(
|
checkArgument(
|
||||||
Objects.equals(getSubRelType(), r.getSubRelType()), "subRelType(s) must be equal");
|
Objects.equals(getSubRelType(), r.getSubRelType()), "subRelType(s) must be equal");
|
||||||
checkArgument(Objects.equals(getRelClass(), r.getRelClass()), "relClass(es) must be equal");
|
checkArgument(Objects.equals(getRelClass(), r.getRelClass()), "relClass(es) must be equal");
|
||||||
|
|
||||||
setCollectedfrom(
|
setCollectedfrom(
|
||||||
Stream.concat(
|
Stream
|
||||||
Optional.ofNullable(getCollectedfrom())
|
.concat(
|
||||||
.map(Collection::stream)
|
Optional
|
||||||
.orElse(Stream.empty()),
|
.ofNullable(getCollectedfrom())
|
||||||
Optional.ofNullable(r.getCollectedfrom())
|
.map(Collection::stream)
|
||||||
.map(Collection::stream)
|
.orElse(Stream.empty()),
|
||||||
.orElse(Stream.empty()))
|
Optional
|
||||||
.distinct() // relies on KeyValue.equals
|
.ofNullable(r.getCollectedfrom())
|
||||||
.collect(Collectors.toList()));
|
.map(Collection::stream)
|
||||||
}
|
.orElse(Stream.empty()))
|
||||||
|
.distinct() // relies on KeyValue.equals
|
||||||
|
.collect(Collectors.toList()));
|
||||||
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public boolean equals(Object o) {
|
public boolean equals(Object o) {
|
||||||
if (this == o) return true;
|
if (this == o)
|
||||||
if (o == null || getClass() != o.getClass()) return false;
|
return true;
|
||||||
Relation relation = (Relation) o;
|
if (o == null || getClass() != o.getClass())
|
||||||
return relType.equals(relation.relType)
|
return false;
|
||||||
&& subRelType.equals(relation.subRelType)
|
Relation relation = (Relation) o;
|
||||||
&& relClass.equals(relation.relClass)
|
return relType.equals(relation.relType)
|
||||||
&& source.equals(relation.source)
|
&& subRelType.equals(relation.subRelType)
|
||||||
&& target.equals(relation.target);
|
&& relClass.equals(relation.relClass)
|
||||||
}
|
&& source.equals(relation.source)
|
||||||
|
&& target.equals(relation.target);
|
||||||
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public int hashCode() {
|
public int hashCode() {
|
||||||
return Objects.hash(relType, subRelType, relClass, source, target, collectedfrom);
|
return Objects.hash(relType, subRelType, relClass, source, target, collectedfrom);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,3 +1,4 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf;
|
package eu.dnetlib.dhp.schema.oaf;
|
||||||
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
|
@ -6,286 +7,291 @@ import java.util.List;
|
||||||
|
|
||||||
public class Result extends OafEntity implements Serializable {
|
public class Result extends OafEntity implements Serializable {
|
||||||
|
|
||||||
private List<Author> author;
|
private List<Author> author;
|
||||||
|
|
||||||
// resulttype allows subclassing results into publications | datasets | software
|
// resulttype allows subclassing results into publications | datasets | software
|
||||||
private Qualifier resulttype;
|
private Qualifier resulttype;
|
||||||
|
|
||||||
// common fields
|
// common fields
|
||||||
private Qualifier language;
|
private Qualifier language;
|
||||||
|
|
||||||
private List<Country> country;
|
private List<Country> country;
|
||||||
|
|
||||||
private List<StructuredProperty> subject;
|
private List<StructuredProperty> subject;
|
||||||
|
|
||||||
private List<StructuredProperty> title;
|
private List<StructuredProperty> title;
|
||||||
|
|
||||||
private List<StructuredProperty> relevantdate;
|
private List<StructuredProperty> relevantdate;
|
||||||
|
|
||||||
private List<Field<String>> description;
|
private List<Field<String>> description;
|
||||||
|
|
||||||
private Field<String> dateofacceptance;
|
private Field<String> dateofacceptance;
|
||||||
|
|
||||||
private Field<String> publisher;
|
private Field<String> publisher;
|
||||||
|
|
||||||
private Field<String> embargoenddate;
|
private Field<String> embargoenddate;
|
||||||
|
|
||||||
private List<Field<String>> source;
|
private List<Field<String>> source;
|
||||||
|
|
||||||
private List<Field<String>> fulltext; // remove candidate
|
private List<Field<String>> fulltext; // remove candidate
|
||||||
|
|
||||||
private List<Field<String>> format;
|
private List<Field<String>> format;
|
||||||
|
|
||||||
private List<Field<String>> contributor;
|
private List<Field<String>> contributor;
|
||||||
|
|
||||||
private Qualifier resourcetype;
|
private Qualifier resourcetype;
|
||||||
|
|
||||||
private List<Field<String>> coverage;
|
private List<Field<String>> coverage;
|
||||||
|
|
||||||
private Qualifier bestaccessright;
|
private Qualifier bestaccessright;
|
||||||
|
|
||||||
private List<Context> context;
|
private List<Context> context;
|
||||||
|
|
||||||
private List<ExternalReference> externalReference;
|
private List<ExternalReference> externalReference;
|
||||||
|
|
||||||
private List<Instance> instance;
|
private List<Instance> instance;
|
||||||
|
|
||||||
public List<Author> getAuthor() {
|
public List<Author> getAuthor() {
|
||||||
return author;
|
return author;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setAuthor(List<Author> author) {
|
public void setAuthor(List<Author> author) {
|
||||||
this.author = author;
|
this.author = author;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Qualifier getResulttype() {
|
public Qualifier getResulttype() {
|
||||||
return resulttype;
|
return resulttype;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setResulttype(Qualifier resulttype) {
|
public void setResulttype(Qualifier resulttype) {
|
||||||
this.resulttype = resulttype;
|
this.resulttype = resulttype;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Qualifier getLanguage() {
|
public Qualifier getLanguage() {
|
||||||
return language;
|
return language;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setLanguage(Qualifier language) {
|
public void setLanguage(Qualifier language) {
|
||||||
this.language = language;
|
this.language = language;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<Country> getCountry() {
|
public List<Country> getCountry() {
|
||||||
return country;
|
return country;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setCountry(List<Country> country) {
|
public void setCountry(List<Country> country) {
|
||||||
this.country = country;
|
this.country = country;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<StructuredProperty> getSubject() {
|
public List<StructuredProperty> getSubject() {
|
||||||
return subject;
|
return subject;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setSubject(List<StructuredProperty> subject) {
|
public void setSubject(List<StructuredProperty> subject) {
|
||||||
this.subject = subject;
|
this.subject = subject;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<StructuredProperty> getTitle() {
|
public List<StructuredProperty> getTitle() {
|
||||||
return title;
|
return title;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setTitle(List<StructuredProperty> title) {
|
public void setTitle(List<StructuredProperty> title) {
|
||||||
this.title = title;
|
this.title = title;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<StructuredProperty> getRelevantdate() {
|
public List<StructuredProperty> getRelevantdate() {
|
||||||
return relevantdate;
|
return relevantdate;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setRelevantdate(List<StructuredProperty> relevantdate) {
|
public void setRelevantdate(List<StructuredProperty> relevantdate) {
|
||||||
this.relevantdate = relevantdate;
|
this.relevantdate = relevantdate;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<Field<String>> getDescription() {
|
public List<Field<String>> getDescription() {
|
||||||
return description;
|
return description;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDescription(List<Field<String>> description) {
|
public void setDescription(List<Field<String>> description) {
|
||||||
this.description = description;
|
this.description = description;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getDateofacceptance() {
|
public Field<String> getDateofacceptance() {
|
||||||
return dateofacceptance;
|
return dateofacceptance;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDateofacceptance(Field<String> dateofacceptance) {
|
public void setDateofacceptance(Field<String> dateofacceptance) {
|
||||||
this.dateofacceptance = dateofacceptance;
|
this.dateofacceptance = dateofacceptance;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getPublisher() {
|
public Field<String> getPublisher() {
|
||||||
return publisher;
|
return publisher;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setPublisher(Field<String> publisher) {
|
public void setPublisher(Field<String> publisher) {
|
||||||
this.publisher = publisher;
|
this.publisher = publisher;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getEmbargoenddate() {
|
public Field<String> getEmbargoenddate() {
|
||||||
return embargoenddate;
|
return embargoenddate;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setEmbargoenddate(Field<String> embargoenddate) {
|
public void setEmbargoenddate(Field<String> embargoenddate) {
|
||||||
this.embargoenddate = embargoenddate;
|
this.embargoenddate = embargoenddate;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<Field<String>> getSource() {
|
public List<Field<String>> getSource() {
|
||||||
return source;
|
return source;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setSource(List<Field<String>> source) {
|
public void setSource(List<Field<String>> source) {
|
||||||
this.source = source;
|
this.source = source;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<Field<String>> getFulltext() {
|
public List<Field<String>> getFulltext() {
|
||||||
return fulltext;
|
return fulltext;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setFulltext(List<Field<String>> fulltext) {
|
public void setFulltext(List<Field<String>> fulltext) {
|
||||||
this.fulltext = fulltext;
|
this.fulltext = fulltext;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<Field<String>> getFormat() {
|
public List<Field<String>> getFormat() {
|
||||||
return format;
|
return format;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setFormat(List<Field<String>> format) {
|
public void setFormat(List<Field<String>> format) {
|
||||||
this.format = format;
|
this.format = format;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<Field<String>> getContributor() {
|
public List<Field<String>> getContributor() {
|
||||||
return contributor;
|
return contributor;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setContributor(List<Field<String>> contributor) {
|
public void setContributor(List<Field<String>> contributor) {
|
||||||
this.contributor = contributor;
|
this.contributor = contributor;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Qualifier getResourcetype() {
|
public Qualifier getResourcetype() {
|
||||||
return resourcetype;
|
return resourcetype;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setResourcetype(Qualifier resourcetype) {
|
public void setResourcetype(Qualifier resourcetype) {
|
||||||
this.resourcetype = resourcetype;
|
this.resourcetype = resourcetype;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<Field<String>> getCoverage() {
|
public List<Field<String>> getCoverage() {
|
||||||
return coverage;
|
return coverage;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setCoverage(List<Field<String>> coverage) {
|
public void setCoverage(List<Field<String>> coverage) {
|
||||||
this.coverage = coverage;
|
this.coverage = coverage;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Qualifier getBestaccessright() {
|
public Qualifier getBestaccessright() {
|
||||||
return bestaccessright;
|
return bestaccessright;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setBestaccessright(Qualifier bestaccessright) {
|
public void setBestaccessright(Qualifier bestaccessright) {
|
||||||
this.bestaccessright = bestaccessright;
|
this.bestaccessright = bestaccessright;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<Context> getContext() {
|
public List<Context> getContext() {
|
||||||
return context;
|
return context;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setContext(List<Context> context) {
|
public void setContext(List<Context> context) {
|
||||||
this.context = context;
|
this.context = context;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<ExternalReference> getExternalReference() {
|
public List<ExternalReference> getExternalReference() {
|
||||||
return externalReference;
|
return externalReference;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setExternalReference(List<ExternalReference> externalReference) {
|
public void setExternalReference(List<ExternalReference> externalReference) {
|
||||||
this.externalReference = externalReference;
|
this.externalReference = externalReference;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<Instance> getInstance() {
|
public List<Instance> getInstance() {
|
||||||
return instance;
|
return instance;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setInstance(List<Instance> instance) {
|
public void setInstance(List<Instance> instance) {
|
||||||
this.instance = instance;
|
this.instance = instance;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public void mergeFrom(OafEntity e) {
|
public void mergeFrom(OafEntity e) {
|
||||||
super.mergeFrom(e);
|
super.mergeFrom(e);
|
||||||
|
|
||||||
if (!Result.class.isAssignableFrom(e.getClass())) {
|
if (!Result.class.isAssignableFrom(e.getClass())) {
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
Result r = (Result) e;
|
Result r = (Result) e;
|
||||||
|
|
||||||
instance = mergeLists(instance, r.getInstance());
|
instance = mergeLists(instance, r.getInstance());
|
||||||
|
|
||||||
if (r.getBestaccessright() != null && compareTrust(this, r) < 0)
|
if (r.getBestaccessright() != null && compareTrust(this, r) < 0)
|
||||||
bestaccessright = r.getBestaccessright();
|
bestaccessright = r.getBestaccessright();
|
||||||
|
|
||||||
if (r.getResulttype() != null && compareTrust(this, r) < 0) resulttype = r.getResulttype();
|
if (r.getResulttype() != null && compareTrust(this, r) < 0)
|
||||||
|
resulttype = r.getResulttype();
|
||||||
|
|
||||||
if (r.getLanguage() != null && compareTrust(this, r) < 0) language = r.getLanguage();
|
if (r.getLanguage() != null && compareTrust(this, r) < 0)
|
||||||
|
language = r.getLanguage();
|
||||||
|
|
||||||
country = mergeLists(country, r.getCountry());
|
country = mergeLists(country, r.getCountry());
|
||||||
|
|
||||||
subject = mergeLists(subject, r.getSubject());
|
subject = mergeLists(subject, r.getSubject());
|
||||||
|
|
||||||
title = mergeLists(title, r.getTitle());
|
title = mergeLists(title, r.getTitle());
|
||||||
|
|
||||||
relevantdate = mergeLists(relevantdate, r.getRelevantdate());
|
relevantdate = mergeLists(relevantdate, r.getRelevantdate());
|
||||||
|
|
||||||
description = longestLists(description, r.getDescription());
|
description = longestLists(description, r.getDescription());
|
||||||
|
|
||||||
if (r.getPublisher() != null && compareTrust(this, r) < 0) publisher = r.getPublisher();
|
if (r.getPublisher() != null && compareTrust(this, r) < 0)
|
||||||
|
publisher = r.getPublisher();
|
||||||
|
|
||||||
if (r.getEmbargoenddate() != null && compareTrust(this, r) < 0)
|
if (r.getEmbargoenddate() != null && compareTrust(this, r) < 0)
|
||||||
embargoenddate = r.getEmbargoenddate();
|
embargoenddate = r.getEmbargoenddate();
|
||||||
|
|
||||||
source = mergeLists(source, r.getSource());
|
source = mergeLists(source, r.getSource());
|
||||||
|
|
||||||
fulltext = mergeLists(fulltext, r.getFulltext());
|
fulltext = mergeLists(fulltext, r.getFulltext());
|
||||||
|
|
||||||
format = mergeLists(format, r.getFormat());
|
format = mergeLists(format, r.getFormat());
|
||||||
|
|
||||||
contributor = mergeLists(contributor, r.getContributor());
|
contributor = mergeLists(contributor, r.getContributor());
|
||||||
|
|
||||||
if (r.getResourcetype() != null) resourcetype = r.getResourcetype();
|
if (r.getResourcetype() != null)
|
||||||
|
resourcetype = r.getResourcetype();
|
||||||
|
|
||||||
coverage = mergeLists(coverage, r.getCoverage());
|
coverage = mergeLists(coverage, r.getCoverage());
|
||||||
|
|
||||||
context = mergeLists(context, r.getContext());
|
context = mergeLists(context, r.getContext());
|
||||||
|
|
||||||
externalReference = mergeLists(externalReference, r.getExternalReference());
|
externalReference = mergeLists(externalReference, r.getExternalReference());
|
||||||
}
|
}
|
||||||
|
|
||||||
private List<Field<String>> longestLists(List<Field<String>> a, List<Field<String>> b) {
|
private List<Field<String>> longestLists(List<Field<String>> a, List<Field<String>> b) {
|
||||||
if (a == null || b == null) return a == null ? b : a;
|
if (a == null || b == null)
|
||||||
if (a.size() == b.size()) {
|
return a == null ? b : a;
|
||||||
int msa =
|
if (a.size() == b.size()) {
|
||||||
a.stream()
|
int msa = a
|
||||||
.filter(i -> i.getValue() != null)
|
.stream()
|
||||||
.map(i -> i.getValue().length())
|
.filter(i -> i.getValue() != null)
|
||||||
.max(Comparator.naturalOrder())
|
.map(i -> i.getValue().length())
|
||||||
.orElse(0);
|
.max(Comparator.naturalOrder())
|
||||||
int msb =
|
.orElse(0);
|
||||||
b.stream()
|
int msb = b
|
||||||
.filter(i -> i.getValue() != null)
|
.stream()
|
||||||
.map(i -> i.getValue().length())
|
.filter(i -> i.getValue() != null)
|
||||||
.max(Comparator.naturalOrder())
|
.map(i -> i.getValue().length())
|
||||||
.orElse(0);
|
.max(Comparator.naturalOrder())
|
||||||
return msa > msb ? a : b;
|
.orElse(0);
|
||||||
}
|
return msa > msb ? a : b;
|
||||||
return a.size() > b.size() ? a : b;
|
}
|
||||||
}
|
return a.size() > b.size() ? a : b;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,78 +1,78 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf;
|
package eu.dnetlib.dhp.schema.oaf;
|
||||||
|
|
||||||
import eu.dnetlib.dhp.schema.common.ModelConstants;
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
import java.util.List;
|
import java.util.List;
|
||||||
|
|
||||||
|
import eu.dnetlib.dhp.schema.common.ModelConstants;
|
||||||
|
|
||||||
public class Software extends Result implements Serializable {
|
public class Software extends Result implements Serializable {
|
||||||
|
|
||||||
private List<Field<String>> documentationUrl;
|
private List<Field<String>> documentationUrl;
|
||||||
|
|
||||||
private List<StructuredProperty> license;
|
private List<StructuredProperty> license;
|
||||||
|
|
||||||
private Field<String> codeRepositoryUrl;
|
private Field<String> codeRepositoryUrl;
|
||||||
|
|
||||||
private Qualifier programmingLanguage;
|
private Qualifier programmingLanguage;
|
||||||
|
|
||||||
public Software() {
|
public Software() {
|
||||||
setResulttype(ModelConstants.SOFTWARE_DEFAULT_RESULTTYPE);
|
setResulttype(ModelConstants.SOFTWARE_DEFAULT_RESULTTYPE);
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<Field<String>> getDocumentationUrl() {
|
public List<Field<String>> getDocumentationUrl() {
|
||||||
return documentationUrl;
|
return documentationUrl;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDocumentationUrl(List<Field<String>> documentationUrl) {
|
public void setDocumentationUrl(List<Field<String>> documentationUrl) {
|
||||||
this.documentationUrl = documentationUrl;
|
this.documentationUrl = documentationUrl;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<StructuredProperty> getLicense() {
|
public List<StructuredProperty> getLicense() {
|
||||||
return license;
|
return license;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setLicense(List<StructuredProperty> license) {
|
public void setLicense(List<StructuredProperty> license) {
|
||||||
this.license = license;
|
this.license = license;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Field<String> getCodeRepositoryUrl() {
|
public Field<String> getCodeRepositoryUrl() {
|
||||||
return codeRepositoryUrl;
|
return codeRepositoryUrl;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setCodeRepositoryUrl(Field<String> codeRepositoryUrl) {
|
public void setCodeRepositoryUrl(Field<String> codeRepositoryUrl) {
|
||||||
this.codeRepositoryUrl = codeRepositoryUrl;
|
this.codeRepositoryUrl = codeRepositoryUrl;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Qualifier getProgrammingLanguage() {
|
public Qualifier getProgrammingLanguage() {
|
||||||
return programmingLanguage;
|
return programmingLanguage;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setProgrammingLanguage(Qualifier programmingLanguage) {
|
public void setProgrammingLanguage(Qualifier programmingLanguage) {
|
||||||
this.programmingLanguage = programmingLanguage;
|
this.programmingLanguage = programmingLanguage;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public void mergeFrom(OafEntity e) {
|
public void mergeFrom(OafEntity e) {
|
||||||
super.mergeFrom(e);
|
super.mergeFrom(e);
|
||||||
|
|
||||||
if (!Software.class.isAssignableFrom(e.getClass())) {
|
if (!Software.class.isAssignableFrom(e.getClass())) {
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
final Software s = (Software) e;
|
final Software s = (Software) e;
|
||||||
documentationUrl = mergeLists(documentationUrl, s.getDocumentationUrl());
|
documentationUrl = mergeLists(documentationUrl, s.getDocumentationUrl());
|
||||||
|
|
||||||
license = mergeLists(license, s.getLicense());
|
license = mergeLists(license, s.getLicense());
|
||||||
|
|
||||||
codeRepositoryUrl =
|
codeRepositoryUrl = s.getCodeRepositoryUrl() != null && compareTrust(this, s) < 0
|
||||||
s.getCodeRepositoryUrl() != null && compareTrust(this, s) < 0
|
? s.getCodeRepositoryUrl()
|
||||||
? s.getCodeRepositoryUrl()
|
: codeRepositoryUrl;
|
||||||
: codeRepositoryUrl;
|
|
||||||
|
|
||||||
programmingLanguage =
|
programmingLanguage = s.getProgrammingLanguage() != null && compareTrust(this, s) < 0
|
||||||
s.getProgrammingLanguage() != null && compareTrust(this, s) < 0
|
? s.getProgrammingLanguage()
|
||||||
? s.getProgrammingLanguage()
|
: programmingLanguage;
|
||||||
: programmingLanguage;
|
|
||||||
|
|
||||||
mergeOAFDataInfo(e);
|
mergeOAFDataInfo(e);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,56 +1,60 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf;
|
package eu.dnetlib.dhp.schema.oaf;
|
||||||
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
|
|
||||||
public class StructuredProperty implements Serializable {
|
public class StructuredProperty implements Serializable {
|
||||||
|
|
||||||
private String value;
|
private String value;
|
||||||
|
|
||||||
private Qualifier qualifier;
|
private Qualifier qualifier;
|
||||||
|
|
||||||
private DataInfo dataInfo;
|
private DataInfo dataInfo;
|
||||||
|
|
||||||
public String getValue() {
|
public String getValue() {
|
||||||
return value;
|
return value;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setValue(String value) {
|
public void setValue(String value) {
|
||||||
this.value = value;
|
this.value = value;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Qualifier getQualifier() {
|
public Qualifier getQualifier() {
|
||||||
return qualifier;
|
return qualifier;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setQualifier(Qualifier qualifier) {
|
public void setQualifier(Qualifier qualifier) {
|
||||||
this.qualifier = qualifier;
|
this.qualifier = qualifier;
|
||||||
}
|
}
|
||||||
|
|
||||||
public DataInfo getDataInfo() {
|
public DataInfo getDataInfo() {
|
||||||
return dataInfo;
|
return dataInfo;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDataInfo(DataInfo dataInfo) {
|
public void setDataInfo(DataInfo dataInfo) {
|
||||||
this.dataInfo = dataInfo;
|
this.dataInfo = dataInfo;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String toComparableString() {
|
public String toComparableString() {
|
||||||
return value != null ? value.toLowerCase() : "";
|
return value != null ? value.toLowerCase() : "";
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public int hashCode() {
|
public int hashCode() {
|
||||||
return toComparableString().hashCode();
|
return toComparableString().hashCode();
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public boolean equals(Object obj) {
|
public boolean equals(Object obj) {
|
||||||
if (this == obj) return true;
|
if (this == obj)
|
||||||
if (obj == null) return false;
|
return true;
|
||||||
if (getClass() != obj.getClass()) return false;
|
if (obj == null)
|
||||||
|
return false;
|
||||||
|
if (getClass() != obj.getClass())
|
||||||
|
return false;
|
||||||
|
|
||||||
StructuredProperty other = (StructuredProperty) obj;
|
StructuredProperty other = (StructuredProperty) obj;
|
||||||
|
|
||||||
return toComparableString().equals(other.toComparableString());
|
return toComparableString().equals(other.toComparableString());
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,83 +1,89 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.scholexplorer;
|
package eu.dnetlib.dhp.schema.scholexplorer;
|
||||||
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.Dataset;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.OafEntity;
|
|
||||||
import java.util.ArrayList;
|
import java.util.ArrayList;
|
||||||
import java.util.HashMap;
|
import java.util.HashMap;
|
||||||
import java.util.List;
|
import java.util.List;
|
||||||
import java.util.Map;
|
import java.util.Map;
|
||||||
|
|
||||||
import org.apache.commons.lang3.StringUtils;
|
import org.apache.commons.lang3.StringUtils;
|
||||||
|
|
||||||
|
import eu.dnetlib.dhp.schema.oaf.Dataset;
|
||||||
|
import eu.dnetlib.dhp.schema.oaf.OafEntity;
|
||||||
|
|
||||||
public class DLIDataset extends Dataset {
|
public class DLIDataset extends Dataset {
|
||||||
|
|
||||||
private String originalObjIdentifier;
|
private String originalObjIdentifier;
|
||||||
|
|
||||||
private List<ProvenaceInfo> dlicollectedfrom;
|
private List<ProvenaceInfo> dlicollectedfrom;
|
||||||
|
|
||||||
private String completionStatus;
|
private String completionStatus;
|
||||||
|
|
||||||
public String getCompletionStatus() {
|
public String getCompletionStatus() {
|
||||||
return completionStatus;
|
return completionStatus;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setCompletionStatus(String completionStatus) {
|
public void setCompletionStatus(String completionStatus) {
|
||||||
this.completionStatus = completionStatus;
|
this.completionStatus = completionStatus;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<ProvenaceInfo> getDlicollectedfrom() {
|
public List<ProvenaceInfo> getDlicollectedfrom() {
|
||||||
return dlicollectedfrom;
|
return dlicollectedfrom;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDlicollectedfrom(List<ProvenaceInfo> dlicollectedfrom) {
|
public void setDlicollectedfrom(List<ProvenaceInfo> dlicollectedfrom) {
|
||||||
this.dlicollectedfrom = dlicollectedfrom;
|
this.dlicollectedfrom = dlicollectedfrom;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getOriginalObjIdentifier() {
|
public String getOriginalObjIdentifier() {
|
||||||
return originalObjIdentifier;
|
return originalObjIdentifier;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setOriginalObjIdentifier(String originalObjIdentifier) {
|
public void setOriginalObjIdentifier(String originalObjIdentifier) {
|
||||||
this.originalObjIdentifier = originalObjIdentifier;
|
this.originalObjIdentifier = originalObjIdentifier;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public void mergeFrom(OafEntity e) {
|
public void mergeFrom(OafEntity e) {
|
||||||
super.mergeFrom(e);
|
super.mergeFrom(e);
|
||||||
DLIDataset p = (DLIDataset) e;
|
DLIDataset p = (DLIDataset) e;
|
||||||
if (StringUtils.isBlank(completionStatus) && StringUtils.isNotBlank(p.completionStatus))
|
if (StringUtils.isBlank(completionStatus) && StringUtils.isNotBlank(p.completionStatus))
|
||||||
completionStatus = p.completionStatus;
|
completionStatus = p.completionStatus;
|
||||||
if ("complete".equalsIgnoreCase(p.completionStatus)) completionStatus = "complete";
|
if ("complete".equalsIgnoreCase(p.completionStatus))
|
||||||
dlicollectedfrom = mergeProvenance(dlicollectedfrom, p.getDlicollectedfrom());
|
completionStatus = "complete";
|
||||||
}
|
dlicollectedfrom = mergeProvenance(dlicollectedfrom, p.getDlicollectedfrom());
|
||||||
|
}
|
||||||
|
|
||||||
private List<ProvenaceInfo> mergeProvenance(
|
private List<ProvenaceInfo> mergeProvenance(
|
||||||
final List<ProvenaceInfo> a, final List<ProvenaceInfo> b) {
|
final List<ProvenaceInfo> a, final List<ProvenaceInfo> b) {
|
||||||
Map<String, ProvenaceInfo> result = new HashMap<>();
|
Map<String, ProvenaceInfo> result = new HashMap<>();
|
||||||
if (a != null)
|
if (a != null)
|
||||||
a.forEach(
|
a
|
||||||
p -> {
|
.forEach(
|
||||||
if (p != null && StringUtils.isNotBlank(p.getId()) && result.containsKey(p.getId())) {
|
p -> {
|
||||||
if ("incomplete".equalsIgnoreCase(result.get(p.getId()).getCompletionStatus())
|
if (p != null && StringUtils.isNotBlank(p.getId()) && result.containsKey(p.getId())) {
|
||||||
&& StringUtils.isNotBlank(p.getCompletionStatus())) {
|
if ("incomplete".equalsIgnoreCase(result.get(p.getId()).getCompletionStatus())
|
||||||
result.put(p.getId(), p);
|
&& StringUtils.isNotBlank(p.getCompletionStatus())) {
|
||||||
}
|
result.put(p.getId(), p);
|
||||||
|
}
|
||||||
|
|
||||||
} else if (p != null && p.getId() != null && !result.containsKey(p.getId()))
|
} else if (p != null && p.getId() != null && !result.containsKey(p.getId()))
|
||||||
result.put(p.getId(), p);
|
result.put(p.getId(), p);
|
||||||
});
|
});
|
||||||
if (b != null)
|
if (b != null)
|
||||||
b.forEach(
|
b
|
||||||
p -> {
|
.forEach(
|
||||||
if (p != null && StringUtils.isNotBlank(p.getId()) && result.containsKey(p.getId())) {
|
p -> {
|
||||||
if ("incomplete".equalsIgnoreCase(result.get(p.getId()).getCompletionStatus())
|
if (p != null && StringUtils.isNotBlank(p.getId()) && result.containsKey(p.getId())) {
|
||||||
&& StringUtils.isNotBlank(p.getCompletionStatus())) {
|
if ("incomplete".equalsIgnoreCase(result.get(p.getId()).getCompletionStatus())
|
||||||
result.put(p.getId(), p);
|
&& StringUtils.isNotBlank(p.getCompletionStatus())) {
|
||||||
}
|
result.put(p.getId(), p);
|
||||||
|
}
|
||||||
|
|
||||||
} else if (p != null && p.getId() != null && !result.containsKey(p.getId()))
|
} else if (p != null && p.getId() != null && !result.containsKey(p.getId()))
|
||||||
result.put(p.getId(), p);
|
result.put(p.getId(), p);
|
||||||
});
|
});
|
||||||
|
|
||||||
return new ArrayList<>(result.values());
|
return new ArrayList<>(result.values());
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,81 +1,87 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.scholexplorer;
|
package eu.dnetlib.dhp.schema.scholexplorer;
|
||||||
|
|
||||||
|
import java.io.Serializable;
|
||||||
|
import java.util.*;
|
||||||
|
|
||||||
|
import org.apache.commons.lang3.StringUtils;
|
||||||
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.OafEntity;
|
import eu.dnetlib.dhp.schema.oaf.OafEntity;
|
||||||
import eu.dnetlib.dhp.schema.oaf.Publication;
|
import eu.dnetlib.dhp.schema.oaf.Publication;
|
||||||
import java.io.Serializable;
|
|
||||||
import java.util.*;
|
|
||||||
import org.apache.commons.lang3.StringUtils;
|
|
||||||
|
|
||||||
public class DLIPublication extends Publication implements Serializable {
|
public class DLIPublication extends Publication implements Serializable {
|
||||||
|
|
||||||
private String originalObjIdentifier;
|
private String originalObjIdentifier;
|
||||||
|
|
||||||
private List<ProvenaceInfo> dlicollectedfrom;
|
private List<ProvenaceInfo> dlicollectedfrom;
|
||||||
|
|
||||||
private String completionStatus;
|
private String completionStatus;
|
||||||
|
|
||||||
public String getCompletionStatus() {
|
public String getCompletionStatus() {
|
||||||
return completionStatus;
|
return completionStatus;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setCompletionStatus(String completionStatus) {
|
public void setCompletionStatus(String completionStatus) {
|
||||||
this.completionStatus = completionStatus;
|
this.completionStatus = completionStatus;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<ProvenaceInfo> getDlicollectedfrom() {
|
public List<ProvenaceInfo> getDlicollectedfrom() {
|
||||||
return dlicollectedfrom;
|
return dlicollectedfrom;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDlicollectedfrom(List<ProvenaceInfo> dlicollectedfrom) {
|
public void setDlicollectedfrom(List<ProvenaceInfo> dlicollectedfrom) {
|
||||||
this.dlicollectedfrom = dlicollectedfrom;
|
this.dlicollectedfrom = dlicollectedfrom;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getOriginalObjIdentifier() {
|
public String getOriginalObjIdentifier() {
|
||||||
return originalObjIdentifier;
|
return originalObjIdentifier;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setOriginalObjIdentifier(String originalObjIdentifier) {
|
public void setOriginalObjIdentifier(String originalObjIdentifier) {
|
||||||
this.originalObjIdentifier = originalObjIdentifier;
|
this.originalObjIdentifier = originalObjIdentifier;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public void mergeFrom(OafEntity e) {
|
public void mergeFrom(OafEntity e) {
|
||||||
super.mergeFrom(e);
|
super.mergeFrom(e);
|
||||||
DLIPublication p = (DLIPublication) e;
|
DLIPublication p = (DLIPublication) e;
|
||||||
if (StringUtils.isBlank(completionStatus) && StringUtils.isNotBlank(p.completionStatus))
|
if (StringUtils.isBlank(completionStatus) && StringUtils.isNotBlank(p.completionStatus))
|
||||||
completionStatus = p.completionStatus;
|
completionStatus = p.completionStatus;
|
||||||
if ("complete".equalsIgnoreCase(p.completionStatus)) completionStatus = "complete";
|
if ("complete".equalsIgnoreCase(p.completionStatus))
|
||||||
dlicollectedfrom = mergeProvenance(dlicollectedfrom, p.getDlicollectedfrom());
|
completionStatus = "complete";
|
||||||
}
|
dlicollectedfrom = mergeProvenance(dlicollectedfrom, p.getDlicollectedfrom());
|
||||||
|
}
|
||||||
|
|
||||||
private List<ProvenaceInfo> mergeProvenance(
|
private List<ProvenaceInfo> mergeProvenance(
|
||||||
final List<ProvenaceInfo> a, final List<ProvenaceInfo> b) {
|
final List<ProvenaceInfo> a, final List<ProvenaceInfo> b) {
|
||||||
Map<String, ProvenaceInfo> result = new HashMap<>();
|
Map<String, ProvenaceInfo> result = new HashMap<>();
|
||||||
if (a != null)
|
if (a != null)
|
||||||
a.forEach(
|
a
|
||||||
p -> {
|
.forEach(
|
||||||
if (p != null && StringUtils.isNotBlank(p.getId()) && result.containsKey(p.getId())) {
|
p -> {
|
||||||
if ("incomplete".equalsIgnoreCase(result.get(p.getId()).getCompletionStatus())
|
if (p != null && StringUtils.isNotBlank(p.getId()) && result.containsKey(p.getId())) {
|
||||||
&& StringUtils.isNotBlank(p.getCompletionStatus())) {
|
if ("incomplete".equalsIgnoreCase(result.get(p.getId()).getCompletionStatus())
|
||||||
result.put(p.getId(), p);
|
&& StringUtils.isNotBlank(p.getCompletionStatus())) {
|
||||||
}
|
result.put(p.getId(), p);
|
||||||
|
}
|
||||||
|
|
||||||
} else if (p != null && p.getId() != null && !result.containsKey(p.getId()))
|
} else if (p != null && p.getId() != null && !result.containsKey(p.getId()))
|
||||||
result.put(p.getId(), p);
|
result.put(p.getId(), p);
|
||||||
});
|
});
|
||||||
if (b != null)
|
if (b != null)
|
||||||
b.forEach(
|
b
|
||||||
p -> {
|
.forEach(
|
||||||
if (p != null && StringUtils.isNotBlank(p.getId()) && result.containsKey(p.getId())) {
|
p -> {
|
||||||
if ("incomplete".equalsIgnoreCase(result.get(p.getId()).getCompletionStatus())
|
if (p != null && StringUtils.isNotBlank(p.getId()) && result.containsKey(p.getId())) {
|
||||||
&& StringUtils.isNotBlank(p.getCompletionStatus())) {
|
if ("incomplete".equalsIgnoreCase(result.get(p.getId()).getCompletionStatus())
|
||||||
result.put(p.getId(), p);
|
&& StringUtils.isNotBlank(p.getCompletionStatus())) {
|
||||||
}
|
result.put(p.getId(), p);
|
||||||
|
}
|
||||||
|
|
||||||
} else if (p != null && p.getId() != null && !result.containsKey(p.getId()))
|
} else if (p != null && p.getId() != null && !result.containsKey(p.getId()))
|
||||||
result.put(p.getId(), p);
|
result.put(p.getId(), p);
|
||||||
});
|
});
|
||||||
|
|
||||||
return new ArrayList<>(result.values());
|
return new ArrayList<>(result.values());
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,15 +1,16 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.scholexplorer;
|
package eu.dnetlib.dhp.schema.scholexplorer;
|
||||||
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.Relation;
|
import eu.dnetlib.dhp.schema.oaf.Relation;
|
||||||
|
|
||||||
public class DLIRelation extends Relation {
|
public class DLIRelation extends Relation {
|
||||||
private String dateOfCollection;
|
private String dateOfCollection;
|
||||||
|
|
||||||
public String getDateOfCollection() {
|
public String getDateOfCollection() {
|
||||||
return dateOfCollection;
|
return dateOfCollection;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDateOfCollection(String dateOfCollection) {
|
public void setDateOfCollection(String dateOfCollection) {
|
||||||
this.dateOfCollection = dateOfCollection;
|
this.dateOfCollection = dateOfCollection;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,109 +1,115 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.scholexplorer;
|
package eu.dnetlib.dhp.schema.scholexplorer;
|
||||||
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.Oaf;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.StructuredProperty;
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
import java.util.ArrayList;
|
import java.util.ArrayList;
|
||||||
import java.util.HashMap;
|
import java.util.HashMap;
|
||||||
import java.util.List;
|
import java.util.List;
|
||||||
import java.util.Map;
|
import java.util.Map;
|
||||||
|
|
||||||
import org.apache.commons.lang3.StringUtils;
|
import org.apache.commons.lang3.StringUtils;
|
||||||
|
|
||||||
|
import eu.dnetlib.dhp.schema.oaf.Oaf;
|
||||||
|
import eu.dnetlib.dhp.schema.oaf.StructuredProperty;
|
||||||
|
|
||||||
public class DLIUnknown extends Oaf implements Serializable {
|
public class DLIUnknown extends Oaf implements Serializable {
|
||||||
|
|
||||||
private String id;
|
private String id;
|
||||||
|
|
||||||
private List<StructuredProperty> pid;
|
private List<StructuredProperty> pid;
|
||||||
|
|
||||||
private String dateofcollection;
|
private String dateofcollection;
|
||||||
|
|
||||||
private String dateoftransformation;
|
private String dateoftransformation;
|
||||||
|
|
||||||
private List<ProvenaceInfo> dlicollectedfrom;
|
private List<ProvenaceInfo> dlicollectedfrom;
|
||||||
|
|
||||||
private String completionStatus = "incomplete";
|
private String completionStatus = "incomplete";
|
||||||
|
|
||||||
public String getCompletionStatus() {
|
public String getCompletionStatus() {
|
||||||
return completionStatus;
|
return completionStatus;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setCompletionStatus(String completionStatus) {
|
public void setCompletionStatus(String completionStatus) {
|
||||||
this.completionStatus = completionStatus;
|
this.completionStatus = completionStatus;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<ProvenaceInfo> getDlicollectedfrom() {
|
public List<ProvenaceInfo> getDlicollectedfrom() {
|
||||||
return dlicollectedfrom;
|
return dlicollectedfrom;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDlicollectedfrom(List<ProvenaceInfo> dlicollectedfrom) {
|
public void setDlicollectedfrom(List<ProvenaceInfo> dlicollectedfrom) {
|
||||||
this.dlicollectedfrom = dlicollectedfrom;
|
this.dlicollectedfrom = dlicollectedfrom;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getId() {
|
public String getId() {
|
||||||
return id;
|
return id;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setId(String id) {
|
public void setId(String id) {
|
||||||
this.id = id;
|
this.id = id;
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<StructuredProperty> getPid() {
|
public List<StructuredProperty> getPid() {
|
||||||
return pid;
|
return pid;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setPid(List<StructuredProperty> pid) {
|
public void setPid(List<StructuredProperty> pid) {
|
||||||
this.pid = pid;
|
this.pid = pid;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getDateofcollection() {
|
public String getDateofcollection() {
|
||||||
return dateofcollection;
|
return dateofcollection;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDateofcollection(String dateofcollection) {
|
public void setDateofcollection(String dateofcollection) {
|
||||||
this.dateofcollection = dateofcollection;
|
this.dateofcollection = dateofcollection;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getDateoftransformation() {
|
public String getDateoftransformation() {
|
||||||
return dateoftransformation;
|
return dateoftransformation;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setDateoftransformation(String dateoftransformation) {
|
public void setDateoftransformation(String dateoftransformation) {
|
||||||
this.dateoftransformation = dateoftransformation;
|
this.dateoftransformation = dateoftransformation;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void mergeFrom(DLIUnknown p) {
|
public void mergeFrom(DLIUnknown p) {
|
||||||
if ("complete".equalsIgnoreCase(p.completionStatus)) completionStatus = "complete";
|
if ("complete".equalsIgnoreCase(p.completionStatus))
|
||||||
dlicollectedfrom = mergeProvenance(dlicollectedfrom, p.getDlicollectedfrom());
|
completionStatus = "complete";
|
||||||
}
|
dlicollectedfrom = mergeProvenance(dlicollectedfrom, p.getDlicollectedfrom());
|
||||||
|
}
|
||||||
|
|
||||||
private List<ProvenaceInfo> mergeProvenance(
|
private List<ProvenaceInfo> mergeProvenance(
|
||||||
final List<ProvenaceInfo> a, final List<ProvenaceInfo> b) {
|
final List<ProvenaceInfo> a, final List<ProvenaceInfo> b) {
|
||||||
Map<String, ProvenaceInfo> result = new HashMap<>();
|
Map<String, ProvenaceInfo> result = new HashMap<>();
|
||||||
if (a != null)
|
if (a != null)
|
||||||
a.forEach(
|
a
|
||||||
p -> {
|
.forEach(
|
||||||
if (p != null && StringUtils.isNotBlank(p.getId()) && result.containsKey(p.getId())) {
|
p -> {
|
||||||
if ("incomplete".equalsIgnoreCase(result.get(p.getId()).getCompletionStatus())
|
if (p != null && StringUtils.isNotBlank(p.getId()) && result.containsKey(p.getId())) {
|
||||||
&& StringUtils.isNotBlank(p.getCompletionStatus())) {
|
if ("incomplete".equalsIgnoreCase(result.get(p.getId()).getCompletionStatus())
|
||||||
result.put(p.getId(), p);
|
&& StringUtils.isNotBlank(p.getCompletionStatus())) {
|
||||||
}
|
result.put(p.getId(), p);
|
||||||
|
}
|
||||||
|
|
||||||
} else if (p != null && p.getId() != null && !result.containsKey(p.getId()))
|
} else if (p != null && p.getId() != null && !result.containsKey(p.getId()))
|
||||||
result.put(p.getId(), p);
|
result.put(p.getId(), p);
|
||||||
});
|
});
|
||||||
if (b != null)
|
if (b != null)
|
||||||
b.forEach(
|
b
|
||||||
p -> {
|
.forEach(
|
||||||
if (p != null && StringUtils.isNotBlank(p.getId()) && result.containsKey(p.getId())) {
|
p -> {
|
||||||
if ("incomplete".equalsIgnoreCase(result.get(p.getId()).getCompletionStatus())
|
if (p != null && StringUtils.isNotBlank(p.getId()) && result.containsKey(p.getId())) {
|
||||||
&& StringUtils.isNotBlank(p.getCompletionStatus())) {
|
if ("incomplete".equalsIgnoreCase(result.get(p.getId()).getCompletionStatus())
|
||||||
result.put(p.getId(), p);
|
&& StringUtils.isNotBlank(p.getCompletionStatus())) {
|
||||||
}
|
result.put(p.getId(), p);
|
||||||
|
}
|
||||||
|
|
||||||
} else if (p != null && p.getId() != null && !result.containsKey(p.getId()))
|
} else if (p != null && p.getId() != null && !result.containsKey(p.getId()))
|
||||||
result.put(p.getId(), p);
|
result.put(p.getId(), p);
|
||||||
});
|
});
|
||||||
|
|
||||||
return new ArrayList<>(result.values());
|
return new ArrayList<>(result.values());
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,46 +1,47 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.scholexplorer;
|
package eu.dnetlib.dhp.schema.scholexplorer;
|
||||||
|
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
|
|
||||||
public class ProvenaceInfo implements Serializable {
|
public class ProvenaceInfo implements Serializable {
|
||||||
|
|
||||||
private String id;
|
private String id;
|
||||||
|
|
||||||
private String name;
|
private String name;
|
||||||
|
|
||||||
private String completionStatus;
|
private String completionStatus;
|
||||||
|
|
||||||
private String collectionMode = "collected";
|
private String collectionMode = "collected";
|
||||||
|
|
||||||
public String getId() {
|
public String getId() {
|
||||||
return id;
|
return id;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setId(String id) {
|
public void setId(String id) {
|
||||||
this.id = id;
|
this.id = id;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getName() {
|
public String getName() {
|
||||||
return name;
|
return name;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setName(String name) {
|
public void setName(String name) {
|
||||||
this.name = name;
|
this.name = name;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getCompletionStatus() {
|
public String getCompletionStatus() {
|
||||||
return completionStatus;
|
return completionStatus;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setCompletionStatus(String completionStatus) {
|
public void setCompletionStatus(String completionStatus) {
|
||||||
this.completionStatus = completionStatus;
|
this.completionStatus = completionStatus;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getCollectionMode() {
|
public String getCollectionMode() {
|
||||||
return collectionMode;
|
return collectionMode;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setCollectionMode(String collectionMode) {
|
public void setCollectionMode(String collectionMode) {
|
||||||
this.collectionMode = collectionMode;
|
this.collectionMode = collectionMode;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,36 +1,40 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.action;
|
package eu.dnetlib.dhp.schema.action;
|
||||||
|
|
||||||
import static org.junit.jupiter.api.Assertions.*;
|
import static org.junit.jupiter.api.Assertions.*;
|
||||||
|
|
||||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.Relation;
|
|
||||||
import java.io.IOException;
|
import java.io.IOException;
|
||||||
|
|
||||||
import org.apache.commons.lang3.StringUtils;
|
import org.apache.commons.lang3.StringUtils;
|
||||||
import org.junit.jupiter.api.Test;
|
import org.junit.jupiter.api.Test;
|
||||||
|
|
||||||
|
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||||
|
|
||||||
|
import eu.dnetlib.dhp.schema.oaf.Relation;
|
||||||
|
|
||||||
/** @author claudio.atzori */
|
/** @author claudio.atzori */
|
||||||
public class AtomicActionTest {
|
public class AtomicActionTest {
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void serializationTest() throws IOException {
|
public void serializationTest() throws IOException {
|
||||||
|
|
||||||
Relation rel = new Relation();
|
Relation rel = new Relation();
|
||||||
rel.setSource("1");
|
rel.setSource("1");
|
||||||
rel.setTarget("2");
|
rel.setTarget("2");
|
||||||
rel.setRelType("resultResult");
|
rel.setRelType("resultResult");
|
||||||
rel.setSubRelType("dedup");
|
rel.setSubRelType("dedup");
|
||||||
rel.setRelClass("merges");
|
rel.setRelClass("merges");
|
||||||
|
|
||||||
AtomicAction aa1 = new AtomicAction(Relation.class, rel);
|
AtomicAction aa1 = new AtomicAction(Relation.class, rel);
|
||||||
|
|
||||||
final ObjectMapper mapper = new ObjectMapper();
|
final ObjectMapper mapper = new ObjectMapper();
|
||||||
String json = mapper.writeValueAsString(aa1);
|
String json = mapper.writeValueAsString(aa1);
|
||||||
|
|
||||||
assertTrue(StringUtils.isNotBlank(json));
|
assertTrue(StringUtils.isNotBlank(json));
|
||||||
|
|
||||||
AtomicAction aa2 = mapper.readValue(json, AtomicAction.class);
|
AtomicAction aa2 = mapper.readValue(json, AtomicAction.class);
|
||||||
|
|
||||||
assertEquals(aa1.getClazz(), aa2.getClazz());
|
assertEquals(aa1.getClazz(), aa2.getClazz());
|
||||||
assertEquals(aa1.getPayload(), aa2.getPayload());
|
assertEquals(aa1.getPayload(), aa2.getPayload());
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,35 +1,37 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.common;
|
package eu.dnetlib.dhp.schema.common;
|
||||||
|
|
||||||
import static org.junit.jupiter.api.Assertions.assertFalse;
|
import static org.junit.jupiter.api.Assertions.assertFalse;
|
||||||
import static org.junit.jupiter.api.Assertions.assertTrue;
|
import static org.junit.jupiter.api.Assertions.assertTrue;
|
||||||
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.OafEntity;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.Relation;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.Result;
|
|
||||||
import org.junit.jupiter.api.Nested;
|
import org.junit.jupiter.api.Nested;
|
||||||
import org.junit.jupiter.api.Test;
|
import org.junit.jupiter.api.Test;
|
||||||
|
|
||||||
|
import eu.dnetlib.dhp.schema.oaf.OafEntity;
|
||||||
|
import eu.dnetlib.dhp.schema.oaf.Relation;
|
||||||
|
import eu.dnetlib.dhp.schema.oaf.Result;
|
||||||
|
|
||||||
public class ModelSupportTest {
|
public class ModelSupportTest {
|
||||||
|
|
||||||
@Nested
|
@Nested
|
||||||
class IsSubClass {
|
class IsSubClass {
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void shouldReturnFalseWhenSubClassDoesNotExtendSuperClass() {
|
public void shouldReturnFalseWhenSubClassDoesNotExtendSuperClass() {
|
||||||
// when
|
// when
|
||||||
Boolean result = ModelSupport.isSubClass(Relation.class, OafEntity.class);
|
Boolean result = ModelSupport.isSubClass(Relation.class, OafEntity.class);
|
||||||
|
|
||||||
// then
|
// then
|
||||||
assertFalse(result);
|
assertFalse(result);
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void shouldReturnTrueWhenSubClassExtendsSuperClass() {
|
public void shouldReturnTrueWhenSubClassExtendsSuperClass() {
|
||||||
// when
|
// when
|
||||||
Boolean result = ModelSupport.isSubClass(Result.class, OafEntity.class);
|
Boolean result = ModelSupport.isSubClass(Result.class, OafEntity.class);
|
||||||
|
|
||||||
// then
|
// then
|
||||||
assertTrue(result);
|
assertTrue(result);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,86 +1,88 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.oaf;
|
package eu.dnetlib.dhp.schema.oaf;
|
||||||
|
|
||||||
import static org.junit.jupiter.api.Assertions.*;
|
import static org.junit.jupiter.api.Assertions.*;
|
||||||
|
|
||||||
import java.util.Arrays;
|
import java.util.Arrays;
|
||||||
import java.util.List;
|
import java.util.List;
|
||||||
|
|
||||||
import org.junit.jupiter.api.BeforeEach;
|
import org.junit.jupiter.api.BeforeEach;
|
||||||
import org.junit.jupiter.api.Test;
|
import org.junit.jupiter.api.Test;
|
||||||
|
|
||||||
public class MergeTest {
|
public class MergeTest {
|
||||||
|
|
||||||
OafEntity oaf;
|
OafEntity oaf;
|
||||||
|
|
||||||
@BeforeEach
|
@BeforeEach
|
||||||
public void setUp() {
|
public void setUp() {
|
||||||
oaf = new Publication();
|
oaf = new Publication();
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void mergeListsTest() {
|
public void mergeListsTest() {
|
||||||
|
|
||||||
// string list merge test
|
// string list merge test
|
||||||
List<String> a = Arrays.asList("a", "b", "c", "e");
|
List<String> a = Arrays.asList("a", "b", "c", "e");
|
||||||
List<String> b = Arrays.asList("a", "b", "c", "d");
|
List<String> b = Arrays.asList("a", "b", "c", "d");
|
||||||
List<String> c = null;
|
List<String> c = null;
|
||||||
|
|
||||||
System.out.println("merge result 1 = " + oaf.mergeLists(a, b));
|
System.out.println("merge result 1 = " + oaf.mergeLists(a, b));
|
||||||
|
|
||||||
System.out.println("merge result 2 = " + oaf.mergeLists(a, c));
|
System.out.println("merge result 2 = " + oaf.mergeLists(a, c));
|
||||||
|
|
||||||
System.out.println("merge result 3 = " + oaf.mergeLists(c, c));
|
System.out.println("merge result 3 = " + oaf.mergeLists(c, c));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void mergePublicationCollectedFromTest() {
|
public void mergePublicationCollectedFromTest() {
|
||||||
|
|
||||||
Publication a = new Publication();
|
Publication a = new Publication();
|
||||||
Publication b = new Publication();
|
Publication b = new Publication();
|
||||||
|
|
||||||
a.setCollectedfrom(Arrays.asList(setKV("a", "open"), setKV("b", "closed")));
|
a.setCollectedfrom(Arrays.asList(setKV("a", "open"), setKV("b", "closed")));
|
||||||
b.setCollectedfrom(Arrays.asList(setKV("A", "open"), setKV("b", "Open")));
|
b.setCollectedfrom(Arrays.asList(setKV("A", "open"), setKV("b", "Open")));
|
||||||
|
|
||||||
a.mergeFrom(b);
|
a.mergeFrom(b);
|
||||||
|
|
||||||
assertNotNull(a.getCollectedfrom());
|
assertNotNull(a.getCollectedfrom());
|
||||||
assertEquals(3, a.getCollectedfrom().size());
|
assertEquals(3, a.getCollectedfrom().size());
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void mergePublicationSubjectTest() {
|
public void mergePublicationSubjectTest() {
|
||||||
|
|
||||||
Publication a = new Publication();
|
Publication a = new Publication();
|
||||||
Publication b = new Publication();
|
Publication b = new Publication();
|
||||||
|
|
||||||
a.setSubject(Arrays.asList(setSP("a", "open", "classe"), setSP("b", "open", "classe")));
|
a.setSubject(Arrays.asList(setSP("a", "open", "classe"), setSP("b", "open", "classe")));
|
||||||
b.setSubject(Arrays.asList(setSP("A", "open", "classe"), setSP("c", "open", "classe")));
|
b.setSubject(Arrays.asList(setSP("A", "open", "classe"), setSP("c", "open", "classe")));
|
||||||
|
|
||||||
a.mergeFrom(b);
|
a.mergeFrom(b);
|
||||||
|
|
||||||
assertNotNull(a.getSubject());
|
assertNotNull(a.getSubject());
|
||||||
assertEquals(3, a.getSubject().size());
|
assertEquals(3, a.getSubject().size());
|
||||||
}
|
}
|
||||||
|
|
||||||
private KeyValue setKV(final String key, final String value) {
|
private KeyValue setKV(final String key, final String value) {
|
||||||
|
|
||||||
KeyValue k = new KeyValue();
|
KeyValue k = new KeyValue();
|
||||||
|
|
||||||
k.setKey(key);
|
k.setKey(key);
|
||||||
k.setValue(value);
|
k.setValue(value);
|
||||||
|
|
||||||
return k;
|
return k;
|
||||||
}
|
}
|
||||||
|
|
||||||
private StructuredProperty setSP(
|
private StructuredProperty setSP(
|
||||||
final String value, final String schema, final String classname) {
|
final String value, final String schema, final String classname) {
|
||||||
StructuredProperty s = new StructuredProperty();
|
StructuredProperty s = new StructuredProperty();
|
||||||
s.setValue(value);
|
s.setValue(value);
|
||||||
Qualifier q = new Qualifier();
|
Qualifier q = new Qualifier();
|
||||||
q.setClassname(classname);
|
q.setClassname(classname);
|
||||||
q.setClassid(classname);
|
q.setClassid(classname);
|
||||||
q.setSchemename(schema);
|
q.setSchemename(schema);
|
||||||
q.setSchemeid(schema);
|
q.setSchemeid(schema);
|
||||||
s.setQualifier(q);
|
s.setQualifier(q);
|
||||||
return s;
|
return s;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,76 +1,83 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.schema.scholexplorer;
|
package eu.dnetlib.dhp.schema.scholexplorer;
|
||||||
|
|
||||||
|
import java.io.IOException;
|
||||||
|
import java.util.Arrays;
|
||||||
|
import java.util.Collections;
|
||||||
|
|
||||||
|
import org.junit.jupiter.api.Test;
|
||||||
|
|
||||||
import com.fasterxml.jackson.core.JsonProcessingException;
|
import com.fasterxml.jackson.core.JsonProcessingException;
|
||||||
import com.fasterxml.jackson.databind.DeserializationFeature;
|
import com.fasterxml.jackson.databind.DeserializationFeature;
|
||||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||||
import com.fasterxml.jackson.databind.SerializationFeature;
|
import com.fasterxml.jackson.databind.SerializationFeature;
|
||||||
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.Qualifier;
|
import eu.dnetlib.dhp.schema.oaf.Qualifier;
|
||||||
import eu.dnetlib.dhp.schema.oaf.StructuredProperty;
|
import eu.dnetlib.dhp.schema.oaf.StructuredProperty;
|
||||||
import java.io.IOException;
|
|
||||||
import java.util.Arrays;
|
|
||||||
import java.util.Collections;
|
|
||||||
import org.junit.jupiter.api.Test;
|
|
||||||
|
|
||||||
public class DLItest {
|
public class DLItest {
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void testMergePublication() throws JsonProcessingException {
|
public void testMergePublication() throws JsonProcessingException {
|
||||||
DLIPublication a1 = new DLIPublication();
|
DLIPublication a1 = new DLIPublication();
|
||||||
a1.setPid(Arrays.asList(createSP("123456", "pdb", "dnet:pid_types")));
|
a1.setPid(Arrays.asList(createSP("123456", "pdb", "dnet:pid_types")));
|
||||||
a1.setTitle(Collections.singletonList(createSP("Un Titolo", "title", "dnetTitle")));
|
a1.setTitle(Collections.singletonList(createSP("Un Titolo", "title", "dnetTitle")));
|
||||||
a1.setDlicollectedfrom(Arrays.asList(createCollectedFrom("znd", "Zenodo", "complete")));
|
a1.setDlicollectedfrom(Arrays.asList(createCollectedFrom("znd", "Zenodo", "complete")));
|
||||||
a1.setCompletionStatus("complete");
|
a1.setCompletionStatus("complete");
|
||||||
|
|
||||||
DLIPublication a = new DLIPublication();
|
DLIPublication a = new DLIPublication();
|
||||||
a.setPid(
|
a
|
||||||
Arrays.asList(
|
.setPid(
|
||||||
createSP("10.11", "doi", "dnet:pid_types"),
|
Arrays
|
||||||
createSP("123456", "pdb", "dnet:pid_types")));
|
.asList(
|
||||||
a.setTitle(Collections.singletonList(createSP("A Title", "title", "dnetTitle")));
|
createSP("10.11", "doi", "dnet:pid_types"),
|
||||||
a.setDlicollectedfrom(
|
createSP("123456", "pdb", "dnet:pid_types")));
|
||||||
Arrays.asList(
|
a.setTitle(Collections.singletonList(createSP("A Title", "title", "dnetTitle")));
|
||||||
createCollectedFrom("dct", "datacite", "complete"),
|
a
|
||||||
createCollectedFrom("dct", "datacite", "incomplete")));
|
.setDlicollectedfrom(
|
||||||
a.setCompletionStatus("incomplete");
|
Arrays
|
||||||
|
.asList(
|
||||||
|
createCollectedFrom("dct", "datacite", "complete"),
|
||||||
|
createCollectedFrom("dct", "datacite", "incomplete")));
|
||||||
|
a.setCompletionStatus("incomplete");
|
||||||
|
|
||||||
a.mergeFrom(a1);
|
a.mergeFrom(a1);
|
||||||
|
|
||||||
ObjectMapper mapper = new ObjectMapper();
|
ObjectMapper mapper = new ObjectMapper();
|
||||||
System.out.println(mapper.writeValueAsString(a));
|
System.out.println(mapper.writeValueAsString(a));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void testDeserialization() throws IOException {
|
public void testDeserialization() throws IOException {
|
||||||
|
|
||||||
final String json =
|
final String json = "{\"dataInfo\":{\"invisible\":false,\"inferred\":null,\"deletedbyinference\":false,\"trust\":\"0.9\",\"inferenceprovenance\":null,\"provenanceaction\":null},\"lastupdatetimestamp\":null,\"id\":\"60|bd9352547098929a394655ad1a44a479\",\"originalId\":[\"bd9352547098929a394655ad1a44a479\"],\"collectedfrom\":[{\"key\":\"dli_________::datacite\",\"value\":\"Datasets in Datacite\",\"dataInfo\":null,\"blank\":false}],\"pid\":[{\"value\":\"10.7925/DRS1.DUCHAS_5078760\",\"qualifier\":{\"classid\":\"doi\",\"classname\":\"doi\",\"schemeid\":\"dnet:pid_types\",\"schemename\":\"dnet:pid_types\",\"blank\":false},\"dataInfo\":null}],\"dateofcollection\":\"2020-01-09T08:29:31.885Z\",\"dateoftransformation\":null,\"extraInfo\":null,\"oaiprovenance\":null,\"author\":[{\"fullname\":\"Cathail, S. Ó\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null},{\"fullname\":\"Donnell, Breda Mc\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null},{\"fullname\":\"Ireland. Department of Arts, Culture, and the Gaeltacht\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null},{\"fullname\":\"University College Dublin\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null},{\"fullname\":\"National Folklore Foundation\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null},{\"fullname\":\"Cathail, S. Ó\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null},{\"fullname\":\"Donnell, Breda Mc\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null}],\"resulttype\":null,\"language\":null,\"country\":null,\"subject\":[{\"value\":\"Recreation\",\"qualifier\":{\"classid\":\"dnet:subject\",\"classname\":\"dnet:subject\",\"schemeid\":\"unknown\",\"schemename\":\"unknown\",\"blank\":false},\"dataInfo\":null},{\"value\":\"Entertainments and recreational activities\",\"qualifier\":{\"classid\":\"dnet:subject\",\"classname\":\"dnet:subject\",\"schemeid\":\"unknown\",\"schemename\":\"unknown\",\"blank\":false},\"dataInfo\":null},{\"value\":\"Siamsaíocht agus caitheamh aimsire\",\"qualifier\":{\"classid\":\"dnet:subject\",\"classname\":\"dnet:subject\",\"schemeid\":\"unknown\",\"schemename\":\"unknown\",\"blank\":false},\"dataInfo\":null}],\"title\":[{\"value\":\"Games We Play\",\"qualifier\":null,\"dataInfo\":null}],\"relevantdate\":[{\"value\":\"1938-09-28\",\"qualifier\":{\"classid\":\"date\",\"classname\":\"date\",\"schemeid\":\"dnet::date\",\"schemename\":\"dnet::date\",\"blank\":false},\"dataInfo\":null}],\"description\":[{\"value\":\"Story collected by Breda Mc Donnell, a student at Tenure school (Tinure, Co. Louth) (no informant identified).\",\"dataInfo\":null}],\"dateofacceptance\":null,\"publisher\":{\"value\":\"University College Dublin\",\"dataInfo\":null},\"embargoenddate\":null,\"source\":null,\"fulltext\":null,\"format\":null,\"contributor\":null,\"resourcetype\":null,\"coverage\":null,\"refereed\":null,\"context\":null,\"processingchargeamount\":null,\"processingchargecurrency\":null,\"externalReference\":null,\"instance\":[],\"storagedate\":null,\"device\":null,\"size\":null,\"version\":null,\"lastmetadataupdate\":null,\"metadataversionnumber\":null,\"geolocation\":null,\"dlicollectedfrom\":[{\"id\":\"dli_________::datacite\",\"name\":\"Datasets in Datacite\",\"completionStatus\":\"complete\",\"collectionMode\":\"resolved\"}],\"completionStatus\":\"complete\"}";
|
||||||
"{\"dataInfo\":{\"invisible\":false,\"inferred\":null,\"deletedbyinference\":false,\"trust\":\"0.9\",\"inferenceprovenance\":null,\"provenanceaction\":null},\"lastupdatetimestamp\":null,\"id\":\"60|bd9352547098929a394655ad1a44a479\",\"originalId\":[\"bd9352547098929a394655ad1a44a479\"],\"collectedfrom\":[{\"key\":\"dli_________::datacite\",\"value\":\"Datasets in Datacite\",\"dataInfo\":null,\"blank\":false}],\"pid\":[{\"value\":\"10.7925/DRS1.DUCHAS_5078760\",\"qualifier\":{\"classid\":\"doi\",\"classname\":\"doi\",\"schemeid\":\"dnet:pid_types\",\"schemename\":\"dnet:pid_types\",\"blank\":false},\"dataInfo\":null}],\"dateofcollection\":\"2020-01-09T08:29:31.885Z\",\"dateoftransformation\":null,\"extraInfo\":null,\"oaiprovenance\":null,\"author\":[{\"fullname\":\"Cathail, S. Ó\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null},{\"fullname\":\"Donnell, Breda Mc\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null},{\"fullname\":\"Ireland. Department of Arts, Culture, and the Gaeltacht\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null},{\"fullname\":\"University College Dublin\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null},{\"fullname\":\"National Folklore Foundation\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null},{\"fullname\":\"Cathail, S. Ó\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null},{\"fullname\":\"Donnell, Breda Mc\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null}],\"resulttype\":null,\"language\":null,\"country\":null,\"subject\":[{\"value\":\"Recreation\",\"qualifier\":{\"classid\":\"dnet:subject\",\"classname\":\"dnet:subject\",\"schemeid\":\"unknown\",\"schemename\":\"unknown\",\"blank\":false},\"dataInfo\":null},{\"value\":\"Entertainments and recreational activities\",\"qualifier\":{\"classid\":\"dnet:subject\",\"classname\":\"dnet:subject\",\"schemeid\":\"unknown\",\"schemename\":\"unknown\",\"blank\":false},\"dataInfo\":null},{\"value\":\"Siamsaíocht agus caitheamh aimsire\",\"qualifier\":{\"classid\":\"dnet:subject\",\"classname\":\"dnet:subject\",\"schemeid\":\"unknown\",\"schemename\":\"unknown\",\"blank\":false},\"dataInfo\":null}],\"title\":[{\"value\":\"Games We Play\",\"qualifier\":null,\"dataInfo\":null}],\"relevantdate\":[{\"value\":\"1938-09-28\",\"qualifier\":{\"classid\":\"date\",\"classname\":\"date\",\"schemeid\":\"dnet::date\",\"schemename\":\"dnet::date\",\"blank\":false},\"dataInfo\":null}],\"description\":[{\"value\":\"Story collected by Breda Mc Donnell, a student at Tenure school (Tinure, Co. Louth) (no informant identified).\",\"dataInfo\":null}],\"dateofacceptance\":null,\"publisher\":{\"value\":\"University College Dublin\",\"dataInfo\":null},\"embargoenddate\":null,\"source\":null,\"fulltext\":null,\"format\":null,\"contributor\":null,\"resourcetype\":null,\"coverage\":null,\"refereed\":null,\"context\":null,\"processingchargeamount\":null,\"processingchargecurrency\":null,\"externalReference\":null,\"instance\":[],\"storagedate\":null,\"device\":null,\"size\":null,\"version\":null,\"lastmetadataupdate\":null,\"metadataversionnumber\":null,\"geolocation\":null,\"dlicollectedfrom\":[{\"id\":\"dli_________::datacite\",\"name\":\"Datasets in Datacite\",\"completionStatus\":\"complete\",\"collectionMode\":\"resolved\"}],\"completionStatus\":\"complete\"}";
|
|
||||||
|
|
||||||
ObjectMapper mapper = new ObjectMapper();
|
ObjectMapper mapper = new ObjectMapper();
|
||||||
mapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
|
mapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
|
||||||
DLIDataset dliDataset = mapper.readValue(json, DLIDataset.class);
|
DLIDataset dliDataset = mapper.readValue(json, DLIDataset.class);
|
||||||
mapper.enable(SerializationFeature.INDENT_OUTPUT);
|
mapper.enable(SerializationFeature.INDENT_OUTPUT);
|
||||||
System.out.println(mapper.writeValueAsString(dliDataset));
|
System.out.println(mapper.writeValueAsString(dliDataset));
|
||||||
}
|
}
|
||||||
|
|
||||||
private ProvenaceInfo createCollectedFrom(
|
private ProvenaceInfo createCollectedFrom(
|
||||||
final String id, final String name, final String completionStatus) {
|
final String id, final String name, final String completionStatus) {
|
||||||
ProvenaceInfo p = new ProvenaceInfo();
|
ProvenaceInfo p = new ProvenaceInfo();
|
||||||
p.setId(id);
|
p.setId(id);
|
||||||
p.setName(name);
|
p.setName(name);
|
||||||
p.setCompletionStatus(completionStatus);
|
p.setCompletionStatus(completionStatus);
|
||||||
return p;
|
return p;
|
||||||
}
|
}
|
||||||
|
|
||||||
private StructuredProperty createSP(
|
private StructuredProperty createSP(
|
||||||
final String value, final String className, final String schemeName) {
|
final String value, final String className, final String schemeName) {
|
||||||
StructuredProperty p = new StructuredProperty();
|
StructuredProperty p = new StructuredProperty();
|
||||||
p.setValue(value);
|
p.setValue(value);
|
||||||
Qualifier schema = new Qualifier();
|
Qualifier schema = new Qualifier();
|
||||||
schema.setClassname(className);
|
schema.setClassname(className);
|
||||||
schema.setClassid(className);
|
schema.setClassid(className);
|
||||||
schema.setSchemename(schemeName);
|
schema.setSchemename(schemeName);
|
||||||
schema.setSchemeid(schemeName);
|
schema.setSchemeid(schemeName);
|
||||||
p.setQualifier(schema);
|
p.setQualifier(schema);
|
||||||
return p;
|
return p;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,8 +1,23 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.actionmanager;
|
package eu.dnetlib.dhp.actionmanager;
|
||||||
|
|
||||||
|
import java.io.Serializable;
|
||||||
|
import java.io.StringReader;
|
||||||
|
import java.util.ArrayList;
|
||||||
|
import java.util.List;
|
||||||
|
import java.util.NoSuchElementException;
|
||||||
|
import java.util.stream.Collectors;
|
||||||
|
|
||||||
|
import org.dom4j.Document;
|
||||||
|
import org.dom4j.Element;
|
||||||
|
import org.dom4j.io.SAXReader;
|
||||||
|
import org.slf4j.Logger;
|
||||||
|
import org.slf4j.LoggerFactory;
|
||||||
|
|
||||||
import com.google.common.base.Splitter;
|
import com.google.common.base.Splitter;
|
||||||
import com.google.common.collect.Iterables;
|
import com.google.common.collect.Iterables;
|
||||||
import com.google.common.collect.Lists;
|
import com.google.common.collect.Lists;
|
||||||
|
|
||||||
import eu.dnetlib.actionmanager.rmi.ActionManagerException;
|
import eu.dnetlib.actionmanager.rmi.ActionManagerException;
|
||||||
import eu.dnetlib.actionmanager.set.ActionManagerSet;
|
import eu.dnetlib.actionmanager.set.ActionManagerSet;
|
||||||
import eu.dnetlib.actionmanager.set.ActionManagerSet.ImpactTypes;
|
import eu.dnetlib.actionmanager.set.ActionManagerSet.ImpactTypes;
|
||||||
|
@ -10,130 +25,120 @@ import eu.dnetlib.dhp.actionmanager.partition.PartitionActionSetsByPayloadTypeJo
|
||||||
import eu.dnetlib.dhp.utils.ISLookupClientFactory;
|
import eu.dnetlib.dhp.utils.ISLookupClientFactory;
|
||||||
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpException;
|
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpException;
|
||||||
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpService;
|
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpService;
|
||||||
import java.io.Serializable;
|
|
||||||
import java.io.StringReader;
|
|
||||||
import java.util.ArrayList;
|
|
||||||
import java.util.List;
|
|
||||||
import java.util.NoSuchElementException;
|
|
||||||
import java.util.stream.Collectors;
|
|
||||||
import org.dom4j.Document;
|
|
||||||
import org.dom4j.Element;
|
|
||||||
import org.dom4j.io.SAXReader;
|
|
||||||
import org.slf4j.Logger;
|
|
||||||
import org.slf4j.LoggerFactory;
|
|
||||||
|
|
||||||
public class ISClient implements Serializable {
|
public class ISClient implements Serializable {
|
||||||
|
|
||||||
private static final Logger log =
|
private static final Logger log = LoggerFactory.getLogger(PartitionActionSetsByPayloadTypeJob.class);
|
||||||
LoggerFactory.getLogger(PartitionActionSetsByPayloadTypeJob.class);
|
|
||||||
|
|
||||||
private static final String INPUT_ACTION_SET_ID_SEPARATOR = ",";
|
private static final String INPUT_ACTION_SET_ID_SEPARATOR = ",";
|
||||||
|
|
||||||
private ISLookUpService isLookup;
|
private final ISLookUpService isLookup;
|
||||||
|
|
||||||
public ISClient(String isLookupUrl) {
|
public ISClient(String isLookupUrl) {
|
||||||
isLookup = ISLookupClientFactory.getLookUpService(isLookupUrl);
|
isLookup = ISLookupClientFactory.getLookUpService(isLookupUrl);
|
||||||
}
|
}
|
||||||
|
|
||||||
public List<String> getLatestRawsetPaths(String setIds) {
|
public List<String> getLatestRawsetPaths(String setIds) {
|
||||||
|
|
||||||
List<String> ids =
|
List<String> ids = Lists
|
||||||
Lists.newArrayList(
|
.newArrayList(
|
||||||
Splitter.on(INPUT_ACTION_SET_ID_SEPARATOR)
|
Splitter
|
||||||
.omitEmptyStrings()
|
.on(INPUT_ACTION_SET_ID_SEPARATOR)
|
||||||
.trimResults()
|
.omitEmptyStrings()
|
||||||
.split(setIds));
|
.trimResults()
|
||||||
|
.split(setIds));
|
||||||
|
|
||||||
return ids.stream()
|
return ids
|
||||||
.map(id -> getSet(isLookup, id))
|
.stream()
|
||||||
.map(as -> as.getPathToLatest())
|
.map(id -> getSet(isLookup, id))
|
||||||
.collect(Collectors.toCollection(ArrayList::new));
|
.map(as -> as.getPathToLatest())
|
||||||
}
|
.collect(Collectors.toCollection(ArrayList::new));
|
||||||
|
}
|
||||||
|
|
||||||
private ActionManagerSet getSet(ISLookUpService isLookup, final String setId) {
|
private ActionManagerSet getSet(ISLookUpService isLookup, final String setId) {
|
||||||
|
|
||||||
final String q =
|
final String q = "for $x in collection('/db/DRIVER/ActionManagerSetDSResources/ActionManagerSetDSResourceType') "
|
||||||
"for $x in collection('/db/DRIVER/ActionManagerSetDSResources/ActionManagerSetDSResourceType') "
|
+ "where $x//SET/@id = '"
|
||||||
+ "where $x//SET/@id = '"
|
+ setId
|
||||||
+ setId
|
+ "' return $x";
|
||||||
+ "' return $x";
|
|
||||||
|
|
||||||
try {
|
try {
|
||||||
final String basePath = getBasePathHDFS(isLookup);
|
final String basePath = getBasePathHDFS(isLookup);
|
||||||
final String setProfile = isLookup.getResourceProfileByQuery(q);
|
final String setProfile = isLookup.getResourceProfileByQuery(q);
|
||||||
return getActionManagerSet(basePath, setProfile);
|
return getActionManagerSet(basePath, setProfile);
|
||||||
} catch (ISLookUpException | ActionManagerException e) {
|
} catch (ISLookUpException | ActionManagerException e) {
|
||||||
throw new RuntimeException("Error accessing Sets, using query: " + q);
|
throw new RuntimeException("Error accessing Sets, using query: " + q);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private ActionManagerSet getActionManagerSet(final String basePath, final String profile)
|
private ActionManagerSet getActionManagerSet(final String basePath, final String profile)
|
||||||
throws ActionManagerException {
|
throws ActionManagerException {
|
||||||
final SAXReader reader = new SAXReader();
|
final SAXReader reader = new SAXReader();
|
||||||
final ActionManagerSet set = new ActionManagerSet();
|
final ActionManagerSet set = new ActionManagerSet();
|
||||||
|
|
||||||
try {
|
try {
|
||||||
final Document doc = reader.read(new StringReader(profile));
|
final Document doc = reader.read(new StringReader(profile));
|
||||||
|
|
||||||
set.setId(doc.valueOf("//SET/@id").trim());
|
set.setId(doc.valueOf("//SET/@id").trim());
|
||||||
set.setName(doc.valueOf("//SET").trim());
|
set.setName(doc.valueOf("//SET").trim());
|
||||||
set.setImpact(ImpactTypes.valueOf(doc.valueOf("//IMPACT").trim()));
|
set.setImpact(ImpactTypes.valueOf(doc.valueOf("//IMPACT").trim()));
|
||||||
set.setLatest(
|
set
|
||||||
doc.valueOf("//RAW_SETS/LATEST/@id"),
|
.setLatest(
|
||||||
doc.valueOf("//RAW_SETS/LATEST/@creationDate"),
|
doc.valueOf("//RAW_SETS/LATEST/@id"),
|
||||||
doc.valueOf("//RAW_SETS/LATEST/@lastUpdate"));
|
doc.valueOf("//RAW_SETS/LATEST/@creationDate"),
|
||||||
set.setDirectory(doc.valueOf("//SET/@directory"));
|
doc.valueOf("//RAW_SETS/LATEST/@lastUpdate"));
|
||||||
final List expiredNodes = doc.selectNodes("//RAW_SETS/EXPIRED");
|
set.setDirectory(doc.valueOf("//SET/@directory"));
|
||||||
if (expiredNodes != null) {
|
final List expiredNodes = doc.selectNodes("//RAW_SETS/EXPIRED");
|
||||||
for (int i = 0; i < expiredNodes.size(); i++) {
|
if (expiredNodes != null) {
|
||||||
Element ex = (Element) expiredNodes.get(i);
|
for (int i = 0; i < expiredNodes.size(); i++) {
|
||||||
set.addExpired(
|
Element ex = (Element) expiredNodes.get(i);
|
||||||
ex.attributeValue("id"),
|
set
|
||||||
ex.attributeValue("creationDate"),
|
.addExpired(
|
||||||
ex.attributeValue("lastUpdate"));
|
ex.attributeValue("id"),
|
||||||
}
|
ex.attributeValue("creationDate"),
|
||||||
}
|
ex.attributeValue("lastUpdate"));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
final StringBuilder sb = new StringBuilder();
|
final StringBuilder sb = new StringBuilder();
|
||||||
sb.append(basePath);
|
sb.append(basePath);
|
||||||
sb.append("/");
|
sb.append("/");
|
||||||
sb.append(doc.valueOf("//SET/@directory"));
|
sb.append(doc.valueOf("//SET/@directory"));
|
||||||
sb.append("/");
|
sb.append("/");
|
||||||
sb.append(doc.valueOf("//RAW_SETS/LATEST/@id"));
|
sb.append(doc.valueOf("//RAW_SETS/LATEST/@id"));
|
||||||
set.setPathToLatest(sb.toString());
|
set.setPathToLatest(sb.toString());
|
||||||
|
|
||||||
return set;
|
return set;
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
throw new ActionManagerException("Error creating set from profile: " + profile, e);
|
throw new ActionManagerException("Error creating set from profile: " + profile, e);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private String getBasePathHDFS(ISLookUpService isLookup) throws ActionManagerException {
|
private String getBasePathHDFS(ISLookUpService isLookup) throws ActionManagerException {
|
||||||
return queryServiceProperty(isLookup, "basePath");
|
return queryServiceProperty(isLookup, "basePath");
|
||||||
}
|
}
|
||||||
|
|
||||||
private String queryServiceProperty(ISLookUpService isLookup, final String propertyName)
|
private String queryServiceProperty(ISLookUpService isLookup, final String propertyName)
|
||||||
throws ActionManagerException {
|
throws ActionManagerException {
|
||||||
final String q =
|
final String q = "for $x in /RESOURCE_PROFILE[.//RESOURCE_TYPE/@value='ActionManagerServiceResourceType'] return $x//SERVICE_PROPERTIES/PROPERTY[./@ key='"
|
||||||
"for $x in /RESOURCE_PROFILE[.//RESOURCE_TYPE/@value='ActionManagerServiceResourceType'] return $x//SERVICE_PROPERTIES/PROPERTY[./@ key='"
|
+ propertyName
|
||||||
+ propertyName
|
+ "']/@value/string()";
|
||||||
+ "']/@value/string()";
|
log.debug("quering for service property: " + q);
|
||||||
log.debug("quering for service property: " + q);
|
try {
|
||||||
try {
|
final List<String> value = isLookup.quickSearchProfile(q);
|
||||||
final List<String> value = isLookup.quickSearchProfile(q);
|
return Iterables.getOnlyElement(value);
|
||||||
return Iterables.getOnlyElement(value);
|
} catch (ISLookUpException e) {
|
||||||
} catch (ISLookUpException e) {
|
String msg = "Error accessing service profile, using query: " + q;
|
||||||
String msg = "Error accessing service profile, using query: " + q;
|
log.error(msg, e);
|
||||||
log.error(msg, e);
|
throw new ActionManagerException(msg, e);
|
||||||
throw new ActionManagerException(msg, e);
|
} catch (NoSuchElementException e) {
|
||||||
} catch (NoSuchElementException e) {
|
String msg = "missing service property: " + propertyName;
|
||||||
String msg = "missing service property: " + propertyName;
|
log.error(msg, e);
|
||||||
log.error(msg, e);
|
throw new ActionManagerException(msg, e);
|
||||||
throw new ActionManagerException(msg, e);
|
} catch (IllegalArgumentException e) {
|
||||||
} catch (IllegalArgumentException e) {
|
String msg = "found more than one service property: " + propertyName;
|
||||||
String msg = "found more than one service property: " + propertyName;
|
log.error(msg, e);
|
||||||
log.error(msg, e);
|
throw new ActionManagerException(msg, e);
|
||||||
throw new ActionManagerException(msg, e);
|
}
|
||||||
}
|
}
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,47 +1,69 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.actionmanager.migration;
|
package eu.dnetlib.dhp.actionmanager.migration;
|
||||||
|
|
||||||
import eu.dnetlib.data.proto.FieldTypeProtos.Qualifier;
|
|
||||||
import java.util.Comparator;
|
import java.util.Comparator;
|
||||||
|
|
||||||
|
import eu.dnetlib.data.proto.FieldTypeProtos.Qualifier;
|
||||||
|
|
||||||
public class LicenseComparator implements Comparator<Qualifier> {
|
public class LicenseComparator implements Comparator<Qualifier> {
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public int compare(Qualifier left, Qualifier right) {
|
public int compare(Qualifier left, Qualifier right) {
|
||||||
|
|
||||||
if (left == null && right == null) return 0;
|
if (left == null && right == null)
|
||||||
if (left == null) return 1;
|
return 0;
|
||||||
if (right == null) return -1;
|
if (left == null)
|
||||||
|
return 1;
|
||||||
|
if (right == null)
|
||||||
|
return -1;
|
||||||
|
|
||||||
String lClass = left.getClassid();
|
String lClass = left.getClassid();
|
||||||
String rClass = right.getClassid();
|
String rClass = right.getClassid();
|
||||||
|
|
||||||
if (lClass.equals(rClass)) return 0;
|
if (lClass.equals(rClass))
|
||||||
|
return 0;
|
||||||
|
|
||||||
if (lClass.equals("OPEN SOURCE")) return -1;
|
if (lClass.equals("OPEN SOURCE"))
|
||||||
if (rClass.equals("OPEN SOURCE")) return 1;
|
return -1;
|
||||||
|
if (rClass.equals("OPEN SOURCE"))
|
||||||
|
return 1;
|
||||||
|
|
||||||
if (lClass.equals("OPEN")) return -1;
|
if (lClass.equals("OPEN"))
|
||||||
if (rClass.equals("OPEN")) return 1;
|
return -1;
|
||||||
|
if (rClass.equals("OPEN"))
|
||||||
|
return 1;
|
||||||
|
|
||||||
if (lClass.equals("6MONTHS")) return -1;
|
if (lClass.equals("6MONTHS"))
|
||||||
if (rClass.equals("6MONTHS")) return 1;
|
return -1;
|
||||||
|
if (rClass.equals("6MONTHS"))
|
||||||
|
return 1;
|
||||||
|
|
||||||
if (lClass.equals("12MONTHS")) return -1;
|
if (lClass.equals("12MONTHS"))
|
||||||
if (rClass.equals("12MONTHS")) return 1;
|
return -1;
|
||||||
|
if (rClass.equals("12MONTHS"))
|
||||||
|
return 1;
|
||||||
|
|
||||||
if (lClass.equals("EMBARGO")) return -1;
|
if (lClass.equals("EMBARGO"))
|
||||||
if (rClass.equals("EMBARGO")) return 1;
|
return -1;
|
||||||
|
if (rClass.equals("EMBARGO"))
|
||||||
|
return 1;
|
||||||
|
|
||||||
if (lClass.equals("RESTRICTED")) return -1;
|
if (lClass.equals("RESTRICTED"))
|
||||||
if (rClass.equals("RESTRICTED")) return 1;
|
return -1;
|
||||||
|
if (rClass.equals("RESTRICTED"))
|
||||||
|
return 1;
|
||||||
|
|
||||||
if (lClass.equals("CLOSED")) return -1;
|
if (lClass.equals("CLOSED"))
|
||||||
if (rClass.equals("CLOSED")) return 1;
|
return -1;
|
||||||
|
if (rClass.equals("CLOSED"))
|
||||||
|
return 1;
|
||||||
|
|
||||||
if (lClass.equals("UNKNOWN")) return -1;
|
if (lClass.equals("UNKNOWN"))
|
||||||
if (rClass.equals("UNKNOWN")) return 1;
|
return -1;
|
||||||
|
if (rClass.equals("UNKNOWN"))
|
||||||
|
return 1;
|
||||||
|
|
||||||
// Else (but unlikely), lexicographical ordering will do.
|
// Else (but unlikely), lexicographical ordering will do.
|
||||||
return lClass.compareTo(rClass);
|
return lClass.compareTo(rClass);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,11 +1,6 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.actionmanager.migration;
|
package eu.dnetlib.dhp.actionmanager.migration;
|
||||||
|
|
||||||
import com.google.common.base.Splitter;
|
|
||||||
import com.google.common.collect.Lists;
|
|
||||||
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
|
|
||||||
import eu.dnetlib.dhp.utils.ISLookupClientFactory;
|
|
||||||
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpException;
|
|
||||||
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpService;
|
|
||||||
import java.io.File;
|
import java.io.File;
|
||||||
import java.io.FileOutputStream;
|
import java.io.FileOutputStream;
|
||||||
import java.io.OutputStream;
|
import java.io.OutputStream;
|
||||||
|
@ -14,6 +9,7 @@ import java.util.LinkedList;
|
||||||
import java.util.List;
|
import java.util.List;
|
||||||
import java.util.Properties;
|
import java.util.Properties;
|
||||||
import java.util.stream.Collectors;
|
import java.util.stream.Collectors;
|
||||||
|
|
||||||
import org.apache.commons.io.IOUtils;
|
import org.apache.commons.io.IOUtils;
|
||||||
import org.apache.commons.lang3.StringUtils;
|
import org.apache.commons.lang3.StringUtils;
|
||||||
import org.apache.hadoop.conf.Configuration;
|
import org.apache.hadoop.conf.Configuration;
|
||||||
|
@ -25,164 +21,174 @@ import org.apache.hadoop.util.ToolRunner;
|
||||||
import org.slf4j.Logger;
|
import org.slf4j.Logger;
|
||||||
import org.slf4j.LoggerFactory;
|
import org.slf4j.LoggerFactory;
|
||||||
|
|
||||||
|
import com.google.common.base.Splitter;
|
||||||
|
import com.google.common.collect.Lists;
|
||||||
|
|
||||||
|
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
|
||||||
|
import eu.dnetlib.dhp.utils.ISLookupClientFactory;
|
||||||
|
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpException;
|
||||||
|
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpService;
|
||||||
|
|
||||||
public class MigrateActionSet {
|
public class MigrateActionSet {
|
||||||
|
|
||||||
private static final Logger log = LoggerFactory.getLogger(MigrateActionSet.class);
|
private static final Logger log = LoggerFactory.getLogger(MigrateActionSet.class);
|
||||||
|
|
||||||
private static final String SEPARATOR = "/";
|
private static final String SEPARATOR = "/";
|
||||||
private static final String TARGET_PATHS = "target_paths";
|
private static final String TARGET_PATHS = "target_paths";
|
||||||
private static final String RAWSET_PREFIX = "rawset_";
|
private static final String RAWSET_PREFIX = "rawset_";
|
||||||
|
|
||||||
public static void main(String[] args) throws Exception {
|
public static void main(String[] args) throws Exception {
|
||||||
final ArgumentApplicationParser parser =
|
final ArgumentApplicationParser parser = new ArgumentApplicationParser(
|
||||||
new ArgumentApplicationParser(
|
IOUtils
|
||||||
IOUtils.toString(
|
.toString(
|
||||||
MigrateActionSet.class.getResourceAsStream(
|
MigrateActionSet.class
|
||||||
"/eu/dnetlib/dhp/actionmanager/migration/migrate_actionsets_parameters.json")));
|
.getResourceAsStream(
|
||||||
parser.parseArgument(args);
|
"/eu/dnetlib/dhp/actionmanager/migration/migrate_actionsets_parameters.json")));
|
||||||
|
parser.parseArgument(args);
|
||||||
|
|
||||||
new MigrateActionSet().run(parser);
|
new MigrateActionSet().run(parser);
|
||||||
}
|
}
|
||||||
|
|
||||||
private void run(ArgumentApplicationParser parser) throws Exception {
|
private void run(ArgumentApplicationParser parser) throws Exception {
|
||||||
|
|
||||||
final String isLookupUrl = parser.get("isLookupUrl");
|
final String isLookupUrl = parser.get("isLookupUrl");
|
||||||
final String sourceNN = parser.get("sourceNameNode");
|
final String sourceNN = parser.get("sourceNameNode");
|
||||||
final String targetNN = parser.get("targetNameNode");
|
final String targetNN = parser.get("targetNameNode");
|
||||||
final String workDir = parser.get("workingDirectory");
|
final String workDir = parser.get("workingDirectory");
|
||||||
final Integer distcp_num_maps = Integer.parseInt(parser.get("distcp_num_maps"));
|
final Integer distcp_num_maps = Integer.parseInt(parser.get("distcp_num_maps"));
|
||||||
|
|
||||||
final String distcp_memory_mb = parser.get("distcp_memory_mb");
|
final String distcp_memory_mb = parser.get("distcp_memory_mb");
|
||||||
final String distcp_task_timeout = parser.get("distcp_task_timeout");
|
final String distcp_task_timeout = parser.get("distcp_task_timeout");
|
||||||
|
|
||||||
final String transform_only_s = parser.get("transform_only");
|
final String transform_only_s = parser.get("transform_only");
|
||||||
|
|
||||||
log.info("transform only param: {}", transform_only_s);
|
log.info("transform only param: {}", transform_only_s);
|
||||||
|
|
||||||
final Boolean transformOnly = Boolean.valueOf(parser.get("transform_only"));
|
final Boolean transformOnly = Boolean.valueOf(parser.get("transform_only"));
|
||||||
|
|
||||||
log.info("transform only: {}", transformOnly);
|
log.info("transform only: {}", transformOnly);
|
||||||
|
|
||||||
ISLookUpService isLookUp = ISLookupClientFactory.getLookUpService(isLookupUrl);
|
ISLookUpService isLookUp = ISLookupClientFactory.getLookUpService(isLookupUrl);
|
||||||
|
|
||||||
Configuration conf = getConfiguration(distcp_task_timeout, distcp_memory_mb, distcp_num_maps);
|
Configuration conf = getConfiguration(distcp_task_timeout, distcp_memory_mb, distcp_num_maps);
|
||||||
FileSystem targetFS = FileSystem.get(conf);
|
FileSystem targetFS = FileSystem.get(conf);
|
||||||
|
|
||||||
Configuration sourceConf =
|
Configuration sourceConf = getConfiguration(distcp_task_timeout, distcp_memory_mb, distcp_num_maps);
|
||||||
getConfiguration(distcp_task_timeout, distcp_memory_mb, distcp_num_maps);
|
sourceConf.set(FileSystem.FS_DEFAULT_NAME_KEY, sourceNN);
|
||||||
sourceConf.set(FileSystem.FS_DEFAULT_NAME_KEY, sourceNN);
|
FileSystem sourceFS = FileSystem.get(sourceConf);
|
||||||
FileSystem sourceFS = FileSystem.get(sourceConf);
|
|
||||||
|
|
||||||
Properties props = new Properties();
|
Properties props = new Properties();
|
||||||
|
|
||||||
List<Path> targetPaths = new ArrayList<>();
|
List<Path> targetPaths = new ArrayList<>();
|
||||||
|
|
||||||
final List<Path> sourcePaths = getSourcePaths(sourceNN, isLookUp);
|
final List<Path> sourcePaths = getSourcePaths(sourceNN, isLookUp);
|
||||||
log.info(
|
log
|
||||||
"paths to process:\n{}",
|
.info(
|
||||||
sourcePaths.stream().map(p -> p.toString()).collect(Collectors.joining("\n")));
|
"paths to process:\n{}",
|
||||||
for (Path source : sourcePaths) {
|
sourcePaths.stream().map(p -> p.toString()).collect(Collectors.joining("\n")));
|
||||||
|
for (Path source : sourcePaths) {
|
||||||
|
|
||||||
if (!sourceFS.exists(source)) {
|
if (!sourceFS.exists(source)) {
|
||||||
log.warn("skipping unexisting path: {}", source);
|
log.warn("skipping unexisting path: {}", source);
|
||||||
} else {
|
} else {
|
||||||
|
|
||||||
LinkedList<String> pathQ =
|
LinkedList<String> pathQ = Lists.newLinkedList(Splitter.on(SEPARATOR).split(source.toUri().getPath()));
|
||||||
Lists.newLinkedList(Splitter.on(SEPARATOR).split(source.toUri().getPath()));
|
|
||||||
|
|
||||||
final String rawSet = pathQ.pollLast();
|
final String rawSet = pathQ.pollLast();
|
||||||
log.info("got RAWSET: {}", rawSet);
|
log.info("got RAWSET: {}", rawSet);
|
||||||
|
|
||||||
if (StringUtils.isNotBlank(rawSet) && rawSet.startsWith(RAWSET_PREFIX)) {
|
if (StringUtils.isNotBlank(rawSet) && rawSet.startsWith(RAWSET_PREFIX)) {
|
||||||
|
|
||||||
final String actionSetDirectory = pathQ.pollLast();
|
final String actionSetDirectory = pathQ.pollLast();
|
||||||
|
|
||||||
final Path targetPath =
|
final Path targetPath = new Path(
|
||||||
new Path(targetNN + workDir + SEPARATOR + actionSetDirectory + SEPARATOR + rawSet);
|
targetNN + workDir + SEPARATOR + actionSetDirectory + SEPARATOR + rawSet);
|
||||||
|
|
||||||
log.info("using TARGET PATH: {}", targetPath);
|
log.info("using TARGET PATH: {}", targetPath);
|
||||||
|
|
||||||
if (!transformOnly) {
|
if (!transformOnly) {
|
||||||
if (targetFS.exists(targetPath)) {
|
if (targetFS.exists(targetPath)) {
|
||||||
targetFS.delete(targetPath, true);
|
targetFS.delete(targetPath, true);
|
||||||
}
|
}
|
||||||
runDistcp(
|
runDistcp(
|
||||||
distcp_num_maps, distcp_memory_mb, distcp_task_timeout, conf, source, targetPath);
|
distcp_num_maps, distcp_memory_mb, distcp_task_timeout, conf, source, targetPath);
|
||||||
}
|
}
|
||||||
|
|
||||||
targetPaths.add(targetPath);
|
targetPaths.add(targetPath);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
props.setProperty(
|
props
|
||||||
TARGET_PATHS, targetPaths.stream().map(p -> p.toString()).collect(Collectors.joining(",")));
|
.setProperty(
|
||||||
File file = new File(System.getProperty("oozie.action.output.properties"));
|
TARGET_PATHS, targetPaths.stream().map(p -> p.toString()).collect(Collectors.joining(",")));
|
||||||
|
File file = new File(System.getProperty("oozie.action.output.properties"));
|
||||||
|
|
||||||
try (OutputStream os = new FileOutputStream(file)) {
|
try (OutputStream os = new FileOutputStream(file)) {
|
||||||
props.store(os, "");
|
props.store(os, "");
|
||||||
}
|
}
|
||||||
System.out.println(file.getAbsolutePath());
|
System.out.println(file.getAbsolutePath());
|
||||||
}
|
}
|
||||||
|
|
||||||
private void runDistcp(
|
private void runDistcp(
|
||||||
Integer distcp_num_maps,
|
Integer distcp_num_maps,
|
||||||
String distcp_memory_mb,
|
String distcp_memory_mb,
|
||||||
String distcp_task_timeout,
|
String distcp_task_timeout,
|
||||||
Configuration conf,
|
Configuration conf,
|
||||||
Path source,
|
Path source,
|
||||||
Path targetPath)
|
Path targetPath)
|
||||||
throws Exception {
|
throws Exception {
|
||||||
|
|
||||||
final DistCpOptions op = new DistCpOptions(source, targetPath);
|
final DistCpOptions op = new DistCpOptions(source, targetPath);
|
||||||
op.setMaxMaps(distcp_num_maps);
|
op.setMaxMaps(distcp_num_maps);
|
||||||
op.preserve(DistCpOptions.FileAttribute.BLOCKSIZE);
|
op.preserve(DistCpOptions.FileAttribute.BLOCKSIZE);
|
||||||
op.preserve(DistCpOptions.FileAttribute.REPLICATION);
|
op.preserve(DistCpOptions.FileAttribute.REPLICATION);
|
||||||
op.preserve(DistCpOptions.FileAttribute.CHECKSUMTYPE);
|
op.preserve(DistCpOptions.FileAttribute.CHECKSUMTYPE);
|
||||||
|
|
||||||
int res =
|
int res = ToolRunner
|
||||||
ToolRunner.run(
|
.run(
|
||||||
new DistCp(conf, op),
|
new DistCp(conf, op),
|
||||||
new String[] {
|
new String[] {
|
||||||
"-Dmapred.task.timeout=" + distcp_task_timeout,
|
"-Dmapred.task.timeout=" + distcp_task_timeout,
|
||||||
"-Dmapreduce.map.memory.mb=" + distcp_memory_mb,
|
"-Dmapreduce.map.memory.mb=" + distcp_memory_mb,
|
||||||
"-pb",
|
"-pb",
|
||||||
"-m " + distcp_num_maps,
|
"-m " + distcp_num_maps,
|
||||||
source.toString(),
|
source.toString(),
|
||||||
targetPath.toString()
|
targetPath.toString()
|
||||||
});
|
});
|
||||||
|
|
||||||
if (res != 0) {
|
if (res != 0) {
|
||||||
throw new RuntimeException(String.format("distcp exited with code %s", res));
|
throw new RuntimeException(String.format("distcp exited with code %s", res));
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private Configuration getConfiguration(
|
private Configuration getConfiguration(
|
||||||
String distcp_task_timeout, String distcp_memory_mb, Integer distcp_num_maps) {
|
String distcp_task_timeout, String distcp_memory_mb, Integer distcp_num_maps) {
|
||||||
final Configuration conf = new Configuration();
|
final Configuration conf = new Configuration();
|
||||||
conf.set("dfs.webhdfs.socket.connect-timeout", distcp_task_timeout);
|
conf.set("dfs.webhdfs.socket.connect-timeout", distcp_task_timeout);
|
||||||
conf.set("dfs.webhdfs.socket.read-timeout", distcp_task_timeout);
|
conf.set("dfs.webhdfs.socket.read-timeout", distcp_task_timeout);
|
||||||
conf.set("dfs.http.client.retry.policy.enabled", "true");
|
conf.set("dfs.http.client.retry.policy.enabled", "true");
|
||||||
conf.set("mapred.task.timeout", distcp_task_timeout);
|
conf.set("mapred.task.timeout", distcp_task_timeout);
|
||||||
conf.set("mapreduce.map.memory.mb", distcp_memory_mb);
|
conf.set("mapreduce.map.memory.mb", distcp_memory_mb);
|
||||||
conf.set("mapred.map.tasks", String.valueOf(distcp_num_maps));
|
conf.set("mapred.map.tasks", String.valueOf(distcp_num_maps));
|
||||||
return conf;
|
return conf;
|
||||||
}
|
}
|
||||||
|
|
||||||
private List<Path> getSourcePaths(String sourceNN, ISLookUpService isLookUp)
|
private List<Path> getSourcePaths(String sourceNN, ISLookUpService isLookUp)
|
||||||
throws ISLookUpException {
|
throws ISLookUpException {
|
||||||
String XQUERY =
|
String XQUERY = "distinct-values(\n"
|
||||||
"distinct-values(\n"
|
+ "let $basePath := collection('/db/DRIVER/ServiceResources/ActionManagerServiceResourceType')//SERVICE_PROPERTIES/PROPERTY[@key = 'basePath']/@value/string()\n"
|
||||||
+ "let $basePath := collection('/db/DRIVER/ServiceResources/ActionManagerServiceResourceType')//SERVICE_PROPERTIES/PROPERTY[@key = 'basePath']/@value/string()\n"
|
+ "for $x in collection('/db/DRIVER/ActionManagerSetDSResources/ActionManagerSetDSResourceType') \n"
|
||||||
+ "for $x in collection('/db/DRIVER/ActionManagerSetDSResources/ActionManagerSetDSResourceType') \n"
|
+ "let $setDir := $x//SET/@directory/string()\n"
|
||||||
+ "let $setDir := $x//SET/@directory/string()\n"
|
+ "let $rawSet := $x//RAW_SETS/LATEST/@id/string()\n"
|
||||||
+ "let $rawSet := $x//RAW_SETS/LATEST/@id/string()\n"
|
+ "return concat($basePath, '/', $setDir, '/', $rawSet))";
|
||||||
+ "return concat($basePath, '/', $setDir, '/', $rawSet))";
|
|
||||||
|
|
||||||
log.info(String.format("running xquery:\n%s", XQUERY));
|
log.info(String.format("running xquery:\n%s", XQUERY));
|
||||||
return isLookUp.quickSearchProfile(XQUERY).stream()
|
return isLookUp
|
||||||
.map(p -> sourceNN + p)
|
.quickSearchProfile(XQUERY)
|
||||||
.map(Path::new)
|
.stream()
|
||||||
.collect(Collectors.toList());
|
.map(p -> sourceNN + p)
|
||||||
}
|
.map(Path::new)
|
||||||
|
.collect(Collectors.toList());
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
File diff suppressed because it is too large
Load Diff
|
@ -1,23 +1,14 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.actionmanager.migration;
|
package eu.dnetlib.dhp.actionmanager.migration;
|
||||||
|
|
||||||
import static eu.dnetlib.dhp.common.SparkSessionSupport.runWithSparkSession;
|
import static eu.dnetlib.dhp.common.SparkSessionSupport.runWithSparkSession;
|
||||||
|
|
||||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
|
||||||
import com.google.common.base.Splitter;
|
|
||||||
import com.google.common.collect.Lists;
|
|
||||||
import com.google.protobuf.InvalidProtocolBufferException;
|
|
||||||
import eu.dnetlib.data.proto.OafProtos;
|
|
||||||
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
|
|
||||||
import eu.dnetlib.dhp.schema.action.AtomicAction;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.*;
|
|
||||||
import eu.dnetlib.dhp.utils.ISLookupClientFactory;
|
|
||||||
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpException;
|
|
||||||
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpService;
|
|
||||||
import java.io.IOException;
|
import java.io.IOException;
|
||||||
import java.io.Serializable;
|
import java.io.Serializable;
|
||||||
import java.util.LinkedList;
|
import java.util.LinkedList;
|
||||||
import java.util.Objects;
|
import java.util.Objects;
|
||||||
import java.util.Optional;
|
import java.util.Optional;
|
||||||
|
|
||||||
import org.apache.commons.io.IOUtils;
|
import org.apache.commons.io.IOUtils;
|
||||||
import org.apache.commons.lang3.StringUtils;
|
import org.apache.commons.lang3.StringUtils;
|
||||||
import org.apache.hadoop.fs.FileSystem;
|
import org.apache.hadoop.fs.FileSystem;
|
||||||
|
@ -29,136 +20,153 @@ import org.apache.spark.api.java.JavaSparkContext;
|
||||||
import org.apache.spark.sql.SparkSession;
|
import org.apache.spark.sql.SparkSession;
|
||||||
import org.slf4j.Logger;
|
import org.slf4j.Logger;
|
||||||
import org.slf4j.LoggerFactory;
|
import org.slf4j.LoggerFactory;
|
||||||
|
|
||||||
|
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||||
|
import com.google.common.base.Splitter;
|
||||||
|
import com.google.common.collect.Lists;
|
||||||
|
import com.google.protobuf.InvalidProtocolBufferException;
|
||||||
|
|
||||||
|
import eu.dnetlib.data.proto.OafProtos;
|
||||||
|
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
|
||||||
|
import eu.dnetlib.dhp.schema.action.AtomicAction;
|
||||||
|
import eu.dnetlib.dhp.schema.oaf.*;
|
||||||
|
import eu.dnetlib.dhp.utils.ISLookupClientFactory;
|
||||||
|
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpException;
|
||||||
|
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpService;
|
||||||
import scala.Tuple2;
|
import scala.Tuple2;
|
||||||
|
|
||||||
public class TransformActions implements Serializable {
|
public class TransformActions implements Serializable {
|
||||||
|
|
||||||
private static final Logger log = LoggerFactory.getLogger(TransformActions.class);
|
private static final Logger log = LoggerFactory.getLogger(TransformActions.class);
|
||||||
|
|
||||||
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
|
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
|
||||||
|
|
||||||
private static final String SEPARATOR = "/";
|
private static final String SEPARATOR = "/";
|
||||||
|
|
||||||
public static void main(String[] args) throws Exception {
|
public static void main(String[] args) throws Exception {
|
||||||
final ArgumentApplicationParser parser =
|
final ArgumentApplicationParser parser = new ArgumentApplicationParser(
|
||||||
new ArgumentApplicationParser(
|
IOUtils
|
||||||
IOUtils.toString(
|
.toString(
|
||||||
MigrateActionSet.class.getResourceAsStream(
|
MigrateActionSet.class
|
||||||
"/eu/dnetlib/dhp/actionmanager/migration/transform_actionsets_parameters.json")));
|
.getResourceAsStream(
|
||||||
parser.parseArgument(args);
|
"/eu/dnetlib/dhp/actionmanager/migration/transform_actionsets_parameters.json")));
|
||||||
|
parser.parseArgument(args);
|
||||||
|
|
||||||
Boolean isSparkSessionManaged =
|
Boolean isSparkSessionManaged = Optional
|
||||||
Optional.ofNullable(parser.get("isSparkSessionManaged"))
|
.ofNullable(parser.get("isSparkSessionManaged"))
|
||||||
.map(Boolean::valueOf)
|
.map(Boolean::valueOf)
|
||||||
.orElse(Boolean.TRUE);
|
.orElse(Boolean.TRUE);
|
||||||
log.info("isSparkSessionManaged: {}", isSparkSessionManaged);
|
log.info("isSparkSessionManaged: {}", isSparkSessionManaged);
|
||||||
|
|
||||||
final String isLookupUrl = parser.get("isLookupUrl");
|
final String isLookupUrl = parser.get("isLookupUrl");
|
||||||
log.info("isLookupUrl: {}", isLookupUrl);
|
log.info("isLookupUrl: {}", isLookupUrl);
|
||||||
|
|
||||||
final String inputPaths = parser.get("inputPaths");
|
final String inputPaths = parser.get("inputPaths");
|
||||||
|
|
||||||
if (StringUtils.isBlank(inputPaths)) {
|
if (StringUtils.isBlank(inputPaths)) {
|
||||||
throw new RuntimeException("empty inputPaths");
|
throw new RuntimeException("empty inputPaths");
|
||||||
}
|
}
|
||||||
log.info("inputPaths: {}", inputPaths);
|
log.info("inputPaths: {}", inputPaths);
|
||||||
|
|
||||||
final String targetBaseDir = getTargetBaseDir(isLookupUrl);
|
final String targetBaseDir = getTargetBaseDir(isLookupUrl);
|
||||||
|
|
||||||
SparkConf conf = new SparkConf();
|
SparkConf conf = new SparkConf();
|
||||||
|
|
||||||
runWithSparkSession(
|
runWithSparkSession(
|
||||||
conf, isSparkSessionManaged, spark -> transformActions(inputPaths, targetBaseDir, spark));
|
conf, isSparkSessionManaged, spark -> transformActions(inputPaths, targetBaseDir, spark));
|
||||||
}
|
}
|
||||||
|
|
||||||
private static void transformActions(String inputPaths, String targetBaseDir, SparkSession spark)
|
private static void transformActions(String inputPaths, String targetBaseDir, SparkSession spark)
|
||||||
throws IOException {
|
throws IOException {
|
||||||
final JavaSparkContext sc = JavaSparkContext.fromSparkContext(spark.sparkContext());
|
final JavaSparkContext sc = JavaSparkContext.fromSparkContext(spark.sparkContext());
|
||||||
final FileSystem fs = FileSystem.get(spark.sparkContext().hadoopConfiguration());
|
final FileSystem fs = FileSystem.get(spark.sparkContext().hadoopConfiguration());
|
||||||
|
|
||||||
for (String sourcePath : Lists.newArrayList(Splitter.on(",").split(inputPaths))) {
|
for (String sourcePath : Lists.newArrayList(Splitter.on(",").split(inputPaths))) {
|
||||||
|
|
||||||
LinkedList<String> pathQ = Lists.newLinkedList(Splitter.on(SEPARATOR).split(sourcePath));
|
LinkedList<String> pathQ = Lists.newLinkedList(Splitter.on(SEPARATOR).split(sourcePath));
|
||||||
|
|
||||||
final String rawset = pathQ.pollLast();
|
final String rawset = pathQ.pollLast();
|
||||||
final String actionSetDirectory = pathQ.pollLast();
|
final String actionSetDirectory = pathQ.pollLast();
|
||||||
|
|
||||||
final Path targetDirectory =
|
final Path targetDirectory = new Path(targetBaseDir + SEPARATOR + actionSetDirectory + SEPARATOR + rawset);
|
||||||
new Path(targetBaseDir + SEPARATOR + actionSetDirectory + SEPARATOR + rawset);
|
|
||||||
|
|
||||||
if (fs.exists(targetDirectory)) {
|
if (fs.exists(targetDirectory)) {
|
||||||
log.info("found target directory '{}", targetDirectory);
|
log.info("found target directory '{}", targetDirectory);
|
||||||
fs.delete(targetDirectory, true);
|
fs.delete(targetDirectory, true);
|
||||||
log.info("deleted target directory '{}", targetDirectory);
|
log.info("deleted target directory '{}", targetDirectory);
|
||||||
}
|
}
|
||||||
|
|
||||||
log.info("transforming actions from '{}' to '{}'", sourcePath, targetDirectory);
|
log.info("transforming actions from '{}' to '{}'", sourcePath, targetDirectory);
|
||||||
|
|
||||||
sc.sequenceFile(sourcePath, Text.class, Text.class)
|
sc
|
||||||
.map(a -> eu.dnetlib.actionmanager.actions.AtomicAction.fromJSON(a._2().toString()))
|
.sequenceFile(sourcePath, Text.class, Text.class)
|
||||||
.map(TransformActions::doTransform)
|
.map(a -> eu.dnetlib.actionmanager.actions.AtomicAction.fromJSON(a._2().toString()))
|
||||||
.filter(Objects::nonNull)
|
.map(TransformActions::doTransform)
|
||||||
.mapToPair(
|
.filter(Objects::nonNull)
|
||||||
a -> new Tuple2<>(a.getClazz().toString(), OBJECT_MAPPER.writeValueAsString(a)))
|
.mapToPair(
|
||||||
.mapToPair(t -> new Tuple2(new Text(t._1()), new Text(t._2())))
|
a -> new Tuple2<>(a.getClazz().toString(), OBJECT_MAPPER.writeValueAsString(a)))
|
||||||
.saveAsNewAPIHadoopFile(
|
.mapToPair(t -> new Tuple2(new Text(t._1()), new Text(t._2())))
|
||||||
targetDirectory.toString(),
|
.saveAsNewAPIHadoopFile(
|
||||||
Text.class,
|
targetDirectory.toString(),
|
||||||
Text.class,
|
Text.class,
|
||||||
SequenceFileOutputFormat.class,
|
Text.class,
|
||||||
sc.hadoopConfiguration());
|
SequenceFileOutputFormat.class,
|
||||||
}
|
sc.hadoopConfiguration());
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
|
||||||
private static AtomicAction doTransform(eu.dnetlib.actionmanager.actions.AtomicAction aa)
|
private static AtomicAction doTransform(eu.dnetlib.actionmanager.actions.AtomicAction aa)
|
||||||
throws InvalidProtocolBufferException {
|
throws InvalidProtocolBufferException {
|
||||||
|
|
||||||
// dedup similarity relations had empty target value, don't migrate them
|
// dedup similarity relations had empty target value, don't migrate them
|
||||||
if (aa.getTargetValue().length == 0) {
|
if (aa.getTargetValue().length == 0) {
|
||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
final OafProtos.Oaf proto_oaf = OafProtos.Oaf.parseFrom(aa.getTargetValue());
|
final OafProtos.Oaf proto_oaf = OafProtos.Oaf.parseFrom(aa.getTargetValue());
|
||||||
final Oaf oaf = ProtoConverter.convert(proto_oaf);
|
final Oaf oaf = ProtoConverter.convert(proto_oaf);
|
||||||
switch (proto_oaf.getKind()) {
|
switch (proto_oaf.getKind()) {
|
||||||
case entity:
|
case entity:
|
||||||
switch (proto_oaf.getEntity().getType()) {
|
switch (proto_oaf.getEntity().getType()) {
|
||||||
case datasource:
|
case datasource:
|
||||||
return new AtomicAction<>(Datasource.class, (Datasource) oaf);
|
return new AtomicAction<>(Datasource.class, (Datasource) oaf);
|
||||||
case organization:
|
case organization:
|
||||||
return new AtomicAction<>(Organization.class, (Organization) oaf);
|
return new AtomicAction<>(Organization.class, (Organization) oaf);
|
||||||
case project:
|
case project:
|
||||||
return new AtomicAction<>(Project.class, (Project) oaf);
|
return new AtomicAction<>(Project.class, (Project) oaf);
|
||||||
case result:
|
case result:
|
||||||
final String resulttypeid =
|
final String resulttypeid = proto_oaf
|
||||||
proto_oaf.getEntity().getResult().getMetadata().getResulttype().getClassid();
|
.getEntity()
|
||||||
switch (resulttypeid) {
|
.getResult()
|
||||||
case "publication":
|
.getMetadata()
|
||||||
return new AtomicAction<>(Publication.class, (Publication) oaf);
|
.getResulttype()
|
||||||
case "software":
|
.getClassid();
|
||||||
return new AtomicAction<>(Software.class, (Software) oaf);
|
switch (resulttypeid) {
|
||||||
case "other":
|
case "publication":
|
||||||
return new AtomicAction<>(OtherResearchProduct.class, (OtherResearchProduct) oaf);
|
return new AtomicAction<>(Publication.class, (Publication) oaf);
|
||||||
case "dataset":
|
case "software":
|
||||||
return new AtomicAction<>(Dataset.class, (Dataset) oaf);
|
return new AtomicAction<>(Software.class, (Software) oaf);
|
||||||
default:
|
case "other":
|
||||||
// can be an update, where the resulttype is not specified
|
return new AtomicAction<>(OtherResearchProduct.class, (OtherResearchProduct) oaf);
|
||||||
return new AtomicAction<>(Result.class, (Result) oaf);
|
case "dataset":
|
||||||
}
|
return new AtomicAction<>(Dataset.class, (Dataset) oaf);
|
||||||
default:
|
default:
|
||||||
throw new IllegalArgumentException(
|
// can be an update, where the resulttype is not specified
|
||||||
"invalid entity type: " + proto_oaf.getEntity().getType());
|
return new AtomicAction<>(Result.class, (Result) oaf);
|
||||||
}
|
}
|
||||||
case relation:
|
default:
|
||||||
return new AtomicAction<>(Relation.class, (Relation) oaf);
|
throw new IllegalArgumentException(
|
||||||
default:
|
"invalid entity type: " + proto_oaf.getEntity().getType());
|
||||||
throw new IllegalArgumentException("invalid kind: " + proto_oaf.getKind());
|
}
|
||||||
}
|
case relation:
|
||||||
}
|
return new AtomicAction<>(Relation.class, (Relation) oaf);
|
||||||
|
default:
|
||||||
|
throw new IllegalArgumentException("invalid kind: " + proto_oaf.getKind());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
private static String getTargetBaseDir(String isLookupUrl) throws ISLookUpException {
|
private static String getTargetBaseDir(String isLookupUrl) throws ISLookUpException {
|
||||||
ISLookUpService isLookUp = ISLookupClientFactory.getLookUpService(isLookupUrl);
|
ISLookUpService isLookUp = ISLookupClientFactory.getLookUpService(isLookupUrl);
|
||||||
String XQUERY =
|
String XQUERY = "collection('/db/DRIVER/ServiceResources/ActionManagerServiceResourceType')//SERVICE_PROPERTIES/PROPERTY[@key = 'basePath']/@value/string()";
|
||||||
"collection('/db/DRIVER/ServiceResources/ActionManagerServiceResourceType')//SERVICE_PROPERTIES/PROPERTY[@key = 'basePath']/@value/string()";
|
return isLookUp.getResourceProfileByQuery(XQUERY);
|
||||||
return isLookUp.getResourceProfileByQuery(XQUERY);
|
}
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,15 +1,13 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.actionmanager.partition;
|
package eu.dnetlib.dhp.actionmanager.partition;
|
||||||
|
|
||||||
import static eu.dnetlib.dhp.common.SparkSessionSupport.runWithSparkSession;
|
import static eu.dnetlib.dhp.common.SparkSessionSupport.runWithSparkSession;
|
||||||
import static org.apache.spark.sql.functions.*;
|
import static org.apache.spark.sql.functions.*;
|
||||||
|
|
||||||
import eu.dnetlib.dhp.actionmanager.ISClient;
|
|
||||||
import eu.dnetlib.dhp.actionmanager.promote.PromoteActionPayloadForGraphTableJob;
|
|
||||||
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
|
|
||||||
import eu.dnetlib.dhp.common.HdfsSupport;
|
|
||||||
import java.util.Arrays;
|
import java.util.Arrays;
|
||||||
import java.util.List;
|
import java.util.List;
|
||||||
import java.util.Optional;
|
import java.util.Optional;
|
||||||
|
|
||||||
import org.apache.commons.io.IOUtils;
|
import org.apache.commons.io.IOUtils;
|
||||||
import org.apache.hadoop.io.Text;
|
import org.apache.hadoop.io.Text;
|
||||||
import org.apache.spark.SparkConf;
|
import org.apache.spark.SparkConf;
|
||||||
|
@ -20,117 +18,127 @@ import org.apache.spark.sql.types.*;
|
||||||
import org.slf4j.Logger;
|
import org.slf4j.Logger;
|
||||||
import org.slf4j.LoggerFactory;
|
import org.slf4j.LoggerFactory;
|
||||||
|
|
||||||
|
import eu.dnetlib.dhp.actionmanager.ISClient;
|
||||||
|
import eu.dnetlib.dhp.actionmanager.promote.PromoteActionPayloadForGraphTableJob;
|
||||||
|
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
|
||||||
|
import eu.dnetlib.dhp.common.HdfsSupport;
|
||||||
|
|
||||||
/** Partitions given set of action sets by payload type. */
|
/** Partitions given set of action sets by payload type. */
|
||||||
public class PartitionActionSetsByPayloadTypeJob {
|
public class PartitionActionSetsByPayloadTypeJob {
|
||||||
|
|
||||||
private static final Logger logger =
|
private static final Logger logger = LoggerFactory.getLogger(PartitionActionSetsByPayloadTypeJob.class);
|
||||||
LoggerFactory.getLogger(PartitionActionSetsByPayloadTypeJob.class);
|
|
||||||
|
|
||||||
private static final StructType KV_SCHEMA =
|
private static final StructType KV_SCHEMA = StructType$.MODULE$
|
||||||
StructType$.MODULE$.apply(
|
.apply(
|
||||||
Arrays.asList(
|
Arrays
|
||||||
StructField$.MODULE$.apply("key", DataTypes.StringType, false, Metadata.empty()),
|
.asList(
|
||||||
StructField$.MODULE$.apply("value", DataTypes.StringType, false, Metadata.empty())));
|
StructField$.MODULE$.apply("key", DataTypes.StringType, false, Metadata.empty()),
|
||||||
|
StructField$.MODULE$.apply("value", DataTypes.StringType, false, Metadata.empty())));
|
||||||
|
|
||||||
private static final StructType ATOMIC_ACTION_SCHEMA =
|
private static final StructType ATOMIC_ACTION_SCHEMA = StructType$.MODULE$
|
||||||
StructType$.MODULE$.apply(
|
.apply(
|
||||||
Arrays.asList(
|
Arrays
|
||||||
StructField$.MODULE$.apply("clazz", DataTypes.StringType, false, Metadata.empty()),
|
.asList(
|
||||||
StructField$.MODULE$.apply(
|
StructField$.MODULE$.apply("clazz", DataTypes.StringType, false, Metadata.empty()),
|
||||||
"payload", DataTypes.StringType, false, Metadata.empty())));
|
StructField$.MODULE$
|
||||||
|
.apply(
|
||||||
|
"payload", DataTypes.StringType, false, Metadata.empty())));
|
||||||
|
|
||||||
private ISClient isClient;
|
private ISClient isClient;
|
||||||
|
|
||||||
public PartitionActionSetsByPayloadTypeJob(String isLookupUrl) {
|
public PartitionActionSetsByPayloadTypeJob(String isLookupUrl) {
|
||||||
this.isClient = new ISClient(isLookupUrl);
|
this.isClient = new ISClient(isLookupUrl);
|
||||||
}
|
}
|
||||||
|
|
||||||
public PartitionActionSetsByPayloadTypeJob() {}
|
public PartitionActionSetsByPayloadTypeJob() {
|
||||||
|
}
|
||||||
|
|
||||||
public static void main(String[] args) throws Exception {
|
public static void main(String[] args) throws Exception {
|
||||||
String jsonConfiguration =
|
String jsonConfiguration = IOUtils
|
||||||
IOUtils.toString(
|
.toString(
|
||||||
PromoteActionPayloadForGraphTableJob.class.getResourceAsStream(
|
PromoteActionPayloadForGraphTableJob.class
|
||||||
"/eu/dnetlib/dhp/actionmanager/partition/partition_action_sets_by_payload_type_input_parameters.json"));
|
.getResourceAsStream(
|
||||||
final ArgumentApplicationParser parser = new ArgumentApplicationParser(jsonConfiguration);
|
"/eu/dnetlib/dhp/actionmanager/partition/partition_action_sets_by_payload_type_input_parameters.json"));
|
||||||
parser.parseArgument(args);
|
final ArgumentApplicationParser parser = new ArgumentApplicationParser(jsonConfiguration);
|
||||||
|
parser.parseArgument(args);
|
||||||
|
|
||||||
Boolean isSparkSessionManaged =
|
Boolean isSparkSessionManaged = Optional
|
||||||
Optional.ofNullable(parser.get("isSparkSessionManaged"))
|
.ofNullable(parser.get("isSparkSessionManaged"))
|
||||||
.map(Boolean::valueOf)
|
.map(Boolean::valueOf)
|
||||||
.orElse(Boolean.TRUE);
|
.orElse(Boolean.TRUE);
|
||||||
logger.info("isSparkSessionManaged: {}", isSparkSessionManaged);
|
logger.info("isSparkSessionManaged: {}", isSparkSessionManaged);
|
||||||
|
|
||||||
String inputActionSetIds = parser.get("inputActionSetIds");
|
String inputActionSetIds = parser.get("inputActionSetIds");
|
||||||
logger.info("inputActionSetIds: {}", inputActionSetIds);
|
logger.info("inputActionSetIds: {}", inputActionSetIds);
|
||||||
|
|
||||||
String outputPath = parser.get("outputPath");
|
String outputPath = parser.get("outputPath");
|
||||||
logger.info("outputPath: {}", outputPath);
|
logger.info("outputPath: {}", outputPath);
|
||||||
|
|
||||||
String isLookupUrl = parser.get("isLookupUrl");
|
String isLookupUrl = parser.get("isLookupUrl");
|
||||||
logger.info("isLookupUrl: {}", isLookupUrl);
|
logger.info("isLookupUrl: {}", isLookupUrl);
|
||||||
|
|
||||||
new PartitionActionSetsByPayloadTypeJob(isLookupUrl)
|
new PartitionActionSetsByPayloadTypeJob(isLookupUrl)
|
||||||
.run(isSparkSessionManaged, inputActionSetIds, outputPath);
|
.run(isSparkSessionManaged, inputActionSetIds, outputPath);
|
||||||
}
|
}
|
||||||
|
|
||||||
protected void run(Boolean isSparkSessionManaged, String inputActionSetIds, String outputPath) {
|
protected void run(Boolean isSparkSessionManaged, String inputActionSetIds, String outputPath) {
|
||||||
|
|
||||||
List<String> inputActionSetPaths = getIsClient().getLatestRawsetPaths(inputActionSetIds);
|
List<String> inputActionSetPaths = getIsClient().getLatestRawsetPaths(inputActionSetIds);
|
||||||
logger.info("inputActionSetPaths: {}", String.join(",", inputActionSetPaths));
|
logger.info("inputActionSetPaths: {}", String.join(",", inputActionSetPaths));
|
||||||
|
|
||||||
SparkConf conf = new SparkConf();
|
SparkConf conf = new SparkConf();
|
||||||
conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer");
|
conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer");
|
||||||
|
|
||||||
runWithSparkSession(
|
runWithSparkSession(
|
||||||
conf,
|
conf,
|
||||||
isSparkSessionManaged,
|
isSparkSessionManaged,
|
||||||
spark -> {
|
spark -> {
|
||||||
removeOutputDir(spark, outputPath);
|
removeOutputDir(spark, outputPath);
|
||||||
readAndWriteActionSetsFromPaths(spark, inputActionSetPaths, outputPath);
|
readAndWriteActionSetsFromPaths(spark, inputActionSetPaths, outputPath);
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
private static void removeOutputDir(SparkSession spark, String path) {
|
private static void removeOutputDir(SparkSession spark, String path) {
|
||||||
HdfsSupport.remove(path, spark.sparkContext().hadoopConfiguration());
|
HdfsSupport.remove(path, spark.sparkContext().hadoopConfiguration());
|
||||||
}
|
}
|
||||||
|
|
||||||
private static void readAndWriteActionSetsFromPaths(
|
private static void readAndWriteActionSetsFromPaths(
|
||||||
SparkSession spark, List<String> inputActionSetPaths, String outputPath) {
|
SparkSession spark, List<String> inputActionSetPaths, String outputPath) {
|
||||||
inputActionSetPaths.stream()
|
inputActionSetPaths
|
||||||
.filter(path -> HdfsSupport.exists(path, spark.sparkContext().hadoopConfiguration()))
|
.stream()
|
||||||
.forEach(
|
.filter(path -> HdfsSupport.exists(path, spark.sparkContext().hadoopConfiguration()))
|
||||||
inputActionSetPath -> {
|
.forEach(
|
||||||
Dataset<Row> actionDS = readActionSetFromPath(spark, inputActionSetPath);
|
inputActionSetPath -> {
|
||||||
saveActions(actionDS, outputPath);
|
Dataset<Row> actionDS = readActionSetFromPath(spark, inputActionSetPath);
|
||||||
});
|
saveActions(actionDS, outputPath);
|
||||||
}
|
});
|
||||||
|
}
|
||||||
|
|
||||||
private static Dataset<Row> readActionSetFromPath(SparkSession spark, String path) {
|
private static Dataset<Row> readActionSetFromPath(SparkSession spark, String path) {
|
||||||
logger.info("Reading actions from path: {}", path);
|
logger.info("Reading actions from path: {}", path);
|
||||||
|
|
||||||
JavaSparkContext sc = JavaSparkContext.fromSparkContext(spark.sparkContext());
|
JavaSparkContext sc = JavaSparkContext.fromSparkContext(spark.sparkContext());
|
||||||
|
|
||||||
JavaRDD<Row> rdd =
|
JavaRDD<Row> rdd = sc
|
||||||
sc.sequenceFile(path, Text.class, Text.class)
|
.sequenceFile(path, Text.class, Text.class)
|
||||||
.map(x -> RowFactory.create(x._1().toString(), x._2().toString()));
|
.map(x -> RowFactory.create(x._1().toString(), x._2().toString()));
|
||||||
|
|
||||||
return spark
|
return spark
|
||||||
.createDataFrame(rdd, KV_SCHEMA)
|
.createDataFrame(rdd, KV_SCHEMA)
|
||||||
.withColumn("atomic_action", from_json(col("value"), ATOMIC_ACTION_SCHEMA))
|
.withColumn("atomic_action", from_json(col("value"), ATOMIC_ACTION_SCHEMA))
|
||||||
.select(expr("atomic_action.*"));
|
.select(expr("atomic_action.*"));
|
||||||
}
|
}
|
||||||
|
|
||||||
private static void saveActions(Dataset<Row> actionDS, String path) {
|
private static void saveActions(Dataset<Row> actionDS, String path) {
|
||||||
logger.info("Saving actions to path: {}", path);
|
logger.info("Saving actions to path: {}", path);
|
||||||
actionDS.write().partitionBy("clazz").mode(SaveMode.Append).parquet(path);
|
actionDS.write().partitionBy("clazz").mode(SaveMode.Append).parquet(path);
|
||||||
}
|
}
|
||||||
|
|
||||||
public ISClient getIsClient() {
|
public ISClient getIsClient() {
|
||||||
return isClient;
|
return isClient;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setIsClient(ISClient isClient) {
|
public void setIsClient(ISClient isClient) {
|
||||||
this.isClient = isClient;
|
this.isClient = isClient;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,82 +1,87 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.actionmanager.promote;
|
package eu.dnetlib.dhp.actionmanager.promote;
|
||||||
|
|
||||||
import static eu.dnetlib.dhp.schema.common.ModelSupport.isSubClass;
|
import static eu.dnetlib.dhp.schema.common.ModelSupport.isSubClass;
|
||||||
|
|
||||||
|
import java.util.function.BiFunction;
|
||||||
|
|
||||||
import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.SerializableSupplier;
|
import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.SerializableSupplier;
|
||||||
import eu.dnetlib.dhp.schema.oaf.Oaf;
|
import eu.dnetlib.dhp.schema.oaf.Oaf;
|
||||||
import eu.dnetlib.dhp.schema.oaf.OafEntity;
|
import eu.dnetlib.dhp.schema.oaf.OafEntity;
|
||||||
import eu.dnetlib.dhp.schema.oaf.Relation;
|
import eu.dnetlib.dhp.schema.oaf.Relation;
|
||||||
import java.util.function.BiFunction;
|
|
||||||
|
|
||||||
/** OAF model merging support. */
|
/** OAF model merging support. */
|
||||||
public class MergeAndGet {
|
public class MergeAndGet {
|
||||||
|
|
||||||
private MergeAndGet() {}
|
private MergeAndGet() {
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Strategy for merging OAF model objects.
|
* Strategy for merging OAF model objects.
|
||||||
*
|
* <p>
|
||||||
* <p>MERGE_FROM_AND_GET: use OAF 'mergeFrom' method SELECT_NEWER_AND_GET: use last update
|
* MERGE_FROM_AND_GET: use OAF 'mergeFrom' method SELECT_NEWER_AND_GET: use last update timestamp to return newer
|
||||||
* timestamp to return newer instance
|
* instance
|
||||||
*/
|
*/
|
||||||
public enum Strategy {
|
public enum Strategy {
|
||||||
MERGE_FROM_AND_GET,
|
MERGE_FROM_AND_GET, SELECT_NEWER_AND_GET
|
||||||
SELECT_NEWER_AND_GET
|
}
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Returns a function for merging OAF model objects.
|
* Returns a function for merging OAF model objects.
|
||||||
*
|
*
|
||||||
* @param strategy Strategy to be used to merge objects
|
* @param strategy Strategy to be used to merge objects
|
||||||
* @param <G> Graph table type
|
* @param <G> Graph table type
|
||||||
* @param <A> Action payload type
|
* @param <A> Action payload type
|
||||||
* @return BiFunction to be used to merge OAF objects
|
* @return BiFunction to be used to merge OAF objects
|
||||||
*/
|
*/
|
||||||
public static <G extends Oaf, A extends Oaf>
|
public static <G extends Oaf, A extends Oaf> SerializableSupplier<BiFunction<G, A, G>> functionFor(
|
||||||
SerializableSupplier<BiFunction<G, A, G>> functionFor(Strategy strategy) {
|
Strategy strategy) {
|
||||||
switch (strategy) {
|
switch (strategy) {
|
||||||
case MERGE_FROM_AND_GET:
|
case MERGE_FROM_AND_GET:
|
||||||
return () -> MergeAndGet::mergeFromAndGet;
|
return () -> MergeAndGet::mergeFromAndGet;
|
||||||
case SELECT_NEWER_AND_GET:
|
case SELECT_NEWER_AND_GET:
|
||||||
return () -> MergeAndGet::selectNewerAndGet;
|
return () -> MergeAndGet::selectNewerAndGet;
|
||||||
}
|
}
|
||||||
throw new RuntimeException();
|
throw new RuntimeException();
|
||||||
}
|
}
|
||||||
|
|
||||||
private static <G extends Oaf, A extends Oaf> G mergeFromAndGet(G x, A y) {
|
private static <G extends Oaf, A extends Oaf> G mergeFromAndGet(G x, A y) {
|
||||||
if (isSubClass(x, Relation.class) && isSubClass(y, Relation.class)) {
|
if (isSubClass(x, Relation.class) && isSubClass(y, Relation.class)) {
|
||||||
((Relation) x).mergeFrom((Relation) y);
|
((Relation) x).mergeFrom((Relation) y);
|
||||||
return x;
|
return x;
|
||||||
} else if (isSubClass(x, OafEntity.class)
|
} else if (isSubClass(x, OafEntity.class)
|
||||||
&& isSubClass(y, OafEntity.class)
|
&& isSubClass(y, OafEntity.class)
|
||||||
&& isSubClass(x, y)) {
|
&& isSubClass(x, y)) {
|
||||||
((OafEntity) x).mergeFrom((OafEntity) y);
|
((OafEntity) x).mergeFrom((OafEntity) y);
|
||||||
return x;
|
return x;
|
||||||
}
|
}
|
||||||
throw new RuntimeException(
|
throw new RuntimeException(
|
||||||
String.format(
|
String
|
||||||
"MERGE_FROM_AND_GET incompatible types: %s, %s",
|
.format(
|
||||||
x.getClass().getCanonicalName(), y.getClass().getCanonicalName()));
|
"MERGE_FROM_AND_GET incompatible types: %s, %s",
|
||||||
}
|
x.getClass().getCanonicalName(), y.getClass().getCanonicalName()));
|
||||||
|
}
|
||||||
|
|
||||||
private static <G extends Oaf, A extends Oaf> G selectNewerAndGet(G x, A y) {
|
private static <G extends Oaf, A extends Oaf> G selectNewerAndGet(G x, A y) {
|
||||||
if (x.getClass().equals(y.getClass())
|
if (x.getClass().equals(y.getClass())
|
||||||
&& x.getLastupdatetimestamp() > y.getLastupdatetimestamp()) {
|
&& x.getLastupdatetimestamp() > y.getLastupdatetimestamp()) {
|
||||||
return x;
|
return x;
|
||||||
} else if (x.getClass().equals(y.getClass())
|
} else if (x.getClass().equals(y.getClass())
|
||||||
&& x.getLastupdatetimestamp() < y.getLastupdatetimestamp()) {
|
&& x.getLastupdatetimestamp() < y.getLastupdatetimestamp()) {
|
||||||
return (G) y;
|
return (G) y;
|
||||||
} else if (isSubClass(x, y) && x.getLastupdatetimestamp() > y.getLastupdatetimestamp()) {
|
} else if (isSubClass(x, y) && x.getLastupdatetimestamp() > y.getLastupdatetimestamp()) {
|
||||||
return x;
|
return x;
|
||||||
} else if (isSubClass(x, y) && x.getLastupdatetimestamp() < y.getLastupdatetimestamp()) {
|
} else if (isSubClass(x, y) && x.getLastupdatetimestamp() < y.getLastupdatetimestamp()) {
|
||||||
throw new RuntimeException(
|
throw new RuntimeException(
|
||||||
String.format(
|
String
|
||||||
"SELECT_NEWER_AND_GET cannot return right type when it is not the same as left type: %s, %s",
|
.format(
|
||||||
x.getClass().getCanonicalName(), y.getClass().getCanonicalName()));
|
"SELECT_NEWER_AND_GET cannot return right type when it is not the same as left type: %s, %s",
|
||||||
}
|
x.getClass().getCanonicalName(), y.getClass().getCanonicalName()));
|
||||||
throw new RuntimeException(
|
}
|
||||||
String.format(
|
throw new RuntimeException(
|
||||||
"SELECT_NEWER_AND_GET cannot be used when left is not subtype of right: %s, %s",
|
String
|
||||||
x.getClass().getCanonicalName(), y.getClass().getCanonicalName()));
|
.format(
|
||||||
}
|
"SELECT_NEWER_AND_GET cannot be used when left is not subtype of right: %s, %s",
|
||||||
|
x.getClass().getCanonicalName(), y.getClass().getCanonicalName()));
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,18 +1,14 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.actionmanager.promote;
|
package eu.dnetlib.dhp.actionmanager.promote;
|
||||||
|
|
||||||
import static eu.dnetlib.dhp.common.SparkSessionSupport.runWithSparkSession;
|
import static eu.dnetlib.dhp.common.SparkSessionSupport.runWithSparkSession;
|
||||||
import static eu.dnetlib.dhp.schema.common.ModelSupport.isSubClass;
|
import static eu.dnetlib.dhp.schema.common.ModelSupport.isSubClass;
|
||||||
|
|
||||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
|
||||||
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
|
|
||||||
import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.SerializableSupplier;
|
|
||||||
import eu.dnetlib.dhp.common.HdfsSupport;
|
|
||||||
import eu.dnetlib.dhp.schema.common.ModelSupport;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.*;
|
|
||||||
import java.util.Objects;
|
import java.util.Objects;
|
||||||
import java.util.Optional;
|
import java.util.Optional;
|
||||||
import java.util.function.BiFunction;
|
import java.util.function.BiFunction;
|
||||||
import java.util.function.Function;
|
import java.util.function.Function;
|
||||||
|
|
||||||
import org.apache.commons.io.IOUtils;
|
import org.apache.commons.io.IOUtils;
|
||||||
import org.apache.spark.SparkConf;
|
import org.apache.spark.SparkConf;
|
||||||
import org.apache.spark.api.java.function.MapFunction;
|
import org.apache.spark.api.java.function.MapFunction;
|
||||||
|
@ -23,204 +19,207 @@ import org.apache.spark.sql.SparkSession;
|
||||||
import org.slf4j.Logger;
|
import org.slf4j.Logger;
|
||||||
import org.slf4j.LoggerFactory;
|
import org.slf4j.LoggerFactory;
|
||||||
|
|
||||||
|
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||||
|
|
||||||
|
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
|
||||||
|
import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.SerializableSupplier;
|
||||||
|
import eu.dnetlib.dhp.common.HdfsSupport;
|
||||||
|
import eu.dnetlib.dhp.schema.common.ModelSupport;
|
||||||
|
import eu.dnetlib.dhp.schema.oaf.*;
|
||||||
|
|
||||||
/** Applies a given action payload file to graph table of compatible type. */
|
/** Applies a given action payload file to graph table of compatible type. */
|
||||||
public class PromoteActionPayloadForGraphTableJob {
|
public class PromoteActionPayloadForGraphTableJob {
|
||||||
private static final Logger logger =
|
private static final Logger logger = LoggerFactory.getLogger(PromoteActionPayloadForGraphTableJob.class);
|
||||||
LoggerFactory.getLogger(PromoteActionPayloadForGraphTableJob.class);
|
|
||||||
|
|
||||||
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
|
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
|
||||||
|
|
||||||
public static void main(String[] args) throws Exception {
|
public static void main(String[] args) throws Exception {
|
||||||
String jsonConfiguration =
|
String jsonConfiguration = IOUtils
|
||||||
IOUtils.toString(
|
.toString(
|
||||||
PromoteActionPayloadForGraphTableJob.class.getResourceAsStream(
|
PromoteActionPayloadForGraphTableJob.class
|
||||||
"/eu/dnetlib/dhp/actionmanager/promote/promote_action_payload_for_graph_table_input_parameters.json"));
|
.getResourceAsStream(
|
||||||
final ArgumentApplicationParser parser = new ArgumentApplicationParser(jsonConfiguration);
|
"/eu/dnetlib/dhp/actionmanager/promote/promote_action_payload_for_graph_table_input_parameters.json"));
|
||||||
parser.parseArgument(args);
|
final ArgumentApplicationParser parser = new ArgumentApplicationParser(jsonConfiguration);
|
||||||
|
parser.parseArgument(args);
|
||||||
|
|
||||||
Boolean isSparkSessionManaged =
|
Boolean isSparkSessionManaged = Optional
|
||||||
Optional.ofNullable(parser.get("isSparkSessionManaged"))
|
.ofNullable(parser.get("isSparkSessionManaged"))
|
||||||
.map(Boolean::valueOf)
|
.map(Boolean::valueOf)
|
||||||
.orElse(Boolean.TRUE);
|
.orElse(Boolean.TRUE);
|
||||||
logger.info("isSparkSessionManaged: {}", isSparkSessionManaged);
|
logger.info("isSparkSessionManaged: {}", isSparkSessionManaged);
|
||||||
|
|
||||||
String inputGraphTablePath = parser.get("inputGraphTablePath");
|
String inputGraphTablePath = parser.get("inputGraphTablePath");
|
||||||
logger.info("inputGraphTablePath: {}", inputGraphTablePath);
|
logger.info("inputGraphTablePath: {}", inputGraphTablePath);
|
||||||
|
|
||||||
String graphTableClassName = parser.get("graphTableClassName");
|
String graphTableClassName = parser.get("graphTableClassName");
|
||||||
logger.info("graphTableClassName: {}", graphTableClassName);
|
logger.info("graphTableClassName: {}", graphTableClassName);
|
||||||
|
|
||||||
String inputActionPayloadPath = parser.get("inputActionPayloadPath");
|
String inputActionPayloadPath = parser.get("inputActionPayloadPath");
|
||||||
logger.info("inputActionPayloadPath: {}", inputActionPayloadPath);
|
logger.info("inputActionPayloadPath: {}", inputActionPayloadPath);
|
||||||
|
|
||||||
String actionPayloadClassName = parser.get("actionPayloadClassName");
|
String actionPayloadClassName = parser.get("actionPayloadClassName");
|
||||||
logger.info("actionPayloadClassName: {}", actionPayloadClassName);
|
logger.info("actionPayloadClassName: {}", actionPayloadClassName);
|
||||||
|
|
||||||
String outputGraphTablePath = parser.get("outputGraphTablePath");
|
String outputGraphTablePath = parser.get("outputGraphTablePath");
|
||||||
logger.info("outputGraphTablePath: {}", outputGraphTablePath);
|
logger.info("outputGraphTablePath: {}", outputGraphTablePath);
|
||||||
|
|
||||||
MergeAndGet.Strategy strategy =
|
MergeAndGet.Strategy strategy = MergeAndGet.Strategy.valueOf(parser.get("mergeAndGetStrategy").toUpperCase());
|
||||||
MergeAndGet.Strategy.valueOf(parser.get("mergeAndGetStrategy").toUpperCase());
|
logger.info("strategy: {}", strategy);
|
||||||
logger.info("strategy: {}", strategy);
|
|
||||||
|
|
||||||
Class<? extends Oaf> rowClazz = (Class<? extends Oaf>) Class.forName(graphTableClassName);
|
Class<? extends Oaf> rowClazz = (Class<? extends Oaf>) Class.forName(graphTableClassName);
|
||||||
Class<? extends Oaf> actionPayloadClazz =
|
Class<? extends Oaf> actionPayloadClazz = (Class<? extends Oaf>) Class.forName(actionPayloadClassName);
|
||||||
(Class<? extends Oaf>) Class.forName(actionPayloadClassName);
|
|
||||||
|
|
||||||
throwIfGraphTableClassIsNotSubClassOfActionPayloadClass(rowClazz, actionPayloadClazz);
|
throwIfGraphTableClassIsNotSubClassOfActionPayloadClass(rowClazz, actionPayloadClazz);
|
||||||
|
|
||||||
SparkConf conf = new SparkConf();
|
SparkConf conf = new SparkConf();
|
||||||
conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer");
|
conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer");
|
||||||
conf.registerKryoClasses(ModelSupport.getOafModelClasses());
|
conf.registerKryoClasses(ModelSupport.getOafModelClasses());
|
||||||
|
|
||||||
runWithSparkSession(
|
runWithSparkSession(
|
||||||
conf,
|
conf,
|
||||||
isSparkSessionManaged,
|
isSparkSessionManaged,
|
||||||
spark -> {
|
spark -> {
|
||||||
removeOutputDir(spark, outputGraphTablePath);
|
removeOutputDir(spark, outputGraphTablePath);
|
||||||
promoteActionPayloadForGraphTable(
|
promoteActionPayloadForGraphTable(
|
||||||
spark,
|
spark,
|
||||||
inputGraphTablePath,
|
inputGraphTablePath,
|
||||||
inputActionPayloadPath,
|
inputActionPayloadPath,
|
||||||
outputGraphTablePath,
|
outputGraphTablePath,
|
||||||
strategy,
|
strategy,
|
||||||
rowClazz,
|
rowClazz,
|
||||||
actionPayloadClazz);
|
actionPayloadClazz);
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
private static void throwIfGraphTableClassIsNotSubClassOfActionPayloadClass(
|
private static void throwIfGraphTableClassIsNotSubClassOfActionPayloadClass(
|
||||||
Class<? extends Oaf> rowClazz, Class<? extends Oaf> actionPayloadClazz) {
|
Class<? extends Oaf> rowClazz, Class<? extends Oaf> actionPayloadClazz) {
|
||||||
if (!isSubClass(rowClazz, actionPayloadClazz)) {
|
if (!isSubClass(rowClazz, actionPayloadClazz)) {
|
||||||
String msg =
|
String msg = String
|
||||||
String.format(
|
.format(
|
||||||
"graph table class is not a subclass of action payload class: graph=%s, action=%s",
|
"graph table class is not a subclass of action payload class: graph=%s, action=%s",
|
||||||
rowClazz.getCanonicalName(), actionPayloadClazz.getCanonicalName());
|
rowClazz.getCanonicalName(), actionPayloadClazz.getCanonicalName());
|
||||||
throw new RuntimeException(msg);
|
throw new RuntimeException(msg);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private static void removeOutputDir(SparkSession spark, String path) {
|
private static void removeOutputDir(SparkSession spark, String path) {
|
||||||
HdfsSupport.remove(path, spark.sparkContext().hadoopConfiguration());
|
HdfsSupport.remove(path, spark.sparkContext().hadoopConfiguration());
|
||||||
}
|
}
|
||||||
|
|
||||||
private static <G extends Oaf, A extends Oaf> void promoteActionPayloadForGraphTable(
|
private static <G extends Oaf, A extends Oaf> void promoteActionPayloadForGraphTable(
|
||||||
SparkSession spark,
|
SparkSession spark,
|
||||||
String inputGraphTablePath,
|
String inputGraphTablePath,
|
||||||
String inputActionPayloadPath,
|
String inputActionPayloadPath,
|
||||||
String outputGraphTablePath,
|
String outputGraphTablePath,
|
||||||
MergeAndGet.Strategy strategy,
|
MergeAndGet.Strategy strategy,
|
||||||
Class<G> rowClazz,
|
Class<G> rowClazz,
|
||||||
Class<A> actionPayloadClazz) {
|
Class<A> actionPayloadClazz) {
|
||||||
Dataset<G> rowDS = readGraphTable(spark, inputGraphTablePath, rowClazz);
|
Dataset<G> rowDS = readGraphTable(spark, inputGraphTablePath, rowClazz);
|
||||||
Dataset<A> actionPayloadDS =
|
Dataset<A> actionPayloadDS = readActionPayload(spark, inputActionPayloadPath, actionPayloadClazz);
|
||||||
readActionPayload(spark, inputActionPayloadPath, actionPayloadClazz);
|
|
||||||
|
|
||||||
Dataset<G> result =
|
Dataset<G> result = promoteActionPayloadForGraphTable(
|
||||||
promoteActionPayloadForGraphTable(
|
rowDS, actionPayloadDS, strategy, rowClazz, actionPayloadClazz)
|
||||||
rowDS, actionPayloadDS, strategy, rowClazz, actionPayloadClazz)
|
.map((MapFunction<G, G>) value -> value, Encoders.bean(rowClazz));
|
||||||
.map((MapFunction<G, G>) value -> value, Encoders.bean(rowClazz));
|
|
||||||
|
|
||||||
saveGraphTable(result, outputGraphTablePath);
|
saveGraphTable(result, outputGraphTablePath);
|
||||||
}
|
}
|
||||||
|
|
||||||
private static <G extends Oaf> Dataset<G> readGraphTable(
|
private static <G extends Oaf> Dataset<G> readGraphTable(
|
||||||
SparkSession spark, String path, Class<G> rowClazz) {
|
SparkSession spark, String path, Class<G> rowClazz) {
|
||||||
logger.info("Reading graph table from path: {}", path);
|
logger.info("Reading graph table from path: {}", path);
|
||||||
|
|
||||||
return spark
|
return spark
|
||||||
.read()
|
.read()
|
||||||
.textFile(path)
|
.textFile(path)
|
||||||
.map(
|
.map(
|
||||||
(MapFunction<String, G>) value -> OBJECT_MAPPER.readValue(value, rowClazz),
|
(MapFunction<String, G>) value -> OBJECT_MAPPER.readValue(value, rowClazz),
|
||||||
Encoders.bean(rowClazz));
|
Encoders.bean(rowClazz));
|
||||||
|
|
||||||
/*
|
/*
|
||||||
* return spark .read() .parquet(path) .as(Encoders.bean(rowClazz));
|
* return spark .read() .parquet(path) .as(Encoders.bean(rowClazz));
|
||||||
*/
|
*/
|
||||||
}
|
}
|
||||||
|
|
||||||
private static <A extends Oaf> Dataset<A> readActionPayload(
|
private static <A extends Oaf> Dataset<A> readActionPayload(
|
||||||
SparkSession spark, String path, Class<A> actionPayloadClazz) {
|
SparkSession spark, String path, Class<A> actionPayloadClazz) {
|
||||||
logger.info("Reading action payload from path: {}", path);
|
logger.info("Reading action payload from path: {}", path);
|
||||||
return spark
|
return spark
|
||||||
.read()
|
.read()
|
||||||
.parquet(path)
|
.parquet(path)
|
||||||
.map(
|
.map(
|
||||||
(MapFunction<Row, A>)
|
(MapFunction<Row, A>) value -> OBJECT_MAPPER
|
||||||
value ->
|
.readValue(value.<String> getAs("payload"), actionPayloadClazz),
|
||||||
OBJECT_MAPPER.readValue(value.<String>getAs("payload"), actionPayloadClazz),
|
Encoders.bean(actionPayloadClazz));
|
||||||
Encoders.bean(actionPayloadClazz));
|
}
|
||||||
}
|
|
||||||
|
|
||||||
private static <G extends Oaf, A extends Oaf> Dataset<G> promoteActionPayloadForGraphTable(
|
private static <G extends Oaf, A extends Oaf> Dataset<G> promoteActionPayloadForGraphTable(
|
||||||
Dataset<G> rowDS,
|
Dataset<G> rowDS,
|
||||||
Dataset<A> actionPayloadDS,
|
Dataset<A> actionPayloadDS,
|
||||||
MergeAndGet.Strategy strategy,
|
MergeAndGet.Strategy strategy,
|
||||||
Class<G> rowClazz,
|
Class<G> rowClazz,
|
||||||
Class<A> actionPayloadClazz) {
|
Class<A> actionPayloadClazz) {
|
||||||
logger.info(
|
logger
|
||||||
"Promoting action payload for graph table: payload={}, table={}",
|
.info(
|
||||||
actionPayloadClazz.getSimpleName(),
|
"Promoting action payload for graph table: payload={}, table={}",
|
||||||
rowClazz.getSimpleName());
|
actionPayloadClazz.getSimpleName(),
|
||||||
|
rowClazz.getSimpleName());
|
||||||
|
|
||||||
SerializableSupplier<Function<G, String>> rowIdFn = ModelSupport::idFn;
|
SerializableSupplier<Function<G, String>> rowIdFn = ModelSupport::idFn;
|
||||||
SerializableSupplier<Function<A, String>> actionPayloadIdFn = ModelSupport::idFn;
|
SerializableSupplier<Function<A, String>> actionPayloadIdFn = ModelSupport::idFn;
|
||||||
SerializableSupplier<BiFunction<G, A, G>> mergeRowWithActionPayloadAndGetFn =
|
SerializableSupplier<BiFunction<G, A, G>> mergeRowWithActionPayloadAndGetFn = MergeAndGet.functionFor(strategy);
|
||||||
MergeAndGet.functionFor(strategy);
|
SerializableSupplier<BiFunction<G, G, G>> mergeRowsAndGetFn = MergeAndGet.functionFor(strategy);
|
||||||
SerializableSupplier<BiFunction<G, G, G>> mergeRowsAndGetFn = MergeAndGet.functionFor(strategy);
|
SerializableSupplier<G> zeroFn = zeroFn(rowClazz);
|
||||||
SerializableSupplier<G> zeroFn = zeroFn(rowClazz);
|
SerializableSupplier<Function<G, Boolean>> isNotZeroFn = PromoteActionPayloadForGraphTableJob::isNotZeroFnUsingIdOrSource;
|
||||||
SerializableSupplier<Function<G, Boolean>> isNotZeroFn =
|
|
||||||
PromoteActionPayloadForGraphTableJob::isNotZeroFnUsingIdOrSource;
|
|
||||||
|
|
||||||
Dataset<G> joinedAndMerged =
|
Dataset<G> joinedAndMerged = PromoteActionPayloadFunctions
|
||||||
PromoteActionPayloadFunctions.joinGraphTableWithActionPayloadAndMerge(
|
.joinGraphTableWithActionPayloadAndMerge(
|
||||||
rowDS,
|
rowDS,
|
||||||
actionPayloadDS,
|
actionPayloadDS,
|
||||||
rowIdFn,
|
rowIdFn,
|
||||||
actionPayloadIdFn,
|
actionPayloadIdFn,
|
||||||
mergeRowWithActionPayloadAndGetFn,
|
mergeRowWithActionPayloadAndGetFn,
|
||||||
rowClazz,
|
rowClazz,
|
||||||
actionPayloadClazz);
|
actionPayloadClazz);
|
||||||
|
|
||||||
return PromoteActionPayloadFunctions.groupGraphTableByIdAndMerge(
|
return PromoteActionPayloadFunctions
|
||||||
joinedAndMerged, rowIdFn, mergeRowsAndGetFn, zeroFn, isNotZeroFn, rowClazz);
|
.groupGraphTableByIdAndMerge(
|
||||||
}
|
joinedAndMerged, rowIdFn, mergeRowsAndGetFn, zeroFn, isNotZeroFn, rowClazz);
|
||||||
|
}
|
||||||
|
|
||||||
private static <T extends Oaf> SerializableSupplier<T> zeroFn(Class<T> clazz) {
|
private static <T extends Oaf> SerializableSupplier<T> zeroFn(Class<T> clazz) {
|
||||||
switch (clazz.getCanonicalName()) {
|
switch (clazz.getCanonicalName()) {
|
||||||
case "eu.dnetlib.dhp.schema.oaf.Dataset":
|
case "eu.dnetlib.dhp.schema.oaf.Dataset":
|
||||||
return () -> clazz.cast(new eu.dnetlib.dhp.schema.oaf.Dataset());
|
return () -> clazz.cast(new eu.dnetlib.dhp.schema.oaf.Dataset());
|
||||||
case "eu.dnetlib.dhp.schema.oaf.Datasource":
|
case "eu.dnetlib.dhp.schema.oaf.Datasource":
|
||||||
return () -> clazz.cast(new eu.dnetlib.dhp.schema.oaf.Datasource());
|
return () -> clazz.cast(new eu.dnetlib.dhp.schema.oaf.Datasource());
|
||||||
case "eu.dnetlib.dhp.schema.oaf.Organization":
|
case "eu.dnetlib.dhp.schema.oaf.Organization":
|
||||||
return () -> clazz.cast(new eu.dnetlib.dhp.schema.oaf.Organization());
|
return () -> clazz.cast(new eu.dnetlib.dhp.schema.oaf.Organization());
|
||||||
case "eu.dnetlib.dhp.schema.oaf.OtherResearchProduct":
|
case "eu.dnetlib.dhp.schema.oaf.OtherResearchProduct":
|
||||||
return () -> clazz.cast(new eu.dnetlib.dhp.schema.oaf.OtherResearchProduct());
|
return () -> clazz.cast(new eu.dnetlib.dhp.schema.oaf.OtherResearchProduct());
|
||||||
case "eu.dnetlib.dhp.schema.oaf.Project":
|
case "eu.dnetlib.dhp.schema.oaf.Project":
|
||||||
return () -> clazz.cast(new eu.dnetlib.dhp.schema.oaf.Project());
|
return () -> clazz.cast(new eu.dnetlib.dhp.schema.oaf.Project());
|
||||||
case "eu.dnetlib.dhp.schema.oaf.Publication":
|
case "eu.dnetlib.dhp.schema.oaf.Publication":
|
||||||
return () -> clazz.cast(new eu.dnetlib.dhp.schema.oaf.Publication());
|
return () -> clazz.cast(new eu.dnetlib.dhp.schema.oaf.Publication());
|
||||||
case "eu.dnetlib.dhp.schema.oaf.Relation":
|
case "eu.dnetlib.dhp.schema.oaf.Relation":
|
||||||
return () -> clazz.cast(new eu.dnetlib.dhp.schema.oaf.Relation());
|
return () -> clazz.cast(new eu.dnetlib.dhp.schema.oaf.Relation());
|
||||||
case "eu.dnetlib.dhp.schema.oaf.Software":
|
case "eu.dnetlib.dhp.schema.oaf.Software":
|
||||||
return () -> clazz.cast(new eu.dnetlib.dhp.schema.oaf.Software());
|
return () -> clazz.cast(new eu.dnetlib.dhp.schema.oaf.Software());
|
||||||
default:
|
default:
|
||||||
throw new RuntimeException("unknown class: " + clazz.getCanonicalName());
|
throw new RuntimeException("unknown class: " + clazz.getCanonicalName());
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private static <T extends Oaf> Function<T, Boolean> isNotZeroFnUsingIdOrSource() {
|
private static <T extends Oaf> Function<T, Boolean> isNotZeroFnUsingIdOrSource() {
|
||||||
return t -> {
|
return t -> {
|
||||||
if (isSubClass(t, Relation.class)) {
|
if (isSubClass(t, Relation.class)) {
|
||||||
return Objects.nonNull(((Relation) t).getSource());
|
return Objects.nonNull(((Relation) t).getSource());
|
||||||
}
|
}
|
||||||
return Objects.nonNull(((OafEntity) t).getId());
|
return Objects.nonNull(((OafEntity) t).getId());
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
private static <G extends Oaf> void saveGraphTable(Dataset<G> result, String path) {
|
private static <G extends Oaf> void saveGraphTable(Dataset<G> result, String path) {
|
||||||
logger.info("Saving graph table to path: {}", path);
|
logger.info("Saving graph table to path: {}", path);
|
||||||
result.toJSON().write().option("compression", "gzip").text(path);
|
result.toJSON().write().option("compression", "gzip").text(path);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,13 +1,13 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.actionmanager.promote;
|
package eu.dnetlib.dhp.actionmanager.promote;
|
||||||
|
|
||||||
import static eu.dnetlib.dhp.schema.common.ModelSupport.isSubClass;
|
import static eu.dnetlib.dhp.schema.common.ModelSupport.isSubClass;
|
||||||
|
|
||||||
import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.SerializableSupplier;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.Oaf;
|
|
||||||
import java.util.Objects;
|
import java.util.Objects;
|
||||||
import java.util.Optional;
|
import java.util.Optional;
|
||||||
import java.util.function.BiFunction;
|
import java.util.function.BiFunction;
|
||||||
import java.util.function.Function;
|
import java.util.function.Function;
|
||||||
|
|
||||||
import org.apache.spark.api.java.function.FilterFunction;
|
import org.apache.spark.api.java.function.FilterFunction;
|
||||||
import org.apache.spark.api.java.function.MapFunction;
|
import org.apache.spark.api.java.function.MapFunction;
|
||||||
import org.apache.spark.sql.Dataset;
|
import org.apache.spark.sql.Dataset;
|
||||||
|
@ -15,171 +15,170 @@ import org.apache.spark.sql.Encoder;
|
||||||
import org.apache.spark.sql.Encoders;
|
import org.apache.spark.sql.Encoders;
|
||||||
import org.apache.spark.sql.TypedColumn;
|
import org.apache.spark.sql.TypedColumn;
|
||||||
import org.apache.spark.sql.expressions.Aggregator;
|
import org.apache.spark.sql.expressions.Aggregator;
|
||||||
|
|
||||||
|
import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.SerializableSupplier;
|
||||||
|
import eu.dnetlib.dhp.schema.oaf.Oaf;
|
||||||
import scala.Tuple2;
|
import scala.Tuple2;
|
||||||
|
|
||||||
/** Promote action payload functions. */
|
/** Promote action payload functions. */
|
||||||
public class PromoteActionPayloadFunctions {
|
public class PromoteActionPayloadFunctions {
|
||||||
|
|
||||||
private PromoteActionPayloadFunctions() {}
|
private PromoteActionPayloadFunctions() {
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Joins dataset representing graph table with dataset representing action payload using supplied
|
* Joins dataset representing graph table with dataset representing action payload using supplied functions.
|
||||||
* functions.
|
*
|
||||||
*
|
* @param rowDS Dataset representing graph table
|
||||||
* @param rowDS Dataset representing graph table
|
* @param actionPayloadDS Dataset representing action payload
|
||||||
* @param actionPayloadDS Dataset representing action payload
|
* @param rowIdFn Function used to get the id of graph table row
|
||||||
* @param rowIdFn Function used to get the id of graph table row
|
* @param actionPayloadIdFn Function used to get id of action payload instance
|
||||||
* @param actionPayloadIdFn Function used to get id of action payload instance
|
* @param mergeAndGetFn Function used to merge graph table row and action payload instance
|
||||||
* @param mergeAndGetFn Function used to merge graph table row and action payload instance
|
* @param rowClazz Class of graph table
|
||||||
* @param rowClazz Class of graph table
|
* @param actionPayloadClazz Class of action payload
|
||||||
* @param actionPayloadClazz Class of action payload
|
* @param <G> Type of graph table row
|
||||||
* @param <G> Type of graph table row
|
* @param <A> Type of action payload instance
|
||||||
* @param <A> Type of action payload instance
|
* @return Dataset of merged graph table rows and action payload instances
|
||||||
* @return Dataset of merged graph table rows and action payload instances
|
*/
|
||||||
*/
|
public static <G extends Oaf, A extends Oaf> Dataset<G> joinGraphTableWithActionPayloadAndMerge(
|
||||||
public static <G extends Oaf, A extends Oaf> Dataset<G> joinGraphTableWithActionPayloadAndMerge(
|
Dataset<G> rowDS,
|
||||||
Dataset<G> rowDS,
|
Dataset<A> actionPayloadDS,
|
||||||
Dataset<A> actionPayloadDS,
|
SerializableSupplier<Function<G, String>> rowIdFn,
|
||||||
SerializableSupplier<Function<G, String>> rowIdFn,
|
SerializableSupplier<Function<A, String>> actionPayloadIdFn,
|
||||||
SerializableSupplier<Function<A, String>> actionPayloadIdFn,
|
SerializableSupplier<BiFunction<G, A, G>> mergeAndGetFn,
|
||||||
SerializableSupplier<BiFunction<G, A, G>> mergeAndGetFn,
|
Class<G> rowClazz,
|
||||||
Class<G> rowClazz,
|
Class<A> actionPayloadClazz) {
|
||||||
Class<A> actionPayloadClazz) {
|
if (!isSubClass(rowClazz, actionPayloadClazz)) {
|
||||||
if (!isSubClass(rowClazz, actionPayloadClazz)) {
|
throw new RuntimeException(
|
||||||
throw new RuntimeException(
|
"action payload type must be the same or be a super type of table row type");
|
||||||
"action payload type must be the same or be a super type of table row type");
|
}
|
||||||
}
|
|
||||||
|
|
||||||
Dataset<Tuple2<String, G>> rowWithIdDS = mapToTupleWithId(rowDS, rowIdFn, rowClazz);
|
Dataset<Tuple2<String, G>> rowWithIdDS = mapToTupleWithId(rowDS, rowIdFn, rowClazz);
|
||||||
Dataset<Tuple2<String, A>> actionPayloadWithIdDS =
|
Dataset<Tuple2<String, A>> actionPayloadWithIdDS = mapToTupleWithId(
|
||||||
mapToTupleWithId(actionPayloadDS, actionPayloadIdFn, actionPayloadClazz);
|
actionPayloadDS, actionPayloadIdFn, actionPayloadClazz);
|
||||||
|
|
||||||
return rowWithIdDS
|
return rowWithIdDS
|
||||||
.joinWith(
|
.joinWith(
|
||||||
actionPayloadWithIdDS,
|
actionPayloadWithIdDS,
|
||||||
rowWithIdDS.col("_1").equalTo(actionPayloadWithIdDS.col("_1")),
|
rowWithIdDS.col("_1").equalTo(actionPayloadWithIdDS.col("_1")),
|
||||||
"full_outer")
|
"full_outer")
|
||||||
.map(
|
.map(
|
||||||
(MapFunction<Tuple2<Tuple2<String, G>, Tuple2<String, A>>, G>)
|
(MapFunction<Tuple2<Tuple2<String, G>, Tuple2<String, A>>, G>) value -> {
|
||||||
value -> {
|
Optional<G> rowOpt = Optional.ofNullable(value._1()).map(Tuple2::_2);
|
||||||
Optional<G> rowOpt = Optional.ofNullable(value._1()).map(Tuple2::_2);
|
Optional<A> actionPayloadOpt = Optional.ofNullable(value._2()).map(Tuple2::_2);
|
||||||
Optional<A> actionPayloadOpt = Optional.ofNullable(value._2()).map(Tuple2::_2);
|
return rowOpt
|
||||||
return rowOpt
|
.map(
|
||||||
.map(
|
row -> actionPayloadOpt
|
||||||
row ->
|
.map(
|
||||||
actionPayloadOpt
|
actionPayload -> mergeAndGetFn.get().apply(row, actionPayload))
|
||||||
.map(
|
.orElse(row))
|
||||||
actionPayload ->
|
.orElseGet(
|
||||||
mergeAndGetFn.get().apply(row, actionPayload))
|
() -> actionPayloadOpt
|
||||||
.orElse(row))
|
.filter(
|
||||||
.orElseGet(
|
actionPayload -> actionPayload.getClass().equals(rowClazz))
|
||||||
() ->
|
.map(rowClazz::cast)
|
||||||
actionPayloadOpt
|
.orElse(null));
|
||||||
.filter(
|
},
|
||||||
actionPayload -> actionPayload.getClass().equals(rowClazz))
|
Encoders.kryo(rowClazz))
|
||||||
.map(rowClazz::cast)
|
.filter((FilterFunction<G>) Objects::nonNull);
|
||||||
.orElse(null));
|
}
|
||||||
},
|
|
||||||
Encoders.kryo(rowClazz))
|
|
||||||
.filter((FilterFunction<G>) Objects::nonNull);
|
|
||||||
}
|
|
||||||
|
|
||||||
private static <T extends Oaf> Dataset<Tuple2<String, T>> mapToTupleWithId(
|
private static <T extends Oaf> Dataset<Tuple2<String, T>> mapToTupleWithId(
|
||||||
Dataset<T> ds, SerializableSupplier<Function<T, String>> idFn, Class<T> clazz) {
|
Dataset<T> ds, SerializableSupplier<Function<T, String>> idFn, Class<T> clazz) {
|
||||||
return ds.map(
|
return ds
|
||||||
(MapFunction<T, Tuple2<String, T>>) value -> new Tuple2<>(idFn.get().apply(value), value),
|
.map(
|
||||||
Encoders.tuple(Encoders.STRING(), Encoders.kryo(clazz)));
|
(MapFunction<T, Tuple2<String, T>>) value -> new Tuple2<>(idFn.get().apply(value), value),
|
||||||
}
|
Encoders.tuple(Encoders.STRING(), Encoders.kryo(clazz)));
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Groups graph table by id and aggregates using supplied functions.
|
* Groups graph table by id and aggregates using supplied functions.
|
||||||
*
|
*
|
||||||
* @param rowDS Dataset representing graph table
|
* @param rowDS Dataset representing graph table
|
||||||
* @param rowIdFn Function used to get the id of graph table row
|
* @param rowIdFn Function used to get the id of graph table row
|
||||||
* @param mergeAndGetFn Function used to merge graph table rows
|
* @param mergeAndGetFn Function used to merge graph table rows
|
||||||
* @param zeroFn Function to create a zero/empty instance of graph table row
|
* @param zeroFn Function to create a zero/empty instance of graph table row
|
||||||
* @param isNotZeroFn Function to check if graph table row is not zero/empty
|
* @param isNotZeroFn Function to check if graph table row is not zero/empty
|
||||||
* @param rowClazz Class of graph table
|
* @param rowClazz Class of graph table
|
||||||
* @param <G> Type of graph table row
|
* @param <G> Type of graph table row
|
||||||
* @return Dataset of aggregated graph table rows
|
* @return Dataset of aggregated graph table rows
|
||||||
*/
|
*/
|
||||||
public static <G extends Oaf> Dataset<G> groupGraphTableByIdAndMerge(
|
public static <G extends Oaf> Dataset<G> groupGraphTableByIdAndMerge(
|
||||||
Dataset<G> rowDS,
|
Dataset<G> rowDS,
|
||||||
SerializableSupplier<Function<G, String>> rowIdFn,
|
SerializableSupplier<Function<G, String>> rowIdFn,
|
||||||
SerializableSupplier<BiFunction<G, G, G>> mergeAndGetFn,
|
SerializableSupplier<BiFunction<G, G, G>> mergeAndGetFn,
|
||||||
SerializableSupplier<G> zeroFn,
|
SerializableSupplier<G> zeroFn,
|
||||||
SerializableSupplier<Function<G, Boolean>> isNotZeroFn,
|
SerializableSupplier<Function<G, Boolean>> isNotZeroFn,
|
||||||
Class<G> rowClazz) {
|
Class<G> rowClazz) {
|
||||||
TypedColumn<G, G> aggregator =
|
TypedColumn<G, G> aggregator = new TableAggregator<>(zeroFn, mergeAndGetFn, isNotZeroFn, rowClazz).toColumn();
|
||||||
new TableAggregator<>(zeroFn, mergeAndGetFn, isNotZeroFn, rowClazz).toColumn();
|
return rowDS
|
||||||
return rowDS
|
.groupByKey((MapFunction<G, String>) x -> rowIdFn.get().apply(x), Encoders.STRING())
|
||||||
.groupByKey((MapFunction<G, String>) x -> rowIdFn.get().apply(x), Encoders.STRING())
|
.agg(aggregator)
|
||||||
.agg(aggregator)
|
.map((MapFunction<Tuple2<String, G>, G>) Tuple2::_2, Encoders.kryo(rowClazz));
|
||||||
.map((MapFunction<Tuple2<String, G>, G>) Tuple2::_2, Encoders.kryo(rowClazz));
|
}
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Aggregator to be used for aggregating graph table rows during grouping.
|
* Aggregator to be used for aggregating graph table rows during grouping.
|
||||||
*
|
*
|
||||||
* @param <G> Type of graph table row
|
* @param <G> Type of graph table row
|
||||||
*/
|
*/
|
||||||
public static class TableAggregator<G extends Oaf> extends Aggregator<G, G, G> {
|
public static class TableAggregator<G extends Oaf> extends Aggregator<G, G, G> {
|
||||||
private SerializableSupplier<G> zeroFn;
|
private final SerializableSupplier<G> zeroFn;
|
||||||
private SerializableSupplier<BiFunction<G, G, G>> mergeAndGetFn;
|
private final SerializableSupplier<BiFunction<G, G, G>> mergeAndGetFn;
|
||||||
private SerializableSupplier<Function<G, Boolean>> isNotZeroFn;
|
private final SerializableSupplier<Function<G, Boolean>> isNotZeroFn;
|
||||||
private Class<G> rowClazz;
|
private final Class<G> rowClazz;
|
||||||
|
|
||||||
public TableAggregator(
|
public TableAggregator(
|
||||||
SerializableSupplier<G> zeroFn,
|
SerializableSupplier<G> zeroFn,
|
||||||
SerializableSupplier<BiFunction<G, G, G>> mergeAndGetFn,
|
SerializableSupplier<BiFunction<G, G, G>> mergeAndGetFn,
|
||||||
SerializableSupplier<Function<G, Boolean>> isNotZeroFn,
|
SerializableSupplier<Function<G, Boolean>> isNotZeroFn,
|
||||||
Class<G> rowClazz) {
|
Class<G> rowClazz) {
|
||||||
this.zeroFn = zeroFn;
|
this.zeroFn = zeroFn;
|
||||||
this.mergeAndGetFn = mergeAndGetFn;
|
this.mergeAndGetFn = mergeAndGetFn;
|
||||||
this.isNotZeroFn = isNotZeroFn;
|
this.isNotZeroFn = isNotZeroFn;
|
||||||
this.rowClazz = rowClazz;
|
this.rowClazz = rowClazz;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public G zero() {
|
public G zero() {
|
||||||
return zeroFn.get();
|
return zeroFn.get();
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public G reduce(G b, G a) {
|
public G reduce(G b, G a) {
|
||||||
return zeroSafeMergeAndGet(b, a);
|
return zeroSafeMergeAndGet(b, a);
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public G merge(G b1, G b2) {
|
public G merge(G b1, G b2) {
|
||||||
return zeroSafeMergeAndGet(b1, b2);
|
return zeroSafeMergeAndGet(b1, b2);
|
||||||
}
|
}
|
||||||
|
|
||||||
private G zeroSafeMergeAndGet(G left, G right) {
|
private G zeroSafeMergeAndGet(G left, G right) {
|
||||||
Function<G, Boolean> isNotZero = isNotZeroFn.get();
|
Function<G, Boolean> isNotZero = isNotZeroFn.get();
|
||||||
if (isNotZero.apply(left) && isNotZero.apply(right)) {
|
if (isNotZero.apply(left) && isNotZero.apply(right)) {
|
||||||
return mergeAndGetFn.get().apply(left, right);
|
return mergeAndGetFn.get().apply(left, right);
|
||||||
} else if (isNotZero.apply(left) && !isNotZero.apply(right)) {
|
} else if (isNotZero.apply(left) && !isNotZero.apply(right)) {
|
||||||
return left;
|
return left;
|
||||||
} else if (!isNotZero.apply(left) && isNotZero.apply(right)) {
|
} else if (!isNotZero.apply(left) && isNotZero.apply(right)) {
|
||||||
return right;
|
return right;
|
||||||
}
|
}
|
||||||
throw new RuntimeException("internal aggregation error: left and right objects are zero");
|
throw new RuntimeException("internal aggregation error: left and right objects are zero");
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public G finish(G reduction) {
|
public G finish(G reduction) {
|
||||||
return reduction;
|
return reduction;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public Encoder<G> bufferEncoder() {
|
public Encoder<G> bufferEncoder() {
|
||||||
return Encoders.kryo(rowClazz);
|
return Encoders.kryo(rowClazz);
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public Encoder<G> outputEncoder() {
|
public Encoder<G> outputEncoder() {
|
||||||
return Encoders.kryo(rowClazz);
|
return Encoders.kryo(rowClazz);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,3 +1,4 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.actionmanager.partition;
|
package eu.dnetlib.dhp.actionmanager.partition;
|
||||||
|
|
||||||
import static eu.dnetlib.dhp.common.ThrowingSupport.rethrowAsRuntimeException;
|
import static eu.dnetlib.dhp.common.ThrowingSupport.rethrowAsRuntimeException;
|
||||||
|
@ -5,16 +6,13 @@ import static org.apache.spark.sql.functions.*;
|
||||||
import static org.junit.jupiter.api.Assertions.assertIterableEquals;
|
import static org.junit.jupiter.api.Assertions.assertIterableEquals;
|
||||||
import static scala.collection.JavaConversions.mutableSeqAsJavaList;
|
import static scala.collection.JavaConversions.mutableSeqAsJavaList;
|
||||||
|
|
||||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
|
||||||
import eu.dnetlib.dhp.actionmanager.ISClient;
|
|
||||||
import eu.dnetlib.dhp.actionmanager.promote.PromoteActionPayloadForGraphTableJobTest;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.*;
|
|
||||||
import java.io.IOException;
|
import java.io.IOException;
|
||||||
import java.nio.file.Files;
|
import java.nio.file.Files;
|
||||||
import java.nio.file.Path;
|
import java.nio.file.Path;
|
||||||
import java.nio.file.Paths;
|
import java.nio.file.Paths;
|
||||||
import java.util.*;
|
import java.util.*;
|
||||||
import java.util.stream.Collectors;
|
import java.util.stream.Collectors;
|
||||||
|
|
||||||
import org.apache.hadoop.conf.Configuration;
|
import org.apache.hadoop.conf.Configuration;
|
||||||
import org.apache.hadoop.io.Text;
|
import org.apache.hadoop.io.Text;
|
||||||
import org.apache.hadoop.mapreduce.Job;
|
import org.apache.hadoop.mapreduce.Job;
|
||||||
|
@ -32,197 +30,212 @@ import org.junit.jupiter.api.io.TempDir;
|
||||||
import org.mockito.Mock;
|
import org.mockito.Mock;
|
||||||
import org.mockito.Mockito;
|
import org.mockito.Mockito;
|
||||||
import org.mockito.junit.jupiter.MockitoExtension;
|
import org.mockito.junit.jupiter.MockitoExtension;
|
||||||
|
|
||||||
|
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||||
|
|
||||||
|
import eu.dnetlib.dhp.actionmanager.ISClient;
|
||||||
|
import eu.dnetlib.dhp.actionmanager.promote.PromoteActionPayloadForGraphTableJobTest;
|
||||||
|
import eu.dnetlib.dhp.schema.oaf.*;
|
||||||
import scala.Tuple2;
|
import scala.Tuple2;
|
||||||
import scala.collection.mutable.Seq;
|
import scala.collection.mutable.Seq;
|
||||||
|
|
||||||
@ExtendWith(MockitoExtension.class)
|
@ExtendWith(MockitoExtension.class)
|
||||||
public class PartitionActionSetsByPayloadTypeJobTest {
|
public class PartitionActionSetsByPayloadTypeJobTest {
|
||||||
private static final ClassLoader cl =
|
private static final ClassLoader cl = PartitionActionSetsByPayloadTypeJobTest.class.getClassLoader();
|
||||||
PartitionActionSetsByPayloadTypeJobTest.class.getClassLoader();
|
|
||||||
|
|
||||||
private static Configuration configuration;
|
private static Configuration configuration;
|
||||||
private static SparkSession spark;
|
private static SparkSession spark;
|
||||||
|
|
||||||
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
|
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
|
||||||
|
|
||||||
private static final StructType ATOMIC_ACTION_SCHEMA =
|
private static final StructType ATOMIC_ACTION_SCHEMA = StructType$.MODULE$
|
||||||
StructType$.MODULE$.apply(
|
.apply(
|
||||||
Arrays.asList(
|
Arrays
|
||||||
StructField$.MODULE$.apply("clazz", DataTypes.StringType, false, Metadata.empty()),
|
.asList(
|
||||||
StructField$.MODULE$.apply(
|
StructField$.MODULE$.apply("clazz", DataTypes.StringType, false, Metadata.empty()),
|
||||||
"payload", DataTypes.StringType, false, Metadata.empty())));
|
StructField$.MODULE$
|
||||||
|
.apply(
|
||||||
|
"payload", DataTypes.StringType, false, Metadata.empty())));
|
||||||
|
|
||||||
@BeforeAll
|
@BeforeAll
|
||||||
public static void beforeAll() throws IOException {
|
public static void beforeAll() throws IOException {
|
||||||
configuration = Job.getInstance().getConfiguration();
|
configuration = Job.getInstance().getConfiguration();
|
||||||
SparkConf conf = new SparkConf();
|
SparkConf conf = new SparkConf();
|
||||||
conf.setAppName(PromoteActionPayloadForGraphTableJobTest.class.getSimpleName());
|
conf.setAppName(PromoteActionPayloadForGraphTableJobTest.class.getSimpleName());
|
||||||
conf.setMaster("local");
|
conf.setMaster("local");
|
||||||
conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer");
|
conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer");
|
||||||
spark = SparkSession.builder().config(conf).getOrCreate();
|
spark = SparkSession.builder().config(conf).getOrCreate();
|
||||||
}
|
}
|
||||||
|
|
||||||
@AfterAll
|
@AfterAll
|
||||||
public static void afterAll() {
|
public static void afterAll() {
|
||||||
spark.stop();
|
spark.stop();
|
||||||
}
|
}
|
||||||
|
|
||||||
@DisplayName("Job")
|
@DisplayName("Job")
|
||||||
@Nested
|
@Nested
|
||||||
class Main {
|
class Main {
|
||||||
|
|
||||||
@Mock private ISClient isClient;
|
@Mock
|
||||||
|
private ISClient isClient;
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void shouldPartitionActionSetsByPayloadType(@TempDir Path workingDir) throws Exception {
|
public void shouldPartitionActionSetsByPayloadType(@TempDir Path workingDir) throws Exception {
|
||||||
// given
|
// given
|
||||||
Path inputActionSetsBaseDir = workingDir.resolve("input").resolve("action_sets");
|
Path inputActionSetsBaseDir = workingDir.resolve("input").resolve("action_sets");
|
||||||
Path outputDir = workingDir.resolve("output");
|
Path outputDir = workingDir.resolve("output");
|
||||||
|
|
||||||
Map<String, List<String>> oafsByClassName = createActionSets(inputActionSetsBaseDir);
|
Map<String, List<String>> oafsByClassName = createActionSets(inputActionSetsBaseDir);
|
||||||
|
|
||||||
List<String> inputActionSetsPaths = resolveInputActionSetPaths(inputActionSetsBaseDir);
|
List<String> inputActionSetsPaths = resolveInputActionSetPaths(inputActionSetsBaseDir);
|
||||||
|
|
||||||
// when
|
// when
|
||||||
Mockito.when(isClient.getLatestRawsetPaths(Mockito.anyString()))
|
Mockito
|
||||||
.thenReturn(inputActionSetsPaths);
|
.when(isClient.getLatestRawsetPaths(Mockito.anyString()))
|
||||||
|
.thenReturn(inputActionSetsPaths);
|
||||||
|
|
||||||
PartitionActionSetsByPayloadTypeJob job = new PartitionActionSetsByPayloadTypeJob();
|
PartitionActionSetsByPayloadTypeJob job = new PartitionActionSetsByPayloadTypeJob();
|
||||||
job.setIsClient(isClient);
|
job.setIsClient(isClient);
|
||||||
job.run(
|
job
|
||||||
Boolean.FALSE,
|
.run(
|
||||||
"", // it can be empty we're mocking the response from isClient
|
Boolean.FALSE,
|
||||||
// to
|
"", // it can be empty we're mocking the response from isClient
|
||||||
// resolve the
|
// to
|
||||||
// paths
|
// resolve the
|
||||||
outputDir.toString());
|
// paths
|
||||||
|
outputDir.toString());
|
||||||
|
|
||||||
// then
|
// then
|
||||||
Files.exists(outputDir);
|
Files.exists(outputDir);
|
||||||
|
|
||||||
assertForOafType(outputDir, oafsByClassName, eu.dnetlib.dhp.schema.oaf.Dataset.class);
|
assertForOafType(outputDir, oafsByClassName, eu.dnetlib.dhp.schema.oaf.Dataset.class);
|
||||||
assertForOafType(outputDir, oafsByClassName, Datasource.class);
|
assertForOafType(outputDir, oafsByClassName, Datasource.class);
|
||||||
assertForOafType(outputDir, oafsByClassName, Organization.class);
|
assertForOafType(outputDir, oafsByClassName, Organization.class);
|
||||||
assertForOafType(outputDir, oafsByClassName, OtherResearchProduct.class);
|
assertForOafType(outputDir, oafsByClassName, OtherResearchProduct.class);
|
||||||
assertForOafType(outputDir, oafsByClassName, Project.class);
|
assertForOafType(outputDir, oafsByClassName, Project.class);
|
||||||
assertForOafType(outputDir, oafsByClassName, Publication.class);
|
assertForOafType(outputDir, oafsByClassName, Publication.class);
|
||||||
assertForOafType(outputDir, oafsByClassName, Result.class);
|
assertForOafType(outputDir, oafsByClassName, Result.class);
|
||||||
assertForOafType(outputDir, oafsByClassName, Relation.class);
|
assertForOafType(outputDir, oafsByClassName, Relation.class);
|
||||||
assertForOafType(outputDir, oafsByClassName, Software.class);
|
assertForOafType(outputDir, oafsByClassName, Software.class);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private List<String> resolveInputActionSetPaths(Path inputActionSetsBaseDir) throws IOException {
|
private List<String> resolveInputActionSetPaths(Path inputActionSetsBaseDir) throws IOException {
|
||||||
Path inputActionSetJsonDumpsDir = getInputActionSetJsonDumpsDir();
|
Path inputActionSetJsonDumpsDir = getInputActionSetJsonDumpsDir();
|
||||||
return Files.list(inputActionSetJsonDumpsDir)
|
return Files
|
||||||
.map(
|
.list(inputActionSetJsonDumpsDir)
|
||||||
path -> {
|
.map(
|
||||||
String inputActionSetId = path.getFileName().toString();
|
path -> {
|
||||||
return inputActionSetsBaseDir.resolve(inputActionSetId).toString();
|
String inputActionSetId = path.getFileName().toString();
|
||||||
})
|
return inputActionSetsBaseDir.resolve(inputActionSetId).toString();
|
||||||
.collect(Collectors.toCollection(ArrayList::new));
|
})
|
||||||
}
|
.collect(Collectors.toCollection(ArrayList::new));
|
||||||
|
}
|
||||||
|
|
||||||
private static Map<String, List<String>> createActionSets(Path inputActionSetsDir)
|
private static Map<String, List<String>> createActionSets(Path inputActionSetsDir)
|
||||||
throws IOException {
|
throws IOException {
|
||||||
Path inputActionSetJsonDumpsDir = getInputActionSetJsonDumpsDir();
|
Path inputActionSetJsonDumpsDir = getInputActionSetJsonDumpsDir();
|
||||||
|
|
||||||
Map<String, List<String>> oafsByType = new HashMap<>();
|
Map<String, List<String>> oafsByType = new HashMap<>();
|
||||||
Files.list(inputActionSetJsonDumpsDir)
|
Files
|
||||||
.forEach(
|
.list(inputActionSetJsonDumpsDir)
|
||||||
inputActionSetJsonDumpFile -> {
|
.forEach(
|
||||||
String inputActionSetId = inputActionSetJsonDumpFile.getFileName().toString();
|
inputActionSetJsonDumpFile -> {
|
||||||
Path inputActionSetDir = inputActionSetsDir.resolve(inputActionSetId);
|
String inputActionSetId = inputActionSetJsonDumpFile.getFileName().toString();
|
||||||
|
Path inputActionSetDir = inputActionSetsDir.resolve(inputActionSetId);
|
||||||
|
|
||||||
Dataset<String> actionDS =
|
Dataset<String> actionDS = readActionsFromJsonDump(inputActionSetJsonDumpFile.toString()).cache();
|
||||||
readActionsFromJsonDump(inputActionSetJsonDumpFile.toString()).cache();
|
|
||||||
|
|
||||||
writeActionsAsJobInput(actionDS, inputActionSetId, inputActionSetDir.toString());
|
writeActionsAsJobInput(actionDS, inputActionSetId, inputActionSetDir.toString());
|
||||||
|
|
||||||
Map<String, List<String>> actionSetOafsByType =
|
Map<String, List<String>> actionSetOafsByType = actionDS
|
||||||
actionDS
|
.withColumn("atomic_action", from_json(col("value"), ATOMIC_ACTION_SCHEMA))
|
||||||
.withColumn("atomic_action", from_json(col("value"), ATOMIC_ACTION_SCHEMA))
|
.select(expr("atomic_action.*"))
|
||||||
.select(expr("atomic_action.*")).groupBy(col("clazz"))
|
.groupBy(col("clazz"))
|
||||||
.agg(collect_list(col("payload")).as("payload_list")).collectAsList().stream()
|
.agg(collect_list(col("payload")).as("payload_list"))
|
||||||
.map(
|
.collectAsList()
|
||||||
row ->
|
.stream()
|
||||||
new AbstractMap.SimpleEntry<>(
|
.map(
|
||||||
row.<String>getAs("clazz"),
|
row -> new AbstractMap.SimpleEntry<>(
|
||||||
mutableSeqAsJavaList(row.<Seq<String>>getAs("payload_list"))))
|
row.<String> getAs("clazz"),
|
||||||
.collect(
|
mutableSeqAsJavaList(row.<Seq<String>> getAs("payload_list"))))
|
||||||
Collectors.toMap(
|
.collect(
|
||||||
AbstractMap.SimpleEntry::getKey, AbstractMap.SimpleEntry::getValue));
|
Collectors
|
||||||
|
.toMap(
|
||||||
|
AbstractMap.SimpleEntry::getKey, AbstractMap.SimpleEntry::getValue));
|
||||||
|
|
||||||
actionSetOafsByType
|
actionSetOafsByType
|
||||||
.keySet()
|
.keySet()
|
||||||
.forEach(
|
.forEach(
|
||||||
x -> {
|
x -> {
|
||||||
if (oafsByType.containsKey(x)) {
|
if (oafsByType.containsKey(x)) {
|
||||||
List<String> collected = new ArrayList<>();
|
List<String> collected = new ArrayList<>();
|
||||||
collected.addAll(oafsByType.get(x));
|
collected.addAll(oafsByType.get(x));
|
||||||
collected.addAll(actionSetOafsByType.get(x));
|
collected.addAll(actionSetOafsByType.get(x));
|
||||||
oafsByType.put(x, collected);
|
oafsByType.put(x, collected);
|
||||||
} else {
|
} else {
|
||||||
oafsByType.put(x, actionSetOafsByType.get(x));
|
oafsByType.put(x, actionSetOafsByType.get(x));
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
return oafsByType;
|
return oafsByType;
|
||||||
}
|
}
|
||||||
|
|
||||||
private static Path getInputActionSetJsonDumpsDir() {
|
private static Path getInputActionSetJsonDumpsDir() {
|
||||||
return Paths.get(
|
return Paths
|
||||||
Objects.requireNonNull(cl.getResource("eu/dnetlib/dhp/actionmanager/partition/input/"))
|
.get(
|
||||||
.getFile());
|
Objects
|
||||||
}
|
.requireNonNull(cl.getResource("eu/dnetlib/dhp/actionmanager/partition/input/"))
|
||||||
|
.getFile());
|
||||||
|
}
|
||||||
|
|
||||||
private static Dataset<String> readActionsFromJsonDump(String path) {
|
private static Dataset<String> readActionsFromJsonDump(String path) {
|
||||||
return spark.read().textFile(path);
|
return spark.read().textFile(path);
|
||||||
}
|
}
|
||||||
|
|
||||||
private static void writeActionsAsJobInput(
|
private static void writeActionsAsJobInput(
|
||||||
Dataset<String> actionDS, String inputActionSetId, String path) {
|
Dataset<String> actionDS, String inputActionSetId, String path) {
|
||||||
actionDS
|
actionDS
|
||||||
.javaRDD()
|
.javaRDD()
|
||||||
.mapToPair(json -> new Tuple2<>(new Text(inputActionSetId), new Text(json)))
|
.mapToPair(json -> new Tuple2<>(new Text(inputActionSetId), new Text(json)))
|
||||||
.saveAsNewAPIHadoopFile(
|
.saveAsNewAPIHadoopFile(
|
||||||
path, Text.class, Text.class, SequenceFileOutputFormat.class, configuration);
|
path, Text.class, Text.class, SequenceFileOutputFormat.class, configuration);
|
||||||
}
|
}
|
||||||
|
|
||||||
private static <T extends Oaf> void assertForOafType(
|
private static <T extends Oaf> void assertForOafType(
|
||||||
Path outputDir, Map<String, List<String>> oafsByClassName, Class<T> clazz) {
|
Path outputDir, Map<String, List<String>> oafsByClassName, Class<T> clazz) {
|
||||||
Path outputDatasetDir = outputDir.resolve(String.format("clazz=%s", clazz.getCanonicalName()));
|
Path outputDatasetDir = outputDir.resolve(String.format("clazz=%s", clazz.getCanonicalName()));
|
||||||
Files.exists(outputDatasetDir);
|
Files.exists(outputDatasetDir);
|
||||||
|
|
||||||
List<T> actuals =
|
List<T> actuals = readActionPayloadFromJobOutput(outputDatasetDir.toString(), clazz).collectAsList();
|
||||||
readActionPayloadFromJobOutput(outputDatasetDir.toString(), clazz).collectAsList();
|
actuals.sort(Comparator.comparingInt(Object::hashCode));
|
||||||
actuals.sort(Comparator.comparingInt(Object::hashCode));
|
|
||||||
|
|
||||||
List<T> expecteds =
|
List<T> expecteds = oafsByClassName
|
||||||
oafsByClassName.get(clazz.getCanonicalName()).stream()
|
.get(clazz.getCanonicalName())
|
||||||
.map(json -> mapToOaf(json, clazz))
|
.stream()
|
||||||
.sorted(Comparator.comparingInt(Object::hashCode))
|
.map(json -> mapToOaf(json, clazz))
|
||||||
.collect(Collectors.toList());
|
.sorted(Comparator.comparingInt(Object::hashCode))
|
||||||
|
.collect(Collectors.toList());
|
||||||
|
|
||||||
assertIterableEquals(expecteds, actuals);
|
assertIterableEquals(expecteds, actuals);
|
||||||
}
|
}
|
||||||
|
|
||||||
private static <T extends Oaf> Dataset<T> readActionPayloadFromJobOutput(
|
private static <T extends Oaf> Dataset<T> readActionPayloadFromJobOutput(
|
||||||
String path, Class<T> clazz) {
|
String path, Class<T> clazz) {
|
||||||
return spark
|
return spark
|
||||||
.read()
|
.read()
|
||||||
.parquet(path)
|
.parquet(path)
|
||||||
.map(
|
.map(
|
||||||
(MapFunction<Row, T>)
|
(MapFunction<Row, T>) value -> OBJECT_MAPPER.readValue(value.<String> getAs("payload"), clazz),
|
||||||
value -> OBJECT_MAPPER.readValue(value.<String>getAs("payload"), clazz),
|
Encoders.bean(clazz));
|
||||||
Encoders.bean(clazz));
|
}
|
||||||
}
|
|
||||||
|
|
||||||
private static <T extends Oaf> T mapToOaf(String json, Class<T> clazz) {
|
private static <T extends Oaf> T mapToOaf(String json, Class<T> clazz) {
|
||||||
return rethrowAsRuntimeException(
|
return rethrowAsRuntimeException(
|
||||||
() -> OBJECT_MAPPER.readValue(json, clazz),
|
() -> OBJECT_MAPPER.readValue(json, clazz),
|
||||||
String.format(
|
String
|
||||||
"failed to map json to class: json=%s, class=%s", json, clazz.getCanonicalName()));
|
.format(
|
||||||
}
|
"failed to map json to class: json=%s, class=%s", json, clazz.getCanonicalName()));
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,3 +1,4 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.actionmanager.promote;
|
package eu.dnetlib.dhp.actionmanager.promote;
|
||||||
|
|
||||||
import static eu.dnetlib.dhp.actionmanager.promote.MergeAndGet.Strategy;
|
import static eu.dnetlib.dhp.actionmanager.promote.MergeAndGet.Strategy;
|
||||||
|
@ -5,254 +6,252 @@ import static eu.dnetlib.dhp.actionmanager.promote.MergeAndGet.functionFor;
|
||||||
import static org.junit.jupiter.api.Assertions.*;
|
import static org.junit.jupiter.api.Assertions.*;
|
||||||
import static org.mockito.Mockito.*;
|
import static org.mockito.Mockito.*;
|
||||||
|
|
||||||
import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.SerializableSupplier;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.*;
|
|
||||||
import java.util.function.BiFunction;
|
import java.util.function.BiFunction;
|
||||||
|
|
||||||
import org.junit.jupiter.api.Nested;
|
import org.junit.jupiter.api.Nested;
|
||||||
import org.junit.jupiter.api.Test;
|
import org.junit.jupiter.api.Test;
|
||||||
|
|
||||||
|
import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.SerializableSupplier;
|
||||||
|
import eu.dnetlib.dhp.schema.oaf.*;
|
||||||
|
|
||||||
public class MergeAndGetTest {
|
public class MergeAndGetTest {
|
||||||
|
|
||||||
@Nested
|
@Nested
|
||||||
class MergeFromAndGetStrategy {
|
class MergeFromAndGetStrategy {
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void shouldThrowForOafAndOaf() {
|
public void shouldThrowForOafAndOaf() {
|
||||||
// given
|
// given
|
||||||
Oaf a = mock(Oaf.class);
|
Oaf a = mock(Oaf.class);
|
||||||
Oaf b = mock(Oaf.class);
|
Oaf b = mock(Oaf.class);
|
||||||
|
|
||||||
// when
|
// when
|
||||||
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.MERGE_FROM_AND_GET);
|
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.MERGE_FROM_AND_GET);
|
||||||
|
|
||||||
// then
|
// then
|
||||||
assertThrows(RuntimeException.class, () -> fn.get().apply(a, b));
|
assertThrows(RuntimeException.class, () -> fn.get().apply(a, b));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void shouldThrowForOafAndRelation() {
|
public void shouldThrowForOafAndRelation() {
|
||||||
// given
|
// given
|
||||||
Oaf a = mock(Oaf.class);
|
Oaf a = mock(Oaf.class);
|
||||||
Relation b = mock(Relation.class);
|
Relation b = mock(Relation.class);
|
||||||
|
|
||||||
// when
|
// when
|
||||||
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.MERGE_FROM_AND_GET);
|
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.MERGE_FROM_AND_GET);
|
||||||
|
|
||||||
// then
|
// then
|
||||||
assertThrows(RuntimeException.class, () -> fn.get().apply(a, b));
|
assertThrows(RuntimeException.class, () -> fn.get().apply(a, b));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void shouldThrowForOafAndOafEntity() {
|
public void shouldThrowForOafAndOafEntity() {
|
||||||
// given
|
// given
|
||||||
Oaf a = mock(Oaf.class);
|
Oaf a = mock(Oaf.class);
|
||||||
OafEntity b = mock(OafEntity.class);
|
OafEntity b = mock(OafEntity.class);
|
||||||
|
|
||||||
// when
|
// when
|
||||||
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.MERGE_FROM_AND_GET);
|
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.MERGE_FROM_AND_GET);
|
||||||
|
|
||||||
// then
|
// then
|
||||||
assertThrows(RuntimeException.class, () -> fn.get().apply(a, b));
|
assertThrows(RuntimeException.class, () -> fn.get().apply(a, b));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void shouldThrowForRelationAndOaf() {
|
public void shouldThrowForRelationAndOaf() {
|
||||||
// given
|
// given
|
||||||
Relation a = mock(Relation.class);
|
Relation a = mock(Relation.class);
|
||||||
Oaf b = mock(Oaf.class);
|
Oaf b = mock(Oaf.class);
|
||||||
|
|
||||||
// when
|
// when
|
||||||
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.MERGE_FROM_AND_GET);
|
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.MERGE_FROM_AND_GET);
|
||||||
|
|
||||||
// then
|
// then
|
||||||
assertThrows(RuntimeException.class, () -> fn.get().apply(a, b));
|
assertThrows(RuntimeException.class, () -> fn.get().apply(a, b));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void shouldThrowForRelationAndOafEntity() {
|
public void shouldThrowForRelationAndOafEntity() {
|
||||||
// given
|
// given
|
||||||
Relation a = mock(Relation.class);
|
Relation a = mock(Relation.class);
|
||||||
OafEntity b = mock(OafEntity.class);
|
OafEntity b = mock(OafEntity.class);
|
||||||
|
|
||||||
// when
|
// when
|
||||||
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.MERGE_FROM_AND_GET);
|
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.MERGE_FROM_AND_GET);
|
||||||
|
|
||||||
// then
|
// then
|
||||||
assertThrows(RuntimeException.class, () -> fn.get().apply(a, b));
|
assertThrows(RuntimeException.class, () -> fn.get().apply(a, b));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void shouldBehaveProperlyForRelationAndRelation() {
|
public void shouldBehaveProperlyForRelationAndRelation() {
|
||||||
// given
|
// given
|
||||||
Relation a = mock(Relation.class);
|
Relation a = mock(Relation.class);
|
||||||
Relation b = mock(Relation.class);
|
Relation b = mock(Relation.class);
|
||||||
|
|
||||||
// when
|
// when
|
||||||
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.MERGE_FROM_AND_GET);
|
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.MERGE_FROM_AND_GET);
|
||||||
|
|
||||||
// then
|
// then
|
||||||
Oaf x = fn.get().apply(a, b);
|
Oaf x = fn.get().apply(a, b);
|
||||||
assertTrue(Relation.class.isAssignableFrom(x.getClass()));
|
assertTrue(Relation.class.isAssignableFrom(x.getClass()));
|
||||||
verify(a).mergeFrom(b);
|
verify(a).mergeFrom(b);
|
||||||
assertEquals(a, x);
|
assertEquals(a, x);
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void shouldThrowForOafEntityAndOaf() {
|
public void shouldThrowForOafEntityAndOaf() {
|
||||||
// given
|
// given
|
||||||
OafEntity a = mock(OafEntity.class);
|
OafEntity a = mock(OafEntity.class);
|
||||||
Oaf b = mock(Oaf.class);
|
Oaf b = mock(Oaf.class);
|
||||||
|
|
||||||
// when
|
// when
|
||||||
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.MERGE_FROM_AND_GET);
|
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.MERGE_FROM_AND_GET);
|
||||||
|
|
||||||
// then
|
// then
|
||||||
assertThrows(RuntimeException.class, () -> fn.get().apply(a, b));
|
assertThrows(RuntimeException.class, () -> fn.get().apply(a, b));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void shouldThrowForOafEntityAndRelation() {
|
public void shouldThrowForOafEntityAndRelation() {
|
||||||
// given
|
// given
|
||||||
OafEntity a = mock(OafEntity.class);
|
OafEntity a = mock(OafEntity.class);
|
||||||
Relation b = mock(Relation.class);
|
Relation b = mock(Relation.class);
|
||||||
|
|
||||||
// when
|
// when
|
||||||
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.MERGE_FROM_AND_GET);
|
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.MERGE_FROM_AND_GET);
|
||||||
|
|
||||||
// then
|
// then
|
||||||
assertThrows(RuntimeException.class, () -> fn.get().apply(a, b));
|
assertThrows(RuntimeException.class, () -> fn.get().apply(a, b));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void shouldThrowForOafEntityAndOafEntityButNotSubclasses() {
|
public void shouldThrowForOafEntityAndOafEntityButNotSubclasses() {
|
||||||
// given
|
// given
|
||||||
class OafEntitySub1 extends OafEntity {}
|
class OafEntitySub1 extends OafEntity {
|
||||||
class OafEntitySub2 extends OafEntity {}
|
}
|
||||||
|
class OafEntitySub2 extends OafEntity {
|
||||||
|
}
|
||||||
|
|
||||||
OafEntitySub1 a = mock(OafEntitySub1.class);
|
OafEntitySub1 a = mock(OafEntitySub1.class);
|
||||||
OafEntitySub2 b = mock(OafEntitySub2.class);
|
OafEntitySub2 b = mock(OafEntitySub2.class);
|
||||||
|
|
||||||
// when
|
// when
|
||||||
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.MERGE_FROM_AND_GET);
|
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.MERGE_FROM_AND_GET);
|
||||||
|
|
||||||
// then
|
// then
|
||||||
assertThrows(RuntimeException.class, () -> fn.get().apply(a, b));
|
assertThrows(RuntimeException.class, () -> fn.get().apply(a, b));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void shouldBehaveProperlyForOafEntityAndOafEntity() {
|
public void shouldBehaveProperlyForOafEntityAndOafEntity() {
|
||||||
// given
|
// given
|
||||||
OafEntity a = mock(OafEntity.class);
|
OafEntity a = mock(OafEntity.class);
|
||||||
OafEntity b = mock(OafEntity.class);
|
OafEntity b = mock(OafEntity.class);
|
||||||
|
|
||||||
// when
|
// when
|
||||||
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.MERGE_FROM_AND_GET);
|
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.MERGE_FROM_AND_GET);
|
||||||
|
|
||||||
// then
|
// then
|
||||||
Oaf x = fn.get().apply(a, b);
|
Oaf x = fn.get().apply(a, b);
|
||||||
assertTrue(OafEntity.class.isAssignableFrom(x.getClass()));
|
assertTrue(OafEntity.class.isAssignableFrom(x.getClass()));
|
||||||
verify(a).mergeFrom(b);
|
verify(a).mergeFrom(b);
|
||||||
assertEquals(a, x);
|
assertEquals(a, x);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@Nested
|
@Nested
|
||||||
class SelectNewerAndGetStrategy {
|
class SelectNewerAndGetStrategy {
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void shouldThrowForOafEntityAndRelation() {
|
public void shouldThrowForOafEntityAndRelation() {
|
||||||
// given
|
// given
|
||||||
OafEntity a = mock(OafEntity.class);
|
OafEntity a = mock(OafEntity.class);
|
||||||
Relation b = mock(Relation.class);
|
Relation b = mock(Relation.class);
|
||||||
|
|
||||||
// when
|
// when
|
||||||
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn =
|
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.SELECT_NEWER_AND_GET);
|
||||||
functionFor(Strategy.SELECT_NEWER_AND_GET);
|
|
||||||
|
|
||||||
// then
|
// then
|
||||||
assertThrows(RuntimeException.class, () -> fn.get().apply(a, b));
|
assertThrows(RuntimeException.class, () -> fn.get().apply(a, b));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void shouldThrowForRelationAndOafEntity() {
|
public void shouldThrowForRelationAndOafEntity() {
|
||||||
// given
|
// given
|
||||||
Relation a = mock(Relation.class);
|
Relation a = mock(Relation.class);
|
||||||
OafEntity b = mock(OafEntity.class);
|
OafEntity b = mock(OafEntity.class);
|
||||||
|
|
||||||
// when
|
// when
|
||||||
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn =
|
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.SELECT_NEWER_AND_GET);
|
||||||
functionFor(Strategy.SELECT_NEWER_AND_GET);
|
|
||||||
|
|
||||||
// then
|
// then
|
||||||
assertThrows(RuntimeException.class, () -> fn.get().apply(a, b));
|
assertThrows(RuntimeException.class, () -> fn.get().apply(a, b));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void shouldThrowForOafEntityAndResult() {
|
public void shouldThrowForOafEntityAndResult() {
|
||||||
// given
|
// given
|
||||||
OafEntity a = mock(OafEntity.class);
|
OafEntity a = mock(OafEntity.class);
|
||||||
Result b = mock(Result.class);
|
Result b = mock(Result.class);
|
||||||
|
|
||||||
// when
|
// when
|
||||||
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn =
|
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.SELECT_NEWER_AND_GET);
|
||||||
functionFor(Strategy.SELECT_NEWER_AND_GET);
|
|
||||||
|
|
||||||
// then
|
// then
|
||||||
assertThrows(RuntimeException.class, () -> fn.get().apply(a, b));
|
assertThrows(RuntimeException.class, () -> fn.get().apply(a, b));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void shouldThrowWhenSuperTypeIsNewerForResultAndOafEntity() {
|
public void shouldThrowWhenSuperTypeIsNewerForResultAndOafEntity() {
|
||||||
// given
|
// given
|
||||||
// real types must be used because subclass-superclass resolution does not work for
|
// real types must be used because subclass-superclass resolution does not work for
|
||||||
// mocks
|
// mocks
|
||||||
Dataset a = new Dataset();
|
Dataset a = new Dataset();
|
||||||
a.setLastupdatetimestamp(1L);
|
a.setLastupdatetimestamp(1L);
|
||||||
Result b = new Result();
|
Result b = new Result();
|
||||||
b.setLastupdatetimestamp(2L);
|
b.setLastupdatetimestamp(2L);
|
||||||
|
|
||||||
// when
|
// when
|
||||||
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn =
|
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.SELECT_NEWER_AND_GET);
|
||||||
functionFor(Strategy.SELECT_NEWER_AND_GET);
|
|
||||||
|
|
||||||
// then
|
// then
|
||||||
assertThrows(RuntimeException.class, () -> fn.get().apply(a, b));
|
assertThrows(RuntimeException.class, () -> fn.get().apply(a, b));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void shouldShouldReturnLeftForOafEntityAndOafEntity() {
|
public void shouldShouldReturnLeftForOafEntityAndOafEntity() {
|
||||||
// given
|
// given
|
||||||
OafEntity a = mock(OafEntity.class);
|
OafEntity a = mock(OafEntity.class);
|
||||||
when(a.getLastupdatetimestamp()).thenReturn(1L);
|
when(a.getLastupdatetimestamp()).thenReturn(1L);
|
||||||
OafEntity b = mock(OafEntity.class);
|
OafEntity b = mock(OafEntity.class);
|
||||||
when(b.getLastupdatetimestamp()).thenReturn(2L);
|
when(b.getLastupdatetimestamp()).thenReturn(2L);
|
||||||
|
|
||||||
// when
|
// when
|
||||||
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn =
|
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.SELECT_NEWER_AND_GET);
|
||||||
functionFor(Strategy.SELECT_NEWER_AND_GET);
|
|
||||||
|
|
||||||
// then
|
// then
|
||||||
Oaf x = fn.get().apply(a, b);
|
Oaf x = fn.get().apply(a, b);
|
||||||
assertTrue(OafEntity.class.isAssignableFrom(x.getClass()));
|
assertTrue(OafEntity.class.isAssignableFrom(x.getClass()));
|
||||||
assertEquals(b, x);
|
assertEquals(b, x);
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void shouldShouldReturnRightForOafEntityAndOafEntity() {
|
public void shouldShouldReturnRightForOafEntityAndOafEntity() {
|
||||||
// given
|
// given
|
||||||
OafEntity a = mock(OafEntity.class);
|
OafEntity a = mock(OafEntity.class);
|
||||||
when(a.getLastupdatetimestamp()).thenReturn(2L);
|
when(a.getLastupdatetimestamp()).thenReturn(2L);
|
||||||
OafEntity b = mock(OafEntity.class);
|
OafEntity b = mock(OafEntity.class);
|
||||||
when(b.getLastupdatetimestamp()).thenReturn(1L);
|
when(b.getLastupdatetimestamp()).thenReturn(1L);
|
||||||
|
|
||||||
// when
|
// when
|
||||||
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn =
|
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.SELECT_NEWER_AND_GET);
|
||||||
functionFor(Strategy.SELECT_NEWER_AND_GET);
|
|
||||||
|
|
||||||
// then
|
// then
|
||||||
Oaf x = fn.get().apply(a, b);
|
Oaf x = fn.get().apply(a, b);
|
||||||
assertTrue(OafEntity.class.isAssignableFrom(x.getClass()));
|
assertTrue(OafEntity.class.isAssignableFrom(x.getClass()));
|
||||||
assertEquals(a, x);
|
assertEquals(a, x);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,11 +1,9 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.actionmanager.promote;
|
package eu.dnetlib.dhp.actionmanager.promote;
|
||||||
|
|
||||||
import static org.junit.jupiter.api.Assertions.*;
|
import static org.junit.jupiter.api.Assertions.*;
|
||||||
import static org.junit.jupiter.params.provider.Arguments.arguments;
|
import static org.junit.jupiter.params.provider.Arguments.arguments;
|
||||||
|
|
||||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
|
||||||
import eu.dnetlib.dhp.schema.common.ModelSupport;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.*;
|
|
||||||
import java.io.IOException;
|
import java.io.IOException;
|
||||||
import java.nio.file.Files;
|
import java.nio.file.Files;
|
||||||
import java.nio.file.Path;
|
import java.nio.file.Path;
|
||||||
|
@ -15,6 +13,7 @@ import java.util.List;
|
||||||
import java.util.Objects;
|
import java.util.Objects;
|
||||||
import java.util.stream.Collectors;
|
import java.util.stream.Collectors;
|
||||||
import java.util.stream.Stream;
|
import java.util.stream.Stream;
|
||||||
|
|
||||||
import org.apache.commons.io.FileUtils;
|
import org.apache.commons.io.FileUtils;
|
||||||
import org.apache.spark.SparkConf;
|
import org.apache.spark.SparkConf;
|
||||||
import org.apache.spark.api.java.function.MapFunction;
|
import org.apache.spark.api.java.function.MapFunction;
|
||||||
|
@ -26,253 +25,256 @@ import org.junit.jupiter.params.ParameterizedTest;
|
||||||
import org.junit.jupiter.params.provider.Arguments;
|
import org.junit.jupiter.params.provider.Arguments;
|
||||||
import org.junit.jupiter.params.provider.MethodSource;
|
import org.junit.jupiter.params.provider.MethodSource;
|
||||||
|
|
||||||
|
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||||
|
|
||||||
|
import eu.dnetlib.dhp.schema.common.ModelSupport;
|
||||||
|
import eu.dnetlib.dhp.schema.oaf.*;
|
||||||
|
|
||||||
public class PromoteActionPayloadForGraphTableJobTest {
|
public class PromoteActionPayloadForGraphTableJobTest {
|
||||||
private static final ClassLoader cl =
|
private static final ClassLoader cl = PromoteActionPayloadForGraphTableJobTest.class.getClassLoader();
|
||||||
PromoteActionPayloadForGraphTableJobTest.class.getClassLoader();
|
|
||||||
|
|
||||||
private static SparkSession spark;
|
private static SparkSession spark;
|
||||||
|
|
||||||
private Path workingDir;
|
private Path workingDir;
|
||||||
private Path inputDir;
|
private Path inputDir;
|
||||||
private Path inputGraphRootDir;
|
private Path inputGraphRootDir;
|
||||||
private Path inputActionPayloadRootDir;
|
private Path inputActionPayloadRootDir;
|
||||||
private Path outputDir;
|
private Path outputDir;
|
||||||
|
|
||||||
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
|
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
|
||||||
|
|
||||||
@BeforeAll
|
@BeforeAll
|
||||||
public static void beforeAll() {
|
public static void beforeAll() {
|
||||||
SparkConf conf = new SparkConf();
|
SparkConf conf = new SparkConf();
|
||||||
conf.setAppName(PromoteActionPayloadForGraphTableJobTest.class.getSimpleName());
|
conf.setAppName(PromoteActionPayloadForGraphTableJobTest.class.getSimpleName());
|
||||||
conf.setMaster("local");
|
conf.setMaster("local");
|
||||||
conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer");
|
conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer");
|
||||||
conf.registerKryoClasses(ModelSupport.getOafModelClasses());
|
conf.registerKryoClasses(ModelSupport.getOafModelClasses());
|
||||||
spark = SparkSession.builder().config(conf).getOrCreate();
|
spark = SparkSession.builder().config(conf).getOrCreate();
|
||||||
}
|
}
|
||||||
|
|
||||||
@BeforeEach
|
@BeforeEach
|
||||||
public void beforeEach() throws IOException {
|
public void beforeEach() throws IOException {
|
||||||
workingDir =
|
workingDir = Files.createTempDirectory(PromoteActionPayloadForGraphTableJobTest.class.getSimpleName());
|
||||||
Files.createTempDirectory(PromoteActionPayloadForGraphTableJobTest.class.getSimpleName());
|
inputDir = workingDir.resolve("input");
|
||||||
inputDir = workingDir.resolve("input");
|
inputGraphRootDir = inputDir.resolve("graph");
|
||||||
inputGraphRootDir = inputDir.resolve("graph");
|
inputActionPayloadRootDir = inputDir.resolve("action_payload");
|
||||||
inputActionPayloadRootDir = inputDir.resolve("action_payload");
|
outputDir = workingDir.resolve("output");
|
||||||
outputDir = workingDir.resolve("output");
|
}
|
||||||
}
|
|
||||||
|
|
||||||
@AfterEach
|
@AfterEach
|
||||||
public void afterEach() throws IOException {
|
public void afterEach() throws IOException {
|
||||||
FileUtils.deleteDirectory(workingDir.toFile());
|
FileUtils.deleteDirectory(workingDir.toFile());
|
||||||
}
|
}
|
||||||
|
|
||||||
@AfterAll
|
@AfterAll
|
||||||
public static void afterAll() {
|
public static void afterAll() {
|
||||||
spark.stop();
|
spark.stop();
|
||||||
}
|
}
|
||||||
|
|
||||||
@DisplayName("Job")
|
@DisplayName("Job")
|
||||||
@Nested
|
@Nested
|
||||||
class Main {
|
class Main {
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void shouldThrowWhenGraphTableClassIsNotASubClassOfActionPayloadClass() {
|
public void shouldThrowWhenGraphTableClassIsNotASubClassOfActionPayloadClass() {
|
||||||
// given
|
// given
|
||||||
Class<Relation> rowClazz = Relation.class;
|
Class<Relation> rowClazz = Relation.class;
|
||||||
Class<OafEntity> actionPayloadClazz = OafEntity.class;
|
Class<OafEntity> actionPayloadClazz = OafEntity.class;
|
||||||
|
|
||||||
// when
|
// when
|
||||||
RuntimeException exception =
|
RuntimeException exception = assertThrows(
|
||||||
assertThrows(
|
RuntimeException.class,
|
||||||
RuntimeException.class,
|
() -> PromoteActionPayloadForGraphTableJob
|
||||||
() ->
|
.main(
|
||||||
PromoteActionPayloadForGraphTableJob.main(
|
new String[] {
|
||||||
new String[] {
|
"-isSparkSessionManaged",
|
||||||
"-isSparkSessionManaged",
|
Boolean.FALSE.toString(),
|
||||||
Boolean.FALSE.toString(),
|
"-inputGraphTablePath",
|
||||||
"-inputGraphTablePath",
|
"",
|
||||||
"",
|
"-graphTableClassName",
|
||||||
"-graphTableClassName",
|
rowClazz.getCanonicalName(),
|
||||||
rowClazz.getCanonicalName(),
|
"-inputActionPayloadPath",
|
||||||
"-inputActionPayloadPath",
|
"",
|
||||||
"",
|
"-actionPayloadClassName",
|
||||||
"-actionPayloadClassName",
|
actionPayloadClazz.getCanonicalName(),
|
||||||
actionPayloadClazz.getCanonicalName(),
|
"-outputGraphTablePath",
|
||||||
"-outputGraphTablePath",
|
"",
|
||||||
"",
|
"-mergeAndGetStrategy",
|
||||||
"-mergeAndGetStrategy",
|
MergeAndGet.Strategy.SELECT_NEWER_AND_GET.name()
|
||||||
MergeAndGet.Strategy.SELECT_NEWER_AND_GET.name()
|
}));
|
||||||
}));
|
|
||||||
|
|
||||||
// then
|
// then
|
||||||
String msg =
|
String msg = String
|
||||||
String.format(
|
.format(
|
||||||
"graph table class is not a subclass of action payload class: graph=%s, action=%s",
|
"graph table class is not a subclass of action payload class: graph=%s, action=%s",
|
||||||
rowClazz.getCanonicalName(), actionPayloadClazz.getCanonicalName());
|
rowClazz.getCanonicalName(), actionPayloadClazz.getCanonicalName());
|
||||||
assertTrue(exception.getMessage().contains(msg));
|
assertTrue(exception.getMessage().contains(msg));
|
||||||
}
|
}
|
||||||
|
|
||||||
@ParameterizedTest(name = "strategy: {0}, graph table: {1}, action payload: {2}")
|
@ParameterizedTest(name = "strategy: {0}, graph table: {1}, action payload: {2}")
|
||||||
@MethodSource(
|
@MethodSource("eu.dnetlib.dhp.actionmanager.promote.PromoteActionPayloadForGraphTableJobTest#promoteJobTestParams")
|
||||||
"eu.dnetlib.dhp.actionmanager.promote.PromoteActionPayloadForGraphTableJobTest#promoteJobTestParams")
|
public void shouldPromoteActionPayloadForGraphTable(
|
||||||
public void shouldPromoteActionPayloadForGraphTable(
|
MergeAndGet.Strategy strategy,
|
||||||
MergeAndGet.Strategy strategy,
|
Class<? extends Oaf> rowClazz,
|
||||||
Class<? extends Oaf> rowClazz,
|
Class<? extends Oaf> actionPayloadClazz)
|
||||||
Class<? extends Oaf> actionPayloadClazz)
|
throws Exception {
|
||||||
throws Exception {
|
// given
|
||||||
// given
|
Path inputGraphTableDir = createGraphTable(inputGraphRootDir, rowClazz);
|
||||||
Path inputGraphTableDir = createGraphTable(inputGraphRootDir, rowClazz);
|
Path inputActionPayloadDir = createActionPayload(inputActionPayloadRootDir, rowClazz, actionPayloadClazz);
|
||||||
Path inputActionPayloadDir =
|
Path outputGraphTableDir = outputDir.resolve("graph").resolve(rowClazz.getSimpleName().toLowerCase());
|
||||||
createActionPayload(inputActionPayloadRootDir, rowClazz, actionPayloadClazz);
|
|
||||||
Path outputGraphTableDir =
|
|
||||||
outputDir.resolve("graph").resolve(rowClazz.getSimpleName().toLowerCase());
|
|
||||||
|
|
||||||
// when
|
// when
|
||||||
PromoteActionPayloadForGraphTableJob.main(
|
PromoteActionPayloadForGraphTableJob
|
||||||
new String[] {
|
.main(
|
||||||
"-isSparkSessionManaged",
|
new String[] {
|
||||||
Boolean.FALSE.toString(),
|
"-isSparkSessionManaged",
|
||||||
"-inputGraphTablePath",
|
Boolean.FALSE.toString(),
|
||||||
inputGraphTableDir.toString(),
|
"-inputGraphTablePath",
|
||||||
"-graphTableClassName",
|
inputGraphTableDir.toString(),
|
||||||
rowClazz.getCanonicalName(),
|
"-graphTableClassName",
|
||||||
"-inputActionPayloadPath",
|
rowClazz.getCanonicalName(),
|
||||||
inputActionPayloadDir.toString(),
|
"-inputActionPayloadPath",
|
||||||
"-actionPayloadClassName",
|
inputActionPayloadDir.toString(),
|
||||||
actionPayloadClazz.getCanonicalName(),
|
"-actionPayloadClassName",
|
||||||
"-outputGraphTablePath",
|
actionPayloadClazz.getCanonicalName(),
|
||||||
outputGraphTableDir.toString(),
|
"-outputGraphTablePath",
|
||||||
"-mergeAndGetStrategy",
|
outputGraphTableDir.toString(),
|
||||||
strategy.name()
|
"-mergeAndGetStrategy",
|
||||||
});
|
strategy.name()
|
||||||
|
});
|
||||||
|
|
||||||
// then
|
// then
|
||||||
assertTrue(Files.exists(outputGraphTableDir));
|
assertTrue(Files.exists(outputGraphTableDir));
|
||||||
|
|
||||||
List<? extends Oaf> actualOutputRows =
|
List<? extends Oaf> actualOutputRows = readGraphTableFromJobOutput(outputGraphTableDir.toString(), rowClazz)
|
||||||
readGraphTableFromJobOutput(outputGraphTableDir.toString(), rowClazz).collectAsList()
|
.collectAsList()
|
||||||
.stream()
|
.stream()
|
||||||
.sorted(Comparator.comparingInt(Object::hashCode))
|
.sorted(Comparator.comparingInt(Object::hashCode))
|
||||||
.collect(Collectors.toList());
|
.collect(Collectors.toList());
|
||||||
String expectedOutputGraphTableJsonDumpPath =
|
String expectedOutputGraphTableJsonDumpPath = resultFileLocation(strategy, rowClazz, actionPayloadClazz);
|
||||||
resultFileLocation(strategy, rowClazz, actionPayloadClazz);
|
Path expectedOutputGraphTableJsonDumpFile = Paths
|
||||||
Path expectedOutputGraphTableJsonDumpFile =
|
.get(
|
||||||
Paths.get(
|
Objects
|
||||||
Objects.requireNonNull(cl.getResource(expectedOutputGraphTableJsonDumpPath))
|
.requireNonNull(cl.getResource(expectedOutputGraphTableJsonDumpPath))
|
||||||
.getFile());
|
.getFile());
|
||||||
List<? extends Oaf> expectedOutputRows =
|
List<? extends Oaf> expectedOutputRows = readGraphTableFromJsonDump(
|
||||||
readGraphTableFromJsonDump(expectedOutputGraphTableJsonDumpFile.toString(), rowClazz)
|
expectedOutputGraphTableJsonDumpFile.toString(), rowClazz)
|
||||||
.collectAsList().stream()
|
.collectAsList()
|
||||||
.sorted(Comparator.comparingInt(Object::hashCode))
|
.stream()
|
||||||
.collect(Collectors.toList());
|
.sorted(Comparator.comparingInt(Object::hashCode))
|
||||||
assertIterableEquals(expectedOutputRows, actualOutputRows);
|
.collect(Collectors.toList());
|
||||||
}
|
assertIterableEquals(expectedOutputRows, actualOutputRows);
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
|
||||||
public static Stream<Arguments> promoteJobTestParams() {
|
public static Stream<Arguments> promoteJobTestParams() {
|
||||||
return Stream.of(
|
return Stream
|
||||||
arguments(
|
.of(
|
||||||
MergeAndGet.Strategy.MERGE_FROM_AND_GET,
|
arguments(
|
||||||
eu.dnetlib.dhp.schema.oaf.Dataset.class,
|
MergeAndGet.Strategy.MERGE_FROM_AND_GET,
|
||||||
eu.dnetlib.dhp.schema.oaf.Dataset.class),
|
eu.dnetlib.dhp.schema.oaf.Dataset.class,
|
||||||
arguments(
|
eu.dnetlib.dhp.schema.oaf.Dataset.class),
|
||||||
MergeAndGet.Strategy.MERGE_FROM_AND_GET,
|
arguments(
|
||||||
eu.dnetlib.dhp.schema.oaf.Dataset.class,
|
MergeAndGet.Strategy.MERGE_FROM_AND_GET,
|
||||||
eu.dnetlib.dhp.schema.oaf.Result.class),
|
eu.dnetlib.dhp.schema.oaf.Dataset.class,
|
||||||
arguments(MergeAndGet.Strategy.MERGE_FROM_AND_GET, Datasource.class, Datasource.class),
|
eu.dnetlib.dhp.schema.oaf.Result.class),
|
||||||
arguments(MergeAndGet.Strategy.MERGE_FROM_AND_GET, Organization.class, Organization.class),
|
arguments(MergeAndGet.Strategy.MERGE_FROM_AND_GET, Datasource.class, Datasource.class),
|
||||||
arguments(
|
arguments(MergeAndGet.Strategy.MERGE_FROM_AND_GET, Organization.class, Organization.class),
|
||||||
MergeAndGet.Strategy.MERGE_FROM_AND_GET,
|
arguments(
|
||||||
OtherResearchProduct.class,
|
MergeAndGet.Strategy.MERGE_FROM_AND_GET,
|
||||||
OtherResearchProduct.class),
|
OtherResearchProduct.class,
|
||||||
arguments(
|
OtherResearchProduct.class),
|
||||||
MergeAndGet.Strategy.MERGE_FROM_AND_GET, OtherResearchProduct.class, Result.class),
|
arguments(
|
||||||
arguments(MergeAndGet.Strategy.MERGE_FROM_AND_GET, Project.class, Project.class),
|
MergeAndGet.Strategy.MERGE_FROM_AND_GET, OtherResearchProduct.class, Result.class),
|
||||||
arguments(MergeAndGet.Strategy.MERGE_FROM_AND_GET, Publication.class, Publication.class),
|
arguments(MergeAndGet.Strategy.MERGE_FROM_AND_GET, Project.class, Project.class),
|
||||||
arguments(MergeAndGet.Strategy.MERGE_FROM_AND_GET, Publication.class, Result.class),
|
arguments(MergeAndGet.Strategy.MERGE_FROM_AND_GET, Publication.class, Publication.class),
|
||||||
arguments(MergeAndGet.Strategy.MERGE_FROM_AND_GET, Relation.class, Relation.class),
|
arguments(MergeAndGet.Strategy.MERGE_FROM_AND_GET, Publication.class, Result.class),
|
||||||
arguments(MergeAndGet.Strategy.MERGE_FROM_AND_GET, Software.class, Software.class),
|
arguments(MergeAndGet.Strategy.MERGE_FROM_AND_GET, Relation.class, Relation.class),
|
||||||
arguments(MergeAndGet.Strategy.MERGE_FROM_AND_GET, Software.class, Result.class));
|
arguments(MergeAndGet.Strategy.MERGE_FROM_AND_GET, Software.class, Software.class),
|
||||||
}
|
arguments(MergeAndGet.Strategy.MERGE_FROM_AND_GET, Software.class, Result.class));
|
||||||
|
}
|
||||||
|
|
||||||
private static <G extends Oaf> Path createGraphTable(Path inputGraphRootDir, Class<G> rowClazz) {
|
private static <G extends Oaf> Path createGraphTable(Path inputGraphRootDir, Class<G> rowClazz) {
|
||||||
String inputGraphTableJsonDumpPath = inputGraphTableJsonDumpLocation(rowClazz);
|
String inputGraphTableJsonDumpPath = inputGraphTableJsonDumpLocation(rowClazz);
|
||||||
Path inputGraphTableJsonDumpFile =
|
Path inputGraphTableJsonDumpFile = Paths
|
||||||
Paths.get(Objects.requireNonNull(cl.getResource(inputGraphTableJsonDumpPath)).getFile());
|
.get(Objects.requireNonNull(cl.getResource(inputGraphTableJsonDumpPath)).getFile());
|
||||||
Dataset<G> rowDS = readGraphTableFromJsonDump(inputGraphTableJsonDumpFile.toString(), rowClazz);
|
Dataset<G> rowDS = readGraphTableFromJsonDump(inputGraphTableJsonDumpFile.toString(), rowClazz);
|
||||||
String inputGraphTableName = rowClazz.getSimpleName().toLowerCase();
|
String inputGraphTableName = rowClazz.getSimpleName().toLowerCase();
|
||||||
Path inputGraphTableDir = inputGraphRootDir.resolve(inputGraphTableName);
|
Path inputGraphTableDir = inputGraphRootDir.resolve(inputGraphTableName);
|
||||||
writeGraphTableAaJobInput(rowDS, inputGraphTableDir.toString());
|
writeGraphTableAaJobInput(rowDS, inputGraphTableDir.toString());
|
||||||
return inputGraphTableDir;
|
return inputGraphTableDir;
|
||||||
}
|
}
|
||||||
|
|
||||||
private static String inputGraphTableJsonDumpLocation(Class<? extends Oaf> rowClazz) {
|
private static String inputGraphTableJsonDumpLocation(Class<? extends Oaf> rowClazz) {
|
||||||
return String.format(
|
return String
|
||||||
"%s/%s.json",
|
.format(
|
||||||
"eu/dnetlib/dhp/actionmanager/promote/input/graph", rowClazz.getSimpleName().toLowerCase());
|
"%s/%s.json",
|
||||||
}
|
"eu/dnetlib/dhp/actionmanager/promote/input/graph", rowClazz.getSimpleName().toLowerCase());
|
||||||
|
}
|
||||||
|
|
||||||
private static <G extends Oaf> Dataset<G> readGraphTableFromJsonDump(
|
private static <G extends Oaf> Dataset<G> readGraphTableFromJsonDump(
|
||||||
String path, Class<G> rowClazz) {
|
String path, Class<G> rowClazz) {
|
||||||
return spark
|
return spark
|
||||||
.read()
|
.read()
|
||||||
.textFile(path)
|
.textFile(path)
|
||||||
.map(
|
.map(
|
||||||
(MapFunction<String, G>) json -> OBJECT_MAPPER.readValue(json, rowClazz),
|
(MapFunction<String, G>) json -> OBJECT_MAPPER.readValue(json, rowClazz),
|
||||||
Encoders.bean(rowClazz));
|
Encoders.bean(rowClazz));
|
||||||
}
|
}
|
||||||
|
|
||||||
private static <G extends Oaf> void writeGraphTableAaJobInput(Dataset<G> rowDS, String path) {
|
private static <G extends Oaf> void writeGraphTableAaJobInput(Dataset<G> rowDS, String path) {
|
||||||
rowDS.write().option("compression", "gzip").json(path);
|
rowDS.write().option("compression", "gzip").json(path);
|
||||||
}
|
}
|
||||||
|
|
||||||
private static <G extends Oaf, A extends Oaf> Path createActionPayload(
|
private static <G extends Oaf, A extends Oaf> Path createActionPayload(
|
||||||
Path inputActionPayloadRootDir, Class<G> rowClazz, Class<A> actionPayloadClazz) {
|
Path inputActionPayloadRootDir, Class<G> rowClazz, Class<A> actionPayloadClazz) {
|
||||||
String inputActionPayloadJsonDumpPath =
|
String inputActionPayloadJsonDumpPath = inputActionPayloadJsonDumpLocation(rowClazz, actionPayloadClazz);
|
||||||
inputActionPayloadJsonDumpLocation(rowClazz, actionPayloadClazz);
|
Path inputActionPayloadJsonDumpFile = Paths
|
||||||
Path inputActionPayloadJsonDumpFile =
|
.get(Objects.requireNonNull(cl.getResource(inputActionPayloadJsonDumpPath)).getFile());
|
||||||
Paths.get(Objects.requireNonNull(cl.getResource(inputActionPayloadJsonDumpPath)).getFile());
|
Dataset<String> actionPayloadDS = readActionPayloadFromJsonDump(inputActionPayloadJsonDumpFile.toString());
|
||||||
Dataset<String> actionPayloadDS =
|
Path inputActionPayloadDir = inputActionPayloadRootDir
|
||||||
readActionPayloadFromJsonDump(inputActionPayloadJsonDumpFile.toString());
|
.resolve(actionPayloadClazz.getSimpleName().toLowerCase());
|
||||||
Path inputActionPayloadDir =
|
writeActionPayloadAsJobInput(actionPayloadDS, inputActionPayloadDir.toString());
|
||||||
inputActionPayloadRootDir.resolve(actionPayloadClazz.getSimpleName().toLowerCase());
|
return inputActionPayloadDir;
|
||||||
writeActionPayloadAsJobInput(actionPayloadDS, inputActionPayloadDir.toString());
|
}
|
||||||
return inputActionPayloadDir;
|
|
||||||
}
|
|
||||||
|
|
||||||
private static String inputActionPayloadJsonDumpLocation(
|
private static String inputActionPayloadJsonDumpLocation(
|
||||||
Class<? extends Oaf> rowClazz, Class<? extends Oaf> actionPayloadClazz) {
|
Class<? extends Oaf> rowClazz, Class<? extends Oaf> actionPayloadClazz) {
|
||||||
|
|
||||||
return String.format(
|
return String
|
||||||
"eu/dnetlib/dhp/actionmanager/promote/input/action_payload/%s_table/%s.json",
|
.format(
|
||||||
rowClazz.getSimpleName().toLowerCase(), actionPayloadClazz.getSimpleName().toLowerCase());
|
"eu/dnetlib/dhp/actionmanager/promote/input/action_payload/%s_table/%s.json",
|
||||||
}
|
rowClazz.getSimpleName().toLowerCase(), actionPayloadClazz.getSimpleName().toLowerCase());
|
||||||
|
}
|
||||||
|
|
||||||
private static Dataset<String> readActionPayloadFromJsonDump(String path) {
|
private static Dataset<String> readActionPayloadFromJsonDump(String path) {
|
||||||
return spark.read().textFile(path);
|
return spark.read().textFile(path);
|
||||||
}
|
}
|
||||||
|
|
||||||
private static void writeActionPayloadAsJobInput(Dataset<String> actionPayloadDS, String path) {
|
private static void writeActionPayloadAsJobInput(Dataset<String> actionPayloadDS, String path) {
|
||||||
actionPayloadDS.withColumnRenamed("value", "payload").write().parquet(path);
|
actionPayloadDS.withColumnRenamed("value", "payload").write().parquet(path);
|
||||||
}
|
}
|
||||||
|
|
||||||
private static <G extends Oaf> Dataset<G> readGraphTableFromJobOutput(
|
private static <G extends Oaf> Dataset<G> readGraphTableFromJobOutput(
|
||||||
String path, Class<G> rowClazz) {
|
String path, Class<G> rowClazz) {
|
||||||
return spark
|
return spark
|
||||||
.read()
|
.read()
|
||||||
.textFile(path)
|
.textFile(path)
|
||||||
.map(
|
.map(
|
||||||
(MapFunction<String, G>) json -> OBJECT_MAPPER.readValue(json, rowClazz),
|
(MapFunction<String, G>) json -> OBJECT_MAPPER.readValue(json, rowClazz),
|
||||||
Encoders.bean(rowClazz));
|
Encoders.bean(rowClazz));
|
||||||
}
|
}
|
||||||
|
|
||||||
private static String resultFileLocation(
|
private static String resultFileLocation(
|
||||||
MergeAndGet.Strategy strategy,
|
MergeAndGet.Strategy strategy,
|
||||||
Class<? extends Oaf> rowClazz,
|
Class<? extends Oaf> rowClazz,
|
||||||
Class<? extends Oaf> actionPayloadClazz) {
|
Class<? extends Oaf> actionPayloadClazz) {
|
||||||
return String.format(
|
return String
|
||||||
"eu/dnetlib/dhp/actionmanager/promote/output/graph/%s/%s/%s_action_payload/result.json",
|
.format(
|
||||||
strategy.name().toLowerCase(),
|
"eu/dnetlib/dhp/actionmanager/promote/output/graph/%s/%s/%s_action_payload/result.json",
|
||||||
rowClazz.getSimpleName().toLowerCase(),
|
strategy.name().toLowerCase(),
|
||||||
actionPayloadClazz.getSimpleName().toLowerCase());
|
rowClazz.getSimpleName().toLowerCase(),
|
||||||
}
|
actionPayloadClazz.getSimpleName().toLowerCase());
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,15 +1,15 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.actionmanager.promote;
|
package eu.dnetlib.dhp.actionmanager.promote;
|
||||||
|
|
||||||
import static org.junit.jupiter.api.Assertions.assertEquals;
|
import static org.junit.jupiter.api.Assertions.assertEquals;
|
||||||
import static org.junit.jupiter.api.Assertions.assertThrows;
|
import static org.junit.jupiter.api.Assertions.assertThrows;
|
||||||
|
|
||||||
import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.SerializableSupplier;
|
|
||||||
import eu.dnetlib.dhp.schema.oaf.Oaf;
|
|
||||||
import java.util.Arrays;
|
import java.util.Arrays;
|
||||||
import java.util.List;
|
import java.util.List;
|
||||||
import java.util.Objects;
|
import java.util.Objects;
|
||||||
import java.util.function.BiFunction;
|
import java.util.function.BiFunction;
|
||||||
import java.util.function.Function;
|
import java.util.function.Function;
|
||||||
|
|
||||||
import org.apache.spark.SparkConf;
|
import org.apache.spark.SparkConf;
|
||||||
import org.apache.spark.sql.Dataset;
|
import org.apache.spark.sql.Dataset;
|
||||||
import org.apache.spark.sql.Encoders;
|
import org.apache.spark.sql.Encoders;
|
||||||
|
@ -19,314 +19,311 @@ import org.junit.jupiter.api.BeforeAll;
|
||||||
import org.junit.jupiter.api.Nested;
|
import org.junit.jupiter.api.Nested;
|
||||||
import org.junit.jupiter.api.Test;
|
import org.junit.jupiter.api.Test;
|
||||||
|
|
||||||
|
import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.SerializableSupplier;
|
||||||
|
import eu.dnetlib.dhp.schema.oaf.Oaf;
|
||||||
|
|
||||||
public class PromoteActionPayloadFunctionsTest {
|
public class PromoteActionPayloadFunctionsTest {
|
||||||
|
|
||||||
private static SparkSession spark;
|
private static SparkSession spark;
|
||||||
|
|
||||||
@BeforeAll
|
@BeforeAll
|
||||||
public static void beforeAll() {
|
public static void beforeAll() {
|
||||||
SparkConf conf = new SparkConf();
|
SparkConf conf = new SparkConf();
|
||||||
conf.setMaster("local");
|
conf.setMaster("local");
|
||||||
conf.setAppName(PromoteActionPayloadFunctionsTest.class.getSimpleName());
|
conf.setAppName(PromoteActionPayloadFunctionsTest.class.getSimpleName());
|
||||||
conf.set("spark.driver.host", "localhost");
|
conf.set("spark.driver.host", "localhost");
|
||||||
spark = SparkSession.builder().config(conf).getOrCreate();
|
spark = SparkSession.builder().config(conf).getOrCreate();
|
||||||
}
|
}
|
||||||
|
|
||||||
@AfterAll
|
@AfterAll
|
||||||
public static void afterAll() {
|
public static void afterAll() {
|
||||||
spark.stop();
|
spark.stop();
|
||||||
}
|
}
|
||||||
|
|
||||||
@Nested
|
@Nested
|
||||||
class JoinTableWithActionPayloadAndMerge {
|
class JoinTableWithActionPayloadAndMerge {
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void shouldThrowWhenTableTypeIsNotSubtypeOfActionPayloadType() {
|
public void shouldThrowWhenTableTypeIsNotSubtypeOfActionPayloadType() {
|
||||||
// given
|
// given
|
||||||
class OafImpl extends Oaf {}
|
class OafImpl extends Oaf {
|
||||||
|
}
|
||||||
|
|
||||||
// when
|
// when
|
||||||
assertThrows(
|
assertThrows(
|
||||||
RuntimeException.class,
|
RuntimeException.class,
|
||||||
() ->
|
() -> PromoteActionPayloadFunctions
|
||||||
PromoteActionPayloadFunctions.joinGraphTableWithActionPayloadAndMerge(
|
.joinGraphTableWithActionPayloadAndMerge(
|
||||||
null, null, null, null, null, OafImplSubSub.class, OafImpl.class));
|
null, null, null, null, null, OafImplSubSub.class, OafImpl.class));
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void shouldRunProperlyWhenActionPayloadTypeAndTableTypeAreTheSame() {
|
public void shouldRunProperlyWhenActionPayloadTypeAndTableTypeAreTheSame() {
|
||||||
// given
|
// given
|
||||||
String id0 = "id0";
|
String id0 = "id0";
|
||||||
String id1 = "id1";
|
String id1 = "id1";
|
||||||
String id2 = "id2";
|
String id2 = "id2";
|
||||||
String id3 = "id3";
|
String id3 = "id3";
|
||||||
String id4 = "id4";
|
String id4 = "id4";
|
||||||
List<OafImplSubSub> rowData =
|
List<OafImplSubSub> rowData = Arrays
|
||||||
Arrays.asList(
|
.asList(
|
||||||
createOafImplSubSub(id0),
|
createOafImplSubSub(id0),
|
||||||
createOafImplSubSub(id1),
|
createOafImplSubSub(id1),
|
||||||
createOafImplSubSub(id2),
|
createOafImplSubSub(id2),
|
||||||
createOafImplSubSub(id3));
|
createOafImplSubSub(id3));
|
||||||
Dataset<OafImplSubSub> rowDS =
|
Dataset<OafImplSubSub> rowDS = spark.createDataset(rowData, Encoders.bean(OafImplSubSub.class));
|
||||||
spark.createDataset(rowData, Encoders.bean(OafImplSubSub.class));
|
|
||||||
|
|
||||||
List<OafImplSubSub> actionPayloadData =
|
List<OafImplSubSub> actionPayloadData = Arrays
|
||||||
Arrays.asList(
|
.asList(
|
||||||
createOafImplSubSub(id1),
|
createOafImplSubSub(id1),
|
||||||
createOafImplSubSub(id2),
|
createOafImplSubSub(id2),
|
||||||
createOafImplSubSub(id2),
|
createOafImplSubSub(id2),
|
||||||
createOafImplSubSub(id3),
|
createOafImplSubSub(id3),
|
||||||
createOafImplSubSub(id3),
|
createOafImplSubSub(id3),
|
||||||
createOafImplSubSub(id3),
|
createOafImplSubSub(id3),
|
||||||
createOafImplSubSub(id4),
|
createOafImplSubSub(id4),
|
||||||
createOafImplSubSub(id4),
|
createOafImplSubSub(id4),
|
||||||
createOafImplSubSub(id4),
|
createOafImplSubSub(id4),
|
||||||
createOafImplSubSub(id4));
|
createOafImplSubSub(id4));
|
||||||
Dataset<OafImplSubSub> actionPayloadDS =
|
Dataset<OafImplSubSub> actionPayloadDS = spark
|
||||||
spark.createDataset(actionPayloadData, Encoders.bean(OafImplSubSub.class));
|
.createDataset(actionPayloadData, Encoders.bean(OafImplSubSub.class));
|
||||||
|
|
||||||
SerializableSupplier<Function<OafImplSubSub, String>> rowIdFn = () -> OafImplRoot::getId;
|
SerializableSupplier<Function<OafImplSubSub, String>> rowIdFn = () -> OafImplRoot::getId;
|
||||||
SerializableSupplier<Function<OafImplSubSub, String>> actionPayloadIdFn =
|
SerializableSupplier<Function<OafImplSubSub, String>> actionPayloadIdFn = () -> OafImplRoot::getId;
|
||||||
() -> OafImplRoot::getId;
|
SerializableSupplier<BiFunction<OafImplSubSub, OafImplSubSub, OafImplSubSub>> mergeAndGetFn = () -> (x,
|
||||||
SerializableSupplier<BiFunction<OafImplSubSub, OafImplSubSub, OafImplSubSub>> mergeAndGetFn =
|
y) -> {
|
||||||
() ->
|
x.merge(y);
|
||||||
(x, y) -> {
|
return x;
|
||||||
x.merge(y);
|
};
|
||||||
return x;
|
|
||||||
};
|
|
||||||
|
|
||||||
// when
|
// when
|
||||||
List<OafImplSubSub> results =
|
List<OafImplSubSub> results = PromoteActionPayloadFunctions
|
||||||
PromoteActionPayloadFunctions.joinGraphTableWithActionPayloadAndMerge(
|
.joinGraphTableWithActionPayloadAndMerge(
|
||||||
rowDS,
|
rowDS,
|
||||||
actionPayloadDS,
|
actionPayloadDS,
|
||||||
rowIdFn,
|
rowIdFn,
|
||||||
actionPayloadIdFn,
|
actionPayloadIdFn,
|
||||||
mergeAndGetFn,
|
mergeAndGetFn,
|
||||||
OafImplSubSub.class,
|
OafImplSubSub.class,
|
||||||
OafImplSubSub.class)
|
OafImplSubSub.class)
|
||||||
.collectAsList();
|
.collectAsList();
|
||||||
|
|
||||||
// then
|
// then
|
||||||
assertEquals(11, results.size());
|
assertEquals(11, results.size());
|
||||||
assertEquals(1, results.stream().filter(x -> x.getId().equals(id0)).count());
|
assertEquals(1, results.stream().filter(x -> x.getId().equals(id0)).count());
|
||||||
assertEquals(1, results.stream().filter(x -> x.getId().equals(id1)).count());
|
assertEquals(1, results.stream().filter(x -> x.getId().equals(id1)).count());
|
||||||
assertEquals(2, results.stream().filter(x -> x.getId().equals(id2)).count());
|
assertEquals(2, results.stream().filter(x -> x.getId().equals(id2)).count());
|
||||||
assertEquals(3, results.stream().filter(x -> x.getId().equals(id3)).count());
|
assertEquals(3, results.stream().filter(x -> x.getId().equals(id3)).count());
|
||||||
assertEquals(4, results.stream().filter(x -> x.getId().equals(id4)).count());
|
assertEquals(4, results.stream().filter(x -> x.getId().equals(id4)).count());
|
||||||
|
|
||||||
results.forEach(
|
results
|
||||||
result -> {
|
.forEach(
|
||||||
switch (result.getId()) {
|
result -> {
|
||||||
case "id0":
|
switch (result.getId()) {
|
||||||
assertEquals(1, result.getMerged());
|
case "id0":
|
||||||
break;
|
assertEquals(1, result.getMerged());
|
||||||
case "id1":
|
break;
|
||||||
case "id2":
|
case "id1":
|
||||||
case "id3":
|
case "id2":
|
||||||
assertEquals(2, result.getMerged());
|
case "id3":
|
||||||
break;
|
assertEquals(2, result.getMerged());
|
||||||
case "id4":
|
break;
|
||||||
assertEquals(1, result.getMerged());
|
case "id4":
|
||||||
break;
|
assertEquals(1, result.getMerged());
|
||||||
default:
|
break;
|
||||||
throw new RuntimeException();
|
default:
|
||||||
}
|
throw new RuntimeException();
|
||||||
});
|
}
|
||||||
}
|
});
|
||||||
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void shouldRunProperlyWhenActionPayloadTypeIsSuperTypeOfTableType() {
|
public void shouldRunProperlyWhenActionPayloadTypeIsSuperTypeOfTableType() {
|
||||||
// given
|
// given
|
||||||
String id0 = "id0";
|
String id0 = "id0";
|
||||||
String id1 = "id1";
|
String id1 = "id1";
|
||||||
String id2 = "id2";
|
String id2 = "id2";
|
||||||
String id3 = "id3";
|
String id3 = "id3";
|
||||||
String id4 = "id4";
|
String id4 = "id4";
|
||||||
List<OafImplSubSub> rowData =
|
List<OafImplSubSub> rowData = Arrays
|
||||||
Arrays.asList(
|
.asList(
|
||||||
createOafImplSubSub(id0),
|
createOafImplSubSub(id0),
|
||||||
createOafImplSubSub(id1),
|
createOafImplSubSub(id1),
|
||||||
createOafImplSubSub(id2),
|
createOafImplSubSub(id2),
|
||||||
createOafImplSubSub(id3));
|
createOafImplSubSub(id3));
|
||||||
Dataset<OafImplSubSub> rowDS =
|
Dataset<OafImplSubSub> rowDS = spark.createDataset(rowData, Encoders.bean(OafImplSubSub.class));
|
||||||
spark.createDataset(rowData, Encoders.bean(OafImplSubSub.class));
|
|
||||||
|
|
||||||
List<OafImplSub> actionPayloadData =
|
List<OafImplSub> actionPayloadData = Arrays
|
||||||
Arrays.asList(
|
.asList(
|
||||||
createOafImplSub(id1),
|
createOafImplSub(id1),
|
||||||
createOafImplSub(id2),
|
createOafImplSub(id2),
|
||||||
createOafImplSub(id2),
|
createOafImplSub(id2),
|
||||||
createOafImplSub(id3),
|
createOafImplSub(id3),
|
||||||
createOafImplSub(id3),
|
createOafImplSub(id3),
|
||||||
createOafImplSub(id3),
|
createOafImplSub(id3),
|
||||||
createOafImplSub(id4),
|
createOafImplSub(id4),
|
||||||
createOafImplSub(id4),
|
createOafImplSub(id4),
|
||||||
createOafImplSub(id4),
|
createOafImplSub(id4),
|
||||||
createOafImplSub(id4));
|
createOafImplSub(id4));
|
||||||
Dataset<OafImplSub> actionPayloadDS =
|
Dataset<OafImplSub> actionPayloadDS = spark
|
||||||
spark.createDataset(actionPayloadData, Encoders.bean(OafImplSub.class));
|
.createDataset(actionPayloadData, Encoders.bean(OafImplSub.class));
|
||||||
|
|
||||||
SerializableSupplier<Function<OafImplSubSub, String>> rowIdFn = () -> OafImplRoot::getId;
|
SerializableSupplier<Function<OafImplSubSub, String>> rowIdFn = () -> OafImplRoot::getId;
|
||||||
SerializableSupplier<Function<OafImplSub, String>> actionPayloadIdFn =
|
SerializableSupplier<Function<OafImplSub, String>> actionPayloadIdFn = () -> OafImplRoot::getId;
|
||||||
() -> OafImplRoot::getId;
|
SerializableSupplier<BiFunction<OafImplSubSub, OafImplSub, OafImplSubSub>> mergeAndGetFn = () -> (x, y) -> {
|
||||||
SerializableSupplier<BiFunction<OafImplSubSub, OafImplSub, OafImplSubSub>> mergeAndGetFn =
|
x.merge(y);
|
||||||
() ->
|
return x;
|
||||||
(x, y) -> {
|
};
|
||||||
x.merge(y);
|
|
||||||
return x;
|
|
||||||
};
|
|
||||||
|
|
||||||
// when
|
// when
|
||||||
List<OafImplSubSub> results =
|
List<OafImplSubSub> results = PromoteActionPayloadFunctions
|
||||||
PromoteActionPayloadFunctions.joinGraphTableWithActionPayloadAndMerge(
|
.joinGraphTableWithActionPayloadAndMerge(
|
||||||
rowDS,
|
rowDS,
|
||||||
actionPayloadDS,
|
actionPayloadDS,
|
||||||
rowIdFn,
|
rowIdFn,
|
||||||
actionPayloadIdFn,
|
actionPayloadIdFn,
|
||||||
mergeAndGetFn,
|
mergeAndGetFn,
|
||||||
OafImplSubSub.class,
|
OafImplSubSub.class,
|
||||||
OafImplSub.class)
|
OafImplSub.class)
|
||||||
.collectAsList();
|
.collectAsList();
|
||||||
|
|
||||||
// then
|
// then
|
||||||
assertEquals(7, results.size());
|
assertEquals(7, results.size());
|
||||||
assertEquals(1, results.stream().filter(x -> x.getId().equals(id0)).count());
|
assertEquals(1, results.stream().filter(x -> x.getId().equals(id0)).count());
|
||||||
assertEquals(1, results.stream().filter(x -> x.getId().equals(id1)).count());
|
assertEquals(1, results.stream().filter(x -> x.getId().equals(id1)).count());
|
||||||
assertEquals(2, results.stream().filter(x -> x.getId().equals(id2)).count());
|
assertEquals(2, results.stream().filter(x -> x.getId().equals(id2)).count());
|
||||||
assertEquals(3, results.stream().filter(x -> x.getId().equals(id3)).count());
|
assertEquals(3, results.stream().filter(x -> x.getId().equals(id3)).count());
|
||||||
assertEquals(0, results.stream().filter(x -> x.getId().equals(id4)).count());
|
assertEquals(0, results.stream().filter(x -> x.getId().equals(id4)).count());
|
||||||
|
|
||||||
results.forEach(
|
results
|
||||||
result -> {
|
.forEach(
|
||||||
switch (result.getId()) {
|
result -> {
|
||||||
case "id0":
|
switch (result.getId()) {
|
||||||
assertEquals(1, result.getMerged());
|
case "id0":
|
||||||
break;
|
assertEquals(1, result.getMerged());
|
||||||
case "id1":
|
break;
|
||||||
case "id2":
|
case "id1":
|
||||||
case "id3":
|
case "id2":
|
||||||
assertEquals(2, result.getMerged());
|
case "id3":
|
||||||
break;
|
assertEquals(2, result.getMerged());
|
||||||
default:
|
break;
|
||||||
throw new RuntimeException();
|
default:
|
||||||
}
|
throw new RuntimeException();
|
||||||
});
|
}
|
||||||
}
|
});
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
|
||||||
@Nested
|
@Nested
|
||||||
class GroupTableByIdAndMerge {
|
class GroupTableByIdAndMerge {
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
public void shouldRunProperly() {
|
public void shouldRunProperly() {
|
||||||
// given
|
// given
|
||||||
String id1 = "id1";
|
String id1 = "id1";
|
||||||
String id2 = "id2";
|
String id2 = "id2";
|
||||||
String id3 = "id3";
|
String id3 = "id3";
|
||||||
List<OafImplSubSub> rowData =
|
List<OafImplSubSub> rowData = Arrays
|
||||||
Arrays.asList(
|
.asList(
|
||||||
createOafImplSubSub(id1),
|
createOafImplSubSub(id1),
|
||||||
createOafImplSubSub(id2),
|
createOafImplSubSub(id2),
|
||||||
createOafImplSubSub(id2),
|
createOafImplSubSub(id2),
|
||||||
createOafImplSubSub(id3),
|
createOafImplSubSub(id3),
|
||||||
createOafImplSubSub(id3),
|
createOafImplSubSub(id3),
|
||||||
createOafImplSubSub(id3));
|
createOafImplSubSub(id3));
|
||||||
Dataset<OafImplSubSub> rowDS =
|
Dataset<OafImplSubSub> rowDS = spark.createDataset(rowData, Encoders.bean(OafImplSubSub.class));
|
||||||
spark.createDataset(rowData, Encoders.bean(OafImplSubSub.class));
|
|
||||||
|
|
||||||
SerializableSupplier<Function<OafImplSubSub, String>> rowIdFn = () -> OafImplRoot::getId;
|
SerializableSupplier<Function<OafImplSubSub, String>> rowIdFn = () -> OafImplRoot::getId;
|
||||||
SerializableSupplier<BiFunction<OafImplSubSub, OafImplSubSub, OafImplSubSub>> mergeAndGetFn =
|
SerializableSupplier<BiFunction<OafImplSubSub, OafImplSubSub, OafImplSubSub>> mergeAndGetFn = () -> (x,
|
||||||
() ->
|
y) -> {
|
||||||
(x, y) -> {
|
x.merge(y);
|
||||||
x.merge(y);
|
return x;
|
||||||
return x;
|
};
|
||||||
};
|
SerializableSupplier<OafImplSubSub> zeroFn = OafImplSubSub::new;
|
||||||
SerializableSupplier<OafImplSubSub> zeroFn = OafImplSubSub::new;
|
SerializableSupplier<Function<OafImplSubSub, Boolean>> isNotZeroFn = () -> x -> Objects.nonNull(x.getId());
|
||||||
SerializableSupplier<Function<OafImplSubSub, Boolean>> isNotZeroFn =
|
|
||||||
() -> x -> Objects.nonNull(x.getId());
|
|
||||||
|
|
||||||
// when
|
// when
|
||||||
List<OafImplSubSub> results =
|
List<OafImplSubSub> results = PromoteActionPayloadFunctions
|
||||||
PromoteActionPayloadFunctions.groupGraphTableByIdAndMerge(
|
.groupGraphTableByIdAndMerge(
|
||||||
rowDS, rowIdFn, mergeAndGetFn, zeroFn, isNotZeroFn, OafImplSubSub.class)
|
rowDS, rowIdFn, mergeAndGetFn, zeroFn, isNotZeroFn, OafImplSubSub.class)
|
||||||
.collectAsList();
|
.collectAsList();
|
||||||
|
|
||||||
// then
|
// then
|
||||||
assertEquals(3, results.size());
|
assertEquals(3, results.size());
|
||||||
assertEquals(1, results.stream().filter(x -> x.getId().equals(id1)).count());
|
assertEquals(1, results.stream().filter(x -> x.getId().equals(id1)).count());
|
||||||
assertEquals(1, results.stream().filter(x -> x.getId().equals(id2)).count());
|
assertEquals(1, results.stream().filter(x -> x.getId().equals(id2)).count());
|
||||||
assertEquals(1, results.stream().filter(x -> x.getId().equals(id3)).count());
|
assertEquals(1, results.stream().filter(x -> x.getId().equals(id3)).count());
|
||||||
|
|
||||||
results.forEach(
|
results
|
||||||
result -> {
|
.forEach(
|
||||||
switch (result.getId()) {
|
result -> {
|
||||||
case "id1":
|
switch (result.getId()) {
|
||||||
assertEquals(1, result.getMerged());
|
case "id1":
|
||||||
break;
|
assertEquals(1, result.getMerged());
|
||||||
case "id2":
|
break;
|
||||||
assertEquals(2, result.getMerged());
|
case "id2":
|
||||||
break;
|
assertEquals(2, result.getMerged());
|
||||||
case "id3":
|
break;
|
||||||
assertEquals(3, result.getMerged());
|
case "id3":
|
||||||
break;
|
assertEquals(3, result.getMerged());
|
||||||
default:
|
break;
|
||||||
throw new RuntimeException();
|
default:
|
||||||
}
|
throw new RuntimeException();
|
||||||
});
|
}
|
||||||
}
|
});
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
|
||||||
public static class OafImplRoot extends Oaf {
|
public static class OafImplRoot extends Oaf {
|
||||||
private String id;
|
private String id;
|
||||||
private int merged = 1;
|
private int merged = 1;
|
||||||
|
|
||||||
public void merge(OafImplRoot e) {
|
public void merge(OafImplRoot e) {
|
||||||
merged += e.merged;
|
merged += e.merged;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getId() {
|
public String getId() {
|
||||||
return id;
|
return id;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setId(String id) {
|
public void setId(String id) {
|
||||||
this.id = id;
|
this.id = id;
|
||||||
}
|
}
|
||||||
|
|
||||||
public int getMerged() {
|
public int getMerged() {
|
||||||
return merged;
|
return merged;
|
||||||
}
|
}
|
||||||
|
|
||||||
public void setMerged(int merged) {
|
public void setMerged(int merged) {
|
||||||
this.merged = merged;
|
this.merged = merged;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
public static class OafImplSub extends OafImplRoot {
|
public static class OafImplSub extends OafImplRoot {
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public void merge(OafImplRoot e) {
|
public void merge(OafImplRoot e) {
|
||||||
super.merge(e);
|
super.merge(e);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private static OafImplSub createOafImplSub(String id) {
|
private static OafImplSub createOafImplSub(String id) {
|
||||||
OafImplSub x = new OafImplSub();
|
OafImplSub x = new OafImplSub();
|
||||||
x.setId(id);
|
x.setId(id);
|
||||||
return x;
|
return x;
|
||||||
}
|
}
|
||||||
|
|
||||||
public static class OafImplSubSub extends OafImplSub {
|
public static class OafImplSubSub extends OafImplSub {
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public void merge(OafImplRoot e) {
|
public void merge(OafImplRoot e) {
|
||||||
super.merge(e);
|
super.merge(e);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private static OafImplSubSub createOafImplSubSub(String id) {
|
private static OafImplSubSub createOafImplSubSub(String id) {
|
||||||
OafImplSubSub x = new OafImplSubSub();
|
OafImplSubSub x = new OafImplSubSub();
|
||||||
x.setId(id);
|
x.setId(id);
|
||||||
return x;
|
return x;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,22 +1,21 @@
|
||||||
|
|
||||||
package eu.dnetlib.dhp.collection;
|
package eu.dnetlib.dhp.collection;
|
||||||
|
|
||||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
import static eu.dnetlib.dhp.common.SparkSessionSupport.runWithSparkSession;
|
||||||
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
|
|
||||||
import eu.dnetlib.dhp.model.mdstore.MetadataRecord;
|
|
||||||
import eu.dnetlib.dhp.model.mdstore.Provenance;
|
|
||||||
import eu.dnetlib.message.Message;
|
|
||||||
import eu.dnetlib.message.MessageManager;
|
|
||||||
import eu.dnetlib.message.MessageType;
|
|
||||||
import java.io.ByteArrayInputStream;
|
import java.io.ByteArrayInputStream;
|
||||||
import java.nio.charset.StandardCharsets;
|
import java.nio.charset.StandardCharsets;
|
||||||
import java.util.HashMap;
|
import java.util.HashMap;
|
||||||
import java.util.Map;
|
import java.util.Map;
|
||||||
import java.util.Objects;
|
import java.util.Objects;
|
||||||
|
import java.util.Optional;
|
||||||
|
|
||||||
import org.apache.commons.cli.*;
|
import org.apache.commons.cli.*;
|
||||||
import org.apache.commons.io.IOUtils;
|
import org.apache.commons.io.IOUtils;
|
||||||
import org.apache.commons.lang3.StringUtils;
|
import org.apache.commons.lang3.StringUtils;
|
||||||
import org.apache.hadoop.io.IntWritable;
|
import org.apache.hadoop.io.IntWritable;
|
||||||
import org.apache.hadoop.io.Text;
|
import org.apache.hadoop.io.Text;
|
||||||
|
import org.apache.spark.SparkConf;
|
||||||
import org.apache.spark.api.java.JavaPairRDD;
|
import org.apache.spark.api.java.JavaPairRDD;
|
||||||
import org.apache.spark.api.java.JavaRDD;
|
import org.apache.spark.api.java.JavaRDD;
|
||||||
import org.apache.spark.api.java.JavaSparkContext;
|
import org.apache.spark.api.java.JavaSparkContext;
|
||||||
|
@ -28,128 +27,149 @@ import org.apache.spark.util.LongAccumulator;
|
||||||
import org.dom4j.Document;
|
import org.dom4j.Document;
|
||||||
import org.dom4j.Node;
|
import org.dom4j.Node;
|
||||||
import org.dom4j.io.SAXReader;
|
import org.dom4j.io.SAXReader;
|
||||||
|
import org.slf4j.Logger;
|
||||||
|
import org.slf4j.LoggerFactory;
|
||||||
|
|
||||||
|
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||||
|
|
||||||
|
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
|
||||||
|
import eu.dnetlib.dhp.model.mdstore.MetadataRecord;
|
||||||
|
import eu.dnetlib.dhp.model.mdstore.Provenance;
|
||||||
|
import eu.dnetlib.message.Message;
|
||||||
|
import eu.dnetlib.message.MessageManager;
|
||||||
|
import eu.dnetlib.message.MessageType;
|
||||||
|
|
||||||
public class GenerateNativeStoreSparkJob {
|
public class GenerateNativeStoreSparkJob {
|
||||||
|
|
||||||
public static MetadataRecord parseRecord(
|
private static final Logger log = LoggerFactory.getLogger(GenerateNativeStoreSparkJob.class);
|
||||||
final String input,
|
|
||||||
final String xpath,
|
|
||||||
final String encoding,
|
|
||||||
final Provenance provenance,
|
|
||||||
final Long dateOfCollection,
|
|
||||||
final LongAccumulator totalItems,
|
|
||||||
final LongAccumulator invalidRecords) {
|
|
||||||
|
|
||||||
if (totalItems != null) totalItems.add(1);
|
public static MetadataRecord parseRecord(
|
||||||
try {
|
final String input,
|
||||||
SAXReader reader = new SAXReader();
|
final String xpath,
|
||||||
Document document =
|
final String encoding,
|
||||||
reader.read(new ByteArrayInputStream(input.getBytes(StandardCharsets.UTF_8)));
|
final Provenance provenance,
|
||||||
Node node = document.selectSingleNode(xpath);
|
final Long dateOfCollection,
|
||||||
final String originalIdentifier = node.getText();
|
final LongAccumulator totalItems,
|
||||||
if (StringUtils.isBlank(originalIdentifier)) {
|
final LongAccumulator invalidRecords) {
|
||||||
if (invalidRecords != null) invalidRecords.add(1);
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
return new MetadataRecord(originalIdentifier, encoding, provenance, input, dateOfCollection);
|
|
||||||
} catch (Throwable e) {
|
|
||||||
if (invalidRecords != null) invalidRecords.add(1);
|
|
||||||
e.printStackTrace();
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
public static void main(String[] args) throws Exception {
|
if (totalItems != null)
|
||||||
|
totalItems.add(1);
|
||||||
|
try {
|
||||||
|
SAXReader reader = new SAXReader();
|
||||||
|
Document document = reader.read(new ByteArrayInputStream(input.getBytes(StandardCharsets.UTF_8)));
|
||||||
|
Node node = document.selectSingleNode(xpath);
|
||||||
|
final String originalIdentifier = node.getText();
|
||||||
|
if (StringUtils.isBlank(originalIdentifier)) {
|
||||||
|
if (invalidRecords != null)
|
||||||
|
invalidRecords.add(1);
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
return new MetadataRecord(originalIdentifier, encoding, provenance, input, dateOfCollection);
|
||||||
|
} catch (Throwable e) {
|
||||||
|
if (invalidRecords != null)
|
||||||
|
invalidRecords.add(1);
|
||||||
|
e.printStackTrace();
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
final ArgumentApplicationParser parser =
|
public static void main(String[] args) throws Exception {
|
||||||
new ArgumentApplicationParser(
|
|
||||||
IOUtils.toString(
|
|
||||||
GenerateNativeStoreSparkJob.class.getResourceAsStream(
|
|
||||||
"/eu/dnetlib/dhp/collection/collection_input_parameters.json")));
|
|
||||||
parser.parseArgument(args);
|
|
||||||
final ObjectMapper jsonMapper = new ObjectMapper();
|
|
||||||
final Provenance provenance = jsonMapper.readValue(parser.get("provenance"), Provenance.class);
|
|
||||||
final long dateOfCollection = new Long(parser.get("dateOfCollection"));
|
|
||||||
|
|
||||||
final SparkSession spark =
|
final ArgumentApplicationParser parser = new ArgumentApplicationParser(
|
||||||
SparkSession.builder()
|
IOUtils
|
||||||
.appName("GenerateNativeStoreSparkJob")
|
.toString(
|
||||||
.master(parser.get("master"))
|
GenerateNativeStoreSparkJob.class
|
||||||
.getOrCreate();
|
.getResourceAsStream(
|
||||||
|
"/eu/dnetlib/dhp/collection/collection_input_parameters.json")));
|
||||||
|
parser.parseArgument(args);
|
||||||
|
final ObjectMapper jsonMapper = new ObjectMapper();
|
||||||
|
final Provenance provenance = jsonMapper.readValue(parser.get("provenance"), Provenance.class);
|
||||||
|
final long dateOfCollection = new Long(parser.get("dateOfCollection"));
|
||||||
|
|
||||||
final Map<String, String> ongoingMap = new HashMap<>();
|
Boolean isSparkSessionManaged = Optional
|
||||||
final Map<String, String> reportMap = new HashMap<>();
|
.ofNullable(parser.get("isSparkSessionManaged"))
|
||||||
|
.map(Boolean::valueOf)
|
||||||
|
.orElse(Boolean.TRUE);
|
||||||
|
log.info("isSparkSessionManaged: {}", isSparkSessionManaged);
|
||||||
|
|
||||||
final boolean test =
|
final Map<String, String> ongoingMap = new HashMap<>();
|
||||||
parser.get("isTest") == null ? false : Boolean.valueOf(parser.get("isTest"));
|
final Map<String, String> reportMap = new HashMap<>();
|
||||||
|
|
||||||
final JavaSparkContext sc = new JavaSparkContext(spark.sparkContext());
|
final boolean test = parser.get("isTest") == null ? false : Boolean.valueOf(parser.get("isTest"));
|
||||||
|
|
||||||
final JavaPairRDD<IntWritable, Text> inputRDD =
|
SparkConf conf = new SparkConf();
|
||||||
sc.sequenceFile(parser.get("input"), IntWritable.class, Text.class);
|
runWithSparkSession(
|
||||||
|
conf,
|
||||||
|
isSparkSessionManaged,
|
||||||
|
spark -> {
|
||||||
|
final JavaSparkContext sc = JavaSparkContext.fromSparkContext(spark.sparkContext());
|
||||||
|
|
||||||
final LongAccumulator totalItems = sc.sc().longAccumulator("TotalItems");
|
final JavaPairRDD<IntWritable, Text> inputRDD = sc
|
||||||
|
.sequenceFile(parser.get("input"), IntWritable.class, Text.class);
|
||||||
|
|
||||||
final LongAccumulator invalidRecords = sc.sc().longAccumulator("InvalidRecords");
|
final LongAccumulator totalItems = sc.sc().longAccumulator("TotalItems");
|
||||||
|
final LongAccumulator invalidRecords = sc.sc().longAccumulator("InvalidRecords");
|
||||||
|
|
||||||
final MessageManager manager =
|
final MessageManager manager = new MessageManager(
|
||||||
new MessageManager(
|
parser.get("rabbitHost"),
|
||||||
parser.get("rabbitHost"),
|
parser.get("rabbitUser"),
|
||||||
parser.get("rabbitUser"),
|
parser.get("rabbitPassword"),
|
||||||
parser.get("rabbitPassword"),
|
false,
|
||||||
false,
|
false,
|
||||||
false,
|
null);
|
||||||
null);
|
|
||||||
|
|
||||||
final JavaRDD<MetadataRecord> mappeRDD =
|
final JavaRDD<MetadataRecord> mappeRDD = inputRDD
|
||||||
inputRDD
|
.map(
|
||||||
.map(
|
item -> parseRecord(
|
||||||
item ->
|
item._2().toString(),
|
||||||
parseRecord(
|
parser.get("xpath"),
|
||||||
item._2().toString(),
|
parser.get("encoding"),
|
||||||
parser.get("xpath"),
|
provenance,
|
||||||
parser.get("encoding"),
|
dateOfCollection,
|
||||||
provenance,
|
totalItems,
|
||||||
dateOfCollection,
|
invalidRecords))
|
||||||
totalItems,
|
.filter(Objects::nonNull)
|
||||||
invalidRecords))
|
.distinct();
|
||||||
.filter(Objects::nonNull)
|
|
||||||
.distinct();
|
|
||||||
|
|
||||||
ongoingMap.put("ongoing", "0");
|
ongoingMap.put("ongoing", "0");
|
||||||
if (!test) {
|
if (!test) {
|
||||||
manager.sendMessage(
|
manager
|
||||||
new Message(
|
.sendMessage(
|
||||||
parser.get("workflowId"), "DataFrameCreation", MessageType.ONGOING, ongoingMap),
|
new Message(
|
||||||
parser.get("rabbitOngoingQueue"),
|
parser.get("workflowId"), "DataFrameCreation", MessageType.ONGOING, ongoingMap),
|
||||||
true,
|
parser.get("rabbitOngoingQueue"),
|
||||||
false);
|
true,
|
||||||
}
|
false);
|
||||||
|
}
|
||||||
|
|
||||||
final Encoder<MetadataRecord> encoder = Encoders.bean(MetadataRecord.class);
|
final Encoder<MetadataRecord> encoder = Encoders.bean(MetadataRecord.class);
|
||||||
final Dataset<MetadataRecord> mdstore = spark.createDataset(mappeRDD.rdd(), encoder);
|
final Dataset<MetadataRecord> mdstore = spark.createDataset(mappeRDD.rdd(), encoder);
|
||||||
final LongAccumulator mdStoreRecords = sc.sc().longAccumulator("MDStoreRecords");
|
final LongAccumulator mdStoreRecords = sc.sc().longAccumulator("MDStoreRecords");
|
||||||
mdStoreRecords.add(mdstore.count());
|
mdStoreRecords.add(mdstore.count());
|
||||||
ongoingMap.put("ongoing", "" + totalItems.value());
|
ongoingMap.put("ongoing", "" + totalItems.value());
|
||||||
if (!test) {
|
if (!test) {
|
||||||
manager.sendMessage(
|
manager
|
||||||
new Message(
|
.sendMessage(
|
||||||
parser.get("workflowId"), "DataFrameCreation", MessageType.ONGOING, ongoingMap),
|
new Message(
|
||||||
parser.get("rabbitOngoingQueue"),
|
parser.get("workflowId"), "DataFrameCreation", MessageType.ONGOING, ongoingMap),
|
||||||
true,
|
parser.get("rabbitOngoingQueue"),
|
||||||
false);
|
true,
|
||||||
}
|
false);
|
||||||
mdstore.write().format("parquet").save(parser.get("output"));
|
}
|
||||||
reportMap.put("inputItem", "" + totalItems.value());
|
mdstore.write().format("parquet").save(parser.get("output"));
|
||||||
reportMap.put("invalidRecords", "" + invalidRecords.value());
|
reportMap.put("inputItem", "" + totalItems.value());
|
||||||
reportMap.put("mdStoreSize", "" + mdStoreRecords.value());
|
reportMap.put("invalidRecords", "" + invalidRecords.value());
|
||||||
if (!test) {
|
reportMap.put("mdStoreSize", "" + mdStoreRecords.value());
|
||||||
manager.sendMessage(
|
if (!test) {
|
||||||
new Message(parser.get("workflowId"), "Collection", MessageType.REPORT, reportMap),
|
manager
|
||||||
parser.get("rabbitReportQueue"),
|
.sendMessage(
|
||||||
true,
|
new Message(parser.get("workflowId"), "Collection", MessageType.REPORT, reportMap),
|
||||||
false);
|
parser.get("rabbitReportQueue"),
|
||||||
manager.close();
|
true,
|
||||||
}
|
false);
|
||||||
}
|
manager.close();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue