forked from D-Net/dnet-hadoop
merge with master fork
This commit is contained in:
commit
4c94231cad
|
@ -0,0 +1,661 @@
|
|||
GNU AFFERO GENERAL PUBLIC LICENSE
|
||||
Version 3, 19 November 2007
|
||||
|
||||
Copyright (C) 2007 Free Software Foundation, Inc. <http://fsf.org/>
|
||||
Everyone is permitted to copy and distribute verbatim copies
|
||||
of this license document, but changing it is not allowed.
|
||||
|
||||
Preamble
|
||||
|
||||
The GNU Affero General Public License is a free, copyleft license for
|
||||
software and other kinds of works, specifically designed to ensure
|
||||
cooperation with the community in the case of network server software.
|
||||
|
||||
The licenses for most software and other practical works are designed
|
||||
to take away your freedom to share and change the works. By contrast,
|
||||
our General Public Licenses are intended to guarantee your freedom to
|
||||
share and change all versions of a program--to make sure it remains free
|
||||
software for all its users.
|
||||
|
||||
When we speak of free software, we are referring to freedom, not
|
||||
price. Our General Public Licenses are designed to make sure that you
|
||||
have the freedom to distribute copies of free software (and charge for
|
||||
them if you wish), that you receive source code or can get it if you
|
||||
want it, that you can change the software or use pieces of it in new
|
||||
free programs, and that you know you can do these things.
|
||||
|
||||
Developers that use our General Public Licenses protect your rights
|
||||
with two steps: (1) assert copyright on the software, and (2) offer
|
||||
you this License which gives you legal permission to copy, distribute
|
||||
and/or modify the software.
|
||||
|
||||
A secondary benefit of defending all users' freedom is that
|
||||
improvements made in alternate versions of the program, if they
|
||||
receive widespread use, become available for other developers to
|
||||
incorporate. Many developers of free software are heartened and
|
||||
encouraged by the resulting cooperation. However, in the case of
|
||||
software used on network servers, this result may fail to come about.
|
||||
The GNU General Public License permits making a modified version and
|
||||
letting the public access it on a server without ever releasing its
|
||||
source code to the public.
|
||||
|
||||
The GNU Affero General Public License is designed specifically to
|
||||
ensure that, in such cases, the modified source code becomes available
|
||||
to the community. It requires the operator of a network server to
|
||||
provide the source code of the modified version running there to the
|
||||
users of that server. Therefore, public use of a modified version, on
|
||||
a publicly accessible server, gives the public access to the source
|
||||
code of the modified version.
|
||||
|
||||
An older license, called the Affero General Public License and
|
||||
published by Affero, was designed to accomplish similar goals. This is
|
||||
a different license, not a version of the Affero GPL, but Affero has
|
||||
released a new version of the Affero GPL which permits relicensing under
|
||||
this license.
|
||||
|
||||
The precise terms and conditions for copying, distribution and
|
||||
modification follow.
|
||||
|
||||
TERMS AND CONDITIONS
|
||||
|
||||
0. Definitions.
|
||||
|
||||
"This License" refers to version 3 of the GNU Affero General Public License.
|
||||
|
||||
"Copyright" also means copyright-like laws that apply to other kinds of
|
||||
works, such as semiconductor masks.
|
||||
|
||||
"The Program" refers to any copyrightable work licensed under this
|
||||
License. Each licensee is addressed as "you". "Licensees" and
|
||||
"recipients" may be individuals or organizations.
|
||||
|
||||
To "modify" a work means to copy from or adapt all or part of the work
|
||||
in a fashion requiring copyright permission, other than the making of an
|
||||
exact copy. The resulting work is called a "modified version" of the
|
||||
earlier work or a work "based on" the earlier work.
|
||||
|
||||
A "covered work" means either the unmodified Program or a work based
|
||||
on the Program.
|
||||
|
||||
To "propagate" a work means to do anything with it that, without
|
||||
permission, would make you directly or secondarily liable for
|
||||
infringement under applicable copyright law, except executing it on a
|
||||
computer or modifying a private copy. Propagation includes copying,
|
||||
distribution (with or without modification), making available to the
|
||||
public, and in some countries other activities as well.
|
||||
|
||||
To "convey" a work means any kind of propagation that enables other
|
||||
parties to make or receive copies. Mere interaction with a user through
|
||||
a computer network, with no transfer of a copy, is not conveying.
|
||||
|
||||
An interactive user interface displays "Appropriate Legal Notices"
|
||||
to the extent that it includes a convenient and prominently visible
|
||||
feature that (1) displays an appropriate copyright notice, and (2)
|
||||
tells the user that there is no warranty for the work (except to the
|
||||
extent that warranties are provided), that licensees may convey the
|
||||
work under this License, and how to view a copy of this License. If
|
||||
the interface presents a list of user commands or options, such as a
|
||||
menu, a prominent item in the list meets this criterion.
|
||||
|
||||
1. Source Code.
|
||||
|
||||
The "source code" for a work means the preferred form of the work
|
||||
for making modifications to it. "Object code" means any non-source
|
||||
form of a work.
|
||||
|
||||
A "Standard Interface" means an interface that either is an official
|
||||
standard defined by a recognized standards body, or, in the case of
|
||||
interfaces specified for a particular programming language, one that
|
||||
is widely used among developers working in that language.
|
||||
|
||||
The "System Libraries" of an executable work include anything, other
|
||||
than the work as a whole, that (a) is included in the normal form of
|
||||
packaging a Major Component, but which is not part of that Major
|
||||
Component, and (b) serves only to enable use of the work with that
|
||||
Major Component, or to implement a Standard Interface for which an
|
||||
implementation is available to the public in source code form. A
|
||||
"Major Component", in this context, means a major essential component
|
||||
(kernel, window system, and so on) of the specific operating system
|
||||
(if any) on which the executable work runs, or a compiler used to
|
||||
produce the work, or an object code interpreter used to run it.
|
||||
|
||||
The "Corresponding Source" for a work in object code form means all
|
||||
the source code needed to generate, install, and (for an executable
|
||||
work) run the object code and to modify the work, including scripts to
|
||||
control those activities. However, it does not include the work's
|
||||
System Libraries, or general-purpose tools or generally available free
|
||||
programs which are used unmodified in performing those activities but
|
||||
which are not part of the work. For example, Corresponding Source
|
||||
includes interface definition files associated with source files for
|
||||
the work, and the source code for shared libraries and dynamically
|
||||
linked subprograms that the work is specifically designed to require,
|
||||
such as by intimate data communication or control flow between those
|
||||
subprograms and other parts of the work.
|
||||
|
||||
The Corresponding Source need not include anything that users
|
||||
can regenerate automatically from other parts of the Corresponding
|
||||
Source.
|
||||
|
||||
The Corresponding Source for a work in source code form is that
|
||||
same work.
|
||||
|
||||
2. Basic Permissions.
|
||||
|
||||
All rights granted under this License are granted for the term of
|
||||
copyright on the Program, and are irrevocable provided the stated
|
||||
conditions are met. This License explicitly affirms your unlimited
|
||||
permission to run the unmodified Program. The output from running a
|
||||
covered work is covered by this License only if the output, given its
|
||||
content, constitutes a covered work. This License acknowledges your
|
||||
rights of fair use or other equivalent, as provided by copyright law.
|
||||
|
||||
You may make, run and propagate covered works that you do not
|
||||
convey, without conditions so long as your license otherwise remains
|
||||
in force. You may convey covered works to others for the sole purpose
|
||||
of having them make modifications exclusively for you, or provide you
|
||||
with facilities for running those works, provided that you comply with
|
||||
the terms of this License in conveying all material for which you do
|
||||
not control copyright. Those thus making or running the covered works
|
||||
for you must do so exclusively on your behalf, under your direction
|
||||
and control, on terms that prohibit them from making any copies of
|
||||
your copyrighted material outside their relationship with you.
|
||||
|
||||
Conveying under any other circumstances is permitted solely under
|
||||
the conditions stated below. Sublicensing is not allowed; section 10
|
||||
makes it unnecessary.
|
||||
|
||||
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
|
||||
|
||||
No covered work shall be deemed part of an effective technological
|
||||
measure under any applicable law fulfilling obligations under article
|
||||
11 of the WIPO copyright treaty adopted on 20 December 1996, or
|
||||
similar laws prohibiting or restricting circumvention of such
|
||||
measures.
|
||||
|
||||
When you convey a covered work, you waive any legal power to forbid
|
||||
circumvention of technological measures to the extent such circumvention
|
||||
is effected by exercising rights under this License with respect to
|
||||
the covered work, and you disclaim any intention to limit operation or
|
||||
modification of the work as a means of enforcing, against the work's
|
||||
users, your or third parties' legal rights to forbid circumvention of
|
||||
technological measures.
|
||||
|
||||
4. Conveying Verbatim Copies.
|
||||
|
||||
You may convey verbatim copies of the Program's source code as you
|
||||
receive it, in any medium, provided that you conspicuously and
|
||||
appropriately publish on each copy an appropriate copyright notice;
|
||||
keep intact all notices stating that this License and any
|
||||
non-permissive terms added in accord with section 7 apply to the code;
|
||||
keep intact all notices of the absence of any warranty; and give all
|
||||
recipients a copy of this License along with the Program.
|
||||
|
||||
You may charge any price or no price for each copy that you convey,
|
||||
and you may offer support or warranty protection for a fee.
|
||||
|
||||
5. Conveying Modified Source Versions.
|
||||
|
||||
You may convey a work based on the Program, or the modifications to
|
||||
produce it from the Program, in the form of source code under the
|
||||
terms of section 4, provided that you also meet all of these conditions:
|
||||
|
||||
a) The work must carry prominent notices stating that you modified
|
||||
it, and giving a relevant date.
|
||||
|
||||
b) The work must carry prominent notices stating that it is
|
||||
released under this License and any conditions added under section
|
||||
7. This requirement modifies the requirement in section 4 to
|
||||
"keep intact all notices".
|
||||
|
||||
c) You must license the entire work, as a whole, under this
|
||||
License to anyone who comes into possession of a copy. This
|
||||
License will therefore apply, along with any applicable section 7
|
||||
additional terms, to the whole of the work, and all its parts,
|
||||
regardless of how they are packaged. This License gives no
|
||||
permission to license the work in any other way, but it does not
|
||||
invalidate such permission if you have separately received it.
|
||||
|
||||
d) If the work has interactive user interfaces, each must display
|
||||
Appropriate Legal Notices; however, if the Program has interactive
|
||||
interfaces that do not display Appropriate Legal Notices, your
|
||||
work need not make them do so.
|
||||
|
||||
A compilation of a covered work with other separate and independent
|
||||
works, which are not by their nature extensions of the covered work,
|
||||
and which are not combined with it such as to form a larger program,
|
||||
in or on a volume of a storage or distribution medium, is called an
|
||||
"aggregate" if the compilation and its resulting copyright are not
|
||||
used to limit the access or legal rights of the compilation's users
|
||||
beyond what the individual works permit. Inclusion of a covered work
|
||||
in an aggregate does not cause this License to apply to the other
|
||||
parts of the aggregate.
|
||||
|
||||
6. Conveying Non-Source Forms.
|
||||
|
||||
You may convey a covered work in object code form under the terms
|
||||
of sections 4 and 5, provided that you also convey the
|
||||
machine-readable Corresponding Source under the terms of this License,
|
||||
in one of these ways:
|
||||
|
||||
a) Convey the object code in, or embodied in, a physical product
|
||||
(including a physical distribution medium), accompanied by the
|
||||
Corresponding Source fixed on a durable physical medium
|
||||
customarily used for software interchange.
|
||||
|
||||
b) Convey the object code in, or embodied in, a physical product
|
||||
(including a physical distribution medium), accompanied by a
|
||||
written offer, valid for at least three years and valid for as
|
||||
long as you offer spare parts or customer support for that product
|
||||
model, to give anyone who possesses the object code either (1) a
|
||||
copy of the Corresponding Source for all the software in the
|
||||
product that is covered by this License, on a durable physical
|
||||
medium customarily used for software interchange, for a price no
|
||||
more than your reasonable cost of physically performing this
|
||||
conveying of source, or (2) access to copy the
|
||||
Corresponding Source from a network server at no charge.
|
||||
|
||||
c) Convey individual copies of the object code with a copy of the
|
||||
written offer to provide the Corresponding Source. This
|
||||
alternative is allowed only occasionally and noncommercially, and
|
||||
only if you received the object code with such an offer, in accord
|
||||
with subsection 6b.
|
||||
|
||||
d) Convey the object code by offering access from a designated
|
||||
place (gratis or for a charge), and offer equivalent access to the
|
||||
Corresponding Source in the same way through the same place at no
|
||||
further charge. You need not require recipients to copy the
|
||||
Corresponding Source along with the object code. If the place to
|
||||
copy the object code is a network server, the Corresponding Source
|
||||
may be on a different server (operated by you or a third party)
|
||||
that supports equivalent copying facilities, provided you maintain
|
||||
clear directions next to the object code saying where to find the
|
||||
Corresponding Source. Regardless of what server hosts the
|
||||
Corresponding Source, you remain obligated to ensure that it is
|
||||
available for as long as needed to satisfy these requirements.
|
||||
|
||||
e) Convey the object code using peer-to-peer transmission, provided
|
||||
you inform other peers where the object code and Corresponding
|
||||
Source of the work are being offered to the general public at no
|
||||
charge under subsection 6d.
|
||||
|
||||
A separable portion of the object code, whose source code is excluded
|
||||
from the Corresponding Source as a System Library, need not be
|
||||
included in conveying the object code work.
|
||||
|
||||
A "User Product" is either (1) a "consumer product", which means any
|
||||
tangible personal property which is normally used for personal, family,
|
||||
or household purposes, or (2) anything designed or sold for incorporation
|
||||
into a dwelling. In determining whether a product is a consumer product,
|
||||
doubtful cases shall be resolved in favor of coverage. For a particular
|
||||
product received by a particular user, "normally used" refers to a
|
||||
typical or common use of that class of product, regardless of the status
|
||||
of the particular user or of the way in which the particular user
|
||||
actually uses, or expects or is expected to use, the product. A product
|
||||
is a consumer product regardless of whether the product has substantial
|
||||
commercial, industrial or non-consumer uses, unless such uses represent
|
||||
the only significant mode of use of the product.
|
||||
|
||||
"Installation Information" for a User Product means any methods,
|
||||
procedures, authorization keys, or other information required to install
|
||||
and execute modified versions of a covered work in that User Product from
|
||||
a modified version of its Corresponding Source. The information must
|
||||
suffice to ensure that the continued functioning of the modified object
|
||||
code is in no case prevented or interfered with solely because
|
||||
modification has been made.
|
||||
|
||||
If you convey an object code work under this section in, or with, or
|
||||
specifically for use in, a User Product, and the conveying occurs as
|
||||
part of a transaction in which the right of possession and use of the
|
||||
User Product is transferred to the recipient in perpetuity or for a
|
||||
fixed term (regardless of how the transaction is characterized), the
|
||||
Corresponding Source conveyed under this section must be accompanied
|
||||
by the Installation Information. But this requirement does not apply
|
||||
if neither you nor any third party retains the ability to install
|
||||
modified object code on the User Product (for example, the work has
|
||||
been installed in ROM).
|
||||
|
||||
The requirement to provide Installation Information does not include a
|
||||
requirement to continue to provide support service, warranty, or updates
|
||||
for a work that has been modified or installed by the recipient, or for
|
||||
the User Product in which it has been modified or installed. Access to a
|
||||
network may be denied when the modification itself materially and
|
||||
adversely affects the operation of the network or violates the rules and
|
||||
protocols for communication across the network.
|
||||
|
||||
Corresponding Source conveyed, and Installation Information provided,
|
||||
in accord with this section must be in a format that is publicly
|
||||
documented (and with an implementation available to the public in
|
||||
source code form), and must require no special password or key for
|
||||
unpacking, reading or copying.
|
||||
|
||||
7. Additional Terms.
|
||||
|
||||
"Additional permissions" are terms that supplement the terms of this
|
||||
License by making exceptions from one or more of its conditions.
|
||||
Additional permissions that are applicable to the entire Program shall
|
||||
be treated as though they were included in this License, to the extent
|
||||
that they are valid under applicable law. If additional permissions
|
||||
apply only to part of the Program, that part may be used separately
|
||||
under those permissions, but the entire Program remains governed by
|
||||
this License without regard to the additional permissions.
|
||||
|
||||
When you convey a copy of a covered work, you may at your option
|
||||
remove any additional permissions from that copy, or from any part of
|
||||
it. (Additional permissions may be written to require their own
|
||||
removal in certain cases when you modify the work.) You may place
|
||||
additional permissions on material, added by you to a covered work,
|
||||
for which you have or can give appropriate copyright permission.
|
||||
|
||||
Notwithstanding any other provision of this License, for material you
|
||||
add to a covered work, you may (if authorized by the copyright holders of
|
||||
that material) supplement the terms of this License with terms:
|
||||
|
||||
a) Disclaiming warranty or limiting liability differently from the
|
||||
terms of sections 15 and 16 of this License; or
|
||||
|
||||
b) Requiring preservation of specified reasonable legal notices or
|
||||
author attributions in that material or in the Appropriate Legal
|
||||
Notices displayed by works containing it; or
|
||||
|
||||
c) Prohibiting misrepresentation of the origin of that material, or
|
||||
requiring that modified versions of such material be marked in
|
||||
reasonable ways as different from the original version; or
|
||||
|
||||
d) Limiting the use for publicity purposes of names of licensors or
|
||||
authors of the material; or
|
||||
|
||||
e) Declining to grant rights under trademark law for use of some
|
||||
trade names, trademarks, or service marks; or
|
||||
|
||||
f) Requiring indemnification of licensors and authors of that
|
||||
material by anyone who conveys the material (or modified versions of
|
||||
it) with contractual assumptions of liability to the recipient, for
|
||||
any liability that these contractual assumptions directly impose on
|
||||
those licensors and authors.
|
||||
|
||||
All other non-permissive additional terms are considered "further
|
||||
restrictions" within the meaning of section 10. If the Program as you
|
||||
received it, or any part of it, contains a notice stating that it is
|
||||
governed by this License along with a term that is a further
|
||||
restriction, you may remove that term. If a license document contains
|
||||
a further restriction but permits relicensing or conveying under this
|
||||
License, you may add to a covered work material governed by the terms
|
||||
of that license document, provided that the further restriction does
|
||||
not survive such relicensing or conveying.
|
||||
|
||||
If you add terms to a covered work in accord with this section, you
|
||||
must place, in the relevant source files, a statement of the
|
||||
additional terms that apply to those files, or a notice indicating
|
||||
where to find the applicable terms.
|
||||
|
||||
Additional terms, permissive or non-permissive, may be stated in the
|
||||
form of a separately written license, or stated as exceptions;
|
||||
the above requirements apply either way.
|
||||
|
||||
8. Termination.
|
||||
|
||||
You may not propagate or modify a covered work except as expressly
|
||||
provided under this License. Any attempt otherwise to propagate or
|
||||
modify it is void, and will automatically terminate your rights under
|
||||
this License (including any patent licenses granted under the third
|
||||
paragraph of section 11).
|
||||
|
||||
However, if you cease all violation of this License, then your
|
||||
license from a particular copyright holder is reinstated (a)
|
||||
provisionally, unless and until the copyright holder explicitly and
|
||||
finally terminates your license, and (b) permanently, if the copyright
|
||||
holder fails to notify you of the violation by some reasonable means
|
||||
prior to 60 days after the cessation.
|
||||
|
||||
Moreover, your license from a particular copyright holder is
|
||||
reinstated permanently if the copyright holder notifies you of the
|
||||
violation by some reasonable means, this is the first time you have
|
||||
received notice of violation of this License (for any work) from that
|
||||
copyright holder, and you cure the violation prior to 30 days after
|
||||
your receipt of the notice.
|
||||
|
||||
Termination of your rights under this section does not terminate the
|
||||
licenses of parties who have received copies or rights from you under
|
||||
this License. If your rights have been terminated and not permanently
|
||||
reinstated, you do not qualify to receive new licenses for the same
|
||||
material under section 10.
|
||||
|
||||
9. Acceptance Not Required for Having Copies.
|
||||
|
||||
You are not required to accept this License in order to receive or
|
||||
run a copy of the Program. Ancillary propagation of a covered work
|
||||
occurring solely as a consequence of using peer-to-peer transmission
|
||||
to receive a copy likewise does not require acceptance. However,
|
||||
nothing other than this License grants you permission to propagate or
|
||||
modify any covered work. These actions infringe copyright if you do
|
||||
not accept this License. Therefore, by modifying or propagating a
|
||||
covered work, you indicate your acceptance of this License to do so.
|
||||
|
||||
10. Automatic Licensing of Downstream Recipients.
|
||||
|
||||
Each time you convey a covered work, the recipient automatically
|
||||
receives a license from the original licensors, to run, modify and
|
||||
propagate that work, subject to this License. You are not responsible
|
||||
for enforcing compliance by third parties with this License.
|
||||
|
||||
An "entity transaction" is a transaction transferring control of an
|
||||
organization, or substantially all assets of one, or subdividing an
|
||||
organization, or merging organizations. If propagation of a covered
|
||||
work results from an entity transaction, each party to that
|
||||
transaction who receives a copy of the work also receives whatever
|
||||
licenses to the work the party's predecessor in interest had or could
|
||||
give under the previous paragraph, plus a right to possession of the
|
||||
Corresponding Source of the work from the predecessor in interest, if
|
||||
the predecessor has it or can get it with reasonable efforts.
|
||||
|
||||
You may not impose any further restrictions on the exercise of the
|
||||
rights granted or affirmed under this License. For example, you may
|
||||
not impose a license fee, royalty, or other charge for exercise of
|
||||
rights granted under this License, and you may not initiate litigation
|
||||
(including a cross-claim or counterclaim in a lawsuit) alleging that
|
||||
any patent claim is infringed by making, using, selling, offering for
|
||||
sale, or importing the Program or any portion of it.
|
||||
|
||||
11. Patents.
|
||||
|
||||
A "contributor" is a copyright holder who authorizes use under this
|
||||
License of the Program or a work on which the Program is based. The
|
||||
work thus licensed is called the contributor's "contributor version".
|
||||
|
||||
A contributor's "essential patent claims" are all patent claims
|
||||
owned or controlled by the contributor, whether already acquired or
|
||||
hereafter acquired, that would be infringed by some manner, permitted
|
||||
by this License, of making, using, or selling its contributor version,
|
||||
but do not include claims that would be infringed only as a
|
||||
consequence of further modification of the contributor version. For
|
||||
purposes of this definition, "control" includes the right to grant
|
||||
patent sublicenses in a manner consistent with the requirements of
|
||||
this License.
|
||||
|
||||
Each contributor grants you a non-exclusive, worldwide, royalty-free
|
||||
patent license under the contributor's essential patent claims, to
|
||||
make, use, sell, offer for sale, import and otherwise run, modify and
|
||||
propagate the contents of its contributor version.
|
||||
|
||||
In the following three paragraphs, a "patent license" is any express
|
||||
agreement or commitment, however denominated, not to enforce a patent
|
||||
(such as an express permission to practice a patent or covenant not to
|
||||
sue for patent infringement). To "grant" such a patent license to a
|
||||
party means to make such an agreement or commitment not to enforce a
|
||||
patent against the party.
|
||||
|
||||
If you convey a covered work, knowingly relying on a patent license,
|
||||
and the Corresponding Source of the work is not available for anyone
|
||||
to copy, free of charge and under the terms of this License, through a
|
||||
publicly available network server or other readily accessible means,
|
||||
then you must either (1) cause the Corresponding Source to be so
|
||||
available, or (2) arrange to deprive yourself of the benefit of the
|
||||
patent license for this particular work, or (3) arrange, in a manner
|
||||
consistent with the requirements of this License, to extend the patent
|
||||
license to downstream recipients. "Knowingly relying" means you have
|
||||
actual knowledge that, but for the patent license, your conveying the
|
||||
covered work in a country, or your recipient's use of the covered work
|
||||
in a country, would infringe one or more identifiable patents in that
|
||||
country that you have reason to believe are valid.
|
||||
|
||||
If, pursuant to or in connection with a single transaction or
|
||||
arrangement, you convey, or propagate by procuring conveyance of, a
|
||||
covered work, and grant a patent license to some of the parties
|
||||
receiving the covered work authorizing them to use, propagate, modify
|
||||
or convey a specific copy of the covered work, then the patent license
|
||||
you grant is automatically extended to all recipients of the covered
|
||||
work and works based on it.
|
||||
|
||||
A patent license is "discriminatory" if it does not include within
|
||||
the scope of its coverage, prohibits the exercise of, or is
|
||||
conditioned on the non-exercise of one or more of the rights that are
|
||||
specifically granted under this License. You may not convey a covered
|
||||
work if you are a party to an arrangement with a third party that is
|
||||
in the business of distributing software, under which you make payment
|
||||
to the third party based on the extent of your activity of conveying
|
||||
the work, and under which the third party grants, to any of the
|
||||
parties who would receive the covered work from you, a discriminatory
|
||||
patent license (a) in connection with copies of the covered work
|
||||
conveyed by you (or copies made from those copies), or (b) primarily
|
||||
for and in connection with specific products or compilations that
|
||||
contain the covered work, unless you entered into that arrangement,
|
||||
or that patent license was granted, prior to 28 March 2007.
|
||||
|
||||
Nothing in this License shall be construed as excluding or limiting
|
||||
any implied license or other defenses to infringement that may
|
||||
otherwise be available to you under applicable patent law.
|
||||
|
||||
12. No Surrender of Others' Freedom.
|
||||
|
||||
If conditions are imposed on you (whether by court order, agreement or
|
||||
otherwise) that contradict the conditions of this License, they do not
|
||||
excuse you from the conditions of this License. If you cannot convey a
|
||||
covered work so as to satisfy simultaneously your obligations under this
|
||||
License and any other pertinent obligations, then as a consequence you may
|
||||
not convey it at all. For example, if you agree to terms that obligate you
|
||||
to collect a royalty for further conveying from those to whom you convey
|
||||
the Program, the only way you could satisfy both those terms and this
|
||||
License would be to refrain entirely from conveying the Program.
|
||||
|
||||
13. Remote Network Interaction; Use with the GNU General Public License.
|
||||
|
||||
Notwithstanding any other provision of this License, if you modify the
|
||||
Program, your modified version must prominently offer all users
|
||||
interacting with it remotely through a computer network (if your version
|
||||
supports such interaction) an opportunity to receive the Corresponding
|
||||
Source of your version by providing access to the Corresponding Source
|
||||
from a network server at no charge, through some standard or customary
|
||||
means of facilitating copying of software. This Corresponding Source
|
||||
shall include the Corresponding Source for any work covered by version 3
|
||||
of the GNU General Public License that is incorporated pursuant to the
|
||||
following paragraph.
|
||||
|
||||
Notwithstanding any other provision of this License, you have
|
||||
permission to link or combine any covered work with a work licensed
|
||||
under version 3 of the GNU General Public License into a single
|
||||
combined work, and to convey the resulting work. The terms of this
|
||||
License will continue to apply to the part which is the covered work,
|
||||
but the work with which it is combined will remain governed by version
|
||||
3 of the GNU General Public License.
|
||||
|
||||
14. Revised Versions of this License.
|
||||
|
||||
The Free Software Foundation may publish revised and/or new versions of
|
||||
the GNU Affero General Public License from time to time. Such new versions
|
||||
will be similar in spirit to the present version, but may differ in detail to
|
||||
address new problems or concerns.
|
||||
|
||||
Each version is given a distinguishing version number. If the
|
||||
Program specifies that a certain numbered version of the GNU Affero General
|
||||
Public License "or any later version" applies to it, you have the
|
||||
option of following the terms and conditions either of that numbered
|
||||
version or of any later version published by the Free Software
|
||||
Foundation. If the Program does not specify a version number of the
|
||||
GNU Affero General Public License, you may choose any version ever published
|
||||
by the Free Software Foundation.
|
||||
|
||||
If the Program specifies that a proxy can decide which future
|
||||
versions of the GNU Affero General Public License can be used, that proxy's
|
||||
public statement of acceptance of a version permanently authorizes you
|
||||
to choose that version for the Program.
|
||||
|
||||
Later license versions may give you additional or different
|
||||
permissions. However, no additional obligations are imposed on any
|
||||
author or copyright holder as a result of your choosing to follow a
|
||||
later version.
|
||||
|
||||
15. Disclaimer of Warranty.
|
||||
|
||||
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
|
||||
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
|
||||
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
|
||||
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
|
||||
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
|
||||
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
|
||||
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
|
||||
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
|
||||
|
||||
16. Limitation of Liability.
|
||||
|
||||
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
|
||||
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
|
||||
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
|
||||
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
|
||||
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
|
||||
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
|
||||
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
|
||||
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
|
||||
SUCH DAMAGES.
|
||||
|
||||
17. Interpretation of Sections 15 and 16.
|
||||
|
||||
If the disclaimer of warranty and limitation of liability provided
|
||||
above cannot be given local legal effect according to their terms,
|
||||
reviewing courts shall apply local law that most closely approximates
|
||||
an absolute waiver of all civil liability in connection with the
|
||||
Program, unless a warranty or assumption of liability accompanies a
|
||||
copy of the Program in return for a fee.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
How to Apply These Terms to Your New Programs
|
||||
|
||||
If you develop a new program, and you want it to be of the greatest
|
||||
possible use to the public, the best way to achieve this is to make it
|
||||
free software which everyone can redistribute and change under these terms.
|
||||
|
||||
To do so, attach the following notices to the program. It is safest
|
||||
to attach them to the start of each source file to most effectively
|
||||
state the exclusion of warranty; and each file should have at least
|
||||
the "copyright" line and a pointer to where the full notice is found.
|
||||
|
||||
<one line to give the program's name and a brief idea of what it does.>
|
||||
Copyright (C) <year> <name of author>
|
||||
|
||||
This program is free software: you can redistribute it and/or modify
|
||||
it under the terms of the GNU Affero General Public License as published by
|
||||
the Free Software Foundation, either version 3 of the License, or
|
||||
(at your option) any later version.
|
||||
|
||||
This program is distributed in the hope that it will be useful,
|
||||
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
GNU Affero General Public License for more details.
|
||||
|
||||
You should have received a copy of the GNU Affero General Public License
|
||||
along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
Also add information on how to contact you by electronic and paper mail.
|
||||
|
||||
If your software can interact with users remotely through a computer
|
||||
network, you should also make sure that it provides a way for users to
|
||||
get its source. For example, if your program is a web application, its
|
||||
interface could display a "Source" link that leads users to an archive
|
||||
of the code. There are many ways you could offer source, and different
|
||||
solutions will be better for different programs; see section 13 for the
|
||||
specific requirements.
|
||||
|
||||
You should also get your employer (if you work as a programmer) or school,
|
||||
if any, to sign a "copyright disclaimer" for the program, if necessary.
|
||||
For more information on this, and how to apply and follow the GNU AGPL, see
|
||||
<http://www.gnu.org/licenses/>.
|
|
@ -12,6 +12,8 @@
|
|||
<artifactId>dhp-build-assembly-resources</artifactId>
|
||||
<packaging>jar</packaging>
|
||||
|
||||
<description>This module contains a set of scripts supporting the build lifecycle for the dnet-hadoop project</description>
|
||||
|
||||
<build>
|
||||
<plugins>
|
||||
<plugin>
|
||||
|
|
|
@ -12,22 +12,29 @@
|
|||
<artifactId>dhp-build-properties-maven-plugin</artifactId>
|
||||
<packaging>maven-plugin</packaging>
|
||||
|
||||
<description>This module is a maven plugin implementing custom properties substitutions in the build lifecycle</description>
|
||||
|
||||
<dependencies>
|
||||
<dependency>
|
||||
<groupId>org.apache.maven</groupId>
|
||||
<artifactId>maven-plugin-api</artifactId>
|
||||
<version>2.0</version>
|
||||
<version>3.6.3</version>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.apache.maven</groupId>
|
||||
<artifactId>maven-project</artifactId>
|
||||
<version>2.0</version>
|
||||
<version>2.2.1</version>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.apache.maven</groupId>
|
||||
<artifactId>maven-artifact</artifactId>
|
||||
<version>2.2.1</version>
|
||||
</dependency>
|
||||
|
||||
<dependency>
|
||||
<groupId>org.kuali.maven.plugins</groupId>
|
||||
<artifactId>properties-maven-plugin</artifactId>
|
||||
<version>1.3.2</version>
|
||||
<version>${properties.maven.plugin.version}</version>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>com.google.code.findbugs</groupId>
|
||||
|
@ -73,44 +80,10 @@
|
|||
<artifactId>maven-javadoc-plugin</artifactId>
|
||||
<configuration>
|
||||
<detectLinks>true</detectLinks>
|
||||
<doclint>none</doclint>
|
||||
</configuration>
|
||||
</plugin>
|
||||
</plugins>
|
||||
<pluginManagement>
|
||||
<plugins>
|
||||
<!--This plugin's configuration is used to store Eclipse m2e settings only. It has no influence on the Maven build itself.-->
|
||||
<plugin>
|
||||
<groupId>org.eclipse.m2e</groupId>
|
||||
<artifactId>lifecycle-mapping</artifactId>
|
||||
<version>1.0.0</version>
|
||||
<configuration>
|
||||
<lifecycleMappingMetadata>
|
||||
<pluginExecutions>
|
||||
<pluginExecution>
|
||||
<pluginExecutionFilter>
|
||||
<groupId>
|
||||
org.apache.maven.plugins
|
||||
</groupId>
|
||||
<artifactId>
|
||||
maven-plugin-plugin
|
||||
</artifactId>
|
||||
<versionRange>
|
||||
[3.2,)
|
||||
</versionRange>
|
||||
<goals>
|
||||
<goal>descriptor</goal>
|
||||
</goals>
|
||||
</pluginExecutionFilter>
|
||||
<action>
|
||||
<ignore />
|
||||
</action>
|
||||
</pluginExecution>
|
||||
</pluginExecutions>
|
||||
</lifecycleMappingMetadata>
|
||||
</configuration>
|
||||
</plugin>
|
||||
</plugins>
|
||||
</pluginManagement>
|
||||
</build>
|
||||
|
||||
</project>
|
||||
|
|
|
@ -1,8 +1,10 @@
|
|||
|
||||
package eu.dnetlib.maven.plugin.properties;
|
||||
|
||||
import java.io.File;
|
||||
import java.util.ArrayList;
|
||||
import java.util.List;
|
||||
|
||||
import org.apache.commons.lang.ArrayUtils;
|
||||
import org.apache.commons.lang.StringUtils;
|
||||
import org.apache.maven.plugin.AbstractMojo;
|
||||
|
@ -20,18 +22,21 @@ public class GenerateOoziePropertiesMojo extends AbstractMojo {
|
|||
public static final String PROPERTY_NAME_WF_SOURCE_DIR = "workflow.source.dir";
|
||||
public static final String PROPERTY_NAME_SANDBOX_NAME = "sandboxName";
|
||||
|
||||
private final String[] limiters = {"dhp", "dnetlib", "eu"};
|
||||
private final String[] limiters = {
|
||||
"dhp", "dnetlib", "eu"
|
||||
};
|
||||
|
||||
@Override
|
||||
public void execute() throws MojoExecutionException, MojoFailureException {
|
||||
if (System.getProperties().containsKey(PROPERTY_NAME_WF_SOURCE_DIR)
|
||||
&& !System.getProperties().containsKey(PROPERTY_NAME_SANDBOX_NAME)) {
|
||||
String generatedSandboxName =
|
||||
generateSandboxName(System.getProperties().getProperty(PROPERTY_NAME_WF_SOURCE_DIR));
|
||||
String generatedSandboxName = generateSandboxName(
|
||||
System.getProperties().getProperty(PROPERTY_NAME_WF_SOURCE_DIR));
|
||||
if (generatedSandboxName != null) {
|
||||
System.getProperties().setProperty(PROPERTY_NAME_SANDBOX_NAME, generatedSandboxName);
|
||||
} else {
|
||||
System.out.println(
|
||||
System.out
|
||||
.println(
|
||||
"unable to generate sandbox name from path: "
|
||||
+ System.getProperties().getProperty(PROPERTY_NAME_WF_SOURCE_DIR));
|
||||
}
|
||||
|
|
|
@ -9,9 +9,9 @@
|
|||
* express or implied. See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
|
||||
package eu.dnetlib.maven.plugin.properties;
|
||||
|
||||
import edu.umd.cs.findbugs.annotations.SuppressFBWarnings;
|
||||
import java.io.File;
|
||||
import java.io.FileInputStream;
|
||||
import java.io.IOException;
|
||||
|
@ -24,6 +24,7 @@ import java.util.List;
|
|||
import java.util.Map.Entry;
|
||||
import java.util.Properties;
|
||||
import java.util.Set;
|
||||
|
||||
import org.apache.commons.io.FileUtils;
|
||||
import org.apache.commons.io.IOUtils;
|
||||
import org.apache.commons.lang.StringUtils;
|
||||
|
@ -35,9 +36,11 @@ import org.springframework.core.io.DefaultResourceLoader;
|
|||
import org.springframework.core.io.Resource;
|
||||
import org.springframework.core.io.ResourceLoader;
|
||||
|
||||
import edu.umd.cs.findbugs.annotations.SuppressFBWarnings;
|
||||
|
||||
/**
|
||||
* Writes project properties for the keys listed in specified properties files. Based on:
|
||||
* http://site.kuali.org/maven/plugins/properties-maven-plugin/1.3.2/write-project-properties-mojo.html
|
||||
* http://site.kuali.org/maven/plugins/properties-maven-plugin/2.0.1/write-project-properties-mojo.html
|
||||
*
|
||||
* @author mhorst
|
||||
* @goal write-project-properties
|
||||
|
@ -70,33 +73,31 @@ public class WritePredefinedProjectProperties extends AbstractMojo {
|
|||
protected File outputFile;
|
||||
|
||||
/**
|
||||
* If true, the plugin will silently ignore any non-existent properties files, and the build will
|
||||
* continue
|
||||
* If true, the plugin will silently ignore any non-existent properties files, and the build will continue
|
||||
*
|
||||
* @parameter property="properties.quiet" default-value="true"
|
||||
*/
|
||||
private boolean quiet;
|
||||
|
||||
/**
|
||||
* Comma separated list of characters to escape when writing property values. cr=carriage return,
|
||||
* lf=linefeed, tab=tab. Any other values are taken literally.
|
||||
* Comma separated list of characters to escape when writing property values. cr=carriage return, lf=linefeed,
|
||||
* tab=tab. Any other values are taken literally.
|
||||
*
|
||||
* @parameter default-value="cr,lf,tab" property="properties.escapeChars"
|
||||
*/
|
||||
private String escapeChars;
|
||||
|
||||
/**
|
||||
* If true, the plugin will include system properties when writing the properties file. System
|
||||
* properties override both environment variables and project properties.
|
||||
* If true, the plugin will include system properties when writing the properties file. System properties override
|
||||
* both environment variables and project properties.
|
||||
*
|
||||
* @parameter default-value="false" property="properties.includeSystemProperties"
|
||||
*/
|
||||
private boolean includeSystemProperties;
|
||||
|
||||
/**
|
||||
* If true, the plugin will include environment variables when writing the properties file.
|
||||
* Environment variables are prefixed with "env". Environment variables override project
|
||||
* properties.
|
||||
* If true, the plugin will include environment variables when writing the properties file. Environment variables
|
||||
* are prefixed with "env". Environment variables override project properties.
|
||||
*
|
||||
* @parameter default-value="false" property="properties.includeEnvironmentVariables"
|
||||
*/
|
||||
|
@ -110,8 +111,8 @@ public class WritePredefinedProjectProperties extends AbstractMojo {
|
|||
private String exclude;
|
||||
|
||||
/**
|
||||
* Comma separated set of properties to write to the properties file. If provided, only the
|
||||
* properties matching those supplied here will be written to the properties file.
|
||||
* Comma separated set of properties to write to the properties file. If provided, only the properties matching
|
||||
* those supplied here will be written to the properties file.
|
||||
*
|
||||
* @parameter property="properties.include"
|
||||
*/
|
||||
|
@ -122,7 +123,9 @@ public class WritePredefinedProjectProperties extends AbstractMojo {
|
|||
* @see org.apache.maven.plugin.AbstractMojo#execute()
|
||||
*/
|
||||
@Override
|
||||
@SuppressFBWarnings({"NP_UNWRITTEN_FIELD", "UWF_UNWRITTEN_FIELD"})
|
||||
@SuppressFBWarnings({
|
||||
"NP_UNWRITTEN_FIELD", "UWF_UNWRITTEN_FIELD"
|
||||
})
|
||||
public void execute() throws MojoExecutionException, MojoFailureException {
|
||||
Properties properties = new Properties();
|
||||
// Add project properties
|
||||
|
@ -437,8 +440,8 @@ public class WritePredefinedProjectProperties extends AbstractMojo {
|
|||
*/
|
||||
public void setIncludePropertyKeysFromFiles(String[] includePropertyKeysFromFiles) {
|
||||
if (includePropertyKeysFromFiles != null) {
|
||||
this.includePropertyKeysFromFiles =
|
||||
Arrays.copyOf(includePropertyKeysFromFiles, includePropertyKeysFromFiles.length);
|
||||
this.includePropertyKeysFromFiles = Arrays
|
||||
.copyOf(includePropertyKeysFromFiles, includePropertyKeysFromFiles.length);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.maven.plugin.properties;
|
||||
|
||||
import static eu.dnetlib.maven.plugin.properties.GenerateOoziePropertiesMojo.PROPERTY_NAME_SANDBOX_NAME;
|
||||
|
@ -10,7 +11,7 @@ import org.junit.jupiter.api.Test;
|
|||
/** @author mhorst, claudio.atzori */
|
||||
public class GenerateOoziePropertiesMojoTest {
|
||||
|
||||
private GenerateOoziePropertiesMojo mojo = new GenerateOoziePropertiesMojo();
|
||||
private final GenerateOoziePropertiesMojo mojo = new GenerateOoziePropertiesMojo();
|
||||
|
||||
@BeforeEach
|
||||
public void clearSystemProperties() {
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.maven.plugin.properties;
|
||||
|
||||
import static eu.dnetlib.maven.plugin.properties.WritePredefinedProjectProperties.PROPERTY_PREFIX_ENV;
|
||||
|
@ -7,6 +8,7 @@ import static org.mockito.Mockito.lenient;
|
|||
|
||||
import java.io.*;
|
||||
import java.util.Properties;
|
||||
|
||||
import org.apache.maven.plugin.MojoExecutionException;
|
||||
import org.apache.maven.project.MavenProject;
|
||||
import org.junit.jupiter.api.*;
|
||||
|
@ -20,7 +22,8 @@ import org.mockito.junit.jupiter.MockitoExtension;
|
|||
@ExtendWith(MockitoExtension.class)
|
||||
public class WritePredefinedProjectPropertiesTest {
|
||||
|
||||
@Mock private MavenProject mavenProject;
|
||||
@Mock
|
||||
private MavenProject mavenProject;
|
||||
|
||||
private WritePredefinedProjectProperties mojo;
|
||||
|
||||
|
@ -145,7 +148,9 @@ public class WritePredefinedProjectPropertiesTest {
|
|||
includedProperties.setProperty(includedKey, "irrelevantValue");
|
||||
includedProperties.store(new FileWriter(includedPropertiesFile), null);
|
||||
|
||||
mojo.setIncludePropertyKeysFromFiles(new String[] {includedPropertiesFile.getAbsolutePath()});
|
||||
mojo.setIncludePropertyKeysFromFiles(new String[] {
|
||||
includedPropertiesFile.getAbsolutePath()
|
||||
});
|
||||
|
||||
// execute
|
||||
mojo.execute();
|
||||
|
@ -171,8 +176,11 @@ public class WritePredefinedProjectPropertiesTest {
|
|||
projectProperties.setProperty(includedKey, includedValue);
|
||||
doReturn(projectProperties).when(mavenProject).getProperties();
|
||||
|
||||
mojo.setIncludePropertyKeysFromFiles(
|
||||
new String[] {"/eu/dnetlib/maven/plugin/properties/included.properties"});
|
||||
mojo
|
||||
.setIncludePropertyKeysFromFiles(
|
||||
new String[] {
|
||||
"/eu/dnetlib/maven/plugin/properties/included.properties"
|
||||
});
|
||||
|
||||
// execute
|
||||
mojo.execute();
|
||||
|
@ -197,7 +205,9 @@ public class WritePredefinedProjectPropertiesTest {
|
|||
projectProperties.setProperty(includedKey, includedValue);
|
||||
doReturn(projectProperties).when(mavenProject).getProperties();
|
||||
|
||||
mojo.setIncludePropertyKeysFromFiles(new String[] {""});
|
||||
mojo.setIncludePropertyKeysFromFiles(new String[] {
|
||||
""
|
||||
});
|
||||
|
||||
// execute
|
||||
Assertions.assertThrows(MojoExecutionException.class, () -> mojo.execute());
|
||||
|
@ -221,7 +231,9 @@ public class WritePredefinedProjectPropertiesTest {
|
|||
includedProperties.setProperty(includedKey, "irrelevantValue");
|
||||
includedProperties.storeToXML(new FileOutputStream(includedPropertiesFile), null);
|
||||
|
||||
mojo.setIncludePropertyKeysFromFiles(new String[] {includedPropertiesFile.getAbsolutePath()});
|
||||
mojo.setIncludePropertyKeysFromFiles(new String[] {
|
||||
includedPropertiesFile.getAbsolutePath()
|
||||
});
|
||||
|
||||
// execute
|
||||
mojo.execute();
|
||||
|
@ -252,7 +264,9 @@ public class WritePredefinedProjectPropertiesTest {
|
|||
includedProperties.setProperty(includedKey, "irrelevantValue");
|
||||
includedProperties.store(new FileOutputStream(includedPropertiesFile), null);
|
||||
|
||||
mojo.setIncludePropertyKeysFromFiles(new String[] {includedPropertiesFile.getAbsolutePath()});
|
||||
mojo.setIncludePropertyKeysFromFiles(new String[] {
|
||||
includedPropertiesFile.getAbsolutePath()
|
||||
});
|
||||
|
||||
// execute
|
||||
Assertions.assertThrows(MojoExecutionException.class, () -> mojo.execute());
|
||||
|
@ -262,7 +276,9 @@ public class WritePredefinedProjectPropertiesTest {
|
|||
public void testExecuteWithQuietModeOn(@TempDir File testFolder) throws Exception {
|
||||
// given
|
||||
mojo.setQuiet(true);
|
||||
mojo.setIncludePropertyKeysFromFiles(new String[] {"invalid location"});
|
||||
mojo.setIncludePropertyKeysFromFiles(new String[] {
|
||||
"invalid location"
|
||||
});
|
||||
|
||||
// execute
|
||||
mojo.execute();
|
||||
|
@ -276,7 +292,9 @@ public class WritePredefinedProjectPropertiesTest {
|
|||
@Test
|
||||
public void testExecuteIncludingPropertyKeysFromInvalidFile() {
|
||||
// given
|
||||
mojo.setIncludePropertyKeysFromFiles(new String[] {"invalid location"});
|
||||
mojo.setIncludePropertyKeysFromFiles(new String[] {
|
||||
"invalid location"
|
||||
});
|
||||
|
||||
// execute
|
||||
Assertions.assertThrows(MojoExecutionException.class, () -> mojo.execute());
|
||||
|
@ -348,7 +366,7 @@ public class WritePredefinedProjectPropertiesTest {
|
|||
}
|
||||
|
||||
private Properties getStoredProperties(File testFolder)
|
||||
throws FileNotFoundException, IOException {
|
||||
throws IOException {
|
||||
Properties properties = new Properties();
|
||||
properties.load(new FileInputStream(getPropertiesFileLocation(testFolder)));
|
||||
return properties;
|
||||
|
|
|
@ -11,6 +11,38 @@
|
|||
|
||||
<packaging>jar</packaging>
|
||||
|
||||
<description>This module contains resources supporting common code style conventions</description>
|
||||
|
||||
<distributionManagement>
|
||||
<snapshotRepository>
|
||||
<id>dnet45-snapshots</id>
|
||||
<name>DNet45 Snapshots</name>
|
||||
<url>http://maven.research-infrastructures.eu/nexus/content/repositories/dnet45-snapshots</url>
|
||||
<layout>default</layout>
|
||||
</snapshotRepository>
|
||||
<repository>
|
||||
<id>dnet45-releases</id>
|
||||
<url>http://maven.research-infrastructures.eu/nexus/content/repositories/dnet45-releases</url>
|
||||
</repository>
|
||||
</distributionManagement>
|
||||
|
||||
<build>
|
||||
<pluginManagement>
|
||||
<plugins>
|
||||
<plugin>
|
||||
<groupId>org.apache.maven.plugins</groupId>
|
||||
<artifactId>maven-project-info-reports-plugin</artifactId>
|
||||
<version>3.0.0</version>
|
||||
</plugin>
|
||||
<plugin>
|
||||
<groupId>org.apache.maven.plugins</groupId>
|
||||
<artifactId>maven-site-plugin</artifactId>
|
||||
<version>3.7.1</version>
|
||||
</plugin>
|
||||
</plugins>
|
||||
</pluginManagement>
|
||||
</build>
|
||||
|
||||
<properties>
|
||||
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
|
||||
</properties>
|
||||
|
|
|
@ -8,6 +8,9 @@
|
|||
</parent>
|
||||
<artifactId>dhp-build</artifactId>
|
||||
<packaging>pom</packaging>
|
||||
|
||||
<description>This module is a container for the build tools used in dnet-hadoop</description>
|
||||
|
||||
<modules>
|
||||
<module>dhp-code-style</module>
|
||||
<module>dhp-build-assembly-resources</module>
|
||||
|
|
|
@ -12,6 +12,8 @@
|
|||
<artifactId>dhp-common</artifactId>
|
||||
<packaging>jar</packaging>
|
||||
|
||||
<description>This module contains common utilities meant to be used across the dnet-hadoop submodules</description>
|
||||
|
||||
<dependencies>
|
||||
|
||||
<dependency>
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.collector.worker.model;
|
||||
|
||||
import java.util.HashMap;
|
||||
|
|
|
@ -1,7 +1,9 @@
|
|||
|
||||
package eu.dnetlib.data.mdstore.manager.common.model;
|
||||
|
||||
import java.io.Serializable;
|
||||
import java.util.UUID;
|
||||
|
||||
import javax.persistence.Column;
|
||||
import javax.persistence.Entity;
|
||||
import javax.persistence.Id;
|
||||
|
|
|
@ -1,6 +1,8 @@
|
|||
|
||||
package eu.dnetlib.data.mdstore.manager.common.model;
|
||||
|
||||
import java.io.Serializable;
|
||||
|
||||
import javax.persistence.Column;
|
||||
import javax.persistence.Entity;
|
||||
import javax.persistence.Id;
|
||||
|
|
|
@ -1,7 +1,9 @@
|
|||
|
||||
package eu.dnetlib.data.mdstore.manager.common.model;
|
||||
|
||||
import java.io.Serializable;
|
||||
import java.util.Date;
|
||||
|
||||
import javax.persistence.Column;
|
||||
import javax.persistence.Entity;
|
||||
import javax.persistence.Id;
|
||||
|
|
|
@ -1,7 +1,9 @@
|
|||
|
||||
package eu.dnetlib.data.mdstore.manager.common.model;
|
||||
|
||||
import java.io.Serializable;
|
||||
import java.util.Date;
|
||||
|
||||
import javax.persistence.Column;
|
||||
import javax.persistence.Entity;
|
||||
import javax.persistence.Id;
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
|
||||
package eu.dnetlib.dhp.application;
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
import java.io.ByteArrayInputStream;
|
||||
import java.io.ByteArrayOutputStream;
|
||||
import java.io.Serializable;
|
||||
|
@ -8,10 +8,13 @@ import java.io.StringWriter;
|
|||
import java.util.*;
|
||||
import java.util.zip.GZIPInputStream;
|
||||
import java.util.zip.GZIPOutputStream;
|
||||
|
||||
import org.apache.commons.cli.*;
|
||||
import org.apache.commons.codec.binary.Base64;
|
||||
import org.apache.commons.io.IOUtils;
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
|
||||
public class ArgumentApplicationParser implements Serializable {
|
||||
|
||||
private final Options options = new Options();
|
||||
|
@ -21,8 +24,7 @@ public class ArgumentApplicationParser implements Serializable {
|
|||
|
||||
public ArgumentApplicationParser(final String json_configuration) throws Exception {
|
||||
final ObjectMapper mapper = new ObjectMapper();
|
||||
final OptionsParameter[] configuration =
|
||||
mapper.readValue(json_configuration, OptionsParameter[].class);
|
||||
final OptionsParameter[] configuration = mapper.readValue(json_configuration, OptionsParameter[].class);
|
||||
createOptionMap(configuration);
|
||||
}
|
||||
|
||||
|
@ -32,7 +34,8 @@ public class ArgumentApplicationParser implements Serializable {
|
|||
|
||||
private void createOptionMap(final OptionsParameter[] configuration) {
|
||||
|
||||
Arrays.stream(configuration)
|
||||
Arrays
|
||||
.stream(configuration)
|
||||
.map(
|
||||
conf -> {
|
||||
final Option o = new Option(conf.getParamName(), true, conf.getParamDescription());
|
||||
|
@ -74,10 +77,11 @@ public class ArgumentApplicationParser implements Serializable {
|
|||
public void parseArgument(final String[] args) throws Exception {
|
||||
CommandLineParser parser = new BasicParser();
|
||||
CommandLine cmd = parser.parse(options, args);
|
||||
Arrays.stream(cmd.getOptions())
|
||||
Arrays
|
||||
.stream(cmd.getOptions())
|
||||
.forEach(
|
||||
it ->
|
||||
objectMap.put(
|
||||
it -> objectMap
|
||||
.put(
|
||||
it.getLongOpt(),
|
||||
compressedValues.contains(it.getLongOpt())
|
||||
? decompressValue(it.getValue())
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.application;
|
||||
|
||||
public class OptionsParameter {
|
||||
|
@ -8,7 +9,8 @@ public class OptionsParameter {
|
|||
private boolean paramRequired;
|
||||
private boolean compressed;
|
||||
|
||||
public OptionsParameter() {}
|
||||
public OptionsParameter() {
|
||||
}
|
||||
|
||||
public String getParamName() {
|
||||
return paramName;
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.common;
|
||||
|
||||
import java.io.Serializable;
|
||||
|
@ -6,16 +7,18 @@ import java.util.function.Supplier;
|
|||
/** Provides serializable and throwing extensions to standard functional interfaces. */
|
||||
public class FunctionalInterfaceSupport {
|
||||
|
||||
private FunctionalInterfaceSupport() {}
|
||||
private FunctionalInterfaceSupport() {
|
||||
}
|
||||
|
||||
/**
|
||||
* Serializable supplier of any kind of objects. To be used withing spark processing pipelines
|
||||
* when supplying functions externally.
|
||||
* Serializable supplier of any kind of objects. To be used withing spark processing pipelines when supplying
|
||||
* functions externally.
|
||||
*
|
||||
* @param <T>
|
||||
*/
|
||||
@FunctionalInterface
|
||||
public interface SerializableSupplier<T> extends Supplier<T>, Serializable {}
|
||||
public interface SerializableSupplier<T> extends Supplier<T>, Serializable {
|
||||
}
|
||||
|
||||
/**
|
||||
* Extension of consumer accepting functions throwing an exception.
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.common;
|
||||
|
||||
import static eu.dnetlib.dhp.common.ThrowingSupport.rethrowAsRuntimeException;
|
||||
|
@ -5,6 +6,7 @@ import static eu.dnetlib.dhp.common.ThrowingSupport.rethrowAsRuntimeException;
|
|||
import java.util.Arrays;
|
||||
import java.util.List;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
import org.apache.hadoop.conf.Configuration;
|
||||
import org.apache.hadoop.fs.FileStatus;
|
||||
import org.apache.hadoop.fs.FileSystem;
|
||||
|
@ -16,7 +18,8 @@ import org.slf4j.LoggerFactory;
|
|||
public class HdfsSupport {
|
||||
private static final Logger logger = LoggerFactory.getLogger(HdfsSupport.class);
|
||||
|
||||
private HdfsSupport() {}
|
||||
private HdfsSupport() {
|
||||
}
|
||||
|
||||
/**
|
||||
* Checks a path (file or dir) exists on HDFS.
|
||||
|
@ -62,8 +65,8 @@ public class HdfsSupport {
|
|||
public static List<String> listFiles(String path, Configuration configuration) {
|
||||
logger.info("Listing files in path: {}", path);
|
||||
return rethrowAsRuntimeException(
|
||||
() ->
|
||||
Arrays.stream(FileSystem.get(configuration).listStatus(new Path(path)))
|
||||
() -> Arrays
|
||||
.stream(FileSystem.get(configuration).listStatus(new Path(path)))
|
||||
.filter(FileStatus::isDirectory)
|
||||
.map(x -> x.getPath().toString())
|
||||
.collect(Collectors.toList()));
|
||||
|
|
|
@ -1,20 +1,23 @@
|
|||
|
||||
package eu.dnetlib.dhp.common;
|
||||
|
||||
import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.ThrowingConsumer;
|
||||
import java.util.Objects;
|
||||
import java.util.function.Function;
|
||||
|
||||
import org.apache.spark.SparkConf;
|
||||
import org.apache.spark.sql.SparkSession;
|
||||
|
||||
import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.ThrowingConsumer;
|
||||
|
||||
/** SparkSession utility methods. */
|
||||
public class SparkSessionSupport {
|
||||
|
||||
private SparkSessionSupport() {}
|
||||
private SparkSessionSupport() {
|
||||
}
|
||||
|
||||
/**
|
||||
* Runs a given function using SparkSession created using default builder and supplied SparkConf.
|
||||
* Stops SparkSession when SparkSession is managed. Allows to reuse SparkSession created
|
||||
* externally.
|
||||
* Runs a given function using SparkSession created using default builder and supplied SparkConf. Stops SparkSession
|
||||
* when SparkSession is managed. Allows to reuse SparkSession created externally.
|
||||
*
|
||||
* @param conf SparkConf instance
|
||||
* @param isSparkSessionManaged When true will stop SparkSession
|
||||
|
@ -27,9 +30,8 @@ public class SparkSessionSupport {
|
|||
}
|
||||
|
||||
/**
|
||||
* Runs a given function using SparkSession created with hive support and using default builder
|
||||
* and supplied SparkConf. Stops SparkSession when SparkSession is managed. Allows to reuse
|
||||
* SparkSession created externally.
|
||||
* Runs a given function using SparkSession created with hive support and using default builder and supplied
|
||||
* SparkConf. Stops SparkSession when SparkSession is managed. Allows to reuse SparkSession created externally.
|
||||
*
|
||||
* @param conf SparkConf instance
|
||||
* @param isSparkSessionManaged When true will stop SparkSession
|
||||
|
@ -45,9 +47,8 @@ public class SparkSessionSupport {
|
|||
}
|
||||
|
||||
/**
|
||||
* Runs a given function using SparkSession created using supplied builder and supplied SparkConf.
|
||||
* Stops SparkSession when SparkSession is managed. Allows to reuse SparkSession created
|
||||
* externally.
|
||||
* Runs a given function using SparkSession created using supplied builder and supplied SparkConf. Stops
|
||||
* SparkSession when SparkSession is managed. Allows to reuse SparkSession created externally.
|
||||
*
|
||||
* @param sparkSessionBuilder Builder of SparkSession
|
||||
* @param conf SparkConf instance
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.common;
|
||||
|
||||
import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.ThrowingRunnable;
|
||||
|
@ -6,7 +7,8 @@ import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.ThrowingSupplier;
|
|||
/** Exception handling utility methods. */
|
||||
public class ThrowingSupport {
|
||||
|
||||
private ThrowingSupport() {}
|
||||
private ThrowingSupport() {
|
||||
}
|
||||
|
||||
/**
|
||||
* Executes given runnable and rethrows any exceptions as RuntimeException.
|
||||
|
|
|
@ -1,8 +1,10 @@
|
|||
|
||||
package eu.dnetlib.dhp.model.mdstore;
|
||||
|
||||
import eu.dnetlib.dhp.utils.DHPUtils;
|
||||
import java.io.Serializable;
|
||||
|
||||
import eu.dnetlib.dhp.utils.DHPUtils;
|
||||
|
||||
/** This class models a record inside the new Metadata store collection on HDFS * */
|
||||
public class MetadataRecord implements Serializable {
|
||||
|
||||
|
@ -16,8 +18,7 @@ public class MetadataRecord implements Serializable {
|
|||
private String encoding;
|
||||
|
||||
/**
|
||||
* The information about the provenance of the record see @{@link Provenance} for the model of
|
||||
* this information
|
||||
* The information about the provenance of the record see @{@link Provenance} for the model of this information
|
||||
*/
|
||||
private Provenance provenance;
|
||||
|
||||
|
|
|
@ -1,11 +1,13 @@
|
|||
|
||||
package eu.dnetlib.dhp.model.mdstore;
|
||||
|
||||
import java.io.Serializable;
|
||||
|
||||
/**
|
||||
* @author Sandro La Bruzzo
|
||||
* <p>Provenace class models the provenance of the record in the metadataStore It contains the
|
||||
* identifier and the name of the datasource that gives the record
|
||||
* <p>
|
||||
* Provenace class models the provenance of the record in the metadataStore It contains the identifier and the
|
||||
* name of the datasource that gives the record
|
||||
*/
|
||||
public class Provenance implements Serializable {
|
||||
|
||||
|
@ -15,7 +17,8 @@ public class Provenance implements Serializable {
|
|||
|
||||
private String nsPrefix;
|
||||
|
||||
public Provenance() {}
|
||||
public Provenance() {
|
||||
}
|
||||
|
||||
public Provenance(String datasourceId, String datasourceName, String nsPrefix) {
|
||||
this.datasourceId = datasourceId;
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.parser.utility;
|
||||
|
||||
public class VtdException extends Exception {
|
||||
|
|
|
@ -1,12 +1,14 @@
|
|||
|
||||
package eu.dnetlib.dhp.parser.utility;
|
||||
|
||||
import com.ximpleware.AutoPilot;
|
||||
import com.ximpleware.VTDNav;
|
||||
import java.util.ArrayList;
|
||||
import java.util.HashMap;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
|
||||
import com.ximpleware.AutoPilot;
|
||||
import com.ximpleware.VTDNav;
|
||||
|
||||
/** Created by sandro on 9/29/16. */
|
||||
public class VtdUtilityParser {
|
||||
|
||||
|
@ -36,7 +38,8 @@ public class VtdUtilityParser {
|
|||
final Map<String, String> currentAttributes = new HashMap<>();
|
||||
if (attributes != null) {
|
||||
|
||||
attributes.forEach(
|
||||
attributes
|
||||
.forEach(
|
||||
attributeKey -> {
|
||||
try {
|
||||
int attr = vn.getAttrVal(attributeKey);
|
||||
|
@ -58,7 +61,8 @@ public class VtdUtilityParser {
|
|||
ap.selectXPath(xpath);
|
||||
while (ap.evalXPath() != -1) {
|
||||
int t = vn.getText();
|
||||
if (t > -1) results.add(vn.toNormalizedString(t));
|
||||
if (t > -1)
|
||||
results.add(vn.toNormalizedString(t));
|
||||
}
|
||||
return results;
|
||||
} catch (Exception e) {
|
||||
|
@ -72,7 +76,8 @@ public class VtdUtilityParser {
|
|||
ap.selectXPath(xpath);
|
||||
while (ap.evalXPath() != -1) {
|
||||
int it = nav.getText();
|
||||
if (it > -1) return nav.toNormalizedString(it);
|
||||
if (it > -1)
|
||||
return nav.toNormalizedString(it);
|
||||
}
|
||||
return null;
|
||||
} catch (Exception e) {
|
||||
|
|
|
@ -1,23 +1,27 @@
|
|||
|
||||
package eu.dnetlib.dhp.utils;
|
||||
|
||||
import com.jayway.jsonpath.JsonPath;
|
||||
import java.io.ByteArrayInputStream;
|
||||
import java.io.ByteArrayOutputStream;
|
||||
import java.nio.charset.StandardCharsets;
|
||||
import java.security.MessageDigest;
|
||||
import java.util.zip.GZIPInputStream;
|
||||
import java.util.zip.GZIPOutputStream;
|
||||
import net.minidev.json.JSONArray;
|
||||
|
||||
import org.apache.commons.codec.binary.Base64;
|
||||
import org.apache.commons.codec.binary.Base64OutputStream;
|
||||
import org.apache.commons.codec.binary.Hex;
|
||||
|
||||
import com.jayway.jsonpath.JsonPath;
|
||||
|
||||
import net.minidev.json.JSONArray;
|
||||
|
||||
public class DHPUtils {
|
||||
|
||||
public static String md5(final String s) {
|
||||
try {
|
||||
final MessageDigest md = MessageDigest.getInstance("MD5");
|
||||
md.update(s.getBytes("UTF-8"));
|
||||
md.update(s.getBytes(StandardCharsets.UTF_8));
|
||||
return new String(Hex.encodeHex(md.digest()));
|
||||
} catch (final Exception e) {
|
||||
System.err.println("Error creating id");
|
||||
|
@ -59,7 +63,8 @@ public class DHPUtils {
|
|||
public static String getJPathString(final String jsonPath, final String json) {
|
||||
try {
|
||||
Object o = JsonPath.read(json, jsonPath);
|
||||
if (o instanceof String) return (String) o;
|
||||
if (o instanceof String)
|
||||
return (String) o;
|
||||
if (o instanceof JSONArray && ((JSONArray) o).size() > 0)
|
||||
return (String) ((JSONArray) o).get(0);
|
||||
return o.toString();
|
||||
|
|
|
@ -1,10 +1,12 @@
|
|||
|
||||
package eu.dnetlib.dhp.utils;
|
||||
|
||||
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpService;
|
||||
import org.apache.commons.logging.Log;
|
||||
import org.apache.commons.logging.LogFactory;
|
||||
import org.apache.cxf.jaxws.JaxWsProxyFactoryBean;
|
||||
|
||||
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpService;
|
||||
|
||||
public class ISLookupClientFactory {
|
||||
|
||||
private static final Log log = LogFactory.getLog(ISLookupClientFactory.class);
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.utils.saxon;
|
||||
|
||||
import net.sf.saxon.expr.XPathContext;
|
||||
|
@ -9,8 +10,7 @@ import net.sf.saxon.trans.XPathException;
|
|||
|
||||
public abstract class AbstractExtensionFunction extends ExtensionFunctionDefinition {
|
||||
|
||||
public static String DEFAULT_SAXON_EXT_NS_URI =
|
||||
"http://www.d-net.research-infrastructures.eu/saxon-extension";
|
||||
public static String DEFAULT_SAXON_EXT_NS_URI = "http://www.d-net.research-infrastructures.eu/saxon-extension";
|
||||
|
||||
public abstract String getName();
|
||||
|
||||
|
|
|
@ -1,9 +1,11 @@
|
|||
|
||||
package eu.dnetlib.dhp.utils.saxon;
|
||||
|
||||
import java.text.ParseException;
|
||||
import java.text.SimpleDateFormat;
|
||||
import java.util.Calendar;
|
||||
import java.util.GregorianCalendar;
|
||||
|
||||
import net.sf.saxon.expr.XPathContext;
|
||||
import net.sf.saxon.om.Item;
|
||||
import net.sf.saxon.om.Sequence;
|
||||
|
@ -13,7 +15,9 @@ import net.sf.saxon.value.StringValue;
|
|||
|
||||
public class ExtractYear extends AbstractExtensionFunction {
|
||||
|
||||
private static final String[] dateFormats = {"yyyy-MM-dd", "yyyy/MM/dd"};
|
||||
private static final String[] dateFormats = {
|
||||
"yyyy-MM-dd", "yyyy/MM/dd"
|
||||
};
|
||||
|
||||
@Override
|
||||
public String getName() {
|
||||
|
@ -44,7 +48,9 @@ public class ExtractYear extends AbstractExtensionFunction {
|
|||
|
||||
@Override
|
||||
public SequenceType[] getArgumentTypes() {
|
||||
return new SequenceType[] {SequenceType.OPTIONAL_ITEM};
|
||||
return new SequenceType[] {
|
||||
SequenceType.OPTIONAL_ITEM
|
||||
};
|
||||
}
|
||||
|
||||
@Override
|
||||
|
|
|
@ -1,8 +1,10 @@
|
|||
|
||||
package eu.dnetlib.dhp.utils.saxon;
|
||||
|
||||
import java.text.ParseException;
|
||||
import java.text.SimpleDateFormat;
|
||||
import java.util.Date;
|
||||
|
||||
import net.sf.saxon.expr.XPathContext;
|
||||
import net.sf.saxon.om.Sequence;
|
||||
import net.sf.saxon.trans.XPathException;
|
||||
|
@ -15,7 +17,7 @@ public class NormalizeDate extends AbstractExtensionFunction {
|
|||
"yyyy-MM-dd'T'hh:mm:ss", "yyyy-MM-dd", "yyyy/MM/dd", "yyyy"
|
||||
};
|
||||
|
||||
private static final String normalizeOutFormat = new String("yyyy-MM-dd'T'hh:mm:ss'Z'");
|
||||
private static final String normalizeOutFormat = "yyyy-MM-dd'T'hh:mm:ss'Z'";
|
||||
|
||||
@Override
|
||||
public String getName() {
|
||||
|
@ -43,7 +45,9 @@ public class NormalizeDate extends AbstractExtensionFunction {
|
|||
|
||||
@Override
|
||||
public SequenceType[] getArgumentTypes() {
|
||||
return new SequenceType[] {SequenceType.OPTIONAL_ITEM};
|
||||
return new SequenceType[] {
|
||||
SequenceType.OPTIONAL_ITEM
|
||||
};
|
||||
}
|
||||
|
||||
@Override
|
||||
|
|
|
@ -1,12 +1,14 @@
|
|||
|
||||
package eu.dnetlib.dhp.utils.saxon;
|
||||
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
|
||||
import net.sf.saxon.expr.XPathContext;
|
||||
import net.sf.saxon.om.Item;
|
||||
import net.sf.saxon.om.Sequence;
|
||||
import net.sf.saxon.trans.XPathException;
|
||||
import net.sf.saxon.value.SequenceType;
|
||||
import net.sf.saxon.value.StringValue;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
|
||||
public class PickFirst extends AbstractExtensionFunction {
|
||||
|
||||
|
@ -49,7 +51,9 @@ public class PickFirst extends AbstractExtensionFunction {
|
|||
|
||||
@Override
|
||||
public SequenceType[] getArgumentTypes() {
|
||||
return new SequenceType[] {SequenceType.OPTIONAL_ITEM};
|
||||
return new SequenceType[] {
|
||||
SequenceType.OPTIONAL_ITEM
|
||||
};
|
||||
}
|
||||
|
||||
@Override
|
||||
|
|
|
@ -1,9 +1,12 @@
|
|||
|
||||
package eu.dnetlib.dhp.utils.saxon;
|
||||
|
||||
import java.io.StringReader;
|
||||
|
||||
import javax.xml.transform.Transformer;
|
||||
import javax.xml.transform.TransformerException;
|
||||
import javax.xml.transform.stream.StreamSource;
|
||||
|
||||
import net.sf.saxon.Configuration;
|
||||
import net.sf.saxon.TransformerFactoryImpl;
|
||||
|
||||
|
|
|
@ -1,9 +1,11 @@
|
|||
|
||||
package eu.dnetlib.message;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.Map;
|
||||
|
||||
import com.fasterxml.jackson.core.JsonProcessingException;
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
import java.io.IOException;
|
||||
import java.util.Map;
|
||||
|
||||
public class Message {
|
||||
|
||||
|
@ -20,7 +22,8 @@ public class Message {
|
|||
return jsonMapper.readValue(json, Message.class);
|
||||
}
|
||||
|
||||
public Message() {}
|
||||
public Message() {
|
||||
}
|
||||
|
||||
public Message(String workflowId, String jobName, MessageType type, Map<String, String> body) {
|
||||
this.workflowId = workflowId;
|
||||
|
|
|
@ -1,12 +1,14 @@
|
|||
|
||||
package eu.dnetlib.message;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.nio.charset.StandardCharsets;
|
||||
import java.util.concurrent.LinkedBlockingQueue;
|
||||
|
||||
import com.rabbitmq.client.AMQP;
|
||||
import com.rabbitmq.client.Channel;
|
||||
import com.rabbitmq.client.DefaultConsumer;
|
||||
import com.rabbitmq.client.Envelope;
|
||||
import java.io.IOException;
|
||||
import java.nio.charset.StandardCharsets;
|
||||
import java.util.concurrent.LinkedBlockingQueue;
|
||||
|
||||
public class MessageConsumer extends DefaultConsumer {
|
||||
|
||||
|
|
|
@ -1,14 +1,16 @@
|
|||
|
||||
package eu.dnetlib.message;
|
||||
|
||||
import com.rabbitmq.client.Channel;
|
||||
import com.rabbitmq.client.Connection;
|
||||
import com.rabbitmq.client.ConnectionFactory;
|
||||
import java.io.IOException;
|
||||
import java.util.HashMap;
|
||||
import java.util.Map;
|
||||
import java.util.concurrent.LinkedBlockingQueue;
|
||||
import java.util.concurrent.TimeoutException;
|
||||
|
||||
import com.rabbitmq.client.Channel;
|
||||
import com.rabbitmq.client.Connection;
|
||||
import com.rabbitmq.client.ConnectionFactory;
|
||||
|
||||
public class MessageManager {
|
||||
|
||||
private final String messageHost;
|
||||
|
@ -19,7 +21,7 @@ public class MessageManager {
|
|||
|
||||
private Connection connection;
|
||||
|
||||
private Map<String, Channel> channels = new HashMap<>();
|
||||
private final Map<String, Channel> channels = new HashMap<>();
|
||||
|
||||
private boolean durable;
|
||||
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
|
||||
package eu.dnetlib.message;
|
||||
|
||||
public enum MessageType {
|
||||
ONGOING,
|
||||
REPORT
|
||||
ONGOING, REPORT
|
||||
}
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.scholexplorer.relation;
|
||||
|
||||
import java.io.Serializable;
|
||||
|
|
|
@ -1,16 +1,18 @@
|
|||
|
||||
package eu.dnetlib.scholexplorer.relation;
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
import java.io.Serializable;
|
||||
import java.util.HashMap;
|
||||
|
||||
import org.apache.commons.io.IOUtils;
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
|
||||
public class RelationMapper extends HashMap<String, RelInfo> implements Serializable {
|
||||
|
||||
public static RelationMapper load() throws Exception {
|
||||
|
||||
final String json =
|
||||
IOUtils.toString(RelationMapper.class.getResourceAsStream("relations.json"));
|
||||
final String json = IOUtils.toString(RelationMapper.class.getResourceAsStream("relations.json"));
|
||||
|
||||
ObjectMapper mapper = new ObjectMapper();
|
||||
return mapper.readValue(json, RelationMapper.class);
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.application;
|
||||
|
||||
import static org.junit.jupiter.api.Assertions.assertEquals;
|
||||
|
@ -10,12 +11,13 @@ public class ArgumentApplicationParserTest {
|
|||
|
||||
@Test
|
||||
public void testParseParameter() throws Exception {
|
||||
final String jsonConfiguration =
|
||||
IOUtils.toString(
|
||||
final String jsonConfiguration = IOUtils
|
||||
.toString(
|
||||
this.getClass().getResourceAsStream("/eu/dnetlib/application/parameters.json"));
|
||||
assertNotNull(jsonConfiguration);
|
||||
ArgumentApplicationParser parser = new ArgumentApplicationParser(jsonConfiguration);
|
||||
parser.parseArgument(
|
||||
parser
|
||||
.parseArgument(
|
||||
new String[] {
|
||||
"-p",
|
||||
"value0",
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.common;
|
||||
|
||||
import static org.junit.jupiter.api.Assertions.*;
|
||||
|
@ -8,6 +9,7 @@ import java.nio.file.Path;
|
|||
import java.util.Arrays;
|
||||
import java.util.List;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
import org.apache.hadoop.conf.Configuration;
|
||||
import org.junit.jupiter.api.Nested;
|
||||
import org.junit.jupiter.api.Test;
|
||||
|
@ -65,8 +67,9 @@ public class HdfsSupportTest {
|
|||
|
||||
// then
|
||||
assertEquals(2, paths.size());
|
||||
List<String> expecteds =
|
||||
Arrays.stream(new String[] {subDir1.toString(), subDir2.toString()})
|
||||
List<String> expecteds = Arrays.stream(new String[] {
|
||||
subDir1.toString(), subDir2.toString()
|
||||
})
|
||||
.sorted()
|
||||
.collect(Collectors.toList());
|
||||
List<String> actuals = paths.stream().sorted().collect(Collectors.toList());
|
||||
|
|
|
@ -1,14 +1,17 @@
|
|||
|
||||
package eu.dnetlib.dhp.common;
|
||||
|
||||
import static org.mockito.Mockito.*;
|
||||
|
||||
import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.ThrowingConsumer;
|
||||
import java.util.function.Function;
|
||||
|
||||
import org.apache.spark.SparkConf;
|
||||
import org.apache.spark.sql.SparkSession;
|
||||
import org.junit.jupiter.api.Nested;
|
||||
import org.junit.jupiter.api.Test;
|
||||
|
||||
import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.ThrowingConsumer;
|
||||
|
||||
public class SparkSessionSupportTest {
|
||||
|
||||
@Nested
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.model.mdstore;
|
||||
|
||||
import static org.junit.jupiter.api.Assertions.assertTrue;
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.message;
|
||||
|
||||
import static org.junit.jupiter.api.Assertions.*;
|
||||
|
@ -5,6 +6,7 @@ import static org.junit.jupiter.api.Assertions.*;
|
|||
import java.io.IOException;
|
||||
import java.util.HashMap;
|
||||
import java.util.Map;
|
||||
|
||||
import org.junit.jupiter.api.Test;
|
||||
|
||||
public class MessageTest {
|
||||
|
@ -33,8 +35,7 @@ public class MessageTest {
|
|||
|
||||
@Test
|
||||
public void toStringTest() {
|
||||
final String expectedJson =
|
||||
"{\"workflowId\":\"wId\",\"jobName\":\"Collection\",\"type\":\"ONGOING\",\"body\":{\"ExecutionTime\":\"30s\",\"parsedItem\":\"300\"}}";
|
||||
final String expectedJson = "{\"workflowId\":\"wId\",\"jobName\":\"Collection\",\"type\":\"ONGOING\",\"body\":{\"ExecutionTime\":\"30s\",\"parsedItem\":\"300\"}}";
|
||||
Message m = new Message();
|
||||
m.setWorkflowId("wId");
|
||||
m.setType(MessageType.ONGOING);
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.scholexplorer.relation;
|
||||
|
||||
import org.junit.jupiter.api.Test;
|
||||
|
|
|
@ -12,7 +12,7 @@
|
|||
<artifactId>dhp-schemas</artifactId>
|
||||
<packaging>jar</packaging>
|
||||
|
||||
|
||||
<description>This module contains common schema classes meant to be used across the dnet-hadoop submodules</description>
|
||||
|
||||
<dependencies>
|
||||
|
||||
|
|
|
@ -1,9 +1,12 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.action;
|
||||
|
||||
import com.fasterxml.jackson.databind.annotation.JsonDeserialize;
|
||||
import eu.dnetlib.dhp.schema.oaf.Oaf;
|
||||
import java.io.Serializable;
|
||||
|
||||
import com.fasterxml.jackson.databind.annotation.JsonDeserialize;
|
||||
|
||||
import eu.dnetlib.dhp.schema.oaf.Oaf;
|
||||
|
||||
@JsonDeserialize(using = AtomicActionDeserializer.class)
|
||||
public class AtomicAction<T extends Oaf> implements Serializable {
|
||||
|
||||
|
@ -11,7 +14,8 @@ public class AtomicAction<T extends Oaf> implements Serializable {
|
|||
|
||||
private T payload;
|
||||
|
||||
public AtomicAction() {}
|
||||
public AtomicAction() {
|
||||
}
|
||||
|
||||
public AtomicAction(Class<T> clazz, T payload) {
|
||||
this.clazz = clazz;
|
||||
|
|
|
@ -1,19 +1,22 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.action;
|
||||
|
||||
import java.io.IOException;
|
||||
|
||||
import com.fasterxml.jackson.core.JsonParser;
|
||||
import com.fasterxml.jackson.core.JsonProcessingException;
|
||||
import com.fasterxml.jackson.databind.DeserializationContext;
|
||||
import com.fasterxml.jackson.databind.JsonDeserializer;
|
||||
import com.fasterxml.jackson.databind.JsonNode;
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
|
||||
import eu.dnetlib.dhp.schema.oaf.Oaf;
|
||||
import java.io.IOException;
|
||||
|
||||
public class AtomicActionDeserializer extends JsonDeserializer {
|
||||
|
||||
@Override
|
||||
public Object deserialize(JsonParser jp, DeserializationContext ctxt)
|
||||
throws IOException, JsonProcessingException {
|
||||
throws IOException {
|
||||
JsonNode node = jp.getCodec().readTree(jp);
|
||||
String classTag = node.get("clazz").asText();
|
||||
JsonNode payload = node.get("payload");
|
||||
|
|
|
@ -1,16 +1,11 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.common;
|
||||
|
||||
import eu.dnetlib.dhp.schema.oaf.OafEntity;
|
||||
|
||||
/** Actual entity types in the Graph */
|
||||
public enum EntityType {
|
||||
publication,
|
||||
dataset,
|
||||
otherresearchproduct,
|
||||
software,
|
||||
datasource,
|
||||
organization,
|
||||
project;
|
||||
publication, dataset, otherresearchproduct, software, datasource, organization, project;
|
||||
|
||||
/**
|
||||
* Resolves the EntityType, given the relative class name
|
||||
|
|
|
@ -1,9 +1,7 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.common;
|
||||
|
||||
/** Main entity types in the Graph */
|
||||
public enum MainEntityType {
|
||||
result,
|
||||
datasource,
|
||||
organization,
|
||||
project
|
||||
result, datasource, organization, project
|
||||
}
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.common;
|
||||
|
||||
import eu.dnetlib.dhp.schema.oaf.Qualifier;
|
||||
|
@ -5,36 +6,83 @@ import eu.dnetlib.dhp.schema.oaf.Qualifier;
|
|||
public class ModelConstants {
|
||||
|
||||
public static final String DNET_RESULT_TYPOLOGIES = "dnet:result_typologies";
|
||||
public static final String DNET_PUBLICATION_RESOURCE = "dnet:publication_resource";
|
||||
public static final String DNET_ACCESS_MODES = "dnet:access_modes";
|
||||
public static final String DNET_LANGUAGES = "dnet:languages";
|
||||
public static final String DNET_PID_TYPES = "dnet:pid_types";
|
||||
public static final String DNET_DATA_CITE_DATE = "dnet:dataCite_date";
|
||||
public static final String DNET_DATA_CITE_RESOURCE = "dnet:dataCite_resource";
|
||||
public static final String DNET_PROVENANCE_ACTIONS = "dnet:provenanceActions";
|
||||
|
||||
public static final String SYSIMPORT_CROSSWALK_REPOSITORY = "sysimport:crosswalk:repository";
|
||||
public static final String SYSIMPORT_CROSSWALK_ENTITYREGISTRY = "sysimport:crosswalk:entityregistry";
|
||||
public static final String USER_CLAIM = "user:claim";
|
||||
|
||||
public static final String DATASET_RESULTTYPE_CLASSID = "dataset";
|
||||
public static final String PUBLICATION_RESULTTYPE_CLASSID = "publication";
|
||||
public static final String SOFTWARE_RESULTTYPE_CLASSID = "software";
|
||||
public static final String ORP_RESULTTYPE_CLASSID = "other";
|
||||
|
||||
public static Qualifier PUBLICATION_DEFAULT_RESULTTYPE = new Qualifier();
|
||||
public static Qualifier DATASET_DEFAULT_RESULTTYPE = new Qualifier();
|
||||
public static Qualifier SOFTWARE_DEFAULT_RESULTTYPE = new Qualifier();
|
||||
public static Qualifier ORP_DEFAULT_RESULTTYPE = new Qualifier();
|
||||
public static final String RESULT_RESULT = "resultResult";
|
||||
public static final String PUBLICATION_DATASET = "publicationDataset";
|
||||
public static final String IS_RELATED_TO = "isRelatedTo";
|
||||
public static final String SUPPLEMENT = "supplement";
|
||||
public static final String IS_SUPPLEMENT_TO = "isSupplementTo";
|
||||
public static final String IS_SUPPLEMENTED_BY = "isSupplementedBy";
|
||||
public static final String PART = "part";
|
||||
public static final String IS_PART_OF = "IsPartOf";
|
||||
public static final String HAS_PARTS = "HasParts";
|
||||
public static final String RELATIONSHIP = "relationship";
|
||||
|
||||
static {
|
||||
PUBLICATION_DEFAULT_RESULTTYPE.setClassid(PUBLICATION_RESULTTYPE_CLASSID);
|
||||
PUBLICATION_DEFAULT_RESULTTYPE.setClassname(PUBLICATION_RESULTTYPE_CLASSID);
|
||||
PUBLICATION_DEFAULT_RESULTTYPE.setSchemeid(DNET_RESULT_TYPOLOGIES);
|
||||
PUBLICATION_DEFAULT_RESULTTYPE.setSchemename(DNET_RESULT_TYPOLOGIES);
|
||||
public static final String RESULT_PROJECT = "resultProject";
|
||||
public static final String OUTCOME = "outcome";
|
||||
public static final String IS_PRODUCED_BY = "isProducedBy";
|
||||
public static final String PRODUCES = "produces";
|
||||
|
||||
DATASET_DEFAULT_RESULTTYPE.setClassid(DATASET_RESULTTYPE_CLASSID);
|
||||
DATASET_DEFAULT_RESULTTYPE.setClassname(DATASET_RESULTTYPE_CLASSID);
|
||||
DATASET_DEFAULT_RESULTTYPE.setSchemeid(DNET_RESULT_TYPOLOGIES);
|
||||
DATASET_DEFAULT_RESULTTYPE.setSchemename(DNET_RESULT_TYPOLOGIES);
|
||||
public static final String DATASOURCE_ORGANIZATION = "datasourceOrganization";
|
||||
public static final String PROVISION = "provision";
|
||||
public static final String IS_PROVIDED_BY = "isProvidedBy";
|
||||
public static final String PROVIDES = "provides";
|
||||
|
||||
SOFTWARE_DEFAULT_RESULTTYPE.setClassid(SOFTWARE_RESULTTYPE_CLASSID);
|
||||
SOFTWARE_DEFAULT_RESULTTYPE.setClassname(SOFTWARE_RESULTTYPE_CLASSID);
|
||||
SOFTWARE_DEFAULT_RESULTTYPE.setSchemeid(DNET_RESULT_TYPOLOGIES);
|
||||
SOFTWARE_DEFAULT_RESULTTYPE.setSchemename(DNET_RESULT_TYPOLOGIES);
|
||||
public static final String PROJECT_ORGANIZATION = "projectOrganization";
|
||||
public static final String PARTICIPATION = "participation";
|
||||
public static final String HAS_PARTICIPANT = "hasParticipant";
|
||||
public static final String IS_PARTICIPANT = "isParticipant";
|
||||
|
||||
ORP_DEFAULT_RESULTTYPE.setClassid(ORP_RESULTTYPE_CLASSID);
|
||||
ORP_DEFAULT_RESULTTYPE.setClassname(ORP_RESULTTYPE_CLASSID);
|
||||
ORP_DEFAULT_RESULTTYPE.setSchemeid(DNET_RESULT_TYPOLOGIES);
|
||||
ORP_DEFAULT_RESULTTYPE.setSchemename(DNET_RESULT_TYPOLOGIES);
|
||||
public static final Qualifier PUBLICATION_DEFAULT_RESULTTYPE = qualifier(
|
||||
PUBLICATION_RESULTTYPE_CLASSID, PUBLICATION_RESULTTYPE_CLASSID,
|
||||
DNET_RESULT_TYPOLOGIES, DNET_RESULT_TYPOLOGIES);
|
||||
|
||||
public static final Qualifier DATASET_DEFAULT_RESULTTYPE = qualifier(
|
||||
DATASET_RESULTTYPE_CLASSID, DATASET_RESULTTYPE_CLASSID,
|
||||
DNET_RESULT_TYPOLOGIES, DNET_RESULT_TYPOLOGIES);
|
||||
|
||||
public static final Qualifier SOFTWARE_DEFAULT_RESULTTYPE = qualifier(
|
||||
SOFTWARE_RESULTTYPE_CLASSID, SOFTWARE_RESULTTYPE_CLASSID,
|
||||
DNET_RESULT_TYPOLOGIES, DNET_RESULT_TYPOLOGIES);
|
||||
|
||||
public static final Qualifier ORP_DEFAULT_RESULTTYPE = qualifier(
|
||||
ORP_RESULTTYPE_CLASSID, ORP_RESULTTYPE_CLASSID,
|
||||
DNET_RESULT_TYPOLOGIES, DNET_RESULT_TYPOLOGIES);
|
||||
|
||||
public static final Qualifier REPOSITORY_PROVENANCE_ACTIONS = qualifier(
|
||||
SYSIMPORT_CROSSWALK_REPOSITORY, SYSIMPORT_CROSSWALK_REPOSITORY,
|
||||
DNET_PROVENANCE_ACTIONS, DNET_PROVENANCE_ACTIONS);
|
||||
|
||||
public static final Qualifier ENTITYREGISTRY_PROVENANCE_ACTION = qualifier(
|
||||
SYSIMPORT_CROSSWALK_ENTITYREGISTRY, SYSIMPORT_CROSSWALK_ENTITYREGISTRY,
|
||||
DNET_PROVENANCE_ACTIONS, DNET_PROVENANCE_ACTIONS);
|
||||
|
||||
private static Qualifier qualifier(
|
||||
final String classid,
|
||||
final String classname,
|
||||
final String schemeid,
|
||||
final String schemename) {
|
||||
final Qualifier q = new Qualifier();
|
||||
q.setClassid(classid);
|
||||
q.setClassname(classname);
|
||||
q.setSchemeid(schemeid);
|
||||
q.setSchemename(schemename);
|
||||
return q;
|
||||
}
|
||||
}
|
||||
|
|
|
@ -13,7 +13,7 @@ import eu.dnetlib.dhp.schema.oaf.*;
|
|||
public class ModelSupport {
|
||||
|
||||
/** Defines the mapping between the actual entity type and the main entity type */
|
||||
private static Map<EntityType, MainEntityType> entityMapping = Maps.newHashMap();
|
||||
private static final Map<EntityType, MainEntityType> entityMapping = Maps.newHashMap();
|
||||
|
||||
static {
|
||||
entityMapping.put(EntityType.publication, MainEntityType.result);
|
||||
|
@ -53,232 +53,6 @@ public class ModelSupport {
|
|||
oafTypes.put("relation", Relation.class);
|
||||
}
|
||||
|
||||
public static final Map<String, String> entityIdPrefix = Maps.newHashMap();
|
||||
|
||||
static {
|
||||
entityIdPrefix.put("datasource", "10");
|
||||
entityIdPrefix.put("organization", "20");
|
||||
entityIdPrefix.put("project", "40");
|
||||
entityIdPrefix.put("result", "50");
|
||||
}
|
||||
|
||||
public static final Map<String, RelationInverse> relationInverseMap = Maps.newHashMap();
|
||||
|
||||
static {
|
||||
relationInverseMap
|
||||
.put(
|
||||
"personResult_authorship_isAuthorOf", new RelationInverse()
|
||||
.setRelation("isAuthorOf")
|
||||
.setInverse("hasAuthor")
|
||||
.setRelType("personResult")
|
||||
.setSubReltype("authorship"));
|
||||
relationInverseMap
|
||||
.put(
|
||||
"personResult_authorship_hasAuthor", new RelationInverse()
|
||||
.setInverse("isAuthorOf")
|
||||
.setRelation("hasAuthor")
|
||||
.setRelType("personResult")
|
||||
.setSubReltype("authorship"));
|
||||
relationInverseMap
|
||||
.put(
|
||||
"projectOrganization_participation_isParticipant", new RelationInverse()
|
||||
.setRelation("isParticipant")
|
||||
.setInverse("hasParticipant")
|
||||
.setRelType("projectOrganization")
|
||||
.setSubReltype("participation"));
|
||||
relationInverseMap
|
||||
.put(
|
||||
"projectOrganization_participation_hasParticipant", new RelationInverse()
|
||||
.setInverse("isParticipant")
|
||||
.setRelation("hasParticipant")
|
||||
.setRelType("projectOrganization")
|
||||
.setSubReltype("participation"));
|
||||
relationInverseMap
|
||||
.put(
|
||||
"resultOrganization_affiliation_hasAuthorInstitution", new RelationInverse()
|
||||
.setRelation("hasAuthorInstitution")
|
||||
.setInverse("isAuthorInstitutionOf")
|
||||
.setRelType("resultOrganization")
|
||||
.setSubReltype("affiliation"));
|
||||
relationInverseMap
|
||||
.put(
|
||||
"resultOrganization_affiliation_isAuthorInstitutionOf", new RelationInverse()
|
||||
.setInverse("hasAuthorInstitution")
|
||||
.setRelation("isAuthorInstitutionOf")
|
||||
.setRelType("resultOrganization")
|
||||
.setSubReltype("affiliation"));
|
||||
relationInverseMap
|
||||
.put(
|
||||
"organizationOrganization_dedup_merges", new RelationInverse()
|
||||
.setRelation("merges")
|
||||
.setInverse("isMergedIn")
|
||||
.setRelType("organizationOrganization")
|
||||
.setSubReltype("dedup"));
|
||||
relationInverseMap
|
||||
.put(
|
||||
"organizationOrganization_dedup_isMergedIn", new RelationInverse()
|
||||
.setInverse("merges")
|
||||
.setRelation("isMergedIn")
|
||||
.setRelType("organizationOrganization")
|
||||
.setSubReltype("dedup"));
|
||||
relationInverseMap
|
||||
.put(
|
||||
"organizationOrganization_dedupSimilarity_isSimilarTo", new RelationInverse()
|
||||
.setInverse("isSimilarTo")
|
||||
.setRelation("isSimilarTo")
|
||||
.setRelType("organizationOrganization")
|
||||
.setSubReltype("dedupSimilarity"));
|
||||
|
||||
relationInverseMap
|
||||
.put(
|
||||
"resultProject_outcome_isProducedBy", new RelationInverse()
|
||||
.setRelation("isProducedBy")
|
||||
.setInverse("produces")
|
||||
.setRelType("resultProject")
|
||||
.setSubReltype("outcome"));
|
||||
relationInverseMap
|
||||
.put(
|
||||
"resultProject_outcome_produces", new RelationInverse()
|
||||
.setInverse("isProducedBy")
|
||||
.setRelation("produces")
|
||||
.setRelType("resultProject")
|
||||
.setSubReltype("outcome"));
|
||||
relationInverseMap
|
||||
.put(
|
||||
"projectPerson_contactPerson_isContact", new RelationInverse()
|
||||
.setRelation("isContact")
|
||||
.setInverse("hasContact")
|
||||
.setRelType("projectPerson")
|
||||
.setSubReltype("contactPerson"));
|
||||
relationInverseMap
|
||||
.put(
|
||||
"projectPerson_contactPerson_hasContact", new RelationInverse()
|
||||
.setInverse("isContact")
|
||||
.setRelation("hasContact")
|
||||
.setRelType("personPerson")
|
||||
.setSubReltype("coAuthorship"));
|
||||
relationInverseMap
|
||||
.put(
|
||||
"personPerson_coAuthorship_isCoauthorOf", new RelationInverse()
|
||||
.setInverse("isCoAuthorOf")
|
||||
.setRelation("isCoAuthorOf")
|
||||
.setRelType("personPerson")
|
||||
.setSubReltype("coAuthorship"));
|
||||
relationInverseMap
|
||||
.put(
|
||||
"personPerson_dedup_merges", new RelationInverse()
|
||||
.setInverse("isMergedIn")
|
||||
.setRelation("merges")
|
||||
.setRelType("personPerson")
|
||||
.setSubReltype("dedup"));
|
||||
relationInverseMap
|
||||
.put(
|
||||
"personPerson_dedup_isMergedIn", new RelationInverse()
|
||||
.setInverse("merges")
|
||||
.setRelation("isMergedIn")
|
||||
.setRelType("personPerson")
|
||||
.setSubReltype("dedup"));
|
||||
relationInverseMap
|
||||
.put(
|
||||
"personPerson_dedupSimilarity_isSimilarTo", new RelationInverse()
|
||||
.setInverse("isSimilarTo")
|
||||
.setRelation("isSimilarTo")
|
||||
.setRelType("personPerson")
|
||||
.setSubReltype("dedupSimilarity"));
|
||||
relationInverseMap
|
||||
.put(
|
||||
"datasourceOrganization_provision_isProvidedBy", new RelationInverse()
|
||||
.setInverse("provides")
|
||||
.setRelation("isProvidedBy")
|
||||
.setRelType("datasourceOrganization")
|
||||
.setSubReltype("provision"));
|
||||
relationInverseMap
|
||||
.put(
|
||||
"datasourceOrganization_provision_provides", new RelationInverse()
|
||||
.setInverse("isProvidedBy")
|
||||
.setRelation("provides")
|
||||
.setRelType("datasourceOrganization")
|
||||
.setSubReltype("provision"));
|
||||
relationInverseMap
|
||||
.put(
|
||||
"resultResult_similarity_hasAmongTopNSimilarDocuments", new RelationInverse()
|
||||
.setInverse("isAmongTopNSimilarDocuments")
|
||||
.setRelation("hasAmongTopNSimilarDocuments")
|
||||
.setRelType("resultResult")
|
||||
.setSubReltype("similarity"));
|
||||
relationInverseMap
|
||||
.put(
|
||||
"resultResult_similarity_isAmongTopNSimilarDocuments", new RelationInverse()
|
||||
.setInverse("hasAmongTopNSimilarDocuments")
|
||||
.setRelation("isAmongTopNSimilarDocuments")
|
||||
.setRelType("resultResult")
|
||||
.setSubReltype("similarity"));
|
||||
relationInverseMap
|
||||
.put(
|
||||
"resultResult_relationship_isRelatedTo", new RelationInverse()
|
||||
.setInverse("isRelatedTo")
|
||||
.setRelation("isRelatedTo")
|
||||
.setRelType("resultResult")
|
||||
.setSubReltype("relationship"));
|
||||
relationInverseMap
|
||||
.put(
|
||||
"resultResult_similarity_isAmongTopNSimilarDocuments", new RelationInverse()
|
||||
.setInverse("hasAmongTopNSimilarDocuments")
|
||||
.setRelation("isAmongTopNSimilarDocuments")
|
||||
.setRelType("resultResult")
|
||||
.setSubReltype("similarity"));
|
||||
relationInverseMap
|
||||
.put(
|
||||
"resultResult_supplement_isSupplementTo", new RelationInverse()
|
||||
.setInverse("isSupplementedBy")
|
||||
.setRelation("isSupplementTo")
|
||||
.setRelType("resultResult")
|
||||
.setSubReltype("supplement"));
|
||||
relationInverseMap
|
||||
.put(
|
||||
"resultResult_supplement_isSupplementedBy", new RelationInverse()
|
||||
.setInverse("isSupplementTo")
|
||||
.setRelation("isSupplementedBy")
|
||||
.setRelType("resultResult")
|
||||
.setSubReltype("supplement"));
|
||||
relationInverseMap
|
||||
.put(
|
||||
"resultResult_part_isPartOf", new RelationInverse()
|
||||
.setInverse("hasPart")
|
||||
.setRelation("isPartOf")
|
||||
.setRelType("resultResult")
|
||||
.setSubReltype("part"));
|
||||
relationInverseMap
|
||||
.put(
|
||||
"resultResult_part_hasPart", new RelationInverse()
|
||||
.setInverse("isPartOf")
|
||||
.setRelation("hasPart")
|
||||
.setRelType("resultResult")
|
||||
.setSubReltype("part"));
|
||||
relationInverseMap
|
||||
.put(
|
||||
"resultResult_dedup_merges", new RelationInverse()
|
||||
.setInverse("isMergedIn")
|
||||
.setRelation("merges")
|
||||
.setRelType("resultResult")
|
||||
.setSubReltype("dedup"));
|
||||
relationInverseMap
|
||||
.put(
|
||||
"resultResult_dedup_isMergedIn", new RelationInverse()
|
||||
.setInverse("merges")
|
||||
.setRelation("isMergedIn")
|
||||
.setRelType("resultResult")
|
||||
.setSubReltype("dedup"));
|
||||
relationInverseMap
|
||||
.put(
|
||||
"resultResult_dedupSimilarity_isSimilarTo", new RelationInverse()
|
||||
.setInverse("isSimilarTo")
|
||||
.setRelation("isSimilarTo")
|
||||
.setRelType("resultResult")
|
||||
.setSubReltype("dedupSimilarity"));
|
||||
|
||||
}
|
||||
|
||||
private static final String schemeTemplate = "dnet:%s_%s_relations";
|
||||
|
||||
private ModelSupport() {
|
||||
|
@ -428,4 +202,5 @@ public class ModelSupport {
|
|||
private static <T extends Oaf> String idFnForOafEntity(T t) {
|
||||
return ((OafEntity) t).getId();
|
||||
}
|
||||
|
||||
}
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf;
|
||||
|
||||
import java.io.Serializable;
|
||||
|
@ -67,8 +68,10 @@ public class Author implements Serializable {
|
|||
|
||||
@Override
|
||||
public boolean equals(Object o) {
|
||||
if (this == o) return true;
|
||||
if (o == null || getClass() != o.getClass()) return false;
|
||||
if (this == o)
|
||||
return true;
|
||||
if (o == null || getClass() != o.getClass())
|
||||
return false;
|
||||
Author author = (Author) o;
|
||||
return Objects.equals(fullname, author.fullname)
|
||||
&& Objects.equals(name, author.name)
|
||||
|
@ -83,14 +86,4 @@ public class Author implements Serializable {
|
|||
return Objects.hash(fullname, name, surname, rank, pid, affiliation);
|
||||
}
|
||||
|
||||
public void addPid(StructuredProperty pid) {
|
||||
|
||||
if (pid == null) return;
|
||||
|
||||
if (this.pid == null) {
|
||||
this.pid = Arrays.asList(pid);
|
||||
} else {
|
||||
this.pid.add(pid);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf;
|
||||
|
||||
import java.io.Serializable;
|
||||
|
@ -31,9 +32,12 @@ public class Context implements Serializable {
|
|||
|
||||
@Override
|
||||
public boolean equals(Object obj) {
|
||||
if (this == obj) return true;
|
||||
if (obj == null) return false;
|
||||
if (getClass() != obj.getClass()) return false;
|
||||
if (this == obj)
|
||||
return true;
|
||||
if (obj == null)
|
||||
return false;
|
||||
if (getClass() != obj.getClass())
|
||||
return false;
|
||||
|
||||
Context other = (Context) obj;
|
||||
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf;
|
||||
|
||||
import java.util.Objects;
|
||||
|
@ -16,9 +17,12 @@ public class Country extends Qualifier {
|
|||
|
||||
@Override
|
||||
public boolean equals(Object o) {
|
||||
if (this == o) return true;
|
||||
if (o == null || getClass() != o.getClass()) return false;
|
||||
if (!super.equals(o)) return false;
|
||||
if (this == o)
|
||||
return true;
|
||||
if (o == null || getClass() != o.getClass())
|
||||
return false;
|
||||
if (!super.equals(o))
|
||||
return false;
|
||||
Country country = (Country) o;
|
||||
return Objects.equals(dataInfo, country.dataInfo);
|
||||
}
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf;
|
||||
|
||||
import java.io.Serializable;
|
||||
|
@ -62,8 +63,10 @@ public class DataInfo implements Serializable {
|
|||
|
||||
@Override
|
||||
public boolean equals(Object o) {
|
||||
if (this == o) return true;
|
||||
if (o == null || getClass() != o.getClass()) return false;
|
||||
if (this == o)
|
||||
return true;
|
||||
if (o == null || getClass() != o.getClass())
|
||||
return false;
|
||||
DataInfo dataInfo = (DataInfo) o;
|
||||
return Objects.equals(invisible, dataInfo.invisible)
|
||||
&& Objects.equals(inferred, dataInfo.inferred)
|
||||
|
@ -75,7 +78,8 @@ public class DataInfo implements Serializable {
|
|||
|
||||
@Override
|
||||
public int hashCode() {
|
||||
return Objects.hash(
|
||||
return Objects
|
||||
.hash(
|
||||
invisible, inferred, deletedbyinference, trust, inferenceprovenance, provenanceaction);
|
||||
}
|
||||
}
|
||||
|
|
|
@ -1,9 +1,11 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf;
|
||||
|
||||
import eu.dnetlib.dhp.schema.common.ModelConstants;
|
||||
import java.io.Serializable;
|
||||
import java.util.List;
|
||||
|
||||
import eu.dnetlib.dhp.schema.common.ModelConstants;
|
||||
|
||||
public class Dataset extends Result implements Serializable {
|
||||
|
||||
private Field<String> storagedate;
|
||||
|
@ -90,8 +92,7 @@ public class Dataset extends Result implements Serializable {
|
|||
|
||||
final Dataset d = (Dataset) e;
|
||||
|
||||
storagedate =
|
||||
d.getStoragedate() != null && compareTrust(this, e) < 0 ? d.getStoragedate() : storagedate;
|
||||
storagedate = d.getStoragedate() != null && compareTrust(this, e) < 0 ? d.getStoragedate() : storagedate;
|
||||
|
||||
device = d.getDevice() != null && compareTrust(this, e) < 0 ? d.getDevice() : device;
|
||||
|
||||
|
@ -99,13 +100,11 @@ public class Dataset extends Result implements Serializable {
|
|||
|
||||
version = d.getVersion() != null && compareTrust(this, e) < 0 ? d.getVersion() : version;
|
||||
|
||||
lastmetadataupdate =
|
||||
d.getLastmetadataupdate() != null && compareTrust(this, e) < 0
|
||||
lastmetadataupdate = d.getLastmetadataupdate() != null && compareTrust(this, e) < 0
|
||||
? d.getLastmetadataupdate()
|
||||
: lastmetadataupdate;
|
||||
|
||||
metadataversionnumber =
|
||||
d.getMetadataversionnumber() != null && compareTrust(this, e) < 0
|
||||
metadataversionnumber = d.getMetadataversionnumber() != null && compareTrust(this, e) < 0
|
||||
? d.getMetadataversionnumber()
|
||||
: metadataversionnumber;
|
||||
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf;
|
||||
|
||||
import java.io.Serializable;
|
||||
|
@ -372,120 +373,93 @@ public class Datasource extends OafEntity implements Serializable {
|
|||
|
||||
Datasource d = (Datasource) e;
|
||||
|
||||
datasourcetype =
|
||||
d.getDatasourcetype() != null && compareTrust(this, e) < 0
|
||||
datasourcetype = d.getDatasourcetype() != null && compareTrust(this, e) < 0
|
||||
? d.getDatasourcetype()
|
||||
: datasourcetype;
|
||||
openairecompatibility =
|
||||
d.getOpenairecompatibility() != null && compareTrust(this, e) < 0
|
||||
openairecompatibility = d.getOpenairecompatibility() != null && compareTrust(this, e) < 0
|
||||
? d.getOpenairecompatibility()
|
||||
: openairecompatibility;
|
||||
officialname =
|
||||
d.getOfficialname() != null && compareTrust(this, e) < 0
|
||||
officialname = d.getOfficialname() != null && compareTrust(this, e) < 0
|
||||
? d.getOfficialname()
|
||||
: officialname;
|
||||
englishname =
|
||||
d.getEnglishname() != null && compareTrust(this, e) < 0 ? d.getEnglishname() : officialname;
|
||||
websiteurl =
|
||||
d.getWebsiteurl() != null && compareTrust(this, e) < 0 ? d.getWebsiteurl() : websiteurl;
|
||||
englishname = d.getEnglishname() != null && compareTrust(this, e) < 0 ? d.getEnglishname() : officialname;
|
||||
websiteurl = d.getWebsiteurl() != null && compareTrust(this, e) < 0 ? d.getWebsiteurl() : websiteurl;
|
||||
logourl = d.getLogourl() != null && compareTrust(this, e) < 0 ? d.getLogourl() : getLogourl();
|
||||
contactemail =
|
||||
d.getContactemail() != null && compareTrust(this, e) < 0
|
||||
contactemail = d.getContactemail() != null && compareTrust(this, e) < 0
|
||||
? d.getContactemail()
|
||||
: contactemail;
|
||||
namespaceprefix =
|
||||
d.getNamespaceprefix() != null && compareTrust(this, e) < 0
|
||||
namespaceprefix = d.getNamespaceprefix() != null && compareTrust(this, e) < 0
|
||||
? d.getNamespaceprefix()
|
||||
: namespaceprefix;
|
||||
latitude = d.getLatitude() != null && compareTrust(this, e) < 0 ? d.getLatitude() : latitude;
|
||||
longitude =
|
||||
d.getLongitude() != null && compareTrust(this, e) < 0 ? d.getLongitude() : longitude;
|
||||
dateofvalidation =
|
||||
d.getDateofvalidation() != null && compareTrust(this, e) < 0
|
||||
longitude = d.getLongitude() != null && compareTrust(this, e) < 0 ? d.getLongitude() : longitude;
|
||||
dateofvalidation = d.getDateofvalidation() != null && compareTrust(this, e) < 0
|
||||
? d.getDateofvalidation()
|
||||
: dateofvalidation;
|
||||
description =
|
||||
d.getDescription() != null && compareTrust(this, e) < 0 ? d.getDescription() : description;
|
||||
description = d.getDescription() != null && compareTrust(this, e) < 0 ? d.getDescription() : description;
|
||||
subjects = mergeLists(subjects, d.getSubjects());
|
||||
|
||||
// opendoar specific fields (od*)
|
||||
odnumberofitems =
|
||||
d.getOdnumberofitems() != null && compareTrust(this, e) < 0
|
||||
odnumberofitems = d.getOdnumberofitems() != null && compareTrust(this, e) < 0
|
||||
? d.getOdnumberofitems()
|
||||
: odnumberofitems;
|
||||
odnumberofitemsdate =
|
||||
d.getOdnumberofitemsdate() != null && compareTrust(this, e) < 0
|
||||
odnumberofitemsdate = d.getOdnumberofitemsdate() != null && compareTrust(this, e) < 0
|
||||
? d.getOdnumberofitemsdate()
|
||||
: odnumberofitemsdate;
|
||||
odpolicies =
|
||||
d.getOdpolicies() != null && compareTrust(this, e) < 0 ? d.getOdpolicies() : odpolicies;
|
||||
odpolicies = d.getOdpolicies() != null && compareTrust(this, e) < 0 ? d.getOdpolicies() : odpolicies;
|
||||
odlanguages = mergeLists(odlanguages, d.getOdlanguages());
|
||||
odcontenttypes = mergeLists(odcontenttypes, d.getOdcontenttypes());
|
||||
accessinfopackage = mergeLists(accessinfopackage, d.getAccessinfopackage());
|
||||
|
||||
// re3data fields
|
||||
releasestartdate =
|
||||
d.getReleasestartdate() != null && compareTrust(this, e) < 0
|
||||
releasestartdate = d.getReleasestartdate() != null && compareTrust(this, e) < 0
|
||||
? d.getReleasestartdate()
|
||||
: releasestartdate;
|
||||
releaseenddate =
|
||||
d.getReleaseenddate() != null && compareTrust(this, e) < 0
|
||||
releaseenddate = d.getReleaseenddate() != null && compareTrust(this, e) < 0
|
||||
? d.getReleaseenddate()
|
||||
: releaseenddate;
|
||||
missionstatementurl =
|
||||
d.getMissionstatementurl() != null && compareTrust(this, e) < 0
|
||||
missionstatementurl = d.getMissionstatementurl() != null && compareTrust(this, e) < 0
|
||||
? d.getMissionstatementurl()
|
||||
: missionstatementurl;
|
||||
dataprovider =
|
||||
d.getDataprovider() != null && compareTrust(this, e) < 0
|
||||
dataprovider = d.getDataprovider() != null && compareTrust(this, e) < 0
|
||||
? d.getDataprovider()
|
||||
: dataprovider;
|
||||
serviceprovider =
|
||||
d.getServiceprovider() != null && compareTrust(this, e) < 0
|
||||
serviceprovider = d.getServiceprovider() != null && compareTrust(this, e) < 0
|
||||
? d.getServiceprovider()
|
||||
: serviceprovider;
|
||||
|
||||
// {open, restricted or closed}
|
||||
databaseaccesstype =
|
||||
d.getDatabaseaccesstype() != null && compareTrust(this, e) < 0
|
||||
databaseaccesstype = d.getDatabaseaccesstype() != null && compareTrust(this, e) < 0
|
||||
? d.getDatabaseaccesstype()
|
||||
: databaseaccesstype;
|
||||
|
||||
// {open, restricted or closed}
|
||||
datauploadtype =
|
||||
d.getDatauploadtype() != null && compareTrust(this, e) < 0
|
||||
datauploadtype = d.getDatauploadtype() != null && compareTrust(this, e) < 0
|
||||
? d.getDatauploadtype()
|
||||
: datauploadtype;
|
||||
|
||||
// {feeRequired, registration, other}
|
||||
databaseaccessrestriction =
|
||||
d.getDatabaseaccessrestriction() != null && compareTrust(this, e) < 0
|
||||
databaseaccessrestriction = d.getDatabaseaccessrestriction() != null && compareTrust(this, e) < 0
|
||||
? d.getDatabaseaccessrestriction()
|
||||
: databaseaccessrestriction;
|
||||
|
||||
// {feeRequired, registration, other}
|
||||
datauploadrestriction =
|
||||
d.getDatauploadrestriction() != null && compareTrust(this, e) < 0
|
||||
datauploadrestriction = d.getDatauploadrestriction() != null && compareTrust(this, e) < 0
|
||||
? d.getDatauploadrestriction()
|
||||
: datauploadrestriction;
|
||||
|
||||
versioning =
|
||||
d.getVersioning() != null && compareTrust(this, e) < 0 ? d.getVersioning() : versioning;
|
||||
citationguidelineurl =
|
||||
d.getCitationguidelineurl() != null && compareTrust(this, e) < 0
|
||||
versioning = d.getVersioning() != null && compareTrust(this, e) < 0 ? d.getVersioning() : versioning;
|
||||
citationguidelineurl = d.getCitationguidelineurl() != null && compareTrust(this, e) < 0
|
||||
? d.getCitationguidelineurl()
|
||||
: citationguidelineurl;
|
||||
|
||||
// {yes, no, unknown}
|
||||
qualitymanagementkind =
|
||||
d.getQualitymanagementkind() != null && compareTrust(this, e) < 0
|
||||
qualitymanagementkind = d.getQualitymanagementkind() != null && compareTrust(this, e) < 0
|
||||
? d.getQualitymanagementkind()
|
||||
: qualitymanagementkind;
|
||||
pidsystems =
|
||||
d.getPidsystems() != null && compareTrust(this, e) < 0 ? d.getPidsystems() : pidsystems;
|
||||
pidsystems = d.getPidsystems() != null && compareTrust(this, e) < 0 ? d.getPidsystems() : pidsystems;
|
||||
|
||||
certificates =
|
||||
d.getCertificates() != null && compareTrust(this, e) < 0
|
||||
certificates = d.getCertificates() != null && compareTrust(this, e) < 0
|
||||
? d.getCertificates()
|
||||
: certificates;
|
||||
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf;
|
||||
|
||||
import java.io.Serializable;
|
||||
|
@ -94,8 +95,10 @@ public class ExternalReference implements Serializable {
|
|||
|
||||
@Override
|
||||
public boolean equals(Object o) {
|
||||
if (this == o) return true;
|
||||
if (o == null || getClass() != o.getClass()) return false;
|
||||
if (this == o)
|
||||
return true;
|
||||
if (o == null || getClass() != o.getClass())
|
||||
return false;
|
||||
ExternalReference that = (ExternalReference) o;
|
||||
return Objects.equals(sitename, that.sitename)
|
||||
&& Objects.equals(label, that.label)
|
||||
|
@ -109,7 +112,8 @@ public class ExternalReference implements Serializable {
|
|||
|
||||
@Override
|
||||
public int hashCode() {
|
||||
return Objects.hash(
|
||||
return Objects
|
||||
.hash(
|
||||
sitename, label, url, description, qualifier, refidentifier, query, dataInfo);
|
||||
}
|
||||
}
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf;
|
||||
|
||||
import java.io.Serializable;
|
||||
|
@ -57,8 +58,10 @@ public class ExtraInfo implements Serializable {
|
|||
|
||||
@Override
|
||||
public boolean equals(Object o) {
|
||||
if (this == o) return true;
|
||||
if (o == null || getClass() != o.getClass()) return false;
|
||||
if (this == o)
|
||||
return true;
|
||||
if (o == null || getClass() != o.getClass())
|
||||
return false;
|
||||
ExtraInfo extraInfo = (ExtraInfo) o;
|
||||
return Objects.equals(name, extraInfo.name)
|
||||
&& Objects.equals(typology, extraInfo.typology)
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf;
|
||||
|
||||
import java.io.Serializable;
|
||||
|
@ -31,9 +32,12 @@ public class Field<T> implements Serializable {
|
|||
|
||||
@Override
|
||||
public boolean equals(Object obj) {
|
||||
if (this == obj) return true;
|
||||
if (obj == null) return false;
|
||||
if (getClass() != obj.getClass()) return false;
|
||||
if (this == obj)
|
||||
return true;
|
||||
if (obj == null)
|
||||
return false;
|
||||
if (getClass() != obj.getClass())
|
||||
return false;
|
||||
Field<T> other = (Field<T>) obj;
|
||||
return getValue().equals(other.getValue());
|
||||
}
|
||||
|
|
|
@ -1,9 +1,12 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf;
|
||||
|
||||
import com.fasterxml.jackson.annotation.JsonIgnore;
|
||||
import java.io.Serializable;
|
||||
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
|
||||
import com.fasterxml.jackson.annotation.JsonIgnore;
|
||||
|
||||
public class GeoLocation implements Serializable {
|
||||
|
||||
private String point;
|
||||
|
@ -44,7 +47,8 @@ public class GeoLocation implements Serializable {
|
|||
public String toComparableString() {
|
||||
return isBlank()
|
||||
? ""
|
||||
: String.format(
|
||||
: String
|
||||
.format(
|
||||
"%s::%s%s",
|
||||
point != null ? point.toLowerCase() : "",
|
||||
box != null ? box.toLowerCase() : "",
|
||||
|
@ -58,9 +62,12 @@ public class GeoLocation implements Serializable {
|
|||
|
||||
@Override
|
||||
public boolean equals(Object obj) {
|
||||
if (this == obj) return true;
|
||||
if (obj == null) return false;
|
||||
if (getClass() != obj.getClass()) return false;
|
||||
if (this == obj)
|
||||
return true;
|
||||
if (obj == null)
|
||||
return false;
|
||||
if (getClass() != obj.getClass())
|
||||
return false;
|
||||
|
||||
GeoLocation other = (GeoLocation) obj;
|
||||
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf;
|
||||
|
||||
import java.io.Serializable;
|
||||
|
@ -121,7 +122,8 @@ public class Instance implements Serializable {
|
|||
}
|
||||
|
||||
public String toComparableString() {
|
||||
return String.format(
|
||||
return String
|
||||
.format(
|
||||
"%s::%s::%s::%s",
|
||||
hostedby != null && hostedby.getKey() != null ? hostedby.getKey().toLowerCase() : "",
|
||||
accessright != null && accessright.getClassid() != null ? accessright.getClassid() : "",
|
||||
|
@ -136,9 +138,12 @@ public class Instance implements Serializable {
|
|||
|
||||
@Override
|
||||
public boolean equals(Object obj) {
|
||||
if (this == obj) return true;
|
||||
if (obj == null) return false;
|
||||
if (getClass() != obj.getClass()) return false;
|
||||
if (this == obj)
|
||||
return true;
|
||||
if (obj == null)
|
||||
return false;
|
||||
if (getClass() != obj.getClass())
|
||||
return false;
|
||||
|
||||
Instance other = (Instance) obj;
|
||||
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf;
|
||||
|
||||
import java.io.Serializable;
|
||||
|
@ -127,8 +128,10 @@ public class Journal implements Serializable {
|
|||
|
||||
@Override
|
||||
public boolean equals(Object o) {
|
||||
if (this == o) return true;
|
||||
if (o == null || getClass() != o.getClass()) return false;
|
||||
if (this == o)
|
||||
return true;
|
||||
if (o == null || getClass() != o.getClass())
|
||||
return false;
|
||||
Journal journal = (Journal) o;
|
||||
return Objects.equals(name, journal.name)
|
||||
&& Objects.equals(issnPrinted, journal.issnPrinted)
|
||||
|
@ -146,7 +149,8 @@ public class Journal implements Serializable {
|
|||
|
||||
@Override
|
||||
public int hashCode() {
|
||||
return Objects.hash(
|
||||
return Objects
|
||||
.hash(
|
||||
name,
|
||||
issnPrinted,
|
||||
issnOnline,
|
||||
|
|
|
@ -1,9 +1,12 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf;
|
||||
|
||||
import com.fasterxml.jackson.annotation.JsonIgnore;
|
||||
import java.io.Serializable;
|
||||
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
|
||||
import com.fasterxml.jackson.annotation.JsonIgnore;
|
||||
|
||||
public class KeyValue implements Serializable {
|
||||
|
||||
private String key;
|
||||
|
@ -39,7 +42,8 @@ public class KeyValue implements Serializable {
|
|||
public String toComparableString() {
|
||||
return isBlank()
|
||||
? ""
|
||||
: String.format(
|
||||
: String
|
||||
.format(
|
||||
"%s::%s",
|
||||
key != null ? key.toLowerCase() : "", value != null ? value.toLowerCase() : "");
|
||||
}
|
||||
|
@ -56,9 +60,12 @@ public class KeyValue implements Serializable {
|
|||
|
||||
@Override
|
||||
public boolean equals(Object obj) {
|
||||
if (this == obj) return true;
|
||||
if (obj == null) return false;
|
||||
if (getClass() != obj.getClass()) return false;
|
||||
if (this == obj)
|
||||
return true;
|
||||
if (obj == null)
|
||||
return false;
|
||||
if (getClass() != obj.getClass())
|
||||
return false;
|
||||
|
||||
KeyValue other = (KeyValue) obj;
|
||||
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf;
|
||||
|
||||
import java.io.Serializable;
|
||||
|
@ -17,8 +18,10 @@ public class OAIProvenance implements Serializable {
|
|||
|
||||
@Override
|
||||
public boolean equals(Object o) {
|
||||
if (this == o) return true;
|
||||
if (o == null || getClass() != o.getClass()) return false;
|
||||
if (this == o)
|
||||
return true;
|
||||
if (o == null || getClass() != o.getClass())
|
||||
return false;
|
||||
OAIProvenance that = (OAIProvenance) o;
|
||||
return Objects.equals(originDescription, that.originDescription);
|
||||
}
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf;
|
||||
|
||||
import java.io.Serializable;
|
||||
|
@ -6,6 +7,9 @@ import java.util.Objects;
|
|||
|
||||
public abstract class Oaf implements Serializable {
|
||||
|
||||
/**
|
||||
* The list of datasource id/name pairs providing this relationship.
|
||||
*/
|
||||
protected List<KeyValue> collectedfrom;
|
||||
|
||||
private DataInfo dataInfo;
|
||||
|
@ -37,11 +41,13 @@ public abstract class Oaf implements Serializable {
|
|||
}
|
||||
|
||||
public void mergeOAFDataInfo(Oaf e) {
|
||||
if (e.getDataInfo() != null && compareTrust(this, e) < 0) dataInfo = e.getDataInfo();
|
||||
if (e.getDataInfo() != null && compareTrust(this, e) < 0)
|
||||
dataInfo = e.getDataInfo();
|
||||
}
|
||||
|
||||
protected String extractTrust(Oaf e) {
|
||||
if (e == null || e.getDataInfo() == null || e.getDataInfo().getTrust() == null) return "0.0";
|
||||
if (e == null || e.getDataInfo() == null || e.getDataInfo().getTrust() == null)
|
||||
return "0.0";
|
||||
return e.getDataInfo().getTrust();
|
||||
}
|
||||
|
||||
|
@ -51,8 +57,10 @@ public abstract class Oaf implements Serializable {
|
|||
|
||||
@Override
|
||||
public boolean equals(Object o) {
|
||||
if (this == o) return true;
|
||||
if (o == null || getClass() != o.getClass()) return false;
|
||||
if (this == o)
|
||||
return true;
|
||||
if (o == null || getClass() != o.getClass())
|
||||
return false;
|
||||
Oaf oaf = (Oaf) o;
|
||||
return Objects.equals(dataInfo, oaf.dataInfo)
|
||||
&& Objects.equals(lastupdatetimestamp, oaf.lastupdatetimestamp);
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf;
|
||||
|
||||
import java.io.Serializable;
|
||||
|
@ -78,7 +79,8 @@ public abstract class OafEntity extends Oaf implements Serializable {
|
|||
|
||||
public void mergeFrom(OafEntity e) {
|
||||
|
||||
if (e == null) return;
|
||||
if (e == null)
|
||||
return;
|
||||
|
||||
originalId = mergeLists(originalId, e.getOriginalId());
|
||||
|
||||
|
@ -100,7 +102,8 @@ public abstract class OafEntity extends Oaf implements Serializable {
|
|||
|
||||
protected <T> List<T> mergeLists(final List<T>... lists) {
|
||||
|
||||
return Arrays.stream(lists)
|
||||
return Arrays
|
||||
.stream(lists)
|
||||
.filter(Objects::nonNull)
|
||||
.flatMap(List::stream)
|
||||
.distinct()
|
||||
|
@ -109,9 +112,12 @@ public abstract class OafEntity extends Oaf implements Serializable {
|
|||
|
||||
@Override
|
||||
public boolean equals(Object o) {
|
||||
if (this == o) return true;
|
||||
if (o == null || getClass() != o.getClass()) return false;
|
||||
if (!super.equals(o)) return false;
|
||||
if (this == o)
|
||||
return true;
|
||||
if (o == null || getClass() != o.getClass())
|
||||
return false;
|
||||
if (!super.equals(o))
|
||||
return false;
|
||||
OafEntity oafEntity = (OafEntity) o;
|
||||
return Objects.equals(id, oafEntity.id);
|
||||
}
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf;
|
||||
|
||||
import java.io.Serializable;
|
||||
|
@ -175,50 +176,38 @@ public class Organization extends OafEntity implements Serializable {
|
|||
}
|
||||
|
||||
final Organization o = (Organization) e;
|
||||
legalshortname =
|
||||
o.getLegalshortname() != null && compareTrust(this, e) < 0
|
||||
legalshortname = o.getLegalshortname() != null && compareTrust(this, e) < 0
|
||||
? o.getLegalshortname()
|
||||
: legalshortname;
|
||||
legalname =
|
||||
o.getLegalname() != null && compareTrust(this, e) < 0 ? o.getLegalname() : legalname;
|
||||
legalname = o.getLegalname() != null && compareTrust(this, e) < 0 ? o.getLegalname() : legalname;
|
||||
alternativeNames = mergeLists(o.getAlternativeNames(), alternativeNames);
|
||||
websiteurl =
|
||||
o.getWebsiteurl() != null && compareTrust(this, e) < 0 ? o.getWebsiteurl() : websiteurl;
|
||||
websiteurl = o.getWebsiteurl() != null && compareTrust(this, e) < 0 ? o.getWebsiteurl() : websiteurl;
|
||||
logourl = o.getLogourl() != null && compareTrust(this, e) < 0 ? o.getLogourl() : logourl;
|
||||
eclegalbody =
|
||||
o.getEclegalbody() != null && compareTrust(this, e) < 0 ? o.getEclegalbody() : eclegalbody;
|
||||
eclegalperson =
|
||||
o.getEclegalperson() != null && compareTrust(this, e) < 0
|
||||
eclegalbody = o.getEclegalbody() != null && compareTrust(this, e) < 0 ? o.getEclegalbody() : eclegalbody;
|
||||
eclegalperson = o.getEclegalperson() != null && compareTrust(this, e) < 0
|
||||
? o.getEclegalperson()
|
||||
: eclegalperson;
|
||||
ecnonprofit =
|
||||
o.getEcnonprofit() != null && compareTrust(this, e) < 0 ? o.getEcnonprofit() : ecnonprofit;
|
||||
ecresearchorganization =
|
||||
o.getEcresearchorganization() != null && compareTrust(this, e) < 0
|
||||
ecnonprofit = o.getEcnonprofit() != null && compareTrust(this, e) < 0 ? o.getEcnonprofit() : ecnonprofit;
|
||||
ecresearchorganization = o.getEcresearchorganization() != null && compareTrust(this, e) < 0
|
||||
? o.getEcresearchorganization()
|
||||
: ecresearchorganization;
|
||||
echighereducation =
|
||||
o.getEchighereducation() != null && compareTrust(this, e) < 0
|
||||
echighereducation = o.getEchighereducation() != null && compareTrust(this, e) < 0
|
||||
? o.getEchighereducation()
|
||||
: echighereducation;
|
||||
ecinternationalorganizationeurinterests =
|
||||
o.getEcinternationalorganizationeurinterests() != null && compareTrust(this, e) < 0
|
||||
ecinternationalorganizationeurinterests = o.getEcinternationalorganizationeurinterests() != null
|
||||
&& compareTrust(this, e) < 0
|
||||
? o.getEcinternationalorganizationeurinterests()
|
||||
: ecinternationalorganizationeurinterests;
|
||||
ecinternationalorganization =
|
||||
o.getEcinternationalorganization() != null && compareTrust(this, e) < 0
|
||||
ecinternationalorganization = o.getEcinternationalorganization() != null && compareTrust(this, e) < 0
|
||||
? o.getEcinternationalorganization()
|
||||
: ecinternationalorganization;
|
||||
ecenterprise =
|
||||
o.getEcenterprise() != null && compareTrust(this, e) < 0
|
||||
ecenterprise = o.getEcenterprise() != null && compareTrust(this, e) < 0
|
||||
? o.getEcenterprise()
|
||||
: ecenterprise;
|
||||
ecsmevalidated =
|
||||
o.getEcsmevalidated() != null && compareTrust(this, e) < 0
|
||||
ecsmevalidated = o.getEcsmevalidated() != null && compareTrust(this, e) < 0
|
||||
? o.getEcsmevalidated()
|
||||
: ecsmevalidated;
|
||||
ecnutscode =
|
||||
o.getEcnutscode() != null && compareTrust(this, e) < 0 ? o.getEcnutscode() : ecnutscode;
|
||||
ecnutscode = o.getEcnutscode() != null && compareTrust(this, e) < 0 ? o.getEcnutscode() : ecnutscode;
|
||||
country = o.getCountry() != null && compareTrust(this, e) < 0 ? o.getCountry() : country;
|
||||
mergeOAFDataInfo(o);
|
||||
}
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf;
|
||||
|
||||
import java.io.Serializable;
|
||||
|
@ -67,8 +68,10 @@ public class OriginDescription implements Serializable {
|
|||
|
||||
@Override
|
||||
public boolean equals(Object o) {
|
||||
if (this == o) return true;
|
||||
if (o == null || getClass() != o.getClass()) return false;
|
||||
if (this == o)
|
||||
return true;
|
||||
if (o == null || getClass() != o.getClass())
|
||||
return false;
|
||||
OriginDescription that = (OriginDescription) o;
|
||||
return Objects.equals(harvestDate, that.harvestDate)
|
||||
&& Objects.equals(altered, that.altered)
|
||||
|
|
|
@ -1,9 +1,11 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf;
|
||||
|
||||
import eu.dnetlib.dhp.schema.common.ModelConstants;
|
||||
import java.io.Serializable;
|
||||
import java.util.List;
|
||||
|
||||
import eu.dnetlib.dhp.schema.common.ModelConstants;
|
||||
|
||||
public class OtherResearchProduct extends Result implements Serializable {
|
||||
|
||||
private List<Field<String>> contactperson;
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf;
|
||||
|
||||
import java.io.Serializable;
|
||||
|
@ -275,63 +276,48 @@ public class Project extends OafEntity implements Serializable {
|
|||
|
||||
Project p = (Project) e;
|
||||
|
||||
websiteurl =
|
||||
p.getWebsiteurl() != null && compareTrust(this, e) < 0 ? p.getWebsiteurl() : websiteurl;
|
||||
websiteurl = p.getWebsiteurl() != null && compareTrust(this, e) < 0 ? p.getWebsiteurl() : websiteurl;
|
||||
code = p.getCode() != null && compareTrust(this, e) < 0 ? p.getCode() : code;
|
||||
acronym = p.getAcronym() != null && compareTrust(this, e) < 0 ? p.getAcronym() : acronym;
|
||||
title = p.getTitle() != null && compareTrust(this, e) < 0 ? p.getTitle() : title;
|
||||
startdate =
|
||||
p.getStartdate() != null && compareTrust(this, e) < 0 ? p.getStartdate() : startdate;
|
||||
startdate = p.getStartdate() != null && compareTrust(this, e) < 0 ? p.getStartdate() : startdate;
|
||||
enddate = p.getEnddate() != null && compareTrust(this, e) < 0 ? p.getEnddate() : enddate;
|
||||
callidentifier =
|
||||
p.getCallidentifier() != null && compareTrust(this, e) < 0
|
||||
callidentifier = p.getCallidentifier() != null && compareTrust(this, e) < 0
|
||||
? p.getCallidentifier()
|
||||
: callidentifier;
|
||||
keywords = p.getKeywords() != null && compareTrust(this, e) < 0 ? p.getKeywords() : keywords;
|
||||
duration = p.getDuration() != null && compareTrust(this, e) < 0 ? p.getDuration() : duration;
|
||||
ecsc39 = p.getEcsc39() != null && compareTrust(this, e) < 0 ? p.getEcsc39() : ecsc39;
|
||||
oamandatepublications =
|
||||
p.getOamandatepublications() != null && compareTrust(this, e) < 0
|
||||
oamandatepublications = p.getOamandatepublications() != null && compareTrust(this, e) < 0
|
||||
? p.getOamandatepublications()
|
||||
: oamandatepublications;
|
||||
ecarticle29_3 =
|
||||
p.getEcarticle29_3() != null && compareTrust(this, e) < 0
|
||||
ecarticle29_3 = p.getEcarticle29_3() != null && compareTrust(this, e) < 0
|
||||
? p.getEcarticle29_3()
|
||||
: ecarticle29_3;
|
||||
subjects = mergeLists(subjects, p.getSubjects());
|
||||
fundingtree = mergeLists(fundingtree, p.getFundingtree());
|
||||
contracttype =
|
||||
p.getContracttype() != null && compareTrust(this, e) < 0
|
||||
contracttype = p.getContracttype() != null && compareTrust(this, e) < 0
|
||||
? p.getContracttype()
|
||||
: contracttype;
|
||||
optional1 =
|
||||
p.getOptional1() != null && compareTrust(this, e) < 0 ? p.getOptional1() : optional1;
|
||||
optional2 =
|
||||
p.getOptional2() != null && compareTrust(this, e) < 0 ? p.getOptional2() : optional2;
|
||||
jsonextrainfo =
|
||||
p.getJsonextrainfo() != null && compareTrust(this, e) < 0
|
||||
optional1 = p.getOptional1() != null && compareTrust(this, e) < 0 ? p.getOptional1() : optional1;
|
||||
optional2 = p.getOptional2() != null && compareTrust(this, e) < 0 ? p.getOptional2() : optional2;
|
||||
jsonextrainfo = p.getJsonextrainfo() != null && compareTrust(this, e) < 0
|
||||
? p.getJsonextrainfo()
|
||||
: jsonextrainfo;
|
||||
contactfullname =
|
||||
p.getContactfullname() != null && compareTrust(this, e) < 0
|
||||
contactfullname = p.getContactfullname() != null && compareTrust(this, e) < 0
|
||||
? p.getContactfullname()
|
||||
: contactfullname;
|
||||
contactfax =
|
||||
p.getContactfax() != null && compareTrust(this, e) < 0 ? p.getContactfax() : contactfax;
|
||||
contactphone =
|
||||
p.getContactphone() != null && compareTrust(this, e) < 0
|
||||
contactfax = p.getContactfax() != null && compareTrust(this, e) < 0 ? p.getContactfax() : contactfax;
|
||||
contactphone = p.getContactphone() != null && compareTrust(this, e) < 0
|
||||
? p.getContactphone()
|
||||
: contactphone;
|
||||
contactemail =
|
||||
p.getContactemail() != null && compareTrust(this, e) < 0
|
||||
contactemail = p.getContactemail() != null && compareTrust(this, e) < 0
|
||||
? p.getContactemail()
|
||||
: contactemail;
|
||||
summary = p.getSummary() != null && compareTrust(this, e) < 0 ? p.getSummary() : summary;
|
||||
currency = p.getCurrency() != null && compareTrust(this, e) < 0 ? p.getCurrency() : currency;
|
||||
totalcost =
|
||||
p.getTotalcost() != null && compareTrust(this, e) < 0 ? p.getTotalcost() : totalcost;
|
||||
fundedamount =
|
||||
p.getFundedamount() != null && compareTrust(this, e) < 0
|
||||
totalcost = p.getTotalcost() != null && compareTrust(this, e) < 0 ? p.getTotalcost() : totalcost;
|
||||
fundedamount = p.getFundedamount() != null && compareTrust(this, e) < 0
|
||||
? p.getFundedamount()
|
||||
: fundedamount;
|
||||
mergeOAFDataInfo(e);
|
||||
|
|
|
@ -1,8 +1,10 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf;
|
||||
|
||||
import eu.dnetlib.dhp.schema.common.ModelConstants;
|
||||
import java.io.Serializable;
|
||||
|
||||
import eu.dnetlib.dhp.schema.common.ModelConstants;
|
||||
|
||||
public class Publication extends Result implements Serializable {
|
||||
|
||||
// publication specific
|
||||
|
@ -30,7 +32,8 @@ public class Publication extends Result implements Serializable {
|
|||
|
||||
Publication p = (Publication) e;
|
||||
|
||||
if (p.getJournal() != null && compareTrust(this, e) < 0) journal = p.getJournal();
|
||||
if (p.getJournal() != null && compareTrust(this, e) < 0)
|
||||
journal = p.getJournal();
|
||||
mergeOAFDataInfo(e);
|
||||
}
|
||||
}
|
||||
|
|
|
@ -1,9 +1,12 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf;
|
||||
|
||||
import com.fasterxml.jackson.annotation.JsonIgnore;
|
||||
import java.io.Serializable;
|
||||
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
|
||||
import com.fasterxml.jackson.annotation.JsonIgnore;
|
||||
|
||||
public class Qualifier implements Serializable {
|
||||
|
||||
private String classid;
|
||||
|
@ -46,7 +49,8 @@ public class Qualifier implements Serializable {
|
|||
public String toComparableString() {
|
||||
return isBlank()
|
||||
? ""
|
||||
: String.format(
|
||||
: String
|
||||
.format(
|
||||
"%s::%s::%s::%s",
|
||||
classid != null ? classid : "",
|
||||
classname != null ? classname : "",
|
||||
|
@ -69,9 +73,12 @@ public class Qualifier implements Serializable {
|
|||
|
||||
@Override
|
||||
public boolean equals(Object obj) {
|
||||
if (this == obj) return true;
|
||||
if (obj == null) return false;
|
||||
if (getClass() != obj.getClass()) return false;
|
||||
if (this == obj)
|
||||
return true;
|
||||
if (obj == null)
|
||||
return false;
|
||||
if (getClass() != obj.getClass())
|
||||
return false;
|
||||
|
||||
Qualifier other = (Qualifier) obj;
|
||||
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf;
|
||||
|
||||
import static com.google.common.base.Preconditions.checkArgument;
|
||||
|
@ -6,16 +7,38 @@ import java.util.*;
|
|||
import java.util.stream.Collectors;
|
||||
import java.util.stream.Stream;
|
||||
|
||||
/**
|
||||
* Relation models any edge between two nodes in the OpenAIRE graph. It has a source id and a target id pointing to
|
||||
* graph node identifiers and it is further characterised by the semantic of the link through the fields relType,
|
||||
* subRelType and relClass. Provenance information is modeled according to the dataInfo element and collectedFrom, while
|
||||
* individual relationship types can provide extra information via the properties field.
|
||||
*/
|
||||
public class Relation extends Oaf {
|
||||
|
||||
/**
|
||||
* Main relationship classifier, values include 'resultResult', 'resultProject', 'resultOrganization', etc.
|
||||
*/
|
||||
private String relType;
|
||||
|
||||
/**
|
||||
* Further classifies a relationship, values include 'affiliation', 'similarity', 'supplement', etc.
|
||||
*/
|
||||
private String subRelType;
|
||||
|
||||
/**
|
||||
* Indicates the direction of the relationship, values include 'isSupplementTo', 'isSupplementedBy', 'merges,
|
||||
* 'isMergedIn'.
|
||||
*/
|
||||
private String relClass;
|
||||
|
||||
/**
|
||||
* The source entity id.
|
||||
*/
|
||||
private String source;
|
||||
|
||||
/**
|
||||
* The target entity id.
|
||||
*/
|
||||
private String target;
|
||||
|
||||
public String getRelType() {
|
||||
|
@ -68,11 +91,14 @@ public class Relation extends Oaf {
|
|||
checkArgument(Objects.equals(getRelClass(), r.getRelClass()), "relClass(es) must be equal");
|
||||
|
||||
setCollectedfrom(
|
||||
Stream.concat(
|
||||
Optional.ofNullable(getCollectedfrom())
|
||||
Stream
|
||||
.concat(
|
||||
Optional
|
||||
.ofNullable(getCollectedfrom())
|
||||
.map(Collection::stream)
|
||||
.orElse(Stream.empty()),
|
||||
Optional.ofNullable(r.getCollectedfrom())
|
||||
Optional
|
||||
.ofNullable(r.getCollectedfrom())
|
||||
.map(Collection::stream)
|
||||
.orElse(Stream.empty()))
|
||||
.distinct() // relies on KeyValue.equals
|
||||
|
@ -81,8 +107,10 @@ public class Relation extends Oaf {
|
|||
|
||||
@Override
|
||||
public boolean equals(Object o) {
|
||||
if (this == o) return true;
|
||||
if (o == null || getClass() != o.getClass()) return false;
|
||||
if (this == o)
|
||||
return true;
|
||||
if (o == null || getClass() != o.getClass())
|
||||
return false;
|
||||
Relation relation = (Relation) o;
|
||||
return relType.equals(relation.relType)
|
||||
&& subRelType.equals(relation.subRelType)
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf;
|
||||
|
||||
import java.io.Serializable;
|
||||
|
@ -233,9 +234,11 @@ public class Result extends OafEntity implements Serializable {
|
|||
if (r.getBestaccessright() != null && compareTrust(this, r) < 0)
|
||||
bestaccessright = r.getBestaccessright();
|
||||
|
||||
if (r.getResulttype() != null && compareTrust(this, r) < 0) resulttype = r.getResulttype();
|
||||
if (r.getResulttype() != null && compareTrust(this, r) < 0)
|
||||
resulttype = r.getResulttype();
|
||||
|
||||
if (r.getLanguage() != null && compareTrust(this, r) < 0) language = r.getLanguage();
|
||||
if (r.getLanguage() != null && compareTrust(this, r) < 0)
|
||||
language = r.getLanguage();
|
||||
|
||||
country = mergeLists(country, r.getCountry());
|
||||
|
||||
|
@ -247,7 +250,8 @@ public class Result extends OafEntity implements Serializable {
|
|||
|
||||
description = longestLists(description, r.getDescription());
|
||||
|
||||
if (r.getPublisher() != null && compareTrust(this, r) < 0) publisher = r.getPublisher();
|
||||
if (r.getPublisher() != null && compareTrust(this, r) < 0)
|
||||
publisher = r.getPublisher();
|
||||
|
||||
if (r.getEmbargoenddate() != null && compareTrust(this, r) < 0)
|
||||
embargoenddate = r.getEmbargoenddate();
|
||||
|
@ -260,7 +264,8 @@ public class Result extends OafEntity implements Serializable {
|
|||
|
||||
contributor = mergeLists(contributor, r.getContributor());
|
||||
|
||||
if (r.getResourcetype() != null) resourcetype = r.getResourcetype();
|
||||
if (r.getResourcetype() != null)
|
||||
resourcetype = r.getResourcetype();
|
||||
|
||||
coverage = mergeLists(coverage, r.getCoverage());
|
||||
|
||||
|
@ -270,16 +275,17 @@ public class Result extends OafEntity implements Serializable {
|
|||
}
|
||||
|
||||
private List<Field<String>> longestLists(List<Field<String>> a, List<Field<String>> b) {
|
||||
if (a == null || b == null) return a == null ? b : a;
|
||||
if (a == null || b == null)
|
||||
return a == null ? b : a;
|
||||
if (a.size() == b.size()) {
|
||||
int msa =
|
||||
a.stream()
|
||||
int msa = a
|
||||
.stream()
|
||||
.filter(i -> i.getValue() != null)
|
||||
.map(i -> i.getValue().length())
|
||||
.max(Comparator.naturalOrder())
|
||||
.orElse(0);
|
||||
int msb =
|
||||
b.stream()
|
||||
int msb = b
|
||||
.stream()
|
||||
.filter(i -> i.getValue() != null)
|
||||
.map(i -> i.getValue().length())
|
||||
.max(Comparator.naturalOrder())
|
||||
|
|
|
@ -1,9 +1,11 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf;
|
||||
|
||||
import eu.dnetlib.dhp.schema.common.ModelConstants;
|
||||
import java.io.Serializable;
|
||||
import java.util.List;
|
||||
|
||||
import eu.dnetlib.dhp.schema.common.ModelConstants;
|
||||
|
||||
public class Software extends Result implements Serializable {
|
||||
|
||||
private List<Field<String>> documentationUrl;
|
||||
|
@ -63,13 +65,11 @@ public class Software extends Result implements Serializable {
|
|||
|
||||
license = mergeLists(license, s.getLicense());
|
||||
|
||||
codeRepositoryUrl =
|
||||
s.getCodeRepositoryUrl() != null && compareTrust(this, s) < 0
|
||||
codeRepositoryUrl = s.getCodeRepositoryUrl() != null && compareTrust(this, s) < 0
|
||||
? s.getCodeRepositoryUrl()
|
||||
: codeRepositoryUrl;
|
||||
|
||||
programmingLanguage =
|
||||
s.getProgrammingLanguage() != null && compareTrust(this, s) < 0
|
||||
programmingLanguage = s.getProgrammingLanguage() != null && compareTrust(this, s) < 0
|
||||
? s.getProgrammingLanguage()
|
||||
: programmingLanguage;
|
||||
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf;
|
||||
|
||||
import java.io.Serializable;
|
||||
|
@ -45,9 +46,12 @@ public class StructuredProperty implements Serializable {
|
|||
|
||||
@Override
|
||||
public boolean equals(Object obj) {
|
||||
if (this == obj) return true;
|
||||
if (obj == null) return false;
|
||||
if (getClass() != obj.getClass()) return false;
|
||||
if (this == obj)
|
||||
return true;
|
||||
if (obj == null)
|
||||
return false;
|
||||
if (getClass() != obj.getClass())
|
||||
return false;
|
||||
|
||||
StructuredProperty other = (StructuredProperty) obj;
|
||||
|
||||
|
|
|
@ -1,13 +1,16 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.scholexplorer;
|
||||
|
||||
import eu.dnetlib.dhp.schema.oaf.Dataset;
|
||||
import eu.dnetlib.dhp.schema.oaf.OafEntity;
|
||||
import java.util.ArrayList;
|
||||
import java.util.HashMap;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
|
||||
import eu.dnetlib.dhp.schema.oaf.Dataset;
|
||||
import eu.dnetlib.dhp.schema.oaf.OafEntity;
|
||||
|
||||
public class DLIDataset extends Dataset {
|
||||
|
||||
private String originalObjIdentifier;
|
||||
|
@ -46,7 +49,8 @@ public class DLIDataset extends Dataset {
|
|||
DLIDataset p = (DLIDataset) e;
|
||||
if (StringUtils.isBlank(completionStatus) && StringUtils.isNotBlank(p.completionStatus))
|
||||
completionStatus = p.completionStatus;
|
||||
if ("complete".equalsIgnoreCase(p.completionStatus)) completionStatus = "complete";
|
||||
if ("complete".equalsIgnoreCase(p.completionStatus))
|
||||
completionStatus = "complete";
|
||||
dlicollectedfrom = mergeProvenance(dlicollectedfrom, p.getDlicollectedfrom());
|
||||
}
|
||||
|
||||
|
@ -54,7 +58,8 @@ public class DLIDataset extends Dataset {
|
|||
final List<ProvenaceInfo> a, final List<ProvenaceInfo> b) {
|
||||
Map<String, ProvenaceInfo> result = new HashMap<>();
|
||||
if (a != null)
|
||||
a.forEach(
|
||||
a
|
||||
.forEach(
|
||||
p -> {
|
||||
if (p != null && StringUtils.isNotBlank(p.getId()) && result.containsKey(p.getId())) {
|
||||
if ("incomplete".equalsIgnoreCase(result.get(p.getId()).getCompletionStatus())
|
||||
|
@ -66,7 +71,8 @@ public class DLIDataset extends Dataset {
|
|||
result.put(p.getId(), p);
|
||||
});
|
||||
if (b != null)
|
||||
b.forEach(
|
||||
b
|
||||
.forEach(
|
||||
p -> {
|
||||
if (p != null && StringUtils.isNotBlank(p.getId()) && result.containsKey(p.getId())) {
|
||||
if ("incomplete".equalsIgnoreCase(result.get(p.getId()).getCompletionStatus())
|
||||
|
|
|
@ -1,10 +1,13 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.scholexplorer;
|
||||
|
||||
import java.io.Serializable;
|
||||
import java.util.*;
|
||||
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
|
||||
import eu.dnetlib.dhp.schema.oaf.OafEntity;
|
||||
import eu.dnetlib.dhp.schema.oaf.Publication;
|
||||
import java.io.Serializable;
|
||||
import java.util.*;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
|
||||
public class DLIPublication extends Publication implements Serializable {
|
||||
|
||||
|
@ -44,7 +47,8 @@ public class DLIPublication extends Publication implements Serializable {
|
|||
DLIPublication p = (DLIPublication) e;
|
||||
if (StringUtils.isBlank(completionStatus) && StringUtils.isNotBlank(p.completionStatus))
|
||||
completionStatus = p.completionStatus;
|
||||
if ("complete".equalsIgnoreCase(p.completionStatus)) completionStatus = "complete";
|
||||
if ("complete".equalsIgnoreCase(p.completionStatus))
|
||||
completionStatus = "complete";
|
||||
dlicollectedfrom = mergeProvenance(dlicollectedfrom, p.getDlicollectedfrom());
|
||||
}
|
||||
|
||||
|
@ -52,7 +56,8 @@ public class DLIPublication extends Publication implements Serializable {
|
|||
final List<ProvenaceInfo> a, final List<ProvenaceInfo> b) {
|
||||
Map<String, ProvenaceInfo> result = new HashMap<>();
|
||||
if (a != null)
|
||||
a.forEach(
|
||||
a
|
||||
.forEach(
|
||||
p -> {
|
||||
if (p != null && StringUtils.isNotBlank(p.getId()) && result.containsKey(p.getId())) {
|
||||
if ("incomplete".equalsIgnoreCase(result.get(p.getId()).getCompletionStatus())
|
||||
|
@ -64,7 +69,8 @@ public class DLIPublication extends Publication implements Serializable {
|
|||
result.put(p.getId(), p);
|
||||
});
|
||||
if (b != null)
|
||||
b.forEach(
|
||||
b
|
||||
.forEach(
|
||||
p -> {
|
||||
if (p != null && StringUtils.isNotBlank(p.getId()) && result.containsKey(p.getId())) {
|
||||
if ("incomplete".equalsIgnoreCase(result.get(p.getId()).getCompletionStatus())
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.scholexplorer;
|
||||
|
||||
import eu.dnetlib.dhp.schema.oaf.Relation;
|
||||
|
|
|
@ -1,14 +1,17 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.scholexplorer;
|
||||
|
||||
import eu.dnetlib.dhp.schema.oaf.Oaf;
|
||||
import eu.dnetlib.dhp.schema.oaf.StructuredProperty;
|
||||
import java.io.Serializable;
|
||||
import java.util.ArrayList;
|
||||
import java.util.HashMap;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
|
||||
import eu.dnetlib.dhp.schema.oaf.Oaf;
|
||||
import eu.dnetlib.dhp.schema.oaf.StructuredProperty;
|
||||
|
||||
public class DLIUnknown extends Oaf implements Serializable {
|
||||
|
||||
private String id;
|
||||
|
@ -72,7 +75,8 @@ public class DLIUnknown extends Oaf implements Serializable {
|
|||
}
|
||||
|
||||
public void mergeFrom(DLIUnknown p) {
|
||||
if ("complete".equalsIgnoreCase(p.completionStatus)) completionStatus = "complete";
|
||||
if ("complete".equalsIgnoreCase(p.completionStatus))
|
||||
completionStatus = "complete";
|
||||
dlicollectedfrom = mergeProvenance(dlicollectedfrom, p.getDlicollectedfrom());
|
||||
}
|
||||
|
||||
|
@ -80,7 +84,8 @@ public class DLIUnknown extends Oaf implements Serializable {
|
|||
final List<ProvenaceInfo> a, final List<ProvenaceInfo> b) {
|
||||
Map<String, ProvenaceInfo> result = new HashMap<>();
|
||||
if (a != null)
|
||||
a.forEach(
|
||||
a
|
||||
.forEach(
|
||||
p -> {
|
||||
if (p != null && StringUtils.isNotBlank(p.getId()) && result.containsKey(p.getId())) {
|
||||
if ("incomplete".equalsIgnoreCase(result.get(p.getId()).getCompletionStatus())
|
||||
|
@ -92,7 +97,8 @@ public class DLIUnknown extends Oaf implements Serializable {
|
|||
result.put(p.getId(), p);
|
||||
});
|
||||
if (b != null)
|
||||
b.forEach(
|
||||
b
|
||||
.forEach(
|
||||
p -> {
|
||||
if (p != null && StringUtils.isNotBlank(p.getId()) && result.containsKey(p.getId())) {
|
||||
if ("incomplete".equalsIgnoreCase(result.get(p.getId()).getCompletionStatus())
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.scholexplorer;
|
||||
|
||||
import java.io.Serializable;
|
||||
|
|
|
@ -1,13 +1,17 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.action;
|
||||
|
||||
import static org.junit.jupiter.api.Assertions.*;
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
import eu.dnetlib.dhp.schema.oaf.Relation;
|
||||
import java.io.IOException;
|
||||
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
import org.junit.jupiter.api.Test;
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
|
||||
import eu.dnetlib.dhp.schema.oaf.Relation;
|
||||
|
||||
/** @author claudio.atzori */
|
||||
public class AtomicActionTest {
|
||||
|
||||
|
|
|
@ -1,13 +1,15 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.common;
|
||||
|
||||
import static org.junit.jupiter.api.Assertions.assertFalse;
|
||||
import static org.junit.jupiter.api.Assertions.assertTrue;
|
||||
|
||||
import org.junit.jupiter.api.Nested;
|
||||
import org.junit.jupiter.api.Test;
|
||||
|
||||
import eu.dnetlib.dhp.schema.oaf.OafEntity;
|
||||
import eu.dnetlib.dhp.schema.oaf.Relation;
|
||||
import eu.dnetlib.dhp.schema.oaf.Result;
|
||||
import org.junit.jupiter.api.Nested;
|
||||
import org.junit.jupiter.api.Test;
|
||||
|
||||
public class ModelSupportTest {
|
||||
|
||||
|
|
|
@ -1,9 +1,11 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.oaf;
|
||||
|
||||
import static org.junit.jupiter.api.Assertions.*;
|
||||
|
||||
import java.util.Arrays;
|
||||
import java.util.List;
|
||||
|
||||
import org.junit.jupiter.api.BeforeEach;
|
||||
import org.junit.jupiter.api.Test;
|
||||
|
||||
|
|
|
@ -1,15 +1,19 @@
|
|||
|
||||
package eu.dnetlib.dhp.schema.scholexplorer;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.Arrays;
|
||||
import java.util.Collections;
|
||||
|
||||
import org.junit.jupiter.api.Test;
|
||||
|
||||
import com.fasterxml.jackson.core.JsonProcessingException;
|
||||
import com.fasterxml.jackson.databind.DeserializationFeature;
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
import com.fasterxml.jackson.databind.SerializationFeature;
|
||||
|
||||
import eu.dnetlib.dhp.schema.oaf.Qualifier;
|
||||
import eu.dnetlib.dhp.schema.oaf.StructuredProperty;
|
||||
import java.io.IOException;
|
||||
import java.util.Arrays;
|
||||
import java.util.Collections;
|
||||
import org.junit.jupiter.api.Test;
|
||||
|
||||
public class DLItest {
|
||||
|
||||
|
@ -22,13 +26,17 @@ public class DLItest {
|
|||
a1.setCompletionStatus("complete");
|
||||
|
||||
DLIPublication a = new DLIPublication();
|
||||
a.setPid(
|
||||
Arrays.asList(
|
||||
a
|
||||
.setPid(
|
||||
Arrays
|
||||
.asList(
|
||||
createSP("10.11", "doi", "dnet:pid_types"),
|
||||
createSP("123456", "pdb", "dnet:pid_types")));
|
||||
a.setTitle(Collections.singletonList(createSP("A Title", "title", "dnetTitle")));
|
||||
a.setDlicollectedfrom(
|
||||
Arrays.asList(
|
||||
a
|
||||
.setDlicollectedfrom(
|
||||
Arrays
|
||||
.asList(
|
||||
createCollectedFrom("dct", "datacite", "complete"),
|
||||
createCollectedFrom("dct", "datacite", "incomplete")));
|
||||
a.setCompletionStatus("incomplete");
|
||||
|
@ -42,8 +50,7 @@ public class DLItest {
|
|||
@Test
|
||||
public void testDeserialization() throws IOException {
|
||||
|
||||
final String json =
|
||||
"{\"dataInfo\":{\"invisible\":false,\"inferred\":null,\"deletedbyinference\":false,\"trust\":\"0.9\",\"inferenceprovenance\":null,\"provenanceaction\":null},\"lastupdatetimestamp\":null,\"id\":\"60|bd9352547098929a394655ad1a44a479\",\"originalId\":[\"bd9352547098929a394655ad1a44a479\"],\"collectedfrom\":[{\"key\":\"dli_________::datacite\",\"value\":\"Datasets in Datacite\",\"dataInfo\":null,\"blank\":false}],\"pid\":[{\"value\":\"10.7925/DRS1.DUCHAS_5078760\",\"qualifier\":{\"classid\":\"doi\",\"classname\":\"doi\",\"schemeid\":\"dnet:pid_types\",\"schemename\":\"dnet:pid_types\",\"blank\":false},\"dataInfo\":null}],\"dateofcollection\":\"2020-01-09T08:29:31.885Z\",\"dateoftransformation\":null,\"extraInfo\":null,\"oaiprovenance\":null,\"author\":[{\"fullname\":\"Cathail, S. Ó\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null},{\"fullname\":\"Donnell, Breda Mc\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null},{\"fullname\":\"Ireland. Department of Arts, Culture, and the Gaeltacht\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null},{\"fullname\":\"University College Dublin\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null},{\"fullname\":\"National Folklore Foundation\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null},{\"fullname\":\"Cathail, S. Ó\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null},{\"fullname\":\"Donnell, Breda Mc\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null}],\"resulttype\":null,\"language\":null,\"country\":null,\"subject\":[{\"value\":\"Recreation\",\"qualifier\":{\"classid\":\"dnet:subject\",\"classname\":\"dnet:subject\",\"schemeid\":\"unknown\",\"schemename\":\"unknown\",\"blank\":false},\"dataInfo\":null},{\"value\":\"Entertainments and recreational activities\",\"qualifier\":{\"classid\":\"dnet:subject\",\"classname\":\"dnet:subject\",\"schemeid\":\"unknown\",\"schemename\":\"unknown\",\"blank\":false},\"dataInfo\":null},{\"value\":\"Siamsaíocht agus caitheamh aimsire\",\"qualifier\":{\"classid\":\"dnet:subject\",\"classname\":\"dnet:subject\",\"schemeid\":\"unknown\",\"schemename\":\"unknown\",\"blank\":false},\"dataInfo\":null}],\"title\":[{\"value\":\"Games We Play\",\"qualifier\":null,\"dataInfo\":null}],\"relevantdate\":[{\"value\":\"1938-09-28\",\"qualifier\":{\"classid\":\"date\",\"classname\":\"date\",\"schemeid\":\"dnet::date\",\"schemename\":\"dnet::date\",\"blank\":false},\"dataInfo\":null}],\"description\":[{\"value\":\"Story collected by Breda Mc Donnell, a student at Tenure school (Tinure, Co. Louth) (no informant identified).\",\"dataInfo\":null}],\"dateofacceptance\":null,\"publisher\":{\"value\":\"University College Dublin\",\"dataInfo\":null},\"embargoenddate\":null,\"source\":null,\"fulltext\":null,\"format\":null,\"contributor\":null,\"resourcetype\":null,\"coverage\":null,\"refereed\":null,\"context\":null,\"processingchargeamount\":null,\"processingchargecurrency\":null,\"externalReference\":null,\"instance\":[],\"storagedate\":null,\"device\":null,\"size\":null,\"version\":null,\"lastmetadataupdate\":null,\"metadataversionnumber\":null,\"geolocation\":null,\"dlicollectedfrom\":[{\"id\":\"dli_________::datacite\",\"name\":\"Datasets in Datacite\",\"completionStatus\":\"complete\",\"collectionMode\":\"resolved\"}],\"completionStatus\":\"complete\"}";
|
||||
final String json = "{\"dataInfo\":{\"invisible\":false,\"inferred\":null,\"deletedbyinference\":false,\"trust\":\"0.9\",\"inferenceprovenance\":null,\"provenanceaction\":null},\"lastupdatetimestamp\":null,\"id\":\"60|bd9352547098929a394655ad1a44a479\",\"originalId\":[\"bd9352547098929a394655ad1a44a479\"],\"collectedfrom\":[{\"key\":\"dli_________::datacite\",\"value\":\"Datasets in Datacite\",\"dataInfo\":null,\"blank\":false}],\"pid\":[{\"value\":\"10.7925/DRS1.DUCHAS_5078760\",\"qualifier\":{\"classid\":\"doi\",\"classname\":\"doi\",\"schemeid\":\"dnet:pid_types\",\"schemename\":\"dnet:pid_types\",\"blank\":false},\"dataInfo\":null}],\"dateofcollection\":\"2020-01-09T08:29:31.885Z\",\"dateoftransformation\":null,\"extraInfo\":null,\"oaiprovenance\":null,\"author\":[{\"fullname\":\"Cathail, S. Ó\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null},{\"fullname\":\"Donnell, Breda Mc\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null},{\"fullname\":\"Ireland. Department of Arts, Culture, and the Gaeltacht\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null},{\"fullname\":\"University College Dublin\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null},{\"fullname\":\"National Folklore Foundation\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null},{\"fullname\":\"Cathail, S. Ó\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null},{\"fullname\":\"Donnell, Breda Mc\",\"name\":null,\"surname\":null,\"rank\":null,\"pid\":null,\"affiliation\":null}],\"resulttype\":null,\"language\":null,\"country\":null,\"subject\":[{\"value\":\"Recreation\",\"qualifier\":{\"classid\":\"dnet:subject\",\"classname\":\"dnet:subject\",\"schemeid\":\"unknown\",\"schemename\":\"unknown\",\"blank\":false},\"dataInfo\":null},{\"value\":\"Entertainments and recreational activities\",\"qualifier\":{\"classid\":\"dnet:subject\",\"classname\":\"dnet:subject\",\"schemeid\":\"unknown\",\"schemename\":\"unknown\",\"blank\":false},\"dataInfo\":null},{\"value\":\"Siamsaíocht agus caitheamh aimsire\",\"qualifier\":{\"classid\":\"dnet:subject\",\"classname\":\"dnet:subject\",\"schemeid\":\"unknown\",\"schemename\":\"unknown\",\"blank\":false},\"dataInfo\":null}],\"title\":[{\"value\":\"Games We Play\",\"qualifier\":null,\"dataInfo\":null}],\"relevantdate\":[{\"value\":\"1938-09-28\",\"qualifier\":{\"classid\":\"date\",\"classname\":\"date\",\"schemeid\":\"dnet::date\",\"schemename\":\"dnet::date\",\"blank\":false},\"dataInfo\":null}],\"description\":[{\"value\":\"Story collected by Breda Mc Donnell, a student at Tenure school (Tinure, Co. Louth) (no informant identified).\",\"dataInfo\":null}],\"dateofacceptance\":null,\"publisher\":{\"value\":\"University College Dublin\",\"dataInfo\":null},\"embargoenddate\":null,\"source\":null,\"fulltext\":null,\"format\":null,\"contributor\":null,\"resourcetype\":null,\"coverage\":null,\"refereed\":null,\"context\":null,\"processingchargeamount\":null,\"processingchargecurrency\":null,\"externalReference\":null,\"instance\":[],\"storagedate\":null,\"device\":null,\"size\":null,\"version\":null,\"lastmetadataupdate\":null,\"metadataversionnumber\":null,\"geolocation\":null,\"dlicollectedfrom\":[{\"id\":\"dli_________::datacite\",\"name\":\"Datasets in Datacite\",\"completionStatus\":\"complete\",\"collectionMode\":\"resolved\"}],\"completionStatus\":\"complete\"}";
|
||||
|
||||
ObjectMapper mapper = new ObjectMapper();
|
||||
mapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
|
||||
|
|
|
@ -1,8 +1,23 @@
|
|||
|
||||
package eu.dnetlib.dhp.actionmanager;
|
||||
|
||||
import java.io.Serializable;
|
||||
import java.io.StringReader;
|
||||
import java.util.ArrayList;
|
||||
import java.util.List;
|
||||
import java.util.NoSuchElementException;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
import org.dom4j.Document;
|
||||
import org.dom4j.Element;
|
||||
import org.dom4j.io.SAXReader;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
|
||||
import com.google.common.base.Splitter;
|
||||
import com.google.common.collect.Iterables;
|
||||
import com.google.common.collect.Lists;
|
||||
|
||||
import eu.dnetlib.actionmanager.rmi.ActionManagerException;
|
||||
import eu.dnetlib.actionmanager.set.ActionManagerSet;
|
||||
import eu.dnetlib.actionmanager.set.ActionManagerSet.ImpactTypes;
|
||||
|
@ -10,26 +25,14 @@ import eu.dnetlib.dhp.actionmanager.partition.PartitionActionSetsByPayloadTypeJo
|
|||
import eu.dnetlib.dhp.utils.ISLookupClientFactory;
|
||||
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpException;
|
||||
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpService;
|
||||
import java.io.Serializable;
|
||||
import java.io.StringReader;
|
||||
import java.util.ArrayList;
|
||||
import java.util.List;
|
||||
import java.util.NoSuchElementException;
|
||||
import java.util.stream.Collectors;
|
||||
import org.dom4j.Document;
|
||||
import org.dom4j.Element;
|
||||
import org.dom4j.io.SAXReader;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
|
||||
public class ISClient implements Serializable {
|
||||
|
||||
private static final Logger log =
|
||||
LoggerFactory.getLogger(PartitionActionSetsByPayloadTypeJob.class);
|
||||
private static final Logger log = LoggerFactory.getLogger(PartitionActionSetsByPayloadTypeJob.class);
|
||||
|
||||
private static final String INPUT_ACTION_SET_ID_SEPARATOR = ",";
|
||||
|
||||
private ISLookUpService isLookup;
|
||||
private final ISLookUpService isLookup;
|
||||
|
||||
public ISClient(String isLookupUrl) {
|
||||
isLookup = ISLookupClientFactory.getLookUpService(isLookupUrl);
|
||||
|
@ -37,14 +40,16 @@ public class ISClient implements Serializable {
|
|||
|
||||
public List<String> getLatestRawsetPaths(String setIds) {
|
||||
|
||||
List<String> ids =
|
||||
Lists.newArrayList(
|
||||
Splitter.on(INPUT_ACTION_SET_ID_SEPARATOR)
|
||||
List<String> ids = Lists
|
||||
.newArrayList(
|
||||
Splitter
|
||||
.on(INPUT_ACTION_SET_ID_SEPARATOR)
|
||||
.omitEmptyStrings()
|
||||
.trimResults()
|
||||
.split(setIds));
|
||||
|
||||
return ids.stream()
|
||||
return ids
|
||||
.stream()
|
||||
.map(id -> getSet(isLookup, id))
|
||||
.map(as -> as.getPathToLatest())
|
||||
.collect(Collectors.toCollection(ArrayList::new));
|
||||
|
@ -52,8 +57,7 @@ public class ISClient implements Serializable {
|
|||
|
||||
private ActionManagerSet getSet(ISLookUpService isLookup, final String setId) {
|
||||
|
||||
final String q =
|
||||
"for $x in collection('/db/DRIVER/ActionManagerSetDSResources/ActionManagerSetDSResourceType') "
|
||||
final String q = "for $x in collection('/db/DRIVER/ActionManagerSetDSResources/ActionManagerSetDSResourceType') "
|
||||
+ "where $x//SET/@id = '"
|
||||
+ setId
|
||||
+ "' return $x";
|
||||
|
@ -78,7 +82,8 @@ public class ISClient implements Serializable {
|
|||
set.setId(doc.valueOf("//SET/@id").trim());
|
||||
set.setName(doc.valueOf("//SET").trim());
|
||||
set.setImpact(ImpactTypes.valueOf(doc.valueOf("//IMPACT").trim()));
|
||||
set.setLatest(
|
||||
set
|
||||
.setLatest(
|
||||
doc.valueOf("//RAW_SETS/LATEST/@id"),
|
||||
doc.valueOf("//RAW_SETS/LATEST/@creationDate"),
|
||||
doc.valueOf("//RAW_SETS/LATEST/@lastUpdate"));
|
||||
|
@ -87,7 +92,8 @@ public class ISClient implements Serializable {
|
|||
if (expiredNodes != null) {
|
||||
for (int i = 0; i < expiredNodes.size(); i++) {
|
||||
Element ex = (Element) expiredNodes.get(i);
|
||||
set.addExpired(
|
||||
set
|
||||
.addExpired(
|
||||
ex.attributeValue("id"),
|
||||
ex.attributeValue("creationDate"),
|
||||
ex.attributeValue("lastUpdate"));
|
||||
|
@ -114,8 +120,7 @@ public class ISClient implements Serializable {
|
|||
|
||||
private String queryServiceProperty(ISLookUpService isLookup, final String propertyName)
|
||||
throws ActionManagerException {
|
||||
final String q =
|
||||
"for $x in /RESOURCE_PROFILE[.//RESOURCE_TYPE/@value='ActionManagerServiceResourceType'] return $x//SERVICE_PROPERTIES/PROPERTY[./@ key='"
|
||||
final String q = "for $x in /RESOURCE_PROFILE[.//RESOURCE_TYPE/@value='ActionManagerServiceResourceType'] return $x//SERVICE_PROPERTIES/PROPERTY[./@ key='"
|
||||
+ propertyName
|
||||
+ "']/@value/string()";
|
||||
log.debug("quering for service property: " + q);
|
||||
|
|
|
@ -1,45 +1,67 @@
|
|||
|
||||
package eu.dnetlib.dhp.actionmanager.migration;
|
||||
|
||||
import eu.dnetlib.data.proto.FieldTypeProtos.Qualifier;
|
||||
import java.util.Comparator;
|
||||
|
||||
import eu.dnetlib.data.proto.FieldTypeProtos.Qualifier;
|
||||
|
||||
public class LicenseComparator implements Comparator<Qualifier> {
|
||||
|
||||
@Override
|
||||
public int compare(Qualifier left, Qualifier right) {
|
||||
|
||||
if (left == null && right == null) return 0;
|
||||
if (left == null) return 1;
|
||||
if (right == null) return -1;
|
||||
if (left == null && right == null)
|
||||
return 0;
|
||||
if (left == null)
|
||||
return 1;
|
||||
if (right == null)
|
||||
return -1;
|
||||
|
||||
String lClass = left.getClassid();
|
||||
String rClass = right.getClassid();
|
||||
|
||||
if (lClass.equals(rClass)) return 0;
|
||||
if (lClass.equals(rClass))
|
||||
return 0;
|
||||
|
||||
if (lClass.equals("OPEN SOURCE")) return -1;
|
||||
if (rClass.equals("OPEN SOURCE")) return 1;
|
||||
if (lClass.equals("OPEN SOURCE"))
|
||||
return -1;
|
||||
if (rClass.equals("OPEN SOURCE"))
|
||||
return 1;
|
||||
|
||||
if (lClass.equals("OPEN")) return -1;
|
||||
if (rClass.equals("OPEN")) return 1;
|
||||
if (lClass.equals("OPEN"))
|
||||
return -1;
|
||||
if (rClass.equals("OPEN"))
|
||||
return 1;
|
||||
|
||||
if (lClass.equals("6MONTHS")) return -1;
|
||||
if (rClass.equals("6MONTHS")) return 1;
|
||||
if (lClass.equals("6MONTHS"))
|
||||
return -1;
|
||||
if (rClass.equals("6MONTHS"))
|
||||
return 1;
|
||||
|
||||
if (lClass.equals("12MONTHS")) return -1;
|
||||
if (rClass.equals("12MONTHS")) return 1;
|
||||
if (lClass.equals("12MONTHS"))
|
||||
return -1;
|
||||
if (rClass.equals("12MONTHS"))
|
||||
return 1;
|
||||
|
||||
if (lClass.equals("EMBARGO")) return -1;
|
||||
if (rClass.equals("EMBARGO")) return 1;
|
||||
if (lClass.equals("EMBARGO"))
|
||||
return -1;
|
||||
if (rClass.equals("EMBARGO"))
|
||||
return 1;
|
||||
|
||||
if (lClass.equals("RESTRICTED")) return -1;
|
||||
if (rClass.equals("RESTRICTED")) return 1;
|
||||
if (lClass.equals("RESTRICTED"))
|
||||
return -1;
|
||||
if (rClass.equals("RESTRICTED"))
|
||||
return 1;
|
||||
|
||||
if (lClass.equals("CLOSED")) return -1;
|
||||
if (rClass.equals("CLOSED")) return 1;
|
||||
if (lClass.equals("CLOSED"))
|
||||
return -1;
|
||||
if (rClass.equals("CLOSED"))
|
||||
return 1;
|
||||
|
||||
if (lClass.equals("UNKNOWN")) return -1;
|
||||
if (rClass.equals("UNKNOWN")) return 1;
|
||||
if (lClass.equals("UNKNOWN"))
|
||||
return -1;
|
||||
if (rClass.equals("UNKNOWN"))
|
||||
return 1;
|
||||
|
||||
// Else (but unlikely), lexicographical ordering will do.
|
||||
return lClass.compareTo(rClass);
|
||||
|
|
|
@ -1,11 +1,6 @@
|
|||
|
||||
package eu.dnetlib.dhp.actionmanager.migration;
|
||||
|
||||
import com.google.common.base.Splitter;
|
||||
import com.google.common.collect.Lists;
|
||||
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
|
||||
import eu.dnetlib.dhp.utils.ISLookupClientFactory;
|
||||
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpException;
|
||||
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpService;
|
||||
import java.io.File;
|
||||
import java.io.FileOutputStream;
|
||||
import java.io.OutputStream;
|
||||
|
@ -14,6 +9,7 @@ import java.util.LinkedList;
|
|||
import java.util.List;
|
||||
import java.util.Properties;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
import org.apache.commons.io.IOUtils;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
import org.apache.hadoop.conf.Configuration;
|
||||
|
@ -25,6 +21,14 @@ import org.apache.hadoop.util.ToolRunner;
|
|||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
|
||||
import com.google.common.base.Splitter;
|
||||
import com.google.common.collect.Lists;
|
||||
|
||||
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
|
||||
import eu.dnetlib.dhp.utils.ISLookupClientFactory;
|
||||
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpException;
|
||||
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpService;
|
||||
|
||||
public class MigrateActionSet {
|
||||
|
||||
private static final Logger log = LoggerFactory.getLogger(MigrateActionSet.class);
|
||||
|
@ -34,10 +38,11 @@ public class MigrateActionSet {
|
|||
private static final String RAWSET_PREFIX = "rawset_";
|
||||
|
||||
public static void main(String[] args) throws Exception {
|
||||
final ArgumentApplicationParser parser =
|
||||
new ArgumentApplicationParser(
|
||||
IOUtils.toString(
|
||||
MigrateActionSet.class.getResourceAsStream(
|
||||
final ArgumentApplicationParser parser = new ArgumentApplicationParser(
|
||||
IOUtils
|
||||
.toString(
|
||||
MigrateActionSet.class
|
||||
.getResourceAsStream(
|
||||
"/eu/dnetlib/dhp/actionmanager/migration/migrate_actionsets_parameters.json")));
|
||||
parser.parseArgument(args);
|
||||
|
||||
|
@ -68,8 +73,7 @@ public class MigrateActionSet {
|
|||
Configuration conf = getConfiguration(distcp_task_timeout, distcp_memory_mb, distcp_num_maps);
|
||||
FileSystem targetFS = FileSystem.get(conf);
|
||||
|
||||
Configuration sourceConf =
|
||||
getConfiguration(distcp_task_timeout, distcp_memory_mb, distcp_num_maps);
|
||||
Configuration sourceConf = getConfiguration(distcp_task_timeout, distcp_memory_mb, distcp_num_maps);
|
||||
sourceConf.set(FileSystem.FS_DEFAULT_NAME_KEY, sourceNN);
|
||||
FileSystem sourceFS = FileSystem.get(sourceConf);
|
||||
|
||||
|
@ -78,7 +82,8 @@ public class MigrateActionSet {
|
|||
List<Path> targetPaths = new ArrayList<>();
|
||||
|
||||
final List<Path> sourcePaths = getSourcePaths(sourceNN, isLookUp);
|
||||
log.info(
|
||||
log
|
||||
.info(
|
||||
"paths to process:\n{}",
|
||||
sourcePaths.stream().map(p -> p.toString()).collect(Collectors.joining("\n")));
|
||||
for (Path source : sourcePaths) {
|
||||
|
@ -87,8 +92,7 @@ public class MigrateActionSet {
|
|||
log.warn("skipping unexisting path: {}", source);
|
||||
} else {
|
||||
|
||||
LinkedList<String> pathQ =
|
||||
Lists.newLinkedList(Splitter.on(SEPARATOR).split(source.toUri().getPath()));
|
||||
LinkedList<String> pathQ = Lists.newLinkedList(Splitter.on(SEPARATOR).split(source.toUri().getPath()));
|
||||
|
||||
final String rawSet = pathQ.pollLast();
|
||||
log.info("got RAWSET: {}", rawSet);
|
||||
|
@ -97,8 +101,8 @@ public class MigrateActionSet {
|
|||
|
||||
final String actionSetDirectory = pathQ.pollLast();
|
||||
|
||||
final Path targetPath =
|
||||
new Path(targetNN + workDir + SEPARATOR + actionSetDirectory + SEPARATOR + rawSet);
|
||||
final Path targetPath = new Path(
|
||||
targetNN + workDir + SEPARATOR + actionSetDirectory + SEPARATOR + rawSet);
|
||||
|
||||
log.info("using TARGET PATH: {}", targetPath);
|
||||
|
||||
|
@ -115,7 +119,8 @@ public class MigrateActionSet {
|
|||
}
|
||||
}
|
||||
|
||||
props.setProperty(
|
||||
props
|
||||
.setProperty(
|
||||
TARGET_PATHS, targetPaths.stream().map(p -> p.toString()).collect(Collectors.joining(",")));
|
||||
File file = new File(System.getProperty("oozie.action.output.properties"));
|
||||
|
||||
|
@ -140,8 +145,8 @@ public class MigrateActionSet {
|
|||
op.preserve(DistCpOptions.FileAttribute.REPLICATION);
|
||||
op.preserve(DistCpOptions.FileAttribute.CHECKSUMTYPE);
|
||||
|
||||
int res =
|
||||
ToolRunner.run(
|
||||
int res = ToolRunner
|
||||
.run(
|
||||
new DistCp(conf, op),
|
||||
new String[] {
|
||||
"-Dmapred.task.timeout=" + distcp_task_timeout,
|
||||
|
@ -171,8 +176,7 @@ public class MigrateActionSet {
|
|||
|
||||
private List<Path> getSourcePaths(String sourceNN, ISLookUpService isLookUp)
|
||||
throws ISLookUpException {
|
||||
String XQUERY =
|
||||
"distinct-values(\n"
|
||||
String XQUERY = "distinct-values(\n"
|
||||
+ "let $basePath := collection('/db/DRIVER/ServiceResources/ActionManagerServiceResourceType')//SERVICE_PROPERTIES/PROPERTY[@key = 'basePath']/@value/string()\n"
|
||||
+ "for $x in collection('/db/DRIVER/ActionManagerSetDSResources/ActionManagerSetDSResourceType') \n"
|
||||
+ "let $setDir := $x//SET/@directory/string()\n"
|
||||
|
@ -180,7 +184,9 @@ public class MigrateActionSet {
|
|||
+ "return concat($basePath, '/', $setDir, '/', $rawSet))";
|
||||
|
||||
log.info(String.format("running xquery:\n%s", XQUERY));
|
||||
return isLookUp.quickSearchProfile(XQUERY).stream()
|
||||
return isLookUp
|
||||
.quickSearchProfile(XQUERY)
|
||||
.stream()
|
||||
.map(p -> sourceNN + p)
|
||||
.map(Path::new)
|
||||
.collect(Collectors.toList());
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.actionmanager.migration;
|
||||
|
||||
import static eu.dnetlib.data.proto.KindProtos.Kind.entity;
|
||||
|
@ -5,16 +6,19 @@ import static eu.dnetlib.data.proto.KindProtos.Kind.relation;
|
|||
import static eu.dnetlib.data.proto.TypeProtos.*;
|
||||
import static eu.dnetlib.data.proto.TypeProtos.Type.*;
|
||||
|
||||
import com.google.common.collect.Lists;
|
||||
import com.googlecode.protobuf.format.JsonFormat;
|
||||
import eu.dnetlib.data.proto.*;
|
||||
import eu.dnetlib.dhp.schema.oaf.*;
|
||||
import java.io.Serializable;
|
||||
import java.util.List;
|
||||
import java.util.Optional;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
|
||||
import com.google.common.collect.Lists;
|
||||
import com.googlecode.protobuf.format.JsonFormat;
|
||||
|
||||
import eu.dnetlib.data.proto.*;
|
||||
import eu.dnetlib.dhp.schema.oaf.*;
|
||||
|
||||
public class ProtoConverter implements Serializable {
|
||||
|
||||
public static final String UNKNOWN = "UNKNOWN";
|
||||
|
@ -46,7 +50,8 @@ public class ProtoConverter implements Serializable {
|
|||
rel.setRelType(r.getRelType().toString());
|
||||
rel.setSubRelType(r.getSubRelType().toString());
|
||||
rel.setRelClass(r.getRelClass());
|
||||
rel.setCollectedfrom(
|
||||
rel
|
||||
.setCollectedfrom(
|
||||
r.getCollectedfromCount() > 0
|
||||
? r.getCollectedfromList().stream().map(kv -> mapKV(kv)).collect(Collectors.toList())
|
||||
: null);
|
||||
|
@ -97,14 +102,16 @@ public class ProtoConverter implements Serializable {
|
|||
}
|
||||
|
||||
private static Organization convertOrganization(OafProtos.Oaf oaf) {
|
||||
final OrganizationProtos.Organization.Metadata m =
|
||||
oaf.getEntity().getOrganization().getMetadata();
|
||||
final OrganizationProtos.Organization.Metadata m = oaf.getEntity().getOrganization().getMetadata();
|
||||
final Organization org = setOaf(new Organization(), oaf);
|
||||
setEntity(org, oaf);
|
||||
org.setLegalshortname(mapStringField(m.getLegalshortname()));
|
||||
org.setLegalname(mapStringField(m.getLegalname()));
|
||||
org.setAlternativeNames(
|
||||
m.getAlternativeNamesList().stream()
|
||||
org
|
||||
.setAlternativeNames(
|
||||
m
|
||||
.getAlternativeNamesList()
|
||||
.stream()
|
||||
.map(ProtoConverter::mapStringField)
|
||||
.collect(Collectors.toList()));
|
||||
org.setWebsiteurl(mapStringField(m.getWebsiteurl()));
|
||||
|
@ -114,7 +121,8 @@ public class ProtoConverter implements Serializable {
|
|||
org.setEcnonprofit(mapStringField(m.getEcnonprofit()));
|
||||
org.setEcresearchorganization(mapStringField(m.getEcresearchorganization()));
|
||||
org.setEchighereducation(mapStringField(m.getEchighereducation()));
|
||||
org.setEcinternationalorganizationeurinterests(
|
||||
org
|
||||
.setEcinternationalorganizationeurinterests(
|
||||
mapStringField(m.getEcinternationalorganizationeurinterests()));
|
||||
org.setEcinternationalorganization(mapStringField(m.getEcinternationalorganization()));
|
||||
org.setEcenterprise(mapStringField(m.getEcenterprise()));
|
||||
|
@ -129,8 +137,11 @@ public class ProtoConverter implements Serializable {
|
|||
final DatasourceProtos.Datasource.Metadata m = oaf.getEntity().getDatasource().getMetadata();
|
||||
final Datasource datasource = setOaf(new Datasource(), oaf);
|
||||
setEntity(datasource, oaf);
|
||||
datasource.setAccessinfopackage(
|
||||
m.getAccessinfopackageList().stream()
|
||||
datasource
|
||||
.setAccessinfopackage(
|
||||
m
|
||||
.getAccessinfopackageList()
|
||||
.stream()
|
||||
.map(ProtoConverter::mapStringField)
|
||||
.collect(Collectors.toList()));
|
||||
datasource.setCertificates(mapStringField(m.getCertificates()));
|
||||
|
@ -151,12 +162,18 @@ public class ProtoConverter implements Serializable {
|
|||
datasource.setLogourl(mapStringField(m.getLogourl()));
|
||||
datasource.setMissionstatementurl(mapStringField(m.getMissionstatementurl()));
|
||||
datasource.setNamespaceprefix(mapStringField(m.getNamespaceprefix()));
|
||||
datasource.setOdcontenttypes(
|
||||
m.getOdcontenttypesList().stream()
|
||||
datasource
|
||||
.setOdcontenttypes(
|
||||
m
|
||||
.getOdcontenttypesList()
|
||||
.stream()
|
||||
.map(ProtoConverter::mapStringField)
|
||||
.collect(Collectors.toList()));
|
||||
datasource.setOdlanguages(
|
||||
m.getOdlanguagesList().stream()
|
||||
datasource
|
||||
.setOdlanguages(
|
||||
m
|
||||
.getOdlanguagesList()
|
||||
.stream()
|
||||
.map(ProtoConverter::mapStringField)
|
||||
.collect(Collectors.toList()));
|
||||
datasource.setOdnumberofitems(mapStringField(m.getOdnumberofitems()));
|
||||
|
@ -165,14 +182,18 @@ public class ProtoConverter implements Serializable {
|
|||
datasource.setOfficialname(mapStringField(m.getOfficialname()));
|
||||
datasource.setOpenairecompatibility(mapQualifier(m.getOpenairecompatibility()));
|
||||
datasource.setPidsystems(mapStringField(m.getPidsystems()));
|
||||
datasource.setPolicies(
|
||||
datasource
|
||||
.setPolicies(
|
||||
m.getPoliciesList().stream().map(ProtoConverter::mapKV).collect(Collectors.toList()));
|
||||
datasource.setQualitymanagementkind(mapStringField(m.getQualitymanagementkind()));
|
||||
datasource.setReleaseenddate(mapStringField(m.getReleaseenddate()));
|
||||
datasource.setServiceprovider(mapBoolField(m.getServiceprovider()));
|
||||
datasource.setReleasestartdate(mapStringField(m.getReleasestartdate()));
|
||||
datasource.setSubjects(
|
||||
m.getSubjectsList().stream()
|
||||
datasource
|
||||
.setSubjects(
|
||||
m
|
||||
.getSubjectsList()
|
||||
.stream()
|
||||
.map(ProtoConverter::mapStructuredProperty)
|
||||
.collect(Collectors.toList()));
|
||||
datasource.setVersioning(mapBoolField(m.getVersioning()));
|
||||
|
@ -204,13 +225,17 @@ public class ProtoConverter implements Serializable {
|
|||
project.setFundedamount(m.getFundedamount());
|
||||
project.setTotalcost(m.getTotalcost());
|
||||
project.setKeywords(mapStringField(m.getKeywords()));
|
||||
project.setSubjects(
|
||||
m.getSubjectsList().stream()
|
||||
project
|
||||
.setSubjects(
|
||||
m
|
||||
.getSubjectsList()
|
||||
.stream()
|
||||
.map(sp -> mapStructuredProperty(sp))
|
||||
.collect(Collectors.toList()));
|
||||
project.setTitle(mapStringField(m.getTitle()));
|
||||
project.setWebsiteurl(mapStringField(m.getWebsiteurl()));
|
||||
project.setFundingtree(
|
||||
project
|
||||
.setFundingtree(
|
||||
m.getFundingtreeList().stream().map(f -> mapStringField(f)).collect(Collectors.toList()));
|
||||
project.setJsonextrainfo(mapStringField(m.getJsonextrainfo()));
|
||||
project.setSummary(mapStringField(m.getSummary()));
|
||||
|
@ -242,12 +267,18 @@ public class ProtoConverter implements Serializable {
|
|||
setEntity(software, oaf);
|
||||
setResult(software, oaf);
|
||||
|
||||
software.setDocumentationUrl(
|
||||
m.getDocumentationUrlList().stream()
|
||||
software
|
||||
.setDocumentationUrl(
|
||||
m
|
||||
.getDocumentationUrlList()
|
||||
.stream()
|
||||
.map(ProtoConverter::mapStringField)
|
||||
.collect(Collectors.toList()));
|
||||
software.setLicense(
|
||||
m.getLicenseList().stream()
|
||||
software
|
||||
.setLicense(
|
||||
m
|
||||
.getLicenseList()
|
||||
.stream()
|
||||
.map(ProtoConverter::mapStructuredProperty)
|
||||
.collect(Collectors.toList()));
|
||||
software.setCodeRepositoryUrl(mapStringField(m.getCodeRepositoryUrl()));
|
||||
|
@ -260,15 +291,22 @@ public class ProtoConverter implements Serializable {
|
|||
OtherResearchProduct otherResearchProducts = setOaf(new OtherResearchProduct(), oaf);
|
||||
setEntity(otherResearchProducts, oaf);
|
||||
setResult(otherResearchProducts, oaf);
|
||||
otherResearchProducts.setContactperson(
|
||||
m.getContactpersonList().stream()
|
||||
otherResearchProducts
|
||||
.setContactperson(
|
||||
m
|
||||
.getContactpersonList()
|
||||
.stream()
|
||||
.map(ProtoConverter::mapStringField)
|
||||
.collect(Collectors.toList()));
|
||||
otherResearchProducts.setContactgroup(
|
||||
m.getContactgroupList().stream()
|
||||
otherResearchProducts
|
||||
.setContactgroup(
|
||||
m
|
||||
.getContactgroupList()
|
||||
.stream()
|
||||
.map(ProtoConverter::mapStringField)
|
||||
.collect(Collectors.toList()));
|
||||
otherResearchProducts.setTool(
|
||||
otherResearchProducts
|
||||
.setTool(
|
||||
m.getToolList().stream().map(ProtoConverter::mapStringField).collect(Collectors.toList()));
|
||||
|
||||
return otherResearchProducts;
|
||||
|
@ -296,8 +334,11 @@ public class ProtoConverter implements Serializable {
|
|||
dataset.setVersion(mapStringField(m.getVersion()));
|
||||
dataset.setLastmetadataupdate(mapStringField(m.getLastmetadataupdate()));
|
||||
dataset.setMetadataversionnumber(mapStringField(m.getMetadataversionnumber()));
|
||||
dataset.setGeolocation(
|
||||
m.getGeolocationList().stream()
|
||||
dataset
|
||||
.setGeolocation(
|
||||
m
|
||||
.getGeolocationList()
|
||||
.stream()
|
||||
.map(ProtoConverter::mapGeolocation)
|
||||
.collect(Collectors.toList()));
|
||||
return dataset;
|
||||
|
@ -314,16 +355,23 @@ public class ProtoConverter implements Serializable {
|
|||
final OafProtos.OafEntity e = oaf.getEntity();
|
||||
entity.setId(e.getId());
|
||||
entity.setOriginalId(e.getOriginalIdList());
|
||||
entity.setCollectedfrom(
|
||||
entity
|
||||
.setCollectedfrom(
|
||||
e.getCollectedfromList().stream().map(ProtoConverter::mapKV).collect(Collectors.toList()));
|
||||
entity.setPid(
|
||||
e.getPidList().stream()
|
||||
entity
|
||||
.setPid(
|
||||
e
|
||||
.getPidList()
|
||||
.stream()
|
||||
.map(ProtoConverter::mapStructuredProperty)
|
||||
.collect(Collectors.toList()));
|
||||
entity.setDateofcollection(e.getDateofcollection());
|
||||
entity.setDateoftransformation(e.getDateoftransformation());
|
||||
entity.setExtraInfo(
|
||||
e.getExtraInfoList().stream()
|
||||
entity
|
||||
.setExtraInfo(
|
||||
e
|
||||
.getExtraInfoList()
|
||||
.stream()
|
||||
.map(ProtoConverter::mapExtraInfo)
|
||||
.collect(Collectors.toList()));
|
||||
return entity;
|
||||
|
@ -332,55 +380,87 @@ public class ProtoConverter implements Serializable {
|
|||
public static <T extends Result> T setResult(T entity, OafProtos.Oaf oaf) {
|
||||
// setting Entity fields
|
||||
final ResultProtos.Result.Metadata m = oaf.getEntity().getResult().getMetadata();
|
||||
entity.setAuthor(
|
||||
entity
|
||||
.setAuthor(
|
||||
m.getAuthorList().stream().map(ProtoConverter::mapAuthor).collect(Collectors.toList()));
|
||||
entity.setResulttype(mapQualifier(m.getResulttype()));
|
||||
entity.setLanguage(mapQualifier(m.getLanguage()));
|
||||
entity.setCountry(
|
||||
m.getCountryList().stream()
|
||||
entity
|
||||
.setCountry(
|
||||
m
|
||||
.getCountryList()
|
||||
.stream()
|
||||
.map(ProtoConverter::mapQualifierAsCountry)
|
||||
.collect(Collectors.toList()));
|
||||
entity.setSubject(
|
||||
m.getSubjectList().stream()
|
||||
entity
|
||||
.setSubject(
|
||||
m
|
||||
.getSubjectList()
|
||||
.stream()
|
||||
.map(ProtoConverter::mapStructuredProperty)
|
||||
.collect(Collectors.toList()));
|
||||
entity.setTitle(
|
||||
m.getTitleList().stream()
|
||||
entity
|
||||
.setTitle(
|
||||
m
|
||||
.getTitleList()
|
||||
.stream()
|
||||
.map(ProtoConverter::mapStructuredProperty)
|
||||
.collect(Collectors.toList()));
|
||||
entity.setRelevantdate(
|
||||
m.getRelevantdateList().stream()
|
||||
entity
|
||||
.setRelevantdate(
|
||||
m
|
||||
.getRelevantdateList()
|
||||
.stream()
|
||||
.map(ProtoConverter::mapStructuredProperty)
|
||||
.collect(Collectors.toList()));
|
||||
entity.setDescription(
|
||||
m.getDescriptionList().stream()
|
||||
entity
|
||||
.setDescription(
|
||||
m
|
||||
.getDescriptionList()
|
||||
.stream()
|
||||
.map(ProtoConverter::mapStringField)
|
||||
.collect(Collectors.toList()));
|
||||
entity.setDateofacceptance(mapStringField(m.getDateofacceptance()));
|
||||
entity.setPublisher(mapStringField(m.getPublisher()));
|
||||
entity.setEmbargoenddate(mapStringField(m.getEmbargoenddate()));
|
||||
entity.setSource(
|
||||
m.getSourceList().stream()
|
||||
entity
|
||||
.setSource(
|
||||
m
|
||||
.getSourceList()
|
||||
.stream()
|
||||
.map(ProtoConverter::mapStringField)
|
||||
.collect(Collectors.toList()));
|
||||
entity.setFulltext(
|
||||
m.getFulltextList().stream()
|
||||
entity
|
||||
.setFulltext(
|
||||
m
|
||||
.getFulltextList()
|
||||
.stream()
|
||||
.map(ProtoConverter::mapStringField)
|
||||
.collect(Collectors.toList()));
|
||||
entity.setFormat(
|
||||
m.getFormatList().stream()
|
||||
entity
|
||||
.setFormat(
|
||||
m
|
||||
.getFormatList()
|
||||
.stream()
|
||||
.map(ProtoConverter::mapStringField)
|
||||
.collect(Collectors.toList()));
|
||||
entity.setContributor(
|
||||
m.getContributorList().stream()
|
||||
entity
|
||||
.setContributor(
|
||||
m
|
||||
.getContributorList()
|
||||
.stream()
|
||||
.map(ProtoConverter::mapStringField)
|
||||
.collect(Collectors.toList()));
|
||||
entity.setResourcetype(mapQualifier(m.getResourcetype()));
|
||||
entity.setCoverage(
|
||||
m.getCoverageList().stream()
|
||||
entity
|
||||
.setCoverage(
|
||||
m
|
||||
.getCoverageList()
|
||||
.stream()
|
||||
.map(ProtoConverter::mapStringField)
|
||||
.collect(Collectors.toList()));
|
||||
entity.setContext(
|
||||
entity
|
||||
.setContext(
|
||||
m.getContextList().stream().map(ProtoConverter::mapContext).collect(Collectors.toList()));
|
||||
|
||||
entity.setBestaccessright(getBestAccessRights(oaf.getEntity().getResult().getInstanceList()));
|
||||
|
@ -390,8 +470,10 @@ public class ProtoConverter implements Serializable {
|
|||
|
||||
private static Qualifier getBestAccessRights(List<ResultProtos.Result.Instance> instanceList) {
|
||||
if (instanceList != null) {
|
||||
final Optional<FieldTypeProtos.Qualifier> min =
|
||||
instanceList.stream().map(i -> i.getAccessright()).min(new LicenseComparator());
|
||||
final Optional<FieldTypeProtos.Qualifier> min = instanceList
|
||||
.stream()
|
||||
.map(i -> i.getAccessright())
|
||||
.min(new LicenseComparator());
|
||||
|
||||
final Qualifier rights = min.isPresent() ? mapQualifier(min.get()) : new Qualifier();
|
||||
|
||||
|
@ -418,8 +500,11 @@ public class ProtoConverter implements Serializable {
|
|||
|
||||
final Context entity = new Context();
|
||||
entity.setId(context.getId());
|
||||
entity.setDataInfo(
|
||||
context.getDataInfoList().stream()
|
||||
entity
|
||||
.setDataInfo(
|
||||
context
|
||||
.getDataInfoList()
|
||||
.stream()
|
||||
.map(ProtoConverter::mapDataInfo)
|
||||
.collect(Collectors.toList()));
|
||||
return entity;
|
||||
|
@ -543,8 +628,11 @@ public class ProtoConverter implements Serializable {
|
|||
entity.setName(author.getName());
|
||||
entity.setSurname(author.getSurname());
|
||||
entity.setRank(author.getRank());
|
||||
entity.setPid(
|
||||
author.getPidList().stream()
|
||||
entity
|
||||
.setPid(
|
||||
author
|
||||
.getPidList()
|
||||
.stream()
|
||||
.map(
|
||||
kv -> {
|
||||
final StructuredProperty sp = new StructuredProperty();
|
||||
|
@ -556,8 +644,11 @@ public class ProtoConverter implements Serializable {
|
|||
return sp;
|
||||
})
|
||||
.collect(Collectors.toList()));
|
||||
entity.setAffiliation(
|
||||
author.getAffiliationList().stream()
|
||||
entity
|
||||
.setAffiliation(
|
||||
author
|
||||
.getAffiliationList()
|
||||
.stream()
|
||||
.map(ProtoConverter::mapStringField)
|
||||
.collect(Collectors.toList()));
|
||||
return entity;
|
||||
|
|
|
@ -1,23 +1,14 @@
|
|||
|
||||
package eu.dnetlib.dhp.actionmanager.migration;
|
||||
|
||||
import static eu.dnetlib.dhp.common.SparkSessionSupport.runWithSparkSession;
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
import com.google.common.base.Splitter;
|
||||
import com.google.common.collect.Lists;
|
||||
import com.google.protobuf.InvalidProtocolBufferException;
|
||||
import eu.dnetlib.data.proto.OafProtos;
|
||||
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
|
||||
import eu.dnetlib.dhp.schema.action.AtomicAction;
|
||||
import eu.dnetlib.dhp.schema.oaf.*;
|
||||
import eu.dnetlib.dhp.utils.ISLookupClientFactory;
|
||||
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpException;
|
||||
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpService;
|
||||
import java.io.IOException;
|
||||
import java.io.Serializable;
|
||||
import java.util.LinkedList;
|
||||
import java.util.Objects;
|
||||
import java.util.Optional;
|
||||
|
||||
import org.apache.commons.io.IOUtils;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
import org.apache.hadoop.fs.FileSystem;
|
||||
|
@ -29,6 +20,19 @@ import org.apache.spark.api.java.JavaSparkContext;
|
|||
import org.apache.spark.sql.SparkSession;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
import com.google.common.base.Splitter;
|
||||
import com.google.common.collect.Lists;
|
||||
import com.google.protobuf.InvalidProtocolBufferException;
|
||||
|
||||
import eu.dnetlib.data.proto.OafProtos;
|
||||
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
|
||||
import eu.dnetlib.dhp.schema.action.AtomicAction;
|
||||
import eu.dnetlib.dhp.schema.oaf.*;
|
||||
import eu.dnetlib.dhp.utils.ISLookupClientFactory;
|
||||
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpException;
|
||||
import eu.dnetlib.enabling.is.lookup.rmi.ISLookUpService;
|
||||
import scala.Tuple2;
|
||||
|
||||
public class TransformActions implements Serializable {
|
||||
|
@ -40,15 +44,16 @@ public class TransformActions implements Serializable {
|
|||
private static final String SEPARATOR = "/";
|
||||
|
||||
public static void main(String[] args) throws Exception {
|
||||
final ArgumentApplicationParser parser =
|
||||
new ArgumentApplicationParser(
|
||||
IOUtils.toString(
|
||||
MigrateActionSet.class.getResourceAsStream(
|
||||
final ArgumentApplicationParser parser = new ArgumentApplicationParser(
|
||||
IOUtils
|
||||
.toString(
|
||||
MigrateActionSet.class
|
||||
.getResourceAsStream(
|
||||
"/eu/dnetlib/dhp/actionmanager/migration/transform_actionsets_parameters.json")));
|
||||
parser.parseArgument(args);
|
||||
|
||||
Boolean isSparkSessionManaged =
|
||||
Optional.ofNullable(parser.get("isSparkSessionManaged"))
|
||||
Boolean isSparkSessionManaged = Optional
|
||||
.ofNullable(parser.get("isSparkSessionManaged"))
|
||||
.map(Boolean::valueOf)
|
||||
.orElse(Boolean.TRUE);
|
||||
log.info("isSparkSessionManaged: {}", isSparkSessionManaged);
|
||||
|
@ -83,8 +88,7 @@ public class TransformActions implements Serializable {
|
|||
final String rawset = pathQ.pollLast();
|
||||
final String actionSetDirectory = pathQ.pollLast();
|
||||
|
||||
final Path targetDirectory =
|
||||
new Path(targetBaseDir + SEPARATOR + actionSetDirectory + SEPARATOR + rawset);
|
||||
final Path targetDirectory = new Path(targetBaseDir + SEPARATOR + actionSetDirectory + SEPARATOR + rawset);
|
||||
|
||||
if (fs.exists(targetDirectory)) {
|
||||
log.info("found target directory '{}", targetDirectory);
|
||||
|
@ -94,7 +98,8 @@ public class TransformActions implements Serializable {
|
|||
|
||||
log.info("transforming actions from '{}' to '{}'", sourcePath, targetDirectory);
|
||||
|
||||
sc.sequenceFile(sourcePath, Text.class, Text.class)
|
||||
sc
|
||||
.sequenceFile(sourcePath, Text.class, Text.class)
|
||||
.map(a -> eu.dnetlib.actionmanager.actions.AtomicAction.fromJSON(a._2().toString()))
|
||||
.map(TransformActions::doTransform)
|
||||
.filter(Objects::nonNull)
|
||||
|
@ -129,8 +134,12 @@ public class TransformActions implements Serializable {
|
|||
case project:
|
||||
return new AtomicAction<>(Project.class, (Project) oaf);
|
||||
case result:
|
||||
final String resulttypeid =
|
||||
proto_oaf.getEntity().getResult().getMetadata().getResulttype().getClassid();
|
||||
final String resulttypeid = proto_oaf
|
||||
.getEntity()
|
||||
.getResult()
|
||||
.getMetadata()
|
||||
.getResulttype()
|
||||
.getClassid();
|
||||
switch (resulttypeid) {
|
||||
case "publication":
|
||||
return new AtomicAction<>(Publication.class, (Publication) oaf);
|
||||
|
@ -157,8 +166,7 @@ public class TransformActions implements Serializable {
|
|||
|
||||
private static String getTargetBaseDir(String isLookupUrl) throws ISLookUpException {
|
||||
ISLookUpService isLookUp = ISLookupClientFactory.getLookUpService(isLookupUrl);
|
||||
String XQUERY =
|
||||
"collection('/db/DRIVER/ServiceResources/ActionManagerServiceResourceType')//SERVICE_PROPERTIES/PROPERTY[@key = 'basePath']/@value/string()";
|
||||
String XQUERY = "collection('/db/DRIVER/ServiceResources/ActionManagerServiceResourceType')//SERVICE_PROPERTIES/PROPERTY[@key = 'basePath']/@value/string()";
|
||||
return isLookUp.getResourceProfileByQuery(XQUERY);
|
||||
}
|
||||
}
|
||||
|
|
|
@ -1,15 +1,13 @@
|
|||
|
||||
package eu.dnetlib.dhp.actionmanager.partition;
|
||||
|
||||
import static eu.dnetlib.dhp.common.SparkSessionSupport.runWithSparkSession;
|
||||
import static org.apache.spark.sql.functions.*;
|
||||
|
||||
import eu.dnetlib.dhp.actionmanager.ISClient;
|
||||
import eu.dnetlib.dhp.actionmanager.promote.PromoteActionPayloadForGraphTableJob;
|
||||
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
|
||||
import eu.dnetlib.dhp.common.HdfsSupport;
|
||||
import java.util.Arrays;
|
||||
import java.util.List;
|
||||
import java.util.Optional;
|
||||
|
||||
import org.apache.commons.io.IOUtils;
|
||||
import org.apache.hadoop.io.Text;
|
||||
import org.apache.spark.SparkConf;
|
||||
|
@ -20,23 +18,30 @@ import org.apache.spark.sql.types.*;
|
|||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
|
||||
import eu.dnetlib.dhp.actionmanager.ISClient;
|
||||
import eu.dnetlib.dhp.actionmanager.promote.PromoteActionPayloadForGraphTableJob;
|
||||
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
|
||||
import eu.dnetlib.dhp.common.HdfsSupport;
|
||||
|
||||
/** Partitions given set of action sets by payload type. */
|
||||
public class PartitionActionSetsByPayloadTypeJob {
|
||||
|
||||
private static final Logger logger =
|
||||
LoggerFactory.getLogger(PartitionActionSetsByPayloadTypeJob.class);
|
||||
private static final Logger logger = LoggerFactory.getLogger(PartitionActionSetsByPayloadTypeJob.class);
|
||||
|
||||
private static final StructType KV_SCHEMA =
|
||||
StructType$.MODULE$.apply(
|
||||
Arrays.asList(
|
||||
private static final StructType KV_SCHEMA = StructType$.MODULE$
|
||||
.apply(
|
||||
Arrays
|
||||
.asList(
|
||||
StructField$.MODULE$.apply("key", DataTypes.StringType, false, Metadata.empty()),
|
||||
StructField$.MODULE$.apply("value", DataTypes.StringType, false, Metadata.empty())));
|
||||
|
||||
private static final StructType ATOMIC_ACTION_SCHEMA =
|
||||
StructType$.MODULE$.apply(
|
||||
Arrays.asList(
|
||||
private static final StructType ATOMIC_ACTION_SCHEMA = StructType$.MODULE$
|
||||
.apply(
|
||||
Arrays
|
||||
.asList(
|
||||
StructField$.MODULE$.apply("clazz", DataTypes.StringType, false, Metadata.empty()),
|
||||
StructField$.MODULE$.apply(
|
||||
StructField$.MODULE$
|
||||
.apply(
|
||||
"payload", DataTypes.StringType, false, Metadata.empty())));
|
||||
|
||||
private ISClient isClient;
|
||||
|
@ -45,18 +50,20 @@ public class PartitionActionSetsByPayloadTypeJob {
|
|||
this.isClient = new ISClient(isLookupUrl);
|
||||
}
|
||||
|
||||
public PartitionActionSetsByPayloadTypeJob() {}
|
||||
public PartitionActionSetsByPayloadTypeJob() {
|
||||
}
|
||||
|
||||
public static void main(String[] args) throws Exception {
|
||||
String jsonConfiguration =
|
||||
IOUtils.toString(
|
||||
PromoteActionPayloadForGraphTableJob.class.getResourceAsStream(
|
||||
String jsonConfiguration = IOUtils
|
||||
.toString(
|
||||
PromoteActionPayloadForGraphTableJob.class
|
||||
.getResourceAsStream(
|
||||
"/eu/dnetlib/dhp/actionmanager/partition/partition_action_sets_by_payload_type_input_parameters.json"));
|
||||
final ArgumentApplicationParser parser = new ArgumentApplicationParser(jsonConfiguration);
|
||||
parser.parseArgument(args);
|
||||
|
||||
Boolean isSparkSessionManaged =
|
||||
Optional.ofNullable(parser.get("isSparkSessionManaged"))
|
||||
Boolean isSparkSessionManaged = Optional
|
||||
.ofNullable(parser.get("isSparkSessionManaged"))
|
||||
.map(Boolean::valueOf)
|
||||
.orElse(Boolean.TRUE);
|
||||
logger.info("isSparkSessionManaged: {}", isSparkSessionManaged);
|
||||
|
@ -97,7 +104,8 @@ public class PartitionActionSetsByPayloadTypeJob {
|
|||
|
||||
private static void readAndWriteActionSetsFromPaths(
|
||||
SparkSession spark, List<String> inputActionSetPaths, String outputPath) {
|
||||
inputActionSetPaths.stream()
|
||||
inputActionSetPaths
|
||||
.stream()
|
||||
.filter(path -> HdfsSupport.exists(path, spark.sparkContext().hadoopConfiguration()))
|
||||
.forEach(
|
||||
inputActionSetPath -> {
|
||||
|
@ -111,8 +119,8 @@ public class PartitionActionSetsByPayloadTypeJob {
|
|||
|
||||
JavaSparkContext sc = JavaSparkContext.fromSparkContext(spark.sparkContext());
|
||||
|
||||
JavaRDD<Row> rdd =
|
||||
sc.sequenceFile(path, Text.class, Text.class)
|
||||
JavaRDD<Row> rdd = sc
|
||||
.sequenceFile(path, Text.class, Text.class)
|
||||
.map(x -> RowFactory.create(x._1().toString(), x._2().toString()));
|
||||
|
||||
return spark
|
||||
|
|
|
@ -1,27 +1,29 @@
|
|||
|
||||
package eu.dnetlib.dhp.actionmanager.promote;
|
||||
|
||||
import static eu.dnetlib.dhp.schema.common.ModelSupport.isSubClass;
|
||||
|
||||
import java.util.function.BiFunction;
|
||||
|
||||
import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.SerializableSupplier;
|
||||
import eu.dnetlib.dhp.schema.oaf.Oaf;
|
||||
import eu.dnetlib.dhp.schema.oaf.OafEntity;
|
||||
import eu.dnetlib.dhp.schema.oaf.Relation;
|
||||
import java.util.function.BiFunction;
|
||||
|
||||
/** OAF model merging support. */
|
||||
public class MergeAndGet {
|
||||
|
||||
private MergeAndGet() {}
|
||||
private MergeAndGet() {
|
||||
}
|
||||
|
||||
/**
|
||||
* Strategy for merging OAF model objects.
|
||||
*
|
||||
* <p>MERGE_FROM_AND_GET: use OAF 'mergeFrom' method SELECT_NEWER_AND_GET: use last update
|
||||
* timestamp to return newer instance
|
||||
* <p>
|
||||
* MERGE_FROM_AND_GET: use OAF 'mergeFrom' method SELECT_NEWER_AND_GET: use last update timestamp to return newer
|
||||
* instance
|
||||
*/
|
||||
public enum Strategy {
|
||||
MERGE_FROM_AND_GET,
|
||||
SELECT_NEWER_AND_GET
|
||||
MERGE_FROM_AND_GET, SELECT_NEWER_AND_GET
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -32,8 +34,8 @@ public class MergeAndGet {
|
|||
* @param <A> Action payload type
|
||||
* @return BiFunction to be used to merge OAF objects
|
||||
*/
|
||||
public static <G extends Oaf, A extends Oaf>
|
||||
SerializableSupplier<BiFunction<G, A, G>> functionFor(Strategy strategy) {
|
||||
public static <G extends Oaf, A extends Oaf> SerializableSupplier<BiFunction<G, A, G>> functionFor(
|
||||
Strategy strategy) {
|
||||
switch (strategy) {
|
||||
case MERGE_FROM_AND_GET:
|
||||
return () -> MergeAndGet::mergeFromAndGet;
|
||||
|
@ -54,7 +56,8 @@ public class MergeAndGet {
|
|||
return x;
|
||||
}
|
||||
throw new RuntimeException(
|
||||
String.format(
|
||||
String
|
||||
.format(
|
||||
"MERGE_FROM_AND_GET incompatible types: %s, %s",
|
||||
x.getClass().getCanonicalName(), y.getClass().getCanonicalName()));
|
||||
}
|
||||
|
@ -70,12 +73,14 @@ public class MergeAndGet {
|
|||
return x;
|
||||
} else if (isSubClass(x, y) && x.getLastupdatetimestamp() < y.getLastupdatetimestamp()) {
|
||||
throw new RuntimeException(
|
||||
String.format(
|
||||
String
|
||||
.format(
|
||||
"SELECT_NEWER_AND_GET cannot return right type when it is not the same as left type: %s, %s",
|
||||
x.getClass().getCanonicalName(), y.getClass().getCanonicalName()));
|
||||
}
|
||||
throw new RuntimeException(
|
||||
String.format(
|
||||
String
|
||||
.format(
|
||||
"SELECT_NEWER_AND_GET cannot be used when left is not subtype of right: %s, %s",
|
||||
x.getClass().getCanonicalName(), y.getClass().getCanonicalName()));
|
||||
}
|
||||
|
|
|
@ -1,18 +1,14 @@
|
|||
|
||||
package eu.dnetlib.dhp.actionmanager.promote;
|
||||
|
||||
import static eu.dnetlib.dhp.common.SparkSessionSupport.runWithSparkSession;
|
||||
import static eu.dnetlib.dhp.schema.common.ModelSupport.isSubClass;
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
|
||||
import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.SerializableSupplier;
|
||||
import eu.dnetlib.dhp.common.HdfsSupport;
|
||||
import eu.dnetlib.dhp.schema.common.ModelSupport;
|
||||
import eu.dnetlib.dhp.schema.oaf.*;
|
||||
import java.util.Objects;
|
||||
import java.util.Optional;
|
||||
import java.util.function.BiFunction;
|
||||
import java.util.function.Function;
|
||||
|
||||
import org.apache.commons.io.IOUtils;
|
||||
import org.apache.spark.SparkConf;
|
||||
import org.apache.spark.api.java.function.MapFunction;
|
||||
|
@ -23,23 +19,31 @@ import org.apache.spark.sql.SparkSession;
|
|||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
|
||||
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
|
||||
import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.SerializableSupplier;
|
||||
import eu.dnetlib.dhp.common.HdfsSupport;
|
||||
import eu.dnetlib.dhp.schema.common.ModelSupport;
|
||||
import eu.dnetlib.dhp.schema.oaf.*;
|
||||
|
||||
/** Applies a given action payload file to graph table of compatible type. */
|
||||
public class PromoteActionPayloadForGraphTableJob {
|
||||
private static final Logger logger =
|
||||
LoggerFactory.getLogger(PromoteActionPayloadForGraphTableJob.class);
|
||||
private static final Logger logger = LoggerFactory.getLogger(PromoteActionPayloadForGraphTableJob.class);
|
||||
|
||||
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
|
||||
|
||||
public static void main(String[] args) throws Exception {
|
||||
String jsonConfiguration =
|
||||
IOUtils.toString(
|
||||
PromoteActionPayloadForGraphTableJob.class.getResourceAsStream(
|
||||
String jsonConfiguration = IOUtils
|
||||
.toString(
|
||||
PromoteActionPayloadForGraphTableJob.class
|
||||
.getResourceAsStream(
|
||||
"/eu/dnetlib/dhp/actionmanager/promote/promote_action_payload_for_graph_table_input_parameters.json"));
|
||||
final ArgumentApplicationParser parser = new ArgumentApplicationParser(jsonConfiguration);
|
||||
parser.parseArgument(args);
|
||||
|
||||
Boolean isSparkSessionManaged =
|
||||
Optional.ofNullable(parser.get("isSparkSessionManaged"))
|
||||
Boolean isSparkSessionManaged = Optional
|
||||
.ofNullable(parser.get("isSparkSessionManaged"))
|
||||
.map(Boolean::valueOf)
|
||||
.orElse(Boolean.TRUE);
|
||||
logger.info("isSparkSessionManaged: {}", isSparkSessionManaged);
|
||||
|
@ -59,13 +63,11 @@ public class PromoteActionPayloadForGraphTableJob {
|
|||
String outputGraphTablePath = parser.get("outputGraphTablePath");
|
||||
logger.info("outputGraphTablePath: {}", outputGraphTablePath);
|
||||
|
||||
MergeAndGet.Strategy strategy =
|
||||
MergeAndGet.Strategy.valueOf(parser.get("mergeAndGetStrategy").toUpperCase());
|
||||
MergeAndGet.Strategy strategy = MergeAndGet.Strategy.valueOf(parser.get("mergeAndGetStrategy").toUpperCase());
|
||||
logger.info("strategy: {}", strategy);
|
||||
|
||||
Class<? extends Oaf> rowClazz = (Class<? extends Oaf>) Class.forName(graphTableClassName);
|
||||
Class<? extends Oaf> actionPayloadClazz =
|
||||
(Class<? extends Oaf>) Class.forName(actionPayloadClassName);
|
||||
Class<? extends Oaf> actionPayloadClazz = (Class<? extends Oaf>) Class.forName(actionPayloadClassName);
|
||||
|
||||
throwIfGraphTableClassIsNotSubClassOfActionPayloadClass(rowClazz, actionPayloadClazz);
|
||||
|
||||
|
@ -92,8 +94,8 @@ public class PromoteActionPayloadForGraphTableJob {
|
|||
private static void throwIfGraphTableClassIsNotSubClassOfActionPayloadClass(
|
||||
Class<? extends Oaf> rowClazz, Class<? extends Oaf> actionPayloadClazz) {
|
||||
if (!isSubClass(rowClazz, actionPayloadClazz)) {
|
||||
String msg =
|
||||
String.format(
|
||||
String msg = String
|
||||
.format(
|
||||
"graph table class is not a subclass of action payload class: graph=%s, action=%s",
|
||||
rowClazz.getCanonicalName(), actionPayloadClazz.getCanonicalName());
|
||||
throw new RuntimeException(msg);
|
||||
|
@ -113,11 +115,9 @@ public class PromoteActionPayloadForGraphTableJob {
|
|||
Class<G> rowClazz,
|
||||
Class<A> actionPayloadClazz) {
|
||||
Dataset<G> rowDS = readGraphTable(spark, inputGraphTablePath, rowClazz);
|
||||
Dataset<A> actionPayloadDS =
|
||||
readActionPayload(spark, inputActionPayloadPath, actionPayloadClazz);
|
||||
Dataset<A> actionPayloadDS = readActionPayload(spark, inputActionPayloadPath, actionPayloadClazz);
|
||||
|
||||
Dataset<G> result =
|
||||
promoteActionPayloadForGraphTable(
|
||||
Dataset<G> result = promoteActionPayloadForGraphTable(
|
||||
rowDS, actionPayloadDS, strategy, rowClazz, actionPayloadClazz)
|
||||
.map((MapFunction<G, G>) value -> value, Encoders.bean(rowClazz));
|
||||
|
||||
|
@ -147,9 +147,8 @@ public class PromoteActionPayloadForGraphTableJob {
|
|||
.read()
|
||||
.parquet(path)
|
||||
.map(
|
||||
(MapFunction<Row, A>)
|
||||
value ->
|
||||
OBJECT_MAPPER.readValue(value.<String>getAs("payload"), actionPayloadClazz),
|
||||
(MapFunction<Row, A>) value -> OBJECT_MAPPER
|
||||
.readValue(value.<String> getAs("payload"), actionPayloadClazz),
|
||||
Encoders.bean(actionPayloadClazz));
|
||||
}
|
||||
|
||||
|
@ -159,22 +158,21 @@ public class PromoteActionPayloadForGraphTableJob {
|
|||
MergeAndGet.Strategy strategy,
|
||||
Class<G> rowClazz,
|
||||
Class<A> actionPayloadClazz) {
|
||||
logger.info(
|
||||
logger
|
||||
.info(
|
||||
"Promoting action payload for graph table: payload={}, table={}",
|
||||
actionPayloadClazz.getSimpleName(),
|
||||
rowClazz.getSimpleName());
|
||||
|
||||
SerializableSupplier<Function<G, String>> rowIdFn = ModelSupport::idFn;
|
||||
SerializableSupplier<Function<A, String>> actionPayloadIdFn = ModelSupport::idFn;
|
||||
SerializableSupplier<BiFunction<G, A, G>> mergeRowWithActionPayloadAndGetFn =
|
||||
MergeAndGet.functionFor(strategy);
|
||||
SerializableSupplier<BiFunction<G, A, G>> mergeRowWithActionPayloadAndGetFn = MergeAndGet.functionFor(strategy);
|
||||
SerializableSupplier<BiFunction<G, G, G>> mergeRowsAndGetFn = MergeAndGet.functionFor(strategy);
|
||||
SerializableSupplier<G> zeroFn = zeroFn(rowClazz);
|
||||
SerializableSupplier<Function<G, Boolean>> isNotZeroFn =
|
||||
PromoteActionPayloadForGraphTableJob::isNotZeroFnUsingIdOrSource;
|
||||
SerializableSupplier<Function<G, Boolean>> isNotZeroFn = PromoteActionPayloadForGraphTableJob::isNotZeroFnUsingIdOrSource;
|
||||
|
||||
Dataset<G> joinedAndMerged =
|
||||
PromoteActionPayloadFunctions.joinGraphTableWithActionPayloadAndMerge(
|
||||
Dataset<G> joinedAndMerged = PromoteActionPayloadFunctions
|
||||
.joinGraphTableWithActionPayloadAndMerge(
|
||||
rowDS,
|
||||
actionPayloadDS,
|
||||
rowIdFn,
|
||||
|
@ -183,7 +181,8 @@ public class PromoteActionPayloadForGraphTableJob {
|
|||
rowClazz,
|
||||
actionPayloadClazz);
|
||||
|
||||
return PromoteActionPayloadFunctions.groupGraphTableByIdAndMerge(
|
||||
return PromoteActionPayloadFunctions
|
||||
.groupGraphTableByIdAndMerge(
|
||||
joinedAndMerged, rowIdFn, mergeRowsAndGetFn, zeroFn, isNotZeroFn, rowClazz);
|
||||
}
|
||||
|
||||
|
|
|
@ -1,13 +1,13 @@
|
|||
|
||||
package eu.dnetlib.dhp.actionmanager.promote;
|
||||
|
||||
import static eu.dnetlib.dhp.schema.common.ModelSupport.isSubClass;
|
||||
|
||||
import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.SerializableSupplier;
|
||||
import eu.dnetlib.dhp.schema.oaf.Oaf;
|
||||
import java.util.Objects;
|
||||
import java.util.Optional;
|
||||
import java.util.function.BiFunction;
|
||||
import java.util.function.Function;
|
||||
|
||||
import org.apache.spark.api.java.function.FilterFunction;
|
||||
import org.apache.spark.api.java.function.MapFunction;
|
||||
import org.apache.spark.sql.Dataset;
|
||||
|
@ -15,16 +15,19 @@ import org.apache.spark.sql.Encoder;
|
|||
import org.apache.spark.sql.Encoders;
|
||||
import org.apache.spark.sql.TypedColumn;
|
||||
import org.apache.spark.sql.expressions.Aggregator;
|
||||
|
||||
import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.SerializableSupplier;
|
||||
import eu.dnetlib.dhp.schema.oaf.Oaf;
|
||||
import scala.Tuple2;
|
||||
|
||||
/** Promote action payload functions. */
|
||||
public class PromoteActionPayloadFunctions {
|
||||
|
||||
private PromoteActionPayloadFunctions() {}
|
||||
private PromoteActionPayloadFunctions() {
|
||||
}
|
||||
|
||||
/**
|
||||
* Joins dataset representing graph table with dataset representing action payload using supplied
|
||||
* functions.
|
||||
* Joins dataset representing graph table with dataset representing action payload using supplied functions.
|
||||
*
|
||||
* @param rowDS Dataset representing graph table
|
||||
* @param actionPayloadDS Dataset representing action payload
|
||||
|
@ -51,8 +54,8 @@ public class PromoteActionPayloadFunctions {
|
|||
}
|
||||
|
||||
Dataset<Tuple2<String, G>> rowWithIdDS = mapToTupleWithId(rowDS, rowIdFn, rowClazz);
|
||||
Dataset<Tuple2<String, A>> actionPayloadWithIdDS =
|
||||
mapToTupleWithId(actionPayloadDS, actionPayloadIdFn, actionPayloadClazz);
|
||||
Dataset<Tuple2<String, A>> actionPayloadWithIdDS = mapToTupleWithId(
|
||||
actionPayloadDS, actionPayloadIdFn, actionPayloadClazz);
|
||||
|
||||
return rowWithIdDS
|
||||
.joinWith(
|
||||
|
@ -60,21 +63,17 @@ public class PromoteActionPayloadFunctions {
|
|||
rowWithIdDS.col("_1").equalTo(actionPayloadWithIdDS.col("_1")),
|
||||
"full_outer")
|
||||
.map(
|
||||
(MapFunction<Tuple2<Tuple2<String, G>, Tuple2<String, A>>, G>)
|
||||
value -> {
|
||||
(MapFunction<Tuple2<Tuple2<String, G>, Tuple2<String, A>>, G>) value -> {
|
||||
Optional<G> rowOpt = Optional.ofNullable(value._1()).map(Tuple2::_2);
|
||||
Optional<A> actionPayloadOpt = Optional.ofNullable(value._2()).map(Tuple2::_2);
|
||||
return rowOpt
|
||||
.map(
|
||||
row ->
|
||||
actionPayloadOpt
|
||||
row -> actionPayloadOpt
|
||||
.map(
|
||||
actionPayload ->
|
||||
mergeAndGetFn.get().apply(row, actionPayload))
|
||||
actionPayload -> mergeAndGetFn.get().apply(row, actionPayload))
|
||||
.orElse(row))
|
||||
.orElseGet(
|
||||
() ->
|
||||
actionPayloadOpt
|
||||
() -> actionPayloadOpt
|
||||
.filter(
|
||||
actionPayload -> actionPayload.getClass().equals(rowClazz))
|
||||
.map(rowClazz::cast)
|
||||
|
@ -86,7 +85,8 @@ public class PromoteActionPayloadFunctions {
|
|||
|
||||
private static <T extends Oaf> Dataset<Tuple2<String, T>> mapToTupleWithId(
|
||||
Dataset<T> ds, SerializableSupplier<Function<T, String>> idFn, Class<T> clazz) {
|
||||
return ds.map(
|
||||
return ds
|
||||
.map(
|
||||
(MapFunction<T, Tuple2<String, T>>) value -> new Tuple2<>(idFn.get().apply(value), value),
|
||||
Encoders.tuple(Encoders.STRING(), Encoders.kryo(clazz)));
|
||||
}
|
||||
|
@ -110,8 +110,7 @@ public class PromoteActionPayloadFunctions {
|
|||
SerializableSupplier<G> zeroFn,
|
||||
SerializableSupplier<Function<G, Boolean>> isNotZeroFn,
|
||||
Class<G> rowClazz) {
|
||||
TypedColumn<G, G> aggregator =
|
||||
new TableAggregator<>(zeroFn, mergeAndGetFn, isNotZeroFn, rowClazz).toColumn();
|
||||
TypedColumn<G, G> aggregator = new TableAggregator<>(zeroFn, mergeAndGetFn, isNotZeroFn, rowClazz).toColumn();
|
||||
return rowDS
|
||||
.groupByKey((MapFunction<G, String>) x -> rowIdFn.get().apply(x), Encoders.STRING())
|
||||
.agg(aggregator)
|
||||
|
@ -124,10 +123,10 @@ public class PromoteActionPayloadFunctions {
|
|||
* @param <G> Type of graph table row
|
||||
*/
|
||||
public static class TableAggregator<G extends Oaf> extends Aggregator<G, G, G> {
|
||||
private SerializableSupplier<G> zeroFn;
|
||||
private SerializableSupplier<BiFunction<G, G, G>> mergeAndGetFn;
|
||||
private SerializableSupplier<Function<G, Boolean>> isNotZeroFn;
|
||||
private Class<G> rowClazz;
|
||||
private final SerializableSupplier<G> zeroFn;
|
||||
private final SerializableSupplier<BiFunction<G, G, G>> mergeAndGetFn;
|
||||
private final SerializableSupplier<Function<G, Boolean>> isNotZeroFn;
|
||||
private final Class<G> rowClazz;
|
||||
|
||||
public TableAggregator(
|
||||
SerializableSupplier<G> zeroFn,
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.actionmanager.partition;
|
||||
|
||||
import static eu.dnetlib.dhp.common.ThrowingSupport.rethrowAsRuntimeException;
|
||||
|
@ -5,16 +6,13 @@ import static org.apache.spark.sql.functions.*;
|
|||
import static org.junit.jupiter.api.Assertions.assertIterableEquals;
|
||||
import static scala.collection.JavaConversions.mutableSeqAsJavaList;
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
import eu.dnetlib.dhp.actionmanager.ISClient;
|
||||
import eu.dnetlib.dhp.actionmanager.promote.PromoteActionPayloadForGraphTableJobTest;
|
||||
import eu.dnetlib.dhp.schema.oaf.*;
|
||||
import java.io.IOException;
|
||||
import java.nio.file.Files;
|
||||
import java.nio.file.Path;
|
||||
import java.nio.file.Paths;
|
||||
import java.util.*;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
import org.apache.hadoop.conf.Configuration;
|
||||
import org.apache.hadoop.io.Text;
|
||||
import org.apache.hadoop.mapreduce.Job;
|
||||
|
@ -32,24 +30,31 @@ import org.junit.jupiter.api.io.TempDir;
|
|||
import org.mockito.Mock;
|
||||
import org.mockito.Mockito;
|
||||
import org.mockito.junit.jupiter.MockitoExtension;
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
|
||||
import eu.dnetlib.dhp.actionmanager.ISClient;
|
||||
import eu.dnetlib.dhp.actionmanager.promote.PromoteActionPayloadForGraphTableJobTest;
|
||||
import eu.dnetlib.dhp.schema.oaf.*;
|
||||
import scala.Tuple2;
|
||||
import scala.collection.mutable.Seq;
|
||||
|
||||
@ExtendWith(MockitoExtension.class)
|
||||
public class PartitionActionSetsByPayloadTypeJobTest {
|
||||
private static final ClassLoader cl =
|
||||
PartitionActionSetsByPayloadTypeJobTest.class.getClassLoader();
|
||||
private static final ClassLoader cl = PartitionActionSetsByPayloadTypeJobTest.class.getClassLoader();
|
||||
|
||||
private static Configuration configuration;
|
||||
private static SparkSession spark;
|
||||
|
||||
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
|
||||
|
||||
private static final StructType ATOMIC_ACTION_SCHEMA =
|
||||
StructType$.MODULE$.apply(
|
||||
Arrays.asList(
|
||||
private static final StructType ATOMIC_ACTION_SCHEMA = StructType$.MODULE$
|
||||
.apply(
|
||||
Arrays
|
||||
.asList(
|
||||
StructField$.MODULE$.apply("clazz", DataTypes.StringType, false, Metadata.empty()),
|
||||
StructField$.MODULE$.apply(
|
||||
StructField$.MODULE$
|
||||
.apply(
|
||||
"payload", DataTypes.StringType, false, Metadata.empty())));
|
||||
|
||||
@BeforeAll
|
||||
|
@ -71,7 +76,8 @@ public class PartitionActionSetsByPayloadTypeJobTest {
|
|||
@Nested
|
||||
class Main {
|
||||
|
||||
@Mock private ISClient isClient;
|
||||
@Mock
|
||||
private ISClient isClient;
|
||||
|
||||
@Test
|
||||
public void shouldPartitionActionSetsByPayloadType(@TempDir Path workingDir) throws Exception {
|
||||
|
@ -84,12 +90,14 @@ public class PartitionActionSetsByPayloadTypeJobTest {
|
|||
List<String> inputActionSetsPaths = resolveInputActionSetPaths(inputActionSetsBaseDir);
|
||||
|
||||
// when
|
||||
Mockito.when(isClient.getLatestRawsetPaths(Mockito.anyString()))
|
||||
Mockito
|
||||
.when(isClient.getLatestRawsetPaths(Mockito.anyString()))
|
||||
.thenReturn(inputActionSetsPaths);
|
||||
|
||||
PartitionActionSetsByPayloadTypeJob job = new PartitionActionSetsByPayloadTypeJob();
|
||||
job.setIsClient(isClient);
|
||||
job.run(
|
||||
job
|
||||
.run(
|
||||
Boolean.FALSE,
|
||||
"", // it can be empty we're mocking the response from isClient
|
||||
// to
|
||||
|
@ -114,7 +122,8 @@ public class PartitionActionSetsByPayloadTypeJobTest {
|
|||
|
||||
private List<String> resolveInputActionSetPaths(Path inputActionSetsBaseDir) throws IOException {
|
||||
Path inputActionSetJsonDumpsDir = getInputActionSetJsonDumpsDir();
|
||||
return Files.list(inputActionSetJsonDumpsDir)
|
||||
return Files
|
||||
.list(inputActionSetJsonDumpsDir)
|
||||
.map(
|
||||
path -> {
|
||||
String inputActionSetId = path.getFileName().toString();
|
||||
|
@ -128,29 +137,31 @@ public class PartitionActionSetsByPayloadTypeJobTest {
|
|||
Path inputActionSetJsonDumpsDir = getInputActionSetJsonDumpsDir();
|
||||
|
||||
Map<String, List<String>> oafsByType = new HashMap<>();
|
||||
Files.list(inputActionSetJsonDumpsDir)
|
||||
Files
|
||||
.list(inputActionSetJsonDumpsDir)
|
||||
.forEach(
|
||||
inputActionSetJsonDumpFile -> {
|
||||
String inputActionSetId = inputActionSetJsonDumpFile.getFileName().toString();
|
||||
Path inputActionSetDir = inputActionSetsDir.resolve(inputActionSetId);
|
||||
|
||||
Dataset<String> actionDS =
|
||||
readActionsFromJsonDump(inputActionSetJsonDumpFile.toString()).cache();
|
||||
Dataset<String> actionDS = readActionsFromJsonDump(inputActionSetJsonDumpFile.toString()).cache();
|
||||
|
||||
writeActionsAsJobInput(actionDS, inputActionSetId, inputActionSetDir.toString());
|
||||
|
||||
Map<String, List<String>> actionSetOafsByType =
|
||||
actionDS
|
||||
Map<String, List<String>> actionSetOafsByType = actionDS
|
||||
.withColumn("atomic_action", from_json(col("value"), ATOMIC_ACTION_SCHEMA))
|
||||
.select(expr("atomic_action.*")).groupBy(col("clazz"))
|
||||
.agg(collect_list(col("payload")).as("payload_list")).collectAsList().stream()
|
||||
.select(expr("atomic_action.*"))
|
||||
.groupBy(col("clazz"))
|
||||
.agg(collect_list(col("payload")).as("payload_list"))
|
||||
.collectAsList()
|
||||
.stream()
|
||||
.map(
|
||||
row ->
|
||||
new AbstractMap.SimpleEntry<>(
|
||||
row.<String>getAs("clazz"),
|
||||
mutableSeqAsJavaList(row.<Seq<String>>getAs("payload_list"))))
|
||||
row -> new AbstractMap.SimpleEntry<>(
|
||||
row.<String> getAs("clazz"),
|
||||
mutableSeqAsJavaList(row.<Seq<String>> getAs("payload_list"))))
|
||||
.collect(
|
||||
Collectors.toMap(
|
||||
Collectors
|
||||
.toMap(
|
||||
AbstractMap.SimpleEntry::getKey, AbstractMap.SimpleEntry::getValue));
|
||||
|
||||
actionSetOafsByType
|
||||
|
@ -172,8 +183,10 @@ public class PartitionActionSetsByPayloadTypeJobTest {
|
|||
}
|
||||
|
||||
private static Path getInputActionSetJsonDumpsDir() {
|
||||
return Paths.get(
|
||||
Objects.requireNonNull(cl.getResource("eu/dnetlib/dhp/actionmanager/partition/input/"))
|
||||
return Paths
|
||||
.get(
|
||||
Objects
|
||||
.requireNonNull(cl.getResource("eu/dnetlib/dhp/actionmanager/partition/input/"))
|
||||
.getFile());
|
||||
}
|
||||
|
||||
|
@ -195,12 +208,12 @@ public class PartitionActionSetsByPayloadTypeJobTest {
|
|||
Path outputDatasetDir = outputDir.resolve(String.format("clazz=%s", clazz.getCanonicalName()));
|
||||
Files.exists(outputDatasetDir);
|
||||
|
||||
List<T> actuals =
|
||||
readActionPayloadFromJobOutput(outputDatasetDir.toString(), clazz).collectAsList();
|
||||
List<T> actuals = readActionPayloadFromJobOutput(outputDatasetDir.toString(), clazz).collectAsList();
|
||||
actuals.sort(Comparator.comparingInt(Object::hashCode));
|
||||
|
||||
List<T> expecteds =
|
||||
oafsByClassName.get(clazz.getCanonicalName()).stream()
|
||||
List<T> expecteds = oafsByClassName
|
||||
.get(clazz.getCanonicalName())
|
||||
.stream()
|
||||
.map(json -> mapToOaf(json, clazz))
|
||||
.sorted(Comparator.comparingInt(Object::hashCode))
|
||||
.collect(Collectors.toList());
|
||||
|
@ -214,15 +227,15 @@ public class PartitionActionSetsByPayloadTypeJobTest {
|
|||
.read()
|
||||
.parquet(path)
|
||||
.map(
|
||||
(MapFunction<Row, T>)
|
||||
value -> OBJECT_MAPPER.readValue(value.<String>getAs("payload"), clazz),
|
||||
(MapFunction<Row, T>) value -> OBJECT_MAPPER.readValue(value.<String> getAs("payload"), clazz),
|
||||
Encoders.bean(clazz));
|
||||
}
|
||||
|
||||
private static <T extends Oaf> T mapToOaf(String json, Class<T> clazz) {
|
||||
return rethrowAsRuntimeException(
|
||||
() -> OBJECT_MAPPER.readValue(json, clazz),
|
||||
String.format(
|
||||
String
|
||||
.format(
|
||||
"failed to map json to class: json=%s, class=%s", json, clazz.getCanonicalName()));
|
||||
}
|
||||
}
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
|
||||
package eu.dnetlib.dhp.actionmanager.promote;
|
||||
|
||||
import static eu.dnetlib.dhp.actionmanager.promote.MergeAndGet.Strategy;
|
||||
|
@ -5,12 +6,14 @@ import static eu.dnetlib.dhp.actionmanager.promote.MergeAndGet.functionFor;
|
|||
import static org.junit.jupiter.api.Assertions.*;
|
||||
import static org.mockito.Mockito.*;
|
||||
|
||||
import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.SerializableSupplier;
|
||||
import eu.dnetlib.dhp.schema.oaf.*;
|
||||
import java.util.function.BiFunction;
|
||||
|
||||
import org.junit.jupiter.api.Nested;
|
||||
import org.junit.jupiter.api.Test;
|
||||
|
||||
import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.SerializableSupplier;
|
||||
import eu.dnetlib.dhp.schema.oaf.*;
|
||||
|
||||
public class MergeAndGetTest {
|
||||
|
||||
@Nested
|
||||
|
@ -126,8 +129,10 @@ public class MergeAndGetTest {
|
|||
@Test
|
||||
public void shouldThrowForOafEntityAndOafEntityButNotSubclasses() {
|
||||
// given
|
||||
class OafEntitySub1 extends OafEntity {}
|
||||
class OafEntitySub2 extends OafEntity {}
|
||||
class OafEntitySub1 extends OafEntity {
|
||||
}
|
||||
class OafEntitySub2 extends OafEntity {
|
||||
}
|
||||
|
||||
OafEntitySub1 a = mock(OafEntitySub1.class);
|
||||
OafEntitySub2 b = mock(OafEntitySub2.class);
|
||||
|
@ -166,8 +171,7 @@ public class MergeAndGetTest {
|
|||
Relation b = mock(Relation.class);
|
||||
|
||||
// when
|
||||
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn =
|
||||
functionFor(Strategy.SELECT_NEWER_AND_GET);
|
||||
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.SELECT_NEWER_AND_GET);
|
||||
|
||||
// then
|
||||
assertThrows(RuntimeException.class, () -> fn.get().apply(a, b));
|
||||
|
@ -180,8 +184,7 @@ public class MergeAndGetTest {
|
|||
OafEntity b = mock(OafEntity.class);
|
||||
|
||||
// when
|
||||
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn =
|
||||
functionFor(Strategy.SELECT_NEWER_AND_GET);
|
||||
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.SELECT_NEWER_AND_GET);
|
||||
|
||||
// then
|
||||
assertThrows(RuntimeException.class, () -> fn.get().apply(a, b));
|
||||
|
@ -194,8 +197,7 @@ public class MergeAndGetTest {
|
|||
Result b = mock(Result.class);
|
||||
|
||||
// when
|
||||
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn =
|
||||
functionFor(Strategy.SELECT_NEWER_AND_GET);
|
||||
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.SELECT_NEWER_AND_GET);
|
||||
|
||||
// then
|
||||
assertThrows(RuntimeException.class, () -> fn.get().apply(a, b));
|
||||
|
@ -212,8 +214,7 @@ public class MergeAndGetTest {
|
|||
b.setLastupdatetimestamp(2L);
|
||||
|
||||
// when
|
||||
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn =
|
||||
functionFor(Strategy.SELECT_NEWER_AND_GET);
|
||||
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.SELECT_NEWER_AND_GET);
|
||||
|
||||
// then
|
||||
assertThrows(RuntimeException.class, () -> fn.get().apply(a, b));
|
||||
|
@ -228,8 +229,7 @@ public class MergeAndGetTest {
|
|||
when(b.getLastupdatetimestamp()).thenReturn(2L);
|
||||
|
||||
// when
|
||||
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn =
|
||||
functionFor(Strategy.SELECT_NEWER_AND_GET);
|
||||
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.SELECT_NEWER_AND_GET);
|
||||
|
||||
// then
|
||||
Oaf x = fn.get().apply(a, b);
|
||||
|
@ -246,8 +246,7 @@ public class MergeAndGetTest {
|
|||
when(b.getLastupdatetimestamp()).thenReturn(1L);
|
||||
|
||||
// when
|
||||
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn =
|
||||
functionFor(Strategy.SELECT_NEWER_AND_GET);
|
||||
SerializableSupplier<BiFunction<Oaf, Oaf, Oaf>> fn = functionFor(Strategy.SELECT_NEWER_AND_GET);
|
||||
|
||||
// then
|
||||
Oaf x = fn.get().apply(a, b);
|
||||
|
|
|
@ -1,11 +1,9 @@
|
|||
|
||||
package eu.dnetlib.dhp.actionmanager.promote;
|
||||
|
||||
import static org.junit.jupiter.api.Assertions.*;
|
||||
import static org.junit.jupiter.params.provider.Arguments.arguments;
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
import eu.dnetlib.dhp.schema.common.ModelSupport;
|
||||
import eu.dnetlib.dhp.schema.oaf.*;
|
||||
import java.io.IOException;
|
||||
import java.nio.file.Files;
|
||||
import java.nio.file.Path;
|
||||
|
@ -15,6 +13,7 @@ import java.util.List;
|
|||
import java.util.Objects;
|
||||
import java.util.stream.Collectors;
|
||||
import java.util.stream.Stream;
|
||||
|
||||
import org.apache.commons.io.FileUtils;
|
||||
import org.apache.spark.SparkConf;
|
||||
import org.apache.spark.api.java.function.MapFunction;
|
||||
|
@ -26,9 +25,13 @@ import org.junit.jupiter.params.ParameterizedTest;
|
|||
import org.junit.jupiter.params.provider.Arguments;
|
||||
import org.junit.jupiter.params.provider.MethodSource;
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
|
||||
import eu.dnetlib.dhp.schema.common.ModelSupport;
|
||||
import eu.dnetlib.dhp.schema.oaf.*;
|
||||
|
||||
public class PromoteActionPayloadForGraphTableJobTest {
|
||||
private static final ClassLoader cl =
|
||||
PromoteActionPayloadForGraphTableJobTest.class.getClassLoader();
|
||||
private static final ClassLoader cl = PromoteActionPayloadForGraphTableJobTest.class.getClassLoader();
|
||||
|
||||
private static SparkSession spark;
|
||||
|
||||
|
@ -52,8 +55,7 @@ public class PromoteActionPayloadForGraphTableJobTest {
|
|||
|
||||
@BeforeEach
|
||||
public void beforeEach() throws IOException {
|
||||
workingDir =
|
||||
Files.createTempDirectory(PromoteActionPayloadForGraphTableJobTest.class.getSimpleName());
|
||||
workingDir = Files.createTempDirectory(PromoteActionPayloadForGraphTableJobTest.class.getSimpleName());
|
||||
inputDir = workingDir.resolve("input");
|
||||
inputGraphRootDir = inputDir.resolve("graph");
|
||||
inputActionPayloadRootDir = inputDir.resolve("action_payload");
|
||||
|
@ -81,11 +83,10 @@ public class PromoteActionPayloadForGraphTableJobTest {
|
|||
Class<OafEntity> actionPayloadClazz = OafEntity.class;
|
||||
|
||||
// when
|
||||
RuntimeException exception =
|
||||
assertThrows(
|
||||
RuntimeException exception = assertThrows(
|
||||
RuntimeException.class,
|
||||
() ->
|
||||
PromoteActionPayloadForGraphTableJob.main(
|
||||
() -> PromoteActionPayloadForGraphTableJob
|
||||
.main(
|
||||
new String[] {
|
||||
"-isSparkSessionManaged",
|
||||
Boolean.FALSE.toString(),
|
||||
|
@ -104,16 +105,15 @@ public class PromoteActionPayloadForGraphTableJobTest {
|
|||
}));
|
||||
|
||||
// then
|
||||
String msg =
|
||||
String.format(
|
||||
String msg = String
|
||||
.format(
|
||||
"graph table class is not a subclass of action payload class: graph=%s, action=%s",
|
||||
rowClazz.getCanonicalName(), actionPayloadClazz.getCanonicalName());
|
||||
assertTrue(exception.getMessage().contains(msg));
|
||||
}
|
||||
|
||||
@ParameterizedTest(name = "strategy: {0}, graph table: {1}, action payload: {2}")
|
||||
@MethodSource(
|
||||
"eu.dnetlib.dhp.actionmanager.promote.PromoteActionPayloadForGraphTableJobTest#promoteJobTestParams")
|
||||
@MethodSource("eu.dnetlib.dhp.actionmanager.promote.PromoteActionPayloadForGraphTableJobTest#promoteJobTestParams")
|
||||
public void shouldPromoteActionPayloadForGraphTable(
|
||||
MergeAndGet.Strategy strategy,
|
||||
Class<? extends Oaf> rowClazz,
|
||||
|
@ -121,13 +121,12 @@ public class PromoteActionPayloadForGraphTableJobTest {
|
|||
throws Exception {
|
||||
// given
|
||||
Path inputGraphTableDir = createGraphTable(inputGraphRootDir, rowClazz);
|
||||
Path inputActionPayloadDir =
|
||||
createActionPayload(inputActionPayloadRootDir, rowClazz, actionPayloadClazz);
|
||||
Path outputGraphTableDir =
|
||||
outputDir.resolve("graph").resolve(rowClazz.getSimpleName().toLowerCase());
|
||||
Path inputActionPayloadDir = createActionPayload(inputActionPayloadRootDir, rowClazz, actionPayloadClazz);
|
||||
Path outputGraphTableDir = outputDir.resolve("graph").resolve(rowClazz.getSimpleName().toLowerCase());
|
||||
|
||||
// when
|
||||
PromoteActionPayloadForGraphTableJob.main(
|
||||
PromoteActionPayloadForGraphTableJob
|
||||
.main(
|
||||
new String[] {
|
||||
"-isSparkSessionManaged",
|
||||
Boolean.FALSE.toString(),
|
||||
|
@ -148,20 +147,21 @@ public class PromoteActionPayloadForGraphTableJobTest {
|
|||
// then
|
||||
assertTrue(Files.exists(outputGraphTableDir));
|
||||
|
||||
List<? extends Oaf> actualOutputRows =
|
||||
readGraphTableFromJobOutput(outputGraphTableDir.toString(), rowClazz).collectAsList()
|
||||
List<? extends Oaf> actualOutputRows = readGraphTableFromJobOutput(outputGraphTableDir.toString(), rowClazz)
|
||||
.collectAsList()
|
||||
.stream()
|
||||
.sorted(Comparator.comparingInt(Object::hashCode))
|
||||
.collect(Collectors.toList());
|
||||
String expectedOutputGraphTableJsonDumpPath =
|
||||
resultFileLocation(strategy, rowClazz, actionPayloadClazz);
|
||||
Path expectedOutputGraphTableJsonDumpFile =
|
||||
Paths.get(
|
||||
Objects.requireNonNull(cl.getResource(expectedOutputGraphTableJsonDumpPath))
|
||||
String expectedOutputGraphTableJsonDumpPath = resultFileLocation(strategy, rowClazz, actionPayloadClazz);
|
||||
Path expectedOutputGraphTableJsonDumpFile = Paths
|
||||
.get(
|
||||
Objects
|
||||
.requireNonNull(cl.getResource(expectedOutputGraphTableJsonDumpPath))
|
||||
.getFile());
|
||||
List<? extends Oaf> expectedOutputRows =
|
||||
readGraphTableFromJsonDump(expectedOutputGraphTableJsonDumpFile.toString(), rowClazz)
|
||||
.collectAsList().stream()
|
||||
List<? extends Oaf> expectedOutputRows = readGraphTableFromJsonDump(
|
||||
expectedOutputGraphTableJsonDumpFile.toString(), rowClazz)
|
||||
.collectAsList()
|
||||
.stream()
|
||||
.sorted(Comparator.comparingInt(Object::hashCode))
|
||||
.collect(Collectors.toList());
|
||||
assertIterableEquals(expectedOutputRows, actualOutputRows);
|
||||
|
@ -169,7 +169,8 @@ public class PromoteActionPayloadForGraphTableJobTest {
|
|||
}
|
||||
|
||||
public static Stream<Arguments> promoteJobTestParams() {
|
||||
return Stream.of(
|
||||
return Stream
|
||||
.of(
|
||||
arguments(
|
||||
MergeAndGet.Strategy.MERGE_FROM_AND_GET,
|
||||
eu.dnetlib.dhp.schema.oaf.Dataset.class,
|
||||
|
@ -196,8 +197,8 @@ public class PromoteActionPayloadForGraphTableJobTest {
|
|||
|
||||
private static <G extends Oaf> Path createGraphTable(Path inputGraphRootDir, Class<G> rowClazz) {
|
||||
String inputGraphTableJsonDumpPath = inputGraphTableJsonDumpLocation(rowClazz);
|
||||
Path inputGraphTableJsonDumpFile =
|
||||
Paths.get(Objects.requireNonNull(cl.getResource(inputGraphTableJsonDumpPath)).getFile());
|
||||
Path inputGraphTableJsonDumpFile = Paths
|
||||
.get(Objects.requireNonNull(cl.getResource(inputGraphTableJsonDumpPath)).getFile());
|
||||
Dataset<G> rowDS = readGraphTableFromJsonDump(inputGraphTableJsonDumpFile.toString(), rowClazz);
|
||||
String inputGraphTableName = rowClazz.getSimpleName().toLowerCase();
|
||||
Path inputGraphTableDir = inputGraphRootDir.resolve(inputGraphTableName);
|
||||
|
@ -206,7 +207,8 @@ public class PromoteActionPayloadForGraphTableJobTest {
|
|||
}
|
||||
|
||||
private static String inputGraphTableJsonDumpLocation(Class<? extends Oaf> rowClazz) {
|
||||
return String.format(
|
||||
return String
|
||||
.format(
|
||||
"%s/%s.json",
|
||||
"eu/dnetlib/dhp/actionmanager/promote/input/graph", rowClazz.getSimpleName().toLowerCase());
|
||||
}
|
||||
|
@ -227,14 +229,12 @@ public class PromoteActionPayloadForGraphTableJobTest {
|
|||
|
||||
private static <G extends Oaf, A extends Oaf> Path createActionPayload(
|
||||
Path inputActionPayloadRootDir, Class<G> rowClazz, Class<A> actionPayloadClazz) {
|
||||
String inputActionPayloadJsonDumpPath =
|
||||
inputActionPayloadJsonDumpLocation(rowClazz, actionPayloadClazz);
|
||||
Path inputActionPayloadJsonDumpFile =
|
||||
Paths.get(Objects.requireNonNull(cl.getResource(inputActionPayloadJsonDumpPath)).getFile());
|
||||
Dataset<String> actionPayloadDS =
|
||||
readActionPayloadFromJsonDump(inputActionPayloadJsonDumpFile.toString());
|
||||
Path inputActionPayloadDir =
|
||||
inputActionPayloadRootDir.resolve(actionPayloadClazz.getSimpleName().toLowerCase());
|
||||
String inputActionPayloadJsonDumpPath = inputActionPayloadJsonDumpLocation(rowClazz, actionPayloadClazz);
|
||||
Path inputActionPayloadJsonDumpFile = Paths
|
||||
.get(Objects.requireNonNull(cl.getResource(inputActionPayloadJsonDumpPath)).getFile());
|
||||
Dataset<String> actionPayloadDS = readActionPayloadFromJsonDump(inputActionPayloadJsonDumpFile.toString());
|
||||
Path inputActionPayloadDir = inputActionPayloadRootDir
|
||||
.resolve(actionPayloadClazz.getSimpleName().toLowerCase());
|
||||
writeActionPayloadAsJobInput(actionPayloadDS, inputActionPayloadDir.toString());
|
||||
return inputActionPayloadDir;
|
||||
}
|
||||
|
@ -242,7 +242,8 @@ public class PromoteActionPayloadForGraphTableJobTest {
|
|||
private static String inputActionPayloadJsonDumpLocation(
|
||||
Class<? extends Oaf> rowClazz, Class<? extends Oaf> actionPayloadClazz) {
|
||||
|
||||
return String.format(
|
||||
return String
|
||||
.format(
|
||||
"eu/dnetlib/dhp/actionmanager/promote/input/action_payload/%s_table/%s.json",
|
||||
rowClazz.getSimpleName().toLowerCase(), actionPayloadClazz.getSimpleName().toLowerCase());
|
||||
}
|
||||
|
@ -269,7 +270,8 @@ public class PromoteActionPayloadForGraphTableJobTest {
|
|||
MergeAndGet.Strategy strategy,
|
||||
Class<? extends Oaf> rowClazz,
|
||||
Class<? extends Oaf> actionPayloadClazz) {
|
||||
return String.format(
|
||||
return String
|
||||
.format(
|
||||
"eu/dnetlib/dhp/actionmanager/promote/output/graph/%s/%s/%s_action_payload/result.json",
|
||||
strategy.name().toLowerCase(),
|
||||
rowClazz.getSimpleName().toLowerCase(),
|
||||
|
|
|
@ -1,15 +1,15 @@
|
|||
|
||||
package eu.dnetlib.dhp.actionmanager.promote;
|
||||
|
||||
import static org.junit.jupiter.api.Assertions.assertEquals;
|
||||
import static org.junit.jupiter.api.Assertions.assertThrows;
|
||||
|
||||
import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.SerializableSupplier;
|
||||
import eu.dnetlib.dhp.schema.oaf.Oaf;
|
||||
import java.util.Arrays;
|
||||
import java.util.List;
|
||||
import java.util.Objects;
|
||||
import java.util.function.BiFunction;
|
||||
import java.util.function.Function;
|
||||
|
||||
import org.apache.spark.SparkConf;
|
||||
import org.apache.spark.sql.Dataset;
|
||||
import org.apache.spark.sql.Encoders;
|
||||
|
@ -19,6 +19,9 @@ import org.junit.jupiter.api.BeforeAll;
|
|||
import org.junit.jupiter.api.Nested;
|
||||
import org.junit.jupiter.api.Test;
|
||||
|
||||
import eu.dnetlib.dhp.common.FunctionalInterfaceSupport.SerializableSupplier;
|
||||
import eu.dnetlib.dhp.schema.oaf.Oaf;
|
||||
|
||||
public class PromoteActionPayloadFunctionsTest {
|
||||
|
||||
private static SparkSession spark;
|
||||
|
@ -43,13 +46,14 @@ public class PromoteActionPayloadFunctionsTest {
|
|||
@Test
|
||||
public void shouldThrowWhenTableTypeIsNotSubtypeOfActionPayloadType() {
|
||||
// given
|
||||
class OafImpl extends Oaf {}
|
||||
class OafImpl extends Oaf {
|
||||
}
|
||||
|
||||
// when
|
||||
assertThrows(
|
||||
RuntimeException.class,
|
||||
() ->
|
||||
PromoteActionPayloadFunctions.joinGraphTableWithActionPayloadAndMerge(
|
||||
() -> PromoteActionPayloadFunctions
|
||||
.joinGraphTableWithActionPayloadAndMerge(
|
||||
null, null, null, null, null, OafImplSubSub.class, OafImpl.class));
|
||||
}
|
||||
|
||||
|
@ -61,17 +65,16 @@ public class PromoteActionPayloadFunctionsTest {
|
|||
String id2 = "id2";
|
||||
String id3 = "id3";
|
||||
String id4 = "id4";
|
||||
List<OafImplSubSub> rowData =
|
||||
Arrays.asList(
|
||||
List<OafImplSubSub> rowData = Arrays
|
||||
.asList(
|
||||
createOafImplSubSub(id0),
|
||||
createOafImplSubSub(id1),
|
||||
createOafImplSubSub(id2),
|
||||
createOafImplSubSub(id3));
|
||||
Dataset<OafImplSubSub> rowDS =
|
||||
spark.createDataset(rowData, Encoders.bean(OafImplSubSub.class));
|
||||
Dataset<OafImplSubSub> rowDS = spark.createDataset(rowData, Encoders.bean(OafImplSubSub.class));
|
||||
|
||||
List<OafImplSubSub> actionPayloadData =
|
||||
Arrays.asList(
|
||||
List<OafImplSubSub> actionPayloadData = Arrays
|
||||
.asList(
|
||||
createOafImplSubSub(id1),
|
||||
createOafImplSubSub(id2),
|
||||
createOafImplSubSub(id2),
|
||||
|
@ -82,22 +85,20 @@ public class PromoteActionPayloadFunctionsTest {
|
|||
createOafImplSubSub(id4),
|
||||
createOafImplSubSub(id4),
|
||||
createOafImplSubSub(id4));
|
||||
Dataset<OafImplSubSub> actionPayloadDS =
|
||||
spark.createDataset(actionPayloadData, Encoders.bean(OafImplSubSub.class));
|
||||
Dataset<OafImplSubSub> actionPayloadDS = spark
|
||||
.createDataset(actionPayloadData, Encoders.bean(OafImplSubSub.class));
|
||||
|
||||
SerializableSupplier<Function<OafImplSubSub, String>> rowIdFn = () -> OafImplRoot::getId;
|
||||
SerializableSupplier<Function<OafImplSubSub, String>> actionPayloadIdFn =
|
||||
() -> OafImplRoot::getId;
|
||||
SerializableSupplier<BiFunction<OafImplSubSub, OafImplSubSub, OafImplSubSub>> mergeAndGetFn =
|
||||
() ->
|
||||
(x, y) -> {
|
||||
SerializableSupplier<Function<OafImplSubSub, String>> actionPayloadIdFn = () -> OafImplRoot::getId;
|
||||
SerializableSupplier<BiFunction<OafImplSubSub, OafImplSubSub, OafImplSubSub>> mergeAndGetFn = () -> (x,
|
||||
y) -> {
|
||||
x.merge(y);
|
||||
return x;
|
||||
};
|
||||
|
||||
// when
|
||||
List<OafImplSubSub> results =
|
||||
PromoteActionPayloadFunctions.joinGraphTableWithActionPayloadAndMerge(
|
||||
List<OafImplSubSub> results = PromoteActionPayloadFunctions
|
||||
.joinGraphTableWithActionPayloadAndMerge(
|
||||
rowDS,
|
||||
actionPayloadDS,
|
||||
rowIdFn,
|
||||
|
@ -115,7 +116,8 @@ public class PromoteActionPayloadFunctionsTest {
|
|||
assertEquals(3, results.stream().filter(x -> x.getId().equals(id3)).count());
|
||||
assertEquals(4, results.stream().filter(x -> x.getId().equals(id4)).count());
|
||||
|
||||
results.forEach(
|
||||
results
|
||||
.forEach(
|
||||
result -> {
|
||||
switch (result.getId()) {
|
||||
case "id0":
|
||||
|
@ -143,17 +145,16 @@ public class PromoteActionPayloadFunctionsTest {
|
|||
String id2 = "id2";
|
||||
String id3 = "id3";
|
||||
String id4 = "id4";
|
||||
List<OafImplSubSub> rowData =
|
||||
Arrays.asList(
|
||||
List<OafImplSubSub> rowData = Arrays
|
||||
.asList(
|
||||
createOafImplSubSub(id0),
|
||||
createOafImplSubSub(id1),
|
||||
createOafImplSubSub(id2),
|
||||
createOafImplSubSub(id3));
|
||||
Dataset<OafImplSubSub> rowDS =
|
||||
spark.createDataset(rowData, Encoders.bean(OafImplSubSub.class));
|
||||
Dataset<OafImplSubSub> rowDS = spark.createDataset(rowData, Encoders.bean(OafImplSubSub.class));
|
||||
|
||||
List<OafImplSub> actionPayloadData =
|
||||
Arrays.asList(
|
||||
List<OafImplSub> actionPayloadData = Arrays
|
||||
.asList(
|
||||
createOafImplSub(id1),
|
||||
createOafImplSub(id2),
|
||||
createOafImplSub(id2),
|
||||
|
@ -164,22 +165,19 @@ public class PromoteActionPayloadFunctionsTest {
|
|||
createOafImplSub(id4),
|
||||
createOafImplSub(id4),
|
||||
createOafImplSub(id4));
|
||||
Dataset<OafImplSub> actionPayloadDS =
|
||||
spark.createDataset(actionPayloadData, Encoders.bean(OafImplSub.class));
|
||||
Dataset<OafImplSub> actionPayloadDS = spark
|
||||
.createDataset(actionPayloadData, Encoders.bean(OafImplSub.class));
|
||||
|
||||
SerializableSupplier<Function<OafImplSubSub, String>> rowIdFn = () -> OafImplRoot::getId;
|
||||
SerializableSupplier<Function<OafImplSub, String>> actionPayloadIdFn =
|
||||
() -> OafImplRoot::getId;
|
||||
SerializableSupplier<BiFunction<OafImplSubSub, OafImplSub, OafImplSubSub>> mergeAndGetFn =
|
||||
() ->
|
||||
(x, y) -> {
|
||||
SerializableSupplier<Function<OafImplSub, String>> actionPayloadIdFn = () -> OafImplRoot::getId;
|
||||
SerializableSupplier<BiFunction<OafImplSubSub, OafImplSub, OafImplSubSub>> mergeAndGetFn = () -> (x, y) -> {
|
||||
x.merge(y);
|
||||
return x;
|
||||
};
|
||||
|
||||
// when
|
||||
List<OafImplSubSub> results =
|
||||
PromoteActionPayloadFunctions.joinGraphTableWithActionPayloadAndMerge(
|
||||
List<OafImplSubSub> results = PromoteActionPayloadFunctions
|
||||
.joinGraphTableWithActionPayloadAndMerge(
|
||||
rowDS,
|
||||
actionPayloadDS,
|
||||
rowIdFn,
|
||||
|
@ -197,7 +195,8 @@ public class PromoteActionPayloadFunctionsTest {
|
|||
assertEquals(3, results.stream().filter(x -> x.getId().equals(id3)).count());
|
||||
assertEquals(0, results.stream().filter(x -> x.getId().equals(id4)).count());
|
||||
|
||||
results.forEach(
|
||||
results
|
||||
.forEach(
|
||||
result -> {
|
||||
switch (result.getId()) {
|
||||
case "id0":
|
||||
|
@ -224,31 +223,28 @@ public class PromoteActionPayloadFunctionsTest {
|
|||
String id1 = "id1";
|
||||
String id2 = "id2";
|
||||
String id3 = "id3";
|
||||
List<OafImplSubSub> rowData =
|
||||
Arrays.asList(
|
||||
List<OafImplSubSub> rowData = Arrays
|
||||
.asList(
|
||||
createOafImplSubSub(id1),
|
||||
createOafImplSubSub(id2),
|
||||
createOafImplSubSub(id2),
|
||||
createOafImplSubSub(id3),
|
||||
createOafImplSubSub(id3),
|
||||
createOafImplSubSub(id3));
|
||||
Dataset<OafImplSubSub> rowDS =
|
||||
spark.createDataset(rowData, Encoders.bean(OafImplSubSub.class));
|
||||
Dataset<OafImplSubSub> rowDS = spark.createDataset(rowData, Encoders.bean(OafImplSubSub.class));
|
||||
|
||||
SerializableSupplier<Function<OafImplSubSub, String>> rowIdFn = () -> OafImplRoot::getId;
|
||||
SerializableSupplier<BiFunction<OafImplSubSub, OafImplSubSub, OafImplSubSub>> mergeAndGetFn =
|
||||
() ->
|
||||
(x, y) -> {
|
||||
SerializableSupplier<BiFunction<OafImplSubSub, OafImplSubSub, OafImplSubSub>> mergeAndGetFn = () -> (x,
|
||||
y) -> {
|
||||
x.merge(y);
|
||||
return x;
|
||||
};
|
||||
SerializableSupplier<OafImplSubSub> zeroFn = OafImplSubSub::new;
|
||||
SerializableSupplier<Function<OafImplSubSub, Boolean>> isNotZeroFn =
|
||||
() -> x -> Objects.nonNull(x.getId());
|
||||
SerializableSupplier<Function<OafImplSubSub, Boolean>> isNotZeroFn = () -> x -> Objects.nonNull(x.getId());
|
||||
|
||||
// when
|
||||
List<OafImplSubSub> results =
|
||||
PromoteActionPayloadFunctions.groupGraphTableByIdAndMerge(
|
||||
List<OafImplSubSub> results = PromoteActionPayloadFunctions
|
||||
.groupGraphTableByIdAndMerge(
|
||||
rowDS, rowIdFn, mergeAndGetFn, zeroFn, isNotZeroFn, OafImplSubSub.class)
|
||||
.collectAsList();
|
||||
|
||||
|
@ -258,7 +254,8 @@ public class PromoteActionPayloadFunctionsTest {
|
|||
assertEquals(1, results.stream().filter(x -> x.getId().equals(id2)).count());
|
||||
assertEquals(1, results.stream().filter(x -> x.getId().equals(id3)).count());
|
||||
|
||||
results.forEach(
|
||||
results
|
||||
.forEach(
|
||||
result -> {
|
||||
switch (result.getId()) {
|
||||
case "id1":
|
||||
|
|
|
@ -1,22 +1,21 @@
|
|||
|
||||
package eu.dnetlib.dhp.collection;
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
|
||||
import eu.dnetlib.dhp.model.mdstore.MetadataRecord;
|
||||
import eu.dnetlib.dhp.model.mdstore.Provenance;
|
||||
import eu.dnetlib.message.Message;
|
||||
import eu.dnetlib.message.MessageManager;
|
||||
import eu.dnetlib.message.MessageType;
|
||||
import static eu.dnetlib.dhp.common.SparkSessionSupport.runWithSparkSession;
|
||||
|
||||
import java.io.ByteArrayInputStream;
|
||||
import java.nio.charset.StandardCharsets;
|
||||
import java.util.HashMap;
|
||||
import java.util.Map;
|
||||
import java.util.Objects;
|
||||
import java.util.Optional;
|
||||
|
||||
import org.apache.commons.cli.*;
|
||||
import org.apache.commons.io.IOUtils;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
import org.apache.hadoop.io.IntWritable;
|
||||
import org.apache.hadoop.io.Text;
|
||||
import org.apache.spark.SparkConf;
|
||||
import org.apache.spark.api.java.JavaPairRDD;
|
||||
import org.apache.spark.api.java.JavaRDD;
|
||||
import org.apache.spark.api.java.JavaSparkContext;
|
||||
|
@ -28,9 +27,22 @@ import org.apache.spark.util.LongAccumulator;
|
|||
import org.dom4j.Document;
|
||||
import org.dom4j.Node;
|
||||
import org.dom4j.io.SAXReader;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
|
||||
import eu.dnetlib.dhp.application.ArgumentApplicationParser;
|
||||
import eu.dnetlib.dhp.model.mdstore.MetadataRecord;
|
||||
import eu.dnetlib.dhp.model.mdstore.Provenance;
|
||||
import eu.dnetlib.message.Message;
|
||||
import eu.dnetlib.message.MessageManager;
|
||||
import eu.dnetlib.message.MessageType;
|
||||
|
||||
public class GenerateNativeStoreSparkJob {
|
||||
|
||||
private static final Logger log = LoggerFactory.getLogger(GenerateNativeStoreSparkJob.class);
|
||||
|
||||
public static MetadataRecord parseRecord(
|
||||
final String input,
|
||||
final String xpath,
|
||||
|
@ -40,20 +52,22 @@ public class GenerateNativeStoreSparkJob {
|
|||
final LongAccumulator totalItems,
|
||||
final LongAccumulator invalidRecords) {
|
||||
|
||||
if (totalItems != null) totalItems.add(1);
|
||||
if (totalItems != null)
|
||||
totalItems.add(1);
|
||||
try {
|
||||
SAXReader reader = new SAXReader();
|
||||
Document document =
|
||||
reader.read(new ByteArrayInputStream(input.getBytes(StandardCharsets.UTF_8)));
|
||||
Document document = reader.read(new ByteArrayInputStream(input.getBytes(StandardCharsets.UTF_8)));
|
||||
Node node = document.selectSingleNode(xpath);
|
||||
final String originalIdentifier = node.getText();
|
||||
if (StringUtils.isBlank(originalIdentifier)) {
|
||||
if (invalidRecords != null) invalidRecords.add(1);
|
||||
if (invalidRecords != null)
|
||||
invalidRecords.add(1);
|
||||
return null;
|
||||
}
|
||||
return new MetadataRecord(originalIdentifier, encoding, provenance, input, dateOfCollection);
|
||||
} catch (Throwable e) {
|
||||
if (invalidRecords != null) invalidRecords.add(1);
|
||||
if (invalidRecords != null)
|
||||
invalidRecords.add(1);
|
||||
e.printStackTrace();
|
||||
return null;
|
||||
}
|
||||
|
@ -61,39 +75,42 @@ public class GenerateNativeStoreSparkJob {
|
|||
|
||||
public static void main(String[] args) throws Exception {
|
||||
|
||||
final ArgumentApplicationParser parser =
|
||||
new ArgumentApplicationParser(
|
||||
IOUtils.toString(
|
||||
GenerateNativeStoreSparkJob.class.getResourceAsStream(
|
||||
final ArgumentApplicationParser parser = new ArgumentApplicationParser(
|
||||
IOUtils
|
||||
.toString(
|
||||
GenerateNativeStoreSparkJob.class
|
||||
.getResourceAsStream(
|
||||
"/eu/dnetlib/dhp/collection/collection_input_parameters.json")));
|
||||
parser.parseArgument(args);
|
||||
final ObjectMapper jsonMapper = new ObjectMapper();
|
||||
final Provenance provenance = jsonMapper.readValue(parser.get("provenance"), Provenance.class);
|
||||
final long dateOfCollection = new Long(parser.get("dateOfCollection"));
|
||||
|
||||
final SparkSession spark =
|
||||
SparkSession.builder()
|
||||
.appName("GenerateNativeStoreSparkJob")
|
||||
.master(parser.get("master"))
|
||||
.getOrCreate();
|
||||
Boolean isSparkSessionManaged = Optional
|
||||
.ofNullable(parser.get("isSparkSessionManaged"))
|
||||
.map(Boolean::valueOf)
|
||||
.orElse(Boolean.TRUE);
|
||||
log.info("isSparkSessionManaged: {}", isSparkSessionManaged);
|
||||
|
||||
final Map<String, String> ongoingMap = new HashMap<>();
|
||||
final Map<String, String> reportMap = new HashMap<>();
|
||||
|
||||
final boolean test =
|
||||
parser.get("isTest") == null ? false : Boolean.valueOf(parser.get("isTest"));
|
||||
final boolean test = parser.get("isTest") == null ? false : Boolean.valueOf(parser.get("isTest"));
|
||||
|
||||
final JavaSparkContext sc = new JavaSparkContext(spark.sparkContext());
|
||||
SparkConf conf = new SparkConf();
|
||||
runWithSparkSession(
|
||||
conf,
|
||||
isSparkSessionManaged,
|
||||
spark -> {
|
||||
final JavaSparkContext sc = JavaSparkContext.fromSparkContext(spark.sparkContext());
|
||||
|
||||
final JavaPairRDD<IntWritable, Text> inputRDD =
|
||||
sc.sequenceFile(parser.get("input"), IntWritable.class, Text.class);
|
||||
final JavaPairRDD<IntWritable, Text> inputRDD = sc
|
||||
.sequenceFile(parser.get("input"), IntWritable.class, Text.class);
|
||||
|
||||
final LongAccumulator totalItems = sc.sc().longAccumulator("TotalItems");
|
||||
|
||||
final LongAccumulator invalidRecords = sc.sc().longAccumulator("InvalidRecords");
|
||||
|
||||
final MessageManager manager =
|
||||
new MessageManager(
|
||||
final MessageManager manager = new MessageManager(
|
||||
parser.get("rabbitHost"),
|
||||
parser.get("rabbitUser"),
|
||||
parser.get("rabbitPassword"),
|
||||
|
@ -101,11 +118,9 @@ public class GenerateNativeStoreSparkJob {
|
|||
false,
|
||||
null);
|
||||
|
||||
final JavaRDD<MetadataRecord> mappeRDD =
|
||||
inputRDD
|
||||
final JavaRDD<MetadataRecord> mappeRDD = inputRDD
|
||||
.map(
|
||||
item ->
|
||||
parseRecord(
|
||||
item -> parseRecord(
|
||||
item._2().toString(),
|
||||
parser.get("xpath"),
|
||||
parser.get("encoding"),
|
||||
|
@ -118,7 +133,8 @@ public class GenerateNativeStoreSparkJob {
|
|||
|
||||
ongoingMap.put("ongoing", "0");
|
||||
if (!test) {
|
||||
manager.sendMessage(
|
||||
manager
|
||||
.sendMessage(
|
||||
new Message(
|
||||
parser.get("workflowId"), "DataFrameCreation", MessageType.ONGOING, ongoingMap),
|
||||
parser.get("rabbitOngoingQueue"),
|
||||
|
@ -132,7 +148,8 @@ public class GenerateNativeStoreSparkJob {
|
|||
mdStoreRecords.add(mdstore.count());
|
||||
ongoingMap.put("ongoing", "" + totalItems.value());
|
||||
if (!test) {
|
||||
manager.sendMessage(
|
||||
manager
|
||||
.sendMessage(
|
||||
new Message(
|
||||
parser.get("workflowId"), "DataFrameCreation", MessageType.ONGOING, ongoingMap),
|
||||
parser.get("rabbitOngoingQueue"),
|
||||
|
@ -144,12 +161,15 @@ public class GenerateNativeStoreSparkJob {
|
|||
reportMap.put("invalidRecords", "" + invalidRecords.value());
|
||||
reportMap.put("mdStoreSize", "" + mdStoreRecords.value());
|
||||
if (!test) {
|
||||
manager.sendMessage(
|
||||
manager
|
||||
.sendMessage(
|
||||
new Message(parser.get("workflowId"), "Collection", MessageType.REPORT, reportMap),
|
||||
parser.get("rabbitReportQueue"),
|
||||
true,
|
||||
false);
|
||||
manager.close();
|
||||
}
|
||||
});
|
||||
|
||||
}
|
||||
}
|
||||
|
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue