Compare commits

...

174 Commits

Author SHA1 Message Date
Giancarlo Panichi 9501585f09 ref 19081: DM Pool Manager uses a deprecated social networking api
https://support.d4science.org/issues/19081

Updated to new Social Networking API
2020-04-16 16:18:04 +02:00
Giancarlo Panichi bd981214c6 Updated project configuration 2020-04-16 10:26:16 +02:00
Giancarlo Panichi 2c476927c6 gCube Release 4.17 2019-12-12 10:12:10 +01:00
Giancarlo Panichi 572a2a3341 ref 18190: Configure SAI and DM to support KNIME 4.1
https://support.d4science.org/issues/18190

Updated to support Knime-Workflow 4.1
2019-12-12 10:10:28 +01:00
Giancarlo Panichi 66980b42ca gCube Release 4.17 2019-12-11 17:38:00 +01:00
Giancarlo Panichi d773ede89a ref 18190: Configure SAI and DM to support KNIME 4.1
https://support.d4science.org/issues/18190

Updated to support Knime-Workflow 4.1
2019-12-11 17:35:11 +01:00
Giancarlo Panichi b9b4665376 ref 18190: Configure SAI and DM to support KNIME 4.1
https://support.d4science.org/issues/18190

Updated to support Knime-Workflow 4.1
2019-12-11 17:33:09 +01:00
Giancarlo Panichi 5e0b6c22cb ref 18190: Configure SAI and DM to support KNIME 4.1
https://support.d4science.org/issues/18190

Updated to support Knime-Workflow 4.1
2019-12-11 17:30:35 +01:00
Giancarlo Panichi 3c71238354 ref 18190: Configure SAI and DM to support KNIME 4.1
https://support.d4science.org/issues/18190

Updated to support Knime-Workflow 4.1
2019-12-11 17:30:11 +01:00
Giancarlo Panichi 92dcac5d97 ref 18190: Configure SAI and DM to support KNIME 4.1
https://support.d4science.org/issues/18190

Updated to support Knime-Workflow 4.1
2019-12-11 17:20:27 +01:00
Giancarlo Panichi a51d1f528a gCube Release 4.17 2019-12-11 17:03:23 +01:00
Giancarlo Panichi 3f5fc26b7c gCube Release 4.17 2019-12-11 15:40:10 +01:00
Giancarlo Panichi 3edd8d6475 ref 18190: Configure SAI and DM to support KNIME 4.1
https://support.d4science.org/issues/18190

Updated to support Knime-Workflow 4.1
2019-12-11 15:31:12 +01:00
Giancarlo Panichi d2f309405e ref 18190: Configure SAI and DM to support KNIME 4.1
https://support.d4science.org/issues/18190

Updated to support Knime-Workflow 4.1

git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@183317 82a268e6-3cf1-43bd-a215-b396298e98cf
2019-12-11 13:34:16 +00:00
Giancarlo Panichi 925cadb84a ref 18190: Configure SAI and DM to support KNIME 4.1
https://support.d4science.org/issues/18190

Updated to support Knime-Workflow 4.1

git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@183316 82a268e6-3cf1-43bd-a215-b396298e98cf
2019-12-11 13:30:47 +00:00
Giancarlo Panichi 209ad868de Fixed default admins
git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@179193 82a268e6-3cf1-43bd-a215-b396298e98cf
2019-04-30 10:31:13 +00:00
Giancarlo Panichi 8cc3ae6919 ref 12742: DataMiner - Support Python 3.6
https://support.d4science.org/issues/12742

Python3.6 added

git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@173964 82a268e6-3cf1-43bd-a215-b396298e98cf
2018-11-05 16:29:54 +00:00
Giancarlo Panichi 4baf6149c5 ref 12742: DataMiner - Support Python 3.6
https://support.d4science.org/issues/12742

Python3.6 added


git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@173907 82a268e6-3cf1-43bd-a215-b396298e98cf
2018-10-31 11:41:20 +00:00
Giancarlo Panichi e7acd9aa21 ref 12742: DataMiner - Support Python 3.6
https://support.d4science.org/issues/12742

Python3.6 added


git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@173906 82a268e6-3cf1-43bd-a215-b396298e98cf
2018-10-31 11:36:17 +00:00
Giancarlo Panichi c3cf33ed1c ref 12742: DataMiner - Support Python 3.6
https://support.d4science.org/issues/12742

Python3.6 added


git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@173905 82a268e6-3cf1-43bd-a215-b396298e98cf
2018-10-31 11:32:03 +00:00
Giancarlo Panichi dd4578a596 ref 12742: DataMiner - Support Python 3.6
https://support.d4science.org/issues/12742

Python3.6 added

git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@173874 82a268e6-3cf1-43bd-a215-b396298e98cf
2018-10-30 10:04:46 +00:00
Giancarlo Panichi 568ce76968 ref 12742: DataMiner - Support Python 3.6
https://support.d4science.org/issues/12742

Python3.6 added

git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@173835 82a268e6-3cf1-43bd-a215-b396298e98cf
2018-10-26 08:51:59 +00:00
Giancarlo Panichi 4ad0ad749f ref 12742: DataMiner - Support Python 3.6
https://support.d4science.org/issues/12742

Python3.6 added

git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@173834 82a268e6-3cf1-43bd-a215-b396298e98cf
2018-10-26 08:50:25 +00:00
Giancarlo Panichi bda80bbb9c ref 12742: DataMiner - Support Python 3.6
https://support.d4science.org/issues/12742

Python3.6 added

git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@173833 82a268e6-3cf1-43bd-a215-b396298e98cf
2018-10-26 08:46:20 +00:00
Ciro Formisano 060f8953fe git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@171363 82a268e6-3cf1-43bd-a215-b396298e98cf 2018-09-10 09:43:01 +00:00
Ciro Formisano 43bc5d5748 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@169817 82a268e6-3cf1-43bd-a215-b396298e98cf 2018-07-18 09:21:17 +00:00
Ciro Formisano 1d58cc013a git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@162010 82a268e6-3cf1-43bd-a215-b396298e98cf 2018-01-09 13:44:26 +00:00
Ciro Formisano 53e0a314b5 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@162002 82a268e6-3cf1-43bd-a215-b396298e98cf 2018-01-09 09:45:06 +00:00
Ciro Formisano f1eb9b8e1e git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@160699 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-12-20 11:01:01 +00:00
Ciro Formisano 1460808417 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@160581 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-12-15 15:46:26 +00:00
Ciro Formisano 0ec3e13515 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@160557 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-12-15 12:01:42 +00:00
Ciro Formisano b51f57304f git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@160082 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-12-05 15:33:33 +00:00
Ciro Formisano 3ef6fce80d git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@160080 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-12-05 15:31:41 +00:00
Ciro Formisano 9315e25d09 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@160078 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-12-05 15:21:18 +00:00
Ciro Formisano 0a6a83698f git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@158895 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-11-29 15:29:01 +00:00
Ciro Formisano 23a9d3ab45 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@158301 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-11-08 15:16:16 +00:00
Ciro Formisano d72243f4df git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@158300 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-11-08 15:13:14 +00:00
Ciro Formisano 20535e60ac git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@158299 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-11-08 15:12:38 +00:00
Nunzio Andrea Galante e81ec8230a git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@158031 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-10-30 21:38:11 +00:00
Nunzio Andrea Galante 563f70cfca git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@157810 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-10-27 21:30:20 +00:00
Nunzio Andrea Galante 1d7e77a3c2 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@157787 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-10-27 17:18:42 +00:00
Nunzio Andrea Galante 3057697992 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@157627 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-10-26 15:20:30 +00:00
Nunzio Andrea Galante a24185bf03 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@155104 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-10-11 10:18:36 +00:00
Nunzio Andrea Galante 0206dad592 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@155087 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-10-10 14:25:11 +00:00
Nunzio Andrea Galante ebcf13638c git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@155028 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-10-09 16:14:23 +00:00
Nunzio Andrea Galante ed9aceb679 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@154948 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-10-06 14:07:09 +00:00
Nunzio Andrea Galante 5da8482541 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@154851 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-10-04 13:13:05 +00:00
Nunzio Andrea Galante a88a8bbc27 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@154813 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-10-04 10:42:31 +00:00
Nunzio Andrea Galante cf024842dc git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@154792 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-10-03 16:45:57 +00:00
Nunzio Andrea Galante a887165d0a git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@154358 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-09-21 09:08:14 +00:00
Nunzio Andrea Galante c29cfe7ee9 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@154344 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-09-20 16:51:21 +00:00
Nunzio Andrea Galante 88759aebd2 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@152900 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-09-07 16:46:26 +00:00
Nunzio Andrea Galante be9b269716 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@152628 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-09-01 16:59:47 +00:00
Nunzio Andrea Galante e192e4b1be git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@152606 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-09-01 15:50:01 +00:00
Nunzio Andrea Galante 1d285e5972 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@152602 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-09-01 15:24:42 +00:00
Nunzio Andrea Galante 87dbf00750 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@152535 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-08-29 15:43:31 +00:00
Nunzio Andrea Galante 65623f449e git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@151456 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-31 11:21:47 +00:00
Nunzio Andrea Galante b20411fa33 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@151410 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-28 10:47:58 +00:00
Nunzio Andrea Galante 10ad01b9cd git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@151384 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-27 13:57:53 +00:00
Nunzio Andrea Galante fd29e223c7 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@151381 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-27 13:32:36 +00:00
Nunzio Andrea Galante 4a13d73fe8 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@151376 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-27 13:03:22 +00:00
Nunzio Andrea Galante b1594d72c5 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@151372 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-27 09:48:12 +00:00
Nunzio Andrea Galante 23b767226a git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@151362 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-26 21:46:07 +00:00
Nunzio Andrea Galante 7b62d8354a git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@151354 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-26 19:35:08 +00:00
Nunzio Andrea Galante 2c1ac7cf2f git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@151353 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-26 19:21:07 +00:00
Nunzio Andrea Galante 9c1b3668f7 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@151319 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-26 12:05:12 +00:00
Nunzio Andrea Galante 28afad7e94 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@151308 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-26 09:39:02 +00:00
Nunzio Andrea Galante 8b055202c8 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@151268 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-25 13:51:25 +00:00
Nunzio Andrea Galante 3eb0ef1065 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@151257 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-25 12:17:52 +00:00
Nunzio Andrea Galante 7a074d742d git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@151252 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-24 17:07:02 +00:00
Nunzio Andrea Galante 97ceef4387 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@151248 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-24 16:51:47 +00:00
Nunzio Andrea Galante 057c37a38d git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@151242 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-24 15:57:53 +00:00
Nunzio Andrea Galante 6fff1630d4 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@151238 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-24 15:47:47 +00:00
Nunzio Andrea Galante abf2ae0d3e git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@151228 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-24 14:50:38 +00:00
Nunzio Andrea Galante b4e78ca39a git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@151222 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-24 10:32:18 +00:00
Nunzio Andrea Galante 23f461784b git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@151136 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-18 13:22:29 +00:00
Nunzio Andrea Galante c55230a213 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@151131 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-18 12:58:56 +00:00
Nunzio Andrea Galante 57b7c75981 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@151119 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-17 21:55:35 +00:00
Nunzio Andrea Galante a22193cf4d git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@151118 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-17 21:52:54 +00:00
Nunzio Andrea Galante 8ae8c4becf git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@151108 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-17 14:43:21 +00:00
Nunzio Andrea Galante e5d4b41372 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@151073 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-13 16:49:43 +00:00
Nunzio Andrea Galante 473f8ce2e2 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@151070 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-13 16:38:03 +00:00
Nunzio Andrea Galante 1255938fa2 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@151067 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-13 16:13:18 +00:00
Nunzio Andrea Galante 2d037f21a3 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@150919 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-07 13:57:46 +00:00
Nunzio Andrea Galante ae5f8df1a7 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@150907 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-07 12:52:56 +00:00
Nunzio Andrea Galante 910857b0bc git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@150856 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-06 16:44:03 +00:00
Nunzio Andrea Galante 7500d118b1 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@150848 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-06 15:34:45 +00:00
Nunzio Andrea Galante 73d4602510 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@150838 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-06 12:54:04 +00:00
Nunzio Andrea Galante ac7fe0f1ef git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@150801 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-07-05 15:50:42 +00:00
Nunzio Andrea Galante de161f8153 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@150673 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-06-30 13:49:04 +00:00
Nunzio Andrea Galante 58bae45a76 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@150435 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-06-19 16:34:44 +00:00
Gabriele Giammatteo fba1a32a23 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@148815 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-05-17 16:28:44 +00:00
Gabriele Giammatteo 34b2844c72 updated SVNUpdater
git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@148811 82a268e6-3cf1-43bd-a215-b396298e98cf
2017-05-17 16:16:25 +00:00
Nunzio Andrea Galante a8af13ea2d git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@148734 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-05-16 14:50:07 +00:00
Nunzio Andrea Galante a831271e75 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@148713 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-05-16 13:30:57 +00:00
Nunzio Andrea Galante d59a41956d git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@148704 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-05-15 16:44:37 +00:00
Nunzio Andrea Galante cece02a666 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@148697 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-05-15 15:51:20 +00:00
Nunzio Andrea Galante 13350b2d34 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@148644 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-05-12 17:11:12 +00:00
Nunzio Andrea Galante 044acabb73 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@148632 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-05-12 16:26:58 +00:00
Nunzio Andrea Galante 18e7158de6 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@148627 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-05-12 15:54:49 +00:00
Nunzio Andrea Galante 85a5d22809 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@148626 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-05-12 15:52:23 +00:00
Nunzio Andrea Galante 407a097b50 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@148620 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-05-12 15:32:14 +00:00
Nunzio Andrea Galante 08188449c1 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@148491 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-05-11 16:08:17 +00:00
Nunzio Andrea Galante 0eb07b35d1 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@148485 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-05-11 15:42:53 +00:00
Nunzio Andrea Galante d428b52929 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@148478 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-05-11 15:03:59 +00:00
Nunzio Andrea Galante 3331ec29b2 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@148467 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-05-11 10:58:38 +00:00
Nunzio Andrea Galante 9d3cf1deb6 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@148431 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-05-10 09:31:28 +00:00
Gabriele Giammatteo 745a01d0d1 adding some todo
git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@148428 82a268e6-3cf1-43bd-a215-b396298e98cf
2017-05-10 09:11:17 +00:00
Gabriele Giammatteo 86433db2a2 refactoring (part 3)
git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@148427 82a268e6-3cf1-43bd-a215-b396298e98cf
2017-05-10 09:04:04 +00:00
Gabriele Giammatteo 422bd3598e refactoring (part 2)
git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@148423 82a268e6-3cf1-43bd-a215-b396298e98cf
2017-05-09 16:46:52 +00:00
Gabriele Giammatteo f8959752cc refactoring
git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@148420 82a268e6-3cf1-43bd-a215-b396298e98cf
2017-05-09 16:38:15 +00:00
Nunzio Andrea Galante c546a46269 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@148412 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-05-09 15:10:03 +00:00
Nunzio Andrea Galante 67951c80e5 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@147248 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-04-28 14:26:40 +00:00
Nunzio Andrea Galante 3035a8ee2f git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@147247 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-04-28 13:26:47 +00:00
Nunzio Andrea Galante 24c7b210a6 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@147195 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-04-26 15:39:24 +00:00
Nunzio Andrea Galante c381af3416 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@147118 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-04-22 17:24:33 +00:00
Nunzio Andrea Galante 1bf5f76518 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@147061 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-04-21 12:26:25 +00:00
Nunzio Andrea Galante ac561756ab git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@147060 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-04-21 12:24:08 +00:00
Nunzio Andrea Galante 071b477e6d git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@147059 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-04-21 12:13:00 +00:00
Nunzio Andrea Galante 68f1f0eada git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@147044 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-04-21 10:47:21 +00:00
Nunzio Andrea Galante 27336e38f2 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@147011 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-04-20 13:11:32 +00:00
Nunzio Andrea Galante 6c304b39c6 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@147001 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-04-20 12:18:05 +00:00
Nunzio Andrea Galante 51124d8b73 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@146977 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-04-20 09:15:11 +00:00
Nunzio Andrea Galante 77ab004ea2 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@146971 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-04-19 16:20:29 +00:00
Nunzio Andrea Galante 61829e2530 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@146967 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-04-19 15:06:24 +00:00
Nunzio Andrea Galante bc31c95bf5 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@146966 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-04-19 15:01:21 +00:00
Nunzio Andrea Galante 1a3ccdc31b git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@146965 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-04-19 15:01:13 +00:00
Nunzio Andrea Galante f7488f8b0e git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@146964 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-04-19 14:59:36 +00:00
Nunzio Andrea Galante a7215f24a3 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@146962 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-04-19 14:21:39 +00:00
Nunzio Andrea Galante 1d004af37d git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@146689 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-04-07 13:56:40 +00:00
Nunzio Andrea Galante ae1019f4f0 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@146658 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-04-06 13:17:50 +00:00
Nunzio Andrea Galante 3ad395846a git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@146605 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-04-06 09:58:51 +00:00
Nunzio Andrea Galante 6ebbad413c git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@146601 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-04-06 08:59:46 +00:00
Nunzio Andrea Galante f41d1cb5d5 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@146587 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-04-05 15:14:18 +00:00
Nunzio Andrea Galante e99eccffb3 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@146567 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-04-04 17:32:53 +00:00
Nunzio Andrea Galante 7958aeecfa git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@146518 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-04-03 12:35:03 +00:00
Nunzio Andrea Galante b35083da4f git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@146433 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-03-29 14:45:22 +00:00
Nunzio Andrea Galante d065e09593 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@146432 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-03-29 14:20:20 +00:00
Nunzio Andrea Galante 486e808d8c git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@146429 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-03-29 14:06:46 +00:00
Nunzio Andrea Galante 7462519751 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@144670 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-03-03 18:13:52 +00:00
Nunzio Andrea Galante 5ca721c6a6 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@144669 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-03-03 18:10:02 +00:00
Nunzio Andrea Galante ad7006ef2d Share project "dataminer-pool-manager" into "http://svn.research-infrastructures.eu/d4science/gcube"
git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@144668 82a268e6-3cf1-43bd-a215-b396298e98cf
2017-03-03 18:07:54 +00:00
Nunzio Andrea Galante 1ec0af6eb3 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@144535 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-03-02 13:49:01 +00:00
Nunzio Andrea Galante 5ae31bfffd git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@144495 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-03-01 16:39:12 +00:00
Nunzio Andrea Galante efdbfe0dad git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@144487 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-03-01 15:22:44 +00:00
Nunzio Andrea Galante 4881cb6d16 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@144486 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-03-01 15:21:40 +00:00
Nunzio Andrea Galante ae47b4745d git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@144447 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-02-28 16:51:04 +00:00
Nunzio Andrea Galante f9a7b0d70c git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@144252 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-02-24 11:03:17 +00:00
Nunzio Andrea Galante 7986056560 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@144250 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-02-24 11:01:02 +00:00
Nunzio Andrea Galante c2a5b0c9e5 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@144197 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-02-23 14:28:31 +00:00
Nunzio Andrea Galante c5851b3366 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@144143 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-02-22 16:26:11 +00:00
Nunzio Andrea Galante b8b1038653 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@144016 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-02-21 13:43:30 +00:00
Nunzio Andrea Galante b7fec5b2eb git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@144015 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-02-21 13:42:26 +00:00
Nunzio Andrea Galante 6c282d1754 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@142822 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-02-21 09:59:25 +00:00
Nunzio Andrea Galante 98c84e35aa git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@142773 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-02-20 13:55:04 +00:00
Nunzio Andrea Galante 5fc7368666 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@142772 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-02-20 13:54:55 +00:00
Nunzio Andrea Galante 009724df5c git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@142771 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-02-20 13:54:40 +00:00
Nunzio Andrea Galante 89c233207a git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@142770 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-02-20 13:54:27 +00:00
Nunzio Andrea Galante 8ea1015b34 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@142769 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-02-20 13:54:11 +00:00
Nunzio Andrea Galante 098269242f git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@142767 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-02-20 13:52:25 +00:00
Paolo Fabriani d431298f80 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@142568 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-02-15 14:45:50 +00:00
Paolo Fabriani e222dfc70c git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@142567 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-02-15 14:45:07 +00:00
Paolo Fabriani 28433c640f git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@142566 82a268e6-3cf1-43bd-a215-b396298e98cf 2017-02-15 14:43:40 +00:00
Paolo Fabriani bb84400e0d git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@134766 82a268e6-3cf1-43bd-a215-b396298e98cf 2016-11-25 10:52:54 +00:00
Paolo Fabriani b562e47e53 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@134610 82a268e6-3cf1-43bd-a215-b396298e98cf 2016-11-23 16:33:26 +00:00
Paolo Fabriani f09b9a2d18 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@134609 82a268e6-3cf1-43bd-a215-b396298e98cf 2016-11-23 16:33:17 +00:00
Paolo Fabriani 57762a41f5 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@134567 82a268e6-3cf1-43bd-a215-b396298e98cf 2016-11-23 08:42:15 +00:00
Paolo Fabriani 9ccb883cb6 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@134331 82a268e6-3cf1-43bd-a215-b396298e98cf 2016-11-17 16:16:35 +00:00
Paolo Fabriani caec5bdf5d git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@134325 82a268e6-3cf1-43bd-a215-b396298e98cf 2016-11-17 14:32:56 +00:00
Paolo Fabriani c0d6c5d286 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@134324 82a268e6-3cf1-43bd-a215-b396298e98cf 2016-11-17 14:32:22 +00:00
Paolo Fabriani 2431bdce34 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@134309 82a268e6-3cf1-43bd-a215-b396298e98cf 2016-11-17 11:40:17 +00:00
Paolo Fabriani ebc68aefa5 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@134307 82a268e6-3cf1-43bd-a215-b396298e98cf 2016-11-17 11:36:42 +00:00
Paolo Fabriani de9f4fa4f3 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@134306 82a268e6-3cf1-43bd-a215-b396298e98cf 2016-11-17 11:34:53 +00:00
Paolo Fabriani 705dd23458 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/dataminer-pool-manager@134304 82a268e6-3cf1-43bd-a215-b396298e98cf 2016-11-17 11:34:06 +00:00
120 changed files with 7648 additions and 0 deletions

37
.classpath Executable file
View File

@ -0,0 +1,37 @@
<?xml version="1.0" encoding="UTF-8"?>
<classpath>
<classpathentry kind="src" output="target/classes" path="src/main/java">
<attributes>
<attribute name="optional" value="true"/>
<attribute name="maven.pomderived" value="true"/>
</attributes>
</classpathentry>
<classpathentry excluding="**" kind="src" output="target/classes" path="src/main/resources">
<attributes>
<attribute name="maven.pomderived" value="true"/>
</attributes>
</classpathentry>
<classpathentry kind="src" output="target/test-classes" path="src/test/java">
<attributes>
<attribute name="optional" value="true"/>
<attribute name="maven.pomderived" value="true"/>
</attributes>
</classpathentry>
<classpathentry excluding="**" kind="src" output="target/test-classes" path="src/test/resources">
<attributes>
<attribute name="maven.pomderived" value="true"/>
</attributes>
</classpathentry>
<classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.8">
<attributes>
<attribute name="maven.pomderived" value="true"/>
</attributes>
</classpathentry>
<classpathentry kind="con" path="org.eclipse.m2e.MAVEN2_CLASSPATH_CONTAINER">
<attributes>
<attribute name="maven.pomderived" value="true"/>
<attribute name="org.eclipse.jst.component.dependency" value="/WEB-INF/lib"/>
</attributes>
</classpathentry>
<classpathentry kind="output" path="target/classes"/>
</classpath>

1
.gitignore vendored Normal file
View File

@ -0,0 +1 @@
/target/

37
.project Executable file
View File

@ -0,0 +1,37 @@
<?xml version="1.0" encoding="UTF-8"?>
<projectDescription>
<name>dataminer-pool-manager</name>
<comment></comment>
<projects>
</projects>
<buildSpec>
<buildCommand>
<name>org.eclipse.jdt.core.javabuilder</name>
<arguments>
</arguments>
</buildCommand>
<buildCommand>
<name>org.eclipse.wst.common.project.facet.core.builder</name>
<arguments>
</arguments>
</buildCommand>
<buildCommand>
<name>org.eclipse.wst.validation.validationbuilder</name>
<arguments>
</arguments>
</buildCommand>
<buildCommand>
<name>org.eclipse.m2e.core.maven2Builder</name>
<arguments>
</arguments>
</buildCommand>
</buildSpec>
<natures>
<nature>org.eclipse.jem.workbench.JavaEMFNature</nature>
<nature>org.eclipse.wst.common.modulecore.ModuleCoreNature</nature>
<nature>org.eclipse.jdt.core.javanature</nature>
<nature>org.eclipse.m2e.core.maven2Nature</nature>
<nature>org.eclipse.wst.common.project.facet.core.nature</nature>
<nature>org.eclipse.wst.jsdt.core.jsNature</nature>
</natures>
</projectDescription>

13
.settings/.jsdtscope Executable file
View File

@ -0,0 +1,13 @@
<?xml version="1.0" encoding="UTF-8"?>
<classpath>
<classpathentry kind="src" path="src/main/webapp"/>
<classpathentry excluding="**/bower_components/*|**/node_modules/*|**/*.min.js" kind="src" path="target/m2e-wtp/web-resources"/>
<classpathentry kind="con" path="org.eclipse.wst.jsdt.launching.JRE_CONTAINER"/>
<classpathentry kind="con" path="org.eclipse.wst.jsdt.launching.WebProject">
<attributes>
<attribute name="hide" value="true"/>
</attributes>
</classpathentry>
<classpathentry kind="con" path="org.eclipse.wst.jsdt.launching.baseBrowserLibrary"/>
<classpathentry kind="output" path=""/>
</classpath>

View File

@ -0,0 +1,6 @@
eclipse.preferences.version=1
encoding//src/main/java=UTF-8
encoding//src/main/resources=UTF-8
encoding//src/test/java=UTF-8
encoding//src/test/resources=UTF-8
encoding/<project>=UTF-8

View File

@ -0,0 +1,8 @@
eclipse.preferences.version=1
org.eclipse.jdt.core.compiler.codegen.inlineJsrBytecode=enabled
org.eclipse.jdt.core.compiler.codegen.targetPlatform=1.8
org.eclipse.jdt.core.compiler.compliance=1.8
org.eclipse.jdt.core.compiler.problem.assertIdentifier=error
org.eclipse.jdt.core.compiler.problem.enumIdentifier=error
org.eclipse.jdt.core.compiler.problem.forbiddenReference=warning
org.eclipse.jdt.core.compiler.source=1.8

View File

@ -0,0 +1,4 @@
activeProfiles=
eclipse.preferences.version=1
resolveWorkspaceProjects=true
version=1

View File

@ -0,0 +1,11 @@
<?xml version="1.0" encoding="UTF-8"?><project-modules id="moduleCoreId" project-version="1.5.0">
<wb-module deploy-name="dataminer-pool-manager-2.7.0-SNAPSHOT">
<wb-resource deploy-path="/" source-path="/target/m2e-wtp/web-resources"/>
<wb-resource deploy-path="/" source-path="/src/main/webapp" tag="defaultRootSource"/>
<wb-resource deploy-path="/WEB-INF/classes" source-path="/src/main/java"/>
<wb-resource deploy-path="/WEB-INF/classes" source-path="/src/main/resources"/>
<wb-resource deploy-path="/WEB-INF/classes" source-path="/src/test/resources"/>
<property name="context-root" value="dataminer-pool-manager"/>
<property name="java-output-path" value="/dataminer-pool-manager/target/classes"/>
</wb-module>
</project-modules>

View File

@ -0,0 +1,7 @@
<root>
<facet id="jst.jaxrs">
<node name="libprov">
<attribute name="provider-id" value="jaxrs-no-op-library-provider"/>
</node>
</facet>
</root>

View File

@ -0,0 +1,8 @@
<?xml version="1.0" encoding="UTF-8"?>
<faceted-project>
<fixed facet="wst.jsdt.web"/>
<installed facet="java" version="1.8"/>
<installed facet="jst.web" version="2.3"/>
<installed facet="wst.jsdt.web" version="1.0"/>
<installed facet="jst.jaxrs" version="2.0"/>
</faceted-project>

View File

@ -0,0 +1 @@
org.eclipse.wst.jsdt.launching.baseBrowserLibrary

View File

@ -0,0 +1 @@
Window

View File

@ -0,0 +1,2 @@
disabled=06target
eclipse.preferences.version=1

311
LICENSE.md Normal file
View File

@ -0,0 +1,311 @@
#European Union Public Licence V.1.1
##*EUPL © the European Community 2007*
This **European Union Public Licence** (the **“EUPL”**) applies to the Work or Software
(as defined below) which is provided under the terms of this Licence. Any use of
the Work, other than as authorised under this Licence is prohibited (to the
extent such use is covered by a right of the copyright holder of the Work).
The Original Work is provided under the terms of this Licence when the Licensor
(as defined below) has placed the following notice immediately following the
copyright notice for the Original Work:
**Licensed under the EUPL V.1.1**
or has expressed by any other mean his willingness to license under the EUPL.
##1. Definitions
In this Licence, the following terms have the following meaning:
- The Licence: this Licence.
- The Original Work or the Software: the software distributed and/or
communicated by the Licensor under this Licence, available as Source Code and
also as Executable Code as the case may be.
- Derivative Works: the works or software that could be created by the Licensee,
based upon the Original Work or modifications thereof. This Licence does not
define the extent of modification or dependence on the Original Work required
in order to classify a work as a Derivative Work; this extent is determined by
copyright law applicable in the country mentioned in Article 15.
- The Work: the Original Work and/or its Derivative Works.
- The Source Code: the human-readable form of the Work which is the most
convenient for people to study and modify.
- The Executable Code: any code which has generally been compiled and which is
meant to be interpreted by a computer as a program.
- The Licensor: the natural or legal person that distributes and/or communicates
the Work under the Licence.
- Contributor(s): any natural or legal person who modifies the Work under the
Licence, or otherwise contributes to the creation of a Derivative Work.
- The Licensee or “You”: any natural or legal person who makes any usage of the
Software under the terms of the Licence.
- Distribution and/or Communication: any act of selling, giving, lending,
renting, distributing, communicating, transmitting, or otherwise making
available, on-line or off-line, copies of the Work or providing access to its
essential functionalities at the disposal of any other natural or legal
person.
##2. Scope of the rights granted by the Licence
The Licensor hereby grants You a world-wide, royalty-free, non-exclusive,
sub-licensable licence to do the following, for the duration of copyright vested
in the Original Work:
- use the Work in any circumstance and for all usage, reproduce the Work, modify
- the Original Work, and make Derivative Works based upon the Work, communicate
- to the public, including the right to make available or display the Work or
- copies thereof to the public and perform publicly, as the case may be, the
- Work, distribute the Work or copies thereof, lend and rent the Work or copies
- thereof, sub-license rights in the Work or copies thereof.
Those rights can be exercised on any media, supports and formats, whether now
known or later invented, as far as the applicable law permits so.
In the countries where moral rights apply, the Licensor waives his right to
exercise his moral right to the extent allowed by law in order to make effective
the licence of the economic rights here above listed.
The Licensor grants to the Licensee royalty-free, non exclusive usage rights to
any patents held by the Licensor, to the extent necessary to make use of the
rights granted on the Work under this Licence.
##3. Communication of the Source Code
The Licensor may provide the Work either in its Source Code form, or as
Executable Code. If the Work is provided as Executable Code, the Licensor
provides in addition a machine-readable copy of the Source Code of the Work
along with each copy of the Work that the Licensor distributes or indicates, in
a notice following the copyright notice attached to the Work, a repository where
the Source Code is easily and freely accessible for as long as the Licensor
continues to distribute and/or communicate the Work.
##4. Limitations on copyright
Nothing in this Licence is intended to deprive the Licensee of the benefits from
any exception or limitation to the exclusive rights of the rights owners in the
Original Work or Software, of the exhaustion of those rights or of other
applicable limitations thereto.
##5. Obligations of the Licensee
The grant of the rights mentioned above is subject to some restrictions and
obligations imposed on the Licensee. Those obligations are the following:
Attribution right: the Licensee shall keep intact all copyright, patent or
trademarks notices and all notices that refer to the Licence and to the
disclaimer of warranties. The Licensee must include a copy of such notices and a
copy of the Licence with every copy of the Work he/she distributes and/or
communicates. The Licensee must cause any Derivative Work to carry prominent
notices stating that the Work has been modified and the date of modification.
Copyleft clause: If the Licensee distributes and/or communicates copies of the
Original Works or Derivative Works based upon the Original Work, this
Distribution and/or Communication will be done under the terms of this Licence
or of a later version of this Licence unless the Original Work is expressly
distributed only under this version of the Licence. The Licensee (becoming
Licensor) cannot offer or impose any additional terms or conditions on the Work
or Derivative Work that alter or restrict the terms of the Licence.
Compatibility clause: If the Licensee Distributes and/or Communicates Derivative
Works or copies thereof based upon both the Original Work and another work
licensed under a Compatible Licence, this Distribution and/or Communication can
be done under the terms of this Compatible Licence. For the sake of this clause,
“Compatible Licence” refers to the licences listed in the appendix attached to
this Licence. Should the Licensees obligations under the Compatible Licence
conflict with his/her obligations under this Licence, the obligations of the
Compatible Licence shall prevail.
Provision of Source Code: When distributing and/or communicating copies of the
Work, the Licensee will provide a machine-readable copy of the Source Code or
indicate a repository where this Source will be easily and freely available for
as long as the Licensee continues to distribute and/or communicate the Work.
Legal Protection: This Licence does not grant permission to use the trade names,
trademarks, service marks, or names of the Licensor, except as required for
reasonable and customary use in describing the origin of the Work and
reproducing the content of the copyright notice.
##6. Chain of Authorship
The original Licensor warrants that the copyright in the Original Work granted
hereunder is owned by him/her or licensed to him/her and that he/she has the
power and authority to grant the Licence.
Each Contributor warrants that the copyright in the modifications he/she brings
to the Work are owned by him/her or licensed to him/her and that he/she has the
power and authority to grant the Licence.
Each time You accept the Licence, the original Licensor and subsequent
Contributors grant You a licence to their contributions to the Work, under the
terms of this Licence.
##7. Disclaimer of Warranty
The Work is a work in progress, which is continuously improved by numerous
contributors. It is not a finished work and may therefore contain defects or
“bugs” inherent to this type of software development.
For the above reason, the Work is provided under the Licence on an “as is” basis
and without warranties of any kind concerning the Work, including without
limitation merchantability, fitness for a particular purpose, absence of defects
or errors, accuracy, non-infringement of intellectual property rights other than
copyright as stated in Article 6 of this Licence.
This disclaimer of warranty is an essential part of the Licence and a condition
for the grant of any rights to the Work.
##8. Disclaimer of Liability
Except in the cases of wilful misconduct or damages directly caused to natural
persons, the Licensor will in no event be liable for any direct or indirect,
material or moral, damages of any kind, arising out of the Licence or of the use
of the Work, including without limitation, damages for loss of goodwill, work
stoppage, computer failure or malfunction, loss of data or any commercial
damage, even if the Licensor has been advised of the possibility of such
damage. However, the Licensor will be liable under statutory product liability
laws as far such laws apply to the Work.
##9. Additional agreements
While distributing the Original Work or Derivative Works, You may choose to
conclude an additional agreement to offer, and charge a fee for, acceptance of
support, warranty, indemnity, or other liability obligations and/or services
consistent with this Licence. However, in accepting such obligations, You may
act only on your own behalf and on your sole responsibility, not on behalf of
the original Licensor or any other Contributor, and only if You agree to
indemnify, defend, and hold each Contributor harmless for any liability incurred
by, or claims asserted against such Contributor by the fact You have accepted
any such warranty or additional liability.
##10. Acceptance of the Licence
The provisions of this Licence can be accepted by clicking on an icon “I agree”
placed under the bottom of a window displaying the text of this Licence or by
affirming consent in any other similar way, in accordance with the rules of
applicable law. Clicking on that icon indicates your clear and irrevocable
acceptance of this Licence and all of its terms and conditions.
Similarly, you irrevocably accept this Licence and all of its terms and
conditions by exercising any rights granted to You by Article 2 of this Licence,
such as the use of the Work, the creation by You of a Derivative Work or the
Distribution and/or Communication by You of the Work or copies thereof.
##11. Information to the public
In case of any Distribution and/or Communication of the Work by means of
electronic communication by You (for example, by offering to download the Work
from a remote location) the distribution channel or media (for example, a
website) must at least provide to the public the information requested by the
applicable law regarding the Licensor, the Licence and the way it may be
accessible, concluded, stored and reproduced by the Licensee.
##12. Termination of the Licence
The Licence and the rights granted hereunder will terminate automatically upon
any breach by the Licensee of the terms of the Licence.
Such a termination will not terminate the licences of any person who has
received the Work from the Licensee under the Licence, provided such persons
remain in full compliance with the Licence.
##13. Miscellaneous
Without prejudice of Article 9 above, the Licence represents the complete
agreement between the Parties as to the Work licensed hereunder.
If any provision of the Licence is invalid or unenforceable under applicable
law, this will not affect the validity or enforceability of the Licence as a
whole. Such provision will be construed and/or reformed so as necessary to make
it valid and enforceable.
The European Commission may publish other linguistic versions and/or new
versions of this Licence, so far this is required and reasonable, without
reducing the scope of the rights granted by the Licence. New versions of the
Licence will be published with a unique version number.
All linguistic versions of this Licence, approved by the European Commission,
have identical value. Parties can take advantage of the linguistic version of
their choice.
##14. Jurisdiction
Any litigation resulting from the interpretation of this License, arising
between the European Commission, as a Licensor, and any Licensee, will be
subject to the jurisdiction of the Court of Justice of the European Communities,
as laid down in article 238 of the Treaty establishing the European Community.
Any litigation arising between Parties, other than the European Commission, and
resulting from the interpretation of this License, will be subject to the
exclusive jurisdiction of the competent court where the Licensor resides or
conducts its primary business.
##15. Applicable Law
This Licence shall be governed by the law of the European Union country where
the Licensor resides or has his registered office.
This licence shall be governed by the Belgian law if:
- a litigation arises between the European Commission, as a Licensor, and any
- Licensee; the Licensor, other than the European Commission, has no residence
- or registered office inside a European Union country.
---
##Appendix
**“Compatible Licences”** according to article 5 EUPL are:
- GNU General Public License (GNU GPL) v. 2
- Open Software License (OSL) v. 2.1, v. 3.0
- Common Public License v. 1.0
- Eclipse Public License v. 1.0
- Cecill v. 2.0

48
README.md Normal file
View File

@ -0,0 +1,48 @@
# DataMiner Pool Manager
DataMiner Pool Manager is a service to support the integration of algorithms in D4Science Infrastructure.
## Structure of the project
* The source code is present in the src folder.
## Built With
* [OpenJDK](https://openjdk.java.net/) - The JDK used
* [Maven](https://maven.apache.org/) - Dependency Management
## Documentation
* Use of this widget is is described on [Wiki](https://wiki.gcube-system.org/gcube/How_to_use_the_DataMiner_Pool_Manager).
## Change log
See [Releases](https://code-repo.d4science.org/gCubeSystem/dataminer-pool-manager/releases).
## Authors
* **Paolo Fabriani** - [Engineering Ingegneria Informatica S.p.A., Italy](https://www.eng.it/)
* **Nunzio Andrea Galante** - [Engineering Ingegneria Informatica S.p.A., Italy](https://www.eng.it/)
* **Ciro Formisano** - [Engineering Ingegneria Informatica S.p.A., Italy](https://www.eng.it/)
## License
This project is licensed under the EUPL V.1.1 License - see the [LICENSE.md](LICENSE.md) file for details.
## About the gCube Framework
This software is part of the [gCubeFramework](https://www.gcube-system.org/ "gCubeFramework"): an
open-source software toolkit used for building and operating Hybrid Data
Infrastructures enabling the dynamic deployment of Virtual Research Environments
by favouring the realisation of reuse oriented policies.
The projects leading to this software have received funding from a series of European Union programmes including:
- the Sixth Framework Programme for Research and Technological Development
- DILIGENT (grant no. 004260);
- the Seventh Framework Programme for research, technological development and demonstration
- D4Science (grant no. 212488), D4Science-II (grant no.239019), ENVRI (grant no. 283465), EUBrazilOpenBio (grant no. 288754), iMarine(grant no. 283644);
- the H2020 research and innovation programme
- BlueBRIDGE (grant no. 675680), EGIEngage (grant no. 654142), ENVRIplus (grant no. 654182), Parthenos (grant no. 654119), SoBigData (grant no. 654024),DESIRA (grant no. 818194), ARIADNEplus (grant no. 823914), RISIS2 (grant no. 824091), PerformFish (grant no. 727610), AGINFRAplus (grant no. 731001);

33
changelog.xml Normal file
View File

@ -0,0 +1,33 @@
<ReleaseNotes>
<Changeset component="org.gcube.dataanalysis.dataminer-pool-manager.2-7-0" date="2020-04-16">
<Change>Updated to new Social Networking API [ticket #19081]</Change>
</Changeset>
<Changeset component="org.gcube.dataanalysis.dataminer-pool-manager.2-6-0" date="2019-12-11">
<Change>Updated to Git and Jenkins</Change>
<Change>Knime 4.1 added [ticket #18190]</Change>
</Changeset>
<Changeset component="org.gcube.dataanalysis.dataminer-pool-manager.2-5-0" date="2019-01-11">
<Change>SVN parameters get from IS</Change>
<Change>Python3.6 added [ticket #12742]</Change>
</Changeset>
<Changeset component="org.gcube.dataanalysis.dataminer-pool-manager.2-4-0" date="2018-11-01">
<Change>Notifies the 'conflicts' on SVN</Change>
</Changeset>
<Changeset component="org.gcube.dataanalysis.dataminer-pool-manager.2-3-0" date="2018-08-01">
<Change>Log information also if the verification of SVN dependencies fails</Change>
</Changeset>
<Changeset component="org.gcube.dataanalysis.dataminer-pool-manager.2-2-0" date="2018-06-01">
<Change>Improvements and bugs fixed/SVN-UTF8 compliant/Installation in Production ghost added</Change>
<Change>Dynamic per-VRE configuration throght the information system</Change>
</Changeset>
<Changeset component="org.gcube.dataanalysis.dataminer-pool-manager.2-1-0" date="2018-05-01">
<Change>New configuration file added</Change>
</Changeset>
<Changeset component="org.gcube.dataanalysis.dataminer-pool-manager.2-0-0" date="2018-01-01">
<Change>Second Release</Change>
</Changeset>
<Changeset component="org.gcube.dataanalysis.dataminer-pool-manager.1-0-0" date="2017-11-01">
<Change>First Release</Change>
</Changeset>
</ReleaseNotes>

31
descriptor.xml Normal file
View File

@ -0,0 +1,31 @@
<assembly
xmlns="http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.0 http://maven.apache.org/xsd/assembly-1.1.0.xsd">
<id>servicearchive</id>
<formats>
<format>tar.gz</format>
</formats>
<baseDirectory>/</baseDirectory>
<fileSets>
<fileSet>
<outputDirectory>/</outputDirectory>
<useDefaultExcludes>true</useDefaultExcludes>
<includes>
<include>README.md</include>
<include>LICENSE.md</include>
<include>changelog.xml</include>
<include>profile.xml</include>
<include>gcube-app.xml</include>
</includes>
<fileMode>755</fileMode>
<filtered>true</filtered>
</fileSet>
</fileSets>
<files>
<file>
<source>target/${build.finalName}.${project.packaging}</source>
<outputDirectory>/${artifactId}</outputDirectory>
</file>
</files>
</assembly>

9
gcube-app.xml Normal file
View File

@ -0,0 +1,9 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE xml>
<application mode="online">
<name>${serviceName}</name>
<group>${serviceClass}</group>
<version>${version}</version>
<description>${description}</description>
<exclude>/api/swagger.*</exclude>
</application>

215
pom.xml Normal file
View File

@ -0,0 +1,215 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<artifactId>maven-parent</artifactId>
<groupId>org.gcube.tools</groupId>
<version>1.1.0</version>
<relativePath />
</parent>
<groupId>org.gcube.dataanalysis</groupId>
<artifactId>dataminer-pool-manager</artifactId>
<packaging>war</packaging>
<version>2.7.0-SNAPSHOT</version>
<name>dataminer-pool-manager</name>
<description>DataMiner Pool Manager is a service to support the integration of algorithms in D4Science Infrastructure</description>
<scm>
<connection>scm:git:https://code-repo.d4science.org/gCubeSystem/${project.artifactId}.git</connection>
<developerConnection>scm:git:https://code-repo.d4science.org/gCubeSystem/${project.artifactId}.git</developerConnection>
<url>https://code-repo.d4science.org/gCubeSystem/${project.artifactId}</url>
</scm>
<properties>
<serviceName>dataminer-pool-manager</serviceName>
<serviceClass>DataAnalysis</serviceClass>
<webappDirectory>${project.basedir}/src/main/webapp/WEB-INF</webappDirectory>
<distroDirectory>distro</distroDirectory>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<version.swagger>1.5.13</version.swagger>
<version.jersey>2.25.1</version.jersey>
</properties>
<dependencies>
<dependency>
<groupId>org.gcube.core</groupId>
<artifactId>common-smartgears</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.gcube.core</groupId>
<artifactId>common-smartgears-app</artifactId>
</dependency>
<dependency>
<groupId>org.gcube.resources.discovery</groupId>
<artifactId>ic-client</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<version>4.3.4</version>
</dependency>
<dependency>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
<version>[2.5.0,2.6.0)</version>
</dependency>
<dependency>
<groupId>org.antlr</groupId>
<artifactId>stringtemplate</artifactId>
<version>[4.0.0, 4.1.0)</version>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
<version>2.5</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.json</groupId>
<artifactId>json</artifactId>
<version>20090211</version>
</dependency>
<dependency>
<groupId>org.tmatesoft.svnkit</groupId>
<artifactId>svnkit</artifactId>
<version>1.8.5</version>
</dependency>
<dependency>
<groupId>commons-configuration</groupId>
<artifactId>commons-configuration</artifactId>
<version>1.10</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.jcraft/jsch -->
<dependency>
<groupId>com.jcraft</groupId>
<artifactId>jsch</artifactId>
<version>0.1.53</version>
</dependency>
<dependency>
<groupId>net.sf.opencsv</groupId>
<artifactId>opencsv</artifactId>
<version>2.3</version>
</dependency>
<dependency>
<groupId>org.yaml</groupId>
<artifactId>snakeyaml</artifactId>
<version>1.16</version>
</dependency>
<dependency>
<groupId>org.glassfish.jersey.media</groupId>
<artifactId>jersey-media-json-jackson</artifactId>
</dependency>
<dependency>
<groupId>org.glassfish.jersey.core</groupId>
<artifactId>jersey-client</artifactId>
</dependency>
<dependency>
<groupId>org.glassfish.jersey.containers</groupId>
<artifactId>jersey-container-servlet</artifactId>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<scope>provided</scope>
</dependency>
</dependencies>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>org.gcube.distribution</groupId>
<artifactId>gcube-smartgears-bom</artifactId>
<version>1.1.0</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-resources-plugin</artifactId>
<version>2.6</version>
<executions>
<execution>
<id>copy-profile</id>
<goals>
<goal>copy-resources</goal>
</goals>
<phase>process-resources</phase>
<configuration>
<outputDirectory>${webappDirectory}</outputDirectory>
<resources>
<resource>
<directory>${project.basedir}</directory>
<filtering>true</filtering>
<excludes>
<exclude>src</exclude>
</excludes>
<includes>
<include>LICENSE.md</include>
<include>README.md</include>
<include>gcube-app.xml</include>
<include>changelog.xml</include>
<include>profile.xml</include>
</includes>
</resource>
</resources>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<descriptors>
<descriptor>descriptor.xml</descriptor>
</descriptors>
</configuration>
<executions>
<execution>
<id>servicearchive</id>
<phase>install</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>

25
profile.xml Normal file
View File

@ -0,0 +1,25 @@
<?xml version="1.0" encoding="UTF-8"?>
<Resource xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<ID />
<Type>Service</Type>
<Profile>
<Description>${description}</Description>
<Class>DataminerPoolManager</Class>
<Name>${artifactId}</Name>
<Version>1.0.0</Version>
<Packages>
<Software>
<Name>${artifactId}</Name>
<Version>${version}</Version>
<MavenCoordinates>
<groupId>${groupId}</groupId>
<artifactId>${artifactId}</artifactId>
<version>${version}</version>
</MavenCoordinates>
<Files>
<File>${build.finalName}.jar</File>
</Files>
</Software>
</Packages>
</Profile>
</Resource>

View File

@ -0,0 +1,140 @@
package org.gcube.dataanalysis.dataminer.poolmanager.ansible;
import java.io.File;
import java.io.IOException;
import java.io.InputStream;
import java.io.PrintStream;
import java.util.Scanner;
import org.gcube.dataanalysis.dataminer.poolmanager.ansible.model.Inventory;
import org.gcube.dataanalysis.dataminer.poolmanager.ansible.model.Playbook;
import org.gcube.dataanalysis.dataminer.poolmanager.ansible.model.Role;
import org.gcube.dataanalysis.dataminer.poolmanager.ansiblebridge.AnsibleSerializeHelper;
import org.tmatesoft.svn.core.SVNException;
/**
* This class is responsible for the interface with ansible, retrieving log,
* etc. etc. It's not supposed to access templates and static stuff files. It
* does not know the service datamodel.
*
* @author paolo
*
*/
public class AnsibleWorker {
/**
* The name of the inventory
*/
private static String INVENTORY_NAME = "inventory.yaml";
/**
* The directory containing roles
*/
private static String ROLES_DIR = "roles";
/**
* The name of the playbook
*/
private static String PLAYBOOK_NAME = "playbook.yaml";
/**
* The root of the worker. This corresponds to a standard ansible working dir.
*/
private File workerRoot;
public AnsibleWorker(File root) {
this.workerRoot = root;
this.ensureWorkStructure();
}
// public File getWorkdir() {
// return this.workerRoot;
// }
public File getRolesDir() {
return new File(this.workerRoot, ROLES_DIR);
}
public String getWorkerId() {
return this.workerRoot.getName();
}
public void ensureWorkStructure() {
// generate root
this.workerRoot.mkdirs();
}
public void removeWorkStructure() {
// remove the working dir
this.workerRoot.delete();
}
public File getPlaybookFile() {
return new File(this.workerRoot, PLAYBOOK_NAME);
}
public File getInventoryFile() {
return new File(this.workerRoot, INVENTORY_NAME);
}
public void setInventory(Inventory inventory) throws IOException {
// serialize the string to the 'inventory' file
AnsibleSerializeHelper.serialize(inventory, this.getInventoryFile());
}
public void setPlaybook(Playbook playbook) throws IOException {
// serialize the string to the 'playbook' file
AnsibleSerializeHelper.serialize(playbook, this.getPlaybookFile());
}
public void addRole(Role r) throws IOException {
// Serialize role in the workdir
AnsibleSerializeHelper.serializeRole(r, this.getRolesDir());
}
public int execute(PrintStream ps)
throws IOException, InterruptedException, SVNException {
System.out.println(this.workerRoot);
try {
Process p = Runtime.getRuntime().exec("ansible-playbook -v -i " + this.getInventoryFile().getAbsolutePath()
+ " " + this.getPlaybookFile().getAbsolutePath());
inheritIO(p.getInputStream(), ps);
inheritIO(p.getErrorStream(), ps);
// writer.println(this.getStatus(p.waitFor()));
// writer.close();
return p.waitFor();
} catch (IOException e) {
e.printStackTrace();
}
return -1;
}
private static void inheritIO(final InputStream src, final PrintStream dest) {
new Thread(new Runnable() {
public void run() {
Scanner sc = new Scanner(src);
while (sc.hasNextLine()) {
dest.println(sc.nextLine());
}
sc.close();
}
}).start();
}
/**
* Destroy the worker:
* - remove the working dir
*/
public void destroy() {
this.removeWorkStructure();
}
}

View File

@ -0,0 +1,19 @@
package org.gcube.dataanalysis.dataminer.poolmanager.ansible.model;
public class AnsibleHost {
private String name;
public AnsibleHost(String name) {
this.name = name;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
}

View File

@ -0,0 +1,29 @@
package org.gcube.dataanalysis.dataminer.poolmanager.ansible.model;
import java.util.Collection;
import java.util.Vector;
public class HostGroup {
private String name;
private Collection<AnsibleHost> hosts;
public HostGroup(String name) {
this.name = name;
this.hosts = new Vector<>();
}
public void addHost(AnsibleHost h) {
this.hosts.add(h);
}
public String getName() {
return this.name;
}
public Collection<AnsibleHost> getHosts() {
return new Vector<>(this.hosts);
}
}

View File

@ -0,0 +1,37 @@
package org.gcube.dataanalysis.dataminer.poolmanager.ansible.model;
import java.util.Collection;
import java.util.Vector;
public class Inventory {
private Collection<HostGroup> groups;
public Inventory() {
this.groups = new Vector<>();
}
public void addGroup(HostGroup group) {
this.groups.add(group);
}
public void addHost(AnsibleHost h, String groupName) {
this.getGroup(groupName).addHost(h);
}
private HostGroup getGroup(String groupName) {
for (HostGroup hg : this.groups) {
if (groupName.equals(hg.getName())) {
return hg;
}
}
HostGroup hg = new HostGroup(groupName);
this.groups.add(hg);
return hg;
}
public Collection<HostGroup> getHostGroups() {
return new Vector<>(this.groups);
}
}

View File

@ -0,0 +1,50 @@
package org.gcube.dataanalysis.dataminer.poolmanager.ansible.model;
import java.util.List;
import java.util.Vector;
public class Playbook {
private String hostGroupName;
private List<String> roles;
private String remote_user;
public Playbook() {
this.roles = new Vector<>();
}
public void addRole(String role) {
roles.add(role);
}
public void applyTo(String hostGroupName) {
this.hostGroupName = hostGroupName;
}
public String getHostGroupName() {
return hostGroupName;
}
public List<String> getRoles() {
return new Vector<>(roles);
}
public String getRemote_user() {
return remote_user;
}
public void setRemote_user(String remote_user) {
this.remote_user = remote_user;
}
public void setHostGroupName(String hostGroupName) {
this.hostGroupName = hostGroupName;
}
public void setRoles(List<String> roles) {
this.roles = roles;
}
}

View File

@ -0,0 +1,51 @@
package org.gcube.dataanalysis.dataminer.poolmanager.ansible.model;
import java.util.Collection;
import java.util.Vector;
public class Role {
/**
* The name of the role
*/
private String name;
private Collection<RoleFile> tasks;
private Collection<RoleFile> meta;
public Role() {
this.tasks = new Vector<>();
this.meta = new Vector<>();
}
public Role(String name) {
this();
this.name = name;
}
public void addTaskFile(RoleFile tf) {
this.tasks.add(tf);
}
public void addMeta(RoleFile tf) {
this.meta.add(tf);
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public Collection<RoleFile> getTaskFiles() {
return new Vector<>(this.tasks);
}
public Collection<RoleFile> getMeta() {
return new Vector<>(this.meta);
}
}

View File

@ -0,0 +1,54 @@
package org.gcube.dataanalysis.dataminer.poolmanager.ansible.model;
public class RoleFile {
/**
* The path to the file, starting from the role root
*/
private String path;
/**
* The name of the task file
*/
private String name;
/**
* The content of the task file
* @return
*/
private String content;
public RoleFile() {
}
public RoleFile(String name, String content) {
this();
this.setName(name);
this.setContent(content);
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public String getContent() {
return content;
}
public void setContent(String content) {
this.content = content;
}
public String getPath() {
return path;
}
public void setPath(String path) {
this.path = path;
}
}

View File

@ -0,0 +1,324 @@
package org.gcube.dataanalysis.dataminer.poolmanager.ansiblebridge;
import java.io.File;
import java.io.IOException;
import java.util.Collection;
import java.util.List;
import java.util.Map;
import java.util.Set;
import java.util.TreeMap;
import java.util.TreeSet;
import java.util.UUID;
import java.util.Vector;
import org.gcube.dataanalysis.dataminer.poolmanager.ansible.AnsibleWorker;
import org.gcube.dataanalysis.dataminer.poolmanager.ansible.model.AnsibleHost;
import org.gcube.dataanalysis.dataminer.poolmanager.ansible.model.Inventory;
import org.gcube.dataanalysis.dataminer.poolmanager.ansible.model.Playbook;
import org.gcube.dataanalysis.dataminer.poolmanager.ansible.model.Role;
import org.gcube.dataanalysis.dataminer.poolmanager.ansiblebridge.template.AlgorithmPackage;
import org.gcube.dataanalysis.dataminer.poolmanager.ansiblebridge.template.CranDependencyPackage;
import org.gcube.dataanalysis.dataminer.poolmanager.ansiblebridge.template.CustomDependencyPackage;
import org.gcube.dataanalysis.dataminer.poolmanager.ansiblebridge.template.CustomRoleManager;
import org.gcube.dataanalysis.dataminer.poolmanager.ansiblebridge.template.OSDependencyPackage;
import org.gcube.dataanalysis.dataminer.poolmanager.ansiblebridge.template.StaticRoleManager;
import org.gcube.dataanalysis.dataminer.poolmanager.ansiblebridge.template.TemplateManager;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Algorithm;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.AlgorithmSet;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Cluster;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Dependency;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Host;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.comparator.HostComparator;
public class AnsibleBridge {
// private static final org.slf4j.Logger LOGGER =
// LoggerFactory.getLogger(AnsibleBridge.class);
private String dpmRoot;
// public AnsibleBridge() {
// this(System.getProperty("user.home")+File.separator+"dataminer-pool-manager");
// //this(System.getProperty("/home/gcube/dataminer-pool-manager"));
//
// }
public AnsibleBridge(String root) {
this.dpmRoot = root;
this.ensureServiceRoot();
}
private void ensureServiceRoot() {
// generate root
new File(dpmRoot).mkdirs();
// 'template' is for template roles
// this.getTemplatesDir().mkdirs();
// 'static' is for custom roles
// this.getCustomDir().mkdirs();
// 'work' is for temporary working directories
this.getWorkDir().mkdirs();
}
private File getWorkDir() {
return new File(this.dpmRoot, "work");
}
// private String getTemplatesDir() {
// String input = null;
// input =
// AnsibleBridge.class.getClassLoader().getResource("templates").getPath();
// return input;
// }
//
//
// private String getCustomDir() {
// String input = null;
// input =
// AnsibleBridge.class.getClassLoader().getResource("custom").getPath();
// return input;
// }
public AnsibleWorker createWorker(Algorithm algorithm, Cluster dataminerCluster,
boolean includeAlgorithmDependencies, String user) throws IOException {
File workerRoot = new File(this.getWorkDir(), UUID.randomUUID().toString());
AnsibleWorker worker = new AnsibleWorker(workerRoot);
List<Role> algoRoles = new Vector<>();
// add algorithms and dependencies to the worker
for (Role r : this.generateRoles(algorithm, includeAlgorithmDependencies)) {
algoRoles.add(r);
worker.addRole(r);
}
// to comment the for in case of just install algo
if (includeAlgorithmDependencies) {
for (Dependency d : algorithm.getDependencies()) {
for (Role r : this.generateRoles(d)) {
worker.addRole(r);
}
}
}
// add static roles
for (Role r : this.getStaticRoleManager().getStaticRoles()) {
worker.addRole(r);
}
// generate the inventory
Inventory inventory = new Inventory();
for (Host h : dataminerCluster.getHosts()) {
AnsibleHost ah = new AnsibleHost(h.getName());
inventory.addHost(ah, "universe");
inventory.addHost(ah, "d4science");
}
worker.setInventory(inventory);
// generate the playbook
Playbook playbook = new Playbook();
playbook.setRemote_user(user);
playbook.applyTo("universe");
for (Role r : algoRoles) {
// add only 'add' roles
if (!r.getName().endsWith("remove")) {
playbook.addRole(r.getName());
}
}
worker.setPlaybook(playbook);
return worker;
}
public void printInventoryByDomainAndSets(Collection<Cluster> clusters) {
Map<String, Set<Host>> inventory = new TreeMap<>();
for (Cluster cluster : clusters) {
for (AlgorithmSet as : cluster.getAlgorithmSets()) {
String asName = as.getName();
for (Host h : cluster.getHosts()) {
String domain = h.getDomain().getName();
String key = String.format("[%s@%s]", asName, domain);
Set<Host> hosts = inventory.get(key);
if (hosts == null) {
hosts = new TreeSet<>(new HostComparator());
inventory.put(key, hosts);
}
hosts.add(h);
}
}
}
for (String key : inventory.keySet()) {
System.out.println(key);
Collection<Host> hosts = inventory.get(key);
for (Host h : hosts) {
System.out.println(h.getName() + "." + h.getDomain().getName());
}
System.out.println();
}
}
public void printInventoryBySets(Collection<Cluster> clusters) {
Map<String, Set<Host>> inventory = new TreeMap<>();
for (Cluster cluster : clusters) {
for (AlgorithmSet as : cluster.getAlgorithmSets()) {
String asName = as.getName();
for (Host h : cluster.getHosts()) {
String key = String.format("[%s]", asName);
Set<Host> hosts = inventory.get(key);
if (hosts == null) {
hosts = new TreeSet<>(new HostComparator());
inventory.put(key, hosts);
}
hosts.add(h);
}
}
}
for (String key : inventory.keySet()) {
System.out.println(key);
Collection<Host> hosts = inventory.get(key);
for (Host h : hosts) {
System.out.println(h.getName() + "." + h.getDomain().getName());
}
System.out.println();
}
}
// public AnsibleWorker applyAlgorithmSetToCluster(AlgorithmSet as, Cluster
// cluster, /*boolean updateSVN,*/ boolean test) throws IOException,
// InterruptedException, SVNException {
//
//
// return applyAlgorithmSetToCluster
// (as,cluster,UUID.randomUUID().toString(),/*updateSVN,*/ test);
// }
// public AnsibleWorker applyAlgorithmSetToCluster(AlgorithmSet as, Cluster
// cluster,String uuid, /*boolean updateSVN,*/ boolean test) throws
// IOException, InterruptedException, SVNException {
// AnsibleWorker worker = new AnsibleWorker(new File(this.getWorkDir(),
// uuid));
//
//
// List<Role> algoRoles = new Vector<>();
//
// // add algorithms and dependencies to the worker
// for (Algorithm a : as.getAlgorithms()) {
// for (Role r : this.generateRoles(a)) {
// algoRoles.add(r);
// worker.addRole(r);
// }
// //to comment the for in case of just install algo
// if(test){
// for (Dependency d : a.getDependencies()) {
// for (Role r : this.generateRoles(d)) {
// worker.addRole(r);
// }
// }
// }
// }
//
// // add static roles
// for(Role r:this.getStaticRoleManager().getStaticRoles()) {
// worker.addRole(r);
// }
//
// // generate the inventory
// Inventory inventory = new Inventory();
// for (Host h : cluster.getHosts()) {
// AnsibleHost ah = new AnsibleHost(h.getName());
// inventory.addHost(ah, "universe");
// inventory.addHost(ah, "d4science");
// }
// worker.setInventory(inventory);
//
// // generate the playbook
// Playbook playbook = new Playbook();
// if(test){
// playbook.setRemote_user("root");}
// playbook.setRemote_user("gcube");
// playbook.applyTo("universe");
// for(Role r:algoRoles) {
// // add only 'add' roles
// if(!r.getName().endsWith("remove")) {
// playbook.addRole(r.getName());
// }
// }
//
// worker.setPlaybook(playbook);
//
// // execute and save log locally
// //PrintStream console = System.out;
// File path = new File(worker.getWorkdir() + File.separator + "logs");
// path.mkdirs();
// File n = new File(path + File.separator + worker.getWorkerId());
// FileOutputStream fos = new FileOutputStream(n);
// PrintStream ps = new PrintStream(fos);
//
// //System.setErr(console);
//
// worker.apply(as,ps,test);
// //System.setOut(console);
// //worker.apply();
// System.out.println("Log stored to to " + n.getAbsolutePath());
//
// // destroy the worker
// worker.destroy();
// return worker;
// }
private TemplateManager getTemplateManager() {
return new TemplateManager();
}
private CustomRoleManager getCustomRoleManager() {
return new CustomRoleManager();
}
private StaticRoleManager getStaticRoleManager() {
return new StaticRoleManager();
}
/**
* Generate all roles for this dependency
*
* @param dep
* Dependency
* @return Collection of Roles
*/
public Collection<Role> generateRoles(Dependency dep) {
Collection<Role> roles = new Vector<>();
if ("os".equalsIgnoreCase(dep.getType())) {
OSDependencyPackage pkg = new OSDependencyPackage(dep);
if (pkg != null) {
roles.addAll(pkg.getRoles(this.getTemplateManager()));
}
} else if ("custom".equalsIgnoreCase(dep.getType())) {
CustomDependencyPackage pkg = new CustomDependencyPackage(dep);
if (pkg != null) {
roles.addAll(pkg.getRoles(this.getCustomRoleManager()));
}
} else if ("github".equalsIgnoreCase(dep.getType())) {
CranDependencyPackage pkg = new CranDependencyPackage(dep);
if (pkg != null) {
roles.addAll(pkg.getRoles(this.getTemplateManager()));
}
} else if ("cran".equalsIgnoreCase(dep.getType())) {
CranDependencyPackage pkg = new CranDependencyPackage(dep);
if (pkg != null) {
roles.addAll(pkg.getRoles(this.getTemplateManager()));
}
}
return roles;
}
public Collection<Role> generateRoles(Algorithm a, boolean includeAlgorithmDependencies) {
AlgorithmPackage pkg = new AlgorithmPackage(a, includeAlgorithmDependencies);
return pkg.getRoles(this.getTemplateManager());
}
}

View File

@ -0,0 +1,119 @@
package org.gcube.dataanalysis.dataminer.poolmanager.ansiblebridge;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.io.PrintWriter;
import org.apache.commons.io.IOUtils;
import org.gcube.dataanalysis.dataminer.poolmanager.ansible.model.AnsibleHost;
import org.gcube.dataanalysis.dataminer.poolmanager.ansible.model.HostGroup;
import org.gcube.dataanalysis.dataminer.poolmanager.ansible.model.Inventory;
import org.gcube.dataanalysis.dataminer.poolmanager.ansible.model.Playbook;
import org.gcube.dataanalysis.dataminer.poolmanager.ansible.model.Role;
import org.gcube.dataanalysis.dataminer.poolmanager.ansible.model.RoleFile;
public class AnsibleSerializeHelper {
public static void serialize(Inventory inventory, File inventoryFile) throws IOException {
String out = "";
for(HostGroup hg:inventory.getHostGroups()) {
out+=String.format("[%s]\n", hg.getName());
for(AnsibleHost h:hg.getHosts()) {
out+=h.getName()+"\n";
}
out+="\n";
}
out = out.trim();
serialize(out, inventoryFile);
}
public static void serialize(Playbook playbook, File playbookFile) throws IOException {
String out = "- hosts: " + playbook.getHostGroupName() + "\n";
out += " remote_user: "+playbook.getRemote_user()+"\n";
out+=" roles:\n";
for(String r:playbook.getRoles()) {
out+=" - " + r+"\n";
}
out+=" vars:\n";
out+=" os_package_state: present\n";
out = out.trim();
serialize(out, playbookFile);
}
public static void serializeRole(Role r, File dir) throws IOException {
// create root
File root = new File(dir, r.getName());
root.mkdirs();
// create tasks
if(r.getTaskFiles().size()>0) {
File tasks = new File(root, "tasks");
tasks.mkdirs();
for(RoleFile tf: r.getTaskFiles()) {
serializeTask(tf, tasks);
}
}
// create meta
if(r.getMeta().size()>0) {
File meta = new File(root, "meta");
meta.mkdirs();
for(RoleFile tf: r.getMeta()) {
serializeTask(tf, meta);
}
}
}
public static void serializeTask(RoleFile tf, File dir) throws IOException {
File f = new File(dir, tf.getName());
serialize(tf.getContent().trim(), f);
}
public static void serialize(String s, File f) throws IOException {
PrintWriter out = new PrintWriter(f);
out.println(s);
out.close();
}
public static Role deserializeRoleFromFilesystem(File roleDir) throws IOException {
Role out = new Role();
out.setName(roleDir.getName());
if(!roleDir.exists()) {
throw new FileNotFoundException();
}
try {
File tasksDir = new File(roleDir, "tasks");
if(tasksDir.exists()) {
for(File main:tasksDir.listFiles()) {
String content = IOUtils.toString(new FileInputStream(main), "UTF-8");
RoleFile tf = new RoleFile(main.getName(), content);
tf.setPath(main.getAbsolutePath().substring(roleDir.getAbsolutePath().length()+1));
out.addTaskFile(tf);
}
}
} catch(FileNotFoundException e) {
e.printStackTrace();
}
try {
File metaDir = new File(roleDir, "meta");
if(metaDir.exists()) {
for(File main:metaDir.listFiles()) {
String content = IOUtils.toString(new FileInputStream(main), "UTF-8");
RoleFile tf = new RoleFile(main.getName(), content);
tf.setPath(main.getAbsolutePath().substring(roleDir.getAbsolutePath().length()+1));
out.addMeta(tf);
}
}
} catch(FileNotFoundException e) {
e.printStackTrace();
}
return out;
}
}

View File

@ -0,0 +1,82 @@
package org.gcube.dataanalysis.dataminer.poolmanager.ansiblebridge.template;
import java.util.Collection;
import java.util.HashMap;
import java.util.Map;
import java.util.NoSuchElementException;
import java.util.Vector;
import org.gcube.common.scope.api.ScopeProvider;
import org.gcube.dataanalysis.dataminer.poolmanager.ansible.model.Role;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Algorithm;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Dependency;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class AlgorithmPackage {
private Algorithm algorithm;
private boolean includeAlgorithmDependencies;
private Logger logger;
public AlgorithmPackage(Algorithm a,boolean includeAlgorithmDependencies)
{
this.logger = LoggerFactory.getLogger(AlgorithmPackage.class);
this.algorithm = a;
this.includeAlgorithmDependencies = includeAlgorithmDependencies;
}
protected Map<String, String> getDictionary(Algorithm a) {
Map<String, String> out = new HashMap<String, String>();
out.put("name", a.getName());
out.put("category", a.getCategory());
out.put("class", a.getClazz());
out.put("atype", a.getAlgorithmType());
out.put("skipjava", a.getSkipJava());
out.put("vre", ScopeProvider.instance.get());
//out.put("vre", "FAKE_VRE");
out.put("packageurl", a.getPackageURL());
out.put("description", a.getDescription());
String deps = "";
if(includeAlgorithmDependencies){
for(Dependency d:a.getDependencies()) {
deps+=String.format("- { role: %s }\n", d.getType()+"-"+d.getName().replaceAll("/", "-"));
}}
deps = deps.trim();
out.put("dependencies", deps);
return out;
}
protected Algorithm getAlgorithm() {
return this.algorithm;
}
public Collection<Role> getRoles(TemplateManager tm) {
Collection<Role> out = new Vector<>();
for(String mode:new String[]{"add"}) { // "remove", "update"
String roleName = "gcube-algorithm-"+this.getAlgorithm().getName()+("add".equals(mode) ? "" : "-"+mode);
try {
// find template
Role template = tm.getRoleTemplate("gcube-algorithm-" + mode);
//
if(template!=null) {
Map<String, String> dictionary = this.getDictionary(this.getAlgorithm());
Role r = tm.fillRoleTemplate(template, dictionary);
r.setName(roleName);
out.add(r);
} else
{
this.logger.warn("WARNING: template is null");
}
} catch (NoSuchElementException e) {
// e.printStackTrace();
this.logger.warn("WARNING: no template found for " + roleName);
}
}
return out;
}
}

View File

@ -0,0 +1,11 @@
package org.gcube.dataanalysis.dataminer.poolmanager.ansiblebridge.template;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Dependency;
public class CranDependencyPackage extends DependencyPackage {
public CranDependencyPackage(Dependency d) {
super(d);
}
}

View File

@ -0,0 +1,69 @@
package org.gcube.dataanalysis.dataminer.poolmanager.ansiblebridge.template;
import java.util.Collection;
import java.util.NoSuchElementException;
import java.util.Vector;
import org.gcube.dataanalysis.dataminer.poolmanager.ansible.model.Role;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Dependency;
import org.slf4j.Logger;
public class CustomDependencyPackage extends DependencyPackage {
private Logger logger;
public CustomDependencyPackage(Dependency dependency)
{
super(dependency);
}
// private String getCustomRepositoryLocation(String ansibleRoot) {
// return ansibleRoot+"/custom";
// }
/*
public void serializeTo(String ansibleRoot) {
for(String mode:new String[]{"add", "remove", "update"}) {
// look for roles in the 'custom' repository
try {
// role name
String roleName = this.getDependency().getType()+"-"+this.getDependency().getName()+("add".equals(mode) ? "" : "-"+mode);
// look for the custom role
File src = new File(this.getCustomRepositoryLocation(ansibleRoot)+"/"+roleName);
System.out.println("** CUSTOM ** " + src);
if(src.exists()) {
// do copy
System.out.println("copying CUSTOM role");
File dest = new File(ansibleRoot+"/work/"+roleName);
FileUtils.copyDirectory(src, dest);
}
} catch(IOException e) {
e.printStackTrace();
}
}
}
*/
public Collection<Role> getRoles(CustomRoleManager crm) {
Collection<Role> out = new Vector<>();
// for(String mode:new String[]{"add", "remove", "update"}) {
for(String mode:new String[]{"add"}) { // "remove", "update"
// role name
String roleName = this.getDependency().getType()+"-"+this.getDependency().getName()+("add".equals(mode) ? "" : "-"+mode);
try {
// look for custom role
Role role = crm.getRole(roleName);
if(role!=null) {
out.add(role);
}
} catch (NoSuchElementException e) {
// e.printStackTrace();
this.logger.warn("WARNING: no custom role found for " + roleName);
}
}
return out;
}
}

View File

@ -0,0 +1,31 @@
package org.gcube.dataanalysis.dataminer.poolmanager.ansiblebridge.template;
import java.io.File;
import java.io.IOException;
import java.util.NoSuchElementException;
import org.gcube.dataanalysis.dataminer.poolmanager.ansible.model.Role;
import org.gcube.dataanalysis.dataminer.poolmanager.ansiblebridge.AnsibleBridge;
import org.gcube.dataanalysis.dataminer.poolmanager.ansiblebridge.AnsibleSerializeHelper;
public class CustomRoleManager {
public String getRoot() {
String input = AnsibleBridge.class.getClassLoader().getResource("custom").getPath();
return input;
}
public Role getRole(String roleName) throws NoSuchElementException {
File f = new File(this.getRoot(), roleName);
try {
return AnsibleSerializeHelper.deserializeRoleFromFilesystem(f);
} catch (IOException e) {
// e.printStackTrace();
throw new NoSuchElementException("unable to find " + roleName);
}
}
}

View File

@ -0,0 +1,60 @@
package org.gcube.dataanalysis.dataminer.poolmanager.ansiblebridge.template;
import java.util.Collection;
import java.util.HashMap;
import java.util.Map;
import java.util.NoSuchElementException;
import java.util.Vector;
import org.gcube.dataanalysis.dataminer.poolmanager.ansible.model.Role;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Dependency;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class DependencyPackage {
private Logger logger;
private Dependency dependency;
public DependencyPackage(Dependency dependency) {
this.dependency = dependency;
this.logger = LoggerFactory.getLogger(DependencyPackage.class);
}
protected Map<String, String> getDictionary(Dependency d) {
Map<String, String> out = new HashMap<String, String>();
out.put("name", d.getName());
out.put("type", d.getType());
return out;
}
protected Dependency getDependency() {
return this.dependency;
}
public Collection<Role> getRoles(TemplateManager tm) {
Collection<Role> out = new Vector<>();
for(String mode:new String[]{"add"}) { // "remove", "update"
String roleName = this.getDependency().getType()+"-"+this.getDependency().getName().replaceAll("/", "-")+("add".equals(mode) ? "" : "-"+mode);
try {
// find template
Role template = tm.getRoleTemplate(this.getDependency().getType()+"-package-"+mode);
//
if(template!=null) {
Map<String, String> dictionary = this.getDictionary(this.getDependency());
Role r = tm.fillRoleTemplate(template, dictionary);
r.setName(roleName);
out.add(r);
} else {
this.logger.warn("WARNING: template is null");
}
} catch (NoSuchElementException e) {
// e.printStackTrace();
this.logger.warn("WARNING: no template found for " + roleName);
}
}
return out;
}
}

View File

@ -0,0 +1,11 @@
package org.gcube.dataanalysis.dataminer.poolmanager.ansiblebridge.template;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Dependency;
public class OSDependencyPackage extends DependencyPackage {
public OSDependencyPackage(Dependency dependency) {
super(dependency);
}
}

View File

@ -0,0 +1,38 @@
package org.gcube.dataanalysis.dataminer.poolmanager.ansiblebridge.template;
import java.io.File;
import java.io.IOException;
import java.util.Collection;
import java.util.Vector;
import org.gcube.dataanalysis.dataminer.poolmanager.ansible.model.Role;
import org.gcube.dataanalysis.dataminer.poolmanager.ansiblebridge.AnsibleBridge;
import org.gcube.dataanalysis.dataminer.poolmanager.ansiblebridge.AnsibleSerializeHelper;
public class StaticRoleManager {
public StaticRoleManager() {
}
public String getRoot() {
String input = AnsibleBridge.class.getClassLoader().getResource("static").getPath();
return input;
}
public Collection<Role> getStaticRoles() {
Collection<Role> out = new Vector<>();
for(File f: new File(this.getRoot()).listFiles()) {
try {
out.add(AnsibleSerializeHelper.deserializeRoleFromFilesystem(f));
} catch(IOException e) {
e.printStackTrace();
}
}
return out;
}
}

View File

@ -0,0 +1,85 @@
package org.gcube.dataanalysis.dataminer.poolmanager.ansiblebridge.template;
import java.io.File;
import java.io.IOException;
import java.util.Map;
import java.util.NoSuchElementException;
import org.gcube.dataanalysis.dataminer.poolmanager.ansible.model.Role;
import org.gcube.dataanalysis.dataminer.poolmanager.ansible.model.RoleFile;
import org.gcube.dataanalysis.dataminer.poolmanager.ansiblebridge.AnsibleBridge;
import org.gcube.dataanalysis.dataminer.poolmanager.ansiblebridge.AnsibleSerializeHelper;
import org.stringtemplate.v4.ST;
public class TemplateManager {
public TemplateManager() {
}
public String getTemplateRoot() {
String input = AnsibleBridge.class.getClassLoader().getResource("templates").getPath();
return input;
}
// private String readTemplate(String templateName) throws IOException {
// File templateFile = new File(this.getTemplateRoot(), templateName + ".yaml");
// System.out.println("looking for file " + templateFile.getName());
// String out = IOUtils.toString(new FileInputStream(templateFile), "UTF-8");
// return out;
// }
// public String getTemplate(String templateName) throws NoSuchElementException {
// String template = null;
// try {
// template = this.readTemplate(templateName);
// } catch (IOException e) {
// throw new NoSuchElementException();
// }
// return template;
// }
public Role fillRoleTemplate(Role template, Map<String, String> dictionary) {
Role out = new Role();
out.setName(template.getName());
for(RoleFile tf:template.getTaskFiles()) {
out.addTaskFile(this.fillTaskTemplate(tf, dictionary));
}
for(RoleFile tf:template.getMeta()) {
out.addMeta(this.fillTaskTemplate(tf, dictionary));
}
return out;
}
private RoleFile fillTaskTemplate(RoleFile template, Map<String, String> dictionary) {
RoleFile out = new RoleFile();
out.setName(template.getName());
out.setContent(this.fillTemplate(template.getContent(), dictionary));
return out;
}
private String fillTemplate(String template, Map<String, String> dictionary) {
if (template != null) {
ST t = new ST(template);
for (String key : dictionary.keySet()) {
t.add(key, dictionary.get(key));
}
String output = t.render();
return output;
}
return template;
}
public Role getRoleTemplate(String roleName) throws NoSuchElementException {
File f = new File(this.getTemplateRoot(), roleName);
try {
return AnsibleSerializeHelper.deserializeRoleFromFilesystem(f);
} catch (IOException e) {
// e.printStackTrace();
throw new NoSuchElementException("unable to find " + roleName);
}
}
}

View File

@ -0,0 +1,167 @@
package org.gcube.dataanalysis.dataminer.poolmanager.clients;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.net.URL;
import java.util.LinkedList;
import java.util.List;
import org.gcube.common.authorization.client.exceptions.ObjectNotFound;
import org.gcube.common.authorization.library.provider.SecurityTokenProvider;
import org.gcube.common.scope.api.ScopeProvider;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Cluster;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Host;
import org.gcube.dataanalysis.dataminer.poolmanager.util.CheckPermission;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import au.com.bytecode.opencsv.CSVReader;
public class HAProxy {
private Logger logger;
private CSVReader reader;
public HAProxy() {
this.logger = LoggerFactory.getLogger(HAProxy.class);
}
// public Cluster getClusterByHProxy() throws IOException {
// Cluster cl = new Cluster();
// String HProxy = ISClient.getHProxy();
// URL stockURL = new URL("http://data.d4science.org/Yk4zSFF6V3JOSytNd3JkRDlnRFpDUUR5TnRJZEw2QjRHbWJQNStIS0N6Yz0");
// BufferedReader in = new BufferedReader(new InputStreamReader(stockURL.openStream()));
// reader = new CSVReader(in);
// String[] nextLine;
// while ((nextLine = reader.readNext()) != null) {
// // rules to add
// if (HProxy.contains(nextLine[0])) {
// cl.setName(nextLine[0]);
// }
// }
// return cl;
//
// }
public Cluster MapCluster() throws IOException {
Cluster cl = new Cluster();
String HProxy = ISClient.getHProxy();
//Prod HAproxy
if (HProxy.equals("dataminer-cluster1.d4science.org")) {
cl.setName("dataminer_cluster1");
}
if (HProxy.equals("dataminer-bigdata.d4science.org")) {
cl.setName("bigdata");
}
if (HProxy.equals("dataminer-cluster1.d4science.org")) {
cl.setName("dataminer_cluster1");
}
if (HProxy.equals("dataminer-cloud1.d4science.org")) {
cl.setName("dataminer_cloud1");
}
if (HProxy.equals("dataminer-prototypes.d4science.org")) {
cl.setName("prototypes");
}
if (HProxy.equals("dataminer.d4science.org")) {
cl.setName("gcubeapps");
}
if (HProxy.equals("dataminer-genericworkers.d4science.org")) {
cl.setName("genericworkers");
}
if (HProxy.equals("dataminer-genericworkers-proto.d4science.org")) {
cl.setName("genericworkers_proto");
}
//dev HAProxy
if (HProxy.equals("dataminer-d-workers.d4science.org")||(HProxy.equals("dataminer-d-d4s.d4science.org"))) {
cl.setName("devnext_backend");
}
//preProd HAProxy
// if (HProxy.equals("dataminer1-pre.d4science.org")) {
// cl.setName("dataminer1-pre.d4science.org");
// }
return cl;
}
public List<Host> listDataMinersByCluster(String targetVREToken,String targetVRE) throws IOException {
SecurityTokenProvider.instance.set(targetVREToken);
ScopeProvider.instance.set(targetVRE);
// next op to use when Cluster info available in the IS
// Cluster cluster = this.getClusterByHProxy();
Cluster cluster = this.MapCluster();
List<Host> out = new LinkedList<Host>();
Host a = new Host();
//no proxy dataminer (preprod)
if (cluster.getName() == null){
a.setName(ISClient.getHProxy());
out.add(a);
}
// if preprod, just one dm available
// if (cluster.getName().equals("dataminer1-pre.d4science.org")) {
// a.setName("dataminer1-pre.d4science.org");
// out.add(a);
//}
else {
// prod
//URL stockURL = new
//URL("http://data.d4science.org/Yk4zSFF6V3JOSytNd3JkRDlnRFpDUUR5TnRJZEw2QjRHbWJQNStIS0N6Yz0");
URL stockURL = new URL("http://"+ ISClient.getHProxy() +":8880/;csv");
//URL stockURL = new URL("http://data.d4science.org/c29KTUluTkZnRlB0WXE5NVNaZnRoR0dtYThUSmNTVlhHbWJQNStIS0N6Yz0");
//System.out.println(stockURL);
// dev
//URL stockURL = new
//URL("http://data.d4science.org/c29KTUluTkZnRlB0WXE5NVNaZnRoR0dtYThUSmNTVlhHbWJQNStIS0N6Yz0");
BufferedReader in = new BufferedReader(new InputStreamReader(stockURL.openStream()));
reader = new CSVReader(in, ',');
String[] nextLine;
while ((nextLine = reader.readNext()) != null) {
if (nextLine[1].equals("BACKEND") || (nextLine[1].equals("FRONTEND"))) {
continue;
}
if (nextLine[0].equals(cluster.getName())) {
Host b = new Host();
b.setName(nextLine[1]);
out.add(b);
this.logger.info(b.getFullyQualifiedName());
}
}
}
this.logger.info(out.toString());
return out;
}
public static void main(String[] args) throws ObjectNotFound, Exception {
HAProxy a = new HAProxy();
//ScopeProvider.instance.set("/gcube/devNext/NextNext");
//ScopeProvider.instance.set("/d4science.research-infrastructures.eu/gCubeApps/RPrototypingLab");
SecurityTokenProvider.instance.set("3a23bfa4-4dfe-44fc-988f-194b91071dd2-843339462");
CheckPermission test = new CheckPermission();
CheckPermission.apply("708e7eb8-11a7-4e9a-816b-c9ed7e7e99fe-98187548", "/gcube/devNext/NextNext");
//ScopeProvider.instance.set("/d4science.research-infrastructures.eu/gCubeApps/RPrototypingLab");
// System.out.println(a.getHProxy());
// System.out.println(a.MapCluster());
//System.out.println(a.listDataMinersByCluster("708e7eb8-11a7-4e9a-816b-c9ed7e7e99fe-98187548","/gcube/devNext/NextNext"));
// System.out.println(a.listDataMinersByCluster());
// List<Dependency> list = new LinkedList<Dependency>();
// Dependency aa = new Dependency();
// aa.setName("testnunzio");
// aa.setType("cran:");
// list.add(aa);
// a.checkSVNdep();
// System.out.println(a.getDataminer("dataminer1-devnext.d4science.org").getDomain());
// System.out.println(a.listDataminersInVRE());
}
}

View File

@ -0,0 +1,249 @@
package org.gcube.dataanalysis.dataminer.poolmanager.clients;
import static org.gcube.resources.discovery.icclient.ICFactory.clientFor;
import static org.gcube.resources.discovery.icclient.ICFactory.queryFor;
import java.io.IOException;
import java.io.StringWriter;
import java.util.Arrays;
import java.util.Collection;
import java.util.List;
import java.util.Vector;
import org.gcube.common.authorization.library.provider.SecurityTokenProvider;
import org.gcube.common.resources.gcore.GenericResource;
import org.gcube.common.resources.gcore.Resources;
import org.gcube.common.resources.gcore.ServiceEndpoint;
import org.gcube.common.scope.api.ScopeProvider;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Algorithm;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Host;
import org.gcube.informationsystem.publisher.AdvancedScopedPublisher;
import org.gcube.informationsystem.publisher.RegistryPublisherFactory;
import org.gcube.informationsystem.publisher.ScopedPublisher;
import org.gcube.informationsystem.publisher.exception.RegistryNotFoundException;
import org.gcube.resources.discovery.client.api.DiscoveryClient;
import org.gcube.resources.discovery.client.queries.api.SimpleQuery;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.tmatesoft.svn.core.SVNException;
public class ISClient {
private Logger logger;
public ISClient() {
this.logger = LoggerFactory.getLogger(ISClient.class);
}
public Host getDataminer(String hostname) {
Host h = new Host();
boolean remote2 = true;
if (!remote2) {
h.setName("dataminer1-devnext.d4science.org");
return h;
} else {
//SimpleQuery query = queryFor(ServiceEndpoint.class);
//query.addCondition("$resource/Profile/RunTime/HostedOn/text() eq '" + hostname + "'");
//DiscoveryClient<ServiceEndpoint> client = clientFor(ServiceEndpoint.class);
//List<ServiceEndpoint> resources = client.submit(query);
//ServiceEndpoint a = resources.get(0);
//h.setName(a.profile().runtime().hostedOn());
h.setName(hostname);
}
return h;
}
// return the HProxy hostname in the VRE
public static String getHProxy() {
Host h = new Host();
SimpleQuery query = queryFor(ServiceEndpoint.class);
query.addCondition("$resource/Profile/Name/text() eq 'DataMiner'");
DiscoveryClient<ServiceEndpoint> client = clientFor(ServiceEndpoint.class);
List<ServiceEndpoint> resources = client.submit(query);
h.setName(resources.get(0).profile().runtime().hostedOn());
return h.getName();
}
public Collection<Host> listDataminersInVRE() {
boolean remote = false;
if (!remote) {
Collection<Host> out = new Vector<>();
Host h = new Host();
// h.setName("bb-dataminer.res.eng.it");
// h.setName("vm101.ui.savba.sk");
h.setName("dataminer1-devnext.d4science.org");
out.add(h);
return out;
} else {
SimpleQuery query = queryFor(ServiceEndpoint.class);
// old version
// query.addCondition("$resource/Profile/Category/text() eq
// 'DataAnalysis'")
// .addCondition("$resource/Profile/Name/text() eq 'DataMiner'");
query.addCondition("$resource/Profile/Platform/Name/text() eq 'DataMiner'");
DiscoveryClient<ServiceEndpoint> client = clientFor(ServiceEndpoint.class);
List<ServiceEndpoint> resources = client.submit(query);
Collection<Host> out = new Vector<>();
for (ServiceEndpoint r : resources) {
Host h = new Host();
h.setName(r.profile().runtime().hostedOn());
out.add(h);
}
return out;
}
}
public void updateAlg(Algorithm algo) {
ScopedPublisher scopedPublisher = RegistryPublisherFactory.scopedPublisher();
SimpleQuery query = queryFor(GenericResource.class);
query.addCondition("$resource/Profile/Name/text() eq '" + algo.getName() + "'").setResult("$resource");
DiscoveryClient<GenericResource> client = clientFor(GenericResource.class);
List<GenericResource> ds = client.submit(query);
if (ds.isEmpty()) {
return;
}
GenericResource a = ds.get(0);
a.profile().newBody(this.getAlgoBody(algo));
try {
scopedPublisher.update(a);
} catch (RegistryNotFoundException e) {
e.printStackTrace();
}
}
private String getAlgoBody(Algorithm algo) {
return "<category>" + algo.getCategory() + "</category>" + "\n" + "<clazz>" + algo.getClazz() + "</clazz>"
+ "\n" + "<algorithmType>" + algo.getAlgorithmType() + "</algorithmType>" + "\n" + "<skipJava>"
+ algo.getSkipJava() + "</skipJava>" + "\n" + "<packageURL>" + algo.getPackageURL() + "</packageURL>"
+ "\n" + "<dependencies>" + algo.getDependencies() + "</dependencies>";
}
// public void addAlgToIs(Algorithm algo) {
// GenericResource a = new GenericResource();
// a.newProfile().name(algo.getName()).type("StatisticalManagerAlgorithm").description(algo.getDescription());
// a.profile().newBody(this.getAlgoBody(algo));
// try {
// publishScopedResource(a, Arrays.asList(new String[] { ScopeProvider.instance.get() }));
// } catch (Exception e) {
// e.printStackTrace();
// }
// }
public void addAlgToIs(Algorithm algo, String token) {
GenericResource a = new GenericResource();
a.newProfile().name(algo.getName()).type("StatisticalManagerAlgorithm").description(algo.getDescription());
a.profile().newBody(this.getAlgoBody(algo));
try {
SecurityTokenProvider.instance.set(token);
publishScopedResource(a, Arrays.asList(new String[] { SecurityTokenProvider.instance.get() }));
} catch (Exception e) {
e.printStackTrace();
}
}
public void unPublishScopedResource(GenericResource resource) throws RegistryNotFoundException, Exception {
ScopedPublisher scopedPublisher = RegistryPublisherFactory.scopedPublisher();
AdvancedScopedPublisher advancedScopedPublisher = new AdvancedScopedPublisher(scopedPublisher);
String id = resource.id();
this.logger.info("Trying to remove {} with ID {} from {}", resource.getClass().getSimpleName(), id,
ScopeProvider.instance.get());
// scopedPublisher.remove(resource, scopes);
advancedScopedPublisher.forceRemove(resource);
this.logger.info("{} with ID {} removed successfully", resource.getClass().getSimpleName(), id);
}
public void publishScopedResource(GenericResource a, List<String> scopes)
throws RegistryNotFoundException, Exception {
StringWriter stringWriter = new StringWriter();
Resources.marshal(a, stringWriter);
ScopedPublisher scopedPublisher = RegistryPublisherFactory.scopedPublisher();
try {
this.logger.debug(scopes.toString());
this.logger.debug(stringWriter.toString());
scopedPublisher.create(a, scopes);
} catch (RegistryNotFoundException e) {
this.logger.error("Registry not found",e);
throw e;
}
}
// public Set<Algorithm> getAlgoFromIs() {
// // TODO Auto-generated method stub
//
// Set<Algorithm> out = new HashSet<Algorithm>();
// SimpleQuery query = queryFor(GenericResource.class);
// query.addCondition("$resource/Profile/SecondaryType/text() eq 'StatisticalManagerAlgorithm'")
// .setResult("$resource");
// DiscoveryClient<GenericResource> client = clientFor(GenericResource.class);
// List<GenericResource> ds = client.submit(query);
// for (GenericResource a : ds) {
// out.add(this.convertAlgo(a));
// }
// return out;
// }
// private Algorithm convertAlgo(GenericResource a) {
// Algorithm out = new Algorithm();
//
// // out.setId(a.profile().body().getElementsByTagName("id").item(0).getTextContent());
// out.setAlgorithmType(a.profile().body().getElementsByTagName("algorithmType").item(0).getTextContent());
// out.setCategory(a.profile().body().getElementsByTagName("category").item(0).getTextContent());
// out.setClazz(a.profile().body().getElementsByTagName("clazz").item(0).getTextContent());
// out.setName(a.profile().name());
// out.setPackageURL(a.profile().body().getElementsByTagName("packageURL").item(0).getTextContent());
// out.setSkipJava(a.profile().body().getElementsByTagName("skipJava").item(0).getTextContent());
// out.setDescription(a.profile().description());
//
// Set<org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Dependency> deps = new HashSet<org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Dependency>();
// for (int i = 0; i < a.profile().body().getElementsByTagName("dependencies").getLength(); i++) {
// org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Dependency d1 = new org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Dependency();
// d1.setName(a.profile().body().getElementsByTagName("dependencies").item(i).getTextContent());
// deps.add(d1);
// }
// out.setDependencies(deps);
// return out;
// }
public static void main(String[] args) throws IOException, SVNException {
ISClient a = new ISClient();
ScopeProvider.instance.set("/gcube/devNext/NextNext");
// System.out.println(a.getHProxy());
// System.out.println(a.MapCluster());
// System.out.println(a.listDataMinersByCluster());
// System.out.println(a.listDataMinersByCluster());
// List<Dependency> list = new LinkedList<Dependency>();
// Dependency aa = new Dependency();
// aa.setName("testnunzio");
// aa.setType("cran:");
// list.add(aa);
// a.checkSVNdep();
//System.out.println(a.getDataminer("dataminer1-d-d4s.d4science.org").getDomain());
// System.out.println(a.listDataminersInVRE());
}
}

View File

@ -0,0 +1,18 @@
package org.gcube.dataanalysis.dataminer.poolmanager.clients;
import java.util.HashMap;
import org.gcube.dataanalysis.dataminer.poolmanager.clients.configuration.ClientConfigurationCache;
public class ScopedCacheMap extends HashMap<String, ClientConfigurationCache> {
/**
*
*/
private static final long serialVersionUID = 1L;
}

View File

@ -0,0 +1,130 @@
package org.gcube.dataanalysis.dataminer.poolmanager.clients.configuration;
import static org.gcube.resources.discovery.icclient.ICFactory.clientFor;
import static org.gcube.resources.discovery.icclient.ICFactory.queryFor;
import java.util.Date;
import java.util.HashMap;
import java.util.Iterator;
import java.util.List;
import java.util.Map;
import org.gcube.common.resources.gcore.GenericResource;
import org.gcube.dataanalysis.dataminer.poolmanager.clients.configuration.ConfigurationImpl.CONFIGURATIONS;
import org.gcube.dataanalysis.dataminer.poolmanager.clients.configurations.AbstractConfiguration;
import org.gcube.resources.discovery.client.api.DiscoveryClient;
import org.gcube.resources.discovery.client.queries.api.SimpleQuery;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class ClientConfigurationCache
{
private Logger logger;
private SVNRepository svnRepository;
private Map<String, AbstractConfiguration> configurations;
private long svnRepositoryTimeout;
private Map<String, Long> configurationsTimeouts;
private final long duration = 120000; //2 minutes
ClientConfigurationCache ()
{
this.logger = LoggerFactory.getLogger(ClientConfigurationCache.class);
this.svnRepository = null;
this.configurations = new HashMap<>();
this.svnRepositoryTimeout = 0;
this.configurationsTimeouts = new HashMap<>();
}
public AbstractConfiguration getConfiguration (CONFIGURATIONS configuration)
{
Long time = this.configurationsTimeouts.get(configuration.toString());
long currentTime = new Date().getTime();
if (time == null || currentTime > time+this.duration)
{
this.configurations.put(configuration.toString(), getConfiguration (configuration.getType()));
this.configurationsTimeouts.put(configuration.toString(), currentTime);
}
return this.configurations.get(configuration.toString());
}
public SVNRepository getSVNRepository ()
{
long currentTime = new Date().getTime();
if (this.svnRepositoryTimeout == 0 || currentTime > this.svnRepositoryTimeout+this.duration)
{
this.svnRepository = queryForRepository();
this.svnRepositoryTimeout = currentTime;
}
return this.svnRepository;
}
private SVNRepository queryForRepository()
{
SimpleQuery query = queryFor(GenericResource.class);
query.addCondition("$resource/Profile/SecondaryType/text() eq 'DMPMConfigurator'").setResult("$resource");
DiscoveryClient<GenericResource> client = clientFor(GenericResource.class);
List<GenericResource> ds = client.submit(query);
Iterator<GenericResource> resourcesIterator = ds.iterator();
SVNRepository response = null;
while (resourcesIterator.hasNext() && response == null)
{
GenericResource resource = resourcesIterator.next();
String repositoryURL = resource.profile().body().getElementsByTagName(SVNRepository.REPOSITORY_URL).item(0).getTextContent();
if (repositoryURL != null)
{
String repositoryPath = resource.profile().body().getElementsByTagName(SVNRepository.REPOSITORY_PATH).item(0).getTextContent();
String repositoryUsername = null;
String repositoryPassword = null;
try
{
repositoryUsername = resource.profile().body().getElementsByTagName(SVNRepository.REPOSITORY_USERNAME).item(0).getTextContent();
repositoryPassword = resource.profile().body().getElementsByTagName(SVNRepository.REPOSITORY_PASSWORD).item(0).getTextContent();
if (repositoryUsername != null && repositoryUsername.trim() == "") repositoryUsername = null;
if (repositoryPassword != null && repositoryPassword.trim() == "") repositoryPassword = null;
this.logger.debug("Repository username "+repositoryUsername);
this.logger.debug("Repository password "+repositoryPassword);
} catch (Exception e)
{
this.logger.debug("SVN Username and password not present");
}
this.logger.debug("SVN Repository URL: "+repositoryURL);
this.logger.debug("SVN Repository path: "+repositoryPath);
response = new SVNRepository(repositoryURL, repositoryPath,repositoryUsername, repositoryPassword);
}
}
return response;
}
private AbstractConfiguration getConfiguration (AbstractConfiguration type)
{
SimpleQuery query = queryFor(GenericResource.class);
query.addCondition("$resource/Profile/SecondaryType/text() eq 'DMPMConfigurator'").setResult(type.getXMLModel());
DiscoveryClient<? extends AbstractConfiguration> client = clientFor(type.getClass());
List<? extends AbstractConfiguration> configurations = client.submit(query);
if (configurations != null && !configurations.isEmpty()) return configurations.get(0);
else return null;
}
}

View File

@ -0,0 +1,38 @@
package org.gcube.dataanalysis.dataminer.poolmanager.clients.configuration;
public interface Configuration {
public String getHost();
public String getSVNAlgorithmsList();
public String getRepository();
public String getSVNLinuxCompiledDepsList();
public String getSVNPreInstalledDepsList();
public String getSVNRBDepsList();
public String getSVNCRANDepsList();
public String getSVNJavaDepsList();
public String getSVNKWDepsList();
public String getSVNKW4_1DepsList();
public String getSVNOctaveDepsList();
public String getSVNPythonDepsList();
public String getSVNPython3_6DepsList();
public String getSVNWCDepsList();
public SVNRepository getSVNRepository();
public String getGhostAlgoDirectory();
}

View File

@ -0,0 +1,158 @@
package org.gcube.dataanalysis.dataminer.poolmanager.clients.configuration;
import org.gcube.dataanalysis.dataminer.poolmanager.clients.configurations.AbstractConfiguration;
import org.gcube.dataanalysis.dataminer.poolmanager.clients.configurations.Prod;
import org.gcube.dataanalysis.dataminer.poolmanager.clients.configurations.Stage;
public class ConfigurationImpl implements Configuration {
enum CONFIGURATIONS {
STAGE (new Stage ()),
PROD (new Prod ());
private AbstractConfiguration type;
private CONFIGURATIONS(AbstractConfiguration type) {
this.type = type;
}
@Override
public String toString() {
return this.type.getType();
}
public AbstractConfiguration getType ()
{
return this.type;
}
}
// enum REPOSITORIES {
// REPO ("svn.repository"),
// MAIN_ALGO ("svn.algo.main.repo");
//
// private String type;
//
// private REPOSITORIES(String type) {
// this.type = type;
// }
//
// @Override
// public String toString() {
// return this.type;
// }
// }
private CONFIGURATIONS type;
private ClientConfigurationCache cache;
public ConfigurationImpl(CONFIGURATIONS type,ClientConfigurationCache cache) {
this.type = type;
this.cache = cache;
}
@Override
public String getHost() {
return this.cache.getConfiguration(this.type).getHost ();
}
@Override
public String getSVNAlgorithmsList() {
return this.cache.getConfiguration(this.type).getAlgorithmsList();
}
@Override
public String getRepository() {
return this.cache.getConfiguration(this.type).getSoftwareRepo();
}
@Override
public String getSVNLinuxCompiledDepsList()
{
return this.cache.getConfiguration(this.type).getDepsLinuxCompiled();
}
@Override
public String getSVNPreInstalledDepsList() {
return this.cache.getConfiguration(this.type).getDepsPreInstalled();
}
@Override
public String getSVNRBDepsList()
{
return this.cache.getConfiguration(this.type).getDepsRBlackbox();
}
@Override
public String getSVNCRANDepsList() {
return this.cache.getConfiguration(this.type).getDepsR();
}
@Override
public String getSVNJavaDepsList() {
return this.cache.getConfiguration(this.type).getDepsJava();
}
@Override
public String getSVNKWDepsList() {
return this.cache.getConfiguration(this.type).getDepsKnimeWorkflow();
}
@Override
public String getSVNKW4_1DepsList() {
return this.cache.getConfiguration(this.type).getDepsKnimeWorkflow4_1();
}
@Override
public String getSVNOctaveDepsList() {
return this.cache.getConfiguration(this.type).getDepsOctave();
}
@Override
public String getSVNPythonDepsList() {
return this.cache.getConfiguration(this.type).getDepsPython();
}
@Override
public String getSVNPython3_6DepsList() {
return this.cache.getConfiguration(this.type).getDepsPython3_6();
}
@Override
public String getSVNWCDepsList() {
return this.cache.getConfiguration(this.type).getDepsWindowsCompiled();
}
@Override
public SVNRepository getSVNRepository()
{
return this.cache.getSVNRepository();
}
@Override
public String getGhostAlgoDirectory() {
return this.cache.getConfiguration(this.type).getGhostRepo();
}
}

View File

@ -0,0 +1,102 @@
package org.gcube.dataanalysis.dataminer.poolmanager.clients.configuration;
import java.io.IOException;
import java.util.ArrayList;
import java.util.Iterator;
import java.util.List;
import java.util.Properties;
import org.gcube.common.scope.api.ScopeProvider;
import org.gcube.dataanalysis.dataminer.poolmanager.clients.ScopedCacheMap;
import org.gcube.dataanalysis.dataminer.poolmanager.clients.configuration.ConfigurationImpl.CONFIGURATIONS;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.tmatesoft.svn.core.SVNException;
public class DMPMClientConfiguratorManager {
private final Logger logger;
private Properties defaultAdmins;
static DMPMClientConfiguratorManager instance;
private ScopedCacheMap cacheMap;
private DMPMClientConfiguratorManager() {
this.cacheMap = new ScopedCacheMap();
this.logger = LoggerFactory.getLogger(DMPMClientConfiguratorManager.class);
this.defaultAdmins = new Properties();
try {
this.defaultAdmins.load(this.getClass().getResourceAsStream("/default.admins"));
this.logger.debug("Default users successfully loaded");
} catch (Exception e) {
this.logger.error("Unable to get default users", e);
}
}
private ClientConfigurationCache getCurrentCache() {
String currentScope = ScopeProvider.instance.get();
this.logger.debug("Current scope = " + currentScope);
this.logger.debug("Getting current configuration cache");
ClientConfigurationCache cache = this.cacheMap.get(currentScope);
if (cache == null) {
this.logger.debug("Cache not created yet, creating...");
cache = new ClientConfigurationCache();
this.cacheMap.put(currentScope, cache);
}
return cache;
}
public static DMPMClientConfiguratorManager getInstance() {
if (instance == null)
instance = new DMPMClientConfiguratorManager();
return instance;
}
public Configuration getProductionConfiguration() {
return new ConfigurationImpl(CONFIGURATIONS.PROD, getCurrentCache());
}
public Configuration getStagingConfiguration() {
return new ConfigurationImpl(CONFIGURATIONS.STAGE, getCurrentCache());
}
public List<String> getDefaultAdmins() {
List<String> admins = new ArrayList<String>();
if (defaultAdmins == null || defaultAdmins.isEmpty()) {
admins.add("statistical.manager");
} else {
Iterator<Object> keys = this.defaultAdmins.keySet().iterator();
while (keys.hasNext()) {
String key = (String) keys.next();
admins.add(defaultAdmins.getProperty(key));
}
}
this.logger.debug("Default admins list: " + admins);
return admins;
}
public static void main(String[] args) throws IOException, SVNException {
DMPMClientConfiguratorManager a = new DMPMClientConfiguratorManager();
ScopeProvider.instance.set("/gcube/devNext/NextNext");
// SecurityTokenProvider.instance.set("708e7eb8-11a7-4e9a-816b-c9ed7e7e99fe-98187548");
System.out.println("RESULT 1" + a.getStagingConfiguration().getSVNCRANDepsList());
System.out.println("RESULT 2" + a.getProductionConfiguration().getRepository());
System.out.println("RESULT 3" + a.getStagingConfiguration().getSVNRepository().getPath());
// System.out.println(a.getRepo());
// System.out.println(a.getAlgoRepo());
// System.out.println(a.getSVNRepo());
}
}

View File

@ -0,0 +1,53 @@
package org.gcube.dataanalysis.dataminer.poolmanager.clients.configuration;
public class SVNRepository
{
static final String REPOSITORY_URL = "svn.repository",
REPOSITORY_PATH = "svn.algo.main.repo",
REPOSITORY_USERNAME = "svn.repository.username",
REPOSITORY_PASSWORD = "svn.repository.password";
private String baseUrl,
path,
username,
password;
SVNRepository(String baseUrl, String path, String username, String password) {
this.baseUrl = baseUrl;
this.path = path;
this.username = username;
this.password = password;
}
SVNRepository(String baseUrl, String path) {
this (baseUrl, path, null, null);
}
public String getUsername() {
return username;
}
public void setUsername(String username) {
this.username = username;
}
public String getPassword() {
return password;
}
public void setPassword(String password) {
this.password = password;
}
public String getBaseUrl() {
return baseUrl;
}
public String getPath() {
return path;
}
}

View File

@ -0,0 +1,177 @@
package org.gcube.dataanalysis.dataminer.poolmanager.clients.configurations;
import javax.xml.bind.annotation.XmlElement;
public abstract class AbstractConfiguration
{
private String host;
private String algorithmsList;
private String softwareRepo;
private String ghostRepo;
private String depsLinuxCompiled;
private String depsPreInstalled;
private String depsRBlackbox;
private String depsR;
private String depsJava;
private String depsKnimeWorkflow;
private String depsKnimeWorkflow4_1;
private String depsOctave;
private String depsPython;
private String depsPython3_6;
private String depsWindowsCompiled;
@XmlElement (name="host")
public String getHost() {
return host;
}
public void setHost(String host) {
this.host = host;
}
@XmlElement (name="algorithms-list")
public String getAlgorithmsList() {
return algorithmsList;
}
public void setAlgorithmsList(String algorithmsList) {
this.algorithmsList = algorithmsList;
}
@XmlElement (name="software-repo")
public String getSoftwareRepo() {
return softwareRepo;
}
public void setSoftwareRepo(String softwareRepo) {
this.softwareRepo = softwareRepo;
}
@XmlElement (name="ghost-repo")
public String getGhostRepo() {
return ghostRepo;
}
public void setGhostRepo(String ghostRepo) {
this.ghostRepo = ghostRepo;
}
@XmlElement (name="deps-linux-compiled")
public String getDepsLinuxCompiled() {
return depsLinuxCompiled;
}
public void setDepsLinuxCompiled(String depsLinuxCompiled) {
this.depsLinuxCompiled = depsLinuxCompiled;
}
@XmlElement (name="deps-pre-installed")
public String getDepsPreInstalled() {
return depsPreInstalled;
}
public void setDepsPreInstalled(String depsPreInstalled) {
this.depsPreInstalled = depsPreInstalled;
}
@XmlElement (name="deps-r-blackbox")
public String getDepsRBlackbox() {
return depsRBlackbox;
}
public void setDepsRBlackbox(String depsRBlackbox) {
this.depsRBlackbox = depsRBlackbox;
}
@XmlElement (name="deps-r")
public String getDepsR() {
return depsR;
}
public void setDepsR(String depsR) {
this.depsR = depsR;
}
@XmlElement (name="deps-java")
public String getDepsJava() {
return depsJava;
}
public void setDepsJava(String depsJava) {
this.depsJava = depsJava;
}
@XmlElement (name="deps-knime-workflow")
public String getDepsKnimeWorkflow() {
return depsKnimeWorkflow;
}
public void setDepsKnimeWorkflow(String depsKnimeWorkflow) {
this.depsKnimeWorkflow = depsKnimeWorkflow;
}
@XmlElement (name="deps-knime-workflow4_1")
public String getDepsKnimeWorkflow4_1() {
return depsKnimeWorkflow4_1;
}
public void setDepsKnimeWorkflow4_1(String depsKnimeWorkflow4_1) {
this.depsKnimeWorkflow4_1 = depsKnimeWorkflow4_1;
}
@XmlElement (name="deps-octave")
public String getDepsOctave() {
return depsOctave;
}
public void setDepsOctave(String depsOctave) {
this.depsOctave = depsOctave;
}
@XmlElement (name="deps-python")
public String getDepsPython() {
return depsPython;
}
public void setDepsPython(String depsPython) {
this.depsPython = depsPython;
}
@XmlElement (name="deps-python3_6")
public String getDepsPython3_6() {
return depsPython3_6;
}
public void setDepsPython3_6(String depsPython3_6) {
this.depsPython3_6 = depsPython3_6;
}
@XmlElement (name="deps-windows-compiled")
public String getDepsWindowsCompiled() {
return depsWindowsCompiled;
}
public void setDepsWindowsCompiled(String depsWindowsCompiled) {
this.depsWindowsCompiled = depsWindowsCompiled;
}
protected String getXML (String type)
{
return "<"+type+"><host>{$resource/Profile/Body/"+type+"/ghost/text()}</host>"+
"<algorithms-list>{$resource/Profile/Body/"+type+"/algorithms-list/text()}</algorithms-list>"+
" <software-repo>{$resource/Profile/Body/"+type+"/software.repo/text()}</software-repo>"+
"<ghost-repo>{$resource/Profile/Body/"+type+"/algo.ghost.repo/text()}</ghost-repo>"+
"<deps-linux-compiled>{$resource/Profile/Body/"+type+"/deps-linux-compiled/text()}</deps-linux-compiled>"+
"<deps-pre-installed>{$resource/Profile/Body/"+type+"/deps-pre-installed/text()}</deps-pre-installed>"+
"<deps-r-blackbox>{$resource/Profile/Body/"+type+"/deps-r-blackbox/text()}</deps-r-blackbox>"+
"<deps-r>{$resource/Profile/Body/"+type+"/deps-r/text()}</deps-r>"+
"<deps-java>{$resource/Profile/Body/"+type+"/deps-java/text()}</deps-java>"+
"<deps-knime-workflow>{$resource/Profile/Body/"+type+"/deps-knime-workflow/text()}</deps-knime-workflow >"+
"<deps-knime-workflow4_1>{$resource/Profile/Body/"+type+"/deps-knime-workflow4_1/text()}</deps-knime-workflow4_1>"+
"<deps-octave>{$resource/Profile/Body/"+type+"/deps-octave/text()}</deps-octave>"+
"<deps-python>{$resource/Profile/Body/"+type+"/deps-python/text()}</deps-python>"+
"<deps-python3_6>{$resource/Profile/Body/"+type+"/deps-python3_6/text()}</deps-python3_6>"+
"<deps-windows-compiled>{$resource/Profile/Body/"+type+"/deps-windows-compiled/text()}</deps-windows-compiled></"+type+">";
}
abstract public String getXMLModel ();
abstract public String getType ();
}

View File

@ -0,0 +1,23 @@
package org.gcube.dataanalysis.dataminer.poolmanager.clients.configurations;
import javax.xml.bind.annotation.XmlRootElement;
@XmlRootElement(name="prod")
public class Prod extends AbstractConfiguration {
private final String TYPE = "prod";
@Override
public String getXMLModel ()
{
return super.getXML(TYPE);
}
@Override
public String getType() {
return TYPE;
}
}

View File

@ -0,0 +1,23 @@
package org.gcube.dataanalysis.dataminer.poolmanager.clients.configurations;
import javax.xml.bind.annotation.XmlRootElement;
@XmlRootElement(name="stage")
public class Stage extends AbstractConfiguration {
private final String TYPE = "stage";
@Override
public String getXMLModel ()
{
return super.getXML(TYPE);
}
@Override
public String getType() {
return TYPE;
}
}

View File

@ -0,0 +1,33 @@
package org.gcube.dataanalysis.dataminer.poolmanager.datamodel;
public class Action {
private String name;
private String description;
private String script;
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public String getDescription() {
return description;
}
public void setDescription(String description) {
this.description = description;
}
public String getScript() {
return script;
}
public void setScript(String script) {
this.script = script;
}
}

View File

@ -0,0 +1,22 @@
package org.gcube.dataanalysis.dataminer.poolmanager.datamodel;
public class AlgoResource {
protected String id;
public AlgoResource() {
}
public AlgoResource(String id) {
this.id = id;
}
public String getId() {
return this.id;
}
public void setId(String id) {
this.id = id;
}
}

View File

@ -0,0 +1,188 @@
package org.gcube.dataanalysis.dataminer.poolmanager.datamodel;
import java.util.Collection;
import java.util.HashSet;
import java.util.Set;
import java.util.Vector;
public class Algorithm {
private String username;
private String fullname;
private String email;
private String language;
private String name;
private String description;
private String clazz;
private String category;
private String algorithmType;
private String skipJava;
private String packageURL;
private Collection<Action> actions;
private Collection<Dependency> dependencies;
public Algorithm() {
this.actions = new Vector<>();
this.dependencies = new Vector<>();
// Dependency p = new Dependency();
//init with default values
this.skipJava = "N";
this.algorithmType = "transducerers";
}
public void addDependency(Dependency dep) {
this.dependencies.add(dep);
}
public void addAction(Action action) {
this.actions.add(action);
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public String getDescription() {
return description;
}
public void setDescription(String description) {
this.description = description.replace(": ", " ");
}
public String getCategory() {
return category;
}
public void setCategory(String category) {
this.category = category;
}
public Collection<Action> getActions() {
return actions;
}
public Collection<Dependency> getDependencies() {
return dependencies;
}
public Collection<Dependency> getOSDependencies(){
Set<Dependency> deps = new HashSet<>();
for(Dependency d: this.getDependencies()){
if(d.getType().equals("os")){
deps.add(d);
}
}
return deps;
}
public Collection<Dependency> getCranDependencies(){
Set<Dependency> deps = new HashSet<>();
for(Dependency d: this.getDependencies()){
if(d.getType().equals("cran")){
deps.add(d);
}
}
return deps;
}
public Collection<Dependency> getGitHubDependencies(){
Set<Dependency> deps = new HashSet<>();
for(Dependency d: this.getDependencies()){
if(d.getType().equals("github")){
deps.add(d);
}
}
return deps;
}
public void setDependencies(Collection<Dependency> deps) {
this.dependencies = deps;
}
public String toString() {
String out = "Algorithm: " + this.getName()+"\n";
out+=" Class Name: " + this.getClazz()+"\n";
out+=" Description: " + this.getDescription()+"\n";
out+=" Dependencies: " + this.getDependencies()+"\n";
return out;
}
public String getClazz() {
return clazz;
}
public void setClazz(String clazz) {
this.clazz = clazz;
}
public String getPackageURL() {
return packageURL;
}
public void setPackageURL(String packageURL) {
this.packageURL = packageURL;
}
public String getAlgorithmType() {
return algorithmType;
}
public void setAlgorithmType(String algorithmType) {
this.algorithmType = algorithmType;
}
public String getSkipJava() {
return skipJava;
}
public void setSkipJava(String skipJava) {
this.skipJava = skipJava;
}
public String getUsername() {
return username;
}
public void setUsername(String username) {
this.username = username;
}
public String getFullname() {
return fullname;
}
public void setFullname(String fullname) {
this.fullname = fullname;
}
public String getEmail() {
return email;
}
public void setEmail(String email) {
this.email = email;
}
public String getLanguage() {
return language;
}
public void setLanguage(String language) {
this.language = language;
}
public void setActions(Collection<Action> actions) {
this.actions = actions;
}
}

View File

@ -0,0 +1,56 @@
package org.gcube.dataanalysis.dataminer.poolmanager.datamodel;
import java.util.Collection;
import java.util.Vector;
public class AlgorithmSet {
private String name;
private Collection<Algorithm> algorithms;
public AlgorithmSet()
{
this.algorithms = new Vector<>();
}
public String getName()
{
return name;
}
public void setName(String name)
{
this.name = name;
}
public Collection<Algorithm> getAlgorithms()
{
return new Vector<>(algorithms);
}
public void addAlgorithm(Algorithm algoritm)
{
this.algorithms.add(algoritm);
}
public Boolean hasAlgorithm(Algorithm algorithm)
{
for (Algorithm a : this.algorithms) {
if (a.getName().equals(algorithm.getName()))
{
return true;
}
}
return false;
}
public String toString()
{
String out = "ALGOSET: " + this.name + "\n";
for(Algorithm a:this.algorithms) {
out+=a+"\n";
}
return out;
}
}

View File

@ -0,0 +1,82 @@
package org.gcube.dataanalysis.dataminer.poolmanager.datamodel;
import java.util.Collection;
import java.util.Vector;
public class Cluster {
/**
* The set of hosts belonging to the cluster.
*/
private Collection<Host> hosts;
/**
* A name for this cluster.
*/
private String name;
/**
* A description of this cluster.
*/
private String description;
/**
* The set of algorithms deployed on this cluster (i.e. on all its hosts)
*/
private Collection<AlgorithmSet> algoSets;
public Cluster()
{
this.hosts = new Vector<>();
this.algoSets = new Vector<>();
}
public void addAlgorithmSet(AlgorithmSet set)
{
this.algoSets.add(set);
}
public void addHost(Host host)
{
this.hosts.add(host);
}
public Collection<Host> getHosts()
{
return hosts;
}
public String getName()
{
return name;
}
public void setName(String name)
{
this.name = name;
}
public String getDescription()
{
return description;
}
public void setDescription(String description)
{
this.description = description;
}
public Collection<AlgorithmSet> getAlgorithmSets()
{
return algoSets;
}
public String toString() {
String out = "Cluster: "+this.name+"\n";
for(Host h:this.getHosts()) {
out+=" "+h+"\n";
}
return out;
}
}

View File

@ -0,0 +1,29 @@
package org.gcube.dataanalysis.dataminer.poolmanager.datamodel;
public class Dependency {
private String name;
private String type;
public String getName()
{
return name;
}
public void setName(String name) {
this.name = name;
}
public String getType() {
return type;
}
public void setType(String type) {
this.type = type;
}
public String toString() {
return this.type+":"+this.name;
}
}

View File

@ -0,0 +1,15 @@
package org.gcube.dataanalysis.dataminer.poolmanager.datamodel;
public class Domain {
private String name;
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
}

View File

@ -0,0 +1,48 @@
package org.gcube.dataanalysis.dataminer.poolmanager.datamodel;
public class Host {
private String name;
private Domain domain;
public Host(String hostname) {
this.setName(hostname);
}
public Host() {
}
public String getFullyQualifiedName() {
if(this.domain!=null && this.domain.getName()!=null)
return this.getName()+"."+this.getDomain().getName();
else
return this.getName();
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public Domain getDomain() {
return domain;
}
public void setDomain(Domain domain) {
this.domain = domain;
}
// public String toString() {
// return this.name + "@" + this.domain;
// }
public String toString() {
return this.name;
}
}

View File

@ -0,0 +1,15 @@
package org.gcube.dataanalysis.dataminer.poolmanager.datamodel.comparator;
import java.util.Comparator;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Algorithm;
public class AlgorithmComparator implements Comparator<Algorithm> {
@Override
public int compare(Algorithm a1, Algorithm a2) {
return a1.getName().compareTo(a2.getName());
}
}

View File

@ -0,0 +1,18 @@
package org.gcube.dataanalysis.dataminer.poolmanager.datamodel.comparator;
import java.util.Comparator;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Dependency;
public class DependencyComparator implements Comparator<Dependency> {
@Override
public int compare(Dependency a1, Dependency a2) {
int out = a1.getType().compareTo(a2.getType());
if(out!=0)
return out;
return a1.getName().compareTo(a2.getName());
}
}

View File

@ -0,0 +1,17 @@
package org.gcube.dataanalysis.dataminer.poolmanager.datamodel.comparator;
import java.util.Comparator;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Host;
public class HostComparator implements Comparator<Host> {
@Override
public int compare(Host h1, Host h2) {
int out = h1.getDomain().getName().compareTo(h2.getDomain().getName());
if(out!=0)
return out;
return h1.getName().compareTo(h2.getName());
}
}

View File

@ -0,0 +1,103 @@
package org.gcube.dataanalysis.dataminer.poolmanager.process;
import java.util.StringTokenizer;
public class AddAlgorithmCommand {
private String command;
private String name;
private String category;
private String clazz;
private String scope;
private String algorithmType;
private String skipJava;
private String url;
private String description;
public AddAlgorithmCommand(String cmd) {
StringTokenizer st = new StringTokenizer(cmd, " ");
if (st.hasMoreElements())
command = st.nextToken();
if (st.hasMoreElements())
name = st.nextToken();
if (st.hasMoreElements())
category = st.nextToken();
if (st.hasMoreElements())
clazz = st.nextToken();
if (st.hasMoreElements())
scope = st.nextToken();
if (st.hasMoreElements())
algorithmType = st.nextToken();
if (st.hasMoreElements())
skipJava = st.nextToken();
if (st.hasMoreElements())
url = st.nextToken();
String d = "";
while (st.hasMoreElements())
d = d + st.nextToken() + " ";
this.setDescription(d);
}
public void setDescription(String d) {
if(d!=null) {
d = d.trim();
if(d.startsWith("\"") && d.endsWith("\"")) {
d = d.substring(1, d.length()-1).trim();
}
}
this.description = d;
}
public String getCommand() {
return command;
}
public String getName() {
return name;
}
public String getCategory() {
return category;
}
public String getClazz() {
return clazz;
}
public String getVRE() {
return scope;
}
public String getAlgorithmType() {
return algorithmType;
}
public String getSkipjava() {
return skipJava;
}
public String getUrl() {
return url;
}
public String getDescription() {
return description;
}
public String toString() {
String out = "";
out += String.format("%-12s: %s\n", "command", command);
out += String.format("%-12s: %s\n", "algo name", name);
out += String.format("%-12s: %s\n", "category", category);
out += String.format("%-12s: %s\n", "class", clazz);
out += String.format("%-12s: %s\n", "scope", scope);
out += String.format("%-12s: %s\n", "algo type", algorithmType);
out += String.format("%-12s: %s\n", "skip java", skipJava);
out += String.format("%-12s: %s\n", "url", url);
out += String.format("%-12s: %s\n", "description", this.description);
return out;
}
}

View File

@ -0,0 +1,317 @@
package org.gcube.dataanalysis.dataminer.poolmanager.process;
import java.io.IOException;
import java.io.InputStream;
import java.net.URL;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.Vector;
import java.util.zip.ZipEntry;
import java.util.zip.ZipInputStream;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Algorithm;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Dependency;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class AlgorithmPackageParser {
/**
* The name of the file containing algorithm metadata. Expected in the root
* directory of the package.
*/
private final String METADATA_FILE_NAME = "Info.txt",
METADATA_USERNAME = "Username",
METADATA_FULLNAME = "Full Name",
METADATA_EMAIL = "Email",
METADATA_LANGUAGE = "Language",
METADATA_CATEGORY = "Algorithm Category",
METADATA_ALGORITHM_NAME = "Algorithm Name",
METADATA_ALGORITHM_DESCRIPTION = "Algorithm Description",
METADATA_CLASS_NAME = "Class Name",
// private static final String METADATA_PACKAGES = "Packages";
METADATA_KEY_VALUE_SEPARATOR = ":";
private final int BUFFER_SIZE = 4096;
private Logger logger;
public AlgorithmPackageParser() {
this.logger = LoggerFactory.getLogger(AlgorithmPackageParser.class);
}
public Algorithm parsePackage(String url) throws IOException {
String packageMetadata = this.getPackageMetadata(url);
if (packageMetadata == null) {
this.logger.warn("WARNING: No metadata found for " + url);
return null;
} else {
Map<String, List<String>> parsedMetadata = this.parseMetadata(packageMetadata);
Algorithm a = this.createAlgorithm(parsedMetadata);
a.setPackageURL(url);
return a;
}
}
private String getPackageMetadata(String url) throws IOException {
InputStream is = new URL(url).openStream();
ZipInputStream zipIs = new ZipInputStream(is);
ZipEntry entry = zipIs.getNextEntry();
String out = null;
while (entry != null) {
if (METADATA_FILE_NAME.equalsIgnoreCase(entry.getName())) {
out = this.getEntryContent(zipIs);
break;
}
entry = zipIs.getNextEntry();
}
is.close();
zipIs.close();
return out;
}
private String getEntryContent(ZipInputStream zipIn) throws IOException {
StringBuilder s = new StringBuilder();
byte[] buffer = new byte[BUFFER_SIZE];
int read = 0;
while ((read = zipIn.read(buffer)) != -1) {
s.append(new String(buffer, 0, read));
}
return s.toString();
}
private Map<String, List<String>> parseMetadata(String metadata) {
Map<String, List<String>> out = new HashMap<String, List<String>>();
String[] lines = metadata.split("\n");
String key = null;
String value = null;
for (String line : lines) {
// skip empty lines
if (line.trim().isEmpty()) {
continue;
}
// scan lines one by one, looking for key and values
String[] parts = line.split(METADATA_KEY_VALUE_SEPARATOR);
if (parts.length > 1) {
// key and value on the same line
key = parts[0].trim();
value = line.substring(parts[0].length() + 1).trim();
} else if (parts.length == 1) {
// either a key or a value
if (line.trim().endsWith(METADATA_KEY_VALUE_SEPARATOR)) {
// key
key = parts[0].trim();
value = null;
} else {
// value
value = line.trim();
}
}
// add key+value to the map
if (key != null && value != null) {
List<String> values = out.get(key);
if (values == null) {
values = new Vector<>();
out.put(key, values);
}
values.add(value);
this.logger.debug(key + METADATA_KEY_VALUE_SEPARATOR + " " + values);
}
}
return out;
}
// private Algorithm createAlgorithm(Map<String, List<String>> metadata) {
// Algorithm out = new Algorithm();
// out.setName(extractSingleValue(metadata, METADATA_ALGORITHM_NAME));
// out.setDescription(extractSingleValue(metadata, METADATA_ALGORITHM_DESCRIPTION));
// out.setClazz(extractSingleValue(metadata, METADATA_CLASS_NAME));
// List<String> dependencies = extractMultipleValues(metadata, METADATA_PACKAGES);
// if (dependencies != null) {
// for (String pkg : dependencies) {
// Dependency dep = new Dependency();
// dep.setName(pkg);
// dep.setType("os");
// out.addDependency(dep);
// }
// }
// return out;
// }
private Algorithm createAlgorithm(Map<String, List<String>> metadata) {
Algorithm out = new Algorithm();
out.setName(extractSingleValue(metadata, METADATA_ALGORITHM_NAME));
out.setDescription(extractSingleValue(metadata, METADATA_ALGORITHM_DESCRIPTION));
out.setClazz(extractSingleValue(metadata, METADATA_CLASS_NAME));
out.setEmail(extractSingleValue(metadata, METADATA_EMAIL));
out.setFullname(extractSingleValue(metadata, METADATA_FULLNAME));
out.setUsername(extractSingleValue(metadata, METADATA_USERNAME));
out.setLanguage(extractSingleValue(metadata, METADATA_LANGUAGE));
out.setCategory(extractSingleValue(metadata, METADATA_CATEGORY));
List<String> dependencies = extractMultipleValues(metadata, "Package Name");
if (dependencies != null) {
for (String pkg : dependencies) {
Dependency dep = new Dependency();
dep.setName(pkg);
out.addDependency(dep);
}
}
// List<String> rdependencies = extractMultipleValues(metadata, "cran");
// if (rdependencies != null) {
// for (String pkg : rdependencies) {
// Dependency dep = new Dependency();
//
// //if (pkg.startsWith("os:")){
// dep.setName(pkg);
// dep.setType("cran");
// out.addDependency(dep);
// }
// }
//
//
// List<String> defdependencies = extractMultipleValues(metadata, "Packages");
// if (defdependencies != null) {
// for (String pkg : defdependencies) {
// Dependency dep = new Dependency();
//
// //if (pkg.startsWith("os:")){
// dep.setName(pkg);
// dep.setType("os");
// out.addDependency(dep);
// }
// }
//
// List<String> osdependencies = extractMultipleValues(metadata, "os");
// if (osdependencies != null) {
// for (String pkg : osdependencies) {
// Dependency dep = new Dependency();
//
// //if (pkg.startsWith("os:")){
// dep.setName(pkg);
// dep.setType("os");
// out.addDependency(dep);
// }
// }
//
//
//
// List<String> gitdependencies = extractMultipleValues(metadata, "github");
// if (gitdependencies != null) {
// for (String pkg : gitdependencies) {
// Dependency dep = new Dependency();
//
// //if (pkg.startsWith("os:")){
// dep.setName(pkg);
// dep.setType("github");
// out.addDependency(dep);
// }
// }
//
//
//
// List<String> cdependencies = extractMultipleValues(metadata, "custom");
// if (cdependencies != null) {
// for (String pkg : cdependencies) {
// Dependency dep = new Dependency();
//
// //if (pkg.startsWith("os:")){
// dep.setName(pkg);
// dep.setType("custom");
// out.addDependency(dep);
// }
// }
// if (pkg.startsWith("r:")){
// //String results = StringEscapeUtils.escapeJava(pkg);
// dep.setName(pkg);
// dep.setType("cran");
// }
// if (pkg.startsWith("custom:")){
// dep.setName(pkg);
// dep.setType("custom");
// }
// if (!pkg.startsWith("os:")&&!pkg.startsWith("r:")&&!pkg.startsWith("custom:")){
// dep.setName(pkg);
// dep.setType("os");
// }
return out;
}
private static String extractSingleValue(Map<String, List<String>> metadata,
String key) {
List<String> l = metadata.get(key);
if (l != null && l.size() == 1) {
return l.get(0);
} else {
return null;
}
}
private static List<String> extractMultipleValues(
Map<String, List<String>> metadata, String key) {
List<String> l = metadata.get(key);
if (l != null) {
return new Vector<>(l);
} else {
return null;
}
}
public static void main(String[] args) {
AlgorithmPackageParser ap = new AlgorithmPackageParser();
String txt =
"Username: giancarlo.panichi\n"+
"Full Name: Giancarlo Panichi\n"+
"Email: g.panichi@isti.cnr.it\n"+
"Language: R\n"+
"Algorithm Name: RBLACKBOX\n"+
"Class Name: org.gcube.dataanalysis.executor.rscripts.RBlackBox\n"+
"Algorithm Description: RBlackBox\n"+
"Algorithm Category: BLACK_BOX\n"+
"Interpreter Version: 3.2.1\n"+
"Packages:\n"+
"Package Name: DBI\n"+
"Package Name: RPostgreSQL\n"+
"Package Name: raster\n"+
"Package Name: maptools\n"+
"Package Name: sqldf\n"+
"Package Name: RJSONIO\n"+
"Package Name: httr \n"+
"Package Name: data.table";
ap.parseMetadata(txt);
}
}

View File

@ -0,0 +1,67 @@
//package org.gcube.dataanalysis.dataminer.poolmanager.rest;
//
//import java.io.IOException;
//import java.net.MalformedURLException;
//import java.net.URL;
//import java.net.UnknownHostException;
//
//import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Algorithm;
//
//public interface PoolManager {
//
// String addAlgorithmToVRE(Algorithm algo, String vre) throws IOException, InterruptedException;
//
// Algorithm extractAlgorithm(String url) throws IOException;
//
// String getLogById(String logId) throws IOException;
//
// void getLogId(Algorithm algo, String vre);
//
// String getScriptFromURL(URL logId) throws IOException;
//
// URL getURLfromWorkerLog(String logUrl) throws MalformedURLException, UnknownHostException;
//
//}
package org.gcube.dataanalysis.dataminer.poolmanager.rest;
import java.io.IOException;
import java.net.MalformedURLException;
import java.net.URL;
import java.net.UnknownHostException;
import java.util.List;
import java.util.Set;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Algorithm;
import org.tmatesoft.svn.core.SVNException;
public interface PoolManager
{
String addAlgorithmToVRE(Algorithm algo, String vre, boolean test ) throws IOException, InterruptedException;
String addAlgorithmToHost(Algorithm algo, String host,boolean test) throws IOException, InterruptedException;
String stageAlgorithm(String algorithmPackageURL) throws IOException, InterruptedException;
String publishAlgorithm(String algorithmPackageURL, String targetVREToken, String targetVRE) throws IOException, InterruptedException;
Algorithm extractAlgorithm(String url) throws IOException;
String getLogById(String logId) throws IOException;
void getLogId(Algorithm algo, String vre);
String getScriptFromURL(URL logId) throws IOException;
URL getURLfromWorkerLog(String logUrl) throws MalformedURLException, UnknownHostException;
void addAlgToIs(Algorithm algo);
Set<Algorithm> getAlgoFromIs();
List<String> updateSVN(String file, List<String> ldep) throws SVNException, IOException;
}

View File

@ -0,0 +1,245 @@
package org.gcube.dataanalysis.dataminer.poolmanager.rest;
import java.io.IOException;
import java.net.MalformedURLException;
import java.net.URL;
import java.net.UnknownHostException;
import java.util.List;
import java.util.Set;
import javax.ws.rs.GET;
import javax.ws.rs.Path;
import javax.ws.rs.Produces;
import javax.ws.rs.QueryParam;
import org.gcube.common.authorization.client.exceptions.ObjectNotFound;
import org.gcube.common.authorization.library.provider.SecurityTokenProvider;
import org.gcube.common.scope.api.ScopeProvider;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Algorithm;
import org.gcube.dataanalysis.dataminer.poolmanager.service.DataminerPoolManager;
import org.gcube.dataanalysis.dataminer.poolmanager.util.AlgorithmBuilder;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.tmatesoft.svn.core.SVNException;
@Path("/")
public class RestPoolManager implements PoolManager
{
//@Context ServletContext context;
private final Logger logger;
private DataminerPoolManager service;
//@Context
//private ApplicationContext context = ContextProvider.get();
public RestPoolManager()
{
this.logger = LoggerFactory.getLogger(RestPoolManager.class);
this.service = new DataminerPoolManager();
}
@GET
@Path("/algorithm/stage")
@Produces("text/plain")
public String stageAlgorithm(
@QueryParam("algorithmPackageURL") String algorithmPackageURL,
@QueryParam("targetVRE") String targetVRE,
@QueryParam("category") String category,
@QueryParam("algorithm_type") String algorithm_type) throws IOException, InterruptedException {
this.logger.debug("Stage algorithm method called");
Algorithm algo = AlgorithmBuilder.create(algorithmPackageURL);
//String env = context.application().getInitParameter("Environment");
return this.service.stageAlgorithm(algo,targetVRE,category,algorithm_type/*,env*/);
}
@GET
@Path("/algorithm/add")
@Produces("text/plain")
public String publishAlgorithm(
@QueryParam("algorithmPackageURL") String algorithmPackageURL,
//@QueryParam("targetVREToken") String targetVREToken,
@QueryParam("targetVRE") String targetVRE,
@QueryParam("category") String category,
@QueryParam("algorithm_type") String algorithm_type) throws IOException, InterruptedException {
this.logger.debug("Publish algorithm method called");
Algorithm algo = AlgorithmBuilder.create(algorithmPackageURL);
//String env = context.application().getInitParameter("Environment");
return this.service.publishAlgorithm(algo, /*targetVREToken,*/ targetVRE,category,algorithm_type/*,env*/);
}
/*
* /scopes/<scope> POST // add an algorithm to all dataminers in the scope
* /hosts/<hostname> POST // add an algorithm to the given host
*/
@GET
@Path("/log")
@Produces("text/plain")
public String getLogById(@QueryParam("logUrl") String logUrl) throws IOException {
// TODO Auto-generated method stub
this.logger.debug("Get log by id method called");
this.logger.debug("Returning Log =" + logUrl);
return service.getLogById(logUrl);
}
@GET
@Path("/monitor")
@Produces("text/plain")
public String getMonitorById(@QueryParam("logUrl") String logUrl) throws IOException {
// TODO Auto-generated method stub
this.logger.debug("Get monitor by id method called");
this.logger.debug("Returning Log =" + logUrl);
return service.getMonitorById(logUrl);
}
@Override
public Algorithm extractAlgorithm(String url) throws IOException {
// TODO Auto-generated method stub
return null;
}
public static void main(String[] args) throws ObjectNotFound, Exception {
// System.out.println(System.getProperty("user.home")+File.separator+"/gcube/dataminer-pool-manager");
// // ProxySelector.setDefault(new
// // PropertiesBasedProxySelector("/home/ngalante/.proxy-settings"));
//
// ScopeProvider.instance.set("/d4science.research-infrastructures.eu/gCubeApps/RPrototypingLab");
// SecurityTokenProvider.instance.set("3a23bfa4-4dfe-44fc-988f-194b91071dd2-843339462");
ScopeProvider.instance.set("/gcube/devNext");
SecurityTokenProvider.instance.set("708e7eb8-11a7-4e9a-816b-c9ed7e7e99fe-98187548");
// AuthorizationEntry entry = authorizationService().get("708e7eb8-11a7-4e9a-816b-c9ed7e7e99fe-98187548");
// System.out.println(entry.getContext());
RestPoolManager a = new RestPoolManager();
a.stageAlgorithm("http://data-d.d4science.org/TSt3cUpDTG1teUJMemxpcXplVXYzV1lBelVHTTdsYjlHbWJQNStIS0N6Yz0");
// //a.publishAlgorithm("http://data.d4science.org/MnovRjZIdGV5WlB0WXE5NVNaZnRoRVg0SU8xZWpWQlFHbWJQNStIS0N6Yz0", "708e7eb8-11a7-4e9a-816b-c9ed7e7e99fe-98187548","/gcube/devNext/NextNext");
// // PoolManager aa = new DataminerPoolManager();
//
}
//Production Testing
/*
stageAlgorithm(Rproto caller token,pacchetto, category)
http://node2-d-d4s.d4science.org:8080/dataminer-pool-manager-1.0.0-SNAPSHOT/rest/algorithm/stage?gcube-token=3a23bfa4-4dfe-44fc-988f-194b91071dd2-843339462&algorithmPackageURL=http://data.d4science.org/dENQTTMxdjNZcGRpK0NHd2pvU0owMFFzN0VWemw3Zy9HbWJQNStIS0N6Yz0&category=ICHTHYOP_MODEL
publishAlgorithm(Rproto caller token, pacchetto, category, target token, target prod vre)
node2-d-d4s.d4science.org:8080/dataminer-pool-manager-1.0.0-SNAPSHOT/rest/algorithm/add?gcube-token=708e7eb8-11a7-4e9a-816b-c9ed7e7e99fe-98187548&algorithmPackageURL=http://data.d4science.org/dENQTTMxdjNZcGRpK0NHd2pvU0owMFFzN0VWemw3Zy9HbWJQNStIS0N6Yz0&category=ICHTHYOP_MODEL&targetVREToken=3a23bfa4-4dfe-44fc-988f-194b91071dd2-843339462&targetVRE=/d4science.research-infrastructures.eu/gCubeApps/RPrototypingLab
getLogById(Rproto caller token, logid)
http://node2-d-d4s.d4science.org:8080/dataminer-pool-manager-1.0.0-SNAPSHOT/rest/log?gcube-token=3a23bfa4-4dfe-44fc-988f-194b91071dd2-843339462&logUrl=
*/
//dev Testing
/*
stageAlgorithm(dev_caller_vre_token,pacchetto, category)
http://node2-d-d4s.d4science.org:8080/dataminer-pool-manager-1.0.0-SNAPSHOT/rest/algorithm/stage?gcube-token=708e7eb8-11a7-4e9a-816b-c9ed7e7e99fe-98187548&algorithmPackageURL=http://data.d4science.org/dENQTTMxdjNZcGRpK0NHd2pvU0owMFFzN0VWemw3Zy9HbWJQNStIS0N6Yz0&category=ICHTHYOP_MODEL
publishAlgorithm(dev_caller_vre_token, pacchetto, category, target token, target prod vre)
http://node2-d-d4s.d4science.org:8080/dataminer-pool-manager-1.0.0-SNAPSHOT/rest/log?gcube-token=708e7eb8-11a7-4e9a-816b-c9ed7e7e99fe-98187548&logUrl=450bb7f9-9e38-4bde-8f4d-f3296f95deba
getLogById(dev_caller_vre_token, logid)
http://node2-d-d4s.d4science.org:8080/dataminer-pool-manager-1.0.0-SNAPSHOT/rest/log?gcube-token=708e7eb8-11a7-4e9a-816b-c9ed7e7e99fe-98187548&logUrl=426c8e35-a624-4710-b612-c90929c32c27 */
@Override
public void getLogId(Algorithm algo, String vre) {
// TODO Auto-generated method stub
}
@Override
public String getScriptFromURL(URL logId) throws IOException {
// TODO Auto-generated method stub
return null;
}
@Override
public URL getURLfromWorkerLog(String logUrl) throws MalformedURLException, UnknownHostException {
// TODO Auto-generated method stub
return null;
}
@Override
public void addAlgToIs(Algorithm algo) {
// TODO Auto-generated method stub
}
@Override
public Set<Algorithm> getAlgoFromIs() {
// TODO Auto-generated method stub
return null;
}
@Override
public List<String> updateSVN(String file, List<String> ldep) throws SVNException {
// TODO Auto-generated method stub
return null;
}
@Override
public String addAlgorithmToHost(Algorithm algo, String host, boolean test)
throws IOException, InterruptedException {
// TODO Auto-generated method stub
return null;
}
@Override
public String addAlgorithmToVRE(Algorithm algo, String vre, boolean test)
throws IOException, InterruptedException {
// TODO Auto-generated method stub
return null;
}
@Override
public String stageAlgorithm(String algorithmPackageURL) throws IOException, InterruptedException {
// TODO Auto-generated method stub
return null;
}
@Override
public String publishAlgorithm(String algorithmPackageURL, String targetVREToken, String targetVRE)
throws IOException, InterruptedException {
// TODO Auto-generated method stub
return null;
}
}

View File

@ -0,0 +1,273 @@
package org.gcube.dataanalysis.dataminer.poolmanager.service;
import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.PrintStream;
import java.io.PrintWriter;
import java.util.Collection;
import java.util.UUID;
import org.gcube.common.authorization.library.provider.SecurityTokenProvider;
import org.gcube.common.scope.api.ScopeProvider;
import org.gcube.dataanalysis.dataminer.poolmanager.ansible.AnsibleWorker;
import org.gcube.dataanalysis.dataminer.poolmanager.ansiblebridge.AnsibleBridge;
import org.gcube.dataanalysis.dataminer.poolmanager.clients.configuration.Configuration;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Algorithm;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Cluster;
import org.gcube.dataanalysis.dataminer.poolmanager.service.exceptions.AnsibleException;
import org.gcube.dataanalysis.dataminer.poolmanager.service.exceptions.UndefinedDependenciesException;
import org.gcube.dataanalysis.dataminer.poolmanager.util.CheckMethod;
import org.gcube.dataanalysis.dataminer.poolmanager.util.NotificationHelper;
import org.gcube.dataanalysis.dataminer.poolmanager.util.SVNUpdater;
import org.gcube.dataanalysis.dataminer.poolmanager.util.SendMail;
import org.gcube.dataanalysis.dataminer.poolmanager.util.exception.DMPMException;
import org.gcube.dataanalysis.dataminer.poolmanager.util.exception.EMailException;
import org.gcube.dataanalysis.dataminer.poolmanager.util.exception.GenericException;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public abstract class DMPMJob {
private Configuration configuration;
private String dmpmHomeDirectory;
private SVNUpdater svnUpdater;
private File jobLogs;
private String id;
private Algorithm algorithm;
private Cluster cluster;
private String vREName;
private String category;
private String algorithm_type;
private Logger logger;
private enum STATUS
{
PROGRESS ("IN PROGRESS"),
COMPLETED ("COMPLETED"),
FAILED ("FAILED");
private String status;
STATUS (String status)
{
this.status = status;
}
}
public DMPMJob(SVNUpdater svnUpdater,Configuration configuration,Algorithm algorithm, Cluster cluster,String vREName,
String category, String algorithm_type){
this.logger = LoggerFactory.getLogger(DMPMJob.class);
this.configuration = configuration;
this.algorithm = algorithm;
this.cluster = cluster;
this.vREName = vREName;
this.category = category;
this.algorithm_type = algorithm_type;
this.svnUpdater = svnUpdater;
this.dmpmHomeDirectory = new String (System.getProperty("user.home")+File.separator+"dataminer-pool-manager");
this.id = UUID.randomUUID().toString();
//TODO: dmpm work directory should be loaded from configuration file
this.jobLogs = new File(this.dmpmHomeDirectory+File.separator+"jobs");
this.jobLogs.mkdirs();
}
public String start()
{
setStatusInformation(STATUS.PROGRESS);
new Thread(new Runnable() {
@Override
public void run() {
try {
execute();
} catch (Exception e) {
e.printStackTrace();
}
}
}).start();
return this.id;
}
protected AnsibleWorker createWorker(Algorithm algo,
Cluster dataminerCluster,
boolean includeAlgorithmDependencies,
String user){
AnsibleBridge ansibleBridge = new AnsibleBridge(this.dmpmHomeDirectory);
try {
return ansibleBridge.createWorker(algo, dataminerCluster, includeAlgorithmDependencies, user);
} catch (IOException e) {
e.printStackTrace();
}
return null;
}
public void setStatusInformation(STATUS exitStatus) {
try
{
File statusFile = new File (this.jobLogs,this.id + "_exitStatus");
//File m = new File ( this.jobLogs + File.separator + this.id + "_exitStatus");
PrintWriter writer = new PrintWriter(statusFile, "UTF-8");
writer.println(exitStatus.status);
writer.close();
} catch (Exception e)
{
this.logger.error ("Unable to update exit status file with status "+exitStatus.status,e);
}
}
private void updateLogFile (File logFile, String message)
{
try
{
PrintWriter writer = new PrintWriter(logFile,"UTF-8");
writer.print(message);
writer.close();
} catch (Exception e)
{
this.logger.error("Unable to log the error message: "+message,e);
}
}
protected abstract void execute ();
private void preInstallation (SendMail sm,NotificationHelper nh, File logFile ) throws GenericException, EMailException,UndefinedDependenciesException
{
this.logger.debug("Checking dependencies...");
Collection<String> undefinedDependencies = this.svnUpdater.getUndefinedDependencies(
this.svnUpdater.getDependencyFile(this.algorithm.getLanguage()),
this.algorithm.getDependencies());
if (!undefinedDependencies.isEmpty())
{
this.logger.debug("Some dependencies are not defined");
throw new UndefinedDependenciesException(undefinedDependencies);
}
}
private String installation (SendMail sm,NotificationHelper nh,CheckMethod methodChecker,File logFile ) throws DMPMException
{
this.logger.debug("Installation process started");
methodChecker.deleteFiles(this.algorithm/*, env*/);
int ret = this.executeAnsibleWorker(createWorker(this.algorithm, this.cluster, false, "root"),logFile);
this.logger.debug("Return code= "+ret);
if (ret != 0) throw new AnsibleException(ret);
else
{
this.logger.debug("Operation completed");
//this.setStatusInformation(STATUS.PROGRESS);
this.logger.debug("Checking the method...");
methodChecker.checkMethod(this.configuration.getHost(), SecurityTokenProvider.instance.get());
methodChecker.copyAlgorithms(this.algorithm);
this.logger.debug("Method OK and algo exists");
this.logger.debug("Interface check ok!");
this.logger.debug("Both the files exist at the correct path!");
boolean algorithmListResult = this.svnUpdater.updateSVNAlgorithmList(this.algorithm, this.vREName,this.category, this.algorithm_type,
this.algorithm.getFullname());
this.setStatusInformation(STATUS.COMPLETED);
return algorithmListResult ?"":"\nWARNING: algorithm list could not be updated on SVN";
}
}
protected void execute(NotificationHelper nh, CheckMethod methodChecker)
{
SendMail sm = new SendMail();
File logFile = new File(this.jobLogs,this.id);
try
{
try {
this.logger.debug("Pre installation operations");
preInstallation(sm, nh, logFile);
this.logger.debug("Pre installation operation completed");
this.logger.debug("Installation...");
String warning = installation(sm, nh, methodChecker, logFile);
this.logger.debug("Installation completed");
this.logger.debug("Warning message "+warning);
this.setStatusInformation(STATUS.COMPLETED);
String bodyResponse = NotificationHelper.getSuccessBody(warning+"\n\n"+this.buildInfo());
sm.sendNotification(nh.getSuccessSubject() + " for "+this.algorithm.getName()+ " algorithm", bodyResponse);
} catch (DMPMException dmpme)
{
this.logger.error("Operation failed: "+dmpme.getMessage());
this.logger.error("Exception: ",dmpme);
this.setStatusInformation(STATUS.FAILED);
String errorMessage = "\n"+NotificationHelper.getFailedBody(dmpme.getErrorMessage()+"\n\n"+this.buildInfo());
this.updateLogFile(logFile, errorMessage);
sm.sendNotification(nh.getFailedSubject() +" for "+this.algorithm.getName()+ " algorithm", errorMessage);
}
} catch (EMailException eme)
{
this.logger.error("Unable to send notification email",eme);
}
}
protected int executeAnsibleWorker(AnsibleWorker worker, File logFile) throws GenericException
{
try
{
FileOutputStream fos = new FileOutputStream(logFile, true);
PrintStream ps = new PrintStream(fos);
// File m = new File(this.jobLogs + File.separator + this.id + "_exitStatus");
// PrintWriter fos2 = new PrintWriter(m, "UTF-8");
return worker.execute(ps);
} catch (Exception e)
{
throw new GenericException(e);
}
}
public String buildInfo() {
return
"\n"+
"Algorithm details:\n"+"\n"+
"User: "+this.algorithm.getFullname()+"\n"+
"Algorithm name: "+this.algorithm.getName()+"\n"+
"Staging DataMiner Host: "+ this.configuration.getHost()+"\n"+
"Caller VRE: "+ScopeProvider.instance.get()+"\n"+
"Target VRE: "+this.vREName+"\n";
}
}

View File

@ -0,0 +1,97 @@
package org.gcube.dataanalysis.dataminer.poolmanager.service;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.util.Scanner;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Algorithm;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Cluster;
import org.gcube.dataanalysis.dataminer.poolmanager.util.ClusterBuilder;
import org.gcube.dataanalysis.dataminer.poolmanager.util.impl.ClusterBuilderProduction;
import org.gcube.dataanalysis.dataminer.poolmanager.util.impl.ClusterBuilderStaging;
import org.gcube.dataanalysis.dataminer.poolmanager.util.impl.SVNUpdaterProduction;
import org.gcube.dataanalysis.dataminer.poolmanager.util.impl.SVNUpdaterStaging;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.tmatesoft.svn.core.SVNException;
public class DataminerPoolManager {
private Logger logger;
private SVNUpdaterStaging svnUpdaterStaging;
private SVNUpdaterProduction svnUpdaterProduction;
public DataminerPoolManager() {
this.logger = LoggerFactory.getLogger(this.getClass());
try {
//TODO: read this from configuration
this.svnUpdaterStaging = new SVNUpdaterStaging();
this.svnUpdaterProduction = new SVNUpdaterProduction();
} catch (SVNException e) {
this.logger.error("SVN Exception",e);
}
}
public String stageAlgorithm(Algorithm algo,String targetVRE, String category, String algorithm_type/*,String env*/) throws IOException, InterruptedException
{
this.logger.debug("Stage algorithm");
this.logger.debug("Algo "+algo);
this.logger.debug("Category "+category);
this.logger.debug("Algo type "+algorithm_type);
ClusterBuilder stagingClusterBuilder = new ClusterBuilderStaging();
Cluster stagingCluster = stagingClusterBuilder.getDataminerCluster();
//Cluster rProtoCluster = ClusterBuilder.getRProtoCluster();
DMPMJob job = new StagingJob(this.svnUpdaterStaging, algo, stagingCluster, /*rProtoCluster,*/ targetVRE, category, algorithm_type/*,env*/);
String id = job.start();
return id;
}
public String publishAlgorithm(Algorithm algo, String targetVRE, String category, String algorithm_type/*, String env*/) throws IOException, InterruptedException
{
this.logger.debug("publish algorithm");
this.logger.debug("Algo "+algo);
this.logger.debug("Category "+category);
this.logger.debug("Algo type "+algorithm_type);
ClusterBuilder productionClusterBuilder = new ClusterBuilderProduction();
Cluster prodCluster = productionClusterBuilder.getDataminerCluster();
DMPMJob job = new ProductionPublishingJob(this.svnUpdaterProduction, algo, prodCluster, targetVRE, category, algorithm_type/*,env*/);
String id = job.start();
return id;
}
public String getLogById(String id) throws FileNotFoundException{
//TODO: load dir from configuration file
this.logger.debug("Getting log by id "+id);
File path = new File(System.getProperty("user.home") + File.separator + "dataminer-pool-manager/jobs/"
+ id);
Scanner scanner = new Scanner(path);
String response = scanner.useDelimiter("\\Z").next();
this.logger.debug("Response "+response);
scanner.close();
return response;
}
public String getMonitorById(String id) throws FileNotFoundException{
this.logger.debug("Getting monitor by id "+id);
//TODO: load dir from configuration file
File path = new File(System.getProperty("user.home") + File.separator + "dataminer-pool-manager/jobs/"
+ id + "_exitStatus");
Scanner scanner = new Scanner(path);
String response= scanner.useDelimiter("\\Z").next();
this.logger.debug("Response "+response);
scanner.close();
return response;
}
}

View File

@ -0,0 +1,45 @@
package org.gcube.dataanalysis.dataminer.poolmanager.service;
import java.io.FileNotFoundException;
import java.io.UnsupportedEncodingException;
import org.gcube.dataanalysis.dataminer.poolmanager.clients.configuration.DMPMClientConfiguratorManager;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Algorithm;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Cluster;
import org.gcube.dataanalysis.dataminer.poolmanager.util.SVNUpdater;
import org.gcube.dataanalysis.dataminer.poolmanager.util.impl.CheckMethodProduction;
import org.gcube.dataanalysis.dataminer.poolmanager.util.impl.NotificationHelperProduction;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class ProductionPublishingJob extends DMPMJob {
//private String targetVREToken;
//private String env;
private Logger logger;
public ProductionPublishingJob(SVNUpdater svnUpdater, Algorithm algorithm,
Cluster prodCluster, String targetVREName, String category,String algorithm_type/*, String env*/) throws FileNotFoundException, UnsupportedEncodingException {
super(svnUpdater,DMPMClientConfiguratorManager.getInstance().getProductionConfiguration(),algorithm,prodCluster,targetVREName,category,algorithm_type);
this.logger = LoggerFactory.getLogger(StagingJob.class);// this.jobLogs = new File(
}
@Override
protected void execute() {
this.logger.debug("Executing staging job...");
super.execute(new NotificationHelperProduction(), new CheckMethodProduction());
}
}

View File

@ -0,0 +1,43 @@
package org.gcube.dataanalysis.dataminer.poolmanager.service;
import java.io.FileNotFoundException;
import java.io.UnsupportedEncodingException;
import org.gcube.dataanalysis.dataminer.poolmanager.clients.configuration.DMPMClientConfiguratorManager;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Algorithm;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Cluster;
import org.gcube.dataanalysis.dataminer.poolmanager.util.SVNUpdater;
import org.gcube.dataanalysis.dataminer.poolmanager.util.impl.CheckMethodStaging;
import org.gcube.dataanalysis.dataminer.poolmanager.util.impl.NotificationHelperStaging;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class StagingJob extends DMPMJob {
private Logger logger;
public StagingJob(SVNUpdater svnUpdater, Algorithm algorithm,
Cluster stagingCluster, /* Cluster rProtoCluster, */
String rProtoVREName, String category, String algorithm_type/*, String env*/) throws FileNotFoundException, UnsupportedEncodingException {
super(svnUpdater,DMPMClientConfiguratorManager.getInstance().getStagingConfiguration(),algorithm,stagingCluster,rProtoVREName,category,algorithm_type);
this.logger = LoggerFactory.getLogger(StagingJob.class);
}
@Override
protected void execute() {
this.logger.debug("Executing staging job...");
super.execute(new NotificationHelperStaging(), new CheckMethodStaging());
}
}

View File

@ -0,0 +1,25 @@
package org.gcube.dataanalysis.dataminer.poolmanager.service.exceptions;
import org.gcube.dataanalysis.dataminer.poolmanager.util.exception.DMPMException;
public class AnsibleException extends DMPMException {
/**
*
*/
private static final long serialVersionUID = 6772009633547404120L;
private int returnCode;
public AnsibleException(int returnCode) {
super ("Ansible work failed");
this.returnCode =returnCode;
}
@Override
public String getErrorMessage() {
return "Installation failed. Return code=" + this.returnCode;
}
}

View File

@ -0,0 +1,29 @@
package org.gcube.dataanalysis.dataminer.poolmanager.service.exceptions;
import java.util.Collection;
import org.gcube.dataanalysis.dataminer.poolmanager.util.exception.DMPMException;
public class UndefinedDependenciesException extends DMPMException {
private String message;
/**
*
*/
private static final long serialVersionUID = 4504593796352609191L;
public UndefinedDependenciesException(Collection<String> undefinedDependencies) {
super ("Some dependencies are not defined");
this.message = "Following dependencies are not defined:\n";
for (String n : undefinedDependencies) {
message += "\n" + n +"\n";
}
}
@Override
public String getErrorMessage() {
return this.message;
}
}

View File

@ -0,0 +1,47 @@
package org.gcube.dataanalysis.dataminer.poolmanager.util;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Algorithm;
import org.gcube.dataanalysis.dataminer.poolmanager.process.AlgorithmPackageParser;
import java.io.IOException;
/**
* Created by ggiammat on 5/9/17.
*/
public class AlgorithmBuilder {
public static Algorithm create(String algorithmPackageURL) throws IOException, InterruptedException {
return create(algorithmPackageURL, null, null, null, null, null, null, null);
}
public static Algorithm create(String algorithmPackageURL, String vre, String hostname, String name, String description,
String category, String algorithmType, String skipJava) throws IOException, InterruptedException {
Algorithm algo = new AlgorithmPackageParser().parsePackage(algorithmPackageURL);
if(category != null){
algo.setCategory(category);
}
if(algorithmType != null){
algo.setAlgorithmType(algorithmType);
}
if(skipJava != null){
algo.setSkipJava(skipJava);
}
if(skipJava != null){
algo.setSkipJava(skipJava);
}
if(name != null){
algo.setName(name);
}
if(description != null){
algo.setDescription(description);
}
return algo;
}
}

View File

@ -0,0 +1,305 @@
package org.gcube.dataanalysis.dataminer.poolmanager.util;
import java.io.BufferedReader;
import java.io.File;
import java.io.FileWriter;
import java.io.InputStreamReader;
import java.net.URL;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.LinkedList;
import java.util.List;
import java.util.Properties;
import org.gcube.dataanalysis.dataminer.poolmanager.clients.configuration.Configuration;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Algorithm;
import org.gcube.dataanalysis.dataminer.poolmanager.util.exception.AlgorithmException;
import org.gcube.dataanalysis.dataminer.poolmanager.util.exception.GenericException;
import org.gcube.dataanalysis.dataminer.poolmanager.util.exception.SVNCommitException;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.tmatesoft.svn.core.SVNException;
import com.jcraft.jsch.Channel;
import com.jcraft.jsch.ChannelSftp;
import com.jcraft.jsch.JSch;
import com.jcraft.jsch.JSchException;
import com.jcraft.jsch.Session;
import com.jcraft.jsch.SftpException;
public abstract class CheckMethod {
private Logger logger;
private Configuration configuration;
private final String KNOWN_HOSTS= "~/.ssh/known_hosts",
PRIVATE_KEY = "~/.ssh/id_rsa",
SSH_USER = "root",
SFTP_PROTOCOL = "sftp",
TEMP_DIRECTORY = "tmp";
private final Properties sshConfig;
public CheckMethod(Configuration configuration)
{
this.logger = LoggerFactory.getLogger(CheckMethod.class);
this.configuration = configuration;
sshConfig = new java.util.Properties();
sshConfig.put("StrictHostKeyChecking", "no");
}
public void checkMethod(String machine, String token) throws AlgorithmException {
try {
this.logger.debug("Checking method for machine "+machine);
this.logger.debug("By using tocken "+token);
this.logger.debug("Machine: " + machine);
// String getCapabilitesRequest = new String();
// String getCapabilitesResponse = new String();
this.logger.debug(" Token: " + token);
String request = "http://" + machine
+ "/wps/WebProcessingService?Request=GetCapabilities&Service=WPS&gcube-token=" + token;
String response = machine + "___" + token + ".xml";
// getCapabilitesRequest = request;
// getCapabilitesResponse = response;
String baseDescriptionRequest = "http://" + machine
+ "/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0" + "&gcube-token="
+ token + "&Identifier=";
URL requestURL = new URL(request);
this.logger.debug("Request url "+requestURL.toString());
BufferedReader bufferedReader = new BufferedReader(new InputStreamReader(requestURL.openStream()));
FileWriter fileWriter = new FileWriter(response);
String line;
boolean flag = true;
this.logger.debug("Writing file");
while (flag && (line = bufferedReader.readLine()) != null) {
this.logger.debug(line);
fileWriter.write(line);
fileWriter.write(System.lineSeparator());
if (line.contains("ows:Identifier"))
{
this.logger.debug("Identifier found");
String operatorName = line.substring(line.indexOf(">") + 1);
operatorName = operatorName.substring(0, operatorName.indexOf("<"));
this.logger.debug("Operator "+operatorName);
URL innerRequestURL = new URL(baseDescriptionRequest + operatorName);
BufferedReader innerBufferedReader = new BufferedReader(
new InputStreamReader(innerRequestURL.openStream()));
String innerLine = innerBufferedReader.readLine();
this.logger.debug("Inner line "+innerLine);
boolean innerFlag = true;
while (innerFlag && (innerLine = innerBufferedReader.readLine()) != null)
{
if (innerLine.contains("ows:Abstract"))
{
this.logger.debug("Abstract found");
String operatorDescription = innerLine.substring(innerLine.indexOf(">") + 1);
operatorDescription = operatorDescription.substring(0, operatorDescription.indexOf("<"));
this.logger.debug("Operator descriptor "+operatorDescription);
this.logger.debug(" " + operatorDescription);
innerFlag = false;
} else if (innerLine.contains("ows:ExceptionText"))
{
this.logger.debug("Exception found");
this.logger.debug(" " + "error retrieving operator description");
innerFlag = false;
flag = false;
} else
{
innerLine = innerBufferedReader.readLine();
this.logger.debug("Inner line completed "+innerLine);
}
}
}
}
this.logger.debug("Operation successful");
fileWriter.close();
} catch (Exception e) {
throw new AlgorithmException("Error "+e.getMessage(),e);
}
}
public void copyAlgorithms(Algorithm algo/*, String env*/) throws SVNCommitException, GenericException, AlgorithmException
{
this.logger.debug("Looking if algo "+algo.getName()+ " exists");
File file = new File(this.configuration.getGhostAlgoDirectory()+"/"+algo.getName()+".jar");
File file2 = new File(this.configuration.getGhostAlgoDirectory()+"/"+algo.getName()+"_interface.jar");
this.logger.debug("Looking for files "+file.getPath()+ " "+file.getPath());
boolean fileExists = false;
try
{
fileExists = (this.doesExist(file.getPath()/*,env*/)) && (this.doesExist(file2.getPath()/*,env*/));
} catch (Exception e)
{
throw new GenericException(e);
}
if (fileExists)
{
try
{
this.logger.debug("Files found");
this.copyFromDmToSVN(file/*,env*/);
this.copyFromDmToSVN(file2/*,env*/);
this.logger.debug("Files have been copied to SVN");
} catch (Exception e)
{
throw new GenericException(e);
}
}
else
{
this.logger.debug("Files not found");
this.logger.debug("Algorithm "+algo.getName()+".jar"+ " and "+algo.getName()+"_interface.jar files are not present at the expected path");
throw new AlgorithmException("Algorithm "+algo.getName()+".jar"+ " and "+algo.getName()+"_interface.jar files are not present at the expected path");
}
}
public void deleteFiles(Algorithm a/*,String env*/) throws GenericException
{
try
{
Session session = generateSession();
this.logger.debug("checking existing in env: " + this.configuration.getHost());
File file = new File(this.configuration.getGhostAlgoDirectory()+"/"+a.getName()+".jar");
File file2 = new File(this.configuration.getGhostAlgoDirectory()+"/"+a.getName()+"_interface.jar");
this.logger.debug("First file is located to: "+file.getPath());
this.logger.debug("Second file is located to: "+file2.getPath());
this.logger.debug("session created.");
session.setConfig(this.sshConfig);
session.connect();
Channel channel = session.openChannel(SFTP_PROTOCOL);
channel.connect();
this.logger.debug("shell channel connected....");
ChannelSftp c = (ChannelSftp) channel;
if(doesExist(file.getPath()/*,env*/)&&(doesExist(file2.getPath()/*,env*/))){
c.rm(file.getPath());
c.rm(file2.getPath());
this.logger.debug("Both the files have been deleted");
}
else this.logger.debug("Files not found");
channel.disconnect();
c.disconnect();
session.disconnect();
} catch (Exception e)
{
throw new GenericException(e);
}
}
public boolean doesExist(String path/*, String env*/) throws Exception {
Session session = generateSession();
boolean success = false;
session.connect();
Channel channel = session.openChannel(SFTP_PROTOCOL);
channel.connect();
this.logger.debug("shell channel connected....");
ChannelSftp c = (ChannelSftp) channel;
this.logger.debug(path);
try {
c.lstat(path);
success = true;
} catch (SftpException e) {
if (e.id == ChannelSftp.SSH_FX_NO_SUCH_FILE) {
// file doesn't exist
success = false;
}
//success = true; // something else went wrong
}
channel.disconnect();
c.disconnect();
session.disconnect();
this.logger.debug("Operation result "+success);
return success;
}
protected abstract void copyFromDmToSVN(File a) throws SVNCommitException, Exception;
protected void copyFromDmToSVN(File algorithmsFile/*,String env*/,SVNUpdater svnUpdater) throws SVNException, SVNCommitException, JSchException, SftpException {
this.logger.debug("Copying algorithm file from Data Miner to SVN");
String fileName = algorithmsFile.getName();
this.logger.debug("File name "+fileName);
Session session = generateSession();
session.connect();
Channel channel = session.openChannel(SFTP_PROTOCOL);
channel.connect();
ChannelSftp sftp = (ChannelSftp) channel;
sftp.cd(this.configuration.getGhostAlgoDirectory());
String remoteFile = new StringBuilder(this.configuration.getGhostAlgoDirectory()).append(File.separator).append(fileName).toString();
this.logger.debug("Remote file "+remoteFile);
String localFile = new StringBuilder(File.separator).append(TEMP_DIRECTORY).append(File.separator).append(fileName).toString();
this.logger.debug("Local file "+localFile);
sftp.get(remoteFile,localFile);
channel.disconnect();
session.disconnect();
File f = new File(localFile);
svnUpdater.updateAlgorithmFiles(f);
f.delete();
}
private Session generateSession () throws JSchException
{
JSch jsch = new JSch();
jsch.setKnownHosts(KNOWN_HOSTS);
jsch.addIdentity(PRIVATE_KEY);
this.logger.debug("Private Key Added.");
Session session = jsch.getSession(SSH_USER, this.configuration.getHost());
this.logger.debug("session created.");
session.setConfig(this.sshConfig);
return session;
}
public static List<String> getFiles(String a){
String[] array = a.split(",");
ArrayList<String> list = new ArrayList<>(Arrays.asList(array));
List<String> ls = new LinkedList<String>();
for (String s: list){
ls.add(s.trim());
}
return ls;
}
}

View File

@ -0,0 +1,28 @@
package org.gcube.dataanalysis.dataminer.poolmanager.util;
import static org.gcube.common.authorization.client.Constants.authorizationService;
import org.gcube.common.authorization.client.exceptions.ObjectNotFound;
import org.gcube.common.authorization.library.AuthorizationEntry;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class CheckPermission {
private static final Logger logger = LoggerFactory.getLogger(CheckPermission.class);
public static boolean apply(String VREToken, String vre) throws ObjectNotFound, Exception
{
AuthorizationEntry entry = authorizationService().get(VREToken);
if (entry.getContext().equals(vre)) {
logger.info("Authorization OK!");
return true;
}
logger.info("Not a valid token recognized for the VRE: "+vre);
return false;
}
}

View File

@ -0,0 +1,58 @@
package org.gcube.dataanalysis.dataminer.poolmanager.util;
import java.io.FileNotFoundException;
import java.io.IOException;
import org.gcube.common.authorization.library.provider.SecurityTokenProvider;
import org.gcube.common.scope.api.ScopeProvider;
import org.gcube.dataanalysis.dataminer.poolmanager.clients.HAProxy;
import org.gcube.dataanalysis.dataminer.poolmanager.clients.configuration.Configuration;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Cluster;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Host;
public abstract class ClusterBuilder {
private Configuration configuration;
public ClusterBuilder (Configuration configuration)
{
this.configuration = configuration;
}
//1. to complete
public Cluster getDataminerCluster() throws FileNotFoundException{
Cluster cluster = new Cluster();
Host h = new Host();
h.setName(this.configuration.getHost());
cluster.addHost(h);
return cluster;
}
public Cluster getVRECluster(String targetVREToken, String targetVRE) throws IOException{
Cluster cluster = new Cluster();
for (Host h : new HAProxy().listDataMinersByCluster(targetVREToken,targetVRE)) {
cluster.addHost(h);
}
return cluster;
}
public Cluster getRProtoCluster() throws IOException{
//Assumes the service is running in RPrototypingLab
String token = SecurityTokenProvider.instance.get();
String targetVRE = ScopeProvider.instance.get();
return this.getVRECluster(token, targetVRE);
}
}

View File

@ -0,0 +1,79 @@
package org.gcube.dataanalysis.dataminer.poolmanager.util;
//import scala.actors.threadpool.Arrays;
public abstract class NotificationHelper {
private String subjectHeader;
protected NotificationHelper (String subjectHeader)
{
this.subjectHeader = subjectHeader;
}
// private Exception executionException;
// private boolean isError() {
// return this.executionException!=null;
// }
// public void setExecutionException(Exception executionException) {
// this.executionException = executionException;
// }
public String getSuccessSubject() {
return this.subjectHeader+" is SUCCESS";
}
public String getFailedSubject() {
return String.format(this.subjectHeader+" is FAILED");
}
public static String getSuccessBody(String info) {
String message = String.format("The installation of the algorithm is completed successfully.");
message+="\n\nYou can retrieve experiment results under the '/DataMiner' e-Infrastructure Workspace folder or from the DataMiner interface.\n\n"+ info;
return message;
}
public static String getFailedBody(String message) {
String body = String.format("An error occurred while deploying your algorithm");
body+= "\n\nHere are the error details:\n\n" + message;
return body;
}
// public String getSuccessBodyRelease(String info) {
// String message = String.format("SVN REPOSITORY CORRECTLY UPDATED.");
// message+="\n\n The CRON job will install the algorithm in the target VRE \n\n"+ info;
// return message;
// }
//
// public String getFailedBodyRelease(String info) {
// String message = String.format("SVN REPOSITORY UPDATE FAILED.");
// message+="\n\n The CRON job will NOT be able to install the algorithm in the target VRE \n\n"+ info;
// return message;
// }
// public String getSubject() {
// if(this.isError()) {
// return this.getFailedSubject();
// } else {
// return this.getSuccessSubject();
// }
// }
//
// public String getBody() {
// if(this.isError()) {
// return this.getFailedBody();
// } else {
// return this.getSuccessBody();
// }
// }
}

View File

@ -0,0 +1,145 @@
package org.gcube.dataanalysis.dataminer.poolmanager.util;
import java.io.IOException;
import java.net.Authenticator;
import java.net.InetSocketAddress;
import java.net.PasswordAuthentication;
import java.net.Proxy;
import java.net.ProxySelector;
import java.net.SocketAddress;
import java.net.URI;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.apache.commons.configuration.ConfigurationException;
import org.apache.commons.configuration.PropertiesConfiguration;
import org.apache.commons.configuration.reloading.FileChangedReloadingStrategy;
interface NetworkConfiguration {
public String getProxyHost();
public String getProxyPort();
public String getProxyUser();
public String getProxyPassword();
public String getNonProxyHosts();
}
class FileBasedProxyConfiguration implements NetworkConfiguration {
private static PropertiesConfiguration configuration;
public FileBasedProxyConfiguration(String path) {
try {
// load the configuration
configuration = new PropertiesConfiguration(path);
// set the reloading strategy to enable hot-configuration
FileChangedReloadingStrategy fcrs = new FileChangedReloadingStrategy();
configuration.setReloadingStrategy(fcrs);
} catch (ConfigurationException e) {
e.printStackTrace();
}
}
@Override
public String getProxyHost() {
return configuration.getString("proxyHost");
}
@Override
public String getProxyPort() {
return configuration.getString("proxyPort");
}
@Override
public String getProxyUser() {
return configuration.getString("proxyUser");
}
@Override
public String getProxyPassword() {
return configuration.getString("proxyPassword");
}
@Override
public String getNonProxyHosts() {
return configuration.getString("nonProxyHosts");
}
}
public class PropertiesBasedProxySelector extends ProxySelector {
List<Proxy> proxies = null;
List<String> nonProxyHosts = null;
public PropertiesBasedProxySelector(String proxySettingsPath) {
this(new FileBasedProxyConfiguration(proxySettingsPath));
}
public PropertiesBasedProxySelector(NetworkConfiguration config) {
if (config == null || config.getProxyHost() == null) {
this.proxies = null;
return;
}
String host = config.getProxyHost();
int port = 80;
if (config.getProxyPort() != null) {
port = Integer.valueOf(config.getProxyPort());
}
if (config.getNonProxyHosts() != null) {
this.nonProxyHosts = Arrays
.asList(config.getNonProxyHosts().split("\\|"));
}
this.proxies = new ArrayList<Proxy>();
this.proxies.add(new Proxy(Proxy.Type.HTTP, new InetSocketAddress(host,
port)));
if (config.getProxyUser() != null) {
final String username = config.getProxyUser();
final String password = config.getProxyPassword();
Authenticator.setDefault(new Authenticator() {
@Override
protected PasswordAuthentication getPasswordAuthentication() {
return new PasswordAuthentication(username, password.toCharArray());
}
});
}
}
@Override
public List<Proxy> select(URI uri) {
if (this.nonProxyHosts == null) {
return Arrays.asList(Proxy.NO_PROXY);
} else {
for (String entry : this.nonProxyHosts) {
entry = entry.trim();
if (entry.startsWith("*") && uri.getHost().endsWith(entry.substring(1))) {
return Arrays.asList(Proxy.NO_PROXY);
}
if (uri.getHost().equals(entry)) {
return Arrays.asList(Proxy.NO_PROXY);
}
}
return this.proxies;
}
}
@Override
public void connectFailed(URI uri, SocketAddress socketAddress, IOException e) {
}
}

View File

@ -0,0 +1,56 @@
package org.gcube.dataanalysis.dataminer.poolmanager.util;
import org.gcube.dataanalysis.dataminer.poolmanager.clients.configuration.Configuration;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.tmatesoft.svn.core.SVNException;
import org.tmatesoft.svn.core.SVNURL;
import org.tmatesoft.svn.core.auth.ISVNAuthenticationManager;
import org.tmatesoft.svn.core.io.SVNRepository;
import org.tmatesoft.svn.core.io.SVNRepositoryFactory;
import org.tmatesoft.svn.core.wc.SVNWCUtil;
public class SVNRepositoryManager {
private SVNRepository svnRepository;
private static SVNRepositoryManager instance;
private Logger logger;
private SVNRepositoryManager (Configuration configuration) throws SVNException
{
this.logger = LoggerFactory.getLogger(SVNRepositoryManager.class);
org.gcube.dataanalysis.dataminer.poolmanager.clients.configuration.SVNRepository repository = configuration.getSVNRepository();
this.svnRepository = SVNRepositoryFactory.create(SVNURL.parseURIEncoded(repository.getBaseUrl()));
ISVNAuthenticationManager authManager = null;
if (repository.getUsername() == null)
{
this.logger.debug("Using SVN default credentials");
authManager = SVNWCUtil.createDefaultAuthenticationManager();
}
else
{
this.logger.debug("Using IS credentials");
authManager = SVNWCUtil.createDefaultAuthenticationManager(repository.getUsername(),repository.getPassword());
}
this.svnRepository.setAuthenticationManager(authManager);
}
public static SVNRepositoryManager getInstance (Configuration configuration) throws SVNException
{
if (instance == null) instance = new SVNRepositoryManager(configuration);
return instance;
}
public SVNRepository getSvnRepository() {
return svnRepository;
}
}

View File

@ -0,0 +1,579 @@
package org.gcube.dataanalysis.dataminer.poolmanager.util;
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.text.DateFormat;
import java.text.ParseException;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Calendar;
import java.util.Collection;
import java.util.Collections;
import java.util.Date;
import java.util.HashSet;
import java.util.LinkedList;
import java.util.List;
import java.util.Set;
import java.util.TimeZone;
import org.gcube.dataanalysis.dataminer.poolmanager.clients.configuration.Configuration;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Algorithm;
import org.gcube.dataanalysis.dataminer.poolmanager.datamodel.Dependency;
import org.gcube.dataanalysis.dataminer.poolmanager.util.exception.GenericException;
import org.gcube.dataanalysis.dataminer.poolmanager.util.exception.SVNCommitException;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.tmatesoft.svn.core.SVNCommitInfo;
import org.tmatesoft.svn.core.SVNErrorMessage;
import org.tmatesoft.svn.core.SVNException;
import org.tmatesoft.svn.core.SVNNodeKind;
import org.tmatesoft.svn.core.internal.wc.SVNFileUtil;
import org.tmatesoft.svn.core.internal.wc.admin.SVNChecksumInputStream;
import org.tmatesoft.svn.core.io.ISVNEditor;
import org.tmatesoft.svn.core.io.SVNRepository;
import org.tmatesoft.svn.core.io.diff.SVNDeltaGenerator;
/**
* Created by ggiammat on 5/9/17.
*/
public abstract class SVNUpdater {
private SVNRepository svnRepository;
private Configuration configuration;
private Logger logger;
public SVNUpdater(Configuration configuration) throws SVNException {
this.configuration = configuration;
this.svnRepository = SVNRepositoryManager.getInstance(configuration).getSvnRepository();
this.logger = LoggerFactory.getLogger(SVNUpdater.class);
}
// public void updateRPRotoDeps(Algorithm algorithm) {
// this.updateSVN(this.configuration.getSVNRProtoOSDepsList(), algorithm.getOSDependencies());
// this.updateSVN(this.configuration.getSVNRProtoCRANDepsList(), algorithm.getCranDependencies());
// this.updateSVN(this.configuration.getSVNRProtoGitHubDepsList(), algorithm.getGitHubDependencies());
// }
public String getDependencyFile(String language/*, String env*/)
{
return getDependencyFile(this.configuration,language);
}
private String getDependencyFile (Configuration configuration, String language)
{
this.logger.debug("Getting dependency file for language "+language);
switch (language)
{
case "R":
return configuration.getSVNCRANDepsList();
case "R-blackbox":
return configuration.getSVNRBDepsList();
case "Java":
return configuration.getSVNJavaDepsList();
case "Knime-Workflow":
return configuration.getSVNKWDepsList();
case "Knime-Workflow4.1":
return configuration.getSVNKW4_1DepsList();
case "Linux-compiled":
return configuration.getSVNLinuxCompiledDepsList();
case "Octave":
return configuration.getSVNOctaveDepsList();
case "Python":
return configuration.getSVNPythonDepsList();
case "Python3.6":
return configuration.getSVNPython3_6DepsList();
case "Pre-Installed":
return configuration.getSVNPreInstalledDepsList();
case "Windows-compiled":
return configuration.getSVNWCDepsList();
default:
return null;
}
}
public boolean updateSVNAlgorithmList(Algorithm algorithm, String targetVRE, String category, String algorithm_type, String user/*, String env*/)
{
return this.updateSVNAlgorithmList(this.configuration.getSVNAlgorithmsList(), algorithm, targetVRE, category, algorithm_type, user);
}
public void updateAlgorithmFiles(File a) throws SVNException, SVNCommitException{
//this.updateAlgorithmList(this.configuration.getSVNMainAlgoRepo(), a);
this.updateAlgorithmList(this.configuration.getRepository(), a);
}
private void updateAlgorithmList(String svnMainAlgoRepo, File algorithmsFile) throws SVNException, SVNCommitException
{
this.logger.debug("Adding .jar file: " + algorithmsFile + " to repository " + svnMainAlgoRepo);
try
{
if (fileExists(svnMainAlgoRepo+File.separator+algorithmsFile.getName(), -1))
{
this.updateFile(new FileInputStream(algorithmsFile), svnMainAlgoRepo, algorithmsFile.getName());
}
else this.putFile(new FileInputStream(algorithmsFile), svnMainAlgoRepo,algorithmsFile.getName());
}
catch (FileNotFoundException e)
{
this.logger.error("Temporary algorithm file not found: this exception should not happen",e);
}
finally
{
this.svnRepository.closeSession();
}
}
public void putFile(FileInputStream fileInputSreeam, String destinationFolder, String fileName) throws SVNException, SVNCommitException
{
this.logger.debug("Putting new file on the SVN repository");
final ISVNEditor commitEditor = svnRepository.getCommitEditor("Add algorithm to list", null);
commitEditor.openRoot(-1);
commitEditor.openDir(destinationFolder, -1);
String filePath = destinationFolder + "/" + fileName;
commitEditor.addFile(filePath, null, -1);
commitEditor.applyTextDelta(filePath, null);
SVNDeltaGenerator deltaGenerator = new SVNDeltaGenerator();
String checksum = deltaGenerator.sendDelta(filePath, fileInputSreeam, commitEditor, true);
commitEditor.closeFile(filePath, checksum);
commitEditor.closeDir();
commitEditor.closeDir();
SVNCommitInfo info = commitEditor.closeEdit();
SVNErrorMessage errorMessage = info.getErrorMessage();
if (errorMessage != null)
{
this.logger.error("Operation failed: "+errorMessage.getFullMessage());
throw new SVNCommitException(errorMessage,fileName);
}
this.logger.debug("Operation completed");
}
public void updateFile(FileInputStream fileInputStream, String destinationFolder, String fileName) throws SVNException, SVNCommitException {
this.logger.debug("Updating existing file on the SVN repository");
final ISVNEditor commitEditor = svnRepository.getCommitEditor("Updating algorithm", null);
commitEditor.openRoot(-1);
commitEditor.openDir(destinationFolder, -1);
String filePath = destinationFolder + "/" + fileName;
// if (fileExists(filePath, -1)) { // updating existing file
commitEditor.openFile(filePath, -1);
//} else { // creating new file
//commitEditor.addFile(filePath, null, -1);
//}
commitEditor.applyTextDelta(filePath, null);
SVNDeltaGenerator deltaGenerator = new SVNDeltaGenerator();
String checksum = deltaGenerator.sendDelta(filePath, fileInputStream, commitEditor, true);
commitEditor.closeFile(filePath, checksum);
commitEditor.closeDir();
commitEditor.closeDir();
SVNCommitInfo info = commitEditor.closeEdit();
SVNErrorMessage errorMessage = info.getErrorMessage();
if (errorMessage != null)
{
this.logger.error("Operation failed: "+errorMessage.getFullMessage());
throw new SVNCommitException(errorMessage,fileName+" to be updated");
}
this.logger.debug("Operation completed");
}
public boolean fileExists(String path, long revision) throws SVNException {
SVNNodeKind kind = svnRepository.checkPath(path, revision);
if (kind == SVNNodeKind.FILE) {
return true;
}
return false;
}
// public static ByteArrayInputStream reteriveByteArrayInputStream(File file) throws IOException
// {
//
// return new ByteArrayInputStream(FileUtils.readFileToByteArray(file));
// }
private boolean updateSVNAlgorithmList(String file, Algorithm algorithm, String targetVRE, String category, String algorithm_type, String user/*, String env*/)
{
boolean response = false;
try {
this.logger.debug("Updating algorithm list: " + file);
final ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
svnRepository.getFile(file, SVNRepository.INVALID_REVISION, null, byteArrayOutputStream);
String lines[] = byteArrayOutputStream.toString().split("\\r?\\n");
List<String> newContent = new LinkedList<>(Arrays.asList(lines));
// check if the algorithm is already in the list (match the class name) and delete the content
for (String l : lines) {
if (l.contains(algorithm.getClazz())) {
newContent.remove(l);
//System.out.println("Not updating algorithm list beacuse already present");
//return;
}
}
// the algorithm is not in the list or must be overwritten cause some modification. Add it
newContent.add(this.generateAlgorithmEntry(algorithm, targetVRE, category,algorithm_type/*, env*/));
// Collections.sort(newContent);
final SVNDeltaGenerator deltaGenerator = new SVNDeltaGenerator();
byte[] originalContents = byteArrayOutputStream.toByteArray();
final ISVNEditor commitEditor = svnRepository.getCommitEditor("update algorithm list", null);
commitEditor.openRoot(-1);
commitEditor.openFile(file, -1);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
for (String line : newContent) {
baos.write(line.getBytes());
baos.write("\n".getBytes());
}
byte[] bytes = baos.toByteArray();
commitEditor.applyTextDelta(file, md5(originalContents));
final String checksum = deltaGenerator.sendDelta(file, new ByteArrayInputStream(originalContents), 0,
new ByteArrayInputStream(bytes), commitEditor, true);
commitEditor.closeFile(file, checksum);
SVNCommitInfo info = commitEditor.closeEdit();
SVNErrorMessage errorMessage = info.getErrorMessage();
if (errorMessage != null)
{
this.logger.error("Operation failed: "+errorMessage.getFullMessage());
response = false;
}
else response = true;
}
catch (Exception ex)
{
this.logger.error("Unable to commit algorithm list",ex);
response = false;
}
finally
{
svnRepository.closeSession();
}
return response;
}
public String generateAlgorithmEntry(Algorithm algorithm, String targetVRE, String category, String algorithm_type/*,String env*/) throws ParseException {
//Timestamp timestamp = new Timestamp(System.currentTimeMillis());
//long unixTime = System.currentTimeMillis() / 1000L;
StringBuffer sb = new StringBuffer("| ");
sb.append(algorithm.getName() + " | ");
sb.append(algorithm.getFullname() + " | ");
sb.append(category + " | ");
sb.append("DataMinerPoolManager | ");
sb.append("<notextile>./addAlgorithm.sh " + algorithm.getName() + " " + algorithm.getCategory() + " "
+ algorithm.getClazz() + " " + targetVRE + " " + algorithm_type + " N "
+ algorithm.getPackageURL() + " \"" + algorithm.getDescription() + "\" </notextile> | ");
sb.append("none | ");
sb.append(this.getTimeZone() + " | ");
this.logger.info("Algo details: "+sb.toString());
return sb.toString();
}
public Collection<String> getUndefinedDependencies(String file, Collection<Dependency> deps) throws GenericException
{
try
{
// SendMail sm = new SendMail();
// NotificationHelper nh = new NotificationHelper();
List<String> undefined = new LinkedList<String>();
//to fix in next release: if the file is not present for that language in the service.properties then skip and return null list of string
//just to uncomment the following lines
if(file.isEmpty()){
return undefined;
}
this.logger.debug("Checking dependencies list: " + file);
List<String> validDependencies = new LinkedList<String>();
for (String singlefile: CheckMethod.getFiles(file)){
final ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
svnRepository.getFile(singlefile, SVNRepository.INVALID_REVISION, null, byteArrayOutputStream);
for(String l: byteArrayOutputStream.toString().split("\\r?\\n")){
validDependencies.add(l.trim());
}}
this.logger.debug("Valid dependencies are: "+validDependencies);
for(Dependency d: deps){
String depName = d.getName();
if(!validDependencies.contains(depName)){
undefined.add(depName);
}
}
return undefined;
//
//
// for (String a : lines) {
// for (String b : ldep) {
// if (b.equals(a)) {
// System.out.println("The following dependency is correctly written: " + b);
// } else
//
// }
// }
//
// boolean check = false;
// try {
// System.out.println("Checking dependencies list: " + file);
// final ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
// svnRepository.getFile(file, SVNRepository.INVALID_REVISION, null, byteArrayOutputStream);
// String lines[] = byteArrayOutputStream.toString().split("\\r?\\n");
//
// // if(deps.isEmpty()){
// // sm.sendNotification(nh.getFailedSubject(), nh.getFailedBody());
// // Exception e = new Exception("No dependency specified for this
// // algorithm");
// // throw e;
// //
// // }
//
// // else if (!deps.isEmpty()) {
// List<String> ldep = new LinkedList<>();
// for (Dependency d : deps) {
// ldep.add(d.getName());
// }
// for (String a : lines) {
// for (String b : ldep) {
// if (b.equals(a)) {
// System.out.println("The following dependency is correctly written: " + b);
// check = true;
// } else
// check = false;
//
// }
// }
// // }
// } catch (Exception a) {
// a.getMessage();
// }
//
// return check;
} catch (SVNException e)
{
throw new GenericException(e);
}
}
public boolean checkIfAvaialable(String file, Collection<Dependency> deps) throws SVNException {
//SendMail sm = new SendMail();
//NotificationHelper nh = new NotificationHelper();
boolean check = false;
try {
this.logger.info("Checking dependencies list: " + file);
final ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
svnRepository.getFile(file, SVNRepository.INVALID_REVISION, null, byteArrayOutputStream);
String lines[] = byteArrayOutputStream.toString().split("\\r?\\n");
// if(deps.isEmpty()){
// sm.sendNotification(nh.getFailedSubject(), nh.getFailedBody());
// Exception e = new Exception("No dependency specified for this
// algorithm");
// throw e;
//
// }
// else if (!deps.isEmpty()) {
List<String> ldep = new LinkedList<>();
for (Dependency d : deps) {
ldep.add(d.getName());
}
for (String a : lines) {
for (String b : ldep) {
if (b.equals(a)) {
System.out.println("The following dependency is correctly written: " + b);
check = true;
} else
check = false;
}
}
// }
} catch (Exception a)
{
this.logger.error(a.getMessage(),a);
}
return check;
}
public void updateSVN(String file, Collection<Dependency> deps) {
try {
this.logger.info("Updating dependencies list: " + file);
final ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
svnRepository.getFile(file, SVNRepository.INVALID_REVISION, null, byteArrayOutputStream);
String lines[] = byteArrayOutputStream.toString().split("\\r?\\n");
List<String> ldep = new LinkedList<>();
for (Dependency d : deps) {
ldep.add(d.getName());
}
List<String> aa = this.checkMatch(lines, ldep);
Collections.sort(aa);
final SVNDeltaGenerator deltaGenerator = new SVNDeltaGenerator();
byte[] originalContents = byteArrayOutputStream.toByteArray();
final ISVNEditor commitEditor = svnRepository.getCommitEditor("update dependencies", null);
commitEditor.openRoot(-1);
commitEditor.openFile(file, -1);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
for (String line : aa) {
baos.write(line.getBytes());
baos.write("\n".getBytes());
}
byte[] bytes = baos.toByteArray();
commitEditor.applyTextDelta(file, md5(originalContents));
final String checksum = deltaGenerator.sendDelta(file, new ByteArrayInputStream(originalContents), 0,
new ByteArrayInputStream(bytes), commitEditor, true);
commitEditor.closeFile(file, checksum);
commitEditor.closeEdit();
} catch (Exception ex) {
ex.printStackTrace();
}
finally {
svnRepository.closeSession();
}
}
public static String md5(byte[] contents) {
final byte[] tmp = new byte[1024];
final SVNChecksumInputStream checksumStream = new SVNChecksumInputStream(new ByteArrayInputStream(contents),
"md5");
try {
while (checksumStream.read(tmp) > 0) {
//
}
return checksumStream.getDigest();
} catch (IOException e) {
// never happens
e.printStackTrace();
return null;
} finally {
SVNFileUtil.closeFile(checksumStream);
}
}
public List<String> checkMatch(String[] lines, List<String> ls) {
Set<String> ss = new HashSet<String>(ls);
ss.addAll(Arrays.asList(lines));
return new ArrayList<>(ss);
}
public String getTimeZone() throws ParseException{
Calendar cal = Calendar.getInstance();
cal.getTime();
DateFormat formatter = new SimpleDateFormat("EEE MMM dd HH:mm:ss zzz yyyy");
Date fromDate = formatter.parse(cal.getTime().toString());
TimeZone central = TimeZone.getTimeZone("UTC");
formatter.setTimeZone(central);
this.logger.info(formatter.format(fromDate));
return formatter.format(fromDate);
}
public static void main(String[] args) throws SVNException, ParseException {
// SVNUpdater c = new SVNUpdater();
//File a = new File("/home/ngalante/Desktop/testCiro");
//File b = new File ("/home/ngalante/Desktop/testB");
//long unixTime = System.currentTimeMillis() / 1000L;
//System.out.println(unixTime);
//c.updateAlgorithmFiles(a);
//c.updateAlgorithmFiles(b);
//Timestamp timestamp = new Timestamp(System.currentTimeMillis());
Calendar cal = Calendar.getInstance();
cal.getTime();
DateFormat formatter = new SimpleDateFormat("EEE MMM dd HH:mm:ss zzz yyyy");
Date fromDate = formatter.parse(cal.getTime().toString());
TimeZone central = TimeZone.getTimeZone("UTC");
formatter.setTimeZone(central);
System.out.println(formatter.format(fromDate));
}
}

View File

@ -0,0 +1,255 @@
package org.gcube.dataanalysis.dataminer.poolmanager.util;
import static org.gcube.common.authorization.client.Constants.authorizationService;
import static org.gcube.resources.discovery.icclient.ICFactory.clientFor;
import static org.gcube.resources.discovery.icclient.ICFactory.queryFor;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.io.OutputStreamWriter;
import java.net.MalformedURLException;
import java.net.URL;
import java.net.URLConnection;
import java.util.ArrayList;
import java.util.List;
import org.apache.http.client.methods.HttpGet;
import org.apache.http.impl.client.CloseableHttpClient;
import org.apache.http.impl.client.HttpClientBuilder;
import org.apache.http.util.EntityUtils;
import org.gcube.common.authorization.client.exceptions.ObjectNotFound;
import org.gcube.common.authorization.library.AuthorizationEntry;
import org.gcube.common.authorization.library.provider.SecurityTokenProvider;
import org.gcube.common.resources.gcore.GCoreEndpoint;
import org.gcube.dataanalysis.dataminer.poolmanager.clients.configuration.DMPMClientConfiguratorManager;
import org.gcube.dataanalysis.dataminer.poolmanager.util.exception.EMailException;
import org.gcube.resources.discovery.client.api.DiscoveryClient;
import org.gcube.resources.discovery.client.queries.api.SimpleQuery;
import org.json.JSONArray;
import org.json.JSONException;
import org.json.JSONObject;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class SendMail {
private Logger logger = LoggerFactory.getLogger(SendMail.class);
private final String WRITE_MESSAGE_ADDRESS_PATH = "2/messages/write-message?gcube-token=",
USER_ROLES_ADDRESS_PATH = "2/users/get-usernames-by-role?role-name=DataMiner-Manager&gcube-token=",
SOCIAL_SERVICE_QUERY_CONDITION = "$resource/Profile/ServiceName/text() eq 'SocialNetworking'",
SOCIAL_SERVICE_URI = "jersey-servlet", JSON_MIME_TYPE = "application/json";
private String socialServiceAddress;
public SendMail() {
}
public void sendNotification(String subject, String body) throws EMailException {
logger.debug("SendNotification");
logger.debug("Notification Subject: " + subject);
logger.debug("Notification Body: " + body);
retrieveSocialService();
String postBody = createPostBody(subject, body);
String requestForMessage = getRequestMessage(WRITE_MESSAGE_ADDRESS_PATH);
sendPostRequest(requestForMessage, postBody);
}
private String createPostBody(String subject, String body) throws EMailException {
try {
List<String> recipientsList = getRecipients();
if (recipientsList == null || recipientsList.isEmpty()) {
logger.error("Invalid recipient list: " + recipientsList);
throw new EMailException("Unable to send email notification. Invalid recipient list:" + recipientsList);
}
// {"subject": "subject-content", "body": "body-content",
// "recipients":[{"id":"userid"}]}
JSONObject data = new JSONObject();
data.put("subject", subject);
data.put("body", body);
JSONArray recipients = new JSONArray();
for (String recipient : recipientsList) {
JSONObject d = new JSONObject();
d.put("id", recipient);
recipients.put(d);
}
data.put("recipients", recipients);
logger.info("Post Body: " + data);
return data.toString();
} catch (EMailException e) {
throw e;
} catch (Throwable e) {
logger.error("Error creating the notification body: " + e.getLocalizedMessage(), e);
throw new EMailException(e);
}
}
private void retrieveSocialService() throws EMailException {
try {
SimpleQuery query = queryFor(GCoreEndpoint.class);
query.addCondition(SOCIAL_SERVICE_QUERY_CONDITION);
DiscoveryClient<GCoreEndpoint> client = clientFor(GCoreEndpoint.class);
List<GCoreEndpoint> resources = client.submit(query);
socialServiceAddress = resources.get(0).profile().endpointMap().get(SOCIAL_SERVICE_URI).uri().toString();
logger.info("Retrieved Social Service Address: " + socialServiceAddress);
if (socialServiceAddress == null || socialServiceAddress.isEmpty()) {
throw new EMailException(
"Unable to send email notification. Invalid address in GCoreEndpoint resource on IS.");
}
} catch (EMailException e) {
logger.error(e.getLocalizedMessage(), e);
throw e;
} catch (Throwable e) {
logger.error(e.getLocalizedMessage(), e);
throw new EMailException(e);
}
}
private String getRequestMessage(String addressPath) {
StringBuilder requestMessageBuilder = new StringBuilder(socialServiceAddress);
if (!socialServiceAddress.endsWith("/"))
requestMessageBuilder.append('/');
requestMessageBuilder.append(addressPath).append(SecurityTokenProvider.instance.get());
String requestForMessage = requestMessageBuilder.toString();
logger.debug("Request " + requestForMessage);
return requestForMessage;
}
private String username(String token) throws ObjectNotFound, Exception {
AuthorizationEntry entry = authorizationService().get(token);
logger.debug(entry.getClientInfo().getId());
return entry.getClientInfo().getId();
}
private void sendPostRequest(String endpoint, String postBody) throws EMailException {
logger.info("Execute Post:" + endpoint);
logger.info("Post Body:" + postBody);
try {
// Send the request
URL url = new URL(endpoint);
URLConnection conn = url.openConnection();
conn.setRequestProperty("Accept", JSON_MIME_TYPE);
conn.setRequestProperty("Content-Type", JSON_MIME_TYPE);
conn.setDoOutput(true);
OutputStreamWriter writer = new OutputStreamWriter(conn.getOutputStream());
writer.write(postBody);
writer.flush();
// Get the response
StringBuffer answer = new StringBuffer();
BufferedReader reader = new BufferedReader(new InputStreamReader(conn.getInputStream()));
String line;
while ((line = reader.readLine()) != null) {
answer.append(line);
}
writer.close();
reader.close();
logger.debug("Operation completed");
String response = answer.toString();
logger.info("Notification Response: " + response);
checkResponse(response);
} catch (EMailException e) {
throw e;
} catch (MalformedURLException e) {
logger.error("Invalid URL: " + e.getLocalizedMessage(), e);
throw new EMailException(e);
} catch (IOException e) {
logger.error("Error in the IO process: " + e.getLocalizedMessage(), e);
throw new EMailException(e);
} catch (Throwable e) {
logger.error("Error executing post:" + e.getLocalizedMessage(), e);
throw new EMailException(e);
}
}
private void checkResponse(String response) throws EMailException {
if (response == null) {
logger.error("Invalid notification response: " + response);
throw new EMailException();// TODO
} else {
try {
JSONObject res = new JSONObject(response);
boolean success = res.getBoolean("success");
if (!success) {
String message = res.getString("message");
logger.error("Error in send email notification: " + message);
throw new EMailException("Error in send email notification: "+message);
}
} catch (JSONException e) {
logger.error("Invalid notification response: " + response);
throw new EMailException(e);
}
}
}
private List<String> getRecipients() {
try {
List<String> recipients = new ArrayList<String>();
String dataMinerManagers = retrieveDataMinerManagers();
logger.debug("Retrieved DataMiner Managers: " + dataMinerManagers);
if (dataMinerManagers != null && !dataMinerManagers.isEmpty()) {
JSONObject obj = new JSONObject(dataMinerManagers);
JSONArray data = obj.getJSONArray("result");
if (data != null) {
for (int i = 0; i < data.length(); i++) {
recipients.add(data.getString(i));
}
}
} else {
logger.info("Use the default admins how workaround ");
List<String> defaultManagers = DMPMClientConfiguratorManager.getInstance().getDefaultAdmins();
recipients.addAll(defaultManagers);
}
recipients.add(this.username(SecurityTokenProvider.instance.get()));
logger.info("Retrieved Recipients: " + recipients);
return recipients;
} catch (Exception e) {
logger.error("Error retrieving recipients: " + e.getLocalizedMessage(), e);
logger.info("Use the default admins how workaround ");
return DMPMClientConfiguratorManager.getInstance().getDefaultAdmins();
}
}
private String retrieveDataMinerManagers() throws Exception {
// Try to retrieve a url like this:
// https://api.d4science.org/social-networking-library-ws/rest/2/users/get-usernames-by-role?role-name=DataMiner-Manager&gcube-token=xxx-xxxx-xxxx-xxx
String requestAdminsUrl = getRequestMessage(USER_ROLES_ADDRESS_PATH);
logger.info("Request Admins Url: " + requestAdminsUrl);
CloseableHttpClient client = HttpClientBuilder.create().build();
HttpGet getReq = new HttpGet(requestAdminsUrl);
getReq.setHeader("accept", JSON_MIME_TYPE);
getReq.setHeader("content-type", JSON_MIME_TYPE);
logger.info("Response: " + EntityUtils.toString(client.execute(getReq).getEntity()));
return EntityUtils.toString(client.execute(getReq).getEntity());
}
}

View File

@ -0,0 +1,35 @@
package org.gcube.dataanalysis.dataminer.poolmanager.util.exception;
public class AlgorithmException extends DMPMException{
/**
*
*/
private static final long serialVersionUID = -5678597187512954288L;
private String algorithmName;
public AlgorithmException (String algorithmName)
{
super ("Algorithm exception");
this.algorithmName = algorithmName;
}
public AlgorithmException (String algorithmName, Throwable cause)
{
super ("Algorithm exception", cause);
this.algorithmName = algorithmName;
}
@Override
public String getErrorMessage() {
return "Installation completed but DataMiner Interface not working correctly or files "
+ this.algorithmName + ".jar and " + this.algorithmName
+ "_interface.jar not availables at the expected path";
}
}

View File

@ -0,0 +1,20 @@
package org.gcube.dataanalysis.dataminer.poolmanager.util.exception;
public abstract class DMPMException extends Exception{
/**
*
*/
private static final long serialVersionUID = 1L;
public DMPMException (String errorMessage)
{
super (errorMessage);
}
public DMPMException(String errorMessage,Throwable cause) {
super (errorMessage,cause);
}
public abstract String getErrorMessage ();
}

View File

@ -0,0 +1,28 @@
package org.gcube.dataanalysis.dataminer.poolmanager.util.exception;
public class EMailException extends Exception {
private static final String MESSAGE = "Unable to send email notification";
/**
*
*/
private static final long serialVersionUID = 1L;
public EMailException() {
super(MESSAGE);
}
public EMailException(String message) {
super(message);
}
public EMailException(String message, Throwable e) {
super(message, e);
}
public EMailException(Throwable e) {
super(MESSAGE, e);
}
}

View File

@ -0,0 +1,26 @@
package org.gcube.dataanalysis.dataminer.poolmanager.util.exception;
public class GenericException extends DMPMException {
/**
*
*/
private static final long serialVersionUID = 6772009633547404120L;
public GenericException(Throwable cause) {
super ("Generic exception",cause);
}
@Override
public String getErrorMessage() {
return this.getCause().getMessage();
}
}

View File

@ -0,0 +1,44 @@
package org.gcube.dataanalysis.dataminer.poolmanager.util.exception;
import org.tmatesoft.svn.core.SVNErrorMessage;
public class SVNCommitException extends DMPMException {
/**
*
*/
private static final long serialVersionUID = -5225403308313619585L;
private SVNErrorMessage svnErrorMessage;
private String fileName;
public SVNCommitException(SVNErrorMessage errorMessage, String fileName) {
super ("Unable to commit");
this.svnErrorMessage = errorMessage;
this.fileName = fileName;
}
public SVNCommitException(String message,SVNErrorMessage errorMessage,String fileName) {
super (message);
this.svnErrorMessage = errorMessage;
this.fileName = fileName;
}
public SVNErrorMessage getSvnErrorMessage() {
return svnErrorMessage;
}
@Override
public String getErrorMessage() {
return "Commit operation failed for "+this.fileName
+ "the message of the SVN Server is the following:\n"+this.svnErrorMessage.getMessage();
}
}

View File

@ -0,0 +1,105 @@
package org.gcube.dataanalysis.dataminer.poolmanager.util.impl;
import java.io.File;
import org.gcube.dataanalysis.dataminer.poolmanager.clients.configuration.DMPMClientConfiguratorManager;
import org.gcube.dataanalysis.dataminer.poolmanager.util.CheckMethod;
import org.gcube.dataanalysis.dataminer.poolmanager.util.exception.SVNCommitException;
import org.tmatesoft.svn.core.SVNException;
import com.jcraft.jsch.JSchException;
import com.jcraft.jsch.SftpException;
public class CheckMethodProduction extends CheckMethod{
public CheckMethodProduction()
{
super (DMPMClientConfiguratorManager.getInstance().getProductionConfiguration());
}
@Override
protected void copyFromDmToSVN(File a) throws SVNException, SVNCommitException, JSchException, SftpException {
super.copyFromDmToSVN(a, new SVNUpdaterProduction());
}
public static void main(String[] args) throws Exception {
// ServiceConfiguration a = new ServiceConfiguration();
// System.out.println(a.getStagingHost());
CheckMethodProduction a = new CheckMethodProduction();
//a.getFiles("/trunk/data-analysis/RConfiguration/RPackagesManagement/r_deb_pkgs.txt, /trunk/data-analysis/RConfiguration/RPackagesManagement/r_cran_pkgs.txt, /trunk/data-analysis/RConfiguration/RPackagesManagement/r_github_pkgs.txt");
// File aa = new File("OCTAVEBLACKBOX.jar");
// System.out.println(aa.getName());
// System.out.println(aa.getPath());
//a.copyFromDmToSVN(aa);
// if (a.checkMethod("dataminer-ghost-d.dev.d4science.org", "708e7eb8-11a7-4e9a-816b-c9ed7e7e99fe-98187548")){
// System.out.println("AAA"); }
//
// if (a.doesExist("/home/gcube/wps_algorithms/algorithms/WINDOWS_BLACK_BOX_EXAMPLE.jar")){
// System.out.println("BBBB");
//
// }
// if (a.doesExist("/home/gcube/wps_algorithms/algorithms/WINDOWS_BLACK_BOX_EXAMPLE_interface.jar")){
// System.out.println("CCCC");}
//
// File aa = new File("/home/gcube/wps_algorithms/algorithms/RBLACKBOX_interface.jar");
// a.copyFromDmToSVN(aa, "Dev");
//
//System.out.println(a.checkMethod("dataminer-ghost-t.pre.d4science.org",
// "2eceaf27-0e22-4dbe-8075-e09eff199bf9-98187548"));
//System.out.println(a.checkMethod("dataminer-proto-ghost.d4science.org",
// "3a23bfa4-4dfe-44fc-988f-194b91071dd2-843339462"));
try
{
a.checkMethod("dataminer-ghost-d.dev.d4science.org",
"708e7eb8-11a7-4e9a-816b-c9ed7e7e99fe-98187548");
} catch (Exception e)
{
e.printStackTrace();
}
//Algorithm aa = new Algorithm();
//aa.setName("UDPIPE_WRAPPER");
//System.out.println(a.algoExists(aa));
////
//ServiceConfiguration bp = new ServiceConfiguration();
////
//SecurityTokenProvider.instance.set("708e7eb8-11a7-4e9a-816b-c9ed7e7e99fe-98187548");
////
//if (a.checkMethod(bp.getStagingHost(), SecurityTokenProvider.instance.get())&&a.algoExists(aa)); {
//System.out.println("ciao");
//
//}
//
//Algorithm al = new Algorithm();
// al.setName("UDPIPE_WRAPPER");
// a.deleteFiles(al);
}
}

View File

@ -0,0 +1,97 @@
package org.gcube.dataanalysis.dataminer.poolmanager.util.impl;
import java.io.File;
import org.gcube.dataanalysis.dataminer.poolmanager.clients.configuration.DMPMClientConfiguratorManager;
import org.gcube.dataanalysis.dataminer.poolmanager.util.CheckMethod;
public class CheckMethodStaging extends CheckMethod{
public CheckMethodStaging()
{
super (DMPMClientConfiguratorManager.getInstance().getStagingConfiguration());
}
@Override
protected void copyFromDmToSVN(File a) throws Exception{
super.copyFromDmToSVN(a, new SVNUpdaterStaging());
}
public static void main(String[] args) throws Exception {
// ServiceConfiguration a = new ServiceConfiguration();
// System.out.println(a.getStagingHost());
CheckMethodStaging a = new CheckMethodStaging();
//a.getFiles("/trunk/data-analysis/RConfiguration/RPackagesManagement/r_deb_pkgs.txt, /trunk/data-analysis/RConfiguration/RPackagesManagement/r_cran_pkgs.txt, /trunk/data-analysis/RConfiguration/RPackagesManagement/r_github_pkgs.txt");
// File aa = new File("OCTAVEBLACKBOX.jar");
// System.out.println(aa.getName());
// System.out.println(aa.getPath());
//a.copyFromDmToSVN(aa);
// if (a.checkMethod("dataminer-ghost-d.dev.d4science.org", "708e7eb8-11a7-4e9a-816b-c9ed7e7e99fe-98187548")){
// System.out.println("AAA"); }
//
// if (a.doesExist("/home/gcube/wps_algorithms/algorithms/WINDOWS_BLACK_BOX_EXAMPLE.jar")){
// System.out.println("BBBB");
//
// }
// if (a.doesExist("/home/gcube/wps_algorithms/algorithms/WINDOWS_BLACK_BOX_EXAMPLE_interface.jar")){
// System.out.println("CCCC");}
//
// File aa = new File("/home/gcube/wps_algorithms/algorithms/RBLACKBOX_interface.jar");
// a.copyFromDmToSVN(aa, "Dev");
//
//System.out.println(a.checkMethod("dataminer-ghost-t.pre.d4science.org",
// "2eceaf27-0e22-4dbe-8075-e09eff199bf9-98187548"));
//System.out.println(a.checkMethod("dataminer-proto-ghost.d4science.org",
// "3a23bfa4-4dfe-44fc-988f-194b91071dd2-843339462"));
try
{
a.checkMethod("dataminer-ghost-d.dev.d4science.org",
"708e7eb8-11a7-4e9a-816b-c9ed7e7e99fe-98187548");
} catch (Exception e)
{
e.printStackTrace();
}
//Algorithm aa = new Algorithm();
//aa.setName("UDPIPE_WRAPPER");
//System.out.println(a.algoExists(aa));
////
//ServiceConfiguration bp = new ServiceConfiguration();
////
//SecurityTokenProvider.instance.set("708e7eb8-11a7-4e9a-816b-c9ed7e7e99fe-98187548");
////
//if (a.checkMethod(bp.getStagingHost(), SecurityTokenProvider.instance.get())&&a.algoExists(aa)); {
//System.out.println("ciao");
//
//}
//
//Algorithm al = new Algorithm();
// al.setName("UDPIPE_WRAPPER");
// a.deleteFiles(al);
}
}

View File

@ -0,0 +1,14 @@
package org.gcube.dataanalysis.dataminer.poolmanager.util.impl;
import org.gcube.dataanalysis.dataminer.poolmanager.clients.configuration.DMPMClientConfiguratorManager;
import org.gcube.dataanalysis.dataminer.poolmanager.util.ClusterBuilder;
public class ClusterBuilderProduction extends ClusterBuilder{
public ClusterBuilderProduction() {
super (DMPMClientConfiguratorManager.getInstance().getProductionConfiguration());
}
}

View File

@ -0,0 +1,13 @@
package org.gcube.dataanalysis.dataminer.poolmanager.util.impl;
import org.gcube.dataanalysis.dataminer.poolmanager.clients.configuration.DMPMClientConfiguratorManager;
import org.gcube.dataanalysis.dataminer.poolmanager.util.ClusterBuilder;
public class ClusterBuilderStaging extends ClusterBuilder{
public ClusterBuilderStaging() {
super (DMPMClientConfiguratorManager.getInstance().getStagingConfiguration());
}
}

View File

@ -0,0 +1,19 @@
package org.gcube.dataanalysis.dataminer.poolmanager.util.impl;
import org.gcube.dataanalysis.dataminer.poolmanager.util.NotificationHelper;
//import scala.actors.threadpool.Arrays;
public class NotificationHelperProduction extends NotificationHelper{
// private Exception executionException;
public NotificationHelperProduction() {
super ("[DataMinerGhostProductionInstallationRequestReport]");
}
}

View File

@ -0,0 +1,18 @@
package org.gcube.dataanalysis.dataminer.poolmanager.util.impl;
import org.gcube.dataanalysis.dataminer.poolmanager.util.NotificationHelper;
//import scala.actors.threadpool.Arrays;
public class NotificationHelperStaging extends NotificationHelper {
// private Exception executionException;
public NotificationHelperStaging() {
super ("[DataMinerGhostStagingInstallationRequestReport]");
}
}

View File

@ -0,0 +1,21 @@
package org.gcube.dataanalysis.dataminer.poolmanager.util.impl;
import org.gcube.dataanalysis.dataminer.poolmanager.clients.configuration.DMPMClientConfiguratorManager;
import org.gcube.dataanalysis.dataminer.poolmanager.util.SVNUpdater;
import org.tmatesoft.svn.core.SVNException;
/**
* Created by ggiammat on 5/9/17.
*/
public class SVNUpdaterProduction extends SVNUpdater{
public SVNUpdaterProduction() throws SVNException {
super (DMPMClientConfiguratorManager.getInstance().getProductionConfiguration());
}
}

View File

@ -0,0 +1,21 @@
package org.gcube.dataanalysis.dataminer.poolmanager.util.impl;
import org.gcube.dataanalysis.dataminer.poolmanager.clients.configuration.DMPMClientConfiguratorManager;
import org.gcube.dataanalysis.dataminer.poolmanager.util.SVNUpdater;
import org.tmatesoft.svn.core.SVNException;
/**
* Created by ggiammat on 5/9/17.
*/
public class SVNUpdaterStaging extends SVNUpdater{
public SVNUpdaterStaging() throws SVNException {
super (DMPMClientConfiguratorManager.getInstance().getStagingConfiguration());
}
}

View File

@ -0,0 +1,4 @@
#---
#dependencies:
# <dependencies>
# - { role: digest }

View File

@ -0,0 +1,4 @@
# tasks file for r
---
- name: Install custom algorithm <name>
command: Rscript --slave --no-save --no-restore-history -e "require(devtools); require(methods); install_github('<name>');"

View File

@ -0,0 +1,4 @@
1=lucio.lelii
2=roberto.cirillo
3=gianpaolo.coro
4=giancarlo.panichi

View File

@ -0,0 +1,25 @@
#YML node file
SVN_REPO = https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/RConfiguration/RPackagesManagement/
svn.repository = https://svn.d4science.research-infrastructures.eu/gcube
svn.algo.main.repo = /trunk/data-analysis/DataMinerConfiguration/algorithms
#STAGE
STAGE_GHOST = dataminer-ghost-d.dev.d4science.org
svn.stage.algorithms-list = /trunk/data-analysis/DataMinerConfiguration/algorithms/dev/algorithms
svn.stage.deps-linux-compiled = /trunk/data-analysis/RConfiguration/RPackagesManagement/r_deb_pkgs.txt
svn.stage.deps-pre-installed = /trunk/data-analysis/RConfiguration/RPackagesManagement/r_deb_pkgs.txt
svn.stage.deps-r-blackbox = /trunk/data-analysis/RConfiguration/RPackagesManagement/r_cran_pkgs.txt
svn.stage.deps-r = /trunk/data-analysis/RConfiguration/RPackagesManagement/r_cran_pkgs.txt
svn.stage.deps-java =
svn.stage.deps-knime-workflow =
svn.stage.deps-knime-workflow4_1 =
svn.stage.deps-octave =
svn.stage.deps-python =
svn.stage.deps-python3_6 =
svn.stage.deps-windows-compiled =
#PROD
svn.prod.algorithms-list = /trunk/data-analysis/DataMinerConfiguration/algorithms/prod/algorithms

View File

@ -0,0 +1,4 @@
dependencies:
# - { role: gcube-dataminer }
# - { role: os-unzip }
# - { role: os-java-1.7.0 }

View File

@ -0,0 +1,32 @@
#---
#- name: remove previous installer (if any)
# file:
# path: /home/gcube/algorithmInstaller
# state: absent
#- name: remove previous installer.zip (if any)
# file:
# path: /home/dpm/algorithmInstaller.zip
# state: absent
#- name: download the installer zip
# get_url:
# url: https://svn.research-infrastructures.eu/public/d4science/gcube/trunk/data-analysis/DataminerAlgorithmsInstaller/package/algorithmInstaller.zip
# dest: /home/dpm/algorithmInstaller.zip
# validate_certs: no
#- name: download and unzip the package
# unarchive:
# src: https://svn.research-infrastructures.eu/public/d4science/gcube/trunk/data-analysis/DataminerAlgorithmsInstaller/package/algorithmInstaller1_1.zip
## src: http://maven.research-infrastructures.eu:8081/nexus/service/local/artifact/maven/redirect?r=gcube-snapshots&g=org.gcube.dataanalysis&a=dataminer-algorithms-importer&v=1.1.1-SNAPSHOT&e=tar.gz
# dest: /home/gcube
# remote_src: yes
# validate_certs: no
#- name: change flags
# file:
# path: /home/gcube/algorithmInstaller
# mode: 0777
# state: directory
# recurse: yes

View File

@ -0,0 +1,2 @@
dependencies:
- { role: gcube-ghn }

View File

@ -0,0 +1,4 @@
---
- name: Install the 'DATAMINER' package
shell: echo 'installing DATAMINER'

View File

@ -0,0 +1,7 @@
---
- name: Install the 'GHN' package
shell: echo 'installing GHN'
# TODO: handler to start container
# TODO: handler to stop container

Some files were not shown because too many files have changed in this diff Show More