Compare commits

...

81 Commits

Author SHA1 Message Date
Giancarlo Panichi f771d4464f Updated Changelog 2022-04-01 10:25:36 +02:00
Giancarlo Panichi 4e22245042 Changed DatastapaceManager logs 2022-03-31 16:18:26 +02:00
Giancarlo Panichi 81c260646d Updated write exclusion behavior 2022-03-25 12:47:26 +01:00
Giancarlo Panichi 6d74cf7fc3 Updated to fix protocol parameters when persistence is disabled 2022-03-21 15:57:29 +01:00
Giancarlo Panichi 5f1fee5ca9 Minor updated 2022-03-15 17:02:12 +01:00
Giancarlo Panichi a308f08c93 Updated bom version for release 2022-03-15 17:01:33 +01:00
Giancarlo Panichi 5e3252ef5b Fixed pom for release 2022-03-15 11:12:16 +01:00
Giancarlo Panichi e76593263b Merge pull request 'feature/22700' (!1) from feature/22700 into master
Reviewed-on: #1
2022-03-15 11:09:05 +01:00
Giancarlo Panichi 6ecfcc9a13 Updated for release 2022-03-15 11:04:40 +01:00
Giancarlo Panichi 71361eb435 Updated gcube-bom to 2.1.0-SNAPSHOT for storagehub 2.0.0 2022-01-27 18:33:07 +01:00
Giancarlo Panichi 5f9681048f ref 22700: DataMiner - Check max computations limit
Updated max computations parameter check.
2022-01-24 18:27:56 +01:00
Roberto Cirillo fbade3e930 Update 'CHANGELOG.md'
snapshot removed from CHANGELOG
2021-10-08 17:30:01 +02:00
Roberto Cirillo 9e26a0680a Update 'pom.xml'
snapshot removed from pom
2021-10-08 17:29:36 +02:00
Roberto Cirillo 106a03dab4 Update 'CHANGELOG.md'
add SNAPSHOT to 1.7.1 version
2021-10-06 12:04:25 +02:00
Roberto Cirillo 403b9c2b2e Update 'pom.xml'
add SNAPSHOT to the version
2021-10-06 12:03:56 +02:00
Giancarlo Panichi 84be7e4fe7 Updated for Release Next 2021-06-15 15:40:52 +02:00
Giancarlo Panichi 193885a1a3 Updated for accounting-lib 2021-06-15 15:31:18 +02:00
Giancarlo Panichi be510cbda0 Updated for Release Next 2021-06-15 15:20:40 +02:00
Giancarlo Panichi 1e0b6b4d02 Rebuild SNAPSHOT 2021-06-15 15:14:36 +02:00
Giancarlo Panichi dbb87e55f7 ref 20971: ShortLink - Check for obsolete short urls
https://support.d4science.org/issues/20971

Fixed ShortLink urls.
2021-05-25 11:14:58 +02:00
Giancarlo Panichi 2d982938ce ref 20971: ShortLink - Check for obsolete short urls
https://support.d4science.org/issues/20971

Fixed ShortLink urls.
2021-05-25 10:50:17 +02:00
Giancarlo Panichi 86f5e9de17 ref 20971: ShortLink - Check for obsolete short urls
https://support.d4science.org/issues/20971

Fixed ShortLink urls.
2021-05-25 10:24:59 +02:00
Giancarlo Panichi 33b08ac966 ref 20971: ShortLink - Check for obsolete short urls
https://support.d4science.org/issues/20971

Fixed ShortLink urls.
2021-05-24 18:45:35 +02:00
Giancarlo Panichi a24383b37e ref 20971: ShortLink - Check for obsolete short urls
https://support.d4science.org/issues/20971

Fixed ShortLink urls.
2021-05-24 18:22:21 +02:00
Giancarlo Panichi e0ad80592d ref 20971: ShortLink - Check for obsolete short urls
https://support.d4science.org/issues/20971

Fixed ShortLink urls.
2021-05-24 18:13:29 +02:00
Roberto Cirillo 6334bb4224 Update 'pom.xml'
removed snapshot
2021-01-20 15:59:21 +01:00
Roberto Cirillo d1f61f1693 Update 'pom.xml'
added SNAPSHOT
2021-01-19 17:02:01 +01:00
Roberto Cirillo a1ae32f437 Update 'pom.xml'
update ecological-engine-smart-executor lower bound range
2021-01-19 16:50:15 +01:00
Roberto Cirillo cde6155c81 Update 'pom.xml'
update ecological-engine-external-algorithms lower bound range
2021-01-19 09:46:31 +01:00
lucio.lelii 2b719a7633 pom updated to remove old maven repositories 2021-01-18 20:01:26 +01:00
lucio.lelii d42448591d pom updated 2020-11-30 17:48:03 +01:00
lucio.lelii d9a6eb21be - EnviromentalVariableManager reverted
- import range changed for ecological-engine libraries
2020-11-30 17:39:25 +01:00
user1 279535c13a config Path can be set from EnvManager 2020-11-20 16:25:19 +01:00
Giancarlo Panichi 0a1de08b27 Updated descriptor.xml 2020-06-11 14:30:40 +02:00
Giancarlo Panichi a6e006439e ref 19423: DataMiner - Update DataMiner Service in Dev for support https
https://support.d4science.org/issues/19423

 Updated for support https protocol
2020-06-10 16:53:51 +02:00
Giancarlo Panichi 806effbd6f Renamed CHANGELOG.md 2020-06-10 15:21:32 +02:00
Giancarlo Panichi 6621bb20d9 Renamed CHANGELOG.md 2020-06-10 14:54:38 +02:00
Giancarlo Panichi 45ecff015d Renamed CHANGELOG.md 2020-06-10 14:44:17 +02:00
Giancarlo Panichi a24ecf41cb Renamed CHANGELOG.md 2020-06-10 14:42:03 +02:00
roberto cirillo 577a564b4d update to version 1.6.0:
removed snapshot, edit changelog.md
2020-05-14 17:13:57 +02:00
roberto cirillo cc1cece20e Merge branch 'master' of https://code-repo.d4science.org/gCubeSystem/dataminer.git 2020-05-12 15:23:53 +02:00
roberto cirillo c573360e2a added shub retry on InputManager class, getLocalFile method 2020-05-12 15:23:26 +02:00
roberto cirillo fdffbd6063 reverting retry in order to push another retry feature
Revert "Added Retry in Input Parameter URL retrieve for fix StorageHub sync."

This reverts commit 4d6318df92.
2020-05-12 15:18:23 +02:00
Giancarlo Panichi 4d6318df92 Added Retry in Input Parameter URL retrieve for fix StorageHub sync. 2020-05-12 11:24:31 +02:00
Giancarlo Panichi 34083f1357 ref 18289: the latest wps doesn't work
https://support.d4science.org/issues/18289

Rebuild Snapshot
2020-04-23 11:22:24 +02:00
Giancarlo Panichi a5223ecb43 Updated to Git and Jenkins 2019-11-28 18:50:01 +01:00
Giancarlo Panichi 08a56c02dd Updated to Git and Jenkins 2019-11-28 11:09:46 +01:00
Giancarlo Panichi c2acc48494 Updated to Git and Jenkins 2019-11-26 15:55:00 +01:00
Giancarlo Panichi 9360f1eaa5 Updated to support Git and Jenkins 2019-11-26 15:13:55 +01:00
Giancarlo Panichi a11cb3647f Updated to support Git and Jenkins 2019-11-26 15:11:49 +01:00
Giancarlo Panichi ea88169e41 Updated to Git and Jenkins 2019-11-25 16:15:04 +01:00
Giancarlo Panichi 6b8d28873b Updated to Git and Jenkins 2019-11-25 15:52:32 +01:00
Giancarlo Panichi f16dbd2f71 Updated to Git and Jenkins 2019-11-25 11:58:58 +01:00
Giancarlo Panichi 81ddc263d1 Updated to Git and Jenkins 2019-11-21 17:32:37 +01:00
Giancarlo Panichi 88bc383f3c Updated Pom to build the SNAPSHOT 2019-11-21 14:23:41 +01:00
Giancarlo Panichi 8d0d481e14 Updated to support Git and Jenkins 2019-11-20 17:23:42 +01:00
Giancarlo Panichi 9891b206a5 Updated to support Git and Jenkins 2019-11-20 17:19:23 +01:00
Giancarlo Panichi 790cff989c ref 18096: DataMiner - HTML file does not open in view widget but in a new browser window
https://support.d4science.org/issues/18096

Fixed Content-Type in output files

git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/DataMiner@182280 82a268e6-3cf1-43bd-a215-b396298e98cf
2019-11-20 15:43:27 +00:00
Giancarlo Panichi 4d653662ce ref 17659: DataMiner - DataMiner Service - DataMiner service must create https links instead of http links for output parameters to avoid problems with the portal that uses https.
https://support.d4science.org/issues/17659

Fixed https link for output parameter

git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/DataMiner@181938 82a268e6-3cf1-43bd-a215-b396298e98cf
2019-10-03 13:26:30 +00:00
Giancarlo Panichi 020c621a34 ref 13024: DataMiner - The service must support the https protocol
https://support.d4science.org/issues/13024

Updated DataMiner now support https

git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/DataMiner@181937 82a268e6-3cf1-43bd-a215-b396298e98cf
2019-10-03 13:14:53 +00:00
Lucio Lelii f47693f27a git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/DataMiner@179537 82a268e6-3cf1-43bd-a215-b396298e98cf 2019-05-23 14:48:31 +00:00
Lucio Lelii ed556a9960 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/DataMiner@179518 82a268e6-3cf1-43bd-a215-b396298e98cf 2019-05-22 15:45:21 +00:00
Lucio Lelii 9d6794ffde git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/DataMiner@179069 82a268e6-3cf1-43bd-a215-b396298e98cf 2019-04-17 15:36:38 +00:00
Lucio Lelii 0adfa5b959 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/DataMiner@178855 82a268e6-3cf1-43bd-a215-b396298e98cf 2019-04-04 16:48:29 +00:00
Lucio Lelii fb6c980623 Added possibility to exclude single or all user to write to workspace
git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/DataMiner@178710 82a268e6-3cf1-43bd-a215-b396298e98cf
2019-03-27 17:34:46 +00:00
Giancarlo Panichi fc5e616101 Fixed Jersey version in dependencies
git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/DataMiner@176633 82a268e6-3cf1-43bd-a215-b396298e98cf
2019-01-17 11:11:46 +00:00
Giancarlo Panichi d3080c4052 ref 13024: DataMiner - The service must support the https protocol
https://support.d4science.org/issues/13024

Updated DataMiner now support https

git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/DataMiner@174936 82a268e6-3cf1-43bd-a215-b396298e98cf
2018-12-17 15:07:07 +00:00
Giancarlo Panichi 34d131b900 Aligned Trunk to Branch
git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/DataMiner@174213 82a268e6-3cf1-43bd-a215-b396298e98cf
2018-11-19 09:03:18 +00:00
Lucio Lelii 60ccac1784 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/DataMiner@174113 82a268e6-3cf1-43bd-a215-b396298e98cf 2018-11-12 17:16:15 +00:00
Giancarlo Panichi 26ad1e8cc9 ref 12703: Public pages visualised by data miner services and proxies
https://support.d4science.org/issues/12703

Updated DataMiner service metadata info

git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/DataMiner@173657 82a268e6-3cf1-43bd-a215-b396298e98cf
2018-10-18 15:38:14 +00:00
Lucio Lelii bf7e31697d git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/DataMiner@173645 82a268e6-3cf1-43bd-a215-b396298e98cf 2018-10-18 08:00:09 +00:00
Lucio Lelii 3d509ae807 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/DataMiner@173523 82a268e6-3cf1-43bd-a215-b396298e98cf 2018-10-16 13:57:23 +00:00
Lucio Lelii 9013721e12 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/DataMiner@173518 82a268e6-3cf1-43bd-a215-b396298e98cf 2018-10-16 13:27:01 +00:00
Lucio Lelii 67fbb1f724 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/DataMiner@173508 82a268e6-3cf1-43bd-a215-b396298e98cf 2018-10-16 12:35:16 +00:00
Lucio Lelii fdda6ce838 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/DataMiner@173507 82a268e6-3cf1-43bd-a215-b396298e98cf 2018-10-16 12:11:26 +00:00
Lucio Lelii f6b3253459 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/DataMiner@173502 82a268e6-3cf1-43bd-a215-b396298e98cf 2018-10-16 09:57:30 +00:00
Lucio Lelii 8e7edbb075 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/DataMiner@173499 82a268e6-3cf1-43bd-a215-b396298e98cf 2018-10-15 17:41:21 +00:00
Lucio Lelii 30749c37e3 git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/DataMiner@173393 82a268e6-3cf1-43bd-a215-b396298e98cf 2018-10-12 14:19:28 +00:00
Lucio Lelii 908e7e57f4 EnvironmentVariableManager added
git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/DataMiner@173383 82a268e6-3cf1-43bd-a215-b396298e98cf
2018-10-12 13:53:36 +00:00
Lucio Lelii 607f49125c Removed special char "["and "]" from output omputation name
git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/DataMiner@173249 82a268e6-3cf1-43bd-a215-b396298e98cf
2018-10-11 16:57:04 +00:00
Lucio Lelii c6676795ce git-svn-id: https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/DataMiner@169188 82a268e6-3cf1-43bd-a215-b396298e98cf 2018-06-14 09:29:15 +00:00
76 changed files with 1686 additions and 925 deletions

3
.classpath Normal file → Executable file
View File

@ -14,17 +14,20 @@
<classpathentry kind="con" path="org.eclipse.m2e.MAVEN2_CLASSPATH_CONTAINER">
<attributes>
<attribute name="maven.pomderived" value="true"/>
<attribute name="org.eclipse.jst.component.nondependency" value=""/>
</attributes>
</classpathentry>
<classpathentry kind="src" output="target/test-classes" path="src/test/java">
<attributes>
<attribute name="optional" value="true"/>
<attribute name="maven.pomderived" value="true"/>
<attribute name="test" value="true"/>
</attributes>
</classpathentry>
<classpathentry excluding="**" kind="src" output="target/test-classes" path="src/test/resources">
<attributes>
<attribute name="maven.pomderived" value="true"/>
<attribute name="test" value="true"/>
</attributes>
</classpathentry>
<classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.8">

1
.gitignore vendored Executable file
View File

@ -0,0 +1 @@
/target/

13
.project Normal file → Executable file
View File

@ -5,11 +5,21 @@
<projects>
</projects>
<buildSpec>
<buildCommand>
<name>org.eclipse.wst.common.project.facet.core.builder</name>
<arguments>
</arguments>
</buildCommand>
<buildCommand>
<name>org.eclipse.jdt.core.javabuilder</name>
<arguments>
</arguments>
</buildCommand>
<buildCommand>
<name>org.eclipse.wst.validation.validationbuilder</name>
<arguments>
</arguments>
</buildCommand>
<buildCommand>
<name>org.eclipse.m2e.core.maven2Builder</name>
<arguments>
@ -17,7 +27,10 @@
</buildCommand>
</buildSpec>
<natures>
<nature>org.eclipse.jem.workbench.JavaEMFNature</nature>
<nature>org.eclipse.wst.common.modulecore.ModuleCoreNature</nature>
<nature>org.eclipse.m2e.core.maven2Nature</nature>
<nature>org.eclipse.jdt.core.javanature</nature>
<nature>org.eclipse.wst.common.project.facet.core.nature</nature>
</natures>
</projectDescription>

2
.settings/.gitignore vendored Normal file
View File

@ -0,0 +1,2 @@
/org.eclipse.ltk.core.refactoring.prefs
/org.eclipse.wst.xsl.core.prefs

0
.settings/org.eclipse.core.resources.prefs Normal file → Executable file
View File

97
.settings/org.eclipse.jdt.core.prefs Normal file → Executable file
View File

@ -1,12 +1,109 @@
eclipse.preferences.version=1
org.eclipse.jdt.core.compiler.annotation.inheritNullAnnotations=disabled
org.eclipse.jdt.core.compiler.annotation.missingNonNullByDefaultAnnotation=ignore
org.eclipse.jdt.core.compiler.annotation.nonnull=org.eclipse.jdt.annotation.NonNull
org.eclipse.jdt.core.compiler.annotation.nonnull.secondary=
org.eclipse.jdt.core.compiler.annotation.nonnullbydefault=org.eclipse.jdt.annotation.NonNullByDefault
org.eclipse.jdt.core.compiler.annotation.nonnullbydefault.secondary=
org.eclipse.jdt.core.compiler.annotation.nullable=org.eclipse.jdt.annotation.Nullable
org.eclipse.jdt.core.compiler.annotation.nullable.secondary=
org.eclipse.jdt.core.compiler.annotation.nullanalysis=disabled
org.eclipse.jdt.core.compiler.codegen.inlineJsrBytecode=enabled
org.eclipse.jdt.core.compiler.codegen.methodParameters=do not generate
org.eclipse.jdt.core.compiler.codegen.targetPlatform=1.8
org.eclipse.jdt.core.compiler.codegen.unusedLocal=preserve
org.eclipse.jdt.core.compiler.compliance=1.8
org.eclipse.jdt.core.compiler.debug.lineNumber=generate
org.eclipse.jdt.core.compiler.debug.localVariable=generate
org.eclipse.jdt.core.compiler.debug.sourceFile=generate
org.eclipse.jdt.core.compiler.problem.annotationSuperInterface=warning
org.eclipse.jdt.core.compiler.problem.assertIdentifier=error
org.eclipse.jdt.core.compiler.problem.autoboxing=ignore
org.eclipse.jdt.core.compiler.problem.comparingIdentical=warning
org.eclipse.jdt.core.compiler.problem.deadCode=warning
org.eclipse.jdt.core.compiler.problem.deprecation=warning
org.eclipse.jdt.core.compiler.problem.deprecationInDeprecatedCode=disabled
org.eclipse.jdt.core.compiler.problem.deprecationWhenOverridingDeprecatedMethod=disabled
org.eclipse.jdt.core.compiler.problem.discouragedReference=warning
org.eclipse.jdt.core.compiler.problem.emptyStatement=ignore
org.eclipse.jdt.core.compiler.problem.enablePreviewFeatures=disabled
org.eclipse.jdt.core.compiler.problem.enumIdentifier=error
org.eclipse.jdt.core.compiler.problem.explicitlyClosedAutoCloseable=ignore
org.eclipse.jdt.core.compiler.problem.fallthroughCase=ignore
org.eclipse.jdt.core.compiler.problem.fatalOptionalError=disabled
org.eclipse.jdt.core.compiler.problem.fieldHiding=ignore
org.eclipse.jdt.core.compiler.problem.finalParameterBound=warning
org.eclipse.jdt.core.compiler.problem.finallyBlockNotCompletingNormally=warning
org.eclipse.jdt.core.compiler.problem.forbiddenReference=warning
org.eclipse.jdt.core.compiler.problem.hiddenCatchBlock=warning
org.eclipse.jdt.core.compiler.problem.includeNullInfoFromAsserts=disabled
org.eclipse.jdt.core.compiler.problem.incompatibleNonInheritedInterfaceMethod=warning
org.eclipse.jdt.core.compiler.problem.incompleteEnumSwitch=warning
org.eclipse.jdt.core.compiler.problem.indirectStaticAccess=ignore
org.eclipse.jdt.core.compiler.problem.localVariableHiding=ignore
org.eclipse.jdt.core.compiler.problem.methodWithConstructorName=warning
org.eclipse.jdt.core.compiler.problem.missingDefaultCase=ignore
org.eclipse.jdt.core.compiler.problem.missingDeprecatedAnnotation=ignore
org.eclipse.jdt.core.compiler.problem.missingEnumCaseDespiteDefault=disabled
org.eclipse.jdt.core.compiler.problem.missingHashCodeMethod=ignore
org.eclipse.jdt.core.compiler.problem.missingOverrideAnnotation=ignore
org.eclipse.jdt.core.compiler.problem.missingOverrideAnnotationForInterfaceMethodImplementation=enabled
org.eclipse.jdt.core.compiler.problem.missingSerialVersion=warning
org.eclipse.jdt.core.compiler.problem.missingSynchronizedOnInheritedMethod=ignore
org.eclipse.jdt.core.compiler.problem.noEffectAssignment=warning
org.eclipse.jdt.core.compiler.problem.noImplicitStringConversion=warning
org.eclipse.jdt.core.compiler.problem.nonExternalizedStringLiteral=ignore
org.eclipse.jdt.core.compiler.problem.nonnullParameterAnnotationDropped=warning
org.eclipse.jdt.core.compiler.problem.nonnullTypeVariableFromLegacyInvocation=warning
org.eclipse.jdt.core.compiler.problem.nullAnnotationInferenceConflict=error
org.eclipse.jdt.core.compiler.problem.nullReference=warning
org.eclipse.jdt.core.compiler.problem.nullSpecViolation=error
org.eclipse.jdt.core.compiler.problem.nullUncheckedConversion=warning
org.eclipse.jdt.core.compiler.problem.overridingPackageDefaultMethod=warning
org.eclipse.jdt.core.compiler.problem.parameterAssignment=ignore
org.eclipse.jdt.core.compiler.problem.pessimisticNullAnalysisForFreeTypeVariables=warning
org.eclipse.jdt.core.compiler.problem.possibleAccidentalBooleanAssignment=ignore
org.eclipse.jdt.core.compiler.problem.potentialNullReference=ignore
org.eclipse.jdt.core.compiler.problem.potentiallyUnclosedCloseable=ignore
org.eclipse.jdt.core.compiler.problem.rawTypeReference=warning
org.eclipse.jdt.core.compiler.problem.redundantNullAnnotation=warning
org.eclipse.jdt.core.compiler.problem.redundantNullCheck=ignore
org.eclipse.jdt.core.compiler.problem.redundantSpecificationOfTypeArguments=ignore
org.eclipse.jdt.core.compiler.problem.redundantSuperinterface=ignore
org.eclipse.jdt.core.compiler.problem.reportMethodCanBePotentiallyStatic=ignore
org.eclipse.jdt.core.compiler.problem.reportMethodCanBeStatic=ignore
org.eclipse.jdt.core.compiler.problem.reportPreviewFeatures=ignore
org.eclipse.jdt.core.compiler.problem.specialParameterHidingField=disabled
org.eclipse.jdt.core.compiler.problem.staticAccessReceiver=warning
org.eclipse.jdt.core.compiler.problem.suppressOptionalErrors=disabled
org.eclipse.jdt.core.compiler.problem.suppressWarnings=enabled
org.eclipse.jdt.core.compiler.problem.syntacticNullAnalysisForFields=disabled
org.eclipse.jdt.core.compiler.problem.syntheticAccessEmulation=ignore
org.eclipse.jdt.core.compiler.problem.typeParameterHiding=warning
org.eclipse.jdt.core.compiler.problem.unavoidableGenericTypeProblems=enabled
org.eclipse.jdt.core.compiler.problem.uncheckedTypeOperation=warning
org.eclipse.jdt.core.compiler.problem.unclosedCloseable=warning
org.eclipse.jdt.core.compiler.problem.undocumentedEmptyBlock=ignore
org.eclipse.jdt.core.compiler.problem.unhandledWarningToken=warning
org.eclipse.jdt.core.compiler.problem.unnecessaryElse=ignore
org.eclipse.jdt.core.compiler.problem.unnecessaryTypeCheck=ignore
org.eclipse.jdt.core.compiler.problem.unqualifiedFieldAccess=ignore
org.eclipse.jdt.core.compiler.problem.unusedDeclaredThrownException=ignore
org.eclipse.jdt.core.compiler.problem.unusedDeclaredThrownExceptionExemptExceptionAndThrowable=enabled
org.eclipse.jdt.core.compiler.problem.unusedDeclaredThrownExceptionIncludeDocCommentReference=enabled
org.eclipse.jdt.core.compiler.problem.unusedDeclaredThrownExceptionWhenOverriding=disabled
org.eclipse.jdt.core.compiler.problem.unusedExceptionParameter=ignore
org.eclipse.jdt.core.compiler.problem.unusedImport=warning
org.eclipse.jdt.core.compiler.problem.unusedLabel=warning
org.eclipse.jdt.core.compiler.problem.unusedLocal=warning
org.eclipse.jdt.core.compiler.problem.unusedObjectAllocation=ignore
org.eclipse.jdt.core.compiler.problem.unusedParameter=ignore
org.eclipse.jdt.core.compiler.problem.unusedParameterIncludeDocCommentReference=enabled
org.eclipse.jdt.core.compiler.problem.unusedParameterWhenImplementingAbstract=disabled
org.eclipse.jdt.core.compiler.problem.unusedParameterWhenOverridingConcrete=disabled
org.eclipse.jdt.core.compiler.problem.unusedPrivateMember=warning
org.eclipse.jdt.core.compiler.problem.unusedTypeParameter=ignore
org.eclipse.jdt.core.compiler.problem.unusedWarningToken=warning
org.eclipse.jdt.core.compiler.problem.varargsArgumentNeedCast=warning
org.eclipse.jdt.core.compiler.release=disabled
org.eclipse.jdt.core.compiler.source=1.8

0
.settings/org.eclipse.m2e.core.prefs Normal file → Executable file
View File

View File

@ -0,0 +1,6 @@
<?xml version="1.0" encoding="UTF-8"?><project-modules id="moduleCoreId" project-version="1.5.0">
<wb-module deploy-name="dataminer">
<wb-resource deploy-path="/" source-path="/src/main/java"/>
<wb-resource deploy-path="/" source-path="/src/main/resources"/>
</wb-module>
</project-modules>

View File

@ -0,0 +1,5 @@
<?xml version="1.0" encoding="UTF-8"?>
<faceted-project>
<installed facet="java" version="1.8"/>
<installed facet="jst.utility" version="1.0"/>
</faceted-project>

View File

@ -0,0 +1,2 @@
disabled=06target
eclipse.preferences.version=1

67
CHANGELOG.md Executable file
View File

@ -0,0 +1,67 @@
# Changelog for "dataminer"
## [v1.8.1-SNAPSHOT] - 2022-03-21
- Update wps service to support not writing of the computation status to the user's workspace [#23054]
- Fixed protocol parameter when persistence is disabled
## [v1.8.0] - 2022-01-24
- Fixed max computations support [#22700]
## [v1.7.1] - 2021-05-24
- Fixed obsolete short urls [#20971]
## [v1.7.0] - 2020-11-20
- import range modified to resolve old repositories invalid url
## [v1.6.0] - 2020-05-12
- Added storagehub retry in InputsManager class, getLocalFile method [#19253]
## [v1.5.9] - 2019-11-20
- Fixed Content-Type support for files in the results of computations [#18096]
## [v1.5.8] - 2019-10-01
- Fixed https link for output parameter [#17659]
## [v1.5.7] - 2019-03-01
- Updated https support [#13024]
## [v1.5.2] - 2017-12-13
- added the right extension on output file
- lock file created on execution
## [v1.5.1] - 2017-09-14
- added accounting on algorithm execution
## [v1.5.0] - 2017-07-31
- service interface classes moved to wps project
## [v1.1.0] - 2016-10-03
- First Release
This project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

26
FUNDING.md Executable file
View File

@ -0,0 +1,26 @@
# Acknowledgments
The projects leading to this software have received funding from a series of European Union programmes including:
- the Sixth Framework Programme for Research and Technological Development
- [DILIGENT](https://cordis.europa.eu/project/id/004260) (grant no. 004260).
- the Seventh Framework Programme for research, technological development and demonstration
- [D4Science](https://cordis.europa.eu/project/id/212488) (grant no. 212488);
- [D4Science-II](https://cordis.europa.eu/project/id/239019) (grant no.239019);
- [ENVRI](https://cordis.europa.eu/project/id/283465) (grant no. 283465);
- [iMarine](https://cordis.europa.eu/project/id/283644) (grant no. 283644);
- [EUBrazilOpenBio](https://cordis.europa.eu/project/id/288754) (grant no. 288754).
- the H2020 research and innovation programme
- [SoBigData](https://cordis.europa.eu/project/id/654024) (grant no. 654024);
- [PARTHENOS](https://cordis.europa.eu/project/id/654119) (grant no. 654119);
- [EGI-Engage](https://cordis.europa.eu/project/id/654142) (grant no. 654142);
- [ENVRI PLUS](https://cordis.europa.eu/project/id/654182) (grant no. 654182);
- [BlueBRIDGE](https://cordis.europa.eu/project/id/675680) (grant no. 675680);
- [PerformFISH](https://cordis.europa.eu/project/id/727610) (grant no. 727610);
- [AGINFRA PLUS](https://cordis.europa.eu/project/id/731001) (grant no. 731001);
- [DESIRA](https://cordis.europa.eu/project/id/818194) (grant no. 818194);
- [ARIADNEplus](https://cordis.europa.eu/project/id/823914) (grant no. 823914);
- [RISIS 2](https://cordis.europa.eu/project/id/824091) (grant no. 824091);
- [EOSC-Pillar](https://cordis.europa.eu/project/id/857650) (grant no. 857650);
- [Blue Cloud](https://cordis.europa.eu/project/id/862409) (grant no. 862409);
- [SoBigData-PlusPlus](https://cordis.europa.eu/project/id/871042) (grant no. 871042);

311
LICENSE.md Executable file
View File

@ -0,0 +1,311 @@
#European Union Public Licence V.1.1
##*EUPL © the European Community 2007*
This **European Union Public Licence** (the **“EUPL”**) applies to the Work or Software
(as defined below) which is provided under the terms of this Licence. Any use of
the Work, other than as authorised under this Licence is prohibited (to the
extent such use is covered by a right of the copyright holder of the Work).
The Original Work is provided under the terms of this Licence when the Licensor
(as defined below) has placed the following notice immediately following the
copyright notice for the Original Work:
**Licensed under the EUPL V.1.1**
or has expressed by any other mean his willingness to license under the EUPL.
##1. Definitions
In this Licence, the following terms have the following meaning:
- The Licence: this Licence.
- The Original Work or the Software: the software distributed and/or
communicated by the Licensor under this Licence, available as Source Code and
also as Executable Code as the case may be.
- Derivative Works: the works or software that could be created by the Licensee,
based upon the Original Work or modifications thereof. This Licence does not
define the extent of modification or dependence on the Original Work required
in order to classify a work as a Derivative Work; this extent is determined by
copyright law applicable in the country mentioned in Article 15.
- The Work: the Original Work and/or its Derivative Works.
- The Source Code: the human-readable form of the Work which is the most
convenient for people to study and modify.
- The Executable Code: any code which has generally been compiled and which is
meant to be interpreted by a computer as a program.
- The Licensor: the natural or legal person that distributes and/or communicates
the Work under the Licence.
- Contributor(s): any natural or legal person who modifies the Work under the
Licence, or otherwise contributes to the creation of a Derivative Work.
- The Licensee or “You”: any natural or legal person who makes any usage of the
Software under the terms of the Licence.
- Distribution and/or Communication: any act of selling, giving, lending,
renting, distributing, communicating, transmitting, or otherwise making
available, on-line or off-line, copies of the Work or providing access to its
essential functionalities at the disposal of any other natural or legal
person.
##2. Scope of the rights granted by the Licence
The Licensor hereby grants You a world-wide, royalty-free, non-exclusive,
sub-licensable licence to do the following, for the duration of copyright vested
in the Original Work:
- use the Work in any circumstance and for all usage, reproduce the Work, modify
- the Original Work, and make Derivative Works based upon the Work, communicate
- to the public, including the right to make available or display the Work or
- copies thereof to the public and perform publicly, as the case may be, the
- Work, distribute the Work or copies thereof, lend and rent the Work or copies
- thereof, sub-license rights in the Work or copies thereof.
Those rights can be exercised on any media, supports and formats, whether now
known or later invented, as far as the applicable law permits so.
In the countries where moral rights apply, the Licensor waives his right to
exercise his moral right to the extent allowed by law in order to make effective
the licence of the economic rights here above listed.
The Licensor grants to the Licensee royalty-free, non exclusive usage rights to
any patents held by the Licensor, to the extent necessary to make use of the
rights granted on the Work under this Licence.
##3. Communication of the Source Code
The Licensor may provide the Work either in its Source Code form, or as
Executable Code. If the Work is provided as Executable Code, the Licensor
provides in addition a machine-readable copy of the Source Code of the Work
along with each copy of the Work that the Licensor distributes or indicates, in
a notice following the copyright notice attached to the Work, a repository where
the Source Code is easily and freely accessible for as long as the Licensor
continues to distribute and/or communicate the Work.
##4. Limitations on copyright
Nothing in this Licence is intended to deprive the Licensee of the benefits from
any exception or limitation to the exclusive rights of the rights owners in the
Original Work or Software, of the exhaustion of those rights or of other
applicable limitations thereto.
##5. Obligations of the Licensee
The grant of the rights mentioned above is subject to some restrictions and
obligations imposed on the Licensee. Those obligations are the following:
Attribution right: the Licensee shall keep intact all copyright, patent or
trademarks notices and all notices that refer to the Licence and to the
disclaimer of warranties. The Licensee must include a copy of such notices and a
copy of the Licence with every copy of the Work he/she distributes and/or
communicates. The Licensee must cause any Derivative Work to carry prominent
notices stating that the Work has been modified and the date of modification.
Copyleft clause: If the Licensee distributes and/or communicates copies of the
Original Works or Derivative Works based upon the Original Work, this
Distribution and/or Communication will be done under the terms of this Licence
or of a later version of this Licence unless the Original Work is expressly
distributed only under this version of the Licence. The Licensee (becoming
Licensor) cannot offer or impose any additional terms or conditions on the Work
or Derivative Work that alter or restrict the terms of the Licence.
Compatibility clause: If the Licensee Distributes and/or Communicates Derivative
Works or copies thereof based upon both the Original Work and another work
licensed under a Compatible Licence, this Distribution and/or Communication can
be done under the terms of this Compatible Licence. For the sake of this clause,
“Compatible Licence” refers to the licences listed in the appendix attached to
this Licence. Should the Licensees obligations under the Compatible Licence
conflict with his/her obligations under this Licence, the obligations of the
Compatible Licence shall prevail.
Provision of Source Code: When distributing and/or communicating copies of the
Work, the Licensee will provide a machine-readable copy of the Source Code or
indicate a repository where this Source will be easily and freely available for
as long as the Licensee continues to distribute and/or communicate the Work.
Legal Protection: This Licence does not grant permission to use the trade names,
trademarks, service marks, or names of the Licensor, except as required for
reasonable and customary use in describing the origin of the Work and
reproducing the content of the copyright notice.
##6. Chain of Authorship
The original Licensor warrants that the copyright in the Original Work granted
hereunder is owned by him/her or licensed to him/her and that he/she has the
power and authority to grant the Licence.
Each Contributor warrants that the copyright in the modifications he/she brings
to the Work are owned by him/her or licensed to him/her and that he/she has the
power and authority to grant the Licence.
Each time You accept the Licence, the original Licensor and subsequent
Contributors grant You a licence to their contributions to the Work, under the
terms of this Licence.
##7. Disclaimer of Warranty
The Work is a work in progress, which is continuously improved by numerous
contributors. It is not a finished work and may therefore contain defects or
“bugs” inherent to this type of software development.
For the above reason, the Work is provided under the Licence on an “as is” basis
and without warranties of any kind concerning the Work, including without
limitation merchantability, fitness for a particular purpose, absence of defects
or errors, accuracy, non-infringement of intellectual property rights other than
copyright as stated in Article 6 of this Licence.
This disclaimer of warranty is an essential part of the Licence and a condition
for the grant of any rights to the Work.
##8. Disclaimer of Liability
Except in the cases of wilful misconduct or damages directly caused to natural
persons, the Licensor will in no event be liable for any direct or indirect,
material or moral, damages of any kind, arising out of the Licence or of the use
of the Work, including without limitation, damages for loss of goodwill, work
stoppage, computer failure or malfunction, loss of data or any commercial
damage, even if the Licensor has been advised of the possibility of such
damage. However, the Licensor will be liable under statutory product liability
laws as far such laws apply to the Work.
##9. Additional agreements
While distributing the Original Work or Derivative Works, You may choose to
conclude an additional agreement to offer, and charge a fee for, acceptance of
support, warranty, indemnity, or other liability obligations and/or services
consistent with this Licence. However, in accepting such obligations, You may
act only on your own behalf and on your sole responsibility, not on behalf of
the original Licensor or any other Contributor, and only if You agree to
indemnify, defend, and hold each Contributor harmless for any liability incurred
by, or claims asserted against such Contributor by the fact You have accepted
any such warranty or additional liability.
##10. Acceptance of the Licence
The provisions of this Licence can be accepted by clicking on an icon “I agree”
placed under the bottom of a window displaying the text of this Licence or by
affirming consent in any other similar way, in accordance with the rules of
applicable law. Clicking on that icon indicates your clear and irrevocable
acceptance of this Licence and all of its terms and conditions.
Similarly, you irrevocably accept this Licence and all of its terms and
conditions by exercising any rights granted to You by Article 2 of this Licence,
such as the use of the Work, the creation by You of a Derivative Work or the
Distribution and/or Communication by You of the Work or copies thereof.
##11. Information to the public
In case of any Distribution and/or Communication of the Work by means of
electronic communication by You (for example, by offering to download the Work
from a remote location) the distribution channel or media (for example, a
website) must at least provide to the public the information requested by the
applicable law regarding the Licensor, the Licence and the way it may be
accessible, concluded, stored and reproduced by the Licensee.
##12. Termination of the Licence
The Licence and the rights granted hereunder will terminate automatically upon
any breach by the Licensee of the terms of the Licence.
Such a termination will not terminate the licences of any person who has
received the Work from the Licensee under the Licence, provided such persons
remain in full compliance with the Licence.
##13. Miscellaneous
Without prejudice of Article 9 above, the Licence represents the complete
agreement between the Parties as to the Work licensed hereunder.
If any provision of the Licence is invalid or unenforceable under applicable
law, this will not affect the validity or enforceability of the Licence as a
whole. Such provision will be construed and/or reformed so as necessary to make
it valid and enforceable.
The European Commission may publish other linguistic versions and/or new
versions of this Licence, so far this is required and reasonable, without
reducing the scope of the rights granted by the Licence. New versions of the
Licence will be published with a unique version number.
All linguistic versions of this Licence, approved by the European Commission,
have identical value. Parties can take advantage of the linguistic version of
their choice.
##14. Jurisdiction
Any litigation resulting from the interpretation of this License, arising
between the European Commission, as a Licensor, and any Licensee, will be
subject to the jurisdiction of the Court of Justice of the European Communities,
as laid down in article 238 of the Treaty establishing the European Community.
Any litigation arising between Parties, other than the European Commission, and
resulting from the interpretation of this License, will be subject to the
exclusive jurisdiction of the competent court where the Licensor resides or
conducts its primary business.
##15. Applicable Law
This Licence shall be governed by the law of the European Union country where
the Licensor resides or has his registered office.
This licence shall be governed by the Belgian law if:
- a litigation arises between the European Commission, as a Licensor, and any
- Licensee; the Licensor, other than the European Commission, has no residence
- or registered office inside a European Union country.
---
##Appendix
**“Compatible Licences”** according to article 5 EUPL are:
- GNU General Public License (GNU GPL) v. 2
- Open Software License (OSL) v. 2.1, v. 3.0
- Common Public License v. 1.0
- Eclipse Public License v. 1.0
- Cecill v. 2.0

40
README.md Executable file
View File

@ -0,0 +1,40 @@
# DataMiner
DataMiner is a library for integrate WPS 52North in the D4Science Infrastructure.
## Structure of the project
* The source code is present in the src folder.
## Built With
* [OpenJDK](https://openjdk.java.net/) - The JDK used
* [Maven](https://maven.apache.org/) - Dependency Management
## Documentation
* Use of this library is described on [Wiki](https://wiki.gcube-system.org/gcube/DataMiner_Installation).
## Change log
See [Releases](https://code-repo.d4science.org/gCubeSystem/dataminer/releases).
## Authors
* **Gianpaolo Coro** ([ORCID]()) - [ISTI-CNR Infrascience Group](http://nemis.isti.cnr.it/groups/infrascience)
* **Lucio Lelii** ([ORCID]()) - [ISTI-CNR Infrascience Group](http://nemis.isti.cnr.it/groups/infrascience)
* **Giancarlo Panichi** ([ORCID](http://orcid.org/0000-0001-8375-6644)) - [ISTI-CNR Infrascience Group](http://nemis.isti.cnr.it/groups/infrascience)
## License
This project is licensed under the EUPL V.1.1 License - see the [LICENSE.md](LICENSE.md) file for details.
## About the gCube Framework
This software is part of the [gCubeFramework](https://www.gcube-system.org/ "gCubeFramework"): an
open-source software toolkit used for building and operating Hybrid Data
Infrastructures enabling the dynamic deployment of Virtual Research Environments
by favouring the realisation of reuse oriented policies.
The projects leading to this software have received funding from a series of European Union programmes see [FUNDING.md](FUNDING.md)

8
distro/descriptor.xml → descriptor.xml Normal file → Executable file
View File

@ -9,13 +9,13 @@
<baseDirectory>/</baseDirectory>
<fileSets>
<fileSet>
<directory>${distroDirectory}</directory>
<outputDirectory>/</outputDirectory>
<useDefaultExcludes>true</useDefaultExcludes>
<includes>
<include>README</include>
<include>LICENSE</include>
<include>changelog.xml</include>
<include>README.md</include>
<include>LICENSE.md</include>
<include>CHANGELOG.md</include>
<include>FUNDING.md</include>
<include>profile.xml</include>
</includes>
<fileMode>755</fileMode>

View File

@ -1 +0,0 @@
${gcube.license}

View File

@ -1,69 +0,0 @@
The gCube System - ${name}
--------------------------------------------------
${description}
${gcube.description}
${gcube.funding}
Version
--------------------------------------------------
${version} (${buildDate})
Please see the file named "changelog.xml" in this directory for the release notes.
Authors
--------------------------------------------------
* Gianpaolo Coro (gianpaolo.coro-AT-isti.cnr.it),
Istituto di Scienza e Tecnologie dell'Informazione "A. Faedo" CNR, Pisa IT
Maintainers
-----------
* Gianpaolo Coro (gianpaolo.coro-AT-isti.cnr.it),
Istituto di Scienza e Tecnologie dell'Informazione "A. Faedo" CNR, Pisa IT
Download information
--------------------------------------------------
Source code is available from SVN:
${scm.url}
Binaries can be downloaded from the gCube website:
${gcube.website}
Installation
--------------------------------------------------
Installation documentation is available on-line in the gCube Wiki:
https://wiki.gcube-system.org/gcube/DataMiner_Installation
Documentation
--------------------------------------------------
Documentation is available on-line in the gCube Wiki:
https://wiki.gcube-system.org/gcube/DataMiner_Installation
Support
--------------------------------------------------
Bugs and support requests can be reported in the gCube issue tracking tool:
${gcube.issueTracking}
Licensing
--------------------------------------------------
This software is licensed under the terms you may find in the file named "LICENSE" in this directory.

View File

@ -1,15 +0,0 @@
<ReleaseNotes>
<Changeset component="${groupId}.${artifactId}.1-5-2" date="2017-12-13">
<Change>added the right extension on output file</Change>
<Change>lock file created on execution</Change>
</Changeset>
<Changeset component="${groupId}.${artifactId}.1-5-1" date="2017-09-14">
<Change>added accounting on algorithm execution</Change>
</Changeset>
<Changeset component="${groupId}.${artifactId}.1-5-0" date="2017-07-31">
<Change>service interface classes moved to wps project</Change>
</Changeset>
<Changeset component="${groupId}.${artifactId}.1-1-0" date="2016-10-03">
<Change>First Release</Change>
</Changeset>
</ReleaseNotes>

419
pom.xml Normal file → Executable file
View File

@ -1,210 +1,211 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<artifactId>maven-parent</artifactId>
<groupId>org.gcube.tools</groupId>
<version>1.0.0</version>
<relativePath />
</parent>
<groupId>org.gcube.dataanalysis</groupId>
<artifactId>dataminer</artifactId>
<version>1.5.3-SNAPSHOT</version>
<name>dataminer</name>
<description>An e-Infrastructure service providing state-of-the art DataMining algorithms and ecological modelling approaches under the Web Processing Service (WPS) standard.</description>
<scm>
<url>https://svn.d4science.research-infrastructures.eu/gcube/trunk/data-analysis/DataMiner</url>
</scm>
<developers>
<developer>
<name>Gianpaolo Coro</name>
<email>gianpaolo.coro@isti.cnr.it</email>
<organization>CNR Pisa, Istituto di Scienza e Tecnologie dell'Informazione "A. Faedo"</organization>
<roles>
<role>architect</role>
<role>developer</role>
</roles>
</developer>
</developers>
<properties>
<webappDirectory>${project.build.directory}/${project.build.finalName}</webappDirectory>
<distroDirectory>distro</distroDirectory>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
</properties>
<dependencies>
<!-- https://mvnrepository.com/artifact/org.reflections/reflections-maven -->
<dependency>
<groupId>org.reflections</groupId>
<artifactId>reflections</artifactId>
<version>0.9.10</version>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
<version>2.5</version>
</dependency>
<dependency>
<groupId>org.n52.wps</groupId>
<artifactId>52n-wps-io</artifactId>
<version>3.6.1</version>
</dependency>
<dependency>
<groupId>org.n52.wps</groupId>
<artifactId>52n-wps-io-impl</artifactId>
<version>3.6.1</version>
</dependency>
<dependency>
<groupId>org.gcube.dataanalysis</groupId>
<artifactId>52n-wps-algorithm-gcube</artifactId>
<version>[3.6.1-SNAPSHOT,3.7.0-SNAPSHOT)</version>
</dependency>
<dependency>
<groupId>org.gcube.dataanalysis</groupId>
<artifactId>52n-wps-server-gcube</artifactId>
<version>[3.6.1-SNAPSHOT, 3.7.0-SNAPSHOT)</version>
</dependency>
<dependency>
<groupId>org.gcube.dataanalysis</groupId>
<artifactId>ecological-engine</artifactId>
<version>[1.8.5-SNAPSHOT,2.0.0-SNAPSHOT)</version>
</dependency>
<dependency>
<groupId>org.gcube.dataanalysis</groupId>
<artifactId>ecological-engine-wps-extension</artifactId>
<version>[1.0.2-SNAPSHOT,2.0.0-SNAPSHOT)</version>
</dependency>
<dependency>
<groupId>org.gcube.dataanalysis</groupId>
<artifactId>ecological-engine-geospatial-extensions</artifactId>
<version>[1.3.2-SNAPSHOT,2.0.0-SNAPSHOT)</version>
<exclusions>
<exclusion>
<artifactId>log4j</artifactId>
<groupId>log4j</groupId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.gcube.dataanalysis</groupId>
<artifactId>ecological-engine-external-algorithms</artifactId>
<version>[1.1.5-SNAPSHOT,2.0.0-SNAPSHOT)</version>
</dependency>
<dependency>
<groupId>org.gcube.dataanalysis</groupId>
<artifactId>ecological-engine-smart-executor</artifactId>
<version>[1.0.0-SNAPSHOT,2.0.0-SNAPSHOT)</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-nop</artifactId>
<version>1.7.10</version>
<exclusions>
<exclusion>
<artifactId>slf4j-api</artifactId>
<groupId>org.slf4j</groupId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
<!-- <dependency> <groupId>org.gcube.common</groupId> <artifactId>common-authorization</artifactId>
</dependency> <dependency> <groupId>org.gcube.core</groupId> <artifactId>common-scope</artifactId>
</dependency> -->
<dependency>
<groupId>javassist</groupId>
<artifactId>javassist</artifactId>
<version>3.12.1.GA</version>
</dependency>
<dependency>
<groupId>org.gcube.common</groupId>
<artifactId>home-library-jcr</artifactId>
<version>[2.0.0-SNAPSHOT,3.0.0-SNAPSHOT)</version>
</dependency>
<dependency>
<groupId>org.gcube.common</groupId>
<artifactId>home-library</artifactId>
<version>[2.0.0-SNAPSHOT,3.0.0-SNAPSHOT)</version>
</dependency>
<dependency>
<groupId>xerces</groupId>
<artifactId>xercesImpl</artifactId>
<version>2.11.0</version>
</dependency>
<dependency>
<groupId>org.gcube.accounting</groupId>
<artifactId>accounting-lib</artifactId>
<version>[3.0.0-SNAPSHOT,4.0.0-SNAPSHOT)</version>
<scope>provided</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.18.1</version>
<configuration>
<skipTests>true</skipTests>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<descriptors>
<descriptor>${distroDirectory}/descriptor.xml</descriptor>
</descriptors>
</configuration>
<executions>
<execution>
<id>servicearchive</id>
<phase>install</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
<repositories>
<repository>
<id>n52-releases</id>
<name>52n Releases</name>
<url>http://52north.org/maven/repo/releases</url>
<releases>
<enabled>true</enabled>
</releases>
<snapshots>
<enabled>false</enabled>
</snapshots>
</repository>
</repositories>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<artifactId>maven-parent</artifactId>
<groupId>org.gcube.tools</groupId>
<version>1.1.0</version>
<relativePath />
</parent>
<groupId>org.gcube.dataanalysis</groupId>
<artifactId>dataminer</artifactId>
<version>1.8.1-SNAPSHOT</version>
<name>dataminer</name>
<description>An e-Infrastructure service providing state-of-the art DataMining algorithms and ecological modelling approaches under the Web Processing Service (WPS) standard.</description>
<scm>
<connection>scm:git:https://code-repo.d4science.org/gCubeSystem/${project.artifactId}.git</connection>
<developerConnection>scm:git:https://code-repo.d4science.org/gCubeSystem/${project.artifactId}.git</developerConnection>
<url>https://code-repo.d4science.org/gCubeSystem/${project.artifactId}</url>
</scm>
<developers>
<developer>
<name>Gianpaolo Coro</name>
<email>gianpaolo.coro@isti.cnr.it</email>
<organization>CNR Pisa, Istituto di Scienza e Tecnologie dell'Informazione "A. Faedo"</organization>
<roles>
<role>architect</role>
<role>developer</role>
</roles>
</developer>
<developer>
<name>Lucio Lelii</name>
<email>lucio.lelii@isti.cnr.it</email>
<organization>CNR Pisa, Istituto di Scienza e Tecnologie dell'Informazione "A. Faedo"</organization>
<roles>
<role>architect</role>
<role>developer</role>
</roles>
</developer>
</developers>
<properties>
<webappDirectory>${project.build.directory}/${project.build.finalName}</webappDirectory>
<distroDirectory>distro</distroDirectory>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
</properties>
<dependencyManagement>
<!-- Old solution <dependencies> <dependency> <groupId>org.gcube.distribution</groupId>
<artifactId>maven-smartgears-bom</artifactId> <version>2.1.0</version> <type>pom</type>
<scope>import</scope> </dependency> </dependencies> -->
<dependencies>
<dependency>
<groupId>org.gcube.distribution</groupId>
<artifactId>gcube-bom</artifactId>
<version>2.1.0-SNAPSHOT</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<!-- https://mvnrepository.com/artifact/org.reflections/reflections-maven -->
<dependency>
<groupId>org.reflections</groupId>
<artifactId>reflections</artifactId>
<version>0.9.10</version>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
<version>2.5</version>
</dependency>
<dependency>
<groupId>org.n52.wps</groupId>
<artifactId>52n-wps-io</artifactId>
<version>3.6.1</version>
</dependency>
<dependency>
<groupId>org.n52.wps</groupId>
<artifactId>52n-wps-io-impl</artifactId>
<version>3.6.1</version>
</dependency>
<dependency>
<groupId>org.gcube.dataanalysis</groupId>
<artifactId>52n-wps-algorithm-gcube</artifactId>
<version>[3.6.1,3.7.0)</version>
</dependency>
<dependency>
<groupId>org.gcube.dataanalysis</groupId>
<artifactId>52n-wps-server-gcube</artifactId>
<version>[3.6.1, 3.7.0)</version>
</dependency>
<dependency>
<groupId>org.gcube.dataanalysis</groupId>
<artifactId>ecological-engine-wps-extension</artifactId>
<version>[1.0.5,2.0.0-SNAPSHOT)</version>
</dependency>
<dependency>
<groupId>org.gcube.dataanalysis</groupId>
<artifactId>ecological-engine-geospatial-extensions</artifactId>
<version>[1.5.2-SNAPSHOT,2.0.0-SNAPSHOT)</version>
<exclusions>
<exclusion>
<artifactId>log4j</artifactId>
<groupId>log4j</groupId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.gcube.dataanalysis</groupId>
<artifactId>ecological-engine-external-algorithms</artifactId>
<version>[1.2.2-SNAPSHOT,2.0.0-SNAPSHOT)</version>
</dependency>
<dependency>
<groupId>org.gcube.dataanalysis</groupId>
<artifactId>ecological-engine-smart-executor</artifactId>
<version>[1.6.5-SNAPSHOT,2.0.0-SNAPSHOT)</version>
<exclusions>
<exclusion>
<groupId>org.glassfish.jersey.core</groupId>
<artifactId>jersey-client</artifactId>
</exclusion>
<exclusion>
<groupId>org.glassfish.jersey.media</groupId>
<artifactId>jersey-media-json-jackson</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-nop</artifactId>
<version>1.7.10</version>
<exclusions>
<exclusion>
<artifactId>slf4j-api</artifactId>
<groupId>org.slf4j</groupId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>javassist</groupId>
<artifactId>javassist</artifactId>
<version>3.12.1.GA</version>
</dependency>
<dependency>
<groupId>org.gcube.common</groupId>
<artifactId>storagehub-client-library</artifactId>
</dependency>
<dependency>
<groupId>org.gcube.common</groupId>
<artifactId>storagehub-model</artifactId>
</dependency>
<dependency>
<groupId>xerces</groupId>
<artifactId>xercesImpl</artifactId>
<version>2.11.0</version>
</dependency>
<dependency>
<groupId>org.gcube.accounting</groupId>
<artifactId>accounting-lib</artifactId>
<scope>provided</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<skipTests>true</skipTests>
</configuration>
</plugin>
</plugins>
</build>
<repositories>
<repository>
<id>n52-releases</id>
<name>52n Releases</name>
<url>http://52north.org/maven/repo/releases</url>
<releases>
<enabled>true</enabled>
</releases>
<snapshots>
<enabled>false</enabled>
</snapshots>
</repository>
</repositories>
</project>

0
distro/profile.xml → profile.xml Normal file → Executable file
View File

View File

@ -30,7 +30,7 @@ public class InfrastructureDialoguer {
public DatabaseInfo getDatabaseInfo(String resourceName) throws Exception{
DatabaseInfo dbi = new DatabaseInfo();
LOGGER.debug("Searching for Database "+resourceName+" in scope "+scope);
LOGGER.debug("Searching for Database {} in scope {}", resourceName, scope);
SimpleQuery query = queryFor(ServiceEndpoint.class);
// query.addCondition("$resource/Profile/Category/text() eq 'Database' and $resource/Profile/Name eq 'StatisticalManagerDataBase' ");
// query.addCondition("$resource/Profile/Category/text() eq 'Database' and $resource/Profile/Name eq '"+resourceName+"' ");
@ -51,7 +51,7 @@ public class InfrastructureDialoguer {
dbi.driver = property.value();
}
LOGGER.debug("Found Database : "+dbi);
LOGGER.debug("Found Database : {}",dbi);
}
if (dbi.url == null)

View File

@ -48,10 +48,16 @@ import org.n52.wps.server.AbstractAnnotatedAlgorithm;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class AbstractEcologicalEngineMapper extends AbstractAnnotatedAlgorithm implements Observable, Cancellable{
public class AbstractEcologicalEngineMapper extends AbstractAnnotatedAlgorithm implements Observable, Cancellable {
private static final int COMPUTATION_WAIT_FOR_RUN_REQUEST = 20000;
/**
* Deploying procedure: 1 - modify configuration files 2 - modify resource file: resources/templates/setup.cfg 3 - generate classes with ClassGenerator 4 - add new classes in the wps_config.xml on the wps web app config folder 5 - produce the Jar file of this project 6 - copy the jar file in the lib folder of the wps web app change the server parameters in the wps_config.xml file
* Deploying procedure: 1 - modify configuration files 2 - modify resource file:
* resources/templates/setup.cfg 3 - generate classes with ClassGenerator 4 -
* add new classes in the wps_config.xml on the wps web app config folder 5 -
* produce the Jar file of this project 6 - copy the jar file in the lib folder
* of the wps web app change the server parameters in the wps_config.xml file
*/
private static final Logger LOGGER = LoggerFactory.getLogger(AbstractEcologicalEngineMapper.class);
@ -59,9 +65,11 @@ public class AbstractEcologicalEngineMapper extends AbstractAnnotatedAlgorithm i
private Observer observer = null;
private boolean cancelled = false;
private TokenManager tokenm = null;
private EnvironmentVariableManager env = null;
// inputs and outputs
public LinkedHashMap<String, Object> inputs = new LinkedHashMap<String, Object>();
public LinkedHashMap<String, Object> outputs = new LinkedHashMap<String, Object>();
@ -78,30 +86,41 @@ public class AbstractEcologicalEngineMapper extends AbstractAnnotatedAlgorithm i
this.wpsExternalID = wpsExternalID;
}
public static synchronized void addComputation(String session, String user) {
private static synchronized void addComputation(String session, String user) {
runningcomputations.put(session, user);
}
public static synchronized void removeComputation(String session) {
private static synchronized void removeComputation(String session) {
runningcomputations.remove(session);
}
public static synchronized int getRuningComputations() {
private static synchronized int getRuningComputations() {
return runningcomputations.size();
}
public static synchronized String displayRunningComputations() {
private static synchronized String displayRunningComputations() {
return runningcomputations.toString();
}
public void waitForResources() throws Exception {
while (getRuningComputations() > ConfigurationManager.getMaxComputations()) {
Thread.sleep(20000);
private void waitForResources(String computationSession, String username, String scope) throws Exception {
while (waitCondition(computationSession, username, scope)) {
Thread.sleep(COMPUTATION_WAIT_FOR_RUN_REQUEST);
LOGGER.debug("Waiting for resources to be available: " + displayRunningComputations());
}
}
private static synchronized boolean waitCondition(String computationSession, String username, String scope) {
if (getRuningComputations() >= ConfigurationManager.getMaxComputations()) {
return true;
} else {
// add the computation to the global list of computations
LOGGER.debug("Add computation to run: {}", computationSession);
addComputation(computationSession, username + ":" + scope);
return false;
}
}
// inner objects
public AlgorithmConfiguration config;
public InfrastructureDialoguer infrastructureDialoguer;
@ -203,7 +222,7 @@ public class AbstractEcologicalEngineMapper extends AbstractAnnotatedAlgorithm i
LOGGER.debug("Could not drop Temporary Table: " + table + " table is null");
}
} catch (Exception e) {
LOGGER.error("error deleting temporary table",e);
LOGGER.error("error deleting temporary table", e);
} finally {
DatabaseUtils.closeDBConnection(dbConnection);
}
@ -246,14 +265,16 @@ public class AbstractEcologicalEngineMapper extends AbstractAnnotatedAlgorithm i
float previousStatus = -3;
String host = WPSConfig.getInstance().getWPSConfig().getServer().getHostname();
public void updateStatus(float status) {
public void updateStatus(float status, boolean canWrite) {
if (agent != null) {
if (status != previousStatus) {
LOGGER.debug("STATUS update to: {} ", status );
LOGGER.debug("STATUS update to: {} ", status);
previousStatus = status;
super.update(new Integer((int) status));
try {
updateComputationOnWS(status, null);
if (canWrite)
updateComputationOnWS(status, null);
} catch (Exception e) {
LOGGER.warn("error updating compution on WS");
}
@ -262,16 +283,21 @@ public class AbstractEcologicalEngineMapper extends AbstractAnnotatedAlgorithm i
}
public void setEnvironmentVariableManager(EnvironmentVariableManager env) {
this.env = env;
}
public void updateComputationOnWS(float status, String exception) {
updateComputationOnWS(status, exception, null, null);
}
class RunDataspaceManager implements Runnable{
class RunDataspaceManager implements Runnable {
List<StoredData> inputData;
List<File> generatedData;
public RunDataspaceManager(List<StoredData> inputData, List<File> generatedData){
this.inputData=inputData;
this.generatedData=generatedData;
public RunDataspaceManager(List<StoredData> inputData, List<File> generatedData) {
this.inputData = inputData;
this.generatedData = generatedData;
}
public void run() {
@ -280,34 +306,35 @@ public class AbstractEcologicalEngineMapper extends AbstractAnnotatedAlgorithm i
LOGGER.debug("Dataspace->Status updater->Writing computational info on the WS asyncronously");
manager.writeRunningComputationData();
} catch (Exception ez) {
LOGGER.error("Dataspace->Status updater->Impossible to write computation information on the Workspace",ez);
LOGGER.error("Dataspace->Status updater->Impossible to write computation information on the Workspace",
ez);
}
}
};
public void updateComputationOnWS(float status, String exception, List<StoredData> inputData, List<File> generatedData) {
public void updateComputationOnWS(float status, String exception, List<StoredData> inputData,
List<File> generatedData) {
if (currentComputation != null) {
currentComputation.setStatus("" + status);
if (exception != null && exception.length() > 0)
currentComputation.setException(exception);
RunDataspaceManager rundm = new RunDataspaceManager(inputData,generatedData);
LOGGER.debug("RunDataspaceManager: [inputData=" + inputData + ", generatedData=" + generatedData + "]");
RunDataspaceManager rundm = new RunDataspaceManager(inputData, generatedData);
rundm.run();
/*
Thread t = new Thread(rundm);
t.start();
* Thread t = new Thread(rundm); t.start();
*/
}
}
@Execute
public void run() throws Exception {
if (observer!=null)
if (observer != null)
observer.isStarted(this);
LOGGER.info("classloader context in this thread is {}",Thread.currentThread().getContextClassLoader());
LOGGER.info("classloader context in this thread is {}", Thread.currentThread().getContextClassLoader());
long startTimeLong = System.currentTimeMillis();
OperationResult operationResult = null;
@ -316,7 +343,8 @@ public class AbstractEcologicalEngineMapper extends AbstractAnnotatedAlgorithm i
List<String> generatedInputTables = null;
List<String> generatedOutputTables = null;
List<File> generatedFiles = null;
//String date = new java.text.SimpleDateFormat("dd_MM_yyyy_HH:mm:ss").format(System.currentTimeMillis());
// String date = new
// java.text.SimpleDateFormat("dd_MM_yyyy_HH:mm:ss").format(System.currentTimeMillis());
String computationSession = this.getAlgorithmClass().getSimpleName() + "_ID_" + UUID.randomUUID().toString();
if (wpsExternalID != null) {
LOGGER.info("Using wps External ID " + wpsExternalID);
@ -324,13 +352,19 @@ public class AbstractEcologicalEngineMapper extends AbstractAnnotatedAlgorithm i
} else
LOGGER.info("Wps External ID not set");
InputsManager inputsManager = null;
ConfigurationManager configManager = new ConfigurationManager(); // initializes parameters from file
ConfigurationManager configManager = new ConfigurationManager(this.env); // initializes
// parameters
// from
// web.xml
manageUserToken();
boolean canWriteOnShub = checkWriteAuthorization(tokenm.getUserName());
Path dir = Paths.get(System.getProperty("java.io.tmpdir"), "dmlocks");
if (!Files.exists(dir))
dir = Files.createDirectory(dir);
Path lockFile = Files.createTempFile(dir, "dm", ".lck");
LOGGER.info("lock file created {}",lockFile.toUri().toURL());
LOGGER.info("lock file created {}", lockFile.toUri().toURL());
try {
// wait for server resources to be available
@ -344,17 +378,17 @@ public class AbstractEcologicalEngineMapper extends AbstractAnnotatedAlgorithm i
config = configManager.getConfig();
LOGGER.info("Configured algorithm with session " + computationSession);
time("Configuration");
waitForResources();
waitForResources(computationSession, configManager.getUsername(), configManager.getScope());
LOGGER.info("Running algorithm with session " + computationSession);
time("Waiting time for resources to be free");
// add the computation to the global list of computations
addComputation(computationSession, configManager.getUsername() + ":" + configManager.getScope());
String scope = configManager.getScope();
String username = configManager.getUsername();
LOGGER.info("1 - Algorithm environment initialized in scope " + scope + " with user name " + username + " and session " + computationSession);
LOGGER.info("Max allowed computations " + ConfigurationManager.getMaxComputations() + " using storage " + ConfigurationManager.useStorage());
LOGGER.info("1 - Algorithm environment initialized in scope " + scope + " with user name " + username
+ " and session " + computationSession);
LOGGER.info("Max allowed computations " + ConfigurationManager.getMaxComputations() + " using storage "
+ ConfigurationManager.useStorage());
// init the infrastructure dialoguer
LOGGER.info("2 - Initializing connection to the e-Infrastructure");
infrastructureDialoguer = new InfrastructureDialoguer(scope);
@ -380,15 +414,19 @@ public class AbstractEcologicalEngineMapper extends AbstractAnnotatedAlgorithm i
time("Ecological Engine Algorithm selection");
// adding service parameters to the configuration
LOGGER.info("5 - Adding Service parameters to the configuration");
inputsManager.addInputServiceParameters(getInputParameters(algorithm), infrastructureDialoguer);
List<StatisticalType> dataminerInputParameters = getInputParameters(algorithm);
LOGGER.debug("Dataminer Algo Default InputParameters: " + dataminerInputParameters);
inputsManager.addInputServiceParameters(dataminerInputParameters, infrastructureDialoguer);
time("Service parameters added to the algorithm");
// merging wps with ecological engine parameters - modifies the
// config
LOGGER.info("6 - Translating WPS Inputs into Ecological Engine Inputs");
LOGGER.debug("Operator class is " + this.getClass().getCanonicalName());
// build computation Data
currentComputation = new ComputationData(config.getTaskID(), config.getAgent(), "", "", startTime, "-", "0", config.getTaskID(), configManager.getUsername(), config.getGcubeScope(), this.getClass().getCanonicalName());
inputsManager.mergeWpsAndEcologicalInputs(supportDatabaseInfo);
currentComputation = new ComputationData(config.getTaskID(), config.getAgent(), "", "", startTime, "-", "0",
config.getTaskID(), configManager.getUsername(), config.getGcubeScope(),
this.getClass().getCanonicalName());
inputsManager.mergeWpsAndEcologicalInputs(supportDatabaseInfo, dataminerInputParameters);
generatedInputTables = inputsManager.getGeneratedTables();
generatedFiles = inputsManager.getGeneratedInputFiles();
time("Setup and download of input parameters with tables creation");
@ -412,11 +450,11 @@ public class AbstractEcologicalEngineMapper extends AbstractAnnotatedAlgorithm i
LOGGER.info("9 - Running the computation and updater");
LOGGER.info("Initializing the WPS status of the computation");
updateStatus(0);
updateStatus(0, canWriteOnShub);
LOGGER.info("Initializing the computation");
agent.init();
LOGGER.info("Updating status");
runStatusUpdater();
runStatusUpdater(canWriteOnShub);
LOGGER.info("Running the computation");
agent.compute();
LOGGER.info("The computation has finished. Retrieving output");
@ -440,13 +478,15 @@ public class AbstractEcologicalEngineMapper extends AbstractAnnotatedAlgorithm i
outputmanager.shutdown();
// delete all temporary tables
LOGGER.info("12 - Deleting possible generated temporary tables");
LOGGER.debug("Final Computation Output: " + outputs);
LOGGER.debug("12 - Final Computation Output");
LOGGER.debug("Outputs: " + outputs);
endTime = new java.text.SimpleDateFormat("dd/MM/yyyy HH:mm:ss").format(System.currentTimeMillis());
if (!isCancelled()) {
saveComputationOnWS(inputsManager.getProvenanceData(), outputmanager.getProvenanceData(), agent, generatedFiles);
LOGGER.debug("Save Computation Data");
if (canWriteOnShub)
saveComputationOnWS(inputsManager.getProvenanceData(), outputmanager.getProvenanceData(), agent,
generatedFiles);
} else {
LOGGER.debug("Computation interrupted - no update");
throw new Exception("Computation cancelled");
@ -455,21 +495,26 @@ public class AbstractEcologicalEngineMapper extends AbstractAnnotatedAlgorithm i
operationResult = OperationResult.SUCCESS;
} catch (Exception e) {
operationResult = OperationResult.FAILED;
LOGGER.error("Error execution Algorithm {}",algorithm,e);
LOGGER.error("Error execution Algorithm {}", algorithm, e);
int exitstatus = -2;
if (isCancelled())
exitstatus = -1;
if (inputsManager != null)
updateComputationOnWS(exitstatus, e.getMessage(), inputsManager.getProvenanceData(), generatedFiles);
else
updateComputationOnWS(exitstatus, e.getMessage());
if (canWriteOnShub)
updateComputationOnWS(exitstatus, e.getMessage(), inputsManager.getProvenanceData(),
generatedFiles);
else if (canWriteOnShub)
updateComputationOnWS(exitstatus, e.getMessage());
if (isCancelled())
throw new Exception("Computation cancelled");
else
throw e;
} finally {
LOGGER.debug("accounting algorithm");
if (operationResult == null) {
operationResult = OperationResult.FAILED;
}
accountAlgorithmExecution(startTimeLong, System.currentTimeMillis(), operationResult);
LOGGER.debug("Deleting Input Tables");
deleteTemporaryTables(generatedInputTables);
@ -483,60 +528,98 @@ public class AbstractEcologicalEngineMapper extends AbstractAnnotatedAlgorithm i
time("Cleaning of resources");
displayTimes();
cleanResources();
if (observer!=null) observer.isFinished(this);
if (observer != null)
observer.isFinished(this);
LOGGER.debug("All done - Computation Finished");
Files.deleteIfExists(lockFile);
}
}
private boolean checkWriteAuthorization(String username) {
if (env != null) {
if (env.getShubUsersExcluded() != null) {
if (env.getShubUsersExcluded().isEmpty()) {
// all users write
return true;
}
if (env.getShubUsersExcluded().contains(username)) {
return false;
} else {
// username write
return true;
}
} else {
//This is the * case, no users write.
return false;
}
} else {
return false;
}
}
private void accountAlgorithmExecution(long start, long end, OperationResult result) {
try{
try {
JobUsageRecord jobUsageRecord = new JobUsageRecord();
jobUsageRecord.setJobName(this.getAlgorithmClass().getSimpleName());
jobUsageRecord.setConsumerId(tokenm.getUserName());
jobUsageRecord.setDuration(end-start);
jobUsageRecord.setDuration(end - start);
jobUsageRecord.setOperationResult(result);
jobUsageRecord.setServiceName("DataMiner");
jobUsageRecord.setServiceClass("WPS");
jobUsageRecord.setHost(WPSConfig.getInstance().getWPSConfig().getServer().getHostname());
jobUsageRecord.setCallerQualifier(tokenm.getTokenQualifier());
AccountingPersistence accountingPersistence =
AccountingPersistenceFactory.getPersistence();
AccountingPersistence accountingPersistence = AccountingPersistenceFactory.getPersistence();
accountingPersistence.account(jobUsageRecord);
}catch(Throwable e){
LOGGER.error("error accounting algorithm execution",e);
} catch (Throwable e) {
LOGGER.error("error accounting algorithm execution", e);
}
}
public class StatusUpdater implements Runnable {
private boolean canWrite = true;
public StatusUpdater(boolean canWrite) {
this.canWrite = canWrite;
}
@Override
public void run() {
while (agent != null && !isCancelled() && agent.getStatus() < 100) {
try {
updateStatus(agent.getStatus());
updateStatus(agent.getStatus(), canWrite);
Thread.sleep(10000);
} catch (InterruptedException e) {}
} catch (InterruptedException e) {
}
}
LOGGER.info("Status updater terminated");
}
}
private void runStatusUpdater() {
StatusUpdater updater = new StatusUpdater();
private void runStatusUpdater(boolean canWrite) {
StatusUpdater updater = new StatusUpdater(canWrite);
Thread t = new Thread(updater);
t.start();
LOGGER.debug("Provenance manager running");
}
private void saveComputationOnWS(List<StoredData> inputData, List<StoredData> outputData, ComputationalAgent agent, List<File> generatedFiles) {
private void saveComputationOnWS(List<StoredData> inputData, List<StoredData> outputData, ComputationalAgent agent,
List<File> generatedFiles) {
LOGGER.debug("Save Computation On WS");
LOGGER.debug("InputData: " + inputData);
LOGGER.debug("OutputData: " + outputData);
LOGGER.debug("Agent: " + agent);
LOGGER.debug("Generated files: " + generatedFiles);
LOGGER.debug("Provenance manager started for operator " + this.getClass().getCanonicalName());
ComputationData computation = new ComputationData(config.getTaskID(), config.getAgent(), agent.getDescription(), agent.getInfrastructure().name(), startTime, endTime, "100", config.getTaskID(), config.getParam(ConfigurationManager.serviceUserNameParameterVariable), config.getGcubeScope(), this.getClass().getCanonicalName());
ComputationData computation = new ComputationData(config.getTaskID(), config.getAgent(), agent.getDescription(),
agent.getInfrastructure().name(), startTime, endTime, "100", config.getTaskID(),
config.getParam(ConfigurationManager.serviceUserNameParameterVariable), config.getGcubeScope(),
this.getClass().getCanonicalName());
// post on WS
DataspaceManager manager = new DataspaceManager(config, computation, inputData, outputData, generatedFiles);
@ -601,7 +684,8 @@ public class AbstractEcologicalEngineMapper extends AbstractAnnotatedAlgorithm i
public static List<String> command(final String cmdline, final String directory) {
try {
Process process = new ProcessBuilder(new String[] { "bash", "-c", cmdline }).redirectErrorStream(true).directory(new File(directory)).start();
Process process = new ProcessBuilder(new String[] { "bash", "-c", cmdline }).redirectErrorStream(true)
.directory(new File(directory)).start();
List<String> output = new ArrayList<String>();
BufferedReader br = new BufferedReader(new InputStreamReader(process.getInputStream()));
@ -622,16 +706,16 @@ public class AbstractEcologicalEngineMapper extends AbstractAnnotatedAlgorithm i
@Override
public void setObserver(Observer o) {
LOGGER.debug("setting observer in {} ",wpsExternalID);
this.observer = o;
LOGGER.debug("setting observer in {} ", wpsExternalID);
this.observer = o;
}
@Override
public synchronized boolean cancel() {
if (!cancelled){
LOGGER.debug("COMPUTATION INTERRUPTED! ({})",wpsExternalID);
try{
if (agent!=null){
if (!cancelled) {
LOGGER.debug("COMPUTATION INTERRUPTED! ({})", wpsExternalID);
try {
if (agent != null) {
agent.shutdown();
agent = null;
}
@ -644,12 +728,12 @@ public class AbstractEcologicalEngineMapper extends AbstractAnnotatedAlgorithm i
}
System.gc();
cancelled = true;
}catch(Exception e){
LOGGER.warn("error cancelling computation with id {}",wpsExternalID);
} catch (Exception e) {
LOGGER.warn("error cancelling computation with id {}", wpsExternalID);
return false;
}
} else {
LOGGER.debug("COMPUTATION ALREADY INTERRUPT! ({})",wpsExternalID);
LOGGER.debug("COMPUTATION ALREADY INTERRUPT! ({})", wpsExternalID);
return false;
}
return true;

View File

@ -14,7 +14,7 @@ import org.slf4j.LoggerFactory;
public class ConfigurationManager {
private static final Logger logger = LoggerFactory.getLogger(ConfigurationManager.class);
public static String serviceUserNameParameterVariable = "ServiceUserName";
public static String processingSessionVariable = "Session";
public static String webpathVariable = "WebPath";
@ -22,26 +22,33 @@ public class ConfigurationManager {
public static String usernameParameter = "user.name";
public static String scopeParameter = "scope";
public static String tokenParameter = "usertoken";
public static String defaultScope= "/gcube/devsec";
public static String defaultUsername= "statistical.wps";
public static String defaultScope = "/gcube/devsec";
public static String defaultUsername = "statistical.wps";
private static Integer maxComputations = null;
private static Boolean useStorage = null;
static boolean simulationMode = false;
public static synchronized Integer getMaxComputations(){
EnvironmentVariableManager env = null;
public static synchronized Integer getMaxComputations() {
return maxComputations;
}
public static synchronized Boolean useStorage(){
public static synchronized Boolean useStorage() {
return useStorage;
}
public static synchronized Boolean isSimulationMode(){
public static synchronized Boolean isSimulationMode() {
return simulationMode;
}
@Deprecated
public void getInitializationProperties() {
}
private void inizializePropertiesUsingTemplateFile() {
try {
if (maxComputations == null) {
Properties options = new Properties();
@ -49,11 +56,12 @@ public class ConfigurationManager {
options.load(is);
is.close();
maxComputations = Integer.parseInt(options.getProperty("maxcomputations"));
logger.info("setting max computation to {}", maxComputations);
useStorage = Boolean.parseBoolean(options.getProperty("saveond4sstorage"));
simulationMode=Boolean.parseBoolean(options.getProperty("simulationMode"));
simulationMode = Boolean.parseBoolean(options.getProperty("simulationMode"));
}
} catch (Exception e) {
logger.error("error initializing properties",e);
logger.error("error initializing properties", e);
}
}
@ -70,67 +78,74 @@ public class ConfigurationManager {
return username;
}
public ConfigurationManager() {
getInitializationProperties();
public ConfigurationManager(EnvironmentVariableManager env) {
if (env == null)
inizializePropertiesUsingTemplateFile();
else {
maxComputations = env.getMaxComputation();
useStorage = env.isSaveOnStorage();
simulationMode = env.isSimulationMode();
}
}
public AlgorithmConfiguration getConfig() {
return config;
}
public void setComputationId(String computationId){
public void setComputationId(String computationId) {
config.setTaskID(computationId);
}
public void configAlgorithmEnvironment(LinkedHashMap<String, Object> inputs) throws Exception {
// set config container
config = new AlgorithmConfiguration();
config.setAlgorithmClassLoader(Thread.currentThread().getContextClassLoader());
String webperspath = WPSConfig.getConfigDir() + "../persistence/";
// selecting persistence path
// String persistencePath = File.createTempFile("wpsstatcheck", ".sm").getParent() + "/../cfg/";
//TODO: REMOVE this shit (the persistence must be the persistence dir of the webapp)
String persistencePath = WPSConfig.getConfigDir() + "../ecocfg/";
String configPath = persistencePath;
if (!new File(configPath).isDirectory()) {
configPath = "./cfg/";
persistencePath = "./";
}
logger.debug("Taking configuration from " + (new File(configPath).getAbsolutePath()) + " and persistence in " + persistencePath);
logger.debug("Taking configuration from {}", configPath);
// + " and persistence in " + persistencePath);
// setting configuration and logger
config.setPersistencePath(persistencePath);
config.setPersistencePath(configPath);
config.setConfigPath(configPath);
config.setNumberOfResources(1);
// setting application paths
String protocol = WPSConfig.getInstance().getWPSConfig().getServer().getProtocol();
String webapp = WPSConfig.getInstance().getWPSConfig().getServer().getWebappPath();
String host = WPSConfig.getInstance().getWPSConfig().getServer().getHostname();
String port = WPSConfig.getInstance().getWPSConfig().getServer().getHostport();
logger.debug("Host: " + host + " Port: " + port + " Webapp: " + webapp + " ");
logger.debug("Web persistence path: " + webperspath);
String webPath = "http://" + host + ":" + port + "/" + webapp + "/persistence/";
logger.debug("Protocol: {} , Host: {} , Port: {} , Webapp: {} ", protocol, host, port, webapp);
logger.info("Web Path Persistence: {}", webperspath);
String webPath = protocol + "://" + host + ":" + port + "/" + webapp + "/persistence/";
// logger.debug("Env Vars: \n"+System.getenv());
logger.debug("Web app path: " + webPath);
logger.info("Web Path Persistence Url: {}", webPath);
// retrieving scope
scope = (String) inputs.get(scopeParameter);
logger.debug("Retrieved scope: " + scope);
logger.debug("Retrieved scope: {} ", scope);
if (scope == null)
throw new Exception("Error: scope parameter (scope) not set! This violates e-Infrastructure security policies");
throw new Exception(
"Error: scope parameter (scope) not set! This violates e-Infrastructure security policies");
if (!scope.startsWith("/"))
scope = "/" + scope;
username = (String) inputs.get(usernameParameter);
token = (String) inputs.get(tokenParameter);
logger.debug("User name used by the client: " + username);
logger.debug("User token used by the client: " + token);
logger.debug("User name used by the client:{} ", username);
logger.debug("User token used by the client:{} ", token);
if (username == null || username.trim().length() == 0)
throw new Exception("Error: user name parameter (user.name) not set! This violates e-Infrastructure security policies");
throw new Exception(
"Error: user name parameter (user.name) not set! This violates e-Infrastructure security policies");
if (token == null || token.trim().length() == 0)
throw new Exception("Error: token parameter not set! This violates e-Infrastructure security policies");
@ -139,7 +154,7 @@ public class ConfigurationManager {
config.setGcubeToken(token);
// DONE get username from request
config.setParam(serviceUserNameParameterVariable, username);
config.setParam(processingSessionVariable, "" + UUID.randomUUID());
config.setParam(processingSessionVariable, UUID.randomUUID().toString());
config.setParam(webpathVariable, webPath);
config.setParam(webPersistencePathVariable, webperspath);

View File

@ -0,0 +1,38 @@
package org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mapping;
import java.util.List;
public class EnvironmentVariableManager {
public EnvironmentVariableManager(int maxComputation, boolean saveOnStorage, boolean simulationMode, List<String> shubUsersExcluded) {
super();
this.maxComputation = maxComputation;
this.saveOnStorage = saveOnStorage;
this.simulationMode = simulationMode;
this.shubUsersExcluded = shubUsersExcluded;
}
private int maxComputation;
private boolean saveOnStorage;
private boolean simulationMode;
//null: all users will write on SHub
//empty: no one will write on Shub
//filled: users reported will not write on Shub
private List<String> shubUsersExcluded;
public int getMaxComputation() {
return maxComputation;
}
public boolean isSaveOnStorage() {
return saveOnStorage;
}
public boolean isSimulationMode() {
return simulationMode;
}
public List<String> getShubUsersExcluded() {
return shubUsersExcluded;
}
}

View File

@ -5,6 +5,7 @@ import java.io.BufferedReader;
import java.io.File;
import java.io.FileOutputStream;
import java.io.FileReader;
import java.io.IOException;
import java.io.InputStream;
import java.net.HttpURLConnection;
import java.net.URL;
@ -41,7 +42,9 @@ import org.slf4j.LoggerFactory;
public class InputsManager {
private static final Logger LOGGER = LoggerFactory.getLogger(InputsManager.class);
private static final Logger LOGGER = LoggerFactory.getLogger(InputsManager.class);
private static final long SHUB_RETRY_MILLIS = 2000;
LinkedHashMap<String, Object> inputs;
List<String> generatedTables;
@ -51,11 +54,11 @@ public class InputsManager {
String computationId;
List<StoredData> provenanceData = new ArrayList<StoredData>();
public List<StoredData> getProvenanceData() {
return provenanceData;
}
public static String inputsSeparator = "\\|";
public AlgorithmConfiguration getConfig() {
@ -92,30 +95,35 @@ public class InputsManager {
config.setParam("DatabaseURL", supportDatabaseInfo.url);
}
public void mergeWpsAndEcologicalInputs(DatabaseInfo supportDatabaseInfo) throws Exception {
public void mergeWpsAndEcologicalInputs(DatabaseInfo supportDatabaseInfo,
List<StatisticalType> dataminerInputParameters) throws Exception {
LOGGER.debug("Merge WPS And Ecological Inputs");
// browse input parameters from WPS
for (String inputName : inputs.keySet()) {
Object input = inputs.get(inputName);
LOGGER.debug("Managing Input Parameter with Name "+ inputName);
LOGGER.debug("Managing Input Parameter with Name " + inputName);
// case of simple input
if (input instanceof String) {
LOGGER.debug("Simple Input: "+ input);
LOGGER.debug("Simple Input: " + input);
// manage lists
String inputAlgoOrig = ((String) input).trim();
String inputAlgo = ((String) input).trim().replaceAll(inputsSeparator, AlgorithmConfiguration.listSeparator);
String inputAlgo = ((String) input).trim().replaceAll(inputsSeparator,
AlgorithmConfiguration.listSeparator);
LOGGER.debug("Simple Input Transformed: " + inputAlgo);
config.setParam(inputName, inputAlgo);
saveInputData(inputName,inputName,inputAlgoOrig);
saveInputData(inputName, inputName, inputAlgoOrig);
}
// case of Complex Input
else if (input instanceof GenericFileData) {
LOGGER.debug("Complex Input: " + input);
LOGGER.debug("Complex Input");
// retrieve payload
GenericFileData files = ((GenericFileData) input);
LOGGER.debug("GenericFileData: [fileExtension=" + files.getFileExtension() + ", mimeType="
+ files.getMimeType() + "]");
List<File> localfiles = getLocalFiles(files,inputName);
List<File> localfiles = getLocalFiles(files, inputName, dataminerInputParameters);
String inputtables = "";
int nfiles = localfiles.size();
StringBuffer sb = new StringBuffer();
@ -128,10 +136,11 @@ public class InputsManager {
if (inputTableTemplates.get(inputName) != null) {
LOGGER.debug("Creating table: " + tableName);
createTable(tableName, tableFile, config, supportDatabaseInfo, inputTableTemplates.get(inputName));
createTable(tableName, tableFile, config, supportDatabaseInfo,
inputTableTemplates.get(inputName));
generatedTables.add(tableName);
}
//case of non-table input file, e.g. FFANN
// case of non-table input file, e.g. FFANN
else
tableName = tableFile.getAbsolutePath();
if (i > 0)
@ -140,13 +149,13 @@ public class InputsManager {
inputtables += tableName;
saveInputData(tableFile.getName(), inputName, tableFile.getAbsolutePath());
if (i>0)
if (i > 0)
sb.append("|");
sb.append(tableFile.getName());
}
sb.append("|");
if (nfiles>0)
if (nfiles > 0)
saveInputData(inputName, inputName, sb.toString());
// the only possible complex input is a table - check the WPS
@ -157,15 +166,15 @@ public class InputsManager {
}
public boolean isXML(String fileContent){
public boolean isXML(String fileContent) {
if (fileContent.startsWith("&lt;"))
return true;
else
else
return false;
}
public String readOneLine(String filename){
public String readOneLine(String filename) {
try {
BufferedReader in = new BufferedReader(new FileReader(new File(filename)));
@ -173,7 +182,7 @@ public class InputsManager {
String vud = "";
while ((line = in.readLine()) != null) {
if (line.trim().length()>0){
if (line.trim().length() > 0) {
vud = line.trim();
break;
}
@ -183,64 +192,85 @@ public class InputsManager {
} catch (Exception e) {
e.printStackTrace();
return null;
}
}
}
public String inputNameFromHttpHeader(String url) throws Exception{
public String inputNameFromHttpHeader(String url) throws Exception {
LOGGER.debug("Search filename in http header from: " + url);
URL obj = new URL(url);
URLConnection conn = obj.openConnection();
String filename=null;
String filename = null;
// get all headers
Map<String, List<String>> map = conn.getHeaderFields();
LOGGER.debug("Getting file name from http header");
for (Map.Entry<String, List<String>> entry : map.entrySet()) {
String value = entry.getValue().toString();
if (value.toLowerCase().contains("filename=")){
LOGGER.debug("Searching in http header: found file name in header value "+value);
filename=value.substring(value.indexOf("=")+1);
filename=filename.replace("\"", "").replace("]", "");
LOGGER.debug("Searching in http header: retrieved file name "+filename);
LOGGER.debug("Header value: " + value);
if (value.toLowerCase().contains("filename")) {
LOGGER.debug("Searching in http header: found file name in header value {}", value);
filename = value.substring(value.indexOf("=") + 1);
filename = filename.replace("\"", "").replace("]", "");
LOGGER.debug("Searching in http header: retrieved file name {}", filename);
break;
}
}
LOGGER.debug("Filename retrieved from http header: " + filename);
return filename;
}
public List<File> getLocalFiles(GenericFileData files,String inputName) throws Exception {
public List<File> getLocalFiles(GenericFileData files, String inputName,
List<StatisticalType> dataminerInputParameters) throws Exception {
LOGGER.debug("GetLocalFiles: [files: " + files + ", inputName: " + inputName + "]");
// download input
List<File> filesList = new ArrayList<File>();
File f = files.getBaseFile(false);
LOGGER.debug("Retrieving file content as a URL link: " + f.getAbsolutePath());
//TODO DO NOT READ FILE INTO MEMORY
LOGGER.debug("Retrieving local files: " + f.getAbsolutePath());
// TODO DO NOT READ FILE INTO MEMORY
// read file content
String fileLink = readOneLine(f.getAbsolutePath());
LOGGER.debug("File link: " + fileLink.substring(0,Math.min(fileLink.length(),10)) + "...");
LOGGER.debug("Check File is link: {} ...", fileLink.substring(0, Math.min(fileLink.length(), 10)));
String fileName = "";
// case of a http link
if (fileLink!=null && fileLink.toLowerCase().startsWith("http:") || fileLink.toLowerCase().startsWith("https:")) {
if (fileLink != null
&& (fileLink.toLowerCase().startsWith("http:") || fileLink.toLowerCase().startsWith("https:"))) {
// manage the case of multiple files
LOGGER.debug("Complex Input payload is link");
LOGGER.debug("Retrieving files from url: " + fileLink);
String[] remotefiles = fileLink.split(inputsSeparator);
for (String subfilelink : remotefiles) {
subfilelink = subfilelink.trim();
LOGGER.debug("Managing link: " + subfilelink);
LOGGER.debug("Managing link: {}", subfilelink);
if (subfilelink.length() == 0)
continue;
InputStream is = null;
HttpURLConnection urlConnection = null;
URL url = new URL(subfilelink);
urlConnection = (HttpURLConnection) url.openConnection();
is = new BufferedInputStream(urlConnection.getInputStream());
try {
urlConnection = (HttpURLConnection) url.openConnection();
is = new BufferedInputStream(urlConnection.getInputStream());
}catch(IOException e) {
LOGGER.warn("download from storagehub failed. Retry ongoing...");
LOGGER.debug("waiting "+SHUB_RETRY_MILLIS+" millis ");
Thread.sleep(SHUB_RETRY_MILLIS);
urlConnection = (HttpURLConnection) url.openConnection();
is = new BufferedInputStream(urlConnection.getInputStream());
LOGGER.debug(" retry success ");
}
// retrieve payload: for test purpose only
String fileNameTemp = inputNameFromHttpHeader(subfilelink);
if (fileNameTemp==null)
fileName = inputName+"_[" + computationId + "]";
else
fileName = fileNameTemp+ "_[" + computationId + "]."+FilenameUtils.getExtension(fileNameTemp);
LOGGER.debug("the fileNameTemp is {}", fileNameTemp);
if (fileNameTemp != null && !fileNameTemp.isEmpty()) {
fileName = String.format("%s_(%s).%s", inputName, computationId,
FilenameUtils.getExtension(fileNameTemp));
} else {
fileName = String.format("%s_(%s).%s", inputName, computationId,
FilenameUtils.getExtension(inputName));
}
LOGGER.debug("the name of the generated file is {}", fileName);
LOGGER.debug("Retrieving remote input in file: " + fileName);
LOGGER.debug("Creating local temp file: " + fileName);
File of = new File(config.getPersistencePath(), fileName);
FileOutputStream fos = new FileOutputStream(of);
IOUtils.copy(is, fos);
@ -249,32 +279,57 @@ public class InputsManager {
fos.close();
urlConnection.disconnect();
filesList.add(of);
LOGGER.debug("Created local file: " + of.getAbsolutePath());
LOGGER.debug("Created local file: {}", of.getAbsolutePath());
}
} else {
LOGGER.debug("Complex Input payload is the filelink");
LOGGER.debug("Complex Input payload is file");
fileName = f.getName();
LOGGER.debug("Retriving local input from file: " + fileName);
if (isXML(fileLink))
{
LOGGER.debug("Retrieving local input from file: {}", fileName);
String fileExt = null;
if (isXML(fileLink)) {
String xmlFile = f.getAbsolutePath();
String csvFile = xmlFile+".csv";
LOGGER.debug("Transforming XML file into a csv: " + csvFile);
String csvFile = xmlFile + ".csv";
LOGGER.debug("Transforming XML file into a csv: {} ", csvFile);
GML2CSV.parseGML(xmlFile, csvFile);
LOGGER.debug("GML Parsed: " + readOneLine(csvFile)+"[..]");
LOGGER.debug("GML Parsed: {} [..]", readOneLine(csvFile));
f = new File(csvFile);
fileExt = "csv";
} else {
LOGGER.debug("The file is a csv: {}", f.getAbsolutePath());
fileExt = FilenameUtils.getExtension(fileName);
}
else{
LOGGER.debug("The file is a csv: " + f.getAbsolutePath());
LOGGER.debug("Retrieve default extension");
String fileDefaultValue = null;
for (StatisticalType defaultInputParameter : dataminerInputParameters) {
if (defaultInputParameter.getName().compareTo(inputName) == 0) {
fileDefaultValue = defaultInputParameter.getDefaultValue();
break;
}
}
String absFile = new File(f.getParent(),inputName+ "_[" + computationId + "].csv").getAbsolutePath();
LOGGER.debug("Renaming to: "+absFile);
LOGGER.debug("Parameter default value retrieved: " + fileDefaultValue);
if (fileDefaultValue != null && !fileDefaultValue.isEmpty()) {
int lastPointIndex = fileDefaultValue.lastIndexOf(".");
if (lastPointIndex > -1 && lastPointIndex < (fileDefaultValue.length() - 1)) {
fileExt = fileDefaultValue.substring(lastPointIndex + 1);
LOGGER.debug("Default Extension retrieved: " + fileExt);
}
}
LOGGER.debug("Use extension: " + fileExt);
String absFile = new File(f.getParent(), String.format("%s_(%s).%s", inputName, computationId, fileExt))
.getAbsolutePath();
LOGGER.debug("Renaming to: " + absFile);
System.gc();
boolean renamed = f.renameTo(new File(absFile));
if (renamed)
f = new File(absFile);
LOGGER.debug("The file has been renamed as : " + f.getAbsolutePath()+" - "+renamed);
LOGGER.debug("The file has been renamed as : {} - {}", f.getAbsolutePath(), renamed);
filesList.add(f);
}
@ -282,7 +337,8 @@ public class InputsManager {
return filesList;
}
public void createTable(String tableName, File tableFile, AlgorithmConfiguration config, DatabaseInfo supportDatabaseInfo, String inputTableTemplate) throws Exception {
public void createTable(String tableName, File tableFile, AlgorithmConfiguration config,
DatabaseInfo supportDatabaseInfo, String inputTableTemplate) throws Exception {
// creating table
LOGGER.debug("Complex Input size after download: " + tableFile.length());
@ -308,11 +364,13 @@ public class InputsManager {
TableTemplatesMapper mapper = new TableTemplatesMapper();
String createstatement = mapper.generateCreateStatement(tableName, templatename, tableStructure);
LOGGER.debug("Creating table: " + tableName);
DatabaseUtils.createBigTable(true, tableName, supportDatabaseInfo.driver, supportDatabaseInfo.username, supportDatabaseInfo.password, supportDatabaseInfo.url, createstatement, dbConnection);
DatabaseUtils.createRemoteTableFromFile(tableFile.getAbsolutePath(), tableName, ",", true, supportDatabaseInfo.username, supportDatabaseInfo.password, supportDatabaseInfo.url);
DatabaseUtils.createBigTable(true, tableName, supportDatabaseInfo.driver, supportDatabaseInfo.username,
supportDatabaseInfo.password, supportDatabaseInfo.url, createstatement, dbConnection);
DatabaseUtils.createRemoteTableFromFile(tableFile.getAbsolutePath(), tableName, ",", true,
supportDatabaseInfo.username, supportDatabaseInfo.password, supportDatabaseInfo.url);
} catch (Exception e) {
LOGGER.error("Error in database transaction " ,e);
LOGGER.error("Error in database transaction ", e);
throw new Exception("Error in creating the table for " + tableName + ": " + e.getLocalizedMessage());
} finally {
DatabaseUtils.closeDBConnection(dbConnection);
@ -368,7 +426,8 @@ public class InputsManager {
return structure.toString();
}
public void addInputServiceParameters(List<StatisticalType> agentInputs, InfrastructureDialoguer infrastructureDialoguer) throws Exception {
public void addInputServiceParameters(List<StatisticalType> agentInputs,
InfrastructureDialoguer infrastructureDialoguer) throws Exception {
// check and fullfil additional parameters
DatabaseInfo dbinfo = null;
@ -376,10 +435,10 @@ public class InputsManager {
for (StatisticalType type : agentInputs) {
if (type instanceof PrimitiveType) {
if (((PrimitiveType) type).getType()==PrimitiveTypes.CONSTANT){
String constant = ""+((PrimitiveType) type).getDefaultValue();
if (((PrimitiveType) type).getType() == PrimitiveTypes.CONSTANT) {
String constant = "" + ((PrimitiveType) type).getDefaultValue();
config.setParam(type.getName(), constant);
LOGGER.debug("Constant parameter: "+constant);
LOGGER.debug("Constant parameter: " + constant);
}
}
if (type instanceof ServiceType) {
@ -392,10 +451,10 @@ public class InputsManager {
String value = "";
if (sp == ServiceParameters.RANDOMSTRING)
value = "stat" + UUID.randomUUID().toString().replace("-", "");
else if (sp == ServiceParameters.USERNAME){
else if (sp == ServiceParameters.USERNAME) {
value = (String) inputs.get(ConfigurationManager.usernameParameter);
LOGGER.debug("User name used by the client: "+value);
LOGGER.debug("User name used by the client: " + value);
}
LOGGER.debug("ServiceType Adding: (" + name + "," + value + ")");
config.setParam(name, value);
@ -436,8 +495,8 @@ public class InputsManager {
}
private void saveInputData(String name, String description, String payload){
private void saveInputData(String name, String description, String payload) {
LOGGER.debug("SaveInputData [name="+name+", description="+description+", payload="+payload+"]");
String id = name;
DataProvenance provenance = DataProvenance.IMPORTED;
String creationDate = new java.text.SimpleDateFormat("dd/MM/yyyy HH:mm:ss").format(System.currentTimeMillis());
@ -445,18 +504,17 @@ public class InputsManager {
String type = "text/plain";
if (payload != null && (new File (payload).exists())) {
if (payload != null && (new File(payload).exists())) {
if (payload.toLowerCase().endsWith(".csv") || payload.toLowerCase().endsWith(".txt")) {
type = "text/csv";
} else
type = "application/d4science";
}
StoredData data = new StoredData(name, description, id, provenance, creationDate, operator, computationId, type,payload,config.getGcubeScope());
StoredData data = new StoredData(name, description, id, provenance, creationDate, operator, computationId, type,
payload, config.getGcubeScope());
provenanceData.add(data);
}
}

View File

@ -21,7 +21,7 @@ import org.slf4j.LoggerFactory;
public class OutputsManager {
private static Logger LOGGER = LoggerFactory.getLogger(OutputsManager.class);
private AlgorithmConfiguration config;
private List<File> generatedFiles = new ArrayList<File>();
@ -29,7 +29,7 @@ public class OutputsManager {
private IClient storageclient;
private String computationsession;
private List<StoredData> provenanceData = new ArrayList<StoredData>();
public List<StoredData> getProvenanceData() {
return provenanceData;
}
@ -37,7 +37,7 @@ public class OutputsManager {
public List<File> getGeneratedData() {
return generatedFiles;
}
public List<File> getGeneratedFiles() {
return generatedFiles;
}
@ -46,12 +46,13 @@ public class OutputsManager {
return generatedTables;
}
public OutputsManager(AlgorithmConfiguration config,String computationsession) {
public OutputsManager(AlgorithmConfiguration config, String computationsession) {
this.config = config;
this.computationsession=computationsession;
this.computationsession = computationsession;
}
public LinkedHashMap<String, Object> createOutput(StatisticalType prioroutput, StatisticalType posterioroutput) throws Exception {
public LinkedHashMap<String, Object> createOutput(StatisticalType prioroutput, StatisticalType posterioroutput)
throws Exception {
LinkedHashMap<String, Object> outputs = new LinkedHashMap<String, Object>();
@ -66,10 +67,10 @@ public class OutputsManager {
StatisticalTypeToWPSType postconverter = new StatisticalTypeToWPSType();
postconverter.convert2WPSType(posterioroutput, false, config);
generatedFiles.addAll(postconverter.getGeneratedFiles());
LOGGER.debug("Generated Files "+generatedFiles);
LOGGER.debug("Generated Files " + generatedFiles);
generatedTables.addAll(postconverter.getGeneratedTables());
LOGGER.debug("Generated Tables "+generatedFiles);
LOGGER.debug("Generated Tables " + generatedFiles);
LinkedHashMap<String, IOWPSInformation> postOutput = postconverter.outputSet;
LinkedHashMap<String, IOWPSInformation> ndoutput = new LinkedHashMap<String, IOWPSInformation>();
@ -102,21 +103,23 @@ public class OutputsManager {
if (ConfigurationManager.useStorage()) {
if (postInfo.getLocalMachineContent() != null) {
// return the url from storage manager
String storageurl = uploadFileOnStorage(postInfo.getLocalMachineContent(), postInfo.getMimetype());
String storageurl = uploadFileOnStorage(postInfo.getLocalMachineContent(),
postInfo.getMimetype());
postInfo.setContent(storageurl);
}
}
/*
else if (postInfo.getLocalMachineContent() != null) {
String url = "<wps:Reference mimeType=\""+postInfo.getMimetype()+"\" xlink:href=\""+postInfo.getContent()+"\" method=\"GET\"/>";
LOGGER.debug("Reference URL: " + url);
outputs.put(okey, url);
}
else*/
* else if (postInfo.getLocalMachineContent() != null) { String
* url = "<wps:Reference mimeType=\""+postInfo.getMimetype()
* +"\" xlink:href=\""+postInfo.getContent()
* +"\" method=\"GET\"/>"; LOGGER.debug("Reference URL: " +
* url); outputs.put(okey, url); } else
*/
if (info != null) {
LOGGER.debug("Found a corresponding output: " + okey);
outputs.put(okey, postInfo.getContent());
//add link to the file also among the non deterministic output
// add link to the file also among the non deterministic
// output
if (postInfo.getLocalMachineContent() != null) {
ndoutput.put(okey, postInfo);
}
@ -129,23 +132,23 @@ public class OutputsManager {
System.gc();
}
XmlObject ndxml = generateNonDeterministicOutput(ndoutput);
outputs.put("non_deterministic_output", ndxml);
//safety check for declared output, i.e. a priori output
for (String pkey:priorOutput.keySet()){
if (outputs.get(pkey)==null){
LOGGER.debug("Safety check: adding empty string for " + pkey+ " of type "+priorOutput.get(pkey).getClassname());
// safety check for declared output, i.e. a priori output
for (String pkey : priorOutput.keySet()) {
if (outputs.get(pkey) == null) {
LOGGER.debug("Safety check: adding empty string for " + pkey + " of type "
+ priorOutput.get(pkey).getClassname());
outputs.put(pkey, "");
}
}
LOGGER.debug("OutputsManager outputs " + outputs);
return outputs;
}
private void saveProvenanceData(IOWPSInformation info){
private void saveProvenanceData(IOWPSInformation info) {
String name = info.getName();
String id = info.getName();
DataProvenance provenance = DataProvenance.COMPUTED;
@ -153,59 +156,97 @@ public class OutputsManager {
String operator = config.getAgent();
String computationId = computationsession;
String type = info.getMimetype();
/* if (info.getLocalMachineContent() != null) {
type = StoredType.DATA;
}
*/
/*
* if (info.getLocalMachineContent() != null) { type = StoredType.DATA;
* }
*/
String payload = info.getContent();
StoredData data = new StoredData(name, info.getAbstractStr(),id, provenance, creationDate, operator, computationId, type,payload,config.getGcubeScope());
StoredData data = new StoredData(name, info.getAbstractStr(), id, provenance, creationDate, operator,
computationId, type, payload, config.getGcubeScope());
provenanceData.add(data);
}
private void prepareForStoring() {
LOGGER.debug("Preparing storage client");
//String scope = config.getGcubeScope();
//ScopeProvider.instance.set(scope);
// String scope = config.getGcubeScope();
// ScopeProvider.instance.set(scope);
String serviceClass = "WPS";
String serviceName = "wps.synch";
String owner = config.getParam(ConfigurationManager.serviceUserNameParameterVariable);
storageclient = new StorageClient(serviceClass, serviceName, owner, AccessType.SHARED, MemoryType.VOLATILE).getClient();
storageclient = new StorageClient(serviceClass, serviceName, owner, AccessType.SHARED, MemoryType.VOLATILE)
.getClient();
LOGGER.debug("Storage client ready");
}
private String uploadFileOnStorage(String localfile, String mimetype) throws Exception {
LOGGER.debug("Storing->Start uploading on storage the following file: " + localfile);
File localFile = new File(localfile);
String remotef = "/wps_synch_output/" +config.getAgent()+"/"+computationsession+"/"+ localFile.getName();
storageclient.put(true).LFile(localfile).RFile(remotef);
String url = storageclient.getHttpUrl().RFile(remotef);
String remotef = "/wps_synch_output/" + config.getAgent() + "/" + computationsession + "/"
+ localFile.getName();
String contentType=retrieveContentType(localfile);
LOGGER.debug("Retrieved Content-Type: "+contentType);
if(contentType==null||contentType.isEmpty()){
storageclient.put(true).LFile(localfile).RFile(remotef);
} else {
storageclient.put(true,contentType).LFile(localfile).RFile(remotef);
}
String url = storageclient.getHttpsUrl().RFile(remotef);
/*
if (config.getGcubeScope().startsWith("/gcube"))
url = "http://data-d.d4science.org/uri-resolver/smp?smp-uri=" + url + "&fileName=" + localFile.getName() + "&contentType=" + mimetype;
else
url = "http://data.d4science.org/uri-resolver/smp?smp-uri=" + url+ "&fileName=" + localFile.getName() + "&contentType=" + mimetype;
*/
* if (config.getGcubeScope().startsWith("/gcube")) url =
* "http://data-d.d4science.org/uri-resolver/smp?smp-uri=" + url +
* "&fileName=" + localFile.getName() + "&contentType=" + mimetype; else
* url = "http://data.d4science.org/uri-resolver/smp?smp-uri=" + url+
* "&fileName=" + localFile.getName() + "&contentType=" + mimetype;
*/
LOGGER.info("Storing->Uploading finished - URL: " + url);
return url;
}
private String retrieveContentType(String fileName) {
String contentType=null;
if (fileName != null && !fileName.isEmpty()) {
String fileNameLowerCase = fileName.toLowerCase();
if (fileNameLowerCase.endsWith(".html") || fileNameLowerCase.endsWith(".htm")) {
contentType="text/html";
} else {
if (fileNameLowerCase.endsWith(".pdf")) {
contentType="application/pdf";
} else {
if (fileNameLowerCase.endsWith(".log") || fileNameLowerCase.endsWith(".txt")) {
contentType="text/plain";
} else {
if (fileNameLowerCase.endsWith(".json")) {
contentType="application/json";
} else {
}
}
}
}
}
return contentType;
}
public String cleanTagString(String tag) {
return tag.replace(" ", "_").replaceAll("[\\]\\[!\"#$%&'()*+,\\./:;<=>?@\\^`{|}~-]", "");
}
public XmlObject generateNonDeterministicOutputPlain(LinkedHashMap<String, IOWPSInformation> ndoutput) throws Exception {
String XMLString = "<gml:featureMember xmlns:gml=\"http://www.opengis.net/gml\" xmlns:d4science=\"http://www.d4science.org\">\n" + " <d4science:output fid=\"outputcollection\">\n";
public XmlObject generateNonDeterministicOutputPlain(LinkedHashMap<String, IOWPSInformation> ndoutput)
throws Exception {
String XMLString = "<gml:featureMember xmlns:gml=\"http://www.opengis.net/gml\" xmlns:d4science=\"http://www.d4science.org\">\n"
+ " <d4science:output fid=\"outputcollection\">\n";
for (String key : ndoutput.keySet()) {
IOWPSInformation info = ndoutput.get(key);
String payload = info.getContent();
String mimetype = info.getMimetype();
XMLString += " <d4science:k_" + cleanTagString(key) + ">" + " <d4science:Data><![CDATA[" + payload + "]]></d4science:Data>\n" + " <d4science:Description><![CDATA[" + (info.getAbstractStr() != null ? info.getAbstractStr() : "") + "]]></d4science:Description>\n" + " <d4science:MimeType>" + mimetype + "</d4science:MimeType>\n" + " </d4science:k_" + cleanTagString(key) + ">\n";
XMLString += " <d4science:k_" + cleanTagString(key) + ">" + " <d4science:Data><![CDATA["
+ payload + "]]></d4science:Data>\n" + " <d4science:Description><![CDATA["
+ (info.getAbstractStr() != null ? info.getAbstractStr() : "") + "]]></d4science:Description>\n"
+ " <d4science:MimeType>" + mimetype + "</d4science:MimeType>\n" + " </d4science:k_"
+ cleanTagString(key) + ">\n";
}
XMLString += " </d4science:output>\n" + "</gml:featureMember>\n";
@ -219,16 +260,21 @@ public class OutputsManager {
return xmlData;
}
public XmlObject generateNonDeterministicOutputCollection(LinkedHashMap<String, IOWPSInformation> ndoutput) throws Exception {
String XMLString = "<ogr:FeatureCollection xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xsi:schemaLocation=\"http://ogr.maptools.org/ result_8751.xsd\" xmlns:ogr=\"http://ogr.maptools.org/\" xmlns:gml=\"http://www.opengis.net/gml\" xmlns:d4science=\"http://www.d4science.org\">" +
"\n<gml:featureMember>\n" + " <ogr:Result fid=\"F0\">\n" +
" <d4science:output fid=\"outputcollection\">\n";
public XmlObject generateNonDeterministicOutputCollection(LinkedHashMap<String, IOWPSInformation> ndoutput)
throws Exception {
String XMLString = "<ogr:FeatureCollection xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xsi:schemaLocation=\"http://ogr.maptools.org/ result_8751.xsd\" xmlns:ogr=\"http://ogr.maptools.org/\" xmlns:gml=\"http://www.opengis.net/gml\" xmlns:d4science=\"http://www.d4science.org\">"
+ "\n<gml:featureMember>\n" + " <ogr:Result fid=\"F0\">\n"
+ " <d4science:output fid=\"outputcollection\">\n";
for (String key : ndoutput.keySet()) {
IOWPSInformation info = ndoutput.get(key);
String payload = info.getContent();
String mimetype = info.getMimetype();
XMLString += " <d4science:k_" + cleanTagString(key) + ">" + " <d4science:Data><![CDATA[" + payload + "]]></d4science:Data>\n" + " <d4science:Description><![CDATA[" + (info.getAbstractStr() != null ? info.getAbstractStr() : "") + "]]></d4science:Description>\n" + " <d4science:MimeType>" + mimetype + "</d4science:MimeType>\n" + " </d4science:k_" + cleanTagString(key) + ">\n";
XMLString += " <d4science:k_" + cleanTagString(key) + ">" + " <d4science:Data><![CDATA["
+ payload + "]]></d4science:Data>\n" + " <d4science:Description><![CDATA["
+ (info.getAbstractStr() != null ? info.getAbstractStr() : "") + "]]></d4science:Description>\n"
+ " <d4science:MimeType>" + mimetype + "</d4science:MimeType>\n" + " </d4science:k_"
+ cleanTagString(key) + ">\n";
}
XMLString += " </d4science:output>\n" + " </ogr:Result>\n</gml:featureMember>\n</ogr:FeatureCollection>";
@ -241,40 +287,46 @@ public class OutputsManager {
return xmlData;
}
public XmlObject generateNonDeterministicOutput(LinkedHashMap<String, IOWPSInformation> ndoutput) throws Exception {
if (ndoutput.size()==0)
if (ndoutput.size() == 0)
return null;
String XMLString = "<ogr:FeatureCollection xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xsi:schemaLocation=\"http://ogr.maptools.org/ result_8751.xsd\" xmlns:ogr=\"http://ogr.maptools.org/\" xmlns:gml=\"http://www.opengis.net/gml\" xmlns:d4science=\"http://www.d4science.org\">" +
"\n<gml:featureMember>\n";
String XMLString = "<ogr:FeatureCollection xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xsi:schemaLocation=\"http://ogr.maptools.org/ result_8751.xsd\" xmlns:ogr=\"http://ogr.maptools.org/\" xmlns:gml=\"http://www.opengis.net/gml\" xmlns:d4science=\"http://www.d4science.org\">"
+ "\n<gml:featureMember>\n";
int count = 0;
for (String key : ndoutput.keySet()) {
IOWPSInformation info = ndoutput.get(key);
String payload = info.getContent();
String mimetype = info.getMimetype();
String abstractStr = info.getAbstractStr();
LOGGER.debug("IOWPS Information: " + "name "+info.getName()+","
+"abstr "+info.getAbstractStr()+","
+"content "+info.getContent()+","
+"def "+info.getDefaultVal()+",");
if ((abstractStr==null || abstractStr.trim().length()==0) && (payload!= null && payload.trim().length()>0))
LOGGER.debug("IOWPS Information [name=" + info.getName() + ", abstr=" + info.getAbstractStr() + ", content="
+ info.getContent() + ", def=" + info.getDefaultVal() + "]");
if ((abstractStr == null || abstractStr.trim().length() == 0)
&& (payload != null && payload.trim().length() > 0))
abstractStr = info.getName();
else if (abstractStr == null)
abstractStr = "";
//geospatialized
// XMLString += " <ogr:Result fid=\"F" + count+ "\">" + "<ogr:geometryProperty><gml:Point><gml:coordinates>0,0</gml:coordinates></gml:Point></ogr:geometryProperty>"+ " <d4science:Data><![CDATA[" + payload + "]]></d4science:Data>\n" + " <d4science:Description><![CDATA[" + (info.getAbstractStr() != null ? info.getAbstractStr() : "") + "]]></d4science:Description>\n" + " <d4science:MimeType>" + mimetype + "</d4science:MimeType>\n" + " </ogr:Result>\n";
XMLString += " <ogr:Result fid=\"F" + count+ "\">" + " <d4science:Data><![CDATA[" + payload + "]]></d4science:Data>\n" + " <d4science:Description><![CDATA[" + abstractStr + "]]></d4science:Description>\n" + " <d4science:MimeType>" + mimetype + "</d4science:MimeType>\n" + " </ogr:Result>\n";
// geospatialized
// XMLString += " <ogr:Result fid=\"F" + count+ "\">" +
// "<ogr:geometryProperty><gml:Point><gml:coordinates>0,0</gml:coordinates></gml:Point></ogr:geometryProperty>"+
// " <d4science:Data><![CDATA[" + payload + "]]></d4science:Data>\n"
// + " <d4science:Description><![CDATA[" + (info.getAbstractStr() !=
// null ? info.getAbstractStr() : "") +
// "]]></d4science:Description>\n" + " <d4science:MimeType>" +
// mimetype + "</d4science:MimeType>\n" + " </ogr:Result>\n";
XMLString += " <ogr:Result fid=\"F" + count + "\">" + " <d4science:Data><![CDATA[" + payload
+ "]]></d4science:Data>\n" + " <d4science:Description><![CDATA[" + abstractStr
+ "]]></d4science:Description>\n" + " <d4science:MimeType>" + mimetype
+ "</d4science:MimeType>\n" + " </ogr:Result>\n";
count++;
}
XMLString += " </gml:featureMember>\n</ogr:FeatureCollection>";
LOGGER.debug("Non deterministic output: " + XMLString);
XmlObject xmlData = XmlObject.Factory.newInstance();
@ -284,12 +336,12 @@ public class OutputsManager {
return xmlData;
}
public void shutdown(){
try{
storageclient.close();
}catch(Exception e){
public void shutdown() {
try {
storageclient.close();
} catch (Exception e) {
}
}
}

View File

@ -34,16 +34,16 @@ public class TableTemplatesMapper {
public void linksMapping(){
linksMap = new HashMap<String, String>();
linksMap.put(TableTemplates.HSPEN.name(), "(HSPEN) http://goo.gl/4zDiAK");
linksMap.put(TableTemplates.HCAF.name(), "(HCAF) http://goo.gl/SZG9uM");
linksMap.put(TableTemplates.HSPEC.name(),"(HSPEC) http://goo.gl/OvKa1h");
linksMap.put(TableTemplates.OCCURRENCE_AQUAMAPS.name(), "(OCCURRENCE_AQUAMAPS) http://goo.gl/vHil5T");
linksMap.put(TableTemplates.OCCURRENCE_SPECIES.name(), "(OCCURRENCE_SPECIES) http://goo.gl/4ExuR5");
linksMap.put(TableTemplates.MINMAXLAT.name(), "(MINMAXLAT) http://goo.gl/cRzwgN");
linksMap.put(TableTemplates.TRAININGSET.name(), "(TRAININGSET) http://goo.gl/Br44UQ");
linksMap.put(TableTemplates.TESTSET.name(), "(TESTSET) http://goo.gl/LZHNXt");
linksMap.put(TableTemplates.CLUSTER.name(), "(CLUSTER) http://goo.gl/PnKhhb");
linksMap.put(TableTemplates.TIMESERIES.name(), "(TIMESERIES) http://goo.gl/DoW6fg");
linksMap.put(TableTemplates.HSPEN.name(), "(HSPEN) https://data.d4science.net/kLeQ");
linksMap.put(TableTemplates.HCAF.name(), "(HCAF) https://data.d4science.net/AhwE");
linksMap.put(TableTemplates.HSPEC.name(),"(HSPEC) https://data.d4science.net/TNCR");
linksMap.put(TableTemplates.OCCURRENCE_AQUAMAPS.name(), "(OCCURRENCE_AQUAMAPS) https://data.d4science.net/Un3H");
linksMap.put(TableTemplates.OCCURRENCE_SPECIES.name(), "(OCCURRENCE_SPECIES) https://data.d4science.net/wKRW");
linksMap.put(TableTemplates.MINMAXLAT.name(), "(MINMAXLAT) https://data.d4science.net/fPFQ");
linksMap.put(TableTemplates.TRAININGSET.name(), "(TRAININGSET) https://data.d4science.net/NgvU");
linksMap.put(TableTemplates.TESTSET.name(), "(TESTSET) https://data.d4science.net/EpjE");
linksMap.put(TableTemplates.CLUSTER.name(), "(CLUSTER) https://data.d4science.net/XN7z");
linksMap.put(TableTemplates.TIMESERIES.name(), "(TIMESERIES) https://data.d4science.net/5g4F");
linksMap.put(TableTemplates.GENERIC.name(), "(GENERIC) A generic comma separated csv file in UTF-8 encoding");
}

View File

@ -13,16 +13,15 @@ import java.util.List;
import java.util.Map;
import java.util.UUID;
import org.gcube.common.homelibrary.home.Home;
import org.gcube.common.homelibrary.home.HomeLibrary;
import org.gcube.common.homelibrary.home.HomeManager;
import org.gcube.common.homelibrary.home.HomeManagerFactory;
import org.gcube.common.homelibrary.home.User;
import org.gcube.common.homelibrary.home.workspace.Workspace;
import org.gcube.common.homelibrary.home.workspace.WorkspaceFolder;
import org.gcube.common.homelibrary.home.workspace.WorkspaceItem;
import org.gcube.common.homelibrary.home.workspace.folder.FolderItem;
import org.gcube.common.homelibrary.util.WorkspaceUtil;
import org.gcube.common.authorization.library.provider.AuthorizationProvider;
import org.gcube.common.storagehub.client.dsl.FileContainer;
import org.gcube.common.storagehub.client.dsl.FolderContainer;
import org.gcube.common.storagehub.client.dsl.ItemContainer;
import org.gcube.common.storagehub.client.dsl.StorageHubClient;
import org.gcube.common.storagehub.model.Metadata;
import org.gcube.common.storagehub.model.exceptions.ItemLockedException;
import org.gcube.common.storagehub.model.items.GCubeItem;
import org.gcube.common.storagehub.model.items.Item;
import org.gcube.contentmanagement.lexicalmatcher.utils.FileTools;
import org.gcube.dataanalysis.ecoengine.configuration.AlgorithmConfiguration;
import org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mapping.AbstractEcologicalEngineMapper;
@ -64,12 +63,20 @@ public class DataspaceManager implements Runnable {
public static String operator = "operator_name";
public static String payload = "payload";
public DataspaceManager(AlgorithmConfiguration config, ComputationData computation, List<StoredData> inputData, List<StoredData> outputData, List<File> generatedFiles) {
private String statusComputationName;
private static final String STATUS_POSTFIX = "-STATUS";
public DataspaceManager(AlgorithmConfiguration config, ComputationData computation, List<StoredData> inputData,
List<StoredData> outputData, List<File> generatedFiles) {
this.config = config;
this.computation = computation;
this.inputData = inputData;
this.outputData = outputData;
this.generatedFiles = generatedFiles;
this.statusComputationName = this.computation.id + STATUS_POSTFIX;
LOGGER.debug("DataspaceManager [config=" + config + ", computation=" + computation + ", inputData=" + inputData
+ ", outputData=" + outputData + ", generatedFiles=" + generatedFiles + "]");
}
public void run() {
@ -88,78 +95,102 @@ public class DataspaceManager implements Runnable {
}
public void createFoldersNetwork(Workspace ws, WorkspaceFolder root) throws Exception {
public FolderContainer createFoldersNetwork() throws Exception {
LOGGER.debug("Dataspace->Creating folders for DataMiner");
// manage folders: create the folders network
if (!ws.exists(dataminerFolder, root.getId())) {
LOGGER.debug("Dataspace->Creating DataMiner main folder");
root.createFolder(dataminerFolder, "A folder collecting DataMiner experiments data and computation information");
((WorkspaceFolder) root.find(dataminerFolder)).setSystemFolder(true);
}
WorkspaceFolder dataminerFolderWS = (WorkspaceFolder) root.find(dataminerFolder);
StorageHubClient shc = new StorageHubClient();
if (!ws.exists(importedDataFolder, dataminerFolderWS.getId())) {
FolderContainer root = shc.getWSRoot();
List<ItemContainer<? extends Item>> dataminerItems = root.findByName(dataminerFolder).getContainers();
FolderContainer dataminerFolderContainer;
// manage folders: create the folders network
if (dataminerItems.isEmpty()) {
LOGGER.debug("Dataspace->Creating DataMiner main folder");
dataminerFolderContainer = root.newFolder(dataminerFolder,
"A folder collecting DataMiner experiments data and computation information");
// ((WorkspaceFolder)
// root.find(dataminerFolder)).setSystemFolder(true);
} else if (dataminerItems.size() > 1)
throw new Exception("found more than one dataminer folder (impossible!!!)");
else
dataminerFolderContainer = (FolderContainer) dataminerItems.get(0);
if (dataminerFolderContainer.findByName(importedDataFolder).getContainers().isEmpty()) {
LOGGER.debug("Dataspace->Creating DataMiner imported data folder");
dataminerFolderWS.createFolder(importedDataFolder, "A folder collecting DataMiner imported data");
dataminerFolderContainer.newFolder(importedDataFolder, "A folder collecting DataMiner imported data");
}
if (!ws.exists(computedDataFolder, dataminerFolderWS.getId())) {
if (dataminerFolderContainer.findByName(computedDataFolder).getContainers().isEmpty()) {
LOGGER.debug("Dataspace->Creating DataMiner computed data folder");
dataminerFolderWS.createFolder(computedDataFolder, "A folder collecting DataMiner computed data");
dataminerFolderContainer.newFolder(computedDataFolder, "A folder collecting DataMiner computed data");
}
if (!ws.exists(computationsFolder, dataminerFolderWS.getId())) {
if (dataminerFolderContainer.findByName(computationsFolder).getContainers().isEmpty()) {
LOGGER.debug("Dataspace->Creating DataMiner computations folder");
dataminerFolderWS.createFolder(computationsFolder, "A folder collecting DataMiner computations information");
dataminerFolderContainer.newFolder(computationsFolder,
"A folder collecting DataMiner computations information");
}
return dataminerFolderContainer;
}
public String uploadData(StoredData data, WorkspaceFolder wsFolder) throws Exception {
return uploadData(data, wsFolder, true);
public String uploadData(StoredData data, FolderContainer destinationFolder) throws Exception {
return uploadData(data, destinationFolder, true);
}
public String uploadData(StoredData data, WorkspaceFolder wsFolder, boolean changename) throws Exception {
LOGGER.debug("Dataspace->Analysing " + data);
// String filenameonwsString = WorkspaceUtil.getUniqueName(data.name, wsFolder);
String filenameonwsString = data.name ;
public String uploadData(StoredData data, FolderContainer destinationFolder, boolean changename) throws Exception {
LOGGER.debug("Dataspace->Uploaddata:"
+ " [data={},destinationFolder={}, changename={}] ", data, destinationFolder,
changename);
// String filenameonwsString = WorkspaceUtil.getUniqueName(data.name,
// wsFolder);
String filenameonwsString = data.name;
if (changename)
filenameonwsString = data.name + "_[" + data.computationId + "]"+getExtension(data.payload, data.type);// ("_"+UUID.randomUUID()).replace("-", "");
filenameonwsString = String.format("%s_(%s)%s", data.name, data.computationId, getExtension(data.payload));
InputStream in = null;
String url = "";
try {
long size = 0;
if (data.type.equals("text/csv")||data.type.equals("application/d4science")||data.type.equals("image/png")) {
// long size = 0;
if (data.type.equals("text/csv") || data.type.equals("application/d4science")
|| data.type.equals("image/png")) {
if (new File(data.payload).exists() || !data.payload.startsWith("http")) {
LOGGER.debug("Dataspace->Uploading file " + data.payload);
LOGGER.debug("Dataspace->Uploading file {}", data.payload);
in = new FileInputStream(new File(data.payload));
size = new File(data.payload).length();
// size = new File(data.payload).length();
} else {
LOGGER.debug("Dataspace->Uploading via URL " + data.payload);
LOGGER.debug("Dataspace->Uploading via URL {}", data.payload);
int tries = 10;
for (int i=0;i<tries;i++){
for (int i = 0; i < tries; i++) {
try {
URL urlc = new URL(data.payload);
url = urlc.toString();
HttpURLConnection urlConnection = (HttpURLConnection) urlc.openConnection();
urlConnection.setConnectTimeout(10000);
urlConnection.setReadTimeout(10000);
in = new BufferedInputStream(urlConnection.getInputStream());
}catch(Exception ee){
LOGGER.warn("Dataspace->Retrying connection to {} number {} ",data.payload,(i+1),ee);
in =null;
} catch (Exception ee) {
LOGGER.warn("Dataspace->Retrying connection to {} number {} ", data.payload, (i + 1), ee);
in = null;
}
if (in!=null)
if (in != null)
break;
else
Thread.sleep(10000);
}
}
if (in==null)
throw new Exception("Impossible to open stream from "+data.payload);
if (in == null)
throw new Exception("Impossible to open stream from " + data.payload);
// LOGGER.debug("Dataspace->final file name on ws " + data.name+" description "+data.description);
LOGGER.debug("Dataspace->WS OP saving the following file on the WS " + filenameonwsString);
LinkedHashMap<String, String> properties = new LinkedHashMap<String, String>();
// LOGGER.debug("Dataspace->final file name on ws " +
// data.name+" description "+data.description);
LOGGER.debug("Dataspace->WS OP saving the following file on the WS: " + filenameonwsString);
Map<String, Object> properties = new LinkedHashMap<String, Object>();
properties.put(computation_id, data.computationId);
properties.put(hostname, WPSConfig.getInstance().getWPSConfig().getServer().getHostname());
@ -172,238 +203,231 @@ public class DataspaceManager implements Runnable {
properties.put(data_type, data.type);
properties.put(payload, url);
FolderItem fileItem = WorkspaceUtil.createExternalFile(wsFolder, filenameonwsString, data.description, in,properties,data.type,size);
//fileItem.getProperties().addProperties(properties);
LOGGER.debug("Dataspace->WS OP file saved on the WS " + filenameonwsString);
FileContainer fileContainer = destinationFolder.uploadFile(in, filenameonwsString, data.description);
LOGGER.debug("Dataspace->WS OP file uploaded on WS: " + filenameonwsString);
Metadata metadata = new Metadata(properties);
fileContainer.setMetadata(metadata);
url = fileItem.getPublicLink(false);
LOGGER.debug("Dataspace->WS OP file set metadata: " + metadata);
url = fileContainer.getPublicLink().toString();
LOGGER.debug("Dataspace->WS OP url produced for the file " + url);
data.payload = url;
try {
in.close();
} catch (Exception e) {
LOGGER.debug("Dataspace->Error creating file " + e.getMessage());
//LOGGER.debug(e);
LOGGER.debug("Dataspace->Error creating file {}", e.getMessage());
// LOGGER.debug(e);
}
LOGGER.debug("Dataspace->File created " + filenameonwsString);
LOGGER.debug("Dataspace->File created {}", filenameonwsString);
} else {
LOGGER.debug("Dataspace->String parameter " + data.payload);
LOGGER.debug("Dataspace->String parameter {}", data.payload);
url = data.payload;
}
} catch (Throwable e) {
LOGGER.error("Dataspace->Could not retrieve input payload {} ",data.payload,e);
//LOGGER.debug(e);
LOGGER.error("Dataspace->Could not retrieve input payload {} ", data.payload, e);
// LOGGER.debug(e);
url = "payload was not made available for this dataset";
data.payload = url;
}
return url;
}
public List<String> uploadInputData(List<StoredData> inputData, WorkspaceFolder dataminerFolder) throws Exception {
LOGGER.debug("Dataspace->uploading input data; Number of data: " + inputData.size());
WorkspaceItem folderItem = dataminerFolder.find(importedDataFolder);
public List<String> uploadInputData(List<StoredData> inputData, FolderContainer dataminerFolder) throws Exception {
LOGGER.debug("Dataspace->uploading input data; Number of data: {}", inputData.size());
FolderContainer destinationFolder = (FolderContainer) dataminerFolder.findByName(importedDataFolder)
.getContainers().get(0);
List<String> urls = new ArrayList<String>();
if (folderItem != null && folderItem.isFolder()) {
WorkspaceFolder destinationFolder = (WorkspaceFolder) folderItem;
for (StoredData input : inputData) {
WorkspaceItem item = null;
for (StoredData input : inputData) {
List<ItemContainer<? extends Item>> items = null;
if (input.type.equals("text/csv")||input.type.equals("application/d4science")||input.type.equals("image/png"))
item = destinationFolder.find(input.name);
if (input.type.equals("text/csv") || input.type.equals("application/d4science")
|| input.type.equals("image/png"))
items = destinationFolder.findByName(input.name).getContainers();
if (item==null){
String url = uploadData(input, destinationFolder,false);
LOGGER.debug("Dataspace->returning property "+url);
urls.add(url);
}
else{
LOGGER.debug("Dataspace->Input item "+input.name+" is already available in the input folder");
String url = item.getPublicLink(false);
LOGGER.debug("Dataspace->returning WS url "+url);
urls.add(url);
}
if (items == null || items.isEmpty()) {
String url = uploadData(input, destinationFolder, false);
LOGGER.debug("Dataspace->returning property {}", url);
urls.add(url);
} else {
FileContainer item = (FileContainer) items.get(0);
LOGGER.debug("Dataspace->Input item {} is already available in the input folder", input.name);
String url = item.getPublicLink().toString();
LOGGER.debug("Dataspace->returning WS url {}", url);
urls.add(url);
}
} else
LOGGER.debug("Dataspace->folder is not valid");
}
LOGGER.debug("Dataspace->finished uploading input data");
return urls;
}
public List<String> uploadOutputData(List<StoredData> outputData, WorkspaceFolder dataminerFolder) throws Exception {
public List<String> uploadOutputData(List<StoredData> outputData, FolderContainer dataminerFolder)
throws Exception {
LOGGER.debug("Dataspace->uploading output data; Number of data: " + outputData.size());
WorkspaceItem folderItem = dataminerFolder.find(computedDataFolder);
FolderContainer destinationFolder = (FolderContainer) dataminerFolder.findByName(computedDataFolder)
.getContainers().get(0);
List<String> urls = new ArrayList<String>();
if (folderItem != null && folderItem.isFolder()) {
WorkspaceFolder destinationFolder = (WorkspaceFolder) folderItem;
for (StoredData output : outputData) {
String url = uploadData(output, destinationFolder);
urls.add(url);
}
} else
LOGGER.debug("Dataspace->folder is not valid");
for (StoredData output : outputData) {
String url = uploadData(output, destinationFolder);
urls.add(url);
}
LOGGER.debug("Dataspace->finished uploading output data");
return urls;
}
public void uploadComputationData(ComputationData computation, List<StoredData> inputData, List<StoredData> outputData, WorkspaceFolder dataminerFolder, Workspace ws) throws Exception {
public void uploadComputationData(ComputationData computation, List<StoredData> inputData,
List<StoredData> outputData, FolderContainer dataminerFolder) throws Exception {
LOGGER.debug("Dataspace->uploading computation data");
WorkspaceItem folderItem = dataminerFolder.find(computationsFolder);
if (folderItem != null && folderItem.isFolder()) {
// create a folder in here
LOGGER.debug("Dataspace->Creating computation folder " + computation.id);
WorkspaceFolder cfolder = ((WorkspaceFolder) folderItem);
String cfoldername = computation.id;
WorkspaceFolder newcomputationFolder = null;
try{
newcomputationFolder = cfolder.createFolder(cfoldername, computation.operatorDescription);
}catch(java.lang.ClassCastException e){
LOGGER.debug("Dataspace->concurrency exception - deleting remaining item");
deleteRunningComputationData();
newcomputationFolder = cfolder.createFolder(cfoldername, computation.operatorDescription);
}
//String itemType = "COMPUTATION";
// create IO folders
LOGGER.debug("Dataspace->creating IO folders under " + cfoldername);
newcomputationFolder.createFolder(importedDataFolder, importedDataFolder);
newcomputationFolder.createFolder(computedDataFolder, computedDataFolder);
// copy IO in those folders
LOGGER.debug("Dataspace->*****uploading inputs in IO folder*****");
List<String> inputurls = uploadInputData(inputData, newcomputationFolder);
LOGGER.debug("Dataspace->*****uploading outputs in IO folder*****");
List<String> outputurls = uploadOutputData(outputData, newcomputationFolder);
LOGGER.debug("Dataspace->*****adding properties to the folder*****");
LOGGER.debug("Dataspace->creating Folder Properties");
// write a computation item for the computation
LinkedHashMap<String, String> properties = new LinkedHashMap<String, String>();
properties.put(computation_id, computation.id);
properties.put(hostname, WPSConfig.getInstance().getWPSConfig().getServer().getHostname());
properties.put(vre, computation.vre);
properties.put(operator_name, config.getAgent());
properties.put(operator_id, computation.operatorId);
properties.put(operator_description, computation.operatorDescription);
properties.put(start_date, computation.startDate);
properties.put(end_date, computation.endDate);
properties.put(status, getStatus(computation.status));
properties.put(execution_platform, computation.infrastructure);
int ninput = inputurls.size();
int noutput = outputurls.size();
LOGGER.debug("Dataspace->Adding input properties for " + ninput + " inputs");
for (int i = 1; i <= ninput; i++) {
StoredData input = inputData.get(i - 1);
if (input.payload.contains("|")){
String payload = input .payload;
LOGGER.debug("Dataspace->Managing complex input "+input.name+" : "+payload);
//delete the names that are not useful
for (StoredData subinput:inputData){
if (input.description.equals(subinput.description)){
payload = payload.replace(subinput.name,subinput.payload);
subinput.name=null;
}
}
input.name = null;
//delete last pipe character
if (payload.endsWith("|"))
payload = payload.substring(0,payload.length()-1);
LOGGER.debug("Dataspace->Complex input after processing "+payload);
properties.put("input" + i + "_" + input.description, payload);
input.payload=payload;
}
}
for (int i = 1; i <= ninput; i++) {
StoredData input = inputData.get(i - 1);
if (input.name!=null){
properties.put("input" + i + "_" + input.name, inputurls.get(i - 1));
}
}
LOGGER.debug("Dataspace->Adding output properties for " + noutput + " outputs");
for (int i = 1; i <= noutput; i++) {
properties.put("output" + i + "_" + outputData.get(i - 1).name, outputurls.get(i - 1));
}
LOGGER.debug("Dataspace->Properties of the folder: " + properties);
LOGGER.debug("Dataspace->Saving properties to ProvO XML file " + noutput + " outputs");
/*
* XStream xstream = new XStream(); String xmlproperties = xstream.toXML(properties);
*/
try {
String xmlproperties = ProvOGenerator.toProvO(computation, inputData, outputData);
File xmltosave = new File(config.getPersistencePath(), "prov_o_" + UUID.randomUUID());
FileTools.saveString(xmltosave.getAbsolutePath(), xmlproperties, true, "UTF-8");
InputStream sis = new FileInputStream(xmltosave);
WorkspaceUtil.createExternalFile(newcomputationFolder, computation.id + ".xml", computation.operatorDescription, sis,null,"text/xml",xmltosave.length());
sis.close();
xmltosave.delete();
} catch (Exception e) {
LOGGER.error("Dataspace->Failed creating ProvO XML file ",e);
}
//List<String> scopes = new ArrayList<String>();
//scopes.add(config.getGcubeScope());
//ws.createGcubeItem(computation.id, computation.operatorDescription, scopes, computation.user, itemType, properties, newcomputationFolder.getId());
newcomputationFolder.getProperties().addProperties(properties);
FolderContainer computationContainer = (FolderContainer) dataminerFolder.findByName(computationsFolder)
.getContainers().get(0);
// create a folder in here
LOGGER.debug("Dataspace->Creating computation folder " + computation.id);
String cfoldername = computation.id;
FolderContainer newcomputationFolder = null;
try {
newcomputationFolder = computationContainer.newFolder(cfoldername, computation.operatorDescription);
} catch (java.lang.ClassCastException e) {
LOGGER.debug("Dataspace->concurrency exception - deleting remaining item");
deleteRunningComputationData();
newcomputationFolder = computationContainer.newFolder(cfoldername, computation.operatorDescription);
}
// String itemType = "COMPUTATION";
// create IO folders
LOGGER.debug("Dataspace->creating IO folders under " + cfoldername);
newcomputationFolder.newFolder(importedDataFolder, importedDataFolder);
newcomputationFolder.newFolder(computedDataFolder, computedDataFolder);
// copy IO in those folders
LOGGER.debug("Dataspace->*****uploading inputs in IO folder*****");
List<String> inputurls = uploadInputData(inputData, newcomputationFolder);
LOGGER.debug("Dataspace->*****uploading outputs in IO folder*****");
List<String> outputurls = uploadOutputData(outputData, newcomputationFolder);
LOGGER.debug("Dataspace->*****adding properties to the folder*****");
LOGGER.debug("Dataspace->creating Folder Properties");
// write a computation item for the computation
Map<String, Object> properties = new LinkedHashMap<String, Object>();
properties.put(computation_id, computation.id);
properties.put(hostname, WPSConfig.getInstance().getWPSConfig().getServer().getHostname());
properties.put(vre, computation.vre);
properties.put(operator_name, config.getAgent());
properties.put(operator_id, computation.operatorId);
properties.put(operator_description, computation.operatorDescription);
properties.put(start_date, computation.startDate);
properties.put(end_date, computation.endDate);
properties.put(status, getStatus(computation.status));
properties.put(execution_platform, computation.infrastructure);
int ninput = inputurls.size();
int noutput = outputurls.size();
LOGGER.debug("Dataspace->Adding input properties for " + ninput + " inputs");
for (int i = 1; i <= ninput; i++) {
StoredData input = inputData.get(i - 1);
if (input.payload.contains("|")) {
String payload = input.payload;
LOGGER.debug("Dataspace->Managing complex input {} : {}", input.name, payload);
// delete the names that are not useful
for (StoredData subinput : inputData) {
if (input.description.equals(subinput.description)) {
payload = payload.replace(subinput.name, subinput.payload);
subinput.name = null;
}
}
input.name = null;
// delete last pipe character
if (payload.endsWith("|"))
payload = payload.substring(0, payload.length() - 1);
LOGGER.debug("Dataspace->Complex input after processing " + payload);
properties.put("input" + i + "_" + input.description, payload);
input.payload = payload;
}
}
for (int i = 1; i <= ninput; i++) {
StoredData input = inputData.get(i - 1);
if (input.name != null) {
properties.put(String.format("input%d_%s", i, input.name), inputurls.get(i - 1));
}
}
LOGGER.debug("Dataspace->Adding output properties for " + noutput + " outputs");
for (int i = 1; i <= noutput; i++) {
properties.put(String.format("output%d_%s", i, outputData.get(i - 1).name), outputurls.get(i - 1));
}
LOGGER.debug("Dataspace->Properties of the folder: {} ", properties);
LOGGER.debug("Dataspace->Saving properties to ProvO XML file {} outputs", noutput);
/*
* XStream xstream = new XStream(); String xmlproperties =
* xstream.toXML(properties);
*/
try {
String xmlproperties = ProvOGenerator.toProvO(computation, inputData, outputData);
File xmltosave = new File(config.getPersistencePath(), "prov_o_" + UUID.randomUUID());
FileTools.saveString(xmltosave.getAbsolutePath(), xmlproperties, true, "UTF-8");
try (InputStream sis = new FileInputStream(xmltosave)) {
newcomputationFolder.uploadFile(sis, computation.id + ".xml", computation.operatorDescription);
}
xmltosave.delete();
} catch (Exception e) {
LOGGER.error("Dataspace->Failed creating ProvO XML file ", e);
}
/*
* List<String> scopes = new ArrayList<String>();
* scopes.add(config.getGcubeScope()); ws.createGcubeItem(computation.id,
* computation.operatorDescription, scopes, computation.user, itemType,
* properties, newcomputationFolder.getId());
*/
newcomputationFolder.setMetadata(new Metadata(properties));
LOGGER.debug("Dataspace->finished uploading computation data");
}
public String buildCompositePayload(List<StoredData> inputData,String payload, String inputName){
public String buildCompositePayload(List<StoredData> inputData, String payload, String inputName) {
for (StoredData input:inputData){
if (inputName.equals(input.description)){
payload = payload.replace(input.name,input.payload);
for (StoredData input : inputData) {
if (inputName.equals(input.description)) {
payload = payload.replace(input.name, input.payload);
}
}
return payload;
}
public void writeProvenance(ComputationData computation, List<StoredData> inputData, List<StoredData> outputData) throws Exception {
public void writeProvenance(ComputationData computation, List<StoredData> inputData, List<StoredData> outputData)
throws Exception {
LOGGER.debug("Dataspace->connecting to Workspace");
HomeManagerFactory factory = HomeLibrary.getHomeManagerFactory();
HomeManager manager = factory.getHomeManager();
LOGGER.debug("Dataspace->getting user");
User user = manager.createUser(computation.user);
Home home = manager.getHome(user);
LOGGER.debug("Dataspace->getting root folder");
Workspace ws = home.getWorkspace();
WorkspaceFolder root = ws.getRoot();
LOGGER.debug("Dataspace->create folders network");
createFoldersNetwork(ws, root);
WorkspaceFolder dataminerItem = (WorkspaceFolder) root.find(dataminerFolder);
FolderContainer dataminerFolder = createFoldersNetwork();
LOGGER.debug("Dataspace->****uploading input files****");
uploadInputData(inputData, dataminerItem);
uploadInputData(inputData, dataminerFolder);
LOGGER.debug("Dataspace->****uploading output files****");
uploadOutputData(outputData, dataminerItem);
uploadOutputData(outputData, dataminerFolder);
LOGGER.debug("Dataspace->****uploading computation files****");
uploadComputationData(computation, inputData, outputData, dataminerItem, ws);
uploadComputationData(computation, inputData, outputData, dataminerFolder);
LOGGER.debug("Dataspace->provenance management finished");
LOGGER.debug("Dataspace->deleting generated files");
AbstractEcologicalEngineMapper.deleteGeneratedFiles(generatedFiles);
@ -414,29 +438,20 @@ public class DataspaceManager implements Runnable {
try {
deleteRunningComputationData();
} catch (Exception e) {
LOGGER.debug("Dataspace->impossible to delete running computation : {} ",e.getMessage());
LOGGER.debug("Dataspace->impossible to delete running computation : {} ", e.getMessage());
}
// LOGGER.debug("Dataspace->updating computation status");
// LOGGER.debug("Dataspace->connecting to Workspace");
HomeManagerFactory factory = HomeLibrary.getHomeManagerFactory();
HomeManager manager = factory.getHomeManager();
// LOGGER.debug("Dataspace->getting user");
User user = manager.createUser(computation.user);
Home home = manager.getHome(user);
// LOGGER.debug("Dataspace->getting root folder");
Workspace ws = home.getWorkspace();
WorkspaceFolder root = ws.getRoot();
// LOGGER.debug("Dataspace->create folders network");
createFoldersNetwork(ws, root);
WorkspaceFolder dataminerFolderWS = (WorkspaceFolder) root.find(dataminerFolder);
WorkspaceItem computationsFolderItem = dataminerFolderWS.find(computationsFolder);
// LOGGER.debug("Dataspace->Creating computation item " + computation.id+" with status"+computation.status);
FolderContainer folderContainer = createFoldersNetwork();
FolderContainer computationsContainer = (FolderContainer) folderContainer.findByName(computationsFolder)
.getContainers().get(0);
// LOGGER.debug("Dataspace->Creating computation item " +
// computation.id+" with status"+computation.status);
String itemType = "COMPUTATION";
// write a computation item for the computation
LinkedHashMap<String, String> properties = new LinkedHashMap<String, String>();
Map<String, Object> properties = new LinkedHashMap<String, Object>();
properties.put(computation_id, computation.id);
properties.put(hostname, WPSConfig.getInstance().getWPSConfig().getServer().getHostname());
properties.put(vre, computation.vre);
@ -452,24 +467,34 @@ public class DataspaceManager implements Runnable {
List<String> scopes = new ArrayList<String>();
scopes.add(config.getGcubeScope());
ws.createGcubeItem(computation.id, computation.operatorDescription, scopes, computation.user, itemType, properties, computationsFolderItem.getId());
// TODO: update gcubeItem not recreate it...
GCubeItem gcubeItem = new GCubeItem();
gcubeItem.setName(this.statusComputationName);
gcubeItem.setDescription(computation.operatorDescription);
gcubeItem.setScopes(scopes.toArray(new String[scopes.size()]));
gcubeItem.setItemType(itemType);
gcubeItem.setMetadata(new Metadata(properties));
gcubeItem.setCreator(AuthorizationProvider.instance.get().getClient().getId());
computationsContainer.newGcubeItem(gcubeItem);
LOGGER.debug("Dataspace->finished uploading computation data");
}
public String getStatus(String status){
public String getStatus(String status) {
double statusD = 0;
try{
try {
statusD = Double.parseDouble(status);
}catch(Exception e){
} catch (Exception e) {
return status;
}
if (statusD==100)
if (statusD == 100)
return "completed";
else if (statusD==-2)
else if (statusD == -2)
return "error";
else if (statusD==-1)
else if (statusD == -1)
return "cancelled";
else
return status;
@ -479,66 +504,75 @@ public class DataspaceManager implements Runnable {
LOGGER.debug("Dataspace->deleting computation item");
LOGGER.debug("Dataspace->connecting to Workspace");
HomeManagerFactory factory = HomeLibrary.getHomeManagerFactory();
HomeManager manager = factory.getHomeManager();
LOGGER.debug("Dataspace->getting user");
User user = manager.createUser(computation.user);
Home home = manager.getHome(user);
LOGGER.debug("Dataspace->getting root folder");
Workspace ws = home.getWorkspace();
WorkspaceFolder root = ws.getRoot();
WorkspaceFolder dataminerFolderWS = (WorkspaceFolder) root.find(dataminerFolder);
WorkspaceItem computationsFolderItem = dataminerFolderWS.find(computationsFolder);
StorageHubClient shc = new StorageHubClient();
FolderContainer dataminerContainer = (FolderContainer) shc.getWSRoot().findByName(dataminerFolder)
.getContainers().get(0);
FolderContainer computationContainer = (FolderContainer) dataminerContainer.findByName(computationsFolder)
.getContainers().get(0);
LOGGER.debug("Dataspace->removing computation data");
WorkspaceFolder computationsFolderWs = ((WorkspaceFolder) computationsFolderItem);
WorkspaceItem wi = computationsFolderWs.find(computation.id);
if (wi!=null){
LOGGER.debug("Dataspace->Found "+computation.id+" under "+computationsFolderWs.getName()+" - removing");
wi.remove();
}
else
LOGGER.debug("Dataspace->Warning Could not find "+computation.id+" under "+computationsFolderWs.getName());
int maxtries = 3;
int i =1;
while (ws.exists(computation.id,computationsFolderWs.getId()) && i<maxtries){
LOGGER.debug("Dataspace->computation data still exist... retrying "+i);
Thread.sleep(1000);
computationsFolderWs.find(computation.id).remove();
i++;
}
List<ItemContainer<? extends Item>> wi = computationContainer.findByName(this.statusComputationName)
.getContainers();
if (!wi.isEmpty()) {
for (ItemContainer<? extends Item> container : wi) {
boolean retry = false;
do {
try {
container.forceDelete();
retry = false;
} catch (ItemLockedException e) {
LOGGER.warn("item locked, retrying");
Thread.sleep(1000);
retry = true;
}
} while (retry);
}
} else
LOGGER.debug("Dataspace->Warning Could not find {} under {}", this.statusComputationName,
computationContainer.get().getName());
LOGGER.debug("Dataspace->finished removing computation data - success "+!ws.exists(computation.id,computationsFolderWs.getId()));
/*
* TODO: ASK GIANPAOLO int maxtries = 3; int i =1; while
* (ws.exists(computation.id,computationsFolderWs.getId()) && i<maxtries){
* LOGGER.debug("Dataspace->computation data still exist... retrying "+i );
* Thread.sleep(1000); computationsFolderWs.find(computation.id).remove(); i++;
* }
*/
LOGGER.debug("Dataspace->finished removing computation data ");
}
public static String getExtension(String payload, String type){
// TODO
public static String getExtension(String payload) {
LOGGER.debug("DataSpace->Get Extension from: " + payload);
String extension = "";
if (type.toLowerCase().equals("text/plain")){}
else if (payload.toLowerCase().startsWith("http")){
if (payload.toLowerCase().startsWith("http")) {
try {
URL obj= new URL(payload);
URL obj = new URL(payload);
URLConnection conn = obj.openConnection();
// get all headers
Map<String, List<String>> map = conn.getHeaderFields();
for (Map.Entry<String, List<String>> entry : map.entrySet()) {
String value = entry.getValue().toString();
if (value.toLowerCase().contains("filename=")){
System.out.println("DataSpace->Searching in http header: found "+value);
extension = value.substring(value.lastIndexOf("."),value.lastIndexOf("\""));
LOGGER.debug("Header value: " + value);
if (value.toLowerCase().contains("filename")) {
LOGGER.debug("DataSpace->Searching in http header: found " + value);
extension = value.substring(value.lastIndexOf("."), value.lastIndexOf("\""));
break;
}
}
conn.getInputStream().close();
} catch (Exception e) {
System.out.println("DataSpace->Error in the payload http link "+e.getMessage());
LOGGER.warn("DataSpace->Error in the payload http link ", e);
}
}
else {
} else {
File paylFile = new File(payload);
if (paylFile.exists()){
if (paylFile.exists()) {
String paylname = paylFile.getName();
extension = paylname.substring(paylname.lastIndexOf("."));
}
}
LOGGER.debug("DataSpace->Extension retrieved: " + extension);
return extension;
}

View File

@ -2,7 +2,6 @@ package org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mapping.datasp
import java.io.StringReader;
import java.io.StringWriter;
import java.util.ArrayList;
import java.util.List;
import javax.xml.parsers.DocumentBuilder;

0
src/main/resources/templates/classtemplate.properties Normal file → Executable file
View File

2
src/main/resources/templates/setup.cfg Normal file → Executable file
View File

@ -1,3 +1,3 @@
maxcomputations=4
maxcomputations=1
saveond4sstorage=true
simulationMode=false

View File

@ -5,7 +5,7 @@
xsi:schemaLocation="http://www.opengis.net/wps/1.0.0 http://schemas.opengis.net/wps/1.0.0/wpsGetCapabilities_response.xsd"
updateSequence="1">
<ows:ServiceIdentification>
<ows:Title>D4Science WPS Service for synchronous executions</ows:Title>
<ows:Title>D4Science DataMiner Service</ows:Title>
<ows:Abstract>Service based on the 52°North implementation of WPS 1.0.0</ows:Abstract>
<ows:Keywords>
<ows:Keyword>WPS</ows:Keyword>
@ -19,8 +19,8 @@
<ows:AccessConstraints>NONE</ows:AccessConstraints>
</ows:ServiceIdentification>
<ows:ServiceProvider>
<ows:ProviderName>National Research Council of Italy</ows:ProviderName>
<ows:ProviderSite xlink:href="www.d4science.org" />
<ows:ProviderName>D4Science</ows:ProviderName>
<ows:ProviderSite xlink:href="https://www.d4science.org" />
<ows:ServiceContact>
<ows:IndividualName>Gianpaolo Coro</ows:IndividualName>
<ows:PositionName>Researcher</ows:PositionName>
@ -34,7 +34,7 @@
<ows:AdministrativeArea>Istituto di Scienza e Tecnologie dell'Informazione A. Faedo</ows:AdministrativeArea>
<ows:PostalCode>56124</ows:PostalCode>
<ows:Country>Italy</ows:Country>
<ows:ElectronicMailAddress>gianpaolo.coro@isti.cnr.it</ows:ElectronicMailAddress>
<ows:ElectronicMailAddress>dataminer-managers@d4science.org</ows:ElectronicMailAddress>
</ows:Address>
</ows:ContactInfo>
</ows:ServiceContact>
@ -43,31 +43,29 @@
<ows:Operation name="GetCapabilities">
<ows:DCP>
<ows:HTTP>
<ows:Get xlink:href="http://#HOST#:#PORT#/wps/WebProcessingService?" />
<ows:Post xlink:href="http://#HOST#:#PORT#/wps/WebProcessingService?" />
<ows:Get xlink:href="#PROTOCOL#://#HOST#:#PORT#/wps/WebProcessingService?" />
<ows:Post xlink:href="#PROTOCOL#://#HOST#:#PORT#/wps/WebProcessingService?" />
</ows:HTTP>
</ows:DCP>
</ows:Operation>
<ows:Operation name="DescribeProcess">
<ows:DCP>
<ows:HTTP>
<ows:Get xlink:href="http://#HOST#:#PORT#/wps/WebProcessingService?" />
<ows:Post xlink:href="http://#HOST#:#PORT#/wps/WebProcessingService?" />
<ows:Get xlink:href="#PROTOCOL#://#HOST#:#PORT#/wps/WebProcessingService?" />
<ows:Post xlink:href="#PROTOCOL#://#HOST#:#PORT#/wps/WebProcessingService?" />
</ows:HTTP>
</ows:DCP>
</ows:Operation>
<ows:Operation name="Execute">
<ows:DCP>
<ows:HTTP>
<ows:Get xlink:href="http://#HOST#:#PORT#/wps/WebProcessingService?" />
<ows:Post xlink:href="http://#HOST#:#PORT#/wps/WebProcessingService?" />
<ows:Get xlink:href="#PROTOCOL#://#HOST#:#PORT#/wps/WebProcessingService?" />
<ows:Post xlink:href="#PROTOCOL#://#HOST#:#PORT#/wps/WebProcessingService?" />
</ows:HTTP>
</ows:DCP>
</ows:Operation>
</ows:OperationsMetadata>
<wps:ProcessOfferings>
#PROCESSES#
</wps:ProcessOfferings>
<wps:ProcessOfferings>#PROCESSES#</wps:ProcessOfferings>
<wps:Languages>
<wps:Default>
<ows:Language>en-GB</ows:Language>

View File

@ -17,22 +17,22 @@ import org.junit.Test;
public class AlgorithmTest {
List<String> executeOnly = Arrays.asList();
List<String> executeOnly = Arrays.asList("#BIONYM", "#AQUAMAPS_SUITABLE", "#AQUAMAPS_SUITABLE 21 sp", "#BIONYM1024","#CMSY2");
@Test
public void executeAlgorithmsFromFile() throws Exception{
String env = "dev";
Properties prop = new Properties();
/*Properties prop = new Properties();
prop.load(AlgorithmTest.class.getResourceAsStream("/test_params.properties"));
*/
String protocol = "http";
String hostname = prop.getProperty(env+".host");
String token = prop.getProperty(env+".token");
String layerID = prop.getProperty(env+".layer");
String hostname = "dataminer-genericworkers.d4science.org";
String token = "257800d8-24bf-4bae-83cd-ea99369e7dd6-843339462";
String layerID = "08ee0d70-4d8b-4f42-8b06-d709482bca95";
Iterator<String> uris = getUrisIterator();
HttpClient client = new HttpClient();
@ -43,7 +43,7 @@ public class AlgorithmTest {
if (nextLine.startsWith("#"))
algorithmName = nextLine;
else{
if (!(executeOnly.isEmpty() || executeOnly.contains(algorithmName))) continue;
if (executeOnly.contains(algorithmName)) continue;
String callUrl = nextLine.replace("{PROTOCOL}", protocol).replace("{HOST}", hostname).replace("{TOKEN}", token).replace("{LAYERID}", layerID);
try{
long start = System.currentTimeMillis();

View File

@ -18,13 +18,8 @@ public class MultiThreadingCalls {
// http://statistical-manager-new.d4science.org:8080/wps/WebProcessingService?request=Execute&service=WPS&version=1.0.0&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.BIONYM_LOCAL&DataInputs=scope=/gcube/devsec/devVRE;user.name=tester;SpeciesAuthorName=Gadus%20morhua;Taxa_Authority_File=ASFIS;Parser_Name=SIMPLE;Activate_Preparsing_Processing=true;Use_Stemmed_Genus_and_Species=false;Accuracy_vs_Speed=MAX_ACCURACY;Matcher_1=GSAy;Threshold_1=0.6;MaxResults_1=10
// final URL urlToCall = new URL("http://"+host+":8080/wps/WebProcessingService?request=Execute&service=WPS&version=1.0.0&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.modellers.FEED_FORWARD_ANN&DataInputs=" +
// URLEncoder.encode("scope=/gcube/devsec;user.name=test.user;LayersNeurons=10|10;LearningThreshold=0.01;MaxIterations=100;ModelName=wps_ann;Reference=1;TargetColumn=depthmean;TrainingColumns=depthmin|depthmax;TrainingDataSet=http://goo.gl/juNsCK@MimeType=text/csv","UTF-8"));
/*
final URL urlToCall = new URL("http://"+host+"/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token=4ccc2c35-60c9-4c9b-9800-616538d5d48b&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.clusterers.XMEANS&DataInputs=" +
URLEncoder.encode("OccurrencePointsClusterLabel=OccClustersTest;min_points=1;maxIterations=100;minClusters=1;maxClusters=3;OccurrencePointsTable=http://goo.gl/VDzpch;FeaturesColumnNames=depthmean|sstmnmax|salinitymean;","UTF-8"));
*/
final URL urlToCall = new URL("http://dataminer1-devnext.d4science.org/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token=f9d49d76-cd60-48ed-9f8e-036bcc1fc045-98187548&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.SUBMITQUERY&DataInputs=" +
URLEncoder.encode("DatabaseName=fishbase;Query=select * from food limit 100;Apply Smart Correction=false;Language=POSTGRES;ResourceName=FishBase;Read-Only Query=true;","UTF-8"));

View File

@ -42,7 +42,6 @@ public class TestMappedEvaluators {
public void testQualityAnalysis() throws Exception{
HRS algorithm = new HRS();
// http://localhost:8080/wps/WebProcessingService?request=Execute&service=WPS&version=1.0.0&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.evaluators.QUALITY_ANALYSIS&DataInputs=user.name=test.user;scope=/gcube/devsec;NegativeCasesTableKeyColumn=csquarecode;DistributionTableProbabilityColumn=probability;PositiveCasesTableKeyColumn=csquarecode;PositiveThreshold=0.8;PositiveCasesTable=http://goo.gl/DEYAbT;DistributionTableKeyColumn=csquarecode;DistributionTable=http://goo.gl/DEYAbT;NegativeThreshold=0.3;NegativeCasesTable=http://goo.gl/DEYAbT;
// algorithm.setScope("/gcube/devsec");

52
src/test/resources/AlgorithmTestURIs.txt Normal file → Executable file
View File

@ -5,58 +5,58 @@
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.ESRI_GRID_EXTRACTION&DataInputs=Layer={LAYERID};YResolution=0.5;XResolution=0.5;BBox_LowerLeftLong=-50;BBox_UpperRightLat=60;BBox_LowerLeftLat=-60;BBox_UpperRightLong=50;Z=0;TimeIndex=0;
#DBSCAN
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.clusterers.DBSCAN
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.clusterers.DBSCAN&DataInputs=OccurrencePointsClusterLabel=OccClustersTest;epsilon=10;min_points=1;OccurrencePointsTable={PROTOCOL}://goo.gl/VDzpch;FeaturesColumnNames=depthmean|sstmnmax|salinitymean;
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.clusterers.DBSCAN&DataInputs=OccurrencePointsClusterLabel=OccClustersTest;epsilon=10;min_points=1;OccurrencePointsTable={PROTOCOL}://brokenlink/VDzpch;FeaturesColumnNames=depthmean|sstmnmax|salinitymean;
#KMEANS
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.clusterers.KMEANS
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.clusterers.KMEANS&DataInputs=OccurrencePointsClusterLabel=OccClustersTest;k=3;max_runs=100;min_points=1;max_optimization_steps=10;OccurrencePointsTable={PROTOCOL}://goo.gl/VDzpch;FeaturesColumnNames=depthmean|sstmnmax|salinitymean;
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.clusterers.KMEANS&DataInputs=OccurrencePointsClusterLabel=OccClustersTest;k=3;max_runs=100;min_points=1;max_optimization_steps=10;OccurrencePointsTable={PROTOCOL}://brokenlink/VDzpch;FeaturesColumnNames=depthmean|sstmnmax|salinitymean;
#LOF
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.clusterers.LOF
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.clusterers.LOF&DataInputs=FeaturesColumnNames=depthmean|sstmnmax|salinitymean;PointsClusterLabel=OccClustersTest;minimal_points_lower_bound=2;PointsTable={PROTOCOL}://goo.gl/VDzpch;minimal_points_upper_bound=10;distance_function=euclidian distance;lof_threshold=2;
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.clusterers.LOF&DataInputs=FeaturesColumnNames=depthmean|sstmnmax|salinitymean;PointsClusterLabel=OccClustersTest;minimal_points_lower_bound=2;PointsTable={PROTOCOL}://brokenlink/VDzpch;minimal_points_upper_bound=10;distance_function=euclidian distance;lof_threshold=2;
#XMEANS
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.clusterers.XMEANS
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.clusterers.XMEANS&DataInputs=OccurrencePointsClusterLabel=OccClustersTest;min_points=1;maxIterations=100;minClusters=1;maxClusters=3;OccurrencePointsTable={PROTOCOL}://goo.gl/VDzpch;FeaturesColumnNames=depthmean|sstmnmax|salinitymean;
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.clusterers.XMEANS&DataInputs=OccurrencePointsClusterLabel=OccClustersTest;min_points=1;maxIterations=100;minClusters=1;maxClusters=3;OccurrencePointsTable={PROTOCOL}://brokenlink/VDzpch;FeaturesColumnNames=depthmean|sstmnmax|salinitymean;
#BIONYM
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.generators.BIONYM
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.generators.BIONYM&DataInputs=Matcher_1=LEVENSHTEIN;Matcher_4=NONE;Matcher_5=NONE;Matcher_2=NONE;Matcher_3=NONE;Threshold_1=0.6;RawTaxaNamesTable={PROTOCOL}://goo.gl/N9e3pC;Threshold_2=0.6;Accuracy_vs_Speed=MAX_ACCURACY;MaxResults_2=10;MaxResults_1=10;Threshold_3=0.4;RawNamesColumn=species;Taxa_Authority_File=FISHBASE;Parser_Name=SIMPLE;OutputTableLabel=bionymwps;MaxResults_4=0;Threshold_4=0;MaxResults_3=0;MaxResults_5=0;Threshold_5=0;Use_Stemmed_Genus_and_Species=false;Activate_Preparsing_Processing=true;
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.generators.BIONYM&DataInputs=Matcher_1=LEVENSHTEIN;Matcher_4=NONE;Matcher_5=NONE;Matcher_2=NONE;Matcher_3=NONE;Threshold_1=0.6;RawTaxaNamesTable={PROTOCOL}://brokenlink/N9e3pC;Threshold_2=0.6;Accuracy_vs_Speed=MAX_ACCURACY;MaxResults_2=10;MaxResults_1=10;Threshold_3=0.4;RawNamesColumn=species;Taxa_Authority_File=FISHBASE;Parser_Name=SIMPLE;OutputTableLabel=bionymwps;MaxResults_4=0;Threshold_4=0;MaxResults_3=0;MaxResults_5=0;Threshold_5=0;Use_Stemmed_Genus_and_Species=false;Activate_Preparsing_Processing=true;
#BIONYM_LOCAL
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.BIONYM_LOCAL
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.BIONYM_LOCAL&DataInputs=Matcher_1=LEVENSHTEIN;Matcher_4=NONE;Matcher_5=NONE;Matcher_2=NONE;Matcher_3=NONE;Threshold_1=0.6;Threshold_2=0.6;Accuracy_vs_Speed=MAX_ACCURACY;MaxResults_2=10;MaxResults_1=10;Threshold_3=0.4;Taxa_Authority_File=FISHBASE;Parser_Name=SIMPLE;MaxResults_4=0;Threshold_4=0;MaxResults_3=0;MaxResults_5=0;Threshold_5=0;Use_Stemmed_Genus_and_Species=false;Activate_Preparsing_Processing=true;SpeciesAuthorName=Gadus morhua
#ABSENCE CELLS FROM AQUAMAPS
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.ABSENCE_CELLS_FROM_AQUAMAPS
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.ABSENCE_CELLS_FROM_AQUAMAPS&DataInputs=Number_of_Points=20;Table_Label=hcaf_filtered_wps;Aquamaps_HSPEC={PROTOCOL}://goo.gl/24XrmE;Take_Randomly=true;Species_Code=Fis-30189;
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.ABSENCE_CELLS_FROM_AQUAMAPS&DataInputs=Number_of_Points=20;Table_Label=hcaf_filtered_wps;Aquamaps_HSPEC={PROTOCOL}://brokenlink/24XrmE;Take_Randomly=true;Species_Code=Fis-30189;
#HCAF_FILTER
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.HCAF_FILTER
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.HCAF_FILTER&DataInputs=B_Box_Left_Lower_Lat=-17;B_Box_Right_Upper_Long=147;B_Box_Right_Upper_Lat=25;B_Box_Left_Lower_Long=89;Table_Label=wps_hcaf_filter;
#MAX_ENT_NICHE_MODELLING
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.MAX_ENT_NICHE_MODELLING
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.MAX_ENT_NICHE_MODELLING&DataInputs=LongitudeColumn=decimallongitude;LatitudeColumn=decimallatitude;Z=0;Layers={LAYERID};TimeIndex=0;MaxIterations=100;SpeciesName=Latimeria%20chalumnae;DefaultPrevalence=0.5;YResolution=0.5;OccurrencesTable=http://goo.gl/5cnKKp;XResolution=0.5;OutputTableLabel=wps_maxent;
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.MAX_ENT_NICHE_MODELLING&DataInputs=LongitudeColumn=decimallongitude;LatitudeColumn=decimallatitude;Z=0;Layers={LAYERID};TimeIndex=0;MaxIterations=100;SpeciesName=Latimeria%20chalumnae;DefaultPrevalence=0.5;YResolution=0.5;OccurrencesTable=http://brokenlink/5cnKKp;XResolution=0.5;OutputTableLabel=wps_maxent;
#OCCURRENCE_ENRICHMENT
#{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.OCCURRENCE_ENRICHMENT
#{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.OCCURRENCE_ENRICHMENT&DataInputs=OptionalFilter=%20;OutputTableName=wps_enriched;FeaturesNames=temp;OccurrenceTable=http://goo.gl/ZfFcfE;LongitudeColumn=decimallongitude;LatitudeColumn=decimallatitude;ScientificNameColumn=scientificname;Layers={LAYERID};TimeColumn=eventdate;Resolution=0.5;
#{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.OCCURRENCE_ENRICHMENT&DataInputs=OptionalFilter=%20;OutputTableName=wps_enriched;FeaturesNames=temp;OccurrenceTable=http://brokenlink/ZfFcfE;LongitudeColumn=decimallongitude;LatitudeColumn=decimallatitude;ScientificNameColumn=scientificname;Layers={LAYERID};TimeColumn=eventdate;Resolution=0.5;
#PRESENCE_CELLS_GENERATION
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.PRESENCE_CELLS_GENERATION
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.PRESENCE_CELLS_GENERATION&DataInputs=Number_of_Points=20;Table_Label=hcaf_filtered_wps;Species_Code=Fis-30189;
#FAO_OCEAN_AREA_COLUMN_CREATOR
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.FAO_OCEAN_AREA_COLUMN_CREATOR
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.FAO_OCEAN_AREA_COLUMN_CREATOR&DataInputs=Resolution=5;Latitude_Column=decimallatitude;InputTable={PROTOCOL}://goo.gl/sdlD5a;Longitude_Column=decimallongitude;OutputTableName=wps_fao_area_column;
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.FAO_OCEAN_AREA_COLUMN_CREATOR&DataInputs=Resolution=5;Latitude_Column=decimallatitude;InputTable={PROTOCOL}://brokenlink/sdlD5a;Longitude_Column=decimallongitude;OutputTableName=wps_fao_area_column;
#FAO_OCEAN_AREA_COLUMN_CREATOR_FROM_QUADRANT
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.FAO_OCEAN_AREA_COLUMN_CREATOR_FROM_QUADRANT
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.FAO_OCEAN_AREA_COLUMN_CREATOR_FROM_QUADRANT&DataInputs=Resolution=5;InputTable={PROTOCOL}://goo.gl/yJTIBZ;Longitude_Column=centerlong;Quadrant_Column=quadrant;OutputTableName=wps_fao_quadrant;Latitude_Column=centerlat;
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.FAO_OCEAN_AREA_COLUMN_CREATOR_FROM_QUADRANT&DataInputs=Resolution=5;InputTable={PROTOCOL}://brokenlink/yJTIBZ;Longitude_Column=centerlong;Quadrant_Column=quadrant;OutputTableName=wps_fao_quadrant;Latitude_Column=centerlat;
#CSQUARE_COLUMN_CREATOR
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.CSQUARE_COLUMN_CREATOR
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.CSQUARE_COLUMN_CREATOR&DataInputs=CSquare_Resolution=0.1;Latitude_Column=decimallatitude;InputTable={PROTOCOL}://goo.gl/sdlD5a;Longitude_Column=decimallongitude;OutputTableName=wps_csquare_column;
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.CSQUARE_COLUMN_CREATOR&DataInputs=CSquare_Resolution=0.1;Latitude_Column=decimallatitude;InputTable={PROTOCOL}://brokenlink/sdlD5a;Longitude_Column=decimallongitude;OutputTableName=wps_csquare_column;
#GENERIC_CHARTS
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.GENERIC_CHARTS
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.GENERIC_CHARTS&DataInputs=Quantities=fvalue;InputTable={PROTOCOL}://goo.gl/lWTvcw;TopElementsNumber=10;Attributes=x|y
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.GENERIC_CHARTS&DataInputs=Quantities=fvalue;InputTable={PROTOCOL}://brokenlink/lWTvcw;TopElementsNumber=10;Attributes=x|y
#GEO_CHART
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.GEO_CHART
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.GEO_CHART&DataInputs=Latitude=y;Quantities=x;Longitude=x;InputTable={PROTOCOL}://goo.gl/lWTvcw
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.GEO_CHART&DataInputs=Latitude=y;Quantities=x;Longitude=x;InputTable={PROTOCOL}://brokenlink/lWTvcw
#TIME_GEO_CHART
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.TIME_GEO_CHART
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.TIME_GEO_CHART&DataInputs=Time=time;Latitude=x;Longitude=y;Quantities=fvalue;InputTable={PROTOCOL}://goo.gl/lWTvcw;
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.TIME_GEO_CHART&DataInputs=Time=time;Latitude=x;Longitude=y;Quantities=fvalue;InputTable={PROTOCOL}://brokenlink/lWTvcw;
#TIME_SERIES_CHARTS
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.TIME_SERIES_CHARTS
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.TIME_SERIES_CHARTS&DataInputs=Attributes=x|y|z;Quantities=fvalue;InputTable={PROTOCOL}://goo.gl/lWTvcw;Time=time
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.TIME_SERIES_CHARTS&DataInputs=Attributes=x|y|z;Quantities=fvalue;InputTable={PROTOCOL}://brokenlink/lWTvcw;Time=time
#OBIS_SPECIES_OBSERVATIONS_PER_MEOW_AREA
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.SPECIES_OBSERVATION_MEOW_AREA_PER_YEAR
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.SPECIES_OBSERVATION_MEOW_AREA_PER_YEAR&DataInputs=Selected species=Gadus morhua;End_year=2015;Start_year=2000;Area_type=NORTH SEA;
@ -80,16 +80,16 @@
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.MOST_OBSERVED_TAXA&DataInputs=Level=GENUS;Taxa_number=10;End_year=2015;Start_year=2000;
#TIME_SERIES_ANALYSIS
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.TIME_SERIES_ANALYSIS
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.TIME_SERIES_ANALYSIS&DataInputs=FFT_Window_Samples=12;SSA_Points_to_Forecast=3;AggregationFunction=SUM;TimeSeriesTable={PROTOCOL}://goo.gl/lWTvcw;Sensitivity=LOW;SSA_Window_in_Samples=12;SSA_EigenvaluesThreshold=0.7;ValueColum=fvalue
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.TIME_SERIES_ANALYSIS&DataInputs=FFT_Window_Samples=12;SSA_Points_to_Forecast=3;AggregationFunction=SUM;TimeSeriesTable={PROTOCOL}://brokenlink/lWTvcw;Sensitivity=LOW;SSA_Window_in_Samples=12;SSA_EigenvaluesThreshold=0.7;ValueColum=fvalue
#MAPS_COMPARISON
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.evaluators.MAPS_COMPARISON
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.evaluators.MAPS_COMPARISON&DataInputs=TimeIndex_1=0;ValuesComparisonThreshold=0.1;TimeIndex_2=0;Z=0;KThreshold=0.5;Layer_1=483a4a32-729e-422b-b5e4-49f27ba93ec2;Layer_2=483a4a32-729e-422b-b5e4-49f27ba93ec2;
#QUALITY_ANALYSIS
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.evaluators.QUALITY_ANALYSIS
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.evaluators.QUALITY_ANALYSIS&DataInputs=NegativeCasesTableKeyColumn=csquarecode;DistributionTableProbabilityColumn=probability;PositiveCasesTableKeyColumn=csquarecode;PositiveThreshold=0.8;PositiveCasesTable={PROTOCOL}://goo.gl/8zWU7u;DistributionTableKeyColumn=csquarecode;DistributionTable={PROTOCOL}://goo.gl/cXbg2n;NegativeThreshold=0.3;NegativeCasesTable={PROTOCOL}://goo.gl/8zWU7u;
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.evaluators.QUALITY_ANALYSIS&DataInputs=NegativeCasesTableKeyColumn=csquarecode;DistributionTableProbabilityColumn=probability;PositiveCasesTableKeyColumn=csquarecode;PositiveThreshold=0.8;PositiveCasesTable={PROTOCOL}://brokenlink/8zWU7u;DistributionTableKeyColumn=csquarecode;DistributionTable={PROTOCOL}://brokenlink/cXbg2n;NegativeThreshold=0.3;NegativeCasesTable={PROTOCOL}://brokenlink/8zWU7u;
#DISCREPANCY_ANALYSIS
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.evaluators.DISCREPANCY_ANALYSIS
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.evaluators.DISCREPANCY_ANALYSIS&DataInputs=ComparisonThreshold=0.1;SecondTable={PROTOCOL}://goo.gl/cXbg2n;FirstTable={PROTOCOL}://goo.gl/BBk8iB;KThreshold=0.5;MaxSamples=10000;FirstTableProbabilityColumn=probability;SecondTableProbabilityColumn=probability;FirstTableCsquareColumn=csquarecode;SecondTableCsquareColumn=csquarecode
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.evaluators.DISCREPANCY_ANALYSIS&DataInputs=ComparisonThreshold=0.1;SecondTable={PROTOCOL}://brokenlink/cXbg2n;FirstTable={PROTOCOL}://brokenlink/BBk8iB;KThreshold=0.5;MaxSamples=10000;FirstTableProbabilityColumn=probability;SecondTableProbabilityColumn=probability;FirstTableCsquareColumn=csquarecode;SecondTableCsquareColumn=csquarecode
#XYEXTRACTOR
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.XYEXTRACTOR
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.XYEXTRACTOR&DataInputs=OutputTableLabel=wps_xy_extractor;Layer={LAYERID};YResolution=0.5;XResolution=0.5;BBox_LowerLeftLong=-50;BBox_UpperRightLat=60;BBox_LowerLeftLat=-60;BBox_UpperRightLong=50;Z=0;TimeIndex=0;
@ -101,22 +101,22 @@
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.ZEXTRACTION&DataInputs=OutputTableLabel=wps_z_extractor;Layer={LAYERID};Resolution=100;Y=38;TimeIndex=0;X=28
#XYEXTRACTOR_TABLE
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.XYEXTRACTOR_TABLE
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.XYEXTRACTOR_TABLE&DataInputs=TimeIndex=0;Z=0;filter= ;zColumn=lme;timeColumn=lme;xColumn=centerlong;yColumn=centerlat;geoReferencedTableName={PROTOCOL}://goo.gl/KjWYQG;valueColumn=oceanarea;XResolution=0.5;YResolution=0.5;BBox_UpperRightLong=50;BBox_LowerLeftLat=-60;BBox_LowerLeftLong=-50;BBox_UpperRightLat=60;OutputTableLabel=wps_xy_extr_table;
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.XYEXTRACTOR_TABLE&DataInputs=TimeIndex=0;Z=0;filter= ;zColumn=lme;timeColumn=lme;xColumn=centerlong;yColumn=centerlat;geoReferencedTableName={PROTOCOL}://brokenlink/KjWYQG;valueColumn=oceanarea;XResolution=0.5;YResolution=0.5;BBox_UpperRightLong=50;BBox_LowerLeftLat=-60;BBox_LowerLeftLong=-50;BBox_UpperRightLat=60;OutputTableLabel=wps_xy_extr_table;
#TIMEEXTRACTION_TABLE
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.TIMEEXTRACTION_TABLE
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.TIMEEXTRACTION_TABLE&DataInputs=Z=0;Resolution=0.5;filter= ;zColumn=lme;timeColumn=lme;xColumn=centerlong;yColumn=centerlat;Y=3.75;X=102.25;geoReferencedTableName={PROTOCOL}://goo.gl/VDzpch;valueColumn=centerlong;SamplingFreq=-1;OutputTableLabel=wps_time_extr_table;
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.TIMEEXTRACTION_TABLE&DataInputs=Z=0;Resolution=0.5;filter= ;zColumn=lme;timeColumn=lme;xColumn=centerlong;yColumn=centerlat;Y=3.75;X=102.25;geoReferencedTableName={PROTOCOL}://brokenlink/VDzpch;valueColumn=centerlong;SamplingFreq=-1;OutputTableLabel=wps_time_extr_table;
#ZEXTRACTION_TABLE
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.ZEXTRACTION_TABLE
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.ZEXTRACTION_TABLE&DataInputs=TimeIndex=0;Resolution=1;filter= ;zColumn=centerlong;xColumn=centerlong;yColumn=centerlat;Y=0.25;X=0.25;geoReferencedTableName={PROTOCOL}://goo.gl/VDzpch;valueColumn=oceanarea;OutputTableLabel=wps_z_table;
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.ZEXTRACTION_TABLE&DataInputs=TimeIndex=0;Resolution=1;filter= ;zColumn=centerlong;xColumn=centerlong;yColumn=centerlat;Y=0.25;X=0.25;geoReferencedTableName={PROTOCOL}://brokenlink/VDzpch;valueColumn=oceanarea;OutputTableLabel=wps_z_table;
#HRS
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.evaluators.HRS
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.evaluators.HRS&DataInputs=PositiveCasesTable={PROTOCOL}://goo.gl/VDzpch;NegativeCasesTable={PROTOCOL}://goo.gl/VDzpch;OptionalCondition= ;ProjectingAreaTable={PROTOCOL}://goo.gl/VDzpch;FeaturesColumns=depthmin|depthmax;
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.evaluators.HRS&DataInputs=PositiveCasesTable={PROTOCOL}://brokenlink/VDzpch;NegativeCasesTable={PROTOCOL}://brokenlink/VDzpch;OptionalCondition= ;ProjectingAreaTable={PROTOCOL}://brokenlink/VDzpch;FeaturesColumns=depthmin|depthmax;
#SGVM_INTERPOLATION
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.SGVM_INTERPOLATION
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.SGVM_INTERPOLATION&DataInputs=headingAdjustment=0;maxspeedThr=6;minspeedThr=2;fm=0.5;margin=10;distscale=20;res=100;sigline=0.2;interval=120;equalDist=true;InputFile={PROTOCOL}://goo.gl/i16kPw;npoints=10;method=cHs;
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.SGVM_INTERPOLATION&DataInputs=headingAdjustment=0;maxspeedThr=6;minspeedThr=2;fm=0.5;margin=10;distscale=20;res=100;sigline=0.2;interval=120;equalDist=true;InputFile={PROTOCOL}://brokenlink/i16kPw;npoints=10;method=cHs;
#BIOCLIMATE_HCAF
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.BIOCLIMATE_HCAF
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.BIOCLIMATE_HCAF&DataInputs=HCAF_Table_List={PROTOCOL}://goo.gl/LTqufC|{PROTOCOL}://goo.gl/LTqufC;HCAF_Table_Names=h1|h2
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.BIOCLIMATE_HCAF&DataInputs=HCAF_Table_List={PROTOCOL}://brokenlink/LTqufC|{PROTOCOL}://brokenlink/LTqufC;HCAF_Table_Names=h1|h2
#BIOCLIMATE_HSPEN
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.BIOCLIMATE_HSPEN
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.BIOCLIMATE_HSPEN&DataInputs=HSPEN_Table_List={PROTOCOL}://data-d.d4science.org/dDRpendoOEpnUE5nRzM4WHQ3RWVlVUorb3lwa2wzNWJHbWJQNStIS0N6Yz0|{PROTOCOL}://data-d.d4science.org/dDRpendoOEpnUE5nRzM4WHQ3RWVlVUorb3lwa2wzNWJHbWJQNStIS0N6Yz0;HSPEN_Table_Names=h1|h2;
@ -131,7 +131,7 @@
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.modellers.FEED_FORWARD_ANN&DataInputs=TrainingDataSet={PROTOCOL}://data-d.d4science.org/UndsT051bEZwbEpnRzM4WHQ3RWVlVDQ0eHNITWgzRXdHbWJQNStIS0N6Yz0;TrainingColumns=a|b;TargetColumn=t;LayersNeurons=1;Reference=1;LearningThreshold=0.001;MaxIterations=1000;ModelName=neuralnet_t;
#FEED_FORWARD_A_N_N_DISTRIBUTION
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.generators.FEED_FORWARD_A_N_N_DISTRIBUTION
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.generators.FEED_FORWARD_A_N_N_DISTRIBUTION&DataInputs=FeaturesTable={PROTOCOL}://data-d.d4science.org/UndsT051bEZwbEpnRzM4WHQ3RWVlVDQ0eHNITWgzRXdHbWJQNStIS0N6Yz0;FeaturesColumnNames=a|b;FinalTableLabel=Distrib_t;ModelName={PROTOCOL}://goo.gl/ggiPyX;
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.generators.FEED_FORWARD_A_N_N_DISTRIBUTION&DataInputs=FeaturesTable={PROTOCOL}://data-d.d4science.org/UndsT051bEZwbEpnRzM4WHQ3RWVlVDQ0eHNITWgzRXdHbWJQNStIS0N6Yz0;FeaturesColumnNames=a|b;FinalTableLabel=Distrib_t;ModelName={PROTOCOL}://brokenlink/ggiPyX;
#OCCURRENCES_DUPLICATES_DELETER
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.OCCURRENCES_DUPLICATES_DELETER
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.OCCURRENCES_DUPLICATES_DELETER&DataInputs=final_Table_Name=DeletedOcc_;OccurrencePointsTableName={PROTOCOL}://data-d.d4science.org/KzI1TmN5TCtJT2hnRzM4WHQ3RWVlZlZLdCttTThpUnRHbWJQNStIS0N6Yz0;longitudeColumn=decimalLongitude;latitudeColumn=decimalLatitude;recordedByColumn=recordedBy;scientificNameColumn=scientificName;eventDateColumn=eventDate;lastModificationColumn=modified;spatialTolerance=0.5;confidence=80;
@ -170,7 +170,7 @@
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.RASTER_DATA_PUBLISHER&DataInputs=PublicationLevel=PUBLIC;DatasetAbstract=Abstract;DatasetTitle=Generic Raster Layer Test3;RasterFile={PROTOCOL}://data.d4science.org/QTVNbXp5cmI0MG52TTE0K2paNzhXZWlCTHhweU8rUCtHbWJQNStIS0N6Yz0;InnerLayerName=analyzed_field;FileNameOnInfra=raster-1465493226242.nc;Topics=analyzed_field;SpatialResolution=-1;
#Web app publisher
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.WEB_APP_PUBLISHER
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.WEB_APP_PUBLISHER&DataInputs=ZipFile={PROTOCOL}://goo.gl/dYQ089;
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.WEB_APP_PUBLISHER&DataInputs=ZipFile={PROTOCOL}://brokenlink/dYQ089;
#ECOPATH_WITH_ECOSIM
{PROTOCOL}://{HOST}/wps/WebProcessingService?Request=DescribeProcess&Service=WPS&Version=1.0.0&gcube-token={TOKEN}&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.ECOPATH_WITH_ECOSIM
{PROTOCOL}://{HOST}/wps/WebProcessingService?request=Execute&service=WPS&Version=1.0.0&gcube-token={TOKEN}&lang=en-US&Identifier=org.gcube.dataanalysis.wps.statisticalmanager.synchserver.mappedclasses.transducerers.ECOPATH_WITH_ECOSIM&DataInputs=Model File={PROTOCOL}://data.d4science.org/eHFkNmhoSUwxMVpmcElhcUlmQUpWaWRGSjQzNkFXNElHbWJQNStIS0N6Yz0;Config File={PROTOCOL}://data.d4science.org/ZGFWaGc4NjUrQmRmcElhcUlmQUpWbTNVQjhqdUV3OWdHbWJQNStIS0N6Yz0;