Compare commits

...

271 Commits

Author SHA1 Message Date
Biagio Peccerillo 0e6142caac Standardized project structure
Details:
- Documentation folders refactored to mirror 'standard' organization.
- DocsGenerator removed to let default Tomcat servlet serve
  documentation
- Added DocsRedirectFilter to redirect all the GETs to /docs and
  /api-docs to their index.html pages
- Added CHANGELOG file
2024-12-19 12:12:35 +01:00
Biagio Peccerillo a1fad9b9f7 Solved Incident #28539
Amazon S3 backend have some problem processing non-ASCII characters.
This behavious arised when uploading files with such characters in their
filename from the workspace, because filename is used as the meta-tag
"title" on the S3 backend. The solution required to URL-encode title
before passing it to the backend.
2024-12-17 11:16:29 +01:00
Biagio Peccerillo c54b64568a Removed obsolete 'version' attribute from docker-compose files 2024-12-12 12:54:34 +01:00
Biagio Peccerillo b0fbd4c385 Added Sphinx documentation
Details:
- Added index and intro pages
- Added sphinx-maven plugin in pom.xml
2024-12-12 12:54:34 +01:00
Biagio Peccerillo f2b5b9e857 Version set to 2.0.1-SNAPSHOT 2024-12-12 12:54:34 +01:00
Biagio Peccerillo 3f413c8391 Improved analysis tools for containerized application
Details:
- Added a file appender for the logs
- Installed vim to easily navigate logs
2024-12-12 12:53:59 +01:00
Biagio Peccerillo 2b54a98760 Improved documentation and removed useless constraint
Details:
- Improved UserManager and GroupManager REST API docs
- removed content-type constraint on group admin DELETE
2024-12-10 11:41:34 +01:00
Biagio Peccerillo 3f45dce6f7 Merge branch 'master' into feature/28427 2024-12-09 16:32:11 +01:00
Lucio Lelii 078516a971 solved an erorr on type definition 2024-12-09 15:28:09 +01:00
Biagio Peccerillo 589e38cd55 Improved AuthorizationControl documentation
For methods with Authorization Control limitations, a hint to the users has been added
under the method description indicating what roles are necessary to do the corrispondent
REST API call
2024-12-06 15:25:33 +01:00
Biagio Peccerillo 868dbabcff Completed REST API documentation
Additions:
- method descriptions
- parameter descriptions
- request/response examples
- return value descriptions
- response codes
Affected classes:
- {ACL,Group,Items,Script,Storage,User,Workspace}Manager, ItemsCreator, ItemsSharing
- DocsGenerator and MessageManager ignored altogether
2024-12-06 15:25:33 +01:00
Biagio Peccerillo bf55883511 Added end boundary string to multipart/form-data examples 2024-12-06 15:25:33 +01:00
Biagio Peccerillo a34eef4617 Merged with master 2024-12-06 15:25:20 +01:00
Biagio Peccerillo 0ee16386f0 disabled useless modules in enunciate.xml 2024-12-06 15:23:43 +01:00
Biagio Peccerillo d0fac83c50 ItemsCreator documented
Addition:
- method description
- request example
- response example
- parameter description
- return value description
- response codes
2024-12-06 15:23:40 +01:00
Biagio Peccerillo 1d3cc6c073 Rendered "https://dev.d4science.org/how-to-access-resources" as link in Enunciate docs 2024-12-06 15:23:37 +01:00
Biagio Peccerillo 65b9f611dc Improved Enunciate support
Changes:
- added "enunciate-lombok" module to support Lombok-annotated classes
- added some source-paths to locate external classes
- disabled two useless enunciate modules
2024-12-06 15:23:30 +01:00
Biagio Peccerillo deb1d315de Corrected Dockerfiles
Changed "d4science" in "harbor.d4science.org/gcube" to clarify the source registry
2024-12-06 15:23:20 +01:00
Biagio Peccerillo eedf6ef667 Improved enunciate documentation
Changes:
- switched to enunciate 2.18 (from 2.17)
- improved storagehub description in the index page (still improvable)
- added missing dependencies from storagehub-model
2024-12-06 15:23:05 +01:00
Lucio Lelii 1b724a8d8b commit for release 2024-11-22 11:33:37 +01:00
Lucio Lelii 34e7820514 solved issue on upload Archive functionality 2024-11-22 11:28:38 +01:00
Lucio Lelii 47a4564ffa commit for release 2024-11-19 09:34:28 +01:00
Lucio Lelii c324cf2507 removed old methods for count and size 2024-11-19 09:33:02 +01:00
Lucio Lelii 8cd39a77de commit for release 2024-11-18 15:39:12 +01:00
Lucio Lelii 41448b297a bug on folder size fixed 2024-11-18 15:36:57 +01:00
Lucio Lelii 17118d0869 commit for release 2024-11-18 10:34:37 +01:00
Lucio Lelii 6652cba390 solved test issue 28321 2024-11-18 10:32:36 +01:00
Lucio Lelii 10baca947a commit for releasey 2024-11-15 11:51:46 +01:00
Lucio Lelii 81cb8994ae solved issue on copy 2024-11-15 11:33:05 +01:00
lucio a41f4a564d commit for release 2024-11-13 11:35:36 +01:00
lucio 1bcd502f25 is alive bug solved 2024-11-13 11:34:44 +01:00
Massimiliano Assante 8dd4df456b Update src/main/java/org/gcube/data/access/storagehub/services/ItemsCreator.java
changed param name from stream to file in method createFileItem to fix the docs
2024-11-12 15:25:27 +01:00
Massimiliano Assante 2e5ffc2d5f Update src/main/java/org/gcube/data/access/storagehub/services/ItemsCreator.java
changed param name from stream to file in method createFileItem to fix the docs
2024-11-12 15:18:28 +01:00
lucio 6850309b27 commit for release 2024-11-12 10:54:23 +01:00
lucio 63b4916520 storage version changed 2024-11-12 10:51:43 +01:00
lucio 57cf44ade2 test removed 2024-10-29 11:55:06 +01:00
lucio 244a13845b maven parent updated 2024-10-29 11:17:06 +01:00
lucio 8f56d9f276 commit for release 2024-10-29 11:14:09 +01:00
lucio 6e1a62a2ab storage-manger version downgraded 2024-10-29 11:06:06 +01:00
lucio 749dc7e603 Node file updated 2024-10-23 15:18:25 +02:00
Lucio Lelii 19a902c737 removed SNAPSHOT from bom 2024-10-22 08:21:26 +02:00
Lucio Lelii 29728077ae commit for release 2024-10-22 08:20:09 +02:00
lucio 6384ce5a34 updated 2024-10-21 17:09:52 +02:00
lucio b097ad8a84 ignoring hidden files 2024-10-21 17:08:11 +02:00
Lucio Lelii 743e06dbb9 gnored 2024-10-21 17:03:43 +02:00
lucio c36f1764e9 moved to version 2, messages reintroduced 2024-10-21 16:59:55 +02:00
lucio e9ebebdbbf updated 2024-10-21 15:01:21 +02:00
lucio 7f87628687 javadoc plugin added for java17 2024-10-21 15:01:21 +02:00
lucio 42137b24a0 aspectj plugin version downgraded to be compatible with maven 3.6
installed on jenkins
2024-10-21 15:01:21 +02:00
lucio 413b5cdfdf changed log level 2024-09-11 15:59:21 +02:00
lucio 0f59bc8d8c get info added on all folder 2024-09-10 14:46:01 +02:00
lucio afbaceffa9 added WS info call 2024-09-10 12:49:49 +02:00
lucio e36f4eb776 moved sotorage constant names 2024-08-16 15:50:42 +02:00
lucio 4955cdcad6 changed class 2024-08-16 15:02:47 +02:00
lucio 4fd099c7cf added role control to create external folder 2024-08-16 15:02:29 +02:00
lucio a9cc80f424 creation of folder with different backend fixed 2024-08-16 13:02:55 +02:00
lucio 50b2c80a1a using TransferManager for file greater than 100Mb to use mutlipart
upload
2024-08-12 14:10:20 +02:00
lucio 3584bdaf29 result added to status 2024-08-06 10:46:11 +02:00
lucio 98ee8ea8e9 update for libraries changes 2024-08-01 12:09:02 +02:00
lucio b805869feb added a predicate to exclude some type of item form listing 2024-08-01 12:08:41 +02:00
lucio 16f61a0a61 some update 2024-07-25 11:40:07 +02:00
lucio cf78d06b64 added drain input sream to avoid S3 Warning 2024-07-25 11:39:47 +02:00
lucio 1736417057 added Export functionalities 2024-07-17 11:41:33 +02:00
lucio 994608da26 solved bug on session from cached node 2024-07-10 18:30:50 +02:00
lucio 76639ed5f1 solved issue on export 2024-07-10 13:14:31 +02:00
lucio 38874c9529 repository related classes updated 2024-04-05 21:11:04 +02:00
lucio 32b35e4060 changes on handlers 2024-04-01 19:29:23 +02:00
lucio 8df806bf49 bug on geUser solved 2024-03-28 08:35:15 +01:00
lucio 26bc4c93ac Handler fro user and group extracted from managers 2024-03-27 22:23:30 +01:00
lucio 335204d3ee docker info updated 2024-03-27 12:04:49 +01:00
lucio 4878f8bb18 updated 2024-03-25 16:37:40 +01:00
lucio eefa46cbf6 warnings removed 2024-03-15 15:14:26 +01:00
lucio e11b2147da porting to jakarta 2024-03-15 14:26:05 +01:00
lucio 0becded125 checks updated 2024-01-24 16:32:18 +01:00
lucio c4d5cffe02 moved to new ceph storage 2024-01-19 12:20:19 +01:00
lucio f75f7d86d9 jdbc driver updated 2023-12-01 13:04:07 +01:00
lucio 2452a25349 changelog updated 2023-06-01 19:06:05 +02:00
lucio 87cc8a3ff9 - managing of vre folder specific backend 2023-06-01 19:00:42 +02:00
lucio b34ad84baf updated 2023-05-30 11:08:31 +02:00
lucio 7e875a5dfb added Docs to root application 2023-05-30 10:46:38 +02:00
lucio 10982ea64d added correct exclude 2023-05-30 10:37:02 +02:00
lucio a0cd2e8ccf enunciate improved 2023-05-30 10:18:47 +02:00
lucio e5dda6bb8b enunciate docs added 2023-05-22 14:44:51 +02:00
lucio d1d45a8056 script utils version updated 2023-05-22 13:19:46 +02:00
lucio 56f8ffb838 updated 2023-05-22 11:02:21 +02:00
Lucio Lelii d216459747 acl control added 2023-05-14 12:57:32 +02:00
lucio 4d118372f3 solved bug on archive upload 2023-05-09 15:46:34 +02:00
lucio 0cc0949698 updated 2023-05-05 22:01:42 +02:00
lucio 884f40b759 app configuration updated 2023-05-05 21:53:27 +02:00
lucio a1b69aee6a update 2023-05-05 15:58:13 +02:00
lucio 09879535d2 added application.yaml 2023-02-13 11:16:44 +01:00
lucio 7d96327512 updated for changes on smartgears 2023-01-20 13:36:28 +01:00
Lucio Lelii 443d9cabd4 version of jackrabbit moved to the latest stable 2022-12-23 15:19:29 +01:00
Lucio Lelii 2629c5c387 updated 2022-12-21 19:17:06 +01:00
Lucio Lelii 3a7aa8b8e3 test on container added 2022-12-19 17:47:39 +01:00
Lucio Lelii d0a7197c5c solved issue on duration 2022-12-15 15:36:12 +01:00
Lucio Lelii 0420e2ba3e added fields to ScriptStatus 2022-12-15 15:24:30 +01:00
Lucio Lelii 2033a4b79f added status for script execution 2022-12-15 12:08:39 +01:00
Lucio Lelii bca553aa5f changes 2022-12-14 14:52:27 +01:00
Lucio Lelii 4bd37f8963 remove specific version added 2022-12-07 11:59:02 +01:00
Lucio Lelii b3913ba9c1 added method for version removal 2022-12-06 16:49:41 +01:00
Lucio Lelii 1f6329c38e added possibility to set owner on backends 2022-12-02 14:39:37 +01:00
Lucio Lelii 4083b7c120 TODO for accounting 2022-11-30 17:51:50 +01:00
Lucio Lelii a500df61a1 excluded VREFolders from renaming 2022-11-30 11:16:02 +01:00
Lucio Lelii 7019740af7 enabled renaming of SharedFodler 2022-11-30 11:06:49 +01:00
Lucio Lelii 2012500de8 added check on id exists 2022-11-24 14:29:41 +01:00
Lucio Lelii 8b235da142 changes on StorageBachend interface 2022-11-22 14:05:31 +01:00
Lucio Lelii 0649acb8a9 solved a bug on internal file creation 2022-11-22 11:52:40 +01:00
Lucio Lelii 2e7fc876cf adde methdo for scripts 2022-11-21 16:15:32 +01:00
Lucio Lelii 9b568a09ec Content handler modified 2022-11-18 17:24:25 +01:00
Lucio Lelii 55b6d8e09a removed unused files 2022-11-18 09:29:23 +01:00
Lucio Lelii e60a07abe9 ingore update 2022-11-18 09:27:23 +01:00
Lucio Lelii 1525afef9e removed uned class 2022-11-18 09:26:04 +01:00
Lucio Lelii fad2e7ffb9 removed old files 2022-11-18 09:24:54 +01:00
Lucio Lelii b625fafcc8 improve upload speed 2022-11-17 17:14:51 +01:00
Lucio Lelii 80d15ccef7 changes 2022-11-16 17:50:00 +01:00
Lucio Lelii 6d72896662 pom updated 2022-11-16 09:28:12 +01:00
Lucio Lelii a87d6ab3da update 2022-11-15 17:55:31 +01:00
Lucio Lelii 3b5686e705 update aspectj plugin dependency 2022-11-15 10:37:11 +01:00
Lucio Lelii d36a3314ba issue on voaltile links solved 2022-11-11 16:09:50 +01:00
Lucio Lelii 6dd371070e updated pom to include lastest tika 2022-11-11 11:34:20 +01:00
Lucio Lelii 6af9fce70f Merge branch 'multipleStorageBackends' of https://code-repo.d4science.org/gCubeSystem/storagehub.git into multipleStorageBackends 2022-11-04 15:51:57 +01:00
Lucio Lelii ac2ca4c360 updated tika library 2022-11-04 15:24:17 +01:00
Lucio Lelii b5b3669af5 bug on public link solved 2022-10-17 15:27:40 +02:00
Lucio Lelii 88406a3bf2 solved big on home update 2022-10-04 13:44:04 +02:00
Lucio Lelii 6756c2890c Merge branch 'multipleStorageBackends' of https://code-repo.d4science.org/gCubeSystem/storagehub into multipleStorageBackends 2022-10-04 11:41:30 +02:00
Lucio Lelii 4d38cc6e72 update user created 2022-10-03 17:24:53 +02:00
Lucio Lelii e1db5df7c9 ScritpUtil updated 2022-09-28 19:21:49 +02:00
Lucio Lelii 25105ca041 adding enunciate 2022-09-22 16:00:30 +02:00
Lucio Lelii 5de8dee586 added notification client to AppManager 2022-09-12 16:56:11 +02:00
Lucio Lelii 3e6e203f36 update 2022-09-12 15:00:23 +02:00
Lucio Lelii bfa702bf0f download folder modified 2022-09-06 17:11:27 +02:00
Lucio Lelii 50124d8a49 volatile area with public link added 2022-09-01 17:15:03 +02:00
Lucio Lelii 9dea04e74e Merge branch 'multipleStorageBackends' of https://code-repo.d4science.org/gCubeSystem/storagehub.git into multipleStorageBackends 2022-08-30 16:38:34 +02:00
Lucio Lelii 28044da030 pom updated 2022-08-30 16:38:05 +02:00
Lucio Lelii b0141e6b6f storage manager libraries updated 2022-08-05 14:16:19 +02:00
Lucio Lelii 14a71d4aa7 docker folder updated 2022-07-22 10:34:01 +02:00
Lucio Lelii 3b0bb084b6 java melody removed 2022-07-22 10:33:12 +02:00
Lucio Lelii c4ea5bb05c solved bug on initalization 2022-06-27 15:34:24 +02:00
Lucio Lelii ce071c1f7e conf updated 2022-06-22 18:51:18 +02:00
Lucio Lelii 805b72155d Merge branch 'multipleStorageBackends' of
https://code-repo.d4science.org/gCubeSystem/storagehub.git into
multipleStorageBackends

Conflicts:
	docker-compose.yml
	src/main/webapp/WEB-INF/README
	src/test/java/org/gcube/data/access/fs/container/CreateUsers.java
	src/test/resources/compose-test.yml
2022-06-16 12:35:19 +02:00
Lucio Lelii e43faf6f92 changes 2022-06-16 12:17:21 +02:00
Lucio Lelii 492873bd7e porting to smartgears 4 2022-06-15 17:49:58 +02:00
Lucio Lelii d2b3151edc update 2022-03-28 18:27:18 +02:00
Lucio Lelii 9d3bd619bd added init script if repository is not yet initialized 2022-02-05 11:42:41 +01:00
Lucio Lelii 6d3e9394c4 docker files added 2022-02-02 11:27:52 +01:00
Lucio Lelii d672386824 added docker files for tests 2022-02-02 11:27:00 +01:00
Lucio Lelii 57e0113216 new feature for PayloadBackend added 2021-12-23 17:26:30 +01:00
Lucio Lelii 4f87677674 Minio integration 2021-12-03 16:55:54 +01:00
Lucio Lelii e11bb536a7 a part of s3StorageIntegration 2021-11-26 17:49:35 +01:00
lucio.lelii af9290cbca check for folder connector with same name 2021-10-28 15:50:56 +02:00
lucio.lelii db30621608 success set on users add to group 2021-10-26 14:47:55 +02:00
Lucio Lelii 70391906e2 update 2021-10-25 17:39:12 +02:00
Lucio Lelii ca23f94e09 ManageBy info added 2021-10-25 16:16:26 +02:00
Lucio Lelii 8d11063f6b Added some fix 2021-10-18 16:11:49 +02:00
lucio.lelii 9facccdf46 upload by url added 2021-10-15 19:51:57 +02:00
lucio.lelii da7385f62f pom update 2021-10-15 13:20:06 +02:00
lucio.lelii 29b728b057 reverted last commit 2021-10-15 12:00:09 +02:00
lucio.lelii 7ee17adeac upload archive session is not saved until finished 2021-10-15 11:51:27 +02:00
lucio.lelii 7f88e20a88 first message shown twice for new user issue solved 2021-10-13 13:48:37 +02:00
lucio.lelii 08f0160d8d changed vre recents parameter 2021-10-13 11:48:03 +02:00
lucio.lelii 2a46ac3aa2 solved error on vre recents 2021-10-13 11:35:19 +02:00
lucio.lelii 882f849e2f pom and changelog updated 2021-10-12 14:01:21 +02:00
lucio.lelii 8c934a138e Merge remote-tracking branch 'origin/vreQueryRemoval'
Conflicts:
	CHANGELOG.md
	pom.xml
2021-10-12 13:40:06 +02:00
lucio.lelii 20390e3147 incident https://support.d4science.org/issues/22184 solved 2021-10-12 13:12:02 +02:00
lucio.lelii 47bf9b57d2 pom and changelog update 2021-10-07 10:38:35 +02:00
lucio.lelii 27c6f2d3e1 scriptManager accept different names 2021-10-06 19:41:18 +02:00
lucio.lelii 586d9df939 removed a part of query to retrieve recents 2021-10-06 19:18:34 +02:00
lucio.lelii 7ebd1ae629 fix updated 2021-10-06 11:56:57 +02:00
lucio.lelii 51cbd0776a solved bug https://support.d4science.org/issues/22147 2021-10-06 11:44:23 +02:00
lucio.lelii f30029c052 - using groupHandler method to retrieve VRE
- added logs to ScriptManager error
2021-10-06 10:37:14 +02:00
Lucio Lelii 68eb65e168 removing query for retreiving VRE root Folder 2021-10-04 17:11:43 +02:00
lucio.lelii 8fd9bb5ac2 Merge branch 'master' of https://code-repo.d4science.org/gCubeSystem/storagehub.git 2021-09-28 13:37:32 +02:00
lucio.lelii 147e8a8961 solved bug on instantiation error for types that are not in the model 2021-09-28 13:36:56 +02:00
Roberto Cirillo 933009f8ab Update 'pom.xml' 2021-09-28 12:53:47 +02:00
Roberto Cirillo 0222d27a5e Update 'CHANGELOG.md' 2021-09-28 12:53:29 +02:00
Roberto Cirillo 082b49ebc6 Update 'CHANGELOG.md' 2021-09-28 12:40:39 +02:00
Roberto Cirillo e255a2ff5d Update 'pom.xml'
update to 1.3.2-SNAPSHOT
2021-09-28 12:39:36 +02:00
lucio.lelii 24576cd30f solved bug on messages attachments 2021-09-10 10:59:07 +02:00
Lucio Lelii f68588e05c solved issue on specific version download 2021-07-26 16:07:40 +02:00
Lucio Lelii cb609cf51c solved issue on deleting version content on empty trash 2021-07-26 10:40:05 +02:00
Lucio Lelii 8929ff579c storagebackends takes ids on delete 2021-07-21 17:36:27 +02:00
Lucio Lelii aeb434dd06 solved issue executing force delete on folder 2021-07-21 15:31:23 +02:00
lucio.lelii 470cc28035 solved issue on attachemnts 2021-07-02 13:17:25 +02:00
lucio.lelii 13016497f1 fixed a bug on Excludes 2021-05-04 18:26:30 +02:00
lucio.lelii eab7dc19b4 commit for release 2021-05-04 12:35:39 +02:00
lucio.lelii 3723ac730c script utils dep added 2021-05-04 12:30:03 +02:00
lucio.lelii 2aadb9cce7 - External Folder imporvements - Inbox and Outbox creation 2021-05-04 11:42:02 +02:00
lucio.lelii bcbe97f547 - ACLManager Delegate Added 2021-04-07 12:38:18 +02:00
lucio.lelii a2613dc1a7 - possibility to impersonate people added 2021-03-31 14:49:47 +02:00
lucio.lelii 64952442ed unshareAll fixed 2021-03-16 15:37:18 +01:00
lucio.lelii e271e9fe78 removing also ACL on already unshared folder 2021-03-16 13:54:34 +01:00
lucio.lelii 7566a3cf9f issue on home folder removal fixed 2021-03-16 13:03:27 +01:00
lucio.lelii 8c677df64e NodeAdmin resource added 2021-03-16 10:01:07 +01:00
lucio.lelii 61c84fbb11 reverted chenage 2021-03-16 00:04:54 +01:00
lucio.lelii 31751ca11e solved an issue on versioning 2021-03-15 16:52:19 +01:00
lucio.lelii 7591536a69 removing old not versioned node 2021-03-15 16:01:14 +01:00
lucio.lelii 6b690caf56 users list ordering added 2021-03-15 11:57:46 +01:00
lucio.lelii 288e8d4254 smartgear bom version fixed 2021-03-12 11:22:46 +01:00
lucio.lelii b68d30cd53 removed snapshot for release 2021-03-12 10:27:38 +01:00
lucio.lelii 7df7afecd3 use of query (indexes are not working well in jackrabbit) to retrieve
shared folder removed on delete user
2021-03-12 10:24:45 +01:00
lucio.lelii 1e916b21b6 added method to check user existence 2021-02-08 12:30:58 +01:00
lucio.lelii 56e01e91f8 Merge branch 'master' of https://code-repo.d4science.org/gCubeSystem/storagehub.git 2021-01-13 10:48:00 +01:00
lucio.lelii 94cb6e4cd3 added catch of all exception on unshare during delete 2021-01-13 10:46:58 +01:00
Roberto Cirillo 5760a44220 Update 'pom.xml'
removed snapshot from gcube-smartgears-bom version
2021-01-12 09:36:07 +01:00
lucio.lelii cbb77864c5 mimetype set also on non parsable pdf
method add and remove admin on a VREFolder enabled also for admin role

fixed bug on unsharing folder on user removal
2021-01-11 18:07:57 +01:00
user1 effbb513a7 ObjectMapper import changed to thegcube embedded jackson 2020-11-10 15:18:11 +01:00
user1 953cec03ee removed an unused session 2020-10-13 16:27:31 +02:00
user1 0d499d6c88 acl error message changed 2020-10-13 15:53:27 +02:00
user1 79f755dc13 pom cleaned from warnings 2020-10-08 12:02:11 +02:00
user1 8b494440af changelog modified 2020-10-07 18:30:46 +02:00
user1 d57647a714 mimetype set also on non parasble pdf 2020-10-07 18:28:34 +02:00
Fabio Sinibaldi cb57512174 adoption of gcube-smartgears-bom.2.0.0-SNAPSHOT 2020-10-07 16:34:00 +02:00
Lucio Lelii 2732b05426 SNAPSHOT removed from version 2020-09-16 12:28:38 +02:00
Lucio Lelii 432859d9bb descriptor removed 2020-09-16 12:27:20 +02:00
Lucio Lelii dcf1a614b4 commit for release 2020-09-16 12:11:52 +02:00
Lucio Lelii 84cec8aa76 changelog added 2020-05-26 16:31:23 +02:00
Lucio Lelii 52df23f2a9 changelog added 2020-05-26 16:30:50 +02:00
Lucio Lelii ce8a1e744f commit for release 4.23
resolves a bug on Archive upload
2020-05-22 17:31:40 +02:00
Lucio Lelii 70ae49e28d update for test issue 2020-04-23 16:09:13 +02:00
Lucio Lelii f2742ce0e0 solved bug with getAnchestors and public folders 2020-04-17 17:03:06 +02:00
Lucio Lelii 7ed01ecc4e ItemAction for restore set to MOVED 2020-04-15 17:40:02 +02:00
Lucio Lelii 1db74cc4df solved bugs on restore item and authorization 2020-04-15 17:36:11 +02:00
Lucio Lelii aae84a27d6 pom updated for release 4.22 2020-04-15 10:42:02 +02:00
Lucio Lelii b80f1ccb0d changelog updated 2020-04-15 10:31:09 +02:00
Lucio Lelii ea438888c8 added a todo 2020-04-14 19:49:04 +02:00
Lucio Lelii 01b74ae117 added a servlet for administration 2020-04-14 19:48:28 +02:00
Lucio Lelii 94d9307b4c refactoring of some classes 2020-04-14 18:52:59 +02:00
Lucio Lelii e82d695bbf setting owner of an full unshared folder to caller 2020-04-14 18:52:20 +02:00
Lucio Lelii 458b8a72ea allows moving shared item outside the shared folder 2020-04-14 18:46:44 +02:00
Lucio Lelii 487eae33e2 refactoring 2020-04-09 13:31:17 +02:00
Lucio Lelii 08351b2005 public folder check added on Authorization Checker 2020-04-09 09:44:40 +02:00
Lucio Lelii 4c13f4098e moved classes in differen packages 2020-04-08 21:11:43 +02:00
lucio 6e69de91d0 solved an error on authorization for deleted user 2020-04-05 11:37:15 +02:00
lucio 62fe5a77a0 removes duplicate nodes in search 2020-03-17 13:58:37 +01:00
lucio acbd780dff pom updated for release 2020-03-17 13:41:46 +01:00
lucio a7ee9afb76 search excludes not authorized node from the results 2020-03-17 13:40:41 +01:00
lucio 9e3b5f08e0 pom updated for release 2020-03-16 16:56:57 +01:00
lucio 133d71f14f search exludes hidden 2020-03-16 16:55:26 +01:00
Lucio Lelii 9d895f0adf pom updated 2020-03-11 15:14:20 +01:00
lucio 177888e1b4 - openByPath added
- replaced some warning log with debug
2020-03-11 15:00:40 +01:00
lucio 868eadfdaa complete remove of file on user and group deletion 2020-03-05 15:39:34 +01:00
lucio 6c9aaa9489 pom updated 2020-03-05 15:18:47 +01:00
lucio 774e2b4bfb getByPath method added 2020-03-05 15:18:20 +01:00
lucio fae5173b17 diaplyName bug with root scope containing dashes fixed 2020-01-31 15:22:54 +01:00
lucio ddbac93245 VRE ACl can be changed using setAcl method 2020-01-30 17:14:46 +01:00
lucio 8a84c93c18 fixed bug on administrator check 2020-01-30 17:02:13 +01:00
lucio 2033af4572 bug on vre folder displayName on folder containing - in the name solved 2020-01-30 16:13:23 +01:00
lucio 0711d8a702 control on invalid group id and user added on addAdmin 2020-01-30 11:18:35 +01:00
lucio b9d62994f9 bug on group creation fixed 2020-01-29 17:14:44 +01:00
lucio 8f725d46c0 group creation logs added 2020-01-29 15:25:24 +01:00
lucio eb3daa0a26 added vre folder to group creator workspace 2020-01-29 12:24:30 +01:00
lucio e08984af23 On Grup creation the user creator is automatically added to the gruop 2020-01-29 11:37:50 +01:00
lucio 067f487f8b search made case insensitive 2020-01-24 12:30:05 +01:00
lucio df1956c08d accounting on serach method fixed 2020-01-24 11:59:01 +01:00
Lucio Lelii 4ebc3ce222 Update pom.xml 2020-01-23 15:37:30 +01:00
Lucio Lelii abc2e16cde SNAPSHOT removed on dependency ranges 2020-01-23 12:51:05 +01:00
Lucio Lelii 0311d9782d Update pom.xml 2020-01-23 12:26:11 +01:00
lucio 1bcaa47d48 Merge remote-tracking branch 'origin/rolesmanaging'
Conflicts:
	src/main/webapp/WEB-INF/README
	src/main/webapp/WEB-INF/gcube-app.xml
2020-01-22 17:14:22 +01:00
lucio 0f645fdde7 commit for release 2020-01-22 16:41:12 +01:00
lucio 8ac4752ca7 pom committed 2020-01-22 12:45:19 +01:00
lucio a4dc3cff54 My ApplicationListener added 2020-01-22 12:40:35 +01:00
lucio de91f86daf version changed for release 2020-01-22 12:40:01 +01:00
lucio 188d11ff70 Application Listener add to correctly shutdown the jackrabbit repository 2020-01-22 12:32:49 +01:00
lucio c1ab8333b8 solved error on group creation 2020-01-16 18:11:23 +01:00
lucio 67fe556a4f methdo for administrator management added 2020-01-15 19:11:07 +01:00
lucio 2dc02aa194 unshared accounting added 2020-01-07 18:25:52 +01:00
lucio 4802a0542e Corrected a Authorization message error 2020-01-07 17:05:51 +01:00
lucio 0f156c6637 gcube-app file updated 2019-10-25 14:58:09 +02:00
lucio 0508aa0e3a added inner method name to GroupManager 2019-10-10 19:09:30 +02:00
lucio 18cba6c067 added client method for User and Group management 2019-10-10 18:15:38 +02:00
166 changed files with 12674 additions and 4298 deletions

View File

@ -1,38 +1,39 @@
<?xml version="1.0" encoding="UTF-8"?> <?xml version="1.0" encoding="UTF-8"?>
<classpath> <classpath>
<classpathentry excluding="**" kind="src" output="target/classes" path="src/main/resources">
<attributes>
<attribute name="maven.pomderived" value="true"/>
<attribute name="optional" value="true"/>
</attributes>
</classpathentry>
<classpathentry excluding="**" kind="src" output="target/test-classes" path="src/test/resources">
<attributes>
<attribute name="test" value="true"/>
<attribute name="maven.pomderived" value="true"/>
<attribute name="optional" value="true"/>
</attributes>
</classpathentry>
<classpathentry kind="con" path="org.eclipse.m2e.MAVEN2_CLASSPATH_CONTAINER">
<attributes>
<attribute name="maven.pomderived" value="true"/>
</attributes>
</classpathentry>
<classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-17">
<attributes>
<attribute name="maven.pomderived" value="true"/>
</attributes>
</classpathentry>
<classpathentry kind="src" output="target/classes" path="src/main/java"> <classpathentry kind="src" output="target/classes" path="src/main/java">
<attributes> <attributes>
<attribute name="optional" value="true"/> <attribute name="optional" value="true"/>
<attribute name="maven.pomderived" value="true"/> <attribute name="maven.pomderived" value="true"/>
</attributes> </attributes>
</classpathentry> </classpathentry>
<classpathentry excluding="**" kind="src" output="target/classes" path="src/main/resources">
<attributes>
<attribute name="maven.pomderived" value="true"/>
</attributes>
</classpathentry>
<classpathentry kind="src" output="target/test-classes" path="src/test/java"> <classpathentry kind="src" output="target/test-classes" path="src/test/java">
<attributes> <attributes>
<attribute name="test" value="true"/>
<attribute name="optional" value="true"/> <attribute name="optional" value="true"/>
<attribute name="maven.pomderived" value="true"/> <attribute name="maven.pomderived" value="true"/>
<attribute name="test" value="true"/>
</attributes>
</classpathentry>
<classpathentry excluding="**" kind="src" output="target/test-classes" path="src/test/resources">
<attributes>
<attribute name="maven.pomderived" value="true"/>
<attribute name="test" value="true"/>
</attributes>
</classpathentry>
<classpathentry kind="con" path="org.eclipse.m2e.MAVEN2_CLASSPATH_CONTAINER">
<attributes>
<attribute name="maven.pomderived" value="true"/>
<attribute name="org.eclipse.jst.component.dependency" value="/WEB-INF/lib"/>
</attributes>
</classpathentry>
<classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.8">
<attributes>
<attribute name="maven.pomderived" value="true"/>
</attributes> </attributes>
</classpathentry> </classpathentry>
<classpathentry kind="output" path="target/classes"/> <classpathentry kind="output" path="target/classes"/>

7
.gitignore vendored
View File

@ -1 +1,8 @@
target target
/Storagehub-TODO
/postgres-data/
.classpath
.settings/org.eclipse.jdt.core.prefs
/.project
/.externalToolBuilders/
/.settings/

View File

@ -6,8 +6,13 @@
</projects> </projects>
<buildSpec> <buildSpec>
<buildCommand> <buildCommand>
<name>org.eclipse.wst.common.project.facet.core.builder</name> <name>org.eclipse.ui.externaltools.ExternalToolBuilder</name>
<triggers>full,incremental,</triggers>
<arguments> <arguments>
<dictionary>
<key>LaunchConfigHandle</key>
<value>&lt;project&gt;/.externalToolBuilders/org.eclipse.wst.common.project.facet.core.builder.launch</value>
</dictionary>
</arguments> </arguments>
</buildCommand> </buildCommand>
<buildCommand> <buildCommand>
@ -16,8 +21,13 @@
</arguments> </arguments>
</buildCommand> </buildCommand>
<buildCommand> <buildCommand>
<name>org.eclipse.wst.validation.validationbuilder</name> <name>org.eclipse.ui.externaltools.ExternalToolBuilder</name>
<triggers>full,incremental,</triggers>
<arguments> <arguments>
<dictionary>
<key>LaunchConfigHandle</key>
<value>&lt;project&gt;/.externalToolBuilders/org.eclipse.wst.validation.validationbuilder.launch</value>
</dictionary>
</arguments> </arguments>
</buildCommand> </buildCommand>
<buildCommand> <buildCommand>

View File

@ -1,14 +1,16 @@
eclipse.preferences.version=1 eclipse.preferences.version=1
org.eclipse.jdt.core.compiler.codegen.inlineJsrBytecode=enabled org.eclipse.jdt.core.compiler.codegen.inlineJsrBytecode=enabled
org.eclipse.jdt.core.compiler.codegen.methodParameters=do not generate org.eclipse.jdt.core.compiler.codegen.methodParameters=do not generate
org.eclipse.jdt.core.compiler.codegen.targetPlatform=1.8 org.eclipse.jdt.core.compiler.codegen.targetPlatform=17
org.eclipse.jdt.core.compiler.codegen.unusedLocal=preserve org.eclipse.jdt.core.compiler.codegen.unusedLocal=preserve
org.eclipse.jdt.core.compiler.compliance=1.8 org.eclipse.jdt.core.compiler.compliance=17
org.eclipse.jdt.core.compiler.debug.lineNumber=generate org.eclipse.jdt.core.compiler.debug.lineNumber=generate
org.eclipse.jdt.core.compiler.debug.localVariable=generate org.eclipse.jdt.core.compiler.debug.localVariable=generate
org.eclipse.jdt.core.compiler.debug.sourceFile=generate org.eclipse.jdt.core.compiler.debug.sourceFile=generate
org.eclipse.jdt.core.compiler.problem.assertIdentifier=error org.eclipse.jdt.core.compiler.problem.assertIdentifier=error
org.eclipse.jdt.core.compiler.problem.enablePreviewFeatures=disabled
org.eclipse.jdt.core.compiler.problem.enumIdentifier=error org.eclipse.jdt.core.compiler.problem.enumIdentifier=error
org.eclipse.jdt.core.compiler.problem.forbiddenReference=warning org.eclipse.jdt.core.compiler.problem.forbiddenReference=warning
org.eclipse.jdt.core.compiler.problem.reportPreviewFeatures=warning
org.eclipse.jdt.core.compiler.release=disabled org.eclipse.jdt.core.compiler.release=disabled
org.eclipse.jdt.core.compiler.source=1.8 org.eclipse.jdt.core.compiler.source=17

View File

@ -1,16 +1,343 @@
<?xml version="1.0" encoding="UTF-8"?><project-modules id="moduleCoreId" project-version="1.5.0"> <?xml version="1.0" encoding="UTF-8"?><project-modules id="moduleCoreId" project-version="1.5.0">
<wb-module deploy-name="storagehub"> <wb-module deploy-name="storagehub">
<wb-resource deploy-path="/" source-path="/target/m2e-wtp/web-resources"/> <wb-resource deploy-path="/" source-path="/target/m2e-wtp/web-resources"/>
<wb-resource deploy-path="/" source-path="/src/main/webapp" tag="defaultRootSource"/> <wb-resource deploy-path="/" source-path="/src/main/webapp" tag="defaultRootSource"/>
<wb-resource deploy-path="/WEB-INF/classes" source-path="/src/main/java"/>
<wb-resource deploy-path="/WEB-INF/classes" source-path="/src/main/resources"/> <wb-resource deploy-path="/WEB-INF/classes" source-path="/src/main/resources"/>
<dependent-module archiveName="authorization-control-library-1.0.0-SNAPSHOT.jar" deploy-path="/WEB-INF/lib" handle="module:/resource/authorization-control-library/authorization-control-library"> <wb-resource deploy-path="/WEB-INF/classes" source-path="/src/main/java"/>
<wb-resource deploy-path="/WEB-INF/classes" source-path="/src/test/java"/>
<dependent-module archiveName="common-smartgears-app-3.0.1-SNAPSHOT.jar" deploy-path="/WEB-INF/lib" handle="module:/resource/common-smartgears-app/common-smartgears-app">
<dependency-type>uses</dependency-type> <dependency-type>uses</dependency-type>
</dependent-module> </dependent-module>
<dependent-module archiveName="storagehub-model-1.0.5.jar" deploy-path="/WEB-INF/lib" handle="module:/resource/storagehub-model/storagehub-model">
<dependent-module archiveName="authorization-control-library-2.0.0-SNAPSHOT.jar" deploy-path="/WEB-INF/lib" handle="module:/resource/authorization-control-library/authorization-control-library">
<dependency-type>uses</dependency-type> <dependency-type>uses</dependency-type>
</dependent-module> </dependent-module>
<dependent-module archiveName="storagehub-model-2.0.0-SNAPSHOT.jar" deploy-path="/WEB-INF/lib" handle="module:/resource/storagehub-model/storagehub-model">
<dependency-type>uses</dependency-type>
</dependent-module>
<dependent-module archiveName="storagehub-script-utils-2.0.0-SNAPSHOT.jar" deploy-path="/WEB-INF/lib" handle="module:/resource/storagehub-scripting-util/storagehub-scripting-util">
<dependency-type>uses</dependency-type>
</dependent-module>
<property name="context-root" value="storagehub"/> <property name="context-root" value="storagehub"/>
<property name="java-output-path" value="/storagehub-webapp_BRANCH/target/classes"/> <property name="java-output-path" value="/storagehub-webapp_BRANCH/target/classes"/>
</wb-module> </wb-module>
</project-modules> </project-modules>

View File

@ -1,8 +1,8 @@
<?xml version="1.0" encoding="UTF-8"?> <?xml version="1.0" encoding="UTF-8"?>
<faceted-project> <faceted-project>
<fixed facet="wst.jsdt.web"/> <fixed facet="wst.jsdt.web"/>
<installed facet="java" version="1.8"/>
<installed facet="jst.web" version="3.0"/>
<installed facet="jst.jaxrs" version="2.0"/> <installed facet="jst.jaxrs" version="2.0"/>
<installed facet="wst.jsdt.web" version="1.0"/> <installed facet="wst.jsdt.web" version="1.0"/>
<installed facet="jst.web" version="4.0"/>
<installed facet="java" version="17"/>
</faceted-project> </faceted-project>

78
CHANGELOG.md Normal file
View File

@ -0,0 +1,78 @@
# Changelog for "storagehub"
All notable changes to this project will be documented in this file.
This project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [v2.0.1-SNAPSHOT]
- improved enunciate api-docs
- added sphinx docs
- incident solved [#28539]
- @Produces added/improved in some cases
- removed content-type constraint on group admin delete
- removed DocsManager, docs are now served by the default Tomcat servlet
- project structure refactored with docs in "standard" locations
## [v2.0.0]
- ceph as default storage
- vre folders can define specific bucket as backend
- enunciate docs
- dockerization of the service
## [v1.4.0] 2021-10-07
- slow query removed from VRE retrieving and recents
- incident solved [#22184]
## [v1.3.2] - [2021-09-28]
- fix 22087
## [v1.3.1] - [2021-09-08]
- solved bug on attachment rights
## [v1.3.0] - [2021-03-31]
possibility to impersonate people added
## [v1.2.5] - [2021-03-11]
use of query (indexes are not working well in jackrabbit) to retrieve shared folder removed on delete user
## [v1.2.4] - [2021-01-11]
mimetype set also on non parsable pdf
method add and remove admin on a VREFolder enabled also for admin role
fixed bug on unsharing folder on user removal
added method exist user
## [v1.2.2] - [2020-10-12]
method for description update added
## [v1.2.1] - [2020-06-20]
bug on Archive uploader solved
## [v1.2.0] - [2020-04-15]
trash items changes owner on restore
restore with new destination folder added
move between shared and private or different shared folder enabled
## [v1.0.8] - [2019-09-20]
Bug on ushare owner fixed
## [v1.0.5] - [2019-04-04]
Active wait for lock in case of item creation added
## [v1.0.0] - [2015-07-01]
First commit

22
DOCKER-INSTRUCTION.md Normal file
View File

@ -0,0 +1,22 @@
# Docker Instruction
Instruction to generate e run storagehub docker image
## Dockerfile - DokerCompose
This image is ready to be deployed in a new environment.
The Dockerfile generate an image without configurations (container, service etc.), all the configurations must be provided at image start time.
In fact the docker-compose.yml requires 3 environmental variables set at start time: APP_PORT, JACKRABBIT_FOLDER, CONTAINER_SERVICE_FILE_FOLDER.
```
APP_PORT=8080 JACKRABBIT_FOLDER=/data/jackrabbit CONTAINER_SERVICE_FILE_FOLDER=/etc/smartgears-config docker compose up
```
## Dockerfile - DokerCompose standalone
The image generated from Dockerfile-standalone contains all the configuration to run on a fully isolated container.
The docker-compose-standalone.yml contains the declaration of all the services needed (postgres and minio) teking all the configuration from the local ./docker folder
```
docker compose -f docker-compose-standalone.yml up
```

3
Dockerfile Normal file
View File

@ -0,0 +1,3 @@
FROM harbor.d4science.org/gcube/smartgears-distribution:4.0.1-SNAPSHOT-java17-tomcat10.1.19
COPY ./target/storagehub.war /tomcat/webapps/
COPY ./docker/storagehub.xml /tomcat/conf/Catalina/localhost/

13
Dockerfile-local Normal file
View File

@ -0,0 +1,13 @@
FROM harbor.d4science.org/gcube/smartgears-distribution:4.0.0-SNAPSHOT-java17-tomcat10.1.19
#install unzip
RUN apt-get update && apt-get install unzip
COPY ./target/storagehub-test-storages.war /tomcat/webapps/storagehub.war
COPY ./docker/jackrabbit /app/jackrabbit
COPY ./docker/storagehub.xml /tomcat/conf/Catalina/localhost/
COPY ./docker/logback.xml /etc/
COPY ./docker/local/container.ini /etc/
RUN unzip /tomcat/webapps/storagehub.war -d /tomcat/webapps/storagehub
RUN rm /tomcat/webapps/storagehub.war
#COPY ./docker/local/storage-settings.properties /tomcat/webapps/storagehub/WEB-INF/classes/

11
Dockerfile-standalone Normal file
View File

@ -0,0 +1,11 @@
FROM harbor.d4science.org/gcube/smartgears-distribution:4.0.0-SNAPSHOT-java17-tomcat10.1.19
# Install vim
RUN apt update && apt install -y vim
RUN mkdir -p /etc/config/storagehub
COPY ./target/storagehub.war /tomcat/webapps/
COPY ./docker/jackrabbit /app/jackrabbit
COPY ./docker/storagehub.xml /tomcat/conf/Catalina/localhost/
COPY ./docker/logback.xml /etc/
COPY ./docker/container.ini /etc/
COPY ./docker/storage-settings.properties /etc/config/storagehub/

26
FUNDING.md Normal file
View File

@ -0,0 +1,26 @@
# Acknowledgments
The projects leading to this software have received funding from a series of European Union programmes including:
- the Sixth Framework Programme for Research and Technological Development
- [DILIGENT](https://cordis.europa.eu/project/id/004260) (grant no. 004260).
- the Seventh Framework Programme for research, technological development and demonstration
- [D4Science](https://cordis.europa.eu/project/id/212488) (grant no. 212488);
- [D4Science-II](https://cordis.europa.eu/project/id/239019) (grant no.239019);
- [ENVRI](https://cordis.europa.eu/project/id/283465) (grant no. 283465);
- [iMarine](https://cordis.europa.eu/project/id/283644) (grant no. 283644);
- [EUBrazilOpenBio](https://cordis.europa.eu/project/id/288754) (grant no. 288754).
- the H2020 research and innovation programme
- [SoBigData](https://cordis.europa.eu/project/id/654024) (grant no. 654024);
- [PARTHENOS](https://cordis.europa.eu/project/id/654119) (grant no. 654119);
- [EGI-Engage](https://cordis.europa.eu/project/id/654142) (grant no. 654142);
- [ENVRI PLUS](https://cordis.europa.eu/project/id/654182) (grant no. 654182);
- [BlueBRIDGE](https://cordis.europa.eu/project/id/675680) (grant no. 675680);
- [PerformFISH](https://cordis.europa.eu/project/id/727610) (grant no. 727610);
- [AGINFRA PLUS](https://cordis.europa.eu/project/id/731001) (grant no. 731001);
- [DESIRA](https://cordis.europa.eu/project/id/818194) (grant no. 818194);
- [ARIADNEplus](https://cordis.europa.eu/project/id/823914) (grant no. 823914);
- [RISIS 2](https://cordis.europa.eu/project/id/824091) (grant no. 824091);
- [EOSC-Pillar](https://cordis.europa.eu/project/id/857650) (grant no. 857650);
- [Blue Cloud](https://cordis.europa.eu/project/id/862409) (grant no. 862409);
- [SoBigData-PlusPlus](https://cordis.europa.eu/project/id/871042) (grant no. 871042);

View File

@ -1,4 +1,4 @@
#European Union Public Licence V.1.1 #European Union Public Licence V.1.2
##*EUPL © the European Community 2007* ##*EUPL © the European Community 2007*
@ -12,7 +12,7 @@ The Original Work is provided under the terms of this Licence when the Licensor
(as defined below) has placed the following notice immediately following the (as defined below) has placed the following notice immediately following the
copyright notice for the Original Work: copyright notice for the Original Work:
**Licensed under the EUPL V.1.1** **Licensed under the EUPL V.1.2**
or has expressed by any other mean his willingness to license under the EUPL. or has expressed by any other mean his willingness to license under the EUPL.

155
NODES-TO-REMOVE.txt Normal file
View File

@ -0,0 +1,155 @@
// TO REMOVE
[nthl:ExternalLink] > nthl:workspaceLeafItem
- hl:value (String) mandatory
[nthl:externalUrl] > nthl:workspaceLeafItem
[nthl:query] > nthl:workspaceLeafItem
[nthl:aquamapsItem] > nthl:workspaceLeafItem
[nthl:timeSeriesItem] > nthl:workspaceLeafItem
[nthl:report] > nthl:workspaceLeafItem
[nthl:reportTemplate] > nthl:workspaceLeafItem
[nthl:workflowReport] > nthl:workspaceLeafItem
[nthl:workflowTemplate] > nthl:workspaceLeafItem
[nthl:gCubeMetadata] > nthl:workspaceLeafItem
[nthl:gCubeDocument] > nthl:workspaceLeafItem
[nthl:gCubeDocumentLink] > nthl:workspaceLeafItem
[nthl:gCubeImageDocumentLink] > nthl:workspaceLeafItem
[nthl:gCubePDFDocumentLink] > nthl:workspaceLeafItem
[nthl:gCubeImageDocument] > nthl:workspaceLeafItem
[nthl:gCubePDFDocument] > nthl:workspaceLeafItem
[nthl:gCubeURLDocument] > nthl:workspaceLeafItem
[nthl:gCubeAnnotation] > nthl:workspaceLeafItem
[nthl:externalResourceLink] > nthl:workspaceLeafItem
[nthl:tabularDataLink] > nthl:workspaceLeafItem
[nthl:documentAlternativeLink] > nt:base
- hl:parentUri (String) mandatory
- hl:uri (String) mandatory
- hl:name (String) mandatory
- hl:mimeType (String) mandatory
[nthl:documentPartLink] > nthl:documentAlternativeLink
[nthl:documentItemContent] > nthl:workspaceLeafItemContent
- hl:collectionName (String) mandatory
- hl:oid (String) mandatory
+ hl:metadata (nt:unstructured)
= nt:unstructured
mandatory autocreated
+ hl:annotations (nt:unstructured)
= nt:unstructured
mandatory autocreated
+ hl:alternatives (nt:unstructured)
= nt:unstructured
mandatory autocreated
+ hl:parts (nt:unstructured)
= nt:unstructured
mandatory autocreated
[nthl:metadataItemContent] > nthl:workspaceLeafItemContent, nthl:file
- hl:schema (String) mandatory
- hl:language (String) mandatory
- hl:collectionName (String) mandatory
- hl:oid (String) mandatory
[nthl:annotationItemContet] > nthl:workspaceLeafItemContent
- hl:oid (String) mandatory
+ hl:annotations (nt:unstructured)
= nt:unstructured
mandatory autocreated
[nthl:queryItemContent] > nthl:workspaceLeafItemContent
- hl:query (String) mandatory
- hl:queryType (String) mandatory
[nthl:aquamapsItemContent] > nthl:workspaceLeafItemContent, nthl:file
- hl:mapName (String) mandatory
- hl:mapType (String) mandatory
- hl:author (String) mandatory
- hl:numberOfSpecies (Long) mandatory
- hl:boundingBox (String) mandatory
- hl:PSOThreshold (Double) mandatory
- hl:numberOfImages (Long) mandatory
+ hl:images(nt:unstructured)
= nt:unstructured
mandatory autocreated
[nthl:timeSeriesItemContent] > nthl:workspaceLeafItemContent, nthl:file
- hl:id (String) mandatory
- hl:title (String) mandatory
- hl:description (String) mandatory
- hl:creator (String) mandatory
- hl:created (String) mandatory
- hl:publisher (String) mandatory
- hl:sourceId (String) mandatory
- hl:sourceName (String) mandatory
- hl:rights (String) mandatory
- hl:dimension (Long) mandatory
- hl:headerLabels (String)
[nthl:reportItemContent] > nthl:workspaceLeafItemContent, nthl:file
- hl:created (Date) mandatory
- hl:lastEdit (Date) mandatory
- hl:author (String) mandatory
- hl:lastEditBy (String) mandatory
- hl:templateName (String) mandatory
- hl:numberOfSection (Long) mandatory
- hl:status (String) mandatory
[nthl:reportTemplateContent] > nthl:workspaceLeafItemContent, nthl:file
- hl:created (Date) mandatory
- hl:lastEdit (Date) mandatory
- hl:author (String) mandatory
- hl:lastEditBy (String) mandatory
- hl:numberOfSection (Long) mandatory
- hl:status (String) mandatory
[nthl:externalResourceLinkContent] > nthl:workspaceLeafItemContent
- hl:mimeType (String)
- hl:size (long) mandatory
- hl:resourceId (String) mandatory
- hl:servicePlugin (String) mandatory
[nthl:tabularDataLinkContent] > nthl:workspaceLeafItemContent
- hl:tableID (String) mandatory
- hl:tableTemplateID (String) mandatory
- hl:provenance (String) mandatory
- hl:runtimeResourceID (String) mandatory
- hl:operator (String)
[nthl:smartFolderContent] > nt:base
- hl:query (String) mandatory
- hl:folderId (String)
[nthl:folderBulkCreator] > nt:base
- hl:folderId (String) mandatory
- hl:status (Long)
= '0'
mandatory autocreated
- hl:failures (Long)
= '0'
mandatory autocreated
- hl:requests (Long) mandatory
[nthl:rootFolderBulkCreator] > nt:folder
+ * (nthl:folderBulkCreator)
= nthl:folderBulkCreator

3
ToRemoveOnImport Normal file
View File

@ -0,0 +1,3 @@
nodeType to remove on new import from a backup:
externalUrl

15
buildImageAndStart.sh Executable file
View File

@ -0,0 +1,15 @@
#!/bin/bash
set -e
NAME=storagehub
PORT=8100
DEBUG_PORT=5005
debug=false
compile=false
mvn clean package
docker compose -f docker-compose-standalone.yml build
docker compose -f docker-compose-standalone.yml up

View File

@ -1,14 +0,0 @@
<ReleaseNotes>
<Changeset component="org.gcube.data-access.storagehub-webapp.1.0.8"
date="2019-09-20">
<Change>Bug on ushare owner fixed</Change>
</Changeset>
<Changeset component="org.gcube.data-access.storagehub-webapp.1.0.5"
date="2019-04-04">
<Change>Active wait for lock in case of item creation added</Change>
</Changeset>
<Changeset component="org.gcube.data-access.storagehub-webapp.1.0.0"
date="2015-07-01">
<Change>First commit</Change>
</Changeset>
</ReleaseNotes>

View File

@ -1,32 +0,0 @@
<assembly
xmlns="http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.0 http://maven.apache.org/xsd/assembly-1.1.0.xsd">
<id>servicearchive</id>
<formats>
<format>tar.gz</format>
</formats>
<baseDirectory>/</baseDirectory>
<fileSets>
<fileSet>
<directory>.</directory>
<outputDirectory>/</outputDirectory>
<useDefaultExcludes>true</useDefaultExcludes>
<includes>
<include>README.md</include>
<include>LICENSE.md</include>
<include>changelog.xml</include>
<include>profile.xml</include>
</includes>
<fileMode>755</fileMode>
<filtered>true</filtered>
</fileSet>
</fileSets>
<files>
<file>
<source>target/${build.finalName}.war</source>
<outputDirectory>/${artifactId}</outputDirectory>
</file>
</files>
</assembly>

View File

@ -1 +0,0 @@
${gcube.license}

View File

@ -1,66 +0,0 @@
The gCube System - ${name}
--------------------------------------------------
${description}
${gcube.description}
${gcube.funding}
Version
--------------------------------------------------
${version} (${buildDate})
Please see the file named "changelog.xml" in this directory for the release notes.
Authors
--------------------------------------------------
* Lucio Lelii (lucio.lelii-AT-isti.cnr.it), CNR Pisa,
Istituto di Scienza e Tecnologie dell'Informazione "A. Faedo".
Maintainers
-----------
* Lucio Lelii (lucio.lelii-AT-isti.cnr.it), CNR Pisa,
Istituto di Scienza e Tecnologie dell'Informazione "A. Faedo".
Download information
--------------------------------------------------
Source code is available from SVN:
${scm.url}
Binaries can be downloaded from the gCube website:
${gcube.website}
Installation
--------------------------------------------------
Installation documentation is available on-line in the gCube Wiki:
${gcube.wikiRoot}/Home_Library_2.0_API_Framework_Specification
Documentation
--------------------------------------------------
Documentation is available on-line in the gCube Wiki:
${gcube.wikiRoot}/StorageHub_API_Framework_Specification
Support
--------------------------------------------------
Bugs and support requests can be reported in the gCube issue tracking tool:
${gcube.issueTracking}
Licensing
--------------------------------------------------
This software is licensed under the terms you may find in the file named "LICENSE" in this directory.

View File

@ -1,7 +0,0 @@
<application mode='online'>
<name>StorageHub</name>
<group>DataAccess</group>
<version>1.0.0-SNAPSHOT</version>
<description>Storage Hub webapp</description>
<local-persistence location='target' />
</application>

View File

49
docker-compose-local.yml Normal file
View File

@ -0,0 +1,49 @@
services:
elb:
image: haproxy
ports:
- "8100:8100"
volumes:
- ./docker/haproxy:/usr/local/etc/haproxy
postgres:
image: postgres:10.5
restart: always
environment:
- POSTGRES_DB=workspace-db
- POSTGRES_USER=ws-db-user
- POSTGRES_PASSWORD=dbPwd
logging:
options:
max-size: 10m
max-file: "3"
ports:
- '5423:5432'
volumes:
- ./postgres-data:/var/lib/postgresql/data
- ./sql/create_tables.sql:/docker-entrypoint-initdb.d/create_tables.sql
storagehub:
build:
dockerfile: ./Dockerfile-local
ports:
- '8080:8080'
- '4954:4954'
environment:
- ADMINISTRATION_PORT_ENABLED=true
- DOMAIN_NAME=docker_domain
- JPDA_OPTS="-agentlib:jdwp=transport=dt_socket,address=0.0.0.0:4954,server=y,suspend=n"
- JPDA_ADDRESS=*:4954
minio:
image: minio/minio
ports:
- "9000:9000"
- "9001:9001"
volumes:
- minio_storage:/data
environment:
MINIO_ROOT_USER: SHUBTEST
MINIO_ROOT_PASSWORD: wJalrXUtnFEMI/K7MDENG/bPxRfiCY
command: server --console-address ":9001" /data
volumes:
minio_storage: {}

View File

@ -0,0 +1,46 @@
services:
postgres:
image: postgres:16.2
restart: always
environment:
- POSTGRES_DB=workspace-db
- POSTGRES_USER=ws-db-user
- POSTGRES_PASSWORD=dbPwd
logging:
options:
max-size: 10m
max-file: "3"
ports:
- '5423:5432'
volumes:
- ./data/postgres-data:/var/lib/postgresql/data
copy the sql script to create tables
- ./data/sql/create_tables.sql:/docker-entrypoint-initdb.d/create_tables.sql
storagehub:
build:
dockerfile: Dockerfile-standalone
environment:
_JAVA_OPTIONS:
-Xdebug
-agentlib:jdwp=transport=dt_socket,server=y,suspend=${SUSPEND:-n},address=*:5005
ports:
- '8081:8080'
- '5005:5005'
volumes:
- /tmp:/tomcat/temp
minio:
image: minio/minio
ports:
- "9000:9000"
- "9001:9001"
volumes:
- minio_storage:/data
environment:
MINIO_ROOT_USER: SHUBTEST
MINIO_ROOT_PASSWORD: wJalrXUtnFEMI/K7MDENG/bPxRfiCY
command: server --console-address ":9001" /data
volumes:
minio_storage: {}

9
docker-compose.yml Normal file
View File

@ -0,0 +1,9 @@
services:
storagehub:
image: d4science/storagehub:latest
ports:
- '${APP_PORT}:8080'
volumes:
- ${JACKRABBIT_FOLDER}:/app/jackrabbit
- ${SMARTGEARS_CONFIG_FOLDER}:/etc

23
docker/container.ini Normal file
View File

@ -0,0 +1,23 @@
[node]
mode = offline
hostname = dlib29.isti.cnr.it
protocol= http
port = 8080
infrastructure = gcube
authorizeChildrenContext = true
publicationFrequencyInSeconds = 60
[properties]
SmartGearsDistribution = 4.0.0-SNAPSHOT
SmartGearsDistributionBundle = UnBundled
[site]
country = it
location = pisa
[authorization]
factory = org.gcube.smartgears.security.defaults.DefaultAuthorizationProviderFactory
factory.endpoint = https://accounts.cloud-dev.d4science.org/auth/realms/d4science/protocol/openid-connect/token
credentials.class = org.gcube.smartgears.security.SimpleCredentials
credentials.clientID = node-whn-test-uno-d-d4s.d4science.org
credentials.secret = 979bd3bc-5cc4-11ec-bf63-0242ac130002

View File

@ -0,0 +1,18 @@
# haproxy.cfg
global
lua-load /usr/local/etc/haproxy/send_to_all_whnmanager.lua
frontend http
bind *:8100
mode http
timeout client 10s
http-request use-service lua.broadcast_to_nodes if { path /whn-manager meth PUT DELETE }
use_backend all
backend all
mode http
option httpchk
http-check send meth GET uri /storagehub/gcube/resource/health
http-check expect status 200
server s1 storagehub:8080 check

View File

@ -0,0 +1,21 @@
local function broadcast_to_nodes(req)
-- Get all servers in the backend
local servers = pxn.get_servers("all")
for _, server in ipairs(servers) do
-- Forward the request to each server
pxn.http_request({
"PUT", -- Method
server["address"], -- Address of the server
tonumber(server["port"]), -- Port of the server
req.method, -- HTTP method
req.uri, -- URI
req.headers, -- Headers
req.body -- Body
})
end
end
core.register_service("broadcast_to_nodes", "http", broadcast_to_nodes)

View File

@ -0,0 +1,11 @@
#bootstrap properties for the repository startup servlet.
#Fri Jul 21 05:19:29 CEST 2017
java.naming.factory.initial=org.apache.jackrabbit.core.jndi.provider.DummyInit$
repository.home=jackrabbit
rmi.enabled=true
repository.config=jackrabbit/repository.xml
repository.name=jackrabbit.repository
rmi.host=localhost
java.naming.provider.url=http\://www.apache.org/jackrabbit
jndi.enabled=true
rmi.port=0

View File

@ -0,0 +1,111 @@
<?xml version="1.0" encoding="UTF-8"?>
<!-- Licensed to the Apache Software Foundation (ASF) under one or more contributor
license agreements. See the NOTICE file distributed with this work for additional
information regarding copyright ownership. The ASF licenses this file to
You under the Apache License, Version 2.0 (the "License"); you may not use
this file except in compliance with the License. You may obtain a copy of
the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required
by applicable law or agreed to in writing, software distributed under the
License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS
OF ANY KIND, either express or implied. See the License for the specific
language governing permissions and limitations under the License. -->
<!DOCTYPE Repository PUBLIC "-//The Apache Software Foundation//DTD Jackrabbit 2.0//EN" "http://jackrabbit.apache.org/dtd/repository-2.0.dtd">
<Repository>
<!-- virtual file system where the repository stores global state (e.g.
registered namespaces, custom node types, etc.) -->
<FileSystem class="org.apache.jackrabbit.core.fs.db.DbFileSystem">
<param name="driver" value="org.postgresql.Driver" />
<param name="schema" value="postgresql" />
<param name="url" value="jdbc:postgresql://postgres:5432/workspace-db" />
<param name="user" value="ws-db-user" />
<param name="password" value="dbPwd" />
<param name="schemaObjectPrefix" value="rep_" />
</FileSystem>
<!-- data store configuration -->
<DataStore class="org.apache.jackrabbit.core.data.db.DbDataStore">
<param name="driver" value="org.postgresql.Driver" />
<param name="databaseType" value="postgresql" />
<param name="url" value="jdbc:postgresql://postgres:5432/workspace-db" />
<param name="user" value="ws-db-user" />
<param name="password" value="dbPwd" />
<param name="minRecordLength" value="1024" />
<param name="maxConnections" value="3" />
<param name="copyWhenReading" value="true" />
<param name="tablePrefix" value="datastore_" />
<param name="schemaObjectPrefix" value="" />
</DataStore>
<!-- security configuration -->
<Security appName="Jackrabbit">
<SecurityManager class="org.apache.jackrabbit.core.DefaultSecurityManager" />
<AccessManager class="org.apache.jackrabbit.core.security.DefaultAccessManager" />
<LoginModule class="org.apache.jackrabbit.core.security.authentication.DefaultLoginModule">
<param name="adminId" value="admin" />
<param name="adminPassword" value="admin" />
</LoginModule>
</Security>
<!-- location of workspaces root directory and name of default workspace -->
<Workspaces rootPath="${rep.home}/workspaces" defaultWorkspace="default" />
<Workspace name="${wsp.name}">
<FileSystem class="org.apache.jackrabbit.core.fs.local.LocalFileSystem">
<param name="path" value="${wsp.home}" />
</FileSystem>
<PersistenceManager class="org.apache.jackrabbit.core.persistence.pool.PostgreSQLPersistenceManager">
<param name="driver" value="org.postgresql.Driver" />
<param name="url" value="jdbc:postgresql://postgres:5432/workspace-db" />
<param name="schema" value="postgresql" />
<param name="user" value="ws-db-user" />
<param name="password" value="dbPwd" />
<param name="schemaObjectPrefix" value="pm_${wsp.name}_" />
<param name="bundleCacheSize" value="600" />
<param name="errorHandling" value="IGNORE_MISSING_BLOBS" />
<param name="consistencyFix" value="false" />
<param name="consistencyCheck" value="false" />
</PersistenceManager>
<!-- Search index and the file system it uses. class: FQN of class implementing
the QueryHandler interface -->
<SearchIndex class="org.apache.jackrabbit.core.query.lucene.SearchIndex">
<param name="path" value="${wsp.home}/index" />
<param name="supportHighlighting" value="true" />
<param name="autoRepair" value="true" />
<param name="onWorkspaceInconsistency" value="log" />
<param name="indexingConfiguration" value="${rep.home}/indexing_configuration.xml" />
<param name="resultFetchSize" value="50" />
<param name="cacheSize" value="100000" />
<param name="enableConsistencyCheck" value="false" />
<param name="forceConsistencyCheck" value="false" />
</SearchIndex>
</Workspace>
<!-- Configures the versioning -->
<Versioning rootPath="${rep.home}/version">
<!-- Configures the filesystem to use for versioning for the respective
persistence manager -->
<FileSystem class="org.apache.jackrabbit.core.fs.local.LocalFileSystem">
<param name="path" value="${rep.home}/version" />
</FileSystem>
<PersistenceManager class="org.apache.jackrabbit.core.persistence.pool.PostgreSQLPersistenceManager">
<param name="driver" value="org.postgresql.Driver" />
<param name="url" value="jdbc:postgresql://postgres:5432/workspace-db" />
<param name="schema" value="postgresql" />
<param name="user" value="ws-db-user" />
<param name="password" value="dbPwd" />
<param name="schemaObjectPrefix" value="pm_version_" />
<param name="bundleCacheSize" value="600" />
<param name="consistencyFix" value="false" />
<param name="consistencyCheck" value="false" />
</PersistenceManager>
</Versioning>
<!-- Cluster configuration -->
<!-- Cluster id="storagehub1.d4science.org" syncDelay="2000">
<Journal class="org.apache.jackrabbit.core.journal.DatabaseJournal">
<param name="driver" value="org.postgresql.Driver" />
<param name="url" value="jdbc:postgresql://postgres/workspace-db" />
<param name="databaseType" value="postgresql" />
<param name="schemaObjectPrefix" value="journal_" />
<param name="user" value="ws-db-user" />
<param name="password" value="dbPwd" />
<param name="revision" value="${rep.home}/revision.log" />
<param name="janitorEnabled" value="false"/>
<set to true if you want to daily clean the journal table https://wiki.apache.org/jackrabbit/Clustering#Removing_Old_Revisions>
</Journal>
</Cluster > -->
</Repository>

View File

@ -0,0 +1,23 @@
[node]
mode = offline
hostname = dlib29.isti.cnr.it
protocol= http
port = 8080
infrastructure = gcube
authorizeChildrenContext = true
publicationFrequencyInSeconds = 60
[properties]
SmartGearsDistribution = 4.0.0-SNAPSHOT
SmartGearsDistributionBundle = UnBundled
[site]
country = it
location = pisa
[authorization]
factory = org.gcube.smartgears.security.defaults.DefaultAuthorizationProviderFactory
factory.endpoint = https://accounts.dev.d4science.org/auth/realms/d4science/protocol/openid-connect/token
credentials.class = org.gcube.smartgears.security.SimpleCredentials
credentials.clientID = node-whn-test-uno-d-d4s.d4science.org
credentials.secret = 979bd3bc-5cc4-11ec-bf63-0242ac130002

View File

@ -0,0 +1,10 @@
default.bucketName = storagehub-dev
default.key = SHUBTEST
default.secret = wJalrXUtnFEMI/K7MDENG/bPxRfiCY
default.url = minio:9000
default.createBucket = true
volatile.bucketName = storagehub-volatile-dev
volatile.key = SHUBTEST
volatile.secret = wJalrXUtnFEMI/K7MDENG/bPxRfiCY
volatile.url = minio:9000
volatile.createBucket = true

31
docker/logback.xml Normal file
View File

@ -0,0 +1,31 @@
<configuration scan="true" debug="true">
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>
<appender name="FILE" class="ch.qos.logback.core.FileAppender">
<file>storagehub.log</file>
<encoder>
<pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>
<logger name="org.gcube" level="DEBUG" />
<logger name="org.gcube.smartgears" level="TRACE" />
<logger name="org.gcube.smartgears.handlers" level="TRACE" />
<logger name="org.gcube.common.events" level="WARN" />
<logger name="org.gcube.data.publishing" level="ERROR" />
<logger name="org.gcube.documentstore" level="ERROR" />
<logger name="org.gcube.common.core.publisher.is.legacy" level="TRACE" />
<logger name="org.gcube.data.access" level="TRACE" />
<logger name="org.gcube.data.access.storagehub.handlers" level="DEBUG" />
<root level="WARN">
<appender-ref ref="STDOUT" />
<appender-ref ref="FILE" />
</root>
</configuration>

6
docker/properties Normal file
View File

@ -0,0 +1,6 @@
${{adminId}}=workspace
${{adminPwd}}=gcube
${{db-host}}=postgres
${{ws-db}}=workspace-db
${{dbUser}}=ws-db-user
${{dbPwd}}=dbPwd

View File

@ -0,0 +1,10 @@
default.bucketName=storagehub-dev
default.key=SHUBTEST
default.secret=wJalrXUtnFEMI/K7MDENG/bPxRfiCY
default.url=http://minio:9000/
default.createBucket=true
volatile.bucketName=storagehub-volatile-dev
volatile.key=SHUBTEST
volatile.secret=wJalrXUtnFEMI/K7MDENG/bPxRfiCY
volatile.url=http://minio:9000/
volatile.createBucket=true

10
docker/storagehub.xml Normal file
View File

@ -0,0 +1,10 @@
<Context path="/storagehub">
<Resource
name="jcr/repository"
auth="Container"
type="javax.jcr.Repository"
factory="org.apache.jackrabbit.core.jndi.BindableRepositoryFactory"
configFilePath="/app/jackrabbit/repository.xml"
repHomeDir="/app/jackrabbit/workspaces"
/>
</Context>

View File

@ -0,0 +1,41 @@
<?xml version="1.0" encoding="UTF-8"?>
<enunciate
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:noNamespaceSchemaLocation="http://enunciate.webcohesion.com/schemas/enunciate-2.18.0.xsd">
<description>
<![CDATA[
<h1>StorageHUB</h1>
<p>StorageHUB is the service implementing the gCube Workspace feature.</p>
<p>It provides an intermediate layer between the storage and the services
willing to access it.</p>
]]>
</description>
<code-license>This project is licensed under the EUPL V.1.1 License - see the LICENSE.md file for details.</code-license>
<modules>
<gwt-json-overlay disabled="true" />
<php-json-client disabled="true" />
<ruby-json-client disabled="true" />
<java-json-client disabled="true" />
<javascript-client disabled="true" />
<java-xml-client disabled="true" />
<jaxb disabled="true" />
<jaxws disabled="true" />
<c-xml-client disabled="true" />
<csharp-xml-client disabled="true" />
<obj-c-xml-client disabled="true" />
<php-xml-client disabled="true" />
<spring-webnt disabled="true" />
<jaxrs groupBy="class" disableExamples="false" path-sort-strategy="depth_first" />
<swagger basePath="/workspace" />
<docs docsDir="${project.build.directory}" docsSubdir="api-docs" />
<docs
freemarkerTemplate="${project.basedir}/src/main/resources/META-INF/enunciate/d4science_docs.fmt">
<additional-css file="css/d4science_enunciate_custom.css" />
</docs>
</modules>
</enunciate>

View File

@ -0,0 +1,20 @@
# Minimal makefile for Sphinx documentation
#
# You can set these variables from the command line, and also
# from the environment for the first two.
SPHINXOPTS ?=
SPHINXBUILD ?= sphinx-build
SOURCEDIR = .
BUILDDIR = _build
# Put it first so that "make" without argument is like "make help".
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
.PHONY: help Makefile
# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)

View File

@ -0,0 +1,58 @@
# Configuration file for the Sphinx documentation builder.
#
# This file only contains a selection of the most common options. For a full
# list see the documentation:
# https://www.sphinx-doc.org/en/master/usage/configuration.html
# -- Path setup --------------------------------------------------------------
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
#
# import os
# import sys
# sys.path.insert(0, os.path.abspath('.'))
# -- Project information -----------------------------------------------------
project = 'StorageHub'
copyright = '2024, Lucio Lelii, Biagio Peccerillo'
author = 'Lucio Lelii, Biagio Peccerillo'
# The full version, including alpha/beta/rc tags
release = '2.0.1'
# -- General configuration ---------------------------------------------------
source_suffix = {
'.rst': 'restructuredtext',
}
# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones.
extensions = []
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
# This pattern also affects html_static_path and html_extra_path.
exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store']
# -- Options for HTML output -------------------------------------------------
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
#
html_theme = 'sphinx_rtd_theme'
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
html_static_path = ['_static']

View File

@ -0,0 +1,8 @@
Welcome to StorageHub's documentation!
======================================
.. toctree::
:maxdepth: 2
:caption: Contents:
intro.rst

View File

@ -0,0 +1,98 @@
Introduction
============
StorageHub is a versatile service designed to provide seamless access
to various storage resources, ensuring data persistence and management. It acts
as an intermediary layer that can interface with any underlying storage
solution, such as Amazon S3 or MongoDB, offering a unified and flexible approach
to storage management.
Base URL
--------
In the production environment, its current value is https://api.d4science.org/
Key Features
------------
Flexibility and Integration
~~~~~~~~~~~~~~~~~~~~~~~~~~~
StorageHub is designed to be highly flexible, allowing it to serve as an
intermediate layer for diverse storage solutions. This flexibility ensures that
it can adapt to different storage backends without requiring significant changes
to the applications that rely on it.
RESTful Interface
~~~~~~~~~~~~~~~~~
StorageHub exposes a RESTful API, which allows any application capable of making
HTTP requests to access it. This REST interface provides a standardized way to
interact with the storage resources, enabling easy integration with various
applications and services. See the available REST-API on `StorageHub API docs
<../api-docs/index.html>`_.
Metadata Management
~~~~~~~~~~~~~~~~~~~
StorageHub leverages a JackRabbit-based object store to manage all metadata
associated with the stored data. This ensures that metadata is efficiently
organized and easily retrievable, enhancing the overall data management
capabilities of the service.
Direct Payload Storage
~~~~~~~~~~~~~~~~~~~~~~
While metadata is handled by JackRabbit, the actual data payloads are stored
directly on the underlying storage solutions. This approach optimizes storage
efficiency and performance, ensuring that large data payloads are managed
effectively.
Primary Use Cases
-----------------
Workspace
~~~~~~~~~
The main application that interacts with StorageHub is the Workspace portlet,
which is easily accessible from the Virtual Research Environments (VREs). The
Workspace provides a "standard" HTML interface where users can perform all the
common operations available in a file system, such as creating, reading,
updating, and deleting files and directories.
In addition to these standard file system operations, the Workspace offers
features that are specific to VREs. These include publishing on the Catalogue,
sharing resources with other users, and managing versions of files. These
capabilities make the Workspace a versatile tool for managing data within the
VREs, leveraging the services provided by StorageHub.
Java Client
~~~~~~~~~~~
The methods of the Web Service can be called by writing your own REST client
application or by using already existing REST client plugins.
In case of a Java client, we provide the StorageHub Client Library, which is a
Java library designed to facilitate seamless interaction with StorageHub. It
abstracts the complexities of the REST API, providing a more intuitive and
convenient interface for Java developers.
The StorageHub Client Library allows developers to easily integrate StorageHub's
capabilities into their applications without dealing with the intricacies of
HTTP requests and responses. The library handles all the necessary communication
with StorageHub, allowing developers to focus on their application's core
functionality.
.. tip:: If you're coding in Java, it is recommended that you include the
StorageHub Client Library into your project.
Authorization
-------------
D4Science adopts state-of-the-art industry standards for authentication and
authorization. Specifically, the implementation fully adopts `OIDC (OpenID
Connect) <https://openid.net/connect>`_ for authentication and UMA 2 (User
Managed Authorization) for authorization flows. `JSON Web Token (JWT) Access
token <https://jwt.io/>`_ are used for both authentication and authorization.
Obtain your Bearer token here: https://dev.d4science.org/how-to-access-resources

View File

@ -0,0 +1,35 @@
@ECHO OFF
pushd %~dp0
REM Command file for Sphinx documentation
if "%SPHINXBUILD%" == "" (
set SPHINXBUILD=sphinx-build
)
set SOURCEDIR=.
set BUILDDIR=_build
if "%1" == "" goto help
%SPHINXBUILD% >NUL 2>NUL
if errorlevel 9009 (
echo.
echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
echo.installed, then set the SPHINXBUILD environment variable to point
echo.to the full path of the 'sphinx-build' executable. Alternatively you
echo.may add the Sphinx directory to PATH.
echo.
echo.If you don't have Sphinx installed, grab it from
echo.https://www.sphinx-doc.org/
exit /b 1
)
%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
goto end
:help
%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
:end
popd

View File

@ -0,0 +1,7 @@
name: StorageHub
group: DataAccess
version: ${version}
description: ${description}
excludes:
- path: /workspace/api-docs/*
- path: /workspace/docs/*

View File

@ -0,0 +1,18 @@
<?xml version="1.0" encoding="UTF-8"?>
<!--
Copyright (c) 2013, 2022 Oracle and/or its affiliates. All rights reserved.
This program and the accompanying materials are made available under the
terms of the Eclipse Distribution License v. 1.0, which is available at
http://www.eclipse.org/org/documents/edl-v10.php.
SPDX-License-Identifier: BSD-3-Clause
-->
<beans xmlns="https://jakarta.ee/xml/ns/jakartaee"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="https://jakarta.ee/xml/ns/jakartaee https://jakarta.ee/xml/ns/jakartaee/beans_4_0.xsd"
version="4.0" bean-discovery-mode="all">
</beans>

View File

@ -0,0 +1,39 @@
<?xml version="1.0" encoding="UTF-8"?>
<web-app>
<context-param>
<param-name>admin-username</param-name>
<param-value>admin</param-value>
</context-param>
<context-param>
<param-name>resolver-basepath</param-name>
<param-value>https://data-d.d4science.org/shub</param-value>
</context-param>
<servlet>
<servlet-name>org.gcube.data.access.storagehub.StorageHub</servlet-name>
</servlet>
<servlet-mapping>
<servlet-name>default</servlet-name>
<url-pattern>/workspace/docs/*</url-pattern>
</servlet-mapping>
<servlet-mapping>
<servlet-name>default</servlet-name>
<url-pattern>/workspace/api-docs/*</url-pattern>
</servlet-mapping>
<servlet-mapping>
<servlet-name>org.gcube.data.access.storagehub.StorageHub</servlet-name>
<url-pattern>/workspace/*</url-pattern>
</servlet-mapping>
<filter>
<filter-name>DocsRedirectFilter</filter-name>
<filter-class>org.gcube.data.access.storagehub.DocsRedirectFilter</filter-class>
</filter>
<filter-mapping>
<filter-name>DocsRedirectFilter</filter-name>
<url-pattern>/workspace/docs</url-pattern>
<url-pattern>/workspace/docs/</url-pattern>
<url-pattern>/workspace/api-docs</url-pattern>
<url-pattern>/workspace/api-docs/</url-pattern>
</filter-mapping>
</web-app>

510
pom.xml
View File

@ -1,139 +1,121 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" <project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<parent> <parent>
<artifactId>maven-parent</artifactId> <artifactId>maven-parent</artifactId>
<groupId>org.gcube.tools</groupId> <groupId>org.gcube.tools</groupId>
<version>1.1.0</version> <version>1.2.0</version>
<relativePath /> <relativePath />
</parent> </parent>
<modelVersion>4.0.0</modelVersion> <modelVersion>4.0.0</modelVersion>
<groupId>org.gcube.data.access</groupId> <groupId>org.gcube.data.access</groupId>
<artifactId>storagehub</artifactId> <artifactId>storagehub</artifactId>
<version>1.0.8</version> <version>2.0.1-SNAPSHOT</version>
<name>storagehub</name> <name>storagehub</name>
<scm> <scm>
<connection>scm:git:https://code-repo.d4science.org/gCubeSystem/storagehub.git</connection> <connection>
<developerConnection>scm:git:https://code-repo.d4science.org/gCubeSystem/storagehub.git</developerConnection> scm:git:https://code-repo.d4science.org/gCubeSystem/storagehub.git</connection>
<developerConnection>
scm:git:https://code-repo.d4science.org/gCubeSystem/storagehub.git</developerConnection>
<url>https://code-repo.d4science.org/gCubeSystem/storagehub</url> <url>https://code-repo.d4science.org/gCubeSystem/storagehub</url>
</scm> </scm>
<packaging>war</packaging> <packaging>war</packaging>
<properties> <properties>
<webappDirectory>${project.basedir}/src/main/webapp/WEB-INF</webappDirectory> <webappDirectory>${project.basedir}/src/main/webapp/WEB-INF</webappDirectory>
<jackrabbit.version>2.16.0</jackrabbit.version> <jackrabbit.version>2.22.0</jackrabbit.version>
<tomcat.version>7.0.40</tomcat.version> <jackson.version>2.15.3</jackson.version>
<jetty.version>6.1.26</jetty.version> <slf4j.version>2.0.12</slf4j.version>
<tika.version>1.21</tika.version> <tika.version>2.6.0</tika.version>
<slf4j.api.version>1.6.6</slf4j.api.version> <aspectj-plugin.version>1.14.0</aspectj-plugin.version>
<slf4j.version>1.7.4</slf4j.version> <!-- sync with logback version -->
<logback.version>1.0.12</logback.version>
<distroDirectory>${project.basedir}/distro</distroDirectory> <distroDirectory>${project.basedir}/distro</distroDirectory>
<description>REST web service for Jackrabbit</description> <description>REST web service for Jackrabbit</description>
<warname>storagehub</warname> <warname>storagehub</warname>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding> <project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<enunciate.version>2.18.1</enunciate.version>
<maven.compiler.source>17</maven.compiler.source>
<maven.compiler.target>17</maven.compiler.target>
<java_version>17</java_version>
</properties> </properties>
<dependencyManagement> <dependencyManagement>
<dependencies> <dependencies>
<dependency> <dependency>
<groupId>org.gcube.distribution</groupId> <groupId>org.gcube.distribution</groupId>
<artifactId>maven-smartgears-bom</artifactId> <artifactId>gcube-smartgears-bom</artifactId>
<version>LATEST</version> <version>4.0.1-SNAPSHOT</version>
<type>pom</type> <type>pom</type>
<scope>import</scope> <scope>import</scope>
</dependency> </dependency>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
<version>1.8.2</version>
</dependency>
</dependencies> </dependencies>
<!-- https://mvnrepository.com/artifact/org.apache.tika/tika-parsers -->
</dependencyManagement> </dependencyManagement>
<dependencies> <dependencies>
<dependency> <dependency>
<groupId>org.aspectj</groupId> <groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId> <artifactId>aspectjrt</artifactId>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.gcube.core</groupId> <groupId>org.gcube.core</groupId>
<artifactId>common-smartgears</artifactId> <artifactId>common-smartgears</artifactId>
</dependency> </dependency>
<dependency>
<groupId>org.gcube.core</groupId>
<artifactId>common-smartgears-app</artifactId>
</dependency>
<dependency> <dependency>
<groupId>org.gcube.common</groupId> <groupId>org.gcube.common</groupId>
<artifactId>authorization-control-library</artifactId> <artifactId>authorization-control-library</artifactId>
<version>[1.0.0-SNAPSHOT,2.0.0-SNAPSHOT)</version>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.gcube.common</groupId> <groupId>org.gcube.common</groupId>
<artifactId>common-authorization</artifactId> <artifactId>common-authorization</artifactId>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.gcube.core</groupId> <groupId>org.gcube.core</groupId>
<artifactId>common-encryption</artifactId> <artifactId>common-encryption</artifactId>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.gcube.core</groupId> <groupId>org.gcube.core</groupId>
<artifactId>common-scope-maps</artifactId> <artifactId>common-scope-maps</artifactId>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.gcube.core</groupId> <groupId>org.gcube.core</groupId>
<artifactId>common-scope</artifactId> <artifactId>common-scope</artifactId>
</dependency> </dependency>
<dependency>
<groupId>org.gcube.core</groupId>
<artifactId>common-encryption</artifactId>
</dependency>
<dependency> <dependency>
<groupId>org.gcube.common</groupId> <groupId>org.gcube.common</groupId>
<artifactId>storagehub-model</artifactId> <artifactId>storagehub-model</artifactId>
<version>[1.0.0-SNAPSHOT,2.0.0-SNAPSHOT)</version>
</dependency> </dependency>
<dependency>
<groupId>org.gcube.data.access</groupId>
<artifactId>storagehub-script-utils</artifactId>
<version>[2.0.0-SNAPSHOT,3.0.0)</version>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
</dependency>
<dependency> <dependency>
<groupId>com.itextpdf</groupId> <groupId>com.itextpdf</groupId>
<artifactId>itextpdf</artifactId> <artifactId>itextpdf</artifactId>
<version>5.5.13.1</version> <version>5.5.13.2</version>
</dependency> </dependency>
<!-- https://mvnrepository.com/artifact/org.bouncycastle/bcprov-jdk15on --> <!-- https://mvnrepository.com/artifact/org.bouncycastle/bcprov-jdk15on -->
<dependency> <dependency>
<groupId>org.bouncycastle</groupId> <groupId>org.bouncycastle</groupId>
<artifactId>bcprov-jdk15on</artifactId> <artifactId>bcprov-jdk15on</artifactId>
<version>1.62</version> <version>1.62</version>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.slf4j</groupId> <groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId> <artifactId>slf4j-api</artifactId>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.gcube.common</groupId> <groupId>org.gcube.common</groupId>
<artifactId>gxJRS</artifactId> <artifactId>gxJRS</artifactId>
<version>[1.0.0-SNAPSHOT, 2.0.0-SNAPSHOT)</version>
</dependency> </dependency>
<!-- JCR dependencies --> <!-- JCR dependencies -->
<dependency> <dependency>
<groupId>javax.jcr</groupId> <groupId>javax.jcr</groupId>
<artifactId>jcr</artifactId> <artifactId>jcr</artifactId>
@ -142,151 +124,159 @@
<dependency> <dependency>
<groupId>org.apache.jackrabbit</groupId> <groupId>org.apache.jackrabbit</groupId>
<artifactId>jackrabbit-api</artifactId> <artifactId>jackrabbit-api</artifactId>
<version>${jackrabbit.version}</version> <version>2.19.3</version>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.apache.jackrabbit</groupId> <groupId>org.apache.jackrabbit</groupId>
<artifactId>jackrabbit-core</artifactId> <artifactId>jackrabbit-core</artifactId>
<version>${jackrabbit.version}</version> <version>${jackrabbit.version}</version>
</dependency> </dependency>
<dependency>
<groupId>org.apache.jackrabbit</groupId>
<artifactId>jackrabbit-jcr-server</artifactId>
<version>${jackrabbit.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.tika/tika-parsers -->
<dependency>
<groupId>org.apache.tika</groupId>
<artifactId>tika-parsers</artifactId>
<version>1.21</version>
</dependency>
<!-- <dependency> <groupId>org.apache.jackrabbit</groupId>
<artifactId>jackrabbit-jcr-server</artifactId>
<version>${jackrabbit.version}</version> </dependency> -->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
<version>${jackson.version}</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>${jackson.version}</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>${jackson.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.tika/tika-core --> <!-- https://mvnrepository.com/artifact/org.apache.tika/tika-core -->
<dependency> <dependency>
<groupId>org.apache.tika</groupId> <groupId>org.apache.tika</groupId>
<artifactId>tika-core</artifactId> <artifactId>tika-core</artifactId>
<version>1.21</version> <version>${tika.version}</version>
</dependency> </dependency>
<!-- jersey -->
<dependency> <dependency>
<groupId>javax.ws.rs</groupId> <groupId>org.apache.tika</groupId>
<artifactId>javax.ws.rs-api</artifactId> <artifactId>tika-parsers-standard-package</artifactId>
<version>2.0</version> <version>${tika.version}</version>
</dependency>
<!-- needed to manage strange image types -->
<dependency>
<groupId>com.twelvemonkeys.imageio</groupId>
<artifactId>imageio-jpeg</artifactId>
<version>3.3.2</version>
</dependency>
<dependency>
<groupId>com.twelvemonkeys.imageio</groupId>
<artifactId>imageio-bmp</artifactId>
<version>3.3.2</version>
</dependency>
<dependency>
<groupId>com.twelvemonkeys.imageio</groupId>
<artifactId>imageio-core</artifactId>
<version>3.3.2</version>
</dependency>
<!-- jersey & weld -->
<dependency>
<groupId>jakarta.servlet</groupId>
<artifactId>jakarta.servlet-api</artifactId>
</dependency>
<!--
https://mvnrepository.com/artifact/javax.interceptor/javax.interceptor-api -->
<dependency>
<groupId>jakarta.interceptor</groupId>
<artifactId>jakarta.interceptor-api</artifactId>
<version>2.0.1</version>
</dependency>
<dependency>
<groupId>jakarta.enterprise</groupId>
<artifactId>jakarta.enterprise.cdi-api</artifactId>
<version>4.0.1</version>
</dependency>
<dependency>
<groupId>jakarta.ws.rs</groupId>
<artifactId>jakarta.ws.rs-api</artifactId>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.glassfish.jersey.containers</groupId> <groupId>org.glassfish.jersey.containers</groupId>
<artifactId>jersey-container-servlet</artifactId> <artifactId>jersey-container-servlet</artifactId>
<version>2.13</version>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.glassfish.jersey.containers.glassfish</groupId> <groupId>org.glassfish.jersey.containers</groupId>
<artifactId>jersey-gf-cdi</artifactId> <artifactId>jersey-container-servlet-core</artifactId>
<version>2.13</version>
</dependency> </dependency>
<!--
https://mvnrepository.com/artifact/org.glassfish.jersey.inject/jersey-hk2 -->
<dependency> <dependency>
<groupId>javax.transaction</groupId> <groupId>org.glassfish.jersey.inject</groupId>
<artifactId>javax.transaction-api</artifactId> <artifactId>jersey-hk2</artifactId>
<version>1.2</version>
</dependency> </dependency>
<dependency> <dependency>
<groupId>javax.servlet</groupId> <groupId>org.glassfish.jersey.core</groupId>
<artifactId>javax.servlet-api</artifactId> <artifactId>jersey-server</artifactId>
<version>3.0.1</version>
<scope>provided</scope>
</dependency> </dependency>
<!-- weld -->
<dependency> <dependency>
<groupId>javax.enterprise</groupId> <groupId>org.glassfish.jersey.ext.cdi</groupId>
<artifactId>cdi-api</artifactId> <artifactId>jersey-cdi1x</artifactId>
<version>1.1</version>
</dependency> </dependency>
<dependency>
<groupId>org.glassfish.jersey.ext.cdi</groupId>
<artifactId>jersey-cdi1x-servlet</artifactId>
</dependency>
<!--
https://mvnrepository.com/artifact/org.jboss.weld.servlet/weld-servlet-core -->
<dependency> <dependency>
<groupId>org.jboss.weld.servlet</groupId> <groupId>org.jboss.weld.servlet</groupId>
<artifactId>weld-servlet</artifactId> <artifactId>weld-servlet-core</artifactId>
<version>2.2.10.Final</version> <version>5.1.2.Final</version>
</dependency> </dependency>
<!--
https://mvnrepository.com/artifact/org.glassfish.jersey.core/jersey-common -->
<dependency> <dependency>
<groupId>org.jboss</groupId> <groupId>org.glassfish.jersey.core</groupId>
<artifactId>jandex</artifactId> <artifactId>jersey-common</artifactId>
<version>1.2.2.Final</version>
</dependency> </dependency>
<dependency>
<groupId>com.fasterxml.jackson.jaxrs</groupId>
<artifactId>jackson-jaxrs-json-provider</artifactId>
<version>2.3.0</version>
</dependency>
<dependency> <dependency>
<groupId>org.glassfish.jersey.media</groupId> <groupId>org.glassfish.jersey.media</groupId>
<artifactId>jersey-media-json-jackson</artifactId> <artifactId>jersey-media-json-jackson</artifactId>
<version>2.13</version>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.glassfish.jersey.media</groupId> <groupId>org.glassfish.jersey.media</groupId>
<artifactId>jersey-media-multipart</artifactId> <artifactId>jersey-media-multipart</artifactId>
<version>2.13</version>
</dependency> </dependency>
<dependency>
<groupId>postgresql</groupId>
<artifactId>postgresql</artifactId>
<version>9.1-901.jdbc4</version>
<scope>runtime</scope>
</dependency>
<!-- Storage dependencies -->
<dependency>
<groupId>org.gcube.contentmanagement</groupId>
<artifactId>storage-manager-core</artifactId>
<version>[2.0.0-SNAPSHOT,3.0.0-SNAPSHOT)</version>
</dependency>
<dependency>
<groupId>org.gcube.contentmanagement</groupId>
<artifactId>storage-manager-wrapper</artifactId>
<version>[2.0.0-SNAPSHOT,3.0.0-SNAPSHOT)</version>
</dependency>
<dependency> <dependency>
<groupId>org.reflections</groupId> <groupId>org.reflections</groupId>
<artifactId>reflections</artifactId> <artifactId>reflections</artifactId>
<version>0.9.10</version>
</dependency> </dependency>
<!-- https://mvnrepository.com/artifact/org.postgresql/postgresql -->
<dependency>
<groupId>org.postgresql</groupId>
<artifactId>postgresql</artifactId>
<version>42.7.0</version>
</dependency>
<dependency> <dependency>
<groupId>com.google.guava</groupId> <groupId>com.google.guava</groupId>
<artifactId>guava</artifactId> <artifactId>guava</artifactId>
<version>16.0</version> <version>16.0</version>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.apache.commons</groupId> <groupId>org.apache.commons</groupId>
<artifactId>commons-compress</artifactId> <artifactId>commons-compress</artifactId>
<version>1.17</version> <version>1.22</version>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.tukaani</groupId> <groupId>org.tukaani</groupId>
<artifactId>xz</artifactId> <artifactId>xz</artifactId>
<version>1.5</version> <version>1.5</version>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.glassfish.jersey.test-framework.providers</groupId> <groupId>org.glassfish.jersey.test-framework.providers</groupId>
<artifactId>jersey-test-framework-provider-simple</artifactId> <artifactId>jersey-test-framework-provider-simple</artifactId>
<version>2.13</version>
<scope>test</scope> <scope>test</scope>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.apache.derby</groupId> <groupId>org.apache.derby</groupId>
<artifactId>derby</artifactId> <artifactId>derby</artifactId>
@ -299,115 +289,103 @@
<version>10.8.2.2</version> <version>10.8.2.2</version>
<scope>test</scope> <scope>test</scope>
</dependency> </dependency>
<dependency>
<groupId>com.googlecode.jeeunit</groupId>
<artifactId>jeeunit</artifactId>
<version>1.0.0</version>
<scope>test</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/com.google.code.gson/gson --> <!-- https://mvnrepository.com/artifact/com.google.code.gson/gson -->
<dependency> <dependency>
<groupId>com.google.code.gson</groupId> <groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId> <artifactId>gson</artifactId>
<version>2.7</version> <version>2.7</version>
</dependency> </dependency>
<dependency> <dependency>
<groupId>ch.qos.logback</groupId> <groupId>com.vlkan.rfos</groupId>
<artifactId>logback-classic</artifactId> <artifactId>rotating-fos</artifactId>
<version>1.0.13</version> <version>0.9.2</version>
<scope>test</scope>
</dependency> </dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.10</version>
<scope>test</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.mockito/mockito-all -->
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-all</artifactId>
<version>1.9.5</version>
<scope>test</scope>
</dependency>
<dependency> <dependency>
<groupId>org.slf4j</groupId> <groupId>org.slf4j</groupId>
<artifactId>jul-to-slf4j</artifactId> <artifactId>jul-to-slf4j</artifactId>
<version>${slf4j.version}</version> <version>${slf4j.version}</version>
</dependency> </dependency>
<!--
https://mvnrepository.com/artifact/org.apache.logging.log4j/log4j-to-slf4j -->
<!-- https://mvnrepository.com/artifact/org.slf4j/log4j-over-slf4j -->
<dependency> <dependency>
<groupId>org.jboss.weld.se</groupId> <groupId>org.slf4j</groupId>
<artifactId>weld-se</artifactId> <artifactId>log4j-over-slf4j</artifactId>
<version>2.2.10.Final</version> <version>2.0.7</version>
</dependency>
<!-- enunciate deps -->
<dependency>
<groupId>com.webcohesion.enunciate</groupId>
<artifactId>enunciate-core-annotations</artifactId>
<version>${enunciate.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.webcohesion.enunciate</groupId>
<artifactId>enunciate-rt-util</artifactId>
<version>${enunciate.version}</version>
<scope>provided</scope>
</dependency>
<!-- Storage dependencies -->
<dependency>
<groupId>org.gcube.contentmanagement</groupId>
<artifactId>storage-manager-core</artifactId>
<version>[4.0.0-SNAPSHOT,5.0.0-SNAPSHOT)</version>
</dependency>
<dependency>
<groupId>org.gcube.contentmanagement</groupId>
<artifactId>storage-manager-wrapper</artifactId>
<version>[4.0.0-SNAPSHOT,5.0.0-SNAPSHOT)</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk-s3 -->
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-s3</artifactId>
<version>1.12.763</version>
</dependency>
<!-- https://mvnrepository.com/artifact/io.minio/minio
<dependency>
<groupId>io.minio</groupId>
<artifactId>minio</artifactId>
<version>8.3.3</version>
</dependency> -->
<dependency>
<groupId>org.testcontainers</groupId>
<artifactId>testcontainers</artifactId>
<version>1.16.3</version>
<scope>test</scope> <scope>test</scope>
</dependency> </dependency>
<!-- https://mvnrepository.com/artifact/junit/junit -->
<dependency> <dependency>
<groupId>junit</groupId> <groupId>junit</groupId>
<artifactId>junit</artifactId> <artifactId>junit</artifactId>
<version>4.11</version> <version>4.13.2</version>
<scope>test</scope> <scope>test</scope>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.glassfish.jersey.test-framework</groupId> <groupId>ch.qos.logback</groupId>
<artifactId>jersey-test-framework-core</artifactId> <artifactId>logback-classic</artifactId>
<version>2.13</version>
<scope>test</scope> <scope>test</scope>
</dependency> </dependency>
<dependency>
<groupId>org.glassfish.jersey.test-framework.providers</groupId>
<artifactId>jersey-test-framework-provider-grizzly2</artifactId>
<version>2.13</version>
<scope>test</scope>
</dependency>
</dependencies> </dependencies>
<build> <build>
<finalName>${artifactId}</finalName> <finalName>${project.artifactId}</finalName>
<resources>
<resource>
<directory>src/main/resources</directory>
</resource>
</resources>
<pluginManagement> <pluginManagement>
<plugins> <plugins>
<plugin>
<groupId>org.eclipse.m2e</groupId>
<artifactId>lifecycle-mapping</artifactId>
<version>1.0.0</version>
<configuration>
<lifecycleMappingMetadata>
<pluginExecutions>
<pluginExecution>
<pluginExecutionFilter>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<versionRange>[1.0,)</versionRange>
<goals>
<goal>test-compile</goal>
<goal>compile</goal>
</goals>
</pluginExecutionFilter>
<action>
<execute />
</action>
</pluginExecution>
</pluginExecutions>
</lifecycleMappingMetadata>
</configuration>
</plugin>
<plugin> <plugin>
<groupId>org.codehaus.mojo</groupId> <groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId> <artifactId>aspectj-maven-plugin</artifactId>
<version>1.7</version> <version>${aspectj-plugin.version}</version>
<configuration> <configuration>
<complianceLevel>1.8</complianceLevel> <complianceLevel>17</complianceLevel>
<source>1.8</source> <source>17</source>
<target>1.8</target> <target>17</target>
<aspectLibraries> <aspectLibraries>
<aspectLibrary> <aspectLibrary>
<groupId>org.gcube.common</groupId> <groupId>org.gcube.common</groupId>
@ -415,10 +393,18 @@
</aspectLibrary> </aspectLibrary>
</aspectLibraries> </aspectLibraries>
</configuration> </configuration>
<dependencies>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjtools</artifactId>
<version>1.9.21.1</version>
</dependency>
</dependencies>
<executions> <executions>
<execution> <execution>
<goals> <goals>
<goal>compile</goal> <goal>compile</goal> <!-- use this goal to weave
all your main classes -->
</goals> </goals>
</execution> </execution>
</executions> </executions>
@ -433,53 +419,127 @@
<plugin> <plugin>
<groupId>org.apache.maven.plugins</groupId> <groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId> <artifactId>maven-war-plugin</artifactId>
<version>2.4</version>
<configuration> <configuration>
<failOnMissingWebXml>false</failOnMissingWebXml> <failOnMissingWebXml>false</failOnMissingWebXml>
</configuration> </configuration>
</plugin> </plugin>
<!-- Enunciate Maven plugin -->
<plugin>
<groupId>com.webcohesion.enunciate</groupId>
<artifactId>enunciate-maven-plugin</artifactId>
<version>${enunciate.version}</version>
<dependencies>
<dependency>
<groupId>com.webcohesion.enunciate</groupId>
<artifactId>enunciate-lombok</artifactId>
<version>2.9.1</version>
</dependency>
</dependencies>
<configuration>
<configFile>${project.basedir}/documentation/enunciate/enunciate.xml</configFile>
<sourcepath-includes>
<!-- Include storagehub classes -->
<sourcepath-include>
<groupId>org.gcube.common</groupId>
<artifactId>storagehub</artifactId>
</sourcepath-include>
<!-- Include storagehub-model classes -->
<sourcepath-include>
<groupId>org.gcube.common</groupId>
<artifactId>storagehub-model</artifactId>
</sourcepath-include>
<!-- Include jersey media classes -->
<sourcepath-include>
<groupId>org.glassfish.jersey.media</groupId>
<artifactId>jersey-media-multipart</artifactId>
</sourcepath-include>
</sourcepath-includes>
</configuration>
<executions>
<execution>
<id>assemble</id>
<goals>
<goal>assemble</goal>
</goals>
</execution>
</executions>
</plugin>
<!-- Copy Enunciate Documentation from your-application/docs to
your-application.war -->
<plugin> <plugin>
<groupId>org.apache.maven.plugins</groupId> <groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-resources-plugin</artifactId> <artifactId>maven-resources-plugin</artifactId>
<version>2.6</version>
<executions> <executions>
<execution> <execution>
<id>copy-profile</id> <id>copy-enunciate-docs</id>
<phase>process-resources</phase>
<goals> <goals>
<goal>copy-resources</goal> <goal>copy-resources</goal>
</goals> </goals>
<phase>process-resources</phase>
<configuration> <configuration>
<outputDirectory>${webappDirectory}</outputDirectory> <outputDirectory>target</outputDirectory>
<resources> <resources>
<resource> <resource>
<directory>${distroDirectory}</directory> <targetPath>${project.build.directory}/${project.artifactId}/workspace/api-docs</targetPath>
<filtering>true</filtering> <directory>${project.build.directory}/api-docs</directory>
<filtering>true</filtering>
</resource> </resource>
</resources> </resources>
</configuration> </configuration>
</execution> </execution>
</executions> </executions>
</plugin> </plugin>
<!-- SPHINX PLUGIN triggered at 'compile' -->
<plugin> <plugin>
<groupId>org.apache.maven.plugins</groupId> <groupId>kr.motd.maven</groupId>
<artifactId>maven-assembly-plugin</artifactId> <artifactId>sphinx-maven-plugin</artifactId>
<version>2.10.0</version>
<configuration> <configuration>
<descriptors> <outputDirectory>${project.build.directory}/${project.build.finalName}/workspace/docs</outputDirectory>
<descriptor>descriptor.xml</descriptor> <builder>html</builder>
</descriptors> <configDirectory>${project.basedir}/documentation/sphinx</configDirectory>
<sourceDirectory>${project.basedir}/documentation/sphinx</sourceDirectory>
</configuration> </configuration>
<executions> <executions>
<execution> <execution>
<id>servicearchive</id> <phase>process-resources</phase>
<phase>install</phase>
<goals> <goals>
<goal>single</goal> <goal>generate</goal>
</goals> </goals>
</execution> </execution>
</executions> </executions>
</plugin> </plugin>
</plugins> </plugins>
</build> </build>
<profiles>
<profile>
<id>integration</id>
<build>
<finalName>storagehub-test-storages</finalName>
<plugins>
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.7</version>
<executions>
<execution>
<phase>process-test-classes</phase>
<configuration>
<target>
<copy todir="${basedir}/target/classes">
<fileset
dir="${basedir}/target/test-classes"
includes="org/gcube/data/access/storages/**/*" />
</copy>
</target>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
</profiles>
</project> </project>

View File

@ -1,25 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<Resource xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<ID></ID>
<Type>Service</Type>
<Profile>
<Description>Storage Hub Webapp</Description>
<Class>DataAccess</Class>
<Name>${artifactId}</Name>
<Version>1.0.0</Version>
<Packages>
<Software>
<Name>${artifactId}</Name>
<Version>${version}</Version>
<MavenCoordinates>
<groupId>${groupId}</groupId>
<artifactId>${artifactId}</artifactId>
<version>${version}</version>
</MavenCoordinates>
<Files>
<File>${build.finalName}.jar</File>
</Files>
</Software>
</Packages>
</Profile>
</Resource>

View File

@ -1,34 +1,39 @@
package org.gcube.data.access.storagehub; package org.gcube.data.access.storagehub;
import org.apache.jackrabbit.api.security.user.Group; import java.util.List;
import javax.inject.Inject; import jakarta.inject.Inject;
import javax.inject.Singleton; import jakarta.inject.Singleton;
import javax.jcr.Node; import javax.jcr.Node;
import javax.jcr.RepositoryException; import javax.jcr.RepositoryException;
import javax.jcr.Session; import javax.jcr.Session;
import javax.jcr.security.AccessControlEntry;
import javax.jcr.security.Privilege;
import org.apache.jackrabbit.api.JackrabbitSession; import org.apache.jackrabbit.api.JackrabbitSession;
import org.apache.jackrabbit.api.security.JackrabbitAccessControlList;
import org.apache.jackrabbit.api.security.user.Authorizable; import org.apache.jackrabbit.api.security.user.Authorizable;
import org.apache.jackrabbit.commons.jackrabbit.authorization.AccessControlUtils; import org.apache.jackrabbit.api.security.user.Group;
import org.gcube.common.authorization.library.provider.AuthorizationProvider; import org.apache.jackrabbit.api.security.user.UserManager;
import org.gcube.common.storagehub.model.Excludes; import org.gcube.common.storagehub.model.Excludes;
import org.gcube.common.storagehub.model.acls.ACL;
import org.gcube.common.storagehub.model.acls.AccessType; import org.gcube.common.storagehub.model.acls.AccessType;
import org.gcube.common.storagehub.model.exceptions.BackendGenericError; import org.gcube.common.storagehub.model.exceptions.BackendGenericError;
import org.gcube.common.storagehub.model.exceptions.InvalidCallParameters; import org.gcube.common.storagehub.model.exceptions.InvalidCallParameters;
import org.gcube.common.storagehub.model.exceptions.UserNotAuthorizedException; import org.gcube.common.storagehub.model.exceptions.UserNotAuthorizedException;
import org.gcube.common.storagehub.model.items.FolderItem;
import org.gcube.common.storagehub.model.items.Item; import org.gcube.common.storagehub.model.items.Item;
import org.gcube.common.storagehub.model.items.SharedFolder; import org.gcube.common.storagehub.model.items.SharedFolder;
import org.gcube.data.access.storagehub.handlers.Node2ItemConverter; import org.gcube.common.storagehub.model.items.TrashItem;
import org.gcube.data.access.storagehub.handlers.items.Node2ItemConverter;
import org.gcube.data.access.storagehub.services.interfaces.ACLManagerInterface;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import lombok.extern.java.Log; /**
import lombok.extern.log4j.Log4j; *
*
* the caller must be authorized, so i'm not passing the login also if it works on behalf of an user
*
*
*/
@Singleton @Singleton
public class AuthorizationChecker { public class AuthorizationChecker {
@ -37,76 +42,113 @@ public class AuthorizationChecker {
@Inject @Inject
Node2ItemConverter node2Item; Node2ItemConverter node2Item;
public void checkReadAuthorizationControl(Session session, String id) throws UserNotAuthorizedException , BackendGenericError, RepositoryException{ @Inject
Node node = session.getNodeByIdentifier(id); PathUtil pathUtil;
String login = AuthorizationProvider.instance.get().getClient().getId(); @Inject
ACLManagerInterface aclManager;
public void checkReadAuthorizationControl(Session session, String userToCheck, String id) throws UserNotAuthorizedException , BackendGenericError, RepositoryException{
Node node = session.getNodeByIdentifier(id);
Item item = node2Item.getItem(node, Excludes.ALL); Item item = node2Item.getItem(node, Excludes.ALL);
if (item==null) throw new UserNotAuthorizedException("Insufficent Provileges for user "+login+" to read node with id "+id+": it's not a valid StorageHub node"); if (item==null) throw new UserNotAuthorizedException("Insufficent Privileges for user "+userToCheck+" to read node with id "+id+": it's not a valid StorageHub node");
//checking if the item is in the owner trash folder
if(item instanceof TrashItem && item.getParentPath().equals(pathUtil.getTrashPath(userToCheck, session).toPath()))
return;
if (!item.isShared() && item.getOwner()!=null && item.getOwner().equals(userToCheck)) return;
if (hasParentPublicFolder(session, item)) return;
//TODO: remove when messages will be passed to a new system
String parentPath = item.getParentPath();
if (parentPath.endsWith("hl:attachments") && (parentPath.contains("/OutBox/") || parentPath.contains("/InBox/"))) {
return ;
}
if (item.isShared()) { if (item.isShared()) {
SharedFolder parentShared = node2Item.getItem(retrieveSharedFolderParent(node, session), Excludes.EXCLUDE_ACCOUNTING); //SharedFolder parentShared = node2Item.getItem(retrieveSharedFolderParent(node, session), Excludes.EXCLUDE_ACCOUNTING);
//if (parentShared.getUsers().getMap().keySet().contains(userToCheck)) return;
if (parentShared.getUsers().getMap().keySet().contains(login)) return;
//CHECKING ACL FOR VREFOLDER AND SHARED FOLDER //CHECKING ACL FOR VREFOLDER AND SHARED FOLDER
JackrabbitAccessControlList accessControlList = AccessControlUtils.getAccessControlList(session, parentShared.getPath()); List<ACL> acls = aclManager.getByItem(item, session);
AccessControlEntry[] entries = accessControlList.getAccessControlEntries(); UserManager userManager = ((JackrabbitSession) session).getUserManager();
Authorizable userAuthorizable = ((JackrabbitSession) session).getUserManager().getAuthorizable(login); Authorizable userAuthorizable = userManager.getAuthorizable(userToCheck);
for (AccessControlEntry entry: entries) { for (ACL entry: acls) {
log.debug("checking access right for {} with compared with {}",login, entry.getPrincipal()); log.debug("checking access right for {} with compared with {}",userToCheck, entry.getPrincipal());
Authorizable authorizable = ((JackrabbitSession) session).getUserManager().getAuthorizable(entry.getPrincipal()); Authorizable authorizable = userManager.getAuthorizable(entry.getPrincipal());
//TODO; check why sometimes the next line gets a nullpointer
if (!authorizable.isGroup() && entry.getPrincipal().getName().equals(login)) return; if (authorizable==null) {
if (authorizable.isGroup() && ((Group) authorizable).isMember(userAuthorizable)) return; log.warn("{} doesn't have a correspondant auhtorizable object, check it ", entry.getPrincipal());
continue;
}
try {
if (!authorizable.isGroup() && entry.getPrincipal().equals(userToCheck)) return;
if (authorizable.isGroup() && ((Group) authorizable).isMember(userAuthorizable)) return;
}catch (Throwable e) {
log.warn("someting went wrong checking authorizations",e);
}
} }
throw new UserNotAuthorizedException("Insufficent Provileges for user "+login+" to read node with id "+id); }
} else if (item.getOwner()==null || !item.getOwner().equals(login)) throw new UserNotAuthorizedException("Insufficent Privileges for user "+userToCheck+" to read node with id "+id);
throw new UserNotAuthorizedException("Insufficent Provileges for user "+login+" to read node with id "+id);
} }
private Node retrieveSharedFolderParent(Node node, Session session) throws BackendGenericError, RepositoryException{
if (node2Item.checkNodeType(node, SharedFolder.class)) return node;
else
return retrieveSharedFolderParent(node.getParent(), session);
private boolean hasParentPublicFolder(Session session, Item item) {
if(item==null || item.getParentPath()==null) return false;
if (item.getParentPath().replaceAll("/Home/[^/]*/"+Constants.WORKSPACE_ROOT_FOLDER_NAME,"").isEmpty() || item.getParentPath().replaceAll(Constants.SHARED_FOLDER_PATH, "").isEmpty()) {
if (item instanceof FolderItem folder)
return folder.isPublicItem();
else return false;
} else {
if (item instanceof FolderItem folder)
try {
return (folder.isPublicItem() || hasParentPublicFolder(session, node2Item.getItem(folder.getParentId(), session, Excludes.ALL)));
}catch (Throwable e) {
log.warn("error checking public parents",e);
return false;
}
else
try {
return hasParentPublicFolder(session, node2Item.getItem(item.getParentId(), session, Excludes.ALL));
}catch (Throwable e) {
log.warn("error checking public parents",e);
return false;
}
}
} }
public void checkWriteAuthorizationControl(Session session, String id, boolean isNewItem) throws UserNotAuthorizedException, BackendGenericError, RepositoryException {
//in case of newItem the id is the parent otherwise the old node to replace
Node node = session.getNodeByIdentifier(id);
Item item = node2Item.getItem(node, Excludes.ALL);
String login = AuthorizationProvider.instance.get().getClient().getId(); //newItem means that a new item will be created and id is the destination directory
public void checkWriteAuthorizationControl(Session session, String userToCheck, Item item, Node node, boolean isNewItem) throws UserNotAuthorizedException, BackendGenericError, RepositoryException {
if (item==null) throw new UserNotAuthorizedException("Insufficent Provileges for user "+login+" to write into node with id "+id+": it's not a valid StorageHub node"); if (item==null) throw new UserNotAuthorizedException("Not valid StorageHub node");
if (Constants.WRITE_PROTECTED_FOLDER.contains(item.getName()) || Constants.WRITE_PROTECTED_FOLDER.contains(item.getTitle())) if (Constants.WRITE_PROTECTED_FOLDER.contains(item.getName()) || Constants.WRITE_PROTECTED_FOLDER.contains(item.getTitle()))
throw new UserNotAuthorizedException("Insufficent Provileges for user "+login+" to write into node with id "+id+": it's a protected folder"); throw new UserNotAuthorizedException("Insufficent Privileges for user "+userToCheck+" to write into node with id "+item.getId()+": it's a protected folder");
if (item.isShared()) { if (item.isShared()) {
Node parentSharedNode = retrieveSharedFolderParent(node, session);
JackrabbitAccessControlList accessControlList = AccessControlUtils.getAccessControlList(session, parentSharedNode.getPath()); //CHECKING ACL FOR VREFOLDER AND SHARED FOLDER
AccessControlEntry[] entries = accessControlList.getAccessControlEntries(); List<ACL> acls = aclManager.getByItem(item, session);
Authorizable UserAuthorizable = ((JackrabbitSession) session).getUserManager().getAuthorizable(login); UserManager userManager = ((JackrabbitSession) session).getUserManager();
Authorizable UserAuthorizable = userManager.getAuthorizable(userToCheck);
//put it in a different method //put it in a different method
for (AccessControlEntry entry: entries) { for (ACL entry: acls) {
Authorizable authorizable = ((JackrabbitSession) session).getUserManager().getAuthorizable(entry.getPrincipal()); Authorizable authorizable = userManager.getAuthorizable(entry.getPrincipal());
if ((!authorizable.isGroup() && entry.getPrincipal().getName().equals(login)) || (authorizable.isGroup() && ((Group) authorizable).isMember(UserAuthorizable))){ if ((!authorizable.isGroup() && entry.getPrincipal().equals(userToCheck)) || (authorizable.isGroup() && ((Group) authorizable).isMember(UserAuthorizable))){
for (Privilege privilege : entry.getPrivileges()){ for (AccessType privilege : entry.getAccessTypes()){
AccessType access = AccessType.fromValue(privilege.getName()); if (isNewItem && privilege!=AccessType.READ_ONLY)
if (isNewItem && access!=AccessType.READ_ONLY)
return; return;
else else
if (!isNewItem && if (!isNewItem &&
(access==AccessType.ADMINISTRATOR || access==AccessType.WRITE_ALL || (access==AccessType.WRITE_OWNER && item.getOwner().equals(login)))) (privilege==AccessType.ADMINISTRATOR || privilege==AccessType.WRITE_ALL || (privilege==AccessType.WRITE_OWNER && item.getOwner().equals(userToCheck))))
return; return;
} }
@ -114,72 +156,56 @@ public class AuthorizationChecker {
} }
} }
} else } else
if(item.getOwner().equals(login)) if(item.getOwner().equals(userToCheck))
return; return;
throw new UserNotAuthorizedException("Insufficent Provileges for user "+login+" to write into node with id "+id); throw new UserNotAuthorizedException("Insufficent Privileges for user "+userToCheck+" to write into node with id "+item.getId());
}
//newItem means that a new item will be created and id is the destination directory
public void checkWriteAuthorizationControl(Session session, String userToCheck , String id, boolean isNewItem) throws UserNotAuthorizedException, BackendGenericError, RepositoryException {
//in case of newItem the id is the parent otherwise the old node to replace
Node node = session.getNodeByIdentifier(id);
Item item = node2Item.getItem(node, Excludes.ALL);
checkWriteAuthorizationControl(session, userToCheck, item, node, isNewItem);
} }
/**
*
* checks if item with {id} can be moved
*
*/
public void checkMoveOpsForProtectedFolders(Session session, String id) throws InvalidCallParameters, BackendGenericError, RepositoryException { public void checkMoveOpsForProtectedFolders(Session session, String id) throws InvalidCallParameters, BackendGenericError, RepositoryException {
Node node = session.getNodeByIdentifier(id); Node node = session.getNodeByIdentifier(id);
Item item = node2Item.getItem(node, Excludes.ALL); Item item = node2Item.getItem(node, Excludes.ALL);
if (Constants.PROTECTED_FOLDER.contains(item.getName()) || Constants.PROTECTED_FOLDER.contains(item.getTitle())) if (Constants.PROTECTED_FOLDER.contains(item.getName()) || Constants.PROTECTED_FOLDER.contains(item.getTitle()))
throw new InvalidCallParameters("protected folder cannot be moved or deleted"); throw new InvalidCallParameters("protected folder cannot be moved or deleted");
} }
public void checkAdministratorControl(Session session, SharedFolder item) throws UserNotAuthorizedException, BackendGenericError, RepositoryException { /**
//TODO: riguardare questo pezzo di codice *
String login = AuthorizationProvider.instance.get().getClient().getId(); * checks if {userToCheck} is an admin for {item}
*
*/
public void checkAdministratorControl(Session session, String userToCheck, SharedFolder item) throws UserNotAuthorizedException, BackendGenericError, RepositoryException {
if (item==null) throw new UserNotAuthorizedException("Insufficent Provileges for user "+login+": it's not a valid StorageHub node"); if (item==null) throw new UserNotAuthorizedException("Insufficent Privileges for user "+userToCheck+": it's not a valid StorageHub node");
Node node = session.getNodeByIdentifier(item.getId());
if (item.isShared()) { if (item.isShared()) {
Node parentSharedNode = retrieveSharedFolderParent(node, session); List<ACL> acls = aclManager.getByItem(item, session);
JackrabbitAccessControlList accessControlList = AccessControlUtils.getAccessControlList(session, parentSharedNode.getPath());
AccessControlEntry[] entries = accessControlList.getAccessControlEntries();
//put it in a different method
SharedFolder parentShared = node2Item.getItem(parentSharedNode, Excludes.EXCLUDE_ACCOUNTING); for (ACL entry: acls) {
for (AccessControlEntry entry: entries) { if (entry.getPrincipal().equals(userToCheck)) {
if (entry.getPrincipal().getName().equals(login) || (parentShared.isVreFolder() && entry.getPrincipal().getName().equals(parentShared.getTitle()))) { for (AccessType privilege : entry.getAccessTypes()){
for (Privilege privilege : entry.getPrivileges()){ if (privilege==AccessType.ADMINISTRATOR)
AccessType access = AccessType.fromValue(privilege.getName());
if (access==AccessType.ADMINISTRATOR)
return; return;
} }
throw new UserNotAuthorizedException("The user "+login+" is not an administrator of node with id "+item.getId());
} }
} }
} }
throw new UserNotAuthorizedException("The user "+userToCheck+" is not an administrator of node with id "+item.getId());
throw new UserNotAuthorizedException("The user "+login+" is not an administrator of node with id "+item.getId());
} }
/*
private String retrieveOwner(Node node) {
Node nodeOwner;
//get Owner
try{
return node.getProperty(NodeProperty.PORTAL_LOGIN.toString()).getString();
}catch (Exception e) {
try {
nodeOwner = node.getNode(NodeProperty.OWNER.toString());
return nodeOwner.getProperty(NodeProperty.PORTAL_LOGIN.toString()).getString();
// this.userId = nodeOwner.getProperty(USER_ID).getString();
// this.portalLogin = nodeOwner.getProperty(PORTAL_LOGIN).getString();
// node.getSession().save();
} catch (Exception e1) {
throw new RuntimeException(e1);
}
}
}
*/
} }

View File

@ -3,9 +3,23 @@ package org.gcube.data.access.storagehub;
import java.util.Arrays; import java.util.Arrays;
import java.util.List; import java.util.List;
import javax.jcr.SimpleCredentials;
public class Constants { public class Constants {
public static final String VRE_FOLDER_PARENT_NAME = "MySpecialFolders"; public static final String OLD_VRE_FOLDER_PARENT_NAME = "MySpecialFolders";
public static final String PERSONAL_VRES_FOLDER_PARENT_NAME = "VREs";
public static final String INBOX_FOLDER_NAME = "InBox";
public static final String OUTBOX_FOLDER_NAME = "OutBox";
public static final String ATTACHMENTNODE_NAME = "hl:attachments";
public static final String SHARED_WITH_ME_PARENT_NAME = "SharedWithMe";
//public static final String MYSHARED_PARENT_NAME = "MyShared";
public static final String SHARED_FOLDER_PATH = "/Share"; public static final String SHARED_FOLDER_PATH = "/Share";
@ -15,13 +29,16 @@ public class Constants {
public static final String QUERY_LANGUAGE ="JCR-SQL2"; public static final String QUERY_LANGUAGE ="JCR-SQL2";
public static final String ADMIN_PARAM_NAME ="admin-username"; public static final String HOME_VERSION_PROP = "hl:version";
public static final String ADMIN_PARAM_PWD ="admin-pwd"; public static final List<String> FOLDERS_TO_EXLUDE = Arrays.asList(Constants.OLD_VRE_FOLDER_PARENT_NAME, Constants.TRASH_ROOT_FOLDER_NAME);
public static final List<String> FOLDERS_TO_EXLUDE = Arrays.asList(Constants.VRE_FOLDER_PARENT_NAME, Constants.TRASH_ROOT_FOLDER_NAME); public static final List<String> WRITE_PROTECTED_FOLDER = Arrays.asList(Constants.OLD_VRE_FOLDER_PARENT_NAME, Constants.TRASH_ROOT_FOLDER_NAME);
public static final List<String> WRITE_PROTECTED_FOLDER = Arrays.asList(Constants.VRE_FOLDER_PARENT_NAME, Constants.TRASH_ROOT_FOLDER_NAME); public static final List<String> PROTECTED_FOLDER = Arrays.asList(Constants.WORKSPACE_ROOT_FOLDER_NAME, Constants.OLD_VRE_FOLDER_PARENT_NAME, Constants.TRASH_ROOT_FOLDER_NAME);
public static final String ADMIN_USER ="admin";
public static final SimpleCredentials JCR_CREDENTIALS = new SimpleCredentials(ADMIN_USER,"admin".toCharArray());
public static final List<String> PROTECTED_FOLDER = Arrays.asList(Constants.WORKSPACE_ROOT_FOLDER_NAME, Constants.VRE_FOLDER_PARENT_NAME, Constants.TRASH_ROOT_FOLDER_NAME);
} }

View File

@ -0,0 +1,40 @@
package org.gcube.data.access.storagehub;
import java.io.IOException;
import jakarta.servlet.Filter;
import jakarta.servlet.FilterChain;
import jakarta.servlet.FilterConfig;
import jakarta.servlet.ServletException;
import jakarta.servlet.ServletRequest;
import jakarta.servlet.ServletResponse;
import jakarta.servlet.http.HttpServletRequest;
import jakarta.servlet.http.HttpServletResponse;
public class DocsRedirectFilter implements Filter {
@Override
public void init(FilterConfig filterConfig) throws ServletException {
// Initialization code, if needed
}
@Override
public void doFilter(ServletRequest request, ServletResponse response, FilterChain chain)
throws IOException, ServletException {
HttpServletRequest httpRequest = (HttpServletRequest) request;
HttpServletResponse httpResponse = (HttpServletResponse) response;
String path = httpRequest.getRequestURI();
if (path.endsWith("/docs") || path.endsWith("/api-docs")) {
httpResponse.sendRedirect(path + "/index.html");
} else if (path.endsWith("/docs/") || path.endsWith("/api-docs/")) {
httpResponse.sendRedirect(path + "index.html");
} else {
chain.doFilter(request, response);
}
}
@Override
public void destroy() {
// Cleanup code, if needed
}
}

View File

@ -1,140 +0,0 @@
package org.gcube.data.access.storagehub;
import java.io.BufferedInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.PipedInputStream;
import java.io.PipedOutputStream;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class MultipleOutputStream {
private Logger logger = LoggerFactory.getLogger(MultipleOutputStream.class);
private MyPipedInputStream[] pipedInStreams;
private InputStream is;
private MyPipedOututStream[] pipedOutStreams;
private int index=0;
public MultipleOutputStream(InputStream is, int number) throws IOException{
this.is = is;
logger.debug("requested {} piped streams ",number);
pipedInStreams = new MyPipedInputStream[number];
pipedOutStreams = new MyPipedOututStream[number];
for (int i =0; i<number; i++) {
pipedOutStreams[i] = new MyPipedOututStream();
pipedInStreams[i] = new MyPipedInputStream(pipedOutStreams[i]);
}
}
public void startWriting() throws IOException{
BufferedInputStream bis = new BufferedInputStream(is);
byte[] buf = new byte[1024*64];
int read=-1;
int writeTot = 0;
while ((read =bis.read(buf))!=-1){
for (int i=0; i< pipedInStreams.length; i++) {
if (!pipedInStreams[i].isClosed()) {
pipedOutStreams[i].write(buf, 0, read);
}
}
writeTot+= read;
if (allOutStreamClosed())
break;
}
for (int i=0; i< pipedOutStreams.length; i++) {
if (!pipedOutStreams[i].isClosed()) {
logger.debug("closing outputstream {}",i);
pipedOutStreams[i].close();
}
}
logger.debug("total written {} ",writeTot);
}
private boolean allOutStreamClosed() {
for (int i=0; i<pipedOutStreams.length; i++) {
if (!pipedOutStreams[i].isClosed())
return false;
}
return true;
}
public synchronized InputStream get() {
logger.debug("requesting piped streams {}",index);
if (index>=pipedInStreams.length) return null;
return pipedInStreams[index++];
}
public class MyPipedOututStream extends PipedOutputStream{
boolean close = false;
@Override
public void close() throws IOException {
this.close = true;
super.close();
}
/**
* @return the close
*/
public boolean isClosed() {
return close;
}
@Override
public void write(byte[] b, int off, int len) throws IOException {
try{
super.write(b, off, len);
}catch(IOException io){
this.close = true;
}
}
}
public class MyPipedInputStream extends PipedInputStream{
boolean close = false;
public MyPipedInputStream(PipedOutputStream src) throws IOException {
super(src);
}
@Override
public void close() throws IOException {
this.close = true;
logger.debug(Thread.currentThread().getName()+" close MyPipedInputStream");
super.close();
}
/**
* @return the close
*/
public boolean isClosed() {
return close;
}
}
}

View File

@ -0,0 +1,38 @@
package org.gcube.data.access.storagehub;
import org.gcube.data.access.storagehub.repository.StoragehubRepository;
import org.glassfish.jersey.server.monitoring.ApplicationEvent;
import org.glassfish.jersey.server.monitoring.ApplicationEventListener;
import org.glassfish.jersey.server.monitoring.RequestEvent;
import org.glassfish.jersey.server.monitoring.RequestEventListener;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import jakarta.ws.rs.ext.Provider;
@Provider
public class MyApplicationListener implements ApplicationEventListener {
private static final Logger log = LoggerFactory.getLogger(MyApplicationListener.class);
StoragehubRepository repository = StoragehubRepository.repository;
@Override
public void onEvent(ApplicationEvent event) {
switch (event.getType()) {
case DESTROY_FINISHED:
log.info("Destroying application storageHub");
repository.shutdown();
log.info("Jackrabbit repository stopped");
default:
break;
}
}
@Override
public RequestEventListener onRequest(RequestEvent requestEvent) {
return null;
}
}

View File

@ -0,0 +1,51 @@
package org.gcube.data.access.storagehub;
import java.util.Iterator;
import javax.jcr.Node;
import javax.jcr.NodeIterator;
import javax.jcr.RepositoryException;
import org.gcube.common.storagehub.model.exceptions.BackendGenericError;
import org.gcube.data.access.storagehub.handlers.ClassHandler;
public class NodeChildrenFilterIterator implements Iterator<Node> {
private NodeIterator it;
public NodeChildrenFilterIterator(Node node) throws BackendGenericError{
super();
try {
it = node.getNodes();
} catch (RepositoryException e) {
throw new BackendGenericError(e);
}
}
public NodeChildrenFilterIterator(NodeIterator iterator) throws BackendGenericError{
it = iterator;
}
private Node currentNode = null;
@Override
public boolean hasNext() {
try {
while (it.hasNext()) {
currentNode=it.nextNode();
if (ClassHandler.instance().get(currentNode.getPrimaryNodeType().getName())!=null)
return true;
}
return false;
}catch (RepositoryException e) {
throw new RuntimeException(e);
}
}
@Override
public Node next() {
return currentNode;
}
}

View File

@ -0,0 +1,75 @@
package org.gcube.data.access.storagehub;
import jakarta.inject.Singleton;
import javax.jcr.Node;
import javax.jcr.RepositoryException;
import javax.jcr.Session;
import org.gcube.common.storagehub.model.Path;
import org.gcube.common.storagehub.model.Paths;
@Singleton
public class PathUtil {
public Path getWorkspacePath(String login){
return Paths.getPath(String.format("/Home/%s/%s",login,Constants.WORKSPACE_ROOT_FOLDER_NAME));
}
public Path getHome(String login){
return Paths.getPath(String.format("/Home/%s",login));
}
public Path getInboxPath(String login) {
return Paths.append(getHome(login),Constants.INBOX_FOLDER_NAME);
}
public Path getOutboxPath(String login) {
return Paths.append(getHome(login),Constants.OUTBOX_FOLDER_NAME);
}
@Deprecated
private Path getOldTrashPath(String login){
return Paths.append(getWorkspacePath(login),Constants.TRASH_ROOT_FOLDER_NAME);
}
private Path getNewTrashPath(String login){
return Paths.append(getHome(login), Constants.TRASH_ROOT_FOLDER_NAME);
}
@Deprecated
private Path getOldVREsPath(String login){
return Paths.append(getWorkspacePath(login),Constants.OLD_VRE_FOLDER_PARENT_NAME);
}
private Path getNewVREsPath(String login){
return Paths.append(getHome(login),Constants.PERSONAL_VRES_FOLDER_PARENT_NAME);
}
public Path getSharedWithMePath(String login){
return Paths.append(getHome(login),Constants.SHARED_WITH_ME_PARENT_NAME);
}
/*
public Path getMySharedPath(String login){
return Paths.append(getHome(login),Constants.MYSHARED_PARENT_NAME);
}*/
public Path getVREsPath(String login, Session session) throws RepositoryException {
Path home = getHome(login);
Node node = session.getNode(home.toPath());
if (node.hasProperty(Constants.HOME_VERSION_PROP) && node.getProperty(Constants.HOME_VERSION_PROP).getLong()>0)
return getNewVREsPath(login);
else
return getOldVREsPath(login);
}
public Path getTrashPath(String login, Session session) throws RepositoryException {
Path home = getHome(login);
Node node = session.getNode(home.toPath());
if (node.hasProperty(Constants.HOME_VERSION_PROP) && node.getProperty(Constants.HOME_VERSION_PROP).getLong()>0)
return getNewTrashPath(login);
else
return getOldTrashPath(login);
}
}

View File

@ -1,31 +0,0 @@
package org.gcube.data.access.storagehub;
import javax.inject.Singleton;
import javax.jcr.Repository;
import javax.naming.Context;
import javax.naming.InitialContext;
import org.gcube.data.access.storagehub.services.RepositoryInitializer;
@Singleton
public class RepositoryInitializerImpl implements RepositoryInitializer{
private Repository repository;
@Override
public Repository getRepository(){
return repository;
}
public RepositoryInitializerImpl() throws Exception{
InitialContext context = new InitialContext();
Context environment = (Context) context.lookup("java:comp/env");
repository = (Repository) environment.lookup("jcr/repository");
}
}

View File

@ -0,0 +1,7 @@
package org.gcube.data.access.storagehub;
public class Roles {
public static final String VREMANAGER_ROLE = "VRE-Manager";
public static final String INFRASTRUCTURE_MANAGER_ROLE = "Infrastructure-Manager";
}

View File

@ -4,8 +4,8 @@ import java.io.IOException;
import java.io.InputStream; import java.io.InputStream;
import java.io.OutputStream; import java.io.OutputStream;
import javax.ws.rs.WebApplicationException; import jakarta.ws.rs.WebApplicationException;
import javax.ws.rs.core.StreamingOutput; import jakarta.ws.rs.core.StreamingOutput;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;

View File

@ -1,36 +0,0 @@
package org.gcube.data.access.storagehub;
import java.util.Map;
import java.util.WeakHashMap;
import org.gcube.common.authorization.library.provider.AuthorizationProvider;
import org.gcube.contentmanagement.blobstorage.service.IClient;
import org.gcube.contentmanager.storageclient.model.protocol.smp.Handler;
import org.gcube.contentmanager.storageclient.wrapper.AccessType;
import org.gcube.contentmanager.storageclient.wrapper.MemoryType;
import org.gcube.contentmanager.storageclient.wrapper.StorageClient;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class StorageFactory {
public final static String SERVICE_NAME = "home-library";
public final static String SERVICE_CLASS = "org.gcube.portlets.user";
private static Map<String, IClient> clientUserMap = new WeakHashMap<String, IClient>();
private static Logger log = LoggerFactory.getLogger(StorageFactory.class);
public static IClient getGcubeStorage(){
String login = AuthorizationProvider.instance.get().getClient().getId();
if (!clientUserMap.containsKey(login)){
IClient storage = new StorageClient(SERVICE_CLASS, SERVICE_NAME,
login, AccessType.SHARED, MemoryType.PERSISTENT).getClient();
log.info("******* Storage activateProtocol for Storage **********");
Handler.activateProtocol();
clientUserMap.put(login, storage);
return storage;
} else return clientUserMap.get(login);
}
}

View File

@ -3,17 +3,21 @@ package org.gcube.data.access.storagehub;
import java.util.HashSet; import java.util.HashSet;
import java.util.Set; import java.util.Set;
import javax.ws.rs.Path; import jakarta.ws.rs.Path;
import javax.ws.rs.core.Application; import jakarta.ws.rs.core.Application;
import org.gcube.common.gxrest.response.entity.SerializableErrorEntityTextWriter; import org.gcube.common.gxrest.response.entity.SerializableErrorEntityTextWriter;
import org.gcube.data.access.storagehub.services.ACLManager; import org.gcube.data.access.storagehub.services.ACLManager;
import org.gcube.data.access.storagehub.services.GroupManager; import org.gcube.data.access.storagehub.services.GroupManager;
import org.gcube.data.access.storagehub.services.Impersonable;
import org.gcube.data.access.storagehub.services.ItemSharing; import org.gcube.data.access.storagehub.services.ItemSharing;
import org.gcube.data.access.storagehub.services.ItemsCreator; import org.gcube.data.access.storagehub.services.ItemsCreator;
import org.gcube.data.access.storagehub.services.ItemsManager; import org.gcube.data.access.storagehub.services.ItemsManager;
import org.gcube.data.access.storagehub.services.MessageManager;
import org.gcube.data.access.storagehub.services.StorageManager;
import org.gcube.data.access.storagehub.services.UserManager; import org.gcube.data.access.storagehub.services.UserManager;
import org.gcube.data.access.storagehub.services.WorkspaceManager; import org.gcube.data.access.storagehub.services.WorkspaceManager;
import org.gcube.data.access.storagehub.services.admin.ScriptManager;
import org.glassfish.jersey.media.multipart.MultiPartFeature; import org.glassfish.jersey.media.multipart.MultiPartFeature;
@Path("workspace") @Path("workspace")
@ -23,7 +27,7 @@ public class StorageHub extends Application {
public Set<Class<?>> getClasses() { public Set<Class<?>> getClasses() {
final Set<Class<?>> classes = new HashSet<Class<?>>(); final Set<Class<?>> classes = new HashSet<Class<?>>();
// register resources and features // register resources and features
classes.add(MultiPartFeature.class); classes.add(Impersonable.class);
classes.add(WorkspaceManager.class); classes.add(WorkspaceManager.class);
classes.add(ItemsManager.class); classes.add(ItemsManager.class);
classes.add(ItemsCreator.class); classes.add(ItemsCreator.class);
@ -31,7 +35,12 @@ public class StorageHub extends Application {
classes.add(ItemSharing.class); classes.add(ItemSharing.class);
classes.add(UserManager.class); classes.add(UserManager.class);
classes.add(GroupManager.class); classes.add(GroupManager.class);
classes.add(ScriptManager.class);
classes.add(MessageManager.class);
classes.add(StorageManager.class);
classes.add(MultiPartFeature.class);
classes.add(SerializableErrorEntityTextWriter.class); classes.add(SerializableErrorEntityTextWriter.class);
classes.add(MyApplicationListener.class);
return classes; return classes;
} }

View File

@ -0,0 +1,97 @@
package org.gcube.data.access.storagehub;
import java.io.IOException;
import java.nio.file.Path;
import java.nio.file.Paths;
import javax.jcr.RepositoryException;
import org.apache.jackrabbit.api.JackrabbitSession;
import org.gcube.com.fasterxml.jackson.databind.ObjectMapper;
import org.gcube.common.storagehub.model.exporter.DumpData;
import org.gcube.data.access.storagehub.handlers.DataHandler;
import org.gcube.data.access.storagehub.repository.StoragehubRepository;
import org.gcube.smartgears.ApplicationManager;
import org.gcube.smartgears.ContextProvider;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class StorageHubApplicationManager implements ApplicationManager {
private static Logger logger = LoggerFactory.getLogger(StorageHubApplicationManager.class);
private boolean alreadyShutDown = false;
private boolean alreadyInit = false;
private StoragehubRepository repository;
// private static NotificationClient notificationClient;
/*
* public static NotificationClient getNotificationClient() { return
* notificationClient; }
*/
public StorageHubApplicationManager() {
repository = StoragehubRepository.repository;
}
@Override
public synchronized void onInit() {
logger.info("onInit Called on storagehub");
try {
if (!alreadyInit) {
logger.info("jackrabbit initialization started");
repository.initContainerAtFirstStart();
DataHandler dh = new DataHandler();
Path shubSpecificConf = ContextProvider.get().appSpecificConfigurationFolder();
Path importPath = Paths.get(shubSpecificConf.toString(), "import");
Path mainFileImportPath = Paths.get(importPath.toString(), "data.json");
if (importPath.toFile().exists() && mainFileImportPath.toFile().exists()) {
JackrabbitSession session = null;
try {
ObjectMapper mapper = new ObjectMapper();
DumpData data = mapper.readValue(mainFileImportPath.toFile(), DumpData.class);
session = (JackrabbitSession) repository.getRepository().login(Constants.JCR_CREDENTIALS);
dh.importData((JackrabbitSession) session, data);
session.save();
} catch (RepositoryException e) {
logger.error("error importing data", e);
} catch (IOException je) {
logger.error("error parsing json data, invalid schema file", je);
} finally {
if (session != null)
session.logout();
}
}
alreadyInit = true;
}
} catch (Throwable e) {
logger.error("unexpected error initiliazing storagehub", e);
}
}
@Override
public synchronized void onShutdown() {
if (!alreadyShutDown)
try {
logger.info("jackrabbit is shutting down");
repository.shutdown();
alreadyShutDown = true;
} catch (Exception e) {
logger.warn("the database was not shutdown properly", e);
}
}
}

View File

@ -1,19 +1,17 @@
package org.gcube.data.access.storagehub; package org.gcube.data.access.storagehub;
import java.io.BufferedInputStream;
import java.io.IOException; import java.io.IOException;
import java.io.InputStream; import java.io.InputStream;
import java.io.OutputStream; import java.io.OutputStream;
import java.net.URL; import java.net.URL;
import java.security.MessageDigest; import java.security.MessageDigest;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Arrays;
import java.util.Calendar; import java.util.Calendar;
import java.util.Deque; import java.util.HashSet;
import java.util.LinkedList;
import java.util.List; import java.util.List;
import java.util.Set; import java.util.Set;
import java.util.zip.ZipEntry; import java.util.function.Predicate;
import java.util.zip.ZipOutputStream;
import javax.jcr.Node; import javax.jcr.Node;
import javax.jcr.NodeIterator; import javax.jcr.NodeIterator;
@ -22,39 +20,42 @@ import javax.jcr.RepositoryException;
import javax.jcr.Session; import javax.jcr.Session;
import javax.jcr.lock.Lock; import javax.jcr.lock.Lock;
import javax.jcr.lock.LockException; import javax.jcr.lock.LockException;
import javax.jcr.version.Version; import javax.jcr.query.Query;
import org.apache.commons.io.FilenameUtils; import org.apache.commons.io.FilenameUtils;
import org.apache.jackrabbit.util.ISO9075;
import org.apache.jackrabbit.util.Text; import org.apache.jackrabbit.util.Text;
import org.gcube.common.authorization.library.provider.AuthorizationProvider;
import org.gcube.common.storagehub.model.Excludes; import org.gcube.common.storagehub.model.Excludes;
import org.gcube.common.storagehub.model.Paths;
import org.gcube.common.storagehub.model.exceptions.BackendGenericError; import org.gcube.common.storagehub.model.exceptions.BackendGenericError;
import org.gcube.common.storagehub.model.exceptions.IdNotFoundException;
import org.gcube.common.storagehub.model.exceptions.ItemLockedException; import org.gcube.common.storagehub.model.exceptions.ItemLockedException;
import org.gcube.common.storagehub.model.exceptions.StorageHubException;
import org.gcube.common.storagehub.model.exceptions.UserNotAuthorizedException;
import org.gcube.common.storagehub.model.items.AbstractFileItem; import org.gcube.common.storagehub.model.items.AbstractFileItem;
import org.gcube.common.storagehub.model.items.ExternalLink; import org.gcube.common.storagehub.model.items.ExternalLink;
import org.gcube.common.storagehub.model.items.FolderItem; import org.gcube.common.storagehub.model.items.FolderItem;
import org.gcube.common.storagehub.model.items.GCubeItem; import org.gcube.common.storagehub.model.items.GCubeItem;
import org.gcube.common.storagehub.model.items.GenericFileItem;
import org.gcube.common.storagehub.model.items.Item; import org.gcube.common.storagehub.model.items.Item;
import org.gcube.common.storagehub.model.items.SharedFolder; import org.gcube.common.storagehub.model.items.SharedFolder;
import org.gcube.common.storagehub.model.storages.MetaInfo;
import org.gcube.common.storagehub.model.types.ItemAction; import org.gcube.common.storagehub.model.types.ItemAction;
import org.gcube.common.storagehub.model.types.NodeProperty; import org.gcube.common.storagehub.model.types.NodeProperty;
import org.gcube.contentmanager.storageclient.wrapper.AccessType; import org.gcube.common.storagehub.model.types.FolderInfoType;
import org.gcube.contentmanager.storageclient.wrapper.MemoryType;
import org.gcube.contentmanager.storageclient.wrapper.StorageClient;
import org.gcube.data.access.storagehub.accounting.AccountingHandler; import org.gcube.data.access.storagehub.accounting.AccountingHandler;
import org.gcube.data.access.storagehub.handlers.Item2NodeConverter; import org.gcube.data.access.storagehub.handlers.items.Item2NodeConverter;
import org.gcube.data.access.storagehub.handlers.Node2ItemConverter; import org.gcube.data.access.storagehub.handlers.items.Node2ItemConverter;
import org.gcube.data.access.storagehub.handlers.StorageBackendHandler; import org.gcube.data.access.storagehub.handlers.items.builders.FolderCreationParameters;
import org.gcube.data.access.storagehub.handlers.VersionHandler; import org.gcube.data.access.storagehub.handlers.plugins.StorageBackendHandler;
import org.gcube.data.access.storagehub.storage.backend.impl.GCubeStorageBackend; import org.gcube.data.access.storagehub.predicates.IncludeTypePredicate;
import org.gcube.data.access.storagehub.predicates.ItemTypePredicate;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
public class Utils { public class Utils {
public final static String SERVICE_NAME = "home-library"; public final static String SERVICE_NAME = "home-library";
public final static String SERVICE_CLASS = "org.gcube.portlets.user"; public final static String SERVICE_CLASS = "org.gcube.portlets.user";
private static final String FOLDERS_TYPE = "nthl:workspaceItem"; private static final String FOLDERS_TYPE = "nthl:workspaceItem";
private static final Logger logger = LoggerFactory.getLogger(Utils.class); private static final Logger logger = LoggerFactory.getLogger(Utils.class);
@ -78,8 +79,22 @@ public class Utils {
return digest; return digest;
} }
public static long getItemCount(Node parent, boolean showHidden, Class<? extends Item> nodeType) throws RepositoryException, BackendGenericError{ public static long getItemCount(Node parent, boolean showHidden, ItemTypePredicate itemTypePredicate) throws RepositoryException, BackendGenericError{
return getItemList(parent, Excludes.ALL, null, showHidden, nodeType).size(); return getItemList(parent, Excludes.ALL, null, showHidden, itemTypePredicate).size();
}
public static FolderInfoType getFolderInfo(Node parent) throws RepositoryException, BackendGenericError{
FolderInfoType info = new FolderInfoType(0, 0);
List<Item> items = getItemList(parent, Excludes.GET_ONLY_CONTENT, null, false, null);
for (Item item: items)
if (item instanceof FolderItem) {
FolderInfoType fit = getFolderInfo((Node) item.getRelatedNode());
info.setCount(info.getCount() + fit.getCount());
info.setSize(info.getSize() + fit.getSize());
} else if (item instanceof AbstractFileItem gfi)
info.setSize(info.getSize() + gfi.getContent().getSize());
info.setCount(info.getCount() + items.size());
return info;
} }
@ -107,36 +122,82 @@ public class Utils {
} }
public static <T extends Item> List<T> searchByNameOnFolder(Session ses, String user, AuthorizationChecker authChecker, Node parent, List<String> excludes, Range range, boolean showHidden, boolean excludeTrashed, ItemTypePredicate itemTypePredicate, String nameParam) throws RepositoryException, BackendGenericError{
String xpath = String.format("/jcr:root%s//element(*,nthl:workspaceItem)[jcr:like(fn:lower-case(@jcr:title), '%s')]",ISO9075.encodePath(parent.getPath()), nameParam.toLowerCase());
public static <T extends Item> List<T> getItemList(Node parent, List<String> excludes, Range range, boolean showHidden, Class<? extends Item> nodeTypeToInclude) throws RepositoryException, BackendGenericError{ //String query = String.format("SELECT * FROM [nthl:workspaceLeafItem] AS node WHERE ISDESCENDANTNODE('%s') ORDER BY node.[jcr:lastModified] DESC ",vreFolder.getPath());
logger.trace("query for search is {}",xpath);
logger.debug("getting children of node {}", parent.getIdentifier());
List<T> returnList = new ArrayList<T>();
long start = System.currentTimeMillis(); long start = System.currentTimeMillis();
NodeIterator iterator = parent.getNodes(); @SuppressWarnings("deprecation")
Query jcrQuery = ses.getWorkspace().getQueryManager().createQuery(xpath, Query.XPATH);
NodeChildrenFilterIterator iterator = new NodeChildrenFilterIterator(jcrQuery.execute().getNodes());
logger.trace("[SEARCH] real search took {} millis",(System.currentTimeMillis()-start));
Predicate<Node> checker = new Predicate<Node>() {
@Override
public boolean test(Node t) {
try {
authChecker.checkReadAuthorizationControl(t.getSession(), user, t.getIdentifier());
return true;
} catch (UserNotAuthorizedException | BackendGenericError | RepositoryException e) {
return false;
}
}
};
return getItemListFromNodeIterator(checker, iterator , excludes, range, showHidden, excludeTrashed, itemTypePredicate);
}
public static <T extends Item> List<T> getItemList(Node parent, List<String> excludes, Range range, boolean showHidden, ItemTypePredicate itemTypePredicate) throws RepositoryException, BackendGenericError{
return getItemList(null, parent, excludes, range, showHidden, itemTypePredicate);
}
public static <T extends Item> List<T> getItemList(Predicate<Node> checker, Node parent, List<String> excludes, Range range, boolean showHidden, ItemTypePredicate itemTypePredicate) throws RepositoryException, BackendGenericError{
logger.trace("getting children of node {}", parent.getIdentifier());
long start = System.currentTimeMillis();
NodeChildrenFilterIterator iterator = new NodeChildrenFilterIterator(parent);
logger.trace("time to get iterator {}",(System.currentTimeMillis()-start)); logger.trace("time to get iterator {}",(System.currentTimeMillis()-start));
logger.trace("nodeType is {}",nodeTypeToInclude); return getItemListFromNodeIterator(checker, iterator, excludes, range, showHidden, false, itemTypePredicate);
}
private static <T extends Item> List<T> getItemListFromNodeIterator(Predicate<Node> checker, NodeChildrenFilterIterator iterator, List<String> excludes, Range range, boolean showHidden, boolean excludeTrashed, ItemTypePredicate itemTypePredicate) throws RepositoryException, BackendGenericError{
List<T> returnList = new ArrayList<T>();
int count =0; int count =0;
logger.trace("selected range is {}", range); logger.trace("selected range is {}", range);
Node2ItemConverter node2Item= new Node2ItemConverter(); Node2ItemConverter node2Item= new Node2ItemConverter();
Set<String> duplicateId = new HashSet<String>();
while (iterator.hasNext()){ while (iterator.hasNext()){
Node current = iterator.nextNode(); Node current = iterator.next();
logger.trace("[SEARCH] evaluating node {} ",current.hasProperty(NodeProperty.TITLE.toString())? current.getProperty(NodeProperty.TITLE.toString()):current.getName());
//REMOVE duplicate nodes, in case the indexes are not working
if (duplicateId.contains(current.getIdentifier())) {
logger.warn("duplicated node found");
continue;
}
//EXCLUDES node from predicate
if (checker!=null && !checker.test(current))
continue;
logger.debug("current node "+current.getName());
if (isToExclude(current, showHidden)) if (isToExclude(current, showHidden))
continue; continue;
logger.debug("current node not excluded "+current.getName()); logger.trace("[SEARCH] current node not excluded {} ",current.hasProperty(NodeProperty.TITLE.toString())? current.getProperty(NodeProperty.TITLE.toString()):current.getName());
if (range==null || (count>=range.getStart() && returnList.size()<range.getLimit())) { if (range==null || (count>=range.getStart() && returnList.size()<range.getLimit())) {
T item = node2Item.getFilteredItem(current, excludes, nodeTypeToInclude); T item = node2Item.getFilteredItem(current, excludes, itemTypePredicate);
if (item==null) continue; if (item==null || (item.isTrashed() && excludeTrashed)) continue;
returnList.add(item); returnList.add(item);
} }
count++; count++;
duplicateId.add(current.getIdentifier());
} }
return returnList; return returnList;
} }
@ -147,83 +208,10 @@ public class Utils {
(node.getPrimaryNodeType().getName().equals(FOLDERS_TYPE) && Constants.FOLDERS_TO_EXLUDE.contains(node.getName()))); (node.getPrimaryNodeType().getName().equals(FOLDERS_TYPE) && Constants.FOLDERS_TO_EXLUDE.contains(node.getName())));
} }
public static org.gcube.common.storagehub.model.Path getWorkspacePath(){
return Paths.getPath(String.format("/Home/%s/Workspace",AuthorizationProvider.instance.get().getClient().getId()));
}
public static org.gcube.common.storagehub.model.Path getWorkspacePath(String login){
return Paths.getPath(String.format("/Home/%s/Workspace",login));
}
public static org.gcube.common.storagehub.model.Path getHome(String login){
return Paths.getPath(String.format("/Home/%s",login));
}
public static Deque<Item> getAllNodesForZip(FolderItem directory, Session session, AccountingHandler accountingHandler, List<String> excludes) throws RepositoryException, BackendGenericError{
Deque<Item> queue = new LinkedList<Item>();
Node currentNode = session.getNodeByIdentifier(directory.getId());
queue.push(directory);
Deque<Item> tempQueue = new LinkedList<Item>();
logger.debug("adding directory {}",currentNode.getPath());
for (Item item : Utils.getItemList(currentNode,Excludes.GET_ONLY_CONTENT, null, false, null)){
if (excludes.contains(item.getId())) continue;
if (item instanceof FolderItem)
tempQueue.addAll(getAllNodesForZip((FolderItem) item, session, accountingHandler, excludes));
else if (item instanceof AbstractFileItem){
logger.debug("adding file {}",item.getPath());
AbstractFileItem fileItem = (AbstractFileItem) item;
accountingHandler.createReadObj(fileItem.getTitle(), session, session.getNodeByIdentifier(item.getId()), false);
queue.addLast(item);
}
}
queue.addAll(tempQueue);
return queue;
}
public static void zipNode(ZipOutputStream zos, Deque<Item> queue, String login, org.gcube.common.storagehub.model.Path originalPath, StorageBackendHandler storageHandler) throws Exception{
logger.trace("originalPath is {}",originalPath.toPath());
org.gcube.common.storagehub.model.Path actualPath = Paths.getPath("");
while (!queue.isEmpty()) {
Item item = queue.pop();
if (item instanceof FolderItem) {
actualPath = Paths.getPath(item.getPath());
logger.debug("actualPath is {}",actualPath.toPath());
String name = Paths.remove(actualPath, originalPath).toPath().replaceFirst("/", "");
logger.debug("writing dir {}",name);
if (name.isEmpty()) continue;
try {
zos.putNextEntry(new ZipEntry(name));
}finally {
zos.closeEntry();
}
} else if (item instanceof AbstractFileItem){
try {
InputStream streamToWrite = storageHandler.download(((AbstractFileItem)item).getContent().getStorageId());
if (streamToWrite == null){
logger.warn("discarding item {} ",item.getName());
continue;
}
try(BufferedInputStream is = new BufferedInputStream(streamToWrite)){
String name = (Paths.remove(actualPath, originalPath).toPath()+item.getName()).replaceFirst("/", "");
logger.debug("writing file {}",name);
zos.putNextEntry(new ZipEntry(name));
copyStream(is, zos);
}catch (Exception e) {
logger.warn("error writing item {}", item.getName(),e);
} finally{
zos.closeEntry();
}
zos.flush();
}catch (Throwable e) {
logger.warn("error reading content for item {}", item.getPath(),e);
}
}
}
zos.close();
}
private static void copyStream(InputStream in, OutputStream out) throws IOException { public static void copyStream(InputStream in, OutputStream out) throws IOException {
byte[] buffer = new byte[2048]; byte[] buffer = new byte[2048];
int readcount = 0; int readcount = 0;
while ((readcount=in.read(buffer))!=-1) { while ((readcount=in.read(buffer))!=-1) {
@ -234,56 +222,34 @@ public class Utils {
public static boolean hasSharedChildren(Node node) throws RepositoryException, BackendGenericError{ public static boolean hasSharedChildren(Node node) throws RepositoryException, BackendGenericError{
Node2ItemConverter node2Item = new Node2ItemConverter(); Node2ItemConverter node2Item = new Node2ItemConverter();
NodeIterator children = node.getNodes(); NodeChildrenFilterIterator children = new NodeChildrenFilterIterator(node);
while (children.hasNext()) { while (children.hasNext()) {
Node child= children.nextNode(); Node child= children.next();
if (node2Item.checkNodeType(child, SharedFolder.class)) return true; if (node2Item.checkNodeType(child, SharedFolder.class)) return true;
if (node2Item.checkNodeType(child, FolderItem.class) && hasSharedChildren(child)) return true; if (node2Item.checkNodeType(child, FolderItem.class) && hasSharedChildren(child)) return true;
} }
return false; return false;
} }
public static void getAllContentIds(Session ses, Set<String> idsToDelete, Item itemToDelete, VersionHandler versionHandler) throws Exception{
if (itemToDelete instanceof AbstractFileItem) {
List<Version> versions = versionHandler.getContentVersionHistory(ses.getNodeByIdentifier(itemToDelete.getId()), ses);
versions.forEach(v -> {
try {
String storageId =v.getFrozenNode().getProperty(NodeProperty.STORAGE_ID.toString()).getString();
idsToDelete.add(storageId);
logger.info("retrieved StorageId {} for version {}", storageId, v.getName());
} catch (Exception e) {
logger.warn("error retreiving sotrageId version for item with id {}",itemToDelete.getId(),e);
}
});
idsToDelete.add(((AbstractFileItem) itemToDelete).getContent().getStorageId());
}else if (itemToDelete instanceof FolderItem) {
List<Item> items = Utils.getItemList(ses.getNodeByIdentifier(itemToDelete.getId()), Excludes.GET_ONLY_CONTENT , null, true, null);
for (Item item: items)
getAllContentIds(ses, idsToDelete, item, versionHandler);
}
}
public static String checkExistanceAndGetUniqueName(Session ses, Node destination, String name) throws BackendGenericError{ public static String checkExistanceAndGetUniqueName(Session ses, Node destination, String name) throws BackendGenericError{
String escapedName = Text.escapeIllegalJcrChars(name);
try { try {
destination.getNode(name); destination.getNode(escapedName);
}catch(PathNotFoundException pnf) { }catch(PathNotFoundException pnf) {
return Text.escapeIllegalJcrChars(name); return escapedName;
} catch (Exception e) { } catch (Exception e) {
throw new BackendGenericError(e); throw new BackendGenericError(e);
} }
try { try {
String filename = FilenameUtils.getBaseName(name); String filename = FilenameUtils.getBaseName(escapedName);
String ext = FilenameUtils.getExtension(name); String ext = FilenameUtils.getExtension(escapedName);
String nameTocheck = ext.isEmpty()? String.format("%s(*)",filename): String.format("%s(*).%s",filename, ext); String nameTocheck = ext.isEmpty()? String.format("%s(*)",filename): String.format("%s(*).%s",filename, ext);
logger.debug("filename is {}, extension is {} , and name to check is {}", filename, ext, nameTocheck); logger.trace("filename is {}, extension is {} , and name to check is {}", filename, ext, nameTocheck);
NodeIterator ni = destination.getNodes(nameTocheck); NodeIterator ni = destination.getNodes(nameTocheck);
int maxval = 0; int maxval = 0;
@ -296,35 +262,66 @@ public class Utils {
String newName = ext.isEmpty()? String.format("%s(%d)", filename,maxval+1) : String.format("%s(%d).%s", filename,maxval+1, ext) ; String newName = ext.isEmpty()? String.format("%s(%d)", filename,maxval+1) : String.format("%s(%d).%s", filename,maxval+1, ext) ;
return Text.escapeIllegalJcrChars(newName); return newName;
} catch (Exception e) { } catch (Exception e) {
throw new BackendGenericError(e); throw new BackendGenericError(e);
} }
} }
public static Node createFolderInternally(Session ses, Node destinationNode, String name, String description, boolean hidden, String login, AccountingHandler accountingHandler) throws BackendGenericError {
String uniqueName = Utils.checkExistanceAndGetUniqueName(ses, destinationNode, name); public static Node createFolderInternally(FolderCreationParameters params, AccountingHandler accountingHandler, boolean isInternalWSFolder) throws StorageHubException {
logger.debug("creating folder {} in {}", params.getName(), params.getParentId());
Node destinationNode;
FolderItem destinationItem;
try {
destinationNode = params.getSession().getNodeByIdentifier(params.getParentId());
destinationItem = (FolderItem) new Node2ItemConverter().getItem(destinationNode, Excludes.ALL);
}catch (RepositoryException e) {
logger.error("id not found",e);
throw new IdNotFoundException(params.getParentId());
}
String uniqueName = Utils.checkExistanceAndGetUniqueName(params.getSession(), destinationNode, params.getName());
FolderItem item = new FolderItem(); FolderItem item = new FolderItem();
Calendar now = Calendar.getInstance(); Calendar now = Calendar.getInstance();
item.setName(uniqueName); item.setName(uniqueName);
item.setTitle(uniqueName); item.setTitle(uniqueName);
item.setDescription(description); item.setDescription(params.getDescription());
if (isInternalWSFolder) {
item.setBackend(StorageBackendHandler.getDefaultPayloadForFolder());
} else {
if (params.getBackend() != null)
item.setBackend(params.getBackend());
else {
if (destinationItem.getBackend() != null)
item.setBackend(destinationItem.getBackend());
else
item.setBackend(StorageBackendHandler.getDefaultPayloadForFolder());
}
}
//TODO: item.setExternalStorage();
//item.setCreationTime(now); //item.setCreationTime(now);
item.setHidden(hidden);
boolean hiddenDestNode= false;
try {
hiddenDestNode = destinationNode.getProperty(NodeProperty.HIDDEN.toString()).getBoolean();
}catch (Throwable e) {}
item.setHidden(params.isHidden() || hiddenDestNode);
item.setLastAction(ItemAction.CREATED); item.setLastAction(ItemAction.CREATED);
item.setLastModificationTime(now); item.setLastModificationTime(now);
item.setLastModifiedBy(login); item.setLastModifiedBy(params.getUser());
item.setOwner(login); item.setOwner(params.getUser());
item.setPublicItem(false); item.setPublicItem(false);
//to inherit hidden property
//item.setHidden(destinationItem.isHidden());
Node newNode = new Item2NodeConverter().getNode(destinationNode, item); Node newNode = new Item2NodeConverter().getNode(destinationNode, item);
if (accountingHandler!=null) if (accountingHandler!=null) {
accountingHandler.createFolderAddObj(name, item.getClass().getSimpleName(), null, ses, newNode, false); accountingHandler.createFolderAddObj(uniqueName, item.getClass().getSimpleName(), null, params.getSession(), params.getUser(), destinationNode, false);
accountingHandler.createEntryCreate(item.getTitle(), params.getSession(), newNode, params.getUser(), false);
}
return newNode; return newNode;
} }
@ -345,12 +342,20 @@ public class Utils {
item.setPublicItem(false); item.setPublicItem(false);
item.setValue(value); item.setValue(value);
//to inherit hidden property try {
//item.setHidden(destinationItem.isHidden()); item.setHidden(destinationNode.getProperty(NodeProperty.HIDDEN.toString()).getBoolean());
} catch (Throwable e) {
item.setHidden(false);
}
Node newNode = new Item2NodeConverter().getNode(destinationNode, item); Node newNode = new Item2NodeConverter().getNode(destinationNode, item);
if (accountingHandler!=null)
accountingHandler.createFolderAddObj(name, item.getClass().getSimpleName(), null, ses, newNode, false); if (accountingHandler!=null) {
accountingHandler.createFolderAddObj(name, item.getClass().getSimpleName(), null, ses, login, destinationNode, false);
accountingHandler.createEntryCreate(item.getTitle(), ses, newNode,login, false);
}
return newNode; return newNode;
} }
@ -380,4 +385,12 @@ public class Utils {
node.setProperty(NodeProperty.LAST_MODIFIED_BY.toString(), login); node.setProperty(NodeProperty.LAST_MODIFIED_BY.toString(), login);
node.setProperty(NodeProperty.LAST_ACTION.toString(), action.name()); node.setProperty(NodeProperty.LAST_ACTION.toString(), action.name());
} }
public static void setContentFromMetaInfo(AbstractFileItem item, MetaInfo contentInfo) {
item.getContent().setSize(contentInfo.getSize());
item.getContent().setRemotePath(contentInfo.getRemotePath());
item.getContent().setSize(contentInfo.getSize());
item.getContent().setPayloadBackend(contentInfo.getPayloadBackend());
}
} }

View File

@ -4,22 +4,17 @@ import java.util.Calendar;
import java.util.Set; import java.util.Set;
import java.util.UUID; import java.util.UUID;
import javax.inject.Singleton;
import javax.jcr.Node; import javax.jcr.Node;
import javax.jcr.RepositoryException; import javax.jcr.RepositoryException;
import javax.jcr.Session; import javax.jcr.Session;
import javax.jcr.UnsupportedRepositoryOperationException;
import javax.jcr.version.Version;
import javax.jcr.version.VersionHistory;
import javax.jcr.version.VersionIterator;
import javax.jcr.version.VersionManager;
import org.gcube.common.authorization.library.provider.AuthorizationProvider;
import org.gcube.common.storagehub.model.items.nodes.accounting.AccountingEntryType; import org.gcube.common.storagehub.model.items.nodes.accounting.AccountingEntryType;
import org.gcube.common.storagehub.model.types.NodeProperty; import org.gcube.common.storagehub.model.types.NodeProperty;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import jakarta.inject.Singleton;
@Singleton @Singleton
public class AccountingHandler { public class AccountingHandler {
@ -33,11 +28,12 @@ public class AccountingHandler {
private static final String OLD_ITEM_NAME = "hl:oldItemName"; private static final String OLD_ITEM_NAME = "hl:oldItemName";
private static final String NEW_ITEM_NAME = "hl:newItemName"; private static final String NEW_ITEM_NAME = "hl:newItemName";
private static final String BASE_VERSION ="1.0";
private static final Logger logger = LoggerFactory.getLogger(AccountingHandler.class); private static final Logger logger = LoggerFactory.getLogger(AccountingHandler.class);
public void createReadObj(String title, Session ses, Node node, boolean saveHistory ) { public void createReadObj(String title, String version, Session ses, Node node, String login, boolean saveHistory ) {
try { try {
if (!node.hasNode(NodeProperty.ACCOUNTING.toString())){ if (!node.hasNode(NodeProperty.ACCOUNTING.toString())){
@ -47,30 +43,39 @@ public class AccountingHandler {
Node accountingNodeParent = node.getNode(NodeProperty.ACCOUNTING.toString()); Node accountingNodeParent = node.getNode(NodeProperty.ACCOUNTING.toString());
Node accountingNode = accountingNodeParent.addNode(UUID.randomUUID().toString(),AccountingEntryType.READ.getNodeTypeDefinition()); Node accountingNode = accountingNodeParent.addNode(UUID.randomUUID().toString(),AccountingEntryType.READ.getNodeTypeDefinition());
accountingNode.setProperty(USER, AuthorizationProvider.instance.get().getClient().getId()); accountingNode.setProperty(USER, login);
accountingNode.setProperty(DATE, Calendar.getInstance()); accountingNode.setProperty(DATE, Calendar.getInstance());
accountingNode.setProperty(ITEM_NAME, title); accountingNode.setProperty(ITEM_NAME, title);
accountingNode.setProperty(VERSION_ACCOUNTING, version!=null?version:BASE_VERSION);
try {
VersionManager vManager = ses.getWorkspace().getVersionManager();
VersionHistory history = vManager.getVersionHistory(node.getNode("jcr:content").getPath());
VersionIterator versions = history.getAllVersions();
Version version= null;
while (versions.hasNext()) {
version = versions.nextVersion();
}
if (version!=null)
accountingNode.setProperty(VERSION_ACCOUNTING, version.getName());
}catch(UnsupportedRepositoryOperationException uropex) {
logger.warn("version cannot be retrieved", uropex);
}
if (saveHistory) ses.save(); if (saveHistory) ses.save();
} catch (RepositoryException e) { } catch (RepositoryException e) {
logger.warn("error trying to retrieve accountign node",e); logger.warn("error trying to retrieve accountign node",e);
} }
} }
public void createFileUpdated(String title, Session ses, Node node, boolean saveHistory ) { public void createEntryCreate(String title, Session ses, Node node, String login, boolean saveHistory ) {
try {
if (!node.hasNode(NodeProperty.ACCOUNTING.toString())){
node.addNode(NodeProperty.ACCOUNTING.toString(), NodeProperty.NT_ACCOUNTING.toString());
}
Node accountingNodeParent = node.getNode(NodeProperty.ACCOUNTING.toString());
Node accountingNode = accountingNodeParent.addNode(UUID.randomUUID().toString(),AccountingEntryType.CREATE.getNodeTypeDefinition());
accountingNode.setProperty(USER, login);
accountingNode.setProperty(DATE, Calendar.getInstance());
accountingNode.setProperty(ITEM_NAME, title);
accountingNode.setProperty(VERSION_ACCOUNTING, BASE_VERSION);
if (saveHistory) ses.save();
} catch (RepositoryException e) {
logger.warn("error trying to retrieve accountign node",e);
}
}
public void createFileUpdated(String title, String version, Session ses, Node node, String login, boolean saveHistory ) {
try { try {
if (!node.hasNode(NodeProperty.ACCOUNTING.toString())){ if (!node.hasNode(NodeProperty.ACCOUNTING.toString())){
@ -80,24 +85,33 @@ public class AccountingHandler {
Node accountingNodeParent = node.getNode(NodeProperty.ACCOUNTING.toString()); Node accountingNodeParent = node.getNode(NodeProperty.ACCOUNTING.toString());
Node accountingNode = accountingNodeParent.addNode(UUID.randomUUID().toString(),AccountingEntryType.UPDATE.getNodeTypeDefinition()); Node accountingNode = accountingNodeParent.addNode(UUID.randomUUID().toString(),AccountingEntryType.UPDATE.getNodeTypeDefinition());
accountingNode.setProperty(USER, AuthorizationProvider.instance.get().getClient().getId()); accountingNode.setProperty(USER, login);
accountingNode.setProperty(DATE, Calendar.getInstance());
accountingNode.setProperty(ITEM_NAME, title);
accountingNode.setProperty(VERSION_ACCOUNTING, version);
if (saveHistory) ses.save();
} catch (RepositoryException e) {
logger.warn("error trying to retrieve accountign node",e);
}
}
public void createVersionDeleted(String title, String version, Session ses, Node node, String login, boolean saveHistory ) {
try {
if (!node.hasNode(NodeProperty.ACCOUNTING.toString())){
node.addNode(NodeProperty.ACCOUNTING.toString(), NodeProperty.NT_ACCOUNTING.toString());
}
Node accountingNodeParent = node.getNode(NodeProperty.ACCOUNTING.toString());
Node accountingNode = accountingNodeParent.addNode(UUID.randomUUID().toString(),AccountingEntryType.DELETE.getNodeTypeDefinition());
accountingNode.setProperty(USER, login);
accountingNode.setProperty(DATE, Calendar.getInstance()); accountingNode.setProperty(DATE, Calendar.getInstance());
accountingNode.setProperty(ITEM_NAME, title); accountingNode.setProperty(ITEM_NAME, title);
try { accountingNode.setProperty(VERSION_ACCOUNTING, version);
VersionManager vManager = ses.getWorkspace().getVersionManager();
VersionHistory history = vManager.getVersionHistory(node.getNode("jcr:content").getPath());
VersionIterator versions = history.getAllVersions();
Version version= null;
while (versions.hasNext()) {
version = versions.nextVersion();
}
if (version!=null)
accountingNode.setProperty(VERSION_ACCOUNTING, version.getName());
}catch(UnsupportedRepositoryOperationException uropex) {
logger.warn("version cannot be retrieved", uropex);
}
if (saveHistory) ses.save(); if (saveHistory) ses.save();
@ -107,17 +121,17 @@ public class AccountingHandler {
} }
public void createFolderAddObj(String title, String itemType, String mimeType, Session ses, Node node, boolean saveHistory ) { public void createFolderAddObj(String title, String itemType, String mimeType, Session ses, String login, Node parentNode, boolean saveHistory ) {
try { try {
Node directoryNode = node.getParent();
if (!directoryNode.hasNode(NodeProperty.ACCOUNTING.toString())){
directoryNode.addNode(NodeProperty.ACCOUNTING.toString(), NodeProperty.NT_ACCOUNTING.toString()); if (!parentNode.hasNode(NodeProperty.ACCOUNTING.toString())){
parentNode.addNode(NodeProperty.ACCOUNTING.toString(), NodeProperty.NT_ACCOUNTING.toString());
} }
Node accountingNodeParent = directoryNode.getNode(NodeProperty.ACCOUNTING.toString()); Node accountingNodeParent = parentNode.getNode(NodeProperty.ACCOUNTING.toString());
Node accountingNode = accountingNodeParent.addNode(UUID.randomUUID().toString(),AccountingEntryType.ADD.getNodeTypeDefinition()); Node accountingNode = accountingNodeParent.addNode(UUID.randomUUID().toString(),AccountingEntryType.ADD.getNodeTypeDefinition());
accountingNode.setProperty(USER, AuthorizationProvider.instance.get().getClient().getId()); accountingNode.setProperty(USER, login);
accountingNode.setProperty(DATE, Calendar.getInstance()); accountingNode.setProperty(DATE, Calendar.getInstance());
accountingNode.setProperty(ITEM_NAME, title); accountingNode.setProperty(ITEM_NAME, title);
accountingNode.setProperty(ITEM_TYPE, itemType); accountingNode.setProperty(ITEM_TYPE, itemType);
@ -130,7 +144,7 @@ public class AccountingHandler {
} }
} }
public void createFolderRemoveObj(String title, String itemType, String mimeType, Session ses, Node parentNode, boolean saveHistory ) { public void createFolderRemoveObj(String title, String itemType, String mimeType, Session ses, String login, Node parentNode, boolean saveHistory ) {
try { try {
if (!parentNode.hasNode(NodeProperty.ACCOUNTING.toString())){ if (!parentNode.hasNode(NodeProperty.ACCOUNTING.toString())){
@ -139,7 +153,7 @@ public class AccountingHandler {
Node accountingNodeParent = parentNode.getNode(NodeProperty.ACCOUNTING.toString()); Node accountingNodeParent = parentNode.getNode(NodeProperty.ACCOUNTING.toString());
Node accountingNode = accountingNodeParent.addNode(UUID.randomUUID().toString(),AccountingEntryType.REMOVAL.getNodeTypeDefinition()); Node accountingNode = accountingNodeParent.addNode(UUID.randomUUID().toString(),AccountingEntryType.REMOVAL.getNodeTypeDefinition());
accountingNode.setProperty(USER, AuthorizationProvider.instance.get().getClient().getId()); accountingNode.setProperty(USER, login);
accountingNode.setProperty(DATE, Calendar.getInstance()); accountingNode.setProperty(DATE, Calendar.getInstance());
accountingNode.setProperty(ITEM_NAME, title); accountingNode.setProperty(ITEM_NAME, title);
accountingNode.setProperty(ITEM_TYPE, itemType); accountingNode.setProperty(ITEM_TYPE, itemType);
@ -152,7 +166,7 @@ public class AccountingHandler {
} }
} }
public void createShareFolder(String title, Set<String> users, Session ses, Node sharedNode, boolean saveHistory ) { public void createShareFolder(String title, Set<String> users, Session ses, Node sharedNode, String login, boolean saveHistory ) {
try { try {
if (!sharedNode.hasNode(NodeProperty.ACCOUNTING.toString())){ if (!sharedNode.hasNode(NodeProperty.ACCOUNTING.toString())){
@ -161,7 +175,7 @@ public class AccountingHandler {
Node accountingNodeParent = sharedNode.getNode(NodeProperty.ACCOUNTING.toString()); Node accountingNodeParent = sharedNode.getNode(NodeProperty.ACCOUNTING.toString());
Node accountingNode = accountingNodeParent.addNode(UUID.randomUUID().toString(),AccountingEntryType.SHARE.getNodeTypeDefinition()); Node accountingNode = accountingNodeParent.addNode(UUID.randomUUID().toString(),AccountingEntryType.SHARE.getNodeTypeDefinition());
accountingNode.setProperty(USER, AuthorizationProvider.instance.get().getClient().getId()); accountingNode.setProperty(USER, login);
accountingNode.setProperty(DATE, Calendar.getInstance()); accountingNode.setProperty(DATE, Calendar.getInstance());
accountingNode.setProperty(ITEM_NAME, title); accountingNode.setProperty(ITEM_NAME, title);
accountingNode.setProperty(MEMBERS, users.toArray(new String[users.size()])); accountingNode.setProperty(MEMBERS, users.toArray(new String[users.size()]));
@ -172,7 +186,7 @@ public class AccountingHandler {
} }
} }
public void createUnshareFolder(String title, Session ses, Node sharedNode, boolean saveHistory ) { public void createUnshareFolder(String title, Session ses, String user, Node sharedNode, boolean saveHistory ) {
try { try {
if (!sharedNode.hasNode(NodeProperty.ACCOUNTING.toString())){ if (!sharedNode.hasNode(NodeProperty.ACCOUNTING.toString())){
@ -180,8 +194,8 @@ public class AccountingHandler {
} }
Node accountingNodeParent = sharedNode.getNode(NodeProperty.ACCOUNTING.toString()); Node accountingNodeParent = sharedNode.getNode(NodeProperty.ACCOUNTING.toString());
Node accountingNode = accountingNodeParent.addNode(UUID.randomUUID().toString(),AccountingEntryType.SHARE.getNodeTypeDefinition()); Node accountingNode = accountingNodeParent.addNode(UUID.randomUUID().toString(),AccountingEntryType.UNSHARE.getNodeTypeDefinition());
accountingNode.setProperty(USER, AuthorizationProvider.instance.get().getClient().getId()); accountingNode.setProperty(USER, user);
accountingNode.setProperty(DATE, Calendar.getInstance()); accountingNode.setProperty(DATE, Calendar.getInstance());
accountingNode.setProperty(ITEM_NAME, title); accountingNode.setProperty(ITEM_NAME, title);
@ -191,7 +205,7 @@ public class AccountingHandler {
} }
} }
public void createRename(String oldTitle, String newTitle, Node node, Session ses, boolean saveHistory ) { public void createRename(String oldTitle, String newTitle, Node node, String login, Session ses, boolean saveHistory ) {
try { try {
if (!node.hasNode(NodeProperty.ACCOUNTING.toString())){ if (!node.hasNode(NodeProperty.ACCOUNTING.toString())){
@ -200,7 +214,7 @@ public class AccountingHandler {
Node accountingNodeParent = node.getNode(NodeProperty.ACCOUNTING.toString()); Node accountingNodeParent = node.getNode(NodeProperty.ACCOUNTING.toString());
Node accountingNode = accountingNodeParent.addNode(UUID.randomUUID().toString(),AccountingEntryType.RENAMING.getNodeTypeDefinition()); Node accountingNode = accountingNodeParent.addNode(UUID.randomUUID().toString(),AccountingEntryType.RENAMING.getNodeTypeDefinition());
accountingNode.setProperty(USER, AuthorizationProvider.instance.get().getClient().getId()); accountingNode.setProperty(USER, login);
accountingNode.setProperty(DATE, Calendar.getInstance()); accountingNode.setProperty(DATE, Calendar.getInstance());
accountingNode.setProperty(OLD_ITEM_NAME, oldTitle); accountingNode.setProperty(OLD_ITEM_NAME, oldTitle);
accountingNode.setProperty(NEW_ITEM_NAME, newTitle); accountingNode.setProperty(NEW_ITEM_NAME, newTitle);

View File

@ -1,16 +0,0 @@
package org.gcube.data.access.storagehub.exception;
import javax.ws.rs.WebApplicationException;
import javax.ws.rs.core.Response.Status;
public class MyAuthException extends WebApplicationException {
/**
*
*/
private static final long serialVersionUID = 1L;
public MyAuthException(Throwable cause) {
super(cause, Status.FORBIDDEN);
}
}

View File

@ -0,0 +1,5 @@
package org.gcube.data.access.storagehub.handlers;
public class ACLHandler {
}

View File

@ -1,11 +1,14 @@
package org.gcube.data.access.storagehub.handlers; package org.gcube.data.access.storagehub.handlers;
import java.util.Arrays;
import java.util.HashMap; import java.util.HashMap;
import java.util.List;
import java.util.Map; import java.util.Map;
import java.util.Set; import java.util.Set;
import org.gcube.common.storagehub.model.annotations.RootNode; import org.gcube.common.storagehub.model.annotations.RootNode;
import org.gcube.common.storagehub.model.items.Item; import org.gcube.common.storagehub.model.items.Item;
import org.gcube.common.storagehub.model.items.RootItem;
import org.reflections.Reflections; import org.reflections.Reflections;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
@ -24,31 +27,38 @@ public class ClassHandler {
private Reflections reflection = new Reflections(); private Reflections reflection = new Reflections();
private Map<String, Class<? extends Item>> classMap = new HashMap<String, Class<? extends Item>>(); private List<String> deprecatedNode = Arrays.asList("nthl:query", "nthl:aquamapsItem", "nthl:timeSeriesItem", "nthl:report", "nthl:reportTemplate", "nthl:workflowReport",
private Map<Class<? extends Item>, String> typeMap = new HashMap<Class<? extends Item>, String>(); "nthl:workflowTemplate", "nthl:gCubeMetadata", "nthl:gCubeDocument", "nthl:gCubeDocumentLink", "nthl:gCubeImageDocumentLink", "nthl:gCubePDFDocumentLink",
"nthl:gCubeImageDocument", "nthl:gCubePDFDocument", "nthl:gCubeURLDocument", "nthl:gCubeAnnotation", "nthl:externalResourceLink", "nthl:tabularDataLink");
private Map<String, Class<? extends RootItem>> classMap = new HashMap<String, Class<? extends RootItem>>();
private Map<Class<? extends RootItem>, String> typeMap = new HashMap<Class<? extends RootItem>, String>();
@SuppressWarnings("unchecked")
private ClassHandler() { private ClassHandler() {
Set<Class<?>> classesAnnotated = reflection.getTypesAnnotatedWith(RootNode.class); Set<Class<?>> classesAnnotated = reflection.getTypesAnnotatedWith(RootNode.class);
for (Class<?> clazz: classesAnnotated ){ for (Class<?> clazz: classesAnnotated ){
if (Item.class.isAssignableFrom(clazz)) { if (RootItem.class.isAssignableFrom(clazz) && clazz.isAnnotationPresent(RootNode.class)) {
String value = clazz.getAnnotation(RootNode.class).value(); String[] values = clazz.getAnnotation(RootNode.class).value();
log.debug("loading class {} with value {} ", clazz, value ); log.debug("loading class {} with values {} ", clazz, values );
classMap.put(value, (Class<? extends Item>) clazz); for (String value: values)
typeMap.put((Class<? extends Item>) clazz, value); classMap.put(value, (Class<? extends RootItem>) clazz);
typeMap.put((Class<? extends RootItem>) clazz, values[0]);
} }
} }
} }
public Class<? extends Item> get(String nodeType){ public Class<? extends RootItem> get(String nodeType){
if (classMap.containsKey(nodeType)) return classMap.get(nodeType); if (classMap.containsKey(nodeType)) return classMap.get(nodeType);
else return Item.class; if (deprecatedNode.contains(nodeType)) return Item.class;
return null;
//throw new RuntimeException("mapping not found for nodetype "+ nodeType); //throw new RuntimeException("mapping not found for nodetype "+ nodeType);
} }
public String getNodeType(Class<? extends Item> clazz){ public String getNodeType(Class<? extends RootItem> clazz){
if (typeMap.containsKey(clazz)) return typeMap.get(clazz); if (typeMap.containsKey(clazz)) return typeMap.get(clazz);
throw new RuntimeException("mapping not found for nodetype "+ clazz.getSimpleName()); throw new RuntimeException("mapping not found for nodetype "+ clazz.getSimpleName());
} }

View File

@ -0,0 +1,116 @@
package org.gcube.data.access.storagehub.handlers;
import java.io.BufferedInputStream;
import java.io.InputStream;
import java.util.Deque;
import java.util.LinkedList;
import java.util.List;
import java.util.zip.ZipEntry;
import java.util.zip.ZipOutputStream;
import jakarta.inject.Inject;
import javax.jcr.Node;
import javax.jcr.RepositoryException;
import javax.jcr.Session;
import javax.jcr.version.Version;
import org.gcube.common.storagehub.model.Excludes;
import org.gcube.common.storagehub.model.Path;
import org.gcube.common.storagehub.model.Paths;
import org.gcube.common.storagehub.model.exceptions.BackendGenericError;
import org.gcube.common.storagehub.model.items.AbstractFileItem;
import org.gcube.common.storagehub.model.items.FolderItem;
import org.gcube.common.storagehub.model.items.Item;
import org.gcube.common.storagehub.model.storages.StorageBackend;
import org.gcube.common.storagehub.model.storages.StorageBackendFactory;
import org.gcube.data.access.storagehub.Utils;
import org.gcube.data.access.storagehub.accounting.AccountingHandler;
import org.gcube.data.access.storagehub.handlers.plugins.StorageBackendHandler;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class CompressHandler {
private Logger logger = LoggerFactory.getLogger(CompressHandler.class);
@Inject
StorageBackendHandler storageBackendHandler;
@Inject
VersionHandler versionHandler;
public Deque<Item> getAllNodesForZip(FolderItem directory, String login, Session session, AccountingHandler accountingHandler, List<String> excludes) throws RepositoryException, BackendGenericError{
Deque<Item> queue = new LinkedList<Item>();
Node currentNode = session.getNodeByIdentifier(directory.getId());
queue.push(directory);
Deque<Item> tempQueue = new LinkedList<Item>();
logger.trace("adding directory {}",currentNode.getPath());
for (Item item : Utils.getItemList(currentNode,Excludes.GET_ONLY_CONTENT, null, false, null)){
if (excludes.contains(item.getId())) continue;
if (item instanceof FolderItem)
tempQueue.addAll(getAllNodesForZip((FolderItem) item, login, session, accountingHandler, excludes));
else if (item instanceof AbstractFileItem fileItem){
logger.trace("adding file {}",item.getPath());
String versionName = null;
try {
Version version = versionHandler.getCurrentVersion((Node) item.getRelatedNode());
versionName = version.getName();
}catch(RepositoryException e) {
logger.warn("current version of {} cannot be retreived", item.getId());
}
accountingHandler.createReadObj(fileItem.getTitle(), versionName, session, (Node) item.getRelatedNode(), login, false);
queue.addLast(item);
}
}
queue.addAll(tempQueue);
return queue;
}
public void zipNode(ZipOutputStream zos, Deque<Item> queue, Path originalPath) throws Exception{
logger.trace("originalPath is {}",originalPath.toPath());
Path actualPath = Paths.getPath("");
while (!queue.isEmpty()) {
Item item = queue.pop();
if (item instanceof FolderItem) {
actualPath = Paths.getPath(item.getPath());
logger.trace("actualPath is {}",actualPath.toPath());
String name = Paths.remove(actualPath, originalPath).toPath().replaceFirst("/", "");
logger.trace("writing dir {}",name);
if (name.isEmpty()) continue;
try {
zos.putNextEntry(new ZipEntry(name));
}finally {
zos.closeEntry();
}
} else if (item instanceof AbstractFileItem fileItem){
try {
StorageBackendFactory sbf = storageBackendHandler.get(fileItem.getContent().getPayloadBackend());
StorageBackend sb = sbf.create(fileItem.getContent().getPayloadBackend());
InputStream streamToWrite = sb.download(fileItem.getContent());
if (streamToWrite == null){
logger.warn("discarding item {} ",item.getName());
continue;
}
try(BufferedInputStream is = new BufferedInputStream(streamToWrite)){
String name = (Paths.remove(actualPath, originalPath).toPath()+item.getName()).replaceFirst("/", "");
logger.trace("writing file {}",name);
zos.putNextEntry(new ZipEntry(name));
Utils.copyStream(is, zos);
}catch (Exception e) {
logger.warn("error writing item {}", item.getName(),e);
} finally{
zos.closeEntry();
}
zos.flush();
}catch (Throwable e) {
logger.warn("error reading content for item {}", item.getPath(),e);
}
}
}
zos.close();
}
}

View File

@ -1,18 +0,0 @@
package org.gcube.data.access.storagehub.handlers;
import javax.jcr.SimpleCredentials;
import javax.servlet.ServletContext;
import org.gcube.data.access.storagehub.Constants;
public class CredentialHandler {
private static SimpleCredentials credentials;
public static SimpleCredentials getAdminCredentials(ServletContext context) {
if (credentials==null)
credentials = new SimpleCredentials(context.getInitParameter(Constants.ADMIN_PARAM_NAME),context.getInitParameter(Constants.ADMIN_PARAM_PWD).toCharArray());
return credentials;
}
}

View File

@ -0,0 +1,150 @@
package org.gcube.data.access.storagehub.handlers;
import java.util.ArrayList;
import java.util.Collections;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.stream.Collectors;
import javax.jcr.Node;
import javax.jcr.RepositoryException;
import org.apache.jackrabbit.api.JackrabbitSession;
import org.gcube.common.storagehub.model.acls.ACL;
import org.gcube.common.storagehub.model.acls.AccessType;
import org.gcube.common.storagehub.model.exceptions.StorageHubException;
import org.gcube.common.storagehub.model.exporter.DumpData;
import org.gcube.common.storagehub.model.exporter.GroupData;
import org.gcube.common.storagehub.model.exporter.UserData;
import org.gcube.common.storagehub.model.items.FolderItem;
import org.gcube.common.storagehub.model.items.Item;
import org.gcube.common.storagehub.model.types.SHUBUser;
import org.gcube.data.access.storagehub.PathUtil;
import org.gcube.data.access.storagehub.Utils;
import org.gcube.data.access.storagehub.handlers.vres.VREManager;
import org.gcube.data.access.storagehub.services.delegates.ACLManagerDelegate;
import org.gcube.data.access.storagehub.services.delegates.GroupManagerDelegate;
import org.gcube.data.access.storagehub.services.delegates.UserManagerDelegate;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import jakarta.inject.Inject;
import jakarta.inject.Singleton;
@Singleton
public class DataHandler {
public static final Logger logger = LoggerFactory.getLogger(DataHandler.class);
@Inject
UserManagerDelegate userHandler;
@Inject
GroupManagerDelegate groupHandler;
@Inject
ACLManagerDelegate aclHandler;
@Inject
PathUtil pathUtil;
@Inject
VREManager vreManager;
public DumpData exportData(JackrabbitSession session) throws RepositoryException,StorageHubException {
DumpData data = new DumpData();
List<SHUBUser> users = userHandler.getAllUsers(session);
List<UserData> usersData = users.stream().map(u -> new UserData(u.getUserName()))
.collect(Collectors.toList());
List<String> groups = groupHandler.getGroups(session);
logger.debug("found users {} ",usersData);
logger.debug("found groups {} ",groups);
List<GroupData> groupsData = new ArrayList<GroupData>(groups.size());
for (String group : groups) {
logger.debug("searching for group {}",group);
Item vreFolderItem = vreManager.getVreFolderItemByGroupName(session, group, null).getVreFolder();
String owner = vreFolderItem.getOwner();
List<ACL> acls = aclHandler.getByItem(vreFolderItem, session);
AccessType accessType = AccessType.READ_ONLY;
List<ACL> otherAccess = new ArrayList<ACL>(acls.size() - 1);
for (ACL acl : acls)
if (acl.getPrincipal().equals(group))
accessType = acl.getAccessTypes().get(0);
else
otherAccess.add(acl);
List<String> members = groupHandler.getUsersBelongingToGroup(session, group);
groupsData.add(new GroupData(group, owner, members, accessType, otherAccess));
}
Map<String, List<Item>> itemsPerUser = new HashMap<String, List<Item>>();
for (SHUBUser user : users ) {
logger.debug("getting all the files in {} workspace folder ",user.getUserName());
String homePath = pathUtil.getWorkspacePath(user.getUserName()).toPath();
Node homeNode = session.getNode(homePath);
List<Item> items = Utils.getItemList(homeNode, Collections.emptyList(), null, true, null);
items.forEach(i -> i.setParentId(null));
itemsPerUser.put(user.getUserName(), retrieveSubItems(items));
}
data.setUsers(usersData);
data.setGroups(groupsData);
data.setItemPerUser(itemsPerUser);
return data;
}
List<Item> retrieveSubItems(List<Item> items) throws StorageHubException, RepositoryException {
List<Item> toReturn = new ArrayList<Item>();
for (Item item: items )
if (item instanceof FolderItem f) {
if (f.isShared()) continue;
toReturn.addAll(retrieveSubItems(Utils.getItemList((Node) f.getRelatedNode(), Collections.emptyList(), null, true, null)));
} else
toReturn.add(item);
return toReturn;
}
public void importData(JackrabbitSession session, DumpData data) {
data.getUsers().forEach(u -> {
try {
userHandler.createUser(session, u.getUserName(), "pwd");
} catch (RepositoryException | StorageHubException e) {
logger.warn("error importing user {} ", u.getUserName(), e);
}
});
data.getGroups().forEach(g -> {
try {
groupHandler.createGroup(session, g.getName(), g.getAccessType(), g.getFolderOwner(), true);
for (String member : g.getMembers())
try {
groupHandler.addUserToGroup(session, member, g.getName());
}catch(Throwable t ) {
logger.warn("error adding user {} to group {}", member, g.getName(), t);
}
Item vreFolderItem = vreManager.getVreFolderItemByGroupName(session, g.getName(), null).getVreFolder();
for (ACL acl : g.getAcls()) {
aclHandler.update(acl.getPrincipal(), (Node)vreFolderItem.getRelatedNode(), acl.getAccessTypes().get(0), session);
}
} catch (Throwable e) {
logger.warn("error importing group {} ", g.getName(), e);
}
});
}
}

View File

@ -0,0 +1,212 @@
package org.gcube.data.access.storagehub.handlers;
import java.io.InputStream;
import java.io.OutputStream;
import java.util.Deque;
import java.util.List;
import java.util.Map;
import java.util.zip.Deflater;
import java.util.zip.ZipOutputStream;
import javax.jcr.Node;
import javax.jcr.RepositoryException;
import javax.jcr.Session;
import javax.jcr.version.Version;
import org.apache.commons.io.FilenameUtils;
import org.gcube.common.storagehub.model.Excludes;
import org.gcube.common.storagehub.model.Paths;
import org.gcube.common.storagehub.model.exceptions.InvalidItemException;
import org.gcube.common.storagehub.model.exceptions.PluginInitializationException;
import org.gcube.common.storagehub.model.exceptions.PluginNotFoundException;
import org.gcube.common.storagehub.model.exceptions.StorageHubException;
import org.gcube.common.storagehub.model.exceptions.StorageIdNotFoundException;
import org.gcube.common.storagehub.model.items.AbstractFileItem;
import org.gcube.common.storagehub.model.items.FolderItem;
import org.gcube.common.storagehub.model.items.Item;
import org.gcube.common.storagehub.model.items.nodes.Content;
import org.gcube.common.storagehub.model.items.nodes.PayloadBackend;
import org.gcube.common.storagehub.model.storages.StorageBackend;
import org.gcube.common.storagehub.model.storages.StorageBackendFactory;
import org.gcube.common.storagehub.model.storages.StorageNames;
import org.gcube.data.access.storagehub.SingleFileStreamingOutput;
import org.gcube.data.access.storagehub.accounting.AccountingHandler;
import org.gcube.data.access.storagehub.handlers.items.Node2ItemConverter;
import org.gcube.data.access.storagehub.handlers.plugins.StorageBackendHandler;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import jakarta.inject.Inject;
import jakarta.inject.Singleton;
import jakarta.ws.rs.core.Response;
import jakarta.ws.rs.core.StreamingOutput;
@Singleton
public class DownloadHandler {
private static final Logger log = LoggerFactory.getLogger(DownloadHandler.class);
@Inject
private AccountingHandler accountingHandler;
@Inject
private StorageBackendHandler storageBackendHandler;
@Inject
private CompressHandler compressHandler;
@Inject
private VersionHandler versionHandler;
@Inject
private Node2ItemConverter node2Item;
public Response downloadFolderItem(Session ses, String login, FolderItem item, boolean withAccounting ) throws StorageHubException, RepositoryException {
try {
final Deque<Item> allNodes = compressHandler.getAllNodesForZip((FolderItem)item, login, ses, accountingHandler, Excludes.GET_ONLY_CONTENT);
final org.gcube.common.storagehub.model.Path originalPath = Paths.getPath(item.getParentPath());
StreamingOutput so = new StreamingOutput() {
@Override
public void write(OutputStream os) {
try(ZipOutputStream zos = new ZipOutputStream(os)){
long start = System.currentTimeMillis();
zos.setLevel(Deflater.BEST_COMPRESSION);
log.debug("writing StreamOutput");
compressHandler.zipNode(zos, allNodes, originalPath);
log.debug("StreamOutput written in {}",(System.currentTimeMillis()-start));
} catch (Exception e) {
log.error("error writing stream",e);
}
}
};
Response response = Response
.ok(so)
.header("content-disposition","attachment; filename = "+item.getTitle()+".zip")
.header("Content-Type", "application/zip")
.header("Content-Length", -1l)
.build();
if (withAccounting)
accountingHandler.createReadObj(item.getTitle(), null, ses, (Node) item.getRelatedNode(), login, false);
return response;
}finally {
if (ses!=null) ses.save();
}
}
public Response downloadFileItem(Session ses, AbstractFileItem fileItem, String login, boolean withAccounting) throws RepositoryException, PluginInitializationException, PluginNotFoundException, StorageHubException {
Content content = fileItem.getContent();
StorageBackendFactory sbf = storageBackendHandler.get(content.getPayloadBackend());
StorageBackend sb = sbf.create(content.getPayloadBackend());
InputStream streamToWrite = sb.download(content);
if (withAccounting) {
String versionName = null;
try {
Version version = versionHandler.getCurrentVersion((Node) fileItem.getRelatedNode());
versionName = version.getName();
}catch(RepositoryException e) {
log.warn("current version of {} cannot be retreived", fileItem.getId());
}
accountingHandler.createReadObj(fileItem.getTitle(), versionName, ses, (Node) fileItem.getRelatedNode(), login, true);
}
StreamingOutput so = new SingleFileStreamingOutput(streamToWrite);
return Response
.ok(so)
.header("content-disposition","attachment; filename = "+fileItem.getName())
.header("Content-Length", fileItem.getContent().getSize())
.header("Content-Type", fileItem.getContent().getMimeType())
.build();
}
public Response downloadVersionedItem(Session ses, String login, AbstractFileItem currentItem, String versionName, boolean withAccounting) throws RepositoryException, StorageHubException{
List<Version> jcrVersions = versionHandler.getContentVersionHistory((Node)currentItem.getRelatedNode());
for (Version version: jcrVersions) {
log.debug("retrieved version id {}, name {}", version.getIdentifier(), version.getName());
if (version.getName().equals(versionName)) {
Content content = node2Item.getContentFromVersion(version);
StorageBackendFactory sbf = storageBackendHandler.get(content.getPayloadBackend());
StorageBackend sb = sbf.create(content.getPayloadBackend());
InputStream streamToWrite = null;
try {
streamToWrite = sb.download(content);
}catch (StorageIdNotFoundException e) {
//TODO: temporary code, it will last until the MINIO porting will not finish
if (sbf.getName().equals(StorageNames.MONGO_STORAGE)) {
sbf = storageBackendHandler.get(StorageNames.DEFAULT_S3_STORAGE);
sbf.create(new PayloadBackend(StorageNames.DEFAULT_S3_STORAGE, null));
} else
throw e;
}
log.debug("retrieved storage id is {} with storageBackend {} (stream is null? {})",content.getStorageId(), sbf.getName(), streamToWrite==null );
String oldfilename = FilenameUtils.getBaseName(currentItem.getTitle());
String ext = FilenameUtils.getExtension(currentItem.getTitle());
String fileName = String.format("%s_v%s.%s", oldfilename, version.getName(), ext);
if (withAccounting)
accountingHandler.createReadObj(currentItem.getTitle(), versionName, ses, (Node) currentItem.getRelatedNode(), login, true);
StreamingOutput so = new SingleFileStreamingOutput(streamToWrite);
return Response
.ok(so)
.header("content-disposition","attachment; filename = "+fileName)
.header("Content-Length", content.getSize())
.header("Content-Type", content.getMimeType())
.build();
}
}
throw new InvalidItemException("the version is not valid");
}
public Response downloadFileFromStorageBackend(String storageId, String storageName) throws RepositoryException, PluginInitializationException, PluginNotFoundException, StorageHubException {
StorageBackendFactory sbf = storageBackendHandler.get(storageName);
StorageBackend sb = sbf.create(new PayloadBackend(storageName, null));
InputStream streamToWrite = sb.download(storageId);
Map<String, String> userMetadata = sb.getFileMetadata(storageId);
log.info("returned metadata from storageBackend are: {}", userMetadata);
long size = Long.parseLong(userMetadata.get("size"));
String title = userMetadata.get("title");
String contentType = userMetadata.get("content-type");
StreamingOutput so = new SingleFileStreamingOutput(streamToWrite);
return Response
.ok(so)
.header("content-disposition","attachment; filename = "+title)
.header("Content-Length", size)
.header("Content-Type", contentType)
.build();
}
}

View File

@ -1,333 +0,0 @@
package org.gcube.data.access.storagehub.handlers;
import java.lang.reflect.Field;
import java.net.URI;
import java.net.URL;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Calendar;
import java.util.Collections;
import java.util.HashMap;
import java.util.HashSet;
import java.util.List;
import java.util.Map;
import java.util.Set;
import javax.inject.Singleton;
import javax.jcr.Node;
import javax.jcr.NodeIterator;
import javax.jcr.PathNotFoundException;
import javax.jcr.Property;
import javax.jcr.PropertyIterator;
import javax.jcr.PropertyType;
import javax.jcr.RepositoryException;
import javax.jcr.Value;
import org.apache.commons.io.IOUtils;
import org.apache.jackrabbit.util.Text;
import org.gcube.common.storagehub.model.Excludes;
import org.gcube.common.storagehub.model.annotations.Attribute;
import org.gcube.common.storagehub.model.annotations.AttributeRootNode;
import org.gcube.common.storagehub.model.annotations.ListNodes;
import org.gcube.common.storagehub.model.annotations.MapAttribute;
import org.gcube.common.storagehub.model.annotations.NodeAttribute;
import org.gcube.common.storagehub.model.exceptions.BackendGenericError;
import org.gcube.common.storagehub.model.items.Item;
import org.gcube.common.storagehub.model.items.SharedFolder;
import org.gcube.common.storagehub.model.items.TrashItem;
import org.gcube.data.access.storagehub.Utils;
import org.reflections.Configuration;
import org.reflections.Reflections;
import org.reflections.util.ConfigurationBuilder;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
@Singleton
public class Node2ItemConverter {
private static final Logger logger = LoggerFactory.getLogger(Node2ItemConverter.class);
private static HashMap<Class, Map<String, Class>> typeToSubtypeMap = new HashMap<>();
public <T extends Item> T getFilteredItem(Node node, List<String> excludes, Class<? extends Item> nodeTypeToInclude) throws RepositoryException, BackendGenericError{
@SuppressWarnings("unchecked")
Class<T> classToHandle = (Class<T>)ClassHandler.instance().get(node.getPrimaryNodeType().getName());
if (nodeTypeToInclude!=null && !(nodeTypeToInclude.isAssignableFrom(classToHandle))) return null;
else return retrieveItem(node, excludes, classToHandle);
}
public <T extends Item> T getItem(Node node, List<String> excludes) throws RepositoryException, BackendGenericError{
@SuppressWarnings("unchecked")
Class<T> classToHandle = (Class<T>)ClassHandler.instance().get(node.getPrimaryNodeType().getName());
/*Node nodeToRetrieve= node;
if (SharedFolder.class.isAssignableFrom(classToHandle)) {
NodeIterator it= node.getSharedSet();
while (it.hasNext()) {
Node sharedNode = it.nextNode();
if (sharedNode.getPath().startsWith(Utils.getWorkspacePath().toPath())) {
nodeToRetrieve = sharedNode;
}
}
}*/
return retrieveItem(node, excludes, classToHandle);
}
private <T extends Item> T retrieveItem(Node node, List<String> excludes, Class<T> classToHandle) throws RepositoryException, BackendGenericError{
T item;
try {
item = classToHandle.newInstance();
}catch (Exception e) {
throw new BackendGenericError(e);
}
item.setId(node.getIdentifier());
item.setName(Text.unescapeIllegalJcrChars(node.getName()));
item.setPath(Text.unescapeIllegalJcrChars(node.getPath()));
item.setLocked(node.isLocked());
item.setPrimaryType(node.getPrimaryNodeType().getName());
Item parent = null ;
if (item instanceof SharedFolder) {
logger.trace("I'm in a Shared Folder");
item.setShared(true);
}else {
try {
parent = getItem(node.getParent(), Excludes.ALL);
item.setShared(parent.isShared());
} catch(Exception e) {
item.setShared(false);
}
}
if (item instanceof TrashItem)
item.setTrashed(true);
else {
try {
if (parent==null)
parent = getItem(node.getParent(), Excludes.ALL);
item.setTrashed(parent.isTrashed());
} catch(Exception e) {
item.setTrashed(false);
}
}
try{
item.setParentId(node.getParent().getIdentifier());
item.setParentPath(node.getParent().getPath());
}catch (Throwable e) {
logger.trace("Root node doesn't have a parent");
}
for (Field field : retrieveAllFields(classToHandle)){
if (field.isAnnotationPresent(Attribute.class)){
Attribute attribute = field.getAnnotation(Attribute.class);
field.setAccessible(true);
try{
Class<?> returnType = field.getType();
field.set(item, getPropertyValue(returnType, node.getProperty(attribute.value())));
logger.debug("retrieve item - added field {}",field.getName());
}catch(PathNotFoundException e){
logger.trace("the current node dosn't contain {} property",attribute.value());
} catch (Exception e ) {
logger.warn("error setting value for property {} ",attribute.value());
}
} else if (field.isAnnotationPresent(NodeAttribute.class)){
String fieldNodeName = field.getAnnotation(NodeAttribute.class).value();
//for now it excludes only first level node
if (excludes!=null && excludes.contains(fieldNodeName)) continue;
logger.trace("retrieving field node "+field.getName());
field.setAccessible(true);
try{
Node fieldNode = node.getNode(fieldNodeName);
logger.trace("looking in node {} searched with {}",fieldNode.getName(),fieldNodeName);
field.set(item, iterateNodeAttributeFields(field.getType(), fieldNode));
}catch(PathNotFoundException e){
logger.trace("the current node dosn't contain {} node",fieldNodeName);
} catch (Exception e ) {
logger.warn("error setting value",e);
}
}
}
return item;
}
private <T> T iterateNodeAttributeFields(Class<T> clazz, Node node) throws Exception{
T obj = clazz.newInstance();
for (Field field : retrieveAllFields(clazz)){
if (field.isAnnotationPresent(Attribute.class)){
Attribute attribute = field.getAnnotation(Attribute.class);
field.setAccessible(true);
try{
@SuppressWarnings("rawtypes")
Class returnType = field.getType();
field.set(obj, getPropertyValue(returnType, node.getProperty(attribute.value())));
}catch(PathNotFoundException e){
logger.trace("the current node dosn't contain {} property",attribute.value());
} catch (Exception e ) {
logger.warn("error setting value {}",e.getMessage());
}
} else if (field.isAnnotationPresent(MapAttribute.class)){
logger.trace("found field {} of type annotated as MapAttribute in class {} and node name {}", field.getName(), clazz.getName(), node.getName());
field.setAccessible(true);
String exclude = field.getAnnotation(MapAttribute.class).excludeStartWith();
Map<String, Object> mapToset = new HashMap<String, Object>();
PropertyIterator iterator = node.getProperties();
if (iterator!=null) {
while (iterator.hasNext()){
Property prop = iterator.nextProperty();
if (!exclude.isEmpty() && prop.getName().startsWith(exclude)) continue;
try{
logger.trace("adding {} in the map",prop.getName());
mapToset.put(prop.getName(), getPropertyValue(prop));
}catch(PathNotFoundException e){
logger.warn("the property {} is not mapped",prop.getName());
} catch (Exception e ) {
logger.warn("error setting value {}",e.getMessage());
}
}
}
field.set(obj, mapToset);
} else if (field.isAnnotationPresent(ListNodes.class)){
logger.trace("found field {} of type annotated as ListNodes in class {} on node {}", field.getName(), clazz.getName(), node.getName());
field.setAccessible(true);
String exclude = field.getAnnotation(ListNodes.class).excludeTypeStartWith();
String include = field.getAnnotation(ListNodes.class).includeTypeStartWith();
Class listType = field.getAnnotation(ListNodes.class).listClass();
Map<String, Class> subTypesMap = Collections.emptyMap();
if (!typeToSubtypeMap.containsKey(listType)) {
Configuration config = new ConfigurationBuilder().forPackages(listType.getPackage().getName());
Reflections reflections = new Reflections(config);
Set<Class> subTypes = reflections.getSubTypesOf(listType);
if (subTypes.size()>0) {
subTypesMap = new HashMap<>();
for (Class subtype: subTypes)
if (subtype.isAnnotationPresent(AttributeRootNode.class)) {
AttributeRootNode attributeRootNode = (AttributeRootNode)subtype.getAnnotation(AttributeRootNode.class);
subTypesMap.put(attributeRootNode.value(), subtype);
}
} else logger.trace("no subtypes found for {}",listType.getName());
typeToSubtypeMap.put(listType, subTypesMap);
} else {
logger.info("subtypes already found in cache");
subTypesMap = typeToSubtypeMap.get(listType);
}
List<Object> toSetList = new ArrayList<>();
NodeIterator iterator = node.getNodes();
while (iterator.hasNext()){
Node currentNode = iterator.nextNode();
String primaryType = currentNode.getPrimaryNodeType().getName();
logger.trace("the current node {} has a list",currentNode.getName());
if (!include.isEmpty() && !primaryType.startsWith(include))
continue;
if (!exclude.isEmpty() && primaryType.startsWith(exclude))
continue;
if (subTypesMap.containsKey(primaryType))
toSetList.add(iterateNodeAttributeFields(subTypesMap.get(primaryType), currentNode));
else toSetList.add(iterateNodeAttributeFields(listType, currentNode));
}
if (toSetList.size()!=0) field.set(obj, toSetList);
}
}
return obj;
}
@SuppressWarnings({ "rawtypes", "unchecked" })
private Object getPropertyValue(Class returnType, Property prop) throws Exception{
if (returnType.equals(String.class)) return prop.getString();
if (returnType.isEnum()) return Enum.valueOf(returnType, prop.getString());
if (returnType.equals(Calendar.class)) return prop.getDate();
if (returnType.equals(URL.class)) return new URI(prop.getString()).toURL();
if (returnType.equals(Boolean.class) || returnType.equals(boolean.class)) return prop.getBoolean();
if (returnType.equals(Long.class) || returnType.equals(long.class)) return prop.getLong();
if (returnType.equals(Integer.class) || returnType.equals(int.class)) return prop.getLong();
if (returnType.isArray()) {
if (prop.getType()==PropertyType.BINARY) {
byte[] bytes = IOUtils.toByteArray(prop.getBinary().getStream());
return bytes;
} else {
Object[] ret= getArrayValue(prop);
return Arrays.copyOf(ret, ret.length, returnType);
}
}
throw new Exception(String.format("class %s not recognized",returnType.getName()));
}
private static Set<Field> retrieveAllFields(Class<?> clazz){
Set<Field> fields = new HashSet<Field>();
Class<?> currentClass = clazz;
do{
List<Field> fieldsFound = Arrays.asList(currentClass.getDeclaredFields());
fields.addAll(fieldsFound);
}while ((currentClass =currentClass.getSuperclass())!=null);
return fields;
}
private Object getPropertyValue(Property prop) throws Exception{
if (prop.isMultiple()){
Object[] values = new Object[prop.getValues().length];
int i = 0;
for (Value value : prop.getValues())
values[i++] = getSingleValue(value);
return values;
} else
return getSingleValue(prop.getValue());
}
private Object getSingleValue(Value value) throws Exception{
switch (value.getType()) {
case PropertyType.DATE:
return value.getDate();
case PropertyType.BOOLEAN:
return value.getBoolean();
case PropertyType.LONG:
return value.getDate();
default:
return value.getString();
}
}
private Object[] getArrayValue(Property prop) throws Exception{
Object[] values = new Object[prop.getValues().length];
int i = 0;
for (Value value : prop.getValues())
values[i++] = getSingleValue(value);
return values;
}
public boolean checkNodeType(Node node, Class<? extends Item> classToCompare) throws BackendGenericError{
try {
logger.info("class from nodetype is {} and class to compare is {}",ClassHandler.instance().get(node.getPrimaryNodeType().getName()), classToCompare);
return classToCompare.isAssignableFrom(ClassHandler.instance().get(node.getPrimaryNodeType().getName()));
//(node.isNodeType(ClassHandler.instance().getNodeType(classToCompare)));
}catch (Throwable e) {
throw new BackendGenericError(e);
}
}
}

View File

@ -0,0 +1,143 @@
package org.gcube.data.access.storagehub.handlers;
import static org.gcube.common.storagehub.model.Constants.enchriptedPrefix;
import static org.gcube.common.storagehub.model.Constants.enchriptedVolatile;
import static org.gcube.common.storagehub.model.Constants.versionPrefix;
import java.util.Base64;
import jakarta.inject.Singleton;
import jakarta.servlet.ServletContext;
import org.gcube.common.encryption.encrypter.StringEncrypter;
import org.gcube.common.security.AuthorizedTasks;
import org.gcube.common.security.secrets.Secret;
import org.gcube.common.storagehub.model.exceptions.BackendGenericError;
import org.gcube.common.storagehub.model.exceptions.StorageHubException;
import org.gcube.data.access.storagehub.types.LinkType;
import org.gcube.data.access.storagehub.types.PublicLink;
import org.gcube.smartgears.ContextProvider;
@Singleton
public class PublicLinkHandler {
public String getForItem(String itemId, ServletContext context) throws BackendGenericError{
return getUrl(itemId, enchriptedPrefix, context);
}
public String getForVersionedItem(String itemId, String version, ServletContext context) throws BackendGenericError {
return getUrl(String.format("%s%s%s",itemId, versionPrefix, version), enchriptedPrefix, context);
}
public String getForVolatile(String fileId, String storageName, ServletContext context) throws BackendGenericError {
return getUrl(String.format("%s_%s",fileId, storageName), enchriptedVolatile, context);
}
public PublicLink resolveEnchriptedId(String enchriptedId) throws StorageHubException {
String complexId = enchriptedId;
boolean isVolatile = false;
if (enchriptedId.startsWith(enchriptedPrefix) || enchriptedId.startsWith(enchriptedVolatile) ) {
final String enchriptedValue = enchriptedId.startsWith(enchriptedPrefix) ? enchriptedPrefix : enchriptedVolatile;
isVolatile = enchriptedId.startsWith(enchriptedVolatile);
try {
String infraContext = String.format("/%s", ContextProvider.get().container().configuration().infrastructure());
Secret infraSecret = ContextProvider.get().container().authorizationProvider().getSecretForContext(infraContext);
complexId = AuthorizedTasks.executeSafely(() -> {
return StringEncrypter.getEncrypter().decrypt(
new String(Base64.getUrlDecoder().decode(enchriptedId.replace(enchriptedValue, ""))));
}, infraSecret);
}catch(Throwable e){
throw new BackendGenericError("invalid public url",e);
}
}
if (isVolatile) {
String[] volatileIdSplit = complexId.split("_");
return new VolatilePublicLink(volatileIdSplit[0], volatileIdSplit[1]);
}else {
if (complexId.contains(versionPrefix)) {
String[] split = complexId.split(versionPrefix);
String itemId = split[0];
String versionName = split[1];
return new ItemPublicLink(itemId, versionName);
} else
return new ItemPublicLink(complexId);
}
}
private String getUrl(String toEnchript, String prefix, ServletContext context) throws BackendGenericError{
String infraContext = String.format("/%s", ContextProvider.get().container().configuration().infrastructure());
Secret infraSecret = ContextProvider.get().container().authorizationProvider().getSecretForContext(infraContext);
try {
String enchriptedQueryString = AuthorizedTasks.executeSafely(
() -> {return StringEncrypter.getEncrypter().encrypt(toEnchript);},infraSecret);
String basepath = context.getInitParameter("resolver-basepath");
String filePublicUrl = String.format("%s/%s%s",basepath, prefix, Base64.getUrlEncoder().encodeToString(enchriptedQueryString.getBytes()));
return filePublicUrl;
}catch (Throwable e) {
throw new BackendGenericError("error encrypting item id",e );
}
}
public static class VolatilePublicLink implements PublicLink {
private String storageKey;
private String storageName;
protected VolatilePublicLink(String storageKey, String storageName){
this.storageKey = storageKey;
this.storageName = storageName;
}
@Override
public LinkType getType() {return LinkType.VOLATILE;}
@Override
public String getId() { return storageKey; }
@Override
public String getStorageName() { return storageName; }
}
public static class ItemPublicLink implements PublicLink {
private String itemId;
private String version;
private LinkType type;
protected ItemPublicLink(String itemId){
this.itemId = itemId;
this.type = LinkType.STANDARD;
}
protected ItemPublicLink(String itemId, String version){
this.itemId = itemId;
this.version = version;
this.type = LinkType.VERSIONED;
}
@Override
public LinkType getType() {return type;}
@Override
public String getId() { return itemId; }
@Override
public String getVersion() { return version; }
}
}

View File

@ -1,51 +0,0 @@
package org.gcube.data.access.storagehub.handlers;
import java.io.InputStream;
import javax.inject.Inject;
import javax.inject.Singleton;
import org.gcube.common.storagehub.model.items.AbstractFileItem;
import org.gcube.common.storagehub.model.items.FolderItem;
import org.gcube.common.storagehub.model.items.Item;
import org.gcube.common.storagehub.model.storages.MetaInfo;
import org.gcube.data.access.storagehub.storage.backend.impl.GCubeStorageBackend;
@Singleton
public class StorageBackendHandler {
@Inject
private GCubeStorageBackend defaultBackend;
public String move(Item item, FolderItem destination) {
//if item is a folder we have to move everything
return defaultBackend.move(((AbstractFileItem) item).getContent().getStorageId());
}
public String copy(AbstractFileItem item) {
return defaultBackend.copy(item.getContent().getStorageId(), item.getContent().getRemotePath());
}
public MetaInfo upload(InputStream stream, String itemPath) {
return defaultBackend.upload(stream, itemPath);
}
public InputStream download(String id) {
return defaultBackend.getContent(id);
}
public void delete(String id) {
defaultBackend.delete(id);
}
public String getTotalVolume() {
return defaultBackend.getTotalSizeStored();
}
public String getTotalItemsCount() {
return defaultBackend.getTotalItemsCount();
}
}

View File

@ -1,35 +1,46 @@
package org.gcube.data.access.storagehub.handlers; package org.gcube.data.access.storagehub.handlers;
import java.util.Calendar; import java.util.Calendar;
import java.util.Collection;
import java.util.HashSet; import java.util.HashSet;
import java.util.List; import java.util.List;
import java.util.Set; import java.util.Set;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.stream.Collectors;
import javax.inject.Inject; import jakarta.inject.Inject;
import javax.inject.Singleton; import jakarta.inject.Singleton;
import javax.jcr.ItemNotFoundException;
import javax.jcr.Node; import javax.jcr.Node;
import javax.jcr.RepositoryException; import javax.jcr.RepositoryException;
import javax.jcr.Session; import javax.jcr.Session;
import javax.jcr.lock.LockException; import javax.jcr.lock.LockException;
import javax.jcr.version.Version;
import org.gcube.common.authorization.library.AuthorizedTasks; import org.gcube.common.security.AuthorizedTasks;
import org.gcube.common.authorization.library.provider.AuthorizationProvider;
import org.gcube.common.storagehub.model.Excludes; import org.gcube.common.storagehub.model.Excludes;
import org.gcube.common.storagehub.model.Paths; import org.gcube.common.storagehub.model.Paths;
import org.gcube.common.storagehub.model.exceptions.BackendGenericError; import org.gcube.common.storagehub.model.exceptions.BackendGenericError;
import org.gcube.common.storagehub.model.exceptions.InvalidCallParameters;
import org.gcube.common.storagehub.model.exceptions.ItemLockedException; import org.gcube.common.storagehub.model.exceptions.ItemLockedException;
import org.gcube.common.storagehub.model.exceptions.StorageHubException; import org.gcube.common.storagehub.model.exceptions.StorageHubException;
import org.gcube.common.storagehub.model.exceptions.UserNotAuthorizedException;
import org.gcube.common.storagehub.model.items.AbstractFileItem; import org.gcube.common.storagehub.model.items.AbstractFileItem;
import org.gcube.common.storagehub.model.items.FolderItem; import org.gcube.common.storagehub.model.items.FolderItem;
import org.gcube.common.storagehub.model.items.Item; import org.gcube.common.storagehub.model.items.Item;
import org.gcube.common.storagehub.model.items.TrashItem; import org.gcube.common.storagehub.model.items.TrashItem;
import org.gcube.common.storagehub.model.items.nodes.Content;
import org.gcube.common.storagehub.model.storages.StorageBackend;
import org.gcube.common.storagehub.model.storages.StorageBackendFactory;
import org.gcube.common.storagehub.model.types.ItemAction; import org.gcube.common.storagehub.model.types.ItemAction;
import org.gcube.contentmanagement.blobstorage.service.IClient;
import org.gcube.data.access.storagehub.AuthorizationChecker; import org.gcube.data.access.storagehub.AuthorizationChecker;
import org.gcube.data.access.storagehub.Constants; import org.gcube.data.access.storagehub.PathUtil;
import org.gcube.data.access.storagehub.Utils; import org.gcube.data.access.storagehub.Utils;
import org.gcube.data.access.storagehub.accounting.AccountingHandler; import org.gcube.data.access.storagehub.accounting.AccountingHandler;
import org.gcube.data.access.storagehub.handlers.items.Item2NodeConverter;
import org.gcube.data.access.storagehub.handlers.items.Node2ItemConverter;
import org.gcube.data.access.storagehub.handlers.plugins.StorageBackendHandler;
import org.gcube.data.access.storagehub.types.ContentPair;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
@ -38,65 +49,131 @@ public class TrashHandler {
private static Logger log = LoggerFactory.getLogger(TrashHandler.class); private static Logger log = LoggerFactory.getLogger(TrashHandler.class);
@Inject ExecutorService executor = Executors.newFixedThreadPool(100);
VersionHandler versionHandler;
@Inject @Inject
AccountingHandler accountingHandler; VersionHandler versionHandler;
@Inject @Inject
AuthorizationChecker authChecker; AuthorizationChecker authChecker;
@Inject
AccountingHandler accountingHandler;
@Inject @Inject
Item2NodeConverter item2Node; Item2NodeConverter item2Node;
@Inject @Inject
StorageBackendHandler storageHandler; Node2ItemConverter node2Item;
@Inject
PathUtil pathUtil;
@Inject
StorageBackendHandler storageBackendHandler;
public void removeNodes(Session ses, List<Item> itemsToDelete) throws RepositoryException, StorageHubException{ public void removeNodes(Session ses, List<Item> itemsToDelete) throws RepositoryException, StorageHubException{
log.debug("defnitively removing nodes with ids {}",itemsToDelete); log.debug("defnitively removing nodes with ids {}",itemsToDelete);
for (Item item: itemsToDelete) { for (Item item: itemsToDelete) {
removeNodesInternally(ses, item); removeNodesInternally(ses, item, false);
} }
} }
private void removeNodesInternally(Session ses, Item itemToDelete) throws RepositoryException, StorageHubException { public void removeOnlyNodesContent(Session ses, List<Item> itemsToDelete) throws RepositoryException, StorageHubException{
log.debug("defnitively removing nodes with ids {}",itemsToDelete);
for (Item item: itemsToDelete) {
removeNodesInternally(ses, item, true);
}
}
private void retrieveItemsToDelete(Set<AbstractFileItem> itemsToDelete, Item itemToDelete) throws Exception{
if (itemToDelete instanceof AbstractFileItem) {
itemsToDelete.add(((AbstractFileItem) itemToDelete));
}else if (itemToDelete instanceof FolderItem) {
//only to be sure to not delete shared content
if (itemToDelete.isShared()) return;
List<Item> items = Utils.getItemList((Node) itemToDelete.getRelatedNode(), Excludes.GET_ONLY_CONTENT , null, true, null);
for (Item item: items)
retrieveItemsToDelete(itemsToDelete, item);
}
}
private Set<ContentPair> retrieveContentToDelete(Collection<AbstractFileItem> itemsToDelete) {
Set<ContentPair> contentSet = new HashSet<ContentPair>();
for (AbstractFileItem item: itemsToDelete) {
if (item.getContent()== null || item.getContent().getStorageId()==null) {
log.warn("item with id {} contains null content",item.getId());
continue;
}
try {
StorageBackendFactory sbf = storageBackendHandler.get(item.getContent().getPayloadBackend());
StorageBackend sb = sbf.create(item.getContent().getPayloadBackend());
contentSet.add(new ContentPair(item.getContent(), sb));
List<Version> versions = versionHandler.getContentVersionHistory((Node)item.getRelatedNode());
for (Version version: versions) {
try {
Content content = node2Item.getContentFromVersion(version);
if (content!= null && content.getStorageId()!=null)
contentSet.add(new ContentPair(content, sb));
else log.warn("invalid version {}",version.getName());
}catch (Throwable t) {
log.warn("error retrieving version content for {}",version.getName(),t);
}
}
}catch (Exception e) {
log.warn("item with id {} cannot be deleted",item.getId(),e);
}
}
return contentSet;
}
private void removeNodesInternally(Session ses, Item itemToDelete, boolean onlyContent) throws RepositoryException, StorageHubException {
try { try {
Set<String> contentIdsToDelete = new HashSet<>(); Set<AbstractFileItem> itemsToDelete = new HashSet<>();
Node nodeToDelete = (Node) itemToDelete.getRelatedNode();
Node nodeToDelete = ses.getNodeByIdentifier(itemToDelete.getId());
if (itemToDelete instanceof TrashItem) { if (itemToDelete instanceof TrashItem) {
List<Item> trashChildren = Utils.getItemList(nodeToDelete, Excludes.GET_ONLY_CONTENT, null, true, null); List<Item> trashChildren = Utils.getItemList(nodeToDelete, Excludes.GET_ONLY_CONTENT, null, true, null);
for (Item itemContentToRetrieve: trashChildren) for (Item itemContentToRetrieve: trashChildren)
Utils.getAllContentIds(ses, contentIdsToDelete, itemContentToRetrieve, versionHandler); retrieveItemsToDelete(itemsToDelete, itemContentToRetrieve);
} else { } else
Utils.getAllContentIds(ses, contentIdsToDelete, itemToDelete, versionHandler); retrieveItemsToDelete(itemsToDelete, itemToDelete);
}
nodeToDelete.remove();
log.debug("content ids to remove are {}",contentIdsToDelete); if (!onlyContent)
nodeToDelete.remove();
String ids = itemsToDelete.stream().map((i) -> i.getId()).collect(Collectors.joining(","));
log.debug("content ids to remove are {}",ids);
Set<ContentPair> contentToDelete = retrieveContentToDelete(itemsToDelete);
String user = AuthorizationProvider.instance.get().getClient().getId();
Runnable deleteFromStorageRunnable = AuthorizedTasks.bind(new Runnable() { Runnable deleteFromStorageRunnable = AuthorizedTasks.bind(new Runnable() {
@Override @Override
public void run() { public void run() {
for (String id: contentIdsToDelete) { for (ContentPair cp: contentToDelete ) {
try { try {
storageHandler.delete(id); cp.getStorageBackend().delete(cp.getContent().getStorageId());
log.debug("file with id {} correctly removed on storage",id); log.debug("file with id {} correctly removed from storage {}",cp.getContent().getStorageId(),cp.getStorageBackend().getClass().getSimpleName());
}catch(Throwable t) { }catch(Throwable t) {
log.warn("error removing file on storage with id {}",id, t); log.warn("error removing file with id {} from storage {}",cp.getContent().getStorageId(), cp.getStorageBackend().getClass().getSimpleName(), t);
} }
} }
} }
}); });
new Thread(deleteFromStorageRunnable).start(); executor.execute(deleteFromStorageRunnable);
if (!onlyContent)
ses.save(); ses.save();
}catch (LockException e) { }catch (LockException e) {
throw new ItemLockedException("the selected node or his parent is locked", e); throw new ItemLockedException("the selected node or his parent is locked", e);
}catch (Exception e) { }catch (Exception e) {
@ -104,20 +181,20 @@ public class TrashHandler {
} }
} }
public void moveToTrash(Session ses, Node nodeToDelete, Item item) throws RepositoryException, BackendGenericError{
public void moveToTrash(Session ses, Node nodeToDelete, Item item, String login) throws RepositoryException, BackendGenericError{
log.debug("moving node {} to trash ",item.getId()); log.debug("moving node {} to trash ",item.getId());
final Node trashFolder = ses.getNode(Paths.append(Utils.getWorkspacePath(),Constants.TRASH_ROOT_FOLDER_NAME).toPath());
final String login = AuthorizationProvider.instance.get().getClient().getId();
final Node trashFolder = ses.getNode(pathUtil.getTrashPath(login, ses).toPath());
try { try {
ses.getWorkspace().getLockManager().lock(trashFolder.getPath(), true, true, 0,login); ses.getWorkspace().getLockManager().lock(trashFolder.getPath(), true, true, 0,login);
ses.getWorkspace().getLockManager().lock(nodeToDelete.getPath(), true, true, 0,login); ses.getWorkspace().getLockManager().lock(nodeToDelete.getPath(), true, true, 0,login);
log.debug("preparing thrash item"); log.debug("preparing thrash item");
TrashItem trashItem = new TrashItem(); TrashItem trashItem = new TrashItem();
trashItem.setDeletedBy(AuthorizationProvider.instance.get().getClient().getId()); trashItem.setDeletedBy(login);
trashItem.setDeletedFrom(nodeToDelete.getParent().getPath()); trashItem.setDeletedFrom(nodeToDelete.getParent().getPath());
Calendar now = Calendar.getInstance(); Calendar now = Calendar.getInstance();
trashItem.setDeletedTime(now); trashItem.setDeletedTime(now);
@ -131,7 +208,7 @@ public class TrashHandler {
trashItem.setName(item.getId()); trashItem.setName(item.getId());
trashItem.setOriginalParentId(nodeToDelete.getParent().getIdentifier()); trashItem.setOriginalParentId(nodeToDelete.getParent().getIdentifier());
trashItem.setOwner(item.getOwner()); trashItem.setOwner(login);
trashItem.setLastModificationTime(item.getLastModificationTime()); trashItem.setLastModificationTime(item.getLastModificationTime());
trashItem.setLastModifiedBy(item.getLastModifiedBy()); trashItem.setLastModifiedBy(item.getLastModifiedBy());
@ -139,8 +216,7 @@ public class TrashHandler {
if (item instanceof FolderItem) { if (item instanceof FolderItem) {
trashItem.setFolder(true); trashItem.setFolder(true);
}else if (item instanceof AbstractFileItem ) { }else if (item instanceof AbstractFileItem file) {
AbstractFileItem file = (AbstractFileItem) item;
if (file.getContent()!=null) { if (file.getContent()!=null) {
trashItem.setMimeType(file.getContent().getMimeType()); trashItem.setMimeType(file.getContent().getMimeType());
trashItem.setLenght(file.getContent().getSize()); trashItem.setLenght(file.getContent().getSize());
@ -156,12 +232,12 @@ public class TrashHandler {
log.debug("calling jcr move"); log.debug("calling jcr move");
ses.getWorkspace().move(nodeToDelete.getPath(), Paths.append(Paths.getPath(newTrashItemNode.getPath()),nodeToDelete.getName()).toPath()); ses.getWorkspace().move(nodeToDelete.getPath(), Paths.append(Paths.getPath(newTrashItemNode.getPath()),nodeToDelete.getName()).toPath());
String mimetype = null; String mimetype = null;
if (item instanceof AbstractFileItem) { if (item instanceof AbstractFileItem file) {
if (((AbstractFileItem)item).getContent()!=null) if (file.getContent()!=null)
mimetype = ((AbstractFileItem) item).getContent().getMimeType(); mimetype = file.getContent().getMimeType();
else log.warn("the AbstractFileItem with id {} has no content (check it!!)", item.getId()); else log.warn("the AbstractFileItem with id {} has no content (check it!!)", file.getId());
} }
accountingHandler.createFolderRemoveObj(item.getName(), item.getClass().getSimpleName(), mimetype, ses, ses.getNodeByIdentifier(item.getParentId()), true); accountingHandler.createFolderRemoveObj(item.getName(), item.getClass().getSimpleName(), mimetype, ses, login, (Node) item.getRelatedNode(), true);
}catch(Throwable t) { }catch(Throwable t) {
log.error("error exceuting move to trash",t); log.error("error exceuting move to trash",t);
throw new BackendGenericError(t); throw new BackendGenericError(t);
@ -172,27 +248,69 @@ public class TrashHandler {
} }
public String restoreItem(Session ses, TrashItem item) throws RepositoryException, BackendGenericError, UserNotAuthorizedException{ public String restoreItem(Session ses, TrashItem item, FolderItem destination, String login) throws RepositoryException, StorageHubException, BackendGenericError{
log.debug("restoring node from trash"); log.debug("restoring node from trash with user ");
final String login = AuthorizationProvider.instance.get().getClient().getId();
//final Node trashFolder = ses.getNode(Paths.append(Utils.getHomePath(),Constants.TRASH_ROOT_FOLDER_NAME).toPath()); //final Node trashFolder = ses.getNode(Paths.append(Utils.getHomePath(),Constants.TRASH_ROOT_FOLDER_NAME).toPath());
Node originalParent = ses.getNodeByIdentifier(item.getOriginalParentId());
authChecker.checkWriteAuthorizationControl(ses, originalParent.getIdentifier(), false );
ses.getWorkspace().getLockManager().lock(originalParent.getPath(), true, true, 0,login); Node destinationNode= null;
List<Item> items = Utils.getItemList(ses.getNodeByIdentifier(item.getId()), Excludes.ALL, null, false, null); if (destination==null) {
if (items.size()!=1) { boolean originalParentExists = true;
log.warn("a problem occurred restoring item from trash"); boolean originalParentTrashed = false;
throw new BackendGenericError("An error occurred on trash item"); Node originalParent = null;
try {
originalParent = ses.getNodeByIdentifier(item.getOriginalParentId());
}catch (ItemNotFoundException e) {
originalParentExists = false;
}
Item originalParentItem = null;
if (originalParentExists) {
originalParentItem = node2Item.getItem(originalParent, Excludes.ALL);
originalParentTrashed = originalParentItem.isTrashed();
}
if(originalParentExists && !originalParentTrashed) {
destinationNode = originalParent;
}else {
String homeWS = pathUtil.getWorkspacePath(login).toPath();
Node node = ses.getNode(homeWS);
destinationNode = node;
}
} else {
Node node = (Node)destination.getRelatedNode();
if (!node2Item.checkNodeType(node, FolderItem.class))
throw new InvalidCallParameters("destination Node is not a folder");
destinationNode = node;
} }
Item itemToMove = items.get(0);
String newNodePath = Paths.append(Paths.getPath(originalParent.getPath()), itemToMove.getName()).toPath();
ses.move(itemToMove.getPath(), newNodePath);
Utils.setPropertyOnChangeNode(ses.getNode(newNodePath), login, ItemAction.MOVED);
ses.removeItem(item.getPath());
ses.save();
return ses.getNode(newNodePath).getIdentifier();
authChecker.checkWriteAuthorizationControl(ses, login, destinationNode.getIdentifier(), true );
ses.getWorkspace().getLockManager().lock(destinationNode.getPath(), true, true, 0,login);
String newNodePath = null;
try {
//the real node is a child of the Trash node
List<Item> items = Utils.getItemList(ses.getNodeByIdentifier(item.getId()), Excludes.ALL, null, false, null);
if (items.size()!=1) {
log.warn("a problem occurred restoring item from trash");
throw new BackendGenericError("An error occurred on trash item");
}
Item itemToMove = items.get(0);
item2Node.updateOwnerOnSubTree(ses.getNodeByIdentifier(itemToMove.getId()), login);
String uniqueName = Utils.checkExistanceAndGetUniqueName(ses, destinationNode, itemToMove.getName());
newNodePath = Paths.append(Paths.getPath(destinationNode.getPath()),uniqueName).toPath();
ses.move(itemToMove.getPath(), newNodePath);
Utils.setPropertyOnChangeNode(ses.getNode(newNodePath), login, ItemAction.MOVED);
ses.removeItem(item.getPath());
ses.save();
}catch (Exception e) {
if (ses.getWorkspace().getLockManager().isLocked(destinationNode.getPath()))
ses.getWorkspace().getLockManager().unlock(destinationNode.getPath());
}
return ses.getNode(newNodePath).getIdentifier();
} }
} }

View File

@ -5,9 +5,10 @@ import java.util.HashSet;
import java.util.List; import java.util.List;
import java.util.Set; import java.util.Set;
import javax.inject.Inject; import jakarta.inject.Inject;
import javax.inject.Singleton; import jakarta.inject.Singleton;
import javax.jcr.Node; import javax.jcr.Node;
import javax.jcr.NodeIterator;
import javax.jcr.RepositoryException; import javax.jcr.RepositoryException;
import javax.jcr.Session; import javax.jcr.Session;
import javax.jcr.lock.LockException; import javax.jcr.lock.LockException;
@ -27,9 +28,13 @@ import org.gcube.common.storagehub.model.items.FolderItem;
import org.gcube.common.storagehub.model.items.Item; import org.gcube.common.storagehub.model.items.Item;
import org.gcube.common.storagehub.model.items.SharedFolder; import org.gcube.common.storagehub.model.items.SharedFolder;
import org.gcube.common.storagehub.model.types.ItemAction; import org.gcube.common.storagehub.model.types.ItemAction;
import org.gcube.common.storagehub.model.types.NodeProperty;
import org.gcube.data.access.storagehub.AuthorizationChecker; import org.gcube.data.access.storagehub.AuthorizationChecker;
import org.gcube.data.access.storagehub.PathUtil;
import org.gcube.data.access.storagehub.Utils; import org.gcube.data.access.storagehub.Utils;
import org.gcube.data.access.storagehub.accounting.AccountingHandler; import org.gcube.data.access.storagehub.accounting.AccountingHandler;
import org.gcube.data.access.storagehub.handlers.items.Item2NodeConverter;
import org.gcube.data.access.storagehub.handlers.items.Node2ItemConverter;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
@ -48,25 +53,38 @@ public class UnshareHandler {
@Inject @Inject
AuthorizationChecker authChecker; AuthorizationChecker authChecker;
@Inject
PathUtil pathUtil;
@Inject @Inject
Item2NodeConverter item2Node; Item2NodeConverter item2Node;
public String unshare(Session ses, Set<String> users, Node sharedNode, String login) throws RepositoryException, StorageHubException{ public String unshare(Session ses, Set<String> users, Node sharedNode, String login) throws RepositoryException, StorageHubException{
return _unshare(ses, users, sharedNode, login, true);
}
public String unshareForRemoval(Session ses, Set<String> users, Node sharedNode, String login) throws RepositoryException, StorageHubException{
return _unshare(ses, users, sharedNode, login, false);
}
private String _unshare(Session ses, Set<String> users, Node sharedNode, String login, boolean withCopyOnUnshare) throws RepositoryException, StorageHubException{
Item item = node2Item.getItem(sharedNode, Excludes.ALL); Item item = node2Item.getItem(sharedNode, Excludes.ALL);
if (!(item instanceof FolderItem) || !((FolderItem) item).isShared() || ((SharedFolder) item).isVreFolder()) if (!(item instanceof FolderItem) || !((FolderItem) item).isShared() || ((SharedFolder) item).isVreFolder()) {
log.warn("this item type cannot be unshared {}",item.getClass().getSimpleName());
return null; return null;
}
SharedFolder sharedItem =(SharedFolder) item; SharedFolder sharedItem =(SharedFolder) item;
Set<String> usersInSharedFolder = new HashSet<>(sharedItem.getUsers().getMap().keySet()); Set<String> usersInSharedFolder = new HashSet<>(sharedItem.getUsers().getMap().keySet());
usersInSharedFolder.removeAll(users); usersInSharedFolder.removeAll(users);
if (users==null || users.size()==0) if (users==null || users.size()==0)
return unshareAll(login, ses, sharedItem); return unshareAll(login, ses, sharedItem, withCopyOnUnshare);
if (usersInSharedFolder.size()<=1) { if (usersInSharedFolder.size()<=1) {
if (users.size()==1 && users.contains(login)) if (users.size()==1 && users.contains(login))
return unshareAll(sharedItem.getOwner(), ses , sharedItem); return unshareAll(sharedItem.getOwner(), ses , sharedItem, withCopyOnUnshare);
else return unshareAll(login, ses, sharedItem); else return unshareAll(login, ses, sharedItem, withCopyOnUnshare);
} }
try { try {
@ -85,7 +103,8 @@ public class UnshareHandler {
} }
private String unshareAll(String login, Session ses, SharedFolder item) throws StorageHubException, BackendGenericError, RepositoryException{
private String unshareAll(String login, Session ses, SharedFolder item, boolean withCopyCreation) throws StorageHubException, BackendGenericError, RepositoryException{
log.info("unshare all called"); log.info("unshare all called");
if (!login.equals(item.getOwner())) if (!login.equals(item.getOwner()))
@ -99,37 +118,66 @@ public class UnshareHandler {
throw new ItemLockedException(e); throw new ItemLockedException(e);
} }
Node unsharedNode; String unsharedNodeIdentifier =null;
try { try {
log.debug("user list is empty, I'm going to remove also the shared dir"); log.debug("user list is empty, I'm going to remove also the shared dir");
//take the admin folder and remove his clone then move the shared folder from share to the user home and change the folder type
String adminDirPath = (String)item.getUsers().getMap().get(login);
String[] splitString = adminDirPath.split("/");
String parentDirectoryId = splitString[0];
String directoryName = splitString[1];
Node parentNode = ses.getNodeByIdentifier(parentDirectoryId);
log.debug("parent node path is {}/{}",parentNode.getPath(), directoryName);
Node adminNode = ses.getNode(String.format("%s/%s",parentNode.getPath(), directoryName)); if (withCopyCreation) {
adminNode.removeShare();
unsharedNode = createUnsharedFolder(ses, parentNode, directoryName, item.getDescription(), login); Node sharedOwnerNode = null;
NodeIterator it = sharedItemNode.getSharedSet();
while(it.hasNext()) {
Node node = it.nextNode();
List<Item> itemsToCopy = Utils.getItemList(sharedItemNode, Excludes.ALL, null, true, null); log.info("[UNSHARE] checking node {} starts with {} ",node.getPath(),pathUtil.getHome(login).toPath());
for (Item itemCopy: itemsToCopy) { if (node.getPath().startsWith(pathUtil.getHome(login).toPath())) {
Node itemToCopyNode = ses.getNodeByIdentifier(itemCopy.getId()); sharedOwnerNode =node;
log.debug("copying {} to {}", itemToCopyNode.getPath(), unsharedNode.getPath()); break;
ses.move(itemToCopyNode.getPath(), String.format("%s/%s",unsharedNode.getPath(), itemToCopyNode.getName())); }
} }
ses.save();
Node shareParent = sharedOwnerNode.getParent();
sharedOwnerNode.removeShare();
Node unsharedNode = createUnsharedFolder(ses, shareParent , item.getTitle() , item.getDescription(), login);
List<Item> itemsToCopy = Utils.getItemList(sharedItemNode, Excludes.ALL, null, true, null);
for (Item itemCopy: itemsToCopy) {
Node itemToCopyNode = ses.getNodeByIdentifier(itemCopy.getId());
log.debug("copying {} to {}", itemToCopyNode.getPath(), unsharedNode.getPath());
ses.move(itemToCopyNode.getPath(), String.format("%s/%s",unsharedNode.getPath(), itemToCopyNode.getName()));
}
unsharedNode.getNode(NodeProperty.ACCOUNTING.toString()).remove();
ses.move(sharedItemNode.getNode(NodeProperty.ACCOUNTING.toString()).getPath(), String.format("%s/%s",unsharedNode.getPath(), NodeProperty.ACCOUNTING.toString()));
//set owner of all the unshared items to the caller
item2Node.updateOwnerOnSubTree(unsharedNode, login);
accountingHandler.createUnshareFolder(sharedItemNode.getProperty(NodeProperty.TITLE.toString()).getString(), ses, "ALL", unsharedNode, false);
unsharedNodeIdentifier = unsharedNode.getIdentifier();
log.info("[UNSHARE] unshared node id {}",unsharedNodeIdentifier);
ses.save();
} //TODO: else shoud I removove all the fiel content ?
log.debug("all the users have been removed, the folder is totally unshared");
}catch(Throwable t) {
log.error("erro unsharing all",t);
throw t;
}finally { }finally {
ses.getWorkspace().getLockManager().unlock(sharedItemNode.getPath()); ses.getWorkspace().getLockManager().unlock(sharedItemNode.getPath());
} }
sharedItemNode.removeSharedSet(); sharedItemNode.removeSharedSet();
ses.save(); ses.save();
log.debug("all the users have been removed, the folder is totally unshared");
return unsharedNode.getIdentifier(); return unsharedNodeIdentifier;
} }
@ -137,16 +185,29 @@ public class UnshareHandler {
private String unshareCaller(String login, Session ses, SharedFolder item) throws StorageHubException, RepositoryException{ private String unshareCaller(String login, Session ses, SharedFolder item) throws StorageHubException, RepositoryException{
log.info("unshare caller");
if (login.equals(item.getOwner())) if (login.equals(item.getOwner()))
throw new InvalidCallParameters("the caller is the owner, the folder cannot be unshared"); throw new InvalidCallParameters("the caller is the owner, the folder cannot be unshared");
if (item.getUsers().getMap().get(login)==null)
throw new InvalidCallParameters("the folder is not shared with user "+login);
Node sharedFolderNode =ses.getNodeByIdentifier(item.getId()); Node sharedFolderNode =ses.getNodeByIdentifier(item.getId());
if (item.getUsers().getMap().get(login)!=null) {
Node usersNode = sharedFolderNode.getNode(NodeConstants.USERS_NAME);
usersNode.remove();
Node newUsersNode = sharedFolderNode.addNode(NodeConstants.USERS_NAME);
String parentId = removeSharingForUser(login, ses, item); item.getUsers().getMap().entrySet().stream().filter(entry -> !entry.getKey().equals(login)).forEach(entry-> {try {
newUsersNode.setProperty(entry.getKey(), (String)entry.getValue());
} catch (Exception e) {
log.error("error adding property to shared node users node under {}",item.getId());
}});
}
Node shareNode = getUserSharingNode(login, ses, item);
String parentId = shareNode.getParent().getIdentifier();
//not returning an error to correct all the old ACL
if (shareNode != null)
shareNode.removeShare();
AccessControlManager acm = ses.getAccessControlManager(); AccessControlManager acm = ses.getAccessControlManager();
JackrabbitAccessControlList acls = AccessControlUtils.getAccessControlList(acm, sharedFolderNode.getPath()); JackrabbitAccessControlList acls = AccessControlUtils.getAccessControlList(acm, sharedFolderNode.getPath());
@ -162,20 +223,12 @@ public class UnshareHandler {
if (entryToDelete!=null) if (entryToDelete!=null)
acls.removeAccessControlEntry(entryToDelete); acls.removeAccessControlEntry(entryToDelete);
log.debug("removed Access control entry for user {}",login);
Node sharedItemNode = ses.getNodeByIdentifier(item.getId());
Node usersNode = sharedItemNode.getNode(NodeConstants.USERS_NAME);
usersNode.remove();
Node newUsersNode = sharedItemNode.addNode(NodeConstants.USERS_NAME);
item.getUsers().getMap().entrySet().stream().filter(entry -> !entry.getKey().equals(login)).forEach(entry-> {try {
newUsersNode.setProperty(entry.getKey(), (String)entry.getValue());
} catch (Exception e) {
log.error("error adding property to shared node users node under {}",item.getId());
}});
acm.setPolicy(sharedFolderNode.getPath(), acls); acm.setPolicy(sharedFolderNode.getPath(), acls);
log.debug("removed Access control entry for user {}",login);
accountingHandler.createUnshareFolder(item.getTitle(), ses, login, sharedFolderNode, false);
ses.save(); ses.save();
return parentId; return parentId;
@ -183,10 +236,9 @@ public class UnshareHandler {
} }
private String unsharePartial(Set<String> usersToUnshare, String login, Session ses, SharedFolder item) throws StorageHubException, RepositoryException { private String unsharePartial(Set<String> usersToUnshare, String login, Session ses, SharedFolder item) throws StorageHubException, RepositoryException {
authChecker.checkAdministratorControl(ses, (SharedFolder)item); log.info("unshare partial");
authChecker.checkAdministratorControl(ses, login, (SharedFolder)item);
if (usersToUnshare.contains(item.getOwner())) if (usersToUnshare.contains(item.getOwner()))
throw new UserNotAuthorizedException("user "+login+" not authorized to unshare owner"); throw new UserNotAuthorizedException("user "+login+" not authorized to unshare owner");
@ -196,7 +248,11 @@ public class UnshareHandler {
JackrabbitAccessControlList acls = AccessControlUtils.getAccessControlList(acm, sharedFolderNode.getPath()); JackrabbitAccessControlList acls = AccessControlUtils.getAccessControlList(acm, sharedFolderNode.getPath());
for (String user : usersToUnshare) { for (String user : usersToUnshare) {
removeSharingForUser(user, ses, item); Node userShareNode = getUserSharingNode(user, ses, item);
//not returning an error to correct all the old ACL
if (userShareNode != null)
userShareNode.removeShare();
AccessControlEntry entryToDelete= null; AccessControlEntry entryToDelete= null;
for (AccessControlEntry ace :acls.getAccessControlEntries()) { for (AccessControlEntry ace :acls.getAccessControlEntries()) {
@ -226,6 +282,10 @@ public class UnshareHandler {
acm.setPolicy(sharedFolderNode.getPath(), acls); acm.setPolicy(sharedFolderNode.getPath(), acls);
for (String user: usersToUnshare) {
accountingHandler.createUnshareFolder(sharedItemNode.getProperty(NodeProperty.TITLE.toString()).getString(), ses, user, sharedItemNode, false);
}
ses.save(); ses.save();
return item.getId(); return item.getId();
@ -233,18 +293,34 @@ public class UnshareHandler {
} }
private String removeSharingForUser(String user, Session ses, SharedFolder item) throws RepositoryException { private Node getUserSharingNode(String user, Session ses, SharedFolder item) throws RepositoryException {
String userDirPath = (String)item.getUsers().getMap().get(user); Node shareNode = null;
if (userDirPath==null) return null; try {
String[] splitString = userDirPath.split("/"); String userDirPath = (String)item.getUsers().getMap().get(user);
String parentDirectoryId = splitString[0]; if (userDirPath==null) return null;
String directoryName = splitString[1]; String[] splitString = userDirPath.split("/");
Node parentNode = ses.getNodeByIdentifier(parentDirectoryId); String parentDirectoryId = splitString[0];
Node userNode = ses.getNode(String.format("%s/%s",parentNode.getPath(), directoryName)); String directoryName = splitString[1];
userNode.removeShare(); Node parentNode = null;
accountingHandler.createUnshareFolder(directoryName, ses, parentNode, false); parentNode = ses.getNodeByIdentifier(parentDirectoryId);
log.debug("directory removed for user {}",user); shareNode = ses.getNode(String.format("%s/%s",parentNode.getPath(), directoryName));
return parentDirectoryId; }catch (Throwable e) {
log.warn("users map is not containing a valid value");
}
Node sharedFolderNode = ses.getNodeByIdentifier(item.getId());
if (shareNode==null) {
NodeIterator it = sharedFolderNode.getSharedSet();
while(it.hasNext()) {
Node node = it.nextNode();
if (node.getPath().startsWith(pathUtil.getHome(user).toPath())) {
shareNode =node;
break;
}
}
}
return shareNode;
} }

View File

@ -1,59 +0,0 @@
package org.gcube.data.access.storagehub.handlers;
import java.util.HashMap;
import java.util.Map;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import javax.inject.Inject;
import javax.inject.Singleton;
import javax.jcr.SimpleCredentials;
import javax.servlet.ServletContext;
import org.gcube.common.storagehub.model.items.Item;
import org.gcube.data.access.storagehub.Constants;
import org.gcube.data.access.storagehub.services.RepositoryInitializer;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
@Singleton
public class VREManager {
private static final Logger logger = LoggerFactory.getLogger(VREManager.class);
private Map<String, VRE> vreMap = new HashMap<>();
@Inject
RepositoryInitializer repository;
ExecutorService executor = Executors.newFixedThreadPool(5);
SimpleCredentials credentials;
@Inject
public VREManager(ServletContext context) {
credentials = new SimpleCredentials(context.getInitParameter(Constants.ADMIN_PARAM_NAME),context.getInitParameter(Constants.ADMIN_PARAM_PWD).toCharArray());
}
public synchronized VRE getVRE(String completeName) {
logger.trace("requesting VRE {}",completeName);
if (vreMap.containsKey(completeName))
return vreMap.get(completeName);
else
return null;
}
public synchronized VRE putVRE(Item vreFolder) {
logger.trace("inserting VRE {}",vreFolder.getTitle());
if (vreMap.containsKey(vreFolder.getTitle())) throw new RuntimeException("something went wrong (vre already present in the map)");
else {
VRE toReturn = new VRE(vreFolder, repository.getRepository(), credentials, executor);
vreMap.put(vreFolder.getTitle(), toReturn);
return toReturn;
}
}
}

View File

@ -1,192 +0,0 @@
package org.gcube.data.access.storagehub.handlers;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Iterator;
import java.util.List;
import java.util.concurrent.Callable;
import javax.jcr.Credentials;
import javax.jcr.Node;
import javax.jcr.NodeIterator;
import javax.jcr.Property;
import javax.jcr.Repository;
import javax.jcr.RepositoryException;
import javax.jcr.Session;
import javax.jcr.observation.Event;
import javax.jcr.observation.EventJournal;
import javax.jcr.query.Query;
import org.gcube.common.storagehub.model.Excludes;
import org.gcube.common.storagehub.model.NodeConstants;
import org.gcube.common.storagehub.model.items.Item;
import org.gcube.data.access.storagehub.Constants;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class VREQueryRetriever implements Callable<List<Item>> {
private static final Logger logger = LoggerFactory.getLogger(VREQueryRetriever.class);
private static final int CACHE_DIMENSION = 50;
private Repository repository;
private Credentials credentials;
private Item vreFolder;
List<Item> cachedList = new ArrayList<>(CACHE_DIMENSION);
long lastTimestamp =0;
private Node2ItemConverter node2Item = new Node2ItemConverter();
public VREQueryRetriever(Repository repository, Credentials credentials, Item vreFolder) {
super();
this.repository = repository;
this.credentials = credentials;
this.vreFolder = vreFolder;
}
public List<Item> call() {
logger.trace("executing recents task");
Session ses = null;
if (lastTimestamp==0) {
try {
long start = System.currentTimeMillis();
ses = repository.login(credentials);
String query = String.format("SELECT * FROM [nthl:workspaceLeafItem] AS node WHERE ISDESCENDANTNODE('%s') ORDER BY node.[jcr:lastModified] DESC ",vreFolder.getPath());
logger.trace("query for recents is {}",query);
Query jcrQuery = ses.getWorkspace().getQueryManager().createQuery(query, Constants.QUERY_LANGUAGE);
jcrQuery.setLimit(CACHE_DIMENSION);
lastTimestamp = System.currentTimeMillis();
NodeIterator it = jcrQuery.execute().getNodes();
logger.trace("query for recents took {}",System.currentTimeMillis()-start);
while (it.hasNext()) {
Node node = it.nextNode();
Item item =node2Item.getItem(node, Excludes.EXCLUDE_ACCOUNTING);
cachedList.add(item);
logger.trace("adding item {} with node {}",item.getTitle(), node.getName());
}
logger.trace("creating objects took {}",System.currentTimeMillis()-start);
if (cachedList.size()<=10) return cachedList;
else return cachedList.subList(0, 10);
} catch (Exception e) {
logger.error("error querying vre {}",vreFolder.getTitle(),e);
throw new RuntimeException(e);
}finally{
if (ses!=null)
ses.logout();
logger.trace("recents task finished");
}
} else {
try {
long timestampToUse = lastTimestamp;
lastTimestamp = System.currentTimeMillis();
long start = System.currentTimeMillis();
ses = repository.login(credentials);
final String[] types = { "nthl:workspaceLeafItem", "nthl:workspaceItem"};
EventJournal journalChanged = ses.getWorkspace().getObservationManager().getEventJournal(Event.PROPERTY_CHANGED^Event.NODE_REMOVED^Event.NODE_MOVED^Event.NODE_ADDED, vreFolder.getPath(), true, null, types);
journalChanged.skipTo(timestampToUse);
logger.trace("getting the journal took {}",System.currentTimeMillis()-start);
int events = 0;
while (journalChanged.hasNext()) {
events++;
Event event = journalChanged.nextEvent();
switch(event.getType()) {
case Event.NODE_ADDED:
if (ses.nodeExists(event.getPath())) {
Node nodeAdded = ses.getNode(event.getPath());
if (nodeAdded.isNodeType("nthl:workspaceLeafItem")) {
logger.trace("node added event received with name {}", nodeAdded.getName());
Item item = node2Item.getItem(nodeAdded, Arrays.asList(NodeConstants.ACCOUNTING_NAME));
insertItemInTheRightPlace(item);
}
}
break;
case Event.PROPERTY_CHANGED:
if (ses.propertyExists(event.getPath())) {
Property property = ses.getProperty(event.getPath());
if (property.getName().equalsIgnoreCase("jcr:lastModified")) {
logger.trace("event property changed on {} with value {} and parent {}",property.getName(), property.getValue().getString(), property.getParent().getPath());
String identifier = property.getParent().getIdentifier();
cachedList.removeIf(i -> i.getId().equals(identifier));
Item item = node2Item.getItem(property.getParent(), Excludes.EXCLUDE_ACCOUNTING);
insertItemInTheRightPlace(item);
}
}
break;
case Event.NODE_REMOVED:
logger.trace("node removed event received with type {}", event.getIdentifier());
cachedList.removeIf(i -> {
try {
return i.getId().equals(event.getIdentifier()) && i.getLastModificationTime().getTime().getTime()<event.getDate();
} catch (RepositoryException e) {
return false;
}
});
break;
case Event.NODE_MOVED:
Node nodeMoved = ses.getNode(event.getPath());
logger.trace("node moved event received with type {}", nodeMoved.getPrimaryNodeType());
if (nodeMoved.isNodeType("nthl:workspaceLeafItem")) {
logger.trace("event node moved on {} with path {}",nodeMoved.getName(), nodeMoved.getPath());
String identifier = nodeMoved.getIdentifier();
cachedList.removeIf(i -> i.getId().equals(identifier) && !i.getPath().startsWith(vreFolder.getPath()));
}
break;
default:
throw new Exception("error in event handling");
}
}
if (cachedList.size()>CACHE_DIMENSION)
cachedList.subList(51, cachedList.size()).clear();
logger.trace("retrieving event took {} with {} events",System.currentTimeMillis()-start, events);
if (cachedList.size()<=10) return cachedList;
else return cachedList.subList(0, 10);
} catch (Exception e) {
logger.error("error getting events for vre {}",vreFolder.getTitle(),e);
throw new RuntimeException(e);
}finally{
if (ses!=null)
ses.logout();
}
}
}
private void insertItemInTheRightPlace(Item item) {
Iterator<Item> it = cachedList.iterator();
int index =0;
while (it.hasNext()) {
Item inListItem = it.next();
if (item.getLastModificationTime().getTime().getTime()>=inListItem.getLastModificationTime().getTime().getTime()) break;
index++;
}
if (index<CACHE_DIMENSION)
cachedList.add(index, item);
}
/* @Override
public void onEvent(EventIterator events) {
logger.trace("on event called");
while (events.hasNext()) {
Event event = events.nextEvent();
try {
logger.trace("new event received of type {} on node {}",event.getType(),event.getIdentifier());
} catch (RepositoryException e) {
logger.error("error reading event",e);
}
}
}*/
}

View File

@ -4,8 +4,9 @@ import java.util.ArrayList;
import java.util.Collections; import java.util.Collections;
import java.util.List; import java.util.List;
import javax.inject.Singleton; import jakarta.inject.Singleton;
import javax.jcr.Node; import javax.jcr.Node;
import javax.jcr.RepositoryException;
import javax.jcr.Session; import javax.jcr.Session;
import javax.jcr.version.Version; import javax.jcr.version.Version;
import javax.jcr.version.VersionHistory; import javax.jcr.version.VersionHistory;
@ -22,7 +23,7 @@ public class VersionHandler {
private static final Logger logger = LoggerFactory.getLogger(VersionHandler.class); private static final Logger logger = LoggerFactory.getLogger(VersionHandler.class);
public void makeVersionableContent(Node node, Session session){ public void makeVersionableContent(Node node){
try { try {
Node contentNode = node.getNode(NodeConstants.CONTENT_NAME); Node contentNode = node.getNode(NodeConstants.CONTENT_NAME);
contentNode.addMixin(JcrConstants.MIX_VERSIONABLE); contentNode.addMixin(JcrConstants.MIX_VERSIONABLE);
@ -31,8 +32,9 @@ public class VersionHandler {
} }
} }
public void checkinContentNode(Node node, Session session){ public void checkinContentNode(Node node){
try { try {
Session session = node.getSession();
Node contentNode = node.getNode(NodeConstants.CONTENT_NAME); Node contentNode = node.getNode(NodeConstants.CONTENT_NAME);
VersionManager versionManager = session.getWorkspace().getVersionManager(); VersionManager versionManager = session.getWorkspace().getVersionManager();
versionManager.checkin(contentNode.getPath()); versionManager.checkin(contentNode.getPath());
@ -41,8 +43,9 @@ public class VersionHandler {
} }
} }
public void checkoutContentNode(Node node, Session session){ public void checkoutContentNode(Node node){
try { try {
Session session = node.getSession();
Node contentNode = node.getNode(NodeConstants.CONTENT_NAME); Node contentNode = node.getNode(NodeConstants.CONTENT_NAME);
VersionManager versionManager = session.getWorkspace().getVersionManager(); VersionManager versionManager = session.getWorkspace().getVersionManager();
versionManager.checkout(contentNode.getPath()); versionManager.checkout(contentNode.getPath());
@ -51,8 +54,16 @@ public class VersionHandler {
} }
} }
public List<Version> getContentVersionHistory(Node node, Session session) { public Version getCurrentVersion(Node node) throws RepositoryException{
Session session = node.getSession();
Node contentNode = node.getNode(NodeConstants.CONTENT_NAME);
VersionManager versionManager = session.getWorkspace().getVersionManager();
return versionManager.getBaseVersion(contentNode.getPath());
}
public List<Version> getContentVersionHistory(Node node) {
try { try {
Session session = node.getSession();
Node contentNode = node.getNode(NodeConstants.CONTENT_NAME); Node contentNode = node.getNode(NodeConstants.CONTENT_NAME);
VersionManager versionManager = session.getWorkspace().getVersionManager(); VersionManager versionManager = session.getWorkspace().getVersionManager();
VersionHistory history = versionManager.getVersionHistory(contentNode.getPath()); VersionHistory history = versionManager.getVersionHistory(contentNode.getPath());
@ -65,10 +76,17 @@ public class VersionHandler {
logger.debug("version name {} with nodeType {}",version.getName(),version.getPrimaryNodeType().getName()); logger.debug("version name {} with nodeType {}",version.getName(),version.getPrimaryNodeType().getName());
} }
return versions; return versions;
}catch(Exception e ) { }catch(Throwable e ) {
logger.warn("cannot get version history content node",e); logger.warn("cannot get version history content node",e);
return Collections.emptyList(); return Collections.emptyList();
} }
} }
public void removeContentVersion(Node node, String versionName) throws RepositoryException{
Node contentNode = node.getNode(NodeConstants.CONTENT_NAME);
VersionHistory history = contentNode.getSession().getWorkspace().getVersionManager().getVersionHistory(contentNode.getPath());
history.removeVersion(versionName);
}
} }

View File

@ -7,7 +7,15 @@ import org.gcube.common.storagehub.model.items.nodes.Content;
public interface ContentHandler { public interface ContentHandler {
void initiliseSpecificContent(InputStream is, String fileName, String mimeType) throws Exception; boolean requiresInputStream();
default void initiliseSpecificContent(InputStream is, String fileName, String mimeType, long size) throws Exception{
throw new UnsupportedOperationException();
}
default void initiliseSpecificContent(String fileName, String mimeType) throws Exception {
throw new UnsupportedOperationException();
}
Content getContent(); Content getContent();

View File

@ -3,7 +3,8 @@ package org.gcube.data.access.storagehub.handlers.content;
import java.util.Arrays; import java.util.Arrays;
import java.util.HashMap; import java.util.HashMap;
import java.util.Set; import java.util.Set;
import javax.inject.Singleton;
import jakarta.inject.Singleton;
import org.gcube.common.storagehub.model.annotations.MimeTypeHandler; import org.gcube.common.storagehub.model.annotations.MimeTypeHandler;
import org.reflections.Reflections; import org.reflections.Reflections;
@ -41,9 +42,9 @@ public class ContentHandlerFactory {
public ContentHandler create(String mimetype) throws Exception{ public ContentHandler create(String mimetype) throws Exception{
Class<? extends ContentHandler> handlerClass = handlerMap.get(mimetype); Class<? extends ContentHandler> handlerClass = handlerMap.get(mimetype);
if (handlerClass!=null) if (handlerClass!=null)
return handlerClass.newInstance(); return handlerClass.getDeclaredConstructor().newInstance();
else else
return defaultHandler.newInstance(); return defaultHandler.getDeclaredConstructor().newInstance();
} }
} }

View File

@ -1,6 +1,5 @@
package org.gcube.data.access.storagehub.handlers.content; package org.gcube.data.access.storagehub.handlers.content;
import java.io.InputStream;
import java.util.Calendar; import java.util.Calendar;
import org.gcube.common.storagehub.model.items.GenericFileItem; import org.gcube.common.storagehub.model.items.GenericFileItem;
@ -11,12 +10,17 @@ public class GenericFileHandler implements ContentHandler{
Content content = new Content(); Content content = new Content();
@Override
public boolean requiresInputStream() {
return false;
}
@Override @Override
public void initiliseSpecificContent(InputStream is, String filename, String mimeType) throws Exception { public void initiliseSpecificContent(String filename, String mimeType) throws Exception {
content.setMimeType(mimeType); content.setMimeType(mimeType);
} }
@Override @Override
public Content getContent() { public Content getContent() {
return content; return content;

View File

@ -28,28 +28,38 @@ public class ImageHandler implements ContentHandler{
private static final Logger logger = LoggerFactory.getLogger(ImageHandler.class); private static final Logger logger = LoggerFactory.getLogger(ImageHandler.class);
@Override @Override
public void initiliseSpecificContent(InputStream is, String fileName, String mimeType) throws Exception { public boolean requiresInputStream() {
Image image = javax.imageio.ImageIO.read(is); return true;
}
int width = image.getWidth(null); @Override
int height = image.getHeight(null); public void initiliseSpecificContent(InputStream is, String fileName, String mimeType, long size) throws Exception {
if (size<5242880) {
Image image = javax.imageio.ImageIO.read(is);
content.setWidth(Long.valueOf(width)); int width = image.getWidth(null);
content.setHeight(Long.valueOf(height)); int height = image.getHeight(null);
try { content.setWidth(Long.valueOf(width));
int[] dimension = getThumbnailDimension(width, height); content.setHeight(Long.valueOf(height));
byte[] buf = transform(image, fileName, dimension[0], dimension[1]).toByteArray();
content.setThumbnailHeight(Long.valueOf(dimension[1])); try {
content.setThumbnailWidth(Long.valueOf(dimension[0])); int[] dimension = getThumbnailDimension(width, height);
content.setThumbnailData(buf);
}catch(Throwable t) { byte[] buf = transform(image, fileName, dimension[0], dimension[1]).toByteArray();
logger.warn("thumbnail for file {} cannot be created ", fileName,t);
content.setThumbnailHeight(Long.valueOf(dimension[1]));
content.setThumbnailWidth(Long.valueOf(dimension[0]));
content.setThumbnailData(buf);
}catch(Throwable t) {
logger.warn("thumbnail for file {} cannot be created ", fileName,t);
}
} }
content.setMimeType(mimeType); content.setMimeType(mimeType);
} }

View File

@ -7,7 +7,6 @@ import org.apache.tika.metadata.Metadata;
import org.apache.tika.parser.ParseContext; import org.apache.tika.parser.ParseContext;
import org.apache.tika.parser.microsoft.ooxml.OOXMLParser; import org.apache.tika.parser.microsoft.ooxml.OOXMLParser;
import org.apache.tika.sax.BodyContentHandler; import org.apache.tika.sax.BodyContentHandler;
import org.gcube.common.storagehub.model.annotations.MimeTypeHandler;
import org.gcube.common.storagehub.model.items.GenericFileItem; import org.gcube.common.storagehub.model.items.GenericFileItem;
import org.gcube.common.storagehub.model.items.nodes.Content; import org.gcube.common.storagehub.model.items.nodes.Content;
import org.gcube.common.storagehub.model.types.ItemAction; import org.gcube.common.storagehub.model.types.ItemAction;
@ -18,7 +17,12 @@ public class OfficeAppHandler implements ContentHandler{
Content content = new Content(); Content content = new Content();
@Override @Override
public void initiliseSpecificContent(InputStream is, String filename, String mimeType) throws Exception { public boolean requiresInputStream() {
return true;
}
@Override
public void initiliseSpecificContent(InputStream is, String filename, String mimeType, long size) throws Exception {
//detecting the file type //detecting the file type
BodyContentHandler handler = new BodyContentHandler(); BodyContentHandler handler = new BodyContentHandler();
Metadata metadata = new Metadata(); Metadata metadata = new Metadata();

View File

@ -8,6 +8,8 @@ import org.gcube.common.storagehub.model.annotations.MimeTypeHandler;
import org.gcube.common.storagehub.model.items.PDFFileItem; import org.gcube.common.storagehub.model.items.PDFFileItem;
import org.gcube.common.storagehub.model.items.nodes.PDFContent; import org.gcube.common.storagehub.model.items.nodes.PDFContent;
import org.gcube.common.storagehub.model.types.ItemAction; import org.gcube.common.storagehub.model.types.ItemAction;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.itextpdf.text.pdf.PdfReader; import com.itextpdf.text.pdf.PdfReader;
@ -22,20 +24,28 @@ public class PdfHandler implements ContentHandler {
PDFContent content = new PDFContent(); PDFContent content = new PDFContent();
private static final Logger logger = LoggerFactory.getLogger(PdfHandler.class);
@Override @Override
public void initiliseSpecificContent(InputStream is, String fileName, String mimeType) throws Exception { public boolean requiresInputStream() {
PdfReader reader = new PdfReader(is); return true;
content.setNumberOfPages(Long.valueOf(reader.getNumberOfPages()));
content.setVersion(String.valueOf(reader.getPdfVersion()));
HashMap<String, String> fileInfo = reader.getInfo();
content.setAuthor(fileInfo.containsKey(AUTHOR)?fileInfo.get(AUTHOR):"n/a");
content.setProducer(fileInfo.containsKey(PRODUCER)?fileInfo.get(PRODUCER):"n/a");
content.setTitle(fileInfo.containsKey(TITLE)?fileInfo.get(TITLE):"n/a");
content.setMimeType(mimeType);
} }
@Override @Override
public void initiliseSpecificContent(InputStream is, String fileName, String mimeType, long size) throws Exception {
try {
PdfReader reader = new PdfReader(is);
content.setNumberOfPages(Long.valueOf(reader.getNumberOfPages()));
content.setVersion(String.valueOf(reader.getPdfVersion()));
HashMap<String, String> fileInfo = reader.getInfo();
content.setAuthor(fileInfo.containsKey(AUTHOR)?fileInfo.get(AUTHOR):"n/a");
content.setProducer(fileInfo.containsKey(PRODUCER)?fileInfo.get(PRODUCER):"n/a");
content.setTitle(fileInfo.containsKey(TITLE)?fileInfo.get(TITLE):"n/a");
} catch (Exception e) {
logger.warn("{} is not a valid pdf", fileName, e);
}
content.setMimeType(mimeType);
} @Override
public PDFContent getContent() { public PDFContent getContent() {
return content; return content;
} }

View File

@ -1,4 +1,4 @@
package org.gcube.data.access.storagehub.handlers; package org.gcube.data.access.storagehub.handlers.items;
import java.lang.reflect.Field; import java.lang.reflect.Field;
import java.net.URL; import java.net.URL;
@ -10,13 +10,14 @@ import java.util.Map;
import java.util.Map.Entry; import java.util.Map.Entry;
import java.util.Set; import java.util.Set;
import javax.inject.Singleton; import jakarta.inject.Singleton;
import javax.jcr.ItemExistsException; import javax.jcr.ItemExistsException;
import javax.jcr.Node; import javax.jcr.Node;
import javax.jcr.PathNotFoundException; import javax.jcr.PathNotFoundException;
import javax.jcr.RepositoryException; import javax.jcr.RepositoryException;
import javax.jcr.Value; import javax.jcr.Value;
import org.apache.jackrabbit.core.NodeImpl;
import org.apache.jackrabbit.util.Text; import org.apache.jackrabbit.util.Text;
import org.apache.jackrabbit.value.BinaryValue; import org.apache.jackrabbit.value.BinaryValue;
import org.apache.jackrabbit.value.BooleanValue; import org.apache.jackrabbit.value.BooleanValue;
@ -31,11 +32,16 @@ import org.gcube.common.storagehub.model.annotations.ListNodes;
import org.gcube.common.storagehub.model.annotations.MapAttribute; import org.gcube.common.storagehub.model.annotations.MapAttribute;
import org.gcube.common.storagehub.model.annotations.NodeAttribute; import org.gcube.common.storagehub.model.annotations.NodeAttribute;
import org.gcube.common.storagehub.model.annotations.RootNode; import org.gcube.common.storagehub.model.annotations.RootNode;
import org.gcube.common.storagehub.model.exceptions.BackendGenericError;
import org.gcube.common.storagehub.model.items.AbstractFileItem; import org.gcube.common.storagehub.model.items.AbstractFileItem;
import org.gcube.common.storagehub.model.items.FolderItem;
import org.gcube.common.storagehub.model.items.Item; import org.gcube.common.storagehub.model.items.Item;
import org.gcube.common.storagehub.model.items.RootItem;
import org.gcube.common.storagehub.model.types.ItemAction; import org.gcube.common.storagehub.model.types.ItemAction;
import org.gcube.common.storagehub.model.types.NodeProperty; import org.gcube.common.storagehub.model.types.NodeProperty;
import org.gcube.data.access.storagehub.NodeChildrenFilterIterator;
import org.gcube.data.access.storagehub.Utils; import org.gcube.data.access.storagehub.Utils;
import org.gcube.data.access.storagehub.handlers.ClassHandler;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
@ -44,50 +50,60 @@ public class Item2NodeConverter {
private static final Logger logger = LoggerFactory.getLogger(Item2NodeConverter.class); private static final Logger logger = LoggerFactory.getLogger(Item2NodeConverter.class);
public <T extends Item> Node getNode(Node parentNode, T item){ public <T extends RootItem> Node getNode(Node parentNode, T item, String uuid){
try { try {
String primaryType= ClassHandler.instance().getNodeType(item.getClass()); String primaryType= ClassHandler.instance().getNodeType(item.getClass());
Node newNode = parentNode.addNode(Text.escapeIllegalJcrChars(item.getName()), primaryType); Node newNode = null;
//newNode.setPrimaryType(primaryType); if (uuid==null)
for (Field field : retrieveAllFields(item.getClass())){ newNode = parentNode.addNode(Text.escapeIllegalJcrChars(item.getName()), primaryType);
if (field.isAnnotationPresent(Attribute.class)){ else ((NodeImpl)parentNode).addNodeWithUuid(item.getName(), primaryType, uuid);
Attribute attribute = field.getAnnotation(Attribute.class); fillNode(newNode, item);
if (attribute.isReadOnly()) continue;
field.setAccessible(true);
try{
//Class<?> returnType = field.getType();
logger.debug("creating node - added field {}",field.getName());
Values values = getObjectValue(field.getType(), field.get(item));
if (values.isMulti()) newNode.setProperty(attribute.value(), values.getValues());
else newNode.setProperty(attribute.value(), values.getValue());
} catch (Exception e ) {
logger.warn("error setting value for attribute {}: {}",attribute.value(), e.getMessage());
}
} else if (field.isAnnotationPresent(NodeAttribute.class)){
NodeAttribute nodeAttribute = field.getAnnotation(NodeAttribute.class);
if (nodeAttribute.isReadOnly()) continue;
String nodeName = nodeAttribute.value();
logger.debug("retrieving field node "+field.getName());
field.setAccessible(true);
try{
Object obj = field.get(item);
if (obj!=null)
iterateItemNodeAttributeFields(obj, newNode, nodeName);
} catch (Exception e ) {
logger.warn("error setting value",e);
}
}
}
return newNode; return newNode;
} catch (RepositoryException e) { } catch (RepositoryException e) {
logger.error("error writing repository",e); logger.error("error writing repository",e);
throw new RuntimeException(e); throw new RuntimeException(e);
} }
}
public <T extends RootItem> Node getNode(Node parentNode, T item){
return getNode(parentNode, item, null);
} }
private <T extends RootItem> void fillNode(Node newNode, T item) throws RepositoryException {
for (Field field : retrieveAllFields(item.getClass())){
if (field.isAnnotationPresent(Attribute.class)){
Attribute attribute = field.getAnnotation(Attribute.class);
if (attribute.isReadOnly()) continue;
field.setAccessible(true);
try{
//Class<?> returnType = field.getType();
logger.trace("creating node - added field {}",field.getName());
Values values = getObjectValue(field.getType(), field.get(item));
if (values.isMulti()) newNode.setProperty(attribute.value(), values.getValues());
else newNode.setProperty(attribute.value(), values.getValue());
} catch (Exception e ) {
logger.warn("error setting value for attribute {}: {}",attribute.value(), e.getMessage());
}
} else if (field.isAnnotationPresent(NodeAttribute.class)){
NodeAttribute nodeAttribute = field.getAnnotation(NodeAttribute.class);
if (nodeAttribute.isReadOnly()) continue;
String nodeName = nodeAttribute.value();
logger.trace("retrieving field node "+field.getName());
field.setAccessible(true);
try{
Object obj = field.get(item);
if (obj!=null)
iterateItemNodeAttributeFields(obj, newNode, nodeName);
} catch (Exception e ) {
logger.debug("error setting value",e);
}
}
}
}
@SuppressWarnings("unchecked")
private void iterateItemNodeAttributeFields(Object object, Node parentNode, String nodeName) throws Exception{ private void iterateItemNodeAttributeFields(Object object, Node parentNode, String nodeName) throws Exception{
AttributeRootNode attributeRootNode = object.getClass().getAnnotation(AttributeRootNode.class); AttributeRootNode attributeRootNode = object.getClass().getAnnotation(AttributeRootNode.class);
@ -110,10 +126,11 @@ public class Item2NodeConverter {
@SuppressWarnings("rawtypes") @SuppressWarnings("rawtypes")
Class returnType = field.getType(); Class returnType = field.getType();
Values values = getObjectValue(returnType, field.get(object)); Values values = getObjectValue(returnType, field.get(object));
if (values == null) continue;
if (values.isMulti()) newNode.setProperty(attribute.value(), values.getValues()); if (values.isMulti()) newNode.setProperty(attribute.value(), values.getValues());
else newNode.setProperty(attribute.value(), values.getValue()); else newNode.setProperty(attribute.value(), values.getValue());
} catch (Exception e ) { } catch (Exception e ) {
logger.warn("error setting value",e); logger.debug("error setting value",e);
} }
} else if (field.isAnnotationPresent(MapAttribute.class)){ } else if (field.isAnnotationPresent(MapAttribute.class)){
//logger.debug("found field {} of type annotated as MapAttribute in class {}", field.getName(), clazz.getName()); //logger.debug("found field {} of type annotated as MapAttribute in class {}", field.getName(), clazz.getName());
@ -125,11 +142,11 @@ public class Item2NodeConverter {
if (values.isMulti()) newNode.setProperty(entry.getKey(), values.getValues()); if (values.isMulti()) newNode.setProperty(entry.getKey(), values.getValues());
else newNode.setProperty(entry.getKey(), values.getValue()); else newNode.setProperty(entry.getKey(), values.getValue());
} catch (Exception e ) { } catch (Exception e ) {
logger.warn("error setting value",e); logger.debug("error setting value",e);
} }
} else if (field.isAnnotationPresent(ListNodes.class)){ } else if (field.isAnnotationPresent(ListNodes.class)){
logger.debug("found field {} of type annotated as ListNodes in class {} on node {}", field.getName(), object.getClass().getName(), newNode.getName()); logger.trace("found field {} of type annotated as ListNodes in class {} on node {}", field.getName(), object.getClass().getName(), newNode.getName());
field.setAccessible(true); field.setAccessible(true);
List<Object> toSetList = (List<Object>) field.get(object); List<Object> toSetList = (List<Object>) field.get(object);
@ -140,13 +157,28 @@ public class Item2NodeConverter {
iterateItemNodeAttributeFields(obj,newNode, field.getName()+(i++)); iterateItemNodeAttributeFields(obj,newNode, field.getName()+(i++));
} }
} else if (field.isAnnotationPresent(NodeAttribute.class)){
NodeAttribute nodeAttribute = field.getAnnotation(NodeAttribute.class);
if (nodeAttribute.isReadOnly()) continue;
String subNodeName = nodeAttribute.value();
logger.trace("retrieving field node "+field.getName());
field.setAccessible(true);
try{
Object obj = field.get(object);
if (obj!=null)
iterateItemNodeAttributeFields(obj, newNode, subNodeName);
} catch (Exception e ) {
logger.debug("error setting value",e);
}
} }
} }
} }
@SuppressWarnings({ "rawtypes" }) @SuppressWarnings({ "rawtypes" })
private Values getObjectValue(Class returnType, Object value) throws Exception{ public static Values getObjectValue(Class returnType, Object value) throws Exception{
if (value== null) return null;
if (returnType.equals(String.class)) return new Values(new StringValue((String) value)); if (returnType.equals(String.class)) return new Values(new StringValue((String) value));
if (returnType.isEnum()) return new Values(new StringValue(((Enum) value).toString())); if (returnType.isEnum()) return new Values(new StringValue(((Enum) value).toString()));
if (returnType.equals(Calendar.class)) return new Values(new DateValue((Calendar) value)); if (returnType.equals(Calendar.class)) return new Values(new DateValue((Calendar) value));
@ -184,7 +216,8 @@ public class Item2NodeConverter {
public <F extends AbstractFileItem> void replaceContent(Node node, F item, ItemAction action){ public <F extends AbstractFileItem> void replaceContent(Node node, F item, ItemAction action){
try { try {
node.setPrimaryType(item.getClass().getAnnotation(RootNode.class).value()); String primaryType = item.getClass().getAnnotation(RootNode.class).value()[0];
node.setPrimaryType(primaryType);
Node contentNode = node.getNode(NodeConstants.CONTENT_NAME); Node contentNode = node.getNode(NodeConstants.CONTENT_NAME);
contentNode.setPrimaryType(item.getContent().getClass().getAnnotation(AttributeRootNode.class).value()); contentNode.setPrimaryType(item.getContent().getClass().getAnnotation(AttributeRootNode.class).value());
@ -192,22 +225,9 @@ public class Item2NodeConverter {
node.setProperty(NodeProperty.LAST_MODIFIED_BY.toString(), item.getLastModifiedBy()); node.setProperty(NodeProperty.LAST_MODIFIED_BY.toString(), item.getLastModifiedBy());
node.setProperty(NodeProperty.LAST_ACTION.toString(), action.name()); node.setProperty(NodeProperty.LAST_ACTION.toString(), action.name());
for (Field field : retrieveAllFields(item.getContent().getClass())){
if (field.isAnnotationPresent(Attribute.class)){
Attribute attribute = field.getAnnotation(Attribute.class);
if (attribute.isReadOnly()) continue;
field.setAccessible(true);
try{
//Class<?> returnType = field.getType();
Values values = getObjectValue(field.getType(), field.get(item.getContent()));
if (values.isMulti()) contentNode.setProperty(attribute.value(), values.getValues() );
else contentNode.setProperty(attribute.value(), values.getValue());
} catch (Exception e ) { replaceContentNodeInternal(contentNode, item.getContent().getClass(),item.getContent());
logger.warn("error setting value for attribute "+attribute.value(),e);
}
}
}
} catch (RepositoryException e) { } catch (RepositoryException e) {
logger.error("error writing repository",e); logger.error("error writing repository",e);
@ -216,11 +236,76 @@ public class Item2NodeConverter {
} }
//VALID ONLY FOR CONTENT
public void replaceContentNodeInternal(Node node, Class<?> clazz, Object instance) {
for (Field field : retrieveAllFields(clazz)){
if (field.isAnnotationPresent(Attribute.class)){
Attribute attribute = field.getAnnotation(Attribute.class);
if (attribute.isReadOnly()) continue;
field.setAccessible(true);
try{
//Class<?> returnType = field.getType();
Values values = getObjectValue(field.getType(), field.get(instance));
if (values.isMulti()) node.setProperty(attribute.value(), values.getValues() );
else node.setProperty(attribute.value(), values.getValue());
} catch (Exception e ) {
logger.debug("error setting value for attribute "+attribute.value(),e);
}
} else if (field.isAnnotationPresent(NodeAttribute.class)){
NodeAttribute nodeAttribute = field.getAnnotation(NodeAttribute.class);
if (nodeAttribute.isReadOnly()) continue;
String subNodeName = nodeAttribute.value();
logger.trace("retrieving field node "+field.getName());
field.setAccessible(true);
try{
Object obj = field.get(instance);
if (obj!=null)
iterateItemNodeAttributeFields(obj, node, subNodeName);
} catch (Exception e ) {
logger.debug("error setting value",e);
}
}
}
}
public void updateHidden(Node node, Boolean hidden,String login) throws RepositoryException {
Utils.setPropertyOnChangeNode(node, login, ItemAction.UPDATED);
node.setProperty(NodeProperty.HIDDEN.toString(), hidden);
}
public void updateDescription(Node node, String description,String login) throws RepositoryException {
Utils.setPropertyOnChangeNode(node, login, ItemAction.UPDATED);
node.setProperty(NodeProperty.DESCRIPTION.toString(), description);
}
@SuppressWarnings("unchecked")
public void updateOwnerOnSubTree(Node node, String owner) throws RepositoryException, BackendGenericError {
Class<? extends Item> classToHandle = (Class<? extends Item>)ClassHandler.instance().get(node.getPrimaryNodeType().getName());
if (classToHandle==null) return;
if (classToHandle.isAssignableFrom(FolderItem.class)) {
NodeChildrenFilterIterator iterator = new NodeChildrenFilterIterator(node);
while (iterator.hasNext()) {
Node nextNode = iterator.next();
updateOwnerOnSubTree(nextNode, owner);
}
}
updateOwner(node, owner);
}
public void updateOwner(Node node, String owner) throws RepositoryException {
//Utils.setPropertyOnChangeNode(node, login, ItemAction.UPDATED);
if (node.hasProperty(NodeProperty.PORTAL_LOGIN.toString()))
node.setProperty(NodeProperty.PORTAL_LOGIN.toString(), owner);
else logger.debug("cannot set new owner to {} "+node.getPath());
}
public <I extends Item> void updateMetadataNode(Node node, Map<String, Object> meta, String login){ public <I extends Item> void updateMetadataNode(Node node, Map<String, Object> meta, String login){
try { try {
//TODO: make a method to update item not only metadata, check if the new metadata has an intersection with the old one to remove properties not needed //TODO: make a method to update item not only metadata, check if the new metadata has an intersection with the old one to remove properties not needed
Utils.setPropertyOnChangeNode(node, login, ItemAction.UPDATED); Utils.setPropertyOnChangeNode(node, login, ItemAction.UPDATED);
Node metadataNode; Node metadataNode;
@ -246,7 +331,7 @@ public class Item2NodeConverter {
} }
} catch (Exception e ) { } catch (Exception e ) {
logger.warn("error setting value",e); logger.debug("error setting value",e);
} }
} }

View File

@ -0,0 +1,416 @@
package org.gcube.data.access.storagehub.handlers.items;
import java.io.BufferedInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.util.HashMap;
import java.util.HashSet;
import java.util.Map;
import java.util.Set;
import jakarta.inject.Inject;
import jakarta.inject.Singleton;
import javax.jcr.Node;
import javax.jcr.RepositoryException;
import javax.jcr.Session;
import javax.jcr.lock.LockException;
import javax.jcr.version.Version;
import org.apache.commons.compress.archivers.ArchiveEntry;
import org.apache.commons.compress.archivers.ArchiveInputStream;
import org.apache.commons.compress.archivers.ArchiveStreamFactory;
import org.apache.tika.config.TikaConfig;
import org.apache.tika.detect.Detector;
import org.apache.tika.io.TikaInputStream;
import org.apache.tika.metadata.Metadata;
import org.gcube.common.storagehub.model.Excludes;
import org.gcube.common.storagehub.model.NodeConstants;
import org.gcube.common.storagehub.model.Paths;
import org.gcube.common.storagehub.model.exceptions.BackendGenericError;
import org.gcube.common.storagehub.model.exceptions.IdNotFoundException;
import org.gcube.common.storagehub.model.exceptions.InvalidCallParameters;
import org.gcube.common.storagehub.model.exceptions.InvalidItemException;
import org.gcube.common.storagehub.model.exceptions.ItemLockedException;
import org.gcube.common.storagehub.model.exceptions.StorageHubException;
import org.gcube.common.storagehub.model.items.AbstractFileItem;
import org.gcube.common.storagehub.model.items.FolderItem;
import org.gcube.common.storagehub.model.storages.MetaInfo;
import org.gcube.common.storagehub.model.storages.StorageBackend;
import org.gcube.common.storagehub.model.storages.StorageBackendFactory;
import org.gcube.common.storagehub.model.types.ItemAction;
import org.gcube.data.access.storagehub.AuthorizationChecker;
import org.gcube.data.access.storagehub.Utils;
import org.gcube.data.access.storagehub.accounting.AccountingHandler;
import org.gcube.data.access.storagehub.handlers.VersionHandler;
import org.gcube.data.access.storagehub.handlers.content.ContentHandler;
import org.gcube.data.access.storagehub.handlers.content.ContentHandlerFactory;
import org.gcube.data.access.storagehub.handlers.items.builders.ArchiveStructureCreationParameter;
import org.gcube.data.access.storagehub.handlers.items.builders.CreateParameters;
import org.gcube.data.access.storagehub.handlers.items.builders.FileCreationParameters;
import org.gcube.data.access.storagehub.handlers.items.builders.FolderCreationParameters;
import org.gcube.data.access.storagehub.handlers.items.builders.GCubeItemCreationParameters;
import org.gcube.data.access.storagehub.handlers.items.builders.URLCreationParameters;
import org.gcube.data.access.storagehub.handlers.plugins.StorageBackendHandler;
import org.glassfish.jersey.media.multipart.FormDataContentDisposition;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
@Singleton
public class ItemHandler {
@Inject
AccountingHandler accountingHandler;
@Inject
ContentHandlerFactory contenthandlerFactory;
@Inject
AuthorizationChecker authChecker;
@Inject
VersionHandler versionHandler;
@Inject
StorageBackendHandler storageBackendHandler;
// private static ExecutorService executor = Executors.newFixedThreadPool(100);
@Inject
Node2ItemConverter node2Item;
@Inject
Item2NodeConverter item2Node;
private static Logger log = LoggerFactory.getLogger(ItemHandler.class);
// TODO: accounting
// provider URI host
// resourceOwner user
// consumer write
// in caso di versione - update con -delta
public <T extends CreateParameters> String create(T parameters) throws Exception {
Session ses = parameters.getSession();
Node destination;
try {
destination = ses.getNodeByIdentifier(parameters.getParentId());
} catch (RepositoryException inf) {
throw new IdNotFoundException(parameters.getParentId());
}
if (!node2Item.checkNodeType(destination, FolderItem.class))
throw new InvalidItemException("the destination item is not a folder");
authChecker.checkWriteAuthorizationControl(ses, parameters.getUser(), destination.getIdentifier(), true);
try {
Node newNode = switch (parameters.getMangedType()) {
case FILE -> create((FileCreationParameters) parameters, destination);
case FOLDER -> create((FolderCreationParameters) parameters, destination);
case ARCHIVE -> create((ArchiveStructureCreationParameter) parameters, destination);
case URL -> create((URLCreationParameters) parameters, destination);
case GCUBEITEM -> create((GCubeItemCreationParameters) parameters, destination);
default -> throw new InvalidCallParameters("Item not supported");
};
log.debug("item with id {} correctly created", newNode.getIdentifier());
return newNode.getIdentifier();
} finally {
if (parameters.getSession().getWorkspace().getLockManager().isLocked(destination.getPath()))
parameters.getSession().getWorkspace().getLockManager().unlock(destination.getPath());
}
}
private Node create(FolderCreationParameters params, Node destination) throws Exception {
Utils.acquireLockWithWait(params.getSession(), destination.getPath(), false, params.getUser(), 10);
Node newNode = Utils.createFolderInternally(params, accountingHandler, false);
params.getSession().save();
return newNode;
}
private Node create(FileCreationParameters params, Node destination) throws Exception {
Node newNode = createFileItemInternally(params.getSession(), destination, params.getStream(), params.getName(),
params.getDescription(), params.getFileDetails(), params.getUser(), true);
params.getSession().save();
versionHandler.checkinContentNode(newNode);
return newNode;
}
private Node create(URLCreationParameters params, Node destination) throws Exception {
Utils.acquireLockWithWait(params.getSession(), destination.getPath(), false, params.getUser(), 10);
Node newNode = Utils.createURLInternally(params.getSession(), destination, params.getName(), params.getUrl(),
params.getDescription(), params.getUser(), accountingHandler);
params.getSession().save();
return newNode;
}
private Node create(ArchiveStructureCreationParameter params, Node destination) throws Exception {
Utils.acquireLockWithWait(params.getSession(), destination.getPath(), false, params.getUser(), 10);
FolderCreationParameters folderParameters = FolderCreationParameters.builder()
.name(params.getParentFolderName()).author(params.getUser()).on(destination.getIdentifier())
.with(params.getSession()).build();
Node parentDirectoryNode = Utils.createFolderInternally(folderParameters, accountingHandler, false);
params.getSession().save();
try {
if (params.getSession().getWorkspace().getLockManager().isLocked(destination.getPath()))
params.getSession().getWorkspace().getLockManager().unlock(destination.getPath());
} catch (Throwable t) {
log.warn("error unlocking {}", destination.getPath(), t);
}
Set<Node> fileNodes = new HashSet<>();
HashMap<String, Node> directoryNodeMap = new HashMap<>();
ArchiveInputStream input = new ArchiveStreamFactory()
.createArchiveInputStream(new BufferedInputStream(params.getStream()));
ArchiveEntry entry;
while ((entry = input.getNextEntry()) != null) {
String entirePath = entry.getName();
log.debug("reading new entry ------> {} ", entirePath);
if (entry.isDirectory()) {
log.debug("creating directory with entire path {} ", entirePath);
createPath(entirePath, directoryNodeMap, parentDirectoryNode, params.getSession(), params.getUser());
} else
try {
String name = entirePath.replaceAll("([^/]*/)*(.*)", "$2");
String parentPath = entirePath.replaceAll("(([^/]*/)*)(.*)", "$1");
log.debug("creating file with entire path {}, name {}, parentPath {} ", entirePath, name,
parentPath);
Node fileNode = null;
long fileSize = entry.getSize();
FormDataContentDisposition fileDetail = FormDataContentDisposition.name(name).size(fileSize)
.build();
//this code has been added for a bug on s3 client(v1) that closes the stream
InputStream notClosableIS = new BufferedInputStream(input) {
@Override
public void close() throws IOException { }
};
if (parentPath.isEmpty()) {
fileNode = createFileItemInternally(params.getSession(), parentDirectoryNode, notClosableIS, name, "",
fileDetail, params.getUser(), false);
} else {
Node parentNode = directoryNodeMap.get(parentPath);
if (parentNode == null)
parentNode = createPath(parentPath, directoryNodeMap, parentDirectoryNode,
params.getSession(), params.getUser());
fileNode = createFileItemInternally(params.getSession(), parentNode, notClosableIS, name, "",
fileDetail, params.getUser(), false);
}
fileNodes.add(fileNode);
} catch (Throwable e) {
log.warn("error getting file {}", entry.getName(), e);
}
}
log.info("archive {} uploading finished ", params.getParentFolderName());
params.getSession().save();
for (Node node : fileNodes)
versionHandler.checkinContentNode(node);
return parentDirectoryNode;
}
private Node createPath(String parentPath, Map<String, Node> directoryNodeMap, Node rootNode, Session ses,
String user) throws StorageHubException, RepositoryException {
String[] parentPathSplit = parentPath.split("/");
String name = parentPathSplit[parentPathSplit.length - 1];
StringBuilder relParentPath = new StringBuilder();
for (int i = 0; i <= parentPathSplit.length - 2; i++)
relParentPath.append(parentPathSplit[i]).append("/");
if (relParentPath.toString().isEmpty()) {
FolderCreationParameters folderParameters = FolderCreationParameters.builder().name(name).author(user)
.on(rootNode.getIdentifier()).with(ses).build();
Node createdNode = Utils.createFolderInternally(folderParameters, accountingHandler, false);
directoryNodeMap.put(name + "/", createdNode);
return createdNode;
} else {
Node relParentNode = directoryNodeMap.get(relParentPath.toString());
if (relParentNode == null) {
relParentNode = createPath(relParentPath.toString(), directoryNodeMap, rootNode, ses, user);
}
FolderCreationParameters folderParameters = FolderCreationParameters.builder().name(name).author(user)
.on(relParentNode.getIdentifier()).with(ses).build();
Node createdNode = Utils.createFolderInternally(folderParameters, accountingHandler, false);
directoryNodeMap.put(relParentPath.append(name).append("/").toString(), createdNode);
return createdNode;
}
}
private Node create(GCubeItemCreationParameters params, Node destination) throws Exception {
Utils.acquireLockWithWait(params.getSession(), destination.getPath(), false, params.getUser(), 10);
Node newNode = Utils.createGcubeItemInternally(params.getSession(), destination, params.getItem().getName(),
params.getItem().getDescription(), params.getUser(), params.getItem(), accountingHandler);
params.getSession().save();
return newNode;
}
private Node createFileItemInternally(Session ses, Node destinationNode, InputStream stream, String name,
String description, FormDataContentDisposition fileDetails, String login, boolean withLock)
throws RepositoryException, StorageHubException {
log.trace("UPLOAD: starting preparing file");
Node newNode;
FolderItem destinationItem = node2Item.getItem(destinationNode, Excludes.ALL);
StorageBackendFactory sbf = storageBackendHandler.get(destinationItem.getBackend());
StorageBackend sb = sbf.create(destinationItem.getBackend());
String relativePath = destinationNode.getPath();
String newNodePath = Paths.append(Paths.getPath(destinationNode.getPath()), name).toPath();
log.info("new node path is {}", newNodePath);
if (ses.nodeExists(newNodePath)) {
newNode = ses.getNode(newNodePath);
authChecker.checkWriteAuthorizationControl(ses, login, newNode.getIdentifier(), false);
AbstractFileItem item = fillItemWithContent(stream, sb, name, description, fileDetails, relativePath,
login);
if (withLock) {
try {
ses.getWorkspace().getLockManager().lock(newNode.getPath(), true, true, 0, login);
} catch (LockException le) {
throw new ItemLockedException(le);
}
}
try {
versionHandler.checkoutContentNode(newNode);
log.trace("replacing content of class {}", item.getContent().getClass());
item2Node.replaceContent(newNode, item, ItemAction.UPDATED);
String versionName = null;
try {
Version version = versionHandler.getCurrentVersion(newNode);
versionName = version.getName();
} catch (RepositoryException e) {
log.warn("current version of {} cannot be retreived", item.getId());
}
accountingHandler.createFileUpdated(item.getTitle(), versionName, ses, newNode, login, false);
ses.save();
} catch (Throwable t) {
log.error("error saving item", t);
} finally {
if (withLock) {
if (ses != null && ses.hasPendingChanges())
ses.save();
ses.getWorkspace().getLockManager().unlock(newNode.getPath());
}
}
} else {
authChecker.checkWriteAuthorizationControl(ses, login, destinationNode.getIdentifier(), true);
AbstractFileItem item = fillItemWithContent(stream, sb, name, description, fileDetails, relativePath,
login);
if (withLock) {
try {
log.debug("trying to acquire lock");
Utils.acquireLockWithWait(ses, destinationNode.getPath(), false, login, 10);
} catch (LockException le) {
throw new ItemLockedException(le);
}
}
try {
newNode = item2Node.getNode(destinationNode, item);
accountingHandler.createEntryCreate(item.getTitle(), ses, newNode, login, false);
ses.save();
} catch (Throwable t) {
log.error("error saving item", t);
throw new BackendGenericError(t);
} finally {
if (withLock)
ses.getWorkspace().getLockManager().unlock(destinationNode.getPath());
}
versionHandler.makeVersionableContent(newNode);
accountingHandler.createFolderAddObj(name, item.getClass().getSimpleName(), item.getContent().getMimeType(),
ses, login, destinationNode, false);
}
// TODO: Utils.updateParentSize()
return newNode;
}
private AbstractFileItem fillItemWithContent(InputStream stream, StorageBackend storageBackend, String name,
String description, FormDataContentDisposition fileDetails, String relPath, String login)
throws BackendGenericError {
log.trace("UPLOAD: filling content");
ContentHandler handler = getContentHandler(stream, storageBackend, name, fileDetails, relPath, login);
return handler.buildItem(name, description, login);
}
private ContentHandler getContentHandler(InputStream stream, StorageBackend storageBackend, String name,
FormDataContentDisposition fileDetails, String relPath, String login) throws BackendGenericError {
log.trace("UPLOAD: handling content");
long start = System.currentTimeMillis();
log.trace("UPLOAD: writing the stream - start");
try {
MetaInfo info = null;
try {
log.debug("UPLOAD: upload on {} - start", storageBackend.getClass());
if (fileDetails != null && fileDetails.getSize() > 0) {
log.debug("UPLOAD: file size set is {} Byte", fileDetails.getSize());
info = storageBackend.upload(stream, relPath, name, fileDetails.getSize(), login);
} else
info = storageBackend.upload(stream, relPath, name, login);
log.debug("UPLOAD: upload on storage - stop");
} catch (Throwable e) {
log.error("error writing content", e);
throw e;
}
ContentHandler handler = null;
String mimeType;
log.debug("UPLOAD: reading the mimetype - start");
try (InputStream is1 = new BufferedInputStream(storageBackend.download(info.getStorageId()))) {
org.apache.tika.mime.MediaType mediaType = null;
TikaConfig config = TikaConfig.getDefaultConfig();
Detector detector = config.getDetector();
TikaInputStream tikastream = TikaInputStream.get(is1);
Metadata metadata = new Metadata();
mediaType = detector.detect(tikastream, metadata);
mimeType = mediaType.getBaseType().toString();
handler = contenthandlerFactory.create(mimeType);
log.debug("UPLOAD: reading the mimetype {} - finished in {}", mimeType,
System.currentTimeMillis() - start);
} catch (Throwable e) {
log.error("error retrieving mimeType", e);
throw new RuntimeException(e);
}
if (handler.requiresInputStream())
try (InputStream is1 = new BufferedInputStream(storageBackend.download(info.getStorageId()))) {
log.debug("UPLOAD: the file type requires input stream");
handler.initiliseSpecificContent(is1, name, mimeType, info.getSize());
}
else {
log.debug("UPLOAD: the file type doesn't requires input stream");
handler.initiliseSpecificContent(name, mimeType);
}
log.debug("UPLOAD: writing the stream - finished in {}", System.currentTimeMillis() - start);
handler.getContent().setData(NodeConstants.CONTENT_NAME);
handler.getContent().setStorageId(info.getStorageId());
handler.getContent().setSize(info.getSize());
handler.getContent().setRemotePath(info.getRemotePath());
handler.getContent().setPayloadBackend(info.getPayloadBackend());
log.debug("UPLOAD: content payload set as {} ", handler.getContent().getPayloadBackend());
return handler;
} catch (Throwable e) {
log.error("error writing file", e);
throw new BackendGenericError(e);
}
}
}

View File

@ -0,0 +1,417 @@
package org.gcube.data.access.storagehub.handlers.items;
import java.lang.reflect.Field;
import java.net.URI;
import java.net.URL;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Calendar;
import java.util.Collections;
import java.util.HashMap;
import java.util.HashSet;
import java.util.List;
import java.util.Map;
import java.util.Set;
import javax.jcr.Node;
import javax.jcr.NodeIterator;
import javax.jcr.PathNotFoundException;
import javax.jcr.Property;
import javax.jcr.PropertyIterator;
import javax.jcr.PropertyType;
import javax.jcr.RepositoryException;
import javax.jcr.Session;
import javax.jcr.Value;
import javax.jcr.version.Version;
import org.apache.commons.io.IOUtils;
import org.apache.jackrabbit.util.Text;
import org.gcube.common.storagehub.model.annotations.Attribute;
import org.gcube.common.storagehub.model.annotations.AttributeRootNode;
import org.gcube.common.storagehub.model.annotations.ListNodes;
import org.gcube.common.storagehub.model.annotations.MapAttribute;
import org.gcube.common.storagehub.model.annotations.NodeAttribute;
import org.gcube.common.storagehub.model.exceptions.BackendGenericError;
import org.gcube.common.storagehub.model.items.ExternalFolder;
import org.gcube.common.storagehub.model.items.Item;
import org.gcube.common.storagehub.model.items.RootItem;
import org.gcube.common.storagehub.model.items.SharedFolder;
import org.gcube.common.storagehub.model.items.TrashItem;
import org.gcube.common.storagehub.model.items.nodes.Content;
import org.gcube.common.storagehub.model.messages.Message;
import org.gcube.data.access.storagehub.Constants;
import org.gcube.data.access.storagehub.handlers.ClassHandler;
import org.gcube.data.access.storagehub.predicates.ItemTypePredicate;
import org.reflections.Configuration;
import org.reflections.Reflections;
import org.reflections.util.ConfigurationBuilder;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import jakarta.inject.Singleton;
@Singleton
@SuppressWarnings("rawtypes")
public class Node2ItemConverter {
private static final Logger logger = LoggerFactory.getLogger(Node2ItemConverter.class);
private static HashMap<Class<?>, Map<String, Class>> typeToSubtypeMap = new HashMap<>();
public <T extends Item> T getFilteredItem(Node node, List<String> excludes, ItemTypePredicate itemTypePredicate) throws RepositoryException, BackendGenericError{
@SuppressWarnings("unchecked")
Class<T> classToHandle = (Class<T>)ClassHandler.instance().get(node.getPrimaryNodeType().getName());
if (classToHandle==null) return null;
if (itemTypePredicate != null && !itemTypePredicate.test(classToHandle)) return null;
else return retrieveItem(node, excludes, classToHandle);
}
public <T extends Item> T getItem(String nodeIdentifier, Session session, List<String> excludes) throws RepositoryException, BackendGenericError{
Node node = session.getNodeByIdentifier(nodeIdentifier);
return getItem(node, excludes);
}
public <T extends Item> T getItem(Node node, List<String> excludes) throws RepositoryException, BackendGenericError{
@SuppressWarnings("unchecked")
Class<T> classToHandle = (Class<T>)ClassHandler.instance().get(node.getPrimaryNodeType().getName());
if (classToHandle==null) return null;
T item = retrieveItem(node, excludes, classToHandle);
return item;
}
public Content getContentFromVersion(Version node) throws RepositoryException, BackendGenericError{
Content content = new Content();
setGenericFields(node.getFrozenNode(), Content.class, null, content);
return content;
}
public Message getMessageItem(Node node) throws RepositoryException{
if (!(node.getPrimaryNodeType().getName().equals("nthl:itemSentRequest")
|| node.getPrimaryNodeType().getName().equals("nthl:itemSentRequestSH")))
return null;
Message msg = new Message();
try {
Node attachmentNode = node.getNode(Constants.ATTACHMENTNODE_NAME);
msg.setWithAttachments(attachmentNode.hasNodes());
}catch (Throwable e) {
msg.setWithAttachments(false);
}
setRootItemCommonFields(node, Collections.emptyList(), Message.class, msg);
return msg;
}
private <T extends Item> T retrieveItem(Node node, List<String> excludes, Class<T> classToHandle) throws RepositoryException, BackendGenericError{
T item;
try {
item = classToHandle.getDeclaredConstructor().newInstance();
}catch (Exception e) {
throw new BackendGenericError(e);
}
try {
item.setShared(SharedFolder.class.isInstance(item) ||
hasTypedParent(node, SharedFolder.class));
}catch (Exception e) {
item.setShared(false);
}
try {
item.setTrashed(TrashItem.class.isInstance(item) ||
hasTypedParent(node, TrashItem.class));
}catch (Exception e) {
item.setTrashed(false);
}
try {
item.setExternalManaged(hasTypedParent(node, ExternalFolder.class));
}catch (Exception e) {
item.setExternalManaged(false);
}
item.setLocked(node.isLocked());
setRootItemCommonFields(node, excludes, classToHandle, item);
return item;
}
private <T extends Item> boolean hasTypedParent(Node node, Class<T> parentType) throws BackendGenericError, RepositoryException{
if(node==null) return false;
return checkNodeType(node, parentType) || hasTypedParent(node.getParent(), parentType);
}
private <T extends RootItem> void setRootItemCommonFields(Node node, List<String> excludes, Class<T> classToHandle, T instance) throws RepositoryException{
try{
instance.setParentId(node.getParent().getIdentifier());
instance.setParentPath(node.getParent().getPath());
}catch (Throwable e) {
logger.trace("Root node doesn't have a parent");
}
instance.setRelatedNode(node);
instance.setId(node.getIdentifier());
instance.setName(Text.unescapeIllegalJcrChars(node.getName()));
instance.setPath(Text.unescapeIllegalJcrChars(node.getPath()));
instance.setPrimaryType(node.getPrimaryNodeType().getName());
setGenericFields(node, classToHandle, excludes, instance);
}
private <T> void setGenericFields(Node node, Class<T> classToHandle,List<String> excludes, T instance){
for (Field field : retrieveAllFields(classToHandle)){
if (field.isAnnotationPresent(Attribute.class)){
setAttributeFieldCheckingDefault(field, instance, node);
} else if (field.isAnnotationPresent(NodeAttribute.class)){
String fieldNodeName = field.getAnnotation(NodeAttribute.class).value();
//for now it excludes only first level node
if (excludes!=null && excludes.contains(fieldNodeName)) continue;
//for now it excludes only first level node
logger.trace("retrieving field node "+field.getName());
field.setAccessible(true);
try{
Node fieldNode = node.getNode(fieldNodeName);
logger.trace("looking in node {} searched with {}",fieldNode.getName(),fieldNodeName);
field.set(instance, iterateNodeAttributeFields(field.getType(), fieldNode));
}catch(PathNotFoundException e){
logger.trace("the current node dosn't contain {} node",fieldNodeName);
} catch (Exception e ) {
logger.trace("error setting value",e);
}
}
}
}
private <T> T iterateNodeAttributeFields(Class<T> clazz, Node node) throws Exception{
T obj = clazz.getDeclaredConstructor().newInstance();
for (Field field : retrieveAllFields(clazz)){
if (field.isAnnotationPresent(Attribute.class)){
setAttributeFieldCheckingDefault(field, obj, node);
} else if (field.isAnnotationPresent(MapAttribute.class)){
logger.trace("found field {} of type annotated as MapAttribute in class {} and node name {}", field.getName(), clazz.getName(), node.getName());
setMapAttribute(field, obj, node);
} else if (field.isAnnotationPresent(ListNodes.class)){
logger.trace("found field {} of type annotated as ListNodes in class {} on node {}", field.getName(), clazz.getName(), node.getName());
setListNode(field, obj, node);
} else if (field.isAnnotationPresent(NodeAttribute.class)){
logger.trace("found field {} of type annotated as NodeAttribute in class {} on node {}", field.getName(), clazz.getName(), node.getName());
String fieldNodeName = field.getAnnotation(NodeAttribute.class).value();
//for now it excludes only first level node
//if (excludes!=null && excludes.contains(fieldNodeName)) continue;
//for now it excludes only first level node
logger.trace("retrieving field node {} on field {}", fieldNodeName, field.getName());
field.setAccessible(true);
try{
Node fieldNode = node.getNode(fieldNodeName);
logger.trace("looking in node {} searched with {}",fieldNode.getName(),fieldNodeName);
field.set(obj, iterateNodeAttributeFields(field.getType(), fieldNode));
}catch(PathNotFoundException e){
logger.trace("the current node dosn't contain {} node",fieldNodeName);
} catch (Exception e ) {
logger.trace("error setting value",e);
}
}
}
return obj;
}
private <T> void setMapAttribute(Field field, T instance, Node node) throws Exception{
field.setAccessible(true);
String exclude = field.getAnnotation(MapAttribute.class).excludeStartWith();
Map<String, Object> mapToset = new HashMap<String, Object>();
PropertyIterator iterator = node.getProperties();
if (iterator!=null) {
while (iterator.hasNext()){
Property prop = iterator.nextProperty();
if (!exclude.isEmpty() && prop.getName().startsWith(exclude)) continue;
try{
logger.trace("adding {} in the map",prop.getName());
mapToset.put(prop.getName(), getPropertyValue(prop));
}catch(PathNotFoundException e){
logger.trace("the property {} is not mapped",prop.getName());
} catch (Exception e ) {
logger.trace("error setting value {}",e);
}
}
}
field.set(instance, mapToset);
}
@SuppressWarnings("unchecked")
private <T> void setListNode(Field field, T instance, Node node) throws Exception{
field.setAccessible(true);
String exclude = field.getAnnotation(ListNodes.class).excludeTypeStartWith();
String include = field.getAnnotation(ListNodes.class).includeTypeStartWith();
Class listType = field.getAnnotation(ListNodes.class).listClass();
Map<String, Class> subTypesMap = Collections.emptyMap();
if (!typeToSubtypeMap.containsKey(listType)) {
Configuration config = new ConfigurationBuilder().forPackages(listType.getPackage().getName());
Reflections reflections = new Reflections(config);
Set<Class> subTypes = reflections.getSubTypesOf(listType);
if (subTypes.size()>0) {
subTypesMap = new HashMap<>();
for (Class subtype: subTypes)
if (subtype.isAnnotationPresent(AttributeRootNode.class)) {
AttributeRootNode attributeRootNode = (AttributeRootNode)subtype.getAnnotation(AttributeRootNode.class);
subTypesMap.put(attributeRootNode.value(), subtype);
}
} else logger.trace("no subtypes found for {}",listType.getName());
typeToSubtypeMap.put(listType, subTypesMap);
} else {
logger.debug("subtypes already found in cache");
subTypesMap = typeToSubtypeMap.get(listType);
}
List<Object> toSetList = new ArrayList<>();
NodeIterator iterator = node.getNodes();
while (iterator.hasNext()){
Node currentNode = iterator.nextNode();
String primaryType = currentNode.getPrimaryNodeType().getName();
logger.trace("the current node {} has a list",currentNode.getName());
if (!include.isEmpty() && !primaryType.startsWith(include))
continue;
if (!exclude.isEmpty() && primaryType.startsWith(exclude))
continue;
if (subTypesMap.containsKey(primaryType))
toSetList.add(iterateNodeAttributeFields(subTypesMap.get(primaryType), currentNode));
else toSetList.add(iterateNodeAttributeFields(listType, currentNode));
}
if (toSetList.size()!=0) field.set(instance, toSetList);
}
private <T> void setAttributeFieldCheckingDefault(Field field, T instance, Node node) {
Attribute attribute = field.getAnnotation(Attribute.class);
field.setAccessible(true);
try{
Object propValue;
Class<?> returnType = field.getType();
if (node.hasProperty(attribute.value())) {
propValue = getPropertyValue(returnType, node.getProperty(attribute.value()));
if (!attribute.defaultValue().isEmpty() && propValue==null )
propValue = returnType.cast(attribute.defaultValue());
field.set(instance, propValue);
logger.trace("retrieve item - added field {}",field.getName());
} else if (!attribute.defaultValue().isEmpty()){
propValue = returnType.cast(attribute.defaultValue());
field.set(instance, propValue);
logger.trace("retrieve item - setting default for field {}",field.getName());
} else
logger.trace("property not found for field {}",field.getName());
} catch (Exception e ) {
logger.debug("error setting value for property {} ",attribute.value());
}
}
@SuppressWarnings({ "unchecked" })
private Object getPropertyValue(Class returnType, Property prop) throws Exception{
if (returnType.equals(String.class)) return prop.getString();
if (returnType.isEnum()) return Enum.valueOf(returnType, prop.getString());
if (returnType.equals(Calendar.class)) return prop.getDate();
if (returnType.equals(URL.class)) return new URI(prop.getString()).toURL();
if (returnType.equals(Boolean.class) || returnType.equals(boolean.class)) return prop.getBoolean();
if (returnType.equals(Long.class) || returnType.equals(long.class)) return prop.getLong();
if (returnType.equals(Integer.class) || returnType.equals(int.class)) return prop.getLong();
if (returnType.isArray()) {
if (prop.getType()==PropertyType.BINARY) {
byte[] bytes = IOUtils.toByteArray(prop.getBinary().getStream());
return bytes;
} else {
Object[] ret= getArrayValue(prop);
return Arrays.copyOf(ret, ret.length, returnType);
}
}
throw new Exception(String.format("class %s not recognized",returnType.getName()));
}
private static Set<Field> retrieveAllFields(Class<?> clazz){
Set<Field> fields = new HashSet<Field>();
Class<?> currentClass = clazz;
do{
List<Field> fieldsFound = Arrays.asList(currentClass.getDeclaredFields());
fields.addAll(fieldsFound);
}while ((currentClass =currentClass.getSuperclass())!=null);
return fields;
}
private Object getPropertyValue(Property prop) throws Exception{
if (prop.isMultiple()){
Object[] values = new Object[prop.getValues().length];
int i = 0;
for (Value value : prop.getValues())
values[i++] = getSingleValue(value);
return values;
} else
return getSingleValue(prop.getValue());
}
private Object getSingleValue(Value value) throws Exception{
switch (value.getType()) {
case PropertyType.DATE:
return value.getDate();
case PropertyType.BOOLEAN:
return value.getBoolean();
case PropertyType.LONG:
return value.getDate();
default:
return value.getString();
}
}
private Object[] getArrayValue(Property prop) throws Exception{
Object[] values = new Object[prop.getValues().length];
int i = 0;
for (Value value : prop.getValues())
values[i++] = getSingleValue(value);
return values;
}
//Checks if a node is a subtype of classToCompare
public boolean checkNodeType(Node node, Class<? extends Item> classToCompare) throws BackendGenericError{
try {
logger.trace("class from nodetype is {} and class to compare is {}",ClassHandler.instance().get(node.getPrimaryNodeType().getName()), classToCompare);
return classToCompare.isAssignableFrom(ClassHandler.instance().get(node.getPrimaryNodeType().getName()));
//(node.isNodeType(ClassHandler.instance().getNodeType(classToCompare)));
}catch (Throwable e) {
throw new BackendGenericError(e);
}
}
}

View File

@ -0,0 +1,75 @@
package org.gcube.data.access.storagehub.handlers.items.builders;
import java.io.InputStream;
import java.util.Objects;
import org.glassfish.jersey.media.multipart.FormDataContentDisposition;
public class ArchiveStructureCreationParameter extends CreateParameters {
String parentFolderName;
FormDataContentDisposition fileDetails;
InputStream stream;
protected ArchiveStructureCreationParameter() {}
public String getParentFolderName() {
return parentFolderName;
}
public FormDataContentDisposition getFileDetails() {
return fileDetails;
}
public InputStream getStream() {
return stream;
}
@Override
protected boolean isValid() {
return Objects.nonNull(parentFolderName) && Objects.nonNull(stream);
}
@Override
public String toString() {
return "FileCreationParameters [parentFolder=" + parentFolderName + ", fileDetails=" + fileDetails
+ ", parentId=" + parentId + ", user=" + user + "]";
}
public static ArchiveCreationBuilder builder() {
return new ArchiveCreationBuilder();
}
public static class ArchiveCreationBuilder extends ItemsParameterBuilder<ArchiveStructureCreationParameter>{
private ArchiveCreationBuilder() {
super(new ArchiveStructureCreationParameter());
}
public ArchiveCreationBuilder parentName(String name) {
parameters.parentFolderName = name;
return this;
}
public ArchiveCreationBuilder stream(InputStream stream) {
parameters.stream = stream;
return this;
}
public ArchiveCreationBuilder fileDetails(FormDataContentDisposition fileDetails) {
parameters.fileDetails = fileDetails;
return this;
}
}
@Override
public ManagedType getMangedType() {
return ManagedType.ARCHIVE;
}
}

View File

@ -0,0 +1,38 @@
package org.gcube.data.access.storagehub.handlers.items.builders;
import javax.jcr.Session;
public abstract class CreateParameters {
public enum ManagedType {
FILE,
FOLDER,
ARCHIVE,
URL,
GCUBEITEM;
}
String parentId;
Session session;
String user;
protected CreateParameters() {}
public abstract ManagedType getMangedType();
public String getParentId() {
return parentId;
}
public Session getSession() {
return session;
}
public String getUser() {
return user;
}
protected abstract boolean isValid();
public abstract String toString();
}

View File

@ -0,0 +1,82 @@
package org.gcube.data.access.storagehub.handlers.items.builders;
import java.io.InputStream;
import java.util.Objects;
import org.glassfish.jersey.media.multipart.FormDataContentDisposition;
public class FileCreationParameters extends CreateParameters {
String name;
String description;
FormDataContentDisposition fileDetails;
InputStream stream;
protected FileCreationParameters() {}
public String getName() {
return name;
}
public String getDescription() {
return description;
}
public FormDataContentDisposition getFileDetails() {
return fileDetails;
}
public InputStream getStream() {
return stream;
}
@Override
protected boolean isValid() {
return Objects.nonNull(name) && Objects.nonNull(description) && Objects.nonNull(stream);
}
@Override
public String toString() {
return "FileCreationParameters [name=" + name + ", description=" + description + ", fileDetails=" + fileDetails
+ ", parentId=" + parentId + ", user=" + user + "]";
}
public static FileCreationBuilder builder() {
return new FileCreationBuilder();
}
public static class FileCreationBuilder extends ItemsParameterBuilder<FileCreationParameters>{
private FileCreationBuilder() {
super(new FileCreationParameters());
}
public FileCreationBuilder name(String name) {
parameters.name = name;
return this;
}
public FileCreationBuilder description(String description) {
parameters.description = description;
return this;
}
public FileCreationBuilder stream(InputStream stream) {
parameters.stream = stream;
return this;
}
public FileCreationBuilder fileDetails(FormDataContentDisposition fileDetails) {
parameters.fileDetails = fileDetails;
return this;
}
}
@Override
public ManagedType getMangedType() {
return ManagedType.FILE;
}
}

View File

@ -0,0 +1,107 @@
package org.gcube.data.access.storagehub.handlers.items.builders;
import java.util.Map;
import java.util.Objects;
import org.gcube.common.storagehub.model.Metadata;
import org.gcube.common.storagehub.model.items.nodes.PayloadBackend;
import org.gcube.common.storagehub.model.plugins.PluginParameters;
public class FolderCreationParameters extends CreateParameters {
private String name;
private String description="";
private boolean hidden = false;
private PayloadBackend backend;
protected FolderCreationParameters() {}
public String getName() {
return name;
}
public String getDescription() {
return description;
}
public boolean isHidden() {
return hidden;
}
public PayloadBackend getBackend() {
return backend;
}
@Override
protected boolean isValid() {
return Objects.nonNull(name) && Objects.nonNull(description);
}
@Override
public String toString() {
return "FolderCreationParameters [name=" + name + ", description=" + description + ", hidden=" + hidden
+ ", parentId=" + parentId + ", session=" + session + ", user=" + user + "]";
}
public static FolderCreationBuilder builder() {
return new FolderCreationBuilder();
}
public static class FolderCreationBuilder extends ItemsParameterBuilder<FolderCreationParameters>{
private FolderCreationBuilder() {
super(new FolderCreationParameters());
}
public FolderCreationBuilder name(String name) {
parameters.name = name;
return this;
}
public FolderCreationBuilder description(String description) {
parameters.description = description;
return this;
}
public BackendCreationBuilder onRepository(String pluginName) {
return new BackendCreationBuilder(pluginName, this);
}
public FolderCreationBuilder hidden(boolean hidden) {
parameters.hidden = hidden;
return this;
}
}
public static class BackendCreationBuilder {
FolderCreationBuilder cb;
String plugin;
protected BackendCreationBuilder(String pluginName, FolderCreationBuilder cb) {
this.cb = cb;
this.plugin = pluginName;
}
public FolderCreationBuilder withParameters(Map<String, Object> params){
this.cb.parameters.backend = new PayloadBackend(plugin, new Metadata(params));
return this.cb;
}
public FolderCreationBuilder withoutParameters(){
this.cb.parameters.backend = new PayloadBackend(plugin, null);
return this.cb;
}
}
@Override
public ManagedType getMangedType() {
return ManagedType.FOLDER;
}
}

View File

@ -0,0 +1,49 @@
package org.gcube.data.access.storagehub.handlers.items.builders;
import java.util.Objects;
import org.gcube.common.storagehub.model.items.GCubeItem;
public class GCubeItemCreationParameters extends CreateParameters{
GCubeItem item;
protected GCubeItemCreationParameters() {}
public GCubeItem getItem() {
return item;
}
@Override
protected boolean isValid() {
return Objects.nonNull(item);
}
@Override
public String toString() {
return "GCubeItemCreationParameters [item=" + item + ", parentId=" + parentId + ", user=" + user + "]";
}
public static GCubeItemCreationBuilder builder() {
return new GCubeItemCreationBuilder();
}
public static class GCubeItemCreationBuilder extends ItemsParameterBuilder<GCubeItemCreationParameters>{
private GCubeItemCreationBuilder() {
super(new GCubeItemCreationParameters());
}
public GCubeItemCreationBuilder item(GCubeItem item) {
parameters.item = item;
return this;
}
}
@Override
public ManagedType getMangedType() {
return ManagedType.GCUBEITEM;
}
}

View File

@ -0,0 +1,39 @@
package org.gcube.data.access.storagehub.handlers.items.builders;
import java.util.Objects;
import javax.jcr.Session;
import org.gcube.common.storagehub.model.exceptions.InvalidCallParameters;
public abstract class ItemsParameterBuilder<T extends CreateParameters> {
T parameters;
protected ItemsParameterBuilder(T parameters) {
super();
this.parameters = parameters;
}
public ItemsParameterBuilder<T> on(String parentId) {
parameters.parentId = parentId;
return this;
}
public ItemsParameterBuilder<T> author(String author) {
parameters.user = author;
return this;
}
public ItemsParameterBuilder<T> with(Session session){
parameters.session = session;
return this;
}
public T build() throws InvalidCallParameters {
if (!(parameters.isValid() && Objects.nonNull(parameters.parentId)))
throw new InvalidCallParameters("invalid call");
return parameters;
}
}

View File

@ -0,0 +1,69 @@
package org.gcube.data.access.storagehub.handlers.items.builders;
import java.net.URL;
import java.util.Objects;
public class URLCreationParameters extends CreateParameters {
String name;
String description;
URL url;
protected URLCreationParameters() {}
public String getName() {
return name;
}
public String getDescription() {
return description;
}
public URL getUrl() {
return url;
}
@Override
protected boolean isValid() {
return Objects.nonNull(name) && Objects.nonNull(description) && Objects.nonNull(url);
}
@Override
public String toString() {
return "URLCreationParameters [name=" + name + ", description=" + description + ", url=" + url + ", parentId="
+ parentId + ", user=" + user + "]";
}
public static URLCreationBuilder builder() {
return new URLCreationBuilder();
}
public static class URLCreationBuilder extends ItemsParameterBuilder<URLCreationParameters>{
private URLCreationBuilder() {
super(new URLCreationParameters());
}
public URLCreationBuilder name(String name) {
parameters.name = name;
return this;
}
public URLCreationBuilder description(String description) {
parameters.description = description;
return this;
}
public URLCreationBuilder url(URL url) {
parameters.url = url;
return this;
}
}
@Override
public ManagedType getMangedType() {
return ManagedType.URL;
}
}

View File

@ -0,0 +1,61 @@
package org.gcube.data.access.storagehub.handlers.plugins;
import java.util.Collection;
import java.util.HashMap;
import java.util.Map;
import org.gcube.common.storagehub.model.exceptions.PluginNotFoundException;
import org.gcube.common.storagehub.model.items.nodes.PayloadBackend;
import org.gcube.common.storagehub.model.storages.StorageBackendFactory;
import org.gcube.common.storagehub.model.storages.StorageNames;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import jakarta.annotation.PostConstruct;
import jakarta.enterprise.inject.Instance;
import jakarta.inject.Inject;
import jakarta.inject.Singleton;
@Singleton
public class StorageBackendHandler {
private static Logger log = LoggerFactory.getLogger(StorageBackendHandler.class);
public static PayloadBackend getDefaultPayloadForFolder() {
return new PayloadBackend(StorageNames.DEFAULT_S3_STORAGE, null);
}
@Inject
Instance<StorageBackendFactory> factories;
Map<String, StorageBackendFactory> storagebackendMap = new HashMap<String, StorageBackendFactory>();
@PostConstruct
void init() {
if (factories != null)
for (StorageBackendFactory connector : factories) {
if (storagebackendMap.containsKey(connector.getName())) {
log.error("multiple storage backend with the same name");
throw new RuntimeException("multiple storage backend with the same name");
}
storagebackendMap.put(connector.getName(), connector);
}
else
throw new RuntimeException("storage backend implementation not found");
}
public StorageBackendFactory get(PayloadBackend payload) throws PluginNotFoundException {
if (payload == null || !storagebackendMap.containsKey(payload.getStorageName()))
throw new PluginNotFoundException(
String.format("implementation for storage %s not found", payload.getStorageName()));
return storagebackendMap.get(payload.getStorageName());
}
public StorageBackendFactory get(String storageName) throws PluginNotFoundException {
return storagebackendMap.get(storageName);
}
public Collection<StorageBackendFactory> getAllImplementations() {
return storagebackendMap.values();
}
}

View File

@ -0,0 +1,51 @@
package org.gcube.data.access.storagehub.handlers.plugins;
import java.io.InputStream;
import jakarta.inject.Inject;
import jakarta.inject.Singleton;
import org.gcube.common.storagehub.model.exceptions.StorageHubException;
import org.gcube.common.storagehub.model.items.nodes.Content;
import org.gcube.common.storagehub.model.items.nodes.PayloadBackend;
import org.gcube.common.storagehub.model.storages.MetaInfo;
import org.gcube.common.storagehub.model.storages.StorageBackend;
import org.gcube.common.storagehub.model.storages.StorageBackendFactory;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
@Singleton
public class StorageOperationMediator {
Logger log = LoggerFactory.getLogger(StorageOperationMediator.class);
@Inject
StorageBackendHandler storageBackendHandler;
public MetaInfo copy(Content source, PayloadBackend destination, String newName, String newParentPath, String login) throws StorageHubException{
log.info("creating Storages for source {} and destination {}", source.getPayloadBackend(), destination.getStorageName());
StorageBackendFactory sourceSBF = storageBackendHandler.get(source.getPayloadBackend());
//TODO: add metadata taken from content node
StorageBackend sourceSB = sourceSBF.create(source.getPayloadBackend());
StorageBackendFactory destSBF = storageBackendHandler.get(destination);
StorageBackend destSB = destSBF.create(destination);
log.info("source factory is {} and destination factory is {}", sourceSBF.getName(), destSBF.getName());
if (sourceSB.equals(destSB)) {
log.info("source and destination are the same storage");
return sourceSB.onCopy(source, newParentPath, newName);
}else {
log.info("source and destination are different storage");
InputStream stream = sourceSB.download(source);
return destSB.upload(stream, newParentPath, newName, source.getSize(), login);
}
}
public boolean move(){
return true;
}
}

View File

@ -1,4 +1,4 @@
package org.gcube.data.access.storagehub.handlers; package org.gcube.data.access.storagehub.handlers.vres;
import java.util.List; import java.util.List;
import java.util.concurrent.ExecutorService; import java.util.concurrent.ExecutorService;
@ -9,6 +9,7 @@ import javax.jcr.SimpleCredentials;
import org.gcube.common.storagehub.model.exceptions.BackendGenericError; import org.gcube.common.storagehub.model.exceptions.BackendGenericError;
import org.gcube.common.storagehub.model.items.Item; import org.gcube.common.storagehub.model.items.Item;
import org.gcube.data.access.storagehub.handlers.items.Node2ItemConverter;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
@ -21,11 +22,11 @@ public class VRE {
private VREQueryRetriever vreQueryRetriever; private VREQueryRetriever vreQueryRetriever;
private ExecutorService executor; private ExecutorService executor;
public VRE(Item item, Repository repository, SimpleCredentials credentials, ExecutorService executor) { protected VRE(Item item, Repository repository, SimpleCredentials credentials, Node2ItemConverter node2Item, ExecutorService executor) {
super(); super();
this.vreFolder = item; this.vreFolder = item;
this.executor = executor; this.executor = executor;
vreQueryRetriever = new VREQueryRetriever(repository, credentials, vreFolder); vreQueryRetriever = new VREQueryRetriever(repository, credentials, node2Item, vreFolder);
result = executor.submit(vreQueryRetriever); result = executor.submit(vreQueryRetriever);
} }

View File

@ -0,0 +1,89 @@
package org.gcube.data.access.storagehub.handlers.vres;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import javax.jcr.Node;
import javax.jcr.RepositoryException;
import org.apache.jackrabbit.api.JackrabbitSession;
import org.gcube.common.security.ContextBean;
import org.gcube.common.security.ContextBean.Type;
import org.gcube.common.storagehub.model.exceptions.BackendGenericError;
import org.gcube.common.storagehub.model.exceptions.StorageHubException;
import org.gcube.common.storagehub.model.items.Item;
import org.gcube.data.access.storagehub.Constants;
import org.gcube.data.access.storagehub.PathUtil;
import org.gcube.data.access.storagehub.handlers.items.Node2ItemConverter;
import org.gcube.data.access.storagehub.repository.StoragehubRepository;
import org.gcube.data.access.storagehub.services.delegates.GroupManagerDelegate;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import jakarta.inject.Inject;
import jakarta.inject.Singleton;
@Singleton
public class VREManager {
private static final Logger logger = LoggerFactory.getLogger(VREManager.class);
private Map<String, VRE> vreMap = new HashMap<>();
StoragehubRepository repository = StoragehubRepository.repository;
@Inject
Node2ItemConverter node2Item;
@Inject
PathUtil pathUtil;
@Inject
GroupManagerDelegate groupHandler;
ExecutorService executor = Executors.newFixedThreadPool(5);
private synchronized VRE getVRE(String completeName) {
logger.trace("requesting VRE {}",completeName);
if (vreMap.containsKey(completeName))
return vreMap.get(completeName);
else
return null;
}
private synchronized VRE putVRE(Item vreFolder) {
logger.trace("inserting VRE {}",vreFolder.getTitle());
if (vreMap.containsKey(vreFolder.getTitle())) throw new RuntimeException("something went wrong (vre already present in the map)");
else {
VRE toReturn = new VRE(vreFolder, repository.getRepository(), Constants.JCR_CREDENTIALS, node2Item, executor);
vreMap.put(vreFolder.getTitle(), toReturn);
return toReturn;
}
}
public String retrieveGroupNameFromContext(String context) throws StorageHubException{
ContextBean bean = new ContextBean(context);
if (!bean.is(Type.VRE)) throw new BackendGenericError("the current scope is not a VRE");
String entireScopeName= bean.toString().replaceAll("^/(.*)/?$", "$1").replaceAll("/", "-");
return entireScopeName;
}
public synchronized VRE getVreFolderItemByGroupName(JackrabbitSession ses, String groupName, List<String> excludes ) throws RepositoryException, StorageHubException{
VRE vre = this.getVRE(groupName);
if (vre!=null) return vre;
else {
Node vreFolderNode = groupHandler.getFolderNodeRelatedToGroup(ses, groupName);
Item vreFolder = node2Item.getItem(vreFolderNode, excludes);
return this.putVRE(vreFolder);
}
}
}

View File

@ -0,0 +1,274 @@
package org.gcube.data.access.storagehub.handlers.vres;
import java.util.ArrayList;
import java.util.Calendar;
import java.util.Collections;
import java.util.HashMap;
import java.util.LinkedList;
import java.util.List;
import java.util.Map;
import java.util.concurrent.Callable;
import java.util.stream.Collectors;
import javax.jcr.Credentials;
import javax.jcr.Node;
import javax.jcr.NodeIterator;
import javax.jcr.Property;
import javax.jcr.Repository;
import javax.jcr.RepositoryException;
import javax.jcr.Session;
import javax.jcr.observation.Event;
import javax.jcr.observation.EventJournal;
import org.gcube.common.storagehub.model.Excludes;
import org.gcube.common.storagehub.model.exceptions.BackendGenericError;
import org.gcube.common.storagehub.model.items.AbstractFileItem;
import org.gcube.common.storagehub.model.items.FolderItem;
import org.gcube.common.storagehub.model.items.Item;
import org.gcube.data.access.storagehub.handlers.items.Node2ItemConverter;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class VREQueryRetriever implements Callable<List<Item>> {
private static final Logger logger = LoggerFactory.getLogger(VREQueryRetriever.class);
private static final int CACHE_DIMENSION = 50;
private Repository repository;
private Credentials credentials;
private Item vreFolder;
Map<String, Long> cachedMap = new HashMap<>(CACHE_DIMENSION);
long lastTimestamp =0;
long doTime= 0;
boolean underRedo = false;
Node2ItemConverter node2Item;
private static int TIME_MEASURE = Calendar.MINUTE;
private static int VALUE_TIME_BEFORE = 30;
public VREQueryRetriever(Repository repository, Credentials credentials, Node2ItemConverter node2Item, Item vreFolder) {
super();
this.repository = repository;
this.credentials = credentials;
this.vreFolder = vreFolder;
this.node2Item = node2Item;
}
public List<Item> call() {
Session ses = null;
try {
ses = repository.login(credentials);
Calendar now = Calendar.getInstance();
now.add(TIME_MEASURE, -1*Math.abs(VALUE_TIME_BEFORE));
if (doTime> now.getTimeInMillis() || underRedo) {
logger.debug("executing recents task for {} (cahced result)",vreFolder.getTitle());
return correctValues(ses, cachedMap);
}else {
logger.debug("executing recents task for {} (redoing it)",vreFolder.getTitle());
List<Item> toReturn = redo(ses);
doTime = System.currentTimeMillis();
return toReturn;
}
}catch (Exception e) {
logger.error("error preparing recents for folder {}", vreFolder.getTitle(),e);
return Collections.emptyList();
}finally{
if (ses!=null)
ses.logout();
logger.debug("recents task finished");
}
}
public synchronized List<Item> redo(Session ses) {
underRedo = true;
try {
Map<String, Long> tempCachedMap = new HashMap<>(cachedMap);
if (cachedMap.isEmpty())
init(ses, tempCachedMap);
else {
logger.debug("redoing recents for {}",vreFolder.getTitle());
update(ses, tempCachedMap);
}
cachedMap = tempCachedMap;
return correctValues(ses, tempCachedMap);
}finally{
underRedo = false;
}
}
private List<Item> correctValues(Session ses, Map<String, Long> tempCachedMap){
logger.debug("preparing returning values for {}",vreFolder.getTitle());
long start = System.currentTimeMillis();
List<Map.Entry<String, Long>> list = new LinkedList<>(tempCachedMap.entrySet());
list.sort((c1, c2) -> c1.getValue().compareTo(c2.getValue())*-1);
if (list.size()>CACHE_DIMENSION)
for (int index = CACHE_DIMENSION-1; index< list.size() ; index++)
tempCachedMap.remove(list.get(index).getKey());
List<String> cachedIds = list.stream().map(m -> m.getKey()).collect(Collectors.toList());
if (cachedIds.size()>10)
cachedIds = cachedIds.subList(0, 10);
List<Item> result = new ArrayList<>(10);
for (String id: cachedIds) {
try {
Item item = node2Item.getItem(id, ses, Excludes.EXCLUDE_ACCOUNTING);
if (item!=null)
result.add(item);
else logger.warn("item with id {} is null",id);
} catch (BackendGenericError | RepositoryException e) { }
}
logger.debug("returning values prepared in {} for {}",System.currentTimeMillis()-start,vreFolder.getTitle());
return result;
}
private void insertItemInTheRightPlace(Item item, Map<String, Long> tempCachedMap) {
long lastModifiedTime = item.getLastModificationTime().getTimeInMillis();
if (tempCachedMap.size()<CACHE_DIMENSION || lastModifiedTime>Collections.min(tempCachedMap.values()))
tempCachedMap.put(item.getId(), lastModifiedTime);
}
private void init(Session ses,Map<String, Long> tempCachedMap){
try {
long start = System.currentTimeMillis();
lastTimestamp = System.currentTimeMillis();
Node vreFolderNode = ses.getNodeByIdentifier(vreFolder.getId());
logger.debug("starting visiting children for {}",vreFolder.getTitle());
visitChildren(vreFolderNode, tempCachedMap);
logger.debug("initializing recents for {} took {}",vreFolder.getTitle(),System.currentTimeMillis()-start);
} catch (Exception e) {
logger.error("error querying vre {}",vreFolder.getTitle(),e);
throw new RuntimeException(e);
}
}
private void update(Session ses, Map<String, Long> tempCachedMap) {
try {
long timestampToUse = lastTimestamp;
lastTimestamp = System.currentTimeMillis();
long start = System.currentTimeMillis();
final String[] types = { "nthl:workspaceLeafItem", "nthl:workspaceItem"};
EventJournal journalChanged = ses.getWorkspace().getObservationManager().getEventJournal(Event.PROPERTY_CHANGED^Event.NODE_REMOVED^Event.NODE_MOVED^Event.NODE_ADDED, vreFolder.getPath(), true, null, types);
journalChanged.skipTo(timestampToUse);
logger.debug("getting the journal took {}",System.currentTimeMillis()-start);
int events = 0;
//TODO: manage hidden nodes
while (journalChanged.hasNext()) {
events++;
Event event = journalChanged.nextEvent();
try {
switch(event.getType()) {
case Event.NODE_ADDED:
if (ses.nodeExists(event.getPath())) {
Node nodeAdded = ses.getNode(event.getPath());
if (nodeAdded.isNodeType("nthl:workspaceLeafItem")) {
logger.debug("node added event received with name {}", nodeAdded.getName());
Item item = node2Item.getItem(nodeAdded, Excludes.ALL);
if (tempCachedMap.get(event.getIdentifier())!=null)
tempCachedMap.remove(event.getIdentifier());
if (item.isHidden())
break;
insertItemInTheRightPlace(item,tempCachedMap);
}
}
break;
case Event.PROPERTY_CHANGED:
if (ses.propertyExists(event.getPath())) {
Property property = ses.getProperty(event.getPath());
if (property.getName().equalsIgnoreCase("jcr:lastModified")) {
logger.debug("event property changed on {} with value {} and parent {}",property.getName(), property.getValue().getString(), property.getParent().getPath());
String identifier = property.getParent().getIdentifier();
tempCachedMap.remove(identifier);
Item item = node2Item.getItem(property.getParent(), Excludes.ALL);
if (item.isHidden())
break;
insertItemInTheRightPlace(item, tempCachedMap);
}
}
break;
case Event.NODE_REMOVED:
logger.trace("node removed event received with type {}", event.getIdentifier());
if (tempCachedMap.get(event.getIdentifier())!=null &&
tempCachedMap.get(event.getIdentifier())<event.getDate())
tempCachedMap.remove(event.getIdentifier());
break;
case Event.NODE_MOVED:
Node nodeMoved = ses.getNode(event.getPath());
logger.trace("node moved event received with type {}", nodeMoved.getPrimaryNodeType());
if (nodeMoved.isNodeType("nthl:workspaceLeafItem")) {
logger.debug("event node moved on {} with path {}",nodeMoved.getName(), nodeMoved.getPath());
String identifier = nodeMoved.getIdentifier();
String nodePath = ses.getNode(identifier).getPath();
if (tempCachedMap.get(event.getIdentifier())!=null &&
!nodePath.startsWith(vreFolder.getPath()))
tempCachedMap.remove(event.getIdentifier());
}
break;
default:
break;
}
}catch (Exception e) {
logger.warn("error handling event {}",event.getType(),e);
}
}
logger.trace("retrieving event took {} with {} events",System.currentTimeMillis()-start, events);
} catch (Exception e) {
logger.error("error getting events for vre {}",vreFolder.getTitle(),e);
throw new RuntimeException(e);
}
}
private void visitChildren(Node node, Map<String, Long> tempCachedMap) throws Exception{
NodeIterator nodeIt = node.getNodes();
while(nodeIt.hasNext()) {
Node child = nodeIt.nextNode();
Item item = node2Item.getItem(child, Excludes.ALL);
if (item==null || item.isHidden()) continue;
if (item instanceof FolderItem)
visitChildren(child,tempCachedMap);
else if(item instanceof AbstractFileItem)
insertItemInTheRightPlace(item,tempCachedMap);
}
}
/* @Override
public void onEvent(EventIterator events) {
logger.trace("on event called");
while (events.hasNext()) {
Event event = events.nextEvent();
try {
logger.trace("new event received of type {} on node {}",event.getType(),event.getIdentifier());
} catch (RepositoryException e) {
logger.error("error reading event",e);
}
}
}*/
}

View File

@ -0,0 +1,41 @@
package org.gcube.data.access.storagehub.health;
import org.gcube.common.health.api.HealthCheck;
import org.gcube.common.health.api.ReadinessChecker;
import org.gcube.common.health.api.response.HealthCheckResponse;
import org.gcube.common.storagehub.model.items.nodes.PayloadBackend;
import org.gcube.data.access.storagehub.handlers.plugins.StorageBackendHandler;
import org.gcube.data.access.storagehub.storage.backend.impl.GcubeDefaultS3StorageBackendFactory;
import org.gcube.data.access.storagehub.storage.backend.impl.S3Backend;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
@ReadinessChecker
public class DefaultStorageCheck implements HealthCheck{
private static Logger log = LoggerFactory.getLogger(DefaultStorageCheck.class);
PayloadBackend defaultPayload = StorageBackendHandler.getDefaultPayloadForFolder();
@Override
public String getName() {
return String.format("default storage (%s)",defaultPayload.getStorageName());
}
@Override
public HealthCheckResponse check() {
try {
GcubeDefaultS3StorageBackendFactory storageFactory =new GcubeDefaultS3StorageBackendFactory();
storageFactory.init();
if (((S3Backend)storageFactory.create(defaultPayload)).isAlive())
return HealthCheckResponse.builder(getName()).up().build();
else
return HealthCheckResponse.builder(getName()).down().error("error contacting storage").build();
} catch (Exception e) {
log.error("error checking defaultStorage",e);
return HealthCheckResponse.builder(getName()).down().error(e.getMessage()).build();
}
}
}

View File

@ -0,0 +1,35 @@
package org.gcube.data.access.storagehub.health;
import javax.jcr.LoginException;
import javax.jcr.Session;
import org.gcube.common.health.api.HealthCheck;
import org.gcube.common.health.api.ReadinessChecker;
import org.gcube.common.health.api.response.HealthCheckResponse;
import org.gcube.data.access.storagehub.repository.StoragehubRepository;
@ReadinessChecker
public class JCRRepositoryCheck implements HealthCheck{
@Override
public String getName() {
return "Jackrabbit repository";
}
@Override
public HealthCheckResponse check() {
try {
Session session = StoragehubRepository.repository.getRepository().login();
if (session != null) session.logout();
return HealthCheckResponse.builder(getName()).up().build();
}catch (LoginException e) { }
catch(Throwable ex) {
return HealthCheckResponse.builder(getName()).down().error(ex.getMessage()).build();
}
return HealthCheckResponse.builder(getName()).up().build();
}
}

Some files were not shown because too many files have changed in this diff Show More