1
0
Fork 0
dnet-hadoop/dhp-workflows/dhp-aggregation
Claudio Atzori 03319d3bd9 Revert "Merge pull request 'Creation of the action set to include the bipFinder! score' (#62) from miriam.baglioni/dnet-hadoop:bipFinder into master"
This reverts commit add7e1693b, reversing
changes made to f9a8fd8bbd.
2020-12-17 12:23:58 +01:00
..
src Revert "Merge pull request 'Creation of the action set to include the bipFinder! score' (#62) from miriam.baglioni/dnet-hadoop:bipFinder into master" 2020-12-17 12:23:58 +01:00
README.md dhp-collection-worker integrated in dhp-workflows 2019-10-24 11:36:59 +02:00
pom.xml moved the library version to global pom 2020-10-01 15:43:47 +02:00

README.md

Description of the Module

This module defines a collector worker application that runs on Hadoop.

It is responsible for harvesting metadata using different plugins.

The collector worker uses a message queue to inform the progress of the harvesting action (using a message queue for sending ONGOING messages) furthermore, It gives, at the end of the job, some information about the status of the collection i.e Number of records collected(using a message queue for sending REPORT messages).

To work the collection worker need some parameter like:

  • hdfsPath: the path where storing the sequential file
  • apidescriptor: the JSON encoding of the API Descriptor
  • namenode: the Name Node URI
  • userHDFS: the user wich create the hdfs seq file
  • rabbitUser: the user to connect with RabbitMq for messaging
  • rabbitPassWord: the password to connect with RabbitMq for messaging
  • rabbitHost: the host of the RabbitMq server
  • rabbitOngoingQueue: the name of the ongoing queue
  • rabbitReportQueue: the name of the report queue
  • workflowId: the identifier of the dnet Workflow

##Plugins

  • OAI Plugin

Usage

TODO