The Controller app of the PDF Aggregation Service.
Go to file
Lampros Smyrnaios 2b50e08bf6 - Handle the case, were multiple threads may load the same HDFS directory to a database table, thus causing the "directory contains no visible files"-SQLException.
- Improve the values of the delays for some scheduledTasks.
- Improve elapsed time precision for the "lastAccessedOn" metadata of the workerReports.
- Code polishing.
2023-05-25 00:34:36 +03:00
gradle/wrapper - Add profiles to docker-services to selectively run the additional "Prometheus" and "Grafana" services or not. 2023-04-22 16:50:33 +03:00
prometheus - Make sure we set the "hasShutdown" to "false", for each known worker which was restarted. 2023-05-16 12:24:14 +03:00
src/main - Handle the case, were multiple threads may load the same HDFS directory to a database table, thus causing the "directory contains no visible files"-SQLException. 2023-05-25 00:34:36 +03:00
.gitignore springified project 2022-01-30 22:15:13 +02:00
Dockerfile - Add Prometheus and Grafana which help measuring various metrics for the Controller's health and performance. 2023-03-21 16:46:33 +02:00
README.md - Add the "getWorkersInfo" endpoint. 2023-05-23 14:57:15 +03:00
build.gradle - Increase the "read-timeout" when searching for the host's machine public-IP. 2023-05-22 21:33:02 +03:00
docker-compose.yml Place the "workerReports" and the "bulkImportReports" dirs inside the "reports" parent-directory. 2023-05-24 14:10:57 +03:00
gradle.properties - Improve some log-messages. 2022-11-30 16:28:39 +02:00
installAndRun.sh - Move the Prometheus and grafana configuration in a dedicated directory and docker-compose file. 2023-05-15 18:52:31 +03:00
settings.gradle - Add the "isControllerAlive"-endpoint. 2021-09-23 15:08:52 +03:00
shutdownController.sh - Move the Prometheus and grafana configuration in a dedicated directory and docker-compose file. 2023-05-15 18:52:31 +03:00

README.md

UrlsController

The Controller's Application receives requests coming from the Workers , constructs an assignments-list with data received from a database and returns the list to the workers.
Then, it receives the "WorkerReports", it requests the full-texts from the workers, in batches, and uploads them on the S3-Object-Store. Finally, it writes the related reports, along with the updated file-locations into the database.

It can also process Bulk-Import requests, from compatible data sources, in which case it receives the full-text files immediately, without offloading crawling jobs to Workers.

For interacting with the database we use Impala.

Statistics API:

  • "getNumberOfAllPayloads" endpoint: http://<IP>:/api/stats/getNumberOfAllPayloads
    This endpoint returns the total number of payloads existing in the database, independently of the way they were aggregated. This includes the payloads created by other pieces of software, before the PDF-Aggregation-Service was created.
  • "getNumberOfPayloadsAggregatedByService" endpoint: http://<IP>:/api/stats/getNumberOfPayloadsAggregatedByService
    This endpoint returns the number of payloads aggregated by the PDF-Aggregated-Service itself. It excludes the payloads aggregated by other methods, by applying a Date-filter for the records created in 2021 or later.
  • "getNumberOfPayloadsForDatasource" endpoint: http://<IP>:/api/stats/getNumberOfPayloadsForDatasource?datasourceId=<givenDatasourceId>
    This endpoint returns the number of payloads which belong to the datasource specified by the given datasourceID.
  • "getNumberOfRecordsInspected" endpoint: http://<IP>:/api/stats/getNumberOfRecordsInspected
    This endpoint returns the number of records inspected by the PDF-Aggregation-Service.

To install and run the application:

  • Run git clone and then cd UrlsController.
  • Set the preferable values inside the application.properties file.
  • Execute the installAndRun.sh script which builds and runs the app.
    If you want to just run the app, then run the script with the argument "1": ./installAndRun.sh 1.
    If you want to build and run the app on a Docker Container, then run the script with the argument "0" followed by the argument "1": ./installAndRun.sh 0 1.

Implementation notes:

  • For transferring the full-text files, we use Facebook's Zstandard compression algorithm, which brings very big benefits in compression rate and speed.
  • The uploaded full-text files follow this naming-scheme: "datasourceID/recordID::fileHash.pdf"