orcid-no-doi #43
|
@ -128,10 +128,7 @@ public class ActivitiesDumpReader {
|
|||
}
|
||||
}
|
||||
} catch (Exception e) {
|
||||
Log
|
||||
.warn(
|
||||
"Parsing work from tar archive and xml work: " + filename + " " + e.getMessage());
|
||||
// Log.warn(e);
|
||||
throw new Exception(filename, e);
|
||||
|
||||
}
|
||||
|
||||
if ((counter % XML_WORKS_PARSED_COUNTER_LOG_INTERVAL) == 0) {
|
||||
|
@ -143,7 +140,7 @@ public class ActivitiesDumpReader {
|
|||
}
|
||||
}
|
||||
}
|
||||
} catch (IOException e) {
|
||||
} catch (Exception e) {
|
||||
Log.warn("Parsing work from gzip archive: " + e.getMessage());
|
||||
Log.warn(e);
|
||||
throw new RuntimeException(e);
|
||||
|
|
Loading…
Reference in New Issue
What is the reason for not handling nor let propagate this exception? I imagine that a malformed entry in the tar file could cause it, but in that case we should interrupt the procedure and deepen the analysis to spot the error. In this way the error would likely be unnoticed, but causing a drop in the number of output records.