updated documentation on how to use

This commit is contained in:
Sandro La Bruzzo 2022-10-21 11:12:24 +02:00
parent f36f9c4fda
commit 1977b9e732
1 changed files with 11 additions and 0 deletions

View File

@ -25,4 +25,15 @@ This script does the following:
- upload all the dependencies jar by checking in the pom those preceded by comment `<!-- JAR NEED -->`
- submit the spark-job and you can watch the output log directly on your machine
## Class convention
To create a new scala Class Spark, you need to extend the _AbstractScalaApplication_ on package `com.sandro.app`
Implements the method `run` where you have already the spark context initialized.
Then define a Singleton object NameOfTheClass and in the main methods you run the code:
`new YOURCLASS(args,logger).initialize().run()`
That's AWESOME!