updated documentation on how to use
This commit is contained in:
parent
f36f9c4fda
commit
1977b9e732
11
README.md
11
README.md
|
@ -25,4 +25,15 @@ This script does the following:
|
||||||
- upload all the dependencies jar by checking in the pom those preceded by comment `<!-- JAR NEED -->`
|
- upload all the dependencies jar by checking in the pom those preceded by comment `<!-- JAR NEED -->`
|
||||||
- submit the spark-job and you can watch the output log directly on your machine
|
- submit the spark-job and you can watch the output log directly on your machine
|
||||||
|
|
||||||
|
|
||||||
|
## Class convention
|
||||||
|
|
||||||
|
To create a new scala Class Spark, you need to extend the _AbstractScalaApplication_ on package `com.sandro.app`
|
||||||
|
Implements the method `run` where you have already the spark context initialized.
|
||||||
|
|
||||||
|
Then define a Singleton object NameOfTheClass and in the main methods you run the code:
|
||||||
|
|
||||||
|
`new YOURCLASS(args,logger).initialize().run()`
|
||||||
|
|
||||||
|
|
||||||
That's AWESOME!
|
That's AWESOME!
|
Loading…
Reference in New Issue