Use SparkSQL in place of Hive for executing step16-createIndicatorsTables.sql of stats update wf #386

Merged
claudio.atzori merged 2 commits from stats_with_spark_sql into beta 3 months ago
Collaborator
There is no content yet.
antonis.lempesis was assigned by giambattista.bloisi 3 months ago
giambattista.bloisi added 1 commit 3 months ago
giambattista.bloisi requested review from claudio.atzori 3 months ago
claudio.atzori added 1 commit 3 months ago
claudio.atzori merged commit f804c58bc7 into beta 3 months ago
claudio.atzori deleted branch stats_with_spark_sql 3 months ago

Reviewers

claudio.atzori was requested for review 3 months ago
The pull request has been merged as f804c58bc7.
You can also view command line instructions.

Step 1:

From your project repository, check out a new branch and test the changes.
git checkout -b stats_with_spark_sql beta
git pull origin stats_with_spark_sql

Step 2:

Merge the changes and update on Gitea.
git checkout beta
git merge --no-ff stats_with_spark_sql
git push origin beta
Sign in to join this conversation.
No reviewers
No Milestone
No project
No Assignees
2 Participants
Notifications
Due Date
The due date is invalid or out of range. Please use the format 'yyyy-mm-dd'.

No due date set.

Dependencies

No dependencies set.

Reference: D-Net/dnet-hadoop#386
Loading…
There is no content yet.