Skip to content

Commit d17e5f2

Browse files
dongjoon-hyunshivaram
authored andcommitted
[SPARK-16233][R][TEST] ORC test should be enabled only when HiveContext is available.
## What changes were proposed in this pull request? ORC test should be enabled only when HiveContext is available. ## How was this patch tested? Manual. ``` $ R/run-tests.sh ... 1. create DataFrame from RDD (test_sparkSQL.R#200) - Hive is not build with SparkSQL, skipped 2. test HiveContext (test_sparkSQL.R#1021) - Hive is not build with SparkSQL, skipped 3. read/write ORC files (test_sparkSQL.R#1728) - Hive is not build with SparkSQL, skipped 4. enableHiveSupport on SparkSession (test_sparkSQL.R#2448) - Hive is not build with SparkSQL, skipped 5. sparkJars tag in SparkContext (test_Windows.R#21) - This test is only for Windows, skipped DONE =========================================================================== Tests passed. ``` Author: Dongjoon Hyun <[email protected]> Closes apache#14019 from dongjoon-hyun/SPARK-16233.
1 parent d601894 commit d17e5f2

File tree

1 file changed

+2
-0
lines changed

1 file changed

+2
-0
lines changed

R/pkg/inst/tests/testthat/test_sparkSQL.R

+2
Original file line numberDiff line numberDiff line change
@@ -1725,6 +1725,7 @@ test_that("mutate(), transform(), rename() and names()", {
17251725
})
17261726

17271727
test_that("read/write ORC files", {
1728+
setHiveContext(sc)
17281729
df <- read.df(jsonPath, "json")
17291730

17301731
# Test write.df and read.df
@@ -1741,6 +1742,7 @@ test_that("read/write ORC files", {
17411742
expect_equal(count(orcDF), count(df))
17421743

17431744
unlink(orcPath2)
1745+
unsetHiveContext()
17441746
})
17451747

17461748
test_that("read/write Parquet files", {

0 commit comments

Comments
 (0)