Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Binary file added docs/source/contributor-guide/img.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
10 changes: 9 additions & 1 deletion docs/source/contributor-guide/spark-sql-tests.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ ENABLE_COMET=true build/sbt "hive/testOnly * -- -l org.apache.spark.tags.Extende
ENABLE_COMET=true build/sbt "hive/testOnly * -- -n org.apache.spark.tags.ExtendedHiveTest"
ENABLE_COMET=true build/sbt "hive/testOnly * -- -n org.apache.spark.tags.SlowHiveTest"
```
#### Steps to run individual test suites
#### Steps to run individual test suites through SBT
1. Open SBT with Comet enabled
```sbt
ENABLE_COMET=true sbt -Dspark.test.includeSlowTests=true
Expand All @@ -74,6 +74,14 @@ ENABLE_COMET=true sbt -Dspark.test.includeSlowTests=true
```sbt
sql/testOnly org.apache.spark.sql.DynamicPartitionPruningV1SuiteAEOn -- -z "SPARK-35568"
```
#### Steps to run individual test suites in IntelliJ IDE
1. Add below configuration in VM Options for your test case (apache-spark repository)
```sbt
-Dspark.comet.enabled=true -Dspark.comet.debug.enabled=true -Dspark.plugins=org.apache.spark.CometPlugin -DXmx4096m -Dspark.executor.heartbeatInterval=20000 -Dspark.network.timeout=10000 --add-exports=java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED
```
2. Set `ENABLE_COMET=true` in environment variables
![img.png](img.png)
3. After the above tests are configured, spark tests can be run with debugging enabled on spark/comet code. Note that Comet is added as a dependency and the classes are readonly while debugging from Spark. Any new changes to Comet are to be built and deployed locally through the command (`PROFILES="-Pspark-3.4" make release`)

## Creating a diff file for a new Spark version

Expand Down