Skip to content

Latest commit

 

History

History
 
 

tests

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
layout title nav_order parent
page
Testing
1
Developer Overview

RAPIDS Accelerator for Apache Spark Testing

We have a stand-alone example that you can run in the integration tests. The example is based off of the mortgage dataset you can download here and the code is in the com.nvidia.spark.rapids.tests.mortgage package.

Unit Tests

Unit tests exist in the tests directory. This is unconventional and is done so we can run the tests on the final shaded version of the plugin. It also helps with how we collect code coverage.

Use Maven to run the unit tests via mvn test.

To run targeted Scala tests append -DwildcardSuites=<comma separated list of wildcard suite names to execute> to the above command.

For more information about using scalatest with Maven please refer to the scalatest documentation.

Running Unit Tests Against Specific Apache Spark Versions

You can run the unit tests against different versions of Spark using the different profiles. The default version runs against Spark 3.1.1, to run against a specific version use one of the following profiles:

  • -Pspark311tests (Spark 3.1.1)
  • -Pspark312tests (Spark 3.1.2)
  • -Pspark313tests (Spark 3.1.3)

Please refer to the tests project POM to see the list of test profiles supported. Apache Spark specific configurations can be passed in by setting the SPARK_CONF environment variable.

Examples:

  • To run tests against Apache Spark 3.1.1, mvn -P spark311tests test
  • To pass Apache Spark configs --conf spark.dynamicAllocation.enabled=false --conf spark.task.cpus=1 do something like. SPARK_CONF="spark.dynamicAllocation.enabled=false,spark.task.cpus=1" mvn ...
  • To run test ParquetWriterSuite in package com.nvidia.spark.rapids, issue mvn test -DwildcardSuites="com.nvidia.spark.rapids.ParquetWriterSuite"

Integration Tests

Please refer to the integration-tests README