Skip to content

Commit

Permalink
Prepare 3.3.0 release
Browse files Browse the repository at this point in the history
  • Loading branch information
jtgrabowski committed Jan 13, 2023
1 parent 9eb9e81 commit 7f554ca
Show file tree
Hide file tree
Showing 6 changed files with 27 additions and 17 deletions.
4 changes: 4 additions & 0 deletions CHANGES.txt
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
3.3.0
* Spark 3.3.x support (SPARKC-693)
* Materialized View read support fix (SPARKC-653)
* Fix Direct Join projection collapse (SPARKC-695)
* Fix mixed case column names in Direct Join (SPARKC-682)

3.2.0
* Spark 3.2.x support (SPARKC-670)
* Fix: Cassandra Direct Join doesn't quote keyspace and table names (SPARKC-667)
Expand Down
20 changes: 12 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,8 @@
| What | Where |
| ---------- |---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| Community | Chat with us at [Datastax and Cassandra Q&A](https://community.datastax.com/index.html) |
| Scala Docs | Most Recent Release (3.2.0): [Spark-Cassandra-Connector](https://datastax.github.io/spark-cassandra-connector/ApiDocs/3.2.0/connector/com/datastax/spark/connector/index.html), [Spark-Cassandra-Connector-Driver](https://datastax.github.io/spark-cassandra-connector/ApiDocs/3.2.0/driver/com/datastax/spark/connector/index.html) |
| Latest Production Release | [3.2.0](https://search.maven.org/artifact/com.datastax.spark/spark-cassandra-connector_2.12/3.2.0/jar) |
| Scala Docs | Most Recent Release (3.3.0): [Spark-Cassandra-Connector](https://datastax.github.io/spark-cassandra-connector/ApiDocs/3.3.0/connector/com/datastax/spark/connector/index.html), [Spark-Cassandra-Connector-Driver](https://datastax.github.io/spark-cassandra-connector/ApiDocs/3.3.0/driver/com/datastax/spark/connector/index.html) |
| Latest Production Release | [3.3.0](https://search.maven.org/artifact/com.datastax.spark/spark-cassandra-connector_2.12/3.3.0/jar) |

## Features

Expand All @@ -19,7 +19,7 @@ Spark RDDs and Datasets/DataFrames to Cassandra tables, and execute arbitrary CQ
in your Spark applications.

- Compatible with Apache Cassandra version 2.1 or higher (see table below)
- Compatible with Apache Spark 1.0 through 3.2 ([see table below](#version-compatibility))
- Compatible with Apache Spark 1.0 through 3.3 ([see table below](#version-compatibility))
- Compatible with Scala 2.11 and 2.12
- Exposes Cassandra tables as Spark RDDs and Datasets/DataFrames
- Maps table rows to CassandraRow objects or tuples
Expand All @@ -45,14 +45,15 @@ corresponds to the 1.6 release. The "master" branch will normally contain
development for the next connector release in progress.

Currently, the following branches are actively supported:
3.2.x ([master](https://github.com/datastax/spark-cassandra-connector/tree/master)),
3.3.x ([master](https://github.com/datastax/spark-cassandra-connector/tree/master)),
3.2.x ([b3.2](https://github.com/datastax/spark-cassandra-connector/tree/b3.2)),
3.1.x ([b3.1](https://github.com/datastax/spark-cassandra-connector/tree/b3.1)),
3.0.x ([b3.0](https://github.com/datastax/spark-cassandra-connector/tree/b3.0)) and
2.5.x ([b2.5](https://github.com/datastax/spark-cassandra-connector/tree/b2.5)).

| Connector | Spark | Cassandra | Cassandra Java Driver | Minimum Java Version | Supported Scala Versions |
| --------- | ------------- | --------- | --------------------- | -------------------- | ----------------------- |
| 3.3 | 3.3 | 2.1.5*, 2.2, 3.x, 4.0 | 4.13 | 8 | 2.12 |
| Connector | Spark | Cassandra | Cassandra Java Driver | Minimum Java Version | Supported Scala Versions |
|-----------|---------------|-----------------------| --------------------- | -------------------- | ----------------------- |
| 3.3 | 3.3 | 2.1.5*, 2.2, 3.x, 4.x | 4.13 | 8 | 2.12 |
| 3.2 | 3.2 | 2.1.5*, 2.2, 3.x, 4.0 | 4.13 | 8 | 2.12 |
| 3.1 | 3.1 | 2.1.5*, 2.2, 3.x, 4.0 | 4.12 | 8 | 2.12 |
| 3.0 | 3.0 | 2.1.5*, 2.2, 3.x, 4.0 | 4.12 | 8 | 2.12 |
Expand All @@ -74,6 +75,9 @@ Currently, the following branches are actively supported:
## Hosted API Docs
API documentation for the Scala and Java interfaces are available online:

### 3.3.0
* [Spark-Cassandra-Connector](https://datastax.github.io/spark-cassandra-connector/ApiDocs/3.3.0/connector/com/datastax/spark/connector/index.html)

### 3.2.0
* [Spark-Cassandra-Connector](https://datastax.github.io/spark-cassandra-connector/ApiDocs/3.2.0/connector/com/datastax/spark/connector/index.html)

Expand All @@ -96,7 +100,7 @@ This project is available on the Maven Central Repository.
For SBT to download the connector binaries, sources and javadoc, put this in your project
SBT config:

libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "3.2.0"
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "3.3.0"

* The default Scala version for Spark 3.0+ is 2.12 please choose the appropriate build. See the
[FAQ](doc/FAQ.md) for more information.
Expand Down
8 changes: 4 additions & 4 deletions doc/0_quick_start.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,14 +15,14 @@ Configure a new Scala project with the Apache Spark and dependency.

The dependencies are easily retrieved via Maven Central

libraryDependencies += "com.datastax.spark" % "spark-cassandra-connector_2.12" % "3.2.0"
libraryDependencies += "com.datastax.spark" % "spark-cassandra-connector_2.12" % "3.3.0"

The spark-packages libraries can also be used with spark-submit and spark shell, these
commands will place the connector and all of its dependencies on the path of the
Spark Driver and all Spark Executors.

$SPARK_HOME/bin/spark-shell --packages com.datastax.spark:spark-cassandra-connector_2.12:3.2.0
$SPARK_HOME/bin/spark-submit --packages com.datastax.spark:spark-cassandra-connector_2.12:3.2.0
$SPARK_HOME/bin/spark-shell --packages com.datastax.spark:spark-cassandra-connector_2.12:3.3.0
$SPARK_HOME/bin/spark-submit --packages com.datastax.spark:spark-cassandra-connector_2.12:3.3.0

For the list of available versions, see:
- https://repo1.maven.org/maven2/com/datastax/spark/spark-cassandra-connector_2.12/
Expand All @@ -42,7 +42,7 @@ and *all* of its dependencies on the Spark Class PathTo configure
the default Spark Configuration pass key value pairs with `--conf`

$SPARK_HOME/bin/spark-shell --conf spark.cassandra.connection.host=127.0.0.1 \
--packages com.datastax.spark:spark-cassandra-connector_2.12:3.2.0
--packages com.datastax.spark:spark-cassandra-connector_2.12:3.3.0
--conf spark.sql.extensions=com.datastax.spark.connector.CassandraSparkExtensions

This command would set the Spark Cassandra Connector parameter
Expand Down
2 changes: 1 addition & 1 deletion doc/13_spark_shell.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ Find additional versions at [Spark Packages](https://repo1.maven.org/maven2/com/
```bash
cd spark/install/dir
#Include the --master if you want to run against a spark cluster and not local mode
./bin/spark-shell [--master sparkMasterAddress] --jars yourAssemblyJar --packages com.datastax.spark:spark-cassandra-connector_2.12:3.2.0 --conf spark.cassandra.connection.host=yourCassandraClusterIp
./bin/spark-shell [--master sparkMasterAddress] --jars yourAssemblyJar --packages com.datastax.spark:spark-cassandra-connector_2.12:3.3.0 --conf spark.cassandra.connection.host=yourCassandraClusterIp
```

By default spark will log everything to the console and this may be a bit of an overload. To change this copy and modify the `log4j.properties` template file
Expand Down
2 changes: 1 addition & 1 deletion doc/15_python.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ shell similarly to how the spark shell is started. The preferred method is now t

```bash
./bin/pyspark \
--packages com.datastax.spark:spark-cassandra-connector_2.12:3.2.0 \
--packages com.datastax.spark:spark-cassandra-connector_2.12:3.3.0 \
--conf spark.sql.extensions=com.datastax.spark.connector.CassandraSparkExtensions
```

Expand Down
8 changes: 5 additions & 3 deletions doc/developers.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,11 +31,11 @@ Cassandra and Spark nodes and are the core of our test coverage.

### Merge Path

b2.5 => b3.0 => b3.1 => master
b2.5 => b3.0 => b3.1 => b3.2 => master

New features can be considered for 2.5 as long as they do not break apis.
Once a feature is ready for b2.5, create a feature branch for b3.0 and merge
b2.5 feature branch to b3.0 feature branch. Repeat for b3.1 and master.
b2.5 feature branch to b3.0 feature branch. Repeat for b3.1, b3.2 and master.

Example for imaginary SPARKC-9999.

Expand Down Expand Up @@ -69,10 +69,12 @@ git merge SPARKC-9999-b3.0
# Resolve conflict, if any
# Push the new feature branch:
git push origin SPARKC-9999-b3.1

# Repeat for b3.2

# Forward merge on the next version:
git checkout -b SPARKC-9999-master datastax/master
git merge SPARKC-9999-b3.1
git merge SPARKC-9999-b3.2
# Resolve conflict, if any
# Push the new feature branch:
git push origin SPARKC-9999-master
Expand Down

0 comments on commit 7f554ca

Please sign in to comment.