Skip to content

Commit

Permalink
Prepare for 2.0.0 release
Browse files Browse the repository at this point in the history
This patch prepares `spark-redshift` for its 2.0.0 release by updating the build to use `spark-avro` 3.0.0 and updating the version numbers in the README.

Author: Josh Rosen <[email protected]>

Closes #250 from JoshRosen/update-for-2.0-release.
  • Loading branch information
JoshRosen committed Aug 2, 2016
1 parent 30b10e7 commit 7b2142a
Show file tree
Hide file tree
Showing 3 changed files with 7 additions and 7 deletions.
4 changes: 2 additions & 2 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,11 +10,11 @@ matrix:
# Scala 2.10.5 tests:
- jdk: openjdk7
scala: 2.10.5
env: HADOOP_VERSION="2.2.0" SPARK_VERSION="2.0.0" SPARK_AVRO_VERSION="3.0.0-preview2"
env: HADOOP_VERSION="2.2.0" SPARK_VERSION="2.0.0" SPARK_AVRO_VERSION="3.0.0"
# Scala 2.11 tests:
- jdk: openjdk7
scala: 2.11.7
env: HADOOP_VERSION="2.2.0" SPARK_VERSION="2.0.0" SPARK_AVRO_VERSION="3.0.0-preview2"
env: HADOOP_VERSION="2.2.0" SPARK_VERSION="2.0.0" SPARK_AVRO_VERSION="3.0.0"
env:
global:
# AWS_REDSHIFT_JDBC_URL
Expand Down
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,14 +37,14 @@ You may use this library in your applications with the following dependency info
```
groupId: com.databricks
artifactId: spark-redshift_2.10
version: 2.0.0-preview1
version: 2.0.0
```

**Scala 2.11**
```
groupId: com.databricks
artifactId: spark-redshift_2.11
version: 2.0.0-preview1
version: 2.0.0
```

You will also need to provide a JDBC driver that is compatible with Redshift. Amazon recommend that you use [their driver](http://docs.aws.amazon.com/redshift/latest/mgmt/configure-jdbc-connection.html), which is distributed as a JAR that is hosted on Amazon's website. This library has also been successfully tested using the Postgres JDBC driver.
Expand Down Expand Up @@ -514,4 +514,4 @@ If the deprecated `usestagingtable` setting is set to `false` then this library

## Migration Guide

- Version 2.0 removed a number of deprecated APIs; for details, see https://github.com/databricks/spark-redshift/pull/239
- Version 2.0 removed a number of deprecated APIs; for details, see https://github.com/databricks/spark-redshift/pull/239
4 changes: 2 additions & 2 deletions project/SparkRedshiftBuild.scala
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ object SparkRedshiftBuild extends Build {
crossScalaVersions := Seq("2.10.5", "2.11.7"),
sparkVersion := "2.0.0",
testSparkVersion := sys.props.get("spark.testVersion").getOrElse(sparkVersion.value),
testSparkAvroVersion := sys.props.get("sparkAvro.testVersion").getOrElse("3.0.0-preview2"),
testSparkAvroVersion := sys.props.get("sparkAvro.testVersion").getOrElse("3.0.0"),
testHadoopVersion := sys.props.get("hadoop.testVersion").getOrElse("2.2.0"),
spName := "databricks/spark-redshift",
sparkComponents ++= Seq("sql", "hive"),
Expand All @@ -73,7 +73,7 @@ object SparkRedshiftBuild extends Build {
"com.amazonaws" % "aws-java-sdk-sts" % "1.10.22" % "test" exclude("com.fasterxml.jackson.core", "jackson-databind"),
// We require spark-avro, but avro-mapred must be provided to match Hadoop version.
// In most cases, avro-mapred will be provided as part of the Spark assembly JAR.
"com.databricks" %% "spark-avro" % "3.0.0-preview2",
"com.databricks" %% "spark-avro" % "3.0.0",
if (testHadoopVersion.value.startsWith("1")) {
"org.apache.avro" % "avro-mapred" % "1.7.7" % "provided" classifier "hadoop1" exclude("org.mortbay.jetty", "servlet-api")
} else {
Expand Down

0 comments on commit 7b2142a

Please sign in to comment.