Skip to content

Commit

Permalink
Bump Dropwizard to version 3.0.0 (#246)
Browse files Browse the repository at this point in the history
  • Loading branch information
joschi authored Apr 3, 2023
1 parent cc0d495 commit cd69ec5
Show file tree
Hide file tree
Showing 11 changed files with 43 additions and 46 deletions.
36 changes: 19 additions & 17 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,10 @@
[![Quality Gate Status](https://sonarcloud.io/api/project_badges/measure?project=dropwizard_dropwizard-kafka&metric=alert_status)](https://sonarcloud.io/dashboard?id=dropwizard_dropwizard-kafka)
[![Maven Central](https://maven-badges.herokuapp.com/maven-central/io.dropwizard.modules/dropwizard-kafka/badge.svg)](https://maven-badges.herokuapp.com/maven-central/io.dropwizard.modules/dropwizard-kafka/)

Provides easy integration for Dropwizard applications with the Apache Kafka client.
Provides easy integration for Dropwizard applications with the Apache Kafka client.

This bundle comes with out-of-the-box support for:
* YAML Configuration integration
* YAML Configuration integration
* Producer and Consumer lifecycle management
* Producer and Cluster connection health checks
* Metrics integration for the Kafka client
Expand All @@ -16,14 +16,16 @@ This bundle comes with out-of-the-box support for:
For more information on Kafka, take a look at the official documentation here: http://kafka.apache.org/documentation/

## Dropwizard Version Support Matrix
dropwizard-kafka | Dropwizard v1.3.x | Dropwizard v2.0.x | Dropwizard v2.1.x
----------------------- | ------------------ | ------------------ | ------------------
v1.3.x | :white_check_mark: | :white_check_mark: | :white_check_mark:
v1.4.x | :white_check_mark: | :white_check_mark: | :white_check_mark:
v1.5.x | :white_check_mark: | :white_check_mark: | :white_check_mark:
v1.6.x | | :white_check_mark: | :white_check_mark:
v1.7.x | | :white_check_mark: | :white_check_mark:
v1.8.x | | | :white_check_mark:
| dropwizard-kafka | Dropwizard v1.3.x | Dropwizard v2.0.x | Dropwizard v2.1.x | Dropwizard v3.0.x | Dropwizard v4.0.x |
|------------------|--------------------|--------------------|--------------------|--------------------|--------------------|
| v1.3.x | :white_check_mark: | :white_check_mark: | :white_check_mark: | :x: | :x: |
| v1.4.x | :white_check_mark: | :white_check_mark: | :white_check_mark: | :x: | :x: |
| v1.5.x | :white_check_mark: | :white_check_mark: | :white_check_mark: | :x: | :x: |
| v1.6.x | :x: | :white_check_mark: | :white_check_mark: | :x: | :x: |
| v1.7.x | :x: | :white_check_mark: | :white_check_mark: | :x: | :x: |
| v1.8.x | :x: | :x: | :white_check_mark: | :x: | :x: |
| v3.0.x | :x: | :x: | :x: | :white_check_mark: | :x: |
| v4.0.x | :x: | :x: | :x: | :x: | :white_check_mark: |

## Usage
Add dependency on library.
Expand Down Expand Up @@ -131,14 +133,14 @@ public void run(ExampleConfiguration config, Environment environment) {
Configure your factory in your `config.yml` file:

```yaml
consumer:
consumer:
type: basic
bootstrapServers:
- 127.0.0.1:9092
- 127.0.0.1:9093
- 127.0.0.1:9094
consumerGroupId: consumer1
name: consumerNameToBeUsedInMetrics
name: consumerNameToBeUsedInMetrics
keyDeserializer:
type: string
valueDeserializer:
Expand Down Expand Up @@ -168,7 +170,7 @@ For example, say you would like to use version `1.1.1` of the Kafka client. One
<version>${dropwizard.version}</version>
</dependency>
</dependencies>
```
```

## Adding support for additional serializers and/or deserializers
In order to support additional serializers or deserializers, you'll need to create a new factory:
Expand All @@ -179,11 +181,11 @@ public class MySerializerFactory extends SerializerFactory {
@NotNull
@JsonProperty
private String someConfig;
public String getSomeConfig() {
return someConfig;
}
public void setSomeConfig(final String someConfig) {
this.someConfig = someConfig;
}
Expand All @@ -196,14 +198,14 @@ public class MySerializerFactory extends SerializerFactory {
}
```

Then you will need to add the following files to your `src/main/resources/META-INF/services` directory in order to support Jackson
Then you will need to add the following files to your `src/main/resources/META-INF/services` directory in order to support Jackson
polymorphic serialization:

File named `io.dropwizard.jackson.Discoverable`:

```
io.dropwizard.kafka.serializer.SerializerFactory
```
```

File named `io.dropwizard.kafka.serializer.SerializerFactory`:

Expand Down
6 changes: 3 additions & 3 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,7 @@
<sonar.host.url>https://sonarcloud.io</sonar.host.url>

<brave.version>5.15.0</brave.version>
<dropwizard.version>2.1.6</dropwizard.version>
<dropwizard.version>3.0.0</dropwizard.version>
<kafka-client.version>3.4.0</kafka-client.version>
<spring-kafka.version>2.9.7</spring-kafka.version>
</properties>
Expand Down Expand Up @@ -160,8 +160,8 @@

<!-- test dependencies -->
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<groupId>org.junit.vintage</groupId>
<artifactId>junit-vintage-engine</artifactId>
<scope>test</scope>
</dependency>
<dependency>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@
import org.apache.kafka.clients.consumer.Consumer;
import org.apache.kafka.clients.consumer.ConsumerConfig;
import org.apache.kafka.clients.consumer.ConsumerRebalanceListener;
import org.checkerframework.checker.nullness.qual.Nullable;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

Expand All @@ -17,8 +18,6 @@
import java.util.Map;
import java.util.Optional;

import javax.annotation.Nullable;

import static java.util.Objects.requireNonNull;

@JsonTypeName("basic")
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@
import io.dropwizard.lifecycle.setup.LifecycleEnvironment;
import org.apache.kafka.clients.CommonClientConfigs;
import org.apache.kafka.clients.producer.Producer;
import org.checkerframework.checker.nullness.qual.Nullable;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

Expand All @@ -16,8 +17,6 @@
import java.util.Map;
import java.util.Optional;

import javax.annotation.Nullable;

import static java.util.Objects.requireNonNull;

@JsonTypeName("basic")
Expand Down
11 changes: 5 additions & 6 deletions src/main/java/io/dropwizard/kafka/KafkaAdminClientBundle.java
Original file line number Diff line number Diff line change
@@ -1,20 +1,19 @@
package io.dropwizard.kafka;

import brave.Tracing;
import io.dropwizard.Configuration;
import io.dropwizard.ConfiguredBundle;
import io.dropwizard.setup.Bootstrap;
import io.dropwizard.setup.Environment;
import io.dropwizard.core.Configuration;
import io.dropwizard.core.ConfiguredBundle;
import io.dropwizard.core.setup.Bootstrap;
import io.dropwizard.core.setup.Environment;
import org.apache.kafka.clients.admin.AdminClient;
import org.apache.kafka.clients.admin.NewTopic;
import org.checkerframework.checker.nullness.qual.Nullable;

import java.util.Collection;
import java.util.Collections;
import java.util.Map;
import java.util.Objects;

import javax.annotation.Nullable;

import static java.util.Objects.requireNonNull;

public abstract class KafkaAdminClientBundle<T extends Configuration> implements ConfiguredBundle<T> {
Expand Down
11 changes: 5 additions & 6 deletions src/main/java/io/dropwizard/kafka/KafkaConsumerBundle.java
Original file line number Diff line number Diff line change
@@ -1,20 +1,19 @@
package io.dropwizard.kafka;

import brave.Tracing;
import io.dropwizard.Configuration;
import io.dropwizard.ConfiguredBundle;
import io.dropwizard.setup.Bootstrap;
import io.dropwizard.setup.Environment;
import io.dropwizard.core.Configuration;
import io.dropwizard.core.ConfiguredBundle;
import io.dropwizard.core.setup.Bootstrap;
import io.dropwizard.core.setup.Environment;
import org.apache.kafka.clients.consumer.Consumer;
import org.apache.kafka.clients.consumer.ConsumerRebalanceListener;
import org.checkerframework.checker.nullness.qual.Nullable;

import java.util.Collection;
import java.util.Collections;
import java.util.Map;
import java.util.Objects;

import javax.annotation.Nullable;

import static java.util.Objects.requireNonNull;

public abstract class KafkaConsumerBundle<K, V, T extends Configuration> implements ConfiguredBundle<T> {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,12 +16,12 @@
import org.apache.kafka.clients.consumer.ConsumerConfig;
import org.apache.kafka.clients.consumer.ConsumerRebalanceListener;
import org.apache.kafka.clients.consumer.KafkaConsumer;
import org.checkerframework.checker.nullness.qual.Nullable;

import java.util.Collections;
import java.util.HashMap;
import java.util.Map;

import javax.annotation.Nullable;
import javax.validation.Valid;
import javax.validation.constraints.Min;
import javax.validation.constraints.NotEmpty;
Expand Down
11 changes: 5 additions & 6 deletions src/main/java/io/dropwizard/kafka/KafkaProducerBundle.java
Original file line number Diff line number Diff line change
@@ -1,18 +1,17 @@
package io.dropwizard.kafka;

import brave.Tracing;
import io.dropwizard.Configuration;
import io.dropwizard.ConfiguredBundle;
import io.dropwizard.setup.Bootstrap;
import io.dropwizard.setup.Environment;
import io.dropwizard.core.Configuration;
import io.dropwizard.core.ConfiguredBundle;
import io.dropwizard.core.setup.Bootstrap;
import io.dropwizard.core.setup.Environment;
import org.apache.kafka.clients.producer.Producer;
import org.checkerframework.checker.nullness.qual.Nullable;

import java.util.Collection;
import java.util.Collections;
import java.util.Map;

import javax.annotation.Nullable;

import static java.util.Objects.requireNonNull;

public abstract class KafkaProducerBundle<K, V, T extends Configuration> implements ConfiguredBundle<T> {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,8 +18,8 @@
import org.apache.kafka.clients.producer.Producer;
import org.apache.kafka.clients.producer.ProducerConfig;
import org.apache.kafka.common.record.CompressionType;
import org.checkerframework.checker.nullness.qual.Nullable;

import javax.annotation.Nullable;
import javax.validation.Valid;
import javax.validation.constraints.Min;
import javax.validation.constraints.NotEmpty;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,8 +8,8 @@
import org.apache.kafka.clients.consumer.ConsumerRebalanceListener;
import org.apache.kafka.clients.consumer.MockConsumer;
import org.apache.kafka.clients.consumer.OffsetResetStrategy;
import org.checkerframework.checker.nullness.qual.Nullable;

import javax.annotation.Nullable;
import java.util.Map;

@JsonTypeName("mock")
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,8 @@
import io.dropwizard.lifecycle.setup.LifecycleEnvironment;
import org.apache.kafka.clients.producer.MockProducer;
import org.apache.kafka.clients.producer.Producer;
import org.checkerframework.checker.nullness.qual.Nullable;

import javax.annotation.Nullable;
import java.util.Collection;
import java.util.Map;

Expand Down

0 comments on commit cd69ec5

Please sign in to comment.