Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

examples #2

Closed
manang opened this issue Sep 8, 2017 · 15 comments
Closed

examples #2

manang opened this issue Sep 8, 2017 · 15 comments
Labels

Comments

@manang
Copy link

manang commented Sep 8, 2017

Hi,
I' trying to use your examples, in particular serdes-tool.
I don't understand which parameter is "-s", because, if I set the path of my json schema the error is:
FATAL: Failed to get schema: REST request failed (code -1): HTTP request failed: Couldn't connect to server

Could you explain me how I have to set my input?
Thank you very much
Angelo

@edenhill
Copy link
Contributor

edenhill commented Sep 8, 2017

Hello,

-s takes the schema name, for the schema definition itself you should pass it with the -S option which takes a JSON string, not a file containing JSON (but we can use the shell to help us with that).

To register a new schema you do something like:
serdes-tool -r http://your-schema-registry-url -s your_schema_name -S "$(cat schemadef.json)"

@manang
Copy link
Author

manang commented Sep 8, 2017

Thank you for the answer.
Do you have a json schema to try you application?
Thank you

@edenhill
Copy link
Contributor

edenhill commented Sep 8, 2017

There are small example schemas in the schema-registry quickstart:
https://github.com/confluentinc/schema-registry#quickstart

@manang
Copy link
Author

manang commented Sep 14, 2017

thanks for your reply.
I'm trying to use librdkafka, in particular "rdkafka_example.cpp" example.
I want to use an avro encoded item to put inside the kafka topic.
could you explain me how I can create the avro object and send it?
I've used the validating.cpp example of avro repo.
Thank you very much, and sorry, but I am new with this kind of problems.

Angelo

@edenhill
Copy link
Contributor

The kafka-serdes-avro-console-producer.cpp example converts a JSON document to an Avro object (also called GenericDatum) here:
https://github.com/confluentinc/libserdes/blob/master/examples/kafka-serdes-avro-console-producer.cpp#L81

which is then serialized and produced here:
https://github.com/confluentinc/libserdes/blob/master/examples/kafka-serdes-avro-console-producer.cpp#L373

For other forms of Avro object instantiations I suggest looking at the avrocpp documentation here:
https://avro.apache.org/docs/1.8.0/api/cpp/html/index.html

@manang
Copy link
Author

manang commented Oct 16, 2017

Hi, but what I don't understand is: is it necessary to use the schema-registry? Or can I use the kafka-serdes-avro-console-producer.cpp without the schema registry?
I'm trying this command:

manang@hicn-virtual:~/libserdes/examples$ ./kafka-serdes-avro-console-producer -b 192.168.252.173:9092 -t prova.prova -s my_schema -S "$(cat /home/manang/avro/lang/c++/examples/dataplatform-raw.avsc)" % Register new schema: my_schema: {"namespace": "pnda.entity", "type": "record", "name": "event", "fields": [ {"name": "timestamp", "type": "long"}, {"name": "src", "type": "string"}, {"name": "host_ip", "type": "string"}, {"name": "rawdata", "type": "bytes"} ] } % FATAL: Failed to register schema my_schema: REST request failed (code -1): HTTP request failed: Couldn't connect to server

Thank you very much
Angelo

@manang
Copy link
Author

manang commented Oct 16, 2017

Hi, sorry for this long list of messages,
I'm trying with the prebuilt server, when I launch the example I have:

manang@hicn-virtual:~/libserdes$ ./examples/kafka-serdes-avro-console-producer -b 192.168.252.173:9092 -t prova.prova -s my_schema -S "$(cat /home/manang/avro/lang/c++/examples/dataplatform-raw.avsc)" % Register new schema: my_schema: {"namespace": "pnda.entity", "type": "record", "name": "event", "fields": [ {"name": "timestamp", "type": "long"}, {"name": "src", "type": "string"}, {"name": "host_ip", "type": "string"}, {"name": "rawdata", "type": "bytes"} ] } % Registered schema my_schema with id 1 {"timestamp": 1234, "src":"asdads", "host_ip":"asdasd", "rawdata","asdasd"} % JSON to Avro transformation failed: Unexpected character in json

Why I have the error?
Thank you very much
Angelo

@edenhill
Copy link
Contributor

The JSON message you are trying to produce is invalid:
{"timestamp": 1234, "src":"asdads", "host_ip":"asdasd", "rawdata","asdasd"}

rawdata should be followed by :, not ,, as so:
{"timestamp": 1234, "src":"asdads", "host_ip":"asdasd", "rawdata":"asdasd"}

@asardaes
Copy link

asardaes commented Feb 5, 2018

I'm also testing the examples, and I get the following:

$ ./kafka-serdes-avro-console-producer -t temp.temp -s test
% Query schema: by name "test" or id -1
% Schema "test" id 1: { "type": "string" }

If I then type

{"type": "test_type"}

I get JSON to Avro transformation failed: Incorrect token in the stream. Expected: String, found Object start

@edenhill
Copy link
Contributor

edenhill commented Feb 5, 2018

This schema: { "type": "string" } is defining a single string, so you should pass it just that, a "a json string".

@passiondjc
Copy link

Hi, i use below command to run example
./serdes-kafka-avro-client -P -b 10.202.8.11:40301 -t testschema -p 1 -r http://localhost:8081 -s testvertica -S '{"type":"record","name""testvertica","namespace":"test","fields":[{"name":"f1","type":"string"},{"name":"f2","type":"string"}]}'

enter:str: {"f1":"value2", "f2":"value3"}

./serdes-kafka-avro-client -C -b 10.202.8.11:40301 -t testschema -p 1 -r http://localhost:8081
show error message:
Failed to read avro value: Cannot read string value: Cannot read string length: Cannot read 1 bytes from memory bufferCannot read string value: Cannot read string length: Cannot read 1 bytes from memory buffer

@Lnixdb
Copy link

Lnixdb commented Jan 16, 2019

Hi, i use below command to run example
./serdes-kafka-avro-client -P -b 10.202.8.11:40301 -t testschema -p 1 -r http://localhost:8081 -s testvertica -S '{"type":"record","name""testvertica","namespace":"test","fields":[{"name":"f1","type":"string"},{"name":"f2","type":"string"}]}'

enter:str: {"f1":"value2", "f2":"value3"}

./serdes-kafka-avro-client -C -b 10.202.8.11:40301 -t testschema -p 1 -r http://localhost:8081
show error message:
Failed to read avro value: Cannot read string value: Cannot read string length: Cannot read 1 bytes from memory bufferCannot read string value: Cannot read string length: Cannot read 1 bytes from memory buffer

Did you solve the problem? I got the same mistake

@aamirrashid
Copy link

I'm seeing the same issues as reported by Lnixdb and passiondjc above. I see it's been ~2 years since it was first reported, and I do not see a resolution yet. Is this still an active github thread? If yes, can someone please provide a resolution?

@xmcqueen
Copy link

The last two questions I think are answered a bit higher where @edenhill indicates to specify the values, not the entire object.

The schema:

{ "type": "string" }

the input string:

"a json string"

@aamirrashid
Copy link

See my response in #32

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

7 participants