-
Notifications
You must be signed in to change notification settings - Fork 484
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
kcat fails to deserialize Avro using schema registry, but confluent_kafka_python succeeds #356
Comments
I have a very similar issue where I have a JavaScript client that can successfully tail a topic while
I know there's a magic byte because I'm pulling it off the raw message and I know it is encoded in Avro. |
Each message must be prefixed by the schema-registry specific framing, which is one byte for magic (value 0) and then the schemaId in big-endian (4 bytes), then followed by the serialized payload (e.g., serialized avro). |
oooh, my issue was staring me in the face and I didn't notice... "key deserialization". Only my values are serialized. Changing my config to I also had to pass my schema registry password unencoded where I could have sworn I read that it should be url encoded. |
@edenhill You can confirm that
This is saying that I have a message with schema ID of 1. I still see the above error from kcat. |
|
I am trying to use kcat to read Confluent Wire Format-encoded messages. They're Avro encoded, and I have a Schema Registry. This little Python script works with confluent_kafka_python v1.7.0:
But this kcat invocation fails, even though it should be identical, and even while running on the same messages:
None of this data is sensitive, so I can upload the schema and message here too: message_and_schema.zip
That zip has a 'msg.avro' which is the Kafka message (including the 5-byte Confluent Wire Format header) and a "schema.json" which is the Avro schema used.
Be warned - this is a big and complex schema, about 40KB, and the Avro message is 580KiB of scientific simulation data.
The text was updated successfully, but these errors were encountered: