argoproj-labs/old-argo-dataflow

Quickstart does not seem to work with examples

Closed this issue · 6 comments

I am trying to play around with dataflow to understand it a bit better but the quick start deployment doesnt seem to work correctly. The input pod that is created which seems like its supposed to be pushing messages to kafka fails. It looks like some other pod is missing. The input pod is using curl to hit a testapi, but there isnt anything like that deployed in any of the quick start instructions or manifests. Am I missing something or is there more to the setup that is missing from the instructions?

Let me take a look.

Please let me know if there is anything I can help with. Happy to help out if I can.

No problem. You could help verify the changes. I'm going to remove the input pod completely. What I'm not clear on, is should we continue to write messages to the input-topic automatically? Or just provide instruction on how to do it, e.g. using

I was just trying to write some messages to it using the kafka-console-producer.sh and port forwarding the svc/kafka-broker but running into some errors. I dont have a ton of experience with kafka so Im not sure if the configuration is just not setup to be able to do that.

kubectl -n argo-dataflow-system port-forward svc/kafka-broker 9092:9092
/kafka-console-producer.sh --topic input-topic --bootstrap-server localhost:9092
[2021-10-08 12:24:39,481] WARN [AdminClient clientId=adminclient-1] Error connecting to node kafka-broker:9092 (id: 0 rack: null) (org.apache.kafka.clients.NetworkClient)
java.net.UnknownHostException: kafka-broker

cat you add 127.0.0.1 kafka-broker to your /etc/hosts?

Stale issue message