SNOW-956924: Snowpark Container Creation Fails with valid spec yaml file
Closed this issue · 2 comments
sfc-gh-cconner commented
SnowCLI version
1.2.1
Python version
3.10.12
Platform
macOS-14.0-arm64-arm-64bit
What happened
I have a yaml file that specifies a custom command for the container. The yaml is valid and works if I create the snowpark container through the UI. It fails with yaml parsing errors from snow cli.
Console output
snow snowpark services create --name kafka2 --compute-pool CCONNER_KAFKA_POOL_2 --spec-path kafka_snowpark_container.yaml --num-instances 1 --connection cconnerprod3
An unexpected exception occurred. Use --debug option to see the traceback.
Exception message:
395019 (22023): Failed to parse 'service spec' as YAML. Error: while parsing a flow sequence
in 'reader', line 2, column 1800:
... atagen:0.6.0-7.3.0", "command": ["bash", "-c", "echo "Installing ...
^
expected ',' or ']', but got <scalar>
in 'reader', line 2, column 1822:
... command": ["bash", "-c", "echo "Installing connector plugins"
^
at [Source: (StringReader); line: 2, column: 1822]
How to reproduce
Try to create a snowpark container with this yaml:
spec:
container:
- name: connect
image: sfcsupport-cconnerprod3.registry.snowflakecomputing.com/cconner_db/public/images/cp-server-connect-datagen:0.6.0-7.3.0
command:
- bash
- -c
- |
echo "Installing connector plugins"
if [[ -z $(ls /usr/share/confluent-hub-components | grep snowflake-kafka-connector-2.0.1.jar) ]]
then
confluent-hub install --no-prompt snowflakeinc/snowflake-kafka-connector:2.0.1
fi
echo "Launching Kafka Connect worker"
/etc/confluent/docker/run
env:
CONNECT_BOOTSTRAP_SERVERS: 'localhost:29092'
CONNECT_REST_ADVERTISED_HOST_NAME: connect
CONNECT_GROUP_ID: compose-connect-group
CONNECT_CONFIG_STORAGE_TOPIC: docker-connect-configs
CONNECT_CONFIG_STORAGE_REPLICATION_FACTOR: 1
CONNECT_OFFSET_FLUSH_INTERVAL_MS: 10000
CONNECT_OFFSET_STORAGE_TOPIC: docker-connect-offsets
CONNECT_OFFSET_STORAGE_REPLICATION_FACTOR: 1
CONNECT_STATUS_STORAGE_TOPIC: docker-connect-status
CONNECT_STATUS_STORAGE_REPLICATION_FACTOR: 1
CONNECT_KEY_CONVERTER: org.apache.kafka.connect.storage.StringConverter
CONNECT_VALUE_CONVERTER: io.confluent.connect.avro.AvroConverter
CONNECT_VALUE_CONVERTER_SCHEMA_REGISTRY_URL: http://localhost:8081
# CLASSPATH required due to CC-2422
CLASSPATH: /usr/share/java/monitoring-interceptors/monitoring-interceptors-7.3.0.jar
CONNECT_PRODUCER_INTERCEPTOR_CLASSES: "io.confluent.monitoring.clients.interceptor.MonitoringProducerInterceptor"
CONNECT_CONSUMER_INTERCEPTOR_CLASSES: "io.confluent.monitoring.clients.interceptor.MonitoringConsumerInterceptor"
CONNECT_PLUGIN_PATH: "/usr/share/java,/usr/share/confluent-hub-components"
CONNECT_LOG4J_LOGGERS: org.apache.zookeeper=ERROR,org.I0Itec.zkclient=ERROR,org.reflections=ERROR
endpoint:
- name: connect
port: 8083
networkPolicyConfig:
allowInternetEgress: true
sfc-gh-pjob commented
I updated the issue description to correct yaml and bash formatting.
sfc-gh-pjob commented
It should be fixed on the main branch and we plan to include the fix also in next 1.X release.