higanworks/knife-zero

data_bag support

Closed this issue · 2 comments

I really love knife-zero, thank you so much for this great tool!

What i have:

  • provisioned nodes with knife zero, with bershelf as a cookbook provider
  • working with own recipes and supermarket

Now one of my cookbooks uses data_bags and i ask myself, how to integrate this with knife zero?
I put my data_bags under /var/chef/data_bags/mydatabaga/item.json but during the converge, it cannot find those. update: i put those on the node, not the client i actually converge from, i guess that is expected, right?

my /etc/chef/client.rb

log_location     STDOUT
chef_server_url  "chefzero://localhost:8889"
validation_client_name "chef-validator"
# Using default node name (fqdn)
automatic_attribute_whitelist ["fqdn", "os", "os_version", "hostname", "ipaddress", "roles", "recipes", "ipaddress", "platform", "platform_version", "cloud", "cloud_v2", "chef_packages"]

on the client i converge from, things look like this

local_mode true
chef_repo_path   File.expand_path('../' , __FILE__)

knife[:ssh_attribute] = 'knife_zero.host'
knife[:ssh_user]='kw'
knife[:use_sudo] = true

knife[:automatic_attribute_whitelist] = %w[
  fqdn
  os
  os_version
  hostname
  ipaddress
  roles
  ipaddress
  platform
  platform_version
  cloud
  cloud_v2
  chef_packages
  recipes
]

and i run berks vendor cookbooks to actually deploy all my cookbooks under cookbooks in that folder

Do i need to configure something else, did i do something wrong?

Are you storing data_bags on the remote node?

The data_bags directory is used in the current directory of the local node executing knife-zero.

sorry, i cleaned up my mess. I totally got data_bags wrong, i though they are non-cookbook, but node centric, but they are also non node-centric, they are server-centric, so valid / specified for all nodes. Obviously they then must go on into the knife-zero repo