Clarification Question: Is InfluxDB really replacing "local" storage - or "doubled" data storage?
Ashcora opened this issue · 6 comments
Problem/Motivation
Is InfluxDB (in a separate Docker container) really storing ALL the data written by Home Assistent instead of a local "Home Assistant DB"
Expected behavior
As soon as the integration between Home Assistant and InfluxDB is activated, I expect that Home Assistant switches the "data writing" to InfluxDB instead of the "Home Assistant DB"
Actual behavior
Unknown - doubled data storage in worst case?
Steps to reproduce
Add InfluxDB addon with config to write without filtering to InfluxDB.
Proposed changes
As soon as the integration between Home Assistant and InfluxDB is activated with no filter option, all data is written to InfluxDB.
There hasn't been any activity on this issue recently, so we clean up some of the older and inactive issues.
Please make sure to update to the latest version and check if that solves the issue. Let us know if that works for you by leaving a comment 👍
This issue has now been marked as stale and will be closed if no further activity occurs. Thanks!
Question is not resolved: In case of using another InfluxDB instance (e.g. on the same host or on a different host), will the data be stored twice?
Hi,
i´m not an expert of influx in ha, but i understand it like this.
after installing the influxdb add in in home assistant, an influxdb instance is running.
now you can configure a database and login to this database over the influxdb gui.
after you have done this, to store home assistant data in the configured database got to the home assistant config yaml and ptu the influx connection parameter and so on into that file. restart home assistant.
now it should send data to the influxdb as soon as they get into HA.
it is possible to configure filter for entities etc to send not all data to influxdb.
also the default home assistant database stores the data. But in standard configuration some data will be purged after 10 days.
not all, statistic data will be available longer.
so yes, if you push all data to influxdb and made no changes to the purge period, you will habe data stored twice in two databases.
if you want lower the retention period for the normal database.
i suggest you read the home assistant documentation about the standard database and also some docs about influxdb.
to get a clue about the data storing processes in ha.
it helped me a lot. i have setup influx in home assistant a couple of weeks ago.
worked on a script to get historical data from sqlite HA Database also to influx, my purge setting was set to 365 days before.
case i wanted to keep some data.
now i´ll be able to set it back to 10 days shrink the sqlite database file and work on dashboards for influx db.
i hope the information will help you a little bit and i have written no bullshit :-)
@provi1: Thank you for helping me out here and furthermore for your extended reply - as well as for your explanation of how you understood the Influx DB topic in Home Assistant.
The background why I am asking is that I am running an Influx DB in my docker on my Intel NUC server. Somehow, for Home Assistant I wanted to use my Raspberry PI.
The problem is that depending on the amount of data, which is written all day long, it might destroy my SD-Card earlier. In addition, I do not want to have the same data stored twice (which would be kind of "unwanted backup/redundancy" 📦).
Maybe it would be best to run it in a test environment (install empty Home Assistant instance, install Influx DB, establish connection between both with regard as proposed in the documentation and monitor the Home Assistant database in my local folder as well as the Influx Database). Then I will also try to attach an USB stick or SSD via USB to my Raspberry and test it in this constellation.
Hi,
a test environment is a good idea.
but you will store data twice for a short period, no matter what you do.
that’s how it is working right now.
The internal home assistant database , stores operational data.
influxd is more like a datawarehouse, to store data over a long period.
I think storing data over a short period twice is not such a big deal.
If you are worried about your SD card, you have to change you Standard database to MySQL for example and locate this also on your NUC. But if your network connection fails, HA will fail also.
And even with all databases on a mothers device, HA will write log file entries to the SD card.
Maybe you should think about using a SSD with USB Kable as harddisk for the system.
that’s what my Setup. I have switched from SD to SSD 4 weeks ago.
I found an article about running raspberry from a nfs in memory. But haven’t tested it.
and I’m not sure that’s a good solution. Network is maybe not stable enough.
If you would like to discuss this further, you can send me a message. I‘ll answer you.