Cannot login after switch from internal to external MongoDB
alexanderharm opened this issue · 2 comments
As recommended in the logs I tried to migrate to an external MongoDB. So I stopped and removed my current unifi container and tried to bring up the new containers using this docker compose:
services:
mongo:
image: mongo:3.6
container_name: unifi-db
restart: unless-stopped
volumes:
- /volume1/docker/unifi/data/db:/data/db
unifi:
image: goofball222/unifi
container_name: unifi
restart: unless-stopped
ports:
- 3478:3478/udp
- 8080:8080
- 8443:8443
- 8880:8880
- 8843:8843
- 6789:6789
- 10001:10001/udp
volumes:
- /etc/localtime:/etc/localtime:ro
- /volume1/docker/unifi/data:/usr/lib/unifi/data
- /volume1/docker/unifi/logs:/usr/lib/unifi/logs
environment:
- DB_MONGO_LOCAL=true
- UNIFI_DB_NAME=unifi
- DB_MONGO_URI=mongodb://mongo:27017/unifi
- STATDB_MONGO_URI=mongodb://mongo:27017/unifi_stat
- TZ=Europe/Berlin
depends_on:
- mongo
Everything comes up nicely and reports as healthy. However, I'm unable to login into the unifi network controller. It reports that the username/password combination is wrong. Nothing in the logs. Switching back to the previous container and I can login just fine.
Any idea?
PS: Would be nice if the example for the docker-compose is a working config. In your Docker images you use MongoDB v3.6. Maybe that can be added?
Is there anything in the logs for Mongo when this is happening? Depending on how old the database you're trying to switch to external is you may have to change the mongo compatibility version setting in it.
I also see in your compose file that you have DB_MONGO_LOCAL=true
, this should be set to DB_MONGO_LOCAL=false
for an externalized DB.
This issue has had no activity for the last 90 days.
Do you still see this issue with the latest release?
Please add a reply within 14 days or this issue will be automatically closed.
To keep a confirmed issue open we can also add a "bug confirmed" tag.
Disclaimer: This is an open community project with limited resources.
Any skilled member of the community may jump in at any time to fix this issue.
That can take a while depending on our busy lives so please be patient,
and take advantage of other resources to help solve the issue.