Docker installation error

Did the installation as docker per the instructions on a new Ubuntu. Unfortunately, it always hangs here.

celery-export-worker_1  | [2022-02-11 15:32:16,328: INFO/MainProcess] Task baserow.core.trash.tasks.permanently_delete_marked_trash[e1095dad-60cf-4f8f-ae07-e79bf8e7ddec] received
celery-export-worker_1  | [2022-02-11 15:32:16,329: INFO/ForkPoolWorker-2] Cleaning up 0 old jobs
celery-export-worker_1  | [2022-02-11 15:32:16,333: INFO/ForkPoolWorker-2] Task baserow.contrib.database.export.tasks.clean_up_old_jobs[abb43af8-7b9b-496b-922c-182b3eb088a0] succeeded in 0.022469437004474457s: None
celery-export-worker_1  | [2022-02-11 15:32:16,354: INFO/ForkPoolWorker-1] Successfully marked 0 old trash items for deletion as they were older than 72 hours.
celery-export-worker_1  | [2022-02-11 15:32:16,354: INFO/ForkPoolWorker-1] Task baserow.core.trash.tasks.mark_old_trash_for_permanent_deletion[812dd946-465f-4fcf-ac5d-5f7a0b3963f2] succeeded in 0.02271961599763017s: None
celery-export-worker_1  | [2022-02-11 15:32:16,360: INFO/ForkPoolWorker-2] Successfully deleted 0 trash entries and their associated trashed items.
celery-export-worker_1  | [2022-02-11 15:32:16,362: INFO/ForkPoolWorker-2] Task baserow.core.trash.tasks.permanently_delete_marked_trash[e1095dad-60cf-4f8f-ae07-e79bf8e7ddec] succeeded in 0.01974878699547844s: None
redis_1                 | 1:M 11 Feb 2022 15:32:16.420 * 100 changes in 300 seconds. Saving...
redis_1                 | 1:M 11 Feb 2022 15:32:16.422 * Background saving started by pid 75
redis_1                 | 75:C 11 Feb 2022 15:32:16.618 * DB saved on disk
redis_1                 | 75:C 11 Feb 2022 15:32:16.620 * RDB: 0 MB of memory used by copy-on-write
redis_1                 | 1:M 11 Feb 2022 15:32:16.623 * Background saving terminated with success
celery-beat-worker_1    | [2022-02-11 15:37:16,324: INFO/MainProcess] Scheduler: Sending due task baserow.contrib.database.export.tasks.clean_up_old_jobs() (baserow.contrib.database.export.tasks.clean_up_old_jobs)
celery-beat-worker_1    | [2022-02-11 15:37:16,329: INFO/MainProcess] Scheduler: Sending due task baserow.core.trash.tasks.mark_old_trash_for_permanent_deletion() (baserow.core.trash.tasks.mark_old_trash_for_permanent_deletion)
celery-export-worker_1  | [2022-02-11 15:37:16,332: INFO/MainProcess] Task baserow.contrib.database.export.tasks.clean_up_old_jobs[415dd954-7fae-4d1f-ac7e-5a116b8daee0] received
celery-export-worker_1  | [2022-02-11 15:37:16,343: INFO/MainProcess] Task baserow.core.trash.tasks.mark_old_trash_for_permanent_deletion[6059f7c2-033e-4395-889c-22581c81f4f4] received
celery-beat-worker_1    | [2022-02-11 15:37:16,335: INFO/MainProcess] Scheduler: Sending due task baserow.core.trash.tasks.permanently_delete_marked_trash() (baserow.core.trash.tasks.permanently_delete_marked_trash)
celery-export-worker_1  | [2022-02-11 15:37:16,351: INFO/MainProcess] Task baserow.core.trash.tasks.permanently_delete_marked_trash[4e8ebf76-c0a6-49d6-83c2-e274e899c8a5] received
celery-export-worker_1  | [2022-02-11 15:37:16,363: INFO/ForkPoolWorker-2] Cleaning up 0 old jobs
celery-export-worker_1  | [2022-02-11 15:37:16,366: INFO/ForkPoolWorker-1] Successfully marked 0 old trash items for deletion as they were older than 72 hours.
celery-export-worker_1  | [2022-02-11 15:37:16,367: INFO/ForkPoolWorker-1] Task baserow.core.trash.tasks.mark_old_trash_for_permanent_deletion[6059f7c2-033e-4395-889c-22581c81f4f4] succeeded in 0.018752650001260918s: None
celery-export-worker_1  | [2022-02-11 15:37:16,374: INFO/ForkPoolWorker-2] Task baserow.contrib.database.export.tasks.clean_up_old_jobs[415dd954-7fae-4d1f-ac7e-5a116b8daee0] succeeded in 0.02856763899762882s: None
celery-export-worker_1  | [2022-02-11 15:37:16,390: INFO/ForkPoolWorker-1] Successfully deleted 0 trash entries and their associated trashed items.
celery-export-worker_1  | [2022-02-11 15:37:16,390: INFO/ForkPoolWorker-1] Task baserow.core.trash.tasks.permanently_delete_marked_trash[4e8ebf76-c0a6-49d6-83c2-e274e899c8a5] succeeded in 0.019071100003202446s: None

Hi @Magicnetworks ,

Are you sure it is hanging? By default docker-compose up will attach to all the processes and show their logs. What happens when you try go to localhost:3000 on the same machine?

docker-compose up -d will launch the containers and not attach to the logs, after which you can run commands like docker-compose ps to inspect the state of the services and docker-compose logs to see the logs.