You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
What's the difference for me? When only changing the Dockerfile and Caddyfile, I've successfully reproduced a bug where in the new Dockerfile and Caddyfile combination, the live deployment will disconnect from it's websocket - and either will stay disconnected or will start agressively reconnecting and disconnecting again. This will only occur after 15 or so minutes of uptime and not before!
What is the result? Well, it's ugly. If the frontend loses the connection to the backend with websockets, then all State variables are reset for the duration of the disconnect. That means the user will start to see blinking forms, variables and error messages if you have validation and invalid input rules! As a result the Reflex application is unusable and the only quick fix was to redeploy the solution again, giving it a 15 minute buffer or so when the websocket connection remains alive.
This will not happen locally, only when building and deploying a single port Dockerfile solution.
I only by accident found the solution to "downgrade" the Dockerfile to the previous example where the Caddyfile is inserted as a string to the Dockerfile.
To me it seems the reverse proxy setup dies, due to an internal bug with either the formatting of the file or what ever and therefore renders the whole application useless.
Hope this helps! Will provide more info in the thread of this message! ❤️
To Reproduce
So the current new example in the github repo is:
Caddyfile:
2024-10-24 12:30:28.045
Successfully started Caddy (pid=332) - Caddy is running in the background
2024-10-24 12:30:28.042
{
"level": "info",
"ts": 1729762228.0412948,
"logger": "tls",
"msg": "cleaning storage unit",
"description": "FileStorage:/root/.local/share/caddy"
}
2024-10-24 12:30:28.040
{
"level": "info",
"ts": 1729762228.0389044,
"msg": "autosaved config (load with --resume flag)",
"file": "/root/.config/caddy/autosave.json"
}
2024-10-24 12:30:28.035
{
"level": "warn",
"ts": 1729762228.031918,
"msg": "Caddyfile input is not formatted; run the 'caddy fmt' command to fix inconsistencies",
"adapter": "caddyfile",
"file": "Caddyfile",
"line": 14
4-10-24 09:49:49.536
{
"level": "info",
"ts": 1729752589.536447,
"logger": "tls",
"msg": "cleaning storage unit",
"description": "FileStorage:/root/.local/share/caddy"
}
Fields
event.provider
app
fly.app.instance
1781350b42d738
fly.app.name
beamline-portal
fly.region
fra
log.level
info
message
{"level":"info","ts":1729752589.536447,"logger":"tls","msg":"cleaning storage unit","description":"FileStorage:/root/.local/share/caddy"}
sort
[1729752589000000000]
2024-10-24 09:49:49.534
{
"level": "info",
"ts": 1729752589.5346453,
"msg": "autosaved config (load with --resume flag)",
"file": "/root/.config/caddy/autosave.json"
}
2024-10-24 09:49:49.529
{
"level": "warn",
"ts": 1729752589.5291812,
"msg": "Caddyfile input is not formatted; run the 'caddy fmt' command to fix inconsistencies",
"adapter": "caddyfile",
"file": "Caddyfile",
"line": 14
}
2024-10-24 09:49:49.227
INFO Preparing to run: `/bin/sh -c [ -d alembic ] && reflex db migrate; caddy start && redis-server --daemonize yes && exec reflex run --env prod --backend-only` as root
2024-10-24 09:48:53.255
Successfully started Caddy (pid=333) - Caddy is running in the background
2024-10-24 09:48:53.250
{
"level": "info",
"ts": 1729752533.2503912,
"logger": "tls",
"msg": "cleaning storage unit",
"description": "FileStorage:/root/.local/share/caddy"
}
What I do see with the old solution is that there is a lot of pings and pongs going on in the background, I'm assuming that is what keeps the websocket connection alive, for some odd reason.
Describe the bug
https://discord.com/channels/1029853095527727165/1139658820533104791/1299009470218768436
Hi! I'm reporting an issue with the Self-Hosting single port solution Dockerfile and Caddyfile.
I did my initial implementation about 6-8 months ago and then in the Github repo https://github.com/reflex-dev/reflex/tree/main/docker-example/simple-one-port there was a different solution.
What's the difference for me? When only changing the Dockerfile and Caddyfile, I've successfully reproduced a bug where in the new Dockerfile and Caddyfile combination, the live deployment will disconnect from it's websocket - and either will stay disconnected or will start agressively reconnecting and disconnecting again. This will only occur after 15 or so minutes of uptime and not before!
What is the result? Well, it's ugly. If the frontend loses the connection to the backend with websockets, then all State variables are reset for the duration of the disconnect. That means the user will start to see blinking forms, variables and error messages if you have validation and invalid input rules! As a result the Reflex application is unusable and the only quick fix was to redeploy the solution again, giving it a 15 minute buffer or so when the websocket connection remains alive.
This will not happen locally, only when building and deploying a single port Dockerfile solution.
I only by accident found the solution to "downgrade" the Dockerfile to the previous example where the Caddyfile is inserted as a string to the Dockerfile.
To me it seems the reverse proxy setup dies, due to an internal bug with either the formatting of the file or what ever and therefore renders the whole application useless.
Hope this helps! Will provide more info in the thread of this message! ❤️
To Reproduce
So the current new example in the github repo is:
Caddyfile:
Old Dockerfile that works:
Logs that might be relevant:
What I do see with the old solution is that there is a lot of pings and pongs going on in the background, I'm assuming that is what keeps the websocket connection alive, for some odd reason.
message-17.txt
Expected behavior
Frontend and backend remain connected.
Specifics (please complete the following information):
The text was updated successfully, but these errors were encountered: