-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ease development with Docker #57
base: master
Are you sure you want to change the base?
Conversation
@aaronjwood Very exciting! I am testing this out now! FYI we'll have to maintain the original process guide in the README (move to bottom instead of replacing) since it's informative for how production is running (Linux systemd). Hopefully I work up the courage to switch production to the docker container. |
Sounds good, I'll adjust the readme when I get some time in a few days. When I got everything up locally and fixed some crashing around the test data parsing I found that the UI didn't show the test data that was loaded into the DB anywhere, and the UI was stuck on December 1969. Are you aware of this being an existing issue? I'm guessing it's specific to the local dev env since things are working for me on your live deployment with my PGE data but I didn't dig in very much to see exactly why it wasn't working. The test data is from 2019 it seems, but the front end doesn't allow to go anywhere besides 1969. |
WORKDIR /frontend | ||
RUN npm ci && npm run build | ||
|
||
FROM python:3.8-slim |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@JPHutchins what do you think about moving to PyPy for the JIT sweetness?
@aaronjwood Unfortunately I am in a "how is this even working" sorta situation with the MQ and Celery tasks on the live server... The docker container works for me up to the point of queuing the async jobs - LMK if this flow is working for you in the docker container: https://github.com/JPHutchins/open-energy-view#example-account-setup Here's a description of what is supposed to be happening.
As I mentioned, in production these are all running from systemd. I've inspected my config and it does not seem to differ from what you have setup in the docker container. LMK what you might find when you run that flow. It's critical for development to be able to mock the PGE request/response in the development environment so that we have an efficient way to test data parsing, fetching etc, thank you for your help! EDIT: just confirmed that the "fake fetch" is working in production.
EDIT2: if it's not clear, the architecturally f*(cky thing here is that the insert_to_db task needs the "flask application context" in order to setup the SQL ORM (sql alchemy). |
JFC there is some embarrassing code in here finally:
pass |
Is this still being worked on? If not i think a good approach would be to at least make docker optional as to not disturb the original flow. So people like me can use the docker container and others can use the app straight up. for things like
we can convert it to
same thing with base paths |
I never got docker working 😬. If you're interested in hosting a project like this, I think this repo can serve as a proof of concept, but a new website that implements Green Button Connect My Data for users should start from scratch. Something like celery shouldn't even be needed, architecture can be simplified. |
Yea i was mostly thinking about self hosting this if it can provide me good information/dashboards on my pg&e usage. I assumed this was only a self hosted project repo since https://www.openenergyview.com/ seems to be down. |
This is the repository for openenergyview.com, but I don't have time to maintain it. It uses PGE Share My Data OAuth to get Green Button data from every user that registers (with OEV). OEV stores the data in a DB to make it easy for users to see their data. The original goal was to pipe that data to Home Assistant users so that they can have energy dashboards for free. Then use the on/off states of smart home devices to determine individual device's power consumption without needing any additional sensors. Further, utility companies are required to implement Green Button, so this is scalable to the entire US. Auth and APIs would be handled uniquely for each utility company. Needless to say it fell short. If you are only trying to get your own data, then the companion SMD repo might work for you. You would still need to run a web server, unless PGE has updated the APIs to avoid that 🤣. |
One command and you're good to go :)