fix(parsers, CO): adopt use_proxy, fix parsing, add tests#7690
fix(parsers, CO): adopt use_proxy, fix parsing, add tests#7690VIKTORVAV99 merged 5 commits intoelectricitymaps:masterfrom
Conversation
45cb72f to
a0d7277
Compare
|
@VIKTORVAV99 I got this parser functional with this PR, but it builds on #7686 currently so I've made it draft until that PR resolves one way or another. |
a0d7277 to
93c52b6
Compare
1513212 to
c5f8eb7
Compare
|
Learned more about pandas doing this work! It now works and tests are added to help future maintenance of this parser. |
| @use_proxy(country_code="CO", monkeypatch_for_pydataxm=True) | ||
| def fetch_consumption( | ||
| zone_key: ZoneKey, | ||
| zone_key: ZoneKey = ZoneKey("CO"), |
There was a problem hiding this comment.
While I think this makes sense I need to double check a few things internally before we can merge this, specifically how much bandwidth/data we have available first as they don't yet support pay as you go billing or if we need to increase this so not all parsers that use the proxy goes down.
Hopefully I can get this done tomorrow or on Friday otherwise it will be early next week.
|
I'll fix the merge conflict tonight! |
c5f8eb7 to
60682fb
Compare
|
A rebase resolved everything, where a commit in this PR was dropped for already being applied via #7741. |
60682fb to
388b0d9
Compare
|
I've been monitoring the current proxy bandwidth usage and it seems like we have some room to add more parsers to it. So this should be fine to merge. I'll take a quick final look tomorrow then merge it! |
|
please how do we use proxy ? How do you get the proxy username and password? |
We set up our own proxy, you'll have to do the same if you want to use this approach. |
Do I have to build a proxy from scratch? Is there a simpler approach that doesn't require creating a proxy? |
You can if you wish but there are plenty of solutions out there already. We did not find a simpler solution sadly. But if you do please share it. |
|
Okay, thank you. other question : Please, does the API have historical data on energy and emissions from power plants in Australia? |
|
Hey @Mbryan30, if you're talking about the ElectricityMaps API, it does not contain energy and emissions at the power plant level, only at the region level. If you trying to use our API, you can find all the information on our website : https://app.electricitymaps.com/dashboard . Let us know if you have any issue. If you're talking about the API we get the data from for Australia, they might have per-power plant energy and emissions data, I don't remember off the top of my head. If you need further discussion, and since this issue is related to Colombia, I suggest that you open another issue if it is related to data parsers, or try our API and contact us via Intercom if you have issues using our API. |
Issue
Description
CO proxy needed and added with monkeypatching
The CO parser now needs to be proxied from CO. However, since the CO parser makes use of the library pydataxm to make requests, and it can't be configured to use a proxy, I've monkeypatched
aiohttp.ClientSession.__init__andrequests.posttemporarily so that when they are used by pydataxm, it will proxy traffic anyhow.An upstream issue about this is EquipoAnaliticaXM/API_XM#37.
Live consumption parser fix
There was a datetime string returned sub-seconds, so I removed them before parsing the datetime.fromisoformat which in Python <= 3.10 doesn't handle that.
Historical parsers fixes
While
pydataxm.ReadDB().reques_data()has a function signature with start_date and end_date, it turns out that sometimes they aren't respected and more data for additional dates are returned. This seems to happen when requesting data from not very recent dates.I've updated the parser to handle receiving multiple dates for consumption/production/price parser, which all had the issue, but where the production parser didn't error but instead generated incorrect data I think.
Double check
poetry run test_parser "zone_key"pnpx prettier@2 --write .andpoetry run formatin the top level directory to format my changes.