You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to import data (few large .csv files) into db. I'm using Importer from inside docker container. With some of my files, I have a problem with the number of chars per column. Can I change csv parser config somehow? I'd like to go the easiest possible way...
The text was updated successfully, but these errors were encountered:
@robfrank We should refactor to parser architecture to allow passing a InputStream as input, so we can write tests much faster, by just creating CSV in memory in the test.
This issue was discussed on the Discord channel:
I'm trying to import data (few large .csv files) into db. I'm using Importer from inside docker container. With some of my files, I have a problem with the number of chars per column. Can I change csv parser config somehow? I'd like to go the easiest possible way...
The text was updated successfully, but these errors were encountered: