Hello,


I’m working in a project that consists in converting a local postgresql 
database (that uses sqlalchemy as the ORM) into a web application, in which 
I upload excel sheets, read them, do some small cleaning and then upload 
selected data into a postgresql database using Django’s ORM. The idea is to 
have the data in a server instead of in every user’s machine. 

Everything is ok but data loading is taking too long since, I think, I am 
using panda’s dataframes to easily structure, read and save the data. In 
the local version of the library, I used lists and was way faster.


I don’t if it’s related to Sqlalchemy, Django, lists or dataframes. Any 
suggestions on how to read spreadsheets data and upload it into a 
postgresql database using Django?


Thanks a lot.

-- 
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/django-users.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/django-users/c43db26f-45fd-4769-b8ce-7f34cc7ca594%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to