Description
Hi All,
This problem has been haunting me for quite sometime, and I need to resolve this asap. When I am trying to batch insert a data from a Data Frame, I get an error of array size of ... is too large. this is my current line for the executemany
x = [tuple(i) for i in df.values]
cursor.executemany(insert_sql, x)
I have tried some resolution present here in the internet but nothing works. I am quite aware that I can have the dataframe set to chunks but I cannot see any proper codes that work. As I have read the cx_Oracle documentation, the data size limit for executemany is 2GB. Here are my questions:
- Can we tweak the data size limit of 2GB into a larger data size limit?
- If 1 does not work, how can I properly slice my one large dataset into chunks and insert them chunk by chunk using executemany?
--EDIT--
Let me rephrase the question into:
how should i chunk my large datasets so i can insert them to my oracle database table using executemany?
Win10 - Python v3.7.9 - cx_Oracle v8.1 client ver 19.10
--END--
Thanks everyone!