Skip to content

Cannot insert large data sets in Oracle table using executemany() #548

Closed
@megurikun08

Description

@megurikun08

Hi All,

This problem has been haunting me for quite sometime, and I need to resolve this asap. When I am trying to batch insert a data from a Data Frame, I get an error of array size of ... is too large. this is my current line for the executemany

x = [tuple(i) for i in df.values]
cursor.executemany(insert_sql, x)

I have tried some resolution present here in the internet but nothing works. I am quite aware that I can have the dataframe set to chunks but I cannot see any proper codes that work. As I have read the cx_Oracle documentation, the data size limit for executemany is 2GB. Here are my questions:

  1. Can we tweak the data size limit of 2GB into a larger data size limit?
  2. If 1 does not work, how can I properly slice my one large dataset into chunks and insert them chunk by chunk using executemany?

--EDIT--
Let me rephrase the question into:

how should i chunk my large datasets so i can insert them to my oracle database table using executemany?

Win10 - Python v3.7.9 - cx_Oracle v8.1 client ver 19.10
--END--

Thanks everyone!

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions