1

I have the following case:

conn=ps.connect("dbname='xxxx' user='xxxx' password='xxxx'")
crs = conn.cursor()
statement= """ SELECT *FROM aaaa; 
""" 
crs.execute(statement)
# get the name of the cars
var = crs.fetchall()

But given the large size of the table, Ubuntu kills the process because of memory issues.

How can I fetch N rows per time in a loop? For instance something like this:

N=1000
for i in range(0,10):
   var = crs.fetchmany(0:N)
   N+=1000

3 Answers 3

2

fetchmany returns up to n next rows and None when there are none left.

rows = crs.fetchmany(1000)
while(rows):
    # do stuff
    rows = crs.fetchmany(1000)
else:
    # done
Sign up to request clarification or add additional context in comments.

Comments

1

If you are using psycopg2:

cur.fetchmany(2)

the cursor class will allow you to fetchone or fetchmany.

Comments

0

I solved it in the following way:

query = """SELECT COUNT(*) FROM xxxxx;
"""
crs.execute(query)
N = crs.fetchall()
N = N[0][0]
for i in range(0,N):
    if n1 == N: continue
    n1 = n + 1000
    print n,n1
    if n1 > N:
        n1 == N
    statement = """ SELECT * FROM xxxxx limit %d offset %d;
    """%(n1,n)
    crs.execute(statement)
    var  = crs.fetchall()

1 Comment

I tried to give an answer. Thanks for your comments.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.