0

This is a simple script that extracts data from api and writes it to csv however it always gets killed. There's no error description just "Killed" message.

I'm guessing it's due to some memory issues. Is there a way for this to be optimized where it doesn't eat up a lot of system memory? Data being processed is just around 100k.

with open('test4.csv', 'w', encoding='utf-8') as csvfile:
    writer = csv.DictWriter(csvfile, fieldnames = ticket_info, extrasaction='ignore')
    writer.writeheader()

    for ticket in zenpy_client.search_export(updated_between=[start_date, end_date], type='ticket',form='1900000029568'):
        t = ticket.to_dict()
        writer.writerow(t)

    csvfile.close()

Edit: I tried to monitor the process and noticed that it gets killed when it consumes more than 85% of memory.

 PID   USER      PR  NI  VIRT  RES  SHR  S %CPU %MEM    TIME+  COMMAND
 28789 root      20   0  7004m 6.7g 3584 R 50.0 85.2   2:48.29  python
4
  • "Data being processed is just around 100k": 100k what? Rows? Bytes? Commented Nov 29, 2022 at 14:33
  • @PranavHosangadi rows and around 4mb total. Commented Nov 29, 2022 at 14:35
  • That's not a whole lot. Why do you think this is a memory problem? Have you done any debugging, such as stepping through your program with a debugger to isolate the problem? Are you running this on your computer or on a remote server? Does it work anywhere else? Commented Nov 29, 2022 at 14:37
  • it runs on our remote server. it runs fine on my local pc. Commented Nov 29, 2022 at 14:40

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.