This is a simple script that extracts data from api and writes it to csv however it always gets killed. There's no error description just "Killed" message.
I'm guessing it's due to some memory issues. Is there a way for this to be optimized where it doesn't eat up a lot of system memory? Data being processed is just around 100k.
with open('test4.csv', 'w', encoding='utf-8') as csvfile:
writer = csv.DictWriter(csvfile, fieldnames = ticket_info, extrasaction='ignore')
writer.writeheader()
for ticket in zenpy_client.search_export(updated_between=[start_date, end_date], type='ticket',form='1900000029568'):
t = ticket.to_dict()
writer.writerow(t)
csvfile.close()
Edit: I tried to monitor the process and noticed that it gets killed when it consumes more than 85% of memory.
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
28789 root 20 0 7004m 6.7g 3584 R 50.0 85.2 2:48.29 python