arrays - Creating Really Big Python lists -


i trying create image database compatible cpickle. list empty. data of each image in directory added new row list. images 224x224. size of images on average 8kb. after loading around 10000 images pc hangs. there no mouse movement. nothing happens. needs restart. below code snippet this..

cr=csv.reader(open(csv_file,"rb"))     row in cr:         print row[0], row[1]         try:              image=image.open(row[0]+'.jpg').convert('la')             pixels=[]             pixels=[f[0] f in list(image.getdata())]             #pix=np.array(image)             dataset.append(pixels)             #dataset.append(pix)             labels.append(row[1])              del image          except:              print("image not found") 

i tried reducing size of images 28x28 , works. don't want reduce size of images. using python 64 bit executable. ram 4gb. ubuntu 14.04. suspect happening due limited stack space, , list taking more available stack space. if so, how create huge list? there workaround issue? end goal create numpy array pixel data rows. converting list numpy array.. there solution problem??

if data numpy array, maybe try using numpy.memmap. works "normal" numpy arrays, difference data stored on disk in binary. requested chunks of array put in ram, may rid of problem.

if size of data array determined, need set correct dimension when creating memmap object. if not, check out numpy.memmap.resize, , should able create anyways.

oh, , there other solutions such pytables.
luck!


Comments

Popular posts from this blog

Magento/PHP - Get phones on all members in a customer group -

php - .htaccess mod_rewrite for dynamic url which has domain names -

Website Login Issue developed in magento -