Out of memory OOM using tensorflow gradient tape but only happens when I append a list
问题 I've been working on a data set (1000,3253) using a CNN. I'm running gradient calculations through gradient tape but it keeps running out of memory. Yet if I remove the line appending a gradient calculation to a list the script runs through all the epochs. I'm not entirely sure why this would happen but I am also new to tensorflow and the use of gradient tape. Any advice or input would be appreciated #create a batch loop for x, y_true in train_dataset: #create a tape to record actions with tf