Google Colaboratory: misleading information about its GPU (only 5% RAM available to some users)

后端 未结 9 617
滥情空心
滥情空心 2020-12-02 03:40

update: this question is related to Google Colab\'s \"Notebook settings: Hardware accelerator: GPU\". This question was written before the \"TPU\" option was added.

相关标签:
9条回答
  • 2020-12-02 04:14

    I believe if we have multiple notebooks open. Just closing it doesn't actually stop the process. I haven't figured out how to stop it. But I used top to find PID of the python3 that was running longest and using most of the memory and I killed it. Everything back to normal now.

    0 讨论(0)
  • 2020-12-02 04:15

    Google Colab resource allocation is dynamic, based on users past usage. Suppose if a user has been using more resources recently and a new user who is less frequently uses Colab, he will be given relatively more preference in resource allocation.

    Hence to get the max out of Colab , close all your Colab tabs and all other active sessions, reset the runtime of the one you want to use. You'll definitely get better GPU allocation.

    0 讨论(0)
  • 2020-12-02 04:24

    just give a heavy task to google colab, it will ask us to change to 25 gb of ram.

    example run this code twice:

    import numpy as np
    from keras.layers import Conv2D, MaxPooling2D, AveragePooling2D
    from keras.layers import Dropout, Flatten, Dense
    from keras.models import Sequential
    from keras.layers.advanced_activations import LeakyReLU
    from keras.datasets import cifar10
    (train_features, train_labels), (test_features, test_labels) = cifar10.load_data()
    model = Sequential()
    
    model.add(Conv2D(filters=16, kernel_size=(2, 2), padding="same", activation="relu", input_shape=(train_features.shape[1:])))
    model.add(MaxPooling2D(pool_size=(2, 2), padding='same'))
    
    model.add(Conv2D(filters=32, kernel_size=(3, 3), padding="same", activation="relu"))
    model.add(MaxPooling2D(pool_size=(2, 2), padding='same'))
    
    model.add(Conv2D(filters=64, kernel_size=(4, 4), padding="same", activation="relu"))
    model.add(MaxPooling2D(pool_size=(2, 2), padding='same'))
    
    model.add(Flatten())
    
    model.add(Dense(25600, activation="relu"))
    model.add(Dense(25600, activation="relu"))
    model.add(Dense(25600, activation="relu"))
    model.add(Dense(25600, activation="relu"))
    model.add(Dense(10, activation="softmax"))
    
    model.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy'])
    
    model.fit(train_features, train_labels, validation_split=0.2, epochs=10, batch_size=128, verbose=1)
    

    then click on get more ram :)

    0 讨论(0)
提交回复
热议问题