random-seed

Why is numpy random seed not remaining fixed but RandomState is when run in parallel?

廉价感情. 提交于 2020-06-08 11:07:40
问题 I am running a monte-carlo simulation in parallel using joblib . I noticed however although my seeds were fixed my results kept changing. However, when I ran the process in series it remained constant as I expect. Below I implement a small example, simulating the mean for a normal distribution with higher variance. Load Libraries and define function import numpy as np from joblib import Parallel, delayed def _estimate_mean(): np.random.seed(0) x = np.random.normal(0, 2, size=100) return np

Why is numpy random seed not remaining fixed but RandomState is when run in parallel?

被刻印的时光 ゝ 提交于 2020-06-08 11:06:21
问题 I am running a monte-carlo simulation in parallel using joblib . I noticed however although my seeds were fixed my results kept changing. However, when I ran the process in series it remained constant as I expect. Below I implement a small example, simulating the mean for a normal distribution with higher variance. Load Libraries and define function import numpy as np from joblib import Parallel, delayed def _estimate_mean(): np.random.seed(0) x = np.random.normal(0, 2, size=100) return np

Why is numpy random seed not remaining fixed but RandomState is when run in parallel?

点点圈 提交于 2020-06-08 11:06:06
问题 I am running a monte-carlo simulation in parallel using joblib . I noticed however although my seeds were fixed my results kept changing. However, when I ran the process in series it remained constant as I expect. Below I implement a small example, simulating the mean for a normal distribution with higher variance. Load Libraries and define function import numpy as np from joblib import Parallel, delayed def _estimate_mean(): np.random.seed(0) x = np.random.normal(0, 2, size=100) return np

Boost Graph Library, Erdos Renyi Generator. Graphs always have same number of edges

岁酱吖の 提交于 2020-05-16 02:42:07
问题 I'm trying to generate Erdos-Renyi graphs using boost graph library. In the code below, which is taken from The Boost 1.72 documentation the networks always have the same number of edges (they should not, for particular p values). I have tried using different random seeds to no avail. Thanks for any help. #include <boost/graph/adjacency_list.hpp> #include <boost/graph/erdos_renyi_generator.hpp> #include <boost/random/linear_congruential.hpp> #include <iostream> using namespace std; typedef

Boost Graph Library, Erdos Renyi Generator. Graphs always have same number of edges

≡放荡痞女 提交于 2020-05-16 02:42:07
问题 I'm trying to generate Erdos-Renyi graphs using boost graph library. In the code below, which is taken from The Boost 1.72 documentation the networks always have the same number of edges (they should not, for particular p values). I have tried using different random seeds to no avail. Thanks for any help. #include <boost/graph/adjacency_list.hpp> #include <boost/graph/erdos_renyi_generator.hpp> #include <boost/random/linear_congruential.hpp> #include <iostream> using namespace std; typedef

Does setting the seed in tf.random.set_seed also set the seed used by the glorot_uniform kernel_initializer when using a conv2D layer in keras?

点点圈 提交于 2020-05-15 02:58:45
问题 I'm currently training a convolutional neural network using a conv2D layer defined like this: conv1 = tf.keras.layers.Conv2D(filters=64, kernel_size=(3,3), padding='SAME', activation='relu')(inputs) My understanding is that the default kernel_initializer is glorot_uniform which has a default seed of 'none': tf.keras.layers.Conv2D( filters, kernel_size, strides=(1, 1), padding='valid', data_format=None, dilation_rate=(1, 1), activation=None, use_bias=True, kernel_initializer='glorot_uniform',

TensorFlow: How to apply the same image distortion to multiple images

只愿长相守 提交于 2020-05-11 05:40:36
问题 Starting from the Tensorflow CNN example, I'm trying to modify the model to have multiple images as an input (so that the input has not just 3 input channels, but multiples of 3 by stacking images). To augment the input, I try to use random image operations, such as flipping, contrast and brightness provided in TensorFlow. My current solution to apply the same random distortion to all input images is to use a fixed seed value for these operations: def distort_image(image): flipped_image = tf

How to generate uniformly distributed random numbers between 0 and 1 in a C code using OpenMP?

坚强是说给别人听的谎言 提交于 2020-04-16 05:47:20
问题 I am trying to write an OpenMP code in which each thread will work on big arrays of uniformly distributed random numbers between 0 and 1. Each thread needs to have different and independent random number distributions. In addition, the random number distributions need to be different every time the code is called. This is what I am using right now. Does this always guarantee each thread has its own/different random number sequences? Will the sequences be different every time the code is