amazon-sagemaker

TensorBoard without callbacks for Keras docker image in SageMaker

拥有回忆 提交于 2020-05-15 21:19:15
问题 I'm trying to add TensorBoard functionality to this SageMaker example: https://github.com/awslabs/amazon-sagemaker-examples/blob/master/hyperparameter_tuning/keras_bring_your_own/hpo_bring_your_own_keras_container.ipynb The issue is that SageMaker's Estimator.fit() does not seem to support Keras models compiled with callbacks. Now from this git issue post it was described that what I need to do for TensorBoard functionality is, "You need your code inside the container to save checkpoints to

Error installing RODBC or ODBC on a Sagemaker Jupyter NoteBook Instance

梦想的初衷 提交于 2020-04-12 04:55:32
问题 I have been trying to establish a connection to Teradata from a Sagemaker Jupyter Notebook instance. I was trying to do it the way I would through R Studio. But when ever I try to install the package in the instance I get an non-zero exit status error. I have tried installing the following ways: remotes::install_github() or devtools::install_github() and also: install.packages('odbc', repo="https://cran.rstudio.com/") I tryed the same with RODBC, and I get the same warnings or errors. Any

Error installing RODBC or ODBC on a Sagemaker Jupyter NoteBook Instance

帅比萌擦擦* 提交于 2020-04-12 04:54:52
问题 I have been trying to establish a connection to Teradata from a Sagemaker Jupyter Notebook instance. I was trying to do it the way I would through R Studio. But when ever I try to install the package in the instance I get an non-zero exit status error. I have tried installing the following ways: remotes::install_github() or devtools::install_github() and also: install.packages('odbc', repo="https://cran.rstudio.com/") I tryed the same with RODBC, and I get the same warnings or errors. Any

Is it possible to predict in sagemaker without using s3

北城余情 提交于 2020-02-04 01:41:48
问题 I have a .pkl which I would like to put into production. I would like to do a daily query of my SQL server and do a prediction on about 1000 rows. The documentation implies I have to load the daily data into s3. Is there a way around this? It should be able to fit in memory no problem. The answer to " is there some kind of persistent local storage in aws sagemaker model training? " says that " The notebook instance is coming with a local EBS (5GB) that you can use to copy some data into it

Is it possible to hit endpoints hosted in SageMaker besides /invocations?

こ雲淡風輕ζ 提交于 2020-01-25 09:38:05
问题 Using the aws sagemaker cli tools it's possible to invoke endpoint that are hosted in sagemaker using a command like: aws sagemaker-runtime invoke-endpoint --body file://container/local_test/payload.json \ --endpoint-name $(DEPLOYMENT_NAME)-staging \ --content-type application/json \ --accept application/json \ output.json By default, this command goes to the /invocations endpoint. Is it possible to go to a different endpoint? For example, if I implemented a health-report endpoint? It's

What is the SageMaker TensorFlow SavedModel export format?

烈酒焚心 提交于 2020-01-15 09:35:11
问题 I'm new to AWS SageMaker TensorFlow 1.11.0 script mode. Been looking around the documentation for a while and can't seem to find the folder structure to export the model in after training. All I know is that I'm suppose to export to the directory specified in the env variable "SM_MODEL_DIR" and that the format is SavedModel. I've used tf.saved_model.simple_save to export in the following folder structure: - model.tar.gz -- saved_model.pb -- variables --- variables.index --- variables.data

What is the SageMaker TensorFlow SavedModel export format?

你离开我真会死。 提交于 2020-01-15 09:35:09
问题 I'm new to AWS SageMaker TensorFlow 1.11.0 script mode. Been looking around the documentation for a while and can't seem to find the folder structure to export the model in after training. All I know is that I'm suppose to export to the directory specified in the env variable "SM_MODEL_DIR" and that the format is SavedModel. I've used tf.saved_model.simple_save to export in the following folder structure: - model.tar.gz -- saved_model.pb -- variables --- variables.index --- variables.data

how to run a pre-trained model in AWS sagemaker?

荒凉一梦 提交于 2020-01-14 05:42:06
问题 I have a model.pkl file which is pre-trained and all other files related to the ml model. I want it to deploy it on the aws sagemaker. But without training, how to deploy it to the aws sagmekaer, as fit() method in aws sagemaker run the train command and push the model.tar.gz to the s3 location and when deploy method is used it uses the same s3 location to deploy the model, we don't manual create the same location in s3 as it is created by the aws model and name it given by using some

How to run a python file inside a aws sagemaker using dockerfile

本秂侑毒 提交于 2020-01-14 05:33:07
问题 I have a python code and a model that is pre-trained and has a model.pkl file with me in the same directory where the code i, now i have to run or deploy this to the aws sagemaker but not getting any solution for this as aws sagemaker supports only two commands train or serve for training and deploying respectively. currently, I am running the program using the command "python filename.py" and it is running successfully I want the same to run on the aws sagemaker. Any Solution?? I tried the

Early Stopping and Callbacks with Keras when using SageMaker

主宰稳场 提交于 2020-01-13 04:59:06
问题 I am using sagemaker to train a keras model. I need to implement early stoping approach when training the model. Is there a way to pass callbacks such as EarlyStopping, Histories..etc. In traditional way, we used to pass this as a parameter to keras's fit function: results = model.fit(train_x_trim, train_y_trim, validation_data=(test_x, test_y), epochs=FLAGS.epoch, verbose=0, callbacks=[tboard, checkpointer, early_stopping, history]) However, if using SageMaker, we need to call SageMaker's