amazon-ec2

How to Copy files from one EBS to Another EBS

99封情书 提交于 2021-02-06 11:01:44
问题 The problem is simple. I need to copy the files from one EBS to Another without passing the files through my local machine. Is this possible? If so how? 回答1: In order to copy files from one EBS volume to another EBS volume, both volumes will (at some point) need to be attached to an instance, though not necessarily the same instance. There are a lot of ways to do this if you allow for multiple instances and/or temporarily storing the files on a third storage option, but without constraints,

How to Copy files from one EBS to Another EBS

末鹿安然 提交于 2021-02-06 11:01:05
问题 The problem is simple. I need to copy the files from one EBS to Another without passing the files through my local machine. Is this possible? If so how? 回答1: In order to copy files from one EBS volume to another EBS volume, both volumes will (at some point) need to be attached to an instance, though not necessarily the same instance. There are a lot of ways to do this if you allow for multiple instances and/or temporarily storing the files on a third storage option, but without constraints,

How to Copy files from one EBS to Another EBS

无人久伴 提交于 2021-02-06 10:59:01
问题 The problem is simple. I need to copy the files from one EBS to Another without passing the files through my local machine. Is this possible? If so how? 回答1: In order to copy files from one EBS volume to another EBS volume, both volumes will (at some point) need to be attached to an instance, though not necessarily the same instance. There are a lot of ways to do this if you allow for multiple instances and/or temporarily storing the files on a third storage option, but without constraints,

Minimal IAM policy for ec2:RunInstances

痴心易碎 提交于 2021-02-06 09:54:08
问题 I'm trying to narrow down the minimal policy to run a predefined machine image. The image is based on two snapshots and I only want "m1.medium" instance types to be launched. Based on that and with the help of this page and this article, I worked out the following policy: { "Version": "2012-10-17", "Statement": [ { "Sid": "Stmt1385026304010", "Effect": "Allow", "Action": [ "ec2:RunInstances" ], "Condition": { "StringEquals": { "ec2:InstanceType": "m1.medium" } }, "Resource": [ "arn:aws:ec2:us

Unable to connect to AWS Documentdb using MongoDB Compass. No option to pass sslInvalidHostName

蹲街弑〆低调 提交于 2021-02-06 08:58:11
问题 AWS DocumentDB is a relatively new service we're trying to migrate to. To connect from outside of the VPC, you have to create a tunnel to an existing instance. For example: ssh -i "ec2Access.pem" -L 27017:sample-cluster.cluster-cu52jq5kfddg.us-east-1.docdb.amazonaws.com:27017 ubuntu@ec2-34-229-221-164.compute-1.amazonaws.com -N And then you can connect from mongo shell with: mongo --sslAllowInvalidHostnames --ssl --sslCAFile rds-combined-ca-bundle.pem --username <yourUsername> --password

AWS SSL on EC2 instance without Load Balancer - NodeJS

孤者浪人 提交于 2021-02-06 08:39:39
问题 Is it possible to have an EC2 instance running, listening on port 443 , without a load balancer ? I'm trying right now in my Node.JS app but it doesn't work when I call the page using https:// . However, if I set it to port 80 everything works fine with http:// . I had it working earlier with a load balancer and route53 , but I don't want to pay $18/mo for an ELB anymore, especially when I only have one server running. Thanks for the help 回答1: You're right, if it's only the one instance and

How to increase vm.max_map_count?

两盒软妹~` 提交于 2021-02-05 20:21:21
问题 I'm trying to run Elastic search in an Ubuntu EC2 machine (t2.medium). But I'm getting the message: max virtual memory areas vm.max_map_count [65530] is too low, increase to at least [262144] How can I increase the vm.max_map_count? 回答1: To make it persistent, you can add this line: vm.max_map_count=262144 in your /etc/sysctl.conf and run $ sudo sysctl -p to reload configuration with new value 回答2: I use # sysctl -w vm.max_map_count=262144 And for the persistence configuration # echo "vm.max

SparkContext Error - File not found /tmp/spark-events does not exist

狂风中的少年 提交于 2021-02-05 20:19:06
问题 Running a Python Spark Application via API call - On submitting the Application - response - Failed SSH into the Worker My python application exists in /root/spark/work/driver-id/wordcount.py Error can be found in /root/spark/work/driver-id/stderr Show the following error - Traceback (most recent call last): File "/root/wordcount.py", line 34, in <module> main() File "/root/wordcount.py", line 18, in main sc = SparkContext(conf=conf) File "/root/spark/python/lib/pyspark.zip/pyspark/context.py

How to increase vm.max_map_count?

烂漫一生 提交于 2021-02-05 20:12:05
问题 I'm trying to run Elastic search in an Ubuntu EC2 machine (t2.medium). But I'm getting the message: max virtual memory areas vm.max_map_count [65530] is too low, increase to at least [262144] How can I increase the vm.max_map_count? 回答1: To make it persistent, you can add this line: vm.max_map_count=262144 in your /etc/sysctl.conf and run $ sudo sysctl -p to reload configuration with new value 回答2: I use # sysctl -w vm.max_map_count=262144 And for the persistence configuration # echo "vm.max

SparkContext Error - File not found /tmp/spark-events does not exist

被刻印的时光 ゝ 提交于 2021-02-05 20:07:09
问题 Running a Python Spark Application via API call - On submitting the Application - response - Failed SSH into the Worker My python application exists in /root/spark/work/driver-id/wordcount.py Error can be found in /root/spark/work/driver-id/stderr Show the following error - Traceback (most recent call last): File "/root/wordcount.py", line 34, in <module> main() File "/root/wordcount.py", line 18, in main sc = SparkContext(conf=conf) File "/root/spark/python/lib/pyspark.zip/pyspark/context.py