问题
Trying to connect from a datalab notebook with PostgreSQL database hosted on Google Cloud SQL. Try both direct IP and instance connection ways but both give us an exception.
direct connection URI:
"{engine}://{user}:{password}@{host}:{port}/{database}"
using gcloud sql connect
"{engine}://{user}:{password}@/{database}?host=/cloudsql/{instance_connection_name}"
both give us this exception:
OperationalError: (psycopg2.OperationalError) could not connect to
server: Connection timed out
Is the server running on host "***.***.***.***" and accepting
TCP/IP connections on port ****?
Any idea if it need a cloud sql proxy as in Collab proxy connection? And if it is needed how to do it with datalab libraries?
回答1:
I finally got it.
Assuming that datalab VM is already authenticated on Gcloud i try to use cloud_sql_proxy to connect without auth python commands that appear on Collab proxy connection and fix the error that still appears by crating missing directory. Si i got this:
!wget https://dl.google.com/cloudsql/cloud_sql_proxy.linux.amd64 -O cloud_sql_proxy
!mkdir -p /cloudsql
!chmod +x cloud_sql_proxy
.
!./cloud_sql_proxy --instances=project-id:europe-west1:posty --dir /cloudsql
as with Collab solution we need to let the notebook running in alternate window to keep proxy. With other notebooks on the same machine we finally obtain access to the database.
Note: Probably a better solution could be to edit the docker image of datalab machines to include this behaviour as noted here.
回答2:
IT may be that your VM machine ip isn't whitelisted on the database.
You can access the list and add new ips on the google cloud console SQL > yor_database > authorization.
Check this link for details https://cloud.google.com/sql/docs/mysql/connect-external-app?hl=en_US&_ga=2.178999533.-851571953.1521816449#appaccessIP
来源:https://stackoverflow.com/questions/50473308/how-to-connect-datalab-with-google-cloud-sql