How to connect to a cluster in Amazon Redshift using SQLAlchemy?

后端 未结 5 1236
既然无缘
既然无缘 2021-01-12 06:17

In Amazon Redshift\'s Getting Started Guide, it\'s mentioned that you can utilize SQL client tools that are compatible with PostgreSQL to connect to your Amazon Redshift Clu

相关标签:
5条回答
  • 2021-01-12 06:22

    I don't think SQL Alchemy "natively" knows about Redshift. You need to change the JDBC "URL" string to use postgres.

    jdbc:postgres://shippy.cx6x1vnxlk55.us-west-2.redshift.amazonaws.com:5439/shippy
    

    Alternatively, you may want to try using sqlalchemy-redshift using the instructions they provide.

    0 讨论(0)
  • 2021-01-12 06:25

    The following works for me with Databricks on all kinds of SQLs

      import sqlalchemy as SA
      import psycopg2
      host = 'your_host_url'
      username = 'your_user'
      password = 'your_passw'
      port = 5439
      url = "{d}+{driver}://{u}:{p}@{h}:{port}/{db}".\
                format(d="redshift",
                driver='psycopg2',
                u=username,
                p=password,
                h=host,
                port=port,
                db=db)
      engine = SA.create_engine(url)
      cnn = engine.connect()
    
      strSQL = "your_SQL ..."
      try:
          cnn.execute(strSQL)
      except:
          raise
    
    0 讨论(0)
  • 2021-01-12 06:28
    import sqlalchemy as db
    engine = db.create_engine('postgres://username:password@url:5439/db_name')
    

    This worked for me

    0 讨论(0)
  • 2021-01-12 06:39

    I was running into the exact same issue, and then I remembered to include my Redshift credentials:

    eng = create_engine('postgres://[LOGIN]:[PWORD]@shippy.cx6x1vnxlk55.us-west-2.redshift.amazonaws.com:5439/shippy
    
    0 讨论(0)
  • 2021-01-12 06:41

    sqlalchemy-redshift is works for me, but after few days of reserch packages (python3.4):

    SQLAlchemy==1.0.14 sqlalchemy-redshift==0.5.0 psycopg2==2.6.2

    First of all, I checked, that my query is working workbench (http://www.sql-workbench.net), then I force it work in sqlalchemy (this https://stackoverflow.com/a/33438115/2837890 helps to know that auto_commit or session.commit() must be):

    db_credentials = (
    'redshift+psycopg2://{p[redshift_user]}:{p[redshift_password]}@{p[redshift_host]}:{p[redshift_port]}/{p[redshift_database]}'
        .format(p=config['Amazon_Redshift_parameters']))
    engine = create_engine(db_credentials, connect_args={'sslmode': 'prefer'})
    connection = engine.connect()
    result = connection.execute(text(
        "COPY assets FROM 's3://xx/xx/hello.csv' WITH CREDENTIALS "
        "'aws_access_key_id=xxx_id;aws_secret_access_key=xxx'"
        " FORMAT csv DELIMITER ',' IGNOREHEADER 1 ENCODING UTF8;").execution_options(autocommit=True))
    result = connection.execute("select * from assets;")
    print(result, type(result))
    print(result.rowcount)
    connection.close()
    

    And after that, I forced to work sqlalchemy_redshift CopyCommand perhaps bad way, looks little tricky:

    import sqlalchemy as sa
    tbl2 = sa.Table(TableAssets, sa.MetaData())
    copy = dialect_rs.CopyCommand(
        assets,
        data_location='s3://xx/xx/hello.csv',
        access_key_id=access_key_id,
        secret_access_key=secret_access_key,
        truncate_columns=True,
        delimiter=',',
        format='CSV',
        ignore_header=1,
        # empty_as_null=True,
        # blanks_as_null=True,
    )
    
    print(str(copy.compile(dialect=RedshiftDialect(), compile_kwargs={'literal_binds': True})))
    print(dir(copy))
    connection = engine.connect()
    connection.execute(copy.execution_options(autocommit=True))
    connection.close()
    

    We make just that I made with sqlalchemy, excute query, except comine query by CopyCommand. I have not see some profit :(.

    0 讨论(0)
提交回复
热议问题