snowflake-cloud-data-platform

Identify missing hours - find the gaps in time

强颜欢笑 提交于 2021-02-08 10:12:27
问题 I have a table with hours, but there are gaps. I need to find which are the missing hours. select datehour from stored_hours order by 1; The gaps in this timeline are easy to find: select lag(datehour) over(order by datehour) since, datehour until , timestampdiff(hour, lag(datehour) over(order by datehour), datehour) - 1 missing from stored_hours qualify missing > 0 How can I create a list of the missing hours during these days? (with Snowflake and SQL) 回答1: To create a list/table of the

how can I add datetime stamp to zip file when unload data from snowflake to s3?

一世执手 提交于 2021-02-04 08:25:52
问题 I want to be able to add a timestamp the filename I'm writing to s3. So far I've been able to write files to AWS S3 using example below. Can someone guide me as to how do I go about putting datetime stamp in the file name? copy into @s3bucket/something.csv.gz from (select * from mytable) file_format = (type=csv FIELD_OPTIONALLY_ENCLOSED_BY = '"' compression='gzip' ) single=true header=TRUE; Thanks in advance. 回答1: The syntax for defining a path inside of a stage or location portion of the

is there an alternative to query with DELETE snowflake SQL statement with CTE?

試著忘記壹切 提交于 2021-01-29 19:42:49
问题 On snowflake, is there an alternative to query with DELETE SQL statement with CTE? seems it is not possible. with t as ( select * from "SNOWFLAKE_SAMPLE_DATA"."TPCDS_SF100TCL"."CALL_CENTER" ), p as (select t.CC_REC_END_DATE, t.CC_CALL_CENTER_ID , t.CC_REC_START_DATE from t where 1=1 AND t.CC_REC_START_DATE > '2000-01-01') delete from p For example: if we use a select, we got some results. but if I use a delete. It shows a syntax error 回答1: The problem with this thinking is that SELECT returns

Parse JSON Multiple values into Rows

孤人 提交于 2021-01-29 15:31:34
问题 Requirement : Parse JSON values into rows JSON: { "uid":"2EDA9DC1D4", "m_lg_loc": "ml_0_49_2965_12990434_1450,ml_0_49_2965_12991888_1450,ml_0_49_2965_12997254_682,ml_0_49_2965_12997940_453", "codec": "PMMMU,G726-32,PMMMA,A729a,tel", "trv_dev": "1,10,2", "geoipp": { "area_code": 703, "location": [ -77.2223, 38.94990000014 ] } } Expected Output: Need m_lg_loc multiple values into rows ml_0_49_2965_12990434_1450 ml_0_49_2965_12991888_1450 ml_0_49_2965_12997254_682 ml_0_49_2965_12997940_453

Can you have permanent IP address with AWS Glue so that it can be whitelisted in Snowflake?

你离开我真会死。 提交于 2021-01-29 10:00:29
问题 The scenario is this: Our snowflake will only be accessible by whitelisted IP addresses. If we plan to use AWS Glue what IP address can we use so that it will allow us to connect to snowflake? I need a way to identify that this AWS Glue job with IP address (endpoint) so that it can be identified in Snowflake. I want to use an AWS Glue because it is a serverless orchestration tool. Thanks, D. 回答1: AWS has specified the ip-ranges of several services and regions, but Glue is currently not listed

Flatten nested JSON in snowflake

别来无恙 提交于 2021-01-29 07:39:10
问题 this is an example of a JSON (it can be more, or less, types and/or values. I want to end up with (order not important): Countries, IC Countries, ES Countries, SE Countries, GB Countries, US Categories, film-chat JSON { "list": [ { "element": { "comparison": "anyOf", "logical": "and", "type": "Countries", "value": { "list": [ { "element": "IC" }, { "element": "ES" }, { "element": "SE" }, { "element": "GB" }, { "element": "US" } ] } } }, { "element": { "comparison": "anyOf", "logical": "and",

Flatten nested JSON in snowflake

让人想犯罪 __ 提交于 2021-01-29 07:31:19
问题 this is an example of a JSON (it can be more, or less, types and/or values. I want to end up with (order not important): Countries, IC Countries, ES Countries, SE Countries, GB Countries, US Categories, film-chat JSON { "list": [ { "element": { "comparison": "anyOf", "logical": "and", "type": "Countries", "value": { "list": [ { "element": "IC" }, { "element": "ES" }, { "element": "SE" }, { "element": "GB" }, { "element": "US" } ] } } }, { "element": { "comparison": "anyOf", "logical": "and",

How to call snowsql client from python

房东的猫 提交于 2021-01-29 06:05:20
问题 I'm calling snowsql client from shell script. I'm importing properties file by doing source. And invoking snowsql client. How can I do the same in Python? Any help would be highly appreciated. Shell script to call snowsql client: source /opt/data/airflow/config/cyrus_de/snowflake_config.txt sudo /home/user/snowsql -c $connection --config=/home/user/.snowsql/config -w $warehouse --variable database_name=$dbname --variable stage=$stagename --variable env=$env -o exit_on_error=true -o variable

Create JSON from a subquery in snowflake

怎甘沉沦 提交于 2021-01-29 02:29:57
问题 I want to create a JSON string from a list of value, but I've never worked with JSON before. Please see the image below for my 2 tables, and what I want to create on the right. I tried this, but it doesn't work (excuse my naivety...but I thought this would be the logical implementation of it) select a.property_key ,to_JSON( select application_ID from tableB where a.property_key = b.property_key) as application_list from tableA a I would appreciate the help. I tried googleing, but I find the

List all columns wtih datatype in specific table in Snowflake

旧街凉风 提交于 2021-01-28 14:09:20
问题 I am looking for a programmatic way to get the Snowflake table schema, is there a way for that? 回答1: You can use a SHOW command to list table columns e.g. SHOW COLUMNS IN TABLE MYSCHEMA.MYTABLE; Note that SHOW commands have the advantage that they do not require a running warehouse to execute, so are zero cost queries. 回答2: Use this query: select ordinal_position as position, column_name, data_type, case when character_maximum_length is not null then character_maximum_length else numeric