dbt

Create JSON from a subquery in snowflake

怎甘沉沦 提交于 2021-01-29 02:29:57
问题 I want to create a JSON string from a list of value, but I've never worked with JSON before. Please see the image below for my 2 tables, and what I want to create on the right. I tried this, but it doesn't work (excuse my naivety...but I thought this would be the logical implementation of it) select a.property_key ,to_JSON( select application_ID from tableB where a.property_key = b.property_key) as application_list from tableA a I would appreciate the help. I tried googleing, but I find the

Running DBT within Airflow through the Docker Operator

混江龙づ霸主 提交于 2021-01-25 07:34:30
问题 Building my question on How to run DBT in airflow without copying our repo, I am currently running airflow and syncing the dags via git. I am considering different option to include DBT within my workflow. One suggestion by louis_guitton is to Dockerize the DBT project, and run it in Airflow via the Docker Operator. I have no prior experience using the Docker Operator in Airflow or generally DBT. I am wondering if anyone has tried or can provide some insights about their experience

Running DBT within Airflow through the Docker Operator

北城以北 提交于 2021-01-25 07:32:49
问题 Building my question on How to run DBT in airflow without copying our repo, I am currently running airflow and syncing the dags via git. I am considering different option to include DBT within my workflow. One suggestion by louis_guitton is to Dockerize the DBT project, and run it in Airflow via the Docker Operator. I have no prior experience using the Docker Operator in Airflow or generally DBT. I am wondering if anyone has tried or can provide some insights about their experience

How to run DBT in airflow without copying our repo

耗尽温柔 提交于 2021-01-21 11:20:01
问题 We use DBT with GCP and BigQuery for transformations in BigQuery, and the simplest approach to scheduling our daily run dbt seems to be a BashOperator in Airflow. Currently we have two separate directories / github projects, one for DBT and another for Airflow. To schedule DBT to run with Airflow, it seems like our entire DBT project would need to be nested inside of our Airflow project, that way we can point to it for our dbt run bash command? Is it possible to trigger our dbt run and dbt

How to run DBT in airflow without copying our repo

旧巷老猫 提交于 2021-01-21 11:19:18
问题 We use DBT with GCP and BigQuery for transformations in BigQuery, and the simplest approach to scheduling our daily run dbt seems to be a BashOperator in Airflow. Currently we have two separate directories / github projects, one for DBT and another for Airflow. To schedule DBT to run with Airflow, it seems like our entire DBT project would need to be nested inside of our Airflow project, that way we can point to it for our dbt run bash command? Is it possible to trigger our dbt run and dbt

How can I avoid `Permission denied` Errors when mounting a container into my deployment?

我的未来我决定 提交于 2021-01-04 06:38:49
问题 Background I am currently deploying Apache Airflow using Helm (using this chart). I am using a git-sync sidecar to mount the SQL & Python files which Airflow will need to have access to to be able to execute scripts/files. What seems not to work Once I am done with deploying my container, it seems that my Airflow user is unable to use the files (that have been mounted by the git sidecar), and exits with error (this error happens for all files that have been mounted not only target): [Errno 13

How can I avoid `Permission denied` Errors when mounting a container into my deployment?

瘦欲@ 提交于 2021-01-04 06:38:31
问题 Background I am currently deploying Apache Airflow using Helm (using this chart). I am using a git-sync sidecar to mount the SQL & Python files which Airflow will need to have access to to be able to execute scripts/files. What seems not to work Once I am done with deploying my container, it seems that my Airflow user is unable to use the files (that have been mounted by the git sidecar), and exits with error (this error happens for all files that have been mounted not only target): [Errno 13

How can I avoid `Permission denied` Errors when mounting a container into my deployment?

让人想犯罪 __ 提交于 2021-01-04 06:38:08
问题 Background I am currently deploying Apache Airflow using Helm (using this chart). I am using a git-sync sidecar to mount the SQL & Python files which Airflow will need to have access to to be able to execute scripts/files. What seems not to work Once I am done with deploying my container, it seems that my Airflow user is unable to use the files (that have been mounted by the git sidecar), and exits with error (this error happens for all files that have been mounted not only target): [Errno 13

勒索病毒自救指南

北城以北 提交于 2020-11-26 01:42:45
经常会有一些小伙伴问:中了勒索病毒,该怎么办,可以解密吗? 第一次遇到勒索病毒是在早几年的时候,客户因网站访问异常,进而远程协助进行排查。登录服务器,在站点目录下发现所有的脚本文件及附件后缀名被篡改,每个文件夹下都有一个文件打开后显示勒索提示信息,这便是勒索病毒的特征。 出于职业习惯,我打包了部分加密文件样本和勒索病毒提示信息用于留档,就在今天,我又重新上传了样本,至今依然无法解密。 作为一个安全工程师,而非一个专业的病毒分析师,我们可以借助各大安全公司的能力,寻找勒索病毒的解密工具。本文整理了一份勒索病毒自救指南,通过勒索病毒索引引擎查找勒索病毒相关信息,再通过各个安全公司提供的免费勒索软件解密工具解密。当然,能否解密全凭运气,so,平时还是勤打补丁多备份。 勒索病毒搜索引擎 在勒索病毒搜索引擎输入病毒名、勒索邮箱、被加密后文件的后缀名,或直接上传被加密文件、勒索提示信息,即可可快速查找到病毒详情和解密工具。解密能力持续更新中,是值得收藏的几个勒索病毒工具网站。 【360】 勒索病毒搜索引擎,支持检索超过800种常见勒索病毒, http://lesuobingdu.360.cn 【腾讯】 勒索病毒搜索引擎,支持检索超过 300 种常见勒索病毒 https://guanjia.qq.com/pr/ls/ 【启明】VenusEye勒索病毒搜索引擎,超300种勒索病毒家族 https:

oracle12c 12.2.0静默安装及简单使用

核能气质少年 提交于 2020-08-20 08:21:25
oracle12c 12.2.0静默安装及简单使用 m0_37975257 2019-09-17 10:25:31 1302 收藏 2 分类专栏: 数据库 #oracle 版权 oracle12c 静默安装及简单使用 oracle12c安装 安装centos7.5 修改配置文件 修改主机名: vi /etc/hostname 修改网络等:vi ifcfg-ens33 VMware: 配置yum源 安装命令 关闭selinux 关闭防火墙 oracle相关配置 安装oracle依赖包关系 修改内核参数 修改用户限制 创建oracle帐号和组 创建相关数据库目录 修改oracle环境变量 下载并解压安装包 复制响应文件模板 修改创建数据库配置文件 安装数据库 执行脚本 静默配置监听 静默建立新库 检查oracle进程状态 登录数据库 oracle12c数据库普通用户创建及登录 登陆数据库 连接数据库 查看数据库 查看当前实例 查看所有容器 修改当前环境为你需要的数据库 查看当前实例 创建用户 配置服务器监听 重启数据库 设置数据库开机启动 oracle12c安装 安装centos7.5 我使用的是虚拟机 内存不少于1G, 1.5G没用那么容易卡 磁盘不少于40G 建议50G以上 修改配置文件 修改主机名: vi /etc/hostname 我设置为orcl 1 修改网络等:vi