google-cloud-bigtable

Connect to Cloud Bigtable from Google App Engine

南笙酒味 提交于 2019-12-11 11:06:48
问题 It appears that I cannot create a connection from Java class running on AppEngine. I use the following library/dependency: <dependency> <groupId>com.google.cloud.bigtable</groupId> <artifactId>bigtable-hbase-1.1</artifactId> <version>0.1.9</version> </dependency> And the following lines of code: import org.apache.hadoop.hbase.client.ConnectionFactory; import org.apache.hadoop.hbase.*; Configuration conf = HBaseConfiguration.create(); connection = ConnectionFactory.createConnection(conf); It

Read And Modify on same transaction - Bigtable

爱⌒轻易说出口 提交于 2019-12-11 07:34:20
问题 I building a coupon system and I use bigtable. My schema has two column - Customer ID, Coupon code I would like to query the table to check if the customer already exists and if true to return the code and if it doesn't, modify the customer id cell with the id and to return back the code. I saw there is an option to do it in Bigtable ReadModifyWriteRow operator or with CheckAndMutateRow but I not found any references. 回答1: Google has API documentation for Bigtable and Python available here.

Can't connect to Bigtable from a Spring Boot application

情到浓时终转凉″ 提交于 2019-12-11 07:04:20
问题 I have a standalone application that works fine with Bigtable when creating a connection like this: Connection connection = BigtableConfiguration.connect(PROJECT_ID, INSTANCE_ID) and using the following dependencies: <dependency> <groupId>com.google.apis</groupId> <artifactId>google-api-services-storage</artifactId> <version>v1-rev78-1.22.0</version> </dependency> <dependency> <groupId>com.google.apis</groupId> <artifactId>google-api-services-pubsub</artifactId> <version>v1-rev11-1.22.0<

Can't connect to Bigtable to scan HTable data due to hardcoded managed=true in hbase client jars

时光毁灭记忆、已成空白 提交于 2019-12-11 06:37:43
问题 I'm working on a custom load function to load data from Bigtable using Pig on Dataproc. I compile my java code using the following list of jar files I grabbed from Dataproc. When I run the following Pig script, it fails when it tries to establish a connection with Bigtable. Error message is: Bigtable does not support managed connections. Questions: Is there a work around for this problem? Is this a known issue and is there a plan to fix or adjust? Is there a different way of implementing

Google Cloud Bigtable Java Client - tcnative errors

时光毁灭记忆、已成空白 提交于 2019-12-11 06:29:56
问题 I am trying to connect to Cloud Bigtable; however, I'm getting issues with netty-tcnative not being found. Maven dependencies: <dependency> <groupId>com.google.cloud.bigtable</groupId> <artifactId>bigtable-hbase-1.2</artifactId> <version>0.9.2</version> </dependency> <dependency> <groupId>io.netty</groupId> <artifactId>netty-tcnative-boringssl-static</artifactId> <version>1.1.33.Fork19</version> </dependency> Error output: ERROR 2016-09-09 22:26:00,969 [main] com.google.cloud.bigtable.grpc

Can Bloom Filters in BigTable be used to filter based only on row ID?

落花浮王杯 提交于 2019-12-11 03:32:37
问题 BigTable uses Bloom filters to allow point reads to avoid accessing SSTables that do not contain any data within a given key-column pair. Can these Bloom filters also be used to avoid accessing SSTables if the query only specifies the row ID and no column ID? BigTable uses row-column pairs as keys to insert into its bloom filters. This means that a query can use these filters for a point read that specifies a row-column pair. Now, suppose we have a query to get all columns of a row based only

Error exporting data from Google Cloud Bigtable

对着背影说爱祢 提交于 2019-12-11 00:23:26
问题 While going through the Google docs, I'm getting the below stack trace on the final export command (executed from the master instance with appropriate env variables set). ${HADOOP_HOME}/bin/hadoop jar ${HADOOP_BIGTABLE_JAR} export-table -libjars ${HADOOP_BIGTABLE_JAR} <table-name> <gs://bucket> SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/hadoop/hbase-install/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found

How to get filtered data from Bigtable using Python?

房东的猫 提交于 2019-12-10 10:33:57
问题 I am using Bigtable emulator and have successfully added a table in it and now I need to get filtered data. The table is as follows: arc_record_id | record_id | batch_id 1 |624 |86 2 |625 |86 3 |626 |86 and so on...till arc_record_id 10. I have tried this given below Python code: visit_dt_filter = ValueRangeFilter(start_value = "1".encode('utf-8'), end_value = "2".encode('utf-8')) col1_filter = ColumnQualifierRegexFilter(b'arc_record_id') chain1 = RowFilterChain(filters=[col1_filter, visit_dt

Sharing BigTable Connection object among DataFlow DoFn sub-classes

荒凉一梦 提交于 2019-12-08 10:34:24
问题 I am setting up a Java Pipeline in DataFlow to read a .csv file and to create a bunch of BigTable rows based on the content of the file. I see in the BigTable documentation the note that connecting to BigTable is an 'expensive' operation and that it's a good idea to do it only once and to share the connection among the functions that need it. However, if I declare the Connection object as a public static variable in the main class and first connect to BigTable in the main function, I get the

Unable to connect to Google Bigtable using HBase REST api

跟風遠走 提交于 2019-12-08 09:03:13
问题 Following this example, running the test script "python put_get_with_client.py" results in a 400 error (Bad Request). Bad request java.lang.ClassCastException: org.apache.hadoop.hbase.client.BigtableConnection cannot be cast to org.apache.hadoop.hbase.client.ClusterConnection at org.apache.hadoop.hbase.client.ConnectionManager.createConnection(ConnectionManager.java:410) at org.apache.hadoop.hbase.client.ConnectionManager.createConnection(ConnectionManager.java:370) at org.apache.hadoop.hbase