data-import

Import dump with SQLFILE parameter not returning the data inside the table

余生颓废 提交于 2019-12-02 12:15:10
I am trying to import the dump file to .sql file using SQLFILE parameter. I used the command "impdp username/password DIRECTORY=dir DUMPFILE=sample.dmp SQLFILE=sample.sql LOGFILE=sample.log" I expected this to return a sql file with contents inside the table. But it created a sql file with only DDL queries. For export I used, "expdp username/password DIRECTORY=dir DUMPFILE=sample.dmp LOGFILE=sample.log FULL=y" Dump file size is 130 GB. So, I believe the dump has been exported correctly. Am I missing something in the import command? Is there any other parameter should I use to get the contents?

Import selected columns from a CSV files to SQL Server table

只愿长相守 提交于 2019-12-02 07:08:00
I am trying to import data from a CSV file to a SQL Server 2008 table. Data upload is working, but I want to import only selected columns, not all, and add them to a a new table, of same no. of columns, using the wizard, but its not happening, the wizard is selecting all the columns. So is it possible using wizard that I only import selected columns. If you are using the Import / Export wizard, when you get to Select Source Tables and Views click on the button "Edit Mappings" on the bottom left of the screen. That opens column mappings screen; on the destination column select Ignore to remove

How can I do indexing XML files stored on other server in solr4

痞子三分冷 提交于 2019-12-02 06:40:08
问题 I have all my XML files stored on to the other server and I have installed and configure the SOLR on different server. How can I index those XML files into the SOLR. I have checked nutch but it's main purpose is to crawl the html pages and index them. I don't need to crawl. I have All those files on specific path on other server. I just need to do indexing those XML files in SOLR. I have installed and configure SOLR4. If anyone have did some thing like this please let me know how to do that.

Variable 'sql_mode' can't be set to the value of 'NO_AUTO_CREATE_USER'

家住魔仙堡 提交于 2019-11-30 12:46:03
I am using MySQL Workbench 8.0. I am trying to dump test data to DB including all the tables, stored procedures and views with data. When I try to import it's says import finished with one error and the error is Variable 'sql_mode' can't be set to the value of 'NO_AUTO_CREATE_USER' Operation failed with exitcode 1 Also after importing if I check the database, only tables have come but there are no stored procedures at all. How would one fix this? I recently had this problem as well after exporting my database from MySQL Workbench 6.1 CE and then trying to import it into a newer version of

Variable 'sql_mode' can't be set to the value of 'NO_AUTO_CREATE_USER'

北慕城南 提交于 2019-11-29 18:18:30
问题 I am using MySQL Workbench 8.0. I am trying to dump test data to DB including all the tables, stored procedures and views with data. When I try to import it's says import finished with one error and the error is Variable 'sql_mode' can't be set to the value of 'NO_AUTO_CREATE_USER' Operation failed with exitcode 1 Also after importing if I check the database, only tables have come but there are no stored procedures at all. How would one fix this? 回答1: I recently had this problem as well after

Reading text data from a CSV file in MATLAB

泄露秘密 提交于 2019-11-29 15:59:33
my data is in following form: days of week date time(hrs) visitors mon jan 2 2010 900 501 mon jan 2 2010 1000 449 mon jan 2 2010 1100 612 likewise for every day for entire year. i need to create a matrix of days of week as shown below: A=[ mon mon mon ] Here is how I would read the tab-separated values , and parse the dates: %# read and parse file fid = fopen('data.csv','rt'); C = textscan(fid, '%s %s %s %d', 'Delimiter','\t', 'HeaderLines',1, ... 'MultipleDelimsAsOne',true, 'CollectOutput',false); fclose(fid); %# get date and number of visitors dt = datenum(strcat(C{2}, {' '}, C{3}), 'mmm dd

How to index and search two different tables which are in same datasource using single solr instance Or Solr Template fields not working properly

女生的网名这么多〃 提交于 2019-11-29 10:07:14
I want to index and search two different entity. File name: db-data-config.xml <dataConfig> <dataSource name="myindex" driver="com.microsoft.sqlserver.jdbc.SQLServerDriver" url="jdbc:sqlserver://test-pc:1433;DatabaseName=SampleDB" user="username" password="password" /> <document> <entity name="Employees" query="select * from employee" transformer="TemplateTransformer" dataSource="myindex"> <field column="id" name="singlekey" /> <field column="eId" name="eid" /> <field column="eName" name="ename" /> <field column="entity" template="Employee" name="entity" /> </entity> <entity name="Products"

Fastest way to import CSV files in MATLAB

淺唱寂寞╮ 提交于 2019-11-29 04:32:48
I've written a script that saves its output to a CSV file for later reference, but the second script for importing the data takes an ungainly amount of time to read it back in. The data is in the following format: Item1,val1,val2,val3 Item2,val4,val5,val6,val7 Item3,val8,val9 where the headers are on the left-most column, and the data values take up the remainder of the row. One major difficulty is that the arrays of data values can be different lengths for each test item. I'd save it as a structure, but I need to be able to edit it outside the MATLAB environment, since sometimes I have to

Importing multiple text files using VBA Macro

≯℡__Kan透↙ 提交于 2019-11-28 12:57:39
问题 I have a daily dump of 2 different text files (in the same folder) that get overwritten daily. I would like to be able to import them into an active spreadsheet with tab delimited, at the same time with a VBA code. I would really appreciate the help! I am using excel 2016. My manual import method of 1 of the text file when recorded gives this code which is how i would like BOTH the text files to be imported (formatting preserved): With ActiveSheet.QueryTables.Add(Connection:= _ "TEXT;C:\Users

App Engine BadValueError On Bulk Data Upload - TextProperty being construed as StringProperty

倾然丶 夕夏残阳落幕 提交于 2019-11-28 09:30:43
bulkoader.yaml: transformers: - kind: ExampleModel connector: csv property_map: - property: __key__ external_name: key export_transform: transform.key_id_or_name_as_string - property: data external_name: data - property: type external_name: type model.py: class ExampleModel(db.Model): data = db.TextProperty(required=True) type = db.StringProperty(required=True) Everything seems to be fine, yet when I upload I get this error: BadValueError: Property data is 24788 bytes long; it must be 500 or less. Consider Text instead, which can store strings of any length. For some reason, it thinks data is