data-import

Bulk Insert Failed “Bulk load data conversion error (truncation)”

删除回忆录丶 提交于 2019-12-22 21:26:32
问题 I've done data imports with SQL Server's BULK INSERT task hundreds of times, but this time I'm receiving an error that's unfamiliar and that I've tried troubleshooting to no avail with Google. The below is the code I use with a comma deliminated file where the new rows are indicated by new line characters: BULK INSERT MyTable FROM 'C:\myflatfile.txt' WITH ( FIELDTERMINATOR = ',' ,ROWTERMINATOR = '/n') GO It consistently works, yet now on a simple file with a date and rate, it's failing with

How to import/read data from an XML file?

不问归期 提交于 2019-12-22 12:26:41
问题 How to access an XML file in C#? How to count the number of nodes in that xml file? How am i supposed to access each and every node in that xml file? I have two xml files, one of them is dev.xml which has this code <Devanagri_to_itrans> <mapping> <character>अ</character> <itrans>a</itrans> </mapping> ... </Devanagri_to_itrans> the second file is guj.xml (with a very similar structure) <Gujrathi_to_itrans> <mapping> <character>અ</character> <itrans>a</itrans> <mapping> ... </Gujrathi_to_itrans

Is there a way other than below to load a json.rows file into RStudio?

五迷三道 提交于 2019-12-19 10:32:19
问题 I have a json.rows file -> instances.json.rows with approximately 223k rows I tried using jsonlite and came up with instancesfile <- fromJSON("instances.json.rows") But i kept getting an error Error in parse_con(txt, bigint_as_char) : parse error: trailing garbage kcBy-cs", "time_type": "in"} {"cluster_ids": ["Bz4SOc6zZn0"] (right here) ------^ Here is an image of the data from the first row of my file. Apologies if my question is not clear enough. Let me know in the comments and I will edit

Matlab Importdata Precision

亡梦爱人 提交于 2019-12-12 14:07:39
问题 I'm trying to use importdata for several data files containing data of a precision up to 11 digits after the decimal, is Matlab seems to think I am only interested in the first 5 digits when using importdata, is there an alternative method I could use to load my data, or a method to define the precision to which I want my data loaded? 回答1: First try: format long g Also, can you paste some of the data you are trying to load? 来源: https://stackoverflow.com/questions/9621027/matlab-importdata

Import json data with null values

ぃ、小莉子 提交于 2019-12-12 12:37:06
问题 From the import documentation of BigQuery, Note: Null values are not allowed So I assume null is not allowed in a json-formatted data for BigQuery import. However, null value is actually very common in regular ETL task (due to missing data). What should be a good solution to import such json source files? Note my data contains nested structures so I do not prefer a conversion to CSV and use ,, to represent a null value. One way I think I can do is to replace all null values with default

Quotation marks when zoo to xts using as.xts() in R

淺唱寂寞╮ 提交于 2019-12-12 05:47:49
问题 When transforming the below data (class "zoo") into xts, I use returns.xts<-as.xts(returns) . The effect is to add quotation marks around the new Data. And then becomes unusable in functions. Why is this? class("zoo") UK.EQUITY EUR.EQUITY NA.EQUITY ASIA.EQUITY JPN.EQUITY EM.EQUITY WORLD.EQUITY.EX.UK 2006-04-30 0.010552982 -0.003337229 -0.033739353 0.025092643 -0.020920633 0.020016007 -0.021165353 2006-05-31 -0.048962517 -0.071924029 -0.059684763 -0.102475485 -0.098121902 -0.141877111 -0

Neo4j jexp/batch-import weird error: java.lang.NumberFormatException

与世无争的帅哥 提交于 2019-12-11 12:07:30
问题 I'm trying to import around 6M nodes using Michael Hunger's batch importer but I'm getting this weird error: java.lang.NumberFormatException: For input string: "78rftark42lp5f8nadc63l62r3" at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65) It is weird because 78rftark42lp5f8nadc63l62r3 is the very first value of the big CSV file that I'm trying to import and its datatype is set to string. These are the first three lines of that file: name:string:sessions labels

solr delta import handler timestamp not specific enough

♀尐吖头ヾ 提交于 2019-12-10 11:54:59
问题 I am new to solr and I have a quite basic question about delta-imports. I have several new records by second in my mySQL DB. So when I start an import at second x it is very possible, that I will get some new records in the DB at the very same second after starting the import, but the next time when I start a delta-import it will check the "last_index_time" in dataimport.properties and will import all the records changed after this second x. So I will lose all records which have been changed

How to extract metatags from HTML files and index them in SOLR and TIKA

筅森魡賤 提交于 2019-12-10 00:24:50
问题 I am trying to extract the metatags of HTML files and indexing them into solr with tika integration. I am not able to extract those metatags with Tika and not able to display in solr. My HTML file is look like this. <meta http-equiv="Content-Type" content="text/html; charset=UTF-8"> <meta name="product_id" content="11"/> <meta name="assetid" content="10001"/> <meta name="title" content="title of the article"/> <meta name="type" content="0xyzb"/> <meta name="category" content="article category

import txt files using excel interop in C# (QueryTables.Add)

早过忘川 提交于 2019-12-08 21:42:41
I am trying to insert text files into excel cell using Querytables.Add; no error, but the worksheet is empty. except for the single cell manipulation using Value2 property. I already using macro to record the object used. Can you help me on this(I am using vs2008, C# , excel 2003 and 2007; both shown empty cell). Below is my code; thanks for your help Application application = new ApplicationClass(); try { object misValue = Missing.Value; wbDoc = application.Workbooks.Open(flnmDoc, misValue, misValue, misValue, misValue, misValue, misValue, misValue, misValue, misValue, misValue, misValue,