flat-file

Transforming flat file to XML using XSLT-like technology

寵の児 提交于 2019-12-08 05:19:07
问题 I'm designing a system which is receiving data from a number of partners in the form of CSV files. The files may differ in the number and ordering of columns. For the most part, I will want to choose a subset of the columns, maybe reorder them, and hand them off to a parser. I would obviously prefer to be able to transform the incoming data into some canonical format so as to make the parser as simple as possible. Ideally, I would like to be able to generate a transformation for each incoming

Replace tabs (“\t”) in flat file with “Unit Separator” (0x1f) in C#

自古美人都是妖i 提交于 2019-12-08 03:28:28
问题 I have been having trouble finding the metacharacter for the 'Unit Separator' to replace the tabs in a flat file. So far I have this: File.WriteAllLines(outputFile, File.ReadLines(inputFile) .Select(t => t.Replace("\t", "\0x1f"))); //this does not work I have also tried: File.WriteAllLines(outputFile, File.ReadLines(inputFile) .Select(t => t.Replace("\t", "\u"))); //also doesn't work AND File.WriteAllLines(outputFile, File.ReadLines(inputFile) .Select(t => t.Replace("\t", 0x1f))); //also

Can't Separate Text File By Delimiter |

怎甘沉沦 提交于 2019-12-07 17:11:02
问题 I am using C#. I am trying to pull in a text file to an object. I am using an ODBC connection and it looks like this Driver={Microsoft Text Driver (*.txt; *.csv)};Dbq=C:\Users\Owner\Desktop\IR\IR_Files\Absolute;Extensions=asc,csv,tab,txt; I am able to make the connection but I can't get my columns separated. I'm using a schema.ini file but it isn't working. Here is my schema file. [MyTextFile.CSV] Format=Delimited(|) ColNameHeader=False Col1=fullstockn Text col2=FULLINFO Text MaxScanRows=0

Help With PHP Pagination Script For Flat File Database

半世苍凉 提交于 2019-12-06 14:57:38
问题 I have a few questions regarding a PHP Pagination Script For Flat File Database I found. I have posted the script below. <?php echo '<html><body>'; // Data, normally from a flat file or some other source $data = "Item1|Item2|Item3|Item4|Item5|Item6|Item7|Item8|Item9|Item10"; // Put our data into an array $dataArray = explode('|', $data); // Get the current page $currentPage = trim($_REQUEST[page]); // Pagination settings $perPage = 3; $numPages = ceil(count($dataArray) / $perPage); if(!

iOS Implementation Theory

谁都会走 提交于 2019-12-06 09:43:35
We (IT Department at work) are looking to build an iPad app, that will take numeric IDs and provide a simple lookup in a table. It is essentially a primary key search on a single table, and displaying a field after minor processing. The Caveat : There are 4.5 million rows in this table, and it needs a lookup time of maximum of 1 second. It will not have an internet connection, so it has to happen on the device. We have a few ideas but which makes most sense: Sqlite: Will it stand up to such abuse? Can it handle that many rows, and will it do it well? Flat file search: we can loop over the file

Spring Batch: How to process multi-line log files

徘徊边缘 提交于 2019-12-06 06:58:11
问题 I am trying to import the contents of a log file into a database using Spring Batch. I am currently using a FlatFileItemReader, but there are unfortunately many log entries that doesn't catch. The two main problems are: Lines that contain multi-line JSON Strings: 2012-03-22 11:47:35,307 DEBUG main someMethod(SomeClass.java:56): Do Something(18,true,null,null,null): my.json = '{ "Foo":"FooValue", "Bar":"BarValue", ... etc }' Lines that contain stack traces 2012-03-22 11:47:50,596 ERROR main

Using FlatFileItemReader with a TaskExecutor (Thread Safety)

房东的猫 提交于 2019-12-05 22:20:31
问题 There are a lot of examples which use FlatFileItemReader along with TaskExecutor . I provide samples below (both with XML and Java Config): Using Oracle Coherence with Spring Batch Spring Batch Multithreading Example I have used it my self with XML configuration for large CSVs (GB size) writing to database with the out-of-the-box JpaItemWriter . There seem to be no issues even without setting save-state = false or taking any kind of special handling. Now, FlatFileItemReader is documented as

SSIS is dropping a record on flat file source import

血红的双手。 提交于 2019-12-05 02:49:46
am experiencing a very strange issue in SSIS (2008). Basic workflow is as follows.. Using a flatfile source (CSV), bring into SSIS, push into SQL. When process is run on dev environment, everything works perfectly. When the dtsx package is placed in production.. using the exact same flat file source, the last record in the file is dropped by the time it gets to the start of the SQL proc. Have gone over everything i can possibly think of including line delimiters, column delimeters, rebuilding the flat file source connection. Has anyone seen anything like this before? The CSV file contains 10

Is there an smart way to write a fixed length flat file?

主宰稳场 提交于 2019-12-05 01:13:52
Is there any framework/library to help writing fixed length flat files in java? I want to write a collection of beans/entities into a flat file without worrying with convertions, padding, alignment, fillers, etcs For example, I'd like to parse a bean like: public class Entity{ String name = "name"; // length = 10; align left; fill with spaces Integer id = 123; // length = 5; align left; fill with spaces Integer serial = 321 // length = 5; align to right; fill with '0' Date register = new Date();// length = 8; convert to yyyyMMdd } ... into ... name 123 0032120110505 mikhas 5000 0122120110504

processing text from a non-flat file (to extract information as if it *were* a flat file)

牧云@^-^@ 提交于 2019-12-05 00:51:39
问题 I have a longitudinal data set generated by a computer simulation that can be represented by the following tables ('var' are variables): time subject var1 var2 var3 t1 subjectA ... t2 subjectB ... and subject name subjectA nameA subjectB nameB However, the file generated writes a data file in a format similar to the following: time t1 description subjectA nameA var1 var2 var3 subjectB nameB var1 var2 var3 time t2 description subjectA nameA var1 var2 var3 subjectB nameB var1 var2 var3 ...(and