dbf

C# - Accessing a DBF file in a Stream

筅森魡賤 提交于 2019-12-12 01:07:29
问题 I have a WebMethod which I use to allow a user to upload a database file (*.DBF). I know I can access the *.DBF file using a OleDbConnection with the appropriate connection string. But I dont really want to save the file on the server, my WebMethod has the file contents in a stream. Is there anyway I can access the *.DBF file while it is in the stream ? 来源: https://stackoverflow.com/questions/14810147/c-sharp-accessing-a-dbf-file-in-a-stream

DBF Large Char Field

。_饼干妹妹 提交于 2019-12-11 13:07:41
问题 I have a database file that I beleive was created with Clipper but can't say for sure (I have .ntx files for indexes which I understand is what Clipper uses). I am trying to create a C# application that will read this database using the System.Data.OleDB namespace. For the most part I can sucessfully read the contents of the tables there is one field that I cannot. This field called CTRLNUMS that is defined as a CHAR(750). I have read various articles found through Google searches that

Convert java Date to dbf FoxPro datetime format

那年仲夏 提交于 2019-12-11 12:18:39
问题 Im trying to create DBF file in FoxPro spec, but idk how to insert date. I dont know how to convert java Date to this: FoxPro's field is 2 32bit integers: one stores the date, the other stores the time, stored in reverse byte order. The date integer stores the number of days from 1/1/4712BC. The time integer stores the number of milliseconds from 00:00:00. Its easy to get days and milliseconds with JodaTime: DateTime start = new DateTime(-4713, 1, 1, 0, 0, 0, 0); DateTime end = new DateTime

Speed up read.dbf in R (problems with importing large dbf file)

我是研究僧i 提交于 2019-12-11 11:59:45
问题 I have a dataset given in .dbf format and need to import it into R. I haven't worked with such extension previously, so have no idea how to export dbf file with multiple tables into different format. Simple read.dbf has been running hours and still no results. Tried to look for speeding up R performance, but not sure whether it's the case, think the problem is behind reading the large dbf file itself (weights ~ 1.5Gb), i.e. the command itself must be not efficient at all. However, I don't

Foxbase to postrgresql data transfer. (dbf files reader)

元气小坏坏 提交于 2019-12-11 10:59:35
问题 I rewrite a program based on the old Foxbase database consisting of files .dbf. I need a tool that would read these files, and helped in the transfer of data to PostgreSQL. You know maybe some of this type of tool? 回答1: pgdbf.sourceforge.net - has worked for all the DBF I've fed it. Quoting the site description: PgDBF is a program for converting XBase databases - particularly FoxPro tables with memo files - into a format that PostgreSQL can directly import. It's a compact C project with no

Trouble with Insert Into .dbf file

℡╲_俬逩灬. 提交于 2019-12-11 06:44:14
问题 This code does not save any data in the dbf file. What is wrong here? Here is the code with the recommended changes. Thank you. string connectionString = @"Provider=Microsoft.Jet.OLEDB.4.0;Data Source=C:\TEMP;Extended Properties=dBase IV"; using (OleDbConnection connection = new OleDbConnection(connectionString)) using (OleDbCommand command = connection.CreateCommand()) { connection.Open(); command.CommandText = @"CREATE TABLE TEST (Id Text, Changed Text, Tacos Text)"; command.ExecuteNonQuery

Batch convert visual foxpro dbf tables to csv

不打扰是莪最后的温柔 提交于 2019-12-11 03:49:03
问题 I have a huge collection of visual foxpro dbf files that I would like to convert to csv. (If you like, you can download some of the data here. Click on the 2011 link for Transaction Data, and prepare to wait a long time...) I can open each table with DBF View Plus (an awesome freeware utility), but exporting them to csv takes a few hours per file, and I have several dozen files to work with. Is there a program like DBF View plus that will allow me to set up a batch of dbf-to-csv conversions

Visual Fox Pro and Python

浪子不回头ぞ 提交于 2019-12-10 21:45:39
问题 I'm working with a visual fox pro database (.dbf file) and I'm using the dbf python module. Heres an example: myDb = VfpTable('table.dbf'); Now I can exclude deleted items with this by doing the following: myDb._use_deleted = None; My question(s) is/are is there an easier way to do this? Maybe a function? I hate accessing "private" variables. Also, without setting this property, how can I determine if a row has been deleted? They are still technically in the database so is there a flag? A

oracle的SCN和Checkpoint_Change#的关系

非 Y 不嫁゛ 提交于 2019-12-10 16:59:16
我们知道ORACLE中有SCN(System Change Number)和Checkpoint_Change#,那这两者的关系是什么呢,其实Checkpoint_Change#是来源于SCN,SCN是时刻在变化的,Checkpoint_Change#是在数据发生了检查点的时候才改变的,它的值来源于SCN.下面通过一个例子来说明. 1.获取当前的SCN SQL> select dbms_flashback.get_system_change_number() from dual; DBMS_FLASHBACK.GET_SYSTEM_CHANGE_NUMBER() ----------------------------------------- 1275075 2.产生检查点 SQL> alter system checkpoint; System altered. 3.从数据文件和数据头文件中查看检查点 SQL> column name format a50; SQL> select name,checkpoint_change# from v$datafile; NAME CHECKPOINT_CHANGE# -------------------------------------------------- ------------------ E:\APP

Oracle修改数据文件路径

筅森魡賤 提交于 2019-12-09 13:54:04
1.连接到数据库 SQL> sqlplus / as sysdba 2. 查看数据文件位置 SQL> select name from v$datafile; FILE_NAME ------------------------------------------------------------------------ /oradata/orcl/users01.dbf /oradata/orcl/undotbs01.dbf /oradata/orcl/sysaux01.dbf /oradata/orcl/system01.dbf /oradata/orcl/work.dbf 3. 关闭数据库 SQL> shutdown immediate; 4. 移动文件到新的位置(比如新的位置为/home/data ) cd /oradata/orcl/ mv users01.dbf undotbs01.dbf sysaux01.dbf system01.dbf work.dbf /home/data 5. 以mount模式启动数据库 SQL> startup mount; SQL> alter database rename file '/oradata/orcl/users01.dbf' to '/home/data/users01.dbf ';