I have plenty of tables in MySQL which which contains zero date in dateTime column 0000-00-00 00:00:00
Using some sort of admin settings, Is it possibl
As this is for migration, I would suggest that you simply wrap your tables in views which does the conversion as you export the data. I have used the below concept moving data from MySQL to postgress which has the same problem.
Each table should be proxied by something like this;
CREATE VIEW migration_mytable AS
SELECT field1, field2,
CASE field3
WHEN '0000-00-00 00:00:00'
THEN '1900-01-01 00:00:00'
ELSE field3
END CASE AS field3
FROM mytable;
You should be able to write a script which generate this for you from the catalog, in case you have a great deal of tables to take care of.
You should then be able to import the data into your SqlServer table (using a bridge like this), and simply running a query like;
INSERT INTO sqlserver.mytable SELECT * FROM mysql.migration_mytable;
Alter your Table as
ALTER TABLE `test_table`
CHANGE COLUMN `created_dt` `created_dt` date NOT NULL DEFAULT '1900-01-01';
but before Altering table you need to update the existing value as juergen d said
update test_table
set created_dt= '1900-01-01'
where created_dt= '0000-00-00'
You can change existing values running that query
update your_table
set date_column = '1900-01-01'
where date_column = '0000-00-00'
And you can change the definition of your table to a specfic default value or null
like this
ALTER TABLE your_table
CHANGE date_column date_column date NOT NULL DEFAULT '1900-01-01'
To change existings values you could use a query like this:
UPDATE tablename SET date_column = '1900-01-01' WHERE date_column = '0000-00-00';
If you want to automate the UPDATE query you can use a prepared statement:
SET @sql_update=CONCAT_WS(' ', 'UPDATE', CONCAT(_schema, '.', _table),
'SET', _column, '=', '\'1900-01-01\'',
'WHERE', _column, '=', '\'0000-00-00\'');
PREPARE stmt FROM @sql_update;
EXECUTE stmt;
DEALLOCATE PREPARE stmt;
And you can loop through all colums in all tables on the current schema that are declared as date:
SELECT
table_schema,
table_name,
column_name
FROM
information_schema.columns
WHERE
table_schema=DATABASE() AND data_type LIKE 'date%'
To loop through all columns you could use a stored procedure:
DELIMITER //
CREATE PROCEDURE update_all_tables() BEGIN
DECLARE done BOOLEAN DEFAULT FALSE;
DECLARE _schema VARCHAR(255);
DECLARE _table VARCHAR(255);
DECLARE _column VARCHAR(255);
DECLARE cur CURSOR FOR SELECT
CONCAT('`', REPLACE(table_schema, '`', '``'), '`'),
CONCAT('`', REPLACE(table_name, '`', '``'), '`'),
CONCAT('`', REPLACE(column_name, '`', '``'), '`')
FROM
information_schema.columns
WHERE
table_schema=DATABASE() AND data_type LIKE 'date%';
DECLARE CONTINUE HANDLER FOR NOT FOUND SET done := TRUE;
OPEN cur;
columnsLoop: LOOP
FETCH cur INTO _schema, _table, _column;
IF done THEN
LEAVE columnsLoop;
END IF;
SET @sql_update=CONCAT_WS(' ', 'UPDATE', CONCAT(_schema, '.', _table),
'SET', _column, '=', '\'1900-01-01\'',
'WHERE', _column, '=', '\'0000-00-00\'');
PREPARE stmt FROM @sql_update;
EXECUTE stmt;
DEALLOCATE PREPARE stmt;
END LOOP columnsLoop;
CLOSE cur;
END//
DELIMITER ;
Please see an example here.
In my opinion, you could generate all updates the simplest way:
select
concat('UPDATE ',TABLE_NAME,' SET ',COLUMN_NAME,'=NULL WHERE ',COLUMN_NAME,'=0;')
from information_schema.COLUMNS
where TABLE_SCHEMA = 'DATABASE_NAME' and DATA_TYPE in ('datetime', 'date', 'time');
Just replace DATABASE_NAME
to your DB name, and execute all updates.
Prefix: You might want to check the concept of ETL in DataWareHousing, there are tools which should do the simple conversions for you, even Open Source ones like Kettle/Pentaho.
But this one is easy when you use any programming language capable of composing SQL queries. I have made an example in perl, but php or java would do also the job:
#!/usr/bin/perl
use strict;
use warnings;
use DBI;
my $user='geheim';
my $pass='secret';
my $dbh = DBI->connect( "dbi:mysql:host=localhost:database=to_convert:port=3306", $user, $pass ) or die $DBI::errstr;
# Prints out all the statements needed, might be checked before executed
my @tables = @{ $dbh->selectall_arrayref("show tables") };
foreach my $tableh ( @tables){
my $tabname = $tableh->[0];
my $sth=$dbh->prepare("explain $tabname");
$sth->execute();
while (my $colinfo = $sth->fetchrow_hashref){
if ($colinfo->{'Type'} =~ /date/i && $colinfo->{'Null'} =~ /yes/i){
print ("update \`$tabname\` set \`" . $colinfo->{'Field'} . "\` = '1990-01-01' where \`" . $colinfo->{'Field'} . "\` IS NULL; \n");
print ("alter table \`$tabname\` change column \`" . $colinfo->{'Field'} . "\` \`" . $colinfo->{'Field'} . "\` " . $colinfo->{'Type'} . " not null default '1990-01-01'; \n");
}
}
}
This does not change anything, but when the database has tables like:
localmysql [localhost]> explain dt;
+-------+------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+-------+------+------+-----+---------+-------+
| a | date | YES | | NULL | |
+-------+------+------+-----+---------+-------+
1 row in set (0.00 sec)
localmysql [localhost]> explain tst
-> ;
+-------+----------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+-------+----------+------+-----+---------+-------+
| id | int(11) | YES | | NULL | |
| atime | datetime | YES | | NULL | |
+-------+----------+------+-----+---------+-------+
2 rows in set (0.00 sec)
it produces the Statements:
update `dt` set `a` = '1990-01-01' where `a` IS NULL;
alter table `dt` change column `a` `a` date not null default '1990-01-01';
update `tst` set `atime` = '1990-01-01' where `atime` IS NULL;
alter table `tst` change column `atime` `atime` datetime not null default '1990-01-01';
This list can then be reviewed and executed as Statements.
Hope that Helps!