Pentaho Data Integration Import large dataset from DB
问题 I'm trying to import a large set of data from one DB to another (MSSQL to MySQL). The transformation does this: gets a subset of data, check if it's an update or an insert by checking hash, map the data and insert it into MySQL DB with an API call. The subset part for the moment is strictly manual, is there a way to set Pentaho to do it for me, kind of iteration. The query I'm using to get the subset is select t1.* from ( select *, ROW_NUMBER() as RowNum over (order by id) from mytable ) t1