I'd probably do something like this:
var $target = $("#targetT");
$("#sourceT tr").each(function() {
var $tds = $(this).children(),
$row = $("
");
$row.append($tds.eq(0).clone()).append($tds.eq(1).clone()).appendTo($target);
});
Demo: http://jsfiddle.net/HwzQg/
That is, loop through each row of the source table and just copy the required columns. This way it doesn't matter if the required columns are adjacent, and it is easy to change the code to copy more columns if your requirement changes. In fact you could easily encapsulate it in a function that takes the source and target tables as parameters along with a list of which column numbers to copy:
function copyColumns(srcTableId, targetTableId) {
var colNos = [].slice.call(arguments,2),
$target = $("#" + targetTableId);
$("#" + srcTableId + " tr").each(function() {
var $tds = $(this).children(),
$row = $("
");
for (var i = 0; i < colNos.length; i++)
$row.append($tds.eq(colNos[i]).clone());
$row.appendTo($target);
});
}
copyColumns("sourceT", "targetT", 0, 1);
// NOTE that this allows you to easily re-order the columns as you copy them:
copyColumns("sourceT", "targetT", 1, 0, 2);
This uses arguments
to let you have any number of column numbers as separate arguments, but of course you could modify it to instead accept an array of column numbers. Whatever works for you.
Demo: http://jsfiddle.net/HwzQg/1/
"I am not trying to loop all the tr and td's from source table. Coz, my source table is going to be more than thousands rows and more than 50 cols."
I wouldn't worry about the size of the source table. Code to get the result you need first, and then optimise the code if the performance is poor. The code you showed is kind of implicitly looping through the original table twice anyway with td:nth-child(1)
and then td:nth-child(2)
.