Yes, the length of varchar affects estimation of the query, memory that will be allocated for internal operation (for example for sorting) and as consequence resources of CPU. You can reproduce it with the following simple example.
1.Create two tables:
create table varLenTest1
(
a varchar(100)
)
create table varLenTest2
(
a varchar(8000)
)
2. Fill both of them with some data:
declare @i int
set @i = 20000
while (@i > 0)
begin
insert into varLenTest1 (a) values (cast(NEWID() as varchar(36)))
set @i = @i - 1
end
3. Execute the following queries with "include actual execution plan":
select a from varLenTest1 order by a OPTION (MAXDOP 1) ;
select a from varLenTest2 order by a OPTION (MAXDOP 1) ;
If you inspect execution plans of these queries, you can see that estimated IO cost and estimated CPU cost is very different: