In pre-.NET world I always assumed that int is faster than byte since this is how processor works.
Now it\'s matter habit of using int even when bytes could work, fo
Your pre-.NET assumption were faulty -- there have always been plenty of computer systems around that, while nominally "byte-addressable", would have to set a single byte by reading a full word, masking it out to alter the one byte of it, write it all down -- slower than just setting a full word. It depends on the internals of the way the processor and memory are connected, not on the programmer-visible architecture.
Whether in .NET or native code, focus first on using the data as semantically correct for your application, not on trying to double guess the computer system's architect -- "Premature optimization is the root of all evil in programming", to quote Knuth quoting Hoare.