I have come to understand that bit banging is horrible practice when it comes to SPI/I2C over GPIO. Why so?
Bit-banging carries a software overhead consuming CPU cycles that you could otherwise utilise for other purposes. This may have a noticeable effect on system responsiveness to other events, and in a hard real-time system, may significantly impact the systems ability to meet real-time deadlines.
If the bit-banged interface is not to have a detrimental effect on real-time performance, then it must be given low priority so will then itself be non-deterministic in terms of data throughput and latency.
The most CPU efficient transfer is achieved by using a hardware interface and DMA transfer to minimise the software overhead. Bit-banging is at the opposite extreme of that.
I would not say it was horrible; if in your application you can achieve responsiveness and real-time constraints and the use of bit-banging perhaps reduces the cost of the part needed or allows you to use existing hardware for example, then it may be entirely justified.