Yes, redis is good for that. But to get the gist, there are basically two approaches to caching. Depending on whether you use a framework (and which) or not, you may have first option available in standard or with use of a plug-in:
- Cache database queries, that is - selected queries and their results will be kept in redis for quicker access for a given time or until clearing cache (useful after updating databse). In this case you can use built-in mysql query caching, it will be simpler than using additional key-value store, or you can override default database integration with your own class making use of cache (for example http://pythonhosted.org/johnny-cache/).
- Custom caching, that is creating your own structures to be kept in cache and periodically or manually refilling them with data fetched from the database. It is more flexible and potentially more powerful, because you can use built-in redis features such as lists or sorted sets, which make update overhead much smaller. It requires a bit more of coding, but it usually offers better results, since it is more customized. Good example is keeping top articles in form of redis list of ids, and then accessing serialized article(s) with given id as well from redis. You can keep that article unnormalized - ie. serialized object can contain user id as well as user name, so that you can keep the overhead of additional queries to a minimum.
It is yours to decide which approach to take, I personally almost always go with approach number two. But, of course, everything depends on how much time you have, and what the application is supposed to do - you might as well start with mysql query caching and if the results are not good enough move to redis and custom caching.