I am currently doing an AndAR project in group of 3. I\'m the person who\'s in charge of video streaming into the Android phone. I got ourselves a D-Link DCS-920 IP camera
MJPEG is a terribly inefficient way to deliver motion video to a mobile device, because each frame is compressed as it's own independent picture. For an application which doesn't need video (someone was asking about a camera watching waiting lines last week) your solution of pushing a static frame every second or so sounds good.
If you need motion video, I would recommend you do transcoding on your webserver from MJPEG to a supported video format which utilizes frame-to-frame compression. This will result in far less data to push, both over the user's 3g connection and from your server to all of its clients. You should only need to run one transcoding engine to support all clients - and you'll be able to use the same one for android & iphone devices, though you may want to also have a higher resolution output for tablets and pc's if your camera output is good enough to justify it.
Instead of an Arduino you could use a Raspberry PI, it should have enough CPU power to control the vehicle and to convert the video stream at the same time. Sure, you'll need to port all of your Arduino software to Raspberry...
On android, if we decode a jpeg by CPU, it will cost 40-100ms. If we want to play mjpeg to 15-30fps, we need hardware jpeg decoder.
You can use MjpegView class to display mjpeg stream directly. https://code.google.com/p/android-camera-axis/source/browse/trunk/serealisation/src/de/mjpegsample/MjpegView/MjpegView.java?r=33
You'll have to implement some AsyncTasks on this class to works fine.
Good luck
There was a useful previous SO discussion and this great one with code. Would you try and let us know if that works for you.