Trying to stream video through following chain: h264/mp4 file on local instance storage (AWS)->ffmpeg->rtp->Janus on same instance->WebRTC playback (Chrome/mac). Resulting video
ffmpeg is optimized for outputting frames in chunks, not for outputting individual coded frames. The muxer, in your case the rtp muxer, normally buffers data before flushing to output. So ffmpeg is not optimized for real-time streaming that requires more or less frame-by-frame output. WebRTC, however, really needs frames arriving in real-time, so if frames are sent in bunches, it may discard the "late" frames, hence the choppiness.
However, there is an option in ffmpeg, to set muxer's buffer size to 0, that works nice. It is:
-max_delay 0
Also, for WebRTC, you want to disable b-frames and append SPS-PPS to every key frame:
-bf 0 +global_header -bsf:v "dump_extra=freq=keyframe"