Last Updated: April 19, 2018
·
42.03K
· mani

How to extract images from a rosbag file and convert them to video

UPDATE (June 2014): bag_tools ROS package includes a node called make_video.py which converts images inside a bag file into a video, similar to the manual workflow described in this post.

Pre-requirements

The methods described below has been tested using ROS Fuerte on a 64bit Ubuntu 12.04 machine. However it should work fine on ROS Electric and/or other Linux distros.

  • A rosbag file with an image topic inside it
  • MJPEG Tools [Ubuntu: sudo apt-get install mjpegtools]
  • ffmpeg [Ubuntu: sudo apt-get install ffmpeg] or
  • mencoder [Ubuntu: sudo apt-get install mencoder] prefered

Extract the images

  1. Create a folder for the extracted images and cd in there.
  2. Execute

rosrun image_view extract_images _sec_per_frame:=0.01 image:=<IMAGETOPICINBAGFILE>

  1. On the other terminal window, run rosbag play <BAGFILE>
  2. A sequence of images will be created.

You can check if the number of frames created is the same as the number of messages in the .bag file using rosbag info command. If the number is less, decrease the _sec_per_frame value.

Note: The images have been named using this printf style pattern: frame%04d.jpg

Making the video file (Mencoder)

  1. Determine the fps of the rosbag file. Use rosbag info, then divide number of messages for image topic by the duration (in seconds). This step is very important.

  2. Use this mencoder command in the folder you stored the images:

mencoder -nosound mf://*.jpg -mf w=<WIDTH>:h=<HEIGHT>:type=jpg:fps=<FPS> \ -ovc lavc -lavcopts vcodec=mpeg4:vbitrate=<BITRATE>\ :mbd=2:keyint=132:v4mv:vqmin=3:lumi_mask=0.07:dark_mask=0.2:mpeg_quant:scplx_mask=0.1:tcplx_mask=0.1:naq -o\ <OUTPUT>.avi

For the <FPS> use the number obtained in step 1, for the <BITRATE> check the Notes section below.

Making the video file (FFMPEG)

  1. Determine the FPS of the rosbag file. Use rosbag info, then divide number of messages for image topic by the duration (in seconds). This step is very important.

  2. Use this command ffmpeg command in the folder you stored the images:

ffmpeg -r <FPS> -b <BITRATE> -i frame%04d.jpg <OUTPUT>.avi

For the <FPS> use the number obtained in step 1, for the <BITRATE> check the Notes section below.

Notes

  • BITRATE determines the quality. For ffmpeg use KiloBytes/Sec, for mencoder use KiloBits/Sec (e.g 2400 Kbps vs 300 KBps). Numbers greater that 1800 KBits usually works fine.

  • The resulting video file will be a MPEG4 encoded AVI file.

References

5 Responses
Add your response

It's mjpegtools, not mjpeg-tools.

over 1 year ago ·

@ecuzzillo Thanks, edited.

over 1 year ago ·

I am using ROS hydro Ubuntu 12.04 and I'm having an error when running"ffmpeg -r <FPS> -b <BITRATE> -i frame%04d.jpg <OUTPUT>.avi" The thing is I am not seeing the sequence of images.I ran rosbag info <BAGFILE> to obtain the topic in the bag file but when I run "rosrun rqtgraph rqtgraph" the /extract_image frame node is there all by itself.It's getting messy.What am I doing wrong?

over 1 year ago ·

@leyonce Does the extract_image step work? I will test the method on Hydro/12.04 soon and will update this guide if needed.

over 1 year ago ·

Hi,
I am getting below error and I am using ROS-kinetic.
terminate called after throwing an instance of 'imagetransport::TransportLoadException'
what(): Unable to load plugin for transport '/home/user/Videos/Recorded
video/image2.bag', error string:
According to the loaded plugin descriptions the class imagetransport//home/user/Videos/Recordedvideo/image2.bagsub with base class type imagetransport::SubscriberPlugin does not exist. Declared types are imagetransport/compressedDepthsub imagetransport/compressedsub imagetransport/rawsub imagetransport/theorasub
Aborted (core dumped)

over 1 year ago ·