I have earlier programmed USB webcam, where the sole aim is to get the live frames from the camera and display in a window. I used cvCaptureFromCAM for that purpose, which worked fine for USB Camera(see code below).
I want to know how do I capture frames from a Gigabit Ethernet camera? I guess I need to capture frames from some default IP address using some API. Can some one point me to right direction?
I will be using C++ with OpenCV on Windows 7 on an Intel i3 processor.
#include "cv.h"
#include "highgui.h"
#include <stdio.h>
// A Simple Camera Capture Framework
int main() {
CvCapture* capture = cvCaptureFromCAM( CV_CAP_ANY );
if ( !capture ) {
fprintf( stderr, "ERROR: capture is NULL \n" );
getchar();
return -1;
}
// Create a window in which the captured images will be presented
cvNamedWindow( "mywindow", CV_WINDOW_AUTOSIZE );
// Show the image captured from the camera in the window and repeat
while ( 1 ) {
// Get one frame
IplImage* frame = cvQueryFrame( capture );
if ( !frame ) {
fprintf( stderr, "ERROR: frame is null...\n" );
getchar();
break;
}
cvShowImage( "mywindow", frame );
// Do not release the frame!
// If ESC key pressed, Key=0x10001B under OpenCV 0.9.7(linux version),
// remove higher bits using AND operator
if ( (cvWaitKey(10) & 255) == 27 ) break;
}
// Release the capture device housekeeping
cvReleaseCapture( &capture );
cvDestroyWindow( "mywindow" );
return 0;
}
Update
So now I am able to display the live images in the vendor provided software GUI. But still I want to display the image (and possibly video) using the IP address of the camera.
When I know the IP address of the camera, why can't I access the data (images) sent by the camera and display on browser? I tried typing the ip address of the camera (i.e 192.169.2.3) on my browser (192.169.2.4), but it say "page not found". What does it mean?
You can do this by using the genIcam API. genIcam is a generic interface for cameras (USB, GigE, CameraLink, etc.). It consists of multiple modules but what you are most concerned with is GenTL (transport layer). You can read more on GenTL documentation HERE. To make the process easier, I recommend using either the Basler API or the Baumer API, which are GenTL consumers (producers and consumers are described in the GenTL documentation). I used the Baumer API, but both will work.
NOTE: I'm using a Baumer HXG20 mono camera.
THINGS TO DOWNLOAD AND INSTALL
- Visual Studios Community eddition (I used 2015 LINK)
- Baumer GAPI SDK, LINK
- openCV (here is a youtube tutorial to build openCV 3 for c++) HERE
TEST CAMERA WITH CAMERA EXPLORER
It's a good idea to check that your network interface card (NIC) and GigE camera are working and play around with the camera by using the Camera Explorer program. You may need to enable Jumbo Packets on your NIC. You can configure the camera IP using the IPconfig program as well. I use the DHCP setting but you can also use a static IP for your camera and NIC.
SETUP VISUAL STUDIOS
The steps for setting your system environment variables and configuring Visual studios are described in the Baumer GAPI SDK programmers guide (chapter 4), which is located in the following directory
C:\Program Files\Baumer\Baumer GAPI SDK\Docs\Programmers_Guide
Check that you have the following system variable (if using 64 bit version), or create the variable if needed (refer to section 4.3.1 in the programmer's guide).
- name =
GENICAM_GENTL64_PATH
- value =
C:\Program Files\Baumer\Baumer GAPI SDK\Components\Bin\x64\
- name =
In visual Studios, create a new C++ project and update the following properties (refer to section 4.4.1 in the programmer's guide).
- C/C++ > General > Additional Include Directories =
C:\Program Files\Baumer\Baumer GAPI SDK\Components\Dev\C++\Inc
- Linker > General > Additional Library Directories =
C:\Program Files\Baumer\Baumer GAPI SDK\Components\Dev\C++\Lib\x64
- Linker > Input > Additional Dependencies =
bgapi2_genicam.lib
- Build Events > Post-Build Event > Command Line =
copy "C:\Program Files\Baumer\Baumer GAPI SDK\Components\Bin\x64"\*.* .\
- C/C++ > General > Additional Include Directories =
CREATE A .CPP FILE TO DISPLAY IMAGE STREAM IN AN OPENCV WINDOW
The simplest way to get started is to use one of the example codes provided in the Baumer GAPI SDK and modify it to add the openCV functionality. The example code to use is 005_PixelTransformation, which is located here
C:\Program Files\Baumer\Baumer GAPI SDK\Components\Examples\C++\src\0_Common\005_PixelTransformation
Copy and paste this .cpp file into your project source directory and make sure you can build and compile. It should capture 8 images and print out the first 6 pixel values from the first 6 lines for each image.
Add these #include
statements to the .cpp source file:
#include <opencv2\core\core.hpp>
#include <opencv2\highgui\highgui.hpp>
#include <opencv2\video\video.hpp>
Add these variable declarations at the beginning of the main()
function
// OPENCV VARIABLE DECLARATIONS
cv::VideoWriter cvVideoCreator; // Create OpenCV video creator
cv::Mat openCvImage; // create an OpenCV image
cv::String videoFileName = "openCvVideo.avi"; // Define video filename
cv::Size frameSize = cv::Size(2048, 1088); // Define video frame size (frame width x height)
cvVideoCreator.open(videoFileName, CV_FOURCC('D', 'I', 'V', 'X'), 20, frameSize, true); // set the codec type and frame rate
In the original 005_PixelTransformation.cpp file, line 569 has a for
loop that loops over 8 images, which says for(int i = 0; i < 8; i++)
. We want to change this to run continuously. I did this by changing it to a while
loop that says
while (pDataStream->GetIsGrabbing())
Within our new while
loop, there's an if
statement that checks if the Pixel format is 'Mono' (greyscale) or color. In the original file, it starts at line 619 and ends at 692. Directly after the if
and else
statement braces are closed, and before the pImage->Release();
statement, we need to add the openCV portion to display the images to a window. Add the following lines of code
} // This is the closing brace for the 'else color' statement
// OPEN CV STUFF
openCvImage = cv::Mat(pTransformImage->GetHeight(), pTransformImage->GetWidth(), CV_8U, (int *)pTransformImage->GetBuffer());
// create OpenCV window ----
cv::namedWindow("OpenCV window: Cam", CV_WINDOW_NORMAL);
//display the current image in the window ----
cv::imshow("OpenCV window : Cam", openCvImage);
cv::waitKey(1);
One thing to note is the pixel format in the openCvImage
object. My camera is a mono 8 bit, so I need to specify CV_8U
. If your camera is RGB or 10-bit pixels, you need to provide the correct format (see the openCV documentation HERE).
you can refer to the other examples for adjusting camera parameters.
now once you build and compile, you should have an openCV window open that displays the camera images!
THUMPS UP FOR MORE UP VOTES PEOPLE!!!! (This took a lot of time to get working so hook me up!!!)
You will not be able to access images on the camera if it doesn't have a web server running (check its doco). try typing this at a command prompt:
telnet 192.169.2.3 80
If telnet times out, your camera is not running a server on the default port 80.
Also see this question:C++ code Capturing image from IP / Ethernet Cameras (AXIS Cam)
To add to mark jay's answer (which I can confirm works on Win7x64 inside a Win32 program using Baumer-GAPI2 2.8.1 and the VC10 compiler and a Baumer TXG06 camera). If the camera is set up to grab Mono8
and you are intending to grab an image of the same format CV_8UC1
, then in the 005_PixelTransformation.cpp
example you can avoid the creation of BGAPI2::Image* pTransformImage
and BGAPI2::Image* pImage
altogether and just build the cv::Mat
using the buffer memory pointer as in the following excerpt from my GigE_cam
class:
bool GigE_cam::operator>>(cv::Mat& out_mat)
{
bool success(false);
try
{
_p_buffer_filled = _p_data_stream->GetFilledBuffer(static_cast<bo_uint64>(_timeout_ms));
if(_p_buffer_filled != 0)
{
if(_p_buffer_filled->GetIsIncomplete())
{
_p_buffer_filled->QueueBuffer();
}
else
{
if(_p_buffer_filled->GetPixelFormat() == "Mono8")
{
_image_out_buffer = cv::Mat(static_cast<int>(_p_buffer_filled->GetHeight()),
static_cast<int>(_p_buffer_filled->GetWidth()),
CV_8UC1,
static_cast<uchar*>(_p_buffer_filled->GetMemPtr()));
if(_image_out_buffer.data)
{
_image_out_buffer.copyTo(out_mat);
success = true;
}
}
else if(_p_buffer_filled->GetPixelFormat() == "Mono10")
{
// Todo transform to BGR8 etc. not implemented
}
_p_buffer_filled->QueueBuffer(); // Queue buffer after use
}
}
}
catch(BGAPI2::Exceptions::IException& ex)
{
_last_BGAPI2_error_str = ex.GetType();
}
return success;
}
This code is getting full frames from the camera (776 X 582 px) at 66.5 fps where even the datasheet only claims 64.0 fps. I am curious to see if their API will act the same on Debian.
来源:https://stackoverflow.com/questions/11009452/opencv-how-to-capture-frames-from-an-ethernet-camera