5
votes

so i can see my webcam stream with OpenCV with imshow with this simple code

int main(int, char**)
{
    VideoCapture cap(0); 
    Mat edges;
    namedWindow("webcam", 1);
    while (true)
    {
        Mat frame;
        cap >> frame; 
        imshow("webcam", frame);
        if (waitKey(30) >= 0) break;
    }
    return 0;
}

now what i want to is to show the image from OpenCV in QImage in Widget on QT Here is a conversion from cv::Mat to QImage

QImage Mat2QImage(cv::Mat const& src)
{
    cv::Mat temp; 
    cvtColor(src, temp, CV_BGR2RGB); 
    QImage dest((const uchar *)temp.data, temp.cols, temp.rows, temp.step, QImage::Format_RGB888);
    dest.bits(); 
    // of QImage::QImage ( const uchar * data, int width, int height, Format format )
    return dest;
}

and the little code to show an image with QImage in QT

int main(int argc, char *argv[])
{
    QApplication a(argc, argv);
    QImage myImage;
    myImage.load("a.png");
    QLabel myLabel;
    myLabel.setPixmap(QPixmap::fromImage(myImage));
    myLabel.show();
    return a.exec();
}

i tried to combine them in this way, but no luck

int main(int argc, char *argv[])
{
    QApplication a(argc, argv);
    VideoCapture cap(0);

    QImage myImage;
    QLabel myLabel;
    while (true)
    {
        Mat frame;
        cap >> frame; // get a new frame from camera

        myImage = Mat2QImage(frame);
        myLabel.setPixmap(QPixmap::fromImage(myImage));
    }


    myLabel.show();

    return a.exec();
2
This is not how it works. You enter in an infinite loop : how could you see anything, since the myLabel.show() is after the loop ?Boiethios
Streaming from a device along with any processing of the image data should be done in a separate thread. If you do what you're doing sooner or later you will have to rewrite your code. Check a couple of videos I've made on integration of OpenCV with Qt: youtube.com/… Note that you can also use a custom QThread and you don't have to stick to the Worker pattern I've used in the tutorial.rbaleksandar

2 Answers

2
votes

You have to create a Window that inherits from QMainWindow with a QTimer. In the constructor, connect the timer to a Window method. You will put your openCV code into this timeout method, that will be called every X millisecond:

class Window : public QMainWindow
{
    Q_OBJECT
    QTimer _timer;

    private slots:
    void on_timeout()
    {
        // put your opencv code in it
    }
    public:
    Window() :
        QMainWindow(), _timer(this)
    {
        connect(&_timer, SIGNAL(timeout()), this, SLOT(on_timeout()));
        // populate your window with images, labels, etc. here
        _timer.start(10 /*call the timer every 10 ms*/);
    }
};

Then show your Window in the main :

int main(int argc, char *argv[])
{
    QApplication a(argc, argv);
    Window win;
    win.show();
    return a.exec();
}

If you use Qt creator, it is simpler to develop with Qt: think about it.

-1
votes

thank you @Boiethios for your response this is the final code i put it in mainwindow.cpp

    MainWindow::MainWindow(QWidget *parent) :
    QMainWindow(parent),
    ui(new Ui::MainWindow)
{
    ui->setupUi(this);
}
class Window : public QMainWindow
{
    Q_OBJECT
        QTimer _timer;

    private slots:
    void on_timeout()
    {
        VideoCapture cap(0);

        Mat edges;
        namedWindow("edges", 1);
        while (true)
        {
            Mat frame;
            cap >> frame;
            myImage = Mat2QImage(frame);
            myLabel.setPixmap(QPixmap::fromImage(myImage));
            myLabel.show();
        }
    }
public:
    QImage myImage;
    QLabel myLabel;
    Window() :
        QMainWindow(), _timer(this)
    {
        connect(&_timer, SIGNAL(timeout()), this, SLOT(on_timeout()));
        // populate your window with images, labels, etc. here
        _timer.start(10 /*call the timer every 10 ms*/);
    }
};

it compile and execute fine but nothing happens just a blank window