问题
I'm building a C++/Qt5.1 app which uses QProcess to launch another program, then wait for the result. Every time I run this code, valgrind says memory is lost on line 2 (the start line).
QProcess command(this);
command.start(commandpath, myParameters);
if (command.waitForStarted(waitToStart)) {
command.write(myStdIn.toLatin1());
command.closeWriteChannel();
if (command.waitForFinished(waitToFinish)) {
myStdOut = command.readAllStandardOutput();
myStdErr = command.readAllStandardError();
}
}
command.deleteLater();
I added the deletelater() line but it doesn't help. (Note that the memory loss only occurs if the 'commandpath' program doesn't run successfully - for example, when I try to run a non-existant program).
Can someone explain why, and how to resolve this memory loss?
Here's some valgrind output if that helps:
16 bytes in 1 blocks are definitely lost in loss record 57 of 678
in RunProcessWorker::run(RunProcessWorker::EMutex, QString, QString, QString, bool, QString, QStringList, QStringList, QString, QString&, QString&, unsigned int, unsigned int, unsigned long long&, RunProcessWorker::EResultCodes&, QProcess::ProcessError&, int&) in /mnt/lserver2/data/development/haast/src/systemcommands/runprocessworker.cpp:249
1: operator new[](unsigned long) in /usr/lib64/valgrind/vgpreload_memcheck-amd64-linux.so
2: /opt/Qt5.1.0/5.1.0/gcc_64/lib/libQt5Core.so.5.1.0
3: QProcess::open(QFlags<QIODevice::OpenModeFlag>) in /opt/Qt5.1.0/5.1.0/gcc_64/lib/libQt5Core.so.5.1.0
4: QProcess::start(QString const&, QStringList const&, QFlags<QIODevice::OpenModeFlag>) in /opt/Qt5.1.0/5.1.0/gcc_64/lib/libQt5Core.so.5.1.0
5: RunProcessWorker::run(RunProcessWorker::EMutex, QString, QString, QString, bool, QString, QStringList, QStringList, QString, QString&, QString&, unsigned int, unsigned int, unsigned long long&, RunProcessWorker::EResultCodes&, QProcess::ProcessError&, int&) in <a href="file:///mnt/lserver2/data/development/haast/bin/debug/../../src/systemcommands/runprocessworker.cpp:249" >/mnt/lserver2/data/development/haast/src/systemcommands/runprocessworker.cpp:249</a>
回答1:
Not all findings by valgrind are "real" memory leaks, or leaks you should care about. As long as memory "leak" comes from a library, and does not grow even if you do the failing thing many times, then don't worry about it.
Even though it is considered bad practice in applications, libraries may allocate things from heap, which are never freed. The library could add an exit handler to release those, but it would slow down exit of program for no real gain, for resources which are released by the OS anyway in one big chunk.
For this reason, valgrind supports suppressing errors. Easiest way to do this with Qt is to run valgrind under Qt Creator, which has a correct suppression file for Qt libraries by default.
If you are worried that this is actually a Qt bug, then you should write code which does the leaky operation in a loop a million times. If size of leak increases, then it's bad, and you should probably file a bug report with the code to reproduce it. Even if it is a one-time leak in uncommon code path, having it fixed would probably be good, instead of leaving useless allocation to clutter the heap.
来源:https://stackoverflow.com/questions/25540603/qprocess-causes-memory-leak