问题
I have an OS X 10.6 Mac I'm using as my dev machine. The program I wrote works perfectly on the dev machine. However, when I tried to run it on an OS X 10.5 (not sure if that's relevant) test machine, it crashes on launch.
This is the error I'm getting:
Process: MyApp[25908]
Path: /Applications/MyApp.app/Contents/MacOS/MyApp
Identifier: MyApp
Version: ??? (???)
Code Type: X86 (Native)
Parent Process: launchd [109]
Interval Since Last Report: 17392106 sec
Crashes Since Last Report: 735
Per-App Interval Since Last Report: 0 sec
Per-App Crashes Since Last Report: 8
Date/Time: 2010-08-14 07:50:09.768 -0700
OS Version: Mac OS X 10.5.8 (9L31a)
Report Version: 6
Anonymous UUID: 1BF30470-ACF2-46C7-B6D5-4514380965C8
Exception Type: EXC_BREAKPOINT (SIGTRAP)
Exception Codes: 0x0000000000000002, 0x0000000000000000
Crashed Thread: 0
Dyld Error Message:
Symbol not found: __ZSt16__ostream_insertIcSt11char_traitsIcEERSt13basic_ostreamIT_T0_ES6_PKS3_i
Referenced from: /Applications/MyApp.app/Contents/MacOS/MyApp
Expected in: /usr/lib/libstdc++.6.dylib
So it looks like it's crashing because it's loading an incompatible version of the dynamic library libstdc++.6. Is this type of thing ordinary? A search on Google doesn't really reveal many other programs that have this problem. What should I be doing in my compile to prevent this from happening? Do I need to be somehow including libstdc++ inside of my application bundle?
回答1:
The solution to this problem is to add the following code to one of your source files:
// Workarounds for symbols that are missing from Leopard stdlibc++.dylib.
_GLIBCXX_BEGIN_NAMESPACE(std)
// From ostream_insert.h
template ostream& __ostream_insert(ostream&, const char*, streamsize);
#ifdef _GLIBCXX_USE_WCHAR_T
template wostream& __ostream_insert(wostream&, const wchar_t*, streamsize);
#endif
// From ostream.tcc
template ostream& ostream::_M_insert(long);
template ostream& ostream::_M_insert(unsigned long);
template ostream& ostream::_M_insert(bool);
#ifdef _GLIBCXX_USE_LONG_LONG
template ostream& ostream::_M_insert(long long);
template ostream& ostream::_M_insert(unsigned long long);
#endif
template ostream& ostream::_M_insert(double);
template ostream& ostream::_M_insert(long double);
template ostream& ostream::_M_insert(const void*);
#ifdef _GLIBCXX_USE_WCHAR_T
template wostream& wostream::_M_insert(long);
template wostream& wostream::_M_insert(unsigned long);
template wostream& wostream::_M_insert(bool);
#ifdef _GLIBCXX_USE_LONG_LONG
template wostream& wostream::_M_insert(long long);
template wostream& wostream::_M_insert(unsigned long long);
#endif
template wostream& wostream::_M_insert(double);
template wostream& wostream::_M_insert(long double);
template wostream& wostream::_M_insert(const void*);
#endif
// From istream.tcc
template istream& istream::_M_extract(unsigned short&);
template istream& istream::_M_extract(unsigned int&);
template istream& istream::_M_extract(long&);
template istream& istream::_M_extract(unsigned long&);
template istream& istream::_M_extract(bool&);
#ifdef _GLIBCXX_USE_LONG_LONG
template istream& istream::_M_extract(long long&);
template istream& istream::_M_extract(unsigned long long&);
#endif
template istream& istream::_M_extract(float&);
template istream& istream::_M_extract(double&);
template istream& istream::_M_extract(long double&);
template istream& istream::_M_extract(void*&);
#ifdef _GLIBCXX_USE_WCHAR_T
template wistream& wistream::_M_extract(unsigned short&);
template wistream& wistream::_M_extract(unsigned int&);
template wistream& wistream::_M_extract(long&);
template wistream& wistream::_M_extract(unsigned long&);
template wistream& wistream::_M_extract(bool&);
#ifdef _GLIBCXX_USE_LONG_LONG
template wistream& wistream::_M_extract(long long&);
template wistream& wistream::_M_extract(unsigned long long&);
#endif
template wistream& wistream::_M_extract(float&);
template wistream& wistream::_M_extract(double&);
template wistream& wistream::_M_extract(long double&);
template wistream& wistream::_M_extract(void*&);
#endif
_GLIBCXX_END_NAMESPACE
The underlying issue is that there are several templates that are declared as extern templates in libstdc++ headers, and while their instantiations are provided by libstdc++ on 10.6+, they are not provided by the libstdc++ on 10.5. As a result, when you are using these templates, you wind up successfully linking against the 10.6 SDK for functions not provided by the 10.5 OS, and so dyld craps out on launch. By providing the instantiations yourself, you ensure your code will load on Snow Leopard.
Alternately, you can
#define _GLIBCXX_EXTERN_TEMPLATE 0
in your prefix file, but doing so will cause template code bloat.
回答2:
There are a few points I can think of:
Did you compile it as a "release build"? The debug build might not run on machines other than the one in which it is compiled.
Which SDK did you use? Which minimal OS version did you specify in the build settings? If you want to run it on 10.5, you need to use 10.5 SDK and/or set the target OS to be 10.5. See this Apple document on building for multiple OS versions.
Did the target machine have
DYLD_LIBRARY_PATH
set to something non-empty? If not done carefully, that might confusedyld
.
One way to distinguish various possibilities is to run your app in the dev machine, but with a separate account with no admin privilege from the dev account; then you can test whether it runs in an 10.6 box.
回答3:
I ran into the same issue (building with GCC 4.2 makes my code unable to execute on OS X 10.5 because of dyld errors in libstdc++.6.dylib).
The solution proposed by Ben Artin works. Alternatively, you can set the define _GLIBCXX_EXTERN_TEMPLATE to zero before adding any headers (if you are using precompiled headers, make sure they are compiled with the define set correctly).
来源:https://stackoverflow.com/questions/3484043/os-x-program-runs-on-dev-machine-crashing-horribly-on-others