globals broken with alchemy?

安稳与你 提交于 2019-12-20 02:09:07

问题


It seems that Adobe Alchemy isn't running global constructors. Here's some simple test code:

#include <stdio.h>

class TestClass {
public:
    TestClass(const char message[]) {
        printf("hello %s \n", message);
    }
};

TestClass global("global");

int main(int argc, char **argv) {
    TestClass local("local");
    printf("in main\n");
    return 0;
}

When compiled with native gcc it outputs:

hello global
hello local
in main

When compiled with alchemy gcc it outputs:

hello local
in main

This problem breaks lots of code, notably UnitTest++ (which depends on globals getting initialized to make its auto test-list functionality work).

I'd really like to get to the bottom of this. Is it a bug or a feature that just didn't get implemented in time for the release? Is it possible to workaround?

EDIT: A relevant post on the Adobe Forums is here.


回答1:


I've run into the same problem. As far as I could tell, this seems to be the case:

Every single static and global variable of class type will silently fail to be initialized if even a single class attempts dynamic allocation at any point during its initialization. Presumably this is because the ByteBuffer being used for dynamic memory isn't yet available. I wish Alchemy would be more clear with its error messages, because at the moment it's like a strand of old-timey Christmas lights where one dead bulb would cause the entire strand to shut off.

For a workaround, once you've discovered the offending object, you'll need to somehow defer its initialization to runtime. The three techniques that come to mind are pointers, lazy evaluation functions, or references to buffers initialized with placement new.

Pointers

// `global` is now a pointer
TestClass *global;

// all global variable initialization is found here now
void init_globals() {
  global = new TestClass("global");
}

int main(int argc, char **argv) {
  // this needs to be explicitly called at the start Alchemy
  init_globals();

You'll then need to refactor your code, changing every occurence of global to (*global).

Function

// `global` is now a function
TestClass& global() {
  // static locals are initialized when their functions are first called
  static TestClass global_("global");
  return global_;
}

Now you need to replace every occurence of global with global(). Notably, this is the only one of these three techniques that doesn't require an explicit init_globals call. I recommend this way unless the name changing to global() is troublesome for some reason... in which case:

Placement new

// a memory buffer is created large enough to hold a TestClass object
unsigned char global_mem[sizeof(TestClass)];
// `global` is now a reference.  
TestClass& global = *(TestClass*)(void*)global_mem;

void init_globals() {
  // this initializes a new TestClass object inside our memory buffer
  new (global_mem) TestClass("global");
}

int main(int argc, char **argv) {
  init_globals();

The advantage of this approach is you don't need to change any other code, as global is still just called global. Unfortunately, maintaining an init_globals function can be troublesome.


Edit:
As discovered in a later question, in addition to dynamic allocation, functions containing static locals also cannot be called during Alchemy's initialization.



来源:https://stackoverflow.com/questions/4169892/globals-broken-with-alchemy

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!