Datanucleus, JDO and executable jar - how to do it?

前端 未结 4 1617
你的背包
你的背包 2020-12-06 07:56

I am developing a desktop app with Datanucleus and JDO for embedded H2 database. It all works fine when I run it from Eclipse, but it stops working when I try to make execut

相关标签:
4条回答
  • 2020-12-06 08:25

    Adding to DataNucleus answer.
    To acheave what you need you should use maven-dependency-plugin
    and add following to your pom.xml

    <build>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-dependency-plugin</artifactId>
                <version>2.4</version>
                <executions>
                    <execution>
                        <id>copy-dependencies</id>
                        <phase>package</phase>
                        <goals>
                            <goal>copy-dependencies</goal>
                        </goals>
                        <configuration>
                            <outputDirectory>${project.build.directory}/jars</outputDirectory>
                            <overWriteReleases>false</overWriteReleases>
                            <overWriteSnapshots>false</overWriteSnapshots>
                            <overWriteIfNewer>true</overWriteIfNewer>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
    

    Then the dependencies will be in target/jars dir.

    To execute your app you use command:

    Windows:
    java -cp "yourFile.jar;jars/*" package.className

    Linux:
    java -cp "yourFile.jar:jars/*" package.className

    NOTE: do not use jars/*.jar, this will not work

    0 讨论(0)
  • 2020-12-06 08:28

    In order to use DataNucleus 4.x in an Apache Storm topology which requires a single jar, I had to do two hacks in order to keep their PluginRegistry stuff working. The issue is that the DataNucleus core tries to load modules as OSGi bundles, even when it is not running in an OSGi container. This works fine as long as the jars are not merged (and I would prefer not to merge my dependencies, but this is not an option for me).

    First, I merged all the plugin.xml files into the datanucleus-core plugin.xml. The trick is that extension-point ids are relative to their parent plugin's id. So if any of the modules your are using define new extension-points, e.g. datanucleus-rdbms, you have to rewrite the ids so they are relative to their new parent plugin.

    Second, I added the following entries to our jar's MANIFEST.MF:

    Premain-Class: org.datanucleus.enhancer.DataNucleusClassFileTransformer
    Bundle-SymbolicName: org.datanucleus;singleton:=true
    

    This solution is not ideal as our application is essentially pretending to be the DataNucleus core OSGi bundle. However, this was what I ended up with after a few days off smashing my head on my desk.

    It might be possible to provide a different PluginRegistry implementation, but I have not looked into this.

    0 讨论(0)
  • 2020-12-06 08:30

    DataNucleus jars are all OSGi-enabled and use a plugin mechanism to identify capabilities, so contain plugin.xml and META-INF/MANIFEST.MF files. These need to be in the same locations as they are in the original DN jars (from the root of the jar). If you unpack and then rejar them up you will need to merge any plugin.xml and META-INF/MANIFEST.MF from the DN jars ... ALL of the information there not just some of it.

    0 讨论(0)
  • 2020-12-06 08:39

    For anyone else struggling to merge the datanucleus plugin.xml files, I used the following code to help. Pipe the contents from the 3 separate datanucleus plugin.xml files using this command and this will tell you where there are extensions which explicitly need merging:

    cat plugin_core.xml plugin_rdbms.xml plugin_api.xml | grep -h "extension point" | tr -d "[:blank:]"| sort | uniq -d

    More details are in a separate post.

    0 讨论(0)
提交回复
热议问题