Joining the Eclipse Project Management Committee

I’m honored to join the Eclipse Project Management Committee (PMC) for the Eclipse Top-Level Project. See
Eclipse project charter for their responsibilities.

As Eclipse committer and project lead for platform UI and e4 my main goals are:

  • Attract and win new contributors and committers
  • Improve stability and performance of the Eclipse IDE
  • Enhance the Eclipse RCP programming model

To help achieving this my main work items are:

  • Cleanup the Eclipse code base
    Update the Eclipse code to new framework versions and Java versions
  • Simplify and automate the committer workflow
  • Review Gerrits as much as possible
  • Coach potential new committers
  • Simplify and enhance the UI exercise and the platform API usage

I always felt that the existing PMC members did support this work. Joining them is a great honor and I can hopefully help to enhance the Eclipse IDE further.

Posted in Eclipse, Lars Vogel | Comments Off on Joining the Eclipse Project Management Committee

Run an Eclipse 32-bit application from a 64-bit Eclipse IDE

Typically the development environment should not be dependent on the target environment the application should run on. For creating an Eclipse RCP application using SWT, this is not as trivial as it looks like. The reason for this is the fact that the SWT implementation is packaged in platform dependent bundle fragments. But it is possible to setup the workspace to make that work, which I will show in the following blog post.

Use Case

You need to maintain an Eclipse RCP application that makes use of 32-bit Windows native libraries. Since you got a brand new laptop or PC that runs a 64-bit Windows, you install the 64-bit version of Eclipse. As you are aware that you need to execute the application in a 32-bit JVM, you add a 32-bit JDK via Window -> Preferences -> Java -> Installed JREs and configure that JDK as the default for the JavaSE-1.8 Execution Environment.

execution_environment

At development time you want to start your application from the IDE, e.g. via .product file -> Launch an Eclipse application. But you get the following error:

java.lang.UnsatisfiedLinkError: Cannot load 64-bit SWT libraries on 32-bit JVM

Solution

The reason for this is clear, you installed a 64-bit Eclipse, therefore you only have the 64-bit bundle fragment of SWT in your installation. But you need the 32-bit SWT fragment. This can be solved easily by configuring the target platform appropriately.

  • Create a new Target Platform
  • Switch to the Environment tab in the PDE Target Editor
  • Change the Architecture to x86
    target_editor_environment
  • Switch to the Definition tab
  • Click Reload (this is important to retrieve the x86 fragments!)
  • Switch to the Content tab and check if the correct fragment is now part of the target platform (check for the org.eclipse.swt.win32.win32.x86 fragment)
  • Activate the target platform via Set as Target Platform

Now it is possible to execute a 32-bit application from a 64-bit Eclipse IDE via .product file -> Launch an Eclipse application.

Note: Remember to start via the .product file and not via an existing run configuration, because the run configuration needs to be updated for the environment settings.

 

 

 

Posted in Dirk Fauth, Eclipse | 2 Comments

OSGi – bundles / fragments / dependencies

In the last weeks I needed to look at several issues regarding OSGi dependencies in different products. A lot of these issues were IMHO related to wrong usage of OSGi bundle fragments. As I needed to search for various solutions, I will publish my results and my opinion on the usage of fragments in this post. Partly also for myself to remind me about it in the future.

What is a fragment?

As explained in the OSGi Wiki, a fragment is a bundle that makes its contents available to another bundle. And most importantly, a fragment and its host bundle share the same classloader.

Looking at this from a more abstract point of view, a fragment is an extension to an existing bundle. This might be a simplified statement. But considering this statement helped me solving several issues.

What are fragments used for?

I have seen a lot of different usage scenarios for fragments. Considering the above statement, some of them where wrong by design. But before explaining when not to use fragments, let’s look when they are the agent of choice. Basically fragments need to be used whenever a resource needs to be accessible by the classloader of the host bundle. There are several use cases for that, most of them rely on technologies and patterns that are based on standard Java. For example:

  • Add configuration files to a third-party-plugin
    e.g. provide the logging configuration (log4j.xml for the org.apache.log4j bundle)
  • Add new language files for a resource bundle
    e.g. a properties file for locale fr_FR that needs to be located next to the other properties files by specification
  • Add classes that need to be dynamically loaded by a framework
    e.g. provide a custom logging appender
  • Provide native code
    This can be done in several ways, but more on that shortly.

In short: fragments are used to customize a bundle

When are fragments the wrong agent of choice?

To explain this we will look at the different ways to provide native code as an example.

One way is to use the Bundle-NativeCode manifest header. This way the native code for all environments are packaged in the same bundle. So no fragments here, but sometimes not easy to setup. At least I struggled with this approach some years ago.

A more common approach is to use fragments. For every supported platform there is a corresponding fragment that contains the platform specific native library. The host bundle on the other side typically contains the Java code that loads the native library and provides the interface to access it (e.g. via JNI). This scenario is IMHO a good example for using fragments to provide native code. The fragment only extend the host bundle without exposing something public.

Another approach is the SWT approach. The difference to the above scenario is, that the host bundle org.eclipse.swt is an almost empty bundle that only contains the OSGi meta-information in the MANIFEST.MF. The native libraries aswell as the corresponding Java code is supplied via platform dependent fragments. Although SWT is often referred as reference for dealing with native libraries in OSGi, I think that approach is wrong.

To elaborate why I think the approach org.eclipse.swt is using is wrong, we will have a look at a small example.

  1. Create a host bundle in Eclipse via File -> New -> Plug-in Project and name it org.fipro.host. Ensure to not creating an Activator or anything else.
  2. Create a fragment for that host bundle via File -> New -> Other -> Plug-in Development -> Fragment Project and name it org.fipro.host.fragment. Specify the host bundle org.fipro.host on the second wizard page.
  3. Create the package org.fipro.host in the fragment project.
  4. Create the following simple class (yes, it has nothing to do with native code in fragments, but it also shows the issues).
    package org.fipro.host;
    
    public class MyHelper {
    	public static void doSomething() {
    		System.out.println("do something");
    	}
    }
    

So far, so good. Now let’s consume the helper class.

  1. Create a new bundle via File -> New -> Plug-in Project and name it org.fipro.consumer. This time let the wizard create an Activator.
  2. In Activator#start(BundleContext) try to call MyHelper#doSomething()

Now the fun begins. Of course MyHelper can not be resolved at this time. We first need to make the package consumable in OSGi. This can be done in the fragment or the host bundle. I personally tend to configure Export-Package in the bundle/fragment where the package is located. We therefore add the Export-Package manifest header to the fragment. To do this open the file org.fipro.host.fragment/META-INF/MANIFEST.MF. Switch to the Runtime tab and click Add… to add the package org.fipro.host.

Note: As a fragment is an extension to a bundle, you can also specify the Export-Package header for org.fipro.host in the host bundle org.fipro.host. org.eclipse.swt is configured this way. But notice that the fragment packages are not automatically resolved using the PDE Manifest Editor and you need to add the manifest header manually.

After that the package org.fipro.host can be consumed by other bundles. Open the file org.fipro.consumer/META-INF/MANIFEST.MF and switch to the Dependencies tab. At this time it doesn’t matter if you use Required Plug-ins or Imported Packages. Although Import-Package should be always the preferred way, as we will see shortly.

Althought the manifest headers are configured correctly, the MyHelper class can not be resolved. The reason for this is PDE tooling. It needs additional information to construct proper class paths for building. This can be done by adding the following line to the manifest file of org.fipro.host

Eclipse-ExtensibleAPI: true

After this additional header is added, the compilation errors are gone.

Note: This additional manifest header is not necessary and not used at runtime. At runtime a fragment is always allowed to add additional packages, classes and resources to the API of the host.

After the compilation errors are gone in our workspace and the application runs fine, let’s try to build it using Maven Tycho. I don’t want to walk through the whole process of setting up a Tycho build. So let’s simply assume you have a running Tycho build and include the three projects to that build. Using POM-less Tycho this simply means to add the three projects to the modules section of the build.

You can find further information on Tycho here:
Eclipse Tycho for building Eclipse Plug-ins and RCP applications
POM-less Tycho builds for structured environments

Running the build will fail because of a Compilation failure. The Activator class does not compile because the import org.fipro.host cannot be resolved. Similar to PDE, Tycho is not aware of the build dependency to the fragment. This can be solved by adding an extra. entry to the build.properties of the org.fipro.consumer project.

extra.. = platform:/fragment/org.fipro.host.fragment

See the Plug-in Development Environment Guide for further information about build configuration.

After that entry was added to the build.properties of the consumer bundle, also the Tycho build succeeds.

What is wrong with the above?

At first sight it is quite obvious what is wrong with the above solution. You need to configure the tooling at several places to make the compilation and the build work. These workarounds even introduce dependencies where there shouldn’t be any. In the above example this might be not a big issue, but think about platform dependent fragments. Do you really want to configure a build dependency to a win32.win32.x86 fragment on the consumer side?

The above scenario even introduces issues for installations with p2. Using the empty host with implementations in the fragments forces you to ensure that at least (or exactly) one fragment is installed together with the host. Which is another workaround in my opinion (see Bug 361901 for further information).

OSGi purists will say that the main issue is located in PDE tooling and Tycho, because the build dependencies are kept as close as possible to the runtime dependencies (see for example here). And using tools like Bndtools you don’t need these workarounds. And in first place I agree with that. But unfortunately it is not possible (or only hard to achieve) to use Bndtools for Eclipse application development. Mainly because in plain OSGi, Eclipse features, applications and products are not known. Therefore also the feature based update mechanism of p2 is not usable. But I don’t want to start the discussion PDE vs. Bndtools. That is worth another (series) of posts.

In my opinion the real issue in the above scenario, and therefore also in org.eclipse.swt, is the wrong usage of fragments. Why is there a host bundle that only contains the OSGi meta information? After thinking a while about this, I realized that the only reason can be laziness! Users want to use Require-Bundle instead of configuring the several needed Import-Package entries. IMHO this is the only reason that the org.eclipse.swt bundle with the multiple platform dependent fragments exists.

Let’s try to think about possible changes. Make every platform dependent fragment a bundle and configure the Export-Package manifest header for every bundle. That’s it on the provider side. If you wonder about the Eclipse-PlatformFilter manifest header, that works for bundles aswell as for fragments. So we don’t loose anything here. On the consumer side we need to ensure that Import-Package is used instead of Require-Bundle. This way we declare dependencies on the functionality, not the bundle where the functionality originated. That’s all! Using this approach, the workarounds mentioned above can be removed. PDE and Tycho are working as intended, as they can simply resolve bundle dependencies. I have to admit that I’m not sure about p2 regarding the platform dependent bundles. Would need to check this separately.

Conclusion

Having a look at the two initial statements about fragments

  • a fragment is an extension to an existing bundle
  • fragments are used to customize a bundle

it is IMHO wrong to make API public available from a fragment. These statements could even be modified to become the following:

  • a fragment is an optional extension to an existing bundle

Having that statement in mind, things are getting even clearer when thinking about fragments. Here is another example to strengthen my statement. Guess you have a host bundle that already exports a package org.fipro.host. Now you have a fragment that adds an additional public class via that package, and in a consumer bundle that class is used. Using Bndtools or the workarounds for PDE and Tycho showed above, this should compile and build fine. But what if the fragment is not deployed or started at runtime? Since there is no constraint for the consumer bundle that would identify the missing fragment, the consumer bundle would start. And you will get a ClassNotFoundException at runtime.

Personally I think that everytime a direct dependency to a fragment is introduced, there is something wrong.

There might be exceptions to that rule. One could be to create a custom logging appender that needs to be accessible in other places, e.g. for programmatically configurations. As the logging appender needs to be in the same classloader as the logging framework (e.g. org.apache.log4j), it needs to be provided via fragment. And to access it programmatically, a direct dependency to the fragment is needed. But honestly, even in such a case a direct dependency to the fragment can be avoided with a good module design. Such a design could be for example to make the appender an OSGi service. The service interface would be defined in a separate API bundle and the programmatic access would be implemented against the service interface. Therefore no direct dependency to the fragment would be necessary.

As I struggled several days with searching for solutions on fragment dependency issues, I hope this post can help others, solving such issues. Basically my solution is to get rid of all fragments that export API and make them either separate bundles or let them provide their API via services.

If someone with a deeper knowledge in OSGi ever comes by this post and has some comments or remarks about my statements, please let me know. I’m always happy to learn something new or getting new insights.

Posted in Dirk Fauth, Eclipse, OSGi | Comments Off on OSGi – bundles / fragments / dependencies

POM-less Tycho builds for structured environments

With Tycho 0.24 POM-less Tycho builds where introduced. That approach uses convention-over-configuration to reduce the number of redundant information for setting up a Tycho build. In short, that means you don’t need to create and maintain pom.xml files for bundle, feature and test projects anymore, as the whole information can be extracted out of the already existing information in MANIFEST.MF or feature.xml.

Lars Vogel shows in his Tycho Tutorial a recommended folder structure, that is also widely used in Eclipse projects.

recommended_folder_structureThe meaning of that folder structure is:

  • bundles
    contains all plug-in projects
  • features
    contains all feature projects
  • products
  • contains all product projects
  • releng
    contains projects related to the release engineering, like

    • the project containing the parent POM
    • the aggregator project that contains the aggregator POM which defines the modules of a build and is also the starting point of a Tycho build
    • the target definition project that contains the target definition for the Tycho build
  • tests
    contains all test plug-in/fragment projects

This structure helps in organizing the project. But there is one convention for POM-less Tycho builds that is not working out-of-the-box with the given folder structure: “The parent pom for features and plugins must reside in the parent directory”. Knowing about the Maven mechanics, this convention can also be satisfied easily by introducing some POM files that simply connect to the real parent POM. I call them POM-less parent POM files. These POM-less parents need to be put into the base directories bundles, features and tests. And they do nothing else than specifying the real parent POM of the project (which is also located in a sub-directory of releng.

The following snippet shows a POM-less parent example for the bundles folder:

<project xmlns="http://maven.apache.org/POM/4.0.0" 
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
    http://maven.apache.org/maven-v4_0_0.xsd"> 

    <modelVersion>4.0.0</modelVersion>

    <artifactId>org.fipro.example.bundles</artifactId>
 
    <packaging>pom</packaging>

    <parent>
        <groupId>org.fipro.example</groupId>
        <artifactId>org.fipro.example.parent</artifactId>
        <version>1.0.0-SNAPSHOT</version>
        <relativePath>../releng/org.fipro.example.parent</relativePath>
    </parent>
</project>

For the features and tests folder you simply need to modify the artifactId accordingly.

Note that you don’t need to reference the POM-less parent POM files in the modules section of the aggregator POM.

Following the best practices regarding the folder structures of a project and the conventions for POM-less Tycho builds, you will have at least 7 pom.xml files in your project.

  • parent POM
    the main build configuration
  • aggregator POM
    the collection of modules to build
  • target-definition POM
    the eclipse target definition build
  • product POM
    the eclipse product build
  • POM-less parent POM for bundles
    the connection to the real parent POM for POM-less plug-in builds
  • POM-less parent POM for features
    the connection to the real parent POM for POM-less feature builds
  • POM-less parent POM for tests
    the connection to the real parent POM for POM-less test plug-in builds

Of course there will be more if you provide a multi-product environment or if you need to customize the build of a plug-in for example.

With the necessary Maven extension descriptor for enabling the POM-less Tycho build (see POM-less Tycho builds), the folder structure will look similar to the following screenshot:

pom-less_folder_structure

I hope this blog post will help people setting up Tycho builds for their products more easily using POM-less Tycho.

 

Posted in Dirk Fauth, Eclipse | 6 Comments

Generic Eclipse 3.x views, editors and handlers with DI – by René Brandstetter

In this blog entry I will show you three independent features provided by Eclipse and OSGi which can work together to create views, editors or handlers via the Eclipse ExtensionRegistry but still use the dependency mechanism of Eclipse 4.

Supersede the ExtensionRegistry creation process of an object (an almost forgotten hidden feature)

The Eclipse ExtensionRegistry allows you as an extension provider to not only contribute your custom implementation but to also be part of the “creation & initialization” process. So whenever an ExtensionPoint needs to create an instance of a specific class (e.g.: <view name="View" class="com.example.legacy3x.View" id="com.example.legacy3x.view"/>) it uses the method IConfigurationElement#createExecutableExtension(String attributeName). This method is responsible for creating an instance of the class via the bundle which provides the extension and has the following workflow:

  1. Retrieve the class name from the extension
  2. Create an instance of the class
  3. If the instance implements IExecutableExtension, call its setInitializationData() method with the initialization data from the provided extension (see next paragraph for details on how this is done)
  4. Now check if the instance implements IExecutableExtensionFactory:
    • If so, return the value object retrieved from its create() method
    • If it doesn’t implement this interface, it just returns the already created instance from step no. 2

With these two interfaces you have the following possibilities:

  • neither implement IExecutableExtension nor IExecutableExtensionFactory – a new instance of the named class is returned (this is more or less the most used scenario)
  • only implement IExecutableExtension – an instance of the named class is created and its setInitializationData() method is called (can be seen as an @PostConstruct)
  • only implement IExecutableExtensionFactory – an instance of the mentioned IExecutableExtensionFactory is created but the object created via IExecutableExtensionFactory#create() is returned and not the IExecutableExtensionFactory instance (as the name already implies that it is the Factory-Pattern)
  • both, IExecutableExtension and IExecutableExtensionFactory, are implemented – which creates an instance of the mentioned IExecutableExtensionFactory, configures it via IExecutableExtension#setInitializationData() and afterwards returns the object created via IExecutableExtensionFactory#create() (can be seen as a configurable factory)

Now that we know how the creation process can be influenced lets take a look at how we can specify the initialization data.

IExecutableExtension defines the method “void setInitializationData(IConfigurationElement config, String propertyName, Object data) throws CoreException;” which gets:

  • config – the contributed extension on which the IConfigurationElement#createExecutableExtension(String attributeName) was called
  • propertyName – the name of the attribute which causes the create (=attributeName of the method IConfigurationElement#createExecutableExtension(String attributeName))
  • data – the additional initialization data you are able to configure

The additional configured initialization data is an object and can have 2 different types depending on how the initialization data is specified.

  • data instanceof String: the initialization data was directly specified in the attribute by separating the class name and the initialization data with a colon (e.g.: class="com.example.legacy3x.View:hereYouCanSpecifyAllInitializationDataAsAString"; data would be "hereYouCanSpecifyAllInitializationDataAsAString")
  • data instanceof Map: the initialization data was defined as key/value pairs in a separate tag named exactly the same as the attribute name (e.g.:
    <view name="View"
      id="com.example.legacy3x.view">
      <class class="com.example.legacy3x.View">
        <parameter name="key 1" value="value 1" />
        <parameter name="key 2" value="value 2" />
      </class>
    </view>

    ). For the transfer of the attribute name to the tag name there is no UI and you have to do this in the XML source directly. Just remove the attribute, create a tag named like the attribute (in our example this is “class“), and provide an attribute named “class” on it which holds the full qualified name of the real class. All code<parameter> sub-tags of your tag will become the key/value pairs of the provided Map.

    BecomingATag

Get an appropriate IEclipseContext

In the Eclipse ExtensionRegistry or in a legacy 3.x Eclipse which uses the “org.eclipse.e4.tools.compat” bundle you probably won’t have an E4-ApplicationModel object to retrieve an IEclipseContext from it. So how can you retrieve it in these situations?

Keep in mind that the IEclipseContext is organized in a hierarchical structure which reflects more or less the structure of the UI.

e4_IEclipseContext_hierarchy

This also represents more or less the structure of an Eclipse 3.x application.

e3_UI_hierarchy

A closer look at the IWorkbench* – classes above shows that all of them extend the IServiceLocator interface and so we have the possibility to call the IServiceLocator#getService(Class) method on them. The good news about this method is that it will return the IEclipseContext instance associated with the IWorkbench*-object whenever you invoke it with IEclipseContext.class as the argument. This means retrieving the IEclipseContext of the currently active window is done by the following method call:

PlatformUI.getWorkbench().getActiveWorkbenchWindow().getService(IEclipseContext.class)

or retrieving it from the application it would require:

PlatformUI.getWorkbench().getService(IEclipseContext.class)

Load classes from specific bundles

In OSGi it’s a bad habit to load classes yourself but sometimes (hopefully in really rare cases) it happens that you want to provide a functionality which has to load a specific class from another bundle that your code isn’t aware of (means it doesn’t know it and so can’t define an Import-Package entry in the MANIFEST.MF). But as Eclipse 4.x shows there must be a way of doing this. To load a class from a bundle only known at runtime your code has to know the following:

  1. the symbolic name of the bundle which holds the class
  2. the full qualified name of the class to load
  3. this would be nice to have but is sadly always ignored because of easier handling: the version or version range of the bundle which holds the class

How can this information be put into a String format? The best and easiest way is to put it in an URI which has a standardized format and this is how it’s done in e4 with the “bundleclass://“-references. For example “bundleclass://com.example.legacy3x.view/com.example.legacy3x.View“, the host-part of the URI would be the bundle symbolic name (=com.example.legacy3x.view) and the full qualified name of the class is held in the first path element (=com.example.legacy3x.View).

Now that we know how to present this information we can ask the OSGi environment to give us the required bundle and with the help of this bundle we can then load the class. To retrieve the bundle form a pure OSGi environment you have 3 possibilities depending on the OSGi version you’re using:

  1. OSGi Version < r4v43 where PackageAdmin isn’t deprecated: retrieve the PackageAdmin service from the OSGi service registry and call org.osgi.service.packageadmin.PackageAdmin.getBundles(String symbolicName, String versionRange)
  2. OSGi Version > r4v43 and < r6 or if you want to be OSGi version independent: use a BundleTracker implementation similar to the one in org.eclipse.e4.ui.internal.workbench.BundleFinder
  3. OSGi Version >= r6: here a new method exists FrameworkWiring#findProviders(Requirement) which can be used to query the requirement/capabilities framework of OSGi

In the following example, I will show you only the latest way of retrieving a bundle from OSGi via FrameworkWiring#findProviders(Requirement), because the other 2 options are self-explanatory.

// ... unimportant imports and null checks omitted for simplicity ...
import org.osgi.framework.Constants;
import org.osgi.framework.namespace.IdentityNamespace;
import org.osgi.framework.wiring.BundleCapability;
import org.osgi.framework.wiring.FrameworkWiring;
import org.osgi.resource.Namespace;
import org.osgi.resource.Requirement;
import org.osgi.resource.Resource;
// ... other imports ...
 
static <T> Class<T> loadClass(final String bundleName, String className) throws ClassNotFoundException {
      // retrieve the FrameworkWiring: according to the OSGi specification only the system bundle can be adapted to this
      FrameworkWiring frameworkWiring = context.getBundle(Constants.SYSTEM_BUNDLE_ID).adapt(FrameworkWiring.class);
          
      // build FrameworkWiring search criteria to find a specific bundle with a given name
      Requirement requirement = new Requirement() {
            @Override
            public Resource getResource() {
              return null// requirement is synthesized hence null
            }
            
            @Override
            public String getNamespace() {
              return IdentityNamespace.IDENTITY_NAMESPACE;
            }
            
            @Override
            public Map<String, String> getDirectives() {
              return Collections.<String, String> singletonMap(Namespace.REQUIREMENT_FILTER_DIRECTIVE, "(" + IdentityNamespace.IDENTITY_NAMESPACE + "=" + bundleName + ")");
            }
            
            @Override
            public Map<String, Object> getAttributes() {
              return Collections.<String, Object>emptyMap();
            }
      };
      Collection<BundleCapability> capabilities = frameworkWiring.findProviders(requirement);  // query the OSGi environment for a bundle with the given symbolic name
          
      // we didn't specify a version for simplicity so we take the bundle with the highest version
      Bundle bundle = null;
      for( BundleCapability capability : capabilities ){
        Bundle b = capability.getRevision().getBundle();
        if( bundle == null || b.getVersion().compareTo(bundle.getVersion()) > 0 ){
          bundle = b;
        }
      }
      // in case the bundle exists, the bundle with the given symbolic name and the highest version is held in the "bundle" variable
      return (Class<T>)bundle.loadClass(className);
  }

The for-loop over the retrieved capabilities only exists because in the example bundle versions are not taken into account. If you enrich your URI with a version (e.g.: bundleclass://com.example.legacy3x.view/com.example.legacy3x.View?v=1.0.0) you could enrich the filter provided in the directives variable with this information too. For this the filter should look like: "(&(" + IdentityNamespace.IDENTITY_NAMESPACE + "=" + bundleName + ")(" + IdentityNamespace.CAPABILITY_VERSION_ATTRIBUTE + "=" + bundleVersion + "))"

Hint: If you stick to Eclipse as the OSGi environment and use the org.eclipse.core.runtime bundle, you could also use the static methods getBundle(String symbolicName) or getBundles(String symbolicName, String version) of org.eclipse.core.runtime.Platform.

Plumbing everything together for a generic DIViewPart

Now that we know how to

  • supersede the ExtensionRegistry class creation mechanism,
  • retrieve an IEclipseContext from different Workbench elements and
  • load a class from another “unknown” bundle

we can use this information to extend the DIViewPart, DIEditorPart and DIHandler classes of the org.eclipse.e4.tools.compat bundle to be more generic than they are now.

For this you have to create a simple bundle which uses the above mentioned hints to:

  • provide a configurable IExecutableExtensionFactory
    • Example: public class DIFactoryForE3Elements implements IExecutableExtension, IExecutableExtensionFactory{ … }
    • Hint: The package of this factory should also be exported so it can be used in other bundles.
  • in its setInitializationData() method retrieve the required information to load the POJO class
    • Example: the data object could be a bundleclass://-URI
      URI bundleClass = new URI((String)data);
      String bundleName = bundleClass.getHost();
      String className = bundleClass.getPath();
  • in its create() method instantiate, depending on the used extension point (handler, view or editor), the appropriate DI*-class of the org.eclipse.e4.tools.compat bundle
    • Example: return new DIHandler<>(pojoClass);
    • Hint: The DI*-classes internally use the shown techniques to retrieve the appropriate IEclipseContext.

After you’ve created this bundle, you can use the created IExecutableExtensionFactory in all bundles which want to provide an “e4-like-POJO-based” view, editor or handler. To do so, you just have to:

  • write a POJO class with the appropriate e4-annotations (@Inject, @PostConstruct, …)
  • import the package which holds your IExecutableExtensionFactory implementation
  • create the appropriate extension entry and use your IExecutableExtensionFactory
    • Example:
      <extension point="org.eclipse.ui.views">
        <view allowMultiple="true"
           class="com.example.e3.with.di.DIFactoryForE3Elements:bundleclass://com.example.e3.with.di.usage/com.example.e3.with.di.usage.PojoView"
           id="com.example.e3.with.di.usage.pojoview"
           name="DI enabled View"
           restorable="true">
        </view>
      <extension>

Final Words

If my explanations are a little bit confusing or if you just want to see it in action, take a look at the following GitHub links:

 

Many thanks to:

Lars Vogel for once again motivating me to write this blog entry

and

Tom Schindl, the mastermind behind the “how to make the DI*-classes generic “ idea, for allowing me to present it here.

And as usual, sorry for being too chatty and for all the typos and errors: comment on them or keep them ;-).

Posted in Eclipse, Lars Vogel | Tagged , , , , , | Comments Off on Generic Eclipse 3.x views, editors and handlers with DI – by René Brandstetter

Honored to become Eclipse PDE committer

I’m happy to announce that I was elected as PDE committer. This made the integration of the new png icons in PDE much easier. I plan to work on the PDE templates and improve the PDE tooling for the IDE and PDE extensions in the Eclipse Neon release.

Posted in Eclipse, Lars Vogel | Tagged , | 4 Comments

Honored to become Platform UI co-lead

It is a great honor to help in the role of the Platform UI co-Lead together with Paul Webster. Paul was in the past a great mentor for me and I’m very happy to continue to work with him.

We at the vogella GmbH follow three goals, two of them apply in my option also for our platform work:

  • Improvement over initial perfection
  • Automation over manual work

As a platform.ui committer I’m contributing since a while a lot to the Platform UI project and I’m happy to extend this activity. Here are some of the work items which I think are important for the Eclipse Platform UI project:

  • Simplify the contribution process to the Eclipse platform
  • Clean up the platform code for ease maintenance in the future
  • Enable the existing tests during our Tycho build to ensure quality of the code
  • Ensure that the Eclipse IDE remains the best IDE out there
  • Make it easier to use the Eclipse 4 API for IDE plug-in development
  • Improve our CSS story
  • Improve the stability and the performance of the platform

The overall target is to extend the number of contributors and committers in the Eclipse platform project.

Thanks a lot to the former co-lead Daniel Rolka for his great work. I’m looking forward to help the platform UI project.

Posted in Eclipse, Lars Vogel | Tagged | 8 Comments

NatTable context menus with Eclipse menus

In this blog post I will explain how to add context menus to a NatTable instance and how it is possible to combine menus configured via Eclipse Application Model with NatTable menus. It is based on the enhancements added in NatTable 1.2.0, but I will also explain the differences to prior NatTable versions and the issues that lead to the enhancements.

I will also show some basics and the usage with Eclipse 3.x menus.

Basics

In SWT it is possible to attach a menu to a control by calling Control#setMenu(Menu). This menu will be shown when performing a right click on the rendered control.

[code source=”java”]
Text input = new Text(parent, SWT.BORDER);

// create a simple Menu for the input text field
Menu menu = new Menu(input);
MenuItem item = new MenuItem(menu, SWT.PUSH);
item.setText("Open dialog");
item.addSelectionListener(new SelectionAdapter() {
@Override
public void widgetSelected(SelectionEvent e) {
MessageDialog.openInformation(
null, "Information", "Some information dialog");
}
});

// set the menu to the input Text control
input.setMenu(menu);
}
[/code]

Using the SWT default mechanism for registering a menu to a NatTable instance would cause showing the menu at every region of a NatTable composition. For example a menu that should only be attached to the column header would open by performing a right click on any region in a grid composition (column header, row header, corner, body). Because of that, and as NatTable comes with a lot of built in commands and a special label/region based concept to determine a context, it provides its own mechanism to register menus.

Via IUiBindingRegistry it is possible to register a binding in NatTable to perform an action on user interaction, e.g. open a menu on performing a right click in a configured grid region. The PopupMenuBuilder is a builder in NatTable to create a menu with menu items that perform NatTable commands. It has several methods for adding such menu items and initializes and returns the menu on calling PopupMenuBuilder#build().

To create a menu with NatTable commands you need to perform the following steps:

  1. Create an IConfiguration for the menu by extending AbstractUiBindingConfiguration
  2. Create a menu using the NatTable PopupMenuBuilder helper class
  3. Register a PopupMenuAction binding using the created menu
  4. Add the IConfiguration to the NatTable instance

The following code shows the DebugMenuConfiguration that is shipped with NatTable to add debugging capability in a rendered NatTable.

[code source=”java”]
// [1] IConfiguration for registering a UI binding to open a menu
public class DebugMenuConfiguration
extends AbstractUiBindingConfiguration {

private final Menu debugMenu;

public DebugMenuConfiguration(NatTable natTable) {
// [2] create the menu using the PopupMenuBuilder
this.debugMenu = new PopupMenuBuilder(natTable)
.withInspectLabelsMenuItem()
.build();
}

@Override
public void configureUiBindings(
UiBindingRegistry uiBindingRegistry) {
// [3] bind the PopupMenuAction to a right click
// using GridRegion.COLUMN_HEADER instead of null would
// for example open the menu only on performing a right
// click on the column header instead of any region
uiBindingRegistry.registerMouseDownBinding(
new MouseEventMatcher(
SWT.NONE,
null,
MouseEventMatcher.RIGHT_BUTTON),
new PopupMenuAction(this.debugMenu));
}

}
[/code]

[code source=”java”]
// [4] add the menu configuration to a NatTable instance
natTable.addConfiguration(new DebugMenuConfiguration(natTable));
[/code]

In NatTable versions before 1.2.0, you also need to ensure that the created menu is disposed when the NatTable instance is disposed. Otherwise you run into a memory leak. The reason for this is, that the menu is not connected to the NatTable Control as SWT menu. So when the NatTable instance gets disposed, the connected menu is not known and therefore not disposed with the Control. This can be solved by adding a DisposeListener to the NatTable instance as shown below.

[code source=”java”]
public DebugMenuConfiguration(NatTable natTable) {
// [2] create the menu using the PopupMenuBuilder
this.debugMenu = new PopupMenuBuilder(natTable)
.withInspectLabelsMenuItem()
.build();

// ensure the created menu gets disposed
// only necessary for NatTable < 1.2.0
natTable.addDisposeListener(new DisposeListener() {
public void widgetDisposed(DisposeEvent e) {
if (debugMenu != null && !debugMenu.isDisposed())
debugMenu.dispose();
}
});
}
[/code]

Since NatTable 1.2.0 the necessary DisposeListener is added automatically when calling PopupMenuBuilder#build().

Combination with Eclipse 3.x menus

Using Eclipse it is possible to specify a menu declarative. In Eclipse 3.x this is done using the extension point org.eclipse.ui.menus. Create a new menuContribution for that extension point and set the locationURI to a value that starts with popup:, e.g. popup:myMenu.

Eclipse3menu

This menu can then be used as a NatTable menu by creating a MenuManager, registering the menu via id together with the MenuManager to the site and create a PopupMenuBuilder using the MenuManager instance.

[code source=”java”]
// somewhere in the ViewPart, e.g. createPartControl(Composite)
MenuManager mgr = new MenuManager();
getSite().registerContextMenu("myMenu", mgr, null);
[/code]

[code source=”java”]
// in the menu configuration
public DebugMenuConfiguration(NatTable natTable, MenuManager mgr) {
// extend the declarative menu provided by the MenuManager
this.debugMenu = new PopupMenuBuilder(natTable, mgr)
.withInspectLabelsMenuItem()
.build();
}
[/code]

Since NatTable 1.2.0 it is possible to create the PopupMenuBuilder using a MenuManager instance. By using the MenuManager it is possible to configure visibility and enablement constraints in the plugin.xml and extend the declarative menu with NatTable commands using the PopupMenuBuilder. Prior 1.2.0 it was only possible to create the PopupMenuBuilder using the Menu instance created via MenuManager#createContextMenu(Control). Menu items that are added via PopupMenuBuilder will be dismissed that way, since they are not of type IContributionItem.

Combination with Eclipse 4.x menus

In Eclipse 4.x you declare a popup menu for a SWT control in the application model, e.g. in the menus section of a part.

Eclipse4menu

The declared menu can then be retrieved via EMenuService. For this get the EMenuService via injection and call EMenuService#registerContextMenu(Object, String)

[code source=”java”]
@Inject
EMenuService menuService;
[/code]

[code source=”java”]
menuService.registerContextMenu(
natTable,
"com.vogella.nebula.nattable.popupmenu.0");
[/code]

Further informations on popup menus with Eclipse 4.x can be found here.

Using the EMenuService will directly register the menu as SWT menu to the NatTable control, which is violating the NatTable menu concepts. As explained above, the menu will be shown for the whole NatTable, and it is not possible to distinguish between regions in a grid for example.

Since it is not possible to retrieve the menu without registering it directly to the Widget, we need to retrieve and unregister it accordingly. After that we are able to create the PopupMenuBuilder instance using the created and retrieved menu.

[code source=”java”]
// get the menu registered by EMenuService
final Menu e4Menu = natTable.getMenu();

// remove the menu reference from NatTable instance
natTable.setMenu(null);

natTable.addConfiguration(
new AbstractUiBindingConfiguration() {

@Override
public void configureUiBindings(
UiBindingRegistry uiBindingRegistry) {
// add NatTable menu items
// and register the DisposeListener
new PopupMenuBuilder(natTable, e4Menu)
.withInspectLabelsMenuItem()
.build();

// register the UI binding
uiBindingRegistry.registerMouseDownBinding(
new MouseEventMatcher(
SWT.NONE,
GridRegion.BODY,
MouseEventMatcher.RIGHT_BUTTON),
new PopupMenuAction(e4Menu));
}
});
[/code]

As explained before, using the Menu instance directly would avoid adding additional menu items, because the MenuManager that created the Menu only handles IContributionItem correctly and is called on filling the menu. Fortunately, using Eclipse 4, the Menu that is created via MenuManager knows the MenuManager that created it. It can be retrieved via Menu#getData(), which is set by the MenuManagerRenderer. The PopupMenuBuilder checks that value and keeps the reference to the MenuManager in order to support extending the menu and adding visibility and enablement states as explained in a minute.

The MenuManager itself also adds the reference to itself to the created Menu via setData(). It uses the key org.eclipse.jface.action.MenuManager.managerKey which is defined as a constant in MenuManager. Unfortunately this constant is private and therefore not accessible from any other code, which seems to be the reason for the MenuManagerRenderer to add the reference via setData() without a key.

State configuration of menu items

Another advantage that comes with NatTable 1.2.0 is the support for visibility and enabled states. As explained above, the PopupMenuBuilder is now able to work with a MenuManager instead of the simple Menu instance. This way also the visibility constraints via core expressions will work correctly.

For NatTable commands it is also possible to specify a visible or enabled state via IMenuItemState. This can be done using the methods withEnabledState(String, IMenuItemState) and withVisibleState(String, IMenuItemState) on PopupMenuBuilder on building the menu. The addition of the visible and enabled states for NatTable menu items for example adds support to disable a menu item for specific columns or for a special state, like disabling the hide column menu item if it would lead to an empty table (hide last column issue). For this every default menu item can be identified via unique id, which are specified as constants in the PopupMenuBuilder.

The following code will extend the Eclipse 4 menu from above with the debug menu item and disable it for the first column in the body region of the grid.

[code source=”java”]
new PopupMenuBuilder(natTable, e4Menu)
.withInspectLabelsMenuItem()
.withEnabledState(
PopupMenuBuilder.INSPECT_LABEL_MENU_ITEM_ID,
new IMenuItemState() {

@Override
public boolean isActive(NatEventData natEventData) {
return natEventData.getColumnPosition() > 1;
}
})
.build();
[/code]

In order to be able to configure a visible or enabled state for custom menu items, the menu items need to be added specifying an id. This can be done using PopupMenuBuilder#withMenuItemProvider(String, IMenuItemProvider). Using the id that was used to add the menu item, it is possible to configure a IMenuItemState for that menu item.

Posted in Dirk Fauth, Eclipse | Comments Off on NatTable context menus with Eclipse menus

NatTable with custom scrollbars

When talking about styling a SWT control via CSS, one issue is raised quite early. The scrollbars can not be styled! Looking at a dark theme, the importance on that issue becomes obvious, as you can see in the following screenshot.

NatTable_dark_default_scrollbars

Using NatTable the scrolling capabilities are via the ViewportLayer. With NatTable 1.1 the possibility was added to set custom scrollbars to the ViewportLayer. This enables for example to have multiple ViewportLayer in a layer composition (split viewport) or to create UI layouts with special scrolling interactions.

With the possibility to use a custom scrollbar implementation, it is possible to style a NatTable completely with a dark theme. As an example for a stylable scrollbar we use the FlatScrollBar from Code Affine.

Since the scrollbars of the Canvas, which is the base class of NatTable, can’t be exchanged directly, we need to create a wrapper composite for the NatTable. This way the scrollbars can be attached beneath the NatTable instead of being part inside the NatTable.

NatTable_wrapper

To create the above layout, a GridLayout with two columns can be used, where the NatTable will take all the available space.

[code source=”java”]
// NatTable and scrollbar container
Composite container = new Composite(parent, SWT.NONE);
GridLayoutFactory
.swtDefaults()
.numColumns(2)
.margins(0, 0)
.spacing(0, 0)
.applyTo(container);

// NatTable as main control
NatTable natTable = new NatTable(container, viewportLayer);
GridDataFactory
.fillDefaults()
.grab(true, true)
.applyTo(natTable);
[/code]

The vertical scrollbar is attached to the right, and the horizontal scrollbar is attached to the bottom. To ensure that the layout doesn’t break, the FlatScrollBar is wrapped into a Composite. This way we are also able to set a fixed width/height, while telling the FlatScrollBar to fill the available space.

[code source=”java”]
// vertical scrollbar wrapped in another composite for layout
Composite verticalComposite =
new Composite(container, SWT.NONE);
GridLayoutFactory
.swtDefaults()
.margins(0, 0)
.spacing(0, 0)
.applyTo(verticalComposite);
GridData verticalData = GridDataFactory
.swtDefaults()
.hint(14, SWT.DEFAULT)
.align(SWT.BEGINNING, SWT.FILL)
.grab(false, true)
.create();
verticalComposite.setLayoutData(verticalData);

FlatScrollBar vertical =
new FlatScrollBar(verticalComposite, SWT.VERTICAL);
GridDataFactory
.fillDefaults()
.grab(true, true)
.applyTo(vertical);

// horizontal scrollbar wrapped in another composite for layout
Composite horizontalComposite =
new Composite(container, SWT.NONE);
GridLayoutFactory
.swtDefaults()
.margins(0, 0)
.spacing(0, 0)
.applyTo(horizontalComposite);
GridData horizontalData = GridDataFactory
.swtDefaults()
.hint(SWT.DEFAULT, 14)
.align(SWT.FILL, SWT.BEGINNING)
.grab(true, false)
.create();
horizontalComposite.setLayoutData(horizontalData);

FlatScrollBar horizontal =
new FlatScrollBar(horizontalComposite, SWT.HORIZONTAL);
GridDataFactory
.fillDefaults()
.grab(true, true)
.applyTo(horizontal);
[/code]

To be independent of the scrollbar implementation, the IScroller<T> interface was introduced in NatTable. The two default implementations ScrollBarScroller and SliderScroller are shipped with NatTable Core to be able to set custom scrollbars using SWT default implementations. Using this abstraction it is also possible to use another scrollbar implementation, like the FlatScrollBar. The following code shows the implementation of a FlatScrollBarScroller.

[code source=”java”]
class FlatScrollBarScroller
implements IScroller<FlatScrollBar> {

private FlatScrollBar scrollBar;

public FlatScrollBarScroller(FlatScrollBar scrollBar) {
this.scrollBar = scrollBar;
}

@Override
public FlatScrollBar getUnderlying() {
return scrollBar;
}

@Override
public boolean isDisposed() {
return scrollBar.isDisposed();
}

@Override
public void addListener(int eventType, Listener listener) {
scrollBar.addListener(eventType, listener);
}

@Override
public void removeListener(int eventType, Listener listener) {
scrollBar.removeListener(eventType, listener);
}

@Override
public int getSelection() {
return scrollBar.getSelection();
}

@Override
public void setSelection(int value) {
scrollBar.setSelection(value);
}

@Override
public int getMaximum() {
return scrollBar.getMaximum();
}

@Override
public void setMaximum(int value) {
scrollBar.setMaximum(value);
}

@Override
public int getPageIncrement() {
return scrollBar.getPageIncrement();
}

@Override
public void setPageIncrement(int value) {
scrollBar.setPageIncrement(value);
}

@Override
public int getThumb() {
return scrollBar.getThumb();
}

@Override
public void setThumb(int value) {
scrollBar.setThumb(value);
}

@Override
public int getIncrement() {
return scrollBar.getIncrement();
}

@Override
public void setIncrement(int value) {
scrollBar.setIncrement(value);
}

@Override
public boolean getEnabled() {
return scrollBar.getEnabled();
}

@Override
public void setEnabled(boolean b) {
scrollBar.setEnabled(b);
}

@Override
public boolean getVisible() {
return scrollBar.getVisible();
}

@Override
public void setVisible(boolean b) {
scrollBar.setVisible(b);
}

}
[/code]

Using the above FlatScrollBarScroller, the created FlatScrollBar instances can be set to the ViewportLayer.

As the layout will always show the space for the scroller with the GridData instances above, we need to register a listener that hides the wrapper Composites of the FlatScrollBar instances in case the FlatScrollBar is hidden, and a listener that shows the Composites again in case the FlatScrollBar becomes visible again. This is done by setting a GridLayoutData with a matching exclude flag.

[code source=”java”]
// create the vertical scroller
FlatScrollBarScroller verticalScroller =
new FlatScrollBarScroller(vertical);

// register the hide/show listener
verticalScroller.addListener(SWT.Hide, new Listener() {
@Override
public void handleEvent(Event event) {
GridDataFactory
.createFrom(verticalData)
.exclude(true)
.applyTo(verticalComposite);
GridDataFactory
.createFrom(horizontalData)
.span(2, 1)
.applyTo(horizontalComposite);
}
});
verticalScroller.addListener(SWT.Show, new Listener() {
@Override
public void handleEvent(Event event) {
verticalComposite.setLayoutData(verticalData);
horizontalComposite.setLayoutData(horizontalData);
}
});

// create the horizontal scroller
FlatScrollBarScroller horizontalScroller =
new FlatScrollBarScroller(horizontal);

// register the hide/show listener
horizontalScroller.addListener(SWT.Hide, new Listener() {
@Override
public void handleEvent(Event event) {
GridDataFactory
.createFrom(verticalData)
.span(1, 2)
.applyTo(verticalComposite);
GridDataFactory
.createFrom(horizontalData)
.exclude(true)
.applyTo(horizontalComposite);
}
});
horizontalScroller.addListener(SWT.Show, new Listener() {
@Override
public void handleEvent(Event event) {
verticalComposite.setLayoutData(verticalData);
horizontalComposite.setLayoutData(horizontalData);
}
});

// set the custom IScroller to the ViewportLayer
viewportLayer.setVerticalScroller(verticalScroller);
viewportLayer.setHorizontalScroller(horizontalScroller);
[/code]

The last part is to set the style information to the NatTable and the FlatScrollBar instances.

[code source=”java”]
// set a dark background to the wrapper container
container.setBackground(GUIHelper.COLOR_BLACK);

// set a dark styling to the scrollbars
vertical.setBackground(GUIHelper.COLOR_BLACK);
vertical.setPageIncrementColor(GUIHelper.COLOR_BLACK);
vertical.setThumbColor(GUIHelper.COLOR_DARK_GRAY);

horizontal.setBackground(GUIHelper.COLOR_BLACK);
horizontal.setPageIncrementColor(GUIHelper.COLOR_BLACK);
horizontal.setThumbColor(GUIHelper.COLOR_DARK_GRAY);

// set a dark styling to NatTable
natTable.setBackground(GUIHelper.COLOR_BLACK);
natTable.setTheme(new DarkNatTableThemeConfiguration());
[/code]

Doing the steps described above it is possible to create a completely dark themed NatTable using custom scrollbars as shown in the picture below.

NatTable_dark_custom_scrollbars

At the time writing this blog post, there is no wrapper or adapter implementation in NatTable for creating a NatTable with custom scrollbars. But it might be added in the future, based on the above explanations.

The full example code is available here.

Posted in Dirk Fauth, Eclipse | Comments Off on NatTable with custom scrollbars

Preferences Spy for Eclipse IDE and RCP

Motivation

Figuring out, which preference is changed, when editing a certain preference in the user interface (i.e. Preference-Dialog), may be difficult.

Let the Preference Spy help you with this issue.

Preferences in Eclipse

The preferences in Eclipse are stored hierarchically. So there is one root node and the other preferences are usually stored for each plug-in, which may be identified by the nodepath.

The nodepath usually contains the Bundle-SymbolicName of the plug-in, which defines a certain preference, but you may also define a custom nodepath.

In the following image the preferences are shown hierarchically. On top of the tree you may see recently changed preferences with bold font, like it is also done with recently changed values in the about:config of the Firefox web browser. The non bold preferences are shown by pressing the second toolbar button, which is used in order to show all currently set preferences.

Explanation of the toolbar buttons of the spy:

  1. Toogle between hierarchical and flat layout
  2. Show all preferences with at least one value
  3. Toogle tracing recently changed preferences
  4. Expand or collapse all preferences in case hierarchical layout is chosen
  5. Remove certain or all entries from the preference spy part

Events fired by the Preference Spy

The PreferenceSpyEventTopics class contains certain Topics, which are consumed by the Preference Spy. Especially the “TOPIC_PREFERENCESPY/PREFERENCE/CHANGED” topic may also be interesting for other developers since it notifies, if any preference is changed. You may receive this event like this:

[code language=”java”]
    @Inject
    @Optional
    public void preferenceChanged(
            @UIEventTopic(PreferenceSpyEventTopics.PREFERENCESPY_PREFERENCE_CHANGED) PreferenceChangeEvent event) {

// print new property value to console
System.out.println(String.valueOf(event.getNewValue()));
    }
[/code]

 Further Plans

At first I like to admit that the project is hosted on Github [1] and we really appreciate feedback.

You can also get the preference spy from the following update site [2] and install it directly. The newest version will always be available on this update site.

Another aim of this project is that the preference spy will become an official spy of the E4 tools project. [3]

[1] https://github.com/vogellacompany/PreferencesSpy

[2] http://dl.bintray.com/vogellacompany/eclipse-preference-spy/

[3] https://git.eclipse.org/r/#/admin/projects/e4/org.eclipse.e4.tools

Posted in Simon Scholz | Tagged , , , | Comments Off on Preferences Spy for Eclipse IDE and RCP