Run an Eclipse 32-bit application from a 64-bit Eclipse IDE

Typically the development environment should not be dependent on the target environment the application should run on. For creating an Eclipse RCP application using SWT, this is not as trivial as it looks like. The reason for this is the fact that the SWT implementation is packaged in platform dependent bundle fragments. But it is possible to setup the workspace to make that work, which I will show in the following blog post.

Use Case

You need to maintain an Eclipse RCP application that makes use of 32-bit Windows native libraries. Since you got a brand new laptop or PC that runs a 64-bit Windows, you install the 64-bit version of Eclipse. As you are aware that you need to execute the application in a 32-bit JVM, you add a 32-bit JDK via Window -> Preferences -> Java -> Installed JREs and configure that JDK as the default for the JavaSE-1.8 Execution Environment.

execution_environment

At development time you want to start your application from the IDE, e.g. via .product file -> Launch an Eclipse application. But you get the following error:

java.lang.UnsatisfiedLinkError: Cannot load 64-bit SWT libraries on 32-bit JVM

Solution

The reason for this is clear, you installed a 64-bit Eclipse, therefore you only have the 64-bit bundle fragment of SWT in your installation. But you need the 32-bit SWT fragment. This can be solved easily by configuring the target platform appropriately.

  • Create a new Target Platform
  • Switch to the Environment tab in the PDE Target Editor
  • Change the Architecture to x86
    target_editor_environment
  • Switch to the Definition tab
  • Click Reload (this is important to retrieve the x86 fragments!)
  • Switch to the Content tab and check if the correct fragment is now part of the target platform (check for the org.eclipse.swt.win32.win32.x86 fragment)
  • Activate the target platform via Set as Target Platform

Now it is possible to execute a 32-bit application from a 64-bit Eclipse IDE via .product file -> Launch an Eclipse application.

Note: Remember to start via the .product file and not via an existing run configuration, because the run configuration needs to be updated for the environment settings.

 

 

 

Posted in Dirk Fauth, Eclipse | 2 Comments

OSGi – bundles / fragments / dependencies

In the last weeks I needed to look at several issues regarding OSGi dependencies in different products. A lot of these issues were IMHO related to wrong usage of OSGi bundle fragments. As I needed to search for various solutions, I will publish my results and my opinion on the usage of fragments in this post. Partly also for myself to remind me about it in the future.

What is a fragment?

As explained in the OSGi Wiki, a fragment is a bundle that makes its contents available to another bundle. And most importantly, a fragment and its host bundle share the same classloader.

Looking at this from a more abstract point of view, a fragment is an extension to an existing bundle. This might be a simplified statement. But considering this statement helped me solving several issues.

What are fragments used for?

I have seen a lot of different usage scenarios for fragments. Considering the above statement, some of them where wrong by design. But before explaining when not to use fragments, let’s look when they are the agent of choice. Basically fragments need to be used whenever a resource needs to be accessible by the classloader of the host bundle. There are several use cases for that, most of them rely on technologies and patterns that are based on standard Java. For example:

  • Add configuration files to a third-party-plugin
    e.g. provide the logging configuration (log4j.xml for the org.apache.log4j bundle)
  • Add new language files for a resource bundle
    e.g. a properties file for locale fr_FR that needs to be located next to the other properties files by specification
  • Add classes that need to be dynamically loaded by a framework
    e.g. provide a custom logging appender
  • Provide native code
    This can be done in several ways, but more on that shortly.

In short: fragments are used to customize a bundle

When are fragments the wrong agent of choice?

To explain this we will look at the different ways to provide native code as an example.

One way is to use the Bundle-NativeCode manifest header. This way the native code for all environments are packaged in the same bundle. So no fragments here, but sometimes not easy to setup. At least I struggled with this approach some years ago.

A more common approach is to use fragments. For every supported platform there is a corresponding fragment that contains the platform specific native library. The host bundle on the other side typically contains the Java code that loads the native library and provides the interface to access it (e.g. via JNI). This scenario is IMHO a good example for using fragments to provide native code. The fragment only extend the host bundle without exposing something public.

Another approach is the SWT approach. The difference to the above scenario is, that the host bundle org.eclipse.swt is an almost empty bundle that only contains the OSGi meta-information in the MANIFEST.MF. The native libraries aswell as the corresponding Java code is supplied via platform dependent fragments. Although SWT is often referred as reference for dealing with native libraries in OSGi, I think that approach is wrong.

To elaborate why I think the approach org.eclipse.swt is using is wrong, we will have a look at a small example.

  1. Create a host bundle in Eclipse via File -> New -> Plug-in Project and name it org.fipro.host. Ensure to not creating an Activator or anything else.
  2. Create a fragment for that host bundle via File -> New -> Other -> Plug-in Development -> Fragment Project and name it org.fipro.host.fragment. Specify the host bundle org.fipro.host on the second wizard page.
  3. Create the package org.fipro.host in the fragment project.
  4. Create the following simple class (yes, it has nothing to do with native code in fragments, but it also shows the issues).
    package org.fipro.host;
    
    public class MyHelper {
    	public static void doSomething() {
    		System.out.println("do something");
    	}
    }
    

So far, so good. Now let’s consume the helper class.

  1. Create a new bundle via File -> New -> Plug-in Project and name it org.fipro.consumer. This time let the wizard create an Activator.
  2. In Activator#start(BundleContext) try to call MyHelper#doSomething()

Now the fun begins. Of course MyHelper can not be resolved at this time. We first need to make the package consumable in OSGi. This can be done in the fragment or the host bundle. I personally tend to configure Export-Package in the bundle/fragment where the package is located. We therefore add the Export-Package manifest header to the fragment. To do this open the file org.fipro.host.fragment/META-INF/MANIFEST.MF. Switch to the Runtime tab and click Add… to add the package org.fipro.host.

Note: As a fragment is an extension to a bundle, you can also specify the Export-Package header for org.fipro.host in the host bundle org.fipro.host. org.eclipse.swt is configured this way. But notice that the fragment packages are not automatically resolved using the PDE Manifest Editor and you need to add the manifest header manually.

After that the package org.fipro.host can be consumed by other bundles. Open the file org.fipro.consumer/META-INF/MANIFEST.MF and switch to the Dependencies tab. At this time it doesn’t matter if you use Required Plug-ins or Imported Packages. Although Import-Package should be always the preferred way, as we will see shortly.

Althought the manifest headers are configured correctly, the MyHelper class can not be resolved. The reason for this is PDE tooling. It needs additional information to construct proper class paths for building. This can be done by adding the following line to the manifest file of org.fipro.host

Eclipse-ExtensibleAPI: true

After this additional header is added, the compilation errors are gone.

Note: This additional manifest header is not necessary and not used at runtime. At runtime a fragment is always allowed to add additional packages, classes and resources to the API of the host.

After the compilation errors are gone in our workspace and the application runs fine, let’s try to build it using Maven Tycho. I don’t want to walk through the whole process of setting up a Tycho build. So let’s simply assume you have a running Tycho build and include the three projects to that build. Using POM-less Tycho this simply means to add the three projects to the modules section of the build.

You can find further information on Tycho here:
Eclipse Tycho for building Eclipse Plug-ins and RCP applications
POM-less Tycho builds for structured environments

Running the build will fail because of a Compilation failure. The Activator class does not compile because the import org.fipro.host cannot be resolved. Similar to PDE, Tycho is not aware of the build dependency to the fragment. This can be solved by adding an extra. entry to the build.properties of the org.fipro.consumer project.

extra.. = platform:/fragment/org.fipro.host.fragment

See the Plug-in Development Environment Guide for further information about build configuration.

After that entry was added to the build.properties of the consumer bundle, also the Tycho build succeeds.

What is wrong with the above?

At first sight it is quite obvious what is wrong with the above solution. You need to configure the tooling at several places to make the compilation and the build work. These workarounds even introduce dependencies where there shouldn’t be any. In the above example this might be not a big issue, but think about platform dependent fragments. Do you really want to configure a build dependency to a win32.win32.x86 fragment on the consumer side?

The above scenario even introduces issues for installations with p2. Using the empty host with implementations in the fragments forces you to ensure that at least (or exactly) one fragment is installed together with the host. Which is another workaround in my opinion (see Bug 361901 for further information).

OSGi purists will say that the main issue is located in PDE tooling and Tycho, because the build dependencies are kept as close as possible to the runtime dependencies (see for example here). And using tools like Bndtools you don’t need these workarounds. And in first place I agree with that. But unfortunately it is not possible (or only hard to achieve) to use Bndtools for Eclipse application development. Mainly because in plain OSGi, Eclipse features, applications and products are not known. Therefore also the feature based update mechanism of p2 is not usable. But I don’t want to start the discussion PDE vs. Bndtools. That is worth another (series) of posts.

In my opinion the real issue in the above scenario, and therefore also in org.eclipse.swt, is the wrong usage of fragments. Why is there a host bundle that only contains the OSGi meta information? After thinking a while about this, I realized that the only reason can be laziness! Users want to use Require-Bundle instead of configuring the several needed Import-Package entries. IMHO this is the only reason that the org.eclipse.swt bundle with the multiple platform dependent fragments exists.

Let’s try to think about possible changes. Make every platform dependent fragment a bundle and configure the Export-Package manifest header for every bundle. That’s it on the provider side. If you wonder about the Eclipse-PlatformFilter manifest header, that works for bundles aswell as for fragments. So we don’t loose anything here. On the consumer side we need to ensure that Import-Package is used instead of Require-Bundle. This way we declare dependencies on the functionality, not the bundle where the functionality originated. That’s all! Using this approach, the workarounds mentioned above can be removed. PDE and Tycho are working as intended, as they can simply resolve bundle dependencies. I have to admit that I’m not sure about p2 regarding the platform dependent bundles. Would need to check this separately.

Conclusion

Having a look at the two initial statements about fragments

  • a fragment is an extension to an existing bundle
  • fragments are used to customize a bundle

it is IMHO wrong to make API public available from a fragment. These statements could even be modified to become the following:

  • a fragment is an optional extension to an existing bundle

Having that statement in mind, things are getting even clearer when thinking about fragments. Here is another example to strengthen my statement. Guess you have a host bundle that already exports a package org.fipro.host. Now you have a fragment that adds an additional public class via that package, and in a consumer bundle that class is used. Using Bndtools or the workarounds for PDE and Tycho showed above, this should compile and build fine. But what if the fragment is not deployed or started at runtime? Since there is no constraint for the consumer bundle that would identify the missing fragment, the consumer bundle would start. And you will get a ClassNotFoundException at runtime.

Personally I think that everytime a direct dependency to a fragment is introduced, there is something wrong.

There might be exceptions to that rule. One could be to create a custom logging appender that needs to be accessible in other places, e.g. for programmatically configurations. As the logging appender needs to be in the same classloader as the logging framework (e.g. org.apache.log4j), it needs to be provided via fragment. And to access it programmatically, a direct dependency to the fragment is needed. But honestly, even in such a case a direct dependency to the fragment can be avoided with a good module design. Such a design could be for example to make the appender an OSGi service. The service interface would be defined in a separate API bundle and the programmatic access would be implemented against the service interface. Therefore no direct dependency to the fragment would be necessary.

As I struggled several days with searching for solutions on fragment dependency issues, I hope this post can help others, solving such issues. Basically my solution is to get rid of all fragments that export API and make them either separate bundles or let them provide their API via services.

If someone with a deeper knowledge in OSGi ever comes by this post and has some comments or remarks about my statements, please let me know. I’m always happy to learn something new or getting new insights.

Posted in Dirk Fauth, Eclipse, OSGi | Comments Off on OSGi – bundles / fragments / dependencies

Substring code completion in Eclipse JDK

As of yesterday yesterday in the Eclipse 4.6. integration build, we offer substring code completion by default in Eclipse JDK.

substring-code-completion

This brings this feature known from the IntelliJ IDE and the Eclipse Code Recommenders project to JDT users and will help to continue to enhance the Java Development Tools in Eclipse.

This feature was original developed within a Google Summer of Code project by Gábor Kövesdán with Noopur Gupta and myself as mentors. After the project finished, the JDT team polished this development quite a bit and activated it yesterday, including improve highlighting in the code proposal.

You find the latest and greatest integration build (I20160112-1800) on http://download.eclipse.org/eclipse/downloads/ if you want to try it out.

Posted in Eclipse, Lars Vogel | 5 Comments

POM-less Tycho builds for structured environments

With Tycho 0.24 POM-less Tycho builds where introduced. That approach uses convention-over-configuration to reduce the number of redundant information for setting up a Tycho build. In short, that means you don’t need to create and maintain pom.xml files for bundle, feature and test projects anymore, as the whole information can be extracted out of the already existing information in MANIFEST.MF or feature.xml.

Lars Vogel shows in his Tycho Tutorial a recommended folder structure, that is also widely used in Eclipse projects.

recommended_folder_structureThe meaning of that folder structure is:

  • bundles
    contains all plug-in projects
  • features
    contains all feature projects
  • products
  • contains all product projects
  • releng
    contains projects related to the release engineering, like

    • the project containing the parent POM
    • the aggregator project that contains the aggregator POM which defines the modules of a build and is also the starting point of a Tycho build
    • the target definition project that contains the target definition for the Tycho build
  • tests
    contains all test plug-in/fragment projects

This structure helps in organizing the project. But there is one convention for POM-less Tycho builds that is not working out-of-the-box with the given folder structure: “The parent pom for features and plugins must reside in the parent directory”. Knowing about the Maven mechanics, this convention can also be satisfied easily by introducing some POM files that simply connect to the real parent POM. I call them POM-less parent POM files. These POM-less parents need to be put into the base directories bundles, features and tests. And they do nothing else than specifying the real parent POM of the project (which is also located in a sub-directory of releng.

The following snippet shows a POM-less parent example for the bundles folder:

<project xmlns="http://maven.apache.org/POM/4.0.0" 
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
    http://maven.apache.org/maven-v4_0_0.xsd"> 

    <modelVersion>4.0.0</modelVersion>

    <artifactId>org.fipro.example.bundles</artifactId>
 
    <packaging>pom</packaging>

    <parent>
        <groupId>org.fipro.example</groupId>
        <artifactId>org.fipro.example.parent</artifactId>
        <version>1.0.0-SNAPSHOT</version>
        <relativePath>../releng/org.fipro.example.parent</relativePath>
    </parent>
</project>

For the features and tests folder you simply need to modify the artifactId accordingly.

Note that you don’t need to reference the POM-less parent POM files in the modules section of the aggregator POM.

Following the best practices regarding the folder structures of a project and the conventions for POM-less Tycho builds, you will have at least 7 pom.xml files in your project.

  • parent POM
    the main build configuration
  • aggregator POM
    the collection of modules to build
  • target-definition POM
    the eclipse target definition build
  • product POM
    the eclipse product build
  • POM-less parent POM for bundles
    the connection to the real parent POM for POM-less plug-in builds
  • POM-less parent POM for features
    the connection to the real parent POM for POM-less feature builds
  • POM-less parent POM for tests
    the connection to the real parent POM for POM-less test plug-in builds

Of course there will be more if you provide a multi-product environment or if you need to customize the build of a plug-in for example.

With the necessary Maven extension descriptor for enabling the POM-less Tycho build (see POM-less Tycho builds), the folder structure will look similar to the following screenshot:

pom-less_folder_structure

I hope this blog post will help people setting up Tycho builds for their products more easily using POM-less Tycho.

 

Posted in Dirk Fauth, Eclipse | 6 Comments

Make Retrofit ready for usage in OSGi

Retrofit is a really great library for addressing  REST APIs. It is often used for Android apps, because it is really lightweight and easy to use.

I’d also love to use this library for my Eclipse 4 RCP applications, so let’s make use of retrofit also here.

So download the retrofit artefacts and make use of it. But wait..! For Eclipse applications we need OSGi bundles rather than usual Java artefacts. When looking at the MANIFEST.MF file of the retrofit jar archive there isn’t any OSGi bundle meta data.

Fortunately there are many tools out there to convert plain Java artefacts into OSGi bundles, e.g., p2-maven-plugin(Maven) or bnd-platform(Gradle).

Since I am involved in the Buildship development (Gradle tooling for Eclipse) and we now also offer Gradle trainings besides our Maven trainings, I chose the bnd-platform plugin for Gradle.

The build.gradle file then looks like this:

buildscript {
	repositories {
		mavenCentral()
	}
	dependencies {
		classpath 'org.standardout:bnd-platform:1.2.0'
	}
}

apply plugin: 'org.standardout.bnd-platform'

repositories {
	mavenCentral()
}

platform {
	bundle 'com.squareup.retrofit:retrofit:2.0.0-beta2'

	bundle 'com.squareup.retrofit:converter-gson:2.0.0-beta2'
}

When Gradle has been setup properly, the desired bundles can be converted with the bundles task from the org.standardout.bnd-platform plugin:

/retrofit-osgi-convert$ ./gradlew bundles

By running the bundles task retrofit, a json converter, in this case GSON, and the transitive dependencies are available as converted OSGi bundles in the /retrofit-osgi-convert/build/plugins folder.

See Gradle tutorial for further information.

When adding these converted bundles to the target platform of a Eclipse RCP application it should usually work out of the box.

But …! After adding the retrofit and gson converter bundles as dependencies to my plugin’s MANIFEST.MF file I still get compile errors. :-(

So what went wrong? Basically 2 things! The first problem is obvious, because when looking into the generated MANIFEST.MF meta data of retrofit there is an import for the android.os package. This import was added automatically during the conversion. The readme of the bnd-platform plugin explains how to configure the imports.

The second thing is that the retrofit and its converter bundles have split packages, which is fine for plain Java projects, but not for OSGi bundles. So the split package problem has also to be resolved. See https://github.com/SimonScholz/retrofit-osgi#make-use-of-retrofit-in-osgi

Fortunately this can also be configured in the build.gradle file:

platform {

	// Convert the retrofit artifact to OSGi, make android.os optional and handle the split package problems in OSGi
	bundle('com.squareup.retrofit:retrofit:2.0.0-beta2'){
		bnd {
			optionalImport 'android.os'
			instruction 'Export-Package', 'retrofit;com.squareup.retrofit=split;mandatory:=com.squareup.retrofit, retrofit.http'
	    	}
	}

	// Convert the retrofit gson converter artifact to OSGi and handle the split package problems in OSGi
	bundle('com.squareup.retrofit:converter-gson:2.0.0-beta2') {
		bnd{
			instruction 'Require-Bundle', 'com.squareup.retrofit'
			instruction 'Export-Package', 'retrofit;com.squareup.retrofit.converter-gson=split;mandatory:=com.squareup.retrofit.converter-gson'
		}
	}

	// You can add other converters similar to the gson converter above...
}

The actual build.gradle file can be found on Github.

After resolving these problems, no compile errors are left. :-) But when running the application an java.lang.IllegalAccessError: tried to access class retrofit.Utils from class retrofit.GsonResponseBodyConverter error occured.

The cause for this are the modifiers used in the retrofit.Utils class, which itself is package private and also its closeQuietly method is package private. So even with these split package rules being applied this package private access rules prohibit the usage of the closeQuietly method from the converters bundles (gson, jackson etc.).

Now comes the part, why I love open source that much. I checked out the retrofit sources made some changes, build retrofit locally, tried my local OSGi version with my changes and finally provided a fix for this. See https://github.com/square/retrofit/pull/1266. Thanks a lot @JakeWharton for merging my pull request that fast.

Retrofit and its GSON converter can already be obtained from bintray as p2 update site: https://dl.bintray.com/simon-scholz/retrofit-osgi/

For further information and a complete example please refer https://github.com/SimonScholz/retrofit-osgi

This repository contains the conversion script for making OSGi bundles from retrofit artifacts and  a sample application, which shows how to make use of retrofit in an Eclipse 4 RCP application. Just clone the repository in an Eclipse workspace, activate the target platform from the retrofit-osgi-target project and start the product in the de.simonscholz.retrofit.product project.

Retrofit and Eclipse 4

Retrofit and Eclipse 4

Feedback is highly appreciated.

Happy retrofitting in your OSGi applications 😉

Posted in Eclipse, Java, OSGi, Simon Scholz | Comments Off on Make Retrofit ready for usage in OSGi

Contributing to the Eclipse IDE – Second edition available as paper book and also as free download

I’m happy to announce that I finished the second edition of the Contributing to the Eclipse IDE book.

eclipsecontribution_book

To help with the contribution process, I decided to release this edition also as free download.

Posted in Eclipse, Lars Vogel | Comments Off on Contributing to the Eclipse IDE – Second edition available as paper book and also as free download

Git statistics from Eclipse Platform.UI in September 2015

Last month our cleanup hero was Sopot! He delete the 2.0 compatibility layer plug-in.

Developers with the most changesets
Lars Vogel 42 (40.4%)
Markus Keller 11 (10.6%)
Simon Scholz 8 (7.7%)
Dani Megert 6 (5.8%)
Brian de Alwis 6 (5.8%)
Dirk Fauth 5 (4.8%)
Jonas Helming 5 (4.8%)
Stefan Xenos 4 (3.8%)
Sopot Cela 3 (2.9%)
Matthias Becker 3 (2.9%)
Alexander Kurtakov 3 (2.9%)
Patrik Suzzi 2 (1.9%)
Sergey Prigogin 1 (1.0%)
Andrey Loskutov 1 (1.0%)
Dariusz Stefanowicz 1 (1.0%)
Christian Radspieler 1 (1.0%)
Christian Georgi 1 (1.0%)
Daniel Haftstein 1 (1.0%)

Developers with the most changed lines
Sopot Cela 9226 (40.0%)
Lars Vogel 5510 (23.9%)
Stefan Xenos 3266 (14.1%)
Markus Keller 1728 (7.5%)
Dirk Fauth 1107 (4.8%)
Simon Scholz 861 (3.7%)
Jonas Helming 504 (2.2%)
Brian de Alwis 439 (1.9%)
Alexander Kurtakov 228 (1.0%)
Matthias Becker 139 (0.6%)
Patrik Suzzi 37 (0.2%)
Andrey Loskutov 20 (0.1%)
Dani Megert 9 (0.0%)
Sergey Prigogin 6 (0.0%)
Christian Georgi 6 (0.0%)
Daniel Haftstein 2 (0.0%)
Dariusz Stefanowicz 1 (0.0%)
Christian Radspieler 1 (0.0%)

Developers with the most lines removed
Sopot Cela 9213 (43.6%)
Lars Vogel 2244 (10.6%)
Dirk Fauth 352 (1.7%)
Markus Keller 201 (1.0%)
Simon Scholz 171 (0.8%)
Alexander Kurtakov 80 (0.4%)
Daniel Haftstein 1 (0.0%)

Posted in Eclipse, Lars Vogel | 2 Comments

Eclipse Neon and Saneclipse adding full screen mode for Eclipse

In the Eclipse Neon build from yesterday you have the option to trigger a “Full Screen mode”. We added it simultaneously to saneclipse. saneclipse is a set of plug-ins to improve the user experience of the Eclipse IDE and can be installed into Eclipse Mars.

You can use the “Toggle Visibility of all Toolbars” command (via Quick Access) to hide all currently visible toolbars. Selecting the command again, reveals these toolbars again. This allows the developer to maximize the space available for editors and views. If you minimize a stack after you selected this command, the minimized stack will be visible, until you trigger the command to hide the toolbars again. This allows the developer to decide which minimized stacks are useful for him.

The following is a screenshot of the IDE with a maximized Java editor and several toolbars visible.

eclipseidewithtoolbars

The next screenshot shows the same maximized editor but with hidden toolbar.

eclipseideinnewfullscreenshot

The shortcut to trigger this is Ctrl+Alt+M on Windows or Linux and Command+Alt+M on Mac.

Please note that we do not hide the main menu, we think that would be to confusing for the user, which accidently trigger this. And this command will soon be available via the Windows menu.

Combined with the upcoming shortcuts for increasing and decreasing the font size of the current editor this should make Eclipse also easier to use for people using Eclipse for presentations.

Posted in Eclipse, Lars Vogel | 5 Comments

vogella GmbH joins the Eclipse Foundation

On behave of the http://www.vogella.com/company/ I’m happy to announce that we joined the Eclipse Foundation. While our employees were already heavily involved with the Eclipse open source framework since several years, this step makes our company commitment official.

We work on almost all Eclipse Platform projects and we are happy to currently be the second strongest contributor to the Eclipse Platform.

platform

We are also leading the Eclipse Platform UI project and it is great to see more and more new people coming in and help with the project. Still we remain very active in this project and it is great for us to see that we are currently doing more than 50 % of the commits in Platform UI.

platformui

Many of our customers are using the plug-in development tools, we are also heavily involved in them.

pde

But of course that is not all, we believe that it is important to also help other projects to improve the Eclipse IDE experience, so we contribute to various projects. Our Eclipse vogella GmbH company page lists the projects we are involved in, here is a current snapshot.

companycontributions

We definitely love the Eclipse Platform and are very impressed with the skills in the Platform team, we are looking forward to work closer together with the Eclipse Foundation. We also hope to continue and intensify our efforts in improving the Eclipse IDE and Platform experience and are looking for qualified developers located in Germany, fulltime or part time (students).

Posted in Eclipse, Lars Vogel | 3 Comments

Saneclipse now with keybindings to increase / decrease editor font size

Based on the work of Mickael Istria we added shortcuts for increasing and decreasing the editor font size to Saneclipse.

Install the latest version from https://dl.bintray.com/vogellacompany/saneclipse/ and enjoy Ctrl++ or Ctrl+- for changing the font size.

We hope to integrate Mickaels change into Neon.

Posted in Eclipse, Lars Vogel | Comments Off on Saneclipse now with keybindings to increase / decrease editor font size