Noesis GUI Integration

Hi all,

I have a game built in Unity, UNION Spaceship Command, which uses Noesis GUI, a vector based UI which is based on (and mostly compatible with) WPF.

You can see an example of a Noesis based UI in UNION here:

I am in the process of porting UNION to Xenko, but I haven’t had any success in integrating Noesis’ C# API into the engine. They have a Monogame integration on Github, but this will not work on the Xenko binary release as the SharpDX.Direct3D11 namespace and handle are not available.

I have a lot of experience with game systems and networking, but none with graphics programming.

I would really appreciate any help you could give with integrating Noesis.


Mark Aherne

1 Like


We also use SharpDX internally, so I was wondering if simply Xenko’sSharpDX.Direct3D11 solves your problem or if more work is necessary.

Xenko’s SharpDX is available in %SiliconStudioXenkoDir%\GamePackages\Xenko.1.6.1-beta\Bin\Windows-Direct3D11\SharpDX.Direct3D11.dll

I suppose part of MonoGameNoesisGUIWrapper.cs needs to be rewritten for Xenko as well.
You can see how EmptyKeys (another external UI framework with Xenko integration) does it:

Let us know if you need more info/help!

Excellent, thanks for the info. I’ll let you know how I get on.

Also, please note that going for this approach will work only for D3D11 (or whatever graphics platform they support).
However, the benefit is that you can reuse the existing NeosisGUI SDK D3D11 renderer as is to get you started quickly, and should work fine if your only target is D3D11 API (Windows).

Ideally, adding a new NeosisGUI SDK renderer to directly use Xenko’s cross-platform rendering API would be better as it would work on all targets Xenko support (i.e. mobile, OpenGL, etc…).

Thank you for the advice.

Noesis do support a wide variety of renderers, but they are currently rewriting their render architecture (to improve the API and make it more accessible). Any integration I do now will have to be rewritten in a few months when Noesis releases it’s next major version. At that point I will look at doing a more complete integration with Xenko (with multiple renderers).

I have the DX11 integration working now (thank you), so that will allow me to complete the port of UNION to Xenko and continue it’s development.

Happy to hear you could easily leverage the existing DX11 solution.

Please keep us posted if you hit any other problem!


We have just completed our first spaceships for UNION, and I’m really looking forward to getting them up and running in Xenko.

Here they are in a concept art piece (by Isaac Hannaford):

1 Like

Looking forward to it too!
Please keep us posted with screenshots/progress!

1 Like

I’ve run into an issue with Noesis on Xenko. Their renderer requires MultiSampling (MSAA), which is currently disabled in Xenko. Without MSAA the UIs don’t render correctly, as shown in this comparison:

On the top is with MSAA, bottom is without.

I have looked at the Xenko source but cannot find any way of enabling multisampling in the binaries (I am using the binaries). There is a GraphicsDeviceManager.PreferMultiSampling boolean, but it does not seem to be used anywhere in the source. Is it possible to enable MultiSampling, and if so what is the cost/side-effect of doing so (I assume it is disabled for a reason)?

I really appreciate any help you give me here.



You have two choices there:

  • Render your UI in a MSAA render target, and then blend it on top of your game. This is probably recommended in most cases as you won’t pay MSAA price for your whole rendering. This should already be possible: simply pass your MSAA level when creating this render target texture.
  • Render your UI directly in the main frame buffer (you will need to wait for the PreferMultiSampling being properly forwarded, looking into it now)

RasterizerState also need to have Multisample enabled.
It is possible now but little bit hackish (using RootEffectRenderFeature.PostProcessPipelineState). We are looking into providing an easier entry point for that.

Additional note: If you run any kind of post effect after UI, it means that your UI is not rendered in the swapchain backbuffer anyway. In this case, first approach is your only choice.

When/where is the UI rendered?

Hi xen2,

Thank you for the response.

I have looked at the source code, trying to find out how to create a render target view. I found the RenderTargetView class, but in order to create one I need to use the GraphicsDevice.NativeDevice and Texture.NativeResource, both of which are not accessible using the binaries (they are marked internal). Maybe I’m looking in the wrong place, could you guide me on how and where to create a render target?

At the moment the integration is very hacky. I pass the SharpDX11 Device NativePointer to the Noesis library at initialisation and then call it’s render function after Xenko’s Draw method has finished.

There are two barriers to this integration. One is that Noesis is closed source, and so I have very little flexibility there, and two is my lack of experience/knowledge of graphics programming/DirectX. However the Noesis devs have been helpful, and I am working on the second issue.

I already have a lot of my UIs built using Noesis, and it provides many features which aren’t available elsewhere (it has a robust implementation of the WPF standard, can render SVG data (all of my UIs are vector based), and uses XAML files which allows my UIs to be easily extended/modded).

I appreciate your help, thank you.

I suppose Neosis renders to currently main render target.

Fixes that will be included in 1.6.3 will allow you to enable MSAA on main render target, so it should work for your cases. Note that if you don’t use post effects, it will forces most rendering to happen with MSAA as well.

We will check into exposing SharpDX interop, which would allow you to do things such as:

  • Create a SharpDX Texture and RenderTarget with MSAA
  • Set it as render target and run Neosis GUI
  • Resolve it (Copy MSAA to non-MSAA)
  • Render it on top of main render target

That is excellent, thank you.

How did you manage to integrate it into the engine? I would be very grateful for any response.