Writing Java2D BufferedImage Data to Opengl Texture Buffer

I am writing a hud for a project, and because I want to do some complex drawing I want to use a powerful 2D rendering library (like Java2D). My idea is to draw into the BufferedImage whenever the UI needs to update, then take the pixel data from the bufferedimage and bind it into an OpenGL texture buffer. I can then draw the texture on top of my game every frame.

Am I insane? -> yes. But is it really crazier then drawing 2D stuff in OpenGL?

The part that I need help with specifically is transferring the bufferedimage data into the texture buffer. I am using LWJGL.

AWT (i.e. Java2D) can interfere with LWJGL/GLFW when you use non-headless features of it (like creating a Window/drawable/container), since they both want to read the window thread’s message queue for window message events.
Read this: http://forum.lwjgl.org/index.php?topic=6310.msg33650#msg33650

LWJGL 3 provides many alternative solutions such as for UI layouting and vector drawing with Nuklear and NanoVG.
If you want layout only then LWJGL 3 also has bindings for Facebook’s Yoga library implementing the Flexbox layout. When you want to render font/text and load images then there is stb.

If you only want to look at one of them, then it should definitely be NanoVG. That also supports text rendering (via stb) and image loading (also via stb) and competes feature-wise pretty much with Java2D.

Yeah, it is. OpenGL is used for 2D stuff all the time. GPUs can really only draw 2D stuff in the first place; you project your 3D triangles to 2D triangles, so it makes a lot of sense to use the GPU for accelerating 2D stuff as well.