3
\$\begingroup\$

I was wandering how easy is to interface existing Java programs/GUIs inside a game loop. Sorry if my question is a bit vague. I'll try to explain.

I am currently designing a simulation environment for a work-station. We have the actual programs running in the real compunters (which have a java GUI and run on linux virual machines). We also have 3D models of the room, chairs, monitors, etc. We can render those in Unreal engine. Currently, I have hard-coded very simple monitor textures that represent the programs running on the computers, but ideally, the player should have have the same experience as being in the actual control room, meaning that he should be able to use the programs in the same or almost the same manner.

My question is if it is possible to interface the I/O of the actual programs within the simulation. More specifically, if it possible to render the java awt as textures for the monitors, as well as taking the player actions and feeding them in the program as inputs. The actual program code may run inside or outside the game loop (preferably outside, in its native virtual machine).

Can you please direct me to the right path, since I am only remotely familiar with game development. Thank you in advance!

Edit: Just to clarify one thing: Running the program on an external machine (virtual or physical) and feeding this machine's display to a local texture is acceptable, but I was wandering if it could be possible to use Java's GUI library output directly into the game loop (instead of the host machine). I suppose this has to be supported by the game engine (eg. having support for an in-game desktop-environment/window-manager). Unreal engine was just chosen for testing. Any engine recommendation is welcome.

\$\endgroup\$
6
  • \$\begingroup\$ What you're looking for is to map those external screen video feeds to local textures on your models in Unreal, correct? While that is difficult but possible, my question to you is Why? What are you using this Unreal game for? \$\endgroup\$ Commented Jul 18, 2016 at 7:42
  • \$\begingroup\$ I think this is correct. The program shall be used with VR hardware for remote training of operators that do not have access to the room itself. \$\endgroup\$ Commented Jul 18, 2016 at 7:53
  • \$\begingroup\$ What activities do you want to train with VR? Is the screen capture absolutely vital to it? \$\endgroup\$ Commented Jul 18, 2016 at 7:54
  • \$\begingroup\$ No, it is not vital, but it will help the operators get used to the actual software during their training within the simulated environment. They are expected to be able to move around the room and perform various tasks using some input and/or mock-up devices. Edit: for example imagine a guy with a keyboard and a VR headset writing commands with the keyboard and while staring at a blank wall, he sees the monitor with the commands being typed/executed within a particular application \$\endgroup\$ Commented Jul 18, 2016 at 7:59
  • \$\begingroup\$ Perhaps, but there is no way that you'll get a feed with a small enough latency, to capture video from real hardware and map that feed texture unto an object in Unreal and actually be able to distinguish anything on the screen through VR hardware (even the best out there right now like the Vive or Oculus Rift) let alone interact with that system. \$\endgroup\$ Commented Jul 18, 2016 at 8:02

0

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.