The “Unreal Development Kit”, UDK, is a game engine editor. The software behind first person games. The company has released the full version of the editor for use by developers and various other interested parties. It is free for non-commercial applications, licensed when used for generating income.
For real world design: Recently I discovered UDK while working on a yacht project, where interior views and angles were more important than just looking from the outside in, and I wanted to generate the physical feeling of the space without having to go to full size mock-up. Geometry was imported from Rhino (through some connecting programs) and the UDK editor to build an application (stand-alone game) to allow walk arounds and through of the model.
What this allows is real time walk throughs of a large scale design, very engaging, very “virtually physical”. Turn right or left, look up or down, re-examine, etc.
My question is if anyone else is using this tool for design visualization, or if there are similar tools available. Although the UDK is not friendly to use coming from a traditional CAD background, the resulting surfaces and materials and lighting are at the level of modern game environments, which these days is nice and high. Perfect fit it seems for architects or for designing large scale objects that have human interaction.
Maybe this also belongs in the Star Trek thread.
The Rift opens up (pun intended) some great remote viewing design possibilities for development and collaboration.
Yacht design, tradeshow planning, retail layout, design object presented in environment.
Time to start the overlap between design studios and game developers.
Very interesting idea. Does it take tons of extra work to get something useful out of the kit, or do you feel that it is something that could gradually integrate into your workflow as you get more familiar with it? To bad I’m not working on anything large scale enough to justify trying this out. I guess I’ll just go back to playing video games.
I have used Unity 3d for a similar application. We also have built some augmented reality apps through the program. I believe the company used Ipads with ‘markers’ on the table so that you could rotate around the products and see all the views. One of the problems that we had was texture baking and lighting confusion due to the static reflections across the object. In the end I had to light map the object with a blend of baked in texture and the programs lighting and material system.
We had a Rift here last week and it was something new and refreshing to see that sort of development. I can see architects and space designers using this to gain a better uderstanding of the area around them. It will be interesting to see what Facebook does with it. Apparently Sony also has one in the works for PS4.
Unity is a much more accessible tool, I’m using it as we speak for some tech demos with the Rift as we speak.
There is a lot of aftermarket community support for Unity, there is actually a great set of more “ID” friendly shaders which let you use HDR lighting for more realistic reflections and effects:
Crazybump is another great tool for generating normal maps quickly from 2D bump maps.
The new Rift DK2 should be out in a few months which is exciting since it’s clear how big of a leap it will be with higher resolution, reduced motion blur and proper head tracking.
Will look into Unity the next time such a project arises. The demo material never particularly impressed me when I looked years ago. But there certainly is nothing user friendly about the UDK from a Rhino User perspective. The final built environment however allowed us to tromp through knee deep water and look at the outside of the yacht as well as board the deck and walk around. And is was full of eye candy and was a fairly immersive experiment. Years ago we used a Quake engine to model a snowboard factory pre-build, also an impressive tool for the time.
This is cool. I would like to see standard virtual environments set up that you could hold a real world proxy object in hand and rotate it around and be able to look at the virtual design object in virsitu. Hand rotation vs head rotation for object vs environment.
It is not an easy jump, the tools are specialized and have little in common with CAD and the usual workflow. Have not tried Unity editor or UDK4 yet. I personally imagine handing off the 3D model to a specialist for the virtual environment and cinematics building. It involved a deep rabbit hole dive to get the yacht simulated.
Unity is much friendlier and they’ve made a whole lot of visual jumps since the older versions and version 5 should be out fairly soon.
Ultimately it’s a lot more work to get your assets looking killer than it is to have the engine render them. UDK is great for teams who have 15 different guys just to do the sculpting for normal maps and get everything to a super tight level of polish.
It’s almost an Alias vs Rhino kind of thing. Unity has a much softer learning curve and can still get you 90% of the way there even if some of it’s ultra high end lighting features aren’t quite up to par. I’ve had good luck with CAD translation as well using Maya to middle man and export FBX files. It’s always nifty to drop one of your products into 3D at 100x scale and walk around on it like a amusement park ride.
For virtual object manipulation, I’m waiting to break down a Razer Hydra (similar to the Wii Mote) and build some new housings because it allows you to get an object that is accurately tracked (Rotation, angle, and position) via the accelerometers/gyros/magnetic tracking. So you could almost break out the PCB and stick it inside any dummy 3D printed object and have the complete tactile sensation of using the real product.
Looks like a great platform, like the idea of embedding it into a printed proxy.
I’ve played with IMU’s for jetpack control systems. The speed of the hand tilt turn to screen response is super fast. Don’t have absolute positioning, but the 9 degrees of measurement are good for on screen feedback, instantaneous.
Someone also wrote an Arduino plugin for Unity. Haven’t had a chance to try it yet, but that could be a pretty awesome way of achieving some crazy augmented/VR experiences with real world feedback. IE controlling a fan when the wind is blowing or any kinds of crazy servo mounted contraptions.
I’m currently using Unreal Engine 4. It’s excellent. But any game engine is a whole new set of skills to learn; most ID folk aren’t going to be used to the idea of needing to “tell” the engine to give an object reflections, for example.
We had previously tried CryEngine 3 and the results are incredible, but it’s the shaders that made us switch to Unreal. Its latest version has a BSDF style of material creation, which is more related to conventional rendering. So this allows me to easily recreate materials I’ve already made in other programs. To me, CryEngine’s gaming-style system is much more difficult.