Full Disclosure: I did recently join this company. But before I came here, I was an industrial designer at a small consultancy in New Jersey. I joined the company because I saw a lot of potential in the software and the platform.
So what this is is a free software that allows you to upload your models and create photorealistic renders in your browser. The coolest thing about it, though, is what this allows you to do when you want to share your work with others (clients, professors, colleagues). You can create a live link that allows a remote user to interact with your render - so rather than sending stills, you can send a hyperlink that allows the user to interact with the scene live.
Check it out;
Click once to load the screen, then the green picture window to start the render.
I hope this isn’t against the forum rules - I think this is relevant to the needs of designers here.
I like the idea, but the noise does not seem to clear up well and leaves a JPEG-like artifact? Is that just for test-purposes?
Most people in viz and i.d. already have up to date workstations that they can use for rendering, but when it comes to more complex geometry, animations, materials and scenes I think this can come in handy if a lot of compute power can be used/at a certain price.
What are your experiences with render software in general?
The longer you let the render sit, the more refined it gets - that noise will disappear over time.
The JPEG-like artifact is the JPEG compression, and can be controlled in real-time as a parameter in the scene. However the image isn’t really a JPG - you can save it out as an uncompressed EXR, PNG, and of course JPG as well.
A big benefit that I personally see is for students, who would be able to work from any machine, or professionals who want to share work with clients in an interactive manner.
I’m a bit of an odd duck when it comes to rendering in the ID field - I taught myself VFX first, and so I use Luxology’s modo (in addition to Lagoa) for my rendering needs, and so bring a lot of that specialization into the field (normal mapping, specular maps, etc, are things I’m very familiar with). I’ve used Cinema 4D and Keyshot/Hypershot.
Definitely students and schools can profit from this, though it should work snappy and understandable I think. But ofcourse that`s easier said than done. I tested it a few months ago and after some “getting the hang if ot” it worked pretty good! It raytraces really good!
I have looked into Lagoa a little bit out of curiosity. I was curious on what sort of controls you have over the settings? How is the environment controlled? Can you have imaged based lighting? Given that the noise constantly goes away, is this an unbias system? Are you using ray tracing/path tracing?
What sort of machines are being used for rendering? I noticed that you charge by the hour of render time, but that seems highly dependent upon the machines that are working on the image.
You have pretty solid control over settings in Lagoa, especially relating to materials - index of refraction, multi-layer materials, etc, are well implemented and exposed to the user. The camera works like a real world camera, so you have that level of control over it (DOF, blades, blade rotation, zoom, etc).
The environment is controlled via dome lights, as well as a physical sun and sky option. We do support image based lighting lighting (applied to the dome light). Yes it is an unbiased system - more information can be found on our FAQ page;
http://support.lagoa.com/faq/. We are a path-tracing engine.
We use CPU based rendering - the machines are owned by a separate company that is a major player in the cloud hardware space.