Considering rebuilding Cosmos Laundomat for VR


Advanced search

Message boards : General talk : Considering rebuilding Cosmos Laundomat for VR

Author Message
Profile mStuff
Send message
Joined: 29 Oct 09
Posts: 26
Credit: 67,721
RAC: 0
Message 14029 - Posted: 8 Aug 2015, 10:48:06 UTC
Last modified: 8 Aug 2015, 10:48:36 UTC

Hi everyone!

After some practice and research making 180 degree stereoscopic content for use with Oculus Rift/Gear VR/HTC Vive and other headmounted displays I am considering rebuilding the newly released Cosmos Laundromat Pilot for a cinematic VR experience.
Having looked though a couple of the source .blends I see that it really is not an easy task. I will try to rebuild much of the scenery that is outside of the directors' frame crop and detail further where needed.
It will for sure take up a lot of rendering time too.

Having a true great cinematic VR experience ready for the up and coming consumer HMD race will for sure benefit the reputation of Blender and the Gooseberry project.

Would you guys be interested in me starting to feed some work units to BURP for rendering?

Keep in mind that I will continue considering if it is even possible as I look though the source material. I will post some general hardware requirement estimates soon.

Profile noderaser
Project donor
Avatar
Send message
Joined: 28 Mar 06
Posts: 507
Credit: 1,549,461
RAC: 95
Message 14030 - Posted: 8 Aug 2015, 15:03:32 UTC

More work? Why would we want a steady flow of work?

:P

I'm always open for a challenge, and keeping up the activity around here is a great thing.
____________

Profile mStuff
Send message
Joined: 29 Oct 09
Posts: 26
Credit: 67,721
RAC: 0
Message 14034 - Posted: 10 Aug 2015, 18:40:09 UTC
Last modified: 10 Aug 2015, 18:53:20 UTC

Yeah, one thing is for sure - it will bring a crazy amount of work to BURP to have this rendered.

The Pilot was just released in 1080p (2048x858, 24fps) on Youtube, you can find it here https://youtu.be/Y-rmzh0PI3c

I have looked through a couple of the scenes, and through some rendering I estimate most of the work units to be around 16Gb memory usage. On my 8-core 4.3MHz one frame takes about (approximated from 40 minute render) 9 hours at 2048x2048, 512 samples.
Work unit frames will probably be split up into 4 parts each.

The entire pilot has just about 12,000 frames altogether. For stereoscopic (left/right) video that is 24,000 frames.

That makes around 96,000 individual work units that would each take 2 hours to render with my system.

If I was to render this 24 hours a day by myself it would take almost 22 years.



Approximately.









And that's without error detection re-renderes, I can probably optimize it somehow :S

Profile Janus
Volunteer moderator
Project administrator
Avatar
Send message
Joined: 16 Jun 04
Posts: 4478
Credit: 2,094,806
RAC: 0
Message 14040 - Posted: 12 Aug 2015, 14:11:49 UTC
Last modified: 12 Aug 2015, 14:20:27 UTC

Sounds like a pretty cool project! Open source rendering projects are always welcome here - especially the BlenderFoundation ones.
16GB is a lot, though. Most of the farm has less than that amount of RAM =)

There's some "hidden" features on the farm that may be of use if you decide to re-render "Cosmos Laundromat - First Cycle":
1) Library linking. You can prepare an entire library of resources in a (or multiple) .zip file(s) that will be unpacked on the client before rendering. This .zip-file can the be re-used on multiple sessions. This feature is not yet officially available but can be enabled on request for very specific cases (like very big projects).
2) HDRI and multilayer EXR. Again, available only on request
3) On-farm postprocessing a.k.a. output linking. When a session has been rendered with splitframe rendering it cannot usually do postprocessing on the individual render nodes because they don't have the entire image. It is, however, possible to create a second session that runs after the first one and performs the postprocessing on the full images by reading the output of the first session. Of course not all sessions need to be split like that - some can run as full-frame which is a lot easier.

Neither of these features have a web-UI yet and they are pretty much completely un-documented but we can probably figure out something. We may have to run a few test-runs first to get the right setup.

Profile mStuff
Send message
Joined: 29 Oct 09
Posts: 26
Credit: 67,721
RAC: 0
Message 14043 - Posted: 13 Aug 2015, 11:55:19 UTC - in response to Message 14040.

Sounds interesting.

I have decided to try and remake Caminandes instead since we are looking at about a twentieth (if not less) of the rendering time compared to the newly released.
I would like to look into getting both the color/render pass output as well as the z-Depth pass - could this be output in an multilayer EXR?
I have the first title scene ready to upload.

Profile Janus
Volunteer moderator
Project administrator
Avatar
Send message
Joined: 16 Jun 04
Posts: 4478
Credit: 2,094,806
RAC: 0
Message 14044 - Posted: 13 Aug 2015, 12:52:28 UTC
Last modified: 13 Aug 2015, 13:22:16 UTC

I would like to look into getting both the color/render pass output as well as the z-Depth pass - could this be output in an multilayer EXR?

I think you can do that with just regular EXR by enabling "Z buffer" under the output options and connecting the relevant pin under nodes. There's no way to select EXR as output format during upload (and doing nothing means that the BURP system will render as PNG regardless of what was selected in the scene) but you can write in the description if you would like to try out the experimental EXR or multilayer EXR rendering.

As for the other stuff:
Well, I'll try to make this as general as possible so that it may be used in some of the documentation later. Let's start with the library feature since that is probably the most important thing:

For long productions with multiple scenes you typically have some resources that are re-used across all these scenes. This may be anything like textures, models, even entire sets. For Big Buck Bunny a lot of the stones and trees in the background were props that were re-used across many scenes and they were simply linked into those scenes from libraries.
A production typically organizes its content in directories - it could be like this:
./ (the project root directory) +---./textures/ | +---./textures/some-rocky-texture.png | +---./textures/some-other-texture.jpg +---./models/ | +---./models/a-rock.blend | +---./models/main-character.blend +---./scenes/ | +---./scenes/intro.blend | +---./scenes/scene1.blend | +---./scenes/scene2A.blend +---./stuff-not-used-for-rendering/ +---./stuff-not-used-for-rendering/post-it-note-about-cats.txt +---./stuff-not-used-for-rendering/sound-effects.wav

The "textures" and "models" directory contents are re-used across all the scenes in "scenes".

In order to use this together with BURP you simply put the "textures" and "models" directories into a .zip-file and upload it*. Notice that the "scenes" directory and the "stuff-not-used-for-rendering" directory were both left out of the .zip since they are not to be shared.

Then, for every scene you want to render you do this:
1) Open the scene ./scenes/sceneX.blend in Blender
2a) BURP requires the rendered scene to reside in the project root and all libraries to be next to it or below it in the directory structure. So select File->External Data->Make All Paths Relative to prepare a move of the scene file.
2b) Use File->Save As, to save the scene in the project root directory as ./sceneX.blend
Very important: Make sure that "Compress" and especially "Remap Relative" are selected under the "Save as Blender File" property panel in order to correctly locate the libraries later.
2c) Hit "Save" to move the file.
3) Upload the modified scene to BURP and attach the library you uploaded earlier to the session using the super nice and awesome online library browser that lists your recently uploaded libraries**

That's it really.
The scene file may contain textures, models and animations specific to just that one scene - as long as they are packed into the .blend when uploaded and no unresolved external references remain.

When using libraries - and especially right now where this is a very experimental feature - please allow some extra time for session reviews as every single file inside the library must be individually reviewed the first time that the library is used.



* You can't actually do that through the website yet, so please just post it somewhere and I'll manually upload it.
** You can't do that either yet, just write in the description what library should be attached.

Profile mStuff
Send message
Joined: 29 Oct 09
Posts: 26
Credit: 67,721
RAC: 0
Message 14045 - Posted: 13 Aug 2015, 13:13:48 UTC - in response to Message 14044.
Last modified: 13 Aug 2015, 13:14:03 UTC

I understand that assets can be shared by different sessions, but I have only edited the things needed in the first intro sequence - I could probably get the whole project ready and then upload it for shared session material, but that would probably take a lot of time.

So far I have edited pretty much all used textures and skyboxes and extended the scene layouts to fit the 180 degree viewability, though I am not sure if my edits works on every scene in the entire animation

I can take a look at it and see if I can make it happen.

Profile Janus
Volunteer moderator
Project administrator
Avatar
Send message
Joined: 16 Jun 04
Posts: 4478
Credit: 2,094,806
RAC: 0
Message 14046 - Posted: 13 Aug 2015, 13:59:12 UTC - in response to Message 14045.
Last modified: 13 Aug 2015, 14:19:06 UTC

but I have only edited the things needed in the first intro sequence

Very good point. It is sometimes the case that there are things which are used only in part of a production - like a character that dies half-way through a movie and no longer appears after that or a set of assets that are only used rarely but could still be shared when they are. It is also happens that a production is a work-in-progress and isn't finished at the time where the shared assets are to be prepared (like in your case).
That is why it is possible to attach not just one .zip-file but several. This way you can divide your project into different groups of assets - for instance those that you believe to be complete, those that are only used in scenes 8 through 14, those that are related to a specific place etc. etc. etc.
By including some, or all, of those libraries in a specific order as well as utilizing the assets bundled inside the .blend-files you can then control what the render nodes "see" as the complete asset tree.

For obvious reasons it would be really cool to have everything prepared in advance so that the best selection of what goes where could be made right away; but if you want to start your test now what I would recommend would be the following:
A) Build a library containing all the files that you know are used in the movie and you don't intend to modify. This could be stuff like the main characters' models, skeleton information, textures relating to things that are not affected by VR (eyes, fur, wood, that kind of stuff). Call it "base.zip" - you know in advance that this will eventually be re-used all over the place. Keep in mind that libraries can have files from many different directories - they will unpack to the same directory structure on the render nodes.
B) Build another library of things that you have changed for the intro scene but you expect are re-used in other scenes. Let's call it "vr-patch1.zip".
C) For all the stuff that are only used in the intro scene, pack it into the .blend

Now with the combined files from (A), (B) and (C) we can render the intro scene and check that it all went fine (there's probably going to be some odd bug that needs fixing but, yeah).
Next week when you've prepared scene 1-8 you will have a better idea about what is shared and you can then go back and check if the stuff in vr-patch1.zip was still relevant.
- If it was all still good and you had nothing new to add then fine, attach base.zip and vr-patch1.zip to those sessions.
- If you had something to add then either use the .blends (for single-session stuff) or if you know it will also be used later add a vr-patch2.zip and attach base,vr-patch1,vr-patch2.zips to the sessions.
- If something changed that was in vr-patch1.zip you can either ditch that .zip and create an entirely new one OR you can overwrite single files from it by also including those files in a new library.

The last one probably requires a bit more explanation:
Let's say that you added painted 2D backgrounds to base.zip because you thought they would work for the entire production. A few weeks into the production you find out that the backgrounds simply are not wide enough. You can either start over with a new 800MB base2.zip that everyone has to download including all the models etc. and go through the review again or you can create a new "overlay", which is another .zip that gets unpacked on top of the old one including only the new backgrounds (and optionally other additional stuff) - they simply overwrite the old files where there is a name clash. That way people only have to download the new files (a few MB) to continue rendering.

There's a limit to how many .zips you can attach to a single session, so if you are continuously changing things, then you may have to consolidate a lot of small .zips into one or two bigger ones. I think the limit is somewhere around 8-10 .zips.
In general, try to keep the number of zips down and try to re-use them across as many sessions as possible - preferably all of them.

It is completely up to you how you name or organize the .zips. For Big Buck Bunny 3D the libraries were split into types of content instead:
chars.zip
envs.zip
mattes.zip
props.zip
sets.zip
shared.zip
This was for most of the sessions and an extra .zip for scenes in the same chapter that shared some additional assets specific to that chapter.
For sessions without matte we could leave out mattes.zip, and likewise chars.zip for sessions without any chars in them.
Back then we didn't have to overlay the .zips since all the assets were static during production. If we suddenly had to change some of the mattes we could have done a mattes-overlay1.zip

Profile mStuff
Send message
Joined: 29 Oct 09
Posts: 26
Credit: 67,721
RAC: 0
Message 14047 - Posted: 13 Aug 2015, 15:53:08 UTC - in response to Message 14046.
Last modified: 13 Aug 2015, 15:53:45 UTC

It is completely up to you how you name or organize the .zips.


When I make resource libraries I everything in shared .blend files which I link to in my main scene setup. Do BURP support setups like this too?

Like, koro.blend with all character resources like textures and mesh data packed, and props.blend etc etc.

I can start restructuring everything right away.

Profile Janus
Volunteer moderator
Project administrator
Avatar
Send message
Joined: 16 Jun 04
Posts: 4478
Credit: 2,094,806
RAC: 0
Message 14048 - Posted: 13 Aug 2015, 16:22:12 UTC - in response to Message 14047.
Last modified: 13 Aug 2015, 16:25:15 UTC

Whether or not it makes sense to put everything into .blend-files inside the .zips depends a lot on what you plan to change later. You would have to replace the entire .blend-file with a new one even if there was only a tiny change to a single texture if everything is packed like that.
You can actually do the same thing with unpacked data and keep the (now smaller) shared .blend around for easily tying it all together into a nice package.

I guess the answer to your question is: yes, you can put one or more .blend-files in the library .zips. Those blend-files can have packed data or external data, it doesn't matter, but external data is more flexible for updating later.

Profile mStuff
Send message
Joined: 29 Oct 09
Posts: 26
Credit: 67,721
RAC: 0
Message 14049 - Posted: 13 Aug 2015, 18:06:54 UTC

In case anyone is wondering, this is the animation I am currently working on getting on BURP: https://youtu.be/Z4C82eyhwgU

Profile mStuff
Send message
Joined: 29 Oct 09
Posts: 26
Credit: 67,721
RAC: 0
Message 14052 - Posted: 17 Aug 2015, 9:51:30 UTC - in response to Message 14049.
Last modified: 17 Aug 2015, 10:11:08 UTC

Blender 2.75 have gotten some new stereoscopy features with built-in stereoscopic support when rendering.

When could BURP support these features?

In this video (https://youtu.be/uxZLyoDgg4U?t=1m21s), the feature of outputting several render passes into individual files (and using them in the node editor) is being used to either make 1 multilayer openEXR or 2 png's.

If possible for a future Caminandes render I would like to have the passes Color_R, Color_L, Zdepth_R and Zdepth_L.
The scenes are definitely light-weight enough that rendering all these passes would be possible in only one work unit. (it would take around 1 hour on my system to do)

I will use the Zdepth passes for experimental foviated depth of field simulation.

Can BURP output several .png's for 1 frame or would OpenEXR be better for this?

Profile Janus
Volunteer moderator
Project administrator
Avatar
Send message
Joined: 16 Jun 04
Posts: 4478
Credit: 2,094,806
RAC: 0
Message 14053 - Posted: 17 Aug 2015, 11:30:35 UTC - in response to Message 14052.
Last modified: 17 Aug 2015, 12:55:41 UTC

No, BURP cannot output multiple files at the moment but it unofficially and very experimentally supports it through other means.

For BBB3D we used side-by-side stereo or above/below in the compositing steps. It works surprisingly well since many 3rd party tools can view/work with that format too but it may be a bit tricky to set up.*
You should, however, be able to use multilayer EXR with the standard colour layers (renaming them may confuse the server when showing previews, so try to keep at least one standard RGB(A)(Z) layer as the first layer and then X other layers - like for instance another RGB(A)(Z) layer for the other eye). Keep in mind that EXR will take a long time to download and work with since the files are very large. For BBB3D at some point we used around 35TB of intermediary storage - even though that is not the actual file footprint (there was redundancy) that's still a lot of downloading even if you have a gigabit connection.

* Ok, apparently it is now super easy to set up using just the render output and camera settings. Same thing for native 3D support in EXR (multilayer not even required). Cool to see that these features are going mainstream like that.

Profile mStuff
Send message
Joined: 29 Oct 09
Posts: 26
Credit: 67,721
RAC: 0
Message 14054 - Posted: 17 Aug 2015, 13:00:12 UTC - in response to Message 14053.
Last modified: 17 Aug 2015, 13:03:19 UTC

Oh wow, 35TB, so that is uncompressed stereoscopic 4k 5 channel (RGB-A-Z) per frame? since I am just using 2048x2048 and 2x4 channels (RGB-Z) and the animation is only around 3000 frames long I hope I can keep the storage under 1TB since I don't have much more than that on my 2TB storage drive.
Meh, I have thought about getting a new 8TB one anyways.

I am a bit puzzled - has this stereoscopic feature been part of Blender for long, and has only just been updated? I thought it was just an add-on for the longest time.

I gotta find some resource to read on the EXR format, I am a bit lost with what it supports - I've never used it before, not even sure if it opens in Photoshop or Premiere Pro :P


Post to thread

Message boards : General talk : Considering rebuilding Cosmos Laundomat for VR