Sunflower A.04.03.R

Sunflower A.04.03.R

Description

The butterfly effect
This is the right eye view of the stereo video.
See Session 1194 for more information.

Message boards : Comments and discussion : 1195

AuthorMessage
Speedy

Send message
Joined: 25 May 06
Posts: 208
Credit: 676,104
RAC: 0
Message 10998 - Posted: 29 Jul 2011, 8:02:54 UTC

Out of interest how fast is the computer that is used to do the Encoding on sessions after they have been rendered?
Have a crunching good day
ID: 10998 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile DoctorNow
Project donor
Avatar

Send message
Joined: 11 Apr 05
Posts: 403
Credit: 2,189,214
RAC: 7
Message 10999 - Posted: 29 Jul 2011, 16:34:45 UTC - in response to Message 10998.  

Yeah, looks like something got stuck here.
The left eye view "only" took about 8 hours for encoding, this one already is at 2 days...
Life is Science, and Science rules. To the universe and beyond
Proud member of BOINC@Heidelberg
My BOINC-Stats
ID: 10999 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Janus
Volunteer moderator
Project administrator
Avatar

Send message
Joined: 16 Jun 04
Posts: 4571
Credit: 2,100,463
RAC: 8
Message 11000 - Posted: 29 Jul 2011, 19:47:31 UTC - in response to Message 10998.  

The computer doing the encoding is actually quite fast - and it's not one but two. Right now, however, they are mostly idling because I'm in the middle of switching to a new version of the BURP backend. The storage system has been switched already and next up is stuff like the encoder service, the workunit queue, validator etc.

I'm trying to do this as transparently as possible but it may cause some things to look odd for a little while the next couple of days.
ID: 11000 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Janus
Volunteer moderator
Project administrator
Avatar

Send message
Joined: 16 Jun 04
Posts: 4571
Credit: 2,100,463
RAC: 8
Message 11001 - Posted: 30 Jul 2011, 21:36:47 UTC - in response to Message 10999.  
Last modified: 30 Jul 2011, 21:54:37 UTC

The left eye view "only" took about 8 hours for encoding

Forgot to comment a bit on this. With the regular PNG sessions encoding is a simple and straightforward process. All we have to do is:
1) Read the PNGs from storage
2) Pipe them over the network to the encoder
3) Apply alpha (convert the 32bpp PNGs to 24bpp) and convert them to YUV422 (which is what h264 usually works with)
4) Feed them into x264 inside a Mencoder process (while also rescaling to preview size)
5) Pack the video stream into an mp4 container
6) Pipe it back to storage server
7) Send it to storage and register it with the session

In this scenario (1) and (2) are really fast, (3) and (4) can happen in a multithreaded way; and (5),(6) and (7) are relatively fast as well. In other words there are not really any bottlenecks - everything is running smoothly.

With EXR files (like the ones from the Sunflower event) we add another few steps before the other ones:
-2) Read EXRs from storage
-1) Convert EXRs to 32bpp PNGs
0) Send PNGs to storage

Here (-2) is a big read. Typically the EXRs are about 10 times the size of their PNG equivalents. (0) is a small write (which are slightly more expensive than reads, but this isn't too bad).
The big resource eater is the conversion going on in (-1) because it involves decoding the EXR information, transforming every pixel in every colour channel to a different colour space and then constructing a new PNG image from that information.
It is not really feasible to multithread this since the conversion takes a relatively large amount of memory. It uses some for the input EXR structures and some for the output PNG while it is constructing it. The server already concurrently runs a similar conversion process for website previews, and currently at most 3GB of total memory is allocated to this particular kind of stuff (the more conversions running concurrently in this space the slower they will each be). Also it actually runs fairly fast for what it is doing.

For the left eye session it had 1170 frames encoded in 8 hours (let's just assume that it wasn't waiting for the encoder and started right away). That turns out to be around an average of about 24 realtime seconds per input frame. In this time the 4K frame goes through
a) (-2), (-1) and (0) one time
b) (1), (2), (3) and (4) four times
c) and finally (5), (6), (7) twice.
So although at first it appears to be slow it is actually doing lots of interesting stuff in those 24 seconds - and the end result is two very nice, streamable preview videos. My guess is that most of the time is simply spent reading the EXR (but I have no timing info available).

I'm looking at the encode of the session that this thread is about and it is running at 0.4fps in step (b) right now.
Oh yes, that's right, the encoder is now running on the new backend =)

[Edit:] I probably should also add that whenever development of the encoding path offers a choice between quality and speed the answer is always "quality". These files are going to be here forever, so the system may as well do a proper job of ensuring that they've been encoded in the best possible way.
ID: 11001 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote

Post to thread

Message boards : Comments and discussion : 1195