Blender (GPU)

Message boards : Number crunching : Blender (GPU)
Message board moderation

To post messages, you must log in.

Previous · 1 · 2

AuthorMessage
zombie67 [MM]
Project donor
Avatar

Send message
Joined: 9 Dec 06
Posts: 93
Credit: 2,492,267
RAC: 649
Message 14620 - Posted: 15 Jul 2016, 16:41:01 UTC

Does this use much Double Precision? I have a TITAN, and have the option of increasing DP performance, at the cost of some clock speed. But if much DP is not needed, then I will leave it at the faster clock speed, with reduced DP performance.
Reno, NV
Team: SETI.USA

ID: 14620 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Janus
Volunteer moderator
Project administrator
Avatar

Send message
Joined: 16 Jun 04
Posts: 4571
Credit: 2,100,463
RAC: 8
Message 14621 - Posted: 15 Jul 2016, 16:52:26 UTC
Last modified: 15 Jul 2016, 16:52:52 UTC

The Blender Cycles rendering engine (our only GPU client so far) is not DP at all as far as I know, so there would be no point
ID: 14621 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile DoctorNow
Project donor
Avatar

Send message
Joined: 11 Apr 05
Posts: 403
Credit: 2,189,214
RAC: 7
Message 14633 - Posted: 21 Jul 2016, 7:55:09 UTC
Last modified: 21 Jul 2016, 8:08:43 UTC

Just found out, that there were some long taking GPU-wus the last day on both of my hosts with several hours:
9794911, 9794411, 9793459
Personally I'm not satisfied with that outcome anymore, so I'm going to stop crunching them until the sessions are over.
Life is Science, and Science rules. To the universe and beyond
Proud member of BOINC@Heidelberg
My BOINC-Stats
ID: 14633 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Janus
Volunteer moderator
Project administrator
Avatar

Send message
Joined: 16 Jun 04
Posts: 4571
Credit: 2,100,463
RAC: 8
Message 14635 - Posted: 21 Jul 2016, 17:13:30 UTC
Last modified: 21 Jul 2016, 17:43:28 UTC

Perfectly fine with me - you get to pick and choose in the GPU lottery. Any WUs run on the GPUs right now are, however, very much appreciated since we are building up a database of statistical data similar to the one that we have of the CPU clients. Payday probably won't be able to run as well as for CPU but maybe one day we will be able to do better than static credit at least.

I agree, however, that we probably should put up another target than the 100 credits. In terms of time, what we are targeting is somewhere around 1-4 hours on average compute time - I'm having quite a lot of difficulty finding out just how much credit that is for an average GPU, the numbers seem to be all over the place (from other BOINC projects).

Any bids?
ID: 14635 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile DoctorNow
Project donor
Avatar

Send message
Joined: 11 Apr 05
Posts: 403
Credit: 2,189,214
RAC: 7
Message 14642 - Posted: 22 Jul 2016, 8:31:06 UTC - in response to Message 14635.  

I'm having quite a lot of difficulty finding out just how much credit that is for an average GPU, the numbers seem to be all over the place (from other BOINC projects).

Yeah, they do. The range on other GPU-projects is going from a few hundred credits to over 10.000 - on a per hour basis - the GPU performance can modify this even more. I've tested them all and have my notes from them.
Since there is no real "guideline" on giving out GPU credits you normally can give anything - but at least it should be a ten times amount in comparison to the cpu value since GPUs normally do a lot more calculations in the same amount of time as cpus.
If you want a 1-4 hours average and since you're currently only can give a fixed amount my suggestion would be 4000 to 5000 credits per wu. That still would be a bit less in comparison to most of the other projects but it would be a start.
Life is Science, and Science rules. To the universe and beyond
Proud member of BOINC@Heidelberg
My BOINC-Stats
ID: 14642 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
zombie67 [MM]
Project donor
Avatar

Send message
Joined: 9 Dec 06
Posts: 93
Credit: 2,492,267
RAC: 649
Message 14671 - Posted: 14 Aug 2016, 22:13:07 UTC

Did we ever get a definitive answer on GPU memory requirements?

Also, the tasks go immediately into panic-mode, so I can never get more than one at a time. This means down time while the large uploads happen. Is this only me, or are others seeing the same panic-mode situation? I guess this could be caused by either over-estimated crunch times, or by too-short due dates. Actual crunch times are relatively short, around 6 minutes. Can anything be done about this?
Reno, NV
Team: SETI.USA

ID: 14671 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Janus
Volunteer moderator
Project administrator
Avatar

Send message
Joined: 16 Jun 04
Posts: 4571
Credit: 2,100,463
RAC: 8
Message 14672 - Posted: 15 Aug 2016, 16:13:10 UTC
Last modified: 15 Aug 2016, 16:33:21 UTC

GPU memory handling is not at all complete yet, the requirement shown is the CPU memory requirement. GPU memory requirements (when implemented, if implemented) will be shown separately.

In more technical terms: For this early testing phase we are using the built-in BOINC plan classes for CUDA together with some special glue in Glue3 and our launcher script in Blender to use the regular client as a GPU client. There are still some serious issues to iron out in order for it to work properly. It works, and it renders very well, but it is still in the early testing phase, do not expect wonders. Some of the issues may be BOINC-wide issues like the credit granting problems.
ID: 14672 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile DoctorNow
Project donor
Avatar

Send message
Joined: 11 Apr 05
Posts: 403
Credit: 2,189,214
RAC: 7
Message 14673 - Posted: 15 Aug 2016, 17:55:07 UTC - in response to Message 14672.  

Some of the issues may be BOINC-wide issues like the credit granting problems.

Like this?

9943419 2981297 46758 15 Aug 2016, 15:14:41 UTC 15 Aug 2016, 15:28:02 UTC Fertig und Bestätigt 607.75 377.47 96.82 Blender (GPU) v5.06 (cuda_fermi)
9943281 2981243 46758 15 Aug 2016, 14:59:01 UTC 15 Aug 2016, 15:14:10 UTC Fertig und Bestätigt 853.04 363.06 88.04 Blender (GPU) v5.06 (cuda_fermi)
9940524 2980155 46758 15 Aug 2016, 10:00:51 UTC 15 Aug 2016, 10:10:54 UTC Fertig und Bestätigt 580.28 339.63 84.13 Blender (GPU) v5.06 (cuda_fermi)
9940388 2980101 46758 15 Aug 2016, 9:48:39 UTC 15 Aug 2016, 10:00:51 UTC Fertig und Bestätigt 708.83 358.09 72.73 Blender (GPU) v5.06 (cuda_fermi)

Was quite a surprise to me as I looked through my tasks and saw a lot of them weren't at the fixed value of 100...
Life is Science, and Science rules. To the universe and beyond
Proud member of BOINC@Heidelberg
My BOINC-Stats
ID: 14673 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
zombie67 [MM]
Project donor
Avatar

Send message
Joined: 9 Dec 06
Posts: 93
Credit: 2,492,267
RAC: 649
Message 14674 - Posted: 15 Aug 2016, 20:53:29 UTC

Let me ask the question more directly: What is the minimum amount of GPU memory required for a task to be issued to a host, and run successfully? Is 256mb enough? 512mb? Or some higher number?
Reno, NV
Team: SETI.USA

ID: 14674 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Janus
Volunteer moderator
Project administrator
Avatar

Send message
Joined: 16 Jun 04
Posts: 4571
Credit: 2,100,463
RAC: 8
Message 14675 - Posted: 16 Aug 2016, 14:43:43 UTC
Last modified: 16 Aug 2016, 15:06:19 UTC

There simply isn't a good answer at the moment.
It depends on your OS, the task, your open programs and your window manager. Typically the answer is that 1GB of dedicated GPU memory will be enough for now but the client will try and fail (or sometimes succeed) regardless.

@Credits: Talks about a 4th generation credit system for BOINC has begun, this time more openly than CreditNew. If you are interested you should check out the BOINC mailing lists.
ID: 14675 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
zombie67 [MM]
Project donor
Avatar

Send message
Joined: 9 Dec 06
Posts: 93
Credit: 2,492,267
RAC: 649
Message 14677 - Posted: 17 Aug 2016, 3:31:10 UTC - in response to Message 14675.  

There simply isn't a good answer at the moment.
It depends on your OS, the task, your open programs and your window manager. Typically the answer is that 1GB of dedicated GPU memory will be enough for now but the client will try and fail (or sometimes succeed) regardless.


Thanks. A general target is appreciated. This tells me that I should not try using any of my 256mb or 512mb cards for now.
Reno, NV
Team: SETI.USA

ID: 14677 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Janus
Volunteer moderator
Project administrator
Avatar

Send message
Joined: 16 Jun 04
Posts: 4571
Credit: 2,100,463
RAC: 8
Message 14727 - Posted: 16 Sep 2016, 15:37:11 UTC
Last modified: 16 Sep 2016, 15:39:39 UTC

Just a quick heads-up about a few changes for the GPU sessions:
Firstly, the credit granted will be greatly increased (from Payday, CreditNew is as random as ever) by a factor of around 40 to better match the kind of sessions going to the GPU farm as discussed earlier in the thread. Secondly, credit granted for old GPU sessions will likewise be retro-actively increased, along with a bonus for people who took part in the very early GPU sessions.
These changes are scheduled to take place after bulk2 of the current Payday run but you will see some of it influence CreditNew's initial credit already starting today.
ID: 14727 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Darrell
Avatar

Send message
Joined: 24 Mar 05
Posts: 16
Credit: 39,356
RAC: 0
Message 15142 - Posted: 30 Apr 2017, 21:35:11 UTC

Back to work rendering Burp. I did a few unit occasionally on my old processor, an Athlon II X2 250 running at 3gHz and crunching tasks on my AMD HD5850. Since the Athlon only had two cores, I could only run Burp sporadically. I have upgraded the processor to a Phenom II X6 1100T running at 3.6gHz and the GPU to a MSI Armor RX480 8gb OC. Now I have plenty of cores to play with and can now run Burp with no problems. Just wondering if there might be an AMD OpenCL app anytime soon?
ID: 15142 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Janus
Volunteer moderator
Project administrator
Avatar

Send message
Joined: 16 Jun 04
Posts: 4571
Credit: 2,100,463
RAC: 8
Message 15143 - Posted: 1 May 2017, 8:00:36 UTC - in response to Message 15142.  
Last modified: 1 May 2017, 8:12:25 UTC

It is likely that there will be an OpenCL client at some point - at the moment the Cycles rendering engine has been maturing in that area quite quickly. Just recently there was a massive rewrite sponsored by AMD of the Cycles rendering kernel. The results look very promising
ID: 15143 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Previous · 1 · 2

Message boards : Number crunching : Blender (GPU)