4internetanonymity wrote:That folding at home and various number crunching work in biology is a huge waste of time.
I think yours is a very pessimistic view of the project and of scientific research in general. Admittedly I have some bias here, but I am not in the same field as F@H.
... - fraud despite being widely reported is a very rare occurrence (so thinking it is widespread is a mistaken statistics - go reporter bias) and tends to be limited to smaller research teams lead by single individuals. "Mistaken statistics", or unreproducible results (which I think is a more accurate description of the problem) is an issue throughout science,
Sorry for adding negativity, but as a former scientist, I have to say that fraud in official science is massive. And the problem is mass deception and various intricate systems of control from the top down, not fraudulent results obtained by individual laboratories.
]]>I've had neurological problems some years now that were caused from being given medically induced parkinsons back when i was 11 or so. I don't make much so this is a nice way to give back how do i join the team and what is the internet data usage like.. can this run on both my server and my laptop from same place?
]]>That folding at home and various number crunching work in biology is a huge waste of time.
I think yours is a very pessimistic view of the project and of scientific research in general. Admittedly I have some bias here, but I am not in the same field as F@H.
The scientific output of F@H is substantial. In this project, I am not concerned about fraudulent results - fraud despite being widely reported is a very rare occurrence (so thinking it is widespread is a mistaken statistics - go reporter bias) and tends to be limited to smaller research teams lead by single individuals. "Mistaken statistics", or unreproducible results (which I think is a more accurate description of the problem) is an issue throughout science, but part of F@H is comparing results obtained from multiple different sources which ensures reproducibility of the analysis, at least while using the same software. There is also comparison of F@H results to known 3D protein structures showing the results are good, but it is generating statistical models and they do need validated in vivo. These results are providing strong prior hypotheses to direct future research.
I think the best use of time for Arch enthusiasts is to contribute back to Arch and Linux - and to learn about the many security problems we have today (especially on the lower levels) and how to protect against them.
You do realise that people are not doing anything to actively contribute to the F@H project? Once installed, the time invested is essentially nothing. That means people can still contribute back to Arch and Linux while having F@H running in the background.
]]>I love Arch Linux, but I do feel I should represent the non-mainstream view on here... That folding at home and various number crunching work in biology is a huge waste of time. I'll leave this here for anyone questioning... Causality in biology is a hugely entrenched paradigm. From the central dogma of Watson and Crick to the many modern articles - that I hope you've heard of - about the fraudulent and mistaken statistics found in most peer reviewed studies today. In topics as reductionist as protein folding, it is hard to represent in one short post the connections with larger mechanisms and biological paradigms. Anyway, just a thought... Because I don't want to just leave criticism without anything positive, I will say: I think the best use of time for Arch enthusiasts is to contribute back to Arch and Linux - and to learn about the many security problems we have today (especially on the lower levels) and how to protect against them.
I fully support your right to have an opinion, however I fail to see the point in coming to this thread just to denounce the efforts of others: https://wiki.archlinux.org/index.php/Co … ther_users
If this isn't for you, then just move on to another thread please.
On client "local" 127.0.0.1:36330: Option 'gpu-index' has no default and is not set.
Diesel1.
UPDATE - A complete reinstall has solved the issue.
]]>22:51:56:WU00:FS00:Starting
22:51:56:WU00:FS00:Removing old file '/opt/fah/work/00/logfile_01-20180209-222043.txt'
22:51:56:WU00:FS00:Running FahCore: /opt/fah/FAHCoreWrapper /opt/fah/cores/fahwebx.stanford.edu/cores/Linux/AMD64/Core_a4.fah/FahCore_a4 -dir 00 -suffix 01 -version 704 -lifeline 502 -checkpoint 15 -np 8
22:51:56:WU00:FS00:Started FahCore on PID 7115
22:51:56:WU00:FS00:Core PID:7119
22:51:56:WU00:FS00:FahCore 0xa4 started
22:51:56:WU00:FS00:0xa4:
22:51:56:WU00:FS00:0xa4:*------------------------------*
22:51:56:WU00:FS00:0xa4:Folding@Home Gromacs GB Core
22:51:56:WU00:FS00:0xa4:Version 2.27 (Dec. 15, 2010)
22:51:56:WU00:FS00:0xa4:
22:51:56:WU00:FS00:0xa4:Preparing to commence simulation
22:51:56:WU00:FS00:0xa4:- Ensuring status. Please wait.
22:52:05:WU00:FS00:0xa4:- Looking at optimizations...
22:52:05:WU00:FS00:0xa4:- Working with standard loops on this execution.
22:52:05:WU00:FS00:0xa4:Examination of work files indicates 8 consecutive improper terminations of core.
22:52:05:WU00:FS00:0xa4:- Expanded 738934 -> 1919960 (decompressed 259.8 percent)
22:52:05:WU00:FS00:0xa4:Called DecompressByteArray: compressed_data_size=738934 data_size=1919960, decompressed_data_size=1919960 diff=0
22:52:05:WU00:FS00:0xa4:- Digital signature verified
22:52:05:WU00:FS00:0xa4:
22:52:05:WU00:FS00:0xa4:Project: 14017 (Run 0, Clone 260, Gen 27)
22:52:05:WU00:FS00:0xa4:
22:52:05:WU00:FS00:0xa4:Entering M.D.
22:52:12:WU00:FS00:FahCore returned: INTERRUPTED (102 = 0x66)
I've tried removing/clearing work units, removing the cpu slot, rebooting and adding the cpu slot.
Works on the 4.14 lts kernel.
I just decided to stop by today and see how the team was going. WOW WOW WOW. The Arch Linux Folding @ Home team broke the top 100 and is currently ranked at 77th with over 2.5 Billion points! It was a goal of mine to see the team break the top 100, I didn't get to see it happen but I am so glad you guys & gals GOT 'ER DONE!
imatechguy has well over 400 million points total and Buddlespit is producing over 730 thousand points AN HOUR. These stats are unbelievable. The clients and hardware have obviously made remarkable improvements.
Thanks to everyone who folded before me, folded with me, and most importantly those who folded after me. Well done.
Pudge
]]>How are your F@H packages running? Are you seeing the expected gains?
Diesel1.
]]>I'm back!
For some reason it started folding on the GPU after a re-install of the driver and CUDA parts.
Diesel1.
]]>@skeevy420
I was under the impression that bigadv was discontinued quite a while back. Maybe what you were seeing was a combination of the boost provided by going from 15 to 16 threads in combination with a better-than-average WU? See for example this: https://foldingforum.org/viewtopic.php?f=96&t=29824.
Haven't had my main rigs folding for a while now (290X and 390X do provide nice PPD, but also quite a bit of heat and power draw). Might just sell them and get something more power efficient.
]]>As it stands, it would seem I need a major downgrade of the nVidia driver or a miracle (I am a devout ((sic)) atheist!) so the downgrade seems most likely, just how to achieve it and is it worth it?
Is there a config option to only download 0x18 work units?
Diesel1.
]]>