Godlike Productions - Discussion Forum
Users Online Now: 2,132 (Who's On?)Visitors Today: 1,028,168
Pageviews Today: 1,910,403Threads Today: 920Posts Today: 16,689
08:59 PM


Back to Forum
Back to Forum
Back to Thread
Back to Thread
REPORT COPYRIGHT VIOLATION IN REPLY
Message Subject The Ultimate Spy. How to design software that reads and records a human consciousness.
Poster Handle Anonymous Coward
Post Content
My estimate, after doing a few numbers is that it would take about 70 quadrillion petaflops to even come close to what you're describing -- and even then; it wouldn't be perfect, and would experience a great deal of problems. On top of this, running a computer at is maximum benchmarked speed for a long duration would be VERY difficult if not impossible with current technology.

With all of this being said, I can see us reaching 70-100 quadrillion petaflops some time in the next 5 years (as new discoveries have been made in relation to this).

Once again however, these are benchmarked speeds - not "average" speeds.
 Quoting: Anonymous Coward 6549406


And you came up with these calculations how?
 Quoting: 2342


Let me first explain something: in reality, it would probable be even more than a million calculations per person. Possibly billions. So before I respond, I must say that my numbers are not only closer, but in fact on the extreme low end.

I'm a high level software engineer with many years of experience. For every "one" micro-decision you take care of, it will require some more behind the scenes to make it happen. Even to run a few bytes of machine code, takes numerous calculations. When you combine this with the billions if not trillions of rules that govern our minds, we require many, many calculations for one microdecision in our own minds.

But to avoid losing track of your question, which is important, you can come up with an IDEA of a number by doing the following. I've become more realistic with the math involved, which ended up making this a number beyond my calculable ability.


----

[NCS] Number of calculations per person, per scenario = [100+ million?]

[NSC] Number of scenarios needed to calculate a good possible decision = [5,000+?]

[NCD] Number of calculations to actually determine the scenario was the best for the micro decision = [Another 100+ million?]

----

You would then take these numbers and multiply them to come up with an idea of just how many calculations would be needed. And lets not forget that in many cases recursive functionality like this grows the processing requirements exponentially, which was not even considered in the calculation.

It's fair to say that we don't have anywhere near the ability we need to think for 7 billion people. With that being said, I agree we are going there, and I share your sense of urgency on the topic.

Name's Jesse by the way. I think I may join soon - this place is great :)
 
Please verify you're human:




Reason for copyright violation:







GLP