Jump to content

Oled/displays as alternative/suppliment to quantum computing!?


Arugela

Recommended Posts

There are no real random numbers. Real life is not random either. It's completely predictable. It's just a matter of how much you understand personally. It would just produce another form of random. Assuming it's even different. To use anything for computers you have to be able to predict it. That is literally how you implement it by definition. Therefore it's no different than a normal computer. It's just a different form of circuitry.

You and a lot of people here seem to not fully understand the concepts you are using and are using fantasy as science. It doesn't work out.

Everything comes down to inputs and outputs. Anything equalling something equals it. It doesn't matter the means to produce it. But modern computers and software are extremely pathetic and not hashed out well. Or at least not to a very advanced state for even basic things.

Quantum computers are not probabilistic. Or your understanding of that is wrong. it's predictable just like everything else. That is how it is used as a computer component. It might be of a higher range of values but it's always predictable. Even if it's a part of a more complex varying set of circumstances. It's not different. At that point you are merely talking about complexity and range. But that doesn't necessarily work as well as you think when it comes to things like encryption and whatnot. It will still have the same issues that have to be dealt with just the same if not more so.(if you don't understand something and implement it someone else will figure it out and use the weakness.) It will eventually be understood completely(and already is by those making it.) and it will be the same as now. Just different. I will go gag and die for repeating that sentence now. Thank you for making me spew the most disgusting thing I have ever heard.

BTW, larger and more complex does not equate directly to better. In fact it can be worse(And likely will be in practice.). Especially against the human brain. And it can lead the user to use it without fully knowing it to a worse level leading to more openings. Everything works out the same in the end for the most part. Or if you don't understand it completely it doesn't matter much potentially.

As a test what is the supposed range of data or functional uses for a quantum computer. Set that to a bit rate or computation needs and look at how much it would take to overcome it with normal computing.

BTW, that level of inflated funding and propagandising is not a good sign. Probabilistic is just a cache phrase to over inflate it's actual use and sell the idea. It's extremely pathetic. It's also shouldn't be needed. Which means one of two things. There are endless idiots involved in it's development or financially or otherwise which will reduced it's implementation. Or it's not as good as they are making out. Potentially meaning it will be outphased by other hardware and they know it! Or at least in it's current expected implementations. Either way, money and morons are involved. The M&M's of development! 8) Melts in your mouth not in your hands! AKA after you eat it not before!(or more the further you get into it's usage.) Hopefully not though, new computing devices would be nice. (FYI, psychologically, whenever you see advertising it's directly proportional to the perception or actual difference in the items value practically. We always respond in the same way. So, the more there is advertising the more you need to examine it. Unless it's purposely advertised less knowing this. Someone feels it's lacking to push it so much. And because our brains work on existing data in our head and we don't have the full answer the severity is always beyond what you are imagining. And more than you can imagine imagining.)

Edit: Other fun thing with light based computation. As long as you don't have something out of sync time wise. You can add as many devices and manipulations along the path of the bus to get the same time to compute. Although that would depend on implementation. It would mean a slight guaranteed delay altogether, but if it increases computational power beyond the norm it could be worth it. that means you can use a string of light manipulating devices along a path. And mirrors and whatnot to do the same with high bandwidth data.

 

To put it simply the point is to treat an xbit and put it as a full range of 0's and 1's representing the entire potential range of that bit combo. Each color variant or combo represents the entire data string in each potential combination. Then you can represent data in a very compact way compared to now. Or whatever other thing that comes to. Which should be quite a bit. Pun intended.

Examples:

32 bit data x4. Each potential bit value is a color. Each color is a data string. Maybe it's correct to look at it as a bitxbit or something like that. It's doable regardless of implementation. It's turning small bits into large. Ciphers can be used if needed etc. Including active changeable ciphers. which means it good that one device had a much faster gate control... then you just need a bunch of separate read sensors synced up.

0 = 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000

1 = 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000001

So that means a 1920x1080 at 32bits then gives: 1920*1080*32=6.63552e7per nanosecond which at microled speeds is something like 6.63552e16 per second or 58.935255667 petabits per second... Am I doing that write?

Not to bad compared to current cpus.

And 24bit is still 1920x1080x24=4.97664e7 per nano second=4.97664e7e16 or 44.2 petabets per second.

And binary light is 1920x1080=2.0736e6 for 2.0736e15 or 1.8417 petabits per second max.

Oled says 0.0001ms? I'm not sure what is meant by this?! do they mean 0.1 miliseconds or do they mean 0.1microsecond as in 100 nano seconds...

That would make oled. 2.0736e13 or 188 terabits per second or as low as 2.0736e10 for 19 gigabits per second.

Yes, I was wrong about the 32bit being read as half a gig without additional work. But you could do that if you wanted. You could also turn it into other amounts. But you could just use that also. Depends on how you can manipulate it. multiple nodes being read in one way. And increasing the bit count doesn't change the overall throughput. I was hoping I wasn't missing something on that one.

Not sure who would win the console vs desktop vs handheld wars then... It might get weird.

https://en.wikipedia.org/wiki/Deterministic_system

BTW, if this is the definition of deterministic you are using. Everything is deterministic. The very basic of reason and all man made theories is determinism. Without it we have no standing logic or ideas. Anything beyond it is impossible as it is the definition of something existing and actual. All thing going into something are what make it. The common term atm is causation. That is the only condition in which reason or deduction is possible. In which case our brains, effectively deduction machines,(via sensory data) cannot figure something out. Everything people take as beyond it means they just haven't defined it. Also any device requiring deduction(IE anything mechanical, logical or physical) can't figure out past that. Nor could they or anything they are made from give them it. We do not know nor do we have a way of figuring beyond it. And it's literally the definition of non existent. If you see something beyond it it's a logical fallacy with your base assumptions or definitions.

https://quantumcomputingreport.com/our-take/how-to-explain-quantum-to-your-classical-friends/quantum-computers-may-not-be-fully-deterministic-but-that-is-ok/

In this case they are simply idiots and should be removed from all research and anything touching it(I hate these sort of websites.). If they cannot predict it it means they have not yet measured it's cause. The point we use it for are the point we can measure it(which is just an inefficiency using raw speed and what I hope/assume is a currently limited set of completely predicable outcomes. Even if it's across multiple instances.). That is what defines a computer and how you implement one. It's the only definition. Unpredictability is only caused by a lack of knowledge. Not a lack of cause. There is nothing and will never be anything that acts that way. It literally cannot happen. It's literal fantasy! These people are completely uneducated. They are just making money off gullible people. What have and are becoming dangerously ignorant people over the last several decades. This is just junk science. Maybe tabloid science is a better term.

 

Quote

In addition, even if you can get multiple answers, there is always the option of repeating a calculation multiple times and using voting to choose the one that comes up most often.   Although this may not seem efficient initially, if you have a quantum computer that is millions of times faster, you can run a calculation hundreds or thousands of time and achieve the correct answer with extremely high probability and still enjoy an enormous speedup.

That should sum it up for you. Edit: And millions of times faster isn't that much more in terms of binary values.

Deterministic and probabilistic are fantasy terms for the gullible(or probabilistic is.). They are just implementing half assed designs they don't fully understand or are short cutting or lieing about outright. The entire thing could be fraud at that point. No other information withstanding. At least from the standpoint of that article writer.

 

Quote

So even though solving a problem on a quantum computer will require a different mindset it can still be of great value even if it is not fully deterministic.

This is a load of crap and the person writing it is full of excrements.

All you can do is take a really fast device and use predictive logic to get an answer(Which has to be completely deterministic). Else it's unusable as a computational device. Or it will be done on purpose for massive fraud and theft. That may be something to worry about when and if this is applied for research purposes. You may see a new age of theft and conmen. And if the device has the potential for it and they can hide it in any way it will be the case without question. It could be the reason for it's development on some levels. These things will always be used to their fullest. And always a soon as possible.

 

I wonder if weird bit translation could be used to produce more correct geometric calculations. Or even ones with unreal numbers faster. Geometric processors are also needed. That would basically be unreal processors. Then they could make a true unreal engine. The difference between 32bit and 2^32 are odd values. I wonder if any produce close to ore can be used in ultra realistic version of stuff.

And you can still use the color outcome to represent a complex set of outcomes to spit a single result at the program to translate as fast as possible to get your total outcomes. This allows a screen to display many outcomes. O ruse multiple pixels to effectively deal with a higher bit rate regardless. then it's like a D&D chart with millions of options. And you can use tricks to get extended tables and other things. Then you can get the results for each screen section if wanted. The trick would be combined outcomes.

I guess the only way to get that would be really fast write... Although I think the one device had picasecond response times. If you could sync up reads to stammer the read to get the same as the write speed you could just do it quickly.

https://www.extremetech.com/extreme/223671-heres-why-we-dont-have-light-based-computing-just-yet

Well the bus idea was right. It could make a good bus system. So why not move to light base busses on mobos in general. Then you can intermix light and electrical processing units. Plus with light based if you can get a non soldered connection for the type of stuff I'm describing, it's removable and upgradable. So, a gpu having a light based bus translator or a more advanced unpacking device could do advanced work towards the bus or even out to the display. Optics would be convenient as a display connector I would imagine.

One of the tricks I'm referring to he refers to as a photonic logic array. I wonder if you can't do the same with an abundance of different read heads in different locations or with different read abilities.

Quote

It just has to do with how photons diffuse into matter as opposed to electrons. The former is so much faster in translucent and transparent objects that it's as if the electrons are standing still. A certain electric charge is required for a doped object to undergo a difference in behaviour. For an object filed with light however, the idea is that if you add another colour, the 2 colours mix and the resultant colour that comes out could be detected as a specific addition of 2 frequencies. If say the object (prism) reacts differnetly to different wavelengths then you could essentially instantly detect the answer you're looking for just by going to the correct output stream in a complex photonic logic array. Photonic Logic Array's are fairly static and their complexity depends on their size really. Regardless, they're still comparibly much faster than any conventional computer we have today for the things they can process.

32^6 x4 is the same as the 2^32 I was using(32^6 is also 1 gigabit.). So, if you could use multiple nodes to work together in effect it would be like static ram of light base computing. And hopefully making it even more powerful with better devices. If anything suplimentary. Even if quantum computing overtook it it could be used for things that quantum computing struggles with because of speed factors. Basically slower things. Or as an early version of quantum computer output for similar computational tasks before it's common. Assuming it's useful before then.

If you used it to store massive game worlds on it you could use it to speed up loading. Aside from issues like hacking or thing requiring a server to keep track of. Could still be used with fast enough right or equivalent size to help deal with that head of time I guess for load times. Assuming you can't just romove useful data like enemy locations. Terrain wouldn't matter unless there is fog of war to overcome.

Someone also mentioned the potential advantage of making very large arrays instead of making multiple servers. This could save space and overall heat. In the end for server rooms and whatnot.

Quote

What about using a holographic chip with guided air as a medium for computing.

And the 3d version of this idea is holographic 3d computing arrays. The visuals can be the computations or the results. You could have swirling cool holograms doing your computation for our system. Or does that just make the same computation as you have to read the output from the other sides... I guess you could use 5 sides for computer and a 6th for read. Or all sides for compute in massive 6 set of arrays and a bunch of external reads via either another light with read or something else. I guess that is not limited.

You could also use natural light as a source... It would kill gaming and anything done late night, but...

BTW, how many detectable light colors or variances are there in natural light maximum. This includes UV and other arrays. How much could be used as unique for a computer hypothetically. Also including all qualities of light detectable.

http://sciencenordic.com/optical-computers-light-horizon

This is interesting:

Quote

Yes, they are. Or, at least, they were. But interestingly, silicon chips can be adapted to include transmitters and receivers for light, alongside the transistors.

I would have thought that was too hard for modern cpu's. I'm assuming it's a larger tarnsister size.

https://news.berkeley.edu/2015/12/23/electronic-photonic-microprocessor-chip/

Edited by Arugela
Link to comment
Share on other sites

10 hours ago, Arugela said:

There are no real random numbers. Real life is not random either. It's completely predictable.

If you can predict when a radioactive nucleus will decay by some means superior to taking its half-life and plugging it into a Poisson distribution, there's a Nobel Prize in it for you*.  There are even devices that generate random numbers this way for cryptography.

Here's a link to an example of the difference between classical and quantum computing: https://en.wikipedia.org/wiki/Shor's_algorithm

Note there is another class of "quantum computers" out there made by D-Wave.  I have no idea if they have solved a problem faster than traditional machines, but it looks likely.

You might want to change the name of the thread if you want to discuss optical and/or analog computing.

* I'm aware that Einstein went to his grave convinced that quantum physics (at least without a hidden variable) was simply wrong.  But even he couldn't find a way to explain the universe any better than quantum mechanics.  Then quantum electrodynamics improved on that, and more quantum theory beyond that.  And assuming that IBM managed to do exactly what they said they did in 2001 (by factoring 15 into 5 and 3) quantum physics appears to operate close enough to what quantum computer designers expect it to do (i.e. probablistically and not deterministicly).

Link to comment
Share on other sites

11 hours ago, Arugela said:

Quantum computers are not probabilistic. Or your understanding of that is wrong. it's predictable just like everything else.

Really, that single sentence tells me that this entire discussion is pointless. You don't seem to believe that anything could be probabilistic. Well, you're wrong. Everything is. Determinism is mere illusion brought by limits of human senses, which don't work at quantum level. It's a useful approximation sometimes, but once you get to small enough scales, it completely breaks down. 

One cannot discuss quantum physics without first unshackling one's mind from determinism. Indeed, according to the quantum theory nothing can have all its properties determined exactly, nor can it even have any single of its properties determined exactly. And it derives from  surprisingly simple mathematics, of a kind that you can do on a blackboard (it has to, because quantum theory is actually quite old and predates widespread use of computers). You do need a primer in matrix algebra and calculus to fully grasp it, but a good lecturer can have you neck-deep in basic quantum mechanics before you even know it. It's really not so hard. Of course, some are mentally incapable of grasping it (as proven every year at exam time :) ). It's good to at least try, however...

Also, I wonder when a moderator catches on to some of the language employed in some of those walls of text of yours. Given their length, probably by the time KSP 1.9 comes out. :) 

Edited by Guest
Link to comment
Share on other sites

That means it's a partial theory. And what you are saying is illogical and unscientific. And the theory of probability doesn't actually say what you are saying it does nor does any serious scientific theory or any theory running by the people making quantum computers or any other hardware.

The only way to not figure it's data is if you can't under the circumstances. What you are saying has nothing to do with real theories. All you are doing is mystifying theories and turning them into your personal fantasy like a lot of very ignorant people today. And everything I said is true. And you are one of those ignorant people. Trust me, I've seen this develop many times in the past. We used to deal with this in forums in the past like swatting flies. Over and over again. Your generation just takes longer like everything else with your generation. Slow as pudding, in every way. 8)

2 hours ago, wumpus said:

If you can predict when a radioactive nucleus will decay by some means superior to taking its half-life and plugging it into a Poisson distribution, there's a Nobel Prize in it for you*.  There are even devices that generate random numbers this way for cryptography.

Here's a link to an example of the difference between classical and quantum computing: https://en.wikipedia.org/wiki/Shor's_algorithm

Note there is another class of "quantum computers" out there made by D-Wave.  I have no idea if they have solved a problem faster than traditional machines, but it looks likely.

You might want to change the name of the thread if you want to discuss optical and/or analog computing.

* I'm aware that Einstein went to his grave convinced that quantum physics (at least without a hidden variable) was simply wrong.  But even he couldn't find a way to explain the universe any better than quantum mechanics.  Then quantum electrodynamics improved on that, and more quantum theory beyond that.  And assuming that IBM managed to do exactly what they said they did in 2001 (by factoring 15 into 5 and 3) quantum physics appears to operate close enough to what quantum computer designers expect it to do (i.e. probablistically and not deterministicly).

Your ability to figure it out has nothing to do with wether it can be figured out. You don't have the slightest grasp on anything you are talking about. there is no such thing as probabilistically except having to repeat calculations for accuracy. Which still has to be done in a way that ensures and outcome. And only those parts will be used for calculations. That is how actual computing works. And everything else. You and a lot of people need to get a serious grasp on reality. Computing only happens when you can predict something. That is how quantum computers work too.

1 hour ago, Dragon01 said:

Really, that single sentence tells me that this entire discussion is pointless. You don't seem to believe that anything could be probabilistic. Well, you're wrong. Everything is. Determinism is mere illusion brought by limits of human senses, which don't work at quantum level. It's a useful approximation sometimes, but once you get to small enough scales, it completely breaks down. 

One cannot discuss quantum physics without first unshackling one's mind from determinism. Indeed, according to the quantum theory nothing can have all its properties determined exactly, nor can it even have any single of its properties determined exactly. And it derives from  surprisingly simple mathematics, of a kind that you can do on a blackboard (it has to, because quantum theory is actually quite old and predates widespread use of computers). You do need a primer in matrix algebra and calculus to fully grasp it, but a good lecturer can have you neck-deep in basic quantum mechanics before you even know it. It's really not so hard. Of course, some are mentally incapable of grasping it (as proven every year at exam time :) ). It's good to at least try, however...

Also, I wonder when a moderator catches on to some of the language employed in some of those walls of text of yours. Given their length, probably by the time KSP 1.9 comes out. :) 

Again, the ability to personally figure it out from senses has nothing to do if it's actually possible to figure something out. We don't use sense to pick up thing outside our sense. We make equipment to do so. You are confused as to basically everything on a fundamental level. There is no probabilistic outside of the determinism you are talking about. Even if our physical biological senses can't pick it up it has nothing at all to do with wether it's deterministic or not. That is literally a matter of cause and effect which is outside of that.... That is merely how our senses work because that is how everything in the universe works by definition. You are literally saying to stop using reason and thought to figure something out. You do not understand what you are saying. This is why deterministic is the root word determine...

Edited by Arugela
Link to comment
Share on other sites

1 hour ago, Arugela said:

Again, the ability to personally figure it out from senses has nothing to do if it's actually possible to figure something out. We don't use sense to pick up thing outside our sense. We make equipment to do so. You are confused as to basically everything on a fundamental level. There is no probabilistic outside of the determinism you are talking about. Even if our physical biological senses can't pick it up it has nothing at all to do with wether it's deterministic or not. That is literally a matter of cause and effect which is outside of that.... That is merely how our senses work because that is how everything in the universe works by definition. You are literally saying to stop using reason and thought to figure something out. You do not understand what you are saying. This is why deterministic is the root word determine...

Well, it might be useful for you to know you're talking to a (bio)physicist. Unless you have better credentials (I'm guessing you're an IT guy, or doing something related. That, or a student of philosophy), I don't think you get to tell me that I'm confused about anything. You're the one posting giant, rambling walls of text. Which tend to be completely wrong, too. From both philosophical and scientific standpoints.

Determinism is a much stronger concept than cause and effect. Specifically, it posits that for a given cause, the effect is always the same. So that if you know the state of a system at any point of time, and the rules, you can determine the exact state of a system at any other point in time. And this is wrong. Two quantum events can, despite being exactly identical (and being "identical" is easier to achieve in QM than you may think), give completely different results, according to an appropriate probability distribution. This is has been measured and is an objective, experimental fact. And no, the uncertainty principles (there are several) are not just a restrictions on our measurement, there's a whole slew of quantum effects in which this causes measurable effects. 

Yes, I'm arguing we stop using "reason and thought". The only way science has gotten anywhere is by using experiments and mathematics. Obviously, it involves a lot of thinking and reasoning, but this is not the thing scientific theories are build around, just the tools used in making them. Philosophers are the ones who make things up based purely on what makes sense to them. Last time those methods actually mattered was in Ancient Greece. There's only so far pure reasoning is capable of taking you, and we've gone past that somewhere around Isaac Newton's time. Sorry, you can't figure out how the world works just by thinking about it, because our minds are adapted for dealing with "normal" situations.

Edited by Guest
Link to comment
Share on other sites

The fun part is that if the one description is accurate and nano computers are only millions of times faster. That put some estimates for potential calculations for light based stuff at around e15th in speed at 1920x1080 resolution. That is right between gigahertz and nano computer supposed speeds. And it's scalable. So, I'll assume it's a matter of cost. I've always liked the idea of light based computing. It's potentially endlessly scalable which could be useful. Even though I'm sure not in my lifetime.

And I think scaling it to double the size(4x) at 3920x2160 put it in the same minimal range of e18th calculation speeds for quantum computer calculation speeds.. So, that is doable with modern video card display sizes. That is a pretty nice potential setup. We are not far off. Some stuff was saying light based cpus in a decade. Not that that usually works out that conveniently. maybe in a decade if I'm lucky I will have a light based system. Not to mention the combination stuff invovling light based on a current cpu. I think they meant in the die with the tarnsisters and not just as the bus. That would be interesting.

So, yes it can be supplemental or even replace potentially. It's possible it could out do quantum computers potentially. They don't seem to be far off. Especially as light based is potentially infinitely scalable. Although I'm assuming quantum is even more power efficient. No, idea how scalable it is. I missed something on that if it was in the stuff I read. I'll assume light based systems and qauntum will go together at some point as the lack of lag will probably be helpful. Unless there is something better out there to speed up copper.

Also wasn't sure where in the nanosecond range of response time microled was supposed to be. That could throw it off a bit. And I think the one article said someone got a pico second signal to the gate. I don't remember what it was. I think it was to get the signal.(or before the signal could be redone? is that the same with the flourescent gate? Not sure if it means an electrical signal or before the fluid goes back to a state it can be read again or the sensor.)

https://etd.ohiolink.edu/!etd.send_file?accession=miami1375882251&disposition=inline

https://www.chromedia.org/chromedia?waxtrapp=mkqjtbEsHiemBpdmBlIEcCArB&subNav=cczbdbEsHiemBpdmBlIEcCArBP

https://www.ncbi.nlm.nih.gov/pubmed/18699476

 

Edited by Arugela
Link to comment
Share on other sites

  • 3 months later...
This thread is quite old. Please consider starting a new thread rather than reviving this one.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...